GitHub OpenGraph preview
QUICK LINKS
README
An asset viewer for the LLLL game. Research purpose only.
I cannot take responsibility. Please be responsible for yourself. Do not host publicly as a service.
This project is still in development. Contributions are very welcome. Linux is supported.
If you are new to web development, please follow the steps below.
Your final directory layout should look like this:
llll/
llll-tools/
bin/
ffmpeg
vgmstream/
build/
cli/
vgmstream-cli
AssetStudio/
AssetStudioCLI/
bin/
Release/
net8.0/
AssetStudioModCLI
UsmToolkit/
UsmToolkit
llll-view/
...
inspix-hailstorm/
...
After cloning, check backend/.env to understand required tool names and paths.
I recommend using proto to manage Node.js and pnpm versions.
# Ubuntu / Debian
apt-get install git unzip gzip xz-utils
And then, install proto:
bash <(curl -fsSL https://moonrepo.dev/install/proto.sh)
This project requires many dependencies. First, decide the root project directory (for example, ~/llll) and clone this repository into it:
git clone (this repository url)
Install node dependencies using pnpm:
cd llll-view
pnpm install
The tools below are required to process LLLL assets. Place them under llll/llll-tools.
Some tools depend on:
Please be careful about versions of .NET.
You can get these tools as open source projects.
However, due to licensing issues, official prebuilt binaries do not include the good m4a library.
If you want high-quality m4a output, build ffmpeg yourself and place it at llll-tools/bin/ffmpeg.
If you do not mind m4a quality, use an official static build instead.
Some thumbnail images are converted to WebP format, so cwebp command is required:
# Ubuntu/Debian
sudo apt-get install webp
Copy frontend/.env.example to frontend/.env and adjust variables if necessary.
CORS_ORIGIN refers to backend server origin. This is required to allow browser to access backend server.
Copy backend/.env.example to backend/.env and adjust variables if necessary.
Thanks to inspix-hailstorm, you can handle assets like official client.
Clone the repository under llll/ directory:
cd llll
git clone https://github.com/vertesan/inspix-hailstorm
If you are not sure whether setup is correct, check backend/.env for required tool names and paths, then adjust them to your environment.
Google Chrome on PC is required for some features like transcription and translation. Official requirements are here: https://developer.chrome.com/docs/ai/prompt-api
Set chrome://flags/#translation-api to Enable and use an HTTPS connection to use the translation feature.
Set the following flags to Enable:
chrome://flags/#optimization-guide-on-device-model
chrome://flags/#prompt-api-for-gemini-nano
chrome://flags/#prompt-api-for-gemini-nano-multimodal-input
I tested with an RTX 5070 Ti. Speed is pretty good, but quality is quite low. Whisper might be better, but it adds heavy dependencies on the server or client side.
Please check help page in the app first.
If you run into issues during setup, use an AI chat to troubleshoot.
I do not have any rights to LLLL assets.