Find a file
2026-02-11 21:31:25 +05:00
m4 v1.0.1: The GNU Autotools devolution 2025-07-26 00:17:55 +05:00
src 1.2.3: The big doc update 2026-02-11 21:31:25 +05:00
.gitignore beginning of a garbage software 2025-07-17 12:52:56 +05:00
configure.ac 1.2.3: The big doc update 2026-02-11 21:31:25 +05:00
env.sh v1.0.6: fix NaN handling etc 2025-08-01 11:14:48 +05:00
example-song.js doc, bug fix, new example song 2025-11-11 10:56:29 +05:00
LICENSE add license read me 2025-07-19 20:08:11 +05:00
Makefile.am Smh my head 2025-07-26 05:50:34 +05:00
meson.build Update for 1.1.1 2025-08-15 05:18:17 +05:00
parse.js i forgot what this commit includes 2025-07-23 02:35:00 +05:00
parse.sh fancy pants ahead of time processing 2025-07-18 23:48:48 +05:00
README.md 1.2.3: The big doc update 2026-02-11 21:31:25 +05:00

libbytebeat

A bytebeat playback library (no affiliation with the linked site). Can be embedded in a C application via the C API or in a JavaScript one by vendoring src/bblib.js. Features:

  • seems to work
  • custom metadata format
  • automatically fixes DC offset with a high-pass filter
  • two-phase multithreaded playback (does not work with some songs, but they can be adapted to the two-phase system). First phase generates samples and possibly a numbers array of auxiliary data and is multithreaded, second phase is single-threaded, linear, processes generated samples further. First-phase is expected to be a pure function and should not depend on side effects from previous samples (except the 0th one), second phase may do this however and is thus useful for reverbs and other filters.

See example-song.js for an example/docs of how to write libbytebeat metadata in a bytebeat song.

As an end-user, the best way to use libbytebeat is using the FFmpeg fork ffmpeg-bb with its libbytebeat-based demuxer. With this modified FFmpeg you can convert .js files with bytebeat code and libbytebeat userscript-style metadata as if they were regular audio files. If you install it system-wide, media players based on FFmpeg (like mpv) will be able to play bytebeat music directly, without having to pre-render it first, even if the song is infinite in length. As well as produce audio the demuxer can also turn console output from bytebeat songs into ASS subtitle-based graphics. Beware of jank in media players! It's pretty cursed and they aren't built to process streamed subtitles like this :D (if you convert to mkv first it should be fine though, however ffmpeg doesn't compress subtitles so you get big chonky file, but there are tools that can compress them, forgot which, or you can just burn subtitles into video which also makes the video work where ASS is not supported like in web browsers)

A bare-bones command-line wrapper (bytebeat-gen) is also included (only in in-place builds, not make install), if building FFmpeg is too difficult. bytebeat-gen does not accept any arguments or display a help text, it will consume javascript code from standard input and produce a bespoke audio format on standard output. This format goes as follows:

  • a "header" consisting of 3 plaintext lines ending with a LF character (newline):
    • the rawaudio format used (floatle or floatbe)
    • sample rate (e.g. 48000)
    • number of channels (e.g. 1 or 2)
  • raw audio in the format specified by headerlines

If you use a unix-like system with mpv installed, you can use shell pipes to direct the output of bytebeat-gen into the parse.sh script to play bytebeat music in mpv:

./bytebeat-gen < example-song.js | ./parse.sh

Without compiling anything, if you have node.js and mpv installed, you can run the parse.js script, which accepts a path to a javascript file as an argument and nothing else:

node parse.js example-song.js

However I do not recommend you to use parse.js. It gives bytebeat song code full access to your computer, and also doesn't support multi-threading. Do not use parse.js.

Compiling

These instructions are aimed towards Unix systems, in particular GNU/Linux.

If you use Microsoft Windows, MSYS2 should provide you with a sufficiently unix-like setup, but I am not personally familiar with how things work over there and I have not tested if the code even works on Windows. If you successfully compile this and especially ffmpeg-bb on Windows, do tell! i have email: root@kimapr.net

The build process as well as runtime running of libbytebeat requires pkg-config, JavaScriptCoreGTK (usually distributed as part of WebkitGTK) and GLib (pulled in by JavaScriptCoreGTK since it also needs that), build process requires also a C compiler toolchain, make, and possibly other things.

First, if you haven't already, download the source code on your computer. This can be done either using git:

git clone https://git.kimapr.net/kimapr/libbytebeat &&
cd libbytebeat

Or by downloading and extracting the release tarball:

curl https://git.kimapr.net/kimapr/libbytebeat/releases/download/1.2.3/libbytebeat-1.2.3.tar.gz |
gzip -d | tar xv && cd libbytebeat-1.2.3

Your current directory should now be at the root of the source repository. If you obtained the source code via git, you need to initialize the GNU Build System with GNU Autotools like so:

autoreconf --install

If you aim to build ffmpeg-bb, skip to the next section. The next step (regardless of download method) is to run configure:

./configure

Then compile it with make

make

bytebeat-gen will now be available in the current directory, and the libbytebeat library itself can be found in .libs.

ffmpeg-bb

The ffmpeg-bb repository is organised into branches based on which release of FFmpeg it is based on. The current latest release will be the default one. You can download it like so (--depth 1 is optional but recommended for efficiency since FFmpeg is large):

git clone https://git.kimapr.net/kimapr/ffmpeg-bb --depth 1 &&
cd ffmpeg-bb

Because libbytebeat is used as a compile- and run-time dependency for ffmpeg-bb's libbytebeat demuxer, but is not packaged by any distro, some more complicated things have to be done. We will designate some directory on your filesystem as the "prefix", and into this prefix libbytebeat will be installed. The exact path doesn't matter, let's presume the path is stored in shell variable PREFIX (either set it for real in your shell or mentally substitute it every time it's used in instructions here)

Let's go back to the libbytebeat repository, and configure it, this time a bit differently:

./configure --prefix="$PREFIX"

Then compile it and install it into the prefix:

make install

Now back to ffmpeg-bb directory, and we need to set a variable for the build process. This will allow ffmpeg build system to find where libbytebeat is located:

export PKG_CONFIG_PATH="$PREFIX/lib/pkgconfig${PKG_CONFIG_PATH:+:}$PKG_CONFIG_PATH"

The weird self-reference in here is so that if PKG_CONFIG_PATH is already set it's extended instead of overwritten. Next step is to configure ffmpeg build system. Its configure script accepts 1 grillion options, but the most important one for us here is of course --enable-libbytebeat. The sussy rpath one makes ffmpeg binary be able to find libbytebeat shared library when run even though it's in a non-standard path, so also important. --prefix isn't strictly necessary, just there so that ffmpeg can get installed into the same prefix comfy together with libbytebeat.

./configure --prefix="$PREFIX" --extra-ldflags=-Wl,-rpath="$PREFIX/lib" --enable-libbytebeat

You might also want to enable --enable-libfontconfig, --enable-libfreetype (libbytebeat demuxer uses those when rendering bytebeats with console output, it's optional but quality will be degraded for console output if they are missing), as well as 1 grillion other enable-options for all the muxers and stuffs. Do beware though that more options equals more dependencies to install! The likely simplest way to fulfill them all is to just learn how your own distribution compiles and packages ffmpeg, and copy all their --enable-* flags and install the dependencies they use in their build. I use this combination of options myself:

./configure --prefix="$PREFIX" --extra-ldflags=-Wl,-rpath="$PREFIX/lib" --enable-gpl --enable-shared --enable-frei0r --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libcaca --enable-libcdio --enable-libdav1d --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libpulse --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libsvtav1 --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libxvid --enable-libx264 --enable-libx265 --enable-openal --enable-opengl --enable-libdrm --enable-vaapi --enable-runtime-cpudetect --disable-htmlpages --disable-static --disable-stripping --disable-mips32r2 --disable-mipsdsp --disable-mipsdspr2 --disable-mipsfpu --enable-libbytebeat --enable-libopenmpt

Most of these are just stoled from the GNU Guix package.

Anyway, next, compile ffmpeg:

make -j$(nproc)

Since ffmpeg is very heavy to compile, using more cores will speed things up a lot, however if you use -j$(nproc) your all your CPU cores will be maxed out for a pretty long time. If this is a concern, replace the $(nproc) part with a number that's smaller than the number of your cores (that's what $(nproc) resolves to, since nproc is a command that prints how many cores you have), but still big enough to offer some concurrency. Once ffmpeg compiles, if you haven't removed --prefix from ffmpeg configure options as i told you was okay to do, you can install it into the Prefix with

make install

ffmpeg tools should now be available in the source code directory of ffmpeg and also in "$PREFIX/bin" if you did make install.

C API

see src/bytebeat.h, also src/bytebeat-gen.c for an example usage

Exotic ByteBeat APIs

p2f(function(t, sampleRate, p1Sample, p2s_numbers) { ... }): Register a function for the second phase.

p2s(type: "u8"|"u16"|...|"i8"|...|"f32"|"f64", ...numbers): Pass data to the function registered with p2f.