You created them, you gave them life, you know exactly what they look like. But, have you ever wonder what your codes sound like?
Now it's time to bring them up and sing for you!
By a spells I learned from Fizban...
Each line of code represent a period of time, the whole file can be treat as a music sheet if we transform line content into notes and read the file from top to bottom.
The pipeline is:
[file] --> (composer)
--> [raw values] --> (instrument)
--> [notes] --> (sound engine)
--> [sound]
Currently we have two composers in town:
simpleComposer
: he will transfer codes to notes by this algorithm:
{}()[]
) appears in the
line. if n
, note length is n % gridDivision
units, start offset is
gridDivision - (n % gridDivision)
units.bassComposer
: he will transfer codes to notes by this algorithm:
So with current config, we have a track composed by bass-composer
with
triangle
waveshape oscillator plays on the left side and a track composed by
simple-composer
with square
waveshap oscillator plays on the right side.
Finally, a simple reverb is added on mixbus.