B and B week 6/8- 6/15

This week I focused on preprocessing the song’s rhythm and melody data before the song even starts. Transitioning from one to the other has one major difference that changes how the whole algorithm works. That is through Unity’s AudioClip’s GetData function. This allows me to access the entirety of the samples of the song that is processed. This populates an array of samples all the way from the beginning of the song to the end, as opposed to before where the same smallish array is populated over and over with the current frame’s data.

However, these are only the samples of the song, this is versus what I was using before, which was GetSpectrumData, which came already with a FFT algorithm available for you to use. This changes the amplitude over time to different frequencies I can use for platform generation. As my project is more about generating levels that could be playable from audio sources, I will be using an audio library for C# in order to access this data again. This library is called DSPLib, which has a very similar output as the Unity Fast Fourier transform algorithm.

Using once again the Hanning FFT window to prevent leaks as much as possible from one bucket to another, the samples are passed into the new FFT algorithm in order to once again acquire the frequency data needed to generate platforms. However due to the way this FFT algorithm works, I needed to change its output of complex numbers to an array of doubles. This process creates a very similar output to the GetSpectrumData, so I can use this output to feed into the previous spectrum analyzer algorithm from the previous two weeks.

Now that I have all of the data required to create platforms before the game even begins, I now have created a very basic prototype of how the game could look like once the platforms are fully synced with the music. This is the output of this algorithm, where the y value of the platforms are based on the frequency of the period of time it was calculated.

Now that I have the frequencies preprocessed, next week I will begin honing my algorithm to create less platforms that are based on more important parts of the song. This will be done to really feel the impact of each note versus a more sensitive more frequent output.

Previous
Previous

B and B week 6/15- 6/22

Next
Next

B and B week 6/1- 6/8