For this week’s prototype, I decided to change what I was planning to do due to hardware limitations, experimenting with new MIDI keytracking applications, and rediscovering Magenta, Google’s free machine learning melody generating software. Originally, my plan was to have a more in-depth demonstration with a MIDI percussion instrument, but due to my MacBook not being powerful enough for a real-time latency free demonstration, I changed this prototype.
I chose to make a music visualizer system where each individual note in one octave of a synthesizer has a unique video effect associated with it. I also used this system with a low frequency oscillator applied to the synth. I chose to use Google’s free AI software called Magenta to generate its own melodies and extend my melodies quickly creating new musical ideas, and essentially, have AI generated visuals because each note in these melodies is linked to a different visual.
I hoped to learn what others thought about the new visual system I made with different notes linked to unique visual effects and what others thought about melodies that Magenta generated. To solicit this information, I let others play notes of the synth so they could get a feel of how the system worked, and played premade and Magenta generated melodies showcasing how Magenta works and how premade melodies both control visuals. I then asked what they thought about what the visuals looked like, how well the system works, and what they thought about the AI generated melodies.
I learned that there were many different ideas on these questions. There were comments on how much was going on in each visual effect regarding how much is going on. I was given the cool idea to have different colors for each note. Another cool idea I was told was to have the visuals correspond to the panning of the audio such as a visual on the left side of the screen linked to audio panned left. User friendliness was discussed to help others use this audio-visual system on the MIDI controller I present for others to experiment with. I was also given the suggestion to have drums I do not hit as much trigger visuals when playing drum beats so I can more precisely control the music visualizers while playing live, and to be able to switch this to drums that I do hit more for more movement in the music visualizer.
In turn, I really like the suggestions and feedback I was given so I will do my best to apply all of this feedback as much as possible.