Calming constellation visuals with music using p5.js.
Hsiao Jui Lin
This project is mainly an animated visual representation of constellations. Its goal is to provide easy information on what the constellation 'Leo' looks like while creating a relaxing, calming atmosphere for users. This project is not yet complete. The completed version would hopefully contain all 12 zodiac constellations, a separate song for each constellation and more animation for the stars. The animation's deeper meaning is to convey the philosophical thought that the entire universe is an unending cycle by making the animation itself a loop. As for the technical aspect, this project encompasses the key concepts of Nature of Code, including objects, force, oscillation and autonomous agents.
An algorithmically-generated music visualization in classical Chinese style
In my project, I used the concept of spring to create the simulation of the chord, water ripple, water sleeves, curtain, etc. I also combined the flow field, perlin noise, and some other concepts that we learned in class. I think the visual aesthetic that spring create fits the classical Chinese style very well, which is with solidness in softness and softness in solidness.
Expression is a p5 sketch that allows you to control musical timbre using body position, along with a built in music synced visual for a song I produced for the Software Music Production course at steinhardt.Tone.js enables a music synced animations and lyric subtitles as well as the ability to assign different stem tracks to their own effect plugins before routing them all into the master track. Posenet from the ml5 machine learning models gives access to body positions which is then passed to manipulate the cutoff values of three lowpass filters each corresponding to a specific track: vocal, drums, or lead.This project originated from a previous project where I used the ml5 model to create a rhythm game, but I decided to add tone.js and change this towards a more musical direction. This can be used for musicians either for production or for live performances, there are countless possibilities for musicians to naturally interact with this program and express themselves in a electronic setting.
The project intended to show a leaping dream process, and the visitors could design their dreams by clicking the “dreamer’s words” to get into their own special dreams. The project was inspired by The Interpretation of Dreams, written by Sigmund Freud, who groundbreaking showed that how elements appear in one’s dream could influence the dreaming process.
As different elements may have the special meaning referred to as a dream dictionary, different words would create special atmospheres for the visitors. By arranging the order of the words, the visitors could enter a “dream” designed by oneself and be encouraged to use the concept “montage”, to help them leaping from one word to another. Every word was connected to a video or audio and was reorganized by the visitors when they click on them.
To make the project more immersive, the VR technique was required in the future for further development.
This is a new sound visualizer made for final project, the main idea is to create a hollow ring rotating in the center. Everything can be absorbed into a small part, but can be expanded into lots of particles as well, everything are in an eternal cycle, from 0 to 1 and from 1 back to 0. The main technique used for this project is flowfield, setting flows in the canvas to control the direction of all the particles. And for synchronization between music and animation is realized by using p5.sound.js,this library is for analyzing music input and returning various parameters. I made a simple beat detection based on volume, every time the beat is detected will let the particles expand/absorb based on different mode, also the volume of the beat will affect the speed of the expansion/absorption. The blocks on the background are for better visuals, the height is controlled by the volume of different frequencies and the color is controlled by the energy of different ranges of frequencies from bass to treble,
Project 華韵 aims to associate the animation presenting the growth of vines and flowers(red spyder lily) with input music. The idea is that to use music to create the animation of nature, and I love the artistic expression of flowers with vines so I decide to make the animation based on this concept. The project mainly involves the implementation of OOP to make vines with stings and flowers on it, and the synchornization between music and animation is realized by implementing p5.sound.js to analyze the music and get parameters from it to feed into animation. I develope an easy beat detection based on volume, and each time a beat is detected will generate a flower, and its size is based on the volume of that beat. Also the reason why there are five vines growing at the same time is because each one represents a range of frequency and the energy of different sound frequency will influence the opacity and the thickness of the corresponding vine.
To challenge the idea that: screen limits the powerful effect of a concert, I use multiple screens and mirrors to create a space which audience and walk into to experience the powerful sound and visuals from concerts under this pandemic background.
This exhibition is called “concerts in pandemic”.
It aims to create a space where people experience the heart-shaking feelings of audio and visual that concerts provide. Based on interview responses from my friends, one biggest reason they go to concert is to “broke the screen” and feel it in person. However, I want to challenge this idea of “screen is a barrier”. I use several screens and mirrors to create a space where people can go in feel the strong visual, audio and emotion provided by screens. No matter with screens or not, concerts and music itself has great power especially in this pandemic period. I put my work in this historical background. In the pandemic period, we are not able to feel the concert in person. However, with a lot of online concerts going on, we still feel connected by music. Music has great power.
Visualize chords on the complex plane using Fourier transform at root frequency.
(Click “Learn More” to play!) There are many interesting visualization methods for pitches and chords. For example, you can put sand on a vibrating plate and patterns will form; you can shine a laser on a vibrating balloon and the reflection will draw cool shapes on a wall. These methods, however, are not mathematically unique. If you do the experiment with different equipment you will get different results. My approach, however, is free of arbitrary parameters. I want to show the one true shape of chords.
Click mouse or press keyboard to interact with the game.
“Music Interaction Game”is a music game consisting of keyboard interaction and background music. By pressing any key on the keyboard, user can hit notes on the screen and get a certain number of scores. To start the game, users can click the “START” button on the upper left corner. This button will trigger background music. There will be notes floating on staves at the bottom as long as the music starts to play. Different from other music games, this game is totally for entertainment and relaxation, which means it’s much easier for users to attain a higher score. Hope this game can release your pressure and provide you a pleasant experience.