Our inspiration for this project were the Nike Air Mags — a shoe popularized by their appearance in the movie Back to the Future and renditioned by Nike in 2015 with Michael J. Fox being the first to get a pair.
These shoes gained popularity for their unique futuristic and out-of-this-world look and their unbelievable price-tag.
What I love about the shoes is how the lights are so pretty and floe with the shoe with it having a hollow-like feeling within the shoes. They are integrated into the design. This is something that Christshon and I strive for. We hope to make our project interactive by giving the user the ability to choose the colors displayed on their feet — hopefully, through a button or potentiometer.
There are many other artists that have created projects similar to ours. Since our idea is an interactive 3D solar system, there are many examples out there already that can be drawn from. However, this also means that to makes this project unique from the mob of other interactive solar systems, we had to research them and see what we could do that they have not already done.
The difference between our project and these projects are that we have a profile view of the universe without the planets rotating around the sun all the time so the planets are always easily in view in a different view than what is normally given. The interface that we are going to create will be different also in that we are planning to design the website so that it zooms in on the planets when clicked and is an easy interface to use. Also we are including physical computing which sets us apart from all these purely online models.
My primary inspiration for this project were the nike Air Mags, a shoe popularized by their appearance in back to the future and later brought to life by nike in 2015 with Micheal J. Fox being the first to get a pair. They later gained popularity for their unique space age look and unbelievable price-tag. What I enjoy about the shoes is the fact that they were successful in creating a “light up” shoe where the lights don’t just seem like a frivolous addition. They are integrated into the design in a way that can be taken serious. This is one of the goals of my project however unlike the Air-mags I plan on making my project interactive by giving the user the ability to choose the colors displayed on their feet.
My next inspiration also came from nike and much like the Air Mags and has its origins in an idea popularized by back to the future. In the movie, the shoes worn by Marty McFly not only lit up but also had the ability to tie themselves. That is the idea behind the nike HyperAdapts 1.0 and I plan on following a similar design as well.
There are two major categories to examine when considering prior art: physical interface inspiration and course search inspiration.
For the course search generally, I’m taking general organizational cues from Rensselaer Polytechnic Institute’s YACS student-designed course search as well as the existing NYU Class Search.
For physical filter interfaces and buttons there is a more diverse field of prior artwork available. One of the most direct inspirations for the concept of a physical interface for digital search was the ITP project “Search Divides Us” made by Asha Veeraswamy, Anthony Bui, and Keerthana Pareddy. It has a higher quality physical construction than I can hope to do with my current skill set, but nonetheless the same concept is still there. I noticed it at the Maker Faire while volunteering a month ago.
A stereotypical old man whose super bitter about everything and is very mad at humans picking up his friends because it reminds him of his impending doom.
An annoying teenager who wants to do nothing but talk about her boyfriends and doesn’t care if you take her boyfriends away. She is always too preoccupied with her current boyfriend. Until she leaves them then she starts wishing for her old boyfriends back but its too late because they’ve already been eaten. And now she has to deal with the harsh reality of life and loneliness.
A fun, optimitic and very passive person. Even when you take his family away he doesn’t seem to mind that much. But the more you take the sadder he becomes but he wont show it because he’s very passive and prefers to see you happy than him sad or angry.
“Tellart’s Terraform table enables users to ‘play God’. Located at London’s V&A Museum, projection mapping turns the giant sandpit into a rugged landscape, with mountains, valleys and lakes. Here’s the cool bit: thanks to a machine learning algorithm, the Table is able to read the height of the sand and respond to any changes. In short, this means you can dig a hole to form a lake, raise a hill to create a snowy peak, or smooth a river over to expand a forest.”
This is related to our project because this is an example of user input or interactivity that we want to incorporate into our project as well. Although this is with VR and not video mapping the interactivity is the same.
“Using two walls, a treadmill, and some nifty projection, director Filip Sterckxcreates a virtual world for the musician Willow’s music video. As with most projection mapping projects, it’s the technique that charms here.
Singer Pieter-Jan Van Den Troost gropes at doors that aren’t really there, trots on the spot down imaginary stairs, and kneels pretending to be paddling in the sea. It’s all surprisingly lo-tech, and all the better for it.”
This is another example of immersive interactivity. While we might not go to that extreme, the interactivity portion of our project that will trigger the video-mapped animation is a vital part to make this project more interesting. And this one uses video mapping too! It’s really cool to have examples like this one to motivate us into doing great work!
My final project with Sama and James is quite similar to Dalziel & Pow‘s interaction piece.
The similarity between our project and Dalziel & Pow’s interaction piece is that we all incorporate video mapping in our storytelling. What’s more, we all need sensors to trigger different animations. Dalziel & Pow used force sensitive resistor in their piece. In our project, we decided to use light sensors as our major trigger.
An interactive artist, Scott Snibbe, did an interactive artwork at 2005, which called “Blow Up”. There is a small version of 12 fans on the desk, and a huge version. Once the viewer blows to the small version, the huge version will also begin spinning. What I did is kind of similar to that. But I will use a pressure sensor as a variable, and the output is the fan on the screen ( or projector shotting the wall). I think my idea breaks the border of 2D and 3D world, it’s like what you did is actually affecting another dimension.
However I noticed, most artists tend to not depict anxiety with literal human related images. It makes me a bit nervous because I wonder if there is a reason so many people choose to avoid representing anxiety in a more literal sense.
Unfortunately, I am running a bit behind on my project but I found DC fans which I think I can use to inflate the lungs. Also, as I added above, there are a surprising amount of tutorials and blog posts on creating lungs with Arduinos. I have also ordered red latex balloons to build the lungs. These should arrive Monday night and I will build a mini prototype of the lungs to test out the DC motors with when they finally arrive.
Most of these projects and collectives mainly deal with visual art and I thought that these visuals would serve as great inspiration for my animation aspect. It makes me think of how I can really be specific in the detail in regards to my experience in my community and what type of style I’m going for.
These are some spoken word pieces I found that talk about gentrification. I really found inspiration in the first one the most. I want to incorporate that performance aspect in the voice as much as I can so the effect can be greater on the listener.
Florence is an interracial interactive story and video game developed by the Australian studio Mountains and published by Annapurna Interactive. The game was released on February 14, 2018 (Valentine’s Day) for iOS and on March 14, 2018 for Android. In Florence, the player follows the story of 25 year-old Florence Yeoh as she lives her daily routine and meets Krish, a cellist she sees in the park. The game features little written dialogue and tells Florence’s story through a series of brief chapters. The game uses basic puzzles to reinforce pieces of Florence’s life as players progress through her story.
Lead designer Ken Wong wanted to create games that eschewed violence and was inspired by his work on Monument Valley to have a more narrative-focused experience that included puzzles. The game received favorable reviews that praised the game’s art-style, music, and narrative structure.
the video mapping book:
I can not find many examples. I get one on the Vimeo.
Finding inspiration for my project was actually quite easy, initially I was gonna take an easy approach and just make my game with buttons and make the buttons press at the right times, but after the proposal that my controller be based off of an actual existing instrument it got me thinking through all of my experience with musical instruments. For a game to teach rhythm, I decided that it should be a instrument that dictates rhythm in music (the “Rhythm section”). Instruments in this category are guitars, bass guitars, percussion, drum set, and voice. The closest to actual button presses, and the most simplistic for teaching would definitely be in percussion, so upon looking through all the instruments, I figured that both hands could be used on bongos, this could be a good way to teach rhythm!
Back on the main topic at hand, inspiration for this project was actually drawn from previous projects and inventions. The main inspiration that had gotten me in previous works have been rhythm games, and actually upon being introduced to the type of classes at IMA I was actually kind of interested to see if I can figure to make this sort of thing here. The type of controller type gameplay that are similar inspirations for my project are that of guitar hero and rock band, as controllers, they use a guitar, drumset, and microphone. Additionally I learned that there is an arcade game developed by Namco in Japan that is a very similar concept to my controller idea except it uses taiko drums, a cultural instrument in Japan and revolves around hitting the drums at the correct time. Which is similar to the bongos in a way.
In terms of gameplay, the video game Rhythm Heaven (developed by Nintendo) actually inspired me, the game is based around very simple inputs that interact with a song with various silly scenarios. It’s kind of weird to explain, so I think the best way to demonstrate its simplicity would be to provide a video or two.
As you can see, they only really give the player two inputs, and they make good use of them as they build a song around it to give the player different musical challenges. I plan to do something similar except explain to the user what they are learning, it perhaps won’t be as quirky, but I hope they are to understand a better sense of rhythm once they finish.