Net-Natyam

Net-Natyam is a hybrid training system and performance platform that explores the relationships between music, machine learning, and movement through electronic sound composition, pose estimation techniques, and classical Indian dance choreography.

Ami Mehta, David Currie, ,

https://davidalexandercurrie.github.io/net-natyam/

Description

Bharatanatyam is a form of classical Indian dance that involves using complex footwork, hand gestures, and facial expressions to tell stories. The dance is traditionally accompanied by Carnatic music and an orchestra consisting of a mridangam drum, a flute, cymbals, and other instruments. Net-Natyam uses three ml5.js machine learning models (PoseNet, Handpose, and Facemesh) and a webcam to detect the movements of a Bharatanatyam dancer and trigger a corresponding sequence of electronically composed sounds.

ITPG-GT.2994.001, ITPG-GT.2048.00006
The Neural Aesthetic (Online), ICM – Media
Performance,Machine Learning

Mr. Scribbles: Dancing Drawing Robot

Get weird, dance and draw on your canvas!

Yona Ngo, Stuti Mohgaonkar

https://vimeo.com/491766360

Description

The Mr. Scribbles Dancing Drawing Robot was created to help people feel more comfortable about their bodies, about their movements — about being weird sometimes. Dancing Drawing Robot is a robot, controlled using dance poses.

ITPG-GT.2301.00006, ITPG-GT.2048.00002
Intro to Phys. Comp., ICM – Media
Machine Learning,Play/Games

To 100 (Non-existed) Boys I've Dated Before

A Tinder Clone tells my frustrating dating experiences with 100 non-existed men whose dating profiles and stories are machine learning generated online.

Tianxu Zhou, Zeyao Li

https://www.loom.com/share/a8455c5fdfe04fbf9f9adf49d628296c

Description

Men all have a well-framed dating app bio. However, we all had a weird and frustrating dating story behind it. No matter what happened on our date, there's only one ending – we never talked again.

The project explores using existing datasets on the Internet and brings awareness of dating data privacy to the audience.

Swipe! Use it as a regular Tinder. If you swipe right, you will get my dating story with the man you matched. The chat stores all the men you connect with and the dating story behind them.

ITPG-GT.2536.00001
Programming from A to Z (Online)
Machine Learning,Narrative/Storytelling

liminal mind

Liminal mind is an exploration of what AI perceives about our experience in liminal spaces.

Julia Ann Myers

https://youtu.be/tiaEudZCxIs

Description

Liminal spaces are undefined, transitional spaces, often devoid of spatial cues and context. What has AI learned about our experiences in liminal spaces and how do AI-generated spaces reflect our conversations and images of liminality? Using media generated entirely by AI and machine learning programs, liminal mind is a Web VR experience comprised of three liminal spaces featuring soundscapes, a generated voiceover by a neural voice and equirectangular photos created from GAN images.

IMALR-GT.202
Critical Experiences
VR\\\\AR,Machine Learning

WordEater

WordEater is a mini game where you can use your webcam to gobble up words in order to generate a sentence.

Jeeyoon Hyun

https://youtu.be/CZt9pIUUvpk

Description

https://jeeyoonhyun.github.io/WordEater/

Ever felt confused of so many words floating around the Internet?

WordEater is a browser based game that lets you gobble up a bunch of meaningless words in order to make another meaningless sentence, eventually removing all words that you see in the screen.

It doesn't matter if you don't understand what the words or sentences are trying to say – after all, they are going to be swallowed and eaten anyway. All you need to do is get some peace of mind by consuming all the disturbing, shattered pieces of information that makes complete nonsense. The goal of the game is making your web browser more cleaner by scavenging fragmented data with your mouth. After all, your web browsers also need some refreshment from the gibberish they encounter everyday!

WordEater uses the Facemesh API in ml5.js to detect your mouth in your webcam. You can play the mouse version if you can't use your webcam – for example, if you are wearing a mask.

ITPG-GT.2233.00005, ITPG-GT.2465.001
ICM, Machine Learning for the Web (Online)
Machine Learning,Play/Games

Weather Journals

Weather Journals uses data visualization and machine learning to put people in touch with their surroundings and with each other.

Schuyler DeVos

https://youtu.be/gm49Qnp8RBg

Description

Weather Journals is an attempt to put people in touch with their surroundings and with each other. The whole thing is powered by OpenAI's GPT2 algorithm, which is a machine learning algorithm that generates human-like text.

First, submit a reflection of the weather where you are in the box on the top left. This can be whatever you want: what do the clouds look like? How does the weather make you feel? Does it remind you of another time?

After you've put in your reflection, adjust the length and the creativity of the text you'd like the weather to write you in return. Then hit the button and wait.

At the end of the day, everything that everyone has written for that day is used to train the machine learning model. What that means is that every day into the future, the model reflects all the reflections of all the days before it, growing and evolving with the weather.

Art,Machine Learning

Mindful Breathing

The competitive meditation game that yells at you if you are not being mindful enough.

Michael Morran

https://vimeo.com/489113320

Description

Inspired by a tendency to take meditation way too seriously, Mindful Breathing challenges users to accumulate breaths, add upgrades, and wager their progress on the journey to transcendence. Using ml5 and PoseNet, participants' bodies are tracked in order to log breaths and transform what is, at first, a simple interface into a claustrophobic cacophony of 'mindfulness enhancers.' After a certain amount of progress, players are able to measure their success against the self-actualization of others and bet their breaths for the chance to surpass the competition.

This project is inspired by Universal Paperclips by Frank Lantz. Big thanks to Mathura, Craig, Ellen, Lisa Jamhoury, Lisa Sokolov, and all my classmates for their help!

IMALR-GT.201 , IMALR-GT.201
Connections Lab, Connections Lab
Machine Learning,Play/Games

Pixel Topographies

Pixel Topographies uses machine learning to generate elevation maps based on Connecticut topography, then creates a tangible 3D representation of that data.

Philip Cadoux

https://vimeo.com/488813688

Description

My family sold the house I grew up in this year, which was very sudden, but for the best. I had never really been that connected to my home state, but when I discovered I may never go back there, I realized that I had come to really appreciate it as a place to grow up. We ended up renting a house not too far from where I grew up to get through the pandemic, but it made me think about what it was that caused me to become so nostalgic. What kept popping into my head was that it is a beautiful place. It has lovely forests, beautiful colors, coastal towns, and even a few mountains. Then, I stumbled across a site which shows the elevation in CT using colors.

This inspired me, I created an ElevationGAN and used Runway's hostel model feature to grab the images generated. I then used p5.js to process down the images into a pixel grid. From there, I used serial communication to send the elevation data to an Arduino, which actuated pixels in and out to reflect these values. This is a proof of concept piece that could be scaled up to create 1:1 representations of elevation maps – ultimately creating wooden topographies. I have plans to elaborate on this ML model and try new things. 

ITPG-GT.2301.00002, ITPG-GT.2465.001
Intro to Phys. Comp., Machine Learning for the Web (Online)
Art,Machine Learning

Ripple Inside

Tell me your feeling, I hear, and I feel you.

Eden Chinn, Rui Shang

https://youtu.be/AsxloEstQ4U

Description

As humans, our existence is defined by different emotional states. When we feel an emotional impulse, it's like a ripple is dropped inside of us. This ripple flows outward and is reflected in how we perceive the world around us, as well as how we act within it.

For this project, we wanted to visualize emotional states using colors, shapes, and sounds in a poetic way.

The first thing we did is dividing all emotion words into 6 classifications: happy, content, sad, angry, shocked, afraid and then used p5.speech to recognize words instead of training words myself in the teachable machine because it’s far more accurate and for now this project can recognize over 110 emotion words.

We create a flowing 3d object and use sin() function to generate a beautiful ripple. More importantly, we generate multiple filters for one song in response to different emotions, and the amplitude of the song will affect the frequency of the ripple. For the visual part, we believe matching colors and custom shapes to different emotion words based on color and shape psychology could give people an immersive experience.

Tell me your feeling with one word.

I hear you, I feel you.

ITPG-GT.2048.00007
ICM – Media
Machine Learning,Music

Blur Clock

A clock which gets blurry when you are not looking at it representing the existence of time being due to humanity's blurred perspective of reality

Aidan Fowler, Youngmin Choi

https://vimeo.com/489005572

Description

This clock is inspired by The Order Of Time by Carlo Rovelli, who theorizes that time as we usually imagine it only exists because of a “blurred macroscopic perspective of the world that we encounter as human beings [and that] the distinction between past and future is tied to this blurring and would disappear if we were able to see the microscopic molecular activity of the world.” Along with the thought that humans create time itself, we are tying in quantum mechanics and the idea that nothing exists in a determinate state until an interaction occurs or a measurement is taken. In our clock, the time is indeterminate and blurred until we measure (by looking at the clock) which causes an exact time to be visible.

ITPG-GT.2040.001
Time
Art,Machine Learning