Education – Winter Show 2020 /shows/winter2020/ ITP/IMA/IMA Low Res Thu, 17 Dec 2020 23:35:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 /shows/winter2020/wp-content/uploads/sites/53/2020/11/cropped-web_plain_2000_1400-1-32x32.png Education – Winter Show 2020 /shows/winter2020/ 32 32 You and I Don't See The Same World /shows/winter2020/you-and-i-dont-see-the-same-world/ /shows/winter2020/you-and-i-dont-see-the-same-world/#respond Mon, 14 Dec 2020 18:50:13 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/you-and-i-dont-see-the-same-world/ Continue reading "You and I Don't See The Same World"

]]>
We each see the world through our own filter – literally.

Wendy Wang

You and I don't see the same world
https://youtu.be/mM-xqNZBNcU

Description

So much of how we view the world depends on what we see. But is the world really an objective truth that's just for everyone to eventually see it? This could be something for the audience to think about while exploring the diversity of animal vision. From finding out how your pet sees you, to learning extraordinary ways different animals see the signals that we could not observe, it's a fun experience to explore how we all see the world a little differently.

*This project aims to show the diversity of color vision. However, as some animals perceive aspects of light that we cannot see, and may perceive colors with a higher dimension of complexity, these visualizations are my speculation of what could be.

ITPG-GT.2233.00007
ICM
Education,Narrative/Storytelling
]]>
/shows/winter2020/you-and-i-dont-see-the-same-world/feed/ 0
VIBROTACTILE /shows/winter2020/vibrotactile/ /shows/winter2020/vibrotactile/#respond Mon, 14 Dec 2020 18:49:53 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/vibrotactile/ Continue reading "VIBROTACTILE"

]]>
An instrument that can feel the music with vibrations for the hearing impaired

Keunjung Bae

a hand placed on strings to feel music through vibration a hand placed on strings to feel music through vibration an instrument(top) a hand placed on strings to feel music through vibration an instrument(side)
https://vimeo.com/491830031

Description

In general, music is perceived as art by sound, and we think that if there is a problem with hearing, we will not be able to hear it. Beethoven, a composer who became deaf in his later years, can be viewed with pity, but on the contrary, it can be seen as a human victory overcoming adversity.

But, just because they can't hear music on their ears, Does it mean they can't play music at all? The important thing is that the source of sound is vibration.Then, even if we can't feel the vibration with our ears, we would it be possible to feel it with other organs of our body.

When listening to a low sound through a loud speaker in a place like club, we can experience the vibrations all over our body. This is a stimulus that can be sensed by a person with a hearing problem because the transmitted stimulus is felt by other sensory organs as well as the ear.

The purpose of this 'VIBROTACTILE' project is a device that makes it possible to recognize and feel music through tactile experiences through vibrations for the hearing impaired.

How would the world of music look like through the sound of strings transmitted to the fingers through several vibrating motors?

VIBROTACTILE makes them feel the music through vibration through the sense of tactile transmitted to the strings through vibration motors. Through the vibration of the motor that vibrates automatically according to the song of the p5 sketch, we can check which keyboard of the p5 sketch video connected with the vibrating motor is what note. Therefore, even if we can not listen to the piano sound, we can check which note is played with only the vibration.

project webstie: http://www.keunjungbae.com/2020/12/09/pcom-week-14/

ITPG-GT.2301.00002
Intro to Phys. Comp.
Education,Accessible
]]>
/shows/winter2020/vibrotactile/feed/ 0
Smart Tiles /shows/winter2020/smart-tiles/ /shows/winter2020/smart-tiles/#respond Mon, 14 Dec 2020 18:49:45 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/smart-tiles/ Continue reading "Smart Tiles"

]]>
A letter block board with text-to-speech engine designed for tangible learning.

Brandon Roots

Render of Smart Tile Board Angle Render of Smart Tile Board Angle Render of Smart Tiles main image Smart Tiles cnc milled blocks and PCB
https://vimeo.com/489237628

Description

Smart Tiles is a block board that recognizes wooden letter tiles to generate speech and play games. It operates entirely offline. The project was designed with my 4 year old niece and nephew in mind. They both enjoy playing with educational smartphone apps but I had a hard time finding any really compelling educational tangible toys for learning letters and words. This is my attempt to fill that gap. The letter tiles are designed to be familiar, like the wooden alphabet blocks I had growing up, and CNC milled from eco friendly Green-T Birch plywood. I have pushed to make the technology as “invisible” as possible to bring some magic to the user experience. Both English Braille and print characters are engraved on each block to be inclusive for tangible learners.

ITPG-GT.2845.001, ITPG-GT.2301.00005, ITPG-GT.2048.00002, ITPG-GT.2536.00001
Prototyping Electronic Devices , Intro to Phys. Comp., ICM – Media, Programming from A to Z (Online)
Education,Play/Games
]]>
/shows/winter2020/smart-tiles/feed/ 0
Text2Video /shows/winter2020/text2video/ /shows/winter2020/text2video/#respond Mon, 14 Dec 2020 18:49:34 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/text2video/ Continue reading "Text2Video"

]]>
Text2Video is a software tool that converts text to video for a more engaging learning experience.

Zack Lee

Screenshot Screenshot Screenshot
https://vimeo.com/489223504

Description

Text2Video is a software tool that converts text to video for a more engaging learning experience.

I started this project because during this semester, I have been given many reading assignments and I felt frustration in reading long text. For me, it was very time and energy consuming to learn something through reading. So I imagined, “What if there was a tool that turns text into something more engaging such as a video, wouldn't it improve my learning experience?”

I did some research and found a number of articles and studies supporting that videos can be more effective in learning than text for many people including the following data:

– The human brain can process visuals 60,000 times faster than text.

– Viewers retain 95% of a video’s message compared to 10% when reading text.

– 65% of people consider themselves to be visual learners.

I created a prototype web application that takes text as an input and generates a video as an output.

I plan to further work on the project targeting young college students who are aged between 18 to 23 because they tend to prefer learning through videos over books based on the survey I found.

The technologies I used for the project are HTML, CSS, Javascript, Node.js, CCapture.js, ffmpegserver.js, Amazon Polly, Python, Flask, gevent, spaCy, and Pixabay API.

Application link: https://text-to-video.herokuapp.com/

Demo Video: https://vimeo.com/489223504

Github repository: https://github.com/cuinjune/text2video

ITPG-GT.2988.001, ITPG-GT.2075.001, ITPG-GT.2778.00001
Hello, Computer: Unconventional Uses of Voice Technology , Imagination and Distributed Learning (Online), Reading and Writing Electronic Text (Online)
Education,Tool\Service
]]>
/shows/winter2020/text2video/feed/ 0
Historia /shows/winter2020/historia/ /shows/winter2020/historia/#respond Mon, 14 Dec 2020 18:49:29 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/historia/ Continue reading "Historia"

]]>
A world-scale augmented reality historical walking tour that sheds light on the history of Downtown Brooklyn through archival photographs, ambient audio, and informational signs.

Ethan Printz

App icon and screenshot over historical photo App icon and screenshot over historical photo App icon and screenshot over historical photo Concept and Inspiration for App Flowchart of app 3D Models for app Breakdown of app's interface
https://www.youtube.com/watch?v=l8aZH3OoNCo

Description

Historia is an fully realized augmented reality walking tour that uses Apple's ARKit and its newly released ARGeoTracking API to create a world scale experience on the streets of Downtown Brooklyn. It has a 3D diegetic interface that integrates itself with the physical surroundings of the sidewalk to provide educational information about the neighborhood's rich history.

Across the neighborhood pops up stands that show photos, soundscapes, and videos sourced from historical archives at the New York City Historical Society, New York Public Library, New York City Transmit Museum, and Museum of the City of New York.

In addition to the information available by looking at and tapping on the 3D photo/audio/video markers (date, source, and 2-3 sentences of context), there's also in depth research presented on info boards next to the neighborhood's most notable landmarks like the demolished Albee Theater and the old City Fire Headquarters.

Through all this media and information, the walking tour attempts to shed light on the past of a neighborhood that has in recent years experienced rapid change and massive redevelopments.

IMNY-UT.220.001, IMNY-UT.220.001
Topics in Computation and Data: New Realities (UG), Topics in Computation and Data: New Realities (UG)
VR\\\\\\\\\\\\\\\\AR,Education
]]>
/shows/winter2020/historia/feed/ 0
Smart Knitting Needles /shows/winter2020/smart-knitting-needles/ /shows/winter2020/smart-knitting-needles/#respond Mon, 14 Dec 2020 18:47:49 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/smart-knitting-needles/ Continue reading "Smart Knitting Needles"

]]>
When crafts meet technology

Liraz Primo

Smart Knitting needles
https://youtu.be/rqeLSeH5xfQ

Description

When the pandemic have started no one imagine that we will have to switch to an online studying system. The limitation of learning through a screen stand even more when you learning physical techniques. As a former student of Textile and an instructor of knitting it was clear to me that we need to think of a system that will integrate the craft with technology.

This project evolves with the connection between three dimensional fabric and the two dimensional form to read it. The needles can sense the fabric you create while you do it and see the technical drawing of it on the computer screen.

In these difficult times and as for the visual people we are, this could be an excellent way to develop our learning methods and for student and instructors to communicate in a better way.

ITPG-GT.2301.00002
Intro to Phys. Comp.
Education,Product Design
]]>
/shows/winter2020/smart-knitting-needles/feed/ 0
I want to see the world /shows/winter2020/i-want-to-see-the-world/ /shows/winter2020/i-want-to-see-the-world/#respond Mon, 14 Dec 2020 18:47:44 +0000 https://itp.nyu.edu/shows/winter2020/2020/12/14/i-want-to-see-the-world/ Continue reading "I want to see the world"

]]>
An interactive experience tells how we see the world via biological processes.

Tiange Hou

colorful image brain with lights brain with LED brain with LED brain with LED
https://vimeo.com/488284023

Description

We have different sensory systems obtaining the perception. But do you know what happened inside our bodies, leading us to perceive the world? When the light goes into our eyes, how do we get the vision from it?

“I want to see the world” is an interactive experience during which users help the person(with eyes, hand, and brain) to see. This project is an overview of how we see the world via the biological processes. When light enters the eyes in the darkness, a critical cell signaling pathway, called phototransduction, starts in cells in our eyes. The sodium channel is closed so Na+ could not enter the cell. Then, everything else helps us to see could occur. Information is sent from the eyes to the brain via nerves and more visual information processing happens in the brain, indicated by LED lights and drawings. The computational drawings simulate a biological technique, fluorescence imaging.

Hands are responsible for all the interactions. Users do some hand gestures to the “hand,” implying that when we cannot see, other sensory systems, such as touch, help us. First, users wave to the “hand” to let this person know that users are around. Second, users move their hands up and down to the “hand” to let the person know that they wish to help. Third, users follow the instructions shining the light on the “eyes.” And last, users wave the hand again to let the eyes see the final image.

ITPG-GT.2233.00003, ITPG-GT.2301.00007
ICM, Intro to Phys. Comp.
Education,Art
]]>
/shows/winter2020/i-want-to-see-the-world/feed/ 0