Phan (Pan) Visutyothapibal

visualizing heartbeat

Live visualization of a user's heartbeat.

Boxes and Lines for Rods and Cones: Video Programming in Max,The Nature of Code

The user's heartbeat is analyzed using a stethoscope. The user's heartbeat is used to provide pulse rate and amplitude. A visualization using these two factors is projected for user to experience.

1) pulse rate of the user drives the speed of the animation.

2) the amplitude (or volume of the heartbeat captured) determines the brightness of the visualization.

I'm inspired by live sound and animation. Prior to this I used sound that i designed or composed, but for this project I wanted live sound that reflects a slice of life of the user.

Research: Using the concepts taught in nature of code, I explored MAX/jitter to learn more about how live amplitude of sound can manipulate 3D animation that appropriately represents the nature and depth of sound

I would like to think that this has a generic appeal for everyone vaguely interested in how sound of their heartbeat can translate to animation... or in other words: Almost everyone.

User Scenario

I intend to use a stethoscope with a built in microphone that will pick up the pulse rate and amplitude of the heartbeat. The users will then hear their own heartbeat and get to see visualizations.

To make the experience border medical art, I will be dressed as a doctor and will treat users as patients.

Stethoscope, Microphone, Mac with Max/Jitter, Projector /LCD screen.

Additional Setup: Table, chair, and lollipops for successful visualizations.

Learn: More about jitter, how to import 2D objects into openGL 3D space.

Discover: The simpler concepts are better.

Break: Stethoscope to put a mic in.