Accessibility Health Machine learning Product design Tool/Service Wearables

Breath Cycle: A Website That Follows Your Breath

Breath Cycle is a web-based contactless breath biofeedback system that uses computer vision to monitor and give feedback on breathing patterns, making breath awareness more accessible. In this interactive breath awareness experience, users can visualize their breathing pattern overtime, use GPT to generate personalized meditations and practice physiological coherence by engaging with feedback from a BLE HRV monitor.


Pedro Sodre


Despina Papadopoulos


Breath Cycle is a webcam-based breath biofeedback system that can be accessed on any browser. In an effort to make breath-centered meditation more accessible to all, the system's default mode syncs dynamic text, shapes and colors to the user's breathing. At the center of the screen, quotes from from Thich Nhat Hanh's commentary on the Anapanasati Sutra invite the user to follow their inhalations and exhalations while procedurally generated graphics coincide with this movement. Along an outer circle, a radial graph grows complete through the duration of the session, displaying the user’s breathing waveform over time. The contour of the graph and its coloring  are modulated by the breath signal, forming a unique visualization every time. The resulting shape can be saved as a .jpeg at the end of the breath cycle session. In addition to the default mode, the system includes two other modes. In the Breath Affirmation AI mode, the system allows users to type in any intention and have a GPT-generated meditation unfold as they breathe in and out. In the Coherence Trainer mode, the user can connect any Bluetooth Low Energy Compliant HRV monitor and have a breath pacer and a percentage coherence score at the center of the screen.


Technical Details

To monitor the user's breathing pattern, the system utilizes a computer vision algorithm based on the scientific paper “Prospective validation of smartphone-based heart rate and respiratory rate measurement algorithms”. The algorithm uses Pyodide to run Python code locally within the web-browser and uses the OpenCV library to analyze the vertical movement of the torso. By estimating a torso region of interest using points from the ML5 face mesh, it calculates a dense optical flow of the area to estimate micro movements in the user's torso, achieving contactless breath monitoring. When engaging with the Breath Affirmation AI, the meditation instructions are generated using OpenAI's DaVinci-002 model with the following prompt:  "Generate 12 sequential goal-oriented motivational affirmations. Each should be 5-word to achieve <users intention> and mentally prepare for it." The Coherence Trainer Mode provides the user with a breath pacer and a coherence percentage, which is calculated by identifying a single peak sine wave in the 0.04–0.26 Hz range of the R-R intervals power spectrum. The project aims to release the computer vision breath monitoring tool and BLE heart-rate monitor analysis/web-bluetooth connection as a mindfulness library for p5.js in the near future.

Generating personalized meditationChaski uses the the AI feature to generate a sleep focused breath meditationShowing physiological coherence and web-bluetooth