In Progress
Actively Recruiting

Remote detection of vital signs (respiration, HR, HRV) for remote symptoms analysis and tracking

Vital sign detection app for remote symptoms analysis using smartphone cameras and audio with AI as sensors.

Created over 1 year ago
152 Volunteers

About the Project


Please see our new website (launched April 12, 2020):

The detection of COVID-19 symptoms requires doctor and hospital visits, or a visit to test sites outside from home. This results in possible transmission or infection of COVID-19, and an increase of doctor visits straining the medical resources available.

Prognosis, diagnosis and screening for the patients infected with COVID-19 is suggested to be based on breathing characteristics (Wang et al. 2020. as well as other vital signs such as blood oxygen saturation/hemoglobin levels (
The detection of vital signs can be achieved via facial cameras and machine learning for breathing patterns (Chen et al. 2019, hemoglobin levels (, HRV and others (

The proposed project is an app that can be installed and used on any smartphone and laptop to detect early symptoms of COVID-19 remotely without leaving the home. As well as for monitoring (ongoing interest from healthcare and medical institutions already).
It uses the detection of vital signs and breathing characteristics via technological devices accessible to everybody at home with cameras of smartphones/laptop and machine learning, and combines it with the detection of pearly symptoms.
The app user will also be able to use therapeutic exercises such as aerobic, mindfulness and belly breathing ( Often, many exercises require additional app use, and are therefore not accessible and as widely used among elderly, vulnerable and people in under-resourced communities. The app will enable an inclusive and accessible use, in combination with vital sign detection. So for the therapeutic part of the app, the vital sign provides a measurement of progress, for instance by showing a reduction of average heart rate, improvement of hemoglobin levels or deeper breathing patterns. Vital signs can offer a quantification of stress levels and progress of health for motivation, too.

The app can also include information on resources to reach for the nearest hospital, doctor and pharmacy.
As well as resources for communal support and connection to nearest online groups, help for grocery delivery etc.

How far along is it

+ Project and product management teams formed, with a tentative team structure so far:
+ Teams and subteams as in graphs
+ Research on latest COVID-19 papers for symptoms and diagnosis and
+ Research on vital sign detection: both summarized over past two weeks
+ Product road mapping after research for app with features as in drawings
+ Regular daily (sub)team meetings and development started

Breathing.AI has been using the laptop or smartphone camera with machine learning to detect vital signs such as heart rate in our prototypes developed at MIT Media Lab and NEW INC since 2018 ( The app would use some of our existing research and development (such as for the online prototype, and collaborate with the researchers to improve the accuracy and testing opportunities with NYU resources. Atashzar and his lab add machine learning capabilities and testing for symptom analysis, Hegde and his team improve the computer vision problems, Goldstein and her team is an advisor of Breathing.AI and consulting on regulatory compliance for the data, and Ben van Buren and his lab are providing additional research and development capacities.
The app would be developed via remote research and development and could also be tested via remote downloads from people at home, as well as from medical professionals at NYU Langone Health hospitals the NYU researchers are connected to. The app offers a remote detection, and the data would be shared with doctors to evaluate if treatment and in-person visits on site are necessary.

For the progress of this project so far, please also see links and team above.

Help Needed

Tasks that need to get done

+ Data scientists and developers to help with the breathing detection using webcam and AI (see papers).
+ App developers to build iOS and Android app, maybe desktop app and browser plug in.
+ Medical professionals to develop accuracy and tests.
+ Any grant writers or links to resources.
+ Anybody willing to help with tasks such as reaching out, writing etc.
+ Project and product managers

Project details

Who is already working on this

Please see our new website (launched April 12, 2020):

UPDATE (April 5, 2020): 120+ volunteering experts (tech, science, medical) were added (some outside HWC).

PI Heidi Boisvert, City Tech CUNY, and founder

The project will be developed by the currently forming team of project and product managers:
+ Ishita Sharan
+ Travis McCauley
+ Ellie Shin
+ Helen Lemma, MPH (strategy)
+ Shain Bonnowsky, The John Hpkins University

Healthcare practitioners (among many others):
+ Sabrina Curi
+ Martha Karran
+ Ali Tahir

Advisors (amo):
+ Nimay Parekh
+ Ed Gelman

And the app will potentially be developed in partnership with Assistant Professors S. Farokh Atashzar and Chinmay Hegde (both NYU), Lynn Goldstein, Assistant Professor Ben van Buren (The New School) and me (see resume) with my startup Breathing.AI.
+ Hannes Bend (initiated project)
+ S. Farokh Atashzar, Assistant Professor of Electrical and Computer Engineering, as well as Mechanical and Aerospace Engineering at New York University (NYU). Atashzar leads the Medical Robotics and Interactive Intelligent Technologies (MERIIT) Laboratory to develop and implement artificial intelligence, advanced control systems, signal processing algorithms, and transparent human-robot interaction systems.
+ Chinmay Hegde, Assistant Professor at NYU; Research Interests: Machine Learning, Algorithms, Big Data, Signal and Image Processing.
+ Lynn A. Goldstein is the Founder of GDPRsimple to implement the EU General Data Protection Regulation (GDPR), a Senior Strategist for the Information Accountability Foundation, a global information policy think tank, and the Founder a privacy consulting firm. Lynn is a member of the U.S. Department of Homeland Security’s Data Privacy and Integrity Advisory Committee and is arbitrator for the EU-U.S. Privacy Shield Framework Binding Arbitration Program.
+ Ben van Buren, Assistant Professor of Psychology at The New School Directs

Helpful links

New website (4/12/2020):

First graphs of app design created by project management team and volunteering UX expert Jay Vidyarthi:

First research on COVID-19 papers and vital sign detection, summarized by our project team so far:

Internal survey form for team and sub-team formation of 100+ volunteers :

How to get in touch or text/whatsapp +1 646 474 3062
Number of volunteers needed
Preferred Volunteer location
Organization status
Not specified