Remote detection of vital signs (respiration, HR, HRV) for remote symptoms analysis and tracking
Vital sign detection app for remote symptoms analysis using smartphone cameras and audio with AI as sensors.
About the Project
Please see our new website (launched April 12, 2020): https://www.vitalsign.ai/
The detection of COVID-19 symptoms requires doctor and hospital visits, or a visit to test sites outside from home. This results in possible transmission or infection of COVID-19, and an increase of doctor visits straining the medical resources available.
Prognosis, diagnosis and screening for the patients infected with COVID-19 is suggested to be based on breathing characteristics (Wang et al. 2020. https://arxiv.org/abs/2002.05534) as well as other vital signs such as blood oxygen saturation/hemoglobin levels (https://www.sciencedaily.com/releases/2020/03/200330152135.htm).
The detection of vital signs can be achieved via facial cameras and machine learning for breathing patterns (Chen et al. 2019 https://arxiv.org/pdf/1909.03503.pdf), hemoglobin levels (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6371334/), HRV and others (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6396075/).
The proposed project is an app that can be installed and used on any smartphone and laptop to detect early symptoms of COVID-19 remotely without leaving the home. As well as for monitoring (ongoing interest from healthcare and medical institutions already).
It uses the detection of vital signs and breathing characteristics via technological devices accessible to everybody at home with cameras of smartphones/laptop and machine learning, and combines it with the detection of pearly symptoms.
The app user will also be able to use therapeutic exercises such as aerobic, mindfulness and belly breathing (https://www.nytimes.com/2020/03/20/style/self-care/isolation-exercise-meditation-coronavirus.html). Often, many exercises require additional app use, and are therefore not accessible and as widely used among elderly, vulnerable and people in under-resourced communities. The app will enable an inclusive and accessible use, in combination with vital sign detection. So for the therapeutic part of the app, the vital sign provides a measurement of progress, for instance by showing a reduction of average heart rate, improvement of hemoglobin levels or deeper breathing patterns. Vital signs can offer a quantification of stress levels and progress of health for motivation, too.
The app can also include information on resources to reach for the nearest hospital, doctor and pharmacy.
As well as resources for communal support and connection to nearest online groups, help for grocery delivery etc.
+ Project and product management teams formed, with a tentative team structure so far: https://drive.google.com/file/d/1_1DxHUMnjy83pdLpVQzttohg6iEmiqyJ/view?usp=sharing
+ Teams and subteams as in graphs
+ Research on latest COVID-19 papers for symptoms and diagnosis and
+ Research on vital sign detection: both summarized over past two weeks
+ Product road mapping after research for app with features as in drawings
+ Regular daily (sub)team meetings and development started
Breathing.AI has been using the laptop or smartphone camera with machine learning to detect vital signs such as heart rate in our prototypes developed at MIT Media Lab and NEW INC since 2018 (https://youtu.be/0ypXPvQXWtk). The app would use some of our existing research and development (such as for the online prototype www.coloring.ai), and collaborate with the researchers to improve the accuracy and testing opportunities with NYU resources. Atashzar and his lab add machine learning capabilities and testing for symptom analysis, Hegde and his team improve the computer vision problems, Goldstein and her team is an advisor of Breathing.AI and consulting on regulatory compliance for the data, and Ben van Buren and his lab are providing additional research and development capacities.
The app would be developed via remote research and development and could also be tested via remote downloads from people at home, as well as from medical professionals at NYU Langone Health hospitals the NYU researchers are connected to. The app offers a remote detection, and the data would be shared with doctors to evaluate if treatment and in-person visits on site are necessary.
For the progress of this project so far, please also see links and team above.
+ Data scientists and developers to help with the breathing detection using webcam and AI (see papers).
+ App developers to build iOS and Android app, maybe desktop app and browser plug in.
+ Medical professionals to develop accuracy and tests.
+ Any grant writers or links to resources.
+ Anybody willing to help with tasks such as reaching out, writing etc.
+ Project and product managers
- Who is already working on this
- Helpful links
- How to get in touch
- Number of volunteers needed
- Preferred Volunteer location
- Organization status