Vital Sign AI’s free app detects vital signs remotely and non-invasively

Smartphones (2020: 14B+) are an accessible technology during pandemics, but are underutilized. Our mission is to equip smartphones to fill the COVID-19 testing gap. Our app also provides critical support in the constant health monitoring needed during recovery.

Vital Sign Ai’s app processes smartphone camera and audio data with machine learning (ML) to detect vital signs including heart rate and respiration.

This data provides critical early-symptom analysis of COVID-19 and guides therapeutic methods and treatments.

Users can give consent to share data with their healthcare provider for symptoms evaluation, remote 1:1 telehealth support, or intake recommendations.

We designed the app to be accessible to diverse populations, including the elderly, regardless of their familiarity with technology. They are guided step by step to share video and audio and the app does the rest.

The documentation of the ML models, as well as the code and R&D process of this open source project, has the potential to help with other diseases in the future. 

Our team uses machine learning models to detect for instance breathing patterns from audio.

Shortness of breath, a critical symptom of COVID-19, can be analyzed with machine learning through the highly available tool of inputting audio data over time.

The Vital Sign AI team is developing our API so that it can integrate into other applications for remote vital sign detection. 

Prognosis, monitoring, and screening for patients infected with COVID-19 are determined through analysis of vital sign data, including respiration and blood oxygen saturation. Machine Learning (ML) methods have provided convincing results in the detection of vital signs and promise to expand in capability and application.

Our application is especially helpful for remotely-living, elderly, and immuno-compromised people, as well as those living in under-resourced communities.

Vitalsigns AI was launched on March 22, 2020, via the website. The project currently has 170+ volunteering healthcare practitioners, data scientists, engineers and researchers from Stanford, Cornell Tech, and around the globe, as well as institutional support. 

Interested? Contact our team to get the app, volunteer, or get involved with this project.

The app facilitates non-invasive and remote detection of vital signs such as heart rate variability, as well as breathing patterns, and hemoglobin. We utilize smart-phone and laptop high-resolution audio and video capture capabilities. By feeding this data stream through state-of-the-art Machine Learning and Deep Learning algorithms, we can draw inferences and make predictions which aid in efforts to contain the pandemic and minimize its impact. 

The app can also be used to encourage compliance with suggested therapeutic exercises proposed for health at home, and ongoing mental and physical health monitoring. The vital sign detection shows progress and offers a motivating way to participate in aerobic exercises as well as exercises involving mindfulness as therapeutic methods.

A well-documented pipeline involving research and development provides long-term value to the medical, scientific and technical fields to detect other diseases, and provide therapeutic support at home.