Google is venturing into a groundbreaking approach to healthcare with its latest project: using artificial intelligence to detect early signs of diseases through the sounds people make. The tech giant is developing a new technology called HeAR (Health Acoustic Representations) that could transform how we monitor and diagnose health conditions by analyzing everyday sounds like coughing, sniffling, and labored breathing.
In an ambitious move, Google has trained HeAR on an extensive dataset comprising 300 million audio recordings, capturing various symptoms associated with illnesses such as tuberculosis. This innovative AI model is designed to identify subtle indicators of health issues, potentially enabling earlier diagnosis and intervention.
To bring this technology to broader audiences, Google has teamed up with Salcit Technologies, an Indian startup specializing in AI-driven solutions for respiratory diseases. Together, they plan to integrate HeAR into smartphones, making it accessible to individuals in high-risk regions with limited healthcare resources.
This initiative is part of Google’s broader efforts to translate human senses into digital formats. Previously, the company supported ventures exploring AI’s capability to detect diseases through smell, underscoring its commitment to advancing health technology through novel and accessible means.
With HeAR, Google aims to redefine the intersection of AI and healthcare, promising a future where early disease detection is just a sound away.
Sources: