Categories
AI Behavioral Targeting

Google is training AI to ‘hear’ when you’re sick. Here’s how it works. [Video]

Google’s AI arm is reportedly tapping into “bioacoustics” — a field that blends a combination of biology and sounds that, in part, help researchers gain insights on how pathogen presence affects human sound. As it turns out, our sounds convey tell-tale information about our well-being.

According to a Bloomberg report, the search-engine giant built an AI model that uses sound signals to “predict early signs of disease.” In places where there is difficulty accessing quality healthcare, this technology can step in as an alternative where users need nothing but their smartphone’s microphone.

How does Google’s bioacoustics AI work?

Google’s bioacoustics-based AI model is called HeAR (Heath Acoustic Representations). It was trained on 300 million, two-second audio samples that include coughs, sniffles, sneezes, and breathing patterns. These audio clips were pulled from non-copyrighted, publicly available content from platforms like YouTube.

One example of such content is a video that recorded sounds of patients in a …

Watch/Read More