A smartphone app called MoodCapture can tell if someone is depressed

Researchers from Dartmouth say they have made the first smartphone app that blends AI with software for processing facial images. To correctly spot the start of sadness before the person concerned even knows something is wrong.

A smartphone app called MoodCapture can tell if someone is depressed

App MoodCapture regularly records a person’s surroundings and facial emotions with their phone’s front camera. Then, it looks at the pictures to see if there are any clinical signs of sadness. The app correctly identified early signs of sadness 75% of the time in a study with 177 people who were diagnosed with major depressive disorder.

The paper’s lead author, Andrew Campbell, said

Read More: A bioelectronic sensor that checks how well the bladder is working!

So far, this is the first time that pictures taken “in the wild” have been used to predict sadness.
A similar set of technologies is used by MoodCapture, which combines deep learning and AI hardware with face recognition technology. “All someone has to do is unlock their phone, and MoodCapture knows how their depression works and can tell them to get help.”

MoodCapture is supposed to look at a set of photos in real time every time a user opens their phone. There are connections in the AI model between facial expressions and things like eye contact, changes in facial expression, and a person’s surroundings that show these things are important for figuring out how depressed someone is.

Jacobson, who runs the AIM HIGH Laboratory for AI and Mental Health, said, “Our goal is to record the changes in symptoms that people with depression experience in their daily lives.”

Jacobson says that an AI programme like MoodCapture should really tell people not to get depressed. And instead, tell them to do something constructive, like calling a friend or going for a walk outside.

Jacobson is in charge of a grant from the National Institutes of Mental Health that paid for this work. The grant is looking into how deep learning and passive data collection can be used to find depressive signs in real time. It also adds to a study that Campbell’s lab did in 2012 that used automatic and passive data from the phones of Dartmouth students who took part. To figure out how their mental health is.

You May Also Like

About the Author: Daniel

Leave a Reply

Your email address will not be published. Required fields are marked *