New Delhi, Jan. 17 -- The USC Signal Analysis and Interpretation Lab (SAIL) in new collaboration with UCLA has found that AI can accurately detect changes in clinical states from speech as well as physicians. The study appears in PLOS One.

SAIL, which has long applied AI and machine learning to identify and classify video, audio and physiological data, partnered with researchers at UCLA to analyze voice data from patients being treated for serious mental illnesses, including bipolar disorder, schizophrenia and major depressive disorders. These individuals and their treating clinicians used the MyCoachConnect interactive voice and mobile tool, created and hosted on the Chorus platform at UCLA, to provide voice diaries related to their men...