- Virtual 'therapists' screen for emotional distress with a high level of accuracy.
- A response to NIMH call for objective measures for mental health diagnosis.
- Non-judgmental and neutral, which eases patient anxiety and allows for comparison.
- Veteran's Affairs employing the virtual agents to treat PTSD.
When we feel ill, it’s commonplace for physicians to take biological samples to diagnose our condition. Louis-Phillipe Morency thinks interaction samples might be a key to treating emotional disorders.
Morency and his colleagues at the University of Southern California’s Institute for Creative Technologies (ICT) have answered the call from the National Institute of Mental Health to find more objective measures to diagnose mental health. (See, for instance, the NIMH Strategic Plan for Research.)
In response, Morency’s team have developed Ellie, a virtual agent who interviews patients to locate verbal and nonverbal signals of mental illness. Ellie is a computer simulation of a therapist who has become artificially intelligent and can aid human therapists in their diagnostic work by interacting with emotionally distressed patients.
"Our goal is not to replace the doctor in making that diagnosis," Morency is quick to point out. "In fact, I would be extremely scared to have a doctor replaced by a computer. Rather, we want to enhance the doctor’s ability to heal."
The ICT has created virtual humans for a variety of scenarios, including disaster recovery training, job interview coaching, virtual reality gaming, motion picture special effects, and aids for medical diagnoses.
Ellie is an example of the latter. Now assistant professor in the Language Technology Institute at the Carnegie Mellon University (CMU), Morency has been working on a Defense Advanced Research Projects Agency (DARPA)-funded, multi-institution effort to identify biomarkers for subtypes of mental illness. Virtual agents can diagnose emotional distress signals with greater accuracy than can human counselors.
The creation of these virtual interviewers is a three-part process, says Stefan Scherer, research assistant professor in the department of Computer Science at the University of Southern California (USC). The first step toward identifying emotional distress was to detect facial landmarks on actual humans. ICT researchers recorded and analyzed face-to-face interactions of applicants solicited through a Craigslist advertisement. From these recordings, they were able to construct prototypes and draw these behaviors onto virtual characters who were then controlled by hidden humans in the SimSensei project.
The final step was to move to fully automated virtual agents, like Ellie. She receives human emotional cues via a webcam and microphone that tracks features such as smile duration, gaze direction, and vowel frequencies. Based on self-assessment questionnaires from hundreds of subjects with a variety of emotional conditions, researchers were able to learn the common behaviors of people with and without emotional distress.
While many of us think we can read another person and divine their emotional state, when it comes to diagnosing serious emotional conditions, a more precise approach is called for.
To read human emotion, we listen to verbal cues and watch for non-verbal hints to what a person might be feeling. Like a dance, we adjust our behavior in response to our conversational partners. But Ellie sees these traits with much greater accuracy. For instance, while we might suspect something is amiss, Ellie knows the smile of a non-distressed person should last 3.43 seconds on average.
"Yes, humans can sense emotional distress," Morency admits, "but if you ask them ‘was there something different?’ they might know something was different, but they would not be able to tell you exactly what it was."
By opting for a virtual therapist, researchers get richer samples of speech than they’d get if their patients interacted with live human interviewers. Ellie also asks questions the same way each time, neutrality that allows comparisons between interviews. What's more, it seems Ellie increases a patient’s willingness to say more. Research shows that patients are more comfortable revealing personal information to a non-judgmental, autonomous reviewer. Patients lower emotional barriers more readily with Ellie, since virtual interviewers reduce the fear of judgment. This is especially helpful in the post-traumatic stress disorder (PTSD) therapy Morency’s group has been doing with the US Department of Veteran’s Affairs.
Ellie is now installed at the Emory University Hospital in Atlanta, in use in a clinical PTSD trial under the supervision of Barbara Rothbaum. "We want to show the technology is mature enough to give objective results and enable clinicians to make more objective assessments," says Scherer.
Using Ellie, individual soldiers are interviewed before, during, and after their treatment. Ellie records the behavioral biomarkers and her screenings will pick up changes in the soldier. The longitudinal feedback will inform her future ability to identify emotional distress. As she learns and becomes intelligent, her ability to help patients improves.
Scherer sees a lot of promise in the artificially intelligent diagnostic tool. "We’re not saving lives yet, but we’re striving to get the technology set up to collect information about things like suicide risk to warn health care providers early enough. The goal is to ultimately save lives — we’re far off but that’s really what we’re after."