A RT I F I C I A L I N T E L L I G E N C E

Downloaded from www.sciencemag.org on July 16, 2015

SPECIA L SECTION

Ellie, a virtual health agent, monitors patients’ expressions, gestures, and voice.

The synthetic therapist Some people prefer to bare their souls to computers rather than to fellow humans

P

eople have always noticed Yrsa Sverrisdottir. First it was ballet, which she performed intensively while growing up in Iceland. Then it was science, which she excelled at and which brought her to the stage at conferences. And starting in 2010, when she moved to the University of Oxford in the United Kingdom to study the neurophysiology of the heart, it was her appearance. With her Nordic features framed by radiant blonde hair, “I just stand out here,” she says. “I can’t help it.” After she arrived in the United Kingdom, she found that she no longer enjoyed the attention. She began to feel uncomfortable in crowds. Her relationships suffered. There had been some obvious stressors, such as the deaths of both of her parents. But the unease didn’t let up. By 2012, she says, “I felt 250

like I was losing control.” Then she met Fjola Helgadottir, one of the few other Icelanders in town. Helgadottir, a clinical psychology researcher at Oxford, had created a computer program to help people identify and manage psychological problems on their own. Sverrisdottir decided to give it a try. The program, based on a technique called cognitive behavioral therapy (CBT) and dubbed CBTpsych, begins with several days of interactive questioning. “It was exhausting,” Sverrisdottir says. The interrogation started easily enough with basic personal details, but then began to probe more deeply. Months of back and forth followed as the program forced her to examine her anxieties and to identify distressing thoughts. CBTpsych diagnosed her as having social anxiety, and the insight rang true. Deep down, Sverrisdottir

realized, “I didn’t want people to see me.” Then the program assumed the role of full-fledged therapist, guiding her through a regimen of real-world exercises for taking control. It sounds like a typical success story for clinical psychology. But no human psychologist was involved. CBTpsych is far from the only computerized psychotherapy tool available, nor the most sophisticated. Ellie, a system built at the University of Southern California (USC) in Los Angeles, uses artificial intelligence (AI) and virtual reality to break down barriers between computers and humans. Originally funded by the U.S. military, its focus is on diagnosing and treating psychological trauma. Because patients interact with a digital system, the project is generating a rare trove of data about psychotherapy itself. The aim, says Albert “Skip” Rizzo, the USC sciencemag.org SCIENCE

17 JULY 2015 • VOL 349 ISSUE 6245

Published by AAAS

PHOTO: USC INSTITUTE FOR CREATIVE TECHNOLOGIES

By John Bohannon

psychologist who leads the effort, is nothing short of “dragging clinical psychology kicking and screaming into the 21st century.” A 19 June editorial in The New York Times deemed computerized psychotherapy “effective against an astonishing variety of disorders.” The penetration of the Internet into far-flung communities could also bring mental health treatment to vast numbers of people who otherwise have no access. But whether clinical psychologists will accept AI into their practice is uncertain. Nor is it clear that the tools of AI can carry computerized psychotherapy beyond its so far limited capacity, says Selmer Bringsjord, a cognitive scientist and AI researcher at Rensselaer Polytechnic Institute in Troy, New York. “It is incredibly ambitious.” ALL OF TODAY’S VIRTUAL psychologists

Conversational “chatbots” such as ELIZA are still viewed as a parlor trick by most computer scientists (Science, 9 January, p. 116). But the chatbots are finding a new niche in clinical psychology. Their success may hinge on the very thing that AI researchers eschew: the ability of an unintelligent computer to trick people into believing that they are talking to an intelligent, empathetic person. THAT ISN’T EASY, as Rizzo is keenly aware.

What most often breaks the spell for a patient conversing with Ellie isn’t the content of the conversation, because the computer hews closely to a script that Rizzo’s team based on traditional clinical therapy sessions. “The problem is entrainment,” he says, referring to the way that humans subconsciously track and mirror each other’s emotions during a conversation. For example, a patient might say to Ellie, “Today was not the best day,” but the voice recognition software misses the “not.” So Ellie smiles and exclaims, “That’s great!” For an AI system striving to bond with a human patient and earn trust, Rizzo says, “that’s a disaster.”

trace their origins to ELIZA, a computer program created half a century ago. Named after the young woman in Pygmalion who rapidly acquires sophisticated language, ELIZA was nothing more than a few thousand lines of code written by Joseph Weizenbaum and other computer scientists at the Massachusetts Institute of Technology (MIT) in the early 1960s to study human-computer interaction. ELIZA followed rules that determined how to respond during a dialogue. The most convincAlbert “Skip” Rizzo, University of Southern California ing results came from a rule set called DOCTOR that simulated a psychotherapist: By turning patients’ stateTo improve entrainment, a camera tracks a ments around as questions, the program patient’s psychological signals: facial exprescoaxed them to do most of the talking. For sion, posture, hand movement, and voice instance, in response to a patient saying, “I dynamics. Ellie crunches those data in an atfeel helpless,” the computer might respond, tempt to gauge emotional state. “Why do you think you feel that way?” (You The patterns can be subtle, says Louiscan talk to ELIZA yourself at http://psych.fulPhilippe Morency, a computer scientist at lerton.edu/mbirnbaum/psych101/Eliza.htm.) USC who has led the development of the AI People engaged readily with ELIZA, that underlies Ellie. For instance, he says, perhaps more for its novelty than its cona person’s voice may shift “from breathy versational skills, but AI researchers were to tense.” The team devised algorithms to unimpressed. “The idea that you could make match patterns to a likely emotional state. a convincing AI system that didn’t really It’s imperfect, he says, but “our experiments have any intelligence was seen as cheating,” showed strong correlation with [a patient’s] says Terry Winograd, a computer scientist at psychological distress level.” Stanford University in Palo Alto, California, Other patterns unfold over multiple seswho was a Ph.D. student down the hall from sions. For instance, the team’s work with Weizenbaum. This was a wildly optimistic U.S. veterans suffering from post-traumatic time for the field, with many researchers anstress disorder (PTSD) revealed that “smile ticipating computers with human-level gendynamics” are a strong predictor of depreseral intelligence right around the corner. sion. The pattern is so subtle that it took But work on artificial general intelligence a computer to detect it: Smiling frequency didn’t pan out, and funding and interest remained the same in depressed patients, dried up in what has come to be known as on average, but the duration and intensity the “AI winter.” It wasn’t until the turn of the of their smiles was reduced. new millennium that mainstream interest in Even if Ellie were to achieve perfect enAI resurged, driven by advances in “narrow trainment, Rizzo says, it “is really just an enAI,” focusing on specific problems such as hanced ELIZA.” The AI under the hood can voice recognition and machine vision. only sustain about a 20-minute conversation

before the spell breaks, which limits the system’s usefulness for diagnosis and treatment of most psychological problems. Without sophisticated natural language processing and semantic knowledge, Ellie will never fool people into believing that they are talking to a human. But that’s okay, Rizzo says: Becoming too humanlike might backfire. One counterintuitive finding from Rizzo’s lab came from telling some patients that Ellie is a puppet controlled by a human while telling others she is fully autonomous. The patients told there was a puppeteer were less engaged and less willing to open up during therapy. That’s no surprise to AI researchers like Winograd. “This goes right back to ELIZA,” he says. “If you don’t feel judged, you open up.” Ethical and privacy issues may loom if AI therapy goes mainstream. Winograd worries that online services may not be forthcoming about whether there is a human in the loop. “There is a place for deceiving people for their own good, such as using placebos in medicine,” he says. But when it comes to AI psychology, “you have to make it clear to people that they are talking to a machine and not a human.” If patients readily open up to a machine, will clinicians be needed at all? Rizzo is adamant that a human must always be involved because machines cannot genuinely empathize with patients. And Ellie, he points out, has a long way to go before being ready for prime time: The program does not yet have the ability to learn from individual patients. Rizzo envisions AI systems as a way to gather baseline data, providing psychologists with the equivalent of a standard battery of blood tests. “The goal isn’t to replace people,” he says, “but to create tools for human caregivers.” Helgadottir has a bolder vision. Although computers are not going to replace therapists anytime soon, she says, “I do believe that in some circumstances computerized therapy can be successful with no human intervention … in many ways people are not well suited to be therapists.” A computer may be more probing and objective. Sverrisdottir’s experience suggests that CBTpsych, at least, can make a difference. Under the program’s tutelage, she says, “very slowly, I started to analyze myself when I’m amongst other people.” She identified a pattern of “negative thoughts about people judging me.” She might have got there with a human therapist, she says. But in the years since she first started talking to a computer about the trouble swirling in her mind, Sverrisdotter says, “I have been able to change it.” ■

The goal is “dragging clinical psychology kicking and screaming into the 21st century.”

SCIENCE sciencemag.org

17 JULY 2015 • VOL 349 ISSUE 6245

Published by AAAS

251

Artificial intelligence. The synthetic therapist.

Artificial intelligence. The synthetic therapist. - PDF Download Free
357KB Sizes 0 Downloads 14 Views