Ohio State nav bar

Virtual Reality Check for Medical Students

March 2, 2015

Virtual Reality Check for Medical Students

Jack has a backache. He’s nervous, sitting there in the doctor’s exam room, and his eyes quickly dart back and forth. He carefully answers the medical student’s questions, grimacing from the pain as he describes where it hurts, and how badly.

The student continues to probe with questions, hoping to hone in on the problem while trying to keep Jack calm and talkative.

Jack is an avatar: a simulated, virtual patient. He’s actually one of several avatars built by researchers at The Ohio State University’s Advanced Computing Center for the Arts and Design (ACCAD) for a collaborative project with the Wexner Medical Center’s Doug Danforth, associate professor of obstetrics and gynecology.

The avatars in the Virtual Patient Project give medical students life-like practice in interviewing patients, delving into health histories and narrowing possible diagnoses. The ACCAD research team included Alan Price, associate professor of design, and former MFA graduate student in design Kellen Maicher, who is now an interactive media consultant for Wexner Medical Center.

“Dr. Danforth came to ACCAD to see if we could build an avatar that would create a rich, high-fidelity experience for the medical students,” explained Price. “The virtual patients are designed so students can practice interviewing, asking the right questions and developing effective communications skills.”

The goal was to create an immersive environment for the medical students with a believable virtual patient with whom they could practice their clinical skills.

“The avatar appears life-sized on a large computer screen in a room set up like a doctor’s office in our Clinical Skills Center,” Danforth said. “The student will ask the avatar questions, and will get responses, and emotions, back. They’ll be able to practice and test their skills one on one with a life-like patient. It will allow them to do all of the ‘detective work’ on a new patient and get immediate feedback from that patient.”

Most medical schools, including Ohio State’s, use “standardized patients” to teach these clinical skills. Standardized patients are members of the local community who are trained to act like real patients. That approach has some limitations, including variability in the skills of the standardized patients, as well as being relatively expensive.

This led Danforth to consider using a virtual patient instead. Use of virtual patients isn’t new in medical schools, but most virtual patient programs are computer-based, clinical-reasoning scenarios, leading medical students through a pre-defined list of problems and diagnoses.

This program is different.

“Nobody to my knowledge is using avatars, artificial intelligence and virtual reality that combine communication and emotion,” Danforth said.

Plus, the program is not housed only in the Clinical Skills Center. A web-based version of the program allows medical students to practice communicating with the virtual patients anywhere — from laptops or home computers.

“Learning how to take an accurate and thorough medical history is one of the most essential skills that a medical student needs to learn,” Danforth said. “Developing innovative approaches to help them practice and get feedback on that aspect of their training will eventually help make them better doctors.”

The avatars, representing various ages and genders, complain of such common ailments as backache, headache, abdominal pain, cough and chest pain.

So how do you build a believable Jack? Maicher’s MFA thesis was based on developing “affect” in the virtual character, which he built in a computer animation studio and imported into a gaming engine. Communication is achieved via an extensive database built by Danforth.

The virtual character was brought to life through a process of procedurally generated animation for lip sync in the character’s real-time responses and nonlinear playback of key framed facial and gestural expressions blended with motion capture files produced in ACCAD’s motion capture lab.

“We want the avatar to elicit emotions, rather than just being robotic and cold,” Maicher said. “The emotional element helps students develop their clinical skills in the exam room.”

He adds that the project focuses on the personal exchange between doctor and patient.

“Not only do we want to have the lip sync correct, but we are building in seven different facial, emotional tags to create any number of different emotional responses. That adds another level to the doctor/patient relationship.”