Research News

GSE developing innovative ‘avatar-guided’ teacher training

A man's hand holding a smartphone with an avatar displayed on the screen.

The new avatar-guided technology is much more versatile and flexible than the current technology, and can be used with a cellphone, iPad and computer.

By CHARLES ANZALONE

Published May 13, 2019 This content is archived.

Print
headshot of richard Lamb.
“This avatar is designed to provide a level of interaction, whether a student is wearing virtual reality goggles or using a cellphone. ”
Richard Lamb, director
Neurocognition Science Laboratory

“Hi there,” the human-like form on the smartphone says. Dressed in a conservative blue suit and equally neat white shirt, “Kevin” sets out to make immediate contact, even though he is a computer-generated person.

“I’m Kevin,” he says through the screen image. “And welcome to the practice interview. Let’s start with the obvious. I am not a real human. However, I am well-trained to converse with humans in this limited context. I can ask you questions. Record your answers. And answer some of your questions.

“This is very simple. You can speak to me the moment I stop talking. When you are done, simply push the microphone down below. So let’s give it a try.

“Are you as excited as I am to be here?”

Meet Kevin, one of a variety of digital guides being developed and soon to be used by the Graduate School of Education to enhance student learning using the most current and appropriately adapted technology — what GSE researchers call intelligent avatars.

The difference that this enhanced, integrated avatar brings to leaning — whether it be training would-be teachers going for advanced degrees in education or preparing patients for medical procedures — is significant, according to Richard L. Lamb, associate professor and director of the Neurocognition Science Laboratory, where research on the use of these teaching avatars is taking place.

Kevin, our avatar of the moment, is ready to help advanced-degree teaching students — candidates the Graduate School of Education calls pre-service teachers — earn their degrees. This capability changes the game of training several ways, Lamb explains. It allows pre-service teachers easier access to the training because the system is available on everyday technology, not just in a lab with elaborate and costly technology equipment. It also reduces the cost of using this system and makes it easier for universities to make these programs available to more students.

"Kevin," an avatar guide developed by the Graduate School of Education to enhance student learning.

"Kevin," one of the avatar guides in the prototype, can be programmed to suit the situation, whether it be how he looks, sounds or reacts to the problem he is being  presented with. Photo: Crosswater Digital Media

As a result, many more students can take advantage of these training programs more quickly and dramatically cheaper.

“This avatar is designed to provide a level of interaction, whether a student is wearing virtual reality goggles or using a cellphone,” says Lamb, whose Neurocognition Science Laboratory has been involved in numerous projects educating children with virtual reality technology, including several programs in cooperation with Buffalo’s Enterprise Charter School.

“It’s completely interactive without having another human there.”

Existing educational technology programs include avatars controlled by a person behind the scenes who is directing the responses of the digital avatar, Lamb says.

“This avatar is completely independent,” Lamb says. “It doesn’t need a person. It does its interaction without any person behind the scenes telling it what to do or say.”

And the avatar in this system actually learns about the person it interacts with, Lamb says.

“So as we interact with it, it learns how to respond and how to make better responses to what we’re saying.

“That’s a step up from the existing technology,” he says. “Right now, if we want to go and use particular learning systems around avatars, we have to schedule two or three weeks out. We have to pay $350 an hour. We have to provide them with materials for the person behind the scenes working the avatar. And so you can’t just pick up the phone and start working with it.”

Scheduling these sessions ahead of time is problematic and expensive, Lamb notes.

“This allows me to interact, any time I want,” he says. “You saw me. Right on my cellphone. I can use the system on an iPad. I can put it on a computer, I can put it on a website. You name it. I can go on the virtual reality material. It’s just much more versatile and flexible.

“And it’s a great example of how, when you combine educational technology with learning, how much more impressive or better oriented learning can be with this technology.”

UB researchers are developing the technology, called Human Interface Agent, in conjunction with the University of Southern California and Crosswater Digital Media. Currently in what they call the prototype stage, the researchers plan to introduce “Kevin” and similar interactive avatars at UB in the fall semester.

The avatars can be used for teaching, by patients who need answers to questions about an upcoming medical procedure, and by counselors looking for tools when dealing with troubled children or adults.

“It could be any type of interview,” says Lamb. “For example, if I am a pre-service teacher and I want to practice telling a parent about a child who may need an individual education plan, or if I need to practice de-escalating a situation in a classroom between two students, or if I just want to practice teaching particular topics to students.

“In current systems used by most colleges of education, there is a flat screen like the large TV screen you can see in the lab, and they have avatars on that screen,” he says, “but those avatars are actually all controlled by a person behind the scenes, pulling the strings like a puppet.”

As with other virtual reality applications, the possibilities are close to endless. For example, avatars can be programmed to suit the situation, whether it be how they look, sound or react to the problem they are presented.

“We can make this avatar look like whatever we want,” says Lamb. “It can wear a suit. It can look like a student. It can look like a patient. You could have an avatar acting as if it were traumatized or abused. And the pre-service teacher can work with that electronic avatar to identify signs and symptoms of abuse.

“The idea is to model the complexity of the classroom and break it down in a way that allows our pre-service teachers opportunities to experience this in a safe environment.

“There is this saying we use, ‘Educators will be educating the AI,’ the artificial intelligence,” Lamb adds. “So there is a place for the human to really teach the artificial intelligence how to be appropriate.”