View Static Version
Loading

Researchers refine virtual reality for teachers

Eye-tracking technology is giving teachers virtual eyes in the backs of their heads.

Measuring eye movement with tiny cameras embedded in virtual reality headsets enables educators to figure out which students are paying attention – and which ones aren’t – when immersed in virtual reality environments.

“Eye tracking allows teachers in virtual reality classrooms with a lot of students to pick up on wavering attention more quickly and to monitor every student,” said Dr. Christoph Borst, an associate professor in UL Lafayette’s School of Computing and Informatics.

Virtual reality – an interactive experience that happens within a simulated environment via computer-generated animation – is an emerging educational tool. Students wear headsets to view three-dimensional images on specialized computers with high-definition screens.

Borst said that while education-based VR will not replace the need for reading from a textbook or putting pen to paper, “it’s effective for teaching material that is more visual than abstract.”

Geography students can take a “field trip” to the Grand Canyon without getting anywhere near the landmark, for example. Biology students can study anatomy during a dissection performed by a virtual teacher in an artificial laboratory.

“Benefits over traditional teaching methods include, in many cases, increased student engagement and motivation, and better retention of subject matter,” Borst said.

But there’s always room for improvement.

Borst is principal investigator for “Enhancing Educational Virtual Reality with Headset-based Eye Tracking.” The three-year project is being funded with a $515,813 grant from the National Science Foundation. Dr. Arun Kulshreshth, an assistant professor in the School of Computing and Informatics, is co-principal investigator.

The eye-tracking project is being conducted at the University’s Center for Advanced Computer Studies, the research arm of the School of Computing and Informatics.

Researchers will examine eye-movement patterns among a cross section of high school and University students.

Borst said findings of the research project will “help educators determine what sorts of teaching methods work best for different students.” Interactive lessons could be “rewound” for students who missed a component or section, for instance.

Another benefit of VR? It lessens the need for teachers to interrupt an entire class to get the attention of one student.

Consider computer science doctoral student Andrew Yoshimura’s assignment as part of the eye-tracking research. He’s using software to create about 10 virtual “cues” – tiny arrows or spiraling cones of concentric circles, for example – that will pop up on a virtual screen to draw a student’s attention back to the subject at hand.

Nicholas Lipari, an instructor in the School of Computing and Informatics, said the team is also “working on ways teachers could remove objects simply by pointing at them.”

Removing virtual objects from view, which could also be done with a few keystrokes or click of a mouse, is intended to enhance learning, not detract from it.

Borst cited a virtual oil derrick as an example. If a lesson is intended to teach students about a maze of pipes – and a student’s gaze often drifts to a forklift sitting on the platform – the forklift could be removed from the immersive environment.

Conversely, objects could be added for students who don’t have issues paying attention or who are able to absorb large amounts of information rapidly. For those students, “you could have all sorts of things going on at once,” Borst said.

Borst is confident the eye-tracking project will help advance educational virtual reality.

“It will develop techniques for using sensor data that gives teachers immediate insight into student activities and behavior patterns,” he said. “This will help teachers improve ways they lead virtual reality lessons, and that will only enhance the learning experience for students.”

This article first appeared in the Fall 2019 issue of La Louisiane, The Magazine of the University of Louisiana at Lafayette.

Top photo: Jason Woodworth, a computer science doctoral student, wears a headset that enables him to take a virtual tour of the University’s Cleco Alternative Energy Center. The tour is led by computer science doctoral student Andrew Yoshimura, who appears in the virtual reality environment shown in the background (University of Louisiana at Lafayette/Doug Dugas).

NextPrevious

Anchor link copied.

Report Abuse

If you feel that the content of this page violates the Adobe Terms of Use, you may report this content by filling out this quick form.

To report a copyright violation, please follow the DMCA section in the Terms of Use.