“This VR system is a really enabling technology,” she says. “I don’t want to claim too much, but with more careful research, it could help enable oceanographers with impaired vision, hearing, or mobility become more involved with ship-based operations.”
But how could untrained scientists realistically pilot an advanced underwater vehicle like Jason? Last summer, Phung tested a group of trained ROV pilots and a mixed group of students and scientists to see if they could pick up a block from a sandbox with a robot manipulator arm, using VR headsets and topside controllers. Pilots and novices both performed the task faster with the VR interface, while novices had a hard time with the normal piloting controls. Phung then made the test more challenging, reducing the camera frame rate to refresh every 10 seconds. Even the experienced pilots struggled with the lack of visual information, but with the VR assist, they were able to perform the tasks without errors or wasting time.
Phung’s tests show that VR could be helpful in poor-bandwidth situations and could also be applied to control untethered vehicles, thus increasing their range. But an even more pressing need is to perform tasks in low-visibility areas. One complaint Phung hears from ROV pilots is that when they disturb seafloor sediment, they have to wait up to several minutes for the cloud to dissipate before moving again. In future studies, Phung hopes to use acoustics to supplement or even replace the visual data coming in from ROV cameras. Not only would that reduce the power load required to explore the dark depths, it would also enable robots to differentiate between rocks and sand—even in pitch-black conditions.
Phung has been testing out the concept of automating ocean robot manipulator arms, like the one shown here, with the use of VR headsets. (Photo by Daniel Hentz, © Woods Hole Oceanographic Institution)
One day in not-too-distant future, Phung envisions a fleet of “autonomous intervention vehicles” touring the world ocean on a mission to answer science questions—and perhaps pick up on trends that humans would miss entirely.
“I think in the next ten years, people won’t have to specify the low-level objectives, like, ‘Okay, take a sample here, take a sample there.’ I think we’re going to move closer to research questions, types of queries like, ‘Okay, I don’t care what rock you sample, I just want to know, is this type of bacteria present in this environment, yes or no?’” says Phung. “They could make their own decisions about what’s interesting to sample, or maybe they just do the measurement right there.”
These capabilities could help scientists target their research and pick up the pace of science, not only on Earth, but on other planets.
“If we can get robots to the level where they’re independent enough to make their own decisions without human support, that makes it possible to explore oceans that are not on Earth, like on Europa and Enceladus,” she says.
For now, Phung’s virtual reality vision is bringing us all one step closer to the frontiers of exploration on our own ocean planet.
Phung’s research is supported by the National Science Foundation Graduate Research Fellowship and the Link Foundation Ocean Engineering & Instrumentation Ph.D. Fellowship. This particular project was supported by NSF NRI Grants IIS-1830500 and IIS-1830660, and NASA PSTAR Grant NNX16AL0G.