Skip to main content Skip to local navigation

York-designed wireless controller frees underwater robot

A waterproof controller designed and built by York University researchers is allowing an underwater robot to go “wireless” in a unique way.

AQUA, an amphibious, otter-like robot, is small and nimble, with flippers rather than propellers, designed for intricate data collection from shipwrecks and reefs.

The robot, a joint project of York, McGill and Dalhousie universities, can now be controlled wirelessly using a waterproof tablet built at York. While underwater, divers can program the tablet to display tags onscreen, similar to bar codes read by smartphones. The robot’s on-board camera then scans these two-dimensional tags to receive and carry out commands.

Cutting the cord on underwater robots has been a long-standing challenge for scientists; water interferes with radio signals, hindering traditional wireless communication via modem. Tethered communication is cumbersome and can create safety issues for divers.

“Having a robot tethered to a vehicle above water creates a scenario where communication between the diver, robot, and surface operator becomes quite complicated,” says Michael Jenkin (right), professor in York’s Department of Computer Science & Engineering in the Faculty of Science & Engineering and co-author of the forthcoming paper, "Swimming with Robots: Human Robot Communication at Depth".

“Investigating a shipwreck, for example, is a very delicate operation and the diver and robot need to be able to react quickly to changes in the environment. An error or a lag in communication could be dangerous,” Jenkin says.

Realizing there was no device on the market that fit the bill, Jenkin and his team at York’s Centre for Vision Research, including the paper’s lead author, master in computer science student Bart Verzijlenberg, set to work constructing a prototype. The resulting device, fittingly dubbed AQUATablet, is watertight to a depth of 60 feet. Aluminum housing with a clear acrylic cover protects the tablet computer, which can be controlled by a diver using toggle-switches and on-screen prompts.

“A diver at 60 feet can actually teleoperate AQUA 30 to 40 feet deeper. Needless to say this is much easier on the diver, physically and much safer,” Jenkin says.

The tablet also allows divers to command the robot much as if they were using a video game joystick; turn the tablet right and AQUA turns right, too. In this mode, the robot is connected to the tablet by a slim length of optical cable, circumventing many of the issues of a robot-to-surface tether. The optical cable also allows AQUA to provide video feedback from its camera to the operator. In a totally wireless mode, the robot acknowledges prompts by flashing its on-board light. Its cameras can be used to build 3-D models of the environment which can then be used to guide the robot to particular tasks.

“This is a huge improvement on [a robot] having to travel to the surface to communicate with its operators,” Jenkin says.

In the past, divers have used laminated flashcards to visually communicate with robots while underwater. However, these limit the diver to a pre-set sequence of commands.

“It’s impossible to anticipate everything you’re going to want the robot to do once you get underwater. We wanted to develop a system where we could create commands on the fly, in response to the environment,” he says.

Jenkin and Verzijlenberg’s paper will be presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in Taiwan.

Jenkin and Verzijlenberg are two of the researchers based in York’s new state-of-the-art Sherman Health Science Research Centre, which officially opened on Sept. 14. Jenkin leads the Canadian Centre for Field Robotics, which is based on the building’s main level. The centre is supported by a grant from the Canada Foundation for Innovation. The AQUA project is funded in part by the Natural Sciences & Engineering Research Council of Canada. York's Centre for Vision Research is part of the Faculty of Health.

Leave a Reply