TapFix: Cursorless Typographical Error Correction for Touch-Sensor Displays


Alapetite, A., Hansen, J. P., & MacKenzie, I. S. (2012). Demo of gaze controlled flying. Proceedings of the 7th Nordic Conference on Human-Computer Interaction – NordiCHI 2012, pp. 773-774. New York: ACM. doi:10.1145/2399016.2399140 [PDF] [video]

Demo of Gaze Controlled Flying

Alexandre Alapetite,1 John Paulin Hansen,2 & I. Scott MacKenzie2

1Technical University of Denmark
DK-2800 Kongens Lyngby
alal@dtu.dk

2IT University of Copenhagen
Rued Langgaards vej 7
DK-2300 Copenhagen,
paulin@itu.dk

3York University
4700 Keele St.
Toronto, Canada
mack@cse.yorku.ca

Abstract. Development of a control paradigm for unmanned aerial vehicles (UAV) is a new challenge to HCI. The demo explores how to use gaze as input for locomotion in 3D. A low-cost drone will be controlled by tracking user's point of regard (gaze) on a live video stream from the UAV.

Keywords: Gaze, control, input, robot, mobile, drone, unmanned aerial vehicle (UAV), target acquisition.

ACM Classification Keywords
H.5 [INFORMATION INTERFACES AND PRESENTATION]: H.5.2 User Interfaces – Input devices and strategies

SYSTEM DESCRIPTION

The possible areas of application for drone technology are numerous. It is known, for example, that "commercial companies and civil government bodies are taking an increasing interest in nano-UAVs, because of their unique surveillance capabilities." (http://www.bbc.co.uk/news/uk-18633664) UAVs can be useful when conducting reconnaissance missions, such as search and rescue tasks and generally surveying large areas in terrain that may be difficult to traverse. It could be used in video documentation of e.g. building construction work.

Natural control of UAVs is likely to emerge from research in the fusing of several human input and output modalities. The fusion may occur between sensor data retrieved from, for example, eye-, hand-, head- and facial muscle movements. The intent of this demo is to present a test-of-concept for an "eye in the sky" that can be controlled and manipulated intuitively by gaze. The demo will use a low-cost AR.drone unmanned aerial vehicle (UAV) (www.ardrone.com). See Figure 1. The drone weighs only 420 grams. It includes four small motors and propellers for locomotion. Importantly, the drone includes a camera that provides a view of the scene ahead. Communication with the base station is via WiFi. Commands are transmitted to the robot every 100 milliseconds, thus continuously updating the navigation instructions.

AR.drone from www.ardrone.com. The scene ahead is transmitted from the nose camera to a control monitor.
Figure 1. AR.drone (www.ardrone.com). The scene ahead is transmitted from the nose camera to a control monitor.

The display in the control room allows the user to perceive visual information acquired by the embedded camera on the drone. Below the display monitor there is a gaze tracking unit. With this, a 'fly-where-you-look' control principle is investigated.

Our approach relies on a direct feedback loop with no visible interface components displayed. We utilize the point of regard on the screen directly as the user, who is situated in a control room, observes the streaming video to continuously adjust the locomotion of the UAV. See Figure 2. The UAV is located in terrain outside the control room building. See Figure 3. The direction and speed is modulated by the distance from the centre point of the monitor. Since the gaze-tracker is providing an input in two dimensions {x, y}, there is a need to make a simplified mapping system to the 3D-world of the drone. Although several options are possible, we have chosen to use an automatic altitude control to simplify the interaction. The y-axis controls the pitch (i.e., forward/backward inclination), which directly impacts the longitudinal speed. We can choose to map the x-axis to either yaw or roll. Yaw (i.e., rotating left/right) changes the orientation while staying on the same spot, while roll (i.e. left/right inclination) induces a lateral displacement (like stepping left/right).

Control room. The point of regard on a video stream is used to control the drone. The eye tracker is positioned below 
the system display.
Figure 2. Control room. The point of regard on a video stream is used to control the drone. The eye tracker is positioned below the system display.

UAV terrain. The drone is passing through a  circular target outside the control room building.
Figure 3. UAV terrain. The drone is passing through a circular target outside the control room building.

Performance data from the demo participants will be analysed to determine the best mapping approach for a future controlled experiment. The task used in the demo involves manoeuvring the drone over a short distance through a target.

RELATED WORK

Gaze control of locomotion has been investigated in computer games [1] and in virtual environments [2]. Remote cameras were driven by gaze in work by Zhu et. al [5]. A driving robot was successfully controlled by gaze in an experiment reported by Tall et. al [3]. Finally, Wästlund et. al [4] tested a gaze-driven wheelchair.

REFERENCES

1. Nielsen, A.M., Petersen, A. L. and Hansen. J. P. 2012. Gaming with gaze and losing with a smile. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), Stephen N. Spencer (Ed.). ACM, New York, NY, USA, 365-368.

2. Stellmach, S. and Dachselt, R. 2012. Designing gaze- based user interfaces for steering in virtual environments. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), Stephen N. Spencer (Ed.). ACM, New York, NY, USA, 131-138.

3. Tall, M., Alapetite, A.; Agustin, J. S. Skovsgaard, H.T.H., Hansen, J. P., Hansen, D. W. and Møllenbach. E. 2009. Gaze-controlled driving. In Proceedings of the 27th international conference extended abstracts on Human factors in computing systems (CHI EA '09). ACM, New York, NY, USA, 4387-4392.

4. Wästlund, W., Sponseller, K. and Pettersson, O. 2010. What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, New York, NY, USA, 133-136.

5. Zhu, D., Gedeon, T. and Taylor, K. 2011. "Moving to the centre": A gaze-driven remote camera control for teleoperation. Interact. Comput. 23, 1 (January 2011), 85- 95.