Human factors issues in UAV platform and sensor control
conference paper
This paper presents an overview of several studies conducted for the Royal Netherlands Navy on the human-machine interface for steering Unmanned Aerial Vehicles (UAVs) and controlling their remote cameras. While manual control is preferable for specific tasks, the operator misses critical sensory information, such as proprioceptive feedback on camera viewing direction. Furthermore, the information on the remote environment which is presented, namely the payload images, is of degraded quality due to the restricted data-link. This may result in camera images with low temporal and spatial resolution, and a small field of view. The studies performed mainly focussed on the negative effects of the degraded visual information (including low update rates, transmission delays and zoomed-in camera images), and the possibilities to compensate these effects by innovative human-machine interface design. An important point of departure was that the improvements did not result in additional claims on the data-link. The applied techniques included the use of graphical overlays, ecological interface design, head-coupled control, and prediction techniques. The results show that carefully designed human-machine interfaces are able to partially compensate specific image degradations.
Er wordt een overzicht gegeven van de UAV-onderzoeken die zijn uitgevoerd problemen bij besturing van een UAV en het bedienen van de sensor.
Topics
TNO Identifier
9497
Source title
Proceedings of the EURO UAV Scandinavia Conference, 1999
Files
To receive the publication files, please send an e-mail request to TNO Repository.