Robot control interface research


Physiological data collection process for a project. Shows ROTC participant with EEG headset. (image via Alex Proaps)

Collaborators:

Advanced Anti-Terror Technologies (A2T2) served as the main development and design company. They specialize in military contracts and attempts to create or expand on current technology to assist Warfighters, veterans, and citizens. Faculty and students from Old Dominion University also participated in research and development.

Problem:

Interface evaluations frequently rely on subjective measures and (to a lesser degree) operational performance, to judge quality and usability. Yet, the concept of “performance” is multidimensional, particularly for unmanned vehicle (UxV) operation. To adopt a more holistic view of operator performance, both cognitive and perceptual performance should be considered (e.g., status monitoring, decision making, problem solving, categorization, vigilance, situation awareness, judgment, workload, attention, and error rates). To augment traditional subjective and behavioral assessments, physiological indices can also be leveraged to reflect aspects of operator performance, with the advantage of real-time assessment.

Purpose:

  • Conduct heuristic evaluations to make design recommendations for two competing, proprietary small unmanned ground and aerial vehicle control interfaces.
  • Create small unmanned ground and aerial vehicle training scenarios using a virtual environment
  • Conduct user research and employ performance and physiological measures of usability to evaluate two competing, proprietary multi-robot control interfaces with ROTC students and enlisted personnel. Our usability testing method included subjective (i.e., NASA-TLX and SUMI questionnaire), physiological (i.e., eye tracking, EEG, and heart rate variability), and performance assessments of error rates, situation awareness, workload, stress, attention, ease of use, and learnability. 
  • Conduct user research to evaluate two control methods (handheld game device and LEAP motion) for robot navigation and robot arm gripper use
  • Develop physiological (heart rate, EEG, and eye tracking) and performance data logging and after action review software to assist researchers and trainers during multi-robot control scenario-based training
  • Create small unmanned ground and aerial vehicle training scenarios using a virtual environment
Participant using a LEAP gesture controller to control a robot in a virtual environment

Participant using a LEAP gesture controller to control a virtual robotic arm

Process:

Assisted the project leads and graduate students in the research and data collection process by:

  • contributing to SBIR proposal literature review process
  • conducting task analyses for multiple robot control
  • analyzing data to establish usability criteria and to create requirements to compare competing interfaces
  • working with software development team to troubleshoot programming issues while developing data logging software
  • developing virtual environment scenarios for a future multiple robot control interface training virtual environment test bed
  • conducting heuristic evaluation to compare competing interfaces
  • collecting physiological (eye tracking, heart rate, and EEG), performance (button press, situation awareness, workload) and subjective opinion data from student, ROTC, and military personnel
  • creating training materials for future experimenters, and recording, creating voice overs
  • editing promotional videos for final data logging software
  • troubleshooting programming issues during each iteration of the software

Eye gaze fixations while using handheld device to navigate virtual robot

Challenges:

This research and development team was distributed across multiple organizations and geographic locations and the workload was distributed across individuals of varying levels and domains of expertise. Therefore, the main challenge for this project has been communication, as it would be in any distributed work. Through this process, we have been pushed to learn more about each other’s roles and domain areas. I learned certain aspects of computer science at a rapid rate, while others learned more about experimental design. Learning to “speak each other’s language” is always a rewarding process when working on multi-disciplinary teams. We also learned to employ unique methods of communication. For example, there were times when we relied less on email exchange and more on phone conversations and remote desktop troubleshooting.

Findings:

Data collection with undergraduate Psychology students, ROTC undergraduate students, and enlisted military personnel is still in process and unclassified. Full images of the interfaces and final data logging software cannot be disclosed.

Differences in visit count across comparable interface areas of interest between expert robot controllers and novices

Eye tracking visit count within interface areas

Product:

The final product, Fused-Reality Assessment Module (FRAM), successfully integrates time stamped data from a screen recorder, EEG, eye tracker, and heart rate monitor. FRAM allows for real time assessment and visualization, and after action review or debriefing. The interface provides EEG outputs based on Emotiv’s emotion indicators,  heart rate accelerations and decelerations, and eye tracking data. Coding of critical events is facilitated through the use of IntelliMarks. IntelliMarks allows users to label, sort and organize important events (e.g., steering error that leads to increased frustration) within the task and interface for playback

Example of after action review playback

Presentations:

Press:

The company has received some press recently for their unique approach to using LEAP motion controllers to disarm a virtual bomb:

If you are interested in learning more about my other applied research projects, visit my Publications.