Controlling Machine Systems by Human Mind in Natural Environments

Non-invasive brain-computer interface (BCI) technologies has translated neural information into commands capable of controlling mahcine systems such as robot arms and drones. Although noninvasive BCI techniques have been significantly improved, their practical real-world usage, has yet to be realized. This is because current noninvasive BCIs are still of limited frequency range and are usually restricted to binary commands. In addition, the information transfer rate is low for communication. To overcome the problem and improve practical applicability, the hybrid BCI paradigm, which uses a brain activity-based interface together with any other multimodal control device, has been a popular topic of investigation. The hybrid BCI paradigm is expected to be a good choice for practical implementation with feasible hardware. Among various choices of control devices for hybridization with BCI, eye-tracking devices are of particular interest because of their omnidirectional controllability in the 2D space.

Figure 1. (a) Interface system with a subject during a task. (b) Pointing and selecting task procedure. © Task parameters.

Recently, inexpensive products to measure EEG signals non-invasively have been commercialized. They provide the potential to extend BCI to practical HCI applications. Commercialization is an important aspect of BCI technology. In this respect, the currently available devices need to be tested. One direction of BCI among its research issues is to become apart of daily life as a convenient approach to interfacing with the environment. We evaluated the promising hybrid interface scheme of eye movement and noninvasive BCI, particularly, with an inexpensively built and comfortable system. Instead of focusing on advancing the performance of this hybridization, our work has attempted to evalute its feasibility as a potential approach to real-world applications through quantitative performance comparision of the developed interface with other pointing interfaces, using Fitt’s law.1

Although the performances of the hybrid interface tended to be less consisten over all participants, the evaluation shows important implications on developoing hybrid interface devices, especially using low-cost equipment. Along with this result, we further developed a low-cost and easy-to-use hybrid interface that interprets eye movements and brain mental activity to allow real-time control of applications in 3D environment. As an alternative to existing work, the interface addressed the limitations of previous systems in a single system.2

Figure 2. The overview of our proposed hybrid interface.

  • Easy-to-learn and easy-to-use system: hybridizing eye tracking and EEG-based classification allowing users to complete their tasks easily with various commands in 3D physical space.
  • Not only low-cost but also a convenient wearable device: people can control their flight naturally in everyday life.

Figure 3. Experimental set-up.

The proposed interface enables both disabled and fully capable people to complete their tasks with various commands in 3D physical space. Through this low-cost and easily wearable device, people can control their flight naturally and easily in everyday life. From the results of our study, our work has also successfully demonstrated the potential of hybrid control with potential applications for hands-free control in both disabled and fully capable people.

Avatar
Byung Hyung Kim
Research Assistant Professor

Ph.D. in Computer Science at KAIST. Current research interests include algorithmic transparency, interpretability in affective intelligence, computational emotional dynamics, cerebral asymmetry and the effects of emotion on brain structure for affective computing, brain-computer interface, and assistive and rehabilitative technology.