NeuroFly
The NeuroFly VR simulator is a pilot project between AFormX and DDTLab Research Laboratory as part of the Network of Research Arts and Culture Centers (RUK) project.

The NeuroFly simulator is a pilot project between AFormX and DDTLab Research Laboratory as part of the Network of Research Arts and Culture Centers (RUK) project. The aim of the project is to combine our VR flight simulator with a brain computer interface (BCI), which allows the end-user to communicate directly between the brain and the simulator and thus direct the aircraft in the simulation. It is a hands-off flight simulator.

DESIGN PRINCIPLES

The NeuroFly simulator consists of several individual components, which together provide an immersive and unique user experience. The user’s first contact with the NeuroFly simulator is a chair with a comfortable seat that was designed with several goals in mind:

  • it is mobile (and can be easy transported in a van);
  • it is functional and is comfortable for seating;
  • has all the VR technology implemented inside;
  • offers insulation from noise and other disturbances present at fairs and other exhibitions;
  •  it has a futuristic look and feel that complements the project R&D nature.

The chair is designed in a way that helps to isolate the user from the outside world and helps him/her focus on his/her flight experience. Using the VR googles, the immersion is full and complete.

NeuroFly
NeuroFly Chair offers the user full and complete immersion.

The user operates the aircraft via the Unicorn Brain Computer Interface (Unicorn BCI), manufactured by the Austrian company g.tec, which specializes in invasive and non-invasive brain-computer interfaces (BCI) and neurotechnology.

The Unicorn BCI device contains 8 electrodes that measure brain electrical activity on the surface of the head using the EEG method. EEG or electroencephalography is a method used to measure the activities of large groups of cortical neurons.

In addition to the electrodes, the Unicorn BCI device also includes a set of computer programs with which we can monitor the measured signals and which allow us to select the elements on the displayed keyboard with the help of the P300 signal.

The NeuroFly simulator uses two of these programs.
– Unicorn Recorder, which shows the time course of the signal on the individual electrodes, is started when the electrode cap is placed on the user’s head and measurements are started to make sure that the signal on the electrodes is good.
– Unicorn Speller is used during the simulation. The program includes a keyboard with flashing characters, and algorithms that analyze brain signals and identify which symbol the user has chosen in mind. The selected symbol is then sent to another application via the UDP protocol.

NeuroFly user experience
NeuroFly user experience mixes BCI and VR technologies

For the needs of the NeuroFly project, we designed our own keyboard, which contains characters that can be used to control the flight of an aircraft in a simulation. Due to the specifics of the P300 signals, the characters on the keyboard must flash, the NeuroFly keyboard displays the face of one of the NeuroFly team members in its place when the character flashes. Showing human faces is a trick that helps us  to improve the selection using only thoughts, as the human brain is good at recognizing faces and even has a brain center dedicated to just that task.

NeuroFly board

The user operates the NeuroFly aircraft via a brain-computer interface using only his/her own brain activity. He/She also simultaneously observes the progress of a flight using VR googles. The operation of BCI devices is based on the interaction between two adaptive controllers: the user, who must know how to intentionally elicit the correct brain signals that will trigger the command, and the BCI system, which must translate these signals into commands and execute them. Operating with brain interfaces is therefore an ability that both the user and the system must learn by constantly adapting to each other.