Efferent visual motion onset the display center BCI paradigm was developed by Jair PEREIRA Junior allowing for single trial-based operation of eight commands' application (here simple 1-8 digit speller).
Afferent visual motion onset the display center BCI paradigm was developed by Caio TEIXEIRA allowing for single trial-based operation of six commands' application (here simple 1-6 digit speller).
This video is about our latest development of 16 commands and 40 Hz carrier frequency (getting closer to the magic 50 Hz) cVEP BCI by Daiki AMINAKA.
This video is about a direct-brain control of two robots with visual cVEP BCIs. The project is a collaboration with Cichocki Lab at RIKEN BSI.
This video is about a direct-brain control of two robots with auditory and tactile BCIs. The project is a collaboration with Cichocki Lab at RIKEN BSI.
This video is about a robot control with eight commands using cVEP BCI. In the demo video the user controls his robot with the cVEP BCI. The project is a collaboration with Cichocki Lab at RIKEN BSI.
These videos are about two robots control with cVEP and tactile BCIs. In each demo video the left user controls his robot with tactile BCI (the tactile push pin stimulator under his right palm). The right user controls his robot with the cVEP BCI (see a second frame on the floor with cVEP patterns). The project is a collaboration with Cichocki Lab at RIKEN BSI and definitely more to come soon!!! Stay tuned ;)
This video presents our first attempt to control a humanoid robot NAO using tactile-pressure BCI developed by Kensuke SHIMIZU & Tomek. The BCI paradigm uses still seven oddball sequences averaging, thus still so slow. The click noises are "side effects" of the tactile pressure stimulus generator (this is also to be improved soon). The project is a collaboration with Cichocki Lab at RIKEN BSI. More to come soon!!! Stay tuned ;)
This video is about chromatic SSVEP BCI-based NAO robot control. This is a result of a collaboration between BCI-lab-group at University of Tsukuba (Daiki AMINAKA, Kensuke SHIMIZU & Dr. Tomasz "Tomek" M. RUTKOWSKI) and Cichocki Lab at RIKEN BSI (Peter JURICA and Dr. Andrzej CICHOCKI). This time we managed to control online NAO robot using chromatic SSVEP BCI developed by Daiki AMINAKA and Tomasz M. RUTKOWSKI.
This video is about tactile-body BCI-based NAO robot control. This is a result of a collaboration between BCI-lab-group at University of Tsukuba (Takumi KODAMA, Kensuke SHIMIZU & Dr. Tomasz "Tomek" M. RUTKOWSKI) and Cichocki Lab at RIKEN BSI (Peter JURICA and Dr. Andrzej CICHOCKI). This time we managed to control online NAO robot using tactile-body BCI developed by Takumi KODAMA, under supervision and advice of Dr. Tomasz M. RUTKOWSKI, using only four trials (sequences) averaging, which matchad the robot's command execution.
This video is about spatial auditory BCI speller developed by Moonjeong CHANG as her graduation project. We present a proof of concept of the Japanese kana (45 characters) speller developed in form of the two-step input spatial auditory BCI.
This video presents autdBCI controlling a small robot. We present results of a study in which contactless and airborne ultrasonic tactile display (AUTD) stimuli delivered to the palms of a user serve as a platform for a brain computer interface (autdBCI) paradigm. Six palm positions are used to evoke somatosensory brain responses, in order to define a novel contactless tactile autdBCI. The autdBCI won The BCI Research Award 2014 (http://www.bci-award.com/).
This demo was made in collaboration with Cichocki Lab, at RIKEN Brain Science Institute, Japan.
In this video we show a preliminary trial of a research project in progress, where chromatic and higher frequency SSVEP is used.
"Spatial Tactile Brain-Computer Interface Paradigm Applying Vibration Stimuli to Large Areas of User's Back" - demo video accompanying a BCI Conference 2014 paper available at http://arxiv.org/abs/1404.4226. The video demonstrates the online tbBCI application with healthy users.
The second collaboration output in our research group of Waldir's virtual reality (neurogaming) app with Hiromu's tbaBCI and Kensuke's tfBCI paradigms under Tomek's project leadership.
A perfect collaboration example of Waldir's virtual reality (neurogaming) app with Hiromu's tbaBCI paradigm developed in BCI-lab-group at University of Tsukuba, Japan, under Tomek's project leadership.
Tactile-force BCI demo with a small vehicular robot by Shota & Tomek (please note that the joystick is moving causing a tactile force stimulus, not the hand, as could be seen at the end of the demo movie).
Chest Tactile BCI (tBCI) for Vehicle Robot Navigation by Hiromu & Tomek (please note the vibrotactile transducers attached to the user's chest which also generate, as a side effect, acoustic noises).