We have used special mobile devices to get the participants' brain EEG data of the frontal lobe. The backend software analyzes the raw data and converts it to two different meaningful signals. These signals are "Focus" and "Meditation" data. We send these data to the visualizer software to create real-time particle movement visuals. this project was developed as a stage performance so all devices worked in real-time, every data was getting in real-time, and visualization was created simultaneously.
I used the NeuroSky Mindwave Mobile 2 device to get data from the frontal lobe of the participants. These devices can send signals over a Bluetooth connection. I prepared six-piece of circuit boards with Bluetooth modules and microcontrollers. All of these modules are connected to the server pc. The server pc was running the backend software to analyze the data and make them smoother with some low pass filter algorithms. Then, the backend software sends all processed data to the data visualizer software pc.
Hi there, We are looking to visual some eeg data in a particle animation for an installation at an event and came across your work for Roche. Did the headsets need to be configured to each person or could you change the person easily? How long would it take to set up the app to do something similar based on tracking people's emotions when they do certain actions? Thanks Ash