Category: Programming

Real-Time Brain Wave Analyzer

The EEG (electroencephalogram) is a neurological test which can reveal abnormalities in people’s brain waves. The EEG device is traditionally found only in medical facilities. Most people will take an EEG test at least once on their lives. EEG devices have a few dozens of electrical sensors which can read brain activity and record those activities for later analysis.

Traditional EEG device

Some years ago, a few portable, consumer oriented EEG devices have appeared, one of them being the Emotiv Epoc, a 14 channels wireless EEG device which, although not comparable to an industrial EEG, also allows for some interesting brain wave experiments and visualizations. Interesting enough, differently from the traditional EEG devices which will only record the brain waves for medical analysis of brain health, the portable device also provides some basic facilities for coarse “mind reading”, that is, through some clever real-time analysis of user’s brain activities, it can most of the time, with some effort, detect a few limited “thoughts” like push, pull and move. So, the user can (again, in a very limited way) effectively control the computer with his mind. It even provides an SDK for advanced users and programmers to develop their own applications.

That is all cool, but that was not really what I was looking for, though. I wanted more low-level device access, direct to the metal, raw sensors reading for research purposes. I posted a new Youtube video showing the first prototype of a real-time brain wave analyzer that I just started to develop for personal AI research purposes.

At the time of recording, I was wearing an Emotiv Epoc and the waves were from real, raw EEG data being read from my own brain. I used a hacked low-level driver (on Linux) to get complete access to the raw sensors of the device, instead of using its built-in software which provides limited access to its sensor readings. The hacked driver was not written by me though — when searching for low-level Epoc protocol info, I found Emokit-c which already opened the full access that I wanted. So, from there I connected the device data stream to the 3D engine and made the first prototype over the weekend.

For now the prototype is rough yet, just showing raw waves with no further processing. In the near future, I plan to have a Neural Network connected to this raw EEG analyzer, learning patterns from thoughts and emotions, and doing more useful (possibly serious, medical related) things.

Although the prototype seems to be just a graphics demo, as said above it’s not just fancy rendering, it’s in reality talking to a real device and getting real raw EEG data from its sensors, which will later on be processed by a Neural Network with serious intents.

More about this in the future, when time permits.

Mobile Game and Neural Networks

At this moment I am only working on Etherea² on my spare time, at weekends. In normal work days I am programming a new (and completely unrelated) commercial mobile game and I’m honestly happy to have a paid work, so things keep going nicely. The game is relatively simple and nice — a real-time strategy game — but I won’t give any further details as it is not my property and I normally don’t comment on commercial work.

But then, back to weekends programming, I’ve been creating my own Deep Neural Network library. It is not tied or dependent on any particular engine, so it can be used both on database-related systems (which I worked in the past and have been contemplating again) and VR/Games.

Neural Networks are awesome, as they try to emulate how real neurons work. The artificial ones also have dentrites (which I resume to input/output ports) and synapses (which I resume to connections between different neurons), and from that simple structure, some really interesting things can be created. As you probably know already, they can be used in many different domains: vision, voice and general cognition are some of them.

For now, my implementation is all CPU based. When it becomes rock solid and fully featured, I will consider moving some or maybe all parts to a GPU implementation, using Compute Shaders. Right now I’m still satisfied with CPU performance, so no urgency on GPU translation yet.

The following screenshot is taken from a small network visualized in Unity3D — so I can quickly test/confirm that the actual topologies and synapses are being created correctly.

The above network is a Feedforward one, showing an extremely simple (but correct and useful) neural net in this case, however the underlying structure can automatically build and interconnect a neural net of any size (only limited by memory), using either Perceptrons or Sigmoid neurons.

As for usage, I do have a list of plans for it. One of them is for Etherea² life and behaviors simulation, the other is a robot which is now waiting for a second iteration of development.

More on this later. Thanks for reading.