At this moment I am only working on Etherea¬≤ on my spare time, at weekends. In normal work days I am programming a new (and completely unrelated) commercial mobile game and I’m honestly happy to have a paid work, so things keep going nicely. The game is relatively simple and nice — a real-time strategy game — but I won’t give any further details as it is not my property and I normally don’t comment on commercial work.

But then, back to weekends programming, I’ve been creating my own Deep Neural Network library. It is not tied or dependent on any particular engine, so it can be used both on database-related systems (which I worked in the past and have been contemplating again) and VR/Games.

Neural Networks are awesome, as they try to emulate how real neurons work. The artificial ones also have dentrites (which I resume to input/output ports) and synapses (which I resume to connections between different neurons), and from that simple structure, some really interesting things can be created. As you probably know already, they can be used in many different domains: vision, voice and general cognition are some of them.

For now, my implementation is all CPU based. When it becomes rock solid and fully featured, I will consider moving some or maybe all parts to a GPU implementation, using Compute Shaders. Right now I’m still satisfied with CPU performance, so no urgency on GPU translation yet.

The following screenshot is taken from a small network visualized in Unity3D — so I can quickly test/confirm that the actual topologies and synapses are being created correctly.

The above network is a Feedforward one, showing an extremely simple (but correct and useful) neural net in this case, however the underlying structure can automatically build and interconnect a neural net of any size (only limited by memory), using either Perceptrons or Sigmoid neurons.

As for usage, I do have a list of plans for it. One of them is for Etherea² life and behaviors simulation, the other is a robot which is now waiting for a second iteration of development.

More on this later. Thanks for reading.