Author: imerso

Neural Networks

I have been creating a Deep Neural Network library on weekends. It is not tied or dependent on any particular engine, so it can be used both on database-related systems and VR/Games.

Neural Networks are awesome, as they try to emulate how real neurons work. The artificial ones also have dentrites (which I resume to input/output ports) and synapses (which I resume to connections between different neurons), and from that simple structure, some really interesting things can be created. As you probably know already, they can be used in many different domains: vision, voice and general cognition are some of them.

For now, my implementation is all CPU based. When it becomes rock solid and fully featured, I will consider moving some or maybe all parts to a GPU implementation, using Compute Shaders. Right now I’m still satisfied with CPU performance, so no urgency on GPU translation yet.

The following screenshot is taken from a small network visualized in Unity3D — so I can quickly test/confirm that the actual topologies and synapses are being created correctly.

The above network is a Feedforward one, showing an extremely simple (but correct and useful) neural net in this case, however the underlying structure can automatically build and interconnect a neural net of any size (only limited by memory), using either Perceptrons or Sigmoid neurons.

More on this later. Thanks for reading.


Dev Log opened

Hi there. So, after a few years, I just decided to reopen a public log. Here I’ll be posting about my personal progress in general, mostly about programming and robotics, and some times about real-life subjects. Welcome and feel free to leave me a comment. Thanks.

Work on Etherea²

Etherea² for Unity3D can potentially handle an universe with trillions of trillions of trillions of kilometers, with millimeter accuracy everywhere, while Unity3D (and almost all the standard 3D engines out there) will handle scenes of about 64000 meters out of the box. That size limit can’t fit a single Etherea² planet, much less its virtually infinite number of planets. So, Etherea² effectively extends Unity3D limits to handle more than the size of the observable universe.

But it does not stop there, the size of the universe is just one of the many challenges that Etherea² solves. It can also render a 1cm ant very close to the camera, at the same time that it renders a 13000km wide planet immediately behind it, plus another planet a million kilometers behind both, without starting a z-buffer fight. Plus you can leave something here on this planet, fly to a different star system light-years away, explore another planet, then come back to the first and find the same object at the same exact place, without any interruptions or loading screens. Also you don’t need a super computer for that, in fact you could be doing that using your Android phone — yes, at a reduced resolution and quality compared to the desktop version, but it does run on high-end mobiles (S6 level and above).

It also supports Virtual Reality modes on Google Cardboard, GearVR and Oculus Rift.

Just to remember, here is a reference video from the first ever Etherea¹. This old 64kb techdemo was available for download in 2010-2011, and was entirely done in C++ and OpenGL:

Out of curiosity, even with texture and geometry compression, a single planet done by hand would take something like 4gb of disk space. But Etherea¹ was 100% procedural, and the entire tech-demo with 8 full planets and background music used only 64kb of disk space.

Anyway, I ended up stopping all work on Etherea¹ (all versions, including the Unity3D one) around 2013. Daily job sucked me entirely and I could not handle the side work on Etherea. It was then frozen for a few years.

I am working on Etherea² on and off for a few months now. Initially I was writing it in pure Javascript/WebGL² ( demo here: https://imersiva.com/demos/eo ), when I decided to jump to Unity3D again, for its productivity and also because it’s commercially more interesting for me.

Here are two preliminar Etherea² videos:


Post-processing (shadows, hdr, bloom, reflections) was disabled on the above video. They’ll be fully enabled in the future.


This one has post-processing enabled.

Thanks for reading.