EDeN Update: Memory patches and speed improvements
Tests continue on the framework, which have lead to:
- Memory leak fixes.
- Various speed improvements.
- Internal fallback to protect against failure from hyper parameter options.
- Much faster and optomised randomisation utility with
- Clearer display of Functome data (The Entities ‘DNA’ of sorts)
EDeN Update: Unity volumetric connectome rendering
Significant enhancements to the backend dlls, rendering methods, and other interfaces have now allowed for the following:
- Jeffbot – Is now created by the user from the python interface client side (Or any other asset via the ‘TraniningMetaData settings)
- Neuron renderer: Rendering 10,000 neurons, each with up to 20,000 Axons/Dendrites, this will improve drastically once again given compute shader processing!
That’s up to 134,217,728 neurons rendered on the active entity!-Special thanks to Joel Rowney from Mjolnir software in his shader mastery!
Update: EDeN Unity engine rendering updates
Work progresses well in developing an entity training environment.
Included in the follow GIF, we see:
- Jeffbot – a representation of test entities where input and output neural probes are attached
- Training environment which spawns different challenges and all essential stimulus, either in the form of food or direct neural inputs
- Neuron renderer: rendering a few hundred neurons just starting to grow from initial conditions.
Energy Decay Network (EDeN)
This paper and accompanying Python/C++ framework is the product of the author’s perceived problems with narrow (discrimination based) AI (Artificial Intelligence). The framework attempts to develop a genetic transfer of experience through potential structural expressions using a common regulation/exchange value (‘energy’) to create a model whereby neural architecture and all unit processes are co-dependently developed. These expressions are born from fractal definition, stochastically tuned and managed by genetic experience; successful routes are maintained through global rules: Stability of signal propagation/function over cross functional (external state, internal immediate state, and genetic bias towards selection of previous expressions). These principles are aimed towards creating a diverse and robust network, hopefully reducing the need for transfer learning and computationally expensive translations as demand on compute increases.