TexturePipelinePreviewHeaderImg

EDeN Update: Dataset Texture processing overview and next steps

In the below video, I demonstrate the latest update which has been on hiatus for three months due to personal issues (moving home and taking care of family):

  • The client interface can now request a root folder located on the server for sourced datasets, this is now automatically loaded into memory and selected for training
  • A new command structure can set position of the entity and training object for stimulating input links.
  • Improved GUI allows inspection of the connectome seperately, whilst providing a ‘mini-map’ type view.
  • Various bug fixes and speed up improvments in the backend. Not shown in the video is a 60x speed up in connectome rendering, especially useful for large brained entities.The results of this will then:
  • Be used as evidence in the next paper : Multi agent learning -working title.
  • And if stable: release to AWS Marketplace!

Texture dataset processing overview

Read More

EDeN Update: Unity volumetric connectome rendering

Significant enhancements to the backend dlls, rendering methods, and other interfaces have now allowed for the following:

  • Jeffbot – Is now created by the user from the python interface client side (Or any other asset via the ‘TraniningMetaData settings)
  • Neuron renderer: Rendering 10,000 neurons, each with up to 20,000 Axons/Dendrites, this will improve drastically once again given compute shader processing!
    That’s up to 134,217,728 neurons rendered on the active entity!-Special thanks to Joel Rowney from Mjolnir software in his shader mastery!

Volumetric rendering preview

Read More

Update: EDeN Unity engine rendering updates

Work progresses well in developing an entity training environment.
Included in the follow GIF, we see:

  • Jeffbot – a representation of test entities where input and output neural probes are attached
  • Training environment which spawns different challenges and all essential stimulus, either in the form of food or direct neural inputs
  • Neuron renderer: rendering a few hundred neurons just starting to grow from initial conditions.
Read More

Energy Decay Network (EDeN)

This paper and accompanying Python/C++ framework is the product of the author’s perceived problems with narrow (discrimination based) AI (Artificial Intelligence). The framework attempts to develop a genetic transfer of experience through potential structural expressions using a common regulation/exchange value (‘energy’) to create a model whereby neural architecture and all unit processes are co-dependently developed. These expressions are born from fractal definition, stochastically tuned and managed by genetic experience; successful routes are maintained through global rules: Stability of signal propagation/function over cross functional (external state, internal immediate state, and genetic bias towards selection of previous expressions). These principles are aimed towards creating a diverse and robust network, hopefully reducing the need for transfer learning and computationally expensive translations as demand on compute increases.

Read on arXiv

Read on engrXiv

Read More