Skip to content
Notes from the field

Notes from the field

New AI method increases the power of artificial neural networks

New AI method increases the power of artificial neural networks

Although i was aware of the benefits of sparsely connected neural networks, this paper outlines an additional, slightly counter-intuitive property – scale-freeness through a method they call "Sparse Evolutionary Training). Starting from a sparse network, the model randomly adds new connections and drops weaker ones, "evolving" in to a more model which is more complex to define, but ultimately simpler, with fewer "hubs" (scale-freeness), and this less densely connected end state is quicker to stabilise (as it requires fewer calculations) and to maintain and train than a traditional statically connected network.
https://rob.al/2Ifd8eM
An international team of scientists from Eindhoven University of Technology, University of Texas at Austin, and University of Derby, has developed a revolutionary method that quadratically…

Share this:

  • Click to email a link to a friend (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on WhatsApp (Opens in new window)
2018-06-22

linkedin cross-post

Post navigation

PREVIOUS
The Weird World of Thieves in Sweden
NEXT
A freshly funded battery startup aims to ease the cobalt crunch
Comments are closed.

Archives

The standard disclaimer…

The views, thoughts, and opinions expressed in the text belong solely to the me, and not necessarily to the my employer, organization, committee or other group that I belong to or am associated with.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
© 2023 Rob Aleck, licensed under CC BY-NC 4.0
Go to mobile version