New AI method increases the power of artificial neural networks

Although i was aware of the benefits of sparsely connected neural networks, this paper outlines an additional, slightly counter-intuitive property – scale-freeness through a method they call "Sparse Evolutionary Training). Starting from a sparse network, the model randomly adds new connections and drops weaker ones, "evolving" in to a more model which is more complex to define, but ultimately simpler, with fewer "hubs" (scale-freeness), and this less densely connected end state is quicker to stabilise (as it requires fewer calculations) and to maintain and train than a traditional statically connected network.
https://rob.al/2Ifd8eM
An international team of scientists from Eindhoven University of Technology, University of Texas at Austin, and University of Derby, has developed a revolutionary method that quadratically…