10. Biomimetic process versus deep learning
Deep learning was initiated in the 90s with Yann Lecun at the forefront. In 2012, the increase in computational performance made it possible to use very large quantities of data and thus to extract common elements by cross-correlation and supervised learning, applying an adaptation of the relationships between local elements called "synaptic weight" or CNN (for Convolutional Neural Network). This process requires computation for several scales, rotation is very poorly managed, and invariance is achieved by a multiplicity of computational layers. Information storage is distributed, media-dependent and variable according to the elements learned. Memory is therefore not transferable between two different media. This process is mainly used by companies wishing to process data on a very large scale.
For everyday applications, the biomimetic process will prove far more...
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!
Biomimetic process versus deep learning
Article included in this offer
"Software technologies and System architectures"
(
227 articles
)
Updated and enriched with articles validated by our scientific committees
A set of exclusive tools to complement the resources
Bibliography
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!