Seminario del 2020

2020
07 febbraio
In this talk I will paint a self‐consistent scenario for information processing in shallow neural networks: I will present a minimal reference framework where what is learnt (e.g. via contrastive divergence on restricted Boltzmann machines) is then retrieved (e.g. via standard Hebbian mechanisms à la Hopfield). Then I will generalize this scheme by discussing three variations on theme: the tradeoff between dilution and multitasking capabilities, the tradeoff between storage and resolution and the mechanism of "sleeping and dreaming".

indietro