Repository logo
  • Log In
    Log in via Symplectic to deposit your publication(s).
Repository logo
  • Communities & Collections
  • Research Outputs
  • Statistics
  • Log In
    Log in via Symplectic to deposit your publication(s).
  1. Home
  2. Faculty of Natural Sciences
  3. Mathematics
  4. Applied Mathematics and Mathematical Physics
  5. Hebbian learning of recurrent connections: a geometrical perspective
 
  • Details
Hebbian learning of recurrent connections: a geometrical perspective
File(s)
NC12.pdf (2.74 MB)
Published version
Author(s)
Galtier, Mathieu N
Faugeras, Olivier D
Bressloff, Paul C
Type
Journal Article
Abstract
We show how a Hopfield network with modifiable recurrent connections undergoing slow Hebbian learning can extract the underlying geometry of an input space. First, we use a slow and fast analysis to derive an averaged system whose dynamics derives from an energy function and therefore always converges to equilibrium points. The equilibria reflect the correlation structure of the inputs, a global object extracted through local recurrent interactions only. Second, we use numerical methods to illustrate how learning extracts the hidden geometrical structure of the inputs. Indeed, multidimensional scaling methods make it possible to project the final connectivity matrix onto a Euclidean distance matrix in a high-dimensional space, with the neurons labeled by spatial position within this space. The resulting network structure turns out to be roughly convolutional. The residual of the projection defines the nonconvolutional part of the connectivity, which is minimized in the process. Finally, we show how restricting the dimension of the space where the neurons live gives rise to patterns similar to cortical maps. We motivate this using an energy efficiency argument based on wire length minimization. Finally, we show how this approach leads to the emergence of ocular dominance or orientation columns in primary visual cortex via the self-organization of recurrent rather than feedforward connections. In addition, we establish that the nonconvolutional (or long-range) connectivity is patchy and is co-aligned in the case of orientation learning.
Date Issued
2012-09
Date Acceptance
2012-03-01
Citation
Neural Computation, 2012, 24 (9), pp.2346-2383
URI
http://hdl.handle.net/10044/1/106984
URL
http://dx.doi.org/10.1162/neco_a_00322
DOI
https://www.dx.doi.org/10.1162/neco_a_00322
ISSN
0899-7667
Publisher
Massachusetts Institute of Technology Press
Start Page
2346
End Page
2383
Journal / Book Title
Neural Computation
Volume
24
Issue
9
Copyright Statement
© 2012 Massachusetts Institute of Technology
Identifier
http://dx.doi.org/10.1162/neco_a_00322
Publication Status
Published
Date Publish Online
2012-09-01
About
Spiral Depositing with Spiral Publishing with Spiral Symplectic
Contact us
Open access team Report an issue
Other Services
Scholarly Communications Library Services
logo

Imperial College London

South Kensington Campus

London SW7 2AZ, UK

tel: +44 (0)20 7589 5111

Accessibility Modern slavery statement Cookie Policy

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback