Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 See also  





2 References  














Infomax






العربية
Deutsch

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Infomax is an optimization principle for artificial neural networks and other information processing systems. It prescribes that a function that maps a set of input values I to a set of output values O should be chosen or learned so as to maximize the average Shannon mutual information between I and O, subject to a set of specified constraints and/or noise processes. Infomax algorithms are learning algorithms that perform this optimization process. The principle was described by Linsker in 1988.[1]

Infomax, in its zero-noise limit, is related to the principle of redundancy reduction proposed for biological sensory processing by Horace Barlow in 1961,[2] and applied quantitatively to retinal processing by Atick and Redlich.[3]

One of the applications of infomax has been to an independent component analysis algorithm that finds independent signals by maximizing entropy. Infomax-based ICA was described by Bell and Sejnowski, and Nadal and Parga in 1995.[4] [5]

See also[edit]

References[edit]

  1. ^ Linsker R (1988). "Self-organization in a perceptual network". IEEE Computer. 21 (3): 105–17. doi:10.1109/2.36. S2CID 1527671.
  • ^ Barlow, H. (1961). "Possible principles underlying the transformations of sensory messages". In Rosenblith, W. (ed.). Sensory Communication. Cambridge MA: MIT Press. pp. 217–234.
  • ^ Atick JJ, Redlich AN (1992). "What does the retina know about natural scenes?". Neural Computation. 4 (2): 196–210. doi:10.1162/neco.1992.4.2.196. S2CID 17515861.
  • ^ Bell AJ, Sejnowski TJ (November 1995). "An information-maximization approach to blind separation and blind deconvolution". Neural Comput. 7 (6): 1129–59. CiteSeerX 10.1.1.36.6605. doi:10.1162/neco.1995.7.6.1129. PMID 7584893. S2CID 1701422.
  • ^ Nadal J.P., Parga N. (1999). "Sensory coding: information maximization and redundancy reduction". In Burdet, G.; Combe, P.; Parodi, O. (eds.). Neural Information Processing. World Scientific Series in Mathematical Biology and Medicine. Vol. 7. Singapore: World Scientific. pp. 164–171.

  • t
  • e

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=Infomax&oldid=993899543"

    Categories: 
    Artificial neural networks
    Computational neuroscience
    Applied mathematics stubs
    Hidden category: 
    All stub articles
     



    This page was last edited on 13 December 2020, at 02:25 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki