skip to content
Information theoretic learning : Renyi's entropy and kernel perspectives Preview this item
ClosePreview this item
Checking...

Information theoretic learning : Renyi's entropy and kernel perspectives

Author: J C Príncipe
Publisher: New York : Springer, ©2010.
Series: Information science and statistics.
Edition/Format:   Print book : EnglishView all editions and formats
Summary:
"This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively  Read more...
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Material Type: Internet resource
Document Type: Book, Internet Resource
All Authors / Contributors: J C Príncipe
ISBN: 9781441915696 1441915699 1441915702 9781441915702
OCLC Number: 502034127
Description: xxii, 526 pages : illustrations ; 24 cm.
Contents: Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces --
Renyi's Entropy, Divergence and Their Nonparametric Estimators --
Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria --
Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems --
Nonlinear Adaptive Filtering with MEE, MCC, and Applications --
Classification with EEC, Divergence Measures, and Error Bounds --
Clustering with ITL Principles --
Self-Organizing ITL Principles for Unsupervised Learning --
A Reproducing Kernel Hilbert Space Framework for ITL --
Correntropy for Random Variables: Properties and Applications in Statistical Inference --
Correntropy for Random Processes: Properties and Applications in Signal Processing.
Series Title: Information science and statistics.
Responsibility: José C. Principe.
More information:

Abstract:

This book is the first cohesive treatment of ITL algorithms to adapt linear or nonlinear learning machines both in supervised and unsupervised paradigms. It compares the performance of ITL algorithms  Read more...

Reviews

Editorial reviews

Publisher Synopsis

From the book reviews:"The book is remarkable in various ways in the information it presents on the concept and use of entropy functions and their applications in signal processing and solution of Read more...

 
User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.

Similar Items

Related Subjects:(4)

User lists with this item (5)

Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


Primary Entity

<http://www.worldcat.org/oclc/502034127> # Information theoretic learning : Renyi's entropy and kernel perspectives
    a schema:CreativeWork, schema:Book ;
    library:oclcnum "502034127" ;
    library:placeOfPublication <http://dbpedia.org/resource/New_York_City> ; # New York
    library:placeOfPublication <http://id.loc.gov/vocabulary/countries/nyu> ;
    schema:about <http://id.worldcat.org/fast/1004795> ; # Machine learning
    schema:about <http://id.worldcat.org/fast/1012127> ; # Mathematical statistics
    schema:about <http://dewey.info/class/006.31/e22/> ;
    schema:about <http://experiment.worldcat.org/entity/work/data/375210334#Topic/information_science_and_statistics> ; # Information science and statistics
    schema:about <http://experiment.worldcat.org/entity/work/data/375210334#Topic/algorithms> ; # Algorithms
    schema:about <http://experiment.worldcat.org/entity/work/data/375210334#Topic/machine_learning> ; # Machine learning
    schema:about <http://experiment.worldcat.org/entity/work/data/375210334#Topic/mathematical_statistics> ; # Mathematical statistics
    schema:about <http://id.worldcat.org/fast/805020> ; # Algorithms
    schema:bookFormat bgn:PrintBook ;
    schema:copyrightYear "2010" ;
    schema:creator <http://viaf.org/viaf/89881565> ; # José C. Príncipe
    schema:datePublished "2010" ;
    schema:description ""This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyi's quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications. Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research. José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering, and BellSouth Professor at the University of Florida, and the Founder and Director of the Computational NeuroEngineering Laboratory. He is an IEEE and AIMBE Fellow, Past President of the International Neural Network Society, Past Editor-in-Chief of the IEEE Trans. on Biomedical Engineering and the Founder Editor-in-Chief of the IEEE Reviews on Biomedical Engineering. He has written an interactive electronic book on Neural Networks, a book on Brain Machine Interface Engineering and more recently a book on Kernel Adaptive Filtering, and was awarded the 2011 IEEE Neural Network Pioneer Award."--Publisher's website."@en ;
    schema:description "Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces -- Renyi's Entropy, Divergence and Their Nonparametric Estimators -- Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria -- Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems -- Nonlinear Adaptive Filtering with MEE, MCC, and Applications -- Classification with EEC, Divergence Measures, and Error Bounds -- Clustering with ITL Principles -- Self-Organizing ITL Principles for Unsupervised Learning -- A Reproducing Kernel Hilbert Space Framework for ITL -- Correntropy for Random Variables: Properties and Applications in Statistical Inference -- Correntropy for Random Processes: Properties and Applications in Signal Processing."@en ;
    schema:exampleOfWork <http://worldcat.org/entity/work/id/375210334> ;
    schema:inLanguage "en" ;
    schema:isPartOf <http://experiment.worldcat.org/entity/work/data/375210334#Series/information_science_and_statistics> ; # Information science and statistics.
    schema:name "Information theoretic learning : Renyi's entropy and kernel perspectives"@en ;
    schema:productID "502034127" ;
    schema:publication <http://www.worldcat.org/title/-/oclc/502034127#PublicationEvent/new_york_springer_2010> ;
    schema:publisher <http://experiment.worldcat.org/entity/work/data/375210334#Agent/springer> ; # Springer
    schema:url <http://catdir.loc.gov/catdir/enhancements/fy1316/2010924811-t.html> ;
    schema:workExample <http://worldcat.org/isbn/9781441915696> ;
    schema:workExample <http://worldcat.org/isbn/9781441915702> ;
    umbel:isLike <http://bnb.data.bl.uk/id/resource/GBB009586> ;
    wdrs:describedby <http://www.worldcat.org/title/-/oclc/502034127> ;
    .


Related Entities

<http://dbpedia.org/resource/New_York_City> # New York
    a schema:Place ;
    schema:name "New York" ;
    .

<http://experiment.worldcat.org/entity/work/data/375210334#Series/information_science_and_statistics> # Information science and statistics.
    a bgn:PublicationSeries ;
    schema:hasPart <http://www.worldcat.org/oclc/502034127> ; # Information theoretic learning : Renyi's entropy and kernel perspectives
    schema:name "Information science and statistics." ;
    schema:name "Information science and statistics" ;
    .

<http://experiment.worldcat.org/entity/work/data/375210334#Topic/information_science_and_statistics> # Information science and statistics
    a schema:Intangible ;
    schema:name "Information science and statistics"@en ;
    .

<http://experiment.worldcat.org/entity/work/data/375210334#Topic/mathematical_statistics> # Mathematical statistics
    a schema:Intangible ;
    schema:name "Mathematical statistics"@en ;
    .

<http://id.worldcat.org/fast/1004795> # Machine learning
    a schema:Intangible ;
    schema:name "Machine learning"@en ;
    .

<http://id.worldcat.org/fast/1012127> # Mathematical statistics
    a schema:Intangible ;
    schema:name "Mathematical statistics"@en ;
    .

<http://id.worldcat.org/fast/805020> # Algorithms
    a schema:Intangible ;
    schema:name "Algorithms"@en ;
    .

<http://viaf.org/viaf/89881565> # José C. Príncipe
    a schema:Person ;
    schema:familyName "Príncipe" ;
    schema:givenName "José C." ;
    schema:givenName "J. C." ;
    schema:name "José C. Príncipe" ;
    .

<http://worldcat.org/isbn/9781441915696>
    a schema:ProductModel ;
    schema:isbn "1441915699" ;
    schema:isbn "9781441915696" ;
    .

<http://worldcat.org/isbn/9781441915702>
    a schema:ProductModel ;
    schema:isbn "1441915702" ;
    schema:isbn "9781441915702" ;
    .


Content-negotiable representations

Close Window

Please sign in to WorldCat 

Don't have an account? You can easily create a free account.