Information Theoretic Learning Information Theoretic Learning
Information Science and Statistics

Information Theoretic Learning

Renyi's Entropy and Kernel Perspectives

    • €164.99
    • €164.99

Publisher Description

This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy.

ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyi’s quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications.

Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research.

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering, and BellSouth Professor at the University of Florida, and the Founder and Director of the Computational NeuroEngineering Laboratory. He is an IEEE and AIMBE Fellow, Past President of the International Neural Network Society, Past Editor-in-Chief of the IEEE Trans. on Biomedical Engineering and the Founder Editor-in-Chief of the IEEE Reviews on Biomedical Engineering. He has written an interactive electronic book on Neural Networks, a book on Brain Machine Interface Engineering and more recently a book on Kernel Adaptive Filtering, and was awarded the 2011 IEEE NeuralNetwork Pioneer Award.

GENRE
Computing & Internet
RELEASED
2010
6 April
LANGUAGE
EN
English
LENGTH
462
Pages
PUBLISHER
Springer New York
PROVIDER INFO
Springer Science & Business Media LLC
SIZE
9.5
MB
Handbook of Computational Statistics Handbook of Computational Statistics
2012
Information Theory in Computer Vision and Pattern Recognition Information Theory in Computer Vision and Pattern Recognition
2009
Principles and Theory for Data Mining and Machine Learning Principles and Theory for Data Mining and Machine Learning
2009
Empirical Inference Empirical Inference
2013
Machine Learning and Knowledge Discovery in Databases Machine Learning and Knowledge Discovery in Databases
2023
Monte Carlo and Quasi-Monte Carlo Methods Monte Carlo and Quasi-Monte Carlo Methods
2018
Kalman Filtering Under Information Theoretic Criteria Kalman Filtering Under Information Theoretic Criteria
2023
Theory of Information and its Value Theory of Information and its Value
2020
Brain-Computer Interfaces Brain-Computer Interfaces
2008
Kernel Adaptive Filtering Kernel Adaptive Filtering
2011
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis
2007
Novelty, Information and Surprise Novelty, Information and Surprise
2023
Information and Complexity in Statistical Modeling Information and Complexity in Statistical Modeling
2007
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis
2012
Probabilistic Conditional Independence Structures Probabilistic Conditional Independence Structures
2006
Support Vector Machines Support Vector Machines
2008