0

Information Theory and Statistical Learning

Erschienen am 14.11.2008, 1. Auflage 2009
106,99 €
(inkl. MwSt.)

Nachfragen

In den Warenkorb
Bibliografische Daten
ISBN/EAN: 9780387848150
Sprache: Englisch
Umfang: x, 439 S.
Einband: gebundenes Buch

Beschreibung

InhaltsangabeAlgorithmic Probability: Theory and Applications.- Model Selection and Testing by the MDL Principle.- Normalized Information Distance.- The Application of Data Compression-Based Distances to Biological Sequences.- MIC: Mutual Information Based Hierarchical Clustering.- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information.- Information Approach to Blind Source Separation and Deconvolution.- Causality in Time Series: Its Detection and Quantification by Means of Information Theory.- Information Theoretic Learning and Kernel Methods.- Information-Theoretic Causal Power.- Information Flows in Complex Networks.- Models of Information Processing in the Sensorimotor Loop.- Information Divergence Geometry and the Application to Statistical Machine Learning.- Model Selection and Information Criterion.- Extreme Physical Information as a Principle of Universal Stability.- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.

Produktsicherheitsverordnung

Hersteller:
Springer Verlag GmbH
juergen.hartmann@springer.com
Tiergartenstr. 17
DE 69121 Heidelberg

Inhalt

Algorithmic Probability-Theory and Applications.- Model Selection and Testing by the MDL Principle.- Normalized Information Distance.- The Application of Data Compression-based Distances to Biological Sequences.- MIC: Mutual Information Based Hierarchical Clustering.- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information.- Information Approach to Blind Source Separation and Deconvolution.- Causality in Time Series: Its Detection and Quantification by Means of Information Theory.- Information Theoretic Learning and Kernel Methods.- Information-Theoretic Causal Power.- Information Flows in Complex Networks.- Models of Information Processing in the Sensorimotor Loop.- Information Divergence Geometry and the Application to Statistical Machine Learning.- Model Selection and Information Criterion.- Extreme Physical Information (EPI) as a PRinciple of Universal Stability.- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.