zum Inhalt wechseln
Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001 Titelvorschau
SchließenTitelvorschau
Prüfung…

Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001

Verfasser/in: Olivier Catoni; Jean Picard; LINK (Online service)
Verlag: Berlin : Springer-Verlag, ©2004.
Serien: Lecture notes in mathematics (Springer-Verlag), 1851.
Ausgabe/Format   E-Book : Dokument : Tagungsband : EnglischAlle Ausgaben und Formate anzeigen
Datenbank:WorldCat
Zusammenfassung:
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes  Weiterlesen…
Bewertung:

(noch nicht bewertet) 0 mit Rezensionen - Verfassen Sie als Erste eine Rezension.

Themen
Ähnliche Titel

 

Online anzeigen

Links zu diesem Titel

Exemplar ausleihen

&AllPage.SpinnerRetrieving; Suche nach Bibliotheken, die diesen Titel besitzen ...

Details

Gattung/Form: Electronic books
Conference proceedings
Congresses
Physisches Format Print version:
Catoni, Olivier.
Statistical learning theory and stochastic optimization.
Berlin : Springer-Verlag, ©2004
(DLC) 2004109143
(OCoLC)56714791
Medientyp: Tagungsband, Dokument, Internetquelle
Dokumenttyp: Internet-Ressource, Computer-Datei
Alle Autoren: Olivier Catoni; Jean Picard; LINK (Online service)
ISBN: 9783540445074 3540445072
OCLC-Nummer: 56508135
Anmerkungen: " ... 31st Probability Summer School in Saint-Flour (July 8-25, 2001) ..."--Preface.
Beschreibung: 1 online resource (viii, 272 pages) : illustrations.
Inhalt: Universal Lossless Data Compression --
Links Between Data Compression and Statistical Estimation --
Non Cumulated Mean Risk --
Gibbs Estimators --
Randomized Estimators and Empirical Complexity --
Deviation Inequalities --
Markov Chains with Exponential Transitions --
References --
Index.
Serientitel: Lecture notes in mathematics (Springer-Verlag), 1851.
Andere Titel Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001
Verfasserangabe: Olivier Catoni ; editor, Jean Picard.
Weitere Informationen:

Abstract:

e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the  Weiterlesen…

Rezensionen

Redaktionelle Rezension

Nielsen BookData

From the reviews: "This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical Weiterlesen…

 
Nutzer-Rezensionen
Suche nach GoodReads-Rezensionen
Suche nach DOGObooks-Rezensionen…

Tags

Tragen Sie als Erste Tags ein.
Anfrage bestätigen

Sie haben diesen Titel bereits angefordert. Wenn Sie trotzdem fortfahren möchten, klicken Sie auf OK.

Verlinkung


<http://www.worldcat.org/oclc/56508135>
library:oclcnum"56508135"
library:placeOfPublication
library:placeOfPublication
owl:sameAs<info:oclcnum/56508135>
rdf:typeschema:Book
schema:about
schema:about
schema:about
schema:about
schema:about
<http://id.worldcat.org/fast/1012127>
rdf:typeschema:Intangible
schema:name"Mathematical statistics"@en
schema:name"Mathematical statistics."@en
schema:about
schema:about
schema:bookFormatschema:EBook
schema:contributor
schema:contributor
schema:contributor
schema:copyrightYear"2004"
schema:creator
schema:datePublished"2004"
schema:description"Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index."@en
schema:description"Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results."@en
schema:exampleOfWork<http://worldcat.org/entity/work/id/14473764>
schema:genre"Conference proceedings"@en
schema:genre"Conference proceedings."@en
schema:genre"Electronic books."@en
schema:inLanguage"en"
schema:name"Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:name"Statistical learning theory and stochastic optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:publisher
schema:url<http://dx.doi.org/10.1007/b99352>
schema:url<http://rave.ohiolink.edu/ebooks/ebc/11305972>
schema:url<http://www.springerlink.com/openurl.asp?genre=issue&issn=0075-8434&volume=1851>
schema:url
schema:url<http://springerlink.metapress.com/link.asp?id=MT30WFT73522>
schema:workExample

Content-negotiable representations

Fenster schließen

Bitte in WorldCat einloggen 

Sie haben kein Konto? Sie können sehr einfach ein kostenloses Konto anlegen,.