omitir hasta el contenido
Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001 Ver este material de antemano
CerrarVer este material de antemano
Chequeando…

Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001

Autor: Olivier Catoni; Jean Picard; LINK (Online service)
Editorial: Berlin : Springer-Verlag, ©2004.
Serie: Lecture notes in mathematics (Springer-Verlag), 1851.
Edición/Formato:   Libro-e : Documento : Publicación de conferencia : Inglés (eng)Ver todas las ediciones y todos los formatos
Base de datos:WorldCat
Resumen:
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes  Leer más
Calificación:

(todavía no calificado) 0 con reseñas - Ser el primero.

Temas
Más materiales como éste

 

Encontrar un ejemplar en línea

Enlaces a este material

Encontrar un ejemplar en la biblioteca

&AllPage.SpinnerRetrieving; Encontrando bibliotecas que tienen este material…

Detalles

Género/Forma: Electronic books
Conference proceedings
Congresses
Formato físico adicional: Print version:
Catoni, Olivier.
Statistical learning theory and stochastic optimization.
Berlin : Springer-Verlag, ©2004
(DLC) 2004109143
(OCoLC)56714791
Tipo de material: Publicación de conferencia, Documento, Recurso en Internet
Tipo de documento: Recurso en Internet, Archivo de computadora
Todos autores / colaboradores: Olivier Catoni; Jean Picard; LINK (Online service)
ISBN: 9783540445074 3540445072
Número OCLC: 56508135
Notas: " ... 31st Probability Summer School in Saint-Flour (July 8-25, 2001) ..."--Preface.
Descripción: 1 online resource (viii, 272 pages) : illustrations.
Contenido: Universal Lossless Data Compression --
Links Between Data Compression and Statistical Estimation --
Non Cumulated Mean Risk --
Gibbs Estimators --
Randomized Estimators and Empirical Complexity --
Deviation Inequalities --
Markov Chains with Exponential Transitions --
References --
Index.
Título de la serie: Lecture notes in mathematics (Springer-Verlag), 1851.
Otros títulos: Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001
Responsabilidad: Olivier Catoni ; editor, Jean Picard.
Más información:

Resumen:

e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the  Leer más

Reseñas

Reseñas editoriales

Resumen de la editorial

From the reviews: "This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical Leer más

 
Reseñas contribuidas por usuarios
Recuperando reseñas de GoodReads…
Recuperando reseñas de DOGObooks…

Etiquetas

Ser el primero.
Confirmar este pedido

Ya ha pedido este material. Escoja OK si desea procesar el pedido de todos modos.

Datos enlazados


<http://www.worldcat.org/oclc/56508135>
library:oclcnum"56508135"
library:placeOfPublication
library:placeOfPublication
owl:sameAs<info:oclcnum/56508135>
rdf:typeschema:Book
schema:about
schema:about
schema:about
schema:about
schema:about
<http://id.worldcat.org/fast/1012127>
rdf:typeschema:Intangible
schema:name"Mathematical statistics"@en
schema:name"Mathematical statistics."@en
schema:about
schema:about
schema:bookFormatschema:EBook
schema:contributor
schema:contributor
schema:contributor
schema:copyrightYear"2004"
schema:creator
schema:datePublished"2004"
schema:description"Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index."@en
schema:description"Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results."@en
schema:exampleOfWork<http://worldcat.org/entity/work/id/14473764>
schema:genre"Conference proceedings"@en
schema:genre"Conference proceedings."@en
schema:genre"Electronic books."@en
schema:inLanguage"en"
schema:name"Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:name"Statistical learning theory and stochastic optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:publisher
schema:url<http://dx.doi.org/10.1007/b99352>
schema:url<http://rave.ohiolink.edu/ebooks/ebc/11305972>
schema:url<http://www.springerlink.com/openurl.asp?genre=issue&issn=0075-8434&volume=1851>
schema:url
schema:url<http://springerlink.metapress.com/link.asp?id=MT30WFT73522>
schema:workExample

Content-negotiable representations

Cerrar ventana

Inicie una sesión con WorldCat 

¿No tienes una cuenta? Puede fácilmente crear una cuenta gratuita.