跳到内容
Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001 预览资料
关闭预览资料
正在查...

Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001

著者: Olivier Catoni; Jean Picard; LINK (Online service)
出版商: Berlin : Springer-Verlag, ©2004.
丛书: Lecture notes in mathematics (Springer-Verlag), 1851.
版本/格式:   电子图书 : 文献 : 会议刊物 : 英语查看所有的版本和格式
数据库:WorldCat
提要:
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes  再读一些...
评估:

(尚未评估) 0 附有评论 - 争取成为第一个。

主题
更多类似这样的

 

在线查找

与资料的链接

在图书馆查找

&AllPage.SpinnerRetrieving; 正在查找有这资料的图书馆...

详细书目

类型/形式: Electronic books
Conference proceedings
Congresses
附加的形体格式: Print version:
Catoni, Olivier.
Statistical learning theory and stochastic optimization.
Berlin : Springer-Verlag, ©2004
(DLC) 2004109143
(OCoLC)56714791
材料类型: 会议刊物, 文献, 互联网资源
文件类型: 互联网资源, 计算机文档
所有的著者/提供者: Olivier Catoni; Jean Picard; LINK (Online service)
ISBN: 9783540445074 3540445072
OCLC号码: 56508135
注意: " ... 31st Probability Summer School in Saint-Flour (July 8-25, 2001) ..."--Preface.
描述: 1 online resource (viii, 272 pages) : illustrations.
内容: Universal Lossless Data Compression --
Links Between Data Compression and Statistical Estimation --
Non Cumulated Mean Risk --
Gibbs Estimators --
Randomized Estimators and Empirical Complexity --
Deviation Inequalities --
Markov Chains with Exponential Transitions --
References --
Index.
丛书名: Lecture notes in mathematics (Springer-Verlag), 1851.
其他题名: Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001
责任: Olivier Catoni ; editor, Jean Picard.
更多信息:

摘要:

e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the  再读一些...

评论

社评

出版商概要

From the reviews: "This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical 再读一些...

 
用户提供的评论
正在获取GoodReads评论...
正在检索DOGObooks的评论

标签

争取是第一个!
确认申请

你可能已经申请过这份资料。如果还是想申请,请选确认。

链接数据


<http://www.worldcat.org/oclc/56508135>
library:oclcnum"56508135"
library:placeOfPublication
library:placeOfPublication
owl:sameAs<info:oclcnum/56508135>
rdf:typeschema:Book
schema:about
schema:about
schema:about
schema:about
schema:about
<http://id.worldcat.org/fast/1012127>
rdf:typeschema:Intangible
schema:name"Mathematical statistics"@en
schema:name"Mathematical statistics."@en
schema:about
schema:about
schema:bookFormatschema:EBook
schema:contributor
schema:contributor
schema:contributor
schema:copyrightYear"2004"
schema:creator
schema:datePublished"2004"
schema:description"Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index."@en
schema:description"Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results."@en
schema:exampleOfWork<http://worldcat.org/entity/work/id/14473764>
schema:genre"Conference proceedings"@en
schema:genre"Conference proceedings."@en
schema:genre"Electronic books."@en
schema:inLanguage"en"
schema:name"Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:name"Statistical learning theory and stochastic optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:publisher
schema:url<http://dx.doi.org/10.1007/b99352>
schema:url<http://rave.ohiolink.edu/ebooks/ebc/11305972>
schema:url<http://www.springerlink.com/openurl.asp?genre=issue&issn=0075-8434&volume=1851>
schema:url
schema:url<http://springerlink.metapress.com/link.asp?id=MT30WFT73522>
schema:workExample

Content-negotiable representations

关闭窗口

请登入WorldCat 

没有张号吗?很容易就可以 建立免费的账号.