跳至内容
Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001 線上預覽
關閉線上預覽
正在查...

Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001

作者: Olivier Catoni; Jean Picard; LINK (Online service)
出版商: Berlin : Springer-Verlag, ©2004.
叢書: Lecture notes in mathematics (Springer-Verlag), 1851.
版本/格式:   電子書 : 文獻 : 會議刊物 : 英語所有版本和格式的總覽
資料庫:WorldCat
提要:
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes  再讀一些...
評定級別:

(尚未評分) 0 附有評論 - 成爲第一個。

主題
更多類似這樣的

 

在線上查詢

與資料的連結

在圖書館查詢

&AllPage.SpinnerRetrieving; 正在查詢有此資料的圖書館...

詳細書目

類型/形式: Electronic books
Conference proceedings
Congresses
其他的實體格式: Print version:
Catoni, Olivier.
Statistical learning theory and stochastic optimization.
Berlin : Springer-Verlag, ©2004
(DLC) 2004109143
(OCoLC)56714791
資料類型: 會議刊物, 文獻, 網際網路資源
文件類型: 網路資源, 電腦資料
所有的作者/貢獻者: Olivier Catoni; Jean Picard; LINK (Online service)
ISBN: 9783540445074 3540445072
OCLC系統控制編碼: 56508135
注意: " ... 31st Probability Summer School in Saint-Flour (July 8-25, 2001) ..."--Preface.
描述: 1 online resource (viii, 272 pages) : illustrations.
内容: Universal Lossless Data Compression --
Links Between Data Compression and Statistical Estimation --
Non Cumulated Mean Risk --
Gibbs Estimators --
Randomized Estimators and Empirical Complexity --
Deviation Inequalities --
Markov Chains with Exponential Transitions --
References --
Index.
叢書名: Lecture notes in mathematics (Springer-Verlag), 1851.
其他題名: Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001
責任: Olivier Catoni ; editor, Jean Picard.

摘要:

e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the  再讀一些...

評論

社評

出版商概要

From the reviews: "This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical 再讀一些...

 
讀者提供的評論
正在擷取GoodReads評論...
正在擷取DOGObooks的評論

標籤

成爲第一個
確認申請

你可能已經申請過這份資料。若還是想申請,請選確認。

連結資料


<http://www.worldcat.org/oclc/56508135>
library:oclcnum"56508135"
library:placeOfPublication
library:placeOfPublication
rdf:typeschema:MediaObject
rdf:typeschema:Book
rdf:valueUnknown value: cnp
rdf:valueUnknown value: dct
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:alternateName"Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:bookFormatschema:EBook
schema:contributor
schema:contributor
schema:contributor
schema:copyrightYear"2004"
schema:creator
schema:datePublished"2004"
schema:description"Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index."@en
schema:description"Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results."@en
schema:exampleOfWork<http://worldcat.org/entity/work/id/14473764>
schema:genre"Conference proceedings"@en
schema:genre"Electronic books"@en
schema:inLanguage"en"
schema:isPartOf
schema:name"Statistical learning theory and stochastic optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"@en
schema:publication
schema:publisher
schema:url<http://dx.doi.org/10.1007/b99352>
schema:url<http://rave.ohiolink.edu/ebooks/ebc/11305972>
schema:url<http://www.springerlink.com/openurl.asp?genre=issue&issn=0075-8434&volume=1851>
schema:url<http://springerlink.metapress.com/link.asp?id=MT30WFT73522>
schema:workExample
wdrs:describedby

Content-negotiable representations

關閉視窗

請登入WorldCat 

没有帳號嗎?你可很容易的 建立免費的帳號.