コンテンツへ移動
Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001 資料のプレビュー
閉じる資料のプレビュー
確認中…

Statistical learning theory and stochastic optimization : Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001

著者: Olivier Catoni; Jean Picard; LINK (Online service)
出版: Berlin : Springer-Verlag, ©2004.
シリーズ: Lecture notes in mathematics (Springer-Verlag), 1851.
エディション/フォーマット:   電子書籍 : Document : Conference publication : Englishすべてのエディションとフォーマットを見る
データベース:WorldCat
概要:
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes  続きを読む
評価:

(まだ評価がありません) 0 件のレビュー - まずはあなたから!

件名
関連情報

 

オンラインで入手

この資料へのリンク

オフラインで入手

&AllPage.SpinnerRetrieving; この資料の所蔵館を検索中…

詳細

ジャンル/形式: Electronic books
Conference proceedings
Congresses
その他のフォーマット: Print version:
Catoni, Olivier.
Statistical learning theory and stochastic optimization.
Berlin : Springer-Verlag, ©2004
(DLC) 2004109143
(OCoLC)56714791
資料の種類: Conference publication, Document, インターネット資料
ドキュメントの種類: インターネットリソース, コンピューターファイル
すべての著者/寄与者: Olivier Catoni; Jean Picard; LINK (Online service)
ISBN: 9783540445074 3540445072
OCLC No.: 56508135
注記: " ... 31st Probability Summer School in Saint-Flour (July 8-25, 2001) ..."--Preface.
形態 1 online resource (viii, 272 pages) : illustrations.
コンテンツ: Universal Lossless Data Compression --
Links Between Data Compression and Statistical Estimation --
Non Cumulated Mean Risk --
Gibbs Estimators --
Randomized Estimators and Empirical Complexity --
Deviation Inequalities --
Markov Chains with Exponential Transitions --
References --
Index.
シリーズタイトル: Lecture notes in mathematics (Springer-Verlag), 1851.
他のタイトル: Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001
責任者: Olivier Catoni ; editor, Jean Picard.
その他の情報:

概要:

e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the  続きを読む

レビュー

編集者のレビュー

出版社によるあらすじ

From the reviews: "This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical 続きを読む

 
ユーザーレビュー
GoodReadsのレビューを取得中…
DOGObooksのレビューを取得中…

タグ

まずはあなたから!
リクエストの確認

あなたは既にこの資料をリクエストしている可能性があります。このリクエストを続行してよろしければ、OK を選択してください。

リンクデータ


<http://www.worldcat.org/oclc/56508135>
library:oclcnum"56508135"
library:placeOfPublication
library:placeOfPublication
owl:sameAs<info:oclcnum/56508135>
rdf:typeschema:Book
rdfs:seeAlso
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:author
schema:bookFormatschema:EBook
schema:contributor
rdf:typeschema:Organization
schema:name"Ecole d'été de probabilités de Saint-Flour 2001)"
schema:contributor
schema:contributor
schema:datePublished"©2004"
schema:datePublished"2004"
schema:description"Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index."
schema:description"Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results."
schema:exampleOfWork<http://worldcat.org/entity/work/id/14473764>
schema:genre"Conference proceedings"
schema:genre"Conference proceedings."
schema:inLanguage"en"
schema:name"Statistical learning theory and stochastic optimization Ecole d'Eté de Probabilités de Saint-Flour XXXI-2001"
schema:publisher
schema:url<http://dx.doi.org/10.1007/b99352>
schema:url<http://rave.ohiolink.edu/ebooks/ebc/11305972>
schema:url<http://www.springerlink.com/openurl.asp?genre=issue&issn=0075-8434&volume=1851>
schema:url<http://springerlink.metapress.com/link.asp?id=MT30WFT73522>
schema:workExample
schema:workExample

Content-negotiable representations

ウインドウを閉じる

WorldCatにログインしてください 

アカウントをお持ちではないですか?簡単に 無料アカウントを作成することができます。.