skip to content
Boosting and maximum likelihood for exponential models Preview this item
ClosePreview this item
Checking...

Boosting and maximum likelihood for exponential models

Author: Guy Lebanon; John D Lafferty
Publisher: Pittsburgh, Pa. : School of Computer Science, Carnegie Mellon University, [2001]
Series: Research paper (Carnegie Mellon University. School of Computer Science), CMU-CS-01-144.
Edition/Format:   Print book : English
Database:WorldCat
Summary:
Abstract: "Recent research has considered the relationship between boosting and more standard statistical methods, such as logistic regression, concluding that AdaBoost is similar but somehow still very different from statistical methods in that it minimizes a different loss function. In this paper we derive an equivalence between AdaBoost and the dual of a convex optimization problem. In this setting, it is seen  Read more...
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

 

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Document Type: Book
All Authors / Contributors: Guy Lebanon; John D Lafferty
OCLC Number: 48184928
Notes: "October 6, 2001."
Description: 17 pages : illustrations ; 28 cm.
Series Title: Research paper (Carnegie Mellon University. School of Computer Science), CMU-CS-01-144.
Responsibility: Guy Lebanon, John Lafferty.

Abstract:

Abstract: "Recent research has considered the relationship between boosting and more standard statistical methods, such as logistic regression, concluding that AdaBoost is similar but somehow still very different from statistical methods in that it minimizes a different loss function. In this paper we derive an equivalence between AdaBoost and the dual of a convex optimization problem. In this setting, it is seen that the only difference between minimizing the exponential loss used by AdaBoost and maximum likelihood for exponential models is that the latter requires the model to be normalized to form a conditional probability distribution over labels; the two methods minimize the same Kullback-Leibler divergence objective function subject to identical feature constraints. In addition to establishing a simple and easily understood connection between the two methods, this framework enables us to derive new regularization procedures for boosting that directly correspond to penalized maximum likelihood. Experiments on UCI datasets, comparing exponential loss and maximum likelihood for parallel and sequential update algorithms, confirm our theoretical analysis, indicating that AdaBoost and maximum likelihood typically yield identical results as the number of features increases to allow the models to fit the training data."

Reviews

User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.

Similar Items

Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


Primary Entity

<http://www.worldcat.org/oclc/48184928> # Boosting and maximum likelihood for exponential models
    a schema:CreativeWork, schema:Book ;
   library:oclcnum "48184928" ;
   library:placeOfPublication <http://experiment.worldcat.org/entity/work/data/37655589#Place/pittsburgh_pa> ; # Pittsburgh, Pa.
   library:placeOfPublication <http://id.loc.gov/vocabulary/countries/pau> ;
   schema:about <http://dewey.info/class/510.7808/> ;
   schema:about <http://id.worldcat.org/fast/1004795> ; # Machine learning
   schema:about <http://id.worldcat.org/fast/1002083> ; # Logistic regression analysis
   schema:bookFormat bgn:PrintBook ;
   schema:contributor <http://viaf.org/viaf/69171210> ; # John D. Lafferty
   schema:creator <http://experiment.worldcat.org/entity/work/data/37655589#Person/lebanon_guy> ; # Guy Lebanon
   schema:datePublished "2001" ;
   schema:description "Abstract: "Recent research has considered the relationship between boosting and more standard statistical methods, such as logistic regression, concluding that AdaBoost is similar but somehow still very different from statistical methods in that it minimizes a different loss function. In this paper we derive an equivalence between AdaBoost and the dual of a convex optimization problem. In this setting, it is seen that the only difference between minimizing the exponential loss used by AdaBoost and maximum likelihood for exponential models is that the latter requires the model to be normalized to form a conditional probability distribution over labels; the two methods minimize the same Kullback-Leibler divergence objective function subject to identical feature constraints. In addition to establishing a simple and easily understood connection between the two methods, this framework enables us to derive new regularization procedures for boosting that directly correspond to penalized maximum likelihood. Experiments on UCI datasets, comparing exponential loss and maximum likelihood for parallel and sequential update algorithms, confirm our theoretical analysis, indicating that AdaBoost and maximum likelihood typically yield identical results as the number of features increases to allow the models to fit the training data.""@en ;
   schema:exampleOfWork <http://worldcat.org/entity/work/id/37655589> ;
   schema:inLanguage "en" ;
   schema:isPartOf <http://experiment.worldcat.org/entity/work/data/37655589#Series/research_paper_carnegie_mellon_university_school_of_computer_science> ; # Research paper (Carnegie Mellon University. School of Computer Science) ;
   schema:name "Boosting and maximum likelihood for exponential models"@en ;
   schema:productID "48184928" ;
   schema:publication <http://www.worldcat.org/title/-/oclc/48184928#PublicationEvent/pittsburgh_pa_school_of_computer_science_carnegie_mellon_university_2001> ;
   schema:publisher <http://experiment.worldcat.org/entity/work/data/37655589#Agent/school_of_computer_science_carnegie_mellon_university> ; # School of Computer Science, Carnegie Mellon University
   wdrs:describedby <http://www.worldcat.org/title/-/oclc/48184928> ;
    .


Related Entities

<http://experiment.worldcat.org/entity/work/data/37655589#Agent/school_of_computer_science_carnegie_mellon_university> # School of Computer Science, Carnegie Mellon University
    a bgn:Agent ;
   schema:name "School of Computer Science, Carnegie Mellon University" ;
    .

<http://experiment.worldcat.org/entity/work/data/37655589#Person/lebanon_guy> # Guy Lebanon
    a schema:Person ;
   schema:familyName "Lebanon" ;
   schema:givenName "Guy" ;
   schema:name "Guy Lebanon" ;
    .

<http://experiment.worldcat.org/entity/work/data/37655589#Place/pittsburgh_pa> # Pittsburgh, Pa.
    a schema:Place ;
   schema:name "Pittsburgh, Pa." ;
    .

<http://experiment.worldcat.org/entity/work/data/37655589#Series/research_paper_carnegie_mellon_university_school_of_computer_science> # Research paper (Carnegie Mellon University. School of Computer Science) ;
    a bgn:PublicationSeries ;
   schema:hasPart <http://www.worldcat.org/oclc/48184928> ; # Boosting and maximum likelihood for exponential models
   schema:name "Research paper (Carnegie Mellon University. School of Computer Science) ;" ;
   schema:name "[Research paper] / Carnegie Mellon University. School of Computer Science ;" ;
    .

<http://id.worldcat.org/fast/1002083> # Logistic regression analysis
    a schema:Intangible ;
   schema:name "Logistic regression analysis"@en ;
    .

<http://id.worldcat.org/fast/1004795> # Machine learning
    a schema:Intangible ;
   schema:name "Machine learning"@en ;
    .

<http://viaf.org/viaf/69171210> # John D. Lafferty
    a schema:Person ;
   schema:familyName "Lafferty" ;
   schema:givenName "John D." ;
   schema:name "John D. Lafferty" ;
    .


Content-negotiable representations

Close Window

Please sign in to WorldCat 

Don't have an account? You can easily create a free account.