skip to content
Optimization for machine learning Preview this item
ClosePreview this item
Checking...

Optimization for machine learning

Author: Suvrit Sra; Sebastian Nowozin; Stephen J Wright
Publisher: Cambridge, Mass. : MIT Press, ©2012.
Series: Neural information processing series.
Edition/Format:   eBook : Document : EnglishView all editions and formats
Database:WorldCat
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

 

Find a copy online

Links to this item

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Genre/Form: Electronic books
Additional Physical Format: Print version:
Optimization for machine learning.
Cambridge, Mass. : MIT Press, c2012
(DLC) 2011002059
(OCoLC)701493361
Material Type: Document, Internet resource
Document Type: Internet Resource, Computer File
All Authors / Contributors: Suvrit Sra; Sebastian Nowozin; Stephen J Wright
ISBN: 9780262298773 0262298775
OCLC Number: 758384972
Description: 1 online resource (ix, 494 p.) : ill.
Contents: Introduction : Optimization and machine learning / S. Sra, S. Nowozin, and S.J. Wright --
Convex optimization with sparsity-inducing norms / F. Bach, R. Jenatton, J. Mairal, and G. Obozinski --
Interior-point methods for large-scale cone programming / M. Andersen, J. Dahl, Z. Liu, and L. Vanderberghe --
Incremental gradient, subgradient, and proximal methods for convex optimization : a survey / D. P. Bertsekas --
First-order methods for nonsmooth convex large-scale optimization, I : general purpose methods / A. Juditsky and A. Nemirovski --
First-order methods for nonsmooth convex large-scale optimization, II : utilizing problem's structure / A. Juditsky and A. Nemirovski --
Cutting-plane methods in machine learning / V. Franc, S. Sonnenburg, and T. Werner --
Introduction to dual decomposition for inference / D. Sontag, A. Globerson, and T. Jaakkola --
Augmented Lagrangian methods for learning, selecting, and combining features / R. Tomioka, T. Suzuki, and M. Sugiyama --
The convex optimization approach to regret minimization / E. Hazan --
Projected Newton-type methods in machine learning / M. Schmidt, D. Kim, and S. Sra --
Interior-point methods in machine learning / J. Gondzio --
The tradeoffs of large-scale learning / L. Bottou and O. Bousquet --
Robust optimization in machine learning / C. Caramanis, S. Mannor, and H. Xu --
Improving first and second-order methods by modeling uncertainty / N. Le Roux, Y. Bengio, and A. Fitzgibbon --
Bandit view on noisy optimization / J.-Y. Audibert, S. Bubeck, and R. Munos --
Optimization methods for sparse inverse covariance selection / K. Scheinberg and S. Ma --
A pathwise algorithm for covariance selection / V. Krishnamurthy, S. D. Ahipasaoglu, and A. d'Aspremont.
Series Title: Neural information processing series.
Responsibility: edited by Suvrit Sra, Sebastian Nowozin, and Stephen J. Wright.

Reviews

User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.
Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


<http://www.worldcat.org/oclc/758384972>
library:oclcnum"758384972"
library:placeOfPublication
library:placeOfPublication
rdf:typeschema:Book
rdf:typeschema:MediaObject
rdf:valueUnknown value: dct
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:about
schema:bookFormatschema:EBook
schema:contributor
schema:contributor
schema:contributor
schema:copyrightYear"2012"
schema:datePublished"2012"
schema:description"Introduction : Optimization and machine learning / S. Sra, S. Nowozin, and S.J. Wright -- Convex optimization with sparsity-inducing norms / F. Bach, R. Jenatton, J. Mairal, and G. Obozinski -- Interior-point methods for large-scale cone programming / M. Andersen, J. Dahl, Z. Liu, and L. Vanderberghe -- Incremental gradient, subgradient, and proximal methods for convex optimization : a survey / D. P. Bertsekas -- First-order methods for nonsmooth convex large-scale optimization, I : general purpose methods / A. Juditsky and A. Nemirovski -- First-order methods for nonsmooth convex large-scale optimization, II : utilizing problem's structure / A. Juditsky and A. Nemirovski -- Cutting-plane methods in machine learning / V. Franc, S. Sonnenburg, and T. Werner -- Introduction to dual decomposition for inference / D. Sontag, A. Globerson, and T. Jaakkola -- Augmented Lagrangian methods for learning, selecting, and combining features / R. Tomioka, T. Suzuki, and M. Sugiyama -- The convex optimization approach to regret minimization / E. Hazan -- Projected Newton-type methods in machine learning / M. Schmidt, D. Kim, and S. Sra -- Interior-point methods in machine learning / J. Gondzio -- The tradeoffs of large-scale learning / L. Bottou and O. Bousquet -- Robust optimization in machine learning / C. Caramanis, S. Mannor, and H. Xu -- Improving first and second-order methods by modeling uncertainty / N. Le Roux, Y. Bengio, and A. Fitzgibbon -- Bandit view on noisy optimization / J.-Y. Audibert, S. Bubeck, and R. Munos -- Optimization methods for sparse inverse covariance selection / K. Scheinberg and S. Ma -- A pathwise algorithm for covariance selection / V. Krishnamurthy, S. D. Ahipasaoglu, and A. d'Aspremont."@en
schema:exampleOfWork<http://worldcat.org/entity/work/id/785795896>
schema:genre"Electronic books"@en
schema:inLanguage"en"
schema:isPartOf
schema:name"Optimization for machine learning"@en
schema:numberOfPages"494"
schema:publication
schema:publisher
schema:url<http://www.jstor.org/stable/10.2307/j.ctt5hhgpg>
schema:url<http://www.myilibrary.com?id=330284>
schema:url<http://site.ebrary.com/lib/alltitles/Doc?id=10504740>
schema:url<http://site.ebrary.com/id/10504740>
schema:url<http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=399078>
schema:workExample
wdrs:describedby

Content-negotiable representations

Close Window

Please sign in to WorldCat 

Don't have an account? You can easily create a free account.