skip to content
Covid-19 virus
COVID-19 Resources

Reliable information about the coronavirus (COVID-19) is available from the World Health Organization (current situation, international travel). Numerous and frequently-updated resource results are available from this WorldCat.org search. OCLC’s WebJunction has pulled together information and resources to assist library staff as they consider how to handle coronavirus issues in their communities.

Image provided by: CDC/ Alissa Eckert, MS; Dan Higgins, MAM
View online Local Loss Optimization in Operator Models: A... Preview this item
ClosePreview this item
Checking...

Local Loss Optimization in Operator Models: A New Insight into Spectral Learning

Author: Balle, Borja; Quattoni, Ariadna; Carreras, Xavier
Publisher: 2012-06-27
Edition/Format:   Downloadable archival material
Summary:
This paper re-visits the spectral method for learning latent variable models defined in terms of observable operators. We give a new perspective on the method, showing that operators can be recovered by minimizing a loss defined on a finite subset of the domain. A non-convex optimization similar to the spectral method is derived. We also propose a regularized convex relaxation of this optimization. We show that in
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

Find a copy online

Links to this item

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Genre/Form: text
Material Type: Internet resource
Document Type: Internet Resource, Archival Material
All Authors / Contributors: Balle, Borja; Quattoni, Ariadna; Carreras, Xavier
OCLC Number: 815865081

Abstract:

This paper re-visits the spectral method for learning latent variable models defined in terms of observable operators. We give a new perspective on the method, showing that operators can be recovered by minimizing a loss defined on a finite subset of the domain. A non-convex optimization similar to the spectral method is derived. We also propose a regularized convex relaxation of this optimization. We show that in practice the availabilty of a continuous regularization parameter (in contrast with the discrete number of states in the original method) allows a better trade-off between accuracy and model complexity. We also prove that in general, a randomized strategy for choosing the local loss will succeed with high probability.

Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012)

Reviews

User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.
Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


\n\n

Primary Entity<\/h3>\n
<http:\/\/www.worldcat.org\/oclc\/815865081<\/a>> # Local Loss Optimization in Operator Models: A New Insight into Spectral Learning<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:CreativeWork<\/a>, schema:MediaObject<\/a>, library:ArchiveMaterial<\/a> ;\u00A0\u00A0\u00A0\nlibrary:oclcnum<\/a> \"815865081<\/span>\" ;\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Thing\/computer_science_machine_learning<\/a>> ; # Computer Science - Machine Learning<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Thing\/statistics_machine_learning<\/a>> ; # Statistics - Machine Learning<\/span>\n\u00A0\u00A0\u00A0\nschema:creator<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/quattoni_ariadna<\/a>> ; # Quattoni, Ariadna<\/span>\n\u00A0\u00A0\u00A0\nschema:creator<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/carreras_xavier<\/a>> ; # Carreras, Xavier<\/span>\n\u00A0\u00A0\u00A0\nschema:creator<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/balle_borja<\/a>> ; # Balle, Borja<\/span>\n\u00A0\u00A0\u00A0\nschema:datePublished<\/a> \"2012\/06\/27<\/span>\" ;\u00A0\u00A0\u00A0\nschema:datePublished<\/a> \"2012<\/span>\" ;\u00A0\u00A0\u00A0\nschema:description<\/a> \"This paper re-visits the spectral method for learning latent variable models defined in terms of observable operators. We give a new perspective on the method, showing that operators can be recovered by minimizing a loss defined on a finite subset of the domain. A non-convex optimization similar to the spectral method is derived. We also propose a regularized convex relaxation of this optimization. We show that in practice the availabilty of a continuous regularization parameter (in contrast with the discrete number of states in the original method) allows a better trade-off between accuracy and model complexity. We also prove that in general, a randomized strategy for choosing the local loss will succeed with high probability.<\/span>\" ;\u00A0\u00A0\u00A0\nschema:description<\/a> \"Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012)<\/span>\" ;\u00A0\u00A0\u00A0\nschema:exampleOfWork<\/a> <http:\/\/worldcat.org\/entity\/work\/id\/1176768075<\/a>> ;\u00A0\u00A0\u00A0\nschema:genre<\/a> \"text<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Local Loss Optimization in Operator Models: A New Insight into Spectral Learning<\/span>\" ;\u00A0\u00A0\u00A0\nschema:productID<\/a> \"815865081<\/span>\" ;\u00A0\u00A0\u00A0\nschema:publication<\/a> <http:\/\/www.worldcat.org\/title\/-\/oclc\/815865081#PublicationEvent\/2012_06_27<\/a>> ;\u00A0\u00A0\u00A0\nschema:url<\/a> <http:\/\/arxiv.org\/abs\/1206.6393<\/a>> ;\u00A0\u00A0\u00A0\nwdrs:describedby<\/a> <http:\/\/www.worldcat.org\/title\/-\/oclc\/815865081<\/a>> ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n\n

Related Entities<\/h3>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/balle_borja<\/a>> # Balle, Borja<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nbgn:Agent<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Balle, Borja<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/carreras_xavier<\/a>> # Carreras, Xavier<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nbgn:Agent<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Carreras, Xavier<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Agent\/quattoni_ariadna<\/a>> # Quattoni, Ariadna<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nbgn:Agent<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Quattoni, Ariadna<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Thing\/computer_science_machine_learning<\/a>> # Computer Science - Machine Learning<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Computer Science - Machine Learning<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/1176768075#Thing\/statistics_machine_learning<\/a>> # Statistics - Machine Learning<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Statistics - Machine Learning<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/www.worldcat.org\/title\/-\/oclc\/815865081<\/a>>\u00A0\u00A0\u00A0\u00A0a \ngenont:InformationResource<\/a>, genont:ContentTypeGenericResource<\/a> ;\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/www.worldcat.org\/oclc\/815865081<\/a>> ; # Local Loss Optimization in Operator Models: A New Insight into Spectral Learning<\/span>\n\u00A0\u00A0\u00A0\nschema:dateModified<\/a> \"2019-06-29<\/span>\" ;\u00A0\u00A0\u00A0\nvoid:inDataset<\/a> <http:\/\/purl.oclc.org\/dataset\/WorldCat<\/a>> ;\u00A0\u00A0\u00A0\nvoid:inDataset<\/a> <http:\/\/purl.oclc.org\/dataset\/WorldCat\/DigitalCollectionGateway<\/a>> ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n