Kearns, Michael J.
Overview
Works:  11 works in 47 publications in 1 language and 1,662 library holdings 

Genres:  Conference proceedings 
Roles:  Author, Editor 
Classifications:  Q325.5, 006.3 
Publication Timeline
.
Most widely held works by
Michael J Kearns
An introduction to computational learning theory by
Michael J Kearns(
Book
)
18 editions published between 1994 and 1997 in English and held by 382 WorldCat member libraries worldwide
Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.Computational learning theory is a new and rapidly expanding area of research that examines formal models of induction with the goals of discovering the common methods underlying efficient learning algorithms and identifying the computational impediments to learning.Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting. Intuition has been emphasized in the presentation to make the material accessible to the nontheoretician while still providing precise arguments for the specialist. This balance is the result of new proofs of established theorems, and new presentations of the standard proofs.The topics covered include the motivation, definitions, and fundamental results, both positive and negative, for the widely studied L. G. Valiant model of Probably Approximately Correct Learning; Occam's Razor, which formalizes a relationship between learning and data compression; the VapnikChervonenkis dimension; the equivalence of weak and strong learning; efficient learning in the presence of noise by the method of statistical queries; relationships between learning and cryptography, and the resulting computational limitations on efficient learning; reducibility between learning problems; and algorithms for learning finite automata from active experimentation
18 editions published between 1994 and 1997 in English and held by 382 WorldCat member libraries worldwide
Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.Computational learning theory is a new and rapidly expanding area of research that examines formal models of induction with the goals of discovering the common methods underlying efficient learning algorithms and identifying the computational impediments to learning.Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting. Intuition has been emphasized in the presentation to make the material accessible to the nontheoretician while still providing precise arguments for the specialist. This balance is the result of new proofs of established theorems, and new presentations of the standard proofs.The topics covered include the motivation, definitions, and fundamental results, both positive and negative, for the widely studied L. G. Valiant model of Probably Approximately Correct Learning; Occam's Razor, which formalizes a relationship between learning and data compression; the VapnikChervonenkis dimension; the equivalence of weak and strong learning; efficient learning in the presence of noise by the method of statistical queries; relationships between learning and cryptography, and the resulting computational limitations on efficient learning; reducibility between learning problems; and algorithms for learning finite automata from active experimentation
The computational complexity of machine learning by
Michael J Kearns(
Book
)
10 editions published between 1989 and 1990 in English and held by 284 WorldCat member libraries worldwide
We also give algorithms for learning powerful concept classes under the uniform distribution, and give equivalences between natural models of efficient learnability. This thesis also includes detailed definitions and motivation for the distributionfree model, a chapter discussing past research in this model and related models, and a short list of important open problems."
10 editions published between 1989 and 1990 in English and held by 284 WorldCat member libraries worldwide
We also give algorithms for learning powerful concept classes under the uniform distribution, and give equivalences between natural models of efficient learnability. This thesis also includes detailed definitions and motivation for the distributionfree model, a chapter discussing past research in this model and related models, and a short list of important open problems."
Advances in neural information processing systems 10 : proceedings of the 1997 conference by IEEE Conference on Neural Information Processing SystemsNatural and Synthetic(
Book
)
11 editions published in 1998 in English and held by 74 WorldCat member libraries worldwide
11 editions published in 1998 in English and held by 74 WorldCat member libraries worldwide
Learning in the presence of malicious errors by
Michael J Kearns(
Book
)
1 edition published in 1987 in English and held by 5 WorldCat member libraries worldwide
1 edition published in 1987 in English and held by 5 WorldCat member libraries worldwide
Learning boolean formulae or finite automata is as hard as factoring by
Michael J Kearns(
Book
)
1 edition published in 1988 in English and held by 4 WorldCat member libraries worldwide
1 edition published in 1988 in English and held by 4 WorldCat member libraries worldwide
Advances in neural information processing systems 10(
Book
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
Exact identification of readonce formulas using fixed points of amplification functions by
Sally A Goldman(
Book
)
1 edition published in 1991 in English and held by 1 WorldCat member library worldwide
Abstract: "In this paper we describe a new technique for exactly identifying certain classes of readonce Boolean formulas. The method is based on sampling the inputoutput behavior of the target formula on a probability distribution which is determined by the fixed point of the formula's amplification function (defined as the probability that a 1 is output by the formula when each input bit is 1 independently with probability p). By performing various statistical tests on easily sampled variants of the fixedpoint distribution, we are able to efficiently infer all structural information about any logarithmicdepth formula (with high probability). We apply our results to prove the existence of short universal identification sequences for large classes of formulas
1 edition published in 1991 in English and held by 1 WorldCat member library worldwide
Abstract: "In this paper we describe a new technique for exactly identifying certain classes of readonce Boolean formulas. The method is based on sampling the inputoutput behavior of the target formula on a probability distribution which is determined by the fixed point of the formula's amplification function (defined as the probability that a 1 is output by the formula when each input bit is 1 independently with probability p). By performing various statistical tests on easily sampled variants of the fixedpoint distribution, we are able to efficiently infer all structural information about any logarithmicdepth formula (with high probability). We apply our results to prove the existence of short universal identification sequences for large classes of formulas
Advances in neural information processing systems proceedings of the 1997 conference(
Book
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
Nature of the procine intestinal K88 receptor by
Michael J Kearns(
)
1 edition published in 1979 in English and held by 1 WorldCat member library worldwide
1 edition published in 1979 in English and held by 1 WorldCat member library worldwide
Advances in neural information processing systems 10(
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
Public affairs : a study and proposal for the Illinois Department of Transportation by
Michael J Kearns(
Book
)
1 edition published in 1971 in English and held by 0 WorldCat member libraries worldwide
1 edition published in 1971 in English and held by 0 WorldCat member libraries worldwide
more
fewer
Audience Level
0 

1  
Kids  General  Special 
Related Identities
Associated Subjects
Algebra, Boolean Algorithms Artificial intelligence Computational complexity Computational learning theory Errors Illinois Illinois.Division of Highways Information storage and retrieval systems Machine learning Neural circuitry Neural computers Neural networks (Computer science) Neurosciences Probability learning Public relations Sequential machine theory TransportationPublic relations