Kearns, Michael J.
Overview
Works:  10 works in 58 publications in 1 language and 1,882 library holdings 

Genres:  Conference papers and proceedings Periodicals 
Roles:  Author, Editor 
Classifications:  Q325.5, 006.3 
Publication Timeline
.
Most widely held works by
Michael J Kearns
An introduction to computational learning theory by
Michael J Kearns(
Book
)
24 editions published between 1994 and 1997 in English and held by 388 WorldCat member libraries worldwide
Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.Computational learning theory is a new and rapidly expanding area of research that examines formal models of induction with the goals of discovering the common methods underlying efficient learning algorithms and identifying the computational impediments to learning. Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting. Intuition has been emphasized in the presentation to make the material accessible to the nontheoretician while still providing precise arguments for the specialist. This balance is the result of new proofs of established theorems, and new presentations of the standard proofs. The topics covered include the motivation, definitions, and fundamental results, both positive and negative, for the widely studied L.G. Valiant model of Probably Approximately Correct Learning; Occam's Razor, which formalizes a relationship between learning and data compression; the VapnikChervonenkis dimension; the equivalence of weak and strong learning; efficient learning in the presence of noise by the method of statistical queries; relationships between learning and cryptography, and the resulting computational limitations on efficient learning; reducibility between learning problems; and algorithms for learning finite automata from active experimentation
24 editions published between 1994 and 1997 in English and held by 388 WorldCat member libraries worldwide
Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.Computational learning theory is a new and rapidly expanding area of research that examines formal models of induction with the goals of discovering the common methods underlying efficient learning algorithms and identifying the computational impediments to learning. Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting. Intuition has been emphasized in the presentation to make the material accessible to the nontheoretician while still providing precise arguments for the specialist. This balance is the result of new proofs of established theorems, and new presentations of the standard proofs. The topics covered include the motivation, definitions, and fundamental results, both positive and negative, for the widely studied L.G. Valiant model of Probably Approximately Correct Learning; Occam's Razor, which formalizes a relationship between learning and data compression; the VapnikChervonenkis dimension; the equivalence of weak and strong learning; efficient learning in the presence of noise by the method of statistical queries; relationships between learning and cryptography, and the resulting computational limitations on efficient learning; reducibility between learning problems; and algorithms for learning finite automata from active experimentation
The computational complexity of machine learning by
Michael J Kearns(
Book
)
12 editions published between 1989 and 1990 in English and held by 289 WorldCat member libraries worldwide
We are interested in the phenomenon of efficient learning in the distributionfree model, in the standard polynomialtime sense. Our results include general tools for determining the polynomialtime learnability of a conceptclass, an extensive study of efficient learning when errors are present in the examples, and lower bounds on the number of examples required for learning in out model. A centerpiece of the thesis is a series of results demonstrating the computational difficulty of learning a number of wellstudied concept classes. These results are obtained by reducing some apparently hard numbertheoretic problems from cryptography to the learning problems. The hardtolearn concept classes include the sets represented by Boolean formulae, deterministic finite automata and a simplified form of neural networks
12 editions published between 1989 and 1990 in English and held by 289 WorldCat member libraries worldwide
We are interested in the phenomenon of efficient learning in the distributionfree model, in the standard polynomialtime sense. Our results include general tools for determining the polynomialtime learnability of a conceptclass, an extensive study of efficient learning when errors are present in the examples, and lower bounds on the number of examples required for learning in out model. A centerpiece of the thesis is a series of results demonstrating the computational difficulty of learning a number of wellstudied concept classes. These results are obtained by reducing some apparently hard numbertheoretic problems from cryptography to the learning problems. The hardtolearn concept classes include the sets represented by Boolean formulae, deterministic finite automata and a simplified form of neural networks
Advances in neural information processing systems 9 : proceedings of the 1996 conference by
NIPS(
Book
)
15 editions published between 1998 and 1999 in English and held by 82 WorldCat member libraries worldwide
Contains the entire proceedings of the 12 Neural Information Processing Systems conferences from 1988 to 1999
15 editions published between 1998 and 1999 in English and held by 82 WorldCat member libraries worldwide
Contains the entire proceedings of the 12 Neural Information Processing Systems conferences from 1988 to 1999
An Introduction to Computational Learning Theory by
Michael J Kearns(
)
in English and held by 13 WorldCat member libraries worldwide
in English and held by 13 WorldCat member libraries worldwide
Learning boolean formulae or finite automata is as hard as factoring by
Michael J Kearns(
Book
)
1 edition published in 1988 in English and held by 4 WorldCat member libraries worldwide
1 edition published in 1988 in English and held by 4 WorldCat member libraries worldwide
Exact identification of readonce formulas using fixed points of amplification functions by
Sally A Goldman(
Book
)
1 edition published in 1991 in English and held by 1 WorldCat member library worldwide
Abstract: "In this paper we describe a new technique for exactly identifying certain classes of readonce Boolean formulas. The method is based on sampling the inputoutput behavior of the target formula on a probability distribution which is determined by the fixed point of the formula's amplification function (defined as the probability that a 1 is output by the formula when each input bit is 1 independently with probability p). By performing various statistical tests on easily sampled variants of the fixedpoint distribution, we are able to efficiently infer all structural information about any logarithmicdepth formula (with high probability). We apply our results to prove the existence of short universal identification sequences for large classes of formulas
1 edition published in 1991 in English and held by 1 WorldCat member library worldwide
Abstract: "In this paper we describe a new technique for exactly identifying certain classes of readonce Boolean formulas. The method is based on sampling the inputoutput behavior of the target formula on a probability distribution which is determined by the fixed point of the formula's amplification function (defined as the probability that a 1 is output by the formula when each input bit is 1 independently with probability p). By performing various statistical tests on easily sampled variants of the fixedpoint distribution, we are able to efficiently infer all structural information about any logarithmicdepth formula (with high probability). We apply our results to prove the existence of short universal identification sequences for large classes of formulas
Advances in neural information processing systems 10(
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
Nature of the procine intestinal K88 receptor by
Michael J Kearns(
)
1 edition published in 1979 in English and held by 1 WorldCat member library worldwide
1 edition published in 1979 in English and held by 1 WorldCat member library worldwide
Advances in neural information processing systems 10(
Book
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
Public affairs : a study and proposal for the Illinois Department of Transportation by
Michael J Kearns(
Book
)
1 edition published in 1971 in English and held by 0 WorldCat member libraries worldwide
1 edition published in 1971 in English and held by 0 WorldCat member libraries worldwide
Audience Level
0 

1  
Kids  General  Special 
Related Identities
Associated Subjects
Algebra, Boolean Algorithms Artificial intelligence Computational complexity Computational learning theory Digital computer simulation Illinois Illinois.Division of Highways Information storage and retrieval systems Machine learning Neural circuitry Neural computers Neural networks (Computer science) Neurosciences Probability learning Public relations Sequential machine theory TransportationPublic relations