skip to content

Hinton, Geoffrey E.

Overview
Works: 36 works in 104 publications in 2 languages and 2,668 library holdings
Genres: Conference proceedings 
Roles: Editor
Classifications: QP408, 612.82
Publication Timeline
Key
Publications about Geoffrey E Hinton
Publications by Geoffrey E Hinton
Most widely held works about Geoffrey E Hinton
 
Most widely held works by Geoffrey E Hinton
Unsupervised learning foundations of neural computation by G Hinton( Computer File )
10 editions published between 1998 and 1999 in English and held by 1,374 libraries worldwide
This volume, on unsupervised learning algorithms, focuses on neural network learning algorithms that do not require an explicit teacher. The goal of unsupervised learning is to extract an efficient internal representation of the statistical structure implicit in the inputs. These algorithms provide insights into the development of the cerebral cortex and implicit learning in humans. They are also of interest to engineers working in areas such as computer vision and speech recognition who seek efficient representations of raw input data
Parallel models of associative memory by Geoffrey E Hinton( Book )
27 editions published between 1981 and 2014 in English and held by 856 libraries worldwide
This update of the 1981 classic on neural networks includes new commentaries by the authors that show how the original ideas are related to subsequent developments. As researchers continue to uncover ways of applying the complex information processing abilities of neural networks, they give these models an exciting future which may well involve revolutionary developments in understanding the brain and the mind -- developments that may allow researchers to build adaptive intelligent machines. The original chapters show where the ideas came from and the new commentaries show where they are going
Connectionist symbol processing by Geoffrey E Hinton( Book )
13 editions published between 1990 and 1991 in English and held by 182 libraries worldwide
Proceedings of the 1988 Connectionist Models Summer School by Connectionist Models Summer School( Book )
6 editions published between 1988 and 1989 in English and held by 167 libraries worldwide
Boltzmann machines : constraint satisfaction networks that learn by Geoffrey E Hinton( Book )
1 edition published in 1984 in English and held by 10 libraries worldwide
Connectionist learning procedures by Geoffrey E Hinton( Book )
5 editions published between 1987 and 1990 in English and Undetermined and held by 9 libraries worldwide
A time-delay neural network architecture for speech recognition by Kevin J Lang( Book )
2 editions published in 1988 in English and held by 9 libraries worldwide
The time-delay architecture was developed on a subset of the alphabetic E-set, a task which is difficult because the distinguishing sounds are low in energy and short in duration. A system can only achieve good performance on the task by learning to ignore meaningless variations in the vowel and the background noise which are major constituents of the input patterns. The time-delay network learned to isolate and analyze the short consonant releases in the patterns without being told that these events were useful, or even where they were located."
Special issue on connectionist symbol processing ( Book )
1 edition published in 1990 in English and held by 7 libraries worldwide
A distributed connectionist production system by David S Touretzky( Book )
4 editions published between 1986 and 1987 in English and Undetermined and held by 7 libraries worldwide
DCPS is a connectionist production system interpreter that uses distributed representations. As a connectionist model it consists of many simple, richly interconnected neuron like computing units that cooperate to solve problems in parallel. One motivation for constructing DCPS was to demonstrate that connectionist models are capable of representing and using explicit rules. A second motivation was to show how coarse coding or distributed representations can be used to construct a working memory that requires far fewer units than the number of different facts that can potentially be stored. The simulation we present is intended as a detailed demonstration of the feasibility of certain ideas and should not be viewed as a full implementation of production systems. Our current model only has a few of the many interesting emergent properties that we eventually hope to demonstrate: it is damage resistant, it performs matching and variable binding by massively parallel constraint satisfaction, and the capacity of its working memory is dependent on the similarity of the items being stored
Experiments on learning by back propagation by David C Plaut( Book )
2 editions published between 1986 and 1987 in English and held by 5 libraries worldwide
Rumelhart, Hinton and Williams (Rumelhart 86) describe a learning procedure for layered networks of deterministic, neuron-like units. This paper describes further research on the learning procedure. We start by describing the units, the way they are connected, the learning procedure, and the extension to iterative nets. We then give an example in which a network learns a set of filters that enable it to discriminate formant-like patterns in the presence of noise. The speed of learning is strongly dependent on the shape of the surface formed by the error measure in weight space . We give examples of the shape of the error surface for a typical task and illustrate how an acceleration method speeds up descent in weight space. The main drawback of the learning procedure is the way it scales as the size of the task and the network increases. We give some preliminary results on scaling and show how the magnitude of the optimal weight changes depends on the fan-in of the units. Additional results illustrate the effects on learning speed of the amount of interaction between the weights. A variation of the learning procedure that back-propagates desired state information rather than error gradients is developed and compared with the standard procedure. Finally, we discuss the relationship between our iterative networks and the analog networks described by Hopefield and Tank (Hopfield 85). The learning procedure can discover appropriate weights in their kind of network, as well as determine an optimal schedule for varying the nonlinearity of the units during a search
Neural networks for real-world problems Sunday, July 14, 1991, 2:00 pm-6 pm by Geoffrey E Hinton( Computer File )
2 editions published in 1991 in English and held by 4 libraries worldwide
Learning internal representations by error propagation by David E Rumelhart( Book )
2 editions published in 1985 in English and held by 4 libraries worldwide
This paper presents a generalization of the perception learning procedure for learning the correct sets of connections for arbitrary networks. The rule, falled the generalized delta rule, is a simple scheme for implementing a gradient descent method for finding weights that minimize the sum squared error of the sytem's performance. The major theoretical contribution of the work is the procedure called error propagation, whereby the gradient can be determined by individual units of the network based only on locally available information. The major empirical contribution of the work is to show that the problem of local minima not serious in this application of gradient descent. Keywords: Learning; networks; Perceptrons; Adaptive systems; Learning machines; and Back propagation
Neural network architectures for artificial intelligence by Geoffrey E Hinton( Book )
2 editions published in 1988 in English and held by 4 libraries worldwide
Distributed representations by Geoffrey E Hinton( Book )
2 editions published in 1984 in English and held by 4 libraries worldwide
Le rappresentazioni distribuite by Geoffrey E Hinton( Article )
in Italian and held by 3 libraries worldwide
Relaxation and its role in vision by Geoffrey E Hinton( Book )
2 editions published between 1977 and 1987 in English and held by 2 libraries worldwide
Connectionist models : Summer school : Selected revised papers ( Book )
2 editions published between 1988 and 1989 in English and held by 2 libraries worldwide
Connectionist models 1988 : proceedings of the 1988 ... Summer School, Carnegie Mellon University, June 17-26, 1988 by David S Touretzky( Book )
1 edition published in 1989 in English and held by 2 libraries worldwide
The Development of the Time-Delayed Neural Network Architecture ( Book )
1 edition published in 1990 in English and held by 1 library worldwide
Currently, one of the most powerful connectionist learning procedures is back-propagation which repeatedly adjusts the weights in a network so as to minimize a measure of the difference between the actual output vector of the network and a desired output vector given the current input vector. The simple weight adjusting rule is derived by propagating partial derivatives of the error backwards through the net using the chain rule. Experiments have shown that back-propagation has most of the properties desired by connectionists. As with any worthwhile learning rule, it can learn non-linear black box functions and make fine distinctions between input patterns in the presence of noise. Moreover, starting from random initial states, back-propagation networks can learn to use their hidden (intermediate layer) units to efficiently represent the structure that is inherent in their input data, often discovering intuitively pleasing features. The fact that back-propagation can discover features and distinguish between similar patterns in the presence of noise makes it a natural candidate as a speech recognition method. Another reason for expecting back-propagation to be good at speech is the success that hidden Markov models have enjoyed in speech can be useful when there is a rigorous automatic method for tuning its parameters. (KR)
How learning can guide evolution by Geoffrey E Hinton( Book )
1 edition published in 1986 in English and held by 1 library worldwide
 
moreShow More Titles
fewerShow Fewer Titles
Alternative Names
Hinton, G. E. (Geoffrey E.)
Hinton, Geoffrey
Languages
English (86)
Italian (1)
Covers
Close Window

Please sign in to WorldCat 

Don't have an account? You can easily create a free account.