Fahlman, Scott E.
Most widely held works by Scott E Fahlman
NETL, a system for representing and using real-world knowledge by Scott E Fahlman ( Book )
10 editions published between 1979 and 1987 in English and held by 390 libraries worldwide
Common LISP : the language by Guy L Steele ( Book )
1 edition published in 1984 in English and held by 23 libraries worldwide
A system for representing and using real-world knowledge by Scott E Fahlman ( Book )
5 editions published between 1977 and 1979 in English and held by 10 libraries worldwide
Common Lisp by Guy L Steele ( Book )
1 edition published in 1984 in Undetermined and held by 9 libraries worldwide
A planning system for robot construction tasks by Scott E Fahlman ( Book )
3 editions published in 1973 in English and held by 9 libraries worldwide
An empirical study of learning speed in back-propagation networks by Scott E Fahlman ( Book )
2 editions published in 1988 in English and held by 8 libraries worldwide
The hashnet interconnection scheme by Scott E Fahlman ( Book )
2 editions published in 1980 in English and held by 8 libraries worldwide
Internal design of CMU Common Lisp on the IBM RT PC by David B McDonald ( Book )
2 editions published between 1987 and 1988 in English and held by 7 libraries worldwide
Thesis progress report : a system for representing and using real-world knowledge by Scott E Fahlman ( Book )
1 edition published in 1975 in English and held by 7 libraries worldwide
The Cascade-Correlation learning architecture by Scott E Fahlman ( Book )
2 editions published in 1990 in English and held by 6 libraries worldwide
Abstract: "Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology, Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network."
Information processing research by Allen Newell ( Book )
1 edition published in 1992 in English and held by 6 libraries worldwide
Abstract: "This report documents a broad program of basic and applied information processing research conducted by Carnegie Mellon's School of Computer Science during the period 15 July 1987 through 14 July 1990, and extended through 31 December 1990. We present in detail our seven major research areas: Artificial Intelligence, Image Understanding, Reliable Distributed Systems, Programming Environments, Reasoning About Programs, Uniform Workstation Interfaces, and Very Large Scale Integration. Sections in each chapter present the area's general research context, the specific problems we addressed, our contributions and their significance, and a bibliography for each chapter."
Learning with limited numerical precision using the cascade- correlation algorithm by Markus Hoehfeld ( Book )
1 edition published in 1991 in English and held by 6 libraries worldwide
Abstract: "A key question in the design of specialized hardware for simulation of neural networks is whether fixed-point arithmetic of limited numerical precision can be used with existing learning algorithms. We present an empirical study of the effects of limited precision in Cascade-Correlation networks on three different learning problems. We show that learning can fail abruptly as the precision of network weights or weight-update calculations is reduced below 12 bits. We introduce techniques for dynamic rescaling and probabilistic rounding that allow reliable convergence down to 6 bits of precision, with only a gradual reduction in the quality of the solutions."
Reducing network depth in the cascade-correlation learning architecture by Shumeet Baluja ( Book )
1 edition published in 1994 in English and held by 5 libraries worldwide
Abstract: "The Cascade-Correlation learning algorithm constructs a multi-layer artificial neural network as it learns to perform a given task. The resulting network's size and topology are chosen specifically for this task. In the resulting 'cascade' networks, each new hidden unit receives incoming connections from all input and pre-existing hidden units. In effect, each new unit adds a new layer to the network. This allows Cascade-Correlation to create complex feature detectors, but it typically results in a network that is deeper, in terms of the longest path from input to output, than is necessary to solve the problem efficiently. In this paper we investigate a simple variation of Cascade-Correlation that will build deep nets if necessary, but that is biased toward minimizing network depth. We demonstrate empirically, across a range of problems, that this simple technique can reduce network depth, often dramatically. However, we show that this technique does not, in general, reduce the total number of weights or improve the generalization abiity of the resulting networks."
The Recurrent Cascade-Correlation architecture by Scott E Fahlman ( Book )
1 edition published in 1991 in English and held by 5 libraries worldwide
Abstract: "Recurrent Cascade-Correlation (RCC) is a recurrent version of the Cascade-Correlation learning architecture of Fahlman and Lebiere [Fahlman, 1990]. RCC can learn from examples to map a sequence of inputs into a desired sequence of outputs. New hidden units with recurrent connections are added to the network one at a time, as they are needed during training. In effect, the network builds up a finite-state machine tailored specifically for the current problem. RCC retains the advantages of Cascade-Correlation: fast learning, good generalization, automatic construction of a near-minimal multi-layered network, and the ability to learn complex behaviors through a sequence of simple lessons.
A Planning System for Robot Construction Tasks ( Book )
2 editions published in 1973 in Undetermined and English and held by 3 libraries worldwide
The paper describes a system which plans the construction of specified structures out of simple objects such as toy blocks. The planning is done using a 3-D model of the work space. A powerful control structure allows the use of such techniques as sub-assembly, temporary scaffolding, and counterweights in the construction. (Author).
Netl : a system for representing and using Real-Word knowledge by Scott E Fahlman ( Book )
1 edition published in 1982 in English and held by 3 libraries worldwide
NETL : a system for representing and using real-world knowledge ( Book )
1 edition published in 1985 in English and held by 3 libraries worldwide
Common LISP : the language by Guy L Steele ( Book )
1 edition published in 1990 in English and held by 3 libraries worldwide
The Hashnet Interconnection Scheme ( Book )
1 edition published in 1980 in Undetermined and held by 2 libraries worldwide
A System for Representing and Using Real-World Knowledge ( Book )
2 editions published between 1975 and 1977 in English and held by 1 library worldwide
This report describes a knowledge-base system in which the information is stored in a network of small parallel processing elements--node and link units--which are controlled by an external serial computer. Discussed is NETL, a language for storing real-world information in such a network. A simulator for the parallel network system has been implemented in MACLISP, and an experimental version of NETL is running on this simulator. A number of test-case results and simulated timings will be presented. (Author).
Artificial intelligence Artificial intelligence--Data processing COMMON LISP (Computer program language) Computer architecture Computer networks Computer programming Computer vision Heuristic programming Integrated circuits--Very large scale integration Learning Learning--Mathematical models Machine learning Multiprocessors NETL (Computer system) Neural circuitry Neural networks (Computer science) Parallel processing (Electronic computers) Programming languages (Electronic computers) Robots--Programming Switching theory Thought and thinking Virtual computer systems