Index

Chaostician-- someone who makes use of Chaos theory. Determinism-- a philosophy which asserts that actions have predictable, determined, causes and effects; an identical action in an identical situation should produce identical effects. Hard determinism asserts that randomness does not exist; if we had enough knowledge about the nature of the forces and entities involved, anything in our world is predictable.

Hebbian Rule-- a learning rule which asserts that synaptic efficcacy increases with use; ie. neurons that fire together, wire together; similiar to lateral inhibition; named after Donald Hebb, an early mathematical modeler of "wet" neural networks.

hypertext-- the universal programing format used by "Net Surfers" all around the globe.

inhibition-- literally, to suppress; in neurology, it refers to an agent's ability to suppress the activation level of a neuron, as opposed to excitation, which refers to a neuron's ability to increase the activation of another neuron.

iteration-- refers to a process in which the output of one cycle, becomes the input for the next (eg. a situation in which the current postion of a moving object is determined by its previous position); in mathematics, specifically refers to taking the output from an equation and using that as the input.

lateral inhibition-- mathematical term describing a situation in which nearby units are increased while units further away are decreased. In PDP, generally, refers to a situation in which the synaptic efficacy of active neurons in close temporal proximity increase, while those neurons less correlated temporally decrease their efficacy.

learning rule-- In PDP, refers to any method whereby patterns of connectivity between processing units are modified by experience. One of the more popular learning rules is backpropagation.

linear-- literally, along a line; typically refers to data in which the underlying order is a direct relationship, with one variable changing in accordance with the equation for a straight line: y=bx+c (where b is the slope, and c is a constant). Linear Models are mathematical models which attempt to fit data to a line. They deal only with relationships in which a change in one variable directly affects the expression of another variable in a manner in accordance with y=bx+c (the equation for a line). Any situation in which the previous is not the case, is beyond the scope of linear models.

neural networks-- In math and computer science, describes a mathematical construct which is supposed to model the functioning of a large interconnected "network" of "neurons." Neural networks make use of Parallel Distributed Processing, and as such, have the same general framework.

Neurology-- Literally, the study of neurons. In practice, tends to refer to anything involving the nervous system.

nodes-- In PDP, refers to clusters of interconnected neurons; also called pools.

non-linear-- literally, anything not in straight line; typically refers to data which appears random, but which is suspected to have an underlying order which is not strictly linear. Non-linear models are any models which make use of functions which are not merely straight lines. Generally speaking, these tend to be deterministic models which make use of some combination of iteration and/or non-linear functions (like a Greene's Function, s-curve, sinusoidal function, etc.)

output function-- In PDP models, refers to the function used to determine the actual output of the neuron or unit, given a prticular level of activation. In other words, does the neuron "fire" at a particular level of activation, or is it's response graded along some kind of curve, and is that curve an S curve, Green's function, some other non-linear function, or merely a straight line.

Parallel Distributed Processing-- term coined by Rumelhart and McClelland (1986) to denote the entire set of models which emphasis brain function as a parallel process rather than a sequential one. Specific information on the general framework of PDP is available.

synaptic efficacy-- refers to the strength of the connection between neurons; can best be expressed in a probabilistic form correlating the firing of neuron A with the likelihood of neuron B firing. High synaptic efficacy denotes a high likelihood that the firing of one neuron will cause the second to fire. Likewise, low synaptic efficacy indicates that the firing of these two neurons are virtually uncorrelated.

"wet" modeling-- kind of mathematical modeling which insists on using strict biological findings and parameters when doing models. In essence, taking a known biological entity, describing it in mathematical terms, and then exploring the implications of the model which results. By virtue of computing speed and the boundries of biological knowledge, these models tend toward the lower end of the systems levels (aka: at the single neuron or otherwise very simple, well known networks); eg. describing the results seen from single unit recordings of an inhibitory neuron, or modeling of the centipede ambulatory network, etc.

XOR-- In logic, refers to a two part expression which evaluates to "true," only if one or the other of the parts is true, but not when both are true. For example, "Yesterday, I was awake at eight, or I wasn't." (Both can't be true). The XOR expression has a particular significance in the history of neural networks because Minsky (1969) demonstrated that networks with one layer could never solve the XOR problem. This "proof" delayed further research on any types of neural networks for many years afterward. It wasn't until the early eighties that anyone realized that just because networks with one layer couldn't solve the XOR problem, didn't mean that networks with more than one layer couldn't solve it. In fact, the Dartnet program included with this tutorial has one two-layered solution to the XOR problem.


Works Cited

Allman, W. F. (1989). Apprentices of Wonder: Inside the Neural Network Revolution. New York: Bantam Books.

Hinton, G. E. (1984). Parallel computations for controlling an arm. Journal of Motor Behavior, 16, 171-194.

McClelland, J. L., & Rumelhart, D. E. (1981). An interactive activation model of context effects in letter perception: Part 1. An account of basic findings. Psychological Review, 88, 375-407. McClelland, J. L., & Rumelhart, D. E. (1989). Explorations in Parallel Distributed Processing: A Handbook of Models, Programs, and Excercises. London: The MIT Press.

Minsky, M. & Papert, S. (1969). Perceptrons. Cambridge, MA: MIT Press.

Rumelhart, D. E., McClelland, J. L., & The PDP Research Group (1986). Parallel Distributed Processing: Explorations in the Microstructure of Cognition. London: The MIT Press.

Rumelhart, D. E., & Norman, D. A. (1982). Simulating a skilled typist: A study of skilled cognitive-motor performance. Cognitive Science, 6, 1-36. Stewart, I. (1989). Does God Play Dice? The Mathematics of Chaos. Cambridge: Basil Blackwell.

Schank, R. C. (1980). Language and memory. Cognitive Science, 4, 243-284.

Weisberg, R. W. (1993). Creativity: Beyond the myth of genius. New York: W. H. Freeman and Company.


Back to Table of Contents.
Last Modified: Sept 11,1995.

George Hollich
Temple University
ghollich@astro.ocis.temple.edu