Neuropsychological Data

• Term coined by 2-2-patterns • From the Greek word for “lack of knowledge” • The inability to recognize objects when using a given (e.g. vision), even though that sense is basically intact (Nolte, 1999)

Agnosia Patient GS

• Sensory abilities intact • Language normal • Usually involves damage to the occipito-parietal • Unable to name objects pathway

Agnosia

• Apperceptive • Specific inability to recognize – Object recognition failure due to perceptual processing – Difficulty recognizing pictures w/deleted segments • Are faces and other objects in the world – Unable to utilize top-down information for pattern recognition represented in fundamentally different • Associative – Perceptual processing intact but subject cannot use information ways in ? to recognize objects – Can draw objects but not say what they are • Does -memory depend on – Language otherwise intact fundamentally different systems? – Often don’t know other things about object (how it’s used, etc.)

1 Are Faces Special? Are Faces Special?

• Subjects presented with a face and asked to represent a face-part • Houses: similar performance for parts & wholes • Subjects presented with a house and asked to • Faces: whole-object advantage represent a house-part

Are Faces Special? Models of Pattern Recognition

• Template Models • Feature Models • Prototype Models • Neural Network Models

• Objects represented in parts and holistically • Faces represented holistically

Word Superiority Effect IAC Model

WORK K

C/K C/K

2 Connectionist (PDP) Models Neural Network Models

• Information evaluated in • Nodes parallel and distributed • Connections throughout the network • Neurally inspired • Activation Rules – Neural Networks • Activation • Motivated by problems • Output function w/classical architecture • Learning rule

Problems w/Classical Slow Architecture •Slow • 100-step Constraint – Active spikes • Brittle approximately once every 1-3 ms •Inflexible – People can perform cognitive tasks in 100-300 ms – Cognitive tasks must be performed w/100 serial operations – Or, cognitive tasks performed in parallel

Brittle Inflexible

• 1 damaged symbol • Digital halts computation • Hard Constraints • 1 missing line of code • No Soft Constraints could be fatal • In contrast, effects of , local – BD often hard to notice

3 Connectionism, NNs Perceptron

• Hardware Matters • Neurally-inspired cognitive modeling

Biological Nets Connectionist Nets Basics

•1010 -1011 • Neuron basic • Set of processing •105 interconnections processing unit units per neuron • Highly interconnected • Input Function • Excitatory & Inhibitory • Excitatory & Inhibitory • State of activation • Learning done by • Learning involves changing strength of • Output function for modifying synapses connections each unit

More Basics The Environment

• Pattern of • Specifying the environment specifies the sorts of Connectivity problems a particular network addresses • Propagation Rule – Specify Inputs • Activation Rule –Specify Outputs • Learning Rule • Networks used to model many things –HebbRule – Stock Market Predictions – Widrow-Hoff Rule – Diagnose Heart Attacks – Generalized Delta – Model Cognition Rule (back-prop) • Pattern Recognition, Categorization, Memory, Learning

4 Appeal of Neural Nets Buzzwords

• Parallel • Spontaneous Generalization – Not subject to 100- • Graceful Degradation step constraint • Distributed • Mutual Constraint Satisfaction Representations • Capacity for Learning and Self- – Less Brittle Organization • Graded Rules • Biologically Plausible (?) – More Flexible

Example Possibilities

•3-layer • Input • Output • Input Layer: Hidden • Home Team • Home Team Layer, fully – Quarterback Rating –Points interconnected – Rushing Yards •Away Team • Hidden Layer: Output – Receiving Yards –Points Layer, fully – Field Goals Made interconnected – Field Goals Attempted • Train with BackProp •Away Team

Features Tri-Level Hypothesis

• Incorporates • Not symbolic information about – Yes, sub-symbolic context • Yes, information processing • Doesn’t assume variables are independent “…dissimilarities do not imply that are not • Captures higher-order computers, but only that brains are not serial, digital statistical regularities computers.” –Churchland, Koch, Sejnowski

5 Computational Level Algorithmic Level

• What problem is the network solving? • How do they do those crazy things they • Use of formal methods to do? – Determine limits of networks • Neural networks blur the structure/process – Determine limits of learning rules distinction • Architecture/Program

Implementation Level IAC Model

• Biological Plausibility – Only capture computationally relevant properties (cognitive modelling) – Closely model particular brain systems (computational neuroscience)

Words versus Letters Word Superiority Effect

• WORD and WORK both begin to get activated because of WOR • Activations feedback and activate K and D nodes • But only K gets bottom-up activation • D gets top-down activation that gets inhibited • Letters presented alone don’t get top-down activation from word level!

6 Template Theories of Pattern IAC Model Evaluation Template Theories of Pattern Recognition • Illustrates important principles • Template theories: a • Doesn’t hold up to 21st century standards miniature copy or template of each known pattern is stored – Scope too narrow in long-term memory – No learning mechanism – straightforward template theory – Too many feature units – normalized template theory • Problems – not adaptable – impose large storage requirements

Feature Theories of Pattern Pattern Recognition Recognition • Pattern Recognition • Feature theories: patterns requires Feature Analysis consist of a set of specific – Break down pattern into features or attributes primitives • Advantages: – Recognize combination of – elementary features can combine to form multiple features objects • Integration of bottom-up – Features can be used to describe novel objects and top-down processing • Problems: – context effects in – recognition can take place when features are occluded

7