75% Papers (25% participation, 25% outlines/questions, 25% leading
discussion)
25% Cognitive modeling project including presentation
Students may miss up to three classes without penalty (the three lowest participation/outline/questions grades will be dropped).
2 credits: everything except the modeling project
3 credits (the default): everything including the modeling project
4 credits: everything including a much more elaborate modeling project (to be arranged with the instructor)
The best way to learn about modeling is by doing it, so I strongly encourage students to try the project.
Because this seminar requires permission of the instructor, you need an override form to register. My understanding is that psychology graduate students can fill these out themselves, but that others will need my signature. I'll bring a stack of overrides with me to class. The overrides need to be turned in at the undergraduate psychology office (East Hall 1044).
2. Leading discussion. Discussion leadership will rotate among members of the class. Leaders should come prepared with points to be made and issues to be raised about the assigned readings as well as questions designed to draw those points/issues out of the class.
3. Cognitive modeling project. Students will work in groups of two or three students on a small computational modeling project. Every attempt will be made to put students with little computational experience in a group with more experienced students and to put together students with similar interests. A number of example projects are listed below. Feel free to choose one of the example projects or to come up with a different project that more closely matches the interests of the group. Unless you have previous experience with computational modeling, you should either choose one of the projects below or propose to re-implement a computational model that already exists and for which you have a detailed description (or perhaps a minor variant of such a model). Keep in mind that symbolic architectures such as Soar, ACT*, and EPIC are quite difficult to learn, so unless your group is very interested in that architecture and is willing to work hard to learn it, you should plan on using Nexus (a user-friendly PDP system) or a conventional programming language such as Lisp or C.
For those working on a project for which they have little experience, I recommend developing it incrementally throughout the semester in the following small parts (including a suggested timeline). Individual groups can decide how to divide the work among the group members (e.g., letting the most computationally experienced group member work on 3.5-3.7 while the less experienced members write up the proposal and prepare the presentation). Students are allowed and encouraged to consult with the instructor and local experts about their projects throughout the term.
3.1. Initial Proposal. Should describe the proposed model and the task it
addresses, specify the architecture/language to be used for implementation,
and specify how the project will be broken down into manageable parts (3.3-
3.7). (~Feb 3)
3.2. Proposal Revisions. A description of revisions that address issues raised
with the initial proposal (if any). (~Feb 10)
3.3. Printout of output from existing model in chosen architecture/language
(~Feb 19, recommended for those with little experience, optional for others)
3.4. Printout of output from "Hello, world" style system in
architecture/language (~Feb 26, recommended for those with little experience, optional for others)
3.5. Printout of output from part 1 of model (~March 17)
3.6. Printout of output from parts 1 & 2 of model (~March 31)
3.7. Printout of output from complete model (~April 14)
3.8. Presentation. A 20-30 minute in-class oral presentation describing the
model, its major strengths and weaknesses, and anything else you learned about
the model/underlying theory and/or architecture. (April 16,21)
Use ACT-R or Soar to construct a model that solves some well-defined toy
problem like multi-column subtraction, the Tower of Hanoi, monkeys and bananas,
blocks world...
2. Jan 13: Computing in Cognitive Science
Pylyshyn, Z. (1989), "Computing in cognitive science", in Posner, M.
(Ed.), Foundations of cognitive science, MIT Press: Cambridge, MA,
pp. 51-91.
Rumelhart, D. (1989), "The architecture of mind: A connectionist approach", in Posner, M. (Ed.), Foundations of cognitive science, MIT Press: Cambridge, MA, pp. 133-159.
(Jan 20: Martin Luther King Day, no class)
4. Jan 22: Classifier systems and genetic algorithms
Holland, J. (1995), Hidden order: How adaptation builds complexity,
Addison-Wesley: Reading, MA, chapter 1.
Holland, J., Holyoak, K., Nisbett, R., and Thagard, P. (1987), Induction: Processes of inference, learning, and discovery, MIT Press: Cambridge, pp. 102-126.
5. Jan 27: Soar and Unified Theories of Cognition
Newell, A. (1992), "Unified theories of cognition and the role of
Soar", in Michon, J. and Akyurek, A. (Eds.), Soar: A cognitive
architecture in perspective, Kluwer
Academic Publishers: Netherlands, pp. 25-79.
6. Jan 29: Soar and Unified Theories of Cognition
Cooper, R. and Shallice, T. (1995), "Soar and the case for unified
theories of cognition", Cognition, 55(2):115-149.
7. Feb 3: EPIC
Meyer, D. & Kieras, D. (in press), "A computational theory of
executive cognitive processes and multiple-task performance: Part 1.
Basic mechanisms," Psychological Review, pp. 1-2, 19-41,
63-77, 86-88.
8. Feb 5: ACT-R and rational analysis
Anderson, J. (1996), "ACT: A simple theory of complex cognition",
American Psychologist, 51(4):355-365.
Anderson, J. (1991), "The place of cognitive architecture in a rational analysis", in VanLehn, K. (Ed.), Architectures for intelligence, LEA: Hillsdale, NJ, pp. 1-24.
9. Feb 10: Symbolic vs. subsymbolic approaches
Fodor, J. and Pylyshyn, Z. (1988), "Connectionism and cognitive
architecture: A critical analysis", Cognition, 28(1-2): 3-71.
10. Feb 12: Alternative views
Simon, H. (1991), "Cognitive architecture and rational analysis:
Comment", in VanLehn, K. (Ed.), Architectures for intelligence, LEA:
Hillsdale, pp. 25-41.
Chalmers, D. (1993), "Connectionism and compositionality: Why Fodor and Pylyshyn were wrong", Philosophical Psychology, 6(3):305-319.
12.Feb 19: Case-based reasoning
Seifert, C. (1994), "Case-based learning: Predictive features in
indexing," Machine Learning, 16:37-56.
13.Feb 24: Deductive reasoning
Polk, T. and Newell, A. (1995), "Deduction as verbal reasoning",
Psychological Review, 102(3):533-566.
14.Feb 26: Skill acquisition
Anderson, J. (1987), "Skill acquisition: Compilation of weak-method
problem situations," Psychological Review, 94(2):192-210
(Mar 3: Spring break, no class)
(Mar 5: Spring break, no class)
15.Mar 10: Language acquisition
Rumelhart, D. and McClelland, J. (1986), "On learning the past tenses
of English verbs," in McClelland, J. and Rumelhart, D. (Eds.),
Parallel Distributed Processing: Explorations in the Microstructure
of Cognition. Volume 2: Psychological and Biological Models,
pp. 216-271.
Prince, A. and Pinker, S. (1988), "Rules and connections in human language," Trends in Neuroscience, 11(5):195-202.
16.Mar 12: Semantic memory
Farah, M. and McClelland, J. (1991), "A computational model of semantic
memory impairment: Modality specificity and emergent category
specificity," Journal of Experimental Psychology: General,
120(4):339-357.
17.Mar 17: Categorization
Krushke, J. (1992), "An exemplar-based connectionist model of category
learning," Psychological Review, 99(1):22-44.
18.Mar 19: Attention
Cohen, J., Dunbar, K., and McClelland, J. (1990), "On the control of
automatic processes: A parallel distributed processing account of
the Stroop effect," Psychological Review, 97(3):332-361.
Cohen, J., Romero, R., Servan-Schreiber, D., and Farah, M. (1994), "Mechanisms of spatial attention: The relation of macrostructure to microstructure in parietal neglect," Journal of Cognitive Neuroscience, 6(4):377-387.
(Mar 24: Away at Cognitive Neuroscience conference, no class)
19.Mar 26: Working Memory
Zipser, D. (1991), "Recurrent network model of the neural mechanism of
short-term active memory," Neural Computation, 3:179-193.
Burgess, N. and Hitch, G. (1992), "Toward a network model of the articulatory loop," Journal of Memory and Language, 31(4):429-460.
20.Mar 31: Explicit learning&memory
McClelland, J., McNaughton, B. and O'Reilly, R. (1995), "Why there
are complementary learning systems in the hippocampus and neocortex:
Insights from the successes and failures of connectionist models of
learning and memory," Psychological Review, 102(3):419-437.
21.Apr 2: Vision
Hildreth, E. and Ullman, S. (1989), "The computational study of
vision," in Posner, M. (Ed.), Foundations of Cognitive Science, MIT
Press: Cambridge, MA, pp. 581-630.
22.Apr 7: Visual word recognition
Plaut, D., McClelland, J., Seidenberg, M. and Patterson, K. (1996),
"Understanding normal and impaired word reading: Computational
principles in quasi-regular domains," Psychological Review,
103(1):56-115.
23.Apr 9: Neural organization
Zhang, J. (1991), "Dynamics and formation of self-organizing maps,"
Neural Computation, 3(1):54-66.
Polk, T. and Farah, M. (1995), "Brain localization for arbitrary stimulus categories: A simple account based on Hebbian learning," Proceedings of the National Academy of Sciences, USA, 92:12370-12373.
Miller, K., Keller, J., and Stryker, M. (1989), "Ocular column dominance development: Analysis and simulation," Science, 245:605-615.
25.Apr 16: Presentations
26.Apr 21: Presentations