6.863J/9.611J Natural Language Processing: Fall 2012
 
 
Course home
[  Main  ] [  About ] [ Assignments ]
 

Staff
Prof. Robert C. Berwick
berwick@csail.mit.edu
32-D728, x3-8918
Office hours: Weds. 1-3pm

Course Support
Lynne Dell
lynne@mit.edu
32-D724, 617-324-1543
TA: Geza Kovacs
gkovacs@mit.edu
Office hrs: Thu, 3:30-4:30pm
32G, 7th floor lounge

Course Time & Place
Lectures: M, W 11-12:30 PM
Room: 32-124,  map

Level & Prerequisites
Undergrad/Graduate; 6.034 or permission of instructor

Policies
Textbooks & readings
Grading marks guide
Style guide

Course Description

A laboratory-oriented course in the theory and practice of building computer systems for human language processing, with an emphasis on how human knowledge of language can be integrated into natural language processing.

Download here: acrobat.dmg

This subject qualifies as an Artificial Intelligence and Applications concentration subject, Grad H level credit.
This term it also qualifies as a course 6 AUS subject.

Textbooks required for puchase or reference:
Jurafsky, D. and Martin, J.H., (JM) Speech and Language Processing (on library reserve, Barker P98.J87 2009)
2nd edition, Prentice-Hall: 2008. 
Some of the chapters from the revised edition may be posted in pdf form, as per the schedule shown on the homepage.
Additional textbook readings available FREE online in: Manning & Schütze, (MS) Statistical Natural Language Processing, from MIT Cognet here.

Announcements:
• Week 9: Project topics and sources described here, and listed here.
• Week 7: Assignment 5 posted here.
• Week 6: Assignment 4 posted here.
• CGW handout now posted here.
• Lecture 8 notes posted here.
• Week 3-4: We will NOT be assigning a Competitive Grammar Writing (CGW)'checkpoint' exercise, but you should read the handout when it's posted and make sure you know how to run the programs on Athena.
• Week 3: Lecture notes for lectures 6 & 7 here.
• Week 3: Lecture notes for lectures 2 & 3 here; notes for Lecture 4 here. Notes for lecture 5 here.
(but please see revisions in lecture 6-7 notes)
• Week 2: Assignment 2, context-free grammars, available here; zip files for assigment here; latex answer template zip file here

• Week 1: Reading & response 1, available here, turn in ONLY after Monday in-class discussion.

• Week 1: Fun NLP link of the week: Postmodernist paper generator. Try 'writing' a new paper by following this link.
• Week 1: And then, if you think the 'hard' sciences are immune, you can follow this link


Class days in blue, holidays in green, reg add/drop dates in orange.

September 2012
Sun
Mon
Tue
Wed
Thu
Fri
Sat
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23 24 25 26 27 28 29
30
October 2012
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31  
           
November 2012
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28 29 30  
             
December 2012
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19 20 21 22
23
24 25 26 27 28 29
30            
Course schedule at a glance
Date
Topic
Slides & Reference Readings
Laboratory/Assignments

9/5
Weds

Introduction: walking the walk, talking the talk
Lecture 1 pdf slides; Jurafsky & Martin (JM), ch. 1.
If you don't know Python, read the NLTK book, ch. 1-3; otherwise, skim NLTK book, chs 2–3.
Background Reading (for RR 1): Jurafsky & Martin ch.4 on ngrams. (pp. 83-94; p. 114-116)
Background Reading (for RR 1): Abney on statistics and language.
Background Reading (for RR 1): Chomsky, Extract on grammaticality, 1955.
(Optional) Idealization in science, by Cartwright
Background chapters on NLP from Russell & Norvig, ch. 22.
Assignment 1, Reading & response (RR #1) OUT
(Ngrams; NLTK Python warmup)

9/10
Mon
RR1 discussion Language models & ngrams
• Lecture 2: see lecture notes here.
• Manning & Shütze (MS), ch. 2 (Review of probability); JM ch. 4 (ngrams)

Reading & response #1
TURN IN AFTER DISCUSSION MON
Assignment 2, Context-free grammars, OUT; zip file here

9/12
Weds
Language models:
part of speech tagging

• Lecture 3: see lecture notes here
• JM, ch. 5 (sec. 5.5); MS, ch. 10, sec. 10.1



9/17
Mon

The Chomsky hierarchy

Lecture 4: see lecture notes here.



9/19
Weds
Grammars & Context-free parsing I
Lecture 5 notes
JM, ch. 13 (parsing), pp. 427-435



9/24
Mon
Context-free parsing:
statistical CF parsing
Lecture 6 pdf slides; pdf 2-up;
• JM
, ch. 14, pp. 459-467; MS ch. 12.
 
9/26
Weds
Competitive grammar writing (CGW) intro Lecture 7 pdf slides: Lecture 7 slides 2 up; Competitive Grammar Writing slides, pdf; pdf 4-up;
Lecture 7 notes.
Read CGW handout
Assignment 2, Context-free grammars, DUE FRI
Assignment 3, Competitive Grammar Writing handout
10/1
Mon
Competitive grammar writing I
Bring notebook computer to class (at least 1 per team)


Competitive Grammar Writing


10/3
Weds
Competitive grammar writing II
Bring notebook computer to class (at least 1 per team)


Competitive Grammar Writing: Grammars FROZEN
10/10
Weds
Competitive Grammar Evaluation & Wrap-up, Grammy Awards
Grammy Awards & Discussion of results; can you beat a computer?


10/15
Mon
Language models, Treebank Parsers I
Lecture 8 notes; Lecture 8 pdf slides; pdf 2-up

 
10/17
Weds
EM method


10/22
Mon
Statistical Parsing I
 
10/24
Weds
Statistical Parsing II

10/29
Mon
Windy City
• MIT off: Hurricane Sandy


Assignment 5, Statistical Parsers & Treebanks, OUT

10/31
Weds
Treebank parsers Lecture 12 pdf slides; pdf 2-up Assignment 4, Language Models, DUE
11/5
Mon
Psycholinguistic Interlude I
Lecture 13 pdf slides; pdf 2-up
• JM ch. 18



11/7
Weds
Semantics: The lambda calculus view
Lecture 14 pdf slides; pdf 2-up

Project proposal checkpoint: paragraph on team & final project proposal

11/14
Weds
Semantics & SQL; QA system
Lecture 15 pdf slides; pdf 2-up;
• JM ch. 19


Assignment 5, Statistical Parsers & Treebanks, DUE
Assignment 6, Lexical semantics, OUT
11/19
Mon
LexicalSemantics I: how to learn word meanings
Lecture 16 pdf slides; pdf 2-up



11/21
Weds
Word sense disambiguation & synonyms

Lecture 17 pdf slides; pdf 2-up

 
11/26
Mon
Word net & framenet
• Lecture 18 pdf slides; pdf 2-up
 
11/28
Weds
Language Learning: poverty of the stimulus

Lecture 19 pdf slides; pdf 2-up
Background Reading: Niyogi & Berwick, A language learning model for finite parameter spaces

Assignment 6, Lexical Semantics DUE THURS
12/3
Mon
Poverty of the stimulus discussion
• Lecture 20 pdf slides; pdf 2-up
 
12/5
Weds
Language Learning & Language Change
• Lecture 21 pdf slides; pdf 2-up;
• Background Reading: Niyogi & Berwick, A dynamical systems model for language change
12/10
Mon
Evolution of language I • Lecture 22 pdf slides; pdf 2-up
• Background Reading: Chomsky, Fitch, Hauser, The faculty of language
• Berwick & Chomsky, Biolinguistics
 
12/12
Weds
Evolution of language II • Lecture 23 pdf slides; pdf 2-up Final Projects DUE FRIDAY
     
 

 

MIT Home