CSE 4392 Special Topics (Natural Language Processing)

Course Summary

Natural language processing (NLP) is the ability of a computer program to understand and further generate (mostly) human language as it is spoken and written -- referred to as natural language. It is a key component of artificial intelligence (AI), and is considered a grand challenge in AI. NLP has existed for more than 50 years and has roots in the field of linguistics. This course introduces the both classical and contempory concepts in NLP especially from a statistical and machine learning approach. It aims to provide the students with a basic understanding and appreciation of key NLP theories such as lexicons, grammar, parsing and language modeling, as well as emerging NLP applications including text classification, information retrieval, machine translation, text summarization, question answering and dialogue systems. Students will practice the knowledge acquired in this course through a team project which aims at solving one particular NLP problem of their choice.

Latest News and Announcements


Administrative Information

Lectures: Mon/Wed 2:30-3:50 PM, ERB-129.

Instructor: Kenny Zhu - ERB-535 Phone: 3420-4592 Email: kenny[dot]zhu@uta[dot]edu
Office hours: Wed 4-5PM

Teaching Assistant: Sinong (Theron) Wang
Email: sinong.wang[at]uta[dot]edu
Office hours: Mon 4-6PM @ ERB-316

Reference Textbooks:

  1. Speech and Language Processing (3rd ed) by Dan Jurafsky and James Martin, The Prentice Hall.
  2. Foundations of Statistical Natural Language Processing by Chritopher Manning and Hinrich Schutze, The MIT Press.
  3. Introduction to Information Retrieval by Christopher D. Manning, Prabhakar Raghavan, Hinrich Schütze, The Cambridge University Press.

Assessment:

  1. In-class quizzes: 10%
  2. Tutorial participation: 5% bonus
  3. Assignments: 30%
  4. Project: 30%
  5. Final Exam: 30%

Schedule

WeekDateTopic SlidesResourcesHomework
101/22/2024 Introduction[lecture] [tutorial] MS Ch 3, JM Ch 17 hw1 (pdf) hw1 (tex)
201/29/2024 Language Models[lecture] [tutorial] JM Ch 3 hw2 (pdf) hw2 (tex)
301/31/2024 Text Classification[lecture] [tutorial] JM Ch 4 hw3 (pdf) hw3 (tex)
402/12/2024 Loglinear Models[lecture] [tutorial] JM Ch 5 hw4 (pdf) hw4 (tex)
502/14/2024 Research ProposalN/A N/A N/A
602/27/2024 Word Embeddings[lecture] [tutorial] JM Ch 6, papers in slides hw5 (pdf) hw5 (tex)
703/06/2024 Neural Network Basics[lecture] [tutorial] JM Ch 7 hw6 (pdf) hw6 (figure png) hw6 (tex)
803/20/2024 Sequence Models[lecture] [tutorial] JM Ch 8 hw7 (pdf) hw7 (tex)
903/25/2024 Expectation Maximization[lecture] [tutorial] Notes by Michael Collins hw8 (pdf) hw8 (tex)
1004/01/2024 Recurrent Neural Networks[lecture] [tutorial] JM Ch 9, Understanding LSTM hw9 (pdf) hw9 (tex) fig (lstm)
1104/08/2024 Transformer and Large Language Model[lecture] [tutorial] JM Ch 10, 11, 12
1204/17/2024 Information Retrieval[lecture] [tutorial] MRS Ch 1-6 hw10 (pdf) hw10 (tex)
1304/22/2024 Dialogue Systems[lecture] [tutorial] JM Ch 15
Copyright (c) Kenny Q. Zhu, 2023-2024.