CMSC 475/675: Neural Networks


Instructor: Tejas Gokhale (OH: WED 1430--1530 ; ITE 214);
Teaching Assistant: Ziwei Zhang (OH: MON 1300--1400 and TUE 1430--1530; ITE 334)
Time: MON and WED 1600--1715
Location: ITE 231


Course Description | Schedule | Grading | Syllabus

Course description

This class will offer a comprehensive overview of neural network architectures and deep learning algorithms. Deep Learning has been a highly successful research field over the last 20 years across a range of domains (vision, language, audio, robotics; ``AI'' in general) and has also translated into significant commercial success. The class will focus on the core principles of extracting meaningful representations from high-dimensional data, a fundamental aspect for several applications in autonomous decision making. Class lectures will cover fundamental topics such as network design, training and optimization, and evaluation. Homeworks will give students the opportunity to implement algorithms learnt in class for applications in visual recognition, language understanding and other domains. In the term project, students will construct a research hypothesis, propose new techniques and solutions, interpret results, and communicate key findings.

Prerequisites: We will assume that you have a basic (but solid) expertise in linear algebra, geometry, probability, and Python programming. Recommended classes at UMBC are: MATH 221 (Linear Algebra), STAT 355 or CMPE 320 (Probability and Statistics), MATH 151 (Calculus and Analytical Geometry). If you are unfamiliar with linear algebra or calculus, you should consider taking both: without these tools, you are likely to struggle with the course. Although we will provide brief math refreshers of these necessary topics, CMSC 475/675 should not be your first introduction to these topics.


Schedule

Schedule is tentative and subject to change.

Topic Resources Optional Reading
0 Introduction [slides]
1 Learning [slides] DL Book Ch 5 Sec 5.1.
2, 3 MLP and Gradient Descent [slides_A] [slides_B] DL Book Ch 6
4 Convolutional Neural Networks [slides] DL Book Ch 9; Bishop DL Book Ch.10
5 Training NNs [slides]
6 NN Generalization
7 Representation Learning
8 Generative Models
9 Transformers
10 Robustness
11 Uncertainty and Calibration
12 Implicit Neural Representations
13 Neural Operators

Grading

Please consult the syllabus for details. The class has a mix of PhD, MS, and BS students. We believe that anyone with the above prerequisites and a will to learn will do well. Work hard, engage and participate in class, learn how to read, write, and present research articles, be creative in your projects, and seek help when needed!

Late Submission Policy

Everyone in this course has 10 late days to use as needed for personal reasons and emergencies. Do not use them as an excuse to procrastinate -- start working on your assignments early. See the syllabus for details.

Projects

Projects will be judged on the basis of relative growth (from where you start to where you end). Grad projects should have an original and unique research hypothesis with a potential for publication. Undergrad students can also propose original and unique research hypothesis, but will be allowed to work on an idea provided by the instructors (i.e. you get to skip “ideation”) or innovative applications or combination of existing work.

Resources

Academic Integrity

Please read UMBC's policy on Academic Integrity. I take academic integrity seriously. I hope that we will never have to deal with violations -- they are never pleasant for anyone involved. Please read the policies stated in the Syllabus .