Neural Networks with Applications to Vision and Language

Graduate Course, 6 credits, Autumn 2016.
Examiners: Michael Felsberg (ISY), Marco Kuhlmann (IDA)

Over the past few years, neural networks have enjoyed a major resurgence in machine learning, and today yield state-of-the-art results in various fields. This course provides an introduction to neural network models, and surveys some the applications of these models in computer vision and natural language processing. The course covers feedforward networks, convolutional networks, recurrent and recursive networks, as well general topics such as input encoding and training techniques. The course also provides acquaintance with some of the software libraries available for building and training neural networks.

Syllabus

Intended learning outcomes

On successful completion of the course you will be able to:

Contents

Basics of machine learning (regression, classification, numerical optimisation). Feedforward networks. Loss functions. Backpropagation training. Regularisation. Convolutional networks. Recurrent and recursive networks. Processing sequences, images, and hierarchical structures. Applications of neural networks in natural language processing and computer vision. Current areas of research.

Course literature

Primary literature

Secondary literature

Organisation

Lectures

The lectures present basic concepts and methods in neural networks, and survey applications in computer vision and natural language processing. Each lecture connects to one or several chapters in the book by Goodfellow, Bengio, and Courville (GBC). The course schedule indicates which chapters you should have read before each lecture and are suggested to read after the lecture.

Lab sessions

The lab sessions give you practical experience in implementing, training, and evaluating neural networks using existing software libraries. There are a total of four labs:

Lab 0 is a preparatory lab that you do on-site. For each of the remaining labs there is an on-site introduction by an instructor, but most of the actual work is self-scheduled.

In order to do labs 1–3, you will need to bring your own computer and install the following software: Python 3, Jupyter Notebook, NumPy, Theano, Keras. If you have some spare time, you can try to install this software before the start of the course; otherwise, we will help you with the installation during lab 0.

Research seminars

In the seminar sessions you and your fellow students present and discuss articles reporting current research on neural networks and their applications. Instructions for presenters and schedule

Individual project

The final part of the course is an individual project that you choose based on your own research interests. We anticipate the typical project to be one where you apply neural networks to solve a concrete problem related to your thesis work. Here is how it works:

Examination

The examination for the course consists of three parts:

You can choose to be examined on any combination of these parts. To be examined on the project part, you will have to submit a project plane (deadline, see below). If you do not submit a project plan, we will assume that you only want to be examined on the lab assignments and the seminars.

Deadlines for the examination:

Schedule

The lectures, lab sessions, and seminars take place in the B-building and the E-building at Campus Valla.

Unless indicated otherwise, time announcements in the format XY indicate XY.15. (For instance, 10 means 10.15.) If we mean XY.00 we write XY.00. (For instance, 10.00 means 10.00.)

DayBefore sessionSessionAfter session
12/9 13–17 Lab session 0.
Location: Systemet
13/9 8–10 Read GBC, chapters 1–5 Lecture 1: Basics of Machine Learning.
Location: Visionen
13/9 10–12 Read GBC, chapter 6 Lecture 2: Feedforward Networks.
Location: Visionen
Read GBC, chapters 7–8
13/9 13–15 Read instructions for lab 1 Lab session 1.
Location: Systemet, Allen Newell
14/9 8–10 Read GBC, chapter 9 Lecture 3: Convolutional Networks.
Location: Visionen
14/9 10–12 Lecture 4: Applications in Computer Vision.
Location: Alan Turing
14/9 13–15 Read instructions for lab 2 Lab session 2.
Location: Systemet, Allen Newell
15/9 8–10 Read GBC, chapter 10 Lecture 5: Recurrent and Recursive Networks.
Location: Visionen
Read GBC, chapters 11–12
15/9 10–12 Skim Goldberg Lecture 6: Applications in NLP.
Location: Visionen
15/9 13–15 Seminar sessions S1A and S1B.
Locations: Systemet (S1A), Allen Newell (S1B)
15/9 15–17 Read instructions for lab 3 Lab session 3.
Location: Systemet, Allen Newell
16/9 8–10 Seminar sessions S2A and S2B.
Locations: Systemet (S2A), Allen Newell (S2B)
16/9 10–12 Seminar sessions S3A and S3B.
Locations: Systemet (S3A), Allen Newell (S3B)