Hide menu

Neural Networks and Deep Learning

2018HT
Full

Status Active. Full course - only reserve registrations
School National Graduate School in Computer Science (CUGS)
Division NLPLAB
Owner Marco Kuhlmann
Homepage http://www.ida.liu.se/divisions/hcs/nlplab/courses/nn/

  Log in  




Course plan

Lectures

Basics of Machine Learning. Deep Feed-Forward Networks. Convolutional Networks. Recurrent and Recursive Networks. Practical Methodology. Applications: Classification and Regression on Spatial Data and Sequential Data

Recommended for

Doctoral students with an interest in neural networks and deep learning and how to apply techniques from these areas to their own research

The course was last given

A previous version of this course was given as a PhD course at IDA and ISY in Autumn 2016, and as a PhD Autumn School for the Swedish Society for Automated Image Analysis.

Goals

On successful completion of the course the student will be able to:

* explain different network architectures and how these are used in current applications,
* implement, train, and evaluate neural networks using existing software libraries,
* present and critically assess current research on neural networks and their applications,
* relate the concepts and techniques introduced in the course to the student's own research,
* plan and carry out a research project on neural networks within given time limits.

Prerequisites

* basic calculus (derivatives)
* basic linear algebra (matrices, vectors)
* basic probability and statistics
* programming experience in Python, Java or MATLAB

Gaps in these prerequisites may be filled by teacher-assisted self-study before the start of the course; contact the examiner for details.

Contents

Basics of machine learning (regression, classification, numerical optimisation). Feed-forward networks. Loss functions. Back-propagation training. Regularisation. Convolutional networks. Recurrent and recursive networks. Processing sequences, images, and hierarchical structures. Applications of neural networks in natural language processing and computer vision. Current areas of research.

Organization

The course is organised in three parts. The first part consists of lectures presenting basic concepts and methods in deep learning, as well as applications from two areas where deep learning has been particularly successful. This part also includes a number of lab sessions that will give students practical experience in implementing, training, and evaluating deep learning architectures using existing so ware libraries. The second part of the course is a series of seminars where students present and discuss one or more research articles. The third part is an individual project that the students choose based on their own research interests.

Literature

The main book for the course is: Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning. MIT Press, 2016.

Additional reading consists of excerpts from the following books:
* Christopher M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, 1996.
* Simon O. Haykin. Neural Networks and Learning Machines. Third edition. Prentice Hall, 2008.
* Yoav Goldberg. Neural Network Methods in Natural Language Processing. Morgan & Claypool, 2017.

Lecturers

* Marco Kuhlmann (IDA)
* Michael Felsberg (ISY)

Examiner

Marco Kuhlmann (IDA)

Examination

* lab assignments (1.5 credits)
* active participation in the research seminar (1.5 credits)
* individual project (3 credits)

Credit

6 credits

Organized by

Natural Language Processing Laboratory (IDA), Computer Vision Laboratory (ISY)

Comments

This proposal has also been submitted to the Faculty for Science and Engineering for consideration as a faculty-level course.


Page responsible: Director of Graduate Studies