1. Course Identity
Course title: Machine Learning
Hours per week: 3
ECTS Units: 6
2. Learning goals
The goal of the course is to offer a broad aspect of the field of Machine Learning by studying the major models and methods based on both the supervised and unsupervised learning paradigm. The theory of learning is also covered so the student learns the capabilities and the limits of these models and what is feasible, in general, through any machine learning process.
The subjects covered are:
- Basic concepts of learning.
- Mathematical background, optimization methods (gradient descent, etc), the LMS algorithm.
- Decision functions, linear separability, the Perceptron algorithm.
- Gaussian discriminant analysis. Bayesian methods.
- Support vector machines.
- Model choice.
- Feature selection.
- Improving performance: bagging, boosting, committee machines.
- Evaluating and debugging learning algorithms.
- Clustering (e.g. the K-means algorithm)
- Mixtures of Gaussians, the EM algorithm
- Signal analysis: Factor Analysis, Principal Component Analysis (PCA), Independent components analysis (ICA)
Elements of ML Theory
- The Vapnik-Chervonenkis dimension
- Using machine learning algorithms
The course will be covered by weekly lectures. An important part of the student load is the weekly homework assignments on a specific part of the course. Moreover, there will be a final project requirement where the student will implement a Machine Learning algorithm for pattern recognition using the MATLAB software package.
5. Student evaluation
Student evaluation will be based on the grades of the weekly homeworks and the grade of the final project.
6. Software-hardware requirements
The MATLAB package is required with, at least, the following toolboxes: Statistics, Optimization, Neural networks, Signal processing.
- (inGreek) Κ. Διαμαντάρας, Τεχνητά Νευρωνικά Δίκτυα, Κλειδάριθμος 2007.
- C. Bishop, Pattern Recognition and Machine Learning, Springer 2006
- S. Haykin, Neural Networks and Learning Machines (3rd Edition), Prentice Hall, 2008
- J. Shawe-Taylor and N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, 2004
- B. Schölkopf and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press 2001
- R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification (2nd Edition), Wiley Interscience, 2000
- V. Vapnik, Statistical Learning Theory, Wiley Interscience, 1998
- A. Hyvärinen, J. Karhunen, E. Oja, Independent Component Analysis, Wiley Interscience, 2001
- K. Diamantaras and S. Y. Kung, Principal Component Neural Networks: Theory and Applications, Wiley Interscience, 1996