Header Image

Introduction to Machine Learning

Welcome! This is the course webpage for the ML course under ACA Summer School 2016. I shall keep posting relevant information here, regarding times, resources, general course content and announcements.

I tentatively plan to cover what I can in about 10 lectures. There are plenty of resources that I would be talking of while in class, for both learning the theoretical aspects of Machine Learning and also applying it to problems - from simple ones to increasingly complex and interesting ones.

For a programming environment, I have mostly used Python, a bit of R and a bit of GNU Octave. Not Matlab though, it is proprietary software. You may use them, or whatever else you like. I possibly won't be able to help much with Java, for I have little experience, but I assure you that help will be provided with any programming environment, by me or someone else. The environment does not matter. The concepts do. The results don't matter (okay, they do, but oh well), the understanding of the results do.

For language used in class and for communication, the academics will be in English. I will say this in class repeatedly, but I'll say this again here: Please do tell me if you have problems understanding not only the concepts, but also my language. I will improve as I get feedback. I am also comfortable in speaking Hindi, Bengali and a bit of Odia, so if you can put your questions better in these languages, please do not feel hesitant to do so.

A tentative schedule along with resources are given below. This can change, however, by an algorithm which I have in my brain, as I get feedback from you. Whether it will be an online algorithm or not, will depend on both of us, though.

Lectures :

Lecture 1 : Introduction to (Introduction to Machine Learning) : We introduced the field along with a bit of history and spoke about the generalized way of solving problems in Machine Learning. Slides are available here. The code that I used to show the OCR (Optical Character Recognization) can be found here.

Lecture 2 : Recap of Linear Algebra, Probability Theory and other Mathematical Stuff : We spoke of probability and linear algebra in general. There were no slides for this one. Some resources in probability are available here. We also introduced the problem of regression and gradient descent, and spoke of loss functions. A good lecture series of linear algebra is posted here.. You may also look at a lecture note I have scribed myself for a different course, available here.

Lecture 3 : Supervised Learning I : Regression : We spoke of regression as a machine learning problem. We introduced the various aspects of regression and gradient descent, looked at a few problems, (I attach the regression code here), solved the optimization problem analytically and introduced classification. The UCI Machine Learning database as well as Kaggle was shown. I encourage everyone to look here also for data pertaining to India. As it is, data is very Euro-America-centric.

Lecture 4 : Supervised Learning II : Classification : We had a quiz today and we also spoke of Classification in general, after wrapping up discussions on regression.

Lecture 5 : Supervised Learning III: Classification : Decision Trees : We discussed decision trees in the general framework of machine learning. I referred to the slides of Prof. Harish Karnick, available here.

Lecture 6 : Supervised Learning wrap-up and Unsupervised Learning I

Lecture 7 : Unsupervised Learning II

Lecture 8 : Introduction to Non-linear Decision Boundaries : SVMs

Lecture 9 : Introduction to Neural Networks : What's the hype about?

Lecture 10: Wrapping up and possible future work