Session | Topics/Activities |
1 | Introduction to Machine Learning –
I. Example-Curve Fitting
II. Probability Refresher
III. Model Selection
IV. Dimensionality
V. Decision & Information Theory |
2 | Probability Distributions –
I. Discrete and Continuous Distributions
II. Exponential Family
III. Nonparametric Methods |
3-4 | Linear Models for Regression –
I. Concept of Basis Function
II. Linear Basis Function Models
III. Bias-Variance Decomposition
IV. Bayesian Linear Regression
V. Model Comparison
VI. Evidence Approximation
VII. Limitations of Fixed Basis Functions |
5-6 | Linear Models for Classification –
I. Discriminant Functions
II. Generative Models
III. Discriminative Models
IV. Laplace Approximation
V. Bayesian Logistic Regression |
7-8 | Neural Networks
I. Feed-Forward mechanism
II. Training of ANN using Back Propagation
III. Regularization
IV. Bayesian Neural Networks |
9-10 | Kernel Methods –
I. Dual Representation
II. Constructing Kernels
III. RBF networks
IV. Gaussian Processes |
11-12 | Sparse Kernel machines
I. Maximum Margin Classifiers
II. Relevance Vector Machines |
13-14 | Mixture Models and EM Algorithm
I. k-means Clustering
II. Mixture of Gaussians
III. Alternative View of EM |
15-16 | Approximate Inference
I. Variational Methods
II. Variational Linear Regression
III. Exponential family distributions
IV. Local variational Methods
V. Variational Logistic Regression |
17-18 | Sampling Methods
I. Basic Sampling Algorithms
II. Markov Chain monte Carlo
III. Gibbs Sampling
IV. Slice Sampling
V. Hybrid Monte Carlo |
19 | Combining Models
I. Bayesian Model Averaging
II. Committees
III. Boosting
IV. Tree Based Models
V. Conditional Mixture Models |
20 | Doubt Clearing and Wrap Up |