ML Syllabus

0
(0)

MACHINE LEARNING

(Program Elective – I)

Prerequisites

1. A Course on ”Data Structures”

2. Knowledge on statistical methods

Objectives

1. This course explains machine learning techniques such as decision tree learning,

Bayesian learning etc.

2. To understand computational learning theory.

3. To study the pattern comparison techniques.

Outcomes

1. Understand the concepts of computational intelligence like machine learning

2. Ability to get the skill to apply machine learning techniques to address the real time

problems in different areas

3. Understand the Neural Networks and its usage in machine learning application.

UNIT – I

Introduction

Well-posed learning problems, designing a learning system Perspectives and issues in machine

learning

Concept learning and the general to specific ordering

Introduction, A concept learning task, concept learning as search, Find-S: Finding a Maximally

Specific Hypothesis, Version Spaces and the Candidate Elimination algorithm, Remarks on

Version Spaces and Candidate Elimination, Inductive Bias.

Decision Tree Learning

Introduction, Decision Tree Representation, Appropriate Problems for Decision Tree Learning,

The Basic Decision Tree Learning Algorithm Hypothesis Space Search in Decision Tree

Learning, Inductive Bias in Decision Tree Learning, Issues in Decision Tree Learning.

UNIT – II


Artificial Neural Networks

Introduction, Neural Network Representation, Appropriate Problems for Neural Network

Learning, Perceptions, Multilayer Networks and the Back propagation Algorithm.

Discussion on the Back Propagation Algorithm, An illustrative Example: Face Recognition

Evaluation Hypotheses

Motivation, Estimation Hypothesis Accuracy, Basics of Sampling Theory, A General

Approach for Deriving Confidence Intervals, Difference in Error of Two Hypotheses,

Comparing Learning Algorithms.

UNIT – III

Bayesian learning

Introduction, Bayes Theorem, Bayes Theorem and Concept Learning Maximum Likelihood

and Least Squared Error Hypotheses, Maximum Likelihood Hypotheses for Predicting

Probabilities, Minimum Description Length Principle , Bayes Optimal Classifier, Gibs

Algorithm, Naïve Bayes Classifier, An Example: Learning to Classify Text, Bayesian Belief

Networks, EM Algorithm.

Computational Learning TheoryIntroduction, Probably Learning an Approximately Correct Hypothesis, Sample Complexity

 

for Finite Hypothesis Space, Sample Complexity for Infinite Hypothesis Spaces, The Mistake

Bound Model of Learning.

 

Instance-Based Learning

Introduction, k-Nearest Neighbor Learning, Locally Weighted Regression, Radial Basis

 

Functions, Case-Based Reasoning, Remarks on Lazy and Eager Learning.

UNIT – IV

Pattern Comparison Techniques

Temporal patterns, Dynamic Time Warping Methods, Clustering, Codebook Generation,

Vector Quantization

Pattern Classification

Introduction to HMMS, Training and Testing of Discrete Hidden Markov Models and

Continuous Hidden Markov Models, Viterbi Algorithm, Different Case Studies in Speech

recognition and Image Processing

UNIT – V

Analytical Learning

Introduction, Learning with Perfect Domain Theories : PROLOG-EBG Remarks on

Explanation-Based Learning, Explanation-Based Learning of Search Control Knowledge,

Using Prior Knowledge to Alter the Search Objective, Using Prior Knowledge to Augment

Search Operations.

Combining Inductive and Analytical Learning

Motivation, Inductive-Analytical Approaches to Learning, Using Prior Knowledge to Initialize

the Hypothesis.

Textbooks:

1. Machine Learning – Tom M.Mitchell,-MGH

2. Fundamentals of Speech Recognition By Lawrence Rabiner and Biing – Hwang Juang.

References

1. Machine Learning : An Algorithmic Perspective, Stephen Marsland, Taylor & Francis

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Leave a Comment

Scroll to Top