Advance Machine Learning Program

Advance Machine Learning Program is the continuation of applied machine learning program of 3 Months, In the second part of the program we cover the fundamentals of scalable machine learning and advance bayesian models, data streams and IOT Data Processing. You will learn applications of machine learning in recommender systems, social network analysis in detail and work on techniques to mine huge datasets in multimedia and the web information Retrieval. We will also cover the application of Machine Learning in Natural Language Processing and Text Mining.

Program Duration: 6 months

The Outcome

Upon graduating, you will be comfortable in designing, implementing models in Scalable Machine Learning, Advance Bayesian modeling, including knowing the fundamentals of advance machine learning and their applications in Recommender System, Social Network Analysis, Text Mining, and Natural Language Processing, and Data Streams. You will also apply massive data mining principles in an end to end IOT project.

The Details

Advance Machine Learning Program runs for 24 weeks and is subdivided into multiple courses.

  • Includes 500 hours of in-class instruction and hands-on sessions, 360Hours of in-person classes and 140 hours of webcast classes with TA
  • Four 4-day in-person immersive sessions and four 2-day in-person immersive sessions will be held in the program
  • In-person classes will be held one day every weekend
  • Webcast classes will be held for 4 hours on weekdays

Applied Machine Learning Program Curriculum

Application Projects

Phase I

  • Network Intrusion detection
  • Predictive Text Generation
  • Churn Prediction
  • Weather Forecasting

Phase II

  • Customer Lifetime Modelling
  • Speech synthesis
  • Named Entity Extraction
  • Diagnosis

Phase III

  • Driving for Fuel Efficiency
  • Car navigation
  • Viral Marketing

Advance Machine Learning Program Curriculum

Application Projects

Phase I

  • IOT Data Processing
  • Spam Detector
  • Dialogue Systems
  • News Recommendation
  • Q & A Systems

Phase II

  • Sentiment Analysis
  • Machine translation
  • Text Summarization
  • Natural Language Processing (caption generation, Word2Vec)
  • Building Intelligent Recommender System

Phase III

  • Conversational Recommender System
  • Building Intelligent Information System
  • Mining Massive Multimedia Data Sets
  • Analyzing Social Networks at Scale

Professor Bhiksha Raj, Fellow IEEE

Language Technologies Institute, School of Computer Science, Carnegie Mellon University

Professor Bhiksha Raj is an expert in the area of Deep Learning and Speech Recognition and has two decades of experience. He has been named to the 2017 class of IEEE fellows for his "contributions to speech recognition," according to IEEE. He is the main Instructor of 11-785, Carnegie Mellon University’s Official Deep Learning Course, which is followed by thousand of researchers worldwide.

Read More

Dr. Sarabjot Singh Anand

Co-Founder and Chief Data Scientist at Tatras Data

Dr. Sarabjot Singh Anand is a Data Geek. He has been involved in the field of data mining since the early 1990s and has derived immense pleasure in developing algorithms, applying them to real-world problems and training a host of data analysts in the capacity of being an academic and data analytics consultant.

Read More

Dr. Vikas Agrawal

Senior Principal Data Scientist @ Oracle Analytics Cloud

Vikas Agrawal works as a Senior Principal Data Scientist in Cognitive Computing for Oracle Analytics Cloud. His current interests are in automated discovery, adaptive anomaly detection in streaming data, intelligent context-aware systems, and explaining black-box model predictions.

Read More

Mr. Mukesh Jain

Analytics, AI, ML & DL Leader (ex-Microsoft, ex-Jio)

Mukesh Jain is Practitioner of Analytics, AI, ML & DL Leader since 1995.

He is Technologist, Techno-Biz Leader, Data Scientist, Author, Coach and Teacher.

Read More

Professor Joao Gama

University of Porto, Director LIAAD

Joao Gama is Associate Professor of the Faculty of Economy, University of Porto. He is a researcher and the Director of LIAAD, a group belonging to INESC TEC. He got the PhD degree from the University of Porto, in 2000. He has worked in projects and authored papers in areas related to machine learning, data streams, and adaptive learning systems and is a member of the editorial board of international journals in his area of expertise.

Read More

Professor Ashish Ghosh

Indian Institute of Statistics, Kolkata

Professor Ashish Ghosh Professor in the Machine Intelligence Unit, Indian Statistical Institute, Calcutta. Short Biography: Ashish Ghosh is a Professor of the Machine Intelligence Unit and the In-charge of Center for Soft Computing Research at the Indian Statistical Institute, Calcutta.

Read More

Professor Jaime Carbonell

Director LTI, Carnegie Mellon University, USA

Jaime is Director and Founder of the Language Technologies Institute and Allen Newell Professor of Computer Science at Carnegie Mellon University. He is a world-renowned expert in the areas of information retrieval, data mining and machine translation. Jaime co-founded and took public Carnegie Group, a company in the IT services market employing advanced artificial-intelligence techniques.

Read More

Dr. Derick Jose

Co-founder, Flutura Decision Sciences & Analytics

Derick is the co-founder of Flutura Decision Sciences a niche AI & IIoT company focussed on impacting outcomes for the Engineering and Energy Industries. Flutura has been rated by Bloomberg as one of the fastest growing machine intelligence companies and its AI platform Cerebra has been certified to work with Halliburton and Hitachis platforms.

Read More

Mr. Joy Mustafi, Director and Principal Researcher at Salesforce

Visiting Scientists , Innosential

Winner of Zinnov Award 2017 - Technical Role Model - Emerging Technologies (Senior Level). Collaborated with the ecosystem by visiting around twenty-five leading universities in India as visiting faculty, guest speaker, advisor, mentor, project supervisor, panelist, academic board member, curricula moderator, paper setter and evaluator, judge of events like hackathon etc. Having more than twenty-five patents and fifteen publications on artificial intelligence in the recent past years.

Read More

Dr. Vijay Gabale

Co-founder and CTO Infilect

Deep learning enabled computer vision forms the core competence of Infilect products. Prior to cofounding Infilect, Vijay was a research scientist with IBM research. Vijay obtained his Ph.D. in Computer Science from IIT Bombay in 2012. Vijay has extensively worked on intelligent networks and systems by applying machine learning and deep learning techniques. Vijay has published research papers in top-tier conferences such as SIGCOMM, KDD and has several patents to his name.

Read More

Mr. Dipanjan Sarkar

Intel AI

Dipanjan (DJ) holds a master of technology degree with specializations in Data Science and Software Engineering. He is also an avid supporter of self-learning and massive open online courses. He plans to venture soon into the world of open-source products to improve the productivity of developers across the world.

Read More

Mr. Ajit Jaokar

Director of the Data Science Program University of Oxford

Ajit Jaokar's work is based on identifying and researching cross-domain technology trends in Telecoms, Mobile and the Internet.

Ajit conducts a course at Oxford University on Big Data and Telecoms and also teaches at City Sciences(Technical University of Madrid) on Big Data Algorithms for future Cities / Internet of Things.

Read More

Dr. Pratibha Moogi

ex-Samsung R&D

Dr. Pratibha Moogi holds PhD from OGI, School of Engineering, OHSU, Portland and Masters from IIT Kanpur. She has served SRI International lab and many R&D groups including Texas Instruments, Nokia, and Samsung. Currently she is serving as a Director in Data Science Group (DSG), in a leading B2B customer operation & journey analytics company, [24]7.ai.

Read More

Applied Machine Learning Program

Preparatory Course: Foundations of Learning AI

Read MoreBack to Curriculum

1. Mathematical Foundations of Data Science:

A. Linear Algebra:

  1. Vectors, Matrices
  2. Tensors
  3. Matrix Operations
  4. Projections
  5. Eigenvalue decomposition of a matrix
  6. LU Decomposition
  7. QR Decomposition/Factorization
  8. Symmetric Matrices
  9. Orthogonalization & Orthonormalization
  10. Real and Complex Analysis (Sets and Sequences, Topology, Metric Spaces, Single-Valued and Continuous Functions, Limits, Cauchy Kernel, Fourier Transforms)
  11. Information Theory (Entropy, Information Gain)
  12. Function Spaces and Manifolds
  13. Relational Algebra and SQL

B. Multivariate Calculus

  1. Differential and Integral Calculus
  2. Partial Derivatives
  3. Vector-Values Functions
  4. Directional Gradient
  5. Hessian
  6. Jacobian
  7. Laplacian and Lagragian Distribution

2: Probability for Data Scientists

  1. Probability Theory and Statistics
  2. Combinatorics
  3. Random Variables
  4. Probability Rules & Axioms
  5. Bayes' Theorem
  6. Variance and Expectation
  7. Conditional and Joint Distributions
  8. Standard Distributions (Bernoulli, Binomial, Multinomial, Uniform and Gaussian)
  9. Moment Generating Functions
  10. Maximum Likelihood Estimation (MLE)
  11. Prior and Posterior
  12. Maximum a Posteriori Estimation (MAP) and Sampling Methods
  13. Descriptive Statistics
  14. Hypothesis Testing
  15. Goodness of Fit
  16. Analysis of Variance
  17. Correlation
  18. Chi2 test
  19. Design of Experiments

3: Algorithms and Data Structures:

A. Graph Theory: Basic Concepts and Algorithms
B. Algorithmic Complexity

  1. Algorithm Analysis
  2. Greedy Algorithms
  3. Divide and Conquer and Dynamic Programming

C. Data Structures

  1. Array, List, Hashing, Binary Trees, Hashing, Heap, Stack etc
  2. Dynamic Programming
  3. Randomized & Sublinear Algorithm
  4. Graphs

4: R and Python

A. R PROGRAMMING LANGUAGE

  1. Vectors
  2. Matrices
  3. Lists
  4. Data frame
  5. Basic Syntax
  6. Basic Statistics
  7. Data Manipulation (dplyr)
  8. Visualization (ggplot2)
  9. Connecting to databases (RJDBC)

B. Python Programming Language

  1. Python language fundamentals
  2. Data Structures
  3. Beautiful Soup
  4. Regular Expressions
  5. JSON
  6. Restful Web Services (Flask)
  7. NumPy
  8. Plots in matplotlib, seaborn
  9. Pandas

5: Numerical and Combinatorial Optimization

Conjugate Gradient Methods, Quasi-Newton Methods, Constrained Optimization, Linear Programming, Nonlinear Constrained optimization, Quadratic Programming, Integer Programming, Knapsack problem, travelling salesman, vehicle routing, job shop scheduling, Gradient/Stochastic Descents and Primal-Dual methods.

Course I: Introduction to AI & Nature of Intelligence

Read MoreBack to Curriculum

In this course, DankoNikolic brain scientists and AI inventor explains the fundamental of intelligence needed for everyone interested in creating ambitious AI solutions. What you do when things get tough? In the course, you will learn the differences between machine intelligence and human intelligence. You will understand why AI fails and when. AI does not have a narrowly limited working memory (a.k.a., short-term memory) but we humans do. How does our working memory make us more intelligent than machines? Why do we understand the world and machines don't? You will also learn fundamental theorems for machine learning and see how they apply to machine intelligence and human intelligence. After having learned that, you will be able to judge whether an ML project is too ambitious or is likely to succeed. You will be able to identify fundamental problems that plagued some of the ambitious AI projects in the past. You will understand why it is nearly impossible for machines to reach human levels of intelligence. Also, you will learn why some of the tricks in machine learning sometimes work and other times not. You will understand why it is so difficult to build self-driving cars.

The course offers fundamentals that you cannot find in any other course or a book. These fundamentals will be invaluable for your future work on ML and AI.

  1. This course explores the nature of intelligence, ranging from machines to the biological brain. Information provided in the course is useful when undergoing ambitious projects in machine learning and AI. It will help you avoid pitfalls in those projects.
  2. What are the differences between the real brain and machine intelligence and how can you use this knowledge to prevent failures in your work? What are the limits of today's AI technology? How to assess early in your AI project whether it has chances of success?
  3. What are the most fundamental mathematical theorems in machine learning and how they are relevant for your everyday work?

Course II: Exploratory Data Analysis and Feature Engineering

Read MoreBack to Curriculum

1. Data Exploration And Preprocessing

  • Basic Plotting of Data
  • Outlier Detection
  • Dimensionality Reduction: Principal Component Analysis, Multidimensional Scaling
  • Data Transformation
  • Dealing with Missing Values

2. Feature Engineering

  • Feature extraction and feature engineering,
  • Feature transformation
  • Feature selection
  • Grid search
  • Automatically create features
  • Aggregations and transformations
  • Introduction about Feature tools
  • Introduction about Entities & Entity Sets, table Relationships, Feature Primitives, Deep Feature synthesis

Course III: Introduction to Machine Learning

Read MoreBack to Curriculum

1. Learning from Data:

  • The appeal of learning from examples, Motivational Case Studies, A formal definition of learning, Key Components of Learning, Population vs. Sample, Decision Boundary, Types of data, Typical Issues with Data, Types of Learning
  • Learning as search: Instance and Hypothesis Space, Introductions to Search Algorithms, Cost Functions
  • Version Spaces/Perceptron/Linear Regression /Nearest Neighbor
  • Overfitting/Regularization, Worst case performance: VC dimension, Bias Variance Tradeoff, Non Linear Embedding, Outlier Detection, Minimum description length
  • Estimating Accuracy: Train/test split, Cross Validation, Bootstrap Hypothesis testing, Confusion Matrix, Sensitivity and Specificity, Precision and Recall, ROC curves and AUC, MAPE, Kappa Statistic, AIC, BIC
  • Data Science Process

Introduction to Bayesian Learning:

  • Probability review
  • Bayes rule
  • Conjugate priors
  • Bayesian Inference I (coin flipping)
  • Bayesian Inference II (hypothesis testing and summarizing distributions)
  • Bayesian Inference III (decision theory)
  • Bayesian linear regression
  • Bayes classifiers
  • Statistical Estimation a. Maximum Likelihood Estimation c. Bayes error rate e. Curse of dimensionality
  • Laplace approximation, Naïve Bayes, Introduction to Bayesian Belief Networks, Introduction to Sampling, Gibbs sampling, Logistic regression

Course IV: Supervised Machine Learning

Read MoreBack to Curriculum

  1. Classification Algorithms: Decision Trees, Rule Induction, SVM
  2. Dealing with Skewed Class Distribution and Cost-based Classification: Resampling
  3. Regression: Lasso and Ridge Regression, MARS, OLS, PLS, GLM
  4. Survival Analysis: Cox’s Regression, Weibull Distribution, Parametric Survival Models
  5. Ensemble Models: Bagging, Boosting, Stacking, Random Forests, XGBoost, GBDT
  6. Neural Networks and Introduction to Deep Learning: Multi Layered Perceptron, Back propagation, Convolutional Neural Networks, Encode-Decoders, Recurrent Neural Networks, Long Short Term Memory (LSTM)
  7. Time Series Forecasting: Time Series Decomposition, Holt-Winters, ARIMA, Intermittent Models, Time-frequency domain, Fourier transforms, wavelet transforms, Dynamic Regression Models, Neural Networks, Demand Forecasting
  1. Case Study: Network Intrusion detection
  2. Case Study: Weather Forecasting
  3. Case Study: Image Classification
  4. Case Study: Predictive Text Generation
  5. Case Study: Customer Lifetime Modelling
  6. Case Study: Churn Prediction
  7. Case Study: Speech synthesis

Course V: What happens at Kaggle. Is seen by the whole world. Kaggle Competitions.

Read MoreBack to Curriculum

In this course, you will learn to analyse and solve competitively such predictive modelling tasks. When you finish this class, you will: Understand how to solve predictive modelling competitions efficiently and learn which of the skills obtained can be applicable to real-world tasks.

  1. Preprocess the data and generate new features from various sources text and images.
  2. Advance feature engineering techniques Generating mean-encodings, aggregated statistical measures, nearest neighbors as a means to improve your predictions.
  3. Cross-validation methodologies benchmark your solutions.
  4. Analyzing and interpreting the data. Inconsistencies, high noise levels, errors and other data-related issues such as leakages.
  5. Efficiently tune hyperparameters of Algorithms and achieve top performance.
  6. Master the art of combining different machine learning models and learn how to ensemble.
  7. Get exposed to past (winning) solutions and codes and learn how to read them.

Course VI: Unsupervised Machine Learning

Read MoreBack to Curriculum

  1. Clustering Methods: Expectation Maximization, K-means, k-medoids, Agglomerative and Divisive Hierarchical clustering, Birch, DBScan, Spectral Clustering, Self-organizing maps
  2. Community detection in graphs
  3. Association Rules and Sequence Pattern Discovery
  4. Deviation Detection
  5. Semi-supervised and Active Learning
  6. Sequential Data Models: Markov Models and Hidden Markov Models, Kalman Filters
  7. Model Selection: Model Comparisons, Analysis Considerations
  1. Case Study: Viral Marketing
  2. Case Study: Driving for Fuel Efficiency

Course VII: Bayesian Machine Learning

Read MoreBack to Curriculum

  1. Expectation Maximization algorithm
  2. Probit regression
  3. Expectation Maximization to variational inference
  4. Variational inference
  5. Finding optimal distributions
  6. Exponential families
  7. Conjugate exponential family models
  8. Scalable inference
  9. Bayesian nonparametric clustering
  10. Markov Models, Hidden Markov models
  11. Conditional Random Fields
  12. Monte Carlo, Sampling, Rejection Sampling
  13. Poisson matrix factorization
  14. Decision Networks
  15. Bayesian Optimization
  16. Bayesian State-Space Models and Kalman Filtering
  17. Probabilistic Numerics and Bayesian Quadrature
  1. Case Study: Named Entity Extraction
  2. Case Study: Car navigation
  3. Case Study: Diagnosis

Advance Machine Learning Program

Preparatory Course I: Foundations of Scalable Data Science

Read MoreBack to Curriculum

1. Big Data Stack

  • The Map reduce paradigm
  • Hadoop,: HDFS, Hive (SQL), Pig, Sqoop Flume, Avro
  • NoSQL: Big Table, HBase, Document stores, Graph stores, Key-Value stores
  • Spark and Introduction to PySpark

2. Functional Programming in Scala
Functions, Data and Abstractions, collections, pattern matching and functions, Lazy Evaluation, Functions and State, Observer Pattern, reactive programming,Parallel Programming, Data Parallelism, Data Structures for Parallel Computing, RDDs,pair RDDs, Reduction Operations, Partitioning and Shuffling,Dataframes, Spark SQL

3. Design Thinking and Storytelling
Preparing your mind for innovation, Idea Generation and Experimentation

Preparatory Course II: Statistics of NLP

Read MoreBack to Curriculum

  1. Statistical Language Modeling
  2. Computational Linguistics
  3. Statistical Decision Making and the Source-Channel Paradigm
  4. Sparseness; Smoothing
  5. Measuring Success: Information Theory, Entropy and Perplexity Maximum Entropy Models, Whole-Sentence Models, Semantic Modeling
  6. EM for sound separation
  7. Probabilistic Context Free Grammars (PCFG), the Inside-Outside Algorithm
  8. Syntactic Language Models
  9. Decision Tree Language Models

Course I: Scalable Machine Learning

Read MoreBack to Curriculum

  1. Multitask Learning
  2. Asynchronous Gradient Decent: Hogwild!, Momentum, Gibbs Sampling, Cyclades
  3. Hyperparameter Optimization
  4. Low-Precision Training
  5. Matrix Completion and Approximation
  6. Tensor Factorization
  7. On-device Inference
  8. Performance Optimizers
  9. Local Learning
  10. Feature Selection
  11. Reinforcement Learning Optimization
  12. Distributed Reinforcement Learning
  13. Scalable Variational Inference
  14. Interpretable ML
  15. ADMM and its connections to belief propagation
  16. The nonconvex landscape of neural networks

Course II: Data Streams

Read MoreBack to Curriculum

  1. Data Streams Computational Model
  2. Change Detection
  3. Classification
  4. Regression
  5. Clustering Data Streams
  6. Frequent Pattern Mining
  1. Case Study: IOT Data Processing

Course III: Natural Language Processing

Read MoreBack to Curriculum

1. Deep Neural Networks for Natural Language Processing. (8 Hours)

A. Introduction

  • Introduction to Neural Networks
  • Example Tasks and Their Difficulties
  • What Neural Nets Can Do To Help

B. Predicting the Next Word in a Sentence

  • Computational Graphs
  • Feed-forward Neural Network Language Models
  • Measuring Model Performance: Likelihood and Perplexity

C. Distributional Semantics and Word Vectors

  • Describing a word by the company that it keeps
  • Counting and predicting
  • Skip-grams and CBOW
  • Evaluating/Visualizing Word Vectors
  • Advanced Methods for Word Vectors

D. Why is word2vec So Fast?: Speed Tricks for Neural Nets

  • Softmax Approximations: Negative Sampling, Hierarchical Softmax
  • Parallel Training
  • Tips for Training on GPUs

E. Convolutional Networks for Text

  • Bag of Words, Bag of n-grams, and Convolution
  • Applications of Convolution: Context Windows and Sentence Modeling
  • Stacked and Dilated Convolutions
  • Structured Convolution
  • Convolutional Models of Sentence Pairs
  • Visualization for CNNs

F. Recurrent Networks for Sentence or Language Modeling

  • Recurrent Networks
  • Vanishing Gradient and LSTMs
  • Strengths and Weaknesses of Recurrence in Sentence Modeling
  • Pre-training for RNNs

G. Using/Evaluating Sentence Representations

  • Sentence Similarity
  • Textual Entailment
  • Paraphrase Identification
  • Retrieval

H. Conditioned Generation

  • Encoder-Decoder Models
  • Conditional Generation and Search
  • Ensembling
  • Evaluation
  • Types of Data to Condition On

I. Attention

  • Attention
  • What do We Attend To?
  • Improvements to Attention
  • Specialized Attention Varieties
  1. A Case Study: "Attention is All You Need"

2. Foundations of Natural language Processing

  • Word embedding
  • Named entity recognition
  • Parts-of-Speech tagging
  • Language modeling
  • Segmentation
  • Paraphrasing
  • Machine translation
  • Information Extraction
  • Text Summarization
  • Conditional Random Fields
  • Dimensionality Reduction: Matrix Factorization, Topic Models
  1. Case Study: Q & A Systems
  2. Case Study: Spam Detector

3. Text Classification

  • Tokenization
  • Lemmatization
  • Vectorization
  • Bag of Words representation
  • Language Models
  • Tfidf
  • Singular Value Decomposition
  • Topic Models
  • Discourse Modelling
  • Coreference Resolution
  • Question Answering Systems
  • Visualizing complex and high dimensional data
  • Sentiment Analysis
  1. Case Study: Web Personalization
  2. Case Study: Text Classification

Course IV: Recommender Systems

Read MoreBack to Curriculum

  1. Content Based Filtering
  2. User and Item based Collaborative Filtering
  3. New Item Problem
  4. ALS
  5. Conversational Recommenders
  6. Diversity in Recommendation

Course V: Analysis of Large Data Sets

Read MoreBack to Curriculum

  1. Frequent Itemset Mining
  2. Locality-Sensitive Hashing
  3. Dimensionality Reduction
  4. Algorithms on Large Graphs
  5. Large-Scale Machine Learning
  6. Computational Advertising
  7. Learning through Experimentation
  8. Optimizing Submodular Functions

Course VI: Social Network Analysis

Read MoreBack to Curriculum

  1. Introduction and Structure of Graphs
  2. Web as a Graph and the Random Graph Model
  3. The Small World Phenomena
  4. Decentralized search in small-world and P2P networks
  5. Applications of Social Network Analysis
  6. Networks with Signed Edges
  7. Cascading Behavior: Decision Based Models of Cascades
  8. Cascading Behavior: Probabilistic Models of Information Flow
  9. Influence Maximization
  10. Outbreak Detection
  11. Power-laws and Preferential attachment
  12. Models of evolving networks
  13. Kronecker graphs
  14. Link Analysis: HITS and PageRank
  15. Strength of weak ties and Community structure in networks
  16. Network community detection: Spectral Clustering
  17. Biological networks, Overlapping communities in networks
  18. Representation Learning on Graphs

Our Address :

2/3, 2nd Floor, 80 Feet Road, Barleyz Junction, Sony World Crossing, Above, KFC, Koramangala, Venkappa Garden, Ejipura, Bengaluru, Karnataka 560034

Phone Number :

+91 9582510786

Email Address :

info@innosential.com

    Enquire Now