• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

diefimov/MTH594_MachineLearning: The materials for the course MTH 594 Advanced d ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

diefimov/MTH594_MachineLearning

开源软件地址(OpenSource Url):

https://github.com/diefimov/MTH594_MachineLearning

开源编程语言(OpenSource Language):

Jupyter Notebook 100.0%

开源软件介绍(OpenSource Introduction):

MTH594 Advanced data mining: theory and applications

The materials for the course MTH 594 Advanced data mining: theory and applications taught by Dmitry Efimov in American University of Sharjah, UAE in Spring, 2016 semester. The program of the course can be downloaded from the folder syllabus.

To compose this lectures mainly I used the ideas from three sources:

  1. Stanford lectures by Andrew Ng on YouTube: https://www.youtube.com/watch?v=UzxYlbK2c7E&list=PLA89DCFA6ADACE599
  2. The book "The elements of Statistical Learning" by T. Hastie, R. Tibshirani and J. Friedman: http://statweb.stanford.edu/~tibs/ElemStatLearn
  3. Lectures by Andrew Ng on Coursera: https://www.coursera.org/learn/machine-learning

All uploaded pdf lectures are adapted in a way to help students to understand the material.

The supplementary files from ipython folder are aimed to teach students how to use built-in methods to train the models on Python 2.7.

In case you found some mistakes or typos, please email me [email protected], this course is a new for me and probably there are some :)

The content of the lectures:

Supervised learning

Linear and logistic regressions, perceptrons

Linear regression

Analytical minimization: normal equations

Statistical interpretation

Logistic regression

Perceptron

Bayesian interpretation and regularization

Python implementation

Linear regression
Logistic regression
Perceptron
Regularization

Methods of optimization

Gradient descent

Examples of gradient descent

Newton's method

Python implementation

Batch gradient descent
Stochastic gradient descent

Generalized linear models (GLM)

Exponential family

Generalized Linear Models (GLM)

Python implementation

Softmax regression

Generative learning algorithms

General idea of generative algorithms

Gaussians

Gaussian discriminant analysis

Generative vs Discriminant comparison

Naive Bayes

Laplace smoothing

Event models

Python implementation

Gaussians
Gaussian discriminant analysis
Naive Bayes

Neural networks

Definition

Backpropagation

Python implementation

Support vector machines

Support vector machines: intuition

Primal/dual optimization problem and KKT

SVM dual

Kernels

Kernel examples

Kernel testing

SVM with kernels

Soft margin

SMO algorithm

Python implementation

Coordinate ascent
SVM
SMO algorithm

Nonparametric methods

Locally weighted regression

Generalized additive models (GAM)

GAM for regression

GAM for classification

Tree-based methods

Regression trees

Classification trees

Boosting

Exponential loss
Adaboost
Gradient boosting
Gradient tree boosting

Python implementation

Locally weighted regression
GAM for regression
GAM for classification
Regression decision trees
Classification decision trees
Gradient tree boosting

Learning theory

Bias / variance

Empirical risk minimization (ERM)

Union bound / Hoeffding inequality

Uniform convergence

VC dimension

Model selection

Feature selection

Python implementation

Cross validation

Online learning

Advices for apply ML algorithms

Unsupervised learning

Clustering

K-means

Python implementation

Mixture of Gaussians and EM algorithm

Mixture of Gaussians

Jensen's inequality

General EM algorithm<


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap