https://www.class-central.com/course/edx-machine-learning-7231
这是 Columbia University
Week 1: maximum likelihood estimation, linear regression, least squares
Week 2: ridge regression, bias-variance, Bayes rule, maximum a posteriori inference
Week 3: Bayesian linear regression, sparsity, subset selection for linear regression
Week 4: nearest neighbor classification, Bayes classifiers, linear classifiers, perceptron
Week 5: logistic regression, Laplace approximation, kernel methods, Gaussian processes
Week 6: maximum margin, support vector machines, trees, random forests, boosting
Week 7: clustering, k-means, EM algorithm, missing data
Week 8: mixtures of Gaussians, matrix factorization
Week 9: non-negative matrix factorization, latent factor models, PCA and variations
Week 10: Markov models, hidden Markov models
Week 11: continuous state-space models, association analysis
Week 12: model selection, next steps
第1周:最大似然估计,线性回归,最小二乘法
第2周:岭回归,偏差 - 方差,贝叶斯规则,最大后验推断
第3周:贝叶斯线性回归,稀疏性,线性回归的子集选择
第4周:最近邻分类,贝叶斯分类器,线性分类器,感知器
第5周:逻辑回归,拉普拉斯近似,核方法,高斯过程
第6周:最大边际,支持向量机,树木,随机森林,提升
第7周:聚类,k均值,EM算法,缺失数据
第8周:高斯混合,矩阵分解
第9周:非负矩阵分解,潜在因子模型,PCA和变化
第10周:马尔可夫模型,隐马尔可夫模型
第11周:连续状态空间模型,关联分析
第12周:模型选择,后续步骤