收藏|200种机器学习教程汇总!非常全面!
来源:云栖社区
本文多资源,建议阅读收藏。
本文精挑细选了包括机器学习、NLP、Python和数学的最佳教程。
[ 导读 ]不吹不黑,绝对史上最全的机器学习学习材料!本文包含了迄今为止大家公认的最佳教程内容。它绝不是网上每个ML相关教程的详尽列表,而是经过精挑细选而成的,毕竟网上的东西并不全是好的。作者汇总的目标是为了补充我即将出版的新书,为它寻找在机器学习和NLP领域中找到的最佳教程。
通过这些最佳教程的汇总,我可以快速的找到我想要得到的教程。从而避免了阅读更广泛覆盖范围的书籍章节和苦恼的研究论文,你也许知道,当你的数学功底不是很好的时候这些论文你通常是拿不下的。为什么不买书呢?没有哪一个作者是一个全能先生。当你尝试学习特定的主题或想要获得不同的观点时,教程可能是非常有帮助的。
我将这篇文章分为四个部分:
- 机器学习
- NLP
- Python
- 数学
我在每个部分都包含了一些主题,但由于机器学习是一个非常复杂的学科,我不可能包含所有可能的主题。
如果有很好的教程你知道我错过了,请告诉我!我将继续完善这个学习教程。我在挑选这些链接的时候,都试图保证每个链接应该具有与其他链接不同的材料或以不同的方式呈现信息(例如,代码与幻灯片)或从不同的角度。
机器学习
- 从机器学习入手
- https://machinelearningmastery.com/start-here/
- 机器学习很有趣!
- https://medium.com/@ageitgey/machine-learning-is-fun-80ea3ec3c471
- 机器学习规则:ML工程的最佳实践
- http://martin.zinkevich.org/rules_of_ml/rules_of_ml.pdf
- 机器学习速成课程:第一部分,第二部分,第三部分(伯克利机器学习)
- https://ml.berkeley.edu/blog/2016/11/06/tutorial-1/
- https://ml.berkeley.edu/blog/2016/12/24/tutorial-2/
- https://ml.berkeley.edu/blog/2017/02/04/tutorial-3/
- 机器学习理论及其应用简介:用一个小例子进行视觉教程
- https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer
- 机器学习的简单指南
- https://monkeylearn.com/blog/a-gentle-guide-to-machine-learning/
- 我应该使用哪种机器学习算法?
- https://blogs.sas.com/content/subconsciousmusings/2017/04/12/machine-learning-algorithm-use/
- 机器学习入门
- https://www.sas.com/content/dam/SAS/en_us/doc/whitepaper1/machine-learning-primer-108796.pdf
- 初学者机器学习教程
- https://www.kaggle.com/kanncaa1/machine-learning-tutorial-for-beginners
激活函数和Dropout函数
- Sigmoid神经元
- http://neuralnetworksanddeeplearning.com/chap1.html
- 激活函数在神经网络中的作用是什么?
- https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network
- 神经网络中常见的激活函数的优缺点比较列表
- https://stats.stackexchange.com/questions/115258/comprehensive-list-of-activation-functions-in-neural-networks-with-pros-cons
- 激活函数及其类型对比
- https://medium.com/towards-data-science/activation-functions-and-its-types-which-is-better-a9a5310cc8f
- 理解对数损失
- http://www.exegetic.biz/blog/2015/12/making-sense-logarithmic-loss/
- 损失函数(斯坦福CS231n)
- http://cs231n.github.io/neural-networks-2/
- L1与L2损失函数
- http://rishy.github.io/ml/2015/07/28/l1-vs-l2-loss/
- 交叉熵成本函数
- http://neuralnetworksanddeeplearning.com/chap3.html
偏差(bias)
- 偏差在神经网络中的作用
- https://stackoverflow.com/questions/2480650/role-of-bias-in-neural-networks/2499936
- 神经网络中的偏差节点
- http://makeyourownneuralnetwork.blogspot.com/2016/06/bias-nodes-in-neural-networks.html
- 什么是人工神经网络的偏差?
- https://www.quora.com/What-is-bias-in-artificial-neural-network
感知器
- 感知器
- http://neuralnetworksanddeeplearning.com/chap1.html
- 感知器
- http://natureofcode.com/book/chapter-10-neural-networks/
- 单层神经网络(感知器)
- http://computing.dcu.ie/~humphrys/Notes/Neural/single.neural.html
- 从Perceptrons到Deep Networks
- https://www.toptal.com/machine-learning/an-introduction-to-deep-learning-from-perceptrons-to-deep-networks
回归
- 线性回归分析介绍
- http://people.duke.edu/~rnau/regintro.htm
- 线性回归
- http://ufldl.stanford.edu/tutorial/supervised/LinearRegression/
- 线性回归
- http://ml-cheatsheet.readthedocs.io/en/latest/linear_regression.html
- Logistic回归
- http://ml-cheatsheet.readthedocs.io/en/latest/logistic_regression.html
- 机器学习的简单线性回归教程
- http://machinelearningmastery.com/simple-linear-regression-tutorial-for-machine-learning/
- 机器学习的Logistic回归教程
- http://machinelearningmastery.com/logistic-regression-tutorial-for-machine-learning/
- Softmax回归
- http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/
梯度下降
- 在梯度下降中学习
- http://neuralnetworksanddeeplearning.com/chap1.html
- 梯度下降
- http://iamtrask.github.io/2015/07/27/python-network-part2/
- 如何理解梯度下降算法
- http://www.kdnuggets.com/2017/04/simple-understand-gradient-descent-algorithm.html
- 梯度下降优化算法概述
- http://sebastianruder.com/optimizing-gradient-descent/
- 优化:随机梯度下降(斯坦福CS231n)
- http://cs231n.github.io/optimization-1/
生成学习(GenerativeLearning)
- 生成学习算法(斯坦福CS229)
- http://cs229.stanford.edu/notes/cs229-notes2.pdf
- 朴素贝叶斯分类器的实用解释
- https://monkeylearn.com/blog/practical-explanation-naive-bayes-classifier/
- https://monkeylearn.com/blog/practical-explanation-naive-bayes-classifier/
支持向量机
- 支持向量机(SVM)简介
- https://monkeylearn.com/blog/introduction-to-support-vector-machines-svm/
- 支持向量机(斯坦福CS229)
- http://cs229.stanford.edu/notes/cs229-notes3.pdf
- 线性分类:支持向量机,Softmax
- http://cs231n.github.io/linear-classify/
反向传播
- 你应该了解的backprop
- medium.com/@karpathy
- 你能给出神经网络反向传播算法的直观解释吗?
- https://github.com/rasbt/python-machine-learning-book/blob/master/faq/visual-backpropagation.md
- 反向传播算法的工作原理
- http://neuralnetworksanddeeplearning.com/chap2.html
- 通过时间反向传播和消失的渐变
- http://www.wildml.com/2015/10/recurrent-neural-networks-tutorial-part-3-backpropagation-through-time-and-vanishing-gradients/
- http://www.wildml.com/2015/10/recurrent-neural-networks-tutorial-part-3-backpropagation-through-time-and-vanishing-gradients/
- 时间反向传播的简单介绍
- http://machinelearningmastery.com/gentle-introduction-backpropagation-time/
- 反向传播,直觉(斯坦福CS231n)
- http://cs231n.github.io/optimization-2/
深度学习
- YN²深度学习指南
- http://cs231n.github.io/optimization-2/
- 深度学习论文阅读路线图
- https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap
- Nutshell中的深度学习
- http://nikhilbuduma.com/2014/12/29/deep-learning-in-a-nutshell/
- 深度学习教程
- http://ai.stanford.edu/~quocle/tutorial1.pdf
- 什么是深度学习?
- http://machinelearningmastery.com/what-is-deep-learning/
- 人工智能,机器学习和深度学习之间有什么区别?
- https://blogs.nvidia.com/blog/2016/07/29/whats-difference-artificial-intelligence-machine-learning-deep-learning-ai/
- 深度学习–简单介绍
- https://gluon.mxnet.io/
最优化和降维
- 数据降维减少的七种技术
- https://www.knime.org/blog/seven-techniques-for-data-dimensionality-reduction
- 主成分分析(斯坦福CS229)
- http://cs229.stanford.edu/notes/cs229-notes10.pdf
- Dropout:一种改善神经网络的简单方法
- http://videolectures.net/site/normal_dl/tag=741100/nips2012_hinton_networks_01.pdf
- 如何训练你的深度神经网络?
- http://rishy.github.io/ml/2017/01/05/how-to-train-your-dnn/
长短期记忆(LSTM)
- 长短期记忆网络的通俗介绍
- http://machinelearningmastery.com/gentle-introduction-long-short-term-memory-networks-experts/
- 了解LSTM 神经网络Networks
- http://colah.github.io/posts/2015-08-Understanding-LSTMs/
- 探索LSTM
- http://blog.echen.me/2017/05/30/exploring-lstms/
- 任何人都可以学习用Python编写LSTM-RNN
- http://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/
- http://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/
卷积神经网络(CNN)
- 卷积网络介绍
- http://neuralnetworksanddeeplearning.com/chap6.html
- 深度学习和卷积神经网络
- https://medium.com/@ageitgey/machine-learning-is-fun-part-3-deep-learning-and-convolutional-neural-networks-f40359318721
- Conv Nets:模块化视角
- http://colah.github.io/posts/2014-07-Conv-Nets-Modular/
- 了解卷积
- http://colah.github.io/posts/2014-07-Understanding-Convolutions/
递归神经网络(RNN)
- 递归神经网络教程
- http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
- 注意和增强的递归神经网络
- http://distill.pub/2016/augmented-rnns/
- 递归神经网络的不合理有效性
- http://karpathy.github.io/2015/05/21/rnn-effectiveness/
- 深入了解递归神经网络
- http://nikhilbuduma.com/2015/01/11/a-deep-dive-into-recurrent-neural-networks/
- http://nikhilbuduma.com/2015/01/11/a-deep-dive-into-recurrent-neural-networks/
强化学习
- 强化学习初学者入门及其实施指南
- https://www.analyticsvidhya.com/blog/2017/01/introduction-to-reinforcement-learning-implementation/
- 强化学习教程
- https://web.mst.edu/~gosavia/tutorial.pdf
- 学习强化学习
- http://www.wildml.com/2016/10/learning-reinforcement-learning/
- 深度强化学习:来自像素的乒乓球
- http://karpathy.github.io/2016/05/31/rl/
生成对抗网络(GAN)
- 对抗机器学习简介
- https://aaai18adversarial.github.io/slides/AML.pptx
- 什么是生成性对抗网络?
- https://blogs.nvidia.com/blog/2017/05/17/generative-adversarial-network/
- 滥用生成对抗网络制作8位像素艺术
- https://medium.com/@ageitgey/abusing-generative-adversarial-networks-to-make-8-bit-pixel-art-e45d9b96cee7
- Generative Adversarial Networks简介(TensorFlow中的代码)
- http://blog.aylien.com/introduction-generative-adversarial-networks-code-tensorflow/
- 初学者的生成对抗网络
- https://www.oreilly.com/learning/generative-adversarial-networks-for-beginners
多任务学习
- 深度神经网络中多任务学习概述
- http://sebastianruder.com/multi-task/index.html
NLP
- 自然语言处理很有趣!
- https://medium.com/@ageitgey/natural-language-processing-is-fun-9a0bff37854e
- 自然语言处理神经网络模型入门
- http://u.cs.biu.ac.il/~yogo/nnlp.pdf
- 自然语言处理权威指南
- https://monkeylearn.com/blog/the-definitive-guide-to-natural-language-processing/
- 自然语言处理简介
- https://blog.algorithmia.com/introduction-natural-language-processing-nlp/
- 自然语言处理教程
- http://www.vikparuchuri.com/blog/natural-language-processing-tutorial/
- 自然语言处理(NLP)来自Scratch
- https://arxiv.org/pdf/1103.0398.pdf
深度学习和NLP
- 深度学习适用于NLP
- https://arxiv.org/pdf/1703.03091.pdf
- NLP的深度学习(没有魔法)
- https://nlp.stanford.edu/courses/NAACL2013/NAACL2013-Socher-Manning-DeepLearning.pdf
- 了解NLP的卷积神经网络
- http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/
- 深度学习、NLP、表示
- http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
- 最先进的NLP模型的新深度学习公式:嵌入、编码、参与、预测
- https://explosion.ai/blog/deep-learning-formula-nlp
- 使用Torch深度神经网络进行自然语言处理
- https://devblogs.nvidia.com/parallelforall/understanding-natural-language-deep-neural-networks-using-torch/
- 使用Pytorch进行深度学习NLP
- http://pytorch.org/tutorials/beginner/deep_learning_nlp_tutorial.html
词向量
- 使用词袋模型解决电影评论分类
- https://www.kaggle.com/c/word2vec-nlp-tutorial
- 词嵌入介绍第一部分,第二部分,第三部分
- http://sebastianruder.com/word-embeddings-1/index.html
- http://sebastianruder.com/word-embeddings-softmax/index.html
- http://sebastianruder.com/secret-word2vec/index.html
- 词向量的惊人力量
- https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/
- word2vec参数学习解释
- https://arxiv.org/pdf/1411.2738.pdf
- Word2Vec教程- Skip-Gram模型,负抽样
- http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/
- http://mccormickml.com/2017/01/11/word2vec-tutorial-part-2-negative-sampling/
编码器-解码器
- 深度学习和NLP中的注意力机制和记忆力模型
- http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/
- 序列模型
- tensorflow.org
- 使用神经网络进行序列学习
- https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf
- 机器学习很有趣第五部分:深度学习的语言翻译和序列的魔力
- https://medium.com/@ageitgey/machine-learning-is-fun-part-5-language-translation-with-deep-learning-and-the-magic-of-sequences-2ace0acca0aa
- 如何使用编码器-解码器LSTM来回显随机整数序列
- http://machinelearningmastery.com/how-to-use-an-encoder-decoder-lstm-to-echo-sequences-of-random-integers/
- tf-seq2seq
- https://google.github.io/seq2seq/
Python
- 机器学习速成课程
- https://developers.google.com/machine-learning/crash-course/
- 令人敬畏的机器学习
- https://github.com/josephmisiti/awesome-machine-learning
- 使用Python掌握机器学习的7个步骤
- http://www.kdnuggets.com/2015/11/seven-steps-machine-learning-python.html
- 一个示例机器学习笔记
- http://nbviewer.jupyter.org/github/rhiever/Data-Analysis-and-Machine-Learning-Projects/blob/master/example-data-science-notebook/Example%20Machine%20Learning%20Notebook.ipynb
- 使用Python进行机器学习
- https://www.tutorialspoint.com/machine_learning_with_python/machine_learning_with_python_quick_guide.htm
实战案例
- 如何在Python中从头开始实现感知器算法
- http://machinelearningmastery.com/implement-perceptron-algorithm-scratch-python/
- 在Python中使用Scratch实现神经网络
- http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/
- 使用11行代码在Python中实现神经网络
- http://iamtrask.github.io/2015/07/12/basic-python-network/
- 使用Python实现你自己的k-Nearest Neighbor算法
- http://www.kdnuggets.com/2016/01/implementing-your-own-knn-using-python.html
- 来自Scatch的ML
- https://github.com/eriklindernoren/ML-From-Scratch
- Python机器学习(第2版)代码库
- https://github.com/rasbt/python-machine-learning-book-2nd-edition
Scipy和numpy
- Scipy讲义
- http://www.scipy-lectures.org/
- Python Numpy教程
- http://cs231n.github.io/python-numpy-tutorial/
- Numpy和Scipy简介
- https://engineering.ucsb.edu/~shell/che210d/numpy.pdf
- Python中的科学家速成课程
- http://nbviewer.jupyter.org/gist/rpmuller/5920182
- http://nbviewer.jupyter.org/gist/rpmuller/5920182
scikit学习
- PyCon scikit-learn教程索引
- http://nbviewer.jupyter.org/github/jakevdp/sklearn_pycon2015/blob/master/notebooks/Index.ipynb
- scikit-learn分类算法
- https://github.com/mmmayo13/scikit-learn-classifiers/blob/master/sklearn-classifiers-tutorial.ipynb
- scikit-learn教程
- http://scikit-learn.org/stable/tutorial/index.html
- 简短的scikit-learn教程
- https://github.com/mmmayo13/scikit-learn-beginners-tutorials
Tensorflow
- Tensorflow教程
- https://www.tensorflow.org/tutorials/
- TensorFlow简介 - CPU与GPU
- https://medium.com/@erikhallstrm/hello-world-tensorflow-649b15aed18c
- TensorFlow
- https://blog.metaflow.fr/tensorflow-a-primer-4b3fa0978be3
- Tensorflow中的RNN
- http://www.wildml.com/2016/08/rnns-in-tensorflow-a-practical-guide-and-undocumented-features/
- 在TensorFlow中实现CNN进行文本分类
- http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow/
- 如何使用TensorFlow运行文本摘要
- http://pavel.surmenok.com/2016/10/15/how-to-run-text-summarization-with-tensorflow/
PyTorch
- PyTorch教程
- http://pytorch.org/tutorials/
- PyTorch的简单介绍
- http://blog.gaurav.im/2017/04/24/a-gentle-intro-to-pytorch/
- 教程:PyTorch中的深度学习
- https://iamtrask.github.io/2017/01/15/pytorch-tutorial/
- PyTorch示例
- https://github.com/jcjohnson/pytorch-examples
- PyTorch教程
- https://github.com/MorvanZhou/PyTorch-Tutorial
- 深度学习研究人员的PyTorch教程
- https://github.com/yunjey/pytorch-tutorial
数学
- 机器学习数学
- https://people.ucsc.edu/~praman1/static/pub/math-for-ml.pdf
- 机器学习数学
- http://www.umiacs.umd.edu/~hal/courses/2013S_ML/math4ml.pdf
线性代数
- 线性代数直观指南
- https://betterexplained.com/articles/linear-algebra-guide/
- 程序员对矩阵乘法的直觉
- https://betterexplained.com/articles/matrix-multiplication/
- 了解Cross产品
- https://betterexplained.com/articles/cross-product/
- 了解Dot产品
- https://betterexplained.com/articles/vector-calculus-understanding-the-dot-product/
- 用于机器学习的线性代数(布法罗大学CSE574)http://www.cedar.buffalo.edu/~srihari/CSE574/Chap1/LinearAlgebra.pdf
- 用于深度学习的线性代数备忘单
- https://medium.com/towards-data-science/linear-algebra-cheat-sheet-for-deep-learning-cd67aba4526c
- 线性代数评论与参考
- http://cs229.stanford.edu/section/cs229-linalg.pdf
概率论
- 用比率理解贝叶斯定理
- https://betterexplained.com/articles/understanding-bayes-theorem-with-ratios/
- 概率论入门
- http://cs229.stanford.edu/section/cs229-prob.pdf
- 机器学习的概率论教程
- https://see.stanford.edu/materials/aimlcs229/cs229-prob.pdf
- 概率论(布法罗大学CSE574)
- http://www.cedar.buffalo.edu/~srihari/CSE574/Chap1/Probability-Theory.pdf
- 机器学习的概率论(多伦多大学CSC411)
- http://www.cs.toronto.edu/~urtasun/courses/CSC411_Fall16/tutorial1.pdf
微积分
- 如何理解导数:商数规则,指数和对数
- https://betterexplained.com/articles/how-to-understand-derivatives-the-quotient-rule-exponents-and-logarithms/
- 如何理解导数:产品,动力和链条规则
- (betterexplained.com)
- https://betterexplained.com/articles/derivatives-product-power-chain/
- 矢量微积分:了解渐变
- https://betterexplained.com/articles/vector-calculus-understanding-the-gradient/
- 微分学(斯坦福CS224n)
- http://web.stanford.edu/class/cs224n/lecture_notes/cs224n-2017-review-differential-calculus.pdf
- 微积分概述
- http://ml-cheatsheet.readthedocs.io/en/latest/calculus.html