深度学习教程Deep Learning Tutorials
Deep Learning Tutorials
Deep Learning is a new area of Machine Learning research, which has been introduced with the objective of moving Machine Learning closer to one of its original goals: Artificial Intelligence. See these course notes for a brief introduction to Machine Learning for AI and an introduction to Deep Learning algorithms.
Deep Learning is about learning multiple levels of representation and abstraction that help to make sense of data such as images, sound, and text. For more about deep learning algorithms, see for example:
- The monograph or review paper Learning Deep Architectures for AI (Foundations & Trends in Machine Learning, 2009).
- The ICML 2009 Workshop on Learning Feature Hierarchies webpage has a list of references.
- The LISA public wiki has a reading list and a bibliography.
- Geoff Hinton has readings from 2009’s NIPS tutorial.
The tutorials presented here will introduce you to some of the most important deep learning algorithms and will also show you how to run them using Theano. Theano is a python library that makes writing deep learning models easy, and gives the option of training them on a GPU.
The algorithm tutorials have some prerequisites. You should know some python, and be familiar with numpy. Since this tutorial is about using Theano, you should read over the Theano basic tutorial first. Once you’ve done that, read through our Getting Started chapter – it introduces the notation, and [downloadable] datasets used in the algorithm tutorials, and the way we do optimization by stochastic gradient descent.
The purely supervised learning algorithms are meant to be read in order:
- Logistic Regression - using Theano for something simple
- Multilayer perceptron - introduction to layers
- Deep Convolutional Network - a simplified version of LeNet5
The unsupervised and semi-supervised learning algorithms can be read in any order (the auto-encoders can be read independently of the RBM/DBN thread):
- Auto Encoders, Denoising Autoencoders - description of autoencoders
- Stacked Denoising Auto-Encoders - easy steps into unsupervised pre-training for deep nets
- Restricted Boltzmann Machines - single layer generative RBM model
- Deep Belief Networks - unsupervised generative pre-training of stacked RBMs followed by supervised fine-tuning
Building towards including the mcRBM model, we have a new tutorial on sampling from energy models:
- HMC Sampling - hybrid (aka Hamiltonian) Monte-Carlo sampling with scan()
- Building towards including the Contractive auto-encoders tutorial, we have the code for now:
-
- Contractive auto-encoders code - There is some basic doc in the code.
- Recurrent neural networks with word embeddings and context window:
- LSTM network for sentiment analysis:
- Energy-based recurrent neural network (RNN-RBM):
Note that the tutorials here are all compatible with Python 2 and 3, with the exception of Modeling and generating sequences of polyphonic music with the RNN-RBM which is only available for Python 2.
from: http://deeplearning.net/tutorial/
最新文章
- 【读fastclick源码有感】彻底解决tap“点透”,提升移动端点击响应速度
- eclipse设置及快捷键
- MVC中的自定义控件——分页
- Linux下升级python
- 使用事务操作SQLite数据库
- jquery插件开发规范
- python自定义日志函数测试
- tomcat 设置默认编码格式
- Javascript基础Function
- 浅谈 js 正则字面量 与 new RegExp 执行效率
- 微信公众号平台接口开发:基础支持,获取access_token
- 《javascript设计模式与开发实践》阅读笔记(12)—— 享元模式
- Makefile常用函数总结
- 我的第一个微信小程序
- SpringBoot整合Mybatis注解版---update出现org.apache.ibatis.binding.BindingException: Parameter 'XXX' not found. Available parameters are [arg1, arg0, param1, param2]
- BZOJ 3261 最大异或和(算竞进阶习题)
- day75 form 组件(对form表单进行输入值校验的一种方式)
- JS数组分组
- 【ZooKeeper】单机伪集群搭建(适用于mac)
- Spring+Logback的集成总结