Deep, Big, Simple Neural Nets for Handwritten Digit Recognition

Neural Computation(2010)

引用 696|浏览4
暂无评分
摘要
Good old online backpropagation for plain multilayer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images to avoid overfitting, and graphics cards to greatly speed up learning.
更多
查看译文
关键词
training set deformations,gpu graphics processing unit,bp back-propagation.,mlp multilayer perceptron,mnist 1,nn neural network,multi layer perceptron,neural network,back propagation,multilayer perceptron,neural net,error rate
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要