Korean J. Math. Vol. 26 No. 2 (2018) pp.337-348
DOI: https://doi.org/10.11568/kjm.2018.26.2.337

Beginner's guide to neural networks for the MNIST dataset using MATLAB

Main Article Content

Bitna Kim
Young Ho Park

Abstract

MNIST dataset is a database containing images of handwritten digits, with each image labeled by an integer from 0 to 9. It is used to benchmark the performance of machine learning algorithms. Neural networks for MNIST are regarded as the starting point of the studying machine learning algorithms. However it is not easy to start the actual programming. In this expository article, we will give a step-by-step instruction to build neural networks for MNIST dataset using MATLAB.


Article Details

References

[1] G. Cybenko, Approximations by superpositions of sigmoidal functions, Mathematics of Control, Signals, and Systems 2 (4) (1989), 303–314. Google Scholar

[2] M.T. Hagan, M.H Beale, H.B. Demuth and O.D Jesu s, Neural network Design, 2nd Ed. Google Scholar

[3] M.T. Hagan, Neural network design, free book from http://hagan.okstate.edu/NNDesign.pdf Google Scholar

[4] J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences 79 (1982), 2554–2558. Google Scholar

[5] K. Hornik, Approximation capabilities of multilayer feedforward networks, Neu- ral Networks 4 (2) (1991), 251–257. Google Scholar

[6] Bitna Kim, Handwritten digits classification by neural networks with small data, Master’s thesis, Kangwon National University, 2018. Google Scholar

[7] P. Kim, Matlab deep learning, Apress, 2017 Google Scholar

[8] T. Kohonen, Correlation matrix memories,IEEE Transactions on Computers 21 (1972), 353–359. Google Scholar

[9] Mathworks, MATLAB documentation, MATLAB version R2016a, 2016. Google Scholar

[10] W.S. McCulloch and W.H. Pitts, A logical calculus of the ideas immanent in nervous activity, Bull. Math. Biophysics 5 (1943) 115–133. Google Scholar

[11] M. Minsky and S. Papert, Perceptrons: an introduction to computational geometry, M.I.T. Press, Cambridge, 1969. Google Scholar

[12] MNIST, http://yann.lecun.com/exdb/mnist/ Google Scholar

[13] A. Ng, Course on machine learing, Cousera, https://www.coursera.org/learn/machine-learning Google Scholar

[14] A.Ng, CS229 lecture notes, http://cs229.standford.edu Google Scholar

[15] M. Nielsen, Neural networks and deep learning, http://neuralnetworksanddeeplearning.com Google Scholar

[16] UFLDL, Using the MNIST dataset, http://ufldl.stanford.edu/wiki/index.php/Using the MNIST Dataset Google Scholar

[17] T. Rashid, Make your own neural network, CreatSpace, 2016 Google Scholar

[18] F. Rosenblatt, The perceptron: A probabilistic model for information storage and organization in the brain, Psycho-logical Review 65 (1958), 386–408. Google Scholar

[19] D. E. Rumelhart and J. L. McClelland, eds., Parallel Distributed Processing: Explorations, Microstructure of Cognition, Vol. 1, Cambridge, MA: MIT Press, 1986. Google Scholar

[20] J.R. Shewchuk, An introduction to the conjugate gradient method eithout the agonizing pain, Technical report, Carnegie Mellon University, 1994 Google Scholar

[21] UFLDL tutorial, Unsupervised feature learning and deep learning, http://deeplearning.stanford.edu/wiki/index.php /UFLDL Tutorial Google Scholar

[22] http://ufldl.stanford.edu/wiki/resources/mnistHelper.zip Google Scholar

[23] Wikipedia, https://en.wikipedia.org/ Google Scholar