Korean J. Math. Vol. 30 No. 1 (2022) pp.131-146
DOI: https://doi.org/10.11568/kjm.2022.30.1.131

Gradients in a deep neural network and their Python implementations

Main Article Content

Young Ho Park

Abstract

This is an expository article about the gradients in deep neural network. It is hard to find a place where gradients in a deep neural network are dealt in details in a systematic and mathematical way. We review and compute the gradients and Jacobians to derive formulas for gradients which appear in the backpropagation and implement them in vectorized forms in Python.



Article Details

References

[1] A. G eron, Hands-on Machine Learning with Scikit-Learn & TensorFlow (haenjeuon meosinreo), Han- bit Media, 2018 Google Scholar

[2] Andrew Ng, Note on neural network and deep learning, https://github.com/ashishpatel26/Andrew-NG-Notes, accessed February 23, 2022 Google Scholar

[3] Y.H. Park, Verifying gradient formulas by PyTorch, https://deepmath.kangwon.ac.kr/∼yhpark/verify gradients.pdf, accessed February 23, 2022 Google Scholar

[4] T. Rashid, Make your own neural network (singyeong ceosgeol), Hanbit Media, 2017 Google Scholar

[5] J. Stewart, Calculus, Books-Hill, 2021 Google Scholar