TY - CONF AV - none N2 - This paper reports how to train a Deep Belief Network (DBN) using only 8-bit fixed-point parameters. We propose a dynamic-point stochastic rounding algorithm that provides enhanced results compared to the existing stochastic rounding. We show that by using a variable scaling factor, the fixed-point parameter updates are enhanced. To be more hardware amenable, the use of common scaling factor at each layer of DBN is further proposed. Using publicly available MNIST database, we show that the proposed algorithm can train a 3-layer DBN with an average accuracy of 98.49, with a drop of 0.08 from the double floating-point average accuracy. © 2017 IEEE. N1 - cited By 7; Conference of 8th International IEEE EMBS Conference on Neural Engineering, NER 2017 ; Conference Date: 25 May 2017 Through 28 May 2017; Conference Code:129986 TI - Dynamic point stochastic rounding algorithm for limited precision arithmetic in Deep Belief Network training SP - 629 ID - scholars8456 KW - Digital arithmetic KW - Deep belief network (DBN); Deep belief networks; Fixed points; Floating points; Mnist database; Precision arithmetic; Rounding algorithm; Scaling factors KW - Stochastic systems Y1 - 2017/// PB - IEEE Computer Society SN - 19483546 A1 - Essam, M. A1 - Tang, T.B. A1 - Ho, E.T.W. A1 - Chen, H. UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-85028585216&doi=10.1109%2fNER.2017.8008430&partnerID=40&md5=3d565633d6b0ee148b8349a9b15f1d76 EP - 632 ER -