CSpace
Trigonometric Inference Providing Learning in Deep Neural Networks
Cai, Jingyong1; Takemoto, Masashi2; Qiu, Yuming3; Nakajo, Hironori1
2021-08-01
摘要Despite being heavily used in the training of deep neural networks (DNNs), multipliers are resource-intensive and insufficient in many different scenarios. Previous discoveries have revealed the superiority when activation functions, such as the sigmoid, are calculated by shift-and-add operations, although they fail to remove multiplications in training altogether. In this paper, we propose an innovative approach that can convert all multiplications in the forward and backward inferences of DNNs into shift-and-add operations. Because the model parameters and backpropagated errors of a large DNN model are typically clustered around zero, these values can be approximated by their sine values. Multiplications between the weights and error signals are transferred to multiplications of their sine values, which are replaceable with simpler operations with the help of the product to sum formula. In addition, a rectified sine activation function is utilized for further converting layer inputs into sine values. In this way, the original multiplication-intensive operations can be computed through simple add-and-shift operations. This trigonometric approximation method provides an efficient training and inference alternative for devices with insufficient hardware multipliers. Experimental results demonstrate that this method is able to obtain a performance close to that of classical training algorithms. The approach we propose sheds new light on future hardware customization research for machine learning.
关键词backpropagation neural network training trigonometric inference
DOI10.3390/app11156704
发表期刊APPLIED SCIENCES-BASEL
卷号11期号:15页码:15
通讯作者Nakajo, Hironori(nakajo@cc.tuat.ac.jp)
收录类别SCI
WOS记录号WOS:000681990600001
语种英语