ComPreEND: Computation Pruning through Predictive Early Negative Detection for ReLU in a Deep Neural Network Accelerator

IEEE Transactions on Computers(2022)

引用 3|浏览20
暂无评分
摘要
A vast amount of activation values of DNNs are zeros due to ReLU (Rectified Linear Unit), which is one of the most common activation functions used in modern neural networks. Since ReLU outputs zero for all negative inputs, the inputs to ReLU do not need to be determined exactly as long as they are negative. However, many accelerators usually do not consider such aspects of DNNs, losing a huge amount of opportunities for speedups and energy savings. To exploit such opportunities, we propose early negative detection (END), a computation pruning technique that detects the negative results at an early stage. The key to the early negative detection is the adoption of inverted two's complement representation for filter parameters. This ensures that as soon as the intermediate results become negative, the final results are guaranteed to be negative. Upon detection, the remaining computation can be skipped and the following ReLU output can be simply set to zero. We also propose a DNN accelerator architecture (ComPreEND) that takes advantage of such skipping. ComPreEND with END significantly improves both the energy efficiency and the performance according to the evaluation. Compared to the baseline, we obtain 20.5 and 29.3 percent speedup with accurate mode and predictive mode, and energy savings by 28.4 and 41.4 percent, respectively.
更多
查看译文
关键词
Early negative detection,prediction,computation pruning,deep neural network,accelerator
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要