{"id":131995,"date":"2022-07-25T18:45:10","date_gmt":"2022-07-25T09:45:10","guid":{"rendered":"http:\/\/192.249.19.202\/?post_type=ai-in-circuit&#038;p=131995"},"modified":"2026-04-05T18:04:07","modified_gmt":"2026-04-05T09:04:07","slug":"quantization-error-robust-deep-neural-network-for-embedded-accelerators","status":"publish","type":"ai-in-circuit","link":"http:\/\/ee.presscat.kr\/en\/ai-in-circuit\/quantization-error-robust-deep-neural-network-for-embedded-accelerators\/","title":{"rendered":"Quantization-Error-Robust Deep Neural Network for Embedded Accelerators"},"content":{"rendered":"<p>Title : Quantization-Error-Robust Deep Neural Network for Embedded Accelerators<\/p>\n<p>Author: Youngbeom Jung, Hyeonuk Kim, Yeongjae Choi, Lee-Sup Kim<\/p>\n<p>&nbsp;<\/p>\n<p>Journal : <a href=\"https:\/\/ieeexplore.ieee.org\/xpl\/RecentIssue.jsp?punumber=8920\" rel=\"noopener\">IEEE Transactions on Circuits and Systems II: Express Briefs<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Abstract : Quantization with low precision has become an essential technique for adopting deep neural networks in energy- and memory-constrained devices. However, there is a limit to the reducing precision by the inevitable loss of accuracy due to the quantization error. To overcome this obstacle, we propose methods reforming and quantizing a network that achieves high accuracy even at low precision without any runtime overhead in embedded accelerators. Our proposition consists of two analytical approaches: 1) network optimization to find the most error-resilient equivalent network in the precision-constrained environment and 2) quantization exploiting adaptive rounding offset control. The experimental results show accuracies of up to 98.31% and 99.96% of floating-point results in 6-bit and 8-bit quantization networks, respectively. Besides, our methods allow the lower precision accelerator design, reducing the energy consumption by 8.5%.<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" class=\"alignnone size-full wp-image-131996\" src=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uae40\uc774\uc12d1.png\" alt=\"\" width=\"1496\" height=\"890\" title=\"\" srcset=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uae40\uc774\uc12d1.png 1496w, http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uae40\uc774\uc12d1-300x178.png 300w, http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uae40\uc774\uc12d1-1024x609.png 1024w, http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uae40\uc774\uc12d1-768x457.png 768w\" sizes=\"(max-width: 1496px) 100vw, 1496px\" \/><\/p>\n","protected":false},"excerpt":{"rendered":"<p>825<\/p>\n","protected":false},"featured_media":0,"template":"","class_list":["post-131995","ai-in-circuit","type-ai-in-circuit","status-publish","hentry"],"acf":[],"_links":{"self":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/ai-in-circuit\/131995","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/ai-in-circuit"}],"about":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/types\/ai-in-circuit"}],"wp:attachment":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/media?parent=131995"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}