Applying Piecewise Linear Approximation for DNN Non-Linear Activation Functions to Bfloat16 MACs

Seok Young Kim, Chang Hyun Kim, Seon Wook Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The efficient implementation of inference engines for the DNN (Deep Neural Network) execution requires low power consumption and a small area. However, the activation functions' implementation is challenging since they require considerable computing resources due to their non-linearity. In this paper, we study the non-linearity of the functions and show that only one MAC execution using our PLA (Piecewise Linear Approximation) scheme is sufficient to guarantee the accuracy in bfloat16. For the evaluation, we applied our proposal to our in-house bfloat16 MACs and achieved that our results were less than 1 LSB difference from ideal values on average.

Original languageEnglish
Title of host publication2021 International Conference on Electronics, Information, and Communication, ICEIC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728191614
DOIs
Publication statusPublished - 2021 Jan 31
Event2021 International Conference on Electronics, Information, and Communication, ICEIC 2021 - Jeju, Korea, Republic of
Duration: 2021 Jan 312021 Feb 3

Publication series

Name2021 International Conference on Electronics, Information, and Communication, ICEIC 2021

Conference

Conference2021 International Conference on Electronics, Information, and Communication, ICEIC 2021
CountryKorea, Republic of
CityJeju
Period21/1/3121/2/3

Keywords

  • Activation function
  • Bfloat16
  • Deep Neural Networks
  • Piecewise Linear Approximation

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Information Systems
  • Information Systems and Management
  • Electrical and Electronic Engineering
  • Instrumentation

Fingerprint Dive into the research topics of 'Applying Piecewise Linear Approximation for DNN Non-Linear Activation Functions to Bfloat16 MACs'. Together they form a unique fingerprint.

Cite this