ASNet: Introducing Approximate Hardware to High-Level Synthesis of Neural Networks

  • Lucas Klemmer (Speaker)

Activity: Talk or presentationContributed talkscience-to-science

Description

Approximate Computing is a design paradigm which makes use of error tolerance inherent to many applications in order to trade off accuracy for performance. One classic example for such an application is machine learning with Neural Networks (NNs). Recently, LeFlow, a High-Level Synthesis (HLS) flow for mapping Tensorflow NNs into hardware has been proposed. The main steps of LeFlow are to compile the Tensorflow models into the LLVM Intermediate Representation (IR), perform several transformations and feed the result into a HLS tool. In this work we take HLS-based NN synthesis one step further by integrating hardware approximation. To achieve this goal, we upgrade LeFlow such that (a) the user can specify hardware approximations, and (b) the user can analyze the impact of hardware approximation already at the SW level. Based on the exploration results which satisfy the NN quality expectations, we import the chosen approx. HW components into an extended version of the HLS tool to finally synthesize the NN to Verilog. The experimental evaluation demonstrates the advantages of our proposed ASNet for several NNs. Significant area reductions as well as improvements in operation frequency are achieved.
Period09 Nov 2020
Event titleIEEE International Symposium on Multiple-Valued Logic (ISMVL) 2020
Event typeConference
LocationAustriaShow on map

Fields of science

  • 202017 Embedded systems
  • 202005 Computer architecture
  • 102005 Computer aided design (CAD)
  • 102 Computer Sciences
  • 102011 Formal languages

JKU Focus areas

  • Digital Transformation