Back

Title
  • en A study on a low power optimization algorithm for an edge-AI device
Creator
    • en Kaneko, Tatsuya
    • en Orimo, Kentaro
    • en Hida, Itaru
    • en Takamaeda, Shinya
    • NRID 1000060738897
    • 別名 en Yamazaki, Shinya
    • en Motomura, Masato
Accessrights metadata only access
Rights
  • en Copyright ©2019 The Institute of Electronics, Information and Communication Engineers
  • https://search.ieice.org/
  • https://search.ieice.org/
Subject
  • Other en machine learning
  • Other en edge AI
  • Other en training algorithm
  • Other en backpropagation
  • Other en quantization
  • Other en low power
  • NDC 547
Description
  • Abstract en Although research on the inference phase of edge artificial intelligence (AI) has made considerable improvement, the required training phase remains an unsolved problem. Neural network (NN) processing has two phases: inference and training. In the training phase, a NN incurs high calculation cost. The number of bits (bitwidth) in the training phase is several orders of magnitude larger than that in the inference phase. Training algorithms, optimized to software, are not appropriate for training hardware-oriented NNs. Therefore, we propose a new training algorithm for edge AI: backpropagation (BP) using a ternarized gradient. This ternarized backpropagation (TBP) provides a balance between calculation cost and performance. Empirical results demonstrate that in a two-class classification task, TBP works well in practice and compares favorably with 16-bit BP (Fixed-BP).
Publisher en 電子情報通信学会(The Institute of Electronics, Information and Communication Engineers / IEICE)
Date
    Issued2019-10
Language
  • eng
Resource Type journal article
Version Type NA
Identifier HDL http://hdl.handle.net/2115/76016
Relation
  • isIdenticalTo DOI https://doi.org/10.1587/nolta.10.373
Journal
    • PISSN 2185-4106
      • en Nonlinear theory and its applications, IEICE
      • Volume Number10 Issue Number4 Page Start373 Page End389
Oaidate 2023-07-26