FPGA Implementation of Range Addressable Activation Function for Lattice-Ladder Neuron

Authors

  • Tomyslav Sledevic
  • Dalius Navakauskas

DOI:

https://doi.org/10.5755/j01.eie.22.2.14598

Keywords:

Lattice-ladder neuron, , nonlinear activation function, transfer function, high-level synthesis, fixed-point arithmetic, FPGA implementation.

Abstract

FPGA implementation of hyperbolic tangent activation function for multilayer perceptron structure seems attractive; however, there is a lack of preliminary results on the choice of memory size particularly, when LUT of the function is stored in dedicated on-chip block RAM. The aim of this investigation was to get insights on the distortions of the selected neuron model output by the evaluation of transfer function RMS error and neuron output signal mean and maximum errors while changing the gain and memory size of the activation function. Thus, the range addressable activation function for the second order normalized lattice-ladder neuron was implemented in Artix-7 FPGA. Various gain and memory constrains were investigated. The increase of LUT memory size and gain yielded smaller error of output signal and nonlinear influence on the transfer function. 2 kB of BRAM is sufficient to achieve tolerable less than 0.4 % maximum error utilizing only 0.36 % of total on-chip block memory.

DOI: http://dx.doi.org/10.5755/j01.eie.22.2.14598

Downloads

Published

2016-04-05

How to Cite

Sledevic, T., & Navakauskas, D. (2016). FPGA Implementation of Range Addressable Activation Function for Lattice-Ladder Neuron. Elektronika Ir Elektrotechnika, 22(2), 92-95. https://doi.org/10.5755/j01.eie.22.2.14598

Issue

Section

SYSTEM ENGINEERING, COMPUTER TECHNOLOGY