Design and Implementation of Neural Networks Neurons with RadBas, LogSig, and TanSig Activation Functions on FPGA
AbstractArtificial Neural Networks (ANNs) are utilized in several key areas such as prediction, classification, motor control, etc. When high performance is needed, FPGA realizations of the ANNs are preferred. In this study, we designed and implemented a total of 18 different FPGA-based neurons, 2, 4 and 6-input biased and non-biased with each having three different activation functions requiring the calculations of ex. Our purpose was to show the possibility of implementing neural networks with exponential activation functions on current FPGAs and measure the performance of the neurons. The results showed that up to 10 neurons can fit in to the smallest Virtex-6 and the network can be clocked up to 405MHz. Ill. 6, bibl. 11, tabl. 2 (in English; abstracts in English and Lithuanian).
Authors retain copyright and grant the journal the right of the first publication with the paper simultaneously licensed under the Creative Commons Attribution 4.0 (CC BY 4.0) licence.
Authors are allowed to enter into separate, additional contractual arrangements for the non-exclusive distribution of the paper published in the journal with an acknowledgement of the initial publication in the journal.
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.