Design and Implementation of Neural Networks Neurons with RadBas, LogSig, and TanSig Activation Functions on FPGA
AbstractArtificial Neural Networks (ANNs) are utilized in several key areas such as prediction, classification, motor control, etc. When high performance is needed, FPGA realizations of the ANNs are preferred. In this study, we designed and implemented a total of 18 different FPGA-based neurons, 2, 4 and 6-input biased and non-biased with each having three different activation functions requiring the calculations of ex. Our purpose was to show the possibility of implementing neural networks with exponential activation functions on current FPGAs and measure the performance of the neurons. The results showed that up to 10 neurons can fit in to the smallest Virtex-6 and the network can be clocked up to 405MHz. Ill. 6, bibl. 11, tabl. 2 (in English; abstracts in English and Lithuanian).
How to Cite
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.