Modified Backpropagation Algorithm with Multiplicative Calculus in Neural Networks

Authors

  • Serkan Ozbay Department of Electrical and Electronics Engineering, Gaziantep University, Gaziantep, Turkey

DOI:

https://doi.org/10.5755/j02.eie.34105

Keywords:

Backpropagation, Local minima problem, Multiplicative calculus, Neural networks

Abstract

Backpropagation is one of the most widely used algorithms for training feedforward deep neural networks. The algorithm requires a differentiable activation function and it performs computations of the gradient proceeding backwards through the feedforward deep neural network from the last layer through to the first layer. In order to calculate the gradient at a specific layer, the gradients of all layers are combined via the chain rule of calculus. One of the biggest disadvantages of the backpropagation is that it requires a large amount of training time. To overcome this issue, this paper proposes a modified backpropagation algorithm with multiplicative calculus. Multiplicative calculus provides an alternative to the classical calculus and it defines new kinds of derivative and integral forms in multiplicative form rather than addition and subtraction forms. The performance analyzes are discussed in various case studies and the results are given comparatively with classical backpropagation algorithm. It is found that the proposed modified backpropagation algorithm converges in less time to the solution and thus provides fast training in the given case studies. It is also shown that the proposed algorithm avoids the local minima problem.

Downloads

Published

2023-06-27

How to Cite

Ozbay, S. (2023). Modified Backpropagation Algorithm with Multiplicative Calculus in Neural Networks . Elektronika Ir Elektrotechnika, 29(3), 55-61. https://doi.org/10.5755/j02.eie.34105

Issue

Section

SYSTEM ENGINEERING, COMPUTER TECHNOLOGY