Cascade-Correlation Algorithm with Trainable Activation Functions


  •  Fanjun Li    
  •  Ying Li    

Abstract

According to the characteristic that higher order derivatives of some base functions can be expressed by primitive functions and lower order derivatives, cascade-correlation algorithm with tunable activation functions is proposed in this paper. The base functions and its higher order derivatives are used to construct the tunable activation functions in cascade-correlation algorithm. The parallel and series constructing schemes of the activation functions are introduced. The model can simply the neural network architecture, speed up the convergence rate and improve its generalization. The efficiency is demonstrated with the two-spiral classification and Mackay-Glass time series prediction problem.



This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1913-8989
  • ISSN(Online): 1913-8997
  • Started: 2008
  • Frequency: quarterly

Journal Metrics

WJCI (2020): 0.439

Impact Factor 2020 (by WJCI): 0.247

Google Scholar Citations (March 2022): 6907

Google-based Impact Factor (2021): 0.68

h-index (December 2021): 37

i10-index (December 2021): 172

(Click Here to Learn More)

Contact