Cascade-Correlation Algorithm with Trainable Activation Functions
- Fanjun Li
- Ying Li
Abstract
According to the characteristic that higher order derivatives of some base functions can be expressed by primitive functions and lower order derivatives, cascade-correlation algorithm with tunable activation functions is proposed in this paper. The base functions and its higher order derivatives are used to construct the tunable activation functions in cascade-correlation algorithm. The parallel and series constructing schemes of the activation functions are introduced. The model can simply the neural network architecture, speed up the convergence rate and improve its generalization. The efficiency is demonstrated with the two-spiral classification and Mackay-Glass time series prediction problem.
- Full Text: PDF
- DOI:10.5539/cis.v4n6p28
Journal Metrics
WJCI (2022): 0.636
Impact Factor 2022 (by WJCI): 0.419
h-index (January 2024): 43
i10-index (January 2024): 193
h5-index (January 2024): N/A
h5-median(January 2024): N/A
( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )
Index
- Academic Journals Database
- BASE (Bielefeld Academic Search Engine)
- CiteFactor
- CNKI Scholar
- COPAC
- CrossRef
- DBLP (2008-2019)
- EBSCOhost
- EuroPub Database
- Excellence in Research for Australia (ERA)
- Genamics JournalSeek
- Google Scholar
- Harvard Library
- Infotrieve
- LOCKSS
- Mendeley
- PKP Open Archives Harvester
- Publons
- ResearchGate
- Scilit
- SHERPA/RoMEO
- Standard Periodical Directory
- The Index of Information Systems Journals
- The Keepers Registry
- UCR Library
- Universe Digital Library
- WJCI Report
- WorldCat
Contact
- Chris LeeEditorial Assistant
- cis@ccsenet.org