A New Approach for Determining Hyperparameters in Artificial Neural Networks: Enhanced Black Hole Optimization Algorithm
Main Article Content
Abstract
Article Details
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
References
[2] Hinkelmann, K. (2021, 7 16). Neural Networks. University of Applied Sciences Northwestern Switzerland School of Business: Accessed on July 7, 2021, from http://didattica.cs.unicam.it/lib/exe/fetch.php?media=didattica:magistrale:kebi:ay_1718:ke-11_neural_networks.pdf
[3] Nair, V., & Hinton, G. E. (2010, June). Rectified linear units improve restricted Boltzmann machines. ICML'10: Proceedings of the 27th International Conference on International Conference on Machine Learning (s. 807-814). Haifa: Omnipress, United States.
[4] Hinton, G., Deng, L., Deng, L., Yu, D., & Dahl, G. (2012). Deep Neural Networks for Acoustic Modeling in Speech Recognition. IEEE Signal Processing Magazine, 26(6), 82-97.
[5] Goodfellow, I., Bengio, Y., & Courville, A. (2017). Deep Learning (Adaptive Computation and Machine Learning series). Massachusetts: Cambridge.
[6] Injadat, M., Moubayed, M., Nassif, A. B., & Shami, A. (2020). Systematic ensemble model selection approach for educational data mining. Knowledge-Based Systems, 200, 105992.
[7] Yang, L., & Shami, A. (2020). On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing, 415, 295-316.
[8] James, B., & Yoshua, B. (2012). Random search hyper-parameter optimization. Journal of Machine Learning Research, 13(2).
[9] Sameen, M. I., Pradhan, B., & Lee, S. (2019). Self-Learning Random Forests Model for Mapping Groundwater Yield in Data-Scarce Areas. Natural Resources Research, 28, 757-775.
[10] Sameen, M. I., Pradhan, B., & Lee, S. (2020). Application of convolutional neural networks featuring Bayesian optimization for landslide susceptibility assessment. CATENA, 186, 104249.
[11] Kouziokas, G. (2020). A new W-SVM kernel combining PSO-neural network transformed vector and Bayesian optimized SVM in GDP forecasting. Engineering Applications of Artificial Intelligence, 92, 103650.
[12] Hatamlou, A. (2013). Blackhole: A new heuristic optimization approach for data clustering. Information Sciences, 222, 175-184.
[13] Dua, D., & Graff, C. (2019). Irvine, CA: The University of California, School of Information and Computer Science. http://archive.ics.uci.edu/ml.
[14] Machine Learning Repository. (2021). Accessed on May 26, 2021, from http://archive.ics.uci.edu/ml