Document Type
Research Article
Abstract
Many works have been done to reduce complexity in terms of time and memory space. The feature selection process is one of the strategies to reduce system complexity and can be defined as a process of selecting the most important feature among feature space.Therefore,themostuseful features will be kept,and the less useful features will be eliminated.In the fault classification and diagnosis field, feature selection plays an important role in reducing dimensionality and sometimes might lead to having a high classification rate. In this paper, a comprehensive review is presented about feature selection processing and how it can be done.The primary goal of this research is to examine all of the strategies that have been used to highlight the (selection) selected process, including filter, wrapper, Meta-heuristicalgorithm,andembedded.ReviewofNature-inspired algorithms that have been used for features selection is more focused such as particle swarm, Grey Wolf,Bat,Genetic,wale,and antcolonyalgorithm.The overall results confirmed that the feature selection approach is important in reducing the complexity of any model-based machine learning algorithm and may sometimes result in improved performance of the simulated model.
Keywords
Featuresselection, Filterprocess, Wrappers, embedded, Metaheuristic algorithm
How to Cite This Article
Hamad, Zana O.
(2023)
"Review Of Feature Selection Methods Using Optimization Algorithm (Review Paper For Optimization Algorithm),"
Polytechnic Journal: Vol. 12:
Iss.
2, Article 24.
DOI: https://doi.org/10.25156/ptj.v12n2y2022.pp203-214
References
Abdullah, J. M., & Ahmed, T. (2019). Fitness dependent optimizer: inspired by the bee swarming reproductive process. IEEE Access, 7, 43473-43486.
https://doi.org/10.1109/ACCESS.2019.2907012Aghdam, M. H., Ghasem-Aghaee, N., & Basiri, M. E. (2009). Text feature selection using ant colony optimization. Expert Systems with Applications, 36(3 PART2), 6843–6853.
https://doi.org/10.1016/j.eswa.2008.08.022Ahmed, A.-A. (2005). Feature subset selection using ant colony optimization. International Journal of Computational, 2(1), 53–58.
http://epress.lib.uts.edu.au/research/handle/10453/6002Ahm ed, A. M., Rashid, T. A., & Saeed, S. A. M. (2020). Cat swarm optimization algorithm: a survey and performance evaluation. Computational intelligence and neuroscience, 2020.Bozorg-Haddad, O. (2018). Studies in Computational Intelligence - Advanced Optimization by Nature- Inspired Algorithms. In Studies in Computational Intelligence.
Chen, L. F., Su, C. T., & Chen, K. H. (2012). An improved particle swarm optimization for feature selection. Intelligent Data Analysis, 16(2), 167–182. https://doi.org/10.3233/IDA-2012- 0517
Chen, Y., Miao, D., & Wang, R. (2010). A rough set approach to feature selection based on ant colony optimization. Pattern Recognition Letters, 31(3), 226– 233. https://doi.org/10.1016/j.patrec.2009.10.013
Cremona, C., Yang, S., & Dieleman, L. (2009). Overview of feature extraction techniques. IOMAC 2009 - 3rd International Operational Modal Analysis Conference, September, 381– 390. https://doi.org/10.2139/ssrn.3568122
Dash, M., & Liu, H. (1997). Feature selection for classification. Intelligent Data Analysis, 1(3), 131–156. https://doi.org/10.3233/IDA-1997-1302
Dash, Manoranjan, & Liu, H. (2003). Consistency-based search in feature selection. Artificial Intelligence, 151(1– 2), 155–176. https://doi.org/10.1016/S0004- 3702(03)00079-1
Diao, R., & Shen, Q. (2015). Nature inspired feature selection Hamad Polytechnic Journal ● Vol 12 ● No 2 ● 2022 210 meta-heuristics. Artificial Intelligence Review, 44(3), 311– 340. https://doi.org/10.1007/s10462-015- 9428-8
Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary grey wolf optimization approaches for feature selection. Neurocomputing, 172, 371–381. https://doi.org/10.1016/j.neucom.2015.06.083
Faris, H., Aljarah, I., Al-Betar, M. A., & Mirjalili, S. (2018). Grey wolf optimizer: a review of recent variants and applications. Neural Computing and Applications, 30(2), 413–435.
https://doi.org/10.1007/s00521-017-3272-5 Forsati, R., Moayedikia, A., Keikha, A., & Shamsfard, M. (2012). A Novel Approach for Feature Selection based on the Bee Colony Optimization. International Journal of Computer Applications, 43(8), 13–16. https://doi.org/10.5120/6122-8329
Guo, H., Jack, L. B., & Nandi, A. K. (2005). Feature generation using genetic programming with application to fault classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 35(1), 89– 99. https://doi.org/10.1109/TSMCB.2004.841426Hajnayeb, A., Ghasemloonia, A., Khadem, S. E., & Moradi
M. H. (2011). Application and comparison of an ANNbased feature selection method and the genetic algorithm in gearbox fault diagnosis. Expert Systems with Applications, 38(8), 10205–10209. https://doi.org/10.1016/j.eswa.2011.02.065
Hancer, E., Xue, B., Zhang, M., Karaboga, D., & Akay, B. (2018). Pareto front feature selection based on artificial bee colony optimization. Information Sciences, 422, 462–479. https://doi.org/10.1016/j.ins.2017.09.028
Jain, I., Jain, V. K., & Jain, R. (2018). Correlation feature selection based improved-Binary Particle Swarm Optimization for gene selection and cancer classification. Applied Soft Computing, 62, 203–215. https://doi.org/10.1016/j.asoc.2017.09.038
Jeyasingh, S., & Veluchamy, M. (2017). Modified bat algorithm for feature selection with the Wisconsin Diagnosis Breast Cancer (WDBC) dataset. Asian Pacific Journal of Cancer Prevention, 18(5), 1257– 1264. https://doi.org/10.22034/APJCP.2017.18.5.1257
Jović, A., Brkić, K., & Bogunović, N. (2015). A review of feature selection methods with applications. 2015 38th International Convention on Information and Communication Technology, Electronics and Microelectronics, MIPRO 2015 - Proceedings, 1200– 1205. https://doi.org/10.1109/MIPRO.2015.7160458
KARABOGA, D. (2005). AN IDEA BASED ON HONEY- BEE SWARM FOR NUMERICAL OPTIMIZATION.
Globalisasi Dan Kebidayaan Lokal: Suatu Dialektika Menuju Indonesia Baru, 7(2), 147–173. http://dx.doi.org/10.1016/j.
Karaboga, D., & Akay, B. (2009). A comparative study of Artificial Bee Colony algorithm. Applied Mathematics and Computation, 214(1), 108–132. https://doi.org/10.1016/j.amc.2009.03.090
Khalid, S., Khalil, T., & Nasreen, S. (2014). A survey of feature selection and feature extraction techniques in machine learning. Proceedings of 2014 Science and Information Conference, SAI 2014, August 2014, 372– 378. https://doi.org/10.1109/SAI.2014.6918213
Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial Intelligence, 97(1–2), 273– 324. https://doi.org/10.1016/s0004-3702(97)00043-x
Lin, K. C., Huang, Y. H., Hung, J. C., & Lin, Y. T. (2015). Feature Selection and Parameter Optimization of Support Vector Machines Based on Modified Cat Swarm Optimization. International Journal of Distributed Sensor Networks, 2015. https://doi.org/10.1155/2015/365869
Lin, K.-C. et al. (2016) ‘Feature selection based on an improved cat swarm optimization algorithm for big data classification’, The Journal of Supercomputing, 72(8), pp. 3210–3221.
Lin, K. C., Zhang, K. Y. and Hung, J. C. (2014) ‘Feature selection of support vector machine based on harmonious cat swarm optimization’, in 2014 7th international conference on ubi-media computing and workshops, pp. 205–208.
Liu, Y. et al. (2011) ‘An improved particle swarm optimization for feature selection’, Journal of Bionic Engineering, 8(2), pp. 191–200.
Mandal, M. et al. (2021) ‘A tri-stage wrapper-filter feature selection framework for disease classification’, Sensors, 21(16), p. 5571.
Moradi, P. and Rostami, M. (2015) ‘Integration of graph clustering with ant colony optimization for feature selection’, Knowledge-Based Systems, 84, pp. 144–161.Nakamura, R. Y. M. et al. (2012) ‘BBA: a binary bat algorithm for feature selection’, in 2012 25th SIBGRAPI conference on graphics, patterns and images, pp. 291–297.
Nematzadeh, H. et al. (2019) ‘Frequency based feature selection method using whale algorithm’, Genomics, 111(6), pp. 1946– 1955.
Nirmala Sreedharan, N. P. et al. (2018) ‘Grey wolf optimisationbased feature selection and classification for facial emotion recognition’, IET Biometrics, 7(5), pp. 490–499.
Qasim, O. S. and Algamal, Z. Y. (2018) ‘Feature selection using particle swarm optimization-based logistic regression model’, Chemometrics and Intelligent Laboratory Systems, 182, pp. 41– 46.
Sahu, B. and Mishra, D. (2012) ‘A novel feature selection algorithm using particle swarm optimization for cancer microarray data’, Procedia Engineering, 38, pp. 27–31.
Saleem, N., Zafar, K. and Sabzwari, A. F. (2019) ‘Enhanced feature subset selection using Niche based bat algorithm’, Computation, 7(3), p. 49.
Salih, J. F., Mohammed, H. M. and Abdul, Z. K. (2022) ‘Modified Fitness Dependent Optimizer for Solving Numerical Optimization Functions’, IEEE Access, 10, pp. 83916–83930. doi: 10.1109/ACCESS.2022.3197290.
Tawhid, M. A. and Dsouza, K. B. (2018) ‘Hybrid binary bat enhanced particle swarm optimization algorithm for solving feature selection problems’, Applied Computing and Informatics.
Too, J. et al. (2018) ‘A new competitive binary grey wolf optimizer to solve the feature selection problem in EMG signals classification’, Computers, 7(4), p. 58.
Too, J., Mafarja, M. and Mirjalili, S. (2021) ‘Spatial bound whale optimization algorithm: an efficient high- dimensional feature selection approach’, Neural Computing and Applications, 33(23), pp. 16229–16250. Vivekanandan, T. and Iyengar, N. C. S. N. (2017) ‘Optimal feature selection using a modified differential evolution algorithm and its effectiveness for prediction of heart disease’, Computers in biology and medicine, 90, pp. 125–136.
Wang, X. et al. (2020) ‘Multi-objective feature selection based on artificial bee colony: An acceleration approach with variable sample size’, Applied Soft Computing, 88, p. 106041.
Xue, B., Zhang, M. and Browne, W. N. (2012) ‘Particle swarm optimization for feature selection in classification: A multi-objective approach’, IEEE transactions on cybernetics, 43(6), pp. 1656–1671.
Yang, X. S. (2013) ‘Na{\"\i}ve Bayes Guided Bat algorithm for featureselection’, TheScientificWorldJournalVolume20, 13.
Zahran, B. M. and Kanaan, G. (2009) ‘Text feature selection using particle swarm optimization algorithm 1’. Zhang, Y. et al. (2019) ‘Cost-sensitive feature selection using two-archive multi-objective artificial bee colony algorithm’, Expert Systems with Applications, 137, pp. 46–58.
Zhang, Y. et al. (2020) ‘Binary differential evolution with selflearning for multi-objective feature selection’, Information Sciences, 507, pp. 67–85.
Zheng, Y. et al. (2018) ‘A novel hybrid algorithm for feature selection based on whale optimization algorithm’, IEEE Access, 7, pp. 14908–14923.
Zorarpac\i, E. and Özel, S. A. (2016) ‘A hybrid approach of differential evolution and artificial bee colony for feature selection’, Expert Systems with Applications, 62, pp. 91–10
Publication Date
2-1-2023
Follow us: