ATTRIBUTE SELECTION MODEL FOR OPTIMAL LOCAL SEARCH AND GLOBAL SEARCH

Authors

  • Mohammad Aizat Basir School of Informatics and Applied Mathematics (PPIMG), Universiti Malaysia Terengganu, 21030 Kuala Terengganu Terengganu, Malaysia
  • Faudziah Ahmad UUM College of Arts and Sciences, Universiti Utara Malaysia, 06010 Sintok, Kedah, Malaysia

DOI:

https://doi.org/10.11113/jt.v78.8487

Keywords:

Attribute selection, reduction algorithm, search methods, classification

Abstract

Attribute selection also known as feature selection is an essential process in data sets that comprise numerous numbers of input attributes. However, finding the optimal combination of algorithms for producing a good set of attributes has remained a challenging task. The aim of this paper is to find a list of an optimal combination search methods and reduction algorithm for attribute selection. The research process involves 2 phases: finding a list of an optimal combination search methods and reduction algorithm. The combination is known as model. Results are in terms of percentage of accuracy and number of selected attributes. Six (6) datasets were used for experiment. The final output is a list of optimal combination search methods and reduction algorithm. The experimental results conducted on public real dataset reveals that the model consistently shows the suitability to perform good classification task on the selected dataset. Significant improvement in accuracy and optimal number of attribute selection is achieved with a list of combination algorithms used.

References

Jensen, R., and Shen, Q. 2003. Finding rough set reducts with ant colony optimization. Proc. 2003 UK Work. 1(2): 15–22.

Suguna, N., and Thanushkodi, K. 2010. A Novel Rough Set Reduct Algorithm for Medical Domain Based on Bee Colony. 2(6): 49-54.

Yue, B., Yao,W., Abraham,, A., Liu, and H. 2007. A New Rough Set Reduct Algorithm Based on Particle Swarm Optimization. 397-406.

Chandrashekar, G. and Sahin, F. 2014. A Survey On Feature Selection Methods. Comput. Electr. Eng. 40(1): 16-28.

Hall , M. a. 1999. Correlation-based Feature Selection for Machine Learning. Methodology. 2(1): 1-5.

Hall , M. a. 1999. Correlation-based Feature Selection for Machine Learning. Methodology. 2(1): 1-5.

Gütlein, M., Frank, E., Hall, M., and Karwath, A. 2009. Large-scale Attribute Selection Using Wrappers. IEEE Symposium on Computational Intelligence and Data Mining, CIDM 2009 - Proceedings. 332-339.

Hamdani, T. M. J,. Won, M. A. M., Alimi, and Karray, F. 2001. Hierarchical Genetic Algorithm With New Evaluation Function And Bi-Coded Representation For The Selection Of Features Considering Their Confidence Rate. Applied Soft Computing Journal. 11(2): 2501-2509.

Goldberg, D. E. 1989. Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-We.

Battiti, R. 1994. Using Mutual Information For Selecting Features In Supervised Neural Net Learning. IEEE Trans. Neural Networks. 5(4): 537-550.

Hedar, A. R., Wang, J., and Fukushima, M. 2008. Tabu Search For Attribute Reduction In Rough Set Theory. Soft Comput. 12(9): 909-918.

Kohavi, R. and John, G. 1997. Wrappers For Feature Subset Selection. Artif. Intell. 97(1): 273-324.

Hedar, A. R., Wang, J., and Fukushima, M. 2008. Tabu Search For Attribute Reduction In Rough Set Theory. Soft Comput. 12(9): 909-918.

Yu, L., and Liu, H. 2003. Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution. Int. Conf. Mach. Learn. 1-8.

Kononenko, I. 1994. Estimating attributes: Analysis and extensions of RELIEF. Machine Learning: ECML-94. 784: 171-182.

Kira, K., and Rendell, L. A. 1992. Machine Learning Proceedings.

Liu, H. L. H., and Setiono, R. 1995. Chi2: Feature Selection And Discretization Of Numeric Attributes. Proc. 7th IEEE Int. Conf. Tools with Artif. Intell.

Seymour, K. You, H. and Dongarra, J. 2008. A Comparison Of Search Heuristics For Empirical Code Optimization. Proceedings - IEEE International Conference on Cluster Computing, ICCC, Proceeding. 421-429.

Kisuki, T. Knijnenburg, P. M. W. and Boyle, M. F. P. O. 2000. Combined Selection of Tile Sizes and Unroll Factors Using Iterative Compilation. Proc. 2000 Int. Conf. Parallel Archit. Compil. Tech. 237.

Norris, B., Hartono, A. and Gropp, W. 2007. Annotations for Productivity and Performance Portability. Petascale Computing: {A}lgorithms and Applications. 443-462.

Whaley, R. C. and Dongarra, J. J. 1998. Automatically Tuned Linear Algebra Software. Proc. ACM/IEEE Conf. Supercomput. 1-27.

Qasem, K. K. M.-C. J. A., 2006. Automatic Tuning Of Whole Applications Using Direct Search And A Performance-Based Transformation System. J. Supercomput. 36(2): 183-196.

Tabatabaee, V. Tiwari, A. and Hollingsworth, J. K. 2005. Parallel Parameter Tuning For Applications With Performance Variability. Proceedings of the ACM/IEEE 2005 Supercomputing Conference.

Tiwari, A. Chen, C. Chame, J. Hall, M. and Hollingsworth, J. K. 2009. A Scalable Auto-Tuning Framework For Compiler Optimization. Proceedings of the 2009 IEEE International Parallel and Distributed Processing Symposium.

Ye, D. Chen, Z. and Liao, J. 2007. A New Algorithm for Minimum Attribute Reduction Based on Binary Particle Swarm Optimization with Vaccination. PAKDD. 1029-1036.

Yu, H. Wang, G. and Lan, F. 2008. Rough Sets and Current Trends in Computing. Berlin, Heidelberg: Springer Berlin Heidelberg.

Jiang, Y. and Liu, Y. 2006. An Attribute Reduction Method Based on Ant Colony Optimization. 6th World Congress on Intelligent Control and Automation. 1: 3542-3546.

Suguna, N. K. Thanushkodi, G. and Nadu, T. 2011. An Independent Rough Set Approach Hybrid with Artificial Bee Colony Algorithm for Dimensionality Reduction. 8(3): 261-266.

Chouchoulas, A. and Shen, Q. 2001. Rough Set-Aided Keyword Reduction For Text Categorization. Applied Artificial Intelligence. 15: 843-873.

Zhang, M. Shao, C. Li, F. Gan, Y. and Sun, J. 2006. Evolving Neural Network Classifiers and Feature Subset Using Artificial Fish Swarm. 2006 IEEE International Conference on Mechatronics and Automation, ICMA.1598-1602.

Narendra, P. M. and Fukunaga, K. 1977. A Branch and Bound Algorithm for Feature Subset Selection. IEEE Trans. Comput. 26(9): 917-922.

Namsrai, E., Munkhdalai, T., Li, M., Shin, J.-H., Namsrai, O.-E., and Ryu, K. H. 2013. A Feature Selection-based Ensemble Method for Arrhythmia Classification. J Inf Process Syst. 9(1): 31-40.

Montazeri, M., H. Naji, R., and Faraahi, A. 2013. A novel memetic feature selection algorithm. The 5th Conference on Information and Knowledge Technology. 295-300.

Aha, D., Murphy, P., Merz, C., Keogh, E., Blake, C., Hettich, S., and Newman, D. 1987. UCI Machine Learning Repository. University of Massachusetts Amherst.

Witten, I. H. and Frank, E. 2005. Data Mining: Practical Machine Learning Tools And Techniques.

Downloads

Published

2016-09-29

Issue

Section

Science and Engineering

How to Cite

ATTRIBUTE SELECTION MODEL FOR OPTIMAL LOCAL SEARCH AND GLOBAL SEARCH. (2016). Jurnal Teknologi (Sciences & Engineering), 78(10). https://doi.org/10.11113/jt.v78.8487