Graph theoretic methods for unsupervised feature selection

The IJNCPS's Authors that presented the article:

  • Seyed Enayatallah Alavi Shahid Chamran University of Ahvaz
Keywords: Dimension reduction, feature selection, filtering guidelines, clustering graph, community’s identification, central node

Abstract

Developments in data collection and storage technologies in recent decade effect on rapid growth of high-dimensional datasets. Feature sets in many domains often contain unrelated and redundant feature and it leads to a decreased algorithms classification performance. Therefore, a feature selection method is proposed to reduce the size of problem dimension and increase the efficiency of algorithms classification. This study combines two methods of graph clustering techniques and central node. Graph theoretic and filter feature selection methods that can select the appropriate subset in an unsupervised mode are provided. The proposed method is compared with the most recognised and recent feature selection methods according to the SVM classifier.

References

[1] Cadenas, J.M., M.C. Garrido, and R. Martínez, Feature subset selection Filter–Wrapper based on low quality data. Expert Systems with Applications, 2013. 40(16): p. 6241-6252.
[2] Liu, Y. and Y.F. Zheng, FS_SFS: A novel feature selection method for support vector machines. Pattern Recognition, 2006. 39(7): p. 1333-1345.
[3] Xin Sun, et al., Using cooperative game theory to optimize the feature selection problem. Neurocomputing, 2012. 97: p. 86-93.
[4] Liu, H. and L. Yu, Toward Integrating Feature Selection Algorithms for Classification and Clustering. IEEE Transactions on Knowledge And Data Engineering, 2005. 17(4): p. 491 – 502
[5] Song, Q., J. Ni, and G. Wang, A Fast Clustering-Based Feature Subset Selection Algorithm for High-Dimensional Data. IEEE Transactions on Knowledge And Data Engineering, 2013. 25(1): p. 1 - 14.
[6] Jung-Yi Jiang, Ren-Jia Liou, and S.-J. Lee, A Fuzzy Self-Constructing Feature Clustering Algorithm for Text Classification. IEEE Transactions Knowledge and Data Engineering, 2011. 23(3): p. 335 – 349
[7] Chandrashekar, G. and F. Sahin, A survey on feature selection methods. Computers & Electrical Engineering, 2014. 40(1): p. 16-28.
[8] Yi Yang, et al., Feature Selection for Multimedia Analysis by Sharing Information Among Multiple Tasks. Multimedia, IEEE Transactions on 2012. 15(3): p. 661 – 669
[9] Hui-Huang Hsu, Cheng-Wei Hsieh, and M.-D. Lu, Hybrid feature selection by combining filters and wrappers. Expert Systems with Applications, 2011. 38: p. 8144–8150.
[10] Xiaofei He, Deng Cai, and P. Niyogi1, Laplacian Score for Feature Selection. Adv. Neural Inf. Process. Syst, 2005. 18: p. 507-514.
[11] Artur J. Ferreira and M.A.T. Figueiredo, An unsupervised approach to feature discretization and selection. Pattern Recognition, 2012. 45(9): p. 3048–3060.
[12] Xin Sun, et al., Feature evaluation and selection with cooperative game theory. Pattern Recognition, 2012. 45(8): p. 2992–3002.
[13] Ahmed K. Farahat, Ali Ghodsi, and M.S. Kamel, Efficient greedy feature selection for unsupervised learning. Knowledge and Information Systems 2013. 35(2): p. 285-310.
[14] S. Theodoridis and C. Koutroumbas, Pattern Recognition, 4th Edn. Elsevier Inc, 2009.
[15] Vincent, B, Jean.G., Renaud. L, and Etinne. L, Fast unfolding of communities in large networks. Journal of Statistical Mechanics: Theory and Experiment, 2008. 10008: p. 1–12.
[16] Xiangbin Yan, Li Zhaia, and W. Fan., C-index: A weighted network node centrality measure for collaboration competence. Journal of Informetrics, 2013. 7(1): p. 223–239.
[17] Xingqin Qi, et al., Laplacian centrality: A new centrality measure for weighted networks. Information Sciences, 2012. 194: p. 240–253.
Published
2019-06-01