An algorithm for independent component analysis using a general class of copula-based dependence criteria

Document Type : Research Paper

Authors

Department of Statistics, Yazd University, Yazd, Iran

Abstract

    The efficiency of Independent Component Analysis ($\rm ICA$) algorithms relies heavily on the choice of objective function and optimization algorithms. The design of objective functions for $\rm ICA$ algorithms necessitate a foundation built upon specific dependence criteria. This paper will investigate a general class of dependency criteria based on the copula density function. One of the aims of this study is to characterize the independence between two random variables and investigate their properties. Additionally, this paper introduces a novel algorithm for $\rm ICA$ based on estimators derived from the proposed criteria. To compare the performance of the proposed algorithm against existing methods, a Monte Carlo simulation-based approach was employed. The results of this simulation revealed significant improvements in the algorithm's outputs. Finally, the algorithm was tested on a batch of time series data related to the international tourism receipts index. It served as a pre-processing procedure within a hybrid clustering algorithm alongside ${\tt PAM}$. The obtained results demonstrated that the utilization of this algorithm led to improved performance in clustering countries based on their international tourism receipts index.

Keywords

Main Subjects


[1] Abayomi, K., Lall, U., & de la Pena, V. (2009). Copula based independent component analysis. SSRN. http://ssrn.com/abstract=1028822
[2] Amari, S., Cichocki, A., & Yang, H. (1996). A new learning algorithm for blind source separation. Advances in Neural Information Processing Systems, 8, 757-763. https://proceedings.neurips.cc/paper/1995/hash/e19347e1c3ca0c0b97de5fb3b690855a-Abstract.html
[3] Antony, M.J., Sankaralingam, B.P., Mahendran, R.K., Gardezi, A.A., Sha q, M., Choi, J.G., & Hamam, H. (2022). Classi cation of EEG using adaptive SVM classi er with CSP and online recursive independent component analysis. Sensors (Basel, Switzerland), 22(19), 7596. https://doi.org/10.3390/s22197596
[4] Bach, F.R., & Jordan, M.I. (2002). Kernel independent component analysis. Journal of Machine Learning Research, 3, 1-48. https://www.jmlr.org/papers/volume3/bach02a/bach02a.pdf
[5] Bouezmarni, T., & Rolin, J.M. (2003). Consistency of the beta kernel density function estimator. The Canadian Journal of Statistics, 31(1), 89-98. https://doi.org/10.2307/3315905
[6] Cardoso, J.F. (1999). High-order contrasts for independent component analysis. Neural Computation, 11(1), 157-192. https://doi.org/10.1162/089976699300016863
[7] Charpentier, A., Fermanian, J.D., & Scaillet, O. (2007). The estimation of copulas: Theory and practice. In: Jorn Rank Copulas: From Theory to Application in Finance, 35-64. Risk Books, London.
[8] Chen, S.X. (1999). Beta kernel estimators for density functions. Computational Statistics & Data Analysis, 31(2), 131-145. https://doi.org/10.1016/S0167-9473(99)00010-9
[9] Chen, R.B., Guo, M., Hardle, W.K., & Huang, S.F. (2007). Independent component analysis via copula techniques. SFB 649 Discussion Paper 2008-004, SSRN. https://ssrn.com/abstract=2894312
[10] Comon, P. (1994). Independent component analysis, a new concept?. Signal Processing, 36(3), 287-314. https://doi.org/10.1016/0165-1684(94)90029-9
[11] Guo, C., Jia, H., & Zhang, N. (2008). Time series clustering based on ICA for stock data analysis. In 2008 4th international conference on wireless communications, networking and mobile computing, IEEE, 1-4. https://doi.org/10.1109/WiCom.2008.2534
[14] Hyvarinen, A. (1999). Fast and robust  xed-point algorithms for independent component analysis. IEEE Transactions on Neural Networks, 10(3), 626-634. https://doi.org/10.1109/72.761722
[13] Hyvarinen, A., & Oja, E. (2000). Independent component analysis: algorithms and applications. Neural Networks, 13(4-5), 411-430. https://doi.org/10.1016/S0893-6080(00)00026-5
[14] Jayabal, V., Badier, J. M., Pizzo, F., Villalon, S.M., Papageorgakis, C., Lopez-Madrona, V., ... & Benar, C.G. (2022). Virtual MEG sensors based on beamformer and independent component analysis can reconstruct epileptic activity
as measured on simultaneous intracerebral recordings. Neuro Image, 264, 1-17.
https://doi.org/10.1016/j.neuroimage.2022.119681
[15] Keziou, A., Fenniri, H., Ghazdali, A., & Moreau, E. (2014). New blind source separation method of independent/dependent sources. Signal Processing, 104, 319-324. https://doi.org/10.1016/j.sigpro.2014.04.017
[16] Langlois, D., Chartier, S., & Gosselin, D. (2010). An introduction to independent component analysis: Infomax and FASTICA algorithms. Tutorials in Quantitative Methods for Psychology, 6(1), 31-38. https://doi.org/10.20982/tqmp.06.1.p031
[17] Lassance, N., DeMiguel, V., & Vrins, F. (2022). Optimal portfolio diversication via independent component analysis. Operations Research, 70(1), 55-72. https://doi.org/10.1287/opre.2021.2140
[18] Learned-Miller, E.G., & Iii, J.W.F. (2003). ICA using spacings estimates of entropy. Journal of Machine Learning Research, 4, 1271-1295. https://www.jmlr.org/papers/volume4/learned-miller03a/learned-miller03a.pdf
[19] Lee, T.W., Girolami, M., & Sejnowski, T.J. (1999). Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources. Neural Computation, 11(2), 417-441. https://doi.org/10.1162/089976699300016719
[20] Lipshutz, D., Pehlevan, C., & Chklovskii, D.B. (2022). Biologically plausible singlelayer networks for nonnegative independent component analysis. Biological Cybernetics, 116(5), 557-568. https://doi.org/10.1007/s00422-022-00943-8
[21] Lyu, Q., & Fu, X. (2022). On  nite-sample identi ability of contrastive learning-based nonlinear independent component analysis. In: Proceedings of the 39th International Conference on Machine Learning, 162, 14582-14600. Baltimore, Maryland, USA, PMLR. https://proceedings.mlr.press/v162/lyu22a.html
[22] Meng, X., Iraji, A., Fu, Z., Kochunov, P., Belger, A., Ford, J., ... & Calhoun, V.D. (2022). Multimodel order independent component analysis: a data-driven method for evaluating brain functional network connectivity within and between multiple spatial scales. Brain Connectivity, 12(7), 617-628. https://doi.org/10.1089/brain.2021.0079
[23] Moneta, A., & Pallante, G. (2022). Identi cation of structural var models via independent component analysis: a performance evaluation study. Journal of Economic Dynamics and Control, 144, 1-37. https://doi.org/10.1016/j.jedc.2022.104530
[24] Nagler, T. (2014). Kernel methods for Vine copula estimation. Master's Thesis, Technische Universitat Munchen. https://mediatum.ub.tum.de/doc/1231221/1231221.pdf
[25] Nelsen, R.B. (2007). An introduction to copulas, Springer Science & Business Media, New York.
[26] Rahmanishamsi, J., & Dolati, A. (2018). Rank based least-squares independent component analysis. Journal of Statisical Research of Iran, 14(2), 247-266. https://doi.org/10.29252/jsri.14.2.247
[27] Rahmanishamsi, J., Dolati, A., & Aghabozorgi, M.R. (2018). A copula based ICA algorithm and its application to time series clustering. Journal of Classi cation, 35(2), 230-249. https://doi.org/10.1007/s00357-018-9258-x
[28] Rdusseeun, L.K.P.J., & Kaufman, P. (1987). Clustering by means of medoids. In: Proceedings of the Statistical Data Analysis Based on the L1 Norm Conference, 31, 405-416. Neuchatel, Switzerland.
[29] Rincourt, S.L., Michiels, S., & Drubay, D. (2022). Complex disease individual molecular characterization using in nite sparse graphical independent component analysis. Cancer Informatics, 21, 1-16. https://doi.org/10.1177/11769351221105776
[30] Rousseeuw, P.J. (1987). Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics, 20, 53-65. https://doi.org/10.1016/0377-0427(87)90125-7
[31] Shahina, K., & Pradeep Kumar, T.S. (2022). Similarity-based clustering and data aggregation with independent component analysis in wireless sensor networks. In: Transactions on Emerging Telecommuications Technologies, 33(7), 1-15. https://doi.org/10.1002/ett.4462
[32] Shang, L., Zhang, Y., & Sun, Z.L. (2022). Palmprint feature extraction utilizing WTAICA in contourlet domain. In: International Conference on Intelligent Computing, 464-471. Xi'an, China. https://doi.org/10.1007/978-3-031-13870-6 39
[33] Sheikh, M.S., & Regan, A. (2022). A complex network analysis approach for estimation and detection of trac incidents based on independent component analysis. Physica A: Statistical Mechanics and its Applications, 586, 126504.
https://doi.org/10.1016/j.physa.2021.126504
[34] Suzuki, T., & Sugiyama, M. (2011). Least-squares independent component analysis. Neural Computation, 23(1), 284-301. https://doi.org/10.1162/NECO a 00062
[35] Tabanfar, Z., Ghassemi, F., & Moradi, M.H. (2022). Estimating brain periodic sources activities in steady-state visual evoked potential using local fourier independent component analysis. Biomedical Signal Processing and Control, 71, 103162. https://doi.org/10.1016/j.bspc.2021.103162
[36] Wand, M.P., & Jones, M.C. (1995) Kernel smoothing. Chapman & Hall, Boca Raton.
[37] Zanghaei, A., Abolhassani, M., Ahmadian, A., Ay, M.R., & Saberi, H. (2013). A New nethod to enhance the clustering algorithm. International Journal of Computer and Electrical Engineering, 5(1), 120-122. https://doi.org/10.7763/IJCEE.2013.V5.677
[38] Zhang, S., & Karunamuni, R.J. (2010). Boundary performance of the beta kernel estimators. Journal of Nonparametric Statistics, 22(1), 81-104. https://doi.org/10.1080/10485250903124984
[39] Zhou, R., Han, J., Li, T., & Guo, Z. (2022). Fast independent component analysis denoising for magnetotelluric data based on a correlation coecient and fast iterative shrinkage threshold algorithm. IEEE Transactions on Geoscience and Remote Sensing, 60, 1-15. https://doi.org/10.1109/TGRS.2022.3182504