Statistical Learning for High-Dimensional Data: A Comprehensive Approach to Dimensionality Reduction in Machine Learning
DOI:
https://doi.org/10.63278/mme.vi.1698Keywords:
Dimensionality Reduction, PCA, LDA, Statistical Analysis, t-SNEAbstract
Dimensionality reduction is a crucial process in machine learning, particularly when dealing with high-dimensional data. As the number of features increases, models often suffer from overfitting, computational complexity, and a lack of interpretability. This paper explores statistical methods for dimensionality reduction, focusing on techniques such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), and t-SNE. These methods aim to preserve the underlying structure of data while reducing its dimensions for better model performance. By analyzing the mathematical foundations of these techniques, we evaluate their application across various machine learning models, demonstrating their utility in improving model efficiency and interpretability. Experimental results validate the effectiveness of these statistical methods in practical machine learning tasks.
Downloads
How to Cite
Issue
Section
License
Copyright (c) 2025 Irsa Sajjad, Sumaira Sharif, Maria Malik, Aysha Qayyum, Sharqa Hashmi

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their published articles online (e.g., in institutional repositories or on their website, social networks like ResearchGate or Academia), as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).

Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution 4.0 International License.



According to the