Dimensionality reduction is the process of reducing the number of random variables under consideration. This study will last for 10 weeks, meeting twice a week for about an hour.
## Introduction to Dimensionality Reduction (0.5 Week)
- Motivations for dimensionality reduction
- Advantages of dimensionality reduction
- Disadvantages of dimensionality reduction
## Feature Selection (3 Weeks)
This is the process of selecting a subset of relevant features. The central premise of this technique is that many features are either redundant or irrelevant and thus can be removed without incurring much loss of information.
### Applications of Metaheuristic Techniques (0.5 Weeks)
- Stepwise Regression
- Branch and Bound
## Feature Extraction (6 Weeks)
Feature extraction transforms the data in high-dimensional space to a space of fewer dimensions. In other words, feature extraction involves reducing the amount of resources required to describe a large set of data.
### Linear Dimensionality Reduction (3 Weeks)
- Principal Component Analysis (PCA)
- Singular Value Decomposition (SVD)
- Non-Negative Matrix Factorization
- Linear Discriminant Analysis (LDA)
- Multidimensional Scaling (MDS)
- Canonical Correlation Analysis (CCA) [If Time Permits]
- Linear Independent Component Analysis [If Time Permits]
- Factor Analysis [If Time Permits]
### Non-Linear Dimensionality Reduction (3 Weeks)
One approach to the simplification is to assume that the data of interest lie on an embedded non-linear manifold within higher-dimensional space.