Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Dimensionality Reduction Methods

10

Flashcards

0/10

Still learning
StarStarStarStar

Linear Discriminant Analysis (LDA)

StarStarStarStar

LDA attempts to model the difference between the classes of data. It finds the axes that maximize the separation between multiple classes.

StarStarStarStar

Multidimensional Scaling (MDS)

StarStarStarStar

MDS is a technique for analyzing similarity or dissimilarity data. It aims to place each object in N-dimensional space such that the between-object distances are preserved as well as possible.

StarStarStarStar

Principal Component Analysis (PCA)

StarStarStarStar

PCA works by calculating the eigenvectors and eigenvalues of the covariance matrix to find the principal components. Benefits include noise reduction and finding the underlying structure of the data.

StarStarStarStar

Autoencoders

StarStarStarStar

Autoencoders are neural networks designed to learn an encoding for the data. They work by compressing the input into a latent-space representation and then reconstructing the output from this representation.

StarStarStarStar

Truncated Singular Value Decomposition (SVD)

StarStarStarStar

Truncated SVD, also known as Latent Semantic Analysis in text processing, reduces dimensionality by transforming data to a lower-dimensional space, preserving only the most significant singular values.

StarStarStarStar

t-Distributed Stochastic Neighbor Embedding (t-SNE)

StarStarStarStar

t-SNE converts similarities between data points to joint probabilities and minimizes the Kullback-Leibler divergence between the joint probabilities of the low-dimensional embedding and the high-dimensional data.

StarStarStarStar

Uniform Manifold Approximation and Projection (UMAP)

StarStarStarStar

UMAP is a manifold learning technique that approximates a high-dimensional manifold. It works well even with large datasets, preserving more of the local and global data structure than t-SNE.

StarStarStarStar

Isomap

StarStarStarStar

Isomap extends PCA to non-linear dimension reduction by incorporating geodesic distances among all points on the manifold.

StarStarStarStar

Independent Component Analysis (ICA)

StarStarStarStar

ICA aims to represent a multivariate signal as a combination of independent non-Gaussian signals. It is commonly used for blind source separation.

StarStarStarStar

Factor Analysis

StarStarStarStar

Factor analysis is a model-based technique that attempts to describe variables as influenced by several latent factors. It reduces dimensionality by modeling the observed variables as linear combinations of potential factors plus noise.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.