Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Data Dimensionality Reduction Techniques

6

Flashcards

0/6

Still learning
StarStarStarStar

Principal Component Analysis (PCA)

StarStarStarStar

PCA is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. Its purpose is to reduce the dimensionality of the dataset, enhancing interpretability while minimizing information loss.

StarStarStarStar

Isomap (Isometric Mapping)

StarStarStarStar

Isomap is a global dimensionality reduction method that computes a lower-dimensional embedding which maintains geodesic distances between all points. It's an extension of MDS and PCA, capable of preserving the intrinsic geometry of the data as opposed to only focusing on local similarities.

StarStarStarStar

t-Distributed Stochastic Neighbor Embedding (t-SNE)

StarStarStarStar

t-SNE is a non-linear dimensionality reduction technique that is well-suited for embedding high-dimensional data for visualization in a low-dimensional space of two or three dimensions. It converts similarities between data points to joint probabilities and tries to minimize the Kullback-Leibler divergence between the joint probabilities of the low-dimensional embedding and the high-dimensional data.

StarStarStarStar

Multidimensional Scaling (MDS)

StarStarStarStar

MDS is a means of visualizing the level of similarity of individual cases of a dataset. It attempts to model similarity or dissimilarity of data by representing them as distances in a geometric space. The principal purpose is to detect meaningful underlying dimensions that explain observed similarities or dissimilarities.

StarStarStarStar

Autoencoders

StarStarStarStar

Autoencoders are a type of artificial neural network used to learn efficient codings of unlabeled data. The network consists of an encoder that compresses the data and a decoder that reconstructs it. The purpose is to minimize the difference between the input and the reconstructed output, thereby reducing dimensionality.

StarStarStarStar

Linear Discriminant Analysis (LDA)

StarStarStarStar

LDA is used as a dimensionality reduction technique in the machine learning field. It reduces dimensions by separating classes in the best possible way. The goal is to project the features in higher dimension space onto a lower-dimensional space.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.