Machine Learning And AI
1.65K subscribers
198 photos
1 video
19 files
351 links
Hi All and Welcome Join our channel for Jobs,latest Programming Blogs, machine learning blogs.
In case any doubt regarding ML/Data Science please reach out to me @ved1104 subscribe my channel
https://youtube.com/@geekycodesin?si=JzJo3WS5E_VFmD1k
Download Telegram
๐‡๐จ๐ฐ ๐๐จ๐ž๐ฌ ๐๐‚๐€ ๐ฆ๐š๐ง๐ฎ๐š๐ฅ๐ฅ๐ฒ ๐œ๐จ๐ฆ๐ฉ๐ฎ๐ญ๐ž? ๐€๐ง๐ ๐–๐ก๐ฒ ๐’๐ก๐จ๐ฎ๐ฅ๐ ๐–๐ž ๐Š๐ง๐จ๐ฐ ๐ˆ๐ญ?

In data science, machine learning, and statistics, Principal Component Analysis (PCA) is a dimensionality-reduction method often used to reduce the dimensionality of large data sets by transforming a large set of variables into a smaller one that still contains most of the information in the large set.

Reducing the number of variables in a data set naturally comes at the expense of accuracy. Still, the trick in dimensionality reduction is to trade a little accuracy for simplicity. Smaller data sets are easier to explore and visualize, making analyzing data much easier and faster for machine learning algorithms without extraneous variables to process.

PCA finds directions for the maximal variance of the data. It finds mutually orthogonal directions. Mutually orthogonal means it's a global algorithm. Global means that all the directions and all the new features they find have a significant global constraint, namely that they must be mutually orthogonal.

Letโ€™s see how we can manually compute PCA given some random table of values (see the illustration)

๐‘บ๐’•๐’†๐’‘ 1: Standardize the dataset.
๐‘บ๐’•๐’†๐’‘ 2: Calculate the covariance matrix for the features in the dataset.
๐‘บ๐’•๐’†๐’‘ 3: Calculate the eigenvalues and eigenvectors for the covariance matrix.
๐‘บ๐’•๐’†๐’‘ 4: Sort eigenvalues and their corresponding eigenvectors.
๐‘บ๐’•๐’†๐’‘ 5: Calculate eigenvector for each eigenvalue using Cramerโ€™s rule
๐‘บ๐’•๐’†๐’‘ 6: Build eigenvectors matrix
๐‘บ๐’•๐’†๐’‘ 7: Pick k eigenvalues and form a matrix of eigenvectors.
๐‘บ๐’•๐’†๐’‘ 8: Transform the original matrix.

๐Š๐ง๐จ๐ฐ๐ข๐ง๐  ๐ก๐จ๐ฐ ๐ญ๐จ ๐œ๐จ๐ฆ๐ฉ๐ฎ๐ญ๐ž ๐๐‚๐€ ๐ฆ๐š๐ง๐ฎ๐š๐ฅ๐ฅ๐ฒ ๐œ๐š๐ง ๐›๐ž ๐ž๐ฌ๐ฌ๐ž๐ง๐ญ๐ข๐š๐ฅ ๐Ÿ๐จ๐ซ ๐ฌ๐ž๐ฏ๐ž๐ซ๐š๐ฅ:

โ–ธ Conceptual understanding enhances your grasp of the underlying mathematical principles.

โ–ธ Sometimes, we may need to customize the PCA process to suit specific requirements or constraints. Manual computation enables us to adapt PCA and adjust it to ๐จ๐ฎ๐ซ needs as necessary.

โ–ธ Understanding the inner workings of PCA through manual computation can enhance our problem-solving skills in data analysis and dimensionality reduction. We will be better equipped to tackle complex data-related challenges.

โ–ธ A solid grasp of manual PCA can be a foundation for understanding ๐ฆ๐จ๐ซ๐ž ๐š๐๐ฏ๐š๐ง๐œ๐ž๐ ๐๐ข๐ฆ๐ž๐ง๐ฌ๐ข๐จ๐ง๐š๐ฅ๐ข๐ญ๐ฒ ๐ซ๐ž๐๐ฎ๐œ๐ญ๐ข๐จ๐ง ๐ญ๐ž๐œ๐ก๐ง๐ข๐ช๐ฎ๐ž๐ฌ and related machine learning and data analysis methods.

โ–ธ Manual computation can be a valuable educational tool if we teach or learn about PCA. It allows instructors and students to see how PCA works from a foundational perspective.
๐Ÿ‘1
Join my channel for more technical blogs and daily updates
โค1
Hi All
Welcome to GeekyCodes. Thank you for 500 members in this group. Join our channel for latest Programming Blogs,Job Openings at various organizations and machine learning blogs.
In case you've any doubt regarding ML/Data Science please reach out to me @ved1104
๐Ÿ‘1