Coding Projects
61.1K subscribers
760 photos
1 video
277 files
362 links
Channel specialized for advanced concepts and projects to master:
* Python programming
* Web development
* Java programming
* Artificial Intelligence
* Machine Learning

Managed by: @love_data
Download Telegram
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.001
2 GB
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.001
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.002
2 GB
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.002
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.003
2 GB
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.003
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.004
1.3 GB
50 Projects In 50 Days - HTML, CSS & JavaScript.zip.004
๐Ÿ”ฅ31๐Ÿ‘11โค1โœ1๐Ÿ‘1
Building_Chatbots_with_Python_Using_Natural_Language_Processing.pdf
5.2 MB
Building Chatbots with Python
๐Ÿ‘9โค3
Algorithms-Leetcode-Javascript

Webpack
questions/answers you can use to prepare for interviews or test your knowledge.

Creator: Stepan V
Stars โญ๏ธ: 178
Forked By : 60
GitHub Repo: https://github.com/styopdev/webpack-interview-questions
๐Ÿ‘7
john-c-shovic-raspberry-pi-iot-projects-prototyping-2021.epub
5.9 MB
Raspberry Pi IoT Projects
John C. Shovic, 2021
๐Ÿ‘2
Managing Machine Learning Projects .pdf
9.4 MB
Managing Machine Learning Projects
Simon Thompson, 2022
๐Ÿ‘6
Feature Scaling is one of the most useful and necessary transformations to perform on a training dataset, since with very few exceptions, ML algorithms do not fit well to datasets with attributes that have very different scales.

Let's talk about it ๐Ÿงต

There are 2 very effective techniques to transform all the attributes of a dataset to the same scale, which are:
โ–ช๏ธ Normalization
โ–ช๏ธ Standardization

The 2 techniques perform the same task, but in different ways. Moreover, each one has its strengths and weaknesses.

Normalization (min-max scaling) is very simple: values are shifted and rescaled to be in the range of 0 and 1.

This is achieved by subtracting each value by the min value and dividing the result by the difference between the max and min value.

In contrast, Standardization first subtracts the mean value (so that the values always have zero mean) and then divides the result by the standard deviation (so that the resulting distribution has unit variance).

More about them:
โ–ช๏ธStandardization doesn't frame the data between the range 0-1, which is undesirable for some algorithms.
โ–ช๏ธStandardization is robust to outliers.
โ–ช๏ธNormalization is sensitive to outliers. A very large value may squash the other values in the range 0.0-0.2.

Both algorithms are implemented in the Scikit-learn Python library and are very easy to use. Check below Google Colab code with a toy example, where you can see how each technique works.

https://colab.research.google.com/drive/1DsvTezhnwfS7bPAeHHHHLHzcZTvjBzLc?usp=sharing

Check below spreadsheet, where you can see another example, step by step, of how to normalize and standardize your data.

https://docs.google.com/spreadsheets/d/14GsqJxrulv2CBW_XyNUGoA-f9l-6iKuZLJMcc2_5tZM/edit?usp=drivesdk

Well, the real benefit of feature scaling is when you want to train a model from a dataset with many features (e.g., m > 10) and these features have very different scales (different orders of magnitude). For NN this preprocessing is key.

Enable gradient descent to converge faster
๐Ÿ‘14โค1
Here is the bubble shooter game in Python
๐Ÿ”ฅ8
Bubble Shooter Game.zip
7.8 MB
๐Ÿคฉ10๐Ÿ‘1
Machine Learning Projects ๐Ÿ‘‡
๐Ÿ‘4