Python | Machine Learning | Coding | R
64K subscribers
1.15K photos
72 videos
145 files
808 links
Help and ads: @hussein_sheikho

Discover powerful insights with Python, Machine Learning, Coding, and R—your essential toolkit for data-driven solutions, smart alg

List of our channels:
https://t.me/addlist/8_rRW2scgfRhOTc0

https://telega.io/?r=nikapsOH
Download Telegram
Create an Audiobook in Python

More ♥️♥️ = more posts

@CodeProgrammer ♥️
❤‍🔥104👍3
Speech to Text using Python

More ♥️♥️ = more posts

@CodeProgrammer ♥️
❤‍🔥112👍1
This is all you need to train a typical image classifier using TensorFlow! 🚀

Let's break it down step-by-step and see what's happening!

More ♥️♥️ = more posts

@CodeProgrammer ♥️
❤‍🔥128👍3
Python | Machine Learning | Coding | R
Photo
More ♥️👍 = more important projects
❤‍🔥103👍2
Building a Convolutional Neural Network in PyTorch

https://machinelearningmastery.com/building-a-convolutional-neural-network-in-pytorch/

More ♥️♥️ = more posts

@CodeProgrammer ♥️

Don't Forget join to another channel
https://t.me/DataScienceT
8👍3
How do Transformers work?

All
the Transformer models mentioned above (GPT, BERT, BART, T5, etc.) have been trained as language models. This means they have been trained on large amounts of raw text in a self-supervised fashion. Self-supervised learning is a type of training in which the objective is automatically computed from the inputs of the model. That means that humans are not needed to label the data!

This type of model develops a statistical understanding of the language it has been trained on, but it’s not very useful for specific practical tasks. Because of this, the general pretrained model then goes through a process called transfer learning. During this process, the model is fine-tuned in a supervised way — that is, using human-annotated labels — on a given task

🔗 Read More

🌸 https://t.me/DataScienceT
5👍2❤‍🔥1