Machine Learning
39.3K subscribers
3.87K photos
32 videos
42 files
1.31K links
Machine learning insights, practical tutorials, and clear explanations for beginners and aspiring data scientists. Follow the channel for models, algorithms, coding guides, and real-world ML applications.

Admin: @HusseinSheikho || @Hussein_Sheikho
Download Telegram
πŸ“š Introduction to Transformers for NLP (2022)

πŸ”— Download Link: https://file.lu/d/1afC

πŸ’¬ Tags: #Transformers #NLP

⛔️ βž• interaction = βž• books

βœ… Click here πŸ‘‰: Surprise 🎁
❀‍πŸ”₯2πŸ‘1
πŸ“š Transformers for Machine Learning (2022)

1⃣ Join Channel Download:
https://t.me/+MhmkscCzIYQ2MmM8

2⃣ Download Book: https://t.me/c/1854405158/46

πŸ’¬ Tags: #Transformers

USEFUL CHANNELS FOR YOU
❀‍πŸ”₯5πŸ‘3❀1
πŸ“š Transformers for Natural Language Processing (2022)

1⃣ Join Channel Download:
https://t.me/+MhmkscCzIYQ2MmM8

2⃣ Download Book: https://t.me/c/1854405158/265

πŸ’¬ Tags: #Transformers #NLP

USEFUL CHANNELS FOR YOU
πŸ‘7❀2
πŸ“š Natural Language Processing with Transformers (2022)

1⃣ Join Channel Download:
https://t.me/+MhmkscCzIYQ2MmM8

2⃣ Download Book: https://t.me/c/1854405158/325

πŸ’¬ Tags: #Transformers #NLP

USEFUL CHANNELS FOR YOU
πŸ‘7❀5πŸ’―2
πŸ“š Transformers (2024)

1⃣ Join Channel Download:
https://t.me/+MhmkscCzIYQ2MmM8

2⃣ Download Book: https://t.me/c/1854405158/1588

πŸ’¬ Tags: #Transformers

πŸ‘‰ BEST DATA SCIENCE CHANNELS ON TELEGRAM πŸ‘ˆ
πŸ‘2❀1
Discover an incredible LLM course designed to deepen your understanding of the transformer architecture and its role in building powerful Large Language Models (LLMs). This course breaks down complex concepts into easy-to-grasp modules, making it perfect for both beginners and advanced learners. Dive into the mechanics of attention mechanisms, encoding-decoding processes, and much more. Elevate your AI knowledge and stay ahead in the world of machine learning!

Enroll Free: https://www.deeplearning.ai/short-courses/how-transformer-llms-work/

#LLMCourse #Transformers #MachineLearning #AIeducation #DeepLearning #TechSkills #ArtificialIntelligence

https://t.me/DataScienceM
πŸ‘5
This media is not supported in your browser
VIEW IN TELEGRAM
Last week we introduced how transformer LLMs work, this week we go deeper into one of its key elementsβ€”the attention mechanism, in a new #OpenSourceAI course, Attention in Transformers: Concepts and #Code in #PyTorch

Enroll Free: https://www.deeplearning.ai/short-courses/attention-in-transformers-concepts-and-code-in-pytorch/

#LLMCourse #Transformers #MachineLearning #AIeducation #DeepLearning #TechSkills #ArtificialIntelligence

https://t.me/DataScienceM
❀4πŸ‘3
Machine Learning
Photo
# Learning rate scheduler for transformers
def lr_schedule(step, d_model=512, warmup_steps=4000):
arg1 = step ** -0.5
arg2 = step * (warmup_steps ** -1.5)
return (d_model ** -0.5) * min(step ** -0.5, step * warmup_steps ** -1.5)


---

### **πŸ“Œ What's Next?
In **Part 5
, we'll cover:
➑️ Generative Models (GANs, VAEs)
➑️ Reinforcement Learning with PyTorch
➑️ Model Optimization & Deployment
➑️ PyTorch Lightning Best Practices

#PyTorch #DeepLearning #NLP #Transformers πŸš€

Practice Exercises:
1. Implement a character-level language model with LSTM
2. Add attention visualization to a sentiment analysis model
3. Build a transformer from scratch for machine translation
4. Compare teacher forcing ratios in seq2seq training
5. Implement beam search for decoder inference

# Character-level LSTM starter
class CharLSTM(nn.Module):
def __init__(self, vocab_size, hidden_size, n_layers):
super().__init__()
self.embed = nn.Embedding(vocab_size, hidden_size)
self.lstm = nn.LSTM(hidden_size, hidden_size, n_layers, batch_first=True)
self.fc = nn.Linear(hidden_size, vocab_size)

def forward(self, x, hidden=None):
x = self.embed(x)
out, hidden = self.lstm(x, hidden)
return self.fc(out), hidden
πŸ”₯2❀1
🌟 Vision Transformer (ViT) Tutorial – Part 1: From CNNs to Transformers – The Revolution in Computer Vision

Let's start: https://hackmd.io/@husseinsheikho/vit-1

#VisionTransformer #ViT #DeepLearning #ComputerVision #Transformers #AI #MachineLearning #NeuralNetworks #ImageClassification #AttentionIsAllYouNeed

βœ‰οΈ Our Telegram channels: https://t.me/addlist/0f6vfFbEMdAwODBk

πŸ“± Our WhatsApp channel: https://whatsapp.com/channel/0029VaC7Weq29753hpcggW2A
❀3πŸ‘1
🌟 Vision Transformer (ViT) Tutorial – Part 2: Implementing ViT from Scratch in PyTorch

Let's start: https://hackmd.io/@husseinsheikho/vit-2

#VisionTransformer #ViTFromScratch #PyTorch #DeepLearning #ComputerVision #Transformers #AI #MachineLearning #CodingTutorial #AttentionIsAllYouNeed


βœ‰οΈ Our Telegram channels: https://t.me/addlist/0f6vfFbEMdAwODBk

πŸ“± Our WhatsApp channel: https://whatsapp.com/channel/0029VaC7Weq29753hpcggW2A
❀2
🌟 Vision Transformer (ViT) Tutorial – Part 3: Pretraining, Transfer Learning & Real-World Applications

Let's start: https://hackmd.io/@husseinsheikho/vit-3

#VisionTransformer #TransferLearning #HuggingFace #ImageNet #FineTuning #AI #DeepLearning #ComputerVision #Transformers #ModelZoo


βœ‰οΈ Our Telegram channels: https://t.me/addlist/0f6vfFbEMdAwODBk
❀3
🌟 Vision Transformer (ViT) Tutorial – Part 5: Efficient Vision Transformers – MobileViT, TinyViT & Edge Deployment

Read lesson: https://hackmd.io/@husseinsheikho/vit-5

#MobileViT #TinyViT #EfficientViT #EdgeAI #ModelOptimization #ONNX #TensorRT #TorchServe #DeepLearning #ComputerVision #Transformers

βœ‰οΈ Our Telegram channels: https://t.me/addlist/0f6vfFbEMdAwODBk

πŸ“± Our WhatsApp channel: https://whatsapp.com/channel/0029VaC7Weq29753hpcggW2A
Please open Telegram to view this post
VIEW IN TELEGRAM
❀2