https://github.com/python/peps/blob/3fe4784290a8461dea98efc8fc338662b2992371/pep-0695.rst
PEP 695 proposes a new syntax for declaring type parameters for generic classes, functions, and type aliases.
This will be a big win. The current state of affairs is a recurring source of confusion and is quite unpleasant.
PEP 695 proposes a new syntax for declaring type parameters for generic classes, functions, and type aliases.
This will be a big win. The current state of affairs is a recurring source of confusion and is quite unpleasant.
GitHub
peps/pep-0695.rst at 3fe4784290a8461dea98efc8fc338662b2992371 · python/peps
Python Enhancement Proposals. Contribute to python/peps development by creating an account on GitHub.
Google AI amazing libs:
TF: 🤖🧠
TFLite: 🤖🧠+📱
TF.js: 🤖🧠+🕸️
JAX: 🧮⚡️
Flax:🤖🧠+🧮⚡️
TF-Agents: 🤖🧠+🕹️♟️
TFX: 🤖🧠+👔🏭
TFHub: 🤖🧠+🧠🧠
TFDF: 🤖🧠+🌳🌲
TensorBoard: 🤖🧠+📊+(🚫🪲)
TF Recommenders: 🤖🧠+(📚➡️📙)
TF Privacy: 🤖🧠+🔐
TF Probability: 🤖🧠+🎲
Magenta: 🤖🧠+🎨🎶
TF: 🤖🧠
TFLite: 🤖🧠+📱
TF.js: 🤖🧠+🕸️
JAX: 🧮⚡️
Flax:🤖🧠+🧮⚡️
TF-Agents: 🤖🧠+🕹️♟️
TFX: 🤖🧠+👔🏭
TFHub: 🤖🧠+🧠🧠
TFDF: 🤖🧠+🌳🌲
TensorBoard: 🤖🧠+📊+(🚫🪲)
TF Recommenders: 🤖🧠+(📚➡️📙)
TF Privacy: 🤖🧠+🔐
TF Probability: 🤖🧠+🎲
Magenta: 🤖🧠+🎨🎶
There are three types of transformers models
1. Encoder Only
2. Decode Only
3. Encoder-Decoder
👉 Encoder Only
BERT leverages the Encoder architecture of the transformer. BERT takes text sequence as input. The BERT Encoder produces BERT embedding, which can be used to perform downstream tasks like Text classification or Named Entity Recognition.
👉 Decoder Only
GPT models leverage the decoder architecture of the transformer. Given the input sequence as prompt, GPT starts generating the response. Therefore, GPT models are best suitable for text or sequence generation.
👉 Encoder-Decoder
T5 is a model which uses both Encoder and Decoder architecture. It treats each task as text to text or sequence to sequence. E.g., In text classification, the Encoder takes text as input, and the Decoder generates text labels instead of classifying them.
1. Encoder Only
2. Decode Only
3. Encoder-Decoder
👉 Encoder Only
BERT leverages the Encoder architecture of the transformer. BERT takes text sequence as input. The BERT Encoder produces BERT embedding, which can be used to perform downstream tasks like Text classification or Named Entity Recognition.
👉 Decoder Only
GPT models leverage the decoder architecture of the transformer. Given the input sequence as prompt, GPT starts generating the response. Therefore, GPT models are best suitable for text or sequence generation.
👉 Encoder-Decoder
T5 is a model which uses both Encoder and Decoder architecture. It treats each task as text to text or sequence to sequence. E.g., In text classification, the Encoder takes text as input, and the Decoder generates text labels instead of classifying them.
✍️✍️Latest - Work from home is now a LEGAL RIGHT in Netherlands! 🇳🇱
Dutch employers now MUST consider employee requests to work remotely, as long as their professions allow it...
It's one of the first countries to do so, congrats! 🙌
✍️✍️A new platform launched by Vodafone and Google called AI Booster aims to handle thousands of ML models a day across 18+ countries. AI Booster is the result of 18 months of development and is built upon Google’s Vertex AI and integrates with Vodafone’s Neuron platform.
✍️✍️ Meta, formerly known as Facebook announced that it has built and open-sourced ‘No Language Left Behind’ NLLB-200, a single AI model that is the first to translate across 200 different languages, including 55 African languages with state-of-the-art results.
Last week AI News
✍️✍️Google no longer accepts deepfake projects on Colab
✍️✍️Zoom receives backlash for emotion-detecting AI
✍️✍️ Apple’s former ML director reportedly joins Google DeepMind
Dutch employers now MUST consider employee requests to work remotely, as long as their professions allow it...
It's one of the first countries to do so, congrats! 🙌
✍️✍️A new platform launched by Vodafone and Google called AI Booster aims to handle thousands of ML models a day across 18+ countries. AI Booster is the result of 18 months of development and is built upon Google’s Vertex AI and integrates with Vodafone’s Neuron platform.
✍️✍️ Meta, formerly known as Facebook announced that it has built and open-sourced ‘No Language Left Behind’ NLLB-200, a single AI model that is the first to translate across 200 different languages, including 55 African languages with state-of-the-art results.
Last week AI News
✍️✍️Google no longer accepts deepfake projects on Colab
✍️✍️Zoom receives backlash for emotion-detecting AI
✍️✍️ Apple’s former ML director reportedly joins Google DeepMind
10 VScode extensions every data scientist should have💻🤖
1. Python
2. Pylance
3. Python Indent
4. Jupyter
5. Jupyter notebook renderers
6. DVC - (ML model experiment tracking)
7. Gitlens
8. Todo MD
9. Excel viewer
10. Markdown preview GitHub styling
1. Python
2. Pylance
3. Python Indent
4. Jupyter
5. Jupyter notebook renderers
6. DVC - (ML model experiment tracking)
7. Gitlens
8. Todo MD
9. Excel viewer
10. Markdown preview GitHub styling
Hugging face released 🌸 BLOOM 🌸 model today, the world’s largest open multilingual language model!
With its 176 billion parameters, BLOOM is able to generate text in 46 natural languages and 13 programming languages.
This release is the culmination of more than a year of work involving over 1000 researchers from 70 countries and 250 institutions, leading to a final run of 11 weeks of training on the Jean Zay supercomputer in the south of Paris, France
Researchers can now download, run and study 🌸 BLOOM 🌸 to investigate the performance and behaviour of recently developed large language models down to their deepest internal operations.
For more information
https://bigscience.huggingface.co/blog/bloom
With its 176 billion parameters, BLOOM is able to generate text in 46 natural languages and 13 programming languages.
This release is the culmination of more than a year of work involving over 1000 researchers from 70 countries and 250 institutions, leading to a final run of 11 weeks of training on the Jean Zay supercomputer in the south of Paris, France
Researchers can now download, run and study 🌸 BLOOM 🌸 to investigate the performance and behaviour of recently developed large language models down to their deepest internal operations.
For more information
https://bigscience.huggingface.co/blog/bloom
✍️✍️ 77 per cent of the devices used today have some sort of AI-enabled feature. The global market is projected at USD120 billion by 2025. A post-pandemic world only accelerates that demand. AI is both a challenge and an opportunity.
✍️✍️ Yesterday, admins of the PyPI registry announced they were in the process of introducing two-factor authentication (2FA) requirement for projects deemed "critical."
https://pypi.org/security-key-giveaway/
✍️✍️ AI bot wrote a scientific paper on itself 2-hours.
✍️✍️ DeepMind AI learns simple physics like a baby. The software model, named Physics Learning through Auto-encoding and Tracking Objects (PLATO), was fed the raw images from the videos, but also versions that highlighted each object in the scene. PLATO was designed to develop an internal representation of the physical properties of the objects, such as their positions and velocities.
✍️✍️ Google and Microsoft call ‘EMOTION AI’ risky but only limit usage.
✍️✍️ Indian Defence Minister launched 75 defence products powered by AI on July 11th, Monday. These products will be found at the first-ever ‘Artificial Intelligence in Defence’ (AIDef) symposium.
✍️✍️ Indian Railways to install AI-enabled CCTV cameras by Jan 2023. All these CCTV cameras under the VSS(Video Surveillance System) at railway stations, which are major hubs of transportation, would be Internet Protocol (IP) covering the waiting halls, reservation counters, parking areas, main entrance and exit, platforms, foot over bridges, booking offices", said the railway
Failure Usecase in AI
✍️✍️ In 2013, the Dutch government deployed artificial intelligence to handle childcare benefits applications and, as you might guess, it did not go well. Disproportionately, ethnic minorities were denied benefits and charged with fraud, and the entire imbroglio culminated in the cabinet resigning in January 2021. Now, it seems, it may not have been a fault of technology as much as a fault of the human beings, or policymakers, operating the technology.
✍️✍️ Yesterday, admins of the PyPI registry announced they were in the process of introducing two-factor authentication (2FA) requirement for projects deemed "critical."
https://pypi.org/security-key-giveaway/
✍️✍️ AI bot wrote a scientific paper on itself 2-hours.
✍️✍️ DeepMind AI learns simple physics like a baby. The software model, named Physics Learning through Auto-encoding and Tracking Objects (PLATO), was fed the raw images from the videos, but also versions that highlighted each object in the scene. PLATO was designed to develop an internal representation of the physical properties of the objects, such as their positions and velocities.
✍️✍️ Google and Microsoft call ‘EMOTION AI’ risky but only limit usage.
✍️✍️ Indian Defence Minister launched 75 defence products powered by AI on July 11th, Monday. These products will be found at the first-ever ‘Artificial Intelligence in Defence’ (AIDef) symposium.
✍️✍️ Indian Railways to install AI-enabled CCTV cameras by Jan 2023. All these CCTV cameras under the VSS(Video Surveillance System) at railway stations, which are major hubs of transportation, would be Internet Protocol (IP) covering the waiting halls, reservation counters, parking areas, main entrance and exit, platforms, foot over bridges, booking offices", said the railway
Failure Usecase in AI
✍️✍️ In 2013, the Dutch government deployed artificial intelligence to handle childcare benefits applications and, as you might guess, it did not go well. Disproportionately, ethnic minorities were denied benefits and charged with fraud, and the entire imbroglio culminated in the cabinet resigning in January 2021. Now, it seems, it may not have been a fault of technology as much as a fault of the human beings, or policymakers, operating the technology.
PyPI
PyPI 2FA Security Key Giveaway
PyPI is implementing a 2FA requirement for critical projects, and distributing security keys to critical maintainers.
Today, Python can be used to create native applications for mobile, web, and desktop environments 🐍📱 In @anaconda Inc next webinar, learn the basics for building & deploying Python anywhere using PyBeeWare
Register here! 👉bit.ly/3RBE2SU
Register here! 👉bit.ly/3RBE2SU
IamPython
Today, Python can be used to create native applications for mobile, web, and desktop environments 🐍📱 In @anaconda Inc next webinar, learn the basics for building & deploying Python anywhere using PyBeeWare Register here! 👉bit.ly/3RBE2SU
On24
Native Application Development with Python
Wednesday, July 20, 2022 at 01:00 PM Central Daylight Time.
If anyone is to join Whatapp group for quick updates
Kindly share below group to the people who are interested in AI -Datascience which run by me since so long ..
https://chat.whatsapp.com/ECDLgJ50bZx7cw9IJ4GoOS
Kindly share below group to the people who are interested in AI -Datascience which run by me since so long ..
https://chat.whatsapp.com/ECDLgJ50bZx7cw9IJ4GoOS
WhatsApp.com
A&D BoonDocks
WhatsApp Group Invite
Here are some useful extensions for VS Code when you are a Python Developer 🐍
▶️Python
▶️Python Indent
▶️AREPL for Python
▶️autoDocstring
▶️Kite AutoComplete
▶️TabOut
▶️CodeSnap
▶️advanced-new-file
▶️Python
▶️Python Indent
▶️AREPL for Python
▶️autoDocstring
▶️Kite AutoComplete
▶️TabOut
▶️CodeSnap
▶️advanced-new-file
Introducing PyTorchLive, an easy to use library of tools for creating on-device ML demos on Android and iOS. With Live, you can build a working mobile app ML demo in minutes. https://pytorch.org/blog/introducing-the-playtorch-app/?utm_source=twitter&utm_medium=organic_social&utm_campaign=blog&content=playtorchv0.2launch
pytorch.org
Introducing the PlayTorch app: Rapidly Create Mobile AI Experiences
Python performance tip:
Specialized functions with simpler signatures tend to beat their more generalized counterparts.
log2(x) and log10(x) are almost twice as fast as log(x) and three times as fast as log(x, 2) or log(x, 2.0)
Specialized functions with simpler signatures tend to beat their more generalized counterparts.
log2(x) and log10(x) are almost twice as fast as log(x) and three times as fast as log(x, 2) or log(x, 2.0)
⌛⌛When you understand Underscores in Python, You will be knowing following concepts as well.
◌Name Mangling
◌Private variables (there is no private variables )
◌Dunder or Special or Magic methods
◌Usage of _ (underscore) for variable
✅Single Underscore: _
✏Variable Name and Temp Variable
✅Single Leading Underscore: _x
✏ Naming Convention and internal use
✅Single Trailing Underscore: x_
✏Naming Convention to avoid name conflicts with python keywords like class , def etc
✅Double Leading Underscore: __x
✏Name Mangling when used in class context
✅Double Leading and Trailing Underscore: x
✏Special methods or Dunder methods or Magic methods
For video tutorial with examples you wish to explore and learn
https://youtu.be/M8-aCSeYzkc
◌Name Mangling
◌Private variables (there is no private variables )
◌Dunder or Special or Magic methods
◌Usage of _ (underscore) for variable
✅Single Underscore: _
✏Variable Name and Temp Variable
✅Single Leading Underscore: _x
✏ Naming Convention and internal use
✅Single Trailing Underscore: x_
✏Naming Convention to avoid name conflicts with python keywords like class , def etc
✅Double Leading Underscore: __x
✏Name Mangling when used in class context
✅Double Leading and Trailing Underscore: x
✏Special methods or Dunder methods or Magic methods
For video tutorial with examples you wish to explore and learn
https://youtu.be/M8-aCSeYzkc