#NLP #News (by Sebastian Ruder):
* 2020 NLP wish lists
* #HuggingFace + #fastai
* #NeurIPS 2019
* #GPT2 things
* #ML Interviews
blog post: http://newsletter.ruder.io/archive/211277
* 2020 NLP wish lists
* #HuggingFace + #fastai
* #NeurIPS 2019
* #GPT2 things
* #ML Interviews
blog post: http://newsletter.ruder.io/archive/211277
Data Science by ODS.ai π¦
ββYouTokenToMe, new tool for text tokenisation from VK team Meet new enhanced tokenisation tool on steroids. Works 7-10 times faster alphabetic languages and 40 to 50 times faster on logographic languages, than alternatives. Under the hood (watch source)β¦
New rust tokenization library from #HuggingFace
Tokenization is a process of converting strings in model input tensors. Library provides BPE/Byte-Level-BPE/WordPiece/SentencePiece tokenization, computes exhaustive set of outputs (offset mappings, attention masks, special token masks).
Library has python and node.js bindings.
The quoted post contains information on another fast #tokenization implementation. Looking forward for speed comparison.
Install:
Github: https://github.com/huggingface/tokenizers/tree/master/tokenizers
#NLU #NLP #Transformers #Rust #NotOnlyPython
Tokenization is a process of converting strings in model input tensors. Library provides BPE/Byte-Level-BPE/WordPiece/SentencePiece tokenization, computes exhaustive set of outputs (offset mappings, attention masks, special token masks).
Library has python and node.js bindings.
The quoted post contains information on another fast #tokenization implementation. Looking forward for speed comparison.
Install:
pip install tokenizers
Github: https://github.com/huggingface/tokenizers/tree/master/tokenizers
#NLU #NLP #Transformers #Rust #NotOnlyPython
GitHub
tokenizers/tokenizers at main Β· huggingface/tokenizers
π₯ Fast State-of-the-Art Tokenizers optimized for Research and Production - huggingface/tokenizers
ββthe latest news from :hugging_face_mask:
[0] Helsinki-NLP
With v2.9.1 released 1,008 machine translation models, covering of 140 different languages trained with marian-nmt
link to models: https://huggingface.co/models?search=Helsinki-NLP%2Fopus-mt
[1] updated colab notebook with the new Trainer
colab: https://t.co/nGQxwqwwZu?amp=1
[2] NLP β library to easily share & load data/metrics already providing access to 99+ datasets!
features
β get them all: built-in interoperability with pytorch, tensorflow, pandas, numpy
β simple transparent pythonic API
β strive on large datasets: nlp frees you from RAM memory limits
β smart cache: process once reuse forever
β add your dataset
colab: https://t.co/37pfogRWIZ?amp=1
github: https://github.com/huggingface/nlp
#nlp #huggingface #helsinki #marian #trainer # #data #metrics
[0] Helsinki-NLP
With v2.9.1 released 1,008 machine translation models, covering of 140 different languages trained with marian-nmt
link to models: https://huggingface.co/models?search=Helsinki-NLP%2Fopus-mt
[1] updated colab notebook with the new Trainer
colab: https://t.co/nGQxwqwwZu?amp=1
[2] NLP β library to easily share & load data/metrics already providing access to 99+ datasets!
features
β get them all: built-in interoperability with pytorch, tensorflow, pandas, numpy
β simple transparent pythonic API
β strive on large datasets: nlp frees you from RAM memory limits
β smart cache: process once reuse forever
β add your dataset
colab: https://t.co/37pfogRWIZ?amp=1
github: https://github.com/huggingface/nlp
#nlp #huggingface #helsinki #marian #trainer # #data #metrics
The Reformer β Pushing the limits of language modeling
Patrick von Platen @ huggingface
The Reformer model was introduced by Kitaev, Kaiser et al. `20 β it is one of the most memory-efficient transformer models for long sequence modeling as of today.
The goal of this blog post is to give an in-depth understanding of each of the next four Reformer features:
[0] reformer self-attention layer β how to efficiently implement self-attention without being restricted to a local context?
[1] chunked feed forward layers β how to get a better time-memory trade-off for large feed forward layers?
[2] reversible residual layers β how to drastically reduce memory consumption in training by a smart residual architecture?
[3] axial positional encodings β how to make positional encodings usable for extremely large input sequences?
This long blog post can better allow you to understand how the model works to correctly set configurations
blog post: https://huggingface.co/blog/reformer
#nlp #reformer #huggingface #transformers
Patrick von Platen @ huggingface
The Reformer model was introduced by Kitaev, Kaiser et al. `20 β it is one of the most memory-efficient transformer models for long sequence modeling as of today.
The goal of this blog post is to give an in-depth understanding of each of the next four Reformer features:
[0] reformer self-attention layer β how to efficiently implement self-attention without being restricted to a local context?
[1] chunked feed forward layers β how to get a better time-memory trade-off for large feed forward layers?
[2] reversible residual layers β how to drastically reduce memory consumption in training by a smart residual architecture?
[3] axial positional encodings β how to make positional encodings usable for extremely large input sequences?
This long blog post can better allow you to understand how the model works to correctly set configurations
blog post: https://huggingface.co/blog/reformer
#nlp #reformer #huggingface #transformers
huggingface.co
The Reformer - Pushing the limits of language modeling
Weβre on a journey to advance and democratize artificial intelligence through open source and open science.
ββPerceiver IO: a scalable, fully-attentional model that works on any modality
#HuggingFace added neural network which is capable of working on all kinds of modailities: text, images, audio, video, coordinates, etc to the transformers library.
Blog: https://huggingface.co/blog/perceiver
#HuggingFace added neural network which is capable of working on all kinds of modailities: text, images, audio, video, coordinates, etc to the transformers library.
Blog: https://huggingface.co/blog/perceiver
π¦ Hi!
We are the first Telegram Data Science channel.
Channel was started as a collection of notable papers, news and releases shared for the members of Open Data Science (ODS) community. Through the years of just keeping the thing going we grew to an independent online Media supporting principles of Free and Open access to the information related to Data Science.
Ultimate Posts
* Where to start learning more about Data Science. https://github.com/open-data-science/ultimate_posts/tree/master/where_to_start
* @opendatascience channel audience research. https://github.com/open-data-science/ods_channel_stats_eda
Open Data Science
ODS.ai is an international community of people anyhow related to Data Science.
Website: https://ods.ai
Hashtags
Through the years we accumulated a big collection of materials, most of them accompanied by hashtags.
#deeplearning #DL β post about deep neural networks (> 1 layer)
#cv β posts related to Computer Vision. Pictures and videos
#nlp #nlu β Natural Language Processing and Natural Language Understanding. Texts and sequences
#audiolearning #speechrecognition β related to audio information processing
#ar β augmeneted reality related content
#rl β Reinforcement Learning (agents, bots and neural networks capable of playing games)
#gan #generation #generatinveart #neuralart β about neural artt and image generation
#transformer #vqgan #vae #bert #clip #StyleGAN2 #Unet #resnet #keras #Pytorch #GPT3 #GPT2 β related to special architectures or frameworks
#coding #CS β content related to software engineering sphere
#OpenAI #microsoft #Github #DeepMind #Yandex #Google #Facebook #huggingface β hashtags related to certain companies
#productionml #sota #recommendation #embeddings #selfdriving #dataset #opensource #analytics #statistics #attention #machine #translation #visualization
Chats
- Data Science Chat https://t.me/datascience_chat
- ODS Slack through invite form at website
ODS resources
* Main website: https://ods.ai
* ODS Community Telegram Channel (in Russian): @ods_ru
* ML trainings Telegram Channel: @mltrainings
* ODS Community Twitter: https://twitter.com/ods_ai
Feedback and Contacts
You are welcome to reach administration through telegram bot: @opendatasciencebot
We are the first Telegram Data Science channel.
Channel was started as a collection of notable papers, news and releases shared for the members of Open Data Science (ODS) community. Through the years of just keeping the thing going we grew to an independent online Media supporting principles of Free and Open access to the information related to Data Science.
Ultimate Posts
* Where to start learning more about Data Science. https://github.com/open-data-science/ultimate_posts/tree/master/where_to_start
* @opendatascience channel audience research. https://github.com/open-data-science/ods_channel_stats_eda
Open Data Science
ODS.ai is an international community of people anyhow related to Data Science.
Website: https://ods.ai
Hashtags
Through the years we accumulated a big collection of materials, most of them accompanied by hashtags.
#deeplearning #DL β post about deep neural networks (> 1 layer)
#cv β posts related to Computer Vision. Pictures and videos
#nlp #nlu β Natural Language Processing and Natural Language Understanding. Texts and sequences
#audiolearning #speechrecognition β related to audio information processing
#ar β augmeneted reality related content
#rl β Reinforcement Learning (agents, bots and neural networks capable of playing games)
#gan #generation #generatinveart #neuralart β about neural artt and image generation
#transformer #vqgan #vae #bert #clip #StyleGAN2 #Unet #resnet #keras #Pytorch #GPT3 #GPT2 β related to special architectures or frameworks
#coding #CS β content related to software engineering sphere
#OpenAI #microsoft #Github #DeepMind #Yandex #Google #Facebook #huggingface β hashtags related to certain companies
#productionml #sota #recommendation #embeddings #selfdriving #dataset #opensource #analytics #statistics #attention #machine #translation #visualization
Chats
- Data Science Chat https://t.me/datascience_chat
- ODS Slack through invite form at website
ODS resources
* Main website: https://ods.ai
* ODS Community Telegram Channel (in Russian): @ods_ru
* ML trainings Telegram Channel: @mltrainings
* ODS Community Twitter: https://twitter.com/ods_ai
Feedback and Contacts
You are welcome to reach administration through telegram bot: @opendatasciencebot
GitHub
ultimate_posts/where_to_start at master Β· open-data-science/ultimate_posts
Ultimate posts for opendatascience telegram channel - open-data-science/ultimate_posts
Data Science by ODS.ai π¦
Some stats to get the perspective of the development of #dalle Β«Used 1000 prompts in Dalle over the last 2 days, about 9 hours each day. Of those, saved ~300. 50 I like enough to share w/ socials. 12 enough to rework for future projects. 3 were perfect,β¦
Tips & Tricks on Image Generation
Generating images with AI tools is a skill, which can be improved and enhanced. So here is couple of articles, covering tips & tricks on how to generate better images with #midjourney. Most interesting one is #huggingface prompt generator, which uses #NLP model to generate sample prompts.
As an example, we tried to reproduce and improve our group avatar, following ideas in the articles. Prompt for an illustration to this post was generated with query
Midjourney Prompt Generator: https://huggingface.co/spaces/doevent/prompt-generator
List of Midjourney prompts: https://www.followchain.org/midjourney-prompts/
An advanced guide to writing prompts for Midjourney ( text-to-image): https://medium.com/mlearning-ai/an-advanced-guide-to-writing-prompts-for-midjourney-text-to-image-aa12a1e33b6
#visualization #gan #generation #generatinveart #aiart #artgentips
Generating images with AI tools is a skill, which can be improved and enhanced. So here is couple of articles, covering tips & tricks on how to generate better images with #midjourney. Most interesting one is #huggingface prompt generator, which uses #NLP model to generate sample prompts.
As an example, we tried to reproduce and improve our group avatar, following ideas in the articles. Prompt for an illustration to this post was generated with query
ferrofluids in form of a brain, beautiful connections chaos, swirling black network --ar 3:4 --iw 9 --q 2 --s 1250
Midjourney Prompt Generator: https://huggingface.co/spaces/doevent/prompt-generator
List of Midjourney prompts: https://www.followchain.org/midjourney-prompts/
An advanced guide to writing prompts for Midjourney ( text-to-image): https://medium.com/mlearning-ai/an-advanced-guide-to-writing-prompts-for-midjourney-text-to-image-aa12a1e33b6
#visualization #gan #generation #generatinveart #aiart #artgentips
Forwarded from Machinelearning
1. Π ΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²ΠΎ ΠΏΠΎ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ ΠΎΡ OpenAI
Π ΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²ΠΎ ΡΠΎΠ΄Π΅ΡΠΆΠΈΡ ΠΏΠΎΠ΄ΡΠΎΠ±Π½ΠΎΠ΅ ΠΎΠΏΠΈΡΠ°Π½ΠΈΠ΅ ΠΏΡΠΎΡΠ΅ΡΡΠ° ΠΏΠ΅ΡΠ΅Π΄Π°ΡΠΈ Π·Π½Π°Π½ΠΈΠΉ ΠΎΡ Π±ΠΎΠ»Π΅Π΅ ΠΊΡΡΠΏΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ ΠΊ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ, c ΡΠΎΡ ΡΠ°Π½Π΅Π½ΠΈΠ΅ΠΌ Π²ΡΡΠΎΠΊΠΎΠΉ ΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΌΠΎΠ΄Π΅Π»ΠΈ.
ΠΡΠ½ΠΎΠ²Π½ΡΠ΅ Π°ΡΠΏΠ΅ΠΊΡΡ, ΡΠ°ΡΡΠΌΠΎΡΡΠ΅Π½Π½ΡΠ΅ Π² ΡΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²Π΅:
- Π‘ΠΎΡ ΡΠ°Π½Π΅Π½ΠΈΠ΅ Π²ΡΡ ΠΎΠ΄Π½ΡΡ Π΄Π°Π½Π½ΡΡ ΠΊΡΡΠΏΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ: Π‘ΠΎΠ·Π΄Π°Π½ΠΈΠ΅ Π½Π°Π±ΠΎΡΠ° Π΄Π°Π½Π½ΡΡ , ΡΠΎΠ΄Π΅ΡΠΆΠ°ΡΠ΅Π³ΠΎ ΠΏΡΠ΅Π΄ΡΠΊΠ°Π·Π°Π½ΠΈΡ Π±ΠΎΠ»ΡΡΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ, ΠΊΠΎΡΠΎΡΡΠ΅ Π±ΡΠ΄ΡΡ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°ΡΡΡΡ Π΄Π»Ρ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ ΠΌΠ΅Π½ΡΡΠ΅ΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ.
- ΠΡΠ΅Π½ΠΊΠ° ΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ: Π‘ΡΠ°Π²Π½ΠΈΡΠ΅Π»ΡΠ½ΡΠΉ Π°Π½Π°Π»ΠΈΠ· ΡΠΎΡΠ½ΠΎΡΡΠΈ ΠΈ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΡΡΠΈ ΠΊΠ°ΠΊ ΠΊΡΡΠΏΠ½ΠΎΠΉ, ΡΠ°ΠΊ ΠΈ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ Π½Π° ΠΎΡΠ½ΠΎΠ²Π΅ ΡΠ°Π·Π»ΠΈΡΠ½ΡΡ ΠΌΠ΅ΡΡΠΈΠΊ.
- Π‘ΠΎΠ·Π΄Π°Π½ΠΈΠ΅ ΠΎΠ±ΡΡΠ°ΡΡΠΈΡ Π΄Π°Π½Π½ΡΡ Π΄Π»Ρ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ: ΠΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ ΠΏΡΠ΅Π΄ΡΠΊΠ°Π·Π°Π½ΠΈΠΉ ΠΊΡΡΠΏΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π΄Π»Ρ Π³Π΅Π½Π΅ΡΠ°ΡΠΈΠΈ ΠΎΠ±ΡΡΠ°ΡΡΠ΅Π³ΠΎ Π½Π°Π±ΠΎΡΠ° Π΄Π°Π½Π½ΡΡ , ΡΠΏΠΎΡΠΎΠ±ΡΡΠ²ΡΡΡΠ΅Π³ΠΎ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΠΌΡ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ ΠΌΠ΅Π½ΡΡΠ΅ΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ.
- ΠΡΠ΅Π½ΠΊΠ° Π΄ΠΎΠΎΠ±ΡΡΠ΅Π½Π½ΠΎΠΉ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ: ΠΡΠΎΠ²Π΅ΡΠΊΠ° ΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΈ ΡΠΎΡΠ½ΠΎΡΡΠΈ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ ΠΏΠΎΡΠ»Π΅ ΠΏΡΠΎΡΠ΅ΡΡΠ° Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ Π΄Π»Ρ ΠΏΠΎΠ΄ΡΠ²Π΅ΡΠΆΠ΄Π΅Π½ΠΈΡ ΡΠΎΠΎΡΠ²Π΅ΡΡΡΠ²ΠΈΡ ΡΡΠ΅Π±ΠΎΠ²Π°Π½ΠΈΡΠΌ.
2. Π£ΡΠ΅Π±Π½ΠΈΠΊ ΠΏΠΎ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ Π·Π½Π°Π½ΠΈΠΉ ΠΎΡ PyTorch
Π ΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²ΠΎ ΠΎΡ PyTorch, ΠΊΠΎΡΠΎΡΠΎΠ΅ ΡΠΎΠ΄Π΅ΡΠΆΠΈΡ ΠΏΡΠ°ΠΊΡΠΈΡΠ΅ΡΠΊΠΎΠ΅ Π²Π²Π΅Π΄Π΅Π½ΠΈΠ΅ Π² ΡΠ΅Ρ Π½ΠΈΠΊΡ ΠΏΠ΅ΡΠ΅Π΄Π°ΡΠΈ Π·Π½Π°Π½ΠΈΠΉ Π΄Π»Ρ ΡΠ°Π·Π²ΡΡΡΡΠ²Π°Π½ΠΈΡ ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ Π½Π° ΡΡΡΡΠΎΠΉΡΡΠ²Π°Ρ Ρ ΠΎΠ³ΡΠ°Π½ΠΈΡΠ΅Π½Π½ΡΠΌΠΈ Π²ΡΡΠΈΡΠ»ΠΈΡΠ΅Π»ΡΠ½ΡΠΌΠΈ ΡΠ΅ΡΡΡΡΠ°ΠΌΠΈ.
ΠΡΠ½ΠΎΠ²Π½ΡΠ΅ Π°ΡΠΏΠ΅ΠΊΡΡ ΡΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²Π°:
- ΠΠ·Π²Π»Π΅ΡΠ΅Π½ΠΈΠ΅ ΡΠΊΡΡΡΡΡ ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½ΠΈΠΉ: Π Π³Π°ΠΉΠ΄Π΅ ΠΏΠΎΠΊΠ°Π·Π°Π½ΠΎ, ΠΊΠ°ΠΊ ΠΏΠΎΠ»ΡΡΠΈΡΡ ΠΏΡΠΎΠΌΠ΅ΠΆΡΡΠΎΡΠ½ΡΠ΅ ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½ΠΈΡ ΠΈΠ· ΠΎΠ±ΡΡΠ΅Π½Π½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π΄Π»Ρ Π΄Π°Π»ΡΠ½Π΅ΠΉΡΠ΅Π³ΠΎ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ.
- ΠΠΎΠ΄ΠΈΡΠΈΠΊΠ°ΡΠΈΡ ΡΠΈΠΊΠ»ΠΎΠ² ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ Π² PyTorch: ΠΠ΄Π΅ΡΡ ΡΠ°ΡΡΠΌΠ°ΡΡΠΈΠ²Π°Π΅ΡΡΡ ΠΈΠ½ΡΠ΅Π³ΡΠ°ΡΠΈΡ Π΄ΠΎΠΏΠΎΠ»Π½ΠΈΡΠ΅Π»ΡΠ½ΡΡ ΡΡΠ½ΠΊΡΠΈΠΉ Π² ΡΡΠ°Π½Π΄Π°ΡΡΠ½ΡΠ΅ ΡΠΈΠΊΠ»Ρ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ Π΄Π»Ρ ΡΡΡΠ΅ΠΊΡΠΈΠ²Π½ΠΎΠΉ ΠΏΠ΅ΡΠ΅Π΄Π°ΡΠΈ Π·Π½Π°Π½ΠΈΠΉ.
- ΠΠ° ΠΏΡΠΈΠΌΠ΅ΡΠ΅ ΠΏΠΎΠΊΠ°Π·Π°Π½ ΠΏΡΠΎΡΠ΅ΡΡ ΠΎΠ±ΡΡΠ΅Π½ΠΈΡ ΠΊΠΎΠΌΠΏΠ°ΠΊΡΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ, Ρ ΠΈΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ ΠΏΡΠ΅Π΄ΡΠΊΠ°Π·Π°Π½ΠΈΡ Π±ΠΎΠ»Π΅Π΅ ΡΠ»ΠΎΠΆΠ½ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π² ΠΊΠ°ΡΠ΅ΡΡΠ²Π΅ ΠΎΡΠΈΠ΅Π½ΡΠΈΡΠ°.
Π ΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²ΠΎ ΡΠΎΠ΄Π΅ΡΠΆΠΈΡ ΠΏΠΎΡΠ°Π³ΠΎΠ²ΡΠ΅ ΠΈΠ½ΡΡΡΡΠΊΡΠΈΠΈ ΠΈ ΠΏΡΠΈΠΌΠ΅ΡΡ ΠΊΠΎΠ΄Π°, ΡΡΠΎ Π΄Π΅Π»Π°Π΅Ρ Π΅Π³ΠΎ ΡΠ΅Π½Π½ΡΠΌ ΡΠ΅ΡΡΡΡΠΎΠΌ, Π΅ΡΠ»ΠΈ Π²Ρ Ρ ΠΎΡΠΈΡΠ΅ Π½Π°ΡΡΠΈΡΡΡΡ ΠΎΠΏΡΠΈΠΌΠΈΠ·ΠΈΡΠΎΠ²Π°ΡΡ ΡΠ²ΠΎΠΈ ΠΌΠΎΠ΄Π΅Π»ΠΈ Π΄Π»Ρ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΡ Π² ΡΡΠ΅Π΄Π°Ρ Ρ ΠΎΠ³ΡΠ°Π½ΠΈΡΠ΅Π½Π½ΡΠΌΠΈ ΡΠ΅ΡΡΡΡΠ°ΠΌΠΈ.
βͺΠ‘ΡΡΠ»ΠΊΠ°
3. Jetson Introduction to Knowledge Distillation ΠΎΡ Nvidia
Π Π΄Π°Π½Π½ΠΎΠΌ ΡΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²Π΅ ΡΠ°ΡΡΠΌΠ°ΡΡΠΈΠ²Π°Π΅ΡΡΡ ΠΏΡΠΎΡΠ΅ΡΡ ΠΏΠ΅ΡΠ΅Π΄Π°ΡΠΈ Π·Π½Π°Π½ΠΈΠΉ ΠΎΡ ΠΌΠΎΠ΄Π΅Π»ΠΈ OpenCLIP (vision-language model) ΠΊ ΠΌΠΎΠ΄Π΅Π»ΠΈ ResNet18 Π΄Π»Ρ ΠΊΠ»Π°ΡΡΠΈΡΠΈΠΊΠ°ΡΠΈΠΈ Π½Π° Π½Π°Π±ΠΎΡΠ΅ Π΄Π°Π½Π½ΡΡ STL10.
ΠΡΠΎΠ±ΠΎΠ΅ Π²Π½ΠΈΠΌΠ°Π½ΠΈΠ΅ ΡΠ΄Π΅Π»ΡΠ΅ΡΡΡ ΡΠΎΠΌΡ, ΠΊΠ°ΠΊ Π²ΡΠ±ΠΎΡ Π΄Π°Π½Π½ΡΡ , ΠΌΠ΅ΡΠΎΠ΄Ρ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ ΠΈ Π°ΡΡ ΠΈΡΠ΅ΠΊΡΡΡΠ° ΠΌΠΎΠ΄Π΅Π»ΠΈ, Π²Π»ΠΈΡΡΡ Π½Π° ΠΈΡΠΎΠ³ΠΎΠ²ΡΡ ΡΠΎΡΠ½ΠΎΡΡΡ.
ΠΡΠΎΠΌΠ΅ ΡΠΎΠ³ΠΎ, ΠΎΠ±ΡΡΠΆΠ΄Π°ΡΡΡΡ ΠΌΠ΅ΡΠΎΠ΄Ρ ΠΏΡΠΎΡΠΈΠ»ΠΈΡΠΎΠ²Π°Π½ΠΈΡ ΠΈ ΠΎΠΏΡΠΈΠΌΠΈΠ·Π°ΡΠΈΠΈ ΠΌΠΎΠ΄Π΅Π»Π΅ΠΉ Π΄Π»Ρ ΠΈΡ ΡΠ°Π·Π²ΡΡΡΡΠ²Π°Π½ΠΈΡ Π½Π° ΡΡΡΡΠΎΠΉΡΡΠ²Π°Ρ NVIDIA Jetson Orin Nano.
4. Π£ΡΠ΅Π±Π½ΠΈΠΊ ΠΏΠΎ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ Π·Π½Π°Π½ΠΈΠΉ ΠΎΡ Keras
ΠΠΎΠ΄ΡΠΎΠ±Π½ΠΎ ΠΎΠΏΠΈΡΡΠ²Π°Π΅ΡΡΡ ΠΊΠΎΠ½ΡΠ΅ΠΏΡΠΈΡ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ Π·Π½Π°Π½ΠΈΠΉ ΠΈ Π΅Π΅ ΠΏΡΠΈΠΌΠ΅Π½Π΅Π½ΠΈΠ΅ Π² ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠ΅ ΠΌΠ΅Π΄ΠΈΡΠΈΠ½ΡΠΊΠΈΡ ΠΈΠ·ΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΠΉ.
5. Π ΡΠΊΠΎΠ²ΠΎΠ΄ΡΡΠ²ΠΎ ΠΏΠΎ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΠΈ ΠΎΡ
huggingface π€
ΠΠ΄Π΅ΡΡ ΠΏΠΎΠΊΠ°Π·Π°Π½ΠΎ, ΠΊΠ°ΠΊ Π²ΡΠΏΠΎΠ»Π½ΡΡΡ Π΄ΠΈΡΡΠΈΠ»Π»ΡΡΠΈΡ Π·Π½Π°Π½ΠΈΠΉ ΡΠ°Π³ Π·Π° ΡΠ°Π³ΠΎΠΌ Π½Π° ΠΊΠΎΠ½ΠΊΡΠ΅ΡΠ½ΠΎΠΌ ΠΏΡΠΈΠΌΠ΅ΡΠ΅.
6. ΠΠΈΡΡΠΈΠ»Π»ΡΡΠΈΡ Π·Π½Π°Π½ΠΈΠΉ Π΄Π»Ρ Π·Π°Π΄Π°Ρ ΠΊΠΎΠΌΠΏΡΡΡΠ΅ΡΠ½ΠΎΠ³ΠΎ Π·ΡΠ΅Π½ΠΈΡ ΠΎΡ huggingface
ΠΠ΄Π΅ΡΡ ΡΠ°ΡΡΠΌΠ°ΡΡΠΈΠ²Π°Π΅ΡΡΡ, ΠΊΠ°ΠΊ ΡΠ΄Π΅Π»Π°ΡΡ ΡΠ°ΠΉΠ½ΡΡΠ½ ViT-ΠΌΠΎΠ΄Π΅Π»ΠΈ Π² MobileNet Ρ ΠΏΠΎΠΌΠΎΡΡΡ API Trainer ΠΈΠ· Transformers.
#KnowledgeDistillation #Distillation #openai #keras #tutorial #course #freecourses #huggingface #Nvidia #pytorch
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM