AI, Python, Cognitive Neuroscience
3.88K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
2_5203986206391534542.pdf
1.5 MB
Sarbazi, M., Sadeghzadeh, M., & Mir Abedini, S. J. (2019). Improving resource allocation in software-defined networks using clustering. Cluster Computing.
doi:10.1007/s10586-019-02985-3

❇️ @AI_Python_EN
AI, Python, Cognitive Neuroscience
2_5203986206391534542.pdf
If you just published a paper let us inform other members.
@ai_python_en
Google researchers just released #ALBERT , that has beaten all models across various benchmarks.

Also, did you know that most NLP models achieves performance that outpaces average human performance?
——————————————————
ALBERT uses parameter reduction techniques to lower memory consumption and increase the training speed of BERT

1. They topped GLUE ( https://lnkd.in/dkWNRVk ) — 92.2%

2. SQuAD (https://lnkd.in/d_Xrba8 ) leaderboards. — 89.4%

3. RACE - they came third with their ensemble model (https://lnkd.in/d2yWbtC ) — 89.4%
——————————————————
Paper at openreview: https://lnkd.in/dzRvWYS
#deeplearning #machinelearning #NLU #NLG #artificiallintelligence #ai
Artificial Design: Modeling Artificial Super Intelligence with Extended General Relativity and Universal Darwinism via Geometrization for Universal Design Automation

https://openreview.net/forum?id=SyxQ_TEFwS
Self-Paced Learning:
- supervised method from 2010 #NIPS
- idea: start learning with the easiest samples first and only then learn the difficult ones
- distinct from curriculum learning, where samples are pre-classified to easy/hard: we need to decide the order on our own
sample in a latent model (outliers will be the hardest)
- a better measure (!): how good are the initial predictions for the sample (samples far away from the decision boundary are the easiest).

- for #classification, samples are only easy in context of other samples!
- the set of easy samples is iteratively enlarged
- results: outperforms CCCP in #DNA Motif Finding, handwritten digit recognition and others problems
- link: https://papers.nips.cc/paper/3923-self-paced-learning-for-latent-variable-models
Github has just launched a new NLP/information retrieval challenge: CodeSearchNet challenge. The goal of code search is to retrieve relevant code given natural language. Along with this, they released a huge dataset with: - 6m functions across 6 programming languages (Go, Java, Python etc) - 2m of those 6m functions have associated documentation (docstrings, JavaDoc etc) - And some metadata (line number and more). They also included some baseline models (e.g. BERT-like self-attention model) to help people get started with the challenge. Check it out! #deeplearning #machinelearning


📝 Article: https://lnkd.in/dezzhs9
🔤 Code: https://lnkd.in/dXhRqpE


✴️ @AI_PYTHON_EN
Microsoft open-sourced scripts and notebooks to pre-train and finetune BERT natural language model with domain-specific texts

Github: https://github.com/microsoft/AzureML-BERT


#Bert #Microsoft #NLP #dl

✴️ @AI_PYTHON_EN
Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch

https://huggingface.co/transformers

✴️ @AI_PYTHON_EN
OpenAI’s GPT-2 Text Generator: Wise As a Scholar

https://www.youtube.com/watch?v=0OtZ8dUFxXA

OpenAI's post:
https://openai.com/blog/gpt-2-6-month-follow-up/

✴️ @AI_Python_en
Free ebooks on Deep Learning

PDFs and epub books on Deep Learning. Make sure you comply with copyrights and use this repository only to get familiar with content and purchasing a legal copy afterward!

Also you should save this link somewhere by forwarding message to your Saved messages (just long tap / click on message and then type ‘Saved messages’ in the dialogue search) or your fellow group, because repo might get shutdown for copyright violation.

Link: https://github.com/ontiyonke/Free-Deep-Learning-Books/tree/master/book

#library #ebook

❇️ @AI_PYTHON_en