Data Science by ODS.ai 🦜
51K subscribers
363 photos
34 videos
7 files
1.52K links
First Telegram Data Science channel. Covering all technical and popular staff about anything related to Data Science: AI, Big Data, Machine Learning, Statistics, general Math and the applications of former. To reach editors contact: @haarrp
Download Telegram
​​TabNine showed deep learning code autocomplete tool based on GPT-2 architecture.

Video demonstrates the concept. Hopefully, it will allow us to write code with less bugs, not more.

Link: https://tabnine.com/blog/deep
Something relatively similar by Microsoft: https://visualstudio.microsoft.com/ru/services/intellicode

#GPT2 #TabNine #autocomplete #product #NLP #NLU #codegeneration
​​GPT-3 application for website form generation

Turns out #GPT3 model is capable of generating #JSX code (which is HTML layout for #React ) given the description of the required blocks to generate.

Author reports that there are exceptions, given current output limit of the model of 512 tokens.

Why this is important: one might suppose that in the future programmers will just write specifications and tests for the AI to generate the code. Given the speed of progress that won’t be surprising at all.

And probably the more sophisticated models will be capable of using hard output limit to produce a code for the output generation but that obviously is still an area for active research.

More realistic evaluation is that the upcoming code generation tools is that it will just allow more people to build products, following #nocode movement.

Twitter thread: https://twitter.com/sharifshameem/status/1282676454690451457

#codegeneration #NLU
This media is not supported in your browser
VIEW IN TELEGRAM
Applying GPT-3 to generate neural network code

Matt Shumer used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.

#GPT3 #inception #codegeneration #NLU #NLP
​​Astrologers proclaimed the week of #codegeneration. Number of articles about the subject doubled.
Deep learning to translate between programming languages

#FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient.

ArXiV: https://arxiv.org/pdf/2006.03511.pdf
Github: https://github.com/facebookresearch/TransCoder/

#NLU #codegeneration #NLP
There is a claim that #ChatGPT is capable of writing a code based on a text input

Why does it matter: it potentially can lower the barrier for programmers and allow more tools for efficient software development to emerge.

Source: tweet

#GPT3 #NLU #NLP #codegeneration