ββTabNine showed deep learning code autocomplete tool based on GPT-2 architecture.
Video demonstrates the concept. Hopefully, it will allow us to write code with less bugs, not more.
Link: https://tabnine.com/blog/deep
Something relatively similar by Microsoft: https://visualstudio.microsoft.com/ru/services/intellicode
#GPT2 #TabNine #autocomplete #product #NLP #NLU #codegeneration
Video demonstrates the concept. Hopefully, it will allow us to write code with less bugs, not more.
Link: https://tabnine.com/blog/deep
Something relatively similar by Microsoft: https://visualstudio.microsoft.com/ru/services/intellicode
#GPT2 #TabNine #autocomplete #product #NLP #NLU #codegeneration
ββGPT-3 application for website form generation
Turns out #GPT3 model is capable of generating #JSX code (which is HTML layout for #React ) given the description of the required blocks to generate.
Author reports that there are exceptions, given current output limit of the model of 512 tokens.
Why this is important: one might suppose that in the future programmers will just write specifications and tests for the AI to generate the code. Given the speed of progress that wonβt be surprising at all.
And probably the more sophisticated models will be capable of using hard output limit to produce a code for the output generation but that obviously is still an area for active research.
More realistic evaluation is that the upcoming code generation tools is that it will just allow more people to build products, following #nocode movement.
Twitter thread: https://twitter.com/sharifshameem/status/1282676454690451457
#codegeneration #NLU
Turns out #GPT3 model is capable of generating #JSX code (which is HTML layout for #React ) given the description of the required blocks to generate.
Author reports that there are exceptions, given current output limit of the model of 512 tokens.
Why this is important: one might suppose that in the future programmers will just write specifications and tests for the AI to generate the code. Given the speed of progress that wonβt be surprising at all.
And probably the more sophisticated models will be capable of using hard output limit to produce a code for the output generation but that obviously is still an area for active research.
More realistic evaluation is that the upcoming code generation tools is that it will just allow more people to build products, following #nocode movement.
Twitter thread: https://twitter.com/sharifshameem/status/1282676454690451457
#codegeneration #NLU
Data Science by ODS.ai π¦
ββGPT-3 application for website form generation Turns out #GPT3 model is capable of generating #JSX code (which is HTML layout for #React ) given the description of the required blocks to generate. Author reports that there are exceptions, given currentβ¦
This media is not supported in your browser
VIEW IN TELEGRAM
Sharif Shameem improved the original app, which is now capable of generating real applications, as he demostrates with a simple ToDo app.
#GPT3 #codegeneration
#GPT3 #codegeneration
This media is not supported in your browser
VIEW IN TELEGRAM
Applying GPT-3 to generate neural network code
Matt Shumer used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.
#GPT3 #inception #codegeneration #NLU #NLP
Matt Shumer used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.
#GPT3 #inception #codegeneration #NLU #NLP
ββAstrologers proclaimed the week of #codegeneration. Number of articles about the subject doubled.
Deep learning to translate between programming languages
#FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient.
ArXiV: https://arxiv.org/pdf/2006.03511.pdf
Github: https://github.com/facebookresearch/TransCoder/
#NLU #codegeneration #NLP
#FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient.
ArXiV: https://arxiv.org/pdf/2006.03511.pdf
Github: https://github.com/facebookresearch/TransCoder/
#NLU #codegeneration #NLP