2019 is the year of Artificial Intelligence.
This is the year that we replace one buzzword with another.
This will allow incompetent organizations that have failed to successfully leverage machine learning and data science in the past to make excuses and get another shot at the whole process...
Unfortunately, these companies that have failed to make the proper investment in data science (and failed to hire at the leadership level first) will just be wasting their money and getting burned again.
If you have the option, try to avoid these buzzword-laden hack shops.
If you're looking for a job, look for companies that have been investing in data science for years and building over time... not constantly rebranding their failing departments and products to look like they have something "fresh."
The companies and individuals that make the long-term investments are the ones that will win out in the end.
Invest in yourself and find a company that invests in data science.
#datascience
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
This is the year that we replace one buzzword with another.
This will allow incompetent organizations that have failed to successfully leverage machine learning and data science in the past to make excuses and get another shot at the whole process...
Unfortunately, these companies that have failed to make the proper investment in data science (and failed to hire at the leadership level first) will just be wasting their money and getting burned again.
If you have the option, try to avoid these buzzword-laden hack shops.
If you're looking for a job, look for companies that have been investing in data science for years and building over time... not constantly rebranding their failing departments and products to look like they have something "fresh."
The companies and individuals that make the long-term investments are the ones that will win out in the end.
Invest in yourself and find a company that invests in data science.
#datascience
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
The Evolved Transformer
Paper by So et al.: https://lnkd.in/eNZ6ije
#artificalintelligence #MachineLearning #NeuralComputing #EvolutionaryComputing #research
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
Paper by So et al.: https://lnkd.in/eNZ6ije
#artificalintelligence #MachineLearning #NeuralComputing #EvolutionaryComputing #research
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
Invertible Residual Networks
Paper by Behrmann: https://lnkd.in/dDnrmhr
#MachineLearning #ArtificialIntelligence #ComputerVision #PatternRecognition
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
Paper by Behrmann: https://lnkd.in/dDnrmhr
#MachineLearning #ArtificialIntelligence #ComputerVision #PatternRecognition
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
This is a super cool resource: Papers With Code now includes 950+ ML tasks, 500+ evaluation tables (including SOTA results) and 8500+ papers with code. Probably the largest collection of NLP tasks I've seen including 140+ tasks and 100 datasets. https://paperswithcode.com/sota
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
YOLOv3 still has the best introduction for any paper I've read so far https://pjreddie.com/media/files/papers/YOLOv3.pdf
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
βοΈ @AI_Python
π£ @AI_Python_Arxiv
β΄οΈ @AI_Python_EN
I have a fully-funded 4Y PhD position in applied #NLP (in the #IR context) available at Delft University of Technology. Get in touch if you are interested!
https://chauff.github.io/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
https://chauff.github.io/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
I have an opening for a 4y PhD position in my ERC Consolidator project DREAM (βdistributed dynamic representations for dialogue managementβ) at #ILLC in Amsterdam. Deadline 25 Feb 2019. More details: http://www.illc.uva.nl/NewsandEvents/News/Positions/newsitem/10538/
Please spread the word! #NLProc AmsterdamNLP
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Please spread the word! #NLProc AmsterdamNLP
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
I have funding for a Ph.D student to work in the general area of multimodal machine learning from images, videos, audio, and multilingual text. Please get in touch if you are interested.
elliottd.github.io
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
elliottd.github.io
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
The Language in Interaction research consortium invites applications for a postdoctoral position in Linguistics! We are looking for a candidate with a background in theoretical and/or computational linguistics. More information can be found here: https://www.mpi.nl/people/vacancies/postdoc-position-in-linguistics-for-research-consortium-language-in-interaction
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
PhD position available
https://lsri.info/2019/02/01/phd-position-available/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
https://lsri.info/2019/02/01/phd-position-available/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
PhD position in nice lab on nice topics.
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
A PhD position is available in my group to study the structure, regulation, and functioning of intercellular nanotubes in bacteria starting asap. This is a project I am very excited about. Please RT or contact me when you are interested. https://www.uni-osnabrueck.de/universitaet/stellenangebote/stellenangebote_detail/1_fb_5_sfb_research_assistant.html
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
DASK CHEATSHEET - FOR PARALLEL COMPUTING IN DATA SCIENCE
You will need Dask when the data is too big
This is the guide from Analytics Vidhya https://lnkd.in/fKVBFhE
#datascience #pydata #pandas
#datascientist
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
You will need Dask when the data is too big
This is the guide from Analytics Vidhya https://lnkd.in/fKVBFhE
#datascience #pydata #pandas
#datascientist
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
π‘ What is the curse of dimensionality?
The curse of dimensionality refers to problems that occur when we try to use statistical methods in high-dimensional space.
As the number of features (dimensionality) increases, the data becomes relatively more sparse and often exponentially more samples are needed to make statistically significant predictions.
Imagine going from a 10x10 grid to a 10x10x10 grid... if we want one sample in each "1x1 square", then the addition of the third parameter requires us to have 10 times as many samples (1000) as we needed when we had 2 parameters (100).
In short, some models become much less accurate in high-dimensional space and may behave erratically. Examples include: linear models with no feature selection or regularization, kNN, Bayesian models
Models that are less affected by the curse of dimensionality: regularized models, random forest, some neural networks, stochastic models (e.g. monte carlo simulations)
#datascience #dsdj #QandA
#machinelearning
For more free info, sign up here -> https://lnkd.in/g7AYg72
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
The curse of dimensionality refers to problems that occur when we try to use statistical methods in high-dimensional space.
As the number of features (dimensionality) increases, the data becomes relatively more sparse and often exponentially more samples are needed to make statistically significant predictions.
Imagine going from a 10x10 grid to a 10x10x10 grid... if we want one sample in each "1x1 square", then the addition of the third parameter requires us to have 10 times as many samples (1000) as we needed when we had 2 parameters (100).
In short, some models become much less accurate in high-dimensional space and may behave erratically. Examples include: linear models with no feature selection or regularization, kNN, Bayesian models
Models that are less affected by the curse of dimensionality: regularized models, random forest, some neural networks, stochastic models (e.g. monte carlo simulations)
#datascience #dsdj #QandA
#machinelearning
For more free info, sign up here -> https://lnkd.in/g7AYg72
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Stanford University - ML Group has released a python package called StanfordNLP build on PyTorch.
The best feature of this package is that it comes with pre-trained neural models for 53 human languages! Presumably the most number of pretrained models in any popular NLP package.
You can find more details here:
https://lnkd.in/f5yaJFK
#datascience #nlp #machinelearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
The best feature of this package is that it comes with pre-trained neural models for 53 human languages! Presumably the most number of pretrained models in any popular NLP package.
You can find more details here:
https://lnkd.in/f5yaJFK
#datascience #nlp #machinelearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Deep Unsupervised Learning Course Spring 2019
Berkeley University
Instructors: Pieter Abbeel, Peter Chen, Jonathan Ho, Aravind Srinivas
https://sites.google.com/view/berkeley-cs294-158-sp19/
Course : Machine Learning for Health
Toronto University: Spring 2019
Instructor: Dr. Marzyeh Ghassemi
https://cs2541-ml4h2019.github.io/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Berkeley University
Instructors: Pieter Abbeel, Peter Chen, Jonathan Ho, Aravind Srinivas
https://sites.google.com/view/berkeley-cs294-158-sp19/
Course : Machine Learning for Health
Toronto University: Spring 2019
Instructor: Dr. Marzyeh Ghassemi
https://cs2541-ml4h2019.github.io/
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
image_2019-02-04_11-01-36.png
204.8 KB
The only notebook on the planet that (i know of) shows you how to Install TensorRT on Google Collab and then run an optimized VGG graph:
https://lnkd.in/e_rP5dU
https://lnkd.in/estbghA
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
https://lnkd.in/e_rP5dU
https://lnkd.in/estbghA
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
image_2019-02-04_11-04-26.png
212 KB
AAAI Conference Analytics
Citation distribution by the top AAAI 20 authors, year by year: https://lnkd.in/eV3YA5h
#artificalintelligence #deeplearning
#machinelearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Citation distribution by the top AAAI 20 authors, year by year: https://lnkd.in/eV3YA5h
#artificalintelligence #deeplearning
#machinelearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Machine Learning is Much More Than Just Deep Learning
Deep learning is the most well-known of machine learning techniques, but far from the only one. If you donβt have a lot of data, techniques like linear regression work well. If there is more data, or the data is likely to have non-linearity, Iβd recommend decision trees or decision forests.
Deep learning works very well with massive sets of images or similar data. But come with serious challenges to cost, time, and complexity.. Labeling the massive set of data required can be time consuming or expensive - especially if you need to pay others to label your data. Deep learning also requires considerable time to train, both training on a large data set as well as for the considerable parameter optimization.
A recent paper used Deep Learning to predict age from blood. The authors included a comparison of their Deep Learning algorithm to other machine learning techniques and discovered that a simpler technique generated similar accuracy. But, Iβve seen other papers where Deep Learning is the only technique used. When I see this I think of the saying, βWhen all you have is a hammer, the world looks like a nail.β
What do you think? Did I miss your favorite machine learning technique? #machinelearning #ai #datascience
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Deep learning is the most well-known of machine learning techniques, but far from the only one. If you donβt have a lot of data, techniques like linear regression work well. If there is more data, or the data is likely to have non-linearity, Iβd recommend decision trees or decision forests.
Deep learning works very well with massive sets of images or similar data. But come with serious challenges to cost, time, and complexity.. Labeling the massive set of data required can be time consuming or expensive - especially if you need to pay others to label your data. Deep learning also requires considerable time to train, both training on a large data set as well as for the considerable parameter optimization.
A recent paper used Deep Learning to predict age from blood. The authors included a comparison of their Deep Learning algorithm to other machine learning techniques and discovered that a simpler technique generated similar accuracy. But, Iβve seen other papers where Deep Learning is the only technique used. When I see this I think of the saying, βWhen all you have is a hammer, the world looks like a nail.β
What do you think? Did I miss your favorite machine learning technique? #machinelearning #ai #datascience
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Nice overview of unsupervised pre-trained language models
https://lilianweng.github.io/lil-log/2019/01/31/generalized-language-models.html
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
https://lilianweng.github.io/lil-log/2019/01/31/generalized-language-models.html
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Generalized Language Models
Blog by Lilian Weng: https://lnkd.in/eJPgKWm
Share us With Your Friend!
#artificalintelligence #NLP #unsupervisedlearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN
Blog by Lilian Weng: https://lnkd.in/eJPgKWm
Share us With Your Friend!
#artificalintelligence #NLP #unsupervisedlearning
βοΈ @AI_Python
π£ @AI_Python_arXiv
β΄οΈ @AI_Python_EN