Cool self supervised #machinelearning #deeplearning #neuralnetwork from #tensorflow
🌎 Link
✴️ @AI_Python_EN
🌎 Link
✴️ @AI_Python_EN
Andriy Burkov
I often receive questions from people in my network about what should they learn and master to become a data scientist. While I personally think that the term "data scientist" is very unfortunate and without a clear definition, this is what a good modern #dataanalyst has to master:
#DataScience
– Data structures (local and distributed)
– Data indexing
– Data privacy and anonymization
– Data lifecycle management
– Data transformation (deduplication, handling outliers, and missing values, dimensionality reduction)
– Data analysis (experiment design, classification, regression, unsupervised methods)
– #Machinelearning methods (feature engineering, regularization, hyperparameter tuning, ensemble methods, and #neuralnetwork s)
– Computer and database programming, numerical optimization
– Distributed data processing
– Real-time and high-frequency data processing
– Linux (my personal bias)
A modern data analyst also has to be a good popularizer of complex ideas. Having a Ph.D. is not a requirement, but a very big plus: it contributes to the popularizing skill and teaches the scientific approach to problem-solving.
✴️ @AI_Python_EN
I often receive questions from people in my network about what should they learn and master to become a data scientist. While I personally think that the term "data scientist" is very unfortunate and without a clear definition, this is what a good modern #dataanalyst has to master:
#DataScience
– Data structures (local and distributed)
– Data indexing
– Data privacy and anonymization
– Data lifecycle management
– Data transformation (deduplication, handling outliers, and missing values, dimensionality reduction)
– Data analysis (experiment design, classification, regression, unsupervised methods)
– #Machinelearning methods (feature engineering, regularization, hyperparameter tuning, ensemble methods, and #neuralnetwork s)
– Computer and database programming, numerical optimization
– Distributed data processing
– Real-time and high-frequency data processing
– Linux (my personal bias)
A modern data analyst also has to be a good popularizer of complex ideas. Having a Ph.D. is not a requirement, but a very big plus: it contributes to the popularizing skill and teaches the scientific approach to problem-solving.
✴️ @AI_Python_EN
A Convolutional Neural Network Tutorial in Keras and Tensorflow 2
https://medium.com/@isakbosman/a-convolutional-neural-network-tutorial-in-keras-and-tensorflow-2-2bff79f477c0
#Keras #neuralnetwork #TensorFlow #ConvolutionalNeuralNetwork
✴️ @AI_Python_EN
https://medium.com/@isakbosman/a-convolutional-neural-network-tutorial-in-keras-and-tensorflow-2-2bff79f477c0
#Keras #neuralnetwork #TensorFlow #ConvolutionalNeuralNetwork
✴️ @AI_Python_EN
Which doodles are human-drawn and which are AI-generated? Berkeley researchers Forrest Huang et al created a #neuralnetwork that can generate sketches based on text descriptions:
http://bit.ly/2LZSHJN
✴️ @AI_Python_EN
http://bit.ly/2LZSHJN
✴️ @AI_Python_EN
Machine Learning for Artists
#DeepLearning #MachineLearning #ArtificialIntelligence #neuralnetwork #gan
http://ml4a.github.io/
✴️ @AI_Python_EN
#DeepLearning #MachineLearning #ArtificialIntelligence #neuralnetwork #gan
http://ml4a.github.io/
✴️ @AI_Python_EN
--Activation Function in CNN--
-Image Analysis-
#imageanalysis #machinelearning #clustering #datascientists #kmeans #deeplearning #neuralnetwork #underfitting
-Image Analysis-
#imageanalysis #machinelearning #clustering #datascientists #kmeans #deeplearning #neuralnetwork #underfitting
A gentle overview on the Deep Learning and Machine Learning
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
✴️ @AI_Python_EN
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
✴️ @AI_Python_EN
Vanishing/exploring gradients problem is a well often problem especially when training big networks, so visualizing gradients is a must when training neural networks. Here is the small network's on MNIST dataset gradients flow. A detailed article is on the way to explain many things in deep learning.
#machinelearning #deeplearning #artificialintelligence #computervision #neuralnetwork
❇️ @AI_Python_EN
#machinelearning #deeplearning #artificialintelligence #computervision #neuralnetwork
❇️ @AI_Python_EN
#GraphNeuralNetwork s for Natural Language Processing
#neuralnetwork #NLP
https://bit.ly/33oprRc
❇️ @AI_Python_EN
#neuralnetwork #NLP
https://bit.ly/33oprRc
❇️ @AI_Python_EN