AI, Python, Cognitive Neuroscience
3.88K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
πŸ“šπŸ“– Python Machine Learning Tutorial πŸ“–πŸ“š

➑️ Python Machine Learning – Tasks and Applications ( https://lnkd.in/fZcs-xE)
➑️ Python Machine Learning Environment Setup – Installation Process (https://lnkd.in/fJHwbjr)
➑️ Data Preprocessing, Analysis & Visualization (https://lnkd.in/fVz58kJ)
➑️ Train and Test Set (https://lnkd.in/fq_GXjn)
➑️ Machine Learning Techniques with Python (https://lnkd.in/fjdsQzd)
➑️ Top Applications of Machine Learning (https://lnkd.in/f-CNyK2)
➑️ Machine Learning Algorithms in Python – You Must Learn (https://lnkd.in/fTxCA23)

#python #machinelearning #datascience #data #dataanalysis #artificialintelligence #ai #visualization #algorithms

✴️ @AI_Python_EN
Convolutional #NeuralNetworks (CNN) for Image Classification β€” a step by step illustrated tutorial: https://dy.si/hMqCH
BigData #AI #MachineLearning #ComputerVision #DataScientists #DataScience #DeepLearning #Algorithms

✴️ @AI_Python_EN
This is the reference implementation of Diff2Vec - "Fast Sequence Based Embedding With Diffusion Graphs" (CompleNet 2018). Diff2Vec is a node embedding algorithm which scales up to networks with millions of nodes. It can be used for node classification, node level regression, latent space community detection and link prediction. Enjoy!

https://lnkd.in/dXiy5-U

#technology #machinelearning #datamining #datascience #deeplearning #neuralnetworks #pytorch #tensorflow #diffusion #Algorithms

✴️ @AI_Python_EN
Have you heard of "R-Transformer?", a Recurrent Neural Network Enhanced Transformer

Recurrent Neural Networks have long been the dominating choice for sequence modeling. However, it severely suffers from two issues: impotent in capturing very long-term dependencies and unable to parallelize the sequential computation procedure.

Therefore, many non-recurrent sequence models that are built on convolution and attention operations have been proposed recently.

Here the authors propose the R-Transformer which enjoys the advantages of both RNNs and the multi-head attention mechanism while avoids their respective drawbacks.

The proposed model can effectively capture both local structures and global long-term dependencies in sequences without any use of position embeddings. They evaluated R-Transformer through extensive experiments with data from a wide range of domains and the empirical results show that R-Transformer outperforms the state-of-the-art methods by a large margin in most of the tasks.

Github code: https://lnkd.in/dpFckix

#research #algorithms #machinelearning #deeplearning #rnn

✴️ @AI_Python_EN
Awesome victory for #DeepLearning πŸ‘πŸ»

GE Healthcare wins FDA clearance for #algorithms to spot type of collapsed lung!

Here’s how the AI algorithm works
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
1. A patient image scanned on a device is automatically searched for pneumothorax.
2. If pneumothorax is suspected, an alert with the original chest X-ray, is sent to the radiologist to review.
3. That technologist would also receive an on-device notification to highlight prioritized cases.
4. Algorithms would then analyze and flag protocol and field of view errors and auto rotate images on device.

Article is here:
https://lnkd.in/daNYHfP

#machinelearning
Mish is now even supported on YOLO v3 backend. Couldn't have been more elated with how rewarding this project has been. Link to repository -

https://github.com/digantamisra98/Mish

#neuralnetworks #mathematics #algorithms #deeplearning #machinelearning

❇️ @AI_Python_EN