AI, Python, Cognitive Neuroscience
3.82K subscribers
1.09K photos
46 videos
78 files
891 links
Download Telegram
Research Guide: Advanced Loss Functions for Machine Learning Models

http://bit.ly/36HBefu

#DataScience #MachineLearning #ArtificialIntelligence

❇️ @AI_Python_EN
Machine ignoring = underfitting
Machine learning = optimal fitting
Machine memorization = overfitting

#datascience #machinelearning

❇️ @AI_Python_EN
Grid search vs randomized search?
💡 What are the pros and cons of grid search? Pros: • Grid search is great when you need to fine-tune hyperparameters over a small search space automatically. • For example, if you have 100 different datasets that you expect to be similar (e.g. solving the same problem repeatedly with different populations), you can use grid search to automatically fine-tune the hyperparameters for each model. Cons: • Grid search is computationally expensive and inefficient, often searching over parameter space that has very little chance of being useful, resulting it being extremely slow. It's especially slow if you need to search a large space since it's complexity increases exponentially as more hyperparameters are optimized.
💡 What are the pros and cons of randomized search? Pros: • Randomized search does a good job finding near-optimal hyperparameters over a very large search space relatively quickly and doesn't suffer from the same exponential scaling problem as grid search. Cons: • Randomized search does not fine-tune the results as much as grid search does since it typically does not test every possible combination of parameters.
#datascience
👉 Free training -> http://bit.ly/dsdj-webinar


❇️ @AI_Python_EN
Machine Learning w.r.t meditation routine.
Machine before meditation = underfitting
Machine after meditation = optimal fitting
Planning of meditation = overfitting
#datascience

❇️ @AI_Python_EN