Any person learning deep learning or artificial intelligence in particular, know that there are ultimately two paths that they can go:
1. Computer vision
2. Natural language processing.
I outlined a roadmap for computer vision I believe many beginners will find helpful.
Artificial Intelligence
1. Computer vision
2. Natural language processing.
I outlined a roadmap for computer vision I believe many beginners will find helpful.
Artificial Intelligence
π12
Python Code to remove Image Background
βββββββββββββββββββββ-
βββββββββββββββββββββ-
from rembg import remove
from PIL import Image
image_path = 'Image Name' ## ---> Change to Image name
output_image = 'ImageNew' ## ---> Change to new name your image
input = Image.open(image_path)
output = remove(input)
output.save(output_image)
π18
Forwarded from Web Development
Web Data Mining with Python.pdf
4.9 MB
Web Data Mining with Python (2023)
π10
Some useful PYTHON libraries for data science
NumPy stands for Numerical Python. The most powerful feature of NumPy is n-dimensional array. This library also contains basic linear algebra functions, Fourier transforms, advanced random number capabilities and tools for integration with other low level languages like Fortran, C and C++
SciPy stands for Scientific Python. SciPy is built on NumPy. It is one of the most useful library for variety of high level science and engineering modules like discrete Fourier transform, Linear Algebra, Optimization and Sparse matrices.
Matplotlib for plotting vast variety of graphs, starting from histograms to line plots to heat plots.. You can use Pylab feature in ipython notebook (ipython notebook βpylab = inline) to use these plotting features inline. If you ignore the inline option, then pylab converts ipython environment to an environment, very similar to Matlab. You can also use Latex commands to add math to your plot.
Pandas for structured data operations and manipulations. It is extensively used for data munging and preparation. Pandas were added relatively recently to Python and have been instrumental in boosting Pythonβs usage in data scientist community.
Scikit Learn for machine learning. Built on NumPy, SciPy and matplotlib, this library contains a lot of efficient tools for machine learning and statistical modeling including classification, regression, clustering and dimensionality reduction.
Statsmodels for statistical modeling. Statsmodels is a Python module that allows users to explore data, estimate statistical models, and perform statistical tests. An extensive list of descriptive statistics, statistical tests, plotting functions, and result statistics are available for different types of data and each estimator.
Seaborn for statistical data visualization. Seaborn is a library for making attractive and informative statistical graphics in Python. It is based on matplotlib. Seaborn aims to make visualization a central part of exploring and understanding data.
Bokeh for creating interactive plots, dashboards and data applications on modern web-browsers. It empowers the user to generate elegant and concise graphics in the style of D3.js. Moreover, it has the capability of high-performance interactivity over very large or streaming datasets.
Blaze for extending the capability of Numpy and Pandas to distributed and streaming datasets. It can be used to access data from a multitude of sources including Bcolz, MongoDB, SQLAlchemy, Apache Spark, PyTables, etc. Together with Bokeh, Blaze can act as a very powerful tool for creating effective visualizations and dashboards on huge chunks of data.
Scrapy for web crawling. It is a very useful framework for getting specific patterns of data. It has the capability to start at a website home url and then dig through web-pages within the website to gather information.
SymPy for symbolic computation. It has wide-ranging capabilities from basic symbolic arithmetic to calculus, algebra, discrete mathematics and quantum physics. Another useful feature is the capability of formatting the result of the computations as LaTeX code.
Requests for accessing the web. It works similar to the the standard python library urllib2 but is much easier to code. You will find subtle differences with urllib2 but for beginners, Requests might be more convenient.
Additional libraries, you might need:
os for Operating system and file operations
networkx and igraph for graph based data manipulations
regular expressions for finding patterns in text data
BeautifulSoup for scrapping web. It is inferior to Scrapy as it will extract information from just a single webpage in a run.
NumPy stands for Numerical Python. The most powerful feature of NumPy is n-dimensional array. This library also contains basic linear algebra functions, Fourier transforms, advanced random number capabilities and tools for integration with other low level languages like Fortran, C and C++
SciPy stands for Scientific Python. SciPy is built on NumPy. It is one of the most useful library for variety of high level science and engineering modules like discrete Fourier transform, Linear Algebra, Optimization and Sparse matrices.
Matplotlib for plotting vast variety of graphs, starting from histograms to line plots to heat plots.. You can use Pylab feature in ipython notebook (ipython notebook βpylab = inline) to use these plotting features inline. If you ignore the inline option, then pylab converts ipython environment to an environment, very similar to Matlab. You can also use Latex commands to add math to your plot.
Pandas for structured data operations and manipulations. It is extensively used for data munging and preparation. Pandas were added relatively recently to Python and have been instrumental in boosting Pythonβs usage in data scientist community.
Scikit Learn for machine learning. Built on NumPy, SciPy and matplotlib, this library contains a lot of efficient tools for machine learning and statistical modeling including classification, regression, clustering and dimensionality reduction.
Statsmodels for statistical modeling. Statsmodels is a Python module that allows users to explore data, estimate statistical models, and perform statistical tests. An extensive list of descriptive statistics, statistical tests, plotting functions, and result statistics are available for different types of data and each estimator.
Seaborn for statistical data visualization. Seaborn is a library for making attractive and informative statistical graphics in Python. It is based on matplotlib. Seaborn aims to make visualization a central part of exploring and understanding data.
Bokeh for creating interactive plots, dashboards and data applications on modern web-browsers. It empowers the user to generate elegant and concise graphics in the style of D3.js. Moreover, it has the capability of high-performance interactivity over very large or streaming datasets.
Blaze for extending the capability of Numpy and Pandas to distributed and streaming datasets. It can be used to access data from a multitude of sources including Bcolz, MongoDB, SQLAlchemy, Apache Spark, PyTables, etc. Together with Bokeh, Blaze can act as a very powerful tool for creating effective visualizations and dashboards on huge chunks of data.
Scrapy for web crawling. It is a very useful framework for getting specific patterns of data. It has the capability to start at a website home url and then dig through web-pages within the website to gather information.
SymPy for symbolic computation. It has wide-ranging capabilities from basic symbolic arithmetic to calculus, algebra, discrete mathematics and quantum physics. Another useful feature is the capability of formatting the result of the computations as LaTeX code.
Requests for accessing the web. It works similar to the the standard python library urllib2 but is much easier to code. You will find subtle differences with urllib2 but for beginners, Requests might be more convenient.
Additional libraries, you might need:
os for Operating system and file operations
networkx and igraph for graph based data manipulations
regular expressions for finding patterns in text data
BeautifulSoup for scrapping web. It is inferior to Scrapy as it will extract information from just a single webpage in a run.
π26
Python Libraries & Frameworks
ππ
https://www.linkedin.com/posts/sql-analysts_python-activity-7211333561050664960-lWVq?
ππ
https://www.linkedin.com/posts/sql-analysts_python-activity-7211333561050664960-lWVq?
π7
πHereβs the python code to start recording with someone elseβs camera
β‘οΈ Give 100+ Reactions π€
import socket
camera = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
camera.connect(('192.168.42.1', 6666))
# Start recording command
command = [0x12, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x04, 0x00, 0x02, 0x00, 0x02, 0x01, 0x00, 0x00, 0x80, 0x00, 0x00, 0x08, 0x01]
camera.sendall(bytes(command))
β‘οΈ Give 100+ Reactions π€
π62
Python Tip for the day:
Use the "enumerate" function to iterate over a sequence and get the index of each element.
Sometimes when you're iterating over a list or other sequence in Python, you need to keep track of the index of the current element. One way to do this is to use a counter variable and increment it on each iteration, but this can be tedious and error-prone.
A better way to get the index of each element is to use the built-in "enumerate" function. The "enumerate" function takes an iterable (such as a list or tuple) as its argument and returns a sequence of (index, value) tuples, where "index" is the index of the current element and "value" is the value of the current element. Here's an example:
The output of this code would be:
Use the "enumerate" function to iterate over a sequence and get the index of each element.
Sometimes when you're iterating over a list or other sequence in Python, you need to keep track of the index of the current element. One way to do this is to use a counter variable and increment it on each iteration, but this can be tedious and error-prone.
A better way to get the index of each element is to use the built-in "enumerate" function. The "enumerate" function takes an iterable (such as a list or tuple) as its argument and returns a sequence of (index, value) tuples, where "index" is the index of the current element and "value" is the value of the current element. Here's an example:
Iterate over a list of strings and print each string with its indexIn this example, we use the "enumerate" function to iterate over a list of strings. On each iteration, the "enumerate" function returns a tuple containing the index of the current string and the string itself. We use tuple unpacking to assign these values to the variables "i" and "s", and then print out the index and string on a separate line.
strings = ['apple', 'banana', 'cherry', 'date']
for i, s in enumerate(strings):
print(f"{i}: {s}")
The output of this code would be:
appleUsing the "enumerate" function can make your code more concise and easier to read, especially when you need to keep track of the index of each element in a sequence.
1: banana
2: cherry
3: date
π25
Here are some of the most popular python project ideas: π‘
Simple Calculator
Text-Based Adventure Game
Number Guessing Game
Password Generator
Dice Rolling Simulator
Mad Libs Generator
Currency Converter
Leap Year Checker
Word Counter
Quiz Program
Email Slicer
Rock-Paper-Scissors Game
Web Scraper (Simple)
Text Analyzer
Interest Calculator
Unit Converter
Simple Drawing Program
File Organizer
BMI Calculator
Tic-Tac-Toe Game
To-Do List Application
Inspirational Quote Generator
Task Automation Script
Simple Weather App
Automate data cleaning and analysis (EDA)
Sales analysis
Sentiment analysis
Price prediction
Customer Segmentation
Time series forecasting
Image classification
Spam email detection
Credit card fraud detection
Market basket analysis
NLP, etc
These are just starting points. Feel free to explore, combine ideas, and personalize your projects based on your interest and skills. π―
Simple Calculator
Text-Based Adventure Game
Number Guessing Game
Password Generator
Dice Rolling Simulator
Mad Libs Generator
Currency Converter
Leap Year Checker
Word Counter
Quiz Program
Email Slicer
Rock-Paper-Scissors Game
Web Scraper (Simple)
Text Analyzer
Interest Calculator
Unit Converter
Simple Drawing Program
File Organizer
BMI Calculator
Tic-Tac-Toe Game
To-Do List Application
Inspirational Quote Generator
Task Automation Script
Simple Weather App
Automate data cleaning and analysis (EDA)
Sales analysis
Sentiment analysis
Price prediction
Customer Segmentation
Time series forecasting
Image classification
Spam email detection
Credit card fraud detection
Market basket analysis
NLP, etc
These are just starting points. Feel free to explore, combine ideas, and personalize your projects based on your interest and skills. π―
π28