🌟 Exciting Announcement! 🌟
Hey there, Python enthusiasts! 🐍✨
I wanted to share a quick update on our Python Advanced Course. Although it didn't quite go as expected, don't worry, because we've got something special planned for you! Before we kick off our highly anticipated Django course, I'm hosting one last class this coming Monday. 📚🎉
But here's the twist: I'd love for all of you to contribute by suggesting project titles or ideas! 🤔💡 This way, we can challenge ourselves and take our skills to new heights as we work on a collaborative Django project together. It's the perfect opportunity to dive deeper into web development and have a blast while doing it! 🚀💻
So, if you have any creative ideas or suggestions to enhance our learning experience, please feel free to share them. Your input is invaluable! Let's make this journey one to remember. 😊
Stay tuned for more details and get ready to embark on an exciting Django adventure soon!
Happy coding
#Python #Django #LearningTogether #CodingCommunity
Hey there, Python enthusiasts! 🐍✨
I wanted to share a quick update on our Python Advanced Course. Although it didn't quite go as expected, don't worry, because we've got something special planned for you! Before we kick off our highly anticipated Django course, I'm hosting one last class this coming Monday. 📚🎉
But here's the twist: I'd love for all of you to contribute by suggesting project titles or ideas! 🤔💡 This way, we can challenge ourselves and take our skills to new heights as we work on a collaborative Django project together. It's the perfect opportunity to dive deeper into web development and have a blast while doing it! 🚀💻
So, if you have any creative ideas or suggestions to enhance our learning experience, please feel free to share them. Your input is invaluable! Let's make this journey one to remember. 😊
Stay tuned for more details and get ready to embark on an exciting Django adventure soon!
Happy coding
#Python #Django #LearningTogether #CodingCommunity
👌15👍5👎1🤔1
You have a chance to win a prize if you fill out the Python Developers Survey 2023! Fill it out today to get your name in the hat 🪄🎩🎁
p.s. if you think snakes are cute: image search "snakes wearing hats" 🥹 you're welcome! #python
https://survey.alchemer.com/s3/7554174/python-developers-survey-2023
Source: https://twitter.com/ThePSF/status/1749707292814061651?t=8Fmw9yJ_o8Bg3Epea0TeWA&s=19
p.s. if you think snakes are cute: image search "snakes wearing hats" 🥹 you're welcome! #python
https://survey.alchemer.com/s3/7554174/python-developers-survey-2023
Source: https://twitter.com/ThePSF/status/1749707292814061651?t=8Fmw9yJ_o8Bg3Epea0TeWA&s=19
Alchemer
Python Developers Survey 2023
The official Python Developers Survey 2023. Join and contribute to the community knowledge!
In Python Threading is a very interesting topic and also one of the most famous weakness of Python(There is a solution for it too).
So Python by default locks execution of thread to a single thread at a time, which means u can't run multiple code at the same time. But there is a concept called Multi-threading what does it means? if it python doesn't execute multiple threads at the same time then what is the use of Multi-threading?
so simple example
Now if you run this code you will see that the output in about 3 seconds, even tho in total it was suppose to wait 5 second we optimized it and ge it down.
In reality what happen is once the first sleep happens python leave it to sleep until it wakes up and start running the second waiter then the code was waiting like the same time as the first one.
But so there are multiple type of task people will execute for example take a look at this un-optimized fibonacci code.
if you execute
Now you will be surprised you won't get any performance speed, you might even get worse performance because of the
Well like i said python only execute one thread at a time, so even tho you have two thread running it will only execute one at a time. which means the first thread is consuming all the resources and the second thread didn't even get a chance to run.
So threading was never designed to improve CPU intensive tasks but it was designed to improve IO bound tasks.
Now what the heck are those terms you might ask? well let me explain.
#python #thread
So Python by default locks execution of thread to a single thread at a time, which means u can't run multiple code at the same time. But there is a concept called Multi-threading what does it means? if it python doesn't execute multiple threads at the same time then what is the use of Multi-threading?
so simple example
import threading, time
def waiteer(n=3):
time.sleep(n)
print(f"I am done waiting {n} seconds")
t1 = threading.Thread(target=waiteer, args=(3,))
t2 = threading.Thread(target=waiteer, args=(2,))
t1.start()
t2.start()
t1.join()
t2.join()
Now if you run this code you will see that the output in about 3 seconds, even tho in total it was suppose to wait 5 second we optimized it and ge it down.
In reality what happen is once the first sleep happens python leave it to sleep until it wakes up and start running the second waiter then the code was waiting like the same time as the first one.
But so there are multiple type of task people will execute for example take a look at this un-optimized fibonacci code.
def fib(n):
if n <= 1:
return n
else:
return fib(n-1) + fib(n-2)
if you execute
fib(35) and fib(36) it might take couple of few seconds. but what if you want to improve the speed like the tasks are independent right? so can we run this in two thread and get the result faster?import threading
t1 = threading.Thread(target=fib, args=(35,))
t2 = threading.Thread(target=fib, args=(36,))
t1.start()
t2.start()
t1.join()
t2.join()
Now you will be surprised you won't get any performance speed, you might even get worse performance because of the
context switching(we will talk about this one day) but what happen shouldn't it be faster?Well like i said python only execute one thread at a time, so even tho you have two thread running it will only execute one at a time. which means the first thread is consuming all the resources and the second thread didn't even get a chance to run.
So threading was never designed to improve CPU intensive tasks but it was designed to improve IO bound tasks.
Now what the heck are those terms you might ask? well let me explain.
#python #thread
⚡5🍓3🆒3🤓1
# IO Bound vs CPU Bound
CPU bound tasks are the tasks that are limited by the CPU, like the code is doing some heavy calculations and it is taking time to execute. For example the fibonacci code we saw earlier.is doing calculation at all time like there is no waiting for anything like reading a file, calling database, calling API etc. It is just consuming CPU at all time.
So tasks like this are CPU bound tasks. The best you can do to improve those tasks is to increase the CPU power or use a better algorithm to not consume that much CPU hahaha.
Some example of CPU bound tasks are:
- Prime number calculation
- Finding factorial of a number
- dijkstra's for finding shortest path
- etc
now we got the CPU bound tasks, what about IO bound tasks?
IO bound tasks are the tasks that are limited by the input/output devices like reading a file, calling a database, calling an API etc. Which means they are not doing any heavy computations instead they are waiting for the input/output devices to respond.
Let's say you are searching through 10M users in a database and you want to find a user with a specific email. So you are not doing any heavy computation in YOUR machine instead you are waiting the database to respond it to you.
Or let's take a real world scenario, you want to download 1000 file from aws s3 bucket. what you might do looks like this in python.
Now if you take a look at this code, you are literally waiting a lot of time for the file to download. So this is an IO bound task.
But you could improve this with couple of ways like using thread, using asyncio, using multiprocessing etc.
example using threading:
Now you will see that the files are downloading faster than before. But this might not the best improvement you can do, you can use
specially
#python #asyncio
CPU bound tasks are the tasks that are limited by the CPU, like the code is doing some heavy calculations and it is taking time to execute. For example the fibonacci code we saw earlier.is doing calculation at all time like there is no waiting for anything like reading a file, calling database, calling API etc. It is just consuming CPU at all time.
So tasks like this are CPU bound tasks. The best you can do to improve those tasks is to increase the CPU power or use a better algorithm to not consume that much CPU hahaha.
Some example of CPU bound tasks are:
- Prime number calculation
- Finding factorial of a number
- dijkstra's for finding shortest path
- etc
now we got the CPU bound tasks, what about IO bound tasks?
IO bound tasks are the tasks that are limited by the input/output devices like reading a file, calling a database, calling an API etc. Which means they are not doing any heavy computations instead they are waiting for the input/output devices to respond.
Let's say you are searching through 10M users in a database and you want to find a user with a specific email. So you are not doing any heavy computation in YOUR machine instead you are waiting the database to respond it to you.
Or let's take a real world scenario, you want to download 1000 file from aws s3 bucket. what you might do looks like this in python.
import boto3
files = ['file1', 'file2', 'file3', 'file4', 'file5']
s3 = boto3.client('s3')
for file in files:
s3.download_file('bucket_name', file, file)
Now if you take a look at this code, you are literally waiting a lot of time for the file to download. So this is an IO bound task.
But you could improve this with couple of ways like using thread, using asyncio, using multiprocessing etc.
example using threading:
import threading
import boto3
files = ['file1', 'file2', 'file3', 'file4', 'file5']
s3 = boto3.client('s3')
def download_file(file):
s3.download_file('bucket_name', file, file)
threads = []
for file in files:
t = threading.Thread(target=download_file, args=(file,))
t.start()
threads.append(t)
for t in threads:
t.join()
Now you will see that the files are downloading faster than before. But this might not the best improvement you can do, you can use
asyncio or multiprocessing to improve the performance even more.specially
asyncio is designed for IO bound tasks and it is very good at it.#python #asyncio
👍13🍓3
🚀 Powerful Rate Limiter with FastAPI + Redis
A simple yet effective implementation of rate limiting using sliding window algorithm.
✨ Features:
• Sliding window algorithm
• Redis for storage
• Configurable rate limit & window
• Remaining requests tracking
• Error handling & logging
📝 Usage:
You can change the way you handle it by using fastapi dependency or in any python framework
#Python #FastAPI #Redis #RateLimit #WebDev
A simple yet effective implementation of rate limiting using sliding window algorithm.
class RateLimiter:
def __init__(self, rate_limit: int = 100, time_window: int = 3600):
self.redis = None
self.rate_limit = rate_limit # requests per window
self.time_window = time_window # window size in seconds
async def check_rate_limit(self, api_key: str) -> bool:
if not self.redis:
self.redis = await get_redis_connection()
window_key = f"ratelimit:{api_key}:{int(time.time()) // self.time_window}"
try:
pipeline = self.redis.pipeline()
await pipeline.incr(window_key)
await pipeline.expire(window_key, self.time_window)
count = (await pipeline.execute())[0]
if count > self.rate_limit:
raise HTTPException(
status_code=429,
detail={"error": "Rate limit exceeded",
"limit": self.rate_limit,
"window": f"{self.time_window}s"}
)
return True
except Exception as e:
logger.error(f"Rate limit error: {e}")
raise
async def get_remaining(self, api_key: str) -> int:
count = await self.redis.get(
f"ratelimit:{api_key}:{int(time.time()) // self.time_window}"
)
return max(0, self.rate_limit - (int(count) if count else 0))
✨ Features:
• Sliding window algorithm
• Redis for storage
• Configurable rate limit & window
• Remaining requests tracking
• Error handling & logging
📝 Usage:
limiter = RateLimiter(rate_limit=100, time_window=3600)
await limiter.check_rate_limit("api_key")
You can change the way you handle it by using fastapi dependency or in any python framework
#Python #FastAPI #Redis #RateLimit #WebDev
✍18❤5👍4