https://www.hellointerview.com/learn/system-design/problem-breakdowns/bitly
omg, check it. I have to create my version, because in general people propose SO BAD solutions for core problem here: "generate distributed unique values"
omg, check it. I have to create my version, because in general people propose SO BAD solutions for core problem here: "generate distributed unique values"
Hellointerview
Design a URL Shortener Like Bit.ly | Hello Interview System Design in a Hurry
System design answer key for designing a URL shortener like Bit.ly, built by FAANG managers and staff engineers.
π3
andreyka26_se
https://www.hellointerview.com/learn/system-design/problem-breakdowns/bitly omg, check it. I have to create my version, because in general people propose SO BAD solutions for core problem here: "generate distributed unique values"
I will explain, there are 3 solutions proposed, and among them there are 0 good solutions.
First: just bad, let's skip.
Second: hashfunction => basically get hash of input url and then get base62 encoding
ofc there will be collisions, and the proposal: just retry until you succeed (add UNIQUE constraint)
Like wtf. How many database retries there will be once we approach our non-func req 1B urls?????
ALSO THE GUY THINKS THIS IS THE SAME AS SNOWFLAKE ID GENERATION PATTERN. I know he is stuff from meta, but man, snowflake and this hashing are TWO DIFFERENT approaches completely.
Third: unique counter via Redis. They proposed Redis cause it is fast. It will not be fast, cause you will do fsync for each operation as you don't want to loose data, otherwise "uniqueness" is not guaranteed.
Second thing, they propose default "redis" replication, which is async replication which again introduces data loss.
First: just bad, let's skip.
Second: hashfunction => basically get hash of input url and then get base62 encoding
base62(hash(input_url))[:8]. Take first 8 characters from the encoded values.ofc there will be collisions, and the proposal: just retry until you succeed (add UNIQUE constraint)
Like wtf. How many database retries there will be once we approach our non-func req 1B urls?????
ALSO THE GUY THINKS THIS IS THE SAME AS SNOWFLAKE ID GENERATION PATTERN. I know he is stuff from meta, but man, snowflake and this hashing are TWO DIFFERENT approaches completely.
Third: unique counter via Redis. They proposed Redis cause it is fast. It will not be fast, cause you will do fsync for each operation as you don't want to loose data, otherwise "uniqueness" is not guaranteed.
Second thing, they propose default "redis" replication, which is async replication which again introduces data loss.
andreyka26_se
I will explain, there are 3 solutions proposed, and among them there are 0 good solutions. First: just bad, let's skip. Second: hashfunction => basically get hash of input url and then get base62 encoding base62(hash(input_url))[:8]. Take first 8 charactersβ¦
I mean, why it is considered "THE BEST" resource, I don't know honestly. Some of the system designs here are great, I personally learnt a lot when I was prepping, but this system design is very bad.
What I consider a good approach?
1) snowflake id, stateless, infinitely scalable, cheap, real prod applied.
2) shards with prefixes: build multiple counters that start from 0 to N. Add 1-2 digits for "shard id". Round robin them. This will ensure both uniqueness and scalability
These are 2 "BEST" choices here.
You also can precompute all the ids, and then just reserve them with optimistic concurrency (+ apply sharding). But that would be so-so
What I consider a good approach?
1) snowflake id, stateless, infinitely scalable, cheap, real prod applied.
2) shards with prefixes: build multiple counters that start from 0 to N. Add 1-2 digits for "shard id". Round robin them. This will ensure both uniqueness and scalability
These are 2 "BEST" choices here.
You also can precompute all the ids, and then just reserve them with optimistic concurrency (+ apply sharding). But that would be so-so
andreyka26_se
Photo
especially the reply to the comment, this disappoints me the most....
Daily
This task was either in top 75 or top 100 liked, so it is good mid question.
https://leetcode.com/problems/successful-pairs-of-spells-and-potions/description/?envType=daily-question&envId=2025-10-08
#daily
This task was either in top 75 or top 100 liked, so it is good mid question.
https://leetcode.com/problems/successful-pairs-of-spells-and-potions/description/?envType=daily-question&envId=2025-10-08
#daily
How frequently you solve Daily leetcode challenge
Anonymous Poll
20%
(almost) everyday
15%
couple of times per week
24%
couple of times per month
41%
never
π₯2
andreyka26_se
Evntually exclusive contentπππ
It is only for you people. Instagram and tiktok wonβt see it ππ
β€5
My friend asked me about "how would you scale the game matchmaking system". This is actually a very good system design.
I have no idea of the requirements, but this is something that I came up with in 15 mins.
A lot of stuff here are out of scope and missed
Concept is a bit similar to "Virtual queue" in Ticket Master system design. When you have shared sorted set and need to pop N out of them
I have no idea of the requirements, but this is something that I came up with in 15 mins.
A lot of stuff here are out of scope and missed
Concept is a bit similar to "Virtual queue" in Ticket Master system design. When you have shared sorted set and need to pop N out of them
π7
Daily
Was very hard for me, was not able to come up with simulation even, cause, I'm not sure, just my brain cannot handle the simulation variables in head at the same time (ongoing potions, busy/notbusy mages).
But there is one picture that helps for building the simulation a lot (check comments)
https://leetcode.com/problems/find-the-minimum-amount-of-time-to-brew-potions/description/?envType=daily-question&envId=2025-10-09
#daily
Was very hard for me, was not able to come up with simulation even, cause, I'm not sure, just my brain cannot handle the simulation variables in head at the same time (ongoing potions, busy/notbusy mages).
But there is one picture that helps for building the simulation a lot (check comments)
https://leetcode.com/problems/find-the-minimum-amount-of-time-to-brew-potions/description/?envType=daily-question&envId=2025-10-09
#daily
Daily
Today is solid, good, not hard DP problem. Love it. Go and collect the point.
https://leetcode.com/problems/taking-maximum-energy-from-the-mystic-dungeon/description/?envType=daily-question&envId=2025-10-10
#daily #medium
Today is solid, good, not hard DP problem. Love it. Go and collect the point.
https://leetcode.com/problems/taking-maximum-energy-from-the-mystic-dungeon/description/?envType=daily-question&envId=2025-10-10
#daily #medium
Daily
https://leetcode.com/problems/maximum-total-damage-with-spell-casting/?envType=daily-question&envId=2025-10-11
Solid DP problem, which requires a bit of smartness
Description understanding helper: not
#daily #mid #dp
https://leetcode.com/problems/maximum-total-damage-with-spell-casting/?envType=daily-question&envId=2025-10-11
Solid DP problem, which requires a bit of smartness
Description understanding helper: not
power[i -1], power[i - 2], ... BUT INSTEAD power[i] - 1, power[i] - 2. It saved me a lot of time.#daily #mid #dp
andreyka26_se
AHA, I got it, I should join both. And the problem that one of them is today 16:00 UTC + 2, another one is 4:00 AM UTC + 2
35 coins for waking up at 4:00 AM?? no, thanks.
π₯3
This media is not supported in your browser
VIEW IN TELEGRAM
Kingβs Landing, Westeros
π₯10β€2π€―1
Daily
Today is trash problem, don't even fucking spend your time on it
https://leetcode.com/problems/find-sum-of-array-product-of-magical-sequences/description/?envType=daily-question&envId=2025-10-12
#daily #hard #trash
Today is trash problem, don't even fucking spend your time on it
https://leetcode.com/problems/find-sum-of-array-product-of-magical-sequences/description/?envType=daily-question&envId=2025-10-12
#daily #hard #trash