Stackmaxxing for a recursion world record
https://www.reddit.com/r/programming/comments/1qm9ilz/stackmaxxing_for_a_recursion_world_record/
submitted by /u/Chii (https://www.reddit.com/user/Chii)
[link] (https://www.youtube.com/watch?v=WQKSyPYF0-Y) [comments] (https://www.reddit.com/r/programming/comments/1qm9ilz/stackmaxxing_for_a_recursion_world_record/)
https://www.reddit.com/r/programming/comments/1qm9ilz/stackmaxxing_for_a_recursion_world_record/
submitted by /u/Chii (https://www.reddit.com/user/Chii)
[link] (https://www.youtube.com/watch?v=WQKSyPYF0-Y) [comments] (https://www.reddit.com/r/programming/comments/1qm9ilz/stackmaxxing_for_a_recursion_world_record/)
Building a lightning-fast highly-configurable Rust-based backtesting system
https://www.reddit.com/r/programming/comments/1qmdzue/building_a_lightningfast_highlyconfigurable/
<!-- SC_OFF -->I created a very detailed technical design doc for how I built a Rust-based algorithmic trading platform. Feel free to ask me any questions below! <!-- SC_ON --> submitted by /u/ReplacementNo598 (https://www.reddit.com/user/ReplacementNo598)
[link] (https://nexustrade.io/blog/building-a-lightning-fast-highly-configurable-rust-based-backtesting-system-20260119) [comments] (https://www.reddit.com/r/programming/comments/1qmdzue/building_a_lightningfast_highlyconfigurable/)
https://www.reddit.com/r/programming/comments/1qmdzue/building_a_lightningfast_highlyconfigurable/
<!-- SC_OFF -->I created a very detailed technical design doc for how I built a Rust-based algorithmic trading platform. Feel free to ask me any questions below! <!-- SC_ON --> submitted by /u/ReplacementNo598 (https://www.reddit.com/user/ReplacementNo598)
[link] (https://nexustrade.io/blog/building-a-lightning-fast-highly-configurable-rust-based-backtesting-system-20260119) [comments] (https://www.reddit.com/r/programming/comments/1qmdzue/building_a_lightningfast_highlyconfigurable/)
Anatomy of the 2024 CrowdStrike outage: a single update, global impact
https://www.reddit.com/r/programming/comments/1qmfv3y/anatomy_of_the_2024_crowdstrike_outage_a_single/
submitted by /u/Digitalunicon (https://www.reddit.com/user/Digitalunicon)
[link] (https://en.wikipedia.org/wiki/2024_CrowdStrike-related_IT_outages) [comments] (https://www.reddit.com/r/programming/comments/1qmfv3y/anatomy_of_the_2024_crowdstrike_outage_a_single/)
https://www.reddit.com/r/programming/comments/1qmfv3y/anatomy_of_the_2024_crowdstrike_outage_a_single/
submitted by /u/Digitalunicon (https://www.reddit.com/user/Digitalunicon)
[link] (https://en.wikipedia.org/wiki/2024_CrowdStrike-related_IT_outages) [comments] (https://www.reddit.com/r/programming/comments/1qmfv3y/anatomy_of_the_2024_crowdstrike_outage_a_single/)
I got tired of manual priority weights in proxies so I used a Reverse Radix Tree instead
https://www.reddit.com/r/programming/comments/1qmhw95/i_got_tired_of_manual_priority_weights_in_proxies/
<!-- SC_OFF -->Most reverse proxies like Nginx or Traefik handle domain rules in the order you write them or by using those annoying "priority" tags. If you have overlapping wildcards, like *.myapp.test and api.myapp.test, you usally have to play "Priority Tetris" to make sure the right rule wins. I wanted something more deterministic and intuitive. I wanted a system where the most specific match always wins without me having to tinker with config weights every time I add a subdomain. I ended up building a Reverse Radix Tree. The basic idea is that domain hierarchy is actualy right to left: test -> myapp -> api. By splitting the domain by the dots and reversing the segments before putting them in the tree, the data structure finaly matches the way DNS actually works. To handle cases where multiple patterns might match (like api-* vs *), I added a "Literal Density" score. The resolver counts how many non-wildcard characters are in a segment and tries the "densest" (most specific) ones first. This happens naturaly as you walk down the tree, so the hierarchy itself acts as a filter. I wrote a post about the logic, how the scoring works, and how I use named parameters to hydrate dynamic upstreams: https://getlode.app/blog/2026-01-25-stop-playing-priority-tetris How do you guys handle complex wildcard routing? Do you find manual weights a necesary evil or would you prefer a hierarchical approach like this? <!-- SC_ON --> submitted by /u/robbiedobbie (https://www.reddit.com/user/robbiedobbie)
[link] (https://getlode.app/blog/2026-01-25-stop-playing-priority-tetris) [comments] (https://www.reddit.com/r/programming/comments/1qmhw95/i_got_tired_of_manual_priority_weights_in_proxies/)
https://www.reddit.com/r/programming/comments/1qmhw95/i_got_tired_of_manual_priority_weights_in_proxies/
<!-- SC_OFF -->Most reverse proxies like Nginx or Traefik handle domain rules in the order you write them or by using those annoying "priority" tags. If you have overlapping wildcards, like *.myapp.test and api.myapp.test, you usally have to play "Priority Tetris" to make sure the right rule wins. I wanted something more deterministic and intuitive. I wanted a system where the most specific match always wins without me having to tinker with config weights every time I add a subdomain. I ended up building a Reverse Radix Tree. The basic idea is that domain hierarchy is actualy right to left: test -> myapp -> api. By splitting the domain by the dots and reversing the segments before putting them in the tree, the data structure finaly matches the way DNS actually works. To handle cases where multiple patterns might match (like api-* vs *), I added a "Literal Density" score. The resolver counts how many non-wildcard characters are in a segment and tries the "densest" (most specific) ones first. This happens naturaly as you walk down the tree, so the hierarchy itself acts as a filter. I wrote a post about the logic, how the scoring works, and how I use named parameters to hydrate dynamic upstreams: https://getlode.app/blog/2026-01-25-stop-playing-priority-tetris How do you guys handle complex wildcard routing? Do you find manual weights a necesary evil or would you prefer a hierarchical approach like this? <!-- SC_ON --> submitted by /u/robbiedobbie (https://www.reddit.com/user/robbiedobbie)
[link] (https://getlode.app/blog/2026-01-25-stop-playing-priority-tetris) [comments] (https://www.reddit.com/r/programming/comments/1qmhw95/i_got_tired_of_manual_priority_weights_in_proxies/)
Nano Queries, a state of the art Query Builder
https://www.reddit.com/r/programming/comments/1qmk23f/nano_queries_a_state_of_the_art_query_builder/
submitted by /u/vitonsky (https://www.reddit.com/user/vitonsky)
[link] (https://vitonsky.net/blog/2026/01/24/nano-queries/) [comments] (https://www.reddit.com/r/programming/comments/1qmk23f/nano_queries_a_state_of_the_art_query_builder/)
https://www.reddit.com/r/programming/comments/1qmk23f/nano_queries_a_state_of_the_art_query_builder/
submitted by /u/vitonsky (https://www.reddit.com/user/vitonsky)
[link] (https://vitonsky.net/blog/2026/01/24/nano-queries/) [comments] (https://www.reddit.com/r/programming/comments/1qmk23f/nano_queries_a_state_of_the_art_query_builder/)
Been following the metadata management space for work reasons and came across an interesting design problem that Apache Gravitino tried to solve in their 1.1 release. The problem: we have like 5+ different table formats now (Iceberg, Delta Lake, Hive, Hudi, now Lance for vectors) and each has its
https://www.reddit.com/r/programming/comments/1qmkv8f/been_following_the_metadata_management_space_for/
<!-- SC_OFF -->Been following the metadata management space for work reasons and came across an interesting design problem that Apache Gravitino tried to solve in their 1.1 release. The problem: we have like 5+ different table formats now (Iceberg, Delta Lake, Hive, Hudi, now Lance for vectors) and each has its own catalog implementation, its own way of handling namespaces, and its own capability negotiation. If you want to build a unified metadata layer across all of them, you end up writing tons of boilerplate code for each new format. Their solution was to create a generic lakehouse catalog framework that abstracts away the format-specific stuff. The idea is you define a standard interface for how catalogs should negotiate capabilities and handle namespaces, then each format implementation just fills in the blanks. What caught my attention was the trade-off discussion. On one hand, abstractions add complexity and sometimes leak. On the other hand, the lakehouse ecosystem is adding new formats constantly. Without this kind of framework, every new format means rewriting similar integration code. From a software design perspective, this reminded me of the adapter pattern but at a larger scale. The challenge is figuring out what belongs in the abstract interface vs what's genuinely format-specific. Has anyone here dealt with similar unification problems? Like building a common interface across multiple storage backends or database types? Curious how you decided where to draw the abstraction boundary. Link to the release notes if anyone wants to dig into specifics: [https://github.com/apache/gravitino/releases/tag/v1.1.0\](https://github.com/apache/gravitino/releases/tag/v1.1.0) (https://github.com/apache/gravitino/releases/tag/v1.1.0%5C%5D(https://github.com/apache/gravitino/releases/tag/v1.1.0)) <!-- SC_ON --> submitted by /u/Agitated_Fox2640 (https://www.reddit.com/user/Agitated_Fox2640)
[link] (https://github.com/apache/gravitino/releases/tag/v1.1.0) [comments] (https://www.reddit.com/r/programming/comments/1qmkv8f/been_following_the_metadata_management_space_for/)
https://www.reddit.com/r/programming/comments/1qmkv8f/been_following_the_metadata_management_space_for/
<!-- SC_OFF -->Been following the metadata management space for work reasons and came across an interesting design problem that Apache Gravitino tried to solve in their 1.1 release. The problem: we have like 5+ different table formats now (Iceberg, Delta Lake, Hive, Hudi, now Lance for vectors) and each has its own catalog implementation, its own way of handling namespaces, and its own capability negotiation. If you want to build a unified metadata layer across all of them, you end up writing tons of boilerplate code for each new format. Their solution was to create a generic lakehouse catalog framework that abstracts away the format-specific stuff. The idea is you define a standard interface for how catalogs should negotiate capabilities and handle namespaces, then each format implementation just fills in the blanks. What caught my attention was the trade-off discussion. On one hand, abstractions add complexity and sometimes leak. On the other hand, the lakehouse ecosystem is adding new formats constantly. Without this kind of framework, every new format means rewriting similar integration code. From a software design perspective, this reminded me of the adapter pattern but at a larger scale. The challenge is figuring out what belongs in the abstract interface vs what's genuinely format-specific. Has anyone here dealt with similar unification problems? Like building a common interface across multiple storage backends or database types? Curious how you decided where to draw the abstraction boundary. Link to the release notes if anyone wants to dig into specifics: [https://github.com/apache/gravitino/releases/tag/v1.1.0\](https://github.com/apache/gravitino/releases/tag/v1.1.0) (https://github.com/apache/gravitino/releases/tag/v1.1.0%5C%5D(https://github.com/apache/gravitino/releases/tag/v1.1.0)) <!-- SC_ON --> submitted by /u/Agitated_Fox2640 (https://www.reddit.com/user/Agitated_Fox2640)
[link] (https://github.com/apache/gravitino/releases/tag/v1.1.0) [comments] (https://www.reddit.com/r/programming/comments/1qmkv8f/been_following_the_metadata_management_space_for/)
Hermes Proxy - Yet Another HTTP Traffic Analyzer
https://www.reddit.com/r/programming/comments/1qmleiq/hermes_proxy_yet_another_http_traffic_analyzer/
submitted by /u/--jp-- (https://www.reddit.com/user/--jp--)
[link] (https://github.com/jp/Hermes-Proxy) [comments] (https://www.reddit.com/r/programming/comments/1qmleiq/hermes_proxy_yet_another_http_traffic_analyzer/)
https://www.reddit.com/r/programming/comments/1qmleiq/hermes_proxy_yet_another_http_traffic_analyzer/
submitted by /u/--jp-- (https://www.reddit.com/user/--jp--)
[link] (https://github.com/jp/Hermes-Proxy) [comments] (https://www.reddit.com/r/programming/comments/1qmleiq/hermes_proxy_yet_another_http_traffic_analyzer/)
C++ RAII guard to detect heap allocations in scopes
https://www.reddit.com/r/programming/comments/1qmo44v/c_raii_guard_to_detect_heap_allocations_in_scopes/
<!-- SC_OFF -->Needed a lightweight way to catch heap allocations in cpp, couldn’t find anything simple, so I built this. Sharing in case it helps anyone <!-- SC_ON --> submitted by /u/North_Chocolate7370 (https://www.reddit.com/user/North_Chocolate7370)
[link] (https://github.com/mkslge/noalloc-cpp) [comments] (https://www.reddit.com/r/programming/comments/1qmo44v/c_raii_guard_to_detect_heap_allocations_in_scopes/)
https://www.reddit.com/r/programming/comments/1qmo44v/c_raii_guard_to_detect_heap_allocations_in_scopes/
<!-- SC_OFF -->Needed a lightweight way to catch heap allocations in cpp, couldn’t find anything simple, so I built this. Sharing in case it helps anyone <!-- SC_ON --> submitted by /u/North_Chocolate7370 (https://www.reddit.com/user/North_Chocolate7370)
[link] (https://github.com/mkslge/noalloc-cpp) [comments] (https://www.reddit.com/r/programming/comments/1qmo44v/c_raii_guard_to_detect_heap_allocations_in_scopes/)
Revision website
https://www.reddit.com/r/programming/comments/1qmsf78/revision_website/
submitted by /u/Jayden11227 (https://www.reddit.com/user/Jayden11227)
[link] (https://brainmaprevision.vercel.app/) [comments] (https://www.reddit.com/r/programming/comments/1qmsf78/revision_website/)
https://www.reddit.com/r/programming/comments/1qmsf78/revision_website/
submitted by /u/Jayden11227 (https://www.reddit.com/user/Jayden11227)
[link] (https://brainmaprevision.vercel.app/) [comments] (https://www.reddit.com/r/programming/comments/1qmsf78/revision_website/)
Failing Fast: Why Quick Failures Beat Slow Deaths
https://www.reddit.com/r/programming/comments/1qmxjft/failing_fast_why_quick_failures_beat_slow_deaths/
submitted by /u/trolleid (https://www.reddit.com/user/trolleid)
[link] (https://lukasniessen.medium.com/failing-fast-why-quick-failures-beat-slow-deaths-ffaa491fa510) [comments] (https://www.reddit.com/r/programming/comments/1qmxjft/failing_fast_why_quick_failures_beat_slow_deaths/)
https://www.reddit.com/r/programming/comments/1qmxjft/failing_fast_why_quick_failures_beat_slow_deaths/
submitted by /u/trolleid (https://www.reddit.com/user/trolleid)
[link] (https://lukasniessen.medium.com/failing-fast-why-quick-failures-beat-slow-deaths-ffaa491fa510) [comments] (https://www.reddit.com/r/programming/comments/1qmxjft/failing_fast_why_quick_failures_beat_slow_deaths/)
I built a 2x faster lexer, then discovered I/O was the real bottleneck
https://www.reddit.com/r/programming/comments/1qmznm8/i_built_a_2x_faster_lexer_then_discovered_io_was/
submitted by /u/modulovalue (https://www.reddit.com/user/modulovalue)
[link] (https://modulovalue.com/blog/syscall-overhead-tar-gz-io-performance/) [comments] (https://www.reddit.com/r/programming/comments/1qmznm8/i_built_a_2x_faster_lexer_then_discovered_io_was/)
https://www.reddit.com/r/programming/comments/1qmznm8/i_built_a_2x_faster_lexer_then_discovered_io_was/
submitted by /u/modulovalue (https://www.reddit.com/user/modulovalue)
[link] (https://modulovalue.com/blog/syscall-overhead-tar-gz-io-performance/) [comments] (https://www.reddit.com/r/programming/comments/1qmznm8/i_built_a_2x_faster_lexer_then_discovered_io_was/)
In humble defense of the .zip TLD
https://www.reddit.com/r/programming/comments/1qn02yc/in_humble_defense_of_the_zip_tld/
submitted by /u/yathern (https://www.reddit.com/user/yathern)
[link] (https://luke.zip/posts/zip-defense/) [comments] (https://www.reddit.com/r/programming/comments/1qn02yc/in_humble_defense_of_the_zip_tld/)
https://www.reddit.com/r/programming/comments/1qn02yc/in_humble_defense_of_the_zip_tld/
submitted by /u/yathern (https://www.reddit.com/user/yathern)
[link] (https://luke.zip/posts/zip-defense/) [comments] (https://www.reddit.com/r/programming/comments/1qn02yc/in_humble_defense_of_the_zip_tld/)
Day 5: Heartbeat Protocol – Detecting Dead Connections at Scale
https://www.reddit.com/r/programming/comments/1qn14mc/day_5_heartbeat_protocol_detecting_dead/
submitted by /u/Extra_Ear_10 (https://www.reddit.com/user/Extra_Ear_10)
[link] (https://javatsc.substack.com/p/day-5-heartbeat-protocol-detecting) [comments] (https://www.reddit.com/r/programming/comments/1qn14mc/day_5_heartbeat_protocol_detecting_dead/)
https://www.reddit.com/r/programming/comments/1qn14mc/day_5_heartbeat_protocol_detecting_dead/
submitted by /u/Extra_Ear_10 (https://www.reddit.com/user/Extra_Ear_10)
[link] (https://javatsc.substack.com/p/day-5-heartbeat-protocol-detecting) [comments] (https://www.reddit.com/r/programming/comments/1qn14mc/day_5_heartbeat_protocol_detecting_dead/)
The open-source React calendar inspired by macOS Calendar – DayFlow
https://www.reddit.com/r/programming/comments/1qn1gmz/the_opensource_react_calendar_inspired_by_macos/
<!-- SC_OFF -->Hi everyone 👋 I’d like to share DayFlow, an open-source full-calendar component for the web that I’ve been building over the past year. I’m a heavy macOS Calendar user, and when I was looking for a clean, modern calendar UI on GitHub (especially one that works well with Tailwind / shadcn-ui), I couldn’t find something that fully matched my needs. So I decided to build one myself. What DayFlow focuses on: Clean, modern calendar UI inspired by macOS Calendar Built with React, designed for modern web apps Easy to integrate with shadcn-ui and other Tailwind UI libraries Modular architecture (views, events, panels are customizable) Actively working on i18n support The project is fully open source, and I’d really appreciate: Feedback on the API & architecture Feature suggestions Bug reports Or PRs if you’re interested in contributing GitHub: ** (https://github.com/dayflow-js/calendar)https://github.com/dayflow-js/calendar\*\* (https://github.com/dayflow-js/calendar%5C*%5C*) Demo: ** (https://dayflow-js.github.io/calendar/)https://dayflow-js.github.io/calendar/\*\* (https://dayflow-js.github.io/calendar/%5C*%5C*) Thanks for reading, and I’d love to hear your thoughts 🙏 <!-- SC_ON --> submitted by /u/Cultural_Mission_482 (https://www.reddit.com/user/Cultural_Mission_482)
[link] (https://dayflow-js.github.io/calendar/) [comments] (https://www.reddit.com/r/programming/comments/1qn1gmz/the_opensource_react_calendar_inspired_by_macos/)
https://www.reddit.com/r/programming/comments/1qn1gmz/the_opensource_react_calendar_inspired_by_macos/
<!-- SC_OFF -->Hi everyone 👋 I’d like to share DayFlow, an open-source full-calendar component for the web that I’ve been building over the past year. I’m a heavy macOS Calendar user, and when I was looking for a clean, modern calendar UI on GitHub (especially one that works well with Tailwind / shadcn-ui), I couldn’t find something that fully matched my needs. So I decided to build one myself. What DayFlow focuses on: Clean, modern calendar UI inspired by macOS Calendar Built with React, designed for modern web apps Easy to integrate with shadcn-ui and other Tailwind UI libraries Modular architecture (views, events, panels are customizable) Actively working on i18n support The project is fully open source, and I’d really appreciate: Feedback on the API & architecture Feature suggestions Bug reports Or PRs if you’re interested in contributing GitHub: ** (https://github.com/dayflow-js/calendar)https://github.com/dayflow-js/calendar\*\* (https://github.com/dayflow-js/calendar%5C*%5C*) Demo: ** (https://dayflow-js.github.io/calendar/)https://dayflow-js.github.io/calendar/\*\* (https://dayflow-js.github.io/calendar/%5C*%5C*) Thanks for reading, and I’d love to hear your thoughts 🙏 <!-- SC_ON --> submitted by /u/Cultural_Mission_482 (https://www.reddit.com/user/Cultural_Mission_482)
[link] (https://dayflow-js.github.io/calendar/) [comments] (https://www.reddit.com/r/programming/comments/1qn1gmz/the_opensource_react_calendar_inspired_by_macos/)
Enigma Machine Simulator
https://www.reddit.com/r/programming/comments/1qn2j6d/enigma_machine_simulator/
submitted by /u/thepan73 (https://www.reddit.com/user/thepan73)
[link] (https://andrewthecoder.com/blog/javascript-enigma-machine) [comments] (https://www.reddit.com/r/programming/comments/1qn2j6d/enigma_machine_simulator/)
https://www.reddit.com/r/programming/comments/1qn2j6d/enigma_machine_simulator/
submitted by /u/thepan73 (https://www.reddit.com/user/thepan73)
[link] (https://andrewthecoder.com/blog/javascript-enigma-machine) [comments] (https://www.reddit.com/r/programming/comments/1qn2j6d/enigma_machine_simulator/)
Broken Tooling & Flaky Tests (CI/CD)
https://www.reddit.com/r/programming/comments/1qn30qx/broken_tooling_flaky_tests_cicd/
submitted by /u/Tech_News_Blog (https://www.reddit.com/user/Tech_News_Blog)
[link] (http://cosmic-ai.pages.dev/) [comments] (https://www.reddit.com/r/programming/comments/1qn30qx/broken_tooling_flaky_tests_cicd/)
https://www.reddit.com/r/programming/comments/1qn30qx/broken_tooling_flaky_tests_cicd/
submitted by /u/Tech_News_Blog (https://www.reddit.com/user/Tech_News_Blog)
[link] (http://cosmic-ai.pages.dev/) [comments] (https://www.reddit.com/r/programming/comments/1qn30qx/broken_tooling_flaky_tests_cicd/)
Study finds many software developers feel ethical pressure to ship products that may conflict with democratic values
https://www.reddit.com/r/programming/comments/1qnaguj/study_finds_many_software_developers_feel_ethical/
submitted by /u/SentFromHeav3n (https://www.reddit.com/user/SentFromHeav3n)
[link] (https://www.tandfonline.com/doi/full/10.1080/1369118X.2025.2566814) [comments] (https://www.reddit.com/r/programming/comments/1qnaguj/study_finds_many_software_developers_feel_ethical/)
https://www.reddit.com/r/programming/comments/1qnaguj/study_finds_many_software_developers_feel_ethical/
submitted by /u/SentFromHeav3n (https://www.reddit.com/user/SentFromHeav3n)
[link] (https://www.tandfonline.com/doi/full/10.1080/1369118X.2025.2566814) [comments] (https://www.reddit.com/r/programming/comments/1qnaguj/study_finds_many_software_developers_feel_ethical/)
Announcing MapLibre Tile: a modern and efficient vector tile format
https://www.reddit.com/r/programming/comments/1qndbz5/announcing_maplibre_tile_a_modern_and_efficient/
submitted by /u/Dear-Economics-315 (https://www.reddit.com/user/Dear-Economics-315)
[link] (https://maplibre.org/news/2026-01-23-mlt-release/) [comments] (https://www.reddit.com/r/programming/comments/1qndbz5/announcing_maplibre_tile_a_modern_and_efficient/)
https://www.reddit.com/r/programming/comments/1qndbz5/announcing_maplibre_tile_a_modern_and_efficient/
submitted by /u/Dear-Economics-315 (https://www.reddit.com/user/Dear-Economics-315)
[link] (https://maplibre.org/news/2026-01-23-mlt-release/) [comments] (https://www.reddit.com/r/programming/comments/1qndbz5/announcing_maplibre_tile_a_modern_and_efficient/)
Locale-dependent case conversion bugs persist (Kotlin as a real-world example)
https://www.reddit.com/r/programming/comments/1qndjes/localedependent_case_conversion_bugs_persist/
<!-- SC_OFF -->Case-insensitive logic can fail in surprising ways when string case conversion depends on the ambient locale. Many programs assume that operations like ToLower() or ToUpper() are locale-neutral, but in reality their behavior can vary by system settings. This can lead to subtle bugs, often involving the well-known “Turkish I” casing rules, where identifiers, keys, or comparisons stop working correctly outside en-US environments. The Kotlin compiler incident linked here is a concrete, real-world example of this broader class of locale-dependent case conversion bugs. <!-- SC_ON --> submitted by /u/BoloFan05 (https://www.reddit.com/user/BoloFan05)
[link] (https://sam-cooper.medium.com/the-country-that-broke-kotlin-84bdd0afb237) [comments] (https://www.reddit.com/r/programming/comments/1qndjes/localedependent_case_conversion_bugs_persist/)
https://www.reddit.com/r/programming/comments/1qndjes/localedependent_case_conversion_bugs_persist/
<!-- SC_OFF -->Case-insensitive logic can fail in surprising ways when string case conversion depends on the ambient locale. Many programs assume that operations like ToLower() or ToUpper() are locale-neutral, but in reality their behavior can vary by system settings. This can lead to subtle bugs, often involving the well-known “Turkish I” casing rules, where identifiers, keys, or comparisons stop working correctly outside en-US environments. The Kotlin compiler incident linked here is a concrete, real-world example of this broader class of locale-dependent case conversion bugs. <!-- SC_ON --> submitted by /u/BoloFan05 (https://www.reddit.com/user/BoloFan05)
[link] (https://sam-cooper.medium.com/the-country-that-broke-kotlin-84bdd0afb237) [comments] (https://www.reddit.com/r/programming/comments/1qndjes/localedependent_case_conversion_bugs_persist/)
Two empty chairs: why "obvious" decisions keep breaking production
https://www.reddit.com/r/programming/comments/1qneymk/two_empty_chairs_why_obvious_decisions_keep/
submitted by /u/dmp0x7c5 (https://www.reddit.com/user/dmp0x7c5)
[link] (https://l.perspectiveship.com/re-pesh) [comments] (https://www.reddit.com/r/programming/comments/1qneymk/two_empty_chairs_why_obvious_decisions_keep/)
https://www.reddit.com/r/programming/comments/1qneymk/two_empty_chairs_why_obvious_decisions_keep/
submitted by /u/dmp0x7c5 (https://www.reddit.com/user/dmp0x7c5)
[link] (https://l.perspectiveship.com/re-pesh) [comments] (https://www.reddit.com/r/programming/comments/1qneymk/two_empty_chairs_why_obvious_decisions_keep/)
AI generated tests as ceremony
https://www.reddit.com/r/programming/comments/1qng7j6/ai_generated_tests_as_ceremony/
submitted by /u/toolbelt (https://www.reddit.com/user/toolbelt)
[link] (https://blog.ploeh.dk/2026/01/26/ai-generated-tests-as-ceremony/) [comments] (https://www.reddit.com/r/programming/comments/1qng7j6/ai_generated_tests_as_ceremony/)
https://www.reddit.com/r/programming/comments/1qng7j6/ai_generated_tests_as_ceremony/
submitted by /u/toolbelt (https://www.reddit.com/user/toolbelt)
[link] (https://blog.ploeh.dk/2026/01/26/ai-generated-tests-as-ceremony/) [comments] (https://www.reddit.com/r/programming/comments/1qng7j6/ai_generated_tests_as_ceremony/)