PythonHub
2.43K subscribers
2.35K photos
49.3K links
News & links about Python programming.
https://pythonhub.dev/
Download Telegram
Production-Grade Python Logging Made Easier with Loguru

This guide will walk you through building a robust logging system with Loguru. We'll directly address the patterns and pain points of the standard logging module and show how Loguru simplifies them.

https://www.dash0.com/guides/python-logging-with-loguru
Starplot

Starplot is a Python library for creating star charts and maps of the sky.

https://starplot.dev/
Python REPL Shortcuts & Features

Python 3.13 introduced a completely redesigned REPL with a much more modern feel. It's easy to overlook many of the features in the new Python REPL. Let's explore the various hidden features the new Python REPL supports, with a focus on tips that are useful for everyday usage.

https://www.pythonmorsels.com/repl-features/
Add Agents to your Web Applications with Pydantic AI and Django

The video demonstrates how to integrate Pydantic AI with Django to add agent-like features to web applications, allowing developers to enrich chatbots with app-specific context and tools such as real-time weather services and database interactionsattached file. It showcases building customizable agents that can use dependencies, execute functions, and securely manipulate database recor...

https://www.youtube.com/watch?v=Z33IBfgVbxI
IBM / mcp-context-forge

A Model Context Protocol (MCP) Gateway & Registry. Serves as a central management point for tools, resources, and prompts that can be accessed by MCP-compatible LLM applications. Converts REST API endpoints to MCP, composes virtual MCP servers with added security and observability, and converts between protocols (stdio, SSE, Streamable HTTP).

https://github.com/IBM/mcp-context-forge
TensorRT-Model-Optimizer

A unified library of state-of-the-art model optimization techniques like quantization, pruning, distillation, speculative decoding, etc. It compresses deep learning models for downstream deployment frameworks like TensorRT-LLM or TensorRT to optimize inference speed.

https://github.com/NVIDIA/TensorRT-Model-Optimizer
Jaxformer Scaling Modern Transformers

This is a zero-to-one guide on scaling modern transformers with n-dimensional parallelism. Transformers have driven much of the deep learning revolution, yet no practical guide reflects SOTA architectures and the complexities of large-scale language modelling. While excellent resources such as DeepMind’s How to Scale Your Model and HuggingFace’s Ultra Scale Playbook exist, a gap remains ...

https://jaxformer.com/