Python Daily
2.57K subscribers
1.48K photos
53 videos
2 files
38.9K links
Daily Python News
Question, Tips and Tricks, Best Practices on Python Programming Language
Find more reddit channels over at @r_channels
Download Telegram
D Who do you all follow for genuinely substantial ML/AI content?

I've been looking for people to follow to keep up with the latest in ML and AI research/releases but have noticed there's a lot of low quality content creators crowding this space.

Who are some people you follow that you genuinely get substantial info from?

/r/MachineLearning
https://redd.it/1ko64s6
What CPython Layoffs Taught Me About the Real Value of Expertise

The layoffs of the CPython and TypeScript compiler teams have been bothering me—not because those people weren’t brilliant, but because their roles didn’t translate into enough real-world value for the businesses that employed them.

That’s the hard truth: Even deep expertise in widely-used technologies won’t protect you if your work doesn’t drive clear, measurable business outcomes.


The tools may be critical to the ecosystem, but the companies decided that further optimizations or refinements didn’t materially affect their goals. In other words, "good enough" was good enough. This is a shift in how I think about technical depth. I used to believe that mastering internals made you indispensable. Now I see that: You’re not measured on what you understand. You’re measured on what you produce—and whether it moves the needle.


The takeaway? Build enough expertise to be productive. Go deeper only when it’s necessary for the problem at hand. Focus on outcomes over architecture, and impact over elegance. CPython is essential. But understanding CPython internals isn’t essential unless it solves a problem that matters right now.

/r/Python
https://redd.it/1kok2e1
Skylos: Another dead code finder, but its better and faster. Source, Trust me bro.

# Skylos: The Python Dead Code Finder Written in Rust

Yo peeps

Been working on a static analysis tool for Python for a while. It's designed to detect unreachable functions and unused imports in your Python codebases. I know there's already Vulture, flake 8 etc etc.. but hear me out. This is more accurate and faster, and because I'm slightly OCD, I like to have my codebase, a bit cleaner. I'll elaborate more down below.

# What Makes Skylos Special?

* **High Performance**: Built with Rust, making it fast
* **Better Detection**: Finds more dead code than alternatives in our benchmarks
* **Interactive Mode**: Select and remove specific items interactively
* **Dry Run Support**: Preview changes before applying them
* **Cross-module Analysis**: Tracks imports and calls across your entire project

# Benchmark Results

|Tool|Time (s)|Functions|Imports|Total|
|:-|:-|:-|:-|:-|
|Skylos|0.039|48|8|56|
|Vulture (100%)|0.040|0|3|3|
|Vulture (60%)|0.041|28|3|31|
|Vulture (0%)|0.041|28|3|31|
|Flake8|0.274|0|8|8|
|Pylint|0.285|0|6|6|
|Dead|0.035|0|0|0|

This is the benchmark shown in the table above.

# How It Works

Skylos uses tree-sitter for parsing of Python code and employs a hybrid architecture with a Rust core for analysis and a Python CLI for the user interface. It handles Python features like decorators, chained method calls, and cross-mod references.

# Target Audience

Anyone with a **.py** file and a huge codebase that needs to kill off dead code? This ONLY

/r/Python
https://redd.it/1koi4fo
[pyfuze] Make your Python project truly cross-platform with Cosmopolitan and uv

## What My Project Does

I recently came across an interesting project called [Cosmopolitan](https://github.com/jart/cosmopolitan). In short, it can compile a C program into an [Actually Portable Executable (APE)](https://justine.lol/ape.html) which is capable of running natively on **Linux**, **macOS**, **Windows**, **FreeBSD**, **OpenBSD**, **NetBSD**, and even **BIOS**, across both **AMD64** and **ARM64** architectures.

The Cosmopolitan project already provides a Python APE (available in [cosmos.zip](https://github.com/jart/cosmopolitan/releases)), but it doesn't support running your own Python project with multiple dependencies.

Recently, I switched from Miniconda to [uv](https://github.com/astral-sh/uv), an extremely fast Python package and project manager. It occurred to me that I could bootstrap **any** Python project using uv!

That led me to create a new project called [pyfuze](https://github.com/TanixLu/pyfuze). It packages your Python project into a single zip file containing:

* `pyfuze.com` — an APE binary that prepares and runs your Python project
* `.python-version` — tells uv which Python version to install
* `requirements.txt` — lists your dependencies
* `src/` — contains all your source code
* `config.txt` — specifies the Python entry point and whether to enable Windows GUI mode (which hides console)

When you execute `pyfuze.com`, it performs the following steps:

* Installs `uv` into the `./uv` folder
* Installs Python into the `./python` folder (version taken from `.python-version`)
* Installs dependencies listed in `requirements.txt`
* Runs your Python

/r/Python
https://redd.it/1koos2n
Why does my Flask /health endpoint show nothing at http://localhost:5000/health?

Hey folks, I’m working on a Flask backend and I’m running into a weird issue.

I’ve set up a simple /health endpoint to check if the server is up. Here’s the code I’m using:

@app.route('/health', methods=['GET'])
def health_check():
return 'OK', 200

The server runs without errors, and I can confirm that it’s listening on port 5000. But when I open http://localhost:5000/health in the browser, I get a blank page or sometimes nothing at all — no “OK” message shows up on Safari while Chrome says “access to localhost was denied”.

What I expected:
A plain "OK" message in the browser or in the response body.

What I get:
Blank screen/access to localhost was denied (but status code is still 200).

Has anyone seen this before? Could it be something to do with the way Flask handles plain text responses in browsers? Or is there something else I’m missing?

Thanks in advance for any help!


/r/flask
https://redd.it/1kolnus
Should I learn FastAPI? Why? Doesn’t Django or Flask do the trick?

I’ve been building Python web apps and always used Django or Flask because they felt reliable and well-established. Recently, I stumbled on davia ai — a tool built on FastAPI that I really wanted to try. But to get the most out of it, I realized I needed to learn FastAPI first. Now I’m wondering if it’s worth the switch. If so, what teaching materials do you recommend?

/r/Python
https://redd.it/1kou6lc
Is there a module that can dynamically can change all div ids and css ids on each request?

as the title says.

I need that without change all other functions in my flask application.

if it doesn't exist and you just wanna talk bullshit then just don't reply

/r/flask
https://redd.it/1kou68z
Senior Django Developers: Do You Stick with Django for High-Concurrency Async Applications or Transition to Other Frameworks?

Hi everyone, I hope you're all doing well!

I'm exploring the feasibility of using Django for applications that need to handle a massive number of asynchronous operations—things like real-time chat systems, live dashboards, or streaming services. With Django's support for ASGI and asynchronous views, it's now possible to implement async features, but I'm wondering how well it holds up in real-world, high-concurrency environments compared to frameworks that are natively asynchronous.

Given that, I'm curious:

1️⃣ Have you successfully deployed Django in high-concurrency, async-heavy environments?

2️⃣ Did you encounter limitations that led you to consider or switch to frameworks like Node.js, ASP.NET Core, or others?

3️⃣ What strategies or tools did you use to scale Django in such scenarios?

I’m especially interested in hearing about real-world experiences, the challenges you faced, and how you decided on the best framework for your needs.

Thanks in advance for sharing your insights—looking forward to learning from you all!

Warm regards!

/r/django
https://redd.it/1koyugq
Should I take a government Data Science job that only uses SAS?

Hey all,
I’ve just been offered a Data Science position at a national finance ministry (public sector). The role sounds meaningful, and I’ve already verbally accepted, but haven’t signed the contract yet.

Here’s the thing:
I currently work in a tech-oriented role where I get to experiment with modern ML/AI tools — Python, transformers, SHAP, even LLM prototyping. In contrast, the ministry role would rely almost entirely on SAS. Python might be introduced at some point, but currently isn’t part of the tech stack.

I’m 35 now, and if I stay for 5 years, I’m worried I’ll lose touch with modern tools and limit my career flexibility. The role would be focused on structured data, traditional scoring models, and heavy audit/governance use cases.

Pros:
• Societal impact
• Work-life balance + flexibility for parental leave
• Stable government job with long-term security
• Exposure to public policy and regulated environments

Cons:
• No Python or open-source stack
• No access to cutting-edge AI tools or innovation
• Potential tech stagnation if I stay long
• May hurt my profile if I return to the private sector at 40

I’m torn between meaning and innovation.

Would love to hear from anyone who’s made a similar move or faced this kind of tradeoff.
Would you take the role and just “keep Python alive” on the side?

/r/Python
https://redd.it/1koy4vw
FRONTEND FRAMEWORK WITH DRF

Hello, writing a drf project and I haven't decided what frontend to use, I've previously written a traditional MVT but first time implementing a frontend with my drf, thinking of using react, but I feel it is kind of stress learning the framework maybe it'll take me a lot of time to get it and since I'm good with django-html and css I feel it's a waste of time or does it worth it?

/r/django
https://redd.it/1kp7yto
Hiding API key

Hi there, I am currently Doing a python application where one of the html pages is a html,css javascript chatbot.


This chatbot relies on an open AI api key. I want to hide this key as an environment variable so I can use it in Javascript and add it as a config var in Heroku. Is it possible to do this.


Thank you.

/r/django
https://redd.it/1kozxzr
P I built a transformer that skips layers per token based on semantic importance

I’m a high school student who’s been exploring how to make transformers/ai models more efficient, and I recently built something I’m really excited about: a transformer that routes each token through a different number of layers depending on how "important" it is.

The idea came from noticing how every token, even simple ones like “the” or “of”, gets pushed through every layer in standard transformers. But not every token needs the same amount of reasoning. So I created a lightweight scoring mechanism that estimates how semantically dense a token is, and based on that, decides how many layers it should go through.

It’s called SparseDepthTransformer, and here’s what it does:

Scores each token for semantic importance
Skips deeper layers for less important tokens using hard gating
Tracks how many layers each token actually uses
Benchmarks against a baseline transformer

In my tests, this reduced memory usage by about 15% and cut the average number of layers per token by \~40%, while keeping output quality the same. Right now it runs a bit slower because the skipping is done token-by-token, but batching optimization is next on my list.

Here’s the GitHub repo if you’re curious or want to give feedback:
https://github.com/Quinnybob/sparse-depth-transformer

Would love if you

/r/MachineLearning
https://redd.it/1kpalhd
Sunday Daily Thread: What's everyone working on this week?

# Weekly Thread: What's Everyone Working On This Week? 🛠️

Hello /r/Python! It's time to share what you've been working on! Whether it's a work-in-progress, a completed masterpiece, or just a rough idea, let us know what you're up to!

## How it Works:

1. Show & Tell: Share your current projects, completed works, or future ideas.
2. Discuss: Get feedback, find collaborators, or just chat about your project.
3. Inspire: Your project might inspire someone else, just as you might get inspired here.

## Guidelines:

Feel free to include as many details as you'd like. Code snippets, screenshots, and links are all welcome.
Whether it's your job, your hobby, or your passion project, all Python-related work is welcome here.

## Example Shares:

1. Machine Learning Model: Working on a ML model to predict stock prices. Just cracked a 90% accuracy rate!
2. Web Scraping: Built a script to scrape and analyze news articles. It's helped me understand media bias better.
3. Automation: Automated my home lighting with Python and Raspberry Pi. My life has never been easier!

Let's build and grow together! Share your journey and learn from others. Happy coding! 🌟

/r/Python
https://redd.it/1kp6wqf
FastAPI + React Front - Auth0, build from scratch?

I have a fastapi backend with a react front end. I’m trying to figure out the best way to manage my users login, credentials, permissions, etc. I keep finding myself just defaulting to building it all myself. Am I missing a different option? What are most people using?

/r/Python
https://redd.it/1kpby44
D Can we possibly construct an AlphaEvolve@HOME?

Today, consumer grade graphics cards are getting to nearly 50 TeraFLOPS in performance. If a PC owner is browsing reddit, or their computer is turned off all night, the presence of an RTX 50XX idling away is wasted computing potential.

When millions of people own a graphics card, the amount of computing potential is quite vast. Under ideal conditions, that vast ocean of computing potential could be utilized for something else.

> AlphaEvolve is a coding agent that orchestrates an autonomous pipeline of computations including queries to LLMs, and produces algorithms that address a userspecified task. At a high level, the orchestrating procedure is an evolutionary algorithm that gradually develops programs that improve the score on the automated evaluation metrics associated with the task.


Deepmind's recent AlphaEvolve agent is performing well on the discovery -- or "invention" -- of new methods. As Deepmind describes above, AlphaEvolve is using an evolutionary algorithm in its workflow pipeline. Evolutionary algorithms are known to benefit from large-scale parallelism. This means it may be possible to run AlphaEvolve on the many rack servers to exploit the parallelism provided by a data center.

Or

/r/MachineLearning
https://redd.it/1kp4nxq
Lets make visualizations of 3D images in Notebooks just as simple as for 2D images

# Target Audience

Many of us who deal with image data in their everyday life and use Python to perform some kind of analysis, are used to employ Jupyter Notebooks. Notebooks are great, because they permit to write a story of the analysis that we perform: We sketch the motivation of our investigation, we write the code to load the data, we explore the data directly inside the Notebooks by embedding images, we write the code for the analysis, we inspect the results (more images!), make observations and we draw conclusions.

Thanks to matplotlib, visualization of 2D images inside Notebooks—be it for exploration or for inspection—is absolutely trivial. Notebooks are a paradise of an ecosystem, for 2D image data. However, things get more complicated when you move to 3D.

>LibCarna is an attempt to make the visualization of 3D image data in Jupyter Notebooks just as simple as it is for 2D images.

In a nutshell: If you ever wanted to visualize 3D images in Notebooks, then LibCarna might be for you.

# What My Project Does

LibCarna started off more than a decade ago (see "Scope of the Project" section below, if you're interested) and was developed with an emphasis on simplicity and flexibility. Under

/r/Python
https://redd.it/1kpfnrc