🚀 Blinter The Linter - A Cross Platform Batch Script Linter
Yes, it's 2025. Yes, people still write batch scripts. No, they shouldn't crash.
## What It Does
✅ 158 rules across Error/Warning/Style/Security/Performance
✅ Catches the nasty stuff: Command injection, path traversal, unsafe temp files
✅ Handles the weird stuff: Variable expansion, FOR loops, multilevel escaping
✅ 10MB+ files? No problem. Unicode? Got it. Thread-safe? Always.
## Get It Now
Or grab the standalone
## One Command
That's it. No config needed. No ceremony. Just point it at your
---
The first professional-grade linter for Windows batch files.
Because your automation scripts shouldn't be held together with duct tape.
📦 PyPI • ⚙️ GitHub
What My Project Does
A cross platform linter for batch scripts.
Target Audience
Developers, primarily Windows based.
Comparison
There is no comparison, it's the only batch linter so theres nothing to compare it to.
/r/Python
https://redd.it/1o4sswc
Yes, it's 2025. Yes, people still write batch scripts. No, they shouldn't crash.
## What It Does
✅ 158 rules across Error/Warning/Style/Security/Performance
✅ Catches the nasty stuff: Command injection, path traversal, unsafe temp files
✅ Handles the weird stuff: Variable expansion, FOR loops, multilevel escaping
✅ 10MB+ files? No problem. Unicode? Got it. Thread-safe? Always.
## Get It Now
pip install Blinter
Or grab the standalone
.exe from GitHub Releases## One Command
python -m blinter script.bat
That's it. No config needed. No ceremony. Just point it at your
.bat or .cmd files.---
The first professional-grade linter for Windows batch files.
Because your automation scripts shouldn't be held together with duct tape.
📦 PyPI • ⚙️ GitHub
What My Project Does
A cross platform linter for batch scripts.
Target Audience
Developers, primarily Windows based.
Comparison
There is no comparison, it's the only batch linter so theres nothing to compare it to.
/r/Python
https://redd.it/1o4sswc
GitHub
Release v1.0.55 · tboy1337/Blinter
Blinter is a linter for Windows batch files. It provides comprehensive static analysis to identify syntax errors, security vulnerabilities, performance issues and style problems. - Release v1.0.55 · tboy1337/Blinter
Built this Django-Unfold showcase — thinking to extend it into a CRM project
/r/django
https://redd.it/1o4psg7
/r/django
https://redd.it/1o4psg7
Livestream with django
Hello, to give you some context: in the app I am developing, there is a service called "Events and Meetings." This service has different functionalities, one of which is that the user should be able to create an online event. My question is, besides django-channels, what other package can help achieve livestreaming for more than 10 or 20 users?
I should mention that I am developing the API using Django REST Framework.
/r/django
https://redd.it/1o4wqlr
Hello, to give you some context: in the app I am developing, there is a service called "Events and Meetings." This service has different functionalities, one of which is that the user should be able to create an online event. My question is, besides django-channels, what other package can help achieve livestreaming for more than 10 or 20 users?
I should mention that I am developing the API using Django REST Framework.
/r/django
https://redd.it/1o4wqlr
Reddit
From the django community on Reddit
Explore this post and more from the django community
P Adapting Karpathy’s baby GPT into a character-level discrete diffusion model
Hi everyone,
I've been exploring how discrete diffusion models can be applied to text generation and put together a single annotated Jupyter Notebook that implements a character-level discrete diffusion GPT.
It's based on Andrej Karpathy’s baby GPT from his nanoGPT repo, but instead of generating text autoregressively (left-to-right), it learns to denoise corrupted text sequences in parallel.
Discrete diffusion model in action
The notebook walks through the math, introduces what adding noise for discrete tokens means, builds discrete diffusion model from baby GPT, and trains it on Shakespeare's text using Score-Entropy based objective.
Access it on GitHub (notebook + README):
https://github.com/ash80/diffusion-gpt
or run it directly on Google Colab:
https://colab.research.google.com/github/ash80/diffusion-gpt/blob/master/The\_Annotated\_Discrete\_Diffusion\_Models.ipynb
I'd appreciate any feedback, corrections, and suggestions, especially from anyone experimenting with discrete diffusion models.
/r/MachineLearning
https://redd.it/1o4qu0h
Hi everyone,
I've been exploring how discrete diffusion models can be applied to text generation and put together a single annotated Jupyter Notebook that implements a character-level discrete diffusion GPT.
It's based on Andrej Karpathy’s baby GPT from his nanoGPT repo, but instead of generating text autoregressively (left-to-right), it learns to denoise corrupted text sequences in parallel.
Discrete diffusion model in action
The notebook walks through the math, introduces what adding noise for discrete tokens means, builds discrete diffusion model from baby GPT, and trains it on Shakespeare's text using Score-Entropy based objective.
Access it on GitHub (notebook + README):
https://github.com/ash80/diffusion-gpt
or run it directly on Google Colab:
https://colab.research.google.com/github/ash80/diffusion-gpt/blob/master/The\_Annotated\_Discrete\_Diffusion\_Models.ipynb
I'd appreciate any feedback, corrections, and suggestions, especially from anyone experimenting with discrete diffusion models.
/r/MachineLearning
https://redd.it/1o4qu0h
GitHub
GitHub - karpathy/nanoGPT: The simplest, fastest repository for training/finetuning medium-sized GPTs.
The simplest, fastest repository for training/finetuning medium-sized GPTs. - karpathy/nanoGPT
My first medium blog on GIL
Hi everyone, today I tried my first attempt at writing a tech blog on GIL basics like what is it, why it is needed as recent 3.14 gil removal created a lot of buzz around it. Please give it a read. Only a 5 min read. Please suggest if anything wrong or any improvements needed.
**GIL in Python: The Lock That Makes and Breaks It**
PS: I wrote it by myself based on my understanding. Only used llm as proof readers so it may appear unpolished here and there.
/r/Python
https://redd.it/1o4oozb
Hi everyone, today I tried my first attempt at writing a tech blog on GIL basics like what is it, why it is needed as recent 3.14 gil removal created a lot of buzz around it. Please give it a read. Only a 5 min read. Please suggest if anything wrong or any improvements needed.
**GIL in Python: The Lock That Makes and Breaks It**
PS: I wrote it by myself based on my understanding. Only used llm as proof readers so it may appear unpolished here and there.
/r/Python
https://redd.it/1o4oozb
Medium
GIL in Python: The Lock That Makes and Breaks It
If you are in the tech world or even remotely familiar with the Python programming language, you must have heard about the opt-in version…
Need advice on simulating real time bus movement and eta predictions
Hello Everyone,
I'm currently studying in college and for semester project i have selected project which can simulate real time bus movement and can predict at what bus will arrive that the certain destination.
What I have:
1. Bus departure time from station
2. Distance between each bus stop
3. Bus stop map coordinates
What I'm trying to achive:
1. Simulating bus moving on real map
2. Variable speeds, dwell times, traffic variation.
3. Estimate arrival time per stop using distance and speed.
4. Live dashboard predicting at what time will reach certain stop based upon traffic flow,speed
Help I need:
1. How to simulate it on real map (showing bus is actually moving along the route)
2. What are the best tools for this project
3. How to model traffic flow
Thanks
/r/Python
https://redd.it/1o5gurz
Hello Everyone,
I'm currently studying in college and for semester project i have selected project which can simulate real time bus movement and can predict at what bus will arrive that the certain destination.
What I have:
1. Bus departure time from station
2. Distance between each bus stop
3. Bus stop map coordinates
What I'm trying to achive:
1. Simulating bus moving on real map
2. Variable speeds, dwell times, traffic variation.
3. Estimate arrival time per stop using distance and speed.
4. Live dashboard predicting at what time will reach certain stop based upon traffic flow,speed
Help I need:
1. How to simulate it on real map (showing bus is actually moving along the route)
2. What are the best tools for this project
3. How to model traffic flow
Thanks
/r/Python
https://redd.it/1o5gurz
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
My first RTC app using Django channels finally went live. I need your feedback.
Hey folks, I just finished the first version of my real-time chat app built with Django, Django Channels, and WebSockets. I also used React for the frontend (which I actually learned while building this project).
It’s still missing some important stuff like testing, better error handling, and a few production-level optimizations, but it’s functional, users can register, log in, and chat in real time with real typing indicators and live presence tracker. I’d really appreciate any backend-focused feedback.
Tech stack:
Django + Django Channels
Redis for message brokering
PostgreSQL
React (frontend)
Live demo / GitHub repos:
here are the live version, frontend and the backend
Login with these accounts to explore:
email: guest1@djgram.com, password:1234, email: guest2@djgram.com, password: 1234
I know it’s far from perfect, still no tests or CI/CD setup but I wanted to get some real feedback before adding more features.
Any feedback (even brutal honesty) is super welcome.
/r/djangolearning
https://redd.it/1o4ulh1
Hey folks, I just finished the first version of my real-time chat app built with Django, Django Channels, and WebSockets. I also used React for the frontend (which I actually learned while building this project).
It’s still missing some important stuff like testing, better error handling, and a few production-level optimizations, but it’s functional, users can register, log in, and chat in real time with real typing indicators and live presence tracker. I’d really appreciate any backend-focused feedback.
Tech stack:
Django + Django Channels
Redis for message brokering
PostgreSQL
React (frontend)
Live demo / GitHub repos:
here are the live version, frontend and the backend
Login with these accounts to explore:
email: guest1@djgram.com, password:1234, email: guest2@djgram.com, password: 1234
I know it’s far from perfect, still no tests or CI/CD setup but I wanted to get some real feedback before adding more features.
Any feedback (even brutal honesty) is super welcome.
/r/djangolearning
https://redd.it/1o4ulh1
Trying to use Google Drive to Store Media Files, But Getting "Service Accounts do not have storage quota" error when uploading
I'm building a Django app and I'm trying to use Google Drive as storage for media files via a service account, but I'm encountering a storage quota error.
# What I've Done
Set up a project in Google Cloud Console
Created a service account and downloaded the JSON key file
Implemented a custom Django storage backend using the Google Drive API v3
Configured
# The Error
When trying to upload files, I get:
HttpError 403: "Service Accounts do not have storage quota. Leverage shared drives
(https://developers.google.com/workspace/drive/api/guides/about-shareddrives),
or use OAuth delegation instead."
# What I've Tried
1. Created a folder in my personal Google Drive (regular Gmail account)
2. Shared it with the service account email (the
3. Set the folder ID as
This is the code of the storage class:
```
# The original version of the code
# https://github.com/torre76/django-googledrive-storage/blob/master/gdstorage/storage.py
Copyright (c) 2014, Gian Luca Dalla Torre
All rights reserved.
"""
import enum
import json
/r/django
https://redd.it/1o5f3tr
I'm building a Django app and I'm trying to use Google Drive as storage for media files via a service account, but I'm encountering a storage quota error.
# What I've Done
Set up a project in Google Cloud Console
Created a service account and downloaded the JSON key file
Implemented a custom Django storage backend using the Google Drive API v3
Configured
GOOGLE_DRIVE_ROOT_FOLDER_ID in my settings# The Error
When trying to upload files, I get:
HttpError 403: "Service Accounts do not have storage quota. Leverage shared drives
(https://developers.google.com/workspace/drive/api/guides/about-shareddrives),
or use OAuth delegation instead."
# What I've Tried
1. Created a folder in my personal Google Drive (regular Gmail account)
2. Shared it with the service account email (the
client_email from the JSON file) with Editor permissions3. Set the folder ID as
GOOGLE_DRIVE_ROOT_FOLDER_ID in my Django settingsThis is the code of the storage class:
```
# The original version of the code
# https://github.com/torre76/django-googledrive-storage/blob/master/gdstorage/storage.py
Copyright (c) 2014, Gian Luca Dalla Torre
All rights reserved.
"""
import enum
import json
/r/django
https://redd.it/1o5f3tr
Google for Developers
Shared drives overview | Google Drive | Google for Developers
Learn more about shared drives in Google Drive, including permissions.
gRPC: Client side vs Server side load balancing, which one to choose?
Hello everyone,
My setup: Two FastAPI apps calling gRPC ML services (layout analysis + table detection). Need to scale both the services.
Question: For GPU-based ML inference over gRPC, does NGINX load balancing significantly hurt performance vs client-side load balancing?
Main concerns:
* Losing HTTP/2 multiplexing benefits
* Extra latency (though probably negligible vs 2-5s processing time)
* Need priority handling for time-critical clients
Current thinking: NGINX seems simpler operationally, but want to make sure I'm not shooting myself in the foot performance-wise.
Experience with gRPC + NGINX? Client-side LB worth the complexity for this use case?
/r/Python
https://redd.it/1o5kwve
Hello everyone,
My setup: Two FastAPI apps calling gRPC ML services (layout analysis + table detection). Need to scale both the services.
Question: For GPU-based ML inference over gRPC, does NGINX load balancing significantly hurt performance vs client-side load balancing?
Main concerns:
* Losing HTTP/2 multiplexing benefits
* Extra latency (though probably negligible vs 2-5s processing time)
* Need priority handling for time-critical clients
Current thinking: NGINX seems simpler operationally, but want to make sure I'm not shooting myself in the foot performance-wise.
Experience with gRPC + NGINX? Client-side LB worth the complexity for this use case?
/r/Python
https://redd.it/1o5kwve
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
Parsegument! - Argument Parsing and function routing
Project Source code: https://github.com/RyanStudioo/Parsegument
Project Docs: https://www.ryanstudio.dev/docs/parsegument/
# What My Project Does
Parsegument allows you to easily define Command structures with Commands and CommandGroups.
Parsegument also automatically parses arguments, converts them to your desired type, then executes functions automatically, all with just one method call and a string.
# Target Audience
Parsegument is targetted for people who would like to simplify making CLIs. I started this project as I was annoyed at having to use lines and lines of switch case statements for another project I was working on
# Comparison
Compared to python's built in argparse, Parsegument has a more intuitive syntax, and makes it more convenient to route and execute functions.
This project is still super early in development, I aim to add other features like aliases, annotations, and more suggestions from you guys!
/r/Python
https://redd.it/1o5jkb9
Project Source code: https://github.com/RyanStudioo/Parsegument
Project Docs: https://www.ryanstudio.dev/docs/parsegument/
# What My Project Does
Parsegument allows you to easily define Command structures with Commands and CommandGroups.
Parsegument also automatically parses arguments, converts them to your desired type, then executes functions automatically, all with just one method call and a string.
# Target Audience
Parsegument is targetted for people who would like to simplify making CLIs. I started this project as I was annoyed at having to use lines and lines of switch case statements for another project I was working on
# Comparison
Compared to python's built in argparse, Parsegument has a more intuitive syntax, and makes it more convenient to route and execute functions.
This project is still super early in development, I aim to add other features like aliases, annotations, and more suggestions from you guys!
/r/Python
https://redd.it/1o5jkb9
GitHub
GitHub - RyanStudioo/Parsegument: Parsegument is a python library for argument parsing and function routing
Parsegument is a python library for argument parsing and function routing - RyanStudioo/Parsegument
Use cases of Django vs NodeJS
Hey I have only really done websites in React + NodeJS + “express” package in NodeJS. I know very little about Django.
I was wondering the use cases or the situations where Django would be better to use than NodeJs.
Also if you could explain the differences in performance btwn Django and NodeJs too.
I’d need the advice pretty soon like by one or two days from now
/r/djangolearning
https://redd.it/1o43b73
Hey I have only really done websites in React + NodeJS + “express” package in NodeJS. I know very little about Django.
I was wondering the use cases or the situations where Django would be better to use than NodeJs.
Also if you could explain the differences in performance btwn Django and NodeJs too.
I’d need the advice pretty soon like by one or two days from now
/r/djangolearning
https://redd.it/1o43b73
Reddit
From the djangolearning community on Reddit
Explore this post and more from the djangolearning community
ChanX: The Django WebSocket Library I Wish Existed Years Ago
Django Channels is excellent for WebSocket support, but after years of using it, I found myself writing the same boilerplate patterns repeatedly: routing chains, validation logic, and documentation. ChanX is a higher-level framework built on top of Channels to handle these common patterns automatically.
# The Problem
If you've used Django Channels, you know the pain:
https://preview.redd.it/1uhc8l973xuf1.png?width=996&format=png&auto=webp&s=913508d7488031446d8c5044132d449ddd770c0f
Plus manual validation everywhere, no type safety, and zero automatic documentation. Unlike Django REST Framework, Channels leaves you building everything from scratch.
# The Solution
Here's what the same consumer looks like with ChanX:
https://preview.redd.it/ymmxw1f93xuf1.png?width=1036&format=png&auto=webp&s=9a156365fcfe57b9ebec78892454619ca47705aa
**What you get:**
* Automatic routing with Pydantic validation - no if-else chains
* Full type safety with mypy/pyright - catch errors before runtime
* Auto-generated AsyncAPI 3.0 docs - like Swagger for WebSockets
* Event broadcasting from anywhere - HTTP views, Celery tasks, etc.
* Built-in authentication with Django permissions
* Structured logging and comprehensive testing utilities
* Works with both Django Channels and FastAPI
**Comparison with other solutions:** See how ChanX compares to raw Django Channels, Broadcaster, and Socket.IO at [https://chanx.readthedocs.io/en/latest/comparison.html](https://chanx.readthedocs.io/en/latest/comparison.html)
# Tutorial for Beginners
I wrote a hands-on tutorial that builds a real chat app with AI assistants, notifications, and background tasks. It uses a Git repo with checkpoints so you can jump in anywhere or compare your code if you get stuck.
Tutorial:
/r/django
https://redd.it/1o5qvcj
Django Channels is excellent for WebSocket support, but after years of using it, I found myself writing the same boilerplate patterns repeatedly: routing chains, validation logic, and documentation. ChanX is a higher-level framework built on top of Channels to handle these common patterns automatically.
# The Problem
If you've used Django Channels, you know the pain:
https://preview.redd.it/1uhc8l973xuf1.png?width=996&format=png&auto=webp&s=913508d7488031446d8c5044132d449ddd770c0f
Plus manual validation everywhere, no type safety, and zero automatic documentation. Unlike Django REST Framework, Channels leaves you building everything from scratch.
# The Solution
Here's what the same consumer looks like with ChanX:
https://preview.redd.it/ymmxw1f93xuf1.png?width=1036&format=png&auto=webp&s=9a156365fcfe57b9ebec78892454619ca47705aa
**What you get:**
* Automatic routing with Pydantic validation - no if-else chains
* Full type safety with mypy/pyright - catch errors before runtime
* Auto-generated AsyncAPI 3.0 docs - like Swagger for WebSockets
* Event broadcasting from anywhere - HTTP views, Celery tasks, etc.
* Built-in authentication with Django permissions
* Structured logging and comprehensive testing utilities
* Works with both Django Channels and FastAPI
**Comparison with other solutions:** See how ChanX compares to raw Django Channels, Broadcaster, and Socket.IO at [https://chanx.readthedocs.io/en/latest/comparison.html](https://chanx.readthedocs.io/en/latest/comparison.html)
# Tutorial for Beginners
I wrote a hands-on tutorial that builds a real chat app with AI assistants, notifications, and background tasks. It uses a Git repo with checkpoints so you can jump in anywhere or compare your code if you get stuck.
Tutorial:
/r/django
https://redd.it/1o5qvcj
Newbie question — which hosting is best for a small Django + Next.js e-commerce site?
Hi everyone, I’m a total newbie so please be kind if this is a basic question 😅
I’m currently learning Python Django from a book (I have zero coding background) and also experimenting with Claude-Code. My goal is to build and deploy a small e-commerce website using Django (backend) and Next.js (frontend). (Australia mel)
Here’s my situation:
Daily users: about 500
Concurrent users: around 100
I want to deploy it for commercial use, and I’m trying to decide which hosting option would be the most suitable. I’m currently considering:
DigitalOcean
Vercel + Railway combo
Google Cloud Run
If you were me, which option would you choose and why? I’d love to hear advice from more experienced developers — especially any tips on cost, performance, or scaling. 🙏
I'm considering price or easy use ai or easy deploy
Thanks for reading my long sentence post
/r/django
https://redd.it/1o5589y
Hi everyone, I’m a total newbie so please be kind if this is a basic question 😅
I’m currently learning Python Django from a book (I have zero coding background) and also experimenting with Claude-Code. My goal is to build and deploy a small e-commerce website using Django (backend) and Next.js (frontend). (Australia mel)
Here’s my situation:
Daily users: about 500
Concurrent users: around 100
I want to deploy it for commercial use, and I’m trying to decide which hosting option would be the most suitable. I’m currently considering:
DigitalOcean
Vercel + Railway combo
Google Cloud Run
If you were me, which option would you choose and why? I’d love to hear advice from more experienced developers — especially any tips on cost, performance, or scaling. 🙏
I'm considering price or easy use ai or easy deploy
Thanks for reading my long sentence post
/r/django
https://redd.it/1o5589y
Reddit
From the django community on Reddit
Explore this post and more from the django community
Tuesday Daily Thread: Advanced questions
# Weekly Wednesday Thread: Advanced Questions 🐍
Dive deep into Python with our Advanced Questions thread! This space is reserved for questions about more advanced Python topics, frameworks, and best practices.
## How it Works:
1. **Ask Away**: Post your advanced Python questions here.
2. **Expert Insights**: Get answers from experienced developers.
3. **Resource Pool**: Share or discover tutorials, articles, and tips.
## Guidelines:
* This thread is for **advanced questions only**. Beginner questions are welcome in our [Daily Beginner Thread](#daily-beginner-thread-link) every Thursday.
* Questions that are not advanced may be removed and redirected to the appropriate thread.
## Recommended Resources:
* If you don't receive a response, consider exploring r/LearnPython or join the [Python Discord Server](https://discord.gg/python) for quicker assistance.
## Example Questions:
1. **How can you implement a custom memory allocator in Python?**
2. **What are the best practices for optimizing Cython code for heavy numerical computations?**
3. **How do you set up a multi-threaded architecture using Python's Global Interpreter Lock (GIL)?**
4. **Can you explain the intricacies of metaclasses and how they influence object-oriented design in Python?**
5. **How would you go about implementing a distributed task queue using Celery and RabbitMQ?**
6. **What are some advanced use-cases for Python's decorators?**
7. **How can you achieve real-time data streaming in Python with WebSockets?**
8. **What are the
/r/Python
https://redd.it/1o60ghk
# Weekly Wednesday Thread: Advanced Questions 🐍
Dive deep into Python with our Advanced Questions thread! This space is reserved for questions about more advanced Python topics, frameworks, and best practices.
## How it Works:
1. **Ask Away**: Post your advanced Python questions here.
2. **Expert Insights**: Get answers from experienced developers.
3. **Resource Pool**: Share or discover tutorials, articles, and tips.
## Guidelines:
* This thread is for **advanced questions only**. Beginner questions are welcome in our [Daily Beginner Thread](#daily-beginner-thread-link) every Thursday.
* Questions that are not advanced may be removed and redirected to the appropriate thread.
## Recommended Resources:
* If you don't receive a response, consider exploring r/LearnPython or join the [Python Discord Server](https://discord.gg/python) for quicker assistance.
## Example Questions:
1. **How can you implement a custom memory allocator in Python?**
2. **What are the best practices for optimizing Cython code for heavy numerical computations?**
3. **How do you set up a multi-threaded architecture using Python's Global Interpreter Lock (GIL)?**
4. **Can you explain the intricacies of metaclasses and how they influence object-oriented design in Python?**
5. **How would you go about implementing a distributed task queue using Celery and RabbitMQ?**
6. **What are some advanced use-cases for Python's decorators?**
7. **How can you achieve real-time data streaming in Python with WebSockets?**
8. **What are the
/r/Python
https://redd.it/1o60ghk
Discord
Join the Python Discord Server!
We're a large community focused around the Python programming language. We believe that anyone can learn to code. | 412982 members
ChanX: Type-Safe WebSocket Framework for Django and FastAPI
# What My Project Does
ChanX is a batteries-included WebSocket framework that works with both Django Channels and FastAPI. It eliminates the boilerplate and repetitive patterns in WebSocket development by providing:
Automatic message routing using Pydantic discriminated unions - no more if-else chains
Type safety with full mypy/pyright support and runtime Pydantic validation
Auto-generated AsyncAPI 3.0 documentation - like OpenAPI/Swagger but for WebSockets
Channel layer integration for broadcasting messages across servers with Redis
Event system to trigger WebSocket messages from anywhere in your application (HTTP views, Celery tasks, management commands)
Built-in authentication with Django REST framework permissions support
Comprehensive testing utilities for both frameworks
Structured logging with automatic request/response tracing
The same decorator-based API works for both Django Channels and FastAPI:
from typing import Literal
from chanx.messages.base import BaseMessage
from chanx.core.decorators import wshandler, channel
from chanx.channels.websocket import AsyncJsonWebsocketConsumer # Django
# from chanx.fastchannels.websocket import AsyncJsonWebsocketConsumer # FastAPI
class ChatMessage(BaseMessage):
action: Literal"chat" = "chat"
payload: str
(name="chat")
/r/Python
https://redd.it/1o5ro8i
# What My Project Does
ChanX is a batteries-included WebSocket framework that works with both Django Channels and FastAPI. It eliminates the boilerplate and repetitive patterns in WebSocket development by providing:
Automatic message routing using Pydantic discriminated unions - no more if-else chains
Type safety with full mypy/pyright support and runtime Pydantic validation
Auto-generated AsyncAPI 3.0 documentation - like OpenAPI/Swagger but for WebSockets
Channel layer integration for broadcasting messages across servers with Redis
Event system to trigger WebSocket messages from anywhere in your application (HTTP views, Celery tasks, management commands)
Built-in authentication with Django REST framework permissions support
Comprehensive testing utilities for both frameworks
Structured logging with automatic request/response tracing
The same decorator-based API works for both Django Channels and FastAPI:
from typing import Literal
from chanx.messages.base import BaseMessage
from chanx.core.decorators import wshandler, channel
from chanx.channels.websocket import AsyncJsonWebsocketConsumer # Django
# from chanx.fastchannels.websocket import AsyncJsonWebsocketConsumer # FastAPI
class ChatMessage(BaseMessage):
action: Literal"chat" = "chat"
payload: str
(name="chat")
/r/Python
https://redd.it/1o5ro8i
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
Pyrefly eats CPU like nobodies business.
So I recently tried out the pyrefly and the ty typecheckers/LSPs in my project for ML. While ty wasn't as useful with it's errors and imports, pyrefly was great in that department. Only problem with the latter was that it sent CPU use to near 100% the whole time it ran.
This was worse than even rust-analyzer, notorious for being a heavy-weight tool, which only uses a ton of CPU on startup but works on low CPU throughout but using a ton of RAM.
Is there some configuration for pyrefly I was missing or is this a bug and if it's the latter should I report it?
Or even worse, is this intended behavior? If so, pyrefly will remain unusable to anyone without a really beefy computer making it completely useless for me. Hopefully not thought, cause I can't have an LSP using over 90% CPU while it runs in background running on my laptop.
/r/Python
https://redd.it/1o66tho
So I recently tried out the pyrefly and the ty typecheckers/LSPs in my project for ML. While ty wasn't as useful with it's errors and imports, pyrefly was great in that department. Only problem with the latter was that it sent CPU use to near 100% the whole time it ran.
This was worse than even rust-analyzer, notorious for being a heavy-weight tool, which only uses a ton of CPU on startup but works on low CPU throughout but using a ton of RAM.
Is there some configuration for pyrefly I was missing or is this a bug and if it's the latter should I report it?
Or even worse, is this intended behavior? If so, pyrefly will remain unusable to anyone without a really beefy computer making it completely useless for me. Hopefully not thought, cause I can't have an LSP using over 90% CPU while it runs in background running on my laptop.
/r/Python
https://redd.it/1o66tho
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
I built JSONxplode a complex json flattener
I built this tool in python and I hope it will help the community.
This code flattens deep, messy and complex json data into a simple tabular form without the need of providing a schema.
so all you need to do is: from jsonxplode import flatten flattened_json = flatten(messy_json_data)
once this code is finished with the json file none of the object or arrays will be left un packed.
you can access it by doing: pip install jsonxplode
code and proper documentation can be found at:
https://github.com/ThanatosDrive/jsonxplode
https://pypi.org/project/jsonxplode/
in the post i shared at the data engineering sub reddit these were some questions and the answers i provided to them:
why i built this code? because none of the current json flatteners handle properly deep, messy and complex json files without the need of having to read into the json file and define its schema.
how does it deal with some edge case scenarios of eg out of scope duplicate keys? there is a column key counter that increments the column name if it notices that in a row there is 2 of the same columns.
how does it deal with empty values does it do a none or a blank string? data is returned as a list of dictionaries
/r/Python
https://redd.it/1o69cvi
I built this tool in python and I hope it will help the community.
This code flattens deep, messy and complex json data into a simple tabular form without the need of providing a schema.
so all you need to do is: from jsonxplode import flatten flattened_json = flatten(messy_json_data)
once this code is finished with the json file none of the object or arrays will be left un packed.
you can access it by doing: pip install jsonxplode
code and proper documentation can be found at:
https://github.com/ThanatosDrive/jsonxplode
https://pypi.org/project/jsonxplode/
in the post i shared at the data engineering sub reddit these were some questions and the answers i provided to them:
why i built this code? because none of the current json flatteners handle properly deep, messy and complex json files without the need of having to read into the json file and define its schema.
how does it deal with some edge case scenarios of eg out of scope duplicate keys? there is a column key counter that increments the column name if it notices that in a row there is 2 of the same columns.
how does it deal with empty values does it do a none or a blank string? data is returned as a list of dictionaries
/r/Python
https://redd.it/1o69cvi
GitHub
GitHub - ThanatosDrive/jsonxplode: Flatten nested JSON structures into flat dictionaries. Handles complex nesting, lists, and arrays…
Flatten nested JSON structures into flat dictionaries. Handles complex nesting, lists, and arrays with dot notation. - ThanatosDrive/jsonxplode
How to use annotate for DB optimization
Hi, I posted a popular comment to a post a couple days ago asking what some advanced Django topics to focus on are: https://www.reddit.com/r/django/comments/1o52kon/comment/nj6i2hs/?utm\_source=share&utm\_medium=web3x&utm\_name=web3xcss&utm\_term=1&utm\_content=share\_button
I mentioned
Here is an
def cities(self, location=None, filtervalue=None):
entitylocationlookup = {f'{self.cityfieldlookup()}id': OuterRef('pk')}
cities = City.objects.annotate(
hasactiveentities=Exists(
self.getqueryset().filter(entitylocationlookup),
),
).filter(hasactiveentities=True)
/r/django
https://redd.it/1o6jepy
Hi, I posted a popular comment to a post a couple days ago asking what some advanced Django topics to focus on are: https://www.reddit.com/r/django/comments/1o52kon/comment/nj6i2hs/?utm\_source=share&utm\_medium=web3x&utm\_name=web3xcss&utm\_term=1&utm\_content=share\_button
I mentioned
annotate as being low hanging fruit for optimization and the top response to my comment was a question asking for details about it. Its a bit involved to respond to that question, and I figured it would get lost in the archive, so this post is a more thorough explanation of the concept that will reach more people who want to read about it.Here is an
annotate I pulled from real production code that I wrote a couple years ago while refactoring crusty 10+ year old code from Django 1.something:def cities(self, location=None, filtervalue=None):
entitylocationlookup = {f'{self.cityfieldlookup()}id': OuterRef('pk')}
cities = City.objects.annotate(
hasactiveentities=Exists(
self.getqueryset().filter(entitylocationlookup),
),
).filter(hasactiveentities=True)
/r/django
https://redd.it/1o6jepy
Reddit
1ncehost's comment on "What is considered truly advanced in Django?"
Explore this conversation and more from the django community
How to prevent TransactionTestCase from truncating all tables?
For my tests, I copy down the production database, and I use the liveserver test case because my frontend is an SPA and so I need to use playwright to have a browser with the tests.
The challenge is that once the liveserver testcase is done, all my data is blown away, because as the docs tell us, "A TransactionTestCase resets the database after the test runs by truncating all tables."
That's fine for CI, but when testing locally it means I have to keep restoring my database manually. Is there any way to stop it from truncating tables? It seems needlessly annoying that it truncates *all* data!
I tried serialized_rollback=True, but this didn't work. I tried googling around for this, but most of the results I get are folks who are having trouble because their database is *not* reset after a test.
/r/django
https://redd.it/1o6ky68
For my tests, I copy down the production database, and I use the liveserver test case because my frontend is an SPA and so I need to use playwright to have a browser with the tests.
The challenge is that once the liveserver testcase is done, all my data is blown away, because as the docs tell us, "A TransactionTestCase resets the database after the test runs by truncating all tables."
That's fine for CI, but when testing locally it means I have to keep restoring my database manually. Is there any way to stop it from truncating tables? It seems needlessly annoying that it truncates *all* data!
I tried serialized_rollback=True, but this didn't work. I tried googling around for this, but most of the results I get are folks who are having trouble because their database is *not* reset after a test.
/r/django
https://redd.it/1o6ky68
Reddit
From the django community on Reddit: How to prevent TransactionTestCase from truncating all tables?
Explore this post and more from the django community