Issue with modified normalize_email and it's uniqueness
I have this Custom User:
[CustomUser](https://preview.redd.it/jj337x6ojapf1.png?width=976&format=png&auto=webp&s=74588d74b758fdb65402e2de5400bf59fc905f00)
and this User Manager:
[UserManager part 1](https://preview.redd.it/2kdwjkayjapf1.png?width=1476&format=png&auto=webp&s=6c43499fad84b24768e8b7c92aa6bfc0466ebe16)
[UserManager part 2](https://preview.redd.it/rzy91my5kapf1.png?width=1250&format=png&auto=webp&s=36a4f5fa88277dbcef3bd3b528bd4dd4574c30d4)
[this is my utility function](https://preview.redd.it/v1hbpaudkapf1.png?width=1220&format=png&auto=webp&s=223af8382fdf63c2f82d1b177eb2350c21990ca4)
When I create a User I am still somehow able to create this, What am I doing now?
https://preview.redd.it/p3p0dxykjapf1.png?width=2852&format=png&auto=webp&s=e94554d59922e90bedb79ce416cb0e6fee777c07
DB data:
https://preview.redd.it/9u5ddr9kmapf1.png?width=1900&format=png&auto=webp&s=a2886c82c51b298115c2a5a494397eeefea4f0c6
/r/django
https://redd.it/1nhgk66
I have this Custom User:
[CustomUser](https://preview.redd.it/jj337x6ojapf1.png?width=976&format=png&auto=webp&s=74588d74b758fdb65402e2de5400bf59fc905f00)
and this User Manager:
[UserManager part 1](https://preview.redd.it/2kdwjkayjapf1.png?width=1476&format=png&auto=webp&s=6c43499fad84b24768e8b7c92aa6bfc0466ebe16)
[UserManager part 2](https://preview.redd.it/rzy91my5kapf1.png?width=1250&format=png&auto=webp&s=36a4f5fa88277dbcef3bd3b528bd4dd4574c30d4)
[this is my utility function](https://preview.redd.it/v1hbpaudkapf1.png?width=1220&format=png&auto=webp&s=223af8382fdf63c2f82d1b177eb2350c21990ca4)
When I create a User I am still somehow able to create this, What am I doing now?
https://preview.redd.it/p3p0dxykjapf1.png?width=2852&format=png&auto=webp&s=e94554d59922e90bedb79ce416cb0e6fee777c07
DB data:
https://preview.redd.it/9u5ddr9kmapf1.png?width=1900&format=png&auto=webp&s=a2886c82c51b298115c2a5a494397eeefea4f0c6
/r/django
https://redd.it/1nhgk66
Random 404 errors.
I am a beginner, and my Flask app is randomly giving 404 URL not found errors. It was running perfectly, and I restarted the app, but now it is not. Last time it happened, I just closed my editor and shut my pc off, and after some time, it was working again.
I know my routes are correct, and I am using url_for and even my Index page, which i donet pass any values into, is not loading.
Has Anyone else faced these issues before and know how to solve them?
/r/flask
https://redd.it/1nhjxbx
I am a beginner, and my Flask app is randomly giving 404 URL not found errors. It was running perfectly, and I restarted the app, but now it is not. Last time it happened, I just closed my editor and shut my pc off, and after some time, it was working again.
I know my routes are correct, and I am using url_for and even my Index page, which i donet pass any values into, is not loading.
Has Anyone else faced these issues before and know how to solve them?
/r/flask
https://redd.it/1nhjxbx
Reddit
From the flask community on Reddit
Explore this post and more from the flask community
NOW - LMS: A flask based learning platform
<tl-dr>
# Python >= 3.11
# Sources:https://github.com/bmosoluciones/now-lms
# License: Apache 2
python3 -m venv venv
venv/bin/pip install now_lms
venv/bin/lmsctl database init
venv/bin/lmsctl serve
# Visit `http://127.0.0.1:8080/` in your browser, default admin user and password are `lms-admin`.
</tl-dr>
Hello, this is a project I have been working to release a online learning plataform for my sister use and my use.
NOW - LMS is designed to be simple yet powerful. Here are its key features:
* **Clean codebase**: Python and HTML5.
* **Compatible with multiple databases**: SQLite, PostgreSQL, and MySQL.
* **Complete course creation functionality**, allowing full curriculum setup.
* **Courses are organized into sections**, which group resources in a logical manner.
* **Flexible resource types** within a course section:
* YouTube videos
* PDFs
* Images
* Audio files
* Rich text content
* External HTML pages
* Slide presentations
* External resource links
* **Course types**:
* Free or paid
* Self-paced, synchronous (with tutor), or time-limited
* **Paid courses support an audit mode**, allowing limited
/r/flask
https://redd.it/1ngwce6
<tl-dr>
# Python >= 3.11
# Sources:https://github.com/bmosoluciones/now-lms
# License: Apache 2
python3 -m venv venv
venv/bin/pip install now_lms
venv/bin/lmsctl database init
venv/bin/lmsctl serve
# Visit `http://127.0.0.1:8080/` in your browser, default admin user and password are `lms-admin`.
</tl-dr>
Hello, this is a project I have been working to release a online learning plataform for my sister use and my use.
NOW - LMS is designed to be simple yet powerful. Here are its key features:
* **Clean codebase**: Python and HTML5.
* **Compatible with multiple databases**: SQLite, PostgreSQL, and MySQL.
* **Complete course creation functionality**, allowing full curriculum setup.
* **Courses are organized into sections**, which group resources in a logical manner.
* **Flexible resource types** within a course section:
* YouTube videos
* PDFs
* Images
* Audio files
* Rich text content
* External HTML pages
* Slide presentations
* External resource links
* **Course types**:
* Free or paid
* Self-paced, synchronous (with tutor), or time-limited
* **Paid courses support an audit mode**, allowing limited
/r/flask
https://redd.it/1ngwce6
GitHub
GitHub - bmosoluciones/now-lms: Create, deliver, and certify courses with ease.
Create, deliver, and certify courses with ease. Contribute to bmosoluciones/now-lms development by creating an account on GitHub.
Tuesday Daily Thread: Advanced questions
# Weekly Wednesday Thread: Advanced Questions 🐍
Dive deep into Python with our Advanced Questions thread! This space is reserved for questions about more advanced Python topics, frameworks, and best practices.
## How it Works:
1. **Ask Away**: Post your advanced Python questions here.
2. **Expert Insights**: Get answers from experienced developers.
3. **Resource Pool**: Share or discover tutorials, articles, and tips.
## Guidelines:
* This thread is for **advanced questions only**. Beginner questions are welcome in our [Daily Beginner Thread](#daily-beginner-thread-link) every Thursday.
* Questions that are not advanced may be removed and redirected to the appropriate thread.
## Recommended Resources:
* If you don't receive a response, consider exploring r/LearnPython or join the [Python Discord Server](https://discord.gg/python) for quicker assistance.
## Example Questions:
1. **How can you implement a custom memory allocator in Python?**
2. **What are the best practices for optimizing Cython code for heavy numerical computations?**
3. **How do you set up a multi-threaded architecture using Python's Global Interpreter Lock (GIL)?**
4. **Can you explain the intricacies of metaclasses and how they influence object-oriented design in Python?**
5. **How would you go about implementing a distributed task queue using Celery and RabbitMQ?**
6. **What are some advanced use-cases for Python's decorators?**
7. **How can you achieve real-time data streaming in Python with WebSockets?**
8. **What are the
/r/Python
https://redd.it/1ni2d07
# Weekly Wednesday Thread: Advanced Questions 🐍
Dive deep into Python with our Advanced Questions thread! This space is reserved for questions about more advanced Python topics, frameworks, and best practices.
## How it Works:
1. **Ask Away**: Post your advanced Python questions here.
2. **Expert Insights**: Get answers from experienced developers.
3. **Resource Pool**: Share or discover tutorials, articles, and tips.
## Guidelines:
* This thread is for **advanced questions only**. Beginner questions are welcome in our [Daily Beginner Thread](#daily-beginner-thread-link) every Thursday.
* Questions that are not advanced may be removed and redirected to the appropriate thread.
## Recommended Resources:
* If you don't receive a response, consider exploring r/LearnPython or join the [Python Discord Server](https://discord.gg/python) for quicker assistance.
## Example Questions:
1. **How can you implement a custom memory allocator in Python?**
2. **What are the best practices for optimizing Cython code for heavy numerical computations?**
3. **How do you set up a multi-threaded architecture using Python's Global Interpreter Lock (GIL)?**
4. **Can you explain the intricacies of metaclasses and how they influence object-oriented design in Python?**
5. **How would you go about implementing a distributed task queue using Celery and RabbitMQ?**
6. **What are some advanced use-cases for Python's decorators?**
7. **How can you achieve real-time data streaming in Python with WebSockets?**
8. **What are the
/r/Python
https://redd.it/1ni2d07
Discord
Join the Python Discord Server!
We're a large community focused around the Python programming language. We believe that anyone can learn to code. | 412982 members
What’s the one Python feature you wish you discovered earlier?
I’ve been coding in Python for a few years, and just recently realized how much time f-strings can save compared to old-style string formatting.
Honestly, I feel like I wasted so many lines of code before discovering this. 😅
Curious — what’s one Python trick, library, or feature that completely changed how you code once you found it?
/r/Python
https://redd.it/1ni9puj
I’ve been coding in Python for a few years, and just recently realized how much time f-strings can save compared to old-style string formatting.
Honestly, I feel like I wasted so many lines of code before discovering this. 😅
Curious — what’s one Python trick, library, or feature that completely changed how you code once you found it?
/r/Python
https://redd.it/1ni9puj
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
Looking for a study buddy(woman)
I need accountability buddy. I have multiple unfinished projects. my suggestion is to have study sessions together every day. It would create some external structure and responsibility. My tz is gmt+5. I am 31f
/r/djangolearning
https://redd.it/1nhmwg9
I need accountability buddy. I have multiple unfinished projects. my suggestion is to have study sessions together every day. It would create some external structure and responsibility. My tz is gmt+5. I am 31f
/r/djangolearning
https://redd.it/1nhmwg9
Reddit
From the djangolearning community on Reddit
Explore this post and more from the djangolearning community
D - NeurIPS 2025 Decisions
Just posting this thread here in anticipation of the bloodbath due in the next 2 days.
/r/MachineLearning
https://redd.it/1nie5rl
Just posting this thread here in anticipation of the bloodbath due in the next 2 days.
/r/MachineLearning
https://redd.it/1nie5rl
Reddit
From the MachineLearning community on Reddit
Explore this post and more from the MachineLearning community
Why do you like/hate Django?
Hello! I'd like to hear different opinions about this framework.
Why do you like it or why do you hate it.
Everyone has a free space to share their opinions about it!
PS: you don't have to motivate me on why i should or shouldn't use it, i'm already using it for work. This doesn't mean i have a love feeling tho 😂, so i want to read everyone's opinions!
/r/django
https://redd.it/1nil9zn
Hello! I'd like to hear different opinions about this framework.
Why do you like it or why do you hate it.
Everyone has a free space to share their opinions about it!
PS: you don't have to motivate me on why i should or shouldn't use it, i'm already using it for work. This doesn't mean i have a love feeling tho 😂, so i want to read everyone's opinions!
/r/django
https://redd.it/1nil9zn
Reddit
From the django community on Reddit
Explore this post and more from the django community
[P] I build a completely free website to help patients to get secondary opinion on mammogram, loading AI model inside browser and completely local inference without data transfer. Optional LLM-based radiology report generation if needed.
https://redd.it/1nirhbj
@pythondaily
https://redd.it/1nirhbj
@pythondaily
Reddit
From the MachineLearning community on Reddit: [P] I build a completely free website to help patients to get secondary opinion on…
Explore this post and more from the MachineLearning community
AIWAF Flask: Drop in Security Middleware with AI Anomaly Detection
Just launched AIWAF Flask, a lightweight yet powerful Web Application Firewall for Flask apps. It combines classic protections like IP blocking, rate limiting, honeypot timing, header validation, and UUID tampering checks with an AI powered anomaly detection system. Instead of relying only on static rules, it can learn suspicious patterns from logs and dynamically adapt to new attack vectors.
The setup is dead simple. By default, just
AIWAF Flask also includes a CLI (
aiwaf-flask · PyPI
/r/flask
https://redd.it/1niw2ii
Just launched AIWAF Flask, a lightweight yet powerful Web Application Firewall for Flask apps. It combines classic protections like IP blocking, rate limiting, honeypot timing, header validation, and UUID tampering checks with an AI powered anomaly detection system. Instead of relying only on static rules, it can learn suspicious patterns from logs and dynamically adapt to new attack vectors.
The setup is dead simple. By default, just
pip install aiwaf-flask and wrap your Flask app with AIWAF(app) and it automatically enables all seven protection layers out of the box. You can go further with decorators like aiwaf_exempt or aiwaf_only for fine grained control, and even choose between CSV, database, or in memory storage depending on your environment. For those who want smarter defenses, installing with [ai] enables anomaly detection using NumPy and scikit-learn.AIWAF Flask also includes a CLI (
aiwaf) for managing IP blacklists/whitelists, blocked keywords, training the AI model from logs, and analyzing traffic patterns. It’s designed for developers who want stronger security in Flask without a steep learning curve or heavy dependencies.aiwaf-flask · PyPI
/r/flask
https://redd.it/1niw2ii
PyPI
aiwaf-flask
Advanced AI-powered Web Application Firewall for Flask with intelligent threat detection, rate limiting, IP blocking, and real-time protection against web attacks
Update: I got tired of Django project setup, so I built a tool to automate it all
/r/django
https://redd.it/1nir37r
/r/django
https://redd.it/1nir37r
List of 87 Programming Ideas for Beginners (with Python implementations)
https://inventwithpython.com/blog/programming-ideas-beginners-big-book-python.html
I've compiled a list of beginner-friendly programming projects, with example implementations in Python. These projects are drawn from my free Python books, but since they only use stdio text, you can implement them in any language.
I got tired of the copy-paste "1001 project" posts that obviously were copied from other posts or generated by AI which included everything from "make a coin flip program" to "make an operating system". I've personally curated this list to be small enough for beginners. The implementations are all usually under 100 or 200 lines of code.
/r/Python
https://redd.it/1nitzoz
https://inventwithpython.com/blog/programming-ideas-beginners-big-book-python.html
I've compiled a list of beginner-friendly programming projects, with example implementations in Python. These projects are drawn from my free Python books, but since they only use stdio text, you can implement them in any language.
I got tired of the copy-paste "1001 project" posts that obviously were copied from other posts or generated by AI which included everything from "make a coin flip program" to "make an operating system". I've personally curated this list to be small enough for beginners. The implementations are all usually under 100 or 200 lines of code.
/r/Python
https://redd.it/1nitzoz
Inventwithpython
List of 87 Programming Ideas for Beginners
Do you prefer sticking to the standard library or pulling in external packages?
I’ve been writing Python for a while and I keep running into this situation. Python’s standard library is huge and covers so much, but sometimes it feels easier (or just faster) to grab a popular external package from PyPI.
For example, I’ve seen people write entire data processing scripts with just built-in modules, while others immediately bring in pandas or requests even for simple tasks.
I’m curious how you all approach this. Do you try to keep dependencies minimal and stick to the stdlib as much as possible, or do you reach for external packages early to save development time?
/r/Python
https://redd.it/1nj12yr
I’ve been writing Python for a while and I keep running into this situation. Python’s standard library is huge and covers so much, but sometimes it feels easier (or just faster) to grab a popular external package from PyPI.
For example, I’ve seen people write entire data processing scripts with just built-in modules, while others immediately bring in pandas or requests even for simple tasks.
I’m curious how you all approach this. Do you try to keep dependencies minimal and stick to the stdlib as much as possible, or do you reach for external packages early to save development time?
/r/Python
https://redd.it/1nj12yr
Reddit
From the Python community on Reddit
Explore this post and more from the Python community
D How is IEEE TIP viewed in the CV/AI/ML community?
Hi everyone,
I’m a PhD student working on video research, and I recently submitted a paper to IEEE Transactions on Image Processing (TIP). After a very long review process (almost a year), it finally reached the “AQ” stage.
Now I’m curious—how do people in the community actually see TIP these days?
Some of my colleagues say it’s still one of the top journals in vision, basically right after TPAMI. Others think it’s kind of outdated and not really read much anymore.
Also, how would you compare it to the major conferences (CVPR/ICCV/ECCV, NeurIPS, ICLR, AAAI)? Is publishing in TIP seen as on par with those, or is it considered more like the “second-tier” conferences (WACV, BMVC, etc.)?
I’m close to graduation, so maybe I’m overthinking this. I know the contribution and philosophy of the work itself matters more than the venue. But I’d still love to hear how people generally view TIP these days, both in academia and in the field.
Thanks!
/r/MachineLearning
https://redd.it/1nj38ur
Hi everyone,
I’m a PhD student working on video research, and I recently submitted a paper to IEEE Transactions on Image Processing (TIP). After a very long review process (almost a year), it finally reached the “AQ” stage.
Now I’m curious—how do people in the community actually see TIP these days?
Some of my colleagues say it’s still one of the top journals in vision, basically right after TPAMI. Others think it’s kind of outdated and not really read much anymore.
Also, how would you compare it to the major conferences (CVPR/ICCV/ECCV, NeurIPS, ICLR, AAAI)? Is publishing in TIP seen as on par with those, or is it considered more like the “second-tier” conferences (WACV, BMVC, etc.)?
I’m close to graduation, so maybe I’m overthinking this. I know the contribution and philosophy of the work itself matters more than the venue. But I’d still love to hear how people generally view TIP these days, both in academia and in the field.
Thanks!
/r/MachineLearning
https://redd.it/1nj38ur
Reddit
From the MachineLearning community on Reddit
Explore this post and more from the MachineLearning community
Flask + gspread: multiple Google Sheets API calls (20+) per scan instead of 1
I’m building a Flask web app for a Model UN conference with around 350-400 registered delegates.
* OCs (Organizing Committee members) log in.
* They scan delegate IDs (QR codes or manual input).
* The app then fetches delegate info from a Google Sheet and logs attendance in another sheet.
All delegate, OC, and attendance data is currently stored in Google Sheets
Whenever a delegate is scanned, the app seems to make many Google Sheets API calls (sometimes 20–25 for a single scan).
I already tried to:
* Cache delegates (load once from master sheet at startup).
* Cache attendance records.
* Batch writes (`append_rows` in chunks of 50).
But I still see too many API calls, and I’m worried about hitting the Google Sheets API quota limits during the event.
After rewriting the backend, I still get around 10 API calls for one instance, now I'm not sure is it because of the backend or frontend, here I've attached MRE of my backend and have attached the HTML code for home page
from flask import Flask, request, render_template, redirect, url_for, session, flash
import gspread, os, json
from google.oauth2.service_account import Credentials
from datetime import datetime, timedelta
/r/flask
https://redd.it/1nj6d9n
I’m building a Flask web app for a Model UN conference with around 350-400 registered delegates.
* OCs (Organizing Committee members) log in.
* They scan delegate IDs (QR codes or manual input).
* The app then fetches delegate info from a Google Sheet and logs attendance in another sheet.
All delegate, OC, and attendance data is currently stored in Google Sheets
Whenever a delegate is scanned, the app seems to make many Google Sheets API calls (sometimes 20–25 for a single scan).
I already tried to:
* Cache delegates (load once from master sheet at startup).
* Cache attendance records.
* Batch writes (`append_rows` in chunks of 50).
But I still see too many API calls, and I’m worried about hitting the Google Sheets API quota limits during the event.
After rewriting the backend, I still get around 10 API calls for one instance, now I'm not sure is it because of the backend or frontend, here I've attached MRE of my backend and have attached the HTML code for home page
from flask import Flask, request, render_template, redirect, url_for, session, flash
import gspread, os, json
from google.oauth2.service_account import Credentials
from datetime import datetime, timedelta
/r/flask
https://redd.it/1nj6d9n
Reddit
From the flask community on Reddit
Explore this post and more from the flask community
Let your Python agents play an MMO: Agent-to-Agent protocol + SDK
Repo: https://github.com/Summoner-Network/summoner-agents
TL;DR: We are building Summoner, a Python SDK with a Rust server for agent-to-agent networking across machines. Early beta (beta version 1.0).
What my project does: A protocol for live agent interaction with a desktop app to track network-wide agent state (battles, collaborations, reputation), so you can build MMO-style games, simulations, and tools.
Target audience: Students, indie devs, and small teams who want to build networked multi-agent projects, simulations, or MMO-style experiments in Python.
Comparison:
LangChain and CrewAI are app frameworks and an API spec for serving agents, not an on-the-wire interop protocol;
Google A2A is an HTTP-based spec that uses JSON-RPC by default (with optional gRPC or REST);
MCP standardizes model-to-tool and data connections.
Summoner targets live, persistent agent-to-agent networking for MMO-style coordination.
Status
Our Beta 1.0. works with example agents today. Expect sharp edges.
More
Github page: https://github.com/Summoner-Network
Docs/design notes: https://github.com/Summoner-Network/summoner-docs
Core runtime: https://github.com/Summoner-Network/summoner-core
Site: https://summoner.org
/r/Python
https://redd.it/1niqudg
Repo: https://github.com/Summoner-Network/summoner-agents
TL;DR: We are building Summoner, a Python SDK with a Rust server for agent-to-agent networking across machines. Early beta (beta version 1.0).
What my project does: A protocol for live agent interaction with a desktop app to track network-wide agent state (battles, collaborations, reputation), so you can build MMO-style games, simulations, and tools.
Target audience: Students, indie devs, and small teams who want to build networked multi-agent projects, simulations, or MMO-style experiments in Python.
Comparison:
LangChain and CrewAI are app frameworks and an API spec for serving agents, not an on-the-wire interop protocol;
Google A2A is an HTTP-based spec that uses JSON-RPC by default (with optional gRPC or REST);
MCP standardizes model-to-tool and data connections.
Summoner targets live, persistent agent-to-agent networking for MMO-style coordination.
Status
Our Beta 1.0. works with example agents today. Expect sharp edges.
More
Github page: https://github.com/Summoner-Network
Docs/design notes: https://github.com/Summoner-Network/summoner-docs
Core runtime: https://github.com/Summoner-Network/summoner-core
Site: https://summoner.org
/r/Python
https://redd.it/1niqudg
GitHub
GitHub - Summoner-Network/summoner-agents: A collection of Summoner clients and agents featuring example implementations and reusable…
A collection of Summoner clients and agents featuring example implementations and reusable templates - Summoner-Network/summoner-agents
Python's role in the AI infrastructure stack – sharing lessons from building production AI systems
Python's dominance in AI/ML is undeniable, but after building several production AI systems, I've learned that the language choice is just the beginning. The real challenges are in architecture, deployment, and scaling.
**Current project:** Multi-agent system processing 100k+ documents daily
**Stack:** FastAPI, Celery, Redis, PostgreSQL, Docker
**Scale:** \~50 concurrent AI workflows, 1M+ API calls/month
**What's working well:**
* **FastAPI for API development** – async support handles concurrent AI calls beautifully
* **Celery for background processing** – essential for long-running AI tasks
* **Pydantic for data validation** – catches errors before they hit expensive AI models
* **Rich ecosystem** – libraries like LangChain, Transformers, and OpenAI client make development fast
**Pain points I've encountered:**
* **Memory management** – AI models are memory-hungry, garbage collection becomes critical
* **Dependency hell** – AI libraries have complex requirements that conflict frequently
* **Performance bottlenecks** – Python's GIL becomes apparent under heavy concurrent loads
* **Deployment complexity** – managing GPU dependencies and model weights in containers
**Architecture decisions that paid off:**
1. **Async everywhere** – using asyncio for all I/O operations, including AI model calls
2. **Worker pools** – separate processes for different AI tasks to isolate failures
3. **Caching layer** – Redis for expensive AI results, dramatically improved response times
4. **Health checks** – monitoring AI model availability and fallback mechanisms
**Code patterns that emerged:**
`# Context manager for AI model lifecycle`
`@asynccontextmanager`
`async def ai_model_context(model_name: str):`
`model
/r/Python
https://redd.it/1nj7y99
Python's dominance in AI/ML is undeniable, but after building several production AI systems, I've learned that the language choice is just the beginning. The real challenges are in architecture, deployment, and scaling.
**Current project:** Multi-agent system processing 100k+ documents daily
**Stack:** FastAPI, Celery, Redis, PostgreSQL, Docker
**Scale:** \~50 concurrent AI workflows, 1M+ API calls/month
**What's working well:**
* **FastAPI for API development** – async support handles concurrent AI calls beautifully
* **Celery for background processing** – essential for long-running AI tasks
* **Pydantic for data validation** – catches errors before they hit expensive AI models
* **Rich ecosystem** – libraries like LangChain, Transformers, and OpenAI client make development fast
**Pain points I've encountered:**
* **Memory management** – AI models are memory-hungry, garbage collection becomes critical
* **Dependency hell** – AI libraries have complex requirements that conflict frequently
* **Performance bottlenecks** – Python's GIL becomes apparent under heavy concurrent loads
* **Deployment complexity** – managing GPU dependencies and model weights in containers
**Architecture decisions that paid off:**
1. **Async everywhere** – using asyncio for all I/O operations, including AI model calls
2. **Worker pools** – separate processes for different AI tasks to isolate failures
3. **Caching layer** – Redis for expensive AI results, dramatically improved response times
4. **Health checks** – monitoring AI model availability and fallback mechanisms
**Code patterns that emerged:**
`# Context manager for AI model lifecycle`
`@asynccontextmanager`
`async def ai_model_context(model_name: str):`
`model
/r/Python
https://redd.it/1nj7y99
Reddit
From the Python community on Reddit
Explore this post and more from the Python community