Python Daily
2.57K subscribers
1.48K photos
53 videos
2 files
38.9K links
Daily Python News
Question, Tips and Tricks, Best Practices on Python Programming Language
Find more reddit channels over at @r_channels
Download Telegram
I built an automation to generate coding reels like Peter Stewie’s on Instagram(got 2k followers)



**What My Project Does:**


You might’ve seen those funny Instagram reels where characters like Stewie and Peter Griffin explain coding concepts to each other while gameplay runs in the background ..all with AI voices.

I built an automation that replicates that exact format.

Just send a script, and it generates a short, engaging coding reel with:

* AI voiceovers
* Character overlays
* Gameplay footage

All fully automated , no manual editing needed.

** Features:**

* Scraped and integrated AI voices
* Automated video rendering using AWS
* Optional ChatGPT support to auto-generate scripts
* Telegram interface to send scripts and receive final videos

Still improving timing and visuals, open to feedback!

**🛠️ GitHub (code + demo):** [https://github.com/Traverser25/Stewie\_it\_v1](https://github.com/Traverser25/Stewie_it_v1)

/r/Python
https://redd.it/1lhgllq
What would be the best way to share my flask app on GitHub so that anyone can self host it?

I’ve been working on a small side project that’s a simple flask web app.

The project is mainly a learning exercise for me but I also want to learn how to properly open source code.

It’s in a state at this point where I feel it’s useable and I’ve been slowly building up a proper readme for my GitHub page.

My goal is to simplify the installation process as much as possible so for now I’ve written 2 batch files that handle the installation and the execution. But I am wondering if there is a better way to go about this.

Keen to hear any advice.

/r/flask
https://redd.it/1lh3t1v
Parallel and Concurrent Programming in Python: A Practical Guide

Hey, I made a video about Parallel and Concurrent Programming in Python with threading and multiprocessing.

First we make a program which doesn't use any of those methods and after that we take advantage of those methods and see the differences in terms of performance

https://www.youtube.com/watch?v=IQxKjGEVteI

/r/Python
https://redd.it/1lhgxek
Fast, lightweight parser for Securities and Exchanges Commission Inline XBRL

Hi there, this is a niche package but may help a few people. I noticed that the SEC XBRL endpoint sometimes takes hours to update, and is missing a lot of data, so I wrote a fast, lightweight InLine XBRL parser to fix this.

https://github.com/john-friedman/secxbrl

# What my project does

Parses SEC InLine XBRL quickly using only the Inline XBRL html file, without the need for linkbases, schema files, etc.

# Target Audience

Algorithmic traders, PhD students, Quant researchers, and hobbyists.

# Comparison

Other packages such as python-xbrl, py-xbrl, and brel are focused on parsing most forms of XBRL. This package only parses SEC XBRL. This allows for dramatically faster performance as no additional files need to be downloaded, making it suitable for running on small instances such as t4g.nanos.

The readme contains links to the other packages as they may be a better fit for your usecase.

# Example

from secxbrl import parseinlinexbrl

# load data
path = '../samples/000095017022000796/tsla-20211231.htm'
with open(path,'rb') as f:
content = f.read()

# get all EarningsPerShareBasic
basic = {'val':item['_val','date':item'_context''context_period_enddate'} for item

/r/Python
https://redd.it/1lhdspc
FastAPI Guard v3.0 - Now with Security Decorators and AI-like Behavior Analysis

Hey r/Python!

So I've been working on my FastAPI security library (fastapi-guard) for a while now, and it's honestly grown way beyond what I thought it would become. Since my last update on r/Python (I wasn't able to post on r/FastAPI until today), I've basically rebuilt the whole thing and added some pretty cool features.

What My Project Does:

Still does all the basic stuff - IP whitelisting/blacklisting, rate limiting, penetration attempt detection, cloud provider blocking, etc. But now it's way more flexible and you can configure everything per route.

What's new:

The biggest addition is Security Decorators. You can now secure individual routes instead of just using the global middleware configuration. Want to rate limit just one endpoint? Block certain countries from accessing your admin panel? Done. No more "all or nothing" approach.

from fastapi_guard.decorators import SecurityDecorator

@app.get("/admin")
@SecurityDecorator.access_control.block_countries(["CN", "RU"])
@SecurityDecorator.rate_limiting.limit(requests=5, window=60)
async def admin_panel():
return {"status": "admin"}


Other stuff that got fixed:

- Had a security vulnerability in v2.0.0 with header injection through X-Forwarded-For. That's patched now
- IPv6 support was broken, fixed that too
- Made IPInfo completely optional - you can now use your own geo IP handler.
- Rate limiting is now proper sliding window instead of fixed window
- Other improvements/enhancements/optimizations...

Been using it in production for months

/r/Python
https://redd.it/1lhxwee
Monday Daily Thread: Project ideas!

# Weekly Thread: Project Ideas 💡

Welcome to our weekly Project Ideas thread! Whether you're a newbie looking for a first project or an expert seeking a new challenge, this is the place for you.

## How it Works:

1. **Suggest a Project**: Comment your project idea—be it beginner-friendly or advanced.
2. **Build & Share**: If you complete a project, reply to the original comment, share your experience, and attach your source code.
3. **Explore**: Looking for ideas? Check out Al Sweigart's ["The Big Book of Small Python Projects"](https://www.amazon.com/Big-Book-Small-Python-Programming/dp/1718501242) for inspiration.

## Guidelines:

* Clearly state the difficulty level.
* Provide a brief description and, if possible, outline the tech stack.
* Feel free to link to tutorials or resources that might help.

# Example Submissions:

## Project Idea: Chatbot

**Difficulty**: Intermediate

**Tech Stack**: Python, NLP, Flask/FastAPI/Litestar

**Description**: Create a chatbot that can answer FAQs for a website.

**Resources**: [Building a Chatbot with Python](https://www.youtube.com/watch?v=a37BL0stIuM)

# Project Idea: Weather Dashboard

**Difficulty**: Beginner

**Tech Stack**: HTML, CSS, JavaScript, API

**Description**: Build a dashboard that displays real-time weather information using a weather API.

**Resources**: [Weather API Tutorial](https://www.youtube.com/watch?v=9P5MY_2i7K8)

## Project Idea: File Organizer

**Difficulty**: Beginner

**Tech Stack**: Python, File I/O

**Description**: Create a script that organizes files in a directory into sub-folders based on file type.

**Resources**: [Automate the Boring Stuff: Organizing Files](https://automatetheboringstuff.com/2e/chapter9/)

Let's help each other grow. Happy

/r/Python
https://redd.it/1li2gwg
Run background tasks in Django with zero external dependencies. Here's an update on my library, django-async-manager.

Hey Django community!

I've posted here before about **django-async-manager**, a library I've been developing, and I wanted to share an update on its progress and features.

**What is django-async-manager?**

It's a lightweight, database-backed task queue for Django that provides a Celery-like experience without external dependencies. Perfect for projects where you need background task processing but don't want the overhead of setting up Redis, RabbitMQ, etc.

** New Feature: Memory Management**

The latest update adds memory limit capabilities to prevent tasks from consuming too much RAM. This is especially useful for long-running tasks or when working in environments with limited resources.

# Task with Memory Limit

@background_task(memory_limit=512) # Limit to 512MB
def memory_intensive_task():
# This task will be terminated if it exceeds 512MB
large_data = process_large_dataset()
return analyze_data(large_data)

# Key Features

* **Simple decorator-based API** \- Just add `@background_task` to any function
* **Task prioritization** \- Set tasks as low, medium, high, or critical priority
* **Multiple queues** \- Route tasks to different workers
* **Task dependencies** \- Chain tasks together
* **Automatic retries** \- With configurable exponential backoff
* **Scheduled tasks** \- Cron-like scheduling for periodic tasks
* **Timeout control** \- Prevent tasks from running too long
* **Memory limits** \- Stop tasks from consuming

/r/django
https://redd.it/1lhxz7r
This media is not supported in your browser
VIEW IN TELEGRAM
[P] I made a website to visualize machine learning algorithms + derive math from scratch

/r/MachineLearning
https://redd.it/1lhtkr4
Fenix: I built an algorithmic trading bot with CrewAI, Ollama, and Pandas.

Hey r/Python,

I'm excited to share a project I've been passionately working on, built entirely within the Python ecosystem: Fenix Trading Bot. The post was removed earlier for missing some sections, so here is a more structured breakdown.

GitHub Link: https://github.com/Ganador1/FenixAI\_tradingBot

# What My Project Does

Fenix is an open-source framework for algorithmic cryptocurrency trading. Instead of relying on a single strategy, it uses a crew of specialized AI agents orchestrated by CrewAI to make decisions. The workflow is:

1. It scrapes data from multiple sources: news feeds, social media (Twitter/Reddit), and real-time market data.
2. It uses a Visual Agent with a vision model (LLaVA) to analyze screenshots of TradingView charts, identifying visual patterns.
3. A Technical Agent analyzes quantitative indicators (RSI, MACD, etc.).
4. A Sentiment Agent reads news/social media to gauge market sentiment.
5. The analyses are passed to Consensus and Risk Management agents that weigh the evidence, check against user-defined risk parameters, and make the final BUY, SELL, or HOLD decision. The entire AI analysis runs 100% locally using Ollama, ensuring privacy and zero API costs.

# Target Audience

This project is aimed at:

Python Developers & AI Enthusiasts: Who want to see a real-world, complex application of modern Python libraries like CrewAI, Ollama, Pydantic, and Selenium working together. It serves as a great case study for building multi-agent systems.
Algorithmic Traders & Quants: Who are looking for a flexible, open-source framework that goes beyond

/r/Python
https://redd.it/1li8id5
sodalite - an open source media downloader with a pure python backend

Made this as a passion project, hope you'll like it :) If you did, please star it! did it as a part of a hackathon and l'd appreciate the support.

What my project does
It detects a link you paste from a supported service, parses it via a network request and serves the file through a FastAPI backend.

Intended audience
Mostly someone who's willing to host this, production ig?

Repo link
https://github.com/oterin/sodalite

/r/Python
https://redd.it/1li6ek4
pandas/python functions (pushing and calling dataframe)

Hello all,
I am fairly new to python and all so i am having difficulty managing next.
So i wanted to create a dim table in separate file, then push few columns to SQL, and allow somehow for few other columns to be allowed to be pulled in another python file, where i would merge it with that data-frame.(creating ID keys basically),
But i am having difficulties doing that,its giving me some long as error. (This part when i am calling in other file : (product_table= Orders_product() )
Could someone point me to right direction?

Product table:

import pandas as pd
from MySQL import getmysqlengine

#getting file
File=r"Excel
FilePath"
Sheet="Orders"
df=pd.readexcel(File, sheetname=Sheet)
productcolumns=["Product Category","Product Sub-Category","Product Container","Product Name"]

def Orders
product():
#cleaning text/droping duplicates

    dfproducts = df[productcolumns].copy()
    for productCol in productcolumns:
        dfproducts[productCol] = dfproducts[productCol].str.strip()
    dfproducts['ProductKeyJoin'] = dfproductsproduct_columns.agg('|'.join, axis=1)


/r/Python
https://redd.it/1lhyni4