Python Daily
2.57K subscribers
1.48K photos
53 videos
2 files
38.9K links
Daily Python News
Question, Tips and Tricks, Best Practices on Python Programming Language
Find more reddit channels over at @r_channels
Download Telegram
Working with username with custom user model

I have come to know that it is advisable to create my own custom user model instead of using the default provided by django, most of the tutorials i have watched don't seem to add a username field and instead strip the username from the email, when i did add the field username i was no longer able to create a superuser without the error "django.core.exceptions.FieldDoesNotExist: User has no field named 'accounts.User.username' ". where should i go from here

my custom user manager

my custom user model



/r/djangolearning
https://redd.it/1pecctq
qCrawl — an async high-performance crawler framework

Site: https://github.com/crawlcore/qcrawl

What My Project Does

qCrawl is an async web crawler framework based on asyncio.

Key features

Async architecture - High-performance concurrent crawling based on asyncio
Performance optimized - Queue backend on Redis with direct delivery, messagepack serialization, connection pooling, DNS caching
Powerful parsing - CSS/XPath selectors with lxml
Middleware system - Customizable request/response processing
Flexible export - Multiple output formats including JSON, CSV, XML
Flexible queue backends - Memory or Redis-based (+disk) schedulers for different scale requirements
Item pipelines - Data transformation, validation, and processing pipeline
Pluggable downloaders - HTTP (aiohttp), Camoufox (stealth browser) for JavaScript rendering and anti-bot evasion

Target Audience

1. Developers building large-scale web crawlers or scrapers
2. Data engineers and data scientists need automated data extraction
3. Companies and researchers performing continuous or scheduled crawling

Comparison

1. it can be compared to scrapy - it is scrapy if it were built on asyncio instead of twisted, with queue backends Memory/Redis with direct delivery and messagepack serialization, and pluggable downloaders - HTTP (aiohttp), Camoufox (stealth browser) for JavaScript rendering and anti-bot evasion
2. it can be compared to playwright/camoufox - you can use them directly, but using qCraw, you can in one spider, distribute requests between aiohttp for max performance and camoufox if JS rendering

/r/Python
https://redd.it/1pfofmq