🚀 New issue to ag2ai/faststream by @carlodri
📝 LICENSE file is missing in source distribution (#2594)
Apache-2.0 requires publishing the LICENSE together with every distribution.
#good_first_issue #bug
sent via relator
📝 LICENSE file is missing in source distribution (#2594)
Apache-2.0 requires publishing the LICENSE together with every distribution.
#good_first_issue #bug
sent via relator
New issue to wemake-services/django-modern-rest by @sobolevn
Write tests for "double validation" problem (#85)
We need to be sure that double validation does not happen. Ever.
See what "double validation" is: fastapi/fastapi#3021
To be sure, we need to write tests for it.
Tests would be rather easy to write:
• One for
• One for
• One for raw data return
Also test disabled response validation mode to do 0 validations.
See the attached issue to learn how to make sure that double validation happened.
No matter what - we need to do a 1 or 0 validations.
#django_modern_rest #help_wanted #enhancement #good_first_issue
sent via relator
Write tests for "double validation" problem (#85)
We need to be sure that double validation does not happen. Ever.
See what "double validation" is: fastapi/fastapi#3021
To be sure, we need to write tests for it.
Tests would be rather easy to write:
• One for
@modify• One for
@validate• One for raw data return
Also test disabled response validation mode to do 0 validations.
See the attached issue to learn how to make sure that double validation happened.
No matter what - we need to do a 1 or 0 validations.
#django_modern_rest #help_wanted #enhancement #good_first_issue
sent via relator
🔥1
New issue to wemake-services/django-modern-rest by @sobolevn
Split `rest` app into several django apps (#171)
Our integration tests need some love.
Please, split this big app into smaller and more managable ones:
• basic parsing can stay there
• openapi should go into its own app
• middleware stuff should go into other app as well
https://github.com/wemake-services/django-modern-rest/blob/master/django_test_app/server/apps/rest/views.py
Also, make sure that test files are also different in
#django_modern_rest #help_wanted #good_first_issue #enhancement
sent via relator
Split `rest` app into several django apps (#171)
Our integration tests need some love.
Please, split this big app into smaller and more managable ones:
• basic parsing can stay there
• openapi should go into its own app
• middleware stuff should go into other app as well
https://github.com/wemake-services/django-modern-rest/blob/master/django_test_app/server/apps/rest/views.py
Also, make sure that test files are also different in
tests/test_integration#django_modern_rest #help_wanted #good_first_issue #enhancement
sent via relator
New issue to wemake-services/django-modern-rest by @sobolevn
Split code from `middleware.rst` into its own example file (#186)
We store our examples as independent files, so it would be easier for us to work with them: type checking, linting, etc.
Right now
#django_modern_rest #documentation #good_first_issue #help_wanted
sent via relator
Split code from `middleware.rst` into its own example file (#186)
We store our examples as independent files, so it would be easier for us to work with them: type checking, linting, etc.
Right now
middleware.rst has a lot of code with no examples. We need to create a new example/middleware/<example_name>.py and split the code.#django_modern_rest #documentation #good_first_issue #help_wanted
sent via relator
New issue to wemake-services/django-modern-rest by @sobolevn
Customize `.. literalinclude` directive to automatically create links to the source files (#191)
Some examples have a big number of
To fix this I propose adding a sphinx custom directive called
🖼️Image
Here are some basic examples of how to do something similar:
https://chatgpt.com/share/68ff5943-c07c-8009-8ae1-cc6520b452d7
PRs are very welcome!
#django_modern_rest #documentation #help_wanted #enhancement #good_first_issue
sent via relator
Customize `.. literalinclude` directive to automatically create links to the source files (#191)
Some examples have a big number of
import statements that take around 30%-50% of the whole example. We can use :lines: NUM- to hide imports, but, we still want to show the whole file when needed.To fix this I propose adding a sphinx custom directive called
.. literalincludelinked, which will produce🖼️Image
Here are some basic examples of how to do something similar:
https://chatgpt.com/share/68ff5943-c07c-8009-8ae1-cc6520b452d7
PRs are very welcome!
#django_modern_rest #documentation #help_wanted #enhancement #good_first_issue
sent via relator
🚀 New issue to ag2ai/faststream by @HelgeKrueger
📝 Bug: additional "Subscribe" in docs (#2617)
Describe the bug
In the sidebar, the subscriber is called "my_subscriberSubscribe" instead of "my_subscriber"
How to reproduce
then run
Screenshots
🖼️Image#good_first_issue #bug
sent via relator
📝 Bug: additional "Subscribe" in docs (#2617)
Describe the bug
In the sidebar, the subscriber is called "my_subscriberSubscribe" instead of "my_subscriber"
How to reproduce
from faststream import FastStream
from faststream.rabbit import RabbitBroker
broker = RabbitBroker()
@broker.subscriber("queue", title="my_subscriber")
async def test(): ...
app = FastStream(broker)
then run
uv run faststream docs serve main:app
Screenshots
🖼️Image#good_first_issue #bug
sent via relator
❤1
New issue to wemake-services/django-modern-rest by @sobolevn
Validate that controllers with `blueprints` can't have `component_parsers` (#217)
Otherwise we would do two parsings in a single HTTP request.
This is not good.
Right now this code is possible:
This must raise a validation exception in
We allow endpoints in the final composed controllers, but they must not use any parsing. Like
#django_modern_rest #good_first_issue #enhancement #help_wanted
sent via relator
Validate that controllers with `blueprints` can't have `component_parsers` (#217)
Otherwise we would do two parsings in a single HTTP request.
This is not good.
Right now this code is possible:
@final
class _ErrorBody(pydantic.BaseModel):
error: int
class _ErrorHandlingBlueprint(
Blueprint[PydanticSerializer],
Body[_ErrorBody], # <- first body parsing
):
def post(self) -> str:
return str(self.parsed_body.error)
class _ErrorHandlingController(
Controller[PydanticSerializer],
Body[dict[str, int]], # <- second body parsing
):
blueprints: ClassVar[Sequence[_BlueprintT]] = [_ErrorHandlingBlueprint]
def get(self) -> int:
return self.parsed_body.get('error', 0)
This must raise a validation exception in
ControllerValidator.We allow endpoints in the final composed controllers, but they must not use any parsing. Like
OPTIONS, for example. If users want to have parsed data, then they should create a blueprint.#django_modern_rest #good_first_issue #enhancement #help_wanted
sent via relator
New issue to wemake-services/django-modern-rest by @sobolevn
Validate that we can't set `Set-Cookie` header and should use `cookies=` instead (#259)
When using
If so, we need to raise an error that
#django_modern_rest #help_wanted #enhancement #good_first_issue
sent via relator
Validate that we can't set `Set-Cookie` header and should use `cookies=` instead (#259)
When using
@validate and @modify we must check that no ResponseSpec.headers contain Set-Cookie header description / modification.If so, we need to raise an error that
cookies= parameter must be used instead.#django_modern_rest #help_wanted #enhancement #good_first_issue
sent via relator
🚀 New issue to ag2ai/faststream by @jsonvot
📝 Bug: The coexistence issue between URL and virtualhost (#2652)
Describe the bug
In version v0.5.33, the first method works properly, and the trailing slash (/) at the end of the URL cannot be omitted. However, in versions >=0.5.34, due to additional handling of virtualhost, only parameter-based formats like 2, 3, 4 and 5 are supported.
I believe method 5 is the most intuitive and should be handled correctly, but it is currently treated as an invalid format. Using this method will result in the following error:
🖼️Image
Environment
faststream[rabbit]>=0.5.34
#bug #good_first_issue #faststream #ag2ai
sent via relator
📝 Bug: The coexistence issue between URL and virtualhost (#2652)
Describe the bug
import asyncio
from faststream.rabbit import RabbitBroker
async def pub():
broker = RabbitBroker('amqp://guest:guest@localhost:5672/', virtualhost='/domestic-aed') # 1
#broker = RabbitBroker('amqp://guest:guest@localhost:5672', virtualhost='//domestic-aed') # 2
#broker = RabbitBroker('amqp://guest:guest@localhost:5672/', virtualhost='//domestic-aed') # 3
#broker = RabbitBroker('amqp://guest:guest@localhost:5672//domestic-aed') # 4
#broker = RabbitBroker('amqp://guest:guest@localhost:5672', virtualhost='/domestic-aed') # 5
async with broker:
await broker.publish(
"Hi!",
queue="test-queue",
exchange="test-exchange"
)
asyncio.run(pub())
In version v0.5.33, the first method works properly, and the trailing slash (/) at the end of the URL cannot be omitted. However, in versions >=0.5.34, due to additional handling of virtualhost, only parameter-based formats like 2, 3, 4 and 5 are supported.
I believe method 5 is the most intuitive and should be handled correctly, but it is currently treated as an invalid format. Using this method will result in the following error:
🖼️Image
Environment
faststream[rabbit]>=0.5.34
#bug #good_first_issue #faststream #ag2ai
sent via relator
🚀 New issue to ag2ai/faststream by @Sehat1137
📝 Skip relator notification from the dependabot (#2665)
Improve pipeline so that it is not triggered by events from dependabot.
https://github.com/ag2ai/faststream/blob/main/.github/workflows/relator.yaml#L19
#good_first_issue #github_actions #faststream #ag2ai
sent via relator
📝 Skip relator notification from the dependabot (#2665)
Improve pipeline so that it is not triggered by events from dependabot.
https://github.com/ag2ai/faststream/blob/main/.github/workflows/relator.yaml#L19
#good_first_issue #github_actions #faststream #ag2ai
sent via relator
🤔3
Upgrade Python versions
https://github.com/pomponchik/instld/issues/21
#good_first_issue #github_actions #super_simple
https://github.com/pomponchik/instld/issues/21
#good_first_issue #github_actions #super_simple
GitHub
Upgrade Python versions · Issue #21 · pomponchik/instld
We need to prepare a new version of the library for Python 3.8-3.14 (including 3.14t). All of these Python versions should be displayed as meta package data and used in CI.
🔥4🤔2🍌1😐1💊1
🚀 New issue to ag2ai/faststream by @GrigoriyKuzevanov
📝 Bug: min_idle_time ignored when group and consumer are specified (#2678)
Describe the bug
When a StreamSub has both 'group' and 'consumer', and 'min_idle_time' specified, Faststream uses 'XREADGROUP' instead of 'XAUTOCALIM'
How to reproduce
Include source code:
Redis MONITOR output shows:
Expected behavior
Observed behavior
I suppose that a root cause in
Or i just misunderstand the logic.
Environment
Running FastStream 0.6.3 with CPython 3.12.3 on Linux
#bug #good_first_issue #faststream #ag2ai
sent via relator
📝 Bug: min_idle_time ignored when group and consumer are specified (#2678)
Describe the bug
When a StreamSub has both 'group' and 'consumer', and 'min_idle_time' specified, Faststream uses 'XREADGROUP' instead of 'XAUTOCALIM'
How to reproduce
Include source code:
from faststream import FastStream
from faststream.redis import RedisBroker, StreamSub
broker = RedisBroker("redis://localhost:6379")
@broker.subscriber(
stream=StreamSub(
"orders",
group="processors",
consumer="claimer",
min_idle_time=10000, # Should trigger XAUTOCLAIM
)
)
async def claiming_handler(msg):
print("Should use XAUTOCLAIM, but uses XREADGROUP")
app = FastStream(broker)
Redis MONITOR output shows:
XREADGROUP GROUP processors claimer BLOCK 100 STREAMS orders >
Expected behavior
XAUTOCLAIM orders processors claimer 10000 0-0 COUNT 1
Observed behavior
I suppose that a root cause in
faststream/redis/subscriber/use_cases/stream_subscriber, method _StreamHandlerMixin.start():if stream.group and stream.consumer: # ← Checked FIRST
# Uses XREADGROUP
...
elif self.stream_sub.min_idle_time is None:
# Uses XREAD
...
else:
# Uses XAUTOCLAIM ← Never reached when group is set!
...
Or i just misunderstand the logic.
Environment
Running FastStream 0.6.3 with CPython 3.12.3 on Linux
#bug #good_first_issue #faststream #ag2ai
sent via relator
🚀 New issue to ag2ai/faststream by @gaby
📝 bug: Usage of custom logger results in no logs (#2677)
Is your feature request related to a problem? Please describe.
The built-logger is configured to always add colors, even when passing a logger to faststream. This is hardcoded here https://github.com/ag2ai/faststream/blob/main/faststream/_internal/logger/logging.py#L80 This affects systems collecting logs from faststream hosts. This makes loga generated by faststream to show in raw text as
Describe the solution you'd like
Make the
Describe alternatives you've considered
Writing a custom log parser.
#enhancement #good_first_issue #faststream #ag2ai
sent via relator
📝 bug: Usage of custom logger results in no logs (#2677)
Is your feature request related to a problem? Please describe.
The built-logger is configured to always add colors, even when passing a logger to faststream. This is hardcoded here https://github.com/ag2ai/faststream/blob/main/faststream/_internal/logger/logging.py#L80 This affects systems collecting logs from faststream hosts. This makes loga generated by faststream to show in raw text as
"\033[36mDEBUG\033[0m" instead of DEBUG.Describe the solution you'd like
Make the
use_colors param configurable instead of a hardcoded value.Describe alternatives you've considered
Writing a custom log parser.
#enhancement #good_first_issue #faststream #ag2ai
sent via relator
🚀New issues to taskiq-python project
Hello everyone interested in contributing to open-source projects! We appreciate your intentions and ask for your help.
The taskiq-python project aims to migrate from Poetry to UV and drop support for Python 3.9. Since Taskiq has many different repositories, we would greatly appreciate your assistance. Here's a list of issues:
- https://github.com/taskiq-python/taskiq-fastapi/issues/24
- https://github.com/taskiq-python/taskiq-redis/issues/108
- https://github.com/taskiq-python/taskiq-psqlpy/issues/10
- https://github.com/taskiq-python/taskiq-valkey/issues/3
- https://github.com/taskiq-python/taskiq-pipelines/issues/28
- https://github.com/taskiq-python/taskiq-litestar/issues/3
- https://github.com/taskiq-python/taskiq-aiogram/issues/4
- https://github.com/taskiq-python/taskiq-aiohttp/issues/7
#good_first_issue #taskiq #enchancement
Hello everyone interested in contributing to open-source projects! We appreciate your intentions and ask for your help.
The taskiq-python project aims to migrate from Poetry to UV and drop support for Python 3.9. Since Taskiq has many different repositories, we would greatly appreciate your assistance. Here's a list of issues:
- https://github.com/taskiq-python/taskiq-fastapi/issues/24
- https://github.com/taskiq-python/taskiq-redis/issues/108
- https://github.com/taskiq-python/taskiq-psqlpy/issues/10
- https://github.com/taskiq-python/taskiq-valkey/issues/3
- https://github.com/taskiq-python/taskiq-pipelines/issues/28
- https://github.com/taskiq-python/taskiq-litestar/issues/3
- https://github.com/taskiq-python/taskiq-aiogram/issues/4
- https://github.com/taskiq-python/taskiq-aiohttp/issues/7
#good_first_issue #taskiq #enchancement
🔥11🫡1
🚀 New issue to ag2ai/faststream by @nectarindev
📝 feat: merge `Broker(context=...)` and `FastStream(context=...)` at broker-level (#2693)
I recently migrated my project to faststream 0.6. I was very interested in how I could add my dependencies to the context. Prior to version 0.6, I did something like this:
I launched the broker as part of my application in lifespan without using the FastStream class.
For version 0.6, I saw examples where it was suggested to pass the context to FastStream, but that solution did not suit me. I discovered that the broker also accepts context, and that solves my problem:
But I also discovered that if I create a FastStream instance, its context will be used, even though I didn't use it to start the broker.
I'm not sure that's normal behavior. It would make much more sense if only the broker's dependency were available.
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Running FastStream 0.6.3 with CPython 3.12.4 on Linux
#enhancement #good_first_issue #core #faststream #ag2ai
sent via relator
📝 feat: merge `Broker(context=...)` and `FastStream(context=...)` at broker-level (#2693)
I recently migrated my project to faststream 0.6. I was very interested in how I could add my dependencies to the context. Prior to version 0.6, I did something like this:
from faststream.annotations import ContextRepo
from faststream.kafka import KafkaBroker
from faststream.utils.context import context
broker = KafkaBroker()
@broker.subscriber("my_topic", group_id="my_group")
async def handle(
context: ContextRepo,
):
print("dependency: ", context.get("dependency")) # 42
async def lifespan(*args, **kwargs):
context.set_global("dependency", 42)
await broker.start()
try:
yield
finally:
await broker.stop()
I launched the broker as part of my application in lifespan without using the FastStream class.
For version 0.6, I saw examples where it was suggested to pass the context to FastStream, but that solution did not suit me. I discovered that the broker also accepts context, and that solves my problem:
broker = KafkaBroker(context=ContextRepo({"dependency": 42}))
...
async def lifespan(*args, **kwargs):
await broker.start()
try:
yield
finally:
await broker.stop()
But I also discovered that if I create a FastStream instance, its context will be used, even though I didn't use it to start the broker.
from fastapi import FastAPI
from faststream import ContextRepo, FastStream
from faststream.kafka import KafkaBroker
broker = KafkaBroker(context=ContextRepo({"broker_dependency": 2}))
app = FastStream(broker, context=ContextRepo({"application_dependency": 1}))
@broker.subscriber("my_topic", group_id="my_group")
async def handle(
context: ContextRepo,
):
print("broker_dependency: ", context.get("broker_dependency")) # None
print("application_dependency: ", context.get("application_dependency")) # 1
async def lifespan(*args, **kwargs):
await broker.start()
try:
yield
finally:
await broker.stop()
asgi = FastAPI(lifespan=lifespan)
I'm not sure that's normal behavior. It would make much more sense if only the broker's dependency were available.
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Running FastStream 0.6.3 with CPython 3.12.4 on Linux
#enhancement #good_first_issue #core #faststream #ag2ai
sent via relator
❤1
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Unskip `test_msgspec` files on 3.14 with new released version (#334)
We need a PR with the fix that removes all
#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
📝 Unskip `test_msgspec` files on 3.14 with new released version (#334)
msgspec@0.20.0 is out and updated in #333. Now all tests in test_plugins/test_msgpspec can be executed on 3.14 as well.We need a PR with the fix that removes all
pytest.skip directives for 3.14 from these files.#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
🚀 New issue to wemake-services/wemake-python-styleguide by @luminoso
📝 False positive for WPS457 when using asyncio to control loops (#3573)
What's wrong
For the code:
WPS457: Found an infinite while loop is raised. But infinite loop is being handled. Just not within the while loop.
How it should be
Not to raise WPS457.
Not 100% sure here if it is the best practice or not.
Also probably the solution is too complex and is just easier to add a
Flake8 version and plugins
pip information
OS information
#bug #help_wanted #levelstarter #good_first_issue #wemake_python_styleguide #wps
sent via relator
📝 False positive for WPS457 when using asyncio to control loops (#3573)
What's wrong
For the code:
import asyncio
async def infinite_loop() -> None:
try:
while True:
await asyncio.sleep(1)
print("I'm alive. And doing work.")
except asyncio.CancelledError:
print("Loop cancelled")
t = asyncio.create_task(infinite_loop())
await asyncio.sleep(5)
t.cancel()
WPS457: Found an infinite while loop is raised. But infinite loop is being handled. Just not within the while loop.
How it should be
Not to raise WPS457.
Not 100% sure here if it is the best practice or not.
Also probably the solution is too complex and is just easier to add a
noqa in the code.Flake8 version and plugins
{
"platform": {
"python_implementation": "CPython",
"python_version": "3.13.9",
"system": "Linux"
},
"plugins": [
{
"plugin": "mccabe",
"version": "0.7.0"
},
{
"plugin": "pycodestyle",
"version": "2.14.0"
},
{
"plugin": "pyflakes",
"version": "3.4.0"
},
{
"plugin": "wemake-python-styleguide",
"version": "1.4.0"
}
],
"version": "7.3.0"
}
pip information
pip 25.3 from /var/home/luminoso/.local/lib/python3.14/site-packages/pip (python 3.14)
anyio==4.11.0
archspec==0.2.5
argcomplete==3.6.3
asttokens==3.0.0
attrs==25.4.0
boto3==1.42.4
botocore==1.42.4
Brlapi==0.8.7
Brotli==1.1.0
certifi==2025.7.9
charset-normalizer==3.4.3
click==8.1.7
conda==25.11.0
conda-package-handling==2.4.0
conda_package_streaming==0.11.0
cupshelpers==1.0
dasbus==1.7
dbus-python==1.4.0
decorator==5.2.1
distro==1.9.0
executing==2.2.1
fedora-third-party==0.10
file-magic==0.4.0
frozendict==2.4.6
gbinder-python==1.1.2
h11==0.16.0
httpcore==1.0.9
httpx==0.28.1
idna==3.10
ipython==8.37.0
jedi==0.19.2
jmespath==1.0.1
jsonpatch==1.33
jsonpointer==2.4
jsonschema==4.23.0
jsonschema-specifications==2024.10.1
langtable==0.0.69
louis==3.33.0
matplotlib-inline==0.1.7
menuinst==2.3.1
mercurial==7.1.1
mutagen==1.47.0
nftables==0.1
olefile==0.47
packaging==25.0
parso==0.8.5
pexpect==4.9.0
pillow==11.3.0
platformdirs==4.2.2
pluggy==1.6.0
progressbar2==4.5.0
prompt_toolkit==3.0.41
psutil==7.0.0
ptyprocess==0.7.0
pure_eval==0.2.3
PyAudio==0.2.13
pycairo==1.28.0
pyclip==0.7.0
pycosat==0.6.6
pycryptodomex==3.23.0
pycups==2.0.4
pyenchant==3.2.2
pygdbmi==0.11.0.0
Pygments==2.19.1
PyGObject==3.54.5
pyinotify==0.9.6
PySocks==1.7.1
python-dateutil==2.9.0.post0
python-linux-procfs==0.7.3
python-utils==3.9.1
pyudev==0.24.3
pyxdg==0.27
PyYAML==6.0.2
pyynl @ file:///builddir/build/BUILD/kernel-6.17.10-build/kernel-6.17.10/linux-6.17.10-300.fc43.x86_64/tools/net/ynl
RapidFuzz==3.12.2
referencing==0.36.2
regex==2025.10.23
requests==2.32.5
rpds-py==0.27.0
rpm==6.0.0
rpmautospec==0.8.3
rpmautospec-core==0.1.5
ruamel.yaml==0.18.16
ruamel.yaml.clib==0.2.12
s3transfer==0.16.0
selinux @ file:///builddir/build/BUILD/libselinux-3.9-build/libselinux-3.9/src
sentry-sdk==2.35.0
sepolicy @ file:///builddir/build/BUILD/policycoreutils-3.9-build/selinux-3.9/python/sepolicy
setools==4.6.0
setuptools==78.1.1
six==1.17.0
sniffio==1.3.1
sos==4.10.1
stack_data==0.6.3
tqdm==4.67.1
traitlets==5.14.3
typing_extensions==4.15.0
urllib3==2.5.0
wcwidth==0.2.13
websockets==15.0.1
yt-dlp==2025.10.22
zstandard==0.25.0
OS information
$ lsb_release -a
LSB Version: n/a
Distributor ID: Fedora
Description: Fedora Linux 43.20251209.0 (Kinoite)
Release: 43
Codename: n/a
#bug #help_wanted #levelstarter #good_first_issue #wemake_python_styleguide #wps
sent via relator
❤1
🚀 New issue to ag2ai/faststream by @esinevgeny
📝 Bug: ValueError while calling redis client.xautoclaim (#2736)
Describe the bug
ValueError while calling xautoclaim
How to reproduce
Use the next route and launch the faststream, if there are messages with PENDING status in stream the issue will be observed
Actual
Traceback (most recent call last):
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/basic.py", line 91, in _consume
await self._get_msgs(*args)
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 341, in _get_msgs
for stream_name, msgs in await read(self.last_id):
^^^^^^^^^^^^^^^^^^^^^^^^
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 140, in read
(next_id, messages, _) = stream_message
^^^^^^^^^^^^^^^^^^^^^^
ValueError: not enough values to unpack (expected 3, got 2)
Environment
Running FastStream 0.6.5 with CPython 3.12.11 on Linux
redis 7.1.0
Also checked on redis 5.3.0
#bug #good_first_issue #faststream #ag2ai
sent via relator
📝 Bug: ValueError while calling redis client.xautoclaim (#2736)
Describe the bug
ValueError while calling xautoclaim
How to reproduce
Use the next route and launch the faststream, if there are messages with PENDING status in stream the issue will be observed
@broker.subscriber(
stream=StreamSub("test:test", group="workers",
consumer="worker",
min_idle_time=5000)
)
async def worker(msg: RedisStreamMessage, redis: Redis):
logger.error(f"Claim {msg.correlation_id}")
await msg.ack(redis=redis, group="workers")
Actual
Traceback (most recent call last):
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/basic.py", line 91, in _consume
await self._get_msgs(*args)
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 341, in _get_msgs
for stream_name, msgs in await read(self.last_id):
^^^^^^^^^^^^^^^^^^^^^^^^
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 140, in read
(next_id, messages, _) = stream_message
^^^^^^^^^^^^^^^^^^^^^^
ValueError: not enough values to unpack (expected 3, got 2)
Environment
Running FastStream 0.6.5 with CPython 3.12.11 on Linux
redis 7.1.0
Also checked on redis 5.3.0
#bug #good_first_issue #faststream #ag2ai
sent via relator
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Support `pyrefly` (#367)
https://pyrefly.org is already quite stable.
We use it in https://github.com/typeddjango/django-stubs/blob/master/pyrefly.toml with no major complains.
I really want to add this in CI as soon as possible.
#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
📝 Support `pyrefly` (#367)
https://pyrefly.org is already quite stable.
We use it in https://github.com/typeddjango/django-stubs/blob/master/pyrefly.toml with no major complains.
I really want to add this in CI as soon as possible.
#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
🤯5😱2
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Create `json_dumps` internal helper (#383)
We have to dump
django-modern-rest/django_modern_rest/openapi/renderers/base.py
Lines 16 to 31 in 9e6132e
But, right now we don't use
So, let's create a new
#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
📝 Create `json_dumps` internal helper (#383)
We have to dump
json in some internal places of our code:django-modern-rest/django_modern_rest/openapi/renderers/base.py
Lines 16 to 31 in 9e6132e
But, right now we don't use
msgspec there if it is available. But, we should :)So, let's create a new
internal/json file and make a helper there. Then use it in base.py#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator