initial commit
This commit is contained in:
Binary file not shown.
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,318 @@
|
||||
Metadata-Version: 2.3
|
||||
Name: aiofiles
|
||||
Version: 24.1.0
|
||||
Summary: File support for asyncio.
|
||||
Project-URL: Changelog, https://github.com/Tinche/aiofiles#history
|
||||
Project-URL: Bug Tracker, https://github.com/Tinche/aiofiles/issues
|
||||
Project-URL: repository, https://github.com/Tinche/aiofiles
|
||||
Author-email: Tin Tvrtkovic <tinchester@gmail.com>
|
||||
License: Apache-2.0
|
||||
License-File: LICENSE
|
||||
License-File: NOTICE
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Framework :: AsyncIO
|
||||
Classifier: License :: OSI Approved :: Apache Software License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: 3.9
|
||||
Classifier: Programming Language :: Python :: 3.10
|
||||
Classifier: Programming Language :: Python :: 3.11
|
||||
Classifier: Programming Language :: Python :: 3.12
|
||||
Classifier: Programming Language :: Python :: 3.13
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Requires-Python: >=3.8
|
||||
Description-Content-Type: text/markdown
|
||||
|
||||
# aiofiles: file support for asyncio
|
||||
|
||||
[](https://pypi.python.org/pypi/aiofiles)
|
||||
[](https://github.com/Tinche/aiofiles/actions)
|
||||
[](https://github.com/Tinche/aiofiles/actions/workflows/main.yml)
|
||||
[](https://github.com/Tinche/aiofiles)
|
||||
[](https://github.com/psf/black)
|
||||
|
||||
**aiofiles** is an Apache2 licensed library, written in Python, for handling local
|
||||
disk files in asyncio applications.
|
||||
|
||||
Ordinary local file IO is blocking, and cannot easily and portably be made
|
||||
asynchronous. This means doing file IO may interfere with asyncio applications,
|
||||
which shouldn't block the executing thread. aiofiles helps with this by
|
||||
introducing asynchronous versions of files that support delegating operations to
|
||||
a separate thread pool.
|
||||
|
||||
```python
|
||||
async with aiofiles.open('filename', mode='r') as f:
|
||||
contents = await f.read()
|
||||
print(contents)
|
||||
'My file contents'
|
||||
```
|
||||
|
||||
Asynchronous iteration is also supported.
|
||||
|
||||
```python
|
||||
async with aiofiles.open('filename') as f:
|
||||
async for line in f:
|
||||
...
|
||||
```
|
||||
|
||||
Asynchronous interface to tempfile module.
|
||||
|
||||
```python
|
||||
async with aiofiles.tempfile.TemporaryFile('wb') as f:
|
||||
await f.write(b'Hello, World!')
|
||||
```
|
||||
|
||||
## Features
|
||||
|
||||
- a file API very similar to Python's standard, blocking API
|
||||
- support for buffered and unbuffered binary files, and buffered text files
|
||||
- support for `async`/`await` ([PEP 492](https://peps.python.org/pep-0492/)) constructs
|
||||
- async interface to tempfile module
|
||||
|
||||
## Installation
|
||||
|
||||
To install aiofiles, simply:
|
||||
|
||||
```bash
|
||||
$ pip install aiofiles
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
Files are opened using the `aiofiles.open()` coroutine, which in addition to
|
||||
mirroring the builtin `open` accepts optional `loop` and `executor`
|
||||
arguments. If `loop` is absent, the default loop will be used, as per the
|
||||
set asyncio policy. If `executor` is not specified, the default event loop
|
||||
executor will be used.
|
||||
|
||||
In case of success, an asynchronous file object is returned with an
|
||||
API identical to an ordinary file, except the following methods are coroutines
|
||||
and delegate to an executor:
|
||||
|
||||
- `close`
|
||||
- `flush`
|
||||
- `isatty`
|
||||
- `read`
|
||||
- `readall`
|
||||
- `read1`
|
||||
- `readinto`
|
||||
- `readline`
|
||||
- `readlines`
|
||||
- `seek`
|
||||
- `seekable`
|
||||
- `tell`
|
||||
- `truncate`
|
||||
- `writable`
|
||||
- `write`
|
||||
- `writelines`
|
||||
|
||||
In case of failure, one of the usual exceptions will be raised.
|
||||
|
||||
`aiofiles.stdin`, `aiofiles.stdout`, `aiofiles.stderr`,
|
||||
`aiofiles.stdin_bytes`, `aiofiles.stdout_bytes`, and
|
||||
`aiofiles.stderr_bytes` provide async access to `sys.stdin`,
|
||||
`sys.stdout`, `sys.stderr`, and their corresponding `.buffer` properties.
|
||||
|
||||
The `aiofiles.os` module contains executor-enabled coroutine versions of
|
||||
several useful `os` functions that deal with files:
|
||||
|
||||
- `stat`
|
||||
- `statvfs`
|
||||
- `sendfile`
|
||||
- `rename`
|
||||
- `renames`
|
||||
- `replace`
|
||||
- `remove`
|
||||
- `unlink`
|
||||
- `mkdir`
|
||||
- `makedirs`
|
||||
- `rmdir`
|
||||
- `removedirs`
|
||||
- `link`
|
||||
- `symlink`
|
||||
- `readlink`
|
||||
- `listdir`
|
||||
- `scandir`
|
||||
- `access`
|
||||
- `getcwd`
|
||||
- `path.abspath`
|
||||
- `path.exists`
|
||||
- `path.isfile`
|
||||
- `path.isdir`
|
||||
- `path.islink`
|
||||
- `path.ismount`
|
||||
- `path.getsize`
|
||||
- `path.getatime`
|
||||
- `path.getctime`
|
||||
- `path.samefile`
|
||||
- `path.sameopenfile`
|
||||
|
||||
### Tempfile
|
||||
|
||||
**aiofiles.tempfile** implements the following interfaces:
|
||||
|
||||
- TemporaryFile
|
||||
- NamedTemporaryFile
|
||||
- SpooledTemporaryFile
|
||||
- TemporaryDirectory
|
||||
|
||||
Results return wrapped with a context manager allowing use with async with and async for.
|
||||
|
||||
```python
|
||||
async with aiofiles.tempfile.NamedTemporaryFile('wb+') as f:
|
||||
await f.write(b'Line1\n Line2')
|
||||
await f.seek(0)
|
||||
async for line in f:
|
||||
print(line)
|
||||
|
||||
async with aiofiles.tempfile.TemporaryDirectory() as d:
|
||||
filename = os.path.join(d, "file.ext")
|
||||
```
|
||||
|
||||
### Writing tests for aiofiles
|
||||
|
||||
Real file IO can be mocked by patching `aiofiles.threadpool.sync_open`
|
||||
as desired. The return type also needs to be registered with the
|
||||
`aiofiles.threadpool.wrap` dispatcher:
|
||||
|
||||
```python
|
||||
aiofiles.threadpool.wrap.register(mock.MagicMock)(
|
||||
lambda *args, **kwargs: aiofiles.threadpool.AsyncBufferedIOBase(*args, **kwargs)
|
||||
)
|
||||
|
||||
async def test_stuff():
|
||||
write_data = 'data'
|
||||
read_file_chunks = [
|
||||
b'file chunks 1',
|
||||
b'file chunks 2',
|
||||
b'file chunks 3',
|
||||
b'',
|
||||
]
|
||||
file_chunks_iter = iter(read_file_chunks)
|
||||
|
||||
mock_file_stream = mock.MagicMock(
|
||||
read=lambda *args, **kwargs: next(file_chunks_iter)
|
||||
)
|
||||
|
||||
with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file_stream) as mock_open:
|
||||
async with aiofiles.open('filename', 'w') as f:
|
||||
await f.write(write_data)
|
||||
assert f.read() == b'file chunks 1'
|
||||
|
||||
mock_file_stream.write.assert_called_once_with(write_data)
|
||||
```
|
||||
|
||||
### History
|
||||
|
||||
#### 24.1.0 (2024-06-24)
|
||||
|
||||
- Import `os.link` conditionally to fix importing on android.
|
||||
[#175](https://github.com/Tinche/aiofiles/issues/175)
|
||||
- Remove spurious items from `aiofiles.os.__all__` when running on Windows.
|
||||
- Switch to more modern async idioms: Remove types.coroutine and make AiofilesContextManager an awaitable instead a coroutine.
|
||||
- Add `aiofiles.os.path.abspath` and `aiofiles.os.getcwd`.
|
||||
[#174](https://github.com/Tinche/aiofiles/issues/181)
|
||||
- _aiofiles_ is now tested on Python 3.13 too.
|
||||
[#184](https://github.com/Tinche/aiofiles/pull/184)
|
||||
- Dropped Python 3.7 support. If you require it, use version 23.2.1.
|
||||
|
||||
#### 23.2.1 (2023-08-09)
|
||||
|
||||
- Import `os.statvfs` conditionally to fix importing on non-UNIX systems.
|
||||
[#171](https://github.com/Tinche/aiofiles/issues/171) [#172](https://github.com/Tinche/aiofiles/pull/172)
|
||||
- aiofiles is now also tested on Windows.
|
||||
|
||||
#### 23.2.0 (2023-08-09)
|
||||
|
||||
- aiofiles is now tested on Python 3.12 too.
|
||||
[#166](https://github.com/Tinche/aiofiles/issues/166) [#168](https://github.com/Tinche/aiofiles/pull/168)
|
||||
- On Python 3.12, `aiofiles.tempfile.NamedTemporaryFile` now accepts a `delete_on_close` argument, just like the stdlib version.
|
||||
- On Python 3.12, `aiofiles.tempfile.NamedTemporaryFile` no longer exposes a `delete` attribute, just like the stdlib version.
|
||||
- Added `aiofiles.os.statvfs` and `aiofiles.os.path.ismount`.
|
||||
[#162](https://github.com/Tinche/aiofiles/pull/162)
|
||||
- Use [PDM](https://pdm.fming.dev/latest/) instead of Poetry.
|
||||
[#169](https://github.com/Tinche/aiofiles/pull/169)
|
||||
|
||||
#### 23.1.0 (2023-02-09)
|
||||
|
||||
- Added `aiofiles.os.access`.
|
||||
[#146](https://github.com/Tinche/aiofiles/pull/146)
|
||||
- Removed `aiofiles.tempfile.temptypes.AsyncSpooledTemporaryFile.softspace`.
|
||||
[#151](https://github.com/Tinche/aiofiles/pull/151)
|
||||
- Added `aiofiles.stdin`, `aiofiles.stdin_bytes`, and other stdio streams.
|
||||
[#154](https://github.com/Tinche/aiofiles/pull/154)
|
||||
- Transition to `asyncio.get_running_loop` (vs `asyncio.get_event_loop`) internally.
|
||||
|
||||
#### 22.1.0 (2022-09-04)
|
||||
|
||||
- Added `aiofiles.os.path.islink`.
|
||||
[#126](https://github.com/Tinche/aiofiles/pull/126)
|
||||
- Added `aiofiles.os.readlink`.
|
||||
[#125](https://github.com/Tinche/aiofiles/pull/125)
|
||||
- Added `aiofiles.os.symlink`.
|
||||
[#124](https://github.com/Tinche/aiofiles/pull/124)
|
||||
- Added `aiofiles.os.unlink`.
|
||||
[#123](https://github.com/Tinche/aiofiles/pull/123)
|
||||
- Added `aiofiles.os.link`.
|
||||
[#121](https://github.com/Tinche/aiofiles/pull/121)
|
||||
- Added `aiofiles.os.renames`.
|
||||
[#120](https://github.com/Tinche/aiofiles/pull/120)
|
||||
- Added `aiofiles.os.{listdir, scandir}`.
|
||||
[#143](https://github.com/Tinche/aiofiles/pull/143)
|
||||
- Switched to CalVer.
|
||||
- Dropped Python 3.6 support. If you require it, use version 0.8.0.
|
||||
- aiofiles is now tested on Python 3.11.
|
||||
|
||||
#### 0.8.0 (2021-11-27)
|
||||
|
||||
- aiofiles is now tested on Python 3.10.
|
||||
- Added `aiofiles.os.replace`.
|
||||
[#107](https://github.com/Tinche/aiofiles/pull/107)
|
||||
- Added `aiofiles.os.{makedirs, removedirs}`.
|
||||
- Added `aiofiles.os.path.{exists, isfile, isdir, getsize, getatime, getctime, samefile, sameopenfile}`.
|
||||
[#63](https://github.com/Tinche/aiofiles/pull/63)
|
||||
- Added `suffix`, `prefix`, `dir` args to `aiofiles.tempfile.TemporaryDirectory`.
|
||||
[#116](https://github.com/Tinche/aiofiles/pull/116)
|
||||
|
||||
#### 0.7.0 (2021-05-17)
|
||||
|
||||
- Added the `aiofiles.tempfile` module for async temporary files.
|
||||
[#56](https://github.com/Tinche/aiofiles/pull/56)
|
||||
- Switched to Poetry and GitHub actions.
|
||||
- Dropped 3.5 support.
|
||||
|
||||
#### 0.6.0 (2020-10-27)
|
||||
|
||||
- `aiofiles` is now tested on ppc64le.
|
||||
- Added `name` and `mode` properties to async file objects.
|
||||
[#82](https://github.com/Tinche/aiofiles/pull/82)
|
||||
- Fixed a DeprecationWarning internally.
|
||||
[#75](https://github.com/Tinche/aiofiles/pull/75)
|
||||
- Python 3.9 support and tests.
|
||||
|
||||
#### 0.5.0 (2020-04-12)
|
||||
|
||||
- Python 3.8 support. Code base modernization (using `async/await` instead of `asyncio.coroutine`/`yield from`).
|
||||
- Added `aiofiles.os.remove`, `aiofiles.os.rename`, `aiofiles.os.mkdir`, `aiofiles.os.rmdir`.
|
||||
[#62](https://github.com/Tinche/aiofiles/pull/62)
|
||||
|
||||
#### 0.4.0 (2018-08-11)
|
||||
|
||||
- Python 3.7 support.
|
||||
- Removed Python 3.3/3.4 support. If you use these versions, stick to aiofiles 0.3.x.
|
||||
|
||||
#### 0.3.2 (2017-09-23)
|
||||
|
||||
- The LICENSE is now included in the sdist.
|
||||
[#31](https://github.com/Tinche/aiofiles/pull/31)
|
||||
|
||||
#### 0.3.1 (2017-03-10)
|
||||
|
||||
- Introduced a changelog.
|
||||
- `aiofiles.os.sendfile` will now work if the standard `os` module contains a `sendfile` function.
|
||||
|
||||
### Contributing
|
||||
|
||||
Contributions are very welcome. Tests can be run with `tox`, please ensure
|
||||
the coverage at least stays the same before you submit a pull request.
|
||||
@@ -0,0 +1,26 @@
|
||||
aiofiles-24.1.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
aiofiles-24.1.0.dist-info/METADATA,sha256=CvUJx21XclgI1Lp5Bt_4AyJesRYg0xCSx4exJZVmaSA,10708
|
||||
aiofiles-24.1.0.dist-info/RECORD,,
|
||||
aiofiles-24.1.0.dist-info/WHEEL,sha256=1yFddiXMmvYK7QYTqtRNtX66WJ0Mz8PYEiEUoOUUxRY,87
|
||||
aiofiles-24.1.0.dist-info/licenses/LICENSE,sha256=y16Ofl9KOYjhBjwULGDcLfdWBfTEZRXnduOspt-XbhQ,11325
|
||||
aiofiles-24.1.0.dist-info/licenses/NOTICE,sha256=EExY0dRQvWR0wJ2LZLwBgnM6YKw9jCU-M0zegpRSD_E,55
|
||||
aiofiles/__init__.py,sha256=1iAMJQyJtX3LGIS0AoFTJeO1aJ_RK2jpBSBhg0VoIrE,344
|
||||
aiofiles/__pycache__/__init__.cpython-312.pyc,,
|
||||
aiofiles/__pycache__/base.cpython-312.pyc,,
|
||||
aiofiles/__pycache__/os.cpython-312.pyc,,
|
||||
aiofiles/__pycache__/ospath.cpython-312.pyc,,
|
||||
aiofiles/base.py,sha256=zo0FgkCqZ5aosjvxqIvDf2t-RFg1Lc6X8P6rZ56p6fQ,1784
|
||||
aiofiles/os.py,sha256=0DrsG-eH4h7xRzglv9pIWsQuzqe7ZhVYw5FQS18fIys,1153
|
||||
aiofiles/ospath.py,sha256=WaYelz_k6ykAFRLStr4bqYIfCVQ-5GGzIqIizykbY2Q,794
|
||||
aiofiles/tempfile/__init__.py,sha256=hFSNTOjOUv371Ozdfy6FIxeln46Nm3xOVh4ZR3Q94V0,10244
|
||||
aiofiles/tempfile/__pycache__/__init__.cpython-312.pyc,,
|
||||
aiofiles/tempfile/__pycache__/temptypes.cpython-312.pyc,,
|
||||
aiofiles/tempfile/temptypes.py,sha256=ddEvNjMLVlr7WUILCe6ypTqw77yREeIonTk16Uw_NVs,2093
|
||||
aiofiles/threadpool/__init__.py,sha256=kt0hwwx3bLiYtnA1SORhW8mJ6z4W9Xr7MbY80UIJJrI,3133
|
||||
aiofiles/threadpool/__pycache__/__init__.cpython-312.pyc,,
|
||||
aiofiles/threadpool/__pycache__/binary.cpython-312.pyc,,
|
||||
aiofiles/threadpool/__pycache__/text.cpython-312.pyc,,
|
||||
aiofiles/threadpool/__pycache__/utils.cpython-312.pyc,,
|
||||
aiofiles/threadpool/binary.py,sha256=hp-km9VCRu0MLz_wAEUfbCz7OL7xtn9iGAawabpnp5U,2315
|
||||
aiofiles/threadpool/text.py,sha256=fNmpw2PEkj0BZSldipJXAgZqVGLxALcfOMiuDQ54Eas,1223
|
||||
aiofiles/threadpool/utils.py,sha256=B59dSZwO_WZs2dFFycKeA91iD2Xq2nNw1EFF8YMBI5k,1868
|
||||
@@ -0,0 +1,4 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: hatchling 1.25.0
|
||||
Root-Is-Purelib: true
|
||||
Tag: py3-none-any
|
||||
@@ -0,0 +1,202 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "{}"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright {yyyy} {name of copyright owner}
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
||||
@@ -0,0 +1,2 @@
|
||||
Asyncio support for files
|
||||
Copyright 2016 Tin Tvrtkovic
|
||||
22
venv/lib/python3.12/site-packages/aiofiles/__init__.py
Normal file
22
venv/lib/python3.12/site-packages/aiofiles/__init__.py
Normal file
@@ -0,0 +1,22 @@
|
||||
"""Utilities for asyncio-friendly file handling."""
|
||||
from .threadpool import (
|
||||
open,
|
||||
stdin,
|
||||
stdout,
|
||||
stderr,
|
||||
stdin_bytes,
|
||||
stdout_bytes,
|
||||
stderr_bytes,
|
||||
)
|
||||
from . import tempfile
|
||||
|
||||
__all__ = [
|
||||
"open",
|
||||
"tempfile",
|
||||
"stdin",
|
||||
"stdout",
|
||||
"stderr",
|
||||
"stdin_bytes",
|
||||
"stdout_bytes",
|
||||
"stderr_bytes",
|
||||
]
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
69
venv/lib/python3.12/site-packages/aiofiles/base.py
Normal file
69
venv/lib/python3.12/site-packages/aiofiles/base.py
Normal file
@@ -0,0 +1,69 @@
|
||||
"""Various base classes."""
|
||||
from collections.abc import Awaitable
|
||||
from contextlib import AbstractAsyncContextManager
|
||||
from asyncio import get_running_loop
|
||||
|
||||
|
||||
class AsyncBase:
|
||||
def __init__(self, file, loop, executor):
|
||||
self._file = file
|
||||
self._executor = executor
|
||||
self._ref_loop = loop
|
||||
|
||||
@property
|
||||
def _loop(self):
|
||||
return self._ref_loop or get_running_loop()
|
||||
|
||||
def __aiter__(self):
|
||||
"""We are our own iterator."""
|
||||
return self
|
||||
|
||||
def __repr__(self):
|
||||
return super().__repr__() + " wrapping " + repr(self._file)
|
||||
|
||||
async def __anext__(self):
|
||||
"""Simulate normal file iteration."""
|
||||
line = await self.readline()
|
||||
if line:
|
||||
return line
|
||||
else:
|
||||
raise StopAsyncIteration
|
||||
|
||||
|
||||
class AsyncIndirectBase(AsyncBase):
|
||||
def __init__(self, name, loop, executor, indirect):
|
||||
self._indirect = indirect
|
||||
self._name = name
|
||||
super().__init__(None, loop, executor)
|
||||
|
||||
@property
|
||||
def _file(self):
|
||||
return self._indirect()
|
||||
|
||||
@_file.setter
|
||||
def _file(self, v):
|
||||
pass # discard writes
|
||||
|
||||
|
||||
class AiofilesContextManager(Awaitable, AbstractAsyncContextManager):
|
||||
"""An adjusted async context manager for aiofiles."""
|
||||
|
||||
__slots__ = ("_coro", "_obj")
|
||||
|
||||
def __init__(self, coro):
|
||||
self._coro = coro
|
||||
self._obj = None
|
||||
|
||||
def __await__(self):
|
||||
if self._obj is None:
|
||||
self._obj = yield from self._coro.__await__()
|
||||
return self._obj
|
||||
|
||||
async def __aenter__(self):
|
||||
return await self
|
||||
|
||||
async def __aexit__(self, exc_type, exc_val, exc_tb):
|
||||
await get_running_loop().run_in_executor(
|
||||
None, self._obj._file.__exit__, exc_type, exc_val, exc_tb
|
||||
)
|
||||
self._obj = None
|
||||
58
venv/lib/python3.12/site-packages/aiofiles/os.py
Normal file
58
venv/lib/python3.12/site-packages/aiofiles/os.py
Normal file
@@ -0,0 +1,58 @@
|
||||
"""Async executor versions of file functions from the os module."""
|
||||
|
||||
import os
|
||||
|
||||
from . import ospath as path
|
||||
from .ospath import wrap
|
||||
|
||||
__all__ = [
|
||||
"path",
|
||||
"stat",
|
||||
"rename",
|
||||
"renames",
|
||||
"replace",
|
||||
"remove",
|
||||
"unlink",
|
||||
"mkdir",
|
||||
"makedirs",
|
||||
"rmdir",
|
||||
"removedirs",
|
||||
"symlink",
|
||||
"readlink",
|
||||
"listdir",
|
||||
"scandir",
|
||||
"access",
|
||||
"wrap",
|
||||
"getcwd",
|
||||
]
|
||||
if hasattr(os, "link"):
|
||||
__all__ += ["link"]
|
||||
if hasattr(os, "sendfile"):
|
||||
__all__ += ["sendfile"]
|
||||
if hasattr(os, "statvfs"):
|
||||
__all__ += ["statvfs"]
|
||||
|
||||
|
||||
stat = wrap(os.stat)
|
||||
rename = wrap(os.rename)
|
||||
renames = wrap(os.renames)
|
||||
replace = wrap(os.replace)
|
||||
remove = wrap(os.remove)
|
||||
unlink = wrap(os.unlink)
|
||||
mkdir = wrap(os.mkdir)
|
||||
makedirs = wrap(os.makedirs)
|
||||
rmdir = wrap(os.rmdir)
|
||||
removedirs = wrap(os.removedirs)
|
||||
symlink = wrap(os.symlink)
|
||||
readlink = wrap(os.readlink)
|
||||
listdir = wrap(os.listdir)
|
||||
scandir = wrap(os.scandir)
|
||||
access = wrap(os.access)
|
||||
getcwd = wrap(os.getcwd)
|
||||
|
||||
if hasattr(os, "link"):
|
||||
link = wrap(os.link)
|
||||
if hasattr(os, "sendfile"):
|
||||
sendfile = wrap(os.sendfile)
|
||||
if hasattr(os, "statvfs"):
|
||||
statvfs = wrap(os.statvfs)
|
||||
30
venv/lib/python3.12/site-packages/aiofiles/ospath.py
Normal file
30
venv/lib/python3.12/site-packages/aiofiles/ospath.py
Normal file
@@ -0,0 +1,30 @@
|
||||
"""Async executor versions of file functions from the os.path module."""
|
||||
|
||||
import asyncio
|
||||
from functools import partial, wraps
|
||||
from os import path
|
||||
|
||||
|
||||
def wrap(func):
|
||||
@wraps(func)
|
||||
async def run(*args, loop=None, executor=None, **kwargs):
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
pfunc = partial(func, *args, **kwargs)
|
||||
return await loop.run_in_executor(executor, pfunc)
|
||||
|
||||
return run
|
||||
|
||||
|
||||
exists = wrap(path.exists)
|
||||
isfile = wrap(path.isfile)
|
||||
isdir = wrap(path.isdir)
|
||||
islink = wrap(path.islink)
|
||||
ismount = wrap(path.ismount)
|
||||
getsize = wrap(path.getsize)
|
||||
getmtime = wrap(path.getmtime)
|
||||
getatime = wrap(path.getatime)
|
||||
getctime = wrap(path.getctime)
|
||||
samefile = wrap(path.samefile)
|
||||
sameopenfile = wrap(path.sameopenfile)
|
||||
abspath = wrap(path.abspath)
|
||||
357
venv/lib/python3.12/site-packages/aiofiles/tempfile/__init__.py
Normal file
357
venv/lib/python3.12/site-packages/aiofiles/tempfile/__init__.py
Normal file
@@ -0,0 +1,357 @@
|
||||
import asyncio
|
||||
from functools import partial, singledispatch
|
||||
from io import BufferedRandom, BufferedReader, BufferedWriter, FileIO, TextIOBase
|
||||
from tempfile import NamedTemporaryFile as syncNamedTemporaryFile
|
||||
from tempfile import SpooledTemporaryFile as syncSpooledTemporaryFile
|
||||
from tempfile import TemporaryDirectory as syncTemporaryDirectory
|
||||
from tempfile import TemporaryFile as syncTemporaryFile
|
||||
from tempfile import _TemporaryFileWrapper as syncTemporaryFileWrapper
|
||||
|
||||
from ..base import AiofilesContextManager
|
||||
from ..threadpool.binary import AsyncBufferedIOBase, AsyncBufferedReader, AsyncFileIO
|
||||
from ..threadpool.text import AsyncTextIOWrapper
|
||||
from .temptypes import AsyncSpooledTemporaryFile, AsyncTemporaryDirectory
|
||||
import sys
|
||||
|
||||
__all__ = [
|
||||
"NamedTemporaryFile",
|
||||
"TemporaryFile",
|
||||
"SpooledTemporaryFile",
|
||||
"TemporaryDirectory",
|
||||
]
|
||||
|
||||
|
||||
# ================================================================
|
||||
# Public methods for async open and return of temp file/directory
|
||||
# objects with async interface
|
||||
# ================================================================
|
||||
if sys.version_info >= (3, 12):
|
||||
|
||||
def NamedTemporaryFile(
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
delete=True,
|
||||
delete_on_close=True,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Async open a named temporary file"""
|
||||
return AiofilesContextManager(
|
||||
_temporary_file(
|
||||
named=True,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
delete=delete,
|
||||
delete_on_close=delete_on_close,
|
||||
loop=loop,
|
||||
executor=executor,
|
||||
)
|
||||
)
|
||||
|
||||
else:
|
||||
|
||||
def NamedTemporaryFile(
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
delete=True,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Async open a named temporary file"""
|
||||
return AiofilesContextManager(
|
||||
_temporary_file(
|
||||
named=True,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
delete=delete,
|
||||
loop=loop,
|
||||
executor=executor,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def TemporaryFile(
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Async open an unnamed temporary file"""
|
||||
return AiofilesContextManager(
|
||||
_temporary_file(
|
||||
named=False,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
loop=loop,
|
||||
executor=executor,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def SpooledTemporaryFile(
|
||||
max_size=0,
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Async open a spooled temporary file"""
|
||||
return AiofilesContextManager(
|
||||
_spooled_temporary_file(
|
||||
max_size=max_size,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
loop=loop,
|
||||
executor=executor,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def TemporaryDirectory(suffix=None, prefix=None, dir=None, loop=None, executor=None):
|
||||
"""Async open a temporary directory"""
|
||||
return AiofilesContextManagerTempDir(
|
||||
_temporary_directory(
|
||||
suffix=suffix, prefix=prefix, dir=dir, loop=loop, executor=executor
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
# =========================================================
|
||||
# Internal coroutines to open new temp files/directories
|
||||
# =========================================================
|
||||
if sys.version_info >= (3, 12):
|
||||
|
||||
async def _temporary_file(
|
||||
named=True,
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
delete=True,
|
||||
delete_on_close=True,
|
||||
loop=None,
|
||||
executor=None,
|
||||
max_size=0,
|
||||
):
|
||||
"""Async method to open a temporary file with async interface"""
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
if named:
|
||||
cb = partial(
|
||||
syncNamedTemporaryFile,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
delete=delete,
|
||||
delete_on_close=delete_on_close,
|
||||
)
|
||||
else:
|
||||
cb = partial(
|
||||
syncTemporaryFile,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
)
|
||||
|
||||
f = await loop.run_in_executor(executor, cb)
|
||||
|
||||
# Wrap based on type of underlying IO object
|
||||
if type(f) is syncTemporaryFileWrapper:
|
||||
# _TemporaryFileWrapper was used (named files)
|
||||
result = wrap(f.file, f, loop=loop, executor=executor)
|
||||
result._closer = f._closer
|
||||
return result
|
||||
else:
|
||||
# IO object was returned directly without wrapper
|
||||
return wrap(f, f, loop=loop, executor=executor)
|
||||
|
||||
else:
|
||||
|
||||
async def _temporary_file(
|
||||
named=True,
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
delete=True,
|
||||
loop=None,
|
||||
executor=None,
|
||||
max_size=0,
|
||||
):
|
||||
"""Async method to open a temporary file with async interface"""
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
if named:
|
||||
cb = partial(
|
||||
syncNamedTemporaryFile,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
delete=delete,
|
||||
)
|
||||
else:
|
||||
cb = partial(
|
||||
syncTemporaryFile,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
)
|
||||
|
||||
f = await loop.run_in_executor(executor, cb)
|
||||
|
||||
# Wrap based on type of underlying IO object
|
||||
if type(f) is syncTemporaryFileWrapper:
|
||||
# _TemporaryFileWrapper was used (named files)
|
||||
result = wrap(f.file, f, loop=loop, executor=executor)
|
||||
# add delete property
|
||||
result.delete = f.delete
|
||||
return result
|
||||
else:
|
||||
# IO object was returned directly without wrapper
|
||||
return wrap(f, f, loop=loop, executor=executor)
|
||||
|
||||
|
||||
async def _spooled_temporary_file(
|
||||
max_size=0,
|
||||
mode="w+b",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
newline=None,
|
||||
suffix=None,
|
||||
prefix=None,
|
||||
dir=None,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Open a spooled temporary file with async interface"""
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
cb = partial(
|
||||
syncSpooledTemporaryFile,
|
||||
max_size=max_size,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
newline=newline,
|
||||
suffix=suffix,
|
||||
prefix=prefix,
|
||||
dir=dir,
|
||||
)
|
||||
|
||||
f = await loop.run_in_executor(executor, cb)
|
||||
|
||||
# Single interface provided by SpooledTemporaryFile for all modes
|
||||
return AsyncSpooledTemporaryFile(f, loop=loop, executor=executor)
|
||||
|
||||
|
||||
async def _temporary_directory(
|
||||
suffix=None, prefix=None, dir=None, loop=None, executor=None
|
||||
):
|
||||
"""Async method to open a temporary directory with async interface"""
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
|
||||
cb = partial(syncTemporaryDirectory, suffix, prefix, dir)
|
||||
f = await loop.run_in_executor(executor, cb)
|
||||
|
||||
return AsyncTemporaryDirectory(f, loop=loop, executor=executor)
|
||||
|
||||
|
||||
class AiofilesContextManagerTempDir(AiofilesContextManager):
|
||||
"""With returns the directory location, not the object (matching sync lib)"""
|
||||
|
||||
async def __aenter__(self):
|
||||
self._obj = await self._coro
|
||||
return self._obj.name
|
||||
|
||||
|
||||
@singledispatch
|
||||
def wrap(base_io_obj, file, *, loop=None, executor=None):
|
||||
"""Wrap the object with interface based on type of underlying IO"""
|
||||
raise TypeError("Unsupported IO type: {}".format(base_io_obj))
|
||||
|
||||
|
||||
@wrap.register(TextIOBase)
|
||||
def _(base_io_obj, file, *, loop=None, executor=None):
|
||||
return AsyncTextIOWrapper(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedWriter)
|
||||
def _(base_io_obj, file, *, loop=None, executor=None):
|
||||
return AsyncBufferedIOBase(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedReader)
|
||||
@wrap.register(BufferedRandom)
|
||||
def _(base_io_obj, file, *, loop=None, executor=None):
|
||||
return AsyncBufferedReader(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(FileIO)
|
||||
def _(base_io_obj, file, *, loop=None, executor=None):
|
||||
return AsyncFileIO(file, loop=loop, executor=executor)
|
||||
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,69 @@
|
||||
"""Async wrappers for spooled temp files and temp directory objects"""
|
||||
from functools import partial
|
||||
|
||||
from ..base import AsyncBase
|
||||
from ..threadpool.utils import (
|
||||
cond_delegate_to_executor,
|
||||
delegate_to_executor,
|
||||
proxy_property_directly,
|
||||
)
|
||||
|
||||
|
||||
@delegate_to_executor("fileno", "rollover")
|
||||
@cond_delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"tell",
|
||||
"truncate",
|
||||
)
|
||||
@proxy_property_directly("closed", "encoding", "mode", "name", "newlines")
|
||||
class AsyncSpooledTemporaryFile(AsyncBase):
|
||||
"""Async wrapper for SpooledTemporaryFile class"""
|
||||
|
||||
async def _check(self):
|
||||
if self._file._rolled:
|
||||
return
|
||||
max_size = self._file._max_size
|
||||
if max_size and self._file.tell() > max_size:
|
||||
await self.rollover()
|
||||
|
||||
async def write(self, s):
|
||||
"""Implementation to anticipate rollover"""
|
||||
if self._file._rolled:
|
||||
cb = partial(self._file.write, s)
|
||||
return await self._loop.run_in_executor(self._executor, cb)
|
||||
else:
|
||||
file = self._file._file # reference underlying base IO object
|
||||
rv = file.write(s)
|
||||
await self._check()
|
||||
return rv
|
||||
|
||||
async def writelines(self, iterable):
|
||||
"""Implementation to anticipate rollover"""
|
||||
if self._file._rolled:
|
||||
cb = partial(self._file.writelines, iterable)
|
||||
return await self._loop.run_in_executor(self._executor, cb)
|
||||
else:
|
||||
file = self._file._file # reference underlying base IO object
|
||||
rv = file.writelines(iterable)
|
||||
await self._check()
|
||||
return rv
|
||||
|
||||
|
||||
@delegate_to_executor("cleanup")
|
||||
@proxy_property_directly("name")
|
||||
class AsyncTemporaryDirectory:
|
||||
"""Async wrapper for TemporaryDirectory class"""
|
||||
|
||||
def __init__(self, file, loop, executor):
|
||||
self._file = file
|
||||
self._loop = loop
|
||||
self._executor = executor
|
||||
|
||||
async def close(self):
|
||||
await self.cleanup()
|
||||
@@ -0,0 +1,139 @@
|
||||
"""Handle files using a thread pool executor."""
|
||||
import asyncio
|
||||
import sys
|
||||
from functools import partial, singledispatch
|
||||
from io import (
|
||||
BufferedIOBase,
|
||||
BufferedRandom,
|
||||
BufferedReader,
|
||||
BufferedWriter,
|
||||
FileIO,
|
||||
TextIOBase,
|
||||
)
|
||||
|
||||
from ..base import AiofilesContextManager
|
||||
from .binary import (
|
||||
AsyncBufferedIOBase,
|
||||
AsyncBufferedReader,
|
||||
AsyncFileIO,
|
||||
AsyncIndirectBufferedIOBase,
|
||||
)
|
||||
from .text import AsyncTextIndirectIOWrapper, AsyncTextIOWrapper
|
||||
|
||||
sync_open = open
|
||||
|
||||
__all__ = (
|
||||
"open",
|
||||
"stdin",
|
||||
"stdout",
|
||||
"stderr",
|
||||
"stdin_bytes",
|
||||
"stdout_bytes",
|
||||
"stderr_bytes",
|
||||
)
|
||||
|
||||
|
||||
def open(
|
||||
file,
|
||||
mode="r",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
errors=None,
|
||||
newline=None,
|
||||
closefd=True,
|
||||
opener=None,
|
||||
*,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
return AiofilesContextManager(
|
||||
_open(
|
||||
file,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
errors=errors,
|
||||
newline=newline,
|
||||
closefd=closefd,
|
||||
opener=opener,
|
||||
loop=loop,
|
||||
executor=executor,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
async def _open(
|
||||
file,
|
||||
mode="r",
|
||||
buffering=-1,
|
||||
encoding=None,
|
||||
errors=None,
|
||||
newline=None,
|
||||
closefd=True,
|
||||
opener=None,
|
||||
*,
|
||||
loop=None,
|
||||
executor=None,
|
||||
):
|
||||
"""Open an asyncio file."""
|
||||
if loop is None:
|
||||
loop = asyncio.get_running_loop()
|
||||
cb = partial(
|
||||
sync_open,
|
||||
file,
|
||||
mode=mode,
|
||||
buffering=buffering,
|
||||
encoding=encoding,
|
||||
errors=errors,
|
||||
newline=newline,
|
||||
closefd=closefd,
|
||||
opener=opener,
|
||||
)
|
||||
f = await loop.run_in_executor(executor, cb)
|
||||
|
||||
return wrap(f, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@singledispatch
|
||||
def wrap(file, *, loop=None, executor=None):
|
||||
raise TypeError("Unsupported io type: {}.".format(file))
|
||||
|
||||
|
||||
@wrap.register(TextIOBase)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncTextIOWrapper(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedWriter)
|
||||
@wrap.register(BufferedIOBase)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncBufferedIOBase(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedReader)
|
||||
@wrap.register(BufferedRandom)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncBufferedReader(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(FileIO)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncFileIO(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
stdin = AsyncTextIndirectIOWrapper("sys.stdin", None, None, indirect=lambda: sys.stdin)
|
||||
stdout = AsyncTextIndirectIOWrapper(
|
||||
"sys.stdout", None, None, indirect=lambda: sys.stdout
|
||||
)
|
||||
stderr = AsyncTextIndirectIOWrapper(
|
||||
"sys.stderr", None, None, indirect=lambda: sys.stderr
|
||||
)
|
||||
stdin_bytes = AsyncIndirectBufferedIOBase(
|
||||
"sys.stdin.buffer", None, None, indirect=lambda: sys.stdin.buffer
|
||||
)
|
||||
stdout_bytes = AsyncIndirectBufferedIOBase(
|
||||
"sys.stdout.buffer", None, None, indirect=lambda: sys.stdout.buffer
|
||||
)
|
||||
stderr_bytes = AsyncIndirectBufferedIOBase(
|
||||
"sys.stderr.buffer", None, None, indirect=lambda: sys.stderr.buffer
|
||||
)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
104
venv/lib/python3.12/site-packages/aiofiles/threadpool/binary.py
Normal file
104
venv/lib/python3.12/site-packages/aiofiles/threadpool/binary.py
Normal file
@@ -0,0 +1,104 @@
|
||||
from ..base import AsyncBase, AsyncIndirectBase
|
||||
from .utils import delegate_to_executor, proxy_method_directly, proxy_property_directly
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"read1",
|
||||
"readinto",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"writable",
|
||||
"write",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("detach", "fileno", "readable")
|
||||
@proxy_property_directly("closed", "raw", "name", "mode")
|
||||
class AsyncBufferedIOBase(AsyncBase):
|
||||
"""The asyncio executor version of io.BufferedWriter and BufferedIOBase."""
|
||||
|
||||
|
||||
@delegate_to_executor("peek")
|
||||
class AsyncBufferedReader(AsyncBufferedIOBase):
|
||||
"""The asyncio executor version of io.BufferedReader and Random."""
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"readall",
|
||||
"readinto",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"writable",
|
||||
"write",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("fileno", "readable")
|
||||
@proxy_property_directly("closed", "name", "mode")
|
||||
class AsyncFileIO(AsyncBase):
|
||||
"""The asyncio executor version of io.FileIO."""
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"read1",
|
||||
"readinto",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"writable",
|
||||
"write",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("detach", "fileno", "readable")
|
||||
@proxy_property_directly("closed", "raw", "name", "mode")
|
||||
class AsyncIndirectBufferedIOBase(AsyncIndirectBase):
|
||||
"""The indirect asyncio executor version of io.BufferedWriter and BufferedIOBase."""
|
||||
|
||||
|
||||
@delegate_to_executor("peek")
|
||||
class AsyncIndirectBufferedReader(AsyncIndirectBufferedIOBase):
|
||||
"""The indirect asyncio executor version of io.BufferedReader and Random."""
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"readall",
|
||||
"readinto",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"writable",
|
||||
"write",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("fileno", "readable")
|
||||
@proxy_property_directly("closed", "name", "mode")
|
||||
class AsyncIndirectFileIO(AsyncIndirectBase):
|
||||
"""The indirect asyncio executor version of io.FileIO."""
|
||||
@@ -0,0 +1,64 @@
|
||||
from ..base import AsyncBase, AsyncIndirectBase
|
||||
from .utils import delegate_to_executor, proxy_method_directly, proxy_property_directly
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"readable",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"write",
|
||||
"writable",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("detach", "fileno", "readable")
|
||||
@proxy_property_directly(
|
||||
"buffer",
|
||||
"closed",
|
||||
"encoding",
|
||||
"errors",
|
||||
"line_buffering",
|
||||
"newlines",
|
||||
"name",
|
||||
"mode",
|
||||
)
|
||||
class AsyncTextIOWrapper(AsyncBase):
|
||||
"""The asyncio executor version of io.TextIOWrapper."""
|
||||
|
||||
|
||||
@delegate_to_executor(
|
||||
"close",
|
||||
"flush",
|
||||
"isatty",
|
||||
"read",
|
||||
"readable",
|
||||
"readline",
|
||||
"readlines",
|
||||
"seek",
|
||||
"seekable",
|
||||
"tell",
|
||||
"truncate",
|
||||
"write",
|
||||
"writable",
|
||||
"writelines",
|
||||
)
|
||||
@proxy_method_directly("detach", "fileno", "readable")
|
||||
@proxy_property_directly(
|
||||
"buffer",
|
||||
"closed",
|
||||
"encoding",
|
||||
"errors",
|
||||
"line_buffering",
|
||||
"newlines",
|
||||
"name",
|
||||
"mode",
|
||||
)
|
||||
class AsyncTextIndirectIOWrapper(AsyncIndirectBase):
|
||||
"""The indirect asyncio executor version of io.TextIOWrapper."""
|
||||
@@ -0,0 +1,72 @@
|
||||
import functools
|
||||
|
||||
|
||||
def delegate_to_executor(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_delegate_method(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def proxy_method_directly(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_proxy_method(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def proxy_property_directly(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_proxy_property(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def cond_delegate_to_executor(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_cond_delegate_method(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def _make_delegate_method(attr_name):
|
||||
async def method(self, *args, **kwargs):
|
||||
cb = functools.partial(getattr(self._file, attr_name), *args, **kwargs)
|
||||
return await self._loop.run_in_executor(self._executor, cb)
|
||||
|
||||
return method
|
||||
|
||||
|
||||
def _make_proxy_method(attr_name):
|
||||
def method(self, *args, **kwargs):
|
||||
return getattr(self._file, attr_name)(*args, **kwargs)
|
||||
|
||||
return method
|
||||
|
||||
|
||||
def _make_proxy_property(attr_name):
|
||||
def proxy_property(self):
|
||||
return getattr(self._file, attr_name)
|
||||
|
||||
return property(proxy_property)
|
||||
|
||||
|
||||
def _make_cond_delegate_method(attr_name):
|
||||
"""For spooled temp files, delegate only if rolled to file object"""
|
||||
|
||||
async def method(self, *args, **kwargs):
|
||||
if self._file._rolled:
|
||||
cb = functools.partial(getattr(self._file, attr_name), *args, **kwargs)
|
||||
return await self._loop.run_in_executor(self._executor, cb)
|
||||
else:
|
||||
return getattr(self._file, attr_name)(*args, **kwargs)
|
||||
|
||||
return method
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,167 @@
|
||||
Metadata-Version: 2.4
|
||||
Name: aiogram
|
||||
Version: 3.22.0
|
||||
Summary: Modern and fully asynchronous framework for Telegram Bot API
|
||||
Project-URL: Homepage, https://aiogram.dev/
|
||||
Project-URL: Documentation, https://docs.aiogram.dev/
|
||||
Project-URL: Repository, https://github.com/aiogram/aiogram/
|
||||
Author-email: Alex Root Junior <jroot.junior@gmail.com>
|
||||
Maintainer-email: Alex Root Junior <jroot.junior@gmail.com>
|
||||
License-Expression: MIT
|
||||
License-File: LICENSE
|
||||
Keywords: api,asyncio,bot,framework,telegram,wrapper
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Environment :: Console
|
||||
Classifier: Framework :: AsyncIO
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: Intended Audience :: System Administrators
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Programming Language :: Python :: 3.9
|
||||
Classifier: Programming Language :: Python :: 3.10
|
||||
Classifier: Programming Language :: Python :: 3.11
|
||||
Classifier: Programming Language :: Python :: 3.12
|
||||
Classifier: Programming Language :: Python :: 3.13
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Communications :: Chat
|
||||
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Classifier: Typing :: Typed
|
||||
Requires-Python: >=3.9
|
||||
Requires-Dist: aiofiles<24.2,>=23.2.1
|
||||
Requires-Dist: aiohttp<3.13,>=3.9.0
|
||||
Requires-Dist: certifi>=2023.7.22
|
||||
Requires-Dist: magic-filter<1.1,>=1.0.12
|
||||
Requires-Dist: pydantic<2.12,>=2.4.1
|
||||
Requires-Dist: typing-extensions<=5.0,>=4.7.0
|
||||
Provides-Extra: cli
|
||||
Requires-Dist: aiogram-cli<2.0.0,>=1.1.0; extra == 'cli'
|
||||
Provides-Extra: dev
|
||||
Requires-Dist: black~=24.4.2; extra == 'dev'
|
||||
Requires-Dist: isort~=5.13.2; extra == 'dev'
|
||||
Requires-Dist: motor-types~=1.0.0b4; extra == 'dev'
|
||||
Requires-Dist: mypy~=1.10.0; extra == 'dev'
|
||||
Requires-Dist: packaging~=24.1; extra == 'dev'
|
||||
Requires-Dist: pre-commit~=3.5; extra == 'dev'
|
||||
Requires-Dist: ruff~=0.5.1; extra == 'dev'
|
||||
Requires-Dist: toml~=0.10.2; extra == 'dev'
|
||||
Provides-Extra: docs
|
||||
Requires-Dist: furo~=2024.8.6; extra == 'docs'
|
||||
Requires-Dist: markdown-include~=0.8.1; extra == 'docs'
|
||||
Requires-Dist: pygments~=2.18.0; extra == 'docs'
|
||||
Requires-Dist: pymdown-extensions~=10.3; extra == 'docs'
|
||||
Requires-Dist: sphinx-autobuild~=2024.9.3; extra == 'docs'
|
||||
Requires-Dist: sphinx-copybutton~=0.5.2; extra == 'docs'
|
||||
Requires-Dist: sphinx-intl~=2.2.0; extra == 'docs'
|
||||
Requires-Dist: sphinx-substitution-extensions~=2024.8.6; extra == 'docs'
|
||||
Requires-Dist: sphinxcontrib-towncrier~=0.4.0a0; extra == 'docs'
|
||||
Requires-Dist: sphinx~=8.0.2; extra == 'docs'
|
||||
Requires-Dist: towncrier~=24.8.0; extra == 'docs'
|
||||
Provides-Extra: fast
|
||||
Requires-Dist: aiodns>=3.0.0; extra == 'fast'
|
||||
Requires-Dist: uvloop>=0.17.0; ((sys_platform == 'darwin' or sys_platform == 'linux') and platform_python_implementation != 'PyPy' and python_version < '3.13') and extra == 'fast'
|
||||
Requires-Dist: uvloop>=0.21.0; ((sys_platform == 'darwin' or sys_platform == 'linux') and platform_python_implementation != 'PyPy' and python_version >= '3.13') and extra == 'fast'
|
||||
Provides-Extra: i18n
|
||||
Requires-Dist: babel<3,>=2.13.0; extra == 'i18n'
|
||||
Provides-Extra: mongo
|
||||
Requires-Dist: motor<3.7.0,>=3.3.2; extra == 'mongo'
|
||||
Requires-Dist: pymongo<4.11,>4.5; extra == 'mongo'
|
||||
Provides-Extra: proxy
|
||||
Requires-Dist: aiohttp-socks~=0.8.3; extra == 'proxy'
|
||||
Provides-Extra: redis
|
||||
Requires-Dist: redis[hiredis]<5.3.0,>=5.0.1; extra == 'redis'
|
||||
Provides-Extra: signature
|
||||
Requires-Dist: cryptography>=43.0.0; extra == 'signature'
|
||||
Provides-Extra: test
|
||||
Requires-Dist: aresponses~=2.1.6; extra == 'test'
|
||||
Requires-Dist: pycryptodomex~=3.19.0; extra == 'test'
|
||||
Requires-Dist: pytest-aiohttp~=1.0.5; extra == 'test'
|
||||
Requires-Dist: pytest-asyncio~=0.21.1; extra == 'test'
|
||||
Requires-Dist: pytest-cov~=4.1.0; extra == 'test'
|
||||
Requires-Dist: pytest-html~=4.0.2; extra == 'test'
|
||||
Requires-Dist: pytest-lazy-fixture~=0.6.3; extra == 'test'
|
||||
Requires-Dist: pytest-mock~=3.12.0; extra == 'test'
|
||||
Requires-Dist: pytest-mypy~=0.10.3; extra == 'test'
|
||||
Requires-Dist: pytest~=7.4.2; extra == 'test'
|
||||
Requires-Dist: pytz~=2023.3; extra == 'test'
|
||||
Description-Content-Type: text/x-rst
|
||||
|
||||
#######
|
||||
aiogram
|
||||
#######
|
||||
|
||||
.. image:: https://img.shields.io/pypi/l/aiogram.svg?style=flat-square
|
||||
:target: https://opensource.org/licenses/MIT
|
||||
:alt: MIT License
|
||||
|
||||
.. image:: https://img.shields.io/pypi/status/aiogram.svg?style=flat-square
|
||||
:target: https://pypi.python.org/pypi/aiogram
|
||||
:alt: PyPi status
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/aiogram.svg?style=flat-square
|
||||
:target: https://pypi.python.org/pypi/aiogram
|
||||
:alt: PyPi Package Version
|
||||
|
||||
.. image:: https://img.shields.io/pypi/dm/aiogram.svg?style=flat-square
|
||||
:target: https://pypi.python.org/pypi/aiogram
|
||||
:alt: Downloads
|
||||
|
||||
.. image:: https://img.shields.io/pypi/pyversions/aiogram.svg?style=flat-square
|
||||
:target: https://pypi.python.org/pypi/aiogram
|
||||
:alt: Supported python versions
|
||||
|
||||
.. image:: https://img.shields.io/badge/dynamic/json?color=blue&logo=telegram&label=Telegram%20Bot%20API&query=%24.api.version&url=https%3A%2F%2Fraw.githubusercontent.com%2Faiogram%2Faiogram%2Fdev-3.x%2F.butcher%2Fschema%2Fschema.json&style=flat-square
|
||||
:target: https://core.telegram.org/bots/api
|
||||
:alt: Telegram Bot API
|
||||
|
||||
.. image:: https://img.shields.io/github/actions/workflow/status/aiogram/aiogram/tests.yml?branch=dev-3.x&style=flat-square
|
||||
:target: https://github.com/aiogram/aiogram/actions
|
||||
:alt: Tests
|
||||
|
||||
.. image:: https://img.shields.io/codecov/c/github/aiogram/aiogram?style=flat-square
|
||||
:target: https://app.codecov.io/gh/aiogram/aiogram
|
||||
:alt: Codecov
|
||||
|
||||
**aiogram** is a modern and fully asynchronous framework for
|
||||
`Telegram Bot API <https://core.telegram.org/bots/api>`_ written in Python 3.8+ using
|
||||
`asyncio <https://docs.python.org/3/library/asyncio.html>`_ and
|
||||
`aiohttp <https://github.com/aio-libs/aiohttp>`_.
|
||||
|
||||
Make your bots faster and more powerful!
|
||||
|
||||
Documentation:
|
||||
- 🇺🇸 `English <https://docs.aiogram.dev/en/dev-3.x/>`_
|
||||
- 🇺🇦 `Ukrainian <https://docs.aiogram.dev/uk_UA/dev-3.x/>`_
|
||||
|
||||
|
||||
Features
|
||||
========
|
||||
|
||||
- Asynchronous (`asyncio docs <https://docs.python.org/3/library/asyncio.html>`_, :pep:`492`)
|
||||
- Has type hints (:pep:`484`) and can be used with `mypy <http://mypy-lang.org/>`_
|
||||
- Supports `PyPy <https://www.pypy.org/>`_
|
||||
- Supports `Telegram Bot API 9.2 <https://core.telegram.org/bots/api>`_ and gets fast updates to the latest versions of the Bot API
|
||||
- Telegram Bot API integration code was `autogenerated <https://github.com/aiogram/tg-codegen>`_ and can be easily re-generated when API gets updated
|
||||
- Updates router (Blueprints)
|
||||
- Has Finite State Machine
|
||||
- Uses powerful `magic filters <https://docs.aiogram.dev/en/latest/dispatcher/filters/magic_filters.html#magic-filters>`_
|
||||
- Middlewares (incoming updates and API calls)
|
||||
- Provides `Replies into Webhook <https://core.telegram.org/bots/faq#how-can-i-make-requests-in-response-to-updates>`_
|
||||
- Integrated I18n/L10n support with GNU Gettext (or Fluent)
|
||||
|
||||
|
||||
.. warning::
|
||||
|
||||
It is strongly advised that you have prior experience working
|
||||
with `asyncio <https://docs.python.org/3/library/asyncio.html>`_
|
||||
before beginning to use **aiogram**.
|
||||
|
||||
If you have any questions, you can visit our community chats on Telegram:
|
||||
|
||||
- 🇺🇸 `@aiogram <https://t.me/aiogram>`_
|
||||
- 🇺🇦 `@aiogramua <https://t.me/aiogramua>`_
|
||||
- 🇺🇿 `@aiogram_uz <https://t.me/aiogram_uz>`_
|
||||
- 🇰🇿 `@aiogram_kz <https://t.me/aiogram_kz>`_
|
||||
- 🇷🇺 `@aiogram_ru <https://t.me/aiogram_ru>`_
|
||||
- 🇮🇷 `@aiogram_fa <https://t.me/aiogram_fa>`_
|
||||
- 🇮🇹 `@aiogram_it <https://t.me/aiogram_it>`_
|
||||
- 🇧🇷 `@aiogram_br <https://t.me/aiogram_br>`_
|
||||
1213
venv/lib/python3.12/site-packages/aiogram-3.22.0.dist-info/RECORD
Normal file
1213
venv/lib/python3.12/site-packages/aiogram-3.22.0.dist-info/RECORD
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,4 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: hatchling 1.27.0
|
||||
Root-Is-Purelib: true
|
||||
Tag: py3-none-any
|
||||
@@ -0,0 +1,18 @@
|
||||
Copyright (c) 2017 - present Alex Root Junior
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this
|
||||
software and associated documentation files (the "Software"), to deal in the Software
|
||||
without restriction, including without limitation the rights to use, copy, modify,
|
||||
merge, publish, distribute, sublicense, and/or sell copies of the Software,
|
||||
and to permit persons to whom the Software is furnished to do so, subject to the
|
||||
following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies
|
||||
or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
|
||||
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
|
||||
PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
|
||||
BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
|
||||
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
|
||||
OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
41
venv/lib/python3.12/site-packages/aiogram/__init__.py
Normal file
41
venv/lib/python3.12/site-packages/aiogram/__init__.py
Normal file
@@ -0,0 +1,41 @@
|
||||
import asyncio as _asyncio
|
||||
from contextlib import suppress
|
||||
|
||||
from aiogram.dispatcher.flags import FlagGenerator
|
||||
|
||||
from . import enums, methods, types
|
||||
from .__meta__ import __api_version__, __version__
|
||||
from .client import session
|
||||
from .client.bot import Bot
|
||||
from .dispatcher.dispatcher import Dispatcher
|
||||
from .dispatcher.middlewares.base import BaseMiddleware
|
||||
from .dispatcher.router import Router
|
||||
from .utils.magic_filter import MagicFilter
|
||||
from .utils.text_decorations import html_decoration as html
|
||||
from .utils.text_decorations import markdown_decoration as md
|
||||
|
||||
with suppress(ImportError):
|
||||
import uvloop as _uvloop
|
||||
|
||||
_asyncio.set_event_loop_policy(_uvloop.EventLoopPolicy())
|
||||
|
||||
|
||||
F = MagicFilter()
|
||||
flags = FlagGenerator()
|
||||
|
||||
__all__ = (
|
||||
"__api_version__",
|
||||
"__version__",
|
||||
"types",
|
||||
"methods",
|
||||
"enums",
|
||||
"Bot",
|
||||
"session",
|
||||
"Dispatcher",
|
||||
"Router",
|
||||
"BaseMiddleware",
|
||||
"F",
|
||||
"html",
|
||||
"md",
|
||||
"flags",
|
||||
)
|
||||
2
venv/lib/python3.12/site-packages/aiogram/__meta__.py
Normal file
2
venv/lib/python3.12/site-packages/aiogram/__meta__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
__version__ = "3.22.0"
|
||||
__api_version__ = "9.2"
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
5685
venv/lib/python3.12/site-packages/aiogram/client/bot.py
Normal file
5685
venv/lib/python3.12/site-packages/aiogram/client/bot.py
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,33 @@
|
||||
from typing import TYPE_CHECKING, Any, Optional
|
||||
|
||||
from pydantic import BaseModel, PrivateAttr
|
||||
from typing_extensions import Self
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from aiogram.client.bot import Bot
|
||||
|
||||
|
||||
class BotContextController(BaseModel):
|
||||
_bot: Optional["Bot"] = PrivateAttr()
|
||||
|
||||
def model_post_init(self, __context: Any) -> None:
|
||||
self._bot = __context.get("bot") if __context else None
|
||||
|
||||
def as_(self, bot: Optional["Bot"]) -> Self:
|
||||
"""
|
||||
Bind object to a bot instance.
|
||||
|
||||
:param bot: Bot instance
|
||||
:return: self
|
||||
"""
|
||||
self._bot = bot
|
||||
return self
|
||||
|
||||
@property
|
||||
def bot(self) -> Optional["Bot"]:
|
||||
"""
|
||||
Get bot instance.
|
||||
|
||||
:return: Bot instance
|
||||
"""
|
||||
return self._bot
|
||||
80
venv/lib/python3.12/site-packages/aiogram/client/default.py
Normal file
80
venv/lib/python3.12/site-packages/aiogram/client/default.py
Normal file
@@ -0,0 +1,80 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import TYPE_CHECKING, Any, Optional
|
||||
|
||||
from aiogram.utils.dataclass import dataclass_kwargs
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from aiogram.types import LinkPreviewOptions
|
||||
|
||||
|
||||
# @dataclass ??
|
||||
class Default:
|
||||
# Is not a dataclass because of JSON serialization.
|
||||
|
||||
__slots__ = ("_name",)
|
||||
|
||||
def __init__(self, name: str) -> None:
|
||||
self._name = name
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return self._name
|
||||
|
||||
def __str__(self) -> str:
|
||||
return f"Default({self._name!r})"
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<{self}>"
|
||||
|
||||
|
||||
@dataclass(**dataclass_kwargs(slots=True, kw_only=True))
|
||||
class DefaultBotProperties:
|
||||
"""
|
||||
Default bot properties.
|
||||
"""
|
||||
|
||||
parse_mode: Optional[str] = None
|
||||
"""Default parse mode for messages."""
|
||||
disable_notification: Optional[bool] = None
|
||||
"""Sends the message silently. Users will receive a notification with no sound."""
|
||||
protect_content: Optional[bool] = None
|
||||
"""Protects content from copying."""
|
||||
allow_sending_without_reply: Optional[bool] = None
|
||||
"""Allows to send messages without reply."""
|
||||
link_preview: Optional[LinkPreviewOptions] = None
|
||||
"""Link preview settings."""
|
||||
link_preview_is_disabled: Optional[bool] = None
|
||||
"""Disables link preview."""
|
||||
link_preview_prefer_small_media: Optional[bool] = None
|
||||
"""Prefer small media in link preview."""
|
||||
link_preview_prefer_large_media: Optional[bool] = None
|
||||
"""Prefer large media in link preview."""
|
||||
link_preview_show_above_text: Optional[bool] = None
|
||||
"""Show link preview above text."""
|
||||
show_caption_above_media: Optional[bool] = None
|
||||
"""Show caption above media."""
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
has_any_link_preview_option = any(
|
||||
(
|
||||
self.link_preview_is_disabled,
|
||||
self.link_preview_prefer_small_media,
|
||||
self.link_preview_prefer_large_media,
|
||||
self.link_preview_show_above_text,
|
||||
)
|
||||
)
|
||||
|
||||
if has_any_link_preview_option and self.link_preview is None:
|
||||
from ..types import LinkPreviewOptions
|
||||
|
||||
self.link_preview = LinkPreviewOptions(
|
||||
is_disabled=self.link_preview_is_disabled,
|
||||
prefer_small_media=self.link_preview_prefer_small_media,
|
||||
prefer_large_media=self.link_preview_prefer_large_media,
|
||||
show_above_text=self.link_preview_show_above_text,
|
||||
)
|
||||
|
||||
def __getitem__(self, item: str) -> Any:
|
||||
return getattr(self, item, None)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,211 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import ssl
|
||||
from typing import (
|
||||
TYPE_CHECKING,
|
||||
Any,
|
||||
AsyncGenerator,
|
||||
Dict,
|
||||
Iterable,
|
||||
List,
|
||||
Optional,
|
||||
Tuple,
|
||||
Type,
|
||||
Union,
|
||||
cast,
|
||||
)
|
||||
|
||||
import certifi
|
||||
from aiohttp import BasicAuth, ClientError, ClientSession, FormData, TCPConnector
|
||||
from aiohttp.hdrs import USER_AGENT
|
||||
from aiohttp.http import SERVER_SOFTWARE
|
||||
|
||||
from aiogram.__meta__ import __version__
|
||||
from aiogram.methods import TelegramMethod
|
||||
|
||||
from ...exceptions import TelegramNetworkError
|
||||
from ...methods.base import TelegramType
|
||||
from ...types import InputFile
|
||||
from .base import BaseSession
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from ..bot import Bot
|
||||
|
||||
_ProxyBasic = Union[str, Tuple[str, BasicAuth]]
|
||||
_ProxyChain = Iterable[_ProxyBasic]
|
||||
_ProxyType = Union[_ProxyChain, _ProxyBasic]
|
||||
|
||||
|
||||
def _retrieve_basic(basic: _ProxyBasic) -> Dict[str, Any]:
|
||||
from aiohttp_socks.utils import parse_proxy_url
|
||||
|
||||
proxy_auth: Optional[BasicAuth] = None
|
||||
|
||||
if isinstance(basic, str):
|
||||
proxy_url = basic
|
||||
else:
|
||||
proxy_url, proxy_auth = basic
|
||||
|
||||
proxy_type, host, port, username, password = parse_proxy_url(proxy_url)
|
||||
if isinstance(proxy_auth, BasicAuth):
|
||||
username = proxy_auth.login
|
||||
password = proxy_auth.password
|
||||
|
||||
return {
|
||||
"proxy_type": proxy_type,
|
||||
"host": host,
|
||||
"port": port,
|
||||
"username": username,
|
||||
"password": password,
|
||||
"rdns": True,
|
||||
}
|
||||
|
||||
|
||||
def _prepare_connector(chain_or_plain: _ProxyType) -> Tuple[Type["TCPConnector"], Dict[str, Any]]:
|
||||
from aiohttp_socks import ChainProxyConnector, ProxyConnector, ProxyInfo
|
||||
|
||||
# since tuple is Iterable(compatible with _ProxyChain) object, we assume that
|
||||
# user wants chained proxies if tuple is a pair of string(url) and BasicAuth
|
||||
if isinstance(chain_or_plain, str) or (
|
||||
isinstance(chain_or_plain, tuple) and len(chain_or_plain) == 2
|
||||
):
|
||||
chain_or_plain = cast(_ProxyBasic, chain_or_plain)
|
||||
return ProxyConnector, _retrieve_basic(chain_or_plain)
|
||||
|
||||
chain_or_plain = cast(_ProxyChain, chain_or_plain)
|
||||
infos: List[ProxyInfo] = []
|
||||
for basic in chain_or_plain:
|
||||
infos.append(ProxyInfo(**_retrieve_basic(basic)))
|
||||
|
||||
return ChainProxyConnector, {"proxy_infos": infos}
|
||||
|
||||
|
||||
class AiohttpSession(BaseSession):
|
||||
def __init__(
|
||||
self, proxy: Optional[_ProxyType] = None, limit: int = 100, **kwargs: Any
|
||||
) -> None:
|
||||
"""
|
||||
Client session based on aiohttp.
|
||||
|
||||
:param proxy: The proxy to be used for requests. Default is None.
|
||||
:param limit: The total number of simultaneous connections. Default is 100.
|
||||
:param kwargs: Additional keyword arguments.
|
||||
"""
|
||||
super().__init__(**kwargs)
|
||||
|
||||
self._session: Optional[ClientSession] = None
|
||||
self._connector_type: Type[TCPConnector] = TCPConnector
|
||||
self._connector_init: Dict[str, Any] = {
|
||||
"ssl": ssl.create_default_context(cafile=certifi.where()),
|
||||
"limit": limit,
|
||||
"ttl_dns_cache": 3600, # Workaround for https://github.com/aiogram/aiogram/issues/1500
|
||||
}
|
||||
self._should_reset_connector = True # flag determines connector state
|
||||
self._proxy: Optional[_ProxyType] = None
|
||||
|
||||
if proxy is not None:
|
||||
try:
|
||||
self._setup_proxy_connector(proxy)
|
||||
except ImportError as exc: # pragma: no cover
|
||||
raise RuntimeError(
|
||||
"In order to use aiohttp client for proxy requests, install "
|
||||
"https://pypi.org/project/aiohttp-socks/"
|
||||
) from exc
|
||||
|
||||
def _setup_proxy_connector(self, proxy: _ProxyType) -> None:
|
||||
self._connector_type, self._connector_init = _prepare_connector(proxy)
|
||||
self._proxy = proxy
|
||||
|
||||
@property
|
||||
def proxy(self) -> Optional[_ProxyType]:
|
||||
return self._proxy
|
||||
|
||||
@proxy.setter
|
||||
def proxy(self, proxy: _ProxyType) -> None:
|
||||
self._setup_proxy_connector(proxy)
|
||||
self._should_reset_connector = True
|
||||
|
||||
async def create_session(self) -> ClientSession:
|
||||
if self._should_reset_connector:
|
||||
await self.close()
|
||||
|
||||
if self._session is None or self._session.closed:
|
||||
self._session = ClientSession(
|
||||
connector=self._connector_type(**self._connector_init),
|
||||
headers={
|
||||
USER_AGENT: f"{SERVER_SOFTWARE} aiogram/{__version__}",
|
||||
},
|
||||
)
|
||||
self._should_reset_connector = False
|
||||
|
||||
return self._session
|
||||
|
||||
async def close(self) -> None:
|
||||
if self._session is not None and not self._session.closed:
|
||||
await self._session.close()
|
||||
|
||||
# Wait 250 ms for the underlying SSL connections to close
|
||||
# https://docs.aiohttp.org/en/stable/client_advanced.html#graceful-shutdown
|
||||
await asyncio.sleep(0.25)
|
||||
|
||||
def build_form_data(self, bot: Bot, method: TelegramMethod[TelegramType]) -> FormData:
|
||||
form = FormData(quote_fields=False)
|
||||
files: Dict[str, InputFile] = {}
|
||||
for key, value in method.model_dump(warnings=False).items():
|
||||
value = self.prepare_value(value, bot=bot, files=files)
|
||||
if not value:
|
||||
continue
|
||||
form.add_field(key, value)
|
||||
for key, value in files.items():
|
||||
form.add_field(
|
||||
key,
|
||||
value.read(bot),
|
||||
filename=value.filename or key,
|
||||
)
|
||||
return form
|
||||
|
||||
async def make_request(
|
||||
self, bot: Bot, method: TelegramMethod[TelegramType], timeout: Optional[int] = None
|
||||
) -> TelegramType:
|
||||
session = await self.create_session()
|
||||
|
||||
url = self.api.api_url(token=bot.token, method=method.__api_method__)
|
||||
form = self.build_form_data(bot=bot, method=method)
|
||||
|
||||
try:
|
||||
async with session.post(
|
||||
url, data=form, timeout=self.timeout if timeout is None else timeout
|
||||
) as resp:
|
||||
raw_result = await resp.text()
|
||||
except asyncio.TimeoutError:
|
||||
raise TelegramNetworkError(method=method, message="Request timeout error")
|
||||
except ClientError as e:
|
||||
raise TelegramNetworkError(method=method, message=f"{type(e).__name__}: {e}")
|
||||
response = self.check_response(
|
||||
bot=bot, method=method, status_code=resp.status, content=raw_result
|
||||
)
|
||||
return cast(TelegramType, response.result)
|
||||
|
||||
async def stream_content(
|
||||
self,
|
||||
url: str,
|
||||
headers: Optional[Dict[str, Any]] = None,
|
||||
timeout: int = 30,
|
||||
chunk_size: int = 65536,
|
||||
raise_for_status: bool = True,
|
||||
) -> AsyncGenerator[bytes, None]:
|
||||
if headers is None:
|
||||
headers = {}
|
||||
|
||||
session = await self.create_session()
|
||||
|
||||
async with session.get(
|
||||
url, timeout=timeout, headers=headers, raise_for_status=raise_for_status
|
||||
) as resp:
|
||||
async for chunk in resp.content.iter_chunked(chunk_size):
|
||||
yield chunk
|
||||
|
||||
async def __aenter__(self) -> AiohttpSession:
|
||||
await self.create_session()
|
||||
return self
|
||||
265
venv/lib/python3.12/site-packages/aiogram/client/session/base.py
Normal file
265
venv/lib/python3.12/site-packages/aiogram/client/session/base.py
Normal file
@@ -0,0 +1,265 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import datetime
|
||||
import json
|
||||
import secrets
|
||||
from enum import Enum
|
||||
from http import HTTPStatus
|
||||
from types import TracebackType
|
||||
from typing import (
|
||||
TYPE_CHECKING,
|
||||
Any,
|
||||
AsyncGenerator,
|
||||
Callable,
|
||||
Dict,
|
||||
Final,
|
||||
Optional,
|
||||
Type,
|
||||
cast,
|
||||
)
|
||||
|
||||
from pydantic import ValidationError
|
||||
|
||||
from aiogram.exceptions import (
|
||||
ClientDecodeError,
|
||||
RestartingTelegram,
|
||||
TelegramAPIError,
|
||||
TelegramBadRequest,
|
||||
TelegramConflictError,
|
||||
TelegramEntityTooLarge,
|
||||
TelegramForbiddenError,
|
||||
TelegramMigrateToChat,
|
||||
TelegramNotFound,
|
||||
TelegramRetryAfter,
|
||||
TelegramServerError,
|
||||
TelegramUnauthorizedError,
|
||||
)
|
||||
|
||||
from ...methods import Response, TelegramMethod
|
||||
from ...methods.base import TelegramType
|
||||
from ...types import InputFile, TelegramObject
|
||||
from ..default import Default
|
||||
from ..telegram import PRODUCTION, TelegramAPIServer
|
||||
from .middlewares.manager import RequestMiddlewareManager
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from ..bot import Bot
|
||||
|
||||
_JsonLoads = Callable[..., Any]
|
||||
_JsonDumps = Callable[..., str]
|
||||
|
||||
DEFAULT_TIMEOUT: Final[float] = 60.0
|
||||
|
||||
|
||||
class BaseSession(abc.ABC):
|
||||
"""
|
||||
This is base class for all HTTP sessions in aiogram.
|
||||
|
||||
If you want to create your own session, you must inherit from this class.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
api: TelegramAPIServer = PRODUCTION,
|
||||
json_loads: _JsonLoads = json.loads,
|
||||
json_dumps: _JsonDumps = json.dumps,
|
||||
timeout: float = DEFAULT_TIMEOUT,
|
||||
) -> None:
|
||||
"""
|
||||
|
||||
:param api: Telegram Bot API URL patterns
|
||||
:param json_loads: JSON loader
|
||||
:param json_dumps: JSON dumper
|
||||
:param timeout: Session scope request timeout
|
||||
"""
|
||||
self.api = api
|
||||
self.json_loads = json_loads
|
||||
self.json_dumps = json_dumps
|
||||
self.timeout = timeout
|
||||
|
||||
self.middleware = RequestMiddlewareManager()
|
||||
|
||||
def check_response(
|
||||
self, bot: Bot, method: TelegramMethod[TelegramType], status_code: int, content: str
|
||||
) -> Response[TelegramType]:
|
||||
"""
|
||||
Check response status
|
||||
"""
|
||||
try:
|
||||
json_data = self.json_loads(content)
|
||||
except Exception as e:
|
||||
# Handled error type can't be classified as specific error
|
||||
# in due to decoder can be customized and raise any exception
|
||||
|
||||
raise ClientDecodeError("Failed to decode object", e, content)
|
||||
|
||||
try:
|
||||
response_type = Response[method.__returning__] # type: ignore
|
||||
response = response_type.model_validate(json_data, context={"bot": bot})
|
||||
except ValidationError as e:
|
||||
raise ClientDecodeError("Failed to deserialize object", e, json_data)
|
||||
|
||||
if HTTPStatus.OK <= status_code <= HTTPStatus.IM_USED and response.ok:
|
||||
return response
|
||||
|
||||
description = cast(str, response.description)
|
||||
|
||||
if parameters := response.parameters:
|
||||
if parameters.retry_after:
|
||||
raise TelegramRetryAfter(
|
||||
method=method, message=description, retry_after=parameters.retry_after
|
||||
)
|
||||
if parameters.migrate_to_chat_id:
|
||||
raise TelegramMigrateToChat(
|
||||
method=method,
|
||||
message=description,
|
||||
migrate_to_chat_id=parameters.migrate_to_chat_id,
|
||||
)
|
||||
if status_code == HTTPStatus.BAD_REQUEST:
|
||||
raise TelegramBadRequest(method=method, message=description)
|
||||
if status_code == HTTPStatus.NOT_FOUND:
|
||||
raise TelegramNotFound(method=method, message=description)
|
||||
if status_code == HTTPStatus.CONFLICT:
|
||||
raise TelegramConflictError(method=method, message=description)
|
||||
if status_code == HTTPStatus.UNAUTHORIZED:
|
||||
raise TelegramUnauthorizedError(method=method, message=description)
|
||||
if status_code == HTTPStatus.FORBIDDEN:
|
||||
raise TelegramForbiddenError(method=method, message=description)
|
||||
if status_code == HTTPStatus.REQUEST_ENTITY_TOO_LARGE:
|
||||
raise TelegramEntityTooLarge(method=method, message=description)
|
||||
if status_code >= HTTPStatus.INTERNAL_SERVER_ERROR:
|
||||
if "restart" in description:
|
||||
raise RestartingTelegram(method=method, message=description)
|
||||
raise TelegramServerError(method=method, message=description)
|
||||
|
||||
raise TelegramAPIError(
|
||||
method=method,
|
||||
message=description,
|
||||
)
|
||||
|
||||
@abc.abstractmethod
|
||||
async def close(self) -> None: # pragma: no cover
|
||||
"""
|
||||
Close client session
|
||||
"""
|
||||
pass
|
||||
|
||||
@abc.abstractmethod
|
||||
async def make_request(
|
||||
self,
|
||||
bot: Bot,
|
||||
method: TelegramMethod[TelegramType],
|
||||
timeout: Optional[int] = None,
|
||||
) -> TelegramType: # pragma: no cover
|
||||
"""
|
||||
Make request to Telegram Bot API
|
||||
|
||||
:param bot: Bot instance
|
||||
:param method: Method instance
|
||||
:param timeout: Request timeout
|
||||
:return:
|
||||
:raise TelegramApiError:
|
||||
"""
|
||||
pass
|
||||
|
||||
@abc.abstractmethod
|
||||
async def stream_content(
|
||||
self,
|
||||
url: str,
|
||||
headers: Optional[Dict[str, Any]] = None,
|
||||
timeout: int = 30,
|
||||
chunk_size: int = 65536,
|
||||
raise_for_status: bool = True,
|
||||
) -> AsyncGenerator[bytes, None]: # pragma: no cover
|
||||
"""
|
||||
Stream reader
|
||||
"""
|
||||
yield b""
|
||||
|
||||
def prepare_value(
|
||||
self,
|
||||
value: Any,
|
||||
bot: Bot,
|
||||
files: Dict[str, Any],
|
||||
_dumps_json: bool = True,
|
||||
) -> Any:
|
||||
"""
|
||||
Prepare value before send
|
||||
"""
|
||||
if value is None:
|
||||
return None
|
||||
if isinstance(value, str):
|
||||
return value
|
||||
if isinstance(value, Default):
|
||||
default_value = bot.default[value.name]
|
||||
return self.prepare_value(default_value, bot=bot, files=files, _dumps_json=_dumps_json)
|
||||
if isinstance(value, InputFile):
|
||||
key = secrets.token_urlsafe(10)
|
||||
files[key] = value
|
||||
return f"attach://{key}"
|
||||
if isinstance(value, dict):
|
||||
value = {
|
||||
key: prepared_item
|
||||
for key, item in value.items()
|
||||
if (
|
||||
prepared_item := self.prepare_value(
|
||||
item, bot=bot, files=files, _dumps_json=False
|
||||
)
|
||||
)
|
||||
is not None
|
||||
}
|
||||
if _dumps_json:
|
||||
return self.json_dumps(value)
|
||||
return value
|
||||
if isinstance(value, list):
|
||||
value = [
|
||||
prepared_item
|
||||
for item in value
|
||||
if (
|
||||
prepared_item := self.prepare_value(
|
||||
item, bot=bot, files=files, _dumps_json=False
|
||||
)
|
||||
)
|
||||
is not None
|
||||
]
|
||||
if _dumps_json:
|
||||
return self.json_dumps(value)
|
||||
return value
|
||||
if isinstance(value, datetime.timedelta):
|
||||
now = datetime.datetime.now()
|
||||
return str(round((now + value).timestamp()))
|
||||
if isinstance(value, datetime.datetime):
|
||||
return str(round(value.timestamp()))
|
||||
if isinstance(value, Enum):
|
||||
return self.prepare_value(value.value, bot=bot, files=files)
|
||||
if isinstance(value, TelegramObject):
|
||||
return self.prepare_value(
|
||||
value.model_dump(warnings=False),
|
||||
bot=bot,
|
||||
files=files,
|
||||
_dumps_json=_dumps_json,
|
||||
)
|
||||
if _dumps_json:
|
||||
return self.json_dumps(value)
|
||||
return value
|
||||
|
||||
async def __call__(
|
||||
self,
|
||||
bot: Bot,
|
||||
method: TelegramMethod[TelegramType],
|
||||
timeout: Optional[int] = None,
|
||||
) -> TelegramType:
|
||||
middleware = self.middleware.wrap_middlewares(self.make_request, timeout=timeout)
|
||||
return cast(TelegramType, await middleware(bot, method))
|
||||
|
||||
async def __aenter__(self) -> BaseSession:
|
||||
return self
|
||||
|
||||
async def __aexit__(
|
||||
self,
|
||||
exc_type: Optional[Type[BaseException]],
|
||||
exc_value: Optional[BaseException],
|
||||
traceback: Optional[TracebackType],
|
||||
) -> None:
|
||||
await self.close()
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,53 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import TYPE_CHECKING, Protocol
|
||||
|
||||
from aiogram.methods import Response, TelegramMethod
|
||||
from aiogram.methods.base import TelegramType
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from ...bot import Bot
|
||||
|
||||
|
||||
class NextRequestMiddlewareType(Protocol[TelegramType]): # pragma: no cover
|
||||
async def __call__(
|
||||
self,
|
||||
bot: "Bot",
|
||||
method: TelegramMethod[TelegramType],
|
||||
) -> Response[TelegramType]:
|
||||
pass
|
||||
|
||||
|
||||
class RequestMiddlewareType(Protocol): # pragma: no cover
|
||||
async def __call__(
|
||||
self,
|
||||
make_request: NextRequestMiddlewareType[TelegramType],
|
||||
bot: "Bot",
|
||||
method: TelegramMethod[TelegramType],
|
||||
) -> Response[TelegramType]:
|
||||
pass
|
||||
|
||||
|
||||
class BaseRequestMiddleware(ABC):
|
||||
"""
|
||||
Generic middleware class
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
async def __call__(
|
||||
self,
|
||||
make_request: NextRequestMiddlewareType[TelegramType],
|
||||
bot: "Bot",
|
||||
method: TelegramMethod[TelegramType],
|
||||
) -> Response[TelegramType]:
|
||||
"""
|
||||
Execute middleware
|
||||
|
||||
:param make_request: Wrapped make_request in middlewares chain
|
||||
:param bot: bot for request making
|
||||
:param method: Request method (Subclass of :class:`aiogram.methods.base.TelegramMethod`)
|
||||
|
||||
:return: :class:`aiogram.methods.Response`
|
||||
"""
|
||||
pass
|
||||
@@ -0,0 +1,62 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from functools import partial
|
||||
from typing import Any, Callable, List, Optional, Sequence, Union, cast, overload
|
||||
|
||||
from aiogram.client.session.middlewares.base import (
|
||||
NextRequestMiddlewareType,
|
||||
RequestMiddlewareType,
|
||||
)
|
||||
from aiogram.methods.base import TelegramType
|
||||
|
||||
|
||||
class RequestMiddlewareManager(Sequence[RequestMiddlewareType]):
|
||||
def __init__(self) -> None:
|
||||
self._middlewares: List[RequestMiddlewareType] = []
|
||||
|
||||
def register(
|
||||
self,
|
||||
middleware: RequestMiddlewareType,
|
||||
) -> RequestMiddlewareType:
|
||||
self._middlewares.append(middleware)
|
||||
return middleware
|
||||
|
||||
def unregister(self, middleware: RequestMiddlewareType) -> None:
|
||||
self._middlewares.remove(middleware)
|
||||
|
||||
def __call__(
|
||||
self,
|
||||
middleware: Optional[RequestMiddlewareType] = None,
|
||||
) -> Union[
|
||||
Callable[[RequestMiddlewareType], RequestMiddlewareType],
|
||||
RequestMiddlewareType,
|
||||
]:
|
||||
if middleware is None:
|
||||
return self.register
|
||||
return self.register(middleware)
|
||||
|
||||
@overload
|
||||
def __getitem__(self, item: int) -> RequestMiddlewareType:
|
||||
pass
|
||||
|
||||
@overload
|
||||
def __getitem__(self, item: slice) -> Sequence[RequestMiddlewareType]:
|
||||
pass
|
||||
|
||||
def __getitem__(
|
||||
self, item: Union[int, slice]
|
||||
) -> Union[RequestMiddlewareType, Sequence[RequestMiddlewareType]]:
|
||||
return self._middlewares[item]
|
||||
|
||||
def __len__(self) -> int:
|
||||
return len(self._middlewares)
|
||||
|
||||
def wrap_middlewares(
|
||||
self,
|
||||
callback: NextRequestMiddlewareType[TelegramType],
|
||||
**kwargs: Any,
|
||||
) -> NextRequestMiddlewareType[TelegramType]:
|
||||
middleware = partial(callback, **kwargs)
|
||||
for m in reversed(self._middlewares):
|
||||
middleware = partial(m, middleware)
|
||||
return cast(NextRequestMiddlewareType[TelegramType], middleware)
|
||||
@@ -0,0 +1,37 @@
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Any, List, Optional, Type
|
||||
|
||||
from aiogram import loggers
|
||||
from aiogram.methods import TelegramMethod
|
||||
from aiogram.methods.base import Response, TelegramType
|
||||
|
||||
from .base import BaseRequestMiddleware, NextRequestMiddlewareType
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from ...bot import Bot
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class RequestLogging(BaseRequestMiddleware):
|
||||
def __init__(self, ignore_methods: Optional[List[Type[TelegramMethod[Any]]]] = None):
|
||||
"""
|
||||
Middleware for logging outgoing requests
|
||||
|
||||
:param ignore_methods: methods to ignore in logging middleware
|
||||
"""
|
||||
self.ignore_methods = ignore_methods if ignore_methods else []
|
||||
|
||||
async def __call__(
|
||||
self,
|
||||
make_request: NextRequestMiddlewareType[TelegramType],
|
||||
bot: "Bot",
|
||||
method: TelegramMethod[TelegramType],
|
||||
) -> Response[TelegramType]:
|
||||
if type(method) not in self.ignore_methods:
|
||||
loggers.middlewares.info(
|
||||
"Make request with method=%r by bot id=%d",
|
||||
type(method).__name__,
|
||||
bot.id,
|
||||
)
|
||||
return await make_request(bot, method)
|
||||
103
venv/lib/python3.12/site-packages/aiogram/client/telegram.py
Normal file
103
venv/lib/python3.12/site-packages/aiogram/client/telegram.py
Normal file
@@ -0,0 +1,103 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Any, Union
|
||||
|
||||
|
||||
class FilesPathWrapper(ABC):
|
||||
@abstractmethod
|
||||
def to_local(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def to_server(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
pass
|
||||
|
||||
|
||||
class BareFilesPathWrapper(FilesPathWrapper):
|
||||
def to_local(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
return path
|
||||
|
||||
def to_server(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
return path
|
||||
|
||||
|
||||
class SimpleFilesPathWrapper(FilesPathWrapper):
|
||||
def __init__(self, server_path: Path, local_path: Path) -> None:
|
||||
self.server_path = server_path
|
||||
self.local_path = local_path
|
||||
|
||||
@classmethod
|
||||
def _resolve(
|
||||
cls, base1: Union[Path, str], base2: Union[Path, str], value: Union[Path, str]
|
||||
) -> Path:
|
||||
relative = Path(value).relative_to(base1)
|
||||
return base2 / relative
|
||||
|
||||
def to_local(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
return self._resolve(base1=self.server_path, base2=self.local_path, value=path)
|
||||
|
||||
def to_server(self, path: Union[Path, str]) -> Union[Path, str]:
|
||||
return self._resolve(base1=self.local_path, base2=self.server_path, value=path)
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class TelegramAPIServer:
|
||||
"""
|
||||
Base config for API Endpoints
|
||||
"""
|
||||
|
||||
base: str
|
||||
"""Base URL"""
|
||||
file: str
|
||||
"""Files URL"""
|
||||
is_local: bool = False
|
||||
"""Mark this server is
|
||||
in `local mode <https://core.telegram.org/bots/api#using-a-local-bot-api-server>`_."""
|
||||
wrap_local_file: FilesPathWrapper = BareFilesPathWrapper()
|
||||
"""Callback to wrap files path in local mode"""
|
||||
|
||||
def api_url(self, token: str, method: str) -> str:
|
||||
"""
|
||||
Generate URL for API methods
|
||||
|
||||
:param token: Bot token
|
||||
:param method: API method name (case insensitive)
|
||||
:return: URL
|
||||
"""
|
||||
return self.base.format(token=token, method=method)
|
||||
|
||||
def file_url(self, token: str, path: Union[str, Path]) -> str:
|
||||
"""
|
||||
Generate URL for downloading files
|
||||
|
||||
:param token: Bot token
|
||||
:param path: file path
|
||||
:return: URL
|
||||
"""
|
||||
return self.file.format(token=token, path=path)
|
||||
|
||||
@classmethod
|
||||
def from_base(cls, base: str, **kwargs: Any) -> "TelegramAPIServer":
|
||||
"""
|
||||
Use this method to auto-generate TelegramAPIServer instance from base URL
|
||||
|
||||
:param base: Base URL
|
||||
:return: instance of :class:`TelegramAPIServer`
|
||||
"""
|
||||
base = base.rstrip("/")
|
||||
return cls(
|
||||
base=f"{base}/bot{{token}}/{{method}}",
|
||||
file=f"{base}/file/bot{{token}}/{{path}}",
|
||||
**kwargs,
|
||||
)
|
||||
|
||||
|
||||
PRODUCTION = TelegramAPIServer(
|
||||
base="https://api.telegram.org/bot{token}/{method}",
|
||||
file="https://api.telegram.org/file/bot{token}/{path}",
|
||||
)
|
||||
TEST = TelegramAPIServer(
|
||||
base="https://api.telegram.org/bot{token}/test/{method}",
|
||||
file="https://api.telegram.org/file/bot{token}/test/{path}",
|
||||
)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,642 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import contextvars
|
||||
import signal
|
||||
import warnings
|
||||
from asyncio import CancelledError, Event, Future, Lock
|
||||
from contextlib import suppress
|
||||
from typing import Any, AsyncGenerator, Awaitable, Dict, List, Optional, Set, Union
|
||||
|
||||
from .. import loggers
|
||||
from ..client.bot import Bot
|
||||
from ..exceptions import TelegramAPIError
|
||||
from ..fsm.middleware import FSMContextMiddleware
|
||||
from ..fsm.storage.base import BaseEventIsolation, BaseStorage
|
||||
from ..fsm.storage.memory import DisabledEventIsolation, MemoryStorage
|
||||
from ..fsm.strategy import FSMStrategy
|
||||
from ..methods import GetUpdates, TelegramMethod
|
||||
from ..methods.base import TelegramType
|
||||
from ..types import Update, User
|
||||
from ..types.base import UNSET, UNSET_TYPE
|
||||
from ..types.update import UpdateTypeLookupError
|
||||
from ..utils.backoff import Backoff, BackoffConfig
|
||||
from .event.bases import UNHANDLED, SkipHandler
|
||||
from .event.telegram import TelegramEventObserver
|
||||
from .middlewares.error import ErrorsMiddleware
|
||||
from .middlewares.user_context import UserContextMiddleware
|
||||
from .router import Router
|
||||
|
||||
DEFAULT_BACKOFF_CONFIG = BackoffConfig(min_delay=1.0, max_delay=5.0, factor=1.3, jitter=0.1)
|
||||
|
||||
|
||||
class Dispatcher(Router):
|
||||
"""
|
||||
Root router
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
*, # * - Preventing to pass instance of Bot to the FSM storage
|
||||
storage: Optional[BaseStorage] = None,
|
||||
fsm_strategy: FSMStrategy = FSMStrategy.USER_IN_CHAT,
|
||||
events_isolation: Optional[BaseEventIsolation] = None,
|
||||
disable_fsm: bool = False,
|
||||
name: Optional[str] = None,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Root router
|
||||
|
||||
:param storage: Storage for FSM
|
||||
:param fsm_strategy: FSM strategy
|
||||
:param events_isolation: Events isolation
|
||||
:param disable_fsm: Disable FSM, note that if you disable FSM
|
||||
then you should not use storage and events isolation
|
||||
:param kwargs: Other arguments, will be passed as keyword arguments to handlers
|
||||
"""
|
||||
super(Dispatcher, self).__init__(name=name)
|
||||
|
||||
if storage and not isinstance(storage, BaseStorage):
|
||||
raise TypeError(
|
||||
f"FSM storage should be instance of 'BaseStorage' not {type(storage).__name__}"
|
||||
)
|
||||
|
||||
# Telegram API provides originally only one event type - Update
|
||||
# For making easily interactions with events here is registered handler which helps
|
||||
# to separate Update to different event types like Message, CallbackQuery etc.
|
||||
self.update = self.observers["update"] = TelegramEventObserver(
|
||||
router=self, event_name="update"
|
||||
)
|
||||
self.update.register(self._listen_update)
|
||||
|
||||
# Error handlers should work is out of all other functions
|
||||
# and should be registered before all others middlewares
|
||||
self.update.outer_middleware(ErrorsMiddleware(self))
|
||||
|
||||
# User context middleware makes small optimization for all other builtin
|
||||
# middlewares via caching the user and chat instances in the event context
|
||||
self.update.outer_middleware(UserContextMiddleware())
|
||||
|
||||
# FSM middleware should always be registered after User context middleware
|
||||
# because here is used context from previous step
|
||||
self.fsm = FSMContextMiddleware(
|
||||
storage=storage or MemoryStorage(),
|
||||
strategy=fsm_strategy,
|
||||
events_isolation=events_isolation or DisabledEventIsolation(),
|
||||
)
|
||||
if not disable_fsm:
|
||||
# Note that when FSM middleware is disabled, the event isolation is also disabled
|
||||
# Because the isolation mechanism is a part of the FSM
|
||||
self.update.outer_middleware(self.fsm)
|
||||
self.shutdown.register(self.fsm.close)
|
||||
|
||||
self.workflow_data: Dict[str, Any] = kwargs
|
||||
self._running_lock = Lock()
|
||||
self._stop_signal: Optional[Event] = None
|
||||
self._stopped_signal: Optional[Event] = None
|
||||
self._handle_update_tasks: Set[asyncio.Task[Any]] = set()
|
||||
|
||||
def __getitem__(self, item: str) -> Any:
|
||||
return self.workflow_data[item]
|
||||
|
||||
def __setitem__(self, key: str, value: Any) -> None:
|
||||
self.workflow_data[key] = value
|
||||
|
||||
def __delitem__(self, key: str) -> None:
|
||||
del self.workflow_data[key]
|
||||
|
||||
def get(self, key: str, /, default: Optional[Any] = None) -> Optional[Any]:
|
||||
return self.workflow_data.get(key, default)
|
||||
|
||||
@property
|
||||
def storage(self) -> BaseStorage:
|
||||
return self.fsm.storage
|
||||
|
||||
@property
|
||||
def parent_router(self) -> Optional[Router]:
|
||||
"""
|
||||
Dispatcher has no parent router and can't be included to any other routers or dispatchers
|
||||
|
||||
:return:
|
||||
"""
|
||||
return None # noqa: RET501
|
||||
|
||||
@parent_router.setter
|
||||
def parent_router(self, value: Router) -> None:
|
||||
"""
|
||||
Dispatcher is root Router then configuring parent router is not allowed
|
||||
|
||||
:param value:
|
||||
:return:
|
||||
"""
|
||||
raise RuntimeError("Dispatcher can not be attached to another Router.")
|
||||
|
||||
async def feed_update(self, bot: Bot, update: Update, **kwargs: Any) -> Any:
|
||||
"""
|
||||
Main entry point for incoming updates
|
||||
Response of this method can be used as Webhook response
|
||||
|
||||
:param bot:
|
||||
:param update:
|
||||
"""
|
||||
loop = asyncio.get_running_loop()
|
||||
handled = False
|
||||
start_time = loop.time()
|
||||
|
||||
if update.bot != bot:
|
||||
# Re-mounting update to the current bot instance for making possible to
|
||||
# use it in shortcuts.
|
||||
# Here is update is re-created because we need to propagate context to
|
||||
# all nested objects and attributes of the Update, but it
|
||||
# is impossible without roundtrip to JSON :(
|
||||
# The preferred way is that pass already mounted Bot instance to this update
|
||||
# before call feed_update method
|
||||
update = Update.model_validate(update.model_dump(), context={"bot": bot})
|
||||
|
||||
try:
|
||||
response = await self.update.wrap_outer_middleware(
|
||||
self.update.trigger,
|
||||
update,
|
||||
{
|
||||
**self.workflow_data,
|
||||
**kwargs,
|
||||
"bot": bot,
|
||||
},
|
||||
)
|
||||
handled = response is not UNHANDLED
|
||||
return response
|
||||
finally:
|
||||
finish_time = loop.time()
|
||||
duration = (finish_time - start_time) * 1000
|
||||
loggers.event.info(
|
||||
"Update id=%s is %s. Duration %d ms by bot id=%d",
|
||||
update.update_id,
|
||||
"handled" if handled else "not handled",
|
||||
duration,
|
||||
bot.id,
|
||||
)
|
||||
|
||||
async def feed_raw_update(self, bot: Bot, update: Dict[str, Any], **kwargs: Any) -> Any:
|
||||
"""
|
||||
Main entry point for incoming updates with automatic Dict->Update serializer
|
||||
|
||||
:param bot:
|
||||
:param update:
|
||||
:param kwargs:
|
||||
"""
|
||||
parsed_update = Update.model_validate(update, context={"bot": bot})
|
||||
return await self._feed_webhook_update(bot=bot, update=parsed_update, **kwargs)
|
||||
|
||||
@classmethod
|
||||
async def _listen_updates(
|
||||
cls,
|
||||
bot: Bot,
|
||||
polling_timeout: int = 30,
|
||||
backoff_config: BackoffConfig = DEFAULT_BACKOFF_CONFIG,
|
||||
allowed_updates: Optional[List[str]] = None,
|
||||
) -> AsyncGenerator[Update, None]:
|
||||
"""
|
||||
Endless updates reader with correctly handling any server-side or connection errors.
|
||||
|
||||
So you may not worry that the polling will stop working.
|
||||
"""
|
||||
backoff = Backoff(config=backoff_config)
|
||||
get_updates = GetUpdates(timeout=polling_timeout, allowed_updates=allowed_updates)
|
||||
kwargs = {}
|
||||
if bot.session.timeout:
|
||||
# Request timeout can be lower than session timeout and that's OK.
|
||||
# To prevent false-positive TimeoutError we should wait longer than polling timeout
|
||||
kwargs["request_timeout"] = int(bot.session.timeout + polling_timeout)
|
||||
failed = False
|
||||
while True:
|
||||
try:
|
||||
updates = await bot(get_updates, **kwargs)
|
||||
except Exception as e:
|
||||
failed = True
|
||||
# In cases when Telegram Bot API was inaccessible don't need to stop polling
|
||||
# process because some developers can't make auto-restarting of the script
|
||||
loggers.dispatcher.error("Failed to fetch updates - %s: %s", type(e).__name__, e)
|
||||
# And also backoff timeout is best practice to retry any network activity
|
||||
loggers.dispatcher.warning(
|
||||
"Sleep for %f seconds and try again... (tryings = %d, bot id = %d)",
|
||||
backoff.next_delay,
|
||||
backoff.counter,
|
||||
bot.id,
|
||||
)
|
||||
await backoff.asleep()
|
||||
continue
|
||||
|
||||
# In case when network connection was fixed let's reset the backoff
|
||||
# to initial value and then process updates
|
||||
if failed:
|
||||
loggers.dispatcher.info(
|
||||
"Connection established (tryings = %d, bot id = %d)",
|
||||
backoff.counter,
|
||||
bot.id,
|
||||
)
|
||||
backoff.reset()
|
||||
failed = False
|
||||
|
||||
for update in updates:
|
||||
yield update
|
||||
# The getUpdates method returns the earliest 100 unconfirmed updates.
|
||||
# To confirm an update, use the offset parameter when calling getUpdates
|
||||
# All updates with update_id less than or equal to offset will be marked
|
||||
# as confirmed on the server and will no longer be returned.
|
||||
get_updates.offset = update.update_id + 1
|
||||
|
||||
async def _listen_update(self, update: Update, **kwargs: Any) -> Any:
|
||||
"""
|
||||
Main updates listener
|
||||
|
||||
Workflow:
|
||||
- Detect content type and propagate to observers in current router
|
||||
- If no one filter is pass - propagate update to child routers as Update
|
||||
|
||||
:param update:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
try:
|
||||
update_type = update.event_type
|
||||
event = update.event
|
||||
except UpdateTypeLookupError as e:
|
||||
warnings.warn(
|
||||
"Detected unknown update type.\n"
|
||||
"Seems like Telegram Bot API was updated and you have "
|
||||
"installed not latest version of aiogram framework"
|
||||
f"\nUpdate: {update.model_dump_json(exclude_unset=True)}",
|
||||
RuntimeWarning,
|
||||
)
|
||||
raise SkipHandler() from e
|
||||
|
||||
kwargs.update(event_update=update)
|
||||
|
||||
return await self.propagate_event(update_type=update_type, event=event, **kwargs)
|
||||
|
||||
@classmethod
|
||||
async def silent_call_request(cls, bot: Bot, result: TelegramMethod[Any]) -> None:
|
||||
"""
|
||||
Simulate answer into WebHook
|
||||
|
||||
:param bot:
|
||||
:param result:
|
||||
:return:
|
||||
"""
|
||||
try:
|
||||
await bot(result)
|
||||
except TelegramAPIError as e:
|
||||
# In due to WebHook mechanism doesn't allow getting response for
|
||||
# requests called in answer to WebHook request.
|
||||
# Need to skip unsuccessful responses.
|
||||
# For debugging here is added logging.
|
||||
loggers.event.error("Failed to make answer: %s: %s", e.__class__.__name__, e)
|
||||
|
||||
async def _process_update(
|
||||
self, bot: Bot, update: Update, call_answer: bool = True, **kwargs: Any
|
||||
) -> bool:
|
||||
"""
|
||||
Propagate update to event listeners
|
||||
|
||||
:param bot: instance of Bot
|
||||
:param update: instance of Update
|
||||
:param call_answer: need to execute response as Telegram method (like answer into webhook)
|
||||
:param kwargs: contextual data for middlewares, filters and handlers
|
||||
:return: status
|
||||
"""
|
||||
try:
|
||||
response = await self.feed_update(bot, update, **kwargs)
|
||||
if call_answer and isinstance(response, TelegramMethod):
|
||||
await self.silent_call_request(bot=bot, result=response)
|
||||
return response is not UNHANDLED
|
||||
|
||||
except Exception as e:
|
||||
loggers.event.exception(
|
||||
"Cause exception while process update id=%d by bot id=%d\n%s: %s",
|
||||
update.update_id,
|
||||
bot.id,
|
||||
e.__class__.__name__,
|
||||
e,
|
||||
)
|
||||
return True # because update was processed but unsuccessful
|
||||
|
||||
async def _process_with_semaphore(
|
||||
self, handle_update: Awaitable[bool], semaphore: asyncio.Semaphore
|
||||
) -> bool:
|
||||
"""
|
||||
Process update with semaphore to limit concurrent tasks
|
||||
|
||||
:param handle_update: Coroutine that processes the update
|
||||
:param semaphore: Semaphore to limit concurrent tasks
|
||||
:return: bool indicating the result of the update processing
|
||||
"""
|
||||
try:
|
||||
return await handle_update
|
||||
finally:
|
||||
semaphore.release()
|
||||
|
||||
async def _polling(
|
||||
self,
|
||||
bot: Bot,
|
||||
polling_timeout: int = 30,
|
||||
handle_as_tasks: bool = True,
|
||||
backoff_config: BackoffConfig = DEFAULT_BACKOFF_CONFIG,
|
||||
allowed_updates: Optional[List[str]] = None,
|
||||
tasks_concurrency_limit: Optional[int] = None,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Internal polling process
|
||||
|
||||
:param bot:
|
||||
:param polling_timeout: Long-polling wait time
|
||||
:param handle_as_tasks: Run task for each event and no wait result
|
||||
:param backoff_config: backoff-retry config
|
||||
:param allowed_updates: List of the update types you want your bot to receive
|
||||
:param tasks_concurrency_limit: Maximum number of concurrent updates to process
|
||||
(None = no limit), used only if handle_as_tasks is True
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
user: User = await bot.me()
|
||||
loggers.dispatcher.info(
|
||||
"Run polling for bot @%s id=%d - %r", user.username, bot.id, user.full_name
|
||||
)
|
||||
|
||||
# Create semaphore if tasks_concurrency_limit is specified
|
||||
semaphore = None
|
||||
if tasks_concurrency_limit is not None and handle_as_tasks:
|
||||
semaphore = asyncio.Semaphore(tasks_concurrency_limit)
|
||||
|
||||
try:
|
||||
async for update in self._listen_updates(
|
||||
bot,
|
||||
polling_timeout=polling_timeout,
|
||||
backoff_config=backoff_config,
|
||||
allowed_updates=allowed_updates,
|
||||
):
|
||||
handle_update = self._process_update(bot=bot, update=update, **kwargs)
|
||||
if handle_as_tasks:
|
||||
if semaphore:
|
||||
# Use semaphore to limit concurrent tasks
|
||||
await semaphore.acquire()
|
||||
handle_update_task = asyncio.create_task(
|
||||
self._process_with_semaphore(handle_update, semaphore)
|
||||
)
|
||||
else:
|
||||
handle_update_task = asyncio.create_task(handle_update)
|
||||
|
||||
self._handle_update_tasks.add(handle_update_task)
|
||||
handle_update_task.add_done_callback(self._handle_update_tasks.discard)
|
||||
else:
|
||||
await handle_update
|
||||
finally:
|
||||
loggers.dispatcher.info(
|
||||
"Polling stopped for bot @%s id=%d - %r", user.username, bot.id, user.full_name
|
||||
)
|
||||
|
||||
async def _feed_webhook_update(self, bot: Bot, update: Update, **kwargs: Any) -> Any:
|
||||
"""
|
||||
The same with `Dispatcher.process_update()` but returns real response instead of bool
|
||||
"""
|
||||
try:
|
||||
return await self.feed_update(bot, update, **kwargs)
|
||||
except Exception as e:
|
||||
loggers.event.exception(
|
||||
"Cause exception while process update id=%d by bot id=%d\n%s: %s",
|
||||
update.update_id,
|
||||
bot.id,
|
||||
e.__class__.__name__,
|
||||
e,
|
||||
)
|
||||
raise
|
||||
|
||||
async def feed_webhook_update(
|
||||
self, bot: Bot, update: Union[Update, Dict[str, Any]], _timeout: float = 55, **kwargs: Any
|
||||
) -> Optional[TelegramMethod[TelegramType]]:
|
||||
if not isinstance(update, Update): # Allow to use raw updates
|
||||
update = Update.model_validate(update, context={"bot": bot})
|
||||
|
||||
ctx = contextvars.copy_context()
|
||||
loop = asyncio.get_running_loop()
|
||||
waiter = loop.create_future()
|
||||
|
||||
def release_waiter(*_: Any) -> None:
|
||||
if not waiter.done():
|
||||
waiter.set_result(None)
|
||||
|
||||
timeout_handle = loop.call_later(_timeout, release_waiter)
|
||||
|
||||
process_updates: Future[Any] = asyncio.ensure_future(
|
||||
self._feed_webhook_update(bot=bot, update=update, **kwargs)
|
||||
)
|
||||
process_updates.add_done_callback(release_waiter, context=ctx)
|
||||
|
||||
def process_response(task: Future[Any]) -> None:
|
||||
warnings.warn(
|
||||
"Detected slow response into webhook.\n"
|
||||
"Telegram is waiting for response only first 60 seconds and then re-send update.\n"
|
||||
"For preventing this situation response into webhook returned immediately "
|
||||
"and handler is moved to background and still processing update.",
|
||||
RuntimeWarning,
|
||||
)
|
||||
try:
|
||||
result = task.result()
|
||||
except Exception as e:
|
||||
raise e
|
||||
if isinstance(result, TelegramMethod):
|
||||
asyncio.ensure_future(self.silent_call_request(bot=bot, result=result))
|
||||
|
||||
try:
|
||||
try:
|
||||
await waiter
|
||||
except CancelledError: # pragma: no cover
|
||||
process_updates.remove_done_callback(release_waiter)
|
||||
process_updates.cancel()
|
||||
raise
|
||||
|
||||
if process_updates.done():
|
||||
# TODO: handle exceptions
|
||||
response: Any = process_updates.result()
|
||||
if isinstance(response, TelegramMethod):
|
||||
return response
|
||||
|
||||
else:
|
||||
process_updates.remove_done_callback(release_waiter)
|
||||
process_updates.add_done_callback(process_response, context=ctx)
|
||||
|
||||
finally:
|
||||
timeout_handle.cancel()
|
||||
|
||||
return None
|
||||
|
||||
async def stop_polling(self) -> None:
|
||||
"""
|
||||
Execute this method if you want to stop polling programmatically
|
||||
|
||||
:return:
|
||||
"""
|
||||
if not self._running_lock.locked():
|
||||
raise RuntimeError("Polling is not started")
|
||||
if not self._stop_signal or not self._stopped_signal:
|
||||
return
|
||||
self._stop_signal.set()
|
||||
await self._stopped_signal.wait()
|
||||
|
||||
def _signal_stop_polling(self, sig: signal.Signals) -> None:
|
||||
if not self._running_lock.locked():
|
||||
return
|
||||
|
||||
loggers.dispatcher.warning("Received %s signal", sig.name)
|
||||
if not self._stop_signal:
|
||||
return
|
||||
self._stop_signal.set()
|
||||
|
||||
async def start_polling(
|
||||
self,
|
||||
*bots: Bot,
|
||||
polling_timeout: int = 10,
|
||||
handle_as_tasks: bool = True,
|
||||
backoff_config: BackoffConfig = DEFAULT_BACKOFF_CONFIG,
|
||||
allowed_updates: Optional[Union[List[str], UNSET_TYPE]] = UNSET,
|
||||
handle_signals: bool = True,
|
||||
close_bot_session: bool = True,
|
||||
tasks_concurrency_limit: Optional[int] = None,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Polling runner
|
||||
|
||||
:param bots: Bot instances (one or more)
|
||||
:param polling_timeout: Long-polling wait time
|
||||
:param handle_as_tasks: Run task for each event and no wait result
|
||||
:param backoff_config: backoff-retry config
|
||||
:param allowed_updates: List of the update types you want your bot to receive
|
||||
By default, all used update types are enabled (resolved from handlers)
|
||||
:param handle_signals: handle signals (SIGINT/SIGTERM)
|
||||
:param close_bot_session: close bot sessions on shutdown
|
||||
:param tasks_concurrency_limit: Maximum number of concurrent updates to process
|
||||
(None = no limit), used only if handle_as_tasks is True
|
||||
:param kwargs: contextual data
|
||||
:return:
|
||||
"""
|
||||
if not bots:
|
||||
raise ValueError("At least one bot instance is required to start polling")
|
||||
if "bot" in kwargs:
|
||||
raise ValueError(
|
||||
"Keyword argument 'bot' is not acceptable, "
|
||||
"the bot instance should be passed as positional argument"
|
||||
)
|
||||
|
||||
async with self._running_lock: # Prevent to run this method twice at a once
|
||||
if self._stop_signal is None:
|
||||
self._stop_signal = Event()
|
||||
if self._stopped_signal is None:
|
||||
self._stopped_signal = Event()
|
||||
|
||||
if allowed_updates is UNSET:
|
||||
allowed_updates = self.resolve_used_update_types()
|
||||
|
||||
self._stop_signal.clear()
|
||||
self._stopped_signal.clear()
|
||||
|
||||
if handle_signals:
|
||||
loop = asyncio.get_running_loop()
|
||||
with suppress(NotImplementedError): # pragma: no cover
|
||||
# Signals handling is not supported on Windows
|
||||
# It also can't be covered on Windows
|
||||
loop.add_signal_handler(
|
||||
signal.SIGTERM, self._signal_stop_polling, signal.SIGTERM
|
||||
)
|
||||
loop.add_signal_handler(
|
||||
signal.SIGINT, self._signal_stop_polling, signal.SIGINT
|
||||
)
|
||||
|
||||
workflow_data = {
|
||||
"dispatcher": self,
|
||||
"bots": bots,
|
||||
**self.workflow_data,
|
||||
**kwargs,
|
||||
}
|
||||
if "bot" in workflow_data:
|
||||
workflow_data.pop("bot")
|
||||
|
||||
await self.emit_startup(bot=bots[-1], **workflow_data)
|
||||
loggers.dispatcher.info("Start polling")
|
||||
try:
|
||||
tasks: List[asyncio.Task[Any]] = [
|
||||
asyncio.create_task(
|
||||
self._polling(
|
||||
bot=bot,
|
||||
handle_as_tasks=handle_as_tasks,
|
||||
polling_timeout=polling_timeout,
|
||||
backoff_config=backoff_config,
|
||||
allowed_updates=allowed_updates,
|
||||
tasks_concurrency_limit=tasks_concurrency_limit,
|
||||
**workflow_data,
|
||||
)
|
||||
)
|
||||
for bot in bots
|
||||
]
|
||||
tasks.append(asyncio.create_task(self._stop_signal.wait()))
|
||||
done, pending = await asyncio.wait(tasks, return_when=asyncio.FIRST_COMPLETED)
|
||||
|
||||
for task in pending:
|
||||
# (mostly) Graceful shutdown unfinished tasks
|
||||
task.cancel()
|
||||
with suppress(CancelledError):
|
||||
await task
|
||||
# Wait finished tasks to propagate unhandled exceptions
|
||||
await asyncio.gather(*done)
|
||||
|
||||
finally:
|
||||
loggers.dispatcher.info("Polling stopped")
|
||||
try:
|
||||
await self.emit_shutdown(bot=bots[-1], **workflow_data)
|
||||
finally:
|
||||
if close_bot_session:
|
||||
await asyncio.gather(*(bot.session.close() for bot in bots))
|
||||
self._stopped_signal.set()
|
||||
|
||||
def run_polling(
|
||||
self,
|
||||
*bots: Bot,
|
||||
polling_timeout: int = 10,
|
||||
handle_as_tasks: bool = True,
|
||||
backoff_config: BackoffConfig = DEFAULT_BACKOFF_CONFIG,
|
||||
allowed_updates: Optional[Union[List[str], UNSET_TYPE]] = UNSET,
|
||||
handle_signals: bool = True,
|
||||
close_bot_session: bool = True,
|
||||
tasks_concurrency_limit: Optional[int] = None,
|
||||
**kwargs: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Run many bots with polling
|
||||
|
||||
:param bots: Bot instances (one or more)
|
||||
:param polling_timeout: Long-polling wait time
|
||||
:param handle_as_tasks: Run task for each event and no wait result
|
||||
:param backoff_config: backoff-retry config
|
||||
:param allowed_updates: List of the update types you want your bot to receive
|
||||
:param handle_signals: handle signals (SIGINT/SIGTERM)
|
||||
:param close_bot_session: close bot sessions on shutdown
|
||||
:param tasks_concurrency_limit: Maximum number of concurrent updates to process
|
||||
(None = no limit), used only if handle_as_tasks is True
|
||||
:param kwargs: contextual data
|
||||
:return:
|
||||
"""
|
||||
with suppress(KeyboardInterrupt):
|
||||
return asyncio.run(
|
||||
self.start_polling(
|
||||
*bots,
|
||||
**kwargs,
|
||||
polling_timeout=polling_timeout,
|
||||
handle_as_tasks=handle_as_tasks,
|
||||
backoff_config=backoff_config,
|
||||
allowed_updates=allowed_updates,
|
||||
handle_signals=handle_signals,
|
||||
close_bot_session=close_bot_session,
|
||||
tasks_concurrency_limit=tasks_concurrency_limit,
|
||||
)
|
||||
)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,35 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Awaitable, Callable, Dict, NoReturn, Optional, TypeVar, Union
|
||||
from unittest.mock import sentinel
|
||||
|
||||
from ...types import TelegramObject
|
||||
from ..middlewares.base import BaseMiddleware
|
||||
|
||||
MiddlewareEventType = TypeVar("MiddlewareEventType", bound=TelegramObject)
|
||||
NextMiddlewareType = Callable[[MiddlewareEventType, Dict[str, Any]], Awaitable[Any]]
|
||||
MiddlewareType = Union[
|
||||
BaseMiddleware,
|
||||
Callable[
|
||||
[NextMiddlewareType[MiddlewareEventType], MiddlewareEventType, Dict[str, Any]],
|
||||
Awaitable[Any],
|
||||
],
|
||||
]
|
||||
|
||||
UNHANDLED = sentinel.UNHANDLED
|
||||
REJECTED = sentinel.REJECTED
|
||||
|
||||
|
||||
class SkipHandler(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class CancelHandler(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def skip(message: Optional[str] = None) -> NoReturn:
|
||||
"""
|
||||
Raise an SkipHandler
|
||||
"""
|
||||
raise SkipHandler(message or "Event skipped")
|
||||
@@ -0,0 +1,53 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Callable, List
|
||||
|
||||
from .handler import CallbackType, HandlerObject
|
||||
|
||||
|
||||
class EventObserver:
|
||||
"""
|
||||
Simple events observer
|
||||
|
||||
Is used for managing events is not related with Telegram
|
||||
(For example startup/shutdown processes)
|
||||
|
||||
Handlers can be registered via decorator or method
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
<observer>.register(my_handler)
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
@<observer>()
|
||||
async def my_handler(*args, **kwargs): ...
|
||||
"""
|
||||
|
||||
def __init__(self) -> None:
|
||||
self.handlers: List[HandlerObject] = []
|
||||
|
||||
def register(self, callback: CallbackType) -> None:
|
||||
"""
|
||||
Register callback with filters
|
||||
"""
|
||||
self.handlers.append(HandlerObject(callback=callback))
|
||||
|
||||
async def trigger(self, *args: Any, **kwargs: Any) -> None:
|
||||
"""
|
||||
Propagate event to handlers.
|
||||
Handler will be called when all its filters is pass.
|
||||
"""
|
||||
for handler in self.handlers:
|
||||
await handler.call(*args, **kwargs)
|
||||
|
||||
def __call__(self) -> Callable[[CallbackType], CallbackType]:
|
||||
"""
|
||||
Decorator for registering event handlers
|
||||
"""
|
||||
|
||||
def wrapper(callback: CallbackType) -> CallbackType:
|
||||
self.register(callback)
|
||||
return callback
|
||||
|
||||
return wrapper
|
||||
@@ -0,0 +1,95 @@
|
||||
import asyncio
|
||||
import contextvars
|
||||
import inspect
|
||||
import warnings
|
||||
from dataclasses import dataclass, field
|
||||
from functools import partial
|
||||
from typing import Any, Callable, Dict, List, Optional, Set, Tuple
|
||||
|
||||
from magic_filter.magic import MagicFilter as OriginalMagicFilter
|
||||
|
||||
from aiogram.dispatcher.flags import extract_flags_from_object
|
||||
from aiogram.filters.base import Filter
|
||||
from aiogram.handlers import BaseHandler
|
||||
from aiogram.utils.magic_filter import MagicFilter
|
||||
from aiogram.utils.warnings import Recommendation
|
||||
|
||||
CallbackType = Callable[..., Any]
|
||||
|
||||
|
||||
@dataclass
|
||||
class CallableObject:
|
||||
callback: CallbackType
|
||||
awaitable: bool = field(init=False)
|
||||
params: Set[str] = field(init=False)
|
||||
varkw: bool = field(init=False)
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
callback = inspect.unwrap(self.callback)
|
||||
self.awaitable = inspect.isawaitable(callback) or inspect.iscoroutinefunction(callback)
|
||||
spec = inspect.getfullargspec(callback)
|
||||
self.params = {*spec.args, *spec.kwonlyargs}
|
||||
self.varkw = spec.varkw is not None
|
||||
|
||||
def _prepare_kwargs(self, kwargs: Dict[str, Any]) -> Dict[str, Any]:
|
||||
if self.varkw:
|
||||
return kwargs
|
||||
|
||||
return {k: kwargs[k] for k in self.params if k in kwargs}
|
||||
|
||||
async def call(self, *args: Any, **kwargs: Any) -> Any:
|
||||
wrapped = partial(self.callback, *args, **self._prepare_kwargs(kwargs))
|
||||
if self.awaitable:
|
||||
return await wrapped()
|
||||
return await asyncio.to_thread(wrapped)
|
||||
|
||||
|
||||
@dataclass
|
||||
class FilterObject(CallableObject):
|
||||
magic: Optional[MagicFilter] = None
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
if isinstance(self.callback, OriginalMagicFilter):
|
||||
# MagicFilter instance is callable but generates
|
||||
# only "CallOperation" instead of applying the filter
|
||||
self.magic = self.callback
|
||||
self.callback = self.callback.resolve
|
||||
if not isinstance(self.magic, MagicFilter):
|
||||
# Issue: https://github.com/aiogram/aiogram/issues/990
|
||||
warnings.warn(
|
||||
category=Recommendation,
|
||||
message="You are using F provided by magic_filter package directly, "
|
||||
"but it lacks `.as_()` extension."
|
||||
"\n Please change the import statement: from `from magic_filter import F` "
|
||||
"to `from aiogram import F` to silence this warning.",
|
||||
stacklevel=6,
|
||||
)
|
||||
|
||||
super(FilterObject, self).__post_init__()
|
||||
|
||||
if isinstance(self.callback, Filter):
|
||||
self.awaitable = True
|
||||
|
||||
|
||||
@dataclass
|
||||
class HandlerObject(CallableObject):
|
||||
filters: Optional[List[FilterObject]] = None
|
||||
flags: Dict[str, Any] = field(default_factory=dict)
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
super(HandlerObject, self).__post_init__()
|
||||
callback = inspect.unwrap(self.callback)
|
||||
if inspect.isclass(callback) and issubclass(callback, BaseHandler):
|
||||
self.awaitable = True
|
||||
self.flags.update(extract_flags_from_object(callback))
|
||||
|
||||
async def check(self, *args: Any, **kwargs: Any) -> Tuple[bool, Dict[str, Any]]:
|
||||
if not self.filters:
|
||||
return True, kwargs
|
||||
for event_filter in self.filters:
|
||||
check = await event_filter.call(*args, **kwargs)
|
||||
if not check:
|
||||
return False, kwargs
|
||||
if isinstance(check, dict):
|
||||
kwargs.update(check)
|
||||
return True, kwargs
|
||||
@@ -0,0 +1,141 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Optional
|
||||
|
||||
from aiogram.dispatcher.middlewares.manager import MiddlewareManager
|
||||
|
||||
from ...exceptions import UnsupportedKeywordArgument
|
||||
from ...filters.base import Filter
|
||||
from ...types import TelegramObject
|
||||
from .bases import UNHANDLED, MiddlewareType, SkipHandler
|
||||
from .handler import CallbackType, FilterObject, HandlerObject
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from aiogram.dispatcher.router import Router
|
||||
|
||||
|
||||
class TelegramEventObserver:
|
||||
"""
|
||||
Event observer for Telegram events
|
||||
|
||||
Here you can register handler with filter.
|
||||
This observer will stop event propagation when first handler is pass.
|
||||
"""
|
||||
|
||||
def __init__(self, router: Router, event_name: str) -> None:
|
||||
self.router: Router = router
|
||||
self.event_name: str = event_name
|
||||
|
||||
self.handlers: List[HandlerObject] = []
|
||||
|
||||
self.middleware = MiddlewareManager()
|
||||
self.outer_middleware = MiddlewareManager()
|
||||
|
||||
# Re-used filters check method from already implemented handler object
|
||||
# with dummy callback which never will be used
|
||||
self._handler = HandlerObject(callback=lambda: True, filters=[])
|
||||
|
||||
def filter(self, *filters: CallbackType) -> None:
|
||||
"""
|
||||
Register filter for all handlers of this event observer
|
||||
|
||||
:param filters: positional filters
|
||||
"""
|
||||
if self._handler.filters is None:
|
||||
self._handler.filters = []
|
||||
self._handler.filters.extend([FilterObject(filter_) for filter_ in filters])
|
||||
|
||||
def _resolve_middlewares(self) -> List[MiddlewareType[TelegramObject]]:
|
||||
middlewares: List[MiddlewareType[TelegramObject]] = []
|
||||
for router in reversed(tuple(self.router.chain_head)):
|
||||
observer = router.observers.get(self.event_name)
|
||||
if observer:
|
||||
middlewares.extend(observer.middleware)
|
||||
|
||||
return middlewares
|
||||
|
||||
def register(
|
||||
self,
|
||||
callback: CallbackType,
|
||||
*filters: CallbackType,
|
||||
flags: Optional[Dict[str, Any]] = None,
|
||||
**kwargs: Any,
|
||||
) -> CallbackType:
|
||||
"""
|
||||
Register event handler
|
||||
"""
|
||||
if kwargs:
|
||||
raise UnsupportedKeywordArgument(
|
||||
"Passing any additional keyword arguments to the registrar method "
|
||||
"is not supported.\n"
|
||||
"This error may be caused when you are trying to register filters like in 2.x "
|
||||
"version of this framework, if it's true just look at correspoding "
|
||||
"documentation pages.\n"
|
||||
f"Please remove the {set(kwargs.keys())} arguments from this call.\n"
|
||||
)
|
||||
|
||||
if flags is None:
|
||||
flags = {}
|
||||
|
||||
for item in filters:
|
||||
if isinstance(item, Filter):
|
||||
item.update_handler_flags(flags=flags)
|
||||
|
||||
self.handlers.append(
|
||||
HandlerObject(
|
||||
callback=callback,
|
||||
filters=[FilterObject(filter_) for filter_ in filters],
|
||||
flags=flags,
|
||||
)
|
||||
)
|
||||
|
||||
return callback
|
||||
|
||||
def wrap_outer_middleware(
|
||||
self, callback: Any, event: TelegramObject, data: Dict[str, Any]
|
||||
) -> Any:
|
||||
wrapped_outer = self.middleware.wrap_middlewares(
|
||||
self.outer_middleware,
|
||||
callback,
|
||||
)
|
||||
return wrapped_outer(event, data)
|
||||
|
||||
def check_root_filters(self, event: TelegramObject, **kwargs: Any) -> Any:
|
||||
return self._handler.check(event, **kwargs)
|
||||
|
||||
async def trigger(self, event: TelegramObject, **kwargs: Any) -> Any:
|
||||
"""
|
||||
Propagate event to handlers and stops propagation on first match.
|
||||
Handler will be called when all its filters are pass.
|
||||
"""
|
||||
for handler in self.handlers:
|
||||
kwargs["handler"] = handler
|
||||
result, data = await handler.check(event, **kwargs)
|
||||
if result:
|
||||
kwargs.update(data)
|
||||
try:
|
||||
wrapped_inner = self.outer_middleware.wrap_middlewares(
|
||||
self._resolve_middlewares(),
|
||||
handler.call,
|
||||
)
|
||||
return await wrapped_inner(event, kwargs)
|
||||
except SkipHandler:
|
||||
continue
|
||||
|
||||
return UNHANDLED
|
||||
|
||||
def __call__(
|
||||
self,
|
||||
*filters: CallbackType,
|
||||
flags: Optional[Dict[str, Any]] = None,
|
||||
**kwargs: Any,
|
||||
) -> Callable[[CallbackType], CallbackType]:
|
||||
"""
|
||||
Decorator for registering event handlers
|
||||
"""
|
||||
|
||||
def wrapper(callback: CallbackType) -> CallbackType:
|
||||
self.register(callback, *filters, flags=flags, **kwargs)
|
||||
return callback
|
||||
|
||||
return wrapper
|
||||
127
venv/lib/python3.12/site-packages/aiogram/dispatcher/flags.py
Normal file
127
venv/lib/python3.12/site-packages/aiogram/dispatcher/flags.py
Normal file
@@ -0,0 +1,127 @@
|
||||
from dataclasses import dataclass
|
||||
from typing import TYPE_CHECKING, Any, Callable, Dict, Optional, Union, cast, overload
|
||||
|
||||
from magic_filter import AttrDict, MagicFilter
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from aiogram.dispatcher.event.handler import HandlerObject
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class Flag:
|
||||
name: str
|
||||
value: Any
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class FlagDecorator:
|
||||
flag: Flag
|
||||
|
||||
@classmethod
|
||||
def _with_flag(cls, flag: Flag) -> "FlagDecorator":
|
||||
return cls(flag)
|
||||
|
||||
def _with_value(self, value: Any) -> "FlagDecorator":
|
||||
new_flag = Flag(self.flag.name, value)
|
||||
return self._with_flag(new_flag)
|
||||
|
||||
@overload
|
||||
def __call__(self, value: Callable[..., Any], /) -> Callable[..., Any]: # type: ignore
|
||||
pass
|
||||
|
||||
@overload
|
||||
def __call__(self, value: Any, /) -> "FlagDecorator":
|
||||
pass
|
||||
|
||||
@overload
|
||||
def __call__(self, **kwargs: Any) -> "FlagDecorator":
|
||||
pass
|
||||
|
||||
def __call__(
|
||||
self,
|
||||
value: Optional[Any] = None,
|
||||
**kwargs: Any,
|
||||
) -> Union[Callable[..., Any], "FlagDecorator"]:
|
||||
if value and kwargs:
|
||||
raise ValueError("The arguments `value` and **kwargs can not be used together")
|
||||
|
||||
if value is not None and callable(value):
|
||||
value.aiogram_flag = {
|
||||
**extract_flags_from_object(value),
|
||||
self.flag.name: self.flag.value,
|
||||
}
|
||||
return cast(Callable[..., Any], value)
|
||||
return self._with_value(AttrDict(kwargs) if value is None else value)
|
||||
|
||||
|
||||
if TYPE_CHECKING:
|
||||
|
||||
class _ChatActionFlagProtocol(FlagDecorator):
|
||||
def __call__( # type: ignore[override]
|
||||
self,
|
||||
action: str = ...,
|
||||
interval: float = ...,
|
||||
initial_sleep: float = ...,
|
||||
**kwargs: Any,
|
||||
) -> FlagDecorator:
|
||||
pass
|
||||
|
||||
|
||||
class FlagGenerator:
|
||||
def __getattr__(self, name: str) -> FlagDecorator:
|
||||
if name[0] == "_":
|
||||
raise AttributeError("Flag name must NOT start with underscore")
|
||||
return FlagDecorator(Flag(name, True))
|
||||
|
||||
if TYPE_CHECKING:
|
||||
chat_action: _ChatActionFlagProtocol
|
||||
|
||||
|
||||
def extract_flags_from_object(obj: Any) -> Dict[str, Any]:
|
||||
if not hasattr(obj, "aiogram_flag"):
|
||||
return {}
|
||||
return cast(Dict[str, Any], obj.aiogram_flag)
|
||||
|
||||
|
||||
def extract_flags(handler: Union["HandlerObject", Dict[str, Any]]) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract flags from handler or middleware context data
|
||||
|
||||
:param handler: handler object or data
|
||||
:return: dictionary with all handler flags
|
||||
"""
|
||||
if isinstance(handler, dict) and "handler" in handler:
|
||||
handler = handler["handler"]
|
||||
if hasattr(handler, "flags"):
|
||||
return handler.flags
|
||||
return {}
|
||||
|
||||
|
||||
def get_flag(
|
||||
handler: Union["HandlerObject", Dict[str, Any]],
|
||||
name: str,
|
||||
*,
|
||||
default: Optional[Any] = None,
|
||||
) -> Any:
|
||||
"""
|
||||
Get flag by name
|
||||
|
||||
:param handler: handler object or data
|
||||
:param name: name of the flag
|
||||
:param default: default value (None)
|
||||
:return: value of the flag or default
|
||||
"""
|
||||
flags = extract_flags(handler)
|
||||
return flags.get(name, default)
|
||||
|
||||
|
||||
def check_flags(handler: Union["HandlerObject", Dict[str, Any]], magic: MagicFilter) -> Any:
|
||||
"""
|
||||
Check flags via magic filter
|
||||
|
||||
:param handler: handler object or data
|
||||
:param magic: instance of the magic
|
||||
:return: the result of magic filter check
|
||||
"""
|
||||
flags = extract_flags(handler)
|
||||
return magic.resolve(AttrDict(flags))
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,29 @@
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, Awaitable, Callable, Dict, TypeVar
|
||||
|
||||
from aiogram.types import TelegramObject
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
class BaseMiddleware(ABC):
|
||||
"""
|
||||
Generic middleware class
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
async def __call__(
|
||||
self,
|
||||
handler: Callable[[TelegramObject, Dict[str, Any]], Awaitable[Any]],
|
||||
event: TelegramObject,
|
||||
data: Dict[str, Any],
|
||||
) -> Any: # pragma: no cover
|
||||
"""
|
||||
Execute middleware
|
||||
|
||||
:param handler: Wrapped handler in middlewares chain
|
||||
:param event: Incoming event (Subclass of :class:`aiogram.types.base.TelegramObject`)
|
||||
:param data: Contextual data. Will be mapped to handler arguments
|
||||
:return: :class:`Any`
|
||||
"""
|
||||
pass
|
||||
@@ -0,0 +1,97 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING, TypedDict
|
||||
|
||||
from typing_extensions import NotRequired
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from aiogram import Bot, Dispatcher, Router
|
||||
from aiogram.dispatcher.event.handler import HandlerObject
|
||||
from aiogram.dispatcher.middlewares.user_context import EventContext
|
||||
from aiogram.fsm.context import FSMContext
|
||||
from aiogram.fsm.storage.base import BaseStorage
|
||||
from aiogram.types import Chat, Update, User
|
||||
from aiogram.utils.i18n import I18n, I18nMiddleware
|
||||
|
||||
|
||||
class DispatcherData(TypedDict, total=False):
|
||||
"""
|
||||
Dispatcher and bot related data.
|
||||
"""
|
||||
|
||||
dispatcher: Dispatcher
|
||||
"""Instance of the Dispatcher from which the handler was called."""
|
||||
bot: Bot
|
||||
"""Bot that received the update."""
|
||||
bots: NotRequired[list[Bot]]
|
||||
"""List of all bots in the Dispatcher. Used only in polling mode."""
|
||||
event_update: Update
|
||||
"""Update object that triggered the handler."""
|
||||
event_router: Router
|
||||
"""Router that was used to find the handler."""
|
||||
handler: NotRequired[HandlerObject]
|
||||
"""Handler object that was called.
|
||||
Available only in the handler itself and inner middlewares."""
|
||||
|
||||
|
||||
class UserContextData(TypedDict, total=False):
|
||||
"""
|
||||
Event context related data about user and chat.
|
||||
"""
|
||||
|
||||
event_context: EventContext
|
||||
"""Event context object that contains user and chat data."""
|
||||
event_from_user: NotRequired[User]
|
||||
"""User object that triggered the handler."""
|
||||
event_chat: NotRequired[Chat]
|
||||
"""Chat object that triggered the handler.
|
||||
.. deprecated:: 3.5.0
|
||||
Use :attr:`event_context.chat` instead."""
|
||||
event_thread_id: NotRequired[int]
|
||||
"""Thread ID of the chat that triggered the handler.
|
||||
.. deprecated:: 3.5.0
|
||||
Use :attr:`event_context.chat` instead."""
|
||||
event_business_connection_id: NotRequired[str]
|
||||
"""Business connection ID of the chat that triggered the handler.
|
||||
.. deprecated:: 3.5.0
|
||||
Use :attr:`event_context.business_connection_id` instead."""
|
||||
|
||||
|
||||
class FSMData(TypedDict, total=False):
|
||||
"""
|
||||
FSM related data.
|
||||
"""
|
||||
|
||||
fsm_storage: BaseStorage
|
||||
"""Storage used for FSM."""
|
||||
state: NotRequired[FSMContext]
|
||||
"""Current state of the FSM."""
|
||||
raw_state: NotRequired[str | None]
|
||||
"""Raw state of the FSM."""
|
||||
|
||||
|
||||
class I18nData(TypedDict, total=False):
|
||||
"""
|
||||
I18n related data.
|
||||
|
||||
Is not included by default, you need to add it to your own Data class if you need it.
|
||||
"""
|
||||
|
||||
i18n: I18n
|
||||
"""I18n object."""
|
||||
i18n_middleware: I18nMiddleware
|
||||
"""I18n middleware."""
|
||||
|
||||
|
||||
class MiddlewareData(
|
||||
DispatcherData,
|
||||
UserContextData,
|
||||
FSMData,
|
||||
# I18nData, # Disabled by default, add it if you need it to your own Data class.
|
||||
total=False,
|
||||
):
|
||||
"""
|
||||
Data passed to the handler by the middlewares.
|
||||
|
||||
You can add your own data by extending this class.
|
||||
"""
|
||||
@@ -0,0 +1,36 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING, Any, Awaitable, Callable, Dict, cast
|
||||
|
||||
from ...types import TelegramObject, Update
|
||||
from ...types.error_event import ErrorEvent
|
||||
from ..event.bases import UNHANDLED, CancelHandler, SkipHandler
|
||||
from .base import BaseMiddleware
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from ..router import Router
|
||||
|
||||
|
||||
class ErrorsMiddleware(BaseMiddleware):
|
||||
def __init__(self, router: Router):
|
||||
self.router = router
|
||||
|
||||
async def __call__(
|
||||
self,
|
||||
handler: Callable[[TelegramObject, Dict[str, Any]], Awaitable[Any]],
|
||||
event: TelegramObject,
|
||||
data: Dict[str, Any],
|
||||
) -> Any:
|
||||
try:
|
||||
return await handler(event, data)
|
||||
except (SkipHandler, CancelHandler): # pragma: no cover
|
||||
raise
|
||||
except Exception as e:
|
||||
response = await self.router.propagate_event(
|
||||
update_type="error",
|
||||
event=ErrorEvent(update=cast(Update, event), exception=e),
|
||||
**data,
|
||||
)
|
||||
if response is not UNHANDLED:
|
||||
return response
|
||||
raise
|
||||
@@ -0,0 +1,65 @@
|
||||
import functools
|
||||
from typing import Any, Callable, Dict, List, Optional, Sequence, Union, overload
|
||||
|
||||
from aiogram.dispatcher.event.bases import (
|
||||
MiddlewareEventType,
|
||||
MiddlewareType,
|
||||
NextMiddlewareType,
|
||||
)
|
||||
from aiogram.dispatcher.event.handler import CallbackType
|
||||
from aiogram.types import TelegramObject
|
||||
|
||||
|
||||
class MiddlewareManager(Sequence[MiddlewareType[TelegramObject]]):
|
||||
def __init__(self) -> None:
|
||||
self._middlewares: List[MiddlewareType[TelegramObject]] = []
|
||||
|
||||
def register(
|
||||
self,
|
||||
middleware: MiddlewareType[TelegramObject],
|
||||
) -> MiddlewareType[TelegramObject]:
|
||||
self._middlewares.append(middleware)
|
||||
return middleware
|
||||
|
||||
def unregister(self, middleware: MiddlewareType[TelegramObject]) -> None:
|
||||
self._middlewares.remove(middleware)
|
||||
|
||||
def __call__(
|
||||
self,
|
||||
middleware: Optional[MiddlewareType[TelegramObject]] = None,
|
||||
) -> Union[
|
||||
Callable[[MiddlewareType[TelegramObject]], MiddlewareType[TelegramObject]],
|
||||
MiddlewareType[TelegramObject],
|
||||
]:
|
||||
if middleware is None:
|
||||
return self.register
|
||||
return self.register(middleware)
|
||||
|
||||
@overload
|
||||
def __getitem__(self, item: int) -> MiddlewareType[TelegramObject]:
|
||||
pass
|
||||
|
||||
@overload
|
||||
def __getitem__(self, item: slice) -> Sequence[MiddlewareType[TelegramObject]]:
|
||||
pass
|
||||
|
||||
def __getitem__(
|
||||
self, item: Union[int, slice]
|
||||
) -> Union[MiddlewareType[TelegramObject], Sequence[MiddlewareType[TelegramObject]]]:
|
||||
return self._middlewares[item]
|
||||
|
||||
def __len__(self) -> int:
|
||||
return len(self._middlewares)
|
||||
|
||||
@staticmethod
|
||||
def wrap_middlewares(
|
||||
middlewares: Sequence[MiddlewareType[MiddlewareEventType]], handler: CallbackType
|
||||
) -> NextMiddlewareType[MiddlewareEventType]:
|
||||
@functools.wraps(handler)
|
||||
def handler_wrapper(event: TelegramObject, kwargs: Dict[str, Any]) -> Any:
|
||||
return handler(event, **kwargs)
|
||||
|
||||
middleware = handler_wrapper
|
||||
for m in reversed(middlewares):
|
||||
middleware = functools.partial(m, middleware) # type: ignore[assignment]
|
||||
return middleware
|
||||
@@ -0,0 +1,182 @@
|
||||
from dataclasses import dataclass
|
||||
from typing import Any, Awaitable, Callable, Dict, Optional
|
||||
|
||||
from aiogram.dispatcher.middlewares.base import BaseMiddleware
|
||||
from aiogram.types import (
|
||||
Chat,
|
||||
ChatBoostSourcePremium,
|
||||
InaccessibleMessage,
|
||||
TelegramObject,
|
||||
Update,
|
||||
User,
|
||||
)
|
||||
|
||||
EVENT_CONTEXT_KEY = "event_context"
|
||||
|
||||
EVENT_FROM_USER_KEY = "event_from_user"
|
||||
EVENT_CHAT_KEY = "event_chat"
|
||||
EVENT_THREAD_ID_KEY = "event_thread_id"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class EventContext:
|
||||
chat: Optional[Chat] = None
|
||||
user: Optional[User] = None
|
||||
thread_id: Optional[int] = None
|
||||
business_connection_id: Optional[str] = None
|
||||
|
||||
@property
|
||||
def user_id(self) -> Optional[int]:
|
||||
return self.user.id if self.user else None
|
||||
|
||||
@property
|
||||
def chat_id(self) -> Optional[int]:
|
||||
return self.chat.id if self.chat else None
|
||||
|
||||
|
||||
class UserContextMiddleware(BaseMiddleware):
|
||||
async def __call__(
|
||||
self,
|
||||
handler: Callable[[TelegramObject, Dict[str, Any]], Awaitable[Any]],
|
||||
event: TelegramObject,
|
||||
data: Dict[str, Any],
|
||||
) -> Any:
|
||||
if not isinstance(event, Update):
|
||||
raise RuntimeError("UserContextMiddleware got an unexpected event type!")
|
||||
event_context = data[EVENT_CONTEXT_KEY] = self.resolve_event_context(event=event)
|
||||
|
||||
# Backward compatibility
|
||||
if event_context.user is not None:
|
||||
data[EVENT_FROM_USER_KEY] = event_context.user
|
||||
if event_context.chat is not None:
|
||||
data[EVENT_CHAT_KEY] = event_context.chat
|
||||
if event_context.thread_id is not None:
|
||||
data[EVENT_THREAD_ID_KEY] = event_context.thread_id
|
||||
|
||||
return await handler(event, data)
|
||||
|
||||
@classmethod
|
||||
def resolve_event_context(cls, event: Update) -> EventContext:
|
||||
"""
|
||||
Resolve chat and user instance from Update object
|
||||
"""
|
||||
if event.message:
|
||||
return EventContext(
|
||||
chat=event.message.chat,
|
||||
user=event.message.from_user,
|
||||
thread_id=(
|
||||
event.message.message_thread_id if event.message.is_topic_message else None
|
||||
),
|
||||
)
|
||||
if event.edited_message:
|
||||
return EventContext(
|
||||
chat=event.edited_message.chat,
|
||||
user=event.edited_message.from_user,
|
||||
thread_id=(
|
||||
event.edited_message.message_thread_id
|
||||
if event.edited_message.is_topic_message
|
||||
else None
|
||||
),
|
||||
)
|
||||
if event.channel_post:
|
||||
return EventContext(chat=event.channel_post.chat)
|
||||
if event.edited_channel_post:
|
||||
return EventContext(chat=event.edited_channel_post.chat)
|
||||
if event.inline_query:
|
||||
return EventContext(user=event.inline_query.from_user)
|
||||
if event.chosen_inline_result:
|
||||
return EventContext(user=event.chosen_inline_result.from_user)
|
||||
if event.callback_query:
|
||||
callback_query_message = event.callback_query.message
|
||||
if callback_query_message:
|
||||
return EventContext(
|
||||
chat=callback_query_message.chat,
|
||||
user=event.callback_query.from_user,
|
||||
thread_id=(
|
||||
callback_query_message.message_thread_id
|
||||
if not isinstance(callback_query_message, InaccessibleMessage)
|
||||
and callback_query_message.is_topic_message
|
||||
else None
|
||||
),
|
||||
business_connection_id=(
|
||||
callback_query_message.business_connection_id
|
||||
if not isinstance(callback_query_message, InaccessibleMessage)
|
||||
else None
|
||||
),
|
||||
)
|
||||
return EventContext(user=event.callback_query.from_user)
|
||||
if event.shipping_query:
|
||||
return EventContext(user=event.shipping_query.from_user)
|
||||
if event.pre_checkout_query:
|
||||
return EventContext(user=event.pre_checkout_query.from_user)
|
||||
if event.poll_answer:
|
||||
return EventContext(
|
||||
chat=event.poll_answer.voter_chat,
|
||||
user=event.poll_answer.user,
|
||||
)
|
||||
if event.my_chat_member:
|
||||
return EventContext(
|
||||
chat=event.my_chat_member.chat, user=event.my_chat_member.from_user
|
||||
)
|
||||
if event.chat_member:
|
||||
return EventContext(chat=event.chat_member.chat, user=event.chat_member.from_user)
|
||||
if event.chat_join_request:
|
||||
return EventContext(
|
||||
chat=event.chat_join_request.chat, user=event.chat_join_request.from_user
|
||||
)
|
||||
if event.message_reaction:
|
||||
return EventContext(
|
||||
chat=event.message_reaction.chat,
|
||||
user=event.message_reaction.user,
|
||||
)
|
||||
if event.message_reaction_count:
|
||||
return EventContext(chat=event.message_reaction_count.chat)
|
||||
if event.chat_boost:
|
||||
# We only check the premium source, because only it has a sender user,
|
||||
# other sources have a user, but it is not the sender, but the recipient
|
||||
if isinstance(event.chat_boost.boost.source, ChatBoostSourcePremium):
|
||||
return EventContext(
|
||||
chat=event.chat_boost.chat,
|
||||
user=event.chat_boost.boost.source.user,
|
||||
)
|
||||
|
||||
return EventContext(chat=event.chat_boost.chat)
|
||||
if event.removed_chat_boost:
|
||||
return EventContext(chat=event.removed_chat_boost.chat)
|
||||
if event.deleted_business_messages:
|
||||
return EventContext(
|
||||
chat=event.deleted_business_messages.chat,
|
||||
business_connection_id=event.deleted_business_messages.business_connection_id,
|
||||
)
|
||||
if event.business_connection:
|
||||
return EventContext(
|
||||
user=event.business_connection.user,
|
||||
business_connection_id=event.business_connection.id,
|
||||
)
|
||||
if event.business_message:
|
||||
return EventContext(
|
||||
chat=event.business_message.chat,
|
||||
user=event.business_message.from_user,
|
||||
thread_id=(
|
||||
event.business_message.message_thread_id
|
||||
if event.business_message.is_topic_message
|
||||
else None
|
||||
),
|
||||
business_connection_id=event.business_message.business_connection_id,
|
||||
)
|
||||
if event.edited_business_message:
|
||||
return EventContext(
|
||||
chat=event.edited_business_message.chat,
|
||||
user=event.edited_business_message.from_user,
|
||||
thread_id=(
|
||||
event.edited_business_message.message_thread_id
|
||||
if event.edited_business_message.is_topic_message
|
||||
else None
|
||||
),
|
||||
business_connection_id=event.edited_business_message.business_connection_id,
|
||||
)
|
||||
if event.purchased_paid_media:
|
||||
return EventContext(
|
||||
user=event.purchased_paid_media.from_user,
|
||||
)
|
||||
return EventContext()
|
||||
275
venv/lib/python3.12/site-packages/aiogram/dispatcher/router.py
Normal file
275
venv/lib/python3.12/site-packages/aiogram/dispatcher/router.py
Normal file
@@ -0,0 +1,275 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, Final, Generator, List, Optional, Set
|
||||
|
||||
from ..types import TelegramObject
|
||||
from .event.bases import REJECTED, UNHANDLED
|
||||
from .event.event import EventObserver
|
||||
from .event.telegram import TelegramEventObserver
|
||||
|
||||
INTERNAL_UPDATE_TYPES: Final[frozenset[str]] = frozenset({"update", "error"})
|
||||
|
||||
|
||||
class Router:
|
||||
"""
|
||||
Router can route update, and it nested update types like messages, callback query,
|
||||
polls and all other event types.
|
||||
|
||||
Event handlers can be registered in observer by two ways:
|
||||
|
||||
- By observer method - :obj:`router.<event_type>.register(handler, <filters, ...>)`
|
||||
- By decorator - :obj:`@router.<event_type>(<filters, ...>)`
|
||||
"""
|
||||
|
||||
def __init__(self, *, name: Optional[str] = None) -> None:
|
||||
"""
|
||||
:param name: Optional router name, can be useful for debugging
|
||||
"""
|
||||
|
||||
self.name = name or hex(id(self))
|
||||
|
||||
self._parent_router: Optional[Router] = None
|
||||
self.sub_routers: List[Router] = []
|
||||
|
||||
# Observers
|
||||
self.message = TelegramEventObserver(router=self, event_name="message")
|
||||
self.edited_message = TelegramEventObserver(router=self, event_name="edited_message")
|
||||
self.channel_post = TelegramEventObserver(router=self, event_name="channel_post")
|
||||
self.edited_channel_post = TelegramEventObserver(
|
||||
router=self, event_name="edited_channel_post"
|
||||
)
|
||||
self.inline_query = TelegramEventObserver(router=self, event_name="inline_query")
|
||||
self.chosen_inline_result = TelegramEventObserver(
|
||||
router=self, event_name="chosen_inline_result"
|
||||
)
|
||||
self.callback_query = TelegramEventObserver(router=self, event_name="callback_query")
|
||||
self.shipping_query = TelegramEventObserver(router=self, event_name="shipping_query")
|
||||
self.pre_checkout_query = TelegramEventObserver(
|
||||
router=self, event_name="pre_checkout_query"
|
||||
)
|
||||
self.poll = TelegramEventObserver(router=self, event_name="poll")
|
||||
self.poll_answer = TelegramEventObserver(router=self, event_name="poll_answer")
|
||||
self.my_chat_member = TelegramEventObserver(router=self, event_name="my_chat_member")
|
||||
self.chat_member = TelegramEventObserver(router=self, event_name="chat_member")
|
||||
self.chat_join_request = TelegramEventObserver(router=self, event_name="chat_join_request")
|
||||
self.message_reaction = TelegramEventObserver(router=self, event_name="message_reaction")
|
||||
self.message_reaction_count = TelegramEventObserver(
|
||||
router=self, event_name="message_reaction_count"
|
||||
)
|
||||
self.chat_boost = TelegramEventObserver(router=self, event_name="chat_boost")
|
||||
self.removed_chat_boost = TelegramEventObserver(
|
||||
router=self, event_name="removed_chat_boost"
|
||||
)
|
||||
self.deleted_business_messages = TelegramEventObserver(
|
||||
router=self, event_name="deleted_business_messages"
|
||||
)
|
||||
self.business_connection = TelegramEventObserver(
|
||||
router=self, event_name="business_connection"
|
||||
)
|
||||
self.edited_business_message = TelegramEventObserver(
|
||||
router=self, event_name="edited_business_message"
|
||||
)
|
||||
self.business_message = TelegramEventObserver(router=self, event_name="business_message")
|
||||
self.purchased_paid_media = TelegramEventObserver(
|
||||
router=self, event_name="purchased_paid_media"
|
||||
)
|
||||
|
||||
self.errors = self.error = TelegramEventObserver(router=self, event_name="error")
|
||||
|
||||
self.startup = EventObserver()
|
||||
self.shutdown = EventObserver()
|
||||
|
||||
self.observers: Dict[str, TelegramEventObserver] = {
|
||||
"message": self.message,
|
||||
"edited_message": self.edited_message,
|
||||
"channel_post": self.channel_post,
|
||||
"edited_channel_post": self.edited_channel_post,
|
||||
"inline_query": self.inline_query,
|
||||
"chosen_inline_result": self.chosen_inline_result,
|
||||
"callback_query": self.callback_query,
|
||||
"shipping_query": self.shipping_query,
|
||||
"pre_checkout_query": self.pre_checkout_query,
|
||||
"poll": self.poll,
|
||||
"poll_answer": self.poll_answer,
|
||||
"my_chat_member": self.my_chat_member,
|
||||
"chat_member": self.chat_member,
|
||||
"chat_join_request": self.chat_join_request,
|
||||
"message_reaction": self.message_reaction,
|
||||
"message_reaction_count": self.message_reaction_count,
|
||||
"chat_boost": self.chat_boost,
|
||||
"removed_chat_boost": self.removed_chat_boost,
|
||||
"deleted_business_messages": self.deleted_business_messages,
|
||||
"business_connection": self.business_connection,
|
||||
"edited_business_message": self.edited_business_message,
|
||||
"business_message": self.business_message,
|
||||
"purchased_paid_media": self.purchased_paid_media,
|
||||
"error": self.errors,
|
||||
}
|
||||
|
||||
def __str__(self) -> str:
|
||||
return f"{type(self).__name__} {self.name!r}"
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<{self}>"
|
||||
|
||||
def resolve_used_update_types(self, skip_events: Optional[Set[str]] = None) -> List[str]:
|
||||
"""
|
||||
Resolve registered event names
|
||||
|
||||
Is useful for getting updates only for registered event types.
|
||||
|
||||
:param skip_events: skip specified event names
|
||||
:return: set of registered names
|
||||
"""
|
||||
handlers_in_use: Set[str] = set()
|
||||
if skip_events is None:
|
||||
skip_events = set()
|
||||
skip_events = {*skip_events, *INTERNAL_UPDATE_TYPES}
|
||||
|
||||
for router in self.chain_tail:
|
||||
for update_name, observer in router.observers.items():
|
||||
if observer.handlers and update_name not in skip_events:
|
||||
handlers_in_use.add(update_name)
|
||||
|
||||
return list(sorted(handlers_in_use)) # NOQA: C413
|
||||
|
||||
async def propagate_event(self, update_type: str, event: TelegramObject, **kwargs: Any) -> Any:
|
||||
kwargs.update(event_router=self)
|
||||
observer = self.observers.get(update_type)
|
||||
|
||||
async def _wrapped(telegram_event: TelegramObject, **data: Any) -> Any:
|
||||
return await self._propagate_event(
|
||||
observer=observer, update_type=update_type, event=telegram_event, **data
|
||||
)
|
||||
|
||||
if observer:
|
||||
return await observer.wrap_outer_middleware(_wrapped, event=event, data=kwargs)
|
||||
return await _wrapped(event, **kwargs)
|
||||
|
||||
async def _propagate_event(
|
||||
self,
|
||||
observer: Optional[TelegramEventObserver],
|
||||
update_type: str,
|
||||
event: TelegramObject,
|
||||
**kwargs: Any,
|
||||
) -> Any:
|
||||
response = UNHANDLED
|
||||
if observer:
|
||||
# Check globally defined filters before any other handler will be checked.
|
||||
# This check is placed here instead of `trigger` method to add possibility
|
||||
# to pass context to handlers from global filters.
|
||||
result, data = await observer.check_root_filters(event, **kwargs)
|
||||
if not result:
|
||||
return UNHANDLED
|
||||
kwargs.update(data)
|
||||
|
||||
response = await observer.trigger(event, **kwargs)
|
||||
if response is REJECTED: # pragma: no cover
|
||||
# Possible only if some handler returns REJECTED
|
||||
return UNHANDLED
|
||||
if response is not UNHANDLED:
|
||||
return response
|
||||
|
||||
for router in self.sub_routers:
|
||||
response = await router.propagate_event(update_type=update_type, event=event, **kwargs)
|
||||
if response is not UNHANDLED:
|
||||
break
|
||||
|
||||
return response
|
||||
|
||||
@property
|
||||
def chain_head(self) -> Generator[Router, None, None]:
|
||||
router: Optional[Router] = self
|
||||
while router:
|
||||
yield router
|
||||
router = router.parent_router
|
||||
|
||||
@property
|
||||
def chain_tail(self) -> Generator[Router, None, None]:
|
||||
yield self
|
||||
for router in self.sub_routers:
|
||||
yield from router.chain_tail
|
||||
|
||||
@property
|
||||
def parent_router(self) -> Optional[Router]:
|
||||
return self._parent_router
|
||||
|
||||
@parent_router.setter
|
||||
def parent_router(self, router: Router) -> None:
|
||||
"""
|
||||
Internal property setter of parent router fot this router.
|
||||
Do not use this method in own code.
|
||||
All routers should be included via `include_router` method.
|
||||
|
||||
Self- and circular- referencing are not allowed here
|
||||
|
||||
:param router:
|
||||
"""
|
||||
if not isinstance(router, Router):
|
||||
raise ValueError(f"router should be instance of Router not {type(router).__name__!r}")
|
||||
if self._parent_router:
|
||||
raise RuntimeError(f"Router is already attached to {self._parent_router!r}")
|
||||
if self == router:
|
||||
raise RuntimeError("Self-referencing routers is not allowed")
|
||||
|
||||
parent: Optional[Router] = router
|
||||
while parent is not None:
|
||||
if parent == self:
|
||||
raise RuntimeError("Circular referencing of Router is not allowed")
|
||||
|
||||
parent = parent.parent_router
|
||||
|
||||
self._parent_router = router
|
||||
router.sub_routers.append(self)
|
||||
|
||||
def include_routers(self, *routers: Router) -> None:
|
||||
"""
|
||||
Attach multiple routers.
|
||||
|
||||
:param routers:
|
||||
:return:
|
||||
"""
|
||||
if not routers:
|
||||
raise ValueError("At least one router must be provided")
|
||||
for router in routers:
|
||||
self.include_router(router)
|
||||
|
||||
def include_router(self, router: Router) -> Router:
|
||||
"""
|
||||
Attach another router.
|
||||
|
||||
:param router:
|
||||
:return:
|
||||
"""
|
||||
if not isinstance(router, Router):
|
||||
raise ValueError(
|
||||
f"router should be instance of Router not {type(router).__class__.__name__}"
|
||||
)
|
||||
router.parent_router = self
|
||||
return router
|
||||
|
||||
async def emit_startup(self, *args: Any, **kwargs: Any) -> None:
|
||||
"""
|
||||
Recursively call startup callbacks
|
||||
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
kwargs.update(router=self)
|
||||
await self.startup.trigger(*args, **kwargs)
|
||||
for router in self.sub_routers:
|
||||
await router.emit_startup(*args, **kwargs)
|
||||
|
||||
async def emit_shutdown(self, *args: Any, **kwargs: Any) -> None:
|
||||
"""
|
||||
Recursively call shutdown callbacks to graceful shutdown
|
||||
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
kwargs.update(router=self)
|
||||
await self.shutdown.trigger(*args, **kwargs)
|
||||
for router in self.sub_routers:
|
||||
await router.emit_shutdown(*args, **kwargs)
|
||||
71
venv/lib/python3.12/site-packages/aiogram/enums/__init__.py
Normal file
71
venv/lib/python3.12/site-packages/aiogram/enums/__init__.py
Normal file
@@ -0,0 +1,71 @@
|
||||
from .bot_command_scope_type import BotCommandScopeType
|
||||
from .chat_action import ChatAction
|
||||
from .chat_boost_source_type import ChatBoostSourceType
|
||||
from .chat_member_status import ChatMemberStatus
|
||||
from .chat_type import ChatType
|
||||
from .content_type import ContentType
|
||||
from .currency import Currency
|
||||
from .dice_emoji import DiceEmoji
|
||||
from .encrypted_passport_element import EncryptedPassportElement
|
||||
from .inline_query_result_type import InlineQueryResultType
|
||||
from .input_media_type import InputMediaType
|
||||
from .input_paid_media_type import InputPaidMediaType
|
||||
from .input_profile_photo_type import InputProfilePhotoType
|
||||
from .input_story_content_type import InputStoryContentType
|
||||
from .keyboard_button_poll_type_type import KeyboardButtonPollTypeType
|
||||
from .mask_position_point import MaskPositionPoint
|
||||
from .menu_button_type import MenuButtonType
|
||||
from .message_entity_type import MessageEntityType
|
||||
from .message_origin_type import MessageOriginType
|
||||
from .owned_gift_type import OwnedGiftType
|
||||
from .paid_media_type import PaidMediaType
|
||||
from .parse_mode import ParseMode
|
||||
from .passport_element_error_type import PassportElementErrorType
|
||||
from .poll_type import PollType
|
||||
from .reaction_type_type import ReactionTypeType
|
||||
from .revenue_withdrawal_state_type import RevenueWithdrawalStateType
|
||||
from .sticker_format import StickerFormat
|
||||
from .sticker_type import StickerType
|
||||
from .story_area_type_type import StoryAreaTypeType
|
||||
from .topic_icon_color import TopicIconColor
|
||||
from .transaction_partner_type import TransactionPartnerType
|
||||
from .transaction_partner_user_transaction_type_enum import (
|
||||
TransactionPartnerUserTransactionTypeEnum,
|
||||
)
|
||||
from .update_type import UpdateType
|
||||
|
||||
__all__ = (
|
||||
"BotCommandScopeType",
|
||||
"ChatAction",
|
||||
"ChatBoostSourceType",
|
||||
"ChatMemberStatus",
|
||||
"ChatType",
|
||||
"ContentType",
|
||||
"Currency",
|
||||
"DiceEmoji",
|
||||
"EncryptedPassportElement",
|
||||
"InlineQueryResultType",
|
||||
"InputMediaType",
|
||||
"InputPaidMediaType",
|
||||
"InputProfilePhotoType",
|
||||
"InputStoryContentType",
|
||||
"KeyboardButtonPollTypeType",
|
||||
"MaskPositionPoint",
|
||||
"MenuButtonType",
|
||||
"MessageEntityType",
|
||||
"MessageOriginType",
|
||||
"OwnedGiftType",
|
||||
"PaidMediaType",
|
||||
"ParseMode",
|
||||
"PassportElementErrorType",
|
||||
"PollType",
|
||||
"ReactionTypeType",
|
||||
"RevenueWithdrawalStateType",
|
||||
"StickerFormat",
|
||||
"StickerType",
|
||||
"StoryAreaTypeType",
|
||||
"TopicIconColor",
|
||||
"TransactionPartnerType",
|
||||
"TransactionPartnerUserTransactionTypeEnum",
|
||||
"UpdateType",
|
||||
)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user