Compare commits

...

405 commits

Author SHA1 Message Date
Leonidas Spyropoulos
8ca61eded2
chore(release): prepare for 6.2.16 2025-01-13 15:52:13 +00:00
Leonidas Spyropoulos
a9bf714dae
fix: bump deps for python 3.13 and vulnerability
pygit2 and watchfiles for precompiled wheels
greenlet for python 3.13 compatibility
python-multipart for security vulnerability

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
3e3173b5c9
chore: avoid cache for new pacman 7
Pacman 7 introduced sandboxing which breaks cache in containers due to permissions on containers

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
eca8bbf515
chore(release): prepare for 6.2.15
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-09-15 12:03:17 +03:00
Jelle van der Waa
edc1ab949a perf(captcha): simplify count() query for user ids
Using .count() isn't great as it runs a count query on a subquery which
selects all fields in the Users table. This rewrites it into a simple
SELECT count(ID) from USers query.
2024-09-12 12:29:46 +00:00
Muflone
97cc6196eb fix: reduce the number of subqueries against Packages by preloading the existing dependencies names from AUR 2024-08-21 01:36:15 +02:00
Muflone
77ef87c882 housekeep: code re-formatted by black for lint pipeline 2024-08-20 21:00:46 +00:00
Muflone
a40283cdb2 fix: reduce the number of subqueries against User by loading eagerly the Users from PackageComaintainer 2024-08-20 21:00:46 +00:00
Levente Polyak
4f68532ee2
chore(mariadb): fix mysql deprecation warnings by using mariadb commands
Mariadb has scheduled to remove the deprecated mysql drop-in interface.
Let's adapt which also removes a lot of warnings while spinning up the
service.
2024-08-19 15:26:36 +02:00
Levente Polyak
439ccd4aa3
feat(docker): add full grafana, prometheus, tempo setup for local dev
This is a very useful stack for local development as well, by allowing
to easily access a local grafana instance and look at the accessed
endpoints, query usage and durations etc.
As a nice side effect this also makes sure we have an easy way to
actually test any changes to the opentelemetry integration in an actual
environment instead of just listening to a raw socket.
2024-08-19 15:26:29 +02:00
Levente Polyak
8dcf0b2d97
fix(docker): fix compose race conditions on mariadb_init
We want the dependent services to wait until the initialization service
of mariadb finishes, but also properly accept if it already exited
before a leaf service gets picked up and put into created state. By
using the service_completed_successfully signal, we can ensure precisely
this, without being racy and leading to none booted services.

While at it, remove the compose version identifiers as docker-compose
deprecated them and always warned about when running docker-compose.
2024-08-19 15:26:21 +02:00
Leonidas Spyropoulos
88e8db4404
chore(release): prepare version 6.2.14 2024-08-17 17:28:26 +01:00
Sven-Hendrik Haase
b730f6447d
feat: Add opentelemtry-based tracing
This adds tracing to fastapi, redis, and sqlalchemy. It uses the
recommended OLTP exporter to send the tracing data.
2024-08-17 11:27:26 +01:00
Leonidas Spyropoulos
92f5bbd37f
housekeep: reformat asgi.py 2024-08-17 01:31:43 +01:00
Jelle van der Waa
6c6ecd3971
perf(aurweb): create a context with what is required
The pkgbase/util.py `make_context` helper does a lot of unrelated
expensive queries which are not required for any of the templates. Only
the 404 template shows git_clone_uri_* and pkgbase.
2024-08-16 21:32:22 +02:00
Leonidas Spyropoulos
9b12eaf2b9
chore(release): prepare version 6.2.13 2024-08-16 16:03:40 +01:00
Jelle van der Waa
d1a66a743e
perf(aurweb/pkgbase): use exists() to avoid fetching a row
The previous approach fetched the matching row, by using `exists()`
SQLAlchemy changes the query to a `SELECT 1`.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
b65d6c5e3a
perf(aurweb/pkgbase): only relevant queries when logged in
Don't query for notify, requests and vote information when the user is
not logged in as this information is not shown.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
d393ed2352
fix(templates): hide non-actionable links when not logged in
A non-logged in user cannot vote/enable notifications or submit a
request so hide these links.
2024-08-09 16:07:17 +02:00
Leonidas Spyropoulos
a16fac9b95
fix: revert mysqlclient to 2.2.3 2024-08-09 11:02:13 +01:00
renovate
5dd65846d1
chore(deps): update dependency coverage to v7.6.1 2024-08-05 11:25:17 +00:00
renovate
a1b2d231c3
fix(deps): update dependency aiofiles to v24 2024-08-04 20:25:21 +00:00
renovate
f306b6df7a
fix(deps): update dependency fastapi to ^0.112.0 2024-08-04 12:25:03 +00:00
renovate
0d17895647
fix(deps): update dependency gunicorn to v22 2024-08-04 10:24:33 +00:00
renovate
36a56e9d3c
fix(deps): update all non-major dependencies 2024-08-04 09:24:29 +00:00
Diego Viola
80d3e5f7b6 housekeep: update .editorconfig url
Signed-off-by: Diego Viola <diego.viola@gmail.com>
2024-08-03 11:58:58 +00:00
Leonidas Spyropoulos
2df5a2d5a8
chore(release): prepare version 6.2.12 2024-08-03 10:46:29 +01:00
Leonidas Spyropoulos
a54b6935a1
housekeep: reformat files with pre-hooks
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-08-03 08:15:56 +01:00
Levente Polyak
4d5909256f
fix: add missing indicies on PackageBase ordered columns
Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:31 +02:00
Levente Polyak
a5b94a47f3
feat: cache rss feedgen for 5 minutes
The RSS feed should be perfectly fine even when caching them for 5
minutes. This should massively reduce the response times on the
endpoint.

Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:24 +02:00
moson
33d31d4117
style: Indicate deleted accounts on requests page
Show "(deleted)" on requests page for user accounts that were removed.

Fixes #505

Signed-off-by: moson <moson@archlinux.org>
2024-06-24 16:35:21 +02:00
Leonidas Spyropoulos
ed878c8c5e
chore(release): prepare for 6.2.11 2024-06-10 11:49:00 +01:00
Leonidas Spyropoulos
77e4979f79
fix: remove the extra spaces in requests textarea
fixes: #503
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-06-10 11:41:19 +01:00
Leonidas Spyropoulos
85af7d6f04
fix: revert Set reply-to header for notifications to ML
The change broke the initial emails to the ML. Not sure why but reverting this now and might look at later

This reverts commit 783422369e.

fixes: #502
2024-06-10 11:40:36 +01:00
Leonidas Spyropoulos
ef0619dc2f
chore(release): prepare for 6.2.10 2024-05-18 20:46:17 +01:00
moson
43b322e739
fix(CI): lint job - fix for python 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-28 17:49:08 +02:00
moson
afb7af3e27
housekeep: replace deprecated datetime functions
tests show warnings for deprecated utc functions with python 3.12

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 18:24:16 +02:00
moson
ffddf63975
housekeep: poetry - include python version 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:46:39 +02:00
moson
c6a530f24f
chore(deps): bump pre-commit tools/libs
Prep for python 3.12
Reformat files with latest pre-commit tools

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:25:39 +02:00
moson
3220cf886e
fix(CI): Remove "fast-single-thread" tag
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21e2ef5ecb
fix(test): Fix "TestClient"
TestClient changes were reverted with 0.37.2:

https://github.com/encode/starlette/pull/2525
https://github.com/encode/starlette/releases/tag/0.37.2
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
6ba06801f7
chore(deps): update dependencies
- Updating pycparser (2.21 -> 2.22)
  - Updating sniffio (1.3.0 -> 1.3.1)
  - Updating typing-extensions (4.8.0 -> 4.11.0)
  - Updating anyio (3.7.1 -> 4.3.0)
  - Updating certifi (2023.11.17 -> 2024.2.2)
  - Updating greenlet (3.0.1 -> 3.0.3)
  - Updating markupsafe (2.1.3 -> 2.1.5)
  - Updating packaging (23.2 -> 24.0)
  - Updating pluggy (1.3.0 -> 1.4.0)
  - Updating pydantic-core (2.14.5 -> 2.16.3)
  - Updating coverage (7.4.0 -> 7.4.4)
  - Updating cryptography (41.0.5 -> 42.0.5)
  - Updating dnspython (2.4.2 -> 2.6.1)
  - Updating execnet (2.0.2 -> 2.1.0)
  - Updating httpcore (1.0.2 -> 1.0.5)
  - Updating lxml (5.1.0 -> 5.2.1)
  - Updating mako (1.3.0 -> 1.3.2)
  - Updating parse (1.20.0 -> 1.20.1)
  - Updating prometheus-client (0.19.0 -> 0.20.0)
  - Updating pydantic (2.5.2 -> 2.6.4)
  - Updating pytest (7.4.4 -> 8.1.1)
  - Updating python-dateutil (2.8.2 -> 2.9.0.post0)
  - Updating redis (5.0.1 -> 5.0.3)
  - Updating urllib3 (2.1.0 -> 2.2.1)
  - Updating asgiref (3.7.2 -> 3.8.1)
  - Updating email-validator (2.1.0.post1 -> 2.1.1)
  - Updating fakeredis (2.20.1 -> 2.21.3)
  - Updating fastapi (0.109.0 -> 0.110.1)
  - Updating filelock (3.13.1 -> 3.13.3)
  - Updating markdown (3.5.2 -> 3.6)
  - Updating mysqlclient (2.2.1 -> 2.2.4)
  - Updating orjson (3.9.12 -> 3.10.0)
  - Updating prometheus-fastapi-instrumentator (6.1.0 -> 7.0.0)
  - Updating protobuf (4.25.2 -> 5.26.1)
  - Updating pygit2 (1.13.3 -> 1.14.1)
  - Updating pytest-asyncio (0.23.3 -> 0.23.6)
  - Updating pytest-cov (4.1.0 -> 5.0.0)
  - Updating tomlkit (0.12.3 -> 0.12.4)
  - Updating uvicorn (0.27.0 -> 0.27.1)
  - Updating werkzeug (3.0.1 -> 3.0.2)
  - Updating starlette (0.35.0 -> 0.37.2)
  - Updating httpx (0.26.0 -> 0.27.0)
  - Updating python-multipart (0.0.6 -> 0.0.9)
  - Updating uvicorn (0.27.1 -> 0.29.0)
  - Updating sqlalchemy (1.4.50 -> 1.4.52)

Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21a23c9abe
feat: Limit comment length
Limit the amount of characters that can be entered for a comment.

Signed-off-by: moson <moson@archlinux.org>
2024-02-25 10:46:47 +01:00
moson
d050b626db
feat: Add blacklist check for pkgbase
Also check "pkgbase" against our blacklist.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 15:55:46 +01:00
moson
057685f304
fix: Fix package info for 404 errors
We try to find packages when a user enters a URL like /somepkg
or accidentally opens /somepkg.git in the browser.

However, it currently also does this for URL's like /pkgbase/doesnotexist
and falsely interprets "pkgbase" part as a package or pkgbase name.
This in combination with a pkgbase that is named "pkgbase" generates
some misleading 404 message for URL's like /pkgbase/doesnotexist.

That being said, we should probably add pkgbase to the blacklist check
as well (we do this for pkgname already) and add things like
"pkgbase" to the blacklist -> Will be picked up in another commit.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 14:12:09 +01:00
renovate
319c565cb9
fix(deps): update all non-major dependencies 2024-01-23 22:24:28 +00:00
renovate
db6bba8bc8
fix(deps): update dependency feedgen to v1 2024-01-23 21:24:53 +00:00
renovate
a37b9685de
fix(deps): update dependency lxml to v5 2024-01-21 14:24:22 +00:00
moson
6e32cf4275
fix(i18n): Adjust transifex host URL
Fix URL, otherwise the API token won't be picked up from ~/.transifexrc

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 11:40:14 +01:00
moson
76b6971267
chore(deps): Ignore python upgrades with Renovate
Stop Renovate from trying to bump the python version.

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 10:43:12 +01:00
Robin Candau
9818c3f48c chore(i18n): Replace [community] leftover mentions to [extra] 2024-01-21 10:27:57 +01:00
moson
f967c3565a
chore(i18n): Update translations
Pull in updated translations from Transifex: 2023-01-18

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 09:59:05 +01:00
moson
2fcd793a58
fix(test): Fixes for "TestClient" changes
Seems that client is optional according to the ASGI spec.
https://asgi.readthedocs.io/en/latest/specs/www.html

With Starlette 0.35 the TestClient connection  scope is None for "client".
https://github.com/encode/starlette/pull/2377

Signed-off-by: moson <moson@archlinux.org>
2024-01-19 16:37:42 +01:00
renovate
22e1577324
fix(deps): update dependency fastapi to ^0.109.0 2024-01-19 10:26:02 +01:00
moson
baf97bd159
fix(test): FastAPI 0.104.1 - Fix warnings
FastAPI events are deprecated. Use "Lifespan" function instead.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:15:18 +01:00
moson
a0b2e826be
feat: Parse markdown within html block elements
By default, markdown within an HTML block element is not parsed.
Add markdown extension to support markdown text within block
elements.

With this we can annotate our element with a "markdown" attribute:
E.g. <details markdown>*Markdown*</details>
And thus indicate that the content should be parsed.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:14:24 +01:00
moson
1ba9e6eb44
fix: change git-cliff "tag_pattern" option to regex
Changed with v1.4.0
See: https://github.com/orhun/git-cliff/pull/318

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:12:48 +01:00
Rafael Fontenelle
1b82887cd6
docs: Change i18n.txt to markdown format 2023-12-08 14:10:32 +01:00
moson
783422369e
feat: Set reply-to header for notifications to ML
We can set the "reply-to" header to the "to" address for any mails
that go out to the aur-requests mailing list.

Signed-off-by: moson <moson@archlinux.org>
2023-11-28 09:33:07 +01:00
moson
4637b2edba
fix(tests): Fix test case for Prometheus metrics
Disable prometheus multiprocess mode in tests to avoid global state:
Depending on the workers which are processing a testfile,
we might run into race issues where tests might influence each other.

We also need to make sure to clear any previously collected values
in case the same worker/process is executing different tests which
evaluate prometheus values.

Signed-off-by: moson <moson@archlinux.org>
2023-11-27 13:21:37 +01:00
moson
027dfbd970
chore(release): prepare for 6.2.9
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:30:29 +01:00
moson
8b234c580d
chore(deps): update dependencies
* Updating idna (3.4 -> 3.6)
* Updating annotated-types (0.5.0 -> 0.6.0)
* Updating pydantic-core (2.10.1 -> 2.14.5)
* Updating certifi (2023.7.22 -> 2023.11.17)
* Updating greenlet (3.0.0 -> 3.0.1)
* Updating pydantic (2.4.2 -> 2.5.2)
* Updating charset-normalizer (3.3.0 -> 3.3.2)
* Updating cryptography (41.0.4 -> 41.0.5)
* Updating fastapi (0.103.2 -> 0.104.1)
* Updating mako (1.2.4 -> 1.3.0)
* Updating parse (1.19.1 -> 1.20.0)
* Updating prometheus-client (0.17.1 -> 0.19.0)
* Updating urllib3 (2.0.6 -> 2.1.0)

Fix type annotation for new test function

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:23:56 +01:00
renovate
9bf0c61051
fix(deps): update all non-major dependencies 2023-11-25 18:25:05 +00:00
moson
9d5b9c4795
feat: Add "groups" to package details page
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:59:43 +01:00
moson
765f989b7d
feat: Allow <del> and <details/summary> tags in comments
* Allow additional html tags: <del> and <details/summary>
* Convert markdown double-tilde (~~) to <del> tags

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:41:28 +01:00
Jelle van der Waa
029ce3b418
templates: update Gitlab navbar to point to Arch namespace
Instead of showing your own projects, show the Arch Linux namespace
where all our bugs/projects are.
2023-11-24 18:20:25 +01:00
Jelle van der Waa
3241391af0
templates: update bugs navbar entry to GitLab
Flyspray is no more and all projects are now on our own GitLab instance.
2023-11-12 16:02:16 +01:00
moson
5d302ae00c
feat: Support timezone and language query params
Support setting the timezone as well as the language via query params:
The timezone parameter previously only worked on certain pages.
While we're at it, let's also add the language as a param.
Refactor code for timezone and language functions.
Remove unused AURTZ cookie.

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:44 +02:00
moson
933654fcbb
fix: Restrict context var override on the package page
Users can (accidentally) override context vars with query params.
This may lead to issues when rendering templates (e.g. "comments=").

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:43 +02:00
moson
40c1d3e8ee
fix(ci): Don't create error reports from sandbox
We should not try to create issue reports for internal server errors
from a sandbox/review-app environment.

Signed-off-by: moson <moson@archlinux.org>
2023-10-20 15:45:58 +02:00
Hanabishi
2b8c8fc92a fix: make dependency source use superscript tag
Avoid using special characters and use '<sup>' HTML tag instead.
To not rely on user's fonts Unicode coverage.

Closes: #490
Signed-off-by: Hanabishi <1722-hanabishi@users.noreply.gitlab.archlinux.org>
2023-10-18 16:19:58 +00:00
moson
27c51430fb
chore(release): prepare for 6.2.8
Signed-off-by: moson <moson@archlinux.org>
2023-10-15 20:52:57 +02:00
moson
27cd533654
fix: Skip setting existing context values
When setting up a context with user provided variables,
we should not override any existing values previously set.

Signed-off-by: moson <moson@archlinux.org>
2023-10-12 18:09:07 +02:00
moson
2166426d4c
fix(deps): update dependencies
* Updating typing-extensions (4.5.0 -> 4.8.0)
* Installing annotated-types (0.5.0)
* Updating anyio (3.6.2 -> 3.7.1)
* Installing pydantic-core (2.10.1)
* Updating certifi (2023.5.7 -> 2023.7.22)
* Updating cffi (1.15.1 -> 1.16.0)
* Updating greenlet (2.0.2 -> 3.0.0)
* Updating markupsafe (2.1.2 -> 2.1.3)
* Updating packaging (23.1 -> 23.2)
* Updating pluggy (1.0.0 -> 1.3.0)
* Updating pydantic (1.10.7 -> 2.4.2)
* Updating charset-normalizer (3.1.0 -> 3.3.0)
* Updating click (8.1.3 -> 8.1.7)
* Updating coverage (7.2.7 -> 7.3.2)
* Updating cryptography (40.0.2 -> 41.0.4)
* Updating dnspython (2.3.0 -> 2.4.2)
* Updating execnet (1.9.0 -> 2.0.2)
* Updating fastapi (0.100.1 -> 0.103.2)
* Updating httpcore (0.17.0 -> 0.17.3)
* Updating parse (1.19.0 -> 1.19.1)
* Updating prometheus-client (0.16.0 -> 0.17.1)
* Updating pytest (7.4.0 -> 7.4.2)
* Updating redis (4.6.0 -> 5.0.1)
* Updating urllib3 (2.0.2 -> 2.0.6)
* Updating aiofiles (23.1.0 -> 23.2.1)
* Updating alembic (1.11.2 -> 1.12.0)
* Updating fakeredis (2.17.0 -> 2.19.0)
* Updating filelock (3.12.2 -> 3.12.4)
* Updating orjson (3.9.2 -> 3.9.7)
* Updating protobuf (4.23.4 -> 4.24.4)
* Updating pygit2 (1.12.2 -> 1.13.1)
* Updating werkzeug (2.3.6 -> 3.0.0)

Signed-off-by: moson <moson@archlinux.org>
2023-10-05 17:59:14 +02:00
moson
fd3022ff6c
fix: Correct password length message.
Wrong config option was used to display the minimum length error msg.
(username_min_len instead of passwd_min_len)

Signed-off-by: moson <moson@archlinux.org>
2023-10-02 13:47:38 +02:00
moson
9e9ba15813
housekeep: TU rename - Misc
Fix some more test functions

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
d2d47254b4
housekeep: TU rename - Table/Column names, scripts
TU_VoteInfo -> VoteInfo
TU_Votes -> Votes
TU_VoteInfo.ActiveTUs -> VoteInfo.ActiveUsers

script: tuvotereminder -> votereminder
Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
87f6791ea8
housekeep: TU rename - Comments
Changes to comments, function descriptions, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
61f1e5b399
housekeep: TU rename - Test suite
Rename tests: Function names, variables, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
148c882501
housekeep: TU rename - /tu routes
Change /tu to /package-maintainer

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
f540c79580
housekeep: TU rename - UI elements
Rename all UI elements and translations.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
1702075875
housekeep: TU rename - code changes
Renaming of symbols. Functions, variables, values, DB values, etc.
Basically everything that is not user-facing.

This only covers "Trusted User" things:
tests, comments, etc. will covered in a following commit.
2023-09-30 16:45:04 +02:00
moson
7466e96449
fix(ci): Exclude review-app jobs for renovate MR's
Signed-off-by: moson <moson@archlinux.org>
2023-09-26 13:47:03 +02:00
moson
0a7b02956f
feat: Indicate dependency source
Dependencies might reside in the AUR or official repositories.
Add "AUR" as superscript letters to indicate if a package/provider
is present in the AUR.

Signed-off-by: moson <moson@archlinux.org>
2023-09-03 14:17:11 +02:00
moson
1433553c05
fix(test): Clear previous prometheus data for test
It could happen that test data is already generated by a previous test.
(running in the same worker)

Make sure we clear everything before performing our checks.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
5699e9bb41
fix(test): Remove file locking and semaphore
All tests within a file run in the same worker and out test DB names
are unique per file as well. We don't really need a locking
mechanism here.

Same is valid for the test-emails. The only potential issue is that it
might try to create the same directory multiple times and thus run
into an error. However, that can be covered by specifying
"exist_ok=True" with os.makedirs such that those errors are ignored.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
9eda6a42c6
feat: Add ansible provisioning step for review-app
Clone infrastructure repository and run playbook to provision our VM
with aurweb.

Signed-off-by: moson <moson@archlinux.org>
2023-08-27 13:54:39 +02:00
Kristian Klausen
6c610b26a3
feat: Add terraform config for review-app[1]
Also removed the logic for deploying to the long gone aur-dev box.

Ansible will be added in a upcoming commit for configurating and
deploying aurweb on the VM.

[1] https://docs.gitlab.com/ee/ci/review_apps/
2023-08-27 12:05:52 +02:00
moson
3005e82f60
fix: Cleanup prometheus metrics for dead workers
The current "cleanup" function that is removing orphan prometheus files
is actually never invoked.
We move this to a default gunicorn config file to register our hook(s).

https://docs.gunicorn.org/en/stable/configure.html
https://docs.gunicorn.org/en/stable/settings.html#child-exit
Signed-off-by: moson <moson@archlinux.org>
2023-08-18 22:04:55 +02:00
Leonidas Spyropoulos
f05f1dbac7
chore(release): prepare for 6.2.7
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-08-04 19:18:38 +03:00
renovate
8ad03522de
fix(deps): update all non-major dependencies 2023-08-04 14:25:22 +00:00
moson
94b62d2949
fix: Check if user exists when editing account
We should check if a user (target) exists before validating permissions.
Otherwise things crash when a TU is trying to edit an account that
does not exist.

Fixes: aurweb-errors#529
Signed-off-by: moson <moson@archlinux.org>
2023-08-04 14:12:50 +02:00
renovate
7a44f37968
fix(deps): update dependency fastapi to v0.100.1 2023-07-27 19:24:28 +00:00
renovate
969b84afe4
fix(deps): update all non-major dependencies 2023-07-25 11:24:30 +00:00
renovate
f74f94b501
fix(deps): update dependency gunicorn to v21 2023-07-24 11:24:26 +00:00
moson
375895f080
feat: Add Prometheus metrics for requests
Adds gauge for requests by type and status

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 22:46:44 +02:00
moson
e45878a058
fix: Fix issue with requests totals
Problem is that we join with PackageBase, thus we are missing
requests for packages that were deleted.

Fixes: #483
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 18:53:58 +02:00
moson
6cd70a5c9f
test: Add tests for user/package statistics
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:51 +02:00
moson
8699457917
feat: Separate cache expiry for stats and search
Allows us to set different cache eviction timespans  for search queries
and statistics.

Stats and especially "last package updates" should probably be refreshed
more often, whereas we might want to cache search results for a bit
longer.

So this gives us a bit more flexibility playing around with different
settings and tweak things.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
44c158b8c2
feat: Implement statistics class & additional metrics
The new module/class helps us constructing queries and count records to
expose various statistics on the homepage. We also utilize for some new
prometheus metrics (package and user gauges).
Record counts are being cached with Redis.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
347c2ce721
change: Change order of commit validation routine
We currently validate all commits going from latest -> oldest.

It would be nicer to go oldest -> latest so that, in case of errors,
we would indicate which commit "introduced" the problem.

Signed-off-by: moson <moson@archlinux.org>
2023-07-22 10:45:08 +02:00
moson
bc03d8b8f2
fix: Fix middleware checking for accepted terms
The current query is a bit mixed up. The intention was to return the
number of unaccepted records. Now it does also count all records
that were accepted by some other user though.

Let's check the total number of terms vs. the number of accepted
records (by our user) instead.

Signed-off-by: moson <moson@archlinux.org>
2023-07-20 18:21:05 +02:00
moson
5729d6787f
fix: git links in comments for multiple OIDs
The chance of finding multiple object IDs when performing lookups with
a shortened SHA1 hash (7 digits) seems to be quite high.

In those cases pygit2 will throw an error.
Let's catch those exceptions and gracefully handle them.

Fixes: aurweb-errors#496 (and alike)
Signed-off-by: moson <moson@archlinux.org>
2023-07-17 12:45:16 +02:00
renovate
862221f5ce
fix(deps): update all non-major dependencies 2023-07-15 20:27:12 +00:00
moson
27819b4465
fix: /rss lazy load issue & perf improvements
Some fixes for the /rss endpoints

* Load all data in one go:
Previously data was lazy loaded thus it made several sub-queries per
package (> 200 queries for composing the rss data for a single request).
Now we are performing a single SQL query.
(request time improvement: 550ms -> 130ms)
This also fixes aurweb-errors#510 and alike

* Remove some "dead code":
The fields "source, author, link" were never included in the rss output
(wrong or insufficient data passed to the different entry.xyz functions)
Nobody seems to be missing them anyways, so let's remove em.

* Remove "Last-Modified" header:
Obsolete since nginx can/will only handle "If-Modified-Since" requests
in it's current configuration. All requests are passed to fastapi anyways.

Signed-off-by: moson <moson@archlinux.org>
2023-07-13 18:27:02 +02:00
moson
fa1212f2de
fix: translations not containing string formatting
In some translations we might be missing replacement placeholders (%).
This turns out to be problematic when calling the format function.

Wrap the jinja2 format function and just return the string unformatted
when % is missing.

Fixes: #341
Signed-off-by: moson <moson@archlinux.org>
2023-07-10 18:02:20 +02:00
moson
c0bbe21d81
fix(test): correct test for ssh-key parsing
Our set of keys returned by "util.parse_ssh_keys" is unordered so we
have to adapt our test to not rely on a specific order for multiple keys.

Fixes: 5ccfa7c0fd ("fix: same ssh key entered multiple times")
Signed-off-by: moson <moson@archlinux.org>
2023-07-09 16:13:02 +02:00
moson
5ccfa7c0fd
fix: same ssh key entered multiple times
Users might accidentally past their ssh key multiple times
when they try to register or edit their account.

Convert our of list of keys to a set, removing any double keys.

Signed-off-by: moson <moson@archlinux.org>
2023-07-09 14:52:15 +02:00
Leonidas Spyropoulos
225ce23761
chore(release): prepare for 6.2.6
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:54:43 +01:00
moson
4821fc1312
fix: show placeholder for deleted user in comments
show "<deleted-account>" in comment headers in case a user
deleted their account.

Signed-off-by: moson <moson@archlinux.org>
2023-07-08 13:44:24 +02:00
Leonidas Spyropoulos
1f40f6c5a0
housekeep: set current maintainers
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:38:19 +01:00
renovate
81d29b4c66
fix(deps): update dependency fastapi to ^0.100.0 2023-07-08 11:24:29 +00:00
renovate
7cde1ca560
fix(deps): update all non-major dependencies 2023-07-08 09:25:09 +00:00
moson-mo
f3f8c0a871
fix: add recipients to BCC when email is hidden
Package requests are sent to the ML as well as users (CC).
For those who chose to hide their mail address,
we should add them to the BCC list instead.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-08 11:19:02 +02:00
moson
9fe8d524ff
fix(test): MariaDB 11 upgrade, query result order
Fix order of recipients for "FlagNotification" test.
Apply sorting to the recipients query.
(only relevant for tests, but who knows when they change things again)

MariaDB 11 includes some changes related to the
query optimizer. Turns out that this might have effects
on how records are ordered for certain queries.
(in case no ORDER BY clause was specified)

https://mariadb.com/kb/en/mariadb-11-0-0-release-notes/
Signed-off-by: moson <moson@archlinux.org>
2023-07-08 10:32:26 +02:00
moson-mo
814ccf6b04
feat: add Prometheus metrics for Redis cache
Adding a Prometheus counter to be able to monitor cache hits/misses
for search queries

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
3acfb08a0f
feat: cache package search results with Redis
The queries being done on the package search page are quite costly.
(Especially the default one ordered by "Popularity" when navigating to /packages)

Let's add the search results to the Redis cache:
Every result of a search query is being pushed to Redis until we hit our maximum of 50k.
An entry expires after 3 minutes before it's evicted from the cache.
Lifetime an Max values are configurable.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
7c8b9ba6bc
perf: add index to tweak our default search query
Adds an index on PackageBases.Popularity and PackageBases.Name to
improve performance of our default search query sorted by "Popularity"

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:55:21 +02:00
moson-mo
c41f2e854a
perf: tweak some search queries
We currently sorting on two columns in different tables which is quite
expensive in terms of performance:
MariaDB is first merging the data into some temporary table to apply the
sorting and record limiting.

We can tweak a couple of these queries by changing the "order by" clause
such that they refer to columns within the same table (PackageBases).
So instead performing the second sorting on "Packages.Name", we do
this on "PackageBases.Name" instead.
This should still be "good enough" to produce properly sorted results.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:21:11 +02:00
Leonidas Spyropoulos
e2c113caee
chore(release): prepare for 6.2.5
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-06-22 19:22:56 +01:00
moson-mo
143575c9de
fix: restore command, remove premature creation of pkgbase
We're currently creating a "PackageBases" when the "restore" command is executed.

This is problematic for pkgbases that never existed before.
In those cases it will create the record but fail in the update.py script.
Thus it leaves an orphan "PackageBases" record in the DB
(which does not have any related "Packages" record(s))

Navigating to such a packages /pkgbase/... URL will result in a crash
since it is not foreseen to have "orphan" pkgbase records.

We can safely remove the early creation of that record because
it'll be taken care of in the update.py script that is being called

We'll also fix some tests. Before it was executing a dummy script
instead of "update.py" which might be a bit misleading
since it did not check the real outcome of our "restore" action.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 14:22:22 +02:00
moson-mo
c6c81f0789
housekeep: Amend .gitignore and .dockerignore
Prevent some files/dirs to end up in the repo / docker image:
* directories typically used for python virtualenvs
* files that are being generated by running tests

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 13:33:39 +02:00
moson-mo
32461f28ea
fix(docker): Suppress error PEP-668
When using docker (compose), we don't create a venv and just install
python packages system-wide.

With python 3.11 (PEP 668) we need to explicitly tell pip to allow this.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-15 14:19:02 +02:00
moson-mo
58158505b0
fix: browser hints for password fields
Co-authored-by: eNV25 <env252525@gmail.com>
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 21:04:35 +02:00
moson-mo
ed17486da6
change(git): allow keys/pgp subdir with .asc files
This allows migration of git history for packages dropped from a repo to AUR
in case they contain PGP key material

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 12:20:02 +02:00
moson-mo
1c11c901a2
feat: switch requests filter for pkgname to "contains"
Use "contains" filtering instead of an exact match
when a package name filter is given.

This makes it easier to find requests for a "group" of packages.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-10 09:40:35 +02:00
Christian Heusel
26b2566b3f
change: print the user name if connecting via ssh
this is similar to the message that gitlab produces:

$ ssh -T aur.archlinux.org
Welcome to AUR, gromit! Interactive shell is disabled.
Try `ssh ssh://aur@aur.archlinux.org help` for a list of commands.

$ ssh -T gitlab.archlinux.org
Welcome to GitLab, @gromit!

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-08 12:47:27 +02:00
Christian Heusel
e9cc2fb437
change: only require .SRCINFO in the latest revision
This is done in order to relax the constraints so that dropping packages
from the official repos can be done with preserving their history.

Its sufficient to also have this present in the latest commit of a push.

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-07 18:54:31 +02:00
Leonidas Spyropoulos
ed2f85ad04
chore(release): prepare for 6.2.4
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-27 14:28:48 +01:00
renovate
2709585a70
fix(deps): update dependency fastapi to v0.95.2 2023-05-27 11:24:46 +00:00
renovate
d1a3fee9fe fix(deps): update all non-major dependencies 2023-05-26 21:12:13 +00:00
moson-mo
49e98d64f4
chore: increase default session/cookie timeout
change from 2 to 4 hours.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 23:03:38 +02:00
moson-mo
a7882c7533
refactor: remove session_time from user.login
The parameter is not used, we can remove it and adapt the callers.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 23:02:38 +02:00
moson-mo
22fe4a988a
fix: make AURSID a session cookie if "remember me" is not checked
This should match more closely the expectation of a user.
A session cookie should vanish on browser close
and you thus they need to authenticate again.

There is no need to bump the expiration of AURSID either,
so we can remove that part.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:47 +02:00
moson-mo
0807ae6b7c
test: add tests for cookie handling
add a bunch of test cases to ensure our cookies work properly

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
d366377231
fix: make AURREMEMBER cookie a permanent one
If it's a session cookie it poses issues for users
whose browsers wipe session cookies after close.
They'd be logged out early even if they chose
the "remember me" option when they log in.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
57c154a72c
fix: increase expiry for AURLANG cookie; only set when needed
We add a new config option for cookies with a 400 day lifetime.
AURLANG should survive longer for unauthenticated users.
Today they have to set this again after each browser restart.
(for users whose browsers wipe session cookies on close)

authenticated users don't need this cookie
since the setting is saved to the DB

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
638ca7b1d0
chore: remove setting AURLANG and AURTZ on login
We don't need to set these cookies when logging in.
These settings are saved to the DB anyways.
(and they are picked up from there as well for any web requests,
when no cookies are given)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
edc4ac332d
chore: remove setting AURLANG and AURTZ on account edit
We don't need to set these cookies when an account is edited.
These settings are saved to the DB anyways.
(and they are picked up from there as well for any web requests,
when no cookies are given)

Setting these cookies can even be counter-productive:
Imagine a TU/Dev editing another users account.
They would overwrite their own cookies with the other users TZ/LANG settings.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
2eacc84cd0
fix: properly evaluate AURREMEMBER cookie
Whenever the AURREMEMBER cookie was defined, regardless of its value,
"remember_me" is always set to True

The get method of a dict returns a string,
converting a value of str "False" into a bool -> True

We have to check AURREMEMBERs value instead.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
Daniel M. Capella
5fe375bdc3
feat: add link to MergeBaseName in requests.html 2023-05-26 13:26:41 -04:00
renovate
1b41e8572a
fix(deps): update all non-major dependencies 2023-05-26 02:24:39 +00:00
moson-mo
7a88aeb673
chore: update .gitignore for test-emails
emails generated when running tests are stored in test-emails/ dir

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-25 11:18:08 +02:00
moson-mo
f24fae0ce6
feat: Add "Requests" filter option for package name
- Add package name textbox for filtering requests (with auto-suggest)
- Make "x pending requests" a link for TU/Dev on the package details page

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-25 11:18:08 +02:00
Leonidas Spyropoulos
acdb2864de
Merge branch 'aurblup-update-repo' into 'master'
fix: update repo information with aurblup script / git packaging repo changes

See merge request archlinux/aurweb!710
2023-05-25 10:06:44 +01:00
moson-mo
146943b3b6
housekeep: support new default repos after git migration
community is merged into extra
testing -> core-testing & extra-testing

Announcement: https://archlinux.org/news/git-migration-announcement/

We list "testing" repos first:
See d0b0e4d88b

Co-authored-by: artafinde <artafinde@archlinux.org>
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-18 13:06:21 +02:00
moson-mo
d0b0e4d88b
fix: update repo information with aurblup script
Currently, the "Repo" column in "OfficialProviders" is not updated
when a package is moved from one repository to another.

Note that we only save a package/provides combination once,
hence if a package is available in core and testing at the same time,
it would only put just one record into the OfficialProviders table.

We iterate through the repos one by one and the last value
is kept for mapping a (package/provides) combination to a repo.
Due to that, the repos listed in the "sync-db" config setting
should be ordered such that the "testing" repos are listed first.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-17 18:22:53 +02:00
moson-mo
3253a6ad29
fix(deps): remove urllib3 from dependency list
Previously pinned urllib3 to v1.x. This is not needed though.
The incompatibility of v2.x is with poetry itself, but not aurweb.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-07 09:58:17 +02:00
Daniel M. Capella
d2e8fa0249
chore(deps): "Group all minor and patch updates together"
Treat FastAPI separately due to regular breakage.

Co-authored-by: moson-mo <mo-son@mailbox.org>
2023-05-06 18:03:05 -04:00
Leonidas Spyropoulos
1d627edbe7
chore(release): prepare for 6.2.3
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-06 20:34:54 +01:00
moson-mo
b115aedf97
chore(deps): update several dependencies
- Removing rfc3986 (1.5.0)
- Updating coverage (7.2.4 -> 7.2.5)
- Updating fastapi (0.94.1 -> 0.95.1)
- Updating httpcore (0.16.3 -> 0.17.0)
- Updating sqlalchemy (1.4.47 -> 1.4.48)
- Updating httpx (0.23.3 -> 0.24.0)
- Updating prometheus-fastapi-instrumentator (5.11.2 -> 6.0.0)
- Updating protobuf (4.22.3 -> 4.22.4)
- Updating pytest-asyncio (0.20.3 -> 0.21.0)
- Updating requests (2.29.0 -> 2.30.0)
- Updating uvicorn (0.21.1 -> 0.22.0)
- Updating watchfiles (0.18.1 -> 0.19.0)
- Updating werkzeug (2.3.2 -> 2.3.3)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-06 20:29:05 +02:00
Christian Heusel
af4239bcac
replace reference to AUR TU Guidelines with AUR Submission Guidelines
Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-05-06 19:47:01 +02:00
Leonidas Spyropoulos
a8d14e0194
housekeep: remove unused templates and rework existing ones
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-01 10:44:45 +01:00
moson-mo
8c5b85db5c
housekeep: remove fix for poetry installer
The problems with the "modern installer" got fixed.
We don't need this workaround anymore.

https://github.com/python-poetry/poetry/issues/7572
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-01 10:23:34 +02:00
moson-mo
b3fcfb7679
doc: improve instructions for setting up a dev/test env
Provide more detailed information how to get started with a dev/test env.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-01 10:23:34 +02:00
Leonidas Spyropoulos
e896edaccc
chore: support for python 3.11 and poetry.lock update
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-30 10:12:09 +01:00
moson-mo
bab17a9d26
doc: amend INSTALL instructions
change path for metadata archive files

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-29 09:59:34 +02:00
moson-mo
ad61c443f4
fix: restore & move cgit html files
restore files accidentally deleted with PHP cleanup.

1325c71712/web/template/cgit
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-29 09:55:54 +02:00
moson-mo
8ca63075e9
housekeep: remove PHP implementation
removal of the PHP codebase

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-28 16:10:32 +02:00
moson-mo
97d0eac303
housekeep: copy static files
we copy static files used by PHP and Python versions into /static

preparation work for the removal of the PHP version

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-28 10:53:22 +02:00
Leonidas Spyropoulos
1325c71712
chore: update poetry.lock
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-24 09:13:38 +01:00
Leonidas Spyropoulos
6ede837b4f
feat: allow users to hide deleted comments
Closes: #435

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-24 09:13:38 +01:00
Leonidas Spyropoulos
174af5f025
chore(release): prepare for 6.2.2
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-03-15 12:09:47 +00:00
moson-mo
993a044680
fix(poetry): use classic installer
The "install" module (v0.6.0) which is being used by poetry 1.4.0
has problems installing certain packages.

Disable the modern installer for now, until things are fixed.

https://github.com/python-poetry/poetry/issues/7572
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 17:57:57 +01:00
moson-mo
bf0d4a2be7
fix(deps): bump dependencies
bump all deps except sqlalchemy.

- Updating exceptiongroup (1.1.0 -> 1.1.1)
- Updating pydantic (1.10.5 -> 1.10.6)
- Updating starlette (0.25.0 -> 0.26.1)
- Updating charset-normalizer (3.0.1 -> 3.1.0)
- Updating fastapi (0.92.0 -> 0.94.1)
- Updating setuptools (67.4.0 -> 67.6.0)
- Updating urllib3 (1.26.14 -> 1.26.15)
- Updating alembic (1.9.4 -> 1.10.2)
- Updating fakeredis (2.9.2 -> 2.10.0)
- Updating prometheus-fastapi-instrumentator (5.10.0 -> 5.11.1)
- Updating protobuf (4.22.0 -> 4.22.1)
- Updating pytest-xdist (3.2.0 -> 3.2.1)
- Updating uvicorn (0.20.0 -> 0.21.0)
- Updating filelock (3.9.0 -> 3.9.1)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 17:57:56 +01:00
moson-mo
b9df7541b3
fix: add comments in email for direct deletion/merge
TUs and Devs can delete and merge packages directly.
Currently the comments they enter, don't end up in the ML notification.

Include the comment in the notifications for direct deletion / merge

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 11:17:45 +01:00
moson-mo
7d1827ffc5
feat: cancel button for comment editing
Adds button that allows cancellation while editing a comment

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-09 21:48:58 +01:00
moson-mo
52c962a590
fix(deps): fastapi 0.92.0 upgrade
middleware must be added before startup:

fixes: "RuntimeError: Cannot add middleware after an application has started"

https://fastapi.tiangolo.com/release-notes/#0910
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-04 10:29:54 +01:00
moson-mo
c0390240bc
housekeep(deps): bump dependencies
update all poetry deps to the latest version (except of sqlalchemy)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-04 10:27:57 +01:00
moson-mo
7d06c9ab97
fix: encode package name in URL for source files
Package(Base) names might include characters that require url-encoding

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-01 18:04:20 +01:00
moson-mo
8aac842307
fix(test): use single-quotes for strings in sql statements
Currently, in the sharness test suites, we use double-quotes
for string literals in SQL statements passed to sqlite3.

With sqlite version 3.41 the usage of double-quotes for string literals
is deactivated by default:
We'll need to switch to single-quotes in our tests.

Ref: Section 6.f. at https://www.sqlite.org/releaselog/3_41_0.html
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-24 10:11:33 +01:00
moson-mo
0c5b4721d6
fix: include package data without "Last Packager"
Data for packages that do not have a "Last Packager"
(e.g. because the user account was deleted)
should still be available from the /rpc and metadata archives.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
moson-mo
8d2e176c2f
housekeep: stop "pkgmaint" script (cron job)
With the removal of the "setup-repo" command this script becomes obsolete,
because it is not possible to reserve a repo anymore.
Hence we don't need cleanup.

We've also seen issues in case the last packager's user account is removed,
leading to the deletion of a Package.

Let's deactivate this for now.

Issue report: #425

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
moson-mo
b1a9efd552
housekeep(git): remove deprecated "setup-repo" command
Marked as deprecated about 6 years ago.
Time to bury it.

Issue report: #428

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
Leonidas Spyropoulos
68813abcf0
fix(RTL): make RTL layout properly displayed
Closes: #290

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-19 02:14:57 +00:00
Leonidas Spyropoulos
45218c4ce7
fix: per-page needs to be non zero
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-08 15:13:21 +00:00
Leonidas Spyropoulos
cb16f42a27
fix: validate timezone before use
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-06 16:40:43 +00:00
moson-mo
f9a5188fb7
chore(lint): reformatting after black update
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-06 09:15:18 +01:00
moson-mo
2373bdf400
fix(deps): bump pre-commit hooks
Bump hooks with "pre-commit autoupdate".

There is an issue with the latest poetry version and the "isort" hook module
https://github.com/PyCQA/isort/issues/2077

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-06 09:12:00 +01:00
Leonidas Spyropoulos
8b25d11a3a
chore(release): prepare for 6.2.1
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-27 18:08:54 +00:00
Leonidas Spyropoulos
ef2baad7b3
feat: expand on update.py tests and show on Gitlab UI
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-27 17:16:25 +00:00
moson-mo
137ed04d34
test: add tests .SRCINFO parsing and git update routine
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-27 15:40:25 +01:00
moson-mo
97e1f07f71
fix(deps): update srcinfo to 0.1.2
Fixes issue parsing .SRCINFO files

Issue report: #422

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-27 14:04:55 +01:00
Leonidas Spyropoulos
2b76b90885
chore(release): prepare for 6.2.0
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-26 23:19:04 +00:00
moson-mo
7f9ac28f6e
feat(deps): add watchfiles
When running aurweb with hot-reloading, the CPU consumption is quite high.
This is because it is using "StatReload" for detecting modified files.
(which seems to be rather inefficient)

When "watchfiles" is installed it'll automatically usees that instead and
CPU load goes down to 1%.
watchfiles uses filesystem events for detecting changes and is way more efficient.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-26 12:59:40 +01:00
Leonidas Spyropoulos
255cdcf667
fix:(revert): fix: only try to show dependencies if object exists
This reverts commit 0e44687ab1.

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 22:17:33 +00:00
moson-mo
ec239ceeb3
feat: add "Last Updated" column to search results
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 22:39:36 +01:00
moson-mo
becce1aac4
fix: occasional errors when loading package details
Fixes errors that might occur when loading the package details page.

Problem:
We are querying a list of "Required by" packages.
This list is loaded with all details for a "PackageDependency" record.

Now we also have a reference to some attributes from the
related package (PackageDependency.Package.xxx)

This will effectively trigger the ORM to run another query (lazyload),
to fetch the missing Package data (for each PackageDependency record).

At that point it might have happened that a referenced package
got deleted / updated so that we can't retrieve this data anymore and
our dep.Package object is "None"

Fix:
We can force our query to include Package data right away.
Thus we can avoid running a separate query (per "required by"...)

As a side-effect we get better performance.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 22:34:19 +01:00
Leonidas Spyropoulos
6c9be9eb97
fix(deps): update dependencies from renovate
fastapi ^0.89.0
coverage v7
srcinfo ^0.1.0

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 21:17:50 +00:00
Leonidas Spyropoulos
c176b2b611
feature: increase mandatory coverage to 95%
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 19:34:52 +00:00
moson-mo
ff0123b54a
fix: save notification state for unchanged comments
When we edit a comment we can enable notifications (if not yet enabled).

We should also do this when the comment text is not changed.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 09:42:20 +01:00
moson-mo
36fd58d7a6
fix: show notification box when adding a comment
Currently, the "Enable notifications" checkbox
is only shown when editing a comment.

We should also show it when a new comment is about to be added.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 09:42:19 +01:00
moson-mo
65ba735f18
fix: bleach upgrade 6.0
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-23 23:50:04 +01:00
renovate
a2487c20d8
fix(deps): update dependency bleach to v6 2023-01-23 17:24:53 +00:00
Christian Heusel
f41f090ed7 simplify the docker development setup instructions
use `docker compose exec` instead of `docker ps` and `docker exec`

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-01-15 09:25:22 +00:00
Leonidas Spyropoulos
0e44687ab1 fix: only try to show dependencies if object exists
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-14 21:08:34 +00:00
Leonidas Spyropoulos
4d0a982c51 fix: assert offset and per_page are positive
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-14 20:57:11 +00:00
moson-mo
f6c4891415
feat: add Support section to Dashboard
Adds the "Support" section (displayed on "Home") to the "Dashboard" page as well.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-14 13:12:33 +01:00
moson-mo
2150f8bc19
fix(docker): nginx health check
nginx health check always results in "unhealthy":

There is no such option "--no-verify" for curl.
We can use "-k" or "--insecure" for disabling SSL checks.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-13 10:26:43 +01:00
moson-mo
ff44eb02de
feat: add link to mailing list article on requests page
Provides a convenient way to check for responses on the
mailing list prior to Accepting/Rejecting requests.

We compute the Message-ID hash that can be used to
link back to the article in the mailing list archive.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-12 12:06:28 +01:00
Kevin Morris
154bb239bf
update-zh_TW translations 2023-01-11 12:25:54 -08:00
Kevin Morris
65d364fe90
update-zh_CN translations 2023-01-11 12:25:53 -08:00
Kevin Morris
ef0e3b9f35
update-zh translations 2023-01-11 12:25:53 -08:00
Kevin Morris
2770952dfb
update-vi translations 2023-01-11 12:25:53 -08:00
Kevin Morris
4cff1e500b
update-uk translations 2023-01-11 12:25:53 -08:00
Kevin Morris
b36cbd526b
update-tr translations 2023-01-11 12:25:52 -08:00
Kevin Morris
5609ddf791
update-sv_SE translations 2023-01-11 12:25:52 -08:00
Kevin Morris
8592bada16
update-sr_RS translations 2023-01-11 12:25:52 -08:00
Kevin Morris
46c925bc82
update-sr translations 2023-01-11 12:25:52 -08:00
Kevin Morris
8ee843b7b1
update-sk translations 2023-01-11 12:25:51 -08:00
Kevin Morris
ebae0d4304
update-ru translations 2023-01-11 12:25:51 -08:00
Kevin Morris
fa20a3b5d8
update-ro translations 2023-01-11 12:25:51 -08:00
Kevin Morris
e7bcf2fc97
update-pt_PT translations 2023-01-11 12:25:51 -08:00
Kevin Morris
bb00a4ecfd
update-pt_BR translations 2023-01-11 12:25:50 -08:00
Kevin Morris
6ee7598211
update-pt translations 2023-01-11 12:25:50 -08:00
Kevin Morris
e572b86fd3
update-pl translations 2023-01-11 12:25:50 -08:00
Kevin Morris
05c6266986
update-nl translations 2023-01-11 12:25:50 -08:00
Kevin Morris
57a2b4b516
update-nb_NO translations 2023-01-11 12:25:49 -08:00
Kevin Morris
d20dbbcf74
update-nb translations 2023-01-11 12:25:49 -08:00
Kevin Morris
e5137e0c42
update-lt translations 2023-01-11 12:25:49 -08:00
Kevin Morris
e6d36101d9
update-ko translations 2023-01-11 12:25:49 -08:00
Kevin Morris
08af8cad8d
update-ja translations 2023-01-11 12:25:49 -08:00
Kevin Morris
a12dbd191a
update-it translations 2023-01-11 12:25:48 -08:00
Kevin Morris
0d950a0c9f
update-is translations 2023-01-11 12:25:48 -08:00
Kevin Morris
3a460faa6e
update-id_ID translations 2023-01-11 12:25:48 -08:00
Kevin Morris
28e8b31211
update-id translations 2023-01-11 12:25:48 -08:00
Kevin Morris
5f71e58db1
update-hu translations 2023-01-11 12:25:47 -08:00
Kevin Morris
bf348fa572
update-hr translations 2023-01-11 12:25:47 -08:00
Kevin Morris
b209cd962c
update-hi_IN translations 2023-01-11 12:25:47 -08:00
Kevin Morris
9385c14f77
update-he translations 2023-01-11 12:25:47 -08:00
Kevin Morris
ff01947f3d
update-fr translations 2023-01-11 12:25:47 -08:00
Kevin Morris
3fa9047864
update-fi_FI translations 2023-01-11 12:25:46 -08:00
Kevin Morris
bce9bedaf4
update-fi translations 2023-01-11 12:25:46 -08:00
Kevin Morris
076245e061
update-et translations 2023-01-11 12:25:46 -08:00
Kevin Morris
aeb38b599d
update-es translations 2023-01-11 12:25:46 -08:00
Kevin Morris
6bf408775c
update-el translations 2023-01-11 12:25:46 -08:00
Kevin Morris
791e715aee
update-de translations 2023-01-11 12:25:45 -08:00
Kevin Morris
5a7a9c2c9f
update-da translations 2023-01-11 12:25:45 -08:00
Kevin Morris
da458ae70a
update-cs translations 2023-01-11 12:25:45 -08:00
Kevin Morris
618a382e6c
update-ca_ES translations 2023-01-11 12:25:45 -08:00
Kevin Morris
d6661403aa
update-ca translations 2023-01-11 12:25:45 -08:00
Kevin Morris
9229220e21
update-bg translations 2023-01-11 12:25:44 -08:00
Kevin Morris
b89fe9eb13
update-az_AZ translations 2023-01-11 12:25:44 -08:00
Kevin Morris
3a13eeb744
update-az translations 2023-01-11 12:25:44 -08:00
Kevin Morris
65266d752b
update-ar translations 2023-01-11 03:09:09 -08:00
Kevin Morris
413de914ca
fix: remove trailing whitespace lint check for ./po
Signed-off-by: Kevin Morris <kevr@0cost.org>
2023-01-10 14:36:31 -08:00
moson-mo
7a9448a3e5
perf: improve packages search-query
Improves performance for queries with large result sets.

The "group by" clause can be removed for all search types but the keywords.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-05 22:00:32 +01:00
moson-mo
d8e91d058c
fix(rpc): provides search should return name match
We need to return packages matching on the name as well.
(A package always provides itself)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-03 15:58:45 +01:00
moson-mo
2b8dedb3a2
feat: add pagination element below comments
other pages like the "package search" have this as well.

Issue report: #390

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-28 17:01:44 +01:00
moson-mo
8027ff936c
fix: alignment of pagination element
pagination for comments should appear on the right instead of center

Issue report: #390

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-28 16:57:27 +01:00
Leonidas Spyropoulos
c74772cb36
chore: bump to v6.1.9
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-27 10:34:07 +00:00
moson-mo
7864ac6dfe
fix: search-by parameter for keyword links
Fixes:
Keyword-links on the package page pass wrong query-parameter.
Thus a name/description search is performed instead of  keywords

Issue report: #397

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-27 10:33:58 +01:00
moson-mo
a08681ba23
fix: Add "Show more..." link for "Required by"
Fix glitch on the package page:
"Show more..." not displayed for the "Required by" list

Fix test case:
Function name does not start with "test" hence it was never executed during test runs

Issue report: #363

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-25 12:24:04 +01:00
moson-mo
a832b3cddb
fix(test): FastAPI 0.87.0 - warning fixes
FastAPI 0.87.0 switched to the httpx library for their TestClient

* cookies need to be defined on the request instance instead of method calls

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-24 22:43:31 +01:00
moson-mo
1216399d53
fix(test): FastAPI 0.87.0 - error fixes
FastAPI 0.87.0 switched to the httpx library for their TestClient

* allow_redirects is deprecated and replaced by follow_redirects

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-24 22:23:37 +01:00
renovate
512ba02389
fix(deps): update dependency fastapi to ^0.87.0 2022-11-23 00:25:31 +00:00
Leonidas Spyropoulos
6b0978b9a5
fix(deps): update dependencies from renovate
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 21:51:15 +00:00
moson-mo
d5e102e3f4
feat: add "Submitter" field to /rpc info request
Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-22 18:46:57 +01:00
Leonidas Spyropoulos
ff92e95f7a
fix: delete associated ssh public keys with account deletion
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:51:09 +00:00
Leonidas Spyropoulos
bce5b81acd
feat: allow filtering requests from maintainers
These are usually easy to handle from TUs so allow to filter for them

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:39:11 +00:00
Leonidas Spyropoulos
500d6b403b
feat: add co-maintainers to RPC
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:32:51 +00:00
moson-mo
bcd808ddc1
feat(rpc): add "by" parameter - comaintainers
Add "by" parameter: comaintainers

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:39 +01:00
moson-mo
efd20ed2c7
feat(rpc): add "by" parameter - keywords
Add "by" parameter: keywords

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:31 +01:00
moson-mo
5484e68b42
feat(rpc): add "by" parameter - submitter
Add "by" parameter: submitter

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:19 +01:00
moson-mo
0583f30a53
feat(rpc): add "by" parameter - groups
Adding "by" parameter to search by "groups"

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:01 +01:00
moson-mo
50287cb066
feat(rpc): add "by" parameters - package relations
This adds new "by" search-parameters: provides, conflicts and replaces

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:30:44 +01:00
Leonidas Spyropoulos
73f0bddf0b
fix: handle default requests when using pages
The default page shows the pending requests which were working OK if one
used the Filters button. This fixes the case when someone submits by
using the pager (Next, Last etc).

Closes: #405

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-08 13:14:42 +00:00
moson-mo
c248a74f80
chore: fix mailing-list URL on passreset page
small addition to the patch provided in #404

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-07 14:36:34 +01:00
Lex Black
4f56a01662
chore: fix mailing-lists urls
Those changed after the migration to mailman3

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-04 14:17:08 +00:00
Leonidas Spyropoulos
c0e806072e
chore: bump to v6.1.8
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 18:31:37 +00:00
Leonidas Spyropoulos
d00371f444
housekeep: bump renovate dependencies
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 17:24:13 +00:00
Leonidas Spyropoulos
f10c1a0505
perf: add PackageKeywords.PackageBaseID index
This is used on the export for package-meta.v1.gz generation

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 17:24:13 +00:00
moson-mo
5669821b29
perf: tweak some queries in mkpkglists
We can omit the "distinct" from some queries
because constraints in the DB ensure uniqueness:

* Groups sub-query
PackageGroup: Primary key makes "PackageID" + "GroupID" unique
Groups: Unique index on "Name" column
-> Technically we can't have a package with the same group-name twice

* Licenses sub-query:
PackageLicense -> Primary key makes "PackageID" + "LicenseID" unique
Licenses -> Unique index on "Name" column
-> Technically we can't have a package with the same license-name twice

* Keywords sub-query:
PackageKeywords -> Primary key makes "PackageBaseID" + "KeywordID" unique
(And a Package can only have one PackageBase)
Keywords -> Unique index on "Name" column
-> Technically we can't have a package with the same Keyword twice

* Packages main-query:
We join PackageBases and Users on their primary key columns
(which are guaranteed to be unique)
-> There is no way we could end up with more than one record for a Package

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-01 18:18:06 +01:00
Leonidas Spyropoulos
286834bab1
fix: regression on gzipped filenames from 3dcbee5a
With the 3dcbee5a the filenames inside the .gz archives contained .tmp
at the end. This fixes those by using Gzip Class constructor instead of
the gzip.open method.

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-31 14:43:31 +00:00
Mario Oenning
6ee34ab3cb feat: add field "CoMaintainers" to metadata-archives 2022-10-31 09:42:56 +00:00
Mario Oenning
333051ab1f feat: add field "Submitter" to metadata-archives 2022-10-28 16:55:16 +00:00
Leonidas Spyropoulos
48e5dc6763
feat: remove empty lines from ssh_keys text area, and show helpful message
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Leonidas Spyropoulos
7e06823e58
refactor: remove redundand parenthesis when return tuple
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Leonidas Spyropoulos
d793193fdf
style: make logging easier to read
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Mario Oenning
3dcbee5a4f fix: make overwriting of archive files atomic 2022-10-28 12:42:50 +00:00
Leonidas Spyropoulos
524334409a
fix: add production logging.prod.conf to be less verbose
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 21:58:30 +01:00
Leonidas Spyropoulos
0417603499
housekeep: bump renovate dependencies
email-validator:  1.2.1 -> ^1.3.0
uvicorn:          ^0.18.0 -> ^0.19.0
fastapi:          ^0.83.0 -> ^0.85.0
pytest-asyncio:   ^0.19.0 -> ^0.20.1
pytest-cov        ^3.0.0 -> ^4.0.0

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 21:48:40 +01:00
Leonidas Spyropoulos
8555e232ae
docs: fix mailing list after migration to mailman3
Closes: #396

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 20:15:46 +01:00
Leonidas Spyropoulos
9c0f8f053e
chore: rename logging.py and redis.py to avoid circular imports
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 18:51:38 +01:00
Leonidas Spyropoulos
b757e66997 feature: add filters and stats for requests
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-15 15:26:53 +03:00
Kevin Morris
da5a646a73
upgrade: bump to v6.1.7
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-10-11 15:04:25 -07:00
Kevin Morris
18f5e142b9
fix: include orphaned packages in metadata output
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-10-11 14:50:09 -07:00
Kevin Morris
3ae6323a7c
upgrade: bump to v6.1.6
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-30 05:19:58 -07:00
Kevin Morris
8657fd336e
feat: GET|POST /account/{name}/delete
Closes #348

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-30 05:08:50 -07:00
Kevin Morris
1180565d0c
Merge branch 'upd-metadata-doc' 2022-09-26 01:39:09 -07:00
Kevin Morris
eb0c5605e4
upgrade: bump version to v6.1.5
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-26 01:28:38 -07:00
Kevin Morris
e00b0059f7
doc: remove --spec popularity from cron recommendations
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-26 01:27:37 -07:00
Leonidas Spyropoulos
0dddaeeb98
fix: remove sessions of suspended users
Fixes: #394

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-09-26 08:59:44 +01:00
moson-mo
137644e919
docs: suggest shallow clone in git-archive.md
we should be suggesting to make a shallow clone to reduce
the amount of data that is being transferred initially

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-09-25 10:03:05 +02:00
Kevin Morris
30e72d2db5 feat: archive git repository (experimental)
See doc/git-archive.md for general Git archive specifications
See doc/repos/metadata-repo.md for info and direction related to the new Git metadata archive
2022-09-24 16:51:25 +00:00
Kevin Morris
ec3152014b
fix: retry transactions who fail due to deadlocks
In my opinion, this kind of handling of transactions is pretty ugly.
The being said, we have issues with running into deadlocks on aur.al,
so this commit works against that immediate bug.

An ideal solution would be to deal with retrying transactions through
the `db.begin()` scope, so we wouldn't have to explicitly annotate
functions as "retry functions," which is what this commit does.

Closes #376

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-13 12:54:08 -07:00
Kevin Morris
f450b5dfc7
upgrade: bump to version v6.1.4
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 12:29:57 -07:00
Kevin Morris
adc3a21863
fix: add 'unsafe-inline' to script-src CSP
swagger-ui uses inline javascript to bootstrap itself, so we need to
allow unsafe inline because we can't give swagger-ui a nonce to embed.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 12:28:42 -07:00
Kevin Morris
37c7dee099
fix: produce DeleteNotification a line before handle_request
With this on a single line, the argument ordering and class/func
execution was a bit too RNG causing exceptions to be thrown when
producing a notification based off of a deleted pkgbase object.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 10:36:50 -07:00
Kevin Morris
624954042b
doc(rpc): include route doc at the top of aurweb.routers.rpc
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:59:52 -07:00
Kevin Morris
17f2c05fd3
feat(rpc): add GET /rpc/v5/suggest/{arg} openapi route
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:50:26 -07:00
Kevin Morris
8e8b746a5b
feat(rpc): add GET /rpc/v5/search/{arg} openapi route
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:50:19 -07:00
Kevin Morris
5e75a00c17
upgrade: bump to version v6.1.3
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:59:16 -07:00
Kevin Morris
9faa7b801d
feat: add cdn.jsdelivr.net to script/style CSP
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:56:29 -07:00
Kevin Morris
df0a4a2be2
feat(rpc): add /rpc/v5/{type} openapi-compatible routes
We will be modeling future RPC implementations on an OpenAPI spec.
While this commit does not completely cohere to OpenAPI in terms
of response data, this is a good start and will allow us to cleanly
document these openapi routes in the current and future.

This commit brings in the new RPC routes:
- GET /rpc/v5/info/{pkgname}
- GET /rpc/v5/info?arg[]=pkg1&arg[]=pkg2
- POST /rpc/v5/info with JSON data `{"arg": ["pkg1", "pkg2"]}`
- GET /rpc/v5/search?arg=keywords&by=valid-by-value
- POST /rpc/v5/search with JSON data `{"by": "valid-by-value", "arg": "keywords"}`

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:11:18 -07:00
renovate
bb6e602e13 fix(deps): update dependency fastapi to ^0.83.0 2022-09-12 01:42:09 +00:00
Kevin Morris
4e0618469d
fix(test): JSONResponse() requires a content argument with fastapi 0.83.0
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:40:31 -07:00
Kevin Morris
b3853e01b8
fix(pre-commit): include migrations in fixes/checks
We want all python files related to the project to be checked, really.
Some of which are still included, but migrations are a core part of
FastAPI aurweb and should be included.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:07:54 -07:00
Kevin Morris
03776c4663
fix(docker): cache & install pre-commit deps during image build
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:00:11 -07:00
Kevin Morris
a2d08e441e
fix(docker): run pre-commit run -a once
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 17:59:45 -07:00
Kevin Morris
6ad24fc950
Merge branch 'fix-docker-test' 2022-09-11 15:57:08 -07:00
renovate
69d6724749
fix(deps): update dependency redis to v4 2022-09-10 05:25:06 +00:00
renovate
307d944cf1
fix(deps): update dependency protobuf to v4 2022-09-10 03:25:08 +00:00
renovate
3de17311cf
fix(deps): update dependency bleach to v5 2022-09-10 00:25:02 +00:00
renovate
7ad22d8143
fix(deps): update dependency bcrypt to v4 2022-09-07 14:24:55 +00:00
renovate
6ab9663b76
fix(deps): update dependency authlib to v1 2022-09-07 06:25:25 +00:00
renovate
486f8bd61c
fix(deps): update dependency aiofiles to v22 2022-09-07 04:24:53 +00:00
renovate
a39f34d695
chore(deps): update dependency pytest to v7 2022-09-07 03:25:30 +00:00
renovate
bb310bdf65
fix(deps): update dependency uvicorn to ^0.18.0 2022-09-07 02:24:55 +00:00
renovate
a73af3e76d
fix(deps): update dependency hypercorn to ^0.14.0 2022-09-07 01:25:03 +00:00
renovate
a981ae4052
fix(deps): update dependency httpx to ^0.23.0 2022-09-07 00:25:32 +00:00
renovate
cdc7bd618c
fix(deps): update dependency email-validator to v1.2.1 2022-09-06 23:24:49 +00:00
renovate
b38e765dfe
fix(deps): update dependency aiofiles to ^0.8.0 2022-09-06 22:24:52 +00:00
renovate
655402a509
chore(deps): update dependency pytest-asyncio to ^0.19.0 2022-09-06 10:25:02 +00:00
renovate
a84d115fa1
chore(deps): add renovate.json 2022-09-06 08:24:03 +00:00
Leonidas Spyropoulos
310c469ba8
fix: run pre-commit checks instead of flake8 and isort
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-09-06 08:07:05 +01:00
Kevin Morris
25e05830a6
test: test that /packages/{name} produces the package's description
This commit fixes two of our tests in test_templates.py to go along
with our new template modifications, as well as a new test in
test_packages_routes.py which constructs two packages belonging
to the same package base, then tests that viewing their pages
produces their independent descriptions.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 19:50:41 -07:00
Kevin Morris
0388b12896
fix: package description on /packages/{name} view
...What in the world happened here. We were literally just populating
`pkg` based on `pkgbase.packages.first()`. We should have been focusing
on the package passed by the context, which is always available when
`show_package_details` is true.

Closes #384

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 19:25:32 -07:00
Kevin Morris
83ddbd220f
test: get /requests displays all requests, including those without a User
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:56:48 -07:00
Kevin Morris
a629098b92
fix: conditional display on Request's 'Filed by' field
Since we support requests which have no associated user, we must
support the case where we are displaying such a request.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:55:20 -07:00
Kevin Morris
7fed5742b8
fix: display requests for TUs which no longer have an associated User
Closes #387

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:43:23 -07:00
Kevin Morris
6435c2b1f1
upgrade: bump to version v6.1.2
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-02 15:28:02 -07:00
Kevin Morris
b8a4ce4ceb
fix: include maint/comaint state in pkgbase post's error context
Closes #386

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-02 15:12:41 -07:00
Kevin Morris
8a3a7e31ac
upgrade: bump version to v6.1.1
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-31 22:01:54 -07:00
Kevin Morris
929bb756a8
ci(lint): add .pre-commit cache for pre-commit
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-23 02:32:35 -07:00
Kevin Morris
fbb3e052fe
ci: use cache/virtualenv for test dependencies
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-23 02:19:16 -07:00
Kevin Morris
57c0409958
style: set flake8's max-line-length=88
In accordance with black's defined style, we now expect a maximum
of 88 columns for any one particular line.

This change fixes remaining violations of 88 columns in the codebase
(not many), and introduces the modified flake8 configuration.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-22 23:44:56 -07:00
Joakim Saario
ce5dbf0eeb
docs(contributing): Update Coding Style 2022-08-22 22:42:10 +02:00
Joakim Saario
de5538a40f
ci(lint): Use pre-commit 2022-08-22 22:42:10 +02:00
Joakim Saario
505eb90479
chore: Add .git-blame-ignore-revs file
The idea is to exclude commits that only contains formatting so that it's
easier to backtrack actual code changes with `git blame`.
2022-08-22 22:41:58 +02:00
Joakim Saario
9c6c13b78a
style: Run pre-commit 2022-08-22 22:40:45 +02:00
Joakim Saario
b47882b114
chore(pre-commit) Use hooks from official repositories
The reason behind this is to make checking and formatting consistent between
contributors and CI. It is also easier to incorporate new hooks, since many
tools already provides pre-commit hooks

In addition this commit also adds `black` and `autoflake` along with a few
other useful hooks from the `pre-commit-hooks` repository.
2022-08-22 22:37:32 +02:00
Kevin Morris
08d485206c
feature: allow co-maintainers to disown their pkg
Derived off of original work done by Leonidas Spyropoulos
at https://gitlab.archlinux.org/archlinux/aurweb/-/merge_requests/503

This revision of that original work finishes off the inconsistencies
mentioned in the original MR and adds a small bit of testing for more
regression checks.

Fixes: #360

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-19 18:00:29 -07:00
Kevin Morris
ab2956eef7
feat: add pytest unit of independent user unflagging
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-18 16:02:03 -07:00
Kevin Morris
93b4cec932
Merge branch 'show-unflag-link-to-flagger' 2022-08-18 16:01:38 -07:00
Kevin Morris
fd4aaed208
fix: use max-age for all cookie expirations
in addition, remove cookie expiration for AURREMEMBER --
we don't really care about a session time for this cookie, it merely
acts as a flag given out on login to remember what the user selected

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-18 15:15:40 -07:00
Kevin Morris
8e43932aa6
fix(doc): re-add Max-Age to list of secure cookie attributes
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-18 14:57:42 -07:00
Kevin Morris
4303086c0e
Merged branch 'sameorigin-lax'
Closes #351

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-18 14:47:24 -07:00
Joakim Saario
f10732960c
fix: Use SameSite=Lax on cookies 2022-08-18 23:42:33 +02:00
Kevin Morris
fb1fb2ef3b
feat: documentation for web authentication (login, verification)
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-17 09:59:56 -07:00
Leon Möller
33bf5df236 fix: show unflag link to flagger
While the flagger is allowed to unflag a package, the link to do so is
hidden from them. Fix by adding the flagger to the unflag list.

Fix #380
2022-08-16 13:19:15 +00:00
Kevin Morris
15d016eb70
fix: secure access to comment edits to user who owns the comment
Found along with the previous commit to be a security hole in our
implementation. This commit resolves an issue regarding comment editing.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-15 23:30:34 -07:00
Kevin Morris
7a52da5587
fix: guard POST keywords & allow co-maintainers to see keyword form
This addresses a severe security issue, which is omitted from this
git message for obscurity purposes.

Otherwise, it allows co-maintainers to see the keyword form when
viewing a package they co-maintain.

Closes #378

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-15 23:30:15 -07:00
Kevin Morris
7b047578fd
fix: correct kwarg name for approved users of creds.has_credential
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-15 19:34:18 -07:00
Kevin Morris
801df832e5
fix(rpc): correct URLPath in package results
This was incorrectly using the particular Package record's name
to format options.snapshot_uri in order to produce URLPath.

It should, instead, use the PackageBase record's name, which
this commit resolves.

Bug reported by thomy2000

Closes #382

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-15 10:06:44 -07:00
Kevin Morris
edacde48e5
Merge branch 'paginate-comments' 2022-08-14 19:50:21 -07:00
Kevin Morris
b4e0aea2b7
Merged bugfixes
Brings in: 9497f6e671
Closes #512

Thanks, jelle!

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-14 19:25:49 -07:00
Jelle van der Waa
9497f6e671
fix(aurweb): resolve exception in ratelimit
Redis's get() method can return None which makes an RPC request error
out:

  File "/srv/http/aurweb/aurweb/ratelimit.py", line 103, in check_ratelimit
    requests = int(requests.decode())
AttributeError: 'NoneType' object has no attribute 'decode'
2022-08-14 15:43:13 +02:00
Kevin Morris
4565aa38cf
update: Swedish translations
Pulled from Transifex on 08/12/2022 - 08/13/2022.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-13 23:51:49 -07:00
Kevin Morris
a82d552e1b
update: migrate new transifex client configuration
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-13 23:49:47 -07:00
Kevin Morris
d63615a994
fix(docker): fix ca entrypoint logic and healthcheck
With this commit, it is advised to `rm ./data/root_ca.crt ./data/*.pem`,
as new certificates and a root CA will be generated while utilizing the
step volume.

Closes #367

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-13 23:43:04 -07:00
Kevin Morris
6f7ac33166
Revert "feat(db): add an index for SSHPubKeys.PubKey (#2)"
This reverts commit 6c7e274968.

Once again, this does actually cause issues with foreign keys.
Removing it for now and will revisit this.
2022-08-13 23:28:31 -07:00
Kevin Morris
829a8b4b81
Revert "fix(docker): apply chown each time sshd is started"
This reverts commit 952c24783b.

The issue found was actually:
- If `./aur.git` exists within the aurweb repository locally,
  it also ends up in the destination, stopping the aurweb_git_data
  volume from being mounted properly.
2022-08-13 20:56:43 -07:00
Kevin Morris
952c24783b
fix(docker): apply chown each time sshd is started
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-13 20:13:07 -07:00
Kevin Morris
6c7e274968
feat(db): add an index for SSHPubKeys.PubKey (#2)
Speeds up SSHPubKeys.PubKey searches in a larger database.

Fixed form of the original commit which was reverted,
1a7f6e1fa9

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-13 19:52:50 -07:00
Kevin Morris
5abd5db313
Revert "feat(db): add an index for SSHPubKeys.PubKey"
This reverts commit 1a7f6e1fa9.

This commit broke account creation in some way. We'd still like to
do this, but we need to ensure it does not intrude on other facets.

Extra: We should really work out how this even passed tests; it
should not have.
2022-08-13 19:23:19 -07:00
Kevin Morris
b3d09a4b77
Merge branch 'dummy-data-instructions' 2022-08-13 16:31:47 -07:00
Kevin Morris
1a7f6e1fa9
feat(db): add an index for SSHPubKeys.PubKey
Speeds up SSHPubKeys.PubKey searches in a larger database.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-12 22:26:26 -07:00
Kevin Morris
913ce8a4f0
fix(performance): lazily load expensive modules within aurweb.db
Closes #374

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-12 22:26:26 -07:00
Jelle van der Waa
0e82916b0a fix(python): don't show maintainer link for non logged in users
Show a plain maintainer text for non logged in users like the submitted,
last packager.

Closes #373
2022-08-10 19:04:59 +00:00
Kevin Morris
9648628a2c
update: requests dependency
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-09 16:43:27 -07:00
Leonidas Spyropoulos
2c080b2ea9
feature: add pagination on comments
Fixes: #354

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-08-02 20:27:47 +03:00
Leonidas Spyropoulos
1d6335363c fix: strip whitespace when parsing package keywords
Remove all extra whitespace when parsing Keywords to ensure we don't add
empty keywords in the DB.

Closes: #332

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-08-02 17:06:36 +03:00
Jelle van der Waa
a509e40474 fix(python): use standard dict/list type annotation
Since Python 3.9 list/dict can be used as type hint.
2022-08-02 12:06:58 +00:00
Hugo Osvaldo Barrera
d6fa4ec5a8 Explain how to populate dummy data for TESTING
Signed-off-by: Hugo Osvaldo Barrera <hugo@whynothugo.nl>
2022-07-19 18:55:42 +02:00
Leonidas Spyropoulos
28970ccc91
fix: align text on left
Closes: #368

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-07-17 19:41:19 +01:00
Leonidas Spyropoulos
034e47bc28
fix: hide Unflag package from non-maintainers
Closes: #364
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-07-17 19:37:00 +01:00
Jelle van der Waa
0b03a6871e
fix(docker): document runtime deps 2022-07-04 21:35:41 +02:00
Jelle van der Waa
4a58e1349c
fix(docker): fix typo scheme -> schema 2022-07-04 21:35:06 +02:00
Jelle van der Waa
edef6cc6ac chore(css): drop old vendor prefixes
All of these vendor prefixes are already supported by all browsers for
quite a while.
2022-06-30 21:57:52 +02:00
Jelle van der Waa
ade624c215 doc(README): update contributing guidelines 2022-06-29 10:57:12 +00:00
Jelle van der Waa
98f55879d3 fix(docker): don't run redis with protected mode
For our development setup we run a redis container without a
username/password. Redis recently set protected mode by default which
disallows this, turn it off as it has no security implication.
2022-06-28 22:14:01 +02:00
Jelle van der Waa
8598ea6f74
fix(gitlab-ci): update coverage reporting in CI
Gitlab 14.10 introduced a coverage_report key which obsoletes the old
way of reporting coverage data.
2022-06-27 21:05:05 +02:00
Kristian Klausen
4ddd1dec9c
upgrade: bump to v6.0.28 2022-05-13 00:41:22 +02:00
Leonidas Spyropoulos
0b54488563
fix(poetry): remove mysql-connector dependency
Reverting a8287921

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-05-12 23:26:57 +01:00
Leonidas Spyropoulos
02d114d575
fix: hide email when account's email hidden is set
Fixes: 362
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-05-12 22:51:22 +01:00
Kevin Morris
7a525d7693
change: remove poetry-dynamic-versioning
We've not been using this as it is and its now warning us
about strtobool deprecation changes. Removing it for now.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-31 20:59:16 -07:00
Kevin Morris
a553d5d95a
fix: replace distutils.util.strtobool with our own
Reference from
github.com/PostHog/posthog/pull/4631/commits/341c28da0f6d33d6fb12fe443766a2d822ff0097

This fixes a deprecation warning regarding distutil's strtobool.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-31 20:59:05 -07:00
Kevin Morris
cf4295a13e
upgrade: bump to v6.0.27
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-31 17:45:39 -07:00
Kevin Morris
ed41a4fe19
feat: add paging to package depends & required by
This patch does not include a javascript implementating, but
provides a pure HTML/HTTP method of paging through these lists.

Also fixes erroneous limiting. We now use a hardcoded limit of 20
by default.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-30 17:07:40 -07:00
Kevin Morris
d8564e446b
upgrade: bump to v6.0.26
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-30 12:30:21 -07:00
Kevin Morris
afd25c248f
fix: remove HEAD and OPTIONS handling from metrics
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-14 06:26:37 -07:00
Kevin Morris
790ca4194a
fix: coherenace -> coherence
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-14 05:57:06 -07:00
Kevin Morris
7ddce6bb2d
doc: update CONTRIBUTING.md
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-14 05:55:19 -07:00
Kevin Morris
c149afb1f1
Merge remote-tracking branch 'fosskers/colin/prework-reformatting' 2022-03-14 05:14:59 -07:00
Kevin Morris
d7cb04b93d
upgrade: bump to v6.0.25
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 20:35:21 -08:00
Kevin Morris
49c5a3facf
feat: display stats about total & active TUs on proposals
This patch brings in two new features:
- when viewing proposal listings, there is a new Statistics section,
  containing the total and active number of Trusted Users found in the
  database.
- when viewing a proposal directly, the number of active trusted users
  assigned when the proposal was added is now displayed in the details
  section.

Closes #323

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 20:28:09 -08:00
Kevin Morris
0afa07ed3b
upgrade: bump to v6.0.24
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 19:16:02 -08:00
Kevin Morris
a1a88ea872
fix(rpc): suggestions should only suggest based on <keyword>%
Previously, Python code was looking for suggestions based on
`%<keyword>%`. This was inconsistent with PHP's suggestion
implementation and cause more records to be bundled with a suggestion,
along with supplying misleading suggestions.

Closes #343

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 19:00:19 -08:00
Kevin Morris
9791704632
Merge branch 'fix-none-path' 2022-03-08 18:34:38 -08:00
Kevin Morris
2a393f95fa
upgrade: bump to v6.0.23
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 17:59:00 -08:00
Kevin Morris
e00cf5f124
test: use smtplib.SMTP[_SSL] timeout = notifications.smtp-timeout
A new option has been added for configuration of SMTP timeout:
- notifications.smtp-timeout

During tests, we can change this timeout to be small, so we aren't
depending on hardware-based RNG to pass the timeout.

Without a timeout, users can run into a long-running test for no
particular reason.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 17:53:31 -08:00
Kevin Morris
13217be939
fix: don't check suspension for ownership changes
People can change comaintainer ownership to suspended users if they
want to.

Suspended users cannot login, so there is no breach of security
here. It does make sense to allow ownership to be changed, imo.

Closes #339

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-03-08 17:51:25 -08:00
Colin Woodbury
3aa8d523f5
change(rpc): search module reformatting 2022-02-21 16:56:10 -08:00
Leonidas Spyropoulos
6e837e0c02
fix: always provide a path
891efcd142
2022-02-21 10:25:01 +00:00
468 changed files with 22642 additions and 25250 deletions

View file

@ -3,7 +3,7 @@ disable_warnings = already-imported
[report]
include = aurweb/*
fail_under = 85
fail_under = 95
exclude_lines =
if __name__ == .__main__.:
pragma: no cover

View file

@ -1,6 +1,23 @@
*/*.mo
# Config files
conf/config
conf/config.sqlite
conf/config.sqlite.defaults
conf/docker
conf/docker.defaults
# Compiled translation files
**/*.mo
# Typical virtualenv directories
env/
venv/
.venv/
# Test output
htmlcov/
test-emails/
test/__pycache__
test/test-results
test/trash_directory*
.coverage
.pytest_cache

View file

@ -1,5 +1,5 @@
# EditorConfig configuration for aurweb
# https://EditorConfig.org
# https://editorconfig.org
# Top-most EditorConfig file
root = true
@ -8,6 +8,3 @@ root = true
end_of_line = lf
insert_final_newline = true
charset = utf-8
[*.{php,t}]
indent_style = tab

1
.env
View file

@ -1,7 +1,6 @@
FASTAPI_BACKEND="uvicorn"
FASTAPI_WORKERS=2
MARIADB_SOCKET_DIR="/var/run/mysqld/"
AURWEB_PHP_PREFIX=https://localhost:8443
AURWEB_FASTAPI_PREFIX=https://localhost:8444
AURWEB_SSHD_PREFIX=ssh://aur@localhost:2222
GIT_DATA_DIR="./aur.git/"

2
.git-blame-ignore-revs Normal file
View file

@ -0,0 +1,2 @@
# style: Run pre-commit
9c6c13b78a30cb9d800043410799e29631f803d2

22
.gitignore vendored
View file

@ -1,3 +1,4 @@
/data/
__pycache__/
*.py[cod]
.vim/
@ -23,7 +24,6 @@ conf/docker
conf/docker.defaults
data.sql
dummy-data.sql*
env/
fastapi_aw/
htmlcov/
po/*.mo
@ -31,7 +31,7 @@ po/*.po~
po/POTFILES
schema/aur-schema-sqlite.sql
test/test-results/
test/trash directory*
test/trash_directory*
web/locale/*/
web/html/*.gz
@ -43,3 +43,21 @@ doc/rpc.html
# Ignore .python-version file from Pyenv
.python-version
# Ignore coverage report
coverage.xml
# Ignore pytest report
report.xml
# Ignore test emails
test-emails/
# Ignore typical virtualenv directories
env/
venv/
.venv/
# Ignore some terraform files
/ci/tf/.terraform
/ci/tf/terraform.tfstate*

View file

@ -4,6 +4,8 @@ cache:
paths:
# For some reason Gitlab CI only supports storing cache/artifacts in a path relative to the build directory
- .pkg-cache
- .venv
- .pre-commit
variables:
AUR_CONFIG: conf/config # Default MySQL config setup in before_script.
@ -11,28 +13,27 @@ variables:
TEST_RECURSION_LIMIT: 10000
CURRENT_DIR: "$(pwd)"
LOG_CONFIG: logging.test.conf
DEV_FQDN: aurweb-$CI_COMMIT_REF_SLUG.sandbox.archlinux.page
INFRASTRUCTURE_REPO: https://gitlab.archlinux.org/archlinux/infrastructure.git
lint:
variables:
# Space-separated list of directories that should be linted.
REQUIRES_LINT: "aurweb test migrations"
stage: .pre
before_script:
- pacman -Sy --noconfirm --noprogressbar --cachedir .pkg-cache
- pacman -Sy --noconfirm --noprogressbar
archlinux-keyring
- pacman -Syu --noconfirm --noprogressbar --cachedir .pkg-cache
python python-isort flake8
- pacman -Syu --noconfirm --noprogressbar
git python python-pre-commit
script:
- bash -c 'flake8 --count $(echo "$REQUIRES_LINT" | xargs); exit $?'
- bash -c 'isort --check-only $(echo "$REQUIRES_LINT" | xargs); exit $?'
- export XDG_CACHE_HOME=.pre-commit
- pre-commit run -a
test:
stage: test
tags:
- fast-single-thread
before_script:
- export PATH="$HOME/.poetry/bin:${PATH}"
- ./docker/scripts/install-deps.sh
- virtualenv -p python3 .venv
- source .venv/bin/activate # Enable our virtualenv cache
- ./docker/scripts/install-python-deps.sh
- useradd -U -d /aurweb -c 'AUR User' aur
- ./docker/mariadb-entrypoint.sh
@ -48,42 +49,113 @@ test:
# Run sharness.
- make -C test sh
# Run pytest.
- pytest
- pytest --junitxml="pytest-report.xml"
- make -C test coverage # Produce coverage reports.
coverage: '/TOTAL.*\s+(\d+\%)/'
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
reports:
cobertura: coverage.xml
junit: pytest-report.xml
coverage_report:
coverage_format: cobertura
path: coverage.xml
deploy:
.init_tf: &init_tf
- pacman -Syu --needed --noconfirm terraform
- export TF_VAR_name="aurweb-${CI_COMMIT_REF_SLUG}"
- TF_ADDRESS="${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"
- cd ci/tf
- >
terraform init \
-backend-config="address=${TF_ADDRESS}" \
-backend-config="lock_address=${TF_ADDRESS}/lock" \
-backend-config="unlock_address=${TF_ADDRESS}/lock" \
-backend-config="username=x-access-token" \
-backend-config="password=${TF_STATE_GITLAB_ACCESS_TOKEN}" \
-backend-config="lock_method=POST" \
-backend-config="unlock_method=DELETE" \
-backend-config="retry_wait_min=5"
deploy_review:
stage: deploy
tags:
- secure
rules:
- if: $CI_COMMIT_BRANCH == "pu"
when: manual
variables:
FASTAPI_BACKEND: gunicorn
FASTAPI_WORKERS: 5
AURWEB_PHP_PREFIX: https://aur-dev.archlinux.org
AURWEB_FASTAPI_PREFIX: https://aur-dev.archlinux.org
AURWEB_SSHD_PREFIX: ssh://aur@aur-dev.archlinux.org:2222
COMMIT_HASH: $CI_COMMIT_SHA
GIT_DATA_DIR: git_data
script:
- pacman -Syu --noconfirm docker docker-compose socat openssh
- chmod 600 ${SSH_KEY}
- socat "UNIX-LISTEN:/tmp/docker.sock,reuseaddr,fork" EXEC:"ssh -o UserKnownHostsFile=${SSH_KNOWN_HOSTS} -Ti ${SSH_KEY} ${SSH_USER}@${SSH_HOST}" &
- export DOCKER_HOST="unix:///tmp/docker.sock"
# Set secure login config for aurweb.
- sed -ri "s/^(disable_http_login).*$/\1 = 1/" conf/config.dev
- docker-compose build
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml down --remove-orphans
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml up -d
- docker image prune -f
- docker container prune -f
- docker volume prune -f
- *init_tf
- terraform apply -auto-approve
environment:
name: development
url: https://aur-dev.archlinux.org
name: review/$CI_COMMIT_REF_NAME
url: https://$DEV_FQDN
on_stop: stop_review
auto_stop_in: 1 week
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual
provision_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- pacman -Syu --noconfirm --needed ansible git openssh jq
# Get ssh key from terraform state file
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- terraform show -json |
jq -r '.values.root_module.resources[] |
select(.address == "tls_private_key.this") |
.values.private_key_openssh' > ~/.ssh/id_ed25519
- chmod 400 ~/.ssh/id_ed25519
# Clone infra repo
- git clone $INFRASTRUCTURE_REPO
- cd infrastructure
# Remove vault files
- rm $(git grep -l 'ANSIBLE_VAULT;1.1;AES256$')
# Remove vault config
- sed -i '/^vault/d' ansible.cfg
# Add host config
- mkdir -p host_vars/$DEV_FQDN
- 'echo "filesystem: btrfs" > host_vars/$DEV_FQDN/misc'
# Add host
- echo "$DEV_FQDN" > hosts
# Add our pubkey and hostkeys
- ssh-keyscan $DEV_FQDN >> ~/.ssh/known_hosts
- ssh-keygen -f ~/.ssh/id_ed25519 -y > pubkeys/aurweb-dev.pub
# Run our ansible playbook
- >
ansible-playbook playbooks/aur-dev.archlinux.org.yml \
-e "aurdev_fqdn=$DEV_FQDN" \
-e "aurweb_repository=$CI_REPOSITORY_URL" \
-e "aurweb_version=$CI_COMMIT_SHA" \
-e "{\"vault_mariadb_users\":{\"root\":\"aur\"}}" \
-e "vault_aurweb_db_password=aur" \
-e "vault_aurweb_gitlab_instance=https://does.not.exist" \
-e "vault_aurweb_error_project=set-me" \
-e "vault_aurweb_error_token=set-me" \
-e "vault_aurweb_secret=aur" \
-e "vault_goaurrpc_metrics_token=aur" \
-e '{"root_additional_keys": ["moson.pub", "aurweb-dev.pub"]}'
environment:
name: review/$CI_COMMIT_REF_NAME
action: access
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
stop_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- terraform destroy -auto-approve
- 'curl --silent --show-error --fail --header "Private-Token: ${TF_STATE_GITLAB_ACCESS_TOKEN}" --request DELETE "${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"'
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual

View file

@ -1,14 +0,0 @@
## Checklist
- [ ] I have set a Username in the Details section
- [ ] I have set an Email in the Details section
- [ ] I have set a valid Account Type in the Details section
## Details
- Instance: aur-dev.archlinux.org
- Username: the_username_you_want
- Email: valid@email.org
- Account Type: (User|Trusted User)
/label account-request

View file

@ -1,12 +1,24 @@
<!--
This template is used to report potential bugs with the AURweb website.
NOTE: All comment sections with a MODIFY note need to be edited. All checkboxes
in the "Checklist" section need to be checked by the owner of the issue.
-->
/label ~bug ~unconfirmed
/title [BUG] <!-- MODIFY: add subject -->
<!--
Please do not remove the above quick actions, which automatically label the
issue and assign relevant users.
-->
### Checklist
This bug template is meant to provide bug issues for code existing in
the aurweb repository. This bug template is **not meant** to handle
bugs with user-uploaded packages.
**NOTE:** This bug template is meant to provide bug issues for code existing in
the aurweb repository.
To work out a bug you have found in a user-uploaded package, contact
the package's maintainer first. If you receive no response, file the
relevant package request against it so TUs can deal with cleanup.
**This bug template is not meant to handle bugs with user-uploaded packages.**
To report issues you might have found in a user-uploaded package, contact
the package's maintainer in comments.
- [ ] I confirm that this is an issue with aurweb's code and not a
user-uploaded package.
@ -29,7 +41,7 @@ this bug.
### Logs
If you have any logs relevent to the bug, include them here in
If you have any logs relevant to the bug, include them here in
quoted or code blocks.
### Version(s)

View file

@ -1,3 +1,25 @@
<!--
This template is used to feature request for AURweb website.
NOTE: All comment sections with a MODIFY note need to be edited. All checkboxes
in the "Checklist" section need to be checked by the owner of the issue.
-->
/label ~feature ~unconfirmed
/title [FEATURE] <!-- MODIFY: add subject -->
<!--
Please do not remove the above quick actions, which automatically label the
issue and assign relevant users.
-->
### Checklist
**NOTE:** This bug template is meant to provide bug issues for code existing in
the aurweb repository.
**This bug template is not meant to handle bugs with user-uploaded packages.**
To report issues you might have found in a user-uploaded package, contact
the package's maintainer in comments.
- [ ] I have summed up the feature in concise words in the [Summary](#summary) section.
- [ ] I have completely described the feature in the [Description](#description) section.
- [ ] I have completed the [Blockers](#blockers) section.
@ -28,5 +50,3 @@ Example:
- [Feature] Do not allow users to be Tyrants
- \<(issue|merge_request)_link\>
/label feature unconsidered

View file

@ -1,58 +0,0 @@
**NOTE:** This issue template is only applicable to FastAPI implementations
in the code-base, which only exists within the `pu` branch. If you wish to
file an issue for the current PHP implementation of aurweb, please file a
standard issue prefixed with `[Bug]` or `[Feature]`.
**Checklist**
- [ ] I have prefixed the issue title with `[Feedback]` along with a message
pointing to the route or feature tested.
- Example: `[Feedback] /packages/{name}`
- [ ] I have completed the [Changes](#changes) section.
- [ ] I have completed the [Bugs](#bugs) section.
- [ ] I have completed the [Improvements](#improvements) section.
- [ ] I have completed the [Summary](#summary) section.
### Changes
Please describe changes in user experience when compared to the PHP
implementation. This section can actually hold a lot of info if you
are up for it -- changes in routes, HTML rendering, back-end behavior,
etc.
If you cannot see any changes from your standpoint, include a short
statement about that fact.
### Bugs
Please describe any bugs you've experienced while testing the route
pertaining to this issue. A "perfect" bug report would include your
specific experience, what you expected to occur, and what happened
otherwise. If you can, please include output of `docker-compose logs fastapi`
with your report; especially if any unintended exceptions occurred.
### Improvements
If you've experienced improvements in the route when compared to PHP,
please do include those here. We'd like to know if users are noticing
these improvements and how they feel about them.
There are multiple routes with no improvements. For these, just include
a short sentence about the fact that you've experienced none.
### Summary
First: If you've gotten here and completed the [Changes](#changes),
[Bugs](#bugs), and [Improvements](#improvements) sections, we'd like
to thank you very much for your contribution and willingness to test.
We are not a company, and we are not a large team; any bit of assistance
here helps the project astronomically and moves us closer toward a
new release.
That being said: please include an overall summary of your experience
and how you felt about the current implementation which you're testing
in comparison with PHP (current aur.archlinux.org, or https://localhost:8443
through docker).
/label feedback

View file

@ -1,24 +1,36 @@
hooks:
- &base
language: python
types: [python]
require_serial: true
exclude: ^migrations/versions
- &flake8
id: flake8
name: flake8
entry: flake8
<<: *base
- &isort
id: isort
name: isort
entry: isort
<<: *base
repos:
- repo: local
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- <<: *flake8
- <<: *isort
args: ['--check-only', '--diff']
- id: check-added-large-files
- id: check-case-conflict
- id: check-merge-conflict
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
exclude: ^po/
- id: debug-statements
- repo: https://github.com/myint/autoflake
rev: v2.3.1
hooks:
- id: autoflake
args:
- --in-place
- --remove-all-unused-imports
- --ignore-init-module-imports
- repo: https://github.com/pycqa/isort
rev: 5.13.2
hooks:
- id: isort
- repo: https://github.com/psf/black
rev: 24.4.1
hooks:
- id: black
- repo: https://github.com/PyCQA/flake8
rev: 7.0.0
hooks:
- id: flake8

View file

@ -1,7 +1,7 @@
[main]
host = https://www.transifex.com
host = https://app.transifex.com
[aurweb.aurwebpot]
[o:lfleischer:p:aurweb:r:aurwebpot]
file_filter = po/<lang>.po
source_file = po/aurweb.pot
source_lang = en

View file

@ -8,8 +8,8 @@ Before sending patches, you are recommended to run `flake8` and `isort`.
You can add a git hook to do this by installing `python-pre-commit` and running
`pre-commit install`.
[1]: https://lists.archlinux.org/listinfo/aur-dev
[2]: https://gitlab.archlinunx.org/archlinux/aurweb
[1]: https://lists.archlinux.org/mailman3/lists/aur-dev.lists.archlinux.org/
[2]: https://gitlab.archlinux.org/archlinux/aurweb
### Coding Guidelines
@ -23,6 +23,83 @@ development.
3. Use four space indentation
4. Use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/)
5. DRY: Don't Repeat Yourself
6. All code should be tested for good _and_ bad cases
6. All code should be tested for good _and_ bad cases (see [test/README.md][3])
[3]: https://gitlab.archlinux.org/archlinux/aurweb/-/blob/master/test/README.md
Test patches that increase coverage in the codebase are always welcome.
### Coding Style
We use `autoflake`, `isort`, `black` and `flake8` to enforce coding style in a
PEP-8 compliant way. These tools run in GitLab CI using `pre-commit` to verify
that any pushed code changes comply with this.
To enable the `pre-commit` git hook, install the `pre-commit` package either
with `pacman` or `pip` and then run `pre-commit install --install-hooks`. This
will ensure formatting is done before any code is commited to the git
repository.
There are plugins for editors or IDEs which automate this process. Some
example plugins:
- [tenfyzhong/autoflake.vim](https://github.com/tenfyzhong/autoflake.vim)
- [fisadev/vim-isort](https://github.com/fisadev/vim-isort)
- [psf/black](https://github.com/psf/black)
- [nvie/vim-flake8](https://github.com/nvie/vim-flake8)
- [prabirshrestha/vim-lsp](https://github.com/prabirshrestha/vim-lsp)
- [dense-analysis/ale](https://github.com/dense-analysis/ale)
See `setup.cfg`, `pyproject.toml` and `.pre-commit-config.yaml` for tool
specific configurations.
### Development Environment
To get started with local development, an instance of aurweb must be
brought up. This can be done using the following sections:
- [Using Docker](#using-docker)
- [Using INSTALL](#using-install)
There are a number of services aurweb employs to run the application
in its entirety:
- ssh
- cron jobs
- starlette/fastapi asgi server
Project structure:
- `./aurweb`: `aurweb` Python package
- `./templates`: Jinja2 templates
- `./docker`: Docker scripts and configuration files
#### Using Docker
Using Docker, we can run the entire infrastructure in two steps:
# Build the aurweb:latest image
$ docker-compose build
# Start all services in the background
$ docker-compose up -d nginx
`docker-compose` services will generate a locally signed root certificate
at `./data/root_ca.crt`. Users can import this into ca-certificates or their
browser if desired.
Accessible services (on the host):
- https://localhost:8444 (python via nginx)
- localhost:13306 (mariadb)
- localhost:16379 (redis)
Docker services, by default, are setup to be hot reloaded when source code
is changed.
For detailed setup instructions have a look at [TESTING](TESTING)
#### Using INSTALL
The [INSTALL](INSTALL) file describes steps to install the application on
bare-metal systems.

View file

@ -2,10 +2,12 @@ FROM archlinux:base-devel
VOLUME /root/.cache/pypoetry/cache
VOLUME /root/.cache/pypoetry/artifacts
VOLUME /root/.cache/pre-commit
ENV PATH="/root/.poetry/bin:${PATH}"
ENV PYTHONPATH=/aurweb
ENV AUR_CONFIG=conf/config
ENV COMPOSE=1
# Install system-wide dependencies.
COPY ./docker/scripts/install-deps.sh /install-deps.sh
@ -27,7 +29,7 @@ RUN cp -vf conf/config.dev conf/config
RUN sed -i "s;YOUR_AUR_ROOT;/aurweb;g" conf/config
# Install Python dependencies.
RUN /docker/scripts/install-python-deps.sh
RUN /docker/scripts/install-python-deps.sh compose
# Compile asciidocs.
RUN make -C doc
@ -40,3 +42,6 @@ RUN ln -sf /usr/share/zoneinfo/UTC /etc/localtime
# Install translations.
RUN make -C po all install
# Install pre-commit repositories and run lint check.
RUN pre-commit run -a

16
INSTALL
View file

@ -14,8 +14,7 @@ read the instructions below.
$ cd aurweb
$ poetry install
2) Setup a web server with PHP and MySQL. Configure the web server to redirect
all URLs to /index.php/foo/bar/. The following block can be used with nginx:
2) Setup a web server with MySQL. The following block can be used with nginx:
server {
# https is preferred and can be done easily with LetsEncrypt
@ -31,14 +30,6 @@ read the instructions below.
ssl_certificate /etc/ssl/certs/aur.cert.pem;
ssl_certificate_key /etc/ssl/private/aur.key.pem;
# Asset root. This is used to match against gzip archives.
root /srv/http/aurweb/web/html;
# TU Bylaws redirect.
location = /trusted-user/TUbylaws.html {
return 301 https://tu-bylaws.aur.archlinux.org;
}
# smartgit location.
location ~ "^/([a-z0-9][a-z0-9.+_-]*?)(\.git)?/(git-(receive|upload)-pack|HEAD|info/refs|objects/(info/(http-)?alternates|packs)|[0-9a-f]{2}/[0-9a-f]{38}|pack/pack-[0-9a-f]{40}\.(pack|idx))$" {
include uwsgi_params;
@ -63,6 +54,9 @@ read the instructions below.
# Static archive assets.
location ~ \.gz$ {
# Asset root. This is used to match against gzip archives.
root /srv/http/aurweb/archives;
types { application/gzip text/plain }
default_type text/plain;
add_header Content-Encoding gzip;
@ -126,7 +120,7 @@ interval:
*/2 * * * * bash -c 'poetry run aurweb-pkgmaint'
*/2 * * * * bash -c 'poetry run aurweb-usermaint'
*/2 * * * * bash -c 'poetry run aurweb-popupdate'
*/12 * * * * bash -c 'poetry run aurweb-tuvotereminder'
*/12 * * * * bash -c 'poetry run aurweb-votereminder'
7) Create a new database and a user and import the aurweb SQL schema:

View file

@ -11,8 +11,8 @@ The aurweb project includes
* A web interface to search for packaging scripts and display package details.
* An SSH/Git interface to submit and update packages and package meta data.
* Community features such as comments, votes, package flagging and requests.
* Editing/deletion of packages and accounts by Trusted Users and Developers.
* Area for Trusted Users to post AUR-related proposals and vote on them.
* Editing/deletion of packages and accounts by Package Maintainers and Developers.
* Area for Package Maintainers to post AUR-related proposals and vote on them.
Directory Layout
----------------
@ -26,7 +26,6 @@ Directory Layout
* `schema`: schema for the SQL database
* `test`: test suite and test cases
* `upgrading`: instructions for upgrading setups from one release to another
* `web`: PHP-based web interface for the AUR
Documentation
-------------
@ -44,7 +43,7 @@ Links
-----
* The repository is hosted at https://gitlab.archlinux.org/archlinux/aurweb
-- see doc/CodingGuidelines for information on the patch submission process.
-- see [CONTRIBUTING.md](./CONTRIBUTING.md) for information on the patch submission process.
* Bugs can (and should) be submitted to the aurweb bug tracker:
https://gitlab.archlinux.org/archlinux/aurweb/-/issues/new?issuable_template=Bug
@ -57,7 +56,7 @@ Translations
------------
Translations are welcome via our Transifex project at
https://www.transifex.com/lfleischer/aurweb; see `doc/i18n.txt` for details.
https://www.transifex.com/lfleischer/aurweb; see [doc/i18n.md](./doc/i18n.md) for details.
![Transifex](https://www.transifex.com/projects/p/aurweb/chart/image_png)

198
TESTING
View file

@ -1,50 +1,130 @@
Setup Testing Environment
=========================
The quickest way to get you hacking on aurweb is to utilize docker.
In case you prefer to run it bare-metal see instructions further below.
Containerized environment
-------------------------
1) Clone the aurweb project:
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
$ cd aurweb
2) Install the necessary packages:
# pacman -S --needed docker docker-compose
3) Build the aurweb:latest image:
# systemctl start docker
# docker compose build
4) Run local Docker development instance:
# docker compose up -d
5) Browse to local aurweb development server.
https://localhost:8444/
6) [Optionally] populate the database with dummy data:
# docker compose exec mariadb /bin/bash
# pacman -S --noconfirm words fortune-mod
# poetry run schema/gendummydata.py dummy_data.sql
# mariadb -uaur -paur aurweb < dummy_data.sql
# exit
Inspect `dummy_data.sql` for test credentials.
Passwords match usernames.
We now have fully set up environment which we can start and stop with:
# docker compose start
# docker compose stop
Proceed with topic "Setup for running tests"
Bare Metal installation
-----------------------
Note that this setup is only to test the web interface. If you need to have a
full aurweb instance with cgit, ssh interface, etc, follow the directions in
INSTALL.
docker-compose
--------------
1) Clone the aurweb project:
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
2) Install the necessary packages:
# pacman -S docker-compose
2) Build the aurweb:latest image:
$ cd /path/to/aurweb/
$ docker-compose build
3) Run local Docker development instance:
$ cd /path/to/aurweb/
$ docker-compose up -d nginx
4) Browse to local aurweb development server.
Python: https://localhost:8444/
PHP: https://localhost:8443/
Bare Metal
----------
1) Clone the aurweb project:
$ git clone git://git.archlinux.org/aurweb.git
$ cd aurweb
2) Install the necessary packages:
# pacman -S python-poetry
# pacman -S --needed python-poetry mariadb words fortune-mod nginx
4) Install the package/dependencies via `poetry`:
3) Install the package/dependencies via `poetry`:
$ poetry install
4) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
path to the root of your aurweb clone. sed can do both tasks for you:
$ sed -e "s;YOUR_AUR_ROOT;$PWD;g" conf/config.dev > conf/config
Note that when the upstream config.dev is updated, you should compare it to
your conf/config, or regenerate your configuration with the command above.
5) Set up mariadb:
# mariadb-install-db --user=mysql --basedir=/usr --datadir=/var/lib/mysql
# systemctl start mariadb
# mariadb -u root
> CREATE USER 'aur'@'localhost' IDENTIFIED BY 'aur';
> GRANT ALL ON *.* TO 'aur'@'localhost' WITH GRANT OPTION;
> CREATE DATABASE aurweb;
> exit
6) Prepare a database and insert dummy data:
$ AUR_CONFIG=conf/config poetry run python -m aurweb.initdb
$ poetry run schema/gendummydata.py dummy_data.sql
$ mariadb -uaur -paur aurweb < dummy_data.sql
7) Run the test server:
## set AUR_CONFIG to our locally created config
$ export AUR_CONFIG=conf/config
## with aurweb.spawn
$ poetry run python -m aurweb.spawn
## with systemd service
$ sudo install -m644 examples/aurweb.service /etc/systemd/system/
# systemctl enable --now aurweb.service
Setup for running tests
-----------------------
If you've set up a docker environment, you can run the full test-suite with:
# docker compose run test
You can collect code-coverage data with:
$ ./util/fix-coverage data/.coverage
See information further below on how to visualize the data.
For running individual tests, we need to perform a couple of additional steps.
In case you did the bare-metal install, steps 2, 3, 4 and 5 should be skipped.
1) Install the necessary packages:
# pacman -S --needed python-poetry mariadb-libs asciidoc openssh
2) Install the package/dependencies via `poetry`:
$ cd /path/to/aurweb/
$ poetry install
3) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
@ -55,23 +135,51 @@ Bare Metal
Note that when the upstream config.dev is updated, you should compare it to
your conf/config, or regenerate your configuration with the command above.
4) Prepare a database:
4) Edit the config file conf/config and change the mysql/mariadb portion
$ cd /path/to/aurweb/
We can make use of our mariadb docker container instead of having to install
mariadb. Change the config as follows:
$ AUR_CONFIG=conf/config poetry run python -m aurweb.initdb
---------------------------------------------------------------------
; MySQL database information. User defaults to root for containerized
; testing with mysqldb. This should be set to a non-root user.
user = root
password = aur
host = 127.0.0.1
port = 13306
;socket = /var/run/mysqld/mysqld.sock
---------------------------------------------------------------------
$ poetry run schema/gendummydata.py dummy_data.sql
$ mysql -uaur -paur aurweb < dummy_data.sql
5) Start our mariadb docker container
5) Run the test server:
# docker compose start mariadb
## set AUR_CONFIG to our locally created config
$ export AUR_CONFIG=conf/config
6) Set environment variables
## with aurweb.spawn
$ poetry run python -m aurweb.spawn
$ export AUR_CONFIG=conf/config
$ export LOG_CONFIG=logging.test.conf
## with systemd service
$ sudo install -m644 examples/aurweb.service /etc/systemd/system/
$ systemctl enable --now aurweb.service
7) Compile translation & doc files
$ make -C po install
$ make -C doc
Now we can run our python test-suite or individual tests with:
$ poetry run pytest test/
$ poetry run pytest test/test_whatever.py
To run Sharness tests:
$ poetry run make -C test sh
The e-Mails that have been generated can be found at test-emails/
After test runs, code-coverage reports can be created with:
## CLI report
$ coverage report
## HTML version stored at htmlcov/
$ coverage html
More information about tests can be found at test/README.md

View file

@ -0,0 +1 @@
# aurweb.archives

View file

@ -0,0 +1 @@
# aurweb.archives.spec

View file

@ -0,0 +1,77 @@
from pathlib import Path
from typing import Any, Dict, Iterable, List, Set
class GitInfo:
"""Information about a Git repository."""
""" Path to Git repository. """
path: str
""" Local Git repository configuration. """
config: Dict[str, Any]
def __init__(self, path: str, config: Dict[str, Any] = dict()) -> "GitInfo":
self.path = Path(path)
self.config = config
class SpecOutput:
"""Class used for git_archive.py output details."""
""" Filename relative to the Git repository root. """
filename: Path
""" Git repository information. """
git_info: GitInfo
""" Bytes bound for `SpecOutput.filename`. """
data: bytes
def __init__(self, filename: str, git_info: GitInfo, data: bytes) -> "SpecOutput":
self.filename = filename
self.git_info = git_info
self.data = data
class SpecBase:
"""
Base for Spec classes defined in git_archve.py --spec modules.
All supported --spec modules must contain the following classes:
- Spec(SpecBase)
"""
""" A list of SpecOutputs, each of which contain output file data. """
outputs: List[SpecOutput] = list()
""" A set of repositories to commit changes to. """
repos: Set[str] = set()
def generate(self) -> Iterable[SpecOutput]:
"""
"Pure virtual" output generator.
`SpecBase.outputs` and `SpecBase.repos` should be populated within an
overridden version of this function in SpecBase derivatives.
"""
raise NotImplementedError()
def add_output(self, filename: str, git_info: GitInfo, data: bytes) -> None:
"""
Add a SpecOutput instance to the set of outputs.
:param filename: Filename relative to the git repository root
:param git_info: GitInfo instance
:param data: Binary data bound for `filename`
"""
if git_info.path not in self.repos:
self.repos.add(git_info.path)
self.outputs.append(
SpecOutput(
filename,
git_info,
data,
)
)

View file

@ -0,0 +1,85 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import Package, PackageBase, User
from aurweb.rpc import RPC
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.metadata_repo = GitInfo(
config.get("git-archive", "metadata-repo"),
)
def generate(self) -> Iterable[SpecOutput]:
# Base query used by the RPC.
base_query = (
db.query(Package)
.join(PackageBase)
.join(User, PackageBase.MaintainerUID == User.ID, isouter=True)
)
# Create an instance of RPC, use it to get entities from
# our query and perform a metadata subquery for all packages.
rpc = RPC(version=5, type="info")
print("performing package database query")
packages = rpc.entities(base_query).all()
print("performing package database subqueries")
rpc.subquery({pkg.ID for pkg in packages})
pkgbases, pkgnames = dict(), dict()
for package in packages:
# Produce RPC type=info data for `package`
data = rpc.get_info_json_data(package)
pkgbase_name = data.get("PackageBase")
pkgbase_data = {
"ID": data.pop("PackageBaseID"),
"URLPath": data.pop("URLPath"),
"FirstSubmitted": data.pop("FirstSubmitted"),
"LastModified": data.pop("LastModified"),
"OutOfDate": data.pop("OutOfDate"),
"Maintainer": data.pop("Maintainer"),
"Keywords": data.pop("Keywords"),
"NumVotes": data.pop("NumVotes"),
"Popularity": data.pop("Popularity"),
"PopularityUpdated": package.PopularityUpdated.timestamp(),
}
# Store the data in `pkgbases` dict. We do this so we only
# end up processing a single `pkgbase` if repeated after
# this loop
pkgbases[pkgbase_name] = pkgbase_data
# Remove Popularity and NumVotes from package data.
# These fields change quite often which causes git data
# modification to explode.
# data.pop("NumVotes")
# data.pop("Popularity")
# Remove the ID key from package json.
data.pop("ID")
# Add the `package`.Name to the pkgnames set
name = data.get("Name")
pkgnames[name] = data
# Add metadata outputs
self.add_output(
"pkgname.json",
self.metadata_repo,
orjson.dumps(pkgnames, option=ORJSON_OPTS),
)
self.add_output(
"pkgbase.json",
self.metadata_repo,
orjson.dumps(pkgbases, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,26 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import PackageBase
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.pkgbases_repo = GitInfo(config.get("git-archive", "pkgbases-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = db.query(PackageBase.Name).order_by(PackageBase.Name.asc()).all()
pkgbases = [pkgbase.Name for pkgbase in query]
self.add_output(
"pkgbase.json",
self.pkgbases_repo,
orjson.dumps(pkgbases, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,31 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import Package, PackageBase
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.pkgnames_repo = GitInfo(config.get("git-archive", "pkgnames-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = (
db.query(Package.Name)
.join(PackageBase, PackageBase.ID == Package.PackageBaseID)
.order_by(Package.Name.asc())
.all()
)
pkgnames = [pkg.Name for pkg in query]
self.add_output(
"pkgname.json",
self.pkgnames_repo,
orjson.dumps(pkgnames, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,26 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import User
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.users_repo = GitInfo(config.get("git-archive", "users-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = db.query(User.Username).order_by(User.Username.asc()).all()
users = [user.Username for user in query]
self.add_output(
"users.json",
self.users_repo,
orjson.dumps(users, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -6,17 +6,21 @@ import re
import sys
import traceback
import typing
from contextlib import asynccontextmanager
from urllib.parse import quote_plus
import requests
from fastapi import FastAPI, HTTPException, Request, Response
from fastapi.responses import RedirectResponse
from fastapi.staticfiles import StaticFiles
from jinja2 import TemplateNotFound
from prometheus_client import multiprocess
from sqlalchemy import and_, or_
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from sqlalchemy import and_
from starlette.exceptions import HTTPException as StarletteHTTPException
from starlette.middleware.authentication import AuthenticationMiddleware
from starlette.middleware.sessions import SessionMiddleware
@ -24,23 +28,29 @@ from starlette.middleware.sessions import SessionMiddleware
import aurweb.captcha # noqa: F401
import aurweb.config
import aurweb.filters # noqa: F401
import aurweb.logging
import aurweb.pkgbase.util as pkgbaseutil
from aurweb import logging, prometheus, util
from aurweb import aur_logging, prometheus, util
from aurweb.aur_redis import redis_connection
from aurweb.auth import BasicAuthBackend
from aurweb.db import get_engine, query
from aurweb.models import AcceptedTerm, Term
from aurweb.packages.util import get_pkg_or_base
from aurweb.prometheus import instrumentator
from aurweb.redis import redis_connection
from aurweb.routers import APP_ROUTES
from aurweb.templates import make_context, render_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
session_secret = aurweb.config.get("fastapi", "session_secret")
@asynccontextmanager
async def lifespan(app: FastAPI):
await app_startup()
yield
# Setup the FastAPI app.
app = FastAPI()
app = FastAPI(lifespan=lifespan)
# Instrument routes with the prometheus-fastapi-instrumentator
# library with custom collectors and expose /metrics.
@ -49,7 +59,17 @@ instrumentator().add(prometheus.http_requests_total())
instrumentator().instrument(app)
@app.on_event("startup")
# Instrument FastAPI for tracing
FastAPIInstrumentor.instrument_app(app)
resource = Resource(attributes={"service.name": "aurweb"})
otlp_endpoint = aurweb.config.get("tracing", "otlp_endpoint")
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
span_processor = BatchSpanProcessor(otlp_exporter)
trace.set_tracer_provider(TracerProvider(resource=resource))
trace.get_tracer_provider().add_span_processor(span_processor)
async def app_startup():
# https://stackoverflow.com/questions/67054759/about-the-maximum-recursion-error-in-fastapi
# Test failures have been observed by internal starlette code when
@ -60,53 +80,39 @@ async def app_startup():
# provided by the user. Docker uses .env's TEST_RECURSION_LIMIT
# when running test suites.
# TODO: Find a proper fix to this issue.
recursion_limit = int(os.environ.get(
"TEST_RECURSION_LIMIT", sys.getrecursionlimit() + 1000))
recursion_limit = int(
os.environ.get("TEST_RECURSION_LIMIT", sys.getrecursionlimit() + 1000)
)
sys.setrecursionlimit(recursion_limit)
backend = aurweb.config.get("database", "backend")
if backend not in aurweb.db.DRIVERS:
raise ValueError(
f"The configured database backend ({backend}) is unsupported. "
f"Supported backends: {str(aurweb.db.DRIVERS.keys())}")
f"Supported backends: {str(aurweb.db.DRIVERS.keys())}"
)
session_secret = aurweb.config.get("fastapi", "session_secret")
if not session_secret:
raise Exception("[fastapi] session_secret must not be empty")
if not os.environ.get("PROMETHEUS_MULTIPROC_DIR", None):
logger.warning("$PROMETHEUS_MULTIPROC_DIR is not set, the /metrics "
"endpoint is disabled.")
logger.warning(
"$PROMETHEUS_MULTIPROC_DIR is not set, the /metrics "
"endpoint is disabled."
)
app.mount("/static/css",
StaticFiles(directory="web/html/css"),
name="static_css")
app.mount("/static/js",
StaticFiles(directory="web/html/js"),
name="static_js")
app.mount("/static/images",
StaticFiles(directory="web/html/images"),
name="static_images")
# Add application middlewares.
app.add_middleware(AuthenticationMiddleware, backend=BasicAuthBackend())
app.add_middleware(SessionMiddleware, secret_key=session_secret)
app.mount("/static", StaticFiles(directory="static"), name="static_files")
# Add application routes.
def add_router(module):
app.include_router(module.router)
util.apply_all(APP_ROUTES, add_router)
# Initialize the database engine and ORM.
get_engine()
def child_exit(server, worker): # pragma: no cover
""" This function is required for gunicorn customization
of prometheus multiprocessing. """
multiprocess.mark_process_dead(worker.pid)
async def internal_server_error(request: Request, exc: Exception) -> Response:
"""
Catch all uncaught Exceptions thrown in a route.
@ -177,9 +183,7 @@ async def internal_server_error(request: Request, exc: Exception) -> Response:
else:
# post
form_data = str(dict(request.state.form_data))
desc = desc + [
f"- Data: `{form_data}`"
] + ["", f"```{tb}```"]
desc = desc + [f"- Data: `{form_data}`"] + ["", f"```{tb}```"]
headers = {"Authorization": f"Bearer {token}"}
data = {
@ -191,11 +195,12 @@ async def internal_server_error(request: Request, exc: Exception) -> Response:
logger.info(endp)
resp = requests.post(endp, json=data, headers=headers)
if resp.status_code != http.HTTPStatus.CREATED:
logger.error(
f"Unable to report exception to {repo}: {resp.text}")
logger.error(f"Unable to report exception to {repo}: {resp.text}")
else:
logger.warning("Unable to report an exception found due to "
"unset notifications.error-{{project,token}}")
logger.warning(
"Unable to report an exception found due to "
"unset notifications.error-{{project,token}}"
)
# Log details about the exception traceback.
logger.error(f"FATAL[{tb_id}]: An unexpected exception has occurred.")
@ -203,14 +208,17 @@ async def internal_server_error(request: Request, exc: Exception) -> Response:
else:
retval = retval.decode()
return render_template(request, "errors/500.html", context,
status_code=http.HTTPStatus.INTERNAL_SERVER_ERROR)
return render_template(
request,
"errors/500.html",
context,
status_code=http.HTTPStatus.INTERNAL_SERVER_ERROR,
)
@app.exception_handler(StarletteHTTPException)
async def http_exception_handler(request: Request, exc: HTTPException) \
-> Response:
""" Handle an HTTPException thrown in a route. """
async def http_exception_handler(request: Request, exc: HTTPException) -> Response:
"""Handle an HTTPException thrown in a route."""
phrase = http.HTTPStatus(exc.status_code).phrase
context = make_context(request, phrase)
context["exc"] = exc
@ -220,24 +228,30 @@ async def http_exception_handler(request: Request, exc: HTTPException) \
if exc.status_code == http.HTTPStatus.NOT_FOUND:
tokens = request.url.path.split("/")
matches = re.match("^([a-z0-9][a-z0-9.+_-]*?)(\\.git)?$", tokens[1])
if matches:
if matches and len(tokens) == 2:
try:
pkgbase = get_pkg_or_base(matches.group(1))
context = pkgbaseutil.make_context(request, pkgbase)
context["pkgbase"] = pkgbase
context["git_clone_uri_anon"] = aurweb.config.get(
"options", "git_clone_uri_anon"
)
context["git_clone_uri_priv"] = aurweb.config.get(
"options", "git_clone_uri_priv"
)
except HTTPException:
pass
try:
return render_template(request, f"errors/{exc.status_code}.html",
context, exc.status_code)
return render_template(
request, f"errors/{exc.status_code}.html", context, exc.status_code
)
except TemplateNotFound:
return render_template(request, "errors/detail.html",
context, exc.status_code)
return render_template(request, "errors/detail.html", context, exc.status_code)
@app.middleware("http")
async def add_security_headers(request: Request, call_next: typing.Callable):
""" This middleware adds the CSP, XCTO, XFO and RP security
"""This middleware adds the CSP, XCTO, XFO and RP security
headers to the HTTP response associated with request.
CSP: Content-Security-Policy
@ -253,10 +267,16 @@ async def add_security_headers(request: Request, call_next: typing.Callable):
# Add CSP header.
nonce = request.user.nonce
csp = "default-src 'self'; "
script_hosts = []
csp += f"script-src 'self' 'nonce-{nonce}' " + ' '.join(script_hosts)
# It's fine if css is inlined.
csp += "; style-src 'self' 'unsafe-inline'"
# swagger-ui needs access to cdn.jsdelivr.net javascript
script_hosts = ["cdn.jsdelivr.net"]
csp += f"script-src 'self' 'unsafe-inline' 'nonce-{nonce}' " + " ".join(
script_hosts
)
# swagger-ui needs access to cdn.jsdelivr.net css
css_hosts = ["cdn.jsdelivr.net"]
csp += "; style-src 'self' 'unsafe-inline' " + " ".join(css_hosts)
response.headers["Content-Security-Policy"] = csp
# Add XTCO header.
@ -276,17 +296,22 @@ async def add_security_headers(request: Request, call_next: typing.Callable):
@app.middleware("http")
async def check_terms_of_service(request: Request, call_next: typing.Callable):
""" This middleware function redirects authenticated users if they
have any outstanding Terms to agree to. """
"""This middleware function redirects authenticated users if they
have any outstanding Terms to agree to."""
if request.user.is_authenticated() and request.url.path != "/tos":
unaccepted = query(Term).join(AcceptedTerm).filter(
or_(AcceptedTerm.UsersID != request.user.ID,
and_(AcceptedTerm.UsersID == request.user.ID,
AcceptedTerm.TermsID == Term.ID,
AcceptedTerm.Revision < Term.Revision)))
if query(Term).count() > unaccepted.count():
return RedirectResponse(
"/tos", status_code=int(http.HTTPStatus.SEE_OTHER))
accepted = (
query(Term)
.join(AcceptedTerm)
.filter(
and_(
AcceptedTerm.UsersID == request.user.ID,
AcceptedTerm.TermsID == Term.ID,
AcceptedTerm.Revision >= Term.Revision,
),
)
)
if query(Term).count() - accepted.count() > 0:
return RedirectResponse("/tos", status_code=int(http.HTTPStatus.SEE_OTHER))
return await util.error_or_result(call_next, request)
@ -301,9 +326,14 @@ async def id_redirect_middleware(request: Request, call_next: typing.Callable):
for k, v in request.query_params.items():
if k != "id":
qs.append(f"{k}={quote_plus(str(v))}")
qs = str() if not qs else '?' + '&'.join(qs)
qs = str() if not qs else "?" + "&".join(qs)
path = request.url.path.rstrip('/')
path = request.url.path.rstrip("/")
return RedirectResponse(f"{path}/{id}{qs}")
return await util.error_or_result(call_next, request)
# Add application middlewares.
app.add_middleware(AuthenticationMiddleware, backend=BasicAuthBackend())
app.add_middleware(SessionMiddleware, secret_key=session_secret)

View file

@ -15,7 +15,7 @@ logging.getLogger("root").addHandler(logging.NullHandler())
def get_logger(name: str) -> logging.Logger:
""" A logging.getLogger wrapper. Importing this function and
"""A logging.getLogger wrapper. Importing this function and
using it to get a module-local logger ensures that logging.conf
initialization is performed wherever loggers are used.

View file

@ -1,17 +1,18 @@
import fakeredis
from opentelemetry.instrumentation.redis import RedisInstrumentor
from redis import ConnectionPool, Redis
import aurweb.config
from aurweb import aur_logging
from aurweb import logging
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
pool = None
RedisInstrumentor().instrument()
class FakeConnectionPool:
""" A fake ConnectionPool class which holds an internal reference
"""A fake ConnectionPool class which holds an internal reference
to a fakeredis handle.
We normally deal with Redis by keeping its ConnectionPool globally

View file

@ -1,25 +1,22 @@
import functools
from http import HTTPStatus
from typing import Callable
import fastapi
from fastapi import HTTPException
from fastapi.responses import RedirectResponse
from starlette.authentication import AuthCredentials, AuthenticationBackend
from starlette.requests import HTTPConnection
import aurweb.config
from aurweb import db, filters, l10n, time, util
from aurweb.models import Session, User
from aurweb.models.account_type import ACCOUNT_TYPE_ID
class StubQuery:
""" Acts as a stubbed version of an orm.Query. Typically used
to masquerade fake records for an AnonymousUser. """
"""Acts as a stubbed version of an orm.Query. Typically used
to masquerade fake records for an AnonymousUser."""
def filter(self, *args):
return StubQuery()
@ -29,19 +26,21 @@ class StubQuery:
class AnonymousUser:
""" A stubbed User class used when an unauthenticated User
makes a request against FastAPI. """
"""A stubbed User class used when an unauthenticated User
makes a request against FastAPI."""
# Stub attributes used to mimic a real user.
ID = 0
Username = "N/A"
Email = "N/A"
class AccountType:
""" A stubbed AccountType static class. In here, we use an ID
"""A stubbed AccountType static class. In here, we use an ID
and AccountType which do not exist in our constant records.
All records primary keys (AccountType.ID) should be non-zero,
so using a zero here means that we'll never match against a
real AccountType. """
real AccountType."""
ID = 0
AccountType = "Anonymous"
@ -72,7 +71,7 @@ class AnonymousUser:
return False
@staticmethod
def is_trusted_user():
def is_package_maintainer():
return False
@staticmethod
@ -97,6 +96,7 @@ class AnonymousUser:
class BasicAuthBackend(AuthenticationBackend):
@db.async_retry_deadlock
async def authenticate(self, conn: HTTPConnection):
unauthenticated = (None, AnonymousUser())
sid = conn.cookies.get("AURSID")
@ -104,11 +104,9 @@ class BasicAuthBackend(AuthenticationBackend):
return unauthenticated
timeout = aurweb.config.getint("options", "login_timeout")
remembered = ("AURREMEMBER" in conn.cookies
and bool(conn.cookies.get("AURREMEMBER")))
remembered = conn.cookies.get("AURREMEMBER") == "True"
if remembered:
timeout = aurweb.config.getint("options",
"persistent_cookie_timeout")
timeout = aurweb.config.getint("options", "persistent_cookie_timeout")
# If no session with sid and a LastUpdateTS now or later exists.
now_ts = time.utcnow()
@ -123,12 +121,11 @@ class BasicAuthBackend(AuthenticationBackend):
# At this point, we cannot have an invalid user if the record
# exists, due to ForeignKey constraints in the schema upheld
# by mysqlclient.
with db.begin():
user = db.query(User).filter(User.ID == record.UsersID).first()
user = db.query(User).filter(User.ID == record.UsersID).first()
user.nonce = util.make_nonce()
user.authenticated = True
return (AuthCredentials(["authenticated"]), user)
return AuthCredentials(["authenticated"]), user
def _auth_required(auth_goal: bool = True):
@ -160,40 +157,45 @@ def _auth_required(auth_goal: bool = True):
# page itself is not directly possible (e.g. submitting a form).
if request.method in ("GET", "HEAD"):
url = request.url.path
elif (referer := request.headers.get("Referer")):
elif referer := request.headers.get("Referer"):
aur = aurweb.config.get("options", "aur_location") + "/"
if not referer.startswith(aur):
_ = l10n.get_translator_for_request(request)
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
detail=_("Bad Referer header."))
url = referer[len(aur) - 1:]
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=_("Bad Referer header."),
)
url = referer[len(aur) - 1 :]
url = "/login?" + filters.urlencode({"next": url})
return RedirectResponse(url, status_code=int(HTTPStatus.SEE_OTHER))
return wrapper
return decorator
def requires_auth(func: Callable) -> Callable:
""" Require an authenticated session for a particular route. """
"""Require an authenticated session for a particular route."""
@functools.wraps(func)
async def wrapper(*args, **kwargs):
return await _auth_required(True)(func)(*args, **kwargs)
return wrapper
def requires_guest(func: Callable) -> Callable:
""" Require a guest (unauthenticated) session for a particular route. """
"""Require a guest (unauthenticated) session for a particular route."""
@functools.wraps(func)
async def wrapper(*args, **kwargs):
return await _auth_required(False)(func)(*args, **kwargs)
return wrapper
def account_type_required(one_of: set):
""" A decorator that can be used on FastAPI routes to dictate
"""A decorator that can be used on FastAPI routes to dictate
that a user belongs to one of the types defined in one_of.
This decorator should be run after an @auth_required(True) is
@ -203,7 +205,7 @@ def account_type_required(one_of: set):
@router.get('/some_route')
@auth_required(True)
@account_type_required({"Trusted User", "Trusted User & Developer"})
@account_type_required({"Package Maintainer", "Package Maintainer & Developer"})
async def some_route(request: fastapi.Request):
return Response()
@ -211,18 +213,15 @@ def account_type_required(one_of: set):
:return: Return the FastAPI function this decorator wraps.
"""
# Convert any account type string constants to their integer IDs.
one_of = {
ACCOUNT_TYPE_ID[atype]
for atype in one_of
if isinstance(atype, str)
}
one_of = {ACCOUNT_TYPE_ID[atype] for atype in one_of if isinstance(atype, str)}
def decorator(func):
@functools.wraps(func)
async def wrapper(request: fastapi.Request, *args, **kwargs):
if request.user.AccountTypeID not in one_of:
return RedirectResponse("/",
status_code=int(HTTPStatus.SEE_OTHER))
return RedirectResponse("/", status_code=int(HTTPStatus.SEE_OTHER))
return await func(request, *args, **kwargs)
return wrapper
return decorator

View file

@ -1,4 +1,9 @@
from aurweb.models.account_type import DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID, USER_ID
from aurweb.models.account_type import (
DEVELOPER_ID,
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
USER_ID,
)
from aurweb.models.user import User
ACCOUNT_CHANGE_TYPE = 1
@ -25,52 +30,53 @@ PKGBASE_VOTE = 16
PKGREQ_FILE = 23
PKGREQ_CLOSE = 17
PKGREQ_LIST = 18
TU_ADD_VOTE = 19
TU_LIST_VOTES = 20
TU_VOTE = 21
PM_ADD_VOTE = 19
PM_LIST_VOTES = 20
PM_VOTE = 21
PKGBASE_MERGE = 29
user_developer_or_trusted_user = set([USER_ID, TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
trusted_user_or_dev = set([TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
developer = set([DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
trusted_user = set([TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID])
user_developer_or_package_maintainer = set(
[USER_ID, PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
)
package_maintainer_or_dev = set(
[PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
)
developer = set([DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
package_maintainer = set([PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
cred_filters = {
PKGBASE_FLAG: user_developer_or_trusted_user,
PKGBASE_NOTIFY: user_developer_or_trusted_user,
PKGBASE_VOTE: user_developer_or_trusted_user,
PKGREQ_FILE: user_developer_or_trusted_user,
ACCOUNT_CHANGE_TYPE: trusted_user_or_dev,
ACCOUNT_EDIT: trusted_user_or_dev,
ACCOUNT_LAST_LOGIN: trusted_user_or_dev,
ACCOUNT_LIST_COMMENTS: trusted_user_or_dev,
ACCOUNT_SEARCH: trusted_user_or_dev,
COMMENT_DELETE: trusted_user_or_dev,
COMMENT_UNDELETE: trusted_user_or_dev,
COMMENT_VIEW_DELETED: trusted_user_or_dev,
COMMENT_EDIT: trusted_user_or_dev,
COMMENT_PIN: trusted_user_or_dev,
PKGBASE_ADOPT: trusted_user_or_dev,
PKGBASE_SET_KEYWORDS: trusted_user_or_dev,
PKGBASE_DELETE: trusted_user_or_dev,
PKGBASE_EDIT_COMAINTAINERS: trusted_user_or_dev,
PKGBASE_DISOWN: trusted_user_or_dev,
PKGBASE_LIST_VOTERS: trusted_user_or_dev,
PKGBASE_UNFLAG: trusted_user_or_dev,
PKGREQ_CLOSE: trusted_user_or_dev,
PKGREQ_LIST: trusted_user_or_dev,
TU_ADD_VOTE: trusted_user,
TU_LIST_VOTES: trusted_user_or_dev,
TU_VOTE: trusted_user,
PKGBASE_FLAG: user_developer_or_package_maintainer,
PKGBASE_NOTIFY: user_developer_or_package_maintainer,
PKGBASE_VOTE: user_developer_or_package_maintainer,
PKGREQ_FILE: user_developer_or_package_maintainer,
ACCOUNT_CHANGE_TYPE: package_maintainer_or_dev,
ACCOUNT_EDIT: package_maintainer_or_dev,
ACCOUNT_LAST_LOGIN: package_maintainer_or_dev,
ACCOUNT_LIST_COMMENTS: package_maintainer_or_dev,
ACCOUNT_SEARCH: package_maintainer_or_dev,
COMMENT_DELETE: package_maintainer_or_dev,
COMMENT_UNDELETE: package_maintainer_or_dev,
COMMENT_VIEW_DELETED: package_maintainer_or_dev,
COMMENT_EDIT: package_maintainer_or_dev,
COMMENT_PIN: package_maintainer_or_dev,
PKGBASE_ADOPT: package_maintainer_or_dev,
PKGBASE_SET_KEYWORDS: package_maintainer_or_dev,
PKGBASE_DELETE: package_maintainer_or_dev,
PKGBASE_EDIT_COMAINTAINERS: package_maintainer_or_dev,
PKGBASE_DISOWN: package_maintainer_or_dev,
PKGBASE_LIST_VOTERS: package_maintainer_or_dev,
PKGBASE_UNFLAG: package_maintainer_or_dev,
PKGREQ_CLOSE: package_maintainer_or_dev,
PKGREQ_LIST: package_maintainer_or_dev,
PM_ADD_VOTE: package_maintainer,
PM_LIST_VOTES: package_maintainer_or_dev,
PM_VOTE: package_maintainer,
ACCOUNT_EDIT_DEV: developer,
PKGBASE_MERGE: trusted_user_or_dev,
PKGBASE_MERGE: package_maintainer_or_dev,
}
def has_credential(user: User,
credential: int,
approved_users: list = tuple()):
if user in approved_users:
def has_credential(user: User, credential: int, approved: list = tuple()):
if user in approved:
return True
return user.AccountTypeID in cred_filters[credential]

View file

@ -1,4 +1,4 @@
from datetime import datetime
from datetime import UTC, datetime
class Benchmark:
@ -6,16 +6,16 @@ class Benchmark:
self.start()
def _timestamp(self) -> float:
""" Generate a timestamp. """
return float(datetime.utcnow().timestamp())
"""Generate a timestamp."""
return float(datetime.now(UTC).timestamp())
def start(self) -> int:
""" Start a benchmark. """
"""Start a benchmark."""
self.current = self._timestamp()
return self.current
def end(self):
""" Return the diff between now - start(). """
"""Return the diff between now - start()."""
n = self._timestamp() - self.current
self.current = float(0)
return n

View file

@ -1,20 +1,64 @@
from redis import Redis
import pickle
from typing import Any, Callable
from sqlalchemy import orm
from aurweb import config
from aurweb.aur_redis import redis_connection
from aurweb.prometheus import SEARCH_REQUESTS
async def db_count_cache(redis: Redis, key: str, query: orm.Query,
expire: int = None) -> int:
""" Store and retrieve a query.count() via redis cache.
_redis = redis_connection()
def lambda_cache(key: str, value: Callable[[], Any], expire: int = None) -> list:
"""Store and retrieve lambda results via redis cache.
:param key: Redis key
:param value: Lambda callable returning the value
:param expire: Optional expiration in seconds
:return: result of callable or cache
"""
result = _redis.get(key)
if result is not None:
return pickle.loads(result)
_redis.set(key, (pickle.dumps(result := value())), ex=expire)
return result
def db_count_cache(key: str, query: orm.Query, expire: int = None) -> int:
"""Store and retrieve a query.count() via redis cache.
:param redis: Redis handle
:param key: Redis key
:param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds
:return: query.count()
"""
result = redis.get(key)
result = _redis.get(key)
if result is None:
redis.set(key, (result := int(query.count())))
_redis.set(key, (result := int(query.count())))
if expire:
redis.expire(key, expire)
_redis.expire(key, expire)
return int(result)
def db_query_cache(key: str, query: orm.Query, expire: int = None) -> list:
"""Store and retrieve query results via redis cache.
:param key: Redis key
:param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds
:return: query.all()
"""
result = _redis.get(key)
if result is None:
SEARCH_REQUESTS.labels(cache="miss").inc()
if _redis.dbsize() > config.getint("cache", "max_search_entries", 50000):
return query.all()
_redis.set(key, (result := pickle.dumps(query.all())))
if expire:
_redis.expire(key, expire)
else:
SEARCH_REQUESTS.labels(cache="hit").inc()
return pickle.loads(result)

View file

@ -1,7 +1,9 @@
""" This module consists of aurweb's CAPTCHA utility functions and filters. """
import hashlib
from jinja2 import pass_context
from sqlalchemy import func
from aurweb.db import query
from aurweb.models import User
@ -9,8 +11,9 @@ from aurweb.templates import register_filter
def get_captcha_salts():
""" Produce salts based on the current user count. """
count = query(User).count()
"""Produce salts based on the current user count."""
count = query(func.count(User.ID)).scalar()
salts = []
for i in range(0, 6):
salts.append(f"aurweb-{count - i}")
@ -18,19 +21,19 @@ def get_captcha_salts():
def get_captcha_token(salt):
""" Produce a token for the CAPTCHA salt. """
"""Produce a token for the CAPTCHA salt."""
return hashlib.md5(salt.encode()).hexdigest()[:3]
def get_captcha_challenge(salt):
""" Get a CAPTCHA challenge string (shell command) for a salt. """
"""Get a CAPTCHA challenge string (shell command) for a salt."""
token = get_captcha_token(salt)
return f"LC_ALL=C pacman -V|sed -r 's#[0-9]+#{token}#g'|md5sum|cut -c1-6"
def get_captcha_answer(token):
""" Compute the answer via md5 of the real template text, return the
first six digits of the hexadecimal hash. """
"""Compute the answer via md5 of the real template text, return the
first six digits of the hexadecimal hash."""
text = r"""
.--. Pacman v%s.%s.%s - libalpm v%s.%s.%s
/ _.-' .-. .-. .-. Copyright (C) %s-%s Pacman Development Team
@ -38,14 +41,16 @@ def get_captcha_answer(token):
'--'
This program may be freely redistributed under
the terms of the GNU General Public License.
""" % tuple([token] * 10)
""" % tuple(
[token] * 10
)
return hashlib.md5((text + "\n").encode()).hexdigest()[:6]
@register_filter("captcha_salt")
@pass_context
def captcha_salt_filter(context):
""" Returns the most recent CAPTCHA salt in the list of salts. """
"""Returns the most recent CAPTCHA salt in the list of salts."""
salts = get_captcha_salts()
return salts[0]
@ -53,5 +58,5 @@ def captcha_salt_filter(context):
@register_filter("captcha_cmdline")
@pass_context
def captcha_cmdline_filter(context, salt):
""" Returns a CAPTCHA challenge for a given salt. """
"""Returns a CAPTCHA challenge for a given salt."""
return get_captcha_challenge(salt)

View file

@ -1,12 +1,8 @@
import configparser
import os
from typing import Any
# Publicly visible version of aurweb. This is used to display
# aurweb versioning in the footer and must be maintained.
# Todo: Make this dynamic/automated.
AURWEB_VERSION = "v6.0.22"
import tomlkit
_parser = None
@ -15,8 +11,8 @@ def _get_parser():
global _parser
if not _parser:
path = os.environ.get('AUR_CONFIG', '/etc/aurweb/config')
defaults = os.environ.get('AUR_CONFIG_DEFAULTS', path + '.defaults')
path = os.environ.get("AUR_CONFIG", "/etc/aurweb/config")
defaults = os.environ.get("AUR_CONFIG_DEFAULTS", path + ".defaults")
_parser = configparser.RawConfigParser()
_parser.optionxform = lambda option: option
@ -29,7 +25,7 @@ def _get_parser():
def rehash():
""" Globally rehash the configuration parser. """
"""Globally rehash the configuration parser."""
global _parser
_parser = None
_get_parser()
@ -43,6 +39,18 @@ def get(section, option):
return _get_parser().get(section, option)
def _get_project_meta():
with open(os.path.join(get("options", "aurwebdir"), "pyproject.toml")) as pyproject:
file_contents = pyproject.read()
return tomlkit.parse(file_contents)["tool"]["poetry"]
# Publicly visible version of aurweb. This is used to display
# aurweb versioning in the footer and must be maintained.
AURWEB_VERSION = str(_get_project_meta()["version"])
def getboolean(section, option):
return _get_parser().getboolean(section, option)

View file

@ -1,68 +1,8 @@
from fastapi import Request
from fastapi.responses import Response
from aurweb import config
def samesite() -> str:
""" Produce cookie SameSite value based on options.disable_http_login.
"""Produce cookie SameSite value.
When options.disable_http_login is True, "strict" is returned. Otherwise,
"lax" is returned.
Currently this is hard-coded to return "lax"
:returns "strict" if options.disable_http_login else "lax"
:returns "lax"
"""
secure = config.getboolean("options", "disable_http_login")
return "strict" if secure else "lax"
def timeout(extended: bool) -> int:
""" Produce a session timeout based on `remember_me`.
This method returns one of AUR_CONFIG's options.persistent_cookie_timeout
and options.login_timeout based on the `extended` argument.
The `extended` argument is typically the value of the AURREMEMBER
cookie, defaulted to False.
If `extended` is False, options.login_timeout is returned. Otherwise,
if `extended` is True, options.persistent_cookie_timeout is returned.
:param extended: Flag which generates an extended timeout when True
:returns: Cookie timeout based on configuration options
"""
timeout = config.getint("options", "login_timeout")
if bool(extended):
timeout = config.getint("options", "persistent_cookie_timeout")
return timeout
def update_response_cookies(request: Request, response: Response,
aurtz: str = None, aurlang: str = None,
aursid: str = None) -> Response:
""" Update session cookies. This method is particularly useful
when updating a cookie which was already set.
The AURSID cookie's expiration is based on the AURREMEMBER cookie,
which is retrieved from `request`.
:param request: FastAPI request
:param response: FastAPI response
:param aurtz: Optional AURTZ cookie value
:param aurlang: Optional AURLANG cookie value
:param aursid: Optional AURSID cookie value
:returns: Updated response
"""
secure = config.getboolean("options", "disable_http_login")
if aurtz:
response.set_cookie("AURTZ", aurtz, secure=secure, httponly=secure,
samesite=samesite())
if aurlang:
response.set_cookie("AURLANG", aurlang, secure=secure, httponly=secure,
samesite=samesite())
if aursid:
remember_me = bool(request.cookies.get("AURREMEMBER", False))
response.set_cookie("AURSID", aursid, secure=secure, httponly=secure,
max_age=timeout(remember_me),
samesite=samesite())
return response
return "lax"

View file

@ -1,34 +1,14 @@
import functools
import hashlib
import math
import os
import re
from typing import Iterable, NewType
import sqlalchemy
from sqlalchemy import create_engine, event
from sqlalchemy.engine.base import Engine
from sqlalchemy.engine.url import URL
from sqlalchemy.orm import Query, Session, SessionTransaction, scoped_session, sessionmaker
import aurweb.config
import aurweb.util
DRIVERS = {
"mysql": "mysql+mysqldb"
}
# Some types we don't get access to in this module.
Base = NewType("Base", "aurweb.models.declarative_base.Base")
# Supported database drivers.
DRIVERS = {"mysql": "mysql+mysqldb"}
def make_random_value(table: str, column: str, length: int):
""" Generate a unique, random value for a string column in a table.
"""Generate a unique, random value for a string column in a table.
:return: A unique string that is not in the database
"""
import aurweb.util
string = aurweb.util.make_random_string(length)
while query(table).filter(column == string).first():
string = aurweb.util.make_random_string(length)
@ -52,8 +32,11 @@ def test_name() -> str:
:return: Unhashed database name
"""
db = os.environ.get("PYTEST_CURRENT_TEST",
aurweb.config.get("database", "name"))
import os
import aurweb.config
db = os.environ.get("PYTEST_CURRENT_TEST", aurweb.config.get("database", "name"))
return db.split(":")[0]
@ -70,7 +53,11 @@ def name() -> str:
dbname = test_name()
if not dbname.startswith("test/"):
return dbname
import hashlib
sha1 = hashlib.sha1(dbname.encode()).hexdigest()
return "db" + sha1
@ -78,18 +65,20 @@ def name() -> str:
_sessions = dict()
def get_session(engine: Engine = None) -> Session:
""" Return aurweb.db's global session. """
def get_session(engine=None):
"""Return aurweb.db's global session."""
dbname = name()
global _sessions
if dbname not in _sessions:
from sqlalchemy.orm import scoped_session, sessionmaker
if not engine: # pragma: no cover
engine = get_engine()
Session = scoped_session(
sessionmaker(autocommit=True, autoflush=False, bind=engine))
sessionmaker(autocommit=True, autoflush=False, bind=engine)
)
_sessions[dbname] = Session()
return _sessions.get(dbname)
@ -106,13 +95,17 @@ def pop_session(dbname: str) -> None:
_sessions.pop(dbname)
def refresh(model: Base) -> Base:
""" Refresh the session's knowledge of `model`. """
def refresh(model):
"""
Refresh the session's knowledge of `model`.
:returns: Passed in `model`
"""
get_session().refresh(model)
return model
def query(Model: Base, *args, **kwargs) -> Query:
def query(Model, *args, **kwargs):
"""
Perform an ORM query against the database session.
@ -124,7 +117,7 @@ def query(Model: Base, *args, **kwargs) -> Query:
return get_session().query(Model).filter(*args, **kwargs)
def create(Model: Base, *args, **kwargs) -> Base:
def create(Model, *args, **kwargs):
"""
Create a record and add() it to the database session.
@ -135,7 +128,7 @@ def create(Model: Base, *args, **kwargs) -> Base:
return add(instance)
def delete(model: Base) -> None:
def delete(model) -> None:
"""
Delete a set of records found by Query.filter(*args, **kwargs).
@ -144,81 +137,133 @@ def delete(model: Base) -> None:
get_session().delete(model)
def delete_all(iterable: Iterable) -> None:
""" Delete each instance found in `iterable`. """
def delete_all(iterable) -> None:
"""Delete each instance found in `iterable`."""
import aurweb.util
session_ = get_session()
aurweb.util.apply_all(iterable, session_.delete)
def rollback() -> None:
""" Rollback the database session. """
"""Rollback the database session."""
get_session().rollback()
def add(model: Base) -> Base:
""" Add `model` to the database session. """
def add(model):
"""Add `model` to the database session."""
get_session().add(model)
return model
def begin() -> SessionTransaction:
""" Begin an SQLAlchemy SessionTransaction. """
def begin():
"""Begin an SQLAlchemy SessionTransaction."""
return get_session().begin()
def get_sqlalchemy_url() -> URL:
def retry_deadlock(func):
from sqlalchemy.exc import OperationalError
def wrapper(*args, _i: int = 0, **kwargs):
# Retry 10 times, then raise the exception
# If we fail before the 10th, recurse into `wrapper`
# If we fail on the 10th, continue to throw the exception
limit = 10
try:
return func(*args, **kwargs)
except OperationalError as exc:
if _i < limit and "Deadlock found" in str(exc):
# Retry on deadlock by recursing into `wrapper`
return wrapper(*args, _i=_i + 1, **kwargs)
# Otherwise, just raise the exception
raise exc
return wrapper
def async_retry_deadlock(func):
from sqlalchemy.exc import OperationalError
async def wrapper(*args, _i: int = 0, **kwargs):
# Retry 10 times, then raise the exception
# If we fail before the 10th, recurse into `wrapper`
# If we fail on the 10th, continue to throw the exception
limit = 10
try:
return await func(*args, **kwargs)
except OperationalError as exc:
if _i < limit and "Deadlock found" in str(exc):
# Retry on deadlock by recursing into `wrapper`
return await wrapper(*args, _i=_i + 1, **kwargs)
# Otherwise, just raise the exception
raise exc
return wrapper
def get_sqlalchemy_url():
"""
Build an SQLAlchemy URL for use with create_engine.
:return: sqlalchemy.engine.url.URL
"""
import sqlalchemy
from sqlalchemy.engine.url import URL
import aurweb.config
constructor = URL
parts = sqlalchemy.__version__.split('.')
parts = sqlalchemy.__version__.split(".")
major = int(parts[0])
minor = int(parts[1])
if major == 1 and minor >= 4: # pragma: no cover
constructor = URL.create
aur_db_backend = aurweb.config.get('database', 'backend')
if aur_db_backend == 'mysql':
aur_db_backend = aurweb.config.get("database", "backend")
if aur_db_backend == "mysql":
param_query = {}
port = aurweb.config.get_with_fallback("database", "port", None)
if not port:
param_query["unix_socket"] = aurweb.config.get(
"database", "socket")
param_query["unix_socket"] = aurweb.config.get("database", "socket")
return constructor(
DRIVERS.get(aur_db_backend),
username=aurweb.config.get('database', 'user'),
password=aurweb.config.get_with_fallback('database', 'password',
fallback=None),
host=aurweb.config.get('database', 'host'),
username=aurweb.config.get("database", "user"),
password=aurweb.config.get_with_fallback(
"database", "password", fallback=None
),
host=aurweb.config.get("database", "host"),
database=name(),
port=port,
query=param_query
query=param_query,
)
elif aur_db_backend == 'sqlite':
elif aur_db_backend == "sqlite":
return constructor(
'sqlite',
database=aurweb.config.get('database', 'name'),
"sqlite",
database=aurweb.config.get("database", "name"),
)
else:
raise ValueError('unsupported database backend')
raise ValueError("unsupported database backend")
def sqlite_regexp(regex, item) -> bool: # pragma: no cover
""" Method which mimics SQL's REGEXP for SQLite. """
"""Method which mimics SQL's REGEXP for SQLite."""
import re
return bool(re.search(regex, str(item)))
def setup_sqlite(engine: Engine) -> None: # pragma: no cover
""" Perform setup for an SQLite engine. """
def setup_sqlite(engine) -> None: # pragma: no cover
"""Perform setup for an SQLite engine."""
from sqlalchemy import event
@event.listens_for(engine, "connect")
def do_begin(conn, record):
import functools
create_deterministic_function = functools.partial(
conn.create_function,
deterministic=True
conn.create_function, deterministic=True
)
create_deterministic_function("REGEXP", 2, sqlite_regexp)
@ -227,7 +272,7 @@ def setup_sqlite(engine: Engine) -> None: # pragma: no cover
_engines = dict()
def get_engine(dbname: str = None, echo: bool = False) -> Engine:
def get_engine(dbname: str = None, echo: bool = False):
"""
Return the SQLAlchemy engine for `dbname`.
@ -238,6 +283,8 @@ def get_engine(dbname: str = None, echo: bool = False) -> Engine:
:param echo: Flag passed through to sqlalchemy.create_engine
:return: SQLAlchemy Engine instance
"""
import aurweb.config
if not dbname:
dbname = name()
@ -250,11 +297,13 @@ def get_engine(dbname: str = None, echo: bool = False) -> Engine:
if is_sqlite: # pragma: no cover
connect_args["check_same_thread"] = False
kwargs = {
"echo": echo,
"connect_args": connect_args
}
_engines[dbname] = create_engine(get_sqlalchemy_url(), **kwargs)
kwargs = {"echo": echo, "connect_args": connect_args}
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
from sqlalchemy import create_engine
engine = create_engine(get_sqlalchemy_url(), **kwargs)
SQLAlchemyInstrumentor().instrument(engine=engine)
_engines[dbname] = engine
if is_sqlite: # pragma: no cover
setup_sqlite(_engines.get(dbname))
@ -274,7 +323,7 @@ def pop_engine(dbname: str) -> None:
def kill_engine() -> None:
""" Close the current session and dispose of the engine. """
"""Close the current session and dispose of the engine."""
dbname = name()
session = get_session()
@ -301,12 +350,16 @@ class ConnectionExecutor:
_conn = None
_paramstyle = None
def __init__(self, conn, backend=aurweb.config.get("database", "backend")):
def __init__(self, conn, backend=None):
import aurweb.config
backend = backend or aurweb.config.get("database", "backend")
self._conn = conn
if backend == "mysql":
self._paramstyle = "format"
elif backend == "sqlite":
import sqlite3
self._paramstyle = sqlite3.paramstyle
def paramstyle(self):
@ -314,13 +367,13 @@ class ConnectionExecutor:
def execute(self, query, params=()): # pragma: no cover
# TODO: SQLite support has been removed in FastAPI. It remains
# here to fund its support for PHP until it is removed.
if self._paramstyle in ('format', 'pyformat'):
query = query.replace('%', '%%').replace('?', '%s')
elif self._paramstyle == 'qmark':
# here to fund its support for the Sharness testsuite.
if self._paramstyle in ("format", "pyformat"):
query = query.replace("%", "%%").replace("?", "%s")
elif self._paramstyle == "qmark":
pass
else:
raise ValueError('unsupported paramstyle')
raise ValueError("unsupported paramstyle")
cur = self._conn.cursor()
cur.execute(query, params)
@ -339,30 +392,36 @@ class Connection:
_conn = None
def __init__(self):
aur_db_backend = aurweb.config.get('database', 'backend')
import aurweb.config
if aur_db_backend == 'mysql':
aur_db_backend = aurweb.config.get("database", "backend")
if aur_db_backend == "mysql":
import MySQLdb
aur_db_host = aurweb.config.get('database', 'host')
aur_db_host = aurweb.config.get("database", "host")
aur_db_name = name()
aur_db_user = aurweb.config.get('database', 'user')
aur_db_pass = aurweb.config.get_with_fallback(
'database', 'password', str())
aur_db_socket = aurweb.config.get('database', 'socket')
self._conn = MySQLdb.connect(host=aur_db_host,
user=aur_db_user,
passwd=aur_db_pass,
db=aur_db_name,
unix_socket=aur_db_socket)
elif aur_db_backend == 'sqlite': # pragma: no cover
aur_db_user = aurweb.config.get("database", "user")
aur_db_pass = aurweb.config.get_with_fallback("database", "password", str())
aur_db_socket = aurweb.config.get("database", "socket")
self._conn = MySQLdb.connect(
host=aur_db_host,
user=aur_db_user,
passwd=aur_db_pass,
db=aur_db_name,
unix_socket=aur_db_socket,
)
elif aur_db_backend == "sqlite": # pragma: no cover
# TODO: SQLite support has been removed in FastAPI. It remains
# here to fund its support for PHP until it is removed.
# here to fund its support for Sharness testsuite.
import math
import sqlite3
aur_db_name = aurweb.config.get('database', 'name')
aur_db_name = aurweb.config.get("database", "name")
self._conn = sqlite3.connect(aur_db_name)
self._conn.create_function("POWER", 2, math.pow)
else:
raise ValueError('unsupported database backend')
raise ValueError("unsupported database backend")
self._conn = ConnectionExecutor(self._conn, aur_db_backend)

View file

@ -6,6 +6,9 @@ O = 0
# Default [P]er [P]age
PP = 50
# Default Comments Per Page
COMMENTS_PER_PAGE = 10
# A whitelist of valid PP values
PP_WHITELIST = {50, 100, 250}
@ -14,8 +17,8 @@ RPC_SEARCH_BY = "name-desc"
def fallback_pp(per_page: int) -> int:
""" If `per_page` is a valid value in PP_WHITELIST, return it.
Otherwise, return defaults.PP. """
"""If `per_page` is a valid value in PP_WHITELIST, return it.
Otherwise, return defaults.PP."""
if per_page not in PP_WHITELIST:
return PP
return per_page

View file

@ -1,5 +1,4 @@
import functools
from typing import Any, Callable
import fastapi
@ -19,61 +18,61 @@ class BannedException(AurwebException):
class PermissionDeniedException(AurwebException):
def __init__(self, user):
msg = 'permission denied: {:s}'.format(user)
msg = "permission denied: {:s}".format(user)
super(PermissionDeniedException, self).__init__(msg)
class BrokenUpdateHookException(AurwebException):
def __init__(self, cmd):
msg = 'broken update hook: {:s}'.format(cmd)
msg = "broken update hook: {:s}".format(cmd)
super(BrokenUpdateHookException, self).__init__(msg)
class InvalidUserException(AurwebException):
def __init__(self, user):
msg = 'unknown user: {:s}'.format(user)
msg = "unknown user: {:s}".format(user)
super(InvalidUserException, self).__init__(msg)
class InvalidPackageBaseException(AurwebException):
def __init__(self, pkgbase):
msg = 'package base not found: {:s}'.format(pkgbase)
msg = "package base not found: {:s}".format(pkgbase)
super(InvalidPackageBaseException, self).__init__(msg)
class InvalidRepositoryNameException(AurwebException):
def __init__(self, pkgbase):
msg = 'invalid repository name: {:s}'.format(pkgbase)
msg = "invalid repository name: {:s}".format(pkgbase)
super(InvalidRepositoryNameException, self).__init__(msg)
class PackageBaseExistsException(AurwebException):
def __init__(self, pkgbase):
msg = 'package base already exists: {:s}'.format(pkgbase)
msg = "package base already exists: {:s}".format(pkgbase)
super(PackageBaseExistsException, self).__init__(msg)
class InvalidReasonException(AurwebException):
def __init__(self, reason):
msg = 'invalid reason: {:s}'.format(reason)
msg = "invalid reason: {:s}".format(reason)
super(InvalidReasonException, self).__init__(msg)
class InvalidCommentException(AurwebException):
def __init__(self, comment):
msg = 'comment is too short: {:s}'.format(comment)
msg = "comment is too short: {:s}".format(comment)
super(InvalidCommentException, self).__init__(msg)
class AlreadyVotedException(AurwebException):
def __init__(self, comment):
msg = 'already voted for package base: {:s}'.format(comment)
msg = "already voted for package base: {:s}".format(comment)
super(AlreadyVotedException, self).__init__(msg)
class NotVotedException(AurwebException):
def __init__(self, comment):
msg = 'missing vote for package base: {:s}'.format(comment)
msg = "missing vote for package base: {:s}".format(comment)
super(NotVotedException, self).__init__(msg)
@ -109,4 +108,5 @@ def handle_form_exceptions(route: Callable) -> fastapi.Response:
async def wrapper(request: fastapi.Request, *args, **kwargs):
request.state.form_data = await request.form()
return await route(request, *args, **kwargs)
return wrapper

View file

@ -1,26 +1,23 @@
import copy
import math
from datetime import datetime
from typing import Any, Dict, Union
from datetime import UTC, datetime
from typing import Any, Union
from urllib.parse import quote_plus, urlencode
from zoneinfo import ZoneInfo
import fastapi
import paginate
from jinja2 import pass_context
from jinja2.filters import do_format
import aurweb.models
from aurweb import config, l10n
from aurweb.templates import register_filter, register_function
@register_filter("pager_nav")
@pass_context
def pager_nav(context: Dict[str, Any],
page: int, total: int, prefix: str) -> str:
def pager_nav(context: dict[str, Any], page: int, total: int, prefix: str) -> str:
page = int(page) # Make sure this is an int.
pp = context.get("PP", 50)
@ -43,10 +40,9 @@ def pager_nav(context: Dict[str, Any],
return f"{prefix}?{qs}"
# Use the paginate module to produce our linkage.
pager = paginate.Page([], page=page + 1,
items_per_page=pp,
item_count=total,
url_maker=create_url)
pager = paginate.Page(
[], page=page + 1, items_per_page=pp, item_count=total, url_maker=create_url
)
return pager.pager(
link_attr={"class": "page"},
@ -56,7 +52,8 @@ def pager_nav(context: Dict[str, Any],
symbol_first="« First",
symbol_previous=" Previous",
symbol_next="Next ",
symbol_last="Last »")
symbol_last="Last »",
)
@register_function("config_getint")
@ -71,17 +68,16 @@ def do_round(f: float) -> int:
@register_filter("tr")
@pass_context
def tr(context: Dict[str, Any], value: str):
""" A translation filter; example: {{ "Hello" | tr("de") }}. """
def tr(context: dict[str, Any], value: str):
"""A translation filter; example: {{ "Hello" | tr("de") }}."""
_ = l10n.get_translator_for_request(context.get("request"))
return _(value)
@register_filter("tn")
@pass_context
def tn(context: Dict[str, Any], count: int,
singular: str, plural: str) -> str:
""" A singular and plural translation filter.
def tn(context: dict[str, Any], count: int, singular: str, plural: str) -> str:
"""A singular and plural translation filter.
Example:
{{ some_integer | tn("singular %d", "plural %d") }}
@ -98,7 +94,7 @@ def tn(context: Dict[str, Any], count: int,
@register_filter("dt")
def timestamp_to_datetime(timestamp: int):
return datetime.utcfromtimestamp(int(timestamp))
return datetime.fromtimestamp(timestamp, UTC)
@register_filter("as_timezone")
@ -107,8 +103,8 @@ def as_timezone(dt: datetime, timezone: str):
@register_filter("extend_query")
def extend_query(query: Dict[str, Any], *additions) -> Dict[str, Any]:
""" Add additional key value pairs to query. """
def extend_query(query: dict[str, Any], *additions) -> dict[str, Any]:
"""Add additional key value pairs to query."""
q = copy.copy(query)
for k, v in list(additions):
q[k] = v
@ -116,26 +112,26 @@ def extend_query(query: Dict[str, Any], *additions) -> Dict[str, Any]:
@register_filter("urlencode")
def to_qs(query: Dict[str, Any]) -> str:
def to_qs(query: dict[str, Any]) -> str:
return urlencode(query, doseq=True)
@register_filter("get_vote")
def get_vote(voteinfo, request: fastapi.Request):
from aurweb.models import TUVote
return voteinfo.tu_votes.filter(TUVote.User == request.user).first()
from aurweb.models import Vote
return voteinfo.votes.filter(Vote.User == request.user).first()
@register_filter("number_format")
def number_format(value: float, places: int):
""" A converter function similar to PHP's number_format. """
"""A converter function similar to PHP's number_format."""
return f"{value:.{places}f}"
@register_filter("account_url")
@pass_context
def account_url(context: Dict[str, Any],
user: "aurweb.models.user.User") -> str:
def account_url(context: dict[str, Any], user: "aurweb.models.user.User") -> str:
base = aurweb.config.get("options", "aur_location")
return f"{base}/account/{user.Username}"
@ -152,8 +148,7 @@ def ceil(*args, **kwargs) -> int:
@register_function("date_strftime")
@pass_context
def date_strftime(context: Dict[str, Any], dt: Union[int, datetime], fmt: str) \
-> str:
def date_strftime(context: dict[str, Any], dt: Union[int, datetime], fmt: str) -> str:
if isinstance(dt, int):
dt = timestamp_to_datetime(dt)
tz = context.get("timezone")
@ -162,11 +157,25 @@ def date_strftime(context: Dict[str, Any], dt: Union[int, datetime], fmt: str) \
@register_function("date_display")
@pass_context
def date_display(context: Dict[str, Any], dt: Union[int, datetime]) -> str:
def date_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
return date_strftime(context, dt, "%Y-%m-%d (%Z)")
@register_function("datetime_display")
@pass_context
def datetime_display(context: Dict[str, Any], dt: Union[int, datetime]) -> str:
def datetime_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
return date_strftime(context, dt, "%Y-%m-%d %H:%M (%Z)")
@register_filter("format")
def safe_format(value: str, *args: Any, **kwargs: Any) -> str:
"""Wrapper for jinja2 format function to perform additional checks."""
# If we don't have anything to be formatted, just return the value.
# We have some translations that do not contain placeholders for replacement.
# In these cases the jinja2 function is throwing an error:
# "TypeError: not all arguments converted during string formatting"
if "%" not in value:
return value
return do_format(value, *args, **kwargs)

View file

@ -9,12 +9,12 @@ import aurweb.db
def format_command(env_vars, command, ssh_opts, ssh_key):
environment = ''
environment = ""
for key, var in env_vars.items():
environment += '{}={} '.format(key, shlex.quote(var))
environment += "{}={} ".format(key, shlex.quote(var))
command = shlex.quote(command)
command = '{}{}'.format(environment, command)
command = "{}{}".format(environment, command)
# The command is being substituted into an authorized_keys line below,
# so we need to escape the double quotes.
@ -24,10 +24,10 @@ def format_command(env_vars, command, ssh_opts, ssh_key):
def main():
valid_keytypes = aurweb.config.get('auth', 'valid-keytypes').split()
username_regex = aurweb.config.get('auth', 'username-regex')
git_serve_cmd = aurweb.config.get('auth', 'git-serve-cmd')
ssh_opts = aurweb.config.get('auth', 'ssh-options')
valid_keytypes = aurweb.config.get("auth", "valid-keytypes").split()
username_regex = aurweb.config.get("auth", "username-regex")
git_serve_cmd = aurweb.config.get("auth", "git-serve-cmd")
ssh_opts = aurweb.config.get("auth", "ssh-options")
keytype = sys.argv[1]
keytext = sys.argv[2]
@ -36,11 +36,13 @@ def main():
conn = aurweb.db.Connection()
cur = conn.execute("SELECT Users.Username, Users.AccountTypeID FROM Users "
"INNER JOIN SSHPubKeys ON SSHPubKeys.UserID = Users.ID "
"WHERE SSHPubKeys.PubKey = ? AND Users.Suspended = 0 "
"AND NOT Users.Passwd = ''",
(keytype + " " + keytext,))
cur = conn.execute(
"SELECT Users.Username, Users.AccountTypeID FROM Users "
"INNER JOIN SSHPubKeys ON SSHPubKeys.UserID = Users.ID "
"WHERE SSHPubKeys.PubKey = ? AND Users.Suspended = 0 "
"AND NOT Users.Passwd = ''",
(keytype + " " + keytext,),
)
row = cur.fetchone()
if not row or cur.fetchone():
@ -51,13 +53,13 @@ def main():
exit(1)
env_vars = {
'AUR_USER': user,
'AUR_PRIVILEGED': '1' if account_type > 1 else '0',
"AUR_USER": user,
"AUR_PRIVILEGED": "1" if account_type > 1 else "0",
}
key = keytype + ' ' + keytext
key = keytype + " " + keytext
print(format_command(env_vars, git_serve_cmd, ssh_opts, key))
if __name__ == '__main__':
if __name__ == "__main__":
main()

View file

@ -11,16 +11,16 @@ import aurweb.config
import aurweb.db
import aurweb.exceptions
notify_cmd = aurweb.config.get('notifications', 'notify-cmd')
notify_cmd = aurweb.config.get("notifications", "notify-cmd")
repo_path = aurweb.config.get('serve', 'repo-path')
repo_regex = aurweb.config.get('serve', 'repo-regex')
git_shell_cmd = aurweb.config.get('serve', 'git-shell-cmd')
git_update_cmd = aurweb.config.get('serve', 'git-update-cmd')
ssh_cmdline = aurweb.config.get('serve', 'ssh-cmdline')
repo_path = aurweb.config.get("serve", "repo-path")
repo_regex = aurweb.config.get("serve", "repo-regex")
git_shell_cmd = aurweb.config.get("serve", "git-shell-cmd")
git_update_cmd = aurweb.config.get("serve", "git-update-cmd")
ssh_cmdline = aurweb.config.get("serve", "ssh-cmdline")
enable_maintenance = aurweb.config.getboolean('options', 'enable-maintenance')
maintenance_exc = aurweb.config.get('options', 'maintenance-exceptions').split()
enable_maintenance = aurweb.config.getboolean("options", "enable-maintenance")
maintenance_exc = aurweb.config.get("options", "maintenance-exceptions").split()
def pkgbase_from_name(pkgbase):
@ -43,14 +43,16 @@ def list_repos(user):
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
cur = conn.execute("SELECT Name, PackagerUID FROM PackageBases " +
"WHERE MaintainerUID = ?", [userid])
cur = conn.execute(
"SELECT Name, PackagerUID FROM PackageBases " + "WHERE MaintainerUID = ?",
[userid],
)
for row in cur:
print((' ' if row[1] else '*') + row[0])
print((" " if row[1] else "*") + row[0])
conn.close()
def create_pkgbase(pkgbase, user):
def validate_pkgbase(pkgbase, user):
if not re.match(repo_regex, pkgbase):
raise aurweb.exceptions.InvalidRepositoryNameException(pkgbase)
if pkgbase_exists(pkgbase):
@ -60,23 +62,12 @@ def create_pkgbase(pkgbase, user):
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [user])
userid = cur.fetchone()[0]
conn.close()
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
now = int(time.time())
cur = conn.execute("INSERT INTO PackageBases (Name, SubmittedTS, " +
"ModifiedTS, SubmitterUID, MaintainerUID, " +
"FlaggerComment) VALUES (?, ?, ?, ?, ?, '')",
[pkgbase, now, now, userid, userid])
pkgbase_id = cur.lastrowid
cur = conn.execute("INSERT INTO PackageNotifications " +
"(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid])
conn.commit()
conn.close()
def pkgbase_adopt(pkgbase, user, privileged):
pkgbase_id = pkgbase_from_name(pkgbase)
@ -85,8 +76,10 @@ def pkgbase_adopt(pkgbase, user, privileged):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT ID FROM PackageBases WHERE ID = ? AND " +
"MaintainerUID IS NULL", [pkgbase_id])
cur = conn.execute(
"SELECT ID FROM PackageBases WHERE ID = ? AND " + "MaintainerUID IS NULL",
[pkgbase_id],
)
if not privileged and not cur.fetchone():
raise aurweb.exceptions.PermissionDeniedException(user)
@ -95,19 +88,25 @@ def pkgbase_adopt(pkgbase, user, privileged):
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
cur = conn.execute("UPDATE PackageBases SET MaintainerUID = ? " +
"WHERE ID = ?", [userid, pkgbase_id])
cur = conn.execute(
"UPDATE PackageBases SET MaintainerUID = ? " + "WHERE ID = ?",
[userid, pkgbase_id],
)
cur = conn.execute("SELECT COUNT(*) FROM PackageNotifications WHERE " +
"PackageBaseID = ? AND UserID = ?",
[pkgbase_id, userid])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageNotifications WHERE "
+ "PackageBaseID = ? AND UserID = ?",
[pkgbase_id, userid],
)
if cur.fetchone()[0] == 0:
cur = conn.execute("INSERT INTO PackageNotifications " +
"(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid])
cur = conn.execute(
"INSERT INTO PackageNotifications "
+ "(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid],
)
conn.commit()
subprocess.Popen((notify_cmd, 'adopt', str(userid), str(pkgbase_id)))
subprocess.Popen((notify_cmd, "adopt", str(userid), str(pkgbase_id)))
conn.close()
@ -115,13 +114,16 @@ def pkgbase_adopt(pkgbase, user, privileged):
def pkgbase_get_comaintainers(pkgbase):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT UserName FROM PackageComaintainers " +
"INNER JOIN Users " +
"ON Users.ID = PackageComaintainers.UsersID " +
"INNER JOIN PackageBases " +
"ON PackageBases.ID = PackageComaintainers.PackageBaseID " +
"WHERE PackageBases.Name = ? " +
"ORDER BY Priority ASC", [pkgbase])
cur = conn.execute(
"SELECT UserName FROM PackageComaintainers "
+ "INNER JOIN Users "
+ "ON Users.ID = PackageComaintainers.UsersID "
+ "INNER JOIN PackageBases "
+ "ON PackageBases.ID = PackageComaintainers.PackageBaseID "
+ "WHERE PackageBases.Name = ? "
+ "ORDER BY Priority ASC",
[pkgbase],
)
return [row[0] for row in cur.fetchall()]
@ -140,8 +142,7 @@ def pkgbase_set_comaintainers(pkgbase, userlist, user, privileged):
uids_old = set()
for olduser in userlist_old:
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?",
[olduser])
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [olduser])
userid = cur.fetchone()[0]
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
@ -149,8 +150,7 @@ def pkgbase_set_comaintainers(pkgbase, userlist, user, privileged):
uids_new = set()
for newuser in userlist:
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?",
[newuser])
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [newuser])
userid = cur.fetchone()[0]
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
@ -162,24 +162,33 @@ def pkgbase_set_comaintainers(pkgbase, userlist, user, privileged):
i = 1
for userid in uids_new:
if userid in uids_add:
cur = conn.execute("INSERT INTO PackageComaintainers " +
"(PackageBaseID, UsersID, Priority) " +
"VALUES (?, ?, ?)", [pkgbase_id, userid, i])
subprocess.Popen((notify_cmd, 'comaintainer-add', str(userid),
str(pkgbase_id)))
cur = conn.execute(
"INSERT INTO PackageComaintainers "
+ "(PackageBaseID, UsersID, Priority) "
+ "VALUES (?, ?, ?)",
[pkgbase_id, userid, i],
)
subprocess.Popen(
(notify_cmd, "comaintainer-add", str(userid), str(pkgbase_id))
)
else:
cur = conn.execute("UPDATE PackageComaintainers " +
"SET Priority = ? " +
"WHERE PackageBaseID = ? AND UsersID = ?",
[i, pkgbase_id, userid])
cur = conn.execute(
"UPDATE PackageComaintainers "
+ "SET Priority = ? "
+ "WHERE PackageBaseID = ? AND UsersID = ?",
[i, pkgbase_id, userid],
)
i += 1
for userid in uids_rem:
cur = conn.execute("DELETE FROM PackageComaintainers " +
"WHERE PackageBaseID = ? AND UsersID = ?",
[pkgbase_id, userid])
subprocess.Popen((notify_cmd, 'comaintainer-remove',
str(userid), str(pkgbase_id)))
cur = conn.execute(
"DELETE FROM PackageComaintainers "
+ "WHERE PackageBaseID = ? AND UsersID = ?",
[pkgbase_id, userid],
)
subprocess.Popen(
(notify_cmd, "comaintainer-remove", str(userid), str(pkgbase_id))
)
conn.commit()
conn.close()
@ -188,18 +197,21 @@ def pkgbase_set_comaintainers(pkgbase, userlist, user, privileged):
def pkgreq_by_pkgbase(pkgbase_id, reqtype):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT PackageRequests.ID FROM PackageRequests " +
"INNER JOIN RequestTypes ON " +
"RequestTypes.ID = PackageRequests.ReqTypeID " +
"WHERE PackageRequests.Status = 0 " +
"AND PackageRequests.PackageBaseID = ? " +
"AND RequestTypes.Name = ?", [pkgbase_id, reqtype])
cur = conn.execute(
"SELECT PackageRequests.ID FROM PackageRequests "
+ "INNER JOIN RequestTypes ON "
+ "RequestTypes.ID = PackageRequests.ReqTypeID "
+ "WHERE PackageRequests.Status = 0 "
+ "AND PackageRequests.PackageBaseID = ? "
+ "AND RequestTypes.Name = ?",
[pkgbase_id, reqtype],
)
return [row[0] for row in cur.fetchall()]
def pkgreq_close(reqid, user, reason, comments, autoclose=False):
statusmap = {'accepted': 2, 'rejected': 3}
statusmap = {"accepted": 2, "rejected": 3}
if reason not in statusmap:
raise aurweb.exceptions.InvalidReasonException(reason)
status = statusmap[reason]
@ -215,16 +227,20 @@ def pkgreq_close(reqid, user, reason, comments, autoclose=False):
raise aurweb.exceptions.InvalidUserException(user)
now = int(time.time())
conn.execute("UPDATE PackageRequests SET Status = ?, ClosedTS = ?, " +
"ClosedUID = ?, ClosureComment = ? " +
"WHERE ID = ?", [status, now, userid, comments, reqid])
conn.execute(
"UPDATE PackageRequests SET Status = ?, ClosedTS = ?, "
+ "ClosedUID = ?, ClosureComment = ? "
+ "WHERE ID = ?",
[status, now, userid, comments, reqid],
)
conn.commit()
conn.close()
if not userid:
userid = 0
subprocess.Popen((notify_cmd, 'request-close', str(userid), str(reqid),
reason)).wait()
subprocess.Popen(
(notify_cmd, "request-close", str(userid), str(reqid), reason)
).wait()
def pkgbase_disown(pkgbase, user, privileged):
@ -239,9 +255,9 @@ def pkgbase_disown(pkgbase, user, privileged):
# TODO: Support disowning package bases via package request.
# Scan through pending orphan requests and close them.
comment = 'The user {:s} disowned the package.'.format(user)
for reqid in pkgreq_by_pkgbase(pkgbase_id, 'orphan'):
pkgreq_close(reqid, user, 'accepted', comment, True)
comment = "The user {:s} disowned the package.".format(user)
for reqid in pkgreq_by_pkgbase(pkgbase_id, "orphan"):
pkgreq_close(reqid, user, "accepted", comment, True)
comaintainers = []
new_maintainer_userid = None
@ -249,19 +265,22 @@ def pkgbase_disown(pkgbase, user, privileged):
conn = aurweb.db.Connection()
# Make the first co-maintainer the new maintainer, unless the action was
# enforced by a Trusted User.
# enforced by a Package Maintainer.
if initialized_by_owner:
comaintainers = pkgbase_get_comaintainers(pkgbase)
if len(comaintainers) > 0:
new_maintainer = comaintainers[0]
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?",
[new_maintainer])
cur = conn.execute(
"SELECT ID FROM Users WHERE Username = ?", [new_maintainer]
)
new_maintainer_userid = cur.fetchone()[0]
comaintainers.remove(new_maintainer)
pkgbase_set_comaintainers(pkgbase, comaintainers, user, privileged)
cur = conn.execute("UPDATE PackageBases SET MaintainerUID = ? " +
"WHERE ID = ?", [new_maintainer_userid, pkgbase_id])
cur = conn.execute(
"UPDATE PackageBases SET MaintainerUID = ? " + "WHERE ID = ?",
[new_maintainer_userid, pkgbase_id],
)
conn.commit()
@ -270,7 +289,7 @@ def pkgbase_disown(pkgbase, user, privileged):
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
subprocess.Popen((notify_cmd, 'disown', str(userid), str(pkgbase_id)))
subprocess.Popen((notify_cmd, "disown", str(userid), str(pkgbase_id)))
conn.close()
@ -290,14 +309,16 @@ def pkgbase_flag(pkgbase, user, comment):
raise aurweb.exceptions.InvalidUserException(user)
now = int(time.time())
conn.execute("UPDATE PackageBases SET " +
"OutOfDateTS = ?, FlaggerUID = ?, FlaggerComment = ? " +
"WHERE ID = ? AND OutOfDateTS IS NULL",
[now, userid, comment, pkgbase_id])
conn.execute(
"UPDATE PackageBases SET "
+ "OutOfDateTS = ?, FlaggerUID = ?, FlaggerComment = ? "
+ "WHERE ID = ? AND OutOfDateTS IS NULL",
[now, userid, comment, pkgbase_id],
)
conn.commit()
subprocess.Popen((notify_cmd, 'flag', str(userid), str(pkgbase_id)))
subprocess.Popen((notify_cmd, "flag", str(userid), str(pkgbase_id)))
def pkgbase_unflag(pkgbase, user):
@ -313,12 +334,15 @@ def pkgbase_unflag(pkgbase, user):
raise aurweb.exceptions.InvalidUserException(user)
if user in pkgbase_get_comaintainers(pkgbase):
conn.execute("UPDATE PackageBases SET OutOfDateTS = NULL " +
"WHERE ID = ?", [pkgbase_id])
conn.execute(
"UPDATE PackageBases SET OutOfDateTS = NULL " + "WHERE ID = ?", [pkgbase_id]
)
else:
conn.execute("UPDATE PackageBases SET OutOfDateTS = NULL " +
"WHERE ID = ? AND (MaintainerUID = ? OR FlaggerUID = ?)",
[pkgbase_id, userid, userid])
conn.execute(
"UPDATE PackageBases SET OutOfDateTS = NULL "
+ "WHERE ID = ? AND (MaintainerUID = ? OR FlaggerUID = ?)",
[pkgbase_id, userid, userid],
)
conn.commit()
@ -335,17 +359,24 @@ def pkgbase_vote(pkgbase, user):
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
cur = conn.execute("SELECT COUNT(*) FROM PackageVotes " +
"WHERE UsersID = ? AND PackageBaseID = ?",
[userid, pkgbase_id])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageVotes "
+ "WHERE UsersID = ? AND PackageBaseID = ?",
[userid, pkgbase_id],
)
if cur.fetchone()[0] > 0:
raise aurweb.exceptions.AlreadyVotedException(pkgbase)
now = int(time.time())
conn.execute("INSERT INTO PackageVotes (UsersID, PackageBaseID, VoteTS) " +
"VALUES (?, ?, ?)", [userid, pkgbase_id, now])
conn.execute("UPDATE PackageBases SET NumVotes = NumVotes + 1 " +
"WHERE ID = ?", [pkgbase_id])
conn.execute(
"INSERT INTO PackageVotes (UsersID, PackageBaseID, VoteTS) "
+ "VALUES (?, ?, ?)",
[userid, pkgbase_id, now],
)
conn.execute(
"UPDATE PackageBases SET NumVotes = NumVotes + 1 " + "WHERE ID = ?",
[pkgbase_id],
)
conn.commit()
@ -361,16 +392,22 @@ def pkgbase_unvote(pkgbase, user):
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
cur = conn.execute("SELECT COUNT(*) FROM PackageVotes " +
"WHERE UsersID = ? AND PackageBaseID = ?",
[userid, pkgbase_id])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageVotes "
+ "WHERE UsersID = ? AND PackageBaseID = ?",
[userid, pkgbase_id],
)
if cur.fetchone()[0] == 0:
raise aurweb.exceptions.NotVotedException(pkgbase)
conn.execute("DELETE FROM PackageVotes WHERE UsersID = ? AND " +
"PackageBaseID = ?", [userid, pkgbase_id])
conn.execute("UPDATE PackageBases SET NumVotes = NumVotes - 1 " +
"WHERE ID = ?", [pkgbase_id])
conn.execute(
"DELETE FROM PackageVotes WHERE UsersID = ? AND " + "PackageBaseID = ?",
[userid, pkgbase_id],
)
conn.execute(
"UPDATE PackageBases SET NumVotes = NumVotes - 1 " + "WHERE ID = ?",
[pkgbase_id],
)
conn.commit()
@ -381,11 +418,12 @@ def pkgbase_set_keywords(pkgbase, keywords):
conn = aurweb.db.Connection()
conn.execute("DELETE FROM PackageKeywords WHERE PackageBaseID = ?",
[pkgbase_id])
conn.execute("DELETE FROM PackageKeywords WHERE PackageBaseID = ?", [pkgbase_id])
for keyword in keywords:
conn.execute("INSERT INTO PackageKeywords (PackageBaseID, Keyword) " +
"VALUES (?, ?)", [pkgbase_id, keyword])
conn.execute(
"INSERT INTO PackageKeywords (PackageBaseID, Keyword) " + "VALUES (?, ?)",
[pkgbase_id, keyword],
)
conn.commit()
conn.close()
@ -394,24 +432,30 @@ def pkgbase_set_keywords(pkgbase, keywords):
def pkgbase_has_write_access(pkgbase, user):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT COUNT(*) FROM PackageBases " +
"LEFT JOIN PackageComaintainers " +
"ON PackageComaintainers.PackageBaseID = PackageBases.ID " +
"INNER JOIN Users " +
"ON Users.ID = PackageBases.MaintainerUID " +
"OR PackageBases.MaintainerUID IS NULL " +
"OR Users.ID = PackageComaintainers.UsersID " +
"WHERE Name = ? AND Username = ?", [pkgbase, user])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageBases "
+ "LEFT JOIN PackageComaintainers "
+ "ON PackageComaintainers.PackageBaseID = PackageBases.ID "
+ "INNER JOIN Users "
+ "ON Users.ID = PackageBases.MaintainerUID "
+ "OR PackageBases.MaintainerUID IS NULL "
+ "OR Users.ID = PackageComaintainers.UsersID "
+ "WHERE Name = ? AND Username = ?",
[pkgbase, user],
)
return cur.fetchone()[0] > 0
def pkgbase_has_full_access(pkgbase, user):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT COUNT(*) FROM PackageBases " +
"INNER JOIN Users " +
"ON Users.ID = PackageBases.MaintainerUID " +
"WHERE Name = ? AND Username = ?", [pkgbase, user])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageBases "
+ "INNER JOIN Users "
+ "ON Users.ID = PackageBases.MaintainerUID "
+ "WHERE Name = ? AND Username = ?",
[pkgbase, user],
)
return cur.fetchone()[0] > 0
@ -419,9 +463,11 @@ def log_ssh_login(user, remote_addr):
conn = aurweb.db.Connection()
now = int(time.time())
conn.execute("UPDATE Users SET LastSSHLogin = ?, " +
"LastSSHLoginIPAddress = ? WHERE Username = ?",
[now, remote_addr, user])
conn.execute(
"UPDATE Users SET LastSSHLogin = ?, "
+ "LastSSHLoginIPAddress = ? WHERE Username = ?",
[now, remote_addr, user],
)
conn.commit()
conn.close()
@ -430,8 +476,7 @@ def log_ssh_login(user, remote_addr):
def bans_match(remote_addr):
conn = aurweb.db.Connection()
cur = conn.execute("SELECT COUNT(*) FROM Bans WHERE IPAddress = ?",
[remote_addr])
cur = conn.execute("SELECT COUNT(*) FROM Bans WHERE IPAddress = ?", [remote_addr])
return cur.fetchone()[0] > 0
@ -458,13 +503,13 @@ def usage(cmds):
def checkarg_atleast(cmdargv, *argdesc):
if len(cmdargv) - 1 < len(argdesc):
msg = 'missing {:s}'.format(argdesc[len(cmdargv) - 1])
msg = "missing {:s}".format(argdesc[len(cmdargv) - 1])
raise aurweb.exceptions.InvalidArgumentsException(msg)
def checkarg_atmost(cmdargv, *argdesc):
if len(cmdargv) - 1 > len(argdesc):
raise aurweb.exceptions.InvalidArgumentsException('too many arguments')
raise aurweb.exceptions.InvalidArgumentsException("too many arguments")
def checkarg(cmdargv, *argdesc):
@ -480,23 +525,23 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
raise aurweb.exceptions.BannedException
log_ssh_login(user, remote_addr)
if action == 'git' and cmdargv[1] in ('upload-pack', 'receive-pack'):
action = action + '-' + cmdargv[1]
if action == "git" and cmdargv[1] in ("upload-pack", "receive-pack"):
action = action + "-" + cmdargv[1]
del cmdargv[1]
if action == 'git-upload-pack' or action == 'git-receive-pack':
checkarg(cmdargv, 'path')
if action == "git-upload-pack" or action == "git-receive-pack":
checkarg(cmdargv, "path")
path = cmdargv[1].rstrip('/')
if not path.startswith('/'):
path = '/' + path
if not path.endswith('.git'):
path = path + '.git'
path = cmdargv[1].rstrip("/")
if not path.startswith("/"):
path = "/" + path
if not path.endswith(".git"):
path = path + ".git"
pkgbase = path[1:-4]
if not re.match(repo_regex, pkgbase):
raise aurweb.exceptions.InvalidRepositoryNameException(pkgbase)
if action == 'git-receive-pack' and pkgbase_exists(pkgbase):
if action == "git-receive-pack" and pkgbase_exists(pkgbase):
if not privileged and not pkgbase_has_write_access(pkgbase, user):
raise aurweb.exceptions.PermissionDeniedException(user)
@ -507,65 +552,60 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
os.environ["AUR_PKGBASE"] = pkgbase
os.environ["GIT_NAMESPACE"] = pkgbase
cmd = action + " '" + repo_path + "'"
os.execl(git_shell_cmd, git_shell_cmd, '-c', cmd)
elif action == 'set-keywords':
checkarg_atleast(cmdargv, 'repository name')
os.execl(git_shell_cmd, git_shell_cmd, "-c", cmd)
elif action == "set-keywords":
checkarg_atleast(cmdargv, "repository name")
pkgbase_set_keywords(cmdargv[1], cmdargv[2:])
elif action == 'list-repos':
elif action == "list-repos":
checkarg(cmdargv)
list_repos(user)
elif action == 'setup-repo':
checkarg(cmdargv, 'repository name')
warn('{:s} is deprecated. '
'Use `git push` to create new repositories.'.format(action))
create_pkgbase(cmdargv[1], user)
elif action == 'restore':
checkarg(cmdargv, 'repository name')
elif action == "restore":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
create_pkgbase(pkgbase, user)
validate_pkgbase(pkgbase, user)
os.environ["AUR_USER"] = user
os.environ["AUR_PKGBASE"] = pkgbase
os.execl(git_update_cmd, git_update_cmd, 'restore')
elif action == 'adopt':
checkarg(cmdargv, 'repository name')
os.execl(git_update_cmd, git_update_cmd, "restore")
elif action == "adopt":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
pkgbase_adopt(pkgbase, user, privileged)
elif action == 'disown':
checkarg(cmdargv, 'repository name')
elif action == "disown":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
pkgbase_disown(pkgbase, user, privileged)
elif action == 'flag':
checkarg(cmdargv, 'repository name', 'comment')
elif action == "flag":
checkarg(cmdargv, "repository name", "comment")
pkgbase = cmdargv[1]
comment = cmdargv[2]
pkgbase_flag(pkgbase, user, comment)
elif action == 'unflag':
checkarg(cmdargv, 'repository name')
elif action == "unflag":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
pkgbase_unflag(pkgbase, user)
elif action == 'vote':
checkarg(cmdargv, 'repository name')
elif action == "vote":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
pkgbase_vote(pkgbase, user)
elif action == 'unvote':
checkarg(cmdargv, 'repository name')
elif action == "unvote":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
pkgbase_unvote(pkgbase, user)
elif action == 'set-comaintainers':
checkarg_atleast(cmdargv, 'repository name')
elif action == "set-comaintainers":
checkarg_atleast(cmdargv, "repository name")
pkgbase = cmdargv[1]
userlist = cmdargv[2:]
pkgbase_set_comaintainers(pkgbase, userlist, user, privileged)
elif action == 'help':
elif action == "help":
cmds = {
"adopt <name>": "Adopt a package base.",
"disown <name>": "Disown a package base.",
@ -575,7 +615,6 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
"restore <name>": "Restore a deleted package base.",
"set-comaintainers <name> [...]": "Set package base co-maintainers.",
"set-keywords <name> [...]": "Change package base keywords.",
"setup-repo <name>": "Create a repository (deprecated).",
"unflag <name>": "Remove out-of-date flag from a package base.",
"unvote <name>": "Remove vote from a package base.",
"vote <name>": "Vote for a package base.",
@ -584,21 +623,21 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
}
usage(cmds)
else:
msg = 'invalid command: {:s}'.format(action)
msg = "invalid command: {:s}".format(action)
raise aurweb.exceptions.InvalidArgumentsException(msg)
def main():
user = os.environ.get('AUR_USER')
privileged = (os.environ.get('AUR_PRIVILEGED', '0') == '1')
ssh_cmd = os.environ.get('SSH_ORIGINAL_COMMAND')
ssh_client = os.environ.get('SSH_CLIENT')
user = os.environ.get("AUR_USER")
privileged = os.environ.get("AUR_PRIVILEGED", "0") == "1"
ssh_cmd = os.environ.get("SSH_ORIGINAL_COMMAND")
ssh_client = os.environ.get("SSH_CLIENT")
if not ssh_cmd:
die_with_help("Interactive shell is disabled.")
die_with_help(f"Welcome to AUR, {user}! Interactive shell is disabled.")
cmdargv = shlex.split(ssh_cmd)
action = cmdargv[0]
remote_addr = ssh_client.split(' ')[0] if ssh_client else None
remote_addr = ssh_client.split(" ")[0] if ssh_client else None
try:
serve(action, cmdargv, user, privileged, remote_addr)
@ -607,10 +646,10 @@ def main():
except aurweb.exceptions.BannedException:
die("The SSH interface is disabled for your IP address.")
except aurweb.exceptions.InvalidArgumentsException as e:
die_with_help('{:s}: {}'.format(action, e))
die_with_help("{:s}: {}".format(action, e))
except aurweb.exceptions.AurwebException as e:
die('{:s}: {}'.format(action, e))
die("{:s}: {}".format(action, e))
if __name__ == '__main__':
if __name__ == "__main__":
main()

View file

@ -13,23 +13,23 @@ import srcinfo.utils
import aurweb.config
import aurweb.db
notify_cmd = aurweb.config.get('notifications', 'notify-cmd')
notify_cmd = aurweb.config.get("notifications", "notify-cmd")
repo_path = aurweb.config.get('serve', 'repo-path')
repo_regex = aurweb.config.get('serve', 'repo-regex')
repo_path = aurweb.config.get("serve", "repo-path")
repo_regex = aurweb.config.get("serve", "repo-regex")
max_blob_size = aurweb.config.getint('update', 'max-blob-size')
max_blob_size = aurweb.config.getint("update", "max-blob-size")
def size_humanize(num):
for unit in ['B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB', 'EiB', 'ZiB']:
for unit in ["B", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB", "ZiB"]:
if abs(num) < 2048.0:
if isinstance(num, int):
return "{}{}".format(num, unit)
else:
return "{:.2f}{}".format(num, unit)
num /= 1024.0
return "{:.2f}{}".format(num, 'YiB')
return "{:.2f}{}".format(num, "YiB")
def extract_arch_fields(pkginfo, field):
@ -39,20 +39,20 @@ def extract_arch_fields(pkginfo, field):
for val in pkginfo[field]:
values.append({"value": val, "arch": None})
for arch in pkginfo['arch']:
if field + '_' + arch in pkginfo:
for val in pkginfo[field + '_' + arch]:
for arch in pkginfo["arch"]:
if field + "_" + arch in pkginfo:
for val in pkginfo[field + "_" + arch]:
values.append({"value": val, "arch": arch})
return values
def parse_dep(depstring):
dep, _, desc = depstring.partition(': ')
depname = re.sub(r'(<|=|>).*', '', dep)
depcond = dep[len(depname):]
dep, _, desc = depstring.partition(": ")
depname = re.sub(r"(<|=|>).*", "", dep)
depcond = dep[len(depname) :]
return (depname, desc, depcond)
return depname, desc, depcond
def create_pkgbase(conn, pkgbase, user):
@ -60,15 +60,18 @@ def create_pkgbase(conn, pkgbase, user):
userid = cur.fetchone()[0]
now = int(time.time())
cur = conn.execute("INSERT INTO PackageBases (Name, SubmittedTS, " +
"ModifiedTS, SubmitterUID, MaintainerUID, " +
"FlaggerComment) VALUES (?, ?, ?, ?, ?, '')",
[pkgbase, now, now, userid, userid])
cur = conn.execute(
"INSERT INTO PackageBases (Name, SubmittedTS, "
+ "ModifiedTS, SubmitterUID, MaintainerUID, "
+ "FlaggerComment) VALUES (?, ?, ?, ?, ?, '')",
[pkgbase, now, now, userid, userid],
)
pkgbase_id = cur.lastrowid
cur = conn.execute("INSERT INTO PackageNotifications " +
"(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid])
cur = conn.execute(
"INSERT INTO PackageNotifications " + "(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid],
)
conn.commit()
@ -77,9 +80,10 @@ def create_pkgbase(conn, pkgbase, user):
def save_metadata(metadata, conn, user): # noqa: C901
# Obtain package base ID and previous maintainer.
pkgbase = metadata['pkgbase']
cur = conn.execute("SELECT ID, MaintainerUID FROM PackageBases "
"WHERE Name = ?", [pkgbase])
pkgbase = metadata["pkgbase"]
cur = conn.execute(
"SELECT ID, MaintainerUID FROM PackageBases " "WHERE Name = ?", [pkgbase]
)
(pkgbase_id, maintainer_uid) = cur.fetchone()
was_orphan = not maintainer_uid
@ -89,119 +93,142 @@ def save_metadata(metadata, conn, user): # noqa: C901
# Update package base details and delete current packages.
now = int(time.time())
conn.execute("UPDATE PackageBases SET ModifiedTS = ?, " +
"PackagerUID = ?, OutOfDateTS = NULL WHERE ID = ?",
[now, user_id, pkgbase_id])
conn.execute("UPDATE PackageBases SET MaintainerUID = ? " +
"WHERE ID = ? AND MaintainerUID IS NULL",
[user_id, pkgbase_id])
for table in ('Sources', 'Depends', 'Relations', 'Licenses', 'Groups'):
conn.execute("DELETE FROM Package" + table + " WHERE EXISTS (" +
"SELECT * FROM Packages " +
"WHERE Packages.PackageBaseID = ? AND " +
"Package" + table + ".PackageID = Packages.ID)",
[pkgbase_id])
conn.execute(
"UPDATE PackageBases SET ModifiedTS = ?, "
+ "PackagerUID = ?, OutOfDateTS = NULL WHERE ID = ?",
[now, user_id, pkgbase_id],
)
conn.execute(
"UPDATE PackageBases SET MaintainerUID = ? "
+ "WHERE ID = ? AND MaintainerUID IS NULL",
[user_id, pkgbase_id],
)
for table in ("Sources", "Depends", "Relations", "Licenses", "Groups"):
conn.execute(
"DELETE FROM Package"
+ table
+ " WHERE EXISTS ("
+ "SELECT * FROM Packages "
+ "WHERE Packages.PackageBaseID = ? AND "
+ "Package"
+ table
+ ".PackageID = Packages.ID)",
[pkgbase_id],
)
conn.execute("DELETE FROM Packages WHERE PackageBaseID = ?", [pkgbase_id])
for pkgname in srcinfo.utils.get_package_names(metadata):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
if 'epoch' in pkginfo and int(pkginfo['epoch']) > 0:
ver = '{:d}:{:s}-{:s}'.format(int(pkginfo['epoch']),
pkginfo['pkgver'],
pkginfo['pkgrel'])
if "epoch" in pkginfo and int(pkginfo["epoch"]) > 0:
ver = "{:d}:{:s}-{:s}".format(
int(pkginfo["epoch"]), pkginfo["pkgver"], pkginfo["pkgrel"]
)
else:
ver = '{:s}-{:s}'.format(pkginfo['pkgver'], pkginfo['pkgrel'])
ver = "{:s}-{:s}".format(pkginfo["pkgver"], pkginfo["pkgrel"])
for field in ('pkgdesc', 'url'):
for field in ("pkgdesc", "url"):
if field not in pkginfo:
pkginfo[field] = None
# Create a new package.
cur = conn.execute("INSERT INTO Packages (PackageBaseID, Name, " +
"Version, Description, URL) " +
"VALUES (?, ?, ?, ?, ?)",
[pkgbase_id, pkginfo['pkgname'], ver,
pkginfo['pkgdesc'], pkginfo['url']])
cur = conn.execute(
"INSERT INTO Packages (PackageBaseID, Name, "
+ "Version, Description, URL) "
+ "VALUES (?, ?, ?, ?, ?)",
[pkgbase_id, pkginfo["pkgname"], ver, pkginfo["pkgdesc"], pkginfo["url"]],
)
conn.commit()
pkgid = cur.lastrowid
# Add package sources.
for source_info in extract_arch_fields(pkginfo, 'source'):
conn.execute("INSERT INTO PackageSources (PackageID, Source, " +
"SourceArch) VALUES (?, ?, ?)",
[pkgid, source_info['value'], source_info['arch']])
for source_info in extract_arch_fields(pkginfo, "source"):
conn.execute(
"INSERT INTO PackageSources (PackageID, Source, "
+ "SourceArch) VALUES (?, ?, ?)",
[pkgid, source_info["value"], source_info["arch"]],
)
# Add package dependencies.
for deptype in ('depends', 'makedepends',
'checkdepends', 'optdepends'):
cur = conn.execute("SELECT ID FROM DependencyTypes WHERE Name = ?",
[deptype])
for deptype in ("depends", "makedepends", "checkdepends", "optdepends"):
cur = conn.execute(
"SELECT ID FROM DependencyTypes WHERE Name = ?", [deptype]
)
deptypeid = cur.fetchone()[0]
for dep_info in extract_arch_fields(pkginfo, deptype):
depname, depdesc, depcond = parse_dep(dep_info['value'])
deparch = dep_info['arch']
conn.execute("INSERT INTO PackageDepends (PackageID, " +
"DepTypeID, DepName, DepDesc, DepCondition, " +
"DepArch) VALUES (?, ?, ?, ?, ?, ?)",
[pkgid, deptypeid, depname, depdesc, depcond,
deparch])
depname, depdesc, depcond = parse_dep(dep_info["value"])
deparch = dep_info["arch"]
conn.execute(
"INSERT INTO PackageDepends (PackageID, "
+ "DepTypeID, DepName, DepDesc, DepCondition, "
+ "DepArch) VALUES (?, ?, ?, ?, ?, ?)",
[pkgid, deptypeid, depname, depdesc, depcond, deparch],
)
# Add package relations (conflicts, provides, replaces).
for reltype in ('conflicts', 'provides', 'replaces'):
cur = conn.execute("SELECT ID FROM RelationTypes WHERE Name = ?",
[reltype])
for reltype in ("conflicts", "provides", "replaces"):
cur = conn.execute("SELECT ID FROM RelationTypes WHERE Name = ?", [reltype])
reltypeid = cur.fetchone()[0]
for rel_info in extract_arch_fields(pkginfo, reltype):
relname, _, relcond = parse_dep(rel_info['value'])
relarch = rel_info['arch']
conn.execute("INSERT INTO PackageRelations (PackageID, " +
"RelTypeID, RelName, RelCondition, RelArch) " +
"VALUES (?, ?, ?, ?, ?)",
[pkgid, reltypeid, relname, relcond, relarch])
relname, _, relcond = parse_dep(rel_info["value"])
relarch = rel_info["arch"]
conn.execute(
"INSERT INTO PackageRelations (PackageID, "
+ "RelTypeID, RelName, RelCondition, RelArch) "
+ "VALUES (?, ?, ?, ?, ?)",
[pkgid, reltypeid, relname, relcond, relarch],
)
# Add package licenses.
if 'license' in pkginfo:
for license in pkginfo['license']:
cur = conn.execute("SELECT ID FROM Licenses WHERE Name = ?",
[license])
if "license" in pkginfo:
for license in pkginfo["license"]:
cur = conn.execute("SELECT ID FROM Licenses WHERE Name = ?", [license])
row = cur.fetchone()
if row:
licenseid = row[0]
else:
cur = conn.execute("INSERT INTO Licenses (Name) " +
"VALUES (?)", [license])
cur = conn.execute(
"INSERT INTO Licenses (Name) " + "VALUES (?)", [license]
)
conn.commit()
licenseid = cur.lastrowid
conn.execute("INSERT INTO PackageLicenses (PackageID, " +
"LicenseID) VALUES (?, ?)",
[pkgid, licenseid])
conn.execute(
"INSERT INTO PackageLicenses (PackageID, "
+ "LicenseID) VALUES (?, ?)",
[pkgid, licenseid],
)
# Add package groups.
if 'groups' in pkginfo:
for group in pkginfo['groups']:
cur = conn.execute("SELECT ID FROM `Groups` WHERE Name = ?",
[group])
if "groups" in pkginfo:
for group in pkginfo["groups"]:
cur = conn.execute("SELECT ID FROM `Groups` WHERE Name = ?", [group])
row = cur.fetchone()
if row:
groupid = row[0]
else:
cur = conn.execute("INSERT INTO `Groups` (Name) VALUES (?)",
[group])
cur = conn.execute(
"INSERT INTO `Groups` (Name) VALUES (?)", [group]
)
conn.commit()
groupid = cur.lastrowid
conn.execute("INSERT INTO PackageGroups (PackageID, "
"GroupID) VALUES (?, ?)", [pkgid, groupid])
conn.execute(
"INSERT INTO PackageGroups (PackageID, " "GroupID) VALUES (?, ?)",
[pkgid, groupid],
)
# Add user to notification list on adoption.
if was_orphan:
cur = conn.execute("SELECT COUNT(*) FROM PackageNotifications WHERE " +
"PackageBaseID = ? AND UserID = ?",
[pkgbase_id, user_id])
cur = conn.execute(
"SELECT COUNT(*) FROM PackageNotifications WHERE "
+ "PackageBaseID = ? AND UserID = ?",
[pkgbase_id, user_id],
)
if cur.fetchone()[0] == 0:
conn.execute("INSERT INTO PackageNotifications " +
"(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, user_id])
conn.execute(
"INSERT INTO PackageNotifications "
+ "(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, user_id],
)
conn.commit()
@ -212,7 +239,7 @@ def update_notify(conn, user, pkgbase_id):
user_id = int(cur.fetchone()[0])
# Execute the notification script.
subprocess.Popen((notify_cmd, 'update', str(user_id), str(pkgbase_id)))
subprocess.Popen((notify_cmd, "update", str(user_id), str(pkgbase_id)))
def die(msg):
@ -225,28 +252,91 @@ def warn(msg):
def die_commit(msg, commit):
sys.stderr.write("error: The following error " +
"occurred when parsing commit\n")
sys.stderr.write("error: The following error " + "occurred when parsing commit\n")
sys.stderr.write("error: {:s}:\n".format(commit))
sys.stderr.write("error: {:s}\n".format(msg))
exit(1)
def validate_metadata(metadata, commit): # noqa: C901
try:
metadata_pkgbase = metadata["pkgbase"]
except KeyError:
die_commit(
"invalid .SRCINFO, does not contain a pkgbase (is the file empty?)",
str(commit.id),
)
if not re.match(repo_regex, metadata_pkgbase):
die_commit("invalid pkgbase: {:s}".format(metadata_pkgbase), str(commit.id))
if not metadata["packages"]:
die_commit("missing pkgname entry", str(commit.id))
for pkgname in set(metadata["packages"].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ("pkgver", "pkgrel", "pkgname"):
if field not in pkginfo:
die_commit(
"missing mandatory field: {:s}".format(field), str(commit.id)
)
if "epoch" in pkginfo and not pkginfo["epoch"].isdigit():
die_commit("invalid epoch: {:s}".format(pkginfo["epoch"]), str(commit.id))
if not re.match(r"[a-z0-9][a-z0-9\.+_-]*$", pkginfo["pkgname"]):
die_commit(
"invalid package name: {:s}".format(pkginfo["pkgname"]),
str(commit.id),
)
max_len = {"pkgname": 255, "pkgdesc": 255, "url": 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit(
"{:s} field too long: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in ("install", "changelog"):
if field in pkginfo and not pkginfo[field] in commit.tree:
die_commit(
"missing {:s} file: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in extract_arch_fields(pkginfo, "source"):
fname = field["value"]
if len(fname) > 8000:
die_commit("source entry too long: {:s}".format(fname), str(commit.id))
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit("missing source file: {:s}".format(fname), str(commit.id))
def validate_blob_size(blob: pygit2.Object, commit: pygit2.Commit):
if isinstance(blob, pygit2.Blob) and blob.size > max_blob_size:
die_commit(
"maximum blob size ({:s}) exceeded".format(size_humanize(max_blob_size)),
str(commit.id),
)
def main(): # noqa: C901
repo = pygit2.Repository(repo_path)
user = os.environ.get("AUR_USER")
pkgbase = os.environ.get("AUR_PKGBASE")
privileged = (os.environ.get("AUR_PRIVILEGED", '0') == '1')
allow_overwrite = (os.environ.get("AUR_OVERWRITE", '0') == '1') and privileged
privileged = os.environ.get("AUR_PRIVILEGED", "0") == "1"
allow_overwrite = (os.environ.get("AUR_OVERWRITE", "0") == "1") and privileged
warn_or_die = warn if privileged else die
if len(sys.argv) == 2 and sys.argv[1] == "restore":
if 'refs/heads/' + pkgbase not in repo.listall_references():
die('{:s}: repository not found: {:s}'.format(sys.argv[1],
pkgbase))
if "refs/heads/" + pkgbase not in repo.listall_references():
die("{:s}: repository not found: {:s}".format(sys.argv[1], pkgbase))
refname = "refs/heads/master"
branchref = 'refs/heads/' + pkgbase
branchref = "refs/heads/" + pkgbase
sha1_old = sha1_new = repo.lookup_reference(branchref).target
elif len(sys.argv) == 4:
refname, sha1_old, sha1_new = sys.argv[1:4]
@ -266,137 +356,115 @@ def main(): # noqa: C901
die("denying non-fast-forward (you should pull first)")
# Prepare the walker that validates new commits.
walker = repo.walk(sha1_new, pygit2.GIT_SORT_TOPOLOGICAL)
walker = repo.walk(sha1_new, pygit2.GIT_SORT_REVERSE)
if sha1_old != "0" * 40:
walker.hide(sha1_old)
head_commit = repo[sha1_new]
if ".SRCINFO" not in head_commit.tree:
die_commit("missing .SRCINFO", str(head_commit.id))
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[head_commit.tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
if errors:
sys.stderr.write(
"error: The following errors occurred " "when parsing .SRCINFO in commit\n"
)
sys.stderr.write("error: {:s}:\n".format(str(head_commit.id)))
for error in errors:
for err in error["error"]:
sys.stderr.write("error: line {:d}: {:s}\n".format(error["line"], err))
exit(1)
# check if there is a correct .SRCINFO file in the latest revision
validate_metadata(metadata, head_commit)
# Validate all new commits.
for commit in walker:
for fname in ('.SRCINFO', 'PKGBUILD'):
if fname not in commit.tree:
die_commit("missing {:s}".format(fname), str(commit.id))
if "PKGBUILD" not in commit.tree:
die_commit("missing PKGBUILD", str(commit.id))
# Iterate over files in root dir
for treeobj in commit.tree:
blob = repo[treeobj.id]
# Don't allow any subdirs besides "keys/"
if isinstance(treeobj, pygit2.Tree) and treeobj.name != "keys":
die_commit(
"the repository must not contain subdirectories",
str(commit.id),
)
if isinstance(blob, pygit2.Tree):
die_commit("the repository must not contain subdirectories",
str(commit.id))
# Check size of files in root dir
validate_blob_size(treeobj, commit)
if not isinstance(blob, pygit2.Blob):
die_commit("not a blob object: {:s}".format(treeobj),
str(commit.id))
if blob.size > max_blob_size:
die_commit("maximum blob size ({:s}) exceeded".format(
size_humanize(max_blob_size)), str(commit.id))
metadata_raw = repo[commit.tree['.SRCINFO'].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
if errors:
sys.stderr.write("error: The following errors occurred "
"when parsing .SRCINFO in commit\n")
sys.stderr.write("error: {:s}:\n".format(str(commit.id)))
for error in errors:
for err in error['error']:
sys.stderr.write("error: line {:d}: {:s}\n".format(
error['line'], err))
exit(1)
try:
metadata_pkgbase = metadata['pkgbase']
except KeyError:
die_commit('invalid .SRCINFO, does not contain a pkgbase (is the file empty?)',
str(commit.id))
if not re.match(repo_regex, metadata_pkgbase):
die_commit('invalid pkgbase: {:s}'.format(metadata_pkgbase),
str(commit.id))
if not metadata['packages']:
die_commit('missing pkgname entry', str(commit.id))
for pkgname in set(metadata['packages'].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ('pkgver', 'pkgrel', 'pkgname'):
if field not in pkginfo:
die_commit('missing mandatory field: {:s}'.format(field),
str(commit.id))
if 'epoch' in pkginfo and not pkginfo['epoch'].isdigit():
die_commit('invalid epoch: {:s}'.format(pkginfo['epoch']),
str(commit.id))
if not re.match(r'[a-z0-9][a-z0-9\.+_-]*$', pkginfo['pkgname']):
die_commit('invalid package name: {:s}'.format(
pkginfo['pkgname']), str(commit.id))
max_len = {'pkgname': 255, 'pkgdesc': 255, 'url': 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit('{:s} field too long: {:s}'.format(field,
pkginfo[field]), str(commit.id))
for field in ('install', 'changelog'):
if field in pkginfo and not pkginfo[field] in commit.tree:
die_commit('missing {:s} file: {:s}'.format(field,
pkginfo[field]), str(commit.id))
for field in extract_arch_fields(pkginfo, 'source'):
fname = field['value']
if len(fname) > 8000:
die_commit('source entry too long: {:s}'.format(fname),
str(commit.id))
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit('missing source file: {:s}'.format(fname),
str(commit.id))
# If we got a subdir keys/,
# make sure it only contains a pgp/ subdir with key files
if "keys" in commit.tree:
# Check for forbidden files/dirs in keys/
for keyobj in commit.tree["keys"]:
if not isinstance(keyobj, pygit2.Tree) or keyobj.name != "pgp":
die_commit(
"the keys/ subdir may only contain a pgp/ directory",
str(commit.id),
)
# Check for forbidden files in keys/pgp/
if "keys/pgp" in commit.tree:
for pgpobj in commit.tree["keys/pgp"]:
if not isinstance(pgpobj, pygit2.Blob) or not pgpobj.name.endswith(
".asc"
):
die_commit(
"the subdir may only contain .asc (PGP pub key) files",
str(commit.id),
)
# Check file size for pgp key files
validate_blob_size(pgpobj, commit)
# Display a warning if .SRCINFO is unchanged.
if sha1_old not in ("0000000000000000000000000000000000000000", sha1_new):
srcinfo_id_old = repo[sha1_old].tree['.SRCINFO'].id
srcinfo_id_new = repo[sha1_new].tree['.SRCINFO'].id
srcinfo_id_old = repo[sha1_old].tree[".SRCINFO"].id
srcinfo_id_new = repo[sha1_new].tree[".SRCINFO"].id
if srcinfo_id_old == srcinfo_id_new:
warn(".SRCINFO unchanged. "
"The package database will not be updated!")
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[repo[sha1_new].tree['.SRCINFO'].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
warn(".SRCINFO unchanged. " "The package database will not be updated!")
# Ensure that the package base name matches the repository name.
metadata_pkgbase = metadata['pkgbase']
metadata_pkgbase = metadata["pkgbase"]
if metadata_pkgbase != pkgbase:
die('invalid pkgbase: {:s}, expected {:s}'.format(metadata_pkgbase,
pkgbase))
die("invalid pkgbase: {:s}, expected {:s}".format(metadata_pkgbase, pkgbase))
# Ensure that packages are neither blacklisted nor overwritten.
pkgbase = metadata['pkgbase']
pkgbase = metadata["pkgbase"]
cur = conn.execute("SELECT ID FROM PackageBases WHERE Name = ?", [pkgbase])
row = cur.fetchone()
pkgbase_id = row[0] if row else 0
cur = conn.execute("SELECT Name FROM PackageBlacklist")
blacklist = [row[0] for row in cur.fetchall()]
if pkgbase in blacklist:
warn_or_die("pkgbase is blacklisted: {:s}".format(pkgbase))
cur = conn.execute("SELECT Name, Repo FROM OfficialProviders")
providers = dict(cur.fetchall())
for pkgname in srcinfo.utils.get_package_names(metadata):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
pkgname = pkginfo['pkgname']
pkgname = pkginfo["pkgname"]
if pkgname in blacklist:
warn_or_die('package is blacklisted: {:s}'.format(pkgname))
warn_or_die("package is blacklisted: {:s}".format(pkgname))
if pkgname in providers:
warn_or_die('package already provided by [{:s}]: {:s}'.format(
providers[pkgname], pkgname))
warn_or_die(
"package already provided by [{:s}]: {:s}".format(
providers[pkgname], pkgname
)
)
cur = conn.execute("SELECT COUNT(*) FROM Packages WHERE Name = ? " +
"AND PackageBaseID <> ?", [pkgname, pkgbase_id])
cur = conn.execute(
"SELECT COUNT(*) FROM Packages WHERE Name = ? " + "AND PackageBaseID <> ?",
[pkgname, pkgbase_id],
)
if cur.fetchone()[0] > 0:
die('cannot overwrite package: {:s}'.format(pkgname))
die("cannot overwrite package: {:s}".format(pkgname))
# Create a new package base if it does not exist yet.
if pkgbase_id == 0:
@ -407,7 +475,7 @@ def main(): # noqa: C901
# Create (or update) a branch with the name of the package base for better
# accessibility.
branchref = 'refs/heads/' + pkgbase
branchref = "refs/heads/" + pkgbase
repo.create_reference(branchref, sha1_new, True)
# Work around a Git bug: The HEAD ref is not updated when using
@ -415,7 +483,7 @@ def main(): # noqa: C901
# mainline. See
# http://git.661346.n2.nabble.com/PATCH-receive-pack-Create-a-HEAD-ref-for-ref-namespace-td7632149.html
# for details.
headref = 'refs/namespaces/' + pkgbase + '/HEAD'
headref = "refs/namespaces/" + pkgbase + "/HEAD"
repo.create_reference(headref, sha1_new, True)
# Send package update notifications.
@ -426,5 +494,5 @@ def main(): # noqa: C901
conn.close()
if __name__ == '__main__':
if __name__ == "__main__":
main()

View file

@ -3,34 +3,46 @@ import argparse
import alembic.command
import alembic.config
import aurweb.aur_logging
import aurweb.db
import aurweb.logging
import aurweb.schema
def feed_initial_data(conn):
conn.execute(aurweb.schema.AccountTypes.insert(), [
{'ID': 1, 'AccountType': 'User'},
{'ID': 2, 'AccountType': 'Trusted User'},
{'ID': 3, 'AccountType': 'Developer'},
{'ID': 4, 'AccountType': 'Trusted User & Developer'},
])
conn.execute(aurweb.schema.DependencyTypes.insert(), [
{'ID': 1, 'Name': 'depends'},
{'ID': 2, 'Name': 'makedepends'},
{'ID': 3, 'Name': 'checkdepends'},
{'ID': 4, 'Name': 'optdepends'},
])
conn.execute(aurweb.schema.RelationTypes.insert(), [
{'ID': 1, 'Name': 'conflicts'},
{'ID': 2, 'Name': 'provides'},
{'ID': 3, 'Name': 'replaces'},
])
conn.execute(aurweb.schema.RequestTypes.insert(), [
{'ID': 1, 'Name': 'deletion'},
{'ID': 2, 'Name': 'orphan'},
{'ID': 3, 'Name': 'merge'},
])
conn.execute(
aurweb.schema.AccountTypes.insert(),
[
{"ID": 1, "AccountType": "User"},
{"ID": 2, "AccountType": "Package Maintainer"},
{"ID": 3, "AccountType": "Developer"},
{"ID": 4, "AccountType": "Package Maintainer & Developer"},
],
)
conn.execute(
aurweb.schema.DependencyTypes.insert(),
[
{"ID": 1, "Name": "depends"},
{"ID": 2, "Name": "makedepends"},
{"ID": 3, "Name": "checkdepends"},
{"ID": 4, "Name": "optdepends"},
],
)
conn.execute(
aurweb.schema.RelationTypes.insert(),
[
{"ID": 1, "Name": "conflicts"},
{"ID": 2, "Name": "provides"},
{"ID": 3, "Name": "replaces"},
],
)
conn.execute(
aurweb.schema.RequestTypes.insert(),
[
{"ID": 1, "Name": "deletion"},
{"ID": 2, "Name": "orphan"},
{"ID": 3, "Name": "merge"},
],
)
def run(args):
@ -40,8 +52,8 @@ def run(args):
# the last step and leave the database in an inconsistent state. The
# configuration is loaded lazily, so we query it to force its loading.
if args.use_alembic:
alembic_config = alembic.config.Config('alembic.ini')
alembic_config.get_main_option('script_location')
alembic_config = alembic.config.Config("alembic.ini")
alembic_config.get_main_option("script_location")
alembic_config.attributes["configure_logger"] = False
engine = aurweb.db.get_engine(echo=(args.verbose >= 1))
@ -51,17 +63,21 @@ def run(args):
conn.close()
if args.use_alembic:
alembic.command.stamp(alembic_config, 'head')
alembic.command.stamp(alembic_config, "head")
if __name__ == '__main__':
if __name__ == "__main__":
parser = argparse.ArgumentParser(
prog='python -m aurweb.initdb',
description='Initialize the aurweb database.')
parser.add_argument('-v', '--verbose', action='count', default=0,
help='increase verbosity')
parser.add_argument('--no-alembic',
help='disable Alembic migrations support',
dest='use_alembic', action='store_false')
prog="python -m aurweb.initdb", description="Initialize the aurweb database."
)
parser.add_argument(
"-v", "--verbose", action="count", default=0, help="increase verbosity"
)
parser.add_argument(
"--no-alembic",
help="disable Alembic migrations support",
dest="use_alembic",
action="store_false",
)
args = parser.parse_args()
run(args)

View file

@ -1,43 +1,44 @@
import gettext
from collections import OrderedDict
from fastapi import Request
import aurweb.config
SUPPORTED_LANGUAGES = OrderedDict({
"ar": "العربية",
"ast": "Asturianu",
"ca": "Català",
"cs": "Český",
"da": "Dansk",
"de": "Deutsch",
"el": "Ελληνικά",
"en": "English",
"es": "Español",
"es_419": "Español (Latinoamérica)",
"fi": "Suomi",
"fr": "Français",
"he": "עברית",
"hr": "Hrvatski",
"hu": "Magyar",
"it": "Italiano",
"ja": "日本語",
"nb": "Norsk",
"nl": "Nederlands",
"pl": "Polski",
"pt_BR": "Português (Brasil)",
"pt_PT": "Português (Portugal)",
"ro": "Română",
"ru": "Русский",
"sk": "Slovenčina",
"sr": "Srpski",
"tr": "Türkçe",
"uk": "Українська",
"zh_CN": "简体中文",
"zh_TW": "正體中文"
})
SUPPORTED_LANGUAGES = OrderedDict(
{
"ar": "العربية",
"ast": "Asturianu",
"ca": "Català",
"cs": "Český",
"da": "Dansk",
"de": "Deutsch",
"el": "Ελληνικά",
"en": "English",
"es": "Español",
"es_419": "Español (Latinoamérica)",
"fi": "Suomi",
"fr": "Français",
"he": "עברית",
"hr": "Hrvatski",
"hu": "Magyar",
"it": "Italiano",
"ja": "日本語",
"nb": "Norsk",
"nl": "Nederlands",
"pl": "Polski",
"pt_BR": "Português (Brasil)",
"pt_PT": "Português (Portugal)",
"ro": "Română",
"ru": "Русский",
"sk": "Slovenčina",
"sr": "Srpski",
"tr": "Türkçe",
"uk": "Українська",
"zh_CN": "简体中文",
"zh_TW": "正體中文",
}
)
RIGHT_TO_LEFT_LANGUAGES = ("he", "ar")
@ -45,15 +46,14 @@ RIGHT_TO_LEFT_LANGUAGES = ("he", "ar")
class Translator:
def __init__(self):
self._localedir = aurweb.config.get('options', 'localedir')
self._localedir = aurweb.config.get("options", "localedir")
self._translator = {}
def get_translator(self, lang: str):
if lang not in self._translator:
self._translator[lang] = gettext.translation("aurweb",
self._localedir,
languages=[lang],
fallback=True)
self._translator[lang] = gettext.translation(
"aurweb", self._localedir, languages=[lang], fallback=True
)
return self._translator.get(lang)
def translate(self, s: str, lang: str):
@ -64,11 +64,24 @@ class Translator:
translator = Translator()
def get_request_language(request: Request):
if request.user.is_authenticated():
def get_request_language(request: Request) -> str:
"""Get a request's language from either query param, user setting or
cookie. We use the configuration's [options] default_lang otherwise.
@param request FastAPI request
"""
request_lang = request.query_params.get("language")
cookie_lang = request.cookies.get("AURLANG")
if request_lang and request_lang in SUPPORTED_LANGUAGES:
return request_lang
elif (
request.user.is_authenticated()
and request.user.LangPreference in SUPPORTED_LANGUAGES
):
return request.user.LangPreference
default_lang = aurweb.config.get("options", "default_lang")
return request.cookies.get("AURLANG", default_lang)
elif cookie_lang and cookie_lang in SUPPORTED_LANGUAGES:
return cookie_lang
return aurweb.config.get_with_fallback("options", "default_lang", "en")
def get_raw_translator_for_request(request: Request):

View file

@ -1,4 +1,5 @@
""" Collection of all aurweb SQLAlchemy declarative models. """
from .accepted_term import AcceptedTerm # noqa: F401
from .account_type import AccountType # noqa: F401
from .api_rate_limit import ApiRateLimit # noqa: F401
@ -26,6 +27,6 @@ from .request_type import RequestType # noqa: F401
from .session import Session # noqa: F401
from .ssh_pub_key import SSHPubKey # noqa: F401
from .term import Term # noqa: F401
from .tu_vote import TUVote # noqa: F401
from .tu_voteinfo import TUVoteInfo # noqa: F401
from .user import User # noqa: F401
from .vote import Vote # noqa: F401
from .voteinfo import VoteInfo # noqa: F401

View file

@ -13,12 +13,16 @@ class AcceptedTerm(Base):
__mapper_args__ = {"primary_key": [__table__.c.TermsID]}
User = relationship(
_User, backref=backref("accepted_terms", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("accepted_terms", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID],
)
Term = relationship(
_Term, backref=backref("accepted_terms", lazy="dynamic"),
foreign_keys=[__table__.c.TermsID])
_Term,
backref=backref("accepted_terms", lazy="dynamic"),
foreign_keys=[__table__.c.TermsID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -27,10 +31,12 @@ class AcceptedTerm(Base):
raise IntegrityError(
statement="Foreign key UsersID cannot be null.",
orig="AcceptedTerms.UserID",
params=("NULL"))
params=("NULL"),
)
if not self.Term and not self.TermsID:
raise IntegrityError(
statement="Foreign key TermID cannot be null.",
orig="AcceptedTerms.TermID",
params=("NULL"))
params=("NULL"),
)

View file

@ -2,21 +2,21 @@ from aurweb import schema
from aurweb.models.declarative import Base
USER = "User"
TRUSTED_USER = "Trusted User"
PACKAGE_MAINTAINER = "Package Maintainer"
DEVELOPER = "Developer"
TRUSTED_USER_AND_DEV = "Trusted User & Developer"
PACKAGE_MAINTAINER_AND_DEV = "Package Maintainer & Developer"
USER_ID = 1
TRUSTED_USER_ID = 2
PACKAGE_MAINTAINER_ID = 2
DEVELOPER_ID = 3
TRUSTED_USER_AND_DEV_ID = 4
PACKAGE_MAINTAINER_AND_DEV_ID = 4
# Map string constants to integer constants.
ACCOUNT_TYPE_ID = {
USER: USER_ID,
TRUSTED_USER: TRUSTED_USER_ID,
PACKAGE_MAINTAINER: PACKAGE_MAINTAINER_ID,
DEVELOPER: DEVELOPER_ID,
TRUSTED_USER_AND_DEV: TRUSTED_USER_AND_DEV_ID
PACKAGE_MAINTAINER_AND_DEV: PACKAGE_MAINTAINER_AND_DEV_ID,
}
# Reversed ACCOUNT_TYPE_ID mapping.
@ -24,7 +24,8 @@ ACCOUNT_TYPE_NAME = {v: k for k, v in ACCOUNT_TYPE_ID.items()}
class AccountType(Base):
""" An ORM model of a single AccountTypes record. """
"""An ORM model of a single AccountTypes record."""
__table__ = schema.AccountTypes
__tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.ID]}
@ -36,5 +37,4 @@ class AccountType(Base):
return str(self.AccountType)
def __repr__(self):
return "<AccountType(ID='%s', AccountType='%s')>" % (
self.ID, str(self))
return "<AccountType(ID='%s', AccountType='%s')>" % (self.ID, str(self))

View file

@ -16,10 +16,12 @@ class ApiRateLimit(Base):
raise IntegrityError(
statement="Column Requests cannot be null.",
orig="ApiRateLimit.Requests",
params=("NULL"))
params=("NULL"),
)
if self.WindowStart is None:
raise IntegrityError(
statement="Column WindowStart cannot be null.",
orig="ApiRateLimit.WindowStart",
params=("NULL"))
params=("NULL"),
)

View file

@ -2,6 +2,7 @@ from fastapi import Request
from aurweb import db, schema
from aurweb.models.declarative import Base
from aurweb.util import get_client_ip
class Ban(Base):
@ -14,6 +15,6 @@ class Ban(Base):
def is_banned(request: Request):
ip = request.client.host
ip = get_client_ip(request)
exists = db.query(Ban).filter(Ban.IPAddress == ip).exists()
return db.query(exists).scalar()

View file

@ -6,26 +6,19 @@ from aurweb import util
def to_dict(model):
return {
c.name: getattr(model, c.name)
for c in model.__table__.columns
}
return {c.name: getattr(model, c.name) for c in model.__table__.columns}
def to_json(model, indent: int = None):
return json.dumps({
k: util.jsonify(v)
for k, v in to_dict(model).items()
}, indent=indent)
return json.dumps(
{k: util.jsonify(v) for k, v in to_dict(model).items()}, indent=indent
)
Base = declarative_base()
# Setup __table_args__ applicable to every table.
Base.__table_args__ = {
"autoload": False,
"extend_existing": True
}
Base.__table_args__ = {"autoload": False, "extend_existing": True}
# Setup Base.as_dict and Base.json.
#

View file

@ -15,4 +15,5 @@ class Group(Base):
raise IntegrityError(
statement="Column Name cannot be null.",
orig="Groups.Name",
params=("NULL"))
params=("NULL"),
)

View file

@ -16,4 +16,5 @@ class License(Base):
raise IntegrityError(
statement="Column Name cannot be null.",
orig="Licenses.Name",
params=("NULL"))
params=("NULL"),
)

View file

@ -21,16 +21,19 @@ class OfficialProvider(Base):
raise IntegrityError(
statement="Column Name cannot be null.",
orig="OfficialProviders.Name",
params=("NULL"))
params=("NULL"),
)
if not self.Repo:
raise IntegrityError(
statement="Column Repo cannot be null.",
orig="OfficialProviders.Repo",
params=("NULL"))
params=("NULL"),
)
if not self.Provides:
raise IntegrityError(
statement="Column Provides cannot be null.",
orig="OfficialProviders.Provides",
params=("NULL"))
params=("NULL"),
)

View file

@ -12,9 +12,10 @@ class Package(Base):
__mapper_args__ = {"primary_key": [__table__.c.ID]}
PackageBase = relationship(
_PackageBase, backref=backref("packages", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("packages", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
# No Package instances are official packages.
is_official = False
@ -26,10 +27,12 @@ class Package(Base):
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="Packages.PackageBaseID",
params=("NULL"))
params=("NULL"),
)
if self.Name is None:
raise IntegrityError(
statement="Column Name cannot be null.",
orig="Packages.Name",
params=("NULL"))
params=("NULL"),
)

View file

@ -12,20 +12,28 @@ class PackageBase(Base):
__mapper_args__ = {"primary_key": [__table__.c.ID]}
Flagger = relationship(
_User, backref=backref("flagged_bases", lazy="dynamic"),
foreign_keys=[__table__.c.FlaggerUID])
_User,
backref=backref("flagged_bases", lazy="dynamic"),
foreign_keys=[__table__.c.FlaggerUID],
)
Submitter = relationship(
_User, backref=backref("submitted_bases", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterUID])
_User,
backref=backref("submitted_bases", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterUID],
)
Maintainer = relationship(
_User, backref=backref("maintained_bases", lazy="dynamic"),
foreign_keys=[__table__.c.MaintainerUID])
_User,
backref=backref("maintained_bases", lazy="dynamic"),
foreign_keys=[__table__.c.MaintainerUID],
)
Packager = relationship(
_User, backref=backref("package_bases", lazy="dynamic"),
foreign_keys=[__table__.c.PackagerUID])
_User,
backref=backref("package_bases", lazy="dynamic"),
foreign_keys=[__table__.c.PackagerUID],
)
# A set used to check for floatable values.
TO_FLOAT = {"Popularity"}
@ -37,7 +45,8 @@ class PackageBase(Base):
raise IntegrityError(
statement="Column Name cannot be null.",
orig="PackageBases.Name",
params=("NULL"))
params=("NULL"),
)
# If no SubmittedTS/ModifiedTS is provided on creation, set them
# here to the current utc timestamp.
@ -55,3 +64,13 @@ class PackageBase(Base):
if key in PackageBase.TO_FLOAT and not isinstance(attr, float):
return float(attr)
return attr
def popularity_decay(pkgbase: PackageBase, utcnow: int):
"""Return the delta between now and the last time popularity was updated, in days"""
return int((utcnow - pkgbase.PopularityUpdated.timestamp()) / 86400)
def popularity(pkgbase: PackageBase, utcnow: int):
"""Return up-to-date popularity"""
return float(pkgbase.Popularity) * (0.98 ** popularity_decay(pkgbase, utcnow))

View file

@ -16,4 +16,5 @@ class PackageBlacklist(Base):
raise IntegrityError(
statement="Column Name cannot be null.",
orig="PackageBlacklist.Name",
params=("NULL"))
params=("NULL"),
)

View file

@ -10,19 +10,19 @@ from aurweb.models.user import User as _User
class PackageComaintainer(Base):
__table__ = schema.PackageComaintainers
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]
}
__mapper_args__ = {"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]}
User = relationship(
_User, backref=backref("comaintained", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("comaintained", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UsersID],
)
PackageBase = relationship(
_PackageBase, backref=backref("comaintainers", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("comaintainers", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -31,16 +31,19 @@ class PackageComaintainer(Base):
raise IntegrityError(
statement="Foreign key UsersID cannot be null.",
orig="PackageComaintainers.UsersID",
params=("NULL"))
params=("NULL"),
)
if not self.PackageBase and not self.PackageBaseID:
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="PackageComaintainers.PackageBaseID",
params=("NULL"))
params=("NULL"),
)
if not self.Priority:
raise IntegrityError(
statement="Column Priority cannot be null.",
orig="PackageComaintainers.Priority",
params=("NULL"))
params=("NULL"),
)

View file

@ -13,21 +13,28 @@ class PackageComment(Base):
__mapper_args__ = {"primary_key": [__table__.c.ID]}
PackageBase = relationship(
_PackageBase, backref=backref("comments", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("comments", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
User = relationship(
_User, backref=backref("package_comments", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("package_comments", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID],
)
Editor = relationship(
_User, backref=backref("edited_comments", lazy="dynamic"),
foreign_keys=[__table__.c.EditedUsersID])
_User,
backref=backref("edited_comments", lazy="dynamic"),
foreign_keys=[__table__.c.EditedUsersID],
)
Deleter = relationship(
_User, backref=backref("deleted_comments", lazy="dynamic"),
foreign_keys=[__table__.c.DelUsersID])
_User,
backref=backref("deleted_comments", lazy="dynamic"),
foreign_keys=[__table__.c.DelUsersID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -36,27 +43,31 @@ class PackageComment(Base):
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="PackageComments.PackageBaseID",
params=("NULL"))
params=("NULL"),
)
if not self.User and not self.UsersID:
raise IntegrityError(
statement="Foreign key UsersID cannot be null.",
orig="PackageComments.UsersID",
params=("NULL"))
params=("NULL"),
)
if self.Comments is None:
raise IntegrityError(
statement="Column Comments cannot be null.",
orig="PackageComments.Comments",
params=("NULL"))
params=("NULL"),
)
if self.RenderedComment is None:
self.RenderedComment = str()
def maintainers(self):
return list(filter(
lambda e: e is not None,
[self.PackageBase.Maintainer] + [
c.User for c in self.PackageBase.comaintainers
]
))
return list(
filter(
lambda e: e is not None,
[self.PackageBase.Maintainer]
+ [c.User for c in self.PackageBase.comaintainers],
)
)

View file

@ -1,5 +1,3 @@
from typing import List
from sqlalchemy import and_, literal
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import backref, relationship
@ -24,14 +22,16 @@ class PackageDependency(Base):
}
Package = relationship(
_Package, backref=backref("package_dependencies", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageID])
_Package,
backref=backref("package_dependencies", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageID],
)
DependencyType = relationship(
_DependencyType,
backref=backref("package_dependencies", lazy="dynamic"),
foreign_keys=[__table__.c.DepTypeID])
foreign_keys=[__table__.c.DepTypeID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -40,43 +40,61 @@ class PackageDependency(Base):
raise IntegrityError(
statement="Foreign key PackageID cannot be null.",
orig="PackageDependencies.PackageID",
params=("NULL"))
params=("NULL"),
)
if not self.DependencyType and not self.DepTypeID:
raise IntegrityError(
statement="Foreign key DepTypeID cannot be null.",
orig="PackageDependencies.DepTypeID",
params=("NULL"))
params=("NULL"),
)
if self.DepName is None:
raise IntegrityError(
statement="Column DepName cannot be null.",
orig="PackageDependencies.DepName",
params=("NULL"))
params=("NULL"),
)
def is_aur_package(self) -> bool:
pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists()
return db.query(pkg).scalar()
def is_package(self) -> bool:
pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists()
official = db.query(_OfficialProvider).filter(
_OfficialProvider.Name == self.DepName).exists()
return db.query(pkg).scalar() or db.query(official).scalar()
official = (
db.query(_OfficialProvider)
.filter(_OfficialProvider.Name == self.DepName)
.exists()
)
return self.is_aur_package() or db.query(official).scalar()
def provides(self) -> List[PackageRelation]:
def provides(self) -> list[PackageRelation]:
from aurweb.models.relation_type import PROVIDES_ID
rels = db.query(PackageRelation).join(_Package).filter(
and_(PackageRelation.RelTypeID == PROVIDES_ID,
PackageRelation.RelName == self.DepName)
).with_entities(
_Package.Name,
literal(False).label("is_official")
).order_by(_Package.Name.asc())
rels = (
db.query(PackageRelation)
.join(_Package)
.filter(
and_(
PackageRelation.RelTypeID == PROVIDES_ID,
PackageRelation.RelName == self.DepName,
)
)
.with_entities(_Package.Name, literal(False).label("is_official"))
.order_by(_Package.Name.asc())
)
official_rels = db.query(_OfficialProvider).filter(
and_(_OfficialProvider.Provides == self.DepName,
_OfficialProvider.Name != self.DepName)
).with_entities(
_OfficialProvider.Name,
literal(True).label("is_official")
).order_by(_OfficialProvider.Name.asc())
official_rels = (
db.query(_OfficialProvider)
.filter(
and_(
_OfficialProvider.Provides == self.DepName,
_OfficialProvider.Name != self.DepName,
)
)
.with_entities(_OfficialProvider.Name, literal(True).label("is_official"))
.order_by(_OfficialProvider.Name.asc())
)
return rels.union(official_rels).all()

View file

@ -10,19 +10,19 @@ from aurweb.models.package import Package as _Package
class PackageGroup(Base):
__table__ = schema.PackageGroups
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.PackageID, __table__.c.GroupID]
}
__mapper_args__ = {"primary_key": [__table__.c.PackageID, __table__.c.GroupID]}
Package = relationship(
_Package, backref=backref("package_groups", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageID])
_Package,
backref=backref("package_groups", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageID],
)
Group = relationship(
_Group, backref=backref("package_groups", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.GroupID])
_Group,
backref=backref("package_groups", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.GroupID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -31,10 +31,12 @@ class PackageGroup(Base):
raise IntegrityError(
statement="Primary key PackageID cannot be null.",
orig="PackageGroups.PackageID",
params=("NULL"))
params=("NULL"),
)
if not self.Group and not self.GroupID:
raise IntegrityError(
statement="Primary key GroupID cannot be null.",
orig="PackageGroups.GroupID",
params=("NULL"))
params=("NULL"),
)

View file

@ -9,14 +9,13 @@ from aurweb.models.package_base import PackageBase as _PackageBase
class PackageKeyword(Base):
__table__ = schema.PackageKeywords
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.PackageBaseID, __table__.c.Keyword]
}
__mapper_args__ = {"primary_key": [__table__.c.PackageBaseID, __table__.c.Keyword]}
PackageBase = relationship(
_PackageBase, backref=backref("keywords", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("keywords", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -25,4 +24,5 @@ class PackageKeyword(Base):
raise IntegrityError(
statement="Primary key PackageBaseID cannot be null.",
orig="PackageKeywords.PackageBaseID",
params=("NULL"))
params=("NULL"),
)

View file

@ -10,19 +10,19 @@ from aurweb.models.package import Package as _Package
class PackageLicense(Base):
__table__ = schema.PackageLicenses
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.PackageID, __table__.c.LicenseID]
}
__mapper_args__ = {"primary_key": [__table__.c.PackageID, __table__.c.LicenseID]}
Package = relationship(
_Package, backref=backref("package_licenses", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageID])
_Package,
backref=backref("package_licenses", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageID],
)
License = relationship(
_License, backref=backref("package_licenses", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.LicenseID])
_License,
backref=backref("package_licenses", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.LicenseID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -31,10 +31,12 @@ class PackageLicense(Base):
raise IntegrityError(
statement="Primary key PackageID cannot be null.",
orig="PackageLicenses.PackageID",
params=("NULL"))
params=("NULL"),
)
if not self.License and not self.LicenseID:
raise IntegrityError(
statement="Primary key LicenseID cannot be null.",
orig="PackageLicenses.LicenseID",
params=("NULL"))
params=("NULL"),
)

View file

@ -10,20 +10,19 @@ from aurweb.models.user import User as _User
class PackageNotification(Base):
__table__ = schema.PackageNotifications
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.UserID, __table__.c.PackageBaseID]
}
__mapper_args__ = {"primary_key": [__table__.c.UserID, __table__.c.PackageBaseID]}
User = relationship(
_User, backref=backref("notifications", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.UserID])
_User,
backref=backref("notifications", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UserID],
)
PackageBase = relationship(
_PackageBase,
backref=backref("notifications", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
backref=backref("notifications", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -32,10 +31,12 @@ class PackageNotification(Base):
raise IntegrityError(
statement="Foreign key UserID cannot be null.",
orig="PackageNotifications.UserID",
params=("NULL"))
params=("NULL"),
)
if not self.PackageBase and not self.PackageBaseID:
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="PackageNotifications.PackageBaseID",
params=("NULL"))
params=("NULL"),
)

View file

@ -19,13 +19,16 @@ class PackageRelation(Base):
}
Package = relationship(
_Package, backref=backref("package_relations", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageID])
_Package,
backref=backref("package_relations", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageID],
)
RelationType = relationship(
_RelationType, backref=backref("package_relations", lazy="dynamic"),
foreign_keys=[__table__.c.RelTypeID])
_RelationType,
backref=backref("package_relations", lazy="dynamic"),
foreign_keys=[__table__.c.RelTypeID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -34,16 +37,19 @@ class PackageRelation(Base):
raise IntegrityError(
statement="Foreign key PackageID cannot be null.",
orig="PackageRelations.PackageID",
params=("NULL"))
params=("NULL"),
)
if not self.RelationType and not self.RelTypeID:
raise IntegrityError(
statement="Foreign key RelTypeID cannot be null.",
orig="PackageRelations.RelTypeID",
params=("NULL"))
params=("NULL"),
)
if not self.RelName:
raise IntegrityError(
statement="Column RelName cannot be null.",
orig="PackageRelations.RelName",
params=("NULL"))
params=("NULL"),
)

View file

@ -1,7 +1,10 @@
import base64
import hashlib
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import backref, relationship
from aurweb import schema
from aurweb import config, schema
from aurweb.models.declarative import Base
from aurweb.models.package_base import PackageBase as _PackageBase
from aurweb.models.request_type import RequestType as _RequestType
@ -25,26 +28,34 @@ class PackageRequest(Base):
__mapper_args__ = {"primary_key": [__table__.c.ID]}
RequestType = relationship(
_RequestType, backref=backref("package_requests", lazy="dynamic"),
foreign_keys=[__table__.c.ReqTypeID])
_RequestType,
backref=backref("package_requests", lazy="dynamic"),
foreign_keys=[__table__.c.ReqTypeID],
)
User = relationship(
_User, backref=backref("package_requests", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("package_requests", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID],
)
PackageBase = relationship(
_PackageBase, backref=backref("requests", lazy="dynamic"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("requests", lazy="dynamic"),
foreign_keys=[__table__.c.PackageBaseID],
)
Closer = relationship(
_User, backref=backref("closed_requests", lazy="dynamic"),
foreign_keys=[__table__.c.ClosedUID])
_User,
backref=backref("closed_requests", lazy="dynamic"),
foreign_keys=[__table__.c.ClosedUID],
)
STATUS_DISPLAY = {
PENDING_ID: PENDING,
CLOSED_ID: CLOSED,
ACCEPTED_ID: ACCEPTED,
REJECTED_ID: REJECTED
REJECTED_ID: REJECTED,
}
def __init__(self, **kwargs):
@ -54,38 +65,57 @@ class PackageRequest(Base):
raise IntegrityError(
statement="Foreign key ReqTypeID cannot be null.",
orig="PackageRequests.ReqTypeID",
params=("NULL"))
params=("NULL"),
)
if not self.PackageBase and not self.PackageBaseID:
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="PackageRequests.PackageBaseID",
params=("NULL"))
params=("NULL"),
)
if not self.PackageBaseName:
raise IntegrityError(
statement="Column PackageBaseName cannot be null.",
orig="PackageRequests.PackageBaseName",
params=("NULL"))
params=("NULL"),
)
if not self.User and not self.UsersID:
raise IntegrityError(
statement="Foreign key UsersID cannot be null.",
orig="PackageRequests.UsersID",
params=("NULL"))
params=("NULL"),
)
if self.Comments is None:
raise IntegrityError(
statement="Column Comments cannot be null.",
orig="PackageRequests.Comments",
params=("NULL"))
params=("NULL"),
)
if self.ClosureComment is None:
raise IntegrityError(
statement="Column ClosureComment cannot be null.",
orig="PackageRequests.ClosureComment",
params=("NULL"))
params=("NULL"),
)
def status_display(self) -> str:
""" Return a display string for the Status column. """
"""Return a display string for the Status column."""
return self.STATUS_DISPLAY[self.Status]
def ml_message_id_hash(self) -> str:
"""Return the X-Message-ID-Hash that is used in the mailing list archive."""
# X-Message-ID-Hash is a base32 encoded SHA1 hash
msgid = f"pkg-request-{str(self.ID)}@aur.archlinux.org"
sha1 = hashlib.sha1(msgid.encode()).digest()
return base64.b32encode(sha1).decode()
def ml_message_url(self) -> str:
"""Return the mailing list URL for the request."""
url = config.get("options", "ml_thread_url") % (self.ml_message_id_hash())
return url

View file

@ -9,17 +9,13 @@ from aurweb.models.package import Package as _Package
class PackageSource(Base):
__table__ = schema.PackageSources
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [
__table__.c.PackageID,
__table__.c.Source
]
}
__mapper_args__ = {"primary_key": [__table__.c.PackageID, __table__.c.Source]}
Package = relationship(
_Package, backref=backref("package_sources", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageID])
_Package,
backref=backref("package_sources", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -28,7 +24,8 @@ class PackageSource(Base):
raise IntegrityError(
statement="Foreign key PackageID cannot be null.",
orig="PackageSources.PackageID",
params=("NULL"))
params=("NULL"),
)
if not self.Source:
self.Source = "/dev/null"

View file

@ -10,18 +10,19 @@ from aurweb.models.user import User as _User
class PackageVote(Base):
__table__ = schema.PackageVotes
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]
}
__mapper_args__ = {"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]}
User = relationship(
_User, backref=backref("package_votes", lazy="dynamic"),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("package_votes", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UsersID],
)
PackageBase = relationship(
_PackageBase, backref=backref("package_votes", lazy="dynamic",
cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID])
_PackageBase,
backref=backref("package_votes", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.PackageBaseID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -30,16 +31,19 @@ class PackageVote(Base):
raise IntegrityError(
statement="Foreign key UsersID cannot be null.",
orig="PackageVotes.UsersID",
params=("NULL"))
params=("NULL"),
)
if not self.PackageBase and not self.PackageBaseID:
raise IntegrityError(
statement="Foreign key PackageBaseID cannot be null.",
orig="PackageVotes.PackageBaseID",
params=("NULL"))
params=("NULL"),
)
if not self.VoteTS:
raise IntegrityError(
statement="Column VoteTS cannot be null.",
orig="PackageVotes.VoteTS",
params=("NULL"))
params=("NULL"),
)

View file

@ -16,5 +16,5 @@ class RequestType(Base):
__mapper_args__ = {"primary_key": [__table__.c.ID]}
def name_display(self) -> str:
""" Return the Name column with its first char capitalized. """
"""Return the Name column with its first char capitalized."""
return self.Name.title()

View file

@ -12,8 +12,10 @@ class Session(Base):
__mapper_args__ = {"primary_key": [__table__.c.UsersID]}
User = relationship(
_User, backref=backref("session", uselist=False),
foreign_keys=[__table__.c.UsersID])
_User,
backref=backref("session", cascade="all, delete", uselist=False),
foreign_keys=[__table__.c.UsersID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -29,10 +31,13 @@ class Session(Base):
user_exists = db.query(_User).filter(_User.ID == uid).exists()
if not db.query(user_exists).scalar():
raise IntegrityError(
statement=("Foreign key UsersID cannot be null and "
"must be a valid user's ID."),
statement=(
"Foreign key UsersID cannot be null and "
"must be a valid user's ID."
),
orig="Sessions.UsersID",
params=("NULL"))
params=("NULL"),
)
def generate_unique_sid():

View file

@ -12,16 +12,17 @@ class SSHPubKey(Base):
__mapper_args__ = {"primary_key": [__table__.c.Fingerprint]}
User = relationship(
"User", backref=backref("ssh_pub_keys", lazy="dynamic"),
foreign_keys=[__table__.c.UserID])
"User",
backref=backref("ssh_pub_keys", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UserID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
def get_fingerprint(pubkey: str) -> str:
proc = Popen(["ssh-keygen", "-l", "-f", "-"], stdin=PIPE, stdout=PIPE,
stderr=PIPE)
proc = Popen(["ssh-keygen", "-l", "-f", "-"], stdin=PIPE, stdout=PIPE, stderr=PIPE)
out, _ = proc.communicate(pubkey.encode())
if proc.returncode:
raise ValueError("The SSH public key is invalid.")

View file

@ -16,10 +16,12 @@ class Term(Base):
raise IntegrityError(
statement="Column Description cannot be null.",
orig="Terms.Description",
params=("NULL"))
params=("NULL"),
)
if not self.URL:
raise IntegrityError(
statement="Column URL cannot be null.",
orig="Terms.URL",
params=("NULL"))
params=("NULL"),
)

View file

@ -1,9 +1,7 @@
import hashlib
from typing import List, Set
from typing import Set
import bcrypt
from fastapi import Request
from sqlalchemy import or_
from sqlalchemy.exc import IntegrityError
@ -12,19 +10,19 @@ from sqlalchemy.orm import backref, relationship
import aurweb.config
import aurweb.models.account_type
import aurweb.schema
from aurweb import db, logging, schema, time, util
from aurweb import aur_logging, db, schema, time, util
from aurweb.models.account_type import AccountType as _AccountType
from aurweb.models.ban import is_banned
from aurweb.models.declarative import Base
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
SALT_ROUNDS_DEFAULT = 12
class User(Base):
""" An ORM model of a single Users record. """
"""An ORM model of a single Users record."""
__table__ = schema.Users
__tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.ID]}
@ -33,7 +31,8 @@ class User(Base):
_AccountType,
backref=backref("users", lazy="dynamic"),
foreign_keys=[__table__.c.AccountTypeID],
uselist=False)
uselist=False,
)
# High-level variables used to track authentication (not in DB).
authenticated = False
@ -41,50 +40,50 @@ class User(Base):
# Make this static to the class just in case SQLAlchemy ever
# does something to bypass our constructor.
salt_rounds = aurweb.config.getint("options", "salt_rounds",
SALT_ROUNDS_DEFAULT)
salt_rounds = aurweb.config.getint("options", "salt_rounds", SALT_ROUNDS_DEFAULT)
def __init__(self, Passwd: str = str(), **kwargs):
super().__init__(**kwargs, Passwd=str())
# Run this again in the constructor in case we rehashed config.
self.salt_rounds = aurweb.config.getint("options", "salt_rounds",
SALT_ROUNDS_DEFAULT)
self.salt_rounds = aurweb.config.getint(
"options", "salt_rounds", SALT_ROUNDS_DEFAULT
)
if Passwd:
self.update_password(Passwd)
def update_password(self, password):
self.Passwd = bcrypt.hashpw(
password.encode(),
bcrypt.gensalt(rounds=self.salt_rounds)).decode()
password.encode(), bcrypt.gensalt(rounds=self.salt_rounds)
).decode()
@staticmethod
def minimum_passwd_length():
return aurweb.config.getint("options", "passwd_min_len")
def is_authenticated(self):
""" Return internal authenticated state. """
"""Return internal authenticated state."""
return self.authenticated
def valid_password(self, password: str):
""" Check authentication against a given password. """
"""Check authentication against a given password."""
if password is None:
return False
password_is_valid = False
try:
password_is_valid = bcrypt.checkpw(password.encode(),
self.Passwd.encode())
password_is_valid = bcrypt.checkpw(password.encode(), self.Passwd.encode())
except ValueError:
pass
# If our Salt column is not empty, we're using a legacy password.
if not password_is_valid and self.Salt != str():
# Try to login with legacy method.
password_is_valid = hashlib.md5(
f"{self.Salt}{password}".encode()
).hexdigest() == self.Passwd
password_is_valid = (
hashlib.md5(f"{self.Salt}{password}".encode()).hexdigest()
== self.Passwd
)
# We got here, we passed the legacy authentication.
# Update the password to our modern hash style.
@ -96,9 +95,8 @@ class User(Base):
def _login_approved(self, request: Request):
return not is_banned(request) and not self.Suspended
def login(self, request: Request, password: str,
session_time: int = 0) -> str:
""" Login and authenticate a request. """
def login(self, request: Request, password: str) -> str:
"""Login and authenticate a request."""
from aurweb import db
from aurweb.models.session import Session, generate_unique_sid
@ -124,12 +122,12 @@ class User(Base):
try:
with db.begin():
self.LastLogin = now_ts
self.LastLoginIPAddress = request.client.host
self.LastLoginIPAddress = util.get_client_ip(request)
if not self.session:
sid = generate_unique_sid()
self.session = db.create(Session, User=self,
SessionID=sid,
LastUpdateTS=now_ts)
self.session = db.create(
Session, User=self, SessionID=sid, LastUpdateTS=now_ts
)
else:
last_updated = self.session.LastUpdateTS
if last_updated and last_updated < now_ts:
@ -148,36 +146,36 @@ class User(Base):
return self.session.SessionID
def has_credential(self, credential: Set[int],
approved: List["User"] = list()):
def has_credential(self, credential: Set[int], approved: list["User"] = list()):
from aurweb.auth.creds import has_credential
return has_credential(self, credential, approved)
def logout(self, request: Request):
def logout(self, request: Request) -> None:
self.authenticated = False
if self.session:
with db.begin():
db.delete(self.session)
def is_trusted_user(self):
def is_package_maintainer(self):
return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID
aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def is_developer(self):
return self.AccountType.ID in {
aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def is_elevated(self):
""" A User is 'elevated' when they have either a
Trusted User or Developer AccountType. """
"""A User is 'elevated' when they have either a
Package Maintainer or Developer AccountType."""
return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def can_edit_user(self, target: "User") -> bool:
@ -190,24 +188,28 @@ class User(Base):
In short, a user must at least have credentials and be at least
the same account type as the target.
User < Trusted User < Developer < Trusted User & Developer
User < Package Maintainer < Developer < Package Maintainer & Developer
:param target: Target User to be edited
:return: Boolean indicating whether `self` can edit `target`
"""
from aurweb.auth import creds
has_cred = self.has_credential(creds.ACCOUNT_EDIT, approved=[target])
return has_cred and self.AccountTypeID >= target.AccountTypeID
def voted_for(self, package) -> bool:
""" Has this User voted for package? """
"""Has this User voted for package?"""
from aurweb.models.package_vote import PackageVote
return bool(package.PackageBase.package_votes.filter(
PackageVote.UsersID == self.ID
).scalar())
return bool(
package.PackageBase.package_votes.filter(
PackageVote.UsersID == self.ID
).scalar()
)
def notified(self, package) -> bool:
""" Is this User being notified about package (or package base)?
"""Is this User being notified about package (or package base)?
:param package: Package or PackageBase instance
:return: Boolean indicating state of package notification
@ -225,12 +227,14 @@ class User(Base):
# Run an exists() query where a pkgbase-related
# PackageNotification exists for self (a user).
return bool(db.query(
query.filter(PackageNotification.UserID == self.ID).exists()
).scalar())
return bool(
db.query(
query.filter(PackageNotification.UserID == self.ID).exists()
).scalar()
)
def packages(self):
""" Returns an ORM query to Package objects owned by this user.
"""Returns an ORM query to Package objects owned by this user.
This should really be replaced with an internal ORM join
configured for the User model. This has not been done yet
@ -241,16 +245,24 @@ class User(Base):
"""
from aurweb.models.package import Package
from aurweb.models.package_base import PackageBase
return db.query(Package).join(PackageBase).filter(
or_(
PackageBase.PackagerUID == self.ID,
PackageBase.MaintainerUID == self.ID
return (
db.query(Package)
.join(PackageBase)
.filter(
or_(
PackageBase.PackagerUID == self.ID,
PackageBase.MaintainerUID == self.ID,
)
)
)
def __repr__(self):
return "<User(ID='%s', AccountType='%s', Username='%s')>" % (
self.ID, str(self.AccountType), self.Username)
self.ID,
str(self.AccountType),
self.Username,
)
def __str__(self) -> str:
return self.Username

View file

@ -3,24 +3,26 @@ from sqlalchemy.orm import backref, relationship
from aurweb import schema
from aurweb.models.declarative import Base
from aurweb.models.tu_voteinfo import TUVoteInfo as _TUVoteInfo
from aurweb.models.user import User as _User
from aurweb.models.voteinfo import VoteInfo as _VoteInfo
class TUVote(Base):
__table__ = schema.TU_Votes
class Vote(Base):
__table__ = schema.Votes
__tablename__ = __table__.name
__mapper_args__ = {
"primary_key": [__table__.c.VoteID, __table__.c.UserID]
}
__mapper_args__ = {"primary_key": [__table__.c.VoteID, __table__.c.UserID]}
VoteInfo = relationship(
_TUVoteInfo, backref=backref("tu_votes", lazy="dynamic"),
foreign_keys=[__table__.c.VoteID])
_VoteInfo,
backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.VoteID],
)
User = relationship(
_User, backref=backref("tu_votes", lazy="dynamic"),
foreign_keys=[__table__.c.UserID])
_User,
backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.UserID],
)
def __init__(self, **kwargs):
super().__init__(**kwargs)
@ -28,11 +30,13 @@ class TUVote(Base):
if not self.VoteInfo and not self.VoteID:
raise IntegrityError(
statement="Foreign key VoteID cannot be null.",
orig="TU_Votes.VoteID",
params=("NULL"))
orig="Votes.VoteID",
params=("NULL"),
)
if not self.User and not self.UserID:
raise IntegrityError(
statement="Foreign key UserID cannot be null.",
orig="TU_Votes.UserID",
params=("NULL"))
orig="Votes.UserID",
params=("NULL"),
)

View file

@ -8,14 +8,16 @@ from aurweb.models.declarative import Base
from aurweb.models.user import User as _User
class TUVoteInfo(Base):
__table__ = schema.TU_VoteInfo
class VoteInfo(Base):
__table__ = schema.VoteInfo
__tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.ID]}
Submitter = relationship(
_User, backref=backref("tu_voteinfo_set", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterID])
_User,
backref=backref("voteinfo_set", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterID],
)
def __init__(self, **kwargs):
# Default Quorum, Yes, No and Abstain columns to 0.
@ -28,41 +30,46 @@ class TUVoteInfo(Base):
if self.Agenda is None:
raise IntegrityError(
statement="Column Agenda cannot be null.",
orig="TU_VoteInfo.Agenda",
params=("NULL"))
orig="VoteInfo.Agenda",
params=("NULL"),
)
if self.User is None:
raise IntegrityError(
statement="Column User cannot be null.",
orig="TU_VoteInfo.User",
params=("NULL"))
orig="VoteInfo.User",
params=("NULL"),
)
if self.Submitted is None:
raise IntegrityError(
statement="Column Submitted cannot be null.",
orig="TU_VoteInfo.Submitted",
params=("NULL"))
orig="VoteInfo.Submitted",
params=("NULL"),
)
if self.End is None:
raise IntegrityError(
statement="Column End cannot be null.",
orig="TU_VoteInfo.End",
params=("NULL"))
orig="VoteInfo.End",
params=("NULL"),
)
if not self.Submitter:
raise IntegrityError(
statement="Foreign key SubmitterID cannot be null.",
orig="TU_VoteInfo.SubmitterID",
params=("NULL"))
orig="VoteInfo.SubmitterID",
params=("NULL"),
)
def __setattr__(self, key: str, value: typing.Any):
""" Customize setattr to stringify any Quorum keys given. """
"""Customize setattr to stringify any Quorum keys given."""
if key == "Quorum":
value = str(value)
return super().__setattr__(key, value)
def __getattribute__(self, key: str):
""" Customize getattr to floatify any fetched Quorum values. """
"""Customize getattr to floatify any fetched Quorum values."""
attr = super().__getattribute__(key)
if key == "Quorum":
return float(attr)

View file

@ -1,4 +1,4 @@
from typing import List, Optional, Set
from typing import Optional, Set
from fastapi import Request
from sqlalchemy import and_, orm
@ -7,46 +7,55 @@ from aurweb import config, db, l10n, time, util
from aurweb.exceptions import InvariantError
from aurweb.models import PackageBase, PackageRequest, User
from aurweb.models.package_request import ACCEPTED_ID, PENDING_ID, REJECTED_ID
from aurweb.models.request_type import DELETION, DELETION_ID, MERGE, MERGE_ID, ORPHAN, ORPHAN_ID
from aurweb.models.request_type import (
DELETION,
DELETION_ID,
MERGE,
MERGE_ID,
ORPHAN,
ORPHAN_ID,
)
from aurweb.scripts import notify
class ClosureFactory:
""" A factory class used to autogenerate closure comments. """
"""A factory class used to autogenerate closure comments."""
REQTYPE_NAMES = {
DELETION_ID: DELETION,
MERGE_ID: MERGE,
ORPHAN_ID: ORPHAN
}
REQTYPE_NAMES = {DELETION_ID: DELETION, MERGE_ID: MERGE, ORPHAN_ID: ORPHAN}
def _deletion_closure(self, requester: User,
pkgbase: PackageBase,
target: PackageBase = None):
return (f"[Autogenerated] Accepted deletion for {pkgbase.Name}.")
def _deletion_closure(
self, requester: User, pkgbase: PackageBase, target: PackageBase = None
):
return f"[Autogenerated] Accepted deletion for {pkgbase.Name}."
def _merge_closure(self, requester: User,
pkgbase: PackageBase,
target: PackageBase = None):
return (f"[Autogenerated] Accepted merge for {pkgbase.Name} "
f"into {target.Name}.")
def _merge_closure(
self, requester: User, pkgbase: PackageBase, target: PackageBase = None
):
return (
f"[Autogenerated] Accepted merge for {pkgbase.Name} " f"into {target.Name}."
)
def _orphan_closure(self, requester: User,
pkgbase: PackageBase,
target: PackageBase = None):
return (f"[Autogenerated] Accepted orphan for {pkgbase.Name}.")
def _orphan_closure(
self, requester: User, pkgbase: PackageBase, target: PackageBase = None
):
return f"[Autogenerated] Accepted orphan for {pkgbase.Name}."
def _rejected_merge_closure(self, requester: User,
pkgbase: PackageBase,
target: PackageBase = None):
return (f"[Autogenerated] Another request to merge {pkgbase.Name} "
f"into {target.Name} has rendered this request invalid.")
def _rejected_merge_closure(
self, requester: User, pkgbase: PackageBase, target: PackageBase = None
):
return (
f"[Autogenerated] Another request to merge {pkgbase.Name} "
f"into {target.Name} has rendered this request invalid."
)
def get_closure(self, reqtype_id: int,
requester: User,
pkgbase: PackageBase,
target: PackageBase = None,
status: int = ACCEPTED_ID) -> str:
def get_closure(
self,
reqtype_id: int,
requester: User,
pkgbase: PackageBase,
target: PackageBase = None,
status: int = ACCEPTED_ID,
) -> str:
"""
Return a closure comment handled by this class.
@ -69,8 +78,9 @@ class ClosureFactory:
return handler(requester, pkgbase, target)
def update_closure_comment(pkgbase: PackageBase, reqtype_id: int,
comments: str, target: PackageBase = None) -> None:
def update_closure_comment(
pkgbase: PackageBase, reqtype_id: int, comments: str, target: PackageBase = None
) -> None:
"""
Update all pending requests related to `pkgbase` with a closure comment.
@ -90,8 +100,10 @@ def update_closure_comment(pkgbase: PackageBase, reqtype_id: int,
return
query = pkgbase.requests.filter(
and_(PackageRequest.ReqTypeID == reqtype_id,
PackageRequest.Status == PENDING_ID))
and_(
PackageRequest.ReqTypeID == reqtype_id, PackageRequest.Status == PENDING_ID
)
)
if reqtype_id == MERGE_ID:
query = query.filter(PackageRequest.MergeBaseName == target.Name)
@ -100,9 +112,8 @@ def update_closure_comment(pkgbase: PackageBase, reqtype_id: int,
def verify_orphan_request(user: User, pkgbase: PackageBase):
""" Verify that an undue orphan request exists in `requests`. """
requests = pkgbase.requests.filter(
PackageRequest.ReqTypeID == ORPHAN_ID)
"""Verify that an undue orphan request exists in `requests`."""
requests = pkgbase.requests.filter(PackageRequest.ReqTypeID == ORPHAN_ID)
for pkgreq in requests:
idle_time = config.getint("options", "request_idle_time")
time_delta = time.utcnow() - pkgreq.RequestTS
@ -115,9 +126,13 @@ def verify_orphan_request(user: User, pkgbase: PackageBase):
return False
def close_pkgreq(pkgreq: PackageRequest, closer: User,
pkgbase: PackageBase, target: Optional[PackageBase],
status: int) -> None:
def close_pkgreq(
pkgreq: PackageRequest,
closer: User,
pkgbase: PackageBase,
target: Optional[PackageBase],
status: int,
) -> None:
"""
Close a package request with `pkgreq`.Status == `status`.
@ -130,16 +145,20 @@ def close_pkgreq(pkgreq: PackageRequest, closer: User,
now = time.utcnow()
pkgreq.Status = status
pkgreq.Closer = closer
pkgreq.ClosureComment = (
pkgreq.ClosureComment or ClosureFactory().get_closure(
pkgreq.ReqTypeID, closer, pkgbase, target, status)
pkgreq.ClosureComment = pkgreq.ClosureComment or ClosureFactory().get_closure(
pkgreq.ReqTypeID, closer, pkgbase, target, status
)
pkgreq.ClosedTS = now
def handle_request(request: Request, reqtype_id: int,
pkgbase: PackageBase,
target: PackageBase = None) -> List[notify.Notification]:
@db.retry_deadlock
def handle_request(
request: Request,
reqtype_id: int,
pkgbase: PackageBase,
target: PackageBase = None,
comments: str = str(),
) -> list[notify.Notification]:
"""
Handle package requests before performing an action.
@ -158,24 +177,27 @@ def handle_request(request: Request, reqtype_id: int,
:param pkgbase: PackageBase which the request is about
:param target: Optional target to merge into
"""
notifs: List[notify.Notification] = []
notifs: list[notify.Notification] = []
# If it's an orphan request, perform further verification
# regarding existing requests.
if reqtype_id == ORPHAN_ID:
if not verify_orphan_request(request.user, pkgbase):
_ = l10n.get_translator_for_request(request)
raise InvariantError(_(
"No due existing orphan requests to accept for %s."
) % pkgbase.Name)
raise InvariantError(
_("No due existing orphan requests to accept for %s.") % pkgbase.Name
)
# Produce a base query for requests related to `pkgbase`, based
# on ReqTypeID matching `reqtype_id`, pending status and a correct
# PackagBaseName column.
query: orm.Query = pkgbase.requests.filter(
and_(PackageRequest.ReqTypeID == reqtype_id,
PackageRequest.Status == PENDING_ID,
PackageRequest.PackageBaseName == pkgbase.Name))
and_(
PackageRequest.ReqTypeID == reqtype_id,
PackageRequest.Status == PENDING_ID,
PackageRequest.PackageBaseName == pkgbase.Name,
)
)
# Build a query for records we should accept. For merge requests,
# this is specific to a matching MergeBaseName. For others, this
@ -183,17 +205,16 @@ def handle_request(request: Request, reqtype_id: int,
accept_query: orm.Query = query
if target:
# If a `target` was supplied, filter by MergeBaseName
accept_query = query.filter(
PackageRequest.MergeBaseName == target.Name)
accept_query = query.filter(PackageRequest.MergeBaseName == target.Name)
# Build an accept list out of `accept_query`.
to_accept: List[PackageRequest] = accept_query.all()
to_accept: list[PackageRequest] = accept_query.all()
accepted_ids: Set[int] = set(p.ID for p in to_accept)
# Build a reject list out of `query` filtered by IDs not found
# in `to_accept`. That is, unmatched records of the same base
# query properties.
to_reject: List[PackageRequest] = query.filter(
to_reject: list[PackageRequest] = query.filter(
~PackageRequest.ID.in_(accepted_ids)
).all()
@ -203,14 +224,16 @@ def handle_request(request: Request, reqtype_id: int,
if not to_accept:
utcnow = time.utcnow()
with db.begin():
pkgreq = db.create(PackageRequest,
ReqTypeID=reqtype_id,
RequestTS=utcnow,
User=request.user,
PackageBase=pkgbase,
PackageBaseName=pkgbase.Name,
Comments="Autogenerated by aurweb.",
ClosureComment=str())
pkgreq = db.create(
PackageRequest,
ReqTypeID=reqtype_id,
RequestTS=utcnow,
User=request.user,
PackageBase=pkgbase,
PackageBaseName=pkgbase.Name,
Comments="Autogenerated by aurweb.",
ClosureComment=comments,
)
# If it's a merge request, set MergeBaseName to `target`.Name.
if pkgreq.ReqTypeID == MERGE_ID:
@ -221,16 +244,25 @@ def handle_request(request: Request, reqtype_id: int,
to_accept.append(pkgreq)
# Update requests with their new status and closures.
with db.begin():
util.apply_all(to_accept, lambda p: close_pkgreq(
p, request.user, pkgbase, target, ACCEPTED_ID))
util.apply_all(to_reject, lambda p: close_pkgreq(
p, request.user, pkgbase, target, REJECTED_ID))
@db.retry_deadlock
def retry_closures():
with db.begin():
util.apply_all(
to_accept,
lambda p: close_pkgreq(p, request.user, pkgbase, target, ACCEPTED_ID),
)
util.apply_all(
to_reject,
lambda p: close_pkgreq(p, request.user, pkgbase, target, REJECTED_ID),
)
retry_closures()
# Create RequestCloseNotifications for all requests involved.
for pkgreq in (to_accept + to_reject):
for pkgreq in to_accept + to_reject:
notif = notify.RequestCloseNotification(
request.user.ID, pkgreq.ID, pkgreq.status_display())
request.user.ID, pkgreq.ID, pkgreq.status_display()
)
notifs.append(notif)
# Return notifications to the caller for sending.

View file

@ -3,16 +3,23 @@ from typing import Set
from sqlalchemy import and_, case, or_, orm
from aurweb import db, models
from aurweb.models import Package, PackageBase, User
from aurweb.models.dependency_type import CHECKDEPENDS_ID, DEPENDS_ID, MAKEDEPENDS_ID, OPTDEPENDS_ID
from aurweb.models import Group, Package, PackageBase, User
from aurweb.models.dependency_type import (
CHECKDEPENDS_ID,
DEPENDS_ID,
MAKEDEPENDS_ID,
OPTDEPENDS_ID,
)
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_group import PackageGroup
from aurweb.models.package_keyword import PackageKeyword
from aurweb.models.package_notification import PackageNotification
from aurweb.models.package_vote import PackageVote
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
class PackageSearch:
""" A Package search query builder. """
"""A Package search query builder."""
# A constant mapping of short to full name sort orderings.
FULL_SORT_ORDER = {"d": "desc", "a": "asc"}
@ -24,14 +31,18 @@ class PackageSearch:
if self.user:
self.query = self.query.join(
PackageVote,
and_(PackageVote.PackageBaseID == PackageBase.ID,
PackageVote.UsersID == self.user.ID),
isouter=True
and_(
PackageVote.PackageBaseID == PackageBase.ID,
PackageVote.UsersID == self.user.ID,
),
isouter=True,
).join(
PackageNotification,
and_(PackageNotification.PackageBaseID == PackageBase.ID,
PackageNotification.UserID == self.user.ID),
isouter=True
and_(
PackageNotification.PackageBaseID == PackageBase.ID,
PackageNotification.UserID == self.user.ID,
),
isouter=True,
)
self.ordering = "d"
@ -47,7 +58,7 @@ class PackageSearch:
"m": self._search_by_maintainer,
"c": self._search_by_comaintainer,
"M": self._search_by_co_or_maintainer,
"s": self._search_by_submitter
"s": self._search_by_submitter,
}
# Setup SB (Sort By) callbacks.
@ -58,7 +69,7 @@ class PackageSearch:
"w": self._sort_by_voted,
"o": self._sort_by_notify,
"m": self._sort_by_maintainer,
"l": self._sort_by_last_modified
"l": self._sort_by_last_modified,
}
self._joined_user = False
@ -66,12 +77,10 @@ class PackageSearch:
self._joined_comaint = False
def _join_user(self, outer: bool = True) -> orm.Query:
""" Centralized joining of a package base's maintainer. """
"""Centralized joining of a package base's maintainer."""
if not self._joined_user:
self.query = self.query.join(
User,
User.ID == PackageBase.MaintainerUID,
isouter=outer
User, User.ID == PackageBase.MaintainerUID, isouter=outer
)
self._joined_user = True
return self.query
@ -87,7 +96,7 @@ class PackageSearch:
self.query = self.query.join(
PackageComaintainer,
PackageComaintainer.PackageBaseID == PackageBase.ID,
isouter=isouter
isouter=isouter,
)
self._joined_comaint = True
return self.query
@ -95,8 +104,10 @@ class PackageSearch:
def _search_by_namedesc(self, keywords: str) -> orm.Query:
self._join_user()
self.query = self.query.filter(
or_(Package.Name.like(f"%{keywords}%"),
Package.Description.like(f"%{keywords}%"))
or_(
Package.Name.like(f"%{keywords}%"),
Package.Description.like(f"%{keywords}%"),
)
)
return self
@ -125,15 +136,17 @@ class PackageSearch:
self._join_user()
self._join_keywords()
keywords = set(k.lower() for k in keywords)
self.query = self.query.filter(PackageKeyword.Keyword.in_(keywords))
self.query = self.query.filter(PackageKeyword.Keyword.in_(keywords)).group_by(
models.Package.Name
)
return self
def _search_by_maintainer(self, keywords: str) -> orm.Query:
self._join_user()
if keywords:
self.query = self.query.filter(
and_(User.Username == keywords,
User.ID == PackageBase.MaintainerUID)
and_(User.Username == keywords, User.ID == PackageBase.MaintainerUID)
)
else:
self.query = self.query.filter(PackageBase.MaintainerUID.is_(None))
@ -182,13 +195,13 @@ class PackageSearch:
def _sort_by_votes(self, order: str):
column = getattr(models.PackageBase.NumVotes, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
def _sort_by_popularity(self, order: str):
column = getattr(models.PackageBase.Popularity, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
@ -197,8 +210,7 @@ class PackageSearch:
# in terms of performance. We should improve this; there's no
# reason it should take _longer_.
column = getattr(
case([(models.PackageVote.UsersID == self.user.ID, 1)], else_=0),
order
case([(models.PackageVote.UsersID == self.user.ID, 1)], else_=0), order
)
name = getattr(models.Package.Name, order)
self.query = self.query.order_by(column(), name())
@ -209,9 +221,8 @@ class PackageSearch:
# in terms of performance. We should improve this; there's no
# reason it should take _longer_.
column = getattr(
case([(models.PackageNotification.UserID == self.user.ID, 1)],
else_=0),
order
case([(models.PackageNotification.UserID == self.user.ID, 1)], else_=0),
order,
)
name = getattr(models.Package.Name, order)
self.query = self.query.order_by(column(), name())
@ -225,7 +236,7 @@ class PackageSearch:
def _sort_by_last_modified(self, order: str):
column = getattr(models.PackageBase.ModifiedTS, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
@ -239,16 +250,16 @@ class PackageSearch:
return callback(ordering)
def count(self) -> int:
""" Return internal query's count. """
"""Return internal query's count."""
return self.query.count()
def results(self) -> orm.Query:
""" Return internal query. """
"""Return internal query."""
return self.query
class RPCSearch(PackageSearch):
""" A PackageSearch-derived RPC package search query builder.
"""A PackageSearch-derived RPC package search query builder.
With RPC search, we need a subset of PackageSearch's handlers,
with a few additional handlers added. So, within the RPCSearch
@ -261,7 +272,7 @@ class RPCSearch(PackageSearch):
sanitization done for the PackageSearch `by` argument.
"""
keys_removed = ("b", "N", "B", "k", "c", "M", "s")
keys_removed = ("b", "N", "B", "M")
def __init__(self) -> "RPCSearch":
super().__init__()
@ -270,52 +281,112 @@ class RPCSearch(PackageSearch):
# We keep: "nd", "n" and "m". We also overlay four new by params
# on top: "depends", "makedepends", "optdepends" and "checkdepends".
self.search_by_cb = {
k: v for k, v in self.search_by_cb.items()
k: v
for k, v in self.search_by_cb.items()
if k not in RPCSearch.keys_removed
}
self.search_by_cb.update({
"depends": self._search_by_depends,
"makedepends": self._search_by_makedepends,
"optdepends": self._search_by_optdepends,
"checkdepends": self._search_by_checkdepends
})
self.search_by_cb.update(
{
"depends": self._search_by_depends,
"makedepends": self._search_by_makedepends,
"optdepends": self._search_by_optdepends,
"checkdepends": self._search_by_checkdepends,
"provides": self._search_by_provides,
"conflicts": self._search_by_conflicts,
"replaces": self._search_by_replaces,
"groups": self._search_by_groups,
}
)
# We always want an optional Maintainer in the RPC.
self._join_user()
def _join_depends(self, dep_type_id: int) -> orm.Query:
""" Join Package with PackageDependency and filter results
"""Join Package with PackageDependency and filter results
based on `dep_type_id`.
:param dep_type_id: DependencyType ID
:returns: PackageDependency-joined orm.Query
"""
self.query = self.query.join(models.PackageDependency).filter(
models.PackageDependency.DepTypeID == dep_type_id)
models.PackageDependency.DepTypeID == dep_type_id
)
return self.query
def _join_relations(self, rel_type_id: int) -> orm.Query:
"""Join Package with PackageRelation and filter results
based on `rel_type_id`.
:param rel_type_id: RelationType ID
:returns: PackageRelation-joined orm.Query
"""
self.query = self.query.join(models.PackageRelation).filter(
models.PackageRelation.RelTypeID == rel_type_id
)
return self.query
def _join_groups(self) -> orm.Query:
"""Join Package with PackageGroup and Group.
:returns: PackageGroup/Group-joined orm.Query
"""
self.query = self.query.join(PackageGroup).join(Group)
return self.query
def _search_by_depends(self, keywords: str) -> "RPCSearch":
self.query = self._join_depends(DEPENDS_ID).filter(
models.PackageDependency.DepName == keywords)
models.PackageDependency.DepName == keywords
)
return self
def _search_by_makedepends(self, keywords: str) -> "RPCSearch":
self.query = self._join_depends(MAKEDEPENDS_ID).filter(
models.PackageDependency.DepName == keywords)
models.PackageDependency.DepName == keywords
)
return self
def _search_by_optdepends(self, keywords: str) -> "RPCSearch":
self.query = self._join_depends(OPTDEPENDS_ID).filter(
models.PackageDependency.DepName == keywords)
models.PackageDependency.DepName == keywords
)
return self
def _search_by_checkdepends(self, keywords: str) -> "RPCSearch":
self.query = self._join_depends(CHECKDEPENDS_ID).filter(
models.PackageDependency.DepName == keywords)
models.PackageDependency.DepName == keywords
)
return self
def _search_by_provides(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(PROVIDES_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_conflicts(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(CONFLICTS_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_replaces(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(REPLACES_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_groups(self, keywords: str) -> "RPCSearch":
self._join_groups()
self.query = self.query.filter(Group.Name == keywords)
return self
def _search_by_keywords(self, keywords: str) -> "RPCSearch":
self._join_keywords()
self.query = self.query.filter(PackageKeyword.Keyword == keywords)
return self
def search_by(self, by: str, keywords: str) -> "RPCSearch":
""" Override inherited search_by. In this override, we reduce the
"""Override inherited search_by. In this override, we reduce the
scope of what we handle within this function. We do not set `by`
to a default of "nd" in the RPC, as the RPC returns an error when
incorrect `by` fields are specified.
@ -329,6 +400,4 @@ class RPCSearch(PackageSearch):
return result
def results(self) -> orm.Query:
return self.query.filter(
models.PackageBase.PackagerUID.isnot(None)
)
return self.query

View file

@ -1,21 +1,21 @@
from collections import defaultdict
from http import HTTPStatus
from typing import Dict, List, Tuple, Union
from typing import Tuple, Union
from urllib.parse import quote_plus
import orjson
from fastapi import HTTPException
from sqlalchemy import orm
from aurweb import config, db, models
from aurweb.aur_redis import redis_connection
from aurweb.models import Package
from aurweb.models.official_provider import OFFICIAL_BASE, OfficialProvider
from aurweb.models.package_dependency import PackageDependency
from aurweb.models.package_relation import PackageRelation
from aurweb.redis import redis_connection
from aurweb.templates import register_filter
Providers = List[Union[PackageRelation, OfficialProvider]]
Providers = list[Union[PackageRelation, OfficialProvider]]
def dep_extra_with_arch(dep: models.PackageDependency, annotation: str) -> str:
@ -43,10 +43,10 @@ def dep_optdepends_extra(dep: models.PackageDependency) -> str:
@register_filter("dep_extra")
def dep_extra(dep: models.PackageDependency) -> str:
""" Some dependency types have extra text added to their
"""Some dependency types have extra text added to their
display. This function provides that output. However, it
**assumes** that the dep passed is bound to a valid one
of: depends, makedepends, checkdepends or optdepends. """
of: depends, makedepends, checkdepends or optdepends."""
f = globals().get(f"dep_{dep.DependencyType.Name}_extra")
return f(dep)
@ -61,13 +61,13 @@ def dep_extra_desc(dep: models.PackageDependency) -> str:
@register_filter("pkgname_link")
def pkgname_link(pkgname: str) -> str:
record = db.query(Package).filter(
Package.Name == pkgname).exists()
record = db.query(Package).filter(Package.Name == pkgname).exists()
if db.query(record).scalar():
return f"/packages/{pkgname}"
official = db.query(OfficialProvider).filter(
OfficialProvider.Name == pkgname).exists()
official = (
db.query(OfficialProvider).filter(OfficialProvider.Name == pkgname).exists()
)
if db.query(official).scalar():
base = "/".join([OFFICIAL_BASE, "packages"])
return f"{base}/?q={pkgname}"
@ -83,17 +83,17 @@ def package_link(package: Union[Package, OfficialProvider]) -> str:
@register_filter("provides_markup")
def provides_markup(provides: Providers) -> str:
return ", ".join([
f'<a href="{package_link(pkg)}">{pkg.Name}</a>'
for pkg in provides
])
links = []
for pkg in provides:
aur = "<sup><small>AUR</small></sup>" if not pkg.is_official else ""
links.append(f'<a href="{package_link(pkg)}">{pkg.Name}</a>{aur}')
return ", ".join(links)
def get_pkg_or_base(
name: str,
cls: Union[models.Package, models.PackageBase] = models.PackageBase) \
-> Union[models.Package, models.PackageBase]:
""" Get a PackageBase instance by its name or raise a 404 if
name: str, cls: Union[models.Package, models.PackageBase] = models.PackageBase
) -> Union[models.Package, models.PackageBase]:
"""Get a PackageBase instance by its name or raise a 404 if
it can't be found in the database.
:param name: {Package,PackageBase}.Name
@ -102,15 +102,13 @@ def get_pkg_or_base(
:raises HTTPException: With status code 404 if record doesn't exist
:return: {Package,PackageBase} instance
"""
with db.begin():
instance = db.query(cls).filter(cls.Name == name).first()
instance = db.query(cls).filter(cls.Name == name).first()
if not instance:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
return instance
def get_pkgbase_comment(pkgbase: models.PackageBase, id: int) \
-> models.PackageComment:
def get_pkgbase_comment(pkgbase: models.PackageBase, id: int) -> models.PackageComment:
comment = pkgbase.comments.filter(models.PackageComment.ID == id).first()
if not comment:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
@ -122,9 +120,8 @@ def out_of_date(packages: orm.Query) -> orm.Query:
return packages.filter(models.PackageBase.OutOfDateTS.isnot(None))
def updated_packages(limit: int = 0,
cache_ttl: int = 600) -> List[models.Package]:
""" Return a list of valid Package objects ordered by their
def updated_packages(limit: int = 0, cache_ttl: int = 600) -> list[models.Package]:
"""Return a list of valid Package objects ordered by their
ModifiedTS column in descending order from cache, after setting
the cache when no key yet exists.
@ -138,27 +135,26 @@ def updated_packages(limit: int = 0,
# If we already have a cache, deserialize it and return.
return orjson.loads(packages)
with db.begin():
query = db.query(models.Package).join(models.PackageBase).filter(
models.PackageBase.PackagerUID.isnot(None)
).order_by(
models.PackageBase.ModifiedTS.desc()
)
query = (
db.query(models.Package)
.join(models.PackageBase)
.order_by(models.PackageBase.ModifiedTS.desc())
)
if limit:
query = query.limit(limit)
if limit:
query = query.limit(limit)
packages = []
for pkg in query:
# For each Package returned by the query, append a dict
# containing Package columns we're interested in.
packages.append({
"Name": pkg.Name,
"Version": pkg.Version,
"PackageBase": {
"ModifiedTS": pkg.PackageBase.ModifiedTS
packages.append(
{
"Name": pkg.Name,
"Version": pkg.Version,
"PackageBase": {"ModifiedTS": pkg.PackageBase.ModifiedTS},
}
})
)
# Store the JSON serialization of the package_updates key into Redis.
redis.set("package_updates", orjson.dumps(packages))
@ -168,9 +164,8 @@ def updated_packages(limit: int = 0,
return packages
def query_voted(query: List[models.Package],
user: models.User) -> Dict[int, bool]:
""" Produce a dictionary of package base ID keys to boolean values,
def query_voted(query: list[models.Package], user: models.User) -> dict[int, bool]:
"""Produce a dictionary of package base ID keys to boolean values,
which indicate whether or not the package base has a vote record
related to user.
@ -180,20 +175,18 @@ def query_voted(query: List[models.Package],
"""
output = defaultdict(bool)
query_set = {pkg.PackageBaseID for pkg in query}
voted = db.query(models.PackageVote).join(
models.PackageBase,
models.PackageBase.ID.in_(query_set)
).filter(
models.PackageVote.UsersID == user.ID
voted = (
db.query(models.PackageVote)
.join(models.PackageBase, models.PackageBase.ID.in_(query_set))
.filter(models.PackageVote.UsersID == user.ID)
)
for vote in voted:
output[vote.PackageBase.ID] = True
return output
def query_notified(query: List[models.Package],
user: models.User) -> Dict[int, bool]:
""" Produce a dictionary of package base ID keys to boolean values,
def query_notified(query: list[models.Package], user: models.User) -> dict[int, bool]:
"""Produce a dictionary of package base ID keys to boolean values,
which indicate whether or not the package base has a notification
record related to user.
@ -203,19 +196,17 @@ def query_notified(query: List[models.Package],
"""
output = defaultdict(bool)
query_set = {pkg.PackageBaseID for pkg in query}
notified = db.query(models.PackageNotification).join(
models.PackageBase,
models.PackageBase.ID.in_(query_set)
).filter(
models.PackageNotification.UserID == user.ID
notified = (
db.query(models.PackageNotification)
.join(models.PackageBase, models.PackageBase.ID.in_(query_set))
.filter(models.PackageNotification.UserID == user.ID)
)
for notif in notified:
output[notif.PackageBase.ID] = True
return output
def pkg_required(pkgname: str, provides: List[str], limit: int) \
-> List[PackageDependency]:
def pkg_required(pkgname: str, provides: list[str]) -> list[PackageDependency]:
"""
Get dependencies that match a string in `[pkgname] + provides`.
@ -225,10 +216,14 @@ def pkg_required(pkgname: str, provides: List[str], limit: int) \
:return: List of PackageDependency instances
"""
targets = set([pkgname] + provides)
query = db.query(PackageDependency).join(Package).filter(
PackageDependency.DepName.in_(targets)
).order_by(Package.Name.asc()).limit(limit)
return query.all()
query = (
db.query(PackageDependency)
.join(Package)
.options(orm.contains_eager(PackageDependency.Package))
.filter(PackageDependency.DepName.in_(targets))
.order_by(Package.Name.asc())
)
return query
@register_filter("source_uri")
@ -247,12 +242,12 @@ def source_uri(pkgsrc: models.PackageSource) -> Tuple[str, str]:
the package base name.
:param pkgsrc: PackageSource instance
:return (text, uri) tuple
:return text, uri)tuple
"""
if "::" in pkgsrc.Source:
return pkgsrc.Source.split("::", 1)
elif "://" in pkgsrc.Source:
return (pkgsrc.Source, pkgsrc.Source)
return pkgsrc.Source, pkgsrc.Source
path = config.get("options", "source_file_uri")
pkgbasename = pkgsrc.Package.PackageBase.Name
return (pkgsrc.Source, path % (pkgsrc.Source, pkgbasename))
pkgbasename = quote_plus(pkgsrc.Package.PackageBase.Name)
return pkgsrc.Source, path % (pkgsrc.Source, pkgbasename)

View file

@ -1,10 +1,8 @@
from typing import List
from fastapi import Request
from aurweb import db, logging, util
from aurweb import aur_logging, db, util
from aurweb.auth import creds
from aurweb.models import PackageBase
from aurweb.models import PackageBase, User
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_notification import PackageNotification
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID
@ -12,19 +10,30 @@ from aurweb.packages.requests import handle_request, update_closure_comment
from aurweb.pkgbase import util as pkgbaseutil
from aurweb.scripts import notify, popupdate
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
@db.retry_deadlock
def _retry_notify(user: User, pkgbase: PackageBase) -> None:
with db.begin():
db.create(PackageNotification, PackageBase=pkgbase, User=user)
def pkgbase_notify_instance(request: Request, pkgbase: PackageBase) -> None:
notif = db.query(pkgbase.notifications.filter(
PackageNotification.UserID == request.user.ID
).exists()).scalar()
notif = db.query(
pkgbase.notifications.filter(
PackageNotification.UserID == request.user.ID
).exists()
).scalar()
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
if has_cred and not notif:
with db.begin():
db.create(PackageNotification,
PackageBase=pkgbase,
User=request.user)
_retry_notify(request.user, pkgbase)
@db.retry_deadlock
def _retry_unnotify(notif: PackageNotification, pkgbase: PackageBase) -> None:
with db.begin():
db.delete(notif)
def pkgbase_unnotify_instance(request: Request, pkgbase: PackageBase) -> None:
@ -33,25 +42,38 @@ def pkgbase_unnotify_instance(request: Request, pkgbase: PackageBase) -> None:
).first()
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
if has_cred and notif:
with db.begin():
db.delete(notif)
_retry_unnotify(notif, pkgbase)
@db.retry_deadlock
def _retry_unflag(pkgbase: PackageBase) -> None:
with db.begin():
pkgbase.OutOfDateTS = None
pkgbase.Flagger = None
pkgbase.FlaggerComment = str()
def pkgbase_unflag_instance(request: Request, pkgbase: PackageBase) -> None:
has_cred = request.user.has_credential(
creds.PKGBASE_UNFLAG, approved=[pkgbase.Flagger, pkgbase.Maintainer])
creds.PKGBASE_UNFLAG,
approved=[pkgbase.Flagger, pkgbase.Maintainer]
+ [c.User for c in pkgbase.comaintainers],
)
if has_cred:
with db.begin():
pkgbase.OutOfDateTS = None
pkgbase.Flagger = None
pkgbase.FlaggerComment = str()
_retry_unflag(pkgbase)
def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
disowner = request.user
notifs = [notify.DisownNotification(disowner.ID, pkgbase.ID)]
@db.retry_deadlock
def _retry_disown(request: Request, pkgbase: PackageBase):
notifs: list[notify.Notification] = []
is_maint = request.user == pkgbase.Maintainer
comaint = pkgbase.comaintainers.filter(
PackageComaintainer.User == request.user
).one_or_none()
is_comaint = comaint is not None
is_maint = disowner == pkgbase.Maintainer
if is_maint:
with db.begin():
# Comaintainer with the lowest Priority value; next-in-line.
@ -65,46 +87,61 @@ def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
else:
# Otherwise, just orphan the package completely.
pkgbase.Maintainer = None
elif is_comaint:
# This disown request is from a Comaintainer
with db.begin():
notif = pkgbaseutil.remove_comaintainer(comaint)
notifs.append(notif)
elif request.user.has_credential(creds.PKGBASE_DISOWN):
# Otherwise, the request user performing this disownage is a
# Trusted User and we treat it like a standard orphan request.
# Package Maintainer and we treat it like a standard orphan request.
notifs += handle_request(request, ORPHAN_ID, pkgbase)
with db.begin():
pkgbase.Maintainer = None
db.delete_all(pkgbase.comaintainers)
return notifs
def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
disowner = request.user
notifs = [notify.DisownNotification(disowner.ID, pkgbase.ID)]
notifs += _retry_disown(request, pkgbase)
util.apply_all(notifs, lambda n: n.send())
def pkgbase_adopt_instance(request: Request, pkgbase: PackageBase) -> None:
@db.retry_deadlock
def _retry_adopt(request: Request, pkgbase: PackageBase) -> None:
with db.begin():
pkgbase.Maintainer = request.user
def pkgbase_adopt_instance(request: Request, pkgbase: PackageBase) -> None:
_retry_adopt(request, pkgbase)
notif = notify.AdoptNotification(request.user.ID, pkgbase.ID)
notif.send()
def pkgbase_delete_instance(request: Request, pkgbase: PackageBase,
comments: str = str()) \
-> List[notify.Notification]:
notifs = handle_request(request, DELETION_ID, pkgbase) + [
notify.DeleteNotification(request.user.ID, pkgbase.ID)
]
@db.retry_deadlock
def _retry_delete(pkgbase: PackageBase, comments: str) -> None:
with db.begin():
update_closure_comment(pkgbase, DELETION_ID, comments)
db.delete(pkgbase)
def pkgbase_delete_instance(
request: Request, pkgbase: PackageBase, comments: str = str()
) -> list[notify.Notification]:
notif = notify.DeleteNotification(request.user.ID, pkgbase.ID)
notifs = handle_request(request, DELETION_ID, pkgbase, comments=comments) + [notif]
_retry_delete(pkgbase, comments)
return notifs
def pkgbase_merge_instance(request: Request, pkgbase: PackageBase,
target: PackageBase, comments: str = str()) -> None:
pkgbasename = str(pkgbase.Name)
# Create notifications.
notifs = handle_request(request, MERGE_ID, pkgbase, target)
@db.retry_deadlock
def _retry_merge(pkgbase: PackageBase, target: PackageBase) -> None:
# Target votes and notifications sets of user IDs that are
# looking to be migrated.
target_votes = set(v.UsersID for v in target.package_votes)
@ -134,9 +171,25 @@ def pkgbase_merge_instance(request: Request, pkgbase: PackageBase,
db.delete(pkg)
db.delete(pkgbase)
def pkgbase_merge_instance(
request: Request,
pkgbase: PackageBase,
target: PackageBase,
comments: str = str(),
) -> None:
pkgbasename = str(pkgbase.Name)
# Create notifications.
notifs = handle_request(request, MERGE_ID, pkgbase, target, comments)
_retry_merge(pkgbase, target)
# Log this out for accountability purposes.
logger.info(f"Trusted User '{request.user.Username}' merged "
f"'{pkgbasename}' into '{target.Name}'.")
logger.info(
f"Package Maintainer '{request.user.Username}' merged "
f"'{pkgbasename}' into '{target.Name}'."
)
# Send notifications.
util.apply_all(notifs, lambda n: n.send())

View file

@ -1,10 +1,12 @@
from typing import Any, Dict, List
from typing import Any
from fastapi import Request
from sqlalchemy import and_
from sqlalchemy.orm import joinedload
from aurweb import config, db, l10n, util
from aurweb import config, db, defaults, l10n, time, util
from aurweb.models import PackageBase, User
from aurweb.models.package_base import popularity
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_comment import PackageComment
from aurweb.models.package_request import PENDING_ID, PackageRequest
@ -13,51 +15,88 @@ from aurweb.scripts import notify
from aurweb.templates import make_context as _make_context
def make_context(request: Request, pkgbase: PackageBase) -> Dict[str, Any]:
""" Make a basic context for package or pkgbase.
def make_context(
request: Request, pkgbase: PackageBase, context: dict[str, Any] = None
) -> dict[str, Any]:
"""Make a basic context for package or pkgbase.
:param request: FastAPI request
:param pkgbase: PackageBase instance
:return: A pkgbase context without specific differences
"""
context = _make_context(request, pkgbase.Name)
if not context:
context = _make_context(request, pkgbase.Name)
is_authenticated = request.user.is_authenticated()
# Per page and offset.
offset, per_page = util.sanitize_params(
request.query_params.get("O", defaults.O),
request.query_params.get("PP", defaults.COMMENTS_PER_PAGE),
)
context["O"] = offset
context["PP"] = per_page
context["git_clone_uri_anon"] = config.get("options", "git_clone_uri_anon")
context["git_clone_uri_priv"] = config.get("options", "git_clone_uri_priv")
context["pkgbase"] = pkgbase
context["comaintainers"] = [
c.User for c in pkgbase.comaintainers.order_by(
PackageComaintainer.Priority.asc()
).all()
c.User
for c in pkgbase.comaintainers.options(joinedload(PackageComaintainer.User))
.order_by(PackageComaintainer.Priority.asc())
.all()
]
if is_authenticated:
context["unflaggers"] = context["comaintainers"].copy()
context["unflaggers"].extend([pkgbase.Maintainer, pkgbase.Flagger])
else:
context["unflaggers"] = []
context["packages_count"] = pkgbase.packages.count()
context["keywords"] = pkgbase.keywords
context["comments"] = pkgbase.comments.order_by(
context["comments_total"] = pkgbase.comments.order_by(
PackageComment.CommentTS.desc()
).count()
context["comments"] = (
pkgbase.comments.order_by(PackageComment.CommentTS.desc())
.limit(per_page)
.offset(offset)
)
context["pinned_comments"] = pkgbase.comments.filter(
PackageComment.PinnedTS != 0
).order_by(PackageComment.CommentTS.desc())
context["is_maintainer"] = bool(request.user == pkgbase.Maintainer)
context["notified"] = request.user.notified(pkgbase)
if is_authenticated:
context["notified"] = request.user.notified(pkgbase)
else:
context["notified"] = False
context["out_of_date"] = bool(pkgbase.OutOfDateTS)
context["voted"] = request.user.package_votes.filter(
PackageVote.PackageBaseID == pkgbase.ID
).scalar()
if is_authenticated:
context["voted"] = db.query(
request.user.package_votes.filter(
PackageVote.PackageBaseID == pkgbase.ID
).exists()
).scalar()
else:
context["voted"] = False
context["requests"] = pkgbase.requests.filter(
and_(PackageRequest.Status == PENDING_ID,
PackageRequest.ClosedTS.is_(None))
).count()
if is_authenticated:
context["requests"] = pkgbase.requests.filter(
and_(PackageRequest.Status == PENDING_ID, PackageRequest.ClosedTS.is_(None))
).count()
else:
context["requests"] = []
context["popularity"] = popularity(pkgbase, time.utcnow())
return context
def remove_comaintainer(comaint: PackageComaintainer) \
-> notify.ComaintainerRemoveNotification:
def remove_comaintainer(
comaint: PackageComaintainer,
) -> notify.ComaintainerRemoveNotification:
"""
Remove a PackageComaintainer.
@ -77,7 +116,8 @@ def remove_comaintainer(comaint: PackageComaintainer) \
return notif
def remove_comaintainers(pkgbase: PackageBase, usernames: List[str]) -> None:
@db.retry_deadlock
def remove_comaintainers(pkgbase: PackageBase, usernames: list[str]) -> None:
"""
Remove comaintainers from `pkgbase`.
@ -86,9 +126,9 @@ def remove_comaintainers(pkgbase: PackageBase, usernames: List[str]) -> None:
"""
notifications = []
with db.begin():
comaintainers = pkgbase.comaintainers.join(User).filter(
User.Username.in_(usernames)
).all()
comaintainers = (
pkgbase.comaintainers.join(User).filter(User.Username.in_(usernames)).all()
)
notifications = [
notify.ComaintainerRemoveNotification(co.User.ID, pkgbase.ID)
for co in comaintainers
@ -112,23 +152,24 @@ def latest_priority(pkgbase: PackageBase) -> int:
"""
# Order comaintainers related to pkgbase by Priority DESC.
record = pkgbase.comaintainers.order_by(
PackageComaintainer.Priority.desc()).first()
record = pkgbase.comaintainers.order_by(PackageComaintainer.Priority.desc()).first()
# Use Priority column if record exists, otherwise 0.
return record.Priority if record else 0
class NoopComaintainerNotification:
""" A noop notification stub used as an error-state return value. """
"""A noop notification stub used as an error-state return value."""
def send(self) -> None:
""" noop """
"""noop"""
return
def add_comaintainer(pkgbase: PackageBase, comaintainer: User) \
-> notify.ComaintainerAddNotification:
@db.retry_deadlock
def add_comaintainer(
pkgbase: PackageBase, comaintainer: User
) -> notify.ComaintainerAddNotification:
"""
Add a new comaintainer to `pkgbase`.
@ -144,14 +185,19 @@ def add_comaintainer(pkgbase: PackageBase, comaintainer: User) \
new_prio = latest_priority(pkgbase) + 1
with db.begin():
db.create(PackageComaintainer, PackageBase=pkgbase,
User=comaintainer, Priority=new_prio)
db.create(
PackageComaintainer,
PackageBase=pkgbase,
User=comaintainer,
Priority=new_prio,
)
return notify.ComaintainerAddNotification(comaintainer.ID, pkgbase.ID)
def add_comaintainers(request: Request, pkgbase: PackageBase,
usernames: List[str]) -> None:
def add_comaintainers(
request: Request, pkgbase: PackageBase, usernames: list[str]
) -> None:
"""
Add comaintainers to `pkgbase`.
@ -195,7 +241,6 @@ def rotate_comaintainers(pkgbase: PackageBase) -> None:
:param pkgbase: PackageBase instance
"""
comaintainers = pkgbase.comaintainers.order_by(
PackageComaintainer.Priority.asc())
comaintainers = pkgbase.comaintainers.order_by(PackageComaintainer.Priority.asc())
for i, comaint in enumerate(comaintainers):
comaint.Priority = i + 1

View file

@ -1,35 +1,55 @@
from typing import Any, Dict
from http import HTTPStatus
from typing import Any
from aurweb import db
from fastapi import HTTPException
from aurweb import config, db
from aurweb.exceptions import ValidationError
from aurweb.models import PackageBase
def request(pkgbase: PackageBase,
type: str, comments: str, merge_into: str,
context: Dict[str, Any]) -> None:
if not comments:
raise ValidationError(["The comment field must not be empty."])
def request(
pkgbase: PackageBase,
type: str,
comments: str,
merge_into: str,
context: dict[str, Any],
) -> None:
# validate comment
comment(comments)
if type == "merge":
# Perform merge-related checks.
if not merge_into:
# TODO: This error needs to be translated.
raise ValidationError(
['The "Merge into" field must not be empty.'])
raise ValidationError(['The "Merge into" field must not be empty.'])
target = db.query(PackageBase).filter(
PackageBase.Name == merge_into
).first()
target = db.query(PackageBase).filter(PackageBase.Name == merge_into).first()
if not target:
# TODO: This error needs to be translated.
raise ValidationError([
"The package base you want to merge into does not exist."
])
raise ValidationError(
["The package base you want to merge into does not exist."]
)
db.refresh(target)
if target.ID == pkgbase.ID:
# TODO: This error needs to be translated.
raise ValidationError([
"You cannot merge a package base into itself."
])
raise ValidationError(["You cannot merge a package base into itself."])
def comment(comment: str):
if not comment:
raise ValidationError(["The comment field must not be empty."])
if len(comment) > config.getint("options", "max_chars_comment", 5000):
raise ValidationError(["Maximum number of characters for comment exceeded."])
def comment_raise_http_ex(comments: str):
try:
comment(comments)
except ValidationError as err:
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=err.data[0],
)

View file

@ -1,26 +1,49 @@
from typing import Any, Callable, Dict, List, Optional
from typing import Any, Callable, Optional
from prometheus_client import Counter
from prometheus_client import Counter, Gauge
from prometheus_fastapi_instrumentator import Instrumentator
from prometheus_fastapi_instrumentator.metrics import Info
from starlette.routing import Match, Route
from aurweb import logging
from aurweb import aur_logging
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
_instrumentator = Instrumentator()
# Custom metrics
SEARCH_REQUESTS = Counter(
"aur_search_requests", "Number of search requests by cache hit/miss", ["cache"]
)
USERS = Gauge(
"aur_users", "Number of AUR users by type", ["type"], multiprocess_mode="livemax"
)
PACKAGES = Gauge(
"aur_packages",
"Number of AUR packages by state",
["state"],
multiprocess_mode="livemax",
)
REQUESTS = Gauge(
"aur_requests",
"Number of AUR requests by type and status",
["type", "status"],
multiprocess_mode="livemax",
)
def instrumentator():
return _instrumentator
# FastAPI metrics
# Taken from https://github.com/stephenhillier/starlette_exporter
# Their license is included in LICENSES/starlette_exporter.
# The code has been modified to remove child route checks
# (since we don't have any) and to stay within an 80-width limit.
def get_matching_route_path(scope: Dict[Any, Any], routes: List[Route],
route_name: Optional[str] = None) -> str:
def get_matching_route_path(
scope: dict[Any, Any], routes: list[Route], route_name: Optional[str] = None
) -> str:
"""
Find a matching route and return its original path string
@ -34,7 +57,7 @@ def get_matching_route_path(scope: Dict[Any, Any], routes: List[Route],
if match == Match.FULL:
route_name = route.path
'''
"""
# This path exists in the original function's code, but we
# don't need it (currently), so it's been removed to avoid
# useless test coverage.
@ -47,7 +70,7 @@ def get_matching_route_path(scope: Dict[Any, Any], routes: List[Route],
route_name = None
else:
route_name += child_route_name
'''
"""
return route_name
elif match == Match.PARTIAL and route_name is None:
@ -55,11 +78,16 @@ def get_matching_route_path(scope: Dict[Any, Any], routes: List[Route],
def http_requests_total() -> Callable[[Info], None]:
metric = Counter("http_requests_total",
"Number of HTTP requests.",
labelnames=("method", "path", "status"))
metric = Counter(
"http_requests_total",
"Number of HTTP requests.",
labelnames=("method", "path", "status"),
)
def instrumentation(info: Info) -> None:
if info.request.method.lower() in ("head", "options"): # pragma: no cover
return
scope = info.request.scope
# Taken from https://github.com/stephenhillier/starlette_exporter
@ -70,11 +98,19 @@ def http_requests_total() -> Callable[[Info], None]:
if not (scope.get("endpoint", None) and scope.get("router", None)):
return None
root_path = scope.get("root_path", str())
app = scope.get("app", dict())
if hasattr(app, "root_path"):
app_root_path = getattr(app, "root_path")
if root_path.startswith(app_root_path):
root_path = root_path[len(app_root_path) :]
base_scope = {
"type": scope.get("type"),
"path": scope.get("root_path", "") + scope.get("path"),
"path": root_path + scope.get("path"),
"path_params": scope.get("path_params", {}),
"method": scope.get("method")
"method": scope.get("method"),
}
method = scope.get("method")
@ -91,9 +127,13 @@ def http_api_requests_total() -> Callable[[Info], None]:
metric = Counter(
"http_api_requests",
"Number of times an RPC API type has been requested.",
labelnames=("type", "status"))
labelnames=("type", "status"),
)
def instrumentation(info: Info) -> None:
if info.request.method.lower() in ("head", "options"): # pragma: no cover
return
if info.request.url.path.rstrip("/") == "/rpc":
type = info.request.query_params.get("type", "None")
if info.response:

View file

@ -1,11 +1,12 @@
from fastapi import Request
from redis.client import Pipeline
from aurweb import config, db, logging, time
from aurweb import aur_logging, config, db, time
from aurweb.aur_redis import redis_connection
from aurweb.models import ApiRateLimit
from aurweb.redis import redis_connection
from aurweb.util import get_client_ip
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
@ -13,7 +14,7 @@ def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
now = time.utcnow()
time_to_delete = now - window_length
host = request.client.host
host = get_client_ip(request)
window_key = f"ratelimit-ws:{host}"
requests_key = f"ratelimit:{host}"
@ -38,27 +39,33 @@ def _update_ratelimit_db(request: Request):
now = time.utcnow()
time_to_delete = now - window_length
records = db.query(ApiRateLimit).filter(
ApiRateLimit.WindowStart < time_to_delete)
with db.begin():
db.delete_all(records)
@db.retry_deadlock
def retry_delete(records: list[ApiRateLimit]) -> None:
with db.begin():
db.delete_all(records)
host = request.client.host
records = db.query(ApiRateLimit).filter(ApiRateLimit.WindowStart < time_to_delete)
retry_delete(records)
@db.retry_deadlock
def retry_create(record: ApiRateLimit, now: int, host: str) -> ApiRateLimit:
with db.begin():
if not record:
record = db.create(ApiRateLimit, WindowStart=now, IP=host, Requests=1)
else:
record.Requests += 1
return record
host = get_client_ip(request)
record = db.query(ApiRateLimit, ApiRateLimit.IP == host).first()
with db.begin():
if not record:
record = db.create(ApiRateLimit,
WindowStart=now,
IP=host, Requests=1)
else:
record.Requests += 1
record = retry_create(record, now, host)
logger.debug(record.Requests)
return record
def update_ratelimit(request: Request, pipeline: Pipeline):
""" Update the ratelimit stored in Redis or the database depending
"""Update the ratelimit stored in Redis or the database depending
on AUR_CONFIG's [options] cache setting.
This Redis-capable function is slightly different than most. If Redis
@ -75,7 +82,7 @@ def update_ratelimit(request: Request, pipeline: Pipeline):
def check_ratelimit(request: Request):
""" Increment and check to see if request has exceeded their rate limit.
"""Increment and check to see if request has exceeded their rate limit.
:param request: FastAPI request
:returns: True if the request host has exceeded the rate limit else False
@ -86,7 +93,7 @@ def check_ratelimit(request: Request):
record = update_ratelimit(request, pipeline)
# Get cache value, else None.
host = request.client.host
host = get_client_ip(request)
pipeline.get(f"ratelimit:{host}")
requests = pipeline.execute()[0]
@ -94,7 +101,7 @@ def check_ratelimit(request: Request):
# valid cache value will be returned which must be converted
# to an int. Otherwise, use the database record returned
# by update_ratelimit.
if not config.getboolean("ratelimit", "cache"):
if not config.getboolean("ratelimit", "cache") or requests is None:
# If we got nothing from pipeline.get, we did not use
# the Redis path of logic: use the DB record's count.
requests = record.Requests

View file

@ -3,7 +3,19 @@ API routers for FastAPI.
See https://fastapi.tiangolo.com/tutorial/bigger-applications/
"""
from . import accounts, auth, html, packages, pkgbase, requests, rpc, rss, sso, trusted_user
from . import (
accounts,
auth,
html,
package_maintainer,
packages,
pkgbase,
requests,
rpc,
rss,
sso,
)
"""
aurweb application routes. This constant can be any iterable
@ -17,7 +29,7 @@ APP_ROUTES = [
packages,
pkgbase,
requests,
trusted_user,
package_maintainer,
rss,
rpc,
sso,

View file

@ -1,17 +1,15 @@
import copy
import typing
from http import HTTPStatus
from typing import Any, Dict
from typing import Any
from fastapi import APIRouter, Form, Request
from fastapi import APIRouter, Form, HTTPException, Request
from fastapi.responses import HTMLResponse, RedirectResponse
from sqlalchemy import and_, or_
import aurweb.config
from aurweb import cookies, db, l10n, logging, models, util
from aurweb.auth import account_type_required, requires_auth, requires_guest
from aurweb import aur_logging, db, l10n, models, util
from aurweb.auth import account_type_required, creds, requires_auth, requires_guest
from aurweb.captcha import get_captcha_salts
from aurweb.exceptions import ValidationError, handle_form_exceptions
from aurweb.l10n import get_translator_for_request
@ -24,7 +22,7 @@ from aurweb.users import update, validate
from aurweb.users.util import get_user_by_name
router = APIRouter()
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
@router.get("/passreset", response_class=HTMLResponse)
@ -34,24 +32,27 @@ async def passreset(request: Request):
return render_template(request, "passreset.html", context)
@db.async_retry_deadlock
@router.post("/passreset", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
async def passreset_post(request: Request,
user: str = Form(...),
resetkey: str = Form(default=None),
password: str = Form(default=None),
confirm: str = Form(default=None)):
async def passreset_post(
request: Request,
user: str = Form(...),
resetkey: str = Form(default=None),
password: str = Form(default=None),
confirm: str = Form(default=None),
):
context = await make_variable_context(request, "Password Reset")
# The user parameter being required, we can match against
criteria = or_(models.User.Username == user, models.User.Email == user)
db_user = db.query(models.User,
and_(criteria, models.User.Suspended == 0)).first()
db_user = db.query(models.User, and_(criteria, models.User.Suspended == 0)).first()
if db_user is None:
context["errors"] = ["Invalid e-mail."]
return render_template(request, "passreset.html", context,
status_code=HTTPStatus.NOT_FOUND)
return render_template(
request, "passreset.html", context, status_code=HTTPStatus.NOT_FOUND
)
db.refresh(db_user)
if resetkey:
@ -59,29 +60,34 @@ async def passreset_post(request: Request,
if not db_user.ResetKey or resetkey != db_user.ResetKey:
context["errors"] = ["Invalid e-mail."]
return render_template(request, "passreset.html", context,
status_code=HTTPStatus.NOT_FOUND)
return render_template(
request, "passreset.html", context, status_code=HTTPStatus.NOT_FOUND
)
if not user or not password:
context["errors"] = ["Missing a required field."]
return render_template(request, "passreset.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "passreset.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if password != confirm:
# If the provided password does not match the provided confirm.
context["errors"] = ["Password fields do not match."]
return render_template(request, "passreset.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "passreset.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if len(password) < models.User.minimum_passwd_length():
# Translate the error here, which simplifies error output
# in the jinja2 template.
_ = get_translator_for_request(request)
context["errors"] = [_(
"Your password must be at least %s characters.") % (
str(models.User.minimum_passwd_length()))]
return render_template(request, "passreset.html", context,
status_code=HTTPStatus.BAD_REQUEST)
context["errors"] = [
_("Your password must be at least %s characters.")
% (str(models.User.minimum_passwd_length()))
]
return render_template(
request, "passreset.html", context, status_code=HTTPStatus.BAD_REQUEST
)
# We got to this point; everything matched up. Update the password
# and remove the ResetKey.
@ -92,8 +98,9 @@ async def passreset_post(request: Request,
db_user.update_password(password)
# Render ?step=complete.
return RedirectResponse(url="/passreset?step=complete",
status_code=HTTPStatus.SEE_OTHER)
return RedirectResponse(
url="/passreset?step=complete", status_code=HTTPStatus.SEE_OTHER
)
# If we got here, we continue with issuing a resetkey for the user.
resetkey = generate_resetkey()
@ -103,13 +110,13 @@ async def passreset_post(request: Request,
ResetKeyNotification(db_user.ID).send()
# Render ?step=confirm.
return RedirectResponse(url="/passreset?step=confirm",
status_code=HTTPStatus.SEE_OTHER)
return RedirectResponse(
url="/passreset?step=confirm", status_code=HTTPStatus.SEE_OTHER
)
def process_account_form(request: Request, user: models.User,
args: Dict[str, Any]):
""" Process an account form. All fields are optional and only checks
def process_account_form(request: Request, user: models.User, args: dict[str, Any]):
"""Process an account form. All fields are optional and only checks
requirements in the case they are present.
```
@ -146,23 +153,22 @@ def process_account_form(request: Request, user: models.User,
validate.username_in_use,
validate.email_in_use,
validate.invalid_account_type,
validate.invalid_captcha
validate.invalid_captcha,
]
try:
for check in checks:
check(**args, request=request, user=user, _=_)
except ValidationError as exc:
return (False, exc.data)
return False, exc.data
return (True, [])
return True, []
def make_account_form_context(context: dict,
request: Request,
user: models.User,
args: dict):
""" Modify a FastAPI context and add attributes for the account form.
def make_account_form_context(
context: dict, request: Request, user: models.User, args: dict
):
"""Modify a FastAPI context and add attributes for the account form.
:param context: FastAPI context
:param request: FastAPI request
@ -173,15 +179,17 @@ def make_account_form_context(context: dict,
# Do not modify the original context.
context = copy.copy(context)
context["account_types"] = list(filter(
lambda e: request.user.AccountTypeID >= e[0],
[
(at.USER_ID, f"Normal {at.USER}"),
(at.TRUSTED_USER_ID, at.TRUSTED_USER),
(at.DEVELOPER_ID, at.DEVELOPER),
(at.TRUSTED_USER_AND_DEV_ID, at.TRUSTED_USER_AND_DEV)
]
))
context["account_types"] = list(
filter(
lambda e: request.user.AccountTypeID >= e[0],
[
(at.USER_ID, f"Normal {at.USER}"),
(at.PACKAGE_MAINTAINER_ID, at.PACKAGE_MAINTAINER),
(at.DEVELOPER_ID, at.DEVELOPER),
(at.PACKAGE_MAINTAINER_AND_DEV_ID, at.PACKAGE_MAINTAINER_AND_DEV),
],
)
)
if request.user.is_authenticated():
context["username"] = args.get("U", user.Username)
@ -201,6 +209,7 @@ def make_account_form_context(context: dict,
context["cn"] = args.get("CN", user.CommentNotify)
context["un"] = args.get("UN", user.UpdateNotify)
context["on"] = args.get("ON", user.OwnershipNotify)
context["hdc"] = args.get("HDC", user.HideDeletedComments)
context["inactive"] = args.get("J", user.InactivityTS != 0)
else:
context["username"] = args.get("U", str())
@ -219,6 +228,7 @@ def make_account_form_context(context: dict,
context["cn"] = args.get("CN", True)
context["un"] = args.get("UN", False)
context["on"] = args.get("ON", True)
context["hdc"] = args.get("HDC", False)
context["inactive"] = args.get("J", False)
context["password"] = args.get("P", str())
@ -229,59 +239,62 @@ def make_account_form_context(context: dict,
@router.get("/register", response_class=HTMLResponse)
@requires_guest
async def account_register(request: Request,
U: str = Form(default=str()), # Username
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=None), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key FP
L: str = Form(default=aurweb.config.get(
"options", "default_lang")),
TZ: str = Form(default=aurweb.config.get(
"options", "default_timezone")),
PK: str = Form(default=None),
CN: bool = Form(default=False), # Comment Notify
CU: bool = Form(default=False), # Update Notify
CO: bool = Form(default=False), # Owner Notify
captcha: str = Form(default=str())):
async def account_register(
request: Request,
U: str = Form(default=str()), # Username
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=None), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key FP
L: str = Form(default=aurweb.config.get("options", "default_lang")),
TZ: str = Form(default=aurweb.config.get("options", "default_timezone")),
PK: str = Form(default=None),
CN: bool = Form(default=False), # Comment Notify
CU: bool = Form(default=False), # Update Notify
CO: bool = Form(default=False), # Owner Notify
HDC: bool = Form(default=False), # Hide Deleted Comments
captcha: str = Form(default=str()),
):
context = await make_variable_context(request, "Register")
context["captcha_salt"] = get_captcha_salts()[0]
context = make_account_form_context(context, request, None, dict())
return render_template(request, "register.html", context)
@db.async_retry_deadlock
@router.post("/register", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
async def account_register_post(request: Request,
U: str = Form(default=str()), # Username
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=''), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key
L: str = Form(default=aurweb.config.get(
"options", "default_lang")),
TZ: str = Form(default=aurweb.config.get(
"options", "default_timezone")),
PK: str = Form(default=str()), # SSH PubKey
CN: bool = Form(default=False),
UN: bool = Form(default=False),
ON: bool = Form(default=False),
captcha: str = Form(default=None),
captcha_salt: str = Form(...)):
async def account_register_post(
request: Request,
U: str = Form(default=str()), # Username
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=""), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key
L: str = Form(default=aurweb.config.get("options", "default_lang")),
TZ: str = Form(default=aurweb.config.get("options", "default_timezone")),
PK: str = Form(default=str()), # SSH PubKey
CN: bool = Form(default=False),
UN: bool = Form(default=False),
ON: bool = Form(default=False),
HDC: bool = Form(default=False),
captcha: str = Form(default=None),
captcha_salt: str = Form(...),
):
context = await make_variable_context(request, "Register")
args = dict(await request.form())
args["K"] = args.get("K", str()).replace(" ", "")
K = args.get("K")
# Force "H" into a boolean.
args["H"] = H = (args.get("H", str()) == "on")
args["H"] = H = args.get("H", str()) == "on"
context = make_account_form_context(context, request, None, args)
ok, errors = process_account_form(request, request.user, args)
@ -289,42 +302,56 @@ async def account_register_post(request: Request,
# If the field values given do not meet the requirements,
# return HTTP 400 with an error.
context["errors"] = errors
return render_template(request, "register.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "register.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if not captcha:
context["errors"] = ["The CAPTCHA is missing."]
return render_template(request, "register.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "register.html", context, status_code=HTTPStatus.BAD_REQUEST
)
# Create a user with no password with a resetkey, then send
# an email off about it.
resetkey = generate_resetkey()
# By default, we grab the User account type to associate with.
atype = db.query(models.AccountType,
models.AccountType.AccountType == "User").first()
atype = db.query(
models.AccountType, models.AccountType.AccountType == "User"
).first()
# Create a user given all parameters available.
with db.begin():
user = db.create(models.User, Username=U,
Email=E, HideEmail=H, BackupEmail=BE,
RealName=R, Homepage=HP, IRCNick=I, PGPKey=K,
LangPreference=L, Timezone=TZ, CommentNotify=CN,
UpdateNotify=UN, OwnershipNotify=ON,
ResetKey=resetkey, AccountType=atype)
user = db.create(
models.User,
Username=U,
Email=E,
HideEmail=H,
BackupEmail=BE,
RealName=R,
Homepage=HP,
IRCNick=I,
PGPKey=K,
LangPreference=L,
Timezone=TZ,
CommentNotify=CN,
UpdateNotify=UN,
OwnershipNotify=ON,
HideDeletedComments=HDC,
ResetKey=resetkey,
AccountType=atype,
)
# If a PK was given and either one does not exist or the given
# PK mismatches the existing user's SSHPubKey.PubKey.
if PK:
# Get the second element in the PK, which is the actual key.
keys = util.parse_ssh_keys(PK.strip())
for k in keys:
pk = " ".join(k)
fprint = get_fingerprint(pk)
with db.begin():
db.create(models.SSHPubKey, UserID=user.ID,
PubKey=pk, Fingerprint=fprint)
# If a PK was given and either one does not exist or the given
# PK mismatches the existing user's SSHPubKey.PubKey.
if PK:
# Get the second element in the PK, which is the actual key.
keys = util.parse_ssh_keys(PK.strip())
for k in keys:
pk = " ".join(k)
fprint = get_fingerprint(pk)
db.create(models.SSHPubKey, User=user, PubKey=pk, Fingerprint=fprint)
# Send a reset key notification to the new user.
WelcomeNotification(user.ID).send()
@ -334,8 +361,9 @@ async def account_register_post(request: Request,
return render_template(request, "register.html", context)
def cannot_edit(request: Request, user: models.User) \
-> typing.Optional[RedirectResponse]:
def cannot_edit(
request: Request, user: models.User
) -> typing.Optional[RedirectResponse]:
"""
Decide if `request.user` cannot edit `user`.
@ -346,6 +374,9 @@ def cannot_edit(request: Request, user: models.User) \
:param user: Target user to be edited
:return: RedirectResponse if approval != granted else None
"""
# raise 404 if user does not exist
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
approved = request.user.can_edit_user(user)
if not approved and (to := "/"):
if user:
@ -373,31 +404,32 @@ async def account_edit(request: Request, username: str):
@router.post("/account/{username}/edit", response_class=HTMLResponse)
@handle_form_exceptions
@requires_auth
async def account_edit_post(request: Request,
username: str,
U: str = Form(default=str()), # Username
J: bool = Form(default=False),
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=None), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key
L: str = Form(aurweb.config.get(
"options", "default_lang")),
TZ: str = Form(aurweb.config.get(
"options", "default_timezone")),
P: str = Form(default=str()), # New Password
C: str = Form(default=None), # Password Confirm
PK: str = Form(default=None), # PubKey
CN: bool = Form(default=False), # Comment Notify
UN: bool = Form(default=False), # Update Notify
ON: bool = Form(default=False), # Owner Notify
T: int = Form(default=None),
passwd: str = Form(default=str())):
user = db.query(models.User).filter(
models.User.Username == username).first()
async def account_edit_post(
request: Request,
username: str,
U: str = Form(default=str()), # Username
J: bool = Form(default=False),
E: str = Form(default=str()), # Email
H: str = Form(default=False), # Hide Email
BE: str = Form(default=None), # Backup Email
R: str = Form(default=None), # Real Name
HP: str = Form(default=None), # Homepage
I: str = Form(default=None), # IRC Nick
K: str = Form(default=None), # PGP Key
L: str = Form(aurweb.config.get("options", "default_lang")),
TZ: str = Form(aurweb.config.get("options", "default_timezone")),
P: str = Form(default=str()), # New Password
C: str = Form(default=None), # Password Confirm
S: bool = Form(default=False), # Suspended
PK: str = Form(default=None), # PubKey
CN: bool = Form(default=False), # Comment Notify
UN: bool = Form(default=False), # Update Notify
ON: bool = Form(default=False), # Owner Notify
HDC: bool = Form(default=False), # Hide Deleted Comments
T: int = Form(default=None),
passwd: str = Form(default=str()),
):
user = db.query(models.User).filter(models.User.Username == username).first()
response = cannot_edit(request, user)
if response:
return response
@ -416,13 +448,15 @@ async def account_edit_post(request: Request,
if not passwd:
context["errors"] = ["Invalid password."]
return render_template(request, "account/edit.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "account/edit.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if not ok:
context["errors"] = errors
return render_template(request, "account/edit.html", context,
status_code=HTTPStatus.BAD_REQUEST)
return render_template(
request, "account/edit.html", context, status_code=HTTPStatus.BAD_REQUEST
)
updates = [
update.simple,
@ -430,29 +464,29 @@ async def account_edit_post(request: Request,
update.timezone,
update.ssh_pubkey,
update.account_type,
update.password
update.password,
update.suspend,
]
# These update functions are all guarded by retry_deadlock;
# there's no need to guard this route itself.
for f in updates:
f(**args, request=request, user=user, context=context)
if not errors:
context["complete"] = True
# Update cookies with requests, in case they were changed.
response = render_template(request, "account/edit.html", context)
return cookies.update_response_cookies(request, response,
aurtz=TZ, aurlang=L)
return render_template(request, "account/edit.html", context)
@router.get("/account/{username}")
async def account(request: Request, username: str):
_ = l10n.get_translator_for_request(request)
context = await make_variable_context(
request, _("Account") + " " + username)
context = await make_variable_context(request, _("Account") + " " + username)
if not request.user.is_authenticated():
return render_template(request, "account/show.html", context,
status_code=HTTPStatus.UNAUTHORIZED)
return render_template(
request, "account/show.html", context, status_code=HTTPStatus.UNAUTHORIZED
)
# Get related User record, if possible.
user = get_user_by_name(username)
@ -460,11 +494,10 @@ async def account(request: Request, username: str):
# Format PGPKey for display with a space between each 4 characters.
k = user.PGPKey or str()
context["pgp_key"] = " ".join([k[i:i + 4] for i in range(0, len(k), 4)])
context["pgp_key"] = " ".join([k[i : i + 4] for i in range(0, len(k), 4)])
login_ts = None
session = db.query(models.Session).filter(
models.Session.UsersID == user.ID).first()
session = db.query(models.Session).filter(models.Session.UsersID == user.ID).first()
if session:
login_ts = user.session.LastUpdateTS
context["login_ts"] = login_ts
@ -480,15 +513,16 @@ async def account_comments(request: Request, username: str):
context = make_context(request, "Accounts")
context["username"] = username
context["comments"] = user.package_comments.order_by(
models.PackageComment.CommentTS.desc())
models.PackageComment.CommentTS.desc()
)
return render_template(request, "account/comments.html", context)
@router.get("/accounts")
@requires_auth
@account_type_required({at.TRUSTED_USER,
at.DEVELOPER,
at.TRUSTED_USER_AND_DEV})
@account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts(request: Request):
context = make_context(request, "Accounts")
return render_template(request, "account/search.html", context)
@ -497,19 +531,21 @@ async def accounts(request: Request):
@router.post("/accounts")
@handle_form_exceptions
@requires_auth
@account_type_required({at.TRUSTED_USER,
at.DEVELOPER,
at.TRUSTED_USER_AND_DEV})
async def accounts_post(request: Request,
O: int = Form(default=0), # Offset
SB: str = Form(default=str()), # Sort By
U: str = Form(default=str()), # Username
T: str = Form(default=str()), # Account Type
S: bool = Form(default=False), # Suspended
E: str = Form(default=str()), # Email
R: str = Form(default=str()), # Real Name
I: str = Form(default=str()), # IRC Nick
K: str = Form(default=str())): # PGP Key
@account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts_post(
request: Request,
O: int = Form(default=0), # Offset
SB: str = Form(default=str()), # Sort By
U: str = Form(default=str()), # Username
T: str = Form(default=str()), # Account Type
S: bool = Form(default=False), # Suspended
E: str = Form(default=str()), # Email
R: str = Form(default=str()), # Real Name
I: str = Form(default=str()), # IRC Nick
K: str = Form(default=str()),
): # PGP Key
context = await make_variable_context(request, "Accounts")
context["pp"] = pp = 50 # Hits per page.
@ -532,9 +568,9 @@ async def accounts_post(request: Request,
# Convert parameter T to an AccountType ID.
account_types = {
"u": at.USER_ID,
"t": at.TRUSTED_USER_ID,
"t": at.PACKAGE_MAINTAINER_ID,
"d": at.DEVELOPER_ID,
"td": at.TRUSTED_USER_AND_DEV_ID
"td": at.PACKAGE_MAINTAINER_AND_DEV_ID,
}
account_type_id = account_types.get(T, None)
@ -545,7 +581,8 @@ async def accounts_post(request: Request,
# Populate this list with any additional statements to
# be ANDed together.
statements = [
v for k, v in [
v
for k, v in [
(account_type_id is not None, models.AccountType.ID == account_type_id),
(bool(U), models.User.Username.like(f"%{U}%")),
(bool(S), models.User.Suspended == S),
@ -553,7 +590,8 @@ async def accounts_post(request: Request,
(bool(R), models.User.RealName.like(f"%{R}%")),
(bool(I), models.User.IRCNick.like(f"%{I}%")),
(bool(K), models.User.PGPKey.like(f"%{K}%")),
] if k
]
if k
]
# Filter the query by coe-mbining all statements added above into
@ -571,9 +609,79 @@ async def accounts_post(request: Request,
return render_template(request, "account/index.html", context)
def render_terms_of_service(request: Request,
context: dict,
terms: typing.Iterable):
@router.get("/account/{name}/delete")
@requires_auth
async def account_delete(request: Request, name: str):
user = db.query(models.User).filter(models.User.Username == name).first()
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
has_cred = request.user.has_credential(creds.ACCOUNT_EDIT, approved=[user])
if not has_cred:
_ = l10n.get_translator_for_request(request)
raise HTTPException(
detail=_("You do not have permission to edit this account."),
status_code=HTTPStatus.UNAUTHORIZED,
)
context = make_context(request, "Accounts")
context["name"] = name
return render_template(request, "account/delete.html", context)
@db.async_retry_deadlock
@router.post("/account/{name}/delete")
@handle_form_exceptions
@requires_auth
async def account_delete_post(
request: Request,
name: str,
passwd: str = Form(default=str()),
confirm: bool = Form(default=False),
):
user = db.query(models.User).filter(models.User.Username == name).first()
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
has_cred = request.user.has_credential(creds.ACCOUNT_EDIT, approved=[user])
if not has_cred:
_ = l10n.get_translator_for_request(request)
raise HTTPException(
detail=_("You do not have permission to edit this account."),
status_code=HTTPStatus.UNAUTHORIZED,
)
context = make_context(request, "Accounts")
context["name"] = name
confirm = util.strtobool(confirm)
if not confirm:
context["errors"] = [
"The account has not been deleted, check the confirmation checkbox."
]
return render_template(
request,
"account/delete.html",
context,
status_code=HTTPStatus.BAD_REQUEST,
)
if not request.user.valid_password(passwd):
context["errors"] = ["Invalid password."]
return render_template(
request,
"account/delete.html",
context,
status_code=HTTPStatus.BAD_REQUEST,
)
with db.begin():
db.delete(user)
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
def render_terms_of_service(request: Request, context: dict, terms: typing.Iterable):
if not terms:
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
context["unaccepted_terms"] = terms
@ -585,14 +693,21 @@ def render_terms_of_service(request: Request,
async def terms_of_service(request: Request):
# Query the database for terms that were previously accepted,
# but now have a bumped Revision that needs to be accepted.
diffs = db.query(models.Term).join(models.AcceptedTerm).filter(
models.AcceptedTerm.Revision < models.Term.Revision).all()
diffs = (
db.query(models.Term)
.join(models.AcceptedTerm)
.filter(models.AcceptedTerm.Revision < models.Term.Revision)
.all()
)
# Query the database for any terms that have not yet been accepted.
unaccepted = db.query(models.Term).filter(
~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID))).all()
unaccepted = (
db.query(models.Term)
.filter(~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID)))
.all()
)
for record in (diffs + unaccepted):
for record in diffs + unaccepted:
db.refresh(record)
# Translate the 'Terms of Service' part of our page title.
@ -604,19 +719,26 @@ async def terms_of_service(request: Request):
return render_terms_of_service(request, context, accept_needed)
@db.async_retry_deadlock
@router.post("/tos")
@handle_form_exceptions
@requires_auth
async def terms_of_service_post(request: Request,
accept: bool = Form(default=False)):
async def terms_of_service_post(request: Request, accept: bool = Form(default=False)):
# Query the database for terms that were previously accepted,
# but now have a bumped Revision that needs to be accepted.
diffs = db.query(models.Term).join(models.AcceptedTerm).filter(
models.AcceptedTerm.Revision < models.Term.Revision).all()
diffs = (
db.query(models.Term)
.join(models.AcceptedTerm)
.filter(models.AcceptedTerm.Revision < models.Term.Revision)
.all()
)
# Query the database for any terms that have not yet been accepted.
unaccepted = db.query(models.Term).filter(
~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID))).all()
unaccepted = (
db.query(models.Term)
.filter(~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID)))
.all()
)
if not accept:
# Translate the 'Terms of Service' part of our page title.
@ -628,7 +750,8 @@ async def terms_of_service_post(request: Request,
# them instead of reiterating the process in terms_of_service.
accept_needed = sorted(unaccepted + diffs)
return render_terms_of_service(
request, context, util.apply_all(accept_needed, db.refresh))
request, context, util.apply_all(accept_needed, db.refresh)
)
with db.begin():
# For each term we found, query for the matching accepted term
@ -636,13 +759,18 @@ async def terms_of_service_post(request: Request,
for term in diffs:
db.refresh(term)
accepted_term = request.user.accepted_terms.filter(
models.AcceptedTerm.TermsID == term.ID).first()
models.AcceptedTerm.TermsID == term.ID
).first()
accepted_term.Revision = term.Revision
# For each term that was never accepted, accept it!
for term in unaccepted:
db.refresh(term)
db.create(models.AcceptedTerm, User=request.user,
Term=term, Revision=term.Revision)
db.create(
models.AcceptedTerm,
User=request.user,
Term=term,
Revision=term.Revision,
)
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)

View file

@ -5,8 +5,7 @@ from fastapi.responses import HTMLResponse, RedirectResponse
from sqlalchemy import or_
import aurweb.config
from aurweb import cookies, db, time
from aurweb import cookies, db
from aurweb.auth import requires_auth, requires_guest
from aurweb.exceptions import handle_form_exceptions
from aurweb.l10n import get_translator_for_request
@ -17,7 +16,7 @@ router = APIRouter()
async def login_template(request: Request, next: str, errors: list = None):
""" Provide login-specific template context to render_template. """
"""Provide login-specific template context to render_template."""
context = await make_variable_context(request, "Login", next)
context["errors"] = errors
context["url_base"] = f"{request.url.scheme}://{request.url.netloc}"
@ -29,77 +28,95 @@ async def login_get(request: Request, next: str = "/"):
return await login_template(request, next)
@db.retry_deadlock
def _retry_login(request: Request, user: User, passwd: str) -> str:
return user.login(request, passwd)
@router.post("/login", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
async def login_post(request: Request,
next: str = Form(...),
user: str = Form(default=str()),
passwd: str = Form(default=str()),
remember_me: bool = Form(default=False)):
async def login_post(
request: Request,
next: str = Form(...),
user: str = Form(default=str()),
passwd: str = Form(default=str()),
remember_me: bool = Form(default=False),
):
# TODO: Once the Origin header gets broader adoption, this code can be
# slightly simplified to use it.
login_path = aurweb.config.get("options", "aur_location") + "/login"
referer = request.headers.get("Referer")
if not referer or not referer.startswith(login_path):
_ = get_translator_for_request(request)
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
detail=_("Bad Referer header."))
with db.begin():
user = db.query(User).filter(
or_(User.Username == user, User.Email == user)
).first()
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST, detail=_("Bad Referer header.")
)
user = (
db.query(User)
.filter(
or_(
User.Username == user,
User.Email == user,
)
)
.first()
)
if not user:
return await login_template(request, next,
errors=["Bad username or password."])
return await login_template(request, next, errors=["Bad username or password."])
if user.Suspended:
return await login_template(request, next,
errors=["Account Suspended"])
return await login_template(request, next, errors=["Account Suspended"])
cookie_timeout = cookies.timeout(remember_me)
sid = user.login(request, passwd, cookie_timeout)
# If "remember me" was not ticked, we set a session cookie for AURSID,
# otherwise we make it a persistent cookie
cookie_timeout = None
if remember_me:
cookie_timeout = aurweb.config.getint("options", "persistent_cookie_timeout")
perma_timeout = aurweb.config.getint("options", "permanent_cookie_timeout")
sid = _retry_login(request, user, passwd)
if not sid:
return await login_template(request, next,
errors=["Bad username or password."])
return await login_template(request, next, errors=["Bad username or password."])
login_timeout = aurweb.config.getint("options", "login_timeout")
expires_at = int(time.utcnow() + max(cookie_timeout, login_timeout))
response = RedirectResponse(url=next,
status_code=HTTPStatus.SEE_OTHER)
response = RedirectResponse(url=next, status_code=HTTPStatus.SEE_OTHER)
secure = aurweb.config.getboolean("options", "disable_http_login")
response.set_cookie("AURSID", sid, expires=expires_at,
secure=secure, httponly=secure,
samesite=cookies.samesite())
response.set_cookie("AURTZ", user.Timezone,
secure=secure, httponly=secure,
samesite=cookies.samesite())
response.set_cookie("AURLANG", user.LangPreference,
secure=secure, httponly=secure,
samesite=cookies.samesite())
response.set_cookie("AURREMEMBER", remember_me,
expires=expires_at,
secure=secure, httponly=secure,
samesite=cookies.samesite())
response.set_cookie(
"AURSID",
sid,
max_age=cookie_timeout,
secure=secure,
httponly=secure,
samesite=cookies.samesite(),
)
response.set_cookie(
"AURREMEMBER",
remember_me,
max_age=perma_timeout,
secure=secure,
httponly=secure,
samesite=cookies.samesite(),
)
return response
@db.retry_deadlock
def _retry_logout(request: Request) -> None:
request.user.logout(request)
@router.post("/logout")
@handle_form_exceptions
@requires_auth
async def logout(request: Request, next: str = Form(default="/")):
if request.user.is_authenticated():
request.user.logout(request)
_retry_logout(request)
# Use 303 since we may be handling a post request, that'll get it
# to redirect to a get request.
response = RedirectResponse(url=next,
status_code=HTTPStatus.SEE_OTHER)
response = RedirectResponse(url=next, status_code=HTTPStatus.SEE_OTHER)
response.delete_cookie("AURSID")
response.delete_cookie("AURTZ")
response.delete_cookie("AURREMEMBER")
return response

View file

@ -1,43 +1,48 @@
""" AURWeb's primary routing module. Define all routes via @app.app.{get,post}
decorators in some way; more complex routes should be defined in their
own modules and imported here. """
import os
import os
from http import HTTPStatus
from fastapi import APIRouter, Form, HTTPException, Request, Response
from fastapi.responses import HTMLResponse, RedirectResponse
from prometheus_client import CONTENT_TYPE_LATEST, CollectorRegistry, generate_latest, multiprocess
from sqlalchemy import and_, case, or_
from prometheus_client import (
CONTENT_TYPE_LATEST,
CollectorRegistry,
generate_latest,
multiprocess,
)
from sqlalchemy import case, or_
import aurweb.config
import aurweb.models.package_request
from aurweb import cookies, db, logging, models, time, util
from aurweb.cache import db_count_cache
from aurweb import aur_logging, cookies, db, models, statistics, time, util
from aurweb.exceptions import handle_form_exceptions
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
from aurweb.models.package_request import PENDING_ID
from aurweb.packages.util import query_notified, query_voted, updated_packages
from aurweb.templates import make_context, render_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
router = APIRouter()
@router.get("/favicon.ico")
async def favicon(request: Request):
""" Some browsers attempt to find a website's favicon via root uri at
/favicon.ico, so provide a redirection here to our static icon. """
"""Some browsers attempt to find a website's favicon via root uri at
/favicon.ico, so provide a redirection here to our static icon."""
return RedirectResponse("/static/images/favicon.ico")
@db.async_retry_deadlock
@router.post("/language", response_class=RedirectResponse)
@handle_form_exceptions
async def language(request: Request,
set_lang: str = Form(...),
next: str = Form(...),
q: str = Form(default=None)):
async def language(
request: Request,
set_lang: str = Form(...),
next: str = Form(...),
q: str = Form(default=None),
):
"""
A POST route used to set a session's language.
@ -45,92 +50,48 @@ async def language(request: Request,
setting the language on any page, we want to preserve query
parameters across the redirect.
"""
if next[0] != '/':
if next[0] != "/":
return HTMLResponse(b"Invalid 'next' parameter.", status_code=400)
query_string = "?" + q if q else str()
response = RedirectResponse(
url=f"{next}{query_string}", status_code=HTTPStatus.SEE_OTHER
)
# If the user is authenticated, update the user's LangPreference.
# Otherwise set an AURLANG cookie
if request.user.is_authenticated():
with db.begin():
request.user.LangPreference = set_lang
else:
secure = aurweb.config.getboolean("options", "disable_http_login")
perma_timeout = aurweb.config.getint("options", "permanent_cookie_timeout")
response.set_cookie(
"AURLANG",
set_lang,
secure=secure,
httponly=secure,
max_age=perma_timeout,
samesite=cookies.samesite(),
)
# In any case, set the response's AURLANG cookie that never expires.
response = RedirectResponse(url=f"{next}{query_string}",
status_code=HTTPStatus.SEE_OTHER)
secure = aurweb.config.getboolean("options", "disable_http_login")
response.set_cookie("AURLANG", set_lang,
secure=secure, httponly=secure,
samesite=cookies.samesite())
return response
@router.get("/", response_class=HTMLResponse)
async def index(request: Request):
""" Homepage route. """
"""Homepage route."""
context = make_context(request, "Home")
context['ssh_fingerprints'] = util.get_ssh_fingerprints()
context["ssh_fingerprints"] = util.get_ssh_fingerprints()
bases = db.query(models.PackageBase)
redis = aurweb.redis.redis_connection()
cache_expire = 300 # Five minutes.
cache_expire = aurweb.config.getint("cache", "expiry_time_statistics", 300)
# Package statistics.
query = bases.filter(models.PackageBase.PackagerUID.isnot(None))
context["package_count"] = await db_count_cache(
redis, "package_count", query, expire=cache_expire)
query = bases.filter(
and_(models.PackageBase.MaintainerUID.is_(None),
models.PackageBase.PackagerUID.isnot(None))
)
context["orphan_count"] = await db_count_cache(
redis, "orphan_count", query, expire=cache_expire)
query = db.query(models.User)
context["user_count"] = await db_count_cache(
redis, "user_count", query, expire=cache_expire)
query = query.filter(
or_(models.User.AccountTypeID == TRUSTED_USER_ID,
models.User.AccountTypeID == TRUSTED_USER_AND_DEV_ID))
context["trusted_user_count"] = await db_count_cache(
redis, "trusted_user_count", query, expire=cache_expire)
# Current timestamp.
now = time.utcnow()
seven_days = 86400 * 7 # Seven days worth of seconds.
seven_days_ago = now - seven_days
one_hour = 3600
updated = bases.filter(
and_(models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS >= one_hour,
models.PackageBase.PackagerUID.isnot(None))
)
query = bases.filter(
and_(models.PackageBase.SubmittedTS >= seven_days_ago,
models.PackageBase.PackagerUID.isnot(None))
)
context["seven_days_old_added"] = await db_count_cache(
redis, "seven_days_old_added", query, expire=cache_expire)
query = updated.filter(models.PackageBase.ModifiedTS >= seven_days_ago)
context["seven_days_old_updated"] = await db_count_cache(
redis, "seven_days_old_updated", query, expire=cache_expire)
year = seven_days * 52 # Fifty two weeks worth: one year.
year_ago = now - year
query = updated.filter(models.PackageBase.ModifiedTS >= year_ago)
context["year_old_updated"] = await db_count_cache(
redis, "year_old_updated", query, expire=cache_expire)
query = bases.filter(
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS < 3600)
context["never_updated"] = await db_count_cache(
redis, "never_updated", query, expire=cache_expire)
counts = statistics.get_homepage_counts()
for k in counts:
context[k] = counts[k]
# Get the 15 most recently updated packages.
context["package_updates"] = updated_packages(15, cache_expire)
@ -140,78 +101,92 @@ async def index(request: Request):
# the dashboard display.
packages = db.query(models.Package).join(models.PackageBase)
maintained = packages.join(
models.PackageComaintainer,
models.PackageComaintainer.PackageBaseID == models.PackageBase.ID,
isouter=True
).join(
models.User,
or_(models.PackageBase.MaintainerUID == models.User.ID,
models.PackageComaintainer.UsersID == models.User.ID)
).filter(
models.User.ID == request.user.ID
maintained = (
packages.join(
models.PackageComaintainer,
models.PackageComaintainer.PackageBaseID == models.PackageBase.ID,
isouter=True,
)
.join(
models.User,
or_(
models.PackageBase.MaintainerUID == models.User.ID,
models.PackageComaintainer.UsersID == models.User.ID,
),
)
.filter(models.User.ID == request.user.ID)
)
# Packages maintained by the user that have been flagged.
context["flagged_packages"] = maintained.filter(
models.PackageBase.OutOfDateTS.isnot(None)
).order_by(
models.PackageBase.ModifiedTS.desc(), models.Package.Name.asc()
).limit(50).all()
context["flagged_packages"] = (
maintained.filter(models.PackageBase.OutOfDateTS.isnot(None))
.order_by(models.PackageBase.ModifiedTS.desc(), models.Package.Name.asc())
.limit(50)
.all()
)
# Flagged packages that request.user has voted for.
context["flagged_packages_voted"] = query_voted(
context.get("flagged_packages"), request.user)
context.get("flagged_packages"), request.user
)
# Flagged packages that request.user is being notified about.
context["flagged_packages_notified"] = query_notified(
context.get("flagged_packages"), request.user)
context.get("flagged_packages"), request.user
)
archive_time = aurweb.config.getint('options', 'request_archive_time')
start = now - archive_time
archive_time = aurweb.config.getint("options", "request_archive_time")
start = time.utcnow() - archive_time
# Package requests created by request.user.
context["package_requests"] = request.user.package_requests.filter(
models.PackageRequest.RequestTS >= start
).order_by(
# Order primarily by the Status column being PENDING_ID,
# and secondarily by RequestTS; both in descending order.
case([(models.PackageRequest.Status == PENDING_ID, 1)],
else_=0).desc(),
models.PackageRequest.RequestTS.desc()
).limit(50).all()
context["package_requests"] = (
request.user.package_requests.filter(
models.PackageRequest.RequestTS >= start
)
.order_by(
# Order primarily by the Status column being PENDING_ID,
# and secondarily by RequestTS; both in descending order.
case([(models.PackageRequest.Status == PENDING_ID, 1)], else_=0).desc(),
models.PackageRequest.RequestTS.desc(),
)
.limit(50)
.all()
)
# Packages that the request user maintains or comaintains.
context["packages"] = maintained.filter(
models.User.ID == models.PackageBase.MaintainerUID
).order_by(
models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc()
).limit(50).all()
context["packages"] = (
maintained.filter(models.User.ID == models.PackageBase.MaintainerUID)
.order_by(models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc())
.limit(50)
.all()
)
# Packages that request.user has voted for.
context["packages_voted"] = query_voted(
context.get("packages"), request.user)
context["packages_voted"] = query_voted(context.get("packages"), request.user)
# Packages that request.user is being notified about.
context["packages_notified"] = query_notified(
context.get("packages"), request.user)
context.get("packages"), request.user
)
# Any packages that the request user comaintains.
context["comaintained"] = packages.join(
models.PackageComaintainer
).filter(
models.PackageComaintainer.UsersID == request.user.ID
).order_by(
models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc()
).limit(50).all()
context["comaintained"] = (
packages.join(models.PackageComaintainer)
.filter(models.PackageComaintainer.UsersID == request.user.ID)
.order_by(models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc())
.limit(50)
.all()
)
# Comaintained packages that request.user has voted for.
context["comaintained_voted"] = query_voted(
context.get("comaintained"), request.user)
context.get("comaintained"), request.user
)
# Comaintained packages that request.user is being notified about.
context["comaintained_notified"] = query_notified(
context.get("comaintained"), request.user)
context.get("comaintained"), request.user
)
return render_template(request, "index.html", context)
@ -232,16 +207,18 @@ async def archive_sha256(request: Request, archive: str):
@router.get("/metrics")
async def metrics(request: Request):
if not os.environ.get("PROMETHEUS_MULTIPROC_DIR", None):
return Response("Prometheus metrics are not enabled.",
status_code=HTTPStatus.SERVICE_UNAVAILABLE)
return Response(
"Prometheus metrics are not enabled.",
status_code=HTTPStatus.SERVICE_UNAVAILABLE,
)
# update prometheus gauges for packages and users
statistics.update_prometheus_metrics()
registry = CollectorRegistry()
multiprocess.MultiProcessCollector(registry)
data = generate_latest(registry)
headers = {
"Content-Type": CONTENT_TYPE_LATEST,
"Content-Length": str(len(data))
}
headers = {"Content-Type": CONTENT_TYPE_LATEST, "Content-Length": str(len(data))}
return Response(data, headers=headers)

View file

@ -0,0 +1,394 @@
import html
import typing
from http import HTTPStatus
from typing import Any
from fastapi import APIRouter, Form, HTTPException, Request
from fastapi.responses import RedirectResponse, Response
from sqlalchemy import and_, func, or_
from aurweb import aur_logging, db, l10n, models, time
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions
from aurweb.models import User
from aurweb.models.account_type import (
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
)
from aurweb.templates import make_context, make_variable_context, render_template
router = APIRouter()
logger = aur_logging.get_logger(__name__)
# Some PM route specific constants.
ITEMS_PER_PAGE = 10 # Paged table size.
MAX_AGENDA_LENGTH = 75 # Agenda table column length.
ADDVOTE_SPECIFICS = {
# This dict stores a vote duration and quorum for a proposal.
# When a proposal is added, duration is added to the current
# timestamp.
# "addvote_type": (duration, quorum)
"add_pm": (7 * 24 * 60 * 60, 0.66),
"remove_pm": (7 * 24 * 60 * 60, 0.75),
"remove_inactive_pm": (5 * 24 * 60 * 60, 0.66),
"bylaws": (7 * 24 * 60 * 60, 0.75),
}
def populate_package_maintainer_counts(context: dict[str, Any]) -> None:
pm_query = db.query(User).filter(
or_(
User.AccountTypeID == PACKAGE_MAINTAINER_ID,
User.AccountTypeID == PACKAGE_MAINTAINER_AND_DEV_ID,
)
)
context["package_maintainer_count"] = pm_query.count()
# In case any records have a None InactivityTS.
active_pm_query = pm_query.filter(
or_(User.InactivityTS.is_(None), User.InactivityTS == 0)
)
context["active_package_maintainer_count"] = active_pm_query.count()
@router.get("/package-maintainer")
@requires_auth
async def package_maintainer(
request: Request,
coff: int = 0, # current offset
cby: str = "desc", # current by
poff: int = 0, # past offset
pby: str = "desc",
): # past by
"""Proposal listings."""
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
context = make_context(request, "Package Maintainer")
current_by, past_by = cby, pby
current_off, past_off = coff, poff
context["pp"] = pp = ITEMS_PER_PAGE
context["prev_len"] = MAX_AGENDA_LENGTH
ts = time.utcnow()
if current_by not in {"asc", "desc"}:
# If a malicious by was given, default to desc.
current_by = "desc"
context["current_by"] = current_by
if past_by not in {"asc", "desc"}:
# If a malicious by was given, default to desc.
past_by = "desc"
context["past_by"] = past_by
current_votes = (
db.query(models.VoteInfo)
.filter(models.VoteInfo.End > ts)
.order_by(models.VoteInfo.Submitted.desc())
)
context["current_votes_count"] = current_votes.count()
current_votes = current_votes.limit(pp).offset(current_off)
context["current_votes"] = (
reversed(current_votes.all()) if current_by == "asc" else current_votes.all()
)
context["current_off"] = current_off
past_votes = (
db.query(models.VoteInfo)
.filter(models.VoteInfo.End <= ts)
.order_by(models.VoteInfo.Submitted.desc())
)
context["past_votes_count"] = past_votes.count()
past_votes = past_votes.limit(pp).offset(past_off)
context["past_votes"] = (
reversed(past_votes.all()) if past_by == "asc" else past_votes.all()
)
context["past_off"] = past_off
last_vote = func.max(models.Vote.VoteID).label("LastVote")
last_votes_by_pm = (
db.query(models.Vote)
.join(models.User)
.join(models.VoteInfo, models.VoteInfo.ID == models.Vote.VoteID)
.filter(
and_(
models.Vote.VoteID == models.VoteInfo.ID,
models.User.ID == models.Vote.UserID,
models.VoteInfo.End < ts,
or_(models.User.AccountTypeID == 2, models.User.AccountTypeID == 4),
)
)
.with_entities(models.Vote.UserID, last_vote, models.User.Username)
.group_by(models.Vote.UserID)
.order_by(last_vote.desc(), models.User.Username.asc())
)
context["last_votes_by_pm"] = last_votes_by_pm.all()
context["current_by_next"] = "asc" if current_by == "desc" else "desc"
context["past_by_next"] = "asc" if past_by == "desc" else "desc"
populate_package_maintainer_counts(context)
context["q"] = {
"coff": current_off,
"cby": current_by,
"poff": past_off,
"pby": past_by,
}
return render_template(request, "package-maintainer/index.html", context)
def render_proposal(
request: Request,
context: dict,
proposal: int,
voteinfo: models.VoteInfo,
voters: typing.Iterable[models.User],
vote: models.Vote,
status_code: HTTPStatus = HTTPStatus.OK,
):
"""Render a single PM proposal."""
context["proposal"] = proposal
context["voteinfo"] = voteinfo
context["voters"] = voters.all()
total = voteinfo.total_votes()
participation = (total / voteinfo.ActiveUsers) if voteinfo.ActiveUsers else 0
context["participation"] = participation
accepted = (voteinfo.Yes > voteinfo.ActiveUsers / 2) or (
participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No
)
context["accepted"] = accepted
can_vote = voters.filter(models.Vote.User == request.user).first() is None
context["can_vote"] = can_vote
if not voteinfo.is_running():
context["error"] = "Voting is closed for this proposal."
context["vote"] = vote
context["has_voted"] = vote is not None
return render_template(
request, "package-maintainer/show.html", context, status_code=status_code
)
@router.get("/package-maintainer/{proposal}")
@requires_auth
async def package_maintainer_proposal(request: Request, proposal: int):
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal)
voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = (
db.query(models.User)
.join(models.Vote)
.filter(models.Vote.VoteID == voteinfo.ID)
)
vote = (
db.query(models.Vote)
.filter(
and_(
models.Vote.UserID == request.user.ID,
models.Vote.VoteID == voteinfo.ID,
)
)
.first()
)
if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Package Maintainers are allowed to vote."
if voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
elif vote is not None:
context["error"] = "You've already voted for this proposal."
context["vote"] = vote
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@db.async_retry_deadlock
@router.post("/package-maintainer/{proposal}")
@handle_form_exceptions
@requires_auth
async def package_maintainer_proposal_post(
request: Request, proposal: int, decision: str = Form(...)
):
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal) # Make sure it's an int.
voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = (
db.query(models.User)
.join(models.Vote)
.filter(models.Vote.VoteID == voteinfo.ID)
)
vote = (
db.query(models.Vote)
.filter(
and_(
models.Vote.UserID == request.user.ID,
models.Vote.VoteID == voteinfo.ID,
)
)
.first()
)
status_code = HTTPStatus.OK
if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Package Maintainers are allowed to vote."
status_code = HTTPStatus.UNAUTHORIZED
elif voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
status_code = HTTPStatus.BAD_REQUEST
elif vote is not None:
context["error"] = "You've already voted for this proposal."
status_code = HTTPStatus.BAD_REQUEST
if status_code != HTTPStatus.OK:
return render_proposal(
request, context, proposal, voteinfo, voters, vote, status_code=status_code
)
with db.begin():
if decision in {"Yes", "No", "Abstain"}:
# Increment whichever decision was given to us.
setattr(voteinfo, decision, getattr(voteinfo, decision) + 1)
else:
return Response(
"Invalid 'decision' value.", status_code=HTTPStatus.BAD_REQUEST
)
vote = db.create(models.Vote, User=request.user, VoteInfo=voteinfo)
context["error"] = "You've already voted for this proposal."
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@router.get("/addvote")
@requires_auth
async def package_maintainer_addvote(
request: Request, user: str = str(), type: str = "add_pm", agenda: str = str()
):
if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Add Proposal")
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
type = "add_pm" # Default it.
context["user"] = user
context["type"] = type
context["agenda"] = agenda
return render_template(request, "addvote.html", context)
@db.async_retry_deadlock
@router.post("/addvote")
@handle_form_exceptions
@requires_auth
async def package_maintainer_addvote_post(
request: Request,
user: str = Form(default=str()),
type: str = Form(default=str()),
agenda: str = Form(default=str()),
):
if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
# Build a context.
context = await make_variable_context(request, "Add Proposal")
context["type"] = type
context["user"] = user
context["agenda"] = agenda
def render_addvote(context, status_code):
"""Simplify render_template a bit for this test."""
return render_template(request, "addvote.html", context, status_code)
# Alright, get some database records, if we can.
if type != "bylaws":
user_record = db.query(models.User).filter(models.User.Username == user).first()
if user_record is None:
context["error"] = "Username does not exist."
return render_addvote(context, HTTPStatus.NOT_FOUND)
utcnow = time.utcnow()
voteinfo = (
db.query(models.VoteInfo)
.filter(and_(models.VoteInfo.User == user, models.VoteInfo.End > utcnow))
.count()
)
if voteinfo:
_ = l10n.get_translator_for_request(request)
context["error"] = _("%s already has proposal running for them.") % (
html.escape(user),
)
return render_addvote(context, HTTPStatus.BAD_REQUEST)
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
context["type"] = type = "add_pm" # Default for rendering.
return render_addvote(context, HTTPStatus.BAD_REQUEST)
if not agenda:
context["error"] = "Proposal cannot be empty."
return render_addvote(context, HTTPStatus.BAD_REQUEST)
# Gather some mapped constants and the current timestamp.
duration, quorum = ADDVOTE_SPECIFICS.get(type)
timestamp = time.utcnow()
# Active PM types we filter for.
types = {PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID}
# Create a new VoteInfo (proposal)!
with db.begin():
active_pms = (
db.query(User)
.filter(
and_(
User.Suspended == 0,
User.InactivityTS.isnot(None),
User.AccountTypeID.in_(types),
)
)
.count()
)
voteinfo = db.create(
models.VoteInfo,
User=user,
Agenda=html.escape(agenda),
Submitted=timestamp,
End=(timestamp + duration),
Quorum=quorum,
ActiveUsers=active_pms,
Submitter=request.user,
)
# Redirect to the new proposal.
endpoint = f"/package-maintainer/{voteinfo.ID}"
return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER)

View file

@ -1,39 +1,41 @@
from collections import defaultdict
from http import HTTPStatus
from typing import Any, Dict, List
from typing import Any
from fastapi import APIRouter, Form, Request, Response
from fastapi import APIRouter, Form, Query, Request, Response
import aurweb.filters # noqa: F401
from aurweb import config, db, defaults, logging, models, util
from aurweb import aur_logging, config, db, defaults, models, util
from aurweb.auth import creds, requires_auth
from aurweb.cache import db_count_cache, db_query_cache
from aurweb.exceptions import InvariantError, handle_form_exceptions
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
from aurweb.packages import util as pkgutil
from aurweb.packages.search import PackageSearch
from aurweb.packages.util import get_pkg_or_base
from aurweb.pkgbase import actions as pkgbase_actions
from aurweb.pkgbase import util as pkgbaseutil
from aurweb.pkgbase import actions as pkgbase_actions, util as pkgbaseutil
from aurweb.templates import make_context, make_variable_context, render_template
from aurweb.util import hash_query
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
router = APIRouter()
async def packages_get(request: Request, context: Dict[str, Any],
status_code: HTTPStatus = HTTPStatus.OK):
async def packages_get(
request: Request, context: dict[str, Any], status_code: HTTPStatus = HTTPStatus.OK
):
# Query parameters used in this request.
context["q"] = dict(request.query_params)
# Per page and offset.
offset, per_page = util.sanitize_params(
request.query_params.get("O", defaults.O),
request.query_params.get("PP", defaults.PP))
request.query_params.get("PP", defaults.PP),
)
context["O"] = offset
# Limit PP to options.max_search_results
max_search_results = aurweb.config.getint("options", "max_search_results")
max_search_results = config.getint("options", "max_search_results")
context["PP"] = per_page = min(per_page, max_search_results)
# Query search by.
@ -82,13 +84,14 @@ async def packages_get(request: Request, context: Dict[str, Any],
if submit == "Orphans":
# If the user clicked the "Orphans" button, we only want
# orphaned packages.
search.query = search.query.filter(
models.PackageBase.MaintainerUID.is_(None))
search.query = search.query.filter(models.PackageBase.MaintainerUID.is_(None))
# Collect search result count here; we've applied our keywords.
# Including more query operations below, like ordering, will
# increase the amount of time required to collect a count.
num_packages = search.count()
# we use redis for caching the results of the query
cache_expire = config.getint("cache", "expiry_time_search", 600)
num_packages = db_count_cache(hash_query(search.query), search.query, cache_expire)
# Apply user-specified sort column and ordering.
search.sort_by(sort_by, sort_order)
@ -103,17 +106,24 @@ async def packages_get(request: Request, context: Dict[str, Any],
models.PackageBase.Popularity,
models.PackageBase.NumVotes,
models.PackageBase.OutOfDateTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
models.PackageVote.PackageBaseID.label("Voted"),
models.PackageNotification.PackageBaseID.label("Notify")
).group_by(models.Package.Name)
models.PackageNotification.PackageBaseID.label("Notify"),
)
# paging
results = results.limit(per_page).offset(offset)
# we use redis for caching the results of the query
packages = db_query_cache(hash_query(results), results, cache_expire)
packages = results.limit(per_page).offset(offset)
context["packages"] = packages
context["packages_count"] = num_packages
return render_template(request, "packages/index.html", context,
status_code=status_code)
return render_template(
request, "packages/index.html", context, status_code=status_code
)
@router.get("/packages")
@ -123,7 +133,25 @@ async def packages(request: Request) -> Response:
@router.get("/packages/{name}")
async def package(request: Request, name: str) -> Response:
async def package(
request: Request,
name: str,
all_deps: bool = Query(default=False),
all_reqs: bool = Query(default=False),
) -> Response:
"""
Get a package by name.
By default, we limit the number of depends and requires results
to 20. To bypass this and load all of them, which should be triggered
via a "Show more" link near the limited listing.
:param name: Package.Name
:param all_deps: Boolean indicating whether we should load all depends
:param all_reqs: Boolean indicating whether we should load all requires
:return: FastAPI Response
"""
# Get the Package.
pkg = get_pkg_or_base(name, models.Package)
pkgbase = pkg.PackageBase
@ -140,25 +168,51 @@ async def package(request: Request, name: str) -> Response:
# Add our base information.
context = pkgbaseutil.make_context(request, pkgbase)
context["q"] = dict(request.query_params)
context.update({"all_deps": all_deps, "all_reqs": all_reqs})
context["package"] = pkg
# Package sources.
context["sources"] = pkg.package_sources.order_by(
models.PackageSource.Source.asc()).all()
models.PackageSource.Source.asc()
).all()
# Listing metadata.
context["max_listing"] = max_listing = 20
# Package dependencies.
max_depends = config.getint("options", "max_depends")
context["dependencies"] = pkg.package_dependencies.order_by(
models.PackageDependency.DepTypeID.asc(),
models.PackageDependency.DepName.asc()
).limit(max_depends).all()
deps = pkg.package_dependencies.order_by(
models.PackageDependency.DepTypeID.asc(), models.PackageDependency.DepName.asc()
)
context["depends_count"] = deps.count()
if not all_deps:
deps = deps.limit(max_listing)
context["dependencies"] = deps.all()
# Existing dependencies to avoid multiple lookups
context["dependencies_names_from_aur"] = [
item.Name
for item in db.query(models.Package)
.filter(
models.Package.Name.in_(
pkg.package_dependencies.with_entities(models.PackageDependency.DepName)
)
)
.all()
]
# Package requirements (other packages depend on this one).
context["required_by"] = pkgutil.pkg_required(
pkg.Name, [p.RelName for p in rels_data.get("p", [])], max_depends)
reqs = pkgutil.pkg_required(pkg.Name, [p.RelName for p in rels_data.get("p", [])])
context["reqs_count"] = reqs.count()
if not all_reqs:
reqs = reqs.limit(max_listing)
context["required_by"] = reqs.all()
context["licenses"] = pkg.package_licenses
context["groups"] = pkg.package_groups
conflicts = pkg.package_relations.filter(
models.PackageRelation.RelTypeID == CONFLICTS_ID
).order_by(models.PackageRelation.RelName.asc())
@ -177,46 +231,42 @@ async def package(request: Request, name: str) -> Response:
return render_template(request, "packages/show.html", context)
async def packages_unflag(request: Request, package_ids: List[int] = [],
**kwargs):
async def packages_unflag(request: Request, package_ids: list[int] = [], **kwargs):
if not package_ids:
return (False, ["You did not select any packages to unflag."])
return False, ["You did not select any packages to unflag."]
# Holds the set of package bases we're looking to unflag.
# Constructed below via looping through the packages query.
bases = set()
package_ids = set(package_ids) # Convert this to a set for O(1).
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
for pkg in packages:
has_cred = request.user.has_credential(
creds.PKGBASE_UNFLAG, approved=[pkg.PackageBase.Flagger])
creds.PKGBASE_UNFLAG, approved=[pkg.PackageBase.Flagger]
)
if not has_cred:
return (False, ["You did not select any packages to unflag."])
return False, ["You did not select any packages to unflag."]
if pkg.PackageBase not in bases:
bases.update({pkg.PackageBase})
for pkgbase in bases:
pkgbase_actions.pkgbase_unflag_instance(request, pkgbase)
return (True, ["The selected packages have been unflagged."])
return True, ["The selected packages have been unflagged."]
async def packages_notify(request: Request, package_ids: List[int] = [],
**kwargs):
async def packages_notify(request: Request, package_ids: list[int] = [], **kwargs):
# In cases where we encounter errors with the request, we'll
# use this error tuple as a return value.
# TODO: This error does not yet have a translation.
error_tuple = (False,
["You did not select any packages to be notified about."])
error_tuple = (False, ["You did not select any packages to be notified about."])
if not package_ids:
return error_tuple
bases = set()
package_ids = set(package_ids)
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
for pkg in packages:
if pkg.PackageBase not in bases:
@ -224,9 +274,11 @@ async def packages_notify(request: Request, package_ids: List[int] = [],
# Perform some checks on what the user selected for notify.
for pkgbase in bases:
notif = db.query(pkgbase.notifications.filter(
models.PackageNotification.UserID == request.user.ID
).exists()).scalar()
notif = db.query(
pkgbase.notifications.filter(
models.PackageNotification.UserID == request.user.ID
).exists()
).scalar()
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
# If the request user either does not have credentials
@ -239,26 +291,23 @@ async def packages_notify(request: Request, package_ids: List[int] = [],
pkgbase_actions.pkgbase_notify_instance(request, pkgbase)
# TODO: This message does not yet have a translation.
return (True, ["The selected packages' notifications have been enabled."])
return True, ["The selected packages' notifications have been enabled."]
async def packages_unnotify(request: Request, package_ids: List[int] = [],
**kwargs):
async def packages_unnotify(request: Request, package_ids: list[int] = [], **kwargs):
if not package_ids:
# TODO: This error does not yet have a translation.
return (False,
["You did not select any packages for notification removal."])
return False, ["You did not select any packages for notification removal."]
# TODO: This error does not yet have a translation.
error_tuple = (
False,
["A package you selected does not have notifications enabled."]
["A package you selected does not have notifications enabled."],
)
bases = set()
package_ids = set(package_ids)
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
for pkg in packages:
if pkg.PackageBase not in bases:
@ -266,9 +315,11 @@ async def packages_unnotify(request: Request, package_ids: List[int] = [],
# Perform some checks on what the user selected for notify.
for pkgbase in bases:
notif = db.query(pkgbase.notifications.filter(
models.PackageNotification.UserID == request.user.ID
).exists()).scalar()
notif = db.query(
pkgbase.notifications.filter(
models.PackageNotification.UserID == request.user.ID
).exists()
).scalar()
if not notif:
return error_tuple
@ -276,22 +327,27 @@ async def packages_unnotify(request: Request, package_ids: List[int] = [],
pkgbase_actions.pkgbase_unnotify_instance(request, pkgbase)
# TODO: This message does not yet have a translation.
return (True, ["The selected packages' notifications have been removed."])
return True, ["The selected packages' notifications have been removed."]
async def packages_adopt(request: Request, package_ids: List[int] = [],
confirm: bool = False, **kwargs):
async def packages_adopt(
request: Request, package_ids: list[int] = [], confirm: bool = False, **kwargs
):
if not package_ids:
return (False, ["You did not select any packages to adopt."])
return False, ["You did not select any packages to adopt."]
if not confirm:
return (False, ["The selected packages have not been adopted, "
"check the confirmation checkbox."])
return (
False,
[
"The selected packages have not been adopted, "
"check the confirmation checkbox."
],
)
bases = set()
package_ids = set(package_ids)
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
for pkg in packages:
if pkg.PackageBase not in bases:
@ -302,18 +358,19 @@ async def packages_adopt(request: Request, package_ids: List[int] = [],
has_cred = request.user.has_credential(creds.PKGBASE_ADOPT)
if not (has_cred or not pkgbase.Maintainer):
# TODO: This error needs to be translated.
return (False, ["You are not allowed to adopt one of the "
"packages you selected."])
return (
False,
["You are not allowed to adopt one of the " "packages you selected."],
)
# Now, really adopt the bases.
for pkgbase in bases:
pkgbase_actions.pkgbase_adopt_instance(request, pkgbase)
return (True, ["The selected packages have been adopted."])
return True, ["The selected packages have been adopted."]
def disown_all(request: Request, pkgbases: List[models.PackageBase]) \
-> List[str]:
def disown_all(request: Request, pkgbases: list[models.PackageBase]) -> list[str]:
errors = []
for pkgbase in pkgbases:
try:
@ -323,19 +380,24 @@ def disown_all(request: Request, pkgbases: List[models.PackageBase]) \
return errors
async def packages_disown(request: Request, package_ids: List[int] = [],
confirm: bool = False, **kwargs):
async def packages_disown(
request: Request, package_ids: list[int] = [], confirm: bool = False, **kwargs
):
if not package_ids:
return (False, ["You did not select any packages to disown."])
return False, ["You did not select any packages to disown."]
if not confirm:
return (False, ["The selected packages have not been disowned, "
"check the confirmation checkbox."])
return (
False,
[
"The selected packages have not been disowned, "
"check the confirmation checkbox."
],
)
bases = set()
package_ids = set(package_ids)
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
for pkg in packages:
if pkg.PackageBase not in bases:
@ -343,43 +405,54 @@ async def packages_disown(request: Request, package_ids: List[int] = [],
# Check that the user has credentials for every package they selected.
for pkgbase in bases:
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN,
approved=[pkgbase.Maintainer])
has_cred = request.user.has_credential(
creds.PKGBASE_DISOWN, approved=[pkgbase.Maintainer]
)
if not has_cred:
# TODO: This error needs to be translated.
return (False, ["You are not allowed to disown one "
"of the packages you selected."])
return (
False,
["You are not allowed to disown one " "of the packages you selected."],
)
# Now, disown all the bases if we can.
if errors := disown_all(request, bases):
return (False, errors)
return False, errors
return (True, ["The selected packages have been disowned."])
return True, ["The selected packages have been disowned."]
async def packages_delete(request: Request, package_ids: List[int] = [],
confirm: bool = False, merge_into: str = str(),
**kwargs):
async def packages_delete(
request: Request,
package_ids: list[int] = [],
confirm: bool = False,
merge_into: str = str(),
**kwargs,
):
if not package_ids:
return (False, ["You did not select any packages to delete."])
return False, ["You did not select any packages to delete."]
if not confirm:
return (False, ["The selected packages have not been deleted, "
"check the confirmation checkbox."])
return (
False,
[
"The selected packages have not been deleted, "
"check the confirmation checkbox."
],
)
if not request.user.has_credential(creds.PKGBASE_DELETE):
return (False, ["You do not have permission to delete packages."])
return False, ["You do not have permission to delete packages."]
# set-ify package_ids and query the database for related records.
package_ids = set(package_ids)
packages = db.query(models.Package).filter(
models.Package.ID.in_(package_ids)).all()
packages = db.query(models.Package).filter(models.Package.ID.in_(package_ids)).all()
if len(packages) != len(package_ids):
# Let the user know there was an issue with their input: they have
# provided at least one package_id which does not exist in the DB.
# TODO: This error has not yet been translated.
return (False, ["One of the packages you selected does not exist."])
return False, ["One of the packages you selected does not exist."]
# Make a set out of all package bases related to `packages`.
bases = {pkg.PackageBase for pkg in packages}
@ -389,15 +462,18 @@ async def packages_delete(request: Request, package_ids: List[int] = [],
notifs += pkgbase_actions.pkgbase_delete_instance(request, pkgbase)
# Log out the fact that this happened for accountability.
logger.info(f"Privileged user '{request.user.Username}' deleted the "
f"following package bases: {str(deleted_bases)}.")
logger.info(
f"Privileged user '{request.user.Username}' deleted the "
f"following package bases: {str(deleted_bases)}."
)
util.apply_all(notifs, lambda n: n.send())
return (True, ["The selected packages have been deleted."])
return True, ["The selected packages have been deleted."]
# A mapping of action string -> callback functions used within the
# `packages_post` route below. We expect any action callback to
# return a tuple in the format: (succeeded: bool, message: List[str]).
# return a tuple in the format: (succeeded: bool, message: list[str]).
PACKAGE_ACTIONS = {
"unflag": packages_unflag,
"notify": packages_notify,
@ -411,11 +487,12 @@ PACKAGE_ACTIONS = {
@router.post("/packages")
@handle_form_exceptions
@requires_auth
async def packages_post(request: Request,
IDs: List[int] = Form(default=[]),
action: str = Form(default=str()),
confirm: bool = Form(default=False)):
async def packages_post(
request: Request,
IDs: list[int] = Form(default=[]),
action: str = Form(default=str()),
confirm: bool = Form(default=False),
):
# If an invalid action is specified, just render GET /packages
# with an BAD_REQUEST status_code.
if action not in PACKAGE_ACTIONS:

File diff suppressed because it is too large Load diff

View file

@ -2,57 +2,127 @@ from http import HTTPStatus
from fastapi import APIRouter, Form, Query, Request
from fastapi.responses import RedirectResponse
from sqlalchemy import case
from sqlalchemy import case, orm
from aurweb import db, defaults, time, util
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions
from aurweb.models import PackageRequest, User
from aurweb.models.package_request import PENDING_ID, REJECTED_ID
from aurweb.models import PackageBase, PackageRequest, User
from aurweb.models.package_request import (
ACCEPTED_ID,
CLOSED_ID,
PENDING_ID,
REJECTED_ID,
)
from aurweb.requests.util import get_pkgreq_by_id
from aurweb.scripts import notify
from aurweb.statistics import get_request_counts
from aurweb.templates import make_context, render_template
FILTER_PARAMS = {
"filter_pending",
"filter_closed",
"filter_accepted",
"filter_rejected",
"filter_maintainers_requests",
}
router = APIRouter()
@router.get("/requests")
@requires_auth
async def requests(request: Request,
O: int = Query(default=defaults.O),
PP: int = Query(default=defaults.PP)):
async def requests( # noqa: C901
request: Request,
O: int = Query(default=defaults.O),
PP: int = Query(default=defaults.PP),
filter_pending: bool = False,
filter_closed: bool = False,
filter_accepted: bool = False,
filter_rejected: bool = False,
filter_maintainer_requests: bool = False,
filter_pkg_name: str = None,
):
context = make_context(request, "Requests")
context["q"] = dict(request.query_params)
O, PP = util.sanitize_params(O, PP)
# Set pending filter by default if no status filter was provided.
# In case we got a package name filter, but no status filter,
# we enable the other ones too.
if not dict(request.query_params).keys() & FILTER_PARAMS:
filter_pending = True
if filter_pkg_name:
filter_closed = True
filter_accepted = True
filter_rejected = True
O, PP = util.sanitize_params(str(O), str(PP))
context["O"] = O
context["PP"] = PP
context["filter_pending"] = filter_pending
context["filter_closed"] = filter_closed
context["filter_accepted"] = filter_accepted
context["filter_rejected"] = filter_rejected
context["filter_maintainer_requests"] = filter_maintainer_requests
context["filter_pkg_name"] = filter_pkg_name
# A PackageRequest query, with left inner joined User and RequestType.
query = db.query(PackageRequest).join(
User, User.ID == PackageRequest.UsersID)
Maintainer = orm.aliased(User)
# A PackageRequest query
query = (
db.query(PackageRequest)
.join(PackageBase)
.join(User, PackageRequest.UsersID == User.ID, isouter=True)
.join(Maintainer, PackageBase.MaintainerUID == Maintainer.ID, isouter=True)
)
# Requests statistics
counts = get_request_counts()
for k in counts:
context[k] = counts[k]
# Apply status filters
in_filters = []
if filter_pending:
in_filters.append(PENDING_ID)
if filter_closed:
in_filters.append(CLOSED_ID)
if filter_accepted:
in_filters.append(ACCEPTED_ID)
if filter_rejected:
in_filters.append(REJECTED_ID)
filtered = query.filter(PackageRequest.Status.in_(in_filters))
# Name filter (contains)
if filter_pkg_name:
filtered = filtered.filter(PackageBase.Name.like(f"%{filter_pkg_name}%"))
# Additionally filter for requests made from package maintainer
if filter_maintainer_requests:
filtered = filtered.filter(PackageRequest.UsersID == PackageBase.MaintainerUID)
# If the request user is not elevated (TU or Dev), then
# filter PackageRequests which are owned by the request user.
if not request.user.is_elevated():
query = query.filter(PackageRequest.UsersID == request.user.ID)
context["total"] = query.count()
context["results"] = query.order_by(
# Order primarily by the Status column being PENDING_ID,
# and secondarily by RequestTS; both in descending order.
case([(PackageRequest.Status == PENDING_ID, 1)], else_=0).desc(),
PackageRequest.RequestTS.desc()
).limit(PP).offset(O).all()
filtered = filtered.filter(PackageRequest.UsersID == request.user.ID)
context["total"] = filtered.count()
context["results"] = (
filtered.order_by(
# Order primarily by the Status column being PENDING_ID,
# and secondarily by RequestTS; both in descending order.
case([(PackageRequest.Status == PENDING_ID, 1)], else_=0).desc(),
PackageRequest.RequestTS.desc(),
)
.limit(PP)
.offset(O)
.all()
)
return render_template(request, "requests.html", context)
@router.get("/requests/{id}/close")
@requires_auth
async def request_close(request: Request, id: int):
pkgreq = get_pkgreq_by_id(id)
if not request.user.is_elevated() and request.user != pkgreq.User:
# Request user doesn't have permission here: redirect to '/'.
@ -63,11 +133,13 @@ async def request_close(request: Request, id: int):
return render_template(request, "requests/close.html", context)
@db.async_retry_deadlock
@router.post("/requests/{id}/close")
@handle_form_exceptions
@requires_auth
async def request_close_post(request: Request, id: int,
comments: str = Form(default=str())):
async def request_close_post(
request: Request, id: int, comments: str = Form(default=str())
):
pkgreq = get_pkgreq_by_id(id)
# `pkgreq`.User can close their own request.
@ -87,7 +159,8 @@ async def request_close_post(request: Request, id: int,
pkgreq.Status = REJECTED_ID
notify_ = notify.RequestCloseNotification(
request.user.ID, pkgreq.ID, pkgreq.status_display())
request.user.ID, pkgreq.ID, pkgreq.status_display()
)
notify_.send()
return RedirectResponse("/requests", status_code=HTTPStatus.SEE_OTHER)

View file

@ -1,12 +1,36 @@
"""
RPC API routing module
For legacy route documentation, see https://aur.archlinux.org/rpc
Legacy Routes:
- GET /rpc
- POST /rpc
Legacy example (version 5): /rpc?v=5&type=info&arg=my-package
For OpenAPI route documentation, see https://aur.archlinux.org/docs
OpenAPI Routes:
- GET /rpc/v{version}/info/{arg}
- GET /rpc/v{version}/info
- POST /rpc/v{version}/info
- GET /rpc/v{version}/search/{arg}
- GET /rpc/v{version}/search
- POST /rpc/v{version}/search
- GET /rpc/v{version}/suggest/{arg}
OpenAPI example (version 5): /rpc/v5/info/my-package
"""
import hashlib
import re
from http import HTTPStatus
from typing import List, Optional
from typing import Optional
from urllib.parse import unquote
import orjson
from fastapi import APIRouter, Form, Query, Request, Response
from fastapi.responses import JSONResponse
@ -19,7 +43,7 @@ router = APIRouter()
def parse_args(request: Request):
""" Handle legacy logic of 'arg' and 'arg[]' query parameter handling.
"""Handle legacy logic of 'arg' and 'arg[]' query parameter handling.
When 'arg' appears as the last argument given to the query string,
that argument is used by itself as one single argument, regardless
@ -39,9 +63,7 @@ def parse_args(request: Request):
# Create a list of (key, value) pairs of the given 'arg' and 'arg[]'
# query parameters from last to first.
query = list(reversed(unquote(request.url.query).split("&")))
parts = [
e.split("=", 1) for e in query if e.startswith(("arg=", "arg[]="))
]
parts = [e.split("=", 1) for e in query if e.startswith(("arg=", "arg[]="))]
args = []
if parts:
@ -63,24 +85,27 @@ def parse_args(request: Request):
return args
JSONP_EXPR = re.compile(r'^[a-zA-Z0-9()_.]{1,128}$')
JSONP_EXPR = re.compile(r"^[a-zA-Z0-9()_.]{1,128}$")
async def rpc_request(request: Request,
v: Optional[int] = None,
type: Optional[str] = None,
by: Optional[str] = defaults.RPC_SEARCH_BY,
arg: Optional[str] = None,
args: Optional[List[str]] = [],
callback: Optional[str] = None):
async def rpc_request(
request: Request,
v: Optional[int] = None,
type: Optional[str] = None,
by: Optional[str] = defaults.RPC_SEARCH_BY,
arg: Optional[str] = None,
args: Optional[list[str]] = [],
callback: Optional[str] = None,
):
# Create a handle to our RPC class.
rpc = RPC(version=v, type=type)
# If ratelimit was exceeded, return a 429 Too Many Requests.
if check_ratelimit(request):
return JSONResponse(rpc.error("Rate limit reached"),
status_code=int(HTTPStatus.TOO_MANY_REQUESTS))
return JSONResponse(
rpc.error("Rate limit reached"),
status_code=int(HTTPStatus.TOO_MANY_REQUESTS),
)
# If `callback` was provided, produce a text/javascript response
# valid for the jsonp callback. Otherwise, by default, return
@ -115,15 +140,11 @@ async def rpc_request(request: Request,
# The ETag header expects quotes to surround any identifier.
# https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/ETag
headers = {
"Content-Type": content_type,
"ETag": f'"{etag}"'
}
headers = {"Content-Type": content_type, "ETag": f'"{etag}"'}
if_none_match = request.headers.get("If-None-Match", str())
if if_none_match and if_none_match.strip("\t\n\r\" ") == etag:
return Response(headers=headers,
status_code=int(HTTPStatus.NOT_MODIFIED))
if if_none_match and if_none_match.strip('\t\n\r" ') == etag:
return Response(headers=headers, status_code=int(HTTPStatus.NOT_MODIFIED))
if callback:
content = f"/**/{callback}({content.decode()})"
@ -135,13 +156,15 @@ async def rpc_request(request: Request,
@router.get("/rpc.php") # Temporary! Remove on 03/04
@router.get("/rpc/")
@router.get("/rpc")
async def rpc(request: Request,
v: Optional[int] = Query(default=None),
type: Optional[str] = Query(default=None),
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Query(default=None),
args: Optional[List[str]] = Query(default=[], alias="arg[]"),
callback: Optional[str] = Query(default=None)):
async def rpc(
request: Request,
v: Optional[int] = Query(default=None),
type: Optional[str] = Query(default=None),
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Query(default=None),
args: Optional[list[str]] = Query(default=[], alias="arg[]"),
callback: Optional[str] = Query(default=None),
):
if not request.url.query:
return documentation()
return await rpc_request(request, v, type, by, arg, args, callback)
@ -152,11 +175,146 @@ async def rpc(request: Request,
@router.post("/rpc/")
@router.post("/rpc")
@handle_form_exceptions
async def rpc_post(request: Request,
v: Optional[int] = Form(default=None),
type: Optional[str] = Form(default=None),
by: Optional[str] = Form(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Form(default=None),
args: Optional[List[str]] = Form(default=[], alias="arg[]"),
callback: Optional[str] = Form(default=None)):
async def rpc_post(
request: Request,
v: Optional[int] = Form(default=None),
type: Optional[str] = Form(default=None),
by: Optional[str] = Form(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Form(default=None),
args: list[str] = Form(default=[], alias="arg[]"),
callback: Optional[str] = Form(default=None),
):
return await rpc_request(request, v, type, by, arg, args, callback)
@router.get("/rpc/v{version}/info/{name}")
async def rpc_openapi_info(request: Request, version: int, name: str):
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
name,
[],
)
@router.get("/rpc/v{version}/info")
async def rpc_openapi_multiinfo(
request: Request,
version: int,
args: Optional[list[str]] = Query(default=[], alias="arg[]"),
):
arg = args.pop(0) if args else None
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
arg,
args,
)
@router.post("/rpc/v{version}/info")
async def rpc_openapi_multiinfo_post(
request: Request,
version: int,
):
data = await request.json()
args = data.get("arg", [])
if not isinstance(args, list):
rpc = RPC(version, "info")
return JSONResponse(
rpc.error("the 'arg' parameter must be of array type"),
status_code=HTTPStatus.BAD_REQUEST,
)
arg = args.pop(0) if args else None
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
arg,
args,
)
@router.get("/rpc/v{version}/search/{arg}")
async def rpc_openapi_search_arg(
request: Request,
version: int,
arg: str,
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
):
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.get("/rpc/v{version}/search")
async def rpc_openapi_search(
request: Request,
version: int,
arg: Optional[str] = Query(default=str()),
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
):
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.post("/rpc/v{version}/search")
async def rpc_openapi_search_post(
request: Request,
version: int,
):
data = await request.json()
by = data.get("by", defaults.RPC_SEARCH_BY)
if not isinstance(by, str):
rpc = RPC(version, "search")
return JSONResponse(
rpc.error("the 'by' parameter must be of string type"),
status_code=HTTPStatus.BAD_REQUEST,
)
arg = data.get("arg", str())
if not isinstance(arg, str):
rpc = RPC(version, "search")
return JSONResponse(
rpc.error("the 'arg' parameter must be of string type"),
status_code=HTTPStatus.BAD_REQUEST,
)
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.get("/rpc/v{version}/suggest/{arg}")
async def rpc_openapi_suggest(request: Request, version: int, arg: str):
return await rpc_request(
request,
version,
"suggest",
defaults.RPC_SEARCH_BY,
arg,
[],
)

View file

@ -1,22 +1,19 @@
from datetime import datetime
from fastapi import APIRouter, Request
from fastapi.responses import Response
from feedgen.feed import FeedGenerator
from aurweb import db, filters
from aurweb import config, db, filters
from aurweb.cache import lambda_cache
from aurweb.models import Package, PackageBase
router = APIRouter()
def make_rss_feed(request: Request, packages: list,
date_attr: str):
""" Create an RSS Feed string for some packages.
def make_rss_feed(request: Request, packages: list):
"""Create an RSS Feed string for some packages.
:param request: A FastAPI request
:param packages: A list of packages to add to the RSS feed
:param date_attr: The date attribute (DB column) to use
:return: RSS Feed string
"""
@ -26,58 +23,67 @@ def make_rss_feed(request: Request, packages: list,
base = f"{request.url.scheme}://{request.url.netloc}"
feed.link(href=base, rel="alternate")
feed.link(href=f"{base}/rss", rel="self")
feed.image(title="AUR Newest Packages",
url=f"{base}/static/css/archnavbar/aurlogo.png",
link=base,
description="AUR Newest Packages Feed")
feed.image(
title="AUR Newest Packages",
url=f"{base}/static/css/archnavbar/aurlogo.png",
link=base,
description="AUR Newest Packages Feed",
)
for pkg in packages:
entry = feed.add_entry(order="append")
entry.title(pkg.Name)
entry.link(href=f"{base}/packages/{pkg.Name}", rel="alternate")
entry.link(href=f"{base}/rss", rel="self", type="application/rss+xml")
entry.description(pkg.Description or str())
attr = getattr(pkg.PackageBase, date_attr)
dt = filters.timestamp_to_datetime(attr)
dt = filters.timestamp_to_datetime(pkg.Timestamp)
dt = filters.as_timezone(dt, request.user.Timezone)
entry.pubDate(dt.strftime("%Y-%m-%d %H:%M:%S%z"))
entry.source(f"{base}")
if pkg.PackageBase.Maintainer:
entry.author(author={"name": pkg.PackageBase.Maintainer.Username})
entry.guid(f"{pkg.Name} - {attr}")
entry.guid(f"{pkg.Name}-{pkg.Timestamp}")
return feed.rss_str()
@router.get("/rss/")
async def rss(request: Request):
packages = db.query(Package).join(PackageBase).order_by(
PackageBase.SubmittedTS.desc()).limit(100)
feed = make_rss_feed(request, packages, "SubmittedTS")
packages = (
db.query(Package)
.join(PackageBase)
.order_by(PackageBase.SubmittedTS.desc())
.limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.SubmittedTS.label("Timestamp"),
)
)
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache("rss", lambda: make_rss_feed(request, packages), cache_expire)
response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.SubmittedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response
@router.get("/rss/modified")
async def rss_modified(request: Request):
packages = db.query(Package).join(PackageBase).order_by(
PackageBase.ModifiedTS.desc()).limit(100)
feed = make_rss_feed(request, packages, "ModifiedTS")
packages = (
db.query(Package)
.join(PackageBase)
.order_by(PackageBase.ModifiedTS.desc())
.limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.ModifiedTS.label("Timestamp"),
)
)
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache(
"rss_modified", lambda: make_rss_feed(request, packages), cache_expire
)
response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.ModifiedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response

View file

@ -1,11 +1,9 @@
import time
import uuid
from http import HTTPStatus
from urllib.parse import urlencode
import fastapi
from authlib.integrations.starlette_client import OAuth, OAuthError
from fastapi import Depends, HTTPException
from fastapi.responses import RedirectResponse
@ -14,7 +12,6 @@ from starlette.requests import Request
import aurweb.config
import aurweb.db
from aurweb import util
from aurweb.l10n import get_translator_for_request
from aurweb.schema import Bans, Sessions, Users
@ -43,14 +40,18 @@ async def login(request: Request, redirect: str = None):
The `redirect` argument is a query parameter specifying the post-login
redirect URL.
"""
authenticate_url = aurweb.config.get("options", "aur_location") + "/sso/authenticate"
authenticate_url = (
aurweb.config.get("options", "aur_location") + "/sso/authenticate"
)
if redirect:
authenticate_url = authenticate_url + "?" + urlencode([("redirect", redirect)])
return await oauth.sso.authorize_redirect(request, authenticate_url, prompt="login")
def is_account_suspended(conn, user_id):
row = conn.execute(select([Users.c.Suspended]).where(Users.c.ID == user_id)).fetchone()
row = conn.execute(
select([Users.c.Suspended]).where(Users.c.ID == user_id)
).fetchone()
return row is not None and bool(row[0])
@ -60,23 +61,29 @@ def open_session(request, conn, user_id):
"""
if is_account_suspended(conn, user_id):
_ = get_translator_for_request(request)
raise HTTPException(status_code=HTTPStatus.FORBIDDEN,
detail=_('Account suspended'))
raise HTTPException(
status_code=HTTPStatus.FORBIDDEN, detail=_("Account suspended")
)
# TODO This is a terrible message because it could imply the attempt at
# logging in just caused the suspension.
sid = uuid.uuid4().hex
conn.execute(Sessions.insert().values(
UsersID=user_id,
SessionID=sid,
LastUpdateTS=time.time(),
))
conn.execute(
Sessions.insert().values(
UsersID=user_id,
SessionID=sid,
LastUpdateTS=time.time(),
)
)
# Update users last login information.
conn.execute(Users.update()
.where(Users.c.ID == user_id)
.values(LastLogin=int(time.time()),
LastLoginIPAddress=request.client.host))
conn.execute(
Users.update()
.where(Users.c.ID == user_id)
.values(
LastLogin=int(time.time()), LastLoginIPAddress=util.get_client_ip(request)
)
)
return sid
@ -98,18 +105,23 @@ def is_aur_url(url):
@router.get("/sso/authenticate")
async def authenticate(request: Request, redirect: str = None, conn=Depends(aurweb.db.connect)):
async def authenticate(
request: Request, redirect: str = None, conn=Depends(aurweb.db.connect)
):
"""
Receive an OpenID Connect ID token, validate it, then process it to create
an new AUR session.
"""
if is_ip_banned(conn, request.client.host):
if is_ip_banned(conn, util.get_client_ip(request)):
_ = get_translator_for_request(request)
raise HTTPException(
status_code=HTTPStatus.FORBIDDEN,
detail=_('The login form is currently disabled for your IP address, '
'probably due to sustained spam attacks. Sorry for the '
'inconvenience.'))
detail=_(
"The login form is currently disabled for your IP address, "
"probably due to sustained spam attacks. Sorry for the "
"inconvenience."
),
)
try:
token = await oauth.sso.authorize_access_token(request)
@ -120,30 +132,41 @@ async def authenticate(request: Request, redirect: str = None, conn=Depends(aurw
_ = get_translator_for_request(request)
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=_('Bad OAuth token. Please retry logging in from the start.'))
detail=_("Bad OAuth token. Please retry logging in from the start."),
)
sub = user.get("sub") # this is the SSO account ID in JWT terminology
if not sub:
_ = get_translator_for_request(request)
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
detail=_("JWT is missing its `sub` field."))
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=_("JWT is missing its `sub` field."),
)
aur_accounts = conn.execute(select([Users.c.ID]).where(Users.c.SSOAccountID == sub)) \
.fetchall()
aur_accounts = conn.execute(
select([Users.c.ID]).where(Users.c.SSOAccountID == sub)
).fetchall()
if not aur_accounts:
return "Sorry, we dont seem to know you Sir " + sub
elif len(aur_accounts) == 1:
sid = open_session(request, conn, aur_accounts[0][Users.c.ID])
response = RedirectResponse(redirect if redirect and is_aur_url(redirect) else "/")
response = RedirectResponse(
redirect if redirect and is_aur_url(redirect) else "/"
)
secure_cookies = aurweb.config.getboolean("options", "disable_http_login")
response.set_cookie(key="AURSID", value=sid, httponly=True,
secure=secure_cookies)
response.set_cookie(
key="AURSID", value=sid, httponly=True, secure=secure_cookies
)
if "id_token" in token:
# We save the id_token for the SSO logout. Its not too important
# though, so if we cant find it, we can live without it.
response.set_cookie(key="SSO_ID_TOKEN", value=token["id_token"],
path="/sso/", httponly=True,
secure=secure_cookies)
response.set_cookie(
key="SSO_ID_TOKEN",
value=token["id_token"],
path="/sso/",
httponly=True,
secure=secure_cookies,
)
return util.add_samesite_fields(response, "strict")
else:
# Weve got a severe integrity violation.
@ -165,8 +188,12 @@ async def logout(request: Request):
return RedirectResponse("/")
metadata = await oauth.sso.load_server_metadata()
query = urlencode({'post_logout_redirect_uri': aurweb.config.get('options', 'aur_location'),
'id_token_hint': id_token})
response = RedirectResponse(metadata["end_session_endpoint"] + '?' + query)
query = urlencode(
{
"post_logout_redirect_uri": aurweb.config.get("options", "aur_location"),
"id_token_hint": id_token,
}
)
response = RedirectResponse(metadata["end_session_endpoint"] + "?" + query)
response.delete_cookie("SSO_ID_TOKEN", path="/sso/")
return response

View file

@ -1,319 +0,0 @@
import html
import typing
from http import HTTPStatus
from fastapi import APIRouter, Form, HTTPException, Request
from fastapi.responses import RedirectResponse, Response
from sqlalchemy import and_, func, or_
from aurweb import db, l10n, logging, models, time
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions
from aurweb.models import User
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
from aurweb.templates import make_context, make_variable_context, render_template
router = APIRouter()
logger = logging.get_logger(__name__)
# Some TU route specific constants.
ITEMS_PER_PAGE = 10 # Paged table size.
MAX_AGENDA_LENGTH = 75 # Agenda table column length.
ADDVOTE_SPECIFICS = {
# This dict stores a vote duration and quorum for a proposal.
# When a proposal is added, duration is added to the current
# timestamp.
# "addvote_type": (duration, quorum)
"add_tu": (7 * 24 * 60 * 60, 0.66),
"remove_tu": (7 * 24 * 60 * 60, 0.75),
"remove_inactive_tu": (5 * 24 * 60 * 60, 0.66),
"bylaws": (7 * 24 * 60 * 60, 0.75)
}
@router.get("/tu")
@requires_auth
async def trusted_user(request: Request,
coff: int = 0, # current offset
cby: str = "desc", # current by
poff: int = 0, # past offset
pby: str = "desc"): # past by
if not request.user.has_credential(creds.TU_LIST_VOTES):
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
context = make_context(request, "Trusted User")
current_by, past_by = cby, pby
current_off, past_off = coff, poff
context["pp"] = pp = ITEMS_PER_PAGE
context["prev_len"] = MAX_AGENDA_LENGTH
ts = time.utcnow()
if current_by not in {"asc", "desc"}:
# If a malicious by was given, default to desc.
current_by = "desc"
context["current_by"] = current_by
if past_by not in {"asc", "desc"}:
# If a malicious by was given, default to desc.
past_by = "desc"
context["past_by"] = past_by
current_votes = db.query(models.TUVoteInfo).filter(
models.TUVoteInfo.End > ts).order_by(
models.TUVoteInfo.Submitted.desc())
context["current_votes_count"] = current_votes.count()
current_votes = current_votes.limit(pp).offset(current_off)
context["current_votes"] = reversed(current_votes.all()) \
if current_by == "asc" else current_votes.all()
context["current_off"] = current_off
past_votes = db.query(models.TUVoteInfo).filter(
models.TUVoteInfo.End <= ts).order_by(
models.TUVoteInfo.Submitted.desc())
context["past_votes_count"] = past_votes.count()
past_votes = past_votes.limit(pp).offset(past_off)
context["past_votes"] = reversed(past_votes.all()) \
if past_by == "asc" else past_votes.all()
context["past_off"] = past_off
last_vote = func.max(models.TUVote.VoteID).label("LastVote")
last_votes_by_tu = db.query(models.TUVote).join(models.User).join(
models.TUVoteInfo,
models.TUVoteInfo.ID == models.TUVote.VoteID
).filter(
and_(models.TUVote.VoteID == models.TUVoteInfo.ID,
models.User.ID == models.TUVote.UserID,
models.TUVoteInfo.End < ts,
or_(models.User.AccountTypeID == 2,
models.User.AccountTypeID == 4))
).with_entities(
models.TUVote.UserID,
last_vote,
models.User.Username
).group_by(models.TUVote.UserID).order_by(
last_vote.desc(), models.User.Username.asc())
context["last_votes_by_tu"] = last_votes_by_tu.all()
context["current_by_next"] = "asc" if current_by == "desc" else "desc"
context["past_by_next"] = "asc" if past_by == "desc" else "desc"
context["q"] = {
"coff": current_off,
"cby": current_by,
"poff": past_off,
"pby": past_by
}
return render_template(request, "tu/index.html", context)
def render_proposal(request: Request, context: dict, proposal: int,
voteinfo: models.TUVoteInfo,
voters: typing.Iterable[models.User],
vote: models.TUVote,
status_code: HTTPStatus = HTTPStatus.OK):
""" Render a single TU proposal. """
context["proposal"] = proposal
context["voteinfo"] = voteinfo
context["voters"] = voters.all()
total = voteinfo.total_votes()
participation = (total / voteinfo.ActiveTUs) if voteinfo.ActiveTUs else 0
context["participation"] = participation
accepted = (voteinfo.Yes > voteinfo.ActiveTUs / 2) or \
(participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No)
context["accepted"] = accepted
can_vote = voters.filter(models.TUVote.User == request.user).first() is None
context["can_vote"] = can_vote
if not voteinfo.is_running():
context["error"] = "Voting is closed for this proposal."
context["vote"] = vote
context["has_voted"] = vote is not None
return render_template(request, "tu/show.html", context,
status_code=status_code)
@router.get("/tu/{proposal}")
@requires_auth
async def trusted_user_proposal(request: Request, proposal: int):
if not request.user.has_credential(creds.TU_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User")
proposal = int(proposal)
voteinfo = db.query(models.TUVoteInfo).filter(
models.TUVoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = db.query(models.User).join(models.TUVote).filter(
models.TUVote.VoteID == voteinfo.ID)
vote = db.query(models.TUVote).filter(
and_(models.TUVote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID)).first()
if not request.user.has_credential(creds.TU_VOTE):
context["error"] = "Only Trusted Users are allowed to vote."
if voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
elif vote is not None:
context["error"] = "You've already voted for this proposal."
context["vote"] = vote
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@router.post("/tu/{proposal}")
@handle_form_exceptions
@requires_auth
async def trusted_user_proposal_post(request: Request, proposal: int,
decision: str = Form(...)):
if not request.user.has_credential(creds.TU_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User")
proposal = int(proposal) # Make sure it's an int.
voteinfo = db.query(models.TUVoteInfo).filter(
models.TUVoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = db.query(models.User).join(models.TUVote).filter(
models.TUVote.VoteID == voteinfo.ID)
vote = db.query(models.TUVote).filter(
and_(models.TUVote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID)).first()
status_code = HTTPStatus.OK
if not request.user.has_credential(creds.TU_VOTE):
context["error"] = "Only Trusted Users are allowed to vote."
status_code = HTTPStatus.UNAUTHORIZED
elif voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
status_code = HTTPStatus.BAD_REQUEST
elif vote is not None:
context["error"] = "You've already voted for this proposal."
status_code = HTTPStatus.BAD_REQUEST
if status_code != HTTPStatus.OK:
return render_proposal(request, context, proposal,
voteinfo, voters, vote,
status_code=status_code)
if decision in {"Yes", "No", "Abstain"}:
# Increment whichever decision was given to us.
setattr(voteinfo, decision, getattr(voteinfo, decision) + 1)
else:
return Response("Invalid 'decision' value.",
status_code=HTTPStatus.BAD_REQUEST)
with db.begin():
vote = db.create(models.TUVote, User=request.user, VoteInfo=voteinfo)
context["error"] = "You've already voted for this proposal."
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@router.get("/addvote")
@requires_auth
async def trusted_user_addvote(request: Request, user: str = str(),
type: str = "add_tu", agenda: str = str()):
if not request.user.has_credential(creds.TU_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Add Proposal")
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
type = "add_tu" # Default it.
context["user"] = user
context["type"] = type
context["agenda"] = agenda
return render_template(request, "addvote.html", context)
@router.post("/addvote")
@handle_form_exceptions
@requires_auth
async def trusted_user_addvote_post(request: Request,
user: str = Form(default=str()),
type: str = Form(default=str()),
agenda: str = Form(default=str())):
if not request.user.has_credential(creds.TU_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
# Build a context.
context = await make_variable_context(request, "Add Proposal")
context["type"] = type
context["user"] = user
context["agenda"] = agenda
def render_addvote(context, status_code):
""" Simplify render_template a bit for this test. """
return render_template(request, "addvote.html", context, status_code)
# Alright, get some database records, if we can.
if type != "bylaws":
user_record = db.query(models.User).filter(
models.User.Username == user).first()
if user_record is None:
context["error"] = "Username does not exist."
return render_addvote(context, HTTPStatus.NOT_FOUND)
utcnow = time.utcnow()
voteinfo = db.query(models.TUVoteInfo).filter(
and_(models.TUVoteInfo.User == user,
models.TUVoteInfo.End > utcnow)).count()
if voteinfo:
_ = l10n.get_translator_for_request(request)
context["error"] = _(
"%s already has proposal running for them.") % (
html.escape(user),)
return render_addvote(context, HTTPStatus.BAD_REQUEST)
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
context["type"] = type = "add_tu" # Default for rendering.
return render_addvote(context, HTTPStatus.BAD_REQUEST)
if not agenda:
context["error"] = "Proposal cannot be empty."
return render_addvote(context, HTTPStatus.BAD_REQUEST)
# Gather some mapped constants and the current timestamp.
duration, quorum = ADDVOTE_SPECIFICS.get(type)
timestamp = time.utcnow()
# Active TU types we filter for.
types = {TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID}
# Create a new TUVoteInfo (proposal)!
with db.begin():
active_tus = db.query(User).filter(
and_(User.Suspended == 0,
User.InactivityTS.isnot(None),
User.AccountTypeID.in_(types))
).count()
voteinfo = db.create(models.TUVoteInfo, User=user,
Agenda=html.escape(agenda),
Submitted=timestamp, End=(timestamp + duration),
Quorum=quorum, ActiveTUs=active_tus,
Submitter=request.user)
# Redirect to the new proposal.
endpoint = f"/tu/{voteinfo.ID}"
return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER)

View file

@ -1,16 +1,15 @@
import os
from collections import defaultdict
from typing import Any, Callable, Dict, List, NewType, Union
from typing import Any, Callable, NewType, Union
from fastapi.responses import HTMLResponse
from sqlalchemy import and_, literal, orm
import aurweb.config as config
from aurweb import db, defaults, models
from aurweb import db, defaults, models, time
from aurweb.exceptions import RPCError
from aurweb.filters import number_format
from aurweb.models.package_base import popularity
from aurweb.packages.search import RPCSearch
TYPE_MAPPING = {
@ -23,8 +22,7 @@ TYPE_MAPPING = {
"replaces": "Replaces",
}
DataGenerator = NewType("DataGenerator",
Callable[[models.Package], Dict[str, Any]])
DataGenerator = NewType("DataGenerator", Callable[[models.Package], dict[str, Any]])
def documentation():
@ -40,7 +38,7 @@ def documentation():
class RPC:
""" RPC API handler class.
"""RPC API handler class.
There are various pieces to RPC's process, and encapsulating them
inside of a class means that external users do not abuse the
@ -66,36 +64,58 @@ class RPC:
# A set of RPC types supported by this API.
EXPOSED_TYPES = {
"info", "multiinfo",
"search", "msearch",
"suggest", "suggest-pkgbase"
"info",
"multiinfo",
"search",
"msearch",
"suggest",
"suggest-pkgbase",
}
# A mapping of type aliases.
TYPE_ALIASES = {"info": "multiinfo"}
EXPOSED_BYS = {
"name-desc", "name", "maintainer",
"depends", "makedepends", "optdepends", "checkdepends"
"name-desc",
"name",
"maintainer",
"depends",
"makedepends",
"optdepends",
"checkdepends",
"provides",
"conflicts",
"replaces",
"groups",
"submitter",
"keywords",
"comaintainers",
}
# A mapping of by aliases.
BY_ALIASES = {"name-desc": "nd", "name": "n", "maintainer": "m"}
BY_ALIASES = {
"name-desc": "nd",
"name": "n",
"maintainer": "m",
"submitter": "s",
"keywords": "k",
"comaintainers": "c",
}
def __init__(self, version: int = 0, type: str = None) -> "RPC":
self.version = version
self.type = RPC.TYPE_ALIASES.get(type, type)
def error(self, message: str) -> Dict[str, Any]:
def error(self, message: str) -> dict[str, Any]:
return {
"version": self.version,
"results": [],
"resultcount": 0,
"type": "error",
"error": message
"error": message,
}
def _verify_inputs(self, by: str = [], args: List[str] = []) -> None:
def _verify_inputs(self, by: str = [], args: list[str] = []) -> None:
if self.version is None:
raise RPCError("Please specify an API version.")
@ -111,20 +131,19 @@ class RPC:
if self.type not in RPC.EXPOSED_TYPES:
raise RPCError("Incorrect request type specified.")
def _enforce_args(self, args: List[str]) -> None:
def _enforce_args(self, args: list[str]) -> None:
if not args:
raise RPCError("No request type/data specified.")
def _get_json_data(self, package: models.Package) -> Dict[str, Any]:
""" Produce dictionary data of one Package that can be JSON-serialized.
def get_json_data(self, package: models.Package) -> dict[str, Any]:
"""Produce dictionary data of one Package that can be JSON-serialized.
:param package: Package instance
:returns: JSON-serializable dictionary
"""
# Produce RPC API compatible Popularity: If zero, it's an integer
# 0, otherwise, it's formatted to the 6th decimal place.
pop = package.Popularity
# Normalize Popularity for RPC output to 6 decimal precision
pop = popularity(package, time.utcnow())
pop = 0 if not pop else float(number_format(pop, 6))
snapshot_uri = config.get("options", "snapshot_uri")
@ -135,26 +154,24 @@ class RPC:
"PackageBase": package.PackageBaseName,
# Maintainer should be set following this update if one exists.
"Maintainer": package.Maintainer,
"Submitter": package.Submitter,
"Version": package.Version,
"Description": package.Description,
"URL": package.URL,
"URLPath": snapshot_uri % package.Name,
"URLPath": snapshot_uri % package.PackageBaseName,
"NumVotes": package.NumVotes,
"Popularity": pop,
"OutOfDate": package.OutOfDateTS,
"FirstSubmitted": package.SubmittedTS,
"LastModified": package.ModifiedTS
"LastModified": package.ModifiedTS,
}
def _get_info_json_data(self, package: models.Package) -> Dict[str, Any]:
data = self._get_json_data(package)
def get_info_json_data(self, package: models.Package) -> dict[str, Any]:
data = self.get_json_data(package)
# All info results have _at least_ an empty list of
# License and Keywords.
data.update({
"License": [],
"Keywords": []
})
data.update({"License": [], "Keywords": []})
# If we actually got extra_info records, update data with
# them for this particular package.
@ -163,9 +180,9 @@ class RPC:
return data
def _assemble_json_data(self, packages: List[models.Package],
data_generator: DataGenerator) \
-> List[Dict[str, Any]]:
def _assemble_json_data(
self, packages: list[models.Package], data_generator: DataGenerator
) -> list[dict[str, Any]]:
"""
Assemble JSON data out of a list of packages.
@ -174,108 +191,129 @@ class RPC:
"""
return [data_generator(pkg) for pkg in packages]
def _entities(self, query: orm.Query) -> orm.Query:
""" Select specific RPC columns on `query`. """
return query.with_entities(
models.Package.ID,
models.Package.Name,
models.Package.Version,
models.Package.Description,
models.Package.URL,
models.Package.PackageBaseID,
models.PackageBase.Name.label("PackageBaseName"),
models.PackageBase.NumVotes,
models.PackageBase.Popularity,
models.PackageBase.OutOfDateTS,
models.PackageBase.SubmittedTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
).group_by(models.Package.ID)
def entities(self, query: orm.Query) -> orm.Query:
"""Select specific RPC columns on `query`."""
Submitter = orm.aliased(models.User)
def _handle_multiinfo_type(self, args: List[str] = [], **kwargs) \
-> List[Dict[str, Any]]:
self._enforce_args(args)
args = set(args)
query = (
query.join(
Submitter,
Submitter.ID == models.PackageBase.SubmitterUID,
isouter=True,
)
.with_entities(
models.Package.ID,
models.Package.Name,
models.Package.Version,
models.Package.Description,
models.Package.URL,
models.Package.PackageBaseID,
models.PackageBase.Name.label("PackageBaseName"),
models.PackageBase.NumVotes,
models.PackageBase.Popularity,
models.PackageBase.PopularityUpdated,
models.PackageBase.OutOfDateTS,
models.PackageBase.SubmittedTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
Submitter.Username.label("Submitter"),
)
.group_by(models.Package.ID)
)
packages = db.query(models.Package).join(models.PackageBase).join(
models.User,
models.User.ID == models.PackageBase.MaintainerUID,
isouter=True
).filter(models.Package.Name.in_(args))
return query
max_results = config.getint("options", "max_rpc_results")
packages = self._entities(packages).limit(max_results + 1)
if packages.count() > max_results:
raise RPCError("Too many package results.")
ids = {pkg.ID for pkg in packages}
# Aliases for 80-width.
def subquery(self, ids: set[int]):
Package = models.Package
PackageKeyword = models.PackageKeyword
subqueries = [
# PackageDependency
db.query(
models.PackageDependency
).join(models.DependencyType).filter(
models.PackageDependency.PackageID.in_(ids)
).with_entities(
db.query(models.PackageDependency)
.join(models.DependencyType)
.filter(models.PackageDependency.PackageID.in_(ids))
.with_entities(
models.PackageDependency.PackageID.label("ID"),
models.DependencyType.Name.label("Type"),
models.PackageDependency.DepName.label("Name"),
models.PackageDependency.DepCondition.label("Cond")
).distinct().order_by("Name"),
models.PackageDependency.DepCondition.label("Cond"),
)
.distinct()
.order_by("Name"),
# PackageRelation
db.query(
models.PackageRelation
).join(models.RelationType).filter(
models.PackageRelation.PackageID.in_(ids)
).with_entities(
db.query(models.PackageRelation)
.join(models.RelationType)
.filter(models.PackageRelation.PackageID.in_(ids))
.with_entities(
models.PackageRelation.PackageID.label("ID"),
models.RelationType.Name.label("Type"),
models.PackageRelation.RelName.label("Name"),
models.PackageRelation.RelCondition.label("Cond")
).distinct().order_by("Name"),
models.PackageRelation.RelCondition.label("Cond"),
)
.distinct()
.order_by("Name"),
# Groups
db.query(models.PackageGroup).join(
db.query(models.PackageGroup)
.join(
models.Group,
and_(models.PackageGroup.GroupID == models.Group.ID,
models.PackageGroup.PackageID.in_(ids))
).with_entities(
and_(
models.PackageGroup.GroupID == models.Group.ID,
models.PackageGroup.PackageID.in_(ids),
),
)
.with_entities(
models.PackageGroup.PackageID.label("ID"),
literal("Groups").label("Type"),
models.Group.Name.label("Name"),
literal(str()).label("Cond")
).distinct().order_by("Name"),
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Licenses
db.query(models.PackageLicense).join(
models.License,
models.PackageLicense.LicenseID == models.License.ID
).filter(
models.PackageLicense.PackageID.in_(ids)
).with_entities(
db.query(models.PackageLicense)
.join(models.License, models.PackageLicense.LicenseID == models.License.ID)
.filter(models.PackageLicense.PackageID.in_(ids))
.with_entities(
models.PackageLicense.PackageID.label("ID"),
literal("License").label("Type"),
models.License.Name.label("Name"),
literal(str()).label("Cond")
).distinct().order_by("Name"),
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Keywords
db.query(models.PackageKeyword).join(
db.query(models.PackageKeyword)
.join(
models.Package,
and_(Package.PackageBaseID == PackageKeyword.PackageBaseID,
Package.ID.in_(ids))
).with_entities(
and_(
Package.PackageBaseID == PackageKeyword.PackageBaseID,
Package.ID.in_(ids),
),
)
.with_entities(
models.Package.ID.label("ID"),
literal("Keywords").label("Type"),
models.PackageKeyword.Keyword.label("Name"),
literal(str()).label("Cond")
).distinct().order_by("Name")
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Co-Maintainer
db.query(models.PackageComaintainer)
.join(models.User, models.User.ID == models.PackageComaintainer.UsersID)
.join(
models.Package,
models.Package.PackageBaseID
== models.PackageComaintainer.PackageBaseID,
)
.with_entities(
models.Package.ID,
literal("CoMaintainers").label("Type"),
models.User.Username.label("Name"),
literal(str()).label("Cond"),
)
.distinct() # A package could have the same co-maintainer multiple times
.order_by("Name"),
]
# Union all subqueries together.
@ -293,10 +331,37 @@ class RPC:
self.extra_info[record.ID][type_].append(name)
return self._assemble_json_data(packages, self._get_info_json_data)
def _handle_multiinfo_type(
self, args: list[str] = [], **kwargs
) -> list[dict[str, Any]]:
self._enforce_args(args)
args = set(args)
def _handle_search_type(self, by: str = defaults.RPC_SEARCH_BY,
args: List[str] = []) -> List[Dict[str, Any]]:
packages = (
db.query(models.Package)
.join(models.PackageBase)
.join(
models.User,
models.User.ID == models.PackageBase.MaintainerUID,
isouter=True,
)
.filter(models.Package.Name.in_(args))
)
max_results = config.getint("options", "max_rpc_results")
packages = self.entities(packages).limit(max_results + 1)
if packages.count() > max_results:
raise RPCError("Too many package results.")
ids = {pkg.ID for pkg in packages}
self.subquery(ids)
return self._assemble_json_data(packages, self.get_info_json_data)
def _handle_search_type(
self, by: str = defaults.RPC_SEARCH_BY, args: list[str] = []
) -> list[dict[str, Any]]:
# If `by` isn't maintainer and we don't have any args, raise an error.
# In maintainer's case, return all orphans if there are no args,
# so we need args to pass through to the handler without errors.
@ -311,56 +376,77 @@ class RPC:
search.search_by(by, arg)
max_results = config.getint("options", "max_rpc_results")
results = self._entities(search.results()).limit(max_results + 1).all()
query = self.entities(search.results()).limit(max_results + 1)
# For "provides", we need to union our relation search
# with an exact search since a package always provides itself.
# Turns out that doing this with an OR statement is extremely slow
if by == "provides":
search = RPCSearch()
search._search_by_exact_name(arg)
query = query.union(self.entities(search.results()))
results = query.all()
if len(results) > max_results:
raise RPCError("Too many package results.")
return self._assemble_json_data(results, self._get_json_data)
data = self._assemble_json_data(results, self.get_json_data)
def _handle_msearch_type(self, args: List[str] = [], **kwargs)\
-> List[Dict[str, Any]]:
# remove Submitter for search results
for pkg in data:
pkg.pop("Submitter")
return data
def _handle_msearch_type(
self, args: list[str] = [], **kwargs
) -> list[dict[str, Any]]:
return self._handle_search_type(by="m", args=args)
def _handle_suggest_type(self, args: List[str] = [], **kwargs)\
-> List[str]:
def _handle_suggest_type(self, args: list[str] = [], **kwargs) -> list[str]:
if not args:
return []
arg = args[0]
packages = db.query(models.Package.Name).join(
models.PackageBase
).filter(
and_(models.PackageBase.PackagerUID.isnot(None),
models.Package.Name.like(f"%{arg}%"))
).order_by(models.Package.Name.asc()).limit(20)
packages = (
db.query(models.Package.Name)
.join(models.PackageBase)
.filter(models.Package.Name.like(f"{arg}%"))
.order_by(models.Package.Name.asc())
.limit(20)
)
return [pkg.Name for pkg in packages]
def _handle_suggest_pkgbase_type(self, args: List[str] = [], **kwargs)\
-> List[str]:
def _handle_suggest_pkgbase_type(self, args: list[str] = [], **kwargs) -> list[str]:
if not args:
return []
packages = db.query(models.PackageBase.Name).filter(
and_(models.PackageBase.PackagerUID.isnot(None),
models.PackageBase.Name.like(f"%{args[0]}%"))
).order_by(models.PackageBase.Name.asc()).limit(20)
arg = args[0]
packages = (
db.query(models.PackageBase.Name)
.filter(models.PackageBase.Name.like(f"{arg}%"))
.order_by(models.PackageBase.Name.asc())
.limit(20)
)
return [pkg.Name for pkg in packages]
def _is_suggestion(self) -> bool:
return self.type.startswith("suggest")
def _handle_callback(self, by: str, args: List[str])\
-> Union[List[Dict[str, Any]], List[str]]:
def _handle_callback(
self, by: str, args: list[str]
) -> Union[list[dict[str, Any]], list[str]]:
# Get a handle to our callback and trap an RPCError with
# an empty list of results based on callback's execution.
callback = getattr(self, f"_handle_{self.type.replace('-', '_')}_type")
results = callback(by=by, args=args)
return results
def handle(self, by: str = defaults.RPC_SEARCH_BY, args: List[str] = [])\
-> Union[List[Dict[str, Any]], Dict[str, Any]]:
""" Request entrypoint. A router should pass v, type and args
def handle(
self, by: str = defaults.RPC_SEARCH_BY, args: list[str] = []
) -> Union[list[dict[str, Any]], dict[str, Any]]:
"""Request entrypoint. A router should pass v, type and args
to this function and expect an output dictionary to be returned.
:param v: RPC version argument
@ -391,8 +477,5 @@ class RPC:
return results
# Return JSON output.
data.update({
"resultcount": len(results),
"results": results
})
data.update({"resultcount": len(results), "results": results})
return data

View file

@ -5,8 +5,18 @@ Changes here should always be accompanied by an Alembic migration, which can be
usually be automatically generated. See `migrations/README` for details.
"""
from sqlalchemy import CHAR, TIMESTAMP, Column, ForeignKey, Index, MetaData, String, Table, Text, text
from sqlalchemy import (
CHAR,
TIMESTAMP,
Column,
ForeignKey,
Index,
MetaData,
String,
Table,
Text,
text,
)
from sqlalchemy.dialects.mysql import BIGINT, DECIMAL, INTEGER, TINYINT
from sqlalchemy.ext.compiler import compiles
@ -15,13 +25,13 @@ import aurweb.config
db_backend = aurweb.config.get("database", "backend")
@compiles(TINYINT, 'sqlite')
@compiles(TINYINT, "sqlite")
def compile_tinyint_sqlite(type_, compiler, **kw): # pragma: no cover
"""TINYINT is not supported on SQLite. Substitute it with INTEGER."""
return 'INTEGER'
return "INTEGER"
@compiles(BIGINT, 'sqlite')
@compiles(BIGINT, "sqlite")
def compile_bigint_sqlite(type_, compiler, **kw): # pragma: no cover
"""
For SQLite's AUTOINCREMENT to work on BIGINT columns, we need to map BIGINT
@ -29,429 +39,585 @@ def compile_bigint_sqlite(type_, compiler, **kw): # pragma: no cover
See https://docs.sqlalchemy.org/en/13/dialects/sqlite.html#allowing-autoincrement-behavior-sqlalchemy-types-other-than-integer-integer
""" # noqa: E501
return 'INTEGER'
return "INTEGER"
metadata = MetaData()
# Define the Account Types for the AUR.
AccountTypes = Table(
'AccountTypes', metadata,
Column('ID', TINYINT(unsigned=True), primary_key=True),
Column('AccountType', String(32), nullable=False, server_default=text("''")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci'
"AccountTypes",
metadata,
Column("ID", TINYINT(unsigned=True), primary_key=True),
Column("AccountType", String(32), nullable=False, server_default=text("''")),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# User information for each user regardless of type.
Users = Table(
'Users', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('AccountTypeID', ForeignKey('AccountTypes.ID', ondelete="NO ACTION"), nullable=False, server_default=text("1")),
Column('Suspended', TINYINT(unsigned=True), nullable=False, server_default=text("0")),
Column('Username', String(32), nullable=False, unique=True),
Column('Email', String(254), nullable=False, unique=True),
Column('BackupEmail', String(254)),
Column('HideEmail', TINYINT(unsigned=True), nullable=False, server_default=text("0")),
Column('Passwd', String(255), nullable=False),
Column('Salt', CHAR(32), nullable=False, server_default=text("''")),
Column('ResetKey', CHAR(32), nullable=False, server_default=text("''")),
Column('RealName', String(64), nullable=False, server_default=text("''")),
Column('LangPreference', String(6), nullable=False, server_default=text("'en'")),
Column('Timezone', String(32), nullable=False, server_default=text("'UTC'")),
Column('Homepage', Text),
Column('IRCNick', String(32), nullable=False, server_default=text("''")),
Column('PGPKey', String(40)),
Column('LastLogin', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Column('LastLoginIPAddress', String(45)),
Column('LastSSHLogin', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Column('LastSSHLoginIPAddress', String(45)),
Column('InactivityTS', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Column('RegistrationTS', TIMESTAMP, nullable=False, server_default=text("CURRENT_TIMESTAMP")),
Column('CommentNotify', TINYINT(1), nullable=False, server_default=text("1")),
Column('UpdateNotify', TINYINT(1), nullable=False, server_default=text("0")),
Column('OwnershipNotify', TINYINT(1), nullable=False, server_default=text("1")),
Column('SSOAccountID', String(255), nullable=True, unique=True),
Index('UsersAccountTypeID', 'AccountTypeID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Users",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column(
"AccountTypeID",
ForeignKey("AccountTypes.ID", ondelete="NO ACTION"),
nullable=False,
server_default=text("1"),
),
Column(
"Suspended", TINYINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("Username", String(32), nullable=False, unique=True),
Column("Email", String(254), nullable=False, unique=True),
Column("BackupEmail", String(254)),
Column(
"HideEmail", TINYINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("Passwd", String(255), nullable=False),
Column("Salt", CHAR(32), nullable=False, server_default=text("''")),
Column("ResetKey", CHAR(32), nullable=False, server_default=text("''")),
Column("RealName", String(64), nullable=False, server_default=text("''")),
Column("LangPreference", String(6), nullable=False, server_default=text("'en'")),
Column("Timezone", String(32), nullable=False, server_default=text("'UTC'")),
Column("Homepage", Text),
Column("IRCNick", String(32), nullable=False, server_default=text("''")),
Column("PGPKey", String(40)),
Column(
"LastLogin", BIGINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("LastLoginIPAddress", String(45)),
Column(
"LastSSHLogin", BIGINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("LastSSHLoginIPAddress", String(45)),
Column(
"InactivityTS", BIGINT(unsigned=True), nullable=False, server_default=text("0")
),
Column(
"RegistrationTS",
TIMESTAMP,
nullable=False,
server_default=text("CURRENT_TIMESTAMP"),
),
Column("CommentNotify", TINYINT(1), nullable=False, server_default=text("1")),
Column("UpdateNotify", TINYINT(1), nullable=False, server_default=text("0")),
Column("OwnershipNotify", TINYINT(1), nullable=False, server_default=text("1")),
Column("SSOAccountID", String(255), nullable=True, unique=True),
Index("UsersAccountTypeID", "AccountTypeID"),
Column(
"HideDeletedComments",
TINYINT(unsigned=True),
nullable=False,
server_default=text("0"),
),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# SSH public keys used for the aurweb SSH/Git interface.
SSHPubKeys = Table(
'SSHPubKeys', metadata,
Column('UserID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('Fingerprint', String(44), primary_key=True),
Column('PubKey', String(4096), nullable=False),
mysql_engine='InnoDB', mysql_charset='utf8mb4', mysql_collate='utf8mb4_bin',
"SSHPubKeys",
metadata,
Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column("Fingerprint", String(44), primary_key=True),
Column("PubKey", String(4096), nullable=False),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_bin",
)
# Track Users logging in/out of AUR web site.
Sessions = Table(
'Sessions', metadata,
Column('UsersID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('SessionID', CHAR(32), nullable=False, unique=True),
Column('LastUpdateTS', BIGINT(unsigned=True), nullable=False),
mysql_engine='InnoDB', mysql_charset='utf8mb4', mysql_collate='utf8mb4_bin',
"Sessions",
metadata,
Column("UsersID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column("SessionID", CHAR(32), nullable=False, unique=True),
Column("LastUpdateTS", BIGINT(unsigned=True), nullable=False),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_bin",
)
# Information on package bases
PackageBases = Table(
'PackageBases', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Name', String(255), nullable=False, unique=True),
Column('NumVotes', INTEGER(unsigned=True), nullable=False, server_default=text("0")),
Column('Popularity',
DECIMAL(10, 6, unsigned=True)
if db_backend == "mysql" else String(17),
nullable=False, server_default=text("0")),
Column('OutOfDateTS', BIGINT(unsigned=True)),
Column('FlaggerComment', Text, nullable=False),
Column('SubmittedTS', BIGINT(unsigned=True), nullable=False),
Column('ModifiedTS', BIGINT(unsigned=True), nullable=False),
Column('FlaggerUID', ForeignKey('Users.ID', ondelete='SET NULL')), # who flagged the package out-of-date?
"PackageBases",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Name", String(255), nullable=False, unique=True),
Column(
"NumVotes", INTEGER(unsigned=True), nullable=False, server_default=text("0")
),
Column(
"Popularity",
DECIMAL(10, 6, unsigned=True) if db_backend == "mysql" else String(17),
nullable=False,
server_default=text("0"),
),
Column(
"PopularityUpdated",
TIMESTAMP,
nullable=False,
server_default=text("'1970-01-01 00:00:01.000000'"),
),
Column("OutOfDateTS", BIGINT(unsigned=True)),
Column("FlaggerComment", Text, nullable=False),
Column("SubmittedTS", BIGINT(unsigned=True), nullable=False),
Column("ModifiedTS", BIGINT(unsigned=True), nullable=False),
Column(
"FlaggerUID", ForeignKey("Users.ID", ondelete="SET NULL")
), # who flagged the package out-of-date?
# deleting a user will cause packages to be orphaned, not deleted
Column('SubmitterUID', ForeignKey('Users.ID', ondelete='SET NULL')), # who submitted it?
Column('MaintainerUID', ForeignKey('Users.ID', ondelete='SET NULL')), # User
Column('PackagerUID', ForeignKey('Users.ID', ondelete='SET NULL')), # Last packager
Index('BasesMaintainerUID', 'MaintainerUID'),
Index('BasesNumVotes', 'NumVotes'),
Index('BasesPackagerUID', 'PackagerUID'),
Index('BasesSubmitterUID', 'SubmitterUID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
Column(
"SubmitterUID", ForeignKey("Users.ID", ondelete="SET NULL")
), # who submitted it?
Column("MaintainerUID", ForeignKey("Users.ID", ondelete="SET NULL")), # User
Column("PackagerUID", ForeignKey("Users.ID", ondelete="SET NULL")), # Last packager
Index("BasesMaintainerUID", "MaintainerUID"),
Index("BasesNumVotes", "NumVotes"),
Index("BasesPackagerUID", "PackagerUID"),
Index("BasesSubmitterUID", "SubmitterUID"),
Index("BasesSubmittedTS", "SubmittedTS"),
Index("BasesModifiedTS", "ModifiedTS"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Keywords of package bases
PackageKeywords = Table(
'PackageKeywords', metadata,
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
Column('Keyword', String(255), primary_key=True, nullable=False, server_default=text("''")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageKeywords",
metadata,
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
primary_key=True,
nullable=True,
),
Column(
"Keyword",
String(255),
primary_key=True,
nullable=False,
server_default=text("''"),
),
Index("KeywordsPackageBaseID", "PackageBaseID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Information about the actual packages
Packages = Table(
'Packages', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), nullable=False),
Column('Name', String(255), nullable=False, unique=True),
Column('Version', String(255), nullable=False, server_default=text("''")),
Column('Description', String(255)),
Column('URL', String(8000)),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Packages",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
nullable=False,
),
Column("Name", String(255), nullable=False, unique=True),
Column("Version", String(255), nullable=False, server_default=text("''")),
Column("Description", String(255)),
Column("URL", String(8000)),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Information about licenses
Licenses = Table(
'Licenses', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Name', String(255), nullable=False, unique=True),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Licenses",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Name", String(255), nullable=False, unique=True),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Information about package-license-relations
PackageLicenses = Table(
'PackageLicenses', metadata,
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
Column('LicenseID', ForeignKey('Licenses.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
mysql_engine='InnoDB',
"PackageLicenses",
metadata,
Column(
"PackageID",
ForeignKey("Packages.ID", ondelete="CASCADE"),
primary_key=True,
nullable=True,
),
Column(
"LicenseID",
ForeignKey("Licenses.ID", ondelete="CASCADE"),
primary_key=True,
nullable=True,
),
mysql_engine="InnoDB",
)
# Information about groups
Groups = Table(
'Groups', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Name', String(255), nullable=False, unique=True),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Groups",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Name", String(255), nullable=False, unique=True),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Information about package-group-relations
PackageGroups = Table(
'PackageGroups', metadata,
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
Column('GroupID', ForeignKey('Groups.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
mysql_engine='InnoDB',
"PackageGroups",
metadata,
Column(
"PackageID",
ForeignKey("Packages.ID", ondelete="CASCADE"),
primary_key=True,
nullable=True,
),
Column(
"GroupID",
ForeignKey("Groups.ID", ondelete="CASCADE"),
primary_key=True,
nullable=True,
),
mysql_engine="InnoDB",
)
# Define the package dependency types
DependencyTypes = Table(
'DependencyTypes', metadata,
Column('ID', TINYINT(unsigned=True), primary_key=True),
Column('Name', String(32), nullable=False, server_default=text("''")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"DependencyTypes",
metadata,
Column("ID", TINYINT(unsigned=True), primary_key=True),
Column("Name", String(32), nullable=False, server_default=text("''")),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Track which dependencies a package has
PackageDepends = Table(
'PackageDepends', metadata,
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), nullable=False),
Column('DepTypeID', ForeignKey('DependencyTypes.ID', ondelete="NO ACTION"), nullable=False),
Column('DepName', String(255), nullable=False),
Column('DepDesc', String(255)),
Column('DepCondition', String(255)),
Column('DepArch', String(255)),
Index('DependsDepName', 'DepName'),
Index('DependsPackageID', 'PackageID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageDepends",
metadata,
Column("PackageID", ForeignKey("Packages.ID", ondelete="CASCADE"), nullable=False),
Column(
"DepTypeID",
ForeignKey("DependencyTypes.ID", ondelete="NO ACTION"),
nullable=False,
),
Column("DepName", String(255), nullable=False),
Column("DepDesc", String(255)),
Column("DepCondition", String(255)),
Column("DepArch", String(255)),
Index("DependsDepName", "DepName"),
Index("DependsPackageID", "PackageID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Define the package relation types
RelationTypes = Table(
'RelationTypes', metadata,
Column('ID', TINYINT(unsigned=True), primary_key=True),
Column('Name', String(32), nullable=False, server_default=text("''")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"RelationTypes",
metadata,
Column("ID", TINYINT(unsigned=True), primary_key=True),
Column("Name", String(32), nullable=False, server_default=text("''")),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Track which conflicts, provides and replaces a package has
PackageRelations = Table(
'PackageRelations', metadata,
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), nullable=False),
Column('RelTypeID', ForeignKey('RelationTypes.ID', ondelete="NO ACTION"), nullable=False),
Column('RelName', String(255), nullable=False),
Column('RelCondition', String(255)),
Column('RelArch', String(255)),
Index('RelationsPackageID', 'PackageID'),
Index('RelationsRelName', 'RelName'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageRelations",
metadata,
Column("PackageID", ForeignKey("Packages.ID", ondelete="CASCADE"), nullable=False),
Column(
"RelTypeID",
ForeignKey("RelationTypes.ID", ondelete="NO ACTION"),
nullable=False,
),
Column("RelName", String(255), nullable=False),
Column("RelCondition", String(255)),
Column("RelArch", String(255)),
Index("RelationsPackageID", "PackageID"),
Index("RelationsRelName", "RelName"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Track which sources a package has
PackageSources = Table(
'PackageSources', metadata,
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), nullable=False),
Column('Source', String(8000), nullable=False, server_default=text("'/dev/null'")),
Column('SourceArch', String(255)),
Index('SourcesPackageID', 'PackageID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageSources",
metadata,
Column("PackageID", ForeignKey("Packages.ID", ondelete="CASCADE"), nullable=False),
Column("Source", String(8000), nullable=False, server_default=text("'/dev/null'")),
Column("SourceArch", String(255)),
Index("SourcesPackageID", "PackageID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Track votes for packages
PackageVotes = Table(
'PackageVotes', metadata,
Column('UsersID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), nullable=False),
Column('VoteTS', BIGINT(unsigned=True), nullable=False),
Index('VoteUsersIDPackageID', 'UsersID', 'PackageBaseID', unique=True),
Index('VotesPackageBaseID', 'PackageBaseID'),
Index('VotesUsersID', 'UsersID'),
mysql_engine='InnoDB',
"PackageVotes",
metadata,
Column("UsersID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
nullable=False,
),
Column("VoteTS", BIGINT(unsigned=True), nullable=False),
Index("VoteUsersIDPackageID", "UsersID", "PackageBaseID", unique=True),
Index("VotesPackageBaseID", "PackageBaseID"),
Index("VotesUsersID", "UsersID"),
mysql_engine="InnoDB",
)
# Record comments for packages
PackageComments = Table(
'PackageComments', metadata,
Column('ID', BIGINT(unsigned=True), primary_key=True),
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), nullable=False),
Column('UsersID', ForeignKey('Users.ID', ondelete='SET NULL')),
Column('Comments', Text, nullable=False),
Column('RenderedComment', Text, nullable=False),
Column('CommentTS', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Column('EditedTS', BIGINT(unsigned=True)),
Column('EditedUsersID', ForeignKey('Users.ID', ondelete='SET NULL')),
Column('DelTS', BIGINT(unsigned=True)),
Column('DelUsersID', ForeignKey('Users.ID', ondelete='CASCADE')),
Column('PinnedTS', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Index('CommentsPackageBaseID', 'PackageBaseID'),
Index('CommentsUsersID', 'UsersID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageComments",
metadata,
Column("ID", BIGINT(unsigned=True), primary_key=True),
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
nullable=False,
),
Column("UsersID", ForeignKey("Users.ID", ondelete="SET NULL")),
Column("Comments", Text, nullable=False),
Column("RenderedComment", Text, nullable=False),
Column(
"CommentTS", BIGINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("EditedTS", BIGINT(unsigned=True)),
Column("EditedUsersID", ForeignKey("Users.ID", ondelete="SET NULL")),
Column("DelTS", BIGINT(unsigned=True)),
Column("DelUsersID", ForeignKey("Users.ID", ondelete="CASCADE")),
Column("PinnedTS", BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Index("CommentsPackageBaseID", "PackageBaseID"),
Index("CommentsUsersID", "UsersID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Package base co-maintainers
PackageComaintainers = Table(
'PackageComaintainers', metadata,
Column('UsersID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), nullable=False),
Column('Priority', INTEGER(unsigned=True), nullable=False),
Index('ComaintainersPackageBaseID', 'PackageBaseID'),
Index('ComaintainersUsersID', 'UsersID'),
mysql_engine='InnoDB',
"PackageComaintainers",
metadata,
Column("UsersID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
nullable=False,
),
Column("Priority", INTEGER(unsigned=True), nullable=False),
Index("ComaintainersPackageBaseID", "PackageBaseID"),
Index("ComaintainersUsersID", "UsersID"),
mysql_engine="InnoDB",
)
# Package base notifications
PackageNotifications = Table(
'PackageNotifications', metadata,
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), nullable=False),
Column('UserID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Index('NotifyUserIDPkgID', 'UserID', 'PackageBaseID', unique=True),
mysql_engine='InnoDB',
"PackageNotifications",
metadata,
Column(
"PackageBaseID",
ForeignKey("PackageBases.ID", ondelete="CASCADE"),
nullable=False,
),
Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Index("NotifyUserIDPkgID", "UserID", "PackageBaseID", unique=True),
mysql_engine="InnoDB",
)
# Package name blacklist
PackageBlacklist = Table(
'PackageBlacklist', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Name', String(64), nullable=False, unique=True),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageBlacklist",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Name", String(64), nullable=False, unique=True),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Providers in the official repositories
OfficialProviders = Table(
'OfficialProviders', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Name', String(64), nullable=False),
Column('Repo', String(64), nullable=False),
Column('Provides', String(64), nullable=False),
Index('ProviderNameProvides', 'Name', 'Provides', unique=True),
mysql_engine='InnoDB', mysql_charset='utf8mb4', mysql_collate='utf8mb4_bin',
"OfficialProviders",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Name", String(64), nullable=False),
Column("Repo", String(64), nullable=False),
Column("Provides", String(64), nullable=False),
Index("ProviderNameProvides", "Name", "Provides", unique=True),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_bin",
)
# Define package request types
RequestTypes = Table(
'RequestTypes', metadata,
Column('ID', TINYINT(unsigned=True), primary_key=True),
Column('Name', String(32), nullable=False, server_default=text("''")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"RequestTypes",
metadata,
Column("ID", TINYINT(unsigned=True), primary_key=True),
Column("Name", String(32), nullable=False, server_default=text("''")),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Package requests
PackageRequests = Table(
'PackageRequests', metadata,
Column('ID', BIGINT(unsigned=True), primary_key=True),
Column('ReqTypeID', ForeignKey('RequestTypes.ID', ondelete="NO ACTION"), nullable=False),
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='SET NULL')),
Column('PackageBaseName', String(255), nullable=False),
Column('MergeBaseName', String(255)),
Column('UsersID', ForeignKey('Users.ID', ondelete='SET NULL')),
Column('Comments', Text, nullable=False),
Column('ClosureComment', Text, nullable=False),
Column('RequestTS', BIGINT(unsigned=True), nullable=False, server_default=text("0")),
Column('ClosedTS', BIGINT(unsigned=True)),
Column('ClosedUID', ForeignKey('Users.ID', ondelete='SET NULL')),
Column('Status', TINYINT(unsigned=True), nullable=False, server_default=text("0")),
Index('RequestsPackageBaseID', 'PackageBaseID'),
Index('RequestsUsersID', 'UsersID'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"PackageRequests",
metadata,
Column("ID", BIGINT(unsigned=True), primary_key=True),
Column(
"ReqTypeID", ForeignKey("RequestTypes.ID", ondelete="NO ACTION"), nullable=False
),
Column("PackageBaseID", ForeignKey("PackageBases.ID", ondelete="SET NULL")),
Column("PackageBaseName", String(255), nullable=False),
Column("MergeBaseName", String(255)),
Column("UsersID", ForeignKey("Users.ID", ondelete="SET NULL")),
Column("Comments", Text, nullable=False),
Column("ClosureComment", Text, nullable=False),
Column(
"RequestTS", BIGINT(unsigned=True), nullable=False, server_default=text("0")
),
Column("ClosedTS", BIGINT(unsigned=True)),
Column("ClosedUID", ForeignKey("Users.ID", ondelete="SET NULL")),
Column("Status", TINYINT(unsigned=True), nullable=False, server_default=text("0")),
Index("RequestsPackageBaseID", "PackageBaseID"),
Index("RequestsUsersID", "UsersID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Vote information
TU_VoteInfo = Table(
'TU_VoteInfo', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Agenda', Text, nullable=False),
Column('User', String(32), nullable=False),
Column('Submitted', BIGINT(unsigned=True), nullable=False),
Column('End', BIGINT(unsigned=True), nullable=False),
Column('Quorum',
DECIMAL(2, 2, unsigned=True)
if db_backend == "mysql" else String(5),
nullable=False),
Column('SubmitterID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('Yes', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
Column('No', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
Column('Abstain', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
Column('ActiveTUs', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
VoteInfo = Table(
"VoteInfo",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Agenda", Text, nullable=False),
Column("User", String(32), nullable=False),
Column("Submitted", BIGINT(unsigned=True), nullable=False),
Column("End", BIGINT(unsigned=True), nullable=False),
Column(
"Quorum",
DECIMAL(2, 2, unsigned=True) if db_backend == "mysql" else String(5),
nullable=False,
),
Column("SubmitterID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column("Yes", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
Column("No", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
Column(
"Abstain", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")
),
Column(
"ActiveUsers",
INTEGER(unsigned=True),
nullable=False,
server_default=text("'0'"),
),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Individual vote records
TU_Votes = Table(
'TU_Votes', metadata,
Column('VoteID', ForeignKey('TU_VoteInfo.ID', ondelete='CASCADE'), nullable=False),
Column('UserID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
mysql_engine='InnoDB',
Votes = Table(
"Votes",
metadata,
Column("VoteID", ForeignKey("VoteInfo.ID", ondelete="CASCADE"), nullable=False),
Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
mysql_engine="InnoDB",
)
# Malicious user banning
Bans = Table(
'Bans', metadata,
Column('IPAddress', String(45), primary_key=True),
Column('BanTS', TIMESTAMP, nullable=False),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Bans",
metadata,
Column("IPAddress", String(45), primary_key=True),
Column("BanTS", TIMESTAMP, nullable=False),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Terms and Conditions
Terms = Table(
'Terms', metadata,
Column('ID', INTEGER(unsigned=True), primary_key=True),
Column('Description', String(255), nullable=False),
Column('URL', String(8000), nullable=False),
Column('Revision', INTEGER(unsigned=True), nullable=False, server_default=text("1")),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"Terms",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Description", String(255), nullable=False),
Column("URL", String(8000), nullable=False),
Column(
"Revision", INTEGER(unsigned=True), nullable=False, server_default=text("1")
),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)
# Terms and Conditions accepted by users
AcceptedTerms = Table(
'AcceptedTerms', metadata,
Column('UsersID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
Column('TermsID', ForeignKey('Terms.ID', ondelete='CASCADE'), nullable=False),
Column('Revision', INTEGER(unsigned=True), nullable=False, server_default=text("0")),
mysql_engine='InnoDB',
"AcceptedTerms",
metadata,
Column("UsersID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
Column("TermsID", ForeignKey("Terms.ID", ondelete="CASCADE"), nullable=False),
Column(
"Revision", INTEGER(unsigned=True), nullable=False, server_default=text("0")
),
mysql_engine="InnoDB",
)
# Rate limits for API
ApiRateLimit = Table(
'ApiRateLimit', metadata,
Column('IP', String(45), primary_key=True, unique=True, default=str()),
Column('Requests', INTEGER(11), nullable=False),
Column('WindowStart', BIGINT(20), nullable=False),
Index('ApiRateLimitWindowStart', 'WindowStart'),
mysql_engine='InnoDB',
mysql_charset='utf8mb4',
mysql_collate='utf8mb4_general_ci',
"ApiRateLimit",
metadata,
Column("IP", String(45), primary_key=True, unique=True, default=str()),
Column("Requests", INTEGER(11), nullable=False),
Column("WindowStart", BIGINT(20), nullable=False),
Index("ApiRateLimitWindowStart", "WindowStart"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
)

View file

@ -6,12 +6,12 @@ See `aurweb-adduser --help` for documentation.
Copyright (C) 2022 aurweb Development Team
All Rights Reserved
"""
import argparse
import sys
import traceback
import aurweb.models.account_type as at
from aurweb import db
from aurweb.models.account_type import AccountType
from aurweb.models.ssh_pub_key import SSHPubKey, get_fingerprint
@ -30,8 +30,9 @@ def parse_args():
parser.add_argument("--ssh-pubkey", help="SSH PubKey")
choices = at.ACCOUNT_TYPE_NAME.values()
parser.add_argument("-t", "--type", help="Account Type",
choices=choices, default=at.USER)
parser.add_argument(
"-t", "--type", help="Account Type", choices=choices, default=at.USER
)
return parser.parse_args()
@ -40,25 +41,29 @@ def main():
args = parse_args()
db.get_engine()
type = db.query(AccountType,
AccountType.AccountType == args.type).first()
type = db.query(AccountType, AccountType.AccountType == args.type).first()
with db.begin():
user = db.create(User, Username=args.username,
Email=args.email, Passwd=args.password,
RealName=args.realname, IRCNick=args.ircnick,
PGPKey=args.pgp_key, AccountType=type)
user = db.create(
User,
Username=args.username,
Email=args.email,
Passwd=args.password,
RealName=args.realname,
IRCNick=args.ircnick,
PGPKey=args.pgp_key,
AccountType=type,
)
if args.ssh_pubkey:
pubkey = args.ssh_pubkey.strip()
# Remove host from the pubkey if it's there.
pubkey = ' '.join(pubkey.split(' ')[:2])
pubkey = " ".join(pubkey.split(" ")[:2])
with db.begin():
db.create(SSHPubKey,
User=user,
PubKey=pubkey,
Fingerprint=get_fingerprint(pubkey))
db.create(
SSHPubKey, User=user, PubKey=pubkey, Fingerprint=get_fingerprint(pubkey)
)
print(user.json())
return 0

View file

@ -3,11 +3,9 @@
import re
import pyalpm
from sqlalchemy import and_
import aurweb.config
from aurweb import db, util
from aurweb.models import OfficialProvider
@ -18,8 +16,8 @@ def _main(force: bool = False):
repomap = dict()
db_path = aurweb.config.get("aurblup", "db-path")
sync_dbs = aurweb.config.get('aurblup', 'sync-dbs').split(' ')
server = aurweb.config.get('aurblup', 'server')
sync_dbs = aurweb.config.get("aurblup", "sync-dbs").split(" ")
server = aurweb.config.get("aurblup", "server")
h = pyalpm.Handle("/", db_path)
for sync_db in sync_dbs:
@ -35,28 +33,46 @@ def _main(force: bool = False):
providers.add((pkg.name, pkg.name))
repomap[(pkg.name, pkg.name)] = repo.name
for provision in pkg.provides:
provisionname = re.sub(r'(<|=|>).*', '', provision)
provisionname = re.sub(r"(<|=|>).*", "", provision)
providers.add((pkg.name, provisionname))
repomap[(pkg.name, provisionname)] = repo.name
with db.begin():
old_providers = set(
db.query(OfficialProvider).with_entities(
db.query(OfficialProvider)
.with_entities(
OfficialProvider.Name.label("Name"),
OfficialProvider.Provides.label("Provides")
).distinct().order_by("Name").all()
OfficialProvider.Provides.label("Provides"),
)
.distinct()
.order_by("Name")
.all()
)
# delete providers not existing in any of our alpm repos
for name, provides in old_providers.difference(providers):
db.delete_all(db.query(OfficialProvider).filter(
and_(OfficialProvider.Name == name,
OfficialProvider.Provides == provides)
))
db.delete_all(
db.query(OfficialProvider).filter(
and_(
OfficialProvider.Name == name,
OfficialProvider.Provides == provides,
)
)
)
# add new providers that do not yet exist in our DB
for name, provides in providers.difference(old_providers):
repo = repomap.get((name, provides))
db.create(OfficialProvider, Name=name,
Repo=repo, Provides=provides)
db.create(OfficialProvider, Name=name, Repo=repo, Provides=provides)
# update providers where a pkg was moved from one repo to another
all_providers = db.query(OfficialProvider)
for op in all_providers:
new_repo = repomap.get((op.Name, op.Provides))
if op.Repo != new_repo:
op.Repo = new_repo
def main(force: bool = False):
@ -64,5 +80,5 @@ def main(force: bool = False):
_main(force)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View file

@ -3,6 +3,7 @@ Perform an action on the aurweb config.
When AUR_CONFIG_IMMUTABLE is set, the `set` action is noop.
"""
import argparse
import configparser
import os
@ -50,12 +51,12 @@ def parse_args():
actions = ["get", "set", "unset"]
parser = argparse.ArgumentParser(
description="aurweb configuration tool",
formatter_class=lambda prog: fmt_cls(prog=prog, max_help_position=80))
formatter_class=lambda prog: fmt_cls(prog=prog, max_help_position=80),
)
parser.add_argument("action", choices=actions, help="script action")
parser.add_argument("section", help="config section")
parser.add_argument("option", help="config option")
parser.add_argument("value", nargs="?", default=0,
help="config option value")
parser.add_argument("value", nargs="?", default=0, help="config option value")
return parser.parse_args()

View file

@ -0,0 +1,125 @@
import argparse
import importlib
import os
import sys
import traceback
from datetime import UTC, datetime
import orjson
import pygit2
from aurweb import config
# Constants
REF = "refs/heads/master"
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
def init_repository(git_info) -> None:
pygit2.init_repository(git_info.path)
repo = pygit2.Repository(git_info.path)
for k, v in git_info.config.items():
repo.config[k] = v
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument(
"--spec",
type=str,
required=True,
help="name of spec module in the aurweb.archives.spec package",
)
return parser.parse_args()
def update_repository(repo: pygit2.Repository):
# Use git status to determine file changes
has_changes = False
changes = repo.status()
for filepath, flags in changes.items():
if flags != pygit2.GIT_STATUS_CURRENT:
has_changes = True
break
if has_changes:
print("diff detected, committing")
# Add everything in the tree.
print("adding files to git tree")
# Add the tree to staging
repo.index.read()
repo.index.add_all()
repo.index.write()
tree = repo.index.write_tree()
# Determine base commit; if repo.head.target raises GitError,
# we have no current commits
try:
base = [repo.head.target]
except pygit2.GitError:
base = []
utcnow = datetime.now(UTC)
author = pygit2.Signature(
config.get("git-archive", "author"),
config.get("git-archive", "author-email"),
int(utcnow.timestamp()),
0,
)
# Commit the changes
timestamp = utcnow.strftime("%Y-%m-%d %H:%M:%S")
title = f"update - {timestamp}"
repo.create_commit(REF, author, author, title, tree, base)
print("committed changes")
else:
print("no diff detected")
def main() -> int:
args = parse_args()
print(f"loading '{args.spec}' spec")
spec_package = "aurweb.archives.spec"
module_path = f"{spec_package}.{args.spec}"
spec_module = importlib.import_module(module_path)
print(f"loaded '{args.spec}'")
# Track repositories that the spec modifies. After we run
# through specs, we want to make a single commit for all
# repositories that contain changes.
repos = dict()
print(f"running '{args.spec}' spec...")
spec = spec_module.Spec()
for output in spec.generate():
if not os.path.exists(output.git_info.path / ".git"):
init_repository(output.git_info)
path = output.git_info.path / output.filename
with open(path, "wb") as f:
f.write(output.data)
if output.git_info.path not in repos:
repos[output.git_info.path] = pygit2.Repository(output.git_info.path)
print(f"done running '{args.spec}' spec")
print("processing repositories")
for path in spec.repos:
print(f"processing repository: {path}")
update_repository(pygit2.Repository(path))
return 0
if __name__ == "__main__":
try:
sys.exit(main())
except KeyboardInterrupt:
sys.exit(0)
except Exception:
traceback.print_exc()
sys.exit(1)

Some files were not shown because too many files have changed in this diff Show more