Compare commits

...

334 commits

Author SHA1 Message Date
Leonidas Spyropoulos
8ca61eded2
chore(release): prepare for 6.2.16 2025-01-13 15:52:13 +00:00
Leonidas Spyropoulos
a9bf714dae
fix: bump deps for python 3.13 and vulnerability
pygit2 and watchfiles for precompiled wheels
greenlet for python 3.13 compatibility
python-multipart for security vulnerability

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
3e3173b5c9
chore: avoid cache for new pacman 7
Pacman 7 introduced sandboxing which breaks cache in containers due to permissions on containers

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
eca8bbf515
chore(release): prepare for 6.2.15
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-09-15 12:03:17 +03:00
Jelle van der Waa
edc1ab949a perf(captcha): simplify count() query for user ids
Using .count() isn't great as it runs a count query on a subquery which
selects all fields in the Users table. This rewrites it into a simple
SELECT count(ID) from USers query.
2024-09-12 12:29:46 +00:00
Muflone
97cc6196eb fix: reduce the number of subqueries against Packages by preloading the existing dependencies names from AUR 2024-08-21 01:36:15 +02:00
Muflone
77ef87c882 housekeep: code re-formatted by black for lint pipeline 2024-08-20 21:00:46 +00:00
Muflone
a40283cdb2 fix: reduce the number of subqueries against User by loading eagerly the Users from PackageComaintainer 2024-08-20 21:00:46 +00:00
Levente Polyak
4f68532ee2
chore(mariadb): fix mysql deprecation warnings by using mariadb commands
Mariadb has scheduled to remove the deprecated mysql drop-in interface.
Let's adapt which also removes a lot of warnings while spinning up the
service.
2024-08-19 15:26:36 +02:00
Levente Polyak
439ccd4aa3
feat(docker): add full grafana, prometheus, tempo setup for local dev
This is a very useful stack for local development as well, by allowing
to easily access a local grafana instance and look at the accessed
endpoints, query usage and durations etc.
As a nice side effect this also makes sure we have an easy way to
actually test any changes to the opentelemetry integration in an actual
environment instead of just listening to a raw socket.
2024-08-19 15:26:29 +02:00
Levente Polyak
8dcf0b2d97
fix(docker): fix compose race conditions on mariadb_init
We want the dependent services to wait until the initialization service
of mariadb finishes, but also properly accept if it already exited
before a leaf service gets picked up and put into created state. By
using the service_completed_successfully signal, we can ensure precisely
this, without being racy and leading to none booted services.

While at it, remove the compose version identifiers as docker-compose
deprecated them and always warned about when running docker-compose.
2024-08-19 15:26:21 +02:00
Leonidas Spyropoulos
88e8db4404
chore(release): prepare version 6.2.14 2024-08-17 17:28:26 +01:00
Sven-Hendrik Haase
b730f6447d
feat: Add opentelemtry-based tracing
This adds tracing to fastapi, redis, and sqlalchemy. It uses the
recommended OLTP exporter to send the tracing data.
2024-08-17 11:27:26 +01:00
Leonidas Spyropoulos
92f5bbd37f
housekeep: reformat asgi.py 2024-08-17 01:31:43 +01:00
Jelle van der Waa
6c6ecd3971
perf(aurweb): create a context with what is required
The pkgbase/util.py `make_context` helper does a lot of unrelated
expensive queries which are not required for any of the templates. Only
the 404 template shows git_clone_uri_* and pkgbase.
2024-08-16 21:32:22 +02:00
Leonidas Spyropoulos
9b12eaf2b9
chore(release): prepare version 6.2.13 2024-08-16 16:03:40 +01:00
Jelle van der Waa
d1a66a743e
perf(aurweb/pkgbase): use exists() to avoid fetching a row
The previous approach fetched the matching row, by using `exists()`
SQLAlchemy changes the query to a `SELECT 1`.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
b65d6c5e3a
perf(aurweb/pkgbase): only relevant queries when logged in
Don't query for notify, requests and vote information when the user is
not logged in as this information is not shown.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
d393ed2352
fix(templates): hide non-actionable links when not logged in
A non-logged in user cannot vote/enable notifications or submit a
request so hide these links.
2024-08-09 16:07:17 +02:00
Leonidas Spyropoulos
a16fac9b95
fix: revert mysqlclient to 2.2.3 2024-08-09 11:02:13 +01:00
renovate
5dd65846d1
chore(deps): update dependency coverage to v7.6.1 2024-08-05 11:25:17 +00:00
renovate
a1b2d231c3
fix(deps): update dependency aiofiles to v24 2024-08-04 20:25:21 +00:00
renovate
f306b6df7a
fix(deps): update dependency fastapi to ^0.112.0 2024-08-04 12:25:03 +00:00
renovate
0d17895647
fix(deps): update dependency gunicorn to v22 2024-08-04 10:24:33 +00:00
renovate
36a56e9d3c
fix(deps): update all non-major dependencies 2024-08-04 09:24:29 +00:00
Diego Viola
80d3e5f7b6 housekeep: update .editorconfig url
Signed-off-by: Diego Viola <diego.viola@gmail.com>
2024-08-03 11:58:58 +00:00
Leonidas Spyropoulos
2df5a2d5a8
chore(release): prepare version 6.2.12 2024-08-03 10:46:29 +01:00
Leonidas Spyropoulos
a54b6935a1
housekeep: reformat files with pre-hooks
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-08-03 08:15:56 +01:00
Levente Polyak
4d5909256f
fix: add missing indicies on PackageBase ordered columns
Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:31 +02:00
Levente Polyak
a5b94a47f3
feat: cache rss feedgen for 5 minutes
The RSS feed should be perfectly fine even when caching them for 5
minutes. This should massively reduce the response times on the
endpoint.

Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:24 +02:00
moson
33d31d4117
style: Indicate deleted accounts on requests page
Show "(deleted)" on requests page for user accounts that were removed.

Fixes #505

Signed-off-by: moson <moson@archlinux.org>
2024-06-24 16:35:21 +02:00
Leonidas Spyropoulos
ed878c8c5e
chore(release): prepare for 6.2.11 2024-06-10 11:49:00 +01:00
Leonidas Spyropoulos
77e4979f79
fix: remove the extra spaces in requests textarea
fixes: #503
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-06-10 11:41:19 +01:00
Leonidas Spyropoulos
85af7d6f04
fix: revert Set reply-to header for notifications to ML
The change broke the initial emails to the ML. Not sure why but reverting this now and might look at later

This reverts commit 783422369e.

fixes: #502
2024-06-10 11:40:36 +01:00
Leonidas Spyropoulos
ef0619dc2f
chore(release): prepare for 6.2.10 2024-05-18 20:46:17 +01:00
moson
43b322e739
fix(CI): lint job - fix for python 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-28 17:49:08 +02:00
moson
afb7af3e27
housekeep: replace deprecated datetime functions
tests show warnings for deprecated utc functions with python 3.12

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 18:24:16 +02:00
moson
ffddf63975
housekeep: poetry - include python version 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:46:39 +02:00
moson
c6a530f24f
chore(deps): bump pre-commit tools/libs
Prep for python 3.12
Reformat files with latest pre-commit tools

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:25:39 +02:00
moson
3220cf886e
fix(CI): Remove "fast-single-thread" tag
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21e2ef5ecb
fix(test): Fix "TestClient"
TestClient changes were reverted with 0.37.2:

https://github.com/encode/starlette/pull/2525
https://github.com/encode/starlette/releases/tag/0.37.2
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
6ba06801f7
chore(deps): update dependencies
- Updating pycparser (2.21 -> 2.22)
  - Updating sniffio (1.3.0 -> 1.3.1)
  - Updating typing-extensions (4.8.0 -> 4.11.0)
  - Updating anyio (3.7.1 -> 4.3.0)
  - Updating certifi (2023.11.17 -> 2024.2.2)
  - Updating greenlet (3.0.1 -> 3.0.3)
  - Updating markupsafe (2.1.3 -> 2.1.5)
  - Updating packaging (23.2 -> 24.0)
  - Updating pluggy (1.3.0 -> 1.4.0)
  - Updating pydantic-core (2.14.5 -> 2.16.3)
  - Updating coverage (7.4.0 -> 7.4.4)
  - Updating cryptography (41.0.5 -> 42.0.5)
  - Updating dnspython (2.4.2 -> 2.6.1)
  - Updating execnet (2.0.2 -> 2.1.0)
  - Updating httpcore (1.0.2 -> 1.0.5)
  - Updating lxml (5.1.0 -> 5.2.1)
  - Updating mako (1.3.0 -> 1.3.2)
  - Updating parse (1.20.0 -> 1.20.1)
  - Updating prometheus-client (0.19.0 -> 0.20.0)
  - Updating pydantic (2.5.2 -> 2.6.4)
  - Updating pytest (7.4.4 -> 8.1.1)
  - Updating python-dateutil (2.8.2 -> 2.9.0.post0)
  - Updating redis (5.0.1 -> 5.0.3)
  - Updating urllib3 (2.1.0 -> 2.2.1)
  - Updating asgiref (3.7.2 -> 3.8.1)
  - Updating email-validator (2.1.0.post1 -> 2.1.1)
  - Updating fakeredis (2.20.1 -> 2.21.3)
  - Updating fastapi (0.109.0 -> 0.110.1)
  - Updating filelock (3.13.1 -> 3.13.3)
  - Updating markdown (3.5.2 -> 3.6)
  - Updating mysqlclient (2.2.1 -> 2.2.4)
  - Updating orjson (3.9.12 -> 3.10.0)
  - Updating prometheus-fastapi-instrumentator (6.1.0 -> 7.0.0)
  - Updating protobuf (4.25.2 -> 5.26.1)
  - Updating pygit2 (1.13.3 -> 1.14.1)
  - Updating pytest-asyncio (0.23.3 -> 0.23.6)
  - Updating pytest-cov (4.1.0 -> 5.0.0)
  - Updating tomlkit (0.12.3 -> 0.12.4)
  - Updating uvicorn (0.27.0 -> 0.27.1)
  - Updating werkzeug (3.0.1 -> 3.0.2)
  - Updating starlette (0.35.0 -> 0.37.2)
  - Updating httpx (0.26.0 -> 0.27.0)
  - Updating python-multipart (0.0.6 -> 0.0.9)
  - Updating uvicorn (0.27.1 -> 0.29.0)
  - Updating sqlalchemy (1.4.50 -> 1.4.52)

Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21a23c9abe
feat: Limit comment length
Limit the amount of characters that can be entered for a comment.

Signed-off-by: moson <moson@archlinux.org>
2024-02-25 10:46:47 +01:00
moson
d050b626db
feat: Add blacklist check for pkgbase
Also check "pkgbase" against our blacklist.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 15:55:46 +01:00
moson
057685f304
fix: Fix package info for 404 errors
We try to find packages when a user enters a URL like /somepkg
or accidentally opens /somepkg.git in the browser.

However, it currently also does this for URL's like /pkgbase/doesnotexist
and falsely interprets "pkgbase" part as a package or pkgbase name.
This in combination with a pkgbase that is named "pkgbase" generates
some misleading 404 message for URL's like /pkgbase/doesnotexist.

That being said, we should probably add pkgbase to the blacklist check
as well (we do this for pkgname already) and add things like
"pkgbase" to the blacklist -> Will be picked up in another commit.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 14:12:09 +01:00
renovate
319c565cb9
fix(deps): update all non-major dependencies 2024-01-23 22:24:28 +00:00
renovate
db6bba8bc8
fix(deps): update dependency feedgen to v1 2024-01-23 21:24:53 +00:00
renovate
a37b9685de
fix(deps): update dependency lxml to v5 2024-01-21 14:24:22 +00:00
moson
6e32cf4275
fix(i18n): Adjust transifex host URL
Fix URL, otherwise the API token won't be picked up from ~/.transifexrc

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 11:40:14 +01:00
moson
76b6971267
chore(deps): Ignore python upgrades with Renovate
Stop Renovate from trying to bump the python version.

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 10:43:12 +01:00
Robin Candau
9818c3f48c chore(i18n): Replace [community] leftover mentions to [extra] 2024-01-21 10:27:57 +01:00
moson
f967c3565a
chore(i18n): Update translations
Pull in updated translations from Transifex: 2023-01-18

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 09:59:05 +01:00
moson
2fcd793a58
fix(test): Fixes for "TestClient" changes
Seems that client is optional according to the ASGI spec.
https://asgi.readthedocs.io/en/latest/specs/www.html

With Starlette 0.35 the TestClient connection  scope is None for "client".
https://github.com/encode/starlette/pull/2377

Signed-off-by: moson <moson@archlinux.org>
2024-01-19 16:37:42 +01:00
renovate
22e1577324
fix(deps): update dependency fastapi to ^0.109.0 2024-01-19 10:26:02 +01:00
moson
baf97bd159
fix(test): FastAPI 0.104.1 - Fix warnings
FastAPI events are deprecated. Use "Lifespan" function instead.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:15:18 +01:00
moson
a0b2e826be
feat: Parse markdown within html block elements
By default, markdown within an HTML block element is not parsed.
Add markdown extension to support markdown text within block
elements.

With this we can annotate our element with a "markdown" attribute:
E.g. <details markdown>*Markdown*</details>
And thus indicate that the content should be parsed.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:14:24 +01:00
moson
1ba9e6eb44
fix: change git-cliff "tag_pattern" option to regex
Changed with v1.4.0
See: https://github.com/orhun/git-cliff/pull/318

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:12:48 +01:00
Rafael Fontenelle
1b82887cd6
docs: Change i18n.txt to markdown format 2023-12-08 14:10:32 +01:00
moson
783422369e
feat: Set reply-to header for notifications to ML
We can set the "reply-to" header to the "to" address for any mails
that go out to the aur-requests mailing list.

Signed-off-by: moson <moson@archlinux.org>
2023-11-28 09:33:07 +01:00
moson
4637b2edba
fix(tests): Fix test case for Prometheus metrics
Disable prometheus multiprocess mode in tests to avoid global state:
Depending on the workers which are processing a testfile,
we might run into race issues where tests might influence each other.

We also need to make sure to clear any previously collected values
in case the same worker/process is executing different tests which
evaluate prometheus values.

Signed-off-by: moson <moson@archlinux.org>
2023-11-27 13:21:37 +01:00
moson
027dfbd970
chore(release): prepare for 6.2.9
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:30:29 +01:00
moson
8b234c580d
chore(deps): update dependencies
* Updating idna (3.4 -> 3.6)
* Updating annotated-types (0.5.0 -> 0.6.0)
* Updating pydantic-core (2.10.1 -> 2.14.5)
* Updating certifi (2023.7.22 -> 2023.11.17)
* Updating greenlet (3.0.0 -> 3.0.1)
* Updating pydantic (2.4.2 -> 2.5.2)
* Updating charset-normalizer (3.3.0 -> 3.3.2)
* Updating cryptography (41.0.4 -> 41.0.5)
* Updating fastapi (0.103.2 -> 0.104.1)
* Updating mako (1.2.4 -> 1.3.0)
* Updating parse (1.19.1 -> 1.20.0)
* Updating prometheus-client (0.17.1 -> 0.19.0)
* Updating urllib3 (2.0.6 -> 2.1.0)

Fix type annotation for new test function

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:23:56 +01:00
renovate
9bf0c61051
fix(deps): update all non-major dependencies 2023-11-25 18:25:05 +00:00
moson
9d5b9c4795
feat: Add "groups" to package details page
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:59:43 +01:00
moson
765f989b7d
feat: Allow <del> and <details/summary> tags in comments
* Allow additional html tags: <del> and <details/summary>
* Convert markdown double-tilde (~~) to <del> tags

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:41:28 +01:00
Jelle van der Waa
029ce3b418
templates: update Gitlab navbar to point to Arch namespace
Instead of showing your own projects, show the Arch Linux namespace
where all our bugs/projects are.
2023-11-24 18:20:25 +01:00
Jelle van der Waa
3241391af0
templates: update bugs navbar entry to GitLab
Flyspray is no more and all projects are now on our own GitLab instance.
2023-11-12 16:02:16 +01:00
moson
5d302ae00c
feat: Support timezone and language query params
Support setting the timezone as well as the language via query params:
The timezone parameter previously only worked on certain pages.
While we're at it, let's also add the language as a param.
Refactor code for timezone and language functions.
Remove unused AURTZ cookie.

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:44 +02:00
moson
933654fcbb
fix: Restrict context var override on the package page
Users can (accidentally) override context vars with query params.
This may lead to issues when rendering templates (e.g. "comments=").

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:43 +02:00
moson
40c1d3e8ee
fix(ci): Don't create error reports from sandbox
We should not try to create issue reports for internal server errors
from a sandbox/review-app environment.

Signed-off-by: moson <moson@archlinux.org>
2023-10-20 15:45:58 +02:00
Hanabishi
2b8c8fc92a fix: make dependency source use superscript tag
Avoid using special characters and use '<sup>' HTML tag instead.
To not rely on user's fonts Unicode coverage.

Closes: #490
Signed-off-by: Hanabishi <1722-hanabishi@users.noreply.gitlab.archlinux.org>
2023-10-18 16:19:58 +00:00
moson
27c51430fb
chore(release): prepare for 6.2.8
Signed-off-by: moson <moson@archlinux.org>
2023-10-15 20:52:57 +02:00
moson
27cd533654
fix: Skip setting existing context values
When setting up a context with user provided variables,
we should not override any existing values previously set.

Signed-off-by: moson <moson@archlinux.org>
2023-10-12 18:09:07 +02:00
moson
2166426d4c
fix(deps): update dependencies
* Updating typing-extensions (4.5.0 -> 4.8.0)
* Installing annotated-types (0.5.0)
* Updating anyio (3.6.2 -> 3.7.1)
* Installing pydantic-core (2.10.1)
* Updating certifi (2023.5.7 -> 2023.7.22)
* Updating cffi (1.15.1 -> 1.16.0)
* Updating greenlet (2.0.2 -> 3.0.0)
* Updating markupsafe (2.1.2 -> 2.1.3)
* Updating packaging (23.1 -> 23.2)
* Updating pluggy (1.0.0 -> 1.3.0)
* Updating pydantic (1.10.7 -> 2.4.2)
* Updating charset-normalizer (3.1.0 -> 3.3.0)
* Updating click (8.1.3 -> 8.1.7)
* Updating coverage (7.2.7 -> 7.3.2)
* Updating cryptography (40.0.2 -> 41.0.4)
* Updating dnspython (2.3.0 -> 2.4.2)
* Updating execnet (1.9.0 -> 2.0.2)
* Updating fastapi (0.100.1 -> 0.103.2)
* Updating httpcore (0.17.0 -> 0.17.3)
* Updating parse (1.19.0 -> 1.19.1)
* Updating prometheus-client (0.16.0 -> 0.17.1)
* Updating pytest (7.4.0 -> 7.4.2)
* Updating redis (4.6.0 -> 5.0.1)
* Updating urllib3 (2.0.2 -> 2.0.6)
* Updating aiofiles (23.1.0 -> 23.2.1)
* Updating alembic (1.11.2 -> 1.12.0)
* Updating fakeredis (2.17.0 -> 2.19.0)
* Updating filelock (3.12.2 -> 3.12.4)
* Updating orjson (3.9.2 -> 3.9.7)
* Updating protobuf (4.23.4 -> 4.24.4)
* Updating pygit2 (1.12.2 -> 1.13.1)
* Updating werkzeug (2.3.6 -> 3.0.0)

Signed-off-by: moson <moson@archlinux.org>
2023-10-05 17:59:14 +02:00
moson
fd3022ff6c
fix: Correct password length message.
Wrong config option was used to display the minimum length error msg.
(username_min_len instead of passwd_min_len)

Signed-off-by: moson <moson@archlinux.org>
2023-10-02 13:47:38 +02:00
moson
9e9ba15813
housekeep: TU rename - Misc
Fix some more test functions

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
d2d47254b4
housekeep: TU rename - Table/Column names, scripts
TU_VoteInfo -> VoteInfo
TU_Votes -> Votes
TU_VoteInfo.ActiveTUs -> VoteInfo.ActiveUsers

script: tuvotereminder -> votereminder
Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
87f6791ea8
housekeep: TU rename - Comments
Changes to comments, function descriptions, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
61f1e5b399
housekeep: TU rename - Test suite
Rename tests: Function names, variables, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
148c882501
housekeep: TU rename - /tu routes
Change /tu to /package-maintainer

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
f540c79580
housekeep: TU rename - UI elements
Rename all UI elements and translations.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
1702075875
housekeep: TU rename - code changes
Renaming of symbols. Functions, variables, values, DB values, etc.
Basically everything that is not user-facing.

This only covers "Trusted User" things:
tests, comments, etc. will covered in a following commit.
2023-09-30 16:45:04 +02:00
moson
7466e96449
fix(ci): Exclude review-app jobs for renovate MR's
Signed-off-by: moson <moson@archlinux.org>
2023-09-26 13:47:03 +02:00
moson
0a7b02956f
feat: Indicate dependency source
Dependencies might reside in the AUR or official repositories.
Add "AUR" as superscript letters to indicate if a package/provider
is present in the AUR.

Signed-off-by: moson <moson@archlinux.org>
2023-09-03 14:17:11 +02:00
moson
1433553c05
fix(test): Clear previous prometheus data for test
It could happen that test data is already generated by a previous test.
(running in the same worker)

Make sure we clear everything before performing our checks.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
5699e9bb41
fix(test): Remove file locking and semaphore
All tests within a file run in the same worker and out test DB names
are unique per file as well. We don't really need a locking
mechanism here.

Same is valid for the test-emails. The only potential issue is that it
might try to create the same directory multiple times and thus run
into an error. However, that can be covered by specifying
"exist_ok=True" with os.makedirs such that those errors are ignored.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
9eda6a42c6
feat: Add ansible provisioning step for review-app
Clone infrastructure repository and run playbook to provision our VM
with aurweb.

Signed-off-by: moson <moson@archlinux.org>
2023-08-27 13:54:39 +02:00
Kristian Klausen
6c610b26a3
feat: Add terraform config for review-app[1]
Also removed the logic for deploying to the long gone aur-dev box.

Ansible will be added in a upcoming commit for configurating and
deploying aurweb on the VM.

[1] https://docs.gitlab.com/ee/ci/review_apps/
2023-08-27 12:05:52 +02:00
moson
3005e82f60
fix: Cleanup prometheus metrics for dead workers
The current "cleanup" function that is removing orphan prometheus files
is actually never invoked.
We move this to a default gunicorn config file to register our hook(s).

https://docs.gunicorn.org/en/stable/configure.html
https://docs.gunicorn.org/en/stable/settings.html#child-exit
Signed-off-by: moson <moson@archlinux.org>
2023-08-18 22:04:55 +02:00
Leonidas Spyropoulos
f05f1dbac7
chore(release): prepare for 6.2.7
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-08-04 19:18:38 +03:00
renovate
8ad03522de
fix(deps): update all non-major dependencies 2023-08-04 14:25:22 +00:00
moson
94b62d2949
fix: Check if user exists when editing account
We should check if a user (target) exists before validating permissions.
Otherwise things crash when a TU is trying to edit an account that
does not exist.

Fixes: aurweb-errors#529
Signed-off-by: moson <moson@archlinux.org>
2023-08-04 14:12:50 +02:00
renovate
7a44f37968
fix(deps): update dependency fastapi to v0.100.1 2023-07-27 19:24:28 +00:00
renovate
969b84afe4
fix(deps): update all non-major dependencies 2023-07-25 11:24:30 +00:00
renovate
f74f94b501
fix(deps): update dependency gunicorn to v21 2023-07-24 11:24:26 +00:00
moson
375895f080
feat: Add Prometheus metrics for requests
Adds gauge for requests by type and status

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 22:46:44 +02:00
moson
e45878a058
fix: Fix issue with requests totals
Problem is that we join with PackageBase, thus we are missing
requests for packages that were deleted.

Fixes: #483
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 18:53:58 +02:00
moson
6cd70a5c9f
test: Add tests for user/package statistics
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:51 +02:00
moson
8699457917
feat: Separate cache expiry for stats and search
Allows us to set different cache eviction timespans  for search queries
and statistics.

Stats and especially "last package updates" should probably be refreshed
more often, whereas we might want to cache search results for a bit
longer.

So this gives us a bit more flexibility playing around with different
settings and tweak things.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
44c158b8c2
feat: Implement statistics class & additional metrics
The new module/class helps us constructing queries and count records to
expose various statistics on the homepage. We also utilize for some new
prometheus metrics (package and user gauges).
Record counts are being cached with Redis.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
347c2ce721
change: Change order of commit validation routine
We currently validate all commits going from latest -> oldest.

It would be nicer to go oldest -> latest so that, in case of errors,
we would indicate which commit "introduced" the problem.

Signed-off-by: moson <moson@archlinux.org>
2023-07-22 10:45:08 +02:00
moson
bc03d8b8f2
fix: Fix middleware checking for accepted terms
The current query is a bit mixed up. The intention was to return the
number of unaccepted records. Now it does also count all records
that were accepted by some other user though.

Let's check the total number of terms vs. the number of accepted
records (by our user) instead.

Signed-off-by: moson <moson@archlinux.org>
2023-07-20 18:21:05 +02:00
moson
5729d6787f
fix: git links in comments for multiple OIDs
The chance of finding multiple object IDs when performing lookups with
a shortened SHA1 hash (7 digits) seems to be quite high.

In those cases pygit2 will throw an error.
Let's catch those exceptions and gracefully handle them.

Fixes: aurweb-errors#496 (and alike)
Signed-off-by: moson <moson@archlinux.org>
2023-07-17 12:45:16 +02:00
renovate
862221f5ce
fix(deps): update all non-major dependencies 2023-07-15 20:27:12 +00:00
moson
27819b4465
fix: /rss lazy load issue & perf improvements
Some fixes for the /rss endpoints

* Load all data in one go:
Previously data was lazy loaded thus it made several sub-queries per
package (> 200 queries for composing the rss data for a single request).
Now we are performing a single SQL query.
(request time improvement: 550ms -> 130ms)
This also fixes aurweb-errors#510 and alike

* Remove some "dead code":
The fields "source, author, link" were never included in the rss output
(wrong or insufficient data passed to the different entry.xyz functions)
Nobody seems to be missing them anyways, so let's remove em.

* Remove "Last-Modified" header:
Obsolete since nginx can/will only handle "If-Modified-Since" requests
in it's current configuration. All requests are passed to fastapi anyways.

Signed-off-by: moson <moson@archlinux.org>
2023-07-13 18:27:02 +02:00
moson
fa1212f2de
fix: translations not containing string formatting
In some translations we might be missing replacement placeholders (%).
This turns out to be problematic when calling the format function.

Wrap the jinja2 format function and just return the string unformatted
when % is missing.

Fixes: #341
Signed-off-by: moson <moson@archlinux.org>
2023-07-10 18:02:20 +02:00
moson
c0bbe21d81
fix(test): correct test for ssh-key parsing
Our set of keys returned by "util.parse_ssh_keys" is unordered so we
have to adapt our test to not rely on a specific order for multiple keys.

Fixes: 5ccfa7c0fd ("fix: same ssh key entered multiple times")
Signed-off-by: moson <moson@archlinux.org>
2023-07-09 16:13:02 +02:00
moson
5ccfa7c0fd
fix: same ssh key entered multiple times
Users might accidentally past their ssh key multiple times
when they try to register or edit their account.

Convert our of list of keys to a set, removing any double keys.

Signed-off-by: moson <moson@archlinux.org>
2023-07-09 14:52:15 +02:00
Leonidas Spyropoulos
225ce23761
chore(release): prepare for 6.2.6
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:54:43 +01:00
moson
4821fc1312
fix: show placeholder for deleted user in comments
show "<deleted-account>" in comment headers in case a user
deleted their account.

Signed-off-by: moson <moson@archlinux.org>
2023-07-08 13:44:24 +02:00
Leonidas Spyropoulos
1f40f6c5a0
housekeep: set current maintainers
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:38:19 +01:00
renovate
81d29b4c66
fix(deps): update dependency fastapi to ^0.100.0 2023-07-08 11:24:29 +00:00
renovate
7cde1ca560
fix(deps): update all non-major dependencies 2023-07-08 09:25:09 +00:00
moson-mo
f3f8c0a871
fix: add recipients to BCC when email is hidden
Package requests are sent to the ML as well as users (CC).
For those who chose to hide their mail address,
we should add them to the BCC list instead.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-08 11:19:02 +02:00
moson
9fe8d524ff
fix(test): MariaDB 11 upgrade, query result order
Fix order of recipients for "FlagNotification" test.
Apply sorting to the recipients query.
(only relevant for tests, but who knows when they change things again)

MariaDB 11 includes some changes related to the
query optimizer. Turns out that this might have effects
on how records are ordered for certain queries.
(in case no ORDER BY clause was specified)

https://mariadb.com/kb/en/mariadb-11-0-0-release-notes/
Signed-off-by: moson <moson@archlinux.org>
2023-07-08 10:32:26 +02:00
moson-mo
814ccf6b04
feat: add Prometheus metrics for Redis cache
Adding a Prometheus counter to be able to monitor cache hits/misses
for search queries

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
3acfb08a0f
feat: cache package search results with Redis
The queries being done on the package search page are quite costly.
(Especially the default one ordered by "Popularity" when navigating to /packages)

Let's add the search results to the Redis cache:
Every result of a search query is being pushed to Redis until we hit our maximum of 50k.
An entry expires after 3 minutes before it's evicted from the cache.
Lifetime an Max values are configurable.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
7c8b9ba6bc
perf: add index to tweak our default search query
Adds an index on PackageBases.Popularity and PackageBases.Name to
improve performance of our default search query sorted by "Popularity"

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:55:21 +02:00
moson-mo
c41f2e854a
perf: tweak some search queries
We currently sorting on two columns in different tables which is quite
expensive in terms of performance:
MariaDB is first merging the data into some temporary table to apply the
sorting and record limiting.

We can tweak a couple of these queries by changing the "order by" clause
such that they refer to columns within the same table (PackageBases).
So instead performing the second sorting on "Packages.Name", we do
this on "PackageBases.Name" instead.
This should still be "good enough" to produce properly sorted results.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:21:11 +02:00
Leonidas Spyropoulos
e2c113caee
chore(release): prepare for 6.2.5
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-06-22 19:22:56 +01:00
moson-mo
143575c9de
fix: restore command, remove premature creation of pkgbase
We're currently creating a "PackageBases" when the "restore" command is executed.

This is problematic for pkgbases that never existed before.
In those cases it will create the record but fail in the update.py script.
Thus it leaves an orphan "PackageBases" record in the DB
(which does not have any related "Packages" record(s))

Navigating to such a packages /pkgbase/... URL will result in a crash
since it is not foreseen to have "orphan" pkgbase records.

We can safely remove the early creation of that record because
it'll be taken care of in the update.py script that is being called

We'll also fix some tests. Before it was executing a dummy script
instead of "update.py" which might be a bit misleading
since it did not check the real outcome of our "restore" action.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 14:22:22 +02:00
moson-mo
c6c81f0789
housekeep: Amend .gitignore and .dockerignore
Prevent some files/dirs to end up in the repo / docker image:
* directories typically used for python virtualenvs
* files that are being generated by running tests

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 13:33:39 +02:00
moson-mo
32461f28ea
fix(docker): Suppress error PEP-668
When using docker (compose), we don't create a venv and just install
python packages system-wide.

With python 3.11 (PEP 668) we need to explicitly tell pip to allow this.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-15 14:19:02 +02:00
moson-mo
58158505b0
fix: browser hints for password fields
Co-authored-by: eNV25 <env252525@gmail.com>
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 21:04:35 +02:00
moson-mo
ed17486da6
change(git): allow keys/pgp subdir with .asc files
This allows migration of git history for packages dropped from a repo to AUR
in case they contain PGP key material

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 12:20:02 +02:00
moson-mo
1c11c901a2
feat: switch requests filter for pkgname to "contains"
Use "contains" filtering instead of an exact match
when a package name filter is given.

This makes it easier to find requests for a "group" of packages.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-10 09:40:35 +02:00
Christian Heusel
26b2566b3f
change: print the user name if connecting via ssh
this is similar to the message that gitlab produces:

$ ssh -T aur.archlinux.org
Welcome to AUR, gromit! Interactive shell is disabled.
Try `ssh ssh://aur@aur.archlinux.org help` for a list of commands.

$ ssh -T gitlab.archlinux.org
Welcome to GitLab, @gromit!

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-08 12:47:27 +02:00
Christian Heusel
e9cc2fb437
change: only require .SRCINFO in the latest revision
This is done in order to relax the constraints so that dropping packages
from the official repos can be done with preserving their history.

Its sufficient to also have this present in the latest commit of a push.

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-07 18:54:31 +02:00
Leonidas Spyropoulos
ed2f85ad04
chore(release): prepare for 6.2.4
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-27 14:28:48 +01:00
renovate
2709585a70
fix(deps): update dependency fastapi to v0.95.2 2023-05-27 11:24:46 +00:00
renovate
d1a3fee9fe fix(deps): update all non-major dependencies 2023-05-26 21:12:13 +00:00
moson-mo
49e98d64f4
chore: increase default session/cookie timeout
change from 2 to 4 hours.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 23:03:38 +02:00
moson-mo
a7882c7533
refactor: remove session_time from user.login
The parameter is not used, we can remove it and adapt the callers.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 23:02:38 +02:00
moson-mo
22fe4a988a
fix: make AURSID a session cookie if "remember me" is not checked
This should match more closely the expectation of a user.
A session cookie should vanish on browser close
and you thus they need to authenticate again.

There is no need to bump the expiration of AURSID either,
so we can remove that part.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:47 +02:00
moson-mo
0807ae6b7c
test: add tests for cookie handling
add a bunch of test cases to ensure our cookies work properly

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
d366377231
fix: make AURREMEMBER cookie a permanent one
If it's a session cookie it poses issues for users
whose browsers wipe session cookies after close.
They'd be logged out early even if they chose
the "remember me" option when they log in.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
57c154a72c
fix: increase expiry for AURLANG cookie; only set when needed
We add a new config option for cookies with a 400 day lifetime.
AURLANG should survive longer for unauthenticated users.
Today they have to set this again after each browser restart.
(for users whose browsers wipe session cookies on close)

authenticated users don't need this cookie
since the setting is saved to the DB

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
638ca7b1d0
chore: remove setting AURLANG and AURTZ on login
We don't need to set these cookies when logging in.
These settings are saved to the DB anyways.
(and they are picked up from there as well for any web requests,
when no cookies are given)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
edc4ac332d
chore: remove setting AURLANG and AURTZ on account edit
We don't need to set these cookies when an account is edited.
These settings are saved to the DB anyways.
(and they are picked up from there as well for any web requests,
when no cookies are given)

Setting these cookies can even be counter-productive:
Imagine a TU/Dev editing another users account.
They would overwrite their own cookies with the other users TZ/LANG settings.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
moson-mo
2eacc84cd0
fix: properly evaluate AURREMEMBER cookie
Whenever the AURREMEMBER cookie was defined, regardless of its value,
"remember_me" is always set to True

The get method of a dict returns a string,
converting a value of str "False" into a bool -> True

We have to check AURREMEMBERs value instead.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-26 22:57:46 +02:00
Daniel M. Capella
5fe375bdc3
feat: add link to MergeBaseName in requests.html 2023-05-26 13:26:41 -04:00
renovate
1b41e8572a
fix(deps): update all non-major dependencies 2023-05-26 02:24:39 +00:00
moson-mo
7a88aeb673
chore: update .gitignore for test-emails
emails generated when running tests are stored in test-emails/ dir

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-25 11:18:08 +02:00
moson-mo
f24fae0ce6
feat: Add "Requests" filter option for package name
- Add package name textbox for filtering requests (with auto-suggest)
- Make "x pending requests" a link for TU/Dev on the package details page

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-25 11:18:08 +02:00
Leonidas Spyropoulos
acdb2864de
Merge branch 'aurblup-update-repo' into 'master'
fix: update repo information with aurblup script / git packaging repo changes

See merge request archlinux/aurweb!710
2023-05-25 10:06:44 +01:00
moson-mo
146943b3b6
housekeep: support new default repos after git migration
community is merged into extra
testing -> core-testing & extra-testing

Announcement: https://archlinux.org/news/git-migration-announcement/

We list "testing" repos first:
See d0b0e4d88b

Co-authored-by: artafinde <artafinde@archlinux.org>
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-18 13:06:21 +02:00
moson-mo
d0b0e4d88b
fix: update repo information with aurblup script
Currently, the "Repo" column in "OfficialProviders" is not updated
when a package is moved from one repository to another.

Note that we only save a package/provides combination once,
hence if a package is available in core and testing at the same time,
it would only put just one record into the OfficialProviders table.

We iterate through the repos one by one and the last value
is kept for mapping a (package/provides) combination to a repo.
Due to that, the repos listed in the "sync-db" config setting
should be ordered such that the "testing" repos are listed first.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-17 18:22:53 +02:00
moson-mo
3253a6ad29
fix(deps): remove urllib3 from dependency list
Previously pinned urllib3 to v1.x. This is not needed though.
The incompatibility of v2.x is with poetry itself, but not aurweb.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-07 09:58:17 +02:00
Daniel M. Capella
d2e8fa0249
chore(deps): "Group all minor and patch updates together"
Treat FastAPI separately due to regular breakage.

Co-authored-by: moson-mo <mo-son@mailbox.org>
2023-05-06 18:03:05 -04:00
Leonidas Spyropoulos
1d627edbe7
chore(release): prepare for 6.2.3
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-06 20:34:54 +01:00
moson-mo
b115aedf97
chore(deps): update several dependencies
- Removing rfc3986 (1.5.0)
- Updating coverage (7.2.4 -> 7.2.5)
- Updating fastapi (0.94.1 -> 0.95.1)
- Updating httpcore (0.16.3 -> 0.17.0)
- Updating sqlalchemy (1.4.47 -> 1.4.48)
- Updating httpx (0.23.3 -> 0.24.0)
- Updating prometheus-fastapi-instrumentator (5.11.2 -> 6.0.0)
- Updating protobuf (4.22.3 -> 4.22.4)
- Updating pytest-asyncio (0.20.3 -> 0.21.0)
- Updating requests (2.29.0 -> 2.30.0)
- Updating uvicorn (0.21.1 -> 0.22.0)
- Updating watchfiles (0.18.1 -> 0.19.0)
- Updating werkzeug (2.3.2 -> 2.3.3)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-06 20:29:05 +02:00
Christian Heusel
af4239bcac
replace reference to AUR TU Guidelines with AUR Submission Guidelines
Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-05-06 19:47:01 +02:00
Leonidas Spyropoulos
a8d14e0194
housekeep: remove unused templates and rework existing ones
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-05-01 10:44:45 +01:00
moson-mo
8c5b85db5c
housekeep: remove fix for poetry installer
The problems with the "modern installer" got fixed.
We don't need this workaround anymore.

https://github.com/python-poetry/poetry/issues/7572
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-01 10:23:34 +02:00
moson-mo
b3fcfb7679
doc: improve instructions for setting up a dev/test env
Provide more detailed information how to get started with a dev/test env.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-05-01 10:23:34 +02:00
Leonidas Spyropoulos
e896edaccc
chore: support for python 3.11 and poetry.lock update
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-30 10:12:09 +01:00
moson-mo
bab17a9d26
doc: amend INSTALL instructions
change path for metadata archive files

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-29 09:59:34 +02:00
moson-mo
ad61c443f4
fix: restore & move cgit html files
restore files accidentally deleted with PHP cleanup.

1325c71712/web/template/cgit
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-29 09:55:54 +02:00
moson-mo
8ca63075e9
housekeep: remove PHP implementation
removal of the PHP codebase

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-28 16:10:32 +02:00
moson-mo
97d0eac303
housekeep: copy static files
we copy static files used by PHP and Python versions into /static

preparation work for the removal of the PHP version

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-04-28 10:53:22 +02:00
Leonidas Spyropoulos
1325c71712
chore: update poetry.lock
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-24 09:13:38 +01:00
Leonidas Spyropoulos
6ede837b4f
feat: allow users to hide deleted comments
Closes: #435

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-04-24 09:13:38 +01:00
Leonidas Spyropoulos
174af5f025
chore(release): prepare for 6.2.2
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-03-15 12:09:47 +00:00
moson-mo
993a044680
fix(poetry): use classic installer
The "install" module (v0.6.0) which is being used by poetry 1.4.0
has problems installing certain packages.

Disable the modern installer for now, until things are fixed.

https://github.com/python-poetry/poetry/issues/7572
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 17:57:57 +01:00
moson-mo
bf0d4a2be7
fix(deps): bump dependencies
bump all deps except sqlalchemy.

- Updating exceptiongroup (1.1.0 -> 1.1.1)
- Updating pydantic (1.10.5 -> 1.10.6)
- Updating starlette (0.25.0 -> 0.26.1)
- Updating charset-normalizer (3.0.1 -> 3.1.0)
- Updating fastapi (0.92.0 -> 0.94.1)
- Updating setuptools (67.4.0 -> 67.6.0)
- Updating urllib3 (1.26.14 -> 1.26.15)
- Updating alembic (1.9.4 -> 1.10.2)
- Updating fakeredis (2.9.2 -> 2.10.0)
- Updating prometheus-fastapi-instrumentator (5.10.0 -> 5.11.1)
- Updating protobuf (4.22.0 -> 4.22.1)
- Updating pytest-xdist (3.2.0 -> 3.2.1)
- Updating uvicorn (0.20.0 -> 0.21.0)
- Updating filelock (3.9.0 -> 3.9.1)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 17:57:56 +01:00
moson-mo
b9df7541b3
fix: add comments in email for direct deletion/merge
TUs and Devs can delete and merge packages directly.
Currently the comments they enter, don't end up in the ML notification.

Include the comment in the notifications for direct deletion / merge

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-14 11:17:45 +01:00
moson-mo
7d1827ffc5
feat: cancel button for comment editing
Adds button that allows cancellation while editing a comment

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-09 21:48:58 +01:00
moson-mo
52c962a590
fix(deps): fastapi 0.92.0 upgrade
middleware must be added before startup:

fixes: "RuntimeError: Cannot add middleware after an application has started"

https://fastapi.tiangolo.com/release-notes/#0910
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-04 10:29:54 +01:00
moson-mo
c0390240bc
housekeep(deps): bump dependencies
update all poetry deps to the latest version (except of sqlalchemy)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-04 10:27:57 +01:00
moson-mo
7d06c9ab97
fix: encode package name in URL for source files
Package(Base) names might include characters that require url-encoding

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-03-01 18:04:20 +01:00
moson-mo
8aac842307
fix(test): use single-quotes for strings in sql statements
Currently, in the sharness test suites, we use double-quotes
for string literals in SQL statements passed to sqlite3.

With sqlite version 3.41 the usage of double-quotes for string literals
is deactivated by default:
We'll need to switch to single-quotes in our tests.

Ref: Section 6.f. at https://www.sqlite.org/releaselog/3_41_0.html
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-24 10:11:33 +01:00
moson-mo
0c5b4721d6
fix: include package data without "Last Packager"
Data for packages that do not have a "Last Packager"
(e.g. because the user account was deleted)
should still be available from the /rpc and metadata archives.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
moson-mo
8d2e176c2f
housekeep: stop "pkgmaint" script (cron job)
With the removal of the "setup-repo" command this script becomes obsolete,
because it is not possible to reserve a repo anymore.
Hence we don't need cleanup.

We've also seen issues in case the last packager's user account is removed,
leading to the deletion of a Package.

Let's deactivate this for now.

Issue report: #425

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
moson-mo
b1a9efd552
housekeep(git): remove deprecated "setup-repo" command
Marked as deprecated about 6 years ago.
Time to bury it.

Issue report: #428

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-21 11:19:02 +01:00
Leonidas Spyropoulos
68813abcf0
fix(RTL): make RTL layout properly displayed
Closes: #290

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-19 02:14:57 +00:00
Leonidas Spyropoulos
45218c4ce7
fix: per-page needs to be non zero
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-08 15:13:21 +00:00
Leonidas Spyropoulos
cb16f42a27
fix: validate timezone before use
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-02-06 16:40:43 +00:00
moson-mo
f9a5188fb7
chore(lint): reformatting after black update
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-06 09:15:18 +01:00
moson-mo
2373bdf400
fix(deps): bump pre-commit hooks
Bump hooks with "pre-commit autoupdate".

There is an issue with the latest poetry version and the "isort" hook module
https://github.com/PyCQA/isort/issues/2077

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-02-06 09:12:00 +01:00
Leonidas Spyropoulos
8b25d11a3a
chore(release): prepare for 6.2.1
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-27 18:08:54 +00:00
Leonidas Spyropoulos
ef2baad7b3
feat: expand on update.py tests and show on Gitlab UI
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-27 17:16:25 +00:00
moson-mo
137ed04d34
test: add tests .SRCINFO parsing and git update routine
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-27 15:40:25 +01:00
moson-mo
97e1f07f71
fix(deps): update srcinfo to 0.1.2
Fixes issue parsing .SRCINFO files

Issue report: #422

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-27 14:04:55 +01:00
Leonidas Spyropoulos
2b76b90885
chore(release): prepare for 6.2.0
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-26 23:19:04 +00:00
moson-mo
7f9ac28f6e
feat(deps): add watchfiles
When running aurweb with hot-reloading, the CPU consumption is quite high.
This is because it is using "StatReload" for detecting modified files.
(which seems to be rather inefficient)

When "watchfiles" is installed it'll automatically usees that instead and
CPU load goes down to 1%.
watchfiles uses filesystem events for detecting changes and is way more efficient.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-26 12:59:40 +01:00
Leonidas Spyropoulos
255cdcf667
fix:(revert): fix: only try to show dependencies if object exists
This reverts commit 0e44687ab1.

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 22:17:33 +00:00
moson-mo
ec239ceeb3
feat: add "Last Updated" column to search results
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 22:39:36 +01:00
moson-mo
becce1aac4
fix: occasional errors when loading package details
Fixes errors that might occur when loading the package details page.

Problem:
We are querying a list of "Required by" packages.
This list is loaded with all details for a "PackageDependency" record.

Now we also have a reference to some attributes from the
related package (PackageDependency.Package.xxx)

This will effectively trigger the ORM to run another query (lazyload),
to fetch the missing Package data (for each PackageDependency record).

At that point it might have happened that a referenced package
got deleted / updated so that we can't retrieve this data anymore and
our dep.Package object is "None"

Fix:
We can force our query to include Package data right away.
Thus we can avoid running a separate query (per "required by"...)

As a side-effect we get better performance.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 22:34:19 +01:00
Leonidas Spyropoulos
6c9be9eb97
fix(deps): update dependencies from renovate
fastapi ^0.89.0
coverage v7
srcinfo ^0.1.0

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 21:17:50 +00:00
Leonidas Spyropoulos
c176b2b611
feature: increase mandatory coverage to 95%
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-25 19:34:52 +00:00
moson-mo
ff0123b54a
fix: save notification state for unchanged comments
When we edit a comment we can enable notifications (if not yet enabled).

We should also do this when the comment text is not changed.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 09:42:20 +01:00
moson-mo
36fd58d7a6
fix: show notification box when adding a comment
Currently, the "Enable notifications" checkbox
is only shown when editing a comment.

We should also show it when a new comment is about to be added.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-25 09:42:19 +01:00
moson-mo
65ba735f18
fix: bleach upgrade 6.0
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-23 23:50:04 +01:00
renovate
a2487c20d8
fix(deps): update dependency bleach to v6 2023-01-23 17:24:53 +00:00
Christian Heusel
f41f090ed7 simplify the docker development setup instructions
use `docker compose exec` instead of `docker ps` and `docker exec`

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-01-15 09:25:22 +00:00
Leonidas Spyropoulos
0e44687ab1 fix: only try to show dependencies if object exists
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-14 21:08:34 +00:00
Leonidas Spyropoulos
4d0a982c51 fix: assert offset and per_page are positive
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-01-14 20:57:11 +00:00
moson-mo
f6c4891415
feat: add Support section to Dashboard
Adds the "Support" section (displayed on "Home") to the "Dashboard" page as well.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-14 13:12:33 +01:00
moson-mo
2150f8bc19
fix(docker): nginx health check
nginx health check always results in "unhealthy":

There is no such option "--no-verify" for curl.
We can use "-k" or "--insecure" for disabling SSL checks.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-13 10:26:43 +01:00
moson-mo
ff44eb02de
feat: add link to mailing list article on requests page
Provides a convenient way to check for responses on the
mailing list prior to Accepting/Rejecting requests.

We compute the Message-ID hash that can be used to
link back to the article in the mailing list archive.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-12 12:06:28 +01:00
Kevin Morris
154bb239bf
update-zh_TW translations 2023-01-11 12:25:54 -08:00
Kevin Morris
65d364fe90
update-zh_CN translations 2023-01-11 12:25:53 -08:00
Kevin Morris
ef0e3b9f35
update-zh translations 2023-01-11 12:25:53 -08:00
Kevin Morris
2770952dfb
update-vi translations 2023-01-11 12:25:53 -08:00
Kevin Morris
4cff1e500b
update-uk translations 2023-01-11 12:25:53 -08:00
Kevin Morris
b36cbd526b
update-tr translations 2023-01-11 12:25:52 -08:00
Kevin Morris
5609ddf791
update-sv_SE translations 2023-01-11 12:25:52 -08:00
Kevin Morris
8592bada16
update-sr_RS translations 2023-01-11 12:25:52 -08:00
Kevin Morris
46c925bc82
update-sr translations 2023-01-11 12:25:52 -08:00
Kevin Morris
8ee843b7b1
update-sk translations 2023-01-11 12:25:51 -08:00
Kevin Morris
ebae0d4304
update-ru translations 2023-01-11 12:25:51 -08:00
Kevin Morris
fa20a3b5d8
update-ro translations 2023-01-11 12:25:51 -08:00
Kevin Morris
e7bcf2fc97
update-pt_PT translations 2023-01-11 12:25:51 -08:00
Kevin Morris
bb00a4ecfd
update-pt_BR translations 2023-01-11 12:25:50 -08:00
Kevin Morris
6ee7598211
update-pt translations 2023-01-11 12:25:50 -08:00
Kevin Morris
e572b86fd3
update-pl translations 2023-01-11 12:25:50 -08:00
Kevin Morris
05c6266986
update-nl translations 2023-01-11 12:25:50 -08:00
Kevin Morris
57a2b4b516
update-nb_NO translations 2023-01-11 12:25:49 -08:00
Kevin Morris
d20dbbcf74
update-nb translations 2023-01-11 12:25:49 -08:00
Kevin Morris
e5137e0c42
update-lt translations 2023-01-11 12:25:49 -08:00
Kevin Morris
e6d36101d9
update-ko translations 2023-01-11 12:25:49 -08:00
Kevin Morris
08af8cad8d
update-ja translations 2023-01-11 12:25:49 -08:00
Kevin Morris
a12dbd191a
update-it translations 2023-01-11 12:25:48 -08:00
Kevin Morris
0d950a0c9f
update-is translations 2023-01-11 12:25:48 -08:00
Kevin Morris
3a460faa6e
update-id_ID translations 2023-01-11 12:25:48 -08:00
Kevin Morris
28e8b31211
update-id translations 2023-01-11 12:25:48 -08:00
Kevin Morris
5f71e58db1
update-hu translations 2023-01-11 12:25:47 -08:00
Kevin Morris
bf348fa572
update-hr translations 2023-01-11 12:25:47 -08:00
Kevin Morris
b209cd962c
update-hi_IN translations 2023-01-11 12:25:47 -08:00
Kevin Morris
9385c14f77
update-he translations 2023-01-11 12:25:47 -08:00
Kevin Morris
ff01947f3d
update-fr translations 2023-01-11 12:25:47 -08:00
Kevin Morris
3fa9047864
update-fi_FI translations 2023-01-11 12:25:46 -08:00
Kevin Morris
bce9bedaf4
update-fi translations 2023-01-11 12:25:46 -08:00
Kevin Morris
076245e061
update-et translations 2023-01-11 12:25:46 -08:00
Kevin Morris
aeb38b599d
update-es translations 2023-01-11 12:25:46 -08:00
Kevin Morris
6bf408775c
update-el translations 2023-01-11 12:25:46 -08:00
Kevin Morris
791e715aee
update-de translations 2023-01-11 12:25:45 -08:00
Kevin Morris
5a7a9c2c9f
update-da translations 2023-01-11 12:25:45 -08:00
Kevin Morris
da458ae70a
update-cs translations 2023-01-11 12:25:45 -08:00
Kevin Morris
618a382e6c
update-ca_ES translations 2023-01-11 12:25:45 -08:00
Kevin Morris
d6661403aa
update-ca translations 2023-01-11 12:25:45 -08:00
Kevin Morris
9229220e21
update-bg translations 2023-01-11 12:25:44 -08:00
Kevin Morris
b89fe9eb13
update-az_AZ translations 2023-01-11 12:25:44 -08:00
Kevin Morris
3a13eeb744
update-az translations 2023-01-11 12:25:44 -08:00
Kevin Morris
65266d752b
update-ar translations 2023-01-11 03:09:09 -08:00
Kevin Morris
413de914ca
fix: remove trailing whitespace lint check for ./po
Signed-off-by: Kevin Morris <kevr@0cost.org>
2023-01-10 14:36:31 -08:00
moson-mo
7a9448a3e5
perf: improve packages search-query
Improves performance for queries with large result sets.

The "group by" clause can be removed for all search types but the keywords.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-05 22:00:32 +01:00
moson-mo
d8e91d058c
fix(rpc): provides search should return name match
We need to return packages matching on the name as well.
(A package always provides itself)

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-01-03 15:58:45 +01:00
moson-mo
2b8dedb3a2
feat: add pagination element below comments
other pages like the "package search" have this as well.

Issue report: #390

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-28 17:01:44 +01:00
moson-mo
8027ff936c
fix: alignment of pagination element
pagination for comments should appear on the right instead of center

Issue report: #390

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-28 16:57:27 +01:00
Leonidas Spyropoulos
c74772cb36
chore: bump to v6.1.9
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-27 10:34:07 +00:00
moson-mo
7864ac6dfe
fix: search-by parameter for keyword links
Fixes:
Keyword-links on the package page pass wrong query-parameter.
Thus a name/description search is performed instead of  keywords

Issue report: #397

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-27 10:33:58 +01:00
moson-mo
a08681ba23
fix: Add "Show more..." link for "Required by"
Fix glitch on the package page:
"Show more..." not displayed for the "Required by" list

Fix test case:
Function name does not start with "test" hence it was never executed during test runs

Issue report: #363

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-25 12:24:04 +01:00
moson-mo
a832b3cddb
fix(test): FastAPI 0.87.0 - warning fixes
FastAPI 0.87.0 switched to the httpx library for their TestClient

* cookies need to be defined on the request instance instead of method calls

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-24 22:43:31 +01:00
moson-mo
1216399d53
fix(test): FastAPI 0.87.0 - error fixes
FastAPI 0.87.0 switched to the httpx library for their TestClient

* allow_redirects is deprecated and replaced by follow_redirects

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-24 22:23:37 +01:00
renovate
512ba02389
fix(deps): update dependency fastapi to ^0.87.0 2022-11-23 00:25:31 +00:00
Leonidas Spyropoulos
6b0978b9a5
fix(deps): update dependencies from renovate
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 21:51:15 +00:00
moson-mo
d5e102e3f4
feat: add "Submitter" field to /rpc info request
Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-22 18:46:57 +01:00
Leonidas Spyropoulos
ff92e95f7a
fix: delete associated ssh public keys with account deletion
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:51:09 +00:00
Leonidas Spyropoulos
bce5b81acd
feat: allow filtering requests from maintainers
These are usually easy to handle from TUs so allow to filter for them

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:39:11 +00:00
Leonidas Spyropoulos
500d6b403b
feat: add co-maintainers to RPC
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-22 16:32:51 +00:00
moson-mo
bcd808ddc1
feat(rpc): add "by" parameter - comaintainers
Add "by" parameter: comaintainers

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:39 +01:00
moson-mo
efd20ed2c7
feat(rpc): add "by" parameter - keywords
Add "by" parameter: keywords

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:31 +01:00
moson-mo
5484e68b42
feat(rpc): add "by" parameter - submitter
Add "by" parameter: submitter

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:19 +01:00
moson-mo
0583f30a53
feat(rpc): add "by" parameter - groups
Adding "by" parameter to search by "groups"

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:32:01 +01:00
moson-mo
50287cb066
feat(rpc): add "by" parameters - package relations
This adds new "by" search-parameters: provides, conflicts and replaces

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-11 11:30:44 +01:00
Leonidas Spyropoulos
73f0bddf0b
fix: handle default requests when using pages
The default page shows the pending requests which were working OK if one
used the Filters button. This fixes the case when someone submits by
using the pager (Next, Last etc).

Closes: #405

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-08 13:14:42 +00:00
moson-mo
c248a74f80
chore: fix mailing-list URL on passreset page
small addition to the patch provided in #404

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-07 14:36:34 +01:00
Lex Black
4f56a01662
chore: fix mailing-lists urls
Those changed after the migration to mailman3

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-04 14:17:08 +00:00
Leonidas Spyropoulos
c0e806072e
chore: bump to v6.1.8
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 18:31:37 +00:00
Leonidas Spyropoulos
d00371f444
housekeep: bump renovate dependencies
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 17:24:13 +00:00
Leonidas Spyropoulos
f10c1a0505
perf: add PackageKeywords.PackageBaseID index
This is used on the export for package-meta.v1.gz generation

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-11-01 17:24:13 +00:00
moson-mo
5669821b29
perf: tweak some queries in mkpkglists
We can omit the "distinct" from some queries
because constraints in the DB ensure uniqueness:

* Groups sub-query
PackageGroup: Primary key makes "PackageID" + "GroupID" unique
Groups: Unique index on "Name" column
-> Technically we can't have a package with the same group-name twice

* Licenses sub-query:
PackageLicense -> Primary key makes "PackageID" + "LicenseID" unique
Licenses -> Unique index on "Name" column
-> Technically we can't have a package with the same license-name twice

* Keywords sub-query:
PackageKeywords -> Primary key makes "PackageBaseID" + "KeywordID" unique
(And a Package can only have one PackageBase)
Keywords -> Unique index on "Name" column
-> Technically we can't have a package with the same Keyword twice

* Packages main-query:
We join PackageBases and Users on their primary key columns
(which are guaranteed to be unique)
-> There is no way we could end up with more than one record for a Package

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-11-01 18:18:06 +01:00
Leonidas Spyropoulos
286834bab1
fix: regression on gzipped filenames from 3dcbee5a
With the 3dcbee5a the filenames inside the .gz archives contained .tmp
at the end. This fixes those by using Gzip Class constructor instead of
the gzip.open method.

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-31 14:43:31 +00:00
Mario Oenning
6ee34ab3cb feat: add field "CoMaintainers" to metadata-archives 2022-10-31 09:42:56 +00:00
Mario Oenning
333051ab1f feat: add field "Submitter" to metadata-archives 2022-10-28 16:55:16 +00:00
Leonidas Spyropoulos
48e5dc6763
feat: remove empty lines from ssh_keys text area, and show helpful message
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Leonidas Spyropoulos
7e06823e58
refactor: remove redundand parenthesis when return tuple
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Leonidas Spyropoulos
d793193fdf
style: make logging easier to read
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-28 13:43:32 +01:00
Mario Oenning
3dcbee5a4f fix: make overwriting of archive files atomic 2022-10-28 12:42:50 +00:00
Leonidas Spyropoulos
524334409a
fix: add production logging.prod.conf to be less verbose
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 21:58:30 +01:00
Leonidas Spyropoulos
0417603499
housekeep: bump renovate dependencies
email-validator:  1.2.1 -> ^1.3.0
uvicorn:          ^0.18.0 -> ^0.19.0
fastapi:          ^0.83.0 -> ^0.85.0
pytest-asyncio:   ^0.19.0 -> ^0.20.1
pytest-cov        ^3.0.0 -> ^4.0.0

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 21:48:40 +01:00
Leonidas Spyropoulos
8555e232ae
docs: fix mailing list after migration to mailman3
Closes: #396

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 20:15:46 +01:00
Leonidas Spyropoulos
9c0f8f053e
chore: rename logging.py and redis.py to avoid circular imports
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-22 18:51:38 +01:00
Leonidas Spyropoulos
b757e66997 feature: add filters and stats for requests
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-10-15 15:26:53 +03:00
Kevin Morris
da5a646a73
upgrade: bump to v6.1.7
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-10-11 15:04:25 -07:00
Kevin Morris
18f5e142b9
fix: include orphaned packages in metadata output
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-10-11 14:50:09 -07:00
Kevin Morris
3ae6323a7c
upgrade: bump to v6.1.6
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-30 05:19:58 -07:00
Kevin Morris
8657fd336e
feat: GET|POST /account/{name}/delete
Closes #348

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-30 05:08:50 -07:00
Kevin Morris
1180565d0c
Merge branch 'upd-metadata-doc' 2022-09-26 01:39:09 -07:00
Kevin Morris
eb0c5605e4
upgrade: bump version to v6.1.5
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-26 01:28:38 -07:00
Kevin Morris
e00b0059f7
doc: remove --spec popularity from cron recommendations
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-26 01:27:37 -07:00
Leonidas Spyropoulos
0dddaeeb98
fix: remove sessions of suspended users
Fixes: #394

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-09-26 08:59:44 +01:00
moson-mo
137644e919
docs: suggest shallow clone in git-archive.md
we should be suggesting to make a shallow clone to reduce
the amount of data that is being transferred initially

Signed-off-by: moson-mo <mo-son@mailbox.org>
2022-09-25 10:03:05 +02:00
Kevin Morris
30e72d2db5 feat: archive git repository (experimental)
See doc/git-archive.md for general Git archive specifications
See doc/repos/metadata-repo.md for info and direction related to the new Git metadata archive
2022-09-24 16:51:25 +00:00
Kevin Morris
ec3152014b
fix: retry transactions who fail due to deadlocks
In my opinion, this kind of handling of transactions is pretty ugly.
The being said, we have issues with running into deadlocks on aur.al,
so this commit works against that immediate bug.

An ideal solution would be to deal with retrying transactions through
the `db.begin()` scope, so we wouldn't have to explicitly annotate
functions as "retry functions," which is what this commit does.

Closes #376

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-13 12:54:08 -07:00
Kevin Morris
f450b5dfc7
upgrade: bump to version v6.1.4
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 12:29:57 -07:00
Kevin Morris
adc3a21863
fix: add 'unsafe-inline' to script-src CSP
swagger-ui uses inline javascript to bootstrap itself, so we need to
allow unsafe inline because we can't give swagger-ui a nonce to embed.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 12:28:42 -07:00
Kevin Morris
37c7dee099
fix: produce DeleteNotification a line before handle_request
With this on a single line, the argument ordering and class/func
execution was a bit too RNG causing exceptions to be thrown when
producing a notification based off of a deleted pkgbase object.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 10:36:50 -07:00
Kevin Morris
624954042b
doc(rpc): include route doc at the top of aurweb.routers.rpc
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:59:52 -07:00
Kevin Morris
17f2c05fd3
feat(rpc): add GET /rpc/v5/suggest/{arg} openapi route
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:50:26 -07:00
Kevin Morris
8e8b746a5b
feat(rpc): add GET /rpc/v5/search/{arg} openapi route
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-12 06:50:19 -07:00
Kevin Morris
5e75a00c17
upgrade: bump to version v6.1.3
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:59:16 -07:00
Kevin Morris
9faa7b801d
feat: add cdn.jsdelivr.net to script/style CSP
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:56:29 -07:00
Kevin Morris
df0a4a2be2
feat(rpc): add /rpc/v5/{type} openapi-compatible routes
We will be modeling future RPC implementations on an OpenAPI spec.
While this commit does not completely cohere to OpenAPI in terms
of response data, this is a good start and will allow us to cleanly
document these openapi routes in the current and future.

This commit brings in the new RPC routes:
- GET /rpc/v5/info/{pkgname}
- GET /rpc/v5/info?arg[]=pkg1&arg[]=pkg2
- POST /rpc/v5/info with JSON data `{"arg": ["pkg1", "pkg2"]}`
- GET /rpc/v5/search?arg=keywords&by=valid-by-value
- POST /rpc/v5/search with JSON data `{"by": "valid-by-value", "arg": "keywords"}`

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 19:11:18 -07:00
renovate
bb6e602e13 fix(deps): update dependency fastapi to ^0.83.0 2022-09-12 01:42:09 +00:00
Kevin Morris
4e0618469d
fix(test): JSONResponse() requires a content argument with fastapi 0.83.0
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:40:31 -07:00
Kevin Morris
b3853e01b8
fix(pre-commit): include migrations in fixes/checks
We want all python files related to the project to be checked, really.
Some of which are still included, but migrations are a core part of
FastAPI aurweb and should be included.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:07:54 -07:00
Kevin Morris
03776c4663
fix(docker): cache & install pre-commit deps during image build
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 18:00:11 -07:00
Kevin Morris
a2d08e441e
fix(docker): run pre-commit run -a once
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-11 17:59:45 -07:00
Kevin Morris
6ad24fc950
Merge branch 'fix-docker-test' 2022-09-11 15:57:08 -07:00
renovate
69d6724749
fix(deps): update dependency redis to v4 2022-09-10 05:25:06 +00:00
renovate
307d944cf1
fix(deps): update dependency protobuf to v4 2022-09-10 03:25:08 +00:00
renovate
3de17311cf
fix(deps): update dependency bleach to v5 2022-09-10 00:25:02 +00:00
renovate
7ad22d8143
fix(deps): update dependency bcrypt to v4 2022-09-07 14:24:55 +00:00
renovate
6ab9663b76
fix(deps): update dependency authlib to v1 2022-09-07 06:25:25 +00:00
renovate
486f8bd61c
fix(deps): update dependency aiofiles to v22 2022-09-07 04:24:53 +00:00
renovate
a39f34d695
chore(deps): update dependency pytest to v7 2022-09-07 03:25:30 +00:00
renovate
bb310bdf65
fix(deps): update dependency uvicorn to ^0.18.0 2022-09-07 02:24:55 +00:00
renovate
a73af3e76d
fix(deps): update dependency hypercorn to ^0.14.0 2022-09-07 01:25:03 +00:00
renovate
a981ae4052
fix(deps): update dependency httpx to ^0.23.0 2022-09-07 00:25:32 +00:00
renovate
cdc7bd618c
fix(deps): update dependency email-validator to v1.2.1 2022-09-06 23:24:49 +00:00
renovate
b38e765dfe
fix(deps): update dependency aiofiles to ^0.8.0 2022-09-06 22:24:52 +00:00
renovate
655402a509
chore(deps): update dependency pytest-asyncio to ^0.19.0 2022-09-06 10:25:02 +00:00
renovate
a84d115fa1
chore(deps): add renovate.json 2022-09-06 08:24:03 +00:00
Leonidas Spyropoulos
310c469ba8
fix: run pre-commit checks instead of flake8 and isort
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2022-09-06 08:07:05 +01:00
Kevin Morris
25e05830a6
test: test that /packages/{name} produces the package's description
This commit fixes two of our tests in test_templates.py to go along
with our new template modifications, as well as a new test in
test_packages_routes.py which constructs two packages belonging
to the same package base, then tests that viewing their pages
produces their independent descriptions.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 19:50:41 -07:00
Kevin Morris
0388b12896
fix: package description on /packages/{name} view
...What in the world happened here. We were literally just populating
`pkg` based on `pkgbase.packages.first()`. We should have been focusing
on the package passed by the context, which is always available when
`show_package_details` is true.

Closes #384

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 19:25:32 -07:00
Kevin Morris
83ddbd220f
test: get /requests displays all requests, including those without a User
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:56:48 -07:00
Kevin Morris
a629098b92
fix: conditional display on Request's 'Filed by' field
Since we support requests which have no associated user, we must
support the case where we are displaying such a request.

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:55:20 -07:00
Kevin Morris
7fed5742b8
fix: display requests for TUs which no longer have an associated User
Closes #387

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-05 02:43:23 -07:00
Kevin Morris
6435c2b1f1
upgrade: bump to version v6.1.2
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-02 15:28:02 -07:00
Kevin Morris
b8a4ce4ceb
fix: include maint/comaint state in pkgbase post's error context
Closes #386

Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-09-02 15:12:41 -07:00
Kevin Morris
8a3a7e31ac
upgrade: bump version to v6.1.1
Signed-off-by: Kevin Morris <kevr@0cost.org>
2022-08-31 22:01:54 -07:00
401 changed files with 14944 additions and 19814 deletions

View file

@ -3,7 +3,7 @@ disable_warnings = already-imported
[report]
include = aurweb/*
fail_under = 85
fail_under = 95
exclude_lines =
if __name__ == .__main__.:
pragma: no cover

View file

@ -1,6 +1,23 @@
*/*.mo
# Config files
conf/config
conf/config.sqlite
conf/config.sqlite.defaults
conf/docker
conf/docker.defaults
# Compiled translation files
**/*.mo
# Typical virtualenv directories
env/
venv/
.venv/
# Test output
htmlcov/
test-emails/
test/__pycache__
test/test-results
test/trash_directory*
.coverage
.pytest_cache

View file

@ -1,5 +1,5 @@
# EditorConfig configuration for aurweb
# https://EditorConfig.org
# https://editorconfig.org
# Top-most EditorConfig file
root = true
@ -8,6 +8,3 @@ root = true
end_of_line = lf
insert_final_newline = true
charset = utf-8
[*.{php,t}]
indent_style = tab

1
.env
View file

@ -1,7 +1,6 @@
FASTAPI_BACKEND="uvicorn"
FASTAPI_WORKERS=2
MARIADB_SOCKET_DIR="/var/run/mysqld/"
AURWEB_PHP_PREFIX=https://localhost:8443
AURWEB_FASTAPI_PREFIX=https://localhost:8444
AURWEB_SSHD_PREFIX=ssh://aur@localhost:2222
GIT_DATA_DIR="./aur.git/"

21
.gitignore vendored
View file

@ -24,7 +24,6 @@ conf/docker
conf/docker.defaults
data.sql
dummy-data.sql*
env/
fastapi_aw/
htmlcov/
po/*.mo
@ -32,7 +31,7 @@ po/*.po~
po/POTFILES
schema/aur-schema-sqlite.sql
test/test-results/
test/trash directory*
test/trash_directory*
web/locale/*/
web/html/*.gz
@ -44,3 +43,21 @@ doc/rpc.html
# Ignore .python-version file from Pyenv
.python-version
# Ignore coverage report
coverage.xml
# Ignore pytest report
report.xml
# Ignore test emails
test-emails/
# Ignore typical virtualenv directories
env/
venv/
.venv/
# Ignore some terraform files
/ci/tf/.terraform
/ci/tf/terraform.tfstate*

View file

@ -13,24 +13,22 @@ variables:
TEST_RECURSION_LIMIT: 10000
CURRENT_DIR: "$(pwd)"
LOG_CONFIG: logging.test.conf
DEV_FQDN: aurweb-$CI_COMMIT_REF_SLUG.sandbox.archlinux.page
INFRASTRUCTURE_REPO: https://gitlab.archlinux.org/archlinux/infrastructure.git
lint:
stage: .pre
before_script:
- pacman -Sy --noconfirm --noprogressbar --cachedir .pkg-cache
- pacman -Sy --noconfirm --noprogressbar
archlinux-keyring
- pacman -Syu --noconfirm --noprogressbar --cachedir .pkg-cache
- pacman -Syu --noconfirm --noprogressbar
git python python-pre-commit
script:
# https://github.com/pre-commit/pre-commit/issues/2178#issuecomment-1002163763
- export SETUPTOOLS_USE_DISTUTILS=stdlib
- export XDG_CACHE_HOME=.pre-commit
- pre-commit run -a
test:
stage: test
tags:
- fast-single-thread
before_script:
- export PATH="$HOME/.poetry/bin:${PATH}"
- ./docker/scripts/install-deps.sh
@ -51,44 +49,113 @@ test:
# Run sharness.
- make -C test sh
# Run pytest.
- pytest
- pytest --junitxml="pytest-report.xml"
- make -C test coverage # Produce coverage reports.
coverage: '/TOTAL.*\s+(\d+\%)/'
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
reports:
junit: pytest-report.xml
coverage_report:
coverage_format: cobertura
path: coverage.xml
deploy:
stage: deploy
tags:
- secure
rules:
- if: $CI_COMMIT_BRANCH == "pu"
when: manual
variables:
FASTAPI_BACKEND: gunicorn
FASTAPI_WORKERS: 5
AURWEB_PHP_PREFIX: https://aur-dev.archlinux.org
AURWEB_FASTAPI_PREFIX: https://aur-dev.archlinux.org
AURWEB_SSHD_PREFIX: ssh://aur@aur-dev.archlinux.org:2222
COMMIT_HASH: $CI_COMMIT_SHA
GIT_DATA_DIR: git_data
script:
- pacman -Syu --noconfirm docker docker-compose socat openssh
- chmod 600 ${SSH_KEY}
- socat "UNIX-LISTEN:/tmp/docker.sock,reuseaddr,fork" EXEC:"ssh -o UserKnownHostsFile=${SSH_KNOWN_HOSTS} -Ti ${SSH_KEY} ${SSH_USER}@${SSH_HOST}" &
- export DOCKER_HOST="unix:///tmp/docker.sock"
# Set secure login config for aurweb.
- sed -ri "s/^(disable_http_login).*$/\1 = 1/" conf/config.dev
- docker-compose build
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml down --remove-orphans
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml up -d
- docker image prune -f
- docker container prune -f
- docker volume prune -f
.init_tf: &init_tf
- pacman -Syu --needed --noconfirm terraform
- export TF_VAR_name="aurweb-${CI_COMMIT_REF_SLUG}"
- TF_ADDRESS="${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"
- cd ci/tf
- >
terraform init \
-backend-config="address=${TF_ADDRESS}" \
-backend-config="lock_address=${TF_ADDRESS}/lock" \
-backend-config="unlock_address=${TF_ADDRESS}/lock" \
-backend-config="username=x-access-token" \
-backend-config="password=${TF_STATE_GITLAB_ACCESS_TOKEN}" \
-backend-config="lock_method=POST" \
-backend-config="unlock_method=DELETE" \
-backend-config="retry_wait_min=5"
deploy_review:
stage: deploy
script:
- *init_tf
- terraform apply -auto-approve
environment:
name: development
url: https://aur-dev.archlinux.org
name: review/$CI_COMMIT_REF_NAME
url: https://$DEV_FQDN
on_stop: stop_review
auto_stop_in: 1 week
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual
provision_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- pacman -Syu --noconfirm --needed ansible git openssh jq
# Get ssh key from terraform state file
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- terraform show -json |
jq -r '.values.root_module.resources[] |
select(.address == "tls_private_key.this") |
.values.private_key_openssh' > ~/.ssh/id_ed25519
- chmod 400 ~/.ssh/id_ed25519
# Clone infra repo
- git clone $INFRASTRUCTURE_REPO
- cd infrastructure
# Remove vault files
- rm $(git grep -l 'ANSIBLE_VAULT;1.1;AES256$')
# Remove vault config
- sed -i '/^vault/d' ansible.cfg
# Add host config
- mkdir -p host_vars/$DEV_FQDN
- 'echo "filesystem: btrfs" > host_vars/$DEV_FQDN/misc'
# Add host
- echo "$DEV_FQDN" > hosts
# Add our pubkey and hostkeys
- ssh-keyscan $DEV_FQDN >> ~/.ssh/known_hosts
- ssh-keygen -f ~/.ssh/id_ed25519 -y > pubkeys/aurweb-dev.pub
# Run our ansible playbook
- >
ansible-playbook playbooks/aur-dev.archlinux.org.yml \
-e "aurdev_fqdn=$DEV_FQDN" \
-e "aurweb_repository=$CI_REPOSITORY_URL" \
-e "aurweb_version=$CI_COMMIT_SHA" \
-e "{\"vault_mariadb_users\":{\"root\":\"aur\"}}" \
-e "vault_aurweb_db_password=aur" \
-e "vault_aurweb_gitlab_instance=https://does.not.exist" \
-e "vault_aurweb_error_project=set-me" \
-e "vault_aurweb_error_token=set-me" \
-e "vault_aurweb_secret=aur" \
-e "vault_goaurrpc_metrics_token=aur" \
-e '{"root_additional_keys": ["moson.pub", "aurweb-dev.pub"]}'
environment:
name: review/$CI_COMMIT_REF_NAME
action: access
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
stop_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- terraform destroy -auto-approve
- 'curl --silent --show-error --fail --header "Private-Token: ${TF_STATE_GITLAB_ACCESS_TOKEN}" --request DELETE "${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"'
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual

View file

@ -1,14 +0,0 @@
## Checklist
- [ ] I have set a Username in the Details section
- [ ] I have set an Email in the Details section
- [ ] I have set a valid Account Type in the Details section
## Details
- Instance: aur-dev.archlinux.org
- Username: the_username_you_want
- Email: valid@email.org
- Account Type: (User|Trusted User)
/label account-request

View file

@ -1,12 +1,24 @@
<!--
This template is used to report potential bugs with the AURweb website.
NOTE: All comment sections with a MODIFY note need to be edited. All checkboxes
in the "Checklist" section need to be checked by the owner of the issue.
-->
/label ~bug ~unconfirmed
/title [BUG] <!-- MODIFY: add subject -->
<!--
Please do not remove the above quick actions, which automatically label the
issue and assign relevant users.
-->
### Checklist
This bug template is meant to provide bug issues for code existing in
the aurweb repository. This bug template is **not meant** to handle
bugs with user-uploaded packages.
**NOTE:** This bug template is meant to provide bug issues for code existing in
the aurweb repository.
To work out a bug you have found in a user-uploaded package, contact
the package's maintainer first. If you receive no response, file the
relevant package request against it so TUs can deal with cleanup.
**This bug template is not meant to handle bugs with user-uploaded packages.**
To report issues you might have found in a user-uploaded package, contact
the package's maintainer in comments.
- [ ] I confirm that this is an issue with aurweb's code and not a
user-uploaded package.
@ -29,7 +41,7 @@ this bug.
### Logs
If you have any logs relevent to the bug, include them here in
If you have any logs relevant to the bug, include them here in
quoted or code blocks.
### Version(s)

View file

@ -1,3 +1,25 @@
<!--
This template is used to feature request for AURweb website.
NOTE: All comment sections with a MODIFY note need to be edited. All checkboxes
in the "Checklist" section need to be checked by the owner of the issue.
-->
/label ~feature ~unconfirmed
/title [FEATURE] <!-- MODIFY: add subject -->
<!--
Please do not remove the above quick actions, which automatically label the
issue and assign relevant users.
-->
### Checklist
**NOTE:** This bug template is meant to provide bug issues for code existing in
the aurweb repository.
**This bug template is not meant to handle bugs with user-uploaded packages.**
To report issues you might have found in a user-uploaded package, contact
the package's maintainer in comments.
- [ ] I have summed up the feature in concise words in the [Summary](#summary) section.
- [ ] I have completely described the feature in the [Description](#description) section.
- [ ] I have completed the [Blockers](#blockers) section.
@ -28,5 +50,3 @@ Example:
- [Feature] Do not allow users to be Tyrants
- \<(issue|merge_request)_link\>
/label feature unconsidered

View file

@ -1,58 +0,0 @@
**NOTE:** This issue template is only applicable to FastAPI implementations
in the code-base, which only exists within the `pu` branch. If you wish to
file an issue for the current PHP implementation of aurweb, please file a
standard issue prefixed with `[Bug]` or `[Feature]`.
**Checklist**
- [ ] I have prefixed the issue title with `[Feedback]` along with a message
pointing to the route or feature tested.
- Example: `[Feedback] /packages/{name}`
- [ ] I have completed the [Changes](#changes) section.
- [ ] I have completed the [Bugs](#bugs) section.
- [ ] I have completed the [Improvements](#improvements) section.
- [ ] I have completed the [Summary](#summary) section.
### Changes
Please describe changes in user experience when compared to the PHP
implementation. This section can actually hold a lot of info if you
are up for it -- changes in routes, HTML rendering, back-end behavior,
etc.
If you cannot see any changes from your standpoint, include a short
statement about that fact.
### Bugs
Please describe any bugs you've experienced while testing the route
pertaining to this issue. A "perfect" bug report would include your
specific experience, what you expected to occur, and what happened
otherwise. If you can, please include output of `docker-compose logs fastapi`
with your report; especially if any unintended exceptions occurred.
### Improvements
If you've experienced improvements in the route when compared to PHP,
please do include those here. We'd like to know if users are noticing
these improvements and how they feel about them.
There are multiple routes with no improvements. For these, just include
a short sentence about the fact that you've experienced none.
### Summary
First: If you've gotten here and completed the [Changes](#changes),
[Bugs](#bugs), and [Improvements](#improvements) sections, we'd like
to thank you very much for your contribution and willingness to test.
We are not a company, and we are not a large team; any bit of assistance
here helps the project astronomically and moves us closer toward a
new release.
That being said: please include an overall summary of your experience
and how you felt about the current implementation which you're testing
in comparison with PHP (current aur.archlinux.org, or https://localhost:8443
through docker).
/label feedback

View file

@ -1,8 +1,6 @@
exclude: ^migrations/versions
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v4.5.0
hooks:
- id: check-added-large-files
- id: check-case-conflict
@ -10,10 +8,11 @@ repos:
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
exclude: ^po/
- id: debug-statements
- repo: https://github.com/myint/autoflake
rev: v1.4
rev: v2.3.1
hooks:
- id: autoflake
args:
@ -22,16 +21,16 @@ repos:
- --ignore-init-module-imports
- repo: https://github.com/pycqa/isort
rev: 5.10.1
rev: 5.13.2
hooks:
- id: isort
- repo: https://github.com/psf/black
rev: 22.6.0
rev: 24.4.1
hooks:
- id: black
- repo: https://github.com/PyCQA/flake8
rev: 5.0.4
rev: 7.0.0
hooks:
- id: flake8

View file

@ -1,5 +1,5 @@
[main]
host = https://www.transifex.com
host = https://app.transifex.com
[o:lfleischer:p:aurweb:r:aurwebpot]
file_filter = po/<lang>.po

View file

@ -8,7 +8,7 @@ Before sending patches, you are recommended to run `flake8` and `isort`.
You can add a git hook to do this by installing `python-pre-commit` and running
`pre-commit install`.
[1]: https://lists.archlinux.org/listinfo/aur-dev
[1]: https://lists.archlinux.org/mailman3/lists/aur-dev.lists.archlinux.org/
[2]: https://gitlab.archlinux.org/archlinux/aurweb
### Coding Guidelines
@ -91,13 +91,14 @@ browser if desired.
Accessible services (on the host):
- https://localhost:8444 (python via nginx)
- https://localhost:8443 (php via nginx)
- localhost:13306 (mariadb)
- localhost:16379 (redis)
Docker services, by default, are setup to be hot reloaded when source code
is changed.
For detailed setup instructions have a look at [TESTING](TESTING)
#### Using INSTALL
The [INSTALL](INSTALL) file describes steps to install the application on

View file

@ -2,6 +2,7 @@ FROM archlinux:base-devel
VOLUME /root/.cache/pypoetry/cache
VOLUME /root/.cache/pypoetry/artifacts
VOLUME /root/.cache/pre-commit
ENV PATH="/root/.poetry/bin:${PATH}"
ENV PYTHONPATH=/aurweb
@ -41,3 +42,6 @@ RUN ln -sf /usr/share/zoneinfo/UTC /etc/localtime
# Install translations.
RUN make -C po all install
# Install pre-commit repositories and run lint check.
RUN pre-commit run -a

16
INSTALL
View file

@ -14,8 +14,7 @@ read the instructions below.
$ cd aurweb
$ poetry install
2) Setup a web server with PHP and MySQL. Configure the web server to redirect
all URLs to /index.php/foo/bar/. The following block can be used with nginx:
2) Setup a web server with MySQL. The following block can be used with nginx:
server {
# https is preferred and can be done easily with LetsEncrypt
@ -31,14 +30,6 @@ read the instructions below.
ssl_certificate /etc/ssl/certs/aur.cert.pem;
ssl_certificate_key /etc/ssl/private/aur.key.pem;
# Asset root. This is used to match against gzip archives.
root /srv/http/aurweb/web/html;
# TU Bylaws redirect.
location = /trusted-user/TUbylaws.html {
return 301 https://tu-bylaws.aur.archlinux.org;
}
# smartgit location.
location ~ "^/([a-z0-9][a-z0-9.+_-]*?)(\.git)?/(git-(receive|upload)-pack|HEAD|info/refs|objects/(info/(http-)?alternates|packs)|[0-9a-f]{2}/[0-9a-f]{38}|pack/pack-[0-9a-f]{40}\.(pack|idx))$" {
include uwsgi_params;
@ -63,6 +54,9 @@ read the instructions below.
# Static archive assets.
location ~ \.gz$ {
# Asset root. This is used to match against gzip archives.
root /srv/http/aurweb/archives;
types { application/gzip text/plain }
default_type text/plain;
add_header Content-Encoding gzip;
@ -126,7 +120,7 @@ interval:
*/2 * * * * bash -c 'poetry run aurweb-pkgmaint'
*/2 * * * * bash -c 'poetry run aurweb-usermaint'
*/2 * * * * bash -c 'poetry run aurweb-popupdate'
*/12 * * * * bash -c 'poetry run aurweb-tuvotereminder'
*/12 * * * * bash -c 'poetry run aurweb-votereminder'
7) Create a new database and a user and import the aurweb SQL schema:

View file

@ -11,8 +11,8 @@ The aurweb project includes
* A web interface to search for packaging scripts and display package details.
* An SSH/Git interface to submit and update packages and package meta data.
* Community features such as comments, votes, package flagging and requests.
* Editing/deletion of packages and accounts by Trusted Users and Developers.
* Area for Trusted Users to post AUR-related proposals and vote on them.
* Editing/deletion of packages and accounts by Package Maintainers and Developers.
* Area for Package Maintainers to post AUR-related proposals and vote on them.
Directory Layout
----------------
@ -26,7 +26,6 @@ Directory Layout
* `schema`: schema for the SQL database
* `test`: test suite and test cases
* `upgrading`: instructions for upgrading setups from one release to another
* `web`: PHP-based web interface for the AUR
Documentation
-------------
@ -57,7 +56,7 @@ Translations
------------
Translations are welcome via our Transifex project at
https://www.transifex.com/lfleischer/aurweb; see `doc/i18n.txt` for details.
https://www.transifex.com/lfleischer/aurweb; see [doc/i18n.md](./doc/i18n.md) for details.
![Transifex](https://www.transifex.com/projects/p/aurweb/chart/image_png)

208
TESTING
View file

@ -1,60 +1,130 @@
Setup Testing Environment
=========================
The quickest way to get you hacking on aurweb is to utilize docker.
In case you prefer to run it bare-metal see instructions further below.
Containerized environment
-------------------------
1) Clone the aurweb project:
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
$ cd aurweb
2) Install the necessary packages:
# pacman -S --needed docker docker-compose
3) Build the aurweb:latest image:
# systemctl start docker
# docker compose build
4) Run local Docker development instance:
# docker compose up -d
5) Browse to local aurweb development server.
https://localhost:8444/
6) [Optionally] populate the database with dummy data:
# docker compose exec mariadb /bin/bash
# pacman -S --noconfirm words fortune-mod
# poetry run schema/gendummydata.py dummy_data.sql
# mariadb -uaur -paur aurweb < dummy_data.sql
# exit
Inspect `dummy_data.sql` for test credentials.
Passwords match usernames.
We now have fully set up environment which we can start and stop with:
# docker compose start
# docker compose stop
Proceed with topic "Setup for running tests"
Bare Metal installation
-----------------------
Note that this setup is only to test the web interface. If you need to have a
full aurweb instance with cgit, ssh interface, etc, follow the directions in
INSTALL.
docker-compose
--------------
1) Clone the aurweb project:
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
2) Install the necessary packages:
# pacman -S docker-compose
2) Build the aurweb:latest image:
$ cd /path/to/aurweb/
$ docker-compose build
3) Run local Docker development instance:
$ cd /path/to/aurweb/
$ docker-compose up -d nginx
4) Browse to local aurweb development server.
Python: https://localhost:8444/
PHP: https://localhost:8443/
5) [Optionally] populate the database with dummy data:
$ docker-compose up mariadb
$ docker-compose exec mariadb /bin/sh
# pacman -S --noconfirm words fortune-mod
# poetry run schema/gendummydata.py dummy_data.sql
# mysql -uaur -paur aurweb < dummy_data.sql
Inspect `dummy_data.sql` for test credentials. Passwords match usernames.
Bare Metal
----------
1) Clone the aurweb project:
$ git clone git://git.archlinux.org/aurweb.git
$ cd aurweb
2) Install the necessary packages:
# pacman -S python-poetry
# pacman -S --needed python-poetry mariadb words fortune-mod nginx
4) Install the package/dependencies via `poetry`:
3) Install the package/dependencies via `poetry`:
$ poetry install
4) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
path to the root of your aurweb clone. sed can do both tasks for you:
$ sed -e "s;YOUR_AUR_ROOT;$PWD;g" conf/config.dev > conf/config
Note that when the upstream config.dev is updated, you should compare it to
your conf/config, or regenerate your configuration with the command above.
5) Set up mariadb:
# mariadb-install-db --user=mysql --basedir=/usr --datadir=/var/lib/mysql
# systemctl start mariadb
# mariadb -u root
> CREATE USER 'aur'@'localhost' IDENTIFIED BY 'aur';
> GRANT ALL ON *.* TO 'aur'@'localhost' WITH GRANT OPTION;
> CREATE DATABASE aurweb;
> exit
6) Prepare a database and insert dummy data:
$ AUR_CONFIG=conf/config poetry run python -m aurweb.initdb
$ poetry run schema/gendummydata.py dummy_data.sql
$ mariadb -uaur -paur aurweb < dummy_data.sql
7) Run the test server:
## set AUR_CONFIG to our locally created config
$ export AUR_CONFIG=conf/config
## with aurweb.spawn
$ poetry run python -m aurweb.spawn
## with systemd service
$ sudo install -m644 examples/aurweb.service /etc/systemd/system/
# systemctl enable --now aurweb.service
Setup for running tests
-----------------------
If you've set up a docker environment, you can run the full test-suite with:
# docker compose run test
You can collect code-coverage data with:
$ ./util/fix-coverage data/.coverage
See information further below on how to visualize the data.
For running individual tests, we need to perform a couple of additional steps.
In case you did the bare-metal install, steps 2, 3, 4 and 5 should be skipped.
1) Install the necessary packages:
# pacman -S --needed python-poetry mariadb-libs asciidoc openssh
2) Install the package/dependencies via `poetry`:
$ cd /path/to/aurweb/
$ poetry install
3) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
@ -65,23 +135,51 @@ Bare Metal
Note that when the upstream config.dev is updated, you should compare it to
your conf/config, or regenerate your configuration with the command above.
4) Prepare a database:
4) Edit the config file conf/config and change the mysql/mariadb portion
$ cd /path/to/aurweb/
We can make use of our mariadb docker container instead of having to install
mariadb. Change the config as follows:
$ AUR_CONFIG=conf/config poetry run python -m aurweb.initdb
---------------------------------------------------------------------
; MySQL database information. User defaults to root for containerized
; testing with mysqldb. This should be set to a non-root user.
user = root
password = aur
host = 127.0.0.1
port = 13306
;socket = /var/run/mysqld/mysqld.sock
---------------------------------------------------------------------
$ poetry run schema/gendummydata.py dummy_data.sql
$ mysql -uaur -paur aurweb < dummy_data.sql
5) Start our mariadb docker container
5) Run the test server:
# docker compose start mariadb
## set AUR_CONFIG to our locally created config
$ export AUR_CONFIG=conf/config
6) Set environment variables
## with aurweb.spawn
$ poetry run python -m aurweb.spawn
$ export AUR_CONFIG=conf/config
$ export LOG_CONFIG=logging.test.conf
## with systemd service
$ sudo install -m644 examples/aurweb.service /etc/systemd/system/
$ systemctl enable --now aurweb.service
7) Compile translation & doc files
$ make -C po install
$ make -C doc
Now we can run our python test-suite or individual tests with:
$ poetry run pytest test/
$ poetry run pytest test/test_whatever.py
To run Sharness tests:
$ poetry run make -C test sh
The e-Mails that have been generated can be found at test-emails/
After test runs, code-coverage reports can be created with:
## CLI report
$ coverage report
## HTML version stored at htmlcov/
$ coverage html
More information about tests can be found at test/README.md

View file

@ -0,0 +1 @@
# aurweb.archives

View file

@ -0,0 +1 @@
# aurweb.archives.spec

View file

@ -0,0 +1,77 @@
from pathlib import Path
from typing import Any, Dict, Iterable, List, Set
class GitInfo:
"""Information about a Git repository."""
""" Path to Git repository. """
path: str
""" Local Git repository configuration. """
config: Dict[str, Any]
def __init__(self, path: str, config: Dict[str, Any] = dict()) -> "GitInfo":
self.path = Path(path)
self.config = config
class SpecOutput:
"""Class used for git_archive.py output details."""
""" Filename relative to the Git repository root. """
filename: Path
""" Git repository information. """
git_info: GitInfo
""" Bytes bound for `SpecOutput.filename`. """
data: bytes
def __init__(self, filename: str, git_info: GitInfo, data: bytes) -> "SpecOutput":
self.filename = filename
self.git_info = git_info
self.data = data
class SpecBase:
"""
Base for Spec classes defined in git_archve.py --spec modules.
All supported --spec modules must contain the following classes:
- Spec(SpecBase)
"""
""" A list of SpecOutputs, each of which contain output file data. """
outputs: List[SpecOutput] = list()
""" A set of repositories to commit changes to. """
repos: Set[str] = set()
def generate(self) -> Iterable[SpecOutput]:
"""
"Pure virtual" output generator.
`SpecBase.outputs` and `SpecBase.repos` should be populated within an
overridden version of this function in SpecBase derivatives.
"""
raise NotImplementedError()
def add_output(self, filename: str, git_info: GitInfo, data: bytes) -> None:
"""
Add a SpecOutput instance to the set of outputs.
:param filename: Filename relative to the git repository root
:param git_info: GitInfo instance
:param data: Binary data bound for `filename`
"""
if git_info.path not in self.repos:
self.repos.add(git_info.path)
self.outputs.append(
SpecOutput(
filename,
git_info,
data,
)
)

View file

@ -0,0 +1,85 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import Package, PackageBase, User
from aurweb.rpc import RPC
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.metadata_repo = GitInfo(
config.get("git-archive", "metadata-repo"),
)
def generate(self) -> Iterable[SpecOutput]:
# Base query used by the RPC.
base_query = (
db.query(Package)
.join(PackageBase)
.join(User, PackageBase.MaintainerUID == User.ID, isouter=True)
)
# Create an instance of RPC, use it to get entities from
# our query and perform a metadata subquery for all packages.
rpc = RPC(version=5, type="info")
print("performing package database query")
packages = rpc.entities(base_query).all()
print("performing package database subqueries")
rpc.subquery({pkg.ID for pkg in packages})
pkgbases, pkgnames = dict(), dict()
for package in packages:
# Produce RPC type=info data for `package`
data = rpc.get_info_json_data(package)
pkgbase_name = data.get("PackageBase")
pkgbase_data = {
"ID": data.pop("PackageBaseID"),
"URLPath": data.pop("URLPath"),
"FirstSubmitted": data.pop("FirstSubmitted"),
"LastModified": data.pop("LastModified"),
"OutOfDate": data.pop("OutOfDate"),
"Maintainer": data.pop("Maintainer"),
"Keywords": data.pop("Keywords"),
"NumVotes": data.pop("NumVotes"),
"Popularity": data.pop("Popularity"),
"PopularityUpdated": package.PopularityUpdated.timestamp(),
}
# Store the data in `pkgbases` dict. We do this so we only
# end up processing a single `pkgbase` if repeated after
# this loop
pkgbases[pkgbase_name] = pkgbase_data
# Remove Popularity and NumVotes from package data.
# These fields change quite often which causes git data
# modification to explode.
# data.pop("NumVotes")
# data.pop("Popularity")
# Remove the ID key from package json.
data.pop("ID")
# Add the `package`.Name to the pkgnames set
name = data.get("Name")
pkgnames[name] = data
# Add metadata outputs
self.add_output(
"pkgname.json",
self.metadata_repo,
orjson.dumps(pkgnames, option=ORJSON_OPTS),
)
self.add_output(
"pkgbase.json",
self.metadata_repo,
orjson.dumps(pkgbases, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,26 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import PackageBase
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.pkgbases_repo = GitInfo(config.get("git-archive", "pkgbases-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = db.query(PackageBase.Name).order_by(PackageBase.Name.asc()).all()
pkgbases = [pkgbase.Name for pkgbase in query]
self.add_output(
"pkgbase.json",
self.pkgbases_repo,
orjson.dumps(pkgbases, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,31 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import Package, PackageBase
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.pkgnames_repo = GitInfo(config.get("git-archive", "pkgnames-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = (
db.query(Package.Name)
.join(PackageBase, PackageBase.ID == Package.PackageBaseID)
.order_by(Package.Name.asc())
.all()
)
pkgnames = [pkg.Name for pkg in query]
self.add_output(
"pkgname.json",
self.pkgnames_repo,
orjson.dumps(pkgnames, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -0,0 +1,26 @@
from typing import Iterable
import orjson
from aurweb import config, db
from aurweb.models import User
from .base import GitInfo, SpecBase, SpecOutput
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
class Spec(SpecBase):
def __init__(self) -> "Spec":
self.users_repo = GitInfo(config.get("git-archive", "users-repo"))
def generate(self) -> Iterable[SpecOutput]:
query = db.query(User.Username).order_by(User.Username.asc()).all()
users = [user.Username for user in query]
self.add_output(
"users.json",
self.users_repo,
orjson.dumps(users, option=ORJSON_OPTS),
)
return self.outputs

View file

@ -6,6 +6,7 @@ import re
import sys
import traceback
import typing
from contextlib import asynccontextmanager
from urllib.parse import quote_plus
import requests
@ -13,8 +14,13 @@ from fastapi import FastAPI, HTTPException, Request, Response
from fastapi.responses import RedirectResponse
from fastapi.staticfiles import StaticFiles
from jinja2 import TemplateNotFound
from prometheus_client import multiprocess
from sqlalchemy import and_, or_
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from sqlalchemy import and_
from starlette.exceptions import HTTPException as StarletteHTTPException
from starlette.middleware.authentication import AuthenticationMiddleware
from starlette.middleware.sessions import SessionMiddleware
@ -22,22 +28,29 @@ from starlette.middleware.sessions import SessionMiddleware
import aurweb.captcha # noqa: F401
import aurweb.config
import aurweb.filters # noqa: F401
import aurweb.logging
import aurweb.pkgbase.util as pkgbaseutil
from aurweb import logging, prometheus, util
from aurweb import aur_logging, prometheus, util
from aurweb.aur_redis import redis_connection
from aurweb.auth import BasicAuthBackend
from aurweb.db import get_engine, query
from aurweb.models import AcceptedTerm, Term
from aurweb.packages.util import get_pkg_or_base
from aurweb.prometheus import instrumentator
from aurweb.redis import redis_connection
from aurweb.routers import APP_ROUTES
from aurweb.templates import make_context, render_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
session_secret = aurweb.config.get("fastapi", "session_secret")
@asynccontextmanager
async def lifespan(app: FastAPI):
await app_startup()
yield
# Setup the FastAPI app.
app = FastAPI()
app = FastAPI(lifespan=lifespan)
# Instrument routes with the prometheus-fastapi-instrumentator
# library with custom collectors and expose /metrics.
@ -46,7 +59,17 @@ instrumentator().add(prometheus.http_requests_total())
instrumentator().instrument(app)
@app.on_event("startup")
# Instrument FastAPI for tracing
FastAPIInstrumentor.instrument_app(app)
resource = Resource(attributes={"service.name": "aurweb"})
otlp_endpoint = aurweb.config.get("tracing", "otlp_endpoint")
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
span_processor = BatchSpanProcessor(otlp_exporter)
trace.set_tracer_provider(TracerProvider(resource=resource))
trace.get_tracer_provider().add_span_processor(span_processor)
async def app_startup():
# https://stackoverflow.com/questions/67054759/about-the-maximum-recursion-error-in-fastapi
# Test failures have been observed by internal starlette code when
@ -69,7 +92,6 @@ async def app_startup():
f"Supported backends: {str(aurweb.db.DRIVERS.keys())}"
)
session_secret = aurweb.config.get("fastapi", "session_secret")
if not session_secret:
raise Exception("[fastapi] session_secret must not be empty")
@ -79,15 +101,7 @@ async def app_startup():
"endpoint is disabled."
)
app.mount("/static/css", StaticFiles(directory="web/html/css"), name="static_css")
app.mount("/static/js", StaticFiles(directory="web/html/js"), name="static_js")
app.mount(
"/static/images", StaticFiles(directory="web/html/images"), name="static_images"
)
# Add application middlewares.
app.add_middleware(AuthenticationMiddleware, backend=BasicAuthBackend())
app.add_middleware(SessionMiddleware, secret_key=session_secret)
app.mount("/static", StaticFiles(directory="static"), name="static_files")
# Add application routes.
def add_router(module):
@ -99,12 +113,6 @@ async def app_startup():
get_engine()
def child_exit(server, worker): # pragma: no cover
"""This function is required for gunicorn customization
of prometheus multiprocessing."""
multiprocess.mark_process_dead(worker.pid)
async def internal_server_error(request: Request, exc: Exception) -> Response:
"""
Catch all uncaught Exceptions thrown in a route.
@ -220,10 +228,16 @@ async def http_exception_handler(request: Request, exc: HTTPException) -> Respon
if exc.status_code == http.HTTPStatus.NOT_FOUND:
tokens = request.url.path.split("/")
matches = re.match("^([a-z0-9][a-z0-9.+_-]*?)(\\.git)?$", tokens[1])
if matches:
if matches and len(tokens) == 2:
try:
pkgbase = get_pkg_or_base(matches.group(1))
context = pkgbaseutil.make_context(request, pkgbase)
context["pkgbase"] = pkgbase
context["git_clone_uri_anon"] = aurweb.config.get(
"options", "git_clone_uri_anon"
)
context["git_clone_uri_priv"] = aurweb.config.get(
"options", "git_clone_uri_priv"
)
except HTTPException:
pass
@ -253,10 +267,16 @@ async def add_security_headers(request: Request, call_next: typing.Callable):
# Add CSP header.
nonce = request.user.nonce
csp = "default-src 'self'; "
script_hosts = []
csp += f"script-src 'self' 'nonce-{nonce}' " + " ".join(script_hosts)
# It's fine if css is inlined.
csp += "; style-src 'self' 'unsafe-inline'"
# swagger-ui needs access to cdn.jsdelivr.net javascript
script_hosts = ["cdn.jsdelivr.net"]
csp += f"script-src 'self' 'unsafe-inline' 'nonce-{nonce}' " + " ".join(
script_hosts
)
# swagger-ui needs access to cdn.jsdelivr.net css
css_hosts = ["cdn.jsdelivr.net"]
csp += "; style-src 'self' 'unsafe-inline' " + " ".join(css_hosts)
response.headers["Content-Security-Policy"] = csp
# Add XTCO header.
@ -279,21 +299,18 @@ async def check_terms_of_service(request: Request, call_next: typing.Callable):
"""This middleware function redirects authenticated users if they
have any outstanding Terms to agree to."""
if request.user.is_authenticated() and request.url.path != "/tos":
unaccepted = (
accepted = (
query(Term)
.join(AcceptedTerm)
.filter(
or_(
AcceptedTerm.UsersID != request.user.ID,
and_(
AcceptedTerm.UsersID == request.user.ID,
AcceptedTerm.TermsID == Term.ID,
AcceptedTerm.Revision < Term.Revision,
),
)
and_(
AcceptedTerm.UsersID == request.user.ID,
AcceptedTerm.TermsID == Term.ID,
AcceptedTerm.Revision >= Term.Revision,
),
)
)
if query(Term).count() > unaccepted.count():
if query(Term).count() - accepted.count() > 0:
return RedirectResponse("/tos", status_code=int(http.HTTPStatus.SEE_OTHER))
return await util.error_or_result(call_next, request)
@ -315,3 +332,8 @@ async def id_redirect_middleware(request: Request, call_next: typing.Callable):
return RedirectResponse(f"{path}/{id}{qs}")
return await util.error_or_result(call_next, request)
# Add application middlewares.
app.add_middleware(AuthenticationMiddleware, backend=BasicAuthBackend())
app.add_middleware(SessionMiddleware, secret_key=session_secret)

View file

@ -1,12 +1,15 @@
import fakeredis
from opentelemetry.instrumentation.redis import RedisInstrumentor
from redis import ConnectionPool, Redis
import aurweb.config
from aurweb import logging
from aurweb import aur_logging
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
pool = None
RedisInstrumentor().instrument()
class FakeConnectionPool:
"""A fake ConnectionPool class which holds an internal reference

View file

@ -71,7 +71,7 @@ class AnonymousUser:
return False
@staticmethod
def is_trusted_user():
def is_package_maintainer():
return False
@staticmethod
@ -96,6 +96,7 @@ class AnonymousUser:
class BasicAuthBackend(AuthenticationBackend):
@db.async_retry_deadlock
async def authenticate(self, conn: HTTPConnection):
unauthenticated = (None, AnonymousUser())
sid = conn.cookies.get("AURSID")
@ -103,9 +104,7 @@ class BasicAuthBackend(AuthenticationBackend):
return unauthenticated
timeout = aurweb.config.getint("options", "login_timeout")
remembered = "AURREMEMBER" in conn.cookies and bool(
conn.cookies.get("AURREMEMBER")
)
remembered = conn.cookies.get("AURREMEMBER") == "True"
if remembered:
timeout = aurweb.config.getint("options", "persistent_cookie_timeout")
@ -122,12 +121,11 @@ class BasicAuthBackend(AuthenticationBackend):
# At this point, we cannot have an invalid user if the record
# exists, due to ForeignKey constraints in the schema upheld
# by mysqlclient.
with db.begin():
user = db.query(User).filter(User.ID == record.UsersID).first()
user = db.query(User).filter(User.ID == record.UsersID).first()
user.nonce = util.make_nonce()
user.authenticated = True
return (AuthCredentials(["authenticated"]), user)
return AuthCredentials(["authenticated"]), user
def _auth_required(auth_goal: bool = True):
@ -207,7 +205,7 @@ def account_type_required(one_of: set):
@router.get('/some_route')
@auth_required(True)
@account_type_required({"Trusted User", "Trusted User & Developer"})
@account_type_required({"Package Maintainer", "Package Maintainer & Developer"})
async def some_route(request: fastapi.Request):
return Response()

View file

@ -1,7 +1,7 @@
from aurweb.models.account_type import (
DEVELOPER_ID,
TRUSTED_USER_AND_DEV_ID,
TRUSTED_USER_ID,
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
USER_ID,
)
from aurweb.models.user import User
@ -30,52 +30,53 @@ PKGBASE_VOTE = 16
PKGREQ_FILE = 23
PKGREQ_CLOSE = 17
PKGREQ_LIST = 18
TU_ADD_VOTE = 19
TU_LIST_VOTES = 20
TU_VOTE = 21
PM_ADD_VOTE = 19
PM_LIST_VOTES = 20
PM_VOTE = 21
PKGBASE_MERGE = 29
user_developer_or_trusted_user = set(
[USER_ID, TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID]
user_developer_or_package_maintainer = set(
[USER_ID, PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
)
trusted_user_or_dev = set([TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
developer = set([DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
trusted_user = set([TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID])
package_maintainer_or_dev = set(
[PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
)
developer = set([DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
package_maintainer = set([PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
cred_filters = {
PKGBASE_FLAG: user_developer_or_trusted_user,
PKGBASE_NOTIFY: user_developer_or_trusted_user,
PKGBASE_VOTE: user_developer_or_trusted_user,
PKGREQ_FILE: user_developer_or_trusted_user,
ACCOUNT_CHANGE_TYPE: trusted_user_or_dev,
ACCOUNT_EDIT: trusted_user_or_dev,
ACCOUNT_LAST_LOGIN: trusted_user_or_dev,
ACCOUNT_LIST_COMMENTS: trusted_user_or_dev,
ACCOUNT_SEARCH: trusted_user_or_dev,
COMMENT_DELETE: trusted_user_or_dev,
COMMENT_UNDELETE: trusted_user_or_dev,
COMMENT_VIEW_DELETED: trusted_user_or_dev,
COMMENT_EDIT: trusted_user_or_dev,
COMMENT_PIN: trusted_user_or_dev,
PKGBASE_ADOPT: trusted_user_or_dev,
PKGBASE_SET_KEYWORDS: trusted_user_or_dev,
PKGBASE_DELETE: trusted_user_or_dev,
PKGBASE_EDIT_COMAINTAINERS: trusted_user_or_dev,
PKGBASE_DISOWN: trusted_user_or_dev,
PKGBASE_LIST_VOTERS: trusted_user_or_dev,
PKGBASE_UNFLAG: trusted_user_or_dev,
PKGREQ_CLOSE: trusted_user_or_dev,
PKGREQ_LIST: trusted_user_or_dev,
TU_ADD_VOTE: trusted_user,
TU_LIST_VOTES: trusted_user_or_dev,
TU_VOTE: trusted_user,
PKGBASE_FLAG: user_developer_or_package_maintainer,
PKGBASE_NOTIFY: user_developer_or_package_maintainer,
PKGBASE_VOTE: user_developer_or_package_maintainer,
PKGREQ_FILE: user_developer_or_package_maintainer,
ACCOUNT_CHANGE_TYPE: package_maintainer_or_dev,
ACCOUNT_EDIT: package_maintainer_or_dev,
ACCOUNT_LAST_LOGIN: package_maintainer_or_dev,
ACCOUNT_LIST_COMMENTS: package_maintainer_or_dev,
ACCOUNT_SEARCH: package_maintainer_or_dev,
COMMENT_DELETE: package_maintainer_or_dev,
COMMENT_UNDELETE: package_maintainer_or_dev,
COMMENT_VIEW_DELETED: package_maintainer_or_dev,
COMMENT_EDIT: package_maintainer_or_dev,
COMMENT_PIN: package_maintainer_or_dev,
PKGBASE_ADOPT: package_maintainer_or_dev,
PKGBASE_SET_KEYWORDS: package_maintainer_or_dev,
PKGBASE_DELETE: package_maintainer_or_dev,
PKGBASE_EDIT_COMAINTAINERS: package_maintainer_or_dev,
PKGBASE_DISOWN: package_maintainer_or_dev,
PKGBASE_LIST_VOTERS: package_maintainer_or_dev,
PKGBASE_UNFLAG: package_maintainer_or_dev,
PKGREQ_CLOSE: package_maintainer_or_dev,
PKGREQ_LIST: package_maintainer_or_dev,
PM_ADD_VOTE: package_maintainer,
PM_LIST_VOTES: package_maintainer_or_dev,
PM_VOTE: package_maintainer,
ACCOUNT_EDIT_DEV: developer,
PKGBASE_MERGE: trusted_user_or_dev,
PKGBASE_MERGE: package_maintainer_or_dev,
}
def has_credential(user: User, credential: int, approved: list = tuple()):
if user in approved:
return True
return user.AccountTypeID in cred_filters[credential]

View file

@ -1,4 +1,4 @@
from datetime import datetime
from datetime import UTC, datetime
class Benchmark:
@ -7,7 +7,7 @@ class Benchmark:
def _timestamp(self) -> float:
"""Generate a timestamp."""
return float(datetime.utcnow().timestamp())
return float(datetime.now(UTC).timestamp())
def start(self) -> int:
"""Start a benchmark."""

View file

@ -1,21 +1,64 @@
from redis import Redis
import pickle
from typing import Any, Callable
from sqlalchemy import orm
from aurweb import config
from aurweb.aur_redis import redis_connection
from aurweb.prometheus import SEARCH_REQUESTS
async def db_count_cache(
redis: Redis, key: str, query: orm.Query, expire: int = None
) -> int:
_redis = redis_connection()
def lambda_cache(key: str, value: Callable[[], Any], expire: int = None) -> list:
"""Store and retrieve lambda results via redis cache.
:param key: Redis key
:param value: Lambda callable returning the value
:param expire: Optional expiration in seconds
:return: result of callable or cache
"""
result = _redis.get(key)
if result is not None:
return pickle.loads(result)
_redis.set(key, (pickle.dumps(result := value())), ex=expire)
return result
def db_count_cache(key: str, query: orm.Query, expire: int = None) -> int:
"""Store and retrieve a query.count() via redis cache.
:param redis: Redis handle
:param key: Redis key
:param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds
:return: query.count()
"""
result = redis.get(key)
result = _redis.get(key)
if result is None:
redis.set(key, (result := int(query.count())))
_redis.set(key, (result := int(query.count())))
if expire:
redis.expire(key, expire)
_redis.expire(key, expire)
return int(result)
def db_query_cache(key: str, query: orm.Query, expire: int = None) -> list:
"""Store and retrieve query results via redis cache.
:param key: Redis key
:param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds
:return: query.all()
"""
result = _redis.get(key)
if result is None:
SEARCH_REQUESTS.labels(cache="miss").inc()
if _redis.dbsize() > config.getint("cache", "max_search_entries", 50000):
return query.all()
_redis.set(key, (result := pickle.dumps(query.all())))
if expire:
_redis.expire(key, expire)
else:
SEARCH_REQUESTS.labels(cache="hit").inc()
return pickle.loads(result)

View file

@ -1,7 +1,9 @@
""" This module consists of aurweb's CAPTCHA utility functions and filters. """
import hashlib
from jinja2 import pass_context
from sqlalchemy import func
from aurweb.db import query
from aurweb.models import User
@ -10,7 +12,8 @@ from aurweb.templates import register_filter
def get_captcha_salts():
"""Produce salts based on the current user count."""
count = query(User).count()
count = query(func.count(User.ID)).scalar()
salts = []
for i in range(0, 6):
salts.append(f"aurweb-{count - i}")

View file

@ -2,10 +2,7 @@ import configparser
import os
from typing import Any
# Publicly visible version of aurweb. This is used to display
# aurweb versioning in the footer and must be maintained.
# Todo: Make this dynamic/automated.
AURWEB_VERSION = "v6.0.28"
import tomlkit
_parser = None
@ -42,6 +39,18 @@ def get(section, option):
return _get_parser().get(section, option)
def _get_project_meta():
with open(os.path.join(get("options", "aurwebdir"), "pyproject.toml")) as pyproject:
file_contents = pyproject.read()
return tomlkit.parse(file_contents)["tool"]["poetry"]
# Publicly visible version of aurweb. This is used to display
# aurweb versioning in the footer and must be maintained.
AURWEB_VERSION = str(_get_project_meta()["version"])
def getboolean(section, option):
return _get_parser().getboolean(section, option)

View file

@ -1,9 +1,3 @@
from fastapi import Request
from fastapi.responses import Response
from aurweb import config
def samesite() -> str:
"""Produce cookie SameSite value.
@ -12,66 +6,3 @@ def samesite() -> str:
:returns "lax"
"""
return "lax"
def timeout(extended: bool) -> int:
"""Produce a session timeout based on `remember_me`.
This method returns one of AUR_CONFIG's options.persistent_cookie_timeout
and options.login_timeout based on the `extended` argument.
The `extended` argument is typically the value of the AURREMEMBER
cookie, defaulted to False.
If `extended` is False, options.login_timeout is returned. Otherwise,
if `extended` is True, options.persistent_cookie_timeout is returned.
:param extended: Flag which generates an extended timeout when True
:returns: Cookie timeout based on configuration options
"""
timeout = config.getint("options", "login_timeout")
if bool(extended):
timeout = config.getint("options", "persistent_cookie_timeout")
return timeout
def update_response_cookies(
request: Request,
response: Response,
aurtz: str = None,
aurlang: str = None,
aursid: str = None,
) -> Response:
"""Update session cookies. This method is particularly useful
when updating a cookie which was already set.
The AURSID cookie's expiration is based on the AURREMEMBER cookie,
which is retrieved from `request`.
:param request: FastAPI request
:param response: FastAPI response
:param aurtz: Optional AURTZ cookie value
:param aurlang: Optional AURLANG cookie value
:param aursid: Optional AURSID cookie value
:returns: Updated response
"""
secure = config.getboolean("options", "disable_http_login")
if aurtz:
response.set_cookie(
"AURTZ", aurtz, secure=secure, httponly=secure, samesite=samesite()
)
if aurlang:
response.set_cookie(
"AURLANG", aurlang, secure=secure, httponly=secure, samesite=samesite()
)
if aursid:
remember_me = bool(request.cookies.get("AURREMEMBER", False))
response.set_cookie(
"AURSID",
aursid,
secure=secure,
httponly=secure,
max_age=timeout(remember_me),
samesite=samesite(),
)
return response

View file

@ -161,6 +161,46 @@ def begin():
return get_session().begin()
def retry_deadlock(func):
from sqlalchemy.exc import OperationalError
def wrapper(*args, _i: int = 0, **kwargs):
# Retry 10 times, then raise the exception
# If we fail before the 10th, recurse into `wrapper`
# If we fail on the 10th, continue to throw the exception
limit = 10
try:
return func(*args, **kwargs)
except OperationalError as exc:
if _i < limit and "Deadlock found" in str(exc):
# Retry on deadlock by recursing into `wrapper`
return wrapper(*args, _i=_i + 1, **kwargs)
# Otherwise, just raise the exception
raise exc
return wrapper
def async_retry_deadlock(func):
from sqlalchemy.exc import OperationalError
async def wrapper(*args, _i: int = 0, **kwargs):
# Retry 10 times, then raise the exception
# If we fail before the 10th, recurse into `wrapper`
# If we fail on the 10th, continue to throw the exception
limit = 10
try:
return await func(*args, **kwargs)
except OperationalError as exc:
if _i < limit and "Deadlock found" in str(exc):
# Retry on deadlock by recursing into `wrapper`
return await wrapper(*args, _i=_i + 1, **kwargs)
# Otherwise, just raise the exception
raise exc
return wrapper
def get_sqlalchemy_url():
"""
Build an SQLAlchemy URL for use with create_engine.
@ -258,9 +298,12 @@ def get_engine(dbname: str = None, echo: bool = False):
connect_args["check_same_thread"] = False
kwargs = {"echo": echo, "connect_args": connect_args}
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
from sqlalchemy import create_engine
_engines[dbname] = create_engine(get_sqlalchemy_url(), **kwargs)
engine = create_engine(get_sqlalchemy_url(), **kwargs)
SQLAlchemyInstrumentor().instrument(engine=engine)
_engines[dbname] = engine
if is_sqlite: # pragma: no cover
setup_sqlite(_engines.get(dbname))
@ -324,7 +367,7 @@ class ConnectionExecutor:
def execute(self, query, params=()): # pragma: no cover
# TODO: SQLite support has been removed in FastAPI. It remains
# here to fund its support for PHP until it is removed.
# here to fund its support for the Sharness testsuite.
if self._paramstyle in ("format", "pyformat"):
query = query.replace("%", "%%").replace("?", "%s")
elif self._paramstyle == "qmark":
@ -370,7 +413,7 @@ class Connection:
)
elif aur_db_backend == "sqlite": # pragma: no cover
# TODO: SQLite support has been removed in FastAPI. It remains
# here to fund its support for PHP until it is removed.
# here to fund its support for Sharness testsuite.
import math
import sqlite3

View file

@ -1,6 +1,6 @@
import copy
import math
from datetime import datetime
from datetime import UTC, datetime
from typing import Any, Union
from urllib.parse import quote_plus, urlencode
from zoneinfo import ZoneInfo
@ -8,6 +8,7 @@ from zoneinfo import ZoneInfo
import fastapi
import paginate
from jinja2 import pass_context
from jinja2.filters import do_format
import aurweb.models
from aurweb import config, l10n
@ -93,7 +94,7 @@ def tn(context: dict[str, Any], count: int, singular: str, plural: str) -> str:
@register_filter("dt")
def timestamp_to_datetime(timestamp: int):
return datetime.utcfromtimestamp(int(timestamp))
return datetime.fromtimestamp(timestamp, UTC)
@register_filter("as_timezone")
@ -117,9 +118,9 @@ def to_qs(query: dict[str, Any]) -> str:
@register_filter("get_vote")
def get_vote(voteinfo, request: fastapi.Request):
from aurweb.models import TUVote
from aurweb.models import Vote
return voteinfo.tu_votes.filter(TUVote.User == request.user).first()
return voteinfo.votes.filter(Vote.User == request.user).first()
@register_filter("number_format")
@ -164,3 +165,17 @@ def date_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
@pass_context
def datetime_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
return date_strftime(context, dt, "%Y-%m-%d %H:%M (%Z)")
@register_filter("format")
def safe_format(value: str, *args: Any, **kwargs: Any) -> str:
"""Wrapper for jinja2 format function to perform additional checks."""
# If we don't have anything to be formatted, just return the value.
# We have some translations that do not contain placeholders for replacement.
# In these cases the jinja2 function is throwing an error:
# "TypeError: not all arguments converted during string formatting"
if "%" not in value:
return value
return do_format(value, *args, **kwargs)

View file

@ -52,7 +52,7 @@ def list_repos(user):
conn.close()
def create_pkgbase(pkgbase, user):
def validate_pkgbase(pkgbase, user):
if not re.match(repo_regex, pkgbase):
raise aurweb.exceptions.InvalidRepositoryNameException(pkgbase)
if pkgbase_exists(pkgbase):
@ -62,26 +62,12 @@ def create_pkgbase(pkgbase, user):
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [user])
userid = cur.fetchone()[0]
conn.close()
if userid == 0:
raise aurweb.exceptions.InvalidUserException(user)
now = int(time.time())
cur = conn.execute(
"INSERT INTO PackageBases (Name, SubmittedTS, "
+ "ModifiedTS, SubmitterUID, MaintainerUID, "
+ "FlaggerComment) VALUES (?, ?, ?, ?, ?, '')",
[pkgbase, now, now, userid, userid],
)
pkgbase_id = cur.lastrowid
cur = conn.execute(
"INSERT INTO PackageNotifications " + "(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid],
)
conn.commit()
conn.close()
def pkgbase_adopt(pkgbase, user, privileged):
pkgbase_id = pkgbase_from_name(pkgbase)
@ -279,7 +265,7 @@ def pkgbase_disown(pkgbase, user, privileged):
conn = aurweb.db.Connection()
# Make the first co-maintainer the new maintainer, unless the action was
# enforced by a Trusted User.
# enforced by a Package Maintainer.
if initialized_by_owner:
comaintainers = pkgbase_get_comaintainers(pkgbase)
if len(comaintainers) > 0:
@ -573,18 +559,11 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
elif action == "list-repos":
checkarg(cmdargv)
list_repos(user)
elif action == "setup-repo":
checkarg(cmdargv, "repository name")
warn(
"{:s} is deprecated. "
"Use `git push` to create new repositories.".format(action)
)
create_pkgbase(cmdargv[1], user)
elif action == "restore":
checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1]
create_pkgbase(pkgbase, user)
validate_pkgbase(pkgbase, user)
os.environ["AUR_USER"] = user
os.environ["AUR_PKGBASE"] = pkgbase
@ -636,7 +615,6 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
"restore <name>": "Restore a deleted package base.",
"set-comaintainers <name> [...]": "Set package base co-maintainers.",
"set-keywords <name> [...]": "Change package base keywords.",
"setup-repo <name>": "Create a repository (deprecated).",
"unflag <name>": "Remove out-of-date flag from a package base.",
"unvote <name>": "Remove vote from a package base.",
"vote <name>": "Vote for a package base.",
@ -656,7 +634,7 @@ def main():
ssh_client = os.environ.get("SSH_CLIENT")
if not ssh_cmd:
die_with_help("Interactive shell is disabled.")
die_with_help(f"Welcome to AUR, {user}! Interactive shell is disabled.")
cmdargv = shlex.split(ssh_cmd)
action = cmdargv[0]
remote_addr = ssh_client.split(" ")[0] if ssh_client else None

View file

@ -52,7 +52,7 @@ def parse_dep(depstring):
depname = re.sub(r"(<|=|>).*", "", dep)
depcond = dep[len(depname) :]
return (depname, desc, depcond)
return depname, desc, depcond
def create_pkgbase(conn, pkgbase, user):
@ -258,6 +258,71 @@ def die_commit(msg, commit):
exit(1)
def validate_metadata(metadata, commit): # noqa: C901
try:
metadata_pkgbase = metadata["pkgbase"]
except KeyError:
die_commit(
"invalid .SRCINFO, does not contain a pkgbase (is the file empty?)",
str(commit.id),
)
if not re.match(repo_regex, metadata_pkgbase):
die_commit("invalid pkgbase: {:s}".format(metadata_pkgbase), str(commit.id))
if not metadata["packages"]:
die_commit("missing pkgname entry", str(commit.id))
for pkgname in set(metadata["packages"].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ("pkgver", "pkgrel", "pkgname"):
if field not in pkginfo:
die_commit(
"missing mandatory field: {:s}".format(field), str(commit.id)
)
if "epoch" in pkginfo and not pkginfo["epoch"].isdigit():
die_commit("invalid epoch: {:s}".format(pkginfo["epoch"]), str(commit.id))
if not re.match(r"[a-z0-9][a-z0-9\.+_-]*$", pkginfo["pkgname"]):
die_commit(
"invalid package name: {:s}".format(pkginfo["pkgname"]),
str(commit.id),
)
max_len = {"pkgname": 255, "pkgdesc": 255, "url": 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit(
"{:s} field too long: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in ("install", "changelog"):
if field in pkginfo and not pkginfo[field] in commit.tree:
die_commit(
"missing {:s} file: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in extract_arch_fields(pkginfo, "source"):
fname = field["value"]
if len(fname) > 8000:
die_commit("source entry too long: {:s}".format(fname), str(commit.id))
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit("missing source file: {:s}".format(fname), str(commit.id))
def validate_blob_size(blob: pygit2.Object, commit: pygit2.Commit):
if isinstance(blob, pygit2.Blob) and blob.size > max_blob_size:
die_commit(
"maximum blob size ({:s}) exceeded".format(size_humanize(max_blob_size)),
str(commit.id),
)
def main(): # noqa: C901
repo = pygit2.Repository(repo_path)
@ -291,110 +356,69 @@ def main(): # noqa: C901
die("denying non-fast-forward (you should pull first)")
# Prepare the walker that validates new commits.
walker = repo.walk(sha1_new, pygit2.GIT_SORT_TOPOLOGICAL)
walker = repo.walk(sha1_new, pygit2.GIT_SORT_REVERSE)
if sha1_old != "0" * 40:
walker.hide(sha1_old)
head_commit = repo[sha1_new]
if ".SRCINFO" not in head_commit.tree:
die_commit("missing .SRCINFO", str(head_commit.id))
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[head_commit.tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
if errors:
sys.stderr.write(
"error: The following errors occurred " "when parsing .SRCINFO in commit\n"
)
sys.stderr.write("error: {:s}:\n".format(str(head_commit.id)))
for error in errors:
for err in error["error"]:
sys.stderr.write("error: line {:d}: {:s}\n".format(error["line"], err))
exit(1)
# check if there is a correct .SRCINFO file in the latest revision
validate_metadata(metadata, head_commit)
# Validate all new commits.
for commit in walker:
for fname in (".SRCINFO", "PKGBUILD"):
if fname not in commit.tree:
die_commit("missing {:s}".format(fname), str(commit.id))
if "PKGBUILD" not in commit.tree:
die_commit("missing PKGBUILD", str(commit.id))
# Iterate over files in root dir
for treeobj in commit.tree:
blob = repo[treeobj.id]
if isinstance(blob, pygit2.Tree):
# Don't allow any subdirs besides "keys/"
if isinstance(treeobj, pygit2.Tree) and treeobj.name != "keys":
die_commit(
"the repository must not contain subdirectories", str(commit.id)
)
if not isinstance(blob, pygit2.Blob):
die_commit("not a blob object: {:s}".format(treeobj), str(commit.id))
if blob.size > max_blob_size:
die_commit(
"maximum blob size ({:s}) exceeded".format(
size_humanize(max_blob_size)
),
"the repository must not contain subdirectories",
str(commit.id),
)
metadata_raw = repo[commit.tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
if errors:
sys.stderr.write(
"error: The following errors occurred "
"when parsing .SRCINFO in commit\n"
)
sys.stderr.write("error: {:s}:\n".format(str(commit.id)))
for error in errors:
for err in error["error"]:
sys.stderr.write(
"error: line {:d}: {:s}\n".format(error["line"], err)
)
exit(1)
# Check size of files in root dir
validate_blob_size(treeobj, commit)
try:
metadata_pkgbase = metadata["pkgbase"]
except KeyError:
die_commit(
"invalid .SRCINFO, does not contain a pkgbase (is the file empty?)",
str(commit.id),
)
if not re.match(repo_regex, metadata_pkgbase):
die_commit("invalid pkgbase: {:s}".format(metadata_pkgbase), str(commit.id))
if not metadata["packages"]:
die_commit("missing pkgname entry", str(commit.id))
for pkgname in set(metadata["packages"].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ("pkgver", "pkgrel", "pkgname"):
if field not in pkginfo:
# If we got a subdir keys/,
# make sure it only contains a pgp/ subdir with key files
if "keys" in commit.tree:
# Check for forbidden files/dirs in keys/
for keyobj in commit.tree["keys"]:
if not isinstance(keyobj, pygit2.Tree) or keyobj.name != "pgp":
die_commit(
"missing mandatory field: {:s}".format(field), str(commit.id)
)
if "epoch" in pkginfo and not pkginfo["epoch"].isdigit():
die_commit(
"invalid epoch: {:s}".format(pkginfo["epoch"]), str(commit.id)
)
if not re.match(r"[a-z0-9][a-z0-9\.+_-]*$", pkginfo["pkgname"]):
die_commit(
"invalid package name: {:s}".format(pkginfo["pkgname"]),
str(commit.id),
)
max_len = {"pkgname": 255, "pkgdesc": 255, "url": 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit(
"{:s} field too long: {:s}".format(field, pkginfo[field]),
"the keys/ subdir may only contain a pgp/ directory",
str(commit.id),
)
for field in ("install", "changelog"):
if field in pkginfo and not pkginfo[field] in commit.tree:
die_commit(
"missing {:s} file: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in extract_arch_fields(pkginfo, "source"):
fname = field["value"]
if len(fname) > 8000:
die_commit(
"source entry too long: {:s}".format(fname), str(commit.id)
)
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit(
"missing source file: {:s}".format(fname), str(commit.id)
)
# Check for forbidden files in keys/pgp/
if "keys/pgp" in commit.tree:
for pgpobj in commit.tree["keys/pgp"]:
if not isinstance(pgpobj, pygit2.Blob) or not pgpobj.name.endswith(
".asc"
):
die_commit(
"the subdir may only contain .asc (PGP pub key) files",
str(commit.id),
)
# Check file size for pgp key files
validate_blob_size(pgpobj, commit)
# Display a warning if .SRCINFO is unchanged.
if sha1_old not in ("0000000000000000000000000000000000000000", sha1_new):
@ -403,10 +427,6 @@ def main(): # noqa: C901
if srcinfo_id_old == srcinfo_id_new:
warn(".SRCINFO unchanged. " "The package database will not be updated!")
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[repo[sha1_new].tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
# Ensure that the package base name matches the repository name.
metadata_pkgbase = metadata["pkgbase"]
if metadata_pkgbase != pkgbase:
@ -420,6 +440,8 @@ def main(): # noqa: C901
cur = conn.execute("SELECT Name FROM PackageBlacklist")
blacklist = [row[0] for row in cur.fetchall()]
if pkgbase in blacklist:
warn_or_die("pkgbase is blacklisted: {:s}".format(pkgbase))
cur = conn.execute("SELECT Name, Repo FROM OfficialProviders")
providers = dict(cur.fetchall())

View file

@ -3,8 +3,8 @@ import argparse
import alembic.command
import alembic.config
import aurweb.aur_logging
import aurweb.db
import aurweb.logging
import aurweb.schema
@ -13,9 +13,9 @@ def feed_initial_data(conn):
aurweb.schema.AccountTypes.insert(),
[
{"ID": 1, "AccountType": "User"},
{"ID": 2, "AccountType": "Trusted User"},
{"ID": 2, "AccountType": "Package Maintainer"},
{"ID": 3, "AccountType": "Developer"},
{"ID": 4, "AccountType": "Trusted User & Developer"},
{"ID": 4, "AccountType": "Package Maintainer & Developer"},
],
)
conn.execute(

View file

@ -64,11 +64,24 @@ class Translator:
translator = Translator()
def get_request_language(request: Request):
if request.user.is_authenticated():
def get_request_language(request: Request) -> str:
"""Get a request's language from either query param, user setting or
cookie. We use the configuration's [options] default_lang otherwise.
@param request FastAPI request
"""
request_lang = request.query_params.get("language")
cookie_lang = request.cookies.get("AURLANG")
if request_lang and request_lang in SUPPORTED_LANGUAGES:
return request_lang
elif (
request.user.is_authenticated()
and request.user.LangPreference in SUPPORTED_LANGUAGES
):
return request.user.LangPreference
default_lang = aurweb.config.get("options", "default_lang")
return request.cookies.get("AURLANG", default_lang)
elif cookie_lang and cookie_lang in SUPPORTED_LANGUAGES:
return cookie_lang
return aurweb.config.get_with_fallback("options", "default_lang", "en")
def get_raw_translator_for_request(request: Request):

View file

@ -1,4 +1,5 @@
""" Collection of all aurweb SQLAlchemy declarative models. """
from .accepted_term import AcceptedTerm # noqa: F401
from .account_type import AccountType # noqa: F401
from .api_rate_limit import ApiRateLimit # noqa: F401
@ -26,6 +27,6 @@ from .request_type import RequestType # noqa: F401
from .session import Session # noqa: F401
from .ssh_pub_key import SSHPubKey # noqa: F401
from .term import Term # noqa: F401
from .tu_vote import TUVote # noqa: F401
from .tu_voteinfo import TUVoteInfo # noqa: F401
from .user import User # noqa: F401
from .vote import Vote # noqa: F401
from .voteinfo import VoteInfo # noqa: F401

View file

@ -2,21 +2,21 @@ from aurweb import schema
from aurweb.models.declarative import Base
USER = "User"
TRUSTED_USER = "Trusted User"
PACKAGE_MAINTAINER = "Package Maintainer"
DEVELOPER = "Developer"
TRUSTED_USER_AND_DEV = "Trusted User & Developer"
PACKAGE_MAINTAINER_AND_DEV = "Package Maintainer & Developer"
USER_ID = 1
TRUSTED_USER_ID = 2
PACKAGE_MAINTAINER_ID = 2
DEVELOPER_ID = 3
TRUSTED_USER_AND_DEV_ID = 4
PACKAGE_MAINTAINER_AND_DEV_ID = 4
# Map string constants to integer constants.
ACCOUNT_TYPE_ID = {
USER: USER_ID,
TRUSTED_USER: TRUSTED_USER_ID,
PACKAGE_MAINTAINER: PACKAGE_MAINTAINER_ID,
DEVELOPER: DEVELOPER_ID,
TRUSTED_USER_AND_DEV: TRUSTED_USER_AND_DEV_ID,
PACKAGE_MAINTAINER_AND_DEV: PACKAGE_MAINTAINER_AND_DEV_ID,
}
# Reversed ACCOUNT_TYPE_ID mapping.

View file

@ -2,6 +2,7 @@ from fastapi import Request
from aurweb import db, schema
from aurweb.models.declarative import Base
from aurweb.util import get_client_ip
class Ban(Base):
@ -14,6 +15,6 @@ class Ban(Base):
def is_banned(request: Request):
ip = request.client.host
ip = get_client_ip(request)
exists = db.query(Ban).filter(Ban.IPAddress == ip).exists()
return db.query(exists).scalar()

View file

@ -64,3 +64,13 @@ class PackageBase(Base):
if key in PackageBase.TO_FLOAT and not isinstance(attr, float):
return float(attr)
return attr
def popularity_decay(pkgbase: PackageBase, utcnow: int):
"""Return the delta between now and the last time popularity was updated, in days"""
return int((utcnow - pkgbase.PopularityUpdated.timestamp()) / 86400)
def popularity(pkgbase: PackageBase, utcnow: int):
"""Return up-to-date popularity"""
return float(pkgbase.Popularity) * (0.98 ** popularity_decay(pkgbase, utcnow))

View file

@ -57,14 +57,17 @@ class PackageDependency(Base):
params=("NULL"),
)
def is_package(self) -> bool:
def is_aur_package(self) -> bool:
pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists()
return db.query(pkg).scalar()
def is_package(self) -> bool:
official = (
db.query(_OfficialProvider)
.filter(_OfficialProvider.Name == self.DepName)
.exists()
)
return db.query(pkg).scalar() or db.query(official).scalar()
return self.is_aur_package() or db.query(official).scalar()
def provides(self) -> list[PackageRelation]:
from aurweb.models.relation_type import PROVIDES_ID

View file

@ -1,7 +1,10 @@
import base64
import hashlib
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import backref, relationship
from aurweb import schema
from aurweb import config, schema
from aurweb.models.declarative import Base
from aurweb.models.package_base import PackageBase as _PackageBase
from aurweb.models.request_type import RequestType as _RequestType
@ -103,3 +106,16 @@ class PackageRequest(Base):
def status_display(self) -> str:
"""Return a display string for the Status column."""
return self.STATUS_DISPLAY[self.Status]
def ml_message_id_hash(self) -> str:
"""Return the X-Message-ID-Hash that is used in the mailing list archive."""
# X-Message-ID-Hash is a base32 encoded SHA1 hash
msgid = f"pkg-request-{str(self.ID)}@aur.archlinux.org"
sha1 = hashlib.sha1(msgid.encode()).digest()
return base64.b32encode(sha1).decode()
def ml_message_url(self) -> str:
"""Return the mailing list URL for the request."""
url = config.get("options", "ml_thread_url") % (self.ml_message_id_hash())
return url

View file

@ -14,7 +14,7 @@ class PackageVote(Base):
User = relationship(
_User,
backref=backref("package_votes", lazy="dynamic"),
backref=backref("package_votes", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UsersID],
)

View file

@ -13,7 +13,7 @@ class Session(Base):
User = relationship(
_User,
backref=backref("session", uselist=False),
backref=backref("session", cascade="all, delete", uselist=False),
foreign_keys=[__table__.c.UsersID],
)

View file

@ -13,7 +13,7 @@ class SSHPubKey(Base):
User = relationship(
"User",
backref=backref("ssh_pub_keys", lazy="dynamic"),
backref=backref("ssh_pub_keys", lazy="dynamic", cascade="all, delete"),
foreign_keys=[__table__.c.UserID],
)

View file

@ -10,12 +10,12 @@ from sqlalchemy.orm import backref, relationship
import aurweb.config
import aurweb.models.account_type
import aurweb.schema
from aurweb import db, logging, schema, time, util
from aurweb import aur_logging, db, schema, time, util
from aurweb.models.account_type import AccountType as _AccountType
from aurweb.models.ban import is_banned
from aurweb.models.declarative import Base
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
SALT_ROUNDS_DEFAULT = 12
@ -95,7 +95,7 @@ class User(Base):
def _login_approved(self, request: Request):
return not is_banned(request) and not self.Suspended
def login(self, request: Request, password: str, session_time: int = 0) -> str:
def login(self, request: Request, password: str) -> str:
"""Login and authenticate a request."""
from aurweb import db
@ -122,7 +122,7 @@ class User(Base):
try:
with db.begin():
self.LastLogin = now_ts
self.LastLoginIPAddress = request.client.host
self.LastLoginIPAddress = util.get_client_ip(request)
if not self.session:
sid = generate_unique_sid()
self.session = db.create(
@ -151,31 +151,31 @@ class User(Base):
return has_credential(self, credential, approved)
def logout(self, request: Request):
def logout(self, request: Request) -> None:
self.authenticated = False
if self.session:
with db.begin():
db.delete(self.session)
def is_trusted_user(self):
def is_package_maintainer(self):
return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def is_developer(self):
return self.AccountType.ID in {
aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def is_elevated(self):
"""A User is 'elevated' when they have either a
Trusted User or Developer AccountType."""
Package Maintainer or Developer AccountType."""
return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID,
aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
}
def can_edit_user(self, target: "User") -> bool:
@ -188,7 +188,7 @@ class User(Base):
In short, a user must at least have credentials and be at least
the same account type as the target.
User < Trusted User < Developer < Trusted User & Developer
User < Package Maintainer < Developer < Package Maintainer & Developer
:param target: Target User to be edited
:return: Boolean indicating whether `self` can edit `target`

View file

@ -3,24 +3,24 @@ from sqlalchemy.orm import backref, relationship
from aurweb import schema
from aurweb.models.declarative import Base
from aurweb.models.tu_voteinfo import TUVoteInfo as _TUVoteInfo
from aurweb.models.user import User as _User
from aurweb.models.voteinfo import VoteInfo as _VoteInfo
class TUVote(Base):
__table__ = schema.TU_Votes
class Vote(Base):
__table__ = schema.Votes
__tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.VoteID, __table__.c.UserID]}
VoteInfo = relationship(
_TUVoteInfo,
backref=backref("tu_votes", lazy="dynamic"),
_VoteInfo,
backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.VoteID],
)
User = relationship(
_User,
backref=backref("tu_votes", lazy="dynamic"),
backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.UserID],
)
@ -30,13 +30,13 @@ class TUVote(Base):
if not self.VoteInfo and not self.VoteID:
raise IntegrityError(
statement="Foreign key VoteID cannot be null.",
orig="TU_Votes.VoteID",
orig="Votes.VoteID",
params=("NULL"),
)
if not self.User and not self.UserID:
raise IntegrityError(
statement="Foreign key UserID cannot be null.",
orig="TU_Votes.UserID",
orig="Votes.UserID",
params=("NULL"),
)

View file

@ -8,14 +8,14 @@ from aurweb.models.declarative import Base
from aurweb.models.user import User as _User
class TUVoteInfo(Base):
__table__ = schema.TU_VoteInfo
class VoteInfo(Base):
__table__ = schema.VoteInfo
__tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.ID]}
Submitter = relationship(
_User,
backref=backref("tu_voteinfo_set", lazy="dynamic"),
backref=backref("voteinfo_set", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterID],
)
@ -30,35 +30,35 @@ class TUVoteInfo(Base):
if self.Agenda is None:
raise IntegrityError(
statement="Column Agenda cannot be null.",
orig="TU_VoteInfo.Agenda",
orig="VoteInfo.Agenda",
params=("NULL"),
)
if self.User is None:
raise IntegrityError(
statement="Column User cannot be null.",
orig="TU_VoteInfo.User",
orig="VoteInfo.User",
params=("NULL"),
)
if self.Submitted is None:
raise IntegrityError(
statement="Column Submitted cannot be null.",
orig="TU_VoteInfo.Submitted",
orig="VoteInfo.Submitted",
params=("NULL"),
)
if self.End is None:
raise IntegrityError(
statement="Column End cannot be null.",
orig="TU_VoteInfo.End",
orig="VoteInfo.End",
params=("NULL"),
)
if not self.Submitter:
raise IntegrityError(
statement="Foreign key SubmitterID cannot be null.",
orig="TU_VoteInfo.SubmitterID",
orig="VoteInfo.SubmitterID",
params=("NULL"),
)

View file

@ -151,8 +151,13 @@ def close_pkgreq(
pkgreq.ClosedTS = now
@db.retry_deadlock
def handle_request(
request: Request, reqtype_id: int, pkgbase: PackageBase, target: PackageBase = None
request: Request,
reqtype_id: int,
pkgbase: PackageBase,
target: PackageBase = None,
comments: str = str(),
) -> list[notify.Notification]:
"""
Handle package requests before performing an action.
@ -227,7 +232,7 @@ def handle_request(
PackageBase=pkgbase,
PackageBaseName=pkgbase.Name,
Comments="Autogenerated by aurweb.",
ClosureComment=str(),
ClosureComment=comments,
)
# If it's a merge request, set MergeBaseName to `target`.Name.
@ -239,15 +244,19 @@ def handle_request(
to_accept.append(pkgreq)
# Update requests with their new status and closures.
with db.begin():
util.apply_all(
to_accept,
lambda p: close_pkgreq(p, request.user, pkgbase, target, ACCEPTED_ID),
)
util.apply_all(
to_reject,
lambda p: close_pkgreq(p, request.user, pkgbase, target, REJECTED_ID),
)
@db.retry_deadlock
def retry_closures():
with db.begin():
util.apply_all(
to_accept,
lambda p: close_pkgreq(p, request.user, pkgbase, target, ACCEPTED_ID),
)
util.apply_all(
to_reject,
lambda p: close_pkgreq(p, request.user, pkgbase, target, REJECTED_ID),
)
retry_closures()
# Create RequestCloseNotifications for all requests involved.
for pkgreq in to_accept + to_reject:

View file

@ -3,7 +3,7 @@ from typing import Set
from sqlalchemy import and_, case, or_, orm
from aurweb import db, models
from aurweb.models import Package, PackageBase, User
from aurweb.models import Group, Package, PackageBase, User
from aurweb.models.dependency_type import (
CHECKDEPENDS_ID,
DEPENDS_ID,
@ -11,9 +11,11 @@ from aurweb.models.dependency_type import (
OPTDEPENDS_ID,
)
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_group import PackageGroup
from aurweb.models.package_keyword import PackageKeyword
from aurweb.models.package_notification import PackageNotification
from aurweb.models.package_vote import PackageVote
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
class PackageSearch:
@ -134,7 +136,10 @@ class PackageSearch:
self._join_user()
self._join_keywords()
keywords = set(k.lower() for k in keywords)
self.query = self.query.filter(PackageKeyword.Keyword.in_(keywords))
self.query = self.query.filter(PackageKeyword.Keyword.in_(keywords)).group_by(
models.Package.Name
)
return self
def _search_by_maintainer(self, keywords: str) -> orm.Query:
@ -190,13 +195,13 @@ class PackageSearch:
def _sort_by_votes(self, order: str):
column = getattr(models.PackageBase.NumVotes, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
def _sort_by_popularity(self, order: str):
column = getattr(models.PackageBase.Popularity, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
@ -231,7 +236,7 @@ class PackageSearch:
def _sort_by_last_modified(self, order: str):
column = getattr(models.PackageBase.ModifiedTS, order)
name = getattr(models.Package.Name, order)
name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name())
return self
@ -267,7 +272,7 @@ class RPCSearch(PackageSearch):
sanitization done for the PackageSearch `by` argument.
"""
keys_removed = ("b", "N", "B", "k", "c", "M", "s")
keys_removed = ("b", "N", "B", "M")
def __init__(self) -> "RPCSearch":
super().__init__()
@ -286,6 +291,10 @@ class RPCSearch(PackageSearch):
"makedepends": self._search_by_makedepends,
"optdepends": self._search_by_optdepends,
"checkdepends": self._search_by_checkdepends,
"provides": self._search_by_provides,
"conflicts": self._search_by_conflicts,
"replaces": self._search_by_replaces,
"groups": self._search_by_groups,
}
)
@ -304,6 +313,26 @@ class RPCSearch(PackageSearch):
)
return self.query
def _join_relations(self, rel_type_id: int) -> orm.Query:
"""Join Package with PackageRelation and filter results
based on `rel_type_id`.
:param rel_type_id: RelationType ID
:returns: PackageRelation-joined orm.Query
"""
self.query = self.query.join(models.PackageRelation).filter(
models.PackageRelation.RelTypeID == rel_type_id
)
return self.query
def _join_groups(self) -> orm.Query:
"""Join Package with PackageGroup and Group.
:returns: PackageGroup/Group-joined orm.Query
"""
self.query = self.query.join(PackageGroup).join(Group)
return self.query
def _search_by_depends(self, keywords: str) -> "RPCSearch":
self.query = self._join_depends(DEPENDS_ID).filter(
models.PackageDependency.DepName == keywords
@ -328,6 +357,34 @@ class RPCSearch(PackageSearch):
)
return self
def _search_by_provides(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(PROVIDES_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_conflicts(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(CONFLICTS_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_replaces(self, keywords: str) -> "RPCSearch":
self.query = self._join_relations(REPLACES_ID).filter(
models.PackageRelation.RelName == keywords
)
return self
def _search_by_groups(self, keywords: str) -> "RPCSearch":
self._join_groups()
self.query = self.query.filter(Group.Name == keywords)
return self
def _search_by_keywords(self, keywords: str) -> "RPCSearch":
self._join_keywords()
self.query = self.query.filter(PackageKeyword.Keyword == keywords)
return self
def search_by(self, by: str, keywords: str) -> "RPCSearch":
"""Override inherited search_by. In this override, we reduce the
scope of what we handle within this function. We do not set `by`
@ -343,4 +400,4 @@ class RPCSearch(PackageSearch):
return result
def results(self) -> orm.Query:
return self.query.filter(models.PackageBase.PackagerUID.isnot(None))
return self.query

View file

@ -1,17 +1,18 @@
from collections import defaultdict
from http import HTTPStatus
from typing import Tuple, Union
from urllib.parse import quote_plus
import orjson
from fastapi import HTTPException
from sqlalchemy import orm
from aurweb import config, db, models
from aurweb.aur_redis import redis_connection
from aurweb.models import Package
from aurweb.models.official_provider import OFFICIAL_BASE, OfficialProvider
from aurweb.models.package_dependency import PackageDependency
from aurweb.models.package_relation import PackageRelation
from aurweb.redis import redis_connection
from aurweb.templates import register_filter
Providers = list[Union[PackageRelation, OfficialProvider]]
@ -82,9 +83,11 @@ def package_link(package: Union[Package, OfficialProvider]) -> str:
@register_filter("provides_markup")
def provides_markup(provides: Providers) -> str:
return ", ".join(
[f'<a href="{package_link(pkg)}">{pkg.Name}</a>' for pkg in provides]
)
links = []
for pkg in provides:
aur = "<sup><small>AUR</small></sup>" if not pkg.is_official else ""
links.append(f'<a href="{package_link(pkg)}">{pkg.Name}</a>{aur}')
return ", ".join(links)
def get_pkg_or_base(
@ -99,8 +102,7 @@ def get_pkg_or_base(
:raises HTTPException: With status code 404 if record doesn't exist
:return: {Package,PackageBase} instance
"""
with db.begin():
instance = db.query(cls).filter(cls.Name == name).first()
instance = db.query(cls).filter(cls.Name == name).first()
if not instance:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
return instance
@ -133,16 +135,14 @@ def updated_packages(limit: int = 0, cache_ttl: int = 600) -> list[models.Packag
# If we already have a cache, deserialize it and return.
return orjson.loads(packages)
with db.begin():
query = (
db.query(models.Package)
.join(models.PackageBase)
.filter(models.PackageBase.PackagerUID.isnot(None))
.order_by(models.PackageBase.ModifiedTS.desc())
)
query = (
db.query(models.Package)
.join(models.PackageBase)
.order_by(models.PackageBase.ModifiedTS.desc())
)
if limit:
query = query.limit(limit)
if limit:
query = query.limit(limit)
packages = []
for pkg in query:
@ -219,6 +219,7 @@ def pkg_required(pkgname: str, provides: list[str]) -> list[PackageDependency]:
query = (
db.query(PackageDependency)
.join(Package)
.options(orm.contains_eager(PackageDependency.Package))
.filter(PackageDependency.DepName.in_(targets))
.order_by(Package.Name.asc())
)
@ -241,12 +242,12 @@ def source_uri(pkgsrc: models.PackageSource) -> Tuple[str, str]:
the package base name.
:param pkgsrc: PackageSource instance
:return (text, uri) tuple
:return text, uri)tuple
"""
if "::" in pkgsrc.Source:
return pkgsrc.Source.split("::", 1)
elif "://" in pkgsrc.Source:
return (pkgsrc.Source, pkgsrc.Source)
return pkgsrc.Source, pkgsrc.Source
path = config.get("options", "source_file_uri")
pkgbasename = pkgsrc.Package.PackageBase.Name
return (pkgsrc.Source, path % (pkgsrc.Source, pkgbasename))
pkgbasename = quote_plus(pkgsrc.Package.PackageBase.Name)
return pkgsrc.Source, path % (pkgsrc.Source, pkgbasename)

View file

@ -1,8 +1,8 @@
from fastapi import Request
from aurweb import db, logging, util
from aurweb import aur_logging, db, util
from aurweb.auth import creds
from aurweb.models import PackageBase
from aurweb.models import PackageBase, User
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_notification import PackageNotification
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID
@ -10,7 +10,13 @@ from aurweb.packages.requests import handle_request, update_closure_comment
from aurweb.pkgbase import util as pkgbaseutil
from aurweb.scripts import notify, popupdate
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
@db.retry_deadlock
def _retry_notify(user: User, pkgbase: PackageBase) -> None:
with db.begin():
db.create(PackageNotification, PackageBase=pkgbase, User=user)
def pkgbase_notify_instance(request: Request, pkgbase: PackageBase) -> None:
@ -21,8 +27,13 @@ def pkgbase_notify_instance(request: Request, pkgbase: PackageBase) -> None:
).scalar()
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
if has_cred and not notif:
with db.begin():
db.create(PackageNotification, PackageBase=pkgbase, User=request.user)
_retry_notify(request.user, pkgbase)
@db.retry_deadlock
def _retry_unnotify(notif: PackageNotification, pkgbase: PackageBase) -> None:
with db.begin():
db.delete(notif)
def pkgbase_unnotify_instance(request: Request, pkgbase: PackageBase) -> None:
@ -31,8 +42,15 @@ def pkgbase_unnotify_instance(request: Request, pkgbase: PackageBase) -> None:
).first()
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
if has_cred and notif:
with db.begin():
db.delete(notif)
_retry_unnotify(notif, pkgbase)
@db.retry_deadlock
def _retry_unflag(pkgbase: PackageBase) -> None:
with db.begin():
pkgbase.OutOfDateTS = None
pkgbase.Flagger = None
pkgbase.FlaggerComment = str()
def pkgbase_unflag_instance(request: Request, pkgbase: PackageBase) -> None:
@ -42,20 +60,17 @@ def pkgbase_unflag_instance(request: Request, pkgbase: PackageBase) -> None:
+ [c.User for c in pkgbase.comaintainers],
)
if has_cred:
with db.begin():
pkgbase.OutOfDateTS = None
pkgbase.Flagger = None
pkgbase.FlaggerComment = str()
_retry_unflag(pkgbase)
def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
disowner = request.user
notifs = [notify.DisownNotification(disowner.ID, pkgbase.ID)]
@db.retry_deadlock
def _retry_disown(request: Request, pkgbase: PackageBase):
notifs: list[notify.Notification] = []
is_maint = disowner == pkgbase.Maintainer
is_maint = request.user == pkgbase.Maintainer
comaint = pkgbase.comaintainers.filter(
PackageComaintainer.User == disowner
PackageComaintainer.User == request.user
).one_or_none()
is_comaint = comaint is not None
@ -79,45 +94,54 @@ def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
notifs.append(notif)
elif request.user.has_credential(creds.PKGBASE_DISOWN):
# Otherwise, the request user performing this disownage is a
# Trusted User and we treat it like a standard orphan request.
# Package Maintainer and we treat it like a standard orphan request.
notifs += handle_request(request, ORPHAN_ID, pkgbase)
with db.begin():
pkgbase.Maintainer = None
db.delete_all(pkgbase.comaintainers)
return notifs
def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
disowner = request.user
notifs = [notify.DisownNotification(disowner.ID, pkgbase.ID)]
notifs += _retry_disown(request, pkgbase)
util.apply_all(notifs, lambda n: n.send())
def pkgbase_adopt_instance(request: Request, pkgbase: PackageBase) -> None:
@db.retry_deadlock
def _retry_adopt(request: Request, pkgbase: PackageBase) -> None:
with db.begin():
pkgbase.Maintainer = request.user
def pkgbase_adopt_instance(request: Request, pkgbase: PackageBase) -> None:
_retry_adopt(request, pkgbase)
notif = notify.AdoptNotification(request.user.ID, pkgbase.ID)
notif.send()
@db.retry_deadlock
def _retry_delete(pkgbase: PackageBase, comments: str) -> None:
with db.begin():
update_closure_comment(pkgbase, DELETION_ID, comments)
db.delete(pkgbase)
def pkgbase_delete_instance(
request: Request, pkgbase: PackageBase, comments: str = str()
) -> list[notify.Notification]:
notifs = handle_request(request, DELETION_ID, pkgbase) + [
notify.DeleteNotification(request.user.ID, pkgbase.ID)
]
notif = notify.DeleteNotification(request.user.ID, pkgbase.ID)
notifs = handle_request(request, DELETION_ID, pkgbase, comments=comments) + [notif]
with db.begin():
update_closure_comment(pkgbase, DELETION_ID, comments)
db.delete(pkgbase)
_retry_delete(pkgbase, comments)
return notifs
def pkgbase_merge_instance(
request: Request, pkgbase: PackageBase, target: PackageBase, comments: str = str()
) -> None:
pkgbasename = str(pkgbase.Name)
# Create notifications.
notifs = handle_request(request, MERGE_ID, pkgbase, target)
@db.retry_deadlock
def _retry_merge(pkgbase: PackageBase, target: PackageBase) -> None:
# Target votes and notifications sets of user IDs that are
# looking to be migrated.
target_votes = set(v.UsersID for v in target.package_votes)
@ -147,9 +171,23 @@ def pkgbase_merge_instance(
db.delete(pkg)
db.delete(pkgbase)
def pkgbase_merge_instance(
request: Request,
pkgbase: PackageBase,
target: PackageBase,
comments: str = str(),
) -> None:
pkgbasename = str(pkgbase.Name)
# Create notifications.
notifs = handle_request(request, MERGE_ID, pkgbase, target, comments)
_retry_merge(pkgbase, target)
# Log this out for accountability purposes.
logger.info(
f"Trusted User '{request.user.Username}' merged "
f"Package Maintainer '{request.user.Username}' merged "
f"'{pkgbasename}' into '{target.Name}'."
)

View file

@ -2,25 +2,17 @@ from typing import Any
from fastapi import Request
from sqlalchemy import and_
from sqlalchemy.orm import joinedload
from aurweb import config, db, defaults, l10n, util
from aurweb import config, db, defaults, l10n, time, util
from aurweb.models import PackageBase, User
from aurweb.models.package_base import popularity
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_comment import PackageComment
from aurweb.models.package_request import PENDING_ID, PackageRequest
from aurweb.models.package_vote import PackageVote
from aurweb.scripts import notify
from aurweb.templates import (
make_context as _make_context,
make_variable_context as _make_variable_context,
)
async def make_variable_context(
request: Request, pkgbase: PackageBase
) -> dict[str, Any]:
ctx = await _make_variable_context(request, pkgbase.Name)
return make_context(request, pkgbase, ctx)
from aurweb.templates import make_context as _make_context
def make_context(
@ -35,6 +27,8 @@ def make_context(
if not context:
context = _make_context(request, pkgbase.Name)
is_authenticated = request.user.is_authenticated()
# Per page and offset.
offset, per_page = util.sanitize_params(
request.query_params.get("O", defaults.O),
@ -47,12 +41,15 @@ def make_context(
context["pkgbase"] = pkgbase
context["comaintainers"] = [
c.User
for c in pkgbase.comaintainers.order_by(
PackageComaintainer.Priority.asc()
).all()
for c in pkgbase.comaintainers.options(joinedload(PackageComaintainer.User))
.order_by(PackageComaintainer.Priority.asc())
.all()
]
context["unflaggers"] = context["comaintainers"].copy()
context["unflaggers"].extend([pkgbase.Maintainer, pkgbase.Flagger])
if is_authenticated:
context["unflaggers"] = context["comaintainers"].copy()
context["unflaggers"].extend([pkgbase.Maintainer, pkgbase.Flagger])
else:
context["unflaggers"] = []
context["packages_count"] = pkgbase.packages.count()
context["keywords"] = pkgbase.keywords
@ -69,17 +66,30 @@ def make_context(
).order_by(PackageComment.CommentTS.desc())
context["is_maintainer"] = bool(request.user == pkgbase.Maintainer)
context["notified"] = request.user.notified(pkgbase)
if is_authenticated:
context["notified"] = request.user.notified(pkgbase)
else:
context["notified"] = False
context["out_of_date"] = bool(pkgbase.OutOfDateTS)
context["voted"] = request.user.package_votes.filter(
PackageVote.PackageBaseID == pkgbase.ID
).scalar()
if is_authenticated:
context["voted"] = db.query(
request.user.package_votes.filter(
PackageVote.PackageBaseID == pkgbase.ID
).exists()
).scalar()
else:
context["voted"] = False
context["requests"] = pkgbase.requests.filter(
and_(PackageRequest.Status == PENDING_ID, PackageRequest.ClosedTS.is_(None))
).count()
if is_authenticated:
context["requests"] = pkgbase.requests.filter(
and_(PackageRequest.Status == PENDING_ID, PackageRequest.ClosedTS.is_(None))
).count()
else:
context["requests"] = []
context["popularity"] = popularity(pkgbase, time.utcnow())
return context
@ -106,6 +116,7 @@ def remove_comaintainer(
return notif
@db.retry_deadlock
def remove_comaintainers(pkgbase: PackageBase, usernames: list[str]) -> None:
"""
Remove comaintainers from `pkgbase`.
@ -155,6 +166,7 @@ class NoopComaintainerNotification:
return
@db.retry_deadlock
def add_comaintainer(
pkgbase: PackageBase, comaintainer: User
) -> notify.ComaintainerAddNotification:

View file

@ -1,6 +1,9 @@
from http import HTTPStatus
from typing import Any
from aurweb import db
from fastapi import HTTPException
from aurweb import config, db
from aurweb.exceptions import ValidationError
from aurweb.models import PackageBase
@ -12,8 +15,8 @@ def request(
merge_into: str,
context: dict[str, Any],
) -> None:
if not comments:
raise ValidationError(["The comment field must not be empty."])
# validate comment
comment(comments)
if type == "merge":
# Perform merge-related checks.
@ -32,3 +35,21 @@ def request(
if target.ID == pkgbase.ID:
# TODO: This error needs to be translated.
raise ValidationError(["You cannot merge a package base into itself."])
def comment(comment: str):
if not comment:
raise ValidationError(["The comment field must not be empty."])
if len(comment) > config.getint("options", "max_chars_comment", 5000):
raise ValidationError(["Maximum number of characters for comment exceeded."])
def comment_raise_http_ex(comments: str):
try:
comment(comments)
except ValidationError as err:
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=err.data[0],
)

View file

@ -1,20 +1,42 @@
from typing import Any, Callable, Optional
from prometheus_client import Counter
from prometheus_client import Counter, Gauge
from prometheus_fastapi_instrumentator import Instrumentator
from prometheus_fastapi_instrumentator.metrics import Info
from starlette.routing import Match, Route
from aurweb import logging
from aurweb import aur_logging
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
_instrumentator = Instrumentator()
# Custom metrics
SEARCH_REQUESTS = Counter(
"aur_search_requests", "Number of search requests by cache hit/miss", ["cache"]
)
USERS = Gauge(
"aur_users", "Number of AUR users by type", ["type"], multiprocess_mode="livemax"
)
PACKAGES = Gauge(
"aur_packages",
"Number of AUR packages by state",
["state"],
multiprocess_mode="livemax",
)
REQUESTS = Gauge(
"aur_requests",
"Number of AUR requests by type and status",
["type", "status"],
multiprocess_mode="livemax",
)
def instrumentator():
return _instrumentator
# FastAPI metrics
# Taken from https://github.com/stephenhillier/starlette_exporter
# Their license is included in LICENSES/starlette_exporter.
# The code has been modified to remove child route checks

View file

@ -1,11 +1,12 @@
from fastapi import Request
from redis.client import Pipeline
from aurweb import config, db, logging, time
from aurweb import aur_logging, config, db, time
from aurweb.aur_redis import redis_connection
from aurweb.models import ApiRateLimit
from aurweb.redis import redis_connection
from aurweb.util import get_client_ip
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
@ -13,7 +14,7 @@ def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
now = time.utcnow()
time_to_delete = now - window_length
host = request.client.host
host = get_client_ip(request)
window_key = f"ratelimit-ws:{host}"
requests_key = f"ratelimit:{host}"
@ -38,17 +39,26 @@ def _update_ratelimit_db(request: Request):
now = time.utcnow()
time_to_delete = now - window_length
records = db.query(ApiRateLimit).filter(ApiRateLimit.WindowStart < time_to_delete)
with db.begin():
db.delete_all(records)
@db.retry_deadlock
def retry_delete(records: list[ApiRateLimit]) -> None:
with db.begin():
db.delete_all(records)
host = request.client.host
records = db.query(ApiRateLimit).filter(ApiRateLimit.WindowStart < time_to_delete)
retry_delete(records)
@db.retry_deadlock
def retry_create(record: ApiRateLimit, now: int, host: str) -> ApiRateLimit:
with db.begin():
if not record:
record = db.create(ApiRateLimit, WindowStart=now, IP=host, Requests=1)
else:
record.Requests += 1
return record
host = get_client_ip(request)
record = db.query(ApiRateLimit, ApiRateLimit.IP == host).first()
with db.begin():
if not record:
record = db.create(ApiRateLimit, WindowStart=now, IP=host, Requests=1)
else:
record.Requests += 1
record = retry_create(record, now, host)
logger.debug(record.Requests)
return record
@ -83,7 +93,7 @@ def check_ratelimit(request: Request):
record = update_ratelimit(request, pipeline)
# Get cache value, else None.
host = request.client.host
host = get_client_ip(request)
pipeline.get(f"ratelimit:{host}")
requests = pipeline.execute()[0]

View file

@ -3,17 +3,18 @@ API routers for FastAPI.
See https://fastapi.tiangolo.com/tutorial/bigger-applications/
"""
from . import (
accounts,
auth,
html,
package_maintainer,
packages,
pkgbase,
requests,
rpc,
rss,
sso,
trusted_user,
)
"""
@ -28,7 +29,7 @@ APP_ROUTES = [
packages,
pkgbase,
requests,
trusted_user,
package_maintainer,
rss,
rpc,
sso,

View file

@ -3,13 +3,13 @@ import typing
from http import HTTPStatus
from typing import Any
from fastapi import APIRouter, Form, Request
from fastapi import APIRouter, Form, HTTPException, Request
from fastapi.responses import HTMLResponse, RedirectResponse
from sqlalchemy import and_, or_
import aurweb.config
from aurweb import cookies, db, l10n, logging, models, util
from aurweb.auth import account_type_required, requires_auth, requires_guest
from aurweb import aur_logging, db, l10n, models, util
from aurweb.auth import account_type_required, creds, requires_auth, requires_guest
from aurweb.captcha import get_captcha_salts
from aurweb.exceptions import ValidationError, handle_form_exceptions
from aurweb.l10n import get_translator_for_request
@ -22,7 +22,7 @@ from aurweb.users import update, validate
from aurweb.users.util import get_user_by_name
router = APIRouter()
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
@router.get("/passreset", response_class=HTMLResponse)
@ -32,6 +32,7 @@ async def passreset(request: Request):
return render_template(request, "passreset.html", context)
@db.async_retry_deadlock
@router.post("/passreset", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
@ -159,9 +160,9 @@ def process_account_form(request: Request, user: models.User, args: dict[str, An
for check in checks:
check(**args, request=request, user=user, _=_)
except ValidationError as exc:
return (False, exc.data)
return False, exc.data
return (True, [])
return True, []
def make_account_form_context(
@ -183,9 +184,9 @@ def make_account_form_context(
lambda e: request.user.AccountTypeID >= e[0],
[
(at.USER_ID, f"Normal {at.USER}"),
(at.TRUSTED_USER_ID, at.TRUSTED_USER),
(at.PACKAGE_MAINTAINER_ID, at.PACKAGE_MAINTAINER),
(at.DEVELOPER_ID, at.DEVELOPER),
(at.TRUSTED_USER_AND_DEV_ID, at.TRUSTED_USER_AND_DEV),
(at.PACKAGE_MAINTAINER_AND_DEV_ID, at.PACKAGE_MAINTAINER_AND_DEV),
],
)
)
@ -208,6 +209,7 @@ def make_account_form_context(
context["cn"] = args.get("CN", user.CommentNotify)
context["un"] = args.get("UN", user.UpdateNotify)
context["on"] = args.get("ON", user.OwnershipNotify)
context["hdc"] = args.get("HDC", user.HideDeletedComments)
context["inactive"] = args.get("J", user.InactivityTS != 0)
else:
context["username"] = args.get("U", str())
@ -226,6 +228,7 @@ def make_account_form_context(
context["cn"] = args.get("CN", True)
context["un"] = args.get("UN", False)
context["on"] = args.get("ON", True)
context["hdc"] = args.get("HDC", False)
context["inactive"] = args.get("J", False)
context["password"] = args.get("P", str())
@ -252,6 +255,7 @@ async def account_register(
CN: bool = Form(default=False), # Comment Notify
CU: bool = Form(default=False), # Update Notify
CO: bool = Form(default=False), # Owner Notify
HDC: bool = Form(default=False), # Hide Deleted Comments
captcha: str = Form(default=str()),
):
context = await make_variable_context(request, "Register")
@ -260,6 +264,7 @@ async def account_register(
return render_template(request, "register.html", context)
@db.async_retry_deadlock
@router.post("/register", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
@ -279,6 +284,7 @@ async def account_register_post(
CN: bool = Form(default=False),
UN: bool = Form(default=False),
ON: bool = Form(default=False),
HDC: bool = Form(default=False),
captcha: str = Form(default=None),
captcha_salt: str = Form(...),
):
@ -332,22 +338,20 @@ async def account_register_post(
CommentNotify=CN,
UpdateNotify=UN,
OwnershipNotify=ON,
HideDeletedComments=HDC,
ResetKey=resetkey,
AccountType=atype,
)
# If a PK was given and either one does not exist or the given
# PK mismatches the existing user's SSHPubKey.PubKey.
if PK:
# Get the second element in the PK, which is the actual key.
keys = util.parse_ssh_keys(PK.strip())
for k in keys:
pk = " ".join(k)
fprint = get_fingerprint(pk)
with db.begin():
db.create(
models.SSHPubKey, UserID=user.ID, PubKey=pk, Fingerprint=fprint
)
# If a PK was given and either one does not exist or the given
# PK mismatches the existing user's SSHPubKey.PubKey.
if PK:
# Get the second element in the PK, which is the actual key.
keys = util.parse_ssh_keys(PK.strip())
for k in keys:
pk = " ".join(k)
fprint = get_fingerprint(pk)
db.create(models.SSHPubKey, User=user, PubKey=pk, Fingerprint=fprint)
# Send a reset key notification to the new user.
WelcomeNotification(user.ID).send()
@ -370,6 +374,9 @@ def cannot_edit(
:param user: Target user to be edited
:return: RedirectResponse if approval != granted else None
"""
# raise 404 if user does not exist
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
approved = request.user.can_edit_user(user)
if not approved and (to := "/"):
if user:
@ -413,10 +420,12 @@ async def account_edit_post(
TZ: str = Form(aurweb.config.get("options", "default_timezone")),
P: str = Form(default=str()), # New Password
C: str = Form(default=None), # Password Confirm
S: bool = Form(default=False), # Suspended
PK: str = Form(default=None), # PubKey
CN: bool = Form(default=False), # Comment Notify
UN: bool = Form(default=False), # Update Notify
ON: bool = Form(default=False), # Owner Notify
HDC: bool = Form(default=False), # Hide Deleted Comments
T: int = Form(default=None),
passwd: str = Form(default=str()),
):
@ -456,17 +465,18 @@ async def account_edit_post(
update.ssh_pubkey,
update.account_type,
update.password,
update.suspend,
]
# These update functions are all guarded by retry_deadlock;
# there's no need to guard this route itself.
for f in updates:
f(**args, request=request, user=user, context=context)
if not errors:
context["complete"] = True
# Update cookies with requests, in case they were changed.
response = render_template(request, "account/edit.html", context)
return cookies.update_response_cookies(request, response, aurtz=TZ, aurlang=L)
return render_template(request, "account/edit.html", context)
@router.get("/account/{username}")
@ -510,7 +520,9 @@ async def account_comments(request: Request, username: str):
@router.get("/accounts")
@requires_auth
@account_type_required({at.TRUSTED_USER, at.DEVELOPER, at.TRUSTED_USER_AND_DEV})
@account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts(request: Request):
context = make_context(request, "Accounts")
return render_template(request, "account/search.html", context)
@ -519,7 +531,9 @@ async def accounts(request: Request):
@router.post("/accounts")
@handle_form_exceptions
@requires_auth
@account_type_required({at.TRUSTED_USER, at.DEVELOPER, at.TRUSTED_USER_AND_DEV})
@account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts_post(
request: Request,
O: int = Form(default=0), # Offset
@ -554,9 +568,9 @@ async def accounts_post(
# Convert parameter T to an AccountType ID.
account_types = {
"u": at.USER_ID,
"t": at.TRUSTED_USER_ID,
"t": at.PACKAGE_MAINTAINER_ID,
"d": at.DEVELOPER_ID,
"td": at.TRUSTED_USER_AND_DEV_ID,
"td": at.PACKAGE_MAINTAINER_AND_DEV_ID,
}
account_type_id = account_types.get(T, None)
@ -595,6 +609,78 @@ async def accounts_post(
return render_template(request, "account/index.html", context)
@router.get("/account/{name}/delete")
@requires_auth
async def account_delete(request: Request, name: str):
user = db.query(models.User).filter(models.User.Username == name).first()
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
has_cred = request.user.has_credential(creds.ACCOUNT_EDIT, approved=[user])
if not has_cred:
_ = l10n.get_translator_for_request(request)
raise HTTPException(
detail=_("You do not have permission to edit this account."),
status_code=HTTPStatus.UNAUTHORIZED,
)
context = make_context(request, "Accounts")
context["name"] = name
return render_template(request, "account/delete.html", context)
@db.async_retry_deadlock
@router.post("/account/{name}/delete")
@handle_form_exceptions
@requires_auth
async def account_delete_post(
request: Request,
name: str,
passwd: str = Form(default=str()),
confirm: bool = Form(default=False),
):
user = db.query(models.User).filter(models.User.Username == name).first()
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
has_cred = request.user.has_credential(creds.ACCOUNT_EDIT, approved=[user])
if not has_cred:
_ = l10n.get_translator_for_request(request)
raise HTTPException(
detail=_("You do not have permission to edit this account."),
status_code=HTTPStatus.UNAUTHORIZED,
)
context = make_context(request, "Accounts")
context["name"] = name
confirm = util.strtobool(confirm)
if not confirm:
context["errors"] = [
"The account has not been deleted, check the confirmation checkbox."
]
return render_template(
request,
"account/delete.html",
context,
status_code=HTTPStatus.BAD_REQUEST,
)
if not request.user.valid_password(passwd):
context["errors"] = ["Invalid password."]
return render_template(
request,
"account/delete.html",
context,
status_code=HTTPStatus.BAD_REQUEST,
)
with db.begin():
db.delete(user)
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
def render_terms_of_service(request: Request, context: dict, terms: typing.Iterable):
if not terms:
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
@ -633,6 +719,7 @@ async def terms_of_service(request: Request):
return render_terms_of_service(request, context, accept_needed)
@db.async_retry_deadlock
@router.post("/tos")
@handle_form_exceptions
@requires_auth

View file

@ -28,6 +28,11 @@ async def login_get(request: Request, next: str = "/"):
return await login_template(request, next)
@db.retry_deadlock
def _retry_login(request: Request, user: User, passwd: str) -> str:
return user.login(request, passwd)
@router.post("/login", response_class=HTMLResponse)
@handle_form_exceptions
@requires_guest
@ -48,21 +53,30 @@ async def login_post(
status_code=HTTPStatus.BAD_REQUEST, detail=_("Bad Referer header.")
)
with db.begin():
user = (
db.query(User)
.filter(or_(User.Username == user, User.Email == user))
.first()
user = (
db.query(User)
.filter(
or_(
User.Username == user,
User.Email == user,
)
)
.first()
)
if not user:
return await login_template(request, next, errors=["Bad username or password."])
if user.Suspended:
return await login_template(request, next, errors=["Account Suspended"])
cookie_timeout = cookies.timeout(remember_me)
sid = user.login(request, passwd, cookie_timeout)
# If "remember me" was not ticked, we set a session cookie for AURSID,
# otherwise we make it a persistent cookie
cookie_timeout = None
if remember_me:
cookie_timeout = aurweb.config.getint("options", "persistent_cookie_timeout")
perma_timeout = aurweb.config.getint("options", "permanent_cookie_timeout")
sid = _retry_login(request, user, passwd)
if not sid:
return await login_template(request, next, errors=["Bad username or password."])
@ -77,23 +91,10 @@ async def login_post(
httponly=secure,
samesite=cookies.samesite(),
)
response.set_cookie(
"AURTZ",
user.Timezone,
secure=secure,
httponly=secure,
samesite=cookies.samesite(),
)
response.set_cookie(
"AURLANG",
user.LangPreference,
secure=secure,
httponly=secure,
samesite=cookies.samesite(),
)
response.set_cookie(
"AURREMEMBER",
remember_me,
max_age=perma_timeout,
secure=secure,
httponly=secure,
samesite=cookies.samesite(),
@ -101,16 +102,21 @@ async def login_post(
return response
@db.retry_deadlock
def _retry_logout(request: Request) -> None:
request.user.logout(request)
@router.post("/logout")
@handle_form_exceptions
@requires_auth
async def logout(request: Request, next: str = Form(default="/")):
if request.user.is_authenticated():
request.user.logout(request)
_retry_logout(request)
# Use 303 since we may be handling a post request, that'll get it
# to redirect to a get request.
response = RedirectResponse(url=next, status_code=HTTPStatus.SEE_OTHER)
response.delete_cookie("AURSID")
response.delete_cookie("AURTZ")
response.delete_cookie("AURREMEMBER")
return response

View file

@ -1,6 +1,7 @@
""" AURWeb's primary routing module. Define all routes via @app.app.{get,post}
decorators in some way; more complex routes should be defined in their
own modules and imported here. """
import os
from http import HTTPStatus
@ -12,19 +13,17 @@ from prometheus_client import (
generate_latest,
multiprocess,
)
from sqlalchemy import and_, case, or_
from sqlalchemy import case, or_
import aurweb.config
import aurweb.models.package_request
from aurweb import cookies, db, logging, models, time, util
from aurweb.cache import db_count_cache
from aurweb import aur_logging, cookies, db, models, statistics, time, util
from aurweb.exceptions import handle_form_exceptions
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
from aurweb.models.package_request import PENDING_ID
from aurweb.packages.util import query_notified, query_voted, updated_packages
from aurweb.templates import make_context, render_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
router = APIRouter()
@ -35,6 +34,7 @@ async def favicon(request: Request):
return RedirectResponse("/static/images/favicon.ico")
@db.async_retry_deadlock
@router.post("/language", response_class=RedirectResponse)
@handle_form_exceptions
async def language(
@ -55,19 +55,28 @@ async def language(
query_string = "?" + q if q else str()
# If the user is authenticated, update the user's LangPreference.
if request.user.is_authenticated():
with db.begin():
request.user.LangPreference = set_lang
# In any case, set the response's AURLANG cookie that never expires.
response = RedirectResponse(
url=f"{next}{query_string}", status_code=HTTPStatus.SEE_OTHER
)
secure = aurweb.config.getboolean("options", "disable_http_login")
response.set_cookie(
"AURLANG", set_lang, secure=secure, httponly=secure, samesite=cookies.samesite()
)
# If the user is authenticated, update the user's LangPreference.
# Otherwise set an AURLANG cookie
if request.user.is_authenticated():
with db.begin():
request.user.LangPreference = set_lang
else:
secure = aurweb.config.getboolean("options", "disable_http_login")
perma_timeout = aurweb.config.getint("options", "permanent_cookie_timeout")
response.set_cookie(
"AURLANG",
set_lang,
secure=secure,
httponly=secure,
max_age=perma_timeout,
samesite=cookies.samesite(),
)
return response
@ -77,84 +86,12 @@ async def index(request: Request):
context = make_context(request, "Home")
context["ssh_fingerprints"] = util.get_ssh_fingerprints()
bases = db.query(models.PackageBase)
redis = aurweb.redis.redis_connection()
cache_expire = 300 # Five minutes.
cache_expire = aurweb.config.getint("cache", "expiry_time_statistics", 300)
# Package statistics.
query = bases.filter(models.PackageBase.PackagerUID.isnot(None))
context["package_count"] = await db_count_cache(
redis, "package_count", query, expire=cache_expire
)
query = bases.filter(
and_(
models.PackageBase.MaintainerUID.is_(None),
models.PackageBase.PackagerUID.isnot(None),
)
)
context["orphan_count"] = await db_count_cache(
redis, "orphan_count", query, expire=cache_expire
)
query = db.query(models.User)
context["user_count"] = await db_count_cache(
redis, "user_count", query, expire=cache_expire
)
query = query.filter(
or_(
models.User.AccountTypeID == TRUSTED_USER_ID,
models.User.AccountTypeID == TRUSTED_USER_AND_DEV_ID,
)
)
context["trusted_user_count"] = await db_count_cache(
redis, "trusted_user_count", query, expire=cache_expire
)
# Current timestamp.
now = time.utcnow()
seven_days = 86400 * 7 # Seven days worth of seconds.
seven_days_ago = now - seven_days
one_hour = 3600
updated = bases.filter(
and_(
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS >= one_hour,
models.PackageBase.PackagerUID.isnot(None),
)
)
query = bases.filter(
and_(
models.PackageBase.SubmittedTS >= seven_days_ago,
models.PackageBase.PackagerUID.isnot(None),
)
)
context["seven_days_old_added"] = await db_count_cache(
redis, "seven_days_old_added", query, expire=cache_expire
)
query = updated.filter(models.PackageBase.ModifiedTS >= seven_days_ago)
context["seven_days_old_updated"] = await db_count_cache(
redis, "seven_days_old_updated", query, expire=cache_expire
)
year = seven_days * 52 # Fifty two weeks worth: one year.
year_ago = now - year
query = updated.filter(models.PackageBase.ModifiedTS >= year_ago)
context["year_old_updated"] = await db_count_cache(
redis, "year_old_updated", query, expire=cache_expire
)
query = bases.filter(
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS < 3600
)
context["never_updated"] = await db_count_cache(
redis, "never_updated", query, expire=cache_expire
)
counts = statistics.get_homepage_counts()
for k in counts:
context[k] = counts[k]
# Get the 15 most recently updated packages.
context["package_updates"] = updated_packages(15, cache_expire)
@ -199,7 +136,7 @@ async def index(request: Request):
)
archive_time = aurweb.config.getint("options", "request_archive_time")
start = now - archive_time
start = time.utcnow() - archive_time
# Package requests created by request.user.
context["package_requests"] = (
@ -275,6 +212,9 @@ async def metrics(request: Request):
status_code=HTTPStatus.SERVICE_UNAVAILABLE,
)
# update prometheus gauges for packages and users
statistics.update_prometheus_metrics()
registry = CollectorRegistry()
multiprocess.MultiProcessCollector(registry)
data = generate_latest(registry)

View file

@ -7,17 +7,20 @@ from fastapi import APIRouter, Form, HTTPException, Request
from fastapi.responses import RedirectResponse, Response
from sqlalchemy import and_, func, or_
from aurweb import db, l10n, logging, models, time
from aurweb import aur_logging, db, l10n, models, time
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions
from aurweb.models import User
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
from aurweb.models.account_type import (
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
)
from aurweb.templates import make_context, make_variable_context, render_template
router = APIRouter()
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
# Some TU route specific constants.
# Some PM route specific constants.
ITEMS_PER_PAGE = 10 # Paged table size.
MAX_AGENDA_LENGTH = 75 # Agenda table column length.
@ -26,32 +29,32 @@ ADDVOTE_SPECIFICS = {
# When a proposal is added, duration is added to the current
# timestamp.
# "addvote_type": (duration, quorum)
"add_tu": (7 * 24 * 60 * 60, 0.66),
"remove_tu": (7 * 24 * 60 * 60, 0.75),
"remove_inactive_tu": (5 * 24 * 60 * 60, 0.66),
"add_pm": (7 * 24 * 60 * 60, 0.66),
"remove_pm": (7 * 24 * 60 * 60, 0.75),
"remove_inactive_pm": (5 * 24 * 60 * 60, 0.66),
"bylaws": (7 * 24 * 60 * 60, 0.75),
}
def populate_trusted_user_counts(context: dict[str, Any]) -> None:
tu_query = db.query(User).filter(
def populate_package_maintainer_counts(context: dict[str, Any]) -> None:
pm_query = db.query(User).filter(
or_(
User.AccountTypeID == TRUSTED_USER_ID,
User.AccountTypeID == TRUSTED_USER_AND_DEV_ID,
User.AccountTypeID == PACKAGE_MAINTAINER_ID,
User.AccountTypeID == PACKAGE_MAINTAINER_AND_DEV_ID,
)
)
context["trusted_user_count"] = tu_query.count()
context["package_maintainer_count"] = pm_query.count()
# In case any records have a None InactivityTS.
active_tu_query = tu_query.filter(
active_pm_query = pm_query.filter(
or_(User.InactivityTS.is_(None), User.InactivityTS == 0)
)
context["active_trusted_user_count"] = active_tu_query.count()
context["active_package_maintainer_count"] = active_pm_query.count()
@router.get("/tu")
@router.get("/package-maintainer")
@requires_auth
async def trusted_user(
async def package_maintainer(
request: Request,
coff: int = 0, # current offset
cby: str = "desc", # current by
@ -60,10 +63,10 @@ async def trusted_user(
): # past by
"""Proposal listings."""
if not request.user.has_credential(creds.TU_LIST_VOTES):
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
context = make_context(request, "Trusted User")
context = make_context(request, "Package Maintainer")
current_by, past_by = cby, pby
current_off, past_off = coff, poff
@ -84,9 +87,9 @@ async def trusted_user(
context["past_by"] = past_by
current_votes = (
db.query(models.TUVoteInfo)
.filter(models.TUVoteInfo.End > ts)
.order_by(models.TUVoteInfo.Submitted.desc())
db.query(models.VoteInfo)
.filter(models.VoteInfo.End > ts)
.order_by(models.VoteInfo.Submitted.desc())
)
context["current_votes_count"] = current_votes.count()
current_votes = current_votes.limit(pp).offset(current_off)
@ -96,9 +99,9 @@ async def trusted_user(
context["current_off"] = current_off
past_votes = (
db.query(models.TUVoteInfo)
.filter(models.TUVoteInfo.End <= ts)
.order_by(models.TUVoteInfo.Submitted.desc())
db.query(models.VoteInfo)
.filter(models.VoteInfo.End <= ts)
.order_by(models.VoteInfo.Submitted.desc())
)
context["past_votes_count"] = past_votes.count()
past_votes = past_votes.limit(pp).offset(past_off)
@ -107,29 +110,29 @@ async def trusted_user(
)
context["past_off"] = past_off
last_vote = func.max(models.TUVote.VoteID).label("LastVote")
last_votes_by_tu = (
db.query(models.TUVote)
last_vote = func.max(models.Vote.VoteID).label("LastVote")
last_votes_by_pm = (
db.query(models.Vote)
.join(models.User)
.join(models.TUVoteInfo, models.TUVoteInfo.ID == models.TUVote.VoteID)
.join(models.VoteInfo, models.VoteInfo.ID == models.Vote.VoteID)
.filter(
and_(
models.TUVote.VoteID == models.TUVoteInfo.ID,
models.User.ID == models.TUVote.UserID,
models.TUVoteInfo.End < ts,
models.Vote.VoteID == models.VoteInfo.ID,
models.User.ID == models.Vote.UserID,
models.VoteInfo.End < ts,
or_(models.User.AccountTypeID == 2, models.User.AccountTypeID == 4),
)
)
.with_entities(models.TUVote.UserID, last_vote, models.User.Username)
.group_by(models.TUVote.UserID)
.with_entities(models.Vote.UserID, last_vote, models.User.Username)
.group_by(models.Vote.UserID)
.order_by(last_vote.desc(), models.User.Username.asc())
)
context["last_votes_by_tu"] = last_votes_by_tu.all()
context["last_votes_by_pm"] = last_votes_by_pm.all()
context["current_by_next"] = "asc" if current_by == "desc" else "desc"
context["past_by_next"] = "asc" if past_by == "desc" else "desc"
populate_trusted_user_counts(context)
populate_package_maintainer_counts(context)
context["q"] = {
"coff": current_off,
@ -138,33 +141,33 @@ async def trusted_user(
"pby": past_by,
}
return render_template(request, "tu/index.html", context)
return render_template(request, "package-maintainer/index.html", context)
def render_proposal(
request: Request,
context: dict,
proposal: int,
voteinfo: models.TUVoteInfo,
voteinfo: models.VoteInfo,
voters: typing.Iterable[models.User],
vote: models.TUVote,
vote: models.Vote,
status_code: HTTPStatus = HTTPStatus.OK,
):
"""Render a single TU proposal."""
"""Render a single PM proposal."""
context["proposal"] = proposal
context["voteinfo"] = voteinfo
context["voters"] = voters.all()
total = voteinfo.total_votes()
participation = (total / voteinfo.ActiveTUs) if voteinfo.ActiveTUs else 0
participation = (total / voteinfo.ActiveUsers) if voteinfo.ActiveUsers else 0
context["participation"] = participation
accepted = (voteinfo.Yes > voteinfo.ActiveTUs / 2) or (
accepted = (voteinfo.Yes > voteinfo.ActiveUsers / 2) or (
participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No
)
context["accepted"] = accepted
can_vote = voters.filter(models.TUVote.User == request.user).first() is None
can_vote = voters.filter(models.Vote.User == request.user).first() is None
context["can_vote"] = can_vote
if not voteinfo.is_running():
@ -173,41 +176,41 @@ def render_proposal(
context["vote"] = vote
context["has_voted"] = vote is not None
return render_template(request, "tu/show.html", context, status_code=status_code)
return render_template(
request, "package-maintainer/show.html", context, status_code=status_code
)
@router.get("/tu/{proposal}")
@router.get("/package-maintainer/{proposal}")
@requires_auth
async def trusted_user_proposal(request: Request, proposal: int):
if not request.user.has_credential(creds.TU_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
async def package_maintainer_proposal(request: Request, proposal: int):
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User")
context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal)
voteinfo = (
db.query(models.TUVoteInfo).filter(models.TUVoteInfo.ID == proposal).first()
)
voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = (
db.query(models.User)
.join(models.TUVote)
.filter(models.TUVote.VoteID == voteinfo.ID)
.join(models.Vote)
.filter(models.Vote.VoteID == voteinfo.ID)
)
vote = (
db.query(models.TUVote)
db.query(models.Vote)
.filter(
and_(
models.TUVote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID,
models.Vote.UserID == request.user.ID,
models.Vote.VoteID == voteinfo.ID,
)
)
.first()
)
if not request.user.has_credential(creds.TU_VOTE):
context["error"] = "Only Trusted Users are allowed to vote."
if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Package Maintainers are allowed to vote."
if voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
elif vote is not None:
@ -217,43 +220,42 @@ async def trusted_user_proposal(request: Request, proposal: int):
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@router.post("/tu/{proposal}")
@db.async_retry_deadlock
@router.post("/package-maintainer/{proposal}")
@handle_form_exceptions
@requires_auth
async def trusted_user_proposal_post(
async def package_maintainer_proposal_post(
request: Request, proposal: int, decision: str = Form(...)
):
if not request.user.has_credential(creds.TU_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User")
context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal) # Make sure it's an int.
voteinfo = (
db.query(models.TUVoteInfo).filter(models.TUVoteInfo.ID == proposal).first()
)
voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = (
db.query(models.User)
.join(models.TUVote)
.filter(models.TUVote.VoteID == voteinfo.ID)
.join(models.Vote)
.filter(models.Vote.VoteID == voteinfo.ID)
)
vote = (
db.query(models.TUVote)
db.query(models.Vote)
.filter(
and_(
models.TUVote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID,
models.Vote.UserID == request.user.ID,
models.Vote.VoteID == voteinfo.ID,
)
)
.first()
)
status_code = HTTPStatus.OK
if not request.user.has_credential(creds.TU_VOTE):
context["error"] = "Only Trusted Users are allowed to vote."
if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Package Maintainers are allowed to vote."
status_code = HTTPStatus.UNAUTHORIZED
elif voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you."
@ -267,14 +269,16 @@ async def trusted_user_proposal_post(
request, context, proposal, voteinfo, voters, vote, status_code=status_code
)
if decision in {"Yes", "No", "Abstain"}:
# Increment whichever decision was given to us.
setattr(voteinfo, decision, getattr(voteinfo, decision) + 1)
else:
return Response("Invalid 'decision' value.", status_code=HTTPStatus.BAD_REQUEST)
with db.begin():
vote = db.create(models.TUVote, User=request.user, VoteInfo=voteinfo)
if decision in {"Yes", "No", "Abstain"}:
# Increment whichever decision was given to us.
setattr(voteinfo, decision, getattr(voteinfo, decision) + 1)
else:
return Response(
"Invalid 'decision' value.", status_code=HTTPStatus.BAD_REQUEST
)
vote = db.create(models.Vote, User=request.user, VoteInfo=voteinfo)
context["error"] = "You've already voted for this proposal."
return render_proposal(request, context, proposal, voteinfo, voters, vote)
@ -282,17 +286,17 @@ async def trusted_user_proposal_post(
@router.get("/addvote")
@requires_auth
async def trusted_user_addvote(
request: Request, user: str = str(), type: str = "add_tu", agenda: str = str()
async def package_maintainer_addvote(
request: Request, user: str = str(), type: str = "add_pm", agenda: str = str()
):
if not request.user.has_credential(creds.TU_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Add Proposal")
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
type = "add_tu" # Default it.
type = "add_pm" # Default it.
context["user"] = user
context["type"] = type
@ -301,17 +305,18 @@ async def trusted_user_addvote(
return render_template(request, "addvote.html", context)
@db.async_retry_deadlock
@router.post("/addvote")
@handle_form_exceptions
@requires_auth
async def trusted_user_addvote_post(
async def package_maintainer_addvote_post(
request: Request,
user: str = Form(default=str()),
type: str = Form(default=str()),
agenda: str = Form(default=str()),
):
if not request.user.has_credential(creds.TU_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
# Build a context.
context = await make_variable_context(request, "Add Proposal")
@ -333,10 +338,8 @@ async def trusted_user_addvote_post(
utcnow = time.utcnow()
voteinfo = (
db.query(models.TUVoteInfo)
.filter(
and_(models.TUVoteInfo.User == user, models.TUVoteInfo.End > utcnow)
)
db.query(models.VoteInfo)
.filter(and_(models.VoteInfo.User == user, models.VoteInfo.End > utcnow))
.count()
)
if voteinfo:
@ -348,7 +351,7 @@ async def trusted_user_addvote_post(
if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type."
context["type"] = type = "add_tu" # Default for rendering.
context["type"] = type = "add_pm" # Default for rendering.
return render_addvote(context, HTTPStatus.BAD_REQUEST)
if not agenda:
@ -359,12 +362,12 @@ async def trusted_user_addvote_post(
duration, quorum = ADDVOTE_SPECIFICS.get(type)
timestamp = time.utcnow()
# Active TU types we filter for.
types = {TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID}
# Active PM types we filter for.
types = {PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID}
# Create a new TUVoteInfo (proposal)!
# Create a new VoteInfo (proposal)!
with db.begin():
active_tus = (
active_pms = (
db.query(User)
.filter(
and_(
@ -376,16 +379,16 @@ async def trusted_user_addvote_post(
.count()
)
voteinfo = db.create(
models.TUVoteInfo,
models.VoteInfo,
User=user,
Agenda=html.escape(agenda),
Submitted=timestamp,
End=(timestamp + duration),
Quorum=quorum,
ActiveTUs=active_tus,
ActiveUsers=active_pms,
Submitter=request.user,
)
# Redirect to the new proposal.
endpoint = f"/tu/{voteinfo.ID}"
endpoint = f"/package-maintainer/{voteinfo.ID}"
return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER)

View file

@ -5,8 +5,9 @@ from typing import Any
from fastapi import APIRouter, Form, Query, Request, Response
import aurweb.filters # noqa: F401
from aurweb import config, db, defaults, logging, models, util
from aurweb import aur_logging, config, db, defaults, models, util
from aurweb.auth import creds, requires_auth
from aurweb.cache import db_count_cache, db_query_cache
from aurweb.exceptions import InvariantError, handle_form_exceptions
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
from aurweb.packages import util as pkgutil
@ -14,8 +15,9 @@ from aurweb.packages.search import PackageSearch
from aurweb.packages.util import get_pkg_or_base
from aurweb.pkgbase import actions as pkgbase_actions, util as pkgbaseutil
from aurweb.templates import make_context, make_variable_context, render_template
from aurweb.util import hash_query
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
router = APIRouter()
@ -87,31 +89,35 @@ async def packages_get(
# Collect search result count here; we've applied our keywords.
# Including more query operations below, like ordering, will
# increase the amount of time required to collect a count.
num_packages = search.count()
# we use redis for caching the results of the query
cache_expire = config.getint("cache", "expiry_time_search", 600)
num_packages = db_count_cache(hash_query(search.query), search.query, cache_expire)
# Apply user-specified sort column and ordering.
search.sort_by(sort_by, sort_order)
# Insert search results into the context.
results = (
search.results()
.with_entities(
models.Package.ID,
models.Package.Name,
models.Package.PackageBaseID,
models.Package.Version,
models.Package.Description,
models.PackageBase.Popularity,
models.PackageBase.NumVotes,
models.PackageBase.OutOfDateTS,
models.User.Username.label("Maintainer"),
models.PackageVote.PackageBaseID.label("Voted"),
models.PackageNotification.PackageBaseID.label("Notify"),
)
.group_by(models.Package.Name)
results = search.results().with_entities(
models.Package.ID,
models.Package.Name,
models.Package.PackageBaseID,
models.Package.Version,
models.Package.Description,
models.PackageBase.Popularity,
models.PackageBase.NumVotes,
models.PackageBase.OutOfDateTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
models.PackageVote.PackageBaseID.label("Voted"),
models.PackageNotification.PackageBaseID.label("Notify"),
)
packages = results.limit(per_page).offset(offset)
# paging
results = results.limit(per_page).offset(offset)
# we use redis for caching the results of the query
packages = db_query_cache(hash_query(results), results, cache_expire)
context["packages"] = packages
context["packages_count"] = num_packages
@ -161,7 +167,8 @@ async def package(
rels_data["r"].append(rel)
# Add our base information.
context = await pkgbaseutil.make_variable_context(request, pkgbase)
context = pkgbaseutil.make_context(request, pkgbase)
context["q"] = dict(request.query_params)
context.update({"all_deps": all_deps, "all_reqs": all_reqs})
@ -183,6 +190,17 @@ async def package(
if not all_deps:
deps = deps.limit(max_listing)
context["dependencies"] = deps.all()
# Existing dependencies to avoid multiple lookups
context["dependencies_names_from_aur"] = [
item.Name
for item in db.query(models.Package)
.filter(
models.Package.Name.in_(
pkg.package_dependencies.with_entities(models.PackageDependency.DepName)
)
)
.all()
]
# Package requirements (other packages depend on this one).
reqs = pkgutil.pkg_required(pkg.Name, [p.RelName for p in rels_data.get("p", [])])
@ -193,6 +211,8 @@ async def package(
context["licenses"] = pkg.package_licenses
context["groups"] = pkg.package_groups
conflicts = pkg.package_relations.filter(
models.PackageRelation.RelTypeID == CONFLICTS_ID
).order_by(models.PackageRelation.RelName.asc())
@ -213,7 +233,7 @@ async def package(
async def packages_unflag(request: Request, package_ids: list[int] = [], **kwargs):
if not package_ids:
return (False, ["You did not select any packages to unflag."])
return False, ["You did not select any packages to unflag."]
# Holds the set of package bases we're looking to unflag.
# Constructed below via looping through the packages query.
@ -226,14 +246,14 @@ async def packages_unflag(request: Request, package_ids: list[int] = [], **kwarg
creds.PKGBASE_UNFLAG, approved=[pkg.PackageBase.Flagger]
)
if not has_cred:
return (False, ["You did not select any packages to unflag."])
return False, ["You did not select any packages to unflag."]
if pkg.PackageBase not in bases:
bases.update({pkg.PackageBase})
for pkgbase in bases:
pkgbase_actions.pkgbase_unflag_instance(request, pkgbase)
return (True, ["The selected packages have been unflagged."])
return True, ["The selected packages have been unflagged."]
async def packages_notify(request: Request, package_ids: list[int] = [], **kwargs):
@ -271,13 +291,13 @@ async def packages_notify(request: Request, package_ids: list[int] = [], **kwarg
pkgbase_actions.pkgbase_notify_instance(request, pkgbase)
# TODO: This message does not yet have a translation.
return (True, ["The selected packages' notifications have been enabled."])
return True, ["The selected packages' notifications have been enabled."]
async def packages_unnotify(request: Request, package_ids: list[int] = [], **kwargs):
if not package_ids:
# TODO: This error does not yet have a translation.
return (False, ["You did not select any packages for notification removal."])
return False, ["You did not select any packages for notification removal."]
# TODO: This error does not yet have a translation.
error_tuple = (
@ -307,14 +327,14 @@ async def packages_unnotify(request: Request, package_ids: list[int] = [], **kwa
pkgbase_actions.pkgbase_unnotify_instance(request, pkgbase)
# TODO: This message does not yet have a translation.
return (True, ["The selected packages' notifications have been removed."])
return True, ["The selected packages' notifications have been removed."]
async def packages_adopt(
request: Request, package_ids: list[int] = [], confirm: bool = False, **kwargs
):
if not package_ids:
return (False, ["You did not select any packages to adopt."])
return False, ["You did not select any packages to adopt."]
if not confirm:
return (
@ -347,7 +367,7 @@ async def packages_adopt(
for pkgbase in bases:
pkgbase_actions.pkgbase_adopt_instance(request, pkgbase)
return (True, ["The selected packages have been adopted."])
return True, ["The selected packages have been adopted."]
def disown_all(request: Request, pkgbases: list[models.PackageBase]) -> list[str]:
@ -364,7 +384,7 @@ async def packages_disown(
request: Request, package_ids: list[int] = [], confirm: bool = False, **kwargs
):
if not package_ids:
return (False, ["You did not select any packages to disown."])
return False, ["You did not select any packages to disown."]
if not confirm:
return (
@ -397,9 +417,9 @@ async def packages_disown(
# Now, disown all the bases if we can.
if errors := disown_all(request, bases):
return (False, errors)
return False, errors
return (True, ["The selected packages have been disowned."])
return True, ["The selected packages have been disowned."]
async def packages_delete(
@ -410,7 +430,7 @@ async def packages_delete(
**kwargs,
):
if not package_ids:
return (False, ["You did not select any packages to delete."])
return False, ["You did not select any packages to delete."]
if not confirm:
return (
@ -422,7 +442,7 @@ async def packages_delete(
)
if not request.user.has_credential(creds.PKGBASE_DELETE):
return (False, ["You do not have permission to delete packages."])
return False, ["You do not have permission to delete packages."]
# set-ify package_ids and query the database for related records.
package_ids = set(package_ids)
@ -432,7 +452,7 @@ async def packages_delete(
# Let the user know there was an issue with their input: they have
# provided at least one package_id which does not exist in the DB.
# TODO: This error has not yet been translated.
return (False, ["One of the packages you selected does not exist."])
return False, ["One of the packages you selected does not exist."]
# Make a set out of all package bases related to `packages`.
bases = {pkg.PackageBase for pkg in packages}
@ -448,7 +468,7 @@ async def packages_delete(
)
util.apply_all(notifs, lambda n: n.send())
return (True, ["The selected packages have been deleted."])
return True, ["The selected packages have been deleted."]
# A mapping of action string -> callback functions used within the
@ -473,7 +493,6 @@ async def packages_post(
action: str = Form(default=str()),
confirm: bool = Form(default=False),
):
# If an invalid action is specified, just render GET /packages
# with an BAD_REQUEST status_code.
if action not in PACKAGE_ACTIONS:

View file

@ -4,7 +4,7 @@ from fastapi import APIRouter, Form, HTTPException, Query, Request, Response
from fastapi.responses import JSONResponse, RedirectResponse
from sqlalchemy import and_
from aurweb import config, db, l10n, logging, templates, time, util
from aurweb import aur_logging, config, db, l10n, templates, time, util
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import InvariantError, ValidationError, handle_form_exceptions
from aurweb.models import PackageBase
@ -21,7 +21,7 @@ from aurweb.scripts import notify, popupdate
from aurweb.scripts.rendercomment import update_comment_render_fastapi
from aurweb.templates import make_variable_context, render_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
router = APIRouter()
@ -87,6 +87,7 @@ async def pkgbase_flag_comment(request: Request, name: str):
return render_template(request, "pkgbase/flag-comment.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/keywords")
@handle_form_exceptions
async def pkgbase_keywords(
@ -139,6 +140,7 @@ async def pkgbase_flag_get(request: Request, name: str):
return render_template(request, "pkgbase/flag.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/flag")
@handle_form_exceptions
@requires_auth
@ -157,6 +159,8 @@ async def pkgbase_flag_post(
request, "pkgbase/flag.html", context, status_code=HTTPStatus.BAD_REQUEST
)
validate.comment_raise_http_ex(comments)
has_cred = request.user.has_credential(creds.PKGBASE_FLAG)
if has_cred and not pkgbase.OutOfDateTS:
now = time.utcnow()
@ -170,6 +174,7 @@ async def pkgbase_flag_post(
return RedirectResponse(f"/pkgbase/{name}", status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments")
@handle_form_exceptions
@requires_auth
@ -182,8 +187,7 @@ async def pkgbase_comments_post(
"""Add a new comment via POST request."""
pkgbase = get_pkg_or_base(name, PackageBase)
if not comment:
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
validate.comment_raise_http_ex(comment)
# If the provided comment is different than the record's version,
# update the db record.
@ -279,6 +283,7 @@ async def pkgbase_comment_edit(
return render_template(request, "pkgbase/comments/edit.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments/{id}")
@handle_form_exceptions
@requires_auth
@ -289,14 +294,20 @@ async def pkgbase_comment_post(
comment: str = Form(default=str()),
enable_notifications: bool = Form(default=False),
next: str = Form(default=None),
cancel: bool = Form(default=False),
):
"""Edit an existing comment."""
if cancel:
return RedirectResponse(
f"/pkgbase/{name}#comment-{id}", status_code=HTTPStatus.SEE_OTHER
)
pkgbase = get_pkg_or_base(name, PackageBase)
db_comment = get_pkgbase_comment(pkgbase, id)
if not comment:
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
elif request.user.ID != db_comment.UsersID:
validate.comment_raise_http_ex(comment)
if request.user.ID != db_comment.UsersID:
raise HTTPException(status_code=HTTPStatus.UNAUTHORIZED)
# If the provided comment is different than the record's version,
@ -308,11 +319,14 @@ async def pkgbase_comment_post(
db_comment.Editor = request.user
db_comment.EditedTS = now
if enable_notifications:
with db.begin():
db_notif = request.user.notifications.filter(
PackageNotification.PackageBaseID == pkgbase.ID
).first()
if enable_notifications and not db_notif:
if not db_notif:
db.create(PackageNotification, User=request.user, PackageBase=pkgbase)
update_comment_render_fastapi(db_comment)
if not next:
@ -324,6 +338,7 @@ async def pkgbase_comment_post(
)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments/{id}/pin")
@handle_form_exceptions
@requires_auth
@ -362,6 +377,7 @@ async def pkgbase_comment_pin(
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments/{id}/unpin")
@handle_form_exceptions
@requires_auth
@ -399,6 +415,7 @@ async def pkgbase_comment_unpin(
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments/{id}/delete")
@handle_form_exceptions
@requires_auth
@ -440,6 +457,7 @@ async def pkgbase_comment_delete(
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comments/{id}/undelete")
@handle_form_exceptions
@requires_auth
@ -482,6 +500,7 @@ async def pkgbase_comment_undelete(
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/vote")
@handle_form_exceptions
@requires_auth
@ -501,6 +520,7 @@ async def pkgbase_vote(request: Request, name: str):
return RedirectResponse(f"/pkgbase/{name}", status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/unvote")
@handle_form_exceptions
@requires_auth
@ -519,6 +539,7 @@ async def pkgbase_unvote(request: Request, name: str):
return RedirectResponse(f"/pkgbase/{name}", status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/notify")
@handle_form_exceptions
@requires_auth
@ -528,6 +549,7 @@ async def pkgbase_notify(request: Request, name: str):
return RedirectResponse(f"/pkgbase/{name}", status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/unnotify")
@handle_form_exceptions
@requires_auth
@ -537,6 +559,7 @@ async def pkgbase_unnotify(request: Request, name: str):
return RedirectResponse(f"/pkgbase/{name}", status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/unflag")
@handle_form_exceptions
@requires_auth
@ -567,6 +590,7 @@ async def pkgbase_disown_get(
return render_template(request, "pkgbase/disown.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/disown")
@handle_form_exceptions
@requires_auth
@ -579,6 +603,9 @@ async def pkgbase_disown_post(
):
pkgbase = get_pkg_or_base(name, PackageBase)
if comments:
validate.comment_raise_http_ex(comments)
comaints = {c.User for c in pkgbase.comaintainers}
approved = [pkgbase.Maintainer] + list(comaints)
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN, approved=approved)
@ -587,6 +614,9 @@ async def pkgbase_disown_post(
context = templates.make_context(request, "Disown Package")
context["pkgbase"] = pkgbase
context["is_maint"] = request.user == pkgbase.Maintainer
context["is_comaint"] = request.user in comaints
if not confirm:
context["errors"] = [
(
@ -610,12 +640,11 @@ async def pkgbase_disown_post(
request, "pkgbase/disown.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if not next:
next = f"/pkgbase/{name}"
next = next or f"/pkgbase/{name}"
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/adopt")
@handle_form_exceptions
@requires_auth
@ -658,6 +687,7 @@ async def pkgbase_comaintainers(request: Request, name: str) -> Response:
return render_template(request, "pkgbase/comaintainers.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/comaintainers")
@handle_form_exceptions
@requires_auth
@ -714,6 +744,7 @@ async def pkgbase_request(
return render_template(request, "pkgbase/request.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/request")
@handle_form_exceptions
@requires_auth
@ -816,6 +847,7 @@ async def pkgbase_delete_get(
return render_template(request, "pkgbase/delete.html", context)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/delete")
@handle_form_exceptions
@requires_auth
@ -845,6 +877,7 @@ async def pkgbase_delete_post(
)
if comments:
validate.comment_raise_http_ex(comments)
# Update any existing deletion requests' ClosureComment.
with db.begin():
requests = pkgbase.requests.filter(
@ -880,7 +913,9 @@ async def pkgbase_merge_get(
# Perhaps additionally: bad_credential_status_code(creds.PKGBASE_MERGE).
# Don't take these examples verbatim. We should find good naming.
if not request.user.has_credential(creds.PKGBASE_MERGE):
context["errors"] = ["Only Trusted Users and Developers can merge packages."]
context["errors"] = [
"Only Package Maintainers and Developers can merge packages."
]
status_code = HTTPStatus.UNAUTHORIZED
return render_template(
@ -888,6 +923,7 @@ async def pkgbase_merge_get(
)
@db.async_retry_deadlock
@router.post("/pkgbase/{name}/merge")
@handle_form_exceptions
@requires_auth
@ -905,7 +941,9 @@ async def pkgbase_merge_post(
# TODO: Lookup errors from credential instead of hardcoding them.
if not request.user.has_credential(creds.PKGBASE_MERGE):
context["errors"] = ["Only Trusted Users and Developers can merge packages."]
context["errors"] = [
"Only Package Maintainers and Developers can merge packages."
]
return render_template(
request, "pkgbase/merge.html", context, status_code=HTTPStatus.UNAUTHORIZED
)
@ -933,6 +971,9 @@ async def pkgbase_merge_post(
request, "pkgbase/merge.html", context, status_code=HTTPStatus.BAD_REQUEST
)
if comments:
validate.comment_raise_http_ex(comments)
with db.begin():
update_closure_comment(pkgbase, MERGE_ID, comments, target=target)

View file

@ -2,46 +2,112 @@ from http import HTTPStatus
from fastapi import APIRouter, Form, Query, Request
from fastapi.responses import RedirectResponse
from sqlalchemy import case
from sqlalchemy import case, orm
from aurweb import db, defaults, time, util
from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions
from aurweb.models import PackageRequest, User
from aurweb.models.package_request import PENDING_ID, REJECTED_ID
from aurweb.models import PackageBase, PackageRequest, User
from aurweb.models.package_request import (
ACCEPTED_ID,
CLOSED_ID,
PENDING_ID,
REJECTED_ID,
)
from aurweb.requests.util import get_pkgreq_by_id
from aurweb.scripts import notify
from aurweb.statistics import get_request_counts
from aurweb.templates import make_context, render_template
FILTER_PARAMS = {
"filter_pending",
"filter_closed",
"filter_accepted",
"filter_rejected",
"filter_maintainers_requests",
}
router = APIRouter()
@router.get("/requests")
@requires_auth
async def requests(
async def requests( # noqa: C901
request: Request,
O: int = Query(default=defaults.O),
PP: int = Query(default=defaults.PP),
filter_pending: bool = False,
filter_closed: bool = False,
filter_accepted: bool = False,
filter_rejected: bool = False,
filter_maintainer_requests: bool = False,
filter_pkg_name: str = None,
):
context = make_context(request, "Requests")
context["q"] = dict(request.query_params)
O, PP = util.sanitize_params(O, PP)
# Set pending filter by default if no status filter was provided.
# In case we got a package name filter, but no status filter,
# we enable the other ones too.
if not dict(request.query_params).keys() & FILTER_PARAMS:
filter_pending = True
if filter_pkg_name:
filter_closed = True
filter_accepted = True
filter_rejected = True
O, PP = util.sanitize_params(str(O), str(PP))
context["O"] = O
context["PP"] = PP
context["filter_pending"] = filter_pending
context["filter_closed"] = filter_closed
context["filter_accepted"] = filter_accepted
context["filter_rejected"] = filter_rejected
context["filter_maintainer_requests"] = filter_maintainer_requests
context["filter_pkg_name"] = filter_pkg_name
# A PackageRequest query, with left inner joined User and RequestType.
query = db.query(PackageRequest).join(User, User.ID == PackageRequest.UsersID)
Maintainer = orm.aliased(User)
# A PackageRequest query
query = (
db.query(PackageRequest)
.join(PackageBase)
.join(User, PackageRequest.UsersID == User.ID, isouter=True)
.join(Maintainer, PackageBase.MaintainerUID == Maintainer.ID, isouter=True)
)
# Requests statistics
counts = get_request_counts()
for k in counts:
context[k] = counts[k]
# Apply status filters
in_filters = []
if filter_pending:
in_filters.append(PENDING_ID)
if filter_closed:
in_filters.append(CLOSED_ID)
if filter_accepted:
in_filters.append(ACCEPTED_ID)
if filter_rejected:
in_filters.append(REJECTED_ID)
filtered = query.filter(PackageRequest.Status.in_(in_filters))
# Name filter (contains)
if filter_pkg_name:
filtered = filtered.filter(PackageBase.Name.like(f"%{filter_pkg_name}%"))
# Additionally filter for requests made from package maintainer
if filter_maintainer_requests:
filtered = filtered.filter(PackageRequest.UsersID == PackageBase.MaintainerUID)
# If the request user is not elevated (TU or Dev), then
# filter PackageRequests which are owned by the request user.
if not request.user.is_elevated():
query = query.filter(PackageRequest.UsersID == request.user.ID)
filtered = filtered.filter(PackageRequest.UsersID == request.user.ID)
context["total"] = query.count()
context["total"] = filtered.count()
context["results"] = (
query.order_by(
filtered.order_by(
# Order primarily by the Status column being PENDING_ID,
# and secondarily by RequestTS; both in descending order.
case([(PackageRequest.Status == PENDING_ID, 1)], else_=0).desc(),
@ -51,14 +117,12 @@ async def requests(
.offset(O)
.all()
)
return render_template(request, "requests.html", context)
@router.get("/requests/{id}/close")
@requires_auth
async def request_close(request: Request, id: int):
pkgreq = get_pkgreq_by_id(id)
if not request.user.is_elevated() and request.user != pkgreq.User:
# Request user doesn't have permission here: redirect to '/'.
@ -69,6 +133,7 @@ async def request_close(request: Request, id: int):
return render_template(request, "requests/close.html", context)
@db.async_retry_deadlock
@router.post("/requests/{id}/close")
@handle_form_exceptions
@requires_auth

View file

@ -1,3 +1,29 @@
"""
RPC API routing module
For legacy route documentation, see https://aur.archlinux.org/rpc
Legacy Routes:
- GET /rpc
- POST /rpc
Legacy example (version 5): /rpc?v=5&type=info&arg=my-package
For OpenAPI route documentation, see https://aur.archlinux.org/docs
OpenAPI Routes:
- GET /rpc/v{version}/info/{arg}
- GET /rpc/v{version}/info
- POST /rpc/v{version}/info
- GET /rpc/v{version}/search/{arg}
- GET /rpc/v{version}/search
- POST /rpc/v{version}/search
- GET /rpc/v{version}/suggest/{arg}
OpenAPI example (version 5): /rpc/v5/info/my-package
"""
import hashlib
import re
from http import HTTPStatus
@ -71,7 +97,6 @@ async def rpc_request(
args: Optional[list[str]] = [],
callback: Optional[str] = None,
):
# Create a handle to our RPC class.
rpc = RPC(version=v, type=type)
@ -156,7 +181,140 @@ async def rpc_post(
type: Optional[str] = Form(default=None),
by: Optional[str] = Form(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Form(default=None),
args: Optional[list[str]] = Form(default=[], alias="arg[]"),
args: list[str] = Form(default=[], alias="arg[]"),
callback: Optional[str] = Form(default=None),
):
return await rpc_request(request, v, type, by, arg, args, callback)
@router.get("/rpc/v{version}/info/{name}")
async def rpc_openapi_info(request: Request, version: int, name: str):
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
name,
[],
)
@router.get("/rpc/v{version}/info")
async def rpc_openapi_multiinfo(
request: Request,
version: int,
args: Optional[list[str]] = Query(default=[], alias="arg[]"),
):
arg = args.pop(0) if args else None
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
arg,
args,
)
@router.post("/rpc/v{version}/info")
async def rpc_openapi_multiinfo_post(
request: Request,
version: int,
):
data = await request.json()
args = data.get("arg", [])
if not isinstance(args, list):
rpc = RPC(version, "info")
return JSONResponse(
rpc.error("the 'arg' parameter must be of array type"),
status_code=HTTPStatus.BAD_REQUEST,
)
arg = args.pop(0) if args else None
return await rpc_request(
request,
version,
"info",
defaults.RPC_SEARCH_BY,
arg,
args,
)
@router.get("/rpc/v{version}/search/{arg}")
async def rpc_openapi_search_arg(
request: Request,
version: int,
arg: str,
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
):
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.get("/rpc/v{version}/search")
async def rpc_openapi_search(
request: Request,
version: int,
arg: Optional[str] = Query(default=str()),
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
):
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.post("/rpc/v{version}/search")
async def rpc_openapi_search_post(
request: Request,
version: int,
):
data = await request.json()
by = data.get("by", defaults.RPC_SEARCH_BY)
if not isinstance(by, str):
rpc = RPC(version, "search")
return JSONResponse(
rpc.error("the 'by' parameter must be of string type"),
status_code=HTTPStatus.BAD_REQUEST,
)
arg = data.get("arg", str())
if not isinstance(arg, str):
rpc = RPC(version, "search")
return JSONResponse(
rpc.error("the 'arg' parameter must be of string type"),
status_code=HTTPStatus.BAD_REQUEST,
)
return await rpc_request(
request,
version,
"search",
by,
arg,
[],
)
@router.get("/rpc/v{version}/suggest/{arg}")
async def rpc_openapi_suggest(request: Request, version: int, arg: str):
return await rpc_request(
request,
version,
"suggest",
defaults.RPC_SEARCH_BY,
arg,
[],
)

View file

@ -1,21 +1,19 @@
from datetime import datetime
from fastapi import APIRouter, Request
from fastapi.responses import Response
from feedgen.feed import FeedGenerator
from aurweb import db, filters
from aurweb import config, db, filters
from aurweb.cache import lambda_cache
from aurweb.models import Package, PackageBase
router = APIRouter()
def make_rss_feed(request: Request, packages: list, date_attr: str):
def make_rss_feed(request: Request, packages: list):
"""Create an RSS Feed string for some packages.
:param request: A FastAPI request
:param packages: A list of packages to add to the RSS feed
:param date_attr: The date attribute (DB column) to use
:return: RSS Feed string
"""
@ -36,18 +34,11 @@ def make_rss_feed(request: Request, packages: list, date_attr: str):
entry = feed.add_entry(order="append")
entry.title(pkg.Name)
entry.link(href=f"{base}/packages/{pkg.Name}", rel="alternate")
entry.link(href=f"{base}/rss", rel="self", type="application/rss+xml")
entry.description(pkg.Description or str())
attr = getattr(pkg.PackageBase, date_attr)
dt = filters.timestamp_to_datetime(attr)
dt = filters.timestamp_to_datetime(pkg.Timestamp)
dt = filters.as_timezone(dt, request.user.Timezone)
entry.pubDate(dt.strftime("%Y-%m-%d %H:%M:%S%z"))
entry.source(f"{base}")
if pkg.PackageBase.Maintainer:
entry.author(author={"name": pkg.PackageBase.Maintainer.Username})
entry.guid(f"{pkg.Name} - {attr}")
entry.guid(f"{pkg.Name}-{pkg.Timestamp}")
return feed.rss_str()
@ -59,16 +50,18 @@ async def rss(request: Request):
.join(PackageBase)
.order_by(PackageBase.SubmittedTS.desc())
.limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.SubmittedTS.label("Timestamp"),
)
)
feed = make_rss_feed(request, packages, "SubmittedTS")
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache("rss", lambda: make_rss_feed(request, packages), cache_expire)
response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.SubmittedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response
@ -79,14 +72,18 @@ async def rss_modified(request: Request):
.join(PackageBase)
.order_by(PackageBase.ModifiedTS.desc())
.limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.ModifiedTS.label("Timestamp"),
)
)
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache(
"rss_modified", lambda: make_rss_feed(request, packages), cache_expire
)
feed = make_rss_feed(request, packages, "ModifiedTS")
response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.ModifiedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response

View file

@ -80,7 +80,9 @@ def open_session(request, conn, user_id):
conn.execute(
Users.update()
.where(Users.c.ID == user_id)
.values(LastLogin=int(time.time()), LastLoginIPAddress=request.client.host)
.values(
LastLogin=int(time.time()), LastLoginIPAddress=util.get_client_ip(request)
)
)
return sid
@ -110,7 +112,7 @@ async def authenticate(
Receive an OpenID Connect ID token, validate it, then process it to create
an new AUR session.
"""
if is_ip_banned(conn, request.client.host):
if is_ip_banned(conn, util.get_client_ip(request)):
_ = get_translator_for_request(request)
raise HTTPException(
status_code=HTTPStatus.FORBIDDEN,

View file

@ -6,9 +6,10 @@ from fastapi.responses import HTMLResponse
from sqlalchemy import and_, literal, orm
import aurweb.config as config
from aurweb import db, defaults, models
from aurweb import db, defaults, models, time
from aurweb.exceptions import RPCError
from aurweb.filters import number_format
from aurweb.models.package_base import popularity
from aurweb.packages.search import RPCSearch
TYPE_MAPPING = {
@ -82,10 +83,24 @@ class RPC:
"makedepends",
"optdepends",
"checkdepends",
"provides",
"conflicts",
"replaces",
"groups",
"submitter",
"keywords",
"comaintainers",
}
# A mapping of by aliases.
BY_ALIASES = {"name-desc": "nd", "name": "n", "maintainer": "m"}
BY_ALIASES = {
"name-desc": "nd",
"name": "n",
"maintainer": "m",
"submitter": "s",
"keywords": "k",
"comaintainers": "c",
}
def __init__(self, version: int = 0, type: str = None) -> "RPC":
self.version = version
@ -120,16 +135,15 @@ class RPC:
if not args:
raise RPCError("No request type/data specified.")
def _get_json_data(self, package: models.Package) -> dict[str, Any]:
def get_json_data(self, package: models.Package) -> dict[str, Any]:
"""Produce dictionary data of one Package that can be JSON-serialized.
:param package: Package instance
:returns: JSON-serializable dictionary
"""
# Produce RPC API compatible Popularity: If zero, it's an integer
# 0, otherwise, it's formatted to the 6th decimal place.
pop = package.Popularity
# Normalize Popularity for RPC output to 6 decimal precision
pop = popularity(package, time.utcnow())
pop = 0 if not pop else float(number_format(pop, 6))
snapshot_uri = config.get("options", "snapshot_uri")
@ -140,6 +154,7 @@ class RPC:
"PackageBase": package.PackageBaseName,
# Maintainer should be set following this update if one exists.
"Maintainer": package.Maintainer,
"Submitter": package.Submitter,
"Version": package.Version,
"Description": package.Description,
"URL": package.URL,
@ -151,8 +166,8 @@ class RPC:
"LastModified": package.ModifiedTS,
}
def _get_info_json_data(self, package: models.Package) -> dict[str, Any]:
data = self._get_json_data(package)
def get_info_json_data(self, package: models.Package) -> dict[str, Any]:
data = self.get_json_data(package)
# All info results have _at least_ an empty list of
# License and Keywords.
@ -176,50 +191,39 @@ class RPC:
"""
return [data_generator(pkg) for pkg in packages]
def _entities(self, query: orm.Query) -> orm.Query:
def entities(self, query: orm.Query) -> orm.Query:
"""Select specific RPC columns on `query`."""
return query.with_entities(
models.Package.ID,
models.Package.Name,
models.Package.Version,
models.Package.Description,
models.Package.URL,
models.Package.PackageBaseID,
models.PackageBase.Name.label("PackageBaseName"),
models.PackageBase.NumVotes,
models.PackageBase.Popularity,
models.PackageBase.OutOfDateTS,
models.PackageBase.SubmittedTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
).group_by(models.Package.ID)
Submitter = orm.aliased(models.User)
def _handle_multiinfo_type(
self, args: list[str] = [], **kwargs
) -> list[dict[str, Any]]:
self._enforce_args(args)
args = set(args)
packages = (
db.query(models.Package)
.join(models.PackageBase)
.join(
models.User,
models.User.ID == models.PackageBase.MaintainerUID,
query = (
query.join(
Submitter,
Submitter.ID == models.PackageBase.SubmitterUID,
isouter=True,
)
.filter(models.Package.Name.in_(args))
.with_entities(
models.Package.ID,
models.Package.Name,
models.Package.Version,
models.Package.Description,
models.Package.URL,
models.Package.PackageBaseID,
models.PackageBase.Name.label("PackageBaseName"),
models.PackageBase.NumVotes,
models.PackageBase.Popularity,
models.PackageBase.PopularityUpdated,
models.PackageBase.OutOfDateTS,
models.PackageBase.SubmittedTS,
models.PackageBase.ModifiedTS,
models.User.Username.label("Maintainer"),
Submitter.Username.label("Submitter"),
)
.group_by(models.Package.ID)
)
max_results = config.getint("options", "max_rpc_results")
packages = self._entities(packages).limit(max_results + 1)
return query
if packages.count() > max_results:
raise RPCError("Too many package results.")
ids = {pkg.ID for pkg in packages}
# Aliases for 80-width.
def subquery(self, ids: set[int]):
Package = models.Package
PackageKeyword = models.PackageKeyword
@ -294,6 +298,22 @@ class RPC:
)
.distinct()
.order_by("Name"),
# Co-Maintainer
db.query(models.PackageComaintainer)
.join(models.User, models.User.ID == models.PackageComaintainer.UsersID)
.join(
models.Package,
models.Package.PackageBaseID
== models.PackageComaintainer.PackageBaseID,
)
.with_entities(
models.Package.ID,
literal("CoMaintainers").label("Type"),
models.User.Username.label("Name"),
literal(str()).label("Cond"),
)
.distinct() # A package could have the same co-maintainer multiple times
.order_by("Name"),
]
# Union all subqueries together.
@ -311,7 +331,33 @@ class RPC:
self.extra_info[record.ID][type_].append(name)
return self._assemble_json_data(packages, self._get_info_json_data)
def _handle_multiinfo_type(
self, args: list[str] = [], **kwargs
) -> list[dict[str, Any]]:
self._enforce_args(args)
args = set(args)
packages = (
db.query(models.Package)
.join(models.PackageBase)
.join(
models.User,
models.User.ID == models.PackageBase.MaintainerUID,
isouter=True,
)
.filter(models.Package.Name.in_(args))
)
max_results = config.getint("options", "max_rpc_results")
packages = self.entities(packages).limit(max_results + 1)
if packages.count() > max_results:
raise RPCError("Too many package results.")
ids = {pkg.ID for pkg in packages}
self.subquery(ids)
return self._assemble_json_data(packages, self.get_info_json_data)
def _handle_search_type(
self, by: str = defaults.RPC_SEARCH_BY, args: list[str] = []
@ -330,12 +376,28 @@ class RPC:
search.search_by(by, arg)
max_results = config.getint("options", "max_rpc_results")
results = self._entities(search.results()).limit(max_results + 1).all()
query = self.entities(search.results()).limit(max_results + 1)
# For "provides", we need to union our relation search
# with an exact search since a package always provides itself.
# Turns out that doing this with an OR statement is extremely slow
if by == "provides":
search = RPCSearch()
search._search_by_exact_name(arg)
query = query.union(self.entities(search.results()))
results = query.all()
if len(results) > max_results:
raise RPCError("Too many package results.")
return self._assemble_json_data(results, self._get_json_data)
data = self._assemble_json_data(results, self.get_json_data)
# remove Submitter for search results
for pkg in data:
pkg.pop("Submitter")
return data
def _handle_msearch_type(
self, args: list[str] = [], **kwargs
@ -350,12 +412,7 @@ class RPC:
packages = (
db.query(models.Package.Name)
.join(models.PackageBase)
.filter(
and_(
models.PackageBase.PackagerUID.isnot(None),
models.Package.Name.like(f"{arg}%"),
)
)
.filter(models.Package.Name.like(f"{arg}%"))
.order_by(models.Package.Name.asc())
.limit(20)
)
@ -368,12 +425,7 @@ class RPC:
arg = args[0]
packages = (
db.query(models.PackageBase.Name)
.filter(
and_(
models.PackageBase.PackagerUID.isnot(None),
models.PackageBase.Name.like(f"{arg}%"),
)
)
.filter(models.PackageBase.Name.like(f"{arg}%"))
.order_by(models.PackageBase.Name.asc())
.limit(20)
)

View file

@ -5,7 +5,6 @@ Changes here should always be accompanied by an Alembic migration, which can be
usually be automatically generated. See `migrations/README` for details.
"""
from sqlalchemy import (
CHAR,
TIMESTAMP,
@ -108,6 +107,12 @@ Users = Table(
Column("OwnershipNotify", TINYINT(1), nullable=False, server_default=text("1")),
Column("SSOAccountID", String(255), nullable=True, unique=True),
Index("UsersAccountTypeID", "AccountTypeID"),
Column(
"HideDeletedComments",
TINYINT(unsigned=True),
nullable=False,
server_default=text("0"),
),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
@ -155,6 +160,12 @@ PackageBases = Table(
nullable=False,
server_default=text("0"),
),
Column(
"PopularityUpdated",
TIMESTAMP,
nullable=False,
server_default=text("'1970-01-01 00:00:01.000000'"),
),
Column("OutOfDateTS", BIGINT(unsigned=True)),
Column("FlaggerComment", Text, nullable=False),
Column("SubmittedTS", BIGINT(unsigned=True), nullable=False),
@ -172,6 +183,8 @@ PackageBases = Table(
Index("BasesNumVotes", "NumVotes"),
Index("BasesPackagerUID", "PackagerUID"),
Index("BasesSubmitterUID", "SubmitterUID"),
Index("BasesSubmittedTS", "SubmittedTS"),
Index("BasesModifiedTS", "ModifiedTS"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
@ -195,6 +208,7 @@ PackageKeywords = Table(
nullable=False,
server_default=text("''"),
),
Index("KeywordsPackageBaseID", "PackageBaseID"),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci",
@ -513,8 +527,8 @@ PackageRequests = Table(
# Vote information
TU_VoteInfo = Table(
"TU_VoteInfo",
VoteInfo = Table(
"VoteInfo",
metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Agenda", Text, nullable=False),
@ -533,7 +547,10 @@ TU_VoteInfo = Table(
"Abstain", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")
),
Column(
"ActiveTUs", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")
"ActiveUsers",
INTEGER(unsigned=True),
nullable=False,
server_default=text("'0'"),
),
mysql_engine="InnoDB",
mysql_charset="utf8mb4",
@ -542,10 +559,10 @@ TU_VoteInfo = Table(
# Individual vote records
TU_Votes = Table(
"TU_Votes",
Votes = Table(
"Votes",
metadata,
Column("VoteID", ForeignKey("TU_VoteInfo.ID", ondelete="CASCADE"), nullable=False),
Column("VoteID", ForeignKey("VoteInfo.ID", ondelete="CASCADE"), nullable=False),
Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
mysql_engine="InnoDB",
)

View file

@ -6,6 +6,7 @@ See `aurweb-adduser --help` for documentation.
Copyright (C) 2022 aurweb Development Team
All Rights Reserved
"""
import argparse
import sys
import traceback

View file

@ -49,6 +49,7 @@ def _main(force: bool = False):
.all()
)
# delete providers not existing in any of our alpm repos
for name, provides in old_providers.difference(providers):
db.delete_all(
db.query(OfficialProvider).filter(
@ -59,10 +60,20 @@ def _main(force: bool = False):
)
)
# add new providers that do not yet exist in our DB
for name, provides in providers.difference(old_providers):
repo = repomap.get((name, provides))
db.create(OfficialProvider, Name=name, Repo=repo, Provides=provides)
# update providers where a pkg was moved from one repo to another
all_providers = db.query(OfficialProvider)
for op in all_providers:
new_repo = repomap.get((op.Name, op.Provides))
if op.Repo != new_repo:
op.Repo = new_repo
def main(force: bool = False):
db.get_engine()

View file

@ -3,6 +3,7 @@ Perform an action on the aurweb config.
When AUR_CONFIG_IMMUTABLE is set, the `set` action is noop.
"""
import argparse
import configparser
import os

View file

@ -0,0 +1,125 @@
import argparse
import importlib
import os
import sys
import traceback
from datetime import UTC, datetime
import orjson
import pygit2
from aurweb import config
# Constants
REF = "refs/heads/master"
ORJSON_OPTS = orjson.OPT_SORT_KEYS | orjson.OPT_INDENT_2
def init_repository(git_info) -> None:
pygit2.init_repository(git_info.path)
repo = pygit2.Repository(git_info.path)
for k, v in git_info.config.items():
repo.config[k] = v
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument(
"--spec",
type=str,
required=True,
help="name of spec module in the aurweb.archives.spec package",
)
return parser.parse_args()
def update_repository(repo: pygit2.Repository):
# Use git status to determine file changes
has_changes = False
changes = repo.status()
for filepath, flags in changes.items():
if flags != pygit2.GIT_STATUS_CURRENT:
has_changes = True
break
if has_changes:
print("diff detected, committing")
# Add everything in the tree.
print("adding files to git tree")
# Add the tree to staging
repo.index.read()
repo.index.add_all()
repo.index.write()
tree = repo.index.write_tree()
# Determine base commit; if repo.head.target raises GitError,
# we have no current commits
try:
base = [repo.head.target]
except pygit2.GitError:
base = []
utcnow = datetime.now(UTC)
author = pygit2.Signature(
config.get("git-archive", "author"),
config.get("git-archive", "author-email"),
int(utcnow.timestamp()),
0,
)
# Commit the changes
timestamp = utcnow.strftime("%Y-%m-%d %H:%M:%S")
title = f"update - {timestamp}"
repo.create_commit(REF, author, author, title, tree, base)
print("committed changes")
else:
print("no diff detected")
def main() -> int:
args = parse_args()
print(f"loading '{args.spec}' spec")
spec_package = "aurweb.archives.spec"
module_path = f"{spec_package}.{args.spec}"
spec_module = importlib.import_module(module_path)
print(f"loaded '{args.spec}'")
# Track repositories that the spec modifies. After we run
# through specs, we want to make a single commit for all
# repositories that contain changes.
repos = dict()
print(f"running '{args.spec}' spec...")
spec = spec_module.Spec()
for output in spec.generate():
if not os.path.exists(output.git_info.path / ".git"):
init_repository(output.git_info)
path = output.git_info.path / output.filename
with open(path, "wb") as f:
f.write(output.data)
if output.git_info.path not in repos:
repos[output.git_info.path] = pygit2.Repository(output.git_info.path)
print(f"done running '{args.spec}' spec")
print("processing repositories")
for path in spec.repos:
print(f"processing repository: {path}")
update_repository(pygit2.Repository(path))
return 0
if __name__ == "__main__":
try:
sys.exit(main())
except KeyboardInterrupt:
sys.exit(0)
except Exception:
traceback.print_exc()
sys.exit(1)

View file

@ -24,7 +24,6 @@ import io
import os
import shutil
import sys
import tempfile
from collections import defaultdict
from typing import Any
@ -32,11 +31,11 @@ import orjson
from sqlalchemy import literal, orm
import aurweb.config
from aurweb import db, filters, logging, models, util
from aurweb import aur_logging, db, filters, models, util
from aurweb.benchmark import Benchmark
from aurweb.models import Package, PackageBase, User
logger = logging.get_logger("aurweb.scripts.mkpkglists")
logger = aur_logging.get_logger("aurweb.scripts.mkpkglists")
TYPE_MAP = {
@ -95,7 +94,7 @@ def get_extended_fields():
models.PackageDependency.DepName.label("Name"),
models.PackageDependency.DepCondition.label("Cond"),
)
.distinct()
.distinct() # A package could have the same dependency multiple times
.order_by("Name"),
# PackageRelation
db.query(models.PackageRelation)
@ -106,7 +105,7 @@ def get_extended_fields():
models.PackageRelation.RelName.label("Name"),
models.PackageRelation.RelCondition.label("Cond"),
)
.distinct()
.distinct() # A package could have the same relation multiple times
.order_by("Name"),
# Groups
db.query(models.PackageGroup)
@ -117,7 +116,6 @@ def get_extended_fields():
models.Group.Name.label("Name"),
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Licenses
db.query(models.PackageLicense)
@ -128,7 +126,6 @@ def get_extended_fields():
models.License.Name.label("Name"),
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Keywords
db.query(models.PackageKeyword)
@ -141,7 +138,21 @@ def get_extended_fields():
models.PackageKeyword.Keyword.label("Name"),
literal(str()).label("Cond"),
)
.distinct()
.order_by("Name"),
# Co-Maintainer
db.query(models.PackageComaintainer)
.join(models.User, models.User.ID == models.PackageComaintainer.UsersID)
.join(
models.Package,
models.Package.PackageBaseID == models.PackageComaintainer.PackageBaseID,
)
.with_entities(
models.Package.ID,
literal("CoMaintainers").label("Type"),
models.User.Username.label("Name"),
literal(str()).label("Cond"),
)
.distinct() # A package could have the same co-maintainer multiple times
.order_by("Name"),
]
query = subqueries[0].union_all(*subqueries[1:])
@ -164,6 +175,7 @@ def as_dict(package: Package) -> dict[str, Any]:
"Popularity": float(package.Popularity),
"OutOfDate": package.OutOfDate,
"Maintainer": package.Maintainer,
"Submitter": package.Submitter,
"FirstSubmitted": package.FirstSubmitted,
"LastModified": package.LastModified,
}
@ -188,13 +200,16 @@ def _main():
USERS = aurweb.config.get("mkpkglists", "userfile")
bench = Benchmark()
logger.warning(f"{sys.argv[0]} is deprecated and will be soon be removed")
logger.info("Started re-creating archives, wait a while...")
Submitter = orm.aliased(User)
query = (
db.query(Package)
.join(PackageBase, PackageBase.ID == Package.PackageBaseID)
.join(User, PackageBase.MaintainerUID == User.ID, isouter=True)
.filter(PackageBase.PackagerUID.isnot(None))
.join(Submitter, PackageBase.SubmitterUID == Submitter.ID, isouter=True)
.with_entities(
Package.ID,
Package.Name,
@ -207,10 +222,10 @@ def _main():
PackageBase.Popularity,
PackageBase.OutOfDateTS.label("OutOfDate"),
User.Username.label("Maintainer"),
Submitter.Username.label("Submitter"),
PackageBase.SubmittedTS.label("FirstSubmitted"),
PackageBase.ModifiedTS.label("LastModified"),
)
.distinct()
.order_by("Name")
)
@ -218,13 +233,14 @@ def _main():
output = list()
snapshot_uri = aurweb.config.get("options", "snapshot_uri")
tmpdir = tempfile.mkdtemp()
tmp_packages = os.path.join(tmpdir, os.path.basename(PACKAGES))
tmp_meta = os.path.join(tmpdir, os.path.basename(META))
tmp_metaext = os.path.join(tmpdir, os.path.basename(META_EXT))
tmp_packages = f"{PACKAGES}.tmp"
tmp_meta = f"{META}.tmp"
tmp_metaext = f"{META_EXT}.tmp"
gzips = {
"packages": gzip.open(tmp_packages, "wt"),
"meta": gzip.open(tmp_meta, "wb"),
"packages": gzip.GzipFile(
filename=PACKAGES, mode="wb", fileobj=open(tmp_packages, "wb")
),
"meta": gzip.GzipFile(filename=META, mode="wb", fileobj=open(tmp_meta, "wb")),
}
# Append list opening to the metafile.
@ -233,7 +249,9 @@ def _main():
# Produce packages.gz + packages-meta-ext-v1.json.gz
extended = False
if len(sys.argv) > 1 and sys.argv[1] in EXTENDED_FIELD_HANDLERS:
gzips["meta_ext"] = gzip.open(tmp_metaext, "wb")
gzips["meta_ext"] = gzip.GzipFile(
filename=META_EXT, mode="wb", fileobj=open(tmp_metaext, "wb")
)
# Append list opening to the meta_ext file.
gzips.get("meta_ext").write(b"[\n")
f = EXTENDED_FIELD_HANDLERS.get(sys.argv[1])
@ -242,28 +260,29 @@ def _main():
results = query.all()
n = len(results) - 1
for i, result in enumerate(results):
# Append to packages.gz.
gzips.get("packages").write(f"{result.Name}\n")
with io.TextIOWrapper(gzips.get("packages")) as p:
for i, result in enumerate(results):
# Append to packages.gz.
p.write(f"{result.Name}\n")
# Construct our result JSON dictionary.
item = as_dict(result)
item["URLPath"] = snapshot_uri % result.Name
# Construct our result JSON dictionary.
item = as_dict(result)
item["URLPath"] = snapshot_uri % result.Name
# We stream out package json objects line per line, so
# we also need to include the ',' character at the end
# of package lines (excluding the last package).
suffix = b",\n" if i < n else b"\n"
# We stream out package json objects line per line, so
# we also need to include the ',' character at the end
# of package lines (excluding the last package).
suffix = b",\n" if i < n else b"\n"
# Write out to packagesmetafile
output.append(item)
gzips.get("meta").write(orjson.dumps(output[-1]) + suffix)
# Write out to packagesmetafile
output.append(item)
gzips.get("meta").write(orjson.dumps(output[-1]) + suffix)
if extended:
# Write out to packagesmetaextfile.
data_ = data.get(result.ID, {})
output[-1].update(data_)
gzips.get("meta_ext").write(orjson.dumps(output[-1]) + suffix)
if extended:
# Write out to packagesmetaextfile.
data_ = data.get(result.ID, {})
output[-1].update(data_)
gzips.get("meta_ext").write(orjson.dumps(output[-1]) + suffix)
# Append the list closing to meta/meta_ext.
gzips.get("meta").write(b"]")
@ -274,15 +293,19 @@ def _main():
util.apply_all(gzips.values(), lambda gz: gz.close())
# Produce pkgbase.gz
query = db.query(PackageBase.Name).filter(PackageBase.PackagerUID.isnot(None)).all()
tmp_pkgbase = os.path.join(tmpdir, os.path.basename(PKGBASE))
with gzip.open(tmp_pkgbase, "wt") as f:
query = db.query(PackageBase.Name).all()
tmp_pkgbase = f"{PKGBASE}.tmp"
pkgbase_gzip = gzip.GzipFile(
filename=PKGBASE, mode="wb", fileobj=open(tmp_pkgbase, "wb")
)
with io.TextIOWrapper(pkgbase_gzip) as f:
f.writelines([f"{base.Name}\n" for i, base in enumerate(query)])
# Produce users.gz
query = db.query(User.Username).all()
tmp_users = os.path.join(tmpdir, os.path.basename(USERS))
with gzip.open(tmp_users, "wt") as f:
tmp_users = f"{USERS}.tmp"
users_gzip = gzip.GzipFile(filename=USERS, mode="wb", fileobj=open(tmp_users, "wb"))
with io.TextIOWrapper(users_gzip) as f:
f.writelines([f"{user.Username}\n" for i, user in enumerate(query)])
files = [
@ -296,7 +319,7 @@ def _main():
for src, dst in files:
checksum = sha256sum(src)
base = os.path.basename(src)
base = os.path.basename(dst)
checksum_formatted = f"SHA256 ({base}) = {checksum}"
checksum_file = f"{dst}.sha256"
@ -306,7 +329,6 @@ def _main():
# Move the new archive into its rightful place.
shutil.move(src, dst)
os.removedirs(tmpdir)
seconds = filters.number_format(bench.end(), 4)
logger.info(f"Completed in {seconds} seconds.")

View file

@ -13,16 +13,16 @@ import aurweb.config
import aurweb.db
import aurweb.filters
import aurweb.l10n
from aurweb import db, logging
from aurweb import aur_logging, db
from aurweb.models import PackageBase, User
from aurweb.models.package_comaintainer import PackageComaintainer
from aurweb.models.package_comment import PackageComment
from aurweb.models.package_notification import PackageNotification
from aurweb.models.package_request import PackageRequest
from aurweb.models.request_type import RequestType
from aurweb.models.tu_vote import TUVote
from aurweb.models.vote import Vote
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
aur_location = aurweb.config.get("options", "aur_location")
@ -45,6 +45,9 @@ class Notification:
def get_cc(self):
return []
def get_bcc(self):
return []
def get_body_fmt(self, lang):
body = ""
for line in self.get_body(lang).splitlines():
@ -114,7 +117,7 @@ class Notification:
server.login(user, passwd)
server.set_debuglevel(0)
deliver_to = [to] + self.get_cc()
deliver_to = [to] + self.get_cc() + self.get_bcc()
server.sendmail(sender, deliver_to, msg.as_bytes())
server.quit()
@ -131,7 +134,6 @@ class Notification:
class ResetKeyNotification(Notification):
def __init__(self, uid):
user = (
db.query(User)
.filter(and_(User.ID == uid, User.Suspended == 0))
@ -194,7 +196,6 @@ class WelcomeNotification(ResetKeyNotification):
class CommentNotification(Notification):
def __init__(self, uid, pkgbase_id, comment_id):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
@ -260,7 +261,6 @@ class CommentNotification(Notification):
class UpdateNotification(Notification):
def __init__(self, uid, pkgbase_id):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
@ -319,7 +319,6 @@ class UpdateNotification(Notification):
class FlagNotification(Notification):
def __init__(self, uid, pkgbase_id):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
@ -338,6 +337,7 @@ class FlagNotification(Notification):
.filter(and_(PackageBase.ID == pkgbase_id, User.Suspended == 0))
.with_entities(User.Email, User.LangPreference)
.distinct()
.order_by(User.Email)
)
self._recipients = [(u.Email, u.LangPreference) for u in query]
@ -375,7 +375,6 @@ class FlagNotification(Notification):
class OwnershipEventNotification(Notification):
def __init__(self, uid, pkgbase_id):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
@ -437,7 +436,6 @@ class DisownNotification(OwnershipEventNotification):
class ComaintainershipEventNotification(Notification):
def __init__(self, uid, pkgbase_id):
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
)
@ -482,7 +480,6 @@ class ComaintainerRemoveNotification(ComaintainershipEventNotification):
class DeleteNotification(Notification):
def __init__(self, uid, old_pkgbase_id, new_pkgbase_id=None):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._old_pkgbase = (
db.query(PackageBase.Name)
@ -560,7 +557,6 @@ class DeleteNotification(Notification):
class RequestOpenNotification(Notification):
def __init__(self, uid, reqid, reqtype, pkgbase_id, merge_into=None):
self._user = db.query(User.Username).filter(User.ID == uid).first().Username
self._pkgbase = (
db.query(PackageBase.Name).filter(PackageBase.ID == pkgbase_id).first().Name
@ -585,10 +581,11 @@ class RequestOpenNotification(Notification):
),
)
.filter(and_(PackageRequest.ID == reqid, User.Suspended == 0))
.with_entities(User.Email)
.with_entities(User.Email, User.HideEmail)
.distinct()
)
self._cc = [u.Email for u in query]
self._cc = [u.Email for u in query if u.HideEmail == 0]
self._bcc = [u.Email for u in query if u.HideEmail == 1]
pkgreq = (
db.query(PackageRequest.Comments).filter(PackageRequest.ID == reqid).first()
@ -605,6 +602,9 @@ class RequestOpenNotification(Notification):
def get_cc(self):
return self._cc
def get_bcc(self):
return self._bcc
def get_subject(self, lang):
return "[PRQ#%d] %s Request for %s" % (
self._reqid,
@ -672,10 +672,11 @@ class RequestCloseNotification(Notification):
),
)
.filter(and_(PackageRequest.ID == reqid, User.Suspended == 0))
.with_entities(User.Email)
.with_entities(User.Email, User.HideEmail)
.distinct()
)
self._cc = [u.Email for u in query]
self._cc = [u.Email for u in query if u.HideEmail == 0]
self._bcc = [u.Email for u in query if u.HideEmail == 1]
pkgreq = (
db.query(PackageRequest)
@ -702,6 +703,9 @@ class RequestCloseNotification(Notification):
def get_cc(self):
return self._cc
def get_bcc(self):
return self._bcc
def get_subject(self, lang):
return "[PRQ#%d] %s Request for %s %s" % (
self._reqid,
@ -740,11 +744,11 @@ class RequestCloseNotification(Notification):
return headers
class TUVoteReminderNotification(Notification):
class VoteReminderNotification(Notification):
def __init__(self, vote_id):
self._vote_id = int(vote_id)
subquery = db.query(TUVote.UserID).filter(TUVote.VoteID == vote_id)
subquery = db.query(Vote.UserID).filter(Vote.VoteID == vote_id)
query = (
db.query(User)
.filter(
@ -765,7 +769,7 @@ class TUVoteReminderNotification(Notification):
def get_subject(self, lang):
return aurweb.l10n.translator.translate(
"TU Vote Reminder: Proposal {id}", lang
"Package Maintainer Vote Reminder: Proposal {id}", lang
).format(id=self._vote_id)
def get_body(self, lang):
@ -776,7 +780,7 @@ class TUVoteReminderNotification(Notification):
).format(id=self._vote_id)
def get_refs(self):
return (aur_location + "/tu/?id=" + str(self._vote_id),)
return (aur_location + "/package-maintainer/?id=" + str(self._vote_id),)
def main():
@ -795,7 +799,7 @@ def main():
"delete": DeleteNotification,
"request-open": RequestOpenNotification,
"request-close": RequestCloseNotification,
"tu-vote-reminder": TUVoteReminderNotification,
"vote-reminder": VoteReminderNotification,
}
with db.begin():

View file

@ -17,6 +17,12 @@ def _main():
def main():
# Previously used to clean up "reserved" packages which never got pushed.
# Let's deactivate this for now since "setup-repo" is gone and we see
# other issue where deletion of a user account might cause unintended
# removal of a package (where PackagerUID account was deleted)
return
db.get_engine()
with db.begin():
_main()

View file

@ -1,9 +1,10 @@
#!/usr/bin/env python3
from datetime import datetime
from sqlalchemy import and_, func
from sqlalchemy.sql.functions import coalesce, sum as _sum
from aurweb import db, time
from aurweb import config, db, time
from aurweb.models import PackageBase, PackageVote
@ -46,13 +47,24 @@ def run_variable(pkgbases: list[PackageBase] = []) -> None:
ids = set()
if pkgbases:
# If `pkgbases` were given, we should forcefully update the given
# package base records' popularities.
ids = {pkgbase.ID for pkgbase in pkgbases}
query = query.filter(PackageBase.ID.in_(ids))
else:
# Otherwise, we should only update popularities which have exceeded
# the popularity interval length.
interval = config.getint("git-archive", "popularity-interval")
query = query.filter(
PackageBase.PopularityUpdated
<= datetime.fromtimestamp((now - interval))
)
query.update(
{
"NumVotes": votes_subq.scalar_subquery(),
"Popularity": pop_subq.scalar_subquery(),
"PopularityUpdated": datetime.fromtimestamp(now),
}
)

View file

@ -9,10 +9,10 @@ import markdown
import pygit2
import aurweb.config
from aurweb import db, logging, util
from aurweb import aur_logging, db, util
from aurweb.models import PackageComment
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
class LinkifyExtension(markdown.extensions.Extension):
@ -46,7 +46,7 @@ class FlysprayLinksInlineProcessor(markdown.inlinepatterns.InlineProcessor):
el = Element("a")
el.set("href", f"https://bugs.archlinux.org/task/{m.group(1)}")
el.text = markdown.util.AtomicString(m.group(0))
return (el, m.start(0), m.end(0))
return el, m.start(0), m.end(0)
class FlysprayLinksExtension(markdown.extensions.Extension):
@ -72,9 +72,14 @@ class GitCommitsInlineProcessor(markdown.inlinepatterns.InlineProcessor):
def handleMatch(self, m, data):
oid = m.group(1)
if oid not in self._repo:
# Unknown OID; preserve the orginal text.
return (None, None, None)
# Lookup might raise ValueError in case multiple object ID's were found
try:
if oid not in self._repo:
# Unknown OID; preserve the orginal text.
return None, None, None
except ValueError:
# Multiple OID's found; preserve the orginal text.
return None, None, None
el = Element("a")
commit_uri = aurweb.config.get("options", "commit_uri")
@ -83,7 +88,7 @@ class GitCommitsInlineProcessor(markdown.inlinepatterns.InlineProcessor):
"href", commit_uri % (quote_plus(self._head), quote_plus(oid[:prefixlen]))
)
el.text = markdown.util.AtomicString(oid[:prefixlen])
return (el, m.start(0), m.end(0))
return el, m.start(0), m.end(0)
class GitCommitsExtension(markdown.extensions.Extension):
@ -116,6 +121,20 @@ class HeadingExtension(markdown.extensions.Extension):
md.treeprocessors.register(HeadingTreeprocessor(md), "heading", 30)
class StrikethroughInlineProcessor(markdown.inlinepatterns.InlineProcessor):
def handleMatch(self, m, data):
el = Element("del")
el.text = m.group(1)
return el, m.start(0), m.end(0)
class StrikethroughExtension(markdown.extensions.Extension):
def extendMarkdown(self, md):
pattern = r"~~(.*?)~~"
processor = StrikethroughInlineProcessor(pattern, md)
md.inlinePatterns.register(processor, "del", 40)
def save_rendered_comment(comment: PackageComment, html: str):
with db.begin():
comment.RenderedComment = html
@ -132,15 +151,17 @@ def update_comment_render(comment: PackageComment) -> None:
html = markdown.markdown(
text,
extensions=[
"md_in_html",
"fenced_code",
LinkifyExtension(),
FlysprayLinksExtension(),
GitCommitsExtension(pkgbasename),
HeadingExtension(),
StrikethroughExtension(),
],
)
allowed_tags = bleach.sanitizer.ALLOWED_TAGS + [
allowed_tags = list(bleach.sanitizer.ALLOWED_TAGS) + [
"p",
"pre",
"h4",
@ -148,6 +169,9 @@ def update_comment_render(comment: PackageComment) -> None:
"h6",
"br",
"hr",
"del",
"details",
"summary",
]
html = bleach.clean(html, tags=allowed_tags)
save_rendered_comment(comment, html)

View file

@ -4,7 +4,7 @@ from sqlalchemy import and_
import aurweb.config
from aurweb import db, time
from aurweb.models import TUVoteInfo
from aurweb.models import VoteInfo
from aurweb.scripts import notify
notify_cmd = aurweb.config.get("notifications", "notify-cmd")
@ -15,17 +15,17 @@ def main():
now = time.utcnow()
start = aurweb.config.getint("tuvotereminder", "range_start")
start = aurweb.config.getint("votereminder", "range_start")
filter_from = now + start
end = aurweb.config.getint("tuvotereminder", "range_end")
end = aurweb.config.getint("votereminder", "range_end")
filter_to = now + end
query = db.query(TUVoteInfo.ID).filter(
and_(TUVoteInfo.End >= filter_from, TUVoteInfo.End <= filter_to)
query = db.query(VoteInfo.ID).filter(
and_(VoteInfo.End >= filter_from, VoteInfo.End <= filter_to)
)
for voteinfo in query:
notif = notify.TUVoteReminderNotification(voteinfo.ID)
notif = notify.VoteReminderNotification(voteinfo.ID)
notif.send()

View file

@ -7,7 +7,6 @@ This module uses a global state, since you cant open two servers with the sam
configuration anyway.
"""
import argparse
import atexit
import os
@ -20,7 +19,6 @@ from typing import Iterable
import aurweb.config
import aurweb.schema
from aurweb.exceptions import AurwebException
children = []
temporary_dir = None
@ -28,9 +26,6 @@ verbosity = 0
asgi_backend = ""
workers = 1
PHP_BINARY = os.environ.get("PHP_BINARY", "php")
PHP_MODULES = ["pdo_mysql", "pdo_sqlite"]
PHP_NGINX_PORT = int(os.environ.get("PHP_NGINX_PORT", 8001))
FASTAPI_NGINX_PORT = int(os.environ.get("FASTAPI_NGINX_PORT", 8002))
@ -47,91 +42,55 @@ class ProcessExceptions(Exception):
super().__init__("\n- ".join(messages))
def validate_php_config() -> None:
"""
Perform a validation check against PHP_BINARY's configuration.
AurwebException is raised here if checks fail to pass. We require
the 'pdo_mysql' and 'pdo_sqlite' modules to be enabled.
:raises: AurwebException
:return: None
"""
try:
proc = subprocess.Popen(
[PHP_BINARY, "-m"], stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
out, _ = proc.communicate()
except FileNotFoundError:
raise AurwebException(f"Unable to locate the '{PHP_BINARY}' " "executable.")
assert proc.returncode == 0, (
"Received non-zero error code " f"{proc.returncode} from '{PHP_BINARY}'."
)
modules = out.decode().splitlines()
for module in PHP_MODULES:
if module not in modules:
raise AurwebException(f"PHP does not have the '{module}' module enabled.")
def generate_nginx_config():
"""
Generate an nginx configuration based on aurweb's configuration.
The file is generated under `temporary_dir`.
Returns the path to the created configuration file.
"""
php_bind = aurweb.config.get("php", "bind_address")
php_host = php_bind.split(":")[0]
fastapi_bind = aurweb.config.get("fastapi", "bind_address")
fastapi_host = fastapi_bind.split(":")[0]
config_path = os.path.join(temporary_dir, "nginx.conf")
config = open(config_path, "w")
# We double nginx's braces because they conflict with Python's f-strings.
config.write(
f"""
events {{}}
daemon off;
error_log /dev/stderr info;
pid {os.path.join(temporary_dir, "nginx.pid")};
http {{
access_log /dev/stdout;
client_body_temp_path {os.path.join(temporary_dir, "client_body")};
proxy_temp_path {os.path.join(temporary_dir, "proxy")};
fastcgi_temp_path {os.path.join(temporary_dir, "fastcgi")}1 2;
uwsgi_temp_path {os.path.join(temporary_dir, "uwsgi")};
scgi_temp_path {os.path.join(temporary_dir, "scgi")};
server {{
listen {php_host}:{PHP_NGINX_PORT};
location / {{
proxy_pass http://{php_bind};
with open(config_path, "w") as config:
# We double nginx's braces because they conflict with Python's f-strings.
config.write(
f"""
events {{}}
daemon off;
error_log /dev/stderr info;
pid {os.path.join(temporary_dir, "nginx.pid")};
http {{
access_log /dev/stdout;
client_body_temp_path {os.path.join(temporary_dir, "client_body")};
proxy_temp_path {os.path.join(temporary_dir, "proxy")};
fastcgi_temp_path {os.path.join(temporary_dir, "fastcgi")}1 2;
uwsgi_temp_path {os.path.join(temporary_dir, "uwsgi")};
scgi_temp_path {os.path.join(temporary_dir, "scgi")};
server {{
listen {fastapi_host}:{FASTAPI_NGINX_PORT};
location / {{
try_files $uri @proxy_to_app;
}}
location @proxy_to_app {{
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off;
proxy_buffering off;
proxy_pass http://{fastapi_bind};
}}
}}
}}
server {{
listen {fastapi_host}:{FASTAPI_NGINX_PORT};
location / {{
try_files $uri @proxy_to_app;
}}
location @proxy_to_app {{
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off;
proxy_buffering off;
proxy_pass http://{fastapi_bind};
}}
}}
}}
"""
)
"""
)
return config_path
def spawn_child(args):
def spawn_child(_args):
"""Open a subprocess and add it to the global state."""
if verbosity >= 1:
print(f":: Spawning {args}", file=sys.stderr)
children.append(subprocess.Popen(args))
print(f":: Spawning {_args}", file=sys.stderr)
children.append(subprocess.Popen(_args))
def start():
@ -154,7 +113,7 @@ def start():
terminal_width = 80
print(
"{ruler}\n"
"Spawing PHP and FastAPI, then nginx as a reverse proxy.\n"
"Spawing FastAPI, then nginx as a reverse proxy.\n"
"Check out {aur_location}\n"
"Hit ^C to terminate everything.\n"
"{ruler}".format(
@ -163,12 +122,6 @@ def start():
)
)
# PHP
php_address = aurweb.config.get("php", "bind_address")
php_host = php_address.split(":")[0]
htmldir = aurweb.config.get("php", "htmldir")
spawn_child(["php", "-S", php_address, "-t", htmldir])
# FastAPI
fastapi_host, fastapi_port = aurweb.config.get("fastapi", "bind_address").rsplit(
":", 1
@ -210,10 +163,7 @@ def start():
f"""
> Started nginx.
>
> PHP backend: http://{php_address}
> FastAPI backend: http://{fastapi_host}:{fastapi_port}
>
> PHP frontend: http://{php_host}:{PHP_NGINX_PORT}
> FastAPI backend: http://{fastapi_host}:{fastapi_port}
> FastAPI frontend: http://{fastapi_host}:{FASTAPI_NGINX_PORT}
>
> Frontends are hosted via nginx and should be preferred.
@ -221,17 +171,17 @@ def start():
)
def _kill_children(
children: Iterable, exceptions: list[Exception] = []
) -> list[Exception]:
def _kill_children(_children: Iterable, exceptions=None) -> list[Exception]:
"""
Kill each process found in `children`.
:param children: Iterable of child processes
:param _children: Iterable of child processes
:param exceptions: Exception memo
:return: `exceptions`
"""
for p in children:
if exceptions is None:
exceptions = []
for p in _children:
try:
p.terminate()
if verbosity >= 1:
@ -241,17 +191,17 @@ def _kill_children(
return exceptions
def _wait_for_children(
children: Iterable, exceptions: list[Exception] = []
) -> list[Exception]:
def _wait_for_children(_children: Iterable, exceptions=None) -> list[Exception]:
"""
Wait for each process to end found in `children`.
:param children: Iterable of child processes
:param _children: Iterable of child processes
:param exceptions: Exception memo
:return: `exceptions`
"""
for p in children:
if exceptions is None:
exceptions = []
for p in _children:
try:
rc = p.wait()
if rc != 0 and rc != -15:
@ -307,12 +257,6 @@ if __name__ == "__main__":
)
args = parser.parse_args()
try:
validate_php_config()
except AurwebException as exc:
print(f"error: {str(exc)}")
sys.exit(1)
verbosity = args.verbose
asgi_backend = args.backend
workers = args.workers

169
aurweb/statistics.py Normal file
View file

@ -0,0 +1,169 @@
from sqlalchemy import func
from aurweb import config, db, time
from aurweb.cache import db_count_cache, db_query_cache
from aurweb.models import PackageBase, PackageRequest, RequestType, User
from aurweb.models.account_type import (
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
USER_ID,
)
from aurweb.models.package_request import (
ACCEPTED_ID,
CLOSED_ID,
PENDING_ID,
REJECTED_ID,
)
from aurweb.prometheus import PACKAGES, REQUESTS, USERS
cache_expire = config.getint("cache", "expiry_time_statistics", 300)
HOMEPAGE_COUNTERS = [
"package_count",
"orphan_count",
"seven_days_old_added",
"seven_days_old_updated",
"year_old_updated",
"never_updated",
"user_count",
"package_maintainer_count",
]
REQUEST_COUNTERS = [
"total_requests",
"pending_requests",
"closed_requests",
"accepted_requests",
"rejected_requests",
]
PROMETHEUS_USER_COUNTERS = [
("package_maintainer_count", "package_maintainer"),
("regular_user_count", "user"),
]
PROMETHEUS_PACKAGE_COUNTERS = [
("orphan_count", "orphan"),
("never_updated", "not_updated"),
("updated_packages", "updated"),
]
class Statistics:
seven_days = 86400 * 7
one_hour = 3600
year = seven_days * 52
def __init__(self, cache_expire: int = None) -> "Statistics":
self.expiry_time = cache_expire
self.now = time.utcnow()
self.seven_days_ago = self.now - self.seven_days
self.year_ago = self.now - self.year
self.user_query = db.query(User)
self.bases_query = db.query(PackageBase)
self.updated_query = db.query(PackageBase).filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS >= self.one_hour
)
self.request_query = db.query(PackageRequest)
def get_count(self, counter: str) -> int:
query = None
match counter:
# Packages
case "package_count":
query = self.bases_query
case "orphan_count":
query = self.bases_query.filter(PackageBase.MaintainerUID.is_(None))
case "seven_days_old_added":
query = self.bases_query.filter(
PackageBase.SubmittedTS >= self.seven_days_ago
)
case "seven_days_old_updated":
query = self.updated_query.filter(
PackageBase.ModifiedTS >= self.seven_days_ago
)
case "year_old_updated":
query = self.updated_query.filter(
PackageBase.ModifiedTS >= self.year_ago
)
case "never_updated":
query = self.bases_query.filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS < self.one_hour
)
case "updated_packages":
query = self.bases_query.filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS > self.one_hour,
~PackageBase.MaintainerUID.is_(None),
)
# Users
case "user_count":
query = self.user_query
case "package_maintainer_count":
query = self.user_query.filter(
User.AccountTypeID.in_(
(
PACKAGE_MAINTAINER_ID,
PACKAGE_MAINTAINER_AND_DEV_ID,
)
)
)
case "regular_user_count":
query = self.user_query.filter(User.AccountTypeID == USER_ID)
# Requests
case "total_requests":
query = self.request_query
case "pending_requests":
query = self.request_query.filter(PackageRequest.Status == PENDING_ID)
case "closed_requests":
query = self.request_query.filter(PackageRequest.Status == CLOSED_ID)
case "accepted_requests":
query = self.request_query.filter(PackageRequest.Status == ACCEPTED_ID)
case "rejected_requests":
query = self.request_query.filter(PackageRequest.Status == REJECTED_ID)
case _:
return -1
return db_count_cache(counter, query, expire=self.expiry_time)
def update_prometheus_metrics():
stats = Statistics(cache_expire)
# Users gauge
for counter, utype in PROMETHEUS_USER_COUNTERS:
count = stats.get_count(counter)
USERS.labels(utype).set(count)
# Packages gauge
for counter, state in PROMETHEUS_PACKAGE_COUNTERS:
count = stats.get_count(counter)
PACKAGES.labels(state).set(count)
# Requests gauge
query = (
db.get_session()
.query(PackageRequest, func.count(PackageRequest.ID), RequestType.Name)
.join(RequestType)
.group_by(RequestType.Name, PackageRequest.Status)
)
results = db_query_cache("request_metrics", query, cache_expire)
for record in results:
status = record[0].status_display()
count = record[1]
rtype = record[2]
REQUESTS.labels(type=rtype, status=status).set(count)
def _get_counts(counters: list[str]) -> dict[str, int]:
stats = Statistics(cache_expire)
result = dict()
for counter in counters:
result[counter] = stats.get_count(counter)
return result
def get_homepage_counts() -> dict[str, int]:
return _get_counts(HOMEPAGE_COUNTERS)
def get_request_counts() -> dict[str, int]:
return _get_counts(REQUEST_COUNTERS)

View file

@ -9,7 +9,7 @@ from fastapi import Request
from fastapi.responses import HTMLResponse
import aurweb.config
from aurweb import cookies, l10n, time
from aurweb import l10n, time
# Prepare jinja2 objects.
_loader = jinja2.FileSystemLoader(
@ -19,6 +19,8 @@ _env = jinja2.Environment(
loader=_loader, autoescape=True, extensions=["jinja2.ext.i18n"]
)
DEFAULT_TIMEZONE = aurweb.config.get("options", "default_timezone")
def register_filter(name: str) -> Callable:
"""A decorator that can be used to register a filter.
@ -68,6 +70,7 @@ def make_context(request: Request, title: str, next: str = None):
commit_url = aurweb.config.get_with_fallback("devel", "commit_url", None)
commit_hash = aurweb.config.get_with_fallback("devel", "commit_hash", None)
max_chars_comment = aurweb.config.getint("options", "max_chars_comment", 5000)
if commit_hash:
# Shorten commit_hash to a short Git hash.
commit_hash = commit_hash[:7]
@ -90,6 +93,7 @@ def make_context(request: Request, title: str, next: str = None):
"creds": aurweb.auth.creds,
"next": next if next else request.url.path,
"version": os.environ.get("COMMIT_HASH", aurweb.config.AURWEB_VERSION),
"max_chars_comment": max_chars_comment,
}
@ -104,8 +108,8 @@ async def make_variable_context(request: Request, title: str, next: str = None):
)
for k, v in to_copy.items():
context[k] = v
if k not in context:
context[k] = v
context["q"] = dict(request.query_params)
return context
@ -137,13 +141,4 @@ def render_template(
):
"""Render a template as an HTMLResponse."""
rendered = render_raw_template(request, path, context)
response = HTMLResponse(rendered, status_code=int(status_code))
sid = None
if request.user.is_authenticated():
sid = request.cookies.get("AURSID")
# Re-emit SID via update_response_cookies with an updated expiration.
# This extends the life of a user session based on the AURREMEMBER
# cookie, which is always set to the "Remember Me" state on login.
return cookies.update_response_cookies(request, response, aursid=sid)
return HTMLResponse(rendered, status_code=int(status_code))

View file

@ -51,8 +51,8 @@ def setup_test_db(*args):
models.Session.__tablename__,
models.SSHPubKey.__tablename__,
models.Term.__tablename__,
models.TUVote.__tablename__,
models.TUVoteInfo.__tablename__,
models.Vote.__tablename__,
models.VoteInfo.__tablename__,
models.User.__tablename__,
]

View file

@ -4,10 +4,10 @@ import re
import shutil
import subprocess
from aurweb import logging, util
from aurweb import aur_logging, util
from aurweb.templates import base_template
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
class AlpmDatabase:

View file

@ -4,9 +4,9 @@ from typing import Callable
from posix_ipc import O_CREAT, Semaphore
from aurweb import logging
from aurweb import aur_logging
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
def default_on_create(path):

View file

@ -1,6 +1,4 @@
import os
import shlex
from subprocess import PIPE, Popen
from typing import Tuple
import py
@ -8,6 +6,7 @@ import py
from aurweb.models import Package
from aurweb.templates import base_template
from aurweb.testing.filelock import FileLock
from aurweb.util import shell_exec
class GitRepository:
@ -24,10 +23,7 @@ class GitRepository:
self.file_lock.lock(on_create=self._setup)
def _exec(self, cmdline: str, cwd: str) -> Tuple[int, str, str]:
args = shlex.split(cmdline)
proc = Popen(args, cwd=cwd, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
return (proc.returncode, out.decode().strip(), err.decode().strip())
return shell_exec(cmdline, cwd)
def _exec_repository(self, cmdline: str) -> Tuple[int, str, str]:
return self._exec(cmdline, cwd=str(self.file_lock.path))

View file

@ -0,0 +1,8 @@
from aurweb import prometheus
def clear_metrics():
prometheus.PACKAGES.clear()
prometheus.REQUESTS.clear()
prometheus.SEARCH_REQUESTS.clear()
prometheus.USERS.clear()

View file

@ -23,7 +23,10 @@ class Client:
class URL:
path = "/"
path: str
def __init__(self, path: str = "/"):
self.path = path
class Request:
@ -39,6 +42,8 @@ class Request:
method: str = "GET",
headers: dict[str, str] = dict(),
cookies: dict[str, str] = dict(),
url: str = "/",
query_params: dict[str, str] = dict(),
) -> "Request":
self.user = user
self.user.authenticated = authenticated
@ -46,3 +51,5 @@ class Request:
self.method = method.upper()
self.headers = headers
self.cookies = cookies
self.url = URL(path=url)
self.query_params = query_params

View file

@ -1,7 +1,6 @@
import zoneinfo
from collections import OrderedDict
from datetime import datetime
from urllib.parse import unquote
from datetime import UTC, datetime
from zoneinfo import ZoneInfo
from fastapi import Request
@ -58,16 +57,20 @@ SUPPORTED_TIMEZONES = OrderedDict(
)
def get_request_timezone(request: Request):
"""Get a request's timezone by its AURTZ cookie. We use the
configuration's [options] default_timezone otherwise.
def get_request_timezone(request: Request) -> str:
"""Get a request's timezone from either query param or user settings.
We use the configuration's [options] default_timezone otherwise.
@param request FastAPI request
"""
default_tz = aurweb.config.get("options", "default_timezone")
if request.user.is_authenticated():
default_tz = request.user.Timezone
return unquote(request.cookies.get("AURTZ", default_tz))
request_tz = request.query_params.get("timezone")
if request_tz and request_tz in SUPPORTED_TIMEZONES:
return request_tz
elif (
request.user.is_authenticated() and request.user.Timezone in SUPPORTED_TIMEZONES
):
return request.user.Timezone
return aurweb.config.get_with_fallback("options", "default_timezone", "UTC")
def now(timezone: str) -> datetime:
@ -86,4 +89,4 @@ def utcnow() -> int:
:return: Current UTC timestamp
"""
return int(datetime.utcnow().timestamp())
return int(datetime.now(UTC).timestamp())

View file

@ -2,12 +2,13 @@ from typing import Any
from fastapi import Request
from aurweb import cookies, db, models, time, util
from aurweb import db, models, time, util
from aurweb.models import SSHPubKey
from aurweb.models.ssh_pub_key import get_fingerprint
from aurweb.util import strtobool
@db.retry_deadlock
def simple(
U: str = str(),
E: str = str(),
@ -21,6 +22,7 @@ def simple(
CN: bool = False,
UN: bool = False,
ON: bool = False,
HDC: bool = False,
S: bool = False,
user: models.User = None,
**kwargs,
@ -40,8 +42,10 @@ def simple(
user.CommentNotify = strtobool(CN)
user.UpdateNotify = strtobool(UN)
user.OwnershipNotify = strtobool(ON)
user.HideDeletedComments = strtobool(HDC)
@db.retry_deadlock
def language(
L: str = str(),
request: Request = None,
@ -55,6 +59,7 @@ def language(
context["language"] = L
@db.retry_deadlock
def timezone(
TZ: str = str(),
request: Request = None,
@ -68,6 +73,7 @@ def timezone(
context["language"] = TZ
@db.retry_deadlock
def ssh_pubkey(PK: str = str(), user: models.User = None, **kwargs) -> None:
if not PK:
# If no pubkey is provided, wipe out any pubkeys the user
@ -101,12 +107,14 @@ def ssh_pubkey(PK: str = str(), user: models.User = None, **kwargs) -> None:
)
@db.retry_deadlock
def account_type(T: int = None, user: models.User = None, **kwargs) -> None:
if T is not None and (T := int(T)) != user.AccountTypeID:
with db.begin():
user.AccountTypeID = T
@db.retry_deadlock
def password(
P: str = str(),
request: Request = None,
@ -123,8 +131,22 @@ def password(
user.update_password(P)
if user == request.user:
remember_me = request.cookies.get("AURREMEMBER", False)
# If the target user is the request user, login with
# the updated password to update the Session record.
user.login(request, P, cookies.timeout(remember_me))
user.login(request, P)
@db.retry_deadlock
def suspend(
S: bool = False,
request: Request = None,
user: models.User = None,
context: dict[str, Any] = {},
**kwargs,
) -> None:
if S and user.session:
context["S"] = None
with db.begin():
db.delete_all(
db.query(models.Session).filter(models.Session.UsersID == user.ID)
)

View file

@ -6,10 +6,11 @@ out of form data from /account/register or /account/{username}/edit.
All functions in this module raise aurweb.exceptions.ValidationError
when encountering invalid criteria and return silently otherwise.
"""
from fastapi import Request
from sqlalchemy import and_
from aurweb import config, db, l10n, logging, models, time, util
from aurweb import aur_logging, config, db, l10n, models, time, util
from aurweb.auth import creds
from aurweb.captcha import get_captcha_answer, get_captcha_salts, get_captcha_token
from aurweb.exceptions import ValidationError
@ -17,7 +18,7 @@ from aurweb.models.account_type import ACCOUNT_TYPE_NAME
from aurweb.models.ssh_pub_key import get_fingerprint
from aurweb.util import strtobool
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
def invalid_fields(E: str = str(), U: str = str(), **kwargs) -> None:
@ -56,12 +57,9 @@ def invalid_password(
) -> None:
if P:
if not util.valid_password(P):
username_min_len = config.getint("options", "username_min_len")
passwd_min_len = config.getint("options", "passwd_min_len")
raise ValidationError(
[
_("Your password must be at least %s characters.")
% (username_min_len)
]
[_("Your password must be at least %s characters.") % (passwd_min_len)]
)
elif not C:
raise ValidationError(["Please confirm your new password."])
@ -70,7 +68,7 @@ def invalid_password(
def is_banned(request: Request = None, **kwargs) -> None:
host = request.client.host
host = util.get_client_ip(request)
exists = db.query(models.Ban, models.Ban.IPAddress == host).exists()
if db.query(exists).scalar():
raise ValidationError(
@ -220,7 +218,7 @@ def invalid_account_type(
raise ValidationError([error])
logger.debug(
f"Trusted User '{request.user.Username}' has "
f"Package Maintainer '{request.user.Username}' has "
f"modified '{user.Username}' account's type to"
f" {name}."
)

View file

@ -1,8 +1,10 @@
import math
import re
import secrets
import shlex
import string
from datetime import datetime
from hashlib import sha1
from http import HTTPStatus
from subprocess import PIPE, Popen
from typing import Callable, Iterable, Tuple, Union
@ -12,11 +14,12 @@ import fastapi
import pygit2
from email_validator import EmailSyntaxError, validate_email
from fastapi.responses import JSONResponse
from sqlalchemy.orm import Query
import aurweb.config
from aurweb import defaults, logging
from aurweb import aur_logging, defaults
logger = logging.get_logger(__name__)
logger = aur_logging.get_logger(__name__)
def make_random_string(length: int) -> str:
@ -95,18 +98,18 @@ def apply_all(iterable: Iterable, fn: Callable):
return iterable
def sanitize_params(offset: str, per_page: str) -> Tuple[int, int]:
def sanitize_params(offset_str: str, per_page_str: str) -> Tuple[int, int]:
try:
offset = int(offset)
offset = defaults.O if int(offset_str) < 0 else int(offset_str)
except ValueError:
offset = defaults.O
try:
per_page = int(per_page)
per_page = defaults.PP if int(per_page_str) <= 0 else int(per_page_str)
except ValueError:
per_page = defaults.PP
return (offset, per_page)
return offset, per_page
def strtobool(value: Union[str, bool]) -> bool:
@ -186,9 +189,30 @@ def parse_ssh_key(string: str) -> Tuple[str, str]:
if proc.returncode:
raise invalid_exc
return (prefix, key)
return prefix, key
def parse_ssh_keys(string: str) -> list[Tuple[str, str]]:
def parse_ssh_keys(string: str) -> set[Tuple[str, str]]:
"""Parse a list of SSH public keys."""
return [parse_ssh_key(e) for e in string.splitlines()]
return set([parse_ssh_key(e) for e in string.strip().splitlines(True) if e.strip()])
def shell_exec(cmdline: str, cwd: str) -> Tuple[int, str, str]:
args = shlex.split(cmdline)
proc = Popen(args, cwd=cwd, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
return proc.returncode, out.decode().strip(), err.decode().strip()
def hash_query(query: Query):
return sha1(
str(query.statement.compile(compile_kwargs={"literal_binds": True})).encode()
).hexdigest()
def get_client_ip(request: fastapi.Request) -> str:
"""
Returns the client's IP address for a Request.
Falls back to 'testclient' if request.client is None
"""
return request.client.host if request.client else "testclient"

61
ci/tf/.terraform.lock.hcl generated Normal file
View file

@ -0,0 +1,61 @@
# This file is maintained automatically by "terraform init".
# Manual edits may be lost in future updates.
provider "registry.terraform.io/hashicorp/dns" {
version = "3.3.2"
hashes = [
"h1:HjskPLRqmCw8Q/kiSuzti3iJBSpcAvcBFdlwFFQuoDE=",
"zh:05d2d50e301318362a4a82e6b7a9734ace07bc01abaaa649c566baf98814755f",
"zh:1e9fd1c3bfdda777e83e42831dd45b7b9e794250a0f351e5fd39762e8a0fe15b",
"zh:40e715fc7a2ede21f919567249b613844692c2f8a64f93ee64e5b68bae7ac2a2",
"zh:454d7aa83000a6e2ba7a7bfde4bcf5d7ed36298b22d760995ca5738ab02ee468",
"zh:46124ded51b4153ad90f12b0305fdbe0c23261b9669aa58a94a31c9cca2f4b19",
"zh:55a4f13d20f73534515a6b05701abdbfc54f4e375ba25b2dffa12afdad20e49d",
"zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3",
"zh:7903b1ceb8211e2b8c79290e2e70906a4b88f4fba71c900eb3a425ce12f1716a",
"zh:b79fc4f444ef7a2fd7111a80428c070ad824f43a681699e99ab7f83074dfedbd",
"zh:ca9f45e0c4cb94e7d62536c226024afef3018b1de84f1ea4608b51bcd497a2a0",
"zh:ddc8bd894559d7d176e0ceb0bb1ae266519b01b315362ebfee8327bb7e7e5fa8",
"zh:e77334c0794ef8f9354b10e606040f6b0b67b373f5ff1db65bddcdd4569b428b",
]
}
provider "registry.terraform.io/hashicorp/tls" {
version = "4.0.4"
hashes = [
"h1:pe9vq86dZZKCm+8k1RhzARwENslF3SXb9ErHbQfgjXU=",
"zh:23671ed83e1fcf79745534841e10291bbf34046b27d6e68a5d0aab77206f4a55",
"zh:45292421211ffd9e8e3eb3655677700e3c5047f71d8f7650d2ce30242335f848",
"zh:59fedb519f4433c0fdb1d58b27c210b27415fddd0cd73c5312530b4309c088be",
"zh:5a8eec2409a9ff7cd0758a9d818c74bcba92a240e6c5e54b99df68fff312bbd5",
"zh:5e6a4b39f3171f53292ab88058a59e64825f2b842760a4869e64dc1dc093d1fe",
"zh:810547d0bf9311d21c81cc306126d3547e7bd3f194fc295836acf164b9f8424e",
"zh:824a5f3617624243bed0259d7dd37d76017097dc3193dac669be342b90b2ab48",
"zh:9361ccc7048be5dcbc2fafe2d8216939765b3160bd52734f7a9fd917a39ecbd8",
"zh:aa02ea625aaf672e649296bce7580f62d724268189fe9ad7c1b36bb0fa12fa60",
"zh:c71b4cd40d6ec7815dfeefd57d88bc592c0c42f5e5858dcc88245d371b4b8b1e",
"zh:dabcd52f36b43d250a3d71ad7abfa07b5622c69068d989e60b79b2bb4f220316",
"zh:f569b65999264a9416862bca5cd2a6177d94ccb0424f3a4ef424428912b9cb3c",
]
}
provider "registry.terraform.io/hetznercloud/hcloud" {
version = "1.42.0"
hashes = [
"h1:cr9lh26H3YbWSHb7OUnCoYw169cYO3Cjpt3yPnRhXS0=",
"zh:153b5f39d780e9a18bc1ea377d872647d328d943813cbd25d3d20863f8a37782",
"zh:35b9e95760c58cca756e34ad5f4138ac6126aa3e8c41b4a0f1d5dc9ee5666c73",
"zh:47a3cdbce982f2b4e17f73d4934bdb3e905a849b36fb59b80f87d852496ed049",
"zh:6a718c244c2ba300fbd43791661a061ad1ab16225ef3e8aeaa3db8c9eff12c85",
"zh:a2cbfc95c5e2c9422ed0a7b6292192c38241220d5b7813c678f937ab3ef962ae",
"zh:b837e118e08fd36aa8be48af7e9d0d3d112d2680c79cfc71cfe2501fb40dbefa",
"zh:bf66db8c680e18b77e16dc1f20ed1cdcc7876bfb7848c320ccb86f0fb80661ed",
"zh:c1ad80bbe48dc8a272a02dcdb4b12f019606f445606651c01e561b9d72d816b1",
"zh:d4e616701128ad14a6b5a427b0e9145ece4cad02aa3b5f9945c6d0b9ada8ab70",
"zh:d9d01f727037d028720100a5bc9fd213cb01e63e4b439a16f2f482c147976530",
"zh:dea047ee4d679370d4376fb746c4b959bf51dd06047c1c2656b32789c2433643",
"zh:e5ad7a3c556894bd40b28a874e7d2f6924876fa75fa443136a7d6ab9a00abbaa",
"zh:edf6e7e129157bd45e3da4a330d1ace17a336d417c3b77c620f302d440c368e8",
"zh:f610bc729866d58da9cffa4deae34dbfdba96655e855a87c6bb2cb7b35a8961c",
]
}

67
ci/tf/main.tf Normal file
View file

@ -0,0 +1,67 @@
terraform {
backend "http" {
}
}
provider "hcloud" {
token = var.hcloud_token
}
provider "dns" {
update {
server = var.dns_server
key_name = var.dns_tsig_key
key_algorithm = var.dns_tsig_algorithm
key_secret = var.dns_tsig_secret
}
}
resource "tls_private_key" "this" {
algorithm = "ED25519"
}
resource "hcloud_ssh_key" "this" {
name = var.name
public_key = tls_private_key.this.public_key_openssh
}
data "hcloud_image" "this" {
with_selector = "custom_image=archlinux"
most_recent = true
with_status = ["available"]
}
resource "hcloud_server" "this" {
name = var.name
image = data.hcloud_image.this.id
server_type = var.server_type
datacenter = var.datacenter
ssh_keys = [hcloud_ssh_key.this.name]
public_net {
ipv4_enabled = true
ipv6_enabled = true
}
}
resource "hcloud_rdns" "this" {
for_each = { ipv4 : hcloud_server.this.ipv4_address, ipv6 : hcloud_server.this.ipv6_address }
server_id = hcloud_server.this.id
ip_address = each.value
dns_ptr = "${var.name}.${var.dns_zone}"
}
resource "dns_a_record_set" "this" {
zone = "${var.dns_zone}."
name = var.name
addresses = [hcloud_server.this.ipv4_address]
ttl = 300
}
resource "dns_aaaa_record_set" "this" {
zone = "${var.dns_zone}."
name = var.name
addresses = [hcloud_server.this.ipv6_address]
ttl = 300
}

4
ci/tf/terraform.tfvars Normal file
View file

@ -0,0 +1,4 @@
server_type = "cpx11"
datacenter = "fsn1-dc14"
dns_server = "redirect.archlinux.org"
dns_zone = "sandbox.archlinux.page"

36
ci/tf/variables.tf Normal file
View file

@ -0,0 +1,36 @@
variable "hcloud_token" {
type = string
sensitive = true
}
variable "dns_server" {
type = string
}
variable "dns_tsig_key" {
type = string
}
variable "dns_tsig_algorithm" {
type = string
}
variable "dns_tsig_secret" {
type = string
}
variable "dns_zone" {
type = string
}
variable "name" {
type = string
}
variable "server_type" {
type = string
}
variable "datacenter" {
type = string
}

Some files were not shown because too many files have changed in this diff Show more