Compare commits

...

128 commits

Author SHA1 Message Date
Leonidas Spyropoulos
8ca61eded2
chore(release): prepare for 6.2.16 2025-01-13 15:52:13 +00:00
Leonidas Spyropoulos
a9bf714dae
fix: bump deps for python 3.13 and vulnerability
pygit2 and watchfiles for precompiled wheels
greenlet for python 3.13 compatibility
python-multipart for security vulnerability

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
3e3173b5c9
chore: avoid cache for new pacman 7
Pacman 7 introduced sandboxing which breaks cache in containers due to permissions on containers

Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2025-01-12 20:39:02 +00:00
Leonidas Spyropoulos
eca8bbf515
chore(release): prepare for 6.2.15
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-09-15 12:03:17 +03:00
Jelle van der Waa
edc1ab949a perf(captcha): simplify count() query for user ids
Using .count() isn't great as it runs a count query on a subquery which
selects all fields in the Users table. This rewrites it into a simple
SELECT count(ID) from USers query.
2024-09-12 12:29:46 +00:00
Muflone
97cc6196eb fix: reduce the number of subqueries against Packages by preloading the existing dependencies names from AUR 2024-08-21 01:36:15 +02:00
Muflone
77ef87c882 housekeep: code re-formatted by black for lint pipeline 2024-08-20 21:00:46 +00:00
Muflone
a40283cdb2 fix: reduce the number of subqueries against User by loading eagerly the Users from PackageComaintainer 2024-08-20 21:00:46 +00:00
Levente Polyak
4f68532ee2
chore(mariadb): fix mysql deprecation warnings by using mariadb commands
Mariadb has scheduled to remove the deprecated mysql drop-in interface.
Let's adapt which also removes a lot of warnings while spinning up the
service.
2024-08-19 15:26:36 +02:00
Levente Polyak
439ccd4aa3
feat(docker): add full grafana, prometheus, tempo setup for local dev
This is a very useful stack for local development as well, by allowing
to easily access a local grafana instance and look at the accessed
endpoints, query usage and durations etc.
As a nice side effect this also makes sure we have an easy way to
actually test any changes to the opentelemetry integration in an actual
environment instead of just listening to a raw socket.
2024-08-19 15:26:29 +02:00
Levente Polyak
8dcf0b2d97
fix(docker): fix compose race conditions on mariadb_init
We want the dependent services to wait until the initialization service
of mariadb finishes, but also properly accept if it already exited
before a leaf service gets picked up and put into created state. By
using the service_completed_successfully signal, we can ensure precisely
this, without being racy and leading to none booted services.

While at it, remove the compose version identifiers as docker-compose
deprecated them and always warned about when running docker-compose.
2024-08-19 15:26:21 +02:00
Leonidas Spyropoulos
88e8db4404
chore(release): prepare version 6.2.14 2024-08-17 17:28:26 +01:00
Sven-Hendrik Haase
b730f6447d
feat: Add opentelemtry-based tracing
This adds tracing to fastapi, redis, and sqlalchemy. It uses the
recommended OLTP exporter to send the tracing data.
2024-08-17 11:27:26 +01:00
Leonidas Spyropoulos
92f5bbd37f
housekeep: reformat asgi.py 2024-08-17 01:31:43 +01:00
Jelle van der Waa
6c6ecd3971
perf(aurweb): create a context with what is required
The pkgbase/util.py `make_context` helper does a lot of unrelated
expensive queries which are not required for any of the templates. Only
the 404 template shows git_clone_uri_* and pkgbase.
2024-08-16 21:32:22 +02:00
Leonidas Spyropoulos
9b12eaf2b9
chore(release): prepare version 6.2.13 2024-08-16 16:03:40 +01:00
Jelle van der Waa
d1a66a743e
perf(aurweb/pkgbase): use exists() to avoid fetching a row
The previous approach fetched the matching row, by using `exists()`
SQLAlchemy changes the query to a `SELECT 1`.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
b65d6c5e3a
perf(aurweb/pkgbase): only relevant queries when logged in
Don't query for notify, requests and vote information when the user is
not logged in as this information is not shown.
2024-08-09 16:07:17 +02:00
Jelle van der Waa
d393ed2352
fix(templates): hide non-actionable links when not logged in
A non-logged in user cannot vote/enable notifications or submit a
request so hide these links.
2024-08-09 16:07:17 +02:00
Leonidas Spyropoulos
a16fac9b95
fix: revert mysqlclient to 2.2.3 2024-08-09 11:02:13 +01:00
renovate
5dd65846d1
chore(deps): update dependency coverage to v7.6.1 2024-08-05 11:25:17 +00:00
renovate
a1b2d231c3
fix(deps): update dependency aiofiles to v24 2024-08-04 20:25:21 +00:00
renovate
f306b6df7a
fix(deps): update dependency fastapi to ^0.112.0 2024-08-04 12:25:03 +00:00
renovate
0d17895647
fix(deps): update dependency gunicorn to v22 2024-08-04 10:24:33 +00:00
renovate
36a56e9d3c
fix(deps): update all non-major dependencies 2024-08-04 09:24:29 +00:00
Diego Viola
80d3e5f7b6 housekeep: update .editorconfig url
Signed-off-by: Diego Viola <diego.viola@gmail.com>
2024-08-03 11:58:58 +00:00
Leonidas Spyropoulos
2df5a2d5a8
chore(release): prepare version 6.2.12 2024-08-03 10:46:29 +01:00
Leonidas Spyropoulos
a54b6935a1
housekeep: reformat files with pre-hooks
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-08-03 08:15:56 +01:00
Levente Polyak
4d5909256f
fix: add missing indicies on PackageBase ordered columns
Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:31 +02:00
Levente Polyak
a5b94a47f3
feat: cache rss feedgen for 5 minutes
The RSS feed should be perfectly fine even when caching them for 5
minutes. This should massively reduce the response times on the
endpoint.

Signed-off-by: Levente Polyak <anthraxx@archlinux.org>
2024-08-03 04:45:24 +02:00
moson
33d31d4117
style: Indicate deleted accounts on requests page
Show "(deleted)" on requests page for user accounts that were removed.

Fixes #505

Signed-off-by: moson <moson@archlinux.org>
2024-06-24 16:35:21 +02:00
Leonidas Spyropoulos
ed878c8c5e
chore(release): prepare for 6.2.11 2024-06-10 11:49:00 +01:00
Leonidas Spyropoulos
77e4979f79
fix: remove the extra spaces in requests textarea
fixes: #503
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2024-06-10 11:41:19 +01:00
Leonidas Spyropoulos
85af7d6f04
fix: revert Set reply-to header for notifications to ML
The change broke the initial emails to the ML. Not sure why but reverting this now and might look at later

This reverts commit 783422369e.

fixes: #502
2024-06-10 11:40:36 +01:00
Leonidas Spyropoulos
ef0619dc2f
chore(release): prepare for 6.2.10 2024-05-18 20:46:17 +01:00
moson
43b322e739
fix(CI): lint job - fix for python 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-28 17:49:08 +02:00
moson
afb7af3e27
housekeep: replace deprecated datetime functions
tests show warnings for deprecated utc functions with python 3.12

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 18:24:16 +02:00
moson
ffddf63975
housekeep: poetry - include python version 3.12
Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:46:39 +02:00
moson
c6a530f24f
chore(deps): bump pre-commit tools/libs
Prep for python 3.12
Reformat files with latest pre-commit tools

Signed-off-by: moson <moson@archlinux.org>
2024-04-25 07:25:39 +02:00
moson
3220cf886e
fix(CI): Remove "fast-single-thread" tag
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21e2ef5ecb
fix(test): Fix "TestClient"
TestClient changes were reverted with 0.37.2:

https://github.com/encode/starlette/pull/2525
https://github.com/encode/starlette/releases/tag/0.37.2
Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
6ba06801f7
chore(deps): update dependencies
- Updating pycparser (2.21 -> 2.22)
  - Updating sniffio (1.3.0 -> 1.3.1)
  - Updating typing-extensions (4.8.0 -> 4.11.0)
  - Updating anyio (3.7.1 -> 4.3.0)
  - Updating certifi (2023.11.17 -> 2024.2.2)
  - Updating greenlet (3.0.1 -> 3.0.3)
  - Updating markupsafe (2.1.3 -> 2.1.5)
  - Updating packaging (23.2 -> 24.0)
  - Updating pluggy (1.3.0 -> 1.4.0)
  - Updating pydantic-core (2.14.5 -> 2.16.3)
  - Updating coverage (7.4.0 -> 7.4.4)
  - Updating cryptography (41.0.5 -> 42.0.5)
  - Updating dnspython (2.4.2 -> 2.6.1)
  - Updating execnet (2.0.2 -> 2.1.0)
  - Updating httpcore (1.0.2 -> 1.0.5)
  - Updating lxml (5.1.0 -> 5.2.1)
  - Updating mako (1.3.0 -> 1.3.2)
  - Updating parse (1.20.0 -> 1.20.1)
  - Updating prometheus-client (0.19.0 -> 0.20.0)
  - Updating pydantic (2.5.2 -> 2.6.4)
  - Updating pytest (7.4.4 -> 8.1.1)
  - Updating python-dateutil (2.8.2 -> 2.9.0.post0)
  - Updating redis (5.0.1 -> 5.0.3)
  - Updating urllib3 (2.1.0 -> 2.2.1)
  - Updating asgiref (3.7.2 -> 3.8.1)
  - Updating email-validator (2.1.0.post1 -> 2.1.1)
  - Updating fakeredis (2.20.1 -> 2.21.3)
  - Updating fastapi (0.109.0 -> 0.110.1)
  - Updating filelock (3.13.1 -> 3.13.3)
  - Updating markdown (3.5.2 -> 3.6)
  - Updating mysqlclient (2.2.1 -> 2.2.4)
  - Updating orjson (3.9.12 -> 3.10.0)
  - Updating prometheus-fastapi-instrumentator (6.1.0 -> 7.0.0)
  - Updating protobuf (4.25.2 -> 5.26.1)
  - Updating pygit2 (1.13.3 -> 1.14.1)
  - Updating pytest-asyncio (0.23.3 -> 0.23.6)
  - Updating pytest-cov (4.1.0 -> 5.0.0)
  - Updating tomlkit (0.12.3 -> 0.12.4)
  - Updating uvicorn (0.27.0 -> 0.27.1)
  - Updating werkzeug (3.0.1 -> 3.0.2)
  - Updating starlette (0.35.0 -> 0.37.2)
  - Updating httpx (0.26.0 -> 0.27.0)
  - Updating python-multipart (0.0.6 -> 0.0.9)
  - Updating uvicorn (0.27.1 -> 0.29.0)
  - Updating sqlalchemy (1.4.50 -> 1.4.52)

Signed-off-by: moson <moson@archlinux.org>
2024-04-08 08:37:41 +02:00
moson
21a23c9abe
feat: Limit comment length
Limit the amount of characters that can be entered for a comment.

Signed-off-by: moson <moson@archlinux.org>
2024-02-25 10:46:47 +01:00
moson
d050b626db
feat: Add blacklist check for pkgbase
Also check "pkgbase" against our blacklist.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 15:55:46 +01:00
moson
057685f304
fix: Fix package info for 404 errors
We try to find packages when a user enters a URL like /somepkg
or accidentally opens /somepkg.git in the browser.

However, it currently also does this for URL's like /pkgbase/doesnotexist
and falsely interprets "pkgbase" part as a package or pkgbase name.
This in combination with a pkgbase that is named "pkgbase" generates
some misleading 404 message for URL's like /pkgbase/doesnotexist.

That being said, we should probably add pkgbase to the blacklist check
as well (we do this for pkgname already) and add things like
"pkgbase" to the blacklist -> Will be picked up in another commit.

Signed-off-by: moson <moson@archlinux.org>
2024-02-17 14:12:09 +01:00
renovate
319c565cb9
fix(deps): update all non-major dependencies 2024-01-23 22:24:28 +00:00
renovate
db6bba8bc8
fix(deps): update dependency feedgen to v1 2024-01-23 21:24:53 +00:00
renovate
a37b9685de
fix(deps): update dependency lxml to v5 2024-01-21 14:24:22 +00:00
moson
6e32cf4275
fix(i18n): Adjust transifex host URL
Fix URL, otherwise the API token won't be picked up from ~/.transifexrc

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 11:40:14 +01:00
moson
76b6971267
chore(deps): Ignore python upgrades with Renovate
Stop Renovate from trying to bump the python version.

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 10:43:12 +01:00
Robin Candau
9818c3f48c chore(i18n): Replace [community] leftover mentions to [extra] 2024-01-21 10:27:57 +01:00
moson
f967c3565a
chore(i18n): Update translations
Pull in updated translations from Transifex: 2023-01-18

Signed-off-by: moson <moson@archlinux.org>
2024-01-21 09:59:05 +01:00
moson
2fcd793a58
fix(test): Fixes for "TestClient" changes
Seems that client is optional according to the ASGI spec.
https://asgi.readthedocs.io/en/latest/specs/www.html

With Starlette 0.35 the TestClient connection  scope is None for "client".
https://github.com/encode/starlette/pull/2377

Signed-off-by: moson <moson@archlinux.org>
2024-01-19 16:37:42 +01:00
renovate
22e1577324
fix(deps): update dependency fastapi to ^0.109.0 2024-01-19 10:26:02 +01:00
moson
baf97bd159
fix(test): FastAPI 0.104.1 - Fix warnings
FastAPI events are deprecated. Use "Lifespan" function instead.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:15:18 +01:00
moson
a0b2e826be
feat: Parse markdown within html block elements
By default, markdown within an HTML block element is not parsed.
Add markdown extension to support markdown text within block
elements.

With this we can annotate our element with a "markdown" attribute:
E.g. <details markdown>*Markdown*</details>
And thus indicate that the content should be parsed.

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:14:24 +01:00
moson
1ba9e6eb44
fix: change git-cliff "tag_pattern" option to regex
Changed with v1.4.0
See: https://github.com/orhun/git-cliff/pull/318

Signed-off-by: moson <moson@archlinux.org>
2023-12-08 14:12:48 +01:00
Rafael Fontenelle
1b82887cd6
docs: Change i18n.txt to markdown format 2023-12-08 14:10:32 +01:00
moson
783422369e
feat: Set reply-to header for notifications to ML
We can set the "reply-to" header to the "to" address for any mails
that go out to the aur-requests mailing list.

Signed-off-by: moson <moson@archlinux.org>
2023-11-28 09:33:07 +01:00
moson
4637b2edba
fix(tests): Fix test case for Prometheus metrics
Disable prometheus multiprocess mode in tests to avoid global state:
Depending on the workers which are processing a testfile,
we might run into race issues where tests might influence each other.

We also need to make sure to clear any previously collected values
in case the same worker/process is executing different tests which
evaluate prometheus values.

Signed-off-by: moson <moson@archlinux.org>
2023-11-27 13:21:37 +01:00
moson
027dfbd970
chore(release): prepare for 6.2.9
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:30:29 +01:00
moson
8b234c580d
chore(deps): update dependencies
* Updating idna (3.4 -> 3.6)
* Updating annotated-types (0.5.0 -> 0.6.0)
* Updating pydantic-core (2.10.1 -> 2.14.5)
* Updating certifi (2023.7.22 -> 2023.11.17)
* Updating greenlet (3.0.0 -> 3.0.1)
* Updating pydantic (2.4.2 -> 2.5.2)
* Updating charset-normalizer (3.3.0 -> 3.3.2)
* Updating cryptography (41.0.4 -> 41.0.5)
* Updating fastapi (0.103.2 -> 0.104.1)
* Updating mako (1.2.4 -> 1.3.0)
* Updating parse (1.19.1 -> 1.20.0)
* Updating prometheus-client (0.17.1 -> 0.19.0)
* Updating urllib3 (2.0.6 -> 2.1.0)

Fix type annotation for new test function

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 20:23:56 +01:00
renovate
9bf0c61051
fix(deps): update all non-major dependencies 2023-11-25 18:25:05 +00:00
moson
9d5b9c4795
feat: Add "groups" to package details page
Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:59:43 +01:00
moson
765f989b7d
feat: Allow <del> and <details/summary> tags in comments
* Allow additional html tags: <del> and <details/summary>
* Convert markdown double-tilde (~~) to <del> tags

Signed-off-by: moson <moson@archlinux.org>
2023-11-25 18:41:28 +01:00
Jelle van der Waa
029ce3b418
templates: update Gitlab navbar to point to Arch namespace
Instead of showing your own projects, show the Arch Linux namespace
where all our bugs/projects are.
2023-11-24 18:20:25 +01:00
Jelle van der Waa
3241391af0
templates: update bugs navbar entry to GitLab
Flyspray is no more and all projects are now on our own GitLab instance.
2023-11-12 16:02:16 +01:00
moson
5d302ae00c
feat: Support timezone and language query params
Support setting the timezone as well as the language via query params:
The timezone parameter previously only worked on certain pages.
While we're at it, let's also add the language as a param.
Refactor code for timezone and language functions.
Remove unused AURTZ cookie.

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:44 +02:00
moson
933654fcbb
fix: Restrict context var override on the package page
Users can (accidentally) override context vars with query params.
This may lead to issues when rendering templates (e.g. "comments=").

Signed-off-by: moson <moson@archlinux.org>
2023-10-21 10:41:43 +02:00
moson
40c1d3e8ee
fix(ci): Don't create error reports from sandbox
We should not try to create issue reports for internal server errors
from a sandbox/review-app environment.

Signed-off-by: moson <moson@archlinux.org>
2023-10-20 15:45:58 +02:00
Hanabishi
2b8c8fc92a fix: make dependency source use superscript tag
Avoid using special characters and use '<sup>' HTML tag instead.
To not rely on user's fonts Unicode coverage.

Closes: #490
Signed-off-by: Hanabishi <1722-hanabishi@users.noreply.gitlab.archlinux.org>
2023-10-18 16:19:58 +00:00
moson
27c51430fb
chore(release): prepare for 6.2.8
Signed-off-by: moson <moson@archlinux.org>
2023-10-15 20:52:57 +02:00
moson
27cd533654
fix: Skip setting existing context values
When setting up a context with user provided variables,
we should not override any existing values previously set.

Signed-off-by: moson <moson@archlinux.org>
2023-10-12 18:09:07 +02:00
moson
2166426d4c
fix(deps): update dependencies
* Updating typing-extensions (4.5.0 -> 4.8.0)
* Installing annotated-types (0.5.0)
* Updating anyio (3.6.2 -> 3.7.1)
* Installing pydantic-core (2.10.1)
* Updating certifi (2023.5.7 -> 2023.7.22)
* Updating cffi (1.15.1 -> 1.16.0)
* Updating greenlet (2.0.2 -> 3.0.0)
* Updating markupsafe (2.1.2 -> 2.1.3)
* Updating packaging (23.1 -> 23.2)
* Updating pluggy (1.0.0 -> 1.3.0)
* Updating pydantic (1.10.7 -> 2.4.2)
* Updating charset-normalizer (3.1.0 -> 3.3.0)
* Updating click (8.1.3 -> 8.1.7)
* Updating coverage (7.2.7 -> 7.3.2)
* Updating cryptography (40.0.2 -> 41.0.4)
* Updating dnspython (2.3.0 -> 2.4.2)
* Updating execnet (1.9.0 -> 2.0.2)
* Updating fastapi (0.100.1 -> 0.103.2)
* Updating httpcore (0.17.0 -> 0.17.3)
* Updating parse (1.19.0 -> 1.19.1)
* Updating prometheus-client (0.16.0 -> 0.17.1)
* Updating pytest (7.4.0 -> 7.4.2)
* Updating redis (4.6.0 -> 5.0.1)
* Updating urllib3 (2.0.2 -> 2.0.6)
* Updating aiofiles (23.1.0 -> 23.2.1)
* Updating alembic (1.11.2 -> 1.12.0)
* Updating fakeredis (2.17.0 -> 2.19.0)
* Updating filelock (3.12.2 -> 3.12.4)
* Updating orjson (3.9.2 -> 3.9.7)
* Updating protobuf (4.23.4 -> 4.24.4)
* Updating pygit2 (1.12.2 -> 1.13.1)
* Updating werkzeug (2.3.6 -> 3.0.0)

Signed-off-by: moson <moson@archlinux.org>
2023-10-05 17:59:14 +02:00
moson
fd3022ff6c
fix: Correct password length message.
Wrong config option was used to display the minimum length error msg.
(username_min_len instead of passwd_min_len)

Signed-off-by: moson <moson@archlinux.org>
2023-10-02 13:47:38 +02:00
moson
9e9ba15813
housekeep: TU rename - Misc
Fix some more test functions

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
d2d47254b4
housekeep: TU rename - Table/Column names, scripts
TU_VoteInfo -> VoteInfo
TU_Votes -> Votes
TU_VoteInfo.ActiveTUs -> VoteInfo.ActiveUsers

script: tuvotereminder -> votereminder
Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
87f6791ea8
housekeep: TU rename - Comments
Changes to comments, function descriptions, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
61f1e5b399
housekeep: TU rename - Test suite
Rename tests: Function names, variables, etc.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:05 +02:00
moson
148c882501
housekeep: TU rename - /tu routes
Change /tu to /package-maintainer

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
f540c79580
housekeep: TU rename - UI elements
Rename all UI elements and translations.

Signed-off-by: moson <moson@archlinux.org>
2023-09-30 16:45:04 +02:00
moson
1702075875
housekeep: TU rename - code changes
Renaming of symbols. Functions, variables, values, DB values, etc.
Basically everything that is not user-facing.

This only covers "Trusted User" things:
tests, comments, etc. will covered in a following commit.
2023-09-30 16:45:04 +02:00
moson
7466e96449
fix(ci): Exclude review-app jobs for renovate MR's
Signed-off-by: moson <moson@archlinux.org>
2023-09-26 13:47:03 +02:00
moson
0a7b02956f
feat: Indicate dependency source
Dependencies might reside in the AUR or official repositories.
Add "AUR" as superscript letters to indicate if a package/provider
is present in the AUR.

Signed-off-by: moson <moson@archlinux.org>
2023-09-03 14:17:11 +02:00
moson
1433553c05
fix(test): Clear previous prometheus data for test
It could happen that test data is already generated by a previous test.
(running in the same worker)

Make sure we clear everything before performing our checks.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
5699e9bb41
fix(test): Remove file locking and semaphore
All tests within a file run in the same worker and out test DB names
are unique per file as well. We don't really need a locking
mechanism here.

Same is valid for the test-emails. The only potential issue is that it
might try to create the same directory multiple times and thus run
into an error. However, that can be covered by specifying
"exist_ok=True" with os.makedirs such that those errors are ignored.

Signed-off-by: moson <moson@archlinux.org>
2023-09-01 22:51:55 +02:00
moson
9eda6a42c6
feat: Add ansible provisioning step for review-app
Clone infrastructure repository and run playbook to provision our VM
with aurweb.

Signed-off-by: moson <moson@archlinux.org>
2023-08-27 13:54:39 +02:00
Kristian Klausen
6c610b26a3
feat: Add terraform config for review-app[1]
Also removed the logic for deploying to the long gone aur-dev box.

Ansible will be added in a upcoming commit for configurating and
deploying aurweb on the VM.

[1] https://docs.gitlab.com/ee/ci/review_apps/
2023-08-27 12:05:52 +02:00
moson
3005e82f60
fix: Cleanup prometheus metrics for dead workers
The current "cleanup" function that is removing orphan prometheus files
is actually never invoked.
We move this to a default gunicorn config file to register our hook(s).

https://docs.gunicorn.org/en/stable/configure.html
https://docs.gunicorn.org/en/stable/settings.html#child-exit
Signed-off-by: moson <moson@archlinux.org>
2023-08-18 22:04:55 +02:00
Leonidas Spyropoulos
f05f1dbac7
chore(release): prepare for 6.2.7
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-08-04 19:18:38 +03:00
renovate
8ad03522de
fix(deps): update all non-major dependencies 2023-08-04 14:25:22 +00:00
moson
94b62d2949
fix: Check if user exists when editing account
We should check if a user (target) exists before validating permissions.
Otherwise things crash when a TU is trying to edit an account that
does not exist.

Fixes: aurweb-errors#529
Signed-off-by: moson <moson@archlinux.org>
2023-08-04 14:12:50 +02:00
renovate
7a44f37968
fix(deps): update dependency fastapi to v0.100.1 2023-07-27 19:24:28 +00:00
renovate
969b84afe4
fix(deps): update all non-major dependencies 2023-07-25 11:24:30 +00:00
renovate
f74f94b501
fix(deps): update dependency gunicorn to v21 2023-07-24 11:24:26 +00:00
moson
375895f080
feat: Add Prometheus metrics for requests
Adds gauge for requests by type and status

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 22:46:44 +02:00
moson
e45878a058
fix: Fix issue with requests totals
Problem is that we join with PackageBase, thus we are missing
requests for packages that were deleted.

Fixes: #483
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 18:53:58 +02:00
moson
6cd70a5c9f
test: Add tests for user/package statistics
Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:51 +02:00
moson
8699457917
feat: Separate cache expiry for stats and search
Allows us to set different cache eviction timespans  for search queries
and statistics.

Stats and especially "last package updates" should probably be refreshed
more often, whereas we might want to cache search results for a bit
longer.

So this gives us a bit more flexibility playing around with different
settings and tweak things.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
44c158b8c2
feat: Implement statistics class & additional metrics
The new module/class helps us constructing queries and count records to
expose various statistics on the homepage. We also utilize for some new
prometheus metrics (package and user gauges).
Record counts are being cached with Redis.

Signed-off-by: moson <moson@archlinux.org>
2023-07-23 13:58:50 +02:00
moson
347c2ce721
change: Change order of commit validation routine
We currently validate all commits going from latest -> oldest.

It would be nicer to go oldest -> latest so that, in case of errors,
we would indicate which commit "introduced" the problem.

Signed-off-by: moson <moson@archlinux.org>
2023-07-22 10:45:08 +02:00
moson
bc03d8b8f2
fix: Fix middleware checking for accepted terms
The current query is a bit mixed up. The intention was to return the
number of unaccepted records. Now it does also count all records
that were accepted by some other user though.

Let's check the total number of terms vs. the number of accepted
records (by our user) instead.

Signed-off-by: moson <moson@archlinux.org>
2023-07-20 18:21:05 +02:00
moson
5729d6787f
fix: git links in comments for multiple OIDs
The chance of finding multiple object IDs when performing lookups with
a shortened SHA1 hash (7 digits) seems to be quite high.

In those cases pygit2 will throw an error.
Let's catch those exceptions and gracefully handle them.

Fixes: aurweb-errors#496 (and alike)
Signed-off-by: moson <moson@archlinux.org>
2023-07-17 12:45:16 +02:00
renovate
862221f5ce
fix(deps): update all non-major dependencies 2023-07-15 20:27:12 +00:00
moson
27819b4465
fix: /rss lazy load issue & perf improvements
Some fixes for the /rss endpoints

* Load all data in one go:
Previously data was lazy loaded thus it made several sub-queries per
package (> 200 queries for composing the rss data for a single request).
Now we are performing a single SQL query.
(request time improvement: 550ms -> 130ms)
This also fixes aurweb-errors#510 and alike

* Remove some "dead code":
The fields "source, author, link" were never included in the rss output
(wrong or insufficient data passed to the different entry.xyz functions)
Nobody seems to be missing them anyways, so let's remove em.

* Remove "Last-Modified" header:
Obsolete since nginx can/will only handle "If-Modified-Since" requests
in it's current configuration. All requests are passed to fastapi anyways.

Signed-off-by: moson <moson@archlinux.org>
2023-07-13 18:27:02 +02:00
moson
fa1212f2de
fix: translations not containing string formatting
In some translations we might be missing replacement placeholders (%).
This turns out to be problematic when calling the format function.

Wrap the jinja2 format function and just return the string unformatted
when % is missing.

Fixes: #341
Signed-off-by: moson <moson@archlinux.org>
2023-07-10 18:02:20 +02:00
moson
c0bbe21d81
fix(test): correct test for ssh-key parsing
Our set of keys returned by "util.parse_ssh_keys" is unordered so we
have to adapt our test to not rely on a specific order for multiple keys.

Fixes: 5ccfa7c0fd ("fix: same ssh key entered multiple times")
Signed-off-by: moson <moson@archlinux.org>
2023-07-09 16:13:02 +02:00
moson
5ccfa7c0fd
fix: same ssh key entered multiple times
Users might accidentally past their ssh key multiple times
when they try to register or edit their account.

Convert our of list of keys to a set, removing any double keys.

Signed-off-by: moson <moson@archlinux.org>
2023-07-09 14:52:15 +02:00
Leonidas Spyropoulos
225ce23761
chore(release): prepare for 6.2.6
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:54:43 +01:00
moson
4821fc1312
fix: show placeholder for deleted user in comments
show "<deleted-account>" in comment headers in case a user
deleted their account.

Signed-off-by: moson <moson@archlinux.org>
2023-07-08 13:44:24 +02:00
Leonidas Spyropoulos
1f40f6c5a0
housekeep: set current maintainers
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-07-08 12:38:19 +01:00
renovate
81d29b4c66
fix(deps): update dependency fastapi to ^0.100.0 2023-07-08 11:24:29 +00:00
renovate
7cde1ca560
fix(deps): update all non-major dependencies 2023-07-08 09:25:09 +00:00
moson-mo
f3f8c0a871
fix: add recipients to BCC when email is hidden
Package requests are sent to the ML as well as users (CC).
For those who chose to hide their mail address,
we should add them to the BCC list instead.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-08 11:19:02 +02:00
moson
9fe8d524ff
fix(test): MariaDB 11 upgrade, query result order
Fix order of recipients for "FlagNotification" test.
Apply sorting to the recipients query.
(only relevant for tests, but who knows when they change things again)

MariaDB 11 includes some changes related to the
query optimizer. Turns out that this might have effects
on how records are ordered for certain queries.
(in case no ORDER BY clause was specified)

https://mariadb.com/kb/en/mariadb-11-0-0-release-notes/
Signed-off-by: moson <moson@archlinux.org>
2023-07-08 10:32:26 +02:00
moson-mo
814ccf6b04
feat: add Prometheus metrics for Redis cache
Adding a Prometheus counter to be able to monitor cache hits/misses
for search queries

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
3acfb08a0f
feat: cache package search results with Redis
The queries being done on the package search page are quite costly.
(Especially the default one ordered by "Popularity" when navigating to /packages)

Let's add the search results to the Redis cache:
Every result of a search query is being pushed to Redis until we hit our maximum of 50k.
An entry expires after 3 minutes before it's evicted from the cache.
Lifetime an Max values are configurable.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-04 11:57:56 +02:00
moson-mo
7c8b9ba6bc
perf: add index to tweak our default search query
Adds an index on PackageBases.Popularity and PackageBases.Name to
improve performance of our default search query sorted by "Popularity"

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:55:21 +02:00
moson-mo
c41f2e854a
perf: tweak some search queries
We currently sorting on two columns in different tables which is quite
expensive in terms of performance:
MariaDB is first merging the data into some temporary table to apply the
sorting and record limiting.

We can tweak a couple of these queries by changing the "order by" clause
such that they refer to columns within the same table (PackageBases).
So instead performing the second sorting on "Packages.Name", we do
this on "PackageBases.Name" instead.
This should still be "good enough" to produce properly sorted results.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-07-02 13:21:11 +02:00
Leonidas Spyropoulos
e2c113caee
chore(release): prepare for 6.2.5
Signed-off-by: Leonidas Spyropoulos <artafinde@archlinux.org>
2023-06-22 19:22:56 +01:00
moson-mo
143575c9de
fix: restore command, remove premature creation of pkgbase
We're currently creating a "PackageBases" when the "restore" command is executed.

This is problematic for pkgbases that never existed before.
In those cases it will create the record but fail in the update.py script.
Thus it leaves an orphan "PackageBases" record in the DB
(which does not have any related "Packages" record(s))

Navigating to such a packages /pkgbase/... URL will result in a crash
since it is not foreseen to have "orphan" pkgbase records.

We can safely remove the early creation of that record because
it'll be taken care of in the update.py script that is being called

We'll also fix some tests. Before it was executing a dummy script
instead of "update.py" which might be a bit misleading
since it did not check the real outcome of our "restore" action.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 14:22:22 +02:00
moson-mo
c6c81f0789
housekeep: Amend .gitignore and .dockerignore
Prevent some files/dirs to end up in the repo / docker image:
* directories typically used for python virtualenvs
* files that are being generated by running tests

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-16 13:33:39 +02:00
moson-mo
32461f28ea
fix(docker): Suppress error PEP-668
When using docker (compose), we don't create a venv and just install
python packages system-wide.

With python 3.11 (PEP 668) we need to explicitly tell pip to allow this.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-15 14:19:02 +02:00
moson-mo
58158505b0
fix: browser hints for password fields
Co-authored-by: eNV25 <env252525@gmail.com>
Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 21:04:35 +02:00
moson-mo
ed17486da6
change(git): allow keys/pgp subdir with .asc files
This allows migration of git history for packages dropped from a repo to AUR
in case they contain PGP key material

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-11 12:20:02 +02:00
moson-mo
1c11c901a2
feat: switch requests filter for pkgname to "contains"
Use "contains" filtering instead of an exact match
when a package name filter is given.

This makes it easier to find requests for a "group" of packages.

Signed-off-by: moson-mo <mo-son@mailbox.org>
2023-06-10 09:40:35 +02:00
Christian Heusel
26b2566b3f
change: print the user name if connecting via ssh
this is similar to the message that gitlab produces:

$ ssh -T aur.archlinux.org
Welcome to AUR, gromit! Interactive shell is disabled.
Try `ssh ssh://aur@aur.archlinux.org help` for a list of commands.

$ ssh -T gitlab.archlinux.org
Welcome to GitLab, @gromit!

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-08 12:47:27 +02:00
Christian Heusel
e9cc2fb437
change: only require .SRCINFO in the latest revision
This is done in order to relax the constraints so that dropping packages
from the official repos can be done with preserving their history.

Its sufficient to also have this present in the latest commit of a push.

Signed-off-by: Christian Heusel <christian@heusel.eu>
2023-06-07 18:54:31 +02:00
217 changed files with 9286 additions and 4845 deletions

View file

@ -1,6 +1,23 @@
*/*.mo # Config files
conf/config conf/config
conf/config.sqlite conf/config.sqlite
conf/config.sqlite.defaults conf/config.sqlite.defaults
conf/docker conf/docker
conf/docker.defaults conf/docker.defaults
# Compiled translation files
**/*.mo
# Typical virtualenv directories
env/
venv/
.venv/
# Test output
htmlcov/
test-emails/
test/__pycache__
test/test-results
test/trash_directory*
.coverage
.pytest_cache

View file

@ -1,5 +1,5 @@
# EditorConfig configuration for aurweb # EditorConfig configuration for aurweb
# https://EditorConfig.org # https://editorconfig.org
# Top-most EditorConfig file # Top-most EditorConfig file
root = true root = true

12
.gitignore vendored
View file

@ -24,7 +24,6 @@ conf/docker
conf/docker.defaults conf/docker.defaults
data.sql data.sql
dummy-data.sql* dummy-data.sql*
env/
fastapi_aw/ fastapi_aw/
htmlcov/ htmlcov/
po/*.mo po/*.mo
@ -32,7 +31,7 @@ po/*.po~
po/POTFILES po/POTFILES
schema/aur-schema-sqlite.sql schema/aur-schema-sqlite.sql
test/test-results/ test/test-results/
test/trash directory* test/trash_directory*
web/locale/*/ web/locale/*/
web/html/*.gz web/html/*.gz
@ -53,3 +52,12 @@ report.xml
# Ignore test emails # Ignore test emails
test-emails/ test-emails/
# Ignore typical virtualenv directories
env/
venv/
.venv/
# Ignore some terraform files
/ci/tf/.terraform
/ci/tf/terraform.tfstate*

View file

@ -13,24 +13,22 @@ variables:
TEST_RECURSION_LIMIT: 10000 TEST_RECURSION_LIMIT: 10000
CURRENT_DIR: "$(pwd)" CURRENT_DIR: "$(pwd)"
LOG_CONFIG: logging.test.conf LOG_CONFIG: logging.test.conf
DEV_FQDN: aurweb-$CI_COMMIT_REF_SLUG.sandbox.archlinux.page
INFRASTRUCTURE_REPO: https://gitlab.archlinux.org/archlinux/infrastructure.git
lint: lint:
stage: .pre stage: .pre
before_script: before_script:
- pacman -Sy --noconfirm --noprogressbar --cachedir .pkg-cache - pacman -Sy --noconfirm --noprogressbar
archlinux-keyring archlinux-keyring
- pacman -Syu --noconfirm --noprogressbar --cachedir .pkg-cache - pacman -Syu --noconfirm --noprogressbar
git python python-pre-commit git python python-pre-commit
script: script:
# https://github.com/pre-commit/pre-commit/issues/2178#issuecomment-1002163763
- export SETUPTOOLS_USE_DISTUTILS=stdlib
- export XDG_CACHE_HOME=.pre-commit - export XDG_CACHE_HOME=.pre-commit
- pre-commit run -a - pre-commit run -a
test: test:
stage: test stage: test
tags:
- fast-single-thread
before_script: before_script:
- export PATH="$HOME/.poetry/bin:${PATH}" - export PATH="$HOME/.poetry/bin:${PATH}"
- ./docker/scripts/install-deps.sh - ./docker/scripts/install-deps.sh
@ -61,34 +59,103 @@ test:
coverage_format: cobertura coverage_format: cobertura
path: coverage.xml path: coverage.xml
deploy: .init_tf: &init_tf
stage: deploy - pacman -Syu --needed --noconfirm terraform
tags: - export TF_VAR_name="aurweb-${CI_COMMIT_REF_SLUG}"
- secure - TF_ADDRESS="${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"
rules: - cd ci/tf
- if: $CI_COMMIT_BRANCH == "pu" - >
when: manual terraform init \
variables: -backend-config="address=${TF_ADDRESS}" \
FASTAPI_BACKEND: gunicorn -backend-config="lock_address=${TF_ADDRESS}/lock" \
FASTAPI_WORKERS: 5 -backend-config="unlock_address=${TF_ADDRESS}/lock" \
AURWEB_FASTAPI_PREFIX: https://aur-dev.archlinux.org -backend-config="username=x-access-token" \
AURWEB_SSHD_PREFIX: ssh://aur@aur-dev.archlinux.org:2222 -backend-config="password=${TF_STATE_GITLAB_ACCESS_TOKEN}" \
COMMIT_HASH: $CI_COMMIT_SHA -backend-config="lock_method=POST" \
GIT_DATA_DIR: git_data -backend-config="unlock_method=DELETE" \
script: -backend-config="retry_wait_min=5"
- pacman -Syu --noconfirm docker docker-compose socat openssh
- chmod 600 ${SSH_KEY}
- socat "UNIX-LISTEN:/tmp/docker.sock,reuseaddr,fork" EXEC:"ssh -o UserKnownHostsFile=${SSH_KNOWN_HOSTS} -Ti ${SSH_KEY} ${SSH_USER}@${SSH_HOST}" &
- export DOCKER_HOST="unix:///tmp/docker.sock"
# Set secure login config for aurweb.
- sed -ri "s/^(disable_http_login).*$/\1 = 1/" conf/config.dev
- docker-compose build
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml down --remove-orphans
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml up -d
- docker image prune -f
- docker container prune -f
- docker volume prune -f
deploy_review:
stage: deploy
script:
- *init_tf
- terraform apply -auto-approve
environment: environment:
name: development name: review/$CI_COMMIT_REF_NAME
url: https://aur-dev.archlinux.org url: https://$DEV_FQDN
on_stop: stop_review
auto_stop_in: 1 week
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual
provision_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- pacman -Syu --noconfirm --needed ansible git openssh jq
# Get ssh key from terraform state file
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- terraform show -json |
jq -r '.values.root_module.resources[] |
select(.address == "tls_private_key.this") |
.values.private_key_openssh' > ~/.ssh/id_ed25519
- chmod 400 ~/.ssh/id_ed25519
# Clone infra repo
- git clone $INFRASTRUCTURE_REPO
- cd infrastructure
# Remove vault files
- rm $(git grep -l 'ANSIBLE_VAULT;1.1;AES256$')
# Remove vault config
- sed -i '/^vault/d' ansible.cfg
# Add host config
- mkdir -p host_vars/$DEV_FQDN
- 'echo "filesystem: btrfs" > host_vars/$DEV_FQDN/misc'
# Add host
- echo "$DEV_FQDN" > hosts
# Add our pubkey and hostkeys
- ssh-keyscan $DEV_FQDN >> ~/.ssh/known_hosts
- ssh-keygen -f ~/.ssh/id_ed25519 -y > pubkeys/aurweb-dev.pub
# Run our ansible playbook
- >
ansible-playbook playbooks/aur-dev.archlinux.org.yml \
-e "aurdev_fqdn=$DEV_FQDN" \
-e "aurweb_repository=$CI_REPOSITORY_URL" \
-e "aurweb_version=$CI_COMMIT_SHA" \
-e "{\"vault_mariadb_users\":{\"root\":\"aur\"}}" \
-e "vault_aurweb_db_password=aur" \
-e "vault_aurweb_gitlab_instance=https://does.not.exist" \
-e "vault_aurweb_error_project=set-me" \
-e "vault_aurweb_error_token=set-me" \
-e "vault_aurweb_secret=aur" \
-e "vault_goaurrpc_metrics_token=aur" \
-e '{"root_additional_keys": ["moson.pub", "aurweb-dev.pub"]}'
environment:
name: review/$CI_COMMIT_REF_NAME
action: access
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
stop_review:
stage: deploy
needs:
- deploy_review
script:
- *init_tf
- terraform destroy -auto-approve
- 'curl --silent --show-error --fail --header "Private-Token: ${TF_STATE_GITLAB_ACCESS_TOKEN}" --request DELETE "${CI_API_V4_URL}/projects/${TF_STATE_PROJECT}/terraform/state/${CI_COMMIT_REF_SLUG}"'
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
rules:
- if: $CI_COMMIT_REF_NAME =~ /^renovate\//
when: never
- if: $CI_MERGE_REQUEST_ID && $CI_PROJECT_PATH == "archlinux/aurweb"
when: manual

View file

@ -1,6 +1,6 @@
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0 rev: v4.5.0
hooks: hooks:
- id: check-added-large-files - id: check-added-large-files
- id: check-case-conflict - id: check-case-conflict
@ -12,7 +12,7 @@ repos:
- id: debug-statements - id: debug-statements
- repo: https://github.com/myint/autoflake - repo: https://github.com/myint/autoflake
rev: v2.0.1 rev: v2.3.1
hooks: hooks:
- id: autoflake - id: autoflake
args: args:
@ -21,16 +21,16 @@ repos:
- --ignore-init-module-imports - --ignore-init-module-imports
- repo: https://github.com/pycqa/isort - repo: https://github.com/pycqa/isort
rev: 5.12.0 rev: 5.13.2
hooks: hooks:
- id: isort - id: isort
- repo: https://github.com/psf/black - repo: https://github.com/psf/black
rev: 23.1.0 rev: 24.4.1
hooks: hooks:
- id: black - id: black
- repo: https://github.com/PyCQA/flake8 - repo: https://github.com/PyCQA/flake8
rev: 6.0.0 rev: 7.0.0
hooks: hooks:
- id: flake8 - id: flake8

View file

@ -1,5 +1,5 @@
[main] [main]
host = https://www.transifex.com host = https://app.transifex.com
[o:lfleischer:p:aurweb:r:aurwebpot] [o:lfleischer:p:aurweb:r:aurwebpot]
file_filter = po/<lang>.po file_filter = po/<lang>.po

View file

@ -30,11 +30,6 @@ read the instructions below.
ssl_certificate /etc/ssl/certs/aur.cert.pem; ssl_certificate /etc/ssl/certs/aur.cert.pem;
ssl_certificate_key /etc/ssl/private/aur.key.pem; ssl_certificate_key /etc/ssl/private/aur.key.pem;
# TU Bylaws redirect.
location = /trusted-user/TUbylaws.html {
return 301 https://tu-bylaws.aur.archlinux.org;
}
# smartgit location. # smartgit location.
location ~ "^/([a-z0-9][a-z0-9.+_-]*?)(\.git)?/(git-(receive|upload)-pack|HEAD|info/refs|objects/(info/(http-)?alternates|packs)|[0-9a-f]{2}/[0-9a-f]{38}|pack/pack-[0-9a-f]{40}\.(pack|idx))$" { location ~ "^/([a-z0-9][a-z0-9.+_-]*?)(\.git)?/(git-(receive|upload)-pack|HEAD|info/refs|objects/(info/(http-)?alternates|packs)|[0-9a-f]{2}/[0-9a-f]{38}|pack/pack-[0-9a-f]{40}\.(pack|idx))$" {
include uwsgi_params; include uwsgi_params;
@ -125,7 +120,7 @@ interval:
*/2 * * * * bash -c 'poetry run aurweb-pkgmaint' */2 * * * * bash -c 'poetry run aurweb-pkgmaint'
*/2 * * * * bash -c 'poetry run aurweb-usermaint' */2 * * * * bash -c 'poetry run aurweb-usermaint'
*/2 * * * * bash -c 'poetry run aurweb-popupdate' */2 * * * * bash -c 'poetry run aurweb-popupdate'
*/12 * * * * bash -c 'poetry run aurweb-tuvotereminder' */12 * * * * bash -c 'poetry run aurweb-votereminder'
7) Create a new database and a user and import the aurweb SQL schema: 7) Create a new database and a user and import the aurweb SQL schema:

View file

@ -11,8 +11,8 @@ The aurweb project includes
* A web interface to search for packaging scripts and display package details. * A web interface to search for packaging scripts and display package details.
* An SSH/Git interface to submit and update packages and package meta data. * An SSH/Git interface to submit and update packages and package meta data.
* Community features such as comments, votes, package flagging and requests. * Community features such as comments, votes, package flagging and requests.
* Editing/deletion of packages and accounts by Trusted Users and Developers. * Editing/deletion of packages and accounts by Package Maintainers and Developers.
* Area for Trusted Users to post AUR-related proposals and vote on them. * Area for Package Maintainers to post AUR-related proposals and vote on them.
Directory Layout Directory Layout
---------------- ----------------
@ -56,7 +56,7 @@ Translations
------------ ------------
Translations are welcome via our Transifex project at Translations are welcome via our Transifex project at
https://www.transifex.com/lfleischer/aurweb; see `doc/i18n.txt` for details. https://www.transifex.com/lfleischer/aurweb; see [doc/i18n.md](./doc/i18n.md) for details.
![Transifex](https://www.transifex.com/projects/p/aurweb/chart/image_png) ![Transifex](https://www.transifex.com/projects/p/aurweb/chart/image_png)

View file

@ -6,6 +6,7 @@ import re
import sys import sys
import traceback import traceback
import typing import typing
from contextlib import asynccontextmanager
from urllib.parse import quote_plus from urllib.parse import quote_plus
import requests import requests
@ -13,8 +14,13 @@ from fastapi import FastAPI, HTTPException, Request, Response
from fastapi.responses import RedirectResponse from fastapi.responses import RedirectResponse
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from jinja2 import TemplateNotFound from jinja2 import TemplateNotFound
from prometheus_client import multiprocess from opentelemetry import trace
from sqlalchemy import and_, or_ from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from sqlalchemy import and_
from starlette.exceptions import HTTPException as StarletteHTTPException from starlette.exceptions import HTTPException as StarletteHTTPException
from starlette.middleware.authentication import AuthenticationMiddleware from starlette.middleware.authentication import AuthenticationMiddleware
from starlette.middleware.sessions import SessionMiddleware from starlette.middleware.sessions import SessionMiddleware
@ -22,7 +28,6 @@ from starlette.middleware.sessions import SessionMiddleware
import aurweb.captcha # noqa: F401 import aurweb.captcha # noqa: F401
import aurweb.config import aurweb.config
import aurweb.filters # noqa: F401 import aurweb.filters # noqa: F401
import aurweb.pkgbase.util as pkgbaseutil
from aurweb import aur_logging, prometheus, util from aurweb import aur_logging, prometheus, util
from aurweb.aur_redis import redis_connection from aurweb.aur_redis import redis_connection
from aurweb.auth import BasicAuthBackend from aurweb.auth import BasicAuthBackend
@ -34,11 +39,18 @@ from aurweb.routers import APP_ROUTES
from aurweb.templates import make_context, render_template from aurweb.templates import make_context, render_template
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
session_secret = aurweb.config.get("fastapi", "session_secret")
@asynccontextmanager
async def lifespan(app: FastAPI):
await app_startup()
yield
# Setup the FastAPI app. # Setup the FastAPI app.
app = FastAPI() app = FastAPI(lifespan=lifespan)
session_secret = aurweb.config.get("fastapi", "session_secret")
# Instrument routes with the prometheus-fastapi-instrumentator # Instrument routes with the prometheus-fastapi-instrumentator
# library with custom collectors and expose /metrics. # library with custom collectors and expose /metrics.
@ -47,7 +59,17 @@ instrumentator().add(prometheus.http_requests_total())
instrumentator().instrument(app) instrumentator().instrument(app)
@app.on_event("startup") # Instrument FastAPI for tracing
FastAPIInstrumentor.instrument_app(app)
resource = Resource(attributes={"service.name": "aurweb"})
otlp_endpoint = aurweb.config.get("tracing", "otlp_endpoint")
otlp_exporter = OTLPSpanExporter(endpoint=otlp_endpoint)
span_processor = BatchSpanProcessor(otlp_exporter)
trace.set_tracer_provider(TracerProvider(resource=resource))
trace.get_tracer_provider().add_span_processor(span_processor)
async def app_startup(): async def app_startup():
# https://stackoverflow.com/questions/67054759/about-the-maximum-recursion-error-in-fastapi # https://stackoverflow.com/questions/67054759/about-the-maximum-recursion-error-in-fastapi
# Test failures have been observed by internal starlette code when # Test failures have been observed by internal starlette code when
@ -91,12 +113,6 @@ async def app_startup():
get_engine() get_engine()
def child_exit(server, worker): # pragma: no cover
"""This function is required for gunicorn customization
of prometheus multiprocessing."""
multiprocess.mark_process_dead(worker.pid)
async def internal_server_error(request: Request, exc: Exception) -> Response: async def internal_server_error(request: Request, exc: Exception) -> Response:
""" """
Catch all uncaught Exceptions thrown in a route. Catch all uncaught Exceptions thrown in a route.
@ -212,10 +228,16 @@ async def http_exception_handler(request: Request, exc: HTTPException) -> Respon
if exc.status_code == http.HTTPStatus.NOT_FOUND: if exc.status_code == http.HTTPStatus.NOT_FOUND:
tokens = request.url.path.split("/") tokens = request.url.path.split("/")
matches = re.match("^([a-z0-9][a-z0-9.+_-]*?)(\\.git)?$", tokens[1]) matches = re.match("^([a-z0-9][a-z0-9.+_-]*?)(\\.git)?$", tokens[1])
if matches: if matches and len(tokens) == 2:
try: try:
pkgbase = get_pkg_or_base(matches.group(1)) pkgbase = get_pkg_or_base(matches.group(1))
context = pkgbaseutil.make_context(request, pkgbase) context["pkgbase"] = pkgbase
context["git_clone_uri_anon"] = aurweb.config.get(
"options", "git_clone_uri_anon"
)
context["git_clone_uri_priv"] = aurweb.config.get(
"options", "git_clone_uri_priv"
)
except HTTPException: except HTTPException:
pass pass
@ -277,21 +299,18 @@ async def check_terms_of_service(request: Request, call_next: typing.Callable):
"""This middleware function redirects authenticated users if they """This middleware function redirects authenticated users if they
have any outstanding Terms to agree to.""" have any outstanding Terms to agree to."""
if request.user.is_authenticated() and request.url.path != "/tos": if request.user.is_authenticated() and request.url.path != "/tos":
unaccepted = ( accepted = (
query(Term) query(Term)
.join(AcceptedTerm) .join(AcceptedTerm)
.filter( .filter(
or_( and_(
AcceptedTerm.UsersID != request.user.ID, AcceptedTerm.UsersID == request.user.ID,
and_( AcceptedTerm.TermsID == Term.ID,
AcceptedTerm.UsersID == request.user.ID, AcceptedTerm.Revision >= Term.Revision,
AcceptedTerm.TermsID == Term.ID, ),
AcceptedTerm.Revision < Term.Revision,
),
)
) )
) )
if query(Term).count() > unaccepted.count(): if query(Term).count() - accepted.count() > 0:
return RedirectResponse("/tos", status_code=int(http.HTTPStatus.SEE_OTHER)) return RedirectResponse("/tos", status_code=int(http.HTTPStatus.SEE_OTHER))
return await util.error_or_result(call_next, request) return await util.error_or_result(call_next, request)

View file

@ -1,4 +1,5 @@
import fakeredis import fakeredis
from opentelemetry.instrumentation.redis import RedisInstrumentor
from redis import ConnectionPool, Redis from redis import ConnectionPool, Redis
import aurweb.config import aurweb.config
@ -7,6 +8,8 @@ from aurweb import aur_logging
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
pool = None pool = None
RedisInstrumentor().instrument()
class FakeConnectionPool: class FakeConnectionPool:
"""A fake ConnectionPool class which holds an internal reference """A fake ConnectionPool class which holds an internal reference

View file

@ -71,7 +71,7 @@ class AnonymousUser:
return False return False
@staticmethod @staticmethod
def is_trusted_user(): def is_package_maintainer():
return False return False
@staticmethod @staticmethod
@ -205,7 +205,7 @@ def account_type_required(one_of: set):
@router.get('/some_route') @router.get('/some_route')
@auth_required(True) @auth_required(True)
@account_type_required({"Trusted User", "Trusted User & Developer"}) @account_type_required({"Package Maintainer", "Package Maintainer & Developer"})
async def some_route(request: fastapi.Request): async def some_route(request: fastapi.Request):
return Response() return Response()

View file

@ -1,7 +1,7 @@
from aurweb.models.account_type import ( from aurweb.models.account_type import (
DEVELOPER_ID, DEVELOPER_ID,
TRUSTED_USER_AND_DEV_ID, PACKAGE_MAINTAINER_AND_DEV_ID,
TRUSTED_USER_ID, PACKAGE_MAINTAINER_ID,
USER_ID, USER_ID,
) )
from aurweb.models.user import User from aurweb.models.user import User
@ -30,47 +30,49 @@ PKGBASE_VOTE = 16
PKGREQ_FILE = 23 PKGREQ_FILE = 23
PKGREQ_CLOSE = 17 PKGREQ_CLOSE = 17
PKGREQ_LIST = 18 PKGREQ_LIST = 18
TU_ADD_VOTE = 19 PM_ADD_VOTE = 19
TU_LIST_VOTES = 20 PM_LIST_VOTES = 20
TU_VOTE = 21 PM_VOTE = 21
PKGBASE_MERGE = 29 PKGBASE_MERGE = 29
user_developer_or_trusted_user = set( user_developer_or_package_maintainer = set(
[USER_ID, TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID] [USER_ID, PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
) )
trusted_user_or_dev = set([TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID]) package_maintainer_or_dev = set(
developer = set([DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID]) [PACKAGE_MAINTAINER_ID, DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID]
trusted_user = set([TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID]) )
developer = set([DEVELOPER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
package_maintainer = set([PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID])
cred_filters = { cred_filters = {
PKGBASE_FLAG: user_developer_or_trusted_user, PKGBASE_FLAG: user_developer_or_package_maintainer,
PKGBASE_NOTIFY: user_developer_or_trusted_user, PKGBASE_NOTIFY: user_developer_or_package_maintainer,
PKGBASE_VOTE: user_developer_or_trusted_user, PKGBASE_VOTE: user_developer_or_package_maintainer,
PKGREQ_FILE: user_developer_or_trusted_user, PKGREQ_FILE: user_developer_or_package_maintainer,
ACCOUNT_CHANGE_TYPE: trusted_user_or_dev, ACCOUNT_CHANGE_TYPE: package_maintainer_or_dev,
ACCOUNT_EDIT: trusted_user_or_dev, ACCOUNT_EDIT: package_maintainer_or_dev,
ACCOUNT_LAST_LOGIN: trusted_user_or_dev, ACCOUNT_LAST_LOGIN: package_maintainer_or_dev,
ACCOUNT_LIST_COMMENTS: trusted_user_or_dev, ACCOUNT_LIST_COMMENTS: package_maintainer_or_dev,
ACCOUNT_SEARCH: trusted_user_or_dev, ACCOUNT_SEARCH: package_maintainer_or_dev,
COMMENT_DELETE: trusted_user_or_dev, COMMENT_DELETE: package_maintainer_or_dev,
COMMENT_UNDELETE: trusted_user_or_dev, COMMENT_UNDELETE: package_maintainer_or_dev,
COMMENT_VIEW_DELETED: trusted_user_or_dev, COMMENT_VIEW_DELETED: package_maintainer_or_dev,
COMMENT_EDIT: trusted_user_or_dev, COMMENT_EDIT: package_maintainer_or_dev,
COMMENT_PIN: trusted_user_or_dev, COMMENT_PIN: package_maintainer_or_dev,
PKGBASE_ADOPT: trusted_user_or_dev, PKGBASE_ADOPT: package_maintainer_or_dev,
PKGBASE_SET_KEYWORDS: trusted_user_or_dev, PKGBASE_SET_KEYWORDS: package_maintainer_or_dev,
PKGBASE_DELETE: trusted_user_or_dev, PKGBASE_DELETE: package_maintainer_or_dev,
PKGBASE_EDIT_COMAINTAINERS: trusted_user_or_dev, PKGBASE_EDIT_COMAINTAINERS: package_maintainer_or_dev,
PKGBASE_DISOWN: trusted_user_or_dev, PKGBASE_DISOWN: package_maintainer_or_dev,
PKGBASE_LIST_VOTERS: trusted_user_or_dev, PKGBASE_LIST_VOTERS: package_maintainer_or_dev,
PKGBASE_UNFLAG: trusted_user_or_dev, PKGBASE_UNFLAG: package_maintainer_or_dev,
PKGREQ_CLOSE: trusted_user_or_dev, PKGREQ_CLOSE: package_maintainer_or_dev,
PKGREQ_LIST: trusted_user_or_dev, PKGREQ_LIST: package_maintainer_or_dev,
TU_ADD_VOTE: trusted_user, PM_ADD_VOTE: package_maintainer,
TU_LIST_VOTES: trusted_user_or_dev, PM_LIST_VOTES: package_maintainer_or_dev,
TU_VOTE: trusted_user, PM_VOTE: package_maintainer,
ACCOUNT_EDIT_DEV: developer, ACCOUNT_EDIT_DEV: developer,
PKGBASE_MERGE: trusted_user_or_dev, PKGBASE_MERGE: package_maintainer_or_dev,
} }

View file

@ -1,4 +1,4 @@
from datetime import datetime from datetime import UTC, datetime
class Benchmark: class Benchmark:
@ -7,7 +7,7 @@ class Benchmark:
def _timestamp(self) -> float: def _timestamp(self) -> float:
"""Generate a timestamp.""" """Generate a timestamp."""
return float(datetime.utcnow().timestamp()) return float(datetime.now(UTC).timestamp())
def start(self) -> int: def start(self) -> int:
"""Start a benchmark.""" """Start a benchmark."""

View file

@ -1,21 +1,64 @@
from redis import Redis import pickle
from typing import Any, Callable
from sqlalchemy import orm from sqlalchemy import orm
from aurweb import config
from aurweb.aur_redis import redis_connection
from aurweb.prometheus import SEARCH_REQUESTS
async def db_count_cache( _redis = redis_connection()
redis: Redis, key: str, query: orm.Query, expire: int = None
) -> int:
def lambda_cache(key: str, value: Callable[[], Any], expire: int = None) -> list:
"""Store and retrieve lambda results via redis cache.
:param key: Redis key
:param value: Lambda callable returning the value
:param expire: Optional expiration in seconds
:return: result of callable or cache
"""
result = _redis.get(key)
if result is not None:
return pickle.loads(result)
_redis.set(key, (pickle.dumps(result := value())), ex=expire)
return result
def db_count_cache(key: str, query: orm.Query, expire: int = None) -> int:
"""Store and retrieve a query.count() via redis cache. """Store and retrieve a query.count() via redis cache.
:param redis: Redis handle
:param key: Redis key :param key: Redis key
:param query: SQLAlchemy ORM query :param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds :param expire: Optional expiration in seconds
:return: query.count() :return: query.count()
""" """
result = redis.get(key) result = _redis.get(key)
if result is None: if result is None:
redis.set(key, (result := int(query.count()))) _redis.set(key, (result := int(query.count())))
if expire: if expire:
redis.expire(key, expire) _redis.expire(key, expire)
return int(result) return int(result)
def db_query_cache(key: str, query: orm.Query, expire: int = None) -> list:
"""Store and retrieve query results via redis cache.
:param key: Redis key
:param query: SQLAlchemy ORM query
:param expire: Optional expiration in seconds
:return: query.all()
"""
result = _redis.get(key)
if result is None:
SEARCH_REQUESTS.labels(cache="miss").inc()
if _redis.dbsize() > config.getint("cache", "max_search_entries", 50000):
return query.all()
_redis.set(key, (result := pickle.dumps(query.all())))
if expire:
_redis.expire(key, expire)
else:
SEARCH_REQUESTS.labels(cache="hit").inc()
return pickle.loads(result)

View file

@ -1,7 +1,9 @@
""" This module consists of aurweb's CAPTCHA utility functions and filters. """ """ This module consists of aurweb's CAPTCHA utility functions and filters. """
import hashlib import hashlib
from jinja2 import pass_context from jinja2 import pass_context
from sqlalchemy import func
from aurweb.db import query from aurweb.db import query
from aurweb.models import User from aurweb.models import User
@ -10,7 +12,8 @@ from aurweb.templates import register_filter
def get_captcha_salts(): def get_captcha_salts():
"""Produce salts based on the current user count.""" """Produce salts based on the current user count."""
count = query(User).count() count = query(func.count(User.ID)).scalar()
salts = [] salts = []
for i in range(0, 6): for i in range(0, 6):
salts.append(f"aurweb-{count - i}") salts.append(f"aurweb-{count - i}")

View file

@ -298,9 +298,12 @@ def get_engine(dbname: str = None, echo: bool = False):
connect_args["check_same_thread"] = False connect_args["check_same_thread"] = False
kwargs = {"echo": echo, "connect_args": connect_args} kwargs = {"echo": echo, "connect_args": connect_args}
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
from sqlalchemy import create_engine from sqlalchemy import create_engine
_engines[dbname] = create_engine(get_sqlalchemy_url(), **kwargs) engine = create_engine(get_sqlalchemy_url(), **kwargs)
SQLAlchemyInstrumentor().instrument(engine=engine)
_engines[dbname] = engine
if is_sqlite: # pragma: no cover if is_sqlite: # pragma: no cover
setup_sqlite(_engines.get(dbname)) setup_sqlite(_engines.get(dbname))

View file

@ -1,6 +1,6 @@
import copy import copy
import math import math
from datetime import datetime from datetime import UTC, datetime
from typing import Any, Union from typing import Any, Union
from urllib.parse import quote_plus, urlencode from urllib.parse import quote_plus, urlencode
from zoneinfo import ZoneInfo from zoneinfo import ZoneInfo
@ -8,6 +8,7 @@ from zoneinfo import ZoneInfo
import fastapi import fastapi
import paginate import paginate
from jinja2 import pass_context from jinja2 import pass_context
from jinja2.filters import do_format
import aurweb.models import aurweb.models
from aurweb import config, l10n from aurweb import config, l10n
@ -93,7 +94,7 @@ def tn(context: dict[str, Any], count: int, singular: str, plural: str) -> str:
@register_filter("dt") @register_filter("dt")
def timestamp_to_datetime(timestamp: int): def timestamp_to_datetime(timestamp: int):
return datetime.utcfromtimestamp(int(timestamp)) return datetime.fromtimestamp(timestamp, UTC)
@register_filter("as_timezone") @register_filter("as_timezone")
@ -117,9 +118,9 @@ def to_qs(query: dict[str, Any]) -> str:
@register_filter("get_vote") @register_filter("get_vote")
def get_vote(voteinfo, request: fastapi.Request): def get_vote(voteinfo, request: fastapi.Request):
from aurweb.models import TUVote from aurweb.models import Vote
return voteinfo.tu_votes.filter(TUVote.User == request.user).first() return voteinfo.votes.filter(Vote.User == request.user).first()
@register_filter("number_format") @register_filter("number_format")
@ -164,3 +165,17 @@ def date_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
@pass_context @pass_context
def datetime_display(context: dict[str, Any], dt: Union[int, datetime]) -> str: def datetime_display(context: dict[str, Any], dt: Union[int, datetime]) -> str:
return date_strftime(context, dt, "%Y-%m-%d %H:%M (%Z)") return date_strftime(context, dt, "%Y-%m-%d %H:%M (%Z)")
@register_filter("format")
def safe_format(value: str, *args: Any, **kwargs: Any) -> str:
"""Wrapper for jinja2 format function to perform additional checks."""
# If we don't have anything to be formatted, just return the value.
# We have some translations that do not contain placeholders for replacement.
# In these cases the jinja2 function is throwing an error:
# "TypeError: not all arguments converted during string formatting"
if "%" not in value:
return value
return do_format(value, *args, **kwargs)

View file

@ -52,7 +52,7 @@ def list_repos(user):
conn.close() conn.close()
def create_pkgbase(pkgbase, user): def validate_pkgbase(pkgbase, user):
if not re.match(repo_regex, pkgbase): if not re.match(repo_regex, pkgbase):
raise aurweb.exceptions.InvalidRepositoryNameException(pkgbase) raise aurweb.exceptions.InvalidRepositoryNameException(pkgbase)
if pkgbase_exists(pkgbase): if pkgbase_exists(pkgbase):
@ -62,26 +62,12 @@ def create_pkgbase(pkgbase, user):
cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [user]) cur = conn.execute("SELECT ID FROM Users WHERE Username = ?", [user])
userid = cur.fetchone()[0] userid = cur.fetchone()[0]
conn.close()
if userid == 0: if userid == 0:
raise aurweb.exceptions.InvalidUserException(user) raise aurweb.exceptions.InvalidUserException(user)
now = int(time.time())
cur = conn.execute(
"INSERT INTO PackageBases (Name, SubmittedTS, "
+ "ModifiedTS, SubmitterUID, MaintainerUID, "
+ "FlaggerComment) VALUES (?, ?, ?, ?, ?, '')",
[pkgbase, now, now, userid, userid],
)
pkgbase_id = cur.lastrowid
cur = conn.execute(
"INSERT INTO PackageNotifications " + "(PackageBaseID, UserID) VALUES (?, ?)",
[pkgbase_id, userid],
)
conn.commit()
conn.close()
def pkgbase_adopt(pkgbase, user, privileged): def pkgbase_adopt(pkgbase, user, privileged):
pkgbase_id = pkgbase_from_name(pkgbase) pkgbase_id = pkgbase_from_name(pkgbase)
@ -279,7 +265,7 @@ def pkgbase_disown(pkgbase, user, privileged):
conn = aurweb.db.Connection() conn = aurweb.db.Connection()
# Make the first co-maintainer the new maintainer, unless the action was # Make the first co-maintainer the new maintainer, unless the action was
# enforced by a Trusted User. # enforced by a Package Maintainer.
if initialized_by_owner: if initialized_by_owner:
comaintainers = pkgbase_get_comaintainers(pkgbase) comaintainers = pkgbase_get_comaintainers(pkgbase)
if len(comaintainers) > 0: if len(comaintainers) > 0:
@ -577,7 +563,7 @@ def serve(action, cmdargv, user, privileged, remote_addr): # noqa: C901
checkarg(cmdargv, "repository name") checkarg(cmdargv, "repository name")
pkgbase = cmdargv[1] pkgbase = cmdargv[1]
create_pkgbase(pkgbase, user) validate_pkgbase(pkgbase, user)
os.environ["AUR_USER"] = user os.environ["AUR_USER"] = user
os.environ["AUR_PKGBASE"] = pkgbase os.environ["AUR_PKGBASE"] = pkgbase
@ -648,7 +634,7 @@ def main():
ssh_client = os.environ.get("SSH_CLIENT") ssh_client = os.environ.get("SSH_CLIENT")
if not ssh_cmd: if not ssh_cmd:
die_with_help("Interactive shell is disabled.") die_with_help(f"Welcome to AUR, {user}! Interactive shell is disabled.")
cmdargv = shlex.split(ssh_cmd) cmdargv = shlex.split(ssh_cmd)
action = cmdargv[0] action = cmdargv[0]
remote_addr = ssh_client.split(" ")[0] if ssh_client else None remote_addr = ssh_client.split(" ")[0] if ssh_client else None

View file

@ -258,6 +258,71 @@ def die_commit(msg, commit):
exit(1) exit(1)
def validate_metadata(metadata, commit): # noqa: C901
try:
metadata_pkgbase = metadata["pkgbase"]
except KeyError:
die_commit(
"invalid .SRCINFO, does not contain a pkgbase (is the file empty?)",
str(commit.id),
)
if not re.match(repo_regex, metadata_pkgbase):
die_commit("invalid pkgbase: {:s}".format(metadata_pkgbase), str(commit.id))
if not metadata["packages"]:
die_commit("missing pkgname entry", str(commit.id))
for pkgname in set(metadata["packages"].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ("pkgver", "pkgrel", "pkgname"):
if field not in pkginfo:
die_commit(
"missing mandatory field: {:s}".format(field), str(commit.id)
)
if "epoch" in pkginfo and not pkginfo["epoch"].isdigit():
die_commit("invalid epoch: {:s}".format(pkginfo["epoch"]), str(commit.id))
if not re.match(r"[a-z0-9][a-z0-9\.+_-]*$", pkginfo["pkgname"]):
die_commit(
"invalid package name: {:s}".format(pkginfo["pkgname"]),
str(commit.id),
)
max_len = {"pkgname": 255, "pkgdesc": 255, "url": 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit(
"{:s} field too long: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in ("install", "changelog"):
if field in pkginfo and not pkginfo[field] in commit.tree:
die_commit(
"missing {:s} file: {:s}".format(field, pkginfo[field]),
str(commit.id),
)
for field in extract_arch_fields(pkginfo, "source"):
fname = field["value"]
if len(fname) > 8000:
die_commit("source entry too long: {:s}".format(fname), str(commit.id))
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit("missing source file: {:s}".format(fname), str(commit.id))
def validate_blob_size(blob: pygit2.Object, commit: pygit2.Commit):
if isinstance(blob, pygit2.Blob) and blob.size > max_blob_size:
die_commit(
"maximum blob size ({:s}) exceeded".format(size_humanize(max_blob_size)),
str(commit.id),
)
def main(): # noqa: C901 def main(): # noqa: C901
repo = pygit2.Repository(repo_path) repo = pygit2.Repository(repo_path)
@ -291,110 +356,69 @@ def main(): # noqa: C901
die("denying non-fast-forward (you should pull first)") die("denying non-fast-forward (you should pull first)")
# Prepare the walker that validates new commits. # Prepare the walker that validates new commits.
walker = repo.walk(sha1_new, pygit2.GIT_SORT_TOPOLOGICAL) walker = repo.walk(sha1_new, pygit2.GIT_SORT_REVERSE)
if sha1_old != "0" * 40: if sha1_old != "0" * 40:
walker.hide(sha1_old) walker.hide(sha1_old)
head_commit = repo[sha1_new]
if ".SRCINFO" not in head_commit.tree:
die_commit("missing .SRCINFO", str(head_commit.id))
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[head_commit.tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
if errors:
sys.stderr.write(
"error: The following errors occurred " "when parsing .SRCINFO in commit\n"
)
sys.stderr.write("error: {:s}:\n".format(str(head_commit.id)))
for error in errors:
for err in error["error"]:
sys.stderr.write("error: line {:d}: {:s}\n".format(error["line"], err))
exit(1)
# check if there is a correct .SRCINFO file in the latest revision
validate_metadata(metadata, head_commit)
# Validate all new commits. # Validate all new commits.
for commit in walker: for commit in walker:
for fname in (".SRCINFO", "PKGBUILD"): if "PKGBUILD" not in commit.tree:
if fname not in commit.tree: die_commit("missing PKGBUILD", str(commit.id))
die_commit("missing {:s}".format(fname), str(commit.id))
# Iterate over files in root dir
for treeobj in commit.tree: for treeobj in commit.tree:
blob = repo[treeobj.id] # Don't allow any subdirs besides "keys/"
if isinstance(treeobj, pygit2.Tree) and treeobj.name != "keys":
if isinstance(blob, pygit2.Tree):
die_commit( die_commit(
"the repository must not contain subdirectories", str(commit.id) "the repository must not contain subdirectories",
)
if not isinstance(blob, pygit2.Blob):
die_commit("not a blob object: {:s}".format(treeobj), str(commit.id))
if blob.size > max_blob_size:
die_commit(
"maximum blob size ({:s}) exceeded".format(
size_humanize(max_blob_size)
),
str(commit.id), str(commit.id),
) )
metadata_raw = repo[commit.tree[".SRCINFO"].id].data.decode() # Check size of files in root dir
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw) validate_blob_size(treeobj, commit)
if errors:
sys.stderr.write(
"error: The following errors occurred "
"when parsing .SRCINFO in commit\n"
)
sys.stderr.write("error: {:s}:\n".format(str(commit.id)))
for error in errors:
for err in error["error"]:
sys.stderr.write(
"error: line {:d}: {:s}\n".format(error["line"], err)
)
exit(1)
try: # If we got a subdir keys/,
metadata_pkgbase = metadata["pkgbase"] # make sure it only contains a pgp/ subdir with key files
except KeyError: if "keys" in commit.tree:
die_commit( # Check for forbidden files/dirs in keys/
"invalid .SRCINFO, does not contain a pkgbase (is the file empty?)", for keyobj in commit.tree["keys"]:
str(commit.id), if not isinstance(keyobj, pygit2.Tree) or keyobj.name != "pgp":
)
if not re.match(repo_regex, metadata_pkgbase):
die_commit("invalid pkgbase: {:s}".format(metadata_pkgbase), str(commit.id))
if not metadata["packages"]:
die_commit("missing pkgname entry", str(commit.id))
for pkgname in set(metadata["packages"].keys()):
pkginfo = srcinfo.utils.get_merged_package(pkgname, metadata)
for field in ("pkgver", "pkgrel", "pkgname"):
if field not in pkginfo:
die_commit( die_commit(
"missing mandatory field: {:s}".format(field), str(commit.id) "the keys/ subdir may only contain a pgp/ directory",
)
if "epoch" in pkginfo and not pkginfo["epoch"].isdigit():
die_commit(
"invalid epoch: {:s}".format(pkginfo["epoch"]), str(commit.id)
)
if not re.match(r"[a-z0-9][a-z0-9\.+_-]*$", pkginfo["pkgname"]):
die_commit(
"invalid package name: {:s}".format(pkginfo["pkgname"]),
str(commit.id),
)
max_len = {"pkgname": 255, "pkgdesc": 255, "url": 8000}
for field in max_len.keys():
if field in pkginfo and len(pkginfo[field]) > max_len[field]:
die_commit(
"{:s} field too long: {:s}".format(field, pkginfo[field]),
str(commit.id), str(commit.id),
) )
# Check for forbidden files in keys/pgp/
for field in ("install", "changelog"): if "keys/pgp" in commit.tree:
if field in pkginfo and not pkginfo[field] in commit.tree: for pgpobj in commit.tree["keys/pgp"]:
die_commit( if not isinstance(pgpobj, pygit2.Blob) or not pgpobj.name.endswith(
"missing {:s} file: {:s}".format(field, pkginfo[field]), ".asc"
str(commit.id), ):
) die_commit(
"the subdir may only contain .asc (PGP pub key) files",
for field in extract_arch_fields(pkginfo, "source"): str(commit.id),
fname = field["value"] )
if len(fname) > 8000: # Check file size for pgp key files
die_commit( validate_blob_size(pgpobj, commit)
"source entry too long: {:s}".format(fname), str(commit.id)
)
if "://" in fname or "lp:" in fname:
continue
if fname not in commit.tree:
die_commit(
"missing source file: {:s}".format(fname), str(commit.id)
)
# Display a warning if .SRCINFO is unchanged. # Display a warning if .SRCINFO is unchanged.
if sha1_old not in ("0000000000000000000000000000000000000000", sha1_new): if sha1_old not in ("0000000000000000000000000000000000000000", sha1_new):
@ -403,10 +427,6 @@ def main(): # noqa: C901
if srcinfo_id_old == srcinfo_id_new: if srcinfo_id_old == srcinfo_id_new:
warn(".SRCINFO unchanged. " "The package database will not be updated!") warn(".SRCINFO unchanged. " "The package database will not be updated!")
# Read .SRCINFO from the HEAD commit.
metadata_raw = repo[repo[sha1_new].tree[".SRCINFO"].id].data.decode()
(metadata, errors) = srcinfo.parse.parse_srcinfo(metadata_raw)
# Ensure that the package base name matches the repository name. # Ensure that the package base name matches the repository name.
metadata_pkgbase = metadata["pkgbase"] metadata_pkgbase = metadata["pkgbase"]
if metadata_pkgbase != pkgbase: if metadata_pkgbase != pkgbase:
@ -420,6 +440,8 @@ def main(): # noqa: C901
cur = conn.execute("SELECT Name FROM PackageBlacklist") cur = conn.execute("SELECT Name FROM PackageBlacklist")
blacklist = [row[0] for row in cur.fetchall()] blacklist = [row[0] for row in cur.fetchall()]
if pkgbase in blacklist:
warn_or_die("pkgbase is blacklisted: {:s}".format(pkgbase))
cur = conn.execute("SELECT Name, Repo FROM OfficialProviders") cur = conn.execute("SELECT Name, Repo FROM OfficialProviders")
providers = dict(cur.fetchall()) providers = dict(cur.fetchall())

View file

@ -13,9 +13,9 @@ def feed_initial_data(conn):
aurweb.schema.AccountTypes.insert(), aurweb.schema.AccountTypes.insert(),
[ [
{"ID": 1, "AccountType": "User"}, {"ID": 1, "AccountType": "User"},
{"ID": 2, "AccountType": "Trusted User"}, {"ID": 2, "AccountType": "Package Maintainer"},
{"ID": 3, "AccountType": "Developer"}, {"ID": 3, "AccountType": "Developer"},
{"ID": 4, "AccountType": "Trusted User & Developer"}, {"ID": 4, "AccountType": "Package Maintainer & Developer"},
], ],
) )
conn.execute( conn.execute(

View file

@ -64,11 +64,24 @@ class Translator:
translator = Translator() translator = Translator()
def get_request_language(request: Request): def get_request_language(request: Request) -> str:
if request.user.is_authenticated(): """Get a request's language from either query param, user setting or
cookie. We use the configuration's [options] default_lang otherwise.
@param request FastAPI request
"""
request_lang = request.query_params.get("language")
cookie_lang = request.cookies.get("AURLANG")
if request_lang and request_lang in SUPPORTED_LANGUAGES:
return request_lang
elif (
request.user.is_authenticated()
and request.user.LangPreference in SUPPORTED_LANGUAGES
):
return request.user.LangPreference return request.user.LangPreference
default_lang = aurweb.config.get("options", "default_lang") elif cookie_lang and cookie_lang in SUPPORTED_LANGUAGES:
return request.cookies.get("AURLANG", default_lang) return cookie_lang
return aurweb.config.get_with_fallback("options", "default_lang", "en")
def get_raw_translator_for_request(request: Request): def get_raw_translator_for_request(request: Request):

View file

@ -1,4 +1,5 @@
""" Collection of all aurweb SQLAlchemy declarative models. """ """ Collection of all aurweb SQLAlchemy declarative models. """
from .accepted_term import AcceptedTerm # noqa: F401 from .accepted_term import AcceptedTerm # noqa: F401
from .account_type import AccountType # noqa: F401 from .account_type import AccountType # noqa: F401
from .api_rate_limit import ApiRateLimit # noqa: F401 from .api_rate_limit import ApiRateLimit # noqa: F401
@ -26,6 +27,6 @@ from .request_type import RequestType # noqa: F401
from .session import Session # noqa: F401 from .session import Session # noqa: F401
from .ssh_pub_key import SSHPubKey # noqa: F401 from .ssh_pub_key import SSHPubKey # noqa: F401
from .term import Term # noqa: F401 from .term import Term # noqa: F401
from .tu_vote import TUVote # noqa: F401
from .tu_voteinfo import TUVoteInfo # noqa: F401
from .user import User # noqa: F401 from .user import User # noqa: F401
from .vote import Vote # noqa: F401
from .voteinfo import VoteInfo # noqa: F401

View file

@ -2,21 +2,21 @@ from aurweb import schema
from aurweb.models.declarative import Base from aurweb.models.declarative import Base
USER = "User" USER = "User"
TRUSTED_USER = "Trusted User" PACKAGE_MAINTAINER = "Package Maintainer"
DEVELOPER = "Developer" DEVELOPER = "Developer"
TRUSTED_USER_AND_DEV = "Trusted User & Developer" PACKAGE_MAINTAINER_AND_DEV = "Package Maintainer & Developer"
USER_ID = 1 USER_ID = 1
TRUSTED_USER_ID = 2 PACKAGE_MAINTAINER_ID = 2
DEVELOPER_ID = 3 DEVELOPER_ID = 3
TRUSTED_USER_AND_DEV_ID = 4 PACKAGE_MAINTAINER_AND_DEV_ID = 4
# Map string constants to integer constants. # Map string constants to integer constants.
ACCOUNT_TYPE_ID = { ACCOUNT_TYPE_ID = {
USER: USER_ID, USER: USER_ID,
TRUSTED_USER: TRUSTED_USER_ID, PACKAGE_MAINTAINER: PACKAGE_MAINTAINER_ID,
DEVELOPER: DEVELOPER_ID, DEVELOPER: DEVELOPER_ID,
TRUSTED_USER_AND_DEV: TRUSTED_USER_AND_DEV_ID, PACKAGE_MAINTAINER_AND_DEV: PACKAGE_MAINTAINER_AND_DEV_ID,
} }
# Reversed ACCOUNT_TYPE_ID mapping. # Reversed ACCOUNT_TYPE_ID mapping.

View file

@ -2,6 +2,7 @@ from fastapi import Request
from aurweb import db, schema from aurweb import db, schema
from aurweb.models.declarative import Base from aurweb.models.declarative import Base
from aurweb.util import get_client_ip
class Ban(Base): class Ban(Base):
@ -14,6 +15,6 @@ class Ban(Base):
def is_banned(request: Request): def is_banned(request: Request):
ip = request.client.host ip = get_client_ip(request)
exists = db.query(Ban).filter(Ban.IPAddress == ip).exists() exists = db.query(Ban).filter(Ban.IPAddress == ip).exists()
return db.query(exists).scalar() return db.query(exists).scalar()

View file

@ -57,14 +57,17 @@ class PackageDependency(Base):
params=("NULL"), params=("NULL"),
) )
def is_package(self) -> bool: def is_aur_package(self) -> bool:
pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists() pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists()
return db.query(pkg).scalar()
def is_package(self) -> bool:
official = ( official = (
db.query(_OfficialProvider) db.query(_OfficialProvider)
.filter(_OfficialProvider.Name == self.DepName) .filter(_OfficialProvider.Name == self.DepName)
.exists() .exists()
) )
return db.query(pkg).scalar() or db.query(official).scalar() return self.is_aur_package() or db.query(official).scalar()
def provides(self) -> list[PackageRelation]: def provides(self) -> list[PackageRelation]:
from aurweb.models.relation_type import PROVIDES_ID from aurweb.models.relation_type import PROVIDES_ID

View file

@ -122,7 +122,7 @@ class User(Base):
try: try:
with db.begin(): with db.begin():
self.LastLogin = now_ts self.LastLogin = now_ts
self.LastLoginIPAddress = request.client.host self.LastLoginIPAddress = util.get_client_ip(request)
if not self.session: if not self.session:
sid = generate_unique_sid() sid = generate_unique_sid()
self.session = db.create( self.session = db.create(
@ -157,25 +157,25 @@ class User(Base):
with db.begin(): with db.begin():
db.delete(self.session) db.delete(self.session)
def is_trusted_user(self): def is_package_maintainer(self):
return self.AccountType.ID in { return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID, aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID, aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
} }
def is_developer(self): def is_developer(self):
return self.AccountType.ID in { return self.AccountType.ID in {
aurweb.models.account_type.DEVELOPER_ID, aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID, aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
} }
def is_elevated(self): def is_elevated(self):
"""A User is 'elevated' when they have either a """A User is 'elevated' when they have either a
Trusted User or Developer AccountType.""" Package Maintainer or Developer AccountType."""
return self.AccountType.ID in { return self.AccountType.ID in {
aurweb.models.account_type.TRUSTED_USER_ID, aurweb.models.account_type.PACKAGE_MAINTAINER_ID,
aurweb.models.account_type.DEVELOPER_ID, aurweb.models.account_type.DEVELOPER_ID,
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID, aurweb.models.account_type.PACKAGE_MAINTAINER_AND_DEV_ID,
} }
def can_edit_user(self, target: "User") -> bool: def can_edit_user(self, target: "User") -> bool:
@ -188,7 +188,7 @@ class User(Base):
In short, a user must at least have credentials and be at least In short, a user must at least have credentials and be at least
the same account type as the target. the same account type as the target.
User < Trusted User < Developer < Trusted User & Developer User < Package Maintainer < Developer < Package Maintainer & Developer
:param target: Target User to be edited :param target: Target User to be edited
:return: Boolean indicating whether `self` can edit `target` :return: Boolean indicating whether `self` can edit `target`

View file

@ -3,24 +3,24 @@ from sqlalchemy.orm import backref, relationship
from aurweb import schema from aurweb import schema
from aurweb.models.declarative import Base from aurweb.models.declarative import Base
from aurweb.models.tu_voteinfo import TUVoteInfo as _TUVoteInfo
from aurweb.models.user import User as _User from aurweb.models.user import User as _User
from aurweb.models.voteinfo import VoteInfo as _VoteInfo
class TUVote(Base): class Vote(Base):
__table__ = schema.TU_Votes __table__ = schema.Votes
__tablename__ = __table__.name __tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.VoteID, __table__.c.UserID]} __mapper_args__ = {"primary_key": [__table__.c.VoteID, __table__.c.UserID]}
VoteInfo = relationship( VoteInfo = relationship(
_TUVoteInfo, _VoteInfo,
backref=backref("tu_votes", lazy="dynamic"), backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.VoteID], foreign_keys=[__table__.c.VoteID],
) )
User = relationship( User = relationship(
_User, _User,
backref=backref("tu_votes", lazy="dynamic"), backref=backref("votes", lazy="dynamic"),
foreign_keys=[__table__.c.UserID], foreign_keys=[__table__.c.UserID],
) )
@ -30,13 +30,13 @@ class TUVote(Base):
if not self.VoteInfo and not self.VoteID: if not self.VoteInfo and not self.VoteID:
raise IntegrityError( raise IntegrityError(
statement="Foreign key VoteID cannot be null.", statement="Foreign key VoteID cannot be null.",
orig="TU_Votes.VoteID", orig="Votes.VoteID",
params=("NULL"), params=("NULL"),
) )
if not self.User and not self.UserID: if not self.User and not self.UserID:
raise IntegrityError( raise IntegrityError(
statement="Foreign key UserID cannot be null.", statement="Foreign key UserID cannot be null.",
orig="TU_Votes.UserID", orig="Votes.UserID",
params=("NULL"), params=("NULL"),
) )

View file

@ -8,14 +8,14 @@ from aurweb.models.declarative import Base
from aurweb.models.user import User as _User from aurweb.models.user import User as _User
class TUVoteInfo(Base): class VoteInfo(Base):
__table__ = schema.TU_VoteInfo __table__ = schema.VoteInfo
__tablename__ = __table__.name __tablename__ = __table__.name
__mapper_args__ = {"primary_key": [__table__.c.ID]} __mapper_args__ = {"primary_key": [__table__.c.ID]}
Submitter = relationship( Submitter = relationship(
_User, _User,
backref=backref("tu_voteinfo_set", lazy="dynamic"), backref=backref("voteinfo_set", lazy="dynamic"),
foreign_keys=[__table__.c.SubmitterID], foreign_keys=[__table__.c.SubmitterID],
) )
@ -30,35 +30,35 @@ class TUVoteInfo(Base):
if self.Agenda is None: if self.Agenda is None:
raise IntegrityError( raise IntegrityError(
statement="Column Agenda cannot be null.", statement="Column Agenda cannot be null.",
orig="TU_VoteInfo.Agenda", orig="VoteInfo.Agenda",
params=("NULL"), params=("NULL"),
) )
if self.User is None: if self.User is None:
raise IntegrityError( raise IntegrityError(
statement="Column User cannot be null.", statement="Column User cannot be null.",
orig="TU_VoteInfo.User", orig="VoteInfo.User",
params=("NULL"), params=("NULL"),
) )
if self.Submitted is None: if self.Submitted is None:
raise IntegrityError( raise IntegrityError(
statement="Column Submitted cannot be null.", statement="Column Submitted cannot be null.",
orig="TU_VoteInfo.Submitted", orig="VoteInfo.Submitted",
params=("NULL"), params=("NULL"),
) )
if self.End is None: if self.End is None:
raise IntegrityError( raise IntegrityError(
statement="Column End cannot be null.", statement="Column End cannot be null.",
orig="TU_VoteInfo.End", orig="VoteInfo.End",
params=("NULL"), params=("NULL"),
) )
if not self.Submitter: if not self.Submitter:
raise IntegrityError( raise IntegrityError(
statement="Foreign key SubmitterID cannot be null.", statement="Foreign key SubmitterID cannot be null.",
orig="TU_VoteInfo.SubmitterID", orig="VoteInfo.SubmitterID",
params=("NULL"), params=("NULL"),
) )

View file

@ -195,13 +195,13 @@ class PackageSearch:
def _sort_by_votes(self, order: str): def _sort_by_votes(self, order: str):
column = getattr(models.PackageBase.NumVotes, order) column = getattr(models.PackageBase.NumVotes, order)
name = getattr(models.Package.Name, order) name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name()) self.query = self.query.order_by(column(), name())
return self return self
def _sort_by_popularity(self, order: str): def _sort_by_popularity(self, order: str):
column = getattr(models.PackageBase.Popularity, order) column = getattr(models.PackageBase.Popularity, order)
name = getattr(models.Package.Name, order) name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name()) self.query = self.query.order_by(column(), name())
return self return self
@ -236,7 +236,7 @@ class PackageSearch:
def _sort_by_last_modified(self, order: str): def _sort_by_last_modified(self, order: str):
column = getattr(models.PackageBase.ModifiedTS, order) column = getattr(models.PackageBase.ModifiedTS, order)
name = getattr(models.Package.Name, order) name = getattr(models.PackageBase.Name, order)
self.query = self.query.order_by(column(), name()) self.query = self.query.order_by(column(), name())
return self return self

View file

@ -83,9 +83,11 @@ def package_link(package: Union[Package, OfficialProvider]) -> str:
@register_filter("provides_markup") @register_filter("provides_markup")
def provides_markup(provides: Providers) -> str: def provides_markup(provides: Providers) -> str:
return ", ".join( links = []
[f'<a href="{package_link(pkg)}">{pkg.Name}</a>' for pkg in provides] for pkg in provides:
) aur = "<sup><small>AUR</small></sup>" if not pkg.is_official else ""
links.append(f'<a href="{package_link(pkg)}">{pkg.Name}</a>{aur}')
return ", ".join(links)
def get_pkg_or_base( def get_pkg_or_base(

View file

@ -94,7 +94,7 @@ def _retry_disown(request: Request, pkgbase: PackageBase):
notifs.append(notif) notifs.append(notif)
elif request.user.has_credential(creds.PKGBASE_DISOWN): elif request.user.has_credential(creds.PKGBASE_DISOWN):
# Otherwise, the request user performing this disownage is a # Otherwise, the request user performing this disownage is a
# Trusted User and we treat it like a standard orphan request. # Package Maintainer and we treat it like a standard orphan request.
notifs += handle_request(request, ORPHAN_ID, pkgbase) notifs += handle_request(request, ORPHAN_ID, pkgbase)
with db.begin(): with db.begin():
pkgbase.Maintainer = None pkgbase.Maintainer = None
@ -187,7 +187,7 @@ def pkgbase_merge_instance(
# Log this out for accountability purposes. # Log this out for accountability purposes.
logger.info( logger.info(
f"Trusted User '{request.user.Username}' merged " f"Package Maintainer '{request.user.Username}' merged "
f"'{pkgbasename}' into '{target.Name}'." f"'{pkgbasename}' into '{target.Name}'."
) )

View file

@ -2,6 +2,7 @@ from typing import Any
from fastapi import Request from fastapi import Request
from sqlalchemy import and_ from sqlalchemy import and_
from sqlalchemy.orm import joinedload
from aurweb import config, db, defaults, l10n, time, util from aurweb import config, db, defaults, l10n, time, util
from aurweb.models import PackageBase, User from aurweb.models import PackageBase, User
@ -11,17 +12,7 @@ from aurweb.models.package_comment import PackageComment
from aurweb.models.package_request import PENDING_ID, PackageRequest from aurweb.models.package_request import PENDING_ID, PackageRequest
from aurweb.models.package_vote import PackageVote from aurweb.models.package_vote import PackageVote
from aurweb.scripts import notify from aurweb.scripts import notify
from aurweb.templates import ( from aurweb.templates import make_context as _make_context
make_context as _make_context,
make_variable_context as _make_variable_context,
)
async def make_variable_context(
request: Request, pkgbase: PackageBase
) -> dict[str, Any]:
ctx = await _make_variable_context(request, pkgbase.Name)
return make_context(request, pkgbase, ctx)
def make_context( def make_context(
@ -36,6 +27,8 @@ def make_context(
if not context: if not context:
context = _make_context(request, pkgbase.Name) context = _make_context(request, pkgbase.Name)
is_authenticated = request.user.is_authenticated()
# Per page and offset. # Per page and offset.
offset, per_page = util.sanitize_params( offset, per_page = util.sanitize_params(
request.query_params.get("O", defaults.O), request.query_params.get("O", defaults.O),
@ -48,12 +41,15 @@ def make_context(
context["pkgbase"] = pkgbase context["pkgbase"] = pkgbase
context["comaintainers"] = [ context["comaintainers"] = [
c.User c.User
for c in pkgbase.comaintainers.order_by( for c in pkgbase.comaintainers.options(joinedload(PackageComaintainer.User))
PackageComaintainer.Priority.asc() .order_by(PackageComaintainer.Priority.asc())
).all() .all()
] ]
context["unflaggers"] = context["comaintainers"].copy() if is_authenticated:
context["unflaggers"].extend([pkgbase.Maintainer, pkgbase.Flagger]) context["unflaggers"] = context["comaintainers"].copy()
context["unflaggers"].extend([pkgbase.Maintainer, pkgbase.Flagger])
else:
context["unflaggers"] = []
context["packages_count"] = pkgbase.packages.count() context["packages_count"] = pkgbase.packages.count()
context["keywords"] = pkgbase.keywords context["keywords"] = pkgbase.keywords
@ -70,17 +66,28 @@ def make_context(
).order_by(PackageComment.CommentTS.desc()) ).order_by(PackageComment.CommentTS.desc())
context["is_maintainer"] = bool(request.user == pkgbase.Maintainer) context["is_maintainer"] = bool(request.user == pkgbase.Maintainer)
context["notified"] = request.user.notified(pkgbase) if is_authenticated:
context["notified"] = request.user.notified(pkgbase)
else:
context["notified"] = False
context["out_of_date"] = bool(pkgbase.OutOfDateTS) context["out_of_date"] = bool(pkgbase.OutOfDateTS)
context["voted"] = request.user.package_votes.filter( if is_authenticated:
PackageVote.PackageBaseID == pkgbase.ID context["voted"] = db.query(
).scalar() request.user.package_votes.filter(
PackageVote.PackageBaseID == pkgbase.ID
).exists()
).scalar()
else:
context["voted"] = False
context["requests"] = pkgbase.requests.filter( if is_authenticated:
and_(PackageRequest.Status == PENDING_ID, PackageRequest.ClosedTS.is_(None)) context["requests"] = pkgbase.requests.filter(
).count() and_(PackageRequest.Status == PENDING_ID, PackageRequest.ClosedTS.is_(None))
).count()
else:
context["requests"] = []
context["popularity"] = popularity(pkgbase, time.utcnow()) context["popularity"] = popularity(pkgbase, time.utcnow())

View file

@ -1,6 +1,9 @@
from http import HTTPStatus
from typing import Any from typing import Any
from aurweb import db from fastapi import HTTPException
from aurweb import config, db
from aurweb.exceptions import ValidationError from aurweb.exceptions import ValidationError
from aurweb.models import PackageBase from aurweb.models import PackageBase
@ -12,8 +15,8 @@ def request(
merge_into: str, merge_into: str,
context: dict[str, Any], context: dict[str, Any],
) -> None: ) -> None:
if not comments: # validate comment
raise ValidationError(["The comment field must not be empty."]) comment(comments)
if type == "merge": if type == "merge":
# Perform merge-related checks. # Perform merge-related checks.
@ -32,3 +35,21 @@ def request(
if target.ID == pkgbase.ID: if target.ID == pkgbase.ID:
# TODO: This error needs to be translated. # TODO: This error needs to be translated.
raise ValidationError(["You cannot merge a package base into itself."]) raise ValidationError(["You cannot merge a package base into itself."])
def comment(comment: str):
if not comment:
raise ValidationError(["The comment field must not be empty."])
if len(comment) > config.getint("options", "max_chars_comment", 5000):
raise ValidationError(["Maximum number of characters for comment exceeded."])
def comment_raise_http_ex(comments: str):
try:
comment(comments)
except ValidationError as err:
raise HTTPException(
status_code=HTTPStatus.BAD_REQUEST,
detail=err.data[0],
)

View file

@ -1,6 +1,6 @@
from typing import Any, Callable, Optional from typing import Any, Callable, Optional
from prometheus_client import Counter from prometheus_client import Counter, Gauge
from prometheus_fastapi_instrumentator import Instrumentator from prometheus_fastapi_instrumentator import Instrumentator
from prometheus_fastapi_instrumentator.metrics import Info from prometheus_fastapi_instrumentator.metrics import Info
from starlette.routing import Match, Route from starlette.routing import Match, Route
@ -11,10 +11,32 @@ logger = aur_logging.get_logger(__name__)
_instrumentator = Instrumentator() _instrumentator = Instrumentator()
# Custom metrics
SEARCH_REQUESTS = Counter(
"aur_search_requests", "Number of search requests by cache hit/miss", ["cache"]
)
USERS = Gauge(
"aur_users", "Number of AUR users by type", ["type"], multiprocess_mode="livemax"
)
PACKAGES = Gauge(
"aur_packages",
"Number of AUR packages by state",
["state"],
multiprocess_mode="livemax",
)
REQUESTS = Gauge(
"aur_requests",
"Number of AUR requests by type and status",
["type", "status"],
multiprocess_mode="livemax",
)
def instrumentator(): def instrumentator():
return _instrumentator return _instrumentator
# FastAPI metrics
# Taken from https://github.com/stephenhillier/starlette_exporter # Taken from https://github.com/stephenhillier/starlette_exporter
# Their license is included in LICENSES/starlette_exporter. # Their license is included in LICENSES/starlette_exporter.
# The code has been modified to remove child route checks # The code has been modified to remove child route checks

View file

@ -4,6 +4,7 @@ from redis.client import Pipeline
from aurweb import aur_logging, config, db, time from aurweb import aur_logging, config, db, time
from aurweb.aur_redis import redis_connection from aurweb.aur_redis import redis_connection
from aurweb.models import ApiRateLimit from aurweb.models import ApiRateLimit
from aurweb.util import get_client_ip
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
@ -13,7 +14,7 @@ def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
now = time.utcnow() now = time.utcnow()
time_to_delete = now - window_length time_to_delete = now - window_length
host = request.client.host host = get_client_ip(request)
window_key = f"ratelimit-ws:{host}" window_key = f"ratelimit-ws:{host}"
requests_key = f"ratelimit:{host}" requests_key = f"ratelimit:{host}"
@ -55,7 +56,7 @@ def _update_ratelimit_db(request: Request):
record.Requests += 1 record.Requests += 1
return record return record
host = request.client.host host = get_client_ip(request)
record = db.query(ApiRateLimit, ApiRateLimit.IP == host).first() record = db.query(ApiRateLimit, ApiRateLimit.IP == host).first()
record = retry_create(record, now, host) record = retry_create(record, now, host)
@ -92,7 +93,7 @@ def check_ratelimit(request: Request):
record = update_ratelimit(request, pipeline) record = update_ratelimit(request, pipeline)
# Get cache value, else None. # Get cache value, else None.
host = request.client.host host = get_client_ip(request)
pipeline.get(f"ratelimit:{host}") pipeline.get(f"ratelimit:{host}")
requests = pipeline.execute()[0] requests = pipeline.execute()[0]

View file

@ -3,17 +3,18 @@ API routers for FastAPI.
See https://fastapi.tiangolo.com/tutorial/bigger-applications/ See https://fastapi.tiangolo.com/tutorial/bigger-applications/
""" """
from . import ( from . import (
accounts, accounts,
auth, auth,
html, html,
package_maintainer,
packages, packages,
pkgbase, pkgbase,
requests, requests,
rpc, rpc,
rss, rss,
sso, sso,
trusted_user,
) )
""" """
@ -28,7 +29,7 @@ APP_ROUTES = [
packages, packages,
pkgbase, pkgbase,
requests, requests,
trusted_user, package_maintainer,
rss, rss,
rpc, rpc,
sso, sso,

View file

@ -184,9 +184,9 @@ def make_account_form_context(
lambda e: request.user.AccountTypeID >= e[0], lambda e: request.user.AccountTypeID >= e[0],
[ [
(at.USER_ID, f"Normal {at.USER}"), (at.USER_ID, f"Normal {at.USER}"),
(at.TRUSTED_USER_ID, at.TRUSTED_USER), (at.PACKAGE_MAINTAINER_ID, at.PACKAGE_MAINTAINER),
(at.DEVELOPER_ID, at.DEVELOPER), (at.DEVELOPER_ID, at.DEVELOPER),
(at.TRUSTED_USER_AND_DEV_ID, at.TRUSTED_USER_AND_DEV), (at.PACKAGE_MAINTAINER_AND_DEV_ID, at.PACKAGE_MAINTAINER_AND_DEV),
], ],
) )
) )
@ -374,6 +374,9 @@ def cannot_edit(
:param user: Target user to be edited :param user: Target user to be edited
:return: RedirectResponse if approval != granted else None :return: RedirectResponse if approval != granted else None
""" """
# raise 404 if user does not exist
if not user:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
approved = request.user.can_edit_user(user) approved = request.user.can_edit_user(user)
if not approved and (to := "/"): if not approved and (to := "/"):
if user: if user:
@ -517,7 +520,9 @@ async def account_comments(request: Request, username: str):
@router.get("/accounts") @router.get("/accounts")
@requires_auth @requires_auth
@account_type_required({at.TRUSTED_USER, at.DEVELOPER, at.TRUSTED_USER_AND_DEV}) @account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts(request: Request): async def accounts(request: Request):
context = make_context(request, "Accounts") context = make_context(request, "Accounts")
return render_template(request, "account/search.html", context) return render_template(request, "account/search.html", context)
@ -526,7 +531,9 @@ async def accounts(request: Request):
@router.post("/accounts") @router.post("/accounts")
@handle_form_exceptions @handle_form_exceptions
@requires_auth @requires_auth
@account_type_required({at.TRUSTED_USER, at.DEVELOPER, at.TRUSTED_USER_AND_DEV}) @account_type_required(
{at.PACKAGE_MAINTAINER, at.DEVELOPER, at.PACKAGE_MAINTAINER_AND_DEV}
)
async def accounts_post( async def accounts_post(
request: Request, request: Request,
O: int = Form(default=0), # Offset O: int = Form(default=0), # Offset
@ -561,9 +568,9 @@ async def accounts_post(
# Convert parameter T to an AccountType ID. # Convert parameter T to an AccountType ID.
account_types = { account_types = {
"u": at.USER_ID, "u": at.USER_ID,
"t": at.TRUSTED_USER_ID, "t": at.PACKAGE_MAINTAINER_ID,
"d": at.DEVELOPER_ID, "d": at.DEVELOPER_ID,
"td": at.TRUSTED_USER_AND_DEV_ID, "td": at.PACKAGE_MAINTAINER_AND_DEV_ID,
} }
account_type_id = account_types.get(T, None) account_type_id = account_types.get(T, None)

View file

@ -1,6 +1,7 @@
""" AURWeb's primary routing module. Define all routes via @app.app.{get,post} """ AURWeb's primary routing module. Define all routes via @app.app.{get,post}
decorators in some way; more complex routes should be defined in their decorators in some way; more complex routes should be defined in their
own modules and imported here. """ own modules and imported here. """
import os import os
from http import HTTPStatus from http import HTTPStatus
@ -16,10 +17,8 @@ from sqlalchemy import case, or_
import aurweb.config import aurweb.config
import aurweb.models.package_request import aurweb.models.package_request
from aurweb import aur_logging, cookies, db, models, time, util from aurweb import aur_logging, cookies, db, models, statistics, time, util
from aurweb.cache import db_count_cache
from aurweb.exceptions import handle_form_exceptions from aurweb.exceptions import handle_form_exceptions
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
from aurweb.models.package_request import PENDING_ID from aurweb.models.package_request import PENDING_ID
from aurweb.packages.util import query_notified, query_voted, updated_packages from aurweb.packages.util import query_notified, query_voted, updated_packages
from aurweb.templates import make_context, render_template from aurweb.templates import make_context, render_template
@ -87,70 +86,12 @@ async def index(request: Request):
context = make_context(request, "Home") context = make_context(request, "Home")
context["ssh_fingerprints"] = util.get_ssh_fingerprints() context["ssh_fingerprints"] = util.get_ssh_fingerprints()
bases = db.query(models.PackageBase) cache_expire = aurweb.config.getint("cache", "expiry_time_statistics", 300)
redis = aurweb.aur_redis.redis_connection()
cache_expire = 300 # Five minutes.
# Package statistics. # Package statistics.
context["package_count"] = await db_count_cache( counts = statistics.get_homepage_counts()
redis, "package_count", bases, expire=cache_expire for k in counts:
) context[k] = counts[k]
query = bases.filter(models.PackageBase.MaintainerUID.is_(None))
context["orphan_count"] = await db_count_cache(
redis, "orphan_count", query, expire=cache_expire
)
query = db.query(models.User)
context["user_count"] = await db_count_cache(
redis, "user_count", query, expire=cache_expire
)
query = query.filter(
or_(
models.User.AccountTypeID == TRUSTED_USER_ID,
models.User.AccountTypeID == TRUSTED_USER_AND_DEV_ID,
)
)
context["trusted_user_count"] = await db_count_cache(
redis, "trusted_user_count", query, expire=cache_expire
)
# Current timestamp.
now = time.utcnow()
seven_days = 86400 * 7 # Seven days worth of seconds.
seven_days_ago = now - seven_days
one_hour = 3600
updated = bases.filter(
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS >= one_hour
)
query = bases.filter(models.PackageBase.SubmittedTS >= seven_days_ago)
context["seven_days_old_added"] = await db_count_cache(
redis, "seven_days_old_added", query, expire=cache_expire
)
query = updated.filter(models.PackageBase.ModifiedTS >= seven_days_ago)
context["seven_days_old_updated"] = await db_count_cache(
redis, "seven_days_old_updated", query, expire=cache_expire
)
year = seven_days * 52 # Fifty two weeks worth: one year.
year_ago = now - year
query = updated.filter(models.PackageBase.ModifiedTS >= year_ago)
context["year_old_updated"] = await db_count_cache(
redis, "year_old_updated", query, expire=cache_expire
)
query = bases.filter(
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS < 3600
)
context["never_updated"] = await db_count_cache(
redis, "never_updated", query, expire=cache_expire
)
# Get the 15 most recently updated packages. # Get the 15 most recently updated packages.
context["package_updates"] = updated_packages(15, cache_expire) context["package_updates"] = updated_packages(15, cache_expire)
@ -195,7 +136,7 @@ async def index(request: Request):
) )
archive_time = aurweb.config.getint("options", "request_archive_time") archive_time = aurweb.config.getint("options", "request_archive_time")
start = now - archive_time start = time.utcnow() - archive_time
# Package requests created by request.user. # Package requests created by request.user.
context["package_requests"] = ( context["package_requests"] = (
@ -271,6 +212,9 @@ async def metrics(request: Request):
status_code=HTTPStatus.SERVICE_UNAVAILABLE, status_code=HTTPStatus.SERVICE_UNAVAILABLE,
) )
# update prometheus gauges for packages and users
statistics.update_prometheus_metrics()
registry = CollectorRegistry() registry = CollectorRegistry()
multiprocess.MultiProcessCollector(registry) multiprocess.MultiProcessCollector(registry)
data = generate_latest(registry) data = generate_latest(registry)

View file

@ -11,13 +11,16 @@ from aurweb import aur_logging, db, l10n, models, time
from aurweb.auth import creds, requires_auth from aurweb.auth import creds, requires_auth
from aurweb.exceptions import handle_form_exceptions from aurweb.exceptions import handle_form_exceptions
from aurweb.models import User from aurweb.models import User
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID from aurweb.models.account_type import (
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
)
from aurweb.templates import make_context, make_variable_context, render_template from aurweb.templates import make_context, make_variable_context, render_template
router = APIRouter() router = APIRouter()
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
# Some TU route specific constants. # Some PM route specific constants.
ITEMS_PER_PAGE = 10 # Paged table size. ITEMS_PER_PAGE = 10 # Paged table size.
MAX_AGENDA_LENGTH = 75 # Agenda table column length. MAX_AGENDA_LENGTH = 75 # Agenda table column length.
@ -26,32 +29,32 @@ ADDVOTE_SPECIFICS = {
# When a proposal is added, duration is added to the current # When a proposal is added, duration is added to the current
# timestamp. # timestamp.
# "addvote_type": (duration, quorum) # "addvote_type": (duration, quorum)
"add_tu": (7 * 24 * 60 * 60, 0.66), "add_pm": (7 * 24 * 60 * 60, 0.66),
"remove_tu": (7 * 24 * 60 * 60, 0.75), "remove_pm": (7 * 24 * 60 * 60, 0.75),
"remove_inactive_tu": (5 * 24 * 60 * 60, 0.66), "remove_inactive_pm": (5 * 24 * 60 * 60, 0.66),
"bylaws": (7 * 24 * 60 * 60, 0.75), "bylaws": (7 * 24 * 60 * 60, 0.75),
} }
def populate_trusted_user_counts(context: dict[str, Any]) -> None: def populate_package_maintainer_counts(context: dict[str, Any]) -> None:
tu_query = db.query(User).filter( pm_query = db.query(User).filter(
or_( or_(
User.AccountTypeID == TRUSTED_USER_ID, User.AccountTypeID == PACKAGE_MAINTAINER_ID,
User.AccountTypeID == TRUSTED_USER_AND_DEV_ID, User.AccountTypeID == PACKAGE_MAINTAINER_AND_DEV_ID,
) )
) )
context["trusted_user_count"] = tu_query.count() context["package_maintainer_count"] = pm_query.count()
# In case any records have a None InactivityTS. # In case any records have a None InactivityTS.
active_tu_query = tu_query.filter( active_pm_query = pm_query.filter(
or_(User.InactivityTS.is_(None), User.InactivityTS == 0) or_(User.InactivityTS.is_(None), User.InactivityTS == 0)
) )
context["active_trusted_user_count"] = active_tu_query.count() context["active_package_maintainer_count"] = active_pm_query.count()
@router.get("/tu") @router.get("/package-maintainer")
@requires_auth @requires_auth
async def trusted_user( async def package_maintainer(
request: Request, request: Request,
coff: int = 0, # current offset coff: int = 0, # current offset
cby: str = "desc", # current by cby: str = "desc", # current by
@ -60,10 +63,10 @@ async def trusted_user(
): # past by ): # past by
"""Proposal listings.""" """Proposal listings."""
if not request.user.has_credential(creds.TU_LIST_VOTES): if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER) return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
context = make_context(request, "Trusted User") context = make_context(request, "Package Maintainer")
current_by, past_by = cby, pby current_by, past_by = cby, pby
current_off, past_off = coff, poff current_off, past_off = coff, poff
@ -84,9 +87,9 @@ async def trusted_user(
context["past_by"] = past_by context["past_by"] = past_by
current_votes = ( current_votes = (
db.query(models.TUVoteInfo) db.query(models.VoteInfo)
.filter(models.TUVoteInfo.End > ts) .filter(models.VoteInfo.End > ts)
.order_by(models.TUVoteInfo.Submitted.desc()) .order_by(models.VoteInfo.Submitted.desc())
) )
context["current_votes_count"] = current_votes.count() context["current_votes_count"] = current_votes.count()
current_votes = current_votes.limit(pp).offset(current_off) current_votes = current_votes.limit(pp).offset(current_off)
@ -96,9 +99,9 @@ async def trusted_user(
context["current_off"] = current_off context["current_off"] = current_off
past_votes = ( past_votes = (
db.query(models.TUVoteInfo) db.query(models.VoteInfo)
.filter(models.TUVoteInfo.End <= ts) .filter(models.VoteInfo.End <= ts)
.order_by(models.TUVoteInfo.Submitted.desc()) .order_by(models.VoteInfo.Submitted.desc())
) )
context["past_votes_count"] = past_votes.count() context["past_votes_count"] = past_votes.count()
past_votes = past_votes.limit(pp).offset(past_off) past_votes = past_votes.limit(pp).offset(past_off)
@ -107,29 +110,29 @@ async def trusted_user(
) )
context["past_off"] = past_off context["past_off"] = past_off
last_vote = func.max(models.TUVote.VoteID).label("LastVote") last_vote = func.max(models.Vote.VoteID).label("LastVote")
last_votes_by_tu = ( last_votes_by_pm = (
db.query(models.TUVote) db.query(models.Vote)
.join(models.User) .join(models.User)
.join(models.TUVoteInfo, models.TUVoteInfo.ID == models.TUVote.VoteID) .join(models.VoteInfo, models.VoteInfo.ID == models.Vote.VoteID)
.filter( .filter(
and_( and_(
models.TUVote.VoteID == models.TUVoteInfo.ID, models.Vote.VoteID == models.VoteInfo.ID,
models.User.ID == models.TUVote.UserID, models.User.ID == models.Vote.UserID,
models.TUVoteInfo.End < ts, models.VoteInfo.End < ts,
or_(models.User.AccountTypeID == 2, models.User.AccountTypeID == 4), or_(models.User.AccountTypeID == 2, models.User.AccountTypeID == 4),
) )
) )
.with_entities(models.TUVote.UserID, last_vote, models.User.Username) .with_entities(models.Vote.UserID, last_vote, models.User.Username)
.group_by(models.TUVote.UserID) .group_by(models.Vote.UserID)
.order_by(last_vote.desc(), models.User.Username.asc()) .order_by(last_vote.desc(), models.User.Username.asc())
) )
context["last_votes_by_tu"] = last_votes_by_tu.all() context["last_votes_by_pm"] = last_votes_by_pm.all()
context["current_by_next"] = "asc" if current_by == "desc" else "desc" context["current_by_next"] = "asc" if current_by == "desc" else "desc"
context["past_by_next"] = "asc" if past_by == "desc" else "desc" context["past_by_next"] = "asc" if past_by == "desc" else "desc"
populate_trusted_user_counts(context) populate_package_maintainer_counts(context)
context["q"] = { context["q"] = {
"coff": current_off, "coff": current_off,
@ -138,33 +141,33 @@ async def trusted_user(
"pby": past_by, "pby": past_by,
} }
return render_template(request, "tu/index.html", context) return render_template(request, "package-maintainer/index.html", context)
def render_proposal( def render_proposal(
request: Request, request: Request,
context: dict, context: dict,
proposal: int, proposal: int,
voteinfo: models.TUVoteInfo, voteinfo: models.VoteInfo,
voters: typing.Iterable[models.User], voters: typing.Iterable[models.User],
vote: models.TUVote, vote: models.Vote,
status_code: HTTPStatus = HTTPStatus.OK, status_code: HTTPStatus = HTTPStatus.OK,
): ):
"""Render a single TU proposal.""" """Render a single PM proposal."""
context["proposal"] = proposal context["proposal"] = proposal
context["voteinfo"] = voteinfo context["voteinfo"] = voteinfo
context["voters"] = voters.all() context["voters"] = voters.all()
total = voteinfo.total_votes() total = voteinfo.total_votes()
participation = (total / voteinfo.ActiveTUs) if voteinfo.ActiveTUs else 0 participation = (total / voteinfo.ActiveUsers) if voteinfo.ActiveUsers else 0
context["participation"] = participation context["participation"] = participation
accepted = (voteinfo.Yes > voteinfo.ActiveTUs / 2) or ( accepted = (voteinfo.Yes > voteinfo.ActiveUsers / 2) or (
participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No
) )
context["accepted"] = accepted context["accepted"] = accepted
can_vote = voters.filter(models.TUVote.User == request.user).first() is None can_vote = voters.filter(models.Vote.User == request.user).first() is None
context["can_vote"] = can_vote context["can_vote"] = can_vote
if not voteinfo.is_running(): if not voteinfo.is_running():
@ -173,41 +176,41 @@ def render_proposal(
context["vote"] = vote context["vote"] = vote
context["has_voted"] = vote is not None context["has_voted"] = vote is not None
return render_template(request, "tu/show.html", context, status_code=status_code) return render_template(
request, "package-maintainer/show.html", context, status_code=status_code
)
@router.get("/tu/{proposal}") @router.get("/package-maintainer/{proposal}")
@requires_auth @requires_auth
async def trusted_user_proposal(request: Request, proposal: int): async def package_maintainer_proposal(request: Request, proposal: int):
if not request.user.has_credential(creds.TU_LIST_VOTES): if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER) return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User") context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal) proposal = int(proposal)
voteinfo = ( voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
db.query(models.TUVoteInfo).filter(models.TUVoteInfo.ID == proposal).first()
)
if not voteinfo: if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND) raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = ( voters = (
db.query(models.User) db.query(models.User)
.join(models.TUVote) .join(models.Vote)
.filter(models.TUVote.VoteID == voteinfo.ID) .filter(models.Vote.VoteID == voteinfo.ID)
) )
vote = ( vote = (
db.query(models.TUVote) db.query(models.Vote)
.filter( .filter(
and_( and_(
models.TUVote.UserID == request.user.ID, models.Vote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID, models.Vote.VoteID == voteinfo.ID,
) )
) )
.first() .first()
) )
if not request.user.has_credential(creds.TU_VOTE): if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Trusted Users are allowed to vote." context["error"] = "Only Package Maintainers are allowed to vote."
if voteinfo.User == request.user.Username: if voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you." context["error"] = "You cannot vote in an proposal about you."
elif vote is not None: elif vote is not None:
@ -218,43 +221,41 @@ async def trusted_user_proposal(request: Request, proposal: int):
@db.async_retry_deadlock @db.async_retry_deadlock
@router.post("/tu/{proposal}") @router.post("/package-maintainer/{proposal}")
@handle_form_exceptions @handle_form_exceptions
@requires_auth @requires_auth
async def trusted_user_proposal_post( async def package_maintainer_proposal_post(
request: Request, proposal: int, decision: str = Form(...) request: Request, proposal: int, decision: str = Form(...)
): ):
if not request.user.has_credential(creds.TU_LIST_VOTES): if not request.user.has_credential(creds.PM_LIST_VOTES):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER) return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Trusted User") context = await make_variable_context(request, "Package Maintainer")
proposal = int(proposal) # Make sure it's an int. proposal = int(proposal) # Make sure it's an int.
voteinfo = ( voteinfo = db.query(models.VoteInfo).filter(models.VoteInfo.ID == proposal).first()
db.query(models.TUVoteInfo).filter(models.TUVoteInfo.ID == proposal).first()
)
if not voteinfo: if not voteinfo:
raise HTTPException(status_code=HTTPStatus.NOT_FOUND) raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
voters = ( voters = (
db.query(models.User) db.query(models.User)
.join(models.TUVote) .join(models.Vote)
.filter(models.TUVote.VoteID == voteinfo.ID) .filter(models.Vote.VoteID == voteinfo.ID)
) )
vote = ( vote = (
db.query(models.TUVote) db.query(models.Vote)
.filter( .filter(
and_( and_(
models.TUVote.UserID == request.user.ID, models.Vote.UserID == request.user.ID,
models.TUVote.VoteID == voteinfo.ID, models.Vote.VoteID == voteinfo.ID,
) )
) )
.first() .first()
) )
status_code = HTTPStatus.OK status_code = HTTPStatus.OK
if not request.user.has_credential(creds.TU_VOTE): if not request.user.has_credential(creds.PM_VOTE):
context["error"] = "Only Trusted Users are allowed to vote." context["error"] = "Only Package Maintainers are allowed to vote."
status_code = HTTPStatus.UNAUTHORIZED status_code = HTTPStatus.UNAUTHORIZED
elif voteinfo.User == request.user.Username: elif voteinfo.User == request.user.Username:
context["error"] = "You cannot vote in an proposal about you." context["error"] = "You cannot vote in an proposal about you."
@ -277,7 +278,7 @@ async def trusted_user_proposal_post(
"Invalid 'decision' value.", status_code=HTTPStatus.BAD_REQUEST "Invalid 'decision' value.", status_code=HTTPStatus.BAD_REQUEST
) )
vote = db.create(models.TUVote, User=request.user, VoteInfo=voteinfo) vote = db.create(models.Vote, User=request.user, VoteInfo=voteinfo)
context["error"] = "You've already voted for this proposal." context["error"] = "You've already voted for this proposal."
return render_proposal(request, context, proposal, voteinfo, voters, vote) return render_proposal(request, context, proposal, voteinfo, voters, vote)
@ -285,17 +286,17 @@ async def trusted_user_proposal_post(
@router.get("/addvote") @router.get("/addvote")
@requires_auth @requires_auth
async def trusted_user_addvote( async def package_maintainer_addvote(
request: Request, user: str = str(), type: str = "add_tu", agenda: str = str() request: Request, user: str = str(), type: str = "add_pm", agenda: str = str()
): ):
if not request.user.has_credential(creds.TU_ADD_VOTE): if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER) return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
context = await make_variable_context(request, "Add Proposal") context = await make_variable_context(request, "Add Proposal")
if type not in ADDVOTE_SPECIFICS: if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type." context["error"] = "Invalid type."
type = "add_tu" # Default it. type = "add_pm" # Default it.
context["user"] = user context["user"] = user
context["type"] = type context["type"] = type
@ -308,14 +309,14 @@ async def trusted_user_addvote(
@router.post("/addvote") @router.post("/addvote")
@handle_form_exceptions @handle_form_exceptions
@requires_auth @requires_auth
async def trusted_user_addvote_post( async def package_maintainer_addvote_post(
request: Request, request: Request,
user: str = Form(default=str()), user: str = Form(default=str()),
type: str = Form(default=str()), type: str = Form(default=str()),
agenda: str = Form(default=str()), agenda: str = Form(default=str()),
): ):
if not request.user.has_credential(creds.TU_ADD_VOTE): if not request.user.has_credential(creds.PM_ADD_VOTE):
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER) return RedirectResponse("/package-maintainer", status_code=HTTPStatus.SEE_OTHER)
# Build a context. # Build a context.
context = await make_variable_context(request, "Add Proposal") context = await make_variable_context(request, "Add Proposal")
@ -337,10 +338,8 @@ async def trusted_user_addvote_post(
utcnow = time.utcnow() utcnow = time.utcnow()
voteinfo = ( voteinfo = (
db.query(models.TUVoteInfo) db.query(models.VoteInfo)
.filter( .filter(and_(models.VoteInfo.User == user, models.VoteInfo.End > utcnow))
and_(models.TUVoteInfo.User == user, models.TUVoteInfo.End > utcnow)
)
.count() .count()
) )
if voteinfo: if voteinfo:
@ -352,7 +351,7 @@ async def trusted_user_addvote_post(
if type not in ADDVOTE_SPECIFICS: if type not in ADDVOTE_SPECIFICS:
context["error"] = "Invalid type." context["error"] = "Invalid type."
context["type"] = type = "add_tu" # Default for rendering. context["type"] = type = "add_pm" # Default for rendering.
return render_addvote(context, HTTPStatus.BAD_REQUEST) return render_addvote(context, HTTPStatus.BAD_REQUEST)
if not agenda: if not agenda:
@ -363,12 +362,12 @@ async def trusted_user_addvote_post(
duration, quorum = ADDVOTE_SPECIFICS.get(type) duration, quorum = ADDVOTE_SPECIFICS.get(type)
timestamp = time.utcnow() timestamp = time.utcnow()
# Active TU types we filter for. # Active PM types we filter for.
types = {TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID} types = {PACKAGE_MAINTAINER_ID, PACKAGE_MAINTAINER_AND_DEV_ID}
# Create a new TUVoteInfo (proposal)! # Create a new VoteInfo (proposal)!
with db.begin(): with db.begin():
active_tus = ( active_pms = (
db.query(User) db.query(User)
.filter( .filter(
and_( and_(
@ -380,16 +379,16 @@ async def trusted_user_addvote_post(
.count() .count()
) )
voteinfo = db.create( voteinfo = db.create(
models.TUVoteInfo, models.VoteInfo,
User=user, User=user,
Agenda=html.escape(agenda), Agenda=html.escape(agenda),
Submitted=timestamp, Submitted=timestamp,
End=(timestamp + duration), End=(timestamp + duration),
Quorum=quorum, Quorum=quorum,
ActiveTUs=active_tus, ActiveUsers=active_pms,
Submitter=request.user, Submitter=request.user,
) )
# Redirect to the new proposal. # Redirect to the new proposal.
endpoint = f"/tu/{voteinfo.ID}" endpoint = f"/package-maintainer/{voteinfo.ID}"
return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER) return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER)

View file

@ -7,6 +7,7 @@ from fastapi import APIRouter, Form, Query, Request, Response
import aurweb.filters # noqa: F401 import aurweb.filters # noqa: F401
from aurweb import aur_logging, config, db, defaults, models, util from aurweb import aur_logging, config, db, defaults, models, util
from aurweb.auth import creds, requires_auth from aurweb.auth import creds, requires_auth
from aurweb.cache import db_count_cache, db_query_cache
from aurweb.exceptions import InvariantError, handle_form_exceptions from aurweb.exceptions import InvariantError, handle_form_exceptions
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
from aurweb.packages import util as pkgutil from aurweb.packages import util as pkgutil
@ -14,6 +15,7 @@ from aurweb.packages.search import PackageSearch
from aurweb.packages.util import get_pkg_or_base from aurweb.packages.util import get_pkg_or_base
from aurweb.pkgbase import actions as pkgbase_actions, util as pkgbaseutil from aurweb.pkgbase import actions as pkgbase_actions, util as pkgbaseutil
from aurweb.templates import make_context, make_variable_context, render_template from aurweb.templates import make_context, make_variable_context, render_template
from aurweb.util import hash_query
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
router = APIRouter() router = APIRouter()
@ -87,7 +89,9 @@ async def packages_get(
# Collect search result count here; we've applied our keywords. # Collect search result count here; we've applied our keywords.
# Including more query operations below, like ordering, will # Including more query operations below, like ordering, will
# increase the amount of time required to collect a count. # increase the amount of time required to collect a count.
num_packages = search.count() # we use redis for caching the results of the query
cache_expire = config.getint("cache", "expiry_time_search", 600)
num_packages = db_count_cache(hash_query(search.query), search.query, cache_expire)
# Apply user-specified sort column and ordering. # Apply user-specified sort column and ordering.
search.sort_by(sort_by, sort_order) search.sort_by(sort_by, sort_order)
@ -108,7 +112,12 @@ async def packages_get(
models.PackageNotification.PackageBaseID.label("Notify"), models.PackageNotification.PackageBaseID.label("Notify"),
) )
packages = results.limit(per_page).offset(offset) # paging
results = results.limit(per_page).offset(offset)
# we use redis for caching the results of the query
packages = db_query_cache(hash_query(results), results, cache_expire)
context["packages"] = packages context["packages"] = packages
context["packages_count"] = num_packages context["packages_count"] = num_packages
@ -158,7 +167,8 @@ async def package(
rels_data["r"].append(rel) rels_data["r"].append(rel)
# Add our base information. # Add our base information.
context = await pkgbaseutil.make_variable_context(request, pkgbase) context = pkgbaseutil.make_context(request, pkgbase)
context["q"] = dict(request.query_params)
context.update({"all_deps": all_deps, "all_reqs": all_reqs}) context.update({"all_deps": all_deps, "all_reqs": all_reqs})
@ -180,6 +190,17 @@ async def package(
if not all_deps: if not all_deps:
deps = deps.limit(max_listing) deps = deps.limit(max_listing)
context["dependencies"] = deps.all() context["dependencies"] = deps.all()
# Existing dependencies to avoid multiple lookups
context["dependencies_names_from_aur"] = [
item.Name
for item in db.query(models.Package)
.filter(
models.Package.Name.in_(
pkg.package_dependencies.with_entities(models.PackageDependency.DepName)
)
)
.all()
]
# Package requirements (other packages depend on this one). # Package requirements (other packages depend on this one).
reqs = pkgutil.pkg_required(pkg.Name, [p.RelName for p in rels_data.get("p", [])]) reqs = pkgutil.pkg_required(pkg.Name, [p.RelName for p in rels_data.get("p", [])])
@ -190,6 +211,8 @@ async def package(
context["licenses"] = pkg.package_licenses context["licenses"] = pkg.package_licenses
context["groups"] = pkg.package_groups
conflicts = pkg.package_relations.filter( conflicts = pkg.package_relations.filter(
models.PackageRelation.RelTypeID == CONFLICTS_ID models.PackageRelation.RelTypeID == CONFLICTS_ID
).order_by(models.PackageRelation.RelName.asc()) ).order_by(models.PackageRelation.RelName.asc())

View file

@ -159,6 +159,8 @@ async def pkgbase_flag_post(
request, "pkgbase/flag.html", context, status_code=HTTPStatus.BAD_REQUEST request, "pkgbase/flag.html", context, status_code=HTTPStatus.BAD_REQUEST
) )
validate.comment_raise_http_ex(comments)
has_cred = request.user.has_credential(creds.PKGBASE_FLAG) has_cred = request.user.has_credential(creds.PKGBASE_FLAG)
if has_cred and not pkgbase.OutOfDateTS: if has_cred and not pkgbase.OutOfDateTS:
now = time.utcnow() now = time.utcnow()
@ -185,8 +187,7 @@ async def pkgbase_comments_post(
"""Add a new comment via POST request.""" """Add a new comment via POST request."""
pkgbase = get_pkg_or_base(name, PackageBase) pkgbase = get_pkg_or_base(name, PackageBase)
if not comment: validate.comment_raise_http_ex(comment)
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
# If the provided comment is different than the record's version, # If the provided comment is different than the record's version,
# update the db record. # update the db record.
@ -304,9 +305,9 @@ async def pkgbase_comment_post(
pkgbase = get_pkg_or_base(name, PackageBase) pkgbase = get_pkg_or_base(name, PackageBase)
db_comment = get_pkgbase_comment(pkgbase, id) db_comment = get_pkgbase_comment(pkgbase, id)
if not comment: validate.comment_raise_http_ex(comment)
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
elif request.user.ID != db_comment.UsersID: if request.user.ID != db_comment.UsersID:
raise HTTPException(status_code=HTTPStatus.UNAUTHORIZED) raise HTTPException(status_code=HTTPStatus.UNAUTHORIZED)
# If the provided comment is different than the record's version, # If the provided comment is different than the record's version,
@ -602,6 +603,9 @@ async def pkgbase_disown_post(
): ):
pkgbase = get_pkg_or_base(name, PackageBase) pkgbase = get_pkg_or_base(name, PackageBase)
if comments:
validate.comment_raise_http_ex(comments)
comaints = {c.User for c in pkgbase.comaintainers} comaints = {c.User for c in pkgbase.comaintainers}
approved = [pkgbase.Maintainer] + list(comaints) approved = [pkgbase.Maintainer] + list(comaints)
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN, approved=approved) has_cred = request.user.has_credential(creds.PKGBASE_DISOWN, approved=approved)
@ -873,6 +877,7 @@ async def pkgbase_delete_post(
) )
if comments: if comments:
validate.comment_raise_http_ex(comments)
# Update any existing deletion requests' ClosureComment. # Update any existing deletion requests' ClosureComment.
with db.begin(): with db.begin():
requests = pkgbase.requests.filter( requests = pkgbase.requests.filter(
@ -908,7 +913,9 @@ async def pkgbase_merge_get(
# Perhaps additionally: bad_credential_status_code(creds.PKGBASE_MERGE). # Perhaps additionally: bad_credential_status_code(creds.PKGBASE_MERGE).
# Don't take these examples verbatim. We should find good naming. # Don't take these examples verbatim. We should find good naming.
if not request.user.has_credential(creds.PKGBASE_MERGE): if not request.user.has_credential(creds.PKGBASE_MERGE):
context["errors"] = ["Only Trusted Users and Developers can merge packages."] context["errors"] = [
"Only Package Maintainers and Developers can merge packages."
]
status_code = HTTPStatus.UNAUTHORIZED status_code = HTTPStatus.UNAUTHORIZED
return render_template( return render_template(
@ -934,7 +941,9 @@ async def pkgbase_merge_post(
# TODO: Lookup errors from credential instead of hardcoding them. # TODO: Lookup errors from credential instead of hardcoding them.
if not request.user.has_credential(creds.PKGBASE_MERGE): if not request.user.has_credential(creds.PKGBASE_MERGE):
context["errors"] = ["Only Trusted Users and Developers can merge packages."] context["errors"] = [
"Only Package Maintainers and Developers can merge packages."
]
return render_template( return render_template(
request, "pkgbase/merge.html", context, status_code=HTTPStatus.UNAUTHORIZED request, "pkgbase/merge.html", context, status_code=HTTPStatus.UNAUTHORIZED
) )
@ -962,6 +971,9 @@ async def pkgbase_merge_post(
request, "pkgbase/merge.html", context, status_code=HTTPStatus.BAD_REQUEST request, "pkgbase/merge.html", context, status_code=HTTPStatus.BAD_REQUEST
) )
if comments:
validate.comment_raise_http_ex(comments)
with db.begin(): with db.begin():
update_closure_comment(pkgbase, MERGE_ID, comments, target=target) update_closure_comment(pkgbase, MERGE_ID, comments, target=target)

View file

@ -16,6 +16,7 @@ from aurweb.models.package_request import (
) )
from aurweb.requests.util import get_pkgreq_by_id from aurweb.requests.util import get_pkgreq_by_id
from aurweb.scripts import notify from aurweb.scripts import notify
from aurweb.statistics import get_request_counts
from aurweb.templates import make_context, render_template from aurweb.templates import make_context, render_template
FILTER_PARAMS = { FILTER_PARAMS = {
@ -31,7 +32,7 @@ router = APIRouter()
@router.get("/requests") @router.get("/requests")
@requires_auth @requires_auth
async def requests( async def requests( # noqa: C901
request: Request, request: Request,
O: int = Query(default=defaults.O), O: int = Query(default=defaults.O),
PP: int = Query(default=defaults.PP), PP: int = Query(default=defaults.PP),
@ -74,18 +75,11 @@ async def requests(
.join(User, PackageRequest.UsersID == User.ID, isouter=True) .join(User, PackageRequest.UsersID == User.ID, isouter=True)
.join(Maintainer, PackageBase.MaintainerUID == Maintainer.ID, isouter=True) .join(Maintainer, PackageBase.MaintainerUID == Maintainer.ID, isouter=True)
) )
# query = db.query(PackageRequest).join(User)
# Requests statistics # Requests statistics
context["total_requests"] = query.count() counts = get_request_counts()
pending_count = 0 + query.filter(PackageRequest.Status == PENDING_ID).count() for k in counts:
context["pending_requests"] = pending_count context[k] = counts[k]
closed_count = 0 + query.filter(PackageRequest.Status == CLOSED_ID).count()
context["closed_requests"] = closed_count
accepted_count = 0 + query.filter(PackageRequest.Status == ACCEPTED_ID).count()
context["accepted_requests"] = accepted_count
rejected_count = 0 + query.filter(PackageRequest.Status == REJECTED_ID).count()
context["rejected_requests"] = rejected_count
# Apply status filters # Apply status filters
in_filters = [] in_filters = []
@ -99,9 +93,9 @@ async def requests(
in_filters.append(REJECTED_ID) in_filters.append(REJECTED_ID)
filtered = query.filter(PackageRequest.Status.in_(in_filters)) filtered = query.filter(PackageRequest.Status.in_(in_filters))
# Name filter # Name filter (contains)
if filter_pkg_name: if filter_pkg_name:
filtered = filtered.filter(PackageBase.Name == filter_pkg_name) filtered = filtered.filter(PackageBase.Name.like(f"%{filter_pkg_name}%"))
# Additionally filter for requests made from package maintainer # Additionally filter for requests made from package maintainer
if filter_maintainer_requests: if filter_maintainer_requests:

View file

@ -23,6 +23,7 @@ OpenAPI Routes:
OpenAPI example (version 5): /rpc/v5/info/my-package OpenAPI example (version 5): /rpc/v5/info/my-package
""" """
import hashlib import hashlib
import re import re
from http import HTTPStatus from http import HTTPStatus
@ -180,7 +181,7 @@ async def rpc_post(
type: Optional[str] = Form(default=None), type: Optional[str] = Form(default=None),
by: Optional[str] = Form(default=defaults.RPC_SEARCH_BY), by: Optional[str] = Form(default=defaults.RPC_SEARCH_BY),
arg: Optional[str] = Form(default=None), arg: Optional[str] = Form(default=None),
args: Optional[list[str]] = Form(default=[], alias="arg[]"), args: list[str] = Form(default=[], alias="arg[]"),
callback: Optional[str] = Form(default=None), callback: Optional[str] = Form(default=None),
): ):
return await rpc_request(request, v, type, by, arg, args, callback) return await rpc_request(request, v, type, by, arg, args, callback)

View file

@ -1,21 +1,19 @@
from datetime import datetime
from fastapi import APIRouter, Request from fastapi import APIRouter, Request
from fastapi.responses import Response from fastapi.responses import Response
from feedgen.feed import FeedGenerator from feedgen.feed import FeedGenerator
from aurweb import db, filters from aurweb import config, db, filters
from aurweb.cache import lambda_cache
from aurweb.models import Package, PackageBase from aurweb.models import Package, PackageBase
router = APIRouter() router = APIRouter()
def make_rss_feed(request: Request, packages: list, date_attr: str): def make_rss_feed(request: Request, packages: list):
"""Create an RSS Feed string for some packages. """Create an RSS Feed string for some packages.
:param request: A FastAPI request :param request: A FastAPI request
:param packages: A list of packages to add to the RSS feed :param packages: A list of packages to add to the RSS feed
:param date_attr: The date attribute (DB column) to use
:return: RSS Feed string :return: RSS Feed string
""" """
@ -36,18 +34,11 @@ def make_rss_feed(request: Request, packages: list, date_attr: str):
entry = feed.add_entry(order="append") entry = feed.add_entry(order="append")
entry.title(pkg.Name) entry.title(pkg.Name)
entry.link(href=f"{base}/packages/{pkg.Name}", rel="alternate") entry.link(href=f"{base}/packages/{pkg.Name}", rel="alternate")
entry.link(href=f"{base}/rss", rel="self", type="application/rss+xml")
entry.description(pkg.Description or str()) entry.description(pkg.Description or str())
dt = filters.timestamp_to_datetime(pkg.Timestamp)
attr = getattr(pkg.PackageBase, date_attr)
dt = filters.timestamp_to_datetime(attr)
dt = filters.as_timezone(dt, request.user.Timezone) dt = filters.as_timezone(dt, request.user.Timezone)
entry.pubDate(dt.strftime("%Y-%m-%d %H:%M:%S%z")) entry.pubDate(dt.strftime("%Y-%m-%d %H:%M:%S%z"))
entry.guid(f"{pkg.Name}-{pkg.Timestamp}")
entry.source(f"{base}")
if pkg.PackageBase.Maintainer:
entry.author(author={"name": pkg.PackageBase.Maintainer.Username})
entry.guid(f"{pkg.Name} - {attr}")
return feed.rss_str() return feed.rss_str()
@ -59,16 +50,18 @@ async def rss(request: Request):
.join(PackageBase) .join(PackageBase)
.order_by(PackageBase.SubmittedTS.desc()) .order_by(PackageBase.SubmittedTS.desc())
.limit(100) .limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.SubmittedTS.label("Timestamp"),
)
) )
feed = make_rss_feed(request, packages, "SubmittedTS")
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache("rss", lambda: make_rss_feed(request, packages), cache_expire)
response = Response(feed, media_type="application/rss+xml") response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.SubmittedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response return response
@ -79,14 +72,18 @@ async def rss_modified(request: Request):
.join(PackageBase) .join(PackageBase)
.order_by(PackageBase.ModifiedTS.desc()) .order_by(PackageBase.ModifiedTS.desc())
.limit(100) .limit(100)
.with_entities(
Package.Name,
Package.Description,
PackageBase.ModifiedTS.label("Timestamp"),
)
)
# we use redis for caching the results of the feedgen
cache_expire = config.getint("cache", "expiry_time_rss", 300)
feed = lambda_cache(
"rss_modified", lambda: make_rss_feed(request, packages), cache_expire
) )
feed = make_rss_feed(request, packages, "ModifiedTS")
response = Response(feed, media_type="application/rss+xml") response = Response(feed, media_type="application/rss+xml")
package = packages.first()
if package:
dt = datetime.utcfromtimestamp(package.PackageBase.ModifiedTS)
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
response.headers["Last-Modified"] = modified
return response return response

View file

@ -80,7 +80,9 @@ def open_session(request, conn, user_id):
conn.execute( conn.execute(
Users.update() Users.update()
.where(Users.c.ID == user_id) .where(Users.c.ID == user_id)
.values(LastLogin=int(time.time()), LastLoginIPAddress=request.client.host) .values(
LastLogin=int(time.time()), LastLoginIPAddress=util.get_client_ip(request)
)
) )
return sid return sid
@ -110,7 +112,7 @@ async def authenticate(
Receive an OpenID Connect ID token, validate it, then process it to create Receive an OpenID Connect ID token, validate it, then process it to create
an new AUR session. an new AUR session.
""" """
if is_ip_banned(conn, request.client.host): if is_ip_banned(conn, util.get_client_ip(request)):
_ = get_translator_for_request(request) _ = get_translator_for_request(request)
raise HTTPException( raise HTTPException(
status_code=HTTPStatus.FORBIDDEN, status_code=HTTPStatus.FORBIDDEN,

View file

@ -5,7 +5,6 @@ Changes here should always be accompanied by an Alembic migration, which can be
usually be automatically generated. See `migrations/README` for details. usually be automatically generated. See `migrations/README` for details.
""" """
from sqlalchemy import ( from sqlalchemy import (
CHAR, CHAR,
TIMESTAMP, TIMESTAMP,
@ -184,6 +183,8 @@ PackageBases = Table(
Index("BasesNumVotes", "NumVotes"), Index("BasesNumVotes", "NumVotes"),
Index("BasesPackagerUID", "PackagerUID"), Index("BasesPackagerUID", "PackagerUID"),
Index("BasesSubmitterUID", "SubmitterUID"), Index("BasesSubmitterUID", "SubmitterUID"),
Index("BasesSubmittedTS", "SubmittedTS"),
Index("BasesModifiedTS", "ModifiedTS"),
mysql_engine="InnoDB", mysql_engine="InnoDB",
mysql_charset="utf8mb4", mysql_charset="utf8mb4",
mysql_collate="utf8mb4_general_ci", mysql_collate="utf8mb4_general_ci",
@ -526,8 +527,8 @@ PackageRequests = Table(
# Vote information # Vote information
TU_VoteInfo = Table( VoteInfo = Table(
"TU_VoteInfo", "VoteInfo",
metadata, metadata,
Column("ID", INTEGER(unsigned=True), primary_key=True), Column("ID", INTEGER(unsigned=True), primary_key=True),
Column("Agenda", Text, nullable=False), Column("Agenda", Text, nullable=False),
@ -546,7 +547,10 @@ TU_VoteInfo = Table(
"Abstain", INTEGER(unsigned=True), nullable=False, server_default=text("'0'") "Abstain", INTEGER(unsigned=True), nullable=False, server_default=text("'0'")
), ),
Column( Column(
"ActiveTUs", INTEGER(unsigned=True), nullable=False, server_default=text("'0'") "ActiveUsers",
INTEGER(unsigned=True),
nullable=False,
server_default=text("'0'"),
), ),
mysql_engine="InnoDB", mysql_engine="InnoDB",
mysql_charset="utf8mb4", mysql_charset="utf8mb4",
@ -555,10 +559,10 @@ TU_VoteInfo = Table(
# Individual vote records # Individual vote records
TU_Votes = Table( Votes = Table(
"TU_Votes", "Votes",
metadata, metadata,
Column("VoteID", ForeignKey("TU_VoteInfo.ID", ondelete="CASCADE"), nullable=False), Column("VoteID", ForeignKey("VoteInfo.ID", ondelete="CASCADE"), nullable=False),
Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False), Column("UserID", ForeignKey("Users.ID", ondelete="CASCADE"), nullable=False),
mysql_engine="InnoDB", mysql_engine="InnoDB",
) )

View file

@ -6,6 +6,7 @@ See `aurweb-adduser --help` for documentation.
Copyright (C) 2022 aurweb Development Team Copyright (C) 2022 aurweb Development Team
All Rights Reserved All Rights Reserved
""" """
import argparse import argparse
import sys import sys
import traceback import traceback

View file

@ -3,6 +3,7 @@ Perform an action on the aurweb config.
When AUR_CONFIG_IMMUTABLE is set, the `set` action is noop. When AUR_CONFIG_IMMUTABLE is set, the `set` action is noop.
""" """
import argparse import argparse
import configparser import configparser
import os import os

View file

@ -3,7 +3,7 @@ import importlib
import os import os
import sys import sys
import traceback import traceback
from datetime import datetime from datetime import UTC, datetime
import orjson import orjson
import pygit2 import pygit2
@ -60,7 +60,7 @@ def update_repository(repo: pygit2.Repository):
except pygit2.GitError: except pygit2.GitError:
base = [] base = []
utcnow = datetime.utcnow() utcnow = datetime.now(UTC)
author = pygit2.Signature( author = pygit2.Signature(
config.get("git-archive", "author"), config.get("git-archive", "author"),
config.get("git-archive", "author-email"), config.get("git-archive", "author-email"),

View file

@ -20,7 +20,7 @@ from aurweb.models.package_comment import PackageComment
from aurweb.models.package_notification import PackageNotification from aurweb.models.package_notification import PackageNotification
from aurweb.models.package_request import PackageRequest from aurweb.models.package_request import PackageRequest
from aurweb.models.request_type import RequestType from aurweb.models.request_type import RequestType
from aurweb.models.tu_vote import TUVote from aurweb.models.vote import Vote
logger = aur_logging.get_logger(__name__) logger = aur_logging.get_logger(__name__)
@ -45,6 +45,9 @@ class Notification:
def get_cc(self): def get_cc(self):
return [] return []
def get_bcc(self):
return []
def get_body_fmt(self, lang): def get_body_fmt(self, lang):
body = "" body = ""
for line in self.get_body(lang).splitlines(): for line in self.get_body(lang).splitlines():
@ -114,7 +117,7 @@ class Notification:
server.login(user, passwd) server.login(user, passwd)
server.set_debuglevel(0) server.set_debuglevel(0)
deliver_to = [to] + self.get_cc() deliver_to = [to] + self.get_cc() + self.get_bcc()
server.sendmail(sender, deliver_to, msg.as_bytes()) server.sendmail(sender, deliver_to, msg.as_bytes())
server.quit() server.quit()
@ -334,6 +337,7 @@ class FlagNotification(Notification):
.filter(and_(PackageBase.ID == pkgbase_id, User.Suspended == 0)) .filter(and_(PackageBase.ID == pkgbase_id, User.Suspended == 0))
.with_entities(User.Email, User.LangPreference) .with_entities(User.Email, User.LangPreference)
.distinct() .distinct()
.order_by(User.Email)
) )
self._recipients = [(u.Email, u.LangPreference) for u in query] self._recipients = [(u.Email, u.LangPreference) for u in query]
@ -577,10 +581,11 @@ class RequestOpenNotification(Notification):
), ),
) )
.filter(and_(PackageRequest.ID == reqid, User.Suspended == 0)) .filter(and_(PackageRequest.ID == reqid, User.Suspended == 0))
.with_entities(User.Email) .with_entities(User.Email, User.HideEmail)
.distinct() .distinct()
) )
self._cc = [u.Email for u in query] self._cc = [u.Email for u in query if u.HideEmail == 0]
self._bcc = [u.Email for u in query if u.HideEmail == 1]
pkgreq = ( pkgreq = (
db.query(PackageRequest.Comments).filter(PackageRequest.ID == reqid).first() db.query(PackageRequest.Comments).filter(PackageRequest.ID == reqid).first()
@ -597,6 +602,9 @@ class RequestOpenNotification(Notification):
def get_cc(self): def get_cc(self):
return self._cc return self._cc
def get_bcc(self):
return self._bcc
def get_subject(self, lang): def get_subject(self, lang):
return "[PRQ#%d] %s Request for %s" % ( return "[PRQ#%d] %s Request for %s" % (
self._reqid, self._reqid,
@ -664,10 +672,11 @@ class RequestCloseNotification(Notification):
), ),
) )
.filter(and_(PackageRequest.ID == reqid, User.Suspended == 0)) .filter(and_(PackageRequest.ID == reqid, User.Suspended == 0))
.with_entities(User.Email) .with_entities(User.Email, User.HideEmail)
.distinct() .distinct()
) )
self._cc = [u.Email for u in query] self._cc = [u.Email for u in query if u.HideEmail == 0]
self._bcc = [u.Email for u in query if u.HideEmail == 1]
pkgreq = ( pkgreq = (
db.query(PackageRequest) db.query(PackageRequest)
@ -694,6 +703,9 @@ class RequestCloseNotification(Notification):
def get_cc(self): def get_cc(self):
return self._cc return self._cc
def get_bcc(self):
return self._bcc
def get_subject(self, lang): def get_subject(self, lang):
return "[PRQ#%d] %s Request for %s %s" % ( return "[PRQ#%d] %s Request for %s %s" % (
self._reqid, self._reqid,
@ -732,11 +744,11 @@ class RequestCloseNotification(Notification):
return headers return headers
class TUVoteReminderNotification(Notification): class VoteReminderNotification(Notification):
def __init__(self, vote_id): def __init__(self, vote_id):
self._vote_id = int(vote_id) self._vote_id = int(vote_id)
subquery = db.query(TUVote.UserID).filter(TUVote.VoteID == vote_id) subquery = db.query(Vote.UserID).filter(Vote.VoteID == vote_id)
query = ( query = (
db.query(User) db.query(User)
.filter( .filter(
@ -757,7 +769,7 @@ class TUVoteReminderNotification(Notification):
def get_subject(self, lang): def get_subject(self, lang):
return aurweb.l10n.translator.translate( return aurweb.l10n.translator.translate(
"TU Vote Reminder: Proposal {id}", lang "Package Maintainer Vote Reminder: Proposal {id}", lang
).format(id=self._vote_id) ).format(id=self._vote_id)
def get_body(self, lang): def get_body(self, lang):
@ -768,7 +780,7 @@ class TUVoteReminderNotification(Notification):
).format(id=self._vote_id) ).format(id=self._vote_id)
def get_refs(self): def get_refs(self):
return (aur_location + "/tu/?id=" + str(self._vote_id),) return (aur_location + "/package-maintainer/?id=" + str(self._vote_id),)
def main(): def main():
@ -787,7 +799,7 @@ def main():
"delete": DeleteNotification, "delete": DeleteNotification,
"request-open": RequestOpenNotification, "request-open": RequestOpenNotification,
"request-close": RequestCloseNotification, "request-close": RequestCloseNotification,
"tu-vote-reminder": TUVoteReminderNotification, "vote-reminder": VoteReminderNotification,
} }
with db.begin(): with db.begin():

View file

@ -72,8 +72,13 @@ class GitCommitsInlineProcessor(markdown.inlinepatterns.InlineProcessor):
def handleMatch(self, m, data): def handleMatch(self, m, data):
oid = m.group(1) oid = m.group(1)
if oid not in self._repo: # Lookup might raise ValueError in case multiple object ID's were found
# Unknown OID; preserve the orginal text. try:
if oid not in self._repo:
# Unknown OID; preserve the orginal text.
return None, None, None
except ValueError:
# Multiple OID's found; preserve the orginal text.
return None, None, None return None, None, None
el = Element("a") el = Element("a")
@ -116,6 +121,20 @@ class HeadingExtension(markdown.extensions.Extension):
md.treeprocessors.register(HeadingTreeprocessor(md), "heading", 30) md.treeprocessors.register(HeadingTreeprocessor(md), "heading", 30)
class StrikethroughInlineProcessor(markdown.inlinepatterns.InlineProcessor):
def handleMatch(self, m, data):
el = Element("del")
el.text = m.group(1)
return el, m.start(0), m.end(0)
class StrikethroughExtension(markdown.extensions.Extension):
def extendMarkdown(self, md):
pattern = r"~~(.*?)~~"
processor = StrikethroughInlineProcessor(pattern, md)
md.inlinePatterns.register(processor, "del", 40)
def save_rendered_comment(comment: PackageComment, html: str): def save_rendered_comment(comment: PackageComment, html: str):
with db.begin(): with db.begin():
comment.RenderedComment = html comment.RenderedComment = html
@ -132,11 +151,13 @@ def update_comment_render(comment: PackageComment) -> None:
html = markdown.markdown( html = markdown.markdown(
text, text,
extensions=[ extensions=[
"md_in_html",
"fenced_code", "fenced_code",
LinkifyExtension(), LinkifyExtension(),
FlysprayLinksExtension(), FlysprayLinksExtension(),
GitCommitsExtension(pkgbasename), GitCommitsExtension(pkgbasename),
HeadingExtension(), HeadingExtension(),
StrikethroughExtension(),
], ],
) )
@ -148,6 +169,9 @@ def update_comment_render(comment: PackageComment) -> None:
"h6", "h6",
"br", "br",
"hr", "hr",
"del",
"details",
"summary",
] ]
html = bleach.clean(html, tags=allowed_tags) html = bleach.clean(html, tags=allowed_tags)
save_rendered_comment(comment, html) save_rendered_comment(comment, html)

View file

@ -4,7 +4,7 @@ from sqlalchemy import and_
import aurweb.config import aurweb.config
from aurweb import db, time from aurweb import db, time
from aurweb.models import TUVoteInfo from aurweb.models import VoteInfo
from aurweb.scripts import notify from aurweb.scripts import notify
notify_cmd = aurweb.config.get("notifications", "notify-cmd") notify_cmd = aurweb.config.get("notifications", "notify-cmd")
@ -15,17 +15,17 @@ def main():
now = time.utcnow() now = time.utcnow()
start = aurweb.config.getint("tuvotereminder", "range_start") start = aurweb.config.getint("votereminder", "range_start")
filter_from = now + start filter_from = now + start
end = aurweb.config.getint("tuvotereminder", "range_end") end = aurweb.config.getint("votereminder", "range_end")
filter_to = now + end filter_to = now + end
query = db.query(TUVoteInfo.ID).filter( query = db.query(VoteInfo.ID).filter(
and_(TUVoteInfo.End >= filter_from, TUVoteInfo.End <= filter_to) and_(VoteInfo.End >= filter_from, VoteInfo.End <= filter_to)
) )
for voteinfo in query: for voteinfo in query:
notif = notify.TUVoteReminderNotification(voteinfo.ID) notif = notify.VoteReminderNotification(voteinfo.ID)
notif.send() notif.send()

View file

@ -7,7 +7,6 @@ This module uses a global state, since you cant open two servers with the sam
configuration anyway. configuration anyway.
""" """
import argparse import argparse
import atexit import atexit
import os import os
@ -52,46 +51,46 @@ def generate_nginx_config():
fastapi_bind = aurweb.config.get("fastapi", "bind_address") fastapi_bind = aurweb.config.get("fastapi", "bind_address")
fastapi_host = fastapi_bind.split(":")[0] fastapi_host = fastapi_bind.split(":")[0]
config_path = os.path.join(temporary_dir, "nginx.conf") config_path = os.path.join(temporary_dir, "nginx.conf")
config = open(config_path, "w") with open(config_path, "w") as config:
# We double nginx's braces because they conflict with Python's f-strings. # We double nginx's braces because they conflict with Python's f-strings.
config.write( config.write(
f""" f"""
events {{}} events {{}}
daemon off; daemon off;
error_log /dev/stderr info; error_log /dev/stderr info;
pid {os.path.join(temporary_dir, "nginx.pid")}; pid {os.path.join(temporary_dir, "nginx.pid")};
http {{ http {{
access_log /dev/stdout; access_log /dev/stdout;
client_body_temp_path {os.path.join(temporary_dir, "client_body")}; client_body_temp_path {os.path.join(temporary_dir, "client_body")};
proxy_temp_path {os.path.join(temporary_dir, "proxy")}; proxy_temp_path {os.path.join(temporary_dir, "proxy")};
fastcgi_temp_path {os.path.join(temporary_dir, "fastcgi")}1 2; fastcgi_temp_path {os.path.join(temporary_dir, "fastcgi")}1 2;
uwsgi_temp_path {os.path.join(temporary_dir, "uwsgi")}; uwsgi_temp_path {os.path.join(temporary_dir, "uwsgi")};
scgi_temp_path {os.path.join(temporary_dir, "scgi")}; scgi_temp_path {os.path.join(temporary_dir, "scgi")};
server {{ server {{
listen {fastapi_host}:{FASTAPI_NGINX_PORT}; listen {fastapi_host}:{FASTAPI_NGINX_PORT};
location / {{ location / {{
try_files $uri @proxy_to_app; try_files $uri @proxy_to_app;
}} }}
location @proxy_to_app {{ location @proxy_to_app {{
proxy_set_header Host $http_host; proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-Proto $scheme;
proxy_redirect off; proxy_redirect off;
proxy_buffering off; proxy_buffering off;
proxy_pass http://{fastapi_bind}; proxy_pass http://{fastapi_bind};
}}
}} }}
}} }}
}} """
""" )
)
return config_path return config_path
def spawn_child(args): def spawn_child(_args):
"""Open a subprocess and add it to the global state.""" """Open a subprocess and add it to the global state."""
if verbosity >= 1: if verbosity >= 1:
print(f":: Spawning {args}", file=sys.stderr) print(f":: Spawning {_args}", file=sys.stderr)
children.append(subprocess.Popen(args)) children.append(subprocess.Popen(_args))
def start(): def start():
@ -172,17 +171,17 @@ def start():
) )
def _kill_children( def _kill_children(_children: Iterable, exceptions=None) -> list[Exception]:
children: Iterable, exceptions: list[Exception] = []
) -> list[Exception]:
""" """
Kill each process found in `children`. Kill each process found in `children`.
:param children: Iterable of child processes :param _children: Iterable of child processes
:param exceptions: Exception memo :param exceptions: Exception memo
:return: `exceptions` :return: `exceptions`
""" """
for p in children: if exceptions is None:
exceptions = []
for p in _children:
try: try:
p.terminate() p.terminate()
if verbosity >= 1: if verbosity >= 1:
@ -192,17 +191,17 @@ def _kill_children(
return exceptions return exceptions
def _wait_for_children( def _wait_for_children(_children: Iterable, exceptions=None) -> list[Exception]:
children: Iterable, exceptions: list[Exception] = []
) -> list[Exception]:
""" """
Wait for each process to end found in `children`. Wait for each process to end found in `children`.
:param children: Iterable of child processes :param _children: Iterable of child processes
:param exceptions: Exception memo :param exceptions: Exception memo
:return: `exceptions` :return: `exceptions`
""" """
for p in children: if exceptions is None:
exceptions = []
for p in _children:
try: try:
rc = p.wait() rc = p.wait()
if rc != 0 and rc != -15: if rc != 0 and rc != -15:

169
aurweb/statistics.py Normal file
View file

@ -0,0 +1,169 @@
from sqlalchemy import func
from aurweb import config, db, time
from aurweb.cache import db_count_cache, db_query_cache
from aurweb.models import PackageBase, PackageRequest, RequestType, User
from aurweb.models.account_type import (
PACKAGE_MAINTAINER_AND_DEV_ID,
PACKAGE_MAINTAINER_ID,
USER_ID,
)
from aurweb.models.package_request import (
ACCEPTED_ID,
CLOSED_ID,
PENDING_ID,
REJECTED_ID,
)
from aurweb.prometheus import PACKAGES, REQUESTS, USERS
cache_expire = config.getint("cache", "expiry_time_statistics", 300)
HOMEPAGE_COUNTERS = [
"package_count",
"orphan_count",
"seven_days_old_added",
"seven_days_old_updated",
"year_old_updated",
"never_updated",
"user_count",
"package_maintainer_count",
]
REQUEST_COUNTERS = [
"total_requests",
"pending_requests",
"closed_requests",
"accepted_requests",
"rejected_requests",
]
PROMETHEUS_USER_COUNTERS = [
("package_maintainer_count", "package_maintainer"),
("regular_user_count", "user"),
]
PROMETHEUS_PACKAGE_COUNTERS = [
("orphan_count", "orphan"),
("never_updated", "not_updated"),
("updated_packages", "updated"),
]
class Statistics:
seven_days = 86400 * 7
one_hour = 3600
year = seven_days * 52
def __init__(self, cache_expire: int = None) -> "Statistics":
self.expiry_time = cache_expire
self.now = time.utcnow()
self.seven_days_ago = self.now - self.seven_days
self.year_ago = self.now - self.year
self.user_query = db.query(User)
self.bases_query = db.query(PackageBase)
self.updated_query = db.query(PackageBase).filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS >= self.one_hour
)
self.request_query = db.query(PackageRequest)
def get_count(self, counter: str) -> int:
query = None
match counter:
# Packages
case "package_count":
query = self.bases_query
case "orphan_count":
query = self.bases_query.filter(PackageBase.MaintainerUID.is_(None))
case "seven_days_old_added":
query = self.bases_query.filter(
PackageBase.SubmittedTS >= self.seven_days_ago
)
case "seven_days_old_updated":
query = self.updated_query.filter(
PackageBase.ModifiedTS >= self.seven_days_ago
)
case "year_old_updated":
query = self.updated_query.filter(
PackageBase.ModifiedTS >= self.year_ago
)
case "never_updated":
query = self.bases_query.filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS < self.one_hour
)
case "updated_packages":
query = self.bases_query.filter(
PackageBase.ModifiedTS - PackageBase.SubmittedTS > self.one_hour,
~PackageBase.MaintainerUID.is_(None),
)
# Users
case "user_count":
query = self.user_query
case "package_maintainer_count":
query = self.user_query.filter(
User.AccountTypeID.in_(
(
PACKAGE_MAINTAINER_ID,
PACKAGE_MAINTAINER_AND_DEV_ID,
)
)
)
case "regular_user_count":
query = self.user_query.filter(User.AccountTypeID == USER_ID)
# Requests
case "total_requests":
query = self.request_query
case "pending_requests":
query = self.request_query.filter(PackageRequest.Status == PENDING_ID)
case "closed_requests":
query = self.request_query.filter(PackageRequest.Status == CLOSED_ID)
case "accepted_requests":
query = self.request_query.filter(PackageRequest.Status == ACCEPTED_ID)
case "rejected_requests":
query = self.request_query.filter(PackageRequest.Status == REJECTED_ID)
case _:
return -1
return db_count_cache(counter, query, expire=self.expiry_time)
def update_prometheus_metrics():
stats = Statistics(cache_expire)
# Users gauge
for counter, utype in PROMETHEUS_USER_COUNTERS:
count = stats.get_count(counter)
USERS.labels(utype).set(count)
# Packages gauge
for counter, state in PROMETHEUS_PACKAGE_COUNTERS:
count = stats.get_count(counter)
PACKAGES.labels(state).set(count)
# Requests gauge
query = (
db.get_session()
.query(PackageRequest, func.count(PackageRequest.ID), RequestType.Name)
.join(RequestType)
.group_by(RequestType.Name, PackageRequest.Status)
)
results = db_query_cache("request_metrics", query, cache_expire)
for record in results:
status = record[0].status_display()
count = record[1]
rtype = record[2]
REQUESTS.labels(type=rtype, status=status).set(count)
def _get_counts(counters: list[str]) -> dict[str, int]:
stats = Statistics(cache_expire)
result = dict()
for counter in counters:
result[counter] = stats.get_count(counter)
return result
def get_homepage_counts() -> dict[str, int]:
return _get_counts(HOMEPAGE_COUNTERS)
def get_request_counts() -> dict[str, int]:
return _get_counts(REQUEST_COUNTERS)

View file

@ -3,7 +3,6 @@ import functools
import os import os
from http import HTTPStatus from http import HTTPStatus
from typing import Callable from typing import Callable
from zoneinfo import ZoneInfoNotFoundError
import jinja2 import jinja2
from fastapi import Request from fastapi import Request
@ -71,14 +70,12 @@ def make_context(request: Request, title: str, next: str = None):
commit_url = aurweb.config.get_with_fallback("devel", "commit_url", None) commit_url = aurweb.config.get_with_fallback("devel", "commit_url", None)
commit_hash = aurweb.config.get_with_fallback("devel", "commit_hash", None) commit_hash = aurweb.config.get_with_fallback("devel", "commit_hash", None)
max_chars_comment = aurweb.config.getint("options", "max_chars_comment", 5000)
if commit_hash: if commit_hash:
# Shorten commit_hash to a short Git hash. # Shorten commit_hash to a short Git hash.
commit_hash = commit_hash[:7] commit_hash = commit_hash[:7]
try: timezone = time.get_request_timezone(request)
timezone = time.get_request_timezone(request)
except ZoneInfoNotFoundError:
timezone = DEFAULT_TIMEZONE
language = l10n.get_request_language(request) language = l10n.get_request_language(request)
return { return {
"request": request, "request": request,
@ -96,6 +93,7 @@ def make_context(request: Request, title: str, next: str = None):
"creds": aurweb.auth.creds, "creds": aurweb.auth.creds,
"next": next if next else request.url.path, "next": next if next else request.url.path,
"version": os.environ.get("COMMIT_HASH", aurweb.config.AURWEB_VERSION), "version": os.environ.get("COMMIT_HASH", aurweb.config.AURWEB_VERSION),
"max_chars_comment": max_chars_comment,
} }
@ -110,9 +108,7 @@ async def make_variable_context(request: Request, title: str, next: str = None):
) )
for k, v in to_copy.items(): for k, v in to_copy.items():
if k == "timezone": if k not in context:
context[k] = v if v in time.SUPPORTED_TIMEZONES else DEFAULT_TIMEZONE
else:
context[k] = v context[k] = v
context["q"] = dict(request.query_params) context["q"] = dict(request.query_params)

View file

@ -51,8 +51,8 @@ def setup_test_db(*args):
models.Session.__tablename__, models.Session.__tablename__,
models.SSHPubKey.__tablename__, models.SSHPubKey.__tablename__,
models.Term.__tablename__, models.Term.__tablename__,
models.TUVote.__tablename__, models.Vote.__tablename__,
models.TUVoteInfo.__tablename__, models.VoteInfo.__tablename__,
models.User.__tablename__, models.User.__tablename__,
] ]

View file

@ -0,0 +1,8 @@
from aurweb import prometheus
def clear_metrics():
prometheus.PACKAGES.clear()
prometheus.REQUESTS.clear()
prometheus.SEARCH_REQUESTS.clear()
prometheus.USERS.clear()

View file

@ -1,7 +1,6 @@
import zoneinfo import zoneinfo
from collections import OrderedDict from collections import OrderedDict
from datetime import datetime from datetime import UTC, datetime
from urllib.parse import unquote
from zoneinfo import ZoneInfo from zoneinfo import ZoneInfo
from fastapi import Request from fastapi import Request
@ -58,16 +57,20 @@ SUPPORTED_TIMEZONES = OrderedDict(
) )
def get_request_timezone(request: Request): def get_request_timezone(request: Request) -> str:
"""Get a request's timezone by its AURTZ cookie. We use the """Get a request's timezone from either query param or user settings.
configuration's [options] default_timezone otherwise. We use the configuration's [options] default_timezone otherwise.
@param request FastAPI request @param request FastAPI request
""" """
default_tz = aurweb.config.get("options", "default_timezone") request_tz = request.query_params.get("timezone")
if request.user.is_authenticated(): if request_tz and request_tz in SUPPORTED_TIMEZONES:
default_tz = request.user.Timezone return request_tz
return unquote(request.cookies.get("AURTZ", default_tz)) elif (
request.user.is_authenticated() and request.user.Timezone in SUPPORTED_TIMEZONES
):
return request.user.Timezone
return aurweb.config.get_with_fallback("options", "default_timezone", "UTC")
def now(timezone: str) -> datetime: def now(timezone: str) -> datetime:
@ -86,4 +89,4 @@ def utcnow() -> int:
:return: Current UTC timestamp :return: Current UTC timestamp
""" """
return int(datetime.utcnow().timestamp()) return int(datetime.now(UTC).timestamp())

View file

@ -6,6 +6,7 @@ out of form data from /account/register or /account/{username}/edit.
All functions in this module raise aurweb.exceptions.ValidationError All functions in this module raise aurweb.exceptions.ValidationError
when encountering invalid criteria and return silently otherwise. when encountering invalid criteria and return silently otherwise.
""" """
from fastapi import Request from fastapi import Request
from sqlalchemy import and_ from sqlalchemy import and_
@ -56,12 +57,9 @@ def invalid_password(
) -> None: ) -> None:
if P: if P:
if not util.valid_password(P): if not util.valid_password(P):
username_min_len = config.getint("options", "username_min_len") passwd_min_len = config.getint("options", "passwd_min_len")
raise ValidationError( raise ValidationError(
[ [_("Your password must be at least %s characters.") % (passwd_min_len)]
_("Your password must be at least %s characters.")
% (username_min_len)
]
) )
elif not C: elif not C:
raise ValidationError(["Please confirm your new password."]) raise ValidationError(["Please confirm your new password."])
@ -70,7 +68,7 @@ def invalid_password(
def is_banned(request: Request = None, **kwargs) -> None: def is_banned(request: Request = None, **kwargs) -> None:
host = request.client.host host = util.get_client_ip(request)
exists = db.query(models.Ban, models.Ban.IPAddress == host).exists() exists = db.query(models.Ban, models.Ban.IPAddress == host).exists()
if db.query(exists).scalar(): if db.query(exists).scalar():
raise ValidationError( raise ValidationError(
@ -220,7 +218,7 @@ def invalid_account_type(
raise ValidationError([error]) raise ValidationError([error])
logger.debug( logger.debug(
f"Trusted User '{request.user.Username}' has " f"Package Maintainer '{request.user.Username}' has "
f"modified '{user.Username}' account's type to" f"modified '{user.Username}' account's type to"
f" {name}." f" {name}."
) )

View file

@ -4,6 +4,7 @@ import secrets
import shlex import shlex
import string import string
from datetime import datetime from datetime import datetime
from hashlib import sha1
from http import HTTPStatus from http import HTTPStatus
from subprocess import PIPE, Popen from subprocess import PIPE, Popen
from typing import Callable, Iterable, Tuple, Union from typing import Callable, Iterable, Tuple, Union
@ -13,6 +14,7 @@ import fastapi
import pygit2 import pygit2
from email_validator import EmailSyntaxError, validate_email from email_validator import EmailSyntaxError, validate_email
from fastapi.responses import JSONResponse from fastapi.responses import JSONResponse
from sqlalchemy.orm import Query
import aurweb.config import aurweb.config
from aurweb import aur_logging, defaults from aurweb import aur_logging, defaults
@ -190,9 +192,9 @@ def parse_ssh_key(string: str) -> Tuple[str, str]:
return prefix, key return prefix, key
def parse_ssh_keys(string: str) -> list[Tuple[str, str]]: def parse_ssh_keys(string: str) -> set[Tuple[str, str]]:
"""Parse a list of SSH public keys.""" """Parse a list of SSH public keys."""
return [parse_ssh_key(e) for e in string.strip().splitlines(True) if e.strip()] return set([parse_ssh_key(e) for e in string.strip().splitlines(True) if e.strip()])
def shell_exec(cmdline: str, cwd: str) -> Tuple[int, str, str]: def shell_exec(cmdline: str, cwd: str) -> Tuple[int, str, str]:
@ -200,3 +202,17 @@ def shell_exec(cmdline: str, cwd: str) -> Tuple[int, str, str]:
proc = Popen(args, cwd=cwd, stdout=PIPE, stderr=PIPE) proc = Popen(args, cwd=cwd, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate() out, err = proc.communicate()
return proc.returncode, out.decode().strip(), err.decode().strip() return proc.returncode, out.decode().strip(), err.decode().strip()
def hash_query(query: Query):
return sha1(
str(query.statement.compile(compile_kwargs={"literal_binds": True})).encode()
).hexdigest()
def get_client_ip(request: fastapi.Request) -> str:
"""
Returns the client's IP address for a Request.
Falls back to 'testclient' if request.client is None
"""
return request.client.host if request.client else "testclient"

61
ci/tf/.terraform.lock.hcl generated Normal file
View file

@ -0,0 +1,61 @@
# This file is maintained automatically by "terraform init".
# Manual edits may be lost in future updates.
provider "registry.terraform.io/hashicorp/dns" {
version = "3.3.2"
hashes = [
"h1:HjskPLRqmCw8Q/kiSuzti3iJBSpcAvcBFdlwFFQuoDE=",
"zh:05d2d50e301318362a4a82e6b7a9734ace07bc01abaaa649c566baf98814755f",
"zh:1e9fd1c3bfdda777e83e42831dd45b7b9e794250a0f351e5fd39762e8a0fe15b",
"zh:40e715fc7a2ede21f919567249b613844692c2f8a64f93ee64e5b68bae7ac2a2",
"zh:454d7aa83000a6e2ba7a7bfde4bcf5d7ed36298b22d760995ca5738ab02ee468",
"zh:46124ded51b4153ad90f12b0305fdbe0c23261b9669aa58a94a31c9cca2f4b19",
"zh:55a4f13d20f73534515a6b05701abdbfc54f4e375ba25b2dffa12afdad20e49d",
"zh:78d5eefdd9e494defcb3c68d282b8f96630502cac21d1ea161f53cfe9bb483b3",
"zh:7903b1ceb8211e2b8c79290e2e70906a4b88f4fba71c900eb3a425ce12f1716a",
"zh:b79fc4f444ef7a2fd7111a80428c070ad824f43a681699e99ab7f83074dfedbd",
"zh:ca9f45e0c4cb94e7d62536c226024afef3018b1de84f1ea4608b51bcd497a2a0",
"zh:ddc8bd894559d7d176e0ceb0bb1ae266519b01b315362ebfee8327bb7e7e5fa8",
"zh:e77334c0794ef8f9354b10e606040f6b0b67b373f5ff1db65bddcdd4569b428b",
]
}
provider "registry.terraform.io/hashicorp/tls" {
version = "4.0.4"
hashes = [
"h1:pe9vq86dZZKCm+8k1RhzARwENslF3SXb9ErHbQfgjXU=",
"zh:23671ed83e1fcf79745534841e10291bbf34046b27d6e68a5d0aab77206f4a55",
"zh:45292421211ffd9e8e3eb3655677700e3c5047f71d8f7650d2ce30242335f848",
"zh:59fedb519f4433c0fdb1d58b27c210b27415fddd0cd73c5312530b4309c088be",
"zh:5a8eec2409a9ff7cd0758a9d818c74bcba92a240e6c5e54b99df68fff312bbd5",
"zh:5e6a4b39f3171f53292ab88058a59e64825f2b842760a4869e64dc1dc093d1fe",
"zh:810547d0bf9311d21c81cc306126d3547e7bd3f194fc295836acf164b9f8424e",
"zh:824a5f3617624243bed0259d7dd37d76017097dc3193dac669be342b90b2ab48",
"zh:9361ccc7048be5dcbc2fafe2d8216939765b3160bd52734f7a9fd917a39ecbd8",
"zh:aa02ea625aaf672e649296bce7580f62d724268189fe9ad7c1b36bb0fa12fa60",
"zh:c71b4cd40d6ec7815dfeefd57d88bc592c0c42f5e5858dcc88245d371b4b8b1e",
"zh:dabcd52f36b43d250a3d71ad7abfa07b5622c69068d989e60b79b2bb4f220316",
"zh:f569b65999264a9416862bca5cd2a6177d94ccb0424f3a4ef424428912b9cb3c",
]
}
provider "registry.terraform.io/hetznercloud/hcloud" {
version = "1.42.0"
hashes = [
"h1:cr9lh26H3YbWSHb7OUnCoYw169cYO3Cjpt3yPnRhXS0=",
"zh:153b5f39d780e9a18bc1ea377d872647d328d943813cbd25d3d20863f8a37782",
"zh:35b9e95760c58cca756e34ad5f4138ac6126aa3e8c41b4a0f1d5dc9ee5666c73",
"zh:47a3cdbce982f2b4e17f73d4934bdb3e905a849b36fb59b80f87d852496ed049",
"zh:6a718c244c2ba300fbd43791661a061ad1ab16225ef3e8aeaa3db8c9eff12c85",
"zh:a2cbfc95c5e2c9422ed0a7b6292192c38241220d5b7813c678f937ab3ef962ae",
"zh:b837e118e08fd36aa8be48af7e9d0d3d112d2680c79cfc71cfe2501fb40dbefa",
"zh:bf66db8c680e18b77e16dc1f20ed1cdcc7876bfb7848c320ccb86f0fb80661ed",
"zh:c1ad80bbe48dc8a272a02dcdb4b12f019606f445606651c01e561b9d72d816b1",
"zh:d4e616701128ad14a6b5a427b0e9145ece4cad02aa3b5f9945c6d0b9ada8ab70",
"zh:d9d01f727037d028720100a5bc9fd213cb01e63e4b439a16f2f482c147976530",
"zh:dea047ee4d679370d4376fb746c4b959bf51dd06047c1c2656b32789c2433643",
"zh:e5ad7a3c556894bd40b28a874e7d2f6924876fa75fa443136a7d6ab9a00abbaa",
"zh:edf6e7e129157bd45e3da4a330d1ace17a336d417c3b77c620f302d440c368e8",
"zh:f610bc729866d58da9cffa4deae34dbfdba96655e855a87c6bb2cb7b35a8961c",
]
}

67
ci/tf/main.tf Normal file
View file

@ -0,0 +1,67 @@
terraform {
backend "http" {
}
}
provider "hcloud" {
token = var.hcloud_token
}
provider "dns" {
update {
server = var.dns_server
key_name = var.dns_tsig_key
key_algorithm = var.dns_tsig_algorithm
key_secret = var.dns_tsig_secret
}
}
resource "tls_private_key" "this" {
algorithm = "ED25519"
}
resource "hcloud_ssh_key" "this" {
name = var.name
public_key = tls_private_key.this.public_key_openssh
}
data "hcloud_image" "this" {
with_selector = "custom_image=archlinux"
most_recent = true
with_status = ["available"]
}
resource "hcloud_server" "this" {
name = var.name
image = data.hcloud_image.this.id
server_type = var.server_type
datacenter = var.datacenter
ssh_keys = [hcloud_ssh_key.this.name]
public_net {
ipv4_enabled = true
ipv6_enabled = true
}
}
resource "hcloud_rdns" "this" {
for_each = { ipv4 : hcloud_server.this.ipv4_address, ipv6 : hcloud_server.this.ipv6_address }
server_id = hcloud_server.this.id
ip_address = each.value
dns_ptr = "${var.name}.${var.dns_zone}"
}
resource "dns_a_record_set" "this" {
zone = "${var.dns_zone}."
name = var.name
addresses = [hcloud_server.this.ipv4_address]
ttl = 300
}
resource "dns_aaaa_record_set" "this" {
zone = "${var.dns_zone}."
name = var.name
addresses = [hcloud_server.this.ipv6_address]
ttl = 300
}

4
ci/tf/terraform.tfvars Normal file
View file

@ -0,0 +1,4 @@
server_type = "cpx11"
datacenter = "fsn1-dc14"
dns_server = "redirect.archlinux.org"
dns_zone = "sandbox.archlinux.page"

36
ci/tf/variables.tf Normal file
View file

@ -0,0 +1,36 @@
variable "hcloud_token" {
type = string
sensitive = true
}
variable "dns_server" {
type = string
}
variable "dns_tsig_key" {
type = string
}
variable "dns_tsig_algorithm" {
type = string
}
variable "dns_tsig_secret" {
type = string
}
variable "dns_zone" {
type = string
}
variable "name" {
type = string
}
variable "server_type" {
type = string
}
variable "datacenter" {
type = string
}

13
ci/tf/versions.tf Normal file
View file

@ -0,0 +1,13 @@
terraform {
required_providers {
tls = {
source = "hashicorp/tls"
}
hcloud = {
source = "hetznercloud/hcloud"
}
dns = {
source = "hashicorp/dns"
}
}
}

View file

@ -47,6 +47,6 @@ commit_parsers = [
# filter out the commits that are not matched by commit parsers # filter out the commits that are not matched by commit parsers
filter_commits = false filter_commits = false
# glob pattern for matching git tags # glob pattern for matching git tags
tag_pattern = "*[0-9]*" tag_pattern = "v[0-9]."
# regex for skipping tags # regex for skipping tags
skip_tags = "v0.1.0-beta.1" skip_tags = "v0.1.0-beta.1"

View file

@ -49,6 +49,8 @@ salt_rounds = 12
redis_address = redis://localhost redis_address = redis://localhost
; Toggles traceback display in templates/errors/500.html. ; Toggles traceback display in templates/errors/500.html.
traceback = 0 traceback = 0
; Maximum number of characters for a comment
max_chars_comment = 5000
[ratelimit] [ratelimit]
request_limit = 4000 request_limit = 4000
@ -158,10 +160,23 @@ commit_url = https://gitlab.archlinux.org/archlinux/aurweb/-/commits/%s
; sed -r "s/^;?(commit_hash) =.*$/\1 = $(git rev-parse HEAD)/" config ; sed -r "s/^;?(commit_hash) =.*$/\1 = $(git rev-parse HEAD)/" config
;commit_hash = 1234567 ;commit_hash = 1234567
[tuvotereminder] [votereminder]
; Offsets used to determine when TUs should be reminded about ; Offsets used to determine when Package Maintainers should be reminded about
; votes that they should make. ; votes that they should make.
; Reminders will be sent out for all votes that a TU has not yet ; Reminders will be sent out for all votes that a Package Maintainer has not yet
; voted on based on `now + range_start <= End <= now + range_end`. ; voted on based on `now + range_start <= End <= now + range_end`.
range_start = 500 range_start = 500
range_end = 172800 range_end = 172800
[cache]
; maximum number of keys/entries (for search results) in our redis cache, default is 50000
max_search_entries = 50000
; number of seconds after a cache entry for search queries expires, default is 10 minutes
expiry_time_search = 600
; number of seconds after a cache entry for statistics queries expires, default is 5 minutes
expiry_time_statistics = 300
; number of seconds after a cache entry for rss queries expires, default is 5 minutes
expiry_time_rss = 300
[tracing]
otlp_endpoint = http://localhost:4318/v1/traces

View file

@ -73,3 +73,6 @@ pkgnames-repo = pkgnames.git
[aurblup] [aurblup]
db-path = YOUR_AUR_ROOT/aurblup/ db-path = YOUR_AUR_ROOT/aurblup/
[tracing]
otlp_endpoint = http://tempo:4318/v1/traces

View file

@ -35,7 +35,7 @@ usually points to the git-serve program.
If SSH has been configured to pass on the AUR_OVERWRITE environment variable If SSH has been configured to pass on the AUR_OVERWRITE environment variable
(via SendEnv, see ssh_config(5) for details) and the user's account is a (via SendEnv, see ssh_config(5) for details) and the user's account is a
registered Trusted User or Developer, this will be passed on to the git-update registered Package Maintainer or Developer, this will be passed on to the git-update
program in order to enable a non-fast-forward push. program in order to enable a non-fast-forward push.
The INSTALL file in the top-level directory contains detailed instructions on The INSTALL file in the top-level directory contains detailed instructions on
@ -70,8 +70,8 @@ The Update Hook: git-update
The Git update hook, called git-update, performs several subtasks: The Git update hook, called git-update, performs several subtasks:
* Prevent from creating branches or tags other than master. * Prevent from creating branches or tags other than master.
* Deny non-fast-forwards, except for Trusted Users and Developers. * Deny non-fast-forwards, except for Package Maintainers and Developers.
* Deny blacklisted packages, except for Trusted Users and Developers. * Deny blacklisted packages, except for Package Maintainers and Developers.
* Verify each new commit (validate meta data, impose file size limits, ...) * Verify each new commit (validate meta data, impose file size limits, ...)
* Update package base information and package information in the database. * Update package base information and package information in the database.
* Update the named branch and the namespaced HEAD ref of the package. * Update the named branch and the namespaced HEAD ref of the package.
@ -109,7 +109,7 @@ is also recommended to disable automatic garbage collection by setting
receive.autogc to false. Remember to periodically run `git gc` manually or receive.autogc to false. Remember to periodically run `git gc` manually or
setup a maintenance script which initiates the garbage collection if you follow setup a maintenance script which initiates the garbage collection if you follow
this advice. For gc.pruneExpire, we recommend "3.months.ago", such that commits this advice. For gc.pruneExpire, we recommend "3.months.ago", such that commits
that became unreachable by TU intervention are kept for a while. that became unreachable by Package Maintainer intervention are kept for a while.
Script Wrappers (poetry) Script Wrappers (poetry)
------------------------ ------------------------

View file

@ -3,9 +3,9 @@ aurweb Translation
This document describes how to create and maintain aurweb translations. This document describes how to create and maintain aurweb translations.
Creating an aurweb translation requires a Transifex (http://www.transifex.com/) Creating an aurweb translation requires a Transifex (https://app.transifex.com/)
account. You will need to register with a translation team on the aurweb account. You will need to register with a translation team on the aurweb
project page (http://www.transifex.com/projects/p/aurweb/). project page (https://app.transifex.com/lfleischer/aurweb/).
Creating a New Translation Creating a New Translation
@ -21,23 +21,23 @@ strings for the translation to be usable, and it may have to be disabled.
1. Check out the aurweb source using git: 1. Check out the aurweb source using git:
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git aurweb-git $ git clone https://gitlab.archlinux.org/archlinux/aurweb.git aurweb-git
2. Go into the "po/" directory in the aurweb source and run msginit(1) to 2. Go into the "po/" directory in the aurweb source and run [msginit(1)][msginit] to
create a initial translation file from our translation catalog: create a initial translation file from our translation catalog:
$ cd aurweb-git $ cd aurweb-git
$ git checkout master $ git checkout master
$ git pull $ git pull
$ cd po $ cd po
$ msginit -l <locale> -o <locale>.po -i aurweb.pot $ msginit -l <locale> -o <locale>.po -i aurweb.pot
3. Use some editor or a translation helper like poedit to add translations: 3. Use some editor or a translation helper like poedit to add translations:
$ poedit <locale>.po $ poedit <locale>.po
5. If you have a working aurweb setup, add a line for the new translation in 5. If you have a working aurweb setup, add a line for the new translation in
"web/lib/config.inc.php.proto" and test if everything looks right. "po/Makefile" and test if everything looks right.
6. Upload the newly created ".po" file to Transifex. If you don't like the web 6. Upload the newly created ".po" file to Transifex. If you don't like the web
interface, you can also use transifex-client to do that (see below). interface, you can also use transifex-client to do that (see below).
@ -49,13 +49,15 @@ Updating an Existing Translation
1. Download current translation files from Transifex. You can also do this 1. Download current translation files from Transifex. You can also do this
using transifex-client which is available through the AUR: using transifex-client which is available through the AUR:
$ tx pull -a $ tx pull -a
2. Update the existing translation file using an editor or a tool like poedit: 2. Update the existing translation file using an editor or a tool like poedit:
$ poedit po/<locale>.po $ poedit po/<locale>.po
3. Push the updated translation file back to Transifex. Using transifex-client, 3. Push the updated translation file back to Transifex. Using transifex-client,
this works as follows: this works as follows:
$ tx push -r aurweb.aurwebpot -t -l <locale> $ tx push -r aurweb.aurwebpot -t -l <locale>
[msginit]: https://man.archlinux.org/man/msginit.1

View file

@ -12,8 +12,8 @@ package maintenance from the command-line. More details can be found in
The web interface can be used to browse packages, view package details, manage The web interface can be used to browse packages, view package details, manage
aurweb accounts, add comments, vote for packages, flag packages, and submit aurweb accounts, add comments, vote for packages, flag packages, and submit
requests. Trusted Users can update package maintainers and delete/merge requests. Package Maintainers can update package maintainers and delete/merge
packages. The web interface also includes an area for Trusted Users to post packages. The web interface also includes an area for Package Maintainers to post
AUR-related proposals and vote on them. AUR-related proposals and vote on them.
The RPC interface can be used to query package information via HTTP. The RPC interface can be used to query package information via HTTP.
@ -62,8 +62,8 @@ computations and clean up the database:
the official repositories. It is also used to prevent users from uploading the official repositories. It is also used to prevent users from uploading
packages that are in the official repositories already. packages that are in the official repositories already.
* aurweb-tuvotereminder sends out reminders to TUs if the voting period for a * aurweb-votereminder sends out reminders if the voting period for a
TU proposal ends soon. Package Maintainer proposal ends soon.
* aurweb-popupdate is used to recompute the popularity score of packages. * aurweb-popupdate is used to recompute the popularity score of packages.
@ -107,13 +107,13 @@ usually scheduled using Cron. The current setup is:
2 */2 * * * poetry run aurweb-aurblup 2 */2 * * * poetry run aurweb-aurblup
3 */2 * * * poetry run aurweb-pkgmaint 3 */2 * * * poetry run aurweb-pkgmaint
4 */2 * * * poetry run aurweb-usermaint 4 */2 * * * poetry run aurweb-usermaint
5 */12 * * * poetry run aurweb-tuvotereminder 5 */12 * * * poetry run aurweb-votereminder
---- ----
Advanced Administrative Features Advanced Administrative Features
-------------------------------- --------------------------------
Trusted Users can set the AUR_OVERWRITE environment variable to enable Package Maintainers can set the AUR_OVERWRITE environment variable to enable
non-fast-forward pushes to the Git repositories. This feature is documented in non-fast-forward pushes to the Git repositories. This feature is documented in
`doc/git-interface.txt`. `doc/git-interface.txt`.

View file

@ -1,5 +1,4 @@
version: "3.8" ---
services: services:
ca: ca:
volumes: volumes:

View file

@ -1,16 +1,10 @@
version: "3.8" ---
services: services:
ca: ca:
volumes: volumes:
- ./data:/data - ./data:/data
- step:/root/.step - step:/root/.step
mariadb_init:
depends_on:
mariadb:
condition: service_healthy
git: git:
volumes: volumes:
- git_data:/aurweb/aur.git - git_data:/aurweb/aur.git
@ -21,9 +15,6 @@ services:
- git_data:/aurweb/aur.git - git_data:/aurweb/aur.git
- ./data:/data - ./data:/data
- smartgit_run:/var/run/smartgit - smartgit_run:/var/run/smartgit
depends_on:
mariadb:
condition: service_healthy
fastapi: fastapi:
volumes: volumes:

View file

@ -1,3 +1,4 @@
---
# #
# Docker service definitions for the aurweb project. # Docker service definitions for the aurweb project.
# #
@ -16,8 +17,6 @@
# #
# Copyright (C) 2021 aurweb Development # Copyright (C) 2021 aurweb Development
# All Rights Reserved. # All Rights Reserved.
version: "3.8"
services: services:
aurweb-image: aurweb-image:
build: . build: .
@ -49,7 +48,7 @@ services:
image: aurweb:latest image: aurweb:latest
init: true init: true
entrypoint: /docker/mariadb-entrypoint.sh entrypoint: /docker/mariadb-entrypoint.sh
command: /usr/bin/mysqld_safe --datadir=/var/lib/mysql command: /usr/bin/mariadbd-safe --datadir=/var/lib/mysql
ports: ports:
# This will expose mariadbd on 127.0.0.1:13306 in the host. # This will expose mariadbd on 127.0.0.1:13306 in the host.
# Ex: `mysql -uaur -paur -h 127.0.0.1 -P 13306 aurweb` # Ex: `mysql -uaur -paur -h 127.0.0.1 -P 13306 aurweb`
@ -81,7 +80,7 @@ services:
environment: environment:
- MARIADB_PRIVILEGED=1 - MARIADB_PRIVILEGED=1
entrypoint: /docker/mariadb-entrypoint.sh entrypoint: /docker/mariadb-entrypoint.sh
command: /usr/bin/mysqld_safe --datadir=/var/lib/mysql command: /usr/bin/mariadbd-safe --datadir=/var/lib/mysql
ports: ports:
# This will expose mariadbd on 127.0.0.1:13307 in the host. # This will expose mariadbd on 127.0.0.1:13307 in the host.
# Ex: `mysql -uaur -paur -h 127.0.0.1 -P 13306 aurweb` # Ex: `mysql -uaur -paur -h 127.0.0.1 -P 13306 aurweb`
@ -107,8 +106,10 @@ services:
test: "bash /docker/health/sshd.sh" test: "bash /docker/health/sshd.sh"
interval: 3s interval: 3s
depends_on: depends_on:
mariadb:
condition: service_healthy
mariadb_init: mariadb_init:
condition: service_started condition: service_completed_successfully
volumes: volumes:
- mariadb_run:/var/run/mysqld - mariadb_run:/var/run/mysqld
@ -122,6 +123,9 @@ services:
healthcheck: healthcheck:
test: "bash /docker/health/smartgit.sh" test: "bash /docker/health/smartgit.sh"
interval: 3s interval: 3s
depends_on:
mariadb:
condition: service_healthy
cgit-fastapi: cgit-fastapi:
image: aurweb:latest image: aurweb:latest
@ -152,8 +156,10 @@ services:
entrypoint: /docker/cron-entrypoint.sh entrypoint: /docker/cron-entrypoint.sh
command: /docker/scripts/run-cron.sh command: /docker/scripts/run-cron.sh
depends_on: depends_on:
mariadb:
condition: service_healthy
mariadb_init: mariadb_init:
condition: service_started condition: service_completed_successfully
volumes: volumes:
- ./aurweb:/aurweb/aurweb - ./aurweb:/aurweb/aurweb
- mariadb_run:/var/run/mysqld - mariadb_run:/var/run/mysqld
@ -182,6 +188,12 @@ services:
condition: service_healthy condition: service_healthy
cron: cron:
condition: service_started condition: service_started
mariadb:
condition: service_healthy
mariadb_init:
condition: service_completed_successfully
tempo:
condition: service_healthy
volumes: volumes:
- archives:/var/lib/aurweb/archives - archives:/var/lib/aurweb/archives
- mariadb_run:/var/run/mysqld - mariadb_run:/var/run/mysqld
@ -281,6 +293,56 @@ services:
- ./test:/aurweb/test - ./test:/aurweb/test
- ./templates:/aurweb/templates - ./templates:/aurweb/templates
grafana:
# TODO: check if we need init: true
image: grafana/grafana:11.1.3
environment:
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
- GF_AUTH_DISABLE_LOGIN_FORM=true
- GF_LOG_LEVEL=warn
# check if depends ar ecorrect, does stopping or restarting a child exit grafana?
depends_on:
prometheus:
condition: service_healthy
tempo:
condition: service_healthy
ports:
- "127.0.0.1:3000:3000"
volumes:
- ./docker/config/grafana/datasources:/etc/grafana/provisioning/datasources
prometheus:
image: prom/prometheus:latest
command:
- --config.file=/etc/prometheus/prometheus.yml
- --web.enable-remote-write-receiver
- --web.listen-address=prometheus:9090
healthcheck:
# TODO: check if there is a status route
test: "sh /docker/health/prometheus.sh"
interval: 3s
ports:
- "127.0.0.1:9090:9090"
volumes:
- ./docker/config/prometheus.yml:/etc/prometheus/prometheus.yml
- ./docker/health/prometheus.sh:/docker/health/prometheus.sh
tempo:
image: grafana/tempo:2.5.0
command:
- -config.file=/etc/tempo/config.yml
healthcheck:
# TODO: check if there is a status route
test: "sh /docker/health/tempo.sh"
interval: 3s
ports:
- "127.0.0.1:3200:3200"
- "127.0.0.1:4318:4318"
volumes:
- ./docker/config/tempo.yml:/etc/tempo/config.yml
- ./docker/health/tempo.sh:/docker/health/tempo.sh
volumes: volumes:
mariadb_test_run: {} mariadb_test_run: {}
mariadb_run: {} # Share /var/run/mysqld/mysqld.sock mariadb_run: {} # Share /var/run/mysqld/mysqld.sock

View file

@ -47,7 +47,7 @@ Luckily such data can be generated.
docker compose exec fastapi /bin/bash docker compose exec fastapi /bin/bash
pacman -S words fortune-mod pacman -S words fortune-mod
./schema/gendummydata.py dummy.sql ./schema/gendummydata.py dummy.sql
mysql aurweb < dummy.sql mariadb aurweb < dummy.sql
``` ```
The generation script may prompt you to install other Arch packages before it The generation script may prompt you to install other Arch packages before it

View file

@ -4,4 +4,4 @@ AUR_CONFIG='/aurweb/conf/config'
*/2 * * * * bash -c 'aurweb-pkgmaint' */2 * * * * bash -c 'aurweb-pkgmaint'
*/2 * * * * bash -c 'aurweb-usermaint' */2 * * * * bash -c 'aurweb-usermaint'
*/2 * * * * bash -c 'aurweb-popupdate' */2 * * * * bash -c 'aurweb-popupdate'
*/12 * * * * bash -c 'aurweb-tuvotereminder' */12 * * * * bash -c 'aurweb-votereminder'

View file

@ -0,0 +1,42 @@
---
apiVersion: 1
deleteDatasources:
- name: Prometheus
- name: Tempo
datasources:
- name: Prometheus
type: prometheus
uid: prometheus
access: proxy
url: http://prometheus:9090
orgId: 1
editable: false
jsonData:
timeInterval: 1m
- name: Tempo
type: tempo
uid: tempo
access: proxy
url: http://tempo:3200
orgId: 1
editable: false
jsonData:
tracesToMetrics:
datasourceUid: 'prometheus'
spanStartTimeShift: '1h'
spanEndTimeShift: '-1h'
serviceMap:
datasourceUid: 'prometheus'
nodeGraph:
enabled: true
search:
hide: false
traceQuery:
timeShiftEnabled: true
spanStartTimeShift: '1h'
spanEndTimeShift: '-1h'
spanBar:
type: 'Tag'
tag: 'http.path'

View file

@ -0,0 +1,15 @@
---
global:
scrape_interval: 60s
scrape_configs:
- job_name: tempo
static_configs:
- targets: ['tempo:3200']
labels:
instance: tempo
- job_name: aurweb
static_configs:
- targets: ['fastapi:8000']
labels:
instance: aurweb

54
docker/config/tempo.yml Normal file
View file

@ -0,0 +1,54 @@
---
stream_over_http_enabled: true
server:
http_listen_address: tempo
http_listen_port: 3200
log_level: info
query_frontend:
search:
duration_slo: 5s
throughput_bytes_slo: 1.073741824e+09
trace_by_id:
duration_slo: 5s
distributor:
receivers:
otlp:
protocols:
http:
endpoint: tempo:4318
log_received_spans:
enabled: false
metric_received_spans:
enabled: false
ingester:
max_block_duration: 5m
compactor:
compaction:
block_retention: 1h
metrics_generator:
registry:
external_labels:
source: tempo
storage:
path: /tmp/tempo/generator/wal
remote_write:
- url: http://prometheus:9090/api/v1/write
send_exemplars: true
traces_storage:
path: /tmp/tempo/generator/traces
storage:
trace:
backend: local
wal:
path: /tmp/tempo/wal
local:
path: /tmp/tempo/blocks
overrides:
metrics_generator_processors: [service-graphs, span-metrics, local-blocks]

View file

@ -1,2 +1,2 @@
#!/bin/bash #!/bin/bash
exec mysqladmin ping --silent exec mariadb-admin ping --silent

2
docker/health/prometheus.sh Executable file
View file

@ -0,0 +1,2 @@
#!/bin/sh
exec wget -q http://prometheus:9090/status -O /dev/null

2
docker/health/tempo.sh Executable file
View file

@ -0,0 +1,2 @@
#!/bin/sh
exec wget -q http://tempo:3200/status -O /dev/null

View file

@ -6,8 +6,8 @@ MYSQL_DATA=/var/lib/mysql
mariadb-install-db --user=mysql --basedir=/usr --datadir=$MYSQL_DATA mariadb-install-db --user=mysql --basedir=/usr --datadir=$MYSQL_DATA
# Start it up. # Start it up.
mysqld_safe --datadir=$MYSQL_DATA --skip-networking & mariadbd-safe --datadir=$MYSQL_DATA --skip-networking &
while ! mysqladmin ping 2>/dev/null; do while ! mariadb-admin ping 2>/dev/null; do
sleep 1s sleep 1s
done done
@ -15,17 +15,17 @@ done
DATABASE="aurweb" # Persistent database for fastapi. DATABASE="aurweb" # Persistent database for fastapi.
echo "Taking care of primary database '${DATABASE}'..." echo "Taking care of primary database '${DATABASE}'..."
mysql -u root -e "CREATE USER IF NOT EXISTS 'aur'@'localhost' IDENTIFIED BY 'aur';" mariadb -u root -e "CREATE USER IF NOT EXISTS 'aur'@'localhost' IDENTIFIED BY 'aur';"
mysql -u root -e "CREATE USER IF NOT EXISTS 'aur'@'%' IDENTIFIED BY 'aur';" mariadb -u root -e "CREATE USER IF NOT EXISTS 'aur'@'%' IDENTIFIED BY 'aur';"
mysql -u root -e "CREATE DATABASE IF NOT EXISTS $DATABASE;" mariadb -u root -e "CREATE DATABASE IF NOT EXISTS $DATABASE;"
mysql -u root -e "CREATE USER IF NOT EXISTS 'aur'@'%' IDENTIFIED BY 'aur';" mariadb -u root -e "CREATE USER IF NOT EXISTS 'aur'@'%' IDENTIFIED BY 'aur';"
mysql -u root -e "GRANT ALL ON aurweb.* TO 'aur'@'localhost';" mariadb -u root -e "GRANT ALL ON aurweb.* TO 'aur'@'localhost';"
mysql -u root -e "GRANT ALL ON aurweb.* TO 'aur'@'%';" mariadb -u root -e "GRANT ALL ON aurweb.* TO 'aur'@'%';"
mysql -u root -e "CREATE USER IF NOT EXISTS 'root'@'%' IDENTIFIED BY 'aur';" mariadb -u root -e "CREATE USER IF NOT EXISTS 'root'@'%' IDENTIFIED BY 'aur';"
mysql -u root -e "GRANT ALL ON *.* TO 'root'@'%' WITH GRANT OPTION;" mariadb -u root -e "GRANT ALL ON *.* TO 'root'@'%' WITH GRANT OPTION;"
mysqladmin -uroot shutdown mariadb-admin -uroot shutdown
exec "$@" exec "$@"

View file

@ -13,7 +13,7 @@ pacman -Sy --noconfirm --noprogressbar archlinux-keyring
# Install other OS dependencies. # Install other OS dependencies.
pacman -Syu --noconfirm --noprogressbar \ pacman -Syu --noconfirm --noprogressbar \
--cachedir .pkg-cache git gpgme nginx redis openssh \ git gpgme nginx redis openssh \
mariadb mariadb-libs cgit-aurweb uwsgi uwsgi-plugin-cgi \ mariadb mariadb-libs cgit-aurweb uwsgi uwsgi-plugin-cgi \
python-pip pyalpm python-srcinfo curl libeatmydata cronie \ python-pip pyalpm python-srcinfo curl libeatmydata cronie \
python-poetry python-poetry-core step-cli step-ca asciidoc \ python-poetry python-poetry-core step-cli step-ca asciidoc \

View file

@ -1,10 +1,8 @@
#!/bin/bash #!/bin/bash
set -eou pipefail set -eou pipefail
# Upgrade PIP; Arch Linux's version of pip is outdated for Poetry.
pip install --upgrade pip
if [ ! -z "${COMPOSE+x}" ]; then if [ ! -z "${COMPOSE+x}" ]; then
export PIP_BREAK_SYSTEM_PACKAGES=1
poetry config virtualenvs.create false poetry config virtualenvs.create false
fi fi
poetry install --no-interaction --no-ansi poetry install --no-interaction --no-ansi

7
gunicorn.conf.py Normal file
View file

@ -0,0 +1,7 @@
from prometheus_client import multiprocess
def child_exit(server, worker): # pragma: no cover
"""This function is required for gunicorn customization
of prometheus multiprocessing."""
multiprocess.mark_process_dead(worker.pid)

View file

@ -0,0 +1,29 @@
"""add indices on PackageBases for RSS order by
Revision ID: 38e5b9982eea
Revises: 7d65d35fae45
Create Date: 2024-08-03 01:35:39.104283
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "38e5b9982eea"
down_revision = "7d65d35fae45"
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_index("BasesModifiedTS", "PackageBases", ["ModifiedTS"], unique=False)
op.create_index("BasesSubmittedTS", "PackageBases", ["SubmittedTS"], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index("BasesSubmittedTS", table_name="PackageBases")
op.drop_index("BasesModifiedTS", table_name="PackageBases")
# ### end Alembic commands ###

View file

@ -5,6 +5,7 @@ Revises: ef39fcd6e1cd
Create Date: 2021-05-17 14:23:00.008479 Create Date: 2021-05-17 14:23:00.008479
""" """
from alembic import op from alembic import op
import aurweb.config import aurweb.config

View file

@ -5,6 +5,7 @@ Revises: d64e5571bc8d
Create Date: 2022-09-22 18:08:03.280664 Create Date: 2022-09-22 18:08:03.280664
""" """
from alembic import op from alembic import op
from sqlalchemy.exc import OperationalError from sqlalchemy.exc import OperationalError

View file

@ -0,0 +1,38 @@
"""Rename TU to Package Maintainer
Revision ID: 6a64dd126029
Revises: c5a6a9b661a0
Create Date: 2023-09-01 13:48:15.315244
"""
from aurweb import db
from aurweb.models import AccountType
# revision identifiers, used by Alembic.
revision = "6a64dd126029"
down_revision = "c5a6a9b661a0"
branch_labels = None
depends_on = None
# AccountTypes
# ID 2 -> Trusted User / Package Maintainer
# ID 4 -> Trusted User & Developer / Package Maintainer & Developer
def upgrade():
with db.begin():
tu = db.query(AccountType).filter(AccountType.ID == 2).first()
tudev = db.query(AccountType).filter(AccountType.ID == 4).first()
tu.AccountType = "Package Maintainer"
tudev.AccountType = "Package Maintainer & Developer"
def downgrade():
with db.begin():
pm = db.query(AccountType).filter(AccountType.ID == 2).first()
pmdev = db.query(AccountType).filter(AccountType.ID == 4).first()
pm.AccountType = "Trusted User"
pmdev.AccountType = "Trusted User & Developer"

View file

@ -0,0 +1,48 @@
"""Rename TU tables/columns
Revision ID: 7d65d35fae45
Revises: 6a64dd126029
Create Date: 2023-09-10 10:21:33.092342
"""
from alembic import op
from sqlalchemy.dialects.mysql import INTEGER
# revision identifiers, used by Alembic.
revision = "7d65d35fae45"
down_revision = "6a64dd126029"
branch_labels = None
depends_on = None
# TU_VoteInfo -> VoteInfo
# TU_VoteInfo.ActiveTUs -> VoteInfo.ActiveUsers
# TU_Votes -> Votes
def upgrade():
# Tables
op.rename_table("TU_VoteInfo", "VoteInfo")
op.rename_table("TU_Votes", "Votes")
# Columns
op.alter_column(
"VoteInfo",
"ActiveTUs",
existing_type=INTEGER(unsigned=True),
new_column_name="ActiveUsers",
)
def downgrade():
# Tables
op.rename_table("VoteInfo", "TU_VoteInfo")
op.rename_table("Votes", "TU_Votes")
# Columns
op.alter_column(
"TU_VoteInfo",
"ActiveUsers",
existing_type=INTEGER(unsigned=True),
new_column_name="ActiveTUs",
)

View file

@ -5,6 +5,7 @@ Revises: 6441d3b65270
Create Date: 2022-10-17 11:11:46.203322 Create Date: 2022-10-17 11:11:46.203322
""" """
from alembic import op from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.

View file

@ -15,6 +15,7 @@ Revision ID: be7adae47ac3
Revises: 56e2ce8e2ffa Revises: 56e2ce8e2ffa
Create Date: 2022-01-06 14:37:07.899778 Create Date: 2022-01-06 14:37:07.899778
""" """
from alembic import op from alembic import op
from sqlalchemy.dialects.mysql import INTEGER, TINYINT from sqlalchemy.dialects.mysql import INTEGER, TINYINT

View file

@ -0,0 +1,25 @@
"""Add index on PackageBases.Popularity and .Name
Revision ID: c5a6a9b661a0
Revises: e4e49ffce091
Create Date: 2023-07-02 13:46:52.522146
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "c5a6a9b661a0"
down_revision = "e4e49ffce091"
branch_labels = None
depends_on = None
def upgrade():
op.create_index(
"BasesPopularityName", "PackageBases", ["Popularity", "Name"], unique=False
)
def downgrade():
op.drop_index("BasesPopularityName", table_name="PackageBases")

View file

@ -5,6 +5,7 @@ Revises: be7adae47ac3
Create Date: 2022-02-18 12:47:05.322766 Create Date: 2022-02-18 12:47:05.322766
""" """
from datetime import datetime from datetime import datetime
import sqlalchemy as sa import sqlalchemy as sa

View file

@ -5,6 +5,7 @@ Revises: 9e3158957fd7
Create Date: 2023-04-19 23:24:25.854874 Create Date: 2023-04-19 23:24:25.854874
""" """
from alembic import op from alembic import op
from sqlalchemy.exc import OperationalError from sqlalchemy.exc import OperationalError

View file

@ -5,6 +5,7 @@ Revises: f47cad5d6d03
Create Date: 2020-06-08 10:04:13.898617 Create Date: 2020-06-08 10:04:13.898617
""" """
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op from alembic import op
from sqlalchemy.engine.reflection import Inspector from sqlalchemy.engine.reflection import Inspector

View file

@ -4,6 +4,7 @@ Revision ID: f47cad5d6d03
Create Date: 2020-02-23 13:23:32.331396 Create Date: 2020-02-23 13:23:32.331396
""" """
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision = "f47cad5d6d03" revision = "f47cad5d6d03"
down_revision = None down_revision = None

131
po/ar.po
View file

@ -12,7 +12,7 @@ msgstr ""
"POT-Creation-Date: 2020-01-31 09:29+0100\n" "POT-Creation-Date: 2020-01-31 09:29+0100\n"
"PO-Revision-Date: 2011-04-10 13:21+0000\n" "PO-Revision-Date: 2011-04-10 13:21+0000\n"
"Last-Translator: صفا الفليج <safaalfulaij@hotmail.com>, 2015-2016\n" "Last-Translator: صفا الفليج <safaalfulaij@hotmail.com>, 2015-2016\n"
"Language-Team: Arabic (http://www.transifex.com/lfleischer/aurweb/language/ar/)\n" "Language-Team: Arabic (http://app.transifex.com/lfleischer/aurweb/language/ar/)\n"
"MIME-Version: 1.0\n" "MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n" "Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n" "Content-Transfer-Encoding: 8bit\n"
@ -133,15 +133,15 @@ msgid "Type"
msgstr "النّوع" msgstr "النّوع"
#: html/addvote.php #: html/addvote.php
msgid "Addition of a TU" msgid "Addition of a Package Maintainer"
msgstr "إضافة م‌م" msgstr ""
#: html/addvote.php #: html/addvote.php
msgid "Removal of a TU" msgid "Removal of a Package Maintainer"
msgstr "إزالة م‌م" msgstr ""
#: html/addvote.php #: html/addvote.php
msgid "Removal of a TU (undeclared inactivity)" msgid "Removal of a Package Maintainer (undeclared inactivity)"
msgstr "" msgstr ""
#: html/addvote.php #: html/addvote.php
@ -199,9 +199,10 @@ msgstr ""
#: html/home.php #: html/home.php
#, php-format #, php-format
msgid "" msgid ""
"Welcome to the AUR! Please read the %sAUR User Guidelines%s and %sAUR TU " "Welcome to the AUR! Please read the %sAUR User Guidelines%s for more "
"Guidelines%s for more information." "information and the %sAUR Submission Guidelines%s if you want to contribute "
msgstr "مرحبًا بك في م‌م‌آ! فضلًا اقرأ %sإرشادات مستخدمي م‌م‌آ%s و%sإرشادات مستخدمي م‌م‌آ الموثوقين (م‌م)%s لمعلومات أكثر." "a PKGBUILD."
msgstr ""
#: html/home.php #: html/home.php
#, php-format #, php-format
@ -215,8 +216,8 @@ msgid "Remember to vote for your favourite packages!"
msgstr "تذكّر أن تصوّت لحزمك المفضّلة!" msgstr "تذكّر أن تصوّت لحزمك المفضّلة!"
#: html/home.php #: html/home.php
msgid "Some packages may be provided as binaries in [community]." msgid "Some packages may be provided as binaries in [extra]."
msgstr "قد تكون بعض الحزم متوفّرة كثنائيّات في مستودع المجتمع [community]." msgstr "قد تكون بعض الحزم متوفّرة كثنائيّات في مستودع المجتمع [extra]."
#: html/home.php #: html/home.php
msgid "DISCLAIMER" msgid "DISCLAIMER"
@ -265,8 +266,8 @@ msgstr "طلب الحذف"
msgid "" msgid ""
"Request a package to be removed from the Arch User Repository. Please do not" "Request a package to be removed from the Arch User Repository. Please do not"
" use this if a package is broken and can be fixed easily. Instead, contact " " use this if a package is broken and can be fixed easily. Instead, contact "
"the package maintainer and file orphan request if necessary." "the maintainer and file orphan request if necessary."
msgstr "اطلب أن تُزال الحزمة من مستودع مستخدمي آرتش. فضلًا لا تستخدم هذه إن كانت الحزمة معطوبة ويمكن إصلاحها بسهولة. بدل ذلك تواصل مع مصين الحزمة وأبلغ عن طلب \"يتيمة\" إن تطلّب الأمر." msgstr ""
#: html/home.php #: html/home.php
msgid "Merge Request" msgid "Merge Request"
@ -308,10 +309,11 @@ msgstr "النّقاش"
#: html/home.php #: html/home.php
#, php-format #, php-format
msgid "" msgid ""
"General discussion regarding the Arch User Repository (AUR) and Trusted User" "General discussion regarding the Arch User Repository (AUR) and Package "
" structure takes place on %saur-general%s. For discussion relating to the " "Maintainer structure takes place on %saur-general%s. For discussion relating"
"development of the AUR web interface, use the %saur-dev%s mailing list." " to the development of the AUR web interface, use the %saur-dev%s mailing "
msgstr "النّقاشات العاّمة حول مستودع مستخدمي آرتش (م‌م‌آ) وبنية المستخدمين الموثوقين تكون في %saur-general%s. للنّقاشات المتعلّقة بتطوير واجهة وِبّ م‌م‌آ، استخدم قائمة %saur-dev%s البريديّة." "list."
msgstr ""
#: html/home.php #: html/home.php
msgid "Bug Reporting" msgid "Bug Reporting"
@ -322,9 +324,9 @@ msgstr "الإبلاغ عن العلل"
msgid "" msgid ""
"If you find a bug in the AUR web interface, please fill out a bug report on " "If you find a bug in the AUR web interface, please fill out a bug report on "
"our %sbug tracker%s. Use the tracker to report bugs in the AUR web interface" "our %sbug tracker%s. Use the tracker to report bugs in the AUR web interface"
" %sonly%s. To report packaging bugs contact the package maintainer or leave " " %sonly%s. To report packaging bugs contact the maintainer or leave a "
"a comment on the appropriate package page." "comment on the appropriate package page."
msgstr "إن وجدت علّة في واجهة وِبّ م‌م‌آ، فضلًا املأ تقريرًا بها في %sمتعقّب العلل%s. استخدم المتعقّب للإبلاغ عن العلل في واجهة وِبّ م‌م‌آ %sفقط%s. للإبلاغ عن علل الحزم راسل مديرها أو اترك تعليقًا في صفحة الحزمة المناسبة." msgstr ""
#: html/home.php #: html/home.php
msgid "Package Search" msgid "Package Search"
@ -524,8 +526,8 @@ msgid "Delete"
msgstr "احذف" msgstr "احذف"
#: html/pkgdel.php #: html/pkgdel.php
msgid "Only Trusted Users and Developers can delete packages." msgid "Only Package Maintainers and Developers can delete packages."
msgstr "يمكن فقط للمستخدمين الموثوقين والمطوّرين حذف الحزم." msgstr ""
#: html/pkgdisown.php template/pkgbase_actions.php #: html/pkgdisown.php template/pkgbase_actions.php
msgid "Disown Package" msgid "Disown Package"
@ -565,8 +567,8 @@ msgid "Disown"
msgstr "تنازل" msgstr "تنازل"
#: html/pkgdisown.php #: html/pkgdisown.php
msgid "Only Trusted Users and Developers can disown packages." msgid "Only Package Maintainers and Developers can disown packages."
msgstr "يمكن فقط للمستخدمين الموثوقين والمطوّرين التّنازل عن الحزم." msgstr ""
#: html/pkgflagcomment.php #: html/pkgflagcomment.php
msgid "Flag Comment" msgid "Flag Comment"
@ -655,8 +657,8 @@ msgid "Merge"
msgstr "دمج" msgstr "دمج"
#: html/pkgmerge.php #: html/pkgmerge.php
msgid "Only Trusted Users and Developers can merge packages." msgid "Only Package Maintainers and Developers can merge packages."
msgstr "يمكن فقط للمستخدمين الموثوقين والمطوّرين دمج الحزم." msgstr ""
#: html/pkgreq.php template/pkgbase_actions.php template/pkgreq_form.php #: html/pkgreq.php template/pkgbase_actions.php template/pkgreq_form.php
msgid "Submit Request" msgid "Submit Request"
@ -713,8 +715,8 @@ msgid "I accept the terms and conditions above."
msgstr "" msgstr ""
#: html/tu.php template/account_details.php template/header.php #: html/tu.php template/account_details.php template/header.php
msgid "Trusted User" msgid "Package Maintainer"
msgstr "مستخدم موثوق" msgstr ""
#: html/tu.php #: html/tu.php
msgid "Could not retrieve proposal details." msgid "Could not retrieve proposal details."
@ -725,8 +727,8 @@ msgid "Voting is closed for this proposal."
msgstr "أُغلق التّصويت على هذا الرّأي." msgstr "أُغلق التّصويت على هذا الرّأي."
#: html/tu.php #: html/tu.php
msgid "Only Trusted Users are allowed to vote." msgid "Only Package Maintainers are allowed to vote."
msgstr "فقط المستخدمين الموثوقين مسموح لهم بالتّصويت." msgstr ""
#: html/tu.php #: html/tu.php
msgid "You cannot vote in an proposal about you." msgid "You cannot vote in an proposal about you."
@ -1221,8 +1223,8 @@ msgstr "مطوّر"
#: template/account_details.php template/account_edit_form.php #: template/account_details.php template/account_edit_form.php
#: template/search_accounts_form.php #: template/search_accounts_form.php
msgid "Trusted User & Developer" msgid "Package Maintainer & Developer"
msgstr "مستخدم موثوق ومطوّر" msgstr ""
#: template/account_details.php template/account_edit_form.php #: template/account_details.php template/account_edit_form.php
#: template/search_accounts_form.php #: template/search_accounts_form.php
@ -1324,10 +1326,6 @@ msgstr ""
msgid "Normal user" msgid "Normal user"
msgstr "مستخدم عاديّ" msgstr "مستخدم عاديّ"
#: template/account_edit_form.php template/search_accounts_form.php
msgid "Trusted user"
msgstr "مستخدم موثوق"
#: template/account_edit_form.php template/search_accounts_form.php #: template/account_edit_form.php template/search_accounts_form.php
msgid "Account Suspended" msgid "Account Suspended"
msgstr "حساب معلّق" msgstr "حساب معلّق"
@ -1400,6 +1398,15 @@ msgid ""
" the Arch User Repository." " the Arch User Repository."
msgstr "المعلومات الآتية مطلوبة فقط إن أردت تقديم حزم إلى مستودع مستخدمي آرتش." msgstr "المعلومات الآتية مطلوبة فقط إن أردت تقديم حزم إلى مستودع مستخدمي آرتش."
#: templates/partials/account_form.html
msgid ""
"Specify multiple SSH Keys separated by new line, empty lines are ignored."
msgstr ""
#: templates/partials/account_form.html
msgid "Hide deleted comments"
msgstr ""
#: template/account_edit_form.php #: template/account_edit_form.php
msgid "SSH Public Key" msgid "SSH Public Key"
msgstr "مفتاح SSH العموميّ" msgstr "مفتاح SSH العموميّ"
@ -1827,22 +1834,22 @@ msgstr "ادمج مع"
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting a deletion request, you ask a Trusted User to delete the " "By submitting a deletion request, you ask a Package Maintainer to delete the"
"package base. This type of request should be used for duplicates, software " " package base. This type of request should be used for duplicates, software "
"abandoned by upstream, as well as illegal and irreparably broken packages." "abandoned by upstream, as well as illegal and irreparably broken packages."
msgstr "" msgstr ""
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting a merge request, you ask a Trusted User to delete the package " "By submitting a merge request, you ask a Package Maintainer to delete the "
"base and transfer its votes and comments to another package base. Merging a " "package base and transfer its votes and comments to another package base. "
"package does not affect the corresponding Git repositories. Make sure you " "Merging a package does not affect the corresponding Git repositories. Make "
"update the Git history of the target package yourself." "sure you update the Git history of the target package yourself."
msgstr "" msgstr ""
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting an orphan request, you ask a Trusted User to disown the " "By submitting an orphan request, you ask a Package Maintainer to disown the "
"package base. Please only do this if the package needs maintainer action, " "package base. Please only do this if the package needs maintainer action, "
"the maintainer is MIA and you already tried to contact the maintainer " "the maintainer is MIA and you already tried to contact the maintainer "
"previously." "previously."
@ -2115,8 +2122,8 @@ msgid "Registered Users"
msgstr "المستخدمون المسجّلون" msgstr "المستخدمون المسجّلون"
#: template/stats/general_stats_table.php #: template/stats/general_stats_table.php
msgid "Trusted Users" msgid "Package Maintainers"
msgstr "المستخدمون الموثوقون" msgstr ""
#: template/stats/updates_table.php #: template/stats/updates_table.php
msgid "Recent Updates" msgid "Recent Updates"
@ -2301,7 +2308,7 @@ msgstr ""
#: scripts/notify.py #: scripts/notify.py
#, python-brace-format #, python-brace-format
msgid "TU Vote Reminder: Proposal {id}" msgid "Package Maintainer Vote Reminder: Proposal {id}"
msgstr "" msgstr ""
#: scripts/notify.py #: scripts/notify.py
@ -2355,3 +2362,35 @@ msgid ""
"This action will close any pending package requests related to it. If " "This action will close any pending package requests related to it. If "
"%sComments%s are omitted, a closure comment will be autogenerated." "%sComments%s are omitted, a closure comment will be autogenerated."
msgstr "" msgstr ""
#: templates/partials/tu/proposal/details.html
msgid "assigned"
msgstr ""
#: templaets/partials/packages/package_metadata.html
msgid "Show %d more"
msgstr ""
#: templates/partials/packages/package_metadata.html
msgid "dependencies"
msgstr ""
#: aurweb/routers/accounts.py
msgid "The account has not been deleted, check the confirmation checkbox."
msgstr ""
#: templates/partials/packages/comment_form.html
msgid "Cancel"
msgstr ""
#: templates/requests.html
msgid "Package name"
msgstr ""
#: templates/partials/account_form.html
msgid ""
"Note that if you hide your email address, it'll end up on the BCC list for "
"any request notifications. In case someone replies to these notifications, "
"you won't receive an email. However, replies are typically sent to the "
"mailing-list and would then be visible in the archive."
msgstr ""

258
po/ast.po
View file

@ -1,19 +1,21 @@
# SOME DESCRIPTIVE TITLE. # SOME DESCRIPTIVE TITLE.
# Copyright (C) YEAR THE PACKAGE'S COPYRIGHT HOLDER # Copyright (C) YEAR THE PACKAGE'S COPYRIGHT HOLDER
# This file is distributed under the same license as the AURWEB package. # This file is distributed under the same license as the AURWEB package.
# #
# Translators: # Translators:
# enolp <enolp@softastur.org>, 2014-2015,2017 # enolp <enolp@softastur.org>, 2014-2015,2017,2020,2022
# Ḷḷumex03 <tornes@opmbx.org>, 2014 # enolp <enolp@softastur.org>, 2020
# prflr88 <prflr88@gmail.com>, 2014-2015 # Ḷḷumex03, 2014
# Ḷḷumex03, 2014
# Pablo Lezaeta Reyes <prflr88@gmail.com>, 2014-2015
msgid "" msgid ""
msgstr "" msgstr ""
"Project-Id-Version: aurweb\n" "Project-Id-Version: aurweb\n"
"Report-Msgid-Bugs-To: https://bugs.archlinux.org/index.php?project=2\n" "Report-Msgid-Bugs-To: https://gitlab.archlinux.org/archlinux/aurweb/-/issues\n"
"POT-Creation-Date: 2020-01-31 09:29+0100\n" "POT-Creation-Date: 2020-01-31 09:29+0100\n"
"PO-Revision-Date: 2020-03-07 17:55+0000\n" "PO-Revision-Date: 2011-04-10 13:21+0000\n"
"Last-Translator: enolp <enolp@softastur.org>\n" "Last-Translator: enolp <enolp@softastur.org>, 2014-2015,2017,2020,2022\n"
"Language-Team: Asturian (http://www.transifex.com/lfleischer/aurweb/language/ast/)\n" "Language-Team: Asturian (http://app.transifex.com/lfleischer/aurweb/language/ast/)\n"
"MIME-Version: 1.0\n" "MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n" "Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n" "Content-Transfer-Encoding: 8bit\n"
@ -22,7 +24,7 @@ msgstr ""
#: html/404.php #: html/404.php
msgid "Page Not Found" msgid "Page Not Found"
msgstr "" msgstr "Nun s'atopó la páxina"
#: html/404.php #: html/404.php
msgid "Sorry, the page you've requested does not exist." msgid "Sorry, the page you've requested does not exist."
@ -48,7 +50,7 @@ msgstr ""
#: html/503.php #: html/503.php
msgid "Service Unavailable" msgid "Service Unavailable"
msgstr "" msgstr "El serviciu nun ta disponible"
#: html/503.php #: html/503.php
msgid "" msgid ""
@ -69,11 +71,11 @@ msgstr ""
#: html/account.php #: html/account.php
msgid "Could not retrieve information for the specified user." msgid "Could not retrieve information for the specified user."
msgstr "" msgstr "Nun se pudo recuperar la información del usuariu especificáu."
#: html/account.php #: html/account.php
msgid "You do not have permission to edit this account." msgid "You do not have permission to edit this account."
msgstr "" msgstr "Nun tienes permisu pa editar esta cuenta."
#: html/account.php lib/acctfuncs.inc.php #: html/account.php lib/acctfuncs.inc.php
msgid "Invalid password." msgid "Invalid password."
@ -134,15 +136,15 @@ msgid "Type"
msgstr "" msgstr ""
#: html/addvote.php #: html/addvote.php
msgid "Addition of a TU" msgid "Addition of a Package Maintainer"
msgstr "" msgstr ""
#: html/addvote.php #: html/addvote.php
msgid "Removal of a TU" msgid "Removal of a Package Maintainer"
msgstr "" msgstr ""
#: html/addvote.php #: html/addvote.php
msgid "Removal of a TU (undeclared inactivity)" msgid "Removal of a Package Maintainer (undeclared inactivity)"
msgstr "" msgstr ""
#: html/addvote.php #: html/addvote.php
@ -200,8 +202,9 @@ msgstr ""
#: html/home.php #: html/home.php
#, php-format #, php-format
msgid "" msgid ""
"Welcome to the AUR! Please read the %sAUR User Guidelines%s and %sAUR TU " "Welcome to the AUR! Please read the %sAUR User Guidelines%s for more "
"Guidelines%s for more information." "information and the %sAUR Submission Guidelines%s if you want to contribute "
"a PKGBUILD."
msgstr "" msgstr ""
#: html/home.php #: html/home.php
@ -216,7 +219,7 @@ msgid "Remember to vote for your favourite packages!"
msgstr "" msgstr ""
#: html/home.php #: html/home.php
msgid "Some packages may be provided as binaries in [community]." msgid "Some packages may be provided as binaries in [extra]."
msgstr "" msgstr ""
#: html/home.php #: html/home.php
@ -260,18 +263,18 @@ msgstr ""
#: html/home.php #: html/home.php
msgid "Deletion Request" msgid "Deletion Request"
msgstr "" msgstr "Solicitú de desaniciu"
#: html/home.php #: html/home.php
msgid "" msgid ""
"Request a package to be removed from the Arch User Repository. Please do not" "Request a package to be removed from the Arch User Repository. Please do not"
" use this if a package is broken and can be fixed easily. Instead, contact " " use this if a package is broken and can be fixed easily. Instead, contact "
"the package maintainer and file orphan request if necessary." "the maintainer and file orphan request if necessary."
msgstr "" msgstr ""
#: html/home.php #: html/home.php
msgid "Merge Request" msgid "Merge Request"
msgstr "" msgstr "Solicitú de mecíu"
#: html/home.php #: html/home.php
msgid "" msgid ""
@ -304,14 +307,15 @@ msgstr ""
#: html/home.php #: html/home.php
msgid "Discussion" msgid "Discussion"
msgstr "" msgstr "Discutiniu"
#: html/home.php #: html/home.php
#, php-format #, php-format
msgid "" msgid ""
"General discussion regarding the Arch User Repository (AUR) and Trusted User" "General discussion regarding the Arch User Repository (AUR) and Package "
" structure takes place on %saur-general%s. For discussion relating to the " "Maintainer structure takes place on %saur-general%s. For discussion relating"
"development of the AUR web interface, use the %saur-dev%s mailing list." " to the development of the AUR web interface, use the %saur-dev%s mailing "
"list."
msgstr "" msgstr ""
#: html/home.php #: html/home.php
@ -323,8 +327,8 @@ msgstr ""
msgid "" msgid ""
"If you find a bug in the AUR web interface, please fill out a bug report on " "If you find a bug in the AUR web interface, please fill out a bug report on "
"our %sbug tracker%s. Use the tracker to report bugs in the AUR web interface" "our %sbug tracker%s. Use the tracker to report bugs in the AUR web interface"
" %sonly%s. To report packaging bugs contact the package maintainer or leave " " %sonly%s. To report packaging bugs contact the maintainer or leave a "
"a comment on the appropriate package page." "comment on the appropriate package page."
msgstr "" msgstr ""
#: html/home.php #: html/home.php
@ -473,6 +477,12 @@ msgid ""
"checkbox." "checkbox."
msgstr "" msgstr ""
#: aurweb/routers/packages.py
msgid ""
"The selected packages have not been adopted, check the confirmation "
"checkbox."
msgstr ""
#: html/pkgbase.php lib/pkgreqfuncs.inc.php #: html/pkgbase.php lib/pkgreqfuncs.inc.php
msgid "Cannot find package to merge votes and comments into." msgid "Cannot find package to merge votes and comments into."
msgstr "" msgstr ""
@ -519,7 +529,7 @@ msgid "Delete"
msgstr "" msgstr ""
#: html/pkgdel.php #: html/pkgdel.php
msgid "Only Trusted Users and Developers can delete packages." msgid "Only Package Maintainers and Developers can delete packages."
msgstr "" msgstr ""
#: html/pkgdisown.php template/pkgbase_actions.php #: html/pkgdisown.php template/pkgbase_actions.php
@ -560,7 +570,7 @@ msgid "Disown"
msgstr "" msgstr ""
#: html/pkgdisown.php #: html/pkgdisown.php
msgid "Only Trusted Users and Developers can disown packages." msgid "Only Package Maintainers and Developers can disown packages."
msgstr "" msgstr ""
#: html/pkgflagcomment.php #: html/pkgflagcomment.php
@ -571,6 +581,14 @@ msgstr ""
msgid "Flag Package Out-Of-Date" msgid "Flag Package Out-Of-Date"
msgstr "" msgstr ""
#: templates/packages/flag.html
msgid ""
"This seems to be a VCS package. Please do %snot%s flag it out-of-date if the"
" package version in the AUR does not match the most recent commit. Flagging "
"this package should only be done if the sources moved or changes in the "
"PKGBUILD are required because of recent upstream changes."
msgstr ""
#: html/pkgflag.php #: html/pkgflag.php
#, php-format #, php-format
msgid "" msgid ""
@ -642,7 +660,7 @@ msgid "Merge"
msgstr "" msgstr ""
#: html/pkgmerge.php #: html/pkgmerge.php
msgid "Only Trusted Users and Developers can merge packages." msgid "Only Package Maintainers and Developers can merge packages."
msgstr "" msgstr ""
#: html/pkgreq.php template/pkgbase_actions.php template/pkgreq_form.php #: html/pkgreq.php template/pkgbase_actions.php template/pkgreq_form.php
@ -697,22 +715,22 @@ msgstr ""
#: html/tos.php #: html/tos.php
msgid "I accept the terms and conditions above." msgid "I accept the terms and conditions above."
msgstr "" msgstr "Acepto los términos y les condiciones d'arriba."
#: html/tu.php template/account_details.php template/header.php #: html/tu.php template/account_details.php template/header.php
msgid "Trusted User" msgid "Package Maintainer"
msgstr "" msgstr ""
#: html/tu.php #: html/tu.php
msgid "Could not retrieve proposal details." msgid "Could not retrieve proposal details."
msgstr "" msgstr "Nun se pudieron recuperar los detalles de la propuesta."
#: html/tu.php #: html/tu.php
msgid "Voting is closed for this proposal." msgid "Voting is closed for this proposal."
msgstr "" msgstr ""
#: html/tu.php #: html/tu.php
msgid "Only Trusted Users are allowed to vote." msgid "Only Package Maintainers are allowed to vote."
msgstr "" msgstr ""
#: html/tu.php #: html/tu.php
@ -867,6 +885,10 @@ msgstr ""
msgid "Account suspended" msgid "Account suspended"
msgstr "" msgstr ""
#: aurweb/routers/accounts.py
msgid "You do not have permission to suspend accounts."
msgstr ""
#: lib/acctfuncs.inc.php #: lib/acctfuncs.inc.php
#, php-format #, php-format
msgid "" msgid ""
@ -910,7 +932,7 @@ msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "Comment cannot be empty." msgid "Comment cannot be empty."
msgstr "" msgstr "El comentariu nun pue tar baleru."
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "Comment has been added." msgid "Comment has been added."
@ -918,7 +940,7 @@ msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "You must be logged in before you can edit package information." msgid "You must be logged in before you can edit package information."
msgstr "" msgstr "Tienes d'aniciar la sesión enantes d'editar la información del paquete."
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "Missing comment ID." msgid "Missing comment ID."
@ -926,15 +948,15 @@ msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "No more than 5 comments can be pinned." msgid "No more than 5 comments can be pinned."
msgstr "" msgstr "Nun se puen fixar más de 5 comentarios."
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "You are not allowed to pin this comment." msgid "You are not allowed to pin this comment."
msgstr "" msgstr "Nun tienes permisu pa fixar esti comentariu."
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "You are not allowed to unpin this comment." msgid "You are not allowed to unpin this comment."
msgstr "" msgstr "Nun tienes permisu pa lliberar esti comentariu."
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "Comment has been pinned." msgid "Comment has been pinned."
@ -950,6 +972,30 @@ msgstr ""
#: lib/pkgbasefuncs.inc.php lib/pkgfuncs.inc.php #: lib/pkgbasefuncs.inc.php lib/pkgfuncs.inc.php
msgid "Package details could not be found." msgid "Package details could not be found."
msgstr "Nun se pudieron atopar los detalles del paquete."
#: aurweb/routers/auth.py
msgid "Bad Referer header."
msgstr ""
#: aurweb/routers/packages.py
msgid "You did not select any packages to be notified about."
msgstr ""
#: aurweb/routers/packages.py
msgid "The selected packages' notifications have been enabled."
msgstr ""
#: aurweb/routers/packages.py
msgid "You did not select any packages for notification removal."
msgstr ""
#: aurweb/routers/packages.py
msgid "A package you selected does not have notifications enabled."
msgstr ""
#: aurweb/routers/packages.py
msgid "The selected packages' notifications have been removed."
msgstr "" msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
@ -988,6 +1034,10 @@ msgstr ""
msgid "You did not select any packages to delete." msgid "You did not select any packages to delete."
msgstr "" msgstr ""
#: aurweb/routers/packages.py
msgid "One of the packages you selected does not exist."
msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "The selected packages have been deleted." msgid "The selected packages have been deleted."
msgstr "" msgstr ""
@ -996,10 +1046,18 @@ msgstr ""
msgid "You must be logged in before you can adopt packages." msgid "You must be logged in before you can adopt packages."
msgstr "" msgstr ""
#: aurweb/routers/package.py
msgid "You are not allowed to adopt one of the packages you selected."
msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "You must be logged in before you can disown packages." msgid "You must be logged in before you can disown packages."
msgstr "" msgstr ""
#: aurweb/routers/packages.py
msgid "You are not allowed to disown one of the packages you selected."
msgstr ""
#: lib/pkgbasefuncs.inc.php #: lib/pkgbasefuncs.inc.php
msgid "You did not select any packages to adopt." msgid "You did not select any packages to adopt."
msgstr "" msgstr ""
@ -1168,7 +1226,7 @@ msgstr ""
#: template/account_details.php template/account_edit_form.php #: template/account_details.php template/account_edit_form.php
#: template/search_accounts_form.php #: template/search_accounts_form.php
msgid "Trusted User & Developer" msgid "Package Maintainer & Developer"
msgstr "" msgstr ""
#: template/account_details.php template/account_edit_form.php #: template/account_details.php template/account_edit_form.php
@ -1271,10 +1329,6 @@ msgstr ""
msgid "Normal user" msgid "Normal user"
msgstr "" msgstr ""
#: template/account_edit_form.php template/search_accounts_form.php
msgid "Trusted user"
msgstr ""
#: template/account_edit_form.php template/search_accounts_form.php #: template/account_edit_form.php template/search_accounts_form.php
msgid "Account Suspended" msgid "Account Suspended"
msgstr "" msgstr ""
@ -1347,6 +1401,15 @@ msgid ""
" the Arch User Repository." " the Arch User Repository."
msgstr "" msgstr ""
#: templates/partials/account_form.html
msgid ""
"Specify multiple SSH Keys separated by new line, empty lines are ignored."
msgstr ""
#: templates/partials/account_form.html
msgid "Hide deleted comments"
msgstr ""
#: template/account_edit_form.php #: template/account_edit_form.php
msgid "SSH Public Key" msgid "SSH Public Key"
msgstr "" msgstr ""
@ -1551,7 +1614,7 @@ msgstr ""
#: template/pkgbase_details.php template/pkg_details.php #: template/pkgbase_details.php template/pkg_details.php
#: template/pkg_search_form.php #: template/pkg_search_form.php
msgid "Keywords" msgid "Keywords"
msgstr "" msgstr "Pallabres clave"
#: template/pkgbase_details.php template/pkg_details.php #: template/pkgbase_details.php template/pkg_details.php
#: template/pkg_search_form.php #: template/pkg_search_form.php
@ -1770,22 +1833,22 @@ msgstr ""
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting a deletion request, you ask a Trusted User to delete the " "By submitting a deletion request, you ask a Package Maintainer to delete the"
"package base. This type of request should be used for duplicates, software " " package base. This type of request should be used for duplicates, software "
"abandoned by upstream, as well as illegal and irreparably broken packages." "abandoned by upstream, as well as illegal and irreparably broken packages."
msgstr "" msgstr ""
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting a merge request, you ask a Trusted User to delete the package " "By submitting a merge request, you ask a Package Maintainer to delete the "
"base and transfer its votes and comments to another package base. Merging a " "package base and transfer its votes and comments to another package base. "
"package does not affect the corresponding Git repositories. Make sure you " "Merging a package does not affect the corresponding Git repositories. Make "
"update the Git history of the target package yourself." "sure you update the Git history of the target package yourself."
msgstr "" msgstr ""
#: template/pkgreq_form.php #: template/pkgreq_form.php
msgid "" msgid ""
"By submitting an orphan request, you ask a Trusted User to disown the " "By submitting an orphan request, you ask a Package Maintainer to disown the "
"package base. Please only do this if the package needs maintainer action, " "package base. Please only do this if the package needs maintainer action, "
"the maintainer is MIA and you already tried to contact the maintainer " "the maintainer is MIA and you already tried to contact the maintainer "
"previously." "previously."
@ -1805,7 +1868,7 @@ msgstr[1] ""
#: template/pkgreq_results.php template/pkg_search_results.php #: template/pkgreq_results.php template/pkg_search_results.php
#, php-format #, php-format
msgid "Page %d of %d." msgid "Page %d of %d."
msgstr "" msgstr "Páxina %d de %d."
#: template/pkgreq_results.php #: template/pkgreq_results.php
msgid "Package" msgid "Package"
@ -2019,7 +2082,7 @@ msgstr ""
#: template/stats/general_stats_table.php #: template/stats/general_stats_table.php
msgid "Orphan Packages" msgid "Orphan Packages"
msgstr "" msgstr "Paquetes güérfanos"
#: template/stats/general_stats_table.php #: template/stats/general_stats_table.php
msgid "Packages added in the past 7 days" msgid "Packages added in the past 7 days"
@ -2039,15 +2102,15 @@ msgstr ""
#: template/stats/general_stats_table.php #: template/stats/general_stats_table.php
msgid "Registered Users" msgid "Registered Users"
msgstr "" msgstr "Usuarios rexistraos"
#: template/stats/general_stats_table.php #: template/stats/general_stats_table.php
msgid "Trusted Users" msgid "Package Maintainers"
msgstr "" msgstr ""
#: template/stats/updates_table.php #: template/stats/updates_table.php
msgid "Recent Updates" msgid "Recent Updates"
msgstr "" msgstr "Anovamientos de recién"
#: template/stats/updates_table.php #: template/stats/updates_table.php
msgid "more" msgid "more"
@ -2092,7 +2155,7 @@ msgstr ""
#: template/tu_details.php #: template/tu_details.php
msgid "Participation" msgid "Participation"
msgstr "" msgstr "Participación"
#: template/tu_last_votes_list.php #: template/tu_last_votes_list.php
msgid "Last Votes by TU" msgid "Last Votes by TU"
@ -2228,7 +2291,7 @@ msgstr ""
#: scripts/notify.py #: scripts/notify.py
#, python-brace-format #, python-brace-format
msgid "TU Vote Reminder: Proposal {id}" msgid "Package Maintainer Vote Reminder: Proposal {id}"
msgstr "" msgstr ""
#: scripts/notify.py #: scripts/notify.py
@ -2237,3 +2300,80 @@ msgid ""
"Please remember to cast your vote on proposal {id} [1]. The voting period " "Please remember to cast your vote on proposal {id} [1]. The voting period "
"ends in less than 48 hours." "ends in less than 48 hours."
msgstr "" msgstr ""
#: aurweb/routers/accounts.py
msgid "Invalid account type provided."
msgstr ""
#: aurweb/routers/accounts.py
msgid "You do not have permission to change account types."
msgstr ""
#: aurweb/routers/accounts.py
msgid "You do not have permission to change this user's account type to %s."
msgstr ""
#: aurweb/packages/requests.py
msgid "No due existing orphan requests to accept for %s."
msgstr ""
#: aurweb/asgi.py
msgid "Internal Server Error"
msgstr "Fallu internu del sirvidor"
#: templates/errors/500.html
msgid "A fatal error has occurred."
msgstr "Asocedió un fallu fatal."
#: templates/errors/500.html
msgid ""
"Details have been logged and will be reviewed by the postmaster posthaste. "
"We apologize for any inconvenience this may have caused."
msgstr ""
#: aurweb/scripts/notify.py
msgid "AUR Server Error"
msgstr ""
#: templates/pkgbase/merge.html templates/packages/delete.html
#: templates/packages/disown.html
msgid "Related package request closure comments..."
msgstr ""
#: templates/pkgbase/merge.html templates/packages/delete.html
msgid ""
"This action will close any pending package requests related to it. If "
"%sComments%s are omitted, a closure comment will be autogenerated."
msgstr ""
#: templates/partials/tu/proposal/details.html
msgid "assigned"
msgstr ""
#: templaets/partials/packages/package_metadata.html
msgid "Show %d more"
msgstr ""
#: templates/partials/packages/package_metadata.html
msgid "dependencies"
msgstr ""
#: aurweb/routers/accounts.py
msgid "The account has not been deleted, check the confirmation checkbox."
msgstr ""
#: templates/partials/packages/comment_form.html
msgid "Cancel"
msgstr ""
#: templates/requests.html
msgid "Package name"
msgstr ""
#: templates/partials/account_form.html
msgid ""
"Note that if you hide your email address, it'll end up on the BCC list for "
"any request notifications. In case someone replies to these notifications, "
"you won't receive an email. However, replies are typically sent to the "
"mailing-list and would then be visible in the archive."
msgstr ""

Some files were not shown because too many files have changed in this diff Show more