mirror of
https://gitlab.archlinux.org/archlinux/aurweb.git
synced 2025-02-03 10:43:03 +01:00
Merge branch 'pu': pre-v6.0.0
Release v6.0.0 - Python This documents UX and functional changes for the v6.0.0 aurweb release. Following this release, we'll be working on a few very nice features noted at the end of this article in Upcoming Work. Preface ------- This v6.0.0 release makes the long-awaited Python port official. Along with the development of the python port, we have modified a number of features. There have been some integral changes to how package requests are dealt with, so _Trusted Users_ should read the entirety of this document. Legend ------ There are a few terms which I'd like to define to increase understanding of these changes as they are listed: - _self_ - Refers to a user viewing or doing something regarding their own account - _/pkgbase/{name}/{action}_ - Refers to a POST action which can be triggered via the relevent package page at `/{pkgbase,packages}/{name}`. Grouped changes explained in multiple items will always be prefixed with the same letter surrounded by braces. Example: - [A] Some feature that does something - [A] The same feature where another thing has changed Infrastructure -------------- - Python packaging is now done with poetry. - SQLite support has been removed. This was done because even though SQLAlchemy is an ORM, SQLite has quite a few SQL-server-like features missing both out of the box and integrally which force us to account for the different database types. We now only support mysql, and should be able to support postgresql without much effort in the future. Note: Users wishing to easily spin up a database quickly can use `docker-compose up -d mariadb` for a Docker-hosted mariadb service. - An example systemd service has been included at `examples/aurweb.service`. - Example wrappers to `aurweb-git-(auth|serve|update)` have been included at `examples/aurweb-git-(auth|serve|update).sh` and should be used to call these scripts when aurweb is installed into a poetry virtualenv. HTML ---- - Pagers have all been modified. They still serve the same purpose, but they have slightly different display. - Some markup and methods around the website has been changed for post requests, and some forms have been completely reworked. Package Requests ---------------- - Normal users can now view and close their own requests - [A] Requests can no longer be accepted through manual closures - [A] Requests are now closed via their relevent actions - Deletion - Through `/packages` bulk delete action - Through `/pkgbase/{name}/delete` - Merge - Through `/pkgbase/{name}/merge` - Orphan - Through `/packages` bulk disown action - Through `/pkgbase/{name}/disown` - Deletion and merge requests (and their closures) are now autogenerated if no pre-existing request exists. This was done to increase tracking of package modifications performed by those with access to do so (TUs). - Deletion, merge and orphan request actions now close all (1 or more) requests pertaining to the action performed. This comes with the downside of multiple notifications sent out about a closure if more than one request (or no request) exists for them - Merge actions now automatically reject other pre-existing merge requests with a mismatched `MergeBaseName` column when a merge action is performed - The last `/requests` page no longer goes nowhere Package Bulk Actions: /packages ------------------------------- - The `Merge into` field has been removed. Merges now require being performed via the `/pkgbase/{name}/merge` action. Package View ------------ - Some cached metadata is no longer cached (pkginfo). Previously, this was defaulted to a one day cache for some package information. If we need to bring this back, we can. TU Proposals ------------ - A valid username is now required for any addition or removal of a TU. RPC --- - `type=get-comment-form` has been removed and is now located at `/pkgbase/{name}/comments/{id}/form`. - Support for versions 1-4 have been removed. - JSON key ordering is different than PHP's JSON. - `type=search` performance is overall slightly worse than PHP's. This should not heavily affect users, as a 3,000 record query is returned in roughly 0.20ms from a local standpoint. We will be working on this in aim to push it over PHP. Archives -------- - Added metadata archive `packages-meta-v1.json.gz`. - Added metadata archive `packages-meta-ext-v1.json.gz`. - Enable this by passing `--extended` to `aurweb-mkpkglists`. Performance Changes ------------------- As is expected from a complete rewrite of the website, performance has changed across the board. In most places, Python's implementation now performs better than the pre-existing PHP implementation, with the exception of a few routes. Notably: - `/` loads much quicker as it is now persistently cached forcibly for five minutes at a time. - `/packages` search is much quicker. - `/packages/{name}` view is slightly slower; we are no longer caching various pieces of package info for `cache_pkginfo_ttl`, which is defaulted to 86400 seconds, or one day. - Request actions are slower due to the removal of the `via` parameter. We now query the database for requests related to the action based on the current state of the DB. - `/rpc?type=info` queries are slightly quicker. - `/rpc?type=search` queries of low result counts are quicker. - `/rpc?type=search` queries of large result counts (> 2500) are slower. - We are not satisfied with this. We'll be working on pushing this over the edge along with the rest of the DB-intensive routes. However, the speed degredation is quite negligible for users' experience: 0.12ms PHP vs 0.15ms Python on a 3,000 record query on my local 4-core 8-thread system. Upcoming Work ------------- This release is the first major release of the Python implementation. We have multiple tasks up for work immediately, which will bring us a few more minor versions forward as they are completed. - Update request and tu vote pagers - Archive differentials - Archive mimetypes - (a) Git scripts to ORM conversion - (a) Sharness removal - Restriction of number of requests users can submit
This commit is contained in:
commit
a467b18474
403 changed files with 78721 additions and 2947 deletions
9
.coveragerc
Normal file
9
.coveragerc
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
[run]
|
||||||
|
disable_warnings = already-imported
|
||||||
|
|
||||||
|
[report]
|
||||||
|
include = aurweb/*
|
||||||
|
fail_under = 85
|
||||||
|
exclude_lines =
|
||||||
|
if __name__ == .__main__.:
|
||||||
|
pragma: no cover
|
6
.dockerignore
Normal file
6
.dockerignore
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
*/*.mo
|
||||||
|
conf/config
|
||||||
|
conf/config.sqlite
|
||||||
|
conf/config.sqlite.defaults
|
||||||
|
conf/docker
|
||||||
|
conf/docker.defaults
|
9
.env
Normal file
9
.env
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
FASTAPI_BACKEND="uvicorn"
|
||||||
|
FASTAPI_WORKERS=2
|
||||||
|
MARIADB_SOCKET_DIR="/var/run/mysqld/"
|
||||||
|
AURWEB_PHP_PREFIX=https://localhost:8443
|
||||||
|
AURWEB_FASTAPI_PREFIX=https://localhost:8444
|
||||||
|
AURWEB_SSHD_PREFIX=ssh://aur@localhost:2222
|
||||||
|
GIT_DATA_DIR="./aur.git/"
|
||||||
|
TEST_RECURSION_LIMIT=10000
|
||||||
|
COMMIT_HASH=
|
44
.gitignore
vendored
44
.gitignore
vendored
|
@ -1,11 +1,45 @@
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
.vim/
|
||||||
|
.pylintrc
|
||||||
|
.coverage
|
||||||
|
.idea
|
||||||
|
/cache/*
|
||||||
|
/logs/*
|
||||||
|
/build/
|
||||||
|
/dist/
|
||||||
|
/aurweb.egg-info/
|
||||||
|
/personal/
|
||||||
|
/notes/
|
||||||
|
/vendor/
|
||||||
|
/pyrightconfig.json
|
||||||
|
/taskell.md
|
||||||
|
aur.git/
|
||||||
|
aurweb.sqlite3
|
||||||
|
conf/config
|
||||||
|
conf/config.sqlite
|
||||||
|
conf/config.sqlite.defaults
|
||||||
|
conf/docker
|
||||||
|
conf/docker.defaults
|
||||||
|
data.sql
|
||||||
dummy-data.sql*
|
dummy-data.sql*
|
||||||
|
env/
|
||||||
|
fastapi_aw/
|
||||||
|
htmlcov/
|
||||||
po/*.mo
|
po/*.mo
|
||||||
po/*.po~
|
po/*.po~
|
||||||
po/POTFILES
|
po/POTFILES
|
||||||
web/locale/*/
|
schema/aur-schema-sqlite.sql
|
||||||
aur.git/
|
|
||||||
__pycache__/
|
|
||||||
*.py[cod]
|
|
||||||
test/test-results/
|
test/test-results/
|
||||||
test/trash directory*
|
test/trash directory*
|
||||||
schema/aur-schema-sqlite.sql
|
web/locale/*/
|
||||||
|
web/html/*.gz
|
||||||
|
|
||||||
|
# Do not stage compiled asciidoc: make -C doc
|
||||||
|
doc/rpc.html
|
||||||
|
|
||||||
|
# Ignore any user-configured .envrc files at the root.
|
||||||
|
/.envrc
|
||||||
|
|
||||||
|
# Ignore .python-version file from Pyenv
|
||||||
|
.python-version
|
||||||
|
|
|
@ -1,19 +1,81 @@
|
||||||
image: archlinux
|
image: archlinux:base-devel
|
||||||
|
|
||||||
cache:
|
cache:
|
||||||
key: system-v1
|
key: system-v1
|
||||||
paths:
|
paths:
|
||||||
# For some reason Gitlab CI only supports storing cache/artifacts in a path relative to the build directory
|
# For some reason Gitlab CI only supports storing cache/artifacts in a path relative to the build directory
|
||||||
- .pkg-cache
|
- .pkg-cache
|
||||||
|
|
||||||
before_script:
|
variables:
|
||||||
- pacman -Syu --noconfirm --noprogressbar --needed --cachedir .pkg-cache
|
AUR_CONFIG: conf/config # Default MySQL config setup in before_script.
|
||||||
base-devel git gpgme protobuf pyalpm python-mysql-connector
|
DB_HOST: localhost
|
||||||
python-pygit2 python-srcinfo python-bleach python-markdown
|
TEST_RECURSION_LIMIT: 10000
|
||||||
python-sqlalchemy python-alembic python-pytest python-werkzeug
|
CURRENT_DIR: "$(pwd)"
|
||||||
python-pytest-tap python-fastapi hypercorn nginx python-authlib
|
LOG_CONFIG: logging.test.conf
|
||||||
python-itsdangerous python-httpx python-orjson
|
|
||||||
|
|
||||||
test:
|
test:
|
||||||
|
stage: test
|
||||||
|
tags:
|
||||||
|
- fast-single-thread
|
||||||
|
before_script:
|
||||||
|
- export PATH="$HOME/.poetry/bin:${PATH}"
|
||||||
|
- ./docker/scripts/install-deps.sh
|
||||||
|
- ./docker/scripts/install-python-deps.sh
|
||||||
|
- useradd -U -d /aurweb -c 'AUR User' aur
|
||||||
|
- ./docker/mariadb-entrypoint.sh
|
||||||
|
- (cd '/usr' && /usr/bin/mysqld_safe --datadir='/var/lib/mysql') &
|
||||||
|
- 'until : > /dev/tcp/127.0.0.1/3306; do sleep 1s; done'
|
||||||
|
- cp -v conf/config.dev conf/config
|
||||||
|
- sed -i "s;YOUR_AUR_ROOT;$(pwd);g" conf/config
|
||||||
|
- ./docker/test-mysql-entrypoint.sh # Create mysql AUR_CONFIG.
|
||||||
|
- make -C po all install # Compile translations.
|
||||||
|
- make -C doc # Compile asciidoc.
|
||||||
|
- make -C test clean # Cleanup coverage.
|
||||||
script:
|
script:
|
||||||
- make -C test
|
# Run sharness.
|
||||||
|
- make -C test sh
|
||||||
|
# Run pytest.
|
||||||
|
- pytest
|
||||||
|
- make -C test coverage # Produce coverage reports.
|
||||||
|
- flake8 --count aurweb # Assert no flake8 violations in aurweb.
|
||||||
|
- flake8 --count test # Assert no flake8 violations in test.
|
||||||
|
- flake8 --count migrations # Assert no flake8 violations in migrations.
|
||||||
|
- isort --check-only aurweb # Assert no isort violations in aurweb.
|
||||||
|
- isort --check-only test # Assert no flake8 violations in test.
|
||||||
|
- isort --check-only migrations # Assert no flake8 violations in migrations.
|
||||||
|
coverage: '/TOTAL.*\s+(\d+\%)/'
|
||||||
|
artifacts:
|
||||||
|
reports:
|
||||||
|
cobertura: coverage.xml
|
||||||
|
|
||||||
|
deploy:
|
||||||
|
stage: deploy
|
||||||
|
tags:
|
||||||
|
- secure
|
||||||
|
rules:
|
||||||
|
- if: $CI_COMMIT_BRANCH == "pu"
|
||||||
|
when: manual
|
||||||
|
variables:
|
||||||
|
FASTAPI_BACKEND: gunicorn
|
||||||
|
FASTAPI_WORKERS: 5
|
||||||
|
AURWEB_PHP_PREFIX: https://aur-dev.archlinux.org
|
||||||
|
AURWEB_FASTAPI_PREFIX: https://aur-dev.archlinux.org
|
||||||
|
AURWEB_SSHD_PREFIX: ssh://aur@aur-dev.archlinux.org:2222
|
||||||
|
COMMIT_HASH: $CI_COMMIT_SHA
|
||||||
|
GIT_DATA_DIR: git_data
|
||||||
|
script:
|
||||||
|
- pacman -Syu --noconfirm docker docker-compose socat openssh
|
||||||
|
- chmod 600 ${SSH_KEY}
|
||||||
|
- socat "UNIX-LISTEN:/tmp/docker.sock,reuseaddr,fork" EXEC:"ssh -o UserKnownHostsFile=${SSH_KNOWN_HOSTS} -Ti ${SSH_KEY} ${SSH_USER}@${SSH_HOST}" &
|
||||||
|
- export DOCKER_HOST="unix:///tmp/docker.sock"
|
||||||
|
# Set secure login config for aurweb.
|
||||||
|
- sed -ri "s/^(disable_http_login).*$/\1 = 1/" conf/config.dev
|
||||||
|
- docker-compose build
|
||||||
|
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml down --remove-orphans
|
||||||
|
- docker-compose -f docker-compose.yml -f docker-compose.aur-dev.yml up -d
|
||||||
|
- docker image prune -f
|
||||||
|
- docker container prune -f
|
||||||
|
- docker volume prune -f
|
||||||
|
|
||||||
|
environment:
|
||||||
|
name: development
|
||||||
|
url: https://aur-dev.archlinux.org
|
||||||
|
|
|
@ -8,3 +8,12 @@ You can add a git hook to do this by installing `python-pre-commit` and running
|
||||||
`pre-commit install`.
|
`pre-commit install`.
|
||||||
|
|
||||||
[1] https://lists.archlinux.org/listinfo/aur-dev
|
[1] https://lists.archlinux.org/listinfo/aur-dev
|
||||||
|
|
||||||
|
### Coding Guidelines
|
||||||
|
|
||||||
|
1. All source modified or added within a patchset **must** maintain equivalent
|
||||||
|
or increased coverage by providing tests that use the functionality.
|
||||||
|
|
||||||
|
2. Please keep your source within an 80 column width.
|
||||||
|
|
||||||
|
Test patches that increase coverage in the codebase are always welcome.
|
||||||
|
|
42
Dockerfile
Normal file
42
Dockerfile
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
FROM archlinux:base-devel
|
||||||
|
|
||||||
|
VOLUME /root/.cache/pypoetry/cache
|
||||||
|
VOLUME /root/.cache/pypoetry/artifacts
|
||||||
|
|
||||||
|
ENV PATH="/root/.poetry/bin:${PATH}"
|
||||||
|
ENV PYTHONPATH=/aurweb
|
||||||
|
ENV AUR_CONFIG=conf/config
|
||||||
|
|
||||||
|
# Install system-wide dependencies.
|
||||||
|
COPY ./docker/scripts/install-deps.sh /install-deps.sh
|
||||||
|
RUN /install-deps.sh
|
||||||
|
|
||||||
|
# Copy Docker scripts
|
||||||
|
COPY ./docker /docker
|
||||||
|
COPY ./docker/scripts/* /usr/local/bin/
|
||||||
|
|
||||||
|
|
||||||
|
# Copy over all aurweb files.
|
||||||
|
COPY . /aurweb
|
||||||
|
|
||||||
|
# Working directory is aurweb root @ /aurweb.
|
||||||
|
WORKDIR /aurweb
|
||||||
|
|
||||||
|
# Copy initial config to conf/config.
|
||||||
|
RUN cp -vf conf/config.dev conf/config
|
||||||
|
RUN sed -i "s;YOUR_AUR_ROOT;/aurweb;g" conf/config
|
||||||
|
|
||||||
|
# Install Python dependencies.
|
||||||
|
RUN /docker/scripts/install-python-deps.sh
|
||||||
|
|
||||||
|
# Compile asciidocs.
|
||||||
|
RUN make -C doc
|
||||||
|
|
||||||
|
# Add our aur user.
|
||||||
|
RUN useradd -U -d /aurweb -c 'AUR User' aur
|
||||||
|
|
||||||
|
# Setup some default system stuff.
|
||||||
|
RUN ln -sf /usr/share/zoneinfo/UTC /etc/localtime
|
||||||
|
|
||||||
|
# Install translations.
|
||||||
|
RUN make -C po all install
|
167
INSTALL
167
INSTALL
|
@ -4,64 +4,135 @@ Setup on Arch Linux
|
||||||
For testing aurweb patches before submission, you can use the instructions in
|
For testing aurweb patches before submission, you can use the instructions in
|
||||||
TESTING for testing the web interface only.
|
TESTING for testing the web interface only.
|
||||||
|
|
||||||
Note that you can only do limited testing using the PHP built-in web server.
|
For a detailed description on how to setup a full aurweb server,
|
||||||
In particular, the cgit interface will be unusable as well as the ssh+git
|
|
||||||
interface. For a detailed description on how to setup a full aurweb server,
|
|
||||||
read the instructions below.
|
read the instructions below.
|
||||||
|
|
||||||
1) Clone the aurweb project:
|
1) Clone the aurweb project and install it (via `python-poetry`):
|
||||||
|
|
||||||
$ cd /srv/http/
|
$ cd /srv/http/
|
||||||
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
|
$ git clone git://git.archlinux.org/aurweb.git
|
||||||
|
$ cd aurweb
|
||||||
|
$ poetry install
|
||||||
|
|
||||||
2) Setup a web server with PHP and MySQL. Configure the web server to redirect
|
2) Setup a web server with PHP and MySQL. Configure the web server to redirect
|
||||||
all URLs to /index.php/foo/bar/. The following block can be used with nginx:
|
all URLs to /index.php/foo/bar/. The following block can be used with nginx:
|
||||||
|
|
||||||
server {
|
server {
|
||||||
listen 80;
|
# https is preferred and can be done easily with LetsEncrypt
|
||||||
|
# or self-CA signing. Users can still listen over 80 for plain
|
||||||
|
# http, for which the [options] disable_http_login used to toggle
|
||||||
|
# the authentication feature.
|
||||||
|
listen 443 ssl http2;
|
||||||
server_name aur.local aur;
|
server_name aur.local aur;
|
||||||
|
|
||||||
root /srv/http/aurweb/web/html;
|
# To enable SSL proxy properly, make sure gunicorn and friends
|
||||||
index index.php;
|
# are supporting forwarded headers over 127.0.0.1 or any if
|
||||||
|
# the asgi server is contacted by non-localhost hosts.
|
||||||
|
ssl_certificate /etc/ssl/certs/aur.cert.pem;
|
||||||
|
ssl_certificate_key /etc/ssl/private/aur.key.pem;
|
||||||
|
|
||||||
location ~ ^/[^/]+\.php($|/) {
|
# Asset root. This is used to match against gzip archives.
|
||||||
fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;
|
root /srv/http/aurweb/web/html;
|
||||||
fastcgi_index index.php;
|
|
||||||
fastcgi_split_path_info ^(/[^/]+\.php)(/.*)$;
|
# TU Bylaws redirect.
|
||||||
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
|
location = /trusted-user/TUbylaws.html {
|
||||||
fastcgi_param PATH_INFO $fastcgi_path_info;
|
return 301 https://tu-bylaws.aur.archlinux.org;
|
||||||
include fastcgi_params;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
location ~ .* {
|
# smartgit location.
|
||||||
rewrite ^/(.*)$ /index.php/$1 last;
|
location ~ "^/([a-z0-9][a-z0-9.+_-]*?)(\.git)?/(git-(receive|upload)-pack|HEAD|info/refs|objects/(info/(http-)?alternates|packs)|[0-9a-f]{2}/[0-9a-f]{38}|pack/pack-[0-9a-f]{40}\.(pack|idx))$" {
|
||||||
|
include uwsgi_params;
|
||||||
|
uwsgi_pass smartgit;
|
||||||
|
uwsgi_modifier1 9;
|
||||||
|
uwsgi_param SCRIPT_FILENAME /usr/lib/git-core/git-http-backend;
|
||||||
|
uwsgi_param PATH_INFO /aur.git/$3;
|
||||||
|
uwsgi_param GIT_HTTP_EXPORT_ALL "";
|
||||||
|
uwsgi_param GIT_NAMESPACE $1;
|
||||||
|
uwsgi_param GIT_PROJECT_ROOT /srv/http/aurweb;
|
||||||
|
}
|
||||||
|
|
||||||
|
# cgitrc.proto should be configured and located somewhere
|
||||||
|
# of your choosing.
|
||||||
|
location ~ ^/cgit {
|
||||||
|
include uwsgi_params;
|
||||||
|
rewrite ^/cgit/([^?/]+/[^?]*)?(?:\?(.*))?$ /cgit.cgi?url=$1&$2 last;
|
||||||
|
uwsgi_modifier1 9;
|
||||||
|
uwsgi_param CGIT_CONFIG /srv/http/aurweb/conf/cgitrc.proto;
|
||||||
|
uwsgi_pass cgit;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Static archive assets.
|
||||||
|
location ~ \.gz$ {
|
||||||
|
types { application/gzip text/plain }
|
||||||
|
default_type text/plain;
|
||||||
|
add_header Content-Encoding gzip;
|
||||||
|
expires 5m;
|
||||||
|
}
|
||||||
|
|
||||||
|
# For everything else, proxy the http request to (guni|uvi|hyper)corn.
|
||||||
|
# The ASGI server application should allow this request's IP to be
|
||||||
|
# forwarded via the headers used below.
|
||||||
|
# https://docs.gunicorn.org/en/stable/settings.html#forwarded-allow-ips
|
||||||
|
location / {
|
||||||
|
proxy_pass http://127.0.0.1:8000;
|
||||||
|
proxy_set_header Host $http_host;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Protocol ssl;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Ssl on;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Ensure to enable the pdo_mysql extension in php.ini.
|
|
||||||
|
|
||||||
3) Optionally copy conf/config.defaults to /etc/aurweb/. Create or copy
|
3) Optionally copy conf/config.defaults to /etc/aurweb/. Create or copy
|
||||||
/etc/aurweb/config (this is expected to contain all configuration settings
|
/etc/aurweb/config (this is expected to contain all configuration settings
|
||||||
if the defaults file does not exist) and adjust the configuration (pay
|
if the defaults file does not exist) and adjust the configuration (pay
|
||||||
attention to disable_http_login, enable_maintenance and aur_location).
|
attention to disable_http_login, enable_maintenance and aur_location).
|
||||||
|
|
||||||
4) Install Python modules and dependencies:
|
4) Install system-wide dependencies:
|
||||||
|
|
||||||
# pacman -S python-mysql-connector python-pygit2 python-srcinfo python-sqlalchemy \
|
# pacman -S git gpgme cgit curl openssh uwsgi uwsgi-plugin-cgi \
|
||||||
python-bleach python-markdown python-alembic python-jinja \
|
python-poetry
|
||||||
python-itsdangerous python-authlib python-httpx hypercorn \
|
|
||||||
python-orjson
|
|
||||||
# python3 setup.py install
|
|
||||||
|
|
||||||
5) Create a new MySQL database and a user and import the aurweb SQL schema:
|
5) Create a new user:
|
||||||
|
|
||||||
$ python -m aurweb.initdb
|
|
||||||
|
|
||||||
6) Create a new user:
|
|
||||||
|
|
||||||
# useradd -U -d /srv/http/aurweb -c 'AUR user' aur
|
# useradd -U -d /srv/http/aurweb -c 'AUR user' aur
|
||||||
|
# su - aur
|
||||||
|
|
||||||
7) Initialize the Git repository:
|
6a) Install Python dependencies via poetry:
|
||||||
|
|
||||||
|
# Install the package and scripts as the aur user.
|
||||||
|
$ poetry install
|
||||||
|
|
||||||
|
6b) Setup Services
|
||||||
|
|
||||||
|
aurweb utilizes the following systemd services:
|
||||||
|
- mariadb
|
||||||
|
- redis (optional, requires [options] cache 'redis')
|
||||||
|
- `examples/aurweb.service`
|
||||||
|
|
||||||
|
6c) Setup Cron
|
||||||
|
|
||||||
|
Using [cronie](https://archlinux.org/packages/core/x86_64/cronie/):
|
||||||
|
|
||||||
|
# su - aur
|
||||||
|
$ crontab -e
|
||||||
|
|
||||||
|
The following crontab file uses every script meant to be run on an
|
||||||
|
interval:
|
||||||
|
|
||||||
|
AUR_CONFIG='/etc/aurweb/config'
|
||||||
|
*/5 * * * * bash -c 'poetry run aurweb-mkpkglists --extended'
|
||||||
|
*/2 * * * * bash -c 'poetry run aurweb-aurblup'
|
||||||
|
*/2 * * * * bash -c 'poetry run aurweb-pkgmaint'
|
||||||
|
*/2 * * * * bash -c 'poetry run aurweb-usermaint'
|
||||||
|
*/2 * * * * bash -c 'poetry run aurweb-popupdate'
|
||||||
|
*/12 * * * * bash -c 'poetry run aurweb-tuvotereminder'
|
||||||
|
|
||||||
|
7) Create a new database and a user and import the aurweb SQL schema:
|
||||||
|
|
||||||
|
$ poetry run python -m aurweb.initdb
|
||||||
|
|
||||||
|
8) Initialize the Git repository:
|
||||||
|
|
||||||
# mkdir /srv/http/aurweb/aur.git/
|
# mkdir /srv/http/aurweb/aur.git/
|
||||||
# cd /srv/http/aurweb/aur.git/
|
# cd /srv/http/aurweb/aur.git/
|
||||||
|
@ -69,19 +140,26 @@ read the instructions below.
|
||||||
# git config --local transfer.hideRefs '^refs/'
|
# git config --local transfer.hideRefs '^refs/'
|
||||||
# git config --local --add transfer.hideRefs '!refs/'
|
# git config --local --add transfer.hideRefs '!refs/'
|
||||||
# git config --local --add transfer.hideRefs '!HEAD'
|
# git config --local --add transfer.hideRefs '!HEAD'
|
||||||
# ln -s /usr/local/bin/aurweb-git-update hooks/update
|
|
||||||
# chown -R aur .
|
# chown -R aur .
|
||||||
|
|
||||||
|
Link to `aurweb-git-update` poetry wrapper provided at
|
||||||
|
`examples/aurweb-git-update.sh` which should be installed
|
||||||
|
somewhere as executable.
|
||||||
|
|
||||||
|
# ln -s /path/to/aurweb-git-update.sh hooks/update
|
||||||
|
|
||||||
It is recommended to read doc/git-interface.txt for more information on the
|
It is recommended to read doc/git-interface.txt for more information on the
|
||||||
administration of the package Git repository.
|
administration of the package Git repository.
|
||||||
|
|
||||||
8) Configure sshd(8) for the AUR. Add the following lines at the end of your
|
9) Configure sshd(8) for the AUR. Add the following lines at the end of your
|
||||||
sshd_config(5) and restart the sshd. Note that OpenSSH 6.9 or newer is
|
sshd_config(5) and restart the sshd.
|
||||||
needed!
|
|
||||||
|
If using a virtualenv, copy `examples/aurweb-git-auth.sh` to a location
|
||||||
|
and call it below:
|
||||||
|
|
||||||
Match User aur
|
Match User aur
|
||||||
PasswordAuthentication no
|
PasswordAuthentication no
|
||||||
AuthorizedKeysCommand /usr/local/bin/aurweb-git-auth "%t" "%k"
|
AuthorizedKeysCommand /path/to/aurweb-git-auth.sh "%t" "%k"
|
||||||
AuthorizedKeysCommandUser aur
|
AuthorizedKeysCommandUser aur
|
||||||
AcceptEnv AUR_OVERWRITE
|
AcceptEnv AUR_OVERWRITE
|
||||||
|
|
||||||
|
@ -100,8 +178,17 @@ read the instructions below.
|
||||||
|
|
||||||
Sample systemd unit files for fcgiwrap can be found under conf/.
|
Sample systemd unit files for fcgiwrap can be found under conf/.
|
||||||
|
|
||||||
10) If you want memcache to cache MySQL data.
|
10) If you want Redis to cache data.
|
||||||
|
|
||||||
# pacman -S php-memcached
|
# pacman -S redis
|
||||||
|
# systemctl enable --now redis
|
||||||
|
|
||||||
And edit the configuration file to enabled memcache caching.
|
And edit the configuration file to enabled redis caching
|
||||||
|
(`[options] cache = redis`).
|
||||||
|
|
||||||
|
11) Start `aurweb.service`.
|
||||||
|
|
||||||
|
An example systemd unit has been included at `examples/aurweb.service`.
|
||||||
|
This unit can be used to manage the aurweb asgi backend. By default,
|
||||||
|
it is configured to use `poetry` as the `aur` user; this should be
|
||||||
|
configured as needed.
|
||||||
|
|
201
LICENSES/starlette_exporter
Normal file
201
LICENSES/starlette_exporter
Normal file
|
@ -0,0 +1,201 @@
|
||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
APPENDIX: How to apply the Apache License to your work.
|
||||||
|
|
||||||
|
To apply the Apache License to your work, attach the following
|
||||||
|
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||||
|
replaced with your own identifying information. (Don't include
|
||||||
|
the brackets!) The text should be enclosed in the appropriate
|
||||||
|
comment syntax for the file format. We also recommend that a
|
||||||
|
file or class name and description of purpose be included on the
|
||||||
|
same "printed page" as the copyright notice for easier
|
||||||
|
identification within third-party archives.
|
||||||
|
|
||||||
|
Copyright [yyyy] [name of copyright owner]
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
|
@ -19,12 +19,14 @@ Directory Layout
|
||||||
|
|
||||||
* `aurweb`: aurweb Python modules, Git interface and maintenance scripts
|
* `aurweb`: aurweb Python modules, Git interface and maintenance scripts
|
||||||
* `conf`: configuration and configuration templates
|
* `conf`: configuration and configuration templates
|
||||||
|
* `static`: static resource files
|
||||||
|
* `templates`: jinja2 template collection
|
||||||
* `doc`: project documentation
|
* `doc`: project documentation
|
||||||
* `po`: translation files for strings in the aurweb interface
|
* `po`: translation files for strings in the aurweb interface
|
||||||
* `schema`: schema for the SQL database
|
* `schema`: schema for the SQL database
|
||||||
* `test`: test suite and test cases
|
* `test`: test suite and test cases
|
||||||
* `upgrading`: instructions for upgrading setups from one release to another
|
* `upgrading`: instructions for upgrading setups from one release to another
|
||||||
* `web`: web interface for the AUR
|
* `web`: PHP-based web interface for the AUR
|
||||||
|
|
||||||
Links
|
Links
|
||||||
-----
|
-----
|
||||||
|
@ -46,3 +48,8 @@ Translations are welcome via our Transifex project at
|
||||||
https://www.transifex.com/lfleischer/aurweb; see `doc/i18n.txt` for details.
|
https://www.transifex.com/lfleischer/aurweb; see `doc/i18n.txt` for details.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
Testing
|
||||||
|
-------
|
||||||
|
|
||||||
|
See [test/README.md](test/README.md) for details on dependencies and testing.
|
||||||
|
|
58
TESTING
58
TESTING
|
@ -5,19 +5,47 @@ Note that this setup is only to test the web interface. If you need to have a
|
||||||
full aurweb instance with cgit, ssh interface, etc, follow the directions in
|
full aurweb instance with cgit, ssh interface, etc, follow the directions in
|
||||||
INSTALL.
|
INSTALL.
|
||||||
|
|
||||||
|
docker-compose
|
||||||
|
--------------
|
||||||
|
|
||||||
1) Clone the aurweb project:
|
1) Clone the aurweb project:
|
||||||
|
|
||||||
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
|
$ git clone https://gitlab.archlinux.org/archlinux/aurweb.git
|
||||||
|
|
||||||
2) Install the necessary packages:
|
2) Install the necessary packages:
|
||||||
|
|
||||||
# pacman -S --needed php php-sqlite sqlite words fortune-mod \
|
# pacman -S docker-compose
|
||||||
python python-sqlalchemy python-alembic \
|
|
||||||
python-fastapi uvicorn nginx \
|
|
||||||
python-authlib python-itsdangerous python-httpx \
|
|
||||||
words fortune-mod
|
|
||||||
|
|
||||||
Ensure to enable the pdo_sqlite extension in php.ini.
|
2) Build the aurweb:latest image:
|
||||||
|
|
||||||
|
$ cd /path/to/aurweb/
|
||||||
|
$ docker-compose build
|
||||||
|
|
||||||
|
3) Run local Docker development instance:
|
||||||
|
|
||||||
|
$ cd /path/to/aurweb/
|
||||||
|
$ docker-compose up -d nginx
|
||||||
|
|
||||||
|
4) Browse to local aurweb development server.
|
||||||
|
|
||||||
|
Python: https://localhost:8444/
|
||||||
|
PHP: https://localhost:8443/
|
||||||
|
|
||||||
|
Bare Metal
|
||||||
|
----------
|
||||||
|
|
||||||
|
1) Clone the aurweb project:
|
||||||
|
|
||||||
|
$ git clone git://git.archlinux.org/aurweb.git
|
||||||
|
|
||||||
|
2) Install the necessary packages:
|
||||||
|
|
||||||
|
# pacman -S python-poetry
|
||||||
|
|
||||||
|
4) Install the package/dependencies via `poetry`:
|
||||||
|
|
||||||
|
$ cd /path/to/aurweb/
|
||||||
|
$ poetry install
|
||||||
|
|
||||||
3) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
|
3) Copy conf/config.dev to conf/config and replace YOUR_AUR_ROOT by the absolute
|
||||||
path to the root of your aurweb clone. sed can do both tasks for you:
|
path to the root of your aurweb clone. sed can do both tasks for you:
|
||||||
|
@ -27,15 +55,23 @@ INSTALL.
|
||||||
Note that when the upstream config.dev is updated, you should compare it to
|
Note that when the upstream config.dev is updated, you should compare it to
|
||||||
your conf/config, or regenerate your configuration with the command above.
|
your conf/config, or regenerate your configuration with the command above.
|
||||||
|
|
||||||
4) Prepare the testing database:
|
4) Prepare a database:
|
||||||
|
|
||||||
$ cd /path/to/aurweb/
|
$ cd /path/to/aurweb/
|
||||||
|
|
||||||
$ AUR_CONFIG=conf/config python -m aurweb.initdb
|
$ AUR_CONFIG=conf/config poetry run python -m aurweb.initdb
|
||||||
|
|
||||||
$ schema/gendummydata.py data.sql
|
$ poetry run schema/gendummydata.py dummy_data.sql
|
||||||
$ sqlite3 aurweb.sqlite3 < data.sql
|
$ mysql -uaur -paur aurweb < dummy_data.sql
|
||||||
|
|
||||||
5) Run the test server:
|
5) Run the test server:
|
||||||
|
|
||||||
$ AUR_CONFIG=conf/config python -m aurweb.spawn
|
## set AUR_CONFIG to our locally created config
|
||||||
|
$ export AUR_CONFIG=conf/config
|
||||||
|
|
||||||
|
## with aurweb.spawn
|
||||||
|
$ poetry run python -m aurweb.spawn
|
||||||
|
|
||||||
|
## with systemd service
|
||||||
|
$ sudo install -m644 examples/aurweb.service /etc/systemd/system/
|
||||||
|
$ systemctl enable --now aurweb.service
|
||||||
|
|
261
aurweb/asgi.py
261
aurweb/asgi.py
|
@ -1,30 +1,259 @@
|
||||||
|
import hashlib
|
||||||
import http
|
import http
|
||||||
|
import io
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
import traceback
|
||||||
|
import typing
|
||||||
|
|
||||||
from fastapi import FastAPI, HTTPException
|
from urllib.parse import quote_plus
|
||||||
from fastapi.responses import HTMLResponse
|
|
||||||
|
from fastapi import FastAPI, HTTPException, Request, Response
|
||||||
|
from fastapi.responses import RedirectResponse
|
||||||
|
from fastapi.staticfiles import StaticFiles
|
||||||
|
from jinja2 import TemplateNotFound
|
||||||
|
from prometheus_client import multiprocess
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
from starlette.exceptions import HTTPException as StarletteHTTPException
|
||||||
|
from starlette.middleware.authentication import AuthenticationMiddleware
|
||||||
from starlette.middleware.sessions import SessionMiddleware
|
from starlette.middleware.sessions import SessionMiddleware
|
||||||
|
|
||||||
|
import aurweb.captcha # noqa: F401
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
|
import aurweb.filters # noqa: F401
|
||||||
|
import aurweb.logging
|
||||||
|
import aurweb.pkgbase.util as pkgbaseutil
|
||||||
|
|
||||||
from aurweb.routers import sso
|
from aurweb import logging, prometheus, util
|
||||||
|
from aurweb.auth import BasicAuthBackend
|
||||||
|
from aurweb.db import get_engine, query
|
||||||
|
from aurweb.models import AcceptedTerm, Term
|
||||||
|
from aurweb.packages.util import get_pkg_or_base
|
||||||
|
from aurweb.prometheus import instrumentator
|
||||||
|
from aurweb.redis import redis_connection
|
||||||
|
from aurweb.routers import APP_ROUTES
|
||||||
|
from aurweb.scripts import notify
|
||||||
|
from aurweb.templates import make_context, render_template
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
# Setup the FastAPI app.
|
||||||
app = FastAPI()
|
app = FastAPI()
|
||||||
|
|
||||||
session_secret = aurweb.config.get("fastapi", "session_secret")
|
# Instrument routes with the prometheus-fastapi-instrumentator
|
||||||
if not session_secret:
|
# library with custom collectors and expose /metrics.
|
||||||
raise Exception("[fastapi] session_secret must not be empty")
|
instrumentator().add(prometheus.http_api_requests_total())
|
||||||
|
instrumentator().add(prometheus.http_requests_total())
|
||||||
app.add_middleware(SessionMiddleware, secret_key=session_secret)
|
instrumentator().instrument(app)
|
||||||
|
|
||||||
app.include_router(sso.router)
|
|
||||||
|
|
||||||
|
|
||||||
@app.exception_handler(HTTPException)
|
@app.on_event("startup")
|
||||||
async def http_exception_handler(request, exc):
|
async def app_startup():
|
||||||
|
# https://stackoverflow.com/questions/67054759/about-the-maximum-recursion-error-in-fastapi
|
||||||
|
# Test failures have been observed by internal starlette code when
|
||||||
|
# using starlette.testclient.TestClient. Looking around in regards
|
||||||
|
# to the recursion error has really not recommended a course of action
|
||||||
|
# other than increasing the recursion limit. For now, that is how
|
||||||
|
# we handle the issue: an optional TEST_RECURSION_LIMIT env var
|
||||||
|
# provided by the user. Docker uses .env's TEST_RECURSION_LIMIT
|
||||||
|
# when running test suites.
|
||||||
|
# TODO: Find a proper fix to this issue.
|
||||||
|
recursion_limit = int(os.environ.get(
|
||||||
|
"TEST_RECURSION_LIMIT", sys.getrecursionlimit() + 1000))
|
||||||
|
sys.setrecursionlimit(recursion_limit)
|
||||||
|
|
||||||
|
backend = aurweb.config.get("database", "backend")
|
||||||
|
if backend not in aurweb.db.DRIVERS:
|
||||||
|
raise ValueError(
|
||||||
|
f"The configured database backend ({backend}) is unsupported. "
|
||||||
|
f"Supported backends: {str(aurweb.db.DRIVERS.keys())}")
|
||||||
|
|
||||||
|
session_secret = aurweb.config.get("fastapi", "session_secret")
|
||||||
|
if not session_secret:
|
||||||
|
raise Exception("[fastapi] session_secret must not be empty")
|
||||||
|
|
||||||
|
app.mount("/static/css",
|
||||||
|
StaticFiles(directory="web/html/css"),
|
||||||
|
name="static_css")
|
||||||
|
app.mount("/static/js",
|
||||||
|
StaticFiles(directory="web/html/js"),
|
||||||
|
name="static_js")
|
||||||
|
app.mount("/static/images",
|
||||||
|
StaticFiles(directory="web/html/images"),
|
||||||
|
name="static_images")
|
||||||
|
|
||||||
|
# Add application middlewares.
|
||||||
|
app.add_middleware(AuthenticationMiddleware, backend=BasicAuthBackend())
|
||||||
|
app.add_middleware(SessionMiddleware, secret_key=session_secret)
|
||||||
|
|
||||||
|
# Add application routes.
|
||||||
|
def add_router(module):
|
||||||
|
app.include_router(module.router)
|
||||||
|
util.apply_all(APP_ROUTES, add_router)
|
||||||
|
|
||||||
|
# Initialize the database engine and ORM.
|
||||||
|
get_engine()
|
||||||
|
|
||||||
|
|
||||||
|
def child_exit(server, worker): # pragma: no cover
|
||||||
|
""" This function is required for gunicorn customization
|
||||||
|
of prometheus multiprocessing. """
|
||||||
|
multiprocess.mark_process_dead(worker.pid)
|
||||||
|
|
||||||
|
|
||||||
|
async def internal_server_error(request: Request, exc: Exception) -> Response:
|
||||||
"""
|
"""
|
||||||
Dirty HTML error page to replace the default JSON error responses.
|
Catch all uncaught Exceptions thrown in a route.
|
||||||
In the future this should use a proper Arch-themed HTML template.
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:return: Rendered 500.html template with status_code 500
|
||||||
"""
|
"""
|
||||||
|
context = make_context(request, "Internal Server Error")
|
||||||
|
|
||||||
|
# Print out the exception via `traceback` and store the value
|
||||||
|
# into the `traceback` context variable.
|
||||||
|
tb_io = io.StringIO()
|
||||||
|
traceback.print_exc(file=tb_io)
|
||||||
|
tb = tb_io.getvalue()
|
||||||
|
context["traceback"] = tb
|
||||||
|
|
||||||
|
# Produce a SHA1 hash of the traceback string.
|
||||||
|
tb_hash = hashlib.sha1(tb.encode()).hexdigest()
|
||||||
|
|
||||||
|
# Use the first 7 characters of the sha1 for the traceback id.
|
||||||
|
# We will use this to log and include in the notification.
|
||||||
|
tb_id = tb_hash[:7]
|
||||||
|
|
||||||
|
redis = redis_connection()
|
||||||
|
pipe = redis.pipeline()
|
||||||
|
key = f"tb:{tb_hash}"
|
||||||
|
pipe.get(key)
|
||||||
|
retval, = pipe.execute()
|
||||||
|
if not retval:
|
||||||
|
# Expire in one hour; this is just done to make sure we
|
||||||
|
# don't infinitely store these values, but reduce the number
|
||||||
|
# of automated reports (notification below). At this time of
|
||||||
|
# writing, unexpected exceptions are not common, thus this
|
||||||
|
# will not produce a large memory footprint in redis.
|
||||||
|
pipe.set(key, tb)
|
||||||
|
pipe.expire(key, 3600)
|
||||||
|
pipe.execute()
|
||||||
|
|
||||||
|
# Send out notification about it.
|
||||||
|
notif = notify.ServerErrorNotification(
|
||||||
|
tb_id, context.get("version"), context.get("utcnow"))
|
||||||
|
notif.send()
|
||||||
|
|
||||||
|
retval = tb
|
||||||
|
else:
|
||||||
|
retval = retval.decode()
|
||||||
|
|
||||||
|
# Log details about the exception traceback.
|
||||||
|
logger.error(f"FATAL[{tb_id}]: An unexpected exception has occurred.")
|
||||||
|
logger.error(retval)
|
||||||
|
|
||||||
|
return render_template(request, "errors/500.html", context,
|
||||||
|
status_code=http.HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
|
||||||
|
|
||||||
|
@app.exception_handler(StarletteHTTPException)
|
||||||
|
async def http_exception_handler(request: Request, exc: HTTPException) \
|
||||||
|
-> Response:
|
||||||
|
""" Handle an HTTPException thrown in a route. """
|
||||||
phrase = http.HTTPStatus(exc.status_code).phrase
|
phrase = http.HTTPStatus(exc.status_code).phrase
|
||||||
return HTMLResponse(f"<h1>{exc.status_code} {phrase}</h1><p>{exc.detail}</p>",
|
context = make_context(request, phrase)
|
||||||
status_code=exc.status_code)
|
context["exc"] = exc
|
||||||
|
context["phrase"] = phrase
|
||||||
|
|
||||||
|
# Additional context for some exceptions.
|
||||||
|
if exc.status_code == http.HTTPStatus.NOT_FOUND:
|
||||||
|
tokens = request.url.path.split("/")
|
||||||
|
matches = re.match("^([a-z0-9][a-z0-9.+_-]*?)(\\.git)?$", tokens[1])
|
||||||
|
if matches:
|
||||||
|
try:
|
||||||
|
pkgbase = get_pkg_or_base(matches.group(1))
|
||||||
|
context = pkgbaseutil.make_context(request, pkgbase)
|
||||||
|
except HTTPException:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
return render_template(request, f"errors/{exc.status_code}.html",
|
||||||
|
context, exc.status_code)
|
||||||
|
except TemplateNotFound:
|
||||||
|
return render_template(request, "errors/detail.html",
|
||||||
|
context, exc.status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def add_security_headers(request: Request, call_next: typing.Callable):
|
||||||
|
""" This middleware adds the CSP, XCTO, XFO and RP security
|
||||||
|
headers to the HTTP response associated with request.
|
||||||
|
|
||||||
|
CSP: Content-Security-Policy
|
||||||
|
XCTO: X-Content-Type-Options
|
||||||
|
RP: Referrer-Policy
|
||||||
|
XFO: X-Frame-Options
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
response = await util.error_or_result(call_next, request)
|
||||||
|
except Exception as exc:
|
||||||
|
return await internal_server_error(request, exc)
|
||||||
|
|
||||||
|
# Add CSP header.
|
||||||
|
nonce = request.user.nonce
|
||||||
|
csp = "default-src 'self'; "
|
||||||
|
script_hosts = []
|
||||||
|
csp += f"script-src 'self' 'nonce-{nonce}' " + ' '.join(script_hosts)
|
||||||
|
# It's fine if css is inlined.
|
||||||
|
csp += "; style-src 'self' 'unsafe-inline'"
|
||||||
|
response.headers["Content-Security-Policy"] = csp
|
||||||
|
|
||||||
|
# Add XTCO header.
|
||||||
|
xcto = "nosniff"
|
||||||
|
response.headers["X-Content-Type-Options"] = xcto
|
||||||
|
|
||||||
|
# Add Referrer Policy header.
|
||||||
|
rp = "same-origin"
|
||||||
|
response.headers["Referrer-Policy"] = rp
|
||||||
|
|
||||||
|
# Add X-Frame-Options header.
|
||||||
|
xfo = "SAMEORIGIN"
|
||||||
|
response.headers["X-Frame-Options"] = xfo
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def check_terms_of_service(request: Request, call_next: typing.Callable):
|
||||||
|
""" This middleware function redirects authenticated users if they
|
||||||
|
have any outstanding Terms to agree to. """
|
||||||
|
if request.user.is_authenticated() and request.url.path != "/tos":
|
||||||
|
unaccepted = query(Term).join(AcceptedTerm).filter(
|
||||||
|
or_(AcceptedTerm.UsersID != request.user.ID,
|
||||||
|
and_(AcceptedTerm.UsersID == request.user.ID,
|
||||||
|
AcceptedTerm.TermsID == Term.ID,
|
||||||
|
AcceptedTerm.Revision < Term.Revision)))
|
||||||
|
if query(Term).count() > unaccepted.count():
|
||||||
|
return RedirectResponse(
|
||||||
|
"/tos", status_code=int(http.HTTPStatus.SEE_OTHER))
|
||||||
|
|
||||||
|
return await util.error_or_result(call_next, request)
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def id_redirect_middleware(request: Request, call_next: typing.Callable):
|
||||||
|
id = request.query_params.get("id")
|
||||||
|
|
||||||
|
if id is not None:
|
||||||
|
# Preserve query string.
|
||||||
|
qs = []
|
||||||
|
for k, v in request.query_params.items():
|
||||||
|
if k != "id":
|
||||||
|
qs.append(f"{k}={quote_plus(str(v))}")
|
||||||
|
qs = str() if not qs else '?' + '&'.join(qs)
|
||||||
|
|
||||||
|
path = request.url.path.rstrip('/')
|
||||||
|
return RedirectResponse(f"{path}/{id}{qs}")
|
||||||
|
|
||||||
|
return await util.error_or_result(call_next, request)
|
||||||
|
|
226
aurweb/auth/__init__.py
Normal file
226
aurweb/auth/__init__.py
Normal file
|
@ -0,0 +1,226 @@
|
||||||
|
import functools
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
from typing import Callable
|
||||||
|
|
||||||
|
import fastapi
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from fastapi.responses import RedirectResponse
|
||||||
|
from starlette.authentication import AuthCredentials, AuthenticationBackend
|
||||||
|
from starlette.requests import HTTPConnection
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
from aurweb import db, filters, l10n, time, util
|
||||||
|
from aurweb.models import Session, User
|
||||||
|
from aurweb.models.account_type import ACCOUNT_TYPE_ID
|
||||||
|
|
||||||
|
|
||||||
|
class StubQuery:
|
||||||
|
""" Acts as a stubbed version of an orm.Query. Typically used
|
||||||
|
to masquerade fake records for an AnonymousUser. """
|
||||||
|
|
||||||
|
def filter(self, *args):
|
||||||
|
return StubQuery()
|
||||||
|
|
||||||
|
def scalar(self):
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
class AnonymousUser:
|
||||||
|
""" A stubbed User class used when an unauthenticated User
|
||||||
|
makes a request against FastAPI. """
|
||||||
|
# Stub attributes used to mimic a real user.
|
||||||
|
ID = 0
|
||||||
|
|
||||||
|
class AccountType:
|
||||||
|
""" A stubbed AccountType static class. In here, we use an ID
|
||||||
|
and AccountType which do not exist in our constant records.
|
||||||
|
All records primary keys (AccountType.ID) should be non-zero,
|
||||||
|
so using a zero here means that we'll never match against a
|
||||||
|
real AccountType. """
|
||||||
|
ID = 0
|
||||||
|
AccountType = "Anonymous"
|
||||||
|
|
||||||
|
# AccountTypeID == AccountType.ID; assign a stubbed column.
|
||||||
|
AccountTypeID = AccountType.ID
|
||||||
|
|
||||||
|
LangPreference = aurweb.config.get("options", "default_lang")
|
||||||
|
Timezone = aurweb.config.get("options", "default_timezone")
|
||||||
|
|
||||||
|
Suspended = 0
|
||||||
|
InactivityTS = 0
|
||||||
|
|
||||||
|
# A stub ssh_pub_key relationship.
|
||||||
|
ssh_pub_key = None
|
||||||
|
|
||||||
|
# Add stubbed relationship backrefs.
|
||||||
|
notifications = StubQuery()
|
||||||
|
package_votes = StubQuery()
|
||||||
|
|
||||||
|
# A nonce attribute, needed for all browser sessions; set in __init__.
|
||||||
|
nonce = None
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.nonce = util.make_nonce()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_authenticated():
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_trusted_user():
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_developer():
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_elevated():
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def has_credential(credential, **kwargs):
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def voted_for(package):
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def notified(package):
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
class BasicAuthBackend(AuthenticationBackend):
|
||||||
|
async def authenticate(self, conn: HTTPConnection):
|
||||||
|
unauthenticated = (None, AnonymousUser())
|
||||||
|
sid = conn.cookies.get("AURSID")
|
||||||
|
if not sid:
|
||||||
|
return unauthenticated
|
||||||
|
|
||||||
|
timeout = aurweb.config.getint("options", "login_timeout")
|
||||||
|
remembered = ("AURREMEMBER" in conn.cookies
|
||||||
|
and bool(conn.cookies.get("AURREMEMBER")))
|
||||||
|
if remembered:
|
||||||
|
timeout = aurweb.config.getint("options",
|
||||||
|
"persistent_cookie_timeout")
|
||||||
|
|
||||||
|
# If no session with sid and a LastUpdateTS now or later exists.
|
||||||
|
now_ts = time.utcnow()
|
||||||
|
record = db.query(Session).filter(Session.SessionID == sid).first()
|
||||||
|
if not record:
|
||||||
|
return unauthenticated
|
||||||
|
elif record.LastUpdateTS < (now_ts - timeout):
|
||||||
|
with db.begin():
|
||||||
|
db.delete_all([record])
|
||||||
|
return unauthenticated
|
||||||
|
|
||||||
|
# At this point, we cannot have an invalid user if the record
|
||||||
|
# exists, due to ForeignKey constraints in the schema upheld
|
||||||
|
# by mysqlclient.
|
||||||
|
with db.begin():
|
||||||
|
user = db.query(User).filter(User.ID == record.UsersID).first()
|
||||||
|
user.nonce = util.make_nonce()
|
||||||
|
user.authenticated = True
|
||||||
|
|
||||||
|
return (AuthCredentials(["authenticated"]), user)
|
||||||
|
|
||||||
|
|
||||||
|
def _auth_required(auth_goal: bool = True):
|
||||||
|
"""
|
||||||
|
Enforce a user's authentication status, bringing them to the login page
|
||||||
|
or homepage if their authentication status does not match the goal.
|
||||||
|
|
||||||
|
NOTE: This function should not need to be used in downstream code.
|
||||||
|
See `requires_auth` and `requires_guest` for decorators meant to be
|
||||||
|
used on routes (they're a bit more implicitly understandable).
|
||||||
|
|
||||||
|
:param auth_goal: Whether authentication is required or entirely disallowed
|
||||||
|
for a user to perform this request.
|
||||||
|
:return: Return the FastAPI function this decorator wraps.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
async def wrapper(request, *args, **kwargs):
|
||||||
|
if request.user.is_authenticated() == auth_goal:
|
||||||
|
return await func(request, *args, **kwargs)
|
||||||
|
|
||||||
|
url = "/"
|
||||||
|
if auth_goal is False:
|
||||||
|
return RedirectResponse(url, status_code=int(HTTPStatus.SEE_OTHER))
|
||||||
|
|
||||||
|
# Use the request path when the user can visit a page directly but
|
||||||
|
# is not authenticated and use the Referer header if visiting the
|
||||||
|
# page itself is not directly possible (e.g. submitting a form).
|
||||||
|
if request.method in ("GET", "HEAD"):
|
||||||
|
url = request.url.path
|
||||||
|
elif (referer := request.headers.get("Referer")):
|
||||||
|
aur = aurweb.config.get("options", "aur_location") + "/"
|
||||||
|
if not referer.startswith(aur):
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
|
||||||
|
detail=_("Bad Referer header."))
|
||||||
|
url = referer[len(aur) - 1:]
|
||||||
|
url = "/login?" + filters.urlencode({"next": url})
|
||||||
|
return RedirectResponse(url, status_code=int(HTTPStatus.SEE_OTHER))
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def requires_auth(func: Callable) -> Callable:
|
||||||
|
""" Require an authenticated session for a particular route. """
|
||||||
|
|
||||||
|
@functools.wraps(func)
|
||||||
|
async def wrapper(*args, **kwargs):
|
||||||
|
return await _auth_required(True)(func)(*args, **kwargs)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
def requires_guest(func: Callable) -> Callable:
|
||||||
|
""" Require a guest (unauthenticated) session for a particular route. """
|
||||||
|
|
||||||
|
@functools.wraps(func)
|
||||||
|
async def wrapper(*args, **kwargs):
|
||||||
|
return await _auth_required(False)(func)(*args, **kwargs)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
def account_type_required(one_of: set):
|
||||||
|
""" A decorator that can be used on FastAPI routes to dictate
|
||||||
|
that a user belongs to one of the types defined in one_of.
|
||||||
|
|
||||||
|
This decorator should be run after an @auth_required(True) is
|
||||||
|
dictated.
|
||||||
|
|
||||||
|
- Example code:
|
||||||
|
|
||||||
|
@router.get('/some_route')
|
||||||
|
@auth_required(True)
|
||||||
|
@account_type_required({"Trusted User", "Trusted User & Developer"})
|
||||||
|
async def some_route(request: fastapi.Request):
|
||||||
|
return Response()
|
||||||
|
|
||||||
|
:param one_of: A set consisting of strings to match against AccountType.
|
||||||
|
:return: Return the FastAPI function this decorator wraps.
|
||||||
|
"""
|
||||||
|
# Convert any account type string constants to their integer IDs.
|
||||||
|
one_of = {
|
||||||
|
ACCOUNT_TYPE_ID[atype]
|
||||||
|
for atype in one_of
|
||||||
|
if isinstance(atype, str)
|
||||||
|
}
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
async def wrapper(request: fastapi.Request, *args, **kwargs):
|
||||||
|
if request.user.AccountTypeID not in one_of:
|
||||||
|
return RedirectResponse("/",
|
||||||
|
status_code=int(HTTPStatus.SEE_OTHER))
|
||||||
|
return await func(request, *args, **kwargs)
|
||||||
|
return wrapper
|
||||||
|
return decorator
|
76
aurweb/auth/creds.py
Normal file
76
aurweb/auth/creds.py
Normal file
|
@ -0,0 +1,76 @@
|
||||||
|
from aurweb.models.account_type import DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID, USER_ID
|
||||||
|
from aurweb.models.user import User
|
||||||
|
|
||||||
|
ACCOUNT_CHANGE_TYPE = 1
|
||||||
|
ACCOUNT_EDIT = 2
|
||||||
|
ACCOUNT_EDIT_DEV = 3
|
||||||
|
ACCOUNT_LAST_LOGIN = 4
|
||||||
|
ACCOUNT_SEARCH = 5
|
||||||
|
ACCOUNT_LIST_COMMENTS = 28
|
||||||
|
COMMENT_DELETE = 6
|
||||||
|
COMMENT_UNDELETE = 27
|
||||||
|
COMMENT_VIEW_DELETED = 22
|
||||||
|
COMMENT_EDIT = 25
|
||||||
|
COMMENT_PIN = 26
|
||||||
|
PKGBASE_ADOPT = 7
|
||||||
|
PKGBASE_SET_KEYWORDS = 8
|
||||||
|
PKGBASE_DELETE = 9
|
||||||
|
PKGBASE_DISOWN = 10
|
||||||
|
PKGBASE_EDIT_COMAINTAINERS = 24
|
||||||
|
PKGBASE_FLAG = 11
|
||||||
|
PKGBASE_LIST_VOTERS = 12
|
||||||
|
PKGBASE_NOTIFY = 13
|
||||||
|
PKGBASE_UNFLAG = 15
|
||||||
|
PKGBASE_VOTE = 16
|
||||||
|
PKGREQ_FILE = 23
|
||||||
|
PKGREQ_CLOSE = 17
|
||||||
|
PKGREQ_LIST = 18
|
||||||
|
TU_ADD_VOTE = 19
|
||||||
|
TU_LIST_VOTES = 20
|
||||||
|
TU_VOTE = 21
|
||||||
|
PKGBASE_MERGE = 29
|
||||||
|
|
||||||
|
user_developer_or_trusted_user = set([USER_ID, TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
|
||||||
|
trusted_user_or_dev = set([TRUSTED_USER_ID, DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
|
||||||
|
developer = set([DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID])
|
||||||
|
trusted_user = set([TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID])
|
||||||
|
|
||||||
|
cred_filters = {
|
||||||
|
PKGBASE_FLAG: user_developer_or_trusted_user,
|
||||||
|
PKGBASE_NOTIFY: user_developer_or_trusted_user,
|
||||||
|
PKGBASE_VOTE: user_developer_or_trusted_user,
|
||||||
|
PKGREQ_FILE: user_developer_or_trusted_user,
|
||||||
|
ACCOUNT_CHANGE_TYPE: trusted_user_or_dev,
|
||||||
|
ACCOUNT_EDIT: trusted_user_or_dev,
|
||||||
|
ACCOUNT_LAST_LOGIN: trusted_user_or_dev,
|
||||||
|
ACCOUNT_LIST_COMMENTS: trusted_user_or_dev,
|
||||||
|
ACCOUNT_SEARCH: trusted_user_or_dev,
|
||||||
|
COMMENT_DELETE: trusted_user_or_dev,
|
||||||
|
COMMENT_UNDELETE: trusted_user_or_dev,
|
||||||
|
COMMENT_VIEW_DELETED: trusted_user_or_dev,
|
||||||
|
COMMENT_EDIT: trusted_user_or_dev,
|
||||||
|
COMMENT_PIN: trusted_user_or_dev,
|
||||||
|
PKGBASE_ADOPT: trusted_user_or_dev,
|
||||||
|
PKGBASE_SET_KEYWORDS: trusted_user_or_dev,
|
||||||
|
PKGBASE_DELETE: trusted_user_or_dev,
|
||||||
|
PKGBASE_EDIT_COMAINTAINERS: trusted_user_or_dev,
|
||||||
|
PKGBASE_DISOWN: trusted_user_or_dev,
|
||||||
|
PKGBASE_LIST_VOTERS: trusted_user_or_dev,
|
||||||
|
PKGBASE_UNFLAG: trusted_user_or_dev,
|
||||||
|
PKGREQ_CLOSE: trusted_user_or_dev,
|
||||||
|
PKGREQ_LIST: trusted_user_or_dev,
|
||||||
|
TU_ADD_VOTE: trusted_user,
|
||||||
|
TU_LIST_VOTES: trusted_user_or_dev,
|
||||||
|
TU_VOTE: trusted_user,
|
||||||
|
ACCOUNT_EDIT_DEV: developer,
|
||||||
|
PKGBASE_MERGE: trusted_user_or_dev,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def has_credential(user: User,
|
||||||
|
credential: int,
|
||||||
|
approved_users: list = tuple()):
|
||||||
|
|
||||||
|
if user in approved_users:
|
||||||
|
return True
|
||||||
|
return user.AccountTypeID in cred_filters[credential]
|
21
aurweb/benchmark.py
Normal file
21
aurweb/benchmark.py
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
|
||||||
|
class Benchmark:
|
||||||
|
def __init__(self):
|
||||||
|
self.start()
|
||||||
|
|
||||||
|
def _timestamp(self) -> float:
|
||||||
|
""" Generate a timestamp. """
|
||||||
|
return float(datetime.utcnow().timestamp())
|
||||||
|
|
||||||
|
def start(self) -> int:
|
||||||
|
""" Start a benchmark. """
|
||||||
|
self.current = self._timestamp()
|
||||||
|
return self.current
|
||||||
|
|
||||||
|
def end(self):
|
||||||
|
""" Return the diff between now - start(). """
|
||||||
|
n = self._timestamp() - self.current
|
||||||
|
self.current = float(0)
|
||||||
|
return n
|
20
aurweb/cache.py
Normal file
20
aurweb/cache.py
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
from redis import Redis
|
||||||
|
from sqlalchemy import orm
|
||||||
|
|
||||||
|
|
||||||
|
async def db_count_cache(redis: Redis, key: str, query: orm.Query,
|
||||||
|
expire: int = None) -> int:
|
||||||
|
""" Store and retrieve a query.count() via redis cache.
|
||||||
|
|
||||||
|
:param redis: Redis handle
|
||||||
|
:param key: Redis key
|
||||||
|
:param query: SQLAlchemy ORM query
|
||||||
|
:param expire: Optional expiration in seconds
|
||||||
|
:return: query.count()
|
||||||
|
"""
|
||||||
|
result = redis.get(key)
|
||||||
|
if result is None:
|
||||||
|
redis.set(key, (result := int(query.count())))
|
||||||
|
if expire:
|
||||||
|
redis.expire(key, expire)
|
||||||
|
return int(result)
|
57
aurweb/captcha.py
Normal file
57
aurweb/captcha.py
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
""" This module consists of aurweb's CAPTCHA utility functions and filters. """
|
||||||
|
import hashlib
|
||||||
|
|
||||||
|
from jinja2 import pass_context
|
||||||
|
|
||||||
|
from aurweb.db import query
|
||||||
|
from aurweb.models import User
|
||||||
|
from aurweb.templates import register_filter
|
||||||
|
|
||||||
|
|
||||||
|
def get_captcha_salts():
|
||||||
|
""" Produce salts based on the current user count. """
|
||||||
|
count = query(User).count()
|
||||||
|
salts = []
|
||||||
|
for i in range(0, 6):
|
||||||
|
salts.append(f"aurweb-{count - i}")
|
||||||
|
return salts
|
||||||
|
|
||||||
|
|
||||||
|
def get_captcha_token(salt):
|
||||||
|
""" Produce a token for the CAPTCHA salt. """
|
||||||
|
return hashlib.md5(salt.encode()).hexdigest()[:3]
|
||||||
|
|
||||||
|
|
||||||
|
def get_captcha_challenge(salt):
|
||||||
|
""" Get a CAPTCHA challenge string (shell command) for a salt. """
|
||||||
|
token = get_captcha_token(salt)
|
||||||
|
return f"LC_ALL=C pacman -V|sed -r 's#[0-9]+#{token}#g'|md5sum|cut -c1-6"
|
||||||
|
|
||||||
|
|
||||||
|
def get_captcha_answer(token):
|
||||||
|
""" Compute the answer via md5 of the real template text, return the
|
||||||
|
first six digits of the hexadecimal hash. """
|
||||||
|
text = r"""
|
||||||
|
.--. Pacman v%s.%s.%s - libalpm v%s.%s.%s
|
||||||
|
/ _.-' .-. .-. .-. Copyright (C) %s-%s Pacman Development Team
|
||||||
|
\ '-. '-' '-' '-' Copyright (C) %s-%s Judd Vinet
|
||||||
|
'--'
|
||||||
|
This program may be freely redistributed under
|
||||||
|
the terms of the GNU General Public License.
|
||||||
|
""" % tuple([token] * 10)
|
||||||
|
return hashlib.md5((text + "\n").encode()).hexdigest()[:6]
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("captcha_salt")
|
||||||
|
@pass_context
|
||||||
|
def captcha_salt_filter(context):
|
||||||
|
""" Returns the most recent CAPTCHA salt in the list of salts. """
|
||||||
|
salts = get_captcha_salts()
|
||||||
|
return salts[0]
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("captcha_cmdline")
|
||||||
|
@pass_context
|
||||||
|
def captcha_cmdline_filter(context, salt):
|
||||||
|
""" Returns a CAPTCHA challenge for a given salt. """
|
||||||
|
return get_captcha_challenge(salt)
|
|
@ -1,6 +1,13 @@
|
||||||
import configparser
|
import configparser
|
||||||
import os
|
import os
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
# Publicly visible version of aurweb. This is used to display
|
||||||
|
# aurweb versioning in the footer and must be maintained.
|
||||||
|
# Todo: Make this dynamic/automated.
|
||||||
|
AURWEB_VERSION = "v5.0.0"
|
||||||
|
|
||||||
_parser = None
|
_parser = None
|
||||||
|
|
||||||
|
|
||||||
|
@ -12,6 +19,7 @@ def _get_parser():
|
||||||
defaults = os.environ.get('AUR_CONFIG_DEFAULTS', path + '.defaults')
|
defaults = os.environ.get('AUR_CONFIG_DEFAULTS', path + '.defaults')
|
||||||
|
|
||||||
_parser = configparser.RawConfigParser()
|
_parser = configparser.RawConfigParser()
|
||||||
|
_parser.optionxform = lambda option: option
|
||||||
if os.path.isfile(defaults):
|
if os.path.isfile(defaults):
|
||||||
with open(defaults) as f:
|
with open(defaults) as f:
|
||||||
_parser.read_file(f)
|
_parser.read_file(f)
|
||||||
|
@ -20,6 +28,17 @@ def _get_parser():
|
||||||
return _parser
|
return _parser
|
||||||
|
|
||||||
|
|
||||||
|
def rehash():
|
||||||
|
""" Globally rehash the configuration parser. """
|
||||||
|
global _parser
|
||||||
|
_parser = None
|
||||||
|
_get_parser()
|
||||||
|
|
||||||
|
|
||||||
|
def get_with_fallback(section, option, fallback):
|
||||||
|
return _get_parser().get(section, option, fallback=fallback)
|
||||||
|
|
||||||
|
|
||||||
def get(section, option):
|
def get(section, option):
|
||||||
return _get_parser().get(section, option)
|
return _get_parser().get(section, option)
|
||||||
|
|
||||||
|
@ -28,5 +47,25 @@ def getboolean(section, option):
|
||||||
return _get_parser().getboolean(section, option)
|
return _get_parser().getboolean(section, option)
|
||||||
|
|
||||||
|
|
||||||
def getint(section, option):
|
def getint(section, option, fallback=None):
|
||||||
return _get_parser().getint(section, option)
|
return _get_parser().getint(section, option, fallback=fallback)
|
||||||
|
|
||||||
|
|
||||||
|
def get_section(section):
|
||||||
|
if section in _get_parser().sections():
|
||||||
|
return _get_parser()[section]
|
||||||
|
|
||||||
|
|
||||||
|
def unset_option(section: str, option: str) -> None:
|
||||||
|
_get_parser().remove_option(section, option)
|
||||||
|
|
||||||
|
|
||||||
|
def set_option(section: str, option: str, value: Any) -> None:
|
||||||
|
_get_parser().set(section, option, value)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def save() -> None:
|
||||||
|
aur_config = os.environ.get("AUR_CONFIG", "/etc/aurweb/config")
|
||||||
|
with open(aur_config, "w") as fp:
|
||||||
|
_get_parser().write(fp)
|
||||||
|
|
68
aurweb/cookies.py
Normal file
68
aurweb/cookies.py
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi.responses import Response
|
||||||
|
|
||||||
|
from aurweb import config
|
||||||
|
|
||||||
|
|
||||||
|
def samesite() -> str:
|
||||||
|
""" Produce cookie SameSite value based on options.disable_http_login.
|
||||||
|
|
||||||
|
When options.disable_http_login is True, "strict" is returned. Otherwise,
|
||||||
|
"lax" is returned.
|
||||||
|
|
||||||
|
:returns "strict" if options.disable_http_login else "lax"
|
||||||
|
"""
|
||||||
|
secure = config.getboolean("options", "disable_http_login")
|
||||||
|
return "strict" if secure else "lax"
|
||||||
|
|
||||||
|
|
||||||
|
def timeout(extended: bool) -> int:
|
||||||
|
""" Produce a session timeout based on `remember_me`.
|
||||||
|
|
||||||
|
This method returns one of AUR_CONFIG's options.persistent_cookie_timeout
|
||||||
|
and options.login_timeout based on the `extended` argument.
|
||||||
|
|
||||||
|
The `extended` argument is typically the value of the AURREMEMBER
|
||||||
|
cookie, defaulted to False.
|
||||||
|
|
||||||
|
If `extended` is False, options.login_timeout is returned. Otherwise,
|
||||||
|
if `extended` is True, options.persistent_cookie_timeout is returned.
|
||||||
|
|
||||||
|
:param extended: Flag which generates an extended timeout when True
|
||||||
|
:returns: Cookie timeout based on configuration options
|
||||||
|
"""
|
||||||
|
timeout = config.getint("options", "login_timeout")
|
||||||
|
if bool(extended):
|
||||||
|
timeout = config.getint("options", "persistent_cookie_timeout")
|
||||||
|
return timeout
|
||||||
|
|
||||||
|
|
||||||
|
def update_response_cookies(request: Request, response: Response,
|
||||||
|
aurtz: str = None, aurlang: str = None,
|
||||||
|
aursid: str = None) -> Response:
|
||||||
|
""" Update session cookies. This method is particularly useful
|
||||||
|
when updating a cookie which was already set.
|
||||||
|
|
||||||
|
The AURSID cookie's expiration is based on the AURREMEMBER cookie,
|
||||||
|
which is retrieved from `request`.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param response: FastAPI response
|
||||||
|
:param aurtz: Optional AURTZ cookie value
|
||||||
|
:param aurlang: Optional AURLANG cookie value
|
||||||
|
:param aursid: Optional AURSID cookie value
|
||||||
|
:returns: Updated response
|
||||||
|
"""
|
||||||
|
secure = config.getboolean("options", "disable_http_login")
|
||||||
|
if aurtz:
|
||||||
|
response.set_cookie("AURTZ", aurtz, secure=secure, httponly=secure,
|
||||||
|
samesite=samesite())
|
||||||
|
if aurlang:
|
||||||
|
response.set_cookie("AURLANG", aurlang, secure=secure, httponly=secure,
|
||||||
|
samesite=samesite())
|
||||||
|
if aursid:
|
||||||
|
remember_me = bool(request.cookies.get("AURREMEMBER", False))
|
||||||
|
response.set_cookie("AURSID", aursid, secure=secure, httponly=secure,
|
||||||
|
max_age=timeout(remember_me),
|
||||||
|
samesite=samesite())
|
||||||
|
return response
|
373
aurweb/db.py
373
aurweb/db.py
|
@ -1,37 +1,205 @@
|
||||||
try:
|
import functools
|
||||||
import mysql.connector
|
import hashlib
|
||||||
except ImportError:
|
import math
|
||||||
pass
|
import os
|
||||||
|
import re
|
||||||
|
|
||||||
try:
|
from typing import Iterable, NewType
|
||||||
import sqlite3
|
|
||||||
except ImportError:
|
import sqlalchemy
|
||||||
pass
|
|
||||||
|
from sqlalchemy import create_engine, event
|
||||||
|
from sqlalchemy.engine.base import Engine
|
||||||
|
from sqlalchemy.engine.url import URL
|
||||||
|
from sqlalchemy.orm import Query, Session, SessionTransaction, scoped_session, sessionmaker
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
|
import aurweb.util
|
||||||
|
|
||||||
engine = None # See get_engine
|
DRIVERS = {
|
||||||
|
"mysql": "mysql+mysqldb"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Some types we don't get access to in this module.
|
||||||
|
Base = NewType("Base", "aurweb.models.declarative_base.Base")
|
||||||
|
|
||||||
|
|
||||||
def get_sqlalchemy_url():
|
def make_random_value(table: str, column: str, length: int):
|
||||||
|
""" Generate a unique, random value for a string column in a table.
|
||||||
|
|
||||||
|
:return: A unique string that is not in the database
|
||||||
"""
|
"""
|
||||||
Build an SQLAlchemy for use with create_engine based on the aurweb configuration.
|
string = aurweb.util.make_random_string(length)
|
||||||
|
while query(table).filter(column == string).first():
|
||||||
|
string = aurweb.util.make_random_string(length)
|
||||||
|
return string
|
||||||
|
|
||||||
|
|
||||||
|
def test_name() -> str:
|
||||||
"""
|
"""
|
||||||
import sqlalchemy
|
Return the unhashed database name.
|
||||||
|
|
||||||
|
The unhashed database name is determined (lower = higher priority) by:
|
||||||
|
-------------------------------------------
|
||||||
|
1. {test_suite} portion of PYTEST_CURRENT_TEST
|
||||||
|
2. aurweb.config.get("database", "name")
|
||||||
|
|
||||||
|
During `pytest` runs, the PYTEST_CURRENT_TEST environment variable
|
||||||
|
is set to the current test in the format `{test_suite}::{test_func}`.
|
||||||
|
|
||||||
|
This allows tests to use a suite-specific database for its runs,
|
||||||
|
which decouples database state from test suites.
|
||||||
|
|
||||||
|
:return: Unhashed database name
|
||||||
|
"""
|
||||||
|
db = os.environ.get("PYTEST_CURRENT_TEST",
|
||||||
|
aurweb.config.get("database", "name"))
|
||||||
|
return db.split(":")[0]
|
||||||
|
|
||||||
|
|
||||||
|
def name() -> str:
|
||||||
|
"""
|
||||||
|
Return sanitized database name that can be used for tests or production.
|
||||||
|
|
||||||
|
If test_name() starts with "test/", the database name is SHA-1 hashed,
|
||||||
|
prefixed with 'db', and returned. Otherwise, test_name() is passed
|
||||||
|
through and not hashed at all.
|
||||||
|
|
||||||
|
:return: SHA1-hashed database name prefixed with 'db'
|
||||||
|
"""
|
||||||
|
dbname = test_name()
|
||||||
|
if not dbname.startswith("test/"):
|
||||||
|
return dbname
|
||||||
|
sha1 = hashlib.sha1(dbname.encode()).hexdigest()
|
||||||
|
return "db" + sha1
|
||||||
|
|
||||||
|
|
||||||
|
# Module-private global memo used to store SQLAlchemy sessions.
|
||||||
|
_sessions = dict()
|
||||||
|
|
||||||
|
|
||||||
|
def get_session(engine: Engine = None) -> Session:
|
||||||
|
""" Return aurweb.db's global session. """
|
||||||
|
dbname = name()
|
||||||
|
|
||||||
|
global _sessions
|
||||||
|
if dbname not in _sessions:
|
||||||
|
|
||||||
|
if not engine: # pragma: no cover
|
||||||
|
engine = get_engine()
|
||||||
|
|
||||||
|
Session = scoped_session(
|
||||||
|
sessionmaker(autocommit=True, autoflush=False, bind=engine))
|
||||||
|
_sessions[dbname] = Session()
|
||||||
|
|
||||||
|
return _sessions.get(dbname)
|
||||||
|
|
||||||
|
|
||||||
|
def pop_session(dbname: str) -> None:
|
||||||
|
"""
|
||||||
|
Pop a Session out of the private _sessions memo.
|
||||||
|
|
||||||
|
:param dbname: Database name
|
||||||
|
:raises KeyError: When `dbname` does not exist in the memo
|
||||||
|
"""
|
||||||
|
global _sessions
|
||||||
|
_sessions.pop(dbname)
|
||||||
|
|
||||||
|
|
||||||
|
def refresh(model: Base) -> Base:
|
||||||
|
""" Refresh the session's knowledge of `model`. """
|
||||||
|
get_session().refresh(model)
|
||||||
|
return model
|
||||||
|
|
||||||
|
|
||||||
|
def query(Model: Base, *args, **kwargs) -> Query:
|
||||||
|
"""
|
||||||
|
Perform an ORM query against the database session.
|
||||||
|
|
||||||
|
This method also runs Query.filter on the resulting model
|
||||||
|
query with *args and **kwargs.
|
||||||
|
|
||||||
|
:param Model: Declarative ORM class
|
||||||
|
"""
|
||||||
|
return get_session().query(Model).filter(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def create(Model: Base, *args, **kwargs) -> Base:
|
||||||
|
"""
|
||||||
|
Create a record and add() it to the database session.
|
||||||
|
|
||||||
|
:param Model: Declarative ORM class
|
||||||
|
:return: Model instance
|
||||||
|
"""
|
||||||
|
instance = Model(*args, **kwargs)
|
||||||
|
return add(instance)
|
||||||
|
|
||||||
|
|
||||||
|
def delete(model: Base) -> None:
|
||||||
|
"""
|
||||||
|
Delete a set of records found by Query.filter(*args, **kwargs).
|
||||||
|
|
||||||
|
:param Model: Declarative ORM class
|
||||||
|
"""
|
||||||
|
get_session().delete(model)
|
||||||
|
|
||||||
|
|
||||||
|
def delete_all(iterable: Iterable) -> None:
|
||||||
|
""" Delete each instance found in `iterable`. """
|
||||||
|
session_ = get_session()
|
||||||
|
aurweb.util.apply_all(iterable, session_.delete)
|
||||||
|
|
||||||
|
|
||||||
|
def rollback() -> None:
|
||||||
|
""" Rollback the database session. """
|
||||||
|
get_session().rollback()
|
||||||
|
|
||||||
|
|
||||||
|
def add(model: Base) -> Base:
|
||||||
|
""" Add `model` to the database session. """
|
||||||
|
get_session().add(model)
|
||||||
|
return model
|
||||||
|
|
||||||
|
|
||||||
|
def begin() -> SessionTransaction:
|
||||||
|
""" Begin an SQLAlchemy SessionTransaction. """
|
||||||
|
return get_session().begin()
|
||||||
|
|
||||||
|
|
||||||
|
def get_sqlalchemy_url() -> URL:
|
||||||
|
"""
|
||||||
|
Build an SQLAlchemy URL for use with create_engine.
|
||||||
|
|
||||||
|
:return: sqlalchemy.engine.url.URL
|
||||||
|
"""
|
||||||
|
constructor = URL
|
||||||
|
|
||||||
|
parts = sqlalchemy.__version__.split('.')
|
||||||
|
major = int(parts[0])
|
||||||
|
minor = int(parts[1])
|
||||||
|
if major == 1 and minor >= 4: # pragma: no cover
|
||||||
|
constructor = URL.create
|
||||||
|
|
||||||
aur_db_backend = aurweb.config.get('database', 'backend')
|
aur_db_backend = aurweb.config.get('database', 'backend')
|
||||||
if aur_db_backend == 'mysql':
|
if aur_db_backend == 'mysql':
|
||||||
return sqlalchemy.engine.url.URL(
|
param_query = {}
|
||||||
'mysql+mysqlconnector',
|
port = aurweb.config.get_with_fallback("database", "port", None)
|
||||||
|
if not port:
|
||||||
|
param_query["unix_socket"] = aurweb.config.get(
|
||||||
|
"database", "socket")
|
||||||
|
|
||||||
|
return constructor(
|
||||||
|
DRIVERS.get(aur_db_backend),
|
||||||
username=aurweb.config.get('database', 'user'),
|
username=aurweb.config.get('database', 'user'),
|
||||||
password=aurweb.config.get('database', 'password'),
|
password=aurweb.config.get_with_fallback('database', 'password',
|
||||||
|
fallback=None),
|
||||||
host=aurweb.config.get('database', 'host'),
|
host=aurweb.config.get('database', 'host'),
|
||||||
database=aurweb.config.get('database', 'name'),
|
database=name(),
|
||||||
query={
|
port=port,
|
||||||
'unix_socket': aurweb.config.get('database', 'socket'),
|
query=param_query
|
||||||
},
|
|
||||||
)
|
)
|
||||||
elif aur_db_backend == 'sqlite':
|
elif aur_db_backend == 'sqlite':
|
||||||
return sqlalchemy.engine.url.URL(
|
return constructor(
|
||||||
'sqlite',
|
'sqlite',
|
||||||
database=aurweb.config.get('database', 'name'),
|
database=aurweb.config.get('database', 'name'),
|
||||||
)
|
)
|
||||||
|
@ -39,26 +207,83 @@ def get_sqlalchemy_url():
|
||||||
raise ValueError('unsupported database backend')
|
raise ValueError('unsupported database backend')
|
||||||
|
|
||||||
|
|
||||||
def get_engine():
|
def sqlite_regexp(regex, item) -> bool: # pragma: no cover
|
||||||
|
""" Method which mimics SQL's REGEXP for SQLite. """
|
||||||
|
return bool(re.search(regex, str(item)))
|
||||||
|
|
||||||
|
|
||||||
|
def setup_sqlite(engine: Engine) -> None: # pragma: no cover
|
||||||
|
""" Perform setup for an SQLite engine. """
|
||||||
|
@event.listens_for(engine, "connect")
|
||||||
|
def do_begin(conn, record):
|
||||||
|
create_deterministic_function = functools.partial(
|
||||||
|
conn.create_function,
|
||||||
|
deterministic=True
|
||||||
|
)
|
||||||
|
create_deterministic_function("REGEXP", 2, sqlite_regexp)
|
||||||
|
|
||||||
|
|
||||||
|
# Module-private global memo used to store SQLAlchemy engines.
|
||||||
|
_engines = dict()
|
||||||
|
|
||||||
|
|
||||||
|
def get_engine(dbname: str = None, echo: bool = False) -> Engine:
|
||||||
"""
|
"""
|
||||||
Return the global SQLAlchemy engine.
|
Return the SQLAlchemy engine for `dbname`.
|
||||||
|
|
||||||
The engine is created on the first call to get_engine and then stored in the
|
The engine is created on the first call to get_engine and then stored in the
|
||||||
`engine` global variable for the next calls.
|
`engine` global variable for the next calls.
|
||||||
"""
|
|
||||||
from sqlalchemy import create_engine
|
|
||||||
global engine
|
|
||||||
if engine is None:
|
|
||||||
connect_args = dict()
|
|
||||||
if aurweb.config.get("database", "backend") == "sqlite":
|
|
||||||
# check_same_thread is for a SQLite technicality
|
|
||||||
# https://fastapi.tiangolo.com/tutorial/sql-databases/#note
|
|
||||||
connect_args["check_same_thread"] = False
|
|
||||||
engine = create_engine(get_sqlalchemy_url(), connect_args=connect_args)
|
|
||||||
Session = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
|
||||||
session = Session()
|
|
||||||
|
|
||||||
return engine
|
:param dbname: Database name (default: aurweb.db.name())
|
||||||
|
:param echo: Flag passed through to sqlalchemy.create_engine
|
||||||
|
:return: SQLAlchemy Engine instance
|
||||||
|
"""
|
||||||
|
if not dbname:
|
||||||
|
dbname = name()
|
||||||
|
|
||||||
|
global _engines
|
||||||
|
if dbname not in _engines:
|
||||||
|
db_backend = aurweb.config.get("database", "backend")
|
||||||
|
connect_args = dict()
|
||||||
|
|
||||||
|
is_sqlite = bool(db_backend == "sqlite")
|
||||||
|
if is_sqlite: # pragma: no cover
|
||||||
|
connect_args["check_same_thread"] = False
|
||||||
|
|
||||||
|
kwargs = {
|
||||||
|
"echo": echo,
|
||||||
|
"connect_args": connect_args
|
||||||
|
}
|
||||||
|
_engines[dbname] = create_engine(get_sqlalchemy_url(), **kwargs)
|
||||||
|
|
||||||
|
if is_sqlite: # pragma: no cover
|
||||||
|
setup_sqlite(_engines.get(dbname))
|
||||||
|
|
||||||
|
return _engines.get(dbname)
|
||||||
|
|
||||||
|
|
||||||
|
def pop_engine(dbname: str) -> None:
|
||||||
|
"""
|
||||||
|
Pop an Engine out of the private _engines memo.
|
||||||
|
|
||||||
|
:param dbname: Database name
|
||||||
|
:raises KeyError: When `dbname` does not exist in the memo
|
||||||
|
"""
|
||||||
|
global _engines
|
||||||
|
_engines.pop(dbname)
|
||||||
|
|
||||||
|
|
||||||
|
def kill_engine() -> None:
|
||||||
|
""" Close the current session and dispose of the engine. """
|
||||||
|
dbname = name()
|
||||||
|
|
||||||
|
session = get_session()
|
||||||
|
session.close()
|
||||||
|
pop_session(dbname)
|
||||||
|
|
||||||
|
engine = get_engine()
|
||||||
|
engine.dispose()
|
||||||
|
pop_engine(dbname)
|
||||||
|
|
||||||
|
|
||||||
def connect():
|
def connect():
|
||||||
|
@ -72,34 +297,24 @@ def connect():
|
||||||
return get_engine().connect()
|
return get_engine().connect()
|
||||||
|
|
||||||
|
|
||||||
class Connection:
|
class ConnectionExecutor:
|
||||||
_conn = None
|
_conn = None
|
||||||
_paramstyle = None
|
_paramstyle = None
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self, conn, backend=aurweb.config.get("database", "backend")):
|
||||||
aur_db_backend = aurweb.config.get('database', 'backend')
|
self._conn = conn
|
||||||
|
if backend == "mysql":
|
||||||
if aur_db_backend == 'mysql':
|
self._paramstyle = "format"
|
||||||
aur_db_host = aurweb.config.get('database', 'host')
|
elif backend == "sqlite":
|
||||||
aur_db_name = aurweb.config.get('database', 'name')
|
import sqlite3
|
||||||
aur_db_user = aurweb.config.get('database', 'user')
|
|
||||||
aur_db_pass = aurweb.config.get('database', 'password')
|
|
||||||
aur_db_socket = aurweb.config.get('database', 'socket')
|
|
||||||
self._conn = mysql.connector.connect(host=aur_db_host,
|
|
||||||
user=aur_db_user,
|
|
||||||
passwd=aur_db_pass,
|
|
||||||
db=aur_db_name,
|
|
||||||
unix_socket=aur_db_socket,
|
|
||||||
buffered=True)
|
|
||||||
self._paramstyle = mysql.connector.paramstyle
|
|
||||||
elif aur_db_backend == 'sqlite':
|
|
||||||
aur_db_name = aurweb.config.get('database', 'name')
|
|
||||||
self._conn = sqlite3.connect(aur_db_name)
|
|
||||||
self._paramstyle = sqlite3.paramstyle
|
self._paramstyle = sqlite3.paramstyle
|
||||||
else:
|
|
||||||
raise ValueError('unsupported database backend')
|
|
||||||
|
|
||||||
def execute(self, query, params=()):
|
def paramstyle(self):
|
||||||
|
return self._paramstyle
|
||||||
|
|
||||||
|
def execute(self, query, params=()): # pragma: no cover
|
||||||
|
# TODO: SQLite support has been removed in FastAPI. It remains
|
||||||
|
# here to fund its support for PHP until it is removed.
|
||||||
if self._paramstyle in ('format', 'pyformat'):
|
if self._paramstyle in ('format', 'pyformat'):
|
||||||
query = query.replace('%', '%%').replace('?', '%s')
|
query = query.replace('%', '%%').replace('?', '%s')
|
||||||
elif self._paramstyle == 'qmark':
|
elif self._paramstyle == 'qmark':
|
||||||
|
@ -117,3 +332,45 @@ class Connection:
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
self._conn.close()
|
self._conn.close()
|
||||||
|
|
||||||
|
|
||||||
|
class Connection:
|
||||||
|
_executor = None
|
||||||
|
_conn = None
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
aur_db_backend = aurweb.config.get('database', 'backend')
|
||||||
|
|
||||||
|
if aur_db_backend == 'mysql':
|
||||||
|
import MySQLdb
|
||||||
|
aur_db_host = aurweb.config.get('database', 'host')
|
||||||
|
aur_db_name = name()
|
||||||
|
aur_db_user = aurweb.config.get('database', 'user')
|
||||||
|
aur_db_pass = aurweb.config.get_with_fallback(
|
||||||
|
'database', 'password', str())
|
||||||
|
aur_db_socket = aurweb.config.get('database', 'socket')
|
||||||
|
self._conn = MySQLdb.connect(host=aur_db_host,
|
||||||
|
user=aur_db_user,
|
||||||
|
passwd=aur_db_pass,
|
||||||
|
db=aur_db_name,
|
||||||
|
unix_socket=aur_db_socket)
|
||||||
|
elif aur_db_backend == 'sqlite': # pragma: no cover
|
||||||
|
# TODO: SQLite support has been removed in FastAPI. It remains
|
||||||
|
# here to fund its support for PHP until it is removed.
|
||||||
|
import sqlite3
|
||||||
|
aur_db_name = aurweb.config.get('database', 'name')
|
||||||
|
self._conn = sqlite3.connect(aur_db_name)
|
||||||
|
self._conn.create_function("POWER", 2, math.pow)
|
||||||
|
else:
|
||||||
|
raise ValueError('unsupported database backend')
|
||||||
|
|
||||||
|
self._conn = ConnectionExecutor(self._conn, aur_db_backend)
|
||||||
|
|
||||||
|
def execute(self, query, params=()):
|
||||||
|
return self._conn.execute(query, params)
|
||||||
|
|
||||||
|
def commit(self):
|
||||||
|
self._conn.commit()
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
self._conn.close()
|
||||||
|
|
21
aurweb/defaults.py
Normal file
21
aurweb/defaults.py
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
""" Constant default values centralized in one place. """
|
||||||
|
|
||||||
|
# Default [O]ffset
|
||||||
|
O = 0
|
||||||
|
|
||||||
|
# Default [P]er [P]age
|
||||||
|
PP = 50
|
||||||
|
|
||||||
|
# A whitelist of valid PP values
|
||||||
|
PP_WHITELIST = {50, 100, 250}
|
||||||
|
|
||||||
|
# Default `by` parameter for RPC search.
|
||||||
|
RPC_SEARCH_BY = "name-desc"
|
||||||
|
|
||||||
|
|
||||||
|
def fallback_pp(per_page: int) -> int:
|
||||||
|
""" If `per_page` is a valid value in PP_WHITELIST, return it.
|
||||||
|
Otherwise, return defaults.PP. """
|
||||||
|
if per_page not in PP_WHITELIST:
|
||||||
|
return PP
|
||||||
|
return per_page
|
|
@ -1,3 +1,6 @@
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
class AurwebException(Exception):
|
class AurwebException(Exception):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
@ -73,3 +76,17 @@ class NotVotedException(AurwebException):
|
||||||
class InvalidArgumentsException(AurwebException):
|
class InvalidArgumentsException(AurwebException):
|
||||||
def __init__(self, msg):
|
def __init__(self, msg):
|
||||||
super(InvalidArgumentsException, self).__init__(msg)
|
super(InvalidArgumentsException, self).__init__(msg)
|
||||||
|
|
||||||
|
|
||||||
|
class RPCError(AurwebException):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class ValidationError(AurwebException):
|
||||||
|
def __init__(self, data: Any, *args, **kwargs):
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
self.data = data
|
||||||
|
|
||||||
|
|
||||||
|
class InvariantError(AurwebException):
|
||||||
|
pass
|
||||||
|
|
150
aurweb/filters.py
Normal file
150
aurweb/filters.py
Normal file
|
@ -0,0 +1,150 @@
|
||||||
|
import copy
|
||||||
|
import math
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any, Dict
|
||||||
|
from urllib.parse import quote_plus, urlencode
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
|
import fastapi
|
||||||
|
import paginate
|
||||||
|
|
||||||
|
from jinja2 import pass_context
|
||||||
|
|
||||||
|
import aurweb.models
|
||||||
|
|
||||||
|
from aurweb import config, l10n
|
||||||
|
from aurweb.templates import register_filter, register_function
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("pager_nav")
|
||||||
|
@pass_context
|
||||||
|
def pager_nav(context: Dict[str, Any],
|
||||||
|
page: int, total: int, prefix: str) -> str:
|
||||||
|
page = int(page) # Make sure this is an int.
|
||||||
|
|
||||||
|
pp = context.get("PP", 50)
|
||||||
|
|
||||||
|
# Setup a local query string dict, optionally passed by caller.
|
||||||
|
q = context.get("q", dict())
|
||||||
|
|
||||||
|
search_by = context.get("SeB", None)
|
||||||
|
if search_by:
|
||||||
|
q["SeB"] = search_by
|
||||||
|
|
||||||
|
sort_by = context.get("SB", None)
|
||||||
|
if sort_by:
|
||||||
|
q["SB"] = sort_by
|
||||||
|
|
||||||
|
def create_url(page: int):
|
||||||
|
nonlocal q
|
||||||
|
offset = max(page * pp - pp, 0)
|
||||||
|
qs = to_qs(extend_query(q, ["O", offset]))
|
||||||
|
return f"{prefix}?{qs}"
|
||||||
|
|
||||||
|
# Use the paginate module to produce our linkage.
|
||||||
|
pager = paginate.Page([], page=page + 1,
|
||||||
|
items_per_page=pp,
|
||||||
|
item_count=total,
|
||||||
|
url_maker=create_url)
|
||||||
|
|
||||||
|
return pager.pager(
|
||||||
|
link_attr={"class": "page"},
|
||||||
|
curpage_attr={"class": "page"},
|
||||||
|
separator=" ",
|
||||||
|
format="$link_first $link_previous ~5~ $link_next $link_last",
|
||||||
|
symbol_first="« First",
|
||||||
|
symbol_previous="‹ Previous",
|
||||||
|
symbol_next="Next ›",
|
||||||
|
symbol_last="Last »")
|
||||||
|
|
||||||
|
|
||||||
|
@register_function("config_getint")
|
||||||
|
def config_getint(section: str, key: str) -> int:
|
||||||
|
return config.getint(section, key)
|
||||||
|
|
||||||
|
|
||||||
|
@register_function("round")
|
||||||
|
def do_round(f: float) -> int:
|
||||||
|
return round(f)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("tr")
|
||||||
|
@pass_context
|
||||||
|
def tr(context: Dict[str, Any], value: str):
|
||||||
|
""" A translation filter; example: {{ "Hello" | tr("de") }}. """
|
||||||
|
_ = l10n.get_translator_for_request(context.get("request"))
|
||||||
|
return _(value)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("tn")
|
||||||
|
@pass_context
|
||||||
|
def tn(context: Dict[str, Any], count: int,
|
||||||
|
singular: str, plural: str) -> str:
|
||||||
|
""" A singular and plural translation filter.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
{{ some_integer | tn("singular %d", "plural %d") }}
|
||||||
|
|
||||||
|
:param context: Response context
|
||||||
|
:param count: The number used to decide singular or plural state
|
||||||
|
:param singular: The singular translation
|
||||||
|
:param plural: The plural translation
|
||||||
|
:return: Translated string
|
||||||
|
"""
|
||||||
|
gettext = l10n.get_raw_translator_for_request(context.get("request"))
|
||||||
|
return gettext.ngettext(singular, plural, count)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("dt")
|
||||||
|
def timestamp_to_datetime(timestamp: int):
|
||||||
|
return datetime.utcfromtimestamp(int(timestamp))
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("as_timezone")
|
||||||
|
def as_timezone(dt: datetime, timezone: str):
|
||||||
|
return dt.astimezone(tz=ZoneInfo(timezone))
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("extend_query")
|
||||||
|
def extend_query(query: Dict[str, Any], *additions) -> Dict[str, Any]:
|
||||||
|
""" Add additional key value pairs to query. """
|
||||||
|
q = copy.copy(query)
|
||||||
|
for k, v in list(additions):
|
||||||
|
q[k] = v
|
||||||
|
return q
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("urlencode")
|
||||||
|
def to_qs(query: Dict[str, Any]) -> str:
|
||||||
|
return urlencode(query, doseq=True)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("get_vote")
|
||||||
|
def get_vote(voteinfo, request: fastapi.Request):
|
||||||
|
from aurweb.models import TUVote
|
||||||
|
return voteinfo.tu_votes.filter(TUVote.User == request.user).first()
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("number_format")
|
||||||
|
def number_format(value: float, places: int):
|
||||||
|
""" A converter function similar to PHP's number_format. """
|
||||||
|
return f"{value:.{places}f}"
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("account_url")
|
||||||
|
@pass_context
|
||||||
|
def account_url(context: Dict[str, Any],
|
||||||
|
user: "aurweb.models.user.User") -> str:
|
||||||
|
base = aurweb.config.get("options", "aur_location")
|
||||||
|
return f"{base}/account/{user.Username}"
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("quote_plus")
|
||||||
|
def _quote_plus(*args, **kwargs) -> str:
|
||||||
|
return quote_plus(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("ceil")
|
||||||
|
def ceil(*args, **kwargs) -> int:
|
||||||
|
return math.ceil(*args, **kwargs)
|
|
@ -305,9 +305,9 @@ def main(): # noqa: C901
|
||||||
|
|
||||||
try:
|
try:
|
||||||
metadata_pkgbase = metadata['pkgbase']
|
metadata_pkgbase = metadata['pkgbase']
|
||||||
except KeyError as e:
|
except KeyError:
|
||||||
die_commit('invalid .SRCINFO, does not contain a pkgbase (is the file empty?)',
|
die_commit('invalid .SRCINFO, does not contain a pkgbase (is the file empty?)',
|
||||||
str(commit.id))
|
str(commit.id))
|
||||||
if not re.match(repo_regex, metadata_pkgbase):
|
if not re.match(repo_regex, metadata_pkgbase):
|
||||||
die_commit('invalid pkgbase: {:s}'.format(metadata_pkgbase),
|
die_commit('invalid pkgbase: {:s}'.format(metadata_pkgbase),
|
||||||
str(commit.id))
|
str(commit.id))
|
||||||
|
|
|
@ -2,9 +2,9 @@ import argparse
|
||||||
|
|
||||||
import alembic.command
|
import alembic.command
|
||||||
import alembic.config
|
import alembic.config
|
||||||
import sqlalchemy
|
|
||||||
|
|
||||||
import aurweb.db
|
import aurweb.db
|
||||||
|
import aurweb.logging
|
||||||
import aurweb.schema
|
import aurweb.schema
|
||||||
|
|
||||||
|
|
||||||
|
@ -34,17 +34,21 @@ def feed_initial_data(conn):
|
||||||
|
|
||||||
|
|
||||||
def run(args):
|
def run(args):
|
||||||
|
aurweb.config.rehash()
|
||||||
|
|
||||||
# Ensure Alembic is fine before we do the real work, in order not to fail at
|
# Ensure Alembic is fine before we do the real work, in order not to fail at
|
||||||
# the last step and leave the database in an inconsistent state. The
|
# the last step and leave the database in an inconsistent state. The
|
||||||
# configuration is loaded lazily, so we query it to force its loading.
|
# configuration is loaded lazily, so we query it to force its loading.
|
||||||
if args.use_alembic:
|
if args.use_alembic:
|
||||||
alembic_config = alembic.config.Config('alembic.ini')
|
alembic_config = alembic.config.Config('alembic.ini')
|
||||||
alembic_config.get_main_option('script_location')
|
alembic_config.get_main_option('script_location')
|
||||||
|
alembic_config.attributes["configure_logger"] = False
|
||||||
|
|
||||||
engine = sqlalchemy.create_engine(aurweb.db.get_sqlalchemy_url(),
|
engine = aurweb.db.get_engine(echo=(args.verbose >= 1))
|
||||||
echo=(args.verbose >= 1))
|
|
||||||
aurweb.schema.metadata.create_all(engine)
|
aurweb.schema.metadata.create_all(engine)
|
||||||
feed_initial_data(engine.connect())
|
conn = engine.connect()
|
||||||
|
feed_initial_data(conn)
|
||||||
|
conn.close()
|
||||||
|
|
||||||
if args.use_alembic:
|
if args.use_alembic:
|
||||||
alembic.command.stamp(alembic_config, 'head')
|
alembic.command.stamp(alembic_config, 'head')
|
||||||
|
|
|
@ -1,38 +1,84 @@
|
||||||
import gettext
|
import gettext
|
||||||
|
|
||||||
|
from collections import OrderedDict
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
|
|
||||||
|
SUPPORTED_LANGUAGES = OrderedDict({
|
||||||
|
"ar": "العربية",
|
||||||
|
"ast": "Asturianu",
|
||||||
|
"ca": "Català",
|
||||||
|
"cs": "Český",
|
||||||
|
"da": "Dansk",
|
||||||
|
"de": "Deutsch",
|
||||||
|
"el": "Ελληνικά",
|
||||||
|
"en": "English",
|
||||||
|
"es": "Español",
|
||||||
|
"es_419": "Español (Latinoamérica)",
|
||||||
|
"fi": "Suomi",
|
||||||
|
"fr": "Français",
|
||||||
|
"he": "עברית",
|
||||||
|
"hr": "Hrvatski",
|
||||||
|
"hu": "Magyar",
|
||||||
|
"it": "Italiano",
|
||||||
|
"ja": "日本語",
|
||||||
|
"nb": "Norsk",
|
||||||
|
"nl": "Nederlands",
|
||||||
|
"pl": "Polski",
|
||||||
|
"pt_BR": "Português (Brasil)",
|
||||||
|
"pt_PT": "Português (Portugal)",
|
||||||
|
"ro": "Română",
|
||||||
|
"ru": "Русский",
|
||||||
|
"sk": "Slovenčina",
|
||||||
|
"sr": "Srpski",
|
||||||
|
"tr": "Türkçe",
|
||||||
|
"uk": "Українська",
|
||||||
|
"zh_CN": "简体中文",
|
||||||
|
"zh_TW": "正體中文"
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
class Translator:
|
class Translator:
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self._localedir = aurweb.config.get('options', 'localedir')
|
self._localedir = aurweb.config.get('options', 'localedir')
|
||||||
self._translator = {}
|
self._translator = {}
|
||||||
|
|
||||||
def translate(self, s, lang):
|
def get_translator(self, lang: str):
|
||||||
if lang == 'en':
|
|
||||||
return s
|
|
||||||
if lang not in self._translator:
|
if lang not in self._translator:
|
||||||
self._translator[lang] = gettext.translation("aurweb",
|
self._translator[lang] = gettext.translation("aurweb",
|
||||||
self._localedir,
|
self._localedir,
|
||||||
languages=[lang])
|
languages=[lang],
|
||||||
return self._translator[lang].gettext(s)
|
fallback=True)
|
||||||
|
return self._translator.get(lang)
|
||||||
|
|
||||||
|
def translate(self, s: str, lang: str):
|
||||||
|
return self.get_translator(lang).gettext(s)
|
||||||
|
|
||||||
|
|
||||||
def get_translator_for_request(request):
|
# Global translator object.
|
||||||
|
translator = Translator()
|
||||||
|
|
||||||
|
|
||||||
|
def get_request_language(request: Request):
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
return request.user.LangPreference
|
||||||
|
default_lang = aurweb.config.get("options", "default_lang")
|
||||||
|
return request.cookies.get("AURLANG", default_lang)
|
||||||
|
|
||||||
|
|
||||||
|
def get_raw_translator_for_request(request: Request):
|
||||||
|
lang = get_request_language(request)
|
||||||
|
return translator.get_translator(lang)
|
||||||
|
|
||||||
|
|
||||||
|
def get_translator_for_request(request: Request):
|
||||||
"""
|
"""
|
||||||
Determine the preferred language from a FastAPI request object and build a
|
Determine the preferred language from a FastAPI request object and build a
|
||||||
translator function for it.
|
translator function for it.
|
||||||
|
|
||||||
Example:
|
|
||||||
```python
|
|
||||||
_ = get_translator_for_request(request)
|
|
||||||
print(_("Hello"))
|
|
||||||
```
|
|
||||||
"""
|
"""
|
||||||
lang = request.cookies.get("AURLANG")
|
lang = get_request_language(request)
|
||||||
if lang is None:
|
|
||||||
lang = aurweb.config.get("options", "default_lang")
|
|
||||||
translator = Translator()
|
|
||||||
|
|
||||||
def translate(message):
|
def translate(message):
|
||||||
return translator.translate(message, lang)
|
return translator.translate(message, lang)
|
||||||
|
|
26
aurweb/logging.py
Normal file
26
aurweb/logging.py
Normal file
|
@ -0,0 +1,26 @@
|
||||||
|
import logging
|
||||||
|
import logging.config
|
||||||
|
import os
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
# For testing, users should set LOG_CONFIG=logging.test.conf
|
||||||
|
# We test against various debug log output.
|
||||||
|
aurwebdir = aurweb.config.get("options", "aurwebdir")
|
||||||
|
log_config = os.environ.get("LOG_CONFIG", "logging.conf")
|
||||||
|
config_path = os.path.join(aurwebdir, log_config)
|
||||||
|
|
||||||
|
logging.config.fileConfig(config_path, disable_existing_loggers=False)
|
||||||
|
logging.getLogger("root").addHandler(logging.NullHandler())
|
||||||
|
|
||||||
|
|
||||||
|
def get_logger(name: str) -> logging.Logger:
|
||||||
|
""" A logging.getLogger wrapper. Importing this function and
|
||||||
|
using it to get a module-local logger ensures that logging.conf
|
||||||
|
initialization is performed wherever loggers are used.
|
||||||
|
|
||||||
|
:param name: Logger name; typically `__name__`
|
||||||
|
:returns: name's logging.Logger
|
||||||
|
"""
|
||||||
|
|
||||||
|
return logging.getLogger(name)
|
31
aurweb/models/__init__.py
Normal file
31
aurweb/models/__init__.py
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
""" Collection of all aurweb SQLAlchemy declarative models. """
|
||||||
|
from .accepted_term import AcceptedTerm # noqa: F401
|
||||||
|
from .account_type import AccountType # noqa: F401
|
||||||
|
from .api_rate_limit import ApiRateLimit # noqa: F401
|
||||||
|
from .ban import Ban # noqa: F401
|
||||||
|
from .dependency_type import DependencyType # noqa: F401
|
||||||
|
from .group import Group # noqa: F401
|
||||||
|
from .license import License # noqa: F401
|
||||||
|
from .official_provider import OfficialProvider # noqa: F401
|
||||||
|
from .package import Package # noqa: F401
|
||||||
|
from .package_base import PackageBase # noqa: F401
|
||||||
|
from .package_blacklist import PackageBlacklist # noqa: F401
|
||||||
|
from .package_comaintainer import PackageComaintainer # noqa: F401
|
||||||
|
from .package_comment import PackageComment # noqa: F401
|
||||||
|
from .package_dependency import PackageDependency # noqa: F401
|
||||||
|
from .package_group import PackageGroup # noqa: F401
|
||||||
|
from .package_keyword import PackageKeyword # noqa: F401
|
||||||
|
from .package_license import PackageLicense # noqa: F401
|
||||||
|
from .package_notification import PackageNotification # noqa: F401
|
||||||
|
from .package_relation import PackageRelation # noqa: F401
|
||||||
|
from .package_request import PackageRequest # noqa: F401
|
||||||
|
from .package_source import PackageSource # noqa: F401
|
||||||
|
from .package_vote import PackageVote # noqa: F401
|
||||||
|
from .relation_type import RelationType # noqa: F401
|
||||||
|
from .request_type import RequestType # noqa: F401
|
||||||
|
from .session import Session # noqa: F401
|
||||||
|
from .ssh_pub_key import SSHPubKey # noqa: F401
|
||||||
|
from .term import Term # noqa: F401
|
||||||
|
from .tu_vote import TUVote # noqa: F401
|
||||||
|
from .tu_voteinfo import TUVoteInfo # noqa: F401
|
||||||
|
from .user import User # noqa: F401
|
36
aurweb/models/accepted_term.py
Normal file
36
aurweb/models/accepted_term.py
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.term import Term as _Term
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class AcceptedTerm(Base):
|
||||||
|
__table__ = schema.AcceptedTerms
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.TermsID]}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("accepted_terms", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
Term = relationship(
|
||||||
|
_Term, backref=backref("accepted_terms", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.TermsID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.User and not self.UsersID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UsersID cannot be null.",
|
||||||
|
orig="AcceptedTerms.UserID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Term and not self.TermsID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key TermID cannot be null.",
|
||||||
|
orig="AcceptedTerms.TermID",
|
||||||
|
params=("NULL"))
|
40
aurweb/models/account_type.py
Normal file
40
aurweb/models/account_type.py
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
USER = "User"
|
||||||
|
TRUSTED_USER = "Trusted User"
|
||||||
|
DEVELOPER = "Developer"
|
||||||
|
TRUSTED_USER_AND_DEV = "Trusted User & Developer"
|
||||||
|
|
||||||
|
USER_ID = 1
|
||||||
|
TRUSTED_USER_ID = 2
|
||||||
|
DEVELOPER_ID = 3
|
||||||
|
TRUSTED_USER_AND_DEV_ID = 4
|
||||||
|
|
||||||
|
# Map string constants to integer constants.
|
||||||
|
ACCOUNT_TYPE_ID = {
|
||||||
|
USER: USER_ID,
|
||||||
|
TRUSTED_USER: TRUSTED_USER_ID,
|
||||||
|
DEVELOPER: DEVELOPER_ID,
|
||||||
|
TRUSTED_USER_AND_DEV: TRUSTED_USER_AND_DEV_ID
|
||||||
|
}
|
||||||
|
|
||||||
|
# Reversed ACCOUNT_TYPE_ID mapping.
|
||||||
|
ACCOUNT_TYPE_NAME = {v: k for k, v in ACCOUNT_TYPE_ID.items()}
|
||||||
|
|
||||||
|
|
||||||
|
class AccountType(Base):
|
||||||
|
""" An ORM model of a single AccountTypes record. """
|
||||||
|
__table__ = schema.AccountTypes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
self.AccountType = kwargs.pop("AccountType")
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return str(self.AccountType)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<AccountType(ID='%s', AccountType='%s')>" % (
|
||||||
|
self.ID, str(self))
|
25
aurweb/models/api_rate_limit.py
Normal file
25
aurweb/models/api_rate_limit.py
Normal file
|
@ -0,0 +1,25 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class ApiRateLimit(Base):
|
||||||
|
__table__ = schema.ApiRateLimit
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.IP]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if self.Requests is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Requests cannot be null.",
|
||||||
|
orig="ApiRateLimit.Requests",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.WindowStart is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column WindowStart cannot be null.",
|
||||||
|
orig="ApiRateLimit.WindowStart",
|
||||||
|
params=("NULL"))
|
19
aurweb/models/ban.py
Normal file
19
aurweb/models/ban.py
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
from fastapi import Request
|
||||||
|
|
||||||
|
from aurweb import db, schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Ban(Base):
|
||||||
|
__table__ = schema.Bans
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.IPAddress]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def is_banned(request: Request):
|
||||||
|
ip = request.client.host
|
||||||
|
exists = db.query(Ban).filter(Ban.IPAddress == ip).exists()
|
||||||
|
return db.query(exists).scalar()
|
36
aurweb/models/declarative.py
Normal file
36
aurweb/models/declarative.py
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
import json
|
||||||
|
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
|
from aurweb import util
|
||||||
|
|
||||||
|
|
||||||
|
def to_dict(model):
|
||||||
|
return {
|
||||||
|
c.name: getattr(model, c.name)
|
||||||
|
for c in model.__table__.columns
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def to_json(model, indent: int = None):
|
||||||
|
return json.dumps({
|
||||||
|
k: util.jsonify(v)
|
||||||
|
for k, v in to_dict(model).items()
|
||||||
|
}, indent=indent)
|
||||||
|
|
||||||
|
|
||||||
|
Base = declarative_base()
|
||||||
|
|
||||||
|
# Setup __table_args__ applicable to every table.
|
||||||
|
Base.__table_args__ = {
|
||||||
|
"autoload": False,
|
||||||
|
"extend_existing": True
|
||||||
|
}
|
||||||
|
|
||||||
|
# Setup Base.as_dict and Base.json.
|
||||||
|
#
|
||||||
|
# With this, declarative models can use .as_dict() or .json()
|
||||||
|
# at any time to produce a dict and json out of table columns.
|
||||||
|
#
|
||||||
|
Base.as_dict = to_dict
|
||||||
|
Base.json = to_json
|
21
aurweb/models/dependency_type.py
Normal file
21
aurweb/models/dependency_type.py
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
DEPENDS = "depends"
|
||||||
|
MAKEDEPENDS = "makedepends"
|
||||||
|
CHECKDEPENDS = "checkdepends"
|
||||||
|
OPTDEPENDS = "optdepends"
|
||||||
|
|
||||||
|
DEPENDS_ID = 1
|
||||||
|
MAKEDEPENDS_ID = 2
|
||||||
|
CHECKDEPENDS_ID = 3
|
||||||
|
OPTDEPENDS_ID = 4
|
||||||
|
|
||||||
|
|
||||||
|
class DependencyType(Base):
|
||||||
|
__table__ = schema.DependencyTypes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, Name: str = None):
|
||||||
|
self.Name = Name
|
18
aurweb/models/group.py
Normal file
18
aurweb/models/group.py
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Group(Base):
|
||||||
|
__table__ = schema.Groups
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
if self.Name is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="Groups.Name",
|
||||||
|
params=("NULL"))
|
19
aurweb/models/license.py
Normal file
19
aurweb/models/license.py
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class License(Base):
|
||||||
|
__table__ = schema.Licenses
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Name:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="Licenses.Name",
|
||||||
|
params=("NULL"))
|
36
aurweb/models/official_provider.py
Normal file
36
aurweb/models/official_provider.py
Normal file
|
@ -0,0 +1,36 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
OFFICIAL_BASE = "https://archlinux.org"
|
||||||
|
|
||||||
|
|
||||||
|
class OfficialProvider(Base):
|
||||||
|
__table__ = schema.OfficialProviders
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
# OfficialProvider instances are official packages.
|
||||||
|
is_official = True
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Name:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="OfficialProviders.Name",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Repo:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Repo cannot be null.",
|
||||||
|
orig="OfficialProviders.Repo",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Provides:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Provides cannot be null.",
|
||||||
|
orig="OfficialProviders.Provides",
|
||||||
|
params=("NULL"))
|
35
aurweb/models/package.py
Normal file
35
aurweb/models/package.py
Normal file
|
@ -0,0 +1,35 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
|
||||||
|
|
||||||
|
class Package(Base):
|
||||||
|
__table__ = schema.Packages
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("packages", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
# No Package instances are official packages.
|
||||||
|
is_official = False
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="Packages.PackageBaseID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.Name is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="Packages.Name",
|
||||||
|
params=("NULL"))
|
57
aurweb/models/package_base.py
Normal file
57
aurweb/models/package_base.py
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema, time
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class PackageBase(Base):
|
||||||
|
__table__ = schema.PackageBases
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
Flagger = relationship(
|
||||||
|
_User, backref=backref("flagged_bases", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.FlaggerUID])
|
||||||
|
|
||||||
|
Submitter = relationship(
|
||||||
|
_User, backref=backref("submitted_bases", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.SubmitterUID])
|
||||||
|
|
||||||
|
Maintainer = relationship(
|
||||||
|
_User, backref=backref("maintained_bases", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.MaintainerUID])
|
||||||
|
|
||||||
|
Packager = relationship(
|
||||||
|
_User, backref=backref("package_bases", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.PackagerUID])
|
||||||
|
|
||||||
|
# A set used to check for floatable values.
|
||||||
|
TO_FLOAT = {"Popularity"}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if self.Name is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="PackageBases.Name",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
# If no SubmittedTS/ModifiedTS is provided on creation, set them
|
||||||
|
# here to the current utc timestamp.
|
||||||
|
now = time.utcnow()
|
||||||
|
if not self.SubmittedTS:
|
||||||
|
self.SubmittedTS = now
|
||||||
|
if not self.ModifiedTS:
|
||||||
|
self.ModifiedTS = now
|
||||||
|
|
||||||
|
if not self.FlaggerComment:
|
||||||
|
self.FlaggerComment = str()
|
||||||
|
|
||||||
|
def __getattribute__(self, key: str):
|
||||||
|
attr = super().__getattribute__(key)
|
||||||
|
if key in PackageBase.TO_FLOAT and not isinstance(attr, float):
|
||||||
|
return float(attr)
|
||||||
|
return attr
|
19
aurweb/models/package_blacklist.py
Normal file
19
aurweb/models/package_blacklist.py
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class PackageBlacklist(Base):
|
||||||
|
__table__ = schema.PackageBlacklist
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Name:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Name cannot be null.",
|
||||||
|
orig="PackageBlacklist.Name",
|
||||||
|
params=("NULL"))
|
46
aurweb/models/package_comaintainer.py
Normal file
46
aurweb/models/package_comaintainer.py
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class PackageComaintainer(Base):
|
||||||
|
__table__ = schema.PackageComaintainers
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]
|
||||||
|
}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("comaintained", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("comaintainers", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.User and not self.UsersID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UsersID cannot be null.",
|
||||||
|
orig="PackageComaintainers.UsersID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageComaintainers.PackageBaseID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Priority:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Priority cannot be null.",
|
||||||
|
orig="PackageComaintainers.Priority",
|
||||||
|
params=("NULL"))
|
54
aurweb/models/package_comment.py
Normal file
54
aurweb/models/package_comment.py
Normal file
|
@ -0,0 +1,54 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class PackageComment(Base):
|
||||||
|
__table__ = schema.PackageComments
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("comments", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("package_comments", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
Editor = relationship(
|
||||||
|
_User, backref=backref("edited_comments", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.EditedUsersID])
|
||||||
|
|
||||||
|
Deleter = relationship(
|
||||||
|
_User, backref=backref("deleted_comments", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.DelUsersID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageComments.PackageBaseID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.User and not self.UsersID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UsersID cannot be null.",
|
||||||
|
orig="PackageComments.UsersID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.Comments is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Comments cannot be null.",
|
||||||
|
orig="PackageComments.Comments",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.RenderedComment is None:
|
||||||
|
self.RenderedComment = str()
|
82
aurweb/models/package_dependency.py
Normal file
82
aurweb/models/package_dependency.py
Normal file
|
@ -0,0 +1,82 @@
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
from sqlalchemy import and_, literal
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import db, schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.dependency_type import DependencyType as _DependencyType
|
||||||
|
from aurweb.models.official_provider import OfficialProvider as _OfficialProvider
|
||||||
|
from aurweb.models.package import Package as _Package
|
||||||
|
from aurweb.models.package_relation import PackageRelation
|
||||||
|
|
||||||
|
|
||||||
|
class PackageDependency(Base):
|
||||||
|
__table__ = schema.PackageDepends
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [
|
||||||
|
__table__.c.PackageID,
|
||||||
|
__table__.c.DepTypeID,
|
||||||
|
__table__.c.DepName,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Package = relationship(
|
||||||
|
_Package, backref=backref("package_dependencies", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageID])
|
||||||
|
|
||||||
|
DependencyType = relationship(
|
||||||
|
_DependencyType,
|
||||||
|
backref=backref("package_dependencies", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.DepTypeID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Package and not self.PackageID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageID cannot be null.",
|
||||||
|
orig="PackageDependencies.PackageID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.DependencyType and not self.DepTypeID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key DepTypeID cannot be null.",
|
||||||
|
orig="PackageDependencies.DepTypeID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.DepName is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column DepName cannot be null.",
|
||||||
|
orig="PackageDependencies.DepName",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
def is_package(self) -> bool:
|
||||||
|
pkg = db.query(_Package).filter(_Package.Name == self.DepName).exists()
|
||||||
|
official = db.query(_OfficialProvider).filter(
|
||||||
|
_OfficialProvider.Name == self.DepName).exists()
|
||||||
|
return db.query(pkg).scalar() or db.query(official).scalar()
|
||||||
|
|
||||||
|
def provides(self) -> List[PackageRelation]:
|
||||||
|
from aurweb.models.relation_type import PROVIDES_ID
|
||||||
|
|
||||||
|
rels = db.query(PackageRelation).join(_Package).filter(
|
||||||
|
and_(PackageRelation.RelTypeID == PROVIDES_ID,
|
||||||
|
PackageRelation.RelName == self.DepName)
|
||||||
|
).with_entities(
|
||||||
|
_Package.Name,
|
||||||
|
literal(False).label("is_official")
|
||||||
|
).order_by(_Package.Name.asc())
|
||||||
|
|
||||||
|
official_rels = db.query(_OfficialProvider).filter(
|
||||||
|
and_(_OfficialProvider.Provides == self.DepName,
|
||||||
|
_OfficialProvider.Name != self.DepName)
|
||||||
|
).with_entities(
|
||||||
|
_OfficialProvider.Name,
|
||||||
|
literal(True).label("is_official")
|
||||||
|
).order_by(_OfficialProvider.Name.asc())
|
||||||
|
|
||||||
|
return rels.union(official_rels).all()
|
40
aurweb/models/package_group.py
Normal file
40
aurweb/models/package_group.py
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.group import Group as _Group
|
||||||
|
from aurweb.models.package import Package as _Package
|
||||||
|
|
||||||
|
|
||||||
|
class PackageGroup(Base):
|
||||||
|
__table__ = schema.PackageGroups
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.PackageID, __table__.c.GroupID]
|
||||||
|
}
|
||||||
|
|
||||||
|
Package = relationship(
|
||||||
|
_Package, backref=backref("package_groups", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageID])
|
||||||
|
|
||||||
|
Group = relationship(
|
||||||
|
_Group, backref=backref("package_groups", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.GroupID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Package and not self.PackageID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Primary key PackageID cannot be null.",
|
||||||
|
orig="PackageGroups.PackageID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Group and not self.GroupID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Primary key GroupID cannot be null.",
|
||||||
|
orig="PackageGroups.GroupID",
|
||||||
|
params=("NULL"))
|
28
aurweb/models/package_keyword.py
Normal file
28
aurweb/models/package_keyword.py
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
|
||||||
|
|
||||||
|
class PackageKeyword(Base):
|
||||||
|
__table__ = schema.PackageKeywords
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.PackageBaseID, __table__.c.Keyword]
|
||||||
|
}
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("keywords", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Primary key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageKeywords.PackageBaseID",
|
||||||
|
params=("NULL"))
|
40
aurweb/models/package_license.py
Normal file
40
aurweb/models/package_license.py
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.license import License as _License
|
||||||
|
from aurweb.models.package import Package as _Package
|
||||||
|
|
||||||
|
|
||||||
|
class PackageLicense(Base):
|
||||||
|
__table__ = schema.PackageLicenses
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.PackageID, __table__.c.LicenseID]
|
||||||
|
}
|
||||||
|
|
||||||
|
Package = relationship(
|
||||||
|
_Package, backref=backref("package_licenses", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageID])
|
||||||
|
|
||||||
|
License = relationship(
|
||||||
|
_License, backref=backref("package_licenses", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.LicenseID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Package and not self.PackageID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Primary key PackageID cannot be null.",
|
||||||
|
orig="PackageLicenses.PackageID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.License and not self.LicenseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Primary key LicenseID cannot be null.",
|
||||||
|
orig="PackageLicenses.LicenseID",
|
||||||
|
params=("NULL"))
|
41
aurweb/models/package_notification.py
Normal file
41
aurweb/models/package_notification.py
Normal file
|
@ -0,0 +1,41 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class PackageNotification(Base):
|
||||||
|
__table__ = schema.PackageNotifications
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.UserID, __table__.c.PackageBaseID]
|
||||||
|
}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("notifications", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.UserID])
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase,
|
||||||
|
backref=backref("notifications", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.User and not self.UserID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UserID cannot be null.",
|
||||||
|
orig="PackageNotifications.UserID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageNotifications.PackageBaseID",
|
||||||
|
params=("NULL"))
|
49
aurweb/models/package_relation.py
Normal file
49
aurweb/models/package_relation.py
Normal file
|
@ -0,0 +1,49 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package import Package as _Package
|
||||||
|
from aurweb.models.relation_type import RelationType as _RelationType
|
||||||
|
|
||||||
|
|
||||||
|
class PackageRelation(Base):
|
||||||
|
__table__ = schema.PackageRelations
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [
|
||||||
|
__table__.c.PackageID,
|
||||||
|
__table__.c.RelTypeID,
|
||||||
|
__table__.c.RelName,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Package = relationship(
|
||||||
|
_Package, backref=backref("package_relations", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageID])
|
||||||
|
|
||||||
|
RelationType = relationship(
|
||||||
|
_RelationType, backref=backref("package_relations", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.RelTypeID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Package and not self.PackageID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageID cannot be null.",
|
||||||
|
orig="PackageRelations.PackageID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.RelationType and not self.RelTypeID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key RelTypeID cannot be null.",
|
||||||
|
orig="PackageRelations.RelTypeID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.RelName:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column RelName cannot be null.",
|
||||||
|
orig="PackageRelations.RelName",
|
||||||
|
params=("NULL"))
|
91
aurweb/models/package_request.py
Normal file
91
aurweb/models/package_request.py
Normal file
|
@ -0,0 +1,91 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
from aurweb.models.request_type import RequestType as _RequestType
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
PENDING = "Pending"
|
||||||
|
CLOSED = "Closed"
|
||||||
|
ACCEPTED = "Accepted"
|
||||||
|
REJECTED = "Rejected"
|
||||||
|
|
||||||
|
# Integer values used for the Status column of PackageRequest.
|
||||||
|
PENDING_ID = 0
|
||||||
|
CLOSED_ID = 1
|
||||||
|
ACCEPTED_ID = 2
|
||||||
|
REJECTED_ID = 3
|
||||||
|
|
||||||
|
|
||||||
|
class PackageRequest(Base):
|
||||||
|
__table__ = schema.PackageRequests
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
RequestType = relationship(
|
||||||
|
_RequestType, backref=backref("package_requests", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.ReqTypeID])
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("package_requests", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("requests", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
Closer = relationship(
|
||||||
|
_User, backref=backref("closed_requests", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.ClosedUID])
|
||||||
|
|
||||||
|
STATUS_DISPLAY = {
|
||||||
|
PENDING_ID: PENDING,
|
||||||
|
CLOSED_ID: CLOSED,
|
||||||
|
ACCEPTED_ID: ACCEPTED,
|
||||||
|
REJECTED_ID: REJECTED
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.RequestType and not self.ReqTypeID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key ReqTypeID cannot be null.",
|
||||||
|
orig="PackageRequests.ReqTypeID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageRequests.PackageBaseID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.PackageBaseName:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column PackageBaseName cannot be null.",
|
||||||
|
orig="PackageRequests.PackageBaseName",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.User and not self.UsersID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UsersID cannot be null.",
|
||||||
|
orig="PackageRequests.UsersID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.Comments is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Comments cannot be null.",
|
||||||
|
orig="PackageRequests.Comments",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.ClosureComment is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column ClosureComment cannot be null.",
|
||||||
|
orig="PackageRequests.ClosureComment",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
def status_display(self) -> str:
|
||||||
|
""" Return a display string for the Status column. """
|
||||||
|
return self.STATUS_DISPLAY[self.Status]
|
34
aurweb/models/package_source.py
Normal file
34
aurweb/models/package_source.py
Normal file
|
@ -0,0 +1,34 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package import Package as _Package
|
||||||
|
|
||||||
|
|
||||||
|
class PackageSource(Base):
|
||||||
|
__table__ = schema.PackageSources
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [
|
||||||
|
__table__.c.PackageID,
|
||||||
|
__table__.c.Source
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Package = relationship(
|
||||||
|
_Package, backref=backref("package_sources", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Package and not self.PackageID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageID cannot be null.",
|
||||||
|
orig="PackageSources.PackageID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Source:
|
||||||
|
self.Source = "/dev/null"
|
45
aurweb/models/package_vote.py
Normal file
45
aurweb/models/package_vote.py
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.package_base import PackageBase as _PackageBase
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class PackageVote(Base):
|
||||||
|
__table__ = schema.PackageVotes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.UsersID, __table__.c.PackageBaseID]
|
||||||
|
}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("package_votes", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
PackageBase = relationship(
|
||||||
|
_PackageBase, backref=backref("package_votes", lazy="dynamic",
|
||||||
|
cascade="all, delete"),
|
||||||
|
foreign_keys=[__table__.c.PackageBaseID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.User and not self.UsersID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UsersID cannot be null.",
|
||||||
|
orig="PackageVotes.UsersID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.PackageBase and not self.PackageBaseID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key PackageBaseID cannot be null.",
|
||||||
|
orig="PackageVotes.PackageBaseID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.VoteTS:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column VoteTS cannot be null.",
|
||||||
|
orig="PackageVotes.VoteTS",
|
||||||
|
params=("NULL"))
|
19
aurweb/models/relation_type.py
Normal file
19
aurweb/models/relation_type.py
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
CONFLICTS = "conflicts"
|
||||||
|
PROVIDES = "provides"
|
||||||
|
REPLACES = "replaces"
|
||||||
|
|
||||||
|
CONFLICTS_ID = 1
|
||||||
|
PROVIDES_ID = 2
|
||||||
|
REPLACES_ID = 3
|
||||||
|
|
||||||
|
|
||||||
|
class RelationType(Base):
|
||||||
|
__table__ = schema.RelationTypes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, Name: str = None):
|
||||||
|
self.Name = Name
|
20
aurweb/models/request_type.py
Normal file
20
aurweb/models/request_type.py
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
DELETION = "deletion"
|
||||||
|
ORPHAN = "orphan"
|
||||||
|
MERGE = "merge"
|
||||||
|
|
||||||
|
DELETION_ID = 1
|
||||||
|
ORPHAN_ID = 2
|
||||||
|
MERGE_ID = 3
|
||||||
|
|
||||||
|
|
||||||
|
class RequestType(Base):
|
||||||
|
__table__ = schema.RequestTypes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def name_display(self) -> str:
|
||||||
|
""" Return the Name column with its first char capitalized. """
|
||||||
|
return self.Name.title()
|
39
aurweb/models/session.py
Normal file
39
aurweb/models/session.py
Normal file
|
@ -0,0 +1,39 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import db, schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class Session(Base):
|
||||||
|
__table__ = schema.Sessions
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.UsersID]}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("session", uselist=False),
|
||||||
|
foreign_keys=[__table__.c.UsersID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
# We'll try to either use UsersID or User.ID if we can.
|
||||||
|
# If neither exist, an AttributeError is raised, in which case
|
||||||
|
# we set the uid to 0, which triggers IntegrityError below.
|
||||||
|
try:
|
||||||
|
uid = self.UsersID or self.User.ID
|
||||||
|
except AttributeError:
|
||||||
|
uid = 0
|
||||||
|
|
||||||
|
user_exists = db.query(_User).filter(_User.ID == uid).exists()
|
||||||
|
if not db.query(user_exists).scalar():
|
||||||
|
raise IntegrityError(
|
||||||
|
statement=("Foreign key UsersID cannot be null and "
|
||||||
|
"must be a valid user's ID."),
|
||||||
|
orig="Sessions.UsersID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
|
||||||
|
def generate_unique_sid():
|
||||||
|
return db.make_random_value(Session, Session.SessionID, 32)
|
42
aurweb/models/ssh_pub_key.py
Normal file
42
aurweb/models/ssh_pub_key.py
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
import os
|
||||||
|
import tempfile
|
||||||
|
|
||||||
|
from subprocess import PIPE, Popen
|
||||||
|
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class SSHPubKey(Base):
|
||||||
|
__table__ = schema.SSHPubKeys
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.Fingerprint]}
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
"User", backref=backref("ssh_pub_key", uselist=False),
|
||||||
|
foreign_keys=[__table__.c.UserID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def get_fingerprint(pubkey):
|
||||||
|
with tempfile.TemporaryDirectory() as tmpdir:
|
||||||
|
pk = os.path.join(tmpdir, "ssh.pub")
|
||||||
|
|
||||||
|
with open(pk, "w") as f:
|
||||||
|
f.write(pubkey)
|
||||||
|
|
||||||
|
proc = Popen(["ssh-keygen", "-l", "-f", pk], stdout=PIPE, stderr=PIPE)
|
||||||
|
out, err = proc.communicate()
|
||||||
|
|
||||||
|
# Invalid SSH Public Key. Return None to the caller.
|
||||||
|
if proc.returncode != 0:
|
||||||
|
return None
|
||||||
|
|
||||||
|
parts = out.decode().split()
|
||||||
|
fp = parts[1].replace("SHA256:", "")
|
||||||
|
|
||||||
|
return fp
|
25
aurweb/models/term.py
Normal file
25
aurweb/models/term.py
Normal file
|
@ -0,0 +1,25 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
|
||||||
|
class Term(Base):
|
||||||
|
__table__ = schema.Terms
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.Description:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Description cannot be null.",
|
||||||
|
orig="Terms.Description",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.URL:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column URL cannot be null.",
|
||||||
|
orig="Terms.URL",
|
||||||
|
params=("NULL"))
|
38
aurweb/models/tu_vote.py
Normal file
38
aurweb/models/tu_vote.py
Normal file
|
@ -0,0 +1,38 @@
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.tu_voteinfo import TUVoteInfo as _TUVoteInfo
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class TUVote(Base):
|
||||||
|
__table__ = schema.TU_Votes
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {
|
||||||
|
"primary_key": [__table__.c.VoteID, __table__.c.UserID]
|
||||||
|
}
|
||||||
|
|
||||||
|
VoteInfo = relationship(
|
||||||
|
_TUVoteInfo, backref=backref("tu_votes", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.VoteID])
|
||||||
|
|
||||||
|
User = relationship(
|
||||||
|
_User, backref=backref("tu_votes", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.UserID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if not self.VoteInfo and not self.VoteID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key VoteID cannot be null.",
|
||||||
|
orig="TU_Votes.VoteID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.User and not self.UserID:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key UserID cannot be null.",
|
||||||
|
orig="TU_Votes.UserID",
|
||||||
|
params=("NULL"))
|
75
aurweb/models/tu_voteinfo.py
Normal file
75
aurweb/models/tu_voteinfo.py
Normal file
|
@ -0,0 +1,75 @@
|
||||||
|
import typing
|
||||||
|
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
from aurweb import schema, time
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
from aurweb.models.user import User as _User
|
||||||
|
|
||||||
|
|
||||||
|
class TUVoteInfo(Base):
|
||||||
|
__table__ = schema.TU_VoteInfo
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
Submitter = relationship(
|
||||||
|
_User, backref=backref("tu_voteinfo_set", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.SubmitterID])
|
||||||
|
|
||||||
|
def __init__(self, **kwargs):
|
||||||
|
# Default Quorum, Yes, No and Abstain columns to 0.
|
||||||
|
for col in ("Quorum", "Yes", "No", "Abstain"):
|
||||||
|
if col not in kwargs:
|
||||||
|
kwargs.update({col: 0})
|
||||||
|
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
if self.Agenda is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Agenda cannot be null.",
|
||||||
|
orig="TU_VoteInfo.Agenda",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.User is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column User cannot be null.",
|
||||||
|
orig="TU_VoteInfo.User",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.Submitted is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column Submitted cannot be null.",
|
||||||
|
orig="TU_VoteInfo.Submitted",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if self.End is None:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Column End cannot be null.",
|
||||||
|
orig="TU_VoteInfo.End",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
if not self.Submitter:
|
||||||
|
raise IntegrityError(
|
||||||
|
statement="Foreign key SubmitterID cannot be null.",
|
||||||
|
orig="TU_VoteInfo.SubmitterID",
|
||||||
|
params=("NULL"))
|
||||||
|
|
||||||
|
def __setattr__(self, key: str, value: typing.Any):
|
||||||
|
""" Customize setattr to stringify any Quorum keys given. """
|
||||||
|
if key == "Quorum":
|
||||||
|
value = str(value)
|
||||||
|
return super().__setattr__(key, value)
|
||||||
|
|
||||||
|
def __getattribute__(self, key: str):
|
||||||
|
""" Customize getattr to floatify any fetched Quorum values. """
|
||||||
|
attr = super().__getattribute__(key)
|
||||||
|
if key == "Quorum":
|
||||||
|
return float(attr)
|
||||||
|
return attr
|
||||||
|
|
||||||
|
def is_running(self):
|
||||||
|
return self.End > time.utcnow()
|
||||||
|
|
||||||
|
def total_votes(self):
|
||||||
|
return self.Yes + self.No + self.Abstain
|
256
aurweb/models/user.py
Normal file
256
aurweb/models/user.py
Normal file
|
@ -0,0 +1,256 @@
|
||||||
|
import hashlib
|
||||||
|
|
||||||
|
from typing import List, Set
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
from sqlalchemy import or_
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import backref, relationship
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
import aurweb.models.account_type
|
||||||
|
import aurweb.schema
|
||||||
|
|
||||||
|
from aurweb import db, logging, schema, time, util
|
||||||
|
from aurweb.models.account_type import AccountType as _AccountType
|
||||||
|
from aurweb.models.ban import is_banned
|
||||||
|
from aurweb.models.declarative import Base
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
SALT_ROUNDS_DEFAULT = 12
|
||||||
|
|
||||||
|
|
||||||
|
class User(Base):
|
||||||
|
""" An ORM model of a single Users record. """
|
||||||
|
__table__ = schema.Users
|
||||||
|
__tablename__ = __table__.name
|
||||||
|
__mapper_args__ = {"primary_key": [__table__.c.ID]}
|
||||||
|
|
||||||
|
AccountType = relationship(
|
||||||
|
_AccountType,
|
||||||
|
backref=backref("users", lazy="dynamic"),
|
||||||
|
foreign_keys=[__table__.c.AccountTypeID],
|
||||||
|
uselist=False)
|
||||||
|
|
||||||
|
# High-level variables used to track authentication (not in DB).
|
||||||
|
authenticated = False
|
||||||
|
nonce = None
|
||||||
|
|
||||||
|
# Make this static to the class just in case SQLAlchemy ever
|
||||||
|
# does something to bypass our constructor.
|
||||||
|
salt_rounds = aurweb.config.getint("options", "salt_rounds",
|
||||||
|
SALT_ROUNDS_DEFAULT)
|
||||||
|
|
||||||
|
def __init__(self, Passwd: str = str(), **kwargs):
|
||||||
|
super().__init__(**kwargs, Passwd=str())
|
||||||
|
|
||||||
|
# Run this again in the constructor in case we rehashed config.
|
||||||
|
self.salt_rounds = aurweb.config.getint("options", "salt_rounds",
|
||||||
|
SALT_ROUNDS_DEFAULT)
|
||||||
|
if Passwd:
|
||||||
|
self.update_password(Passwd)
|
||||||
|
|
||||||
|
def update_password(self, password):
|
||||||
|
self.Passwd = bcrypt.hashpw(
|
||||||
|
password.encode(),
|
||||||
|
bcrypt.gensalt(rounds=self.salt_rounds)).decode()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def minimum_passwd_length():
|
||||||
|
return aurweb.config.getint("options", "passwd_min_len")
|
||||||
|
|
||||||
|
def is_authenticated(self):
|
||||||
|
""" Return internal authenticated state. """
|
||||||
|
return self.authenticated
|
||||||
|
|
||||||
|
def valid_password(self, password: str):
|
||||||
|
""" Check authentication against a given password. """
|
||||||
|
if password is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
password_is_valid = False
|
||||||
|
|
||||||
|
try:
|
||||||
|
password_is_valid = bcrypt.checkpw(password.encode(),
|
||||||
|
self.Passwd.encode())
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# If our Salt column is not empty, we're using a legacy password.
|
||||||
|
if not password_is_valid and self.Salt != str():
|
||||||
|
# Try to login with legacy method.
|
||||||
|
password_is_valid = hashlib.md5(
|
||||||
|
f"{self.Salt}{password}".encode()
|
||||||
|
).hexdigest() == self.Passwd
|
||||||
|
|
||||||
|
# We got here, we passed the legacy authentication.
|
||||||
|
# Update the password to our modern hash style.
|
||||||
|
if password_is_valid:
|
||||||
|
self.update_password(password)
|
||||||
|
|
||||||
|
return password_is_valid
|
||||||
|
|
||||||
|
def _login_approved(self, request: Request):
|
||||||
|
return not is_banned(request) and not self.Suspended
|
||||||
|
|
||||||
|
def login(self, request: Request, password: str,
|
||||||
|
session_time: int = 0) -> str:
|
||||||
|
""" Login and authenticate a request. """
|
||||||
|
|
||||||
|
from aurweb import db
|
||||||
|
from aurweb.models.session import Session, generate_unique_sid
|
||||||
|
|
||||||
|
if not self._login_approved(request):
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.authenticated = self.valid_password(password)
|
||||||
|
if not self.authenticated:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Maximum number of iterations where we attempt to generate
|
||||||
|
# a unique SID. In cases where the Session table has
|
||||||
|
# exhausted all possible values, this will catch exceptions
|
||||||
|
# instead of raising them and include details about failing
|
||||||
|
# generation in an HTTPException.
|
||||||
|
tries = 36
|
||||||
|
|
||||||
|
exc = None
|
||||||
|
for i in range(tries):
|
||||||
|
exc = None
|
||||||
|
now_ts = time.utcnow()
|
||||||
|
try:
|
||||||
|
with db.begin():
|
||||||
|
self.LastLogin = now_ts
|
||||||
|
self.LastLoginIPAddress = request.client.host
|
||||||
|
if not self.session:
|
||||||
|
sid = generate_unique_sid()
|
||||||
|
self.session = db.create(Session, User=self,
|
||||||
|
SessionID=sid,
|
||||||
|
LastUpdateTS=now_ts)
|
||||||
|
else:
|
||||||
|
last_updated = self.session.LastUpdateTS
|
||||||
|
if last_updated and last_updated < now_ts:
|
||||||
|
self.session.SessionID = generate_unique_sid()
|
||||||
|
self.session.LastUpdateTS = now_ts
|
||||||
|
break
|
||||||
|
except IntegrityError as exc_:
|
||||||
|
exc = exc_
|
||||||
|
|
||||||
|
if exc:
|
||||||
|
raise exc
|
||||||
|
|
||||||
|
return self.session.SessionID
|
||||||
|
|
||||||
|
def has_credential(self, credential: Set[int],
|
||||||
|
approved: List["User"] = list()):
|
||||||
|
from aurweb.auth.creds import has_credential
|
||||||
|
return has_credential(self, credential, approved)
|
||||||
|
|
||||||
|
def logout(self, request: Request):
|
||||||
|
self.authenticated = False
|
||||||
|
if self.session:
|
||||||
|
with db.begin():
|
||||||
|
db.delete(self.session)
|
||||||
|
|
||||||
|
def is_trusted_user(self):
|
||||||
|
return self.AccountType.ID in {
|
||||||
|
aurweb.models.account_type.TRUSTED_USER_ID,
|
||||||
|
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID
|
||||||
|
}
|
||||||
|
|
||||||
|
def is_developer(self):
|
||||||
|
return self.AccountType.ID in {
|
||||||
|
aurweb.models.account_type.DEVELOPER_ID,
|
||||||
|
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID
|
||||||
|
}
|
||||||
|
|
||||||
|
def is_elevated(self):
|
||||||
|
""" A User is 'elevated' when they have either a
|
||||||
|
Trusted User or Developer AccountType. """
|
||||||
|
return self.AccountType.ID in {
|
||||||
|
aurweb.models.account_type.TRUSTED_USER_ID,
|
||||||
|
aurweb.models.account_type.DEVELOPER_ID,
|
||||||
|
aurweb.models.account_type.TRUSTED_USER_AND_DEV_ID,
|
||||||
|
}
|
||||||
|
|
||||||
|
def can_edit_user(self, target: "User") -> bool:
|
||||||
|
"""
|
||||||
|
Whether this User instance can edit `target`.
|
||||||
|
|
||||||
|
This User can edit user `target` if we both: have credentials and
|
||||||
|
self.AccountTypeID is greater or equal to `target`.AccountTypeID.
|
||||||
|
|
||||||
|
In short, a user must at least have credentials and be at least
|
||||||
|
the same account type as the target.
|
||||||
|
|
||||||
|
User < Trusted User < Developer < Trusted User & Developer
|
||||||
|
|
||||||
|
:param target: Target User to be edited
|
||||||
|
:return: Boolean indicating whether `self` can edit `target`
|
||||||
|
"""
|
||||||
|
from aurweb.auth import creds
|
||||||
|
has_cred = self.has_credential(creds.ACCOUNT_EDIT, approved=[target])
|
||||||
|
return has_cred and self.AccountTypeID >= target.AccountTypeID
|
||||||
|
|
||||||
|
def voted_for(self, package) -> bool:
|
||||||
|
""" Has this User voted for package? """
|
||||||
|
from aurweb.models.package_vote import PackageVote
|
||||||
|
return bool(package.PackageBase.package_votes.filter(
|
||||||
|
PackageVote.UsersID == self.ID
|
||||||
|
).scalar())
|
||||||
|
|
||||||
|
def notified(self, package) -> bool:
|
||||||
|
""" Is this User being notified about package (or package base)?
|
||||||
|
|
||||||
|
:param package: Package or PackageBase instance
|
||||||
|
:return: Boolean indicating state of package notification
|
||||||
|
in relation to this User
|
||||||
|
"""
|
||||||
|
from aurweb.models.package import Package
|
||||||
|
from aurweb.models.package_base import PackageBase
|
||||||
|
from aurweb.models.package_notification import PackageNotification
|
||||||
|
|
||||||
|
query = None
|
||||||
|
if isinstance(package, Package):
|
||||||
|
query = package.PackageBase.notifications
|
||||||
|
elif isinstance(package, PackageBase):
|
||||||
|
query = package.notifications
|
||||||
|
|
||||||
|
# Run an exists() query where a pkgbase-related
|
||||||
|
# PackageNotification exists for self (a user).
|
||||||
|
return bool(db.query(
|
||||||
|
query.filter(PackageNotification.UserID == self.ID).exists()
|
||||||
|
).scalar())
|
||||||
|
|
||||||
|
def packages(self):
|
||||||
|
""" Returns an ORM query to Package objects owned by this user.
|
||||||
|
|
||||||
|
This should really be replaced with an internal ORM join
|
||||||
|
configured for the User model. This has not been done yet
|
||||||
|
due to issues I've been encountering in the process, so
|
||||||
|
sticking with this function until we can properly implement it.
|
||||||
|
|
||||||
|
:return: ORM query of User-packaged or maintained Package objects
|
||||||
|
"""
|
||||||
|
from aurweb.models.package import Package
|
||||||
|
from aurweb.models.package_base import PackageBase
|
||||||
|
return db.query(Package).join(PackageBase).filter(
|
||||||
|
or_(
|
||||||
|
PackageBase.PackagerUID == self.ID,
|
||||||
|
PackageBase.MaintainerUID == self.ID
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<User(ID='%s', AccountType='%s', Username='%s')>" % (
|
||||||
|
self.ID, str(self.AccountType), self.Username)
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
return self.Username
|
||||||
|
|
||||||
|
|
||||||
|
def generate_resetkey():
|
||||||
|
return util.make_random_string(32)
|
0
aurweb/packages/__init__.py
Normal file
0
aurweb/packages/__init__.py
Normal file
235
aurweb/packages/requests.py
Normal file
235
aurweb/packages/requests.py
Normal file
|
@ -0,0 +1,235 @@
|
||||||
|
from typing import List, Optional, Set
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
from sqlalchemy import and_, orm
|
||||||
|
|
||||||
|
from aurweb import config, db, l10n, time, util
|
||||||
|
from aurweb.exceptions import InvariantError
|
||||||
|
from aurweb.models import PackageBase, PackageRequest, User
|
||||||
|
from aurweb.models.package_request import ACCEPTED_ID, PENDING_ID, REJECTED_ID
|
||||||
|
from aurweb.models.request_type import DELETION, DELETION_ID, MERGE, MERGE_ID, ORPHAN, ORPHAN_ID
|
||||||
|
from aurweb.scripts import notify
|
||||||
|
|
||||||
|
|
||||||
|
class ClosureFactory:
|
||||||
|
""" A factory class used to autogenerate closure comments. """
|
||||||
|
|
||||||
|
REQTYPE_NAMES = {
|
||||||
|
DELETION_ID: DELETION,
|
||||||
|
MERGE_ID: MERGE,
|
||||||
|
ORPHAN_ID: ORPHAN
|
||||||
|
}
|
||||||
|
|
||||||
|
def _deletion_closure(self, requester: User,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None):
|
||||||
|
return (f"[Autogenerated] Accepted deletion for {pkgbase.Name}.")
|
||||||
|
|
||||||
|
def _merge_closure(self, requester: User,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None):
|
||||||
|
return (f"[Autogenerated] Accepted merge for {pkgbase.Name} "
|
||||||
|
f"into {target.Name}.")
|
||||||
|
|
||||||
|
def _orphan_closure(self, requester: User,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None):
|
||||||
|
return (f"[Autogenerated] Accepted orphan for {pkgbase.Name}.")
|
||||||
|
|
||||||
|
def _rejected_merge_closure(self, requester: User,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None):
|
||||||
|
return (f"[Autogenerated] Another request to merge {pkgbase.Name} "
|
||||||
|
f"into {target.Name} has rendered this request invalid.")
|
||||||
|
|
||||||
|
def get_closure(self, reqtype_id: int,
|
||||||
|
requester: User,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None,
|
||||||
|
status: int = ACCEPTED_ID) -> str:
|
||||||
|
"""
|
||||||
|
Return a closure comment handled by this class.
|
||||||
|
|
||||||
|
:param reqtype_id: RequestType.ID
|
||||||
|
:param requester: User who is closing a request
|
||||||
|
:param pkgbase: PackageBase instance related to the request
|
||||||
|
:param target: Merge request target PackageBase instance
|
||||||
|
:param status: PackageRequest.Status
|
||||||
|
"""
|
||||||
|
reqtype = ClosureFactory.REQTYPE_NAMES.get(reqtype_id)
|
||||||
|
|
||||||
|
partial = str()
|
||||||
|
if status == REJECTED_ID:
|
||||||
|
partial = "_rejected"
|
||||||
|
|
||||||
|
try:
|
||||||
|
handler = getattr(self, f"{partial}_{reqtype}_closure")
|
||||||
|
except AttributeError:
|
||||||
|
raise NotImplementedError("Unsupported 'reqtype_id' value.")
|
||||||
|
return handler(requester, pkgbase, target)
|
||||||
|
|
||||||
|
|
||||||
|
def update_closure_comment(pkgbase: PackageBase, reqtype_id: int,
|
||||||
|
comments: str, target: PackageBase = None) -> None:
|
||||||
|
"""
|
||||||
|
Update all pending requests related to `pkgbase` with a closure comment.
|
||||||
|
|
||||||
|
In order to persist closure comments through `handle_request`'s
|
||||||
|
algorithm, we must set `PackageRequest.ClosureComment` before calling
|
||||||
|
it. This function can be used to update the closure comment of all
|
||||||
|
package requests related to `pkgbase` and `reqtype_id`.
|
||||||
|
|
||||||
|
If an empty `comments` string is provided, we no-op out of this.
|
||||||
|
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:param reqtype_id: RequestType.ID
|
||||||
|
:param comments: PackageRequest.ClosureComment to update to
|
||||||
|
:param target: Merge request target PackageBase instance
|
||||||
|
"""
|
||||||
|
if not comments:
|
||||||
|
return
|
||||||
|
|
||||||
|
query = pkgbase.requests.filter(
|
||||||
|
and_(PackageRequest.ReqTypeID == reqtype_id,
|
||||||
|
PackageRequest.Status == PENDING_ID))
|
||||||
|
if reqtype_id == MERGE_ID:
|
||||||
|
query = query.filter(PackageRequest.MergeBaseName == target.Name)
|
||||||
|
|
||||||
|
for pkgreq in query:
|
||||||
|
pkgreq.ClosureComment = comments
|
||||||
|
|
||||||
|
|
||||||
|
def verify_orphan_request(user: User, pkgbase: PackageBase):
|
||||||
|
""" Verify that an undue orphan request exists in `requests`. """
|
||||||
|
requests = pkgbase.requests.filter(
|
||||||
|
PackageRequest.ReqTypeID == ORPHAN_ID)
|
||||||
|
for pkgreq in requests:
|
||||||
|
idle_time = config.getint("options", "request_idle_time")
|
||||||
|
time_delta = time.utcnow() - pkgreq.RequestTS
|
||||||
|
is_due = pkgreq.Status == PENDING_ID and time_delta > idle_time
|
||||||
|
if is_due:
|
||||||
|
# If the requester is the pkgbase maintainer or the
|
||||||
|
# request is already due, we're good to go: return True.
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def close_pkgreq(pkgreq: PackageRequest, closer: User,
|
||||||
|
pkgbase: PackageBase, target: Optional[PackageBase],
|
||||||
|
status: int) -> None:
|
||||||
|
"""
|
||||||
|
Close a package request with `pkgreq`.Status == `status`.
|
||||||
|
|
||||||
|
:param pkgreq: PackageRequest instance
|
||||||
|
:param closer: `pkgreq`.Closer User instance to update to
|
||||||
|
:param pkgbase: PackageBase instance which `pkgreq` is about
|
||||||
|
:param target: Optional PackageBase instance to merge into
|
||||||
|
:param status: `pkgreq`.Status value to update to
|
||||||
|
"""
|
||||||
|
now = time.utcnow()
|
||||||
|
pkgreq.Status = status
|
||||||
|
pkgreq.Closer = closer
|
||||||
|
pkgreq.ClosureComment = (
|
||||||
|
pkgreq.ClosureComment or ClosureFactory().get_closure(
|
||||||
|
pkgreq.ReqTypeID, closer, pkgbase, target, status)
|
||||||
|
)
|
||||||
|
pkgreq.ClosedTS = now
|
||||||
|
|
||||||
|
|
||||||
|
def handle_request(request: Request, reqtype_id: int,
|
||||||
|
pkgbase: PackageBase,
|
||||||
|
target: PackageBase = None) -> List[notify.Notification]:
|
||||||
|
"""
|
||||||
|
Handle package requests before performing an action.
|
||||||
|
|
||||||
|
The actions we're interested in are disown (orphan), delete and
|
||||||
|
merge. There is now an automated request generation and closure
|
||||||
|
notification when a privileged user performs one of these actions
|
||||||
|
without a pre-existing request. They all commit changes to the
|
||||||
|
database, and thus before calling, state should be verified to
|
||||||
|
avoid leaked database records regarding these requests.
|
||||||
|
|
||||||
|
Otherwise, we accept and reject requests based on their state
|
||||||
|
and send out the relevent notifications.
|
||||||
|
|
||||||
|
:param requester: User who needs this a `pkgbase` request handled
|
||||||
|
:param reqtype_id: RequestType.ID
|
||||||
|
:param pkgbase: PackageBase which the request is about
|
||||||
|
:param target: Optional target to merge into
|
||||||
|
"""
|
||||||
|
notifs: List[notify.Notification] = []
|
||||||
|
|
||||||
|
# If it's an orphan request, perform further verification
|
||||||
|
# regarding existing requests.
|
||||||
|
if reqtype_id == ORPHAN_ID:
|
||||||
|
if not verify_orphan_request(request.user, pkgbase):
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise InvariantError(_(
|
||||||
|
"No due existing orphan requests to accept for %s."
|
||||||
|
) % pkgbase.Name)
|
||||||
|
|
||||||
|
# Produce a base query for requests related to `pkgbase`, based
|
||||||
|
# on ReqTypeID matching `reqtype_id`, pending status and a correct
|
||||||
|
# PackagBaseName column.
|
||||||
|
query: orm.Query = pkgbase.requests.filter(
|
||||||
|
and_(PackageRequest.ReqTypeID == reqtype_id,
|
||||||
|
PackageRequest.Status == PENDING_ID,
|
||||||
|
PackageRequest.PackageBaseName == pkgbase.Name))
|
||||||
|
|
||||||
|
# Build a query for records we should accept. For merge requests,
|
||||||
|
# this is specific to a matching MergeBaseName. For others, this
|
||||||
|
# just ends up becoming `query`.
|
||||||
|
accept_query: orm.Query = query
|
||||||
|
if target:
|
||||||
|
# If a `target` was supplied, filter by MergeBaseName
|
||||||
|
accept_query = query.filter(
|
||||||
|
PackageRequest.MergeBaseName == target.Name)
|
||||||
|
|
||||||
|
# Build an accept list out of `accept_query`.
|
||||||
|
to_accept: List[PackageRequest] = accept_query.all()
|
||||||
|
accepted_ids: Set[int] = set(p.ID for p in to_accept)
|
||||||
|
|
||||||
|
# Build a reject list out of `query` filtered by IDs not found
|
||||||
|
# in `to_accept`. That is, unmatched records of the same base
|
||||||
|
# query properties.
|
||||||
|
to_reject: List[PackageRequest] = query.filter(
|
||||||
|
~PackageRequest.ID.in_(accepted_ids)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
# If we have no requests to accept, create a new one.
|
||||||
|
# This is done to increase tracking of actions occurring
|
||||||
|
# through the website.
|
||||||
|
if not to_accept:
|
||||||
|
with db.begin():
|
||||||
|
pkgreq = db.create(PackageRequest,
|
||||||
|
ReqTypeID=reqtype_id,
|
||||||
|
User=request.user,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
PackageBaseName=pkgbase.Name,
|
||||||
|
Comments="Autogenerated by aurweb.",
|
||||||
|
ClosureComment=str())
|
||||||
|
|
||||||
|
# If it's a merge request, set MergeBaseName to `target`.Name.
|
||||||
|
if pkgreq.ReqTypeID == MERGE_ID:
|
||||||
|
pkgreq.MergeBaseName = target.Name
|
||||||
|
|
||||||
|
# Add the new request to `to_accept` and allow standard
|
||||||
|
# flow to continue afterward.
|
||||||
|
to_accept.append(pkgreq)
|
||||||
|
|
||||||
|
# Update requests with their new status and closures.
|
||||||
|
with db.begin():
|
||||||
|
util.apply_all(to_accept, lambda p: close_pkgreq(
|
||||||
|
p, request.user, pkgbase, target, ACCEPTED_ID))
|
||||||
|
util.apply_all(to_reject, lambda p: close_pkgreq(
|
||||||
|
p, request.user, pkgbase, target, REJECTED_ID))
|
||||||
|
|
||||||
|
# Create RequestCloseNotifications for all requests involved.
|
||||||
|
for pkgreq in (to_accept + to_reject):
|
||||||
|
notif = notify.RequestCloseNotification(
|
||||||
|
request.user.ID, pkgreq.ID, pkgreq.status_display())
|
||||||
|
notifs.append(notif)
|
||||||
|
|
||||||
|
# Return notifications to the caller for sending.
|
||||||
|
return notifs
|
320
aurweb/packages/search.py
Normal file
320
aurweb/packages/search.py
Normal file
|
@ -0,0 +1,320 @@
|
||||||
|
from sqlalchemy import and_, case, or_, orm
|
||||||
|
|
||||||
|
from aurweb import db, models
|
||||||
|
from aurweb.models import Package, PackageBase, User
|
||||||
|
from aurweb.models.dependency_type import CHECKDEPENDS_ID, DEPENDS_ID, MAKEDEPENDS_ID, OPTDEPENDS_ID
|
||||||
|
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||||
|
from aurweb.models.package_keyword import PackageKeyword
|
||||||
|
from aurweb.models.package_notification import PackageNotification
|
||||||
|
from aurweb.models.package_vote import PackageVote
|
||||||
|
|
||||||
|
|
||||||
|
class PackageSearch:
|
||||||
|
""" A Package search query builder. """
|
||||||
|
|
||||||
|
# A constant mapping of short to full name sort orderings.
|
||||||
|
FULL_SORT_ORDER = {"d": "desc", "a": "asc"}
|
||||||
|
|
||||||
|
def __init__(self, user: models.User = None):
|
||||||
|
self.query = db.query(Package).join(PackageBase)
|
||||||
|
|
||||||
|
self.user = user
|
||||||
|
if self.user:
|
||||||
|
self.query = self.query.join(
|
||||||
|
PackageVote,
|
||||||
|
and_(PackageVote.PackageBaseID == PackageBase.ID,
|
||||||
|
PackageVote.UsersID == self.user.ID),
|
||||||
|
isouter=True
|
||||||
|
).join(
|
||||||
|
PackageNotification,
|
||||||
|
and_(PackageNotification.PackageBaseID == PackageBase.ID,
|
||||||
|
PackageNotification.UserID == self.user.ID),
|
||||||
|
isouter=True
|
||||||
|
)
|
||||||
|
|
||||||
|
self.ordering = "d"
|
||||||
|
|
||||||
|
# Setup SeB (Search By) callbacks.
|
||||||
|
self.search_by_cb = {
|
||||||
|
"nd": self._search_by_namedesc,
|
||||||
|
"n": self._search_by_name,
|
||||||
|
"b": self._search_by_pkgbase,
|
||||||
|
"N": self._search_by_exact_name,
|
||||||
|
"B": self._search_by_exact_pkgbase,
|
||||||
|
"k": self._search_by_keywords,
|
||||||
|
"m": self._search_by_maintainer,
|
||||||
|
"c": self._search_by_comaintainer,
|
||||||
|
"M": self._search_by_co_or_maintainer,
|
||||||
|
"s": self._search_by_submitter
|
||||||
|
}
|
||||||
|
|
||||||
|
# Setup SB (Sort By) callbacks.
|
||||||
|
self.sort_by_cb = {
|
||||||
|
"n": self._sort_by_name,
|
||||||
|
"v": self._sort_by_votes,
|
||||||
|
"p": self._sort_by_popularity,
|
||||||
|
"w": self._sort_by_voted,
|
||||||
|
"o": self._sort_by_notify,
|
||||||
|
"m": self._sort_by_maintainer,
|
||||||
|
"l": self._sort_by_last_modified
|
||||||
|
}
|
||||||
|
|
||||||
|
self._joined = False
|
||||||
|
|
||||||
|
def _join_user(self, outer: bool = True) -> orm.Query:
|
||||||
|
""" Centralized joining of a package base's maintainer. """
|
||||||
|
if not self._joined:
|
||||||
|
self.query = self.query.join(
|
||||||
|
User,
|
||||||
|
User.ID == PackageBase.MaintainerUID,
|
||||||
|
isouter=outer
|
||||||
|
)
|
||||||
|
self._joined = True
|
||||||
|
return self.query
|
||||||
|
|
||||||
|
def _search_by_namedesc(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.filter(
|
||||||
|
or_(Package.Name.like(f"%{keywords}%"),
|
||||||
|
Package.Description.like(f"%{keywords}%"))
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_name(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.filter(Package.Name.like(f"%{keywords}%"))
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_exact_name(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.filter(Package.Name == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_pkgbase(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.filter(PackageBase.Name.like(f"%{keywords}%"))
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_exact_pkgbase(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.filter(PackageBase.Name == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_keywords(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
self.query = self.query.join(PackageKeyword).filter(
|
||||||
|
PackageKeyword.Keyword == keywords
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_maintainer(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
if keywords:
|
||||||
|
self.query = self.query.filter(
|
||||||
|
and_(User.Username == keywords,
|
||||||
|
User.ID == PackageBase.MaintainerUID)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.query = self.query.filter(PackageBase.MaintainerUID.is_(None))
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_comaintainer(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
exists_subq = db.query(PackageComaintainer).join(User).filter(
|
||||||
|
and_(PackageComaintainer.PackageBaseID == PackageBase.ID,
|
||||||
|
User.Username == keywords)
|
||||||
|
).exists()
|
||||||
|
self.query = self.query.filter(db.query(exists_subq).scalar_subquery())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_co_or_maintainer(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
exists_subq = db.query(PackageComaintainer).join(User).filter(
|
||||||
|
and_(PackageComaintainer.PackageBaseID == PackageBase.ID,
|
||||||
|
User.Username == keywords)
|
||||||
|
).exists()
|
||||||
|
self.query = self.query.filter(
|
||||||
|
or_(and_(User.Username == keywords,
|
||||||
|
User.ID == PackageBase.MaintainerUID),
|
||||||
|
db.query(exists_subq).scalar_subquery())
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_submitter(self, keywords: str) -> orm.Query:
|
||||||
|
self._join_user()
|
||||||
|
|
||||||
|
uid = 0
|
||||||
|
user = db.query(User).filter(User.Username == keywords).first()
|
||||||
|
if user:
|
||||||
|
uid = user.ID
|
||||||
|
|
||||||
|
self.query = self.query.filter(PackageBase.SubmitterUID == uid)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def search_by(self, search_by: str, keywords: str) -> orm.Query:
|
||||||
|
if search_by not in self.search_by_cb:
|
||||||
|
search_by = "nd" # Default: Name, Description
|
||||||
|
callback = self.search_by_cb.get(search_by)
|
||||||
|
result = callback(keywords)
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _sort_by_name(self, order: str):
|
||||||
|
column = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_votes(self, order: str):
|
||||||
|
column = getattr(models.PackageBase.NumVotes, order)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_popularity(self, order: str):
|
||||||
|
column = getattr(models.PackageBase.Popularity, order)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_voted(self, order: str):
|
||||||
|
# FIXME: Currently, PHP is destroying this implementation
|
||||||
|
# in terms of performance. We should improve this; there's no
|
||||||
|
# reason it should take _longer_.
|
||||||
|
column = getattr(
|
||||||
|
case([(models.PackageVote.UsersID == self.user.ID, 1)], else_=0),
|
||||||
|
order
|
||||||
|
)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_notify(self, order: str):
|
||||||
|
# FIXME: Currently, PHP is destroying this implementation
|
||||||
|
# in terms of performance. We should improve this; there's no
|
||||||
|
# reason it should take _longer_.
|
||||||
|
column = getattr(
|
||||||
|
case([(models.PackageNotification.UserID == self.user.ID, 1)],
|
||||||
|
else_=0),
|
||||||
|
order
|
||||||
|
)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_maintainer(self, order: str):
|
||||||
|
column = getattr(models.User.Username, order)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _sort_by_last_modified(self, order: str):
|
||||||
|
column = getattr(models.PackageBase.ModifiedTS, order)
|
||||||
|
name = getattr(models.Package.Name, order)
|
||||||
|
self.query = self.query.order_by(column(), name())
|
||||||
|
return self
|
||||||
|
|
||||||
|
def sort_by(self, sort_by: str, ordering: str = "d") -> orm.Query:
|
||||||
|
if sort_by not in self.sort_by_cb:
|
||||||
|
sort_by = "p" # Default: Popularity
|
||||||
|
callback = self.sort_by_cb.get(sort_by)
|
||||||
|
if ordering not in self.FULL_SORT_ORDER:
|
||||||
|
ordering = "d" # Default: Descending
|
||||||
|
ordering = self.FULL_SORT_ORDER.get(ordering)
|
||||||
|
return callback(ordering)
|
||||||
|
|
||||||
|
def count(self, limit: int) -> int:
|
||||||
|
"""
|
||||||
|
Return internal query's count up to `limit`.
|
||||||
|
|
||||||
|
:param limit: Upper bound
|
||||||
|
:return: Database count up to `limit`
|
||||||
|
"""
|
||||||
|
return self.query.limit(limit).count()
|
||||||
|
|
||||||
|
def results(self) -> orm.Query:
|
||||||
|
""" Return internal query. """
|
||||||
|
return self.query
|
||||||
|
|
||||||
|
|
||||||
|
class RPCSearch(PackageSearch):
|
||||||
|
""" A PackageSearch-derived RPC package search query builder.
|
||||||
|
|
||||||
|
With RPC search, we need a subset of PackageSearch's handlers,
|
||||||
|
with a few additional handlers added. So, within the RPCSearch
|
||||||
|
constructor, we pop unneeded keys out of inherited self.search_by_cb
|
||||||
|
and add a few more keys to it, namely: depends, makedepends,
|
||||||
|
optdepends and checkdepends.
|
||||||
|
|
||||||
|
Additionally, some logic within the inherited PackageSearch.search_by
|
||||||
|
method is not needed, so it is overridden in this class without
|
||||||
|
sanitization done for the PackageSearch `by` argument.
|
||||||
|
"""
|
||||||
|
|
||||||
|
keys_removed = ("b", "N", "B", "k", "c", "M", "s")
|
||||||
|
|
||||||
|
def __init__(self) -> "RPCSearch":
|
||||||
|
super().__init__()
|
||||||
|
|
||||||
|
# Fix-up inherited search_by_cb to reflect RPC-specific by params.
|
||||||
|
# We keep: "nd", "n" and "m". We also overlay four new by params
|
||||||
|
# on top: "depends", "makedepends", "optdepends" and "checkdepends".
|
||||||
|
self.search_by_cb = {
|
||||||
|
k: v for k, v in self.search_by_cb.items()
|
||||||
|
if k not in RPCSearch.keys_removed
|
||||||
|
}
|
||||||
|
self.search_by_cb.update({
|
||||||
|
"depends": self._search_by_depends,
|
||||||
|
"makedepends": self._search_by_makedepends,
|
||||||
|
"optdepends": self._search_by_optdepends,
|
||||||
|
"checkdepends": self._search_by_checkdepends
|
||||||
|
})
|
||||||
|
|
||||||
|
# We always want an optional Maintainer in the RPC.
|
||||||
|
self._join_user()
|
||||||
|
|
||||||
|
def _join_depends(self, dep_type_id: int) -> orm.Query:
|
||||||
|
""" Join Package with PackageDependency and filter results
|
||||||
|
based on `dep_type_id`.
|
||||||
|
|
||||||
|
:param dep_type_id: DependencyType ID
|
||||||
|
:returns: PackageDependency-joined orm.Query
|
||||||
|
"""
|
||||||
|
self.query = self.query.join(models.PackageDependency).filter(
|
||||||
|
models.PackageDependency.DepTypeID == dep_type_id)
|
||||||
|
return self.query
|
||||||
|
|
||||||
|
def _search_by_depends(self, keywords: str) -> "RPCSearch":
|
||||||
|
self.query = self._join_depends(DEPENDS_ID).filter(
|
||||||
|
models.PackageDependency.DepName == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_makedepends(self, keywords: str) -> "RPCSearch":
|
||||||
|
self.query = self._join_depends(MAKEDEPENDS_ID).filter(
|
||||||
|
models.PackageDependency.DepName == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_optdepends(self, keywords: str) -> "RPCSearch":
|
||||||
|
self.query = self._join_depends(OPTDEPENDS_ID).filter(
|
||||||
|
models.PackageDependency.DepName == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def _search_by_checkdepends(self, keywords: str) -> "RPCSearch":
|
||||||
|
self.query = self._join_depends(CHECKDEPENDS_ID).filter(
|
||||||
|
models.PackageDependency.DepName == keywords)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def search_by(self, by: str, keywords: str) -> "RPCSearch":
|
||||||
|
""" Override inherited search_by. In this override, we reduce the
|
||||||
|
scope of what we handle within this function. We do not set `by`
|
||||||
|
to a default of "nd" in the RPC, as the RPC returns an error when
|
||||||
|
incorrect `by` fields are specified.
|
||||||
|
|
||||||
|
:param by: RPC `by` argument
|
||||||
|
:param keywords: RPC `arg` argument
|
||||||
|
:returns: self
|
||||||
|
"""
|
||||||
|
callback = self.search_by_cb.get(by)
|
||||||
|
result = callback(keywords)
|
||||||
|
return result
|
||||||
|
|
||||||
|
def results(self) -> orm.Query:
|
||||||
|
return self.query.filter(models.PackageBase.PackagerUID.isnot(None))
|
261
aurweb/packages/util.py
Normal file
261
aurweb/packages/util.py
Normal file
|
@ -0,0 +1,261 @@
|
||||||
|
from collections import defaultdict
|
||||||
|
from http import HTTPStatus
|
||||||
|
from typing import Dict, List, Tuple, Union
|
||||||
|
|
||||||
|
import orjson
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from sqlalchemy import orm
|
||||||
|
|
||||||
|
from aurweb import config, db, models
|
||||||
|
from aurweb.models import Package
|
||||||
|
from aurweb.models.official_provider import OFFICIAL_BASE, OfficialProvider
|
||||||
|
from aurweb.models.package_dependency import PackageDependency
|
||||||
|
from aurweb.models.package_relation import PackageRelation
|
||||||
|
from aurweb.redis import redis_connection
|
||||||
|
from aurweb.templates import register_filter
|
||||||
|
|
||||||
|
Providers = List[Union[PackageRelation, OfficialProvider]]
|
||||||
|
|
||||||
|
|
||||||
|
def dep_extra_with_arch(dep: models.PackageDependency, annotation: str) -> str:
|
||||||
|
output = [annotation]
|
||||||
|
if dep.DepArch:
|
||||||
|
output.append(dep.DepArch)
|
||||||
|
return f"({', '.join(output)})"
|
||||||
|
|
||||||
|
|
||||||
|
def dep_depends_extra(dep: models.PackageDependency) -> str:
|
||||||
|
return str()
|
||||||
|
|
||||||
|
|
||||||
|
def dep_makedepends_extra(dep: models.PackageDependency) -> str:
|
||||||
|
return dep_extra_with_arch(dep, "make")
|
||||||
|
|
||||||
|
|
||||||
|
def dep_checkdepends_extra(dep: models.PackageDependency) -> str:
|
||||||
|
return dep_extra_with_arch(dep, "check")
|
||||||
|
|
||||||
|
|
||||||
|
def dep_optdepends_extra(dep: models.PackageDependency) -> str:
|
||||||
|
return dep_extra_with_arch(dep, "optional")
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("dep_extra")
|
||||||
|
def dep_extra(dep: models.PackageDependency) -> str:
|
||||||
|
""" Some dependency types have extra text added to their
|
||||||
|
display. This function provides that output. However, it
|
||||||
|
**assumes** that the dep passed is bound to a valid one
|
||||||
|
of: depends, makedepends, checkdepends or optdepends. """
|
||||||
|
f = globals().get(f"dep_{dep.DependencyType.Name}_extra")
|
||||||
|
return f(dep)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("dep_extra_desc")
|
||||||
|
def dep_extra_desc(dep: models.PackageDependency) -> str:
|
||||||
|
extra = dep_extra(dep)
|
||||||
|
if not dep.DepDesc:
|
||||||
|
return extra
|
||||||
|
return extra + f" – {dep.DepDesc}"
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("pkgname_link")
|
||||||
|
def pkgname_link(pkgname: str) -> str:
|
||||||
|
official = db.query(OfficialProvider).filter(
|
||||||
|
OfficialProvider.Name == pkgname).exists()
|
||||||
|
if db.query(official).scalar():
|
||||||
|
base = "/".join([OFFICIAL_BASE, "packages"])
|
||||||
|
return f"{base}/?q={pkgname}"
|
||||||
|
return f"/packages/{pkgname}"
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("package_link")
|
||||||
|
def package_link(package: Union[Package, OfficialProvider]) -> str:
|
||||||
|
if package.is_official:
|
||||||
|
base = "/".join([OFFICIAL_BASE, "packages"])
|
||||||
|
return f"{base}/?q={package.Name}"
|
||||||
|
return f"/packages/{package.Name}"
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("provides_markup")
|
||||||
|
def provides_markup(provides: Providers) -> str:
|
||||||
|
return ", ".join([
|
||||||
|
f'<a href="{package_link(pkg)}">{pkg.Name}</a>'
|
||||||
|
for pkg in provides
|
||||||
|
])
|
||||||
|
|
||||||
|
|
||||||
|
def get_pkg_or_base(
|
||||||
|
name: str,
|
||||||
|
cls: Union[models.Package, models.PackageBase] = models.PackageBase) \
|
||||||
|
-> Union[models.Package, models.PackageBase]:
|
||||||
|
""" Get a PackageBase instance by its name or raise a 404 if
|
||||||
|
it can't be found in the database.
|
||||||
|
|
||||||
|
:param name: {Package,PackageBase}.Name
|
||||||
|
:param exception: Whether to raise an HTTPException or simply return None if
|
||||||
|
the package can't be found.
|
||||||
|
:raises HTTPException: With status code 404 if record doesn't exist
|
||||||
|
:return: {Package,PackageBase} instance
|
||||||
|
"""
|
||||||
|
provider = db.query(models.OfficialProvider).filter(
|
||||||
|
models.OfficialProvider.Name == name).first()
|
||||||
|
if provider:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
instance = db.query(cls).filter(cls.Name == name).first()
|
||||||
|
|
||||||
|
if not instance:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
return instance
|
||||||
|
|
||||||
|
|
||||||
|
def get_pkgbase_comment(pkgbase: models.PackageBase, id: int) \
|
||||||
|
-> models.PackageComment:
|
||||||
|
comment = pkgbase.comments.filter(models.PackageComment.ID == id).first()
|
||||||
|
if not comment:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
return db.refresh(comment)
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("out_of_date")
|
||||||
|
def out_of_date(packages: orm.Query) -> orm.Query:
|
||||||
|
return packages.filter(models.PackageBase.OutOfDateTS.isnot(None))
|
||||||
|
|
||||||
|
|
||||||
|
def updated_packages(limit: int = 0,
|
||||||
|
cache_ttl: int = 600) -> List[models.Package]:
|
||||||
|
""" Return a list of valid Package objects ordered by their
|
||||||
|
ModifiedTS column in descending order from cache, after setting
|
||||||
|
the cache when no key yet exists.
|
||||||
|
|
||||||
|
:param limit: Optional record limit
|
||||||
|
:param cache_ttl: Cache expiration time (in seconds)
|
||||||
|
:return: A list of Packages
|
||||||
|
"""
|
||||||
|
redis = redis_connection()
|
||||||
|
packages = redis.get("package_updates")
|
||||||
|
if packages:
|
||||||
|
# If we already have a cache, deserialize it and return.
|
||||||
|
return orjson.loads(packages)
|
||||||
|
|
||||||
|
query = db.query(models.Package).join(models.PackageBase).filter(
|
||||||
|
models.PackageBase.PackagerUID.isnot(None)
|
||||||
|
).order_by(
|
||||||
|
models.PackageBase.ModifiedTS.desc()
|
||||||
|
)
|
||||||
|
|
||||||
|
if limit:
|
||||||
|
query = query.limit(limit)
|
||||||
|
|
||||||
|
packages = []
|
||||||
|
for pkg in query:
|
||||||
|
# For each Package returned by the query, append a dict
|
||||||
|
# containing Package columns we're interested in.
|
||||||
|
db.refresh(pkg)
|
||||||
|
packages.append({
|
||||||
|
"Name": pkg.Name,
|
||||||
|
"Version": pkg.Version,
|
||||||
|
"PackageBase": {
|
||||||
|
"ModifiedTS": pkg.PackageBase.ModifiedTS
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
# Store the JSON serialization of the package_updates key into Redis.
|
||||||
|
redis.set("package_updates", orjson.dumps(packages))
|
||||||
|
redis.expire("package_updates", cache_ttl)
|
||||||
|
|
||||||
|
# Return the deserialized list of packages.
|
||||||
|
return packages
|
||||||
|
|
||||||
|
|
||||||
|
def query_voted(query: List[models.Package],
|
||||||
|
user: models.User) -> Dict[int, bool]:
|
||||||
|
""" Produce a dictionary of package base ID keys to boolean values,
|
||||||
|
which indicate whether or not the package base has a vote record
|
||||||
|
related to user.
|
||||||
|
|
||||||
|
:param query: A collection of Package models
|
||||||
|
:param user: The user that is being notified or not
|
||||||
|
:return: Vote state dict (PackageBase.ID: int -> bool)
|
||||||
|
"""
|
||||||
|
output = defaultdict(bool)
|
||||||
|
query_set = {pkg.PackageBaseID for pkg in query}
|
||||||
|
voted = db.query(models.PackageVote).join(
|
||||||
|
models.PackageBase,
|
||||||
|
models.PackageBase.ID.in_(query_set)
|
||||||
|
).filter(
|
||||||
|
models.PackageVote.UsersID == user.ID
|
||||||
|
)
|
||||||
|
for vote in voted:
|
||||||
|
output[vote.PackageBase.ID] = True
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def query_notified(query: List[models.Package],
|
||||||
|
user: models.User) -> Dict[int, bool]:
|
||||||
|
""" Produce a dictionary of package base ID keys to boolean values,
|
||||||
|
which indicate whether or not the package base has a notification
|
||||||
|
record related to user.
|
||||||
|
|
||||||
|
:param query: A collection of Package models
|
||||||
|
:param user: The user that is being notified or not
|
||||||
|
:return: Notification state dict (PackageBase.ID: int -> bool)
|
||||||
|
"""
|
||||||
|
output = defaultdict(bool)
|
||||||
|
query_set = {pkg.PackageBaseID for pkg in query}
|
||||||
|
notified = db.query(models.PackageNotification).join(
|
||||||
|
models.PackageBase,
|
||||||
|
models.PackageBase.ID.in_(query_set)
|
||||||
|
).filter(
|
||||||
|
models.PackageNotification.UserID == user.ID
|
||||||
|
)
|
||||||
|
for notif in notified:
|
||||||
|
output[notif.PackageBase.ID] = True
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def pkg_required(pkgname: str, provides: List[str], limit: int) \
|
||||||
|
-> List[PackageDependency]:
|
||||||
|
"""
|
||||||
|
Get dependencies that match a string in `[pkgname] + provides`.
|
||||||
|
|
||||||
|
:param pkgname: Package.Name
|
||||||
|
:param provides: List of PackageRelation.Name
|
||||||
|
:param limit: Maximum number of dependencies to query
|
||||||
|
:return: List of PackageDependency instances
|
||||||
|
"""
|
||||||
|
targets = set([pkgname] + provides)
|
||||||
|
query = db.query(PackageDependency).join(Package).filter(
|
||||||
|
PackageDependency.DepName.in_(targets)
|
||||||
|
).order_by(Package.Name.asc()).limit(limit)
|
||||||
|
return query.all()
|
||||||
|
|
||||||
|
|
||||||
|
@register_filter("source_uri")
|
||||||
|
def source_uri(pkgsrc: models.PackageSource) -> Tuple[str, str]:
|
||||||
|
"""
|
||||||
|
Produce a (text, uri) tuple out of `pkgsrc`.
|
||||||
|
|
||||||
|
In this filter, we cover various cases:
|
||||||
|
1. If "::" is anywhere in the Source column, split the string,
|
||||||
|
which should produce a (text, uri), where text is before "::"
|
||||||
|
and uri is after "::".
|
||||||
|
2. Otherwise, if "://" is anywhere in the Source column, it's just
|
||||||
|
some sort of URI, which we'll return varbatim as both text and uri.
|
||||||
|
3. Otherwise, we'll return a path to the source file in a uri produced
|
||||||
|
out of options.source_file_uri formatted with the source file and
|
||||||
|
the package base name.
|
||||||
|
|
||||||
|
:param pkgsrc: PackageSource instance
|
||||||
|
:return (text, uri) tuple
|
||||||
|
"""
|
||||||
|
if "::" in pkgsrc.Source:
|
||||||
|
return pkgsrc.Source.split("::", 1)
|
||||||
|
elif "://" in pkgsrc.Source:
|
||||||
|
return (pkgsrc.Source, pkgsrc.Source)
|
||||||
|
path = config.get("options", "source_file_uri")
|
||||||
|
pkgbasename = pkgsrc.Package.PackageBase.Name
|
||||||
|
return (pkgsrc.Source, path % (pkgsrc.Source, pkgbasename))
|
0
aurweb/pkgbase/__init__.py
Normal file
0
aurweb/pkgbase/__init__.py
Normal file
141
aurweb/pkgbase/actions.py
Normal file
141
aurweb/pkgbase/actions.py
Normal file
|
@ -0,0 +1,141 @@
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
|
||||||
|
from aurweb import db, logging, util
|
||||||
|
from aurweb.auth import creds
|
||||||
|
from aurweb.models import PackageBase
|
||||||
|
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||||
|
from aurweb.models.package_notification import PackageNotification
|
||||||
|
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID
|
||||||
|
from aurweb.packages.requests import handle_request, update_closure_comment
|
||||||
|
from aurweb.pkgbase import util as pkgbaseutil
|
||||||
|
from aurweb.scripts import notify, popupdate
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_notify_instance(request: Request, pkgbase: PackageBase) -> None:
|
||||||
|
notif = db.query(pkgbase.notifications.filter(
|
||||||
|
PackageNotification.UserID == request.user.ID
|
||||||
|
).exists()).scalar()
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
|
||||||
|
if has_cred and not notif:
|
||||||
|
with db.begin():
|
||||||
|
db.create(PackageNotification,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
User=request.user)
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_unnotify_instance(request: Request, pkgbase: PackageBase) -> None:
|
||||||
|
notif = pkgbase.notifications.filter(
|
||||||
|
PackageNotification.UserID == request.user.ID
|
||||||
|
).first()
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
|
||||||
|
if has_cred and notif:
|
||||||
|
with db.begin():
|
||||||
|
db.delete(notif)
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_unflag_instance(request: Request, pkgbase: PackageBase) -> None:
|
||||||
|
has_cred = request.user.has_credential(
|
||||||
|
creds.PKGBASE_UNFLAG, approved=[pkgbase.Flagger, pkgbase.Maintainer])
|
||||||
|
if has_cred:
|
||||||
|
with db.begin():
|
||||||
|
pkgbase.OutOfDateTS = None
|
||||||
|
pkgbase.Flagger = None
|
||||||
|
pkgbase.FlaggerComment = str()
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_disown_instance(request: Request, pkgbase: PackageBase) -> None:
|
||||||
|
disowner = request.user
|
||||||
|
notifs = [notify.DisownNotification(disowner.ID, pkgbase.ID)]
|
||||||
|
|
||||||
|
is_maint = disowner == pkgbase.Maintainer
|
||||||
|
if is_maint:
|
||||||
|
with db.begin():
|
||||||
|
# Comaintainer with the lowest Priority value; next-in-line.
|
||||||
|
prio_comaint = pkgbase.comaintainers.order_by(
|
||||||
|
PackageComaintainer.Priority.asc()
|
||||||
|
).first()
|
||||||
|
if prio_comaint:
|
||||||
|
# If there is such a comaintainer, promote them to maint.
|
||||||
|
pkgbase.Maintainer = prio_comaint.User
|
||||||
|
notifs.append(pkgbaseutil.remove_comaintainer(prio_comaint))
|
||||||
|
else:
|
||||||
|
# Otherwise, just orphan the package completely.
|
||||||
|
pkgbase.Maintainer = None
|
||||||
|
elif request.user.has_credential(creds.PKGBASE_DISOWN):
|
||||||
|
# Otherwise, the request user performing this disownage is a
|
||||||
|
# Trusted User and we treat it like a standard orphan request.
|
||||||
|
notifs += handle_request(request, ORPHAN_ID, pkgbase)
|
||||||
|
with db.begin():
|
||||||
|
pkgbase.Maintainer = None
|
||||||
|
|
||||||
|
util.apply_all(notifs, lambda n: n.send())
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_adopt_instance(request: Request, pkgbase: PackageBase) -> None:
|
||||||
|
with db.begin():
|
||||||
|
pkgbase.Maintainer = request.user
|
||||||
|
|
||||||
|
notif = notify.AdoptNotification(request.user.ID, pkgbase.ID)
|
||||||
|
notif.send()
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_delete_instance(request: Request, pkgbase: PackageBase,
|
||||||
|
comments: str = str()) \
|
||||||
|
-> List[notify.Notification]:
|
||||||
|
notifs = handle_request(request, DELETION_ID, pkgbase) + [
|
||||||
|
notify.DeleteNotification(request.user.ID, pkgbase.ID)
|
||||||
|
]
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
update_closure_comment(pkgbase, DELETION_ID, comments)
|
||||||
|
db.delete(pkgbase)
|
||||||
|
|
||||||
|
return notifs
|
||||||
|
|
||||||
|
|
||||||
|
def pkgbase_merge_instance(request: Request, pkgbase: PackageBase,
|
||||||
|
target: PackageBase, comments: str = str()) -> None:
|
||||||
|
pkgbasename = str(pkgbase.Name)
|
||||||
|
|
||||||
|
# Create notifications.
|
||||||
|
notifs = handle_request(request, MERGE_ID, pkgbase, target)
|
||||||
|
|
||||||
|
# Target votes and notifications sets of user IDs that are
|
||||||
|
# looking to be migrated.
|
||||||
|
target_votes = set(v.UsersID for v in target.package_votes)
|
||||||
|
target_notifs = set(n.UserID for n in target.notifications)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
# Merge pkgbase's comments.
|
||||||
|
for comment in pkgbase.comments:
|
||||||
|
comment.PackageBase = target
|
||||||
|
|
||||||
|
# Merge notifications that don't yet exist in the target.
|
||||||
|
for notif in pkgbase.notifications:
|
||||||
|
if notif.UserID not in target_notifs:
|
||||||
|
notif.PackageBase = target
|
||||||
|
|
||||||
|
# Merge votes that don't yet exist in the target.
|
||||||
|
for vote in pkgbase.package_votes:
|
||||||
|
if vote.UsersID not in target_votes:
|
||||||
|
vote.PackageBase = target
|
||||||
|
|
||||||
|
# Run popupdate.
|
||||||
|
popupdate.run_single(target)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
# Delete pkgbase and its packages now that everything's merged.
|
||||||
|
for pkg in pkgbase.packages:
|
||||||
|
db.delete(pkg)
|
||||||
|
db.delete(pkgbase)
|
||||||
|
|
||||||
|
# Log this out for accountability purposes.
|
||||||
|
logger.info(f"Trusted User '{request.user.Username}' merged "
|
||||||
|
f"'{pkgbasename}' into '{target.Name}'.")
|
||||||
|
|
||||||
|
# Send notifications.
|
||||||
|
util.apply_all(notifs, lambda n: n.send())
|
196
aurweb/pkgbase/util.py
Normal file
196
aurweb/pkgbase/util.py
Normal file
|
@ -0,0 +1,196 @@
|
||||||
|
from typing import Any, Dict, List
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
from sqlalchemy import and_
|
||||||
|
|
||||||
|
from aurweb import config, db, l10n, util
|
||||||
|
from aurweb.models import PackageBase, User
|
||||||
|
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||||
|
from aurweb.models.package_comment import PackageComment
|
||||||
|
from aurweb.models.package_request import PENDING_ID, PackageRequest
|
||||||
|
from aurweb.models.package_vote import PackageVote
|
||||||
|
from aurweb.scripts import notify
|
||||||
|
from aurweb.templates import make_context as _make_context
|
||||||
|
|
||||||
|
|
||||||
|
def make_context(request: Request, pkgbase: PackageBase) -> Dict[str, Any]:
|
||||||
|
""" Make a basic context for package or pkgbase.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:return: A pkgbase context without specific differences
|
||||||
|
"""
|
||||||
|
context = _make_context(request, pkgbase.Name)
|
||||||
|
|
||||||
|
context["git_clone_uri_anon"] = config.get("options", "git_clone_uri_anon")
|
||||||
|
context["git_clone_uri_priv"] = config.get("options", "git_clone_uri_priv")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
context["packages_count"] = pkgbase.packages.count()
|
||||||
|
context["keywords"] = pkgbase.keywords
|
||||||
|
context["comments"] = pkgbase.comments.order_by(
|
||||||
|
PackageComment.CommentTS.desc()
|
||||||
|
)
|
||||||
|
context["pinned_comments"] = pkgbase.comments.filter(
|
||||||
|
PackageComment.PinnedTS != 0
|
||||||
|
).order_by(PackageComment.CommentTS.desc())
|
||||||
|
|
||||||
|
context["is_maintainer"] = bool(request.user == pkgbase.Maintainer)
|
||||||
|
context["notified"] = request.user.notified(pkgbase)
|
||||||
|
|
||||||
|
context["out_of_date"] = bool(pkgbase.OutOfDateTS)
|
||||||
|
|
||||||
|
context["voted"] = request.user.package_votes.filter(
|
||||||
|
PackageVote.PackageBaseID == pkgbase.ID
|
||||||
|
).scalar()
|
||||||
|
|
||||||
|
context["requests"] = pkgbase.requests.filter(
|
||||||
|
and_(PackageRequest.Status == PENDING_ID,
|
||||||
|
PackageRequest.ClosedTS.is_(None))
|
||||||
|
).count()
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
def remove_comaintainer(comaint: PackageComaintainer) \
|
||||||
|
-> notify.ComaintainerRemoveNotification:
|
||||||
|
"""
|
||||||
|
Remove a PackageComaintainer.
|
||||||
|
|
||||||
|
This function does *not* begin any database transaction and
|
||||||
|
must be used **within** a database transaction, e.g.:
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
remove_comaintainer(comaint)
|
||||||
|
|
||||||
|
:param comaint: Target PackageComaintainer to be deleted
|
||||||
|
:return: ComaintainerRemoveNotification
|
||||||
|
"""
|
||||||
|
pkgbase = comaint.PackageBase
|
||||||
|
notif = notify.ComaintainerRemoveNotification(comaint.User.ID, pkgbase.ID)
|
||||||
|
db.delete(comaint)
|
||||||
|
rotate_comaintainers(pkgbase)
|
||||||
|
return notif
|
||||||
|
|
||||||
|
|
||||||
|
def remove_comaintainers(pkgbase: PackageBase, usernames: List[str]) -> None:
|
||||||
|
"""
|
||||||
|
Remove comaintainers from `pkgbase`.
|
||||||
|
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:param usernames: Iterable of username strings
|
||||||
|
"""
|
||||||
|
notifications = []
|
||||||
|
with db.begin():
|
||||||
|
comaintainers = pkgbase.comaintainers.join(User).filter(
|
||||||
|
User.Username.in_(usernames)
|
||||||
|
).all()
|
||||||
|
notifications = [
|
||||||
|
notify.ComaintainerRemoveNotification(co.User.ID, pkgbase.ID)
|
||||||
|
for co in comaintainers
|
||||||
|
]
|
||||||
|
db.delete_all(comaintainers)
|
||||||
|
|
||||||
|
# Rotate comaintainer priority values.
|
||||||
|
with db.begin():
|
||||||
|
rotate_comaintainers(pkgbase)
|
||||||
|
|
||||||
|
# Send out notifications.
|
||||||
|
util.apply_all(notifications, lambda n: n.send())
|
||||||
|
|
||||||
|
|
||||||
|
def latest_priority(pkgbase: PackageBase) -> int:
|
||||||
|
"""
|
||||||
|
Return the highest Priority column related to `pkgbase`.
|
||||||
|
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:return: Highest Priority found or 0 if no records exist
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Order comaintainers related to pkgbase by Priority DESC.
|
||||||
|
record = pkgbase.comaintainers.order_by(
|
||||||
|
PackageComaintainer.Priority.desc()).first()
|
||||||
|
|
||||||
|
# Use Priority column if record exists, otherwise 0.
|
||||||
|
return record.Priority if record else 0
|
||||||
|
|
||||||
|
|
||||||
|
class NoopComaintainerNotification:
|
||||||
|
""" A noop notification stub used as an error-state return value. """
|
||||||
|
|
||||||
|
def send(self) -> None:
|
||||||
|
""" noop """
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
|
def add_comaintainer(pkgbase: PackageBase, comaintainer: User) \
|
||||||
|
-> notify.ComaintainerAddNotification:
|
||||||
|
"""
|
||||||
|
Add a new comaintainer to `pkgbase`.
|
||||||
|
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:param comaintainer: User instance used for new comaintainer record
|
||||||
|
:return: ComaintainerAddNotification
|
||||||
|
"""
|
||||||
|
# Skip given `comaintainers` who are already maintainer.
|
||||||
|
if pkgbase.Maintainer == comaintainer:
|
||||||
|
return NoopComaintainerNotification()
|
||||||
|
|
||||||
|
# Priority for the new comaintainer is +1 more than the highest.
|
||||||
|
new_prio = latest_priority(pkgbase) + 1
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
db.create(PackageComaintainer, PackageBase=pkgbase,
|
||||||
|
User=comaintainer, Priority=new_prio)
|
||||||
|
|
||||||
|
return notify.ComaintainerAddNotification(comaintainer.ID, pkgbase.ID)
|
||||||
|
|
||||||
|
|
||||||
|
def add_comaintainers(request: Request, pkgbase: PackageBase,
|
||||||
|
usernames: List[str]) -> None:
|
||||||
|
"""
|
||||||
|
Add comaintainers to `pkgbase`.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
:param usernames: Iterable of username strings
|
||||||
|
:return: Error string on failure else None
|
||||||
|
"""
|
||||||
|
# For each username in usernames, perform validation of the username
|
||||||
|
# and append the User record to `users` if no errors occur.
|
||||||
|
users = []
|
||||||
|
for username in usernames:
|
||||||
|
user = db.query(User).filter(User.Username == username).first()
|
||||||
|
if not user:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
return _("Invalid user name: %s") % username
|
||||||
|
users.append(user)
|
||||||
|
|
||||||
|
notifications = []
|
||||||
|
|
||||||
|
def add_comaint(user: User):
|
||||||
|
nonlocal notifications
|
||||||
|
# Populate `notifications` with add_comaintainer's return value,
|
||||||
|
# which is a ComaintainerAddNotification.
|
||||||
|
notifications.append(add_comaintainer(pkgbase, user))
|
||||||
|
|
||||||
|
# Move along: add all `users` as new `pkgbase` comaintainers.
|
||||||
|
util.apply_all(users, add_comaint)
|
||||||
|
|
||||||
|
# Send out notifications.
|
||||||
|
util.apply_all(notifications, lambda n: n.send())
|
||||||
|
|
||||||
|
|
||||||
|
def rotate_comaintainers(pkgbase: PackageBase) -> None:
|
||||||
|
"""
|
||||||
|
Rotate `pkgbase` comaintainers.
|
||||||
|
|
||||||
|
This function resets the Priority column of all PackageComaintainer
|
||||||
|
instances related to `pkgbase` to seqential 1 .. n values with
|
||||||
|
persisted order.
|
||||||
|
|
||||||
|
:param pkgbase: PackageBase instance
|
||||||
|
"""
|
||||||
|
comaintainers = pkgbase.comaintainers.order_by(
|
||||||
|
PackageComaintainer.Priority.asc())
|
||||||
|
for i, comaint in enumerate(comaintainers):
|
||||||
|
comaint.Priority = i + 1
|
35
aurweb/pkgbase/validate.py
Normal file
35
aurweb/pkgbase/validate.py
Normal file
|
@ -0,0 +1,35 @@
|
||||||
|
from typing import Any, Dict
|
||||||
|
|
||||||
|
from aurweb import db
|
||||||
|
from aurweb.exceptions import ValidationError
|
||||||
|
from aurweb.models import PackageBase
|
||||||
|
|
||||||
|
|
||||||
|
def request(pkgbase: PackageBase,
|
||||||
|
type: str, comments: str, merge_into: str,
|
||||||
|
context: Dict[str, Any]) -> None:
|
||||||
|
if not comments:
|
||||||
|
raise ValidationError(["The comment field must not be empty."])
|
||||||
|
|
||||||
|
if type == "merge":
|
||||||
|
# Perform merge-related checks.
|
||||||
|
if not merge_into:
|
||||||
|
# TODO: This error needs to be translated.
|
||||||
|
raise ValidationError(
|
||||||
|
['The "Merge into" field must not be empty.'])
|
||||||
|
|
||||||
|
target = db.query(PackageBase).filter(
|
||||||
|
PackageBase.Name == merge_into
|
||||||
|
).first()
|
||||||
|
if not target:
|
||||||
|
# TODO: This error needs to be translated.
|
||||||
|
raise ValidationError([
|
||||||
|
"The package base you want to merge into does not exist."
|
||||||
|
])
|
||||||
|
|
||||||
|
db.refresh(target)
|
||||||
|
if target.ID == pkgbase.ID:
|
||||||
|
# TODO: This error needs to be translated.
|
||||||
|
raise ValidationError([
|
||||||
|
"You cannot merge a package base into itself."
|
||||||
|
])
|
103
aurweb/prometheus.py
Normal file
103
aurweb/prometheus.py
Normal file
|
@ -0,0 +1,103 @@
|
||||||
|
from typing import Any, Callable, Dict, List, Optional
|
||||||
|
|
||||||
|
from prometheus_client import Counter
|
||||||
|
from prometheus_fastapi_instrumentator import Instrumentator
|
||||||
|
from prometheus_fastapi_instrumentator.metrics import Info
|
||||||
|
from starlette.routing import Match, Route
|
||||||
|
|
||||||
|
from aurweb import logging
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
_instrumentator = Instrumentator()
|
||||||
|
|
||||||
|
|
||||||
|
def instrumentator():
|
||||||
|
return _instrumentator
|
||||||
|
|
||||||
|
|
||||||
|
# Taken from https://github.com/stephenhillier/starlette_exporter
|
||||||
|
# Their license is included in LICENSES/starlette_exporter.
|
||||||
|
# The code has been modified to remove child route checks
|
||||||
|
# (since we don't have any) and to stay within an 80-width limit.
|
||||||
|
def get_matching_route_path(scope: Dict[Any, Any], routes: List[Route],
|
||||||
|
route_name: Optional[str] = None) -> str:
|
||||||
|
"""
|
||||||
|
Find a matching route and return its original path string
|
||||||
|
|
||||||
|
Will attempt to enter mounted routes and subrouters.
|
||||||
|
|
||||||
|
Credit to https://github.com/elastic/apm-agent-python
|
||||||
|
|
||||||
|
"""
|
||||||
|
for route in routes:
|
||||||
|
match, child_scope = route.matches(scope)
|
||||||
|
if match == Match.FULL:
|
||||||
|
route_name = route.path
|
||||||
|
|
||||||
|
'''
|
||||||
|
# This path exists in the original function's code, but we
|
||||||
|
# don't need it (currently), so it's been removed to avoid
|
||||||
|
# useless test coverage.
|
||||||
|
child_scope = {**scope, **child_scope}
|
||||||
|
if isinstance(route, Mount) and route.routes:
|
||||||
|
child_route_name = get_matching_route_path(child_scope,
|
||||||
|
route.routes,
|
||||||
|
route_name)
|
||||||
|
if child_route_name is None:
|
||||||
|
route_name = None
|
||||||
|
else:
|
||||||
|
route_name += child_route_name
|
||||||
|
'''
|
||||||
|
|
||||||
|
return route_name
|
||||||
|
elif match == Match.PARTIAL and route_name is None:
|
||||||
|
route_name = route.path
|
||||||
|
|
||||||
|
|
||||||
|
def http_requests_total() -> Callable[[Info], None]:
|
||||||
|
metric = Counter("http_requests_total",
|
||||||
|
"Number of HTTP requests.",
|
||||||
|
labelnames=("method", "path", "status"))
|
||||||
|
|
||||||
|
def instrumentation(info: Info) -> None:
|
||||||
|
scope = info.request.scope
|
||||||
|
|
||||||
|
# Taken from https://github.com/stephenhillier/starlette_exporter
|
||||||
|
# Their license is included at LICENSES/starlette_exporter.
|
||||||
|
# The code has been slightly modified: we no longer catch
|
||||||
|
# exceptions; we expect this collector to always succeed.
|
||||||
|
# Failures in this collector shall cause test failures.
|
||||||
|
if not (scope.get("endpoint", None) and scope.get("router", None)):
|
||||||
|
return None
|
||||||
|
|
||||||
|
base_scope = {
|
||||||
|
"type": scope.get("type"),
|
||||||
|
"path": scope.get("root_path", "") + scope.get("path"),
|
||||||
|
"path_params": scope.get("path_params", {}),
|
||||||
|
"method": scope.get("method")
|
||||||
|
}
|
||||||
|
|
||||||
|
method = scope.get("method")
|
||||||
|
path = get_matching_route_path(base_scope, scope.get("router").routes)
|
||||||
|
|
||||||
|
if info.response:
|
||||||
|
status = str(int(info.response.status_code))[:1] + "xx"
|
||||||
|
metric.labels(method=method, path=path, status=status).inc()
|
||||||
|
|
||||||
|
return instrumentation
|
||||||
|
|
||||||
|
|
||||||
|
def http_api_requests_total() -> Callable[[Info], None]:
|
||||||
|
metric = Counter(
|
||||||
|
"http_api_requests",
|
||||||
|
"Number of times an RPC API type has been requested.",
|
||||||
|
labelnames=("type", "status"))
|
||||||
|
|
||||||
|
def instrumentation(info: Info) -> None:
|
||||||
|
if info.request.url.path.rstrip("/") == "/rpc":
|
||||||
|
type = info.request.query_params.get("type", "None")
|
||||||
|
if info.response:
|
||||||
|
status = str(info.response.status_code)[:1] + "xx"
|
||||||
|
metric.labels(type=type, status=status).inc()
|
||||||
|
|
||||||
|
return instrumentation
|
110
aurweb/ratelimit.py
Normal file
110
aurweb/ratelimit.py
Normal file
|
@ -0,0 +1,110 @@
|
||||||
|
from fastapi import Request
|
||||||
|
from redis.client import Pipeline
|
||||||
|
|
||||||
|
from aurweb import config, db, logging, time
|
||||||
|
from aurweb.models import ApiRateLimit
|
||||||
|
from aurweb.redis import redis_connection
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def _update_ratelimit_redis(request: Request, pipeline: Pipeline):
|
||||||
|
window_length = config.getint("ratelimit", "window_length")
|
||||||
|
now = time.utcnow()
|
||||||
|
time_to_delete = now - window_length
|
||||||
|
|
||||||
|
host = request.client.host
|
||||||
|
window_key = f"ratelimit-ws:{host}"
|
||||||
|
requests_key = f"ratelimit:{host}"
|
||||||
|
|
||||||
|
pipeline.get(window_key)
|
||||||
|
window = pipeline.execute()[0]
|
||||||
|
|
||||||
|
if not window or int(window.decode()) < time_to_delete:
|
||||||
|
pipeline.set(window_key, now)
|
||||||
|
pipeline.expire(window_key, window_length)
|
||||||
|
|
||||||
|
pipeline.set(requests_key, 1)
|
||||||
|
pipeline.expire(requests_key, window_length)
|
||||||
|
|
||||||
|
pipeline.execute()
|
||||||
|
else:
|
||||||
|
pipeline.incr(requests_key)
|
||||||
|
pipeline.execute()
|
||||||
|
|
||||||
|
|
||||||
|
def _update_ratelimit_db(request: Request):
|
||||||
|
window_length = config.getint("ratelimit", "window_length")
|
||||||
|
now = time.utcnow()
|
||||||
|
time_to_delete = now - window_length
|
||||||
|
|
||||||
|
records = db.query(ApiRateLimit).filter(
|
||||||
|
ApiRateLimit.WindowStart < time_to_delete)
|
||||||
|
with db.begin():
|
||||||
|
db.delete_all(records)
|
||||||
|
|
||||||
|
host = request.client.host
|
||||||
|
record = db.query(ApiRateLimit, ApiRateLimit.IP == host).first()
|
||||||
|
with db.begin():
|
||||||
|
if not record:
|
||||||
|
record = db.create(ApiRateLimit,
|
||||||
|
WindowStart=now,
|
||||||
|
IP=host, Requests=1)
|
||||||
|
else:
|
||||||
|
record.Requests += 1
|
||||||
|
|
||||||
|
logger.debug(record.Requests)
|
||||||
|
return record
|
||||||
|
|
||||||
|
|
||||||
|
def update_ratelimit(request: Request, pipeline: Pipeline):
|
||||||
|
""" Update the ratelimit stored in Redis or the database depending
|
||||||
|
on AUR_CONFIG's [options] cache setting.
|
||||||
|
|
||||||
|
This Redis-capable function is slightly different than most. If Redis
|
||||||
|
is not configured to use a real server, this function instead uses
|
||||||
|
the database to persist tracking of a particular host.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param pipeline: redis.client.Pipeline
|
||||||
|
:returns: ApiRateLimit record when Redis cache is not configured, else None
|
||||||
|
"""
|
||||||
|
if config.getboolean("ratelimit", "cache"):
|
||||||
|
return _update_ratelimit_redis(request, pipeline)
|
||||||
|
return _update_ratelimit_db(request)
|
||||||
|
|
||||||
|
|
||||||
|
def check_ratelimit(request: Request):
|
||||||
|
""" Increment and check to see if request has exceeded their rate limit.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:returns: True if the request host has exceeded the rate limit else False
|
||||||
|
"""
|
||||||
|
redis = redis_connection()
|
||||||
|
pipeline = redis.pipeline()
|
||||||
|
|
||||||
|
record = update_ratelimit(request, pipeline)
|
||||||
|
|
||||||
|
# Get cache value, else None.
|
||||||
|
host = request.client.host
|
||||||
|
pipeline.get(f"ratelimit:{host}")
|
||||||
|
requests = pipeline.execute()[0]
|
||||||
|
|
||||||
|
# Take into account the split paths. When Redis is used, a
|
||||||
|
# valid cache value will be returned which must be converted
|
||||||
|
# to an int. Otherwise, use the database record returned
|
||||||
|
# by update_ratelimit.
|
||||||
|
if not config.getboolean("ratelimit", "cache"):
|
||||||
|
# If we got nothing from pipeline.get, we did not use
|
||||||
|
# the Redis path of logic: use the DB record's count.
|
||||||
|
requests = record.Requests
|
||||||
|
else:
|
||||||
|
# Otherwise, just case Redis results over to an int.
|
||||||
|
requests = int(requests.decode())
|
||||||
|
|
||||||
|
limit = config.getint("ratelimit", "request_limit")
|
||||||
|
exceeded_ratelimit = requests > limit
|
||||||
|
if exceeded_ratelimit:
|
||||||
|
logger.debug(f"{host} has exceeded the ratelimit.")
|
||||||
|
|
||||||
|
return exceeded_ratelimit
|
57
aurweb/redis.py
Normal file
57
aurweb/redis.py
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
import fakeredis
|
||||||
|
|
||||||
|
from redis import ConnectionPool, Redis
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
from aurweb import logging
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
pool = None
|
||||||
|
|
||||||
|
|
||||||
|
class FakeConnectionPool:
|
||||||
|
""" A fake ConnectionPool class which holds an internal reference
|
||||||
|
to a fakeredis handle.
|
||||||
|
|
||||||
|
We normally deal with Redis by keeping its ConnectionPool globally
|
||||||
|
referenced so we can persist connection state through different calls
|
||||||
|
to redis_connection(), and since FakeRedis does not offer a ConnectionPool,
|
||||||
|
we craft one up here to hang onto the same handle instance as long as the
|
||||||
|
same instance is alive; this allows us to use a similar flow from the
|
||||||
|
redis_connection() user's perspective.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.handle = fakeredis.FakeStrictRedis()
|
||||||
|
|
||||||
|
def disconnect(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def redis_connection(): # pragma: no cover
|
||||||
|
global pool
|
||||||
|
|
||||||
|
disabled = aurweb.config.get("options", "cache") != "redis"
|
||||||
|
|
||||||
|
# If we haven't initialized redis yet, construct a pool.
|
||||||
|
if disabled:
|
||||||
|
if pool is None:
|
||||||
|
logger.debug("Initializing fake Redis instance.")
|
||||||
|
pool = FakeConnectionPool()
|
||||||
|
return pool.handle
|
||||||
|
else:
|
||||||
|
if pool is None:
|
||||||
|
logger.debug("Initializing real Redis instance.")
|
||||||
|
redis_addr = aurweb.config.get("options", "redis_address")
|
||||||
|
pool = ConnectionPool.from_url(redis_addr)
|
||||||
|
|
||||||
|
# Create a connection to the pool.
|
||||||
|
return Redis(connection_pool=pool)
|
||||||
|
|
||||||
|
|
||||||
|
def kill_redis():
|
||||||
|
global pool
|
||||||
|
if pool:
|
||||||
|
pool.disconnect()
|
||||||
|
pool = None
|
0
aurweb/requests/__init__.py
Normal file
0
aurweb/requests/__init__.py
Normal file
13
aurweb/requests/util.py
Normal file
13
aurweb/requests/util.py
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
|
|
||||||
|
from aurweb import db
|
||||||
|
from aurweb.models import PackageRequest
|
||||||
|
|
||||||
|
|
||||||
|
def get_pkgreq_by_id(id: int) -> PackageRequest:
|
||||||
|
pkgreq = db.query(PackageRequest).filter(PackageRequest.ID == id).first()
|
||||||
|
if not pkgreq:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
return db.refresh(pkgreq)
|
|
@ -3,3 +3,22 @@ API routers for FastAPI.
|
||||||
|
|
||||||
See https://fastapi.tiangolo.com/tutorial/bigger-applications/
|
See https://fastapi.tiangolo.com/tutorial/bigger-applications/
|
||||||
"""
|
"""
|
||||||
|
from . import accounts, auth, html, packages, pkgbase, requests, rpc, rss, sso, trusted_user
|
||||||
|
|
||||||
|
"""
|
||||||
|
aurweb application routes. This constant can be any iterable
|
||||||
|
and each element must have a .router attribute which points
|
||||||
|
to a fastapi.APIRouter.
|
||||||
|
"""
|
||||||
|
APP_ROUTES = [
|
||||||
|
accounts,
|
||||||
|
auth,
|
||||||
|
html,
|
||||||
|
packages,
|
||||||
|
pkgbase,
|
||||||
|
requests,
|
||||||
|
trusted_user,
|
||||||
|
rss,
|
||||||
|
rpc,
|
||||||
|
sso,
|
||||||
|
]
|
||||||
|
|
633
aurweb/routers/accounts.py
Normal file
633
aurweb/routers/accounts.py
Normal file
|
@ -0,0 +1,633 @@
|
||||||
|
import copy
|
||||||
|
import typing
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, Request
|
||||||
|
from fastapi.responses import HTMLResponse, RedirectResponse
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
from aurweb import cookies, db, l10n, logging, models, util
|
||||||
|
from aurweb.auth import account_type_required, requires_auth, requires_guest
|
||||||
|
from aurweb.captcha import get_captcha_salts
|
||||||
|
from aurweb.exceptions import ValidationError
|
||||||
|
from aurweb.l10n import get_translator_for_request
|
||||||
|
from aurweb.models import account_type as at
|
||||||
|
from aurweb.models.ssh_pub_key import get_fingerprint
|
||||||
|
from aurweb.models.user import generate_resetkey
|
||||||
|
from aurweb.scripts.notify import ResetKeyNotification, WelcomeNotification
|
||||||
|
from aurweb.templates import make_context, make_variable_context, render_template
|
||||||
|
from aurweb.users import update, validate
|
||||||
|
from aurweb.users.util import get_user_by_name
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/passreset", response_class=HTMLResponse)
|
||||||
|
@requires_guest
|
||||||
|
async def passreset(request: Request):
|
||||||
|
context = await make_variable_context(request, "Password Reset")
|
||||||
|
return render_template(request, "passreset.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/passreset", response_class=HTMLResponse)
|
||||||
|
@requires_guest
|
||||||
|
async def passreset_post(request: Request,
|
||||||
|
user: str = Form(...),
|
||||||
|
resetkey: str = Form(default=None),
|
||||||
|
password: str = Form(default=None),
|
||||||
|
confirm: str = Form(default=None)):
|
||||||
|
context = await make_variable_context(request, "Password Reset")
|
||||||
|
|
||||||
|
# The user parameter being required, we can match against
|
||||||
|
user = db.query(models.User, or_(models.User.Username == user,
|
||||||
|
models.User.Email == user)).first()
|
||||||
|
if not user:
|
||||||
|
context["errors"] = ["Invalid e-mail."]
|
||||||
|
return render_template(request, "passreset.html", context,
|
||||||
|
status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
db.refresh(user)
|
||||||
|
if resetkey:
|
||||||
|
context["resetkey"] = resetkey
|
||||||
|
|
||||||
|
if not user.ResetKey or resetkey != user.ResetKey:
|
||||||
|
context["errors"] = ["Invalid e-mail."]
|
||||||
|
return render_template(request, "passreset.html", context,
|
||||||
|
status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
if not user or not password:
|
||||||
|
context["errors"] = ["Missing a required field."]
|
||||||
|
return render_template(request, "passreset.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if password != confirm:
|
||||||
|
# If the provided password does not match the provided confirm.
|
||||||
|
context["errors"] = ["Password fields do not match."]
|
||||||
|
return render_template(request, "passreset.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if len(password) < models.User.minimum_passwd_length():
|
||||||
|
# Translate the error here, which simplifies error output
|
||||||
|
# in the jinja2 template.
|
||||||
|
_ = get_translator_for_request(request)
|
||||||
|
context["errors"] = [_(
|
||||||
|
"Your password must be at least %s characters.") % (
|
||||||
|
str(models.User.minimum_passwd_length()))]
|
||||||
|
return render_template(request, "passreset.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
# We got to this point; everything matched up. Update the password
|
||||||
|
# and remove the ResetKey.
|
||||||
|
with db.begin():
|
||||||
|
user.ResetKey = str()
|
||||||
|
if user.session:
|
||||||
|
db.delete(user.session)
|
||||||
|
user.update_password(password)
|
||||||
|
|
||||||
|
# Render ?step=complete.
|
||||||
|
return RedirectResponse(url="/passreset?step=complete",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
# If we got here, we continue with issuing a resetkey for the user.
|
||||||
|
resetkey = generate_resetkey()
|
||||||
|
with db.begin():
|
||||||
|
user.ResetKey = resetkey
|
||||||
|
|
||||||
|
ResetKeyNotification(user.ID).send()
|
||||||
|
|
||||||
|
# Render ?step=confirm.
|
||||||
|
return RedirectResponse(url="/passreset?step=confirm",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
def process_account_form(request: Request, user: models.User, args: dict):
|
||||||
|
""" Process an account form. All fields are optional and only checks
|
||||||
|
requirements in the case they are present.
|
||||||
|
|
||||||
|
```
|
||||||
|
context = await make_variable_context(request, "Accounts")
|
||||||
|
ok, errors = process_account_form(request, user, **kwargs)
|
||||||
|
if not ok:
|
||||||
|
context["errors"] = errors
|
||||||
|
return render_template(request, "some_account_template.html", context)
|
||||||
|
```
|
||||||
|
|
||||||
|
:param request: An incoming FastAPI request
|
||||||
|
:param user: The user model of the account being processed
|
||||||
|
:param args: A dictionary of arguments generated via request.form()
|
||||||
|
:return: A (passed processing boolean, list of errors) tuple
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Get a local translator.
|
||||||
|
_ = get_translator_for_request(request)
|
||||||
|
|
||||||
|
checks = [
|
||||||
|
validate.is_banned,
|
||||||
|
validate.invalid_user_password,
|
||||||
|
validate.invalid_fields,
|
||||||
|
validate.invalid_suspend_permission,
|
||||||
|
validate.invalid_username,
|
||||||
|
validate.invalid_password,
|
||||||
|
validate.invalid_email,
|
||||||
|
validate.invalid_backup_email,
|
||||||
|
validate.invalid_homepage,
|
||||||
|
validate.invalid_pgp_key,
|
||||||
|
validate.invalid_ssh_pubkey,
|
||||||
|
validate.invalid_language,
|
||||||
|
validate.invalid_timezone,
|
||||||
|
validate.username_in_use,
|
||||||
|
validate.email_in_use,
|
||||||
|
validate.invalid_account_type,
|
||||||
|
validate.invalid_captcha
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
for check in checks:
|
||||||
|
check(**args, request=request, user=user, _=_)
|
||||||
|
except ValidationError as exc:
|
||||||
|
return (False, exc.data)
|
||||||
|
|
||||||
|
return (True, [])
|
||||||
|
|
||||||
|
|
||||||
|
def make_account_form_context(context: dict,
|
||||||
|
request: Request,
|
||||||
|
user: models.User,
|
||||||
|
args: dict):
|
||||||
|
""" Modify a FastAPI context and add attributes for the account form.
|
||||||
|
|
||||||
|
:param context: FastAPI context
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param user: Target user
|
||||||
|
:param args: Persistent arguments: request.form()
|
||||||
|
:return: FastAPI context adjusted for account form
|
||||||
|
"""
|
||||||
|
# Do not modify the original context.
|
||||||
|
context = copy.copy(context)
|
||||||
|
|
||||||
|
context["account_types"] = list(filter(
|
||||||
|
lambda e: request.user.AccountTypeID >= e[0],
|
||||||
|
[
|
||||||
|
(at.USER_ID, f"Normal {at.USER}"),
|
||||||
|
(at.TRUSTED_USER_ID, at.TRUSTED_USER),
|
||||||
|
(at.DEVELOPER_ID, at.DEVELOPER),
|
||||||
|
(at.TRUSTED_USER_AND_DEV_ID, at.TRUSTED_USER_AND_DEV)
|
||||||
|
]
|
||||||
|
))
|
||||||
|
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
context["username"] = args.get("U", user.Username)
|
||||||
|
context["account_type"] = args.get("T", user.AccountType.ID)
|
||||||
|
context["suspended"] = args.get("S", user.Suspended)
|
||||||
|
context["email"] = args.get("E", user.Email)
|
||||||
|
context["hide_email"] = args.get("H", user.HideEmail)
|
||||||
|
context["backup_email"] = args.get("BE", user.BackupEmail)
|
||||||
|
context["realname"] = args.get("R", user.RealName)
|
||||||
|
context["homepage"] = args.get("HP", user.Homepage or str())
|
||||||
|
context["ircnick"] = args.get("I", user.IRCNick)
|
||||||
|
context["pgp"] = args.get("K", user.PGPKey or str())
|
||||||
|
context["lang"] = args.get("L", user.LangPreference)
|
||||||
|
context["tz"] = args.get("TZ", user.Timezone)
|
||||||
|
ssh_pk = user.ssh_pub_key.PubKey if user.ssh_pub_key else str()
|
||||||
|
context["ssh_pk"] = args.get("PK", ssh_pk)
|
||||||
|
context["cn"] = args.get("CN", user.CommentNotify)
|
||||||
|
context["un"] = args.get("UN", user.UpdateNotify)
|
||||||
|
context["on"] = args.get("ON", user.OwnershipNotify)
|
||||||
|
context["inactive"] = args.get("J", user.InactivityTS != 0)
|
||||||
|
else:
|
||||||
|
context["username"] = args.get("U", str())
|
||||||
|
context["account_type"] = args.get("T", at.USER_ID)
|
||||||
|
context["suspended"] = args.get("S", False)
|
||||||
|
context["email"] = args.get("E", str())
|
||||||
|
context["hide_email"] = args.get("H", False)
|
||||||
|
context["backup_email"] = args.get("BE", str())
|
||||||
|
context["realname"] = args.get("R", str())
|
||||||
|
context["homepage"] = args.get("HP", str())
|
||||||
|
context["ircnick"] = args.get("I", str())
|
||||||
|
context["pgp"] = args.get("K", str())
|
||||||
|
context["lang"] = args.get("L", context.get("language"))
|
||||||
|
context["tz"] = args.get("TZ", context.get("timezone"))
|
||||||
|
context["ssh_pk"] = args.get("PK", str())
|
||||||
|
context["cn"] = args.get("CN", True)
|
||||||
|
context["un"] = args.get("UN", False)
|
||||||
|
context["on"] = args.get("ON", True)
|
||||||
|
context["inactive"] = args.get("J", False)
|
||||||
|
|
||||||
|
context["password"] = args.get("P", str())
|
||||||
|
context["confirm"] = args.get("C", str())
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/register", response_class=HTMLResponse)
|
||||||
|
@requires_guest
|
||||||
|
async def account_register(request: Request,
|
||||||
|
U: str = Form(default=str()), # Username
|
||||||
|
E: str = Form(default=str()), # Email
|
||||||
|
H: str = Form(default=False), # Hide Email
|
||||||
|
BE: str = Form(default=None), # Backup Email
|
||||||
|
R: str = Form(default=None), # Real Name
|
||||||
|
HP: str = Form(default=None), # Homepage
|
||||||
|
I: str = Form(default=None), # IRC Nick
|
||||||
|
K: str = Form(default=None), # PGP Key FP
|
||||||
|
L: str = Form(default=aurweb.config.get(
|
||||||
|
"options", "default_lang")),
|
||||||
|
TZ: str = Form(default=aurweb.config.get(
|
||||||
|
"options", "default_timezone")),
|
||||||
|
PK: str = Form(default=None),
|
||||||
|
CN: bool = Form(default=False), # Comment Notify
|
||||||
|
CU: bool = Form(default=False), # Update Notify
|
||||||
|
CO: bool = Form(default=False), # Owner Notify
|
||||||
|
captcha: str = Form(default=str())):
|
||||||
|
context = await make_variable_context(request, "Register")
|
||||||
|
context["captcha_salt"] = get_captcha_salts()[0]
|
||||||
|
context = make_account_form_context(context, request, None, dict())
|
||||||
|
return render_template(request, "register.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/register", response_class=HTMLResponse)
|
||||||
|
@requires_guest
|
||||||
|
async def account_register_post(request: Request,
|
||||||
|
U: str = Form(default=str()), # Username
|
||||||
|
E: str = Form(default=str()), # Email
|
||||||
|
H: str = Form(default=False), # Hide Email
|
||||||
|
BE: str = Form(default=None), # Backup Email
|
||||||
|
R: str = Form(default=''), # Real Name
|
||||||
|
HP: str = Form(default=None), # Homepage
|
||||||
|
I: str = Form(default=None), # IRC Nick
|
||||||
|
K: str = Form(default=None), # PGP Key
|
||||||
|
L: str = Form(default=aurweb.config.get(
|
||||||
|
"options", "default_lang")),
|
||||||
|
TZ: str = Form(default=aurweb.config.get(
|
||||||
|
"options", "default_timezone")),
|
||||||
|
PK: str = Form(default=None), # SSH PubKey
|
||||||
|
CN: bool = Form(default=False),
|
||||||
|
UN: bool = Form(default=False),
|
||||||
|
ON: bool = Form(default=False),
|
||||||
|
captcha: str = Form(default=None),
|
||||||
|
captcha_salt: str = Form(...)):
|
||||||
|
context = await make_variable_context(request, "Register")
|
||||||
|
args = dict(await request.form())
|
||||||
|
|
||||||
|
context = make_account_form_context(context, request, None, args)
|
||||||
|
ok, errors = process_account_form(request, request.user, args)
|
||||||
|
if not ok:
|
||||||
|
# If the field values given do not meet the requirements,
|
||||||
|
# return HTTP 400 with an error.
|
||||||
|
context["errors"] = errors
|
||||||
|
return render_template(request, "register.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if not captcha:
|
||||||
|
context["errors"] = ["The CAPTCHA is missing."]
|
||||||
|
return render_template(request, "register.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
# Create a user with no password with a resetkey, then send
|
||||||
|
# an email off about it.
|
||||||
|
resetkey = generate_resetkey()
|
||||||
|
|
||||||
|
# By default, we grab the User account type to associate with.
|
||||||
|
atype = db.query(models.AccountType,
|
||||||
|
models.AccountType.AccountType == "User").first()
|
||||||
|
|
||||||
|
# Create a user given all parameters available.
|
||||||
|
with db.begin():
|
||||||
|
user = db.create(models.User, Username=U,
|
||||||
|
Email=E, HideEmail=H, BackupEmail=BE,
|
||||||
|
RealName=R, Homepage=HP, IRCNick=I, PGPKey=K,
|
||||||
|
LangPreference=L, Timezone=TZ, CommentNotify=CN,
|
||||||
|
UpdateNotify=UN, OwnershipNotify=ON,
|
||||||
|
ResetKey=resetkey, AccountType=atype)
|
||||||
|
|
||||||
|
# If a PK was given and either one does not exist or the given
|
||||||
|
# PK mismatches the existing user's SSHPubKey.PubKey.
|
||||||
|
if PK:
|
||||||
|
# Get the second element in the PK, which is the actual key.
|
||||||
|
pubkey = PK.strip().rstrip()
|
||||||
|
parts = pubkey.split(" ")
|
||||||
|
if len(parts) == 3:
|
||||||
|
# Remove the host part.
|
||||||
|
pubkey = parts[0] + " " + parts[1]
|
||||||
|
fingerprint = get_fingerprint(pubkey)
|
||||||
|
with db.begin():
|
||||||
|
user.ssh_pub_key = models.SSHPubKey(UserID=user.ID,
|
||||||
|
PubKey=pubkey,
|
||||||
|
Fingerprint=fingerprint)
|
||||||
|
|
||||||
|
# Send a reset key notification to the new user.
|
||||||
|
WelcomeNotification(user.ID).send()
|
||||||
|
|
||||||
|
context["complete"] = True
|
||||||
|
context["user"] = user
|
||||||
|
return render_template(request, "register.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
def cannot_edit(request: Request, user: models.User) \
|
||||||
|
-> typing.Optional[RedirectResponse]:
|
||||||
|
"""
|
||||||
|
Decide if `request.user` cannot edit `user`.
|
||||||
|
|
||||||
|
If the request user can edit the target user, None is returned.
|
||||||
|
Otherwise, a redirect is returned to /account/{user.Username}.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:param user: Target user to be edited
|
||||||
|
:return: RedirectResponse if approval != granted else None
|
||||||
|
"""
|
||||||
|
approved = request.user.can_edit_user(user)
|
||||||
|
if not approved and (to := "/"):
|
||||||
|
if user:
|
||||||
|
to = f"/account/{user.Username}"
|
||||||
|
return RedirectResponse(to, status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/account/{username}/edit", response_class=HTMLResponse)
|
||||||
|
@requires_auth
|
||||||
|
async def account_edit(request: Request, username: str):
|
||||||
|
user = db.query(models.User, models.User.Username == username).first()
|
||||||
|
|
||||||
|
response = cannot_edit(request, user)
|
||||||
|
if response:
|
||||||
|
return response
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Accounts")
|
||||||
|
context["user"] = db.refresh(user)
|
||||||
|
|
||||||
|
context = make_account_form_context(context, request, user, dict())
|
||||||
|
return render_template(request, "account/edit.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/account/{username}/edit", response_class=HTMLResponse)
|
||||||
|
@requires_auth
|
||||||
|
async def account_edit_post(request: Request,
|
||||||
|
username: str,
|
||||||
|
U: str = Form(default=str()), # Username
|
||||||
|
J: bool = Form(default=False),
|
||||||
|
E: str = Form(default=str()), # Email
|
||||||
|
H: str = Form(default=False), # Hide Email
|
||||||
|
BE: str = Form(default=None), # Backup Email
|
||||||
|
R: str = Form(default=None), # Real Name
|
||||||
|
HP: str = Form(default=None), # Homepage
|
||||||
|
I: str = Form(default=None), # IRC Nick
|
||||||
|
K: str = Form(default=None), # PGP Key
|
||||||
|
L: str = Form(aurweb.config.get(
|
||||||
|
"options", "default_lang")),
|
||||||
|
TZ: str = Form(aurweb.config.get(
|
||||||
|
"options", "default_timezone")),
|
||||||
|
P: str = Form(default=str()), # New Password
|
||||||
|
C: str = Form(default=None), # Password Confirm
|
||||||
|
PK: str = Form(default=None), # PubKey
|
||||||
|
CN: bool = Form(default=False), # Comment Notify
|
||||||
|
UN: bool = Form(default=False), # Update Notify
|
||||||
|
ON: bool = Form(default=False), # Owner Notify
|
||||||
|
T: int = Form(default=None),
|
||||||
|
passwd: str = Form(default=str())):
|
||||||
|
user = db.query(models.User).filter(
|
||||||
|
models.User.Username == username).first()
|
||||||
|
response = cannot_edit(request, user)
|
||||||
|
if response:
|
||||||
|
return response
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Accounts")
|
||||||
|
context["user"] = db.refresh(user)
|
||||||
|
|
||||||
|
args = dict(await request.form())
|
||||||
|
context = make_account_form_context(context, request, user, args)
|
||||||
|
ok, errors = process_account_form(request, user, args)
|
||||||
|
|
||||||
|
if not passwd:
|
||||||
|
context["errors"] = ["Invalid password."]
|
||||||
|
return render_template(request, "account/edit.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if not ok:
|
||||||
|
context["errors"] = errors
|
||||||
|
return render_template(request, "account/edit.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
updates = [
|
||||||
|
update.simple,
|
||||||
|
update.language,
|
||||||
|
update.timezone,
|
||||||
|
update.ssh_pubkey,
|
||||||
|
update.account_type,
|
||||||
|
update.password
|
||||||
|
]
|
||||||
|
|
||||||
|
for f in updates:
|
||||||
|
f(**args, request=request, user=user, context=context)
|
||||||
|
|
||||||
|
if not errors:
|
||||||
|
context["complete"] = True
|
||||||
|
|
||||||
|
# Update cookies with requests, in case they were changed.
|
||||||
|
response = render_template(request, "account/edit.html", context)
|
||||||
|
return cookies.update_response_cookies(request, response,
|
||||||
|
aurtz=TZ, aurlang=L)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/account/{username}")
|
||||||
|
async def account(request: Request, username: str):
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
context = await make_variable_context(
|
||||||
|
request, _("Account") + " " + username)
|
||||||
|
if not request.user.is_authenticated():
|
||||||
|
return render_template(request, "account/show.html", context,
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED)
|
||||||
|
|
||||||
|
# Get related User record, if possible.
|
||||||
|
user = get_user_by_name(username)
|
||||||
|
context["user"] = user
|
||||||
|
|
||||||
|
# Format PGPKey for display with a space between each 4 characters.
|
||||||
|
k = user.PGPKey or str()
|
||||||
|
context["pgp_key"] = " ".join([k[i:i + 4] for i in range(0, len(k), 4)])
|
||||||
|
|
||||||
|
login_ts = None
|
||||||
|
session = db.query(models.Session).filter(
|
||||||
|
models.Session.UsersID == user.ID).first()
|
||||||
|
if session:
|
||||||
|
login_ts = user.session.LastUpdateTS
|
||||||
|
context["login_ts"] = login_ts
|
||||||
|
|
||||||
|
# Render the template.
|
||||||
|
return render_template(request, "account/show.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/account/{username}/comments")
|
||||||
|
@requires_auth
|
||||||
|
async def account_comments(request: Request, username: str):
|
||||||
|
user = get_user_by_name(username)
|
||||||
|
context = make_context(request, "Accounts")
|
||||||
|
context["username"] = username
|
||||||
|
context["comments"] = user.package_comments.order_by(
|
||||||
|
models.PackageComment.CommentTS.desc())
|
||||||
|
return render_template(request, "account/comments.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/accounts")
|
||||||
|
@requires_auth
|
||||||
|
@account_type_required({at.TRUSTED_USER,
|
||||||
|
at.DEVELOPER,
|
||||||
|
at.TRUSTED_USER_AND_DEV})
|
||||||
|
async def accounts(request: Request):
|
||||||
|
context = make_context(request, "Accounts")
|
||||||
|
return render_template(request, "account/search.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/accounts")
|
||||||
|
@requires_auth
|
||||||
|
@account_type_required({at.TRUSTED_USER,
|
||||||
|
at.DEVELOPER,
|
||||||
|
at.TRUSTED_USER_AND_DEV})
|
||||||
|
async def accounts_post(request: Request,
|
||||||
|
O: int = Form(default=0), # Offset
|
||||||
|
SB: str = Form(default=str()), # Sort By
|
||||||
|
U: str = Form(default=str()), # Username
|
||||||
|
T: str = Form(default=str()), # Account Type
|
||||||
|
S: bool = Form(default=False), # Suspended
|
||||||
|
E: str = Form(default=str()), # Email
|
||||||
|
R: str = Form(default=str()), # Real Name
|
||||||
|
I: str = Form(default=str()), # IRC Nick
|
||||||
|
K: str = Form(default=str())): # PGP Key
|
||||||
|
context = await make_variable_context(request, "Accounts")
|
||||||
|
context["pp"] = pp = 50 # Hits per page.
|
||||||
|
|
||||||
|
offset = max(O, 0) # Minimize offset at 0.
|
||||||
|
context["offset"] = offset # Offset.
|
||||||
|
|
||||||
|
context["params"] = dict(await request.form())
|
||||||
|
if "O" in context["params"]:
|
||||||
|
context["params"].pop("O")
|
||||||
|
|
||||||
|
# Setup order by criteria based on SB.
|
||||||
|
order_by_columns = {
|
||||||
|
"t": (models.AccountType.ID.asc(), models.User.Username.asc()),
|
||||||
|
"r": (models.User.RealName.asc(), models.AccountType.ID.asc()),
|
||||||
|
"i": (models.User.IRCNick.asc(), models.AccountType.ID.asc()),
|
||||||
|
}
|
||||||
|
default_order = (models.User.Username.asc(), models.AccountType.ID.asc())
|
||||||
|
order_by = order_by_columns.get(SB, default_order)
|
||||||
|
|
||||||
|
# Convert parameter T to an AccountType ID.
|
||||||
|
account_types = {
|
||||||
|
"u": at.USER_ID,
|
||||||
|
"t": at.TRUSTED_USER_ID,
|
||||||
|
"d": at.DEVELOPER_ID,
|
||||||
|
"td": at.TRUSTED_USER_AND_DEV_ID
|
||||||
|
}
|
||||||
|
account_type_id = account_types.get(T, None)
|
||||||
|
|
||||||
|
# Get a query handle to users, populate the total user
|
||||||
|
# count into a jinja2 context variable.
|
||||||
|
query = db.query(models.User).join(models.AccountType)
|
||||||
|
|
||||||
|
# Populate this list with any additional statements to
|
||||||
|
# be ANDed together.
|
||||||
|
statements = [
|
||||||
|
v for k, v in [
|
||||||
|
(account_type_id is not None, models.AccountType.ID == account_type_id),
|
||||||
|
(bool(U), models.User.Username.like(f"%{U}%")),
|
||||||
|
(bool(S), models.User.Suspended == S),
|
||||||
|
(bool(E), models.User.Email.like(f"%{E}%")),
|
||||||
|
(bool(R), models.User.RealName.like(f"%{R}%")),
|
||||||
|
(bool(I), models.User.IRCNick.like(f"%{I}%")),
|
||||||
|
(bool(K), models.User.PGPKey.like(f"%{K}%")),
|
||||||
|
] if k
|
||||||
|
]
|
||||||
|
|
||||||
|
# Filter the query by coe-mbining all statements added above into
|
||||||
|
# an AND statement, unless there's just one statement, which
|
||||||
|
# we pass on to filter() as args.
|
||||||
|
if statements:
|
||||||
|
query = query.filter(and_(*statements))
|
||||||
|
|
||||||
|
context["total_users"] = query.count()
|
||||||
|
|
||||||
|
# Finally, order and truncate our users for the current page.
|
||||||
|
users = query.order_by(*order_by).limit(pp).offset(offset).all()
|
||||||
|
context["users"] = util.apply_all(users, db.refresh)
|
||||||
|
|
||||||
|
return render_template(request, "account/index.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
def render_terms_of_service(request: Request,
|
||||||
|
context: dict,
|
||||||
|
terms: typing.Iterable):
|
||||||
|
if not terms:
|
||||||
|
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
context["unaccepted_terms"] = terms
|
||||||
|
return render_template(request, "tos/index.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tos")
|
||||||
|
@requires_auth
|
||||||
|
async def terms_of_service(request: Request):
|
||||||
|
# Query the database for terms that were previously accepted,
|
||||||
|
# but now have a bumped Revision that needs to be accepted.
|
||||||
|
diffs = db.query(models.Term).join(models.AcceptedTerm).filter(
|
||||||
|
models.AcceptedTerm.Revision < models.Term.Revision).all()
|
||||||
|
|
||||||
|
# Query the database for any terms that have not yet been accepted.
|
||||||
|
unaccepted = db.query(models.Term).filter(
|
||||||
|
~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID))).all()
|
||||||
|
|
||||||
|
for record in (diffs + unaccepted):
|
||||||
|
db.refresh(record)
|
||||||
|
|
||||||
|
# Translate the 'Terms of Service' part of our page title.
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
title = f"AUR {_('Terms of Service')}"
|
||||||
|
context = await make_variable_context(request, title)
|
||||||
|
|
||||||
|
accept_needed = sorted(unaccepted + diffs)
|
||||||
|
return render_terms_of_service(request, context, accept_needed)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/tos")
|
||||||
|
@requires_auth
|
||||||
|
async def terms_of_service_post(request: Request,
|
||||||
|
accept: bool = Form(default=False)):
|
||||||
|
# Query the database for terms that were previously accepted,
|
||||||
|
# but now have a bumped Revision that needs to be accepted.
|
||||||
|
diffs = db.query(models.Term).join(models.AcceptedTerm).filter(
|
||||||
|
models.AcceptedTerm.Revision < models.Term.Revision).all()
|
||||||
|
|
||||||
|
# Query the database for any terms that have not yet been accepted.
|
||||||
|
unaccepted = db.query(models.Term).filter(
|
||||||
|
~models.Term.ID.in_(db.query(models.AcceptedTerm.TermsID))).all()
|
||||||
|
|
||||||
|
if not accept:
|
||||||
|
# Translate the 'Terms of Service' part of our page title.
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
title = f"AUR {_('Terms of Service')}"
|
||||||
|
context = await make_variable_context(request, title)
|
||||||
|
|
||||||
|
# We already did the database filters here, so let's just use
|
||||||
|
# them instead of reiterating the process in terms_of_service.
|
||||||
|
accept_needed = sorted(unaccepted + diffs)
|
||||||
|
return render_terms_of_service(
|
||||||
|
request, context, util.apply_all(accept_needed, db.refresh))
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
# For each term we found, query for the matching accepted term
|
||||||
|
# and update its Revision to the term's current Revision.
|
||||||
|
for term in diffs:
|
||||||
|
db.refresh(term)
|
||||||
|
accepted_term = request.user.accepted_terms.filter(
|
||||||
|
models.AcceptedTerm.TermsID == term.ID).first()
|
||||||
|
accepted_term.Revision = term.Revision
|
||||||
|
|
||||||
|
# For each term that was never accepted, accept it!
|
||||||
|
for term in unaccepted:
|
||||||
|
db.refresh(term)
|
||||||
|
db.create(models.AcceptedTerm, User=request.user,
|
||||||
|
Term=term, Revision=term.Revision)
|
||||||
|
|
||||||
|
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
|
93
aurweb/routers/auth.py
Normal file
93
aurweb/routers/auth.py
Normal file
|
@ -0,0 +1,93 @@
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, HTTPException, Request
|
||||||
|
from fastapi.responses import HTMLResponse, RedirectResponse
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
from aurweb import cookies, db, time
|
||||||
|
from aurweb.auth import requires_auth, requires_guest
|
||||||
|
from aurweb.l10n import get_translator_for_request
|
||||||
|
from aurweb.models import User
|
||||||
|
from aurweb.templates import make_variable_context, render_template
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
async def login_template(request: Request, next: str, errors: list = None):
|
||||||
|
""" Provide login-specific template context to render_template. """
|
||||||
|
context = await make_variable_context(request, "Login", next)
|
||||||
|
context["errors"] = errors
|
||||||
|
context["url_base"] = f"{request.url.scheme}://{request.url.netloc}"
|
||||||
|
return render_template(request, "login.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/login", response_class=HTMLResponse)
|
||||||
|
async def login_get(request: Request, next: str = "/"):
|
||||||
|
return await login_template(request, next)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/login", response_class=HTMLResponse)
|
||||||
|
@requires_guest
|
||||||
|
async def login_post(request: Request,
|
||||||
|
next: str = Form(...),
|
||||||
|
user: str = Form(default=str()),
|
||||||
|
passwd: str = Form(default=str()),
|
||||||
|
remember_me: bool = Form(default=False)):
|
||||||
|
# TODO: Once the Origin header gets broader adoption, this code can be
|
||||||
|
# slightly simplified to use it.
|
||||||
|
login_path = aurweb.config.get("options", "aur_location") + "/login"
|
||||||
|
referer = request.headers.get("Referer")
|
||||||
|
if not referer or not referer.startswith(login_path):
|
||||||
|
_ = get_translator_for_request(request)
|
||||||
|
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
|
||||||
|
detail=_("Bad Referer header."))
|
||||||
|
|
||||||
|
user = db.query(User).filter(User.Username == user).first()
|
||||||
|
if not user:
|
||||||
|
return await login_template(request, next,
|
||||||
|
errors=["Bad username or password."])
|
||||||
|
|
||||||
|
cookie_timeout = cookies.timeout(remember_me)
|
||||||
|
sid = user.login(request, passwd, cookie_timeout)
|
||||||
|
if not sid:
|
||||||
|
return await login_template(request, next,
|
||||||
|
errors=["Bad username or password."])
|
||||||
|
|
||||||
|
login_timeout = aurweb.config.getint("options", "login_timeout")
|
||||||
|
|
||||||
|
expires_at = int(time.utcnow() + max(cookie_timeout, login_timeout))
|
||||||
|
|
||||||
|
response = RedirectResponse(url=next,
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
secure = aurweb.config.getboolean("options", "disable_http_login")
|
||||||
|
response.set_cookie("AURSID", sid, expires=expires_at,
|
||||||
|
secure=secure, httponly=secure,
|
||||||
|
samesite=cookies.samesite())
|
||||||
|
response.set_cookie("AURTZ", user.Timezone,
|
||||||
|
secure=secure, httponly=secure,
|
||||||
|
samesite=cookies.samesite())
|
||||||
|
response.set_cookie("AURLANG", user.LangPreference,
|
||||||
|
secure=secure, httponly=secure,
|
||||||
|
samesite=cookies.samesite())
|
||||||
|
response.set_cookie("AURREMEMBER", remember_me,
|
||||||
|
expires=expires_at,
|
||||||
|
secure=secure, httponly=secure,
|
||||||
|
samesite=cookies.samesite())
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/logout")
|
||||||
|
@requires_auth
|
||||||
|
async def logout(request: Request, next: str = Form(default="/")):
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
request.user.logout(request)
|
||||||
|
|
||||||
|
# Use 303 since we may be handling a post request, that'll get it
|
||||||
|
# to redirect to a get request.
|
||||||
|
response = RedirectResponse(url=next,
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
response.delete_cookie("AURSID")
|
||||||
|
response.delete_cookie("AURTZ")
|
||||||
|
return response
|
223
aurweb/routers/html.py
Normal file
223
aurweb/routers/html.py
Normal file
|
@ -0,0 +1,223 @@
|
||||||
|
""" AURWeb's primary routing module. Define all routes via @app.app.{get,post}
|
||||||
|
decorators in some way; more complex routes should be defined in their
|
||||||
|
own modules and imported here. """
|
||||||
|
import os
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, HTTPException, Request, Response
|
||||||
|
from fastapi.responses import HTMLResponse, RedirectResponse
|
||||||
|
from prometheus_client import CONTENT_TYPE_LATEST, CollectorRegistry, generate_latest, multiprocess
|
||||||
|
from sqlalchemy import and_, case, or_
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
import aurweb.models.package_request
|
||||||
|
|
||||||
|
from aurweb import cookies, db, models, time, util
|
||||||
|
from aurweb.cache import db_count_cache
|
||||||
|
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
|
||||||
|
from aurweb.models.package_request import PENDING_ID
|
||||||
|
from aurweb.packages.util import query_notified, query_voted, updated_packages
|
||||||
|
from aurweb.templates import make_context, render_template
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/favicon.ico")
|
||||||
|
async def favicon(request: Request):
|
||||||
|
""" Some browsers attempt to find a website's favicon via root uri at
|
||||||
|
/favicon.ico, so provide a redirection here to our static icon. """
|
||||||
|
return RedirectResponse("/static/images/favicon.ico")
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/language", response_class=RedirectResponse)
|
||||||
|
async def language(request: Request,
|
||||||
|
set_lang: str = Form(...),
|
||||||
|
next: str = Form(...),
|
||||||
|
q: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
A POST route used to set a session's language.
|
||||||
|
|
||||||
|
Return a 303 See Other redirect to {next}?next={next}. If we are
|
||||||
|
setting the language on any page, we want to preserve query
|
||||||
|
parameters across the redirect.
|
||||||
|
"""
|
||||||
|
if next[0] != '/':
|
||||||
|
return HTMLResponse(b"Invalid 'next' parameter.", status_code=400)
|
||||||
|
|
||||||
|
query_string = "?" + q if q else str()
|
||||||
|
|
||||||
|
# If the user is authenticated, update the user's LangPreference.
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
with db.begin():
|
||||||
|
request.user.LangPreference = set_lang
|
||||||
|
|
||||||
|
# In any case, set the response's AURLANG cookie that never expires.
|
||||||
|
response = RedirectResponse(url=f"{next}{query_string}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
secure = aurweb.config.getboolean("options", "disable_http_login")
|
||||||
|
response.set_cookie("AURLANG", set_lang,
|
||||||
|
secure=secure, httponly=secure,
|
||||||
|
samesite=cookies.samesite())
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/", response_class=HTMLResponse)
|
||||||
|
async def index(request: Request):
|
||||||
|
""" Homepage route. """
|
||||||
|
context = make_context(request, "Home")
|
||||||
|
context['ssh_fingerprints'] = util.get_ssh_fingerprints()
|
||||||
|
|
||||||
|
bases = db.query(models.PackageBase)
|
||||||
|
|
||||||
|
redis = aurweb.redis.redis_connection()
|
||||||
|
cache_expire = 300 # Five minutes.
|
||||||
|
|
||||||
|
# Package statistics.
|
||||||
|
query = bases.filter(models.PackageBase.PackagerUID.isnot(None))
|
||||||
|
context["package_count"] = await db_count_cache(
|
||||||
|
redis, "package_count", query, expire=cache_expire)
|
||||||
|
|
||||||
|
query = bases.filter(
|
||||||
|
and_(models.PackageBase.MaintainerUID.is_(None),
|
||||||
|
models.PackageBase.PackagerUID.isnot(None))
|
||||||
|
)
|
||||||
|
context["orphan_count"] = await db_count_cache(
|
||||||
|
redis, "orphan_count", query, expire=cache_expire)
|
||||||
|
|
||||||
|
query = db.query(models.User)
|
||||||
|
context["user_count"] = await db_count_cache(
|
||||||
|
redis, "user_count", query, expire=cache_expire)
|
||||||
|
|
||||||
|
query = query.filter(
|
||||||
|
or_(models.User.AccountTypeID == TRUSTED_USER_ID,
|
||||||
|
models.User.AccountTypeID == TRUSTED_USER_AND_DEV_ID))
|
||||||
|
context["trusted_user_count"] = await db_count_cache(
|
||||||
|
redis, "trusted_user_count", query, expire=cache_expire)
|
||||||
|
|
||||||
|
# Current timestamp.
|
||||||
|
now = time.utcnow()
|
||||||
|
|
||||||
|
seven_days = 86400 * 7 # Seven days worth of seconds.
|
||||||
|
seven_days_ago = now - seven_days
|
||||||
|
|
||||||
|
one_hour = 3600
|
||||||
|
updated = bases.filter(
|
||||||
|
and_(models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS >= one_hour,
|
||||||
|
models.PackageBase.PackagerUID.isnot(None))
|
||||||
|
)
|
||||||
|
|
||||||
|
query = bases.filter(
|
||||||
|
and_(models.PackageBase.SubmittedTS >= seven_days_ago,
|
||||||
|
models.PackageBase.PackagerUID.isnot(None))
|
||||||
|
)
|
||||||
|
context["seven_days_old_added"] = await db_count_cache(
|
||||||
|
redis, "seven_days_old_added", query, expire=cache_expire)
|
||||||
|
|
||||||
|
query = updated.filter(models.PackageBase.ModifiedTS >= seven_days_ago)
|
||||||
|
context["seven_days_old_updated"] = await db_count_cache(
|
||||||
|
redis, "seven_days_old_updated", query, expire=cache_expire)
|
||||||
|
|
||||||
|
year = seven_days * 52 # Fifty two weeks worth: one year.
|
||||||
|
year_ago = now - year
|
||||||
|
query = updated.filter(models.PackageBase.ModifiedTS >= year_ago)
|
||||||
|
context["year_old_updated"] = await db_count_cache(
|
||||||
|
redis, "year_old_updated", query, expire=cache_expire)
|
||||||
|
|
||||||
|
query = bases.filter(
|
||||||
|
models.PackageBase.ModifiedTS - models.PackageBase.SubmittedTS < 3600)
|
||||||
|
context["never_updated"] = await db_count_cache(
|
||||||
|
redis, "never_updated", query, expire=cache_expire)
|
||||||
|
|
||||||
|
# Get the 15 most recently updated packages.
|
||||||
|
context["package_updates"] = updated_packages(15, cache_expire)
|
||||||
|
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
# Authenticated users get a few extra pieces of data for
|
||||||
|
# the dashboard display.
|
||||||
|
packages = db.query(models.Package).join(models.PackageBase)
|
||||||
|
|
||||||
|
maintained = packages.join(
|
||||||
|
models.User, models.PackageBase.MaintainerUID == models.User.ID
|
||||||
|
).filter(
|
||||||
|
models.PackageBase.MaintainerUID == request.user.ID
|
||||||
|
)
|
||||||
|
|
||||||
|
# Packages maintained by the user that have been flagged.
|
||||||
|
context["flagged_packages"] = maintained.filter(
|
||||||
|
models.PackageBase.OutOfDateTS.isnot(None)
|
||||||
|
).order_by(
|
||||||
|
models.PackageBase.ModifiedTS.desc(), models.Package.Name.asc()
|
||||||
|
).limit(50).all()
|
||||||
|
|
||||||
|
# Flagged packages that request.user has voted for.
|
||||||
|
context["flagged_packages_voted"] = query_voted(
|
||||||
|
context.get("flagged_packages"), request.user)
|
||||||
|
|
||||||
|
# Flagged packages that request.user is being notified about.
|
||||||
|
context["flagged_packages_notified"] = query_notified(
|
||||||
|
context.get("flagged_packages"), request.user)
|
||||||
|
|
||||||
|
archive_time = aurweb.config.getint('options', 'request_archive_time')
|
||||||
|
start = now - archive_time
|
||||||
|
|
||||||
|
# Package requests created by request.user.
|
||||||
|
context["package_requests"] = request.user.package_requests.filter(
|
||||||
|
models.PackageRequest.RequestTS >= start
|
||||||
|
).order_by(
|
||||||
|
# Order primarily by the Status column being PENDING_ID,
|
||||||
|
# and secondarily by RequestTS; both in descending order.
|
||||||
|
case([(models.PackageRequest.Status == PENDING_ID, 1)],
|
||||||
|
else_=0).desc(),
|
||||||
|
models.PackageRequest.RequestTS.desc()
|
||||||
|
).limit(50).all()
|
||||||
|
|
||||||
|
# Packages that the request user maintains or comaintains.
|
||||||
|
context["packages"] = maintained.order_by(
|
||||||
|
models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc()
|
||||||
|
).limit(50).all()
|
||||||
|
|
||||||
|
# Packages that request.user has voted for.
|
||||||
|
context["packages_voted"] = query_voted(
|
||||||
|
context.get("packages"), request.user)
|
||||||
|
|
||||||
|
# Packages that request.user is being notified about.
|
||||||
|
context["packages_notified"] = query_notified(
|
||||||
|
context.get("packages"), request.user)
|
||||||
|
|
||||||
|
# Any packages that the request user comaintains.
|
||||||
|
context["comaintained"] = packages.join(
|
||||||
|
models.PackageComaintainer
|
||||||
|
).filter(
|
||||||
|
models.PackageComaintainer.UsersID == request.user.ID
|
||||||
|
).order_by(
|
||||||
|
models.PackageBase.ModifiedTS.desc(), models.Package.Name.desc()
|
||||||
|
).limit(50).all()
|
||||||
|
|
||||||
|
# Comaintained packages that request.user has voted for.
|
||||||
|
context["comaintained_voted"] = query_voted(
|
||||||
|
context.get("comaintained"), request.user)
|
||||||
|
|
||||||
|
# Comaintained packages that request.user is being notified about.
|
||||||
|
context["comaintained_notified"] = query_notified(
|
||||||
|
context.get("comaintained"), request.user)
|
||||||
|
|
||||||
|
return render_template(request, "index.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/metrics")
|
||||||
|
async def metrics(request: Request):
|
||||||
|
registry = CollectorRegistry()
|
||||||
|
if os.environ.get("PROMETHEUS_MULTIPROC_DIR", None): # pragma: no cover
|
||||||
|
multiprocess.MultiProcessCollector(registry)
|
||||||
|
data = generate_latest(registry)
|
||||||
|
headers = {
|
||||||
|
"Content-Type": CONTENT_TYPE_LATEST,
|
||||||
|
"Content-Length": str(len(data))
|
||||||
|
}
|
||||||
|
return Response(data, headers=headers)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/raisefivethree", response_class=HTMLResponse)
|
||||||
|
async def raise_service_unavailable(request: Request):
|
||||||
|
raise HTTPException(status_code=HTTPStatus.SERVICE_UNAVAILABLE)
|
440
aurweb/routers/packages.py
Normal file
440
aurweb/routers/packages.py
Normal file
|
@ -0,0 +1,440 @@
|
||||||
|
from collections import defaultdict
|
||||||
|
from http import HTTPStatus
|
||||||
|
from typing import Any, Dict, List
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, Request, Response
|
||||||
|
|
||||||
|
import aurweb.filters # noqa: F401
|
||||||
|
|
||||||
|
from aurweb import config, db, defaults, logging, models, util
|
||||||
|
from aurweb.auth import creds, requires_auth
|
||||||
|
from aurweb.exceptions import InvariantError
|
||||||
|
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
|
||||||
|
from aurweb.packages import util as pkgutil
|
||||||
|
from aurweb.packages.search import PackageSearch
|
||||||
|
from aurweb.packages.util import get_pkg_or_base
|
||||||
|
from aurweb.pkgbase import actions as pkgbase_actions
|
||||||
|
from aurweb.pkgbase import util as pkgbaseutil
|
||||||
|
from aurweb.templates import make_context, render_template
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_get(request: Request, context: Dict[str, Any],
|
||||||
|
status_code: HTTPStatus = HTTPStatus.OK):
|
||||||
|
# Query parameters used in this request.
|
||||||
|
context["q"] = dict(request.query_params)
|
||||||
|
|
||||||
|
# Per page and offset.
|
||||||
|
offset, per_page = util.sanitize_params(
|
||||||
|
request.query_params.get("O", defaults.O),
|
||||||
|
request.query_params.get("PP", defaults.PP))
|
||||||
|
context["O"] = offset
|
||||||
|
context["PP"] = per_page
|
||||||
|
|
||||||
|
# Query search by.
|
||||||
|
search_by = context["SeB"] = request.query_params.get("SeB", "nd")
|
||||||
|
|
||||||
|
# Query sort by.
|
||||||
|
sort_by = context["SB"] = request.query_params.get("SB", "p")
|
||||||
|
|
||||||
|
# Query sort order.
|
||||||
|
sort_order = request.query_params.get("SO", None)
|
||||||
|
|
||||||
|
# Apply ordering, limit and offset.
|
||||||
|
search = PackageSearch(request.user)
|
||||||
|
|
||||||
|
# For each keyword found in K, apply a search_by filter.
|
||||||
|
# This means that for any sentences separated by spaces,
|
||||||
|
# they are used as if they were ANDed.
|
||||||
|
keywords = context["K"] = request.query_params.get("K", str())
|
||||||
|
keywords = keywords.split(" ")
|
||||||
|
for keyword in keywords:
|
||||||
|
search.search_by(search_by, keyword)
|
||||||
|
|
||||||
|
# Collect search result count here; we've applied our keywords.
|
||||||
|
# Including more query operations below, like ordering, will
|
||||||
|
# increase the amount of time required to collect a count.
|
||||||
|
limit = config.getint("options", "max_search_results")
|
||||||
|
num_packages = search.count(limit)
|
||||||
|
|
||||||
|
flagged = request.query_params.get("outdated", None)
|
||||||
|
if flagged:
|
||||||
|
# If outdated was given, set it up in the context.
|
||||||
|
context["outdated"] = flagged
|
||||||
|
|
||||||
|
# When outdated is set to "on," we filter records which do have
|
||||||
|
# an OutOfDateTS. When it's set to "off," we filter out any which
|
||||||
|
# do **not** have OutOfDateTS.
|
||||||
|
criteria = None
|
||||||
|
if flagged == "on":
|
||||||
|
criteria = models.PackageBase.OutOfDateTS.isnot
|
||||||
|
else:
|
||||||
|
criteria = models.PackageBase.OutOfDateTS.is_
|
||||||
|
|
||||||
|
# Apply the flag criteria to our PackageSearch.query.
|
||||||
|
search.query = search.query.filter(criteria(None))
|
||||||
|
|
||||||
|
submit = request.query_params.get("submit", "Go")
|
||||||
|
if submit == "Orphans":
|
||||||
|
# If the user clicked the "Orphans" button, we only want
|
||||||
|
# orphaned packages.
|
||||||
|
search.query = search.query.filter(
|
||||||
|
models.PackageBase.MaintainerUID.is_(None))
|
||||||
|
|
||||||
|
# Apply user-specified specified sort column and ordering.
|
||||||
|
search.sort_by(sort_by, sort_order)
|
||||||
|
|
||||||
|
# If no SO was given, default the context SO to 'a' (Ascending).
|
||||||
|
# By default, if no SO is given, the search should sort by 'd'
|
||||||
|
# (Descending), but display "Ascending" for the Sort order select.
|
||||||
|
if sort_order is None:
|
||||||
|
sort_order = "a"
|
||||||
|
context["SO"] = sort_order
|
||||||
|
|
||||||
|
# Insert search results into the context.
|
||||||
|
results = search.results().with_entities(
|
||||||
|
models.Package.ID,
|
||||||
|
models.Package.Name,
|
||||||
|
models.Package.PackageBaseID,
|
||||||
|
models.Package.Version,
|
||||||
|
models.Package.Description,
|
||||||
|
models.PackageBase.Popularity,
|
||||||
|
models.PackageBase.NumVotes,
|
||||||
|
models.PackageBase.OutOfDateTS,
|
||||||
|
models.User.Username.label("Maintainer"),
|
||||||
|
models.PackageVote.PackageBaseID.label("Voted"),
|
||||||
|
models.PackageNotification.PackageBaseID.label("Notify")
|
||||||
|
)
|
||||||
|
|
||||||
|
packages = results.limit(per_page).offset(offset)
|
||||||
|
context["packages"] = packages
|
||||||
|
context["packages_count"] = num_packages
|
||||||
|
|
||||||
|
return render_template(request, "packages/index.html", context,
|
||||||
|
status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/packages")
|
||||||
|
async def packages(request: Request) -> Response:
|
||||||
|
context = make_context(request, "Packages")
|
||||||
|
return await packages_get(request, context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/packages/{name}")
|
||||||
|
async def package(request: Request, name: str) -> Response:
|
||||||
|
# Get the Package.
|
||||||
|
pkg = get_pkg_or_base(name, models.Package)
|
||||||
|
pkgbase = pkg.PackageBase
|
||||||
|
|
||||||
|
rels = pkg.package_relations.order_by(models.PackageRelation.RelName.asc())
|
||||||
|
rels_data = defaultdict(list)
|
||||||
|
for rel in rels:
|
||||||
|
if rel.RelTypeID == CONFLICTS_ID:
|
||||||
|
rels_data["c"].append(rel)
|
||||||
|
elif rel.RelTypeID == PROVIDES_ID:
|
||||||
|
rels_data["p"].append(rel)
|
||||||
|
elif rel.RelTypeID == REPLACES_ID:
|
||||||
|
rels_data["r"].append(rel)
|
||||||
|
|
||||||
|
# Add our base information.
|
||||||
|
context = pkgbaseutil.make_context(request, pkgbase)
|
||||||
|
context["package"] = pkg
|
||||||
|
|
||||||
|
# Package sources.
|
||||||
|
context["sources"] = pkg.package_sources.order_by(
|
||||||
|
models.PackageSource.Source.asc()).all()
|
||||||
|
|
||||||
|
# Package dependencies.
|
||||||
|
max_depends = config.getint("options", "max_depends")
|
||||||
|
context["dependencies"] = pkg.package_dependencies.order_by(
|
||||||
|
models.PackageDependency.DepTypeID.asc(),
|
||||||
|
models.PackageDependency.DepName.asc()
|
||||||
|
).limit(max_depends).all()
|
||||||
|
|
||||||
|
# Package requirements (other packages depend on this one).
|
||||||
|
context["required_by"] = pkgutil.pkg_required(
|
||||||
|
pkg.Name, [p.RelName for p in rels_data.get("p", [])], max_depends)
|
||||||
|
|
||||||
|
context["licenses"] = pkg.package_licenses
|
||||||
|
|
||||||
|
conflicts = pkg.package_relations.filter(
|
||||||
|
models.PackageRelation.RelTypeID == CONFLICTS_ID
|
||||||
|
).order_by(models.PackageRelation.RelName.asc())
|
||||||
|
context["conflicts"] = conflicts
|
||||||
|
|
||||||
|
provides = pkg.package_relations.filter(
|
||||||
|
models.PackageRelation.RelTypeID == PROVIDES_ID
|
||||||
|
).order_by(models.PackageRelation.RelName.asc())
|
||||||
|
context["provides"] = provides
|
||||||
|
|
||||||
|
replaces = pkg.package_relations.filter(
|
||||||
|
models.PackageRelation.RelTypeID == REPLACES_ID
|
||||||
|
).order_by(models.PackageRelation.RelName.asc())
|
||||||
|
context["replaces"] = replaces
|
||||||
|
|
||||||
|
return render_template(request, "packages/show.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_unflag(request: Request, package_ids: List[int] = [],
|
||||||
|
**kwargs):
|
||||||
|
if not package_ids:
|
||||||
|
return (False, ["You did not select any packages to unflag."])
|
||||||
|
|
||||||
|
# Holds the set of package bases we're looking to unflag.
|
||||||
|
# Constructed below via looping through the packages query.
|
||||||
|
bases = set()
|
||||||
|
|
||||||
|
package_ids = set(package_ids) # Convert this to a set for O(1).
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
for pkg in packages:
|
||||||
|
has_cred = request.user.has_credential(
|
||||||
|
creds.PKGBASE_UNFLAG, approved=[pkg.PackageBase.Flagger])
|
||||||
|
if not has_cred:
|
||||||
|
return (False, ["You did not select any packages to unflag."])
|
||||||
|
|
||||||
|
if pkg.PackageBase not in bases:
|
||||||
|
bases.update({pkg.PackageBase})
|
||||||
|
|
||||||
|
for pkgbase in bases:
|
||||||
|
pkgbase_actions.pkgbase_unflag_instance(request, pkgbase)
|
||||||
|
return (True, ["The selected packages have been unflagged."])
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_notify(request: Request, package_ids: List[int] = [],
|
||||||
|
**kwargs):
|
||||||
|
# In cases where we encounter errors with the request, we'll
|
||||||
|
# use this error tuple as a return value.
|
||||||
|
# TODO: This error does not yet have a translation.
|
||||||
|
error_tuple = (False,
|
||||||
|
["You did not select any packages to be notified about."])
|
||||||
|
if not package_ids:
|
||||||
|
return error_tuple
|
||||||
|
|
||||||
|
bases = set()
|
||||||
|
package_ids = set(package_ids)
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
|
||||||
|
for pkg in packages:
|
||||||
|
if pkg.PackageBase not in bases:
|
||||||
|
bases.update({pkg.PackageBase})
|
||||||
|
|
||||||
|
# Perform some checks on what the user selected for notify.
|
||||||
|
for pkgbase in bases:
|
||||||
|
notif = db.query(pkgbase.notifications.filter(
|
||||||
|
models.PackageNotification.UserID == request.user.ID
|
||||||
|
).exists()).scalar()
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_NOTIFY)
|
||||||
|
|
||||||
|
# If the request user either does not have credentials
|
||||||
|
# or the notification already exists:
|
||||||
|
if not (has_cred and not notif):
|
||||||
|
return error_tuple
|
||||||
|
|
||||||
|
# If we get here, user input is good.
|
||||||
|
for pkgbase in bases:
|
||||||
|
pkgbase_actions.pkgbase_notify_instance(request, pkgbase)
|
||||||
|
|
||||||
|
# TODO: This message does not yet have a translation.
|
||||||
|
return (True, ["The selected packages' notifications have been enabled."])
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_unnotify(request: Request, package_ids: List[int] = [],
|
||||||
|
**kwargs):
|
||||||
|
if not package_ids:
|
||||||
|
# TODO: This error does not yet have a translation.
|
||||||
|
return (False,
|
||||||
|
["You did not select any packages for notification removal."])
|
||||||
|
|
||||||
|
# TODO: This error does not yet have a translation.
|
||||||
|
error_tuple = (
|
||||||
|
False,
|
||||||
|
["A package you selected does not have notifications enabled."]
|
||||||
|
)
|
||||||
|
|
||||||
|
bases = set()
|
||||||
|
package_ids = set(package_ids)
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
|
||||||
|
for pkg in packages:
|
||||||
|
if pkg.PackageBase not in bases:
|
||||||
|
bases.update({pkg.PackageBase})
|
||||||
|
|
||||||
|
# Perform some checks on what the user selected for notify.
|
||||||
|
for pkgbase in bases:
|
||||||
|
notif = db.query(pkgbase.notifications.filter(
|
||||||
|
models.PackageNotification.UserID == request.user.ID
|
||||||
|
).exists()).scalar()
|
||||||
|
if not notif:
|
||||||
|
return error_tuple
|
||||||
|
|
||||||
|
for pkgbase in bases:
|
||||||
|
pkgbase_actions.pkgbase_unnotify_instance(request, pkgbase)
|
||||||
|
|
||||||
|
# TODO: This message does not yet have a translation.
|
||||||
|
return (True, ["The selected packages' notifications have been removed."])
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_adopt(request: Request, package_ids: List[int] = [],
|
||||||
|
confirm: bool = False, **kwargs):
|
||||||
|
if not package_ids:
|
||||||
|
return (False, ["You did not select any packages to adopt."])
|
||||||
|
|
||||||
|
if not confirm:
|
||||||
|
return (False, ["The selected packages have not been adopted, "
|
||||||
|
"check the confirmation checkbox."])
|
||||||
|
|
||||||
|
bases = set()
|
||||||
|
package_ids = set(package_ids)
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
|
||||||
|
for pkg in packages:
|
||||||
|
if pkg.PackageBase not in bases:
|
||||||
|
bases.update({pkg.PackageBase})
|
||||||
|
|
||||||
|
# Check that the user has credentials for every package they selected.
|
||||||
|
for pkgbase in bases:
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_ADOPT)
|
||||||
|
if not (has_cred or not pkgbase.Maintainer):
|
||||||
|
# TODO: This error needs to be translated.
|
||||||
|
return (False, ["You are not allowed to adopt one of the "
|
||||||
|
"packages you selected."])
|
||||||
|
|
||||||
|
# Now, really adopt the bases.
|
||||||
|
for pkgbase in bases:
|
||||||
|
pkgbase_actions.pkgbase_adopt_instance(request, pkgbase)
|
||||||
|
|
||||||
|
return (True, ["The selected packages have been adopted."])
|
||||||
|
|
||||||
|
|
||||||
|
def disown_all(request: Request, pkgbases: List[models.PackageBase]) \
|
||||||
|
-> List[str]:
|
||||||
|
errors = []
|
||||||
|
for pkgbase in pkgbases:
|
||||||
|
try:
|
||||||
|
pkgbase_actions.pkgbase_disown_instance(request, pkgbase)
|
||||||
|
except InvariantError as exc:
|
||||||
|
errors.append(str(exc))
|
||||||
|
return errors
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_disown(request: Request, package_ids: List[int] = [],
|
||||||
|
confirm: bool = False, **kwargs):
|
||||||
|
if not package_ids:
|
||||||
|
return (False, ["You did not select any packages to disown."])
|
||||||
|
|
||||||
|
if not confirm:
|
||||||
|
return (False, ["The selected packages have not been disowned, "
|
||||||
|
"check the confirmation checkbox."])
|
||||||
|
|
||||||
|
bases = set()
|
||||||
|
package_ids = set(package_ids)
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
|
||||||
|
for pkg in packages:
|
||||||
|
if pkg.PackageBase not in bases:
|
||||||
|
bases.update({pkg.PackageBase})
|
||||||
|
|
||||||
|
# Check that the user has credentials for every package they selected.
|
||||||
|
for pkgbase in bases:
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_cred:
|
||||||
|
# TODO: This error needs to be translated.
|
||||||
|
return (False, ["You are not allowed to disown one "
|
||||||
|
"of the packages you selected."])
|
||||||
|
|
||||||
|
# Now, disown all the bases if we can.
|
||||||
|
if errors := disown_all(request, bases):
|
||||||
|
return (False, errors)
|
||||||
|
|
||||||
|
return (True, ["The selected packages have been disowned."])
|
||||||
|
|
||||||
|
|
||||||
|
async def packages_delete(request: Request, package_ids: List[int] = [],
|
||||||
|
confirm: bool = False, merge_into: str = str(),
|
||||||
|
**kwargs):
|
||||||
|
if not package_ids:
|
||||||
|
return (False, ["You did not select any packages to delete."])
|
||||||
|
|
||||||
|
if not confirm:
|
||||||
|
return (False, ["The selected packages have not been deleted, "
|
||||||
|
"check the confirmation checkbox."])
|
||||||
|
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_DELETE):
|
||||||
|
return (False, ["You do not have permission to delete packages."])
|
||||||
|
|
||||||
|
# set-ify package_ids and query the database for related records.
|
||||||
|
package_ids = set(package_ids)
|
||||||
|
packages = db.query(models.Package).filter(
|
||||||
|
models.Package.ID.in_(package_ids)).all()
|
||||||
|
|
||||||
|
if len(packages) != len(package_ids):
|
||||||
|
# Let the user know there was an issue with their input: they have
|
||||||
|
# provided at least one package_id which does not exist in the DB.
|
||||||
|
# TODO: This error has not yet been translated.
|
||||||
|
return (False, ["One of the packages you selected does not exist."])
|
||||||
|
|
||||||
|
# Make a set out of all package bases related to `packages`.
|
||||||
|
bases = {pkg.PackageBase for pkg in packages}
|
||||||
|
deleted_bases, notifs = [], []
|
||||||
|
for pkgbase in bases:
|
||||||
|
deleted_bases.append(pkgbase.Name)
|
||||||
|
notifs += pkgbase_actions.pkgbase_delete_instance(request, pkgbase)
|
||||||
|
|
||||||
|
# Log out the fact that this happened for accountability.
|
||||||
|
logger.info(f"Privileged user '{request.user.Username}' deleted the "
|
||||||
|
f"following package bases: {str(deleted_bases)}.")
|
||||||
|
|
||||||
|
util.apply_all(notifs, lambda n: n.send())
|
||||||
|
return (True, ["The selected packages have been deleted."])
|
||||||
|
|
||||||
|
# A mapping of action string -> callback functions used within the
|
||||||
|
# `packages_post` route below. We expect any action callback to
|
||||||
|
# return a tuple in the format: (succeeded: bool, message: List[str]).
|
||||||
|
PACKAGE_ACTIONS = {
|
||||||
|
"unflag": packages_unflag,
|
||||||
|
"notify": packages_notify,
|
||||||
|
"unnotify": packages_unnotify,
|
||||||
|
"adopt": packages_adopt,
|
||||||
|
"disown": packages_disown,
|
||||||
|
"delete": packages_delete,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/packages")
|
||||||
|
@requires_auth
|
||||||
|
async def packages_post(request: Request,
|
||||||
|
IDs: List[int] = Form(default=[]),
|
||||||
|
action: str = Form(default=str()),
|
||||||
|
confirm: bool = Form(default=False)):
|
||||||
|
|
||||||
|
# If an invalid action is specified, just render GET /packages
|
||||||
|
# with an BAD_REQUEST status_code.
|
||||||
|
if action not in PACKAGE_ACTIONS:
|
||||||
|
context = make_context(request, "Packages")
|
||||||
|
return await packages_get(request, context, HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
context = make_context(request, "Packages")
|
||||||
|
|
||||||
|
# We deal with `IDs`, `merge_into` and `confirm` arguments
|
||||||
|
# within action callbacks.
|
||||||
|
callback = PACKAGE_ACTIONS.get(action)
|
||||||
|
retval = await callback(request, package_ids=IDs, confirm=confirm)
|
||||||
|
if retval: # If *anything* was returned:
|
||||||
|
success, messages = retval
|
||||||
|
if not success:
|
||||||
|
# If the first element was False:
|
||||||
|
context["errors"] = messages
|
||||||
|
return await packages_get(request, context, HTTPStatus.BAD_REQUEST)
|
||||||
|
else:
|
||||||
|
# Otherwise:
|
||||||
|
context["success"] = messages
|
||||||
|
|
||||||
|
return await packages_get(request, context)
|
856
aurweb/routers/pkgbase.py
Normal file
856
aurweb/routers/pkgbase.py
Normal file
|
@ -0,0 +1,856 @@
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, HTTPException, Query, Request, Response
|
||||||
|
from fastapi.responses import JSONResponse, RedirectResponse
|
||||||
|
from sqlalchemy import and_
|
||||||
|
|
||||||
|
from aurweb import config, db, l10n, logging, templates, time, util
|
||||||
|
from aurweb.auth import creds, requires_auth
|
||||||
|
from aurweb.exceptions import InvariantError, ValidationError
|
||||||
|
from aurweb.models import PackageBase
|
||||||
|
from aurweb.models.package_comment import PackageComment
|
||||||
|
from aurweb.models.package_keyword import PackageKeyword
|
||||||
|
from aurweb.models.package_notification import PackageNotification
|
||||||
|
from aurweb.models.package_request import ACCEPTED_ID, PENDING_ID, PackageRequest
|
||||||
|
from aurweb.models.package_vote import PackageVote
|
||||||
|
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID
|
||||||
|
from aurweb.packages.requests import update_closure_comment
|
||||||
|
from aurweb.packages.util import get_pkg_or_base, get_pkgbase_comment
|
||||||
|
from aurweb.pkgbase import actions
|
||||||
|
from aurweb.pkgbase import util as pkgbaseutil
|
||||||
|
from aurweb.pkgbase import validate
|
||||||
|
from aurweb.scripts import notify, popupdate
|
||||||
|
from aurweb.scripts.rendercomment import update_comment_render_fastapi
|
||||||
|
from aurweb.templates import make_variable_context, render_template
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}")
|
||||||
|
async def pkgbase(request: Request, name: str) -> Response:
|
||||||
|
"""
|
||||||
|
Single package base view.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:return: HTMLResponse
|
||||||
|
"""
|
||||||
|
# Get the PackageBase.
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
# If this is not a split package, redirect to /packages/{name}.
|
||||||
|
if pkgbase.packages.count() == 1:
|
||||||
|
return RedirectResponse(f"/packages/{name}",
|
||||||
|
status_code=int(HTTPStatus.SEE_OTHER))
|
||||||
|
|
||||||
|
# Add our base information.
|
||||||
|
context = pkgbaseutil.make_context(request, pkgbase)
|
||||||
|
context["packages"] = pkgbase.packages.all()
|
||||||
|
|
||||||
|
return render_template(request, "pkgbase/index.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/voters")
|
||||||
|
async def pkgbase_voters(request: Request, name: str) -> Response:
|
||||||
|
"""
|
||||||
|
View of package base voters.
|
||||||
|
|
||||||
|
Requires `request.user` has creds.PKGBASE_LIST_VOTERS credential.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:return: HTMLResponse
|
||||||
|
"""
|
||||||
|
# Get the PackageBase.
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_LIST_VOTERS):
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Voters")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
return render_template(request, "pkgbase/voters.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/flag-comment")
|
||||||
|
async def pkgbase_flag_comment(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
if pkgbase.Flagger is None:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Flag Comment")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
return render_template(request, "pkgbase/flag-comment.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/keywords")
|
||||||
|
async def pkgbase_keywords(request: Request, name: str,
|
||||||
|
keywords: str = Form(default=str())):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
keywords = set(keywords.split(" "))
|
||||||
|
|
||||||
|
# Delete all keywords which are not supplied by the user.
|
||||||
|
other_keywords = pkgbase.keywords.filter(
|
||||||
|
~PackageKeyword.Keyword.in_(keywords))
|
||||||
|
other_keyword_strings = [kwd.Keyword for kwd in other_keywords]
|
||||||
|
|
||||||
|
existing_keywords = set(
|
||||||
|
kwd.Keyword for kwd in
|
||||||
|
pkgbase.keywords.filter(
|
||||||
|
~PackageKeyword.Keyword.in_(other_keyword_strings))
|
||||||
|
)
|
||||||
|
with db.begin():
|
||||||
|
db.delete_all(other_keywords)
|
||||||
|
for keyword in keywords.difference(existing_keywords):
|
||||||
|
db.create(PackageKeyword,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
Keyword=keyword)
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/flag")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_flag_get(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_FLAG)
|
||||||
|
if not has_cred or pkgbase.Flagger is not None:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Flag Package Out-Of-Date")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
return render_template(request, "pkgbase/flag.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/flag")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_flag_post(request: Request, name: str,
|
||||||
|
comments: str = Form(default=str())):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
if not comments:
|
||||||
|
context = templates.make_context(request, "Flag Package Out-Of-Date")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
context["errors"] = ["The selected packages have not been flagged, "
|
||||||
|
"please enter a comment."]
|
||||||
|
return render_template(request, "pkgbase/flag.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_FLAG)
|
||||||
|
if has_cred and not pkgbase.Flagger:
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
pkgbase.OutOfDateTS = now
|
||||||
|
pkgbase.Flagger = request.user
|
||||||
|
pkgbase.FlaggerComment = comments
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comments_post(
|
||||||
|
request: Request, name: str,
|
||||||
|
comment: str = Form(default=str()),
|
||||||
|
enable_notifications: bool = Form(default=False)):
|
||||||
|
""" Add a new comment via POST request. """
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
if not comment:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
# If the provided comment is different than the record's version,
|
||||||
|
# update the db record.
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
comment = db.create(PackageComment, User=request.user,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
Comments=comment, RenderedComment=str(),
|
||||||
|
CommentTS=now)
|
||||||
|
|
||||||
|
if enable_notifications and not request.user.notified(pkgbase):
|
||||||
|
db.create(PackageNotification,
|
||||||
|
User=request.user,
|
||||||
|
PackageBase=pkgbase)
|
||||||
|
update_comment_render_fastapi(comment)
|
||||||
|
|
||||||
|
# Redirect to the pkgbase page.
|
||||||
|
return RedirectResponse(f"/pkgbase/{pkgbase.Name}#comment-{comment.ID}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/comments/{id}/form")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_form(request: Request, name: str, id: int,
|
||||||
|
next: str = Query(default=None)):
|
||||||
|
"""
|
||||||
|
Produce a comment form for comment {id}.
|
||||||
|
|
||||||
|
This route is used as a partial HTML endpoint when editing
|
||||||
|
package comments via Javascript. This endpoint used to be
|
||||||
|
part of the RPC as type=get-comment-form and has been
|
||||||
|
relocated here because the form returned cannot be used
|
||||||
|
externally and requires a POST request by the user.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` value used for the comment form
|
||||||
|
:return: JSONResponse
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = pkgbase.comments.filter(PackageComment.ID == id).first()
|
||||||
|
if not comment:
|
||||||
|
return JSONResponse({}, status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
if not request.user.is_elevated() and request.user != comment.User:
|
||||||
|
return JSONResponse({}, status_code=HTTPStatus.UNAUTHORIZED)
|
||||||
|
|
||||||
|
context = pkgbaseutil.make_context(request, pkgbase)
|
||||||
|
context["comment"] = comment
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
context["next"] = next
|
||||||
|
|
||||||
|
form = templates.render_raw_template(
|
||||||
|
request, "partials/packages/comment_form.html", context)
|
||||||
|
return JSONResponse({"form": form})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/comments/{id}/edit")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_edit(request: Request, name: str, id: int,
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
Render the non-javascript edit form.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` parameter used in the POST request
|
||||||
|
:return: HTMLResponse
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Edit comment", next=next)
|
||||||
|
context["comment"] = comment
|
||||||
|
return render_template(request, "pkgbase/comments/edit.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments/{id}")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_post(
|
||||||
|
request: Request, name: str, id: int,
|
||||||
|
comment: str = Form(default=str()),
|
||||||
|
enable_notifications: bool = Form(default=False),
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
""" Edit an existing comment. """
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
db_comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
if not comment:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
# If the provided comment is different than the record's version,
|
||||||
|
# update the db record.
|
||||||
|
now = time.utcnow()
|
||||||
|
if db_comment.Comments != comment:
|
||||||
|
with db.begin():
|
||||||
|
db_comment.Comments = comment
|
||||||
|
db_comment.Editor = request.user
|
||||||
|
db_comment.EditedTS = now
|
||||||
|
|
||||||
|
db_notif = request.user.notifications.filter(
|
||||||
|
PackageNotification.PackageBaseID == pkgbase.ID
|
||||||
|
).first()
|
||||||
|
if enable_notifications and not db_notif:
|
||||||
|
db.create(PackageNotification,
|
||||||
|
User=request.user,
|
||||||
|
PackageBase=pkgbase)
|
||||||
|
update_comment_render_fastapi(db_comment)
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{pkgbase.Name}"
|
||||||
|
|
||||||
|
# Redirect to the pkgbase page anchored to the updated comment.
|
||||||
|
return RedirectResponse(f"{next}#comment-{db_comment.ID}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments/{id}/pin")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_pin(request: Request, name: str, id: int,
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
Pin a comment.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` parameter used in the POST request
|
||||||
|
:return: RedirectResponse to `next`
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.COMMENT_PIN,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_cred:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED,
|
||||||
|
detail=_("You are not allowed to pin this comment."))
|
||||||
|
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
comment.PinnedTS = now
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments/{id}/unpin")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_unpin(request: Request, name: str, id: int,
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
Unpin a comment.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` parameter used in the POST request
|
||||||
|
:return: RedirectResponse to `next`
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.COMMENT_PIN,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_cred:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED,
|
||||||
|
detail=_("You are not allowed to unpin this comment."))
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
comment.PinnedTS = 0
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments/{id}/delete")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_delete(request: Request, name: str, id: int,
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
Delete a comment.
|
||||||
|
|
||||||
|
This action does **not** delete the comment from the database, but
|
||||||
|
sets PackageBase.DelTS and PackageBase.DeleterUID, which is used to
|
||||||
|
decide who gets to view the comment and what utilities it gets.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` parameter used in the POST request
|
||||||
|
:return: RedirectResposne to `next`
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
authorized = request.user.has_credential(creds.COMMENT_DELETE,
|
||||||
|
[comment.User])
|
||||||
|
if not authorized:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED,
|
||||||
|
detail=_("You are not allowed to delete this comment."))
|
||||||
|
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
comment.Deleter = request.user
|
||||||
|
comment.DelTS = now
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comments/{id}/undelete")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comment_undelete(request: Request, name: str, id: int,
|
||||||
|
next: str = Form(default=None)):
|
||||||
|
"""
|
||||||
|
Undelete a comment.
|
||||||
|
|
||||||
|
This action does **not** undelete any comment from the database, but
|
||||||
|
unsets PackageBase.DelTS and PackageBase.DeleterUID which restores
|
||||||
|
the comment to a standard state.
|
||||||
|
|
||||||
|
:param request: FastAPI Request
|
||||||
|
:param name: PackageBase.Name
|
||||||
|
:param id: PackageComment.ID
|
||||||
|
:param next: Optional `next` parameter used in the POST request
|
||||||
|
:return: RedirectResponse to `next`
|
||||||
|
"""
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
comment = get_pkgbase_comment(pkgbase, id)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.COMMENT_UNDELETE,
|
||||||
|
approved=[comment.User])
|
||||||
|
if not has_cred:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED,
|
||||||
|
detail=_("You are not allowed to undelete this comment."))
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
comment.Deleter = None
|
||||||
|
comment.DelTS = None
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{name}"
|
||||||
|
|
||||||
|
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/vote")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_vote(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
vote = pkgbase.package_votes.filter(
|
||||||
|
PackageVote.UsersID == request.user.ID
|
||||||
|
).first()
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_VOTE)
|
||||||
|
if has_cred and not vote:
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
db.create(PackageVote,
|
||||||
|
User=request.user,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
VoteTS=now)
|
||||||
|
|
||||||
|
# Update NumVotes/Popularity.
|
||||||
|
popupdate.run_single(pkgbase)
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/unvote")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_unvote(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
vote = pkgbase.package_votes.filter(
|
||||||
|
PackageVote.UsersID == request.user.ID
|
||||||
|
).first()
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_VOTE)
|
||||||
|
if has_cred and vote:
|
||||||
|
with db.begin():
|
||||||
|
db.delete(vote)
|
||||||
|
|
||||||
|
# Update NumVotes/Popularity.
|
||||||
|
popupdate.run_single(pkgbase)
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/notify")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_notify(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
actions.pkgbase_notify_instance(request, pkgbase)
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/unnotify")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_unnotify(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
actions.pkgbase_unnotify_instance(request, pkgbase)
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/unflag")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_unflag(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
actions.pkgbase_unflag_instance(request, pkgbase)
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/disown")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_disown_get(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_cred:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Disown Package")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
return render_template(request, "pkgbase/disown.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/disown")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_disown_post(request: Request, name: str,
|
||||||
|
comments: str = Form(default=str()),
|
||||||
|
confirm: bool = Form(default=False)):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_DISOWN,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_cred:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Disown Package")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
if not confirm:
|
||||||
|
context["errors"] = [("The selected packages have not been disowned, "
|
||||||
|
"check the confirmation checkbox.")]
|
||||||
|
return render_template(request, "pkgbase/disown.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
update_closure_comment(pkgbase, ORPHAN_ID, comments)
|
||||||
|
|
||||||
|
try:
|
||||||
|
actions.pkgbase_disown_instance(request, pkgbase)
|
||||||
|
except InvariantError as exc:
|
||||||
|
context["errors"] = [str(exc)]
|
||||||
|
return render_template(request, "pkgbase/disown.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/adopt")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_adopt_post(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
has_cred = request.user.has_credential(creds.PKGBASE_ADOPT)
|
||||||
|
if has_cred or not pkgbase.Maintainer:
|
||||||
|
# If the user has credentials, they'll adopt the package regardless
|
||||||
|
# of maintainership. Otherwise, we'll promote the user to maintainer
|
||||||
|
# if no maintainer currently exists.
|
||||||
|
actions.pkgbase_adopt_instance(request, pkgbase)
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/comaintainers")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comaintainers(request: Request, name: str) -> Response:
|
||||||
|
# Get the PackageBase.
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
# Unauthorized users (Non-TU/Dev and not the pkgbase maintainer)
|
||||||
|
# get redirected to the package base's page.
|
||||||
|
has_creds = request.user.has_credential(creds.PKGBASE_EDIT_COMAINTAINERS,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_creds:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
# Add our base information.
|
||||||
|
context = templates.make_context(request, "Manage Co-maintainers")
|
||||||
|
context.update({
|
||||||
|
"pkgbase": pkgbase,
|
||||||
|
"comaintainers": [
|
||||||
|
c.User.Username for c in pkgbase.comaintainers
|
||||||
|
]
|
||||||
|
})
|
||||||
|
|
||||||
|
return render_template(request, "pkgbase/comaintainers.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/comaintainers")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_comaintainers_post(request: Request, name: str,
|
||||||
|
users: str = Form(default=str())) \
|
||||||
|
-> Response:
|
||||||
|
# Get the PackageBase.
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
# Unauthorized users (Non-TU/Dev and not the pkgbase maintainer)
|
||||||
|
# get redirected to the package base's page.
|
||||||
|
has_creds = request.user.has_credential(creds.PKGBASE_EDIT_COMAINTAINERS,
|
||||||
|
approved=[pkgbase.Maintainer])
|
||||||
|
if not has_creds:
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
users = {e.strip() for e in users.split("\n") if bool(e.strip())}
|
||||||
|
records = {c.User.Username for c in pkgbase.comaintainers}
|
||||||
|
|
||||||
|
users_to_rm = records.difference(users)
|
||||||
|
pkgbaseutil.remove_comaintainers(pkgbase, users_to_rm)
|
||||||
|
logger.debug(f"{request.user} removed comaintainers from "
|
||||||
|
f"{pkgbase.Name}: {users_to_rm}")
|
||||||
|
|
||||||
|
users_to_add = users.difference(records)
|
||||||
|
error = pkgbaseutil.add_comaintainers(request, pkgbase, users_to_add)
|
||||||
|
if error:
|
||||||
|
context = templates.make_context(request, "Manage Co-maintainers")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
context["comaintainers"] = [
|
||||||
|
c.User.Username for c in pkgbase.comaintainers
|
||||||
|
]
|
||||||
|
context["errors"] = [error]
|
||||||
|
return render_template(request, "pkgbase/comaintainers.html", context)
|
||||||
|
|
||||||
|
logger.debug(f"{request.user} added comaintainers to "
|
||||||
|
f"{pkgbase.Name}: {users_to_add}")
|
||||||
|
|
||||||
|
return RedirectResponse(f"/pkgbase/{pkgbase.Name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/request")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_request(request: Request, name: str):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
context = await make_variable_context(request, "Submit Request")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
return render_template(request, "pkgbase/request.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/request")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_request_post(request: Request, name: str,
|
||||||
|
type: str = Form(...),
|
||||||
|
merge_into: str = Form(default=None),
|
||||||
|
comments: str = Form(default=str())):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
# Create our render context.
|
||||||
|
context = await make_variable_context(request, "Submit Request")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
|
||||||
|
types = {
|
||||||
|
"deletion": DELETION_ID,
|
||||||
|
"merge": MERGE_ID,
|
||||||
|
"orphan": ORPHAN_ID
|
||||||
|
}
|
||||||
|
|
||||||
|
if type not in types:
|
||||||
|
# In the case that someone crafted a POST request with an invalid
|
||||||
|
# type, just return them to the request form with BAD_REQUEST status.
|
||||||
|
return render_template(request, "pkgbase/request.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
try:
|
||||||
|
validate.request(pkgbase, type, comments, merge_into, context)
|
||||||
|
except ValidationError as exc:
|
||||||
|
logger.error(f"Request Validation Error: {str(exc.data)}")
|
||||||
|
context["errors"] = exc.data
|
||||||
|
return render_template(request, "pkgbase/request.html", context)
|
||||||
|
|
||||||
|
# All good. Create a new PackageRequest based on the given type.
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
pkgreq = db.create(PackageRequest,
|
||||||
|
ReqTypeID=types.get(type),
|
||||||
|
User=request.user,
|
||||||
|
RequestTS=now,
|
||||||
|
PackageBase=pkgbase,
|
||||||
|
PackageBaseName=pkgbase.Name,
|
||||||
|
MergeBaseName=merge_into,
|
||||||
|
Comments=comments,
|
||||||
|
ClosureComment=str())
|
||||||
|
|
||||||
|
# Prepare notification object.
|
||||||
|
notif = notify.RequestOpenNotification(
|
||||||
|
request.user.ID, pkgreq.ID, type,
|
||||||
|
pkgreq.PackageBase.ID, merge_into=merge_into or None)
|
||||||
|
|
||||||
|
# Send the notification now that we're out of the DB scope.
|
||||||
|
notif.send()
|
||||||
|
|
||||||
|
auto_orphan_age = config.getint("options", "auto_orphan_age")
|
||||||
|
auto_delete_age = config.getint("options", "auto_delete_age")
|
||||||
|
|
||||||
|
ood_ts = pkgbase.OutOfDateTS or 0
|
||||||
|
flagged = ood_ts and (now - ood_ts) >= auto_orphan_age
|
||||||
|
is_maintainer = pkgbase.Maintainer == request.user
|
||||||
|
outdated = (now - pkgbase.SubmittedTS) <= auto_delete_age
|
||||||
|
|
||||||
|
if type == "orphan" and flagged:
|
||||||
|
# This request should be auto-accepted.
|
||||||
|
with db.begin():
|
||||||
|
pkgbase.Maintainer = None
|
||||||
|
pkgreq.Status = ACCEPTED_ID
|
||||||
|
notif = notify.RequestCloseNotification(
|
||||||
|
request.user.ID, pkgreq.ID, pkgreq.status_display())
|
||||||
|
notif.send()
|
||||||
|
logger.debug(f"New request #{pkgreq.ID} is marked for auto-orphan.")
|
||||||
|
elif type == "deletion" and is_maintainer and outdated:
|
||||||
|
# This request should be auto-accepted.
|
||||||
|
notifs = actions.pkgbase_delete_instance(
|
||||||
|
request, pkgbase, comments=comments)
|
||||||
|
util.apply_all(notifs, lambda n: n.send())
|
||||||
|
logger.debug(f"New request #{pkgreq.ID} is marked for auto-deletion.")
|
||||||
|
|
||||||
|
# Redirect the submitting user to /packages.
|
||||||
|
return RedirectResponse("/packages", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/delete")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_delete_get(request: Request, name: str):
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_DELETE):
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Package Deletion")
|
||||||
|
context["pkgbase"] = get_pkg_or_base(name, PackageBase)
|
||||||
|
return render_template(request, "pkgbase/delete.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/delete")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_delete_post(request: Request, name: str,
|
||||||
|
confirm: bool = Form(default=False),
|
||||||
|
comments: str = Form(default=str())):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_DELETE):
|
||||||
|
return RedirectResponse(f"/pkgbase/{name}",
|
||||||
|
status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
if not confirm:
|
||||||
|
context = templates.make_context(request, "Package Deletion")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
context["errors"] = [("The selected packages have not been deleted, "
|
||||||
|
"check the confirmation checkbox.")]
|
||||||
|
return render_template(request, "pkgbase/delete.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if comments:
|
||||||
|
# Update any existing deletion requests' ClosureComment.
|
||||||
|
with db.begin():
|
||||||
|
requests = pkgbase.requests.filter(
|
||||||
|
and_(PackageRequest.Status == PENDING_ID,
|
||||||
|
PackageRequest.ReqTypeID == DELETION_ID)
|
||||||
|
)
|
||||||
|
for pkgreq in requests:
|
||||||
|
pkgreq.ClosureComment = comments
|
||||||
|
|
||||||
|
notifs = actions.pkgbase_delete_instance(
|
||||||
|
request, pkgbase, comments=comments)
|
||||||
|
util.apply_all(notifs, lambda n: n.send())
|
||||||
|
return RedirectResponse("/packages", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/pkgbase/{name}/merge")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_merge_get(request: Request, name: str,
|
||||||
|
into: str = Query(default=str()),
|
||||||
|
next: str = Query(default=str())):
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
|
||||||
|
context = templates.make_context(request, "Package Merging")
|
||||||
|
context.update({
|
||||||
|
"pkgbase": pkgbase,
|
||||||
|
"into": into,
|
||||||
|
"next": next
|
||||||
|
})
|
||||||
|
|
||||||
|
status_code = HTTPStatus.OK
|
||||||
|
# TODO: Lookup errors from credential instead of hardcoding them.
|
||||||
|
# Idea: Something like credential_errors(creds.PKGBASE_MERGE).
|
||||||
|
# Perhaps additionally: bad_credential_status_code(creds.PKGBASE_MERGE).
|
||||||
|
# Don't take these examples verbatim. We should find good naming.
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_MERGE):
|
||||||
|
context["errors"] = [
|
||||||
|
"Only Trusted Users and Developers can merge packages."]
|
||||||
|
status_code = HTTPStatus.UNAUTHORIZED
|
||||||
|
|
||||||
|
return render_template(request, "pkgbase/merge.html", context,
|
||||||
|
status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/pkgbase/{name}/merge")
|
||||||
|
@requires_auth
|
||||||
|
async def pkgbase_merge_post(request: Request, name: str,
|
||||||
|
into: str = Form(default=str()),
|
||||||
|
comments: str = Form(default=str()),
|
||||||
|
confirm: bool = Form(default=False),
|
||||||
|
next: str = Form(default=str())):
|
||||||
|
|
||||||
|
pkgbase = get_pkg_or_base(name, PackageBase)
|
||||||
|
context = await make_variable_context(request, "Package Merging")
|
||||||
|
context["pkgbase"] = pkgbase
|
||||||
|
|
||||||
|
# TODO: Lookup errors from credential instead of hardcoding them.
|
||||||
|
if not request.user.has_credential(creds.PKGBASE_MERGE):
|
||||||
|
context["errors"] = [
|
||||||
|
"Only Trusted Users and Developers can merge packages."]
|
||||||
|
return render_template(request, "pkgbase/merge.html", context,
|
||||||
|
status_code=HTTPStatus.UNAUTHORIZED)
|
||||||
|
|
||||||
|
if not confirm:
|
||||||
|
context["errors"] = ["The selected packages have not been deleted, "
|
||||||
|
"check the confirmation checkbox."]
|
||||||
|
return render_template(request, "pkgbase/merge.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
try:
|
||||||
|
target = get_pkg_or_base(into, PackageBase)
|
||||||
|
except HTTPException:
|
||||||
|
context["errors"] = [
|
||||||
|
"Cannot find package to merge votes and comments into."]
|
||||||
|
return render_template(request, "pkgbase/merge.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if pkgbase == target:
|
||||||
|
context["errors"] = ["Cannot merge a package base with itself."]
|
||||||
|
return render_template(request, "pkgbase/merge.html", context,
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
update_closure_comment(pkgbase, MERGE_ID, comments, target=target)
|
||||||
|
|
||||||
|
# Merge pkgbase into target.
|
||||||
|
actions.pkgbase_merge_instance(request, pkgbase, target, comments=comments)
|
||||||
|
|
||||||
|
if not next:
|
||||||
|
next = f"/pkgbase/{target.Name}"
|
||||||
|
|
||||||
|
# Redirect to the newly merged into package.
|
||||||
|
return RedirectResponse(next, status_code=HTTPStatus.SEE_OTHER)
|
91
aurweb/routers/requests.py
Normal file
91
aurweb/routers/requests.py
Normal file
|
@ -0,0 +1,91 @@
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, Query, Request
|
||||||
|
from fastapi.responses import RedirectResponse
|
||||||
|
from sqlalchemy import case
|
||||||
|
|
||||||
|
from aurweb import db, defaults, time, util
|
||||||
|
from aurweb.auth import creds, requires_auth
|
||||||
|
from aurweb.models import PackageRequest, User
|
||||||
|
from aurweb.models.package_request import PENDING_ID, REJECTED_ID
|
||||||
|
from aurweb.requests.util import get_pkgreq_by_id
|
||||||
|
from aurweb.scripts import notify
|
||||||
|
from aurweb.templates import make_context, render_template
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/requests")
|
||||||
|
@requires_auth
|
||||||
|
async def requests(request: Request,
|
||||||
|
O: int = Query(default=defaults.O),
|
||||||
|
PP: int = Query(default=defaults.PP)):
|
||||||
|
context = make_context(request, "Requests")
|
||||||
|
|
||||||
|
context["q"] = dict(request.query_params)
|
||||||
|
|
||||||
|
O, PP = util.sanitize_params(O, PP)
|
||||||
|
context["O"] = O
|
||||||
|
context["PP"] = PP
|
||||||
|
|
||||||
|
# A PackageRequest query, with left inner joined User and RequestType.
|
||||||
|
query = db.query(PackageRequest).join(
|
||||||
|
User, User.ID == PackageRequest.UsersID)
|
||||||
|
|
||||||
|
# If the request user is not elevated (TU or Dev), then
|
||||||
|
# filter PackageRequests which are owned by the request user.
|
||||||
|
if not request.user.is_elevated():
|
||||||
|
query = query.filter(PackageRequest.UsersID == request.user.ID)
|
||||||
|
|
||||||
|
context["total"] = query.count()
|
||||||
|
context["results"] = query.order_by(
|
||||||
|
# Order primarily by the Status column being PENDING_ID,
|
||||||
|
# and secondarily by RequestTS; both in descending order.
|
||||||
|
case([(PackageRequest.Status == PENDING_ID, 1)], else_=0).desc(),
|
||||||
|
PackageRequest.RequestTS.desc()
|
||||||
|
).limit(PP).offset(O).all()
|
||||||
|
|
||||||
|
return render_template(request, "requests.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/requests/{id}/close")
|
||||||
|
@requires_auth
|
||||||
|
async def request_close(request: Request, id: int):
|
||||||
|
|
||||||
|
pkgreq = get_pkgreq_by_id(id)
|
||||||
|
if not request.user.is_elevated() and request.user != pkgreq.User:
|
||||||
|
# Request user doesn't have permission here: redirect to '/'.
|
||||||
|
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = make_context(request, "Close Request")
|
||||||
|
context["pkgreq"] = pkgreq
|
||||||
|
return render_template(request, "requests/close.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/requests/{id}/close")
|
||||||
|
@requires_auth
|
||||||
|
async def request_close_post(request: Request, id: int,
|
||||||
|
comments: str = Form(default=str())):
|
||||||
|
pkgreq = get_pkgreq_by_id(id)
|
||||||
|
|
||||||
|
# `pkgreq`.User can close their own request.
|
||||||
|
approved = [pkgreq.User]
|
||||||
|
if not request.user.has_credential(creds.PKGREQ_CLOSE, approved=approved):
|
||||||
|
# Request user doesn't have permission here: redirect to '/'.
|
||||||
|
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = make_context(request, "Close Request")
|
||||||
|
context["pkgreq"] = pkgreq
|
||||||
|
|
||||||
|
now = time.utcnow()
|
||||||
|
with db.begin():
|
||||||
|
pkgreq.Closer = request.user
|
||||||
|
pkgreq.ClosureComment = comments
|
||||||
|
pkgreq.ClosedTS = now
|
||||||
|
pkgreq.Status = REJECTED_ID
|
||||||
|
|
||||||
|
notify_ = notify.RequestCloseNotification(
|
||||||
|
request.user.ID, pkgreq.ID, pkgreq.status_display())
|
||||||
|
notify_.send()
|
||||||
|
|
||||||
|
return RedirectResponse("/requests", status_code=HTTPStatus.SEE_OTHER)
|
127
aurweb/routers/rpc.py
Normal file
127
aurweb/routers/rpc.py
Normal file
|
@ -0,0 +1,127 @@
|
||||||
|
import hashlib
|
||||||
|
import re
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
from typing import List, Optional
|
||||||
|
from urllib.parse import unquote
|
||||||
|
|
||||||
|
import orjson
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Query, Request, Response
|
||||||
|
from fastapi.responses import JSONResponse
|
||||||
|
|
||||||
|
from aurweb import defaults
|
||||||
|
from aurweb.ratelimit import check_ratelimit
|
||||||
|
from aurweb.rpc import RPC, documentation
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args(request: Request):
|
||||||
|
""" Handle legacy logic of 'arg' and 'arg[]' query parameter handling.
|
||||||
|
|
||||||
|
When 'arg' appears as the last argument given to the query string,
|
||||||
|
that argument is used by itself as one single argument, regardless
|
||||||
|
of any more 'arg' or 'arg[]' parameters supplied before it.
|
||||||
|
|
||||||
|
When 'arg[]' appears as the last argument given to the query string,
|
||||||
|
we iterate from last to first and build a list of arguments until
|
||||||
|
we hit an 'arg'.
|
||||||
|
|
||||||
|
TODO: This handling should be addressed in v6 of the RPC API. This
|
||||||
|
was most likely a bi-product of legacy handling of versions 1-4
|
||||||
|
which we no longer support.
|
||||||
|
|
||||||
|
:param request: FastAPI request
|
||||||
|
:returns: List of deduced arguments
|
||||||
|
"""
|
||||||
|
# Create a list of (key, value) pairs of the given 'arg' and 'arg[]'
|
||||||
|
# query parameters from last to first.
|
||||||
|
query = list(reversed(unquote(request.url.query).split("&")))
|
||||||
|
parts = [
|
||||||
|
e.split("=", 1) for e in query if e.startswith(("arg=", "arg[]="))
|
||||||
|
]
|
||||||
|
|
||||||
|
args = []
|
||||||
|
if parts:
|
||||||
|
# If we found 'arg' and/or 'arg[]' arguments, we begin processing
|
||||||
|
# the set of arguments depending on the last key found.
|
||||||
|
last = parts[0][0]
|
||||||
|
|
||||||
|
if last == "arg":
|
||||||
|
# If the last key was 'arg', then it is our sole argument.
|
||||||
|
args.append(parts[0][1])
|
||||||
|
else:
|
||||||
|
# Otherwise, it must be 'arg[]', so traverse backward
|
||||||
|
# until we reach a non-'arg[]' key.
|
||||||
|
for key, value in parts:
|
||||||
|
if key != last:
|
||||||
|
break
|
||||||
|
args.append(value)
|
||||||
|
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
JSONP_EXPR = re.compile(r'^[a-zA-Z0-9()_.]{1,128}$')
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/rpc")
|
||||||
|
async def rpc(request: Request,
|
||||||
|
v: Optional[int] = Query(default=None),
|
||||||
|
type: Optional[str] = Query(default=None),
|
||||||
|
by: Optional[str] = Query(default=defaults.RPC_SEARCH_BY),
|
||||||
|
arg: Optional[str] = Query(default=None),
|
||||||
|
args: Optional[List[str]] = Query(default=[], alias="arg[]"),
|
||||||
|
callback: Optional[str] = Query(default=None)):
|
||||||
|
|
||||||
|
if not request.url.query:
|
||||||
|
return documentation()
|
||||||
|
|
||||||
|
# Create a handle to our RPC class.
|
||||||
|
rpc = RPC(version=v, type=type)
|
||||||
|
|
||||||
|
# If ratelimit was exceeded, return a 429 Too Many Requests.
|
||||||
|
if check_ratelimit(request):
|
||||||
|
return JSONResponse(rpc.error("Rate limit reached"),
|
||||||
|
status_code=int(HTTPStatus.TOO_MANY_REQUESTS))
|
||||||
|
|
||||||
|
# If `callback` was provided, produce a text/javascript response
|
||||||
|
# valid for the jsonp callback. Otherwise, by default, return
|
||||||
|
# application/json containing `output`.
|
||||||
|
content_type = "application/json"
|
||||||
|
if callback:
|
||||||
|
if not re.match(JSONP_EXPR, callback):
|
||||||
|
return rpc.error("Invalid callback name.")
|
||||||
|
|
||||||
|
content_type = "text/javascript"
|
||||||
|
|
||||||
|
# Prepare list of arguments for input. If 'arg' was given, it'll
|
||||||
|
# be a list with one element.
|
||||||
|
arguments = parse_args(request)
|
||||||
|
data = rpc.handle(by=by, args=arguments)
|
||||||
|
|
||||||
|
# Serialize `data` into JSON in a sorted fashion. This way, our
|
||||||
|
# ETag header produced below will never end up changed.
|
||||||
|
content = orjson.dumps(data, option=orjson.OPT_SORT_KEYS)
|
||||||
|
|
||||||
|
# Produce an md5 hash based on `output`.
|
||||||
|
md5 = hashlib.md5()
|
||||||
|
md5.update(content)
|
||||||
|
etag = md5.hexdigest()
|
||||||
|
|
||||||
|
# The ETag header expects quotes to surround any identifier.
|
||||||
|
# https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/ETag
|
||||||
|
headers = {
|
||||||
|
"Content-Type": content_type,
|
||||||
|
"ETag": f'"{etag}"'
|
||||||
|
}
|
||||||
|
|
||||||
|
if_none_match = request.headers.get("If-None-Match", str())
|
||||||
|
if if_none_match and if_none_match.strip("\t\n\r\" ") == etag:
|
||||||
|
return Response(headers=headers,
|
||||||
|
status_code=int(HTTPStatus.NOT_MODIFIED))
|
||||||
|
|
||||||
|
if callback:
|
||||||
|
content = f"/**/{callback}({content.decode()})"
|
||||||
|
|
||||||
|
return Response(content, headers=headers)
|
83
aurweb/routers/rss.py
Normal file
83
aurweb/routers/rss.py
Normal file
|
@ -0,0 +1,83 @@
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Request
|
||||||
|
from fastapi.responses import Response
|
||||||
|
from feedgen.feed import FeedGenerator
|
||||||
|
|
||||||
|
from aurweb import db, filters
|
||||||
|
from aurweb.models import Package, PackageBase
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
def make_rss_feed(request: Request, packages: list,
|
||||||
|
date_attr: str):
|
||||||
|
""" Create an RSS Feed string for some packages.
|
||||||
|
|
||||||
|
:param request: A FastAPI request
|
||||||
|
:param packages: A list of packages to add to the RSS feed
|
||||||
|
:param date_attr: The date attribute (DB column) to use
|
||||||
|
:return: RSS Feed string
|
||||||
|
"""
|
||||||
|
|
||||||
|
feed = FeedGenerator()
|
||||||
|
feed.title("AUR Newest Packages")
|
||||||
|
feed.description("The latest and greatest packages in the AUR")
|
||||||
|
base = f"{request.url.scheme}://{request.url.netloc}"
|
||||||
|
feed.link(href=base, rel="alternate")
|
||||||
|
feed.link(href=f"{base}/rss", rel="self")
|
||||||
|
feed.image(title="AUR Newest Packages",
|
||||||
|
url=f"{base}/css/archnavbar/aurlogo.png",
|
||||||
|
link=base,
|
||||||
|
description="AUR Newest Packages Feed")
|
||||||
|
|
||||||
|
for pkg in packages:
|
||||||
|
entry = feed.add_entry(order="append")
|
||||||
|
entry.title(pkg.Name)
|
||||||
|
entry.link(href=f"{base}/packages/{pkg.Name}", rel="alternate")
|
||||||
|
entry.link(href=f"{base}/rss", rel="self", type="application/rss+xml")
|
||||||
|
entry.description(pkg.Description or str())
|
||||||
|
|
||||||
|
attr = getattr(pkg.PackageBase, date_attr)
|
||||||
|
dt = filters.timestamp_to_datetime(attr)
|
||||||
|
dt = filters.as_timezone(dt, request.user.Timezone)
|
||||||
|
entry.pubDate(dt.strftime("%Y-%m-%d %H:%M:%S%z"))
|
||||||
|
|
||||||
|
entry.source(f"{base}")
|
||||||
|
if pkg.PackageBase.Maintainer:
|
||||||
|
entry.author(author={"name": pkg.PackageBase.Maintainer.Username})
|
||||||
|
entry.guid(f"{pkg.Name} - {attr}")
|
||||||
|
|
||||||
|
return feed.rss_str()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/rss/")
|
||||||
|
async def rss(request: Request):
|
||||||
|
packages = db.query(Package).join(PackageBase).order_by(
|
||||||
|
PackageBase.SubmittedTS.desc()).limit(100)
|
||||||
|
feed = make_rss_feed(request, packages, "SubmittedTS")
|
||||||
|
|
||||||
|
response = Response(feed, media_type="application/rss+xml")
|
||||||
|
package = packages.first()
|
||||||
|
if package:
|
||||||
|
dt = datetime.utcfromtimestamp(package.PackageBase.SubmittedTS)
|
||||||
|
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
|
||||||
|
response.headers["Last-Modified"] = modified
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/rss/modified")
|
||||||
|
async def rss_modified(request: Request):
|
||||||
|
packages = db.query(Package).join(PackageBase).order_by(
|
||||||
|
PackageBase.ModifiedTS.desc()).limit(100)
|
||||||
|
feed = make_rss_feed(request, packages, "ModifiedTS")
|
||||||
|
|
||||||
|
response = Response(feed, media_type="application/rss+xml")
|
||||||
|
package = packages.first()
|
||||||
|
if package:
|
||||||
|
dt = datetime.utcfromtimestamp(package.PackageBase.ModifiedTS)
|
||||||
|
modified = dt.strftime("%a, %d %m %Y %H:%M:%S GMT")
|
||||||
|
response.headers["Last-Modified"] = modified
|
||||||
|
|
||||||
|
return response
|
|
@ -1,6 +1,7 @@
|
||||||
import time
|
import time
|
||||||
import uuid
|
import uuid
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
from urllib.parse import urlencode
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
import fastapi
|
import fastapi
|
||||||
|
@ -14,6 +15,7 @@ from starlette.requests import Request
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
import aurweb.db
|
||||||
|
|
||||||
|
from aurweb import util
|
||||||
from aurweb.l10n import get_translator_for_request
|
from aurweb.l10n import get_translator_for_request
|
||||||
from aurweb.schema import Bans, Sessions, Users
|
from aurweb.schema import Bans, Sessions, Users
|
||||||
|
|
||||||
|
@ -58,7 +60,8 @@ def open_session(request, conn, user_id):
|
||||||
"""
|
"""
|
||||||
if is_account_suspended(conn, user_id):
|
if is_account_suspended(conn, user_id):
|
||||||
_ = get_translator_for_request(request)
|
_ = get_translator_for_request(request)
|
||||||
raise HTTPException(status_code=403, detail=_('Account suspended'))
|
raise HTTPException(status_code=HTTPStatus.FORBIDDEN,
|
||||||
|
detail=_('Account suspended'))
|
||||||
# TODO This is a terrible message because it could imply the attempt at
|
# TODO This is a terrible message because it could imply the attempt at
|
||||||
# logging in just caused the suspension.
|
# logging in just caused the suspension.
|
||||||
|
|
||||||
|
@ -103,7 +106,7 @@ async def authenticate(request: Request, redirect: str = None, conn=Depends(aurw
|
||||||
if is_ip_banned(conn, request.client.host):
|
if is_ip_banned(conn, request.client.host):
|
||||||
_ = get_translator_for_request(request)
|
_ = get_translator_for_request(request)
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=403,
|
status_code=HTTPStatus.FORBIDDEN,
|
||||||
detail=_('The login form is currently disabled for your IP address, '
|
detail=_('The login form is currently disabled for your IP address, '
|
||||||
'probably due to sustained spam attacks. Sorry for the '
|
'probably due to sustained spam attacks. Sorry for the '
|
||||||
'inconvenience.'))
|
'inconvenience.'))
|
||||||
|
@ -116,13 +119,14 @@ async def authenticate(request: Request, redirect: str = None, conn=Depends(aurw
|
||||||
# Let’s give attackers as little information as possible.
|
# Let’s give attackers as little information as possible.
|
||||||
_ = get_translator_for_request(request)
|
_ = get_translator_for_request(request)
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=400,
|
status_code=HTTPStatus.BAD_REQUEST,
|
||||||
detail=_('Bad OAuth token. Please retry logging in from the start.'))
|
detail=_('Bad OAuth token. Please retry logging in from the start.'))
|
||||||
|
|
||||||
sub = user.get("sub") # this is the SSO account ID in JWT terminology
|
sub = user.get("sub") # this is the SSO account ID in JWT terminology
|
||||||
if not sub:
|
if not sub:
|
||||||
_ = get_translator_for_request(request)
|
_ = get_translator_for_request(request)
|
||||||
raise HTTPException(status_code=400, detail=_("JWT is missing its `sub` field."))
|
raise HTTPException(status_code=HTTPStatus.BAD_REQUEST,
|
||||||
|
detail=_("JWT is missing its `sub` field."))
|
||||||
|
|
||||||
aur_accounts = conn.execute(select([Users.c.ID]).where(Users.c.SSOAccountID == sub)) \
|
aur_accounts = conn.execute(select([Users.c.ID]).where(Users.c.SSOAccountID == sub)) \
|
||||||
.fetchall()
|
.fetchall()
|
||||||
|
@ -131,14 +135,16 @@ async def authenticate(request: Request, redirect: str = None, conn=Depends(aurw
|
||||||
elif len(aur_accounts) == 1:
|
elif len(aur_accounts) == 1:
|
||||||
sid = open_session(request, conn, aur_accounts[0][Users.c.ID])
|
sid = open_session(request, conn, aur_accounts[0][Users.c.ID])
|
||||||
response = RedirectResponse(redirect if redirect and is_aur_url(redirect) else "/")
|
response = RedirectResponse(redirect if redirect and is_aur_url(redirect) else "/")
|
||||||
|
secure_cookies = aurweb.config.getboolean("options", "disable_http_login")
|
||||||
response.set_cookie(key="AURSID", value=sid, httponly=True,
|
response.set_cookie(key="AURSID", value=sid, httponly=True,
|
||||||
secure=request.url.scheme == "https")
|
secure=secure_cookies)
|
||||||
if "id_token" in token:
|
if "id_token" in token:
|
||||||
# We save the id_token for the SSO logout. It’s not too important
|
# We save the id_token for the SSO logout. It’s not too important
|
||||||
# though, so if we can’t find it, we can live without it.
|
# though, so if we can’t find it, we can live without it.
|
||||||
response.set_cookie(key="SSO_ID_TOKEN", value=token["id_token"], path="/sso/",
|
response.set_cookie(key="SSO_ID_TOKEN", value=token["id_token"],
|
||||||
httponly=True, secure=request.url.scheme == "https")
|
path="/sso/", httponly=True,
|
||||||
return response
|
secure=secure_cookies)
|
||||||
|
return util.add_samesite_fields(response, "strict")
|
||||||
else:
|
else:
|
||||||
# We’ve got a severe integrity violation.
|
# We’ve got a severe integrity violation.
|
||||||
raise Exception("Multiple accounts found for SSO account " + sub)
|
raise Exception("Multiple accounts found for SSO account " + sub)
|
||||||
|
|
311
aurweb/routers/trusted_user.py
Normal file
311
aurweb/routers/trusted_user.py
Normal file
|
@ -0,0 +1,311 @@
|
||||||
|
import html
|
||||||
|
import typing
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Form, HTTPException, Request
|
||||||
|
from fastapi.responses import RedirectResponse, Response
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
|
||||||
|
from aurweb import db, l10n, logging, models, time
|
||||||
|
from aurweb.auth import creds, requires_auth
|
||||||
|
from aurweb.models import User
|
||||||
|
from aurweb.models.account_type import TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID
|
||||||
|
from aurweb.templates import make_context, make_variable_context, render_template
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
# Some TU route specific constants.
|
||||||
|
ITEMS_PER_PAGE = 10 # Paged table size.
|
||||||
|
MAX_AGENDA_LENGTH = 75 # Agenda table column length.
|
||||||
|
|
||||||
|
ADDVOTE_SPECIFICS = {
|
||||||
|
# This dict stores a vote duration and quorum for a proposal.
|
||||||
|
# When a proposal is added, duration is added to the current
|
||||||
|
# timestamp.
|
||||||
|
# "addvote_type": (duration, quorum)
|
||||||
|
"add_tu": (7 * 24 * 60 * 60, 0.66),
|
||||||
|
"remove_tu": (7 * 24 * 60 * 60, 0.75),
|
||||||
|
"remove_inactive_tu": (5 * 24 * 60 * 60, 0.66),
|
||||||
|
"bylaws": (7 * 24 * 60 * 60, 0.75)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tu")
|
||||||
|
@requires_auth
|
||||||
|
async def trusted_user(request: Request,
|
||||||
|
coff: int = 0, # current offset
|
||||||
|
cby: str = "desc", # current by
|
||||||
|
poff: int = 0, # past offset
|
||||||
|
pby: str = "desc"): # past by
|
||||||
|
if not request.user.has_credential(creds.TU_LIST_VOTES):
|
||||||
|
return RedirectResponse("/", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = make_context(request, "Trusted User")
|
||||||
|
|
||||||
|
current_by, past_by = cby, pby
|
||||||
|
current_off, past_off = coff, poff
|
||||||
|
|
||||||
|
context["pp"] = pp = ITEMS_PER_PAGE
|
||||||
|
context["prev_len"] = MAX_AGENDA_LENGTH
|
||||||
|
|
||||||
|
ts = time.utcnow()
|
||||||
|
|
||||||
|
if current_by not in {"asc", "desc"}:
|
||||||
|
# If a malicious by was given, default to desc.
|
||||||
|
current_by = "desc"
|
||||||
|
context["current_by"] = current_by
|
||||||
|
|
||||||
|
if past_by not in {"asc", "desc"}:
|
||||||
|
# If a malicious by was given, default to desc.
|
||||||
|
past_by = "desc"
|
||||||
|
context["past_by"] = past_by
|
||||||
|
|
||||||
|
current_votes = db.query(models.TUVoteInfo).filter(
|
||||||
|
models.TUVoteInfo.End > ts).order_by(
|
||||||
|
models.TUVoteInfo.Submitted.desc())
|
||||||
|
context["current_votes_count"] = current_votes.count()
|
||||||
|
current_votes = current_votes.limit(pp).offset(current_off)
|
||||||
|
context["current_votes"] = reversed(current_votes.all()) \
|
||||||
|
if current_by == "asc" else current_votes.all()
|
||||||
|
context["current_off"] = current_off
|
||||||
|
|
||||||
|
past_votes = db.query(models.TUVoteInfo).filter(
|
||||||
|
models.TUVoteInfo.End <= ts).order_by(
|
||||||
|
models.TUVoteInfo.Submitted.desc())
|
||||||
|
context["past_votes_count"] = past_votes.count()
|
||||||
|
past_votes = past_votes.limit(pp).offset(past_off)
|
||||||
|
context["past_votes"] = reversed(past_votes.all()) \
|
||||||
|
if past_by == "asc" else past_votes.all()
|
||||||
|
context["past_off"] = past_off
|
||||||
|
|
||||||
|
# TODO
|
||||||
|
# We order last votes by TUVote.VoteID and User.Username.
|
||||||
|
# This is really bad. We should add a Created column to
|
||||||
|
# TUVote of type Timestamp and order by that instead.
|
||||||
|
last_votes_by_tu = db.query(models.TUVote).filter(
|
||||||
|
and_(models.TUVote.VoteID == models.TUVoteInfo.ID,
|
||||||
|
models.TUVoteInfo.End <= ts,
|
||||||
|
models.TUVote.UserID == models.User.ID,
|
||||||
|
or_(models.User.AccountTypeID == 2,
|
||||||
|
models.User.AccountTypeID == 4))
|
||||||
|
).group_by(models.User.ID).order_by(
|
||||||
|
models.TUVote.VoteID.desc(), models.User.Username.asc())
|
||||||
|
context["last_votes_by_tu"] = last_votes_by_tu.all()
|
||||||
|
|
||||||
|
context["current_by_next"] = "asc" if current_by == "desc" else "desc"
|
||||||
|
context["past_by_next"] = "asc" if past_by == "desc" else "desc"
|
||||||
|
|
||||||
|
context["q"] = {
|
||||||
|
"coff": current_off,
|
||||||
|
"cby": current_by,
|
||||||
|
"poff": past_off,
|
||||||
|
"pby": past_by
|
||||||
|
}
|
||||||
|
|
||||||
|
return render_template(request, "tu/index.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
def render_proposal(request: Request, context: dict, proposal: int,
|
||||||
|
voteinfo: models.TUVoteInfo,
|
||||||
|
voters: typing.Iterable[models.User],
|
||||||
|
vote: models.TUVote,
|
||||||
|
status_code: HTTPStatus = HTTPStatus.OK):
|
||||||
|
""" Render a single TU proposal. """
|
||||||
|
context["proposal"] = proposal
|
||||||
|
context["voteinfo"] = voteinfo
|
||||||
|
context["voters"] = voters.all()
|
||||||
|
|
||||||
|
total = voteinfo.total_votes()
|
||||||
|
participation = (total / voteinfo.ActiveTUs) if total else 0
|
||||||
|
context["participation"] = participation
|
||||||
|
|
||||||
|
accepted = (voteinfo.Yes > voteinfo.ActiveTUs / 2) or \
|
||||||
|
(participation > voteinfo.Quorum and voteinfo.Yes > voteinfo.No)
|
||||||
|
context["accepted"] = accepted
|
||||||
|
|
||||||
|
can_vote = voters.filter(models.TUVote.User == request.user).first() is None
|
||||||
|
context["can_vote"] = can_vote
|
||||||
|
|
||||||
|
if not voteinfo.is_running():
|
||||||
|
context["error"] = "Voting is closed for this proposal."
|
||||||
|
|
||||||
|
context["vote"] = vote
|
||||||
|
context["has_voted"] = vote is not None
|
||||||
|
|
||||||
|
return render_template(request, "tu/show.html", context,
|
||||||
|
status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tu/{proposal}")
|
||||||
|
@requires_auth
|
||||||
|
async def trusted_user_proposal(request: Request, proposal: int):
|
||||||
|
if not request.user.has_credential(creds.TU_LIST_VOTES):
|
||||||
|
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Trusted User")
|
||||||
|
proposal = int(proposal)
|
||||||
|
|
||||||
|
voteinfo = db.query(models.TUVoteInfo).filter(
|
||||||
|
models.TUVoteInfo.ID == proposal).first()
|
||||||
|
if not voteinfo:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
voters = db.query(models.User).join(models.TUVote).filter(
|
||||||
|
models.TUVote.VoteID == voteinfo.ID)
|
||||||
|
vote = db.query(models.TUVote).filter(
|
||||||
|
and_(models.TUVote.UserID == request.user.ID,
|
||||||
|
models.TUVote.VoteID == voteinfo.ID)).first()
|
||||||
|
if not request.user.has_credential(creds.TU_VOTE):
|
||||||
|
context["error"] = "Only Trusted Users are allowed to vote."
|
||||||
|
if voteinfo.User == request.user.Username:
|
||||||
|
context["error"] = "You cannot vote in an proposal about you."
|
||||||
|
elif vote is not None:
|
||||||
|
context["error"] = "You've already voted for this proposal."
|
||||||
|
|
||||||
|
context["vote"] = vote
|
||||||
|
return render_proposal(request, context, proposal, voteinfo, voters, vote)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/tu/{proposal}")
|
||||||
|
@requires_auth
|
||||||
|
async def trusted_user_proposal_post(request: Request, proposal: int,
|
||||||
|
decision: str = Form(...)):
|
||||||
|
if not request.user.has_credential(creds.TU_LIST_VOTES):
|
||||||
|
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Trusted User")
|
||||||
|
proposal = int(proposal) # Make sure it's an int.
|
||||||
|
|
||||||
|
voteinfo = db.query(models.TUVoteInfo).filter(
|
||||||
|
models.TUVoteInfo.ID == proposal).first()
|
||||||
|
if not voteinfo:
|
||||||
|
raise HTTPException(status_code=HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
voters = db.query(models.User).join(models.TUVote).filter(
|
||||||
|
models.TUVote.VoteID == voteinfo.ID)
|
||||||
|
vote = db.query(models.TUVote).filter(
|
||||||
|
and_(models.TUVote.UserID == request.user.ID,
|
||||||
|
models.TUVote.VoteID == voteinfo.ID)).first()
|
||||||
|
|
||||||
|
status_code = HTTPStatus.OK
|
||||||
|
if not request.user.has_credential(creds.TU_VOTE):
|
||||||
|
context["error"] = "Only Trusted Users are allowed to vote."
|
||||||
|
status_code = HTTPStatus.UNAUTHORIZED
|
||||||
|
elif voteinfo.User == request.user.Username:
|
||||||
|
context["error"] = "You cannot vote in an proposal about you."
|
||||||
|
status_code = HTTPStatus.BAD_REQUEST
|
||||||
|
elif vote is not None:
|
||||||
|
context["error"] = "You've already voted for this proposal."
|
||||||
|
status_code = HTTPStatus.BAD_REQUEST
|
||||||
|
|
||||||
|
if status_code != HTTPStatus.OK:
|
||||||
|
return render_proposal(request, context, proposal,
|
||||||
|
voteinfo, voters, vote,
|
||||||
|
status_code=status_code)
|
||||||
|
|
||||||
|
if decision in {"Yes", "No", "Abstain"}:
|
||||||
|
# Increment whichever decision was given to us.
|
||||||
|
setattr(voteinfo, decision, getattr(voteinfo, decision) + 1)
|
||||||
|
else:
|
||||||
|
return Response("Invalid 'decision' value.",
|
||||||
|
status_code=HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
vote = db.create(models.TUVote, User=request.user, VoteInfo=voteinfo)
|
||||||
|
voteinfo.ActiveTUs += 1
|
||||||
|
|
||||||
|
context["error"] = "You've already voted for this proposal."
|
||||||
|
return render_proposal(request, context, proposal, voteinfo, voters, vote)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/addvote")
|
||||||
|
@requires_auth
|
||||||
|
async def trusted_user_addvote(request: Request, user: str = str(),
|
||||||
|
type: str = "add_tu", agenda: str = str()):
|
||||||
|
if not request.user.has_credential(creds.TU_ADD_VOTE):
|
||||||
|
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
context = await make_variable_context(request, "Add Proposal")
|
||||||
|
|
||||||
|
if type not in ADDVOTE_SPECIFICS:
|
||||||
|
context["error"] = "Invalid type."
|
||||||
|
type = "add_tu" # Default it.
|
||||||
|
|
||||||
|
context["user"] = user
|
||||||
|
context["type"] = type
|
||||||
|
context["agenda"] = agenda
|
||||||
|
|
||||||
|
return render_template(request, "addvote.html", context)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/addvote")
|
||||||
|
@requires_auth
|
||||||
|
async def trusted_user_addvote_post(request: Request,
|
||||||
|
user: str = Form(default=str()),
|
||||||
|
type: str = Form(default=str()),
|
||||||
|
agenda: str = Form(default=str())):
|
||||||
|
if not request.user.has_credential(creds.TU_ADD_VOTE):
|
||||||
|
return RedirectResponse("/tu", status_code=HTTPStatus.SEE_OTHER)
|
||||||
|
|
||||||
|
# Build a context.
|
||||||
|
context = await make_variable_context(request, "Add Proposal")
|
||||||
|
|
||||||
|
context["type"] = type
|
||||||
|
context["user"] = user
|
||||||
|
context["agenda"] = agenda
|
||||||
|
|
||||||
|
def render_addvote(context, status_code):
|
||||||
|
""" Simplify render_template a bit for this test. """
|
||||||
|
return render_template(request, "addvote.html", context, status_code)
|
||||||
|
|
||||||
|
# Alright, get some database records, if we can.
|
||||||
|
if type != "bylaws":
|
||||||
|
user_record = db.query(models.User).filter(
|
||||||
|
models.User.Username == user).first()
|
||||||
|
if user_record is None:
|
||||||
|
context["error"] = "Username does not exist."
|
||||||
|
return render_addvote(context, HTTPStatus.NOT_FOUND)
|
||||||
|
|
||||||
|
voteinfo = db.query(models.TUVoteInfo).filter(
|
||||||
|
models.TUVoteInfo.User == user).count()
|
||||||
|
if voteinfo:
|
||||||
|
_ = l10n.get_translator_for_request(request)
|
||||||
|
context["error"] = _(
|
||||||
|
"%s already has proposal running for them.") % (
|
||||||
|
html.escape(user),)
|
||||||
|
return render_addvote(context, HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if type not in ADDVOTE_SPECIFICS:
|
||||||
|
context["error"] = "Invalid type."
|
||||||
|
context["type"] = type = "add_tu" # Default for rendering.
|
||||||
|
return render_addvote(context, HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
if not agenda:
|
||||||
|
context["error"] = "Proposal cannot be empty."
|
||||||
|
return render_addvote(context, HTTPStatus.BAD_REQUEST)
|
||||||
|
|
||||||
|
# Gather some mapped constants and the current timestamp.
|
||||||
|
duration, quorum = ADDVOTE_SPECIFICS.get(type)
|
||||||
|
timestamp = time.utcnow()
|
||||||
|
|
||||||
|
# Active TU types we filter for.
|
||||||
|
types = {TRUSTED_USER_ID, TRUSTED_USER_AND_DEV_ID}
|
||||||
|
|
||||||
|
# Create a new TUVoteInfo (proposal)!
|
||||||
|
with db.begin():
|
||||||
|
active_tus = db.query(User).filter(
|
||||||
|
and_(User.Suspended == 0,
|
||||||
|
User.InactivityTS.isnot(None),
|
||||||
|
User.AccountTypeID.in_(types))
|
||||||
|
).count()
|
||||||
|
voteinfo = db.create(models.TUVoteInfo, User=user,
|
||||||
|
Agenda=html.escape(agenda),
|
||||||
|
Submitted=timestamp, End=(timestamp + duration),
|
||||||
|
Quorum=quorum, ActiveTUs=active_tus,
|
||||||
|
Submitter=request.user)
|
||||||
|
|
||||||
|
# Redirect to the new proposal.
|
||||||
|
endpoint = f"/tu/{voteinfo.ID}"
|
||||||
|
return RedirectResponse(endpoint, status_code=HTTPStatus.SEE_OTHER)
|
389
aurweb/rpc.py
Normal file
389
aurweb/rpc.py
Normal file
|
@ -0,0 +1,389 @@
|
||||||
|
import os
|
||||||
|
|
||||||
|
from collections import defaultdict
|
||||||
|
from typing import Any, Callable, Dict, List, NewType, Union
|
||||||
|
|
||||||
|
from fastapi.responses import HTMLResponse
|
||||||
|
from sqlalchemy import and_, literal, orm
|
||||||
|
|
||||||
|
import aurweb.config as config
|
||||||
|
|
||||||
|
from aurweb import db, defaults, models
|
||||||
|
from aurweb.exceptions import RPCError
|
||||||
|
from aurweb.filters import number_format
|
||||||
|
from aurweb.packages.search import RPCSearch
|
||||||
|
|
||||||
|
TYPE_MAPPING = {
|
||||||
|
"depends": "Depends",
|
||||||
|
"makedepends": "MakeDepends",
|
||||||
|
"checkdepends": "CheckDepends",
|
||||||
|
"optdepends": "OptDepends",
|
||||||
|
"conflicts": "Conflicts",
|
||||||
|
"provides": "Provides",
|
||||||
|
"replaces": "Replaces",
|
||||||
|
}
|
||||||
|
|
||||||
|
DataGenerator = NewType("DataGenerator",
|
||||||
|
Callable[[models.Package], Dict[str, Any]])
|
||||||
|
|
||||||
|
|
||||||
|
def documentation():
|
||||||
|
aurwebdir = config.get("options", "aurwebdir")
|
||||||
|
rpc_doc = os.path.join(aurwebdir, "doc", "rpc.html")
|
||||||
|
|
||||||
|
if not os.path.exists(rpc_doc):
|
||||||
|
raise OSError("doc/rpc.html could not be read")
|
||||||
|
|
||||||
|
with open(rpc_doc) as f:
|
||||||
|
data = f.read()
|
||||||
|
return HTMLResponse(data)
|
||||||
|
|
||||||
|
|
||||||
|
class RPC:
|
||||||
|
""" RPC API handler class.
|
||||||
|
|
||||||
|
There are various pieces to RPC's process, and encapsulating them
|
||||||
|
inside of a class means that external users do not abuse the
|
||||||
|
RPC implementation to achieve goals. We call type handlers
|
||||||
|
by taking a reference to the callback named "_handle_{type}_type(...)",
|
||||||
|
and if the handler does not exist, we return a not implemented
|
||||||
|
error to the API user.
|
||||||
|
|
||||||
|
EXPOSED_VERSIONS holds the set of versions that the API
|
||||||
|
officially supports.
|
||||||
|
|
||||||
|
EXPOSED_TYPES holds the set of types that the API officially
|
||||||
|
supports.
|
||||||
|
|
||||||
|
ALIASES holds an alias mapping of type -> type strings.
|
||||||
|
|
||||||
|
We should focus on privatizing implementation helpers and
|
||||||
|
focusing on performance in the code used.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# A set of RPC versions supported by this API.
|
||||||
|
EXPOSED_VERSIONS = {5}
|
||||||
|
|
||||||
|
# A set of RPC types supported by this API.
|
||||||
|
EXPOSED_TYPES = {
|
||||||
|
"info", "multiinfo",
|
||||||
|
"search", "msearch",
|
||||||
|
"suggest", "suggest-pkgbase"
|
||||||
|
}
|
||||||
|
|
||||||
|
# A mapping of type aliases.
|
||||||
|
TYPE_ALIASES = {"info": "multiinfo"}
|
||||||
|
|
||||||
|
EXPOSED_BYS = {
|
||||||
|
"name-desc", "name", "maintainer",
|
||||||
|
"depends", "makedepends", "optdepends", "checkdepends"
|
||||||
|
}
|
||||||
|
|
||||||
|
# A mapping of by aliases.
|
||||||
|
BY_ALIASES = {"name-desc": "nd", "name": "n", "maintainer": "m"}
|
||||||
|
|
||||||
|
def __init__(self, version: int = 0, type: str = None) -> "RPC":
|
||||||
|
self.version = version
|
||||||
|
self.type = RPC.TYPE_ALIASES.get(type, type)
|
||||||
|
|
||||||
|
def error(self, message: str) -> Dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"version": self.version,
|
||||||
|
"results": [],
|
||||||
|
"resultcount": 0,
|
||||||
|
"type": "error",
|
||||||
|
"error": message
|
||||||
|
}
|
||||||
|
|
||||||
|
def _verify_inputs(self, by: str = [], args: List[str] = []) -> None:
|
||||||
|
if self.version is None:
|
||||||
|
raise RPCError("Please specify an API version.")
|
||||||
|
|
||||||
|
if self.version not in RPC.EXPOSED_VERSIONS:
|
||||||
|
raise RPCError("Invalid version specified.")
|
||||||
|
|
||||||
|
if by not in RPC.EXPOSED_BYS:
|
||||||
|
raise RPCError("Incorrect by field specified.")
|
||||||
|
|
||||||
|
if self.type is None:
|
||||||
|
raise RPCError("No request type/data specified.")
|
||||||
|
|
||||||
|
if self.type not in RPC.EXPOSED_TYPES:
|
||||||
|
raise RPCError("Incorrect request type specified.")
|
||||||
|
|
||||||
|
def _enforce_args(self, args: List[str]) -> None:
|
||||||
|
if not args:
|
||||||
|
raise RPCError("No request type/data specified.")
|
||||||
|
|
||||||
|
def _get_json_data(self, package: models.Package) -> Dict[str, Any]:
|
||||||
|
""" Produce dictionary data of one Package that can be JSON-serialized.
|
||||||
|
|
||||||
|
:param package: Package instance
|
||||||
|
:returns: JSON-serializable dictionary
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Produce RPC API compatible Popularity: If zero, it's an integer
|
||||||
|
# 0, otherwise, it's formatted to the 6th decimal place.
|
||||||
|
pop = package.Popularity
|
||||||
|
pop = 0 if not pop else float(number_format(pop, 6))
|
||||||
|
|
||||||
|
snapshot_uri = config.get("options", "snapshot_uri")
|
||||||
|
return {
|
||||||
|
"ID": package.ID,
|
||||||
|
"Name": package.Name,
|
||||||
|
"PackageBaseID": package.PackageBaseID,
|
||||||
|
"PackageBase": package.PackageBaseName,
|
||||||
|
# Maintainer should be set following this update if one exists.
|
||||||
|
"Maintainer": package.Maintainer,
|
||||||
|
"Version": package.Version,
|
||||||
|
"Description": package.Description,
|
||||||
|
"URL": package.URL,
|
||||||
|
"URLPath": snapshot_uri % package.Name,
|
||||||
|
"NumVotes": package.NumVotes,
|
||||||
|
"Popularity": pop,
|
||||||
|
"OutOfDate": package.OutOfDateTS,
|
||||||
|
"FirstSubmitted": package.SubmittedTS,
|
||||||
|
"LastModified": package.ModifiedTS
|
||||||
|
}
|
||||||
|
|
||||||
|
def _get_info_json_data(self, package: models.Package) -> Dict[str, Any]:
|
||||||
|
data = self._get_json_data(package)
|
||||||
|
|
||||||
|
# All info results have _at least_ an empty list of
|
||||||
|
# License and Keywords.
|
||||||
|
data.update({
|
||||||
|
"License": [],
|
||||||
|
"Keywords": []
|
||||||
|
})
|
||||||
|
|
||||||
|
# If we actually got extra_info records, update data with
|
||||||
|
# them for this particular package.
|
||||||
|
if self.extra_info:
|
||||||
|
data.update(self.extra_info.get(package.ID, {}))
|
||||||
|
|
||||||
|
return data
|
||||||
|
|
||||||
|
def _assemble_json_data(self, packages: List[models.Package],
|
||||||
|
data_generator: DataGenerator) \
|
||||||
|
-> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Assemble JSON data out of a list of packages.
|
||||||
|
|
||||||
|
:param packages: A list of Package instances or a Package ORM query
|
||||||
|
:param data_generator: Generator callable of single-Package JSON data
|
||||||
|
"""
|
||||||
|
return [data_generator(pkg) for pkg in packages]
|
||||||
|
|
||||||
|
def _entities(self, query: orm.Query) -> orm.Query:
|
||||||
|
""" Select specific RPC columns on `query`. """
|
||||||
|
return query.with_entities(
|
||||||
|
models.Package.ID,
|
||||||
|
models.Package.Name,
|
||||||
|
models.Package.Version,
|
||||||
|
models.Package.Description,
|
||||||
|
models.Package.URL,
|
||||||
|
models.Package.PackageBaseID,
|
||||||
|
models.PackageBase.Name.label("PackageBaseName"),
|
||||||
|
models.PackageBase.NumVotes,
|
||||||
|
models.PackageBase.Popularity,
|
||||||
|
models.PackageBase.OutOfDateTS,
|
||||||
|
models.PackageBase.SubmittedTS,
|
||||||
|
models.PackageBase.ModifiedTS,
|
||||||
|
models.User.Username.label("Maintainer"),
|
||||||
|
).group_by(models.Package.ID)
|
||||||
|
|
||||||
|
def _handle_multiinfo_type(self, args: List[str] = [], **kwargs) \
|
||||||
|
-> List[Dict[str, Any]]:
|
||||||
|
self._enforce_args(args)
|
||||||
|
args = set(args)
|
||||||
|
|
||||||
|
packages = db.query(models.Package).join(models.PackageBase).join(
|
||||||
|
models.User,
|
||||||
|
models.User.ID == models.PackageBase.MaintainerUID,
|
||||||
|
isouter=True
|
||||||
|
).filter(models.Package.Name.in_(args))
|
||||||
|
packages = self._entities(packages)
|
||||||
|
|
||||||
|
ids = {pkg.ID for pkg in packages}
|
||||||
|
|
||||||
|
# Aliases for 80-width.
|
||||||
|
Package = models.Package
|
||||||
|
PackageKeyword = models.PackageKeyword
|
||||||
|
|
||||||
|
subqueries = [
|
||||||
|
# PackageDependency
|
||||||
|
db.query(
|
||||||
|
models.PackageDependency
|
||||||
|
).join(models.DependencyType).filter(
|
||||||
|
models.PackageDependency.PackageID.in_(ids)
|
||||||
|
).with_entities(
|
||||||
|
models.PackageDependency.PackageID.label("ID"),
|
||||||
|
models.DependencyType.Name.label("Type"),
|
||||||
|
models.PackageDependency.DepName.label("Name"),
|
||||||
|
models.PackageDependency.DepCondition.label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# PackageRelation
|
||||||
|
db.query(
|
||||||
|
models.PackageRelation
|
||||||
|
).join(models.RelationType).filter(
|
||||||
|
models.PackageRelation.PackageID.in_(ids)
|
||||||
|
).with_entities(
|
||||||
|
models.PackageRelation.PackageID.label("ID"),
|
||||||
|
models.RelationType.Name.label("Type"),
|
||||||
|
models.PackageRelation.RelName.label("Name"),
|
||||||
|
models.PackageRelation.RelCondition.label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# Groups
|
||||||
|
db.query(models.PackageGroup).join(
|
||||||
|
models.Group,
|
||||||
|
and_(models.PackageGroup.GroupID == models.Group.ID,
|
||||||
|
models.PackageGroup.PackageID.in_(ids))
|
||||||
|
).with_entities(
|
||||||
|
models.PackageGroup.PackageID.label("ID"),
|
||||||
|
literal("Groups").label("Type"),
|
||||||
|
models.Group.Name.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# Licenses
|
||||||
|
db.query(models.PackageLicense).join(
|
||||||
|
models.License,
|
||||||
|
models.PackageLicense.LicenseID == models.License.ID
|
||||||
|
).filter(
|
||||||
|
models.PackageLicense.PackageID.in_(ids)
|
||||||
|
).with_entities(
|
||||||
|
models.PackageLicense.PackageID.label("ID"),
|
||||||
|
literal("License").label("Type"),
|
||||||
|
models.License.Name.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# Keywords
|
||||||
|
db.query(models.PackageKeyword).join(
|
||||||
|
models.Package,
|
||||||
|
and_(Package.PackageBaseID == PackageKeyword.PackageBaseID,
|
||||||
|
Package.ID.in_(ids))
|
||||||
|
).with_entities(
|
||||||
|
models.Package.ID.label("ID"),
|
||||||
|
literal("Keywords").label("Type"),
|
||||||
|
models.PackageKeyword.Keyword.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name")
|
||||||
|
]
|
||||||
|
|
||||||
|
# Union all subqueries together.
|
||||||
|
query = subqueries[0].union_all(*subqueries[1:])
|
||||||
|
|
||||||
|
# Store our extra information in a class-wise dictionary,
|
||||||
|
# which contains package id -> extra info dict mappings.
|
||||||
|
self.extra_info = defaultdict(lambda: defaultdict(list))
|
||||||
|
for record in query:
|
||||||
|
type_ = TYPE_MAPPING.get(record.Type, record.Type)
|
||||||
|
|
||||||
|
name = record.Name
|
||||||
|
if record.Cond:
|
||||||
|
name += record.Cond
|
||||||
|
|
||||||
|
self.extra_info[record.ID][type_].append(name)
|
||||||
|
|
||||||
|
return self._assemble_json_data(packages, self._get_info_json_data)
|
||||||
|
|
||||||
|
def _handle_search_type(self, by: str = defaults.RPC_SEARCH_BY,
|
||||||
|
args: List[str] = []) -> List[Dict[str, Any]]:
|
||||||
|
# If `by` isn't maintainer and we don't have any args, raise an error.
|
||||||
|
# In maintainer's case, return all orphans if there are no args,
|
||||||
|
# so we need args to pass through to the handler without errors.
|
||||||
|
if by != "m" and not len(args):
|
||||||
|
raise RPCError("No request type/data specified.")
|
||||||
|
|
||||||
|
arg = args[0] if args else str()
|
||||||
|
if by != "m" and len(arg) < 2:
|
||||||
|
raise RPCError("Query arg too small.")
|
||||||
|
|
||||||
|
search = RPCSearch()
|
||||||
|
search.search_by(by, arg)
|
||||||
|
|
||||||
|
max_results = config.getint("options", "max_rpc_results")
|
||||||
|
results = self._entities(search.results()).limit(max_results)
|
||||||
|
return self._assemble_json_data(results, self._get_json_data)
|
||||||
|
|
||||||
|
def _handle_msearch_type(self, args: List[str] = [], **kwargs)\
|
||||||
|
-> List[Dict[str, Any]]:
|
||||||
|
return self._handle_search_type(by="m", args=args)
|
||||||
|
|
||||||
|
def _handle_suggest_type(self, args: List[str] = [], **kwargs)\
|
||||||
|
-> List[str]:
|
||||||
|
if not args:
|
||||||
|
return []
|
||||||
|
|
||||||
|
arg = args[0]
|
||||||
|
packages = db.query(models.Package.Name).join(
|
||||||
|
models.PackageBase
|
||||||
|
).filter(
|
||||||
|
and_(models.PackageBase.PackagerUID.isnot(None),
|
||||||
|
models.Package.Name.like(f"%{arg}%"))
|
||||||
|
).order_by(models.Package.Name.asc()).limit(20)
|
||||||
|
return [pkg.Name for pkg in packages]
|
||||||
|
|
||||||
|
def _handle_suggest_pkgbase_type(self, args: List[str] = [], **kwargs)\
|
||||||
|
-> List[str]:
|
||||||
|
if not args:
|
||||||
|
return []
|
||||||
|
|
||||||
|
packages = db.query(models.PackageBase.Name).filter(
|
||||||
|
and_(models.PackageBase.PackagerUID.isnot(None),
|
||||||
|
models.PackageBase.Name.like(f"%{args[0]}%"))
|
||||||
|
).order_by(models.PackageBase.Name.asc()).limit(20)
|
||||||
|
return [pkg.Name for pkg in packages]
|
||||||
|
|
||||||
|
def _is_suggestion(self) -> bool:
|
||||||
|
return self.type.startswith("suggest")
|
||||||
|
|
||||||
|
def _handle_callback(self, by: str, args: List[str])\
|
||||||
|
-> Union[List[Dict[str, Any]], List[str]]:
|
||||||
|
# Get a handle to our callback and trap an RPCError with
|
||||||
|
# an empty list of results based on callback's execution.
|
||||||
|
callback = getattr(self, f"_handle_{self.type.replace('-', '_')}_type")
|
||||||
|
results = callback(by=by, args=args)
|
||||||
|
return results
|
||||||
|
|
||||||
|
def handle(self, by: str = defaults.RPC_SEARCH_BY, args: List[str] = [])\
|
||||||
|
-> Union[List[Dict[str, Any]], Dict[str, Any]]:
|
||||||
|
""" Request entrypoint. A router should pass v, type and args
|
||||||
|
to this function and expect an output dictionary to be returned.
|
||||||
|
|
||||||
|
:param v: RPC version argument
|
||||||
|
:param type: RPC type argument
|
||||||
|
:param args: Deciphered list of arguments based on arg/arg[] inputs
|
||||||
|
"""
|
||||||
|
# Prepare our output data dictionary with some basic keys.
|
||||||
|
data = {"version": self.version, "type": self.type}
|
||||||
|
|
||||||
|
# Run some verification on our given arguments.
|
||||||
|
try:
|
||||||
|
self._verify_inputs(by=by, args=args)
|
||||||
|
except RPCError as exc:
|
||||||
|
return self.error(str(exc))
|
||||||
|
|
||||||
|
# Convert by to its aliased value if it has one.
|
||||||
|
by = RPC.BY_ALIASES.get(by, by)
|
||||||
|
|
||||||
|
# Process the requested handler.
|
||||||
|
try:
|
||||||
|
results = self._handle_callback(by, args)
|
||||||
|
except RPCError as exc:
|
||||||
|
return self.error(str(exc))
|
||||||
|
|
||||||
|
# These types are special: we produce a different kind of
|
||||||
|
# successful JSON output: a list of results.
|
||||||
|
if self._is_suggestion():
|
||||||
|
return results
|
||||||
|
|
||||||
|
# Return JSON output.
|
||||||
|
data.update({
|
||||||
|
"resultcount": len(results),
|
||||||
|
"results": results
|
||||||
|
})
|
||||||
|
return data
|
|
@ -16,13 +16,13 @@ db_backend = aurweb.config.get("database", "backend")
|
||||||
|
|
||||||
|
|
||||||
@compiles(TINYINT, 'sqlite')
|
@compiles(TINYINT, 'sqlite')
|
||||||
def compile_tinyint_sqlite(type_, compiler, **kw):
|
def compile_tinyint_sqlite(type_, compiler, **kw): # pragma: no cover
|
||||||
"""TINYINT is not supported on SQLite. Substitute it with INTEGER."""
|
"""TINYINT is not supported on SQLite. Substitute it with INTEGER."""
|
||||||
return 'INTEGER'
|
return 'INTEGER'
|
||||||
|
|
||||||
|
|
||||||
@compiles(BIGINT, 'sqlite')
|
@compiles(BIGINT, 'sqlite')
|
||||||
def compile_bigint_sqlite(type_, compiler, **kw):
|
def compile_bigint_sqlite(type_, compiler, **kw): # pragma: no cover
|
||||||
"""
|
"""
|
||||||
For SQLite's AUTOINCREMENT to work on BIGINT columns, we need to map BIGINT
|
For SQLite's AUTOINCREMENT to work on BIGINT columns, we need to map BIGINT
|
||||||
to INTEGER. Aside from that, BIGINT is the same as INTEGER for SQLite.
|
to INTEGER. Aside from that, BIGINT is the same as INTEGER for SQLite.
|
||||||
|
@ -107,7 +107,10 @@ PackageBases = Table(
|
||||||
Column('ID', INTEGER(unsigned=True), primary_key=True),
|
Column('ID', INTEGER(unsigned=True), primary_key=True),
|
||||||
Column('Name', String(255), nullable=False, unique=True),
|
Column('Name', String(255), nullable=False, unique=True),
|
||||||
Column('NumVotes', INTEGER(unsigned=True), nullable=False, server_default=text("0")),
|
Column('NumVotes', INTEGER(unsigned=True), nullable=False, server_default=text("0")),
|
||||||
Column('Popularity', DECIMAL(10, 6, unsigned=True), nullable=False, server_default=text("0")),
|
Column('Popularity',
|
||||||
|
DECIMAL(10, 6, unsigned=True)
|
||||||
|
if db_backend == "mysql" else String(17),
|
||||||
|
nullable=False, server_default=text("0")),
|
||||||
Column('OutOfDateTS', BIGINT(unsigned=True)),
|
Column('OutOfDateTS', BIGINT(unsigned=True)),
|
||||||
Column('FlaggerComment', Text, nullable=False),
|
Column('FlaggerComment', Text, nullable=False),
|
||||||
Column('SubmittedTS', BIGINT(unsigned=True), nullable=False),
|
Column('SubmittedTS', BIGINT(unsigned=True), nullable=False),
|
||||||
|
@ -130,7 +133,7 @@ PackageBases = Table(
|
||||||
# Keywords of package bases
|
# Keywords of package bases
|
||||||
PackageKeywords = Table(
|
PackageKeywords = Table(
|
||||||
'PackageKeywords', metadata,
|
'PackageKeywords', metadata,
|
||||||
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), primary_key=True, nullable=False),
|
Column('PackageBaseID', ForeignKey('PackageBases.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
|
||||||
Column('Keyword', String(255), primary_key=True, nullable=False, server_default=text("''")),
|
Column('Keyword', String(255), primary_key=True, nullable=False, server_default=text("''")),
|
||||||
mysql_engine='InnoDB',
|
mysql_engine='InnoDB',
|
||||||
mysql_charset='utf8mb4',
|
mysql_charset='utf8mb4',
|
||||||
|
@ -167,8 +170,8 @@ Licenses = Table(
|
||||||
# Information about package-license-relations
|
# Information about package-license-relations
|
||||||
PackageLicenses = Table(
|
PackageLicenses = Table(
|
||||||
'PackageLicenses', metadata,
|
'PackageLicenses', metadata,
|
||||||
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=False),
|
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
|
||||||
Column('LicenseID', ForeignKey('Licenses.ID', ondelete='CASCADE'), primary_key=True, nullable=False),
|
Column('LicenseID', ForeignKey('Licenses.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
|
||||||
mysql_engine='InnoDB',
|
mysql_engine='InnoDB',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -187,8 +190,8 @@ Groups = Table(
|
||||||
# Information about package-group-relations
|
# Information about package-group-relations
|
||||||
PackageGroups = Table(
|
PackageGroups = Table(
|
||||||
'PackageGroups', metadata,
|
'PackageGroups', metadata,
|
||||||
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=False),
|
Column('PackageID', ForeignKey('Packages.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
|
||||||
Column('GroupID', ForeignKey('Groups.ID', ondelete='CASCADE'), primary_key=True, nullable=False),
|
Column('GroupID', ForeignKey('Groups.ID', ondelete='CASCADE'), primary_key=True, nullable=True),
|
||||||
mysql_engine='InnoDB',
|
mysql_engine='InnoDB',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -383,12 +386,15 @@ TU_VoteInfo = Table(
|
||||||
Column('User', String(32), nullable=False),
|
Column('User', String(32), nullable=False),
|
||||||
Column('Submitted', BIGINT(unsigned=True), nullable=False),
|
Column('Submitted', BIGINT(unsigned=True), nullable=False),
|
||||||
Column('End', BIGINT(unsigned=True), nullable=False),
|
Column('End', BIGINT(unsigned=True), nullable=False),
|
||||||
Column('Quorum', DECIMAL(2, 2, unsigned=True), nullable=False),
|
Column('Quorum',
|
||||||
|
DECIMAL(2, 2, unsigned=True)
|
||||||
|
if db_backend == "mysql" else String(5),
|
||||||
|
nullable=False),
|
||||||
Column('SubmitterID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
|
Column('SubmitterID', ForeignKey('Users.ID', ondelete='CASCADE'), nullable=False),
|
||||||
Column('Yes', TINYINT(3, unsigned=True), nullable=False, server_default=text("'0'")),
|
Column('Yes', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
|
||||||
Column('No', TINYINT(3, unsigned=True), nullable=False, server_default=text("'0'")),
|
Column('No', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
|
||||||
Column('Abstain', TINYINT(3, unsigned=True), nullable=False, server_default=text("'0'")),
|
Column('Abstain', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
|
||||||
Column('ActiveTUs', TINYINT(3, unsigned=True), nullable=False, server_default=text("'0'")),
|
Column('ActiveTUs', INTEGER(unsigned=True), nullable=False, server_default=text("'0'")),
|
||||||
mysql_engine='InnoDB',
|
mysql_engine='InnoDB',
|
||||||
mysql_charset='utf8mb4',
|
mysql_charset='utf8mb4',
|
||||||
mysql_collate='utf8mb4_general_ci',
|
mysql_collate='utf8mb4_general_ci',
|
||||||
|
@ -441,7 +447,7 @@ AcceptedTerms = Table(
|
||||||
# Rate limits for API
|
# Rate limits for API
|
||||||
ApiRateLimit = Table(
|
ApiRateLimit = Table(
|
||||||
'ApiRateLimit', metadata,
|
'ApiRateLimit', metadata,
|
||||||
Column('IP', String(45), primary_key=True),
|
Column('IP', String(45), primary_key=True, unique=True, default=str()),
|
||||||
Column('Requests', INTEGER(11), nullable=False),
|
Column('Requests', INTEGER(11), nullable=False),
|
||||||
Column('WindowStart', BIGINT(20), nullable=False),
|
Column('WindowStart', BIGINT(20), nullable=False),
|
||||||
Index('ApiRateLimitWindowStart', 'WindowStart'),
|
Index('ApiRateLimitWindowStart', 'WindowStart'),
|
||||||
|
|
74
aurweb/scripts/adduser.py
Normal file
74
aurweb/scripts/adduser.py
Normal file
|
@ -0,0 +1,74 @@
|
||||||
|
"""
|
||||||
|
Add a user to the configured aurweb database.
|
||||||
|
|
||||||
|
See `aurweb-adduser --help` for documentation.
|
||||||
|
|
||||||
|
Copyright (C) 2022 aurweb Development Team
|
||||||
|
All Rights Reserved
|
||||||
|
"""
|
||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
import traceback
|
||||||
|
|
||||||
|
import aurweb.models.account_type as at
|
||||||
|
|
||||||
|
from aurweb import db
|
||||||
|
from aurweb.models.account_type import AccountType
|
||||||
|
from aurweb.models.ssh_pub_key import SSHPubKey, get_fingerprint
|
||||||
|
from aurweb.models.user import User
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
parser = argparse.ArgumentParser(description="aurweb-adduser options")
|
||||||
|
|
||||||
|
parser.add_argument("-u", "--username", help="Username", required=True)
|
||||||
|
parser.add_argument("-e", "--email", help="Email", required=True)
|
||||||
|
parser.add_argument("-p", "--password", help="Password", required=True)
|
||||||
|
parser.add_argument("-r", "--realname", help="Real Name")
|
||||||
|
parser.add_argument("-i", "--ircnick", help="IRC Nick")
|
||||||
|
parser.add_argument("--pgp-key", help="PGP Key Fingerprint")
|
||||||
|
parser.add_argument("--ssh-pubkey", help="SSH PubKey")
|
||||||
|
|
||||||
|
choices = at.ACCOUNT_TYPE_NAME.values()
|
||||||
|
parser.add_argument("-t", "--type", help="Account Type",
|
||||||
|
choices=choices, default=at.USER)
|
||||||
|
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
db.get_engine()
|
||||||
|
type = db.query(AccountType,
|
||||||
|
AccountType.AccountType == args.type).first()
|
||||||
|
with db.begin():
|
||||||
|
user = db.create(User, Username=args.username,
|
||||||
|
Email=args.email, Passwd=args.password,
|
||||||
|
RealName=args.realname, IRCNick=args.ircnick,
|
||||||
|
PGPKey=args.pgp_key, AccountType=type)
|
||||||
|
|
||||||
|
if args.ssh_pubkey:
|
||||||
|
pubkey = args.ssh_pubkey.strip()
|
||||||
|
|
||||||
|
# Remove host from the pubkey if it's there.
|
||||||
|
pubkey = ' '.join(pubkey.split(' ')[:2])
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
db.create(SSHPubKey,
|
||||||
|
User=user,
|
||||||
|
PubKey=pubkey,
|
||||||
|
Fingerprint=get_fingerprint(pubkey))
|
||||||
|
|
||||||
|
print(user.json())
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
e = 1
|
||||||
|
try:
|
||||||
|
e = main()
|
||||||
|
except Exception:
|
||||||
|
traceback.print_exc()
|
||||||
|
e = 1
|
||||||
|
sys.exit(e)
|
|
@ -4,30 +4,34 @@ import re
|
||||||
|
|
||||||
import pyalpm
|
import pyalpm
|
||||||
|
|
||||||
|
from sqlalchemy import and_
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
|
||||||
|
|
||||||
db_path = aurweb.config.get('aurblup', 'db-path')
|
from aurweb import db, util
|
||||||
sync_dbs = aurweb.config.get('aurblup', 'sync-dbs').split(' ')
|
from aurweb.models import OfficialProvider
|
||||||
server = aurweb.config.get('aurblup', 'server')
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def _main(force: bool = False):
|
||||||
blacklist = set()
|
blacklist = set()
|
||||||
providers = set()
|
providers = set()
|
||||||
repomap = dict()
|
repomap = dict()
|
||||||
|
|
||||||
|
db_path = aurweb.config.get("aurblup", "db-path")
|
||||||
|
sync_dbs = aurweb.config.get('aurblup', 'sync-dbs').split(' ')
|
||||||
|
server = aurweb.config.get('aurblup', 'server')
|
||||||
|
|
||||||
h = pyalpm.Handle("/", db_path)
|
h = pyalpm.Handle("/", db_path)
|
||||||
for sync_db in sync_dbs:
|
for sync_db in sync_dbs:
|
||||||
repo = h.register_syncdb(sync_db, pyalpm.SIG_DATABASE_OPTIONAL)
|
repo = h.register_syncdb(sync_db, pyalpm.SIG_DATABASE_OPTIONAL)
|
||||||
repo.servers = [server.replace("%s", sync_db)]
|
repo.servers = [server.replace("%s", sync_db)]
|
||||||
t = h.init_transaction()
|
t = h.init_transaction()
|
||||||
repo.update(False)
|
repo.update(force)
|
||||||
t.release()
|
t.release()
|
||||||
|
|
||||||
for pkg in repo.pkgcache:
|
for pkg in repo.pkgcache:
|
||||||
blacklist.add(pkg.name)
|
blacklist.add(pkg.name)
|
||||||
[blacklist.add(x) for x in pkg.replaces]
|
util.apply_all(pkg.replaces, blacklist.add)
|
||||||
providers.add((pkg.name, pkg.name))
|
providers.add((pkg.name, pkg.name))
|
||||||
repomap[(pkg.name, pkg.name)] = repo.name
|
repomap[(pkg.name, pkg.name)] = repo.name
|
||||||
for provision in pkg.provides:
|
for provision in pkg.provides:
|
||||||
|
@ -35,21 +39,29 @@ def main():
|
||||||
providers.add((pkg.name, provisionname))
|
providers.add((pkg.name, provisionname))
|
||||||
repomap[(pkg.name, provisionname)] = repo.name
|
repomap[(pkg.name, provisionname)] = repo.name
|
||||||
|
|
||||||
conn = aurweb.db.Connection()
|
with db.begin():
|
||||||
|
old_providers = set(
|
||||||
|
db.query(OfficialProvider).with_entities(
|
||||||
|
OfficialProvider.Name.label("Name"),
|
||||||
|
OfficialProvider.Provides.label("Provides")
|
||||||
|
).distinct().order_by("Name").all()
|
||||||
|
)
|
||||||
|
|
||||||
cur = conn.execute("SELECT Name, Provides FROM OfficialProviders")
|
for name, provides in old_providers.difference(providers):
|
||||||
oldproviders = set(cur.fetchall())
|
db.delete_all(db.query(OfficialProvider).filter(
|
||||||
|
and_(OfficialProvider.Name == name,
|
||||||
|
OfficialProvider.Provides == provides)
|
||||||
|
))
|
||||||
|
|
||||||
for pkg, provides in oldproviders.difference(providers):
|
for name, provides in providers.difference(old_providers):
|
||||||
conn.execute("DELETE FROM OfficialProviders "
|
repo = repomap.get((name, provides))
|
||||||
"WHERE Name = ? AND Provides = ?", [pkg, provides])
|
db.create(OfficialProvider, Name=name,
|
||||||
for pkg, provides in providers.difference(oldproviders):
|
Repo=repo, Provides=provides)
|
||||||
repo = repomap[(pkg, provides)]
|
|
||||||
conn.execute("INSERT INTO OfficialProviders (Name, Repo, Provides) "
|
|
||||||
"VALUES (?, ?, ?)", [pkg, repo, provides])
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
def main(force: bool = False):
|
||||||
|
db.get_engine()
|
||||||
|
_main(force)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
69
aurweb/scripts/config.py
Normal file
69
aurweb/scripts/config.py
Normal file
|
@ -0,0 +1,69 @@
|
||||||
|
"""
|
||||||
|
Perform an action on the aurweb config.
|
||||||
|
|
||||||
|
When AUR_CONFIG_IMMUTABLE is set, the `set` action is noop.
|
||||||
|
"""
|
||||||
|
import argparse
|
||||||
|
import configparser
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
|
||||||
|
def do_action(func, *args, save: bool = True):
|
||||||
|
# If AUR_CONFIG_IMMUTABLE is defined, skip out on config setting.
|
||||||
|
if int(os.environ.get("AUR_CONFIG_IMMUTABLE", 0)):
|
||||||
|
return
|
||||||
|
|
||||||
|
value = None
|
||||||
|
try:
|
||||||
|
value = func(*args)
|
||||||
|
if save:
|
||||||
|
aurweb.config.save()
|
||||||
|
except configparser.NoSectionError:
|
||||||
|
print("error: no section found", file=sys.stderr)
|
||||||
|
except configparser.NoOptionError:
|
||||||
|
print("error: no option found", file=sys.stderr)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def action_set(args):
|
||||||
|
if not args.value:
|
||||||
|
print("error: no value provided", file=sys.stderr)
|
||||||
|
return
|
||||||
|
do_action(aurweb.config.set_option, args.section, args.option, args.value)
|
||||||
|
|
||||||
|
|
||||||
|
def action_unset(args):
|
||||||
|
do_action(aurweb.config.unset_option, args.section, args.option)
|
||||||
|
|
||||||
|
|
||||||
|
def action_get(args):
|
||||||
|
val = do_action(aurweb.config.get, args.section, args.option, save=False)
|
||||||
|
print(val)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
fmt_cls = argparse.RawDescriptionHelpFormatter
|
||||||
|
actions = ["get", "set", "unset"]
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="aurweb configuration tool",
|
||||||
|
formatter_class=lambda prog: fmt_cls(prog=prog, max_help_position=80))
|
||||||
|
parser.add_argument("action", choices=actions, help="script action")
|
||||||
|
parser.add_argument("section", help="config section")
|
||||||
|
parser.add_argument("option", help="config option")
|
||||||
|
parser.add_argument("value", nargs="?", default=0,
|
||||||
|
help="config option value")
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
action = getattr(sys.modules[__name__], f"action_{args.action}")
|
||||||
|
return action(args)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
|
@ -18,25 +18,33 @@ on the following, right-hand side fields are added to each item.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import datetime
|
|
||||||
import gzip
|
import gzip
|
||||||
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from decimal import Decimal
|
from typing import Any, Dict
|
||||||
|
|
||||||
import orjson
|
import orjson
|
||||||
|
|
||||||
|
from sqlalchemy import literal, orm
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
|
||||||
|
|
||||||
packagesfile = aurweb.config.get('mkpkglists', 'packagesfile')
|
from aurweb import db, filters, logging, models, util
|
||||||
packagesmetafile = aurweb.config.get('mkpkglists', 'packagesmetafile')
|
from aurweb.benchmark import Benchmark
|
||||||
packagesmetaextfile = aurweb.config.get('mkpkglists', 'packagesmetaextfile')
|
from aurweb.models import Package, PackageBase, User
|
||||||
|
|
||||||
pkgbasefile = aurweb.config.get('mkpkglists', 'pkgbasefile')
|
logger = logging.get_logger("aurweb.scripts.mkpkglists")
|
||||||
|
|
||||||
userfile = aurweb.config.get('mkpkglists', 'userfile')
|
archivedir = aurweb.config.get("mkpkglists", "archivedir")
|
||||||
|
os.makedirs(archivedir, exist_ok=True)
|
||||||
|
|
||||||
|
PACKAGES = aurweb.config.get('mkpkglists', 'packagesfile')
|
||||||
|
META = aurweb.config.get('mkpkglists', 'packagesmetafile')
|
||||||
|
META_EXT = aurweb.config.get('mkpkglists', 'packagesmetaextfile')
|
||||||
|
PKGBASE = aurweb.config.get('mkpkglists', 'pkgbasefile')
|
||||||
|
USERS = aurweb.config.get('mkpkglists', 'userfile')
|
||||||
|
|
||||||
|
|
||||||
TYPE_MAP = {
|
TYPE_MAP = {
|
||||||
|
@ -50,7 +58,7 @@ TYPE_MAP = {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def get_extended_dict(query: str):
|
def get_extended_dict(query: orm.Query):
|
||||||
"""
|
"""
|
||||||
Produce data in the form in a single bulk SQL query:
|
Produce data in the form in a single bulk SQL query:
|
||||||
|
|
||||||
|
@ -71,61 +79,75 @@ def get_extended_dict(query: str):
|
||||||
output[i].update(data.get(package_id))
|
output[i].update(data.get(package_id))
|
||||||
"""
|
"""
|
||||||
|
|
||||||
conn = aurweb.db.Connection()
|
|
||||||
|
|
||||||
cursor = conn.execute(query)
|
|
||||||
|
|
||||||
data = defaultdict(lambda: defaultdict(list))
|
data = defaultdict(lambda: defaultdict(list))
|
||||||
|
|
||||||
for result in cursor.fetchall():
|
for result in query:
|
||||||
|
|
||||||
pkgid = result[0]
|
pkgid = result[0]
|
||||||
key = TYPE_MAP.get(result[1], result[1])
|
key = TYPE_MAP.get(result[1], result[1])
|
||||||
output = result[2]
|
output = result[2]
|
||||||
if result[3]:
|
if result[3]:
|
||||||
output += result[3]
|
output += result[3]
|
||||||
|
|
||||||
# In all cases, we have at least an empty License list.
|
|
||||||
if "License" not in data[pkgid]:
|
|
||||||
data[pkgid]["License"] = []
|
|
||||||
|
|
||||||
# In all cases, we have at least an empty Keywords list.
|
|
||||||
if "Keywords" not in data[pkgid]:
|
|
||||||
data[pkgid]["Keywords"] = []
|
|
||||||
|
|
||||||
data[pkgid][key].append(output)
|
data[pkgid][key].append(output)
|
||||||
|
|
||||||
conn.close()
|
|
||||||
return data
|
return data
|
||||||
|
|
||||||
|
|
||||||
def get_extended_fields():
|
def get_extended_fields():
|
||||||
# Returns: [ID, Type, Name, Cond]
|
subqueries = [
|
||||||
query = """
|
# PackageDependency
|
||||||
SELECT PackageDepends.PackageID AS ID, DependencyTypes.Name AS Type,
|
db.query(
|
||||||
PackageDepends.DepName AS Name, PackageDepends.DepCondition AS Cond
|
models.PackageDependency
|
||||||
FROM PackageDepends
|
).join(models.DependencyType).with_entities(
|
||||||
LEFT JOIN DependencyTypes
|
models.PackageDependency.PackageID.label("ID"),
|
||||||
ON DependencyTypes.ID = PackageDepends.DepTypeID
|
models.DependencyType.Name.label("Type"),
|
||||||
UNION SELECT PackageRelations.PackageID AS ID, RelationTypes.Name AS Type,
|
models.PackageDependency.DepName.label("Name"),
|
||||||
PackageRelations.RelName AS Name,
|
models.PackageDependency.DepCondition.label("Cond")
|
||||||
PackageRelations.RelCondition AS Cond
|
).distinct().order_by("Name"),
|
||||||
FROM PackageRelations
|
|
||||||
LEFT JOIN RelationTypes
|
# PackageRelation
|
||||||
ON RelationTypes.ID = PackageRelations.RelTypeID
|
db.query(
|
||||||
UNION SELECT PackageGroups.PackageID AS ID, 'Groups' AS Type,
|
models.PackageRelation
|
||||||
Groups.Name, '' AS Cond
|
).join(models.RelationType).with_entities(
|
||||||
FROM Groups
|
models.PackageRelation.PackageID.label("ID"),
|
||||||
INNER JOIN PackageGroups ON PackageGroups.GroupID = Groups.ID
|
models.RelationType.Name.label("Type"),
|
||||||
UNION SELECT PackageLicenses.PackageID AS ID, 'License' AS Type,
|
models.PackageRelation.RelName.label("Name"),
|
||||||
Licenses.Name, '' as Cond
|
models.PackageRelation.RelCondition.label("Cond")
|
||||||
FROM Licenses
|
).distinct().order_by("Name"),
|
||||||
INNER JOIN PackageLicenses ON PackageLicenses.LicenseID = Licenses.ID
|
|
||||||
UNION SELECT Packages.ID AS ID, 'Keywords' AS Type,
|
# Groups
|
||||||
PackageKeywords.Keyword AS Name, '' as Cond
|
db.query(models.PackageGroup).join(
|
||||||
FROM PackageKeywords
|
models.Group,
|
||||||
INNER JOIN Packages ON Packages.PackageBaseID = PackageKeywords.PackageBaseID
|
models.PackageGroup.GroupID == models.Group.ID
|
||||||
"""
|
).with_entities(
|
||||||
|
models.PackageGroup.PackageID.label("ID"),
|
||||||
|
literal("Groups").label("Type"),
|
||||||
|
models.Group.Name.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# Licenses
|
||||||
|
db.query(models.PackageLicense).join(
|
||||||
|
models.License,
|
||||||
|
models.PackageLicense.LicenseID == models.License.ID
|
||||||
|
).with_entities(
|
||||||
|
models.PackageLicense.PackageID.label("ID"),
|
||||||
|
literal("License").label("Type"),
|
||||||
|
models.License.Name.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name"),
|
||||||
|
|
||||||
|
# Keywords
|
||||||
|
db.query(models.PackageKeyword).join(
|
||||||
|
models.Package,
|
||||||
|
Package.PackageBaseID == models.PackageKeyword.PackageBaseID
|
||||||
|
).with_entities(
|
||||||
|
models.Package.ID.label("ID"),
|
||||||
|
literal("Keywords").label("Type"),
|
||||||
|
models.PackageKeyword.Keyword.label("Name"),
|
||||||
|
literal(str()).label("Cond")
|
||||||
|
).distinct().order_by("Name")
|
||||||
|
]
|
||||||
|
query = subqueries[0].union_all(*subqueries[1:])
|
||||||
return get_extended_dict(query)
|
return get_extended_dict(query)
|
||||||
|
|
||||||
|
|
||||||
|
@ -134,97 +156,122 @@ EXTENDED_FIELD_HANDLERS = {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def is_decimal(column):
|
def as_dict(package: Package) -> Dict[str, Any]:
|
||||||
""" Check if an SQL column is of decimal.Decimal type. """
|
return {
|
||||||
if isinstance(column, Decimal):
|
"ID": package.ID,
|
||||||
return float(column)
|
"Name": package.Name,
|
||||||
return column
|
"PackageBaseID": package.PackageBaseID,
|
||||||
|
"PackageBase": package.PackageBase,
|
||||||
|
"Version": package.Version,
|
||||||
|
"Description": package.Description,
|
||||||
|
"NumVotes": package.NumVotes,
|
||||||
|
"Popularity": float(package.Popularity),
|
||||||
|
"OutOfDate": package.OutOfDate,
|
||||||
|
"Maintainer": package.Maintainer,
|
||||||
|
"FirstSubmitted": package.FirstSubmitted,
|
||||||
|
"LastModified": package.LastModified,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
def write_archive(archive: str, output: list):
|
def _main():
|
||||||
with gzip.open(archive, "wb") as f:
|
bench = Benchmark()
|
||||||
f.write(b"[\n")
|
logger.info("Started re-creating archives, wait a while...")
|
||||||
for i, item in enumerate(output):
|
|
||||||
f.write(orjson.dumps(item))
|
|
||||||
if i < len(output) - 1:
|
|
||||||
f.write(b",")
|
|
||||||
f.write(b"\n")
|
|
||||||
f.write(b"]")
|
|
||||||
|
|
||||||
|
query = db.query(Package).join(
|
||||||
def main():
|
PackageBase,
|
||||||
conn = aurweb.db.Connection()
|
PackageBase.ID == Package.PackageBaseID
|
||||||
|
).join(
|
||||||
datestr = datetime.datetime.utcnow().strftime("%a, %d %b %Y %H:%M:%S GMT")
|
User,
|
||||||
pkglist_header = "# AUR package list, generated on " + datestr
|
PackageBase.MaintainerUID == User.ID,
|
||||||
pkgbaselist_header = "# AUR package base list, generated on " + datestr
|
isouter=True
|
||||||
userlist_header = "# AUR user name list, generated on " + datestr
|
).filter(PackageBase.PackagerUID.isnot(None)).with_entities(
|
||||||
|
Package.ID,
|
||||||
# Query columns; copied from RPC.
|
Package.Name,
|
||||||
columns = ("Packages.ID, Packages.Name, "
|
PackageBase.ID.label("PackageBaseID"),
|
||||||
"PackageBases.ID AS PackageBaseID, "
|
PackageBase.Name.label("PackageBase"),
|
||||||
"PackageBases.Name AS PackageBase, "
|
Package.Version,
|
||||||
"Version, Description, URL, NumVotes, "
|
Package.Description,
|
||||||
"Popularity, OutOfDateTS AS OutOfDate, "
|
PackageBase.NumVotes,
|
||||||
"Users.UserName AS Maintainer, "
|
PackageBase.Popularity,
|
||||||
"SubmittedTS AS FirstSubmitted, "
|
PackageBase.OutOfDateTS.label("OutOfDate"),
|
||||||
"ModifiedTS AS LastModified")
|
User.Username.label("Maintainer"),
|
||||||
|
PackageBase.SubmittedTS.label("FirstSubmitted"),
|
||||||
# Perform query.
|
PackageBase.ModifiedTS.label("LastModified")
|
||||||
cur = conn.execute(f"SELECT {columns} FROM Packages "
|
).distinct().order_by("Name")
|
||||||
"LEFT JOIN PackageBases "
|
|
||||||
"ON PackageBases.ID = Packages.PackageBaseID "
|
|
||||||
"LEFT JOIN Users "
|
|
||||||
"ON PackageBases.MaintainerUID = Users.ID "
|
|
||||||
"WHERE PackageBases.PackagerUID IS NOT NULL")
|
|
||||||
|
|
||||||
# Produce packages-meta-v1.json.gz
|
# Produce packages-meta-v1.json.gz
|
||||||
output = list()
|
output = list()
|
||||||
snapshot_uri = aurweb.config.get("options", "snapshot_uri")
|
snapshot_uri = aurweb.config.get("options", "snapshot_uri")
|
||||||
for result in cur.fetchall():
|
gzips = {
|
||||||
item = {
|
"packages": gzip.open(PACKAGES, "wt"),
|
||||||
column[0]: is_decimal(result[i])
|
"meta": gzip.open(META, "wb"),
|
||||||
for i, column in enumerate(cur.description)
|
}
|
||||||
}
|
|
||||||
item["URLPath"] = snapshot_uri % item.get("Name")
|
|
||||||
output.append(item)
|
|
||||||
|
|
||||||
write_archive(packagesmetafile, output)
|
# Append list opening to the metafile.
|
||||||
|
gzips["meta"].write(b"[\n")
|
||||||
|
|
||||||
# Produce packages-meta-ext-v1.json.gz
|
# Produce packages.gz + packages-meta-ext-v1.json.gz
|
||||||
|
extended = False
|
||||||
if len(sys.argv) > 1 and sys.argv[1] in EXTENDED_FIELD_HANDLERS:
|
if len(sys.argv) > 1 and sys.argv[1] in EXTENDED_FIELD_HANDLERS:
|
||||||
|
gzips["meta_ext"] = gzip.open(META_EXT, "wb")
|
||||||
|
# Append list opening to the meta_ext file.
|
||||||
|
gzips.get("meta_ext").write(b"[\n")
|
||||||
f = EXTENDED_FIELD_HANDLERS.get(sys.argv[1])
|
f = EXTENDED_FIELD_HANDLERS.get(sys.argv[1])
|
||||||
data = f()
|
data = f()
|
||||||
|
extended = True
|
||||||
|
|
||||||
default_ = {"Groups": [], "License": [], "Keywords": []}
|
results = query.all()
|
||||||
for i in range(len(output)):
|
n = len(results) - 1
|
||||||
data_ = data.get(output[i].get("ID"), default_)
|
for i, result in enumerate(results):
|
||||||
output[i].update(data_)
|
# Append to packages.gz.
|
||||||
|
gzips.get("packages").write(f"{result.Name}\n")
|
||||||
|
|
||||||
write_archive(packagesmetaextfile, output)
|
# Construct our result JSON dictionary.
|
||||||
|
item = as_dict(result)
|
||||||
|
item["URLPath"] = snapshot_uri % result.Name
|
||||||
|
|
||||||
# Produce packages.gz
|
# We stream out package json objects line per line, so
|
||||||
with gzip.open(packagesfile, "wb") as f:
|
# we also need to include the ',' character at the end
|
||||||
f.write(bytes(pkglist_header + "\n", "UTF-8"))
|
# of package lines (excluding the last package).
|
||||||
f.writelines([
|
suffix = b",\n" if i < n else b'\n'
|
||||||
bytes(x.get("Name") + "\n", "UTF-8")
|
|
||||||
for x in output
|
# Write out to packagesmetafile
|
||||||
])
|
output.append(item)
|
||||||
|
gzips.get("meta").write(orjson.dumps(output[-1]) + suffix)
|
||||||
|
|
||||||
|
if extended:
|
||||||
|
# Write out to packagesmetaextfile.
|
||||||
|
data_ = data.get(result.ID, {})
|
||||||
|
output[-1].update(data_)
|
||||||
|
gzips.get("meta_ext").write(orjson.dumps(output[-1]) + suffix)
|
||||||
|
|
||||||
|
# Append the list closing to meta/meta_ext.
|
||||||
|
gzips.get("meta").write(b"]")
|
||||||
|
if extended:
|
||||||
|
gzips.get("meta_ext").write(b"]")
|
||||||
|
|
||||||
|
# Close gzip files.
|
||||||
|
util.apply_all(gzips.values(), lambda gz: gz.close())
|
||||||
|
|
||||||
# Produce pkgbase.gz
|
# Produce pkgbase.gz
|
||||||
with gzip.open(pkgbasefile, "w") as f:
|
query = db.query(PackageBase.Name).filter(
|
||||||
f.write(bytes(pkgbaselist_header + "\n", "UTF-8"))
|
PackageBase.PackagerUID.isnot(None)).all()
|
||||||
cur = conn.execute("SELECT Name FROM PackageBases " +
|
with gzip.open(PKGBASE, "wt") as f:
|
||||||
"WHERE PackagerUID IS NOT NULL")
|
f.writelines([f"{base.Name}\n" for i, base in enumerate(query)])
|
||||||
f.writelines([bytes(x[0] + "\n", "UTF-8") for x in cur.fetchall()])
|
|
||||||
|
|
||||||
# Produce users.gz
|
# Produce users.gz
|
||||||
with gzip.open(userfile, "w") as f:
|
query = db.query(User.Username).all()
|
||||||
f.write(bytes(userlist_header + "\n", "UTF-8"))
|
with gzip.open(USERS, "wt") as f:
|
||||||
cur = conn.execute("SELECT UserName FROM Users")
|
f.writelines([f"{user.Username}\n" for i, user in enumerate(query)])
|
||||||
f.writelines([bytes(x[0] + "\n", "UTF-8") for x in cur.fetchall()])
|
|
||||||
|
|
||||||
conn.close()
|
seconds = filters.number_format(bench.end(), 4)
|
||||||
|
logger.info(f"Completed in {seconds} seconds.")
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
db.get_engine()
|
||||||
|
with db.begin():
|
||||||
|
_main()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -7,10 +7,26 @@ import subprocess
|
||||||
import sys
|
import sys
|
||||||
import textwrap
|
import textwrap
|
||||||
|
|
||||||
|
from typing import List, Tuple
|
||||||
|
|
||||||
|
from sqlalchemy import and_, or_
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
import aurweb.db
|
||||||
|
import aurweb.filters
|
||||||
import aurweb.l10n
|
import aurweb.l10n
|
||||||
|
|
||||||
|
from aurweb import db, l10n, logging
|
||||||
|
from aurweb.models import PackageBase, User
|
||||||
|
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||||
|
from aurweb.models.package_comment import PackageComment
|
||||||
|
from aurweb.models.package_notification import PackageNotification
|
||||||
|
from aurweb.models.package_request import PackageRequest
|
||||||
|
from aurweb.models.request_type import RequestType
|
||||||
|
from aurweb.models.tu_vote import TUVote
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
aur_location = aurweb.config.get('options', 'aur_location')
|
aur_location = aurweb.config.get('options', 'aur_location')
|
||||||
|
|
||||||
|
|
||||||
|
@ -22,27 +38,7 @@ def headers_reply(thread_id):
|
||||||
return {'In-Reply-To': thread_id, 'References': thread_id}
|
return {'In-Reply-To': thread_id, 'References': thread_id}
|
||||||
|
|
||||||
|
|
||||||
def username_from_id(conn, uid):
|
|
||||||
cur = conn.execute('SELECT UserName FROM Users WHERE ID = ?', [uid])
|
|
||||||
return cur.fetchone()[0]
|
|
||||||
|
|
||||||
|
|
||||||
def pkgbase_from_id(conn, pkgbase_id):
|
|
||||||
cur = conn.execute('SELECT Name FROM PackageBases WHERE ID = ?',
|
|
||||||
[pkgbase_id])
|
|
||||||
return cur.fetchone()[0]
|
|
||||||
|
|
||||||
|
|
||||||
def pkgbase_from_pkgreq(conn, reqid):
|
|
||||||
cur = conn.execute('SELECT PackageBaseID FROM PackageRequests ' +
|
|
||||||
'WHERE ID = ?', [reqid])
|
|
||||||
return cur.fetchone()[0]
|
|
||||||
|
|
||||||
|
|
||||||
class Notification:
|
class Notification:
|
||||||
def __init__(self):
|
|
||||||
self._l10n = aurweb.l10n.Translator()
|
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
return ()
|
return ()
|
||||||
|
|
||||||
|
@ -55,15 +51,15 @@ class Notification:
|
||||||
def get_body_fmt(self, lang):
|
def get_body_fmt(self, lang):
|
||||||
body = ''
|
body = ''
|
||||||
for line in self.get_body(lang).splitlines():
|
for line in self.get_body(lang).splitlines():
|
||||||
if line == '-- ':
|
if line == '--':
|
||||||
body += '-- \n'
|
body += '--\n'
|
||||||
continue
|
continue
|
||||||
body += textwrap.fill(line, break_long_words=False) + '\n'
|
body += textwrap.fill(line, break_long_words=False) + '\n'
|
||||||
for i, ref in enumerate(self.get_refs()):
|
for i, ref in enumerate(self.get_refs()):
|
||||||
body += '\n' + '[%d] %s' % (i + 1, ref)
|
body += '\n' + '[%d] %s' % (i + 1, ref)
|
||||||
return body.rstrip()
|
return body.rstrip()
|
||||||
|
|
||||||
def send(self):
|
def _send(self) -> None:
|
||||||
sendmail = aurweb.config.get('notifications', 'sendmail')
|
sendmail = aurweb.config.get('notifications', 'sendmail')
|
||||||
sender = aurweb.config.get('notifications', 'sender')
|
sender = aurweb.config.get('notifications', 'sender')
|
||||||
reply_to = aurweb.config.get('notifications', 'reply-to')
|
reply_to = aurweb.config.get('notifications', 'reply-to')
|
||||||
|
@ -97,16 +93,20 @@ class Notification:
|
||||||
else:
|
else:
|
||||||
# send email using smtplib; no local MTA required
|
# send email using smtplib; no local MTA required
|
||||||
server_addr = aurweb.config.get('notifications', 'smtp-server')
|
server_addr = aurweb.config.get('notifications', 'smtp-server')
|
||||||
server_port = aurweb.config.getint('notifications', 'smtp-port')
|
server_port = aurweb.config.getint('notifications',
|
||||||
use_ssl = aurweb.config.getboolean('notifications', 'smtp-use-ssl')
|
'smtp-port')
|
||||||
use_starttls = aurweb.config.getboolean('notifications', 'smtp-use-starttls')
|
use_ssl = aurweb.config.getboolean('notifications',
|
||||||
|
'smtp-use-ssl')
|
||||||
|
use_starttls = aurweb.config.getboolean('notifications',
|
||||||
|
'smtp-use-starttls')
|
||||||
user = aurweb.config.get('notifications', 'smtp-user')
|
user = aurweb.config.get('notifications', 'smtp-user')
|
||||||
passwd = aurweb.config.get('notifications', 'smtp-password')
|
passwd = aurweb.config.get('notifications', 'smtp-password')
|
||||||
|
|
||||||
if use_ssl:
|
classes = {
|
||||||
server = smtplib.SMTP_SSL(server_addr, server_port)
|
False: smtplib.SMTP,
|
||||||
else:
|
True: smtplib.SMTP_SSL,
|
||||||
server = smtplib.SMTP(server_addr, server_port)
|
}
|
||||||
|
server = classes[use_ssl](server_addr, server_port)
|
||||||
|
|
||||||
if use_starttls:
|
if use_starttls:
|
||||||
server.ehlo()
|
server.ehlo()
|
||||||
|
@ -121,13 +121,76 @@ class Notification:
|
||||||
server.sendmail(sender, deliver_to, msg.as_bytes())
|
server.sendmail(sender, deliver_to, msg.as_bytes())
|
||||||
server.quit()
|
server.quit()
|
||||||
|
|
||||||
|
def send(self) -> None:
|
||||||
|
try:
|
||||||
|
self._send()
|
||||||
|
except OSError as exc:
|
||||||
|
logger.error("Unable to emit notification due to an "
|
||||||
|
"OSError (precise exception following).")
|
||||||
|
logger.error(str(exc))
|
||||||
|
|
||||||
|
|
||||||
|
class ServerErrorNotification(Notification):
|
||||||
|
""" A notification used to represent an internal server error. """
|
||||||
|
|
||||||
|
def __init__(self, traceback_id: int, version: str, utc: int):
|
||||||
|
"""
|
||||||
|
Construct a ServerErrorNotification.
|
||||||
|
|
||||||
|
:param traceback_id: Traceback ID
|
||||||
|
:param version: aurweb version
|
||||||
|
:param utc: UTC timestamp
|
||||||
|
"""
|
||||||
|
self._tb_id = traceback_id
|
||||||
|
self._version = version
|
||||||
|
self._utc = utc
|
||||||
|
|
||||||
|
postmaster = aurweb.config.get("notifications", "postmaster")
|
||||||
|
self._to = postmaster
|
||||||
|
|
||||||
|
super().__init__()
|
||||||
|
|
||||||
|
def get_recipients(self) -> List[Tuple[str, str]]:
|
||||||
|
from aurweb.auth import AnonymousUser
|
||||||
|
user = (db.query(User).filter(User.Email == self._to).first()
|
||||||
|
or AnonymousUser())
|
||||||
|
return [(self._to, user.LangPreference)]
|
||||||
|
|
||||||
|
def get_subject(self, lang: str) -> str:
|
||||||
|
return l10n.translator.translate("AUR Server Error", lang)
|
||||||
|
|
||||||
|
def get_body(self, lang: str) -> str:
|
||||||
|
""" A forcibly English email body. """
|
||||||
|
dt = aurweb.filters.timestamp_to_datetime(self._utc)
|
||||||
|
dts = dt.strftime("%Y-%m-%d %H:%M")
|
||||||
|
return (f"Traceback ID: {self._tb_id}\n"
|
||||||
|
f"Location: {aur_location}\n"
|
||||||
|
f"Version: {self._version}\n"
|
||||||
|
f"Datetime: {dts} UTC\n")
|
||||||
|
|
||||||
|
def get_refs(self):
|
||||||
|
return (aur_location,)
|
||||||
|
|
||||||
|
|
||||||
class ResetKeyNotification(Notification):
|
class ResetKeyNotification(Notification):
|
||||||
def __init__(self, conn, uid):
|
def __init__(self, uid):
|
||||||
cur = conn.execute('SELECT UserName, Email, BackupEmail, ' +
|
|
||||||
'LangPreference, ResetKey ' +
|
user = db.query(User).filter(
|
||||||
'FROM Users WHERE ID = ? AND Suspended = 0', [uid])
|
and_(User.ID == uid, User.Suspended == 0)
|
||||||
self._username, self._to, self._backup, self._lang, self._resetkey = cur.fetchone()
|
).with_entities(
|
||||||
|
User.Username,
|
||||||
|
User.Email,
|
||||||
|
User.BackupEmail,
|
||||||
|
User.LangPreference,
|
||||||
|
User.ResetKey
|
||||||
|
).order_by(User.Username.asc()).first()
|
||||||
|
|
||||||
|
self._username = user.Username
|
||||||
|
self._to = user.Email
|
||||||
|
self._backup = user.BackupEmail
|
||||||
|
self._lang = user.LangPreference
|
||||||
|
self._resetkey = user.ResetKey
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
|
@ -137,15 +200,15 @@ class ResetKeyNotification(Notification):
|
||||||
return [(self._to, self._lang)]
|
return [(self._to, self._lang)]
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Password Reset', lang)
|
return aurweb.l10n.translator.translate('AUR Password Reset', lang)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'A password reset request was submitted for the account '
|
'A password reset request was submitted for the account '
|
||||||
'{user} associated with your email address. If you wish to '
|
'{user} associated with your email address. If you wish to '
|
||||||
'reset your password follow the link [1] below, otherwise '
|
'reset your password follow the link [1] below, otherwise '
|
||||||
'ignore this message and nothing will happen.',
|
'ignore this message and nothing will happen.',
|
||||||
lang).format(user=self._username)
|
lang).format(user=self._username)
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
return (aur_location + '/passreset/?resetkey=' + self._resetkey,)
|
return (aur_location + '/passreset/?resetkey=' + self._resetkey,)
|
||||||
|
@ -153,52 +216,62 @@ class ResetKeyNotification(Notification):
|
||||||
|
|
||||||
class WelcomeNotification(ResetKeyNotification):
|
class WelcomeNotification(ResetKeyNotification):
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('Welcome to the Arch User Repository',
|
return aurweb.l10n.translator.translate(
|
||||||
lang)
|
'Welcome to the Arch User Repository',
|
||||||
|
lang)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'Welcome to the Arch User Repository! In order to set an '
|
'Welcome to the Arch User Repository! In order to set an '
|
||||||
'initial password for your new account, please click the '
|
'initial password for your new account, please click the '
|
||||||
'link [1] below. If the link does not work, try copying and '
|
'link [1] below. If the link does not work, try copying and '
|
||||||
'pasting it into your browser.', lang)
|
'pasting it into your browser.', lang)
|
||||||
|
|
||||||
|
|
||||||
class CommentNotification(Notification):
|
class CommentNotification(Notification):
|
||||||
def __init__(self, conn, uid, pkgbase_id, comment_id):
|
def __init__(self, uid, pkgbase_id, comment_id):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email, Users.LangPreference '
|
User.ID == uid).first().Username
|
||||||
'FROM Users INNER JOIN PackageNotifications ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'ON PackageNotifications.UserID = Users.ID WHERE ' +
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
'Users.CommentNotify = 1 AND ' +
|
|
||||||
'PackageNotifications.UserID != ? AND ' +
|
query = db.query(User).join(PackageNotification).filter(
|
||||||
'PackageNotifications.PackageBaseID = ? AND ' +
|
and_(User.CommentNotify == 1,
|
||||||
'Users.Suspended = 0',
|
PackageNotification.UserID != uid,
|
||||||
[uid, pkgbase_id])
|
PackageNotification.PackageBaseID == pkgbase_id,
|
||||||
self._recipients = cur.fetchall()
|
User.Suspended == 0)
|
||||||
cur = conn.execute('SELECT Comments FROM PackageComments WHERE ID = ?',
|
).with_entities(
|
||||||
[comment_id])
|
User.Email,
|
||||||
self._text = cur.fetchone()[0]
|
User.LangPreference
|
||||||
|
).distinct()
|
||||||
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
|
pkgcomment = db.query(PackageComment.Comments).filter(
|
||||||
|
PackageComment.ID == comment_id).first()
|
||||||
|
self._text = pkgcomment.Comments
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Comment for {pkgbase}',
|
return aurweb.l10n.translator.translate(
|
||||||
lang).format(pkgbase=self._pkgbase)
|
'AUR Comment for {pkgbase}',
|
||||||
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
body = self._l10n.translate(
|
body = aurweb.l10n.translator.translate(
|
||||||
'{user} [1] added the following comment to {pkgbase} [2]:',
|
'{user} [1] added the following comment to {pkgbase} [2]:',
|
||||||
lang).format(user=self._user, pkgbase=self._pkgbase)
|
lang).format(user=self._user, pkgbase=self._pkgbase)
|
||||||
body += '\n\n' + self._text + '\n\n-- \n'
|
body += '\n\n' + self._text + '\n\n--\n'
|
||||||
dnlabel = self._l10n.translate('Disable notifications', lang)
|
dnlabel = aurweb.l10n.translator.translate(
|
||||||
body += self._l10n.translate(
|
'Disable notifications', lang)
|
||||||
'If you no longer wish to receive notifications about this '
|
body += aurweb.l10n.translator.translate(
|
||||||
'package, please go to the package page [2] and select '
|
'If you no longer wish to receive notifications about this '
|
||||||
'"{label}".', lang).format(label=dnlabel)
|
'package, please go to the package page [2] and select '
|
||||||
|
'"{label}".', lang).format(label=dnlabel)
|
||||||
return body
|
return body
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
|
@ -212,39 +285,45 @@ class CommentNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class UpdateNotification(Notification):
|
class UpdateNotification(Notification):
|
||||||
def __init__(self, conn, uid, pkgbase_id):
|
def __init__(self, uid, pkgbase_id):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email, ' +
|
User.ID == uid).first().Username
|
||||||
'Users.LangPreference FROM Users ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'INNER JOIN PackageNotifications ' +
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
'ON PackageNotifications.UserID = Users.ID WHERE ' +
|
|
||||||
'Users.UpdateNotify = 1 AND ' +
|
query = db.query(User).join(PackageNotification).filter(
|
||||||
'PackageNotifications.UserID != ? AND ' +
|
and_(User.UpdateNotify == 1,
|
||||||
'PackageNotifications.PackageBaseID = ? AND ' +
|
PackageNotification.UserID != uid,
|
||||||
'Users.Suspended = 0',
|
PackageNotification.PackageBaseID == pkgbase_id,
|
||||||
[uid, pkgbase_id])
|
User.Suspended == 0)
|
||||||
self._recipients = cur.fetchall()
|
).with_entities(
|
||||||
|
User.Email,
|
||||||
|
User.LangPreference
|
||||||
|
).distinct()
|
||||||
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Package Update: {pkgbase}',
|
return aurweb.l10n.translator.translate(
|
||||||
lang).format(pkgbase=self._pkgbase)
|
'AUR Package Update: {pkgbase}',
|
||||||
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
body = self._l10n.translate('{user} [1] pushed a new commit to '
|
body = aurweb.l10n.translator.translate(
|
||||||
'{pkgbase} [2].', lang).format(
|
'{user} [1] pushed a new commit to {pkgbase} [2].',
|
||||||
user=self._user,
|
lang).format(user=self._user, pkgbase=self._pkgbase)
|
||||||
pkgbase=self._pkgbase)
|
body += '\n\n--\n'
|
||||||
body += '\n\n-- \n'
|
dnlabel = aurweb.l10n.translator.translate(
|
||||||
dnlabel = self._l10n.translate('Disable notifications', lang)
|
'Disable notifications', lang)
|
||||||
body += self._l10n.translate(
|
body += aurweb.l10n.translator.translate(
|
||||||
'If you no longer wish to receive notifications about this '
|
'If you no longer wish to receive notifications about this '
|
||||||
'package, please go to the package page [2] and select '
|
'package, please go to the package page [2] and select '
|
||||||
'"{label}".', lang).format(label=dnlabel)
|
'"{label}".', lang).format(label=dnlabel)
|
||||||
return body
|
return body
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
|
@ -258,37 +337,45 @@ class UpdateNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class FlagNotification(Notification):
|
class FlagNotification(Notification):
|
||||||
def __init__(self, conn, uid, pkgbase_id):
|
def __init__(self, uid, pkgbase_id):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email, ' +
|
User.ID == uid).first().Username
|
||||||
'Users.LangPreference FROM Users ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'LEFT JOIN PackageComaintainers ' +
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
'ON PackageComaintainers.UsersID = Users.ID ' +
|
|
||||||
'INNER JOIN PackageBases ' +
|
query = db.query(User).join(PackageComaintainer, isouter=True).join(
|
||||||
'ON PackageBases.MaintainerUID = Users.ID OR ' +
|
PackageBase,
|
||||||
'PackageBases.ID = PackageComaintainers.PackageBaseID ' +
|
or_(PackageBase.MaintainerUID == User.ID,
|
||||||
'WHERE PackageBases.ID = ? AND ' +
|
PackageBase.ID == PackageComaintainer.PackageBaseID)
|
||||||
'Users.Suspended = 0', [pkgbase_id])
|
).filter(
|
||||||
self._recipients = cur.fetchall()
|
and_(PackageBase.ID == pkgbase_id,
|
||||||
cur = conn.execute('SELECT FlaggerComment FROM PackageBases WHERE ' +
|
User.Suspended == 0)
|
||||||
'ID = ?', [pkgbase_id])
|
).with_entities(
|
||||||
self._text = cur.fetchone()[0]
|
User.Email,
|
||||||
|
User.LangPreference
|
||||||
|
).distinct()
|
||||||
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
|
pkgbase = db.query(PackageBase.FlaggerComment).filter(
|
||||||
|
PackageBase.ID == pkgbase_id).first()
|
||||||
|
self._text = pkgbase.FlaggerComment
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Out-of-date Notification for '
|
return aurweb.l10n.translator.translate(
|
||||||
'{pkgbase}',
|
'AUR Out-of-date Notification for {pkgbase}',
|
||||||
lang).format(pkgbase=self._pkgbase)
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
body = self._l10n.translate(
|
body = aurweb.l10n.translator.translate(
|
||||||
'Your package {pkgbase} [1] has been flagged out-of-date by '
|
'Your package {pkgbase} [1] has been flagged out-of-date by '
|
||||||
'{user} [2]:', lang).format(pkgbase=self._pkgbase,
|
'{user} [2]:', lang).format(pkgbase=self._pkgbase,
|
||||||
user=self._user)
|
user=self._user)
|
||||||
body += '\n\n' + self._text
|
body += '\n\n' + self._text
|
||||||
return body
|
return body
|
||||||
|
|
||||||
|
@ -298,30 +385,37 @@ class FlagNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class OwnershipEventNotification(Notification):
|
class OwnershipEventNotification(Notification):
|
||||||
def __init__(self, conn, uid, pkgbase_id):
|
def __init__(self, uid, pkgbase_id):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email, ' +
|
User.ID == uid).first().Username
|
||||||
'Users.LangPreference FROM Users ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'INNER JOIN PackageNotifications ' +
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
'ON PackageNotifications.UserID = Users.ID WHERE ' +
|
|
||||||
'Users.OwnershipNotify = 1 AND ' +
|
query = db.query(User).join(PackageNotification).filter(
|
||||||
'PackageNotifications.UserID != ? AND ' +
|
and_(User.OwnershipNotify == 1,
|
||||||
'PackageNotifications.PackageBaseID = ? AND ' +
|
PackageNotification.UserID != uid,
|
||||||
'Users.Suspended = 0',
|
PackageNotification.PackageBaseID == pkgbase_id,
|
||||||
[uid, pkgbase_id])
|
User.Suspended == 0)
|
||||||
self._recipients = cur.fetchall()
|
).with_entities(
|
||||||
cur = conn.execute('SELECT FlaggerComment FROM PackageBases WHERE ' +
|
User.Email,
|
||||||
'ID = ?', [pkgbase_id])
|
User.LangPreference
|
||||||
self._text = cur.fetchone()[0]
|
).distinct()
|
||||||
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
|
pkgbase = db.query(PackageBase.FlaggerComment).filter(
|
||||||
|
PackageBase.ID == pkgbase_id).first()
|
||||||
|
self._text = pkgbase.FlaggerComment
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Ownership Notification for {pkgbase}',
|
return aurweb.l10n.translator.translate(
|
||||||
lang).format(pkgbase=self._pkgbase)
|
'AUR Ownership Notification for {pkgbase}',
|
||||||
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
return (aur_location + '/pkgbase/' + self._pkgbase + '/',
|
return (aur_location + '/pkgbase/' + self._pkgbase + '/',
|
||||||
|
@ -330,34 +424,45 @@ class OwnershipEventNotification(Notification):
|
||||||
|
|
||||||
class AdoptNotification(OwnershipEventNotification):
|
class AdoptNotification(OwnershipEventNotification):
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'The package {pkgbase} [1] was adopted by {user} [2].',
|
'The package {pkgbase} [1] was adopted by {user} [2].',
|
||||||
lang).format(pkgbase=self._pkgbase, user=self._user)
|
lang).format(pkgbase=self._pkgbase, user=self._user)
|
||||||
|
|
||||||
|
|
||||||
class DisownNotification(OwnershipEventNotification):
|
class DisownNotification(OwnershipEventNotification):
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'The package {pkgbase} [1] was disowned by {user} '
|
'The package {pkgbase} [1] was disowned by {user} '
|
||||||
'[2].', lang).format(pkgbase=self._pkgbase,
|
'[2].', lang).format(pkgbase=self._pkgbase,
|
||||||
user=self._user)
|
user=self._user)
|
||||||
|
|
||||||
|
|
||||||
class ComaintainershipEventNotification(Notification):
|
class ComaintainershipEventNotification(Notification):
|
||||||
def __init__(self, conn, uid, pkgbase_id):
|
def __init__(self, uid, pkgbase_id):
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
|
||||||
cur = conn.execute('SELECT Email, LangPreference FROM Users ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'WHERE ID = ? AND Suspended = 0', [uid])
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
self._to, self._lang = cur.fetchone()
|
|
||||||
|
user = db.query(User).filter(
|
||||||
|
and_(User.ID == uid,
|
||||||
|
User.Suspended == 0)
|
||||||
|
).with_entities(
|
||||||
|
User.Email,
|
||||||
|
User.LangPreference
|
||||||
|
).first()
|
||||||
|
|
||||||
|
self._to = user.Email
|
||||||
|
self._lang = user.LangPreference
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return [(self._to, self._lang)]
|
return [(self._to, self._lang)]
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Co-Maintainer Notification for '
|
return aurweb.l10n.translator.translate(
|
||||||
'{pkgbase}',
|
'AUR Co-Maintainer Notification for {pkgbase}',
|
||||||
lang).format(pkgbase=self._pkgbase)
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
return (aur_location + '/pkgbase/' + self._pkgbase + '/',)
|
return (aur_location + '/pkgbase/' + self._pkgbase + '/',)
|
||||||
|
@ -365,60 +470,68 @@ class ComaintainershipEventNotification(Notification):
|
||||||
|
|
||||||
class ComaintainerAddNotification(ComaintainershipEventNotification):
|
class ComaintainerAddNotification(ComaintainershipEventNotification):
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'You were added to the co-maintainer list of {pkgbase} [1].',
|
'You were added to the co-maintainer list of {pkgbase} [1].',
|
||||||
lang).format(pkgbase=self._pkgbase)
|
lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
|
|
||||||
class ComaintainerRemoveNotification(ComaintainershipEventNotification):
|
class ComaintainerRemoveNotification(ComaintainershipEventNotification):
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'You were removed from the co-maintainer list of {pkgbase} '
|
'You were removed from the co-maintainer list of {pkgbase} '
|
||||||
'[1].', lang).format(pkgbase=self._pkgbase)
|
'[1].', lang).format(pkgbase=self._pkgbase)
|
||||||
|
|
||||||
|
|
||||||
class DeleteNotification(Notification):
|
class DeleteNotification(Notification):
|
||||||
def __init__(self, conn, uid, old_pkgbase_id, new_pkgbase_id=None):
|
def __init__(self, uid, old_pkgbase_id, new_pkgbase_id=None):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._old_pkgbase = pkgbase_from_id(conn, old_pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
|
User.ID == uid).first().Username
|
||||||
|
self._old_pkgbase = db.query(PackageBase.Name).filter(
|
||||||
|
PackageBase.ID == old_pkgbase_id).first().Name
|
||||||
|
|
||||||
|
self._new_pkgbase = None
|
||||||
if new_pkgbase_id:
|
if new_pkgbase_id:
|
||||||
self._new_pkgbase = pkgbase_from_id(conn, new_pkgbase_id)
|
self._new_pkgbase = db.query(PackageBase.Name).filter(
|
||||||
else:
|
PackageBase.ID == new_pkgbase_id).first().Name
|
||||||
self._new_pkgbase = None
|
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email, ' +
|
query = db.query(User).join(PackageNotification).filter(
|
||||||
'Users.LangPreference FROM Users ' +
|
and_(PackageNotification.UserID != uid,
|
||||||
'INNER JOIN PackageNotifications ' +
|
PackageNotification.PackageBaseID == old_pkgbase_id,
|
||||||
'ON PackageNotifications.UserID = Users.ID WHERE ' +
|
User.Suspended == 0)
|
||||||
'PackageNotifications.UserID != ? AND ' +
|
).with_entities(
|
||||||
'PackageNotifications.PackageBaseID = ? AND ' +
|
User.Email,
|
||||||
'Users.Suspended = 0',
|
User.LangPreference
|
||||||
[uid, old_pkgbase_id])
|
).distinct()
|
||||||
self._recipients = cur.fetchall()
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('AUR Package deleted: {pkgbase}',
|
return aurweb.l10n.translator.translate(
|
||||||
lang).format(pkgbase=self._old_pkgbase)
|
'AUR Package deleted: {pkgbase}',
|
||||||
|
lang).format(pkgbase=self._old_pkgbase)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
if self._new_pkgbase:
|
if self._new_pkgbase:
|
||||||
dnlabel = self._l10n.translate('Disable notifications', lang)
|
dnlabel = aurweb.l10n.translator.translate(
|
||||||
return self._l10n.translate(
|
'Disable notifications', lang)
|
||||||
'{user} [1] merged {old} [2] into {new} [3].\n\n'
|
return aurweb.l10n.translator.translate(
|
||||||
'-- \n'
|
'{user} [1] merged {old} [2] into {new} [3].\n\n'
|
||||||
'If you no longer wish receive notifications about the '
|
'--\n'
|
||||||
'new package, please go to [3] and click "{label}".',
|
'If you no longer wish receive notifications about the '
|
||||||
lang).format(user=self._user, old=self._old_pkgbase,
|
'new package, please go to [3] and click "{label}".',
|
||||||
new=self._new_pkgbase, label=dnlabel)
|
lang).format(user=self._user, old=self._old_pkgbase,
|
||||||
|
new=self._new_pkgbase, label=dnlabel)
|
||||||
else:
|
else:
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'{user} [1] deleted {pkgbase} [2].\n\n'
|
'{user} [1] deleted {pkgbase} [2].\n\n'
|
||||||
'You will no longer receive notifications about this '
|
'You will no longer receive notifications about this '
|
||||||
'package.', lang).format(user=self._user,
|
'package.', lang).format(user=self._user,
|
||||||
pkgbase=self._old_pkgbase)
|
pkgbase=self._old_pkgbase)
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
refs = (aur_location + '/account/' + self._user + '/',
|
refs = (aur_location + '/account/' + self._user + '/',
|
||||||
|
@ -429,25 +542,36 @@ class DeleteNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class RequestOpenNotification(Notification):
|
class RequestOpenNotification(Notification):
|
||||||
def __init__(self, conn, uid, reqid, reqtype, pkgbase_id, merge_into=None):
|
def __init__(self, uid, reqid, reqtype, pkgbase_id, merge_into=None):
|
||||||
self._user = username_from_id(conn, uid)
|
|
||||||
self._pkgbase = pkgbase_from_id(conn, pkgbase_id)
|
self._user = db.query(User.Username).filter(
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email FROM PackageRequests ' +
|
User.ID == uid).first().Username
|
||||||
'INNER JOIN PackageBases ' +
|
self._pkgbase = db.query(PackageBase.Name).filter(
|
||||||
'ON PackageBases.ID = PackageRequests.PackageBaseID ' +
|
PackageBase.ID == pkgbase_id).first().Name
|
||||||
'LEFT JOIN PackageComaintainers ' +
|
|
||||||
'ON PackageComaintainers.PackageBaseID = PackageRequests.PackageBaseID ' +
|
|
||||||
'INNER JOIN Users ' +
|
|
||||||
'ON Users.ID = PackageRequests.UsersID ' +
|
|
||||||
'OR Users.ID = PackageBases.MaintainerUID ' +
|
|
||||||
'OR Users.ID = PackageComaintainers.UsersID ' +
|
|
||||||
'WHERE PackageRequests.ID = ? AND ' +
|
|
||||||
'Users.Suspended = 0', [reqid])
|
|
||||||
self._to = aurweb.config.get('options', 'aur_request_ml')
|
self._to = aurweb.config.get('options', 'aur_request_ml')
|
||||||
self._cc = [row[0] for row in cur.fetchall()]
|
|
||||||
cur = conn.execute('SELECT Comments FROM PackageRequests WHERE ID = ?',
|
query = db.query(PackageRequest).join(PackageBase).join(
|
||||||
[reqid])
|
PackageComaintainer,
|
||||||
self._text = cur.fetchone()[0]
|
PackageComaintainer.PackageBaseID == PackageRequest.PackageBaseID,
|
||||||
|
isouter=True
|
||||||
|
).join(
|
||||||
|
User,
|
||||||
|
or_(User.ID == PackageRequest.UsersID,
|
||||||
|
User.ID == PackageBase.MaintainerUID,
|
||||||
|
User.ID == PackageComaintainer.UsersID)
|
||||||
|
).filter(
|
||||||
|
and_(PackageRequest.ID == reqid,
|
||||||
|
User.Suspended == 0)
|
||||||
|
).with_entities(
|
||||||
|
User.Email
|
||||||
|
).distinct()
|
||||||
|
self._cc = [u.Email for u in query]
|
||||||
|
|
||||||
|
pkgreq = db.query(PackageRequest.Comments).filter(
|
||||||
|
PackageRequest.ID == reqid).first()
|
||||||
|
|
||||||
|
self._text = pkgreq.Comments
|
||||||
self._reqid = int(reqid)
|
self._reqid = int(reqid)
|
||||||
self._reqtype = reqtype
|
self._reqtype = reqtype
|
||||||
self._merge_into = merge_into
|
self._merge_into = merge_into
|
||||||
|
@ -490,29 +614,42 @@ class RequestOpenNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class RequestCloseNotification(Notification):
|
class RequestCloseNotification(Notification):
|
||||||
def __init__(self, conn, uid, reqid, reason):
|
|
||||||
self._user = username_from_id(conn, uid) if int(uid) else None
|
def __init__(self, uid, reqid, reason):
|
||||||
cur = conn.execute('SELECT DISTINCT Users.Email FROM PackageRequests ' +
|
user = db.query(User.Username).filter(User.ID == uid).first()
|
||||||
'INNER JOIN PackageBases ' +
|
self._user = user.Username if user else None
|
||||||
'ON PackageBases.ID = PackageRequests.PackageBaseID ' +
|
|
||||||
'LEFT JOIN PackageComaintainers ' +
|
|
||||||
'ON PackageComaintainers.PackageBaseID = PackageRequests.PackageBaseID ' +
|
|
||||||
'INNER JOIN Users ' +
|
|
||||||
'ON Users.ID = PackageRequests.UsersID ' +
|
|
||||||
'OR Users.ID = PackageBases.MaintainerUID ' +
|
|
||||||
'OR Users.ID = PackageComaintainers.UsersID ' +
|
|
||||||
'WHERE PackageRequests.ID = ? AND ' +
|
|
||||||
'Users.Suspended = 0', [reqid])
|
|
||||||
self._to = aurweb.config.get('options', 'aur_request_ml')
|
self._to = aurweb.config.get('options', 'aur_request_ml')
|
||||||
self._cc = [row[0] for row in cur.fetchall()]
|
|
||||||
cur = conn.execute('SELECT PackageRequests.ClosureComment, ' +
|
query = db.query(PackageRequest).join(PackageBase).join(
|
||||||
'RequestTypes.Name, ' +
|
PackageComaintainer,
|
||||||
'PackageRequests.PackageBaseName ' +
|
PackageComaintainer.PackageBaseID == PackageRequest.PackageBaseID,
|
||||||
'FROM PackageRequests ' +
|
isouter=True
|
||||||
'INNER JOIN RequestTypes ' +
|
).join(
|
||||||
'ON RequestTypes.ID = PackageRequests.ReqTypeID ' +
|
User,
|
||||||
'WHERE PackageRequests.ID = ?', [reqid])
|
or_(User.ID == PackageRequest.UsersID,
|
||||||
self._text, self._reqtype, self._pkgbase = cur.fetchone()
|
User.ID == PackageBase.MaintainerUID,
|
||||||
|
User.ID == PackageComaintainer.UsersID)
|
||||||
|
).filter(
|
||||||
|
and_(PackageRequest.ID == reqid,
|
||||||
|
User.Suspended == 0)
|
||||||
|
).with_entities(
|
||||||
|
User.Email
|
||||||
|
).distinct()
|
||||||
|
self._cc = [u.Email for u in query]
|
||||||
|
|
||||||
|
pkgreq = db.query(PackageRequest).join(RequestType).filter(
|
||||||
|
PackageRequest.ID == reqid
|
||||||
|
).with_entities(
|
||||||
|
PackageRequest.ClosureComment,
|
||||||
|
RequestType.Name,
|
||||||
|
PackageRequest.PackageBaseName
|
||||||
|
).first()
|
||||||
|
|
||||||
|
self._text = pkgreq.ClosureComment
|
||||||
|
self._reqtype = pkgreq.Name
|
||||||
|
self._pkgbase = pkgreq.PackageBaseName
|
||||||
|
|
||||||
self._reqid = int(reqid)
|
self._reqid = int(reqid)
|
||||||
self._reason = reason
|
self._reason = reason
|
||||||
|
|
||||||
|
@ -555,34 +692,41 @@ class RequestCloseNotification(Notification):
|
||||||
|
|
||||||
|
|
||||||
class TUVoteReminderNotification(Notification):
|
class TUVoteReminderNotification(Notification):
|
||||||
def __init__(self, conn, vote_id):
|
def __init__(self, vote_id):
|
||||||
self._vote_id = int(vote_id)
|
self._vote_id = int(vote_id)
|
||||||
cur = conn.execute('SELECT Email, LangPreference FROM Users ' +
|
|
||||||
'WHERE AccountTypeID IN (2, 4) AND ID NOT IN ' +
|
subquery = db.query(TUVote.UserID).filter(TUVote.VoteID == vote_id)
|
||||||
'(SELECT UserID FROM TU_Votes ' +
|
query = db.query(User).filter(
|
||||||
'WHERE TU_Votes.VoteID = ?) AND ' +
|
and_(User.AccountTypeID.in_((2, 4)),
|
||||||
'Users.Suspended = 0', [vote_id])
|
~User.ID.in_(subquery),
|
||||||
self._recipients = cur.fetchall()
|
User.Suspended == 0)
|
||||||
|
).with_entities(
|
||||||
|
User.Email, User.LangPreference
|
||||||
|
)
|
||||||
|
self._recipients = [(u.Email, u.LangPreference) for u in query]
|
||||||
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def get_recipients(self):
|
def get_recipients(self):
|
||||||
return self._recipients
|
return self._recipients
|
||||||
|
|
||||||
def get_subject(self, lang):
|
def get_subject(self, lang):
|
||||||
return self._l10n.translate('TU Vote Reminder: Proposal {id}',
|
return aurweb.l10n.translator.translate(
|
||||||
lang).format(id=self._vote_id)
|
'TU Vote Reminder: Proposal {id}',
|
||||||
|
lang).format(id=self._vote_id)
|
||||||
|
|
||||||
def get_body(self, lang):
|
def get_body(self, lang):
|
||||||
return self._l10n.translate(
|
return aurweb.l10n.translator.translate(
|
||||||
'Please remember to cast your vote on proposal {id} [1]. '
|
'Please remember to cast your vote on proposal {id} [1]. '
|
||||||
'The voting period ends in less than 48 hours.',
|
'The voting period ends in less than 48 hours.',
|
||||||
lang).format(id=self._vote_id)
|
lang).format(id=self._vote_id)
|
||||||
|
|
||||||
def get_refs(self):
|
def get_refs(self):
|
||||||
return (aur_location + '/tu/?id=' + str(self._vote_id),)
|
return (aur_location + '/tu/?id=' + str(self._vote_id),)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
|
db.get_engine()
|
||||||
action = sys.argv[1]
|
action = sys.argv[1]
|
||||||
action_map = {
|
action_map = {
|
||||||
'send-resetkey': ResetKeyNotification,
|
'send-resetkey': ResetKeyNotification,
|
||||||
|
@ -600,14 +744,10 @@ def main():
|
||||||
'tu-vote-reminder': TUVoteReminderNotification,
|
'tu-vote-reminder': TUVoteReminderNotification,
|
||||||
}
|
}
|
||||||
|
|
||||||
conn = aurweb.db.Connection()
|
with db.begin():
|
||||||
|
notification = action_map[action](*sys.argv[2:])
|
||||||
notification = action_map[action](conn, *sys.argv[2:])
|
|
||||||
notification.send()
|
notification.send()
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
main()
|
main()
|
||||||
|
|
|
@ -1,19 +1,25 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
import time
|
from sqlalchemy import and_
|
||||||
|
|
||||||
import aurweb.db
|
from aurweb import db, time
|
||||||
|
from aurweb.models import PackageBase
|
||||||
|
|
||||||
|
|
||||||
|
def _main():
|
||||||
|
# One day behind.
|
||||||
|
limit_to = time.utcnow() - 86400
|
||||||
|
|
||||||
|
query = db.query(PackageBase).filter(
|
||||||
|
and_(PackageBase.SubmittedTS < limit_to,
|
||||||
|
PackageBase.PackagerUID.is_(None)))
|
||||||
|
db.delete_all(query)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
conn = aurweb.db.Connection()
|
db.get_engine()
|
||||||
|
with db.begin():
|
||||||
limit_to = int(time.time()) - 86400
|
_main()
|
||||||
conn.execute("DELETE FROM PackageBases WHERE " +
|
|
||||||
"SubmittedTS < ? AND PackagerUID IS NULL", [limit_to])
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,25 +1,70 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
import time
|
from typing import List
|
||||||
|
|
||||||
import aurweb.db
|
from sqlalchemy import and_, func
|
||||||
|
from sqlalchemy.sql.functions import coalesce
|
||||||
|
from sqlalchemy.sql.functions import sum as _sum
|
||||||
|
|
||||||
|
from aurweb import db, time
|
||||||
|
from aurweb.models import PackageBase, PackageVote
|
||||||
|
|
||||||
|
|
||||||
|
def run_variable(pkgbases: List[PackageBase] = []) -> None:
|
||||||
|
"""
|
||||||
|
Update popularity on a list of PackageBases.
|
||||||
|
|
||||||
|
If no PackageBase is included, we update the popularity
|
||||||
|
of every PackageBase in the database.
|
||||||
|
|
||||||
|
:param pkgbases: List of PackageBase instances
|
||||||
|
"""
|
||||||
|
now = time.utcnow()
|
||||||
|
|
||||||
|
# NumVotes subquery.
|
||||||
|
votes_subq = db.get_session().query(
|
||||||
|
func.count("*")
|
||||||
|
).select_from(PackageVote).filter(
|
||||||
|
PackageVote.PackageBaseID == PackageBase.ID
|
||||||
|
)
|
||||||
|
|
||||||
|
# Popularity subquery.
|
||||||
|
pop_subq = db.get_session().query(
|
||||||
|
coalesce(_sum(func.pow(0.98, (now - PackageVote.VoteTS) / 86400)), 0.0),
|
||||||
|
).select_from(PackageVote).filter(
|
||||||
|
and_(PackageVote.PackageBaseID == PackageBase.ID,
|
||||||
|
PackageVote.VoteTS.isnot(None))
|
||||||
|
)
|
||||||
|
|
||||||
|
with db.begin():
|
||||||
|
query = db.query(PackageBase)
|
||||||
|
|
||||||
|
ids = set()
|
||||||
|
if pkgbases:
|
||||||
|
ids = {pkgbase.ID for pkgbase in pkgbases}
|
||||||
|
query = query.filter(PackageBase.ID.in_(ids))
|
||||||
|
|
||||||
|
query.update({
|
||||||
|
"NumVotes": votes_subq.scalar_subquery(),
|
||||||
|
"Popularity": pop_subq.scalar_subquery()
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def run_single(pkgbase: PackageBase) -> None:
|
||||||
|
""" A single popupdate. The given pkgbase instance will be
|
||||||
|
refreshed after the database update is done.
|
||||||
|
|
||||||
|
NOTE: This function is compatible only with aurweb FastAPI.
|
||||||
|
|
||||||
|
:param pkgbase: Instance of db.PackageBase
|
||||||
|
"""
|
||||||
|
run_variable([pkgbase])
|
||||||
|
db.refresh(pkgbase)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
conn = aurweb.db.Connection()
|
db.get_engine()
|
||||||
|
run_variable()
|
||||||
conn.execute("UPDATE PackageBases SET NumVotes = (" +
|
|
||||||
"SELECT COUNT(*) FROM PackageVotes " +
|
|
||||||
"WHERE PackageVotes.PackageBaseID = PackageBases.ID)")
|
|
||||||
|
|
||||||
now = int(time.time())
|
|
||||||
conn.execute("UPDATE PackageBases SET Popularity = (" +
|
|
||||||
"SELECT COALESCE(SUM(POWER(0.98, (? - VoteTS) / 86400)), 0.0) " +
|
|
||||||
"FROM PackageVotes WHERE PackageVotes.PackageBaseID = " +
|
|
||||||
"PackageBases.ID AND NOT VoteTS IS NULL)", [now])
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -2,15 +2,18 @@
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
|
from xml.etree.ElementTree import Element
|
||||||
|
|
||||||
import bleach
|
import bleach
|
||||||
import markdown
|
import markdown
|
||||||
import pygit2
|
import pygit2
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
|
||||||
|
|
||||||
repo_path = aurweb.config.get('serve', 'repo-path')
|
from aurweb import db, logging, util
|
||||||
commit_uri = aurweb.config.get('options', 'commit_uri')
|
from aurweb.models import PackageComment
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
class LinkifyExtension(markdown.extensions.Extension):
|
class LinkifyExtension(markdown.extensions.Extension):
|
||||||
|
@ -24,7 +27,7 @@ class LinkifyExtension(markdown.extensions.Extension):
|
||||||
_urlre = (r'(\b(?:https?|ftp):\/\/[\w\/\#~:.?+=&%@!\-;,]+?'
|
_urlre = (r'(\b(?:https?|ftp):\/\/[\w\/\#~:.?+=&%@!\-;,]+?'
|
||||||
r'(?=[.:?\-;,]*(?:[^\w\/\#~:.?+=&%@!\-;,]|$)))')
|
r'(?=[.:?\-;,]*(?:[^\w\/\#~:.?+=&%@!\-;,]|$)))')
|
||||||
|
|
||||||
def extendMarkdown(self, md, md_globals):
|
def extendMarkdown(self, md):
|
||||||
processor = markdown.inlinepatterns.AutolinkInlineProcessor(self._urlre, md)
|
processor = markdown.inlinepatterns.AutolinkInlineProcessor(self._urlre, md)
|
||||||
# Register it right after the default <>-link processor (priority 120).
|
# Register it right after the default <>-link processor (priority 120).
|
||||||
md.inlinePatterns.register(processor, 'linkify', 119)
|
md.inlinePatterns.register(processor, 'linkify', 119)
|
||||||
|
@ -39,14 +42,14 @@ class FlysprayLinksInlineProcessor(markdown.inlinepatterns.InlineProcessor):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def handleMatch(self, m, data):
|
def handleMatch(self, m, data):
|
||||||
el = markdown.util.etree.Element('a')
|
el = Element('a')
|
||||||
el.set('href', f'https://bugs.archlinux.org/task/{m.group(1)}')
|
el.set('href', f'https://bugs.archlinux.org/task/{m.group(1)}')
|
||||||
el.text = markdown.util.AtomicString(m.group(0))
|
el.text = markdown.util.AtomicString(m.group(0))
|
||||||
return el, m.start(0), m.end(0)
|
return (el, m.start(0), m.end(0))
|
||||||
|
|
||||||
|
|
||||||
class FlysprayLinksExtension(markdown.extensions.Extension):
|
class FlysprayLinksExtension(markdown.extensions.Extension):
|
||||||
def extendMarkdown(self, md, md_globals):
|
def extendMarkdown(self, md):
|
||||||
processor = FlysprayLinksInlineProcessor(r'\bFS#(\d+)\b', md)
|
processor = FlysprayLinksInlineProcessor(r'\bFS#(\d+)\b', md)
|
||||||
md.inlinePatterns.register(processor, 'flyspray-links', 118)
|
md.inlinePatterns.register(processor, 'flyspray-links', 118)
|
||||||
|
|
||||||
|
@ -60,9 +63,9 @@ class GitCommitsInlineProcessor(markdown.inlinepatterns.InlineProcessor):
|
||||||
considered.
|
considered.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_repo = pygit2.Repository(repo_path)
|
|
||||||
|
|
||||||
def __init__(self, md, head):
|
def __init__(self, md, head):
|
||||||
|
repo_path = aurweb.config.get('serve', 'repo-path')
|
||||||
|
self._repo = pygit2.Repository(repo_path)
|
||||||
self._head = head
|
self._head = head
|
||||||
super().__init__(r'\b([0-9a-f]{7,40})\b', md)
|
super().__init__(r'\b([0-9a-f]{7,40})\b', md)
|
||||||
|
|
||||||
|
@ -70,18 +73,14 @@ class GitCommitsInlineProcessor(markdown.inlinepatterns.InlineProcessor):
|
||||||
oid = m.group(1)
|
oid = m.group(1)
|
||||||
if oid not in self._repo:
|
if oid not in self._repo:
|
||||||
# Unkwown OID; preserve the orginal text.
|
# Unkwown OID; preserve the orginal text.
|
||||||
return None, None, None
|
return (None, None, None)
|
||||||
|
|
||||||
prefixlen = 12
|
el = Element('a')
|
||||||
while prefixlen < 40:
|
commit_uri = aurweb.config.get("options", "commit_uri")
|
||||||
if oid[:prefixlen] in self._repo:
|
prefixlen = util.git_search(self._repo, oid)
|
||||||
break
|
|
||||||
prefixlen += 1
|
|
||||||
|
|
||||||
el = markdown.util.etree.Element('a')
|
|
||||||
el.set('href', commit_uri % (self._head, oid[:prefixlen]))
|
el.set('href', commit_uri % (self._head, oid[:prefixlen]))
|
||||||
el.text = markdown.util.AtomicString(oid[:prefixlen])
|
el.text = markdown.util.AtomicString(oid[:prefixlen])
|
||||||
return el, m.start(0), m.end(0)
|
return (el, m.start(0), m.end(0))
|
||||||
|
|
||||||
|
|
||||||
class GitCommitsExtension(markdown.extensions.Extension):
|
class GitCommitsExtension(markdown.extensions.Extension):
|
||||||
|
@ -91,9 +90,12 @@ class GitCommitsExtension(markdown.extensions.Extension):
|
||||||
self._head = head
|
self._head = head
|
||||||
super(markdown.extensions.Extension, self).__init__()
|
super(markdown.extensions.Extension, self).__init__()
|
||||||
|
|
||||||
def extendMarkdown(self, md, md_globals):
|
def extendMarkdown(self, md):
|
||||||
processor = GitCommitsInlineProcessor(md, self._head)
|
try:
|
||||||
md.inlinePatterns.register(processor, 'git-commits', 117)
|
processor = GitCommitsInlineProcessor(md, self._head)
|
||||||
|
md.inlinePatterns.register(processor, 'git-commits', 117)
|
||||||
|
except pygit2.GitError:
|
||||||
|
logger.error(f"No git repository found for '{self._head}'.")
|
||||||
|
|
||||||
|
|
||||||
class HeadingTreeprocessor(markdown.treeprocessors.Treeprocessor):
|
class HeadingTreeprocessor(markdown.treeprocessors.Treeprocessor):
|
||||||
|
@ -106,42 +108,46 @@ class HeadingTreeprocessor(markdown.treeprocessors.Treeprocessor):
|
||||||
|
|
||||||
|
|
||||||
class HeadingExtension(markdown.extensions.Extension):
|
class HeadingExtension(markdown.extensions.Extension):
|
||||||
def extendMarkdown(self, md, md_globals):
|
def extendMarkdown(self, md):
|
||||||
# Priority doesn't matter since we don't conflict with other processors.
|
# Priority doesn't matter since we don't conflict with other processors.
|
||||||
md.treeprocessors.register(HeadingTreeprocessor(md), 'heading', 30)
|
md.treeprocessors.register(HeadingTreeprocessor(md), 'heading', 30)
|
||||||
|
|
||||||
|
|
||||||
def get_comment(conn, commentid):
|
def save_rendered_comment(comment: PackageComment, html: str):
|
||||||
cur = conn.execute('SELECT PackageComments.Comments, PackageBases.Name '
|
with db.begin():
|
||||||
'FROM PackageComments INNER JOIN PackageBases '
|
comment.RenderedComment = html
|
||||||
'ON PackageBases.ID = PackageComments.PackageBaseID '
|
|
||||||
'WHERE PackageComments.ID = ?', [commentid])
|
|
||||||
return cur.fetchone()
|
|
||||||
|
|
||||||
|
|
||||||
def save_rendered_comment(conn, commentid, html):
|
def update_comment_render_fastapi(comment: PackageComment) -> None:
|
||||||
conn.execute('UPDATE PackageComments SET RenderedComment = ? WHERE ID = ?',
|
update_comment_render(comment)
|
||||||
[html, commentid])
|
|
||||||
|
|
||||||
|
def update_comment_render(comment: PackageComment) -> None:
|
||||||
|
text = comment.Comments
|
||||||
|
pkgbasename = comment.PackageBase.Name
|
||||||
|
|
||||||
|
html = markdown.markdown(text, extensions=[
|
||||||
|
'fenced_code',
|
||||||
|
LinkifyExtension(),
|
||||||
|
FlysprayLinksExtension(),
|
||||||
|
GitCommitsExtension(pkgbasename),
|
||||||
|
HeadingExtension()
|
||||||
|
])
|
||||||
|
|
||||||
|
allowed_tags = (bleach.sanitizer.ALLOWED_TAGS
|
||||||
|
+ ['p', 'pre', 'h4', 'h5', 'h6', 'br', 'hr'])
|
||||||
|
html = bleach.clean(html, tags=allowed_tags)
|
||||||
|
save_rendered_comment(comment, html)
|
||||||
|
db.refresh(comment)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
commentid = int(sys.argv[1])
|
db.get_engine()
|
||||||
|
comment_id = int(sys.argv[1])
|
||||||
conn = aurweb.db.Connection()
|
comment = db.query(PackageComment).filter(
|
||||||
|
PackageComment.ID == comment_id
|
||||||
text, pkgbase = get_comment(conn, commentid)
|
).first()
|
||||||
html = markdown.markdown(text, extensions=['fenced_code',
|
update_comment_render(comment)
|
||||||
LinkifyExtension(),
|
|
||||||
FlysprayLinksExtension(),
|
|
||||||
GitCommitsExtension(pkgbase),
|
|
||||||
HeadingExtension()])
|
|
||||||
allowed_tags = (bleach.sanitizer.ALLOWED_TAGS +
|
|
||||||
['p', 'pre', 'h4', 'h5', 'h6', 'br', 'hr'])
|
|
||||||
html = bleach.clean(html, tags=allowed_tags)
|
|
||||||
save_rendered_comment(conn, commentid, html)
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,27 +1,34 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
import subprocess
|
from sqlalchemy import and_
|
||||||
import time
|
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.db
|
|
||||||
|
from aurweb import db, time
|
||||||
|
from aurweb.models import TUVoteInfo
|
||||||
|
from aurweb.scripts import notify
|
||||||
|
|
||||||
notify_cmd = aurweb.config.get('notifications', 'notify-cmd')
|
notify_cmd = aurweb.config.get('notifications', 'notify-cmd')
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
conn = aurweb.db.Connection()
|
db.get_engine()
|
||||||
|
|
||||||
now = int(time.time())
|
now = time.utcnow()
|
||||||
filter_from = now + 500
|
|
||||||
filter_to = now + 172800
|
|
||||||
|
|
||||||
cur = conn.execute("SELECT ID FROM TU_VoteInfo " +
|
start = aurweb.config.getint("tuvotereminder", "range_start")
|
||||||
"WHERE End >= ? AND End <= ?",
|
filter_from = now + start
|
||||||
[filter_from, filter_to])
|
|
||||||
|
|
||||||
for vote_id in [row[0] for row in cur.fetchall()]:
|
end = aurweb.config.getint("tuvotereminder", "range_end")
|
||||||
subprocess.Popen((notify_cmd, 'tu-vote-reminder', str(vote_id))).wait()
|
filter_to = now + end
|
||||||
|
|
||||||
|
query = db.query(TUVoteInfo.ID).filter(
|
||||||
|
and_(TUVoteInfo.End >= filter_from,
|
||||||
|
TUVoteInfo.End <= filter_to)
|
||||||
|
)
|
||||||
|
for voteinfo in query:
|
||||||
|
notif = notify.TUVoteReminderNotification(voteinfo.ID)
|
||||||
|
notif.send()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -1,21 +1,29 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
import time
|
from sqlalchemy import update
|
||||||
|
|
||||||
import aurweb.db
|
from aurweb import db, time
|
||||||
|
from aurweb.models import User
|
||||||
|
|
||||||
|
|
||||||
|
def _main():
|
||||||
|
limit_to = time.utcnow() - 86400 * 7
|
||||||
|
|
||||||
|
update_ = update(User).where(
|
||||||
|
User.LastLogin < limit_to
|
||||||
|
).values(LastLoginIPAddress=None)
|
||||||
|
db.get_session().execute(update_)
|
||||||
|
|
||||||
|
update_ = update(User).where(
|
||||||
|
User.LastSSHLogin < limit_to
|
||||||
|
).values(LastSSHLoginIPAddress=None)
|
||||||
|
db.get_session().execute(update_)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
conn = aurweb.db.Connection()
|
db.get_engine()
|
||||||
|
with db.begin():
|
||||||
limit_to = int(time.time()) - 86400 * 7
|
_main()
|
||||||
conn.execute("UPDATE Users SET LastLoginIPAddress = NULL " +
|
|
||||||
"WHERE LastLogin < ?", [limit_to])
|
|
||||||
conn.execute("UPDATE Users SET LastSSHLoginIPAddress = NULL " +
|
|
||||||
"WHERE LastSSHLogin < ?", [limit_to])
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
167
aurweb/spawn.py
167
aurweb/spawn.py
|
@ -16,15 +16,24 @@ import subprocess
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
import time
|
import time
|
||||||
import urllib
|
|
||||||
|
from typing import Iterable, List
|
||||||
|
|
||||||
import aurweb.config
|
import aurweb.config
|
||||||
import aurweb.schema
|
import aurweb.schema
|
||||||
|
|
||||||
|
from aurweb.exceptions import AurwebException
|
||||||
|
|
||||||
children = []
|
children = []
|
||||||
temporary_dir = None
|
temporary_dir = None
|
||||||
verbosity = 0
|
verbosity = 0
|
||||||
asgi_backend = ''
|
asgi_backend = ''
|
||||||
|
workers = 1
|
||||||
|
|
||||||
|
PHP_BINARY = os.environ.get("PHP_BINARY", "php")
|
||||||
|
PHP_MODULES = ["pdo_mysql", "pdo_sqlite"]
|
||||||
|
PHP_NGINX_PORT = int(os.environ.get("PHP_NGINX_PORT", 8001))
|
||||||
|
FASTAPI_NGINX_PORT = int(os.environ.get("FASTAPI_NGINX_PORT", 8002))
|
||||||
|
|
||||||
|
|
||||||
class ProcessExceptions(Exception):
|
class ProcessExceptions(Exception):
|
||||||
|
@ -40,14 +49,45 @@ class ProcessExceptions(Exception):
|
||||||
super().__init__("\n- ".join(messages))
|
super().__init__("\n- ".join(messages))
|
||||||
|
|
||||||
|
|
||||||
|
def validate_php_config() -> None:
|
||||||
|
"""
|
||||||
|
Perform a validation check against PHP_BINARY's configuration.
|
||||||
|
|
||||||
|
AurwebException is raised here if checks fail to pass. We require
|
||||||
|
the 'pdo_mysql' and 'pdo_sqlite' modules to be enabled.
|
||||||
|
|
||||||
|
:raises: AurwebException
|
||||||
|
:return: None
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
proc = subprocess.Popen([PHP_BINARY, "-m"],
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
stderr=subprocess.PIPE)
|
||||||
|
out, _ = proc.communicate()
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise AurwebException(f"Unable to locate the '{PHP_BINARY}' "
|
||||||
|
"executable.")
|
||||||
|
|
||||||
|
assert proc.returncode == 0, ("Received non-zero error code "
|
||||||
|
f"{proc.returncode} from '{PHP_BINARY}'.")
|
||||||
|
|
||||||
|
modules = out.decode().splitlines()
|
||||||
|
for module in PHP_MODULES:
|
||||||
|
if module not in modules:
|
||||||
|
raise AurwebException(
|
||||||
|
f"PHP does not have the '{module}' module enabled.")
|
||||||
|
|
||||||
|
|
||||||
def generate_nginx_config():
|
def generate_nginx_config():
|
||||||
"""
|
"""
|
||||||
Generate an nginx configuration based on aurweb's configuration.
|
Generate an nginx configuration based on aurweb's configuration.
|
||||||
The file is generated under `temporary_dir`.
|
The file is generated under `temporary_dir`.
|
||||||
Returns the path to the created configuration file.
|
Returns the path to the created configuration file.
|
||||||
"""
|
"""
|
||||||
aur_location = aurweb.config.get("options", "aur_location")
|
php_bind = aurweb.config.get("php", "bind_address")
|
||||||
aur_location_parts = urllib.parse.urlsplit(aur_location)
|
php_host = php_bind.split(":")[0]
|
||||||
|
fastapi_bind = aurweb.config.get("fastapi", "bind_address")
|
||||||
|
fastapi_host = fastapi_bind.split(":")[0]
|
||||||
config_path = os.path.join(temporary_dir, "nginx.conf")
|
config_path = os.path.join(temporary_dir, "nginx.conf")
|
||||||
config = open(config_path, "w")
|
config = open(config_path, "w")
|
||||||
# We double nginx's braces because they conflict with Python's f-strings.
|
# We double nginx's braces because they conflict with Python's f-strings.
|
||||||
|
@ -58,13 +98,29 @@ def generate_nginx_config():
|
||||||
pid {os.path.join(temporary_dir, "nginx.pid")};
|
pid {os.path.join(temporary_dir, "nginx.pid")};
|
||||||
http {{
|
http {{
|
||||||
access_log /dev/stdout;
|
access_log /dev/stdout;
|
||||||
|
client_body_temp_path {os.path.join(temporary_dir, "client_body")};
|
||||||
|
proxy_temp_path {os.path.join(temporary_dir, "proxy")};
|
||||||
|
fastcgi_temp_path {os.path.join(temporary_dir, "fastcgi")}1 2;
|
||||||
|
uwsgi_temp_path {os.path.join(temporary_dir, "uwsgi")};
|
||||||
|
scgi_temp_path {os.path.join(temporary_dir, "scgi")};
|
||||||
server {{
|
server {{
|
||||||
listen {aur_location_parts.netloc};
|
listen {php_host}:{PHP_NGINX_PORT};
|
||||||
location / {{
|
location / {{
|
||||||
proxy_pass http://{aurweb.config.get("php", "bind_address")};
|
proxy_pass http://{php_bind};
|
||||||
}}
|
}}
|
||||||
location /sso {{
|
}}
|
||||||
proxy_pass http://{aurweb.config.get("fastapi", "bind_address")};
|
server {{
|
||||||
|
listen {fastapi_host}:{FASTAPI_NGINX_PORT};
|
||||||
|
location / {{
|
||||||
|
try_files $uri @proxy_to_app;
|
||||||
|
}}
|
||||||
|
location @proxy_to_app {{
|
||||||
|
proxy_set_header Host $http_host;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_redirect off;
|
||||||
|
proxy_buffering off;
|
||||||
|
proxy_pass http://{fastapi_bind};
|
||||||
}}
|
}}
|
||||||
}}
|
}}
|
||||||
}}
|
}}
|
||||||
|
@ -107,32 +163,56 @@ def start():
|
||||||
|
|
||||||
# PHP
|
# PHP
|
||||||
php_address = aurweb.config.get("php", "bind_address")
|
php_address = aurweb.config.get("php", "bind_address")
|
||||||
|
php_host = php_address.split(":")[0]
|
||||||
htmldir = aurweb.config.get("php", "htmldir")
|
htmldir = aurweb.config.get("php", "htmldir")
|
||||||
spawn_child(["php", "-S", php_address, "-t", htmldir])
|
spawn_child(["php", "-S", php_address, "-t", htmldir])
|
||||||
|
|
||||||
# FastAPI
|
# FastAPI
|
||||||
host, port = aurweb.config.get("fastapi", "bind_address").rsplit(":", 1)
|
fastapi_host, fastapi_port = aurweb.config.get(
|
||||||
if asgi_backend == "hypercorn":
|
"fastapi", "bind_address").rsplit(":", 1)
|
||||||
portargs = ["-b", f"{host}:{port}"]
|
|
||||||
elif asgi_backend == "uvicorn":
|
# Logging config.
|
||||||
portargs = ["--host", host, "--port", port]
|
aurwebdir = aurweb.config.get("options", "aurwebdir")
|
||||||
spawn_child(["python", "-m", asgi_backend] + portargs + ["aurweb.asgi:app"])
|
fastapi_log_config = os.path.join(aurwebdir, "logging.conf")
|
||||||
|
|
||||||
|
backend_args = {
|
||||||
|
"hypercorn": ["-b", f"{fastapi_host}:{fastapi_port}"],
|
||||||
|
"uvicorn": ["--host", fastapi_host, "--port", fastapi_port],
|
||||||
|
"gunicorn": ["--bind", f"{fastapi_host}:{fastapi_port}",
|
||||||
|
"-k", "uvicorn.workers.UvicornWorker",
|
||||||
|
"-w", str(workers)]
|
||||||
|
}
|
||||||
|
backend_args = backend_args.get(asgi_backend)
|
||||||
|
spawn_child([
|
||||||
|
"python", "-m", asgi_backend,
|
||||||
|
"--log-config", fastapi_log_config,
|
||||||
|
] + backend_args + ["aurweb.asgi:app"])
|
||||||
|
|
||||||
# nginx
|
# nginx
|
||||||
spawn_child(["nginx", "-p", temporary_dir, "-c", generate_nginx_config()])
|
spawn_child(["nginx", "-p", temporary_dir, "-c", generate_nginx_config()])
|
||||||
|
|
||||||
|
print(f"""
|
||||||
|
> Started nginx.
|
||||||
|
>
|
||||||
|
> PHP backend: http://{php_address}
|
||||||
|
> FastAPI backend: http://{fastapi_host}:{fastapi_port}
|
||||||
|
>
|
||||||
|
> PHP frontend: http://{php_host}:{PHP_NGINX_PORT}
|
||||||
|
> FastAPI frontend: http://{fastapi_host}:{FASTAPI_NGINX_PORT}
|
||||||
|
>
|
||||||
|
> Frontends are hosted via nginx and should be preferred.
|
||||||
|
""")
|
||||||
|
|
||||||
def stop():
|
|
||||||
"""
|
|
||||||
Stop all the child processes.
|
|
||||||
|
|
||||||
If an exception occurs during the process, the process continues anyway
|
def _kill_children(children: Iterable, exceptions: List[Exception] = []) \
|
||||||
because we don’t want to leave runaway processes around, and all the
|
-> List[Exception]:
|
||||||
exceptions are finally raised as a single ProcessExceptions.
|
"""
|
||||||
|
Kill each process found in `children`.
|
||||||
|
|
||||||
|
:param children: Iterable of child processes
|
||||||
|
:param exceptions: Exception memo
|
||||||
|
:return: `exceptions`
|
||||||
"""
|
"""
|
||||||
global children
|
|
||||||
atexit.unregister(stop)
|
|
||||||
exceptions = []
|
|
||||||
for p in children:
|
for p in children:
|
||||||
try:
|
try:
|
||||||
p.terminate()
|
p.terminate()
|
||||||
|
@ -140,6 +220,18 @@ def stop():
|
||||||
print(f":: Sent SIGTERM to {p.args}", file=sys.stderr)
|
print(f":: Sent SIGTERM to {p.args}", file=sys.stderr)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
exceptions.append(e)
|
exceptions.append(e)
|
||||||
|
return exceptions
|
||||||
|
|
||||||
|
|
||||||
|
def _wait_for_children(children: Iterable, exceptions: List[Exception] = []) \
|
||||||
|
-> List[Exception]:
|
||||||
|
"""
|
||||||
|
Wait for each process to end found in `children`.
|
||||||
|
|
||||||
|
:param children: Iterable of child processes
|
||||||
|
:param exceptions: Exception memo
|
||||||
|
:return: `exceptions`
|
||||||
|
"""
|
||||||
for p in children:
|
for p in children:
|
||||||
try:
|
try:
|
||||||
rc = p.wait()
|
rc = p.wait()
|
||||||
|
@ -149,6 +241,24 @@ def stop():
|
||||||
raise Exception(f"Process {p.args} exited with {rc}")
|
raise Exception(f"Process {p.args} exited with {rc}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
exceptions.append(e)
|
exceptions.append(e)
|
||||||
|
return exceptions
|
||||||
|
|
||||||
|
|
||||||
|
def stop() -> None:
|
||||||
|
"""
|
||||||
|
Stop all the child processes.
|
||||||
|
|
||||||
|
If an exception occurs during the process, the process continues anyway
|
||||||
|
because we don’t want to leave runaway processes around, and all the
|
||||||
|
exceptions are finally raised as a single ProcessExceptions.
|
||||||
|
|
||||||
|
:raises: ProcessException
|
||||||
|
:return: None
|
||||||
|
"""
|
||||||
|
global children
|
||||||
|
atexit.unregister(stop)
|
||||||
|
exceptions = _kill_children(children)
|
||||||
|
exceptions = _wait_for_children(children, exceptions)
|
||||||
children = []
|
children = []
|
||||||
if exceptions:
|
if exceptions:
|
||||||
raise ProcessExceptions("Errors terminating the child processes:",
|
raise ProcessExceptions("Errors terminating the child processes:",
|
||||||
|
@ -161,11 +271,22 @@ if __name__ == '__main__':
|
||||||
description='Start aurweb\'s test server.')
|
description='Start aurweb\'s test server.')
|
||||||
parser.add_argument('-v', '--verbose', action='count', default=0,
|
parser.add_argument('-v', '--verbose', action='count', default=0,
|
||||||
help='increase verbosity')
|
help='increase verbosity')
|
||||||
parser.add_argument('-b', '--backend', choices=['hypercorn', 'uvicorn'], default='hypercorn',
|
choices = ['hypercorn', 'gunicorn', 'uvicorn']
|
||||||
|
parser.add_argument('-b', '--backend', choices=choices, default='uvicorn',
|
||||||
help='asgi backend used to launch the python server')
|
help='asgi backend used to launch the python server')
|
||||||
|
parser.add_argument("-w", "--workers", default=1, type=int,
|
||||||
|
help="number of workers to use in gunicorn")
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
try:
|
||||||
|
validate_php_config()
|
||||||
|
except AurwebException as exc:
|
||||||
|
print(f"error: {str(exc)}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
verbosity = args.verbose
|
verbosity = args.verbose
|
||||||
asgi_backend = args.backend
|
asgi_backend = args.backend
|
||||||
|
workers = args.workers
|
||||||
with tempfile.TemporaryDirectory(prefix="aurweb-") as tmpdirname:
|
with tempfile.TemporaryDirectory(prefix="aurweb-") as tmpdirname:
|
||||||
temporary_dir = tmpdirname
|
temporary_dir = tmpdirname
|
||||||
start()
|
start()
|
||||||
|
|
140
aurweb/templates.py
Normal file
140
aurweb/templates.py
Normal file
|
@ -0,0 +1,140 @@
|
||||||
|
import copy
|
||||||
|
import functools
|
||||||
|
import os
|
||||||
|
|
||||||
|
from http import HTTPStatus
|
||||||
|
from typing import Callable
|
||||||
|
|
||||||
|
import jinja2
|
||||||
|
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi.responses import HTMLResponse
|
||||||
|
|
||||||
|
import aurweb.config
|
||||||
|
|
||||||
|
from aurweb import cookies, l10n, time
|
||||||
|
|
||||||
|
# Prepare jinja2 objects.
|
||||||
|
_loader = jinja2.FileSystemLoader(os.path.join(
|
||||||
|
aurweb.config.get("options", "aurwebdir"), "templates"))
|
||||||
|
_env = jinja2.Environment(loader=_loader, autoescape=True,
|
||||||
|
extensions=["jinja2.ext.i18n"])
|
||||||
|
|
||||||
|
|
||||||
|
def register_filter(name: str) -> Callable:
|
||||||
|
""" A decorator that can be used to register a filter.
|
||||||
|
|
||||||
|
Example
|
||||||
|
@register_filter("some_filter")
|
||||||
|
def some_filter(some_value: str) -> str:
|
||||||
|
return some_value.replace("-", "_")
|
||||||
|
|
||||||
|
Jinja2
|
||||||
|
{{ 'blah-blah' | some_filter }}
|
||||||
|
|
||||||
|
:param name: Filter name
|
||||||
|
:return: Callable used for filter
|
||||||
|
"""
|
||||||
|
def decorator(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
_env.filters[name] = wrapper
|
||||||
|
return wrapper
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def register_function(name: str) -> Callable:
|
||||||
|
""" A decorator that can be used to register a function.
|
||||||
|
"""
|
||||||
|
def decorator(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
if name in _env.globals:
|
||||||
|
raise KeyError(f"Jinja already has a function named '{name}'")
|
||||||
|
_env.globals[name] = wrapper
|
||||||
|
return wrapper
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def make_context(request: Request, title: str, next: str = None):
|
||||||
|
""" Create a context for a jinja2 TemplateResponse. """
|
||||||
|
import aurweb.auth.creds
|
||||||
|
|
||||||
|
commit_url = aurweb.config.get_with_fallback("devel", "commit_url", None)
|
||||||
|
commit_hash = aurweb.config.get_with_fallback("devel", "commit_hash", None)
|
||||||
|
if commit_hash:
|
||||||
|
# Shorten commit_hash to a short Git hash.
|
||||||
|
commit_hash = commit_hash[:7]
|
||||||
|
|
||||||
|
timezone = time.get_request_timezone(request)
|
||||||
|
return {
|
||||||
|
"request": request,
|
||||||
|
"commit_url": commit_url,
|
||||||
|
"commit_hash": commit_hash,
|
||||||
|
"language": l10n.get_request_language(request),
|
||||||
|
"languages": l10n.SUPPORTED_LANGUAGES,
|
||||||
|
"timezone": timezone,
|
||||||
|
"timezones": time.SUPPORTED_TIMEZONES,
|
||||||
|
"title": title,
|
||||||
|
"now": time.now(timezone),
|
||||||
|
"utcnow": time.utcnow(),
|
||||||
|
"config": aurweb.config,
|
||||||
|
"creds": aurweb.auth.creds,
|
||||||
|
"next": next if next else request.url.path,
|
||||||
|
"version": os.environ.get("COMMIT_HASH", aurweb.config.AURWEB_VERSION)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
async def make_variable_context(request: Request, title: str, next: str = None):
|
||||||
|
""" Make a context with variables provided by the user
|
||||||
|
(query params via GET or form data via POST). """
|
||||||
|
context = make_context(request, title, next)
|
||||||
|
to_copy = dict(request.query_params) \
|
||||||
|
if request.method.lower() == "get" \
|
||||||
|
else dict(await request.form())
|
||||||
|
|
||||||
|
for k, v in to_copy.items():
|
||||||
|
context[k] = v
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
def base_template(path: str):
|
||||||
|
templates = copy.copy(_env)
|
||||||
|
return templates.get_template(path)
|
||||||
|
|
||||||
|
|
||||||
|
def render_raw_template(request: Request, path: str, context: dict):
|
||||||
|
""" Render a Jinja2 multi-lingual template with some context. """
|
||||||
|
# Create a deep copy of our jinja2 _environment. The _environment in
|
||||||
|
# total by itself is 48 bytes large (according to sys.getsizeof).
|
||||||
|
# This is done so we can install gettext translations on the template
|
||||||
|
# _environment being rendered without installing them into a global
|
||||||
|
# which is reused in this function.
|
||||||
|
templates = copy.copy(_env)
|
||||||
|
|
||||||
|
translator = l10n.get_raw_translator_for_request(context.get("request"))
|
||||||
|
templates.install_gettext_translations(translator)
|
||||||
|
|
||||||
|
template = templates.get_template(path)
|
||||||
|
return template.render(context)
|
||||||
|
|
||||||
|
|
||||||
|
def render_template(request: Request,
|
||||||
|
path: str,
|
||||||
|
context: dict,
|
||||||
|
status_code: HTTPStatus = HTTPStatus.OK):
|
||||||
|
""" Render a template as an HTMLResponse. """
|
||||||
|
rendered = render_raw_template(request, path, context)
|
||||||
|
response = HTMLResponse(rendered, status_code=int(status_code))
|
||||||
|
|
||||||
|
sid = None
|
||||||
|
if request.user.is_authenticated():
|
||||||
|
sid = request.cookies.get("AURSID")
|
||||||
|
|
||||||
|
# Re-emit SID via update_response_cookies with an updated expiration.
|
||||||
|
# This extends the life of a user session based on the AURREMEMBER
|
||||||
|
# cookie, which is always set to the "Remember Me" state on login.
|
||||||
|
return cookies.update_response_cookies(request, response, aursid=sid)
|
68
aurweb/testing/__init__.py
Normal file
68
aurweb/testing/__init__.py
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
import aurweb.db
|
||||||
|
|
||||||
|
from aurweb import models
|
||||||
|
|
||||||
|
|
||||||
|
def setup_test_db(*args):
|
||||||
|
""" This function is to be used to setup a test database before
|
||||||
|
using it. It takes a variable number of table strings, and for
|
||||||
|
each table in that set of table strings, it deletes all records.
|
||||||
|
|
||||||
|
The primary goal of this method is to configure empty tables
|
||||||
|
that tests can use from scratch. This means that tests using
|
||||||
|
this function should make sure they do not depend on external
|
||||||
|
records and keep their logic self-contained.
|
||||||
|
|
||||||
|
Generally used inside of pytest fixtures, this function
|
||||||
|
can be used anywhere, but keep in mind its functionality when
|
||||||
|
doing so.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
setup_test_db("Users", "Sessions")
|
||||||
|
|
||||||
|
test_tables = ["Users", "Sessions"];
|
||||||
|
setup_test_db(*test_tables)
|
||||||
|
"""
|
||||||
|
# Make sure that we've grabbed the engine before using the session.
|
||||||
|
aurweb.db.get_engine()
|
||||||
|
|
||||||
|
tables = list(args)
|
||||||
|
if not tables:
|
||||||
|
tables = [
|
||||||
|
models.AcceptedTerm.__tablename__,
|
||||||
|
models.ApiRateLimit.__tablename__,
|
||||||
|
models.Ban.__tablename__,
|
||||||
|
models.Group.__tablename__,
|
||||||
|
models.License.__tablename__,
|
||||||
|
models.OfficialProvider.__tablename__,
|
||||||
|
models.Package.__tablename__,
|
||||||
|
models.PackageBase.__tablename__,
|
||||||
|
models.PackageBlacklist.__tablename__,
|
||||||
|
models.PackageComaintainer.__tablename__,
|
||||||
|
models.PackageComment.__tablename__,
|
||||||
|
models.PackageDependency.__tablename__,
|
||||||
|
models.PackageGroup.__tablename__,
|
||||||
|
models.PackageKeyword.__tablename__,
|
||||||
|
models.PackageLicense.__tablename__,
|
||||||
|
models.PackageNotification.__tablename__,
|
||||||
|
models.PackageRelation.__tablename__,
|
||||||
|
models.PackageRequest.__tablename__,
|
||||||
|
models.PackageSource.__tablename__,
|
||||||
|
models.PackageVote.__tablename__,
|
||||||
|
models.Session.__tablename__,
|
||||||
|
models.SSHPubKey.__tablename__,
|
||||||
|
models.Term.__tablename__,
|
||||||
|
models.TUVote.__tablename__,
|
||||||
|
models.TUVoteInfo.__tablename__,
|
||||||
|
models.User.__tablename__,
|
||||||
|
]
|
||||||
|
|
||||||
|
aurweb.db.get_session().execute("SET FOREIGN_KEY_CHECKS = 0")
|
||||||
|
for table in tables:
|
||||||
|
aurweb.db.get_session().execute(f"DELETE FROM {table}")
|
||||||
|
aurweb.db.get_session().execute("SET FOREIGN_KEY_CHECKS = 1")
|
||||||
|
aurweb.db.get_session().expunge_all()
|
||||||
|
|
||||||
|
|
||||||
|
def noop(*args, **kwargs) -> None:
|
||||||
|
return
|
87
aurweb/testing/alpm.py
Normal file
87
aurweb/testing/alpm.py
Normal file
|
@ -0,0 +1,87 @@
|
||||||
|
import hashlib
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
from aurweb import logging, util
|
||||||
|
from aurweb.templates import base_template
|
||||||
|
|
||||||
|
logger = logging.get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class AlpmDatabase:
|
||||||
|
"""
|
||||||
|
Fake libalpm database management class.
|
||||||
|
|
||||||
|
This class can be used to add or remove packages from a
|
||||||
|
test repository.
|
||||||
|
"""
|
||||||
|
repo = "test"
|
||||||
|
|
||||||
|
def __init__(self, database_root: str):
|
||||||
|
self.root = database_root
|
||||||
|
self.local = os.path.join(self.root, "local")
|
||||||
|
self.remote = os.path.join(self.root, "remote")
|
||||||
|
self.repopath = os.path.join(self.remote, self.repo)
|
||||||
|
|
||||||
|
# Make directories.
|
||||||
|
os.makedirs(self.local)
|
||||||
|
os.makedirs(self.remote)
|
||||||
|
|
||||||
|
def _get_pkgdir(self, pkgname: str, pkgver: str, repo: str) -> str:
|
||||||
|
pkgfile = f"{pkgname}-{pkgver}-1"
|
||||||
|
pkgdir = os.path.join(self.remote, repo, pkgfile)
|
||||||
|
os.makedirs(pkgdir)
|
||||||
|
return pkgdir
|
||||||
|
|
||||||
|
def add(self, pkgname: str, pkgver: str, arch: str,
|
||||||
|
provides: List[str] = []) -> None:
|
||||||
|
context = {
|
||||||
|
"pkgname": pkgname,
|
||||||
|
"pkgver": pkgver,
|
||||||
|
"arch": arch,
|
||||||
|
"provides": provides
|
||||||
|
}
|
||||||
|
template = base_template("testing/alpm_package.j2")
|
||||||
|
pkgdir = self._get_pkgdir(pkgname, pkgver, self.repo)
|
||||||
|
desc = os.path.join(pkgdir, "desc")
|
||||||
|
with open(desc, "w") as f:
|
||||||
|
f.write(template.render(context))
|
||||||
|
|
||||||
|
self.compile()
|
||||||
|
|
||||||
|
def remove(self, pkgname: str):
|
||||||
|
files = os.listdir(self.repopath)
|
||||||
|
logger.info(f"Files: {files}")
|
||||||
|
expr = "^" + pkgname + r"-[0-9.]+-1$"
|
||||||
|
logger.info(f"Expression: {expr}")
|
||||||
|
to_delete = filter(lambda e: re.match(expr, e), files)
|
||||||
|
|
||||||
|
for target in to_delete:
|
||||||
|
logger.info(f"Deleting {target}")
|
||||||
|
path = os.path.join(self.repopath, target)
|
||||||
|
shutil.rmtree(path)
|
||||||
|
|
||||||
|
self.compile()
|
||||||
|
|
||||||
|
def clean(self) -> None:
|
||||||
|
db_file = os.path.join(self.remote, "test.db")
|
||||||
|
try:
|
||||||
|
os.remove(db_file)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def compile(self) -> None:
|
||||||
|
self.clean()
|
||||||
|
cmdline = ["bash", "-c", "bsdtar -czvf ../test.db *"]
|
||||||
|
proc = subprocess.run(cmdline, cwd=self.repopath)
|
||||||
|
assert proc.returncode == 0, \
|
||||||
|
f"Bad return code while creating alpm database: {proc.returncode}"
|
||||||
|
|
||||||
|
# Print out the md5 hash value of the new test.db.
|
||||||
|
test_db = os.path.join(self.remote, "test.db")
|
||||||
|
db_hash = util.file_hash(test_db, hashlib.md5)
|
||||||
|
logger.debug(f"{test_db}: {db_hash}")
|
155
aurweb/testing/email.py
Normal file
155
aurweb/testing/email.py
Normal file
|
@ -0,0 +1,155 @@
|
||||||
|
import base64
|
||||||
|
import binascii
|
||||||
|
import copy
|
||||||
|
import email
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from typing import TextIO
|
||||||
|
|
||||||
|
|
||||||
|
class Email:
|
||||||
|
"""
|
||||||
|
An email class used for testing.
|
||||||
|
|
||||||
|
This class targets a specific serial of emails for PYTEST_CURRENT_TEST.
|
||||||
|
As emails are sent out with util/sendmail, the serial number increases,
|
||||||
|
starting at 1.
|
||||||
|
|
||||||
|
Email content sent out by aurweb is always base64-encoded. Email.parse()
|
||||||
|
decodes that for us and puts it into Email.body.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
# Get the {test_suite}_{test_function}.1.txt email.
|
||||||
|
email = Email(1).parse()
|
||||||
|
print(email.body)
|
||||||
|
print(email.headers)
|
||||||
|
|
||||||
|
"""
|
||||||
|
TEST_DIR = "test-emails"
|
||||||
|
|
||||||
|
def __init__(self, serial: int = 1, autoparse: bool = True):
|
||||||
|
self.serial = serial
|
||||||
|
self.content = self._get()
|
||||||
|
|
||||||
|
if autoparse:
|
||||||
|
self._parse()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def email_prefix(suite: bool = False) -> str:
|
||||||
|
"""
|
||||||
|
Get the email prefix.
|
||||||
|
|
||||||
|
We find the email prefix by reducing PYTEST_CURRENT_TEST to
|
||||||
|
either {test_suite}_{test_function}. If `suite` is set, we
|
||||||
|
reduce it to {test_suite} only.
|
||||||
|
|
||||||
|
:param suite: Reduce PYTEST_CURRENT_TEST to {test_suite}
|
||||||
|
:return: Email prefix with '/', '.', ',', and ':' chars replaced by '_'
|
||||||
|
"""
|
||||||
|
value = os.environ.get("PYTEST_CURRENT_TEST", "email").split(" ")[0]
|
||||||
|
if suite:
|
||||||
|
value = value.split(":")[0]
|
||||||
|
return re.sub(r'(\/|\.|,|:)', "_", value)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def count() -> int:
|
||||||
|
"""
|
||||||
|
Count the current number of emails sent from the test.
|
||||||
|
|
||||||
|
This function is **only** supported inside of pytest functions.
|
||||||
|
Do not use it elsewhere as data races will occur.
|
||||||
|
|
||||||
|
:return: Number of emails sent by the current test
|
||||||
|
"""
|
||||||
|
files = os.listdir(Email.TEST_DIR)
|
||||||
|
prefix = Email.email_prefix()
|
||||||
|
expr = "^" + prefix + r"\.\d+\.txt$"
|
||||||
|
subset = filter(lambda e: re.match(expr, e), files)
|
||||||
|
return len(list(subset))
|
||||||
|
|
||||||
|
def _email_path(self) -> str:
|
||||||
|
filename = self.email_prefix() + f".{self.serial}.txt"
|
||||||
|
return os.path.join(Email.TEST_DIR, filename)
|
||||||
|
|
||||||
|
def _get(self) -> str:
|
||||||
|
"""
|
||||||
|
Get this email's content by reading its file.
|
||||||
|
|
||||||
|
:return: Email content
|
||||||
|
"""
|
||||||
|
path = self._email_path()
|
||||||
|
with open(path) as f:
|
||||||
|
return f.read()
|
||||||
|
|
||||||
|
def _parse(self) -> "Email":
|
||||||
|
"""
|
||||||
|
Parse this email and base64-decode the body.
|
||||||
|
|
||||||
|
This function populates Email.message, Email.headers and Email.body.
|
||||||
|
|
||||||
|
Additionally, after parsing, we write over our email file with
|
||||||
|
self.glue()'d content (base64-decoded). This is done for ease
|
||||||
|
of inspection by users.
|
||||||
|
|
||||||
|
:return: self
|
||||||
|
"""
|
||||||
|
self.message = email.message_from_string(self.content)
|
||||||
|
self.headers = dict(self.message)
|
||||||
|
|
||||||
|
# aurweb email notifications always have base64 encoded content.
|
||||||
|
# Decode it here so self.body is human readable.
|
||||||
|
try:
|
||||||
|
self.body = base64.b64decode(self.message.get_payload()).decode()
|
||||||
|
except (binascii.Error, UnicodeDecodeError):
|
||||||
|
self.body = self.message.get_payload()
|
||||||
|
|
||||||
|
path = self._email_path()
|
||||||
|
with open(path, "w") as f:
|
||||||
|
f.write(self.glue())
|
||||||
|
|
||||||
|
return self
|
||||||
|
|
||||||
|
def parse(self) -> "Email":
|
||||||
|
return self
|
||||||
|
|
||||||
|
def glue(self) -> str:
|
||||||
|
"""
|
||||||
|
Glue parsed content back into a complete email document, but
|
||||||
|
base64-decoded this time.
|
||||||
|
|
||||||
|
:return: Email document as a string
|
||||||
|
"""
|
||||||
|
headers = copy.copy(self.headers)
|
||||||
|
|
||||||
|
if "Content-Transfer-Encoding" in headers:
|
||||||
|
headers.pop("Content-Transfer-Encoding")
|
||||||
|
|
||||||
|
output = []
|
||||||
|
for k, v in headers.items():
|
||||||
|
output.append(f"{k}: {v}")
|
||||||
|
output.append("")
|
||||||
|
output.append(self.body)
|
||||||
|
return "\n".join(output)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def dump(file: TextIO = sys.stdout) -> None:
|
||||||
|
"""
|
||||||
|
Dump emails content to `file`.
|
||||||
|
|
||||||
|
This function is intended to be used to debug email issues
|
||||||
|
while testing something relevent to email.
|
||||||
|
|
||||||
|
:param file: Writable file object
|
||||||
|
"""
|
||||||
|
lines = []
|
||||||
|
for i in range(Email.count()):
|
||||||
|
email = Email(i + 1)
|
||||||
|
lines += [
|
||||||
|
f"== Email #{i + 1} ==",
|
||||||
|
email.glue(),
|
||||||
|
f"== End of Email #{i + 1}"
|
||||||
|
]
|
||||||
|
print("\n".join(lines), file=file)
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue