mirror of
https://gitlab.archlinux.org/archlinux/aurweb.git
synced 2025-02-03 10:43:03 +01:00
Merge branch 'pu': pre-v6.0.0
Release v6.0.0 - Python This documents UX and functional changes for the v6.0.0 aurweb release. Following this release, we'll be working on a few very nice features noted at the end of this article in Upcoming Work. Preface ------- This v6.0.0 release makes the long-awaited Python port official. Along with the development of the python port, we have modified a number of features. There have been some integral changes to how package requests are dealt with, so _Trusted Users_ should read the entirety of this document. Legend ------ There are a few terms which I'd like to define to increase understanding of these changes as they are listed: - _self_ - Refers to a user viewing or doing something regarding their own account - _/pkgbase/{name}/{action}_ - Refers to a POST action which can be triggered via the relevent package page at `/{pkgbase,packages}/{name}`. Grouped changes explained in multiple items will always be prefixed with the same letter surrounded by braces. Example: - [A] Some feature that does something - [A] The same feature where another thing has changed Infrastructure -------------- - Python packaging is now done with poetry. - SQLite support has been removed. This was done because even though SQLAlchemy is an ORM, SQLite has quite a few SQL-server-like features missing both out of the box and integrally which force us to account for the different database types. We now only support mysql, and should be able to support postgresql without much effort in the future. Note: Users wishing to easily spin up a database quickly can use `docker-compose up -d mariadb` for a Docker-hosted mariadb service. - An example systemd service has been included at `examples/aurweb.service`. - Example wrappers to `aurweb-git-(auth|serve|update)` have been included at `examples/aurweb-git-(auth|serve|update).sh` and should be used to call these scripts when aurweb is installed into a poetry virtualenv. HTML ---- - Pagers have all been modified. They still serve the same purpose, but they have slightly different display. - Some markup and methods around the website has been changed for post requests, and some forms have been completely reworked. Package Requests ---------------- - Normal users can now view and close their own requests - [A] Requests can no longer be accepted through manual closures - [A] Requests are now closed via their relevent actions - Deletion - Through `/packages` bulk delete action - Through `/pkgbase/{name}/delete` - Merge - Through `/pkgbase/{name}/merge` - Orphan - Through `/packages` bulk disown action - Through `/pkgbase/{name}/disown` - Deletion and merge requests (and their closures) are now autogenerated if no pre-existing request exists. This was done to increase tracking of package modifications performed by those with access to do so (TUs). - Deletion, merge and orphan request actions now close all (1 or more) requests pertaining to the action performed. This comes with the downside of multiple notifications sent out about a closure if more than one request (or no request) exists for them - Merge actions now automatically reject other pre-existing merge requests with a mismatched `MergeBaseName` column when a merge action is performed - The last `/requests` page no longer goes nowhere Package Bulk Actions: /packages ------------------------------- - The `Merge into` field has been removed. Merges now require being performed via the `/pkgbase/{name}/merge` action. Package View ------------ - Some cached metadata is no longer cached (pkginfo). Previously, this was defaulted to a one day cache for some package information. If we need to bring this back, we can. TU Proposals ------------ - A valid username is now required for any addition or removal of a TU. RPC --- - `type=get-comment-form` has been removed and is now located at `/pkgbase/{name}/comments/{id}/form`. - Support for versions 1-4 have been removed. - JSON key ordering is different than PHP's JSON. - `type=search` performance is overall slightly worse than PHP's. This should not heavily affect users, as a 3,000 record query is returned in roughly 0.20ms from a local standpoint. We will be working on this in aim to push it over PHP. Archives -------- - Added metadata archive `packages-meta-v1.json.gz`. - Added metadata archive `packages-meta-ext-v1.json.gz`. - Enable this by passing `--extended` to `aurweb-mkpkglists`. Performance Changes ------------------- As is expected from a complete rewrite of the website, performance has changed across the board. In most places, Python's implementation now performs better than the pre-existing PHP implementation, with the exception of a few routes. Notably: - `/` loads much quicker as it is now persistently cached forcibly for five minutes at a time. - `/packages` search is much quicker. - `/packages/{name}` view is slightly slower; we are no longer caching various pieces of package info for `cache_pkginfo_ttl`, which is defaulted to 86400 seconds, or one day. - Request actions are slower due to the removal of the `via` parameter. We now query the database for requests related to the action based on the current state of the DB. - `/rpc?type=info` queries are slightly quicker. - `/rpc?type=search` queries of low result counts are quicker. - `/rpc?type=search` queries of large result counts (> 2500) are slower. - We are not satisfied with this. We'll be working on pushing this over the edge along with the rest of the DB-intensive routes. However, the speed degredation is quite negligible for users' experience: 0.12ms PHP vs 0.15ms Python on a 3,000 record query on my local 4-core 8-thread system. Upcoming Work ------------- This release is the first major release of the Python implementation. We have multiple tasks up for work immediately, which will bring us a few more minor versions forward as they are completed. - Update request and tu vote pagers - Archive differentials - Archive mimetypes - (a) Git scripts to ORM conversion - (a) Sharness removal - Restriction of number of requests users can submit
This commit is contained in:
commit
a467b18474
403 changed files with 78721 additions and 2947 deletions
|
@ -2,17 +2,30 @@ T = $(sort $(wildcard *.t))
|
|||
|
||||
PROVE := $(shell command -v prove 2> /dev/null)
|
||||
|
||||
MAKEFLAGS = -j1
|
||||
|
||||
# IMPORTANT: `sh` should come somewhere AFTER `pytest`.
|
||||
check: sh pytest
|
||||
|
||||
pytest:
|
||||
cd .. && coverage run --append /usr/bin/pytest test
|
||||
|
||||
ifdef PROVE
|
||||
check:
|
||||
sh:
|
||||
prove .
|
||||
else
|
||||
check: $(T)
|
||||
sh: $(T)
|
||||
endif
|
||||
|
||||
coverage:
|
||||
cd .. && coverage report --include='aurweb/*'
|
||||
cd .. && coverage xml --include='aurweb/*'
|
||||
|
||||
clean:
|
||||
$(RM) -r test-results/
|
||||
rm -f ../.coverage
|
||||
|
||||
$(T):
|
||||
@echo "*** $@ ***"; $(SHELL) $@
|
||||
@echo "*** $@ ***"; $(SHELL) $@ -v
|
||||
|
||||
.PHONY: check $(FOREIGN_TARGETS) clean $(T)
|
||||
.PHONY: check coverage $(FOREIGN_TARGETS) clean $(T)
|
||||
|
|
186
test/README.md
186
test/README.md
|
@ -1,37 +1,185 @@
|
|||
aurweb Test Collection
|
||||
======================
|
||||
|
||||
To run all tests, you may run `make -C test sh` and `pytest` within
|
||||
the project root:
|
||||
|
||||
$ make -C test sh # Run Sharness tests.
|
||||
$ poetry run pytest # Run Pytest suites.
|
||||
|
||||
For more control, you may use the `prove` or `pytest` command, which receives a
|
||||
directory or a list of files to run, and produces a report.
|
||||
|
||||
Each test script is standalone, so you may run them individually. Some tests
|
||||
may receive command-line options to help debugging.
|
||||
|
||||
Dependencies
|
||||
------------
|
||||
|
||||
For all tests to run dependencies provided via `poetry` are required:
|
||||
|
||||
$ poetry install
|
||||
|
||||
Logging
|
||||
-------
|
||||
|
||||
Tests also require the `logging.test.conf` logging configuration
|
||||
file to be used. You can specify the `LOG_CONFIG` environment
|
||||
variable to override:
|
||||
|
||||
$ export LOG_CONFIG=logging.test.conf
|
||||
|
||||
`logging.test.conf` enables debug logging for the aurweb package,
|
||||
for which we run tests against.
|
||||
|
||||
Test Configuration
|
||||
------------------
|
||||
|
||||
To perform any tests, we need to supply `aurweb` with a valid
|
||||
configuration. For development (and testing) purposes, an example
|
||||
[conf/config.dev](../conf/config.dev) can be slightly modified.
|
||||
|
||||
Start off by copying `config.dev` to a new configuration.
|
||||
|
||||
$ cp -v conf/config.dev conf/config
|
||||
|
||||
First, we must tell `aurweb` where the root of our project
|
||||
lives by replacing `YOUR_AUR_ROOT` with the path to the aurweb
|
||||
repository.
|
||||
|
||||
$ sed -i "s;YOUR_AUR_ROOT;/path/to/aurweb;g" conf/config
|
||||
|
||||
Test Databases
|
||||
--------------
|
||||
|
||||
Python tests create and drop hashed database names based on
|
||||
`PYTEST_CURRENT_TEST`. To run tests with a database, the database
|
||||
user must have privileges to create and drop their own databases.
|
||||
Typically, this is the root user, but can be configured for any
|
||||
other user:
|
||||
|
||||
GRANT ALL ON *.* TO 'user'@'localhost' WITH GRANT OPTION
|
||||
|
||||
The aurweb platform is intended to use the `mysql` backend, but
|
||||
the `sqlite` backend is still used for sharness tests. These tests
|
||||
will soon be replaced with pytest suites and `sqlite` removed.
|
||||
|
||||
After ensuring you've configured a test database, users can continue
|
||||
on to [Running Tests](#running-tests).
|
||||
|
||||
Running tests
|
||||
-------------
|
||||
|
||||
To run all the tests, you may run `make check` under `test/`.
|
||||
Makefile test targets: `sh`, `clean`.
|
||||
|
||||
For more control, you may use the `prove` command, which receives a directory
|
||||
or a list of files to run, and produces a report.
|
||||
Recommended method of running tests: `pytest`.
|
||||
|
||||
Each test script is standalone, so you may run them individually. Some tests
|
||||
may receive command-line options to help debugging. See for example sharness's
|
||||
documentation for shell test scripts:
|
||||
https://github.com/chriscool/sharness/blob/master/README.git
|
||||
Legacy sharness tests: `make -C test sh`.
|
||||
|
||||
### Dependencies
|
||||
aurweb is currently going through a refactor where the majority of
|
||||
`sharness` tests have been replaced with `pytest` units. There are
|
||||
still a few `sharness` tests around, and they are required to gain
|
||||
as much coverage as possible over an entire test run. Users should
|
||||
be writing `pytest` units for any new features.
|
||||
|
||||
For all the test to run, the following Arch packages should be installed:
|
||||
Run tests from the project root.
|
||||
|
||||
- pyalpm
|
||||
- python-alembic
|
||||
- python-bleach
|
||||
- python-markdown
|
||||
- python-pygit2
|
||||
- python-sqlalchemy
|
||||
- python-srcinfo
|
||||
$ cd /path/to/aurweb
|
||||
|
||||
Writing tests
|
||||
-------------
|
||||
Ensure you have the proper `AUR_CONFIG` and `LOG_CONFIG` exported:
|
||||
|
||||
Test scripts must follow the Test Anything Protocol specification:
|
||||
$ export AUR_CONFIG=conf/config
|
||||
$ export LOG_CONFIG=logging.test.conf
|
||||
|
||||
To run `sharness` shell test suites (requires Arch Linux):
|
||||
|
||||
$ make -C test sh
|
||||
|
||||
To run `pytest` Python test suites:
|
||||
|
||||
# With poetry-installed aurweb
|
||||
$ poetry run pytest
|
||||
|
||||
# With globally-installed aurweb
|
||||
$ pytest
|
||||
|
||||
After tests are run, one can produce coverage reports.
|
||||
|
||||
# Print out a CLI coverage report.
|
||||
$ coverage report
|
||||
|
||||
# Produce an HTML-based coverage report.
|
||||
$ coverage html
|
||||
|
||||
Writing Python tests (current)
|
||||
------------------------------
|
||||
|
||||
Almost all of our `pytest` suites use the database in some way. There
|
||||
are a few particular testing utilities in `aurweb` that one should
|
||||
keep aware of to aid testing code:
|
||||
|
||||
- `db_test` pytest fixture
|
||||
- Prepares test databases for the module and cleans out database
|
||||
tables for each test function requiring this fixture.
|
||||
- `aurweb.testing.requests.Request`
|
||||
- A fake stripped down version of `fastapi.Request` that can
|
||||
be passed to any functions in our codebase which use
|
||||
`fastapi.Request` parameters.
|
||||
|
||||
Example code:
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
# We need to use the `db_test` fixture at some point
|
||||
# during our test functions.
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test: None) -> None:
|
||||
return
|
||||
|
||||
# Or... specify it in a dependency fixture.
|
||||
@pytest.fixture
|
||||
def user(db_test: None) -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test",
|
||||
Email="test@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
def test_user_login(user: User):
|
||||
assert isinstance(user, User) is True
|
||||
|
||||
fake_request = Request()
|
||||
sid = user.login(fake_request, "testPassword")
|
||||
assert sid is not None
|
||||
|
||||
Writing Sharness tests (legacy)
|
||||
-------------------------------
|
||||
|
||||
Shell test scripts must follow the Test Anything Protocol specification:
|
||||
http://testanything.org/tap-specification.html
|
||||
|
||||
Python tests must be compatible with `pytest` and included in `pytest test/`
|
||||
execution after setting up a configuration.
|
||||
|
||||
Tests must support being run from any directory. They may use $0 to determine
|
||||
their location. Python scripts should expect aurweb to be installed and
|
||||
importable without toying with os.path or PYTHONPATH.
|
||||
|
||||
Tests written in shell should use sharness. In general, new tests should be
|
||||
consistent with existing tests unless they have a good reason not to.
|
||||
|
||||
Debugging Sharness tests
|
||||
---------------
|
||||
|
||||
By default, `make -C test` is quiet and does not print out verbose information
|
||||
about tests being run. If a test is failing, one can look into verbose details
|
||||
of sharness tests by executing them with the `--verbose` flag. Example:
|
||||
`./t1100_git_auth.t --verbose`. This is particularly useful when tests happen
|
||||
to fail in a remote continuous integration environment, where the reader does
|
||||
not have complete access to the runner.
|
||||
|
|
0
test/__init__.py
Normal file
0
test/__init__.py
Normal file
231
test/conftest.py
Normal file
231
test/conftest.py
Normal file
|
@ -0,0 +1,231 @@
|
|||
"""
|
||||
pytest configuration.
|
||||
|
||||
The conftest.py file is used to define pytest-global fixtures
|
||||
or actions run before tests.
|
||||
|
||||
Module scoped fixtures:
|
||||
----------------------
|
||||
- setup_database
|
||||
- db_session (depends: setup_database)
|
||||
|
||||
Function scoped fixtures:
|
||||
------------------------
|
||||
- db_test (depends: db_session)
|
||||
|
||||
Tests in aurweb which access the database **must** use the `db_test`
|
||||
function fixture. Most database tests simply require this fixture in
|
||||
an autouse=True setup fixture, or for fixtures used in DB tests example:
|
||||
|
||||
# In scenarios which there are no other database fixtures
|
||||
# or other database fixtures dependency paths don't always
|
||||
# hit `db_test`.
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
# In scenarios where we can embed the `db_test` fixture in
|
||||
# specific fixtures that already exist.
|
||||
@pytest.fixture
|
||||
def user(db_test):
|
||||
with db.begin():
|
||||
user = db.create(User, ...)
|
||||
yield user
|
||||
|
||||
The `db_test` fixture triggers our module-level database fixtures,
|
||||
then clears the database for each test function run in that module.
|
||||
It is done this way because migration has a large cost; migrating
|
||||
ahead of each function takes too long when compared to this method.
|
||||
"""
|
||||
import os
|
||||
import pathlib
|
||||
|
||||
from multiprocessing import Lock
|
||||
|
||||
import py
|
||||
import pytest
|
||||
|
||||
from posix_ipc import O_CREAT, Semaphore
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.engine import URL
|
||||
from sqlalchemy.engine.base import Engine
|
||||
from sqlalchemy.exc import ProgrammingError
|
||||
from sqlalchemy.orm import scoped_session
|
||||
|
||||
import aurweb.config
|
||||
import aurweb.db
|
||||
|
||||
from aurweb import initdb, logging, testing
|
||||
from aurweb.testing.email import Email
|
||||
from aurweb.testing.filelock import FileLock
|
||||
from aurweb.testing.git import GitRepository
|
||||
|
||||
logger = logging.get_logger(__name__)
|
||||
|
||||
# Synchronization lock for database setup.
|
||||
setup_lock = Lock()
|
||||
|
||||
|
||||
def test_engine() -> Engine:
|
||||
"""
|
||||
Return a privileged SQLAlchemy engine with no database.
|
||||
|
||||
This method is particularly useful for providing an engine that
|
||||
can be used to create and drop databases from an SQL server.
|
||||
|
||||
:return: SQLAlchemy Engine instance (not connected to a database)
|
||||
"""
|
||||
unix_socket = aurweb.config.get_with_fallback("database", "socket", None)
|
||||
kwargs = {
|
||||
"username": aurweb.config.get("database", "user"),
|
||||
"password": aurweb.config.get_with_fallback(
|
||||
"database", "password", None),
|
||||
"host": aurweb.config.get("database", "host"),
|
||||
"port": aurweb.config.get_with_fallback("database", "port", None),
|
||||
"query": {
|
||||
"unix_socket": unix_socket
|
||||
}
|
||||
}
|
||||
|
||||
backend = aurweb.config.get("database", "backend")
|
||||
driver = aurweb.db.DRIVERS.get(backend)
|
||||
return create_engine(URL.create(driver, **kwargs))
|
||||
|
||||
|
||||
class AlembicArgs:
|
||||
"""
|
||||
Masquerade an ArgumentParser like structure.
|
||||
|
||||
This structure is needed to pass conftest-specific arguments
|
||||
to initdb.run duration database creation.
|
||||
"""
|
||||
verbose = False
|
||||
use_alembic = True
|
||||
|
||||
|
||||
def _create_database(engine: Engine, dbname: str) -> None:
|
||||
"""
|
||||
Create a test database.
|
||||
|
||||
:param engine: Engine returned by test_engine()
|
||||
:param dbname: Database name to create
|
||||
"""
|
||||
conn = engine.connect()
|
||||
try:
|
||||
conn.execute(f"CREATE DATABASE {dbname}")
|
||||
except ProgrammingError: # pragma: no cover
|
||||
# The database most likely already existed if we hit
|
||||
# a ProgrammingError. Just drop the database and try
|
||||
# again. If at that point things still fail, any
|
||||
# exception will be propogated up to the caller.
|
||||
conn.execute(f"DROP DATABASE {dbname}")
|
||||
conn.execute(f"CREATE DATABASE {dbname}")
|
||||
conn.close()
|
||||
initdb.run(AlembicArgs)
|
||||
|
||||
|
||||
def _drop_database(engine: Engine, dbname: str) -> None:
|
||||
"""
|
||||
Drop a test database.
|
||||
|
||||
:param engine: Engine returned by test_engine()
|
||||
:param dbname: Database name to drop
|
||||
"""
|
||||
aurweb.schema.metadata.drop_all(bind=engine)
|
||||
conn = engine.connect()
|
||||
conn.execute(f"DROP DATABASE {dbname}")
|
||||
conn.close()
|
||||
|
||||
|
||||
def setup_email():
|
||||
# TODO: Fix this data race! This try/catch is ugly; why is it even
|
||||
# racing here? Perhaps we need to multiproc + multithread lock
|
||||
# inside of setup_database to block the check?
|
||||
with Semaphore("/test-emails", flags=O_CREAT, initial_value=1):
|
||||
if not os.path.exists(Email.TEST_DIR):
|
||||
# Create the directory.
|
||||
os.makedirs(Email.TEST_DIR)
|
||||
|
||||
# Cleanup all email files for this test suite.
|
||||
prefix = Email.email_prefix(suite=True)
|
||||
files = os.listdir(Email.TEST_DIR)
|
||||
for file in files:
|
||||
if file.startswith(prefix):
|
||||
os.remove(os.path.join(Email.TEST_DIR, file))
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def setup_database(tmp_path_factory: pathlib.Path, worker_id: str) -> None:
|
||||
""" Create and drop a database for the suite this fixture is used in. """
|
||||
engine = test_engine()
|
||||
dbname = aurweb.db.name()
|
||||
|
||||
if worker_id == "master": # pragma: no cover
|
||||
# If we're not running tests through multiproc pytest-xdist.
|
||||
setup_email()
|
||||
yield _create_database(engine, dbname)
|
||||
_drop_database(engine, dbname)
|
||||
return
|
||||
|
||||
def setup(path):
|
||||
setup_email()
|
||||
_create_database(engine, dbname)
|
||||
|
||||
tmpdir = tmp_path_factory.getbasetemp().parent
|
||||
file_lock = FileLock(tmpdir, dbname)
|
||||
file_lock.lock(on_create=setup)
|
||||
yield # Run the test function depending on this fixture.
|
||||
_drop_database(engine, dbname) # Cleanup the database.
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def db_session(setup_database: None) -> scoped_session:
|
||||
"""
|
||||
Yield a database session based on aurweb.db.name().
|
||||
|
||||
The returned session is popped out of persistence after the test is run.
|
||||
"""
|
||||
# After the test runs, aurweb.db.name() ends up returning the
|
||||
# configured database, because PYTEST_CURRENT_TEST is removed.
|
||||
dbname = aurweb.db.name()
|
||||
session = aurweb.db.get_session()
|
||||
|
||||
yield session
|
||||
|
||||
# Close the session and pop it.
|
||||
session.close()
|
||||
aurweb.db.pop_session(dbname)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def db_test(db_session: scoped_session) -> None:
|
||||
"""
|
||||
Database test fixture.
|
||||
|
||||
This fixture should be included in any tests which access the
|
||||
database. It ensures that a test database is created and
|
||||
alembic migrated, takes care of dropping the database when
|
||||
the module is complete, and runs setup_test_db() to clear out
|
||||
tables for each test.
|
||||
|
||||
Tests using this fixture should access the database
|
||||
session via aurweb.db.get_session().
|
||||
"""
|
||||
testing.setup_test_db()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def git(tmpdir: py.path.local) -> GitRepository:
|
||||
yield GitRepository(tmpdir)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def email_test() -> None:
|
||||
"""
|
||||
A decoupled test email setup fixture.
|
||||
|
||||
When using the `db_test` fixture, this fixture is redundant. Otherwise,
|
||||
email tests need to run through our `setup_email` function to ensure
|
||||
that we set them up to be used via aurweb.testing.email.Email.
|
||||
"""
|
||||
setup_email()
|
13
test/scripts/cover
Executable file
13
test/scripts/cover
Executable file
|
@ -0,0 +1,13 @@
|
|||
#!/bin/sh
|
||||
# This script is used by sharness tests hosted in our `test`
|
||||
# directory. We require a concrete script to make using this easily,
|
||||
# because we often call `env` in those tests.
|
||||
#
|
||||
# The purpose of this script is to allow sharness tests to gather
|
||||
# Python coverage when calling scripts within `aurweb`.
|
||||
#
|
||||
TOPLEVEL=$(dirname "$0")/../..
|
||||
|
||||
# Define a COVERAGE_FILE in our root directory.
|
||||
COVERAGE_FILE="$TOPLEVEL/.coverage" \
|
||||
coverage run -L --source="$TOPLEVEL/aurweb" --append "$@"
|
|
@ -18,6 +18,11 @@ AURBLUP="$TOPLEVEL/aurweb/scripts/aurblup.py"
|
|||
NOTIFY="$TOPLEVEL/aurweb/scripts/notify.py"
|
||||
RENDERCOMMENT="$TOPLEVEL/aurweb/scripts/rendercomment.py"
|
||||
|
||||
# We reuse some of these scripts when running `env`, so add
|
||||
# it to PATH; that way, env can pick up the script when loaded.
|
||||
PATH="${PATH}:${TOPLEVEL}/test/scripts"
|
||||
export PATH
|
||||
|
||||
# Create the configuration file and a dummy notification script.
|
||||
cat >config <<-EOF
|
||||
[database]
|
||||
|
@ -25,6 +30,7 @@ backend = sqlite
|
|||
name = aur.db
|
||||
|
||||
[options]
|
||||
aurwebdir = $TOPLEVEL
|
||||
aur_location = https://aur.archlinux.org
|
||||
aur_request_ml = aur-requests@lists.archlinux.org
|
||||
enable-maintenance = 0
|
||||
|
@ -61,6 +67,7 @@ sync-dbs = test
|
|||
server = file://$(pwd)/remote/
|
||||
|
||||
[mkpkglists]
|
||||
archivedir = $(pwd)/archive
|
||||
packagesfile = packages.gz
|
||||
packagesmetafile = packages-meta-v1.json.gz
|
||||
packagesmetaextfile = packages-meta-ext-v1.json.gz
|
||||
|
|
|
@ -4,24 +4,25 @@ test_description='git-auth tests'
|
|||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
|
||||
test_expect_success 'Test basic authentication.' '
|
||||
"$GIT_AUTH" "$AUTH_KEYTYPE_USER" "$AUTH_KEYTEXT_USER" >out &&
|
||||
cover "$GIT_AUTH" "$AUTH_KEYTYPE_USER" "$AUTH_KEYTEXT_USER" >out &&
|
||||
grep -q AUR_USER=user out &&
|
||||
grep -q AUR_PRIVILEGED=0 out
|
||||
'
|
||||
|
||||
test_expect_success 'Test Trusted User authentication.' '
|
||||
"$GIT_AUTH" "$AUTH_KEYTYPE_TU" "$AUTH_KEYTEXT_TU" >out &&
|
||||
cover "$GIT_AUTH" "$AUTH_KEYTYPE_TU" "$AUTH_KEYTEXT_TU" >out &&
|
||||
grep -q AUR_USER=tu out &&
|
||||
grep -q AUR_PRIVILEGED=1 out
|
||||
'
|
||||
|
||||
test_expect_success 'Test authentication with an unsupported key type.' '
|
||||
test_must_fail "$GIT_AUTH" ssh-xxx "$AUTH_KEYTEXT_USER"
|
||||
test_must_fail cover "$GIT_AUTH" ssh-xxx "$AUTH_KEYTEXT_USER"
|
||||
'
|
||||
|
||||
test_expect_success 'Test authentication with a wrong key.' '
|
||||
"$GIT_AUTH" "$AUTH_KEYTYPE_MISSING" "$AUTH_KEYTEXT_MISSING" >out
|
||||
cover "$GIT_AUTH" "$AUTH_KEYTYPE_MISSING" "$AUTH_KEYTEXT_MISSING" >out
|
||||
test_must_be_empty out
|
||||
'
|
||||
|
||||
|
|
|
@ -2,14 +2,14 @@
|
|||
|
||||
test_description='git-serve tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
. "$(dirname $0)/setup.sh"
|
||||
|
||||
test_expect_success 'Test interactive shell.' '
|
||||
"$GIT_SERVE" 2>&1 | grep -q "Interactive shell is disabled."
|
||||
cover "$GIT_SERVE" 2>&1 | grep -q "Interactive shell is disabled."
|
||||
'
|
||||
|
||||
test_expect_success 'Test help.' '
|
||||
SSH_ORIGINAL_COMMAND=help "$GIT_SERVE" 2>actual &&
|
||||
SSH_ORIGINAL_COMMAND=help cover "$GIT_SERVE" 2>actual &&
|
||||
save_IFS=$IFS
|
||||
IFS=
|
||||
while read -r line; do
|
||||
|
@ -25,7 +25,7 @@ test_expect_success 'Test maintenance mode.' '
|
|||
sed "s/^\(enable-maintenance = \)0$/\\11/" config.old >config &&
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND=help \
|
||||
"$GIT_SERVE" 2>actual &&
|
||||
cover "$GIT_SERVE" 2>actual &&
|
||||
cat >expected <<-EOF &&
|
||||
The AUR is down due to maintenance. We will be back soon.
|
||||
EOF
|
||||
|
@ -34,7 +34,7 @@ test_expect_success 'Test maintenance mode.' '
|
|||
'
|
||||
|
||||
test_expect_success 'Test IP address logging.' '
|
||||
SSH_ORIGINAL_COMMAND=help AUR_USER=user "$GIT_SERVE" 2>actual &&
|
||||
SSH_ORIGINAL_COMMAND=help AUR_USER=user cover "$GIT_SERVE" 2>actual &&
|
||||
cat >expected <<-EOF &&
|
||||
1.2.3.4
|
||||
EOF
|
||||
|
@ -48,7 +48,7 @@ test_expect_success 'Test IP address bans.' '
|
|||
SSH_CLIENT="1.3.3.7 1337 22" &&
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND=help \
|
||||
"$GIT_SERVE" 2>actual &&
|
||||
cover "$GIT_SERVE" 2>actual &&
|
||||
cat >expected <<-EOF &&
|
||||
The SSH interface is disabled for your IP address.
|
||||
EOF
|
||||
|
@ -58,14 +58,14 @@ test_expect_success 'Test IP address bans.' '
|
|||
|
||||
test_expect_success 'Test setup-repo and list-repos.' '
|
||||
SSH_ORIGINAL_COMMAND="setup-repo foobar" AUR_USER=user \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="setup-repo foobar2" AUR_USER=tu \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -77,7 +77,7 @@ test_expect_success 'Test git-receive-pack.' '
|
|||
EOF
|
||||
SSH_ORIGINAL_COMMAND="git-receive-pack /foobar.git/" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -85,7 +85,7 @@ test_expect_success 'Test git-receive-pack with an invalid repository name.' '
|
|||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="git-receive-pack /!.git/" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual
|
||||
cover "$GIT_SERVE" 2>&1 >actual
|
||||
'
|
||||
|
||||
test_expect_success "Test git-upload-pack." '
|
||||
|
@ -96,7 +96,7 @@ test_expect_success "Test git-upload-pack." '
|
|||
EOF
|
||||
SSH_ORIGINAL_COMMAND="git-upload-pack /foobar.git/" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -108,7 +108,7 @@ test_expect_success "Try to pull from someone else's repository." '
|
|||
EOF
|
||||
SSH_ORIGINAL_COMMAND="git-upload-pack /foobar2.git/" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -116,7 +116,7 @@ test_expect_success "Try to push to someone else's repository." '
|
|||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="git-receive-pack /foobar2.git/" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Try to push to someone else's repository as Trusted User." '
|
||||
|
@ -127,7 +127,7 @@ test_expect_success "Try to push to someone else's repository as Trusted User."
|
|||
EOF
|
||||
SSH_ORIGINAL_COMMAND="git-receive-pack /foobar.git/" \
|
||||
AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -139,40 +139,40 @@ test_expect_success "Test restore." '
|
|||
foobar
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="restore foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual
|
||||
cover "$GIT_SERVE" 2>&1 >actual
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
test_expect_success "Try to restore an existing package base." '
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="restore foobar2" \
|
||||
env SSH_ORIGINAL_COMMAND="restore foobar2"\
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Disown all package bases." '
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
test_expect_success "Adopt a package base as a regular user." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -180,119 +180,119 @@ test_expect_success "Adopt an already adopted package base." '
|
|||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="adopt foobar" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Adopt a package base as a Trusted User." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar2
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
test_expect_success "Disown one's own package base as a regular user." '
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
test_expect_success "Disown one's own package base as a Trusted User." '
|
||||
SSH_ORIGINAL_COMMAND="disown foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
test_expect_success "Try to steal another user's package as a regular user." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="adopt foobar2" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar2
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Try to steal another user's package as a Trusted User." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Try to disown another user's package as a regular user." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="disown foobar2" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar2
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar2" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Try to disown another user's package as a Trusted User." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1
|
||||
cover "$GIT_SERVE" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success "Adopt a package base and add co-maintainers." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="set-comaintainers foobar user3 user4" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
5|3|1
|
||||
6|3|2
|
||||
|
@ -305,7 +305,7 @@ test_expect_success "Adopt a package base and add co-maintainers." '
|
|||
test_expect_success "Update package base co-maintainers." '
|
||||
SSH_ORIGINAL_COMMAND="set-comaintainers foobar user2 user3 user4" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
4|3|1
|
||||
5|3|2
|
||||
|
@ -320,7 +320,7 @@ test_expect_success "Try to add co-maintainers to an orphan package base." '
|
|||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="set-comaintainers foobar2 user2 user3 user4" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
4|3|1
|
||||
5|3|2
|
||||
|
@ -333,12 +333,12 @@ test_expect_success "Try to add co-maintainers to an orphan package base." '
|
|||
|
||||
test_expect_success "Disown a package base and check (co-)maintainer list." '
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
*foobar
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
cat >expected <<-EOF &&
|
||||
5|3|1
|
||||
|
@ -351,11 +351,11 @@ test_expect_success "Disown a package base and check (co-)maintainer list." '
|
|||
|
||||
test_expect_success "Force-disown a package base and check (co-)maintainer list." '
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=tu AUR_PRIVILEGED=1 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
SSH_ORIGINAL_COMMAND="list-repos" AUR_USER=user3 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 >actual &&
|
||||
cover "$GIT_SERVE" 2>&1 >actual &&
|
||||
test_cmp expected actual &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
|
@ -366,7 +366,7 @@ test_expect_success "Force-disown a package base and check (co-)maintainer list.
|
|||
|
||||
test_expect_success "Check whether package requests are closed when disowning." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageRequests (ID, ReqTypeID, PackageBaseID, PackageBaseName, UsersID, Comments, ClosureComment) VALUES (1, 2, 3, "foobar", 4, "", "");
|
||||
INSERT INTO PackageRequests (ID, ReqTypeID, PackageBaseID, PackageBaseName, UsersID, Comments, ClosureComment) VALUES (2, 3, 3, "foobar", 5, "", "");
|
||||
|
@ -374,7 +374,7 @@ test_expect_success "Check whether package requests are closed when disowning."
|
|||
EOD
|
||||
>sendmail.out &&
|
||||
SSH_ORIGINAL_COMMAND="disown foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#1] Orphan Request for foobar Accepted
|
||||
EOD
|
||||
|
@ -389,7 +389,7 @@ test_expect_success "Check whether package requests are closed when disowning."
|
|||
|
||||
test_expect_success "Flag a package base out-of-date." '
|
||||
SSH_ORIGINAL_COMMAND="flag foobar Because." AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
1|Because.
|
||||
EOF
|
||||
|
@ -400,7 +400,7 @@ test_expect_success "Flag a package base out-of-date." '
|
|||
|
||||
test_expect_success "Unflag a package base as flagger." '
|
||||
SSH_ORIGINAL_COMMAND="unflag foobar" AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
0|Because.
|
||||
EOF
|
||||
|
@ -411,11 +411,11 @@ test_expect_success "Unflag a package base as flagger." '
|
|||
|
||||
test_expect_success "Unflag a package base as maintainer." '
|
||||
SSH_ORIGINAL_COMMAND="adopt foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="flag foobar Because." AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="unflag foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
0|Because.
|
||||
EOF
|
||||
|
@ -426,9 +426,9 @@ test_expect_success "Unflag a package base as maintainer." '
|
|||
|
||||
test_expect_success "Unflag a package base as random user." '
|
||||
SSH_ORIGINAL_COMMAND="flag foobar Because." AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
SSH_ORIGINAL_COMMAND="unflag foobar" AUR_USER=user3 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
1|Because.
|
||||
EOF
|
||||
|
@ -439,11 +439,11 @@ test_expect_success "Unflag a package base as random user." '
|
|||
|
||||
test_expect_success "Flag using a comment which is too short." '
|
||||
SSH_ORIGINAL_COMMAND="unflag foobar" AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="flag foobar xx" \
|
||||
AUR_USER=user2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
0|Because.
|
||||
EOF
|
||||
|
@ -454,7 +454,7 @@ test_expect_success "Flag using a comment which is too short." '
|
|||
|
||||
test_expect_success "Vote for a package base." '
|
||||
SSH_ORIGINAL_COMMAND="vote foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
3|1
|
||||
EOF
|
||||
|
@ -472,7 +472,7 @@ test_expect_success "Vote for a package base." '
|
|||
test_expect_success "Vote for a package base twice." '
|
||||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="vote foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
3|1
|
||||
EOF
|
||||
|
@ -489,7 +489,7 @@ test_expect_success "Vote for a package base twice." '
|
|||
|
||||
test_expect_success "Remove vote from a package base." '
|
||||
SSH_ORIGINAL_COMMAND="unvote foobar" AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
echo "SELECT PackageBaseID, UsersID FROM PackageVotes;" | \
|
||||
|
@ -507,7 +507,7 @@ test_expect_success "Try to remove the vote again." '
|
|||
test_must_fail \
|
||||
env SSH_ORIGINAL_COMMAND="unvote foobar" \
|
||||
AUR_USER=user AUR_PRIVILEGED=0 \
|
||||
"$GIT_SERVE" 2>&1 &&
|
||||
cover "$GIT_SERVE" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
EOF
|
||||
echo "SELECT PackageBaseID, UsersID FROM PackageVotes;" | \
|
||||
|
|
|
@ -16,7 +16,7 @@ test_expect_success 'Test update hook on a fresh repository.' '
|
|||
old=0000000000000000000000000000000000000000 &&
|
||||
new=$(git -C aur.git rev-parse HEAD^) &&
|
||||
AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
1|1|foobar|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
1|GPL
|
||||
|
@ -34,7 +34,7 @@ test_expect_success 'Test update hook on another fresh repository.' '
|
|||
git -C aur.git checkout -q refs/namespaces/foobar2/refs/heads/master &&
|
||||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
AUR_USER=user AUR_PKGBASE=foobar2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
1|1|foobar|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
2|2|foobar2|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
|
@ -55,7 +55,7 @@ test_expect_success 'Test update hook on an updated repository.' '
|
|||
old=$(git -C aur.git rev-parse HEAD^) &&
|
||||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
2|2|foobar2|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
3|1|foobar|1-2|aurweb test package.|https://aur.archlinux.org/
|
||||
|
@ -74,7 +74,7 @@ test_expect_success 'Test update hook on an updated repository.' '
|
|||
|
||||
test_expect_success 'Test restore mode.' '
|
||||
AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" restore 2>&1 &&
|
||||
cover "$GIT_UPDATE" restore 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
2|2|foobar2|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
3|1|foobar|1-2|aurweb test package.|https://aur.archlinux.org/
|
||||
|
@ -97,7 +97,7 @@ test_expect_success 'Test restore mode on a non-existent repository.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar3 AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" restore >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" restore >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -109,7 +109,7 @@ test_expect_success 'Pushing to a branch other than master.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/pu "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/pu "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -121,7 +121,7 @@ test_expect_success 'Performing a non-fast-forward ref update.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -133,7 +133,7 @@ test_expect_success 'Performing a non-fast-forward ref update as Trusted User.'
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=tu AUR_PKGBASE=foobar AUR_PRIVILEGED=1 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -145,7 +145,7 @@ test_expect_success 'Performing a non-fast-forward ref update as normal user wit
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 AUR_OVERWRITE=1 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -153,7 +153,7 @@ test_expect_success 'Performing a non-fast-forward ref update as Trusted User wi
|
|||
old=$(git -C aur.git rev-parse HEAD) &&
|
||||
new=$(git -C aur.git rev-parse HEAD^) &&
|
||||
AUR_USER=tu AUR_PKGBASE=foobar AUR_PRIVILEGED=1 AUR_OVERWRITE=1 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success 'Removing .SRCINFO.' '
|
||||
|
@ -164,7 +164,7 @@ test_expect_success 'Removing .SRCINFO.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing .SRCINFO$" actual
|
||||
'
|
||||
|
||||
|
@ -177,7 +177,7 @@ test_expect_success 'Removing .SRCINFO with a follow-up fix.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing .SRCINFO$" actual
|
||||
'
|
||||
|
||||
|
@ -189,7 +189,7 @@ test_expect_success 'Removing PKGBUILD.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing PKGBUILD$" actual
|
||||
'
|
||||
|
||||
|
@ -203,7 +203,7 @@ test_expect_success 'Pushing a tree with a subdirectory.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: the repository must not contain subdirectories$" actual
|
||||
'
|
||||
|
||||
|
@ -216,7 +216,7 @@ test_expect_success 'Pushing a tree with a large blob.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: maximum blob size (250.00KiB) exceeded$" actual
|
||||
'
|
||||
|
||||
|
@ -232,7 +232,7 @@ test_expect_success 'Pushing .SRCINFO with a non-matching package base.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: invalid pkgbase: foobar2, expected foobar$" actual
|
||||
'
|
||||
|
||||
|
@ -248,7 +248,7 @@ test_expect_success 'Pushing .SRCINFO with invalid syntax.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1
|
||||
'
|
||||
|
||||
test_expect_success 'Pushing .SRCINFO without pkgver.' '
|
||||
|
@ -263,7 +263,7 @@ test_expect_success 'Pushing .SRCINFO without pkgver.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing mandatory field: pkgver$" actual
|
||||
'
|
||||
|
||||
|
@ -279,7 +279,7 @@ test_expect_success 'Pushing .SRCINFO without pkgrel.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing mandatory field: pkgrel$" actual
|
||||
'
|
||||
|
||||
|
@ -294,7 +294,7 @@ test_expect_success 'Pushing .SRCINFO with epoch.' '
|
|||
) &&
|
||||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" 2>&1 &&
|
||||
cat >expected <<-EOF &&
|
||||
2|2|foobar2|1-1|aurweb test package.|https://aur.archlinux.org/
|
||||
3|1|foobar|1:1-2|aurweb test package.|https://aur.archlinux.org/
|
||||
|
@ -315,7 +315,7 @@ test_expect_success 'Pushing .SRCINFO with invalid pkgname.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: invalid package name: !$" actual
|
||||
'
|
||||
|
||||
|
@ -331,7 +331,7 @@ test_expect_success 'Pushing .SRCINFO with invalid epoch.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: invalid epoch: !$" actual
|
||||
'
|
||||
|
||||
|
@ -348,7 +348,7 @@ test_expect_success 'Pushing .SRCINFO with too long URL.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: url field too long: $url\$" actual
|
||||
'
|
||||
|
||||
|
@ -364,7 +364,7 @@ test_expect_success 'Missing install file.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing install file: install$" actual
|
||||
'
|
||||
|
||||
|
@ -380,7 +380,7 @@ test_expect_success 'Missing changelog file.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing changelog file: changelog$" actual
|
||||
'
|
||||
|
||||
|
@ -396,7 +396,7 @@ test_expect_success 'Missing source file.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: missing source file: file$" actual
|
||||
'
|
||||
|
||||
|
@ -413,7 +413,7 @@ test_expect_success 'Pushing .SRCINFO with too long source URL.' '
|
|||
new=$(git -C aur.git rev-parse HEAD) &&
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
grep -q "^error: source entry too long: $url\$" actual
|
||||
'
|
||||
|
||||
|
@ -428,7 +428,7 @@ test_expect_success 'Pushing a blacklisted package.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -442,7 +442,7 @@ test_expect_success 'Pushing a blacklisted package as Trusted User.' '
|
|||
warning: package is blacklisted: forbidden
|
||||
EOD
|
||||
AUR_USER=tu AUR_PKGBASE=foobar AUR_PRIVILEGED=1 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -457,7 +457,7 @@ test_expect_success 'Pushing a package already in the official repositories.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -471,7 +471,7 @@ test_expect_success 'Pushing a package already in the official repositories as T
|
|||
warning: package already provided by [core]: official
|
||||
EOD
|
||||
AUR_USER=tu AUR_PKGBASE=foobar AUR_PRIVILEGED=1 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
@ -491,7 +491,7 @@ test_expect_success 'Trying to hijack a package.' '
|
|||
EOD
|
||||
test_must_fail \
|
||||
env AUR_USER=user AUR_PKGBASE=foobar2 AUR_PRIVILEGED=0 \
|
||||
"$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
cover "$GIT_UPDATE" refs/heads/master "$old" "$new" >actual 2>&1 &&
|
||||
test_cmp expected actual
|
||||
'
|
||||
|
||||
|
|
|
@ -1,65 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='mkpkglists tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test package list generation with no packages.' '
|
||||
echo "DELETE FROM Packages;" | sqlite3 aur.db &&
|
||||
echo "DELETE FROM PackageBases;" | sqlite3 aur.db &&
|
||||
"$MKPKGLISTS" &&
|
||||
test $(zcat packages.gz | wc -l) -eq 1 &&
|
||||
test $(zcat pkgbase.gz | wc -l) -eq 1
|
||||
'
|
||||
|
||||
test_expect_success 'Test package list generation.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1, "foobar", 1, 0, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (2, "foobar2", 2, 0, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (3, "foobar3", NULL, 0, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (4, "foobar4", 1, 0, 0, "");
|
||||
INSERT INTO Packages (ID, PackageBaseID, Name) VALUES (1, 1, "pkg1");
|
||||
INSERT INTO Packages (ID, PackageBaseID, Name) VALUES (2, 1, "pkg2");
|
||||
INSERT INTO Packages (ID, PackageBaseID, Name) VALUES (3, 1, "pkg3");
|
||||
INSERT INTO Packages (ID, PackageBaseID, Name) VALUES (4, 2, "pkg4");
|
||||
INSERT INTO Packages (ID, PackageBaseID, Name) VALUES (5, 3, "pkg5");
|
||||
EOD
|
||||
"$MKPKGLISTS" &&
|
||||
cat <<-EOD >expected &&
|
||||
foobar
|
||||
foobar2
|
||||
foobar4
|
||||
EOD
|
||||
gunzip pkgbase.gz &&
|
||||
sed "/^#/d" pkgbase >actual &&
|
||||
test_cmp actual expected &&
|
||||
cat <<-EOD >expected &&
|
||||
pkg1
|
||||
pkg2
|
||||
pkg3
|
||||
pkg4
|
||||
EOD
|
||||
gunzip packages.gz &&
|
||||
sed "/^#/d" packages >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test user list generation.' '
|
||||
"$MKPKGLISTS" &&
|
||||
cat <<-EOD >expected &&
|
||||
dev
|
||||
tu
|
||||
tu2
|
||||
tu3
|
||||
tu4
|
||||
user
|
||||
user2
|
||||
user3
|
||||
user4
|
||||
EOD
|
||||
gunzip users.gz &&
|
||||
sed "/^#/d" users >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,53 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='tuvotereminder tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test Trusted User vote reminders.' '
|
||||
now=$(date -d now +%s) &&
|
||||
tomorrow=$(date -d tomorrow +%s) &&
|
||||
threedays=$(date -d "3 days" +%s) &&
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO TU_VoteInfo (ID, Agenda, User, Submitted, End, Quorum, SubmitterID) VALUES (1, "Lorem ipsum.", "user", 0, $now, 0.00, 2);
|
||||
INSERT INTO TU_VoteInfo (ID, Agenda, User, Submitted, End, Quorum, SubmitterID) VALUES (2, "Lorem ipsum.", "user", 0, $tomorrow, 0.00, 2);
|
||||
INSERT INTO TU_VoteInfo (ID, Agenda, User, Submitted, End, Quorum, SubmitterID) VALUES (3, "Lorem ipsum.", "user", 0, $tomorrow, 0.00, 2);
|
||||
INSERT INTO TU_VoteInfo (ID, Agenda, User, Submitted, End, Quorum, SubmitterID) VALUES (4, "Lorem ipsum.", "user", 0, $threedays, 0.00, 2);
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$TUVOTEREMINDER" &&
|
||||
grep -q "Proposal 2" sendmail.out &&
|
||||
grep -q "Proposal 3" sendmail.out &&
|
||||
test_must_fail grep -q "Proposal 1" sendmail.out &&
|
||||
test_must_fail grep -q "Proposal 4" sendmail.out
|
||||
'
|
||||
|
||||
test_expect_success 'Check that only TUs who did not vote receive reminders.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (1, 2);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (2, 2);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (3, 2);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (4, 2);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (1, 7);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (3, 7);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (2, 8);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (4, 8);
|
||||
INSERT INTO TU_Votes (VoteID, UserID) VALUES (1, 9);
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$TUVOTEREMINDER" &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: TU Vote Reminder: Proposal 2
|
||||
To: tu2@localhost
|
||||
Subject: TU Vote Reminder: Proposal 2
|
||||
To: tu4@localhost
|
||||
Subject: TU Vote Reminder: Proposal 3
|
||||
To: tu3@localhost
|
||||
Subject: TU Vote Reminder: Proposal 3
|
||||
To: tu4@localhost
|
||||
EOD
|
||||
grep "^\(Subject\|To\)" sendmail.out >sendmail.parts &&
|
||||
test_cmp sendmail.parts expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,26 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='pkgmaint tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test package base cleanup script.' '
|
||||
now=$(date -d now +%s) &&
|
||||
threedaysago=$(date -d "3 days ago" +%s) &&
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1, "foobar", 1, $now, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (2, "foobar2", 2, $threedaysago, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (3, "foobar3", NULL, $now, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (4, "foobar4", NULL, $threedaysago, 0, "");
|
||||
EOD
|
||||
"$PKGMAINT" &&
|
||||
cat <<-EOD >expected &&
|
||||
foobar
|
||||
foobar2
|
||||
foobar3
|
||||
EOD
|
||||
echo "SELECT Name FROM PackageBases;" | sqlite3 aur.db >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,53 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='aurblup tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test official provider update script.' '
|
||||
mkdir -p remote/test/foobar-1.0-1 &&
|
||||
cat <<-EOD >remote/test/foobar-1.0-1/desc &&
|
||||
%FILENAME%
|
||||
foobar-1.0-any.pkg.tar.xz
|
||||
|
||||
%NAME%
|
||||
foobar
|
||||
|
||||
%VERSION%
|
||||
1.0-1
|
||||
|
||||
%ARCH%
|
||||
any
|
||||
EOD
|
||||
mkdir -p remote/test/foobar2-1.0-1 &&
|
||||
cat <<-EOD >remote/test/foobar2-1.0-1/desc &&
|
||||
%FILENAME%
|
||||
foobar2-1.0-any.pkg.tar.xz
|
||||
|
||||
%NAME%
|
||||
foobar2
|
||||
|
||||
%VERSION%
|
||||
1.0-1
|
||||
|
||||
%ARCH%
|
||||
any
|
||||
|
||||
%PROVIDES%
|
||||
foobar3
|
||||
foobar4
|
||||
EOD
|
||||
( cd remote/test && bsdtar -czf ../test.db * ) &&
|
||||
mkdir sync &&
|
||||
"$AURBLUP" &&
|
||||
cat <<-EOD >expected &&
|
||||
foobar|test|foobar
|
||||
foobar2|test|foobar2
|
||||
foobar2|test|foobar3
|
||||
foobar2|test|foobar4
|
||||
EOD
|
||||
echo "SELECT Name, Repo, Provides FROM OfficialProviders ORDER BY Provides;" | sqlite3 aur.db >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,431 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='notify tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test out-of-date notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
/* Use package base IDs which can be distinguished from user IDs. */
|
||||
INSERT INTO PackageBases (ID, Name, MaintainerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1001, "foobar", 1, 0, 0, "This is a test OOD comment.");
|
||||
INSERT INTO PackageBases (ID, Name, MaintainerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1002, "foobar2", 2, 0, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, MaintainerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1003, "foobar3", NULL, 0, 0, "");
|
||||
INSERT INTO PackageBases (ID, Name, MaintainerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1004, "foobar4", 1, 0, 0, "");
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1001, 2, 1);
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1001, 4, 2);
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1002, 3, 1);
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1002, 5, 2);
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1003, 4, 1);
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" flag 1 1001 &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Out-of-date Notification for foobar
|
||||
To: tu@localhost
|
||||
Subject: AUR Out-of-date Notification for foobar
|
||||
To: user2@localhost
|
||||
Subject: AUR Out-of-date Notification for foobar
|
||||
To: user@localhost
|
||||
EOD
|
||||
grep "^\(Subject\|To\)" sendmail.out >sendmail.parts &&
|
||||
test_cmp sendmail.parts expected &&
|
||||
cat <<-EOD | sqlite3 aur.db
|
||||
DELETE FROM PackageComaintainers;
|
||||
EOD
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of reset key notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE Users SET ResetKey = "12345678901234567890123456789012" WHERE ID = 1;
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" send-resetkey 1 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Password Reset
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
A password reset request was submitted for the account user associated
|
||||
with your email address. If you wish to reset your password follow the
|
||||
link [1] below, otherwise ignore this message and nothing will happen.
|
||||
|
||||
[1] https://aur.archlinux.org/passreset/?resetkey=12345678901234567890123456789012
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of welcome notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE Users SET ResetKey = "12345678901234567890123456789012" WHERE ID = 1;
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" welcome 1 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: Welcome to the Arch User Repository
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Welcome to the Arch User Repository! In order to set an initial
|
||||
password for your new account, please click the link [1] below. If the
|
||||
link does not work, try copying and pasting it into your browser.
|
||||
|
||||
[1] https://aur.archlinux.org/passreset/?resetkey=12345678901234567890123456789012
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of comment notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
/* Use package comments IDs which can be distinguished from other IDs. */
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, UsersID, Comments, RenderedComment) VALUES (2001, 1001, 1, "This is a test comment.", "This is a test comment.");
|
||||
INSERT INTO PackageNotifications (PackageBaseID, UserID) VALUES (1001, 2);
|
||||
UPDATE Users SET CommentNotify = 1 WHERE ID = 2;
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" comment 1 1001 2001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Comment for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] added the following comment to foobar [2]:
|
||||
|
||||
This is a test comment.
|
||||
|
||||
--
|
||||
If you no longer wish to receive notifications about this package,
|
||||
please go to the package page [2] and select "Disable notifications".
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of update notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE Users SET UpdateNotify = 1 WHERE ID = 2;
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" update 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Package Update: foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] pushed a new commit to foobar [2].
|
||||
|
||||
--
|
||||
If you no longer wish to receive notifications about this package,
|
||||
please go to the package page [2] and select "Disable notifications".
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of out-of-date notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" flag 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Out-of-date Notification for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Your package foobar [1] has been flagged out-of-date by user [2]:
|
||||
|
||||
This is a test OOD comment.
|
||||
|
||||
[1] https://aur.archlinux.org/pkgbase/foobar/
|
||||
[2] https://aur.archlinux.org/account/user/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of adopt notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" adopt 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Ownership Notification for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
The package foobar [1] was adopted by user [2].
|
||||
|
||||
[1] https://aur.archlinux.org/pkgbase/foobar/
|
||||
[2] https://aur.archlinux.org/account/user/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of disown notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" disown 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Ownership Notification for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
The package foobar [1] was disowned by user [2].
|
||||
|
||||
[1] https://aur.archlinux.org/pkgbase/foobar/
|
||||
[2] https://aur.archlinux.org/account/user/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of co-maintainer addition notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" comaintainer-add 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Co-Maintainer Notification for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
You were added to the co-maintainer list of foobar [1].
|
||||
|
||||
[1] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of co-maintainer removal notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" comaintainer-remove 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Co-Maintainer Notification for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
You were removed from the co-maintainer list of foobar [1].
|
||||
|
||||
[1] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of delete notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" delete 1 1001 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Package deleted: foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] deleted foobar [2].
|
||||
|
||||
You will no longer receive notifications about this package.
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of merge notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" delete 1 1001 1002 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: AUR Package deleted: foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] merged foobar [2] into foobar2 [3].
|
||||
|
||||
--
|
||||
If you no longer wish receive notifications about the new package,
|
||||
please go to [3] and click "Disable notifications".
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
[3] https://aur.archlinux.org/pkgbase/foobar2/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Cc, subject and body of request open notifications.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
/* Use package request IDs which can be distinguished from other IDs. */
|
||||
INSERT INTO PackageRequests (ID, PackageBaseID, PackageBaseName, UsersID, ReqTypeID, Comments, ClosureComment) VALUES (3001, 1001, "foobar", 2, 1, "This is a request test comment.", "");
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-open 1 3001 orphan 1001 &&
|
||||
grep ^Cc: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Cc: user@localhost, tu@localhost
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#3001] Orphan Request for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] filed an orphan request for foobar [2]:
|
||||
|
||||
This is a request test comment.
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of request open notifications for merge requests.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-open 1 3001 merge 1001 foobar2 &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#3001] Merge Request for foobar
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
user [1] filed a request to merge foobar [2] into foobar2 [3]:
|
||||
|
||||
This is a request test comment.
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
[2] https://aur.archlinux.org/pkgbase/foobar/
|
||||
[3] https://aur.archlinux.org/pkgbase/foobar2/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Cc, subject and body of request close notifications.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-close 1 3001 accepted &&
|
||||
grep ^Cc: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Cc: user@localhost, tu@localhost
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#3001] Deletion Request for foobar Accepted
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Request #3001 has been accepted by user [1].
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of request close notifications (auto-accept).' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-close 0 3001 accepted &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#3001] Deletion Request for foobar Accepted
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Request #3001 has been accepted automatically by the Arch User
|
||||
Repository package request system.
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Cc of request close notification with co-maintainer.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
/* Use package base IDs which can be distinguished from user IDs. */
|
||||
INSERT INTO PackageComaintainers (PackageBaseID, UsersID, Priority) VALUES (1001, 3, 1);
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-close 0 3001 accepted &&
|
||||
grep ^Cc: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Cc: user@localhost, tu@localhost, dev@localhost
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
cat <<-EOD | sqlite3 aur.db
|
||||
DELETE FROM PackageComaintainers;
|
||||
EOD
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of request close notifications with closure comment.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE PackageRequests SET ClosureComment = "This is a test closure comment." WHERE ID = 3001;
|
||||
EOD
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" request-close 1 3001 accepted &&
|
||||
grep ^Subject: sendmail.out >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: [PRQ#3001] Deletion Request for foobar Accepted
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Request #3001 has been accepted by user [1]:
|
||||
|
||||
This is a test closure comment.
|
||||
|
||||
[1] https://aur.archlinux.org/account/user/
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test subject and body of TU vote reminders.' '
|
||||
>sendmail.out &&
|
||||
"$NOTIFY" tu-vote-reminder 1 &&
|
||||
grep ^Subject: sendmail.out | head -1 >actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Subject: TU Vote Reminder: Proposal 1
|
||||
EOD
|
||||
test_cmp actual expected &&
|
||||
sed -n "/^\$/,\$p" sendmail.out | head -4 | base64 -d >actual &&
|
||||
echo >>actual &&
|
||||
cat <<-EOD >expected &&
|
||||
Please remember to cast your vote on proposal 1 [1]. The voting period
|
||||
ends in less than 48 hours.
|
||||
|
||||
[1] https://aur.archlinux.org/tu/?id=1
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,160 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='rendercomment tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test comment rendering.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageBases (ID, Name, PackagerUID, SubmittedTS, ModifiedTS, FlaggerComment) VALUES (1, "foobar", 1, 0, 0, "");
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (1, 1, "Hello world!
|
||||
This is a comment.", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 1 &&
|
||||
cat <<-EOD >expected &&
|
||||
<p>Hello world!
|
||||
This is a comment.</p>
|
||||
EOD
|
||||
cat <<-EOD | sqlite3 aur.db >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 1;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Markdown conversion.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (2, 1, "*Hello* [world](https://www.archlinux.org/)!", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 2 &&
|
||||
cat <<-EOD >expected &&
|
||||
<p><em>Hello</em> <a href="https://www.archlinux.org/">world</a>!</p>
|
||||
EOD
|
||||
cat <<-EOD | sqlite3 aur.db >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 2;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test HTML sanitizing.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (3, 1, "<script>alert(""XSS!"");</script>", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 3 &&
|
||||
cat <<-EOD >expected &&
|
||||
<script>alert("XSS!");</script>
|
||||
EOD
|
||||
cat <<-EOD | sqlite3 aur.db >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 3;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test link conversion.' '
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (4, 1, "
|
||||
Visit https://www.archlinux.org/#_test_.
|
||||
Visit *https://www.archlinux.org/*.
|
||||
Visit <https://www.archlinux.org/>.
|
||||
Visit \`https://www.archlinux.org/\`.
|
||||
Visit [Arch Linux](https://www.archlinux.org/).
|
||||
Visit [Arch Linux][arch].
|
||||
[arch]: https://www.archlinux.org/
|
||||
", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 4 &&
|
||||
cat <<-EOD >expected &&
|
||||
<p>Visit <a href="https://www.archlinux.org/#_test_">https://www.archlinux.org/#_test_</a>.
|
||||
Visit <em><a href="https://www.archlinux.org/">https://www.archlinux.org/</a></em>.
|
||||
Visit <a href="https://www.archlinux.org/">https://www.archlinux.org/</a>.
|
||||
Visit <code>https://www.archlinux.org/</code>.
|
||||
Visit <a href="https://www.archlinux.org/">Arch Linux</a>.
|
||||
Visit <a href="https://www.archlinux.org/">Arch Linux</a>.</p>
|
||||
EOD
|
||||
cat <<-EOD | sqlite3 aur.db >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 4;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Git commit linkification.' '
|
||||
local oid=`git -C aur.git rev-parse --verify HEAD`
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (5, 1, "
|
||||
$oid
|
||||
${oid:0:7}
|
||||
x.$oid.x
|
||||
${oid}x
|
||||
0123456789abcdef
|
||||
\`$oid\`
|
||||
http://example.com/$oid
|
||||
", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 5 &&
|
||||
cat <<-EOD >expected &&
|
||||
<p><a href="https://aur.archlinux.org/cgit/aur.git/log/?h=foobar&id=${oid:0:12}">${oid:0:12}</a>
|
||||
<a href="https://aur.archlinux.org/cgit/aur.git/log/?h=foobar&id=${oid:0:7}">${oid:0:7}</a>
|
||||
x.<a href="https://aur.archlinux.org/cgit/aur.git/log/?h=foobar&id=${oid:0:12}">${oid:0:12}</a>.x
|
||||
${oid}x
|
||||
0123456789abcdef
|
||||
<code>$oid</code>
|
||||
<a href="http://example.com/$oid">http://example.com/$oid</a></p>
|
||||
EOD
|
||||
cat <<-EOD | sqlite3 aur.db >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 5;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test Flyspray issue linkification.' '
|
||||
sqlite3 aur.db <<-EOD &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (6, 1, "
|
||||
FS#1234567.
|
||||
*FS#1234*
|
||||
FS#
|
||||
XFS#1
|
||||
\`FS#1234\`
|
||||
https://archlinux.org/?test=FS#1234
|
||||
", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 6 &&
|
||||
cat <<-EOD >expected &&
|
||||
<p><a href="https://bugs.archlinux.org/task/1234567">FS#1234567</a>.
|
||||
<em><a href="https://bugs.archlinux.org/task/1234">FS#1234</a></em>
|
||||
FS#
|
||||
XFS#1
|
||||
<code>FS#1234</code>
|
||||
<a href="https://archlinux.org/?test=FS#1234">https://archlinux.org/?test=FS#1234</a></p>
|
||||
EOD
|
||||
sqlite3 aur.db <<-EOD >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 6;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test headings lowering.' '
|
||||
sqlite3 aur.db <<-EOD &&
|
||||
INSERT INTO PackageComments (ID, PackageBaseID, Comments, RenderedComment) VALUES (7, 1, "
|
||||
# One
|
||||
## Two
|
||||
### Three
|
||||
#### Four
|
||||
##### Five
|
||||
###### Six
|
||||
", "");
|
||||
EOD
|
||||
"$RENDERCOMMENT" 7 &&
|
||||
cat <<-EOD >expected &&
|
||||
<h5>One</h5>
|
||||
<h6>Two</h6>
|
||||
<h6>Three</h6>
|
||||
<h6>Four</h6>
|
||||
<h6>Five</h6>
|
||||
<h6>Six</h6>
|
||||
EOD
|
||||
sqlite3 aur.db <<-EOD >actual &&
|
||||
SELECT RenderedComment FROM PackageComments WHERE ID = 7;
|
||||
EOD
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
|
@ -1,49 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
test_description='usermaint tests'
|
||||
|
||||
. "$(dirname "$0")/setup.sh"
|
||||
|
||||
test_expect_success 'Test removal of login IP addresses.' '
|
||||
now=$(date -d now +%s) &&
|
||||
threedaysago=$(date -d "3 days ago" +%s) &&
|
||||
tendaysago=$(date -d "10 days ago" +%s) &&
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE Users SET LastLogin = $threedaysago, LastLoginIPAddress = "1.2.3.4" WHERE ID = 1;
|
||||
UPDATE Users SET LastLogin = $tendaysago, LastLoginIPAddress = "2.3.4.5" WHERE ID = 2;
|
||||
UPDATE Users SET LastLogin = $now, LastLoginIPAddress = "3.4.5.6" WHERE ID = 3;
|
||||
UPDATE Users SET LastLogin = 0, LastLoginIPAddress = "4.5.6.7" WHERE ID = 4;
|
||||
UPDATE Users SET LastLogin = 0, LastLoginIPAddress = "5.6.7.8" WHERE ID = 5;
|
||||
UPDATE Users SET LastLogin = $tendaysago, LastLoginIPAddress = "6.7.8.9" WHERE ID = 6;
|
||||
EOD
|
||||
"$USERMAINT" &&
|
||||
cat <<-EOD >expected &&
|
||||
1.2.3.4
|
||||
3.4.5.6
|
||||
EOD
|
||||
echo "SELECT LastLoginIPAddress FROM Users WHERE LastLoginIPAddress IS NOT NULL;" | sqlite3 aur.db >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_expect_success 'Test removal of SSH login IP addresses.' '
|
||||
now=$(date -d now +%s) &&
|
||||
threedaysago=$(date -d "3 days ago" +%s) &&
|
||||
tendaysago=$(date -d "10 days ago" +%s) &&
|
||||
cat <<-EOD | sqlite3 aur.db &&
|
||||
UPDATE Users SET LastSSHLogin = $now, LastSSHLoginIPAddress = "1.2.3.4" WHERE ID = 1;
|
||||
UPDATE Users SET LastSSHLogin = $threedaysago, LastSSHLoginIPAddress = "2.3.4.5" WHERE ID = 2;
|
||||
UPDATE Users SET LastSSHLogin = $tendaysago, LastSSHLoginIPAddress = "3.4.5.6" WHERE ID = 3;
|
||||
UPDATE Users SET LastSSHLogin = 0, LastSSHLoginIPAddress = "4.5.6.7" WHERE ID = 4;
|
||||
UPDATE Users SET LastSSHLogin = 0, LastSSHLoginIPAddress = "5.6.7.8" WHERE ID = 5;
|
||||
UPDATE Users SET LastSSHLogin = $tendaysago, LastSSHLoginIPAddress = "6.7.8.9" WHERE ID = 6;
|
||||
EOD
|
||||
"$USERMAINT" &&
|
||||
cat <<-EOD >expected &&
|
||||
1.2.3.4
|
||||
2.3.4.5
|
||||
EOD
|
||||
echo "SELECT LastSSHLoginIPAddress FROM Users WHERE LastSSHLoginIPAddress IS NOT NULL;" | sqlite3 aur.db >actual &&
|
||||
test_cmp actual expected
|
||||
'
|
||||
|
||||
test_done
|
55
test/test_accepted_term.py
Normal file
55
test/test_accepted_term.py
Normal file
|
@ -0,0 +1,55 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.accepted_term import AcceptedTerm
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.term import Term
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def term() -> Term:
|
||||
with db.begin():
|
||||
term = db.create(Term, Description="Test term",
|
||||
URL="https://test.term")
|
||||
yield term
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def accepted_term(user: User, term: Term) -> AcceptedTerm:
|
||||
with db.begin():
|
||||
accepted_term = db.create(AcceptedTerm, User=user, Term=term)
|
||||
yield accepted_term
|
||||
|
||||
|
||||
def test_accepted_term(user: User, term: Term, accepted_term: AcceptedTerm):
|
||||
# Make sure our AcceptedTerm relationships got initialized properly.
|
||||
assert accepted_term.User == user
|
||||
assert accepted_term in user.accepted_terms
|
||||
assert accepted_term in term.accepted_terms
|
||||
|
||||
|
||||
def test_accepted_term_null_user_raises_exception(term: Term):
|
||||
with pytest.raises(IntegrityError):
|
||||
AcceptedTerm(Term=term)
|
||||
|
||||
|
||||
def test_accepted_term_null_term_raises_exception(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
AcceptedTerm(User=user)
|
51
test/test_account_type.py
Normal file
51
test/test_account_type.py
Normal file
|
@ -0,0 +1,51 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import AccountType
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def account_type() -> AccountType:
|
||||
with db.begin():
|
||||
account_type_ = db.create(AccountType, AccountType="TestUser")
|
||||
|
||||
yield account_type_
|
||||
|
||||
with db.begin():
|
||||
db.delete(account_type_)
|
||||
|
||||
|
||||
def test_account_type(account_type):
|
||||
""" Test creating an AccountType, and reading its columns. """
|
||||
# Make sure it got db.created and was given an ID.
|
||||
assert bool(account_type.ID)
|
||||
|
||||
# Next, test our string functions.
|
||||
assert str(account_type) == "TestUser"
|
||||
assert repr(account_type) == \
|
||||
"<AccountType(ID='%s', AccountType='TestUser')>" % (
|
||||
account_type.ID)
|
||||
|
||||
record = db.query(AccountType,
|
||||
AccountType.AccountType == "TestUser").first()
|
||||
assert account_type == record
|
||||
|
||||
|
||||
def test_user_account_type_relationship(account_type):
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountType=account_type)
|
||||
|
||||
assert user.AccountType == account_type
|
||||
|
||||
# This must be db.deleted here to avoid foreign key issues when
|
||||
# deleting the temporary AccountType in the fixture.
|
||||
with db.begin():
|
||||
db.delete(user)
|
1867
test/test_accounts_routes.py
Normal file
1867
test/test_accounts_routes.py
Normal file
File diff suppressed because it is too large
Load diff
56
test/test_adduser.py
Normal file
56
test/test_adduser.py
Normal file
|
@ -0,0 +1,56 @@
|
|||
from typing import List
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import aurweb.models.account_type as at
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models import User
|
||||
from aurweb.scripts import adduser
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
TEST_SSH_PUBKEY = ("ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAI"
|
||||
"bmlzdHAyNTYAAABBBEURnkiY6JoLyqDE8Li1XuAW+LHmkmLDMW/GL5wY"
|
||||
"7k4/A+Ta7bjA3MOKrF9j4EuUTvCuNXULxvpfSqheTFWZc+g= "
|
||||
"kevr@volcano")
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def run_main(args: List[str] = []):
|
||||
with mock.patch("sys.argv", ["aurweb-adduser"] + args):
|
||||
adduser.main()
|
||||
|
||||
|
||||
def test_adduser_no_args():
|
||||
with pytest.raises(SystemExit):
|
||||
run_main()
|
||||
|
||||
|
||||
def test_adduser():
|
||||
run_main(["-u", "test", "-e", "test@example.org", "-p", "abcd1234"])
|
||||
test = db.query(User).filter(User.Username == "test").first()
|
||||
assert test is not None
|
||||
assert test.login(Request(), "abcd1234")
|
||||
|
||||
|
||||
def test_adduser_tu():
|
||||
run_main([
|
||||
"-u", "test", "-e", "test@example.org", "-p", "abcd1234",
|
||||
"-t", at.TRUSTED_USER
|
||||
])
|
||||
test = db.query(User).filter(User.Username == "test").first()
|
||||
assert test is not None
|
||||
assert test.AccountTypeID == at.TRUSTED_USER_ID
|
||||
|
||||
|
||||
def test_adduser_ssh_pk():
|
||||
run_main(["-u", "test", "-e", "test@example.org", "-p", "abcd1234",
|
||||
"--ssh-pubkey", TEST_SSH_PUBKEY])
|
||||
test = db.query(User).filter(User.Username == "test").first()
|
||||
assert test is not None
|
||||
assert TEST_SSH_PUBKEY.startswith(test.ssh_pub_key.PubKey)
|
36
test/test_api_rate_limit.py
Normal file
36
test/test_api_rate_limit.py
Normal file
|
@ -0,0 +1,36 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.api_rate_limit import ApiRateLimit
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_api_rate_key_creation():
|
||||
with db.begin():
|
||||
rate = db.create(ApiRateLimit, IP="127.0.0.1", Requests=10,
|
||||
WindowStart=1)
|
||||
assert rate.IP == "127.0.0.1"
|
||||
assert rate.Requests == 10
|
||||
assert rate.WindowStart == 1
|
||||
|
||||
|
||||
def test_api_rate_key_ip_default():
|
||||
with db.begin():
|
||||
api_rate_limit = db.create(ApiRateLimit, Requests=10, WindowStart=1)
|
||||
assert api_rate_limit.IP == str()
|
||||
|
||||
|
||||
def test_api_rate_key_null_requests_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
ApiRateLimit(IP="127.0.0.1", WindowStart=1)
|
||||
|
||||
|
||||
def test_api_rate_key_null_window_start_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
ApiRateLimit(IP="127.0.0.1", Requests=1)
|
119
test/test_asgi.py
Normal file
119
test/test_asgi.py
Normal file
|
@ -0,0 +1,119 @@
|
|||
import http
|
||||
import os
|
||||
import re
|
||||
|
||||
from unittest import mock
|
||||
|
||||
import fastapi
|
||||
import pytest
|
||||
|
||||
from fastapi import HTTPException
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
import aurweb.asgi
|
||||
import aurweb.config
|
||||
import aurweb.redis
|
||||
|
||||
from aurweb.testing.email import Email
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def setup(db_test, email_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_asgi_startup_session_secret_exception(monkeypatch):
|
||||
""" Test that we get an IOError on app_startup when we cannot
|
||||
connect to options.redis_address. """
|
||||
|
||||
redis_addr = aurweb.config.get("options", "redis_address")
|
||||
|
||||
def mock_get(section: str, key: str):
|
||||
if section == "fastapi" and key == "session_secret":
|
||||
return None
|
||||
return redis_addr
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_get):
|
||||
with pytest.raises(Exception):
|
||||
await aurweb.asgi.app_startup()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_asgi_startup_exception(monkeypatch):
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": "conf/config.defaults"}):
|
||||
aurweb.config.rehash()
|
||||
with pytest.raises(Exception):
|
||||
await aurweb.asgi.app_startup()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_asgi_http_exception_handler():
|
||||
exc = HTTPException(status_code=422, detail="EXCEPTION!")
|
||||
phrase = http.HTTPStatus(exc.status_code).phrase
|
||||
response = await aurweb.asgi.http_exception_handler(Request(), exc)
|
||||
assert response.status_code == 422
|
||||
content = response.body.decode()
|
||||
assert f"{exc.status_code} - {phrase}" in content
|
||||
assert "EXCEPTION!" in content
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_asgi_app_unsupported_backends():
|
||||
config_get = aurweb.config.get
|
||||
|
||||
# Test that the previously supported "sqlite" backend is now
|
||||
# unsupported by FastAPI.
|
||||
def mock_sqlite_backend(section: str, key: str):
|
||||
if section == "database" and key == "backend":
|
||||
return "sqlite"
|
||||
return config_get(section, key)
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_sqlite_backend):
|
||||
expr = r"^.*\(sqlite\) is unsupported.*$"
|
||||
with pytest.raises(ValueError, match=expr):
|
||||
await aurweb.asgi.app_startup()
|
||||
|
||||
|
||||
def test_internal_server_error(setup: None,
|
||||
caplog: pytest.LogCaptureFixture):
|
||||
config_getboolean = aurweb.config.getboolean
|
||||
|
||||
def mock_getboolean(section: str, key: str) -> bool:
|
||||
if section == "options" and key == "traceback":
|
||||
return True
|
||||
return config_getboolean(section, key)
|
||||
|
||||
@aurweb.asgi.app.get("/internal_server_error")
|
||||
async def internal_server_error(request: fastapi.Request):
|
||||
raise ValueError("test exception")
|
||||
|
||||
with mock.patch("aurweb.config.getboolean", side_effect=mock_getboolean):
|
||||
with TestClient(app=aurweb.asgi.app) as request:
|
||||
resp = request.get("/internal_server_error")
|
||||
assert resp.status_code == int(http.HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||
|
||||
# Let's assert that a notification was sent out to the postmaster.
|
||||
assert Email.count() == 1
|
||||
|
||||
aur_location = aurweb.config.get("options", "aur_location")
|
||||
email = Email(1)
|
||||
assert f"Location: {aur_location}" in email.body
|
||||
assert "Traceback ID:" in email.body
|
||||
assert "Version:" in email.body
|
||||
assert "Datetime:" in email.body
|
||||
assert f"[1] {aur_location}" in email.body
|
||||
|
||||
# Assert that the exception got logged with with its traceback id.
|
||||
expr = r"FATAL\[.{7}\]"
|
||||
assert re.search(expr, caplog.text)
|
||||
|
||||
# Let's do it again; no email should be sent the next time,
|
||||
# since the hash is stored in redis.
|
||||
with mock.patch("aurweb.config.getboolean", side_effect=mock_getboolean):
|
||||
with TestClient(app=aurweb.asgi.app) as request:
|
||||
resp = request.get("/internal_server_error")
|
||||
assert resp.status_code == int(http.HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||
assert Email.count() == 1
|
91
test/test_aurblup.py
Normal file
91
test/test_aurblup.py
Normal file
|
@ -0,0 +1,91 @@
|
|||
import tempfile
|
||||
|
||||
from unittest import mock
|
||||
|
||||
import py
|
||||
import pytest
|
||||
|
||||
from aurweb import config, db
|
||||
from aurweb.models import OfficialProvider
|
||||
from aurweb.scripts import aurblup
|
||||
from aurweb.testing.alpm import AlpmDatabase
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tempdir() -> str:
|
||||
with tempfile.TemporaryDirectory() as name:
|
||||
yield name
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def alpm_db(tempdir: py.path.local) -> AlpmDatabase:
|
||||
yield AlpmDatabase(tempdir)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test, alpm_db: AlpmDatabase, tempdir: py.path.local) -> None:
|
||||
config_get = config.get
|
||||
|
||||
def mock_config_get(section: str, key: str) -> str:
|
||||
value = config_get(section, key)
|
||||
if section == "aurblup":
|
||||
if key == "db-path":
|
||||
return alpm_db.local
|
||||
elif key == "server":
|
||||
return f'file://{alpm_db.remote}'
|
||||
elif key == "sync-dbs":
|
||||
return alpm_db.repo
|
||||
return value
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_config_get):
|
||||
config.rehash()
|
||||
yield
|
||||
config.rehash()
|
||||
|
||||
|
||||
def test_aurblup(alpm_db: AlpmDatabase):
|
||||
# Test that we can add a package.
|
||||
alpm_db.add("pkg", "1.0", "x86_64", provides=["pkg2", "pkg3"])
|
||||
alpm_db.add("pkg2", "2.0", "x86_64")
|
||||
aurblup.main()
|
||||
|
||||
# Test that the package got added to the database.
|
||||
for name in ("pkg", "pkg2"):
|
||||
pkg = db.query(OfficialProvider).filter(
|
||||
OfficialProvider.Name == name).first()
|
||||
assert pkg is not None
|
||||
|
||||
# Test that we can remove the package.
|
||||
alpm_db.remove("pkg")
|
||||
|
||||
# Run aurblup again with forced repository update.
|
||||
aurblup.main(True)
|
||||
|
||||
# Expect that the database got updated accordingly.
|
||||
pkg = db.query(OfficialProvider).filter(
|
||||
OfficialProvider.Name == "pkg").first()
|
||||
assert pkg is None
|
||||
pkg2 = db.query(OfficialProvider).filter(
|
||||
OfficialProvider.Name == "pkg2").first()
|
||||
assert pkg2 is not None
|
||||
|
||||
|
||||
def test_aurblup_cleanup(alpm_db: AlpmDatabase):
|
||||
# Add a package and sync up the database.
|
||||
alpm_db.add("pkg", "1.0", "x86_64", provides=["pkg2", "pkg3"])
|
||||
aurblup.main()
|
||||
|
||||
# Now, let's insert an OfficialPackage that doesn't exist,
|
||||
# then exercise the old provider deletion path.
|
||||
with db.begin():
|
||||
db.create(OfficialProvider, Name="fake package",
|
||||
Repo="test", Provides="package")
|
||||
|
||||
# Run aurblup again.
|
||||
aurblup.main()
|
||||
|
||||
# Expect that the fake package got deleted because it's
|
||||
# not in alpm_db anymore.
|
||||
providers = db.query(OfficialProvider).filter(
|
||||
OfficialProvider.Name == "fake package").all()
|
||||
assert len(providers) == 0
|
153
test/test_auth.py
Normal file
153
test/test_auth.py
Normal file
|
@ -0,0 +1,153 @@
|
|||
import fastapi
|
||||
import pytest
|
||||
|
||||
from fastapi import HTTPException
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import config, db, time
|
||||
from aurweb.auth import AnonymousUser, BasicAuthBackend, _auth_required, account_type_required
|
||||
from aurweb.models.account_type import USER, USER_ID
|
||||
from aurweb.models.session import Session
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.com",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def backend() -> BasicAuthBackend:
|
||||
yield BasicAuthBackend()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_auth_backend_missing_sid(backend: BasicAuthBackend):
|
||||
# The request has no AURSID cookie, so authentication fails, and
|
||||
# AnonymousUser is returned.
|
||||
_, result = await backend.authenticate(Request())
|
||||
assert not result.is_authenticated()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_auth_backend_invalid_sid(backend: BasicAuthBackend):
|
||||
# Provide a fake AURSID that won't be found in the database.
|
||||
# This results in our path going down the invalid sid route,
|
||||
# which gives us an AnonymousUser.
|
||||
request = Request()
|
||||
request.cookies["AURSID"] = "fake"
|
||||
_, result = await backend.authenticate(request)
|
||||
assert not result.is_authenticated()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_auth_backend_invalid_user_id():
|
||||
# Create a new session with a fake user id.
|
||||
now_ts = time.utcnow()
|
||||
with pytest.raises(IntegrityError):
|
||||
Session(UsersID=666, SessionID="realSession",
|
||||
LastUpdateTS=now_ts + 5)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_basic_auth_backend(user: User, backend: BasicAuthBackend):
|
||||
# This time, everything matches up. We expect the user to
|
||||
# equal the real_user.
|
||||
now_ts = time.utcnow()
|
||||
with db.begin():
|
||||
db.create(Session, UsersID=user.ID, SessionID="realSession",
|
||||
LastUpdateTS=now_ts + 5)
|
||||
|
||||
request = Request()
|
||||
request.cookies["AURSID"] = "realSession"
|
||||
_, result = await backend.authenticate(request)
|
||||
assert result == user
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_expired_session(backend: BasicAuthBackend, user: User):
|
||||
""" Login, expire the session manually, then authenticate. """
|
||||
# First, build a Request with a logged in user.
|
||||
request = Request()
|
||||
request.user = user
|
||||
sid = request.user.login(Request(), "testPassword")
|
||||
request.cookies["AURSID"] = sid
|
||||
|
||||
# Set Session.LastUpdateTS to 20 seconds expired.
|
||||
timeout = config.getint("options", "login_timeout")
|
||||
now_ts = time.utcnow()
|
||||
with db.begin():
|
||||
request.user.session.LastUpdateTS = now_ts - timeout - 20
|
||||
|
||||
# Run through authentication backend and get the session
|
||||
# deleted due to its expiration.
|
||||
await backend.authenticate(request)
|
||||
session = db.query(Session).filter(Session.SessionID == sid).first()
|
||||
assert session is None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_auth_required_redirection_bad_referrer():
|
||||
# Create a fake route function which can be wrapped by auth_required.
|
||||
def bad_referrer_route(request: fastapi.Request):
|
||||
pass
|
||||
|
||||
# Get down to the nitty gritty internal wrapper.
|
||||
bad_referrer_route = _auth_required()(bad_referrer_route)
|
||||
|
||||
# Execute the route with a "./blahblahblah" Referer, which does not
|
||||
# match aur_location; `./` has been used as a prefix to attempt to
|
||||
# ensure we're providing a fake referer.
|
||||
with pytest.raises(HTTPException) as exc:
|
||||
request = Request(method="POST", headers={"Referer": "./blahblahblah"})
|
||||
await bad_referrer_route(request)
|
||||
assert exc.detail == "Bad Referer header."
|
||||
|
||||
|
||||
def test_account_type_required():
|
||||
""" This test merely asserts that a few different paths
|
||||
do not raise exceptions. """
|
||||
# This one shouldn't raise.
|
||||
account_type_required({USER})
|
||||
|
||||
# This one also shouldn't raise.
|
||||
account_type_required({USER_ID})
|
||||
|
||||
# But this one should! We have no "FAKE" key.
|
||||
with pytest.raises(KeyError):
|
||||
account_type_required({'FAKE'})
|
||||
|
||||
|
||||
def test_is_trusted_user():
|
||||
user_ = AnonymousUser()
|
||||
assert not user_.is_trusted_user()
|
||||
|
||||
|
||||
def test_is_developer():
|
||||
user_ = AnonymousUser()
|
||||
assert not user_.is_developer()
|
||||
|
||||
|
||||
def test_is_elevated():
|
||||
user_ = AnonymousUser()
|
||||
assert not user_.is_elevated()
|
||||
|
||||
|
||||
def test_voted_for():
|
||||
user_ = AnonymousUser()
|
||||
assert not user_.voted_for(None)
|
||||
|
||||
|
||||
def test_notified():
|
||||
user_ = AnonymousUser()
|
||||
assert not user_.notified(None)
|
335
test/test_auth_routes.py
Normal file
335
test/test_auth_routes.py
Normal file
|
@ -0,0 +1,335 @@
|
|||
import re
|
||||
|
||||
from http import HTTPStatus
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
import aurweb.config
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.asgi import app
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.session import Session
|
||||
from aurweb.models.user import User
|
||||
|
||||
# Some test global constants.
|
||||
TEST_USERNAME = "test"
|
||||
TEST_EMAIL = "test@example.org"
|
||||
TEST_REFERER = {
|
||||
"referer": aurweb.config.get("options", "aur_location") + "/login",
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
client = TestClient(app=app)
|
||||
|
||||
# Necessary for forged login CSRF protection on the login route. Set here
|
||||
# instead of only on the necessary requests for convenience.
|
||||
client.headers.update(TEST_REFERER)
|
||||
yield client
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username=TEST_USERNAME, Email=TEST_EMAIL,
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
def test_login_logout(client: TestClient, user: User):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"passwd": "testPassword",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
# First, let's test get /login.
|
||||
response = request.get("/login")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
response = request.post("/login", data=post_data,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
# Simulate following the redirect location from above's response.
|
||||
response = request.get(response.headers.get("location"))
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
response = request.post("/logout", data=post_data,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
response = request.post("/logout", data=post_data, cookies={
|
||||
"AURSID": response.cookies.get("AURSID")
|
||||
}, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
|
||||
def mock_getboolean(a, b):
|
||||
if a == "options" and b == "disable_http_login":
|
||||
return True
|
||||
return bool(aurweb.config.get(a, b))
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.getboolean", side_effect=mock_getboolean)
|
||||
def test_secure_login(getboolean: bool, client: TestClient, user: User):
|
||||
""" In this test, we check to verify the course of action taken
|
||||
by starlette when providing secure=True to a response cookie.
|
||||
This is achieved by mocking aurweb.config.getboolean to return
|
||||
True (or 1) when looking for `options.disable_http_login`.
|
||||
When we receive a response with `disable_http_login` enabled,
|
||||
we check the fields in cookies received for the secure and
|
||||
httponly fields, in addition to the rest of the fields given
|
||||
on such a request. """
|
||||
|
||||
# Create a local TestClient here since we mocked configuration.
|
||||
# client = TestClient(app)
|
||||
|
||||
# Necessary for forged login CSRF protection on the login route. Set here
|
||||
# instead of only on the necessary requests for convenience.
|
||||
# client.headers.update(TEST_REFERER)
|
||||
|
||||
# Data used for our upcoming http post request.
|
||||
post_data = {
|
||||
"user": user.Username,
|
||||
"passwd": "testPassword",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
# Perform a login request with the data matching our user.
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data,
|
||||
allow_redirects=False)
|
||||
|
||||
# Make sure we got the expected status out of it.
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
# Let's check what we got in terms of cookies for AURSID.
|
||||
# Make sure that a secure cookie got passed to us.
|
||||
cookie = next(c for c in response.cookies if c.name == "AURSID")
|
||||
assert cookie.secure is True
|
||||
assert cookie.has_nonstandard_attr("HttpOnly") is True
|
||||
assert cookie.has_nonstandard_attr("SameSite") is True
|
||||
assert cookie.get_nonstandard_attr("SameSite") == "strict"
|
||||
assert cookie.value is not None and len(cookie.value) > 0
|
||||
|
||||
# Let's make sure we actually have a session relationship
|
||||
# with the AURSID we ended up with.
|
||||
record = db.query(Session, Session.SessionID == cookie.value).first()
|
||||
assert record is not None and record.User == user
|
||||
assert user.session == record
|
||||
|
||||
|
||||
def test_authenticated_login(client: TestClient, user: User):
|
||||
post_data = {
|
||||
"user": user.Username,
|
||||
"passwd": "testPassword",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
# Try to login.
|
||||
response = request.post("/login", data=post_data,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/"
|
||||
|
||||
# Now, let's verify that we get the logged in rendering
|
||||
# when requesting GET /login as an authenticated user.
|
||||
# Now, let's verify that we receive 403 Forbidden when we
|
||||
# try to get /login as an authenticated user.
|
||||
response = request.get("/login", cookies=response.cookies,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
assert "Logged-in as: <strong>test</strong>" in response.text
|
||||
|
||||
|
||||
def test_unauthenticated_logout_unauthorized(client: TestClient):
|
||||
with client as request:
|
||||
# Alright, let's verify that attempting to /logout when not
|
||||
# authenticated returns 401 Unauthorized.
|
||||
response = request.post("/logout", allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location").startswith("/login")
|
||||
|
||||
|
||||
def test_login_missing_username(client: TestClient):
|
||||
post_data = {
|
||||
"passwd": "testPassword",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
# Make sure password isn't prefilled and remember_me isn't checked.
|
||||
content = response.content.decode()
|
||||
assert post_data["passwd"] not in content
|
||||
assert "checked" not in content
|
||||
|
||||
|
||||
def test_login_remember_me(client: TestClient, user: User):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"passwd": "testPassword",
|
||||
"next": "/",
|
||||
"remember_me": True
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert "AURSID" in response.cookies
|
||||
|
||||
cookie_timeout = aurweb.config.getint(
|
||||
"options", "persistent_cookie_timeout")
|
||||
now_ts = time.utcnow()
|
||||
session = db.query(Session).filter(Session.UsersID == user.ID).first()
|
||||
|
||||
# Expect that LastUpdateTS is not past the cookie timeout
|
||||
# for a remembered session.
|
||||
assert session.LastUpdateTS > (now_ts - cookie_timeout)
|
||||
|
||||
|
||||
def test_login_incorrect_password_remember_me(client: TestClient, user: User):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"passwd": "badPassword",
|
||||
"next": "/",
|
||||
"remember_me": "on"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
# Make sure username is prefilled, password isn't prefilled,
|
||||
# and remember_me is checked.
|
||||
assert post_data["user"] in response.text
|
||||
assert post_data["passwd"] not in response.text
|
||||
assert "checked" in response.text
|
||||
|
||||
|
||||
def test_login_missing_password(client: TestClient):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
# Make sure username is prefilled and remember_me isn't checked.
|
||||
assert post_data["user"] in response.text
|
||||
assert "checked" not in response.text
|
||||
|
||||
|
||||
def test_login_incorrect_password(client: TestClient):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"passwd": "badPassword",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
# Make sure username is prefilled, password isn't prefilled
|
||||
# and remember_me isn't checked.
|
||||
assert post_data["user"] in response.text
|
||||
assert post_data["passwd"] not in response.text
|
||||
assert "checked" not in response.text
|
||||
|
||||
|
||||
def test_login_bad_referer(client: TestClient):
|
||||
post_data = {
|
||||
"user": "test",
|
||||
"passwd": "testPassword",
|
||||
"next": "/",
|
||||
}
|
||||
|
||||
# Create new TestClient without a Referer header.
|
||||
client = TestClient(app)
|
||||
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
BAD_REFERER = {
|
||||
"referer": aurweb.config.get("options", "aur_location") + ".mal.local",
|
||||
}
|
||||
with client as request:
|
||||
response = request.post("/login", data=post_data, headers=BAD_REFERER)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
assert "AURSID" not in response.cookies
|
||||
|
||||
|
||||
def test_generate_unique_sid_exhausted(client: TestClient, user: User,
|
||||
caplog: pytest.LogCaptureFixture):
|
||||
"""
|
||||
In this test, we mock up generate_unique_sid() to infinitely return
|
||||
the same SessionID given to `user`. Within that mocking, we try
|
||||
to login as `user2` and expect the internal server error rendering
|
||||
by our error handler.
|
||||
|
||||
This exercises the bad path of /login, where we can't find a unique
|
||||
SID to assign the user.
|
||||
"""
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
# Create a second user; we'll login with this one.
|
||||
user2 = db.create(User, Username="test2", Email="test2@example.org",
|
||||
ResetKey="testReset", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
|
||||
# Create a session with ID == "testSession" for `user`.
|
||||
db.create(Session, User=user, SessionID="testSession",
|
||||
LastUpdateTS=now)
|
||||
|
||||
# Mock out generate_unique_sid; always return "testSession" which
|
||||
# causes us to eventually error out and raise an internal error.
|
||||
def mock_generate_sid():
|
||||
return "testSession"
|
||||
|
||||
# Login as `user2`; we expect an internal server error response
|
||||
# with a relevent detail.
|
||||
post_data = {
|
||||
"user": user2.Username,
|
||||
"passwd": "testPassword",
|
||||
"next": "/",
|
||||
}
|
||||
generate_unique_sid_ = "aurweb.models.session.generate_unique_sid"
|
||||
with mock.patch(generate_unique_sid_, mock_generate_sid):
|
||||
with client as request:
|
||||
# Set cookies = {} to remove any previous login kept by TestClient.
|
||||
response = request.post("/login", data=post_data, cookies={})
|
||||
assert response.status_code == int(HTTPStatus.INTERNAL_SERVER_ERROR)
|
||||
|
||||
assert "500 - Internal Server Error" in response.text
|
||||
|
||||
# Make sure an IntegrityError from the DB got logged out
|
||||
# with a FATAL traceback ID.
|
||||
expr = r"FATAL\[.{7}\]"
|
||||
assert re.search(expr, caplog.text)
|
||||
assert "IntegrityError" in caplog.text
|
||||
|
||||
expr = r"Duplicate entry .+ for key .+SessionID.+"
|
||||
assert re.search(expr, response.text)
|
58
test/test_ban.py
Normal file
58
test/test_ban.py
Normal file
|
@ -0,0 +1,58 @@
|
|||
import warnings
|
||||
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
import pytest
|
||||
|
||||
from sqlalchemy import exc as sa_exc
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.db import create
|
||||
from aurweb.models.ban import Ban, is_banned
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def ban() -> Ban:
|
||||
ts = datetime.utcnow() + timedelta(seconds=30)
|
||||
with db.begin():
|
||||
ban = create(Ban, IPAddress="127.0.0.1", BanTS=ts)
|
||||
yield ban
|
||||
|
||||
|
||||
def test_ban(ban: Ban):
|
||||
assert ban.IPAddress == "127.0.0.1"
|
||||
assert bool(ban.BanTS)
|
||||
|
||||
|
||||
def test_invalid_ban():
|
||||
with pytest.raises(sa_exc.IntegrityError):
|
||||
bad_ban = Ban(BanTS=datetime.utcnow())
|
||||
|
||||
# We're adding a ban with no primary key; this causes an
|
||||
# SQLAlchemy warnings when committing to the DB.
|
||||
# Ignore them.
|
||||
with warnings.catch_warnings():
|
||||
warnings.simplefilter("ignore", sa_exc.SAWarning)
|
||||
with db.begin():
|
||||
db.add(bad_ban)
|
||||
|
||||
# Since we got a transaction failure, we need to rollback.
|
||||
db.rollback()
|
||||
|
||||
|
||||
def test_banned(ban: Ban):
|
||||
request = Request()
|
||||
request.client.host = "127.0.0.1"
|
||||
assert is_banned(request)
|
||||
|
||||
|
||||
def test_not_banned(ban: Ban):
|
||||
request = Request()
|
||||
request.client.host = "192.168.0.1"
|
||||
assert not is_banned(request)
|
71
test/test_cache.py
Normal file
71
test/test_cache.py
Normal file
|
@ -0,0 +1,71 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import cache, db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
class StubRedis:
|
||||
""" A class which acts as a RedisConnection without using Redis. """
|
||||
|
||||
cache = dict()
|
||||
expires = dict()
|
||||
|
||||
def get(self, key, *args):
|
||||
if "key" not in self.cache:
|
||||
self.cache[key] = None
|
||||
return self.cache[key]
|
||||
|
||||
def set(self, key, *args):
|
||||
self.cache[key] = list(args)[0]
|
||||
|
||||
def expire(self, key, *args):
|
||||
self.expires[key] = list(args)[0]
|
||||
|
||||
async def execute(self, command, key, *args):
|
||||
f = getattr(self, command)
|
||||
return f(key, *args)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def redis():
|
||||
yield StubRedis()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_db_count_cache(redis):
|
||||
db.create(User, Username="user1",
|
||||
Email="user1@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
|
||||
query = db.query(User)
|
||||
|
||||
# Now, perform several checks that db_count_cache matches query.count().
|
||||
|
||||
# We have no cached value yet.
|
||||
assert await cache.db_count_cache(redis, "key1", query) == query.count()
|
||||
|
||||
# It's cached now.
|
||||
assert await cache.db_count_cache(redis, "key1", query) == query.count()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_db_count_cache_expires(redis):
|
||||
db.create(User, Username="user1",
|
||||
Email="user1@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
|
||||
query = db.query(User)
|
||||
|
||||
# Cache a query with an expire.
|
||||
value = await cache.db_count_cache(redis, "key1", query, 100)
|
||||
assert value == query.count()
|
||||
|
||||
assert redis.expires["key1"] == 100
|
67
test/test_captcha.py
Normal file
67
test/test_captcha.py
Normal file
|
@ -0,0 +1,67 @@
|
|||
import hashlib
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import captcha
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_captcha_salts():
|
||||
""" Make sure we can get some captcha salts. """
|
||||
salts = captcha.get_captcha_salts()
|
||||
assert len(salts) == 6
|
||||
|
||||
|
||||
def test_captcha_token():
|
||||
""" Make sure getting a captcha salt's token matches up against
|
||||
the first three digits of the md5 hash of the salt. """
|
||||
salts = captcha.get_captcha_salts()
|
||||
salt = salts[0]
|
||||
|
||||
token1 = captcha.get_captcha_token(salt)
|
||||
token2 = hashlib.md5(salt.encode()).hexdigest()[:3]
|
||||
|
||||
assert token1 == token2
|
||||
|
||||
|
||||
def test_captcha_challenge_answer():
|
||||
""" Make sure that executing the captcha challenge via shell
|
||||
produces the correct result by comparing it against a straight
|
||||
up token conversion. """
|
||||
salts = captcha.get_captcha_salts()
|
||||
salt = salts[0]
|
||||
|
||||
challenge = captcha.get_captcha_challenge(salt)
|
||||
|
||||
token = captcha.get_captcha_token(salt)
|
||||
challenge2 = f"LC_ALL=C pacman -V|sed -r 's#[0-9]+#{token}#g'|md5sum|cut -c1-6"
|
||||
|
||||
assert challenge == challenge2
|
||||
|
||||
|
||||
def test_captcha_salt_filter():
|
||||
""" Make sure captcha_salt_filter returns the first salt from
|
||||
get_captcha_salts().
|
||||
|
||||
Example usage:
|
||||
<input type="hidden" name="captcha_salt" value="{{ captcha_salt }}">
|
||||
"""
|
||||
salt = captcha.captcha_salt_filter(None)
|
||||
assert salt == captcha.get_captcha_salts()[0]
|
||||
|
||||
|
||||
def test_captcha_cmdline_filter():
|
||||
""" Make sure that the captcha_cmdline filter gives us the
|
||||
same challenge that get_captcha_challenge does.
|
||||
|
||||
Example usage:
|
||||
<code>{{ captcha_salt | captcha_cmdline }}</code>
|
||||
"""
|
||||
salt = captcha.captcha_salt_filter(None)
|
||||
display1 = captcha.captcha_cmdline_filter(None, salt)
|
||||
display2 = captcha.get_captcha_challenge(salt)
|
||||
assert display1 == display2
|
177
test/test_config.py
Normal file
177
test/test_config.py
Normal file
|
@ -0,0 +1,177 @@
|
|||
import configparser
|
||||
import io
|
||||
import os
|
||||
import re
|
||||
|
||||
from unittest import mock
|
||||
|
||||
import py
|
||||
|
||||
from aurweb import config
|
||||
from aurweb.scripts.config import main
|
||||
|
||||
|
||||
def noop(*args, **kwargs) -> None:
|
||||
return
|
||||
|
||||
|
||||
def test_get():
|
||||
assert config.get("options", "disable_http_login") == "0"
|
||||
|
||||
|
||||
def test_getboolean():
|
||||
assert not config.getboolean("options", "disable_http_login")
|
||||
|
||||
|
||||
def test_getint():
|
||||
assert config.getint("options", "disable_http_login") == 0
|
||||
|
||||
|
||||
def mock_config_get():
|
||||
config_get = config.get
|
||||
|
||||
def _mock_config_get(section: str, option: str):
|
||||
if section == "options":
|
||||
if option == "salt_rounds":
|
||||
return "666"
|
||||
return config_get(section, option)
|
||||
return _mock_config_get
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.get", side_effect=mock_config_get())
|
||||
def test_config_main_get(get: str):
|
||||
stdout = io.StringIO()
|
||||
args = ["aurweb-config", "get", "options", "salt_rounds"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("sys.stdout", stdout):
|
||||
main()
|
||||
|
||||
expected = "666"
|
||||
assert stdout.getvalue().strip() == expected
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.get", side_effect=mock_config_get())
|
||||
def test_config_main_get_unknown_section(get: str):
|
||||
stderr = io.StringIO()
|
||||
args = ["aurweb-config", "get", "fakeblahblah", "salt_rounds"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("sys.stderr", stderr):
|
||||
main()
|
||||
|
||||
# With an invalid section, we should get a usage error.
|
||||
expected = r'^error: no section found$'
|
||||
assert re.match(expected, stderr.getvalue().strip())
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.get", side_effect=mock_config_get())
|
||||
def test_config_main_get_unknown_option(get: str):
|
||||
stderr = io.StringIO()
|
||||
args = ["aurweb-config", "get", "options", "fakeblahblah"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("sys.stderr", stderr):
|
||||
main()
|
||||
|
||||
expected = "error: no option found"
|
||||
assert stderr.getvalue().strip() == expected
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.save", side_effect=noop)
|
||||
def test_config_main_set(save: None):
|
||||
data = None
|
||||
|
||||
def set_option(section: str, option: str, value: str) -> None:
|
||||
nonlocal data
|
||||
data = value
|
||||
|
||||
args = ["aurweb-config", "set", "options", "salt_rounds", "666"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("aurweb.config.set_option", side_effect=set_option):
|
||||
main()
|
||||
|
||||
expected = "666"
|
||||
assert data == expected
|
||||
|
||||
|
||||
def test_config_main_set_real(tmpdir: py.path.local):
|
||||
"""
|
||||
Test a real set_option path.
|
||||
"""
|
||||
|
||||
# Copy AUR_CONFIG to {tmpdir}/aur.config.
|
||||
aur_config = os.environ.get("AUR_CONFIG")
|
||||
tmp_aur_config = os.path.join(str(tmpdir), "aur.config")
|
||||
with open(aur_config) as f:
|
||||
with open(tmp_aur_config, "w") as o:
|
||||
o.write(f.read())
|
||||
|
||||
# Force reset the parser. This should NOT be done publicly.
|
||||
config._parser = None
|
||||
|
||||
value = 666
|
||||
args = ["aurweb-config", "set", "options", "fake-key", str(value)]
|
||||
with mock.patch.dict("os.environ", {"AUR_CONFIG": tmp_aur_config}):
|
||||
with mock.patch("sys.argv", args):
|
||||
# Run aurweb.config.main().
|
||||
main()
|
||||
|
||||
# Update the config; fake-key should be set.
|
||||
config.rehash()
|
||||
assert config.getint("options", "fake-key") == 666
|
||||
|
||||
# Restore config back to normal.
|
||||
args = ["aurweb-config", "unset", "options", "fake-key"]
|
||||
with mock.patch("sys.argv", args):
|
||||
main()
|
||||
|
||||
# Return the config back to normal.
|
||||
config.rehash()
|
||||
|
||||
# fake-key should no longer exist.
|
||||
assert config.getint("options", "fake-key") is None
|
||||
|
||||
|
||||
def test_config_main_set_immutable():
|
||||
data = None
|
||||
|
||||
def mock_set_option(section: str, option: str, value: str) -> None:
|
||||
nonlocal data
|
||||
data = value
|
||||
|
||||
args = ["aurweb-config", "set", "options", "salt_rounds", "666"]
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG_IMMUTABLE": "1"}):
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("aurweb.config.set_option",
|
||||
side_effect=mock_set_option):
|
||||
main()
|
||||
|
||||
expected = None
|
||||
assert data == expected
|
||||
|
||||
|
||||
def test_config_main_set_invalid_value():
|
||||
stderr = io.StringIO()
|
||||
|
||||
args = ["aurweb-config", "set", "options", "salt_rounds"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("sys.stderr", stderr):
|
||||
main()
|
||||
|
||||
expected = "error: no value provided"
|
||||
assert stderr.getvalue().strip() == expected
|
||||
|
||||
|
||||
@ mock.patch("aurweb.config.save", side_effect=noop)
|
||||
def test_config_main_set_unknown_section(save: None):
|
||||
stderr = io.StringIO()
|
||||
|
||||
def mock_set_option(section: str, option: str, value: str) -> None:
|
||||
raise configparser.NoSectionError(section=section)
|
||||
|
||||
args = ["aurweb-config", "set", "options", "salt_rounds", "666"]
|
||||
with mock.patch("sys.argv", args):
|
||||
with mock.patch("sys.stderr", stderr):
|
||||
with mock.patch("aurweb.config.set_option",
|
||||
side_effect=mock_set_option):
|
||||
main()
|
||||
|
||||
assert stderr.getvalue().strip() == "error: no section found"
|
224
test/test_db.py
Normal file
224
test/test_db.py
Normal file
|
@ -0,0 +1,224 @@
|
|||
import os
|
||||
import re
|
||||
import sqlite3
|
||||
import tempfile
|
||||
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import aurweb.config
|
||||
import aurweb.initdb
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import AccountType
|
||||
|
||||
|
||||
class Args:
|
||||
""" Stub arguments used for running aurweb.initdb. """
|
||||
use_alembic = True
|
||||
verbose = True
|
||||
|
||||
|
||||
class DBCursor:
|
||||
""" A fake database cursor object used in tests. """
|
||||
items = []
|
||||
|
||||
def execute(self, *args, **kwargs):
|
||||
self.items = list(args)
|
||||
return self
|
||||
|
||||
def fetchall(self):
|
||||
return self.items
|
||||
|
||||
|
||||
class DBConnection:
|
||||
""" A fake database connection object used in tests. """
|
||||
@staticmethod
|
||||
def cursor():
|
||||
return DBCursor()
|
||||
|
||||
@staticmethod
|
||||
def create_function(name, num_args, func):
|
||||
pass
|
||||
|
||||
|
||||
def make_temp_config(*replacements):
|
||||
""" Generate a temporary config file with a set of replacements.
|
||||
|
||||
:param *replacements: A variable number of tuple regex replacement pairs
|
||||
:return: A tuple containing (temp directory, temp config file)
|
||||
"""
|
||||
aurwebdir = aurweb.config.get("options", "aurwebdir")
|
||||
config_file = os.path.join(aurwebdir, "conf", "config.dev")
|
||||
config_defaults = os.path.join(aurwebdir, "conf", "config.defaults")
|
||||
|
||||
db_name = aurweb.config.get("database", "name")
|
||||
db_host = aurweb.config.get_with_fallback("database", "host", "localhost")
|
||||
db_port = aurweb.config.get_with_fallback("database", "port", "3306")
|
||||
db_user = aurweb.config.get_with_fallback("database", "user", "root")
|
||||
db_password = aurweb.config.get_with_fallback("database", "password", None)
|
||||
|
||||
# Replacements to perform before *replacements.
|
||||
# These serve as generic replacements in config.dev
|
||||
perform = (
|
||||
(r"name = .+", f"name = {db_name}"),
|
||||
(r"host = .+", f"host = {db_host}"),
|
||||
(r";port = .+", f";port = {db_port}"),
|
||||
(r"user = .+", f"user = {db_user}"),
|
||||
(r"password = .+", f"password = {db_password}"),
|
||||
("YOUR_AUR_ROOT", aurwebdir),
|
||||
)
|
||||
|
||||
tmpdir = tempfile.TemporaryDirectory()
|
||||
tmp = os.path.join(tmpdir.name, "config.tmp")
|
||||
with open(config_file) as f:
|
||||
config = f.read()
|
||||
for repl in tuple(perform + replacements):
|
||||
config = re.sub(repl[0], repl[1], config)
|
||||
with open(tmp, "w") as o:
|
||||
o.write(config)
|
||||
with open(config_defaults) as i:
|
||||
with open(f"{tmp}.defaults", "w") as o:
|
||||
o.write(i.read())
|
||||
return tmpdir, tmp
|
||||
|
||||
|
||||
def make_temp_sqlite_config():
|
||||
return make_temp_config((r"backend = .*", "backend = sqlite"),
|
||||
(r"name = .*", "name = /tmp/aurweb.sqlite3"))
|
||||
|
||||
|
||||
def make_temp_mysql_config():
|
||||
return make_temp_config((r"backend = .*", "backend = mysql"),
|
||||
(r"name = .*", "name = aurweb_test"))
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
if os.path.exists("/tmp/aurweb.sqlite3"):
|
||||
os.remove("/tmp/aurweb.sqlite3")
|
||||
|
||||
|
||||
def test_sqlalchemy_sqlite_url():
|
||||
tmpctx, tmp = make_temp_sqlite_config()
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
assert db.get_sqlalchemy_url()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_sqlalchemy_mysql_url():
|
||||
tmpctx, tmp = make_temp_mysql_config()
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
assert db.get_sqlalchemy_url()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_sqlalchemy_mysql_port_url():
|
||||
tmpctx, tmp = make_temp_config((r";port = 3306", "port = 3306"))
|
||||
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
assert db.get_sqlalchemy_url()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_sqlalchemy_mysql_socket_url():
|
||||
tmpctx, tmp = make_temp_config()
|
||||
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
assert db.get_sqlalchemy_url()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_sqlalchemy_unknown_backend():
|
||||
tmpctx, tmp = make_temp_config((r"backend = .+", "backend = blah"))
|
||||
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
with pytest.raises(ValueError):
|
||||
db.get_sqlalchemy_url()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_db_connects_without_fail():
|
||||
""" This only tests the actual config supplied to pytest. """
|
||||
db.connect()
|
||||
|
||||
|
||||
def test_connection_class_unsupported_backend():
|
||||
tmpctx, tmp = make_temp_config((r"backend = .+", "backend = blah"))
|
||||
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
with pytest.raises(ValueError):
|
||||
db.Connection()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
@mock.patch("MySQLdb.connect", mock.MagicMock(return_value=True))
|
||||
def test_connection_mysql():
|
||||
tmpctx, tmp = make_temp_mysql_config()
|
||||
with tmpctx:
|
||||
with mock.patch.dict(os.environ, {"AUR_CONFIG": tmp}):
|
||||
aurweb.config.rehash()
|
||||
db.Connection()
|
||||
aurweb.config.rehash()
|
||||
|
||||
|
||||
def test_create_delete():
|
||||
with db.begin():
|
||||
account_type = db.create(AccountType, AccountType="test")
|
||||
|
||||
record = db.query(AccountType, AccountType.AccountType == "test").first()
|
||||
assert record is not None
|
||||
|
||||
with db.begin():
|
||||
db.delete(account_type)
|
||||
|
||||
record = db.query(AccountType, AccountType.AccountType == "test").first()
|
||||
assert record is None
|
||||
|
||||
|
||||
def test_add_commit():
|
||||
# Use db.add and db.commit to add a temporary record.
|
||||
account_type = AccountType(AccountType="test")
|
||||
with db.begin():
|
||||
db.add(account_type)
|
||||
|
||||
# Assert it got created in the DB.
|
||||
assert bool(account_type.ID)
|
||||
|
||||
# Query the DB for it and compare the record with our object.
|
||||
record = db.query(AccountType, AccountType.AccountType == "test").first()
|
||||
assert record == account_type
|
||||
|
||||
# Remove the record.
|
||||
with db.begin():
|
||||
db.delete(account_type)
|
||||
|
||||
|
||||
def test_connection_executor_mysql_paramstyle():
|
||||
executor = db.ConnectionExecutor(None, backend="mysql")
|
||||
assert executor.paramstyle() == "format"
|
||||
|
||||
|
||||
@mock.patch("sqlite3.paramstyle", "pyformat")
|
||||
def test_connection_executor_sqlite_paramstyle():
|
||||
executor = db.ConnectionExecutor(None, backend="sqlite")
|
||||
assert executor.paramstyle() == sqlite3.paramstyle
|
||||
|
||||
|
||||
def test_name_without_pytest_current_test():
|
||||
with mock.patch.dict("os.environ", {}, clear=True):
|
||||
dbname = aurweb.db.name()
|
||||
assert dbname == aurweb.config.get("database", "name")
|
14
test/test_defaults.py
Normal file
14
test/test_defaults.py
Normal file
|
@ -0,0 +1,14 @@
|
|||
from aurweb import defaults
|
||||
|
||||
|
||||
def test_fallback_pp():
|
||||
assert defaults.fallback_pp(75) == defaults.PP
|
||||
assert defaults.fallback_pp(100) == 100
|
||||
|
||||
|
||||
def test_pp():
|
||||
assert defaults.PP == 50
|
||||
|
||||
|
||||
def test_o():
|
||||
assert defaults.O == 0
|
34
test/test_dependency_type.py
Normal file
34
test/test_dependency_type.py
Normal file
|
@ -0,0 +1,34 @@
|
|||
import pytest
|
||||
|
||||
from aurweb.db import begin, create, delete, query
|
||||
from aurweb.models.dependency_type import DependencyType
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_dependency_types():
|
||||
dep_types = ["depends", "makedepends", "checkdepends", "optdepends"]
|
||||
for dep_type in dep_types:
|
||||
dependency_type = query(DependencyType,
|
||||
DependencyType.Name == dep_type).first()
|
||||
assert dependency_type is not None
|
||||
|
||||
|
||||
def test_dependency_type_creation():
|
||||
with begin():
|
||||
dependency_type = create(DependencyType, Name="Test Type")
|
||||
assert bool(dependency_type.ID)
|
||||
assert dependency_type.Name == "Test Type"
|
||||
with begin():
|
||||
delete(dependency_type)
|
||||
|
||||
|
||||
def test_dependency_type_null_name_uses_default():
|
||||
with begin():
|
||||
dependency_type = create(DependencyType)
|
||||
assert dependency_type.Name == str()
|
||||
with begin():
|
||||
delete(dependency_type)
|
59
test/test_email.py
Normal file
59
test/test_email.py
Normal file
|
@ -0,0 +1,59 @@
|
|||
import io
|
||||
|
||||
from subprocess import PIPE, Popen
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import config
|
||||
from aurweb.testing.email import Email
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(email_test):
|
||||
return
|
||||
|
||||
|
||||
def sendmail(from_: str, to_: str, content: str) -> Email:
|
||||
binary = config.get("notifications", "sendmail")
|
||||
proc = Popen(binary, stdin=PIPE, stdout=PIPE, stderr=PIPE)
|
||||
content = f"From: {from_}\nTo: {to_}\n\n{content}"
|
||||
proc.communicate(content.encode())
|
||||
proc.wait()
|
||||
assert proc.returncode == 0
|
||||
|
||||
|
||||
def test_email_glue():
|
||||
""" Test that Email.glue() decodes both base64 and decoded content. """
|
||||
body = "Test email."
|
||||
sendmail("test@example.org", "test@example.org", body)
|
||||
assert Email.count() == 1
|
||||
|
||||
email1 = Email(1)
|
||||
email2 = Email(1)
|
||||
assert email1.glue() == email2.glue()
|
||||
|
||||
|
||||
def test_email_dump():
|
||||
""" Test that Email.dump() dumps a single email. """
|
||||
body = "Test email."
|
||||
sendmail("test@example.org", "test@example.org", body)
|
||||
assert Email.count() == 1
|
||||
|
||||
stdout = io.StringIO()
|
||||
Email.dump(file=stdout)
|
||||
content = stdout.getvalue()
|
||||
assert "== Email #1 ==" in content
|
||||
|
||||
|
||||
def test_email_dump_multiple():
|
||||
""" Test that Email.dump() dumps multiple emails. """
|
||||
body = "Test email."
|
||||
sendmail("test@example.org", "test@example.org", body)
|
||||
sendmail("test2@example.org", "test2@example.org", body)
|
||||
assert Email.count() == 2
|
||||
|
||||
stdout = io.StringIO()
|
||||
Email.dump(file=stdout)
|
||||
content = stdout.getvalue()
|
||||
assert "== Email #1 ==" in content
|
||||
assert "== Email #2 ==" in content
|
106
test/test_exceptions.py
Normal file
106
test/test_exceptions.py
Normal file
|
@ -0,0 +1,106 @@
|
|||
from aurweb import exceptions
|
||||
|
||||
|
||||
def test_aurweb_exception():
|
||||
try:
|
||||
raise exceptions.AurwebException("test")
|
||||
except exceptions.AurwebException as exc:
|
||||
assert str(exc) == "test"
|
||||
|
||||
|
||||
def test_maintenance_exception():
|
||||
try:
|
||||
raise exceptions.MaintenanceException("test")
|
||||
except exceptions.MaintenanceException as exc:
|
||||
assert str(exc) == "test"
|
||||
|
||||
|
||||
def test_banned_exception():
|
||||
try:
|
||||
raise exceptions.BannedException("test")
|
||||
except exceptions.BannedException as exc:
|
||||
assert str(exc) == "test"
|
||||
|
||||
|
||||
def test_already_voted_exception():
|
||||
try:
|
||||
raise exceptions.AlreadyVotedException("test")
|
||||
except exceptions.AlreadyVotedException as exc:
|
||||
assert str(exc) == "already voted for package base: test"
|
||||
|
||||
|
||||
def test_broken_update_hook_exception():
|
||||
try:
|
||||
raise exceptions.BrokenUpdateHookException("test")
|
||||
except exceptions.BrokenUpdateHookException as exc:
|
||||
assert str(exc) == "broken update hook: test"
|
||||
|
||||
|
||||
def test_invalid_arguments_exception():
|
||||
try:
|
||||
raise exceptions.InvalidArgumentsException("test")
|
||||
except exceptions.InvalidArgumentsException as exc:
|
||||
assert str(exc) == "test"
|
||||
|
||||
|
||||
def test_invalid_packagebase_exception():
|
||||
try:
|
||||
raise exceptions.InvalidPackageBaseException("test")
|
||||
except exceptions.InvalidPackageBaseException as exc:
|
||||
assert str(exc) == "package base not found: test"
|
||||
|
||||
|
||||
def test_invalid_comment_exception():
|
||||
try:
|
||||
raise exceptions.InvalidCommentException("test")
|
||||
except exceptions.InvalidCommentException as exc:
|
||||
assert str(exc) == "comment is too short: test"
|
||||
|
||||
|
||||
def test_invalid_reason_exception():
|
||||
try:
|
||||
raise exceptions.InvalidReasonException("test")
|
||||
except exceptions.InvalidReasonException as exc:
|
||||
assert str(exc) == "invalid reason: test"
|
||||
|
||||
|
||||
def test_invalid_user_exception():
|
||||
try:
|
||||
raise exceptions.InvalidUserException("test")
|
||||
except exceptions.InvalidUserException as exc:
|
||||
assert str(exc) == "unknown user: test"
|
||||
|
||||
|
||||
def test_not_voted_exception():
|
||||
try:
|
||||
raise exceptions.NotVotedException("test")
|
||||
except exceptions.NotVotedException as exc:
|
||||
assert str(exc) == "missing vote for package base: test"
|
||||
|
||||
|
||||
def test_packagebase_exists_exception():
|
||||
try:
|
||||
raise exceptions.PackageBaseExistsException("test")
|
||||
except exceptions.PackageBaseExistsException as exc:
|
||||
assert str(exc) == "package base already exists: test"
|
||||
|
||||
|
||||
def test_permission_denied_exception():
|
||||
try:
|
||||
raise exceptions.PermissionDeniedException("test")
|
||||
except exceptions.PermissionDeniedException as exc:
|
||||
assert str(exc) == "permission denied: test"
|
||||
|
||||
|
||||
def test_repository_name_exception():
|
||||
try:
|
||||
raise exceptions.InvalidRepositoryNameException("test")
|
||||
except exceptions.InvalidRepositoryNameException as exc:
|
||||
assert str(exc) == "invalid repository name: test"
|
||||
|
||||
|
||||
def test_invariant_error():
|
||||
try:
|
||||
raise exceptions.InvariantError("test")
|
||||
except exceptions.InvariantError as exc:
|
||||
assert str(exc) == "test"
|
26
test/test_filelock.py
Normal file
26
test/test_filelock.py
Normal file
|
@ -0,0 +1,26 @@
|
|||
import py
|
||||
|
||||
from _pytest.logging import LogCaptureFixture
|
||||
|
||||
from aurweb.testing.filelock import FileLock
|
||||
|
||||
|
||||
def test_filelock(tmpdir: py.path.local):
|
||||
cb_path = None
|
||||
|
||||
def setup(path: str):
|
||||
nonlocal cb_path
|
||||
cb_path = str(path)
|
||||
|
||||
flock = FileLock(tmpdir, "test")
|
||||
assert not flock.lock(on_create=setup)
|
||||
assert cb_path == str(tmpdir / "test")
|
||||
assert flock.lock()
|
||||
|
||||
|
||||
def test_filelock_default(caplog: LogCaptureFixture, tmpdir: py.path.local):
|
||||
# Test default_on_create here.
|
||||
flock = FileLock(tmpdir, "test")
|
||||
assert not flock.lock()
|
||||
assert caplog.messages[0] == f"Filelock at {flock.path} acquired."
|
||||
assert flock.lock()
|
36
test/test_filters.py
Normal file
36
test/test_filters.py
Normal file
|
@ -0,0 +1,36 @@
|
|||
from datetime import datetime
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
from aurweb import filters, time
|
||||
|
||||
|
||||
def test_timestamp_to_datetime():
|
||||
ts = time.utcnow()
|
||||
dt = datetime.utcfromtimestamp(int(ts))
|
||||
assert filters.timestamp_to_datetime(ts) == dt
|
||||
|
||||
|
||||
def test_as_timezone():
|
||||
ts = time.utcnow()
|
||||
dt = filters.timestamp_to_datetime(ts)
|
||||
assert filters.as_timezone(dt, "UTC") == dt.astimezone(tz=ZoneInfo("UTC"))
|
||||
|
||||
|
||||
def test_number_format():
|
||||
assert filters.number_format(0.222, 2) == "0.22"
|
||||
assert filters.number_format(0.226, 2) == "0.23"
|
||||
|
||||
|
||||
def test_extend_query():
|
||||
""" Test extension of a query via extend_query. """
|
||||
query = {"a": "b"}
|
||||
extended = filters.extend_query(query, ("a", "c"), ("b", "d"))
|
||||
assert extended.get("a") == "c"
|
||||
assert extended.get("b") == "d"
|
||||
|
||||
|
||||
def test_to_qs():
|
||||
""" Test conversion from a query dictionary to a query string. """
|
||||
query = {"a": "b", "c": [1, 2, 3]}
|
||||
qs = filters.to_qs(query)
|
||||
assert qs == "a=b&c=1&c=2&c=3"
|
23
test/test_group.py
Normal file
23
test/test_group.py
Normal file
|
@ -0,0 +1,23 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.group import Group
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_group_creation():
|
||||
with db.begin():
|
||||
group = db.create(Group, Name="Test Group")
|
||||
assert bool(group.ID)
|
||||
assert group.Name == "Test Group"
|
||||
|
||||
|
||||
def test_group_null_name_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
Group()
|
223
test/test_homepage.py
Normal file
223
test/test_homepage.py
Normal file
|
@ -0,0 +1,223 @@
|
|||
import re
|
||||
|
||||
from http import HTTPStatus
|
||||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.asgi import app
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||
from aurweb.models.package_request import PackageRequest
|
||||
from aurweb.models.request_type import DELETION_ID, RequestType
|
||||
from aurweb.models.user import User
|
||||
from aurweb.redis import redis_connection
|
||||
from aurweb.testing.html import parse_root
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user():
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd="testPassword", AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def redis():
|
||||
redis = redis_connection()
|
||||
|
||||
def delete_keys():
|
||||
# Cleanup keys if they exist.
|
||||
for key in ("package_count", "orphan_count", "user_count",
|
||||
"trusted_user_count", "seven_days_old_added",
|
||||
"seven_days_old_updated", "year_old_updated",
|
||||
"never_updated", "package_updates"):
|
||||
if redis.get(key) is not None:
|
||||
redis.delete(key)
|
||||
|
||||
delete_keys()
|
||||
yield redis
|
||||
delete_keys()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(user):
|
||||
""" Yield a list of num_packages Package objects maintained by user. """
|
||||
num_packages = 50 # Tunable
|
||||
|
||||
# For i..num_packages, create a package named pkg_{i}.
|
||||
pkgs = []
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
for i in range(num_packages):
|
||||
pkgbase = db.create(PackageBase, Name=f"pkg_{i}",
|
||||
Maintainer=user, Packager=user,
|
||||
SubmittedTS=now, ModifiedTS=now)
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name)
|
||||
pkgs.append(pkg)
|
||||
now += 1
|
||||
|
||||
yield pkgs
|
||||
|
||||
|
||||
def test_homepage():
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
@patch('aurweb.util.get_ssh_fingerprints')
|
||||
def test_homepage_ssh_fingerprints(get_ssh_fingerprints_mock):
|
||||
fingerprints = {'Ed25519': "SHA256:RFzBCUItH9LZS0cKB5UE6ceAYhBD5C8GeOBip8Z11+4"}
|
||||
get_ssh_fingerprints_mock.return_value = fingerprints
|
||||
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
|
||||
for key, value in fingerprints.items():
|
||||
assert key in response.content.decode()
|
||||
assert value in response.content.decode()
|
||||
assert 'The following SSH fingerprints are used for the AUR' in response.content.decode()
|
||||
|
||||
|
||||
@patch('aurweb.util.get_ssh_fingerprints')
|
||||
def test_homepage_no_ssh_fingerprints(get_ssh_fingerprints_mock):
|
||||
get_ssh_fingerprints_mock.return_value = {}
|
||||
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
|
||||
assert 'The following SSH fingerprints are used for the AUR' not in response.content.decode()
|
||||
|
||||
|
||||
def test_homepage_stats(redis, packages):
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
|
||||
expectations = [
|
||||
("Packages", r'\d+'),
|
||||
("Orphan Packages", r'\d+'),
|
||||
("Packages added in the past 7 days", r'\d+'),
|
||||
("Packages updated in the past 7 days", r'\d+'),
|
||||
("Packages updated in the past year", r'\d+'),
|
||||
("Packages never updated", r'\d+'),
|
||||
("Registered Users", r'\d+'),
|
||||
("Trusted Users", r'\d+')
|
||||
]
|
||||
|
||||
stats = root.xpath('//div[@id="pkg-stats"]//tr')
|
||||
for i, expected in enumerate(expectations):
|
||||
expected_key, expected_regex = expected
|
||||
key, value = stats[i].xpath('./td')
|
||||
assert key.text.strip() == expected_key
|
||||
assert re.match(expected_regex, value.text.strip())
|
||||
|
||||
|
||||
def test_homepage_updates(redis, packages):
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
# Run the request a second time to exercise the Redis path.
|
||||
response = request.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
|
||||
# We expect to see the latest 15 packages, which happens to be
|
||||
# pkg_49 .. pkg_34. So, create a list of expectations using a range
|
||||
# starting at 49, stepping down to 49 - 15, -1 step at a time.
|
||||
expectations = [f"pkg_{i}" for i in range(50 - 1, 50 - 1 - 15, -1)]
|
||||
updates = root.xpath('//div[@id="pkg-updates"]/table/tbody/tr')
|
||||
for i, expected in enumerate(expectations):
|
||||
pkgname = updates[i].xpath('./td/a').pop(0)
|
||||
assert pkgname.text.strip() == expected
|
||||
|
||||
|
||||
def test_homepage_dashboard(redis, packages, user):
|
||||
# Create Comaintainer records for all of the packages.
|
||||
with db.begin():
|
||||
for pkg in packages:
|
||||
db.create(PackageComaintainer,
|
||||
PackageBase=pkg.PackageBase,
|
||||
User=user, Priority=1)
|
||||
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
|
||||
# Assert some expectations that we end up getting all fifty
|
||||
# packages in the "My Packages" table.
|
||||
expectations = [f"pkg_{i}" for i in range(50 - 1, 0, -1)]
|
||||
my_packages = root.xpath('//table[@id="my-packages"]/tbody/tr')
|
||||
for i, expected in enumerate(expectations):
|
||||
name, version, votes, pop, voted, notify, desc, maint \
|
||||
= my_packages[i].xpath('./td')
|
||||
assert name.xpath('./a').pop(0).text.strip() == expected
|
||||
|
||||
# Do the same for the Comaintained Packages table.
|
||||
my_packages = root.xpath('//table[@id="comaintained-packages"]/tbody/tr')
|
||||
for i, expected in enumerate(expectations):
|
||||
name, version, votes, pop, voted, notify, desc, maint \
|
||||
= my_packages[i].xpath('./td')
|
||||
assert name.xpath('./a').pop(0).text.strip() == expected
|
||||
|
||||
|
||||
def test_homepage_dashboard_requests(redis, packages, user):
|
||||
now = time.utcnow()
|
||||
|
||||
pkg = packages[0]
|
||||
reqtype = db.query(RequestType, RequestType.ID == DELETION_ID).first()
|
||||
with db.begin():
|
||||
pkgreq = db.create(PackageRequest, PackageBase=pkg.PackageBase,
|
||||
PackageBaseName=pkg.PackageBase.Name,
|
||||
User=user, Comments=str(),
|
||||
ClosureComment=str(), RequestTS=now,
|
||||
RequestType=reqtype)
|
||||
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
request = root.xpath('//table[@id="pkgreq-results"]/tbody/tr').pop(0)
|
||||
pkgname = request.xpath('./td/a').pop(0)
|
||||
assert pkgname.text.strip() == pkgreq.PackageBaseName
|
||||
|
||||
|
||||
def test_homepage_dashboard_flagged_packages(redis, packages, user):
|
||||
# Set the first Package flagged by setting its OutOfDateTS column.
|
||||
pkg = packages[0]
|
||||
with db.begin():
|
||||
pkg.PackageBase.OutOfDateTS = time.utcnow()
|
||||
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Check to see that the package showed up in the Flagged Packages table.
|
||||
root = parse_root(response.text)
|
||||
flagged_pkg = root.xpath('//table[@id="flagged-packages"]/tbody/tr').pop(0)
|
||||
flagged_name = flagged_pkg.xpath('./td/a').pop(0)
|
||||
assert flagged_name.text.strip() == pkg.Name
|
167
test/test_html.py
Normal file
167
test/test_html.py
Normal file
|
@ -0,0 +1,167 @@
|
|||
""" A test suite used to test HTML renders in different cases. """
|
||||
from http import HTTPStatus
|
||||
|
||||
import fastapi
|
||||
import pytest
|
||||
|
||||
from fastapi import HTTPException
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import asgi, db
|
||||
from aurweb.models import PackageBase
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID, USER_ID
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.html import get_errors, get_successes, parse_root
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
yield TestClient(app=asgi.app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd="testPassword", AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def trusted_user(user: User) -> User:
|
||||
with db.begin():
|
||||
user.AccountTypeID = TRUSTED_USER_ID
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-pkg", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_archdev_navbar(client: TestClient):
|
||||
expected = [
|
||||
"AUR Home",
|
||||
"Packages",
|
||||
"Register",
|
||||
"Login"
|
||||
]
|
||||
with client as request:
|
||||
resp = request.get("/")
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(resp.text)
|
||||
items = root.xpath('//div[@id="archdev-navbar"]/ul/li/a')
|
||||
for i, item in enumerate(items):
|
||||
assert item.text.strip() == expected[i]
|
||||
|
||||
|
||||
def test_archdev_navbar_authenticated(client: TestClient, user: User):
|
||||
expected = [
|
||||
"Dashboard",
|
||||
"Packages",
|
||||
"Requests",
|
||||
"My Account",
|
||||
"Logout"
|
||||
]
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get("/", cookies=cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(resp.text)
|
||||
items = root.xpath('//div[@id="archdev-navbar"]/ul/li/a')
|
||||
for i, item in enumerate(items):
|
||||
assert item.text.strip() == expected[i]
|
||||
|
||||
|
||||
def test_archdev_navbar_authenticated_tu(client: TestClient,
|
||||
trusted_user: User):
|
||||
expected = [
|
||||
"Dashboard",
|
||||
"Packages",
|
||||
"Requests",
|
||||
"Accounts",
|
||||
"My Account",
|
||||
"Trusted User",
|
||||
"Logout"
|
||||
]
|
||||
cookies = {"AURSID": trusted_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get("/", cookies=cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(resp.text)
|
||||
items = root.xpath('//div[@id="archdev-navbar"]/ul/li/a')
|
||||
for i, item in enumerate(items):
|
||||
assert item.text.strip() == expected[i]
|
||||
|
||||
|
||||
def test_get_errors():
|
||||
html = """
|
||||
<ul class="errorlist">
|
||||
<li>Test</li>
|
||||
</ul>
|
||||
"""
|
||||
errors = get_errors(html)
|
||||
assert errors[0].text.strip() == "Test"
|
||||
|
||||
|
||||
def test_get_successes():
|
||||
html = """
|
||||
<ul class="success">
|
||||
<li>Test</li>
|
||||
</ul>
|
||||
"""
|
||||
successes = get_successes(html)
|
||||
assert successes[0].text.strip() == "Test"
|
||||
|
||||
|
||||
def test_metrics(client: TestClient):
|
||||
with client as request:
|
||||
resp = request.get("/metrics")
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
assert resp.headers.get("Content-Type").startswith("text/plain")
|
||||
|
||||
|
||||
def test_404_with_valid_pkgbase(client: TestClient, pkgbase: PackageBase):
|
||||
""" Test HTTPException with status_code == 404 and valid pkgbase. """
|
||||
endpoint = f"/{pkgbase.Name}"
|
||||
with client as request:
|
||||
response = request.get(endpoint)
|
||||
assert response.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
|
||||
body = response.text
|
||||
assert "404 - Page Not Found" in body
|
||||
assert "To clone the Git repository" in body
|
||||
|
||||
|
||||
def test_404(client: TestClient):
|
||||
""" Test HTTPException with status_code == 404 without a valid pkgbase. """
|
||||
with client as request:
|
||||
response = request.get("/nonexistentroute")
|
||||
assert response.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
|
||||
body = response.text
|
||||
assert "404 - Page Not Found" in body
|
||||
# No `pkgbase` is provided here; we don't see the extra info.
|
||||
assert "To clone the Git repository" not in body
|
||||
|
||||
|
||||
def test_503(client: TestClient):
|
||||
""" Test HTTPException with status_code == 503 (Service Unavailable). """
|
||||
@asgi.app.get("/raise-503")
|
||||
async def raise_503(request: fastapi.Request):
|
||||
raise HTTPException(status_code=HTTPStatus.SERVICE_UNAVAILABLE)
|
||||
|
||||
with TestClient(app=asgi.app) as request:
|
||||
response = request.get("/raise-503")
|
||||
assert response.status_code == int(HTTPStatus.SERVICE_UNAVAILABLE)
|
29
test/test_initdb.py
Normal file
29
test/test_initdb.py
Normal file
|
@ -0,0 +1,29 @@
|
|||
import pytest
|
||||
|
||||
import aurweb.config
|
||||
import aurweb.db
|
||||
import aurweb.initdb
|
||||
|
||||
from aurweb.models.account_type import AccountType
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
class Args:
|
||||
use_alembic = True
|
||||
verbose = True
|
||||
|
||||
|
||||
def test_run():
|
||||
from aurweb.schema import metadata
|
||||
aurweb.db.kill_engine()
|
||||
metadata.drop_all(aurweb.db.get_engine())
|
||||
aurweb.initdb.run(Args())
|
||||
|
||||
# Check that constant table rows got added via initdb.
|
||||
record = aurweb.db.query(AccountType,
|
||||
AccountType.AccountType == "User").first()
|
||||
assert record is not None
|
52
test/test_l10n.py
Normal file
52
test/test_l10n.py
Normal file
|
@ -0,0 +1,52 @@
|
|||
""" Test our l10n module. """
|
||||
from aurweb import filters, l10n
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
def test_translator():
|
||||
""" Test creating l10n translation tools. """
|
||||
de_home = l10n.translator.translate("Home", "de")
|
||||
assert de_home == "Startseite"
|
||||
|
||||
|
||||
def test_get_request_language():
|
||||
""" First, tests default_lang, then tests a modified AURLANG cookie. """
|
||||
request = Request()
|
||||
assert l10n.get_request_language(request) == "en"
|
||||
|
||||
request.cookies["AURLANG"] = "de"
|
||||
assert l10n.get_request_language(request) == "de"
|
||||
|
||||
|
||||
def test_get_raw_translator_for_request():
|
||||
""" Make sure that get_raw_translator_for_request is giving us
|
||||
the translator we expect. """
|
||||
request = Request()
|
||||
request.cookies["AURLANG"] = "de"
|
||||
translator = l10n.get_raw_translator_for_request(request)
|
||||
assert translator.gettext("Home") == \
|
||||
l10n.translator.translate("Home", "de")
|
||||
|
||||
|
||||
def test_get_translator_for_request():
|
||||
""" Make sure that get_translator_for_request is giving us back
|
||||
our expected translation function. """
|
||||
request = Request()
|
||||
request.cookies["AURLANG"] = "de"
|
||||
|
||||
translate = l10n.get_translator_for_request(request)
|
||||
assert translate("Home") == "Startseite"
|
||||
|
||||
|
||||
def test_tn_filter():
|
||||
request = Request()
|
||||
request.cookies["AURLANG"] = "en"
|
||||
context = {"language": "en", "request": request}
|
||||
|
||||
translated = filters.tn(context, 1, "%d package found.",
|
||||
"%d packages found.")
|
||||
assert translated == "%d package found."
|
||||
|
||||
translated = filters.tn(context, 2, "%d package found.",
|
||||
"%d packages found.")
|
||||
assert translated == "%d packages found."
|
23
test/test_license.py
Normal file
23
test/test_license.py
Normal file
|
@ -0,0 +1,23 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.license import License
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_license_creation():
|
||||
with db.begin():
|
||||
license = db.create(License, Name="Test License")
|
||||
assert bool(license.ID)
|
||||
assert license.Name == "Test License"
|
||||
|
||||
|
||||
def test_license_null_name_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
License()
|
16
test/test_logging.py
Normal file
16
test/test_logging.py
Normal file
|
@ -0,0 +1,16 @@
|
|||
from aurweb import logging
|
||||
|
||||
logger = logging.get_logger(__name__)
|
||||
|
||||
|
||||
def test_logging(caplog):
|
||||
logger.info("Test log.")
|
||||
|
||||
# Test that we logged once.
|
||||
assert len(caplog.records) == 1
|
||||
|
||||
# Test that our log record was of INFO level.
|
||||
assert caplog.records[0].levelname == "INFO"
|
||||
|
||||
# Test that our message got logged.
|
||||
assert "Test log." in caplog.text
|
215
test/test_mkpkglists.py
Normal file
215
test/test_mkpkglists.py
Normal file
|
@ -0,0 +1,215 @@
|
|||
import json
|
||||
|
||||
from typing import List, Union
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import config, db, util
|
||||
from aurweb.models import License, Package, PackageBase, PackageDependency, PackageLicense, User
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.dependency_type import DEPENDS_ID
|
||||
from aurweb.testing import noop
|
||||
|
||||
|
||||
class FakeFile:
|
||||
data = str()
|
||||
__exit__ = noop
|
||||
|
||||
def __init__(self, modes: str) -> "FakeFile":
|
||||
self.modes = modes
|
||||
|
||||
def __enter__(self, *args, **kwargs) -> "FakeFile":
|
||||
return self
|
||||
|
||||
def write(self, data: Union[str, bytes]) -> None:
|
||||
if isinstance(data, bytes):
|
||||
data = data.decode()
|
||||
self.data += data
|
||||
|
||||
def writelines(self, dataset: List[Union[str, bytes]]) -> None:
|
||||
util.apply_all(dataset, self.write)
|
||||
|
||||
def close(self) -> None:
|
||||
return
|
||||
|
||||
|
||||
class MockGzipOpen:
|
||||
def __init__(self):
|
||||
self.gzips = dict()
|
||||
|
||||
def open(self, archive: str, modes: str):
|
||||
self.gzips[archive] = FakeFile(modes)
|
||||
return self.gzips.get(archive)
|
||||
|
||||
def get(self, key: str) -> FakeFile:
|
||||
return self.gzips.get(key)
|
||||
|
||||
def __getitem__(self, key: str) -> FakeFile:
|
||||
return self.get(key)
|
||||
|
||||
def __contains__(self, key: str) -> bool:
|
||||
return key in self.gzips
|
||||
|
||||
def data(self, archive: str):
|
||||
return self.get(archive).data
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
config.rehash()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test",
|
||||
Email="test@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(user: User) -> List[Package]:
|
||||
output = []
|
||||
with db.begin():
|
||||
lic = db.create(License, Name="GPL")
|
||||
for i in range(5):
|
||||
# Create the package.
|
||||
pkgbase = db.create(PackageBase, Name=f"pkgbase_{i}",
|
||||
Packager=user)
|
||||
pkg = db.create(Package, PackageBase=pkgbase,
|
||||
Name=f"pkg_{i}")
|
||||
|
||||
# Create some related records.
|
||||
db.create(PackageLicense, Package=pkg, License=lic)
|
||||
db.create(PackageDependency, DepTypeID=DEPENDS_ID,
|
||||
Package=pkg, DepName=f"dep_{i}",
|
||||
DepCondition=">=1.0")
|
||||
|
||||
# Add the package to our output list.
|
||||
output.append(pkg)
|
||||
|
||||
# Sort output by the package name and return it.
|
||||
yield sorted(output, key=lambda k: k.Name)
|
||||
|
||||
|
||||
@mock.patch("os.makedirs", side_effect=noop)
|
||||
def test_mkpkglists_empty(makedirs: mock.MagicMock):
|
||||
gzips = MockGzipOpen()
|
||||
with mock.patch("gzip.open", side_effect=gzips.open):
|
||||
from aurweb.scripts import mkpkglists
|
||||
mkpkglists.main()
|
||||
|
||||
archives = config.get_section("mkpkglists")
|
||||
archives.pop("archivedir")
|
||||
archives.pop("packagesmetaextfile")
|
||||
|
||||
for archive in archives.values():
|
||||
assert archive in gzips
|
||||
|
||||
# Expect that packagesfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
packages_file = archives.get("packagesfile")
|
||||
assert gzips.data(packages_file) == str()
|
||||
|
||||
# Expect that pkgbasefile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("pkgbasefile")
|
||||
assert gzips.data(users_file) == str()
|
||||
|
||||
# Expect that userfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("userfile")
|
||||
assert gzips.data(users_file) == str()
|
||||
|
||||
# Expect that packagesmetafile got created, but is empty because
|
||||
# we have no DB records; it's still a valid empty JSON list.
|
||||
meta_file = archives.get("packagesmetafile")
|
||||
assert gzips.data(meta_file) == "[\n]"
|
||||
|
||||
|
||||
@mock.patch("sys.argv", ["mkpkglists", "--extended"])
|
||||
@mock.patch("os.makedirs", side_effect=noop)
|
||||
def test_mkpkglists_extended_empty(makedirs: mock.MagicMock):
|
||||
gzips = MockGzipOpen()
|
||||
with mock.patch("gzip.open", side_effect=gzips.open):
|
||||
from aurweb.scripts import mkpkglists
|
||||
mkpkglists.main()
|
||||
|
||||
archives = config.get_section("mkpkglists")
|
||||
archives.pop("archivedir")
|
||||
|
||||
for archive in archives.values():
|
||||
assert archive in gzips
|
||||
|
||||
# Expect that packagesfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
packages_file = archives.get("packagesfile")
|
||||
assert gzips.data(packages_file) == str()
|
||||
|
||||
# Expect that pkgbasefile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("pkgbasefile")
|
||||
assert gzips.data(users_file) == str()
|
||||
|
||||
# Expect that userfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("userfile")
|
||||
assert gzips.data(users_file) == str()
|
||||
|
||||
# Expect that packagesmetafile got created, but is empty because
|
||||
# we have no DB records; it's still a valid empty JSON list.
|
||||
meta_file = archives.get("packagesmetafile")
|
||||
assert gzips.data(meta_file) == "[\n]"
|
||||
|
||||
# Expect that packagesmetafile got created, but is empty because
|
||||
# we have no DB records; it's still a valid empty JSON list.
|
||||
meta_file = archives.get("packagesmetaextfile")
|
||||
assert gzips.data(meta_file) == "[\n]"
|
||||
|
||||
|
||||
@mock.patch("sys.argv", ["mkpkglists", "--extended"])
|
||||
@mock.patch("os.makedirs", side_effect=noop)
|
||||
def test_mkpkglists_extended(makedirs: mock.MagicMock, user: User,
|
||||
packages: List[Package]):
|
||||
gzips = MockGzipOpen()
|
||||
with mock.patch("gzip.open", side_effect=gzips.open):
|
||||
from aurweb.scripts import mkpkglists
|
||||
mkpkglists.main()
|
||||
|
||||
archives = config.get_section("mkpkglists")
|
||||
archives.pop("archivedir")
|
||||
|
||||
for archive in archives.values():
|
||||
assert archive in gzips
|
||||
|
||||
# Expect that packagesfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
packages_file = archives.get("packagesfile")
|
||||
expected = "\n".join([p.Name for p in packages]) + "\n"
|
||||
assert gzips.data(packages_file) == expected
|
||||
|
||||
# Expect that pkgbasefile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("pkgbasefile")
|
||||
expected = "\n".join([p.PackageBase.Name for p in packages]) + "\n"
|
||||
assert gzips.data(users_file) == expected
|
||||
|
||||
# Expect that userfile got created, but is empty because
|
||||
# we have no DB records.
|
||||
users_file = archives.get("userfile")
|
||||
assert gzips.data(users_file) == "test\n"
|
||||
|
||||
# Expect that packagesmetafile got created, but is empty because
|
||||
# we have no DB records; it's still a valid empty JSON list.
|
||||
meta_file = archives.get("packagesmetafile")
|
||||
data = json.loads(gzips.data(meta_file))
|
||||
assert len(data) == 5
|
||||
|
||||
# Expect that packagesmetafile got created, but is empty because
|
||||
# we have no DB records; it's still a valid empty JSON list.
|
||||
meta_file = archives.get("packagesmetaextfile")
|
||||
data = json.loads(gzips.data(meta_file))
|
||||
assert len(data) == 5
|
665
test/test_notify.py
Normal file
665
test/test_notify.py
Normal file
|
@ -0,0 +1,665 @@
|
|||
from logging import ERROR
|
||||
from typing import List
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import config, db, models, time
|
||||
from aurweb.models import Package, PackageBase, PackageRequest, User
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID, USER_ID
|
||||
from aurweb.models.request_type import ORPHAN_ID
|
||||
from aurweb.scripts import notify, rendercomment
|
||||
from aurweb.testing.email import Email
|
||||
from aurweb.testing.smtp import FakeSMTP, FakeSMTP_SSL
|
||||
|
||||
aur_location = config.get("options", "aur_location")
|
||||
aur_request_ml = config.get("options", "aur_request_ml")
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd=str(), AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user1() -> User:
|
||||
with db.begin():
|
||||
user1 = db.create(User, Username="user1", Email="user1@example.org",
|
||||
Passwd=str(), AccountTypeID=USER_ID)
|
||||
yield user1
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user2() -> User:
|
||||
with db.begin():
|
||||
user2 = db.create(User, Username="user2", Email="user2@example.org",
|
||||
Passwd=str(), AccountTypeID=USER_ID)
|
||||
yield user2
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbases(user: User) -> List[PackageBase]:
|
||||
now = time.utcnow()
|
||||
|
||||
output = []
|
||||
with db.begin():
|
||||
for i in range(5):
|
||||
output.append(
|
||||
db.create(PackageBase, Name=f"pkgbase_{i}",
|
||||
Maintainer=user, SubmittedTS=now,
|
||||
ModifiedTS=now))
|
||||
db.create(models.PackageNotification, PackageBase=output[-1],
|
||||
User=user)
|
||||
yield output
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgreq(user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
with db.begin():
|
||||
pkgreq_ = db.create(PackageRequest, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name, User=user2,
|
||||
ReqTypeID=ORPHAN_ID,
|
||||
Comments="This is a request test comment.",
|
||||
ClosureComment=str())
|
||||
yield pkgreq_
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(pkgbases: List[PackageBase]) -> List[Package]:
|
||||
output = []
|
||||
with db.begin():
|
||||
for i, pkgbase in enumerate(pkgbases):
|
||||
output.append(
|
||||
db.create(Package, PackageBase=pkgbase,
|
||||
Name=f"pkg_{i}", Version=f"{i}.0"))
|
||||
yield output
|
||||
|
||||
|
||||
def test_out_of_date(user: User, user1: User, user2: User,
|
||||
pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
# Create two comaintainers. We'll pass the maintainer uid to
|
||||
# FlagNotification, so we should expect to get two emails.
|
||||
with db.begin():
|
||||
db.create(models.PackageComaintainer,
|
||||
PackageBase=pkgbase, User=user1, Priority=1)
|
||||
db.create(models.PackageComaintainer,
|
||||
PackageBase=pkgbase, User=user2, Priority=2)
|
||||
|
||||
# Send the notification for pkgbases[0].
|
||||
notif = notify.FlagNotification(user.ID, pkgbases[0].ID)
|
||||
notif.send()
|
||||
|
||||
# Should've gotten three emails: maintainer + the two comaintainers.
|
||||
assert Email.count() == 3
|
||||
|
||||
# Comaintainer 1.
|
||||
first = Email(1).parse()
|
||||
assert first.headers.get("To") == user1.Email
|
||||
|
||||
expected = f"AUR Out-of-date Notification for {pkgbase.Name}"
|
||||
assert first.headers.get("Subject") == expected
|
||||
|
||||
# Comaintainer 2.
|
||||
second = Email(2).parse()
|
||||
assert second.headers.get("To") == user2.Email
|
||||
|
||||
# Maintainer.
|
||||
third = Email(3).parse()
|
||||
assert third.headers.get("To") == user.Email
|
||||
|
||||
|
||||
def test_reset(user: User):
|
||||
with db.begin():
|
||||
user.ResetKey = "12345678901234567890123456789012"
|
||||
|
||||
notif = notify.ResetKeyNotification(user.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
expected = "AUR Password Reset"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
A password reset request was submitted for the account test associated
|
||||
with your email address. If you wish to reset your password follow the
|
||||
link [1] below, otherwise ignore this message and nothing will happen.
|
||||
|
||||
[1] {aur_location}/passreset/?resetkey=12345678901234567890123456789012\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_welcome(user: User):
|
||||
with db.begin():
|
||||
user.ResetKey = "12345678901234567890123456789012"
|
||||
|
||||
notif = notify.WelcomeNotification(user.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
expected = "Welcome to the Arch User Repository"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
Welcome to the Arch User Repository! In order to set an initial
|
||||
password for your new account, please click the link [1] below. If the
|
||||
link does not work, try copying and pasting it into your browser.
|
||||
|
||||
[1] {aur_location}/passreset/?resetkey=12345678901234567890123456789012\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_comment(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
|
||||
with db.begin():
|
||||
comment = db.create(models.PackageComment, PackageBase=pkgbase,
|
||||
User=user2, Comments="This is a test comment.")
|
||||
rendercomment.update_comment_render_fastapi(comment)
|
||||
|
||||
notif = notify.CommentNotification(user2.ID, pkgbase.ID, comment.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Comment for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
{user2.Username} [1] added the following comment to {pkgbase.Name} [2]:
|
||||
|
||||
This is a test comment.
|
||||
|
||||
--
|
||||
If you no longer wish to receive notifications about this package,
|
||||
please go to the package page [2] and select "Disable notifications".
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/
|
||||
[2] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert expected == email.body
|
||||
|
||||
|
||||
def test_update(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
with db.begin():
|
||||
user.UpdateNotify = 1
|
||||
|
||||
notif = notify.UpdateNotification(user2.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Package Update: {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
{user2.Username} [1] pushed a new commit to {pkgbase.Name} [2].
|
||||
|
||||
--
|
||||
If you no longer wish to receive notifications about this package,
|
||||
please go to the package page [2] and select "Disable notifications".
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/
|
||||
[2] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert expected == email.body
|
||||
|
||||
|
||||
def test_adopt(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
notif = notify.AdoptNotification(user2.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Ownership Notification for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
The package {pkgbase.Name} [1] was adopted by {user2.Username} [2].
|
||||
|
||||
[1] {aur_location}/pkgbase/{pkgbase.Name}/
|
||||
[2] {aur_location}/account/{user2.Username}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_disown(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
notif = notify.DisownNotification(user2.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Ownership Notification for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
The package {pkgbase.Name} [1] was disowned by {user2.Username} [2].
|
||||
|
||||
[1] {aur_location}/pkgbase/{pkgbase.Name}/
|
||||
[2] {aur_location}/account/{user2.Username}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_comaintainer_addition(user: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
notif = notify.ComaintainerAddNotification(user.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Co-Maintainer Notification for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
You were added to the co-maintainer list of {pkgbase.Name} [1].
|
||||
|
||||
[1] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_comaintainer_removal(user: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
notif = notify.ComaintainerRemoveNotification(user.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Co-Maintainer Notification for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
You were removed from the co-maintainer list of {pkgbase.Name} [1].
|
||||
|
||||
[1] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_delete(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
notif = notify.DeleteNotification(user2.ID, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Package deleted: {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
{user2.Username} [1] deleted {pkgbase.Name} [2].
|
||||
|
||||
You will no longer receive notifications about this package.
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/
|
||||
[2] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_merge(user: User, user2: User, pkgbases: List[PackageBase]):
|
||||
source, target = pkgbases[:2]
|
||||
notif = notify.DeleteNotification(user2.ID, source.ID, target.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"AUR Package deleted: {source.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
{user2.Username} [1] merged {source.Name} [2] into {target.Name} [3].
|
||||
|
||||
--
|
||||
If you no longer wish receive notifications about the new package,
|
||||
please go to [3] and click "Disable notifications".
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/
|
||||
[2] {aur_location}/pkgbase/{source.Name}/
|
||||
[3] {aur_location}/pkgbase/{target.Name}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def set_tu(users: List[User]) -> User:
|
||||
with db.begin():
|
||||
for user in users:
|
||||
user.AccountTypeID = TRUSTED_USER_ID
|
||||
|
||||
|
||||
def test_open_close_request(user: User, user2: User,
|
||||
pkgreq: PackageRequest,
|
||||
pkgbases: List[PackageBase]):
|
||||
set_tu([user])
|
||||
pkgbase = pkgbases[0]
|
||||
|
||||
# Send an open request notification.
|
||||
notif = notify.RequestOpenNotification(
|
||||
user2.ID, pkgreq.ID, pkgreq.RequestType.Name, pkgbase.ID)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == aur_request_ml
|
||||
assert email.headers.get("Cc") == ", ".join([user.Email, user2.Email])
|
||||
expected = f"[PRQ#{pkgreq.ID}] Orphan Request for {pkgbase.Name}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
{user2.Username} [1] filed an orphan request for {pkgbase.Name} [2]:
|
||||
|
||||
This is a request test comment.
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/
|
||||
[2] {aur_location}/pkgbase/{pkgbase.Name}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
# Now send a closure notification on the pkgbase we just opened.
|
||||
notif = notify.RequestCloseNotification(user2.ID, pkgreq.ID, "rejected")
|
||||
notif.send()
|
||||
assert Email.count() == 2
|
||||
|
||||
email = Email(2).parse()
|
||||
assert email.headers.get("To") == aur_request_ml
|
||||
assert email.headers.get("Cc") == ", ".join([user.Email, user2.Email])
|
||||
expected = f"[PRQ#{pkgreq.ID}] Orphan Request for {pkgbase.Name} Rejected"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
Request #{pkgreq.ID} has been rejected by {user2.Username} [1].
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
# Test auto-accept.
|
||||
notif = notify.RequestCloseNotification(0, pkgreq.ID, "accepted")
|
||||
notif.send()
|
||||
assert Email.count() == 3
|
||||
|
||||
email = Email(3).parse()
|
||||
assert email.headers.get("To") == aur_request_ml
|
||||
assert email.headers.get("Cc") == ", ".join([user.Email, user2.Email])
|
||||
expected = (f"[PRQ#{pkgreq.ID}] Orphan Request for "
|
||||
f"{pkgbase.Name} Accepted")
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = (f"Request #{pkgreq.ID} has been accepted automatically "
|
||||
"by the Arch User Repository\npackage request system.")
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_close_request_comaintainer_cc(user: User, user2: User,
|
||||
pkgreq: PackageRequest,
|
||||
pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
with db.begin():
|
||||
db.create(models.PackageComaintainer, PackageBase=pkgbase,
|
||||
User=user2, Priority=1)
|
||||
|
||||
notif = notify.RequestCloseNotification(0, pkgreq.ID, "accepted")
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == aur_request_ml
|
||||
assert email.headers.get("Cc") == ", ".join([user.Email, user2.Email])
|
||||
|
||||
|
||||
def test_close_request_closure_comment(user: User, user2: User,
|
||||
pkgreq: PackageRequest,
|
||||
pkgbases: List[PackageBase]):
|
||||
pkgbase = pkgbases[0]
|
||||
with db.begin():
|
||||
pkgreq.ClosureComment = "This is a test closure comment."
|
||||
|
||||
notif = notify.RequestCloseNotification(user2.ID, pkgreq.ID, "accepted")
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == aur_request_ml
|
||||
assert email.headers.get("Cc") == ", ".join([user.Email, user2.Email])
|
||||
expected = f"[PRQ#{pkgreq.ID}] Orphan Request for {pkgbase.Name} Accepted"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
Request #{pkgreq.ID} has been accepted by {user2.Username} [1]:
|
||||
|
||||
This is a test closure comment.
|
||||
|
||||
[1] {aur_location}/account/{user2.Username}/\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_tu_vote_reminders(user: User):
|
||||
set_tu([user])
|
||||
|
||||
vote_id = 1
|
||||
notif = notify.TUVoteReminderNotification(vote_id)
|
||||
notif.send()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"TU Vote Reminder: Proposal {vote_id}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
Please remember to cast your vote on proposal {vote_id} [1]. The voting period
|
||||
ends in less than 48 hours.
|
||||
|
||||
[1] {aur_location}/tu/?id={vote_id}\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
def test_notify_main(user: User):
|
||||
""" Test TU vote reminder through aurweb.notify.main(). """
|
||||
set_tu([user])
|
||||
|
||||
vote_id = 1
|
||||
args = ["aurweb-notify", "tu-vote-reminder", str(vote_id)]
|
||||
with mock.patch("sys.argv", args):
|
||||
notify.main()
|
||||
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
expected = f"TU Vote Reminder: Proposal {vote_id}"
|
||||
assert email.headers.get("Subject") == expected
|
||||
|
||||
expected = f"""\
|
||||
Please remember to cast your vote on proposal {vote_id} [1]. The voting period
|
||||
ends in less than 48 hours.
|
||||
|
||||
[1] {aur_location}/tu/?id={vote_id}\
|
||||
"""
|
||||
assert email.body == expected
|
||||
|
||||
|
||||
# Save original config.get; we're going to mock it and need
|
||||
# to be able to fallback when we are not overriding.
|
||||
config_get = config.get
|
||||
|
||||
|
||||
def mock_smtp_config(cls):
|
||||
def _mock_smtp_config(section: str, key: str):
|
||||
if section == "notifications":
|
||||
if key == "sendmail":
|
||||
return cls()
|
||||
elif key == "smtp-use-ssl":
|
||||
return cls(0)
|
||||
elif key == "smtp-use-starttls":
|
||||
return cls(0)
|
||||
elif key == "smtp-user":
|
||||
return cls()
|
||||
elif key == "smtp-password":
|
||||
return cls()
|
||||
return cls(config_get(section, key))
|
||||
return _mock_smtp_config
|
||||
|
||||
|
||||
def test_smtp(user: User):
|
||||
with db.begin():
|
||||
user.ResetKey = "12345678901234567890123456789012"
|
||||
|
||||
SMTP = FakeSMTP()
|
||||
|
||||
get = "aurweb.config.get"
|
||||
getboolean = "aurweb.config.getboolean"
|
||||
with mock.patch(get, side_effect=mock_smtp_config(str)):
|
||||
with mock.patch(getboolean, side_effect=mock_smtp_config(bool)):
|
||||
with mock.patch("smtplib.SMTP", side_effect=lambda a, b: SMTP):
|
||||
config.rehash()
|
||||
notif = notify.WelcomeNotification(user.ID)
|
||||
notif.send()
|
||||
config.rehash()
|
||||
assert len(SMTP.emails) == 1
|
||||
|
||||
|
||||
def mock_smtp_starttls_config(cls):
|
||||
def _mock_smtp_starttls_config(section: str, key: str):
|
||||
if section == "notifications":
|
||||
if key == "sendmail":
|
||||
return cls()
|
||||
elif key == "smtp-use-ssl":
|
||||
return cls(0)
|
||||
elif key == "smtp-use-starttls":
|
||||
return cls(1)
|
||||
elif key == "smtp-user":
|
||||
return cls("test")
|
||||
elif key == "smtp-password":
|
||||
return cls("password")
|
||||
return cls(config_get(section, key))
|
||||
return _mock_smtp_starttls_config
|
||||
|
||||
|
||||
def test_smtp_starttls(user: User):
|
||||
# This test does two things: test starttls path and test
|
||||
# path where we have a backup email.
|
||||
|
||||
with db.begin():
|
||||
user.ResetKey = "12345678901234567890123456789012"
|
||||
user.BackupEmail = "backup@example.org"
|
||||
|
||||
SMTP = FakeSMTP()
|
||||
|
||||
get = "aurweb.config.get"
|
||||
getboolean = "aurweb.config.getboolean"
|
||||
with mock.patch(get, side_effect=mock_smtp_starttls_config(str)):
|
||||
with mock.patch(
|
||||
getboolean, side_effect=mock_smtp_starttls_config(bool)):
|
||||
with mock.patch("smtplib.SMTP", side_effect=lambda a, b: SMTP):
|
||||
notif = notify.WelcomeNotification(user.ID)
|
||||
notif.send()
|
||||
assert SMTP.starttls_enabled
|
||||
assert SMTP.user
|
||||
assert SMTP.passwd
|
||||
|
||||
assert len(SMTP.emails) == 2
|
||||
to = SMTP.emails[0][1]
|
||||
assert to == [user.Email]
|
||||
|
||||
to = SMTP.emails[1][1]
|
||||
assert to == [user.BackupEmail]
|
||||
|
||||
|
||||
def mock_smtp_ssl_config(cls):
|
||||
def _mock_smtp_ssl_config(section: str, key: str):
|
||||
if section == "notifications":
|
||||
if key == "sendmail":
|
||||
return cls()
|
||||
elif key == "smtp-use-ssl":
|
||||
return cls(1)
|
||||
elif key == "smtp-use-starttls":
|
||||
return cls(0)
|
||||
elif key == "smtp-user":
|
||||
return cls("test")
|
||||
elif key == "smtp-password":
|
||||
return cls("password")
|
||||
return cls(config_get(section, key))
|
||||
return _mock_smtp_ssl_config
|
||||
|
||||
|
||||
def test_smtp_ssl(user: User):
|
||||
with db.begin():
|
||||
user.ResetKey = "12345678901234567890123456789012"
|
||||
|
||||
SMTP = FakeSMTP_SSL()
|
||||
|
||||
get = "aurweb.config.get"
|
||||
getboolean = "aurweb.config.getboolean"
|
||||
with mock.patch(get, side_effect=mock_smtp_ssl_config(str)):
|
||||
with mock.patch(getboolean, side_effect=mock_smtp_ssl_config(bool)):
|
||||
with mock.patch("smtplib.SMTP_SSL", side_effect=lambda a, b: SMTP):
|
||||
notif = notify.WelcomeNotification(user.ID)
|
||||
notif.send()
|
||||
assert len(SMTP.emails) == 1
|
||||
assert SMTP.use_ssl
|
||||
assert SMTP.user
|
||||
assert SMTP.passwd
|
||||
|
||||
|
||||
def test_notification_defaults():
|
||||
notif = notify.Notification()
|
||||
assert notif.get_refs() == tuple()
|
||||
assert notif.get_headers() == dict()
|
||||
assert notif.get_cc() == list()
|
||||
|
||||
|
||||
def test_notification_oserror(user: User, caplog: pytest.LogCaptureFixture):
|
||||
""" Try sending a notification with a bad SMTP configuration. """
|
||||
caplog.set_level(ERROR)
|
||||
config_get = config.get
|
||||
|
||||
mocked_options = {
|
||||
"sendmail": str(),
|
||||
"smtp-server": "mail.server.xyz",
|
||||
"smtp-port": "587",
|
||||
"smtp-user": "notify@server.xyz",
|
||||
"smtp-password": "notify_server_xyz",
|
||||
"sender": "notify@server.xyz",
|
||||
"reply-to": "no-reply@server.xyz"
|
||||
}
|
||||
|
||||
def mock_config_get(section: str, key: str) -> str:
|
||||
if section == "notifications":
|
||||
if key in mocked_options:
|
||||
return mocked_options.get(key)
|
||||
return config_get(section, key)
|
||||
|
||||
notif = notify.WelcomeNotification(user.ID)
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_config_get):
|
||||
notif.send()
|
||||
|
||||
expected = "Unable to emit notification due to an OSError"
|
||||
assert expected in caplog.text
|
65
test/test_official_provider.py
Normal file
65
test/test_official_provider.py
Normal file
|
@ -0,0 +1,65 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.official_provider import OfficialProvider
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_official_provider_creation():
|
||||
with db.begin():
|
||||
oprovider = db.create(OfficialProvider,
|
||||
Name="some-name",
|
||||
Repo="some-repo",
|
||||
Provides="some-provides")
|
||||
assert bool(oprovider.ID)
|
||||
assert oprovider.Name == "some-name"
|
||||
assert oprovider.Repo == "some-repo"
|
||||
assert oprovider.Provides == "some-provides"
|
||||
|
||||
|
||||
def test_official_provider_cs():
|
||||
""" Test case sensitivity of the database table. """
|
||||
with db.begin():
|
||||
oprovider = db.create(OfficialProvider,
|
||||
Name="some-name",
|
||||
Repo="some-repo",
|
||||
Provides="some-provides")
|
||||
assert bool(oprovider.ID)
|
||||
|
||||
with db.begin():
|
||||
oprovider_cs = db.create(OfficialProvider,
|
||||
Name="SOME-NAME",
|
||||
Repo="SOME-REPO",
|
||||
Provides="SOME-PROVIDES")
|
||||
assert bool(oprovider_cs.ID)
|
||||
|
||||
assert oprovider.ID != oprovider_cs.ID
|
||||
|
||||
assert oprovider.Name == "some-name"
|
||||
assert oprovider.Repo == "some-repo"
|
||||
assert oprovider.Provides == "some-provides"
|
||||
|
||||
assert oprovider_cs.Name == "SOME-NAME"
|
||||
assert oprovider_cs.Repo == "SOME-REPO"
|
||||
assert oprovider_cs.Provides == "SOME-PROVIDES"
|
||||
|
||||
|
||||
def test_official_provider_null_name_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
OfficialProvider(Repo="some-repo", Provides="some-provides")
|
||||
|
||||
|
||||
def test_official_provider_null_repo_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
OfficialProvider(Name="some-name", Provides="some-provides")
|
||||
|
||||
|
||||
def test_official_provider_null_provides_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
OfficialProvider(Name="some-name", Repo="some-repo")
|
68
test/test_package.py
Normal file
68
test/test_package.py
Normal file
|
@ -0,0 +1,68 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy import and_
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.user import User
|
||||
|
||||
user = pkgbase = package = None
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="beautiful-package",
|
||||
Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="Test description.",
|
||||
URL="https://test.package")
|
||||
yield package
|
||||
|
||||
|
||||
def test_package(package: Package):
|
||||
assert package.Name == "beautiful-package"
|
||||
assert package.Description == "Test description."
|
||||
assert package.Version == str() # Default version.
|
||||
assert package.URL == "https://test.package"
|
||||
|
||||
# Update package Version.
|
||||
with db.begin():
|
||||
package.Version = "1.2.3"
|
||||
|
||||
# Make sure it got updated in the database.
|
||||
record = db.query(Package).filter(
|
||||
and_(Package.ID == package.ID,
|
||||
Package.Version == "1.2.3")
|
||||
).first()
|
||||
assert record is not None
|
||||
|
||||
|
||||
def test_package_null_pkgbase_raises():
|
||||
with pytest.raises(IntegrityError):
|
||||
Package(Name="some-package", Description="Some description.",
|
||||
URL="https://some.package")
|
||||
|
||||
|
||||
def test_package_null_name_raises(package: Package):
|
||||
pkgbase = package.PackageBase
|
||||
with pytest.raises(IntegrityError):
|
||||
Package(PackageBase=pkgbase, Description="Some description.",
|
||||
URL="https://some.package")
|
67
test/test_package_base.py
Normal file
67
test/test_package_base.py
Normal file
|
@ -0,0 +1,67 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="beautiful-package",
|
||||
Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_base(user: User, pkgbase: PackageBase):
|
||||
assert pkgbase in user.maintained_bases
|
||||
assert not pkgbase.OutOfDateTS
|
||||
assert pkgbase.SubmittedTS > 0
|
||||
assert pkgbase.ModifiedTS > 0
|
||||
|
||||
# Set Popularity to a string, then get it by attribute to
|
||||
# exercise the string -> float conversion path.
|
||||
with db.begin():
|
||||
pkgbase.Popularity = "0.0"
|
||||
assert pkgbase.Popularity == 0.0
|
||||
|
||||
|
||||
def test_package_base_ci(user: User, pkgbase: PackageBase):
|
||||
""" Test case insensitivity of the database table. """
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
db.create(PackageBase, Name=pkgbase.Name.upper(), Maintainer=user)
|
||||
db.rollback()
|
||||
|
||||
|
||||
def test_package_base_relationships(user: User, pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
pkgbase.Flagger = user
|
||||
pkgbase.Submitter = user
|
||||
pkgbase.Packager = user
|
||||
assert pkgbase in user.flagged_bases
|
||||
assert pkgbase in user.maintained_bases
|
||||
assert pkgbase in user.submitted_bases
|
||||
assert pkgbase in user.package_bases
|
||||
|
||||
|
||||
def test_package_base_null_name_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageBase()
|
23
test/test_package_blacklist.py
Normal file
23
test/test_package_blacklist.py
Normal file
|
@ -0,0 +1,23 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.package_blacklist import PackageBlacklist
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_package_blacklist_creation():
|
||||
with db.begin():
|
||||
package_blacklist = db.create(PackageBlacklist, Name="evil-package")
|
||||
assert bool(package_blacklist.ID)
|
||||
assert package_blacklist.Name == "evil-package"
|
||||
|
||||
|
||||
def test_package_blacklist_null_name_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageBlacklist()
|
56
test/test_package_comaintainer.py
Normal file
56
test/test_package_comaintainer.py
Normal file
|
@ -0,0 +1,56 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_comaintainer import PackageComaintainer
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_comaintainer_creation(user: User, pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
package_comaintainer = db.create(PackageComaintainer, User=user,
|
||||
PackageBase=pkgbase, Priority=5)
|
||||
assert bool(package_comaintainer)
|
||||
assert package_comaintainer.User == user
|
||||
assert package_comaintainer.PackageBase == pkgbase
|
||||
assert package_comaintainer.Priority == 5
|
||||
|
||||
|
||||
def test_package_comaintainer_null_user_raises(pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComaintainer(PackageBase=pkgbase, Priority=1)
|
||||
|
||||
|
||||
def test_package_comaintainer_null_pkgbase_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComaintainer(User=user, Priority=1)
|
||||
|
||||
|
||||
def test_package_comaintainer_null_priority_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComaintainer(User=user, PackageBase=pkgbase)
|
66
test/test_package_comment.py
Normal file
66
test/test_package_comment.py
Normal file
|
@ -0,0 +1,66 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_comment import PackageComment
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_comment_creation(user: User, pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
package_comment = db.create(PackageComment, PackageBase=pkgbase,
|
||||
User=user, Comments="Test comment.",
|
||||
RenderedComment="Test rendered comment.")
|
||||
assert bool(package_comment.ID)
|
||||
|
||||
|
||||
def test_package_comment_null_pkgbase_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComment(User=user, Comments="Test comment.",
|
||||
RenderedComment="Test rendered comment.")
|
||||
|
||||
|
||||
def test_package_comment_null_user_raises(pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComment(PackageBase=pkgbase,
|
||||
Comments="Test comment.",
|
||||
RenderedComment="Test rendered comment.")
|
||||
|
||||
|
||||
def test_package_comment_null_comments_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageComment(PackageBase=pkgbase, User=user,
|
||||
RenderedComment="Test rendered comment.")
|
||||
|
||||
|
||||
def test_package_comment_null_renderedcomment_defaults(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
record = db.create(PackageComment, PackageBase=pkgbase,
|
||||
User=user, Comments="Test comment.")
|
||||
assert record.RenderedComment == str()
|
66
test/test_package_dependency.py
Normal file
66
test/test_package_dependency.py
Normal file
|
@ -0,0 +1,66 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.dependency_type import DEPENDS_ID
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_dependency import PackageDependency
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd=str(),
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="Test description.",
|
||||
URL="https://test.package")
|
||||
yield package
|
||||
|
||||
|
||||
def test_package_dependencies(user: User, package: Package):
|
||||
with db.begin():
|
||||
pkgdep = db.create(PackageDependency, Package=package,
|
||||
DepTypeID=DEPENDS_ID, DepName="test-dep")
|
||||
assert pkgdep.DepName == "test-dep"
|
||||
assert pkgdep.Package == package
|
||||
assert pkgdep in package.package_dependencies
|
||||
assert not pkgdep.is_package()
|
||||
|
||||
with db.begin():
|
||||
base = db.create(PackageBase, Name=pkgdep.DepName, Maintainer=user)
|
||||
db.create(Package, PackageBase=base, Name=pkgdep.DepName)
|
||||
|
||||
assert pkgdep.is_package()
|
||||
|
||||
|
||||
def test_package_dependencies_null_package_raises():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageDependency(DepTypeID=DEPENDS_ID, DepName="test-dep")
|
||||
|
||||
|
||||
def test_package_dependencies_null_dependency_type_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageDependency(Package=package, DepName="test-dep")
|
||||
|
||||
|
||||
def test_package_dependencies_null_depname_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageDependency(DepTypeID=DEPENDS_ID, Package=package)
|
57
test/test_package_group.py
Normal file
57
test/test_package_group.py
Normal file
|
@ -0,0 +1,57 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.group import Group
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_group import PackageGroup
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def group() -> Group:
|
||||
with db.begin():
|
||||
group = db.create(Group, Name="Test Group")
|
||||
yield group
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name)
|
||||
yield package
|
||||
|
||||
|
||||
def test_package_group(package: Package, group: Group):
|
||||
with db.begin():
|
||||
package_group = db.create(PackageGroup, Package=package, Group=group)
|
||||
assert package_group.Group == group
|
||||
assert package_group.Package == package
|
||||
|
||||
|
||||
def test_package_group_null_package_raises(group: Group):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageGroup(Group=group)
|
||||
|
||||
|
||||
def test_package_group_null_group_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageGroup(Package=package)
|
44
test/test_package_keyword.py
Normal file
44
test/test_package_keyword.py
Normal file
|
@ -0,0 +1,44 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_keyword import PackageKeyword
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="beautiful-package",
|
||||
Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_keyword(pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
pkg_keyword = db.create(PackageKeyword, PackageBase=pkgbase,
|
||||
Keyword="test")
|
||||
assert pkg_keyword in pkgbase.keywords
|
||||
assert pkgbase == pkg_keyword.PackageBase
|
||||
|
||||
|
||||
def test_package_keyword_null_pkgbase_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageKeyword(Keyword="test")
|
58
test/test_package_license.py
Normal file
58
test/test_package_license.py
Normal file
|
@ -0,0 +1,58 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.license import License
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_license import PackageLicense
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def license() -> License:
|
||||
with db.begin():
|
||||
license = db.create(License, Name="Test License")
|
||||
yield license
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User, license: License):
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name)
|
||||
yield package
|
||||
|
||||
|
||||
def test_package_license(license: License, package: Package):
|
||||
with db.begin():
|
||||
package_license = db.create(PackageLicense, Package=package,
|
||||
License=license)
|
||||
assert package_license.License == license
|
||||
assert package_license.Package == package
|
||||
|
||||
|
||||
def test_package_license_null_package_raises(license: License):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageLicense(License=license)
|
||||
|
||||
|
||||
def test_package_license_null_license_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageLicense(Package=package)
|
47
test/test_package_notification.py
Normal file
47
test/test_package_notification.py
Normal file
|
@ -0,0 +1,47 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_notification import PackageNotification
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword")
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_notification_creation(user: User, pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
package_notification = db.create(
|
||||
PackageNotification, User=user, PackageBase=pkgbase)
|
||||
assert bool(package_notification)
|
||||
assert package_notification.User == user
|
||||
assert package_notification.PackageBase == pkgbase
|
||||
|
||||
|
||||
def test_package_notification_null_user_raises(pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageNotification(PackageBase=pkgbase)
|
||||
|
||||
|
||||
def test_package_notification_null_pkgbase_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageNotification(User=user)
|
67
test/test_package_relation.py
Normal file
67
test/test_package_relation.py
Normal file
|
@ -0,0 +1,67 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_relation import PackageRelation
|
||||
from aurweb.models.relation_type import CONFLICTS_ID, PROVIDES_ID, REPLACES_ID
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="Test description.",
|
||||
URL="https://test.package")
|
||||
yield package
|
||||
|
||||
|
||||
def test_package_relation(package: Package):
|
||||
with db.begin():
|
||||
pkgrel = db.create(PackageRelation, Package=package,
|
||||
RelTypeID=CONFLICTS_ID,
|
||||
RelName="test-relation")
|
||||
|
||||
assert pkgrel.RelName == "test-relation"
|
||||
assert pkgrel.Package == package
|
||||
assert pkgrel in package.package_relations
|
||||
|
||||
with db.begin():
|
||||
pkgrel.RelTypeID = PROVIDES_ID
|
||||
|
||||
with db.begin():
|
||||
pkgrel.RelTypeID = REPLACES_ID
|
||||
|
||||
|
||||
def test_package_relation_null_package_raises():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRelation(RelTypeID=CONFLICTS_ID, RelName="test-relation")
|
||||
|
||||
|
||||
def test_package_relation_null_relation_type_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRelation(Package=package, RelName="test-relation")
|
||||
|
||||
|
||||
def test_package_relation_null_relname_raises(package: Package):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRelation(Package=package, RelTypeID=CONFLICTS_ID)
|
142
test/test_package_request.py
Normal file
142
test/test_package_request.py
Normal file
|
@ -0,0 +1,142 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_request import (ACCEPTED, ACCEPTED_ID, CLOSED, CLOSED_ID, PENDING, PENDING_ID, REJECTED,
|
||||
REJECTED_ID, PackageRequest)
|
||||
from aurweb.models.request_type import MERGE_ID
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_request_creation(user: User, pkgbase: PackageBase):
|
||||
with db.begin():
|
||||
package_request = db.create(PackageRequest, ReqTypeID=MERGE_ID,
|
||||
User=user, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
assert bool(package_request.ID)
|
||||
assert package_request.User == user
|
||||
assert package_request.PackageBase == pkgbase
|
||||
assert package_request.PackageBaseName == pkgbase.Name
|
||||
assert package_request.Comments == str()
|
||||
assert package_request.ClosureComment == str()
|
||||
|
||||
# Make sure that everything is cross-referenced with relationships.
|
||||
assert package_request in user.package_requests
|
||||
assert package_request in pkgbase.requests
|
||||
|
||||
|
||||
def test_package_request_closed(user: User, pkgbase: PackageBase):
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
package_request = db.create(PackageRequest, ReqTypeID=MERGE_ID,
|
||||
User=user, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name,
|
||||
Closer=user, ClosedTS=ts,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
assert package_request.Closer == user
|
||||
assert package_request.ClosedTS == ts
|
||||
|
||||
# Test relationships.
|
||||
assert package_request in user.closed_requests
|
||||
|
||||
|
||||
def test_package_request_null_request_type_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(User=user, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
|
||||
def test_package_request_null_user_raises(pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(ReqTypeID=MERGE_ID,
|
||||
PackageBase=pkgbase, PackageBaseName=pkgbase.Name,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
|
||||
def test_package_request_null_package_base_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(ReqTypeID=MERGE_ID,
|
||||
User=user, PackageBaseName=pkgbase.Name,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
|
||||
def test_package_request_null_package_base_name_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(ReqTypeID=MERGE_ID,
|
||||
User=user, PackageBase=pkgbase,
|
||||
Comments=str(), ClosureComment=str())
|
||||
|
||||
|
||||
def test_package_request_null_comments_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(ReqTypeID=MERGE_ID, User=user,
|
||||
PackageBase=pkgbase, PackageBaseName=pkgbase.Name,
|
||||
ClosureComment=str())
|
||||
|
||||
|
||||
def test_package_request_null_closure_comment_raises(user: User,
|
||||
pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageRequest(ReqTypeID=MERGE_ID, User=user,
|
||||
PackageBase=pkgbase, PackageBaseName=pkgbase.Name,
|
||||
Comments=str())
|
||||
|
||||
|
||||
def test_package_request_status_display(user: User, pkgbase: PackageBase):
|
||||
""" Test status_display() based on the Status column value. """
|
||||
with db.begin():
|
||||
pkgreq = db.create(PackageRequest, ReqTypeID=MERGE_ID,
|
||||
User=user, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name,
|
||||
Comments=str(), ClosureComment=str(),
|
||||
Status=PENDING_ID)
|
||||
assert pkgreq.status_display() == PENDING
|
||||
|
||||
with db.begin():
|
||||
pkgreq.Status = CLOSED_ID
|
||||
assert pkgreq.status_display() == CLOSED
|
||||
|
||||
with db.begin():
|
||||
pkgreq.Status = ACCEPTED_ID
|
||||
assert pkgreq.status_display() == ACCEPTED
|
||||
|
||||
with db.begin():
|
||||
pkgreq.Status = REJECTED_ID
|
||||
assert pkgreq.status_display() == REJECTED
|
||||
|
||||
with db.begin():
|
||||
pkgreq.Status = 124
|
||||
with pytest.raises(KeyError):
|
||||
pkgreq.status_display()
|
46
test/test_package_source.py
Normal file
46
test/test_package_source.py
Normal file
|
@ -0,0 +1,46 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_source import PackageSource
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
package = db.create(Package, PackageBase=pkgbase, Name="test-package")
|
||||
yield package
|
||||
|
||||
|
||||
def test_package_source(package: Package):
|
||||
with db.begin():
|
||||
pkgsource = db.create(PackageSource, Package=package)
|
||||
assert pkgsource.Package == package
|
||||
# By default, PackageSources.Source assigns the string '/dev/null'.
|
||||
assert pkgsource.Source == "/dev/null"
|
||||
assert pkgsource.SourceArch is None
|
||||
|
||||
|
||||
def test_package_source_null_package_raises():
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageSource()
|
57
test/test_package_vote.py
Normal file
57
test/test_package_vote.py
Normal file
|
@ -0,0 +1,57 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_vote import PackageVote
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd=str(),
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-package", Maintainer=user)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
def test_package_vote_creation(user: User, pkgbase: PackageBase):
|
||||
ts = time.utcnow()
|
||||
|
||||
with db.begin():
|
||||
package_vote = db.create(PackageVote, User=user,
|
||||
PackageBase=pkgbase, VoteTS=ts)
|
||||
assert bool(package_vote)
|
||||
assert package_vote.User == user
|
||||
assert package_vote.PackageBase == pkgbase
|
||||
assert package_vote.VoteTS == ts
|
||||
|
||||
|
||||
def test_package_vote_null_user_raises(pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageVote(PackageBase=pkgbase, VoteTS=1)
|
||||
|
||||
|
||||
def test_package_vote_null_pkgbase_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageVote(User=user, VoteTS=1)
|
||||
|
||||
|
||||
def test_package_vote_null_votets_raises(user: User, pkgbase: PackageBase):
|
||||
with pytest.raises(IntegrityError):
|
||||
PackageVote(User=user, PackageBase=pkgbase)
|
1485
test/test_packages_routes.py
Normal file
1485
test/test_packages_routes.py
Normal file
File diff suppressed because it is too large
Load diff
128
test/test_packages_util.py
Normal file
128
test/test_packages_util.py
Normal file
|
@ -0,0 +1,128 @@
|
|||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import asgi, config, db, time
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.official_provider import OFFICIAL_BASE, OfficialProvider
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_notification import PackageNotification
|
||||
from aurweb.models.package_source import PackageSource
|
||||
from aurweb.models.package_vote import PackageVote
|
||||
from aurweb.models.user import User
|
||||
from aurweb.packages import util
|
||||
from aurweb.redis import kill_redis
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def maintainer() -> User:
|
||||
with db.begin():
|
||||
maintainer = db.create(User, Username="test_maintainer",
|
||||
Email="test_maintainer@examepl.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield maintainer
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(maintainer: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-pkg",
|
||||
Packager=maintainer, Maintainer=maintainer)
|
||||
package = db.create(Package, Name=pkgbase.Name, PackageBase=pkgbase)
|
||||
yield package
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
yield TestClient(app=asgi.app)
|
||||
|
||||
|
||||
def test_package_link(client: TestClient, package: Package):
|
||||
expected = f"/packages/{package.Name}"
|
||||
assert util.package_link(package) == expected
|
||||
|
||||
|
||||
def test_official_package_link(client: TestClient, package: Package):
|
||||
with db.begin():
|
||||
provider = db.create(OfficialProvider,
|
||||
Name=package.Name,
|
||||
Repo="core",
|
||||
Provides=package.Name)
|
||||
expected = f"{OFFICIAL_BASE}/packages/?q={package.Name}"
|
||||
assert util.package_link(provider) == expected
|
||||
|
||||
|
||||
def test_updated_packages(maintainer: User, package: Package):
|
||||
expected = {
|
||||
"Name": package.Name,
|
||||
"Version": package.Version,
|
||||
"PackageBase": {
|
||||
"ModifiedTS": package.PackageBase.ModifiedTS
|
||||
}
|
||||
}
|
||||
|
||||
kill_redis() # Kill it here to ensure we're on a fake instance.
|
||||
assert util.updated_packages(1, 0) == [expected]
|
||||
assert util.updated_packages(1, 600) == [expected]
|
||||
kill_redis() # Kill it again, in case other tests use a real instance.
|
||||
|
||||
|
||||
def test_query_voted(maintainer: User, package: Package):
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
db.create(PackageVote, User=maintainer, VoteTS=now,
|
||||
PackageBase=package.PackageBase)
|
||||
|
||||
query = db.query(Package).filter(Package.ID == package.ID).all()
|
||||
query_voted = util.query_voted(query, maintainer)
|
||||
assert query_voted[package.PackageBase.ID]
|
||||
|
||||
|
||||
def test_query_notified(maintainer: User, package: Package):
|
||||
with db.begin():
|
||||
db.create(PackageNotification, User=maintainer,
|
||||
PackageBase=package.PackageBase)
|
||||
|
||||
query = db.query(Package).filter(Package.ID == package.ID).all()
|
||||
query_notified = util.query_notified(query, maintainer)
|
||||
assert query_notified[package.PackageBase.ID]
|
||||
|
||||
|
||||
def test_source_uri_file(package: Package):
|
||||
FILE = "test_file"
|
||||
|
||||
with db.begin():
|
||||
pkgsrc = db.create(PackageSource, Source=FILE,
|
||||
Package=package, SourceArch="x86_64")
|
||||
source_file_uri = config.get("options", "source_file_uri")
|
||||
file, uri = util.source_uri(pkgsrc)
|
||||
expected = source_file_uri % (pkgsrc.Source, package.PackageBase.Name)
|
||||
assert (file, uri) == (FILE, expected)
|
||||
|
||||
|
||||
def test_source_uri_named_uri(package: Package):
|
||||
FILE = "test"
|
||||
URL = "https://test.xyz"
|
||||
|
||||
with db.begin():
|
||||
pkgsrc = db.create(PackageSource, Source=f"{FILE}::{URL}",
|
||||
Package=package, SourceArch="x86_64")
|
||||
file, uri = util.source_uri(pkgsrc)
|
||||
assert (file, uri) == (FILE, URL)
|
||||
|
||||
|
||||
def test_source_uri_unnamed_uri(package: Package):
|
||||
URL = "https://test.xyz"
|
||||
|
||||
with db.begin():
|
||||
pkgsrc = db.create(PackageSource, Source=f"{URL}",
|
||||
Package=package, SourceArch="x86_64")
|
||||
file, uri = util.source_uri(pkgsrc)
|
||||
assert (file, uri) == (URL, URL)
|
1333
test/test_pkgbase_routes.py
Normal file
1333
test/test_pkgbase_routes.py
Normal file
File diff suppressed because it is too large
Load diff
64
test/test_pkgmaint.py
Normal file
64
test/test_pkgmaint.py
Normal file
|
@ -0,0 +1,64 @@
|
|||
from typing import List
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models import Package, PackageBase, User
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.scripts import pkgmaint
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd="testPassword", AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(user: User) -> List[Package]:
|
||||
output = []
|
||||
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
for i in range(5):
|
||||
pkgbase = db.create(PackageBase, Name=f"pkg_{i}",
|
||||
SubmittedTS=now,
|
||||
ModifiedTS=now)
|
||||
pkg = db.create(Package, PackageBase=pkgbase,
|
||||
Name=f"pkg_{i}", Version=f"{i}.0")
|
||||
output.append(pkg)
|
||||
yield output
|
||||
|
||||
|
||||
def test_pkgmaint_noop(packages: List[Package]):
|
||||
assert len(packages) == 5
|
||||
pkgmaint.main()
|
||||
packages = db.query(Package).all()
|
||||
assert len(packages) == 5
|
||||
|
||||
|
||||
def test_pkgmaint(packages: List[Package]):
|
||||
assert len(packages) == 5
|
||||
|
||||
# Modify the first package so it's out of date and gets deleted.
|
||||
with db.begin():
|
||||
# Reduce SubmittedTS by a day + 10 seconds.
|
||||
packages[0].PackageBase.SubmittedTS -= (86400 + 10)
|
||||
|
||||
# Run pkgmaint.
|
||||
pkgmaint.main()
|
||||
|
||||
# Query package objects again and assert that the
|
||||
# first package was deleted but all others are intact.
|
||||
packages = db.query(Package).all()
|
||||
assert len(packages) == 4
|
||||
expected = ["pkg_1", "pkg_2", "pkg_3", "pkg_4"]
|
||||
for i, pkgname in enumerate(expected):
|
||||
assert packages[i].Name == pkgname
|
12
test/test_popupdate.py
Normal file
12
test/test_popupdate.py
Normal file
|
@ -0,0 +1,12 @@
|
|||
import pytest
|
||||
|
||||
from aurweb.scripts import popupdate
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_popupdate():
|
||||
popupdate.main()
|
117
test/test_ratelimit.py
Normal file
117
test/test_ratelimit.py
Normal file
|
@ -0,0 +1,117 @@
|
|||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from redis.client import Pipeline
|
||||
|
||||
from aurweb import config, db, logging
|
||||
from aurweb.models import ApiRateLimit
|
||||
from aurweb.ratelimit import check_ratelimit
|
||||
from aurweb.redis import redis_connection
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
logger = logging.get_logger(__name__)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pipeline():
|
||||
redis = redis_connection()
|
||||
pipeline = redis.pipeline()
|
||||
|
||||
pipeline.delete("ratelimit-ws:127.0.0.1")
|
||||
pipeline.delete("ratelimit:127.0.0.1")
|
||||
pipeline.execute()
|
||||
|
||||
yield pipeline
|
||||
|
||||
|
||||
config_getint = config.getint
|
||||
|
||||
|
||||
def mock_config_getint(section: str, key: str):
|
||||
if key == "request_limit":
|
||||
return 4
|
||||
elif key == "window_length":
|
||||
return 100
|
||||
return config_getint(section, key)
|
||||
|
||||
|
||||
config_getboolean = config.getboolean
|
||||
|
||||
|
||||
def mock_config_getboolean(return_value: int = 0):
|
||||
def fn(section: str, key: str):
|
||||
if section == "ratelimit" and key == "cache":
|
||||
return return_value
|
||||
return config_getboolean(section, key)
|
||||
return fn
|
||||
|
||||
|
||||
config_get = config.get
|
||||
|
||||
|
||||
def mock_config_get(return_value: str = "none"):
|
||||
def fn(section: str, key: str):
|
||||
if section == "options" and key == "cache":
|
||||
return return_value
|
||||
return config_get(section, key)
|
||||
return fn
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.getint", side_effect=mock_config_getint)
|
||||
@mock.patch("aurweb.config.getboolean", side_effect=mock_config_getboolean(1))
|
||||
@mock.patch("aurweb.config.get", side_effect=mock_config_get("none"))
|
||||
def test_ratelimit_redis(get: mock.MagicMock, getboolean: mock.MagicMock,
|
||||
getint: mock.MagicMock, pipeline: Pipeline):
|
||||
""" This test will only cover aurweb.ratelimit's Redis
|
||||
path if a real Redis server is configured. Otherwise,
|
||||
it'll use the database. """
|
||||
|
||||
# We'll need a Request for everything here.
|
||||
request = Request()
|
||||
|
||||
# Run check_ratelimit for our request_limit. These should succeed.
|
||||
for i in range(4):
|
||||
assert not check_ratelimit(request)
|
||||
|
||||
# This check_ratelimit should fail, being the 4001th request.
|
||||
assert check_ratelimit(request)
|
||||
|
||||
# Delete the Redis keys.
|
||||
host = request.client.host
|
||||
pipeline.delete(f"ratelimit-ws:{host}")
|
||||
pipeline.delete(f"ratelimit:{host}")
|
||||
one, two = pipeline.execute()
|
||||
assert one and two
|
||||
|
||||
# Should be good to go again!
|
||||
assert not check_ratelimit(request)
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.getint", side_effect=mock_config_getint)
|
||||
@mock.patch("aurweb.config.getboolean", side_effect=mock_config_getboolean(0))
|
||||
@mock.patch("aurweb.config.get", side_effect=mock_config_get("none"))
|
||||
def test_ratelimit_db(get: mock.MagicMock, getboolean: mock.MagicMock,
|
||||
getint: mock.MagicMock, pipeline: Pipeline):
|
||||
|
||||
# We'll need a Request for everything here.
|
||||
request = Request()
|
||||
|
||||
# Run check_ratelimit for our request_limit. These should succeed.
|
||||
for i in range(4):
|
||||
assert not check_ratelimit(request)
|
||||
|
||||
# This check_ratelimit should fail, being the 4001th request.
|
||||
assert check_ratelimit(request)
|
||||
|
||||
# Delete the ApiRateLimit record.
|
||||
with db.begin():
|
||||
db.delete(db.query(ApiRateLimit).first())
|
||||
|
||||
# Should be good to go again!
|
||||
assert not check_ratelimit(request)
|
40
test/test_redis.py
Normal file
40
test/test_redis.py
Normal file
|
@ -0,0 +1,40 @@
|
|||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import aurweb.config
|
||||
|
||||
from aurweb.redis import redis_connection
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def rediss():
|
||||
""" Create a RedisStub. """
|
||||
def mock_get(section, key):
|
||||
return "none"
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_get):
|
||||
aurweb.config.rehash()
|
||||
redis = redis_connection()
|
||||
aurweb.config.rehash()
|
||||
|
||||
yield redis
|
||||
|
||||
|
||||
def test_redis_stub(rediss):
|
||||
# We don't yet have a test key set.
|
||||
assert rediss.get("test") is None
|
||||
|
||||
# Set the test key to abc.
|
||||
rediss.set("test", "abc")
|
||||
assert rediss.get("test").decode() == "abc"
|
||||
|
||||
# Test expire.
|
||||
rediss.expire("test", 0)
|
||||
assert rediss.get("test") is None
|
||||
|
||||
# Now, set the test key again and use delete() on it.
|
||||
rediss.set("test", "abc")
|
||||
assert rediss.get("test").decode() == "abc"
|
||||
rediss.delete("test")
|
||||
assert rediss.get("test") is None
|
34
test/test_relation_type.py
Normal file
34
test/test_relation_type.py
Normal file
|
@ -0,0 +1,34 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.relation_type import RelationType
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_relation_type_creation():
|
||||
with db.begin():
|
||||
relation_type = db.create(RelationType, Name="test-relation")
|
||||
|
||||
assert bool(relation_type.ID)
|
||||
assert relation_type.Name == "test-relation"
|
||||
|
||||
with db.begin():
|
||||
db.delete(relation_type)
|
||||
|
||||
|
||||
def test_relation_types():
|
||||
conflicts = db.query(RelationType, RelationType.Name == "conflicts").first()
|
||||
assert conflicts is not None
|
||||
assert conflicts.Name == "conflicts"
|
||||
|
||||
provides = db.query(RelationType, RelationType.Name == "provides").first()
|
||||
assert provides is not None
|
||||
assert provides.Name == "provides"
|
||||
|
||||
replaces = db.query(RelationType, RelationType.Name == "replaces").first()
|
||||
assert replaces is not None
|
||||
assert replaces.Name == "replaces"
|
201
test/test_rendercomment.py
Normal file
201
test/test_rendercomment.py
Normal file
|
@ -0,0 +1,201 @@
|
|||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import config, db, logging, time
|
||||
from aurweb.models import Package, PackageBase, PackageComment, User
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.scripts import rendercomment
|
||||
from aurweb.scripts.rendercomment import update_comment_render
|
||||
from aurweb.testing.git import GitRepository
|
||||
|
||||
logger = logging.get_logger(__name__)
|
||||
aur_location = config.get("options", "aur_location")
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test, git: GitRepository):
|
||||
config_get = config.get
|
||||
|
||||
def mock_config_get(section: str, key: str) -> str:
|
||||
if section == "serve" and key == "repo-path":
|
||||
return git.file_lock.path
|
||||
elif section == "options" and key == "commit_uri":
|
||||
return "/cgit/aur.git/log/?h=%s&id=%s"
|
||||
return config_get(section, key)
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_config_get):
|
||||
yield
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd=str(), AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Packager=user, Name="pkgbase_0",
|
||||
SubmittedTS=now, ModifiedTS=now)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(pkgbase: PackageBase) -> Package:
|
||||
with db.begin():
|
||||
package = db.create(Package, PackageBase=pkgbase,
|
||||
Name=pkgbase.Name, Version="1.0")
|
||||
yield package
|
||||
|
||||
|
||||
def create_comment(user: User, pkgbase: PackageBase, comments: str,
|
||||
render: bool = True):
|
||||
with db.begin():
|
||||
comment = db.create(PackageComment, User=user,
|
||||
PackageBase=pkgbase, Comments=comments)
|
||||
if render:
|
||||
update_comment_render(comment)
|
||||
return comment
|
||||
|
||||
|
||||
def test_comment_rendering(user: User, pkgbase: PackageBase):
|
||||
text = "Hello world! This is a comment."
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
expected = f"<p>{text}</p>"
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_rendercomment_main(user: User, pkgbase: PackageBase):
|
||||
text = "Hello world! This is a comment."
|
||||
comment = create_comment(user, pkgbase, text, False)
|
||||
|
||||
args = ["aurweb-rendercomment", str(comment.ID)]
|
||||
with mock.patch("sys.argv", args):
|
||||
rendercomment.main()
|
||||
db.refresh(comment)
|
||||
|
||||
expected = f"<p>{text}</p>"
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_markdown_conversion(user: User, pkgbase: PackageBase):
|
||||
text = "*Hello* [world](https://aur.archlinux.org)!"
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
expected = ('<p><em>Hello</em> '
|
||||
'<a href="https://aur.archlinux.org">world</a>!</p>')
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_html_sanitization(user: User, pkgbase: PackageBase):
|
||||
text = '<script>alert("XSS!")</script>'
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
expected = '<script>alert("XSS!")</script>'
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_link_conversion(user: User, pkgbase: PackageBase):
|
||||
text = """\
|
||||
Visit https://www.archlinux.org/#_test_.
|
||||
Visit *https://www.archlinux.org/*.
|
||||
Visit <https://www.archlinux.org/>.
|
||||
Visit `https://www.archlinux.org/`.
|
||||
Visit [Arch Linux](https://www.archlinux.org/).
|
||||
Visit [Arch Linux][arch].
|
||||
[arch]: https://www.archlinux.org/\
|
||||
"""
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
expected = '''\
|
||||
<p>Visit <a href="https://www.archlinux.org/#_test_">\
|
||||
https://www.archlinux.org/#_test_</a>.
|
||||
Visit <em><a href="https://www.archlinux.org/">https://www.archlinux.org/</a></em>.
|
||||
Visit <a href="https://www.archlinux.org/">https://www.archlinux.org/</a>.
|
||||
Visit <code>https://www.archlinux.org/</code>.
|
||||
Visit <a href="https://www.archlinux.org/">Arch Linux</a>.
|
||||
Visit <a href="https://www.archlinux.org/">Arch Linux</a>.</p>\
|
||||
'''
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_git_commit_link(git: GitRepository, user: User, package: Package):
|
||||
commit_hash = git.commit(package, "Initial commit.")
|
||||
logger.info(f"Created commit: {commit_hash}")
|
||||
logger.info(f"Short hash: {commit_hash[:7]}")
|
||||
|
||||
text = f"""\
|
||||
{commit_hash}
|
||||
{commit_hash[:7]}
|
||||
x.{commit_hash}.x
|
||||
{commit_hash}x
|
||||
0123456789abcdef
|
||||
`{commit_hash}`
|
||||
http://example.com/{commit_hash}\
|
||||
"""
|
||||
comment = create_comment(user, package.PackageBase, text)
|
||||
|
||||
pkgname = package.PackageBase.Name
|
||||
cgit_path = f"/cgit/aur.git/log/?h={pkgname}&"
|
||||
expected = f"""\
|
||||
<p><a href="{cgit_path}id={commit_hash[:12]}">{commit_hash[:12]}</a>
|
||||
<a href="{cgit_path}id={commit_hash[:7]}">{commit_hash[:7]}</a>
|
||||
x.<a href="{cgit_path}id={commit_hash[:12]}">{commit_hash[:12]}</a>.x
|
||||
{commit_hash}x
|
||||
0123456789abcdef
|
||||
<code>{commit_hash}</code>
|
||||
<a href="http://example.com/{commit_hash}">\
|
||||
http://example.com/{commit_hash}\
|
||||
</a>\
|
||||
</p>\
|
||||
"""
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_flyspray_issue_link(user: User, pkgbase: PackageBase):
|
||||
text = """\
|
||||
FS#1234567.
|
||||
*FS#1234*
|
||||
FS#
|
||||
XFS#1
|
||||
`FS#1234`
|
||||
https://archlinux.org/?test=FS#1234\
|
||||
"""
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
|
||||
expected = """\
|
||||
<p><a href="https://bugs.archlinux.org/task/1234567">FS#1234567</a>.
|
||||
<em><a href="https://bugs.archlinux.org/task/1234">FS#1234</a></em>
|
||||
FS#
|
||||
XFS#1
|
||||
<code>FS#1234</code>
|
||||
<a href="https://archlinux.org/?test=FS#1234">\
|
||||
https://archlinux.org/?test=FS#1234\
|
||||
</a>\
|
||||
</p>\
|
||||
"""
|
||||
assert comment.RenderedComment == expected
|
||||
|
||||
|
||||
def test_lower_headings(user: User, pkgbase: PackageBase):
|
||||
text = """\
|
||||
# One
|
||||
## Two
|
||||
### Three
|
||||
#### Four
|
||||
##### Five
|
||||
###### Six\
|
||||
"""
|
||||
comment = create_comment(user, pkgbase, text)
|
||||
|
||||
expected = """\
|
||||
<h5>One</h5>
|
||||
<h6>Two</h6>
|
||||
<h6>Three</h6>
|
||||
<h6>Four</h6>
|
||||
<h6>Five</h6>
|
||||
<h6>Six</h6>\
|
||||
"""
|
||||
assert comment.RenderedComment == expected
|
42
test/test_request_type.py
Normal file
42
test/test_request_type.py
Normal file
|
@ -0,0 +1,42 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID, RequestType
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_request_type_creation():
|
||||
with db.begin():
|
||||
request_type = db.create(RequestType, Name="Test Request")
|
||||
|
||||
assert bool(request_type.ID)
|
||||
assert request_type.Name == "Test Request"
|
||||
|
||||
with db.begin():
|
||||
db.delete(request_type)
|
||||
|
||||
|
||||
def test_request_type_null_name_returns_empty_string():
|
||||
with db.begin():
|
||||
request_type = db.create(RequestType)
|
||||
|
||||
assert bool(request_type.ID)
|
||||
assert request_type.Name == str()
|
||||
|
||||
with db.begin():
|
||||
db.delete(request_type)
|
||||
|
||||
|
||||
def test_request_type_name_display():
|
||||
deletion = db.query(RequestType, RequestType.ID == DELETION_ID).first()
|
||||
assert deletion.name_display() == "Deletion"
|
||||
|
||||
orphan = db.query(RequestType, RequestType.ID == ORPHAN_ID).first()
|
||||
assert orphan.name_display() == "Orphan"
|
||||
|
||||
merge = db.query(RequestType, RequestType.ID == MERGE_ID).first()
|
||||
assert merge.name_display() == "Merge"
|
759
test/test_requests.py
Normal file
759
test/test_requests.py
Normal file
|
@ -0,0 +1,759 @@
|
|||
import re
|
||||
|
||||
from http import HTTPStatus
|
||||
from logging import DEBUG
|
||||
from typing import List
|
||||
|
||||
import pytest
|
||||
|
||||
from fastapi import HTTPException
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import asgi, config, db, defaults, time
|
||||
from aurweb.models import Package, PackageBase, PackageRequest, User
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID, USER_ID
|
||||
from aurweb.models.package_notification import PackageNotification
|
||||
from aurweb.models.package_request import ACCEPTED_ID, PENDING_ID, REJECTED_ID
|
||||
from aurweb.models.request_type import DELETION_ID, MERGE_ID, ORPHAN_ID
|
||||
from aurweb.packages.requests import ClosureFactory
|
||||
from aurweb.requests.util import get_pkgreq_by_id
|
||||
from aurweb.testing.email import Email
|
||||
from aurweb.testing.html import get_errors, parse_root
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test) -> None:
|
||||
""" Setup the database. """
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
""" Yield a TestClient. """
|
||||
yield TestClient(app=asgi.app)
|
||||
|
||||
|
||||
def create_user(username: str, email: str) -> User:
|
||||
"""
|
||||
Create a user based on `username` and `email`.
|
||||
|
||||
:param username: User.Username
|
||||
:param email: User.Email
|
||||
:return: User instance
|
||||
"""
|
||||
with db.begin():
|
||||
user = db.create(User, Username=username, Email=email,
|
||||
Passwd="testPassword", AccountTypeID=USER_ID)
|
||||
return user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
""" Yield a User instance. """
|
||||
user = create_user("test", "test@example.org")
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def auser(user: User) -> User:
|
||||
""" Yield an authenticated User instance. """
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
user.cookies = cookies
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user2() -> User:
|
||||
""" Yield a secondary non-maintainer User instance. """
|
||||
user = create_user("test2", "test2@example.org")
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def auser2(user2: User) -> User:
|
||||
""" Yield an authenticated secondary non-maintainer User instance. """
|
||||
cookies = {"AURSID": user2.login(Request(), "testPassword")}
|
||||
user2.cookies = cookies
|
||||
yield user2
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def maintainer() -> User:
|
||||
""" Yield a specific User used to maintain packages. """
|
||||
with db.begin():
|
||||
maintainer = db.create(User, Username="test_maintainer",
|
||||
Email="test_maintainer@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield maintainer
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(maintainer: User) -> List[Package]:
|
||||
""" Yield 55 packages named pkg_0 .. pkg_54. """
|
||||
packages_ = []
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
for i in range(55):
|
||||
pkgbase = db.create(PackageBase,
|
||||
Name=f"pkg_{i}",
|
||||
Maintainer=maintainer,
|
||||
Packager=maintainer,
|
||||
Submitter=maintainer,
|
||||
ModifiedTS=now)
|
||||
package = db.create(Package,
|
||||
PackageBase=pkgbase,
|
||||
Name=f"pkg_{i}")
|
||||
packages_.append(package)
|
||||
|
||||
yield packages_
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def requests(user: User, packages: List[Package]) -> List[PackageRequest]:
|
||||
pkgreqs = []
|
||||
with db.begin():
|
||||
for i in range(55):
|
||||
pkgreq = db.create(PackageRequest,
|
||||
ReqTypeID=DELETION_ID,
|
||||
User=user,
|
||||
PackageBase=packages[i].PackageBase,
|
||||
PackageBaseName=packages[i].Name,
|
||||
Comments=f"Deletion request for pkg_{i}",
|
||||
ClosureComment=str())
|
||||
pkgreqs.append(pkgreq)
|
||||
yield pkgreqs
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tu_user() -> User:
|
||||
""" Yield an authenticated Trusted User instance. """
|
||||
user = create_user("test_tu", "test_tu@example.org")
|
||||
with db.begin():
|
||||
user.AccountTypeID = TRUSTED_USER_ID
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
user.cookies = cookies
|
||||
yield user
|
||||
|
||||
|
||||
def create_pkgbase(user: User, name: str) -> PackageBase:
|
||||
"""
|
||||
Create a package base based on `user` and `name`.
|
||||
|
||||
This function also creates a matching Package record.
|
||||
|
||||
:param user: User instance
|
||||
:param name: PackageBase.Name
|
||||
:return: PackageBase instance
|
||||
"""
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name=name,
|
||||
Maintainer=user, Packager=user,
|
||||
SubmittedTS=now, ModifiedTS=now)
|
||||
db.create(Package, Name=pkgbase.Name, PackageBase=pkgbase)
|
||||
return pkgbase
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
""" Yield a package base. """
|
||||
pkgbase = create_pkgbase(user, "test-package")
|
||||
yield pkgbase
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def target(user: User) -> PackageBase:
|
||||
""" Yield a merge target (package base). """
|
||||
with db.begin():
|
||||
target = db.create(PackageBase, Name="target-package",
|
||||
Maintainer=user, Packager=user)
|
||||
yield target
|
||||
|
||||
|
||||
def create_request(reqtype_id: int, user: User, pkgbase: PackageBase,
|
||||
comments: str) -> PackageRequest:
|
||||
"""
|
||||
Create a package request based on `reqtype_id`, `user`,
|
||||
`pkgbase` and `comments`.
|
||||
|
||||
:param reqtype_id: RequestType.ID
|
||||
:param user: User instance
|
||||
:param pkgbase: PackageBase instance
|
||||
:param comments: PackageRequest.Comments
|
||||
:return: PackageRequest instance
|
||||
"""
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgreq = db.create(PackageRequest, ReqTypeID=reqtype_id,
|
||||
User=user, PackageBase=pkgbase,
|
||||
PackageBaseName=pkgbase.Name,
|
||||
RequestTS=now,
|
||||
Comments=comments,
|
||||
ClosureComment=str())
|
||||
return pkgreq
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgreq(user: User, pkgbase: PackageBase):
|
||||
""" Yield a package request. """
|
||||
pkgreq = create_request(DELETION_ID, user, pkgbase, "Test request.")
|
||||
yield pkgreq
|
||||
|
||||
|
||||
def create_notification(user: User, pkgbase: PackageBase):
|
||||
""" Create a notification for a `user` on `pkgbase`. """
|
||||
with db.begin():
|
||||
notif = db.create(PackageNotification, User=user, PackageBase=pkgbase)
|
||||
return notif
|
||||
|
||||
|
||||
def test_request(client: TestClient, auser: User, pkgbase: PackageBase):
|
||||
""" Test the standard pkgbase request route GET method. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
with client as request:
|
||||
resp = request.get(endpoint, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
def test_request_post_deletion(client: TestClient, auser2: User,
|
||||
pkgbase: PackageBase):
|
||||
""" Test the POST route for creating a deletion request works. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {"comments": "Test request.", "type": "deletion"}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser2.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
pkgreq = pkgbase.requests.first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == DELETION_ID
|
||||
assert pkgreq.Status == PENDING_ID
|
||||
|
||||
# A RequestOpenNotification should've been sent out.
|
||||
assert Email.count() == 1
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Deletion Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
|
||||
def test_request_post_deletion_as_maintainer(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase):
|
||||
""" Test the POST route for creating a deletion request as maint works. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {"comments": "Test request.", "type": "deletion"}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
# Check the pkgreq record got created and accepted.
|
||||
pkgreq = db.query(PackageRequest).first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == DELETION_ID
|
||||
assert pkgreq.Status == ACCEPTED_ID
|
||||
|
||||
# Should've gotten two emails.
|
||||
assert Email.count() == 2
|
||||
|
||||
# A RequestOpenNotification should've been sent out.
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Deletion Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# Check the content of the close notification.
|
||||
email = Email(2)
|
||||
expr = r"^\[PRQ#%d\] Deletion Request for [^ ]+ Accepted$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
|
||||
def test_request_post_deletion_autoaccept(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase,
|
||||
caplog: pytest.LogCaptureFixture):
|
||||
""" Test the request route for deletion as maintainer. """
|
||||
caplog.set_level(DEBUG)
|
||||
|
||||
now = time.utcnow()
|
||||
auto_delete_age = config.getint("options", "auto_delete_age")
|
||||
with db.begin():
|
||||
pkgbase.ModifiedTS = now - auto_delete_age + 100
|
||||
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {"comments": "Test request.", "type": "deletion"}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
pkgreq = db.query(PackageRequest).filter(
|
||||
PackageRequest.PackageBaseName == pkgbase.Name
|
||||
).first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == DELETION_ID
|
||||
assert pkgreq.Status == ACCEPTED_ID
|
||||
|
||||
# A RequestOpenNotification should've been sent out.
|
||||
assert Email.count() == 2
|
||||
Email.dump()
|
||||
|
||||
# Check the content of the open notification.
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Deletion Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# Check the content of the close notification.
|
||||
email = Email(2)
|
||||
expr = r"^\[PRQ#%d\] Deletion Request for [^ ]+ Accepted$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# Check logs.
|
||||
expr = r"New request #\d+ is marked for auto-deletion."
|
||||
assert re.search(expr, caplog.text)
|
||||
|
||||
|
||||
def test_request_post_merge(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase, target: PackageBase):
|
||||
""" Test the request route for merge as maintainer. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {
|
||||
"type": "merge",
|
||||
"merge_into": target.Name,
|
||||
"comments": "Test request.",
|
||||
}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
pkgreq = pkgbase.requests.first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == MERGE_ID
|
||||
assert pkgreq.Status == PENDING_ID
|
||||
assert pkgreq.MergeBaseName == target.Name
|
||||
|
||||
# A RequestOpenNotification should've been sent out.
|
||||
assert Email.count() == 1
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Merge Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
|
||||
def test_request_post_orphan(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase):
|
||||
""" Test the POST route for creating an orphan request works. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {
|
||||
"type": "orphan",
|
||||
"comments": "Test request.",
|
||||
}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
pkgreq = pkgbase.requests.first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == ORPHAN_ID
|
||||
assert pkgreq.Status == PENDING_ID
|
||||
|
||||
# A RequestOpenNotification should've been sent out.
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Orphan Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
|
||||
def test_deletion_request(client: TestClient, user: User, tu_user: User,
|
||||
pkgbase: PackageBase, pkgreq: PackageRequest):
|
||||
""" Test deleting a package with a preexisting request. """
|
||||
# `pkgreq`.ReqTypeID is already DELETION_ID.
|
||||
create_request(DELETION_ID, user, pkgbase, "Other request.")
|
||||
|
||||
# Create a notification record for another user. They should then
|
||||
# also receive a DeleteNotification.
|
||||
user2 = create_user("test2", "test2@example.org")
|
||||
create_notification(user2, pkgbase)
|
||||
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/delete"
|
||||
comments = "Test closure."
|
||||
data = {"comments": comments, "confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == "/packages"
|
||||
|
||||
# Ensure that `pkgreq`.ClosureComment was left alone when specified.
|
||||
assert pkgreq.ClosureComment == comments
|
||||
|
||||
# We should've gotten three emails. Two accepted requests and
|
||||
# a DeleteNotification.
|
||||
assert Email.count() == 3
|
||||
|
||||
# Both requests should have gotten accepted and had a notification
|
||||
# sent out for them.
|
||||
for i in range(Email.count() - 1):
|
||||
email = Email(i + 1).parse()
|
||||
expr = r"^\[PRQ#\d+\] Deletion Request for [^ ]+ Accepted$"
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# We should've also had a DeleteNotification sent out.
|
||||
email = Email(3).parse()
|
||||
subject = r"^AUR Package deleted: [^ ]+$"
|
||||
assert re.match(subject, email.headers.get("Subject"))
|
||||
body = r"%s [1] deleted %s [2]." % (tu_user.Username, pkgbase.Name)
|
||||
assert body in email.body
|
||||
|
||||
|
||||
def test_deletion_autorequest(client: TestClient, tu_user: User,
|
||||
pkgbase: PackageBase):
|
||||
""" Test deleting a package without a request. """
|
||||
# `pkgreq`.ReqTypeID is already DELETION_ID.
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/delete"
|
||||
data = {"confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
assert resp.headers.get("location") == "/packages"
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
subject = r"^\[PRQ#\d+\] Deletion Request for [^ ]+ Accepted$"
|
||||
assert re.match(subject, email.headers.get("Subject"))
|
||||
assert "[Autogenerated]" in email.body
|
||||
|
||||
|
||||
def test_merge_request(client: TestClient, user: User, tu_user: User,
|
||||
pkgbase: PackageBase, target: PackageBase,
|
||||
pkgreq: PackageRequest):
|
||||
""" Test merging a package with a pre - existing request. """
|
||||
with db.begin():
|
||||
pkgreq.ReqTypeID = MERGE_ID
|
||||
pkgreq.MergeBaseName = target.Name
|
||||
|
||||
other_target = create_pkgbase(user, "other-target")
|
||||
other_request = create_request(MERGE_ID, user, pkgbase, "Other request.")
|
||||
other_target2 = create_pkgbase(user, "other-target2")
|
||||
other_request2 = create_request(MERGE_ID, user, pkgbase, "Other request2.")
|
||||
with db.begin():
|
||||
other_request.MergeBaseName = other_target.Name
|
||||
other_request2.MergeBaseName = other_target2.Name
|
||||
|
||||
# `pkgreq`.ReqTypeID is already DELETION_ID.
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/merge"
|
||||
comments = "Test merge closure."
|
||||
data = {"into": target.Name, "comments": comments, "confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == f"/pkgbase/{target.Name}"
|
||||
|
||||
# Ensure that `pkgreq`.ClosureComment was left alone when specified.
|
||||
assert pkgreq.ClosureComment == comments
|
||||
|
||||
# We should've gotten 3 emails: an accepting and two rejections.
|
||||
assert Email.count() == 3
|
||||
|
||||
# Assert specific IDs match up in the subjects.
|
||||
accepted = Email(1).parse()
|
||||
subj = r"^\[PRQ#%d\] Merge Request for [^ ]+ Accepted$" % pkgreq.ID
|
||||
assert re.match(subj, accepted.headers.get("Subject"))
|
||||
|
||||
# In the accepted case, we already supplied a closure comment,
|
||||
# which stops one from being autogenerated by the algorithm.
|
||||
assert "[Autogenerated]" not in accepted.body
|
||||
|
||||
# Test rejection emails, which do have autogenerated closures.
|
||||
rejected = Email(2).parse()
|
||||
subj = r"^\[PRQ#%d\] Merge Request for [^ ]+ Rejected$" % other_request.ID
|
||||
assert re.match(subj, rejected.headers.get("Subject"))
|
||||
assert "[Autogenerated]" in rejected.body
|
||||
|
||||
rejected = Email(3).parse()
|
||||
subj = r"^\[PRQ#%d\] Merge Request for [^ ]+ Rejected$" % other_request2.ID
|
||||
assert re.match(subj, rejected.headers.get("Subject"))
|
||||
assert "[Autogenerated]" in rejected.body
|
||||
|
||||
|
||||
def test_merge_autorequest(client: TestClient, user: User, tu_user: User,
|
||||
pkgbase: PackageBase, target: PackageBase):
|
||||
""" Test merging a package without a request. """
|
||||
with db.begin():
|
||||
pkgreq.ReqTypeID = MERGE_ID
|
||||
pkgreq.MergeBaseName = target.Name
|
||||
|
||||
# `pkgreq`.ReqTypeID is already DELETION_ID.
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/merge"
|
||||
data = {"into": target.Name, "confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == f"/pkgbase/{target.Name}"
|
||||
|
||||
# Should've gotten one email; an [Autogenerated] one.
|
||||
assert Email.count() == 1
|
||||
|
||||
# Test accepted merge request notification.
|
||||
email = Email(1).parse()
|
||||
subj = r"^\[PRQ#\d+\] Merge Request for [^ ]+ Accepted$"
|
||||
assert re.match(subj, email.headers.get("Subject"))
|
||||
assert "[Autogenerated]" in email.body
|
||||
|
||||
|
||||
def test_orphan_request(client: TestClient, user: User, tu_user: User,
|
||||
pkgbase: PackageBase, pkgreq: PackageRequest):
|
||||
""" Test the standard orphan request route. """
|
||||
idle_time = config.getint("options", "request_idle_time")
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgreq.ReqTypeID = ORPHAN_ID
|
||||
# Set the request time so it's seen as due (idle_time has passed).
|
||||
pkgreq.RequestTS = now - idle_time - 10
|
||||
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/disown"
|
||||
comments = "Test orphan closure."
|
||||
data = {"comments": comments, "confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == f"/pkgbase/{pkgbase.Name}"
|
||||
|
||||
# Ensure that `pkgreq`.ClosureComment was left alone when specified.
|
||||
assert pkgreq.ClosureComment == comments
|
||||
|
||||
# Check the email we expect.
|
||||
assert Email.count() == 1
|
||||
email = Email(1).parse()
|
||||
subj = r"^\[PRQ#%d\] Orphan Request for [^ ]+ Accepted$" % pkgreq.ID
|
||||
assert re.match(subj, email.headers.get("Subject"))
|
||||
|
||||
|
||||
def test_request_post_orphan_autogenerated_closure(client: TestClient,
|
||||
tu_user: User,
|
||||
pkgbase: PackageBase,
|
||||
pkgreq: PackageRequest):
|
||||
idle_time = config.getint("options", "request_idle_time")
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgreq.ReqTypeID = ORPHAN_ID
|
||||
# Set the request time so it's seen as due (idle_time has passed).
|
||||
pkgreq.RequestTS = now - idle_time - 10
|
||||
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/disown"
|
||||
data = {"confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == f"/pkgbase/{pkgbase.Name}"
|
||||
|
||||
assert Email.count() == 1
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#\d+\] Orphan Request for .+ Accepted$"
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
expr = r"\[Autogenerated\] Accepted orphan for .+\."
|
||||
assert re.search(expr, email.body)
|
||||
|
||||
|
||||
def test_request_post_orphan_autoaccept(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase,
|
||||
caplog: pytest.LogCaptureFixture):
|
||||
""" Test the standard pkgbase request route GET method. """
|
||||
caplog.set_level(DEBUG)
|
||||
now = time.utcnow()
|
||||
auto_orphan_age = config.getint("options", "auto_orphan_age")
|
||||
with db.begin():
|
||||
pkgbase.OutOfDateTS = now - auto_orphan_age - 100
|
||||
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/request"
|
||||
data = {
|
||||
"type": "orphan",
|
||||
"comments": "Test request.",
|
||||
}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
pkgreq = pkgbase.requests.first()
|
||||
assert pkgreq is not None
|
||||
assert pkgreq.ReqTypeID == ORPHAN_ID
|
||||
|
||||
# A Request(Open|Close)Notification should've been sent out.
|
||||
assert Email.count() == 2
|
||||
|
||||
# Check the first email; should be our open request.
|
||||
email = Email(1)
|
||||
expr = r"^\[PRQ#%d\] Orphan Request for [^ ]+$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# And the second should be the automated closure.
|
||||
email = Email(2)
|
||||
expr = r"^\[PRQ#%d\] Orphan Request for [^ ]+ Accepted$" % pkgreq.ID
|
||||
assert re.match(expr, email.headers.get("Subject"))
|
||||
|
||||
# Check logs.
|
||||
expr = r"New request #\d+ is marked for auto-orphan."
|
||||
assert re.search(expr, caplog.text)
|
||||
|
||||
|
||||
def test_orphan_as_maintainer(client: TestClient, auser: User,
|
||||
pkgbase: PackageBase):
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/disown"
|
||||
data = {"confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=auser.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == f"/pkgbase/{pkgbase.Name}"
|
||||
|
||||
# As the pkgbase maintainer, disowning the package just ends up
|
||||
# either promoting the lowest priority comaintainer or removing
|
||||
# the associated maintainer relationship altogether.
|
||||
assert pkgbase.Maintainer is None
|
||||
|
||||
|
||||
def test_orphan_without_requests(client: TestClient, tu_user: User,
|
||||
pkgbase: PackageBase):
|
||||
""" Test orphans are automatically accepted past a certain date. """
|
||||
endpoint = f"/pkgbase/{pkgbase.Name}/disown"
|
||||
data = {"confirm": True}
|
||||
with client as request:
|
||||
resp = request.post(endpoint, data=data, cookies=tu_user.cookies)
|
||||
assert resp.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
errors = get_errors(resp.text)
|
||||
expected = r"^No due existing orphan requests to accept for .+\.$"
|
||||
assert re.match(expected, errors[0].text.strip())
|
||||
|
||||
assert Email.count() == 0
|
||||
|
||||
|
||||
def test_closure_factory_invalid_reqtype_id():
|
||||
""" Test providing an invalid reqtype_id raises NotImplementedError. """
|
||||
automated = ClosureFactory()
|
||||
match = r"^Unsupported '.+' value\.$"
|
||||
with pytest.raises(NotImplementedError, match=match):
|
||||
automated.get_closure(666, None, None, None, ACCEPTED_ID)
|
||||
with pytest.raises(NotImplementedError, match=match):
|
||||
automated.get_closure(666, None, None, None, REJECTED_ID)
|
||||
|
||||
|
||||
def test_pkgreq_by_id_not_found():
|
||||
with pytest.raises(HTTPException):
|
||||
get_pkgreq_by_id(0)
|
||||
|
||||
|
||||
def test_requests_unauthorized(client: TestClient):
|
||||
with client as request:
|
||||
resp = request.get("/requests", allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
|
||||
def test_requests(client: TestClient,
|
||||
tu_user: User,
|
||||
packages: List[Package],
|
||||
requests: List[PackageRequest]):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get("/requests", params={
|
||||
# Pass in url query parameters O, SeB and SB to exercise
|
||||
# their paths inside of the pager_nav used in this request.
|
||||
"O": 0, # Page 1
|
||||
"SeB": "nd",
|
||||
"SB": "n"
|
||||
}, cookies=cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
assert "Next ›" in resp.text
|
||||
assert "Last »" in resp.text
|
||||
|
||||
root = parse_root(resp.text)
|
||||
# We have 55 requests, our defaults.PP is 50, so expect we have 50 rows.
|
||||
rows = root.xpath('//table[@class="results"]/tbody/tr')
|
||||
assert len(rows) == defaults.PP
|
||||
|
||||
# Request page 2 of the requests page.
|
||||
with client as request:
|
||||
resp = request.get("/requests", params={
|
||||
"O": 50 # Page 2
|
||||
}, cookies=cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
assert "‹ Previous" in resp.text
|
||||
assert "« First" in resp.text
|
||||
|
||||
root = parse_root(resp.text)
|
||||
rows = root.xpath('//table[@class="results"]/tbody/tr')
|
||||
assert len(rows) == 5 # There are five records left on the second page.
|
||||
|
||||
|
||||
def test_requests_selfmade(client: TestClient, user: User,
|
||||
requests: List[PackageRequest]):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get("/requests", cookies=cookies)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# As the user who creates all of the requests, we should see all of them.
|
||||
# However, we are not allowed to accept any of them ourselves.
|
||||
root = parse_root(resp.text)
|
||||
rows = root.xpath('//table[@class="results"]/tbody/tr')
|
||||
assert len(rows) == defaults.PP
|
||||
|
||||
# Our first and only link in the last row should be "Close".
|
||||
for row in rows:
|
||||
last_row = row.xpath('./td')[-1].xpath('./a')[0]
|
||||
assert last_row.text.strip() == "Close"
|
||||
|
||||
|
||||
def test_requests_close(client: TestClient, user: User,
|
||||
pkgreq: PackageRequest):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get(f"/requests/{pkgreq.ID}/close", cookies=cookies,
|
||||
allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
def test_requests_close_unauthorized(client: TestClient, maintainer: User,
|
||||
pkgreq: PackageRequest):
|
||||
cookies = {"AURSID": maintainer.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.get(f"/requests/{pkgreq.ID}/close", cookies=cookies,
|
||||
allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == "/"
|
||||
|
||||
|
||||
def test_requests_close_post_unauthorized(client: TestClient, maintainer: User,
|
||||
pkgreq: PackageRequest):
|
||||
cookies = {"AURSID": maintainer.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.post(f"/requests/{pkgreq.ID}/close", data={
|
||||
"reason": ACCEPTED_ID
|
||||
}, cookies=cookies, allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert resp.headers.get("location") == "/"
|
||||
|
||||
|
||||
def test_requests_close_post(client: TestClient, user: User,
|
||||
pkgreq: PackageRequest):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.post(f"/requests/{pkgreq.ID}/close",
|
||||
cookies=cookies, allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
assert pkgreq.Status == REJECTED_ID
|
||||
assert pkgreq.Closer == user
|
||||
assert pkgreq.ClosureComment == str()
|
||||
|
||||
|
||||
def test_requests_close_post_rejected(client: TestClient, user: User,
|
||||
pkgreq: PackageRequest):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
resp = request.post(f"/requests/{pkgreq.ID}/close",
|
||||
cookies=cookies, allow_redirects=False)
|
||||
assert resp.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
assert pkgreq.Status == REJECTED_ID
|
||||
assert pkgreq.Closer == user
|
||||
assert pkgreq.ClosureComment == str()
|
162
test/test_routes.py
Normal file
162
test/test_routes.py
Normal file
|
@ -0,0 +1,162 @@
|
|||
import re
|
||||
import urllib.parse
|
||||
|
||||
from http import HTTPStatus
|
||||
|
||||
import lxml.etree
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.asgi import app
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
yield TestClient(app=app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
def test_index(client: TestClient):
|
||||
""" Test the index route at '/'. """
|
||||
with client as req:
|
||||
response = req.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
def test_index_security_headers(client: TestClient):
|
||||
""" Check for the existence of CSP, XCTO, XFO and RP security headers.
|
||||
|
||||
CSP: Content-Security-Policy
|
||||
XCTO: X-Content-Type-Options
|
||||
RP: Referrer-Policy
|
||||
XFO: X-Frame-Options
|
||||
"""
|
||||
# Use `with` to trigger FastAPI app events.
|
||||
with client as req:
|
||||
response = req.get("/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
assert response.headers.get("Content-Security-Policy") is not None
|
||||
assert response.headers.get("X-Content-Type-Options") == "nosniff"
|
||||
assert response.headers.get("Referrer-Policy") == "same-origin"
|
||||
assert response.headers.get("X-Frame-Options") == "SAMEORIGIN"
|
||||
|
||||
|
||||
def test_favicon(client: TestClient):
|
||||
""" Test the favicon route at '/favicon.ico'. """
|
||||
with client as request:
|
||||
response1 = request.get("/static/images/favicon.ico")
|
||||
response2 = request.get("/favicon.ico")
|
||||
assert response1.status_code == int(HTTPStatus.OK)
|
||||
assert response1.content == response2.content
|
||||
|
||||
|
||||
def test_language(client: TestClient):
|
||||
""" Test the language post route as a guest user. """
|
||||
post_data = {
|
||||
"set_lang": "de",
|
||||
"next": "/"
|
||||
}
|
||||
with client as req:
|
||||
response = req.post("/language", data=post_data)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
|
||||
def test_language_invalid_next(client: TestClient):
|
||||
""" Test an invalid next route at '/language'. """
|
||||
post_data = {
|
||||
"set_lang": "de",
|
||||
"next": "https://evil.net"
|
||||
}
|
||||
with client as req:
|
||||
response = req.post("/language", data=post_data)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
|
||||
def test_user_language(client: TestClient, user: User):
|
||||
""" Test the language post route as an authenticated user. """
|
||||
post_data = {
|
||||
"set_lang": "de",
|
||||
"next": "/"
|
||||
}
|
||||
|
||||
sid = user.login(Request(), "testPassword")
|
||||
assert sid is not None
|
||||
|
||||
with client as req:
|
||||
response = req.post("/language", data=post_data,
|
||||
cookies={"AURSID": sid})
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert user.LangPreference == "de"
|
||||
|
||||
|
||||
def test_language_query_params(client: TestClient):
|
||||
""" Test the language post route with query params. """
|
||||
next = urllib.parse.quote_plus("/")
|
||||
post_data = {
|
||||
"set_lang": "de",
|
||||
"next": "/",
|
||||
"q": f"next={next}"
|
||||
}
|
||||
q = post_data.get("q")
|
||||
with client as req:
|
||||
response = req.post("/language", data=post_data)
|
||||
assert response.headers.get("location") == f"/?{q}"
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
|
||||
def test_error_messages(client: TestClient):
|
||||
with client as request:
|
||||
response1 = request.get("/thisroutedoesnotexist")
|
||||
response2 = request.get("/raisefivethree")
|
||||
assert response1.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
assert response2.status_code == int(HTTPStatus.SERVICE_UNAVAILABLE)
|
||||
|
||||
|
||||
def test_nonce_csp(client: TestClient):
|
||||
with client as request:
|
||||
response = request.get("/")
|
||||
data = response.headers.get("Content-Security-Policy")
|
||||
nonce = next(field for field in data.split("; ") if "nonce" in field)
|
||||
match = re.match(r"^script-src .*'nonce-([a-fA-F0-9]{8})' .*$", nonce)
|
||||
nonce = match.group(1)
|
||||
assert nonce is not None and len(nonce) == 8
|
||||
|
||||
parser = lxml.etree.HTMLParser(recover=True)
|
||||
root = lxml.etree.fromstring(response.text, parser=parser)
|
||||
|
||||
nonce_verified = False
|
||||
scripts = root.xpath("//script")
|
||||
for script in scripts:
|
||||
if script.text is not None:
|
||||
assert "nonce" in script.keys()
|
||||
if not (nonce_verified := (script.get("nonce") == nonce)):
|
||||
break
|
||||
assert nonce_verified is True
|
||||
|
||||
|
||||
def test_id_redirect(client: TestClient):
|
||||
with client as request:
|
||||
response = request.get("/", params={
|
||||
"id": "test", # This param will be rewritten into Location.
|
||||
"key": "value", # Test that this param persists.
|
||||
"key2": "value2" # And this one.
|
||||
}, allow_redirects=False)
|
||||
assert response.headers.get("location") == "/test?key=value&key2=value2"
|
784
test/test_rpc.py
Normal file
784
test/test_rpc.py
Normal file
|
@ -0,0 +1,784 @@
|
|||
import re
|
||||
|
||||
from http import HTTPStatus
|
||||
from typing import List
|
||||
from unittest import mock
|
||||
|
||||
import orjson
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
from redis.client import Pipeline
|
||||
|
||||
import aurweb.models.dependency_type as dt
|
||||
import aurweb.models.relation_type as rt
|
||||
|
||||
from aurweb import asgi, config, db, rpc, scripts, time
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.license import License
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_dependency import PackageDependency
|
||||
from aurweb.models.package_keyword import PackageKeyword
|
||||
from aurweb.models.package_license import PackageLicense
|
||||
from aurweb.models.package_relation import PackageRelation
|
||||
from aurweb.models.package_vote import PackageVote
|
||||
from aurweb.models.user import User
|
||||
from aurweb.redis import redis_connection
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client() -> TestClient:
|
||||
yield TestClient(app=asgi.app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user(db_test) -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User 1", Passwd=str(),
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user2() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="user2", Email="user2@example.org",
|
||||
RealName="Test User 2", Passwd=str(),
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user3() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="user3", Email="user3@example.org",
|
||||
RealName="Test User 3", Passwd=str(),
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(user: User, user2: User, user3: User) -> List[Package]:
|
||||
output = []
|
||||
|
||||
# Create package records used in our tests.
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="big-chungus",
|
||||
Maintainer=user, Packager=user)
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="Bunny bunny around bunny",
|
||||
URL="https://example.com/")
|
||||
output.append(pkg)
|
||||
|
||||
pkgbase = db.create(PackageBase, Name="chungy-chungus",
|
||||
Maintainer=user, Packager=user)
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="Wubby wubby on wobba wuubu",
|
||||
URL="https://example.com/")
|
||||
output.append(pkg)
|
||||
|
||||
pkgbase = db.create(PackageBase, Name="gluggly-chungus",
|
||||
Maintainer=user, Packager=user)
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="glurrba glurrba gur globba",
|
||||
URL="https://example.com/")
|
||||
output.append(pkg)
|
||||
|
||||
pkgbase = db.create(PackageBase, Name="fugly-chungus",
|
||||
Maintainer=user, Packager=user)
|
||||
|
||||
desc = "A Package belonging to a PackageBase with another name."
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name="other-pkg",
|
||||
Description=desc, URL="https://example.com")
|
||||
output.append(pkg)
|
||||
|
||||
pkgbase = db.create(PackageBase, Name="woogly-chungus")
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name,
|
||||
Description="wuggla woblabeloop shemashmoop",
|
||||
URL="https://example.com/")
|
||||
output.append(pkg)
|
||||
|
||||
# Setup a few more related records on the first package:
|
||||
# a license, some keywords and some votes.
|
||||
with db.begin():
|
||||
lic = db.create(License, Name="GPL")
|
||||
db.create(PackageLicense, Package=output[0], License=lic)
|
||||
|
||||
for keyword in ["big-chungus", "smol-chungus", "sizeable-chungus"]:
|
||||
db.create(PackageKeyword,
|
||||
PackageBase=output[0].PackageBase,
|
||||
Keyword=keyword)
|
||||
|
||||
now = time.utcnow()
|
||||
for user_ in [user, user2, user3]:
|
||||
db.create(PackageVote, User=user_,
|
||||
PackageBase=output[0].PackageBase, VoteTS=now)
|
||||
scripts.popupdate.run_single(output[0].PackageBase)
|
||||
|
||||
yield output
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def depends(packages: List[Package]) -> List[PackageDependency]:
|
||||
output = []
|
||||
|
||||
with db.begin():
|
||||
dep = db.create(PackageDependency,
|
||||
Package=packages[0],
|
||||
DepTypeID=dt.DEPENDS_ID,
|
||||
DepName="chungus-depends")
|
||||
output.append(dep)
|
||||
|
||||
dep = db.create(PackageDependency,
|
||||
Package=packages[1],
|
||||
DepTypeID=dt.DEPENDS_ID,
|
||||
DepName="chungy-depends")
|
||||
output.append(dep)
|
||||
|
||||
dep = db.create(PackageDependency,
|
||||
Package=packages[0],
|
||||
DepTypeID=dt.OPTDEPENDS_ID,
|
||||
DepName="chungus-optdepends",
|
||||
DepCondition="=50")
|
||||
output.append(dep)
|
||||
|
||||
dep = db.create(PackageDependency,
|
||||
Package=packages[0],
|
||||
DepTypeID=dt.MAKEDEPENDS_ID,
|
||||
DepName="chungus-makedepends")
|
||||
output.append(dep)
|
||||
|
||||
dep = db.create(PackageDependency,
|
||||
Package=packages[0],
|
||||
DepTypeID=dt.CHECKDEPENDS_ID,
|
||||
DepName="chungus-checkdepends")
|
||||
output.append(dep)
|
||||
|
||||
yield output
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def relations(user: User, packages: List[Package]) -> List[PackageRelation]:
|
||||
output = []
|
||||
|
||||
with db.begin():
|
||||
rel = db.create(PackageRelation,
|
||||
Package=packages[0],
|
||||
RelTypeID=rt.CONFLICTS_ID,
|
||||
RelName="chungus-conflicts")
|
||||
output.append(rel)
|
||||
|
||||
rel = db.create(PackageRelation,
|
||||
Package=packages[1],
|
||||
RelTypeID=rt.CONFLICTS_ID,
|
||||
RelName="chungy-conflicts")
|
||||
output.append(rel)
|
||||
|
||||
rel = db.create(PackageRelation,
|
||||
Package=packages[0],
|
||||
RelTypeID=rt.PROVIDES_ID,
|
||||
RelName="chungus-provides",
|
||||
RelCondition="<=200")
|
||||
output.append(rel)
|
||||
|
||||
rel = db.create(PackageRelation,
|
||||
Package=packages[0],
|
||||
RelTypeID=rt.REPLACES_ID,
|
||||
RelName="chungus-replaces",
|
||||
RelCondition="<=200")
|
||||
output.append(rel)
|
||||
|
||||
# Finally, yield the packages.
|
||||
yield output
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
# Create some extra package relationships.
|
||||
pass
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pipeline():
|
||||
redis = redis_connection()
|
||||
pipeline = redis.pipeline()
|
||||
|
||||
# The 'testclient' host is used when requesting the app
|
||||
# via fastapi.testclient.TestClient.
|
||||
pipeline.delete("ratelimit-ws:testclient")
|
||||
pipeline.delete("ratelimit:testclient")
|
||||
pipeline.execute()
|
||||
|
||||
yield pipeline
|
||||
|
||||
|
||||
def test_rpc_documentation(client: TestClient):
|
||||
with client as request:
|
||||
resp = request.get("/rpc")
|
||||
assert resp.status_code == int(HTTPStatus.OK)
|
||||
assert "aurweb RPC Interface" in resp.text
|
||||
|
||||
|
||||
def test_rpc_documentation_missing():
|
||||
config_get = config.get
|
||||
|
||||
def mock_get(section: str, key: str) -> str:
|
||||
if section == "options" and key == "aurwebdir":
|
||||
return "/missing"
|
||||
return config_get(section, key)
|
||||
|
||||
with mock.patch("aurweb.config.get", side_effect=mock_get):
|
||||
config.rehash()
|
||||
expr = r"^doc/rpc\.html could not be read$"
|
||||
with pytest.raises(OSError, match=expr):
|
||||
rpc.documentation()
|
||||
config.rehash()
|
||||
|
||||
|
||||
def test_rpc_singular_info(client: TestClient,
|
||||
user: User,
|
||||
packages: List[Package],
|
||||
depends: List[PackageDependency],
|
||||
relations: List[PackageRelation]):
|
||||
# Define expected response.
|
||||
pkg = packages[0]
|
||||
expected_data = {
|
||||
"version": 5,
|
||||
"results": [{
|
||||
"Name": pkg.Name,
|
||||
"Version": pkg.Version,
|
||||
"Description": pkg.Description,
|
||||
"URL": pkg.URL,
|
||||
"PackageBase": pkg.PackageBase.Name,
|
||||
"NumVotes": pkg.PackageBase.NumVotes,
|
||||
"Popularity": float(pkg.PackageBase.Popularity),
|
||||
"OutOfDate": None,
|
||||
"Maintainer": user.Username,
|
||||
"URLPath": f"/cgit/aur.git/snapshot/{pkg.Name}.tar.gz",
|
||||
"Depends": ["chungus-depends"],
|
||||
"OptDepends": ["chungus-optdepends=50"],
|
||||
"MakeDepends": ["chungus-makedepends"],
|
||||
"CheckDepends": ["chungus-checkdepends"],
|
||||
"Conflicts": ["chungus-conflicts"],
|
||||
"Provides": ["chungus-provides<=200"],
|
||||
"Replaces": ["chungus-replaces<=200"],
|
||||
"License": [pkg.package_licenses.first().License.Name],
|
||||
"Keywords": [
|
||||
"big-chungus",
|
||||
"sizeable-chungus",
|
||||
"smol-chungus"
|
||||
]
|
||||
}],
|
||||
"resultcount": 1,
|
||||
"type": "multiinfo"
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
resp = request.get("/rpc", params={
|
||||
"v": 5,
|
||||
"type": "info",
|
||||
"arg": ["chungy-chungus", "big-chungus"],
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(resp.text)
|
||||
|
||||
# Remove the FirstSubmitted LastModified, ID and PackageBaseID keys from
|
||||
# reponse, as the key's values aren't guaranteed to match between the two
|
||||
# (the keys are already removed from 'expected_data').
|
||||
for i in ["FirstSubmitted", "LastModified", "ID", "PackageBaseID"]:
|
||||
response_data["results"][0].pop(i)
|
||||
|
||||
# Validate that the new dictionaries are the same.
|
||||
assert response_data == expected_data
|
||||
|
||||
|
||||
def test_rpc_nonexistent_package(client: TestClient):
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc/?v=5&type=info&arg=nonexistent-package")
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert response_data["resultcount"] == 0
|
||||
|
||||
|
||||
def test_rpc_multiinfo(client: TestClient, packages: List[Package]):
|
||||
# Make dummy request.
|
||||
request_packages = ["big-chungus", "chungy-chungus"]
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"v": 5, "type": "info", "arg[]": request_packages
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
for i in response_data["results"]:
|
||||
request_packages.remove(i["Name"])
|
||||
|
||||
assert request_packages == []
|
||||
|
||||
|
||||
def test_rpc_mixedargs(client: TestClient, packages: List[Package]):
|
||||
# Make dummy request.
|
||||
response1_packages = ["gluggly-chungus"]
|
||||
response2_packages = ["gluggly-chungus", "chungy-chungus"]
|
||||
|
||||
with client as request:
|
||||
# Supply all of the args in the url to enforce ordering.
|
||||
response1 = request.get(
|
||||
"/rpc?v=5&arg[]=big-chungus&arg=gluggly-chungus&type=info")
|
||||
assert response1.status_code == int(HTTPStatus.OK)
|
||||
|
||||
with client as request:
|
||||
response2 = request.get(
|
||||
"/rpc?v=5&arg=big-chungus&arg[]=gluggly-chungus"
|
||||
"&type=info&arg[]=chungy-chungus")
|
||||
assert response1.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response1_data = orjson.loads(response1.content.decode())
|
||||
response2_data = orjson.loads(response2.content.decode())
|
||||
|
||||
# Validate data.
|
||||
for i in response1_data["results"]:
|
||||
response1_packages.remove(i["Name"])
|
||||
|
||||
for i in response2_data["results"]:
|
||||
response2_packages.remove(i["Name"])
|
||||
|
||||
for i in [response1_packages, response2_packages]:
|
||||
assert i == []
|
||||
|
||||
|
||||
def test_rpc_no_dependencies_omits_key(client: TestClient, user: User,
|
||||
packages: List[Package],
|
||||
depends: List[PackageDependency],
|
||||
relations: List[PackageRelation]):
|
||||
"""
|
||||
This makes sure things like 'MakeDepends' get removed from JSON strings
|
||||
when they don't have set values.
|
||||
"""
|
||||
pkg = packages[1]
|
||||
expected_response = {
|
||||
'version': 5,
|
||||
'results': [{
|
||||
'Name': pkg.Name,
|
||||
'Version': pkg.Version,
|
||||
'Description': pkg.Description,
|
||||
'URL': pkg.URL,
|
||||
'PackageBase': pkg.PackageBase.Name,
|
||||
'NumVotes': pkg.PackageBase.NumVotes,
|
||||
'Popularity': int(pkg.PackageBase.Popularity),
|
||||
'OutOfDate': None,
|
||||
'Maintainer': user.Username,
|
||||
'URLPath': '/cgit/aur.git/snapshot/chungy-chungus.tar.gz',
|
||||
'Depends': ['chungy-depends'],
|
||||
'Conflicts': ['chungy-conflicts'],
|
||||
'License': [],
|
||||
'Keywords': []
|
||||
}],
|
||||
'resultcount': 1,
|
||||
'type': 'multiinfo'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"v": 5, "type": "info", "arg": "chungy-chungus"
|
||||
})
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Remove inconsistent keys.
|
||||
for i in ["ID", "PackageBaseID", "FirstSubmitted", "LastModified"]:
|
||||
response_data["results"][0].pop(i)
|
||||
|
||||
assert response_data == expected_response
|
||||
|
||||
|
||||
def test_rpc_bad_type(client: TestClient):
|
||||
# Define expected response.
|
||||
expected_data = {
|
||||
'version': 5,
|
||||
'results': [],
|
||||
'resultcount': 0,
|
||||
'type': 'error',
|
||||
'error': 'Incorrect request type specified.'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"v": 5, "type": "invalid-type", "arg": "big-chungus"
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert expected_data == response_data
|
||||
|
||||
|
||||
def test_rpc_bad_version(client: TestClient):
|
||||
# Define expected response.
|
||||
expected_data = {
|
||||
'version': 0,
|
||||
'resultcount': 0,
|
||||
'results': [],
|
||||
'type': 'error',
|
||||
'error': 'Invalid version specified.'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"v": 0, "type": "info", "arg": "big-chungus"
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert expected_data == response_data
|
||||
|
||||
|
||||
def test_rpc_no_version(client: TestClient):
|
||||
# Define expected response.
|
||||
expected_data = {
|
||||
'version': None,
|
||||
'resultcount': 0,
|
||||
'results': [],
|
||||
'type': 'error',
|
||||
'error': 'Please specify an API version.'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"type": "info",
|
||||
"arg": "big-chungus"
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert expected_data == response_data
|
||||
|
||||
|
||||
def test_rpc_no_type(client: TestClient):
|
||||
# Define expected response.
|
||||
expected_data = {
|
||||
'version': 5,
|
||||
'results': [],
|
||||
'resultcount': 0,
|
||||
'type': 'error',
|
||||
'error': 'No request type/data specified.'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={"v": 5, "arg": "big-chungus"})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert expected_data == response_data
|
||||
|
||||
|
||||
def test_rpc_no_args(client: TestClient):
|
||||
# Define expected response.
|
||||
expected_data = {
|
||||
'version': 5,
|
||||
'results': [],
|
||||
'resultcount': 0,
|
||||
'type': 'error',
|
||||
'error': 'No request type/data specified.'
|
||||
}
|
||||
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={"v": 5, "type": "info"})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert expected_data == response_data
|
||||
|
||||
|
||||
def test_rpc_no_maintainer(client: TestClient, packages: List[Package]):
|
||||
# Make dummy request.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params={
|
||||
"v": 5, "type": "info", "arg": "woogly-chungus"
|
||||
})
|
||||
|
||||
# Load request response into Python dictionary.
|
||||
response_data = orjson.loads(response.content.decode())
|
||||
|
||||
# Validate data.
|
||||
assert response_data["results"][0]["Maintainer"] is None
|
||||
|
||||
|
||||
def test_rpc_suggest_pkgbase(client: TestClient, packages: List[Package]):
|
||||
params = {"v": 5, "type": "suggest-pkgbase", "arg": "big"}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == ["big-chungus"]
|
||||
|
||||
params["arg"] = "chungy"
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == ["chungy-chungus"]
|
||||
|
||||
# Test no arg supplied.
|
||||
del params["arg"]
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == []
|
||||
|
||||
|
||||
def test_rpc_suggest(client: TestClient, packages: List[Package]):
|
||||
params = {"v": 5, "type": "suggest", "arg": "other"}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == ["other-pkg"]
|
||||
|
||||
# Test non-existent Package.
|
||||
params["arg"] = "nonexistent"
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == []
|
||||
|
||||
# Test no arg supplied.
|
||||
del params["arg"]
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data == []
|
||||
|
||||
|
||||
def mock_config_getint(section: str, key: str):
|
||||
if key == "request_limit":
|
||||
return 4
|
||||
elif key == "window_length":
|
||||
return 100
|
||||
return config.getint(section, key)
|
||||
|
||||
|
||||
@mock.patch("aurweb.config.getint", side_effect=mock_config_getint)
|
||||
def test_rpc_ratelimit(getint: mock.MagicMock, client: TestClient,
|
||||
pipeline: Pipeline, packages: List[Package]):
|
||||
params = {"v": 5, "type": "suggest-pkgbase", "arg": "big"}
|
||||
|
||||
for i in range(4):
|
||||
# The first 4 requests should be good.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# The fifth request should be banned.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.status_code == int(HTTPStatus.TOO_MANY_REQUESTS)
|
||||
|
||||
# Delete the cached records.
|
||||
pipeline.delete("ratelimit-ws:testclient")
|
||||
pipeline.delete("ratelimit:testclient")
|
||||
one, two = pipeline.execute()
|
||||
assert one and two
|
||||
|
||||
# The new first request should be good.
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
def test_rpc_etag(client: TestClient, packages: List[Package]):
|
||||
params = {"v": 5, "type": "suggest-pkgbase", "arg": "big"}
|
||||
|
||||
with client as request:
|
||||
response1 = request.get("/rpc", params=params)
|
||||
with client as request:
|
||||
response2 = request.get("/rpc", params=params)
|
||||
|
||||
assert response1.headers.get("ETag") is not None
|
||||
assert response1.headers.get("ETag") != str()
|
||||
assert response1.headers.get("ETag") == response2.headers.get("ETag")
|
||||
|
||||
|
||||
def test_rpc_search_arg_too_small(client: TestClient):
|
||||
params = {"v": 5, "type": "search", "arg": "b"}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
assert response.json().get("error") == "Query arg too small."
|
||||
|
||||
|
||||
def test_rpc_search(client: TestClient, packages: List[Package]):
|
||||
params = {"v": 5, "type": "search", "arg": "big"}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == packages[0].Name
|
||||
|
||||
# Test the If-None-Match headers.
|
||||
etag = response.headers.get("ETag").strip('"')
|
||||
headers = {"If-None-Match": etag}
|
||||
response = request.get("/rpc", params=params, headers=headers)
|
||||
assert response.status_code == int(HTTPStatus.NOT_MODIFIED)
|
||||
assert response.content == b''
|
||||
|
||||
# No args on non-m by types return an error.
|
||||
del params["arg"]
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.json().get("error") == "No request type/data specified."
|
||||
|
||||
|
||||
def test_rpc_msearch(client: TestClient, user: User, packages: List[Package]):
|
||||
params = {"v": 5, "type": "msearch", "arg": user.Username}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
|
||||
# user1 maintains 4 packages; assert that we got them all.
|
||||
assert data.get("resultcount") == 4
|
||||
names = list(sorted(r.get("Name") for r in data.get("results")))
|
||||
expected_results = [
|
||||
"big-chungus",
|
||||
"chungy-chungus",
|
||||
"gluggly-chungus",
|
||||
"other-pkg"
|
||||
]
|
||||
assert names == expected_results
|
||||
|
||||
# Search for a non-existent maintainer, giving us zero packages.
|
||||
params["arg"] = "blah-blah"
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 0
|
||||
|
||||
with db.begin():
|
||||
packages[0].PackageBase.Maintainer = None
|
||||
|
||||
# A missing arg still succeeds, but it returns all orphans.
|
||||
# Just verify that we receive no error and the orphaned result.
|
||||
params.pop("arg")
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == "big-chungus"
|
||||
|
||||
|
||||
def test_rpc_search_depends(client: TestClient, packages: List[Package],
|
||||
depends: List[PackageDependency]):
|
||||
params = {
|
||||
"v": 5, "type": "search", "by": "depends", "arg": "chungus-depends"
|
||||
}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == packages[0].Name
|
||||
|
||||
|
||||
def test_rpc_search_makedepends(client: TestClient, packages: List[Package],
|
||||
depends: List[PackageDependency]):
|
||||
params = {
|
||||
"v": 5,
|
||||
"type": "search",
|
||||
"by": "makedepends",
|
||||
"arg": "chungus-makedepends"
|
||||
}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == packages[0].Name
|
||||
|
||||
|
||||
def test_rpc_search_optdepends(client: TestClient, packages: List[Package],
|
||||
depends: List[PackageDependency]):
|
||||
params = {
|
||||
"v": 5,
|
||||
"type": "search",
|
||||
"by": "optdepends",
|
||||
"arg": "chungus-optdepends"
|
||||
}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == packages[0].Name
|
||||
|
||||
|
||||
def test_rpc_search_checkdepends(client: TestClient, packages: List[Package],
|
||||
depends: List[PackageDependency]):
|
||||
params = {
|
||||
"v": 5,
|
||||
"type": "search",
|
||||
"by": "checkdepends",
|
||||
"arg": "chungus-checkdepends"
|
||||
}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
data = response.json()
|
||||
assert data.get("resultcount") == 1
|
||||
result = data.get("results")[0]
|
||||
assert result.get("Name") == packages[0].Name
|
||||
|
||||
|
||||
def test_rpc_incorrect_by(client: TestClient):
|
||||
params = {"v": 5, "type": "search", "by": "fake", "arg": "big"}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.json().get("error") == "Incorrect by field specified."
|
||||
|
||||
|
||||
def test_rpc_jsonp_callback(client: TestClient):
|
||||
""" Test the callback parameter.
|
||||
|
||||
For end-to-end verification, the `examples/jsonp.html` file can be
|
||||
used to submit jsonp callback requests to the RPC.
|
||||
"""
|
||||
params = {
|
||||
"v": 5,
|
||||
"type": "search",
|
||||
"arg": "big",
|
||||
"callback": "jsonCallback"
|
||||
}
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.headers.get("content-type") == "text/javascript"
|
||||
assert re.search(r'^/\*\*/jsonCallback\(.*\)$', response.text) is not None
|
||||
|
||||
# Test an invalid callback name; we get an application/json error.
|
||||
params["callback"] = "jsonCallback!"
|
||||
with client as request:
|
||||
response = request.get("/rpc", params=params)
|
||||
assert response.headers.get("content-type") == "application/json"
|
||||
assert response.json().get("error") == "Invalid callback name."
|
102
test/test_rss.py
Normal file
102
test/test_rss.py
Normal file
|
@ -0,0 +1,102 @@
|
|||
from http import HTTPStatus
|
||||
|
||||
import lxml.etree
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import db, logging, time
|
||||
from aurweb.asgi import app
|
||||
from aurweb.models.account_type import AccountType
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.user import User
|
||||
|
||||
logger = logging.get_logger(__name__)
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client():
|
||||
yield TestClient(app=app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user():
|
||||
account_type = db.query(AccountType,
|
||||
AccountType.AccountType == "User").first()
|
||||
yield db.create(User, Username="test",
|
||||
Email="test@example.org",
|
||||
RealName="Test User",
|
||||
Passwd="testPassword",
|
||||
AccountType=account_type)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def packages(user):
|
||||
pkgs = []
|
||||
now = time.utcnow()
|
||||
|
||||
# Create 101 packages; we limit 100 on RSS feeds.
|
||||
with db.begin():
|
||||
for i in range(101):
|
||||
pkgbase = db.create(
|
||||
PackageBase, Maintainer=user, Name=f"test-package-{i}",
|
||||
SubmittedTS=(now + i), ModifiedTS=(now + i))
|
||||
pkg = db.create(Package, Name=pkgbase.Name, PackageBase=pkgbase)
|
||||
pkgs.append(pkg)
|
||||
yield pkgs
|
||||
|
||||
|
||||
def parse_root(xml):
|
||||
return lxml.etree.fromstring(xml)
|
||||
|
||||
|
||||
def test_rss(client, user, packages):
|
||||
with client as request:
|
||||
response = request.get("/rss/")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Test that the RSS we got is sorted by descending SubmittedTS.
|
||||
def key_(pkg):
|
||||
return pkg.PackageBase.SubmittedTS
|
||||
packages = list(reversed(sorted(packages, key=key_)))
|
||||
|
||||
# Just take the first 100.
|
||||
packages = packages[:100]
|
||||
|
||||
root = parse_root(response.content)
|
||||
items = root.xpath("//channel/item")
|
||||
assert len(items) == 100
|
||||
|
||||
for i, item in enumerate(items):
|
||||
title = next(iter(item.xpath('./title')))
|
||||
logger.debug(f"title: '{title.text}' vs name: '{packages[i].Name}'")
|
||||
assert title.text == packages[i].Name
|
||||
|
||||
|
||||
def test_rss_modified(client, user, packages):
|
||||
with client as request:
|
||||
response = request.get("/rss/modified")
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Test that the RSS we got is sorted by descending SubmittedTS.
|
||||
def key_(pkg):
|
||||
return pkg.PackageBase.ModifiedTS
|
||||
packages = list(reversed(sorted(packages, key=key_)))
|
||||
|
||||
# Just take the first 100.
|
||||
packages = packages[:100]
|
||||
|
||||
root = parse_root(response.content)
|
||||
items = root.xpath("//channel/item")
|
||||
assert len(items) == 100
|
||||
|
||||
for i, item in enumerate(items):
|
||||
title = next(iter(item.xpath('./title')))
|
||||
logger.debug(f"title: '{title.text}' vs name: '{packages[i].Name}'")
|
||||
assert title.text == packages[i].Name
|
80
test/test_session.py
Normal file
80
test/test_session.py
Normal file
|
@ -0,0 +1,80 @@
|
|||
""" Test our Session model. """
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.session import Session, generate_unique_sid
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
ResetKey="testReset", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def session(user: User) -> Session:
|
||||
with db.begin():
|
||||
session = db.create(Session, User=user, SessionID="testSession",
|
||||
LastUpdateTS=time.utcnow())
|
||||
yield session
|
||||
|
||||
|
||||
def test_session(user: User, session: Session):
|
||||
assert session.SessionID == "testSession"
|
||||
assert session.UsersID == user.ID
|
||||
|
||||
|
||||
def test_session_cs():
|
||||
""" Test case sensitivity of the database table. """
|
||||
with db.begin():
|
||||
user2 = db.create(User, Username="test2", Email="test2@example.org",
|
||||
ResetKey="testReset2", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
|
||||
with db.begin():
|
||||
session_cs = db.create(Session, User=user2, SessionID="TESTSESSION",
|
||||
LastUpdateTS=time.utcnow())
|
||||
|
||||
assert session_cs.SessionID == "TESTSESSION"
|
||||
assert session_cs.SessionID != "testSession"
|
||||
|
||||
|
||||
def test_session_user_association(user: User, session: Session):
|
||||
# Make sure that the Session user attribute is correct.
|
||||
assert session.User == user
|
||||
|
||||
|
||||
def test_session_null_user_raises():
|
||||
with pytest.raises(IntegrityError):
|
||||
Session()
|
||||
|
||||
|
||||
def test_generate_unique_sid(session: Session):
|
||||
# Mock up aurweb.models.session.generate_sid by returning
|
||||
# sids[i % 2] from 0 .. n. This will swap between each sid
|
||||
# between each call.
|
||||
sids = ["testSession", "realSession"]
|
||||
i = 0
|
||||
|
||||
def mock_generate_sid(length):
|
||||
nonlocal i
|
||||
sid = sids[i % 2]
|
||||
i += 1
|
||||
return sid
|
||||
|
||||
with mock.patch("aurweb.util.make_random_string", mock_generate_sid):
|
||||
assert generate_unique_sid() == "realSession"
|
149
test/test_spawn.py
Normal file
149
test/test_spawn.py
Normal file
|
@ -0,0 +1,149 @@
|
|||
import os
|
||||
import tempfile
|
||||
|
||||
from typing import Tuple
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import aurweb.config
|
||||
import aurweb.spawn
|
||||
|
||||
from aurweb.exceptions import AurwebException
|
||||
|
||||
# Some os.environ overrides we use in this suite.
|
||||
TEST_ENVIRONMENT = {
|
||||
"PHP_NGINX_PORT": "8001",
|
||||
"FASTAPI_NGINX_PORT": "8002"
|
||||
}
|
||||
|
||||
|
||||
class FakeProcess:
|
||||
""" Fake a subprocess.Popen return object. """
|
||||
|
||||
returncode = 0
|
||||
stdout = b''
|
||||
stderr = b''
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
""" We need this constructor to remain compatible with Popen. """
|
||||
pass
|
||||
|
||||
def communicate(self) -> Tuple[bytes, bytes]:
|
||||
return (self.stdout, self.stderr)
|
||||
|
||||
def terminate(self) -> None:
|
||||
raise Exception("Fake termination.")
|
||||
|
||||
def wait(self) -> int:
|
||||
return self.returncode
|
||||
|
||||
|
||||
class MockFakeProcess:
|
||||
""" FakeProcess construction helper to be used in mocks. """
|
||||
|
||||
def __init__(self, return_code: int = 0, stdout: bytes = b'',
|
||||
stderr: bytes = b''):
|
||||
self.returncode = return_code
|
||||
self.stdout = stdout
|
||||
self.stderr = stderr
|
||||
|
||||
def process(self, *args, **kwargs) -> FakeProcess:
|
||||
proc = FakeProcess()
|
||||
proc.returncode = self.returncode
|
||||
proc.stdout = self.stdout
|
||||
proc.stderr = self.stderr
|
||||
return proc
|
||||
|
||||
|
||||
@mock.patch("aurweb.spawn.PHP_BINARY", "does-not-exist")
|
||||
def test_spawn():
|
||||
match = r"^Unable to locate the '.*' executable\.$"
|
||||
with pytest.raises(AurwebException, match=match):
|
||||
aurweb.spawn.validate_php_config()
|
||||
|
||||
|
||||
@mock.patch("subprocess.Popen", side_effect=MockFakeProcess(1).process)
|
||||
def test_spawn_non_zero_php_binary(fake_process: FakeProcess):
|
||||
match = r"^Received non-zero error code.*$"
|
||||
with pytest.raises(AssertionError, match=match):
|
||||
aurweb.spawn.validate_php_config()
|
||||
|
||||
|
||||
def test_spawn_missing_modules():
|
||||
side_effect = MockFakeProcess(stdout=b"pdo_sqlite").process
|
||||
with mock.patch("subprocess.Popen", side_effect=side_effect):
|
||||
match = r"PHP does not have the 'pdo_mysql' module enabled\.$"
|
||||
with pytest.raises(AurwebException, match=match):
|
||||
aurweb.spawn.validate_php_config()
|
||||
|
||||
side_effect = MockFakeProcess(stdout=b"pdo_mysql").process
|
||||
with mock.patch("subprocess.Popen", side_effect=side_effect):
|
||||
match = r"PHP does not have the 'pdo_sqlite' module enabled\.$"
|
||||
with pytest.raises(AurwebException, match=match):
|
||||
aurweb.spawn.validate_php_config()
|
||||
|
||||
|
||||
@mock.patch.dict("os.environ", TEST_ENVIRONMENT)
|
||||
def test_spawn_generate_nginx_config():
|
||||
ctx = tempfile.TemporaryDirectory()
|
||||
with ctx and mock.patch("aurweb.spawn.temporary_dir", ctx.name):
|
||||
aurweb.spawn.generate_nginx_config()
|
||||
nginx_config_path = os.path.join(ctx.name, "nginx.conf")
|
||||
with open(nginx_config_path) as f:
|
||||
nginx_config = f.read().rstrip()
|
||||
|
||||
php_address = aurweb.config.get("php", "bind_address")
|
||||
php_host = php_address.split(":")[0]
|
||||
fastapi_address = aurweb.config.get("fastapi", "bind_address")
|
||||
fastapi_host = fastapi_address.split(":")[0]
|
||||
expected_content = [
|
||||
f'listen {php_host}:{TEST_ENVIRONMENT.get("PHP_NGINX_PORT")}',
|
||||
f"proxy_pass http://{php_address}",
|
||||
f'listen {fastapi_host}:{TEST_ENVIRONMENT.get("FASTAPI_NGINX_PORT")}',
|
||||
f"proxy_pass http://{fastapi_address}"
|
||||
]
|
||||
for expected in expected_content:
|
||||
assert expected in nginx_config
|
||||
|
||||
|
||||
@mock.patch("aurweb.spawn.asgi_backend", "uvicorn")
|
||||
@mock.patch("aurweb.spawn.verbosity", 1)
|
||||
@mock.patch("aurweb.spawn.workers", 1)
|
||||
def test_spawn_start_stop():
|
||||
ctx = tempfile.TemporaryDirectory()
|
||||
with ctx and mock.patch("aurweb.spawn.temporary_dir", ctx.name):
|
||||
aurweb.spawn.start()
|
||||
aurweb.spawn.stop()
|
||||
|
||||
|
||||
@mock.patch("aurweb.spawn.asgi_backend", "uvicorn")
|
||||
@mock.patch("aurweb.spawn.verbosity", 1)
|
||||
@mock.patch("aurweb.spawn.workers", 1)
|
||||
@mock.patch("aurweb.spawn.children", [MockFakeProcess().process()])
|
||||
def test_spawn_start_noop_with_children():
|
||||
aurweb.spawn.start()
|
||||
|
||||
|
||||
@mock.patch("aurweb.spawn.asgi_backend", "uvicorn")
|
||||
@mock.patch("aurweb.spawn.verbosity", 1)
|
||||
@mock.patch("aurweb.spawn.workers", 1)
|
||||
@mock.patch("aurweb.spawn.children", [MockFakeProcess().process()])
|
||||
def test_spawn_stop_terminate_failure():
|
||||
ctx = tempfile.TemporaryDirectory()
|
||||
with ctx and mock.patch("aurweb.spawn.temporary_dir", ctx.name):
|
||||
match = r"^Errors terminating the child processes"
|
||||
with pytest.raises(aurweb.spawn.ProcessExceptions, match=match):
|
||||
aurweb.spawn.stop()
|
||||
|
||||
|
||||
@mock.patch("aurweb.spawn.asgi_backend", "uvicorn")
|
||||
@mock.patch("aurweb.spawn.verbosity", 1)
|
||||
@mock.patch("aurweb.spawn.workers", 1)
|
||||
@mock.patch("aurweb.spawn.children", [MockFakeProcess(1).process()])
|
||||
def test_spawn_stop_wait_failure():
|
||||
ctx = tempfile.TemporaryDirectory()
|
||||
with ctx and mock.patch("aurweb.spawn.temporary_dir", ctx.name):
|
||||
match = r"^Errors terminating the child processes"
|
||||
with pytest.raises(aurweb.spawn.ProcessExceptions, match=match):
|
||||
aurweb.spawn.stop()
|
68
test/test_ssh_pub_key.py
Normal file
68
test/test_ssh_pub_key.py
Normal file
|
@ -0,0 +1,68 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.ssh_pub_key import SSHPubKey, get_fingerprint
|
||||
from aurweb.models.user import User
|
||||
|
||||
TEST_SSH_PUBKEY = """
|
||||
ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCycoCi5yGCvSclH2wmNBUuwsYEzRZZBJaQquRc4y\
|
||||
sl+Tg+/jiDkR3Zn9fIznC4KnFoyrIHzkKuePZ3bNDYwkZxkJKoWBCh4hXKDXSm87FMN0+VDC+1QxF/\
|
||||
z0XaAGr/P6f4XukabyddypBdnHcZiplbw+YOSqcAE2TCqOlSXwNMOcF9U89UsR/Q9i9I52hlvU0q8+\
|
||||
fZVGhou1KCowFSnHYtrr5KYJ04CXkJ13DkVf3+pjQWyrByvBcf1hGEaczlgfobrrv/y96jDhgfXucx\
|
||||
liNKLdufDPPkii3LhhsNcDmmI1VZ3v0irKvd9WZuauqloobY84zEFcDTyjn0hxGjVeYFejm4fBnvjg\
|
||||
a0yZXORuWksdNfXWLDxFk6MDDd1jF0ExRbP+OxDuU4IVyIuDL7S3cnbf2YjGhkms/8voYT2OBE7FwN\
|
||||
lfv98Kr0NUp51zpf55Arxn9j0Rz9xTA7FiODQgCn6iQ0SDtzUNL0IKTCw26xJY5gzMxbfpvzPQGeul\
|
||||
x/ioM= kevr@volcano
|
||||
"""
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pubkey(user: User) -> SSHPubKey:
|
||||
with db.begin():
|
||||
pubkey = db.create(SSHPubKey, User=user,
|
||||
Fingerprint="testFingerprint",
|
||||
PubKey="testPubKey")
|
||||
yield pubkey
|
||||
|
||||
|
||||
def test_pubkey(user: User, pubkey: SSHPubKey):
|
||||
assert pubkey.UserID == user.ID
|
||||
assert pubkey.User == user
|
||||
assert pubkey.Fingerprint == "testFingerprint"
|
||||
assert pubkey.PubKey == "testPubKey"
|
||||
|
||||
|
||||
def test_pubkey_cs(user: User):
|
||||
""" Test case sensitivity of the database table. """
|
||||
with db.begin():
|
||||
pubkey_cs = db.create(SSHPubKey, User=user,
|
||||
Fingerprint="TESTFINGERPRINT",
|
||||
PubKey="TESTPUBKEY")
|
||||
|
||||
assert pubkey_cs.Fingerprint == "TESTFINGERPRINT"
|
||||
assert pubkey_cs.Fingerprint != "testFingerprint"
|
||||
assert pubkey_cs.PubKey == "TESTPUBKEY"
|
||||
assert pubkey_cs.PubKey != "testPubKey"
|
||||
|
||||
|
||||
def test_pubkey_fingerprint():
|
||||
assert get_fingerprint(TEST_SSH_PUBKEY) is not None
|
||||
|
||||
|
||||
def test_pubkey_invalid_fingerprint():
|
||||
assert get_fingerprint("ssh-rsa fake and invalid") is None
|
328
test/test_templates.py
Normal file
328
test/test_templates.py
Normal file
|
@ -0,0 +1,328 @@
|
|||
import re
|
||||
|
||||
from typing import Any, Dict
|
||||
|
||||
import pytest
|
||||
|
||||
import aurweb.filters # noqa: F401
|
||||
|
||||
from aurweb import config, db, templates, time
|
||||
from aurweb.filters import as_timezone, number_format
|
||||
from aurweb.filters import timestamp_to_datetime as to_dt
|
||||
from aurweb.models import Package, PackageBase, User
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.models.license import License
|
||||
from aurweb.models.package_license import PackageLicense
|
||||
from aurweb.models.package_relation import PackageRelation
|
||||
from aurweb.models.relation_type import PROVIDES_ID, REPLACES_ID
|
||||
from aurweb.templates import base_template, make_context, register_filter, register_function
|
||||
from aurweb.testing.html import parse_root
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
GIT_CLONE_URI_ANON = "anon_%s"
|
||||
GIT_CLONE_URI_PRIV = "priv_%s"
|
||||
|
||||
|
||||
@register_filter("func")
|
||||
def func():
|
||||
pass
|
||||
|
||||
|
||||
@register_function("function")
|
||||
def function():
|
||||
pass
|
||||
|
||||
|
||||
def create_user(username: str) -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username=username,
|
||||
Email=f"{username}@example.org",
|
||||
Passwd="testPassword",
|
||||
AccountTypeID=USER_ID)
|
||||
return user
|
||||
|
||||
|
||||
def create_pkgrel(package: Package, reltype_id: int, relname: str) \
|
||||
-> PackageRelation:
|
||||
return db.create(PackageRelation,
|
||||
Package=package,
|
||||
RelTypeID=reltype_id,
|
||||
RelName=relname)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user(db_test) -> User:
|
||||
user = create_user("test")
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pkgbase(user: User) -> PackageBase:
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="test-pkg", Maintainer=user,
|
||||
SubmittedTS=now, ModifiedTS=now)
|
||||
yield pkgbase
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User, pkgbase: PackageBase) -> Package:
|
||||
with db.begin():
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name)
|
||||
yield pkg
|
||||
|
||||
|
||||
def create_license(pkg: Package, license_name: str) -> PackageLicense:
|
||||
lic = db.create(License, Name=license_name)
|
||||
pkglic = db.create(PackageLicense, License=lic, Package=pkg)
|
||||
return pkglic
|
||||
|
||||
|
||||
def test_register_function_exists_key_error():
|
||||
""" Most instances of register_filter are tested through module
|
||||
imports or template renders, so we only test failures here. """
|
||||
with pytest.raises(KeyError):
|
||||
@register_function("function")
|
||||
def some_func():
|
||||
pass
|
||||
|
||||
|
||||
def test_commit_hash():
|
||||
# Hashes we'll use for this test. long_commit_hash should be
|
||||
# shortened to commit_hash for rendering.
|
||||
commit_hash = "abcdefg"
|
||||
long_commit_hash = commit_hash + "1234567"
|
||||
|
||||
def config_get_with_fallback(section: str, option: str,
|
||||
fallback: str = None) -> str:
|
||||
if section == "devel" and option == "commit_hash":
|
||||
return long_commit_hash
|
||||
return config.original_get_with_fallback(section, option, fallback)
|
||||
|
||||
# Fake config.get_with_fallback.
|
||||
config.original_get_with_fallback = config.get_with_fallback
|
||||
config.get_with_fallback = config_get_with_fallback
|
||||
|
||||
request = Request()
|
||||
context = templates.make_context(request, "Test Context")
|
||||
render = templates.render_raw_template(request, "index.html", context)
|
||||
|
||||
# We've faked config.get_with_fallback to return a "valid" commit_hash
|
||||
# when queried. Test that the expected render occurs.
|
||||
commit_url = config.get("devel", "commit_url")
|
||||
expected = commit_url % commit_hash
|
||||
assert expected in render
|
||||
assert f"HEAD@{commit_hash}" in render
|
||||
assert long_commit_hash not in render
|
||||
|
||||
# Restore config.get_with_fallback.
|
||||
config.get_with_fallback = config.original_get_with_fallback
|
||||
config.original_get_with_fallback = None
|
||||
|
||||
# Now, we no longer fake the commit_hash option: no commit
|
||||
# is displayed in the footer. Assert this expectation.
|
||||
context = templates.make_context(request, "Test Context")
|
||||
render = templates.render_raw_template(request, "index.html", context)
|
||||
assert commit_hash not in render
|
||||
|
||||
|
||||
def pager_context(num_packages: int) -> Dict[str, Any]:
|
||||
return {
|
||||
"request": Request(),
|
||||
"singular": "%d package found.",
|
||||
"plural": "%d packages found.",
|
||||
"prefix": "/packages",
|
||||
"total": num_packages,
|
||||
"O": 0,
|
||||
"PP": 50
|
||||
}
|
||||
|
||||
|
||||
def test_pager_no_results():
|
||||
""" Test the pager partial with no results. """
|
||||
num_packages = 0
|
||||
context = pager_context(num_packages)
|
||||
body = base_template("partials/pager.html").render(context)
|
||||
|
||||
root = parse_root(body)
|
||||
stats = root.xpath('//div[@class="pkglist-stats"]/p')
|
||||
expected = "0 packages found."
|
||||
assert stats[0].text.strip() == expected
|
||||
|
||||
|
||||
def test_pager():
|
||||
""" Test the pager partial with two pages of results. """
|
||||
num_packages = 100
|
||||
context = pager_context(num_packages)
|
||||
body = base_template("partials/pager.html").render(context)
|
||||
|
||||
root = parse_root(body)
|
||||
stats = root.xpath('//div[@class="pkglist-stats"]/p')
|
||||
stats = re.sub(r"\s{2,}", " ", stats[0].text.strip())
|
||||
expected = f"{num_packages} packages found. Page 1 of 2."
|
||||
assert stats == expected
|
||||
|
||||
|
||||
def check_package_details(content: str, pkg: Package) -> None:
|
||||
"""
|
||||
Perform assertion checks against package details.
|
||||
"""
|
||||
pkgbase = pkg.PackageBase
|
||||
|
||||
root = parse_root(content)
|
||||
pkginfo = root.xpath('//table[@id="pkginfo"]')[0]
|
||||
rows = pkginfo.xpath("./tr")
|
||||
|
||||
# Check Git Clone URL.
|
||||
git_clone_uris = rows[0].xpath("./td/a")
|
||||
anon_uri, priv_uri = git_clone_uris
|
||||
pkgbasename = pkgbase.Name
|
||||
assert anon_uri.text.strip() == GIT_CLONE_URI_ANON % pkgbasename
|
||||
assert priv_uri.text.strip() == GIT_CLONE_URI_PRIV % pkgbasename
|
||||
|
||||
# Check Package Base.
|
||||
pkgbase_markup = rows[1].xpath("./td/a")[0]
|
||||
assert pkgbase_markup.text.strip() == pkgbasename
|
||||
|
||||
# Check Description.
|
||||
desc = rows[2].xpath("./td")[0]
|
||||
assert desc.text.strip() == str(pkg.Description)
|
||||
|
||||
# Check URL, for which we have none. In this case, no <a> should
|
||||
# be used since we have nothing to link.
|
||||
url = rows[3].xpath("./td")[0]
|
||||
assert url.text.strip() == str(pkg.URL)
|
||||
|
||||
# Check Keywords, which should be empty.
|
||||
keywords = rows[4].xpath("./td/form/div/input")[0]
|
||||
assert keywords.attrib["value"] == str()
|
||||
|
||||
i = 4
|
||||
licenses = pkg.package_licenses.all()
|
||||
if licenses:
|
||||
i += 1
|
||||
expected = ", ".join([p.License.Name for p in licenses])
|
||||
license_markup = rows[i].xpath("./td")[0]
|
||||
assert license_markup.text.strip() == expected
|
||||
else:
|
||||
assert "Licenses" not in content
|
||||
|
||||
provides = pkg.package_relations.filter(
|
||||
PackageRelation.RelTypeID == PROVIDES_ID
|
||||
).all()
|
||||
if provides:
|
||||
i += 1
|
||||
expected = ", ".join([p.RelName for p in provides])
|
||||
provides_markup = rows[i].xpath("./td")[0]
|
||||
assert provides_markup.text.strip() == expected
|
||||
else:
|
||||
assert "Provides" not in content
|
||||
|
||||
replaces = pkg.package_relations.filter(
|
||||
PackageRelation.RelTypeID == REPLACES_ID
|
||||
).all()
|
||||
if replaces:
|
||||
i += 1
|
||||
expected = ", ".join([r.RelName for r in replaces])
|
||||
replaces_markup = rows[i].xpath("./td")[0]
|
||||
assert replaces_markup.text.strip() == expected
|
||||
else:
|
||||
assert "Replaces" not in content
|
||||
|
||||
# Check Submitter.
|
||||
selector = "./td" if not pkg.PackageBase.Submitter else "./td/a"
|
||||
i += 1
|
||||
submitter = rows[i].xpath(selector)[0]
|
||||
assert submitter.text.strip() == str(pkg.PackageBase.Submitter)
|
||||
|
||||
# Check Maintainer.
|
||||
selector = "./td" if not pkg.PackageBase.Maintainer else "./td/a"
|
||||
i += 1
|
||||
maintainer = rows[i].xpath(selector)[0]
|
||||
assert maintainer.text.strip() == str(pkg.PackageBase.Maintainer)
|
||||
|
||||
# Check Packager.
|
||||
selector = "./td" if not pkg.PackageBase.Packager else "./td/a"
|
||||
i += 1
|
||||
packager = rows[i].xpath(selector)[0]
|
||||
assert packager.text.strip() == str(pkg.PackageBase.Packager)
|
||||
|
||||
# Check Votes.
|
||||
i += 1
|
||||
votes = rows[i].xpath("./td")[0]
|
||||
assert votes.text.strip() == str(pkg.PackageBase.NumVotes)
|
||||
|
||||
# Check Popularity; for this package, a number_format of 6 places is used.
|
||||
i += 1
|
||||
pop = rows[i].xpath("./td")[0]
|
||||
assert pop.text.strip() == number_format(0, 6)
|
||||
|
||||
# Check First Submitted
|
||||
date_fmt = "%Y-%m-%d %H:%M"
|
||||
i += 1
|
||||
first_submitted = rows[i].xpath("./td")[0]
|
||||
converted_dt = as_timezone(to_dt(pkg.PackageBase.SubmittedTS), "UTC")
|
||||
expected = converted_dt.strftime(date_fmt)
|
||||
assert first_submitted.text.strip() == expected
|
||||
|
||||
# Check Last Updated.
|
||||
i += 1
|
||||
last_updated = rows[i].xpath("./td")[0]
|
||||
converted_dt = as_timezone(to_dt(pkg.PackageBase.ModifiedTS), "UTC")
|
||||
expected = converted_dt.strftime(date_fmt)
|
||||
assert last_updated.text.strip() == expected
|
||||
|
||||
|
||||
def test_package_details(user: User, package: Package):
|
||||
""" Test package details with most fields populated, but not all. """
|
||||
request = Request(user=user, authenticated=True)
|
||||
context = make_context(request, "Test Details")
|
||||
context.update({
|
||||
"request": request,
|
||||
"git_clone_uri_anon": GIT_CLONE_URI_ANON,
|
||||
"git_clone_uri_priv": GIT_CLONE_URI_PRIV,
|
||||
"pkgbase": package.PackageBase,
|
||||
"pkg": package
|
||||
})
|
||||
|
||||
base = base_template("partials/packages/details.html")
|
||||
body = base.render(context, show_package_details=True)
|
||||
check_package_details(body, package)
|
||||
|
||||
|
||||
def test_package_details_filled(user: User, package: Package):
|
||||
""" Test package details with all fields populated. """
|
||||
|
||||
pkgbase = package.PackageBase
|
||||
with db.begin():
|
||||
# Setup Submitter and Packager; Maintainer is already set to `user`.
|
||||
pkgbase.Submitter = pkgbase.Packager = user
|
||||
|
||||
# Create two licenses.
|
||||
create_license(package, "TPL") # Testing Public License
|
||||
create_license(package, "TPL2") # Testing Public License 2
|
||||
|
||||
# Add provides.
|
||||
create_pkgrel(package, PROVIDES_ID, "test-provider")
|
||||
|
||||
# Add replaces.
|
||||
create_pkgrel(package, REPLACES_ID, "test-replacement")
|
||||
|
||||
request = Request(user=user, authenticated=True)
|
||||
context = make_context(request, "Test Details")
|
||||
context.update({
|
||||
"request": request,
|
||||
"git_clone_uri_anon": GIT_CLONE_URI_ANON,
|
||||
"git_clone_uri_priv": GIT_CLONE_URI_PRIV,
|
||||
"pkgbase": package.PackageBase,
|
||||
"pkg": package,
|
||||
"licenses": package.package_licenses,
|
||||
"provides": package.package_relations.filter(
|
||||
PackageRelation.RelTypeID == PROVIDES_ID),
|
||||
"replaces": package.package_relations.filter(
|
||||
PackageRelation.RelTypeID == REPLACES_ID),
|
||||
})
|
||||
|
||||
base = base_template("partials/packages/details.html")
|
||||
body = base.render(context, show_package_details=True)
|
||||
check_package_details(body, package)
|
31
test/test_term.py
Normal file
31
test/test_term.py
Normal file
|
@ -0,0 +1,31 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.models.term import Term
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def test_term_creation():
|
||||
with db.begin():
|
||||
term = db.create(Term, Description="Term description",
|
||||
URL="https://fake_url.io")
|
||||
assert bool(term.ID)
|
||||
assert term.Description == "Term description"
|
||||
assert term.URL == "https://fake_url.io"
|
||||
assert term.Revision == 1
|
||||
|
||||
|
||||
def test_term_null_description_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
Term(URL="https://fake_url.io")
|
||||
|
||||
|
||||
def test_term_null_url_raises_exception():
|
||||
with pytest.raises(IntegrityError):
|
||||
Term(Description="Term description")
|
33
test/test_time.py
Normal file
33
test/test_time.py
Normal file
|
@ -0,0 +1,33 @@
|
|||
import aurweb.config
|
||||
|
||||
from aurweb.testing.requests import Request
|
||||
from aurweb.time import get_request_timezone, tz_offset
|
||||
|
||||
|
||||
def test_tz_offset_utc():
|
||||
offset = tz_offset("UTC")
|
||||
assert offset == "+00:00"
|
||||
|
||||
|
||||
def test_tz_offset_mst():
|
||||
offset = tz_offset("MST")
|
||||
assert offset == "-07:00"
|
||||
|
||||
|
||||
def test_request_timezone():
|
||||
request = Request()
|
||||
tz = get_request_timezone(request)
|
||||
assert tz == aurweb.config.get("options", "default_timezone")
|
||||
|
||||
|
||||
def test_authenticated_request_timezone():
|
||||
# Modify a fake request to be authenticated with the
|
||||
# America/Los_Angeles timezone.
|
||||
request = Request()
|
||||
request.user.authenticated = True
|
||||
request.user.Timezone = "America/Los_Angeles"
|
||||
|
||||
# Get the request's timezone, it should be America/Los_Angeles.
|
||||
tz = get_request_timezone(request)
|
||||
assert tz == request.user.Timezone
|
||||
assert tz == "America/Los_Angeles"
|
867
test/test_trusted_user_routes.py
Normal file
867
test/test_trusted_user_routes.py
Normal file
|
@ -0,0 +1,867 @@
|
|||
import re
|
||||
|
||||
from http import HTTPStatus
|
||||
from io import StringIO
|
||||
from typing import Tuple
|
||||
|
||||
import lxml.etree
|
||||
import pytest
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from aurweb import config, db, filters, time
|
||||
from aurweb.models.account_type import DEVELOPER_ID, AccountType
|
||||
from aurweb.models.tu_vote import TUVote
|
||||
from aurweb.models.tu_voteinfo import TUVoteInfo
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
DATETIME_REGEX = r'^[0-9]{4}-[0-9]{2}-[0-9]{2}$'
|
||||
PARTICIPATION_REGEX = r'^1?[0-9]{2}[%]$' # 0% - 100%
|
||||
|
||||
|
||||
def parse_root(html):
|
||||
parser = lxml.etree.HTMLParser(recover=True)
|
||||
tree = lxml.etree.parse(StringIO(html), parser)
|
||||
return tree.getroot()
|
||||
|
||||
|
||||
def get_table(root, class_name):
|
||||
table = root.xpath(f'//table[contains(@class, "{class_name}")]')[0]
|
||||
return table
|
||||
|
||||
|
||||
def get_table_rows(table):
|
||||
tbody = table.xpath("./tbody")[0]
|
||||
return tbody.xpath("./tr")
|
||||
|
||||
|
||||
def get_pkglist_directions(table):
|
||||
stats = table.getparent().xpath("./div[@class='pkglist-stats']")[0]
|
||||
nav = stats.xpath("./p[@class='pkglist-nav']")[0]
|
||||
return nav.xpath("./a")
|
||||
|
||||
|
||||
def get_a(node):
|
||||
return node.xpath('./a')[0].text.strip()
|
||||
|
||||
|
||||
def get_span(node):
|
||||
return node.xpath('./span')[0].text.strip()
|
||||
|
||||
|
||||
def assert_current_vote_html(row, expected):
|
||||
columns = row.xpath("./td")
|
||||
proposal, start, end, user, voted = columns
|
||||
p, s, e, u, v = expected # Column expectations.
|
||||
assert re.match(p, get_a(proposal)) is not None
|
||||
assert re.match(s, start.text) is not None
|
||||
assert re.match(e, end.text) is not None
|
||||
assert re.match(u, get_a(user)) is not None
|
||||
assert re.match(v, get_span(voted)) is not None
|
||||
|
||||
|
||||
def assert_past_vote_html(row, expected):
|
||||
columns = row.xpath("./td")
|
||||
proposal, start, end, user, yes, no, voted = columns # Real columns.
|
||||
p, s, e, u, y, n, v = expected # Column expectations.
|
||||
assert re.match(p, get_a(proposal)) is not None
|
||||
assert re.match(s, start.text) is not None
|
||||
assert re.match(e, end.text) is not None
|
||||
assert re.match(u, get_a(user)) is not None
|
||||
assert re.match(y, yes.text) is not None
|
||||
assert re.match(n, no.text) is not None
|
||||
assert re.match(v, get_span(voted)) is not None
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client():
|
||||
from aurweb.asgi import app
|
||||
yield TestClient(app=app)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tu_user():
|
||||
tu_type = db.query(AccountType,
|
||||
AccountType.AccountType == "Trusted User").first()
|
||||
with db.begin():
|
||||
tu_user = db.create(User, Username="test_tu",
|
||||
Email="test_tu@example.org",
|
||||
RealName="Test TU", Passwd="testPassword",
|
||||
AccountType=tu_type)
|
||||
yield tu_user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user():
|
||||
user_type = db.query(AccountType,
|
||||
AccountType.AccountType == "User").first()
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountType=user_type)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def proposal(user, tu_user):
|
||||
ts = time.utcnow()
|
||||
agenda = "Test proposal."
|
||||
start = ts - 5
|
||||
end = ts + 1000
|
||||
|
||||
with db.begin():
|
||||
voteinfo = db.create(TUVoteInfo,
|
||||
Agenda=agenda, Quorum=0.0,
|
||||
User=user.Username, Submitter=tu_user,
|
||||
Submitted=start, End=end)
|
||||
yield (tu_user, user, voteinfo)
|
||||
|
||||
|
||||
def test_tu_index_guest(client):
|
||||
headers = {"referer": config.get("options", "aur_location") + "/tu"}
|
||||
with client as request:
|
||||
response = request.get("/tu", allow_redirects=False, headers=headers)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
params = filters.urlencode({"next": "/tu"})
|
||||
assert response.headers.get("location") == f"/login?{params}"
|
||||
|
||||
|
||||
def test_tu_index_unauthorized(client: TestClient, user: User):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
# Login as a normal user, not a TU.
|
||||
response = request.get("/tu", cookies=cookies, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/"
|
||||
|
||||
|
||||
def test_tu_empty_index(client, tu_user):
|
||||
""" Check an empty index when we don't create any records. """
|
||||
|
||||
# Make a default get request to /tu.
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Parse lxml root.
|
||||
root = parse_root(response.text)
|
||||
|
||||
# Check that .current-votes does not exist.
|
||||
tables = root.xpath('//table[contains(@class, "current-votes")]')
|
||||
assert len(tables) == 0
|
||||
|
||||
# Check that .past-votes has does not exist.
|
||||
tables = root.xpath('//table[contains(@class, "current-votes")]')
|
||||
assert len(tables) == 0
|
||||
|
||||
|
||||
def test_tu_index(client, tu_user):
|
||||
ts = time.utcnow()
|
||||
|
||||
# Create some test votes: (Agenda, Start, End).
|
||||
votes = [
|
||||
("Test agenda 1", ts - 5, ts + 1000), # Still running.
|
||||
("Test agenda 2", ts - 1000, ts - 5) # Not running anymore.
|
||||
]
|
||||
vote_records = []
|
||||
with db.begin():
|
||||
for vote in votes:
|
||||
agenda, start, end = vote
|
||||
vote_records.append(
|
||||
db.create(TUVoteInfo, Agenda=agenda,
|
||||
User=tu_user.Username,
|
||||
Submitted=start, End=end,
|
||||
Quorum=0.0,
|
||||
Submitter=tu_user))
|
||||
|
||||
with db.begin():
|
||||
# Vote on an ended proposal.
|
||||
vote_record = vote_records[1]
|
||||
vote_record.Yes += 1
|
||||
vote_record.ActiveTUs += 1
|
||||
db.create(TUVote, VoteInfo=vote_record, User=tu_user)
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
# Pass an invalid cby and pby; let them default to "desc".
|
||||
response = request.get("/tu", cookies=cookies, params={
|
||||
"cby": "BAD!",
|
||||
"pby": "blah"
|
||||
}, allow_redirects=False)
|
||||
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Rows we expect to exist in HTML produced by /tu for current votes.
|
||||
expected_rows = [
|
||||
(
|
||||
r'Test agenda 1',
|
||||
DATETIME_REGEX,
|
||||
DATETIME_REGEX,
|
||||
tu_user.Username,
|
||||
r'^(Yes|No)$'
|
||||
)
|
||||
]
|
||||
|
||||
# Assert that we are matching the number of current votes.
|
||||
current_votes = [c for c in votes if c[2] > ts]
|
||||
assert len(current_votes) == len(expected_rows)
|
||||
|
||||
# Parse lxml.etree root.
|
||||
root = parse_root(response.text)
|
||||
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, expected_rows[i])
|
||||
|
||||
# Assert that we are matching the number of past votes.
|
||||
past_votes = [c for c in votes if c[2] <= ts]
|
||||
assert len(past_votes) == len(expected_rows)
|
||||
|
||||
# Rows we expect to exist in HTML produced by /tu for past votes.
|
||||
expected_rows = [
|
||||
(
|
||||
r'Test agenda 2',
|
||||
DATETIME_REGEX,
|
||||
DATETIME_REGEX,
|
||||
tu_user.Username,
|
||||
r'^\d+$',
|
||||
r'^\d+$',
|
||||
r'^(Yes|No)$'
|
||||
)
|
||||
]
|
||||
|
||||
table = get_table(root, "past-votes")
|
||||
rows = get_table_rows(table)
|
||||
for i, row in enumerate(rows):
|
||||
assert_past_vote_html(row, expected_rows[i])
|
||||
|
||||
# Get the .last-votes table and check that our vote shows up.
|
||||
table = get_table(root, "last-votes")
|
||||
rows = get_table_rows(table)
|
||||
assert len(rows) == 1
|
||||
|
||||
# Check to see the rows match up to our user and related vote.
|
||||
username, vote_id = rows[0]
|
||||
vote_id = vote_id.xpath("./a")[0]
|
||||
assert username.text.strip() == tu_user.Username
|
||||
assert int(vote_id.text.strip()) == vote_records[1].ID
|
||||
|
||||
|
||||
def test_tu_index_table_paging(client, tu_user):
|
||||
ts = time.utcnow()
|
||||
|
||||
with db.begin():
|
||||
for i in range(25):
|
||||
# Create 25 current votes.
|
||||
db.create(TUVoteInfo, Agenda=f"Agenda #{i}",
|
||||
User=tu_user.Username,
|
||||
Submitted=(ts - 5), End=(ts + 1000),
|
||||
Quorum=0.0,
|
||||
Submitter=tu_user)
|
||||
|
||||
for i in range(25):
|
||||
# Create 25 past votes.
|
||||
db.create(TUVoteInfo, Agenda=f"Agenda #{25 + i}",
|
||||
User=tu_user.Username,
|
||||
Submitted=(ts - 1000), End=(ts - 5),
|
||||
Quorum=0.0,
|
||||
Submitter=tu_user)
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Parse lxml.etree root.
|
||||
root = parse_root(response.text)
|
||||
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
assert len(rows) == 10
|
||||
|
||||
def make_expectation(offset, i):
|
||||
return [
|
||||
f"Agenda #{offset + i}",
|
||||
DATETIME_REGEX,
|
||||
DATETIME_REGEX,
|
||||
tu_user.Username,
|
||||
r'^(Yes|No)$'
|
||||
]
|
||||
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, make_expectation(0, i))
|
||||
|
||||
# Parse out Back/Next buttons.
|
||||
directions = get_pkglist_directions(table)
|
||||
assert len(directions) == 1
|
||||
assert "Next" in directions[0].text
|
||||
|
||||
# Now, get the next page of current votes.
|
||||
offset = 10 # Specify coff=10
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, params={
|
||||
"coff": offset
|
||||
}, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
old_rows = rows
|
||||
root = parse_root(response.text)
|
||||
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
assert rows != old_rows
|
||||
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, make_expectation(offset, i))
|
||||
|
||||
# Parse out Back/Next buttons.
|
||||
directions = get_pkglist_directions(table)
|
||||
assert len(directions) == 2
|
||||
assert "Back" in directions[0].text
|
||||
assert "Next" in directions[1].text
|
||||
|
||||
# Make sure past-votes' Back/Next were not affected.
|
||||
past_votes = get_table(root, "past-votes")
|
||||
past_directions = get_pkglist_directions(past_votes)
|
||||
assert len(past_directions) == 1
|
||||
assert "Next" in past_directions[0].text
|
||||
|
||||
offset = 20 # Specify coff=10
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, params={
|
||||
"coff": offset
|
||||
}, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Do it again, we only have five left.
|
||||
old_rows = rows
|
||||
root = parse_root(response.text)
|
||||
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
assert rows != old_rows
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, make_expectation(offset, i))
|
||||
|
||||
# Parse out Back/Next buttons.
|
||||
directions = get_pkglist_directions(table)
|
||||
assert len(directions) == 1
|
||||
assert "Back" in directions[0].text
|
||||
|
||||
# Make sure past-votes' Back/Next were not affected.
|
||||
past_votes = get_table(root, "past-votes")
|
||||
past_directions = get_pkglist_directions(past_votes)
|
||||
assert len(past_directions) == 1
|
||||
assert "Next" in past_directions[0].text
|
||||
|
||||
|
||||
def test_tu_index_sorting(client, tu_user):
|
||||
ts = time.utcnow()
|
||||
|
||||
with db.begin():
|
||||
for i in range(2):
|
||||
# Create 'Agenda #1' and 'Agenda #2'.
|
||||
db.create(TUVoteInfo, Agenda=f"Agenda #{i + 1}",
|
||||
User=tu_user.Username,
|
||||
Submitted=(ts + 5), End=(ts + 1000),
|
||||
Quorum=0.0,
|
||||
Submitter=tu_user)
|
||||
|
||||
# Let's order each vote one day after the other.
|
||||
# This will allow us to test the sorting nature
|
||||
# of the tables.
|
||||
ts += 86405
|
||||
|
||||
# Make a default request to /tu.
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Get lxml handles of the document.
|
||||
root = parse_root(response.text)
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
|
||||
# The latest Agenda is at the top by default.
|
||||
expected = [
|
||||
"Agenda #2",
|
||||
"Agenda #1"
|
||||
]
|
||||
|
||||
assert len(rows) == len(expected)
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, [
|
||||
expected[i],
|
||||
DATETIME_REGEX,
|
||||
DATETIME_REGEX,
|
||||
tu_user.Username,
|
||||
r'^(Yes|No)$'
|
||||
])
|
||||
|
||||
# Make another request; one that sorts the current votes
|
||||
# in ascending order instead of the default descending order.
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies, params={
|
||||
"cby": "asc"
|
||||
}, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Get lxml handles of the document.
|
||||
root = parse_root(response.text)
|
||||
table = get_table(root, "current-votes")
|
||||
rows = get_table_rows(table)
|
||||
|
||||
# Reverse our expectations and assert that the proposals got flipped.
|
||||
rev_expected = list(reversed(expected))
|
||||
assert len(rows) == len(rev_expected)
|
||||
for i, row in enumerate(rows):
|
||||
assert_current_vote_html(row, [
|
||||
rev_expected[i],
|
||||
DATETIME_REGEX,
|
||||
DATETIME_REGEX,
|
||||
tu_user.Username,
|
||||
r'^(Yes|No)$'
|
||||
])
|
||||
|
||||
|
||||
def test_tu_index_last_votes(client, tu_user, user):
|
||||
ts = time.utcnow()
|
||||
|
||||
with db.begin():
|
||||
# Create a proposal which has ended.
|
||||
voteinfo = db.create(TUVoteInfo, Agenda="Test agenda",
|
||||
User=user.Username,
|
||||
Submitted=(ts - 1000),
|
||||
End=(ts - 5),
|
||||
Yes=1,
|
||||
ActiveTUs=1,
|
||||
Quorum=0.0,
|
||||
Submitter=tu_user)
|
||||
|
||||
# Create a vote on it from tu_user.
|
||||
db.create(TUVote, VoteInfo=voteinfo, User=tu_user)
|
||||
|
||||
# Now, check that tu_user got populated in the .last-votes table.
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/tu", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
table = get_table(root, "last-votes")
|
||||
rows = get_table_rows(table)
|
||||
assert len(rows) == 1
|
||||
|
||||
last_vote = rows[0]
|
||||
user, vote_id = last_vote.xpath("./td")
|
||||
vote_id = vote_id.xpath("./a")[0]
|
||||
|
||||
assert user.text.strip() == tu_user.Username
|
||||
assert int(vote_id.text.strip()) == voteinfo.ID
|
||||
|
||||
|
||||
def test_tu_proposal_not_found(client, tu_user):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/tu", params={"id": 1}, cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
|
||||
|
||||
def test_tu_proposal_unauthorized(client: TestClient, user: User,
|
||||
proposal: Tuple[User, User, TUVoteInfo]):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
endpoint = f"/tu/{proposal[2].ID}"
|
||||
with client as request:
|
||||
response = request.get(endpoint, cookies=cookies,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/tu"
|
||||
|
||||
with client as request:
|
||||
response = request.post(endpoint, cookies=cookies,
|
||||
data={"decision": False},
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/tu"
|
||||
|
||||
|
||||
def test_tu_running_proposal(client: TestClient,
|
||||
proposal: Tuple[User, User, TUVoteInfo]):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
# Initiate an authenticated GET request to /tu/{proposal_id}.
|
||||
proposal_id = voteinfo.ID
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get(f"/tu/{proposal_id}", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Alright, now let's continue on to verifying some markup.
|
||||
# First, let's verify that the proposal details match.
|
||||
root = parse_root(response.text)
|
||||
details = root.xpath('//div[@class="proposal details"]')[0]
|
||||
|
||||
vote_running = root.xpath('//p[contains(@class, "vote-running")]')[0]
|
||||
assert vote_running.text.strip() == "This vote is still running."
|
||||
|
||||
# Verify User field.
|
||||
username = details.xpath(
|
||||
'./div[contains(@class, "user")]/strong/a/text()')[0]
|
||||
assert username.strip() == user.Username
|
||||
|
||||
submitted = details.xpath(
|
||||
'./div[contains(@class, "submitted")]/text()')[0]
|
||||
assert re.match(r'^Submitted: \d{4}-\d{2}-\d{2} \d{2}:\d{2} by$',
|
||||
submitted.strip()) is not None
|
||||
submitter = details.xpath('./div[contains(@class, "submitted")]/a')[0]
|
||||
assert submitter.text.strip() == tu_user.Username
|
||||
assert submitter.attrib["href"] == f"/account/{tu_user.Username}"
|
||||
|
||||
end = details.xpath('./div[contains(@class, "end")]')[0]
|
||||
end_label = end.xpath("./text()")[0]
|
||||
assert end_label.strip() == "End:"
|
||||
|
||||
end_datetime = end.xpath("./strong/text()")[0]
|
||||
assert re.match(r'^\d{4}-\d{2}-\d{2} \d{2}:\d{2}$',
|
||||
end_datetime.strip()) is not None
|
||||
|
||||
# We have not voted yet. Assert that our voting form is shown.
|
||||
form = root.xpath('//form[contains(@class, "action-form")]')[0]
|
||||
fields = form.xpath("./fieldset")[0]
|
||||
buttons = fields.xpath('./button[@name="decision"]')
|
||||
assert len(buttons) == 3
|
||||
|
||||
# Check the button names and values.
|
||||
yes, no, abstain = buttons
|
||||
|
||||
# Yes
|
||||
assert yes.attrib["name"] == "decision"
|
||||
assert yes.attrib["value"] == "Yes"
|
||||
|
||||
# No
|
||||
assert no.attrib["name"] == "decision"
|
||||
assert no.attrib["value"] == "No"
|
||||
|
||||
# Abstain
|
||||
assert abstain.attrib["name"] == "decision"
|
||||
assert abstain.attrib["value"] == "Abstain"
|
||||
|
||||
# Create a vote.
|
||||
with db.begin():
|
||||
db.create(TUVote, VoteInfo=voteinfo, User=tu_user)
|
||||
voteinfo.ActiveTUs += 1
|
||||
voteinfo.Yes += 1
|
||||
|
||||
# Make another request now that we've voted.
|
||||
with client as request:
|
||||
response = request.get(
|
||||
"/tu", params={"id": voteinfo.ID}, cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Parse our new root.
|
||||
root = parse_root(response.text)
|
||||
|
||||
# Check that we no longer have a voting form.
|
||||
form = root.xpath('//form[contains(@class, "action-form")]')
|
||||
assert not form
|
||||
|
||||
# Check that we're told we've voted.
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You've already voted for this proposal."
|
||||
|
||||
|
||||
def test_tu_ended_proposal(client, proposal):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
voteinfo.End = ts - 5 # 5 seconds ago.
|
||||
|
||||
# Initiate an authenticated GET request to /tu/{proposal_id}.
|
||||
proposal_id = voteinfo.ID
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get(f"/tu/{proposal_id}", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Alright, now let's continue on to verifying some markup.
|
||||
# First, let's verify that the proposal details match.
|
||||
root = parse_root(response.text)
|
||||
details = root.xpath('//div[@class="proposal details"]')[0]
|
||||
|
||||
vote_running = root.xpath('//p[contains(@class, "vote-running")]')
|
||||
assert not vote_running
|
||||
|
||||
result_node = details.xpath('./div[contains(@class, "result")]')[0]
|
||||
result_label = result_node.xpath("./text()")[0]
|
||||
assert result_label.strip() == "Result:"
|
||||
|
||||
result = result_node.xpath("./span/text()")[0]
|
||||
assert result.strip() == "unknown"
|
||||
|
||||
# Check that voting has ended.
|
||||
form = root.xpath('//form[contains(@class, "action-form")]')
|
||||
assert not form
|
||||
|
||||
# We should see a status about it.
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "Voting is closed for this proposal."
|
||||
|
||||
|
||||
def test_tu_proposal_vote_not_found(client, tu_user):
|
||||
""" Test POST request to a missing vote. """
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.post("/tu/1", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
|
||||
|
||||
def test_tu_proposal_vote(client, proposal):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
# Store the current related values.
|
||||
yes = voteinfo.Yes
|
||||
active_tus = voteinfo.ActiveTUs
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.post(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
# Check that the proposal record got updated.
|
||||
assert voteinfo.Yes == yes + 1
|
||||
assert voteinfo.ActiveTUs == active_tus + 1
|
||||
|
||||
# Check that the new TUVote exists.
|
||||
vote = db.query(TUVote, TUVote.VoteInfo == voteinfo,
|
||||
TUVote.User == tu_user).first()
|
||||
assert vote is not None
|
||||
|
||||
root = parse_root(response.text)
|
||||
|
||||
# Check that we're told we've voted.
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You've already voted for this proposal."
|
||||
|
||||
|
||||
def test_tu_proposal_vote_unauthorized(
|
||||
client: TestClient, proposal: Tuple[User, User, TUVoteInfo]):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
with db.begin():
|
||||
tu_user.AccountTypeID = DEVELOPER_ID
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.post(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.UNAUTHORIZED)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "Only Trusted Users are allowed to vote."
|
||||
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.get(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "Only Trusted Users are allowed to vote."
|
||||
|
||||
|
||||
def test_tu_proposal_vote_cant_self_vote(client, proposal):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
# Update voteinfo.User.
|
||||
with db.begin():
|
||||
voteinfo.User = tu_user.Username
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.post(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You cannot vote in an proposal about you."
|
||||
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.get(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You cannot vote in an proposal about you."
|
||||
|
||||
|
||||
def test_tu_proposal_vote_already_voted(client, proposal):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
with db.begin():
|
||||
db.create(TUVote, VoteInfo=voteinfo, User=tu_user)
|
||||
voteinfo.Yes += 1
|
||||
voteinfo.ActiveTUs += 1
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.post(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You've already voted for this proposal."
|
||||
|
||||
with client as request:
|
||||
data = {"decision": "Yes"}
|
||||
response = request.get(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data, allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
status = root.xpath('//span[contains(@class, "status")]/text()')[0]
|
||||
assert status == "You've already voted for this proposal."
|
||||
|
||||
|
||||
def test_tu_proposal_vote_invalid_decision(client, proposal):
|
||||
tu_user, user, voteinfo = proposal
|
||||
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
data = {"decision": "EVIL"}
|
||||
response = request.post(f"/tu/{voteinfo.ID}", cookies=cookies,
|
||||
data=data)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
assert response.text == "Invalid 'decision' value."
|
||||
|
||||
|
||||
def test_tu_addvote(client: TestClient, tu_user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/addvote", cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
|
||||
def test_tu_addvote_unauthorized(client: TestClient, user: User,
|
||||
proposal: Tuple[User, User, TUVoteInfo]):
|
||||
cookies = {"AURSID": user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/addvote", cookies=cookies,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/tu"
|
||||
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies,
|
||||
allow_redirects=False)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
assert response.headers.get("location") == "/tu"
|
||||
|
||||
|
||||
def test_tu_addvote_invalid_type(client: TestClient, tu_user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
with client as request:
|
||||
response = request.get("/addvote", params={"type": "faketype"},
|
||||
cookies=cookies)
|
||||
assert response.status_code == int(HTTPStatus.OK)
|
||||
|
||||
root = parse_root(response.text)
|
||||
error = root.xpath('//*[contains(@class, "error")]/text()')[0]
|
||||
assert error.strip() == "Invalid type."
|
||||
|
||||
|
||||
def test_tu_addvote_post(client: TestClient, tu_user: User, user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
|
||||
data = {
|
||||
"user": user.Username,
|
||||
"type": "add_tu",
|
||||
"agenda": "Blah"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
voteinfo = db.query(TUVoteInfo, TUVoteInfo.Agenda == "Blah").first()
|
||||
assert voteinfo is not None
|
||||
|
||||
|
||||
def test_tu_addvote_post_cant_duplicate_username(client: TestClient,
|
||||
tu_user: User, user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
|
||||
data = {
|
||||
"user": user.Username,
|
||||
"type": "add_tu",
|
||||
"agenda": "Blah"
|
||||
}
|
||||
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
||||
|
||||
voteinfo = db.query(TUVoteInfo, TUVoteInfo.Agenda == "Blah").first()
|
||||
assert voteinfo is not None
|
||||
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
|
||||
def test_tu_addvote_post_invalid_username(client: TestClient, tu_user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
data = {"user": "fakeusername"}
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.NOT_FOUND)
|
||||
|
||||
|
||||
def test_tu_addvote_post_invalid_type(client: TestClient, tu_user: User,
|
||||
user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
data = {"user": user.Username}
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
|
||||
def test_tu_addvote_post_invalid_agenda(client: TestClient,
|
||||
tu_user: User, user: User):
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
data = {"user": user.Username, "type": "add_tu"}
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.BAD_REQUEST)
|
||||
|
||||
|
||||
def test_tu_addvote_post_bylaws(client: TestClient, tu_user: User):
|
||||
# Bylaws votes do not need a user specified.
|
||||
cookies = {"AURSID": tu_user.login(Request(), "testPassword")}
|
||||
data = {"type": "bylaws", "agenda": "Blah blah!"}
|
||||
with client as request:
|
||||
response = request.post("/addvote", cookies=cookies, data=data)
|
||||
assert response.status_code == int(HTTPStatus.SEE_OTHER)
|
54
test/test_tu_vote.py
Normal file
54
test/test_tu_vote.py
Normal file
|
@ -0,0 +1,54 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID
|
||||
from aurweb.models.tu_vote import TUVote
|
||||
from aurweb.models.tu_voteinfo import TUVoteInfo
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=TRUSTED_USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tu_voteinfo(user: User) -> TUVoteInfo:
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
tu_voteinfo = db.create(TUVoteInfo, Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=ts, End=ts + 5,
|
||||
Quorum=0.5, Submitter=user)
|
||||
yield tu_voteinfo
|
||||
|
||||
|
||||
def test_tu_vote_creation(user: User, tu_voteinfo: TUVoteInfo):
|
||||
with db.begin():
|
||||
tu_vote = db.create(TUVote, User=user, VoteInfo=tu_voteinfo)
|
||||
|
||||
assert tu_vote.VoteInfo == tu_voteinfo
|
||||
assert tu_vote.User == user
|
||||
assert tu_vote in user.tu_votes
|
||||
assert tu_vote in tu_voteinfo.tu_votes
|
||||
|
||||
|
||||
def test_tu_vote_null_user_raises_exception(tu_voteinfo: TUVoteInfo):
|
||||
with pytest.raises(IntegrityError):
|
||||
TUVote(VoteInfo=tu_voteinfo)
|
||||
|
||||
|
||||
def test_tu_vote_null_voteinfo_raises_exception(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
TUVote(User=user)
|
148
test/test_tu_voteinfo.py
Normal file
148
test/test_tu_voteinfo.py
Normal file
|
@ -0,0 +1,148 @@
|
|||
import pytest
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.db import create, rollback
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID
|
||||
from aurweb.models.tu_voteinfo import TUVoteInfo
|
||||
from aurweb.models.user import User
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = create(User, Username="test", Email="test@example.org",
|
||||
RealName="Test User", Passwd="testPassword",
|
||||
AccountTypeID=TRUSTED_USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
def test_tu_voteinfo_creation(user: User):
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
tu_voteinfo = create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=ts, End=ts + 5,
|
||||
Quorum=0.5,
|
||||
Submitter=user)
|
||||
assert bool(tu_voteinfo.ID)
|
||||
assert tu_voteinfo.Agenda == "Blah blah."
|
||||
assert tu_voteinfo.User == user.Username
|
||||
assert tu_voteinfo.Submitted == ts
|
||||
assert tu_voteinfo.End == ts + 5
|
||||
assert tu_voteinfo.Quorum == 0.5
|
||||
assert tu_voteinfo.Submitter == user
|
||||
assert tu_voteinfo.Yes == 0
|
||||
assert tu_voteinfo.No == 0
|
||||
assert tu_voteinfo.Abstain == 0
|
||||
assert tu_voteinfo.ActiveTUs == 0
|
||||
|
||||
assert tu_voteinfo in user.tu_voteinfo_set
|
||||
|
||||
|
||||
def test_tu_voteinfo_is_running(user: User):
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
tu_voteinfo = create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=ts, End=ts + 1000,
|
||||
Quorum=0.5,
|
||||
Submitter=user)
|
||||
assert tu_voteinfo.is_running() is True
|
||||
|
||||
with db.begin():
|
||||
tu_voteinfo.End = ts - 5
|
||||
assert tu_voteinfo.is_running() is False
|
||||
|
||||
|
||||
def test_tu_voteinfo_total_votes(user: User):
|
||||
ts = time.utcnow()
|
||||
with db.begin():
|
||||
tu_voteinfo = create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=ts, End=ts + 1000,
|
||||
Quorum=0.5,
|
||||
Submitter=user)
|
||||
|
||||
tu_voteinfo.Yes = 1
|
||||
tu_voteinfo.No = 3
|
||||
tu_voteinfo.Abstain = 5
|
||||
|
||||
# total_votes() should be the sum of Yes, No and Abstain: 1 + 3 + 5 = 9.
|
||||
assert tu_voteinfo.total_votes() == 9
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_submitter_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=0, End=0,
|
||||
Quorum=0.50)
|
||||
rollback()
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_agenda_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
create(TUVoteInfo,
|
||||
User=user.Username,
|
||||
Submitted=0, End=0,
|
||||
Quorum=0.50,
|
||||
Submitter=user)
|
||||
rollback()
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_user_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
Submitted=0, End=0,
|
||||
Quorum=0.50,
|
||||
Submitter=user)
|
||||
rollback()
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_submitted_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
End=0,
|
||||
Quorum=0.50,
|
||||
Submitter=user)
|
||||
rollback()
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_end_raises(user: User):
|
||||
with pytest.raises(IntegrityError):
|
||||
with db.begin():
|
||||
create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=0,
|
||||
Quorum=0.50,
|
||||
Submitter=user)
|
||||
rollback()
|
||||
|
||||
|
||||
def test_tu_voteinfo_null_quorum_default(user: User):
|
||||
with db.begin():
|
||||
vi = create(TUVoteInfo,
|
||||
Agenda="Blah blah.",
|
||||
User=user.Username,
|
||||
Submitted=0, End=0,
|
||||
Submitter=user)
|
||||
assert vi.Quorum == 0
|
101
test/test_tuvotereminder.py
Normal file
101
test/test_tuvotereminder.py
Normal file
|
@ -0,0 +1,101 @@
|
|||
from typing import Tuple
|
||||
|
||||
import pytest
|
||||
|
||||
from aurweb import config, db, time
|
||||
from aurweb.models import TUVote, TUVoteInfo, User
|
||||
from aurweb.models.account_type import TRUSTED_USER_ID
|
||||
from aurweb.scripts import tuvotereminder as reminder
|
||||
from aurweb.testing.email import Email
|
||||
|
||||
aur_location = config.get("options", "aur_location")
|
||||
|
||||
|
||||
def create_vote(user: User, voteinfo: TUVoteInfo) -> TUVote:
|
||||
with db.begin():
|
||||
vote = db.create(TUVote, User=user, VoteID=voteinfo.ID)
|
||||
return vote
|
||||
|
||||
|
||||
def create_user(username: str, type_id: int):
|
||||
with db.begin():
|
||||
user = db.create(User, AccountTypeID=type_id, Username=username,
|
||||
Email=f"{username}@example.org", Passwd=str())
|
||||
return user
|
||||
|
||||
|
||||
def email_pieces(voteinfo: TUVoteInfo) -> Tuple[str, str]:
|
||||
"""
|
||||
Return a (subject, content) tuple based on voteinfo.ID
|
||||
|
||||
:param voteinfo: TUVoteInfo instance
|
||||
:return: tuple(subject, content)
|
||||
"""
|
||||
subject = f"TU Vote Reminder: Proposal {voteinfo.ID}"
|
||||
content = (f"Please remember to cast your vote on proposal {voteinfo.ID} "
|
||||
f"[1]. The voting period\nends in less than 48 hours.\n\n"
|
||||
f"[1] {aur_location}/tu/?id={voteinfo.ID}")
|
||||
return (subject, content)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user(db_test) -> User:
|
||||
yield create_user("test", TRUSTED_USER_ID)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user2() -> User:
|
||||
yield create_user("test2", TRUSTED_USER_ID)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user3() -> User:
|
||||
yield create_user("test3", TRUSTED_USER_ID)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def voteinfo(user: User) -> TUVoteInfo:
|
||||
now = time.utcnow()
|
||||
start = config.getint("tuvotereminder", "range_start")
|
||||
with db.begin():
|
||||
voteinfo = db.create(TUVoteInfo, Agenda="Lorem ipsum.",
|
||||
User=user.Username, End=(now + start + 1),
|
||||
Quorum=0.00, Submitter=user, Submitted=0)
|
||||
yield voteinfo
|
||||
|
||||
|
||||
def test_tu_vote_reminders(user: User, user2: User, user3: User,
|
||||
voteinfo: TUVoteInfo):
|
||||
reminder.main()
|
||||
assert Email.count() == 3
|
||||
|
||||
emails = [Email(i).parse() for i in range(1, 4)]
|
||||
subject, content = email_pieces(voteinfo)
|
||||
expectations = [
|
||||
# (to, content)
|
||||
(user.Email, subject, content),
|
||||
(user2.Email, subject, content),
|
||||
(user3.Email, subject, content)
|
||||
]
|
||||
for i, element in enumerate(expectations):
|
||||
email, subject, content = element
|
||||
assert emails[i].headers.get("To") == email
|
||||
assert emails[i].headers.get("Subject") == subject
|
||||
assert emails[i].body == content
|
||||
|
||||
|
||||
def test_tu_vote_reminders_only_unvoted(user: User, user2: User, user3: User,
|
||||
voteinfo: TUVoteInfo):
|
||||
# Vote with user2 and user3; leaving only user to be notified.
|
||||
create_vote(user2, voteinfo)
|
||||
create_vote(user3, voteinfo)
|
||||
|
||||
reminder.main()
|
||||
assert Email.count() == 1
|
||||
|
||||
email = Email(1).parse()
|
||||
assert email.headers.get("To") == user.Email
|
||||
|
||||
subject, content = email_pieces(voteinfo)
|
||||
assert email.headers.get("Subject") == subject
|
||||
assert email.body == content
|
316
test/test_user.py
Normal file
316
test/test_user.py
Normal file
|
@ -0,0 +1,316 @@
|
|||
import hashlib
|
||||
import json
|
||||
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
import bcrypt
|
||||
import pytest
|
||||
|
||||
import aurweb.auth
|
||||
import aurweb.config
|
||||
import aurweb.models.account_type as at
|
||||
|
||||
from aurweb import db
|
||||
from aurweb.auth import creds
|
||||
from aurweb.models.account_type import DEVELOPER_ID, TRUSTED_USER_AND_DEV_ID, TRUSTED_USER_ID, USER_ID
|
||||
from aurweb.models.ban import Ban
|
||||
from aurweb.models.package import Package
|
||||
from aurweb.models.package_base import PackageBase
|
||||
from aurweb.models.package_notification import PackageNotification
|
||||
from aurweb.models.package_vote import PackageVote
|
||||
from aurweb.models.session import Session
|
||||
from aurweb.models.ssh_pub_key import SSHPubKey
|
||||
from aurweb.models.user import User
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
def create_user(username: str, account_type_id: int):
|
||||
with db.begin():
|
||||
user = db.create(User, Username=username,
|
||||
Email=f"{username}@example.org",
|
||||
RealName=username.title(), Passwd="testPassword",
|
||||
AccountTypeID=account_type_id)
|
||||
return user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
user = create_user("test", USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tu_user() -> User:
|
||||
user = create_user("test_tu", TRUSTED_USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def dev_user() -> User:
|
||||
user = create_user("test_dev", DEVELOPER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tu_and_dev_user() -> User:
|
||||
user = create_user("test_tu_and_dev", TRUSTED_USER_AND_DEV_ID)
|
||||
yield user
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def package(user: User) -> Package:
|
||||
with db.begin():
|
||||
pkgbase = db.create(PackageBase, Name="pkg1", Maintainer=user)
|
||||
pkg = db.create(Package, PackageBase=pkgbase, Name=pkgbase.Name)
|
||||
yield pkg
|
||||
|
||||
|
||||
def test_user_login_logout(user: User):
|
||||
""" Test creating a user and reading its columns. """
|
||||
# Assert that make_user created a valid user.
|
||||
assert bool(user.ID)
|
||||
|
||||
# Test authentication.
|
||||
assert user.valid_password("testPassword")
|
||||
assert not user.valid_password("badPassword")
|
||||
|
||||
# Make a raw request.
|
||||
request = Request()
|
||||
assert not user.login(request, "badPassword")
|
||||
assert not user.is_authenticated()
|
||||
|
||||
sid = user.login(request, "testPassword")
|
||||
assert sid is not None
|
||||
assert user.is_authenticated()
|
||||
|
||||
# Expect that User session relationships work right.
|
||||
user_session = db.query(Session,
|
||||
Session.UsersID == user.ID).first()
|
||||
assert user_session == user.session
|
||||
assert user.session.SessionID == sid
|
||||
assert user.session.User == user
|
||||
|
||||
# Search for the user via query API.
|
||||
result = db.query(User, User.ID == user.ID).first()
|
||||
|
||||
# Compare the result and our original user.
|
||||
assert result == user
|
||||
assert result.ID == user.ID
|
||||
assert result.AccountType.ID == user.AccountType.ID
|
||||
assert result.Username == user.Username
|
||||
assert result.Email == user.Email
|
||||
|
||||
# Test result authenticate methods to ensure they work the same.
|
||||
assert not result.valid_password("badPassword")
|
||||
assert result.valid_password("testPassword")
|
||||
assert result.is_authenticated()
|
||||
|
||||
# Test out user string functions.
|
||||
assert repr(user) == f"<User(ID='{user.ID}', " + \
|
||||
"AccountType='User', Username='test')>"
|
||||
|
||||
# Test logout.
|
||||
user.logout(request)
|
||||
assert not user.is_authenticated()
|
||||
|
||||
|
||||
def test_user_login_twice(user: User):
|
||||
request = Request()
|
||||
assert user.login(request, "testPassword")
|
||||
assert user.login(request, "testPassword")
|
||||
|
||||
|
||||
def test_user_login_banned(user: User):
|
||||
# Add ban for the next 30 seconds.
|
||||
banned_timestamp = datetime.utcnow() + timedelta(seconds=30)
|
||||
with db.begin():
|
||||
db.create(Ban, IPAddress="127.0.0.1", BanTS=banned_timestamp)
|
||||
|
||||
request = Request()
|
||||
request.client.host = "127.0.0.1"
|
||||
assert not user.login(request, "testPassword")
|
||||
|
||||
|
||||
def test_user_login_suspended(user: User):
|
||||
with db.begin():
|
||||
user.Suspended = True
|
||||
assert not user.login(Request(), "testPassword")
|
||||
|
||||
|
||||
def test_legacy_user_authentication(user: User):
|
||||
with db.begin():
|
||||
user.Salt = bcrypt.gensalt().decode()
|
||||
user.Passwd = hashlib.md5(
|
||||
f"{user.Salt}testPassword".encode()
|
||||
).hexdigest()
|
||||
|
||||
assert not user.valid_password("badPassword")
|
||||
assert user.valid_password("testPassword")
|
||||
|
||||
# Test by passing a password of None value in.
|
||||
assert not user.valid_password(None)
|
||||
|
||||
|
||||
def test_user_login_with_outdated_sid(user: User):
|
||||
# Make a session with a LastUpdateTS 5 seconds ago, causing
|
||||
# user.login to update it with a new sid.
|
||||
with db.begin():
|
||||
db.create(Session, UsersID=user.ID, SessionID="stub",
|
||||
LastUpdateTS=datetime.utcnow().timestamp() - 5)
|
||||
sid = user.login(Request(), "testPassword")
|
||||
assert sid and user.is_authenticated()
|
||||
assert sid != "stub"
|
||||
|
||||
|
||||
def test_user_update_password(user: User):
|
||||
user.update_password("secondPassword")
|
||||
assert not user.valid_password("testPassword")
|
||||
assert user.valid_password("secondPassword")
|
||||
|
||||
|
||||
def test_user_minimum_passwd_length():
|
||||
passwd_min_len = aurweb.config.getint("options", "passwd_min_len")
|
||||
assert User.minimum_passwd_length() == passwd_min_len
|
||||
|
||||
|
||||
def test_user_has_credential(user: User):
|
||||
assert not user.has_credential(creds.ACCOUNT_CHANGE_TYPE)
|
||||
|
||||
|
||||
def test_user_ssh_pub_key(user: User):
|
||||
assert user.ssh_pub_key is None
|
||||
|
||||
with db.begin():
|
||||
ssh_pub_key = db.create(SSHPubKey, UserID=user.ID,
|
||||
Fingerprint="testFingerprint",
|
||||
PubKey="testPubKey")
|
||||
|
||||
assert user.ssh_pub_key == ssh_pub_key
|
||||
|
||||
|
||||
def test_user_credential_types(user: User):
|
||||
assert user.AccountTypeID in creds.user_developer_or_trusted_user
|
||||
assert user.AccountTypeID not in creds.trusted_user
|
||||
assert user.AccountTypeID not in creds.developer
|
||||
assert user.AccountTypeID not in creds.trusted_user_or_dev
|
||||
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.TRUSTED_USER_ID
|
||||
|
||||
assert user.AccountTypeID in creds.trusted_user
|
||||
assert user.AccountTypeID in creds.trusted_user_or_dev
|
||||
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.DEVELOPER_ID
|
||||
|
||||
assert user.AccountTypeID in creds.developer
|
||||
assert user.AccountTypeID in creds.trusted_user_or_dev
|
||||
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.TRUSTED_USER_AND_DEV_ID
|
||||
|
||||
assert user.AccountTypeID in creds.trusted_user
|
||||
assert user.AccountTypeID in creds.developer
|
||||
assert user.AccountTypeID in creds.trusted_user_or_dev
|
||||
|
||||
# Some model authorization checks.
|
||||
assert user.is_elevated()
|
||||
assert user.is_trusted_user()
|
||||
assert user.is_developer()
|
||||
|
||||
|
||||
def test_user_json(user: User):
|
||||
data = json.loads(user.json())
|
||||
assert data.get("ID") == user.ID
|
||||
assert data.get("Username") == user.Username
|
||||
assert data.get("Email") == user.Email
|
||||
# .json() converts datetime values to integer timestamps.
|
||||
assert isinstance(data.get("RegistrationTS"), int)
|
||||
|
||||
|
||||
def test_user_as_dict(user: User):
|
||||
data = user.as_dict()
|
||||
assert data.get("ID") == user.ID
|
||||
assert data.get("Username") == user.Username
|
||||
assert data.get("Email") == user.Email
|
||||
# .as_dict() does not convert values to json-capable types.
|
||||
assert isinstance(data.get("RegistrationTS"), datetime)
|
||||
|
||||
|
||||
def test_user_is_trusted_user(user: User):
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.TRUSTED_USER_ID
|
||||
assert user.is_trusted_user() is True
|
||||
|
||||
# Do it again with the combined role.
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.TRUSTED_USER_AND_DEV_ID
|
||||
assert user.is_trusted_user() is True
|
||||
|
||||
|
||||
def test_user_is_developer(user: User):
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.DEVELOPER_ID
|
||||
assert user.is_developer() is True
|
||||
|
||||
# Do it again with the combined role.
|
||||
with db.begin():
|
||||
user.AccountTypeID = at.TRUSTED_USER_AND_DEV_ID
|
||||
assert user.is_developer() is True
|
||||
|
||||
|
||||
def test_user_voted_for(user: User, package: Package):
|
||||
pkgbase = package.PackageBase
|
||||
now = int(datetime.utcnow().timestamp())
|
||||
with db.begin():
|
||||
db.create(PackageVote, PackageBase=pkgbase, User=user, VoteTS=now)
|
||||
assert user.voted_for(package)
|
||||
|
||||
|
||||
def test_user_notified(user: User, package: Package):
|
||||
pkgbase = package.PackageBase
|
||||
with db.begin():
|
||||
db.create(PackageNotification, PackageBase=pkgbase, User=user)
|
||||
assert user.notified(package)
|
||||
|
||||
|
||||
def test_user_packages(user: User, package: Package):
|
||||
assert package in user.packages()
|
||||
|
||||
|
||||
def test_can_edit_user(user: User, tu_user: User, dev_user: User,
|
||||
tu_and_dev_user: User):
|
||||
# User can edit.
|
||||
assert user.can_edit_user(user)
|
||||
|
||||
# User cannot edit.
|
||||
assert not user.can_edit_user(tu_user)
|
||||
assert not user.can_edit_user(dev_user)
|
||||
assert not user.can_edit_user(tu_and_dev_user)
|
||||
|
||||
# Trusted User can edit.
|
||||
assert tu_user.can_edit_user(user)
|
||||
assert tu_user.can_edit_user(tu_user)
|
||||
|
||||
# Trusted User cannot edit.
|
||||
assert not tu_user.can_edit_user(dev_user)
|
||||
assert not tu_user.can_edit_user(tu_and_dev_user)
|
||||
|
||||
# Developer can edit.
|
||||
assert dev_user.can_edit_user(user)
|
||||
assert dev_user.can_edit_user(tu_user)
|
||||
assert dev_user.can_edit_user(dev_user)
|
||||
|
||||
# Developer cannot edit.
|
||||
assert not dev_user.can_edit_user(tu_and_dev_user)
|
||||
|
||||
# Trusted User & Developer can edit.
|
||||
assert tu_and_dev_user.can_edit_user(user)
|
||||
assert tu_and_dev_user.can_edit_user(tu_user)
|
||||
assert tu_and_dev_user.can_edit_user(dev_user)
|
||||
assert tu_and_dev_user.can_edit_user(tu_and_dev_user)
|
65
test/test_usermaint.py
Normal file
65
test/test_usermaint.py
Normal file
|
@ -0,0 +1,65 @@
|
|||
import pytest
|
||||
|
||||
from aurweb import db, time
|
||||
from aurweb.models import User
|
||||
from aurweb.models.account_type import USER_ID
|
||||
from aurweb.scripts import usermaint
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def setup(db_test):
|
||||
return
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def user() -> User:
|
||||
with db.begin():
|
||||
user = db.create(User, Username="test", Email="test@example.org",
|
||||
Passwd="testPassword", AccountTypeID=USER_ID)
|
||||
yield user
|
||||
|
||||
|
||||
def test_usermaint_noop(user: User):
|
||||
""" Last[SSH]Login isn't expired in this test: usermaint is noop. """
|
||||
|
||||
now = time.utcnow()
|
||||
with db.begin():
|
||||
user.LastLoginIPAddress = "127.0.0.1"
|
||||
user.LastLogin = now - 10
|
||||
user.LastSSHLoginIPAddress = "127.0.0.1"
|
||||
user.LastSSHLogin = now - 10
|
||||
|
||||
usermaint.main()
|
||||
|
||||
assert user.LastLoginIPAddress == "127.0.0.1"
|
||||
assert user.LastSSHLoginIPAddress == "127.0.0.1"
|
||||
|
||||
|
||||
def test_usermaint(user: User):
|
||||
"""
|
||||
In this case, we first test that only the expired record gets
|
||||
updated, but the non-expired record remains untouched. After,
|
||||
we update the login time on the non-expired record and exercise
|
||||
its code path.
|
||||
"""
|
||||
|
||||
now = time.utcnow()
|
||||
limit_to = now - 86400 * 7
|
||||
with db.begin():
|
||||
user.LastLoginIPAddress = "127.0.0.1"
|
||||
user.LastLogin = limit_to - 666
|
||||
user.LastSSHLoginIPAddress = "127.0.0.1"
|
||||
user.LastSSHLogin = now - 10
|
||||
|
||||
usermaint.main()
|
||||
|
||||
assert user.LastLoginIPAddress is None
|
||||
assert user.LastSSHLoginIPAddress == "127.0.0.1"
|
||||
|
||||
with db.begin():
|
||||
user.LastSSHLogin = limit_to - 666
|
||||
|
||||
usermaint.main()
|
||||
|
||||
assert user.LastLoginIPAddress is None
|
||||
assert user.LastSSHLoginIPAddress is None
|
62
test/test_util.py
Normal file
62
test/test_util.py
Normal file
|
@ -0,0 +1,62 @@
|
|||
import json
|
||||
|
||||
from http import HTTPStatus
|
||||
|
||||
import fastapi
|
||||
import pytest
|
||||
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
from aurweb import filters, util
|
||||
from aurweb.testing.requests import Request
|
||||
|
||||
|
||||
def test_round():
|
||||
assert filters.do_round(1.3) == 1
|
||||
assert filters.do_round(1.5) == 2
|
||||
assert filters.do_round(2.0) == 2
|
||||
|
||||
|
||||
def test_git_search():
|
||||
""" Test that git_search matches the full commit if necessary. """
|
||||
commit_hash = "0123456789abcdef"
|
||||
repo = {commit_hash}
|
||||
prefixlen = util.git_search(repo, commit_hash)
|
||||
assert prefixlen == 16
|
||||
|
||||
|
||||
def test_git_search_double_commit():
|
||||
""" Test that git_search matches a shorter prefix length. """
|
||||
commit_hash = "0123456789abcdef"
|
||||
repo = {commit_hash[:13]}
|
||||
# Locate the shortest prefix length that matches commit_hash.
|
||||
prefixlen = util.git_search(repo, commit_hash)
|
||||
assert prefixlen == 13
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_error_or_result():
|
||||
|
||||
async def route(request: fastapi.Request):
|
||||
raise RuntimeError("No response returned.")
|
||||
|
||||
response = await util.error_or_result(route, Request())
|
||||
assert response.status_code == HTTPStatus.INTERNAL_SERVER_ERROR
|
||||
|
||||
data = json.loads(response.body)
|
||||
assert data.get("error") == "No response returned."
|
||||
|
||||
async def good_route(request: fastapi.Request):
|
||||
return JSONResponse()
|
||||
|
||||
response = await util.error_or_result(good_route, Request())
|
||||
assert response.status_code == HTTPStatus.OK
|
||||
|
||||
|
||||
def test_valid_homepage():
|
||||
assert util.valid_homepage("http://google.com")
|
||||
assert util.valid_homepage("https://google.com")
|
||||
assert not util.valid_homepage("http://[google.com/broken-ipv6")
|
||||
assert not util.valid_homepage("https://[google.com/broken-ipv6")
|
||||
|
||||
assert not util.valid_homepage("gopher://gopher.hprc.utoronto.ca/")
|
Loading…
Add table
Add a link
Reference in a new issue