fix(FastAPI): Improve sqlite testing speed

This commit adds a new Arch dependency: `libeatmydata`, which
provides the `eatmydata` executable that stubs out fsync() operations.
We use `eatmydata` to run our sharness and pytests in Docker now.

With `autocommit=True`, required by SQLAlchemy to keep the
session up to date with external DB modifications, many fsync
calls are used in the SQLite backend; especially because we're wiping
and creating records in every DB-bound test.

**Before:**

- mysql: 1m42s (elapsed during pytest run)
- sqlite: 3m06s (elapsed during pytest run)

**After:**

- mysql: 1m40s (elapsed during pytest run)
- sqlite: 1m50s (elapsed during pytest run)

Shout out to @klausenbusk, who suggested this as a possible fix,
and it was. Thanks, Kristian!

Closes #120

Signed-off-by: Kevin Morris <kevr@0cost.org>
This commit is contained in:
Kevin Morris 2021-10-03 15:11:42 -07:00
parent b5f8e69b8a
commit 7bfc2bf9b4
No known key found for this signature in database
GPG key ID: F7E46DED420788F3
4 changed files with 13 additions and 3 deletions

View file

@ -4,4 +4,4 @@ set -eou pipefail
# Initialize the new database; ignore errors.
python -m aurweb.initdb 2>/dev/null || /bin/true
make -C test sh
eatmydata -- make -C test sh