* ci: add Dockerfile and action to build celery worker image * ci: build celery worker on push to jennifer/celery branch * ci: also build celery worker for main branch * ci: Add comment to celery Dockerfile * chore: first stab at a celery/rabbitmq docker-compose * feat: add celery configuration and test task / endpoint * chore: run mq/celery containers for dev work * chore: point to ghcr.io image for celery worker * refactor: move XML parsing duties into XMLDraft Move some PlaintextDraft methods into the Draft base class and implement for the XMLDraft class. Use xml2rfc code from ietf.submit as a model for the parsing. This leaves some mismatch between the PlaintextDraft and the Draft class spec for the get_author_list() method to be resolved. * feat: add api_upload endpoint and beginnings of async processing This adds an api_upload() that behaves analogously to the api_submit() endpoint. Celery tasks to handle asynchronous processing are added but are not yet functional enough to be useful. * perf: index Submission table on submission_date This substantially speeds up submission rate threshold checks. * feat: remove existing files when accepting a new submission After checking that a submission is not in progress, remove any files in staging that have the same name/rev with any extension. This should guard against stale files confusing the submission process if the usual cleanup fails or is skipped for some reason. * refactor: make clear that deduce_group() uses only the draft name * refactor: extract only draft name/revision in clean() method Minimizing the amount of validation done when accepting a file. The data extraction will be moved to asynchronous processing. * refactor: minimize checks and data extraction in api_upload() view * ci: fix dockerfiles to match sandbox testing * ci: tweak celery container docker-compose settings * refactor: clean up Draft parsing API and usage * remove get_draftname() from Draft api; set filename during init * further XMLDraft work - remember xml_version after parsing - extract filename/revision during init - comment out long broken get_abstract() method * adjust form clean() method to use changed API * feat: flesh out async submission processing First basically working pass! * feat: add state name for submission being validated asynchronously * feat: cancel submissions that async processing can't handle * refactor: simplify/consolidate async tasks and improve error handling * feat: add api_submission_status endpoint * refactor: return JSON from submission api endpoints * refactor: reuse cancel_submission method * refactor: clean up error reporting a bit * feat: guard against cancellation of a submission while validating Not bulletproof but should prevent * feat: indicate that a submission is still being validated * fix: do not delete submission files after creating them * chore: remove debug statement * test: add tests of the api_upload and api_submission_status endpoints * test: add tests and stubs for async side of submission handling * fix: gracefully handle (ignore) invalid IDs in async submit task * test: test process_uploaded_submission method * fix: fix failures of new tests * refactor: fix type checker complaints * test: test submission_status view of submission in "validating" state * fix: fix up migrations * fix: use the streamlined SubmissionBaseUploadForm for api_upload * feat: show submission history event timestamp as mouse-over text * fix: remove 'manual' as next state for 'validating' submission state * refactor: share SubmissionBaseUploadForm code with Deprecated version * fix: validate text submission title, update a couple comments * chore: disable requirements updating when celery dev container starts * feat: log traceback on unexpected error during submission processing * feat: allow secretariat to cancel "validating" submission * feat: indicate time since submission on the status page * perf: check submission rate thresholds earlier when possible No sense parsing details of a draft that is going to be dropped regardless of those details! * fix: create Submission before saving to reduce race condition window * fix: call deduce_group() with filename * refactor: remove code lint * refactor: change the api_upload URL to api/submission * docs: update submission API documentation * test: add tests of api_submission's text draft consistency checks * refactor: rename api_upload to api_submission to agree with new URL * test: test API documentation and submission thresholds * fix: fix a couple api_submission view renames missed in templates * chore: use base image + add arm64 support * ci: try to fix workflow_dispatch for celery worker * ci: another attempt to fix workflow_dispatch * ci: build celery image for submit-async branch * ci: fix typo * ci: publish celery worker to ghcr.io/painless-security * ci: install python requirements in celery image * ci: fix up requirements install on celery image * chore: remove XML_LIBRARY references that crept back in * feat: accept 'replaces' field in api_submission * docs: update api_submission documentation * fix: remove unused import * test: test "replaces" validation for submission API * test: test that "replaces" is set by api_submission * feat: trap TERM to gracefully stop celery container * chore: tweak celery/mq settings * docs: update installation instructions * ci: adjust paths that trigger celery worker image build * ci: fix branches/repo names left over from dev * ci: run manage.py check when initializing celery container Driver here is applying the patches. Starting the celery workers also invokes the check task, but this should cause a clearer failure if something fails. * docs: revise INSTALL instructions * ci: pass filename to pip update in celery container * docs: update INSTALL to include freezing pip versions Will be used to coordinate package versions with the celery container in production. * docs: add explanation of frozen-requirements.txt * ci: build image for sandbox deployment * ci: add additional build trigger path * docs: tweak INSTALL * fix: change INSTALL process to stop datatracker before running migrations * chore: use ietf.settings for manage.py check in celery container * chore: set uid/gid for celery worker * chore: create user/group in celery container if needed * chore: tweak docker compose/init so celery container works in dev * ci: build mq docker image * fix: move rabbitmq.pid to writeable location * fix: clear password when CELERY_PASSWORD is empty Setting to an empty password is really not a good plan! * chore: add shutdown debugging option to celery image * chore: add django-celery-beat package * chore: run "celery beat" in datatracker-celery image * chore: fix docker image name * feat: add task to cancel stale submissions * test: test the cancel_stale_submissions task * chore: make f-string with no interpolation a plain string Co-authored-by: Nicolas Giard <github@ngpixel.com> Co-authored-by: Robert Sparks <rjsparks@nostrum.com>
152 lines
5.2 KiB
Plaintext
152 lines
5.2 KiB
Plaintext
==============================================================================
|
|
IETF Datatracker
|
|
==============================================================================
|
|
|
|
------------------------------------------------------------------------------
|
|
Installation Instructions
|
|
------------------------------------------------------------------------------
|
|
|
|
General Instructions for Deployment of a New Release
|
|
====================================================
|
|
|
|
1. Make a directory to hold the new release::
|
|
sudo su - -s /bin/bash wwwrun
|
|
mkdir /a/www/ietf-datatracker/${releasenumber}
|
|
cd /a/www/ietf-datatracker/${releasenumber}
|
|
|
|
2. Fetch the release tarball from github
|
|
(see https://github.com/ietf-tools/datatracker/releases)::
|
|
|
|
wget https://github.com/ietf-tools/datatracker/releases/download/${releasenumber}/release.tar.gz
|
|
tar xzvf release.tar.gz
|
|
|
|
3. Copy ietf/settings_local.py from previous release::
|
|
|
|
cp ../web/ietf/settings_local.py ietf/
|
|
|
|
4. Setup a new virtual environment and install requirements::
|
|
|
|
python3.9 -mvenv env
|
|
source env/bin/activate
|
|
pip install -r requirements.txt
|
|
pip freeze > frozen-requirements.txt
|
|
|
|
(The pip freeze command records the exact versions of the Python libraries that pip installed.
|
|
This is used by the celery docker container to ensure it uses the same library versions as
|
|
the datatracker service.)
|
|
|
|
5. Move static files into place for CDN (/a/www/www6s/lib/dt):
|
|
|
|
ietf/manage.py collectstatic
|
|
|
|
6. Run system checks (which patches the just installed modules)::
|
|
|
|
ietf/manage.py check
|
|
|
|
7. Switch to the docker directory and update images:
|
|
|
|
cd /a/docker/datatracker-cel
|
|
docker image tag ghcr.io/ietf-tools/datatracker-celery:latest datatracker-celery-fallback
|
|
docker image tag ghcr.io/ietf-tools/datatracker-mq:latest datatracker-mq-fallback
|
|
docker-compose pull
|
|
|
|
8. Stop and remove the async task containers:
|
|
Wait for this to finish cleanly. It may take up to about 10 minutes for the 'stop' command to
|
|
complete if a long-running task is in progress.
|
|
|
|
docker-compose down
|
|
|
|
9. Stop the datatracker
|
|
(consider doing this with a second shell at ietfa to avoid the exit and shift back to wwwrun)
|
|
|
|
exit
|
|
sudo systemctl stop datatracker.socket datatracker.service
|
|
sudo su - -s /bin/bash wwwrun
|
|
|
|
10. Return to the release directory and run migrations:
|
|
|
|
cd /a/www/ietf-datatracker/${releasenumber}
|
|
ietf/manage.py migrate
|
|
|
|
Take note if any migrations were executed.
|
|
|
|
11. Back out one directory level, then re-point the 'web' symlink::
|
|
|
|
cd ..
|
|
rm ./web; ln -s ${releasenumber} web
|
|
|
|
12. Start the datatracker service (it is no longer necessary to restart apache) ::
|
|
|
|
exit # or CTRL-D, back to root level shell
|
|
sudo systemctl start datatracker.service datatracker.socket
|
|
|
|
13. Start async task worker and message broker:
|
|
|
|
cd /a/docker/datatracker-cel
|
|
bash startcommand
|
|
|
|
14. Verify operation:
|
|
|
|
http://datatracker.ietf.org/
|
|
|
|
15. If install failed and there were no migrations at step 9, revert web symlink and docker update and repeat the
|
|
restart in steps 11 and 12. To revert the docker update:
|
|
|
|
cd /a/docker/datatracker-cel
|
|
docker-compose down
|
|
docker image rm ghcr.io/ietf-tools/datatracker-celery:latest ghcr.io/ietf-tools/datatracker-mq:latest
|
|
docker image tag datatracker-celery-fallback ghcr.io/ietf-tools/datatracker-celery:latest
|
|
docker image tag datatracker-mq-fallback ghcr.io/ietf-tools/datatracker-mq:latest
|
|
cd -
|
|
|
|
If there were migrations at step 10, they will need to be reversed before the restart at step 12.
|
|
If it's not obvious what to do to reverse the migrations, contact the dev team.
|
|
|
|
|
|
Patching a Production Release
|
|
=============================
|
|
|
|
Sometimes it can prove necessary to patch an existing release.
|
|
The following process should be used:
|
|
|
|
1. Code and test the patch on an copy of the release with any
|
|
previously applied patches put in place.
|
|
|
|
2. Produce a patch file, named with date and subject::
|
|
|
|
$ git diff > 2013-03-25-ballot-calculation.patch
|
|
|
|
3. Move the patch file to the production server, and place it in
|
|
'/a/www/ietf-datatracker/patches/'
|
|
|
|
4. Make a recursive copy of the production code to a new directory, named with a patch number.
|
|
|
|
/a/www/ietf-datatracker $ rsync -a web/ ${releasenumber}.p1/
|
|
|
|
5. Apply the patch::
|
|
|
|
/a/www/ietf-datatracker $ cd ${releasenumber}.p1/
|
|
/a/www/ietf-datatracker/${releasnumber}.p1 $ patch -p1 \
|
|
< ../patches/2013-03-25-ballot-calculation.patch
|
|
|
|
This must not produce any messages about failing to apply any chunks;
|
|
if it does, go back to 1. and figure out why.
|
|
|
|
6. Edit ``.../ietf/__init__.py`` in the new patched release to indicate the patch
|
|
version in the ``__patch__`` string.
|
|
|
|
7. Stop the async task container (this may take a few minutes if tasks are in progress):
|
|
|
|
cd /a/docker/datatracker-cel
|
|
docker-compose stop celery
|
|
|
|
8. Change the 'web' symlink, reload etc. as described in
|
|
`General Instructions for Deployment of a New Release`_.
|
|
|
|
9. Start async task worker:
|
|
|
|
cd /a/docker/datatracker-cel
|
|
bash startcommand
|
|
|
|
|