ci: merge main to release (#8407)

This commit is contained in:
Robert Sparks 2025-01-09 13:39:50 -06:00 committed by GitHub
commit 2c04f367b4
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
25 changed files with 163 additions and 324 deletions

View file

@ -1,157 +0,0 @@
==============================================================================
IETF Datatracker
==============================================================================
------------------------------------------------------------------------------
Installation Instructions
------------------------------------------------------------------------------
General Instructions for Deployment of a New Release
====================================================
0. Prepare to hold different roles at different stages of the instructions below.
You will need to be root, wwwrun, and some user in group docker.
Consider using separate shells for the wwwrun and other roles. These instructions
are written assuming you will only use one shell.
1. Make a directory to hold the new release as wwwrun::
sudo su - -s /bin/bash wwwrun
mkdir /a/www/ietf-datatracker/${releasenumber}
cd /a/www/ietf-datatracker/${releasenumber}
2. Fetch the release tarball from github
(see https://github.com/ietf-tools/datatracker/releases)::
wget https://github.com/ietf-tools/datatracker/releases/download/${releasenumber}/release.tar.gz
tar xzvf release.tar.gz
3. Copy ietf/settings_local.py from previous release::
cp ../web/ietf/settings_local.py ietf/
4. Setup a new virtual environment and install requirements::
python3.9 -mvenv env
source env/bin/activate
pip install -r requirements.txt
pip freeze > frozen-requirements.txt
(The pip freeze command records the exact versions of the Python libraries that pip installed.
This is used by the celery docker container to ensure it uses the same library versions as
the datatracker service.)
5. Move static files into place for CDN (/a/www/www6s/lib/dt):
ietf/manage.py collectstatic
6. Run system checks (which patches the just installed modules)::
ietf/manage.py check
7. Switch to the docker directory and update images as a user in group docker:
exit
cd /a/docker/datatracker
docker image tag ghcr.io/ietf-tools/datatracker-celery:latest datatracker-celery-fallback
docker image tag ghcr.io/ietf-tools/datatracker-mq:latest datatracker-mq-fallback
docker-compose pull
8. Stop and remove the async task containers:
Wait for this to finish cleanly. Usually this will only be a few seconds, but it may take up
to about 10 minutes for the 'down' command to complete if a long-running task is in progress.
docker-compose down
9. Stop the datatracker and remove the web link so cron or other applications
don't run code in the older deployment.
sudo systemctl stop datatracker.socket datatracker.service
rm /a/www/ietf-datatracker/web
10. Return to the release directory and run migrations as wwwrun:
sudo su - -s /bin/bash wwwrun
cd /a/www/ietf-datatracker/${releasenumber}
ietf/manage.py migrate
Take note if any migrations were executed.
11. Back out one directory level, then re-point the 'web' symlink::
cd ..
ln -s ${releasenumber} web
12. Start the datatracker service (it is no longer necessary to restart apache) ::
exit
sudo systemctl start datatracker.service datatracker.socket
13. Start async task worker and message broker:
cd /a/docker/datatracker
bash startcommand
14. Verify operation:
http://datatracker.ietf.org/
15. If install failed and there were no migrations at step 9, revert web symlink and docker update and repeat the
restart in steps 11 and 12. To revert the docker update:
cd /a/docker/datatracker
docker-compose down
docker image rm ghcr.io/ietf-tools/datatracker-celery:latest ghcr.io/ietf-tools/datatracker-mq:latest
docker image tag datatracker-celery-fallback ghcr.io/ietf-tools/datatracker-celery:latest
docker image tag datatracker-mq-fallback ghcr.io/ietf-tools/datatracker-mq:latest
cd -
If there were migrations at step 10, they will need to be reversed before the restart at step 12.
If it's not obvious what to do to reverse the migrations, contact the dev team.
Patching a Production Release
=============================
Sometimes it can prove necessary to patch an existing release.
The following process should be used:
1. Code and test the patch on an copy of the release with any
previously applied patches put in place.
2. Produce a patch file, named with date and subject::
$ git diff > 2013-03-25-ballot-calculation.patch
3. Move the patch file to the production server, and place it in
'/a/www/ietf-datatracker/patches/'
4. Make a recursive copy of the production code to a new directory, named with a patch number.
/a/www/ietf-datatracker $ rsync -a web/ ${releasenumber}.p1/
5. Apply the patch::
/a/www/ietf-datatracker $ cd ${releasenumber}.p1/
/a/www/ietf-datatracker/${releasnumber}.p1 $ patch -p1 \
< ../patches/2013-03-25-ballot-calculation.patch
This must not produce any messages about failing to apply any chunks;
if it does, go back to 1. and figure out why.
6. Edit ``.../ietf/__init__.py`` in the new patched release to indicate the patch
version in the ``__patch__`` string.
7. Stop the async task container (this may take a few minutes if tasks are in progress):
cd /a/docker/datatracker
docker-compose down
8. Change the 'web' symlink, reload etc. as described in
`General Instructions for Deployment of a New Release`_.
9. Start async task worker:
cd /a/docker/datatracker
bash startcommand

View file

@ -64,6 +64,7 @@ INTERNET_ALL_DRAFTS_ARCHIVE_DIR = '/assets/archive/id'
BIBXML_BASE_PATH = '/assets/ietfdata/derived/bibxml'
IDSUBMIT_REPOSITORY_PATH = INTERNET_DRAFT_PATH
FTP_DIR = '/assets/ftp'
NFS_METRICS_TMP_DIR = '/assets/tmp'
NOMCOM_PUBLIC_KEYS_DIR = 'data/nomcom_keys/public_keys/'
SLIDE_STAGING_PATH = '/test/staging/'

View file

@ -60,6 +60,7 @@ INTERNET_DRAFT_ARCHIVE_DIR = '/assets/collection/draft-archive'
INTERNET_ALL_DRAFTS_ARCHIVE_DIR = '/assets/ietf-ftp/internet-drafts/'
BIBXML_BASE_PATH = '/assets/ietfdata/derived/bibxml'
FTP_DIR = '/assets/ftp'
NFS_METRICS_TMP_DIR = '/assets/tmp'
NOMCOM_PUBLIC_KEYS_DIR = 'data/nomcom_keys/public_keys/'
SLIDE_STAGING_PATH = 'test/staging/'

View file

@ -59,6 +59,7 @@ INTERNET_DRAFT_ARCHIVE_DIR = '/assets/collection/draft-archive'
INTERNET_ALL_DRAFTS_ARCHIVE_DIR = '/assets/ietf-ftp/internet-drafts/'
BIBXML_BASE_PATH = '/assets/ietfdata/derived/bibxml'
FTP_DIR = '/assets/ftp'
NFS_METRICS_TMP_DIR = '/assets/tmp'
NOMCOM_PUBLIC_KEYS_DIR = 'data/nomcom_keys/public_keys/'
SLIDE_STAGING_PATH = 'test/staging/'

View file

@ -50,6 +50,7 @@ INTERNET_ALL_DRAFTS_ARCHIVE_DIR = '/assets/archive/id'
BIBXML_BASE_PATH = '/assets/ietfdata/derived/bibxml'
IDSUBMIT_REPOSITORY_PATH = INTERNET_DRAFT_PATH
FTP_DIR = '/assets/ftp'
NFS_METRICS_TMP_DIR = '/assets/tmp'
NOMCOM_PUBLIC_KEYS_DIR = 'data/nomcom_keys/public_keys/'
SLIDE_STAGING_PATH = '/assets/www6s/staging/'

View file

@ -29,6 +29,7 @@ for sub in \
/assets/www6/iesg \
/assets/www6/iesg/evaluation \
/assets/media/photo \
/assets/tmp \
/assets/ftp \
/assets/ftp/charter \
/assets/ftp/internet-drafts \

View file

@ -970,6 +970,14 @@ class CustomApiTests(TestCase):
self.assertEqual(jsondata['success'], True)
self.client.logout()
@override_settings(APP_API_TOKENS={"ietf.api.views.nfs_metrics": ["valid-token"]})
def test_api_nfs_metrics(self):
url = urlreverse("ietf.api.views.nfs_metrics")
r = self.client.get(url)
self.assertEqual(r.status_code, 403)
r = self.client.get(url, headers={"X-Api-Key": "valid-token"})
self.assertContains(r, 'nfs_latency_seconds{operation="write"}')
def test_api_get_session_matherials_no_agenda_meeting_url(self):
meeting = MeetingFactory(type_id='ietf')
session = SessionFactory(meeting=meeting)

View file

@ -82,6 +82,8 @@ urlpatterns = [
url(r'^version/?$', api_views.version),
# Application authentication API key
url(r'^appauth/(?P<app>authortools|bibxml)$', api_views.app_auth),
# NFS metrics endpoint
url(r'^metrics/nfs/?$', api_views.nfs_metrics),
# latest versions
url(r'^rfcdiff-latest-json/%(name)s(?:-%(rev)s)?(\.txt|\.html)?/?$' % settings.URL_REGEXPS, api_views.rfcdiff_latest_json),
url(r'^rfcdiff-latest-json/(?P<name>[Rr][Ff][Cc] [0-9]+?)(\.txt|\.html)?/?$', api_views.rfcdiff_latest_json),

View file

@ -3,7 +3,10 @@
import base64
import binascii
import datetime
import json
from pathlib import Path
from tempfile import NamedTemporaryFile
import jsonschema
import pytz
import re
@ -264,7 +267,22 @@ def app_auth(request, app: Literal["authortools", "bibxml"]):
json.dumps({'success': True}),
content_type='application/json')
@requires_api_token
@csrf_exempt
def nfs_metrics(request):
with NamedTemporaryFile(dir=settings.NFS_METRICS_TMP_DIR,delete=False) as fp:
fp.close()
mark = datetime.datetime.now()
with open(fp.name, mode="w") as f:
f.write("whyioughta"*1024)
write_latency = (datetime.datetime.now() - mark).total_seconds()
mark = datetime.datetime.now()
with open(fp.name, "r") as f:
_=f.read()
read_latency = (datetime.datetime.now() - mark).total_seconds()
Path(f.name).unlink()
response=f'nfs_latency_seconds{{operation="write"}} {write_latency}\nnfs_latency_seconds{{operation="read"}} {read_latency}\n'
return HttpResponse(response)
def find_doc_for_rfcdiff(name, rev):
"""rfcdiff lookup heuristics

View file

@ -13,10 +13,10 @@ from pathlib import Path
from typing import List, Optional # pyflakes:ignore
from ietf.doc.utils import new_state_change_event, update_action_holders
from ietf.doc.utils import update_action_holders
from ietf.utils import log
from ietf.utils.mail import send_mail
from ietf.doc.models import Document, DocEvent, State, StateDocEvent
from ietf.doc.models import Document, DocEvent, State
from ietf.person.models import Person
from ietf.meeting.models import Meeting
from ietf.mailtrigger.utils import gather_address_lists
@ -213,11 +213,11 @@ def clean_up_draft_files():
def move_file_to(subdir):
# Similar to move_draft_files_to_archive
# ghostlinkd would keep this in the combined all archive since it would
# be sourced from a different place. But when ghostlinkd is removed, nothing
# new is needed here - the file will already exist in the combined archive
shutil.move(path,
os.path.join(settings.INTERNET_DRAFT_ARCHIVE_DIR, subdir, basename))
mark = Path(settings.FTP_DIR) / "internet-drafts" / basename
if mark.exists():
mark.unlink()
try:
doc = Document.objects.get(name=filename, rev=revision)
@ -235,41 +235,3 @@ def clean_up_draft_files():
# All uses of this past 2014 seem related to major system failures.
move_file_to("unknown_ids")
def repair_dead_on_expire():
by = Person.objects.get(name="(System)")
id_exists = State.objects.get(type="draft-iesg", slug="idexists")
dead = State.objects.get(type="draft-iesg", slug="dead")
dead_drafts = Document.objects.filter(
states__type="draft-iesg", states__slug="dead", type_id="draft"
)
for d in dead_drafts:
dead_event = d.latest_event(
StateDocEvent, state_type="draft-iesg", state__slug="dead"
)
if dead_event is not None:
if d.docevent_set.filter(type="expired_document").exists():
closest_expiry = min(
[
abs(e.time - dead_event.time)
for e in d.docevent_set.filter(type="expired_document")
]
)
if closest_expiry.total_seconds() < 60:
d.set_state(id_exists)
events = []
e = DocEvent(
doc=d,
rev=d.rev,
type="added_comment",
by=by,
desc="IESG Dead state was set due only to document expiry - changing IESG state to ID-Exists",
)
e.skip_community_list_notification = True
e.save()
events.append(e)
e = new_state_change_event(d, by, dead, id_exists)
e.skip_community_list_notification = True
e.save()
events.append(e)
d.save_with_history(events)

View file

@ -18,7 +18,6 @@ from .expire import (
in_draft_expire_freeze,
get_expired_drafts,
expirable_drafts,
repair_dead_on_expire,
send_expire_notice_for_draft,
expire_draft,
clean_up_draft_files,
@ -62,11 +61,6 @@ def expire_ids_task():
raise
@shared_task
def repair_dead_on_expire_task():
repair_dead_on_expire()
@shared_task
def notify_expirations_task(notify_days=14):
for doc in get_soon_to_expire_drafts(notify_days):

View file

@ -4,6 +4,7 @@
import io
import os
from pathlib import Path
from pyquery import PyQuery
from textwrap import wrap
@ -387,7 +388,7 @@ class ConflictReviewTests(TestCase):
class ConflictReviewSubmitTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['CONFLICT_REVIEW_PATH']
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['CONFLICT_REVIEW_PATH','FTP_PATH']
def test_initial_submission(self):
doc = Document.objects.get(name='conflict-review-imaginary-irtf-submission')
url = urlreverse('ietf.doc.views_conflict_review.submit',kwargs=dict(name=doc.name))
@ -403,9 +404,15 @@ class ConflictReviewSubmitTests(TestCase):
# Right now, nothing to test - we let people put whatever the web browser will let them put into that textbox
# sane post using textbox
path = os.path.join(settings.CONFLICT_REVIEW_PATH, '%s-%s.txt' % (doc.name, doc.rev))
basename = f"{doc.name}-{doc.rev}.txt"
path = Path(settings.CONFLICT_REVIEW_PATH) / basename
ftp_dir = Path(settings.FTP_DIR) / "conflict-reviews"
if not ftp_dir.exists():
ftp_dir.mkdir()
ftp_path = ftp_dir / basename
self.assertEqual(doc.rev,'00')
self.assertFalse(os.path.exists(path))
self.assertFalse(path.exists())
self.assertFalse(ftp_path.exists())
r = self.client.post(url,dict(content="Some initial review text\n",submit_response="1"))
self.assertEqual(r.status_code,302)
doc = Document.objects.get(name='conflict-review-imaginary-irtf-submission')
@ -413,6 +420,7 @@ class ConflictReviewSubmitTests(TestCase):
with io.open(path) as f:
self.assertEqual(f.read(),"Some initial review text\n")
f.close()
self.assertTrue(ftp_path.exists())
self.assertTrue( "submission-00" in doc.latest_event(NewRevisionDocEvent).desc)
def test_subsequent_submission(self):

View file

@ -19,10 +19,10 @@ from django.utils.html import escape
import debug # pyflakes:ignore
from ietf.doc.expire import expirable_drafts, get_expired_drafts, repair_dead_on_expire, send_expire_notice_for_draft, expire_draft
from ietf.doc.factories import EditorialDraftFactory, IndividualDraftFactory, StateDocEventFactory, WgDraftFactory, RgDraftFactory, DocEventFactory
from ietf.doc.expire import expirable_drafts, get_expired_drafts, send_expire_notice_for_draft, expire_draft
from ietf.doc.factories import EditorialDraftFactory, IndividualDraftFactory, WgDraftFactory, RgDraftFactory, DocEventFactory
from ietf.doc.models import ( Document, DocReminder, DocEvent,
ConsensusDocEvent, LastCallDocEvent, RelatedDocument, State, StateDocEvent, TelechatDocEvent,
ConsensusDocEvent, LastCallDocEvent, RelatedDocument, State, TelechatDocEvent,
WriteupDocEvent, DocRelationshipName, IanaExpertDocEvent )
from ietf.doc.utils import get_tags_for_stream_id, create_ballot_if_not_open
from ietf.doc.views_draft import AdoptDraftForm
@ -36,7 +36,7 @@ from ietf.iesg.models import TelechatDate
from ietf.utils.test_utils import login_testing_unauthorized
from ietf.utils.mail import outbox, empty_outbox, get_payload_text
from ietf.utils.test_utils import TestCase
from ietf.utils.timezone import date_today, datetime_today, datetime_from_date, DEADLINE_TZINFO
from ietf.utils.timezone import date_today, datetime_from_date, DEADLINE_TZINFO
class ChangeStateTests(TestCase):
@ -845,77 +845,6 @@ class ExpireIDsTests(DraftFileMixin, TestCase):
self.assertTrue(not os.path.exists(os.path.join(settings.INTERNET_DRAFT_PATH, txt)))
self.assertTrue(os.path.exists(os.path.join(settings.INTERNET_DRAFT_ARCHIVE_DIR, txt)))
@mock.patch("ietf.community.signals.notify_of_event")
def test_repair_dead_on_expire(self, mock_notify):
# Create a draft in iesg idexists - ensure it doesn't get new docevents.
# Create a draft in iesg dead with no expires within the window - ensure it doesn't get new docevents and its state doesn't change.
# Create a draft in iesg dead with an expiry in the window - ensure it gets the right doc events, iesg state changes, draft state doesn't change.
last_year = datetime_today() - datetime.timedelta(days=365)
not_dead = WgDraftFactory(name="draft-not-dead")
not_dead_event_count = not_dead.docevent_set.count()
dead_not_from_expires = WgDraftFactory(name="draft-dead-not-from-expiring")
dead_not_from_expires.set_state(
State.objects.get(type="draft-iesg", slug="dead")
)
StateDocEventFactory(
doc=dead_not_from_expires, state=("draft-iesg", "dead"), time=last_year
)
DocEventFactory(
doc=dead_not_from_expires,
type="expired_document",
time=last_year + datetime.timedelta(days=1),
)
dead_not_from_expires_event_count = dead_not_from_expires.docevent_set.count()
dead_from_expires = []
dead_from_expires_event_count = dict()
for delta in [-5, 5]:
d = WgDraftFactory(
name=f"draft-dead-from-expiring-just-{'before' if delta<0 else 'after'}"
)
d.set_state(State.objects.get(type="draft-iesg", slug="dead"))
StateDocEventFactory(doc=d, state=("draft-iesg", "dead"), time=last_year)
DocEventFactory(
doc=d,
type="expired_document",
time=last_year + datetime.timedelta(seconds=delta),
)
dead_from_expires.append(d)
dead_from_expires_event_count[d] = d.docevent_set.count()
notified_during_factory_work = mock_notify.call_count
for call_args in mock_notify.call_args_list:
e = call_args.args[0]
self.assertTrue(isinstance(e,DocEvent))
self.assertFalse(hasattr(e,"skip_community_list_notification"))
repair_dead_on_expire()
self.assertEqual(not_dead.docevent_set.count(), not_dead_event_count)
self.assertEqual(
dead_not_from_expires.docevent_set.count(),
dead_not_from_expires_event_count,
)
for d in dead_from_expires:
self.assertEqual(
d.docevent_set.count(), dead_from_expires_event_count[d] + 2
)
self.assertIn(
"due only to document expiry", d.latest_event(type="added_comment").desc
)
self.assertEqual(
d.latest_event(StateDocEvent).desc,
"IESG state changed to <b>I-D Exists</b> from Dead",
)
self.assertEqual(mock_notify.call_count, 4+notified_during_factory_work)
for call_args in mock_notify.call_args_list[-4:]:
e = call_args.args[0]
self.assertTrue(isinstance(e,DocEvent))
self.assertTrue(hasattr(e,"skip_community_list_notification"))
self.assertTrue(e.skip_community_list_notification)
class ExpireLastCallTests(TestCase):
def test_expire_last_call(self):

View file

@ -28,7 +28,7 @@ from ietf.utils.test_utils import TestCase, login_testing_unauthorized
class GroupMaterialTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['AGENDA_PATH']
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['AGENDA_PATH', 'FTP_DIR']
def setUp(self):
super().setUp()
self.materials_dir = self.tempdir("materials")
@ -37,6 +37,10 @@ class GroupMaterialTests(TestCase):
self.slides_dir.mkdir()
self.saved_document_path_pattern = settings.DOCUMENT_PATH_PATTERN
settings.DOCUMENT_PATH_PATTERN = self.materials_dir + "/{doc.type_id}/"
self.assertTrue(Path(settings.FTP_DIR).exists())
ftp_slides_dir = Path(settings.FTP_DIR) / "slides"
if not ftp_slides_dir.exists():
ftp_slides_dir.mkdir()
self.meeting_slides_dir = Path(settings.AGENDA_PATH) / "42" / "slides"
if not self.meeting_slides_dir.exists():
@ -112,7 +116,12 @@ class GroupMaterialTests(TestCase):
self.assertEqual(doc.title, "Test File - with fancy title")
self.assertEqual(doc.get_state_slug(), "active")
with io.open(os.path.join(self.materials_dir, "slides", doc.name + "-" + doc.rev + ".pdf")) as f:
basename=f"{doc.name}-{doc.rev}.pdf"
filepath=Path(self.materials_dir) / "slides" / basename
with filepath.open() as f:
self.assertEqual(f.read(), content)
ftp_filepath=Path(settings.FTP_DIR) / "slides" / basename
with ftp_filepath.open() as f:
self.assertEqual(f.read(), content)
# check that posting same name is prevented

View file

@ -4,6 +4,7 @@
import io
import os
from pathlib import Path
import debug # pyflakes:ignore
@ -540,7 +541,7 @@ class StatusChangeTests(TestCase):
DocumentFactory(type_id='statchg',name='status-change-imaginary-mid-review',notify='notify@example.org')
class StatusChangeSubmitTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['STATUS_CHANGE_PATH']
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['STATUS_CHANGE_PATH', 'FTP_PATH']
def test_initial_submission(self):
doc = Document.objects.get(name='status-change-imaginary-mid-review')
url = urlreverse('ietf.doc.views_status_change.submit',kwargs=dict(name=doc.name))
@ -556,14 +557,19 @@ class StatusChangeSubmitTests(TestCase):
# Right now, nothing to test - we let people put whatever the web browser will let them put into that textbox
# sane post using textbox
path = os.path.join(settings.STATUS_CHANGE_PATH, '%s-%s.txt' % (doc.name, doc.rev))
self.assertEqual(doc.rev,'00')
self.assertFalse(os.path.exists(path))
basename = f"{doc.name}-{doc.rev}.txt"
filepath = Path(settings.STATUS_CHANGE_PATH) / basename
ftp_filepath = Path(settings.FTP_DIR) / "status-changes" / basename
self.assertFalse(filepath.exists())
self.assertFalse(ftp_filepath.exists())
r = self.client.post(url,dict(content="Some initial review text\n",submit_response="1"))
self.assertEqual(r.status_code,302)
doc = Document.objects.get(name='status-change-imaginary-mid-review')
self.assertEqual(doc.rev,'00')
with io.open(path) as f:
with filepath.open() as f:
self.assertEqual(f.read(),"Some initial review text\n")
with ftp_filepath.open() as f:
self.assertEqual(f.read(),"Some initial review text\n")
self.assertTrue( "mid-review-00" in doc.latest_event(NewRevisionDocEvent).desc)
@ -628,3 +634,6 @@ class StatusChangeSubmitTests(TestCase):
def setUp(self):
super().setUp()
DocumentFactory(type_id='statchg',name='status-change-imaginary-mid-review',notify='notify@example.org')
ftp_subdir=Path(settings.FTP_DIR)/"status-changes"
if not ftp_subdir.exists():
ftp_subdir.mkdir()

View file

@ -21,7 +21,6 @@ from .tasks import (
generate_idnits2_rfcs_obsoleted_task,
generate_idnits2_rfc_status_task,
notify_expirations_task,
repair_dead_on_expire_task,
)
class TaskTests(TestCase):
@ -99,10 +98,6 @@ class TaskTests(TestCase):
self.assertEqual(mock_expire.call_args_list[1], mock.call(docs[1]))
self.assertEqual(mock_expire.call_args_list[2], mock.call(docs[2]))
@mock.patch("ietf.doc.tasks.repair_dead_on_expire")
def test_repair_dead_on_expire_task(self, mock_repair):
repair_dead_on_expire_task()
self.assertEqual(mock_repair.call_count, 1)
class Idnits2SupportTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['DERIVED_DIR']

View file

@ -112,10 +112,10 @@ def fix_charter_revision_after_approval(charter, by):
)
try:
os.link(new, ftp_filepath)
except IOError:
except IOError as ex:
log(
"There was an error creating a harlink at %s pointing to %s"
% (ftp_filepath, new)
"There was an error creating a hardlink at %s pointing to %s: %s"
% (ftp_filepath, new, ex)
)
events = []

View file

@ -5,6 +5,7 @@
import datetime
import io
import os
from pathlib import Path
from django import forms
from django.shortcuts import render, get_object_or_404, redirect
@ -181,12 +182,21 @@ class UploadForm(forms.Form):
return get_cleaned_text_file_content(self.cleaned_data["txt"])
def save(self, review):
filename = os.path.join(settings.CONFLICT_REVIEW_PATH, '%s-%s.txt' % (review.name, review.rev))
with io.open(filename, 'w', encoding='utf-8') as destination:
basename = f"{review.name}-{review.rev}.txt"
filepath = Path(settings.CONFLICT_REVIEW_PATH) / basename
with filepath.open('w', encoding='utf-8') as destination:
if self.cleaned_data['txt']:
destination.write(self.cleaned_data['txt'])
else:
destination.write(self.cleaned_data['content'])
ftp_filepath = Path(settings.FTP_DIR) / "conflict-reviews" / basename
try:
os.link(filepath, ftp_filepath) # Path.hardlink_to is not available until 3.10
except IOError as e:
log.log(
"There was an error creating a hardlink at %s pointing to %s: %s"
% (ftp_filepath, filepath, e)
)
#This is very close to submit on charter - can we get better reuse?
@role_required('Area Director','Secretariat')

View file

@ -3,8 +3,8 @@
# views for managing group materials (slides, ...)
import io
import os
from pathlib import Path
import re
from django import forms
@ -162,9 +162,21 @@ def edit_material(request, name=None, acronym=None, action=None, doc_type=None):
f = form.cleaned_data["material"]
file_ext = os.path.splitext(f.name)[1]
with io.open(os.path.join(doc.get_file_path(), doc.name + "-" + doc.rev + file_ext), 'wb+') as dest:
basename = f"{doc.name}-{doc.rev}{file_ext}" # Note the lack of a . before file_ext - see os.path.splitext
filepath = Path(doc.get_file_path()) / basename
with filepath.open('wb+') as dest:
for chunk in f.chunks():
dest.write(chunk)
if not doc.meeting_related():
log.assertion('doc.type_id == "slides"')
ftp_filepath = Path(settings.FTP_DIR) / doc.type_id / basename
try:
os.link(filepath, ftp_filepath) # Path.hardlink is not available until 3.10
except IOError as ex:
log.log(
"There was an error creating a hardlink at %s pointing to %s: %s"
% (ftp_filepath, filepath, ex)
)
if prev_rev != doc.rev:
e = NewRevisionDocEvent(type="new_revision", doc=doc, rev=doc.rev)

View file

@ -5,6 +5,7 @@
import datetime
import io
import os
from pathlib import Path
import re
from typing import Dict # pyflakes:ignore
@ -33,6 +34,7 @@ from ietf.ietfauth.utils import has_role, role_required
from ietf.mailtrigger.utils import gather_address_lists
from ietf.name.models import DocRelationshipName, StdLevelName
from ietf.person.models import Person
from ietf.utils.log import log
from ietf.utils.mail import send_mail_preformatted
from ietf.utils.textupload import get_cleaned_text_file_content
from ietf.utils.timezone import date_today, DEADLINE_TZINFO
@ -154,12 +156,21 @@ class UploadForm(forms.Form):
return get_cleaned_text_file_content(self.cleaned_data["txt"])
def save(self, doc):
filename = os.path.join(settings.STATUS_CHANGE_PATH, '%s-%s.txt' % (doc.name, doc.rev))
with io.open(filename, 'w', encoding='utf-8') as destination:
if self.cleaned_data['txt']:
destination.write(self.cleaned_data['txt'])
else:
destination.write(self.cleaned_data['content'])
basename = f"{doc.name}-{doc.rev}.txt"
filename = Path(settings.STATUS_CHANGE_PATH) / basename
with io.open(filename, 'w', encoding='utf-8') as destination:
if self.cleaned_data['txt']:
destination.write(self.cleaned_data['txt'])
else:
destination.write(self.cleaned_data['content'])
try:
ftp_filename = Path(settings.FTP_DIR) / "status-changes" / basename
os.link(filename, ftp_filename) # Path.hardlink is not available until 3.10
except IOError as ex:
log(
"There was an error creating a hardlink at %s pointing to %s: %s"
% (ftp_filename, filename, ex)
)
#This is very close to submit on charter - can we get better reuse?
@role_required('Area Director','Secretariat')

View file

@ -43,23 +43,28 @@ def generate_wg_charters_files_task():
encoding="utf8",
)
charter_copy_dest = getattr(settings, "CHARTER_COPY_PATH", None)
if charter_copy_dest is not None:
if not Path(charter_copy_dest).is_dir():
log.log(
f"Error copying 1wg-charter files to {charter_copy_dest}: it does not exist or is not a directory"
)
else:
try:
shutil.copy2(charters_file, charter_copy_dest)
except IOError as err:
log.log(f"Error copying {charters_file} to {charter_copy_dest}: {err}")
try:
shutil.copy2(charters_by_acronym_file, charter_copy_dest)
except IOError as err:
charter_copy_dests = [
getattr(settings, "CHARTER_COPY_PATH", None),
getattr(settings, "CHARTER_COPY_OTHER_PATH", None),
getattr(settings, "CHARTER_COPY_THIRD_PATH", None),
]
for charter_copy_dest in charter_copy_dests:
if charter_copy_dest is not None:
if not Path(charter_copy_dest).is_dir():
log.log(
f"Error copying {charters_by_acronym_file} to {charter_copy_dest}: {err}"
f"Error copying 1wg-charter files to {charter_copy_dest}: it does not exist or is not a directory"
)
else:
try:
shutil.copy2(charters_file, charter_copy_dest)
except IOError as err:
log.log(f"Error copying {charters_file} to {charter_copy_dest}: {err}")
try:
shutil.copy2(charters_by_acronym_file, charter_copy_dest)
except IOError as err:
log.log(
f"Error copying {charters_by_acronym_file} to {charter_copy_dest}: {err}"
)
@shared_task

View file

@ -62,6 +62,8 @@ class GroupPagesTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + [
"CHARTER_PATH",
"CHARTER_COPY_PATH",
"CHARTER_COPY_OTHER_PATH", # Note: not explicitly testing use of
"CHARTER_COPY_THIRD_PATH", # either of these settings
"GROUP_SUMMARY_PATH",
]

View file

@ -744,6 +744,8 @@ INTERNET_DRAFT_PDF_PATH = '/a/www/ietf-datatracker/pdf/'
RFC_PATH = '/a/www/ietf-ftp/rfc/'
CHARTER_PATH = '/a/ietfdata/doc/charter/'
CHARTER_COPY_PATH = '/a/www/ietf-ftp/ietf' # copy 1wg-charters files here if set
CHARTER_COPY_OTHER_PATH = '/a/www/ftp/ietf'
CHARTER_COPY_THIRD_PATH = '/a/www/ftp/charter'
GROUP_SUMMARY_PATH = '/a/www/ietf-ftp/ietf'
BOFREQ_PATH = '/a/ietfdata/doc/bofreq/'
CONFLICT_REVIEW_PATH = '/a/ietfdata/doc/conflict-review'
@ -759,6 +761,7 @@ MEETING_RECORDINGS_DIR = '/a/www/audio'
DERIVED_DIR = '/a/ietfdata/derived'
FTP_DIR = '/a/ftp'
ALL_ID_DOWNLOAD_DIR = '/a/www/www6s/download'
NFS_METRICS_TMP_DIR = '/a/tmp'
DOCUMENT_FORMAT_ALLOWLIST = ["txt", "ps", "pdf", "xml", "html", ]

View file

@ -4,6 +4,7 @@
import io
import os
from pathlib import Path
import re
import shutil
import sys
@ -280,6 +281,15 @@ class DraftYangChecker(object):
dest = os.path.join(settings.SUBMIT_YANG_DRAFT_MODEL_DIR, model)
shutil.move(path, dest)
ftp_dest = Path(settings.FTP_DIR) / "yang" / "draftmod" / model
try:
os.link(dest, ftp_dest)
except IOError as ex:
log(
"There was an error creating a hardlink at %s pointing to %s: %s"
% (ftp_dest, dest, ex)
)
# summary result
results.append({

View file

@ -8,6 +8,7 @@ import json
import os
import pathlib
import re
import subprocess
import sys
import time
import traceback
@ -1596,3 +1597,6 @@ def populate_yang_model_dirs():
modfile.unlink()
except UnicodeDecodeError as e:
log.log(f"Error processing {item.name}: {e}")
ftp_moddir = Path(settings.FTP_DIR) / "yang" / "draftmod"
subprocess.call(("/usr/bin/rsync", "-aq", "--delete", moddir, ftp_moddir))