Find a file
Robert Sparks 997239a2ea
feat: write objects to blob storage (#8557)
* feat: basic blobstore infrastructure for dev

* refactor: (broken) attempt to put minio console behind nginx

* feat: initialize blobstore with boto3

* fix: abandon attempt to proxy minio. Use docker compose instead.

* feat: beginning of blob writes

* feat: storage utilities

* feat: test buckets

* chore: black

* chore: remove unused import

* chore: avoid f string when not needed

* fix: inform all settings files about blobstores

* fix: declare types for some settings

* ci: point to new target base

* ci: adjust test workflow

* fix: give the tests debug environment a blobstore

* fix: "better" name declarations

* ci: use devblobstore container

* chore: identify places to write to blobstorage

* chore: remove unreachable code

* feat: store materials

* feat: store statements

* feat: store status changes

* feat: store liaison attachments

* feat: store agendas provided with Interim session requests

* chore: capture TODOs

* feat: store polls and chatlogs

* chore: remove unneeded TODO

* feat: store drafts on submit and post

* fix: handle storage during doc expiration and resurrection

* fix: mirror an unlink

* chore: add/refine TODOs

* feat: store slide submissions

* fix: structure slide test correctly

* fix: correct sense of existence check

* feat: store some indexes

* feat: BlobShadowFileSystemStorage

* feat: shadow floorplans / host logos to the blob

* chore: remove unused import

* feat: strip path from blob shadow names

* feat: shadow photos / thumbs

* refactor: combine photo and photothumb blob kinds

The photos / thumbs were already dropped in the same
directory, so let's not add a distinction at this point.

* style: whitespace

* refactor: use kwargs consistently

* chore: migrations

* refactor: better deconstruct(); rebuild migrations

* fix: use new class in mack patch

* chore: add TODO

* feat: store group index documents

* chore: identify more TODO

* feat: store reviews

* fix: repair merge

* chore: remove unnecessary TODO

* feat: StoredObject metadata

* fix: deburr some debugging code

* fix: only set the deleted timestamp once

* chore: correct typo

* fix: get_or_create vs get and test

* fix: avoid the questionable is_seekable helper

* chore: capture future design consideration

* chore: blob store cfg for k8s

* chore: black

* chore: copyright

* ci: bucket name prefix option + run Black

Adds/uses DATATRACKER_BLOB_STORE_BUCKET_PREFIX option. Other changes
are just Black styling.

* ci: fix typo in bucket name expression

* chore: parameters in app-configure-blobstore

Allows use with other blob stores.

* ci: remove verify=False option

* fix: don't return value from __init__

* feat: option to log timing of S3Storage calls

* chore: units

* fix: deleted->null when storing a file

* style: Black

* feat: log as JSON; refactor to share code; handle exceptions

* ci: add ietf_log_blob_timing option for k8s

* test: --no-manage-blobstore option for running tests

* test: use blob store settings from env, if set

* test: actually set a couple more storage opts

* feat: offswitch (#8541)

* feat: offswitch

* fix: apply ENABLE_BLOBSTORAGE to BlobShadowFileSystemStorage behavior

* chore: log timing of blob reads

* chore: import Config from botocore.config

* chore(deps): import boto3-stubs / botocore

botocore is implicitly imported, but make it explicit
since we refer to it directly

* chore: drop type annotation that mypy loudly ignores

* refactor: add storage methods via mixin

Shares code between Document and DocHistory without
putting it in the base DocumentInfo class, which
lacks the name field. Also makes mypy happy.

* feat: add timeout / retry limit to boto client

* ci: let k8s config the timeouts via env

* chore: repair merge resolution typo

* chore: tweak settings imports

* chore: simplify k8s/settings_local.py imports

---------

Co-authored-by: Jennifer Richards <jennifer@staff.ietf.org>
2025-02-19 17:41:10 -06:00
.devcontainer feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
.github feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
.vscode chore: rename test settings and shorten runserver instructions (#6131) 2023-08-11 11:21:38 -05:00
.yarn chore(deps): revert ical.js to 1.5.0 (#7293) 2024-04-05 10:40:21 -05:00
bin chore: remove unused bin items (#7593) 2024-06-26 13:32:00 -05:00
client fix: use groupAcronym to determine hackathon icon in agenda (#8540) 2025-02-14 17:14:03 -06:00
dev feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
docker feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
ietf feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
k8s feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
patch refactor: Streamline draft aliases api (#7607) 2024-06-28 12:01:26 -05:00
playwright fix: use groupAcronym to determine hackathon icon in agenda (#8540) 2025-02-14 17:14:03 -06:00
svn-history chore: capture svn changeset to git commit mapping (#3573) 2022-03-07 13:00:06 -06:00
test chore: remove unused setting (#8336) 2024-12-13 15:14:13 -06:00
vzic Changed the plain UTC.ics zoneinfo entry from symlink to file. 2020-09-16 18:16:28 +00:00
.dockerignore chore: add .dockerignore (#4027) 2022-05-31 16:47:40 -05:00
.editorconfig chore: Remove helm config from .editorconfig 2024-05-13 21:42:21 -04:00
.eslintrc.js test: remove cypress + migrate existing tests to playwright (#4781) 2022-11-22 16:26:19 -06:00
.gitattributes chore: set gitattributes for normalizing line endings (#8245) 2024-11-20 16:48:00 -06:00
.gitignore feat: total ids, pre-pubreq counts and pages left to ballot on on the AD dashboard (#7813) 2024-09-05 10:43:43 -05:00
.pnp.cjs chore(deps): revert ical.js to 1.5.0 (#7293) 2024-04-05 10:40:21 -05:00
.pnp.loader.mjs chore: Update bootstrap to 5.2.0 (#4034) 2022-07-21 15:56:27 -05:00
.pylintrc fix: Avoid deprecated config syntax (#5179) 2023-02-21 13:51:10 -06:00
.yarnrc.yml chore: Update bootstrap to 5.2.0 (#4034) 2022-07-21 15:56:27 -05:00
debug.py feat: improve / clean up logging (#7591) 2024-06-26 14:53:05 -05:00
docker-compose.yml feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
jsconfig.json fix: handle dynamic poll columns (#7243) 2024-03-22 12:38:28 -05:00
LICENSE docs: Update LICENSE 2024-10-23 04:51:11 -04:00
mypy.ini test: Remove outdated mypy test exceptions 2023-05-12 20:00:14 -03:00
package.json feat: Add session recordings (#8218) 2025-01-31 10:28:39 -06:00
README.md feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
requirements.txt feat: write objects to blob storage (#8557) 2025-02-19 17:41:10 -06:00
tzparse.py Initial 2to3 patch with added copyright statement updates. 2019-06-27 14:40:54 +00:00
vite.config.js fix: precompile template urls at build time + bs5 btn on agenda loading screen (#4679) 2022-11-01 11:51:33 -05:00
yarn.lock chore(deps): revert ical.js to 1.5.0 (#7293) 2024-04-05 10:40:21 -05:00

IETF Datatracker

Release License Code Coverage
Python Version Django Version Node Version MariaDB Version

The day-to-day front-end to the IETF database for people who work on IETF standards.

Getting Started

This project is following the standard Git Feature Workflow development model. Learn about all the various steps of the development workflow, from creating a fork to submitting a pull request, in the Contributing guide.

Make sure to read the Styleguides section to ensure a cohesive code format across the project.

You can submit bug reports, enhancement and new feature requests in the discussions area. Accepted tickets will be converted to issues.

Creating a Fork

Click the Fork button in the top-right corner of the repository to create a personal copy that you can work on.

Note that some GitHub Actions might be enabled by default in your fork. You should disable them by going to Settings > Actions > General and selecting Disable actions (then Save).

Git Cloning Tips

As outlined in the Contributing guide, you will first want to create a fork of the datatracker project in your personal GitHub account before cloning it.

Windows developers: Start with WSL2 from the beginning.

Because of the extensive history of this project, cloning the datatracker project locally can take a long time / disk space. You can speed up the cloning process by limiting the history depth, for example (replace USERNAME with your GitHub username):

  • To fetch only up to the 10 latest commits:
    git clone --depth=10 https://github.com/USERNAME/datatracker.git
    
  • To fetch only up to a specific date:
    git clone --shallow-since=DATE https://github.com/USERNAME/datatracker.git
    

The tl;dr to get going

Note that you will have to have cloned the datatracker code locally - please read the above sections.

Datatracker development is performed using Docker containers. You will need to be able to run docker (and docker-compose) on your machine to effectively develop. It is possible to get a purely native install working, but it is very complicated and typically takes a first time datatracker developer a full day of setup, where the docker setup completes in a small number of minutes.

Many developers are using VS Code and taking advantage of VS Code's ability to start a project in a set of containers. If you are using VS Code, simply start VS Code in your clone and inside VS Code choose Restart in container.

If VS Code is not available to you, in your clone, type cd docker; ./run

Once the containers are started, run the tests to make sure your checkout is a good place to start from (all tests should pass - if any fail, ask for help at tools-help@). Inside the app container's shell type:

ietf/manage.py test --settings=settings_test

Note that we recently moved the datatracker onto PostgreSQL - you may still find older documentation that suggests testing with settings_sqlitetest. That will no longer work.

For a more detailed description of getting going, see docker/README.md.

Overview of the datatracker models

A beginning of a walkthrough of the datatracker models was prepared for the IAB AID workshop.

Docker Dev Environment

In order to simplify and reduce the time required for setup, a preconfigured docker environment is available.

Read the Docker Dev Environment guide to get started.

Database & Assets

Nightly database dumps of the datatracker are available as Docker images: ghcr.io/ietf-tools/datatracker-db:latest

Note that to update the database in your dev environment to the latest version, you should run the docker/cleandb script.

Blob storage for dev/test

The dev and test environments use minio to provide local blob storage. See the settings files for how the app container communicates with the blobstore container. If you need to work with minio directly from outside the containers (to interact with its api or console), use docker compose from the top level directory of your clone to expose it at an ephemeral port.

$ docker compose port blobstore 9001
0.0.0.0:<some ephemeral port>

$ curl -I http://localhost:<some ephemeral port>
HTTP/1.1 200 OK
...

The minio container exposes the minio api at port 9000 and the minio console at port 9001

Frontend Development

Intro

We now use yarn to manage assets for the Datatracker, and vite/parcel to package them. yarn maintains its node packages under the .yarn directory.

The datatracker uses 2 different build systems, depending on the use case:

  • Vite for Vue 3 pages / components
  • Parcel for legacy pages / jQuery

Vite (Vue 3)

Pages will gradually be updated to Vue 3 components. These components are located under the /client directory.

Each Vue 3 app has its own sub-directory. For example, the agenda app is located under /client/agenda.

The datatracker makes use of the Django-Vite plugin to point to either the Vite.js server or the precompiled production files. The DJANGO_VITE_DEV_MODE flag, found in the ietf/settings_local.py file determines whether the Vite.js server is used or not.

In development mode, you must start the Vite.js development server, in addition to the usual Datatracker server:

yarn dev

Any changes made to the files under /client will automatically trigger a hot-reload of the modified components.

To generate production assets, run the build command:

yarn build

This will create packages under ietf/static/dist-neue, which are then served by the Django development server, and which must be uploaded to the CDN.

Parcel (Legacy/jQuery)

The Datatracker includes these packages from the various Javascript and CSS files in ietf/static/js and ietf/static/css respectively, bundled using Parcel. Static images are likewise in ietf/static/images.

Whenever changes are made to the files under ietf/static, you must re-run the build command to package them:

yarn legacy:build

This will create packages under ietf/static/dist/ietf, which are then served by the Django development server, and which must be uploaded to the CDN.

Bootstrap

The "new" datatracker uses Twitter Bootstrap for the UI.

Get familiar with https://getbootstrap.com/getting-started/ and use those UI elements, CSS classes, etc. instead of cooking up your own.

Some ground rules:

  • Think hard before tweaking the bootstrap CSS, it will make it harder to upgrade to future releases.
  • No <style> tags in the HTML! Put CSS into the "morecss" block of a template instead.
  • CSS that is used by multiple templates goes into static/css/ietf.css or a new CSS file.
  • Javascript that is only used on one template goes into the "js" block of that template.
  • Javascript that is used by multiple templates goes into static/js/ietf.js or a new js file.
  • Avoid CSS, HTML styling or Javascript in the python code!

Serving Static Files via CDN

Production Mode

If resources served over a CDN and/or with a high max-age don't have different URLs for different versions, then any component upgrade which is accompanied by a change in template functionality will have a long transition time during which the new pages are served with old components, with possible breakage. We want to avoid this.

The intention is that after a release has been checked out, but before it is deployed, the standard django collectstatic management command will be run, resulting in all static files being collected from their working directory location and placed in an appropriate location for serving via CDN. This location will have the datatracker release version as part of its URL, so that after the deployment of a new release, the CDN will be forced to fetch the appropriate static files for that release.

An important part of this is to set up the STATIC_ROOT and STATIC_URL settings appropriately. In 6.4.0, the setting is as follows in production mode:

STATIC_URL = "https://www.ietf.org/lib/dt/%s/"%__version__
STATIC_ROOT = CDN_ROOT + "/a/www/www6s/lib/dt/%s/"%__version__

The result is that all static files collected via the collectstatic command will be placed in a location served via CDN, with the release version being part of the URL.

Development Mode

In development mode, STATIC_URL is set to /static/, and Django's staticfiles infrastructure makes the static files available under that local URL root (unless you set settings.SERVE_CDN_FILES_LOCALLY_IN_DEV_MODE to False). It is not necessary to actually populate the static/ directory by running collectstatic in order for static files to be served when running ietf/manage.py runserver -- the runserver command has extra support for finding and serving static files without running collectstatic.

In order to work backwards from a file served in development mode to the location from which it is served, the mapping is as follows:

Development URL Working copy location
localhost:8000/static/ietf/* ietf/static/ietf/*
localhost:8000/static/secr/* ietf/secr/static/secr/*

Handling of External Javascript and CSS Components

In order to make it easy to keep track of and upgrade external components, these are now handled by a tool called yarn via the configuration in package.json.

To add a new package, simply run (replace <package-name> with the NPM module name):

yarn add <package-name>

Handling of Internal Static Files

Previous to this release, internal static files were located under static/, mixed together with the external components. They are now located under ietf/static/ietf/ and ietf/secr/static/secr, and will be collected for serving via CDN by the collectstatic command. Any static files associated with a particular app will be handled the same way (which means that all admin/ static files automatically will be handled correctly, too).

Changes to Template Files

In order to make the template files refer to the correct versioned CDN URL (as given by the STATIC_URL root) all references to static files in the templates have been updated to use the static template tag when referring to static files. This will automatically result in both serving static files from the right place in development mode, and referring to the correct versioned URL in production mode and the simpler /static/ URLs in development mode.

Deployment

During deployment, it is now necessary to run the management command:

ietf/manage.py collectstatic

before activating a new release.

Running Tests

Python Tests

From a datatracker container, run the command:

./ietf/manage.py test --settings=settings_test

You can limit the run to specific tests using the --pattern argument.

Frontend Tests

Frontend tests are done via Playwright. There're 2 different type of tests:

  • Tests that test Vue pages / components and run natively without any external dependency.
  • Tests that require a running datatracker instance to test against (usually legacy views).

Make sure you have Node.js 16.x or later installed on your machine.

Run Vue Tests

⚠️ All commands below MUST be run from the ./playwright directory, unless noted otherwise.

  1. Run once to install dependencies on your system:

    npm install
    npm run install-deps
    
  2. Run in a separate process, from the project root directory:

    yarn preview
    
  3. Run the tests, in of these 3 modes, from the ./playwright directory:

    3.1 To run the tests headlessly (command line mode):

    npm test
    

    3.2 To run the tests visually (CANNOT run in docker):

    npm run test:visual
    

    3.3 To run the tests in debug mode (CANNOT run in docker):

    npm run test:debug
    

Run Legacy Views Tests

First, you need to start a datatracker instance (dev or prod), ideally from a docker container, exposing the 8000 port.

⚠️ All commands below MUST be run from the ./playwright directory.

  1. Run once to install dependencies on your system:
npm install
npm run install-deps
  1. Run the tests headlessly (command line mode):
npm run test:legacy

Diff Tool

To compare 2 different datatracker instances and look for diff, read the diff tool instructions.