Merge 7.45.1.dev0 into Bootstrap 5 update branch. Made a first pass at reconciling differences.

- Legacy-Id: 19945
This commit is contained in:
Jennifer Richards 2022-02-17 20:09:49 +00:00
commit 6c260a5b7e
162 changed files with 10405 additions and 1441 deletions

View file

@ -58,18 +58,19 @@
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"ms-python.python",
"ms-python.vscode-pylance",
"ms-azuretools.vscode-docker",
"editorconfig.editorconfig",
"redhat.vscode-yaml",
"visualstudioexptteam.vscodeintellicode",
"batisteo.vscode-django",
"mutantdino.resourcemonitor",
"spmeesseman.vscode-taskexplorer",
"mtxr.sqltools",
"mtxr.sqltools-driver-mysql"
],
"ms-python.python",
"ms-python.vscode-pylance",
"ms-azuretools.vscode-docker",
"editorconfig.editorconfig",
"redhat.vscode-yaml",
"visualstudioexptteam.vscodeintellicode",
"batisteo.vscode-django",
"mutantdino.resourcemonitor",
"spmeesseman.vscode-taskexplorer",
"mtxr.sqltools",
"mtxr.sqltools-driver-mysql",
"mrmlnc.vscode-duplicate"
],
// Use 'forwardPorts' to make a list of ports inside the container available locally.
"forwardPorts": [8000, 3306],

View file

@ -4,7 +4,8 @@ services:
app:
environment:
EDITOR_VSCODE: 1
DJANGO_SETTINGS_MODULE: settings_local_sqlitetest
volumes:
- ..:/root/src:cached
- ..:/root/src
# Runs app on the same network as the database container, allows "forwardPorts" in devcontainer.json function.
network_mode: service:db

View file

@ -5,6 +5,8 @@
root = true
# Settings for IETF datatracker
# ---------------------------------------------------------
# PEP8 Style
[*]
indent_style = space
@ -14,3 +16,39 @@ charset = utf-8
# to avoid tripping Henrik's commit hook:
trim_trailing_whitespace = false
insert_final_newline = false
# Settings for .github folder
# ---------------------------------------------------------
# GitHub Markdown Style
[.github/**]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = false
insert_final_newline = true
# Settings for client-side JS / Vue files
# ---------------------------------------------------------
# StandardJS Style
[client/**]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
# Settings for cypress tests
# ---------------------------------------------------------
# StandardJS Style
[cypress/**]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true

43
.github/workflows/dev-db-nightly.yml vendored Normal file
View file

@ -0,0 +1,43 @@
# GITHUB ACTIONS - WORKFLOW
# Build the database dev docker image with the latest database dump every night
# so that developers don't have to manually build it themselves.
name: Nightly Dev DB Image
# Controls when the workflow will run
on:
schedule:
- cron: '0 0 * * *'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
env:
REGISTRY: ghcr.io
IMAGE_NAME: datatracker-db
jobs:
build:
runs-on: ubuntu-latest
if: ${{ github.ref == 'refs/heads/main' }}
permissions:
contents: read
packages: write
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- name: Get Current Date as Tag
id: date
run: echo "::set-output name=date::$(date +'%Y%m%d')"
- name: Docker Build & Push Action
uses: mr-smithers-excellent/docker-build-push@v5.6
with:
image: ${{ env.IMAGE_NAME }}
tags: nightly-${{ steps.date.outputs.date }}, latest
registry: ${{ env.REGISTRY }}
dockerfile: docker/db.Dockerfile
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

1
.gitignore vendored
View file

@ -54,3 +54,4 @@
/unix.tag
*.pyc
__pycache__
node_modules

22
.vscode/tasks.json vendored
View file

@ -43,7 +43,7 @@
"problemMatcher": []
},
{
"label": "Run JS Tests",
"label": "Run JS Tests (python)",
"type": "shell",
"command": "/usr/local/bin/python",
"args": [
@ -117,7 +117,7 @@
"type": "shell",
"command": "/bin/bash",
"args": [
"${workspaceFolder}/docker/app-win32-timezone-fix.sh"
"${workspaceFolder}/docker/scripts/app-win32-timezone-fix.sh"
],
"presentation": {
"echo": true,
@ -128,6 +128,24 @@
"clear": false
},
"problemMatcher": []
},
{
"label": "Run JS Tests (cypress)",
"type": "shell",
"command": "/bin/bash",
"args": [
"${workspaceFolder}/docker/scripts/app-cypress.sh"
],
"group": "test",
"presentation": {
"echo": true,
"reveal": "always",
"focus": true,
"panel": "new",
"showReuseMessage": false,
"clear": false
},
"problemMatcher": []
}
]
}

33
CODE_OF_CONDUCT.md Normal file
View file

@ -0,0 +1,33 @@
# Code of Conduct
This is a reminder of IETF policies in effect on various topics such as patents
or code of conduct. It is only meant to point you in the right direction.
Exceptions may apply. The IETF's patent policy and the definition of an IETF
"contribution" and "participation" are set forth in
[BCP 79](https://www.rfc-editor.org/info/bcp79); please read it carefully.
As a reminder:
* By participating in the IETF, you agree to follow IETF processes and
policies.
* If you are aware that any IETF contribution is covered by patents or patent
applications that are owned or controlled by you or your sponsor, you must
disclose that fact, or not participate in the discussion.
* As a participant in or attendee to any IETF activity you acknowledge that
written, audio, video, and photographic records of meetings may be made public.
* Personal information that you provide to IETF will be handled in accordance
with the IETF Privacy Statement.
* As a participant or attendee, you agree to work respectfully with other
participants; please contact the
[ombudsteam](https://www.ietf.org/contact/ombudsteam/) if you have questions
or concerns about this.
Definitive information is in the documents listed below and other IETF BCPs.
For advice, please talk to WG chairs or ADs:
* [BCP 9 (Internet Standards Process)](https://www.rfc-editor.org/info/bcp9)
* [BCP 25 (Working Group processes)](https://www.rfc-editor.org/info/bcp25)
* [BCP 25 (Anti-Harassment Procedures)](https://www.rfc-editor.org/info/bcp25)
* [BCP 54 (Code of Conduct)](https://www.rfc-editor.org/info/bcp54)
* [BCP 78 (Copyright)](https://www.rfc-editor.org/info/bcp78)
* [BCP 79 (Patents, Participation)](https://www.rfc-editor.org/info/bcp79)
* [Privacy Policy](https://www.ietf.org/privacy-policy/)

310
CONTRIBUTING.md Normal file
View file

@ -0,0 +1,310 @@
# Contributing to Datatracker
:+1::tada: First off, thanks for taking the time to contribute! :tada::+1:
Before going any further, make sure you read the [code of conduct](CODE_OF_CONDUCT.md).
#### Table Of Contents
- [Workflow Overview](#workflow-overview)
- [Creating a Fork](#creating-a-fork)
- [Cloning a Fork](#cloning-a-fork)
- [Using Git Command Line](#using-git-command-line)
- [Using GitHub Desktop / GitKraken](#using-github-desktop--gitkraken)
- [Using GitHub CLI](#using-github-cli)
- [Create a Local Branch](#create-a-local-branch)
- [Creating a Commit](#creating-a-commit)
- [From your editor / GUI tool](#from-your-editor-gui--tool)
- [From the command line](#from-the-command-line)
- [Push Commits](#push-commits)
- [Create a Pull Request](#create-a-pull-request)
- [Sync your Fork](#sync-your-fork)
- [Syncing with uncommitted changes](#syncing-with-uncommitted-changes)
- [Styleguides](#styleguides)
- [Git Commit Messages](#git-commit-messages)
- [Javascript](#javascript)
- [Python](#python)
## Workflow Overview
The datatracker project uses the **Git Feature Workflow with Develop Branch** model.
It consists of two primary branches:
**Main** - The main branch always reflects a production-ready state. Any push to this branch will trigger a deployment to production. Developers never push code directly to this branch.
**Develop** - The develop branch contains the latest development changes for the next release. This is where new commits are merged.
A typical development workflow:
1. First, [create a fork](#creating-a-fork) of the repository and then [clone the fork](#cloning-a-fork) to your local machine.
2. [Create a new branch](#create-a-local-branch), based on the develop branch, for the feature / fix you are to work on.
3. [Add one or more commits](#creating-a-commit) to this feature/fix branch.
4. [Push the commits](#push-commits) to the remote fork.
5. [Create a pull request (PR)](#create-a-pull-request) to request your feature branch from your fork to be merged to the source repository `develop` branch.
6. The PR is reviewed by the lead developer / other developers, automated tests / checks are run to catch any errors and if accepted, the PR is merged with the `develop` branch.
7. [Fast-forward (sync)](#sync-your-fork) your forked develop branch to include the latest changes made by all developers.
8. Repeat this workflow from step 2.
![](media/docs/workflow-diagram.jpg)
## Creating a Fork
As a general rule, work is never done directly on the datatracker repository. You instead [create a fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) of the project. Creating a "fork" is producing a personal copy of the datatracker project. Forks act as a sort of bridge between the original repository and your personal copy.
1. Navigate to https://github.com/ietf-tools/datatracker
2. Click the **Fork** button. *You may be prompted to select where the fork should be created, in which case you should select your personal GitHub account.*
![](media/docs/fork-button.jpg)
Your personal fork contains all the branches / contents of the original repository as it was at the exact moment you created the fork. You are free to create new branches or modify existing ones on your personal fork, as it won't affect the original repository.
Note that forks live on GitHub and not locally on your personal machine. To get a copy locally, we need to clone the fork...
## Cloning a Fork
Right now, you have a fork of the datatracker repository, but you don't have the files in that repository locally on your computer.
After forking the datatracker repository, you should have landed on your personal forked copy. If that's not the case, make sure you are on the fork (e.g. `john-doe/datatracker` and not the original repository `ietf-tools/datatracker`).
Above the list of files, click the **Code** button. A clone dialog will appear.
![](media/docs/code-button.png)
There are several ways to clone a repository, depending on your personal preferences. Let's go through them...
> :triangular_flag_on_post: In all cases, you must have **git** already installed on your system.
- [Using Git Command Line](#using-git-command-line)
- [Using GitHub Desktop / GitKraken](#using-github-desktop--gitkraken)
- [Using GitHub CLI](#using-github-cli)
### Using Git Command Line
1. Copy the URL in the **Clone with HTTPS** dialog.
2. In a terminal window, navigate to where you want to work. Subfolders will be created for each project you clone. **DO NOT** create empty folders for projects to be cloned. This is done automatically by git.
3. Type `git clone` and then paste the URL you just copied, e.g.:
```sh
git clone https://github.com/YOUR-USERNAME/datatracker
```
4. Press **Enter**. Your local clone will be created in a subfolder named `datatracker`.
### Using GitHub Desktop / GitKraken
There are several GUI tools which simplify your interaction with git:
- [GitHub Desktop](https://desktop.github.com/) *(macOS / Windows)*
- [GitKraken](https://www.gitkraken.com/) *(Linux / macOS / Windows)*
- [Sourcetree](https://www.sourcetreeapp.com/) *(macOS / Windows)*
If using **GitHub Desktop**, you can simply click **Open with GitHub Desktop** in the clone dialog.
For other tools, you must either manually browse to your forked repository or paste the HTTPS URL from the clone dialog.
### Using GitHub CLI
The GitHub CLI offers tight integration with GitHub.
1. Install the [GitHub CLI](https://cli.github.com/).
2. In a terminal window, navigate to where you want to work. Subfolders will be created for each project you clone. **DO NOT** create empty folders for projects to be cloned. This is done automatically by git.
3. Type `gh repo clone` followed by `YOUR-USERNAME/datatracker` (replacing YOUR-USERNAME with your GitHub username), e.g.:
```sh
gh repo clone john-doe/datatracker
```
4. Press **Enter**. Your local clone will be created in a subfolder named `datatracker`.
## Create a Local Branch
While you could *technically* work directly on the develop branch, it is best practice to create a branch for the feature / fix you are working on. It also makes it much easier to fast-forward your forks develop branch to the match the source repository.
1. From a terminal window, nagivate to the project directory you cloned earlier.
2. First, make sure you are on the `develop` branch.:
```sh
git checkout develop
```
3. Let's create a branch named `feature-1` based on the `develop` branch:
```sh
git checkout -b feature-1
```
4. Press **Enter**. A new branch will be created, being an exact copy of the develop branch.
You are now ready to work on your feature / fix in your favorite editor.
## Creating a Commit
Once you are ready to commit the changes you made to the project code, it's time to stage the modifications.
### From your editor / GUI tool
It's generally easier to use either your editor (assuming it has git capabilities) or using a git GUI tool. This ensures you're not missing any new untracked files. Select the changes / new files you wish to include in the commit, enter a meaningful short description of the change (see [Git Commit Messages](#git-commit-messages) section) and create a commit.
### From the command line
If you wish to use the command line instead, you can view the current state of your local repository using the [git status](https://git-scm.com/docs/git-status) command:
```sh
git status
```
To stage a modification, use the [git add](https://git-scm.com/docs/git-add) command:
```sh
git add some-file.py
```
Finally, create the commit by running the [git commit](https://git-scm.com/docs/git-commit) command:
```sh
git commit
```
This will launch a text editor prompting you for a commit message. Enter a meaningful short description of the change (see [Git Commit Messages](#git-commit-messages) section) and save.
> :information_source: There are several command parameters you can use to quickly add all modifications or execute several actions at once. Refer to the documentation for each command above.
## Push Commits
You can now push your commits to your forked repository. This will add the commits you created locally to the feature/fix branch on the remote forked repository.
Look for the **Push** button in your editor / GUI tool.
If you prefer to use the command line, you would use the [git push](https://git-scm.com/docs/git-push) command:
```sh
git push origin feature-1
```
> :information_source: If the feature branch doesn't exist on the remote fork, it will automatically be created.
## Create a Pull Request
When your feature / fix is ready to be merged with the source repository `develop` branch, it's time to create a **Pull Request (PR)**.
On GitHub, navigate to your branch (in your forked repository). A yellow banner will invite you to **Compare & pull request**. You can also click the **Contribute** dropdown to initiate a PR.
![](media/docs/pr-buttons.png)
Make sure the base repository is set to `ietf-tools/datatracker` with the branch `develop` (this is the destination):
![](media/docs/pr-form.png)
Enter a title and description of what your PR includes and click **Create pull request** when ready.
Your PR will then be reviewed by the lead developer / other developers. Automated tests will also run on your code to catch any potential errors.
Once approved and merged, your changes will appear in the `develop` branch. It's now time to fast-forward your fork to the source repository. This ensures your fork develop branch is in sync with the source develop branch...
## Sync your Fork
Your fork `develop` branch is now behind the source `develop` branch. To fast-forward it to the latest changes, click the **Fetch upstream** button:
![](media/docs/sync-branch.png)
Note that you also need to fast-forward your **local machine** `develop` branch. This can again be done quickly from your editor / GUI tool. If you're using the command line, run these commands:
```sh
git checkout develop
git merge --ff-only origin/develop
```
> :information_source: While you could use the `git pull` command to achieve the same thing, this ensures that only a fast-forward operation will be executed and not a merge (which is most likely not what you want). You can read more about the different ways of pulling the latest changes via [git merge](https://git-scm.com/docs/git-merge), [git pull](https://git-scm.com/docs/git-pull) and [git rebase](https://git-scm.com/docs/git-rebase).
### Syncing with uncommitted changes
In some cases, you may need to get the latest changes while you're still working on your local branch.
Some tools like GitKraken automates this process and will even handle the stashing process if necessary.
If you prefer to use the command line:
1. You must first [git stash](https://git-scm.com/docs/git-stash) any uncommitted changes:
```sh
git stash
```
This will save the current state of your branch so that it can be re-applied later.
2. Run the [git rebase](https://git-scm.com/docs/git-rebase) command to fast-forward your branch to the latest commit from `develop` and then apply all your new commits on top of it:
```sh
git rebase develop
```
You can add the `-i` flag to the above command to trigger an interactive rebase session. Instead of blindly moving all of the commits to the new base, interactive rebasing gives you the opportunity to alter individual commits in the process.
3. Use the [git stash pop](https://git-scm.com/docs/git-stash) :musical_note: command to restore any changes you previously stashed:
```sh
git stash pop
```
> :warning: Note that you should **never** rebase once you've pushed commits to the source repository. After a PR, **always** fast-forward your forked develop branch to match the source one and create a new feature branch from it. Continuing directly from a previously merged branch will result in duplicated commits when you try to push or create a PR.
## Styleguides
### Git Commit Messages
* Use the present tense ("Add feature" not "Added feature")
* Use the imperative mood ("Move cursor to..." not "Moves cursor to...")
* Limit the first line to 72 characters or less
* Reference issues and pull requests liberally after the first line
* When only changing documentation, include `[ci skip]` in the commit title
* Consider starting the commit message with one of the following keywords (see [Conventional Commits](https://www.conventionalcommits.org/) specification):
* `build:` Changes that affect the build system or external dependencies
* `docs:` Documentation only changes
* `feat:` A new feature
* `fix:` A bug fix
* `perf:` A code change that improves performance
* `refactor:` A code change that neither fixes a bug nor adds a feature
* `style:` Changes that do not affect the meaning of the code *(white-space, formatting, missing semi-colons, etc)*
* `test:` Adding missing tests or correcting existing tests
### Javascript
#### JS Coding Style
[StandardJS](https://standardjs.com/) is the style guide used for this project.
[![JavaScript Style Guide](https://cdn.rawgit.com/standard/standard/master/badge.svg)](https://github.com/standard/standard)
ESLint and EditorConfig configuration files are present in the project root. Most editors can automatically enforce these [rules](https://standardjs.com/rules.html) and even format your code accordingly as you type.
These rules apply whether the code is inside a `.js` file or as part of a `.vue` / `.html` file.
Refer to the [rules](https://standardjs.com/rules.html) for a complete list with examples. However, here are some of the major ones:
* No semi-colons! :no_entry_sign:
* Use 2 spaces for indentation
* Use single quotes for strings (except to avoid escaping)
* Use camelCase when naming variables and functions
* Always use `===` instead of `==` (unless you **specifically** need to check for `null || undefined`)
* No unused variables
* Keep `else` statements on the same line as their curly braces
* No trailing commas
* Files must end with a newline *(only for new .js / .vue files. See the Python directives below for other file types.)*
Finally, avoid using `var` to declare variables. You should instead use `const` and `let`. `var` unnecessarily pollutes the global scope and there's almost no use-case where it should be used.
#### JS Tests
The [Cypress](https://www.cypress.io/) framework is used for javascript testing (in addition to end-to-end testing which covers the whole application).
The tests are located under the `cypress/` directory.
*To be expanded*
### Python
#### Python Coding Style
* Follow the coding style in the piece of code you are working on. Don't re-format code you're not working on to fit your preferred style. As a whole, the piece of code you're working on will be more readable if the style is consistent, even if it's not your style.
* For Python code, PEP 8 is the style guide. Please adhere to it, except when in conflict with the bullet above.
* Don't change whitespace in files you are working on, (except for in the code you're actually adding/changing, of course); and don't let your editor do end-of-line space stripping on saving. Gratuitous whitespace changes may give commit logs and diffs an appearance of there being a lot of changes, and your actual code change can be buried in all the whitespace-change noise.
* Now and then, code clean-up projects are run. During those, it can be the right thing to do whitespace clean-up, coding style alignment, moving code around in order to have it live in a more appropriate place, etc. The point in *those* cases is that when you do that kind of work, it is labelled as such, and actual code changes are not to be inserted in style and whitespace-change commits. If you are not in a clean-up project, don't move code around if you're not actually doing work on it.
* If you are modifying existing code, consider whether you're bending it out of shape in order to support your needs. If you're bending it too much out of shape, consider refactoring. Always try to leave code you change in a better shape than you found it.
#### Python Tests
* Reasonably comprehensive test suites should be written and committed to the project repository.
* Projects written for Django should use Django's test facility, in files tests.py in each application directory.
* Other projects, written in Python, should use Python's doctests or unittest framework.
* Other projects should use the best practice for the respective code environment for testing.
* As of release 5.12.0, the Django test suite for the datatracker includes tests which measure the test suite's code, template, and URL coverage and fails if it drops below that of the latest release. When merged in, your code should not make the test coverage drop below the latest release. Please run the full test suite regularly, to keep an eye on your coverage numbers.
* Please shoot for a test suite with at least 80% code coverage for new code, as measured by the built-in coverage tests for the datatracker or standalone use of coverage.py for other Python projects. For non-Python projects, use the most appropriate test coverage measurement tool.
* For the datatracker, aim for 100% test suite template coverage for new templates.
* When a reported functional bug is being addressed, a test must be written or updated to fail while the bug is present and succeed when it has been fixed, and made part of the bugfix. This is not applicable for minor functional bugs, typos or template changes.

145
README.md
View file

@ -1,4 +1,147 @@
# IETF Datatracker
<div align="center">
<img src="media/docs/ietf-datatracker-logo.svg" alt="IETF Datatracker" width="600" />
[![Release](https://img.shields.io/github/release/ietf-tools/datatracker.svg?style=flat&maxAge=3600)](https://github.com/ietf-tools/datatracker/releases)
[![License](https://img.shields.io/badge/license-BSD3-blue.svg?style=flat)](https://github.com/ietf-tools/datatracker/blob/main/LICENSE)
![Nightly DB Build](https://img.shields.io/github/workflow/status/ietf-tools/datatracker/dev-db-nightly?label=Nightly%20DB%20Build&style=flat&logo=docker&logoColor=white&maxAge=3600)
##### The day-to-day front-end to the IETF database for people who work on IETF standards.
</div>
- [**Production Website**](https://datatracker.ietf.org)
- [Getting Started](#getting-started)
- [Prerequisites](#prerequisites)
- [Code Tree Overview](#code-tree-overview)
- [Adding a New Web Page](#adding-a-new-web-page)
- [Testing your work](#testing-your-work)
- [Docker Dev Environment](#docker-dev-environment)
- [Continuous Integration](#continuous-integration)
- [Database & Assets](#database--assets)
- [Bootstrap 5 Upgrade](#bootstrap-5-upgrade)
---
### Getting Started
This project is following the standard **Git Feature Workflow with Develop Branch** development model. Learn about all the various steps of the development workflow, from creating a fork to submitting a pull request, in the [Contributing](CONTRIBUTING.md) guide.
> Make sure to read the [Styleguides](CONTRIBUTING.md#styleguides) section to ensure a cohesive code format across the project.
You can submit bug reports, enhancement and new feature requests in the [discussions](https://github.com/ietf-tools/datatracker/discussions) area. Accepted tickets will be converted to issues.
#### Prerequisites
- Python 3.6
- Django 2.x
- Node.js 16.x
- MariaDB 10
> See the [Docker Dev Environment](#docker-dev-environment) section below for a preconfigured docker environment.
#### Code Tree Overview
The `ietf/templates/` directory contains Django templates used to generate web pages for the datatracker, mailing list, wgcharter and other things.
Most of the other `ietf` sub-directories, such as `meeting`, contain the python/Django model and view information that go with the related templates. In these directories, the key files are:
| File | Description |
|--|--|
| urls.py | binds a URL to a view, possibly selecting some data from the model. |
| models.py | has the data models for the tool area. |
| views.py | has the views for this tool area, and is where views are bound to the template. |
#### Adding a New Web Page
To add a new page to the tools, first explore the `models.py` to see if the model you need already exists. Within `models.py` are classes such as:
```python
class IETFWG(models.Model):
ACTIVE = 1
group_acronym = models.ForeignKey(Acronym, primary_key=True, unique=True, editable=False)
group_type = models.ForeignKey(WGType)
proposed_date = models.DateField(null=True, blank=True)
start_date = models.DateField(null=True, blank=True)
dormant_date = models.DateField(null=True, blank=True)
...
```
In this example, the `IETFWG` class can be used to reference various fields of the database including `group_type`. Of note here is that `group_acronym` is the `Acronym` model so fields in that model can be accessed (e.g., `group_acronym.name`).
Next, add a template for the new page in the proper sub-directory of the `ietf/templates` directory. For a simple page that iterates over one type of object, the key part of the template will look something like this:
```html
{% for wg in object_list %}
<tr>
<td><a href="{{ wg.email_archive }}">{{ wg }}</a></td>
<td>{{ wg.group_acronym.name }}</td>
</tr>
{% endfor %}
```
In this case, we're expecting `object_list` to be passed to the template from the view and expecting it to contain objects with the `IETFWG` model.
Then add a view for the template to `views.py`. A simple view might look like:
```python
def list_wgwebmail(request):
wgs = IETFWG.objects.all();
return render_to_response('mailinglists/wgwebmail_list.html', {'object_list': wgs})
```
The selects the IETFWG objects from the database and renders the template with them in object_list. The model you're using has to be explicitly imported at the top of views.py in the imports statement.
Finally, add a URL to display the view to `urls.py`. For this example, the reference to `list_wgwebmail` view is called:
```python
urlpatterns += patterns('',
...
(r'^wg/$', views.list_wgwebmail),
)
```
#### Testing your work
Assuming you have the database settings configured already, you can run the server locally with:
```sh
$ ietf/manage.py runserver localhost:<port>
```
where `<port>` is arbitrary. Then connect your web browser to `localhost:<port>` and provide the URL to see your work.
When you believe you are ready to commit your work, you should run the test suite to make sure that no tests break. You do this by running
```sh
$ ietf/manage.py test --settings=settings_sqlitetest
```
### Docker Dev Environment
In order to simplify and reduce the time required for setup, a preconfigured docker environment is available.
Read the [Docker Dev Environment](docker/README.md) guide to get started.
### Continuous Integration
*TODO*
### Database & Assets
Nightly database dumps of the datatracker are available at
https://www.ietf.org/lib/dt/sprint/ietf_utf8.sql.gz
> Note that this link is provided as reference only. To update the database in your dev environment to the latest version, you should instead run the `docker/cleandb` script!
Additional data files used by the datatracker (e.g. instance drafts, charters, rfcs, agendas, minutes, etc.) are available at
https://www.ietf.org/standards/ids/internet-draft-mirror-sites/
> A script is available at `docker/scripts/app-rsync-extras.sh` to automatically fetch these resources via rsync.
---
# Bootstrap 5 Update
An update of the UI to use Bootstrap 5 is under way. The following notes describe this work-in-progress and should
be integrated with the rest of the document as the details and processes become final.
## Intro

View file

@ -14,7 +14,7 @@ django.setup()
from django.conf import settings
from django.core.validators import validate_email, ValidationError
from ietf.utils.draft import Draft
from ietf.utils.draft import PlaintextDraft
from ietf.submit.utils import update_authors
import debug # pyflakes:ignore
@ -61,7 +61,7 @@ for name in sorted(names):
except UnicodeDecodeError:
text = raw.decode('latin1')
try:
draft = Draft(text, txt_file.name, name_from_source=True)
draft = PlaintextDraft(text, txt_file.name, name_from_source=True)
except Exception as e:
print name, rev, "Can't parse", p,":",e
continue

View file

@ -92,6 +92,9 @@ CHARTER=/a/www/ietf-ftp/charter
wget -q https://datatracker.ietf.org/wg/1wg-charters-by-acronym.txt -O $CHARTER/1wg-charters-by-acronym.txt
wget -q https://datatracker.ietf.org/wg/1wg-charters.txt -O $CHARTER/1wg-charters.txt
# Regenerate the last week of bibxml-ids
$DTDIR/ietf/manage.py generate_draft_bibxml_files
# Create and update group wikis
#$DTDIR/ietf/manage.py create_group_wikis

View file

@ -232,6 +232,8 @@ def skip_url(url):
# Skip most html conversions, not worth the time
"^/doc/html/draft-[0-9ac-z]",
"^/doc/html/draft-b[0-9b-z]",
"^/doc/pdf/draft-[0-9ac-z]",
"^/doc/pdf/draft-b[0-9b-z]",
"^/doc/html/charter-.*",
"^/doc/html/status-.*",
"^/doc/html/rfc.*",

280
changelog
View file

@ -1,3 +1,283 @@
ietfdb (7.45.0) ietf; urgency=medium
** MeetEcho interim request integration, bugfixes **
* Merged in [19892] from rjsparks@nostrum.com:
Guard against reference sections without names.
* Merged in [19895] from jennifer@painless-security.com:
Look at v2 'title' attribute in reference type heuristics for XML
drafts. Related to #3529.
* Merged in [19900] from jennifer@painless-security.com:
Fix hiding of name/purpose/type fields when not needed in secr/sreq.
Fixes #3531.
* Merged in [19907] from rjsparks@nostrum.com:
Provide the complete context to the template for mail about approved
interim requests. Fixes #3534.
* Merged in [19915] from rjsparks@nostrum.com:
Simplify search for link back to group from the review management view.
* Merged in [19919] from rjsparks@nostrum.com:
Allow secretariat to edit session requests when tool is closed to
chairs. Fixes #3547.
* Merged in [19920] from rjsparks@nostrum.com:
Make working with session purpose easier in the admin.
* Merged in [19921] from rjsparks@nostrum.com:
add search to the doc states admin form.
* Merged in [19922] from jennifer@painless-security.com:
Fix scoping of session loop variables in sreq edit view. Improve tests
that should have caught this.
* Merged in [19925] from jennifer@painless-security.com:
Suppress origin template tag in production mode, show relative path
only in other modes.
* Merged in [19917] and [19930] from jennifer@painless-security.com:
Create/delete Meetecho conferences when requesting/canceling interim
sessions. Fixes #3507. Fixes #3508.
-- Robert Sparks <rjsparks@nostrum.com> 15 Feb 2022 14:51:10 +0000
ietfdb (7.44.0) ietf; urgency=medium
** Schedule editor improvements, bugfixes **
* Merged in [19874] from rjsparks@nostrum.com:
Rollback menu caching. More work is required to allow left menu to
function correctly.
* Merged in [19876] from jennifer@painless-security.com:
Do not redirect user to the logout page when logging in. Fixes #3478.
* Merged in [19878] from jennifer@painless-security.com:
Hide timeslots type is disabled plus other schedule editor
debugging/improvements. Fixes #3510. Fixes #3430.
* Merged in [19880] from rjsparks@nostrum.com:
Add gunicorn to requirements to support new deployment model.
-- Robert Sparks <rjsparks@nostrum.com> 28 Jan 2022 14:56:13 +0000
ietfdb (7.43.0) ietf; urgency=medium
** Easier account creation, bugfixes, enhancements **
* Merged in [19825] from jennifer@painless-security.com:
Find references from submitted XML instead of rendering to text and
parsing. Fixes #3342.
* Merged in [19826] from jennifer@painless-security.com:
Remove code still using old 'length_sessionX' SessionForm fields.
* Merged in [19830] from jennifer@painless-security.com:
Include RFC title in doc/html view title element. Fixes #3488.
* Merged in [19831] and [19832] from rjsparks@nostrum.com:
Cache menus by login.
* Merged in [19833] from jennifer@painless-security.com:
Point to RFC editor info page in document_bibtex view. Fixes #3484.
* Merged in [19834] from lars@eggert.org:
Add djhtml (https://github.com/rtts/djhtml), for auto-reformatting
of the Django templates via 'djlint --profile django --reformat'.
It still has some bugs and issues, esp. on complex templates and with
regards to whitespace after links, but those are manageable, and the
benefits of having consistently-formatted templates IMO outweigh them.
* Merged in [19837] from jennifer@painless-security.com:
Update any_email_sent() to use balloters instead of old ad field. Add
tests to catch the otherwise quiet failure. Fixes #3438.
* Merged in [19838] from jennifer@painless-security.com:
Allow editing of group non-chartered group descriptions through UI.
Fixes #3388.
* Merged in [19839] from jennifer@painless-security.com:
Add timeouts to requests library calls. Fixes #3498.
* Merged in [19841] from jennifer@painless-security.com:
Link to the timeslot editor when meeting has no timeslots. Fixes #3511.
* Merged in [19848] from jennifer@painless-security.com:
Fix several review reminder problems.
Send secretary's review reminders to secretary instead of assignee.
Send unconfirmed assignment reminders based on assignment age and CC
secretaries.
Correctly separate open review reminders by review team.
Fixes #3482. Fixes #3324.
* Merged in [19857] from rjsparks@nostrum.com:
Add a link to account creation in the login page body.
* Merged in [19858] from rjsparks@nostrum.com:
Remove the manual intervention step for account creation.
* Merged in [19863] from rjsparks@nostrum.com:
Add de-gfm to the docker setup. Fixes #3494.
-- Robert Sparks <rjsparks@nostrum.com> 19 Jan 2022 20:03:55 +0000
ietfdb (7.42.0) ietf; urgency=medium
** Bugfixes and minor features **
* Merged in [19786] from jennifer@painless-security.com:
Strip Unicode control characters out of feed content. Fixes #3398.
* Merged in [19787] from rjsparks@nostrum.com:
Change to not serve any personalapikey metadata.
* Merged in [19788] from jennifer@painless-security.com:
Import django.conf.settings instead of ietf.settings. Fixes #3392.
* Merged in [19790] from rjsparks@nostrum.com:
Provide and maintain an rsyncable bibxml-ids dataset.
* Merged in [19793] from nick@staff.ietf.org:
misc: new README.md + docker dir cleanup
* Merged in [19801] from nick@staff.ietf.org:
fix: missing dependencies in dockerfile from changeset #19767
* Merged in [19804] from rjsparks@nostrum.com:
Pin tastypie at 0.14.3. Related to #3500.
* Merged in [19806] from rjsparks@nostrum.com:
Correct the url for the bibtex button. Provide a pdfized button. Fixes
#3501.
* Merged in [19811] from lars@eggert.org:
When using Docker, the runserver isn't being accessed over loopback,
so we need to initialize INTERNAL_IPS based on the current interface
configuration.
* Merged in [19813] from rjsparks@nostrum.com:
Improve robustness of pdfization. Tune the test crawler. Don't show
htmlized and pdfized buttons when that genration will fail.
-- Robert Sparks <rjsparks@nostrum.com> 07 Jan 2022 15:23:26 +0000
ietfdb (7.41.0) ietf; urgency=medium
** improved markdown uploads, js testing, prep for move to github, pdfized documents **
* Merged in [19672] from jennifer@painless-security.com:
Add tests of meeting forms for the new session purpose work and a few
other untested parts. Fix a few bugs uncovered.
* Merged in [19675] from jennifer@painless-security.com:
Update uploaded_filename when modifying agenda through the interim
meeting request edit view. Fixes #3395.
* Merged in [19679] from jennifer@painless-security.com:
Include requester's last name as part of a bofreq document's name.
Fixes #3377.
* Merged in [19683] from jennifer@painless-security.com:
Guard against absent 'form_class' kwarg in IETFJSONField.formfield().
* Merged in [19694] from jennifer@painless-security.com:
Better handle invalid character encodings in process_email and
feedback_email commands. Add tests of this using stdin.
* Merged in [19700] from lars@eggert.org:
Add space between RFC and number.
* Merged in [19710] from jennifer@painless-security.com:
Allow nomcom chair to edit liaisons as well as members and generate
GroupEvents when changed. Share code between group and nomcom for this
purpose. Fixes #3376.
* Merged in [19711] from krathnayake@ietf.org:
Adds private app authentication API for bibxml. Fixes #3480.
* Merged in [19713] from lars@eggert.org:
Remove ietf/templates/iesg/scribe_template.html and related,
which is not used anymore according to the secretariat.
(On merge, rjsparks@nostrum.com also removed the three other
templates that only that one included, and removed the test
that covered the view tht was removed).
* Merged in [19716] from jennifer@painless-security.com:
Update CSS selectors to update times/timezones for any elements with
.time/.current-tz classes, not just span. Fixes #3485.
* Merged in [19718] from rjsparks@nostrum.com:
Update the utility that generates batches of bibxml3 files to match the
way the view uses the templates.
* Merged in [19719] from rjsparks@nostrum.com:
Change the I-D announce text to mention rsync instead of ftp per
RFC9141 and its associated transition plan.
* Merged in [19693] from nick@staff.ietf.org:
feat: cypress JS testing for agenda meetings + weekview swimlane (WIP)
* Merged in [19696] from nick@staff.ietf.org:
feat: add nomcom expand panel test
* Merged in [19697] from nick@staff.ietf.org:
feat: add nomcom expand panel test (with missing file)
* Merged in [19698] from nick@staff.ietf.org:
feat: add nomcom questionnaires tabs tests
* Update coverage to reflect removal of scribe templates
* Merged in [19741] from lars@eggert.org:
Add ghostscript to app image, which is used by some tests.
* Merged in [19744] from jennifer@painless-security.com:
Treat application/octet-stream as text/markdown for '.md' materials
uploads. Refactor FileUploadForm hierarchy to reduce boilerplate. Fixes
#3163.
* Merged in [19747] from rjsparks@nostrum.com:
Provide a more direct replacement for tools.ietf.org/id at doc/id.
* Merged in [19748] from nick@staff.ietf.org:
docs: add CONTRIBUTING.md (with associated assets) and
CODE_OF_CONDUCT.md
* Merged in [19750] from nick@staff.ietf.org:
build: Add GitHub Actions workflow for automatic nightly datatracker DB
image build
* Merged in [19751] from nick@staff.ietf.org:
misc: add .gitignore + fix cypress files to match JS style guide
* Merged in [19753] from rjsparks@nostrum.com:
Provide pdfs of htmlized (pdfized) documents to replace
tools.ietf.org/pdf/ at /doc/pdf.
* Merged in [19761] from nick@staff.ietf.org:
fix: skip chromedriver install if arch is not supported in docker build
* Merged in [19763] from jennifer@painless-security.com:
Add ability to import session minutes from notes.ietf.org. Mock out
calls to the requests library in tests. Call markdown library through a
util method. Fixes #3489.
* Merged in [19766] from jennifer@painless-security.com:
Accept/replace invalid Unicode bytes when processing ipr response
emails. Fixes #3489.
* Pin weasyprint to an earlier version because of packaging trouble with
dependencies.
-- Robert Sparks <rjsparks@nostrum.com> 10 Dec 2021 16:30:21 +0000
ietfdb (7.40.0) ietf; urgency=medium
** Codesprint, session purposes, new docker dev env, performance improvements **

6
cypress.json Normal file
View file

@ -0,0 +1,6 @@
{
"baseUrl": "http://localhost:8000",
"chromeWebSecurity": false,
"viewportWidth": 1280,
"viewportHeight": 800
}

2
cypress/.gitignore vendored Normal file
View file

@ -0,0 +1,2 @@
screenshots
videos

232
cypress/fixtures/users.json Normal file
View file

@ -0,0 +1,232 @@
[
{
"id": 1,
"name": "Leanne Graham",
"username": "Bret",
"email": "Sincere@april.biz",
"address": {
"street": "Kulas Light",
"suite": "Apt. 556",
"city": "Gwenborough",
"zipcode": "92998-3874",
"geo": {
"lat": "-37.3159",
"lng": "81.1496"
}
},
"phone": "1-770-736-8031 x56442",
"website": "hildegard.org",
"company": {
"name": "Romaguera-Crona",
"catchPhrase": "Multi-layered client-server neural-net",
"bs": "harness real-time e-markets"
}
},
{
"id": 2,
"name": "Ervin Howell",
"username": "Antonette",
"email": "Shanna@melissa.tv",
"address": {
"street": "Victor Plains",
"suite": "Suite 879",
"city": "Wisokyburgh",
"zipcode": "90566-7771",
"geo": {
"lat": "-43.9509",
"lng": "-34.4618"
}
},
"phone": "010-692-6593 x09125",
"website": "anastasia.net",
"company": {
"name": "Deckow-Crist",
"catchPhrase": "Proactive didactic contingency",
"bs": "synergize scalable supply-chains"
}
},
{
"id": 3,
"name": "Clementine Bauch",
"username": "Samantha",
"email": "Nathan@yesenia.net",
"address": {
"street": "Douglas Extension",
"suite": "Suite 847",
"city": "McKenziehaven",
"zipcode": "59590-4157",
"geo": {
"lat": "-68.6102",
"lng": "-47.0653"
}
},
"phone": "1-463-123-4447",
"website": "ramiro.info",
"company": {
"name": "Romaguera-Jacobson",
"catchPhrase": "Face to face bifurcated interface",
"bs": "e-enable strategic applications"
}
},
{
"id": 4,
"name": "Patricia Lebsack",
"username": "Karianne",
"email": "Julianne.OConner@kory.org",
"address": {
"street": "Hoeger Mall",
"suite": "Apt. 692",
"city": "South Elvis",
"zipcode": "53919-4257",
"geo": {
"lat": "29.4572",
"lng": "-164.2990"
}
},
"phone": "493-170-9623 x156",
"website": "kale.biz",
"company": {
"name": "Robel-Corkery",
"catchPhrase": "Multi-tiered zero tolerance productivity",
"bs": "transition cutting-edge web services"
}
},
{
"id": 5,
"name": "Chelsey Dietrich",
"username": "Kamren",
"email": "Lucio_Hettinger@annie.ca",
"address": {
"street": "Skiles Walks",
"suite": "Suite 351",
"city": "Roscoeview",
"zipcode": "33263",
"geo": {
"lat": "-31.8129",
"lng": "62.5342"
}
},
"phone": "(254)954-1289",
"website": "demarco.info",
"company": {
"name": "Keebler LLC",
"catchPhrase": "User-centric fault-tolerant solution",
"bs": "revolutionize end-to-end systems"
}
},
{
"id": 6,
"name": "Mrs. Dennis Schulist",
"username": "Leopoldo_Corkery",
"email": "Karley_Dach@jasper.info",
"address": {
"street": "Norberto Crossing",
"suite": "Apt. 950",
"city": "South Christy",
"zipcode": "23505-1337",
"geo": {
"lat": "-71.4197",
"lng": "71.7478"
}
},
"phone": "1-477-935-8478 x6430",
"website": "ola.org",
"company": {
"name": "Considine-Lockman",
"catchPhrase": "Synchronised bottom-line interface",
"bs": "e-enable innovative applications"
}
},
{
"id": 7,
"name": "Kurtis Weissnat",
"username": "Elwyn.Skiles",
"email": "Telly.Hoeger@billy.biz",
"address": {
"street": "Rex Trail",
"suite": "Suite 280",
"city": "Howemouth",
"zipcode": "58804-1099",
"geo": {
"lat": "24.8918",
"lng": "21.8984"
}
},
"phone": "210.067.6132",
"website": "elvis.io",
"company": {
"name": "Johns Group",
"catchPhrase": "Configurable multimedia task-force",
"bs": "generate enterprise e-tailers"
}
},
{
"id": 8,
"name": "Nicholas Runolfsdottir V",
"username": "Maxime_Nienow",
"email": "Sherwood@rosamond.me",
"address": {
"street": "Ellsworth Summit",
"suite": "Suite 729",
"city": "Aliyaview",
"zipcode": "45169",
"geo": {
"lat": "-14.3990",
"lng": "-120.7677"
}
},
"phone": "586.493.6943 x140",
"website": "jacynthe.com",
"company": {
"name": "Abernathy Group",
"catchPhrase": "Implemented secondary concept",
"bs": "e-enable extensible e-tailers"
}
},
{
"id": 9,
"name": "Glenna Reichert",
"username": "Delphine",
"email": "Chaim_McDermott@dana.io",
"address": {
"street": "Dayna Park",
"suite": "Suite 449",
"city": "Bartholomebury",
"zipcode": "76495-3109",
"geo": {
"lat": "24.6463",
"lng": "-168.8889"
}
},
"phone": "(775)976-6794 x41206",
"website": "conrad.com",
"company": {
"name": "Yost and Sons",
"catchPhrase": "Switchable contextually-based project",
"bs": "aggregate real-time technologies"
}
},
{
"id": 10,
"name": "Clementina DuBuque",
"username": "Moriah.Stanton",
"email": "Rey.Padberg@karina.biz",
"address": {
"street": "Kattie Turnpike",
"suite": "Suite 198",
"city": "Lebsackbury",
"zipcode": "31428-2261",
"geo": {
"lat": "-38.2386",
"lng": "57.2232"
}
},
"phone": "024-648-3804",
"website": "ambrose.net",
"company": {
"name": "Hoeger LLC",
"catchPhrase": "Centralized empowering task-force",
"bs": "target end-to-end models"
}
}
]

View file

@ -0,0 +1,103 @@
/// <reference types="cypress" />
describe('meeting agenda', () => {
before(() => {
cy.visit('/meeting/agenda/')
})
it('toggle customize panel when clicking on customize header bar', () => {
cy.get('#agenda-filter-customize').click()
cy.get('#customize').should('be.visible').and('have.class', 'in')
cy.get('#agenda-filter-customize').click()
cy.get('#customize').should('not.be.visible').and('not.have.class', 'in')
})
it('customize panel should have at least 3 areas', () => {
cy.get('#agenda-filter-customize').click()
cy.get('.agenda-filter-areaselectbtn').should('have.length.at.least', 3)
})
it('customize panel should have at least 10 groups', () => {
cy.get('.agenda-filter-groupselectbtn').should('have.length.at.least', 10)
})
it('filtering the agenda should modify the URL', () => {
// cy.intercept({
// method: 'GET',
// path: '/meeting/agenda/week-view.html**',
// times: 10
// }, {
// forceNetworkError: true
// })
cy.get('.agenda-filter-groupselectbtn').any(5).as('selectedGroups').each(randomElement => {
cy.wrap(randomElement).click()
cy.wrap(randomElement).invoke('attr', 'data-filter-item').then(keyword => {
cy.url().should('contain', keyword)
})
})
// Deselect everything
cy.get('@selectedGroups').click({ multiple: true })
})
it('selecting an area should select all corresponding groups', () => {
cy.get('.agenda-filter-areaselectbtn').any().click().invoke('attr', 'data-filter-item').then(area => {
cy.url().should('contain', area)
cy.get(`.agenda-filter-groupselectbtn[data-filter-keywords*="${area}"]`).each(group => {
cy.wrap(group).invoke('attr', 'data-filter-keywords').then(groupKeywords => {
// In case value is a comma-separated list of keywords...
if (groupKeywords.indexOf(',') < 0 || groupKeywords.split(',').includes(area)) {
cy.wrap(group).should('have.class', 'active')
}
})
})
})
})
it('weekview iframe should load', () => {
cy.get('iframe#weekview').its('0.contentDocument').should('exist')
cy.get('iframe#weekview').its('0.contentDocument.readyState').should('equal', 'complete')
cy.get('iframe#weekview').its('0.contentDocument.body', {
timeout: 30000
}).should('not.be.empty')
})
})
describe('meeting agenda weekview', () => {
before(() => {
cy.visit('/meeting/agenda/week-view.html')
})
it('should have day headers', () => {
cy.get('.agenda-weekview-day').should('have.length.greaterThan', 0).and('be.visible')
})
it('should have day columns', () => {
cy.get('.agenda-weekview-column').should('have.length.greaterThan', 0).and('be.visible')
})
it('should have the same number of day headers and columns', () => {
cy.get('.agenda-weekview-day').its('length').then(lgth => {
cy.get('.agenda-weekview-column').should('have.length', lgth)
})
})
it('should have meetings', () => {
cy.get('.agenda-weekview-meeting').should('have.length.greaterThan', 0).and('be.visible')
})
it('meeting hover should cause expansion to column width', () => {
cy.get('.agenda-weekview-column:first').invoke('outerWidth').then(colWidth => {
cy.get('.agenda-weekview-meeting-mini').any(5).each(meeting => {
cy.wrap(meeting)
.wait(250)
.realHover({ position: 'center' })
.invoke('outerWidth')
.should('be.closeTo', colWidth, 1)
// Move over to top left corner of the page to end the mouseover of the current meeting block
cy.get('.agenda-weekview-day:first').realHover().wait(250)
})
})
})
})

View file

@ -0,0 +1,27 @@
/// <reference types="cypress" />
describe('expertise', () => {
before(() => {
cy.visit('/nomcom/2021/expertise/')
})
it('expertises with expandable panels should expand', () => {
cy.get('.nomcom-req-positions-tabs > li > a').each($tab => {
cy.wrap($tab).click()
cy.wrap($tab).parent().should('have.class', 'active')
cy.wrap($tab).invoke('attr', 'href').then($tabId => {
cy.get($tabId).should('have.class', 'tab-pane').and('have.class', 'active').and('be.visible')
cy.get($tabId).then($tabContent => {
if ($tabContent.find('.generic_iesg_reqs_header').length) {
cy.wrap($tabContent).find('.generic_iesg_reqs_header').click()
cy.wrap($tabContent).find('.generic_iesg_reqs_header').invoke('attr', 'href').then($expandId => {
cy.get($expandId).should('be.visible')
})
}
})
})
})
})
})

View file

@ -0,0 +1,18 @@
/// <reference types="cypress" />
describe('questionnaires', () => {
before(() => {
cy.visit('/nomcom/2021/questionnaires/')
})
it('position tabs should display the appropriate panel on click', () => {
cy.get('.nomcom-questnr-positions-tabs > li > a').each($tab => {
cy.wrap($tab).click()
cy.wrap($tab).parent().should('have.class', 'active')
cy.wrap($tab).invoke('attr', 'href').then($tabId => {
cy.get($tabId).should('have.class', 'tab-pane').and('have.class', 'active').and('be.visible')
})
})
})
})

22
cypress/plugins/index.js Normal file
View file

@ -0,0 +1,22 @@
/// <reference types="cypress" />
// ***********************************************************
// This example plugins/index.js can be used to load plugins
//
// You can change the location of this file or turn off loading
// the plugins file with the 'pluginsFile' configuration option.
//
// You can read more here:
// https://on.cypress.io/plugins-guide
// ***********************************************************
// This function is called when a project is opened or re-opened (e.g. due to
// the project's config changing)
/**
* @type {Cypress.PluginConfig}
*/
// eslint-disable-next-line no-unused-vars
module.exports = (on, config) => {
// `on` is used to hook into various events Cypress emits
// `config` is the resolved Cypress config
}

View file

@ -0,0 +1,34 @@
// ***********************************************
// This example commands.js shows you how to
// create various custom commands and overwrite
// existing commands.
//
// For more comprehensive examples of custom
// commands please read more here:
// https://on.cypress.io/custom-commands
// ***********************************************
//
//
// -- This is a parent command --
// Cypress.Commands.add('login', (email, password) => { ... })
//
//
// -- This is a child command --
// Cypress.Commands.add('drag', { prevSubject: 'element'}, (subject, options) => { ... })
//
//
// -- This is a dual command --
// Cypress.Commands.add('dismiss', { prevSubject: 'optional'}, (subject, options) => { ... })
//
//
// -- This will overwrite an existing command --
// Cypress.Commands.overwrite('visit', (originalFn, url, options) => { ... })
Cypress.Commands.add('any', { prevSubject: 'element' }, (subject, size = 1) => {
cy.wrap(subject).then(elementList => {
elementList = (elementList.jquery) ? elementList.get() : elementList
elementList = Cypress._.sampleSize(elementList, size)
elementList = (elementList.length > 1) ? elementList : elementList[0]
cy.wrap(elementList)
})
})

22
cypress/support/index.js Normal file
View file

@ -0,0 +1,22 @@
// ***********************************************************
// This example support/index.js is processed and
// loaded automatically before your test files.
//
// This is a great place to put global configuration and
// behavior that modifies Cypress.
//
// You can change the location of this file or turn off
// automatically serving support files with the
// 'supportFile' configuration option.
//
// You can read more here:
// https://on.cypress.io/configuration
// ***********************************************************
// Import commands.js using ES2015 syntax:
import './commands'
// Alternatively you can use CommonJS syntax:
// require('./commands')
import 'cypress-real-events/support'

View file

@ -1,56 +0,0 @@
# Datatracker Development in Docker
## Getting started
1. [Set up Docker](https://docs.docker.com/get-started/) on your preferred
platform.
2. If you have a copy of the datatracker code checked out already, simply `cd`
to the top-level directory.
If not, check out a datatracker branch as usual. We'll check out `trunk`
below, but you can use any branch:
svn co https://svn.ietf.org/svn/tools/ietfdb/trunk
cd trunk
3. **TEMPORARY:** Replace the contents of the `docker` directory with [Lars'
files](https://svn.ietf.org/svn/tools/ietfdb/personal/lars/7.39.1.dev0/docker/).
4. **TEMPORARY:** Until [Lars'
changes](https://svn.ietf.org/svn/tools/ietfdb/personal/lars/7.39.1.dev0/docker/)
have been merged and a docker image is available for download, you will need
to build it locally:
docker/build
This will take a while, but only needs to be done once.
5. Use the `docker/run` script to start the datatracker container. You will be
dropped into a shell from which you can start the datatracker and execute
related commands as usual, for example
ietf/manage.py runserver 0.0.0.0:8000
to start the datatracker.
You can also pass additional arguments to `docker/run`, in which case they
will be executed in the container (instead of a shell being started.)
If you do not already have a copy of the IETF database available in the
`data` directory, one will be downloaded and imported the first time you run
`docker/run`. This will take some time.
Once the datatracker has started, you should be able to open
[http://localhost:8000](http://localhost:8000) in a browser and see the
landing page.
## Troubleshooting
- If the database fails to start, the cause is usually an incompatibility
between the database that last touched the files in `data/mysql` and the
database running inside the docker container.
The solution is to blow away your existing database (`rm -rf data/mysql`). A
fresh copy will be retrieved and imported next time you do `docker/run`, which
should resolve this issue.

View file

@ -1,110 +0,0 @@
#!/bin/bash
version=0.20
program=${0##*/}
progdir=${0%/*}
if [ "$progdir" = "$program" ]; then progdir="."; fi
if [ "$progdir" = "." ]; then progdir="$PWD"; fi
parent=$(dirname "$progdir")
if [ "$parent" = "." ]; then parent="$PWD"; fi
if [[ $(uname) =~ CYGWIN.* ]]; then parent=$(echo "$parent" | sed -e 's/^\/cygdrive\/\(.\)/\1:/'); fi
function usage() {
cat <<EOF
NAME
$program - Run a docker datatracker container with suitable settings
SYNOPSIS
$program [OPTIONS] ARGS
DESCRIPTION
This is a wrapper which runs an Ubuntu-based docker image which
has been set up with the dependencies needed to easily run the
IETF datatracker in development mode.
MySQL database files at data/mysql will be used; if they do not exist,
a database dump will be retrieved and restored on first run.
OPTIONS
EOF
grep -E '^\s+-[a-zA-Z])' "$0" | sed -E -e 's/\)[^#]+#/ /'
cat <<EOF
AUTHOR
Written by:
Henrik Levkowetz, <henrik@levkowetz.com>
Lars Eggert, <lars@eggert.org>
COPYRIGHT
Copyright (c) 2016 IETF Trust and the persons identified as authors of
the code. All rights reserved. Redistribution and use in source and
binary forms, with or without modification, is permitted pursuant to,
and subject to the license terms contained in, the Revised BSD
License set forth in Section 4.c of the IETF Trusts Legal Provisions
Relating to IETF Documents(https://trustee.ietf.org/license-info).
EOF
}
function die() {
echo -e "\n$program: error: $*" >&2
exit 1
}
function version() {
echo -e "$program $version"
}
trap 'echo "$program($LINENO): Command failed with error code $? ([$$] $0 $*)"; exit 1' ERR
# Default values
MYSQLDIR=$parent/data/mysql
PORT=8000
REPO="ietf/datatracker-environment"
CACHED=':cached'
# Option parsing
shortopts=cChp:V
args=$(getopt -o$shortopts $*)
if [ $? != 0 ] ; then die "Terminating..." >&2 ; exit 1 ; fi
set -- $args
while true ; do
case "$1" in
-c) CACHED=':cached';; # Use cached disk access to reduce system load
-C) CACHED=':consistent';; # Use fully synchronized disk access
-h) usage; exit;; # Show this help, then exit
-p) PORT=$2; shift;; # Bind the container's port 8000 to external port PORT
-V) version; exit;; # Show program version, then exit
--) shift; break;;
*) die "Internal error, inconsistent option specification: '$1'";;
esac
shift
done
if [ -z "$TAG" ]; then
TAG=$(basename "$(svn info "$parent" | grep ^URL | awk '{print $2}' | tr -d '\r')")
fi
if [[ $(uname) =~ CYGWIN.* ]]; then
echo "Running under Cygwin, replacing symlinks with file copies"
ICSFILES=$(/usr/bin/find "$parent/vzic/zoneinfo/" -name '*.ics' -print)
for ICSFILE in $ICSFILES; do
LINK=$(head -n1 "$ICSFILE" | sed -e '/link .*/!d' -e 's/link \(.*\)/\1/')
if [ "$LINK" ]; then
WDIR=$(dirname "$ICSFILE")
echo "Replacing $(basename "$ICSFILE") with $LINK"
cp -f "$WDIR/$LINK" "$ICSFILE"
fi
done
fi
echo "Starting a docker container for '$REPO:$TAG'."
mkdir -p "$MYSQLDIR"
docker run -ti -p "$PORT":8000 -p 33306:3306 \
-v "$parent:/root/src$CACHED" \
-v "$MYSQLDIR:/var/lib/mysql:delegated" \
"$REPO:$TAG" "$@"

View file

@ -36,6 +36,16 @@ RUN apt-get install -qy \
graphviz \
jq \
less \
libcairo2-dev \
libgtk2.0-0 \
libgtk-3-0 \
libnotify-dev \
libgconf-2-4 \
libgbm-dev \
libnss3 \
libxss1 \
libasound2 \
libxtst6 \
libmagic-dev \
libmariadb-dev \
libtidy-dev \
@ -49,27 +59,37 @@ RUN apt-get install -qy \
ripgrep \
rsync \
rsyslog \
ruby \
ruby-rubygems \
subversion \
unzip \
wget \
yang-tools && \
xauth \
xvfb \
yang-tools \
zsh
# Install chromedriver
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add - && \
echo "deb http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list && \
apt-get update -y && \
apt-get install -y google-chrome-stable && \
CHROMEVER=$(google-chrome --product-version | grep -o "[^\.]*\.[^\.]*\.[^\.]*") && \
DRIVERVER=$(curl -s "https://chromedriver.storage.googleapis.com/LATEST_RELEASE_$CHROMEVER") && \
wget -q --continue -P /chromedriver "http://chromedriver.storage.googleapis.com/$DRIVERVER/chromedriver_linux64.zip" && \
unzip /chromedriver/chromedriver* -d /chromedriver && \
ln -s /chromedriver/chromedriver /usr/local/bin/chromedriver && \
ln -s /chromedriver/chromedriver /usr/bin/chromedriver
# Install kramdown-rfc2629 (ruby)
RUN gem install kramdown-rfc2629
# Install chromedriver if supported
COPY docker/scripts/app-install-chromedriver.sh /tmp/app-install-chromedriver.sh
RUN sed -i 's/\r$//' /tmp/app-install-chromedriver.sh && \
chmod +x /tmp/app-install-chromedriver.sh
RUN /tmp/app-install-chromedriver.sh
# Get rid of installation files we don't need in the image, to reduce size
RUN apt-get clean && rm -rf /var/lib/apt/lists/*
# "fake" dbus address to prevent errors
# https://github.com/SeleniumHQ/docker-selenium/issues/87
ENV DBUS_SESSION_BUS_ADDRESS=/dev/null
# avoid million NPM install messages
ENV npm_config_loglevel warn
# allow installing when the main user is root
ENV npm_config_unsafe_perm true
# Set locale to en_US.UTF-8
RUN echo "LC_ALL=en_US.UTF-8" >> /etc/environment && \
echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen && \

View file

@ -48,8 +48,6 @@ MEDIA_URL = '/media/'
PHOTOS_DIRNAME = 'photo'
PHOTOS_DIR = MEDIA_ROOT + PHOTOS_DIRNAME
DOCUMENT_PATH_PATTERN = 'data/developers/ietf-ftp/{doc.type_id}/'
SUBMIT_YANG_CATALOG_MODEL_DIR = 'data/developers/ietf-ftp/yang/catalogmod/'
SUBMIT_YANG_DRAFT_MODEL_DIR = 'data/developers/ietf-ftp/yang/draftmod/'
SUBMIT_YANG_INVAL_MODEL_DIR = 'data/developers/ietf-ftp/yang/invalmod/'
@ -76,4 +74,6 @@ INTERNET_DRAFT_ARCHIVE_DIR = 'data/developers/ietf-ftp/internet-drafts/'
INTERNET_ALL_DRAFTS_ARCHIVE_DIR = 'data/developers/ietf-ftp/internet-drafts/'
NOMCOM_PUBLIC_KEYS_DIR = 'data/nomcom_keys/public_keys/'
SLIDE_STAGING_PATH = 'test/staging/'
SLIDE_STAGING_PATH = 'test/staging/'
DE_GFM_BINARY = '/usr/local/bin/de-gfm'

View file

@ -80,3 +80,5 @@ SUBMIT_YANG_DRAFT_MODEL_DIR = 'data/developers/ietf-ftp/yang/draftmod/'
SUBMIT_YANG_INVAL_MODEL_DIR = 'data/developers/ietf-ftp/yang/invalmod/'
SUBMIT_YANG_IANA_MODEL_DIR = 'data/developers/ietf-ftp/yang/ianamod/'
SUBMIT_YANG_RFC_MODEL_DIR = 'data/developers/ietf-ftp/yang/rfcmod/'
DE_GFM_BINARY = '/usr/local/bin/de-gfm'

View file

@ -23,6 +23,8 @@ services:
depends_on:
- db
ipc: host
# environment:
# USER: django
# UID: 1001
@ -37,6 +39,7 @@ services:
# (Adding the "ports" property to this file will not forward from a Codespace.)
db:
# image: ghcr.io/ngpixel/datatracker-db:nightly-20211208
build:
context: ..
dockerfile: docker/db.Dockerfile

25
docker/scripts/app-cypress.sh Executable file
View file

@ -0,0 +1,25 @@
#!/bin/bash
WORKSPACEDIR="/root/src"
pushd .
cd $WORKSPACEDIR
echo "Installing NPM dependencies..."
npm install --silent
echo "Starting datatracker server..."
ietf/manage.py runserver 0.0.0.0:8000 --settings=settings_local > /dev/null 2>&1 &
serverPID=$!
echo "Waiting for server to come online ..."
wget -qO- https://raw.githubusercontent.com/eficode/wait-for/v2.1.3/wait-for | sh -s -- localhost:8000 -- echo "Server ready"
echo "Run dbus process to silence warnings..."
sudo mkdir -p /run/dbus
sudo dbus-daemon --system &> /dev/null
echo "Starting JS tests..."
npx cypress run
kill $serverPID
popd

View file

@ -0,0 +1,18 @@
#!/bin/bash
HOSTARCH=$(arch)
if [ $HOSTARCH == "x86_64" ]; then
echo "Installing chrome driver..."
wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
echo "deb http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list
apt-get update -y
apt-get install -y google-chrome-stable
CHROMEVER=$(google-chrome --product-version | grep -o "[^\.]*\.[^\.]*\.[^\.]*")
DRIVERVER=$(curl -s "https://chromedriver.storage.googleapis.com/LATEST_RELEASE_$CHROMEVER")
wget -q --continue -P /chromedriver "http://chromedriver.storage.googleapis.com/$DRIVERVER/chromedriver_linux64.zip"
unzip /chromedriver/chromedriver* -d /chromedriver
ln -s /chromedriver/chromedriver /usr/local/bin/chromedriver
ln -s /chromedriver/chromedriver /usr/bin/chromedriver
else
echo "This architecture doesn't support chromedriver. Skipping installation..."
fi

View file

@ -1,105 +1,5 @@
#!/bin/bash
version=0.20
program=${0##*/}
progdir=${0%/*}
if [ "$progdir" = "$program" ]; then progdir="."; fi
if [ "$progdir" = "." ]; then progdir="$PWD"; fi
parent=$(dirname "$progdir")
if [ "$parent" = "." ]; then parent="$PWD"; fi
if [[ $(uname) =~ CYGWIN.* ]]; then parent=$(echo "$parent" | sed -e 's/^\/cygdrive\/\(.\)/\1:/'); fi
echo "This script is deprecated. Please use the `cleandb` script in the parent folder instead."
function usage() {
cat <<EOF
NAME
$program - Update the local copy of the IETF database from a dump
SYNOPSIS
$program [OPTIONS] ARGS
DESCRIPTION
This script downloads a dump of the IETF database and loads into the
local sql server if it is newer than the current dump.
OPTIONS
EOF
grep -E '^\s+-[a-zA-Z])' "$0" | sed -E -e 's/\)[^#]+#/ /'
cat <<EOF
AUTHOR
Written by:
Henrik Levkowetz, <henrik@levkowetz.com>
Lars Eggert, <lars@eggert.org>
COPYRIGHT
Copyright (c) 2016 IETF Trust and the persons identified as authors of
the code. All rights reserved. Redistribution and use in source and
binary forms, with or without modification, is permitted pursuant to,
and subject to the license terms contained in, the Revised BSD
License set forth in Section 4.c of the IETF Trusts Legal Provisions
Relating to IETF Documents(https://trustee.ietf.org/license-info).
EOF
}
function die() {
echo -e "\n$program: error: $*" >&2
exit 1
}
function version() {
echo -e "$program $version"
}
trap 'echo "$program($LINENO): Command failed with error code $? ([$$] $0 $*)"; exit 1' ERR
# Option parsing
shortopts=DLZhV
LOAD=1
DOWNLOAD=1
DROP=1
args=$(getopt -o$shortopts $*)
if [ $? != 0 ] ; then die "Terminating..." >&2 ; exit 1 ; fi
set -- $args
while true ; do
case "$1" in
-D) DOWNLOAD="";; # Don't download, use existing file
-L) LOAD=""; ;; # Don't load the database
-Z) DROP="";; # Don't drop new tables
-h) usage; exit;; # Show this help, then exit
-V) version; exit;; # Show program version, then exit
--) shift; break;;
*) die "Internal error, inconsistent option specification: '$1'";;
esac
shift
done
# The program itself
DATADIR=$parent/data
DUMP=ietf_utf8.sql.gz
if [ "$DOWNLOAD" ]; then
echo "Fetching database dump..."
rsync --info=progress2 rsync.ietf.org::dev.db/$DUMP "$DATADIR"
fi
if [ "$LOAD" ]; then
echo "Loading database..."
SIZE=$(pigz --list "$DATADIR/$DUMP" | tail -n 1 | awk '{ print $2 }')
pigz -d < "$DATADIR/$DUMP" \
| pv --progress --bytes --rate --eta --size "$SIZE" \
| sed -e 's/ENGINE=MyISAM/ENGINE=InnoDB/' \
| "$parent/ietf/manage.py" dbshell
fi
if [ "$DROP" ]; then
echo "Dropping tables not in the dump (so migrations can succeed)..."
diff <(pigz -d -c "$DATADIR/$DUMP" | grep '^DROP TABLE IF EXISTS' | tr -d '`;' | awk '{ print $5 }') \
<("$parent/ietf/manage.py" dbshell <<< 'show tables;' | tail -n +2) \
| grep '^>' | awk '{print "drop table if exists", $2, ";";}' \
| tee >(cat >&2) | "$parent/ietf/manage.py" dbshell
fi
# Modified on 2021-12-20, remove this file after a while

View file

@ -1,7 +1,6 @@
# -*- conf-mode -*-
#
/personal/lars/7.39.1.dev0@19495 # Hold the modal 'give us your xml' poking until bibxml service is stable
# and maybe until we have rendered previews.

View file

@ -5,13 +5,13 @@
from . import checks # pyflakes:ignore
# Don't add patch number here:
__version__ = "7.40.1.dev0"
__version__ = "7.45.1.dev0"
# set this to ".p1", ".p2", etc. after patching
__patch__ = ""
__date__ = "$Date$"
__rev__ = "$Rev$ (dev) Latest release: Rev. 19686 "
__rev__ = "$Rev$ (dev) Latest release: Rev. 19938 "
__id__ = "$Id$"

View file

@ -356,8 +356,8 @@ class CustomApiTests(TestCase):
self.assertEqual(data['version'], ietf.__version__+ietf.__patch__)
self.assertIn(data['date'], ietf.__date__)
def test_api_appauth_authortools(self):
url = urlreverse('ietf.api.views.author_tools')
def test_api_appauth(self):
url = urlreverse('ietf.api.views.app_auth')
person = PersonFactory()
apikey = PersonalApiKey.objects.create(endpoint=url, person=person)

View file

@ -40,8 +40,8 @@ urlpatterns = [
url(r'^submit/?$', submit_views.api_submit),
# Datatracker version
url(r'^version/?$', api_views.version),
# Authtools API key
url(r'^appauth/authortools', api_views.author_tools),
# Application authentication API key
url(r'^appauth/[authortools|bibxml]', api_views.app_auth),
]
# Additional (standard) Tastypie endpoints

View file

@ -218,7 +218,7 @@ def version(request):
@require_api_key
@csrf_exempt
def author_tools(request):
def app_auth(request):
return HttpResponse(
json.dumps({'success': True}),
content_type='application/json')

View file

@ -31,8 +31,16 @@ syslog.syslog("Updating history log with new RFC entries from IANA protocols pag
# FIXME: this needs to be the date where this tool is first deployed
rfc_must_published_later_than = datetime.datetime(2012, 11, 26, 0, 0, 0)
text = requests.get(settings.IANA_SYNC_PROTOCOLS_URL).text
rfc_numbers = parse_protocol_page(text)
try:
response = requests.get(
settings.IANA_SYNC_PROTOCOLS_URL,
timeout=30,
)
except requests.Timeout as exc:
syslog.syslog(f'GET request timed out retrieving IANA protocols page: {exc}')
sys.exit(1)
rfc_numbers = parse_protocol_page(response.text)
for chunk in chunks(rfc_numbers, 100):
updated = update_rfc_log_from_protocol_page(chunk, rfc_must_published_later_than)

View file

@ -5,10 +5,8 @@
import datetime
import io
import json
import os
import requests
import socket
import sys
import syslog
import traceback
@ -48,11 +46,28 @@ if options.skip_date:
log("Updating document metadata from RFC index going back to %s, from %s" % (skip_date, settings.RFC_EDITOR_INDEX_URL))
socket.setdefaulttimeout(30)
rfc_index_xml = requests.get(settings.RFC_EDITOR_INDEX_URL).text
try:
response = requests.get(
settings.RFC_EDITOR_INDEX_URL,
timeout=30, # seconds
)
except requests.Timeout as exc:
log(f'GET request timed out retrieving RFC editor index: {exc}')
sys.exit(1)
rfc_index_xml = response.text
index_data = ietf.sync.rfceditor.parse_index(io.StringIO(rfc_index_xml))
errata_data = requests.get(settings.RFC_EDITOR_ERRATA_JSON_URL).json()
try:
response = requests.get(
settings.RFC_EDITOR_ERRATA_JSON_URL,
timeout=30, # seconds
)
except requests.Timeout as exc:
log(f'GET request timed out retrieving RFC editor errata: {exc}')
sys.exit(1)
errata_data = response.json()
if len(index_data) < ietf.sync.rfceditor.MIN_INDEX_RESULTS:
log("Not enough index entries, only %s" % len(index_data))

View file

@ -3,7 +3,6 @@
import io
import os
import requests
import socket
import sys
# boilerplate
@ -21,9 +20,15 @@ from ietf.utils.log import log
log("Updating RFC Editor queue states from %s" % settings.RFC_EDITOR_QUEUE_URL)
socket.setdefaulttimeout(30)
response = requests.get(settings.RFC_EDITOR_QUEUE_URL).text
drafts, warnings = parse_queue(io.StringIO(response))
try:
response = requests.get(
settings.RFC_EDITOR_QUEUE_URL,
timeout=30, # seconds
)
except requests.Timeout as exc:
log(f'GET request timed out retrieving RFC editor queue: {exc}')
sys.exit(1)
drafts, warnings = parse_queue(io.StringIO(response.text))
for w in warnings:
log(u"Warning: %s" % w)

View file

@ -23,7 +23,7 @@ django.setup()
from ietf.review.utils import (
review_assignments_needing_reviewer_reminder, email_reviewer_reminder,
review_assignments_needing_secretary_reminder, email_secretary_reminder,
send_unavaibility_period_ending_reminder, send_reminder_all_open_reviews,
send_unavailability_period_ending_reminder, send_reminder_all_open_reviews,
send_review_reminder_overdue_assignment, send_reminder_unconfirmed_assignments)
from ietf.utils.log import log
@ -38,7 +38,7 @@ for assignment, secretary_role in review_assignments_needing_secretary_reminder(
review_req = assignment.review_request
log("Emailed reminder to {} for review of {} in {} (req. id {})".format(secretary_role.email.address, review_req.doc_id, review_req.team.acronym, review_req.pk))
period_end_reminders_sent = send_unavaibility_period_ending_reminder(today)
period_end_reminders_sent = send_unavailability_period_ending_reminder(today)
for msg in period_end_reminders_sent:
log(msg)

View file

@ -23,6 +23,7 @@ admin.site.register(StateType, StateTypeAdmin)
class StateAdmin(admin.ModelAdmin):
list_display = ["slug", "type", 'name', 'order', 'desc']
list_filter = ["type", ]
search_fields = ["slug", "type__label", "type__slug", "name", "desc"]
filter_horizontal = ["next_states"]
admin.site.register(State, StateAdmin)

View file

@ -147,6 +147,12 @@ class IndividualRfcFactory(IndividualDraftFactory):
else:
obj.set_state(State.objects.get(type_id='draft',slug='rfc'))
@factory.post_generation
def reset_canonical_name(obj, create, extracted, **kwargs):
if hasattr(obj, '_canonical_name'):
del obj._canonical_name
return None
class WgDraftFactory(BaseDocumentFactory):
type_id = 'draft'
@ -186,6 +192,11 @@ class WgRfcFactory(WgDraftFactory):
obj.set_state(State.objects.get(type_id='draft',slug='rfc'))
obj.set_state(State.objects.get(type_id='draft-iesg', slug='pub'))
@factory.post_generation
def reset_canonical_name(obj, create, extracted, **kwargs):
if hasattr(obj, '_canonical_name'):
del obj._canonical_name
return None
class RgDraftFactory(BaseDocumentFactory):
@ -230,6 +241,12 @@ class RgRfcFactory(RgDraftFactory):
obj.set_state(State.objects.get(type_id='draft-stream-irtf', slug='pub'))
obj.set_state(State.objects.get(type_id='draft-iesg',slug='idexists'))
@factory.post_generation
def reset_canonical_name(obj, create, extracted, **kwargs):
if hasattr(obj, '_canonical_name'):
del obj._canonical_name
return None
class CharterFactory(BaseDocumentFactory):
@ -394,12 +411,8 @@ class BallotPositionDocEventFactory(DocEventFactory):
model = BallotPositionDocEvent
type = 'changed_ballot_position'
# This isn't right - it needs to build a ballot for the same doc as this position
# For now, deal with this in test code by building BallotDocEvent and BallotPositionDocEvent
# separately and passing the same doc into thier factories.
ballot = factory.SubFactory(BallotDocEventFactory)
ballot = factory.SubFactory(BallotDocEventFactory)
doc = factory.SelfAttribute('ballot.doc') # point to same doc as the ballot
balloter = factory.SubFactory('ietf.person.factories.PersonFactory')
pos_id = 'discuss'
@ -464,11 +477,14 @@ class BofreqResponsibleDocEventFactory(DocEventFactory):
class BofreqFactory(BaseDocumentFactory):
type_id = 'bofreq'
title = factory.Faker('sentence')
name = factory.LazyAttribute(lambda o: 'bofreq-%s'%(xslugify(o.title)))
name = factory.LazyAttribute(lambda o: 'bofreq-%s-%s'%(xslugify(o.requester_lastname), xslugify(o.title)))
bofreqeditordocevent = factory.RelatedFactory('ietf.doc.factories.BofreqEditorDocEventFactory','doc')
bofreqresponsibledocevent = factory.RelatedFactory('ietf.doc.factories.BofreqResponsibleDocEventFactory','doc')
class Params:
requester_lastname = factory.Faker('last_name')
@factory.post_generation
def states(obj, create, extracted, **kwargs):
if not create:

View file

@ -3,6 +3,7 @@
import datetime
import unicodedata
from django.contrib.syndication.views import Feed, FeedDoesNotExist
from django.utils.feedgenerator import Atom1Feed, Rss201rev2Feed
@ -15,6 +16,15 @@ from ietf.doc.models import Document, State, LastCallDocEvent, DocEvent
from ietf.doc.utils import augment_events_with_revision
from ietf.doc.templatetags.ietf_filters import format_textarea
def strip_control_characters(s):
"""Remove Unicode control / non-printing characters from a string"""
replacement_char = unicodedata.lookup('REPLACEMENT CHARACTER')
return ''.join(
replacement_char if unicodedata.category(c)[0] == 'C' else c
for c in s
)
class DocumentChangesFeed(Feed):
feed_type = Atom1Feed
@ -38,10 +48,14 @@ class DocumentChangesFeed(Feed):
return events
def item_title(self, item):
return "[%s] %s [rev. %s]" % (item.by, truncatewords(strip_tags(item.desc), 15), item.rev)
return strip_control_characters("[%s] %s [rev. %s]" % (
item.by,
truncatewords(strip_tags(item.desc), 15),
item.rev,
))
def item_description(self, item):
return truncatewords_html(format_textarea(item.desc), 20)
return strip_control_characters(truncatewords_html(format_textarea(item.desc), 20))
def item_pubdate(self, item):
return item.time
@ -75,7 +89,7 @@ class InLastCallFeed(Feed):
datefilter(item.lc_event.expires, "F j, Y"))
def item_description(self, item):
return linebreaks(item.lc_event.desc)
return strip_control_characters(linebreaks(item.lc_event.desc))
def item_pubdate(self, item):
return item.lc_event.time

View file

@ -61,7 +61,7 @@ class Command(BaseCommand):
process_all = options.get("all")
days = options.get("days")
#
bibxmldir = os.path.join(settings.BIBXML_BASE_PATH, 'bibxml3')
bibxmldir = os.path.join(settings.BIBXML_BASE_PATH, 'bibxml-ids')
if not os.path.exists(bibxmldir):
os.makedirs(bibxmldir)
#
@ -75,19 +75,21 @@ class Command(BaseCommand):
for e in doc_events:
self.mutter('%s %s' % (e.time, e.doc.name))
try:
e.doc.date = e.time.date()
doc = e.doc
if e.rev != doc.rev:
for h in doc.history_set.order_by("-time"):
if e.rev == h.rev:
doc = h
break
ref_text = '%s' % render_to_string('doc/bibxml.xml', {'doc': doc, 'doc_bibtype':'I-D'})
if e.rev == e.doc.rev:
ref_file_name = os.path.join(bibxmldir, 'reference.I-D.%s.xml' % (doc.name[6:], ))
self.write(ref_file_name, ref_text)
else:
self.note("Skipping %s; outdated revision: %s" % (os.path.basename(ref_file_name), e.rev))
doc.date = e.time.date()
ref_text = '%s' % render_to_string('doc/bibxml.xml', {'name':doc.name, 'doc': doc, 'doc_bibtype':'I-D'})
# if e.rev == e.doc.rev:
# for name in (doc.name, doc.name[6:]):
# ref_file_name = os.path.join(bibxmldir, 'reference.I-D.%s.xml' % (name, ))
# self.write(ref_file_name, ref_text)
# for name in (doc.name, doc.name[6:]):
# ref_rev_file_name = os.path.join(bibxmldir, 'reference.I-D.%s-%s.xml' % (name, doc.rev))
# self.write(ref_rev_file_name, ref_text)
ref_rev_file_name = os.path.join(bibxmldir, 'reference.I-D.%s-%s.xml' % (doc.name, doc.rev))
self.write(ref_rev_file_name, ref_text)
except Exception as ee:

View file

@ -10,6 +10,7 @@ import rfc2html
import time
from typing import Optional, TYPE_CHECKING
from weasyprint import HTML as wpHTML
from django.db import models
from django.core import checks
@ -565,6 +566,26 @@ class DocumentInfo(models.Model):
cache.set(cache_key, html, settings.HTMLIZER_CACHE_TIME)
return html
def pdfized(self):
name = self.get_base_name()
text = self.text()
cache = caches['pdfized']
cache_key = name.split('.')[0]
try:
pdf = cache.get(cache_key)
except EOFError:
pdf = None
if not pdf:
html = rfc2html.markup(text, path=settings.PDFIZER_URL_PREFIX)
try:
pdf = wpHTML(string=html.replace('\xad','')).write_pdf(stylesheets=[io.BytesIO(b'html { font-size: 94%;}')])
except AssertionError:
log.log(f'weasyprint failed with an assert on {self.name}')
pdf = None
if pdf:
cache.set(cache_key, pdf, settings.PDFIZER_CACHE_TIME)
return pdf
def references(self):
return self.relations_that_doc(('refnorm','refinfo','refunk','refold'))
@ -1332,7 +1353,11 @@ class BallotPositionDocEvent(DocEvent):
def any_email_sent(self):
# When the send_email field is introduced, old positions will have it
# set to None. We still essentially return True, False, or don't know:
sent_list = BallotPositionDocEvent.objects.filter(ballot=self.ballot, time__lte=self.time, ad=self.ad).values_list('send_email', flat=True)
sent_list = BallotPositionDocEvent.objects.filter(
ballot=self.ballot,
time__lte=self.time,
balloter=self.balloter,
).values_list('send_email', flat=True)
false = any( s==False for s in sent_list )
true = any( s==True for s in sent_list )
return True if true else False if false else None

View file

@ -180,6 +180,11 @@ def rfclink(string):
string = str(string);
return "https://datatracker.ietf.org/doc/html/rfc" + string;
@register.filter
def rfceditor_info_url(rfcnum : str):
"""Link to the RFC editor info page for an RFC"""
return urljoin(settings.RFC_EDITOR_INFO_BASE_URL, f'rfc{rfcnum}')
@register.filter(name='urlize_ietf_docs', is_safe=True, needs_autoescape=True)
def urlize_ietf_docs(string, autoescape=None):
"""
@ -357,6 +362,23 @@ def expires_soon(x,request):
def startswith(x, y):
return str(x).startswith(y)
@register.filter(name='removesuffix', is_safe=False)
def removesuffix(value, suffix):
"""Remove an exact-match suffix
The is_safe flag is False because indiscriminate use of this could result in non-safe output.
See https://docs.djangoproject.com/en/2.2/howto/custom-template-tags/#filters-and-auto-escaping
which describes the possibility that removing characters from an escaped string may introduce
HTML-unsafe output.
"""
base = str(value)
if base.endswith(suffix):
return base[:-len(suffix)]
else:
return base
@register.filter
def has_role(user, role_names):
from ietf.ietfauth.utils import has_role

View file

@ -22,6 +22,7 @@ from django.urls import reverse as urlreverse
from django.conf import settings
from django.forms import Form
from django.utils.html import escape
from django.test import override_settings
from django.utils.text import slugify
from tastypie.test import ResourceTestCaseMixin
@ -678,17 +679,24 @@ Man Expires September 22, 2015 [Page 3]
self.assertContains(r, "Versions:")
self.assertContains(r, "Deimos street")
q = PyQuery(r.content)
self.assertEqual(q('title').text(), 'draft-ietf-mars-test-01')
self.assertEqual(len(q('.rfcmarkup pre')), 4)
self.assertEqual(len(q('.rfcmarkup span.h1')), 2)
self.assertEqual(len(q('.rfcmarkup a[href]')), 41)
r = self.client.get(urlreverse("ietf.doc.views_doc.document_html", kwargs=dict(name=draft.name, rev=draft.rev)))
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertEqual(q('title').text(), 'draft-ietf-mars-test-01')
rfc = WgRfcFactory()
(Path(settings.RFC_PATH) / rfc.get_base_name()).touch()
r = self.client.get(urlreverse("ietf.doc.views_doc.document_html", kwargs=dict(name=rfc.canonical_name())))
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertEqual(q('title').text(), f'RFC {rfc.rfc_number()} - {rfc.title}')
# synonyms for the rfc should be redirected to its canonical view
r = self.client.get(urlreverse("ietf.doc.views_doc.document_html", kwargs=dict(name=rfc.rfc_number())))
self.assertRedirects(r, urlreverse("ietf.doc.views_doc.document_html", kwargs=dict(name=rfc.canonical_name())))
r = self.client.get(urlreverse("ietf.doc.views_doc.document_html", kwargs=dict(name=f'RFC {rfc.rfc_number()}')))
@ -1704,6 +1712,20 @@ class DocTestCase(TestCase):
self.assertEqual(r.status_code, 200)
self.assertContains(r, e.desc)
def test_document_feed_with_control_character(self):
doc = IndividualDraftFactory()
DocEvent.objects.create(
doc=doc,
rev=doc.rev,
desc="Something happened involving the \x0b character.",
type="added_comment",
by=Person.objects.get(name="(System)"))
r = self.client.get("/feed/document-changes/%s/" % doc.name)
self.assertEqual(r.status_code, 200)
self.assertContains(r, 'Something happened involving the')
def test_last_call_feed(self):
doc = IndividualDraftFactory()
@ -1712,7 +1734,7 @@ class DocTestCase(TestCase):
LastCallDocEvent.objects.create(
doc=doc,
rev=doc.rev,
desc="Last call",
desc="Last call\x0b", # include a control character to be sure it does not break anything
type="sent_last_call",
by=Person.objects.get(user__username="secretary"),
expires=datetime.date.today() + datetime.timedelta(days=7))
@ -1752,6 +1774,12 @@ class DocTestCase(TestCase):
self.assertEqual(r.status_code, 200)
self.assertNotContains(r, "Request publication")
def _parse_bibtex_response(self, response) -> dict:
parser = bibtexparser.bparser.BibTexParser()
parser.homogenise_fields = False # do not modify field names (e.g., turns "url" into "link" by default)
return bibtexparser.loads(response.content.decode(), parser=parser).get_entry_dict()
@override_settings(RFC_EDITOR_INFO_BASE_URL='https://www.rfc-editor.ietf.org/info/')
def test_document_bibtex(self):
rfc = WgRfcFactory.create(
#other_aliases = ['rfc6020',],
@ -1764,12 +1792,13 @@ class DocTestCase(TestCase):
#
url = urlreverse('ietf.doc.views_doc.document_bibtex', kwargs=dict(name=rfc.name))
r = self.client.get(url)
entry = bibtexparser.loads(unicontent(r)).get_entry_dict()["rfc%s"%num]
entry = self._parse_bibtex_response(r)["rfc%s"%num]
self.assertEqual(entry['series'], 'Request for Comments')
self.assertEqual(entry['number'], num)
self.assertEqual(entry['doi'], '10.17487/RFC%s'%num)
self.assertEqual(entry['year'], '2010')
self.assertEqual(entry['month'], 'oct')
self.assertEqual(entry['url'], f'https://www.rfc-editor.ietf.org/info/rfc{num}')
#
self.assertNotIn('day', entry)
@ -1785,25 +1814,27 @@ class DocTestCase(TestCase):
url = urlreverse('ietf.doc.views_doc.document_bibtex', kwargs=dict(name=april1.name))
r = self.client.get(url)
self.assertEqual(r.get('Content-Type'), 'text/plain; charset=utf-8')
entry = bibtexparser.loads(unicontent(r)).get_entry_dict()['rfc%s'%num]
entry = self._parse_bibtex_response(r)["rfc%s"%num]
self.assertEqual(entry['series'], 'Request for Comments')
self.assertEqual(entry['number'], num)
self.assertEqual(entry['doi'], '10.17487/RFC%s'%num)
self.assertEqual(entry['year'], '1990')
self.assertEqual(entry['month'], 'apr')
self.assertEqual(entry['day'], '1')
self.assertEqual(entry['url'], f'https://www.rfc-editor.ietf.org/info/rfc{num}')
draft = IndividualDraftFactory.create()
docname = '%s-%s' % (draft.name, draft.rev)
bibname = docname[6:] # drop the 'draft-' prefix
url = urlreverse('ietf.doc.views_doc.document_bibtex', kwargs=dict(name=draft.name))
r = self.client.get(url)
entry = bibtexparser.loads(unicontent(r)).get_entry_dict()[bibname]
entry = self._parse_bibtex_response(r)[bibname]
self.assertEqual(entry['note'], 'Work in Progress')
self.assertEqual(entry['number'], docname)
self.assertEqual(entry['year'], str(draft.pub_date().year))
self.assertEqual(entry['month'], draft.pub_date().strftime('%b').lower())
self.assertEqual(entry['day'], str(draft.pub_date().day))
self.assertEqual(entry['url'], f'https://datatracker.ietf.org/doc/html/{docname}')
#
self.assertNotIn('doi', entry)
@ -2681,4 +2712,91 @@ class RfcdiffSupportTests(TestCase):
self.do_rfc_with_broken_history_test(draft_name='draft-some-draft')
# tricky draft names
self.do_rfc_with_broken_history_test(draft_name='draft-gizmo-01')
self.do_rfc_with_broken_history_test(draft_name='draft-oh-boy-what-a-draft-02-03')
self.do_rfc_with_broken_history_test(draft_name='draft-oh-boy-what-a-draft-02-03')
class RawIdTests(TestCase):
def __init__(self, *args, **kwargs):
self.view = "ietf.doc.views_doc.document_raw_id"
self.mimetypes = {'txt':'text/plain','html':'text/html','xml':'application/xml'}
super(self.__class__, self).__init__(*args, **kwargs)
def should_succeed(self, argdict):
url = urlreverse(self.view, kwargs=argdict)
r = self.client.get(url)
self.assertEqual(r.status_code,200)
self.assertEqual(r.get('Content-Type'),f"{self.mimetypes[argdict.get('ext','txt')]};charset=utf-8")
def should_404(self, argdict):
url = urlreverse(self.view, kwargs=argdict)
r = self.client.get(url)
self.assertEqual(r.status_code, 404)
def test_raw_id(self):
draft = WgDraftFactory(create_revisions=range(0,2))
dir = settings.INTERNET_ALL_DRAFTS_ARCHIVE_DIR
for r in range(0,2):
rev = f'{r:02d}'
(Path(dir) / f'{draft.name}-{rev}.txt').touch()
if r == 1:
(Path(dir) / f'{draft.name}-{rev}.html').touch()
(Path(dir) / f'{draft.name}-{rev}.xml').touch()
self.should_succeed(dict(name=draft.name))
for ext in ('txt', 'html', 'xml'):
self.should_succeed(dict(name=draft.name, ext=ext))
self.should_succeed(dict(name=draft.name, rev='01', ext=ext))
self.should_404(dict(name=draft.name, ext='pdf'))
self.should_succeed(dict(name=draft.name, rev='00'))
self.should_succeed(dict(name=draft.name, rev='00',ext='txt'))
self.should_404(dict(name=draft.name, rev='00',ext='html'))
def test_raw_id_rfc(self):
rfc = WgRfcFactory()
dir = settings.INTERNET_ALL_DRAFTS_ARCHIVE_DIR
(Path(dir) / f'{rfc.name}-{rfc.rev}.txt').touch()
self.should_succeed(dict(name=rfc.name))
self.should_404(dict(name=rfc.canonical_name()))
def test_non_draft(self):
charter = CharterFactory()
self.should_404(dict(name=charter.name))
class PdfizedTests(TestCase):
def __init__(self, *args, **kwargs):
self.view = "ietf.doc.views_doc.document_pdfized"
super(self.__class__, self).__init__(*args, **kwargs)
def should_succeed(self, argdict):
url = urlreverse(self.view, kwargs=argdict)
r = self.client.get(url)
self.assertEqual(r.status_code,200)
self.assertEqual(r.get('Content-Type'),'application/pdf;charset=utf-8')
def should_404(self, argdict):
url = urlreverse(self.view, kwargs=argdict)
r = self.client.get(url)
self.assertEqual(r.status_code, 404)
def test_pdfized(self):
rfc = WgRfcFactory(create_revisions=range(0,2))
dir = settings.RFC_PATH
with (Path(dir) / f'{rfc.canonical_name()}.txt').open('w') as f:
f.write('text content')
dir = settings.INTERNET_ALL_DRAFTS_ARCHIVE_DIR
for r in range(0,2):
with (Path(dir) / f'{rfc.name}-{r:02d}.txt').open('w') as f:
f.write('text content')
self.should_succeed(dict(name=rfc.canonical_name()))
self.should_succeed(dict(name=rfc.name))
for r in range(0,2):
self.should_succeed(dict(name=rfc.name,rev=f'{r:02d}'))
for ext in ('pdf','txt','html','anythingatall'):
self.should_succeed(dict(name=rfc.name,rev=f'{r:02d}',ext=ext))
self.should_404(dict(name=rfc.name,rev='02'))

View file

@ -9,12 +9,16 @@ from pyquery import PyQuery
import debug # pyflakes:ignore
from django.test import RequestFactory
from django.utils.text import slugify
from django.urls import reverse as urlreverse
from ietf.doc.models import ( Document, State, DocEvent,
BallotPositionDocEvent, LastCallDocEvent, WriteupDocEvent, TelechatDocEvent )
from ietf.doc.factories import DocumentFactory, IndividualDraftFactory, IndividualRfcFactory, WgDraftFactory
from ietf.doc.models import (Document, State, DocEvent,
BallotPositionDocEvent, LastCallDocEvent, WriteupDocEvent, TelechatDocEvent)
from ietf.doc.factories import (DocumentFactory, IndividualDraftFactory, IndividualRfcFactory, WgDraftFactory,
BallotPositionDocEventFactory, BallotDocEventFactory)
from ietf.doc.utils import create_ballot_if_not_open
from ietf.doc.views_doc import document_ballot_content
from ietf.group.models import Group, Role
from ietf.group.factories import GroupFactory, RoleFactory, ReviewTeamFactory
from ietf.ipr.factories import HolderIprDisclosureFactory
@ -22,6 +26,7 @@ from ietf.name.models import BallotPositionName
from ietf.iesg.models import TelechatDate
from ietf.person.models import Person, PersonalApiKey
from ietf.person.factories import PersonFactory
from ietf.person.utils import get_active_ads
from ietf.utils.test_utils import TestCase, login_testing_unauthorized
from ietf.utils.mail import outbox, empty_outbox, get_payload_text
from ietf.utils.text import unwrap
@ -1100,11 +1105,278 @@ class RegenerateLastCallTestCase(TestCase):
self.assertTrue("rfc6666" in lc_text)
self.assertTrue("Independent Submission" in lc_text)
draft.relateddocument_set.create(target=rfc.docalias.get(name='rfc6666'),relationship_id='downref-approval')
draft.relateddocument_set.create(target=rfc.docalias.get(name='rfc6666'), relationship_id='downref-approval')
r = self.client.post(url, dict(regenerate_last_call_text="1"))
self.assertEqual(r.status_code, 200)
draft = Document.objects.get(name=draft.name)
lc_text = draft.latest_event(WriteupDocEvent, type="changed_last_call_text").text
self.assertFalse("contains these normative down" in lc_text)
self.assertFalse("rfc6666" in lc_text)
self.assertFalse("rfc6666" in lc_text)
class BallotContentTests(TestCase):
def test_ballotpositiondocevent_any_email_sent(self):
now = datetime.datetime.now() # be sure event timestamps are at distinct times
bpde_with_null_send_email = BallotPositionDocEventFactory(
time=now - datetime.timedelta(minutes=30),
send_email=None,
)
ballot = bpde_with_null_send_email.ballot
balloter = bpde_with_null_send_email.balloter
self.assertIsNone(
bpde_with_null_send_email.any_email_sent(),
'Result is None when only send_email is None',
)
self.assertIsNone(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=29),
send_email=None,
).any_email_sent(),
'Result is None when all send_email values are None',
)
# test with assertIs instead of assertFalse to distinguish None from False
self.assertIs(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=28),
send_email=False,
).any_email_sent(),
False,
'Result is False when current send_email is False'
)
self.assertIs(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=27),
send_email=None,
).any_email_sent(),
False,
'Result is False when earlier send_email is False'
)
self.assertIs(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=26),
send_email=True,
).any_email_sent(),
True,
'Result is True when current send_email is True'
)
self.assertIs(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=25),
send_email=None,
).any_email_sent(),
True,
'Result is True when earlier send_email is True and current is None'
)
self.assertIs(
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloter,
time=now - datetime.timedelta(minutes=24),
send_email=False,
).any_email_sent(),
True,
'Result is True when earlier send_email is True and current is False'
)
def _assertBallotMessage(self, q, balloter, expected):
heading = q(f'h4[id$="_{slugify(balloter.plain_name())}"]')
self.assertEqual(len(heading), 1)
# <h4/> is followed by a panel with the message of interest, so use next()
self.assertEqual(
len(heading.next().find(
f'*[title="{expected}"]'
)),
1,
)
def test_document_ballot_content_email_sent(self):
"""Ballot content correctly describes whether email is requested for each position"""
ballot = BallotDocEventFactory()
balloters = get_active_ads()
self.assertGreaterEqual(len(balloters), 6,
'Oops! Need to create additional active balloters for test')
# send_email is True
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[0],
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=True,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[1],
pos_id='noobj',
comment='Commentary',
comment_time=datetime.datetime.now(),
send_email=True,
)
# send_email False
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[2],
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=False,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[3],
pos_id='noobj',
comment='Commentary',
comment_time=datetime.datetime.now(),
send_email=False,
)
# send_email False but earlier position had send_email True
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[4],
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now() - datetime.timedelta(days=1),
send_email=True,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[4],
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=False,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[5],
pos_id='noobj',
comment='Commentary',
comment_time=datetime.datetime.now() - datetime.timedelta(days=1),
send_email=True,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[5],
pos_id='noobj',
comment='Commentary',
comment_time=datetime.datetime.now(),
send_email=False,
)
# Create a few positions with non-active-ad people. These will be treated
# as "old" ballot positions because the people are not in the list returned
# by get_active_ads()
#
# Some faked non-ASCII names wind up with plain names that cannot be slugified.
# This causes test failure because that slug is used in an HTML element ID.
# Until that's fixed, set the plain names to something guaranteed unique so
# the test does not randomly fail.
no_email_balloter = BallotPositionDocEventFactory(
ballot=ballot,
balloter__plain='plain name1',
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=False,
).balloter
send_email_balloter = BallotPositionDocEventFactory(
ballot=ballot,
balloter__plain='plain name2',
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=True,
).balloter
prev_send_email_balloter = BallotPositionDocEventFactory(
ballot=ballot,
balloter__plain='plain name3',
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now() - datetime.timedelta(days=1),
send_email=True,
).balloter
BallotPositionDocEventFactory(
ballot=ballot,
balloter=prev_send_email_balloter,
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=False,
)
content = document_ballot_content(
request=RequestFactory(),
doc=ballot.doc,
ballot_id=ballot.pk,
)
q = PyQuery(content)
self._assertBallotMessage(q, balloters[0], 'Email requested to be sent for this discuss')
self._assertBallotMessage(q, balloters[1], 'Email requested to be sent for this comment')
self._assertBallotMessage(q, balloters[2], 'No email send requests for this discuss')
self._assertBallotMessage(q, balloters[3], 'No email send requests for this comment')
self._assertBallotMessage(q, balloters[4], 'Email requested to be sent for earlier discuss')
self._assertBallotMessage(q, balloters[5], 'Email requested to be sent for earlier comment')
self._assertBallotMessage(q, no_email_balloter, 'No email send requests for this ballot position')
self._assertBallotMessage(q, send_email_balloter, 'Email requested to be sent for this ballot position')
self._assertBallotMessage(q, prev_send_email_balloter, 'Email requested to be sent for earlier ballot position')
def test_document_ballot_content_without_send_email_values(self):
"""Ballot content correctly indicates lack of send_email field in records"""
ballot = BallotDocEventFactory()
balloters = get_active_ads()
self.assertGreaterEqual(len(balloters), 2,
'Oops! Need to create additional active balloters for test')
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[0],
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=None,
)
BallotPositionDocEventFactory(
ballot=ballot,
balloter=balloters[1],
pos_id='noobj',
comment='Commentary',
comment_time=datetime.datetime.now(),
send_email=None,
)
old_balloter = BallotPositionDocEventFactory(
ballot=ballot,
balloter__plain='plain name', # ensure plain name is slugifiable
pos_id='discuss',
discuss='Discussion text',
discuss_time=datetime.datetime.now(),
send_email=None,
).balloter
content = document_ballot_content(
request=RequestFactory(),
doc=ballot.doc,
ballot_id=ballot.pk,
)
q = PyQuery(content)
self._assertBallotMessage(q, balloters[0], 'No email send requests for this discuss')
self._assertBallotMessage(q, balloters[1], 'No ballot position send log available')
self._assertBallotMessage(q, old_balloter, 'No ballot position send log available')

View file

@ -21,6 +21,7 @@ from ietf.doc.utils_bofreq import bofreq_editors, bofreq_responsible
from ietf.person.factories import PersonFactory
from ietf.utils.mail import outbox, empty_outbox
from ietf.utils.test_utils import TestCase, reload_db_objects, unicontent, login_testing_unauthorized
from ietf.utils.text import xslugify
class BofreqTests(TestCase):
@ -333,7 +334,7 @@ This test section has some text.
empty_outbox()
r = self.client.post(url, postdict)
self.assertEqual(r.status_code,302)
name = f"bofreq-{postdict['title']}".replace(' ','-')
name = f"bofreq-{xslugify(nobody.last_name())[:64]}-{postdict['title']}".replace(' ','-')
bofreq = Document.objects.filter(name=name,type_id='bofreq').first()
self.assertIsNotNone(bofreq)
self.assertIsNotNone(DocAlias.objects.filter(name=name).first())
@ -345,7 +346,7 @@ This test section has some text.
self.assertEqual(bofreq.text_or_error(), 'some stuff')
self.assertEqual(len(outbox),1)
os.unlink(file.name)
existing_bofreq = BofreqFactory()
existing_bofreq = BofreqFactory(requester_lastname=nobody.last_name())
for postdict in [
dict(title='', bofreq_submission='enter', bofreq_content='some stuff'),
dict(title='a title', bofreq_submission='enter', bofreq_content=''),
@ -354,9 +355,9 @@ This test section has some text.
dict(title='a title', bofreq_submission='', bofreq_content='some stuff'),
]:
r = self.client.post(url,postdict)
self.assertEqual(r.status_code, 200)
self.assertEqual(r.status_code, 200, f'Wrong status_code for {postdict}')
q = PyQuery(r.content)
self.assertTrue(q('form div.is-invalid'))
self.assertTrue(q('form div.is-invalid'), f'Expected an error for {postdict}')
def test_post_proposed_restrictions(self):
states = State.objects.filter(type_id='bofreq').exclude(slug='proposed')
@ -384,4 +385,3 @@ This test section has some text.
q = PyQuery(r.content)
self.assertEqual(0, len(q('td.edit>a.btn')))
self.assertEqual([],q('#change-request'))

View file

@ -1,16 +1,22 @@
# Copyright The IETF Trust 2020, All Rights Reserved
import datetime
import debug # pyflakes:ignore
from unittest.mock import patch
from django.db import IntegrityError
from ietf.group.factories import GroupFactory, RoleFactory
from ietf.name.models import DocTagName
from ietf.person.factories import PersonFactory
from ietf.utils.test_utils import TestCase
from ietf.utils.test_utils import TestCase, name_of_file_containing
from ietf.person.models import Person
from ietf.doc.factories import DocumentFactory, WgRfcFactory
from ietf.doc.factories import DocumentFactory, WgRfcFactory, WgDraftFactory
from ietf.doc.models import State, DocumentActionHolder, DocumentAuthor, Document
from ietf.doc.utils import update_action_holders, add_state_change_event, update_documentauthors, fuzzy_find_documents
from ietf.doc.utils import (update_action_holders, add_state_change_event, update_documentauthors,
fuzzy_find_documents, rebuild_reference_relations)
from ietf.utils.draft import Draft, PlaintextDraft
from ietf.utils.xmldraft import XMLDraft
class ActionHoldersTests(TestCase):
@ -285,3 +291,140 @@ class MiscTests(TestCase):
self.do_fuzzy_find_documents_rfc_test('draft-name-with-number-01')
self.do_fuzzy_find_documents_rfc_test('draft-name-that-has-two-02-04')
self.do_fuzzy_find_documents_rfc_test('draft-wild-01-numbers-0312')
class RebuildReferenceRelationsTests(TestCase):
def setUp(self):
super().setUp()
self.doc = WgDraftFactory() # document under test
# Other documents that should be found by rebuild_reference_relations
self.normative, self.informative, self.unknown = WgRfcFactory.create_batch(3)
for relationship in ['refnorm', 'refinfo', 'refunk', 'refold']:
self.doc.relateddocument_set.create(
target=WgRfcFactory().docalias.first(),
relationship_id=relationship,
)
self.updated = WgRfcFactory() # related document that should be left alone
self.doc.relateddocument_set.create(target=self.updated.docalias.first(), relationship_id='updates')
self.assertCountEqual(self.doc.relateddocument_set.values_list('relationship__slug', flat=True),
['refnorm', 'refinfo', 'refold', 'refunk', 'updates'],
'Test conditions set up incorrectly: wrong prior document relationships')
for other_doc in [self.normative, self.informative, self.unknown]:
self.assertEqual(
self.doc.relateddocument_set.filter(target__name=other_doc.canonical_name()).count(),
0,
'Test conditions set up incorrectly: new documents already related',
)
def _get_refs_return_value(self):
return {
self.normative.canonical_name(): Draft.REF_TYPE_NORMATIVE,
self.informative.canonical_name(): Draft.REF_TYPE_INFORMATIVE,
self.unknown.canonical_name(): Draft.REF_TYPE_UNKNOWN,
'draft-not-found': Draft.REF_TYPE_NORMATIVE,
}
def test_requires_txt_or_xml(self):
result = rebuild_reference_relations(self.doc, {})
self.assertCountEqual(result.keys(), ['errors'])
self.assertEqual(len(result['errors']), 1)
self.assertIn('No draft text available', result['errors'][0],
'Error should be reported if no draft file is given')
result = rebuild_reference_relations(self.doc, {'md': 'cant-do-this.md'})
self.assertCountEqual(result.keys(), ['errors'])
self.assertEqual(len(result['errors']), 1)
self.assertIn('No draft text available', result['errors'][0],
'Error should be reported if no XML or plaintext file is given')
@patch.object(XMLDraft, 'get_refs')
@patch.object(XMLDraft, '__init__', return_value=None)
def test_xml(self, mock_init, mock_get_refs):
"""Should build reference relations with only XML"""
mock_get_refs.return_value = self._get_refs_return_value()
result = rebuild_reference_relations(self.doc, {'xml': 'file.xml'})
# if the method of calling the XMLDraft() constructor changes, this will need to be updated
xmldraft_init_args, _ = mock_init.call_args
self.assertEqual(xmldraft_init_args, ('file.xml',), 'XMLDraft initialized with unexpected arguments')
self.assertEqual(
result,
{
'warnings': ['There were 1 references with no matching DocAlias'],
'unfound': ['draft-not-found'],
}
)
self.assertCountEqual(
self.doc.relateddocument_set.values_list('target__name', 'relationship__slug'),
[
(self.normative.canonical_name(), 'refnorm'),
(self.informative.canonical_name(), 'refinfo'),
(self.unknown.canonical_name(), 'refunk'),
(self.updated.docalias.first().name, 'updates'),
]
)
@patch.object(PlaintextDraft, 'get_refs')
@patch.object(PlaintextDraft, '__init__', return_value=None)
def test_plaintext(self, mock_init, mock_get_refs):
"""Should build reference relations with only plaintext"""
mock_get_refs.return_value = self._get_refs_return_value()
with name_of_file_containing('contents') as temp_file_name:
result = rebuild_reference_relations(self.doc, {'txt': temp_file_name})
# if the method of calling the PlaintextDraft() constructor changes, this test will need to be updated
_, mock_init_kwargs = mock_init.call_args
self.assertEqual(mock_init_kwargs, {'text': 'contents', 'source': temp_file_name},
'PlaintextDraft initialized with unexpected arguments')
self.assertEqual(
result,
{
'warnings': ['There were 1 references with no matching DocAlias'],
'unfound': ['draft-not-found'],
}
)
self.assertCountEqual(
self.doc.relateddocument_set.values_list('target__name', 'relationship__slug'),
[
(self.normative.canonical_name(), 'refnorm'),
(self.informative.canonical_name(), 'refinfo'),
(self.unknown.canonical_name(), 'refunk'),
(self.updated.docalias.first().name, 'updates'),
]
)
@patch.object(PlaintextDraft, '__init__')
@patch.object(XMLDraft, 'get_refs')
@patch.object(XMLDraft, '__init__', return_value=None)
def test_xml_and_plaintext(self, mock_init, mock_get_refs, mock_plaintext_init):
"""Should build reference relations with XML when plaintext also available"""
mock_get_refs.return_value = self._get_refs_return_value()
result = rebuild_reference_relations(self.doc, {'txt': 'file.txt', 'xml': 'file.xml'})
self.assertFalse(mock_plaintext_init.called, 'PlaintextDraft should not be used when XML is available')
# if the method of calling the XMLDraft() constructor changes, this will need to be updated
xmldraft_init_args, _ = mock_init.call_args
self.assertEqual(xmldraft_init_args, ('file.xml',), 'XMLDraft initialized with unexpected arguments')
self.assertEqual(
result,
{
'warnings': ['There were 1 references with no matching DocAlias'],
'unfound': ['draft-not-found'],
}
)
self.assertCountEqual(
self.doc.relateddocument_set.values_list('target__name', 'relationship__slug'),
[
(self.normative.canonical_name(), 'refnorm'),
(self.informative.canonical_name(), 'refinfo'),
(self.unknown.canonical_name(), 'refunk'),
(self.updated.docalias.first().name, 'updates'),
]
)

View file

@ -65,9 +65,17 @@ urlpatterns = [
url(r'^stats/newrevisiondocevent/data/?$', views_stats.chart_data_newrevisiondocevent),
url(r'^stats/person/(?P<id>[0-9]+)/drafts/conf/?$', views_stats.chart_conf_person_drafts),
url(r'^stats/person/(?P<id>[0-9]+)/drafts/data/?$', views_stats.chart_data_person_drafts),
# This block should really all be at the idealized docs.ietf.org service
url(r'^html/(?P<name>bcp[0-9]+?)(\.txt|\.html)?/?$', RedirectView.as_view(url=settings.RFC_EDITOR_INFO_BASE_URL+"%(name)s", permanent=False)),
url(r'^html/(?P<name>std[0-9]+?)(\.txt|\.html)?/?$', RedirectView.as_view(url=settings.RFC_EDITOR_INFO_BASE_URL+"%(name)s", permanent=False)),
url(r'^html/%(name)s(?:-%(rev)s)?(\.txt|\.html)?/?$' % settings.URL_REGEXPS, views_doc.document_html),
url(r'^id/%(name)s(?:-%(rev)s)?(?:\.(?P<ext>(txt|html|xml)))?/?$' % settings.URL_REGEXPS, views_doc.document_raw_id),
url(r'^pdf/%(name)s(?:-%(rev)s)?(?:\.(?P<ext>[a-z]+))?/?$' % settings.URL_REGEXPS, views_doc.document_pdfized),
# End of block that should be an idealized docs.ietf.org service instead
url(r'^html/(?P<name>[Rr][Ff][Cc] [0-9]+?)(\.txt|\.html)?/?$', views_doc.document_html),
url(r'^idnits2-rfcs-obsoleted/?$', views_doc.idnits2_rfcs_obsoleted),
url(r'^idnits2-rfc-status/?$', views_doc.idnits2_rfc_status),

View file

@ -39,6 +39,8 @@ from ietf.utils import draft, text
from ietf.utils.mail import send_mail
from ietf.mailtrigger.utils import gather_address_lists
from ietf.utils import log
from ietf.utils.xmldraft import XMLDraft
def save_document_in_history(doc):
"""Save a snapshot of document and related objects in the database."""
@ -742,21 +744,25 @@ def update_telechat(request, doc, by, new_telechat_date, new_returning_item=None
return e
def rebuild_reference_relations(doc,filename=None):
def rebuild_reference_relations(doc, filenames):
"""Rebuild reference relations for a document
filenames should be a dict mapping file ext (i.e., type) to the full path of each file.
"""
if doc.type.slug != 'draft':
return None
if not filename:
if doc.get_state_slug() == 'rfc':
filename=os.path.join(settings.RFC_PATH,doc.canonical_name()+".txt")
else:
filename=os.path.join(settings.INTERNET_DRAFT_PATH,doc.filename_with_rev())
try:
with io.open(filename, 'rb') as file:
refs = draft.Draft(file.read().decode('utf8'), filename).get_refs()
except IOError as e:
return { 'errors': ["%s :%s" % (e.strerror, filename)] }
# try XML first
if 'xml' in filenames:
refs = XMLDraft(filenames['xml']).get_refs()
elif 'txt' in filenames:
filename = filenames['txt']
try:
refs = draft.PlaintextDraft.from_file(filename).get_refs()
except IOError as e:
return { 'errors': ["%s :%s" % (e.strerror, filename)] }
else:
return {'errors': ['No draft text available for rebuilding reference relations. Need XML or plaintext.']}
doc.relateddocument_set.filter(relationship__slug__in=['refnorm','refinfo','refold','refunk']).delete()
@ -764,6 +770,7 @@ def rebuild_reference_relations(doc,filename=None):
errors = []
unfound = set()
for ( ref, refType ) in refs.items():
# As of Dec 2021, DocAlias has a unique constraint on the name field, so count > 1 should not occur
refdoc = DocAlias.objects.filter( name=ref )
count = refdoc.count()
if count == 0:
@ -1040,7 +1047,7 @@ def build_file_urls(doc):
file_urls.append(("htmlized", urlreverse('ietf.doc.views_doc.document_html', kwargs=dict(name=name))))
if doc.tags.filter(slug="verified-errata").exists():
file_urls.append(("with errata", settings.RFC_EDITOR_INLINE_ERRATA_URL.format(rfc_number=doc.rfc_number())))
file_urls.append(("bibtex", urlreverse('ietf.doc.views_doc.document_main',kwargs=dict(name=name))+"bibtex"))
file_urls.append(("bibtex", urlreverse('ietf.doc.views_doc.document_bibtex',kwargs=dict(name=name))))
else:
base_path = os.path.join(settings.INTERNET_ALL_DRAFTS_ARCHIVE_DIR, doc.name + "-" + doc.rev + ".")
possible_types = settings.IDSUBMIT_FILE_TYPES
@ -1051,10 +1058,10 @@ def build_file_urls(doc):
label = "plain text" if t == "txt" else t
file_urls.append((label, base + doc.name + "-" + doc.rev + "." + t))
if "pdf" not in found_types:
file_urls.append(("pdf", settings.TOOLS_ID_PDF_URL + doc.name + "-" + doc.rev + ".pdf"))
file_urls.append(("htmlized", urlreverse('ietf.doc.views_doc.document_html', kwargs=dict(name=doc.name, rev=doc.rev))))
file_urls.append(("bibtex", urlreverse('ietf.doc.views_doc.document_main',kwargs=dict(name=doc.name,rev=doc.rev))+"bibtex"))
if doc.text():
file_urls.append(("htmlized", urlreverse('ietf.doc.views_doc.document_html', kwargs=dict(name=doc.name, rev=doc.rev))))
file_urls.append(("pdfized", urlreverse('ietf.doc.views_doc.document_pdfized', kwargs=dict(name=doc.name, rev=doc.rev))))
file_urls.append(("bibtex", urlreverse('ietf.doc.views_doc.document_bibtex',kwargs=dict(name=doc.name,rev=doc.rev))))
return file_urls, found_types

View file

@ -3,7 +3,6 @@
import debug # pyflakes:ignore
import io
import markdown
from django import forms
from django.contrib.auth.decorators import login_required
@ -20,6 +19,7 @@ from ietf.doc.utils import add_state_change_event
from ietf.doc.utils_bofreq import bofreq_editors, bofreq_responsible
from ietf.ietfauth.utils import has_role, role_required
from ietf.person.fields import SearchablePersonsField
from ietf.utils import markdown
from ietf.utils.response import permission_denied
from ietf.utils.text import xslugify
from ietf.utils.textupload import get_cleaned_text_file_content
@ -64,7 +64,7 @@ class BofreqUploadForm(forms.Form):
if require_field("bofreq_file"):
content = get_cleaned_text_file_content(self.cleaned_data["bofreq_file"])
try:
_ = markdown.markdown(content, extensions=['extra'])
_ = markdown.markdown(content)
except Exception as e:
raise forms.ValidationError(f'Markdown processing failed: {e}')
@ -113,14 +113,20 @@ class NewBofreqForm(BofreqUploadForm):
title = forms.CharField(max_length=255)
field_order = ['title','bofreq_submission','bofreq_file','bofreq_content']
def name_from_title(self,title):
name = 'bofreq-' + xslugify(title).replace('_', '-')[:128]
return name
def __init__(self, requester, *args, **kwargs):
self._requester = requester
super().__init__(*args, **kwargs)
def name_from_title(self, title):
requester_slug = xslugify(self._requester.last_name())
title_slug = xslugify(title)
name = f'bofreq-{requester_slug[:64]}-{title_slug[:128]}'
return name.replace('_', '-')
def clean_title(self):
title = self.cleaned_data['title']
name = self.name_from_title(title)
if name == 'bofreq-':
if name == self.name_from_title(''):
raise forms.ValidationError('The filename derived from this title is empty. Please include a few descriptive words using ascii or numeric characters')
if Document.objects.filter(name=name).exists():
raise forms.ValidationError('This title produces a filename already used by an existing BOF request')
@ -130,7 +136,7 @@ class NewBofreqForm(BofreqUploadForm):
def new_bof_request(request):
if request.method == 'POST':
form = NewBofreqForm(request.POST, request.FILES)
form = NewBofreqForm(request.user.person, request.POST, request.FILES)
if form.is_valid():
title = form.cleaned_data['title']
name = form.name_from_title(title)
@ -175,7 +181,7 @@ def new_bof_request(request):
init = {'bofreq_content':escape(render_to_string('doc/bofreq/bofreq_template.md',{})),
'bofreq_submission':'enter',
}
form = NewBofreqForm(initial=init)
form = NewBofreqForm(request.user.person, initial=init)
return render(request, 'doc/bofreq/new_bofreq.html',
{'form':form})

View file

@ -40,7 +40,6 @@ import io
import json
import os
import re
import markdown
from urllib.parse import quote
@ -80,8 +79,8 @@ from ietf.meeting.utils import group_sessions, get_upcoming_manageable_sessions,
from ietf.review.models import ReviewAssignment
from ietf.review.utils import can_request_review_of_doc, review_assignments_to_list_for_docs
from ietf.review.utils import no_review_from_teams_on_doc
from ietf.utils import markup_txt, log
from ietf.utils.draft import Draft
from ietf.utils import markup_txt, log, markdown
from ietf.utils.draft import PlaintextDraft
from ietf.utils.response import permission_denied
from ietf.utils.text import maybe_split
@ -550,7 +549,7 @@ def document_main(request, name, rev=None):
))
if doc.type_id == "bofreq":
content = markdown.markdown(doc.text_or_error(),extensions=['extra'])
content = markdown.markdown(doc.text_or_error())
editors = bofreq_editors(doc)
responsible = bofreq_responsible(doc)
can_manage = has_role(request.user,['Secretariat', 'Area Director', 'IAB'])
@ -661,7 +660,7 @@ def document_main(request, name, rev=None):
content = doc.text_or_error()
t = "plain text"
elif extension == ".md":
content = markdown.markdown(doc.text_or_error(), extensions=['extra'])
content = markdown.markdown(doc.text_or_error())
content_is_html = True
t = "markdown"
other_types.append((t, url))
@ -719,6 +718,45 @@ def document_main(request, name, rev=None):
raise Http404("Document not found: %s" % (name + ("-%s"%rev if rev else "")))
def document_raw_id(request, name, rev=None, ext=None):
if not name.startswith('draft-'):
raise Http404
found = fuzzy_find_documents(name, rev)
num_found = found.documents.count()
if num_found == 0:
raise Http404("Document not found: %s" % name)
if num_found > 1:
raise Http404("Multiple documents matched: %s" % name)
doc = found.documents.get()
if found.matched_rev or found.matched_name.startswith('rfc'):
rev = found.matched_rev
else:
rev = doc.rev
if rev:
doc = doc.history_set.filter(rev=rev).first() or doc.fake_history_obj(rev)
base_path = os.path.join(settings.INTERNET_ALL_DRAFTS_ARCHIVE_DIR, doc.name + "-" + doc.rev + ".")
possible_types = settings.IDSUBMIT_FILE_TYPES
found_types=dict()
for t in possible_types:
if os.path.exists(base_path + t):
found_types[t]=base_path+t
if ext == None:
ext = 'txt'
if not ext in found_types:
raise Http404('dont have the file for that extension')
mimetypes = {'txt':'text/plain','html':'text/html','xml':'application/xml'}
try:
with open(found_types[ext],'rb') as f:
blob = f.read()
return HttpResponse(blob,content_type=f'{mimetypes[ext]};charset=utf-8')
except:
raise Http404
def document_html(request, name, rev=None):
found = fuzzy_find_documents(name, rev)
num_found = found.documents.count()
@ -731,9 +769,6 @@ def document_html(request, name, rev=None):
return redirect('ietf.doc.views_doc.document_html', name=found.matched_name)
doc = found.documents.get()
if not os.path.exists(doc.get_file_name()):
raise Http404("File not found: %s" % doc.get_file_name())
if found.matched_rev or found.matched_name.startswith('rfc'):
rev = found.matched_rev
@ -742,6 +777,9 @@ def document_html(request, name, rev=None):
if rev:
doc = doc.history_set.filter(rev=rev).first() or doc.fake_history_obj(rev)
if not os.path.exists(doc.get_file_name()):
raise Http404("File not found: %s" % doc.get_file_name())
if doc.type_id in ['draft',]:
doc.supermeta = build_doc_supermeta_block(doc)
doc.meta = build_doc_meta_block(doc, settings.HTMLIZER_URL_PREFIX)
@ -767,6 +805,36 @@ def document_html(request, name, rev=None):
return render(request, "doc/document_html.html", {"doc":doc, "doccolor":doccolor })
def document_pdfized(request, name, rev=None, ext=None):
found = fuzzy_find_documents(name, rev)
num_found = found.documents.count()
if num_found == 0:
raise Http404("Document not found: %s" % name)
if num_found > 1:
raise Http404("Multiple documents matched: %s" % name)
if found.matched_name.startswith('rfc') and name != found.matched_name:
return redirect('ietf.doc.views_doc.document_pdfized', name=found.matched_name)
doc = found.documents.get()
if found.matched_rev or found.matched_name.startswith('rfc'):
rev = found.matched_rev
else:
rev = doc.rev
if rev:
doc = doc.history_set.filter(rev=rev).first() or doc.fake_history_obj(rev)
if not os.path.exists(doc.get_file_name()):
raise Http404("File not found: %s" % doc.get_file_name())
pdf = doc.pdfized()
if pdf:
return HttpResponse(pdf,content_type='application/pdf;charset=utf-8')
else:
raise Http404
def check_doc_email_aliases():
pattern = re.compile(r'^expand-(.*?)(\..*?)?@.*? +(.*)$')
good_count = 0
@ -1108,6 +1176,10 @@ def document_ballot_content(request, doc, ballot_id, editable=True):
positions = ballot.all_positions()
# put into position groups
#
# Each position group is a tuple (BallotPositionName, [BallotPositionDocEvent, ...])
# THe list contains the latest entry for each AD, possibly with a fake 'no record' entry
# for any ADs without an event. Blocking positions are earlier in the list than non-blocking.
position_groups = []
for n in BallotPositionName.objects.filter(slug__in=[p.pos_id for p in positions]).order_by('order'):
g = (n, [p for p in positions if p.pos_id == n.slug])
@ -1773,7 +1845,7 @@ def idnits2_state(request, name, rev=None):
else:
text = doc.text()
if text:
parsed_draft = Draft(text=doc.text(), source=name, name_from_source=False)
parsed_draft = PlaintextDraft(text=doc.text(), source=name, name_from_source=False)
doc.deststatus = parsed_draft.get_status()
else:
doc.deststatus="Unknown"

View file

@ -44,6 +44,7 @@ from ietf.review.utils import (active_review_teams, assign_review_request_to_rev
close_review_request_states,
close_review_request)
from ietf.review import mailarch
from ietf.utils import log
from ietf.utils.fields import DatepickerDateField
from ietf.utils.text import strip_prefix, xslugify
from ietf.utils.textupload import get_cleaned_text_file_content
@ -621,9 +622,13 @@ class CompleteReviewForm(forms.Form):
url = self.cleaned_data['review_url']
#scheme, netloc, path, parameters, query, fragment = urlparse(url)
if url:
r = requests.get(url)
try:
r = requests.get(url, timeout=settings.DEFAULT_REQUESTS_TIMEOUT)
except requests.Timeout as exc:
log.log(f'GET request timed out for [{url}]: {exc}')
raise forms.ValidationError("Trying to retrieve the URL resulted in a request timeout. Please provide a URL that can be retrieved.") from exc
if r.status_code != 200:
raise forms.ValidationError("Trying to retrieve the URL resulted in status code %s: %s. Please provide an URL that can be retrieved." % (r.status_code, r.reason))
raise forms.ValidationError("Trying to retrieve the URL resulted in status code %s: %s. Please provide a URL that can be retrieved." % (r.status_code, r.reason))
return url
def clean(self):

View file

@ -66,6 +66,7 @@ class GroupForm(forms.Form):
list_email = forms.CharField(max_length=64, required=False)
list_subscribe = forms.CharField(max_length=255, required=False)
list_archive = forms.CharField(max_length=255, required=False)
description = forms.CharField(widget=forms.Textarea, required=False, help_text='Text that appears on the "about" page.')
urls = forms.CharField(widget=forms.Textarea, label="Additional URLs", help_text="Format: https://site/path (Optional description). Separate multiple entries with newline. Prefer HTTPS URLs where possible.", required=False)
resources = forms.CharField(widget=forms.Textarea, label="Additional Resources", help_text="Format: tag value (Optional description). Separate multiple entries with newline. Prefer HTTPS URLs where possible.", required=False)
closing_note = forms.CharField(widget=forms.Textarea, label="Closing note", required=False)
@ -103,6 +104,9 @@ class GroupForm(forms.Form):
super(self.__class__, self).__init__(*args, **kwargs)
if not group_features or group_features.has_chartering_process:
self.fields.pop('description') # do not show the description field for chartered groups
for role_slug in self.used_roles:
role_name = RoleName.objects.get(slug=role_slug)
fieldname = '%s_roles'%role_slug

View file

@ -8,6 +8,7 @@ import datetime
import io
import bleach
from unittest.mock import patch
from pathlib import Path
from pyquery import PyQuery
from tempfile import NamedTemporaryFile
@ -515,12 +516,16 @@ class GroupEditTests(TestCase):
self.assertTrue(len(q('form .is-invalid')) > 0)
# Ok creation
r = self.client.post(url, dict(acronym="testwg", name="Testing WG", state=bof_state.pk, parent=area.pk))
r = self.client.post(
url,
dict(acronym="testwg", name="Testing WG", state=bof_state.pk, parent=area.pk, description="ignored"),
)
self.assertEqual(r.status_code, 302)
self.assertEqual(len(Group.objects.filter(type="wg")), num_wgs + 1)
group = Group.objects.get(acronym="testwg")
self.assertEqual(group.name, "Testing WG")
self.assertEqual(charter_name_for_group(group), "charter-ietf-testwg")
self.assertEqual(group.description, '', 'Description should be ignored for a WG')
def test_create_rg(self):
@ -579,6 +584,28 @@ class GroupEditTests(TestCase):
# self.assertEqual(Group.objects.get(acronym=group.acronym).state_id, "proposed")
# self.assertEqual(Group.objects.get(acronym=group.acronym).name, "Test")
def test_create_non_chartered_includes_description(self):
parent = GroupFactory(type_id='area')
group_type = GroupTypeName.objects.filter(used=True, features__has_chartering_process=False).first()
self.assertIsNotNone(group_type)
url = urlreverse('ietf.group.views.edit', kwargs=dict(group_type=group_type.slug, action="create"))
login_testing_unauthorized(self, "secretary", url)
r = self.client.post(
url,
{
'acronym': "testgrp",
'name': "Testing",
'state': GroupStateName.objects.get(slug='active').pk,
'parent': parent.pk,
'description': "not ignored",
},
)
self.assertEqual(r.status_code, 302)
group = Group.objects.get(acronym="testgrp")
self.assertEqual(group.name, "Testing")
self.assertEqual(group.description, 'not ignored', 'Description should not be ignored')
def test_edit_info(self):
group = GroupFactory(acronym='mars',parent=GroupFactory(type_id='area'))
CharterFactory(group=group)
@ -640,6 +667,7 @@ class GroupEditTests(TestCase):
list_email="mars@mail",
list_subscribe="subscribe.mars",
list_archive="archive.mars",
description='ignored'
))
self.assertEqual(r.status_code, 302)
@ -658,6 +686,7 @@ class GroupEditTests(TestCase):
self.assertEqual(group.list_email, "mars@mail")
self.assertEqual(group.list_subscribe, "subscribe.mars")
self.assertEqual(group.list_archive, "archive.mars")
self.assertEqual(group.description, '')
self.assertTrue((Path(settings.CHARTER_PATH) / ("%s-%s.txt" % (group.charter.canonical_name(), group.charter.rev))).exists())
self.assertEqual(len(outbox), 2)
@ -843,6 +872,60 @@ class GroupEditTests(TestCase):
self.assertEqual(review_assignment.state_id, 'accepted')
self.assertEqual(other_review_assignment.state_id, 'assigned')
def test_edit_info_non_chartered_includes_description(self):
group_type = GroupTypeName.objects.filter(used=True, features__has_chartering_process=False).first()
self.assertIsNotNone(group_type)
group = GroupFactory(type_id=group_type.pk, description='Original description')
url = urlreverse('ietf.group.views.edit', kwargs={'acronym': group.acronym, 'action': 'edit'})
PersonFactory(user__username='plain')
self.client.login(username='plain', password='plain+password')
# mock the auth check so we don't have to delve into details of GroupFeatures for testing
with patch('ietf.group.views.can_manage_group', return_value=True):
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertTrue(q('textarea[name="description"]'))
with patch('ietf.group.views.can_manage_group', return_value=True):
r = self.client.post(url, {
'name': group.name,
'acronym': group.acronym,
'state': group.state.pk,
'description': 'Updated description',
})
self.assertEqual(r.status_code, 302)
group = Group.objects.get(pk=group.pk) # refresh
self.assertEqual(group.description, 'Updated description')
def test_edit_description_field(self):
group_type = GroupTypeName.objects.filter(used=True, features__has_chartering_process=False).first()
self.assertIsNotNone(group_type)
group = GroupFactory(type_id=group_type.pk, description='Original description')
url = urlreverse('ietf.group.views.edit',
kwargs={'acronym': group.acronym, 'action': 'edit', 'field': 'description'})
PersonFactory(user__username='plain')
self.client.login(username='plain', password='plain+password')
# mock the auth check so we don't have to delve into details of GroupFeatures for testing
with patch('ietf.group.views.can_manage_group', return_value=True):
r = self.client.post(url, {
'description': 'Updated description',
})
self.assertEqual(r.status_code, 302)
group = Group.objects.get(pk=group.pk) # refresh
self.assertEqual(group.description, 'Updated description')
# Convert the group to a chartered type and repeat - should no longer be able to edit the desc
group.type = GroupTypeName.objects.filter(used=True, features__has_chartering_process=True).first()
group.save()
with patch('ietf.group.views.can_manage_group', return_value=True):
r = self.client.post(url, {
'description': 'Ignored description',
})
self.assertEqual(r.status_code, 302)
group = Group.objects.get(pk=group.pk) # refresh
self.assertEqual(group.description, 'Updated description')
def test_conclude(self):
group = GroupFactory(acronym="mars")
@ -1036,6 +1119,32 @@ class GroupFormTests(TestCase):
self.assertTrue(form.is_valid())
self._assert_cleaned_data_equal(form.cleaned_data, data)
def test_no_description_field_for_chartered_groups(self):
group = GroupFactory()
self.assertTrue(
group.features.has_chartering_process,
'Group type must have has_chartering_process=True for this test',
)
self.assertNotIn('description', GroupForm(group=group).fields)
self.assertNotIn('description', GroupForm(group_type=group.type).fields)
self.assertNotIn('description', GroupForm(group=group, group_type=group.type).fields)
self.assertNotIn('description', GroupForm(data={'description': 'blah'}, group=group).fields)
self.assertNotIn('description', GroupForm(data={'description': 'blah'}, group_type=group.type).fields)
self.assertNotIn('description', GroupForm(data={'description': 'blah'}, group=group, group_type=group.type).fields)
def test_have_description_field_for_non_chartered_groups(self):
group = GroupFactory(type_id='dir')
self.assertFalse(
group.features.has_chartering_process,
'Group type must have has_chartering_process=False for this test',
)
self.assertIn('description', GroupForm(group=group).fields)
self.assertIn('description', GroupForm(group_type=group.type).fields)
self.assertIn('description', GroupForm(group=group, group_type=group.type).fields)
self.assertIn('description', GroupForm(data={'description': 'blah'}, group=group).fields)
self.assertIn('description', GroupForm(data={'description': 'blah'}, group_type=group.type).fields)
self.assertIn('description', GroupForm(data={'description': 'blah'}, group=group, group_type=group.type).fields)
class MilestoneTests(TestCase):
def create_test_milestones(self):

View file

@ -12,17 +12,10 @@ from django.urls import reverse as urlreverse
from ietf.review.policies import get_reviewer_queue_policy
from ietf.utils.test_utils import login_testing_unauthorized, TestCase, reload_db_objects
from ietf.doc.models import TelechatDocEvent, LastCallDocEvent, State
from ietf.group.models import Role
from ietf.iesg.models import TelechatDate
from ietf.person.models import Person
from ietf.review.models import ( ReviewerSettings, UnavailablePeriod, ReviewSecretarySettings,
ReviewTeamSettings, NextReviewerInTeam )
from ietf.review.utils import (
suggested_review_requests_for_team,
review_assignments_needing_reviewer_reminder, email_reviewer_reminder,
review_assignments_needing_secretary_reminder, email_secretary_reminder,
send_unavaibility_period_ending_reminder, send_reminder_all_open_reviews,
send_review_reminder_overdue_assignment, send_reminder_unconfirmed_assignments)
from ietf.review.models import ReviewerSettings, UnavailablePeriod, ReviewSecretarySettings,NextReviewerInTeam
from ietf.review.utils import suggested_review_requests_for_team
from ietf.name.models import ReviewResultName, ReviewRequestStateName, ReviewAssignmentStateName, \
ReviewTypeName
import ietf.group.views
@ -701,199 +694,6 @@ class ReviewTests(TestCase):
self.assertEqual(settings.max_items_to_show_in_reviewer_list, 10)
self.assertEqual(settings.days_to_show_in_reviewer_list, 365)
def test_review_reminders(self):
review_req = ReviewRequestFactory(state_id='assigned')
reviewer = RoleFactory(name_id='reviewer',group=review_req.team,person__user__username='reviewer').person
assignment = ReviewAssignmentFactory(review_request=review_req, state_id='assigned', assigned_on = review_req.time, reviewer=reviewer.email_set.first())
RoleFactory(name_id='secr',group=review_req.team,person__user__username='reviewsecretary')
ReviewerSettingsFactory(team = review_req.team, person = reviewer)
remind_days = 6
reviewer_settings = ReviewerSettings.objects.get(team=review_req.team, person=reviewer)
reviewer_settings.remind_days_before_deadline = remind_days
reviewer_settings.save()
secretary = Person.objects.get(user__username="reviewsecretary")
secretary_role = Role.objects.get(group=review_req.team, name="secr", person=secretary)
secretary_settings = ReviewSecretarySettings(team=review_req.team, person=secretary)
secretary_settings.remind_days_before_deadline = remind_days
secretary_settings.save()
today = datetime.date.today()
review_req.reviewer = reviewer.email_set.first()
review_req.deadline = today + datetime.timedelta(days=remind_days)
review_req.save()
# reviewer
needing_reminders = review_assignments_needing_reviewer_reminder(today - datetime.timedelta(days=1))
self.assertEqual(list(needing_reminders), [])
needing_reminders = review_assignments_needing_reviewer_reminder(today)
self.assertEqual(list(needing_reminders), [assignment])
needing_reminders = review_assignments_needing_reviewer_reminder(today + datetime.timedelta(days=1))
self.assertEqual(list(needing_reminders), [])
# secretary
needing_reminders = review_assignments_needing_secretary_reminder(today - datetime.timedelta(days=1))
self.assertEqual(list(needing_reminders), [])
needing_reminders = review_assignments_needing_secretary_reminder(today)
self.assertEqual(list(needing_reminders), [(assignment, secretary_role)])
needing_reminders = review_assignments_needing_secretary_reminder(today + datetime.timedelta(days=1))
self.assertEqual(list(needing_reminders), [])
# email reviewer
empty_outbox()
email_reviewer_reminder(assignment)
self.assertEqual(len(outbox), 1)
self.assertTrue(review_req.doc.name in get_payload_text(outbox[0]))
# email secretary
empty_outbox()
email_secretary_reminder(assignment, secretary_role)
self.assertEqual(len(outbox), 1)
self.assertTrue(review_req.doc.name in get_payload_text(outbox[0]))
def test_send_unavaibility_period_ending_reminder(self):
review_team = ReviewTeamFactory(acronym="reviewteam", name="Review Team", type_id="review",
list_email="reviewteam@ietf.org")
reviewer = RoleFactory(group=review_team, person__user__username='reviewer',
person__user__email='reviewer@example.com',
person__name='Some Reviewer', name_id='reviewer')
secretary = RoleFactory(group=review_team, person__user__username='reviewsecretary',
person__user__email='reviewsecretary@example.com', name_id='secr')
empty_outbox()
today = datetime.date.today()
UnavailablePeriod.objects.create(
team=review_team,
person=reviewer.person,
start_date=today - datetime.timedelta(days=40),
end_date=today + datetime.timedelta(days=3),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=review_team,
person=reviewer.person,
# This object should be ignored, length is too short
start_date=today - datetime.timedelta(days=20),
end_date=today + datetime.timedelta(days=3),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=review_team,
person=reviewer.person,
start_date=today - datetime.timedelta(days=40),
# This object should be ignored, end date is too far away
end_date=today + datetime.timedelta(days=4),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=review_team,
person=reviewer.person,
# This object should be ignored, end date is too close
start_date=today - datetime.timedelta(days=40),
end_date=today + datetime.timedelta(days=2),
availability="unavailable",
)
log = send_unavaibility_period_ending_reminder(today)
self.assertEqual(len(outbox), 1)
self.assertTrue(reviewer.person.email_address() in outbox[0]["To"])
self.assertTrue(secretary.person.email_address() in outbox[0]["To"])
message = get_payload_text(outbox[0])
self.assertTrue(reviewer.person.name in message)
self.assertTrue(review_team.acronym in message)
self.assertEqual(len(log), 1)
self.assertTrue(reviewer.person.name in log[0])
self.assertTrue(review_team.acronym in log[0])
def test_send_review_reminder_overdue_assignment(self):
today = datetime.date.today()
# An assignment that's exactly on the date at which the grace period expires
review_req = ReviewRequestFactory(state_id='assigned', deadline=today - datetime.timedelta(5))
reviewer = RoleFactory(name_id='reviewer', group=review_req.team,person__user__username='reviewer').person
ReviewAssignmentFactory(review_request=review_req, state_id='assigned', assigned_on=review_req.time, reviewer=reviewer.email_set.first())
secretary = RoleFactory(name_id='secr', group=review_req.team, person__user__username='reviewsecretary')
# A assignment that is not yet overdue
not_overdue = today + datetime.timedelta(days=1)
ReviewAssignmentFactory(review_request__team=review_req.team, review_request__state_id='assigned', review_request__deadline=not_overdue, state_id='assigned', assigned_on=not_overdue, reviewer=reviewer.email_set.first())
# An assignment that is overdue but is not past the grace period
in_grace_period = today - datetime.timedelta(days=1)
ReviewAssignmentFactory(review_request__team=review_req.team, review_request__state_id='assigned', review_request__deadline=in_grace_period, state_id='assigned', assigned_on=in_grace_period, reviewer=reviewer.email_set.first())
empty_outbox()
log = send_review_reminder_overdue_assignment(today)
self.assertEqual(len(log), 1)
self.assertEqual(len(outbox), 1)
self.assertTrue(secretary.person.email_address() in outbox[0]["To"])
self.assertEqual(outbox[0]["Subject"], "1 Overdue review for team {}".format(review_req.team.acronym))
message = get_payload_text(outbox[0])
self.assertIn(review_req.team.acronym + ' has 1 accepted or assigned review overdue by at least 5 days.', message)
self.assertIn('Review of {} by {}'.format(review_req.doc.name, reviewer.plain_name()), message)
self.assertEqual(len(log), 1)
self.assertIn(secretary.person.email_address(), log[0])
self.assertIn('1 overdue review', log[0])
def test_send_reminder_all_open_reviews(self):
review_req = ReviewRequestFactory(state_id='assigned')
reviewer = RoleFactory(name_id='reviewer', group=review_req.team,person__user__username='reviewer').person
ReviewAssignmentFactory(review_request=review_req, state_id='assigned', assigned_on=review_req.time, reviewer=reviewer.email_set.first())
RoleFactory(name_id='secr', group=review_req.team, person__user__username='reviewsecretary')
ReviewerSettingsFactory(team=review_req.team, person=reviewer, remind_days_open_reviews=1)
empty_outbox()
today = datetime.date.today()
log = send_reminder_all_open_reviews(today)
self.assertEqual(len(outbox), 1)
self.assertTrue(reviewer.email_address() in outbox[0]["To"])
self.assertEqual(outbox[0]["Subject"], "Reminder: you have 1 open review assignment")
message = get_payload_text(outbox[0])
self.assertTrue(review_req.team.acronym in message)
self.assertTrue('you have 1 open review' in message)
self.assertTrue(review_req.doc.name in message)
self.assertTrue(review_req.deadline.strftime('%Y-%m-%d') in message)
self.assertEqual(len(log), 1)
self.assertTrue(reviewer.email_address() in log[0])
self.assertTrue('1 open review' in log[0])
def test_send_reminder_unconfirmed_assignments(self):
review_req = ReviewRequestFactory(state_id='assigned')
reviewer = RoleFactory(name_id='reviewer', group=review_req.team, person__user__username='reviewer').person
ReviewAssignmentFactory(review_request=review_req, state_id='assigned', assigned_on=review_req.time, reviewer=reviewer.email_set.first())
RoleFactory(name_id='secr', group=review_req.team, person__user__username='reviewsecretary')
today = datetime.date.today()
# By default, these reminders are disabled for all teams.
empty_outbox()
log = send_reminder_unconfirmed_assignments(today)
self.assertEqual(len(outbox), 0)
self.assertFalse(log)
ReviewTeamSettings.objects.update(remind_days_unconfirmed_assignments=1)
empty_outbox()
log = send_reminder_unconfirmed_assignments(today)
self.assertEqual(len(outbox), 1)
self.assertIn(reviewer.email_address(), outbox[0]["To"])
self.assertEqual(outbox[0]["Subject"], "Reminder: you have not responded to a review assignment")
message = get_payload_text(outbox[0])
self.assertIn(review_req.team.acronym, message)
self.assertIn('accept or reject the assignment on', message)
self.assertIn(review_req.doc.name, message)
self.assertEqual(len(log), 1)
self.assertIn(reviewer.email_address(), log[0])
self.assertIn('not accepted/rejected review assignment', log[0])
class BulkAssignmentTests(TestCase):

View file

@ -7,6 +7,7 @@ import os
from django.db.models import Q
from django.shortcuts import get_object_or_404
from django.utils.html import format_html
from django.utils.safestring import mark_safe
from django.urls import reverse as urlreverse
@ -15,9 +16,9 @@ import debug # pyflakes:ignore
from ietf.community.models import CommunityList, SearchRule
from ietf.community.utils import reset_name_contains_index_for_rule, can_manage_community_list
from ietf.doc.models import Document, State
from ietf.group.models import Group, RoleHistory, Role, GroupFeatures
from ietf.group.models import Group, RoleHistory, Role, GroupFeatures, GroupEvent
from ietf.ietfauth.utils import has_role
from ietf.name.models import GroupTypeName
from ietf.name.models import GroupTypeName, RoleName
from ietf.person.models import Email
from ietf.review.utils import can_manage_review_requests_for_team
from ietf.utils import log
@ -279,4 +280,45 @@ def group_features_role_filter(roles, person, feature):
return roles.none()
q = reduce(lambda a,b:a|b, [ Q(person=person, name__slug__in=getattr(t.features, feature)) for t in group_types ])
return roles.filter(q)
def group_attribute_change_desc(attr, new, old=None):
if old is None:
return format_html('{} changed to <b>{}</b>', attr, new)
else:
return format_html('{} changed to <b>{}</b> from {}', attr, new, old)
def update_role_set(group, role_name, new_value, by):
"""Alter role_set for a group
Updates the value and creates history events.
"""
if isinstance(role_name, str):
role_name = RoleName.objects.get(slug=role_name)
new = set(new_value)
old = set(r.email for r in group.role_set.filter(name=role_name).distinct().select_related("person"))
removed = old - new
added = new - old
if added or removed:
GroupEvent.objects.create(
group=group,
by=by,
type='info_changed',
desc=group_attribute_change_desc(
role_name.name,
", ".join(sorted(x.get_name() for x in new)),
", ".join(sorted(x.get_name() for x in old)),
)
)
group.role_set.filter(name=role_name, email__in=removed).delete()
for email in added:
group.role_set.create(name=role_name, email=email, person=email.person)
for e in new:
if not e.origin or (e.person.user and e.origin == e.person.user.username):
e.origin = "role: %s %s" % (group.acronym, role_name.slug)
e.save()
return added, removed

View file

@ -38,7 +38,6 @@ import copy
import datetime
import itertools
import io
import markdown
import math
import os
import re
@ -77,9 +76,9 @@ from ietf.group.models import ( Group, Role, GroupEvent, GroupStateTransitions,
ChangeStateGroupEvent, GroupFeatures )
from ietf.group.utils import (get_charter_text, can_manage_all_groups_of_type,
milestone_reviewer_for_group_type, can_provide_status_update,
can_manage_materials,
can_manage_materials, group_attribute_change_desc,
construct_group_menu_context, get_group_materials,
save_group_in_history, can_manage_group,
save_group_in_history, can_manage_group, update_role_set,
get_group_or_404, setup_default_community_list_for_group, )
#
from ietf.ietfauth.utils import has_role, is_authorized_in_group
@ -121,7 +120,7 @@ from ietf.settings import MAILING_LIST_INFO_URL
from ietf.utils.pipe import pipe
from ietf.utils.response import permission_denied
from ietf.utils.text import strip_suffix
from ietf.utils import markdown
# --- Helpers ----------------------------------------------------------
@ -581,7 +580,7 @@ def group_about_rendertest(request, acronym, group_type=None):
if group.charter:
charter = get_charter_text(group)
try:
rendered = markdown.markdown(charter, extensions=['extra'])
rendered = markdown.markdown(charter)
except Exception as e:
rendered = f'Markdown rendering failed: {e}'
return render(request, 'group/group_about_rendertest.html', {'group':group, 'charter':charter, 'rendered':rendered})
@ -873,13 +872,6 @@ def group_photos(request, group_type=None, acronym=None):
def edit(request, group_type=None, acronym=None, action="edit", field=None):
"""Edit or create a group, notifying parties as
necessary and logging changes as group events."""
def desc(attr, new, old):
entry = "%(attr)s changed to <b>%(new)s</b> from %(old)s"
if new_group:
entry = "%(attr)s changed to <b>%(new)s</b>"
return entry % dict(attr=attr, new=new, old=old)
def format_resources(resources, fs="\n"):
res = []
for r in resources:
@ -892,11 +884,15 @@ def edit(request, group_type=None, acronym=None, action="edit", field=None):
return fs.join(res)
def diff(attr, name):
if field and attr != field:
if attr not in clean or (field and attr != field):
return
v = getattr(group, attr)
if clean[attr] != v:
changes.append((attr, clean[attr], desc(name, clean[attr], v)))
changes.append((
attr,
clean[attr],
group_attribute_change_desc(name, clean[attr], v if v else None)
))
setattr(group, attr, clean[attr])
if action == "edit":
@ -955,6 +951,7 @@ def edit(request, group_type=None, acronym=None, action="edit", field=None):
diff('name', "Name")
diff('acronym', "Acronym")
diff('state', "State")
diff('description', "Description")
diff('parent', "IETF Area" if group.type=="wg" else "Group parent")
diff('list_email', "Mailing list email")
diff('list_subscribe', "Mailing list subscribe address")
@ -972,47 +969,33 @@ def edit(request, group_type=None, acronym=None, action="edit", field=None):
title = f.label
new = clean[attr]
old = Email.objects.filter(role__group=group, role__name=slug).select_related("person")
if set(new) != set(old):
changes.append((attr, new, desc(title,
", ".join(sorted(x.get_name() for x in new)),
", ".join(sorted(x.get_name() for x in old)))))
group.role_set.filter(name=slug).delete()
for e in new:
Role.objects.get_or_create(name_id=slug, email=e, group=group, person=e.person)
if not e.origin or (e.person.user and e.origin == e.person.user.username):
e.origin = "role: %s %s" % (group.acronym, slug)
e.save()
added, deleted = update_role_set(group, slug, clean[attr], request.user.person)
changed_personnel.update(added | deleted)
if added:
change_text=title + ' added: ' + ", ".join(x.name_and_email() for x in added)
personnel_change_text+=change_text+"\n"
if deleted:
change_text=title + ' deleted: ' + ", ".join(x.name_and_email() for x in deleted)
personnel_change_text+=change_text+"\n"
today = datetime.date.today()
for deleted_email in deleted:
# Verify the person doesn't have a separate reviewer role for the group with a different address
if not group.role_set.filter(name_id='reviewer',person=deleted_email.person).exists():
active_assignments = ReviewAssignment.objects.filter(
review_request__team=group,
reviewer__person=deleted_email.person,
state_id__in=['accepted', 'assigned'],
)
for assignment in active_assignments:
if assignment.review_request.deadline > today:
assignment.state_id = 'withdrawn'
else:
assignment.state_id = 'no-response'
# save() will update review_request state to 'requested'
# if needed, so that the review can be assigned to someone else
assignment.save()
added = set(new) - set(old)
deleted = set(old) - set(new)
if added:
change_text=title + ' added: ' + ", ".join(x.name_and_email() for x in added)
personnel_change_text+=change_text+"\n"
if deleted:
change_text=title + ' deleted: ' + ", ".join(x.name_and_email() for x in deleted)
personnel_change_text+=change_text+"\n"
today = datetime.date.today()
for deleted_email in deleted:
# Verify the person doesn't have a separate reviewer role for the group with a different address
if not group.role_set.filter(name_id='reviewer',person=deleted_email.person).exists():
active_assignments = ReviewAssignment.objects.filter(
review_request__team=group,
reviewer__person=deleted_email.person,
state_id__in=['accepted', 'assigned'],
)
for assignment in active_assignments:
if assignment.review_request.deadline > today:
assignment.state_id = 'withdrawn'
else:
assignment.state_id = 'no-response'
# save() will update review_request state to 'requested'
# if needed, so that the review can be assigned to someone else
assignment.save()
changed_personnel.update(set(old)^set(new))
if personnel_change_text!="":
changed_personnel = [ str(p) for p in changed_personnel ]
@ -1030,7 +1013,15 @@ def edit(request, group_type=None, acronym=None, action="edit", field=None):
value = parts[1]
display_name = ' '.join(parts[2:]).strip('()')
group.groupextresource_set.create(value=value, name_id=name, display_name=display_name)
changes.append(('resources', new_resources, desc('Resources', ", ".join(new_resources), ", ".join(old_resources))))
changes.append((
'resources',
new_resources,
group_attribute_change_desc(
'Resources',
", ".join(new_resources),
", ".join(old_resources) if old_resources else None
)
))
group.time = datetime.datetime.now()
@ -1077,6 +1068,7 @@ def edit(request, group_type=None, acronym=None, action="edit", field=None):
init = dict(name=group.name,
acronym=group.acronym,
state=group.state,
description = group.description,
parent=group.parent.id if group.parent else None,
list_email=group.list_email if group.list_email else None,
list_subscribe=group.list_subscribe if group.list_subscribe else None,

View file

@ -27,6 +27,7 @@ from urllib.parse import urlsplit
from django.urls import reverse as urlreverse
from django.contrib.auth.models import User
from django.conf import settings
from django.template.loader import render_to_string
import debug # pyflakes:ignore
@ -94,6 +95,7 @@ class IetfAuthTests(TestCase):
# try logging out
r = self.client.get(urlreverse('django.contrib.auth.views.logout'))
self.assertEqual(r.status_code, 200)
self.assertNotContains(r, "accounts/logout")
r = self.client.get(urlreverse(ietf.ietfauth.views.profile))
self.assertEqual(r.status_code, 302)
@ -138,20 +140,26 @@ class IetfAuthTests(TestCase):
return False
def test_create_account_failure(self):
# For the lowered barrier to account creation period, we are disabling this kind of failure
# def test_create_account_failure(self):
url = urlreverse(ietf.ietfauth.views.create_account)
# url = urlreverse(ietf.ietfauth.views.create_account)
# get
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
# # get
# r = self.client.get(url)
# self.assertEqual(r.status_code, 200)
# register email and verify failure
email = 'new-account@example.com'
empty_outbox()
r = self.client.post(url, { 'email': email })
self.assertEqual(r.status_code, 200)
self.assertContains(r, "Additional Assistance Required")
# # register email and verify failure
# email = 'new-account@example.com'
# empty_outbox()
# r = self.client.post(url, { 'email': email })
# self.assertEqual(r.status_code, 200)
# self.assertContains(r, "Additional Assistance Required")
# Rather than delete the failure template just yet, here's a test to make sure it still renders should we need to revert to it.
def test_create_account_failure_template(self):
r = render_to_string('registration/manual.html', { 'account_request_email': settings.ACCOUNT_REQUEST_EMAIL })
self.assertTrue("Additional Assistance Required" in r)
def register_and_verify(self, email):
url = urlreverse(ietf.ietfauth.views.create_account)
@ -655,7 +663,7 @@ class IetfAuthTests(TestCase):
self.assertContains(r, 'Invalid apikey', status_code=403)
# invalid apikey (invalidated api key)
unauthorized_url = urlreverse('ietf.api.views.author_tools')
unauthorized_url = urlreverse('ietf.api.views.app_auth')
invalidated_apikey = PersonalApiKey.objects.create(
endpoint=unauthorized_url, person=person, valid=False)
r = self.client.post(unauthorized_url, {'apikey': invalidated_apikey.hash()})

View file

@ -36,7 +36,9 @@
import importlib
from datetime import datetime as DateTime, timedelta as TimeDelta, date as Date
from datetime import date as Date
# needed if we revert to higher barrier for account creation
#from datetime import datetime as DateTime, timedelta as TimeDelta, date as Date
from collections import defaultdict
import django.core.signing
@ -65,7 +67,9 @@ from ietf.ietfauth.forms import ( RegistrationForm, PasswordForm, ResetPasswordF
NewEmailForm, ChangeUsernameForm, PersonPasswordForm)
from ietf.ietfauth.htpasswd import update_htpasswd_file
from ietf.ietfauth.utils import role_required, has_role
from ietf.mailinglists.models import Subscribed, Whitelisted
from ietf.mailinglists.models import Whitelisted
# needed if we revert to higher barrier for account creation
#from ietf.mailinglists.models import Subscribed, Whitelisted
from ietf.name.models import ExtResourceName
from ietf.nomcom.models import NomCom
from ietf.person.models import Person, Email, Alias, PersonalApiKey, PERSON_API_KEY_VALUES
@ -76,6 +80,9 @@ from ietf.utils.decorators import person_required
from ietf.utils.mail import send_mail
from ietf.utils.validators import validate_external_resource_value
# These are needed if we revert to the higher bar for account creation
def index(request):
return render(request, 'registration/index.html')
@ -114,13 +121,19 @@ def create_account(request):
form = RegistrationForm(request.POST)
if form.is_valid():
to_email = form.cleaned_data['email'] # This will be lowercase if form.is_valid()
existing = Subscribed.objects.filter(email=to_email).first()
ok_to_create = ( Whitelisted.objects.filter(email=to_email).exists()
or existing and (existing.time + TimeDelta(seconds=settings.LIST_ACCOUNT_DELAY)) < DateTime.now() )
if ok_to_create:
send_account_creation_email(request, to_email)
else:
return render(request, 'registration/manual.html', { 'account_request_email': settings.ACCOUNT_REQUEST_EMAIL })
# For the IETF 113 Registration period (at least) we are lowering the barriers for account creation
# to the simple email round-trip check
send_account_creation_email(request, to_email)
# The following is what to revert to should that lowered barrier prove problematic
# existing = Subscribed.objects.filter(email=to_email).first()
# ok_to_create = ( Whitelisted.objects.filter(email=to_email).exists()
# or existing and (existing.time + TimeDelta(seconds=settings.LIST_ACCOUNT_DELAY)) < DateTime.now() )
# if ok_to_create:
# send_account_creation_email(request, to_email)
# else:
# return render(request, 'registration/manual.html', { 'account_request_email': settings.ACCOUNT_REQUEST_EMAIL })
else:
form = RegistrationForm()

View file

@ -11,7 +11,7 @@ import pytz
import re
from django.template.loader import render_to_string
from django.utils.encoding import force_text, force_str
from django.utils.encoding import force_text, force_bytes
import debug # pyflakes:ignore
@ -174,7 +174,7 @@ def process_response_email(msg):
a matching value in the reply_to field, associated to an IPR disclosure through
IprEvent. Create a Message object for the incoming message and associate it to
the original message via new IprEvent"""
message = email.message_from_string(force_str(msg))
message = email.message_from_bytes(force_bytes(msg))
to = message.get('To', '')
# exit if this isn't a response we're interested in (with plus addressing)

View file

@ -23,15 +23,10 @@ class Command(EmailOnFailureCommand):
def handle(self, *args, **options):
email = options.get('email', None)
if not email:
msg = sys.stdin.read()
self.msg_bytes = msg.encode()
else:
self.msg_bytes = io.open(email, "rb").read()
msg = self.msg_bytes.decode()
binary_input = io.open(email, 'rb') if email else sys.stdin.buffer
self.msg_bytes = binary_input.read()
try:
process_response_email(msg)
process_response_email(self.msg_bytes)
except ValueError as e:
raise CommandError(e)

View file

@ -2,6 +2,7 @@
# -*- coding: utf-8 -*-
"""Tests of ipr management commands"""
import mock
import sys
from django.core.management import call_command
from django.test.utils import override_settings
@ -17,16 +18,17 @@ class ProcessEmailTests(TestCase):
with name_of_file_containing('contents') as filename:
call_command('process_email', email_file=filename)
self.assertEqual(process_mock.call_count, 1, 'process_response_email should be called once')
(msg,) = process_mock.call_args.args
self.assertEqual(
process_mock.call_args.args,
('contents',),
msg.decode(),
'contents',
'process_response_email should receive the correct contents'
)
@mock.patch('ietf.utils.management.base.send_smtp')
@mock.patch('ietf.ipr.management.commands.process_email.process_response_email')
def test_send_error_to_admin(self, process_mock, send_smtp_mock):
"""The process_email command should email the admins on error"""
"""The process_email command should email the admins on error in process_response_email"""
# arrange an mock error during processing
process_mock.side_effect = RuntimeError('mock error')
@ -47,3 +49,30 @@ class ProcessEmailTests(TestCase):
self.assertIn('mock.py', content, 'File where error occurred should be included in error email')
self.assertIn('traceback', traceback.lower(), 'Traceback should be attached to error email')
self.assertEqual(original, 'contents', 'Original message should be attached to error email')
@mock.patch('ietf.utils.management.base.send_smtp')
@mock.patch('ietf.ipr.management.commands.process_email.process_response_email')
def test_invalid_character_encodings(self, process_mock, send_smtp_mock):
"""The process_email command should accept messages with invalid encoding when using a file input"""
invalid_characters = b'\xfe\xff'
with name_of_file_containing(invalid_characters, mode='wb') as filename:
call_command('process_email', email_file=filename)
self.assertFalse(send_smtp_mock.called) # should not send an error email
self.assertTrue(process_mock.called)
(msg,) = process_mock.call_args.args
self.assertEqual(msg, invalid_characters, 'Invalid unicode should be passed to process_email()')
@mock.patch.object(sys.stdin.buffer, 'read')
@mock.patch('ietf.utils.management.base.send_smtp')
@mock.patch('ietf.ipr.management.commands.process_email.process_response_email')
def test_invalid_character_encodings_via_stdin(self, process_mock, send_smtp_mock, stdin_read_mock):
"""The process_email command should attach messages with invalid encoding when using stdin"""
invalid_characters = b'\xfe\xff'
stdin_read_mock.return_value = invalid_characters
call_command('process_email')
self.assertFalse(send_smtp_mock.called) # should not send an error email
self.assertTrue(process_mock.called)
(msg,) = process_mock.call_args.args
self.assertEqual(msg, invalid_characters, 'Invalid unicode should be passed to process_email()')

View file

@ -592,8 +592,7 @@ I would like to revoke this declaration.
self.assertEqual(len(outbox),2)
self.assertIn('Secretariat on '+ipr.get_latest_event_submitted().time.strftime("%Y-%m-%d"), get_payload_text(outbox[1]).replace('\n',' '))
def test_process_response_email(self):
# first send a mail
def send_ipr_email_helper(self):
ipr = HolderIprDisclosureFactory()
url = urlreverse('ietf.ipr.views.email',kwargs={ "id": ipr.id })
self.client.login(username="secretary", password="secretary+password")
@ -614,19 +613,25 @@ I would like to revoke this declaration.
self.assertTrue(event.response_past_due())
self.assertEqual(len(outbox), 1)
self.assertTrue('joe@test.com' in outbox[0]['To'])
return data['reply_to'], event
uninteresting_ipr_message_strings = [
("To: {to}\nCc: {cc}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"),
("Cc: {cc}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"), # no To
("To: {to}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"), # no Cc
("From: joe@test.com\nDate: {date}\nSubject: test\n"), # no To or Cc
("Cc: {cc}\nDate: {date}\nSubject: test\n"), # no To
("To: {to}\nDate: {date}\nSubject: test\n"), # no Cc
("Date: {date}\nSubject: test\n"), # no To or Cc
]
def test_process_response_email(self):
# first send a mail
reply_to, event = self.send_ipr_email_helper()
# test process response uninteresting messages
addrs = gather_address_lists('ipr_disclosure_submitted').as_strings()
uninteresting_message_strings = [
("To: {to}\nCc: {cc}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"),
("Cc: {cc}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"), # no To
("To: {to}\nFrom: joe@test.com\nDate: {date}\nSubject: test\n"), # no Cc
("From: joe@test.com\nDate: {date}\nSubject: test\n"), # no To or Cc
("Cc: {cc}\nDate: {date}\nSubject: test\n"), # no To
("To: {to}\nDate: {date}\nSubject: test\n"), # no Cc
("Date: {date}\nSubject: test\n"), # no To or Cc
]
for message_string in uninteresting_message_strings:
for message_string in self.uninteresting_ipr_message_strings:
result = process_response_email(
message_string.format(
to=addrs.to,
@ -641,12 +646,41 @@ I would like to revoke this declaration.
From: joe@test.com
Date: {}
Subject: test
""".format(data['reply_to'],datetime.datetime.now().ctime())
""".format(reply_to, datetime.datetime.now().ctime())
result = process_response_email(message_string)
self.assertIsInstance(result,Message)
self.assertIsInstance(result, Message)
self.assertFalse(event.response_past_due())
def test_process_response_email_with_invalid_encoding(self):
"""Interesting emails with invalid encoding should be handled"""
reply_to, _ = self.send_ipr_email_helper()
# test process response
message_string = """To: {}
From: joe@test.com
Date: {}
Subject: test
""".format(reply_to, datetime.datetime.now().ctime())
message_bytes = message_string.encode('utf8') + b'\nInvalid stuff: \xfe\xff\n'
result = process_response_email(message_bytes)
self.assertIsInstance(result, Message)
# \ufffd is a rhombus character with an inverse ?, used to replace invalid characters
self.assertEqual(result.body, 'Invalid stuff: \ufffd\ufffd\n\n', # not sure where the extra \n is from
'Invalid characters should be replaced with \ufffd characters')
def test_process_response_email_uninteresting_with_invalid_encoding(self):
"""Uninteresting emails with invalid encoding should be quietly dropped"""
self.send_ipr_email_helper()
addrs = gather_address_lists('ipr_disclosure_submitted').as_strings()
for message_string in self.uninteresting_ipr_message_strings:
message_bytes = message_string.format(
to=addrs.to,
cc=addrs.cc,
date=datetime.datetime.now().ctime(),
).encode('utf8') + b'\nInvalid stuff: \xfe\xff\n'
result = process_response_email(message_bytes)
self.assertIsNone(result)
def test_ajax_search(self):
url = urlreverse('ietf.ipr.views.ajax_search')
response=self.client.get(url+'?q=disclosure')

View file

@ -96,10 +96,10 @@ class SchedulingEventInline(admin.TabularInline):
raw_id_fields = ["by"]
class SessionAdmin(admin.ModelAdmin):
list_display = ["meeting", "name", "group", "attendees", "requested", "current_status"]
list_filter = ["meeting", ]
list_display = ["meeting", "name", "group_acronym", "purpose", "attendees", "requested", "current_status"]
list_filter = ["purpose", "meeting", ]
raw_id_fields = ["meeting", "group", "materials", "joint_with_groups", "tombstone_for"]
search_fields = ["meeting__number", "name", "group__name", "group__acronym", ]
search_fields = ["meeting__number", "name", "group__name", "group__acronym", "purpose__name"]
ordering = ["-id"]
inlines = [SchedulingEventInline]
@ -108,10 +108,13 @@ class SessionAdmin(admin.ModelAdmin):
qs = super(SessionAdmin, self).get_queryset(request)
return qs.prefetch_related('schedulingevent_set')
def group_acronym(self, instance):
return instance.group and instance.group.acronym
def current_status(self, instance):
events = sorted(instance.schedulingevent_set.all(), key=lambda e: (e.time, e.id))
if events:
return events[-1].time
return f'{events[-1].status} ({events[-1].time:%Y-%m-%d %H:%M})'
else:
return None

View file

@ -6,6 +6,9 @@ import io
import os
import datetime
import json
import re
from pathlib import Path
from django import forms
from django.conf import settings
@ -13,6 +16,7 @@ from django.core import validators
from django.core.exceptions import ValidationError
from django.db.models import Q
from django.forms import BaseInlineFormSet
from django.utils.functional import cached_property
import debug # pyflakes:ignore
@ -58,21 +62,18 @@ def duration_string(duration):
'''Custom duration_string to return HH:MM (no seconds)'''
days = duration.days
seconds = duration.seconds
microseconds = duration.microseconds
minutes = seconds // 60
seconds = seconds % 60
hours = minutes // 60
minutes = minutes % 60
string = '{:02d}:{:02d}'.format(hours, minutes)
if days:
string = '{} '.format(days) + string
if microseconds:
string += '.{:06d}'.format(microseconds)
return string
# -------------------------------------------------
# Forms
# -------------------------------------------------
@ -104,18 +105,34 @@ class InterimSessionInlineFormSet(BaseInlineFormSet):
return # formset doesn't have cleaned_data
class InterimMeetingModelForm(forms.ModelForm):
group = GroupModelChoiceField(queryset=Group.objects.filter(type_id__in=GroupFeatures.objects.filter(has_meetings=True).values_list('type_id',flat=True), state__in=('active', 'proposed', 'bof')).order_by('acronym'), required=False, empty_label="Click to select")
group = GroupModelChoiceField(
queryset=Group.objects.filter(
type_id__in=GroupFeatures.objects.filter(
has_meetings=True
).values_list('type_id',flat=True),
state__in=('active', 'proposed', 'bof')
).order_by('acronym'),
required=False,
empty_label="Click to select",
)
in_person = forms.BooleanField(required=False)
meeting_type = forms.ChoiceField(choices=(
("single", "Single"),
("multi-day", "Multi-Day"),
('series', 'Series')), required=False, initial='single', widget=forms.RadioSelect, help_text='''
meeting_type = forms.ChoiceField(
choices=(
("single", "Single"),
("multi-day", "Multi-Day"),
('series', 'Series')
),
required=False,
initial='single',
widget=forms.RadioSelect,
help_text='''
Use <b>Multi-Day</b> for a single meeting that spans more than one contiguous
workday. Do not use Multi-Day for a series of separate meetings (such as
periodic interim calls). Use Series instead.
Use <b>Series</b> for a series of separate meetings, such as periodic interim calls.
Use Multi-Day for a single meeting that spans more than one contiguous
workday.''')
workday.''',
)
approved = forms.BooleanField(required=False)
city = forms.CharField(max_length=255, required=False)
city.widget.attrs['placeholder'] = "City"
@ -216,10 +233,16 @@ class InterimSessionModelForm(forms.ModelForm):
requested_duration = CustomDurationField(required=True)
end_time = forms.TimeField(required=False, help_text="Local time")
end_time.widget.attrs['placeholder'] = "HH:MM"
remote_instructions = forms.CharField(max_length=1024, required=True, help_text='''
For virtual interims, a conference link <b>should be provided in the original request</b> in all but the most unusual circumstances.
Otherwise, "Remote participation is not supported" or "Remote participation information will be obtained at the time of approval" are acceptable values.
See <a href="https://www.ietf.org/forms/wg-webex-account-request/">here</a> for more on remote participation support.''')
remote_participation = forms.ChoiceField(choices=(), required=False)
remote_instructions = forms.CharField(
max_length=1024,
required=False,
help_text='''
For virtual interims, a conference link <b>should be provided in the original request</b> in all but the most unusual circumstances.
Otherwise, "Remote participation is not supported" or "Remote participation information will be obtained at the time of approval" are acceptable values.
See <a href="https://www.ietf.org/forms/wg-webex-account-request/">here</a> for more on remote participation support.
''',
)
agenda = forms.CharField(required=False, widget=forms.Textarea, strip=False)
agenda.widget.attrs['placeholder'] = "Paste agenda here"
agenda_note = forms.CharField(max_length=255, required=False, label=" Additional information")
@ -246,7 +269,13 @@ class InterimSessionModelForm(forms.ModelForm):
doc = self.instance.agenda()
content = doc.text_or_error()
self.initial['agenda'] = content
# set up remote participation choices
choices = []
if hasattr(settings, 'MEETECHO_API_CONFIG'):
choices.append(('meetecho', 'Automatically create Meetecho conference'))
choices.append(('manual', 'Manually specify remote instructions...'))
self.fields['remote_participation'].choices = choices
def clean_date(self):
'''Date field validator. We can't use required on the input because
@ -264,6 +293,21 @@ class InterimSessionModelForm(forms.ModelForm):
raise forms.ValidationError('Provide a duration, %s-%smin.' % (min_minutes, max_minutes))
return duration
def clean(self):
if self.cleaned_data.get('remote_participation', None) == 'meetecho':
self.cleaned_data['remote_instructions'] = '' # blank this out if we're creating a Meetecho conference
elif not self.cleaned_data['remote_instructions']:
self.add_error('remote_instructions', 'This field is required')
return self.cleaned_data
# Override to ignore the non-model 'remote_participation' field when computing has_changed()
@cached_property
def changed_data(self):
data = super().changed_data
if 'remote_participation' in data:
data.remove('remote_participation')
return data
def save(self, *args, **kwargs):
"""NOTE: as the baseform of an inlineformset self.save(commit=True)
never gets called"""
@ -279,6 +323,7 @@ class InterimSessionModelForm(forms.ModelForm):
if self.instance.agenda():
doc = self.instance.agenda()
doc.rev = str(int(doc.rev) + 1).zfill(2)
doc.uploaded_filename = doc.filename_with_rev()
e = NewRevisionDocEvent.objects.create(
type='new_revision',
by=self.user.person,
@ -339,14 +384,19 @@ class InterimCancelForm(forms.Form):
self.fields['date'].widget.attrs['disabled'] = True
class FileUploadForm(forms.Form):
"""Base class for FileUploadForms
Abstract base class - subclasses must fill in the doc_type value with
the type of document they handle.
"""
file = forms.FileField(label='File to upload')
doc_type = '' # subclasses must set this
def __init__(self, *args, **kwargs):
doc_type = kwargs.pop('doc_type')
assert doc_type in settings.MEETING_VALID_UPLOAD_EXTENSIONS
self.doc_type = doc_type
self.extensions = settings.MEETING_VALID_UPLOAD_EXTENSIONS[doc_type]
self.mime_types = settings.MEETING_VALID_UPLOAD_MIME_TYPES[doc_type]
assert self.doc_type in settings.MEETING_VALID_UPLOAD_EXTENSIONS
self.extensions = settings.MEETING_VALID_UPLOAD_EXTENSIONS[self.doc_type]
self.mime_types = settings.MEETING_VALID_UPLOAD_MIME_TYPES[self.doc_type]
super(FileUploadForm, self).__init__(*args, **kwargs)
label = '%s file to upload. ' % (self.doc_type.capitalize(), )
if self.doc_type == "slides":
@ -359,6 +409,15 @@ class FileUploadForm(forms.Form):
file = self.cleaned_data['file']
validate_file_size(file)
ext = validate_file_extension(file, self.extensions)
# override the Content-Type if needed
if file.content_type in 'application/octet-stream':
content_type_map = settings.MEETING_APPLICATION_OCTET_STREAM_OVERRIDES
filename = Path(file.name)
if filename.suffix in content_type_map:
file.content_type = content_type_map[filename.suffix]
self.cleaned_data['file'] = file
mime_type, encoding = validate_mime_type(file, self.mime_types)
if not hasattr(self, 'file_encoding'):
self.file_encoding = {}
@ -366,15 +425,76 @@ class FileUploadForm(forms.Form):
if self.mime_types:
if not file.content_type in settings.MEETING_VALID_UPLOAD_MIME_FOR_OBSERVED_MIME[mime_type]:
raise ValidationError('Upload Content-Type (%s) is different from the observed mime-type (%s)' % (file.content_type, mime_type))
if mime_type in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS:
if not ext in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS[mime_type]:
# We just validated that file.content_type is safe to accept despite being identified
# as a different MIME type by the validator. Check extension based on file.content_type
# because that better reflects the intention of the upload client.
if file.content_type in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS:
if not ext in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS[file.content_type]:
raise ValidationError('Upload Content-Type (%s) does not match the extension (%s)' % (file.content_type, ext))
if mime_type in ['text/html', ] or ext in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS['text/html']:
if (file.content_type in ['text/html', ]
or ext in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS.get('text/html', [])):
# We'll do html sanitization later, but for frames, we fail here,
# as the sanitized version will most likely be useless.
validate_no_html_frame(file)
return file
class UploadBlueSheetForm(FileUploadForm):
doc_type = 'bluesheets'
class ApplyToAllFileUploadForm(FileUploadForm):
"""FileUploadField that adds an apply_to_all checkbox
Checkbox can be disabled by passing show_apply_to_all_checkbox=False to the constructor.
This entirely removes the field from the form.
"""
# Note: subclasses must set doc_type for FileUploadForm
apply_to_all = forms.BooleanField(label='Apply to all group sessions at this meeting',initial=True,required=False)
def __init__(self, show_apply_to_all_checkbox, *args, **kwargs):
super().__init__(*args, **kwargs)
if not show_apply_to_all_checkbox:
self.fields.pop('apply_to_all')
else:
self.order_fields(
sorted(
self.fields.keys(),
key=lambda f: 'zzzzzz' if f == 'apply_to_all' else f
)
)
class UploadMinutesForm(ApplyToAllFileUploadForm):
doc_type = 'minutes'
class UploadAgendaForm(ApplyToAllFileUploadForm):
doc_type = 'agenda'
class UploadSlidesForm(ApplyToAllFileUploadForm):
doc_type = 'slides'
title = forms.CharField(max_length=255)
def __init__(self, session, *args, **kwargs):
super().__init__(*args, **kwargs)
self.session = session
def clean_title(self):
title = self.cleaned_data['title']
# The current tables only handles Unicode BMP:
if ord(max(title)) > 0xffff:
raise forms.ValidationError("The title contains characters outside the Unicode BMP, which is not currently supported")
if self.session.meeting.type_id=='interim':
if re.search(r'-\d{2}$', title):
raise forms.ValidationError("Interim slides currently may not have a title that ends with something that looks like a revision number (-nn)")
return title
class ImportMinutesForm(forms.Form):
markdown_text = forms.CharField(strip=False, widget=forms.HiddenInput)
class RequestMinutesForm(forms.Form):
to = MultiEmailField()
cc = MultiEmailField(required=False)
@ -560,7 +680,11 @@ class DurationChoiceField(forms.ChoiceField):
return ''
def to_python(self, value):
return datetime.timedelta(seconds=round(float(value))) if value not in self.empty_values else None
if value in self.empty_values or (isinstance(value, str) and not value.isnumeric()):
return None # treat non-numeric values as empty
else:
# noinspection PyTypeChecker
return datetime.timedelta(seconds=round(float(value)))
def valid_value(self, value):
return super().valid_value(self.prepare_value(value))
@ -595,11 +719,15 @@ class SessionDetailsForm(forms.ModelForm):
def __init__(self, group, *args, **kwargs):
session_purposes = group.features.session_purposes
kwargs.setdefault('initial', {})
kwargs['initial'].setdefault(
'purpose',
session_purposes[0] if len(session_purposes) > 0 else None,
)
# Default to the first allowed session_purposes. Do not do this if we have an instance,
# though, because ModelForm will override instance data with initial data if it gets both.
# When we have an instance we want to keep its value.
if 'instance' not in kwargs:
kwargs.setdefault('initial', {})
kwargs['initial'].setdefault(
'purpose',
session_purposes[0] if len(session_purposes) > 0 else None,
)
super().__init__(*args, **kwargs)
self.fields['type'].widget.attrs.update({
@ -615,18 +743,22 @@ class SessionDetailsForm(forms.ModelForm):
class Meta:
model = Session
fields = (
'name', 'short', 'purpose', 'type', 'requested_duration',
'purpose', 'name', 'short', 'type', 'requested_duration',
'on_agenda', 'remote_instructions', 'attendees', 'comments',
)
labels = {'requested_duration': 'Length'}
def clean(self):
super().clean()
# Fill in on_agenda. If this is a new instance or we have changed its purpose, then use
# the on_agenda value for the purpose. Otherwise, keep the value of an existing instance (if any)
# or leave it blank.
if 'purpose' in self.cleaned_data and (
'purpose' in self.changed_data or self.instance.pk is None
self.instance.pk is None or (self.instance.purpose != self.cleaned_data['purpose'])
):
self.cleaned_data['on_agenda'] = self.cleaned_data['purpose'].on_agenda
elif self.instance.pk is not None:
self.cleaned_data['on_agenda'] = self.instance.on_agenda
return self.cleaned_data
class Media:
@ -642,10 +774,9 @@ class SessionEditForm(SessionDetailsForm):
super().__init__(instance=instance, group=instance.group, *args, **kwargs)
class SessionDetailsInlineFormset(forms.BaseInlineFormSet):
class SessionDetailsInlineFormSet(forms.BaseInlineFormSet):
def __init__(self, group, meeting, queryset=None, *args, **kwargs):
self._meeting = meeting
self.created_instances = []
# Restrict sessions to the meeting and group. The instance
# property handles one of these for free.
@ -667,12 +798,6 @@ class SessionDetailsInlineFormset(forms.BaseInlineFormSet):
form.instance.meeting = self._meeting
return super().save_new(form, commit)
def save(self, commit=True):
existing_instances = set(form.instance for form in self.forms if form.instance.pk)
saved = super().save(commit)
self.created_instances = [inst for inst in saved if inst not in existing_instances]
return saved
@property
def forms_to_keep(self):
"""Get the not-deleted forms"""
@ -682,7 +807,7 @@ def sessiondetailsformset_factory(min_num=1, max_num=3):
return forms.inlineformset_factory(
Group,
Session,
formset=SessionDetailsInlineFormset,
formset=SessionDetailsInlineFormSet,
form=SessionDetailsForm,
can_delete=True,
can_order=False,

View file

@ -12,6 +12,7 @@ from tempfile import mkstemp
from django.http import Http404
from django.db.models import F, Prefetch
from django.conf import settings
from django.contrib import messages
from django.contrib.auth.models import AnonymousUser
from django.urls import reverse
from django.shortcuts import get_object_or_404
@ -29,7 +30,7 @@ from ietf.person.models import Person
from ietf.meeting.models import Meeting, Schedule, TimeSlot, SchedTimeSessAssignment, ImportantDate, SchedulingEvent, Session
from ietf.meeting.utils import session_requested_by, add_event_info_to_session_qs
from ietf.name.models import ImportantDateName, SessionPurposeName
from ietf.utils import log
from ietf.utils import log, meetecho
from ietf.utils.history import find_history_replacements_active_at
from ietf.utils.mail import send_mail
from ietf.utils.pipe import pipe
@ -942,6 +943,8 @@ def send_interim_approval(user, meeting):
template = 'meeting/interim_approval.txt'
context = {
'meeting': meeting,
'group' : first_session.group,
'requester' : session_requested_by(first_session),
}
send_mail(None,
to_email,
@ -1072,6 +1075,76 @@ def sessions_post_save(request, forms):
if 'agenda' in form.changed_data:
form.save_agenda()
try:
create_interim_session_conferences(
form.instance for form in forms
if form.cleaned_data.get('remote_participation', None) == 'meetecho'
)
except RuntimeError:
messages.warning(
request,
'An error occurred while creating a Meetecho conference. The interim meeting request '
'has been created without complete remote participation information. '
'Please edit the request to add this or contact the secretariat if you require assistance.',
)
def create_interim_session_conferences(sessions):
error_occurred = False
if hasattr(settings, 'MEETECHO_API_CONFIG'): # do nothing if not configured
meetecho_manager = meetecho.ConferenceManager(settings.MEETECHO_API_CONFIG)
for session in sessions:
ts = session.official_timeslotassignment().timeslot
try:
confs = meetecho_manager.create(
group=session.group,
description=str(session),
start_time=ts.time,
duration=ts.duration,
)
except Exception as err:
log.log(f'Exception creating Meetecho conference for {session}: {err}')
confs = []
if len(confs) == 1:
session.remote_instructions = confs[0].url
session.save()
else:
error_occurred = True
if error_occurred:
raise RuntimeError('error creating meetecho conferences')
def delete_interim_session_conferences(sessions):
"""Delete Meetecho conference for the session, if any"""
if hasattr(settings, 'MEETECHO_API_CONFIG'): # do nothing if Meetecho API not configured
meetecho_manager = meetecho.ConferenceManager(settings.MEETECHO_API_CONFIG)
for session in sessions:
if session.remote_instructions:
for conference in meetecho_manager.fetch(session.group):
if conference.url == session.remote_instructions:
conference.delete()
break
def sessions_post_cancel(request, sessions):
"""Clean up after session cancellation
When this is called, the session has already been canceled, so exceptions should
not be raised.
"""
try:
delete_interim_session_conferences(sessions)
except Exception as err:
sess_pks = ', '.join(str(s.pk) for s in sessions)
log.log(f'Exception deleting Meetecho conferences for sessions [{sess_pks}]: {err}')
messages.warning(
request,
'An error occurred while cleaning up Meetecho conferences for the canceled sessions. '
'The session or sessions have been canceled, but Meetecho conferences may not have been cleaned '
'up properly.',
)
def update_interim_session_assignment(form):
"""Helper function to create / update timeslot assigned to interim session"""

View file

@ -0,0 +1,91 @@
# Copyright The IETF Trust 2022, All Rights Reserved
# -*- coding: utf-8 -*-
import datetime
from textwrap import dedent
from django.conf import settings
from django.core.management.base import BaseCommand, CommandError
from ietf.meeting.models import Session
from ietf.utils.meetecho import ConferenceManager, MeetechoAPIError
class Command(BaseCommand):
help = 'Manage Meetecho conferences'
def add_arguments(self, parser) -> None:
parser.add_argument('group', type=str)
parser.add_argument('-d', '--delete', type=int, action='append',
metavar='SESSION_PK',
help='Delete the conference associated with the specified Session')
def handle(self, group, delete, *args, **options):
conf_mgr = ConferenceManager(settings.MEETECHO_API_CONFIG)
if delete:
self.handle_delete_conferences(conf_mgr, group, delete)
else:
self.handle_list_conferences(conf_mgr, group)
def handle_list_conferences(self, conf_mgr, group):
confs, conf_sessions = self.fetch_conferences(conf_mgr, group)
self.stdout.write(f'Meetecho conferences for {group}:\n\n')
for conf in confs:
sessions_desc = ', '.join(str(s.pk) for s in conf_sessions[conf.id]) or None
self.stdout.write(
dedent(f'''\
* {conf.description}
Start time: {conf.start_time}
Duration: {int(conf.duration.total_seconds() // 60)} minutes
URL: {conf.url}
Associated session PKs: {sessions_desc}
''')
)
def handle_delete_conferences(self, conf_mgr, group, session_pks_to_delete):
sessions_to_delete = Session.objects.filter(pk__in=session_pks_to_delete)
confs, conf_sessions = self.fetch_conferences(conf_mgr, group)
confs_to_delete = []
descriptions = []
for session in sessions_to_delete:
for conf in confs:
associated = conf_sessions[conf.id]
if session in associated:
confs_to_delete.append(conf)
sessions_desc = ', '.join(str(s.pk) for s in associated) or None
descriptions.append(
f'{conf.description} ({conf.start_time}, {int(conf.duration.total_seconds() // 60)} mins) - used by {sessions_desc}'
)
if len(confs_to_delete) > 0:
self.stdout.write('Will delete:')
for desc in descriptions:
self.stdout.write(f'* {desc}')
try:
proceed = input('Proceed [y/N]? ').lower()
except EOFError:
proceed = 'n'
if proceed in ['y', 'yes']:
for conf, desc in zip(confs_to_delete, descriptions):
conf.delete()
self.stdout.write(f'Deleted {desc}')
else:
self.stdout.write('Nothing deleted.')
else:
self.stdout.write('No associated Meetecho conferences found')
def fetch_conferences(self, conf_mgr, group):
try:
confs = conf_mgr.fetch(group)
except MeetechoAPIError as err:
raise CommandError('API error fetching Meetecho conference data') from err
conf_sessions = {}
for conf in confs:
conf_sessions[conf.id] = Session.objects.filter(
group__acronym=group,
meeting__date__gte=datetime.date.today(),
remote_instructions__contains=conf.url,
)
return confs, conf_sessions

View file

@ -14,6 +14,7 @@ import string
from collections import namedtuple
from pathlib import Path
from urllib.parse import urljoin
import debug # pyflakes:ignore
@ -1260,6 +1261,13 @@ class Session(models.Model):
else:
return self.group.acronym
def notes_id(self):
note_id_fragment = 'plenary' if self.type.slug == 'plenary' else self.group.acronym
return f'notes-ietf-{self.meeting.number}-{note_id_fragment}'
def notes_url(self):
return urljoin(settings.IETF_NOTES_URL, self.notes_id())
class SchedulingEvent(models.Model):
session = ForeignKey(Session)
time = models.DateTimeField(default=datetime.datetime.now, help_text="When the event happened")

View file

@ -166,11 +166,12 @@ api.meeting.register(ScheduleResource())
from ietf.group.resources import GroupResource
from ietf.doc.resources import DocumentResource
from ietf.name.resources import TimeSlotTypeNameResource
from ietf.name.resources import TimeSlotTypeNameResource, SessionPurposeNameResource
from ietf.person.resources import PersonResource
class SessionResource(ModelResource):
meeting = ToOneField(MeetingResource, 'meeting')
type = ToOneField(TimeSlotTypeNameResource, 'type')
purpose = ToOneField(SessionPurposeNameResource, 'purpose')
group = ToOneField(GroupResource, 'group')
materials = ToManyField(DocumentResource, 'materials', null=True)
resources = ToManyField(ResourceAssociationResource, 'resources', null=True)
@ -195,6 +196,7 @@ class SessionResource(ModelResource):
"modified": ALL,
"meeting": ALL_WITH_RELATIONS,
"type": ALL_WITH_RELATIONS,
"purpose": ALL_WITH_RELATIONS,
"group": ALL_WITH_RELATIONS,
"requested_by": ALL_WITH_RELATIONS,
"status": ALL_WITH_RELATIONS,

View file

@ -72,7 +72,7 @@ def webcal_url(context, viewname, *args, **kwargs):
@register.simple_tag
def assignment_display_name(assignment):
"""Get name for an assignment"""
if assignment.session.type.slug == 'regular' and assignment.session.historic_group:
if assignment.session.type.slug == 'regular' and getattr(assignment.session, 'historic_group', None):
return assignment.session.historic_group.name
return assignment.session.name or assignment.timeslot.name

121
ietf/meeting/tests_forms.py Normal file
View file

@ -0,0 +1,121 @@
# Copyright The IETF Trust 2021, All Rights Reserved
# -*- coding: utf-8 -*-
"""Tests of forms in the Meeting application"""
from django.conf import settings
from django.core.files.uploadedfile import SimpleUploadedFile
from django.test import override_settings
from ietf.meeting.forms import FileUploadForm, ApplyToAllFileUploadForm, InterimSessionModelForm
from ietf.utils.test_utils import TestCase
@override_settings(
MEETING_APPLICATION_OCTET_STREAM_OVERRIDES={'.md': 'text/markdown'}, # test relies on .txt not mapping
MEETING_VALID_UPLOAD_EXTENSIONS={'minutes': ['.txt', '.md']}, # test relies on .exe being absent
MEETING_VALID_UPLOAD_MIME_TYPES={'minutes': ['text/plain', 'text/markdown']},
MEETING_VALID_MIME_TYPE_EXTENSIONS={'text/plain': ['.txt'], 'text/markdown': ['.md']},
MEETING_VALID_UPLOAD_MIME_FOR_OBSERVED_MIME={'text/plain': ['text/plain', 'text/markdown']},
)
class FileUploadFormTests(TestCase):
class TestClass(FileUploadForm):
doc_type = 'minutes'
def test_accepts_valid_data(self):
test_file = SimpleUploadedFile(
name='file.txt',
content=b'plain text',
content_type='text/plain',
)
form = FileUploadFormTests.TestClass(files={'file': test_file})
self.assertTrue(form.is_valid(), 'Test data are valid input')
cleaned_file = form.cleaned_data['file']
self.assertEqual(cleaned_file.name, 'file.txt', 'Uploaded filename should not be changed')
with cleaned_file.open('rb') as f:
self.assertEqual(f.read(), b'plain text', 'Uploaded file contents should not be changed')
self.assertEqual(cleaned_file.content_type, 'text/plain', 'Content-Type should be overridden')
def test_overrides_content_type_application_octet_stream(self):
test_file = SimpleUploadedFile(
name='file.md',
content=b'plain text',
content_type='application/octet-stream',
)
form = FileUploadFormTests.TestClass(files={'file': test_file})
self.assertTrue(form.is_valid(), 'Test data are valid input')
cleaned_file = form.cleaned_data['file']
# Test that the test_file is what actually came out of the cleaning process.
# This is not technically required here, but the other tests check that test_file's
# content_type has not been changed. If cleaning does not modify the content_type
# when it succeeds, then those other tests are not actually testing anything.
self.assertEqual(cleaned_file, test_file, 'Cleaning should return the file object that was passed in')
self.assertEqual(cleaned_file.name, 'file.md', 'Uploaded filename should not be changed')
with cleaned_file.open('rb') as f:
self.assertEqual(f.read(), b'plain text', 'Uploaded file contents should not be changed')
self.assertEqual(cleaned_file.content_type, 'text/markdown', 'Content-Type should be overridden')
def test_overrides_only_application_octet_stream(self):
test_file = SimpleUploadedFile(
name='file.md',
content=b'plain text',
content_type='application/json'
)
form = FileUploadFormTests.TestClass(files={'file': test_file})
self.assertFalse(form.is_valid(), 'Test data are invalid input')
self.assertEqual(test_file.name, 'file.md', 'Uploaded filename should not be changed')
self.assertEqual(test_file.content_type, 'application/json', 'Uploaded Content-Type should not be changed')
def test_overrides_only_requested_extensions_when_valid_ext(self):
test_file = SimpleUploadedFile(
name='file.txt',
content=b'plain text',
content_type='application/octet-stream',
)
form = FileUploadFormTests.TestClass(files={'file': test_file})
self.assertFalse(form.is_valid(), 'Test data are invalid input')
self.assertEqual(test_file.name, 'file.txt', 'Uploaded filename should not be changed')
self.assertEqual(test_file.content_type, 'application/octet-stream', 'Uploaded Content-Type should not be changed')
def test_overrides_only_requested_extensions_when_invalid_ext(self):
test_file = SimpleUploadedFile(
name='file.exe',
content=b'plain text',
content_type='application/octet-stream'
)
form = FileUploadFormTests.TestClass(files={'file': test_file})
self.assertFalse(form.is_valid(), 'Test data are invalid input')
self.assertEqual(test_file.name, 'file.exe', 'Uploaded filename should not be changed')
self.assertEqual(test_file.content_type, 'application/octet-stream', 'Uploaded Content-Type should not be changed')
class ApplyToAllFileUploadFormTests(TestCase):
class TestClass(ApplyToAllFileUploadForm):
doc_type = 'minutes'
def test_has_apply_to_all_field_by_default(self):
form = ApplyToAllFileUploadFormTests.TestClass(show_apply_to_all_checkbox=True)
self.assertIn('apply_to_all', form.fields)
def test_no_show_apply_to_all_field(self):
form = ApplyToAllFileUploadFormTests.TestClass(show_apply_to_all_checkbox=False)
self.assertNotIn('apply_to_all', form.fields)
class InterimSessionModelFormTests(TestCase):
@override_settings(MEETECHO_API_CONFIG={}) # setting needs to exist, don't care about its value in this test
def test_remote_participation_options(self):
"""Only offer Meetecho conference creation when configured"""
form = InterimSessionModelForm()
choice_vals = [choice[0] for choice in form.fields['remote_participation'].choices]
self.assertIn('meetecho', choice_vals)
self.assertIn('manual', choice_vals)
del settings.MEETECHO_API_CONFIG
form = InterimSessionModelForm()
choice_vals = [choice[0] for choice in form.fields['remote_participation'].choices]
self.assertNotIn('meetecho', choice_vals)
self.assertIn('manual', choice_vals)

View file

@ -1,15 +1,20 @@
# Copyright The IETF Trust 2020, All Rights Reserved
# -*- coding: utf-8 -*-
from unittest.mock import patch, Mock
from django.conf import settings
from django.test import override_settings
from django.contrib.messages.storage.fallback import FallbackStorage
from django.test import override_settings, RequestFactory
from ietf.group.factories import GroupFactory
from ietf.group.models import Group
from ietf.meeting.factories import SessionFactory, MeetingFactory, TimeSlotFactory
from ietf.meeting.helpers import AgendaFilterOrganizer, AgendaKeywordTagger
from ietf.meeting.models import SchedTimeSessAssignment
from ietf.meeting.helpers import (AgendaFilterOrganizer, AgendaKeywordTagger,
delete_interim_session_conferences, sessions_post_save, sessions_post_cancel,
create_interim_session_conferences)
from ietf.meeting.models import SchedTimeSessAssignment, Session
from ietf.meeting.test_data import make_meeting_test_data
from ietf.utils.meetecho import Conference
from ietf.utils.test_utils import TestCase
@ -332,4 +337,281 @@ class AgendaFilterOrganizerTests(TestCase):
self.assertEqual(filter_organizer.get_non_area_keywords(), expected)
filter_organizer = AgendaFilterOrganizer(assignments=assignments, single_category=True)
self.assertEqual(filter_organizer.get_non_area_keywords(), expected)
self.assertEqual(filter_organizer.get_non_area_keywords(), expected)
@override_settings(
MEETECHO_API_CONFIG={
'api_base': 'https://example.com',
'client_id': 'datatracker',
'client_secret': 'secret',
'request_timeout': 3.01,
}
)
class InterimTests(TestCase):
@patch('ietf.utils.meetecho.ConferenceManager')
def test_delete_interim_session_conferences(self, mock):
mock_conf_mgr = mock.return_value # "instance" seen by the internals
sessions = [
SessionFactory(meeting__type_id='interim', remote_instructions='fake-meetecho-url'),
SessionFactory(meeting__type_id='interim', remote_instructions='other-fake-meetecho-url'),
]
timeslots = [
session.official_timeslotassignment().timeslot for session in sessions
]
conferences = [
Conference(
manager=mock_conf_mgr, id=1, public_id='some-uuid', description='desc',
start_time=timeslots[0].time, duration=timeslots[0].duration, url='fake-meetecho-url',
deletion_token='please-delete-me',
),
Conference(
manager=mock_conf_mgr, id=2, public_id='some-uuid-2', description='desc',
start_time=timeslots[1].time, duration=timeslots[1].duration, url='other-fake-meetecho-url',
deletion_token='please-delete-me-as-well',
),
]
# should not call the API if MEETECHO_API_CONFIG is not defined
with override_settings(): # will undo any changes to settings in the block
del settings.MEETECHO_API_CONFIG
delete_interim_session_conferences([sessions[0], sessions[1]])
self.assertFalse(mock.called)
# no conferences, no sessions being deleted -> no conferences deleted
mock.reset_mock()
mock_conf_mgr.fetch.return_value = []
delete_interim_session_conferences([])
self.assertFalse(mock_conf_mgr.delete_conference.called)
# two conferences, no sessions being deleted -> no conferences deleted
mock_conf_mgr.fetch.return_value = [conferences[0], conferences[1]]
mock_conf_mgr.delete_conference.reset_mock()
delete_interim_session_conferences([])
self.assertFalse(mock_conf_mgr.delete_conference.called)
mock_conf_mgr.delete_conference.reset_mock()
# one conference, other session being deleted -> no conferences deleted
mock_conf_mgr.fetch.return_value = [conferences[0]]
delete_interim_session_conferences([sessions[1]])
self.assertFalse(mock_conf_mgr.delete_conference.called)
# one conference, same session being deleted -> conference deleted
mock.reset_mock()
mock_conf_mgr.fetch.return_value = [conferences[0]]
delete_interim_session_conferences([sessions[0]])
self.assertTrue(mock_conf_mgr.delete_conference.called)
self.assertCountEqual(
mock_conf_mgr.delete_conference.call_args[0],
(conferences[0],)
)
# two conferences, one being deleted -> correct conference deleted
mock.reset_mock()
mock_conf_mgr.fetch.return_value = [conferences[0], conferences[1]]
delete_interim_session_conferences([sessions[1]])
self.assertTrue(mock_conf_mgr.delete_conference.called)
self.assertEqual(mock_conf_mgr.delete_conference.call_count, 1)
self.assertEqual(
mock_conf_mgr.delete_conference.call_args[0],
(conferences[1],)
)
# two conferences, both being deleted -> both conferences deleted
mock.reset_mock()
mock_conf_mgr.fetch.return_value = [conferences[0], conferences[1]]
delete_interim_session_conferences([sessions[0], sessions[1]])
self.assertTrue(mock_conf_mgr.delete_conference.called)
self.assertEqual(mock_conf_mgr.delete_conference.call_count, 2)
args_list = [call_args[0] for call_args in mock_conf_mgr.delete_conference.call_args_list]
self.assertCountEqual(
args_list,
((conferences[0],), (conferences[1],)),
)
@patch('ietf.meeting.helpers.delete_interim_session_conferences')
def test_sessions_post_cancel(self, mock):
sessions_post_cancel(RequestFactory().post('/some/url'), 'sessions arg')
self.assertTrue(mock.called)
self.assertEqual(mock.call_args[0], ('sessions arg',))
@patch('ietf.meeting.helpers.delete_interim_session_conferences')
def test_sessions_post_cancel_delete_exception(self, mock):
"""sessions_post_cancel prevents exceptions percolating up"""
mock.side_effect = RuntimeError('oops')
sessions = SessionFactory.create_batch(3, meeting__type_id='interim')
# create mock request with session / message storage
request = RequestFactory().post('/some/url')
setattr(request, 'session', 'session')
messages = FallbackStorage(request)
setattr(request, '_messages', messages)
sessions_post_cancel(request, sessions)
self.assertTrue(mock.called)
self.assertEqual(mock.call_args[0], (sessions,))
msgs = [str(msg) for msg in messages]
self.assertEqual(len(msgs), 1)
self.assertIn('An error occurred', msgs[0])
@patch('ietf.utils.meetecho.ConferenceManager')
def test_create_interim_session_conferences(self, mock):
mock_conf_mgr = mock.return_value # "instance" seen by the internals
sessions = [
SessionFactory(meeting__type_id='interim', remote_instructions='junk'),
SessionFactory(meeting__type_id='interim', remote_instructions=''),
]
timeslots = [
session.official_timeslotassignment().timeslot for session in sessions
]
with override_settings(): # will undo any changes to settings in the block
del settings.MEETECHO_API_CONFIG
create_interim_session_conferences([sessions[0], sessions[1]])
self.assertFalse(mock.called)
# create for 0 sessions
mock.reset_mock()
create_interim_session_conferences([])
self.assertFalse(mock_conf_mgr.create.called)
self.assertEqual(
Session.objects.get(pk=sessions[0].pk).remote_instructions,
'junk',
)
# create for 1 session
mock.reset_mock()
mock_conf_mgr.create.return_value = [
Conference(
manager=mock_conf_mgr, id=1, public_id='some-uuid', description='desc',
start_time=timeslots[0].time, duration=timeslots[0].duration, url='fake-meetecho-url',
deletion_token='please-delete-me',
),
]
create_interim_session_conferences([sessions[0]])
self.assertTrue(mock_conf_mgr.create.called)
self.assertCountEqual(
mock_conf_mgr.create.call_args[1],
{
'group': sessions[0].group,
'description': str(sessions[0]),
'start_time': timeslots[0].time,
'duration': timeslots[0].duration,
}
)
self.assertEqual(
Session.objects.get(pk=sessions[0].pk).remote_instructions,
'fake-meetecho-url',
)
# create for 2 sessions
mock.reset_mock()
mock_conf_mgr.create.side_effect = [
[Conference(
manager=mock_conf_mgr, id=1, public_id='some-uuid', description='desc',
start_time=timeslots[0].time, duration=timeslots[0].duration, url='different-fake-meetecho-url',
deletion_token='please-delete-me',
)],
[Conference(
manager=mock_conf_mgr, id=2, public_id='another-uuid', description='desc',
start_time=timeslots[1].time, duration=timeslots[1].duration, url='another-fake-meetecho-url',
deletion_token='please-delete-me-too',
)],
]
create_interim_session_conferences([sessions[0], sessions[1]])
self.assertTrue(mock_conf_mgr.create.called)
self.assertCountEqual(
mock_conf_mgr.create.call_args_list,
[
({
'group': sessions[0].group,
'description': str(sessions[0]),
'start_time': timeslots[0].time,
'duration': timeslots[0].duration,
},),
({
'group': sessions[1].group,
'description': str(sessions[1]),
'start_time': timeslots[1].time,
'duration': timeslots[1].duration,
},),
]
)
self.assertEqual(
Session.objects.get(pk=sessions[0].pk).remote_instructions,
'different-fake-meetecho-url',
)
self.assertEqual(
Session.objects.get(pk=sessions[1].pk).remote_instructions,
'another-fake-meetecho-url',
)
@patch('ietf.utils.meetecho.ConferenceManager')
def test_create_interim_session_conferences_errors(self, mock):
mock_conf_mgr = mock.return_value
session = SessionFactory(meeting__type_id='interim')
timeslot = session.official_timeslotassignment().timeslot
mock_conf_mgr.create.return_value = []
with self.assertRaises(RuntimeError):
create_interim_session_conferences([session])
mock.reset_mock()
mock_conf_mgr.create.return_value = [
Conference(
manager=mock_conf_mgr, id=1, public_id='some-uuid', description='desc',
start_time=timeslot.time, duration=timeslot.duration, url='different-fake-meetecho-url',
deletion_token='please-delete-me',
),
Conference(
manager=mock_conf_mgr, id=2, public_id='another-uuid', description='desc',
start_time=timeslot.time, duration=timeslot.duration, url='another-fake-meetecho-url',
deletion_token='please-delete-me-too',
),
]
with self.assertRaises(RuntimeError):
create_interim_session_conferences([session])
mock.reset_mock()
mock_conf_mgr.create.side_effect = ValueError('some error')
with self.assertRaises(RuntimeError):
create_interim_session_conferences([session])
@patch('ietf.meeting.helpers.create_interim_session_conferences')
def test_sessions_post_save_creates_meetecho_conferences(self, mock_create_method):
session = SessionFactory(meeting__type_id='interim')
mock_form = Mock()
mock_form.instance = session
mock_form.has_changed.return_value = True
mock_form.changed_data = []
mock_form.requires_approval = True
mock_form.cleaned_data = {'remote_participation': None}
sessions_post_save(RequestFactory().post('/some/url'), [mock_form])
self.assertTrue(mock_create_method.called)
self.assertCountEqual(mock_create_method.call_args[0][0], [])
mock_create_method.reset_mock()
mock_form.cleaned_data = {'remote_participation': 'manual'}
sessions_post_save(RequestFactory().post('/some/url'), [mock_form])
self.assertTrue(mock_create_method.called)
self.assertCountEqual(mock_create_method.call_args[0][0], [])
mock_create_method.reset_mock()
mock_form.cleaned_data = {'remote_participation': 'meetecho'}
sessions_post_save(RequestFactory().post('/some/url'), [mock_form])
self.assertTrue(mock_create_method.called)
self.assertCountEqual(mock_create_method.call_args[0][0], [session])
# Check that an exception does not percolate through sessions_post_save
mock_create_method.side_effect = RuntimeError('some error')
mock_form.cleaned_data = {'remote_participation': 'meetecho'}
# create mock request with session / message storage
request = RequestFactory().post('/some/url')
setattr(request, 'session', 'session')
messages = FallbackStorage(request)
setattr(request, '_messages', messages)
sessions_post_save(request, [mock_form])
self.assertTrue(mock_create_method.called)
self.assertCountEqual(mock_create_method.call_args[0][0], [session])
msgs = [str(msg) for msg in messages]
self.assertEqual(len(msgs), 1)
self.assertIn('An error occurred', msgs[0])

View file

@ -15,6 +15,7 @@ from django.utils.timezone import now
from django.db.models import F
import pytz
from django.conf import settings
from django.test.utils import override_settings
import debug # pyflakes:ignore
@ -35,7 +36,6 @@ from ietf.meeting.utils import add_event_info_to_session_qs
from ietf.utils.test_utils import assert_ical_response_is_valid
from ietf.utils.jstest import ( IetfSeleniumTestCase, ifSeleniumEnabled, selenium_enabled,
presence_of_element_child_by_css_selector )
from ietf import settings
if selenium_enabled():
from selenium.webdriver.common.action_chains import ActionChains
@ -545,21 +545,21 @@ class EditMeetingScheduleTests(IetfSeleniumTestCase):
past_swap_ts_buttons = self.driver.find_elements(By.CSS_SELECTOR,
','.join(
'.swap-timeslot-col[data-start="{}"]'.format(ts.utc_start_time().isoformat()) for ts in past_timeslots
'*[data-start="{}"] .swap-timeslot-col'.format(ts.utc_start_time().isoformat()) for ts in past_timeslots
)
)
self.assertEqual(len(past_swap_ts_buttons), len(past_timeslots), 'Missing past swap timeslot col buttons')
future_swap_ts_buttons = self.driver.find_elements(By.CSS_SELECTOR,
','.join(
'.swap-timeslot-col[data-start="{}"]'.format(ts.utc_start_time().isoformat()) for ts in future_timeslots
'*[data-start="{}"] .swap-timeslot-col'.format(ts.utc_start_time().isoformat()) for ts in future_timeslots
)
)
self.assertEqual(len(future_swap_ts_buttons), len(future_timeslots), 'Missing future swap timeslot col buttons')
now_swap_ts_buttons = self.driver.find_elements(By.CSS_SELECTOR,
','.join(
'.swap-timeslot-col[data-start="{}"]'.format(ts.utc_start_time().isoformat()) for ts in now_timeslots
'[data-start="{}"] .swap-timeslot-col'.format(ts.utc_start_time().isoformat()) for ts in now_timeslots
)
)
self.assertEqual(len(now_swap_ts_buttons), len(now_timeslots), 'Missing "now" swap timeslot col buttons')

View file

@ -0,0 +1,476 @@
# Copyright The IETF Trust 2021, All Rights Reserved
# -*- coding: utf-8 -*-
import json
from datetime import date, timedelta
from unittest.mock import patch
from django import forms
import debug # pyflakes: ignore
from ietf.group.factories import GroupFactory
from ietf.meeting.factories import MeetingFactory, TimeSlotFactory, RoomFactory, SessionFactory
from ietf.meeting.forms import (CsvModelPkInput, CustomDurationField, SwapTimeslotsForm, duration_string,
TimeSlotDurationField, TimeSlotEditForm, TimeSlotCreateForm, DurationChoiceField,
SessionDetailsForm, sessiondetailsformset_factory, SessionEditForm)
from ietf.name.models import SessionPurposeName
from ietf.utils.test_utils import TestCase
class CsvModelPkInputTests(TestCase):
widget = CsvModelPkInput()
def test_render_none(self):
result = self.widget.render('csv_model', value=None)
self.assertHTMLEqual(result, '<input type="text" name="csv_model" value="">')
def test_render_value(self):
result = self.widget.render('csv_model', value=[1, 2, 3])
self.assertHTMLEqual(result, '<input type="text" name="csv_model" value="1,2,3">')
def test_value_from_datadict(self):
result = self.widget.value_from_datadict({'csv_model': '11,23,47'}, {}, 'csv_model')
self.assertEqual(result, ['11', '23', '47'])
class SwapTimeslotsFormTests(TestCase):
def setUp(self):
super().setUp()
self.meeting = MeetingFactory(type_id='ietf', populate_schedule=False)
self.timeslots = TimeSlotFactory.create_batch(2, meeting=self.meeting)
self.other_meeting_timeslot = TimeSlotFactory()
def test_valid(self):
form = SwapTimeslotsForm(
meeting=self.meeting,
data={
'origin_timeslot': str(self.timeslots[0].pk),
'target_timeslot': str(self.timeslots[1].pk),
'rooms': ','.join(str(rm.pk) for rm in self.meeting.room_set.all()),
}
)
self.assertTrue(form.is_valid())
def test_invalid(self):
# the magic numbers are (very likely) non-existent pks
form = SwapTimeslotsForm(
meeting=self.meeting,
data={
'origin_timeslot': '25',
'target_timeslot': str(self.timeslots[1].pk),
'rooms': ','.join(str(rm.pk) for rm in self.meeting.room_set.all()),
}
)
self.assertFalse(form.is_valid())
form = SwapTimeslotsForm(
meeting=self.meeting,
data={
'origin_timeslot': str(self.timeslots[0].pk),
'target_timeslot': str(self.other_meeting_timeslot.pk),
'rooms': ','.join(str(rm.pk) for rm in self.meeting.room_set.all()),
}
)
self.assertFalse(form.is_valid())
form = SwapTimeslotsForm(
meeting=self.meeting,
data={
'origin_timeslot': str(self.timeslots[0].pk),
'target_timeslot': str(self.timeslots[1].pk),
'rooms': '1034',
}
)
self.assertFalse(form.is_valid())
class CustomDurationFieldTests(TestCase):
def test_duration_string(self):
self.assertEqual(duration_string(timedelta(hours=3, minutes=17)), '03:17')
self.assertEqual(duration_string(timedelta(hours=3, minutes=17, seconds=43)), '03:17')
self.assertEqual(duration_string(timedelta(days=1, hours=3, minutes=17, seconds=43)), '1 03:17')
self.assertEqual(duration_string(timedelta(hours=3, minutes=17, seconds=43, microseconds=37438)), '03:17')
def _render_field(self, field):
"""Helper to render a form containing a field"""
class Form(forms.Form):
f = field
return str(Form()['f'])
@patch('ietf.meeting.forms.duration_string', return_value='12:34')
def test_render(self, mock_duration_string):
self.assertHTMLEqual(
self._render_field(CustomDurationField()),
'<input id="id_f" name="f" type="text" placeholder="HH:MM" required>'
)
self.assertHTMLEqual(
self._render_field(CustomDurationField(initial=timedelta(hours=1))),
'<input id="id_f" name="f" type="text" placeholder="HH:MM" required value="12:34">',
'Rendered value should come from duration_string when initial value is a timedelta'
)
self.assertHTMLEqual(
self._render_field(CustomDurationField(initial="01:02")),
'<input id="id_f" name="f" type="text" placeholder="HH:MM" required value="01:02">',
'Rendered value should come from initial when it is not a timedelta'
)
class TimeSlotDurationFieldTests(TestCase):
def test_validation(self):
field = TimeSlotDurationField()
with self.assertRaises(forms.ValidationError):
field.clean('-01:00')
with self.assertRaises(forms.ValidationError):
field.clean('12:01')
self.assertEqual(field.clean('00:00'), timedelta(seconds=0))
self.assertEqual(field.clean('01:00'), timedelta(hours=1))
self.assertEqual(field.clean('12:00'), timedelta(hours=12))
class TimeSlotEditFormTests(TestCase):
def test_location_options(self):
meeting = MeetingFactory(type_id='ietf', populate_schedule=False)
rooms = [
RoomFactory(meeting=meeting, capacity=3),
RoomFactory(meeting=meeting, capacity=123),
]
ts = TimeSlotFactory(meeting=meeting)
rendered = str(TimeSlotEditForm(instance=ts)['location'])
# noinspection PyTypeChecker
self.assertInHTML(
f'<option value="{ts.location.pk}" selected>{ts.location.name} size: None</option>',
rendered,
)
for room in rooms:
# noinspection PyTypeChecker
self.assertInHTML(
f'<option value="{room.pk}">{room.name} size: {room.capacity}</option>',
rendered,
)
class TimeSlotCreateFormTests(TestCase):
def setUp(self):
super().setUp()
self.meeting = MeetingFactory(type_id='ietf', date=date(2021, 11, 16), days=3, populate_schedule=False)
def test_other_date(self):
room = RoomFactory(meeting=self.meeting)
# no other_date, no day selected
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
})
self.assertFalse(form.is_valid())
# no other_date, day is selected
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'days': ['738111'], # date(2021,11,17).toordinal()
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
})
self.assertTrue(form.is_valid())
self.assertNotIn('other_date', form.cleaned_data)
self.assertEqual(form.cleaned_data['days'], [date(2021, 11, 17)])
# other_date given, no day is selected
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
'other_date': '2021-11-15',
})
self.assertTrue(form.is_valid())
self.assertNotIn('other_date', form.cleaned_data)
self.assertEqual(form.cleaned_data['days'], [date(2021, 11, 15)])
# day is selected and other_date is given
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'days': ['738111'], # date(2021,11,17).toordinal()
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
'other_date': '2021-11-15',
})
self.assertTrue(form.is_valid())
self.assertNotIn('other_date', form.cleaned_data)
self.assertCountEqual(form.cleaned_data['days'], [date(2021, 11, 17), date(2021, 11, 15)])
# invalid other_date, no day selected
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
'other_date': 'invalid',
})
self.assertFalse(form.is_valid())
# invalid other_date, day selected
form = TimeSlotCreateForm(
self.meeting,
data={
'name': 'time slot',
'type': 'regular',
'days': ['738111'], # date(2021,11,17).toordinal()
'time': '12:00',
'duration': '01:00',
'locations': [str(room.pk)],
'other_date': 'invalid',
})
self.assertFalse(form.is_valid())
def test_meeting_days(self):
form = TimeSlotCreateForm(self.meeting)
self.assertEqual(
form.fields['days'].choices,
[
('738110', 'Tuesday (2021-11-16)'),
('738111', 'Wednesday (2021-11-17)'),
('738112', 'Thursday (2021-11-18)'),
],
)
def test_locations(self):
rooms = RoomFactory.create_batch(5, meeting=self.meeting)
form = TimeSlotCreateForm(self.meeting)
self.assertCountEqual(form.fields['locations'].queryset.all(), rooms)
class DurationChoiceFieldTests(TestCase):
def test_choices_default(self):
f = DurationChoiceField()
self.assertEqual(f.choices, [('', '--Please select'), ('3600', '1 hour'), ('7200', '2 hours')])
def test_choices(self):
f = DurationChoiceField([60, 1800, 3600, 5400, 7260, 7261])
self.assertEqual(
f.choices,
[
('', '--Please select'),
('60', '1 minute'),
('1800', '30 minutes'),
('3600', '1 hour'),
('5400', '1 hour 30 minutes'),
('7260', '2 hours 1 minute'),
('7261', '2 hours 1 minute'),
]
)
def test_bound_value(self):
class Form(forms.Form):
f = DurationChoiceField()
form = Form(data={'f': '3600'})
self.assertTrue(form.is_valid())
self.assertEqual(form.cleaned_data['f'], timedelta(hours=1))
form = Form(data={'f': '7200'})
self.assertTrue(form.is_valid())
self.assertEqual(form.cleaned_data['f'], timedelta(hours=2))
self.assertFalse(Form(data={'f': '3601'}).is_valid())
self.assertFalse(Form(data={'f': ''}).is_valid())
self.assertFalse(Form(data={'f': 'bob'}).is_valid())
class SessionDetailsFormTests(TestCase):
def setUp(self):
super().setUp()
self.meeting = MeetingFactory(type_id='ietf', populate_schedule=False)
self.group = GroupFactory()
def test_initial_purpose(self):
"""First session purpose for group should be default"""
# change the session_purposes GroupFeature to check that it's being used
self.group.features.session_purposes = ['coding', 'admin', 'closed_meeting']
self.group.features.save()
self.assertEqual(SessionDetailsForm(group=self.group).initial['purpose'], 'coding')
self.group.features.session_purposes = ['admin', 'coding', 'closed_meeting']
self.group.features.save()
self.assertEqual(SessionDetailsForm(group=self.group).initial['purpose'], 'admin')
def test_session_purposes(self):
# change the session_purposes GroupFeature to check that it's being used
self.group.features.session_purposes = ['coding', 'admin', 'closed_meeting']
self.group.features.save()
self.assertCountEqual(
SessionDetailsForm(group=self.group).fields['purpose'].queryset.values_list('slug', flat=True),
['coding', 'admin', 'closed_meeting'],
)
self.group.features.session_purposes = ['admin', 'closed_meeting']
self.group.features.save()
self.assertCountEqual(
SessionDetailsForm(group=self.group).fields['purpose'].queryset.values_list('slug', flat=True),
['admin', 'closed_meeting'],
)
def test_allowed_types(self):
"""Correct map from SessionPurposeName to allowed TimeSlotTypeName should be sent to JS"""
# change the allowed map to a known and non-standard arrangement
SessionPurposeName.objects.filter(slug='regular').update(timeslot_types=['other'])
SessionPurposeName.objects.filter(slug='admin').update(timeslot_types=['break', 'regular'])
SessionPurposeName.objects.exclude(slug__in=['regular', 'admin']).update(timeslot_types=[])
# check that the map we just installed is actually passed along to the JS through a widget attr
allowed = json.loads(SessionDetailsForm(group=self.group).fields['type'].widget.attrs['data-allowed-options'])
self.assertEqual(allowed['regular'], ['other'])
self.assertEqual(allowed['admin'], ['break', 'regular'])
for purpose in SessionPurposeName.objects.exclude(slug__in=['regular', 'admin']):
self.assertEqual(allowed[purpose.slug], [])
def test_duration_options(self):
self.assertTrue(self.group.features.acts_like_wg)
self.assertEqual(
SessionDetailsForm(group=self.group).fields['requested_duration'].choices,
[('', '--Please select'), ('3600', '1 hour'), ('7200', '2 hours')],
)
self.group.features.acts_like_wg = False
self.group.features.save()
self.assertEqual(
SessionDetailsForm(group=self.group).fields['requested_duration'].choices,
[('', '--Please select'), ('1800', '30 minutes'),
('3600', '1 hour'), ('5400', '1 hour 30 minutes'),
('7200', '2 hours'), ('9000', '2 hours 30 minutes'),
('10800', '3 hours'), ('12600', '3 hours 30 minutes'),
('14400', '4 hours')],
)
def test_on_agenda(self):
# new session gets its purpose's on_agenda value when True
self.assertTrue(SessionPurposeName.objects.get(slug='regular').on_agenda)
form = SessionDetailsForm(group=self.group, data={
'name': 'blah',
'purpose': 'regular',
'type': 'regular',
'requested_duration': '3600',
})
self.assertTrue(form.is_valid())
self.assertTrue(form.cleaned_data['on_agenda'])
# new session gets its purpose's on_agenda value when False
SessionPurposeName.objects.filter(slug='regular').update(on_agenda=False)
form = SessionDetailsForm(group=self.group, data={
'name': 'blah',
'purpose': 'regular',
'type': 'regular',
'requested_duration': '3600',
})
self.assertTrue(form.is_valid())
self.assertFalse(form.cleaned_data['on_agenda'])
# updated session keeps its on_agenda value, even if it differs from its purpose
session = SessionFactory(meeting=self.meeting, add_to_schedule=False, on_agenda=True)
form = SessionDetailsForm(
group=self.group,
instance=session,
data={
'name': 'blah',
'purpose': 'regular',
'type': 'regular',
'requested_duration': '3600',
},
)
self.assertTrue(form.is_valid())
self.assertTrue(form.cleaned_data['on_agenda'])
# session gets purpose's on_agenda value if its purpose changes (changing the
# purpose away from 'regular' so we can use the 'wg' type group that only allows
# regular sessions)
session.purpose_id = 'admin'
session.save()
form = SessionDetailsForm(
group=self.group,
instance=session,
data={
'name': 'blah',
'purpose': 'regular',
'type': 'regular',
'requested_duration': '3600',
},
)
self.assertTrue(form.is_valid())
self.assertFalse(form.cleaned_data['on_agenda'])
class SessionEditFormTests(TestCase):
def test_rejects_group_mismatch(self):
session = SessionFactory(meeting__type_id='ietf', meeting__populate_schedule=False, add_to_schedule=False)
other_group = GroupFactory()
with self.assertRaisesMessage(ValueError, 'Session group does not match group keyword'):
SessionEditForm(instance=session, group=other_group)
class SessionDetailsInlineFormset(TestCase):
def setUp(self):
super().setUp()
self.meeting = MeetingFactory(type_id='ietf', populate_schedule=False)
self.group = GroupFactory()
def test_initial_sessions(self):
"""Sessions for the correct meeting and group should be included"""
sessions = SessionFactory.create_batch(2, meeting=self.meeting, group=self.group, add_to_schedule=False)
SessionFactory(meeting=self.meeting, add_to_schedule=False) # should be ignored
SessionFactory(group=self.group, add_to_schedule=False) # should be ignored
formset_class = sessiondetailsformset_factory()
formset = formset_class(group=self.group, meeting=self.meeting)
self.assertCountEqual(formset.queryset.all(), sessions)
def test_forms_created_with_group_kwarg(self):
class MockFormClass(SessionDetailsForm):
"""Mock class to track the group that was passed to the init method"""
def __init__(self, group, *args, **kwargs):
self.init_group_argument = group
super().__init__(group, *args, **kwargs)
with patch('ietf.meeting.forms.SessionDetailsForm', MockFormClass):
formset_class = sessiondetailsformset_factory()
formset = formset_class(meeting=self.meeting, group=self.group)
str(formset) # triggers instantiation of forms
self.assertGreaterEqual(len(formset), 1)
for form in formset:
self.assertEqual(form.init_group_argument, self.group)
def test_add_instance(self):
session = SessionFactory(meeting=self.meeting, group=self.group, add_to_schedule=False)
formset_class = sessiondetailsformset_factory()
formset = formset_class(group=self.group, meeting=self.meeting, data={
'session_set-TOTAL_FORMS': '2',
'session_set-INITIAL_FORMS': '1',
'session_set-0-id': str(session.pk),
'session_set-0-name': 'existing',
'session_set-0-purpose': 'regular',
'session_set-0-type': 'regular',
'session_set-0-requested_duration': '3600',
'session_set-1-name': 'new',
'session_set-1-purpose': 'regular',
'session_set-1-type': 'regular',
'session_set-1-requested_duration': '3600',
})
formset.save()
# make sure session created
self.assertEqual(self.meeting.session_set.count(), 2)
self.assertIn(session, self.meeting.session_set.all())
self.assertEqual(len(formset.new_objects), 1)
self.assertEqual(formset.new_objects[0].name, 'new')
self.assertEqual(formset.new_objects[0].meeting, self.meeting)
self.assertEqual(formset.new_objects[0].group, self.group)

View file

@ -8,6 +8,8 @@ import random
import re
import shutil
import pytz
import requests.exceptions
import requests_mock
from unittest import skipIf
from mock import patch, PropertyMock
@ -19,7 +21,6 @@ from urllib.parse import urlparse, urlsplit
from PIL import Image
from pathlib import Path
from django.urls import reverse as urlreverse
from django.conf import settings
from django.contrib.auth.models import User
@ -44,7 +45,7 @@ from ietf.meeting.models import Session, TimeSlot, Meeting, SchedTimeSessAssignm
from ietf.meeting.test_data import make_meeting_test_data, make_interim_meeting, make_interim_test_data
from ietf.meeting.utils import finalize, condition_slide_order
from ietf.meeting.utils import add_event_info_to_session_qs
from ietf.meeting.views import session_draft_list, parse_agenda_filter_params
from ietf.meeting.views import session_draft_list, parse_agenda_filter_params, sessions_post_save
from ietf.name.models import SessionStatusName, ImportantDateName, RoleName, ProceedingsMaterialTypeName
from ietf.utils.decorators import skip_coverage
from ietf.utils.mail import outbox, empty_outbox, get_payload_text
@ -1676,6 +1677,23 @@ class EditMeetingScheduleTests(TestCase):
self.assertEqual(r.status_code, 200)
self.assertTrue(self._decode_json_response(r)['success'])
def test_editor_with_no_timeslots(self):
"""Schedule editor should not crash when there are no timeslots"""
meeting = MeetingFactory(
type_id='ietf',
date=datetime.date.today() + datetime.timedelta(days=7),
populate_schedule=False,
)
meeting.schedule = ScheduleFactory(meeting=meeting)
meeting.save()
SessionFactory(meeting=meeting, add_to_schedule=False)
self.assertEqual(meeting.timeslot_set.count(), 0, 'Test problem - meeting should not have any timeslots')
url = urlreverse('ietf.meeting.views.edit_meeting_schedule', kwargs={'num': meeting.number})
self.assertTrue(self.client.login(username='secretary', password='secretary+password'))
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.assertContains(r, 'No timeslots exist')
self.assertContains(r, urlreverse('ietf.meeting.views.edit_timeslots', kwargs={'num': meeting.number}))
class EditTimeslotsTests(TestCase):
@ -4438,7 +4456,9 @@ class InterimTests(TestCase):
'session_set-MIN_NUM_FORMS':0,
'session_set-MAX_NUM_FORMS':1000}
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
with patch('ietf.meeting.views.sessions_post_save', wraps=sessions_post_save) as mock:
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
self.assertTrue(mock.called)
self.assertRedirects(r,urlreverse('ietf.meeting.views.upcoming'))
meeting = Meeting.objects.order_by('id').last()
self.assertEqual(meeting.type_id,'interim')
@ -4507,7 +4527,9 @@ class InterimTests(TestCase):
'session_set-TOTAL_FORMS':1,
'session_set-INITIAL_FORMS':0}
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
with patch('ietf.meeting.views.sessions_post_save', wraps=sessions_post_save) as mock:
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
self.assertTrue(mock.called)
self.assertRedirects(r,urlreverse('ietf.meeting.views.upcoming'))
meeting = Meeting.objects.order_by('id').last()
self.assertEqual(meeting.type_id,'interim')
@ -4561,7 +4583,9 @@ class InterimTests(TestCase):
'session_set-TOTAL_FORMS':2,
'session_set-INITIAL_FORMS':0}
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
with patch('ietf.meeting.views.sessions_post_save', wraps=sessions_post_save) as mock:
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
self.assertTrue(mock.called)
self.assertRedirects(r,urlreverse('ietf.meeting.views.upcoming'))
meeting = Meeting.objects.order_by('id').last()
@ -4697,8 +4721,9 @@ class InterimTests(TestCase):
'session_set-TOTAL_FORMS':2,
'session_set-INITIAL_FORMS':0}
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
with patch('ietf.meeting.views.sessions_post_save', wraps=sessions_post_save) as mock:
r = self.client.post(urlreverse("ietf.meeting.views.interim_request"),data)
self.assertTrue(mock.called)
self.assertRedirects(r,urlreverse('ietf.meeting.views.upcoming'))
meeting_count_after = Meeting.objects.filter(type='interim').count()
self.assertEqual(meeting_count_after,meeting_count_before + 2)
@ -5033,7 +5058,8 @@ class InterimTests(TestCase):
def test_interim_request_disapprove_with_extra_and_canceled_sessions(self):
self.do_interim_request_disapprove_test(extra_session=True, canceled_session=True)
def test_interim_request_cancel(self):
@patch('ietf.meeting.views.sessions_post_cancel')
def test_interim_request_cancel(self, mock):
"""Test that interim request cancel function works
Does not test that UI buttons are present, that is handled elsewhere.
@ -5052,6 +5078,7 @@ class InterimTests(TestCase):
self.client.login(username="ameschairman", password="ameschairman+password")
r = self.client.post(url, {'comments': comments})
self.assertEqual(r.status_code, 403)
self.assertFalse(mock.called, 'Should not cancel sessions if request rejected')
# test cancelling before announcement
self.client.login(username="marschairman", password="marschairman+password")
@ -5062,8 +5089,11 @@ class InterimTests(TestCase):
self.assertEqual(session.current_status,'canceledpa')
self.assertEqual(session.agenda_note, comments)
self.assertEqual(len(outbox), length_before) # no email notice
self.assertTrue(mock.called, 'Should cancel sessions if request handled')
self.assertCountEqual(mock.call_args[0][1], meeting.session_set.all())
# test cancelling after announcement
mock.reset_mock()
meeting = add_event_info_to_session_qs(Session.objects.filter(meeting__type='interim', group__acronym='mars')).filter(current_status='sched').first().meeting
url = urlreverse('ietf.meeting.views.interim_request_cancel', kwargs={'number': meeting.number})
r = self.client.post(url, {'comments': comments})
@ -5073,8 +5103,11 @@ class InterimTests(TestCase):
self.assertEqual(session.agenda_note, comments)
self.assertEqual(len(outbox), length_before + 1)
self.assertIn('Interim Meeting Cancelled', outbox[-1]['Subject'])
self.assertTrue(mock.called, 'Should cancel sessions if request handled')
self.assertCountEqual(mock.call_args[0][1], meeting.session_set.all())
def test_interim_request_session_cancel(self):
@patch('ietf.meeting.views.sessions_post_cancel')
def test_interim_request_session_cancel(self, mock):
"""Test that interim meeting session cancellation functions
Does not test that UI buttons are present, that is handled elsewhere.
@ -5090,6 +5123,7 @@ class InterimTests(TestCase):
url = urlreverse('ietf.meeting.views.interim_request_session_cancel', kwargs={'sessionid': session.pk})
r = self.client.post(url, {'comments': comments})
self.assertEqual(r.status_code, 409)
self.assertFalse(mock.called, 'Should not cancel sessions if request rejected')
# Add a second session
SessionFactory(meeting=meeting, status_id='apprw')
@ -5099,7 +5133,8 @@ class InterimTests(TestCase):
self.client.login(username="ameschairman", password="ameschairman+password")
r = self.client.post(url, {'comments': comments})
self.assertEqual(r.status_code, 403)
self.assertFalse(mock.called, 'Should not cancel sessions if request rejected')
# test cancelling before announcement
self.client.login(username="marschairman", password="marschairman+password")
length_before = len(outbox)
@ -5108,6 +5143,9 @@ class InterimTests(TestCase):
r = self.client.post(url, {'comments': comments})
self.assertRedirects(r, urlreverse('ietf.meeting.views.interim_request_details',
kwargs={'number': meeting.number}))
self.assertTrue(mock.called, 'Should cancel sessions if request handled')
self.assertCountEqual(mock.call_args[0][1], [session])
# This session should be canceled...
sessions = meeting.session_set.with_current_status()
session = sessions.filter(id=session.pk).first() # reload our session info
@ -5121,6 +5159,7 @@ class InterimTests(TestCase):
self.assertEqual(len(outbox), length_before) # no email notice
# test cancelling after announcement
mock.reset_mock()
session = Session.objects.with_current_status().filter(
meeting__type='interim', group__acronym='mars', current_status='sched').first()
meeting = session.meeting
@ -5129,6 +5168,7 @@ class InterimTests(TestCase):
url = urlreverse('ietf.meeting.views.interim_request_session_cancel', kwargs={'sessionid': session.pk})
r = self.client.post(url, {'comments': comments})
self.assertEqual(r.status_code, 409)
self.assertFalse(mock.called, 'Should not cancel sessions if request rejected')
# Add another session
SessionFactory(meeting=meeting, status_id='sched') # two sessions so canceling a session makes sense
@ -5138,6 +5178,9 @@ class InterimTests(TestCase):
r = self.client.post(url, {'comments': comments})
self.assertRedirects(r, urlreverse('ietf.meeting.views.interim_request_details',
kwargs={'number': meeting.number}))
self.assertTrue(mock.called, 'Should cancel sessions if request handled')
self.assertCountEqual(mock.call_args[0][1], [session])
# This session should be canceled...
sessions = meeting.session_set.with_current_status()
session = sessions.filter(id=session.pk).first() # reload our session info
@ -5236,6 +5279,45 @@ class InterimTests(TestCase):
d["minutes"], d["seconds"] = divmod(rem, 60)
return fmt.format(**d)
def test_interim_request_edit_agenda_updates_doc(self):
"""Updating the agenda through the request edit form should update the doc correctly"""
make_interim_test_data()
meeting = add_event_info_to_session_qs(Session.objects.filter(meeting__type='interim', group__acronym='mars')).filter(current_status='sched').first().meeting
group = meeting.session_set.first().group
url = urlreverse('ietf.meeting.views.interim_request_edit', kwargs={'number': meeting.number})
session = meeting.session_set.first()
agenda_doc = session.agenda()
rev_before = agenda_doc.rev
uploaded_filename_before = agenda_doc.uploaded_filename
self.client.login(username='secretary', password='secretary+password')
r = self.client.get(url)
form_initial = r.context['form'].initial
formset_initial = r.context['formset'].forms[0].initial
data = {
'group': group.pk,
'meeting_type': 'single',
'session_set-0-id': session.id,
'session_set-0-date': formset_initial['date'].strftime('%Y-%m-%d'),
'session_set-0-time': formset_initial['time'].strftime('%H:%M'),
'session_set-0-requested_duration': '00:30',
'session_set-0-remote_instructions': formset_initial['remote_instructions'],
'session_set-0-agenda': 'modified agenda contents',
'session_set-0-agenda_note': formset_initial['agenda_note'],
'session_set-TOTAL_FORMS': 1,
'session_set-INITIAL_FORMS': 1,
}
data.update(form_initial)
r = self.client.post(url, data)
self.assertRedirects(r, urlreverse('ietf.meeting.views.interim_request_details', kwargs={'number': meeting.number}))
session = Session.objects.get(pk=session.pk) # refresh
agenda_doc = session.agenda()
self.assertEqual(agenda_doc.rev, f'{int(rev_before) + 1:02}', 'Revision of agenda should increase')
self.assertNotEqual(agenda_doc.uploaded_filename, uploaded_filename_before, 'Uploaded filename should be updated')
with (Path(agenda_doc.get_file_path()) / agenda_doc.uploaded_filename).open() as f:
self.assertEqual(f.read(), 'modified agenda contents', 'New agenda contents should be saved')
def test_interim_request_details_permissions(self):
make_interim_test_data()
meeting = add_event_info_to_session_qs(Session.objects.filter(meeting__type='interim', group__acronym='mars')).filter(current_status='apprw').first().meeting
@ -5404,12 +5486,16 @@ class IphoneAppJsonTests(TestCase):
self.assertTrue(msessions.filter(group__acronym=s['group']['acronym']).exists())
class FinalizeProceedingsTests(TestCase):
@patch('urllib.request.urlopen')
def test_finalize_proceedings(self, mock_urlopen):
mock_urlopen.return_value = BytesIO(b'[{"LastName":"Smith","FirstName":"John","Company":"ABC","Country":"US"}]')
@override_settings(STATS_REGISTRATION_ATTENDEES_JSON_URL='https://ietf.example.com/{number}')
@requests_mock.Mocker()
def test_finalize_proceedings(self, mock):
make_meeting_test_data()
meeting = Meeting.objects.filter(type_id='ietf').order_by('id').last()
meeting.session_set.filter(group__acronym='mars').first().sessionpresentation_set.create(document=Document.objects.filter(type='draft').first(),rev=None)
mock.get(
settings.STATS_REGISTRATION_ATTENDEES_JSON_URL.format(number=meeting.number),
text=json.dumps([{"LastName": "Smith", "FirstName": "John", "Company": "ABC", "Country": "US"}]),
)
url = urlreverse('ietf.meeting.views.finalize_proceedings',kwargs={'num':meeting.number})
login_testing_unauthorized(self,"secretary",url)
@ -5604,8 +5690,10 @@ class MaterialsTests(TestCase):
self.assertEqual(doc.rev,'02')
# Verify that we don't have dead links
url = url=urlreverse('ietf.meeting.views.session_details', kwargs={'num':session.meeting.number, 'acronym': session.group.acronym})
url = urlreverse('ietf.meeting.views.session_details', kwargs={'num':session.meeting.number, 'acronym': session.group.acronym})
top = '/meeting/%s/' % session.meeting.number
self.requests_mock.get(f'{session.notes_url()}/download', text='markdown notes')
self.requests_mock.get(f'{session.notes_url()}/info', text=json.dumps({'title': 'title', 'updatetime': '2021-12-01T17:11:00z'}))
self.crawl_materials(url=url, top=top)
def test_upload_minutes_agenda_unscheduled(self):
@ -5652,8 +5740,10 @@ class MaterialsTests(TestCase):
self.assertEqual(doc.rev,'00')
# Verify that we don't have dead links
url = url=urlreverse('ietf.meeting.views.session_details', kwargs={'num':session.meeting.number, 'acronym': session.group.acronym})
url = urlreverse('ietf.meeting.views.session_details', kwargs={'num':session.meeting.number, 'acronym': session.group.acronym})
top = '/meeting/%s/' % session.meeting.number
self.requests_mock.get(f'{session.notes_url()}/download', text='markdown notes')
self.requests_mock.get(f'{session.notes_url()}/info', text=json.dumps({'title': 'title', 'updatetime': '2021-12-01T17:11:00z'}))
self.crawl_materials(url=url, top=top)
def test_upload_slides(self):
@ -5927,6 +6017,151 @@ class MaterialsTests(TestCase):
self.assertIn('third version', contents)
@override_settings(IETF_NOTES_URL='https://notes.ietf.org/')
class ImportNotesTests(TestCase):
settings_temp_path_overrides = TestCase.settings_temp_path_overrides + ['AGENDA_PATH']
def setUp(self):
super().setUp()
self.session = SessionFactory(meeting__type_id='ietf')
self.meeting = self.session.meeting
def test_retrieves_note(self):
"""Can import and preview a note from notes.ietf.org"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
with requests_mock.Mocker() as mock:
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/download', text='markdown text')
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/info',
text=json.dumps({"title": "title", "updatetime": "2021-12-02T11:22:33z"}))
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
iframe = q('iframe#preview')
self.assertEqual('<p>markdown text</p>', iframe.attr('srcdoc'))
markdown_text_input = q('form #id_markdown_text')
self.assertEqual(markdown_text_input.val(), 'markdown text')
def test_retrieves_with_broken_metadata(self):
"""Can import and preview a note even if it has a metadata problem"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
with requests_mock.Mocker() as mock:
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/download', text='markdown text')
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/info', text='this is not valid json {]')
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
iframe = q('iframe#preview')
self.assertEqual('<p>markdown text</p>', iframe.attr('srcdoc'))
markdown_text_input = q('form #id_markdown_text')
self.assertEqual(markdown_text_input.val(), 'markdown text')
def test_redirects_on_success(self):
"""Redirects to session details page after import"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
r = self.client.post(url, {'markdown_text': 'markdown text'})
self.assertRedirects(
r,
urlreverse(
'ietf.meeting.views.session_details',
kwargs={
'num': self.meeting.number,
'acronym': self.session.group.acronym,
},
),
)
def test_imports_previewed_text(self):
"""Import text that was shown as preview even if notes site is updated"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
with requests_mock.Mocker() as mock:
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/download', text='updated markdown text')
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/info',
text=json.dumps({"title": "title", "updatetime": "2021-12-02T11:22:33z"}))
r = self.client.post(url, {'markdown_text': 'original markdown text'})
self.assertEqual(r.status_code, 302)
minutes_path = Path(self.meeting.get_materials_path()) / 'minutes'
with (minutes_path / self.session.minutes().uploaded_filename).open() as f:
self.assertEqual(f.read(), 'original markdown text')
def test_refuses_identical_import(self):
"""Should not be able to import text identical to the current revision"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
r = self.client.post(url, {'markdown_text': 'original markdown text'}) # create a rev
self.assertEqual(r.status_code, 302)
with requests_mock.Mocker() as mock:
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/download', text='original markdown text')
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/info',
text=json.dumps({"title": "title", "updatetime": "2021-12-02T11:22:33z"}))
r = self.client.get(url) # try to import the same text
self.assertContains(r, "This document is identical", status_code=200)
q = PyQuery(r.content)
self.assertEqual(len(q('button:disabled[type="submit"]')), 1)
self.assertEqual(len(q('button:not(:disabled)[type="submit"]')), 0)
def test_handles_missing_previous_revision_file(self):
"""Should still allow import if the file for the previous revision is missing"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
r = self.client.post(url, {'markdown_text': 'original markdown text'}) # create a rev
# remove the file uploaded for the first rev
minutes_docs = self.session.sessionpresentation_set.filter(document__type='minutes')
self.assertEqual(minutes_docs.count(), 1)
Path(minutes_docs.first().document.get_file_name()).unlink()
self.assertEqual(r.status_code, 302)
with requests_mock.Mocker() as mock:
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/download', text='original markdown text')
mock.get(f'https://notes.ietf.org/{self.session.notes_id()}/info',
text=json.dumps({"title": "title", "updatetime": "2021-12-02T11:22:33z"}))
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
iframe = q('iframe#preview')
self.assertEqual('<p>original markdown text</p>', iframe.attr('srcdoc'))
markdown_text_input = q('form #id_markdown_text')
self.assertEqual(markdown_text_input.val(), 'original markdown text')
def test_handles_note_does_not_exist(self):
"""Should not try to import a note that does not exist"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
with requests_mock.Mocker() as mock:
mock.get(requests_mock.ANY, status_code=404)
r = self.client.get(url, follow=True)
self.assertContains(r, 'Could not import', status_code=200)
def test_handles_notes_server_failure(self):
"""Problems communicating with the notes server should be handled gracefully"""
url = urlreverse('ietf.meeting.views.import_session_minutes',
kwargs={'num': self.meeting.number, 'session_id': self.session.pk})
self.client.login(username='secretary', password='secretary+password')
with requests_mock.Mocker() as mock:
mock.get(re.compile(r'.+/download'), exc=requests.exceptions.ConnectTimeout)
mock.get(re.compile(r'.+//info'), text='{}')
r = self.client.get(url, follow=True)
self.assertContains(r, 'Could not reach the notes server', status_code=200)
class SessionTests(TestCase):
def test_meeting_requests(self):
@ -6911,12 +7146,15 @@ class ProceedingsTests(BaseMeetingTestCase):
0,
)
@patch('ietf.meeting.utils.requests.get')
def test_proceedings_attendees(self, mockobj):
mockobj.return_value.text = b'[{"LastName":"Smith","FirstName":"John","Company":"ABC","Country":"US"}]'
mockobj.return_value.json = lambda: json.loads(b'[{"LastName":"Smith","FirstName":"John","Company":"ABC","Country":"US"}]')
@override_settings(STATS_REGISTRATION_ATTENDEES_JSON_URL='https://ietf.example.com/{number}')
@requests_mock.Mocker()
def test_proceedings_attendees(self, mock):
make_meeting_test_data()
meeting = MeetingFactory(type_id='ietf', date=datetime.date(2016,7,14), number="97")
mock.get(
settings.STATS_REGISTRATION_ATTENDEES_JSON_URL.format(number=meeting.number),
text=json.dumps([{"LastName": "Smith", "FirstName": "John", "Company": "ABC", "Country": "US"}]),
)
finalize(meeting)
url = urlreverse('ietf.meeting.views.proceedings_attendees',kwargs={'num':97})
response = self.client.get(url)
@ -6924,14 +7162,18 @@ class ProceedingsTests(BaseMeetingTestCase):
q = PyQuery(response.content)
self.assertEqual(1,len(q("#id_attendees tbody tr")))
@patch('urllib.request.urlopen')
def test_proceedings_overview(self, mock_urlopen):
@override_settings(STATS_REGISTRATION_ATTENDEES_JSON_URL='https://ietf.example.com/{number}')
@requests_mock.Mocker()
def test_proceedings_overview(self, mock):
'''Test proceedings IETF Overview page.
Note: old meetings aren't supported so need to add a new meeting then test.
'''
mock_urlopen.return_value = BytesIO(b'[{"LastName":"Smith","FirstName":"John","Company":"ABC","Country":"US"}]')
make_meeting_test_data()
meeting = MeetingFactory(type_id='ietf', date=datetime.date(2016,7,14), number="97")
mock.get(
settings.STATS_REGISTRATION_ATTENDEES_JSON_URL.format(number=meeting.number),
text=json.dumps([{"LastName": "Smith", "FirstName": "John", "Company": "ABC", "Country": "US"}]),
)
finalize(meeting)
url = urlreverse('ietf.meeting.views.proceedings_overview',kwargs={'num':97})
response = self.client.get(url)

View file

@ -13,6 +13,7 @@ safe_for_all_meeting_types = [
url(r'^session/(?P<session_id>\d+)/bluesheets$', views.upload_session_bluesheets),
url(r'^session/(?P<session_id>\d+)/minutes$', views.upload_session_minutes),
url(r'^session/(?P<session_id>\d+)/agenda$', views.upload_session_agenda),
url(r'^session/(?P<session_id>\d+)/import/minutes$', views.import_session_minutes),
url(r'^session/(?P<session_id>\d+)/propose_slides$', views.propose_session_slides),
url(r'^session/(?P<session_id>\d+)/slides(?:/%(name)s)?$' % settings.URL_REGEXPS, views.upload_session_slides),
url(r'^session/(?P<session_id>\d+)/add_to_session$', views.ajax_add_slides_to_session),

View file

@ -1,17 +1,19 @@
# Copyright The IETF Trust 2016-2020, All Rights Reserved
# -*- coding: utf-8 -*-
import datetime
import itertools
import re
import requests
import subprocess
from collections import defaultdict
from pathlib import Path
from urllib.error import HTTPError
from django.conf import settings
from django.contrib import messages
from django.template.loader import render_to_string
from django.utils.encoding import smart_text
from django.utils.html import format_html
from django.utils.safestring import mark_safe
@ -19,11 +21,15 @@ import debug # pyflakes:ignore
from ietf.dbtemplate.models import DBTemplate
from ietf.meeting.models import Session, SchedulingEvent, TimeSlot, Constraint, SchedTimeSessAssignment
from ietf.doc.models import Document, DocAlias, State, NewRevisionDocEvent
from ietf.group.models import Group
from ietf.group.utils import can_manage_materials
from ietf.name.models import SessionStatusName, ConstraintName
from ietf.person.models import Person
from ietf.secr.proceedings.proc_utils import import_audio_files
from ietf.utils.html import sanitize_document
from ietf.utils.log import log
def session_time_for_sorting(session, use_meeting_date):
official_timeslot = TimeSlot.objects.filter(sessionassignments__session=session, sessionassignments__schedule__in=[session.meeting.schedule, session.meeting.schedule.base if session.meeting.schedule else None]).first()
@ -118,9 +124,10 @@ def create_proceedings_templates(meeting):
# Get meeting attendees from registration system
url = settings.STATS_REGISTRATION_ATTENDEES_JSON_URL.format(number=meeting.number)
try:
attendees = requests.get(url).json()
except (ValueError, HTTPError):
attendees = requests.get(url, timeout=settings.DEFAULT_REQUESTS_TIMEOUT).json()
except (ValueError, HTTPError, requests.Timeout) as exc:
attendees = []
log(f'Failed to retrieve meeting attendees from [{url}]: {exc}')
if attendees:
attendees = sorted(attendees, key = lambda a: a['LastName'])
@ -543,4 +550,180 @@ def preprocess_meeting_important_dates(meetings):
m.important_dates = m.importantdate_set.prefetch_related("name")
for d in m.important_dates:
d.midnight_cutoff = "UTC 23:59" in d.name.name
def get_meeting_sessions(num, acronym):
types = ['regular','plenary','other']
sessions = Session.objects.filter(
meeting__number=num,
group__acronym=acronym,
type__in=types,
)
if not sessions:
sessions = Session.objects.filter(
meeting__number=num,
short=acronym,
type__in=types,
)
return sessions
class SessionNotScheduledError(Exception):
"""Indicates failure because operation requires a scheduled session"""
pass
class SaveMaterialsError(Exception):
"""Indicates failure saving session materials"""
pass
def save_session_minutes_revision(session, file, ext, request, encoding=None, apply_to_all=False):
"""Creates or updates session minutes records
This updates the database models to reflect a new version. It does not handle
storing the new file contents, that should be handled via handle_upload_file()
or similar.
If the session does not already have minutes, it must be a scheduled
session. If not, SessionNotScheduledError will be raised.
Returns (Document, [DocEvents]), which should be passed to doc.save_with_history()
if the file contents are stored successfully.
"""
minutes_sp = session.sessionpresentation_set.filter(document__type='minutes').first()
if minutes_sp:
doc = minutes_sp.document
doc.rev = '%02d' % (int(doc.rev)+1)
minutes_sp.rev = doc.rev
minutes_sp.save()
else:
ota = session.official_timeslotassignment()
sess_time = ota and ota.timeslot.time
if not sess_time:
raise SessionNotScheduledError
if session.meeting.type_id=='ietf':
name = 'minutes-%s-%s' % (session.meeting.number,
session.group.acronym)
title = 'Minutes IETF%s: %s' % (session.meeting.number,
session.group.acronym)
if not apply_to_all:
name += '-%s' % (sess_time.strftime("%Y%m%d%H%M"),)
title += ': %s' % (sess_time.strftime("%a %H:%M"),)
else:
name = 'minutes-%s-%s' % (session.meeting.number, sess_time.strftime("%Y%m%d%H%M"))
title = 'Minutes %s: %s' % (session.meeting.number, sess_time.strftime("%a %H:%M"))
if Document.objects.filter(name=name).exists():
doc = Document.objects.get(name=name)
doc.rev = '%02d' % (int(doc.rev)+1)
else:
doc = Document.objects.create(
name = name,
type_id = 'minutes',
title = title,
group = session.group,
rev = '00',
)
DocAlias.objects.create(name=doc.name).docs.add(doc)
doc.states.add(State.objects.get(type_id='minutes',slug='active'))
if session.sessionpresentation_set.filter(document=doc).exists():
sp = session.sessionpresentation_set.get(document=doc)
sp.rev = doc.rev
sp.save()
else:
session.sessionpresentation_set.create(document=doc,rev=doc.rev)
if apply_to_all:
for other_session in get_meeting_sessions(session.meeting.number, session.group.acronym):
if other_session != session:
other_session.sessionpresentation_set.filter(document__type='minutes').delete()
other_session.sessionpresentation_set.create(document=doc,rev=doc.rev)
filename = f'{doc.name}-{doc.rev}{ext}'
doc.uploaded_filename = filename
e = NewRevisionDocEvent.objects.create(
doc=doc,
by=request.user.person,
type='new_revision',
desc=f'New revision available: {doc.rev}',
rev=doc.rev,
)
# The way this function builds the filename it will never trigger the file delete in handle_file_upload.
save_error = handle_upload_file(
file=file,
filename=doc.uploaded_filename,
meeting=session.meeting,
subdir='minutes',
request=request,
encoding=encoding,
)
if save_error:
raise SaveMaterialsError(save_error)
else:
doc.save_with_history([e])
def handle_upload_file(file, filename, meeting, subdir, request=None, encoding=None):
"""Accept an uploaded materials file
This function takes a file object, a filename and a meeting object and subdir as string.
It saves the file to the appropriate directory, get_materials_path() + subdir.
If the file is a zip file, it creates a new directory in 'slides', which is the basename of the
zip file and unzips the file in the new directory.
"""
filename = Path(filename)
is_zipfile = filename.suffix == '.zip'
path = Path(meeting.get_materials_path()) / subdir
if is_zipfile:
path = path / filename.stem
path.mkdir(parents=True, exist_ok=True)
# agendas and minutes can only have one file instance so delete file if it already exists
if subdir in ('agenda', 'minutes'):
for f in path.glob(f'{filename.stem}.*'):
try:
f.unlink()
except FileNotFoundError:
pass # if the file is already gone, so be it
with (path / filename).open('wb+') as destination:
if filename.suffix in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS['text/html']:
file.open()
text = file.read()
if encoding:
try:
text = text.decode(encoding)
except LookupError as e:
return (
f"Failure trying to save '{filename}': "
f"Could not identify the file encoding, got '{str(e)[:120]}'. "
f"Hint: Try to upload as UTF-8."
)
else:
try:
text = smart_text(text)
except UnicodeDecodeError as e:
return "Failure trying to save '%s'. Hint: Try to upload as UTF-8: %s..." % (filename, str(e)[:120])
# Whole file sanitization; add back what's missing from a complete
# document (sanitize will remove these).
clean = sanitize_document(text)
destination.write(clean.encode('utf8'))
if request and clean != text:
messages.warning(request,
(
f"Uploaded html content is sanitized to prevent unsafe content. "
f"Your upload {filename} was changed by the sanitization; "
f"please check the resulting content. "
))
else:
if hasattr(file, 'chunks'):
for chunk in file.chunks():
destination.write(chunk)
else:
destination.write(file.read())
# unzip zipfile
if is_zipfile:
subprocess.call(['unzip', filename], cwd=path)
return None

View file

@ -14,7 +14,6 @@ import pytz
import re
import tarfile
import tempfile
import markdown
from calendar import timegm
from collections import OrderedDict, Counter, deque, defaultdict
@ -26,7 +25,7 @@ from django import forms
from django.shortcuts import render, redirect, get_object_or_404
from django.http import (HttpResponse, HttpResponseRedirect, HttpResponseForbidden,
HttpResponseNotFound, Http404, HttpResponseBadRequest,
JsonResponse)
JsonResponse, HttpResponseGone)
from django.conf import settings
from django.contrib import messages
from django.contrib.auth.decorators import login_required
@ -57,7 +56,7 @@ from ietf.ietfauth.utils import role_required, has_role, user_is_person
from ietf.mailtrigger.utils import gather_address_lists
from ietf.meeting.models import Meeting, Session, Schedule, FloorPlan, SessionPresentation, TimeSlot, SlideSubmission
from ietf.meeting.models import SessionStatusName, SchedulingEvent, SchedTimeSessAssignment, Room, TimeSlotTypeName
from ietf.meeting.forms import ( CustomDurationField, SwapDaysForm, SwapTimeslotsForm,
from ietf.meeting.forms import ( CustomDurationField, SwapDaysForm, SwapTimeslotsForm, ImportMinutesForm,
TimeSlotCreateForm, TimeSlotEditForm, SessionEditForm )
from ietf.meeting.helpers import get_person_by_email, get_schedule_by_name
from ietf.meeting.helpers import get_meeting, get_ietf_meeting, get_current_ietf_meeting_num
@ -72,23 +71,24 @@ from ietf.meeting.helpers import sessions_post_save, is_interim_meeting_approved
from ietf.meeting.helpers import send_interim_meeting_cancellation_notice, send_interim_session_cancellation_notice
from ietf.meeting.helpers import send_interim_approval
from ietf.meeting.helpers import send_interim_approval_request
from ietf.meeting.helpers import send_interim_announcement_request
from ietf.meeting.helpers import send_interim_announcement_request, sessions_post_cancel
from ietf.meeting.utils import finalize, sort_accept_tuple, condition_slide_order
from ietf.meeting.utils import add_event_info_to_session_qs
from ietf.meeting.utils import session_time_for_sorting
from ietf.meeting.utils import session_requested_by
from ietf.meeting.utils import current_session_status
from ietf.meeting.utils import data_for_meetings_overview
from ietf.meeting.utils import session_requested_by, SaveMaterialsError
from ietf.meeting.utils import current_session_status, get_meeting_sessions, SessionNotScheduledError
from ietf.meeting.utils import data_for_meetings_overview, handle_upload_file, save_session_minutes_revision
from ietf.meeting.utils import preprocess_constraints_for_meeting_schedule_editor
from ietf.meeting.utils import diff_meeting_schedules, prefetch_schedule_diff_objects
from ietf.meeting.utils import swap_meeting_schedule_timeslot_assignments, bulk_create_timeslots
from ietf.meeting.utils import preprocess_meeting_important_dates
from ietf.message.utils import infer_message
from ietf.name.models import SlideSubmissionStatusName, ProceedingsMaterialTypeName, SessionPurposeName
from ietf.secr.proceedings.utils import handle_upload_file
from ietf.secr.proceedings.proc_utils import (get_progress_stats, post_process, import_audio_files,
create_recording)
from ietf.utils import markdown
from ietf.utils.decorators import require_api_key
from ietf.utils.hedgedoc import Note, NoteError
from ietf.utils.history import find_history_replacements_active_at
from ietf.utils.log import assertion
from ietf.utils.mail import send_mail_message, send_mail_text
@ -99,7 +99,8 @@ from ietf.utils.response import permission_denied
from ietf.utils.text import xslugify
from .forms import (InterimMeetingModelForm, InterimAnnounceForm, InterimSessionModelForm,
InterimCancelForm, InterimSessionInlineFormSet, FileUploadForm, RequestMinutesForm,)
InterimCancelForm, InterimSessionInlineFormSet, RequestMinutesForm,
UploadAgendaForm, UploadBlueSheetForm, UploadMinutesForm, UploadSlidesForm)
def get_interim_menu_entries(request):
@ -258,7 +259,7 @@ def materials_document(request, document, num=None, ext=None):
content_type = content_type.replace('plain', 'markdown', 1)
break;
elif atype[0] == 'text/html':
bytes = "<html>\n<head></head>\n<body>\n%s\n</body>\n</html>\n" % markdown.markdown(bytes.decode(),extensions=['extra'])
bytes = "<html>\n<head></head>\n<body>\n%s\n</body>\n</html>\n" % markdown.markdown(bytes.decode())
content_type = content_type.replace('plain', 'html', 1)
break;
elif atype[0] == 'text/plain':
@ -505,8 +506,8 @@ def edit_meeting_schedule(request, num=None, owner=None, name=None):
min_duration = min(t.duration for t in timeslots_qs)
max_duration = max(t.duration for t in timeslots_qs)
else:
min_duration = 1
max_duration = 2
min_duration = datetime.timedelta(minutes=30)
max_duration = datetime.timedelta(minutes=120)
def timedelta_to_css_ems(timedelta):
# we scale the session and slots a bit according to their
@ -535,15 +536,17 @@ def edit_meeting_schedule(request, num=None, owner=None, name=None):
for s in sessions:
s.requested_by_person = requested_by_lookup.get(s.requested_by)
s.scheduling_label = "???"
s.purpose_label = None
if (s.purpose.slug in ('none', 'regular')) and s.group:
s.scheduling_label = s.group.acronym
s.purpose_label = 'BoF' if s.group.is_bof() else s.group.type.name
if s.group:
if (s.purpose.slug in ('none', 'regular')):
s.scheduling_label = s.group.acronym
s.purpose_label = 'BoF' if s.group.is_bof() else s.group.type.name
else:
s.scheduling_label = s.name if s.name else f'??? [{s.group.acronym}]'
s.purpose_label = s.purpose.name
else:
s.scheduling_label = s.name if s.name else '???'
s.purpose_label = s.purpose.name
if s.name:
s.scheduling_label = s.name
s.requested_duration_in_hours = round(s.requested_duration.seconds / 60.0 / 60.0, 1)
@ -2202,16 +2205,13 @@ def meeting_requests(request, num=None):
{"meeting": meeting, "sessions":sessions,
"groups_not_meeting": groups_not_meeting})
def get_sessions(num, acronym):
meeting = get_meeting(num=num,type_in=None)
sessions = Session.objects.filter(meeting=meeting,group__acronym=acronym,type__in=['regular','plenary','other'])
return sorted(
get_meeting_sessions(num, acronym).with_current_status(),
key=lambda s: session_time_for_sorting(s, use_meeting_date=False)
)
if not sessions:
sessions = Session.objects.filter(meeting=meeting,short=acronym,type__in=['regular','plenary','other'])
sessions = sessions.with_current_status()
return sorted(sessions, key=lambda s: session_time_for_sorting(s, use_meeting_date=False))
def session_details(request, num, acronym):
meeting = get_meeting(num=num,type_in=None)
@ -2343,13 +2343,6 @@ def add_session_drafts(request, session_id, num):
})
class UploadBlueSheetForm(FileUploadForm):
def __init__(self, *args, **kwargs):
kwargs['doc_type'] = 'bluesheets'
super(UploadBlueSheetForm, self).__init__(*args, **kwargs )
def upload_session_bluesheets(request, session_id, num):
# num is redundant, but we're dragging it along an artifact of where we are in the current URL structure
session = get_object_or_404(Session,pk=session_id)
@ -2375,13 +2368,14 @@ def upload_session_bluesheets(request, session_id, num):
ota = session.official_timeslotassignment()
sess_time = ota and ota.timeslot.time
if not sess_time:
return HttpResponse("Cannot receive uploads for an unscheduled session. Please check the session ID.", status=410, content_type="text/plain")
return HttpResponseGone("Cannot receive uploads for an unscheduled session. Please check the session ID.", content_type="text/plain")
save_error = save_bluesheet(request, session, file, encoding=form.file_encoding[file.name])
if save_error:
form.add_error(None, save_error)
else:
messages.success(request, 'Successfully uploaded bluesheets.')
return redirect('ietf.meeting.views.session_details',num=num,acronym=session.group.acronym)
else:
form = UploadBlueSheetForm()
@ -2437,15 +2431,6 @@ def save_bluesheet(request, session, file, encoding='utf-8'):
doc.save_with_history([e])
return save_error
class UploadMinutesForm(FileUploadForm):
apply_to_all = forms.BooleanField(label='Apply to all group sessions at this meeting',initial=True,required=False)
def __init__(self, show_apply_to_all_checkbox, *args, **kwargs):
kwargs['doc_type'] = 'minutes'
super(UploadMinutesForm, self).__init__(*args, **kwargs )
if not show_apply_to_all_checkbox:
self.fields.pop('apply_to_all')
def upload_session_minutes(request, session_id, num):
# num is redundant, but we're dragging it along an artifact of where we are in the current URL structure
@ -2472,62 +2457,29 @@ def upload_session_minutes(request, session_id, num):
apply_to_all = session.type_id == 'regular'
if show_apply_to_all_checkbox:
apply_to_all = form.cleaned_data['apply_to_all']
if minutes_sp:
doc = minutes_sp.document
doc.rev = '%02d' % (int(doc.rev)+1)
minutes_sp.rev = doc.rev
minutes_sp.save()
# Set up the new revision
try:
save_session_minutes_revision(
session=session,
apply_to_all=apply_to_all,
file=file,
ext=ext,
encoding=form.file_encoding[file.name],
request=request,
)
except SessionNotScheduledError:
return HttpResponseGone(
"Cannot receive uploads for an unscheduled session. Please check the session ID.",
content_type="text/plain",
)
except SaveMaterialsError as err:
form.add_error(None, str(err))
else:
ota = session.official_timeslotassignment()
sess_time = ota and ota.timeslot.time
if not sess_time:
return HttpResponse("Cannot receive uploads for an unscheduled session. Please check the session ID.", status=410, content_type="text/plain")
if session.meeting.type_id=='ietf':
name = 'minutes-%s-%s' % (session.meeting.number,
session.group.acronym)
title = 'Minutes IETF%s: %s' % (session.meeting.number,
session.group.acronym)
if not apply_to_all:
name += '-%s' % (sess_time.strftime("%Y%m%d%H%M"),)
title += ': %s' % (sess_time.strftime("%a %H:%M"),)
else:
name = 'minutes-%s-%s' % (session.meeting.number, sess_time.strftime("%Y%m%d%H%M"))
title = 'Minutes %s: %s' % (session.meeting.number, sess_time.strftime("%a %H:%M"))
if Document.objects.filter(name=name).exists():
doc = Document.objects.get(name=name)
doc.rev = '%02d' % (int(doc.rev)+1)
else:
doc = Document.objects.create(
name = name,
type_id = 'minutes',
title = title,
group = session.group,
rev = '00',
)
DocAlias.objects.create(name=doc.name).docs.add(doc)
doc.states.add(State.objects.get(type_id='minutes',slug='active'))
if session.sessionpresentation_set.filter(document=doc).exists():
sp = session.sessionpresentation_set.get(document=doc)
sp.rev = doc.rev
sp.save()
else:
session.sessionpresentation_set.create(document=doc,rev=doc.rev)
if apply_to_all:
for other_session in sessions:
if other_session != session:
other_session.sessionpresentation_set.filter(document__type='minutes').delete()
other_session.sessionpresentation_set.create(document=doc,rev=doc.rev)
filename = '%s-%s%s'% ( doc.name, doc.rev, ext)
doc.uploaded_filename = filename
e = NewRevisionDocEvent.objects.create(doc=doc, by=request.user.person, type='new_revision', desc='New revision available: %s'%doc.rev, rev=doc.rev)
# The way this function builds the filename it will never trigger the file delete in handle_file_upload.
save_error = handle_upload_file(file, filename, session.meeting, 'minutes', request=request, encoding=form.file_encoding[file.name])
if save_error:
form.add_error(None, save_error)
else:
doc.save_with_history([e])
return redirect('ietf.meeting.views.session_details',num=num,acronym=session.group.acronym)
else:
# no exception -- success!
messages.success(request, f'Successfully uploaded minutes as revision {session.minutes().rev}.')
return redirect('ietf.meeting.views.session_details', num=num, acronym=session.group.acronym)
else:
form = UploadMinutesForm(show_apply_to_all_checkbox)
return render(request, "meeting/upload_session_minutes.html",
@ -2538,15 +2490,6 @@ def upload_session_minutes(request, session_id, num):
})
class UploadAgendaForm(FileUploadForm):
apply_to_all = forms.BooleanField(label='Apply to all group sessions at this meeting',initial=True,required=False)
def __init__(self, show_apply_to_all_checkbox, *args, **kwargs):
kwargs['doc_type'] = 'agenda'
super(UploadAgendaForm, self).__init__(*args, **kwargs )
if not show_apply_to_all_checkbox:
self.fields.pop('apply_to_all')
def upload_session_agenda(request, session_id, num):
# num is redundant, but we're dragging it along an artifact of where we are in the current URL structure
session = get_object_or_404(Session,pk=session_id)
@ -2558,7 +2501,7 @@ def upload_session_agenda(request, session_id, num):
session_number = None
sessions = get_sessions(session.meeting.number,session.group.acronym)
show_apply_to_all_checkbox = len(sessions) > 1 if session.type_id == 'regular' else False
show_apply_to_all_checkbox = len(sessions) > 1 if session.type.slug == 'regular' else False
if len(sessions) > 1:
session_number = 1 + sessions.index(session)
@ -2569,7 +2512,7 @@ def upload_session_agenda(request, session_id, num):
if form.is_valid():
file = request.FILES['file']
_, ext = os.path.splitext(file.name)
apply_to_all = session.type_id == 'regular'
apply_to_all = session.type.slug == 'regular'
if show_apply_to_all_checkbox:
apply_to_all = form.cleaned_data['apply_to_all']
if agenda_sp:
@ -2581,7 +2524,7 @@ def upload_session_agenda(request, session_id, num):
ota = session.official_timeslotassignment()
sess_time = ota and ota.timeslot.time
if not sess_time:
return HttpResponse("Cannot receive uploads for an unscheduled session. Please check the session ID.", status=410, content_type="text/plain")
return HttpResponseGone("Cannot receive uploads for an unscheduled session. Please check the session ID.", content_type="text/plain")
if session.meeting.type_id=='ietf':
name = 'agenda-%s-%s' % (session.meeting.number,
session.group.acronym)
@ -2629,6 +2572,7 @@ def upload_session_agenda(request, session_id, num):
form.add_error(None, save_error)
else:
doc.save_with_history([e])
messages.success(request, f'Successfully uploaded agenda as revision {doc.rev}.')
return redirect('ietf.meeting.views.session_details',num=num,acronym=session.group.acronym)
else:
form = UploadAgendaForm(show_apply_to_all_checkbox, initial={'apply_to_all':session.type_id=='regular'})
@ -2641,27 +2585,6 @@ def upload_session_agenda(request, session_id, num):
})
class UploadSlidesForm(FileUploadForm):
title = forms.CharField(max_length=255)
apply_to_all = forms.BooleanField(label='Apply to all group sessions at this meeting',initial=False,required=False)
def __init__(self, session, show_apply_to_all_checkbox, *args, **kwargs):
self.session = session
kwargs['doc_type'] = 'slides'
super(UploadSlidesForm, self).__init__(*args, **kwargs )
if not show_apply_to_all_checkbox:
self.fields.pop('apply_to_all')
def clean_title(self):
title = self.cleaned_data['title']
# The current tables only handles Unicode BMP:
if ord(max(title)) > 0xffff:
raise forms.ValidationError("The title contains characters outside the Unicode BMP, which is not currently supported")
if self.session.meeting.type_id=='interim':
if re.search(r'-\d{2}$', title):
raise forms.ValidationError("Interim slides currently may not have a title that ends with something that looks like a revision number (-nn)")
return title
def upload_session_slides(request, session_id, num, name):
# num is redundant, but we're dragging it along an artifact of where we are in the current URL structure
session = get_object_or_404(Session,pk=session_id)
@ -2745,6 +2668,9 @@ def upload_session_slides(request, session_id, num, name):
else:
doc.save_with_history([e])
post_process(doc)
messages.success(
request,
f'Successfully uploaded slides as revision {doc.rev} of {doc.name}.')
return redirect('ietf.meeting.views.session_details',num=num,acronym=session.group.acronym)
else:
initial = {}
@ -2811,6 +2737,7 @@ def propose_session_slides(request, session_id, num):
msg.by = request.user.person
msg.save()
send_mail_message(request, msg)
messages.success(request, 'Successfully submitted proposed slides.')
return redirect('ietf.meeting.views.session_details',num=num,acronym=session.group.acronym)
else:
initial = {}
@ -2834,6 +2761,7 @@ def remove_sessionpresentation(request, session_id, num, name):
c = DocEvent(type="added_comment", doc=sp.document, rev=sp.document.rev, by=request.user.person)
c.desc = "Removed from session: %s" % (session)
c.save()
messages.success(request, f'Successfully removed {name}.')
return redirect('ietf.meeting.views.session_details', num=session.meeting.number, acronym=session.group.acronym)
return render(request,'meeting/remove_sessionpresentation.html', {'sp': sp })
@ -3258,7 +3186,9 @@ def interim_request_cancel(request, number):
was_scheduled = session_status.slug == 'sched'
result_status = SessionStatusName.objects.get(slug='canceled' if was_scheduled else 'canceledpa')
for session in meeting.session_set.not_canceled():
sessions_to_cancel = meeting.session_set.not_canceled()
for session in sessions_to_cancel:
SchedulingEvent.objects.create(
session=session,
status=result_status,
@ -3268,6 +3198,8 @@ def interim_request_cancel(request, number):
if was_scheduled:
send_interim_meeting_cancellation_notice(meeting)
sessions_post_cancel(request, sessions_to_cancel)
messages.success(request, 'Interim meeting cancelled')
return redirect(upcoming)
else:
@ -3315,6 +3247,8 @@ def interim_request_session_cancel(request, sessionid):
if was_scheduled:
send_interim_session_cancellation_notice(session)
sessions_post_cancel(request, [session])
messages.success(request, 'Interim meeting session cancelled')
return redirect(interim_request_details, number=session.meeting.number)
else:
@ -4165,4 +4099,88 @@ def approve_proposed_slides(request, slidesubmission_id, num):
'session_number': session_number,
'existing_doc' : existing_doc,
'form': form,
})
})
def import_session_minutes(request, session_id, num):
"""Import session minutes from the ietf.notes.org site
A GET pulls in the markdown for a session's notes using the HedgeDoc API. An HTML preview of how
the datatracker will render the result is sent back. The confirmation form presented to the user
contains a hidden copy of the markdown source that will be submitted back if approved.
A POST accepts the hidden source and creates a new revision of the notes. This step does *not*
retrieve the note from the HedgeDoc API again - it posts the hidden source from the form. Any
changes to the HedgeDoc site after the preview was retrieved will be ignored. We could also pull
the source again and re-display the updated preview with an explanatory message, but there will
always be a race condition. Rather than add that complication, we assume that the user previewing
the imported minutes will be aware of anyone else changing the notes and coordinate with them.
A consequence is that the user could tamper with the hidden form and it would be accepted. This is
ok, though, because they could more simply upload whatever they want through the upload form with
the same effect so no exploit is introduced.
"""
session = get_object_or_404(Session, pk=session_id)
note = Note(session.notes_id())
if not session.can_manage_materials(request.user):
permission_denied(request, "You don't have permission to import minutes for this session.")
if session.is_material_submission_cutoff() and not has_role(request.user, "Secretariat"):
permission_denied(request, "The materials cutoff for this session has passed. Contact the secretariat for further action.")
if request.method == 'POST':
form = ImportMinutesForm(request.POST)
if not form.is_valid():
import_contents = form.data['markdown_text']
else:
import_contents = form.cleaned_data['markdown_text']
try:
save_session_minutes_revision(
session=session,
file=io.BytesIO(import_contents.encode('utf8')),
ext='.md',
request=request,
)
except SessionNotScheduledError:
return HttpResponseGone(
"Cannot import minutes for an unscheduled session. Please check the session ID.",
content_type="text/plain",
)
except SaveMaterialsError as err:
form.add_error(None, str(err))
else:
messages.success(request, f'Successfully imported minutes as revision {session.minutes().rev}.')
return redirect('ietf.meeting.views.session_details', num=num, acronym=session.group.acronym)
else:
try:
import_contents = note.get_source()
except NoteError as err:
messages.error(request, f'Could not import notes with id {note.id}: {err}.')
return redirect('ietf.meeting.views.session_details', num=num, acronym=session.group.acronym)
form = ImportMinutesForm(initial={'markdown_text': import_contents})
# Try to prevent pointless revision creation. Note that we do not block replacing
# a document with an identical copy in the validation above. We cannot entirely
# avoid a race condition and the likelihood/amount of damage is very low so no
# need to complicate things further.
current_minutes = session.minutes()
contents_changed = True
if current_minutes:
try:
with open(current_minutes.get_file_name()) as f:
if import_contents == Note.preprocess_source(f.read()):
contents_changed = False
messages.warning(request, 'This document is identical to the current revision, no need to import.')
except FileNotFoundError:
pass # allow import if the file is missing
return render(
request,
'meeting/import_minutes.html',
{
'form': form,
'note': note,
'session': session,
'contents_changed': contents_changed,
},
)

View file

@ -14,10 +14,12 @@ from ietf.meeting.forms import FileUploadForm
from ietf.meeting.models import Meeting, MeetingHost
from ietf.meeting.helpers import get_meeting
from ietf.name.models import ProceedingsMaterialTypeName
from ietf.secr.proceedings.utils import handle_upload_file
from ietf.meeting.utils import handle_upload_file
from ietf.utils.text import xslugify
class UploadProceedingsMaterialForm(FileUploadForm):
doc_type = 'procmaterials'
use_url = forms.BooleanField(
required=False,
label='Use an external URL instead of uploading a document',
@ -34,7 +36,7 @@ class UploadProceedingsMaterialForm(FileUploadForm):
)
def __init__(self, *args, **kwargs):
super().__init__(doc_type='procmaterials', *args, **kwargs)
super().__init__(*args, **kwargs)
self.fields['file'].label = 'Select a file to upload. Allowed format{}: {}'.format(
'' if len(self.mime_types) == 1 else 's',
', '.join(self.mime_types),

View file

@ -86,9 +86,23 @@ class MultiplePositionNomineeField(forms.MultipleChoiceField, PositionNomineeFie
return result
class NewEditMembersForm(forms.Form):
class EditMembersForm(forms.Form):
members = SearchableEmailsField(only_users=True, all_emails=True, required=False)
liaisons = SearchableEmailsField(only_users=True, all_emails=True, required=False)
def __init__(self, nomcom, *args, **kwargs):
initial = kwargs.setdefault('initial', {})
roles = nomcom.group.role_set.filter(
name__slug__in=('member', 'liaison')
).order_by('email__person__name').select_related('email')
initial['members'] = [
r.email for r in roles if r.name.slug == 'member'
]
initial['liaisons'] = [
r.email for r in roles if r.name.slug =='liaison'
]
super().__init__(*args, **kwargs)
members = SearchableEmailsField(only_users=True,all_emails=True)
class EditNomcomForm(forms.ModelForm):

View file

@ -42,10 +42,8 @@ class Command(EmailOnFailureCommand):
except NomCom.DoesNotExist:
raise CommandError("NomCom %s does not exist or it isn't active" % year)
if not email:
self.msg = io.open(sys.stdin.fileno(), 'rb').read()
else:
self.msg = io.open(email, "rb").read()
binary_input = io.open(email, 'rb') if email else sys.stdin.buffer
self.msg = binary_input.read()
try:
feedback = create_feedback_email(self.nomcom, self.msg)

View file

@ -2,6 +2,7 @@
# -*- coding: utf-8 -*-
"""Tests of nomcom management commands"""
import mock
import sys
from collections import namedtuple
@ -83,3 +84,30 @@ class FeedbackEmailTests(TestCase):
(self.nomcom, b'feedback message'),
'feedback_email should process the correct email for the correct nomcom'
)
@mock.patch('ietf.utils.management.base.send_smtp')
def test_invalid_character_encodings(self, send_smtp_mock):
"""The feedback_email command should send a message when file input has invalid encoding"""
# mock an exception in create_feedback_email()
invalid_characters = b'\xfe\xff'
with name_of_file_containing(invalid_characters, mode='wb') as filename:
call_command('feedback_email', nomcom_year=self.year, email_file=filename)
self.assertTrue(send_smtp_mock.called)
(msg,) = send_smtp_mock.call_args.args # get the message to be sent
parts = msg.get_payload()
self.assertEqual(len(parts), 2, 'Error email should contain message and original message')
@mock.patch.object(sys.stdin.buffer, 'read')
@mock.patch('ietf.utils.management.base.send_smtp')
def test_invalid_character_encodings_via_stdin(self, send_smtp_mock, stdin_read_mock):
"""The feedback_email command should send a message when stdin input has invalid encoding"""
# mock an exception in create_feedback_email()
invalid_characters = b'\xfe\xff'
stdin_read_mock.return_value = invalid_characters
call_command('feedback_email', nomcom_year=self.year)
self.assertTrue(send_smtp_mock.called)
(msg,) = send_smtp_mock.call_args.args # get the message to be sent
parts = msg.get_payload()
self.assertEqual(len(parts), 2, 'Error email should contain message and original message')

View file

@ -406,9 +406,14 @@ class NomcomViewsTest(TestCase):
self.client.logout()
def change_members(self, members):
members_emails = ['%s%s' % (member, EMAIL_DOMAIN) for member in members]
test_data = {'members': members_emails,}
def change_members(self, members=None, liaisons=None):
test_data = {}
if members is not None:
members_emails = ['%s%s' % (member, EMAIL_DOMAIN) for member in members]
test_data['members'] = members_emails
if liaisons is not None:
liaisons_emails = ['%s%s' % (liaison, EMAIL_DOMAIN) for liaison in liaisons]
test_data['liaisons'] = liaisons_emails
self.client.post(self.edit_members_url, test_data)
def test_edit_members_view(self):
@ -430,6 +435,54 @@ class NomcomViewsTest(TestCase):
self.check_url_status(self.private_index_url, 403)
self.client.logout()
def test_edit_members_only_removes_member_roles(self):
"""Removing a member or liaison should not affect other roles"""
# log in and set up members/liaisons lists
self.access_chair_url(self.edit_members_url)
self.change_members(
members=[CHAIR_USER, COMMUNITY_USER],
liaisons=[CHAIR_USER, COMMUNITY_USER],
)
nomcom_group = Group.objects.get(acronym=f'nomcom{self.year}')
self.assertCountEqual(
nomcom_group.role_set.filter(name='member').values_list('email__address', flat=True),
[CHAIR_USER + EMAIL_DOMAIN, COMMUNITY_USER + EMAIL_DOMAIN],
)
self.assertCountEqual(
nomcom_group.role_set.filter(name='liaison').values_list('email__address', flat=True),
[CHAIR_USER + EMAIL_DOMAIN, COMMUNITY_USER + EMAIL_DOMAIN],
)
# remove a member who is also a liaison and check that the liaisons list is unchanged
self.change_members(
members=[COMMUNITY_USER],
liaisons=[CHAIR_USER, COMMUNITY_USER],
)
nomcom_group = Group.objects.get(pk=nomcom_group.pk) # refresh from db
self.assertCountEqual(
nomcom_group.role_set.filter(name='member').values_list('email__address', flat=True),
[COMMUNITY_USER + EMAIL_DOMAIN],
)
self.assertCountEqual(
nomcom_group.role_set.filter(name='liaison').values_list('email__address', flat=True),
[CHAIR_USER + EMAIL_DOMAIN, COMMUNITY_USER + EMAIL_DOMAIN],
)
# remove a liaison who is also a member and check that the members list is unchanged
self.change_members(
members=[COMMUNITY_USER],
liaisons=[CHAIR_USER],
)
nomcom_group = Group.objects.get(pk=nomcom_group.pk) # refresh from db
self.assertCountEqual(
nomcom_group.role_set.filter(name='member').values_list('email__address', flat=True),
[COMMUNITY_USER + EMAIL_DOMAIN],
)
self.assertCountEqual(
nomcom_group.role_set.filter(name='liaison').values_list('email__address', flat=True),
[CHAIR_USER + EMAIL_DOMAIN],
)
def test_edit_nomcom_view(self):
r = self.access_chair_url(self.edit_nomcom_url)
q = PyQuery(r.content)

View file

@ -22,7 +22,8 @@ from django.utils.encoding import force_bytes, force_text
from ietf.dbtemplate.models import DBTemplate
from ietf.dbtemplate.views import group_template_edit, group_template_show
from ietf.name.models import NomineePositionStateName, FeedbackTypeName
from ietf.group.models import Group, GroupEvent, Role
from ietf.group.models import Group, GroupEvent, Role
from ietf.group.utils import update_role_set
from ietf.message.models import Message
from ietf.nomcom.decorators import nomcom_private_key_required
@ -31,7 +32,7 @@ from ietf.nomcom.forms import (NominateForm, NominateNewPersonForm, FeedbackForm
PrivateKeyForm, EditNomcomForm, EditNomineeForm,
PendingFeedbackForm, ReminderDatesForm, FullFeedbackFormSet,
FeedbackEmailForm, NominationResponseCommentForm, TopicForm,
NewEditMembersForm, VolunteerForm, )
EditMembersForm, VolunteerForm, )
from ietf.nomcom.models import (Position, NomineePosition, Nominee, Feedback, NomCom, ReminderDates,
FeedbackLastSeen, Topic, TopicFeedbackLastSeen, )
from ietf.nomcom.utils import (get_nomcom_by_year, store_nomcom_private_key, suggest_affiliation,
@ -1230,18 +1231,14 @@ def edit_members(request, year):
if nomcom.group.state_id=='conclude':
permission_denied(request, 'This nomcom is closed.')
old_members_email = [r.email for r in nomcom.group.role_set.filter(name='member')]
if request.method=='POST':
form = NewEditMembersForm(data=request.POST)
form = EditMembersForm(nomcom, data=request.POST)
if form.is_valid():
new_members_email = form.cleaned_data['members']
nomcom.group.role_set.filter( email__in=set(old_members_email)-set(new_members_email) ).delete()
for email in set(new_members_email)-set(old_members_email):
nomcom.group.role_set.create(email=email,person=email.person,name_id='member')
update_role_set(nomcom.group, 'member', form.cleaned_data['members'], request.user.person)
update_role_set(nomcom.group, 'liaison', form.cleaned_data['liaisons'], request.user.person)
return HttpResponseRedirect(reverse('ietf.nomcom.views.private_index',kwargs={'year':year}))
else:
form = NewEditMembersForm(initial={ 'members' : old_members_email })
form = EditMembersForm(nomcom)
return render(request, 'nomcom/new_edit_members.html',
{'nomcom' : nomcom,

View file

@ -0,0 +1,18 @@
# Generated by Django 2.2.25 on 2021-12-10 08:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('person', '0020_auto_20210920_0924'),
]
operations = [
migrations.AlterField(
model_name='personalapikey',
name='endpoint',
field=models.CharField(choices=[('/api/appauth/authortools', '/api/appauth/authortools'), ('/api/appauth/bibxml', '/api/appauth/bibxml'), ('/api/iesg/position', '/api/iesg/position'), ('/api/meeting/session/video/url', '/api/meeting/session/video/url'), ('/api/notify/meeting/bluesheet', '/api/notify/meeting/bluesheet'), ('/api/notify/meeting/registration', '/api/notify/meeting/registration'), ('/api/v2/person/person', '/api/v2/person/person')], max_length=128),
),
]

View file

@ -357,6 +357,7 @@ PERSON_API_KEY_VALUES = [
("/api/notify/meeting/registration", "/api/notify/meeting/registration", "Robot"),
("/api/notify/meeting/bluesheet", "/api/notify/meeting/bluesheet", "Recording Manager"),
("/api/appauth/authortools", "/api/appauth/authortools", None),
("/api/appauth/bibxml", "/api/appauth/bibxml", None),
]
PERSON_API_KEY_ENDPOINTS = sorted(list(set([ (v, n) for (v, n, r) in PERSON_API_KEY_VALUES ])))

View file

@ -69,7 +69,7 @@ api.person.register(AliasResource())
class PersonalApiKeyResource(ModelResource):
person = ToOneField(PersonResource, 'person')
class Meta:
queryset = PersonalApiKey.objects.all()
queryset = PersonalApiKey.objects.none()
serializer = api.Serializer()
cache = SimpleCache()
excludes = ['salt', ]

View file

@ -1,10 +1,17 @@
# Copyright The IETF Trust 2019-2020, All Rights Reserved
# -*- coding: utf-8 -*-
import datetime
from ietf.review.factories import ReviewAssignmentFactory, ReviewRequestFactory
from ietf.group.factories import RoleFactory
from ietf.utils.mail import empty_outbox, get_payload_text, outbox
from ietf.utils.test_utils import TestCase, reload_db_objects
from .factories import ReviewAssignmentFactory, ReviewRequestFactory, ReviewerSettingsFactory
from .mailarch import hash_list_message_id
from .models import ReviewerSettings, ReviewSecretarySettings, ReviewTeamSettings, UnavailablePeriod
from .utils import (email_secretary_reminder, review_assignments_needing_secretary_reminder,
email_reviewer_reminder, review_assignments_needing_reviewer_reminder,
send_reminder_unconfirmed_assignments, send_review_reminder_overdue_assignment,
send_reminder_all_open_reviews, send_unavailability_period_ending_reminder)
class HashTest(TestCase):
@ -63,3 +70,434 @@ class ReviewAssignmentTest(TestCase):
assignment.save()
review_req = reload_db_objects(review_req)
self.assertEqual(review_req.state_id, 'withdrawn')
class ReviewAssignmentReminderTests(TestCase):
today = datetime.date.today()
deadline = today + datetime.timedelta(days=6)
def setUp(self):
super().setUp()
self.review_req = ReviewRequestFactory(
state_id='assigned',
deadline=self.deadline,
)
self.team = self.review_req.team
self.reviewer = RoleFactory(
name_id='reviewer',
group=self.team,
person__user__username='reviewer',
).person
self.assignment = ReviewAssignmentFactory(
review_request=self.review_req,
state_id='assigned',
assigned_on=self.review_req.time,
reviewer=self.reviewer.email_set.first(),
)
def make_secretary(self, username, remind_days=None):
secretary_role = RoleFactory(
name_id='secr',
group=self.team,
person__user__username=username,
)
ReviewSecretarySettings.objects.create(
team=self.team,
person=secretary_role.person,
remind_days_before_deadline=remind_days,
)
return secretary_role
def make_non_secretary(self, username, remind_days=None):
"""Make a non-secretary role that has a ReviewSecretarySettings
This is a little odd, but might come up if an ex-secretary takes on another role and still
has a ReviewSecretarySettings record.
"""
role = RoleFactory(
name_id='reviewer',
group=self.team,
person__user__username=username,
)
ReviewSecretarySettings.objects.create(
team=self.team,
person=role.person,
remind_days_before_deadline=remind_days,
)
return role
def test_review_assignments_needing_secretary_reminder(self):
"""Notification sent to multiple secretaries"""
# Set up two secretaries with the same remind_days one with a different, and one with None.
secretary_roles = [
self.make_secretary(username='reviewsecretary0', remind_days=6),
self.make_secretary(username='reviewsecretary1', remind_days=6),
self.make_secretary(username='reviewsecretary2', remind_days=5),
self.make_secretary(username='reviewsecretary3', remind_days=None), # never notified
]
self.make_non_secretary(username='nonsecretary', remind_days=6) # never notified
# Check from more than remind_days before the deadline all the way through the day before.
# Should only get reminders on the expected days.
self.assertCountEqual(
review_assignments_needing_secretary_reminder(self.deadline - datetime.timedelta(days=7)),
[],
'No reminder needed when deadline is more than remind_days away',
)
self.assertCountEqual(
review_assignments_needing_secretary_reminder(self.deadline - datetime.timedelta(days=6)),
[(self.assignment, secretary_roles[0]), (self.assignment, secretary_roles[1])],
'Reminders needed for all secretaries when deadline is exactly remind_days away',
)
self.assertCountEqual(
review_assignments_needing_secretary_reminder(self.deadline - datetime.timedelta(days=5)),
[(self.assignment, secretary_roles[2])],
'Reminder needed when deadline is exactly remind_days away',
)
for days in range(1, 5):
self.assertCountEqual(
review_assignments_needing_secretary_reminder(self.deadline - datetime.timedelta(days=days)),
[],
f'No reminder needed when deadline is less than remind_days away (tried {days})',
)
def test_email_secretary_reminder_emails_secretaries(self):
"""Secretary review assignment reminders are sent to secretaries"""
secretary_role = self.make_secretary(username='reviewsecretary')
# create a couple other roles for the team to check that only the requested secretary is reminded
self.make_secretary(username='ignoredsecretary')
self.make_non_secretary(username='nonsecretary')
empty_outbox()
email_secretary_reminder(self.assignment, secretary_role)
self.assertEqual(len(outbox), 1)
msg = outbox[0]
text = get_payload_text(msg)
self.assertIn(secretary_role.email.address, msg['to'])
self.assertIn(self.review_req.doc.name, msg['subject'])
self.assertIn(self.review_req.doc.name, text)
self.assertIn(self.team.acronym, msg['subject'])
self.assertIn(self.team.acronym, text)
def test_review_assignments_needing_reviewer_reminder(self):
# method should find lists of assignments
reviewer_settings = ReviewerSettings.objects.create(
team=self.team,
person=self.reviewer,
remind_days_before_deadline=6,
)
# Give this reviewer another team with a review to be sure
# we don't have cross-talk between teams.
second_req = ReviewRequestFactory(state_id='assigned', deadline=self.deadline)
second_team = second_req.team
second_assignment = ReviewAssignmentFactory(
review_request=second_req,
state_id='assigned',
assigned_on=second_req.time,
reviewer=self.reviewer.email(),
)
ReviewerSettingsFactory(
team=second_team,
person=self.reviewer,
remind_days_before_deadline=5,
)
self.assertCountEqual(
review_assignments_needing_reviewer_reminder(self.deadline - datetime.timedelta(days=7)),
[],
'No reminder needed when deadline is more than remind_days away'
)
self.assertCountEqual(
review_assignments_needing_reviewer_reminder(self.deadline - datetime.timedelta(days=6)),
[self.assignment],
'Reminder needed when deadline is exactly remind_days away',
)
self.assertCountEqual(
review_assignments_needing_reviewer_reminder(self.deadline - datetime.timedelta(days=5)),
[second_assignment],
'Reminder needed for other assignment'
)
self.assertCountEqual(
review_assignments_needing_reviewer_reminder(self.deadline - datetime.timedelta(days=4)),
[],
'No reminder needed when deadline is less than remind_days away'
)
# should never send a reminder when disabled
reviewer_settings.remind_days_before_deadline = None
reviewer_settings.save()
second_assignment.delete() # get rid of this one for the second test
# test over a range that includes when we *did* send a reminder above
for days in range(1, 8):
self.assertCountEqual(
review_assignments_needing_reviewer_reminder(self.deadline - datetime.timedelta(days=days)),
[],
f'No reminder should be sent when reminders are disabled (sent for days={days})',
)
def test_email_review_reminder_emails_reviewers(self):
"""Reviewer assignment reminders are sent to the reviewers"""
empty_outbox()
email_reviewer_reminder(self.assignment)
self.assertEqual(len(outbox), 1)
msg = outbox[0]
text = get_payload_text(msg)
self.assertIn(self.reviewer.email_address(), msg['to'])
self.assertIn(self.review_req.doc.name, msg['subject'])
self.assertIn(self.review_req.doc.name, text)
self.assertIn(self.team.acronym, msg['subject'])
def test_send_reminder_unconfirmed_assignments(self):
"""Unconfirmed assignment reminders are sent to reviewer and team secretary"""
assigned_on = self.assignment.assigned_on.date()
secretaries = [
self.make_secretary(username='reviewsecretary0').person,
self.make_secretary(username='reviewsecretary1').person,
]
# assignments that should be ignored (will result in extra emails being sent if not)
ReviewAssignmentFactory(
review_request=self.review_req,
state_id='accepted',
assigned_on=self.review_req.time,
)
ReviewAssignmentFactory(
review_request=self.review_req,
state_id='completed',
assigned_on=self.review_req.time,
)
ReviewAssignmentFactory(
review_request=self.review_req,
state_id='rejected',
assigned_on=self.review_req.time,
)
# Create a second review for a different team to test for cross-talk between teams.
ReviewAssignmentFactory(
state_id='completed', # something that does not need a reminder
reviewer=self.reviewer.email(),
)
# By default, these reminders are disabled for all teams.
ReviewTeamSettings.objects.update(remind_days_unconfirmed_assignments=1)
empty_outbox()
log = send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=1))
self.assertEqual(len(outbox), 1)
self.assertIn(self.reviewer.email_address(), outbox[0]["To"])
for secretary in secretaries:
self.assertIn(
secretary.email_address(),
outbox[0]["Cc"],
f'Secretary {secretary.user.username} was not copied on the reminder',
)
self.assertEqual(outbox[0]["Subject"], "Reminder: you have not responded to a review assignment")
message = get_payload_text(outbox[0])
self.assertIn(self.team.acronym, message)
self.assertIn('accept or reject the assignment on', message)
self.assertIn(self.review_req.doc.name, message)
self.assertEqual(len(log), 1)
self.assertIn(self.reviewer.email_address(), log[0])
self.assertIn('not accepted/rejected review assignment', log[0])
def test_send_reminder_unconfirmed_assignments_respects_remind_days(self):
"""Unconfirmed assignment reminders should respect the team settings"""
assigned_on = self.assignment.assigned_on.date()
# By default, these reminders are disabled for all teams.
empty_outbox()
for days in range(10):
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=days))
self.assertEqual(len(outbox), 0)
# expect a notification every day except the day of assignment
ReviewTeamSettings.objects.update(remind_days_unconfirmed_assignments=1)
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=0))
self.assertEqual(len(outbox), 0) # no message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=1))
self.assertEqual(len(outbox), 1) # one new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=2))
self.assertEqual(len(outbox), 2) # one new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=3))
self.assertEqual(len(outbox), 3) # one new message
# expect a notification every other day
empty_outbox()
ReviewTeamSettings.objects.update(remind_days_unconfirmed_assignments=2)
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=0))
self.assertEqual(len(outbox), 0) # no message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=1))
self.assertEqual(len(outbox), 0) # no message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=2))
self.assertEqual(len(outbox), 1) # one new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=3))
self.assertEqual(len(outbox), 1) # no new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=4))
self.assertEqual(len(outbox), 2) # one new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=5))
self.assertEqual(len(outbox), 2) # no new message
send_reminder_unconfirmed_assignments(assigned_on + datetime.timedelta(days=6))
self.assertEqual(len(outbox), 3) # no new message
def test_send_unavailability_period_ending_reminder(self):
secretary = self.make_secretary(username='reviewsecretary')
empty_outbox()
today = datetime.date.today()
UnavailablePeriod.objects.create(
team=self.team,
person=self.reviewer,
start_date=today - datetime.timedelta(days=40),
end_date=today + datetime.timedelta(days=3),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=self.team,
person=self.reviewer,
# This object should be ignored, length is too short
start_date=today - datetime.timedelta(days=20),
end_date=today + datetime.timedelta(days=3),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=self.team,
person=self.reviewer,
start_date=today - datetime.timedelta(days=40),
# This object should be ignored, end date is too far away
end_date=today + datetime.timedelta(days=4),
availability="unavailable",
)
UnavailablePeriod.objects.create(
team=self.team,
person=self.reviewer,
# This object should be ignored, end date is too close
start_date=today - datetime.timedelta(days=40),
end_date=today + datetime.timedelta(days=2),
availability="unavailable",
)
log = send_unavailability_period_ending_reminder(today)
self.assertEqual(len(outbox), 1)
self.assertTrue(self.reviewer.email_address() in outbox[0]["To"])
self.assertTrue(secretary.person.email_address() in outbox[0]["To"])
message = get_payload_text(outbox[0])
self.assertTrue(self.reviewer.name in message)
self.assertTrue(self.team.acronym in message)
self.assertEqual(len(log), 1)
self.assertTrue(self.reviewer.name in log[0])
self.assertTrue(self.team.acronym in log[0])
def test_send_review_reminder_overdue_assignment(self):
"""An overdue assignment reminder should be sent to the secretary
This tests that a second set of assignments for the same reviewer but a different
review team does not cause cross-talk between teams. To do this, it removes the
ReviewTeamSettings instance for the second review team. At the moment, this has
the effect of disabling these reminders. This is a bit of a hack, because I'm not
sure that review teams without the ReviewTeamSettings should exist. It has the
needed effect but might require rethinking in the future.
"""
secretary = self.make_secretary(username='reviewsecretary')
# Set the remind_date to be exactly one grace period after self.deadline
remind_date = self.deadline + datetime.timedelta(days=5)
# Create a second request for a second team that will not be sent reminders
second_team = ReviewAssignmentFactory(
review_request__state_id='assigned',
review_request__deadline=self.deadline,
state_id='assigned',
assigned_on=self.deadline,
reviewer=self.reviewer.email_set.first(),
).review_request.team
second_team.reviewteamsettings.delete() # prevent it from being sent reminders
# An assignment that is not yet overdue
not_overdue = remind_date + datetime.timedelta(days=1)
ReviewAssignmentFactory(
review_request__team=self.team,
review_request__state_id='assigned',
review_request__deadline=not_overdue,
state_id='assigned',
assigned_on=not_overdue,
reviewer=self.reviewer.email_set.first(),
)
ReviewAssignmentFactory(
review_request__team=second_team,
review_request__state_id='assigned',
review_request__deadline=not_overdue,
state_id='assigned',
assigned_on=not_overdue,
reviewer=self.reviewer.email_set.first(),
)
# An assignment that is overdue but is not past the grace period
in_grace_period = remind_date - datetime.timedelta(days=1)
ReviewAssignmentFactory(
review_request__team=self.team,
review_request__state_id='assigned',
review_request__deadline=in_grace_period,
state_id='assigned',
assigned_on=in_grace_period,
reviewer=self.reviewer.email_set.first(),
)
ReviewAssignmentFactory(
review_request__team=second_team,
review_request__state_id='assigned',
review_request__deadline=in_grace_period,
state_id='assigned',
assigned_on=in_grace_period,
reviewer=self.reviewer.email_set.first(),
)
empty_outbox()
log = send_review_reminder_overdue_assignment(remind_date)
self.assertEqual(len(log), 1)
self.assertEqual(len(outbox), 1)
self.assertTrue(secretary.person.email_address() in outbox[0]["To"])
self.assertEqual(outbox[0]["Subject"], "1 Overdue review for team {}".format(self.team.acronym))
message = get_payload_text(outbox[0])
self.assertIn(
self.team.acronym + ' has 1 accepted or assigned review overdue by at least 5 days.',
message,
)
self.assertIn('Review of {} by {}'.format(self.review_req.doc.name, self.reviewer.plain_name()), message)
self.assertEqual(len(log), 1)
self.assertIn(secretary.person.email_address(), log[0])
self.assertIn('1 overdue review', log[0])
def test_send_reminder_all_open_reviews(self):
self.make_secretary(username='reviewsecretary')
ReviewerSettingsFactory(team=self.team, person=self.reviewer, remind_days_open_reviews=1)
# Create another assignment for this reviewer in a different team.
# Configure so that a reminder should not be sent for the date we test. It should not
# be included in the reminder that's sent - only one open review assignment should be
# reported.
second_req = ReviewRequestFactory(state_id='assigned', deadline=self.deadline)
second_team = second_req.team
ReviewAssignmentFactory(
review_request=second_req,
state_id='assigned',
assigned_on=second_req.time,
reviewer=self.reviewer.email(),
)
ReviewerSettingsFactory(team=second_team, person=self.reviewer, remind_days_open_reviews=13)
empty_outbox()
today = datetime.date.today()
log = send_reminder_all_open_reviews(today)
self.assertEqual(len(outbox), 1)
self.assertTrue(self.reviewer.email_address() in outbox[0]["To"])
self.assertEqual(outbox[0]["Subject"], "Reminder: you have 1 open review assignment")
message = get_payload_text(outbox[0])
self.assertTrue(self.team.acronym in message)
self.assertTrue('you have 1 open review' in message)
self.assertTrue(self.review_req.doc.name in message)
self.assertTrue(self.review_req.deadline.strftime('%Y-%m-%d') in message)
self.assertEqual(len(log), 1)
self.assertTrue(self.reviewer.email_address() in log[0])
self.assertTrue('1 open review' in log[0])

View file

@ -702,7 +702,7 @@ def get_default_filter_re(person):
return '^draft-(%s|%s)-.*$' % ( person.last_name().lower(), '|'.join(['ietf-%s' % g.acronym for g in groups_to_avoid]))
def send_unavaibility_period_ending_reminder(remind_date):
def send_unavailability_period_ending_reminder(remind_date):
reminder_days = 3
end_date = remind_date + datetime.timedelta(days=reminder_days)
min_start_date = end_date - datetime.timedelta(days=30)
@ -771,6 +771,7 @@ def send_reminder_all_open_reviews(remind_date):
assignments = ReviewAssignment.objects.filter(
state__in=("assigned", "accepted"),
reviewer__person=reviewer_settings.person,
review_request__team=reviewer_settings.team,
)
if not assignments:
continue
@ -800,14 +801,10 @@ def send_reminder_unconfirmed_assignments(remind_date):
accepted or rejected, if enabled in ReviewTeamSettings.
"""
log = []
days_since_origin = (remind_date - ORIGIN_DATE_PERIODIC_REMINDERS).days
relevant_review_team_settings = ReviewTeamSettings.objects.filter(
remind_days_unconfirmed_assignments__isnull=False)
for review_team_settings in relevant_review_team_settings:
if days_since_origin % review_team_settings.remind_days_unconfirmed_assignments != 0:
continue
assignments = ReviewAssignment.objects.filter(
state='assigned',
review_request__team=review_team_settings.group,
@ -816,6 +813,9 @@ def send_reminder_unconfirmed_assignments(remind_date):
continue
for assignment in assignments:
days_old = (remind_date - assignment.assigned_on.date()).days
if days_old == 0 or (days_old % review_team_settings.remind_days_unconfirmed_assignments) != 0:
continue # skip those created today or not due for a reminder today
to = assignment.reviewer.formatted_email()
subject = "Reminder: you have not responded to a review assignment"
domain = Site.objects.get_current().domain
@ -823,19 +823,33 @@ def send_reminder_unconfirmed_assignments(remind_date):
"name": assignment.review_request.doc.name,
"request_id": assignment.review_request.pk
})
cc = [secr_role.formatted_email()
for secr_role in assignment.review_request.team.role_set.filter(name__slug='secr')]
send_mail(None, to, None, subject, "review/reviewer_reminder_unconfirmed_assignments.txt", {
"review_request_url": "https://{}{}".format(domain, review_request_url),
"assignment": assignment,
"team": assignment.review_request.team,
"remind_days": review_team_settings.remind_days_unconfirmed_assignments,
})
send_mail(
request=None,
to=to,
cc=cc,
frm=None,
subject=subject,
template="review/reviewer_reminder_unconfirmed_assignments.txt",
context={
"review_request_url": "https://{}{}".format(domain, review_request_url),
"assignment": assignment,
"team": assignment.review_request.team,
"remind_days": review_team_settings.remind_days_unconfirmed_assignments,
},
)
log.append("Emailed reminder to {} about not accepted/rejected review assignment {}".format(to, assignment.pk))
return log
def review_assignments_needing_reviewer_reminder(remind_date):
"""Get review assignments needing reviewer reminders
Returns a queryset of ReviewAssignments whose reviewers should be notified.
"""
assignment_qs = ReviewAssignment.objects.filter(
state__in=("assigned", "accepted"),
reviewer__person__reviewersettings__remind_days_before_deadline__isnull=False,
@ -876,23 +890,44 @@ def email_reviewer_reminder(assignment):
})
def review_assignments_needing_secretary_reminder(remind_date):
"""Find ReviewAssignments whose secretary should be sent a reminder today"""
# Get ReviewAssignments for teams whose secretaries have a non-null remind_days_before_deadline
# setting.
assignment_qs = ReviewAssignment.objects.filter(
state__in=("assigned", "accepted"),
review_request__team__role__name__slug='secr',
review_request__team__role__person__reviewsecretarysettings__remind_days_before_deadline__isnull=False,
review_request__team__role__person__reviewsecretarysettings__team=F("review_request__team"),
).exclude(
reviewer=None
).values_list("pk", "review_request__deadline", "review_request__team__role", "review_request__team__role__person__reviewsecretarysettings__remind_days_before_deadline").distinct()
assignment_pks = {}
# For each assignment, find all secretaries who should be reminded today
assignment_pks = set()
secretary_pks = set()
notifications = []
for a_pk, deadline, secretary_role_pk, remind_days in assignment_qs:
if (deadline - remind_date).days == remind_days:
assignment_pks[a_pk] = secretary_role_pk
notifications.append((a_pk, secretary_role_pk))
assignment_pks.add(a_pk)
secretary_pks.add(secretary_role_pk)
review_assignments = { a.pk: a for a in ReviewAssignment.objects.filter(pk__in=list(assignment_pks.keys())).select_related("reviewer", "reviewer__person", "state", "review_request__team") }
secretary_roles = { r.pk: r for r in Role.objects.filter(pk__in=list(assignment_pks.values())).select_related("email", "person") }
review_assignments = {
a.pk: a
for a in ReviewAssignment.objects.filter(pk__in=assignment_pks).select_related(
"reviewer", "reviewer__person", "state", "review_request__team"
)
}
secretary_roles = {
r.pk: r
for r in Role.objects.filter(pk__in=secretary_pks).select_related("email", "person")
}
return [
(review_assignments[a_pk], secretary_roles[secretary_role_pk])
for a_pk, secretary_role_pk in notifications
]
return [ (review_assignments[a_pk], secretary_roles[secretary_role_pk]) for a_pk, secretary_role_pk in assignment_pks.items() ]
def email_secretary_reminder(assignment, secretary_role):
review_request = assignment.review_request
@ -912,7 +947,7 @@ def email_secretary_reminder(assignment, secretary_role):
settings = ReviewSecretarySettings.objects.filter(person=secretary_role.person_id, team=team).first()
remind_days = settings.remind_days_before_deadline if settings else 0
send_mail(None, [assignment.reviewer.formatted_email()], None, subject, "review/secretary_reminder.txt", {
send_mail(None, [secretary_role.email.formatted_email()], None, subject, "review/secretary_reminder.txt", {
"review_request_url": "https://{}{}".format(domain, request_url),
"settings_url": "https://{}{}".format(domain, settings_url),
"review_request": review_request,

View file

@ -190,7 +190,7 @@ class MiscSessionForm(TimeSlotForm):
Plenary = IETF''',
required=False)
location = forms.ModelChoiceField(queryset=Room.objects, required=False)
remote_instructions = forms.CharField(max_length=255)
remote_instructions = forms.CharField(max_length=255, required=False)
show_location = forms.BooleanField(required=False)
def __init__(self,*args,**kwargs):

View file

@ -20,14 +20,13 @@ from ietf.utils.mail import send_mail
from ietf.meeting.forms import duration_string
from ietf.meeting.helpers import get_meeting, make_materials_directories, populate_important_dates
from ietf.meeting.models import Meeting, Session, Room, TimeSlot, SchedTimeSessAssignment, Schedule, SchedulingEvent
from ietf.meeting.utils import add_event_info_to_session_qs
from ietf.meeting.utils import add_event_info_to_session_qs, handle_upload_file
from ietf.name.models import SessionStatusName
from ietf.group.models import Group, GroupEvent
from ietf.secr.meetings.blue_sheets import create_blue_sheets
from ietf.secr.meetings.forms import ( BaseMeetingRoomFormSet, MeetingModelForm, MeetingSelectForm,
MeetingRoomForm, MiscSessionForm, TimeSlotForm, RegularSessionEditForm,
UploadBlueSheetForm, MeetingRoomOptionsForm )
from ietf.secr.proceedings.utils import handle_upload_file
from ietf.secr.sreq.views import get_initial_session
from ietf.secr.utils.meeting import get_session, get_timeslot
from ietf.mailtrigger.utils import gather_address_lists
@ -431,6 +430,7 @@ def misc_sessions(request, meeting_id, schedule_name):
group=group,
type=type,
purpose=purpose,
on_agenda=purpose.on_agenda,
remote_instructions=remote_instructions)
SchedulingEvent.objects.create(
@ -558,6 +558,8 @@ def misc_session_edit(request, meeting_id, schedule_name, slot_id):
session.name = name
session.short = short
session.remote_instructions = remote_instructions
if session.purpose != session_purpose: # only change if purpose is changing
session.on_agenda = session_purpose.on_agenda
session.purpose = session_purpose
session.type = slot_type
session.save()

View file

@ -1,74 +0,0 @@
# Copyright The IETF Trust 2016-2019, All Rights Reserved
import glob
import io
import os
from django.conf import settings
from django.contrib import messages
from django.utils.encoding import smart_text
import debug # pyflakes:ignore
from ietf.utils.html import sanitize_document
def handle_upload_file(file,filename,meeting,subdir, request=None, encoding=None):
'''
This function takes a file object, a filename and a meeting object and subdir as string.
It saves the file to the appropriate directory, get_materials_path() + subdir.
If the file is a zip file, it creates a new directory in 'slides', which is the basename of the
zip file and unzips the file in the new directory.
'''
base, extension = os.path.splitext(filename)
if extension == '.zip':
path = os.path.join(meeting.get_materials_path(),subdir,base)
if not os.path.exists(path):
os.mkdir(path)
else:
path = os.path.join(meeting.get_materials_path(),subdir)
if not os.path.exists(path):
os.makedirs(path)
# agendas and minutes can only have one file instance so delete file if it already exists
if subdir in ('agenda','minutes'):
old_files = glob.glob(os.path.join(path,base) + '.*')
for f in old_files:
os.remove(f)
destination = io.open(os.path.join(path,filename), 'wb+')
if extension in settings.MEETING_VALID_MIME_TYPE_EXTENSIONS['text/html']:
file.open()
text = file.read()
if encoding:
try:
text = text.decode(encoding)
except LookupError as e:
return "Failure trying to save '%s': Could not identify the file encoding, got '%s'. Hint: Try to upload as UTF-8." % (filename, str(e)[:120])
else:
try:
text = smart_text(text)
except UnicodeDecodeError as e:
return "Failure trying to save '%s'. Hint: Try to upload as UTF-8: %s..." % (filename, str(e)[:120])
# Whole file sanitization; add back what's missing from a complete
# document (sanitize will remove these).
clean = sanitize_document(text)
destination.write(clean.encode('utf8'))
if request and clean != text:
messages.warning(request, "Uploaded html content is sanitized to prevent unsafe content. "
"Your upload %s was changed by the sanitization; please check the "
"resulting content. " % (filename, ))
else:
if hasattr(file, 'chunks'):
for chunk in file.chunks():
destination.write(chunk)
else:
destination.write(file.read())
destination.close()
# unzip zipfile
if extension == '.zip':
os.chdir(path)
os.system('unzip %s' % filename)
return None

View file

@ -33,7 +33,10 @@ def display_duration(value):
3600: '1 Hour',
5400: '1.5 Hours',
7200: '2 Hours',
9000: '2.5 Hours'}
9000: '2.5 Hours',
10800: '3 Hours',
12600: '3.5 Hours',
14400: '4 Hours'}
if value in map:
return map[value]
else:

View file

@ -95,6 +95,7 @@ class SessionRequestTestCase(TestCase):
attendees = 10
comments = 'need lights'
mars_sessions = meeting.session_set.filter(group__acronym='mars')
empty_outbox()
post_data = {'num_session':'2',
'attendees': attendees,
'constraint_chair_conflict':iabprog.acronym,
@ -103,7 +104,7 @@ class SessionRequestTestCase(TestCase):
'joint_with_groups': group3.acronym + ' ' + group4.acronym,
'joint_for_session': '2',
'timeranges': ['thursday-afternoon-early', 'thursday-afternoon-late'],
'session_set-TOTAL_FORMS': '2',
'session_set-TOTAL_FORMS': '3', # matches what view actually sends, even with only 2 filled in
'session_set-INITIAL_FORMS': '1',
'session_set-MIN_NUM_FORMS': '1',
'session_set-MAX_NUM_FORMS': '3',
@ -129,6 +130,16 @@ class SessionRequestTestCase(TestCase):
'session_set-1-attendees': attendees,
'session_set-1-comments': comments,
'session_set-1-DELETE': '',
'session_set-2-id': '',
'session_set-2-name': '',
'session_set-2-short': '',
'session_set-2-purpose': 'regular',
'session_set-2-type': 'regular',
'session_set-2-requested_duration': '',
'session_set-2-on_agenda': 'True',
'session_set-2-attendees': attendees,
'session_set-2-comments': '',
'session_set-2-DELETE': 'on',
'submit': 'Continue'}
r = self.client.post(url, post_data, HTTP_HOST='example.com')
redirect_url = reverse('ietf.secr.sreq.views.view', kwargs={'acronym': 'mars'})
@ -156,17 +167,22 @@ class SessionRequestTestCase(TestCase):
self.assertContains(r, group2.acronym)
self.assertContains(r, 'Second session with: {} {}'.format(group3.acronym, group4.acronym))
# check that a notification was sent
self.assertEqual(len(outbox), 1)
notification_payload = get_payload_text(outbox[0])
self.assertIn('1 Hour, 1 Hour', notification_payload)
self.assertNotIn('1 Hour, 1 Hour, 1 Hour', notification_payload)
# Edit again, changing the joint sessions and clearing some fields. The behaviour of
# edit is different depending on whether previous joint sessions were recorded.
empty_outbox()
post_data = {'num_session':'2',
'length_session1':'3600',
'length_session2':'3600',
'attendees':attendees,
'constraint_chair_conflict':'',
'comments':'need lights',
'joint_with_groups': group2.acronym,
'joint_for_session': '1',
'session_set-TOTAL_FORMS': '2',
'session_set-TOTAL_FORMS': '3', # matches what view actually sends, even with only 2 filled in
'session_set-INITIAL_FORMS': '2',
'session_set-MIN_NUM_FORMS': '1',
'session_set-MAX_NUM_FORMS': '3',
@ -192,6 +208,16 @@ class SessionRequestTestCase(TestCase):
'session_set-1-attendees': sessions[1].attendees,
'session_set-1-comments': sessions[1].comments,
'session_set-1-DELETE': '',
'session_set-2-id': '',
'session_set-2-name': '',
'session_set-2-short': '',
'session_set-2-purpose': 'regular',
'session_set-2-type': 'regular',
'session_set-2-requested_duration': '',
'session_set-2-on_agenda': 'True',
'session_set-2-attendees': attendees,
'session_set-2-comments': '',
'session_set-2-DELETE': 'on',
'submit': 'Continue'}
r = self.client.post(url, post_data, HTTP_HOST='example.com')
self.assertRedirects(r, redirect_url)
@ -206,10 +232,84 @@ class SessionRequestTestCase(TestCase):
self.assertEqual(list(sessions[0].joint_with_groups.all()), [group2])
self.assertFalse(sessions[1].joint_with_groups.count())
# check that a notification was sent
self.assertEqual(len(outbox), 1)
notification_payload = get_payload_text(outbox[0])
self.assertIn('1 Hour, 1 Hour', notification_payload)
self.assertNotIn('1 Hour, 1 Hour, 1 Hour', notification_payload)
# Check whether the updated data is visible on the view page
r = self.client.get(redirect_url)
self.assertContains(r, 'First session with: {}'.format(group2.acronym))
@override_settings(SECR_VIRTUAL_MEETINGS=tuple()) # ensure not unexpectedly testing a virtual meeting session
def test_edit_constraint_bethere(self):
meeting = MeetingFactory(type_id='ietf', date=datetime.date.today())
mars = RoleFactory(name_id='chair', person__user__username='marschairman', group__acronym='mars').group
session = SessionFactory(meeting=meeting, group=mars, status_id='sched')
Constraint.objects.create(
meeting=meeting,
source=mars,
person=Person.objects.get(user__username='marschairman'),
name_id='bethere',
)
self.assertEqual(session.people_constraints.count(), 1)
url = reverse('ietf.secr.sreq.views.edit', kwargs=dict(acronym='mars'))
self.client.login(username='marschairman', password='marschairman+password')
attendees = '10'
ad = Person.objects.get(user__username='ad')
post_data = {
'num_session': '1',
'attendees': attendees,
'bethere': str(ad.pk),
'constraint_chair_conflict':'',
'comments':'',
'joint_with_groups': '',
'joint_for_session': '',
'delete_conflict': 'on',
'session_set-TOTAL_FORMS': '3', # matches what view actually sends, even with only 2 filled in
'session_set-INITIAL_FORMS': '1',
'session_set-MIN_NUM_FORMS': '1',
'session_set-MAX_NUM_FORMS': '3',
'session_set-0-id':session.pk,
'session_set-0-name': session.name,
'session_set-0-short': session.short,
'session_set-0-purpose': session.purpose_id,
'session_set-0-type': session.type_id,
'session_set-0-requested_duration': '3600',
'session_set-0-on_agenda': session.on_agenda,
'session_set-0-remote_instructions': session.remote_instructions,
'session_set-0-attendees': attendees,
'session_set-0-comments': '',
'session_set-0-DELETE': '',
'session_set-1-id': '',
'session_set-1-name': '',
'session_set-1-short': '',
'session_set-1-purpose':'regular',
'session_set-1-type':'regular',
'session_set-1-requested_duration': '',
'session_set-1-on_agenda': 'True',
'session_set-1-attendees': attendees,
'session_set-1-comments': '',
'session_set-1-DELETE': 'on',
'session_set-2-id': '',
'session_set-2-name': '',
'session_set-2-short': '',
'session_set-2-purpose': 'regular',
'session_set-2-type': 'regular',
'session_set-2-requested_duration': '',
'session_set-2-on_agenda': 'True',
'session_set-2-attendees': attendees,
'session_set-2-comments': '',
'session_set-2-DELETE': 'on',
'submit': 'Save',
}
r = self.client.post(url, post_data, HTTP_HOST='example.com')
redirect_url = reverse('ietf.secr.sreq.views.view', kwargs={'acronym': 'mars'})
self.assertRedirects(r, redirect_url)
self.assertEqual([pc.person for pc in session.people_constraints.all()], [ad])
def test_edit_inactive_conflicts(self):
"""Inactive conflicts should be displayed and removable"""
meeting = MeetingFactory(type_id='ietf', date=datetime.date.today(), group_conflicts=['chair_conflict'])
@ -579,7 +679,7 @@ class SubmitRequestCase(TestCase):
sessions = Session.objects.filter(meeting=meeting,group=group)
self.assertEqual(len(sessions), 2)
session = sessions[0]
self.assertEqual(session.resources.count(),1)
self.assertEqual(session.people_constraints.count(),1)
self.assertEqual(session.constraints().get(name='time_relation').time_relation, 'subsequent-days')
@ -597,6 +697,115 @@ class SubmitRequestCase(TestCase):
self.assertTrue(ad.ascii_name() in notification_payload)
self.assertIn(ConstraintName.objects.get(slug='chair_conflict').name, notification_payload)
self.assertIn(group.acronym, notification_payload)
self.assertIn('1 Hour, 1 Hour', notification_payload)
self.assertNotIn('1 Hour, 1 Hour, 1 Hour', notification_payload)
self.assertNotIn('The third session requires your approval', notification_payload)
def test_request_notification_third_session(self):
meeting = MeetingFactory(type_id='ietf', date=datetime.date.today())
ad = Person.objects.get(user__username='ad')
area = GroupFactory(type_id='area')
RoleFactory(name_id='ad', person=ad, group=area)
group = GroupFactory(acronym='ames', parent=area)
group2 = GroupFactory(acronym='ames2', parent=area)
group3 = GroupFactory(acronym='ames2', parent=area)
group4 = GroupFactory(acronym='ames3', parent=area)
RoleFactory(name_id='chair', group=group, person__user__username='ameschairman')
resource = ResourceAssociation.objects.create(name_id='project')
# Bit of a test data hack - the fixture now has no used resources to pick from
resource.name.used=True
resource.name.save()
url = reverse('ietf.secr.sreq.views.new',kwargs={'acronym':group.acronym})
confirm_url = reverse('ietf.secr.sreq.views.confirm',kwargs={'acronym':group.acronym})
len_before = len(outbox)
attendees = '10'
post_data = {'num_session':'2',
'third_session': 'true',
'attendees':attendees,
'bethere':str(ad.pk),
'constraint_chair_conflict':group4.acronym,
'comments':'',
'resources': resource.pk,
'session_time_relation': 'subsequent-days',
'adjacent_with_wg': group2.acronym,
'joint_with_groups': group3.acronym,
'joint_for_session': '2',
'timeranges': ['thursday-afternoon-early', 'thursday-afternoon-late'],
'session_set-TOTAL_FORMS': '3',
'session_set-INITIAL_FORMS': '0',
'session_set-MIN_NUM_FORMS': '1',
'session_set-MAX_NUM_FORMS': '3',
# no 'session_set-0-id' for new session
'session_set-0-name': '',
'session_set-0-short': '',
'session_set-0-purpose': 'regular',
'session_set-0-type': 'regular',
'session_set-0-requested_duration': '3600',
'session_set-0-on_agenda': True,
'session_set-0-remote_instructions': '',
'session_set-0-attendees': attendees,
'session_set-0-comments': '',
'session_set-0-DELETE': '',
# no 'session_set-1-id' for new session
'session_set-1-name': '',
'session_set-1-short': '',
'session_set-1-purpose': 'regular',
'session_set-1-type': 'regular',
'session_set-1-requested_duration': '3600',
'session_set-1-on_agenda': True,
'session_set-1-remote_instructions': '',
'session_set-1-attendees': attendees,
'session_set-1-comments': '',
'session_set-1-DELETE': '',
# no 'session_set-2-id' for new session
'session_set-2-name': '',
'session_set-2-short': '',
'session_set-2-purpose': 'regular',
'session_set-2-type': 'regular',
'session_set-2-requested_duration': '3600',
'session_set-2-on_agenda': True,
'session_set-2-remote_instructions': '',
'session_set-2-attendees': attendees,
'session_set-2-comments': '',
'session_set-2-DELETE': '',
'submit': 'Continue'}
self.client.login(username="ameschairman", password="ameschairman+password")
# submit
r = self.client.post(url,post_data)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertTrue('Confirm' in str(q("title")), r.context['form'].errors)
# confirm
post_data['submit'] = 'Submit'
r = self.client.post(confirm_url,post_data)
self.assertRedirects(r, reverse('ietf.secr.sreq.views.main'))
self.assertEqual(len(outbox),len_before+1)
notification = outbox[-1]
notification_payload = get_payload_text(notification)
sessions = Session.objects.filter(meeting=meeting,group=group)
self.assertEqual(len(sessions), 3)
session = sessions[0]
self.assertEqual(session.resources.count(),1)
self.assertEqual(session.people_constraints.count(),1)
self.assertEqual(session.constraints().get(name='time_relation').time_relation, 'subsequent-days')
self.assertEqual(session.constraints().get(name='wg_adjacent').target.acronym, group2.acronym)
self.assertEqual(
list(session.constraints().get(name='timerange').timeranges.all().values('name')),
list(TimerangeName.objects.filter(name__in=['thursday-afternoon-early', 'thursday-afternoon-late']).values('name'))
)
resource = session.resources.first()
self.assertTrue(resource.desc in notification_payload)
self.assertTrue('Schedule the sessions on subsequent days' in notification_payload)
self.assertTrue(group2.acronym in notification_payload)
self.assertTrue("Can't meet: Thursday early afternoon, Thursday late" in notification_payload)
self.assertTrue('Second session joint with: {}'.format(group3.acronym) in notification_payload)
self.assertTrue(ad.ascii_name() in notification_payload)
self.assertIn(ConstraintName.objects.get(slug='chair_conflict').name, notification_payload)
self.assertIn(group.acronym, notification_payload)
self.assertIn('1 Hour, 1 Hour, 1 Hour', notification_payload)
self.assertIn('The third session requires your approval', notification_payload)
class LockAppTestCase(TestCase):
def setUp(self):
@ -612,6 +821,12 @@ class LockAppTestCase(TestCase):
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertEqual(len(q(':disabled[name="submit"]')), 0)
chair = self.group.role_set.filter(name_id='chair').first().person.user.username
self.client.login(username=chair, password=f'{chair}+password')
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
q = PyQuery(r.content)
self.assertEqual(len(q(':disabled[name="submit"]')), 1)
def test_view_request(self):
@ -747,8 +962,8 @@ class SessionFormTest(TestCase):
# Test with two sessions
self.valid_form_data.update({
'length_session3': '',
'third_session': '',
'session_set-TOTAL_FORMS': '2',
'joint_for_session': '2'
})
form = SessionForm(data=self.valid_form_data, group=self.group1, meeting=self.meeting)
@ -756,8 +971,8 @@ class SessionFormTest(TestCase):
# Test with one session
self.valid_form_data.update({
'length_session2': '',
'num_session': 1,
'session_set-TOTAL_FORMS': '1',
'joint_for_session': '1',
'session_time_relation': '',
})
@ -806,7 +1021,7 @@ class SessionFormTest(TestCase):
def test_invalid_session_time_relation(self):
form = self._invalid_test_helper({
'third_session': '',
'length_session2': '',
'session_set-TOTAL_FORMS': 1,
'num_session': 1,
'joint_for_session': '1',
})

View file

@ -141,10 +141,11 @@ def save_conflicts(group, meeting, conflicts, name):
name=constraint_name)
constraint.save()
def send_notification(group,meeting,login,session,action):
def send_notification(group, meeting, login, sreq_data, session_data, action):
'''
This function generates email notifications for various session request activities.
session argument is a dictionary of fields from the session request form
sreq_data argument is a dictionary of fields from the session request form
session_data is an array of data from individual session subforms
action argument is a string [new|update].
'''
(to_email, cc_list) = gather_address_lists('session_requested',group=group,person=login)
@ -154,7 +155,7 @@ def send_notification(group,meeting,login,session,action):
# send email
context = {}
context['session'] = session
context['session'] = sreq_data
context['group'] = group
context['meeting'] = meeting
context['login'] = login
@ -168,12 +169,14 @@ def send_notification(group,meeting,login,session,action):
# if third session requested approval is required
# change headers TO=ADs, CC=session-request, submitter and cochairs
if session.get('length_session3',None):
context['session']['num_session'] = 3
if len(session_data) > 2:
(to_email, cc_list) = gather_address_lists('session_requested_long',group=group,person=login)
subject = '%s - Request for meeting session approval for IETF %s' % (group.acronym, meeting.number)
template = 'sreq/session_approval_notification.txt'
#status_text = 'the %s Directors for approval' % group.parent
context['session_lengths'] = [sd['requested_duration'] for sd in session_data]
send_mail(None,
to_email,
from_email,
@ -368,7 +371,14 @@ def confirm(request, acronym):
# send notification
session_data['outbound_conflicts'] = [f"{d['name']}: {d['groups']}" for d in outbound_conflicts]
send_notification(group,meeting,login,session_data,'new')
send_notification(
group,
meeting,
login,
session_data,
[sf.cleaned_data for sf in form.session_forms[:num_sessions]],
'new',
)
status_text = 'IETF Agenda to be scheduled'
messages.success(request, 'Your request has been sent to %s' % status_text)
@ -436,9 +446,9 @@ def edit(request, acronym, num=None):
)
login = request.user.person
session = Session()
first_session = Session()
if(len(sessions) > 0):
session = sessions[0]
first_session = sessions[0]
if request.method == 'POST':
button_text = request.POST.get('submit', '')
@ -451,11 +461,10 @@ def edit(request, acronym, num=None):
changed_session_forms = [sf for sf in form.session_forms.forms_to_keep if sf.has_changed()]
form.session_forms.save()
for n, subform in enumerate(form.session_forms):
session = subform.instance
if session in form.session_forms.created_instances:
if subform.instance in form.session_forms.new_objects:
SchedulingEvent.objects.create(
session=session,
status_id=status_slug_for_new_session(session, n),
session=subform.instance,
status_id=status_slug_for_new_session(subform.instance, n),
by=request.user.person,
)
for sf in changed_session_forms:
@ -473,10 +482,10 @@ def edit(request, acronym, num=None):
new_joint_for_session_idx = int(form.data.get('joint_for_session', '-1')) - 1
current_joint_for_session_idx = None
current_joint_with_groups = None
for idx, session in enumerate(sessions):
if session.joint_with_groups.count():
for idx, sess in enumerate(sessions):
if sess.joint_with_groups.count():
current_joint_for_session_idx = idx
current_joint_with_groups = session.joint_with_groups.all()
current_joint_with_groups = sess.joint_with_groups.all()
if current_joint_with_groups != new_joint_with_groups or current_joint_for_session_idx != new_joint_for_session_idx:
if current_joint_for_session_idx is not None:
@ -510,13 +519,13 @@ def edit(request, acronym, num=None):
new_resource_ids = form.cleaned_data['resources']
new_resources = [ ResourceAssociation.objects.get(pk=a)
for a in new_resource_ids]
session.resources = new_resources
first_session.resources = new_resources
if 'bethere' in form.changed_data and set(form.cleaned_data['bethere'])!=set(initial['bethere']):
session.constraints().filter(name='bethere').delete()
first_session.constraints().filter(name='bethere').delete()
bethere_cn = ConstraintName.objects.get(slug='bethere')
for p in form.cleaned_data['bethere']:
Constraint.objects.create(name=bethere_cn, source=group, person=p, meeting=session.meeting)
Constraint.objects.create(name=bethere_cn, source=group, person=p, meeting=first_session.meeting)
if 'session_time_relation' in form.changed_data:
Constraint.objects.filter(meeting=meeting, source=group, name='time_relation').delete()
@ -537,7 +546,14 @@ def edit(request, acronym, num=None):
#add_session_activity(group,'Session Request was updated',meeting,user)
# send notification
send_notification(group,meeting,login,form.cleaned_data,'update')
send_notification(
group,
meeting,
login,
form.cleaned_data,
[sf.cleaned_data for sf in form.session_forms.forms_to_keep],
'update',
)
messages.success(request, 'Session Request updated')
return redirect('ietf.secr.sreq.views.view', acronym=acronym)
@ -555,7 +571,7 @@ def edit(request, acronym, num=None):
form = FormClass(group, meeting, initial=initial)
return render(request, 'sreq/edit.html', {
'is_locked': is_locked,
'is_locked': is_locked and not has_role(request.user,'Secretariat'),
'is_virtual': meeting.number in settings.SECR_VIRTUAL_MEETINGS,
'meeting': meeting,
'form': form,

View file

@ -6,7 +6,7 @@ Session Requester: {{ login }}
{% if session.joint_with_groups %}{{ session.joint_for_session_display }} joint with: {{ session.joint_with_groups }}{% endif %}
Number of Sessions: {{ session.num_session }}
Length of Session(s): {{ session.length_session1|display_duration }}{% if session.length_session2 %}, {{ session.length_session2|display_duration }}{% endif %}{% if session.length_session3 %}, {{ session.length_session3|display_duration }}{% endif %}
Length of Session(s): {% for session_length in session_lengths %}{{ session_length.total_seconds|display_duration }}{% if not forloop.last %}, {% endif %}{% endfor %}
Number of Attendees: {{ session.attendees }}
Conflicts to Avoid:
{% for line in session.outbound_conflicts %} {{line}}

View file

@ -6,6 +6,10 @@
<script src="{% static 'secr/js/utils.js' %}"></script>
<script src="{% static 'secr/js/sessions.js' %}"></script>
{{ form.media }}
<style>
.hidden {display: none !important;}
div.form-group {display: inline;}
</style>
{% endblock %}
{% block breadcrumbs %}{{ block.super }}

Some files were not shown because too many files have changed in this diff Show more