Difference between revisions of "Release Process"
(62 intermediate revisions by 9 users not shown) | |||
Line 8: | Line 8: | ||
# Write documentation | # Write documentation | ||
# Update dependencies | # Update dependencies | ||
+ | # Update version | ||
# [[#Build deb/rpm packages]] | # [[#Build deb/rpm packages]] | ||
# Test packages for new installs and upgrades | # Test packages for new installs and upgrades | ||
Line 40: | Line 41: | ||
# Checkout fido from https://github.com/openpreserve/fido | # Checkout fido from https://github.com/openpreserve/fido | ||
# Update signatures | # Update signatures | ||
− | #* Run <code> | + | #* Run <code>python setup.py install</code> |
+ | #* Run <code>python -m fido.update_signatures</code> from the fido repository root | ||
# Add: | # Add: | ||
#* New signature file <code>fido/conf/DROID_SignatureFile-v##.xml</code> | #* New signature file <code>fido/conf/DROID_SignatureFile-v##.xml</code> | ||
Line 50: | Line 52: | ||
#* Old formats file | #* Old formats file | ||
#* Old PRONOM zip file | #* Old PRONOM zip file | ||
− | # | + | # Replace: |
− | # Create pull request. | + | #* the container signatures (download here: https://www.nationalarchives.gov.uk/aboutapps/pronom/droid-signature-files.htm) |
+ | #* the reference path to the container signatures in <code>fido/fido.py</code> | ||
+ | # Update version: | ||
+ | #* Update in __init__.py | ||
+ | #* Update in versions.xml | ||
+ | # Create pull request. | ||
+ | # Release new version of FIDO. | ||
− | + | ||
− | + | Remember to package FIDO and Siegfried in Archivematica! See next step below. | |
− | |||
− | |||
− | |||
=== Package FIDO === | === Package FIDO === | ||
− | + | FIDO is packaged via [http://pypi.org/ PyPi] under opf-fido. The Makefile includes a `make package` command that will do the update. | |
− | + | MCPClient's base.txt [https://github.com/artefactual/archivematica/blob/qa/1.x/src/MCPClient/requirements/base.txt] will need to be updated with the latest version. | |
=== Package Siegfried === | === Package Siegfried === | ||
− | + | When a new Siegfried version becomes available, clone the [https://github.com/artefactual-labs/am-packbuild/ am-packbuild] repo, checkout the <code>qa/1.x</code> branch and update the Makefiles available at <code>rpms/EL9/siegfried/Makefile</code>, and <code>debs/siegfried/Makefile</code> , and run <code>make</code> in each directory to build the packages. | |
− | |||
− | |||
=== Update FPR === | === Update FPR === | ||
+ | ==== Examples ==== | ||
<div class="text-alert"> | <div class="text-alert"> | ||
* In https://github.com/artefactual/archivematica-fpr-admin/pull/51, the devtools used here to generate the migration were included in archivematica-fpr-admin as django-admin management tools. | * In https://github.com/artefactual/archivematica-fpr-admin/pull/51, the devtools used here to generate the migration were included in archivematica-fpr-admin as django-admin management tools. | ||
Line 83: | Line 87: | ||
*** https://github.com/artefactual/archivematica-fpr-admin/pull/51 | *** https://github.com/artefactual/archivematica-fpr-admin/pull/51 | ||
*** https://github.com/artefactual/archivematica-fpr-admin/pull/55 | *** https://github.com/artefactual/archivematica-fpr-admin/pull/55 | ||
+ | |||
+ | There used to be a [https://projects.artefactual.com/issues/10466#change-46673 bug] in the imports model. To fix, Remove the <code>apps.get_model</code> lines and Import the models directly with <code>from fpr.models import Format, FormatVersion, IDRule</code>. You shouldn't have to do this now. | ||
+ | |||
</div> | </div> | ||
− | This | + | ==== Update workflow ==== |
+ | |||
+ | This depends on FIDO having updated PRONOM files. See [[#Update FIDO]] | ||
− | |||
− | |||
# Generate a JSON with the current version of the FPR (for use later) | # Generate a JSON with the current version of the FPR (for use later) | ||
− | #* <code> | + | #* <code>python src/dashboard/src/manage.py dumpdata -o tmp/fpr-current.json fpr</code> |
− | # | + | # Make a new migration (you can copy from a previous one) and update it accordingly |
− | + | #* E.g. <code>cp src/dashboard/src/fpr/migrations/0022_pronom_94.py src/dashboard/src/fpr/migrations/0032_pronom_96.py</code> | |
− | #* E.g. <code>. | ||
# Generate the FPR migration body. This also updates the local database's FPR with the new PRONOM IDs | # Generate the FPR migration body. This also updates the local database's FPR with the new PRONOM IDs | ||
− | #* E.g. <code> | + | #* E.g. <code>python src/dashboard/src/manage.py import_pronom_ids path/to/fido/fido/conf/formats-v96.xml --output-filename pronom96.txt</code> |
− | # Copy the output into the blank migration | + | # Copy the output into the blank migration above the Migration class. (Note: This is temporary, to create the data inside the FPR for the analyst steps below) |
− | + | # Make sure the below RunPython operation is in the Migration class, in the operations list | |
− | + | # Deploy on testing pipeline or locally | |
− | # | + | |
− | + | ||
− | + | (Analyst work) | |
− | # Deploy on testing pipeline | + | # Update the new entries. Edit ONLY entries added by the latest PRONOM update otherwise the fixture won't work properly! |
− | + | #* Move new formats to the most appropriate category | |
− | |||
− | #* Move new formats to | ||
#* Create rules & commands | #* Create rules & commands | ||
#* Test with data for new formats | #* Test with data for new formats | ||
+ | (End Analyst work) | ||
+ | |||
+ | |||
# Generate a JSON with the updated version of the FPR on the testing pipeline | # Generate a JSON with the updated version of the FPR on the testing pipeline | ||
− | #* <code> | + | #* <code>python src/dashboard/src/manage.py dumpdata -o tmp/fpr-updated.json fpr</code> |
# Get the updates as JSON | # Get the updates as JSON | ||
− | #* E.g. <code> | + | #* E.g. <code>python src/dashboard/src/manage.py get_fpr_changes fpr-current.json fpr-updated.json pronom_96.json</code> |
− | # Update the migration to load the JSON updates | + | # Update the migration to load the JSON updates (see previous migrations) |
− | + | # Review JSON -- some IDs with multipleformats are being imported and will have to be manually reviewed until bug is identified/corrected. | |
− | + | #* Remove any direct imports from the bug | |
− | # | ||
− | #* Remove | ||
#* '''Remove the pk's from the entries in the JSON document.''' | #* '''Remove the pk's from the entries in the JSON document.''' | ||
#* Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON | #* Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON | ||
− | # Commit | + | # Rebuild and test migration |
− | # | + | # Commit, send PR, merge |
+ | |||
+ | === Finally, update IDTools versions in the FPR === | ||
+ | # FPR needs a migration to point to the latest and accurate versions of Fido and Siegfried, and disables the previous version of Siegfried (Fido is disabled by default. As of 1.9, there can only be one enabled identification tool). See this PR for an example of a functional migration, and heed the messy commits as a warning: https://github.com/artefactual/archivematica/pull/1547/files | ||
+ | # Testing the above migration can be done by running <code>make bootstrap-dashboard-db</code> to recreate the dashboard and run all associated migrations. | ||
== Update dependencies == | == Update dependencies == | ||
Line 137: | Line 146: | ||
#* The Makefile was created from instructions at https://packaging.python.org/distributing/ | #* The Makefile was created from instructions at https://packaging.python.org/distributing/ | ||
# <code>make clean</code> will delete packaging related files | # <code>make clean</code> will delete packaging related files | ||
+ | |||
+ | == Update version == | ||
+ | |||
+ | # Update PREMIS agent to Archivematica-X.X.X | ||
+ | # Update Dashboard-Administration-Version to X.X.X | ||
+ | # Update Storage services-Admin-Version to X.X.X | ||
== Build deb/rpm packages == | == Build deb/rpm packages == | ||
Line 146: | Line 161: | ||
# Clone the [https://github.com/artefactual-labs/am-packbuild am-packbuild] repo. Latest work is available in master | # Clone the [https://github.com/artefactual-labs/am-packbuild am-packbuild] repo. Latest work is available in master | ||
# Put your gpg private key into <code>debs/GPG-KEYS-REPOS</code>. That's the place the Dockerfile looks for it when building the environment. | # Put your gpg private key into <code>debs/GPG-KEYS-REPOS</code>. That's the place the Dockerfile looks for it when building the environment. | ||
− | # Update the makefile at <code>debs/archivematica/Makefile</code> in order to reflect version/keys you want to use | + | # Update the makefile at <code>debs/archivematica/Makefile</code> in order to reflect version/keys you want to use. |
− | |||
# Run <make>, and the packages will be available in the <code>build</code> once the building finishes. | # Run <make>, and the packages will be available in the <code>build</code> once the building finishes. | ||
# Upload packages to public debian repository | # Upload packages to public debian repository | ||
Line 255: | Line 269: | ||
# Create stable/#.x branch | # Create stable/#.x branch | ||
# Delete qa/#.x branch if necessary | # Delete qa/#.x branch if necessary | ||
+ | |||
+ | == Release Day Checklist == | ||
+ | |||
+ | '''Step 0''': Operations Team checklist tasks are complete (as of September 2018, this is an internal Trello list). | ||
+ | *At this point, there should be a release candidate that has been tested. A release candidate always precedes a tagged release. If any new issues have been uncovered, a new release candidate should be made and tested before proceeding with the rest of the steps. | ||
+ | |||
+ | '''Step 1''': Release Captain decides whether the current release candidate is ready to release | ||
+ | |||
+ | * Look at all recently filed issues in GitHub -- is anything concerning / relevant for this release? Does anything need to be addressed? If an issue is uncovered that does need to be fixed, this restarts the checklist back to Step 0. | ||
+ | * Have automated tests passed? | ||
+ | |||
+ | '''Step 2''': Release Captain creates a new tag for the release via GitHub (e.g. `v1.7.0` or `v0.11.0` for the Storage Service) or assigns someone else to do it. | ||
+ | |||
+ | * Ensure you are adding the tag to the right commit! It should match the last commit of the final release candidate. | ||
+ | * You can create the tags from GitHub or from CLI: <code>git tag $VERSION $REVISION</code>and <code>git push origin refs/tags/$VERSION</code> | ||
+ | * Make sure that the version is valid, | ||
+ | ** Valid values: v1.8.1, v1.8.1-rc.1 | ||
+ | ** Invalid values: 1.8.1, 1.8, 1.8.1-rc1, v1.8.1-rc1 | ||
+ | |||
+ | '''Step 3''': Sysadmin builds new packages using the release tag. | ||
+ | |||
+ | * See the internal wiki for steps. | ||
+ | |||
+ | '''Step 4''': Sysadmin copies new packages to the proper repository (e.g., https://packages.archivematica.org/1.7.x/) | ||
+ | |||
+ | * See the internal wiki for steps. | ||
+ | |||
+ | '''Step 5''': Analyst updates the Archivematica documentation links in the install / upgrade section with the correct package names and locations (e.g. conf.py in docs repo) | ||
+ | |||
+ | '''Step 6''': Sysadmin updates deploy-pub to use the new links. | ||
+ | |||
+ | '''Step 7''': Developer updates archivematica-web (managed in Gitolite / GitLab) to show the new release. | ||
+ | |||
+ | * See the internal wiki for steps. | ||
+ | |||
+ | '''Step 8''': Developer changes the default branches in GitHub and GitLab and updates references in https://gist.github.com/qubot. | ||
+ | |||
+ | '''Step 9''': Systems administrator updates am-packbuild and upgrades public and private demo sites. | ||
+ | |||
+ | * See the internal wiki for steps. | ||
+ | |||
+ | '''Step 10''': Developer, Ops, or Analyst, creates a release branch, and release, of [https://github.com/artefactual-labs/archivematica-acceptance-tests Archivematica Automated Acceptance tests] (AMAUAT) in-line with the Archivematica versioning, e.g. for Archivematica 1.10 release a 1.10 branch and [https://github.com/artefactual-labs/archivematica-acceptance-tests/releases release] of AMAUAT. | ||
+ | |||
+ | '''Step 11''': Release Captain finalizes the [https://wiki.archivematica.org/Release_Notes release notes] and adds a link to them in the [https://github.com/artefactual/archivematica/releases GitHub release]. Make sure that the releases are marked as "published". | ||
+ | * https://github.com/artefactual/archivematica/releases | ||
+ | * https://github.com/artefactual/archivematica-storage-service/releases | ||
+ | |||
+ | '''Step 12''': Release Captain posts a notification to the [https://groups.google.com/forum/#!forum/archivematica Archivematica Google Group] and the News section of the Artefactual website. | ||
+ | |||
+ | '''Step 13''': Release Captain closes all release-related issues. | ||
+ | |||
+ | '''Step 14''': All involved eat cake (for a major release) or cupcakes (for a minor release). | ||
+ | |||
+ | Post-release cleanup: remove any temporary VMs created for testing. | ||
[[Category:Process documentation]] | [[Category:Process documentation]] |
Latest revision as of 18:25, 10 March 2024
This is an outline/checklist of the process to create Archivematica & Storage service releases.
Overview[edit]
- Merge new features
- Test new features
- #Update PRONOM
- Write documentation
- Update dependencies
- Update version
- #Build deb/rpm packages
- Test packages for new installs and upgrades
- #Tag Release
- Update ansible roles
- Announce release
Translations[edit]
Needs to be improved!
- Determine code freeze / call for translations process
- Describe processes: push and pull - and when it needs to happen
- We made a choice on how we're using Transifex to keep things simple: only one branch at a time pushed to Transifex. E.g. once SS 0.10.0 is released we have to decide if:
- We move Transifex to stable/0.10.x for a while so we can work on a minor release with translation fixes (e.g. 0.10.1), or
- We move to qa/0.11.x which would only make possible to bring new translations to SS 0.11.0.
- Affected repositories
- archivematica-storage-service
- archivematica-workflow
- archivematica-dashboard
- Includes archivematica-fpr-admin
- Includes appraisal-tab
Update PRONOM[edit]
PRONOM needs to be updated in our file identification tools, FIDO & Siegfried, as well as in the FPR.
Update FIDO[edit]
The FPR update currently use FIDO as a source for new PRONOM, since it is formatted nicer than PRONOM offers, so we depend on FIDO having updated their PRONOM. If that has not happened, we can generate a new formats-v##.xml by updating signatures manually. Artefactual can also update PRONOM and submit a PR to FIDO.
- Checkout fido from https://github.com/openpreserve/fido
- Update signatures
- Run
python setup.py install
- Run
python -m fido.update_signatures
from the fido repository root
- Run
- Add:
- New signature file
fido/conf/DROID_SignatureFile-v##.xml
- New formats file
fido/conf/formats-v##.xml
- New PRONOM zip file
fido/conf/pronom-xml-v##.zip
- Updated
fido/conf/versions.xml
- New signature file
- Remove:
- Old signature file
- Old formats file
- Old PRONOM zip file
- Replace:
- the container signatures (download here: https://www.nationalarchives.gov.uk/aboutapps/pronom/droid-signature-files.htm)
- the reference path to the container signatures in
fido/fido.py
- Update version:
- Update in __init__.py
- Update in versions.xml
- Create pull request.
- Release new version of FIDO.
Remember to package FIDO and Siegfried in Archivematica! See next step below.
Package FIDO[edit]
FIDO is packaged via PyPi under opf-fido. The Makefile includes a `make package` command that will do the update.
MCPClient's base.txt [1] will need to be updated with the latest version.
Package Siegfried[edit]
When a new Siegfried version becomes available, clone the am-packbuild repo, checkout the qa/1.x
branch and update the Makefiles available at rpms/EL9/siegfried/Makefile
, and debs/siegfried/Makefile
, and run make
in each directory to build the packages.
Update FPR[edit]
Examples[edit]
- In https://github.com/artefactual/archivematica-fpr-admin/pull/51, the devtools used here to generate the migration were included in archivematica-fpr-admin as django-admin management tools.
- At the end of the process: remember to remove "pk"s from the final migration fixture! E.g. see PR #55
- Examples
There used to be a bug in the imports model. To fix, Remove the apps.get_model
lines and Import the models directly with from fpr.models import Format, FormatVersion, IDRule
. You shouldn't have to do this now.
Update workflow[edit]
This depends on FIDO having updated PRONOM files. See #Update FIDO
- Generate a JSON with the current version of the FPR (for use later)
python src/dashboard/src/manage.py dumpdata -o tmp/fpr-current.json fpr
- Make a new migration (you can copy from a previous one) and update it accordingly
- E.g.
cp src/dashboard/src/fpr/migrations/0022_pronom_94.py src/dashboard/src/fpr/migrations/0032_pronom_96.py
- E.g.
- Generate the FPR migration body. This also updates the local database's FPR with the new PRONOM IDs
- E.g.
python src/dashboard/src/manage.py import_pronom_ids path/to/fido/fido/conf/formats-v96.xml --output-filename pronom96.txt
- E.g.
- Copy the output into the blank migration above the Migration class. (Note: This is temporary, to create the data inside the FPR for the analyst steps below)
- Make sure the below RunPython operation is in the Migration class, in the operations list
- Deploy on testing pipeline or locally
(Analyst work)
- Update the new entries. Edit ONLY entries added by the latest PRONOM update otherwise the fixture won't work properly!
- Move new formats to the most appropriate category
- Create rules & commands
- Test with data for new formats
(End Analyst work)
- Generate a JSON with the updated version of the FPR on the testing pipeline
python src/dashboard/src/manage.py dumpdata -o tmp/fpr-updated.json fpr
- Get the updates as JSON
- E.g.
python src/dashboard/src/manage.py get_fpr_changes fpr-current.json fpr-updated.json pronom_96.json
- E.g.
- Update the migration to load the JSON updates (see previous migrations)
- Review JSON -- some IDs with multipleformats are being imported and will have to be manually reviewed until bug is identified/corrected.
- Remove any direct imports from the bug
- Remove the pk's from the entries in the JSON document.
- Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON
- Rebuild and test migration
- Commit, send PR, merge
Finally, update IDTools versions in the FPR[edit]
- FPR needs a migration to point to the latest and accurate versions of Fido and Siegfried, and disables the previous version of Siegfried (Fido is disabled by default. As of 1.9, there can only be one enabled identification tool). See this PR for an example of a functional migration, and heed the messy commits as a warning: https://github.com/artefactual/archivematica/pull/1547/files
- Testing the above migration can be done by running
make bootstrap-dashboard-db
to recreate the dashboard and run all associated migrations.
Update dependencies[edit]
Python Packages[edit]
metsrw and agentarchives both have Makefiles that handle most of the packaging
- Check for open PRs, merge as necessary
- Update
setup.py
with the new version, create a pull request, code review, merge. - Tag new release, push tag
git push --tags
- Run
make package
- This will build the package and upload it to PyPI. It will prompt for your PyPI username and password for the upload
- The Makefile was created from instructions at https://packaging.python.org/distributing/
make clean
will delete packaging related files
Update version[edit]
- Update PREMIS agent to Archivematica-X.X.X
- Update Dashboard-Administration-Version to X.X.X
- Update Storage services-Admin-Version to X.X.X
Build deb/rpm packages[edit]
The am-packbuild repository has all the code related to building packages, except the building gpg keys. The steps to follow in order to build production production packages are as follow
Debian packages[edit]
- Clone the am-packbuild repo. Latest work is available in master
- Put your gpg private key into
debs/GPG-KEYS-REPOS
. That's the place the Dockerfile looks for it when building the environment. - Update the makefile at
debs/archivematica/Makefile
in order to reflect version/keys you want to use. - Run <make>, and the packages will be available in the
build
once the building finishes. - Upload packages to public debian repository
Debian reposities[edit]
There are two debian repositories, one for archivematica packages, and one for dependencies. The procedure in order to create new ones, or upload packages to them, is the same:
- Create folder for repo, and configuration file:
mkdir -p /path/to/repos/repo/conf
cat > /path/to/repos/repo/conf/distributions << EOF
Codename: trusty
Components: main
Architectures: amd64 source
SignWith: <short gpg keyid>
EOF
- Go inside the repo, and import the packages previously uploaded with:
cd /path/to/repos/repo/
reprepro includedeb trusty /path/to/packages/*.deb
reprepro includedsc trusty /path/to/packages/*.deb
The current official repo is at packages.archivematica.org
RPM Packages[edit]
- Package specs are available in am-packbuild/rpms
- There are vars in the Makefiles for version/release , so update them when needed
- In order to build them, just go into the directory you want to build, and run “make”
RPM Repositories[edit]
Once the packages are built, upload them to packages.archivematica.org/<version>/centos Sign the packages with rpm --addsign *.rpm (already signed packages will be skipped)
Go inside that dir, and as user ohbot run:
- rpm --addsign *.rpm (already signed packages will be skipped)
- createrepo . (For packages other than archivematica , use “centos-extras” repository)
- gpg --detach-sign --armor repodata/repomd.xml
The first gpg command signs the rpms, and the later signs the repository content.
Development stage[edit]
In the final stages of development, the repositories for the new releases are created, but packages are signed with a development key to avoid mistakes. Once the development stage finishes, all new packages need to be rebuild using the production keys.
Development packages are built on each new commit to stable branches by Jenkins. Repositories are available at http://jenkins-ci.archivematica.org/repos/
Website[edit]
Needs to be improved!
Homepage[edit]
- Make changes in archivematica-web.git
- Update links
- Add new doc repos
- Deploy
- Log in sites-pub as archivematica-web and run update-web.sh
Documentation[edit]
- Deploy
- Log in sites-pub as archivematica-web and run update-docs.sh
Wiki[edit]
- Release notes
- Installation notes
- ...
News[edit]
- Mailing list
- News section in artefactual.com
Update ansible roles[edit]
Check that the deploy-pub vars files for archivematica are updated, and the ansible-ansiblematica-src and ansible-archivematica-pkg roles are able to deploy the new version
Tag Release[edit]
- Add release tags to repositories
- Archivematica
- Storage Service
- FPR-admin
- appraisal tab
- transfer browser
- Others?
- Create stable/#.x branch
- Delete qa/#.x branch if necessary
Release Day Checklist[edit]
Step 0: Operations Team checklist tasks are complete (as of September 2018, this is an internal Trello list).
- At this point, there should be a release candidate that has been tested. A release candidate always precedes a tagged release. If any new issues have been uncovered, a new release candidate should be made and tested before proceeding with the rest of the steps.
Step 1: Release Captain decides whether the current release candidate is ready to release
- Look at all recently filed issues in GitHub -- is anything concerning / relevant for this release? Does anything need to be addressed? If an issue is uncovered that does need to be fixed, this restarts the checklist back to Step 0.
- Have automated tests passed?
Step 2: Release Captain creates a new tag for the release via GitHub (e.g. `v1.7.0` or `v0.11.0` for the Storage Service) or assigns someone else to do it.
- Ensure you are adding the tag to the right commit! It should match the last commit of the final release candidate.
- You can create the tags from GitHub or from CLI:
git tag $VERSION $REVISION
andgit push origin refs/tags/$VERSION
- Make sure that the version is valid,
- Valid values: v1.8.1, v1.8.1-rc.1
- Invalid values: 1.8.1, 1.8, 1.8.1-rc1, v1.8.1-rc1
Step 3: Sysadmin builds new packages using the release tag.
- See the internal wiki for steps.
Step 4: Sysadmin copies new packages to the proper repository (e.g., https://packages.archivematica.org/1.7.x/)
- See the internal wiki for steps.
Step 5: Analyst updates the Archivematica documentation links in the install / upgrade section with the correct package names and locations (e.g. conf.py in docs repo)
Step 6: Sysadmin updates deploy-pub to use the new links.
Step 7: Developer updates archivematica-web (managed in Gitolite / GitLab) to show the new release.
- See the internal wiki for steps.
Step 8: Developer changes the default branches in GitHub and GitLab and updates references in https://gist.github.com/qubot.
Step 9: Systems administrator updates am-packbuild and upgrades public and private demo sites.
- See the internal wiki for steps.
Step 10: Developer, Ops, or Analyst, creates a release branch, and release, of Archivematica Automated Acceptance tests (AMAUAT) in-line with the Archivematica versioning, e.g. for Archivematica 1.10 release a 1.10 branch and release of AMAUAT.
Step 11: Release Captain finalizes the release notes and adds a link to them in the GitHub release. Make sure that the releases are marked as "published".
- https://github.com/artefactual/archivematica/releases
- https://github.com/artefactual/archivematica-storage-service/releases
Step 12: Release Captain posts a notification to the Archivematica Google Group and the News section of the Artefactual website.
Step 13: Release Captain closes all release-related issues.
Step 14: All involved eat cake (for a major release) or cupcakes (for a minor release).
Post-release cleanup: remove any temporary VMs created for testing.