Difference between revisions of "Release Process"

From Archivematica
Jump to navigation Jump to search
 
(85 intermediate revisions by 10 users not shown)
Line 7: Line 7:
 
# [[#Update PRONOM]]
 
# [[#Update PRONOM]]
 
# Write documentation
 
# Write documentation
 +
# Update dependencies
 +
# Update version
 
# [[#Build deb/rpm packages]]
 
# [[#Build deb/rpm packages]]
 
# Test packages for new installs and upgrades
 
# Test packages for new installs and upgrades
Line 12: Line 14:
 
# Update ansible roles
 
# Update ansible roles
 
# Announce release
 
# Announce release
 +
 +
== Translations ==
 +
 +
'''Needs to be improved!'''
 +
 +
* Determine code freeze / call for translations process
 +
* Describe processes: push and pull - and when it needs to happen
 +
* We made a choice on how we're using Transifex to keep things simple: only one branch at a time pushed to Transifex. E.g. once SS 0.10.0 is released we have to decide if:
 +
*# We move Transifex to stable/0.10.x for a while so we can work on a minor release with translation fixes (e.g. 0.10.1), or
 +
*# We move to qa/0.11.x which would only make possible to bring new translations to SS 0.11.0.
 +
* Affected repositories
 +
** archivematica-storage-service
 +
** archivematica-workflow
 +
** archivematica-dashboard
 +
*** Includes archivematica-fpr-admin
 +
*** Includes appraisal-tab
  
 
== Update PRONOM ==
 
== Update PRONOM ==
Line 23: Line 41:
 
# Checkout fido from https://github.com/openpreserve/fido
 
# Checkout fido from https://github.com/openpreserve/fido
 
# Update signatures
 
# Update signatures
 +
#* Run <code>python setup.py install</code>
 
#* Run <code>python -m fido.update_signatures</code> from the fido repository root
 
#* Run <code>python -m fido.update_signatures</code> from the fido repository root
 
# Add:
 
# Add:
Line 33: Line 52:
 
#* Old formats file
 
#* Old formats file
 
#* Old PRONOM zip file
 
#* Old PRONOM zip file
# Update <code>fido/fido.py</code> to point to the new format_files
+
# Replace:
# Create pull request. This may trigger a release from FIDO
+
#* the container signatures (download here: https://www.nationalarchives.gov.uk/aboutapps/pronom/droid-signature-files.htm)
 +
#* the reference path to the container signatures in <code>fido/fido.py</code>
 +
# Update version:
 +
#* Update in __init__.py
 +
#* Update in versions.xml
 +
# Create pull request.  
 +
# Release new version of FIDO.
 +
 
 +
 
 +
Remember to package FIDO and Siegfried in Archivematica! See next step below.
  
 
=== Package FIDO ===
 
=== Package FIDO ===
  
First, clone the [https://github.com/artefactual-labs/am-packbuild/  am-packbuild]  repo. The most recent work has been done in branch dev/packages-1.6-docker, and needs to be merged into main.
+
FIDO is packaged via [http://pypi.org/ PyPi] under opf-fido. The Makefile includes a `make package` command that will do the update.
  
Update the Makefiles available at <code>rpm/fido/Makefile</code>, and  <code>deb/fido/Makefile</code> , and run <code>make</code> in each directory to build the packages.
+
MCPClient's base.txt [https://github.com/artefactual/archivematica/blob/qa/1.x/src/MCPClient/requirements/base.txt] will need to be updated with the latest version.
  
 
=== Package Siegfried ===
 
=== Package Siegfried ===
  
First, clone the [https://github.com/artefactual-labs/am-packbuild/ am-packbuild] repo. The most recent work has been done in branch dev/packages-1.6-docker, and needs to be merged into main.
+
When a new Siegfried version becomes available, clone the [https://github.com/artefactual-labs/am-packbuild/ am-packbuild] repo, checkout the <code>qa/1.x</code> branch and update the Makefiles available at <code>rpms/EL9/siegfried/Makefile</code>, and  <code>debs/siegfried/Makefile</code> , and run <code>make</code> in each directory to build the packages.
  
Update the Makefiles available at <code>rpm/siegfried/Makefile</code>, and  <code>deb/siegfried/Makefile</code> , and run <code>make</code> in each directory to build the packages.
+
=== Update FPR ===
  
=== Update FPR ===
+
==== Examples ====
 +
<div class="text-alert">
 +
* In https://github.com/artefactual/archivematica-fpr-admin/pull/51, the devtools used here to generate the migration were included in archivematica-fpr-admin as django-admin management tools.
 +
* At the end of the process: '''remember''' to remove "pk"s from the final migration fixture! E.g. see PR #55
 +
* Examples
 +
** PRONOM 92:
 +
*** https://github.com/artefactual/archivematica-fpr-admin/pull/63
 +
*** ???
 +
** PRONOM 90:
 +
*** https://github.com/artefactual/archivematica-fpr-admin/pull/51
 +
*** https://github.com/artefactual/archivematica-fpr-admin/pull/55
 +
 
 +
There used to be a [https://projects.artefactual.com/issues/10466#change-46673 bug] in the imports model. To fix, Remove the <code>apps.get_model</code> lines and Import the models directly with <code>from fpr.models import Format, FormatVersion, IDRule</code>. You shouldn't have to do this now.
 +
 
 +
</div>
 +
 
 +
==== Update workflow ====
  
This process is currently more convoluted than it needs to be, and should be simplified.
+
This depends on FIDO having updated PRONOM files. See [[#Update FIDO]]
  
This depends on FIDO having updated their PRONOM files. See [[#Update FIDO]]
 
  
# Install the latest stable Archivematica release
 
 
# Generate a JSON with the current version of the FPR (for use later)
 
# Generate a JSON with the current version of the FPR (for use later)
#* <code>./src/dashboard/src/manage.py dumpdata fpr > fpr-current.json</code>
+
#* <code>python src/dashboard/src/manage.py dumpdata -o tmp/fpr-current.json fpr</code>
# Install the latest development version of Archivematica
+
# Make a new migration (you can copy from a previous one) and update it accordingly
# Generate a blank data migration in the FPR-admin module
+
#* E.g. <code>cp src/dashboard/src/fpr/migrations/0022_pronom_94.py src/dashboard/src/fpr/migrations/0032_pronom_96.py</code>
#* E.g. <code>./src/dashboard/src/manage.py makemigrations --empty --name pronom_89 fpr</code>
 
 
# Generate the FPR migration body.  This also updates the local database's FPR with the new PRONOM IDs
 
# Generate the FPR migration body.  This also updates the local database's FPR with the new PRONOM IDs
#* E.g. <code>am import-pronom-ids fido/fido/conf/formats-v89.xml --output-filename archivematica/pronom89.txt</code>
+
#* E.g. <code>python src/dashboard/src/manage.py import_pronom_ids path/to/fido/fido/conf/formats-v96.xml --output-filename pronom96.txt</code>
# Copy the output into the blank migration, add RunPython operation
+
# Copy the output into the blank migration above the Migration class. (Note: This is temporary, to create the data inside the FPR for the analyst steps below)
#* <code>migrations.RunPython(data_migration),</code>
+
# Make sure the below RunPython operation is in the Migration class, in the operations list
# Due to a [https://projects.artefactual.com/issues/10466#change-46673 bug], fix the imports
+
# Deploy on testing pipeline or locally
#* Remove the <code>apps.get_model</code> lines
+
 
#* Import the models directly with <code>from fpr.models import Format, FormatVersion, IDRule</code>
+
 
# Commit
+
(Analyst work)  
# Deploy on testing pipeline
+
# Update the new entries. Edit ONLY entries added by the latest PRONOM update otherwise the fixture won't work properly!
# (Analysts) Update the new entries
+
#* Move new formats to the most appropriate category
#* Move new formats to a more appropriate category
 
 
#* Create rules & commands
 
#* Create rules & commands
 
#* Test with data for new formats
 
#* Test with data for new formats
 +
(End Analyst work)
 +
 
 +
 
# Generate a JSON with the updated version of the FPR on the testing pipeline
 
# Generate a JSON with the updated version of the FPR on the testing pipeline
#* <code>./src/dashboard/src/manage.py dumpdata fpr > fpr-updated.json</code>
+
#* <code>python src/dashboard/src/manage.py dumpdata -o tmp/fpr-updated.json fpr</code>
 
# Get the updates as JSON
 
# Get the updates as JSON
#* E.g. <code>am get-fpr-changes fpr-current.json fpr-updated.json src/dashboard/src/fpr/migrations/pronom_89.json</code>
+
#* E.g. <code>python src/dashboard/src/manage.py get_fpr_changes fpr-current.json fpr-updated.json pronom_96.json</code>
# Update the migration to load the JSON updates
+
# Update the migration to load the JSON updates (see previous migrations)
#* Replace the contents of the data migration function with
+
# Review JSON -- some IDs with multipleformats are being imported and will have to be manually reviewed until bug is identified/corrected.
#** <code>fixture_file = os.path.join(os.path.dirname(__file__), 'pronom_89.json')</code>
+
#* Remove any direct imports from the bug
#** <code>call_command('loaddata', fixture_file, app_label='fpr')</code>
+
#* '''Remove the pk's from the entries in the JSON document.'''
#* Remove the direct imports from the bug
 
 
#* Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON
 
#* Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON
# Commit
+
# Rebuild and test migration
# Merge
+
# Commit, send PR, merge
  
 +
=== Finally, update IDTools versions in the FPR ===
 +
# FPR needs a migration to point to the latest and accurate versions of Fido and Siegfried, and disables the previous version of Siegfried (Fido is disabled by default. As of 1.9, there can only be one enabled identification tool). See this PR for an example of a functional migration, and heed the messy commits as a warning: https://github.com/artefactual/archivematica/pull/1547/files
 +
# Testing the above migration can be done by running <code>make bootstrap-dashboard-db</code> to recreate the dashboard and run all associated migrations.
 +
 +
== Update dependencies ==
 +
 +
=== Python Packages ===
 +
 +
[https://github.com/artefactual-labs/mets-reader-writer metsrw] and [https://github.com/artefactual-labs/agentarchives agentarchives] both have Makefiles that handle most of the packaging
 +
 +
# Check for open PRs, merge as necessary
 +
# Update <code>setup.py</code> with the new version, create a pull request, code review, [[Merging | merge]].
 +
# Tag new release, push tag
 +
#* <code>git push --tags</code>
 +
# Run <code>make package</code>
 +
#* This will build the package and upload it to PyPI. It will prompt for your PyPI username and password for the upload
 +
#* The Makefile was created from instructions at https://packaging.python.org/distributing/
 +
# <code>make clean</code> will delete packaging related files
 +
 +
== Update version ==
 +
 +
# Update PREMIS agent to Archivematica-X.X.X
 +
# Update Dashboard-Administration-Version to X.X.X
 +
# Update Storage services-Admin-Version to X.X.X
  
 
== Build deb/rpm packages ==
 
== Build deb/rpm packages ==
Line 93: Line 159:
 
=== Debian packages ===
 
=== Debian packages ===
  
# Clone the [https://github.com/artefactual-labs/am-packbuild am-packbuild] repo. Latest work is available in dev/packages-1.6-docker
+
# Clone the [https://github.com/artefactual-labs/am-packbuild am-packbuild] repo. Latest work is available in master
 
# Put your gpg private key into <code>debs/GPG-KEYS-REPOS</code>. That's the place the Dockerfile looks for it when building the environment.
 
# Put your gpg private key into <code>debs/GPG-KEYS-REPOS</code>. That's the place the Dockerfile looks for it when building the environment.
# Update the makefile at <code>debs/archivematica/Makefile</code> in order to reflect version/keys you want to use. If you want to build the storage service, you need also to change the packbuild.py line to reflect that (TODO: create service-storage package folder)
+
# Update the makefile at <code>debs/archivematica/Makefile</code> in order to reflect version/keys you want to use.
This dockerfile uses packbuild.py  script, whose function is downloadig the code from github, creating a debian/changelog file for each ubuntu version with the expected version name and description of last commit included, and build the package aftewards. It can be run on their own too, without the need of a docker environment.
 
 
# Run <make>, and the packages will be available in the <code>build</code> once the building finishes.
 
# Run <make>, and the packages will be available in the <code>build</code> once the building finishes.
 
# Upload packages to public debian repository
 
# Upload packages to public debian repository
Line 142: Line 207:
  
 
Once the packages are built, upload them to packages.archivematica.org/<version>/centos
 
Once the packages are built, upload them to packages.archivematica.org/<version>/centos
Go inside that dir, and run “createrepo .(For packages other than archivematica , use  “centos-extras” repository)
+
Sign the packages with rpm --addsign *.rpm (already signed packages will be skipped)
 +
 
 +
Go inside that dir, and as user ohbot run:
 +
 
 +
* ''rpm --addsign *.rpm'' (already signed packages will be skipped)
 +
 
 +
* ''createrepo .'' (For packages other than archivematica , use  “centos-extras” repository)
 +
 
 +
* ''gpg --detach-sign --armor repodata/repomd.xml''
 +
 
 +
The first gpg command signs the rpms, and the later signs the repository content.
 +
 
 +
=== Development stage ===
 +
 
 +
In the final stages of development, the repositories for the new releases are created, but packages are signed with a development key to avoid mistakes. Once the development stage finishes, all new packages need to be rebuild using the production keys.
 +
 
 +
Development packages are built on each new commit to stable branches by Jenkins. Repositories are available at http://jenkins-ci.archivematica.org/repos/
  
 
== Website ==
 
== Website ==
Line 188: Line 269:
 
# Create stable/#.x branch
 
# Create stable/#.x branch
 
# Delete qa/#.x branch if necessary
 
# Delete qa/#.x branch if necessary
 +
 +
== Release Day Checklist ==
 +
 +
'''Step 0''': Operations Team checklist tasks are complete (as of September 2018, this is an internal Trello list).
 +
*At this point, there should be a release candidate that has been tested. A release candidate always precedes a tagged release. If any new issues have been uncovered, a new release candidate should be made and tested before proceeding with the rest of the steps.
 +
 +
'''Step 1''': Release Captain decides whether the current release candidate is ready to release
 +
 +
* Look at all recently filed issues in GitHub -- is anything concerning / relevant for this release? Does anything need to be addressed? If an issue is uncovered that does need to be fixed, this restarts the checklist back to Step 0.
 +
* Have automated tests passed?
 +
 +
'''Step 2''': Release Captain creates a new tag for the release via GitHub (e.g. `v1.7.0` or `v0.11.0` for the Storage Service) or assigns someone else to do it.
 +
 +
* Ensure you are adding the tag to the right commit! It should match the last commit of the final release candidate.
 +
* You can create the tags from GitHub or from CLI: <code>git tag $VERSION $REVISION</code>and <code>git push origin refs/tags/$VERSION</code>
 +
* Make sure that the version is valid,
 +
** Valid values: v1.8.1, v1.8.1-rc.1
 +
** Invalid values: 1.8.1, 1.8, 1.8.1-rc1, v1.8.1-rc1
 +
 +
'''Step 3''': Sysadmin builds new packages using the release tag.
 +
 +
* See the internal wiki for steps.
 +
 +
'''Step 4''': Sysadmin copies new packages to the proper repository (e.g., https://packages.archivematica.org/1.7.x/)
 +
 +
* See the internal wiki for steps.
 +
 +
'''Step 5''': Analyst updates the Archivematica documentation links in the install / upgrade section with the correct package names and locations (e.g. conf.py in docs repo)
 +
 +
'''Step 6''': Sysadmin updates deploy-pub to use the new links.
 +
 +
'''Step 7''': Developer updates archivematica-web (managed in Gitolite / GitLab) to show the new release.
 +
 +
* See the internal wiki for steps.
 +
 +
'''Step 8''': Developer changes the default branches in GitHub and GitLab and updates references in https://gist.github.com/qubot.
 +
 +
'''Step 9''': Systems administrator updates am-packbuild and upgrades public and private demo sites.
 +
 +
* See the internal wiki for steps.
 +
 +
'''Step 10''': Developer, Ops, or Analyst, creates a release branch, and release, of [https://github.com/artefactual-labs/archivematica-acceptance-tests Archivematica Automated Acceptance tests] (AMAUAT) in-line with the Archivematica versioning, e.g. for Archivematica 1.10 release a 1.10 branch and [https://github.com/artefactual-labs/archivematica-acceptance-tests/releases release] of AMAUAT.
 +
 +
'''Step 11''': Release Captain finalizes the [https://wiki.archivematica.org/Release_Notes release notes] and adds a link to them in the [https://github.com/artefactual/archivematica/releases GitHub release]. Make sure that the releases are marked as "published".
 +
* https://github.com/artefactual/archivematica/releases
 +
* https://github.com/artefactual/archivematica-storage-service/releases
 +
 +
'''Step 12''': Release Captain posts a notification to the [https://groups.google.com/forum/#!forum/archivematica Archivematica Google Group] and the News section of the Artefactual website.
 +
 +
'''Step 13''': Release Captain closes all release-related issues.
 +
 +
'''Step 14''': All involved eat cake (for a major release) or cupcakes (for a minor release).
 +
 +
Post-release cleanup: remove any temporary VMs created for testing.
  
 
[[Category:Process documentation]]
 
[[Category:Process documentation]]

Latest revision as of 18:25, 10 March 2024

This is an outline/checklist of the process to create Archivematica & Storage service releases.

Overview[edit]

  1. Merge new features
  2. Test new features
  3. #Update PRONOM
  4. Write documentation
  5. Update dependencies
  6. Update version
  7. #Build deb/rpm packages
  8. Test packages for new installs and upgrades
  9. #Tag Release
  10. Update ansible roles
  11. Announce release

Translations[edit]

Needs to be improved!

  • Determine code freeze / call for translations process
  • Describe processes: push and pull - and when it needs to happen
  • We made a choice on how we're using Transifex to keep things simple: only one branch at a time pushed to Transifex. E.g. once SS 0.10.0 is released we have to decide if:
    1. We move Transifex to stable/0.10.x for a while so we can work on a minor release with translation fixes (e.g. 0.10.1), or
    2. We move to qa/0.11.x which would only make possible to bring new translations to SS 0.11.0.
  • Affected repositories
    • archivematica-storage-service
    • archivematica-workflow
    • archivematica-dashboard
      • Includes archivematica-fpr-admin
      • Includes appraisal-tab

Update PRONOM[edit]

PRONOM needs to be updated in our file identification tools, FIDO & Siegfried, as well as in the FPR.

Update FIDO[edit]

The FPR update currently use FIDO as a source for new PRONOM, since it is formatted nicer than PRONOM offers, so we depend on FIDO having updated their PRONOM. If that has not happened, we can generate a new formats-v##.xml by updating signatures manually. Artefactual can also update PRONOM and submit a PR to FIDO.

  1. Checkout fido from https://github.com/openpreserve/fido
  2. Update signatures
    • Run python setup.py install
    • Run python -m fido.update_signatures from the fido repository root
  3. Add:
    • New signature file fido/conf/DROID_SignatureFile-v##.xml
    • New formats file fido/conf/formats-v##.xml
    • New PRONOM zip file fido/conf/pronom-xml-v##.zip
    • Updated fido/conf/versions.xml
  4. Remove:
    • Old signature file
    • Old formats file
    • Old PRONOM zip file
  5. Replace:
  6. Update version:
    • Update in __init__.py
    • Update in versions.xml
  7. Create pull request.
  8. Release new version of FIDO.


Remember to package FIDO and Siegfried in Archivematica! See next step below.

Package FIDO[edit]

FIDO is packaged via PyPi under opf-fido. The Makefile includes a `make package` command that will do the update.

MCPClient's base.txt [1] will need to be updated with the latest version.

Package Siegfried[edit]

When a new Siegfried version becomes available, clone the am-packbuild repo, checkout the qa/1.x branch and update the Makefiles available at rpms/EL9/siegfried/Makefile, and debs/siegfried/Makefile , and run make in each directory to build the packages.

Update FPR[edit]

Examples[edit]

There used to be a bug in the imports model. To fix, Remove the apps.get_model lines and Import the models directly with from fpr.models import Format, FormatVersion, IDRule. You shouldn't have to do this now.

Update workflow[edit]

This depends on FIDO having updated PRONOM files. See #Update FIDO


  1. Generate a JSON with the current version of the FPR (for use later)
    • python src/dashboard/src/manage.py dumpdata -o tmp/fpr-current.json fpr
  2. Make a new migration (you can copy from a previous one) and update it accordingly
    • E.g. cp src/dashboard/src/fpr/migrations/0022_pronom_94.py src/dashboard/src/fpr/migrations/0032_pronom_96.py
  3. Generate the FPR migration body. This also updates the local database's FPR with the new PRONOM IDs
    • E.g. python src/dashboard/src/manage.py import_pronom_ids path/to/fido/fido/conf/formats-v96.xml --output-filename pronom96.txt
  4. Copy the output into the blank migration above the Migration class. (Note: This is temporary, to create the data inside the FPR for the analyst steps below)
  5. Make sure the below RunPython operation is in the Migration class, in the operations list
  6. Deploy on testing pipeline or locally


(Analyst work)

  1. Update the new entries. Edit ONLY entries added by the latest PRONOM update otherwise the fixture won't work properly!
    • Move new formats to the most appropriate category
    • Create rules & commands
    • Test with data for new formats

(End Analyst work)


  1. Generate a JSON with the updated version of the FPR on the testing pipeline
    • python src/dashboard/src/manage.py dumpdata -o tmp/fpr-updated.json fpr
  2. Get the updates as JSON
    • E.g. python src/dashboard/src/manage.py get_fpr_changes fpr-current.json fpr-updated.json pronom_96.json
  3. Update the migration to load the JSON updates (see previous migrations)
  4. Review JSON -- some IDs with multipleformats are being imported and will have to be manually reviewed until bug is identified/corrected.
    • Remove any direct imports from the bug
    • Remove the pk's from the entries in the JSON document.
    • Improvement Note: Because this is using loaddata, this will have problems if the FPR models are changed. A possible solution is to update get-fpr-changes to generate a migration instead of JSON
  5. Rebuild and test migration
  6. Commit, send PR, merge

Finally, update IDTools versions in the FPR[edit]

  1. FPR needs a migration to point to the latest and accurate versions of Fido and Siegfried, and disables the previous version of Siegfried (Fido is disabled by default. As of 1.9, there can only be one enabled identification tool). See this PR for an example of a functional migration, and heed the messy commits as a warning: https://github.com/artefactual/archivematica/pull/1547/files
  2. Testing the above migration can be done by running make bootstrap-dashboard-db to recreate the dashboard and run all associated migrations.

Update dependencies[edit]

Python Packages[edit]

metsrw and agentarchives both have Makefiles that handle most of the packaging

  1. Check for open PRs, merge as necessary
  2. Update setup.py with the new version, create a pull request, code review, merge.
  3. Tag new release, push tag
    • git push --tags
  4. Run make package
  5. make clean will delete packaging related files

Update version[edit]

  1. Update PREMIS agent to Archivematica-X.X.X
  2. Update Dashboard-Administration-Version to X.X.X
  3. Update Storage services-Admin-Version to X.X.X

Build deb/rpm packages[edit]

The am-packbuild repository has all the code related to building packages, except the building gpg keys. The steps to follow in order to build production production packages are as follow

Debian packages[edit]

  1. Clone the am-packbuild repo. Latest work is available in master
  2. Put your gpg private key into debs/GPG-KEYS-REPOS. That's the place the Dockerfile looks for it when building the environment.
  3. Update the makefile at debs/archivematica/Makefile in order to reflect version/keys you want to use.
  4. Run <make>, and the packages will be available in the build once the building finishes.
  5. Upload packages to public debian repository

Debian reposities[edit]

There are two debian repositories, one for archivematica packages, and one for dependencies. The procedure in order to create new ones, or upload packages to them, is the same:

    • Create folder for repo, and configuration file:

mkdir -p /path/to/repos/repo/conf

cat > /path/to/repos/repo/conf/distributions << EOF

Codename: trusty

Components: main

Architectures: amd64 source

SignWith: <short gpg keyid>

EOF

    • Go inside the repo, and import the packages previously uploaded with:

cd /path/to/repos/repo/

reprepro includedeb trusty /path/to/packages/*.deb

reprepro includedsc trusty /path/to/packages/*.deb

The current official repo is at packages.archivematica.org

RPM Packages[edit]

  1. Package specs are available in am-packbuild/rpms
  2. There are vars in the Makefiles for version/release , so update them when needed
  3. In order to build them, just go into the directory you want to build, and run “make”

RPM Repositories[edit]

Once the packages are built, upload them to packages.archivematica.org/<version>/centos Sign the packages with rpm --addsign *.rpm (already signed packages will be skipped)

Go inside that dir, and as user ohbot run:

  • rpm --addsign *.rpm (already signed packages will be skipped)
  • createrepo . (For packages other than archivematica , use “centos-extras” repository)
  • gpg --detach-sign --armor repodata/repomd.xml

The first gpg command signs the rpms, and the later signs the repository content.

Development stage[edit]

In the final stages of development, the repositories for the new releases are created, but packages are signed with a development key to avoid mistakes. Once the development stage finishes, all new packages need to be rebuild using the production keys.

Development packages are built on each new commit to stable branches by Jenkins. Repositories are available at http://jenkins-ci.archivematica.org/repos/

Website[edit]

Needs to be improved!

Homepage[edit]

  • Make changes in archivematica-web.git
    • Update links
    • Add new doc repos
  • Deploy
    • Log in sites-pub as archivematica-web and run update-web.sh

Documentation[edit]

  • Deploy
    • Log in sites-pub as archivematica-web and run update-docs.sh

Wiki[edit]

  • Release notes
  • Installation notes
  • ...

News[edit]

  • Twitter
  • Mailing list
  • News section in artefactual.com

Update ansible roles[edit]

Check that the deploy-pub vars files for archivematica are updated, and the ansible-ansiblematica-src and ansible-archivematica-pkg roles are able to deploy the new version

Tag Release[edit]

  1. Add release tags to repositories
    • Archivematica
    • Storage Service
    • FPR-admin
    • appraisal tab
    • transfer browser
    • Others?
  2. Create stable/#.x branch
  3. Delete qa/#.x branch if necessary

Release Day Checklist[edit]

Step 0: Operations Team checklist tasks are complete (as of September 2018, this is an internal Trello list).

  • At this point, there should be a release candidate that has been tested. A release candidate always precedes a tagged release. If any new issues have been uncovered, a new release candidate should be made and tested before proceeding with the rest of the steps.

Step 1: Release Captain decides whether the current release candidate is ready to release

  • Look at all recently filed issues in GitHub -- is anything concerning / relevant for this release? Does anything need to be addressed? If an issue is uncovered that does need to be fixed, this restarts the checklist back to Step 0.
  • Have automated tests passed?

Step 2: Release Captain creates a new tag for the release via GitHub (e.g. `v1.7.0` or `v0.11.0` for the Storage Service) or assigns someone else to do it.

  • Ensure you are adding the tag to the right commit! It should match the last commit of the final release candidate.
  • You can create the tags from GitHub or from CLI: git tag $VERSION $REVISIONand git push origin refs/tags/$VERSION
  • Make sure that the version is valid,
    • Valid values: v1.8.1, v1.8.1-rc.1
    • Invalid values: 1.8.1, 1.8, 1.8.1-rc1, v1.8.1-rc1

Step 3: Sysadmin builds new packages using the release tag.

  • See the internal wiki for steps.

Step 4: Sysadmin copies new packages to the proper repository (e.g., https://packages.archivematica.org/1.7.x/)

  • See the internal wiki for steps.

Step 5: Analyst updates the Archivematica documentation links in the install / upgrade section with the correct package names and locations (e.g. conf.py in docs repo)

Step 6: Sysadmin updates deploy-pub to use the new links.

Step 7: Developer updates archivematica-web (managed in Gitolite / GitLab) to show the new release.

  • See the internal wiki for steps.

Step 8: Developer changes the default branches in GitHub and GitLab and updates references in https://gist.github.com/qubot.

Step 9: Systems administrator updates am-packbuild and upgrades public and private demo sites.

  • See the internal wiki for steps.

Step 10: Developer, Ops, or Analyst, creates a release branch, and release, of Archivematica Automated Acceptance tests (AMAUAT) in-line with the Archivematica versioning, e.g. for Archivematica 1.10 release a 1.10 branch and release of AMAUAT.

Step 11: Release Captain finalizes the release notes and adds a link to them in the GitHub release. Make sure that the releases are marked as "published".

Step 12: Release Captain posts a notification to the Archivematica Google Group and the News section of the Artefactual website.

Step 13: Release Captain closes all release-related issues.

Step 14: All involved eat cake (for a major release) or cupcakes (for a minor release).

Post-release cleanup: remove any temporary VMs created for testing.