diff --git a/.aptly-bin/LICENSE b/.aptly-bin/LICENSE
deleted file mode 100644
index ba83ce3..0000000
--- a/.aptly-bin/LICENSE
+++ /dev/null
@@ -1,21 +0,0 @@
-Copyright 2013-2015 aptly authors. All rights reserved.
-
-MIT License
-
-Permission is hereby granted, free of charge, to any person obtaining a copy
-of this software and associated documentation files (the "Software"), to deal
-in the Software without restriction, including without limitation the rights
-to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-copies of the Software, and to permit persons to whom the Software is
-furnished to do so, subject to the following conditions:
-
-The above copyright notice and this permission notice shall be included in
-all copies or substantial portions of the Software.
-
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
-OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
-FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
-IN THE SOFTWARE.
\ No newline at end of file
diff --git a/.aptly-bin/README.rst b/.aptly-bin/README.rst
deleted file mode 100644
index c5c7b07..0000000
--- a/.aptly-bin/README.rst
+++ /dev/null
@@ -1,106 +0,0 @@
-=====
-aptly
-=====
-
-.. image:: https://travis-ci.org/smira/aptly.png?branch=master
- :target: https://travis-ci.org/smira/aptly
-
-.. image:: https://coveralls.io/repos/smira/aptly/badge.png?branch=HEAD
- :target: https://coveralls.io/r/smira/aptly?branch=HEAD
-
-.. image:: https://badges.gitter.im/Join Chat.svg
- :target: https://gitter.im/smira/aptly?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
-
-.. image:: http://goreportcard.com/badge/gojp/goreportcard
- :target: http://goreportcard.com/report/gojp/goreportcard
-
-Aptly is a swiss army knife for Debian repository management.
-
-.. image:: http://www.aptly.info/img/aptly_logo.png
- :target: http://www.aptly.info/
-
-Documentation is available at `http://www.aptly.info/ `_. For support use
-mailing list `aptly-discuss `_.
-
-Aptly features: ("+" means planned features)
-
-* make mirrors of remote Debian/Ubuntu repositories, limiting by components/architectures
-* take snapshots of mirrors at any point in time, fixing state of repository at some moment of time
-* publish snapshot as Debian repository, ready to be consumed by apt
-* controlled update of one or more packages in snapshot from upstream mirror, tracking dependencies
-* merge two or more snapshots into one
-* filter repository by search query, pulling dependencies when required
-* publish self-made packages as Debian repositories
-* REST API for remote access
-* mirror repositories "as-is" (without resigning with user's key) (+)
-* support for yum repositories (+)
-
-Current limitations:
-
-* translations are not supported yet
-
-Download
---------
-
-To install aptly on Debian/Ubuntu, add new repository to /etc/apt/sources.list::
-
- deb http://repo.aptly.info/ squeeze main
-
-And import key that is used to sign the release::
-
- $ apt-key adv --keyserver keys.gnupg.net --recv-keys E083A3782A194991
-
-After that you can install aptly as any other software package::
-
- $ apt-get update
- $ apt-get install aptly
-
-Don't worry about squeeze part in repo name: aptly package should work on Debian squeeze+,
-Ubuntu 10.0+. Package contains aptly binary, man page and bash completion.
-
-If you would like to use nightly builds (unstable), please use following repository::
-
- deb http://repo.aptly.info/ nightly main
-
-Binary executables (depends almost only on libc) are available for download from `Bintray `_.
-
-If you have Go environment set up, you can build aptly from source by running (go 1.4+ required)::
-
- go get -u github.com/mattn/gom
- mkdir -p $GOPATH/src/github.com/smira/aptly
- git clone https://github.com/smira/aptly $GOPATH/src/github.com/smira/aptly
- cd $GOPATH/src/github.com/smira/aptly
- gom -production install
- gom build -o $GOPATH/bin/aptly
-
-Aptly is using `gom `_ to fix external dependencies, so regular ``go get github.com/smira/aptly``
-should work as well, but might fail or produce different result (if external libraries got updated).
-
-If you don't have Go installed (or older version), you can easily install Go using `gvm `_.
-
-Integrations
-------------
-
-Vagrant:
-
-- `Vagrant configuration `_ by
- Zane Williamson, allowing to bring two virtual servers, one with aptly installed
- and another one set up to install packages from repository published by aptly
-
-Docker:
-
-- `Docker container `_ with aptly inside by Mike Purvis
-
-With configuration management systems:
-
-- `Chef cookbook `_ by Aaron Baer
- (Heavy Water Operations, LLC)
-- `Puppet module `_ by
- Government Digital Services
-- `SaltStack Formula `_ by
- Forrest Alvarez and Brian Jackson
-- `Ansible role `_ by Tom Paine
-
-CLI for aptly API:
-
-- `Ruby aptly CLI/library `_ by Zane Williamson
diff --git a/.editorconfig b/.editorconfig
deleted file mode 100644
index f41d58e..0000000
--- a/.editorconfig
+++ /dev/null
@@ -1,10 +0,0 @@
-[*.yml]
-indent_style = space
-indent_size = 2
-
-[*.py]
-indent_style = space
-indent_size = 4
-
-[Makefile]
-indent_style = tab
diff --git a/.flake8 b/.flake8
deleted file mode 100644
index 47ae620..0000000
--- a/.flake8
+++ /dev/null
@@ -1,3 +0,0 @@
-[flake8]
-ignore = E203,E221,E222,E241,E251,E272
-jobs=auto
diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
new file mode 100644
index 0000000..d86b60c
--- /dev/null
+++ b/.github/workflows/main.yml
@@ -0,0 +1,63 @@
+name: Main branch
+
+on:
+ push:
+ branches:
+ - main
+ schedule:
+ - cron: "0 0 * * 0"
+
+jobs:
+ cache:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+
+ - name: Login to GitHub Container Registry
+ uses: docker/login-action@v3
+ with:
+ registry: ghcr.io
+ username: ${{ github.actor }}
+ password: ${{ secrets.GITHUB_TOKEN }}
+
+ - name: Set up Docker Buildx
+ uses: docker/setup-buildx-action@v3
+ with:
+ buildkitd-flags: --debug
+
+ - name: Build container
+ uses: docker/build-push-action@v5
+ with:
+ context: compose
+ push: true
+ tags: ghcr.io/adfinis/pyaptly/cache:latest
+ cache-from: type=registry,ref=ghcr.io/adfinis/pyaptly/cache:gha
+ cache-to: type=registry,ref=ghcr.io/adfinis/pyaptly/cache:gha,mode=max
+
+ test:
+ runs-on: ubuntu-latest
+ needs: [cache]
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v2
+
+ - name: Set up Docker Buildx
+ uses: docker/setup-buildx-action@v3
+ with:
+ buildkitd-flags: --debug
+
+ - name: Build container
+ uses: docker/build-push-action@v5
+ with:
+ context: compose
+ push: false
+ load: true
+ tags: ghcr.io/adfinis/pyaptly/cache:latest
+ cache-from: type=registry,ref=ghcr.io/adfinis/pyaptly/cache:gha
+
+ - name: Run tests
+ run: |
+ echo testing
diff --git a/.github/workflows/pull-requests.yml b/.github/workflows/pull-requests.yml
new file mode 100644
index 0000000..ea8e80a
--- /dev/null
+++ b/.github/workflows/pull-requests.yml
@@ -0,0 +1,32 @@
+name: Pull requests
+
+on:
+ pull_request:
+ branches:
+ - main
+
+jobs:
+ test:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v2
+
+ - name: Set up Docker Buildx
+ uses: docker/setup-buildx-action@v3
+ with:
+ buildkitd-flags: --debug
+
+ - name: Build container
+ uses: docker/build-push-action@v5
+ with:
+ context: compose
+ push: false
+ load: true
+ tags: ghcr.io/adfinis/pyaptly/cache:latest
+ cache-from: type=registry,ref=ghcr.io/adfinis/pyaptly/cache:gha
+
+ - name: Run tests
+ run: |
+ echo testing
diff --git a/.gitignore b/.gitignore
index 20d8971..a1fa701 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,18 +1,2 @@
-/pyaptly.egg-info
-*.swp
-*.pyc
-.python-version
__pycache__
-/.cache
/.hypothesis
-/.coverage
-/.vagrant
-/build
-/doc/_build
-.hypothesis
-/.deps
-/.gnupg
-/.aptly.conf
-/.aptly
-/.local
-.work
diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
deleted file mode 100644
index c0b7e5a..0000000
--- a/.gitlab-ci.yml
+++ /dev/null
@@ -1,28 +0,0 @@
-before_script:
- - source /etc/profile
- - GIT_SSL_NO_VERIFY=true git submodule update --init --recursive
-
-stages:
- - doc
- - test
-
-doc:
- stage: doc
- script:
- - pyenv local 3.5.1
- - make install
- - make .deps/pytest .deps/hypothesis .deps/freeze .deps/testfixtures
- - make doc
- - rsync -av --delete doc/_build/html/ doc-sync@docs.adfinis-sygroup.ch:/var/www/html/public/pyaptly/
-
-test27:
- stage: test
- script:
- - pyenv local 2.7.11
- - make test-local
-
-test3:
- stage: test
- script:
- - pyenv local 3.5.1
- - make test-local
diff --git a/.gitmodules b/.gitmodules
deleted file mode 100644
index 39d8ab0..0000000
--- a/.gitmodules
+++ /dev/null
@@ -1,9 +0,0 @@
-[submodule "pyproject"]
- path = pyproject
- url = https://github.com/adfinis-sygroup/pyproject.git
-[submodule "vagrant/libfaketime"]
- path = vagrant/libfaketime
- url = https://github.com/wolfcw/libfaketime.git
-[submodule "doc/adsy-sphinx-template.src"]
- path = doc/adsy-sphinx-template.src
- url = https://github.com/adfinis-sygroup/adsy-sphinx-template.git
diff --git a/.requirements.txt b/.requirements.txt
deleted file mode 100644
index 31c1453..0000000
--- a/.requirements.txt
+++ /dev/null
@@ -1,3 +0,0 @@
-mock
-freezegun
-pytz
diff --git a/.travis.yml b/.travis.yml
deleted file mode 100644
index c38abba..0000000
--- a/.travis.yml
+++ /dev/null
@@ -1,11 +0,0 @@
-language: python
-env:
- - HYPOTHESIS_PROFILE=ci
-python:
- - "2.6"
- - "2.7"
- - "3.4"
- - "3.5"
- - "pypy"
-install: "pip install -r .requirements.txt"
-script: make test-local
diff --git a/HOW_TO_RELEASE.rst b/HOW_TO_RELEASE.rst
deleted file mode 120000
index f03e166..0000000
--- a/HOW_TO_RELEASE.rst
+++ /dev/null
@@ -1 +0,0 @@
-pyproject/HOW_TO_RELEASE.rst
\ No newline at end of file
diff --git a/LICENSE b/LICENSE
index dba13ed..be3f7b2 100644
--- a/LICENSE
+++ b/LICENSE
@@ -1,7 +1,7 @@
GNU AFFERO GENERAL PUBLIC LICENSE
Version 3, 19 November 2007
- Copyright (C) 2007 Free Software Foundation, Inc.
+ Copyright (C) 2007 Free Software Foundation, Inc.
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
@@ -643,7 +643,7 @@ the "copyright" line and a pointer to where the full notice is found.
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
- along with this program. If not, see .
+ along with this program. If not, see .
Also add information on how to contact you by electronic and paper mail.
@@ -658,4 +658,4 @@ specific requirements.
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU AGPL, see
-.
+.
diff --git a/Makefile b/Makefile
index 4ced522..740762c 100644
--- a/Makefile
+++ b/Makefile
@@ -1,51 +1,53 @@
-.PHONY: webserver
-PROJECT := pyaptly
-GIT_HUB := "https://github.com/adfinis-sygroup/pyaptly"
-
-include pyproject/Makefile
-
-PYTHON26 := $(shell echo $(PYTHON_VERSION) | grep -Eq 2.6 && echo True 2> /dev/null)
-
-# not all comprehensions are supported in 2.6 therefore
-# need to disable linter for such
-DEVNULL := $(shell touch .deps/flake8_comprehensions)
-
-ifeq ($(PYTHON26),True)
- # disable installation of hypothesis on python version <2.7
- DEVNULL := $(shell touch .deps/hypothesis .deps/hypothesispytest)
-endif
-
-test-local:
- source testenv; \
- make webserver && \
- make test
-
-.gnupg:
- bash -c '[[ "$$HOME" == *"pyaptly"* ]]'
- gpg -k
- gpg --batch --import < vagrant/key.pub
- gpg --batch --import < vagrant/key.sec
- gpg --batch --no-default-keyring --keyring trustedkeys.gpg --import < vagrant/key.pub
- cat vagrant/*.key | gpg --batch --no-default-keyring --keyring trustedkeys.gpg --import
- gpg -k
-
-.aptly:
- aptly repo create -architectures="amd64" fakerepo01
- aptly repo create -architectures="amd64" fakerepo02
- aptly repo add fakerepo01 vagrant/*.deb
- aptly repo add fakerepo02 vagrant/*.deb
-
-.aptly/public: .aptly .gnupg
- aptly publish repo -gpg-key="650FE755" -distribution="main" fakerepo01 fakerepo01; true
- aptly publish repo -gpg-key="650FE755" -distribution="main" fakerepo02 fakerepo02; true
- touch .aptly/public
-
-webserver: .aptly/public
- pkill -f -x "python -m SimpleHTTPServer 8421"; true
- pkill -f -x "python -m http.server 8421"; true
- cd .aptly/public && python -m SimpleHTTPServer 8421 > /dev/null 2> /dev/null &
- cd .aptly/public && python -m http.server 8421 > /dev/null 2> /dev/null &
-
-remote-test:
- vagrant up
- vagrant ssh -c "cd /vagrant && make test"
+.DEFAULT_GOAL := help
+
+CACHE_IMG = "ghcr.io/adfinis/pyaptly/cache:latest"
+
+DOCKER_BUILDKIT = 1
+export DOCKER_BUILDKIT
+
+# Help target extracts the double-hash comments from the targets and shows them
+# in a convenient way. This allows us to easily document the user-facing Make
+# targets
+.PHONY: help
+help:
+ @grep -E '^[a-zA-Z0-9_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort -k 1,1 | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
+
+.PHONY: build
+build: build ## Build the container in case you made changes
+ @docker compose build
+
+.PHONY: up
+up: ## start the container (cached)
+ @docker compose up -d
+
+.PHONY: push
+push: ## push docker build cache to registry
+ @docker push $(CACHE_IMG)
+
+.PHONY: down
+down: ## stop and remove container
+ @docker compose down -v
+
+.PHONY: recreate
+recreate: down up ## recreate container
+
+.PHONY: wait-for-ready
+wait-for-ready: up ## wait for web-server to be ready for testing
+ @docker compose exec testing wait-for-it -t 0 127.0.0.1:3123
+ @docker compose exec testing wait-for-it -t 0 127.0.0.1:8080
+
+.PHONY: poetry-install
+poetry-install: wait-for-ready ## install dev environment
+ @docker compose exec testing poetry install
+
+.PHONY: test
+test: poetry-install ## run pytest
+ @docker compose exec testing poetry run pytest
+
+.PHONY: shell
+shell: poetry-install ## run shell
+ @docker compose exec testing bash -c "SHELL=bash poetry shell"
+
+.PHONY: entr
+entr: poetry-install ## run entr
+ @docker compose exec testing bash -c "find -name '*.py' | SHELL=bash poetry run entr bash -c 'pytest -x --lf'"
diff --git a/doc/_static/.gitkeep b/README.md
similarity index 100%
rename from doc/_static/.gitkeep
rename to README.md
diff --git a/TODO b/TODO
index 0d545dd..f9ffe6d 100644
--- a/TODO
+++ b/TODO
@@ -1,6 +1,9 @@
-TODO
-====
+# Update old files
-* Cleanup of rotated snapshots after snapshot-update
-* rotated snapshots should be identified by configuration option, not just by
- "not being timestamped
+The following files have just been copied without checking if their content is ok:
+
+- CHANELOG
+- CHANELOG.rst
+- README.rst
+
+The reason for this is to ensure the continuity of the git history.
\ No newline at end of file
diff --git a/Vagrantfile b/Vagrantfile
deleted file mode 100644
index 2232290..0000000
--- a/Vagrantfile
+++ /dev/null
@@ -1,126 +0,0 @@
-# -*- mode: ruby -*-
-# vi: set ft=ruby :
-
-# All Vagrant configuration is done below. The "2" in Vagrant.configure
-# configures the configuration version (we support older styles for
-# backwards compatibility). Please don't change it unless you know what
-# you're doing.
-Vagrant.configure(2) do |config|
- # The most common configuration options are documented and commented below.
- # For a complete reference, please see the online documentation at
- # https://docs.vagrantup.com.
-
- # Every Vagrant development environment requires a box. You can search for
- # boxes at https://atlas.hashicorp.com/search.
- config.vm.box = "adsy-centos-6.5.box"
- config.vm.box_url = "https://pkg.adfinis-sygroup.ch/vagrant/adsy-centos-6.5.box"
- config.vm.box_download_checksum = "a0f2cc25560495cd927da103659a59d69b2e4f1bf032ee67f35e8ea1b1c88a80"
- config.vm.box_download_checksum_type = "sha256"
- begin
- if Vagrant.plugin("2").manager.config.has_key? :vbguest then
- config.vbguest.auto_update = false
- end
- rescue
- end
- if ! File.exists?(".vagrant/machines/default/virtualbox/id")
- # Then this machine is brannd new.
- system "rm -rf pyaptly.egg-info/"
- end
-
- config.ssh.forward_agent = true
-
- # Disable automatic box update checking. If you disable this, then
- # boxes will only be checked for updates when the user runs
- # `vagrant box outdated`. This is not recommended.
- # config.vm.box_check_update = false
-
- # Create a forwarded port mapping which allows access to a specific port
- # within the machine from a port on the host machine. In the example below,
- # accessing "localhost:8080" will access port 80 on the guest machine.
- # config.vm.network "forwarded_port", guest: 80, host: 8080
-
- # Create a private network, which allows host-only access to the machine
- # using a specific IP.
- # config.vm.network "private_network", ip: "192.168.33.10"
-
- # Create a public network, which generally matched to bridged network.
- # Bridged networks make the machine appear as another physical device on
- # your network.
- # config.vm.network "public_network"
-
- # Share an additional folder to the guest VM. The first argument is
- # the path on the host to the actual folder. The second argument is
- # the path on the guest to mount the folder. And the optional third
- # argument is a set of non-required options.
- # config.vm.synced_folder "../data", "/vagrant_data"
-
- # Provider-specific configuration so you can fine-tune various
- # backing providers for Vagrant. These expose provider-specific options.
- # Example for VirtualBox:
- #
- config.vm.provider "virtualbox" do |vb|
- # Display the VirtualBox GUI when booting the machine
- # vb.gui = true
-
- # Customize the amount of memory on the VM:
- vb.memory = "512"
- vb.customize ["modifyvm", :id, "--natdnshostresolver1", "on"]
- end
- #
- # View the documentation for the provider you are using for more
- # information on available options.
-
- # Define a Vagrant Push strategy for pushing to Atlas. Other push strategies
- # such as FTP and Heroku are also available. See the documentation at
- # https://docs.vagrantup.com/v2/push/atlas.html for more information.
- # config.push.define "atlas" do |push|
- # push.app = "YOUR_ATLAS_USERNAME/YOUR_APPLICATION_NAME"
- # end
-
- # Enable provisioning with a shell script. Additional provisioners such as
- # Puppet, Chef, Ansible, Salt, and Docker are also available. Please see the
- # documentation for more information about their specific syntax and use.
- config.vm.provision "shell", inline: <<-SHELL
- set -e
- yum -y install wget rsync
- cd /usr/local/bin
- wget -q https://dl.bintray.com/smira/aptly/0.9.5/centos-6.5-x64/aptly
- sha256sum -c <> /home/vagrant/.bashrc"
- true
- SHELL
-end
diff --git a/compose/Dockerfile b/compose/Dockerfile
new file mode 100644
index 0000000..bf18a8d
--- /dev/null
+++ b/compose/Dockerfile
@@ -0,0 +1,36 @@
+FROM debian:bookworm-slim
+
+RUN apt-get update && apt-get install -y --no-install-recommends \
+ python3-toml \
+ python3-requests \
+ python3-poetry \
+ gnupg \
+ bzip2 \
+ tini \
+ curl \
+ wait-for-it \
+ entr \
+ gnutls-bin \
+ nettle-dev \
+ gcc \
+ llvm-dev \
+ libclang-dev \
+ build-essential \
+ pkg-config \
+ gettext \
+ git \
+ procps \
+ psmisc \
+ vim-tiny \
+ cargo \
+ && rm -rf /var/lib/apt/lists/* \
+ && apt-get clean \
+ && apt-get autoclean \
+ && rm -rf /var/lib/apt/archives/* \
+ && rm -rf /var/cache/apt/*
+ADD setup /setup
+RUN /setup/setup
+ADD run /setup/run
+WORKDIR /source
+ENTRYPOINT ["/usr/bin/tini", "--"]
+CMD ["bash", "/setup/run"]
\ No newline at end of file
diff --git a/compose/compose-norecommend b/compose/compose-norecommend
new file mode 100644
index 0000000..afede6f
--- /dev/null
+++ b/compose/compose-norecommend
@@ -0,0 +1,11 @@
+APT::Install-Recommends "0";
+APT::Install-Suggests "0";
+Dpkg::Options {
+ "--exclude=/usr/share/doc";
+ "--exclude=/usr/share/man";
+ "--exclude=/usr/share/groff";
+ "--exclude=/usr/share/info";
+ "--exclude=/usr/share/lintian";
+ "--exclude=/usr/share/linda";
+ "--exclude=/var/cache/man";
+};
diff --git a/compose/run b/compose/run
new file mode 100755
index 0000000..2384b2b
--- /dev/null
+++ b/compose/run
@@ -0,0 +1,25 @@
+#!/bin/sh
+
+TAG=3a9c40c25b2f
+
+cleanup() {
+ pkill -f http-$TAG
+ pkill -f hagrid-$TAG
+}
+
+(
+cd /setup/aptly/public
+exec -a http-$TAG python3 -m http.server 3123
+) &
+
+(
+cd /setup/hagrid
+exec -a hagrid-$TAG /root/.cargo/bin/hagrid
+) &
+
+trap cleanup SIGTERM SIGINT
+
+wait-for-it -t 0 127.0.0.1:8080
+cat /setup/test01.pub | curl -T - http://127.0.0.1:8080
+
+wait
\ No newline at end of file
diff --git a/compose/setup/aptly.conf b/compose/setup/aptly.conf
new file mode 100644
index 0000000..e141d63
--- /dev/null
+++ b/compose/setup/aptly.conf
@@ -0,0 +1,3 @@
+{
+ "rootDir": "/setup/aptly"
+}
diff --git a/doc/_templates/.gitkeep b/compose/setup/aptly/.keep
similarity index 100%
rename from doc/_templates/.gitkeep
rename to compose/setup/aptly/.keep
diff --git a/.aptly-bin/aptly b/compose/setup/aptly_1.5.0_amd64.deb
old mode 100755
new mode 100644
similarity index 50%
rename from .aptly-bin/aptly
rename to compose/setup/aptly_1.5.0_amd64.deb
index 55d8ad8..e291fcf
Binary files a/.aptly-bin/aptly and b/compose/setup/aptly_1.5.0_amd64.deb differ
diff --git a/compose/setup/config.toml b/compose/setup/config.toml
new file mode 100644
index 0000000..64fa972
--- /dev/null
+++ b/compose/setup/config.toml
@@ -0,0 +1,22 @@
+# Run compose/setup/setup with nocheck to update hashes
+
+[aptly]
+url="https://github.com/aptly-dev/aptly/releases/download/v1.5.0/aptly_1.5.0_amd64.deb"
+hash="c606c06ef2ddc6f0b225d6cbecaccd4b17f537ddc8a3fc72a12be94f864674cb"
+target="/usr/local"
+base=["aptly", "-config=/setup/aptly.conf"]
+
+[gnupg]
+base=[
+ "gpg",
+ "--no-default-keyring",
+ "--keyring",
+ "trustedkeys.gpg"
+]
+test01="2841988729C7F3FF"
+test02="EC54D33E5B5EBE98"
+
+[hagrid]
+repo="https://gitlab.com/keys.openpgp.org/hagrid.git"
+# Upstream project does not tag, this is version 1.2.1
+revision="5e08a7086eccf03bfe6d3bb06e197c33735c96f4"
diff --git a/vagrant/hellome_0.1-1_amd64.deb b/compose/setup/hellome_0.1-1_amd64.deb
similarity index 100%
rename from vagrant/hellome_0.1-1_amd64.deb
rename to compose/setup/hellome_0.1-1_amd64.deb
diff --git a/vagrant/libhello_0.1-1_amd64.deb b/compose/setup/libhello_0.1-1_amd64.deb
similarity index 100%
rename from vagrant/libhello_0.1-1_amd64.deb
rename to compose/setup/libhello_0.1-1_amd64.deb
diff --git a/compose/setup/setup b/compose/setup/setup
new file mode 100755
index 0000000..4d57559
--- /dev/null
+++ b/compose/setup/setup
@@ -0,0 +1,150 @@
+#!/usr/bin/env python3
+import hashlib
+import os
+import shutil
+import sys
+from glob import glob
+from pathlib import Path
+from subprocess import run
+
+import requests
+import toml
+
+base = Path("/setup")
+_nocheck = False
+_blocksize = 1024 * 16
+
+
+def get(url, path):
+ print(f"Download {url}")
+ hash = hashlib.sha256()
+ with requests.get(url, stream=True) as response:
+ if response.status_code == 200:
+ with path.open("wb") as file:
+ for chunk in response.iter_content(chunk_size=8192):
+ if chunk:
+ sys.stdout.write(".")
+ sys.stdout.flush()
+ file.write(chunk)
+ hash.update(chunk)
+ print("done")
+ hexdigest = hash.hexdigest()
+ print(f"sha256sum: {hexdigest}")
+ return hexdigest
+
+
+def main():
+ global _nocheck
+ _nocheck = "nocheck" in sys.argv
+ config = toml.load(base / "config.toml")
+ aptly = config["aptly"]
+ url = aptly["url"]
+ aptly["filename"] = Path(url).name
+ setup_gnupg(config["gnupg"])
+ install_aptly(aptly)
+ setup_static_aptly(config)
+ publish_key("01")
+ publish_key("02")
+ build_hagrid(config["hagrid"])
+
+
+def drun(cmd, *args, **kwargs):
+ """run for debugbing."""
+ print(" ".join(cmd))
+ run(cmd, *args, **kwargs)
+
+
+def create_repo(config, name, key):
+ cmd = config["base"]
+ run(cmd + ["repo", "create", '-architectures="amd64"', name], check=True)
+ run(cmd + ["repo", "add", name] + glob("/setup/*.deb"), check=True)
+ run(
+ cmd
+ + [
+ "publish",
+ "repo",
+ "-keyring=trustedkeys.gpg",
+ f"-gpg-key={key}",
+ "-distribution=main",
+ name,
+ name,
+ ],
+ check=True,
+ )
+
+
+def setup_static_aptly(config):
+ aptly = config["aptly"]
+ gnupg = config["gnupg"]
+ create_repo(aptly, "fakerepo01", gnupg["test01"])
+ create_repo(aptly, "fakerepo02", gnupg["test01"])
+ create_repo(aptly, "fakerepo03", gnupg["test02"])
+
+
+def setup_key(gpg, number):
+ run(gpg + ["--import", base / f"test{number}.key"], check=True)
+ run(gpg + ["--import", base / f"test{number}.pub"], check=True)
+
+
+def setup_gnupg(config):
+ gpg = config["base"]
+ setup_key(gpg, "01")
+ setup_key(gpg, "02")
+
+
+def install_aptly(config):
+ url = config["url"]
+ filename = config["filename"]
+ target = Path(config["target"])
+ bin = target / "bin"
+ aptly_path = base / filename
+ print(aptly_path)
+ if not aptly_path.exists():
+ hexdigest = get(url, aptly_path)
+ if not _nocheck:
+ assert hexdigest == config["hash"]
+ run(["dpkg", "-i", aptly_path], check=True)
+ run(["aptly", "version"], check=True)
+
+
+def publish_key(number):
+ aptly = base / "aptly"
+ public = aptly / "public"
+ keydir = public / "keys"
+ keydir.mkdir(parents=True, exist_ok=True)
+ key = f"test{number}.key"
+ (keydir / key).hardlink_to(base / key)
+
+
+def digestdir(hash, dir: Path):
+ for item in sorted(dir.iterdir(), key=lambda x: x.name):
+ if item.is_file():
+ with item.open("rb") as f:
+ while block := f.read(_blocksize):
+ hash.update(block)
+ elif item.is_dir():
+ digestdir(hash, item)
+
+
+def build_hagrid(config):
+ os.chdir(base)
+ hagrid = base / "hagrid.src"
+ hagrid_dst = base / "hagrid"
+ hagrid_dst.mkdir()
+ run(["git", "clone", config["repo"], hagrid])
+ os.chdir(hagrid)
+ run(["git", "checkout", config["revision"]])
+ run(["cargo", "install", "--locked", "--path", "."])
+ rocket_dist = hagrid / "dist"
+ rocket_dist.rename(hagrid_dst)
+ config_src = hagrid / "Rocket.toml.dist"
+ config_dst = hagrid_dst / "Rocket.toml"
+ config_src.rename(config_dst)
+ os.chdir(base)
+ shutil.rmtree(hagrid)
+ shutil.rmtree("/root/.cargo/git")
+ shutil.rmtree("/root/.cargo/registry")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/compose/setup/test01.key b/compose/setup/test01.key
new file mode 100644
index 0000000..384564e
--- /dev/null
+++ b/compose/setup/test01.key
@@ -0,0 +1,15 @@
+-----BEGIN PGP PRIVATE KEY BLOCK-----
+
+lFgEZXCo4BYJKwYBBAHaRw8BAQdA6cL1J8RN8QGco4pWCJ6ad9gI9eqGGGBVOsY3
++a3nBGYAAP9pKzNFmGve0do5FGQ6tKg62rVE0D4Eiji3jjUk3dr1UhBbtChQeWFw
+dGx5IFRlc3QgMDEgPHRlc3QwMUBweWFwdGx5Lm5vd2hlcmU+iJAEExYIADgWIQRj
+gMB/9klgFuAc9FIoQZiHKcfz/wUCZXCo4AIbAwULCQgHAgYVCgkICwIEFgIDAQIe
+AQIXgAAKCRAoQZiHKcfz/+LJAP9buWREh4VdFWcMCMJlFT8EqJV9/LxT43Khgb9C
+4J9skwEArWaaC/XieUFLIUXMXhw6ovdxHEx4djMZ8yG8SCwDBQecXQRlcKjgEgor
+BgEEAZdVAQUBAQdAfncDnoTVLSw0zMF6DXnH+m99zRksaUOIYS1pccMDlBQDAQgH
+AAD/fj/MmqfQ8V7jOFXcQPMtoYW4KlgglA7AyQduAD7U9aARW4h4BBgWCAAgFiEE
+Y4DAf/ZJYBbgHPRSKEGYhynH8/8FAmVwqOACGwwACgkQKEGYhynH8/8dZgD9FjC1
+S3Qi+MwczZduqN1r7EGW0+LEXxKzW5Mqemktr7IA/0kQusprlpU1cjmnwHgQ41bD
+CLN+ZceKvLqhbLLEGaoA
+=QonA
+-----END PGP PRIVATE KEY BLOCK-----
diff --git a/compose/setup/test01.pub b/compose/setup/test01.pub
new file mode 100644
index 0000000..732fed2
--- /dev/null
+++ b/compose/setup/test01.pub
@@ -0,0 +1,13 @@
+-----BEGIN PGP PUBLIC KEY BLOCK-----
+
+mDMEZXCo4BYJKwYBBAHaRw8BAQdA6cL1J8RN8QGco4pWCJ6ad9gI9eqGGGBVOsY3
++a3nBGa0KFB5YXB0bHkgVGVzdCAwMSA8dGVzdDAxQHB5YXB0bHkubm93aGVyZT6I
+kAQTFggAOBYhBGOAwH/2SWAW4Bz0UihBmIcpx/P/BQJlcKjgAhsDBQsJCAcCBhUK
+CQgLAgQWAgMBAh4BAheAAAoJEChBmIcpx/P/4skA/1u5ZESHhV0VZwwIwmUVPwSo
+lX38vFPjcqGBv0Lgn2yTAQCtZpoL9eJ5QUshRcxeHDqi93EcTHh2MxnzIbxILAMF
+B7g4BGVwqOASCisGAQQBl1UBBQEBB0B+dwOehNUtLDTMwXoNecf6b33NGSxpQ4hh
+LWlxwwOUFAMBCAeIeAQYFggAIBYhBGOAwH/2SWAW4Bz0UihBmIcpx/P/BQJlcKjg
+AhsMAAoJEChBmIcpx/P/HWYA/RYwtUt0IvjMHM2Xbqjda+xBltPixF8Ss1uTKnpp
+La+yAP9JELrKa5aVNXI5p8B4EONWwwizfmXHiry6oWyyxBmqAA==
+=zUlB
+-----END PGP PUBLIC KEY BLOCK-----
\ No newline at end of file
diff --git a/compose/setup/test02.key b/compose/setup/test02.key
new file mode 100644
index 0000000..8e7f5bf
--- /dev/null
+++ b/compose/setup/test02.key
@@ -0,0 +1,15 @@
+-----BEGIN PGP PRIVATE KEY BLOCK-----
+
+lFgEZXCpuRYJKwYBBAHaRw8BAQdAJUDsX/4AJM/azcK9wyzBqdK6ok3unG/5ry7Q
+48C14QAAAP0euwdpeDf8G0/TYbgeWT4ywZEj1dgvLYzWJHESgSw98g6RtChQeWFw
+dGx5IFRlc3QgMDIgPHRlc3QwMkBweWFwdGx5Lm5vd2hlcmU+iJAEExYIADgWIQRm
+DUUiira1nM5Ir7PsVNM+W16+mAUCZXCpuQIbAwULCQgHAgYVCgkICwIEFgIDAQIe
+AQIXgAAKCRDsVNM+W16+mGWOAQDEd/bFb2Ir/4xq1dBP+yw54xRWLowVLvQidZ3r
+I9nVnQD7B/JyQxpXJsqql/UegS2wmODkG092r4vezVpt/007ZQ6cXQRlcKm5Egor
+BgEEAZdVAQUBAQdAgyGTMcFXMwCZ3CwAHunuqPdQrS33OB9ZC6+AtwTGXk8DAQgH
+AAD/dGmh1IdXftTLAQaa+vs7l2ZH90BR9qUHI3Kvx25YLvgRgoh4BBgWCAAgFiEE
+Zg1FIoq2tZzOSK+z7FTTPltevpgFAmVwqbkCGwwACgkQ7FTTPltevpj8HQEA1Gun
+fISJE5s2nHRVqKkQP5zLQAIEk+QFzK/V2WHMN3IBAJRDBIo9PD+QiRaNtGV5brUd
+Yg3uqcnSvqrKNPpXo5gF
+=WRdn
+-----END PGP PRIVATE KEY BLOCK-----
diff --git a/compose/setup/test02.pub b/compose/setup/test02.pub
new file mode 100644
index 0000000..a02977e
--- /dev/null
+++ b/compose/setup/test02.pub
@@ -0,0 +1,13 @@
+-----BEGIN PGP PUBLIC KEY BLOCK-----
+
+mDMEZXCpuRYJKwYBBAHaRw8BAQdAJUDsX/4AJM/azcK9wyzBqdK6ok3unG/5ry7Q
+48C14QC0KFB5YXB0bHkgVGVzdCAwMiA8dGVzdDAyQHB5YXB0bHkubm93aGVyZT6I
+kAQTFggAOBYhBGYNRSKKtrWczkivs+xU0z5bXr6YBQJlcKm5AhsDBQsJCAcCBhUK
+CQgLAgQWAgMBAh4BAheAAAoJEOxU0z5bXr6YZY4BAMR39sVvYiv/jGrV0E/7LDnj
+FFYujBUu9CJ1nesj2dWdAPsH8nJDGlcmyqqX9R6BLbCY4OQbT3avi97NWm3/TTtl
+Drg4BGVwqbkSCisGAQQBl1UBBQEBB0CDIZMxwVczAJncLAAe6e6o91CtLfc4H1kL
+r4C3BMZeTwMBCAeIeAQYFggAIBYhBGYNRSKKtrWczkivs+xU0z5bXr6YBQJlcKm5
+AhsMAAoJEOxU0z5bXr6Y/B0BANRrp3yEiRObNpx0VaipED+cy0ACBJPkBcyv1dlh
+zDdyAQCUQwSKPTw/kIkWjbRleW61HWIN7qnJ0r6qyjT6V6OYBQ==
+=MNVw
+-----END PGP PUBLIC KEY BLOCK-----
\ No newline at end of file
diff --git a/compose/setup/version b/compose/setup/version
new file mode 100644
index 0000000..0cfbf08
--- /dev/null
+++ b/compose/setup/version
@@ -0,0 +1 @@
+2
diff --git a/debian/changelog b/debian/changelog
deleted file mode 100644
index 36a89d7..0000000
--- a/debian/changelog
+++ /dev/null
@@ -1,156 +0,0 @@
-pyaptly (1.2.0-1) stable; urgency=low
-
- * Important: Because of PR #29 pyaptly needs at least aptly 0.9.6
- * Merge pull request #29 from sliverc/support_flags
- - https://github.com/adfinis-sygroup/pyaptly/pull/29
- - Added additional option to skip contents file generation in publish
- * Merge pull request #30 from sliverc/add_python26_support
- - https://github.com/adfinis-sygroup/pyaptly/pull/30
- - Add python26 tests
- * Merge pull request #28 from winged/do_not_expect_timestamp_in_snapshot_dict
- - https://github.com/adfinis-sygroup/pyaptly/pull/28
- - Do not expect dict-snapshots to contain timestamps
- * Merge pull request #27 from msabramo/patch-4
- - https://github.com/adfinis-sygroup/pyaptly/pull/27
- - README.rst: Add PyPI badge
- * Merge pull request #26 from msabramo/patch-3
- - https://github.com/adfinis-sygroup/pyaptly/pull/26
- - Fix some typos
- * Merge pull request #25 from msabramo/patch-2
- - https://github.com/adfinis-sygroup/pyaptly/pull/25
- - format.rst: Fix a few typos
- * Merge pull request #24 from msabramo/patch-1
- - https://github.com/adfinis-sygroup/pyaptly/pull/24
- - setup.py: Set url to GitHub repo
- * Merge pull request #22 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/22
- - Updated pyproject to support version suffix
- * Merge pull request #21 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/21
- - Update pyproject for CI build
- * Merge pull request #20 from karras/bug_fix_snapshot_update_when_publishing_repos
- - https://github.com/adfinis-sygroup/pyaptly/pull/20
- - Fix bug when executing snapshot update
-
- -- Jean-Louis Fuchs Mon, 24 Oct 2016 15:25:13 +0000
-
-pyaptly (1.2.0-1) stable; urgency=low
-
- * Need at least aptly version 0.9.6
- * Merge pull request #30 from sliverc/add_python26_support
- - https://github.com/adfinis-sygroup/pyaptly/pull/30
- - Add python26 tests
- * Merge pull request #28 from winged/do_not_expect_timestamp_in_snapshot_dict
- - https://github.com/adfinis-sygroup/pyaptly/pull/28
- - Do not expect dict-snapshots to contain timestamps
- * Merge pull request #27 from msabramo/patch-4
- - https://github.com/adfinis-sygroup/pyaptly/pull/27
- - README.rst: Add PyPI badge
- * Merge pull request #26 from msabramo/patch-3
- - https://github.com/adfinis-sygroup/pyaptly/pull/26
- - Fix some typos
- * Merge pull request #25 from msabramo/patch-2
- - https://github.com/adfinis-sygroup/pyaptly/pull/25
- - format.rst: Fix a few typos
- * Merge pull request #24 from msabramo/patch-1
- - https://github.com/adfinis-sygroup/pyaptly/pull/24
- - setup.py: Set url to GitHub repo
- * Merge pull request #22 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/22
- - Updated pyproject to support version suffix
- * Merge pull request #21 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/21
- - Update pyproject for CI build
- * Merge branch 'master' of https://github.com/adfinis-sygroup/pyaptly
- - https://github.com/adfinis-sygroup/pyaptly/commit/87094a2
- * Merge pull request #20 from karras/bug_fix_snapshot_update_when_publishing_repos
- - https://github.com/adfinis-sygroup/pyaptly/pull/20
- - Fix bug when executing snapshot update
-
- -- Jean-Louis Fuchs Mon, 24 Oct 2016 15:19:16 +0000
-
-pyaptly (1.2.0-1) stable; urgency=low
-
- * Merge pull request #30 from sliverc/add_python26_support
- - https://github.com/adfinis-sygroup/pyaptly/pull/30
- - Add python26 tests
- * Merge pull request #28 from winged/do_not_expect_timestamp_in_snapshot_dict
- - https://github.com/adfinis-sygroup/pyaptly/pull/28
- - Do not expect dict-snapshots to contain timestamps
- * Merge pull request #27 from msabramo/patch-4
- - https://github.com/adfinis-sygroup/pyaptly/pull/27
- - README.rst: Add PyPI badge
- * Merge pull request #26 from msabramo/patch-3
- - https://github.com/adfinis-sygroup/pyaptly/pull/26
- - Fix some typos
- * Merge pull request #25 from msabramo/patch-2
- - https://github.com/adfinis-sygroup/pyaptly/pull/25
- - format.rst: Fix a few typos
- * Merge pull request #24 from msabramo/patch-1
- - https://github.com/adfinis-sygroup/pyaptly/pull/24
- - setup.py: Set url to GitHub repo
- * Merge pull request #23 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/23
- - Installation
- * Merge pull request #22 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/22
- - Updated pyproject to support version suffix
- * Merge branch 'master' of https://github.com/adfinis-sygroup/pyaptly
- - https://github.com/adfinis-sygroup/pyaptly/commit/146e7b3
- * Merge pull request #21 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/21
- - Update pyproject for CI build
- * Merge branch 'master' of https://github.com/adfinis-sygroup/pyaptly
- - https://github.com/adfinis-sygroup/pyaptly/commit/87094a2
- * Merge pull request #20 from karras/bug_fix_snapshot_update_when_publishing_repos
- - https://github.com/adfinis-sygroup/pyaptly/pull/20
- - Fix bug when executing snapshot update
-
- -- Jean-Louis Fuchs Mon, 24 Oct 2016 15:19:16 +0000
-
-pyaptly (1.1.0-1) stable; urgency=low
-
- * Merge pull request #15 from ganwell/feature_gpg_for_publish
- - https://github.com/adfinis-sygroup/pyaptly/pull/15
- - Update documentation about gpg-key and the gpg-agent. Read public keys and subkeys from gpg
- * Merge pull request #14 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/14
- - Meta PR: Fixing PR #11 and #12
- * Merge pull request #9 from winged/fix_exponential_complexity_in_read_snapshot_map
- - https://github.com/adfinis-sygroup/pyaptly/pull/9
- - Fix exponential complexity when reading snapshot map.
- * Merge pull request #7 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/7
- - Making hypothesis examples smaller to avoid timeouts
- * Merge pull request #6 from karras/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/6
- - Fix typos in README
- * Merge pull request #5 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/5
- - Display travis badge
- * Merge pull request #4 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/4
- - Update CHANGELOG
- * Merge pull request #3 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/3
- - Change to public documentation location
-
- -- Jean-Louis Fuchs Wed, 15 Jun 2016 13:21:43 +0000
-
-pyaptly (1.0.1-1) stable; urgency=low
-
- * Merge pull request #3 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/3
- - Change to public documentation location
-
- -- Jean-Louis Fuchs Sat, 07 May 2016 14:17:42 +0000
-
-pyaptly (1.0.0-1) stable; urgency=low
-
- * Merge pull request #2 from ganwell/master
- - https://github.com/adfinis-sygroup/pyaptly/pull/2
- - Semi-Automatic Release of deb and rpm Packages
- * Added CHANGELOG
- - https://github.com/adfinis-sygroup/pyaptly/commit/9f8ea2e
-
- -- Jean-Louis Fuchs Fri, 06 May 2016 19:40:42 +0000
diff --git a/debian/clean b/debian/clean
deleted file mode 100644
index 45149aa..0000000
--- a/debian/clean
+++ /dev/null
@@ -1 +0,0 @@
-*.egg-info/*
diff --git a/debian/compat b/debian/compat
deleted file mode 100644
index ec63514..0000000
--- a/debian/compat
+++ /dev/null
@@ -1 +0,0 @@
-9
diff --git a/debian/control b/debian/control
deleted file mode 100644
index 07f015e..0000000
--- a/debian/control
+++ /dev/null
@@ -1,21 +0,0 @@
-Source: pyaptly
-Section: python
-Priority: optional
-Maintainer: Jean-Louis Fuchs - Adfinis-SyGroup
-Uploaders: Jean-Louis Fuchs - Adfinis-SyGroup
-Build-Depends: debhelper (>= 9), dh-python, python-all, python3-all,
- python-setuptools, python3-setuptools
-Standards-Version: 3.9.4
-Homepage: https://github.com/adfinis-sygroup/pyaptly
-Vcs-Git: https://github.com/adfinis-sygroup/pyaptly
-Vcs-Browser: https://github.com/adfinis-sygroup/pyaptly
-
-Package: python-pyaptly
-Architecture: all
-Depends: ${misc:Depends}, ${python:Depends}, python-pkg-resources
-Description: Automates the creation and managment of aptly mirrors and snapshots based on yml input files.
-
-Package: python3-pyaptly
-Architecture: all
-Depends: ${misc:Depends}, ${python3:Depends}, python3-pkg-resources
-Description: Automates the creation and managment of aptly mirrors and snapshots based on yml input files.
diff --git a/debian/copyright b/debian/copyright
deleted file mode 100644
index db2d99c..0000000
--- a/debian/copyright
+++ /dev/null
@@ -1,723 +0,0 @@
-Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
-Upstream-Name: pyaptly
-Upstream-Contact: Jean-Louis Fuchs - Adfinis-SyGroup
-Source: https://github.com/adfinis-sygroup/pyaptly
-
-Files: *
-Copyright: Jean-Louis Fuchs - Adfinis-SyGroup
-License: AGPLv3
-
-Files: debian/*
-Copyright: Jean-Louis Fuchs - Adfinis-SyGroup
-License: GPL-2+
-
-License: GPL-2+
- This program is free software; you can redistribute it
- and/or modify it under the terms of the GNU General Public
- License as published by the Free Software Foundation; either
- version 2 of the License, or (at your option) any later
- version.
- .
- This program is distributed in the hope that it will be
- useful, but WITHOUT ANY WARRANTY; without even the implied
- warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR
- PURPOSE. See the GNU General Public License for more
- details.
- .
- You should have received a copy of the GNU General Public
- License along with this package; if not, write to the Free
- Software Foundation, Inc., 51 Franklin St, Fifth Floor,
- Boston, MA 02110-1301 USA
- .
- On Debian systems, the full text of the GNU General Public
- License version 2 can be found in the file
- `/usr/share/common-licenses/GPL-2'.
-
-License: AGPLv3
- GNU AFFERO GENERAL PUBLIC LICENSE
- Version 3, 19 November 2007
-
- Copyright (C) 2007 Free Software Foundation, Inc.
- Everyone is permitted to copy and distribute verbatim copies
- of this license document, but changing it is not allowed.
-
- Preamble
-
- The GNU Affero General Public License is a free, copyleft license for
- software and other kinds of works, specifically designed to ensure
- cooperation with the community in the case of network server software.
-
- The licenses for most software and other practical works are designed
- to take away your freedom to share and change the works. By contrast,
- our General Public Licenses are intended to guarantee your freedom to
- share and change all versions of a program--to make sure it remains free
- software for all its users.
-
- When we speak of free software, we are referring to freedom, not
- price. Our General Public Licenses are designed to make sure that you
- have the freedom to distribute copies of free software (and charge for
- them if you wish), that you receive source code or can get it if you
- want it, that you can change the software or use pieces of it in new
- free programs, and that you know you can do these things.
-
- Developers that use our General Public Licenses protect your rights
- with two steps: (1) assert copyright on the software, and (2) offer
- you this License which gives you legal permission to copy, distribute
- and/or modify the software.
-
- A secondary benefit of defending all users' freedom is that
- improvements made in alternate versions of the program, if they
- receive widespread use, become available for other developers to
- incorporate. Many developers of free software are heartened and
- encouraged by the resulting cooperation. However, in the case of
- software used on network servers, this result may fail to come about.
- The GNU General Public License permits making a modified version and
- letting the public access it on a server without ever releasing its
- source code to the public.
-
- The GNU Affero General Public License is designed specifically to
- ensure that, in such cases, the modified source code becomes available
- to the community. It requires the operator of a network server to
- provide the source code of the modified version running there to the
- users of that server. Therefore, public use of a modified version, on
- a publicly accessible server, gives the public access to the source
- code of the modified version.
-
- An older license, called the Affero General Public License and
- published by Affero, was designed to accomplish similar goals. This is
- a different license, not a version of the Affero GPL, but Affero has
- released a new version of the Affero GPL which permits relicensing under
- this license.
-
- The precise terms and conditions for copying, distribution and
- modification follow.
-
- TERMS AND CONDITIONS
-
- 0. Definitions.
-
- "This License" refers to version 3 of the GNU Affero General Public License.
-
- "Copyright" also means copyright-like laws that apply to other kinds of
- works, such as semiconductor masks.
-
- "The Program" refers to any copyrightable work licensed under this
- License. Each licensee is addressed as "you". "Licensees" and
- "recipients" may be individuals or organizations.
-
- To "modify" a work means to copy from or adapt all or part of the work
- in a fashion requiring copyright permission, other than the making of an
- exact copy. The resulting work is called a "modified version" of the
- earlier work or a work "based on" the earlier work.
-
- A "covered work" means either the unmodified Program or a work based
- on the Program.
-
- To "propagate" a work means to do anything with it that, without
- permission, would make you directly or secondarily liable for
- infringement under applicable copyright law, except executing it on a
- computer or modifying a private copy. Propagation includes copying,
- distribution (with or without modification), making available to the
- public, and in some countries other activities as well.
-
- To "convey" a work means any kind of propagation that enables other
- parties to make or receive copies. Mere interaction with a user through
- a computer network, with no transfer of a copy, is not conveying.
-
- An interactive user interface displays "Appropriate Legal Notices"
- to the extent that it includes a convenient and prominently visible
- feature that (1) displays an appropriate copyright notice, and (2)
- tells the user that there is no warranty for the work (except to the
- extent that warranties are provided), that licensees may convey the
- work under this License, and how to view a copy of this License. If
- the interface presents a list of user commands or options, such as a
- menu, a prominent item in the list meets this criterion.
-
- 1. Source Code.
-
- The "source code" for a work means the preferred form of the work
- for making modifications to it. "Object code" means any non-source
- form of a work.
-
- A "Standard Interface" means an interface that either is an official
- standard defined by a recognized standards body, or, in the case of
- interfaces specified for a particular programming language, one that
- is widely used among developers working in that language.
-
- The "System Libraries" of an executable work include anything, other
- than the work as a whole, that (a) is included in the normal form of
- packaging a Major Component, but which is not part of that Major
- Component, and (b) serves only to enable use of the work with that
- Major Component, or to implement a Standard Interface for which an
- implementation is available to the public in source code form. A
- "Major Component", in this context, means a major essential component
- (kernel, window system, and so on) of the specific operating system
- (if any) on which the executable work runs, or a compiler used to
- produce the work, or an object code interpreter used to run it.
-
- The "Corresponding Source" for a work in object code form means all
- the source code needed to generate, install, and (for an executable
- work) run the object code and to modify the work, including scripts to
- control those activities. However, it does not include the work's
- System Libraries, or general-purpose tools or generally available free
- programs which are used unmodified in performing those activities but
- which are not part of the work. For example, Corresponding Source
- includes interface definition files associated with source files for
- the work, and the source code for shared libraries and dynamically
- linked subprograms that the work is specifically designed to require,
- such as by intimate data communication or control flow between those
- subprograms and other parts of the work.
-
- The Corresponding Source need not include anything that users
- can regenerate automatically from other parts of the Corresponding
- Source.
-
- The Corresponding Source for a work in source code form is that
- same work.
-
- 2. Basic Permissions.
-
- All rights granted under this License are granted for the term of
- copyright on the Program, and are irrevocable provided the stated
- conditions are met. This License explicitly affirms your unlimited
- permission to run the unmodified Program. The output from running a
- covered work is covered by this License only if the output, given its
- content, constitutes a covered work. This License acknowledges your
- rights of fair use or other equivalent, as provided by copyright law.
-
- You may make, run and propagate covered works that you do not
- convey, without conditions so long as your license otherwise remains
- in force. You may convey covered works to others for the sole purpose
- of having them make modifications exclusively for you, or provide you
- with facilities for running those works, provided that you comply with
- the terms of this License in conveying all material for which you do
- not control copyright. Those thus making or running the covered works
- for you must do so exclusively on your behalf, under your direction
- and control, on terms that prohibit them from making any copies of
- your copyrighted material outside their relationship with you.
-
- Conveying under any other circumstances is permitted solely under
- the conditions stated below. Sublicensing is not allowed; section 10
- makes it unnecessary.
-
- 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
-
- No covered work shall be deemed part of an effective technological
- measure under any applicable law fulfilling obligations under article
- 11 of the WIPO copyright treaty adopted on 20 December 1996, or
- similar laws prohibiting or restricting circumvention of such
- measures.
-
- When you convey a covered work, you waive any legal power to forbid
- circumvention of technological measures to the extent such circumvention
- is effected by exercising rights under this License with respect to
- the covered work, and you disclaim any intention to limit operation or
- modification of the work as a means of enforcing, against the work's
- users, your or third parties' legal rights to forbid circumvention of
- technological measures.
-
- 4. Conveying Verbatim Copies.
-
- You may convey verbatim copies of the Program's source code as you
- receive it, in any medium, provided that you conspicuously and
- appropriately publish on each copy an appropriate copyright notice;
- keep intact all notices stating that this License and any
- non-permissive terms added in accord with section 7 apply to the code;
- keep intact all notices of the absence of any warranty; and give all
- recipients a copy of this License along with the Program.
-
- You may charge any price or no price for each copy that you convey,
- and you may offer support or warranty protection for a fee.
-
- 5. Conveying Modified Source Versions.
-
- You may convey a work based on the Program, or the modifications to
- produce it from the Program, in the form of source code under the
- terms of section 4, provided that you also meet all of these conditions:
-
- a) The work must carry prominent notices stating that you modified
- it, and giving a relevant date.
-
- b) The work must carry prominent notices stating that it is
- released under this License and any conditions added under section
- 7. This requirement modifies the requirement in section 4 to
- "keep intact all notices".
-
- c) You must license the entire work, as a whole, under this
- License to anyone who comes into possession of a copy. This
- License will therefore apply, along with any applicable section 7
- additional terms, to the whole of the work, and all its parts,
- regardless of how they are packaged. This License gives no
- permission to license the work in any other way, but it does not
- invalidate such permission if you have separately received it.
-
- d) If the work has interactive user interfaces, each must display
- Appropriate Legal Notices; however, if the Program has interactive
- interfaces that do not display Appropriate Legal Notices, your
- work need not make them do so.
-
- A compilation of a covered work with other separate and independent
- works, which are not by their nature extensions of the covered work,
- and which are not combined with it such as to form a larger program,
- in or on a volume of a storage or distribution medium, is called an
- "aggregate" if the compilation and its resulting copyright are not
- used to limit the access or legal rights of the compilation's users
- beyond what the individual works permit. Inclusion of a covered work
- in an aggregate does not cause this License to apply to the other
- parts of the aggregate.
-
- 6. Conveying Non-Source Forms.
-
- You may convey a covered work in object code form under the terms
- of sections 4 and 5, provided that you also convey the
- machine-readable Corresponding Source under the terms of this License,
- in one of these ways:
-
- a) Convey the object code in, or embodied in, a physical product
- (including a physical distribution medium), accompanied by the
- Corresponding Source fixed on a durable physical medium
- customarily used for software interchange.
-
- b) Convey the object code in, or embodied in, a physical product
- (including a physical distribution medium), accompanied by a
- written offer, valid for at least three years and valid for as
- long as you offer spare parts or customer support for that product
- model, to give anyone who possesses the object code either (1) a
- copy of the Corresponding Source for all the software in the
- product that is covered by this License, on a durable physical
- medium customarily used for software interchange, for a price no
- more than your reasonable cost of physically performing this
- conveying of source, or (2) access to copy the
- Corresponding Source from a network server at no charge.
-
- c) Convey individual copies of the object code with a copy of the
- written offer to provide the Corresponding Source. This
- alternative is allowed only occasionally and noncommercially, and
- only if you received the object code with such an offer, in accord
- with subsection 6b.
-
- d) Convey the object code by offering access from a designated
- place (gratis or for a charge), and offer equivalent access to the
- Corresponding Source in the same way through the same place at no
- further charge. You need not require recipients to copy the
- Corresponding Source along with the object code. If the place to
- copy the object code is a network server, the Corresponding Source
- may be on a different server (operated by you or a third party)
- that supports equivalent copying facilities, provided you maintain
- clear directions next to the object code saying where to find the
- Corresponding Source. Regardless of what server hosts the
- Corresponding Source, you remain obligated to ensure that it is
- available for as long as needed to satisfy these requirements.
-
- e) Convey the object code using peer-to-peer transmission, provided
- you inform other peers where the object code and Corresponding
- Source of the work are being offered to the general public at no
- charge under subsection 6d.
-
- A separable portion of the object code, whose source code is excluded
- from the Corresponding Source as a System Library, need not be
- included in conveying the object code work.
-
- A "User Product" is either (1) a "consumer product", which means any
- tangible personal property which is normally used for personal, family,
- or household purposes, or (2) anything designed or sold for incorporation
- into a dwelling. In determining whether a product is a consumer product,
- doubtful cases shall be resolved in favor of coverage. For a particular
- product received by a particular user, "normally used" refers to a
- typical or common use of that class of product, regardless of the status
- of the particular user or of the way in which the particular user
- actually uses, or expects or is expected to use, the product. A product
- is a consumer product regardless of whether the product has substantial
- commercial, industrial or non-consumer uses, unless such uses represent
- the only significant mode of use of the product.
-
- "Installation Information" for a User Product means any methods,
- procedures, authorization keys, or other information required to install
- and execute modified versions of a covered work in that User Product from
- a modified version of its Corresponding Source. The information must
- suffice to ensure that the continued functioning of the modified object
- code is in no case prevented or interfered with solely because
- modification has been made.
-
- If you convey an object code work under this section in, or with, or
- specifically for use in, a User Product, and the conveying occurs as
- part of a transaction in which the right of possession and use of the
- User Product is transferred to the recipient in perpetuity or for a
- fixed term (regardless of how the transaction is characterized), the
- Corresponding Source conveyed under this section must be accompanied
- by the Installation Information. But this requirement does not apply
- if neither you nor any third party retains the ability to install
- modified object code on the User Product (for example, the work has
- been installed in ROM).
-
- The requirement to provide Installation Information does not include a
- requirement to continue to provide support service, warranty, or updates
- for a work that has been modified or installed by the recipient, or for
- the User Product in which it has been modified or installed. Access to a
- network may be denied when the modification itself materially and
- adversely affects the operation of the network or violates the rules and
- protocols for communication across the network.
-
- Corresponding Source conveyed, and Installation Information provided,
- in accord with this section must be in a format that is publicly
- documented (and with an implementation available to the public in
- source code form), and must require no special password or key for
- unpacking, reading or copying.
-
- 7. Additional Terms.
-
- "Additional permissions" are terms that supplement the terms of this
- License by making exceptions from one or more of its conditions.
- Additional permissions that are applicable to the entire Program shall
- be treated as though they were included in this License, to the extent
- that they are valid under applicable law. If additional permissions
- apply only to part of the Program, that part may be used separately
- under those permissions, but the entire Program remains governed by
- this License without regard to the additional permissions.
-
- When you convey a copy of a covered work, you may at your option
- remove any additional permissions from that copy, or from any part of
- it. (Additional permissions may be written to require their own
- removal in certain cases when you modify the work.) You may place
- additional permissions on material, added by you to a covered work,
- for which you have or can give appropriate copyright permission.
-
- Notwithstanding any other provision of this License, for material you
- add to a covered work, you may (if authorized by the copyright holders of
- that material) supplement the terms of this License with terms:
-
- a) Disclaiming warranty or limiting liability differently from the
- terms of sections 15 and 16 of this License; or
-
- b) Requiring preservation of specified reasonable legal notices or
- author attributions in that material or in the Appropriate Legal
- Notices displayed by works containing it; or
-
- c) Prohibiting misrepresentation of the origin of that material, or
- requiring that modified versions of such material be marked in
- reasonable ways as different from the original version; or
-
- d) Limiting the use for publicity purposes of names of licensors or
- authors of the material; or
-
- e) Declining to grant rights under trademark law for use of some
- trade names, trademarks, or service marks; or
-
- f) Requiring indemnification of licensors and authors of that
- material by anyone who conveys the material (or modified versions of
- it) with contractual assumptions of liability to the recipient, for
- any liability that these contractual assumptions directly impose on
- those licensors and authors.
-
- All other non-permissive additional terms are considered "further
- restrictions" within the meaning of section 10. If the Program as you
- received it, or any part of it, contains a notice stating that it is
- governed by this License along with a term that is a further
- restriction, you may remove that term. If a license document contains
- a further restriction but permits relicensing or conveying under this
- License, you may add to a covered work material governed by the terms
- of that license document, provided that the further restriction does
- not survive such relicensing or conveying.
-
- If you add terms to a covered work in accord with this section, you
- must place, in the relevant source files, a statement of the
- additional terms that apply to those files, or a notice indicating
- where to find the applicable terms.
-
- Additional terms, permissive or non-permissive, may be stated in the
- form of a separately written license, or stated as exceptions;
- the above requirements apply either way.
-
- 8. Termination.
-
- You may not propagate or modify a covered work except as expressly
- provided under this License. Any attempt otherwise to propagate or
- modify it is void, and will automatically terminate your rights under
- this License (including any patent licenses granted under the third
- paragraph of section 11).
-
- However, if you cease all violation of this License, then your
- license from a particular copyright holder is reinstated (a)
- provisionally, unless and until the copyright holder explicitly and
- finally terminates your license, and (b) permanently, if the copyright
- holder fails to notify you of the violation by some reasonable means
- prior to 60 days after the cessation.
-
- Moreover, your license from a particular copyright holder is
- reinstated permanently if the copyright holder notifies you of the
- violation by some reasonable means, this is the first time you have
- received notice of violation of this License (for any work) from that
- copyright holder, and you cure the violation prior to 30 days after
- your receipt of the notice.
-
- Termination of your rights under this section does not terminate the
- licenses of parties who have received copies or rights from you under
- this License. If your rights have been terminated and not permanently
- reinstated, you do not qualify to receive new licenses for the same
- material under section 10.
-
- 9. Acceptance Not Required for Having Copies.
-
- You are not required to accept this License in order to receive or
- run a copy of the Program. Ancillary propagation of a covered work
- occurring solely as a consequence of using peer-to-peer transmission
- to receive a copy likewise does not require acceptance. However,
- nothing other than this License grants you permission to propagate or
- modify any covered work. These actions infringe copyright if you do
- not accept this License. Therefore, by modifying or propagating a
- covered work, you indicate your acceptance of this License to do so.
-
- 10. Automatic Licensing of Downstream Recipients.
-
- Each time you convey a covered work, the recipient automatically
- receives a license from the original licensors, to run, modify and
- propagate that work, subject to this License. You are not responsible
- for enforcing compliance by third parties with this License.
-
- An "entity transaction" is a transaction transferring control of an
- organization, or substantially all assets of one, or subdividing an
- organization, or merging organizations. If propagation of a covered
- work results from an entity transaction, each party to that
- transaction who receives a copy of the work also receives whatever
- licenses to the work the party's predecessor in interest had or could
- give under the previous paragraph, plus a right to possession of the
- Corresponding Source of the work from the predecessor in interest, if
- the predecessor has it or can get it with reasonable efforts.
-
- You may not impose any further restrictions on the exercise of the
- rights granted or affirmed under this License. For example, you may
- not impose a license fee, royalty, or other charge for exercise of
- rights granted under this License, and you may not initiate litigation
- (including a cross-claim or counterclaim in a lawsuit) alleging that
- any patent claim is infringed by making, using, selling, offering for
- sale, or importing the Program or any portion of it.
-
- 11. Patents.
-
- A "contributor" is a copyright holder who authorizes use under this
- License of the Program or a work on which the Program is based. The
- work thus licensed is called the contributor's "contributor version".
-
- A contributor's "essential patent claims" are all patent claims
- owned or controlled by the contributor, whether already acquired or
- hereafter acquired, that would be infringed by some manner, permitted
- by this License, of making, using, or selling its contributor version,
- but do not include claims that would be infringed only as a
- consequence of further modification of the contributor version. For
- purposes of this definition, "control" includes the right to grant
- patent sublicenses in a manner consistent with the requirements of
- this License.
-
- Each contributor grants you a non-exclusive, worldwide, royalty-free
- patent license under the contributor's essential patent claims, to
- make, use, sell, offer for sale, import and otherwise run, modify and
- propagate the contents of its contributor version.
-
- In the following three paragraphs, a "patent license" is any express
- agreement or commitment, however denominated, not to enforce a patent
- (such as an express permission to practice a patent or covenant not to
- sue for patent infringement). To "grant" such a patent license to a
- party means to make such an agreement or commitment not to enforce a
- patent against the party.
-
- If you convey a covered work, knowingly relying on a patent license,
- and the Corresponding Source of the work is not available for anyone
- to copy, free of charge and under the terms of this License, through a
- publicly available network server or other readily accessible means,
- then you must either (1) cause the Corresponding Source to be so
- available, or (2) arrange to deprive yourself of the benefit of the
- patent license for this particular work, or (3) arrange, in a manner
- consistent with the requirements of this License, to extend the patent
- license to downstream recipients. "Knowingly relying" means you have
- actual knowledge that, but for the patent license, your conveying the
- covered work in a country, or your recipient's use of the covered work
- in a country, would infringe one or more identifiable patents in that
- country that you have reason to believe are valid.
-
- If, pursuant to or in connection with a single transaction or
- arrangement, you convey, or propagate by procuring conveyance of, a
- covered work, and grant a patent license to some of the parties
- receiving the covered work authorizing them to use, propagate, modify
- or convey a specific copy of the covered work, then the patent license
- you grant is automatically extended to all recipients of the covered
- work and works based on it.
-
- A patent license is "discriminatory" if it does not include within
- the scope of its coverage, prohibits the exercise of, or is
- conditioned on the non-exercise of one or more of the rights that are
- specifically granted under this License. You may not convey a covered
- work if you are a party to an arrangement with a third party that is
- in the business of distributing software, under which you make payment
- to the third party based on the extent of your activity of conveying
- the work, and under which the third party grants, to any of the
- parties who would receive the covered work from you, a discriminatory
- patent license (a) in connection with copies of the covered work
- conveyed by you (or copies made from those copies), or (b) primarily
- for and in connection with specific products or compilations that
- contain the covered work, unless you entered into that arrangement,
- or that patent license was granted, prior to 28 March 2007.
-
- Nothing in this License shall be construed as excluding or limiting
- any implied license or other defenses to infringement that may
- otherwise be available to you under applicable patent law.
-
- 12. No Surrender of Others' Freedom.
-
- If conditions are imposed on you (whether by court order, agreement or
- otherwise) that contradict the conditions of this License, they do not
- excuse you from the conditions of this License. If you cannot convey a
- covered work so as to satisfy simultaneously your obligations under this
- License and any other pertinent obligations, then as a consequence you may
- not convey it at all. For example, if you agree to terms that obligate you
- to collect a royalty for further conveying from those to whom you convey
- the Program, the only way you could satisfy both those terms and this
- License would be to refrain entirely from conveying the Program.
-
- 13. Remote Network Interaction; Use with the GNU General Public License.
-
- Notwithstanding any other provision of this License, if you modify the
- Program, your modified version must prominently offer all users
- interacting with it remotely through a computer network (if your version
- supports such interaction) an opportunity to receive the Corresponding
- Source of your version by providing access to the Corresponding Source
- from a network server at no charge, through some standard or customary
- means of facilitating copying of software. This Corresponding Source
- shall include the Corresponding Source for any work covered by version 3
- of the GNU General Public License that is incorporated pursuant to the
- following paragraph.
-
- Notwithstanding any other provision of this License, you have
- permission to link or combine any covered work with a work licensed
- under version 3 of the GNU General Public License into a single
- combined work, and to convey the resulting work. The terms of this
- License will continue to apply to the part which is the covered work,
- but the work with which it is combined will remain governed by version
- 3 of the GNU General Public License.
-
- 14. Revised Versions of this License.
-
- The Free Software Foundation may publish revised and/or new versions of
- the GNU Affero General Public License from time to time. Such new versions
- will be similar in spirit to the present version, but may differ in detail to
- address new problems or concerns.
-
- Each version is given a distinguishing version number. If the
- Program specifies that a certain numbered version of the GNU Affero General
- Public License "or any later version" applies to it, you have the
- option of following the terms and conditions either of that numbered
- version or of any later version published by the Free Software
- Foundation. If the Program does not specify a version number of the
- GNU Affero General Public License, you may choose any version ever published
- by the Free Software Foundation.
-
- If the Program specifies that a proxy can decide which future
- versions of the GNU Affero General Public License can be used, that proxy's
- public statement of acceptance of a version permanently authorizes you
- to choose that version for the Program.
-
- Later license versions may give you additional or different
- permissions. However, no additional obligations are imposed on any
- author or copyright holder as a result of your choosing to follow a
- later version.
-
- 15. Disclaimer of Warranty.
-
- THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
- APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
- HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
- OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
- THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
- PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
- IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
- ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
-
- 16. Limitation of Liability.
-
- IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
- WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
- THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
- GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
- USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
- DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
- PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
- EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
- SUCH DAMAGES.
-
- 17. Interpretation of Sections 15 and 16.
-
- If the disclaimer of warranty and limitation of liability provided
- above cannot be given local legal effect according to their terms,
- reviewing courts shall apply local law that most closely approximates
- an absolute waiver of all civil liability in connection with the
- Program, unless a warranty or assumption of liability accompanies a
- copy of the Program in return for a fee.
-
- END OF TERMS AND CONDITIONS
-
- How to Apply These Terms to Your New Programs
-
- If you develop a new program, and you want it to be of the greatest
- possible use to the public, the best way to achieve this is to make it
- free software which everyone can redistribute and change under these terms.
-
- To do so, attach the following notices to the program. It is safest
- to attach them to the start of each source file to most effectively
- state the exclusion of warranty; and each file should have at least
- the "copyright" line and a pointer to where the full notice is found.
-
-
- Copyright (C)
-
- This program is free software: you can redistribute it and/or modify
- it under the terms of the GNU Affero General Public License as published by
- the Free Software Foundation, either version 3 of the License, or
- (at your option) any later version.
-
- This program is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
- GNU Affero General Public License for more details.
-
- You should have received a copy of the GNU Affero General Public License
- along with this program. If not, see .
-
- Also add information on how to contact you by electronic and paper mail.
-
- If your software can interact with users remotely through a computer
- network, you should also make sure that it provides a way for users to
- get its source. For example, if your program is a web application, its
- interface could display a "Source" link that leads users to an archive
- of the code. There are many ways you could offer source, and different
- solutions will be better for different programs; see section 13 for the
- specific requirements.
-
- You should also get your employer (if you work as a programmer) or school,
- if any, to sign a "copyright disclaimer" for the program, if necessary.
- For more information on this, and how to apply and follow the GNU AGPL, see
- .
-
-# If you want to use GPL v2 or later for the /debian/* files use
-# the following clauses, or change it to suit. Delete these two lines
-Files: debian/*
-Copyright: 2015 Adfinis SyGroup AG
-License: GPL-2+
- This package is free software; you can redistribute it and/or modify
- it under the terms of the GNU General Public License as published by
- the Free Software Foundation; either version 2 of the License, or
- (at your option) any later version.
- .
- This package is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
- GNU General Public License for more details.
- .
- You should have received a copy of the GNU General Public License
- along with this program. If not, see
- .
- On Debian systems, the complete text of the GNU General
- Public License version 2 can be found in "/usr/share/common-licenses/GPL-2".
-
-# Please also look if there are files or directories which have a
-# different copyright/license attached and list them here.
-# Please avoid to pick license terms that are more restrictive than the
-# packaged work, as it may make Debian's contributions unacceptable upstream.
diff --git a/debian/docs b/debian/docs
deleted file mode 100644
index b9df965..0000000
--- a/debian/docs
+++ /dev/null
@@ -1,2 +0,0 @@
-README.rst
-doc/format.rst
diff --git a/debian/pydist-overrides b/debian/pydist-overrides
deleted file mode 100644
index 2380247..0000000
--- a/debian/pydist-overrides
+++ /dev/null
@@ -1 +0,0 @@
-ipython ipython
diff --git a/debian/rules b/debian/rules
deleted file mode 100755
index f2d26c3..0000000
--- a/debian/rules
+++ /dev/null
@@ -1,7 +0,0 @@
-#!/usr/bin/make -f
-
-export PYBUILD_DESTDIR_python2=debian/python-pyaptly/
-export PYBUILD_DESTDIR_python3=debian/python3-pyaptly/
-
-%:
- dh $@ --with python2,python3 --buildsystem=pybuild
diff --git a/debian/source/format b/debian/source/format
deleted file mode 100644
index 163aaf8..0000000
--- a/debian/source/format
+++ /dev/null
@@ -1 +0,0 @@
-3.0 (quilt)
diff --git a/doc/CHANGELOG.rst b/doc/CHANGELOG.rst
deleted file mode 120000
index e22698b..0000000
--- a/doc/CHANGELOG.rst
+++ /dev/null
@@ -1 +0,0 @@
-../CHANGELOG.rst
\ No newline at end of file
diff --git a/doc/Makefile b/doc/Makefile
deleted file mode 100644
index c854f3d..0000000
--- a/doc/Makefile
+++ /dev/null
@@ -1,196 +0,0 @@
-# Makefile for Sphinx documentation
-#
-
-# You can set these variables from the command line.
-SPHINXOPTS =
-SPHINXBUILD = sphinx-build
-PAPER =
-BUILDDIR = _build
-
-# User-friendly check for sphinx-build
-ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
-$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
-endif
-
-# Internal variables.
-PAPEROPT_a4 = -D latex_paper_size=a4
-PAPEROPT_letter = -D latex_paper_size=letter
-ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
-# the i18n builder cannot share the environment and doctrees with the others
-I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
-
-.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext
-
-help:
- @echo "Please use \`make ' where is one of"
- @echo " html to make standalone HTML files"
- @echo " dirhtml to make HTML files named index.html in directories"
- @echo " singlehtml to make a single large HTML file"
- @echo " pickle to make pickle files"
- @echo " json to make JSON files"
- @echo " htmlhelp to make HTML files and a HTML help project"
- @echo " qthelp to make HTML files and a qthelp project"
- @echo " applehelp to make an Apple Help Book"
- @echo " devhelp to make HTML files and a Devhelp project"
- @echo " epub to make an epub"
- @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
- @echo " latexpdf to make LaTeX files and run them through pdflatex"
- @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
- @echo " text to make text files"
- @echo " man to make manual pages"
- @echo " texinfo to make Texinfo files"
- @echo " info to make Texinfo files and run them through makeinfo"
- @echo " gettext to make PO message catalogs"
- @echo " changes to make an overview of all changed/added/deprecated items"
- @echo " xml to make Docutils-native XML files"
- @echo " pseudoxml to make pseudoxml-XML files for display purposes"
- @echo " linkcheck to check all external links for integrity"
- @echo " doctest to run all doctests embedded in the documentation (if enabled)"
- @echo " coverage to run coverage check of the documentation (if enabled)"
-
-clean:
- rm -rf $(BUILDDIR)/*
-
-html:
- $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
- @echo
- @echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
-
-dirhtml:
- $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
- @echo
- @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
-
-singlehtml:
- $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
- @echo
- @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
-
-pickle:
- $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
- @echo
- @echo "Build finished; now you can process the pickle files."
-
-json:
- $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
- @echo
- @echo "Build finished; now you can process the JSON files."
-
-htmlhelp:
- $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
- @echo
- @echo "Build finished; now you can run HTML Help Workshop with the" \
- ".hhp project file in $(BUILDDIR)/htmlhelp."
-
-qthelp:
- $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
- @echo
- @echo "Build finished; now you can run "qcollectiongenerator" with the" \
- ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
- @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/PyAptly.qhcp"
- @echo "To view the help file:"
- @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/PyAptly.qhc"
-
-applehelp:
- $(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
- @echo
- @echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
- @echo "N.B. You won't be able to view it unless you put it in" \
- "~/Library/Documentation/Help or install it in your application" \
- "bundle."
-
-devhelp:
- $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
- @echo
- @echo "Build finished."
- @echo "To view the help file:"
- @echo "# mkdir -p $$HOME/.local/share/devhelp/PyAptly"
- @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/PyAptly"
- @echo "# devhelp"
-
-epub:
- $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
- @echo
- @echo "Build finished. The epub file is in $(BUILDDIR)/epub."
-
-latex:
- $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
- @echo
- @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
- @echo "Run \`make' in that directory to run these through (pdf)latex" \
- "(use \`make latexpdf' here to do that automatically)."
-
-latexpdf:
- $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
- sed -i 's/pdflatex/xelatex/g' $(BUILDDIR)/latex/Makefile
- sed -i '/^\\DeclareUnicodeCharacter/d' $(BUILDDIR)/latex/*.tex
- sed -i '/\\usepackage{hyperref}/d' $(BUILDDIR)/latex/sphinxmanual.cls
- sed -i '/\\usepackage\[Bjarne\]{fncychap}/d' $(BUILDDIR)/latex/*.tex
- @echo "Running LaTeX files through pdflatex..."
- $(MAKE) -C $(BUILDDIR)/latex all-pdf
- @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
-
-latexpdfja:
- $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
- @echo "Running LaTeX files through platex and dvipdfmx..."
- $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
- @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
-
-text:
- $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
- @echo
- @echo "Build finished. The text files are in $(BUILDDIR)/text."
-
-man:
- $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
- @echo
- @echo "Build finished. The manual pages are in $(BUILDDIR)/man."
-
-texinfo:
- $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
- @echo
- @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
- @echo "Run \`make' in that directory to run these through makeinfo" \
- "(use \`make info' here to do that automatically)."
-
-info:
- $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
- @echo "Running Texinfo files through makeinfo..."
- make -C $(BUILDDIR)/texinfo info
- @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
-
-gettext:
- $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
- @echo
- @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
-
-changes:
- $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
- @echo
- @echo "The overview file is in $(BUILDDIR)/changes."
-
-linkcheck:
- $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
- @echo
- @echo "Link check complete; look for any errors in the above output " \
- "or in $(BUILDDIR)/linkcheck/output.txt."
-
-doctest:
- $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
- @echo "Testing of doctests in the sources finished, look at the " \
- "results in $(BUILDDIR)/doctest/output.txt."
-
-coverage:
- $(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
- @echo "Testing of coverage in the sources finished, look at the " \
- "results in $(BUILDDIR)/coverage/python.txt."
-
-xml:
- $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
- @echo
- @echo "Build finished. The XML files are in $(BUILDDIR)/xml."
-
-pseudoxml:
- $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
- @echo
- @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
diff --git a/doc/README.rst b/doc/README.rst
deleted file mode 120000
index 89a0106..0000000
--- a/doc/README.rst
+++ /dev/null
@@ -1 +0,0 @@
-../README.rst
\ No newline at end of file
diff --git a/doc/adsy-sphinx-template.src b/doc/adsy-sphinx-template.src
deleted file mode 160000
index 8faa61d..0000000
--- a/doc/adsy-sphinx-template.src
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit 8faa61dc94ac36d09f2055a2ac8c7fe159420594
diff --git a/doc/aptly_test.rst b/doc/aptly_test.rst
deleted file mode 100644
index 0c2db55..0000000
--- a/doc/aptly_test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-==========
-aptly_test
-==========
-
-.. automodule:: pyaptly.aptly_test
- :members:
diff --git a/doc/conf.py b/doc/conf.py
deleted file mode 100644
index e18d449..0000000
--- a/doc/conf.py
+++ /dev/null
@@ -1,318 +0,0 @@
-#!/usr/bin/env python3
-# -*- coding: utf-8 -*-
-#
-# PyAptly documentation build configuration file, created by
-# sphinx-quickstart on Tue Dec 1 18:08:44 2015.
-#
-# This file is execfile()d with the current directory set to its
-# containing dir.
-#
-# Note that not all possible configuration values are present in this
-# autogenerated file.
-#
-# All configuration values have a default; values that are commented out
-# serve to show the default.
-
-import sys
-import os
-import shlex
-
-# If extensions (or modules to document with autodoc) are in another directory,
-# add these directories to sys.path here. If the directory is relative to the
-# documentation root, use os.path.abspath to make it absolute, like shown here.
-#sys.path.insert(0, os.path.abspath('.'))
-
-# -- General configuration ------------------------------------------------
-
-# If your documentation needs a minimal Sphinx version, state it here.
-#needs_sphinx = '1.0'
-
-# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
-# ones.
-extensions = [
- 'sphinx.ext.autodoc',
- 'sphinx.ext.intersphinx',
- 'sphinx.ext.todo',
- 'sphinx.ext.coverage',
- 'sphinx.ext.mathjax',
- 'sphinx.ext.ifconfig',
- 'sphinx.ext.viewcode',
-]
-
-# Add any paths that contain templates here, relative to this directory.
-templates_path = ['_templates']
-
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
-
-# The encoding of source files.
-#source_encoding = 'utf-8-sig'
-
-# The master toctree document.
-master_doc = 'index'
-
-# General information about the project.
-project = 'PyAptly'
-copyright = '2015, Lukas Grossar, David Vogt, Jean-Louis Fuchs'
-author = 'Lukas Grossar, David Vogt, Jean-Louis Fuchs'
-
-# The version info for the project you're documenting, acts as replacement for
-# |version| and |release|, also used in various other places throughout the
-# built documents.
-#
-# The short X.Y version.
-__version__ = None
-version_file = "../pyaptly/version.py"
-with open(version_file) as f:
- code = compile(f.read(), version_file, 'exec')
- exec(code)
-version = __version__
-# The full version, including alpha/beta/rc tags.
-release = __version__
-
-# The language for content autogenerated by Sphinx. Refer to documentation
-# for a list of supported languages.
-#
-# This is also used if you do content translation via gettext catalogs.
-# Usually you set "language" from the command line for these cases.
-language = None
-
-# There are two options for replacing |today|: either, you set today to some
-# non-false value, then it is used:
-#today = ''
-# Else, today_fmt is used as the format for a strftime call.
-#today_fmt = '%B %d, %Y'
-
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-exclude_patterns = ['_build']
-
-# The reST default role (used for this markup: `text`) to use for all
-# documents.
-#default_role = None
-
-# If true, '()' will be appended to :func: etc. cross-reference text.
-#add_function_parentheses = True
-
-# If true, the current module name will be prepended to all description
-# unit titles (such as .. function::).
-#add_module_names = True
-
-# If true, sectionauthor and moduleauthor directives will be shown in the
-# output. They are ignored by default.
-#show_authors = False
-
-# The name of the Pygments (syntax highlighting) style to use.
-pygments_style = 'sphinx'
-
-# A list of ignored prefixes for module index sorting.
-#modindex_common_prefix = []
-
-# If true, keep warnings as "system message" paragraphs in the built documents.
-#keep_warnings = False
-
-# If true, `todo` and `todoList` produce output, else they produce nothing.
-todo_include_todos = True
-
-
-# -- Options for HTML output ----------------------------------------------
-html_context = {
- 'source_url_prefix':
- "https://github.com/adfinis-sygroup/pyaptly/tree/master/doc/",
- 'source_suffix': ".rst",
-}
-# The theme to use for HTML and HTML Help pages. See the documentation for
-# a list of builtin themes.
-html_theme = 'adsy'
-
-# Theme options are theme-specific and customize the look and feel of a theme
-# further. For a list of options available for each theme, see the
-# documentation.
-#html_theme_options = {}
-
-# Add any paths that contain custom themes here, relative to this directory.
-html_theme_path = [ 'adsy-sphinx-template.src/html' ]
-
-# The name for this set of Sphinx documents. If None, it defaults to
-# " v documentation".
-#html_title = None
-
-# A shorter title for the navigation bar. Default is the same as html_title.
-#html_short_title = None
-
-# The name of an image file (relative to this directory) to place at the top
-# of the sidebar.
-#html_logo = None
-
-# The name of an image file (within the static path) to use as favicon of the
-# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
-# pixels large.
-#html_favicon = None
-
-# Add any paths that contain custom static files (such as style sheets) here,
-# relative to this directory. They are copied after the builtin static files,
-# so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static']
-
-# Add any extra paths that contain custom files (such as robots.txt or
-# .htaccess) here, relative to this directory. These files are copied
-# directly to the root of the documentation.
-#html_extra_path = []
-
-# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
-# using the given strftime format.
-#html_last_updated_fmt = '%b %d, %Y'
-
-# If true, SmartyPants will be used to convert quotes and dashes to
-# typographically correct entities.
-#html_use_smartypants = True
-
-# Custom sidebar templates, maps document names to template names.
-#html_sidebars = {}
-
-# Additional templates that should be rendered to pages, maps page names to
-# template names.
-#html_additional_pages = {}
-
-# If false, no module index is generated.
-#html_domain_indices = True
-
-# If false, no index is generated.
-#html_use_index = True
-
-# If true, the index is split into individual pages for each letter.
-#html_split_index = False
-
-# If true, links to the reST sources are added to the pages.
-#html_show_sourcelink = True
-
-# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
-#html_show_sphinx = True
-
-# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
-#html_show_copyright = True
-
-# If true, an OpenSearch description file will be output, and all pages will
-# contain a tag referring to it. The value of this option must be the
-# base URL from which the finished HTML is served.
-#html_use_opensearch = ''
-
-# This is the file name suffix for HTML files (e.g. ".xhtml").
-#html_file_suffix = None
-
-# Language to be used for generating the HTML full-text search index.
-# Sphinx supports the following languages:
-# 'da', 'de', 'en', 'es', 'fi', 'fr', 'h', 'it', 'ja'
-# 'nl', 'no', 'pt', 'ro', 'r', 'sv', 'tr'
-#html_search_language = 'en'
-
-# A dictionary with options for the search language support, empty by default.
-# Now only 'ja' uses this config value
-#html_search_options = {'type': 'default'}
-
-# The name of a javascript file (relative to the configuration directory) that
-# implements a search results scorer. If empty, the default will be used.
-#html_search_scorer = 'scorer.js'
-
-# Output file base name for HTML help builder.
-htmlhelp_basename = 'PyAptlydoc'
-
-# -- Options for LaTeX output ---------------------------------------------
-
-latex_additional_files = [
- 'adsy-sphinx-template.src/latex/logo.png',
- 'adsy-sphinx-template.src/latex/sphinx.sty',
- 'adsy-sphinx-template.src/latex/adsy.sty'
-]
-
-
-latex_elements = {
- # The paper size ('letterpaper' or 'a4paper').
- 'papersize': 'a4paper',
-
- # The font size ('10pt', '11pt' or '12pt').
- 'pointsize': '10pt',
-
- # Additional stuff for the LaTeX preamble.
- 'preamble' : r"""
-
- \usepackage{adsy}
-
-
- \renewcommand{\subtitle}{%s}
-
- """ % (project)
-
-}
-
-# Grouping the document tree into LaTeX files. List of tuples
-# (source start file, target name, title,
-# author, documentclass [howto, manual, or own class]).
-latex_documents = [
- (master_doc, 'PyAptly.tex', 'PyAptly Documentation',
- 'Lukas Grossar, David Vogt, Jean-Louis Fuchs', 'manual'),
-]
-
-# The name of an image file (relative to this directory) to place at the top of
-# the title page.
-#latex_logo = None
-
-# For "manual" documents, if this is true, then toplevel headings are parts,
-# not chapters.
-#latex_use_parts = False
-
-# If true, show page references after internal links.
-#latex_show_pagerefs = False
-
-# If true, show URL addresses after external links.
-#latex_show_urls = False
-
-# Documents to append as an appendix to all manuals.
-#latex_appendices = []
-
-# If false, no module index is generated.
-#latex_domain_indices = True
-
-
-# -- Options for manual page output ---------------------------------------
-
-# One entry per manual page. List of tuples
-# (source start file, name, description, authors, manual section).
-man_pages = [
- (master_doc, 'pyaptly', 'PyAptly Documentation',
- [author], 1)
-]
-
-# If true, show URL addresses after external links.
-#man_show_urls = False
-
-
-# -- Options for Texinfo output -------------------------------------------
-
-# Grouping the document tree into Texinfo files. List of tuples
-# (source start file, target name, title, author,
-# dir menu entry, description, category)
-texinfo_documents = [
- (master_doc, 'PyAptly', 'PyAptly Documentation',
- author, 'PyAptly', 'One line description of project.',
- 'Miscellaneous'),
-]
-
-# Documents to append as an appendix to all manuals.
-#texinfo_appendices = []
-
-# If false, no module index is generated.
-#texinfo_domain_indices = True
-
-# How to display URL addresses: 'footnote', 'no', or 'inline'.
-#texinfo_show_urls = 'footnote'
-
-# If true, do not generate a @detailmenu in the "Top" node's menu.
-#texinfo_no_detailmenu = False
-
-
-# Example configuration for intersphinx: refer to the Python standard library.
-intersphinx_mapping = {'https://docs.python.org/': None}
diff --git a/doc/dateround_test.rst b/doc/dateround_test.rst
deleted file mode 100644
index 43db6f7..0000000
--- a/doc/dateround_test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-==============
-dateround_test
-==============
-
-.. automodule:: pyaptly.dateround_test
- :members:
diff --git a/doc/format.rst b/doc/format.rst
deleted file mode 100644
index 3f4b1b8..0000000
--- a/doc/format.rst
+++ /dev/null
@@ -1,227 +0,0 @@
-======================
-YAML input file format
-======================
-
-The yaml file defines the four main components **repo**, **mirror**,
-**snapshot** and **publish**.
-
-repo
- is a local repository. PyAptly only creates repositories for you, the rest of
- the interaction with repositories is done via aptly directly.
-
-.. caution::
-
- You have to add at least one package before you can use a repo in snapshots
- or publishes.
-
-mirror
- mirrors a remote repository. PyAptly creates and updates mirrors. Usually all
- interaction is done through PyAptly
-
-.. caution::
-
- You have to update a mirror (download packages) at least once befure you can
- use a mirror in snapshots or publishes.
-
-snapshot
- PyAptly supports daily and weekly snapshots. Snapshots are created at a fixed
- time. If PyAptly is called later than that time the snapshot will still have
- the timestamp of the time when I should have been created. This is needed to
- find the snapshots in publishes or merges. Snapshots can also merge multiple
- other snapshots. There are also so called current snapshots, that are always
- updated and republished when PyAptly is run with "snapshot update" or
- "publish update".
-
-publish
- can have **snapshot**, **repo** or **publish** as source. If a publish has a
- current-snapshot as sources it is automatically updated.
-
-.. note::
-
- Current-snapshots are not the same as timestamped snapshots using
- "current" as timestamp. A current-snapshot has a unique name and is updated
- on every PyAptly run.
-
-Defining a mirror
-=================
-
-.. note::
-
- Every config-key ending in a S expects a list, but the yaml-reader will convert
- the value into a list for you, if you supply a single value.
-
-.. code-block:: yaml
-
- mirror:
- google-chrome:
- components: "main"
- architectures: ["amd64", "i386"]
- distribution: "stable"
- archive: "http://dl.google.com/linux/chrome/deb/"
- gpg-keys: ["7FAC5991"]
- gpg-urls: ["https://dl.google.com/linux/linux_signing_key.pub"]
-
-components
- main, contrib and non-free are the classical components from Debian. Other
- Repositories may use this to subdivide the repositories in other ways.
-
-architectures
- is another way of subdividing your repository, but should be used accordingly
- usually there is amd64 and i386.
-
-distribution
- is a distribution name, e.g. squeeze, for flat repositories use ./ instead of
- distribution name
-
-archive
- is the URL to download from.
-
-gpg-keys
- a list of gpg-keys that are automatically fetched from the key-server before
- the mirror is created.
-
-gpg-urls
- if the keys are not on a public keyserver pyaptly can download them from URLs
- too.
-
-sources
- if sources True pyaptly will tell aptly to also download sources.
-
-udeb
- if set to True use udeb (micro debs) which are stripped down debian packages,
- intended to save disk space.
-
-Defining a snapshot
-===================
-
-.. code-block:: yaml
-
- snapshot:
- google-chrome-stable-%T:
- timestamp: {"time": "00:00", "repeat-weekly": "sat"}
- filter:
- source: {"name": "google-chrome-%T", "timestamp": "current"}
- query: "google-chrome-stable"
-
-The name of a snapshot can include the `%T` macro, which is replaced by the
-calculated time of the snapshot.
-
-timestamp
- can contain **time** and **repeat-weekly**. If only **time** is defined it is
- a daily snapshot and is created daily at the given time. If **repeat-weekly**
- is also defined the snapshot will be created only on the given day. Allowed
- values are: 'mon' 'tue' 'wed' 'thu' 'fri' 'sat' 'sun'
-
-.. code-block:: yaml
-
- merge:
- - "roche-keyring-latest"
- - {"name": "trusty-main-stable-%T", "timestamp": "current"}
-
-merge
- merges multiple snapshots. It can either be a plain snapshot in this case
- *roche-keyring-latest* or in can be a snapshot. The definition contains the
- name of the snapshot including a %T macro and **timestamp** which defines the
- N latest snapshot. "current" is a name for 0 and "previous" for 1. But you
- can also define any other number.
-
-.. caution::
-
- If the N latest snapshot hasn't been created you will see an error, but
- PyAptly should continue.
-
-.. code-block:: yaml
-
- google-chrome-stable-%T:
- timestamp: {"time": "00:00", "repeat-weekly": "sat"}
- filter:
- source: {"name": "google-chrome-%T", "timestamp": "current"}
- query: "google-chrome-stable"
-
-filter
- Filters a snapshot using an aptly query. Define the source using the same
- syntax as in merge. The query uses aptly-query-syntax.
-
-Defining a publish
-==================
-
-.. code-block:: yaml
-
- publish:
- icaclient:
- -
- distribution: "latest"
- architectures: ["amd64", "i386"]
- components: "main"
- repo: "icaclient"
- automatic-update: true
- gpg-key: "7FAC5991"
-
-The name of the publish may include slashes: I.e. "ubuntu/latest".
-
-The sources of a publish can be:
-
-repo
- Name of repo defined in the yaml
-
-.. code-block:: yaml
-
- publish:
- ubuntu/latest:
- -
- distribution: "trusty"
- origin: "Ubuntu"
- architectures: ["amd64", "i386", "source"]
- components: ["main", "restricted", "universe", "multiverse"]
- snapshots:
- - {"name": "trusty-main_roche-keyring-%T", "timestamp": "current"}
- - {"name": "trusty-restricted-%T", "timestamp": "current"}
- - {"name": "trusty-universe-%T", "timestamp": "current"}
- - {"name": "trusty-multiverse-%T", "timestamp": "current"}
- automatic-update: true
-
-snapshots
- A list of snapshots using the same syntax as in merge.
-
-mirror
- Name of a mirror defined in the yaml
-
-These fields are the same as in the mirror definition:
-
-components
- main, contrib and non-free are the classical components from Debian. Other
- Repositories may use this to subdivide the repositories in other ways.
-
-architectures
- is another way of subdividing your repository, but should be used accordingly
- usually there is amd64 and i386.
-
-distribution
- is a distribution name, e.g. squeeze, for flat repositories use ./ instead of
- distribution name
-
-Additional fields are:
-
- origin
- Optional field indicating the origin of the repository, a single line of
- free form text.
-
- automatic-update
- If automatic-update is false the publish will only be updated if you
- explicitly name it: "pyaptly publish update ubuntu/stable". If you just
- call "pyaptly pulish update", the will stay on the last publish point
- (snapshot).
-
- gpg-key
- The key must exist in the users gpg-database and if the database has a
- password the gpg-agent must be active and the password must have been
- entered.
-
- See also gpg-agent.conf:
-
- default-cache-ttl 31536000 # A Year
-
- max-cache-ttl 31536000
-
- skip-contents
- If true pyaptly will tell aptly not generate contents index files
diff --git a/doc/graph_test.rst b/doc/graph_test.rst
deleted file mode 100644
index e112947..0000000
--- a/doc/graph_test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-==========
-graph_test
-==========
-
-.. automodule:: pyaptly.graph_test
- :members:
diff --git a/doc/helpers_test.rst b/doc/helpers_test.rst
deleted file mode 100644
index 50336e5..0000000
--- a/doc/helpers_test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-============
-helpers_test
-============
-
-.. automodule:: pyaptly.helpers_test
- :members:
diff --git a/doc/index.rst b/doc/index.rst
deleted file mode 100644
index a048dbf..0000000
--- a/doc/index.rst
+++ /dev/null
@@ -1,26 +0,0 @@
-.. PyAptly documentation master file, created by
- sphinx-quickstart on Tue Dec 1 18:08:44 2015.
- You can adapt this file completely to your liking, but it should at least
- contain the root `toctree` directive.
-
-PyAptly
-=======
-
-Contents:
-
-.. toctree::
- :maxdepth: 3
-
- README
- format
- modules
- CHANGELOG
-
-
-Indices and tables
-==================
-
-* :ref:`genindex`
-* :ref:`modindex`
-* :ref:`search`
-
diff --git a/doc/modules.rst b/doc/modules.rst
deleted file mode 100644
index 2dc17f9..0000000
--- a/doc/modules.rst
+++ /dev/null
@@ -1,14 +0,0 @@
-============
-Internal API
-============
-
-.. toctree::
- :maxdepth: 2
-
- pyaptly
- test
- aptly_test
- dateround_test
- helpers_test
- test_test
- graph_test
diff --git a/doc/pyaptly.rst b/doc/pyaptly.rst
deleted file mode 100644
index 02c308a..0000000
--- a/doc/pyaptly.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-=======
-pyaptly
-=======
-
-.. automodule:: pyaptly
- :members:
diff --git a/doc/test.rst b/doc/test.rst
deleted file mode 100644
index 81530c2..0000000
--- a/doc/test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-====
-test
-====
-
-.. automodule:: pyaptly.test
- :members:
diff --git a/doc/test_test.rst b/doc/test_test.rst
deleted file mode 100644
index e06ebd5..0000000
--- a/doc/test_test.rst
+++ /dev/null
@@ -1,6 +0,0 @@
-=========
-test_test
-=========
-
-.. automodule:: pyaptly.test_test
- :members:
diff --git a/docker-compose.yml b/docker-compose.yml
new file mode 100644
index 0000000..dd7d501
--- /dev/null
+++ b/docker-compose.yml
@@ -0,0 +1,8 @@
+# See compose/config.toml
+services:
+ testing:
+ image: ghcr.io/adfinis/pyaptly/cache:latest
+ build:
+ context: compose
+ volumes:
+ - ./:/source
diff --git a/poetry.lock b/poetry.lock
new file mode 100644
index 0000000..9f42a28
--- /dev/null
+++ b/poetry.lock
@@ -0,0 +1,921 @@
+# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+
+[[package]]
+name = "attrs"
+version = "23.1.0"
+description = "Classes Without Boilerplate"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
+ {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
+]
+
+[package.extras]
+cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]", "pre-commit"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
+tests = ["attrs[tests-no-zope]", "zope-interface"]
+tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+
+[[package]]
+name = "black"
+version = "23.12.0"
+description = "The uncompromising code formatter."
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "black-23.12.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:67f19562d367468ab59bd6c36a72b2c84bc2f16b59788690e02bbcb140a77175"},
+ {file = "black-23.12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bbd75d9f28a7283b7426160ca21c5bd640ca7cd8ef6630b4754b6df9e2da8462"},
+ {file = "black-23.12.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:593596f699ca2dcbbbdfa59fcda7d8ad6604370c10228223cd6cf6ce1ce7ed7e"},
+ {file = "black-23.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:12d5f10cce8dc27202e9a252acd1c9a426c83f95496c959406c96b785a92bb7d"},
+ {file = "black-23.12.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e73c5e3d37e5a3513d16b33305713237a234396ae56769b839d7c40759b8a41c"},
+ {file = "black-23.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ba09cae1657c4f8a8c9ff6cfd4a6baaf915bb4ef7d03acffe6a2f6585fa1bd01"},
+ {file = "black-23.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ace64c1a349c162d6da3cef91e3b0e78c4fc596ffde9413efa0525456148873d"},
+ {file = "black-23.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:72db37a2266b16d256b3ea88b9affcdd5c41a74db551ec3dd4609a59c17d25bf"},
+ {file = "black-23.12.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:fdf6f23c83078a6c8da2442f4d4eeb19c28ac2a6416da7671b72f0295c4a697b"},
+ {file = "black-23.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:39dda060b9b395a6b7bf9c5db28ac87b3c3f48d4fdff470fa8a94ab8271da47e"},
+ {file = "black-23.12.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7231670266ca5191a76cb838185d9be59cfa4f5dd401b7c1c70b993c58f6b1b5"},
+ {file = "black-23.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:193946e634e80bfb3aec41830f5d7431f8dd5b20d11d89be14b84a97c6b8bc75"},
+ {file = "black-23.12.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:bcf91b01ddd91a2fed9a8006d7baa94ccefe7e518556470cf40213bd3d44bbbc"},
+ {file = "black-23.12.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:996650a89fe5892714ea4ea87bc45e41a59a1e01675c42c433a35b490e5aa3f0"},
+ {file = "black-23.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bdbff34c487239a63d86db0c9385b27cdd68b1bfa4e706aa74bb94a435403672"},
+ {file = "black-23.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:97af22278043a6a1272daca10a6f4d36c04dfa77e61cbaaf4482e08f3640e9f0"},
+ {file = "black-23.12.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ead25c273adfad1095a8ad32afdb8304933efba56e3c1d31b0fee4143a1e424a"},
+ {file = "black-23.12.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c71048345bdbced456cddf1622832276d98a710196b842407840ae8055ade6ee"},
+ {file = "black-23.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:81a832b6e00eef2c13b3239d514ea3b7d5cc3eaa03d0474eedcbbda59441ba5d"},
+ {file = "black-23.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:6a82a711d13e61840fb11a6dfecc7287f2424f1ca34765e70c909a35ffa7fb95"},
+ {file = "black-23.12.0-py3-none-any.whl", hash = "sha256:a7c07db8200b5315dc07e331dda4d889a56f6bf4db6a9c2a526fa3166a81614f"},
+ {file = "black-23.12.0.tar.gz", hash = "sha256:330a327b422aca0634ecd115985c1c7fd7bdb5b5a2ef8aa9888a82e2ebe9437a"},
+]
+
+[package.dependencies]
+click = ">=8.0.0"
+mypy-extensions = ">=0.4.3"
+packaging = ">=22.0"
+pathspec = ">=0.9.0"
+platformdirs = ">=2"
+
+[package.extras]
+colorama = ["colorama (>=0.4.3)"]
+d = ["aiohttp (>=3.7.4)", "aiohttp (>=3.7.4,!=3.9.0)"]
+jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
+uvloop = ["uvloop (>=0.15.2)"]
+
+[[package]]
+name = "click"
+version = "8.1.7"
+description = "Composable command line interface toolkit"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28"},
+ {file = "click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de"},
+]
+
+[package.dependencies]
+colorama = {version = "*", markers = "platform_system == \"Windows\""}
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+description = "Cross-platform colored terminal text."
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+files = [
+ {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
+ {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
+]
+
+[[package]]
+name = "docstring-to-markdown"
+version = "0.13"
+description = "On the fly conversion of Python docstrings to markdown"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "docstring-to-markdown-0.13.tar.gz", hash = "sha256:3025c428638ececae920d6d26054546a20335af3504a145327e657e7ad7ce1ce"},
+ {file = "docstring_to_markdown-0.13-py3-none-any.whl", hash = "sha256:aa487059d0883e70e54da25c7b230e918d9e4d40f23d6dfaa2b73e4225b2d7dd"},
+]
+
+[[package]]
+name = "fancycompleter"
+version = "0.9.1"
+description = "colorful TAB completion for Python prompt"
+optional = false
+python-versions = "*"
+files = [
+ {file = "fancycompleter-0.9.1-py3-none-any.whl", hash = "sha256:dd076bca7d9d524cc7f25ec8f35ef95388ffef9ef46def4d3d25e9b044ad7080"},
+ {file = "fancycompleter-0.9.1.tar.gz", hash = "sha256:09e0feb8ae242abdfd7ef2ba55069a46f011814a80fe5476be48f51b00247272"},
+]
+
+[package.dependencies]
+pyreadline = {version = "*", markers = "platform_system == \"Windows\""}
+pyrepl = ">=0.8.2"
+
+[[package]]
+name = "flake8"
+version = "6.1.0"
+description = "the modular source code checker: pep8 pyflakes and co"
+optional = false
+python-versions = ">=3.8.1"
+files = [
+ {file = "flake8-6.1.0-py2.py3-none-any.whl", hash = "sha256:ffdfce58ea94c6580c77888a86506937f9a1a227dfcd15f245d694ae20a6b6e5"},
+ {file = "flake8-6.1.0.tar.gz", hash = "sha256:d5b3857f07c030bdb5bf41c7f53799571d75c4491748a3adcd47de929e34cd23"},
+]
+
+[package.dependencies]
+mccabe = ">=0.7.0,<0.8.0"
+pycodestyle = ">=2.11.0,<2.12.0"
+pyflakes = ">=3.1.0,<3.2.0"
+
+[[package]]
+name = "flake8-bugbear"
+version = "23.12.2"
+description = "A plugin for flake8 finding likely bugs and design problems in your program. Contains warnings that don't belong in pyflakes and pycodestyle."
+optional = false
+python-versions = ">=3.8.1"
+files = [
+ {file = "flake8-bugbear-23.12.2.tar.gz", hash = "sha256:32b2903e22331ae04885dae25756a32a8c666c85142e933f43512a70f342052a"},
+ {file = "flake8_bugbear-23.12.2-py3-none-any.whl", hash = "sha256:83324bad4d90fee4bf64dd69c61aff94debf8073fbd807c8b6a36eec7a2f0719"},
+]
+
+[package.dependencies]
+attrs = ">=19.2.0"
+flake8 = ">=6.0.0"
+
+[package.extras]
+dev = ["coverage", "hypothesis", "hypothesmith (>=0.2)", "pre-commit", "pytest", "tox"]
+
+[[package]]
+name = "flake8-debugger"
+version = "4.1.2"
+description = "ipdb/pdb statement checker plugin for flake8"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "flake8-debugger-4.1.2.tar.gz", hash = "sha256:52b002560941e36d9bf806fca2523dc7fb8560a295d5f1a6e15ac2ded7a73840"},
+ {file = "flake8_debugger-4.1.2-py3-none-any.whl", hash = "sha256:0a5e55aeddcc81da631ad9c8c366e7318998f83ff00985a49e6b3ecf61e571bf"},
+]
+
+[package.dependencies]
+flake8 = ">=3.0"
+pycodestyle = "*"
+
+[[package]]
+name = "flake8-docstrings"
+version = "1.7.0"
+description = "Extension for flake8 which uses pydocstyle to check docstrings"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "flake8_docstrings-1.7.0-py2.py3-none-any.whl", hash = "sha256:51f2344026da083fc084166a9353f5082b01f72901df422f74b4d953ae88ac75"},
+ {file = "flake8_docstrings-1.7.0.tar.gz", hash = "sha256:4c8cc748dc16e6869728699e5d0d685da9a10b0ea718e090b1ba088e67a941af"},
+]
+
+[package.dependencies]
+flake8 = ">=3"
+pydocstyle = ">=2.1"
+
+[[package]]
+name = "flake8-isort"
+version = "6.1.1"
+description = "flake8 plugin that integrates isort"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "flake8_isort-6.1.1-py3-none-any.whl", hash = "sha256:0fec4dc3a15aefbdbe4012e51d5531a2eb5fa8b981cdfbc882296a59b54ede12"},
+ {file = "flake8_isort-6.1.1.tar.gz", hash = "sha256:c1f82f3cf06a80c13e1d09bfae460e9666255d5c780b859f19f8318d420370b3"},
+]
+
+[package.dependencies]
+flake8 = "*"
+isort = ">=5.0.0,<6"
+
+[package.extras]
+test = ["pytest"]
+
+[[package]]
+name = "flake8-string-format"
+version = "0.3.0"
+description = "string format checker, plugin for flake8"
+optional = false
+python-versions = "*"
+files = [
+ {file = "flake8-string-format-0.3.0.tar.gz", hash = "sha256:65f3da786a1461ef77fca3780b314edb2853c377f2e35069723348c8917deaa2"},
+ {file = "flake8_string_format-0.3.0-py2.py3-none-any.whl", hash = "sha256:812ff431f10576a74c89be4e85b8e075a705be39bc40c4b4278b5b13e2afa9af"},
+]
+
+[package.dependencies]
+flake8 = "*"
+
+[[package]]
+name = "flake8-tuple"
+version = "0.4.1"
+description = "Check code for 1 element tuple."
+optional = false
+python-versions = "*"
+files = [
+ {file = "flake8_tuple-0.4.1-py2.py3-none-any.whl", hash = "sha256:d828cc8e461c50cacca116e9abb0c9e3be565e8451d3f5c00578c63670aae680"},
+ {file = "flake8_tuple-0.4.1.tar.gz", hash = "sha256:8a1b42aab134ef4c3fef13c6a8f383363f158b19fbc165bd91aed9c51851a61d"},
+]
+
+[package.dependencies]
+flake8 = "*"
+six = "*"
+
+[[package]]
+name = "freezegun"
+version = "1.3.1"
+description = "Let your Python tests travel through time"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "freezegun-1.3.1-py3-none-any.whl", hash = "sha256:065e77a12624d05531afa87ade12a0b9bdb53495c4573893252a055b545ce3ea"},
+ {file = "freezegun-1.3.1.tar.gz", hash = "sha256:48984397b3b58ef5dfc645d6a304b0060f612bcecfdaaf45ce8aff0077a6cb6a"},
+]
+
+[package.dependencies]
+python-dateutil = ">=2.7"
+
+[[package]]
+name = "hypothesis"
+version = "6.92.0"
+description = "A library for property-based testing"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "hypothesis-6.92.0-py3-none-any.whl", hash = "sha256:d4577f99b912acc725bea684899b7cb62591a0412e2446c618be0b4855995276"},
+ {file = "hypothesis-6.92.0.tar.gz", hash = "sha256:65b72c7dc7da3e16144db54fe093c6b74a33631b933a8063eb754c5a61361ae6"},
+]
+
+[package.dependencies]
+attrs = ">=22.2.0"
+sortedcontainers = ">=2.1.0,<3.0.0"
+
+[package.extras]
+all = ["backports.zoneinfo (>=0.2.1)", "black (>=19.10b0)", "click (>=7.0)", "django (>=3.2)", "dpcontracts (>=0.4)", "lark (>=0.10.1)", "libcst (>=0.3.16)", "numpy (>=1.17.3)", "pandas (>=1.1)", "pytest (>=4.6)", "python-dateutil (>=1.4)", "pytz (>=2014.1)", "redis (>=3.0.0)", "rich (>=9.0.0)", "tzdata (>=2023.3)"]
+cli = ["black (>=19.10b0)", "click (>=7.0)", "rich (>=9.0.0)"]
+codemods = ["libcst (>=0.3.16)"]
+dateutil = ["python-dateutil (>=1.4)"]
+django = ["django (>=3.2)"]
+dpcontracts = ["dpcontracts (>=0.4)"]
+ghostwriter = ["black (>=19.10b0)"]
+lark = ["lark (>=0.10.1)"]
+numpy = ["numpy (>=1.17.3)"]
+pandas = ["pandas (>=1.1)"]
+pytest = ["pytest (>=4.6)"]
+pytz = ["pytz (>=2014.1)"]
+redis = ["redis (>=3.0.0)"]
+zoneinfo = ["backports.zoneinfo (>=0.2.1)", "tzdata (>=2023.3)"]
+
+[[package]]
+name = "iniconfig"
+version = "2.0.0"
+description = "brain-dead simple config-ini parsing"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"},
+ {file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"},
+]
+
+[[package]]
+name = "isort"
+version = "5.13.2"
+description = "A Python utility / library to sort Python imports."
+optional = false
+python-versions = ">=3.8.0"
+files = [
+ {file = "isort-5.13.2-py3-none-any.whl", hash = "sha256:8ca5e72a8d85860d5a3fa69b8745237f2939afe12dbf656afbcb47fe72d947a6"},
+ {file = "isort-5.13.2.tar.gz", hash = "sha256:48fdfcb9face5d58a4f6dde2e72a1fb8dcaf8ab26f95ab49fab84c2ddefb0109"},
+]
+
+[package.extras]
+colors = ["colorama (>=0.4.6)"]
+
+[[package]]
+name = "jedi"
+version = "0.19.1"
+description = "An autocompletion tool for Python that can be used for text editors."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "jedi-0.19.1-py2.py3-none-any.whl", hash = "sha256:e983c654fe5c02867aef4cdfce5a2fbb4a50adc0af145f70504238f18ef5e7e0"},
+ {file = "jedi-0.19.1.tar.gz", hash = "sha256:cf0496f3651bc65d7174ac1b7d043eff454892c708a87d1b683e57b569927ffd"},
+]
+
+[package.dependencies]
+parso = ">=0.8.3,<0.9.0"
+
+[package.extras]
+docs = ["Jinja2 (==2.11.3)", "MarkupSafe (==1.1.1)", "Pygments (==2.8.1)", "alabaster (==0.7.12)", "babel (==2.9.1)", "chardet (==4.0.0)", "commonmark (==0.8.1)", "docutils (==0.17.1)", "future (==0.18.2)", "idna (==2.10)", "imagesize (==1.2.0)", "mock (==1.0.1)", "packaging (==20.9)", "pyparsing (==2.4.7)", "pytz (==2021.1)", "readthedocs-sphinx-ext (==2.1.4)", "recommonmark (==0.5.0)", "requests (==2.25.1)", "six (==1.15.0)", "snowballstemmer (==2.1.0)", "sphinx (==1.8.5)", "sphinx-rtd-theme (==0.4.3)", "sphinxcontrib-serializinghtml (==1.1.4)", "sphinxcontrib-websupport (==1.2.4)", "urllib3 (==1.26.4)"]
+qa = ["flake8 (==5.0.4)", "mypy (==0.971)", "types-setuptools (==67.2.0.1)"]
+testing = ["Django", "attrs", "colorama", "docopt", "pytest (<7.0.0)"]
+
+[[package]]
+name = "mccabe"
+version = "0.7.0"
+description = "McCabe checker, plugin for flake8"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"},
+ {file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"},
+]
+
+[[package]]
+name = "mock"
+version = "5.1.0"
+description = "Rolling backport of unittest.mock for all Pythons"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "mock-5.1.0-py3-none-any.whl", hash = "sha256:18c694e5ae8a208cdb3d2c20a993ca1a7b0efa258c247a1e565150f477f83744"},
+ {file = "mock-5.1.0.tar.gz", hash = "sha256:5e96aad5ccda4718e0a229ed94b2024df75cc2d55575ba5762d31f5767b8767d"},
+]
+
+[package.extras]
+build = ["blurb", "twine", "wheel"]
+docs = ["sphinx"]
+test = ["pytest", "pytest-cov"]
+
+[[package]]
+name = "mypy"
+version = "1.7.1"
+description = "Optional static typing for Python"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "mypy-1.7.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:12cce78e329838d70a204293e7b29af9faa3ab14899aec397798a4b41be7f340"},
+ {file = "mypy-1.7.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1484b8fa2c10adf4474f016e09d7a159602f3239075c7bf9f1627f5acf40ad49"},
+ {file = "mypy-1.7.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31902408f4bf54108bbfb2e35369877c01c95adc6192958684473658c322c8a5"},
+ {file = "mypy-1.7.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f2c2521a8e4d6d769e3234350ba7b65ff5d527137cdcde13ff4d99114b0c8e7d"},
+ {file = "mypy-1.7.1-cp310-cp310-win_amd64.whl", hash = "sha256:fcd2572dd4519e8a6642b733cd3a8cfc1ef94bafd0c1ceed9c94fe736cb65b6a"},
+ {file = "mypy-1.7.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4b901927f16224d0d143b925ce9a4e6b3a758010673eeded9b748f250cf4e8f7"},
+ {file = "mypy-1.7.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2f7f6985d05a4e3ce8255396df363046c28bea790e40617654e91ed580ca7c51"},
+ {file = "mypy-1.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:944bdc21ebd620eafefc090cdf83158393ec2b1391578359776c00de00e8907a"},
+ {file = "mypy-1.7.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9c7ac372232c928fff0645d85f273a726970c014749b924ce5710d7d89763a28"},
+ {file = "mypy-1.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:f6efc9bd72258f89a3816e3a98c09d36f079c223aa345c659622f056b760ab42"},
+ {file = "mypy-1.7.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6dbdec441c60699288adf051f51a5d512b0d818526d1dcfff5a41f8cd8b4aaf1"},
+ {file = "mypy-1.7.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4fc3d14ee80cd22367caaaf6e014494415bf440980a3045bf5045b525680ac33"},
+ {file = "mypy-1.7.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c6e4464ed5f01dc44dc9821caf67b60a4e5c3b04278286a85c067010653a0eb"},
+ {file = "mypy-1.7.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:d9b338c19fa2412f76e17525c1b4f2c687a55b156320acb588df79f2e6fa9fea"},
+ {file = "mypy-1.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:204e0d6de5fd2317394a4eff62065614c4892d5a4d1a7ee55b765d7a3d9e3f82"},
+ {file = "mypy-1.7.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:84860e06ba363d9c0eeabd45ac0fde4b903ad7aa4f93cd8b648385a888e23200"},
+ {file = "mypy-1.7.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8c5091ebd294f7628eb25ea554852a52058ac81472c921150e3a61cdd68f75a7"},
+ {file = "mypy-1.7.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40716d1f821b89838589e5b3106ebbc23636ffdef5abc31f7cd0266db936067e"},
+ {file = "mypy-1.7.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5cf3f0c5ac72139797953bd50bc6c95ac13075e62dbfcc923571180bebb662e9"},
+ {file = "mypy-1.7.1-cp38-cp38-win_amd64.whl", hash = "sha256:78e25b2fd6cbb55ddfb8058417df193f0129cad5f4ee75d1502248e588d9e0d7"},
+ {file = "mypy-1.7.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:75c4d2a6effd015786c87774e04331b6da863fc3fc4e8adfc3b40aa55ab516fe"},
+ {file = "mypy-1.7.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2643d145af5292ee956aa0a83c2ce1038a3bdb26e033dadeb2f7066fb0c9abce"},
+ {file = "mypy-1.7.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75aa828610b67462ffe3057d4d8a4112105ed211596b750b53cbfe182f44777a"},
+ {file = "mypy-1.7.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ee5d62d28b854eb61889cde4e1dbc10fbaa5560cb39780c3995f6737f7e82120"},
+ {file = "mypy-1.7.1-cp39-cp39-win_amd64.whl", hash = "sha256:72cf32ce7dd3562373f78bd751f73c96cfb441de147cc2448a92c1a308bd0ca6"},
+ {file = "mypy-1.7.1-py3-none-any.whl", hash = "sha256:f7c5d642db47376a0cc130f0de6d055056e010debdaf0707cd2b0fc7e7ef30ea"},
+ {file = "mypy-1.7.1.tar.gz", hash = "sha256:fcb6d9afb1b6208b4c712af0dafdc650f518836065df0d4fb1d800f5d6773db2"},
+]
+
+[package.dependencies]
+mypy-extensions = ">=1.0.0"
+typing-extensions = ">=4.1.0"
+
+[package.extras]
+dmypy = ["psutil (>=4.0)"]
+install-types = ["pip"]
+mypyc = ["setuptools (>=50)"]
+reports = ["lxml"]
+
+[[package]]
+name = "mypy-extensions"
+version = "1.0.0"
+description = "Type system extensions for programs checked with the mypy type checker."
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"},
+ {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
+]
+
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
+[[package]]
+name = "parso"
+version = "0.8.3"
+description = "A Python Parser"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "parso-0.8.3-py2.py3-none-any.whl", hash = "sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75"},
+ {file = "parso-0.8.3.tar.gz", hash = "sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0"},
+]
+
+[package.extras]
+qa = ["flake8 (==3.8.3)", "mypy (==0.782)"]
+testing = ["docopt", "pytest (<6.0.0)"]
+
+[[package]]
+name = "pathspec"
+version = "0.12.1"
+description = "Utility library for gitignore style pattern matching of file paths."
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"},
+ {file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"},
+]
+
+[[package]]
+name = "pdbpp"
+version = "0.10.3"
+description = "pdb++, a drop-in replacement for pdb"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pdbpp-0.10.3-py2.py3-none-any.whl", hash = "sha256:79580568e33eb3d6f6b462b1187f53e10cd8e4538f7d31495c9181e2cf9665d1"},
+ {file = "pdbpp-0.10.3.tar.gz", hash = "sha256:d9e43f4fda388eeb365f2887f4e7b66ac09dce9b6236b76f63616530e2f669f5"},
+]
+
+[package.dependencies]
+fancycompleter = ">=0.8"
+pygments = "*"
+wmctrl = "*"
+
+[package.extras]
+funcsigs = ["funcsigs"]
+testing = ["funcsigs", "pytest"]
+
+[[package]]
+name = "platformdirs"
+version = "4.1.0"
+description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "platformdirs-4.1.0-py3-none-any.whl", hash = "sha256:11c8f37bcca40db96d8144522d925583bdb7a31f7b0e37e3ed4318400a8e2380"},
+ {file = "platformdirs-4.1.0.tar.gz", hash = "sha256:906d548203468492d432bcb294d4bc2fff751bf84971fbb2c10918cc206ee420"},
+]
+
+[package.extras]
+docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.1)", "sphinx-autodoc-typehints (>=1.24)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.4)", "pytest-cov (>=4.1)", "pytest-mock (>=3.11.1)"]
+
+[[package]]
+name = "pluggy"
+version = "1.3.0"
+description = "plugin and hook calling mechanisms for python"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "pluggy-1.3.0-py3-none-any.whl", hash = "sha256:d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7"},
+ {file = "pluggy-1.3.0.tar.gz", hash = "sha256:cf61ae8f126ac6f7c451172cf30e3e43d3ca77615509771b3a984a0730651e12"},
+]
+
+[package.extras]
+dev = ["pre-commit", "tox"]
+testing = ["pytest", "pytest-benchmark"]
+
+[[package]]
+name = "pretty-dump"
+version = "3.0"
+description = "Diff and dump anything"
+optional = false
+python-versions = "*"
+files = []
+develop = false
+
+[package.dependencies]
+six = "*"
+
+[package.source]
+type = "git"
+url = "https://github.com/adfinis/freeze"
+reference = "HEAD"
+resolved_reference = "a5bd2bdfc68d46df01695079886b3818477f3137"
+
+[[package]]
+name = "pycodestyle"
+version = "2.11.1"
+description = "Python style guide checker"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "pycodestyle-2.11.1-py2.py3-none-any.whl", hash = "sha256:44fe31000b2d866f2e41841b18528a505fbd7fef9017b04eff4e2648a0fadc67"},
+ {file = "pycodestyle-2.11.1.tar.gz", hash = "sha256:41ba0e7afc9752dfb53ced5489e89f8186be00e599e712660695b7a75ff2663f"},
+]
+
+[[package]]
+name = "pydocstyle"
+version = "6.3.0"
+description = "Python docstring style checker"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
+ {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+]
+
+[package.dependencies]
+snowballstemmer = ">=2.2.0"
+
+[package.extras]
+toml = ["tomli (>=1.2.3)"]
+
+[[package]]
+name = "pyflakes"
+version = "3.1.0"
+description = "passive checker of Python programs"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "pyflakes-3.1.0-py2.py3-none-any.whl", hash = "sha256:4132f6d49cb4dae6819e5379898f2b8cce3c5f23994194c24b77d5da2e36f774"},
+ {file = "pyflakes-3.1.0.tar.gz", hash = "sha256:a0aae034c444db0071aa077972ba4768d40c830d9539fd45bf4cd3f8f6992efc"},
+]
+
+[[package]]
+name = "pygments"
+version = "2.17.2"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pygments-2.17.2-py3-none-any.whl", hash = "sha256:b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c"},
+ {file = "pygments-2.17.2.tar.gz", hash = "sha256:da46cec9fd2de5be3a8a784f434e4c4ab670b4ff54d605c4c2717e9d49c4c367"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+windows-terminal = ["colorama (>=0.4.6)"]
+
+[[package]]
+name = "pyreadline"
+version = "2.1"
+description = "A python implmementation of GNU readline."
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyreadline-2.1.zip", hash = "sha256:4530592fc2e85b25b1a9f79664433da09237c1a270e4d78ea5aa3a2c7229e2d1"},
+]
+
+[[package]]
+name = "pyrepl"
+version = "0.9.0"
+description = "A library for building flexible command line interfaces"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyrepl-0.9.0.tar.gz", hash = "sha256:292570f34b5502e871bbb966d639474f2b57fbfcd3373c2d6a2f3d56e681a775"},
+]
+
+[[package]]
+name = "pytest"
+version = "7.4.3"
+description = "pytest: simple powerful testing with Python"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pytest-7.4.3-py3-none-any.whl", hash = "sha256:0d009c083ea859a71b76adf7c1d502e4bc170b80a8ef002da5806527b9591fac"},
+ {file = "pytest-7.4.3.tar.gz", hash = "sha256:d989d136982de4e3b29dabcc838ad581c64e8ed52c11fbe86ddebd9da0818cd5"},
+]
+
+[package.dependencies]
+colorama = {version = "*", markers = "sys_platform == \"win32\""}
+iniconfig = "*"
+packaging = "*"
+pluggy = ">=0.12,<2.0"
+
+[package.extras]
+testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"]
+
+[[package]]
+name = "python-dateutil"
+version = "2.8.2"
+description = "Extensions to the standard Python datetime module"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
+files = [
+ {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
+ {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
+]
+
+[package.dependencies]
+six = ">=1.5"
+
+[[package]]
+name = "python-lsp-black"
+version = "1.3.0"
+description = "Black plugin for the Python LSP Server"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "python-lsp-black-1.3.0.tar.gz", hash = "sha256:5aa257e9e7b7e5a2316ef2a9fbcd242e82e0f695bf1622e31c0bf5cd69e6113f"},
+ {file = "python_lsp_black-1.3.0-py3-none-any.whl", hash = "sha256:5f583b4395d8d048885974095088ab81e36e501de369cc49a621a82473bb9070"},
+]
+
+[package.dependencies]
+black = ">=22.3.0"
+python-lsp-server = ">=1.4.0"
+
+[package.extras]
+dev = ["flake8", "isort (>=5.0)", "mypy", "pre-commit", "pytest", "types-pkg-resources", "types-setuptools"]
+
+[[package]]
+name = "python-lsp-isort"
+version = "0.1"
+description = "isort plugin for the Python LSP Server"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "python-lsp-isort-0.1.tar.gz", hash = "sha256:f02948bc8e7549905032100e772f03464f7548afa96f07d744ff1f93cc58339a"},
+]
+
+[package.dependencies]
+isort = ">=5.0"
+python-lsp-server = "*"
+
+[package.extras]
+dev = ["pytest"]
+
+[[package]]
+name = "python-lsp-jsonrpc"
+version = "1.1.2"
+description = "JSON RPC 2.0 server library"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "python-lsp-jsonrpc-1.1.2.tar.gz", hash = "sha256:4688e453eef55cd952bff762c705cedefa12055c0aec17a06f595bcc002cc912"},
+ {file = "python_lsp_jsonrpc-1.1.2-py3-none-any.whl", hash = "sha256:7339c2e9630ae98903fdaea1ace8c47fba0484983794d6aafd0bd8989be2b03c"},
+]
+
+[package.dependencies]
+ujson = ">=3.0.0"
+
+[package.extras]
+test = ["coverage", "pycodestyle", "pyflakes", "pylint", "pytest", "pytest-cov"]
+
+[[package]]
+name = "python-lsp-server"
+version = "1.9.0"
+description = "Python Language Server for the Language Server Protocol"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "python-lsp-server-1.9.0.tar.gz", hash = "sha256:dc0c8298f0222fd66a52aa3170f3a5c8fe3021007a02098bb72f7fd8df353d13"},
+ {file = "python_lsp_server-1.9.0-py3-none-any.whl", hash = "sha256:6b947cf9dc33d7bed9abc936bb173140fcf606b6eb50cf02e27d4cb09f10d3fb"},
+]
+
+[package.dependencies]
+docstring-to-markdown = "*"
+jedi = ">=0.17.2,<0.20.0"
+pluggy = ">=1.0.0"
+python-lsp-jsonrpc = ">=1.1.0,<2.0.0"
+ujson = ">=3.0.0"
+
+[package.extras]
+all = ["autopep8 (>=2.0.4,<2.1.0)", "flake8 (>=6.1.0,<7)", "mccabe (>=0.7.0,<0.8.0)", "pycodestyle (>=2.11.0,<2.12.0)", "pydocstyle (>=6.3.0,<6.4.0)", "pyflakes (>=3.1.0,<3.2.0)", "pylint (>=2.5.0,<3.1)", "rope (>1.2.0)", "whatthepatch (>=1.0.2,<2.0.0)", "yapf (>=0.33.0)"]
+autopep8 = ["autopep8 (>=1.6.0,<2.1.0)"]
+flake8 = ["flake8 (>=6.1.0,<7)"]
+mccabe = ["mccabe (>=0.7.0,<0.8.0)"]
+pycodestyle = ["pycodestyle (>=2.11.0,<2.12.0)"]
+pydocstyle = ["pydocstyle (>=6.3.0,<6.4.0)"]
+pyflakes = ["pyflakes (>=3.1.0,<3.2.0)"]
+pylint = ["pylint (>=2.5.0,<3.1)"]
+rope = ["rope (>1.2.0)"]
+test = ["coverage", "flaky", "matplotlib", "numpy", "pandas", "pylint (>=2.5.0,<3.1)", "pyqt5", "pytest", "pytest-cov"]
+websockets = ["websockets (>=10.3)"]
+yapf = ["whatthepatch (>=1.0.2,<2.0.0)", "yapf (>=0.33.0)"]
+
+[[package]]
+name = "pytz"
+version = "2023.3.post1"
+description = "World timezone definitions, modern and historical"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pytz-2023.3.post1-py2.py3-none-any.whl", hash = "sha256:ce42d816b81b68506614c11e8937d3aa9e41007ceb50bfdcb0749b921bf646c7"},
+ {file = "pytz-2023.3.post1.tar.gz", hash = "sha256:7b4fddbeb94a1eba4b557da24f19fdf9db575192544270a9101d8509f9f43d7b"},
+]
+
+[[package]]
+name = "pyyaml"
+version = "6.0.1"
+description = "YAML parser and emitter for Python"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "PyYAML-6.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a"},
+ {file = "PyYAML-6.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"},
+ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
+ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
+ {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+ {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
+ {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
+ {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
+ {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
+ {file = "PyYAML-6.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab"},
+ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
+ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
+ {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+ {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
+ {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
+ {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+ {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+ {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+ {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+ {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+ {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+ {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-win32.whl", hash = "sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585"},
+ {file = "PyYAML-6.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-win32.whl", hash = "sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba"},
+ {file = "PyYAML-6.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867"},
+ {file = "PyYAML-6.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595"},
+ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
+ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
+ {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+ {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
+ {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
+ {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
+ {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
+ {file = "PyYAML-6.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859"},
+ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
+ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
+ {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+ {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
+ {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
+ {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
+ {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
+]
+
+[[package]]
+name = "six"
+version = "1.16.0"
+description = "Python 2 and 3 compatibility utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+ {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
+ {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
+]
+
+[[package]]
+name = "snowballstemmer"
+version = "2.2.0"
+description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms."
+optional = false
+python-versions = "*"
+files = [
+ {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
+ {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
+]
+
+[[package]]
+name = "sortedcontainers"
+version = "2.4.0"
+description = "Sorted Containers -- Sorted List, Sorted Dict, Sorted Set"
+optional = false
+python-versions = "*"
+files = [
+ {file = "sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0"},
+ {file = "sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88"},
+]
+
+[[package]]
+name = "testfixtures"
+version = "7.2.2"
+description = "A collection of helpers and mock objects for unit tests and doc tests."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "testfixtures-7.2.2-py3-none-any.whl", hash = "sha256:c72adaf9c7b06882b0ea379dd8159422ca2be0205eedc7c3fee0074cd32f45fa"},
+ {file = "testfixtures-7.2.2.tar.gz", hash = "sha256:80774aecb0249458275ab783f53093fbe75795ff2b3218d22ce3fff0a12c4da6"},
+]
+
+[package.extras]
+build = ["setuptools-git", "twine", "wheel"]
+docs = ["django", "furo", "sphinx", "sybil (>=3)", "twisted", "zope.component"]
+test = ["django", "mypy", "pytest (>=3.6)", "pytest-cov", "pytest-django", "sybil (>=3)", "twisted", "zope.component"]
+
+[[package]]
+name = "typing-extensions"
+version = "4.9.0"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "typing_extensions-4.9.0-py3-none-any.whl", hash = "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"},
+ {file = "typing_extensions-4.9.0.tar.gz", hash = "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783"},
+]
+
+[[package]]
+name = "ujson"
+version = "5.9.0"
+description = "Ultra fast JSON encoder and decoder for Python"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "ujson-5.9.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ab71bf27b002eaf7d047c54a68e60230fbd5cd9da60de7ca0aa87d0bccead8fa"},
+ {file = "ujson-5.9.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:7a365eac66f5aa7a7fdf57e5066ada6226700884fc7dce2ba5483538bc16c8c5"},
+ {file = "ujson-5.9.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e015122b337858dba5a3dc3533af2a8fc0410ee9e2374092f6a5b88b182e9fcc"},
+ {file = "ujson-5.9.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:779a2a88c53039bebfbccca934430dabb5c62cc179e09a9c27a322023f363e0d"},
+ {file = "ujson-5.9.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:10ca3c41e80509fd9805f7c149068fa8dbee18872bbdc03d7cca928926a358d5"},
+ {file = "ujson-5.9.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4a566e465cb2fcfdf040c2447b7dd9718799d0d90134b37a20dff1e27c0e9096"},
+ {file = "ujson-5.9.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:f833c529e922577226a05bc25b6a8b3eb6c4fb155b72dd88d33de99d53113124"},
+ {file = "ujson-5.9.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b68a0caab33f359b4cbbc10065c88e3758c9f73a11a65a91f024b2e7a1257106"},
+ {file = "ujson-5.9.0-cp310-cp310-win32.whl", hash = "sha256:7cc7e605d2aa6ae6b7321c3ae250d2e050f06082e71ab1a4200b4ae64d25863c"},
+ {file = "ujson-5.9.0-cp310-cp310-win_amd64.whl", hash = "sha256:a6d3f10eb8ccba4316a6b5465b705ed70a06011c6f82418b59278fbc919bef6f"},
+ {file = "ujson-5.9.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3b23bbb46334ce51ddb5dded60c662fbf7bb74a37b8f87221c5b0fec1ec6454b"},
+ {file = "ujson-5.9.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6974b3a7c17bbf829e6c3bfdc5823c67922e44ff169851a755eab79a3dd31ec0"},
+ {file = "ujson-5.9.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b5964ea916edfe24af1f4cc68488448fbb1ec27a3ddcddc2b236da575c12c8ae"},
+ {file = "ujson-5.9.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ba7cac47dd65ff88571eceeff48bf30ed5eb9c67b34b88cb22869b7aa19600d"},
+ {file = "ujson-5.9.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6bbd91a151a8f3358c29355a491e915eb203f607267a25e6ab10531b3b157c5e"},
+ {file = "ujson-5.9.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:829a69d451a49c0de14a9fecb2a2d544a9b2c884c2b542adb243b683a6f15908"},
+ {file = "ujson-5.9.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:a807ae73c46ad5db161a7e883eec0fbe1bebc6a54890152ccc63072c4884823b"},
+ {file = "ujson-5.9.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8fc2aa18b13d97b3c8ccecdf1a3c405f411a6e96adeee94233058c44ff92617d"},
+ {file = "ujson-5.9.0-cp311-cp311-win32.whl", hash = "sha256:70e06849dfeb2548be48fdd3ceb53300640bc8100c379d6e19d78045e9c26120"},
+ {file = "ujson-5.9.0-cp311-cp311-win_amd64.whl", hash = "sha256:7309d063cd392811acc49b5016728a5e1b46ab9907d321ebbe1c2156bc3c0b99"},
+ {file = "ujson-5.9.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:20509a8c9f775b3a511e308bbe0b72897ba6b800767a7c90c5cca59d20d7c42c"},
+ {file = "ujson-5.9.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b28407cfe315bd1b34f1ebe65d3bd735d6b36d409b334100be8cdffae2177b2f"},
+ {file = "ujson-5.9.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9d302bd17989b6bd90d49bade66943c78f9e3670407dbc53ebcf61271cadc399"},
+ {file = "ujson-5.9.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f21315f51e0db8ee245e33a649dd2d9dce0594522de6f278d62f15f998e050e"},
+ {file = "ujson-5.9.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5635b78b636a54a86fdbf6f027e461aa6c6b948363bdf8d4fbb56a42b7388320"},
+ {file = "ujson-5.9.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:82b5a56609f1235d72835ee109163c7041b30920d70fe7dac9176c64df87c164"},
+ {file = "ujson-5.9.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:5ca35f484622fd208f55041b042d9d94f3b2c9c5add4e9af5ee9946d2d30db01"},
+ {file = "ujson-5.9.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:829b824953ebad76d46e4ae709e940bb229e8999e40881338b3cc94c771b876c"},
+ {file = "ujson-5.9.0-cp312-cp312-win32.whl", hash = "sha256:25fa46e4ff0a2deecbcf7100af3a5d70090b461906f2299506485ff31d9ec437"},
+ {file = "ujson-5.9.0-cp312-cp312-win_amd64.whl", hash = "sha256:60718f1720a61560618eff3b56fd517d107518d3c0160ca7a5a66ac949c6cf1c"},
+ {file = "ujson-5.9.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:d581db9db9e41d8ea0b2705c90518ba623cbdc74f8d644d7eb0d107be0d85d9c"},
+ {file = "ujson-5.9.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ff741a5b4be2d08fceaab681c9d4bc89abf3c9db600ab435e20b9b6d4dfef12e"},
+ {file = "ujson-5.9.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cdcb02cabcb1e44381221840a7af04433c1dc3297af76fde924a50c3054c708c"},
+ {file = "ujson-5.9.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e208d3bf02c6963e6ef7324dadf1d73239fb7008491fdf523208f60be6437402"},
+ {file = "ujson-5.9.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f4b3917296630a075e04d3d07601ce2a176479c23af838b6cf90a2d6b39b0d95"},
+ {file = "ujson-5.9.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:0c4d6adb2c7bb9eb7c71ad6f6f612e13b264942e841f8cc3314a21a289a76c4e"},
+ {file = "ujson-5.9.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:0b159efece9ab5c01f70b9d10bbb77241ce111a45bc8d21a44c219a2aec8ddfd"},
+ {file = "ujson-5.9.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f0cb4a7814940ddd6619bdce6be637a4b37a8c4760de9373bac54bb7b229698b"},
+ {file = "ujson-5.9.0-cp38-cp38-win32.whl", hash = "sha256:dc80f0f5abf33bd7099f7ac94ab1206730a3c0a2d17549911ed2cb6b7aa36d2d"},
+ {file = "ujson-5.9.0-cp38-cp38-win_amd64.whl", hash = "sha256:506a45e5fcbb2d46f1a51fead991c39529fc3737c0f5d47c9b4a1d762578fc30"},
+ {file = "ujson-5.9.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d0fd2eba664a22447102062814bd13e63c6130540222c0aa620701dd01f4be81"},
+ {file = "ujson-5.9.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:bdf7fc21a03bafe4ba208dafa84ae38e04e5d36c0e1c746726edf5392e9f9f36"},
+ {file = "ujson-5.9.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2f909bc08ce01f122fd9c24bc6f9876aa087188dfaf3c4116fe6e4daf7e194f"},
+ {file = "ujson-5.9.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd4ea86c2afd41429751d22a3ccd03311c067bd6aeee2d054f83f97e41e11d8f"},
+ {file = "ujson-5.9.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:63fb2e6599d96fdffdb553af0ed3f76b85fda63281063f1cb5b1141a6fcd0617"},
+ {file = "ujson-5.9.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:32bba5870c8fa2a97f4a68f6401038d3f1922e66c34280d710af00b14a3ca562"},
+ {file = "ujson-5.9.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:37ef92e42535a81bf72179d0e252c9af42a4ed966dc6be6967ebfb929a87bc60"},
+ {file = "ujson-5.9.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:f69f16b8f1c69da00e38dc5f2d08a86b0e781d0ad3e4cc6a13ea033a439c4844"},
+ {file = "ujson-5.9.0-cp39-cp39-win32.whl", hash = "sha256:3382a3ce0ccc0558b1c1668950008cece9bf463ebb17463ebf6a8bfc060dae34"},
+ {file = "ujson-5.9.0-cp39-cp39-win_amd64.whl", hash = "sha256:6adef377ed583477cf005b58c3025051b5faa6b8cc25876e594afbb772578f21"},
+ {file = "ujson-5.9.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:ffdfebd819f492e48e4f31c97cb593b9c1a8251933d8f8972e81697f00326ff1"},
+ {file = "ujson-5.9.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c4eec2ddc046360d087cf35659c7ba0cbd101f32035e19047013162274e71fcf"},
+ {file = "ujson-5.9.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2fbb90aa5c23cb3d4b803c12aa220d26778c31b6e4b7a13a1f49971f6c7d088e"},
+ {file = "ujson-5.9.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ba0823cb70866f0d6a4ad48d998dd338dce7314598721bc1b7986d054d782dfd"},
+ {file = "ujson-5.9.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4e35d7885ed612feb6b3dd1b7de28e89baaba4011ecdf995e88be9ac614765e9"},
+ {file = "ujson-5.9.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:b048aa93eace8571eedbd67b3766623e7f0acbf08ee291bef7d8106210432427"},
+ {file = "ujson-5.9.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:323279e68c195110ef85cbe5edce885219e3d4a48705448720ad925d88c9f851"},
+ {file = "ujson-5.9.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9ac92d86ff34296f881e12aa955f7014d276895e0e4e868ba7fddebbde38e378"},
+ {file = "ujson-5.9.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:6eecbd09b316cea1fd929b1e25f70382917542ab11b692cb46ec9b0a26c7427f"},
+ {file = "ujson-5.9.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:473fb8dff1d58f49912323d7cb0859df5585cfc932e4b9c053bf8cf7f2d7c5c4"},
+ {file = "ujson-5.9.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f91719c6abafe429c1a144cfe27883eace9fb1c09a9c5ef1bcb3ae80a3076a4e"},
+ {file = "ujson-5.9.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b1c0991c4fe256f5fdb19758f7eac7f47caac29a6c57d0de16a19048eb86bad"},
+ {file = "ujson-5.9.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2a8ea0f55a1396708e564595aaa6696c0d8af532340f477162ff6927ecc46e21"},
+ {file = "ujson-5.9.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:07e0cfdde5fd91f54cd2d7ffb3482c8ff1bf558abf32a8b953a5d169575ae1cd"},
+ {file = "ujson-5.9.0.tar.gz", hash = "sha256:89cc92e73d5501b8a7f48575eeb14ad27156ad092c2e9fc7e3cf949f07e75532"},
+]
+
+[[package]]
+name = "wmctrl"
+version = "0.5"
+description = "A tool to programmatically control windows inside X"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "wmctrl-0.5-py2.py3-none-any.whl", hash = "sha256:ae695c1863a314c899e7cf113f07c0da02a394b968c4772e1936219d9234ddd7"},
+ {file = "wmctrl-0.5.tar.gz", hash = "sha256:7839a36b6fe9e2d6fd22304e5dc372dbced2116ba41283ea938b2da57f53e962"},
+]
+
+[package.dependencies]
+attrs = "*"
+
+[package.extras]
+test = ["pytest"]
+
+[metadata]
+lock-version = "2.0"
+python-versions = "^3.11"
+content-hash = "961562d63a5a6df669954ff44ed4df0f6b130b26c01336919ee93ac1df85236e"
diff --git a/pyaptly/__init__.py b/pyaptly/__init__.py
index 84cef98..120e5b5 100755
--- a/pyaptly/__init__.py
+++ b/pyaptly/__init__.py
@@ -25,12 +25,11 @@
def init_hypothesis():
"""Initialize hypothesis profile if hypothesis is available"""
try: # pragma: no cover:w
- if b'HYPOTHESIS_PROFILE' in environb:
+ if b"HYPOTHESIS_PROFILE" in environb:
from hypothesis import Settings
- Settings.register_profile("ci", Settings(
- max_examples=10000
- ))
- Settings.load_profile(os.getenv(u'HYPOTHESIS_PROFILE', 'default'))
+
+ Settings.register_profile("ci", Settings(max_examples=10000))
+ Settings.load_profile(os.getenv("HYPOTHESIS_PROFILE", "default"))
except (ImportError, AttributeError): # pragma: no cover
pass
@@ -41,6 +40,7 @@ def get_logger():
:rtype: logging.Logger"""
return logging.getLogger("pyaptly")
+
lg = get_logger()
init_hypothesis()
@@ -65,10 +65,7 @@ def iso_to_gregorian(iso_year, iso_week, iso_day, tzinfo=None):
:param iso_day: ISO day
:type iso_day: int"""
year_start = iso_first_week_start(iso_year, tzinfo)
- return year_start + datetime.timedelta(
- days=iso_day - 1,
- weeks=iso_week - 1
- )
+ return year_start + datetime.timedelta(days=iso_day - 1, weeks=iso_week - 1)
def time_remove_tz(time):
@@ -80,10 +77,10 @@ def time_remove_tz(time):
:rtype: :py:class:`datetime.time`
"""
return datetime.time(
- hour = time.hour,
- minute = time.minute,
- second = time.second,
- microsecond = time.microsecond,
+ hour=time.hour,
+ minute=time.minute,
+ second=time.second,
+ microsecond=time.microsecond,
)
@@ -96,14 +93,14 @@ def time_delta_helper(time): # pragma: no cover
:rtype: :py:class:`datetime.datetime`
"""
return datetime.datetime(
- year = 2000,
- month = 1,
- day = 1,
- hour = time.hour,
- minute = time.minute,
- second = time.second,
- microsecond = time.microsecond,
- tzinfo = time.tzinfo,
+ year=2000,
+ month=1,
+ day=1,
+ hour=time.hour,
+ minute=time.minute,
+ second=time.second,
+ microsecond=time.microsecond,
+ tzinfo=time.tzinfo,
)
@@ -121,18 +118,18 @@ def date_round_weekly(date, day_of_week=1, time=None):
:type time: :py:class:`datetime.time`
:rtype: :py:class:`datetime.datetime`"""
if time:
- time = time_remove_tz(time)
+ time = time_remove_tz(time)
else: # pragma: no cover
- time = datetime.time(hour=0, minute=0)
-
- delta = datetime.timedelta(
- days = day_of_week - 1,
- hours = time.hour,
- minutes = time.minute,
- seconds = time.second,
- microseconds = time.microsecond,
+ time = datetime.time(hour=0, minute=0)
+
+ delta = datetime.timedelta(
+ days=day_of_week - 1,
+ hours=time.hour,
+ minutes=time.minute,
+ seconds=time.second,
+ microseconds=time.microsecond,
)
- raster_date = date - delta
+ raster_date = date - delta
iso = raster_date.isocalendar()
rounded_date = iso_to_gregorian(iso[0], iso[1], 1, date.tzinfo)
return rounded_date + delta
@@ -150,21 +147,21 @@ def date_round_daily(date, time=None):
:type time: :py:class:`datetime.time`
:rtype: :py:class:`datetime.datetime`"""
if time:
- time = time_remove_tz(time)
+ time = time_remove_tz(time)
else: # pragma: no cover
- time = datetime.time(hour=0, minute=0)
- delta = datetime.timedelta(
- hours = time.hour,
- minutes = time.minute,
- seconds = time.second,
- microseconds = time.microsecond,
+ time = datetime.time(hour=0, minute=0)
+ delta = datetime.timedelta(
+ hours=time.hour,
+ minutes=time.minute,
+ seconds=time.second,
+ microseconds=time.microsecond,
)
- raster_date = date - delta
+ raster_date = date - delta
rounded_date = datetime.datetime(
- year = raster_date.year,
- month = raster_date.month,
- day = raster_date.day,
- tzinfo = raster_date.tzinfo
+ year=raster_date.year,
+ month=raster_date.month,
+ day=raster_date.day,
+ tzinfo=raster_date.tzinfo,
)
return rounded_date + delta
@@ -178,16 +175,15 @@ def call_output(args, input_=None):
:type input_: bytes
"""
p = subprocess.Popen(
- args,
- stdin=subprocess.PIPE,
- stdout=subprocess.PIPE,
- stderr=subprocess.PIPE
+ args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
output, err = p.communicate(input_)
if p.returncode != 0:
raise subprocess.CalledProcessError(
p.returncode,
args,
+ output,
+ err,
)
return (output.decode("UTF-8"), err.decode("UTF-8"))
@@ -208,7 +204,11 @@ def __init__(self, cmd):
self._provides = set()
self._finished = None
self._known_dependency_types = (
- 'snapshot', 'mirror', 'repo', 'publish', 'virtual'
+ "snapshot",
+ "mirror",
+ "repo",
+ "publish",
+ "virtual",
)
def get_provides(self): # pragma: no cover
@@ -235,9 +235,9 @@ def require(self, type_, identifier):
:type identifier: usually str
"""
assert type_ in (
- self._known_dependency_types +
- ('any', ) +
- SystemStateReader.known_dependency_types
+ self._known_dependency_types
+ + ("any",)
+ + SystemStateReader.known_dependency_types
)
self._requires.add((type_, str(identifier)))
@@ -261,10 +261,10 @@ def execute(self):
return self._finished
if not Command.pretend_mode:
- lg.debug('Running command: %s', ' '.join(self.cmd))
+ lg.debug("Running command: %s", " ".join(self.cmd))
self._finished = subprocess.check_call(self.cmd)
else:
- lg.info('Pretending to run command: %s', ' '.join(self.cmd))
+ lg.info("Pretending to run command: %s", " ".join(self.cmd))
return self._finished
@@ -278,9 +278,7 @@ def __hash__(self):
"""Hash of the command.
:rtype: integer"""
- return freeze.recursive_hash(
- (self.cmd, self._requires, self._provides)
- )
+ return freeze.recursive_hash((self.cmd, self._requires, self._provides))
def __eq__(self, other):
"""Equalitity based on the hash, might collide... hmm"""
@@ -312,14 +310,14 @@ def result_node(type_, name):
"""Get the dot representation of a result node."""
return (
'"%s %s" [shape=ellipse]' % (type_, name),
- '"%s %s"' % (type_, name),
+ '"%s %s"' % (type_, name),
)
def cmd_node(command):
"""Get the dot representation of a command node."""
return (
'"%s" [shape=box]' % command.repr_cmd(),
- '"%s"' % command.repr_cmd(),
+ '"%s"' % command.repr_cmd(),
)
for cmd in commands:
@@ -347,7 +345,7 @@ def cmd_node(command):
"""
return template % (
";\n".join(nodes),
- ";\n".join(['%s -> %s' % edge for edge in edges])
+ ";\n".join(["%s -> %s" % edge for edge in edges]),
)
@staticmethod
@@ -363,13 +361,11 @@ def order_commands(commands, has_dependency_cb=lambda x: False):
commands = set([c for c in commands if c is not None])
- lg.debug('Ordering commands: %s', [
- str(cmd) for cmd in commands
- ])
+ lg.debug("Ordering commands: %s", [str(cmd) for cmd in commands])
have_requirements = collections.defaultdict(lambda: 0)
- required_number = collections.defaultdict(lambda: 0)
- scheduled = []
+ required_number = collections.defaultdict(lambda: 0)
+ scheduled = []
for cmd in commands:
for provide in cmd._provides:
@@ -403,34 +399,26 @@ def order_commands(commands, has_dependency_cb=lambda x: False):
break
if can_schedule:
- lg.debug(
- "%s: all dependencies fulfilled" % cmd
- )
+ lg.debug("%s: all dependencies fulfilled" % cmd)
scheduled.append(cmd)
for provide in cmd._provides:
have_requirements[provide] += 1
something_changed = True
- unresolved = [
- cmd
- for cmd in commands
- if cmd not in scheduled
- ]
+ unresolved = [cmd for cmd in commands if cmd not in scheduled]
if len(unresolved) > 0: # pragma: no cover
- raise ValueError('Commands with unresolved deps: %s' % [
- str(cmd) for cmd in unresolved
- ])
+ raise ValueError(
+ "Commands with unresolved deps: %s" % [str(cmd) for cmd in unresolved]
+ )
# Just one last verification before we commence
scheduled_set = set([cmd for cmd in scheduled])
- incoming_set = set([cmd for cmd in commands])
+ incoming_set = set([cmd for cmd in commands])
assert incoming_set == scheduled_set
- lg.info('Reordered commands: %s', [
- str(cmd) for cmd in scheduled
- ])
+ lg.info("Reordered commands: %s", [str(cmd) for cmd in scheduled])
return scheduled
@@ -447,20 +435,14 @@ class FunctionCommand(Command):
def __init__(self, func, *args, **kwargs):
super(FunctionCommand, self).__init__(None)
- assert hasattr(func, '__call__')
- self.cmd = func
- self.args = args
+ assert hasattr(func, "__call__")
+ self.cmd = func
+ self.args = args
self.kwargs = kwargs
def __hash__(self):
return freeze.recursive_hash(
- (
- id(self.cmd),
- self.args,
- self.kwargs,
- self._requires,
- self._provides
- )
+ (id(self.cmd), self.args, self.kwargs, self._requires, self._provides)
)
def execute(self):
@@ -470,7 +452,7 @@ def execute(self):
if not Command.pretend_mode:
lg.debug(
- 'Running code: %s(args=%s, kwargs=%s)',
+ "Running code: %s(args=%s, kwargs=%s)",
self.cmd.__name__,
repr(self.args),
repr(self.kwargs),
@@ -481,7 +463,7 @@ def execute(self):
self._finished = True
else: # pragma: no cover
lg.info(
- 'Pretending to run code: %s(args=%s, kwargs=%s)',
+ "Pretending to run code: %s(args=%s, kwargs=%s)",
self.repr_cmd(),
repr(self.args),
repr(self.kwargs),
@@ -495,7 +477,7 @@ def repr_cmd(self):
:rtype: str"""
# We need to "id" ourselves here so that multiple commands that call a
# function with the same name won't be shown as being equal.
- return '%s|%s' % (self.cmd.__name__, id(self))
+ return "%s|%s" % (self.cmd.__name__, id(self))
def __repr__(self):
return "FunctionCommand<%s requires %s, provides %s>\n" % (
@@ -509,18 +491,17 @@ class SystemStateReader(object):
"""Reads the state from aptly and gpg to find out what operations have to
be performed to reach the state defined in the yml config-file.
"""
- known_dependency_types = (
- 'repo', 'snapshot', 'mirror', 'gpg_key'
- )
+
+ known_dependency_types = ("repo", "snapshot", "mirror", "gpg_key")
def __init__(self):
- self.gpg_keys = set()
- self.mirrors = set()
- self.repos = set()
- self.snapshots = set()
+ self.gpg_keys = set()
+ self.mirrors = set()
+ self.repos = set()
+ self.snapshots = set()
self.snapshot_map = {}
- self.publishes = set()
- self.publish_map = {}
+ self.publishes = set()
+ self.publish_map = {}
def _extract_sources(self, data):
"""
@@ -536,7 +517,7 @@ def _extract_sources(self, data):
sources = []
for line in data.split("\n"):
# source line need to start with two spaces
- if entered_sources and line[0:2] != ' ':
+ if entered_sources and line[0:2] != " ":
break
if entered_sources:
@@ -560,14 +541,16 @@ def read(self):
def read_gpg(self):
"""Read all trusted keys in gpg."""
self.gpg_keys = set()
- data, _ = call_output([
+ cmd = [
"gpg",
"--no-default-keyring",
- "--keyring", "trustedkeys.gpg",
+ "--keyring",
+ "trustedkeys.gpg",
"--list-keys",
- "--with-colons"
- ])
- lg.debug('GPG returned: %s', data)
+ "--with-colons",
+ ]
+ data, _ = call_output(cmd)
+ lg.debug("GPG returned: %s", data)
for line in data.split("\n"):
field = line.split(":")
if field[0] in ("pub", "sub"):
@@ -582,18 +565,15 @@ def read_publish_map(self):
# match example: main: test-snapshot [snapshot]
re_snap = re.compile(r"\s+[\w\d-]+\:\s([\w\d-]+)\s\[snapshot\]")
for publish in self.publishes:
-
- prefix, dist = publish.split(' ')
- data, _ = call_output([
- "aptly", "publish", "show", dist, prefix
- ])
+ prefix, dist = publish.split(" ")
+ data, _ = call_output(["aptly", "publish", "show", dist, prefix])
sources = self._extract_sources(data)
matches = [re_snap.match(source) for source in sources]
snapshots = [match.group(1) for match in matches if match]
self.publish_map[publish] = set(snapshots)
- lg.debug('Joined snapshots and publishes: %s', self.publish_map)
+ lg.debug("Joined snapshots and publishes: %s", self.publish_map)
def read_snapshot_map(self):
"""Create a snapshot map. snapshot -> snapshots. This is also called
@@ -602,18 +582,13 @@ def read_snapshot_map(self):
# match example: test-snapshot [snapshot]
re_snap = re.compile(r"\s+([\w\d-]+)\s\[snapshot\]")
for snapshot_outer in self.snapshots:
- data, _ = call_output([
- "aptly", "snapshot", "show", snapshot_outer
- ])
+ data, _ = call_output(["aptly", "snapshot", "show", snapshot_outer])
sources = self._extract_sources(data)
matches = [re_snap.match(source) for source in sources]
snapshots = [match.group(1) for match in matches if match]
self.snapshot_map[snapshot_outer] = set(snapshots)
- lg.debug(
- 'Joined snapshots with self(snapshots): %s',
- self.snapshot_map
- )
+ lg.debug("Joined snapshots with self(snapshots): %s", self.snapshot_map)
def read_publishes(self):
"""Read all available publishes."""
@@ -642,10 +617,8 @@ def read_aptly_list(self, type_, list_):
:type type_: str
:param list_: Read into this list
:param list_: list"""
- data, _ = call_output([
- "aptly", type_, "list", "-raw"
- ])
- lg.debug('Aptly returned %s: %s', type_, data)
+ data, _ = call_output(["aptly", type_, "list", "-raw"])
+ lg.debug("Aptly returned %s: %s", type_, data)
for line in data.split("\n"):
clean_line = line.strip()
if clean_line:
@@ -658,23 +631,21 @@ def has_dependency(self, dependency):
:type dependency: list"""
type_, name = dependency
- if type_ == 'repo': # pragma: no cover
+ if type_ == "repo": # pragma: no cover
return name in self.repos
- if type_ == 'mirror': # pragma: no cover
+ if type_ == "mirror": # pragma: no cover
return name in self.mirrors
- elif type_ == 'snapshot':
+ elif type_ == "snapshot":
return name in self.snapshots # pragma: no cover
- elif type_ == 'gpg_key': # pragma: no cover
+ elif type_ == "gpg_key": # pragma: no cover
return name in self.gpg_keys # Not needed ATM
- elif type_ == 'virtual':
+ elif type_ == "virtual":
# virtual dependencies can never be resolved by the
# system state reader - they are used for internal
# ordering only
return False
else:
- raise ValueError(
- "Unknown dependency to resolve: %s" % str(dependency)
- )
+ raise ValueError("Unknown dependency to resolve: %s" % str(dependency))
state = SystemStateReader()
@@ -688,84 +659,50 @@ def main(argv=None):
global _logging_setup
if not argv: # pragma: no cover
argv = sys.argv[1:]
- parser = argparse.ArgumentParser(description='Manage aptly')
+ parser = argparse.ArgumentParser(description="Manage aptly")
parser.add_argument(
- '--config',
- '-c',
- help='Yaml config file defining mirrors and snapshots',
+ "--config",
+ "-c",
+ help="Yaml config file defining mirrors and snapshots",
type=str,
- required=True
+ required=True,
)
parser.add_argument(
- '--debug',
- '-d',
- help='Enable debug output',
- action='store_true',
+ "--debug",
+ "-d",
+ help="Enable debug output",
+ action="store_true",
)
parser.add_argument(
- '--pretend',
- '-p',
- help='Do not do anything, just print out what WOULD be done',
- action='store_true',
+ "--pretend",
+ "-p",
+ help="Do not do anything, just print out what WOULD be done",
+ action="store_true",
)
subparsers = parser.add_subparsers()
- mirror_parser = subparsers.add_parser(
- 'mirror',
- help='manage aptly mirrors'
- )
+ mirror_parser = subparsers.add_parser("mirror", help="manage aptly mirrors")
mirror_parser.set_defaults(func=mirror)
- mirror_parser.add_argument(
- 'task',
- type=str,
- choices=['create', 'update']
- )
- mirror_parser.add_argument(
- 'mirror_name',
- type=str,
- nargs='?',
- default='all'
- )
- snap_parser = subparsers.add_parser(
- 'snapshot',
- help='manage aptly snapshots'
- )
+ mirror_parser.add_argument("task", type=str, choices=["create", "update"])
+ mirror_parser.add_argument("mirror_name", type=str, nargs="?", default="all")
+ snap_parser = subparsers.add_parser("snapshot", help="manage aptly snapshots")
snap_parser.set_defaults(func=snapshot)
- snap_parser.add_argument('task', type=str, choices=['create', 'update'])
- snap_parser.add_argument(
- 'snapshot_name',
- type=str,
- nargs='?',
- default='all'
- )
+ snap_parser.add_argument("task", type=str, choices=["create", "update"])
+ snap_parser.add_argument("snapshot_name", type=str, nargs="?", default="all")
publish_parser = subparsers.add_parser(
- 'publish',
- help='manage aptly publish endpoints'
+ "publish", help="manage aptly publish endpoints"
)
publish_parser.set_defaults(func=publish)
- publish_parser.add_argument('task', type=str, choices=['create', 'update'])
- publish_parser.add_argument(
- 'publish_name',
- type=str,
- nargs='?',
- default='all'
- )
- repo_parser = subparsers.add_parser(
- 'repo',
- help='manage aptly repositories'
- )
+ publish_parser.add_argument("task", type=str, choices=["create", "update"])
+ publish_parser.add_argument("publish_name", type=str, nargs="?", default="all")
+ repo_parser = subparsers.add_parser("repo", help="manage aptly repositories")
repo_parser.set_defaults(func=repo)
- repo_parser.add_argument('task', type=str, choices=['create'])
- repo_parser.add_argument(
- 'repo_name',
- type=str,
- nargs='?',
- default='all'
- )
+ repo_parser.add_argument("task", type=str, choices=["create"])
+ repo_parser.add_argument("repo_name", type=str, nargs="?", default="all")
args = parser.parse_args(argv)
root = logging.getLogger()
formatter = logging.Formatter(
- '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
+ "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
if not _logging_setup: # noqa
handler = logging.StreamHandler(sys.stderr)
@@ -783,21 +720,22 @@ def main(argv=None):
_logging_setup = True # noqa
lg.debug("Args: %s", vars(args))
- with codecs.open(args.config, 'r', encoding="UTF-8") as cfgfile:
+ with codecs.open(args.config, "r", encoding="UTF-8") as cfgfile:
cfg = yaml.load(cfgfile, Loader=yaml.FullLoader)
state.read()
# run function for selected subparser
args.func(cfg, args)
+
day_of_week_map = {
- 'mon': 1,
- 'tue': 2,
- 'wed': 3,
- 'thu': 4,
- 'fri': 5,
- 'sat': 6,
- 'sun': 7,
+ "mon": 1,
+ "tue": 2,
+ "wed": 3,
+ "thu": 4,
+ "fri": 5,
+ "sat": 6,
+ "sun": 7,
}
@@ -809,10 +747,10 @@ def expand_timestamped_name(name, timestamp_config, date=None):
:type timestamp_config: dict
:param date: The date to expand the timestamp with.
:type date: :py:class:`datetime.datetime`"""
- if '%T' not in name:
+ if "%T" not in name:
return name
timestamp = round_timestamp(timestamp_config, date)
- return name.replace('%T', timestamp.strftime('%Y%m%dT%H%MZ'))
+ return name.replace("%T", timestamp.strftime("%Y%m%dT%H%MZ"))
def round_timestamp(timestamp_config, date=None):
@@ -866,17 +804,16 @@ def round_timestamp(timestamp_config, date=None):
:param date: The date to expand the timestamp with.
:type date: :py:class:`datetime.datetime`
"""
- timestamp_info = timestamp_config.get('timestamp', timestamp_config)
- config_time = timestamp_info.get('time', 'FAIL')
- if config_time == 'FAIL': # pragma: no cover
+ timestamp_info = timestamp_config.get("timestamp", timestamp_config)
+ config_time = timestamp_info.get("time", "FAIL")
+ if config_time == "FAIL": # pragma: no cover
raise ValueError(
- "Timestamp config has no valid time entry: %s" %
- str(timestamp_config)
+ "Timestamp config has no valid time entry: %s" % str(timestamp_config)
)
- config_repeat_weekly = timestamp_info.get('repeat-weekly', None)
+ config_repeat_weekly = timestamp_info.get("repeat-weekly", None)
- hour, minute = [int(x) for x in config_time.split(':')][:2]
+ hour, minute = [int(x) for x in config_time.split(":")][:2]
if date is None:
date = datetime.datetime.now()
@@ -885,15 +822,10 @@ def round_timestamp(timestamp_config, date=None):
day_of_week = day_of_week_map.get(config_repeat_weekly.lower())
timestamp = date_round_weekly(
- date,
- day_of_week,
- datetime.time(hour=hour, minute=minute)
+ date, day_of_week, datetime.time(hour=hour, minute=minute)
)
else:
- timestamp = date_round_daily(
- date,
- datetime.time(hour=hour, minute=minute)
- )
+ timestamp = date_round_daily(date, datetime.time(hour=hour, minute=minute))
return timestamp
@@ -909,10 +841,7 @@ def unit_or_list_to_list(thingy):
return [thingy]
-def publish_cmd_create(cfg,
- publish_name,
- publish_config,
- ignore_existing=False):
+def publish_cmd_create(cfg, publish_name, publish_config, ignore_existing=False):
"""Creates a publish command with its dependencies to be ordered and
executed later.
@@ -922,108 +851,93 @@ def publish_cmd_create(cfg,
:type publish_name: str
:param publish_config: Configuration of the publish from the yml file.
:type publish_config: dict"""
- publish_fullname = '%s %s' % (publish_name, publish_config['distribution'])
+ publish_fullname = "%s %s" % (publish_name, publish_config["distribution"])
if publish_fullname in state.publishes and not ignore_existing:
# Nothing to do, publish already created
return
- publish_cmd = ['aptly', 'publish']
- options = []
- source_args = []
- endpoint_args = [
- publish_name
- ]
+ publish_cmd = ["aptly", "publish"]
+ options = []
+ source_args = []
+ endpoint_args = [publish_name]
has_source = False
num_sources = 0
for conf, conf_value in publish_config.items():
-
- if conf == 'skip-contents':
+ if conf == "skip-contents":
if conf_value:
- options.append('-skip-contents=true')
- elif conf == 'architectures': # pragma: no cover
+ options.append("-skip-contents=true")
+ elif conf == "architectures": # pragma: no cover
options.append(
- '-architectures=%s' %
- ','.join(unit_or_list_to_list(conf_value))
+ "-architectures=%s" % ",".join(unit_or_list_to_list(conf_value))
)
- elif conf == 'components':
+ elif conf == "components":
components = unit_or_list_to_list(conf_value)
- options.append(
- '-component=%s' % ','.join(components)
- )
- elif conf == 'label': # pragma: no cover
- options.append(
- '-label=%s' % conf_value
- )
- elif conf == 'origin': # pragma: no cover
- options.append('-origin=%s' % conf_value)
-
- elif conf == 'distribution':
- options.append('-distribution=%s' % conf_value)
-
- elif conf == 'gpg-key':
- options.append('-gpg-key=%s' % conf_value)
- elif conf == 'automatic-update':
+ options.append("-component=%s" % ",".join(components))
+ elif conf == "label": # pragma: no cover
+ options.append("-label=%s" % conf_value)
+ elif conf == "origin": # pragma: no cover
+ options.append("-origin=%s" % conf_value)
+
+ elif conf == "distribution":
+ options.append("-distribution=%s" % conf_value)
+
+ elif conf == "gpg-key":
+ options.append("-gpg-key=%s" % conf_value)
+ elif conf == "automatic-update":
# Ignored here
pass
- elif conf == 'snapshots':
+ elif conf == "snapshots":
if has_source: # pragma: no cover
raise ValueError(
- "Multiple sources for publish %s %s" % (
- publish_name,
- publish_config
- )
+ "Multiple sources for publish %s %s"
+ % (publish_name, publish_config)
)
has_source = True
snapshots = unit_or_list_to_list(conf_value)
- source_args.append('snapshot')
- source_args.extend([
- snapshot_spec_to_name(cfg, conf_value)
- for conf_value
- in snapshots
- ])
+ source_args.append("snapshot")
+ source_args.extend(
+ [snapshot_spec_to_name(cfg, conf_value) for conf_value in snapshots]
+ )
num_sources = len(snapshots)
- elif conf == 'repo':
+ elif conf == "repo":
if has_source: # pragma: no cover
raise ValueError(
- "Multiple sources for publish %s %s" % (
- publish_name,
- publish_config
- )
+ "Multiple sources for publish %s %s"
+ % (publish_name, publish_config)
)
has_source = True
- source_args = [
- 'repo',
- conf_value
- ]
+ source_args = ["repo", conf_value]
num_sources = 1
- elif conf == 'publish':
+ elif conf == "publish":
if has_source: # pragma: no cover
raise ValueError(
- "Multiple sources for publish %s %s" % (
- publish_name,
- publish_config
- )
+ "Multiple sources for publish %s %s"
+ % (publish_name, publish_config)
)
has_source = True
conf_value = " ".join(conf_value.split("/"))
- source_args.append('snapshot')
+ source_args.append("snapshot")
try:
sources = state.publish_map[conf_value]
except KeyError:
- lg.critical((
- "Creating %s has been deferred, please call publish "
- "create again"
- ) % publish_name)
+ lg.critical(
+ (
+ "Creating %s has been deferred, please call publish "
+ "create again"
+ )
+ % publish_name
+ )
return
source_args.extend(sources)
num_sources = len(sources)
else: # pragma: no cover
raise ValueError(
- "Don't know how to handle publish config entry %s in %s" % (
+ "Don't know how to handle publish config entry %s in %s"
+ % (
conf,
publish_name,
)
@@ -1042,22 +956,13 @@ def clone_snapshot(origin, destination):
:type origin: str
:param destination: The new name of the snapshot
:type destination: str"""
- cmd = Command([
- 'aptly',
- 'snapshot',
- 'merge',
- destination,
- origin
- ])
- cmd.provide('snapshot', destination)
- cmd.require('snapshot', origin)
+ cmd = Command(["aptly", "snapshot", "merge", destination, origin])
+ cmd.provide("snapshot", destination)
+ cmd.require("snapshot", origin)
return cmd
-def publish_cmd_update(cfg,
- publish_name,
- publish_config,
- ignore_existing=False):
+def publish_cmd_update(cfg, publish_name, publish_config, ignore_existing=False):
"""Creates a publish command with its dependencies to be ordered and
executed later.
@@ -1068,33 +973,29 @@ def publish_cmd_update(cfg,
:param publish_config: Configuration of the publish from the yml file.
:type publish_config: dict"""
- publish_cmd = ['aptly', 'publish']
- options = []
- args = [publish_config['distribution'], publish_name]
+ publish_cmd = ["aptly", "publish"]
+ options = []
+ args = [publish_config["distribution"], publish_name]
- if 'skip-contents' in publish_config and publish_config['skip-contents']:
- options.append('-skip-contents=true')
+ if "skip-contents" in publish_config and publish_config["skip-contents"]:
+ options.append("-skip-contents=true")
- if 'repo' in publish_config:
- publish_cmd.append('update')
+ if "repo" in publish_config:
+ publish_cmd.append("update")
return Command(publish_cmd + options + args)
- publish_fullname = '%s %s' % (publish_name, publish_config['distribution'])
+ publish_fullname = "%s %s" % (publish_name, publish_config["distribution"])
current_snapshots = state.publish_map[publish_fullname]
- if 'snapshots' in publish_config:
- snapshots_config = publish_config['snapshots']
- new_snapshots = [
- snapshot_spec_to_name(cfg, snap)
- for snap
- in snapshots_config
- ]
- elif 'publish' in publish_config:
- conf_value = publish_config['publish']
+ if "snapshots" in publish_config:
+ snapshots_config = publish_config["snapshots"]
+ new_snapshots = [snapshot_spec_to_name(cfg, snap) for snap in snapshots_config]
+ elif "publish" in publish_config:
+ conf_value = publish_config["publish"]
snapshots_config = []
- ref_publish_name, distribution = conf_value.split(" ")
- for publish in cfg['publish'][ref_publish_name]:
- if publish['distribution'] == distribution:
- snapshots_config.extend(publish['snapshots'])
+ ref_publish_name, distribution = conf_value.split(" ")
+ for publish in cfg["publish"][ref_publish_name]:
+ if publish["distribution"] == distribution:
+ snapshots_config.extend(publish["snapshots"])
break
new_snapshots = list(state.publish_map[conf_value])
else: # pragma: no cover
@@ -1105,40 +1006,38 @@ def publish_cmd_update(cfg,
if set(new_snapshots) == set(current_snapshots) and not ignore_existing:
# Already pointing to the newest snapshot, nothing to do
return
- components = unit_or_list_to_list(publish_config['components'])
+ components = unit_or_list_to_list(publish_config["components"])
for snap in snapshots_config:
# snap may be a plain name or a dict..
- if hasattr(snap, 'items'):
+ if hasattr(snap, "items"):
# Dict mode - only here can we even have an archive option
- archive = snap.get('archive-on-update', None)
+ archive = snap.get("archive-on-update", None)
if archive:
# Replace any timestamp placeholder with the current
# date/time. Note that this is NOT rounded, as we want to
# know exactly when the archival happened.
archive = archive.replace(
- '%T',
- format_timestamp(datetime.datetime.now())
+ "%T", format_timestamp(datetime.datetime.now())
)
if archive in state.snapshots: # pragma: no cover
continue
- prefix_to_search = re.sub('%T$', '', snap['name'])
+ prefix_to_search = re.sub("%T$", "", snap["name"])
current_snapshot = [
snap_name
- for snap_name
- in sorted(current_snapshots, key=lambda x: -len(x))
+ for snap_name in sorted(current_snapshots, key=lambda x: -len(x))
if snap_name.startswith(prefix_to_search)
][0]
clone_snapshot(current_snapshot, archive).execute()
- publish_cmd.append('switch')
- options.append('-component=%s' % ','.join(components))
+ publish_cmd.append("switch")
+ options.append("-component=%s" % ",".join(components))
- if 'skip-contents' in publish_config and publish_config['skip-contents']:
- options.append('-skip-contents=true')
+ if "skip-contents" in publish_config and publish_config["skip-contents"]:
+ options.append("-skip-contents=true")
return Command(publish_cmd + options + args + new_snapshots)
@@ -1157,30 +1056,26 @@ def repo_cmd_create(cfg, repo_name, repo_config):
# Nothing to do, repo already created
return
- repo_cmd = ['aptly', 'repo']
- options = []
- endpoint_args = ['create', repo_name]
+ repo_cmd = ["aptly", "repo"]
+ options = []
+ endpoint_args = ["create", repo_name]
for conf, conf_value in repo_config.items():
- if conf == 'architectures':
+ if conf == "architectures":
options.append(
- '-architectures=%s' %
- ','.join(unit_or_list_to_list(conf_value))
+ "-architectures=%s" % ",".join(unit_or_list_to_list(conf_value))
)
- elif conf == 'component':
+ elif conf == "component":
components = unit_or_list_to_list(conf_value)
- options.append(
- '-component=%s' % ','.join(components)
- )
- elif conf == 'comment': # pragma: no cover
- options.append(
- '-comment=%s' % conf_value
- )
- elif conf == 'distribution':
- options.append('-distribution=%s' % conf_value)
+ options.append("-component=%s" % ",".join(components))
+ elif conf == "comment": # pragma: no cover
+ options.append("-comment=%s" % conf_value)
+ elif conf == "distribution":
+ options.append("-distribution=%s" % conf_value)
else: # pragma: no cover
raise ValueError(
- "Don't know how to handle repo config entry %s in %s" % (
+ "Don't know how to handle repo config entry %s in %s"
+ % (
conf,
repo_name,
)
@@ -1196,10 +1091,10 @@ def repo(cfg, args):
:type cfg: dict
:param args: The command-line arguments read with :py:mod:`argparse`
:type args: namespace"""
- lg.debug("Repositories to create: %s", cfg['repo'])
+ lg.debug("Repositories to create: %s", cfg["repo"])
repo_cmds = {
- 'create': repo_cmd_create,
+ "create": repo_cmd_create,
}
cmd_repo = repo_cmds[args.task]
@@ -1207,28 +1102,20 @@ def repo(cfg, args):
if args.repo_name == "all":
commands = [
cmd_repo(cfg, repo_name, repo_conf)
- for repo_name, repo_conf in cfg['repo'].items()
+ for repo_name, repo_conf in cfg["repo"].items()
]
for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
- if args.repo_name in cfg['repo']:
- commands = [
- cmd_repo(
- cfg,
- args.repo_name,
- cfg['repo'][args.repo_name]
- )
- ]
+ if args.repo_name in cfg["repo"]:
+ commands = [cmd_repo(cfg, args.repo_name, cfg["repo"][args.repo_name])]
for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
raise ValueError(
- "Requested publish is not defined in config file: %s" % (
- args.repo_name
- )
+ "Requested publish is not defined in config file: %s" % (args.repo_name)
)
@@ -1239,14 +1126,14 @@ def publish(cfg, args):
:type cfg: dict
:param args: The command-line arguments read with :py:mod:`argparse`
:type args: namespace"""
- lg.debug("Publishes to create / update: %s", cfg['publish'])
+ lg.debug("Publishes to create / update: %s", cfg["publish"])
# aptly publish snapshot -components ... -architectures ... -distribution
# ... -origin Ubuntu trusty-stable ubuntu/stable
publish_cmds = {
- 'create': publish_cmd_create,
- 'update': publish_cmd_update,
+ "create": publish_cmd_create,
+ "update": publish_cmd_update,
}
cmd_publish = publish_cmds[args.task]
@@ -1254,32 +1141,26 @@ def publish(cfg, args):
if args.publish_name == "all":
commands = [
cmd_publish(cfg, publish_name, publish_conf_entry)
- for publish_name, publish_conf in cfg['publish'].items()
+ for publish_name, publish_conf in cfg["publish"].items()
for publish_conf_entry in publish_conf
- if publish_conf_entry.get('automatic-update', 'false') is True
+ if publish_conf_entry.get("automatic-update", "false") is True
]
for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
- if args.publish_name in cfg['publish']:
+ if args.publish_name in cfg["publish"]:
commands = [
- cmd_publish(
- cfg,
- args.publish_name,
- publish_conf_entry
- )
- for publish_conf_entry
- in cfg['publish'][args.publish_name]
+ cmd_publish(cfg, args.publish_name, publish_conf_entry)
+ for publish_conf_entry in cfg["publish"][args.publish_name]
]
for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
raise ValueError(
- "Requested publish is not defined in config file: %s" % (
- args.publish_name
- )
+ "Requested publish is not defined in config file: %s"
+ % (args.publish_name)
)
@@ -1290,11 +1171,11 @@ def snapshot(cfg, args):
:type cfg: dict
:param args: The command-line arguments read with :py:mod:`argparse`
:type args: namespace"""
- lg.debug("Snapshots to create: %s", cfg['snapshot'].keys())
+ lg.debug("Snapshots to create: %s", cfg["snapshot"].keys())
snapshot_cmds = {
- 'create': cmd_snapshot_create,
- 'update': cmd_snapshot_update,
+ "create": cmd_snapshot_create,
+ "update": cmd_snapshot_update,
}
cmd_snapshot = snapshot_cmds[args.task]
@@ -1302,39 +1183,34 @@ def snapshot(cfg, args):
if args.snapshot_name == "all":
commands = [
cmd
- for snapshot_name, snapshot_config in cfg['snapshot'].items()
+ for snapshot_name, snapshot_config in cfg["snapshot"].items()
for cmd in cmd_snapshot(cfg, snapshot_name, snapshot_config)
]
if args.debug: # pragma: no cover
dot_file = "/tmp/commands.dot"
- with codecs.open(dot_file, 'w', "UTF-8") as fh_dot:
+ with codecs.open(dot_file, "w", "UTF-8") as fh_dot:
fh_dot.write(Command.command_list_to_digraph(commands))
- lg.info('Wrote command dependency tree graph to %s', dot_file)
+ lg.info("Wrote command dependency tree graph to %s", dot_file)
if len(commands) > 0:
- for cmd in Command.order_commands(commands,
- state.has_dependency):
+ for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
- if args.snapshot_name in cfg['snapshot']:
+ if args.snapshot_name in cfg["snapshot"]:
commands = cmd_snapshot(
- cfg,
- args.snapshot_name,
- cfg['snapshot'][args.snapshot_name]
+ cfg, args.snapshot_name, cfg["snapshot"][args.snapshot_name]
)
if len(commands) > 0:
- for cmd in Command.order_commands(commands,
- state.has_dependency):
+ for cmd in Command.order_commands(commands, state.has_dependency):
cmd.execute()
else:
raise ValueError(
- "Requested snapshot is not defined in config file: %s" % (
- args.snapshot_name
- )
+ "Requested snapshot is not defined in config file: %s"
+ % (args.snapshot_name)
)
@@ -1343,11 +1219,11 @@ def format_timestamp(timestamp):
:param timestamp: The timestamp to format
:type timestamp: :py:class:`datetime.datetime`"""
- return timestamp.strftime('%Y%m%dT%H%MZ')
+ return timestamp.strftime("%Y%m%dT%H%MZ")
back_reference_map = {
- "current": 0,
+ "current": 0,
"previous": 1,
}
@@ -1374,16 +1250,16 @@ def snapshot_spec_to_name(cfg, snapshot):
:type snapshot: dict
"""
delta = datetime.timedelta(seconds=1)
- if hasattr(snapshot, 'items'):
- name = snapshot['name']
- if 'timestamp' not in snapshot:
+ if hasattr(snapshot, "items"):
+ name = snapshot["name"]
+ if "timestamp" not in snapshot:
return name
- ts = snapshot['timestamp']
- back_ref = back_reference_map.get(ts)
+ ts = snapshot["timestamp"]
+ back_ref = back_reference_map.get(ts)
if back_ref is None:
back_ref = int(ts)
- reference = cfg['snapshot'][name]
+ reference = cfg["snapshot"][name]
timestamp = datetime.datetime.now()
for _ in range(back_ref + 1):
@@ -1391,7 +1267,7 @@ def snapshot_spec_to_name(cfg, snapshot):
timestamp -= delta
timestamp += delta
- return name.replace('%T', format_timestamp(timestamp))
+ return name.replace("%T", format_timestamp(timestamp))
else: # pragma: no cover
return snapshot
@@ -1414,11 +1290,9 @@ def rotate_snapshot(cfg, snapshot_name):
:type cfg: dict
:param snapshot_name: the snapshot to rotate
:type snapshot_name: str"""
- rotated_name = cfg['snapshot'][snapshot_name].get(
- 'rotate_via', '%s-rotated-%s' % (
- snapshot_name,
- format_timestamp(datetime.datetime.now())
- )
+ rotated_name = cfg["snapshot"][snapshot_name].get(
+ "rotate_via",
+ "%s-rotated-%s" % (snapshot_name, format_timestamp(datetime.datetime.now())),
)
# First, verify that our snapshot environment is in a sane state.
@@ -1426,16 +1300,13 @@ def rotate_snapshot(cfg, snapshot_name):
if rotated_name in state.snapshots: # pragma: no cover
raise Exception(
- "Cannot update snapshot %s - rotated name %s already exists" % (
- snapshot_name, rotated_name
- )
+ "Cannot update snapshot %s - rotated name %s already exists"
+ % (snapshot_name, rotated_name)
)
- cmd = Command([
- 'aptly', 'snapshot', 'rename', snapshot_name, rotated_name
- ])
+ cmd = Command(["aptly", "snapshot", "rename", snapshot_name, rotated_name])
- cmd.provide('virtual', rotated_name)
+ cmd.provide("virtual", rotated_name)
return cmd
@@ -1456,7 +1327,7 @@ def cmd_snapshot_update(cfg, snapshot_name, snapshot_config):
# 4) Update / switch-over publishes
# 5) Remove the rotated temporary snapshots
- if '%T' in snapshot_name: # pragma: no cover
+ if "%T" in snapshot_name: # pragma: no cover
# Timestamped snapshots are never rotated by design.
return []
@@ -1466,69 +1337,60 @@ def cmd_snapshot_update(cfg, snapshot_name, snapshot_config):
# TODO: rotated snapshots should be identified by configuration option, not
# just by "not being timestamped
- rename_cmds = [
- rotate_snapshot(cfg, snap)
- for snap
- in affected_snapshots
- ]
+ rename_cmds = [rotate_snapshot(cfg, snap) for snap in affected_snapshots]
# The "intermediate" command causes the state reader to refresh. At the
# same time, it provides a collection point for dependency handling.
intermediate = FunctionCommand(state.read)
- intermediate.provide('virtual', 'all-snapshots-rotated')
+ intermediate.provide("virtual", "all-snapshots-rotated")
for cmd in rename_cmds:
# Ensure that our "intermediate" pseudo command comes after all
# the rename commands, by ensuring it depends on all their "virtual"
# provided items.
cmd_vprovides = [
- provide
- for ptype, provide
- in cmd.get_provides()
- if ptype == 'virtual'
+ provide for ptype, provide in cmd.get_provides() if ptype == "virtual"
]
for provide in cmd_vprovides:
- intermediate.require('virtual', provide)
+ intermediate.require("virtual", provide)
# Same as before - create a focal point to "collect" dependencies
# after the snapshots have been rebuilt. Also reload state once again
intermediate2 = FunctionCommand(state.read)
- intermediate2.provide('virtual', 'all-snapshots-rebuilt')
+ intermediate2.provide("virtual", "all-snapshots-rebuilt")
create_cmds = []
for _ in affected_snapshots:
-
# Well.. there's normally just one, but since we need interface
# consistency, cmd_snapshot_create() returns a list. And since it
# returns a list, we may just as well future-proof it and loop instead
# of assuming it's going to be a single entry (and fail horribly if
# this assumption changes in the future).
- for create_cmd in cmd_snapshot_create(cfg,
- snapshot_name,
- cfg['snapshot'][snapshot_name],
- ignore_existing=True):
-
+ for create_cmd in cmd_snapshot_create(
+ cfg, snapshot_name, cfg["snapshot"][snapshot_name], ignore_existing=True
+ ):
# enforce cmd to run after the refresh, and thus also
# after all the renames
- create_cmd.require('virtual', 'all-snapshots-rotated')
+ create_cmd.require("virtual", "all-snapshots-rotated")
# Evil hack - we must do the dependencies ourselves, to avoid
# getting a circular graph
- create_cmd._requires = set([
- (type_, req)
- for type_, req
- in create_cmd._requires
- if type_ != 'snapshot'
- ])
-
- create_cmd.provide('virtual', 'readyness-for-%s' % snapshot_name)
+ create_cmd._requires = set(
+ [
+ (type_, req)
+ for type_, req in create_cmd._requires
+ if type_ != "snapshot"
+ ]
+ )
+
+ create_cmd.provide("virtual", "readyness-for-%s" % snapshot_name)
for follower in dependents_of_snapshot(snapshot_name):
- create_cmd.require('virtual', 'readyness-for-%s' % follower)
+ create_cmd.require("virtual", "readyness-for-%s" % follower)
# "Focal point" - make intermediate2 run after all the commands
# that re-create the snapshots
- create_cmd.provide('virtual', 'rebuilt-%s' % snapshot_name)
- intermediate2.require('virtual', 'rebuilt-%s' % snapshot_name)
+ create_cmd.provide("virtual", "rebuilt-%s" % snapshot_name)
+ intermediate2.require("virtual", "rebuilt-%s" % snapshot_name)
create_cmds.append(create_cmd)
@@ -1537,65 +1399,51 @@ def cmd_snapshot_update(cfg, snapshot_name, snapshot_config):
# So now, we're left with updating the publishes.
def is_publish_affected(name, publish):
- if "%s %s" % (
- name,
- publish['distribution']
- ) in state.publishes:
+ if "%s %s" % (name, publish["distribution"]) in state.publishes:
try:
- for snap in publish['snapshots']:
+ for snap in publish["snapshots"]:
snap_name = snapshot_spec_to_name(cfg, snap)
if snap_name in affected_snapshots:
return True
except KeyError: # pragma: no cover
- lg.debug((
- "Publish endpoint %s is not affected because it has no "
- "snapshots defined"
- ) % name)
+ lg.debug(
+ (
+ "Publish endpoint %s is not affected because it has no "
+ "snapshots defined"
+ )
+ % name
+ )
return False
return False
- if 'publish' in cfg:
+ if "publish" in cfg:
all_publish_commands = [
- publish_cmd_update(cfg,
- publish_name,
- publish_conf_entry,
- ignore_existing=True)
- for publish_name, publish_conf in cfg['publish'].items()
+ publish_cmd_update(
+ cfg, publish_name, publish_conf_entry, ignore_existing=True
+ )
+ for publish_name, publish_conf in cfg["publish"].items()
for publish_conf_entry in publish_conf
- if publish_conf_entry.get('automatic-update', 'false') is True
+ if publish_conf_entry.get("automatic-update", "false") is True
if is_publish_affected(publish_name, publish_conf_entry)
]
else:
all_publish_commands = []
- republish_cmds = [
- c
- for c
- in all_publish_commands
- if c
- ]
+ republish_cmds = [c for c in all_publish_commands if c]
# Ensure that the republish commands run AFTER the snapshots are rebuilt
for cmd in republish_cmds:
- cmd.require('virtual', 'all-snapshots-rebuilt')
+ cmd.require("virtual", "all-snapshots-rebuilt")
# TODO:
# - We need to cleanup all the rotated snapshots after the publishes are
# rebuilt
# - Filter publishes, so only the non-timestamped publishes are rebuilt
- return (
- rename_cmds +
- create_cmds +
- republish_cmds +
- [intermediate, intermediate2]
- )
+ return rename_cmds + create_cmds + republish_cmds + [intermediate, intermediate2]
-def cmd_snapshot_create(cfg,
- snapshot_name,
- snapshot_config,
- ignore_existing=False):
+def cmd_snapshot_create(cfg, snapshot_name, snapshot_config, ignore_existing=False):
"""Create a snapshot create command to be ordered and executed later.
:param cfg: pyaptly config
@@ -1615,69 +1463,64 @@ def cmd_snapshot_create(cfg,
# TODO: extract possible timestamp component
# and generate *actual* snapshot name
- snapshot_name = expand_timestamped_name(
- snapshot_name, snapshot_config
- )
+ snapshot_name = expand_timestamped_name(snapshot_name, snapshot_config)
if snapshot_name in state.snapshots and not ignore_existing:
return []
- default_aptly_cmd = ['aptly', 'snapshot', 'create']
+ default_aptly_cmd = ["aptly", "snapshot", "create"]
default_aptly_cmd.append(snapshot_name)
- default_aptly_cmd.append('from')
+ default_aptly_cmd.append("from")
- if 'mirror' in snapshot_config:
- cmd = Command(
- default_aptly_cmd + ['mirror', snapshot_config['mirror']]
- )
- cmd.provide('snapshot', snapshot_name)
- cmd.require('mirror', snapshot_config['mirror'])
+ if "mirror" in snapshot_config:
+ cmd = Command(default_aptly_cmd + ["mirror", snapshot_config["mirror"]])
+ cmd.provide("snapshot", snapshot_name)
+ cmd.require("mirror", snapshot_config["mirror"])
return [cmd]
- elif 'repo' in snapshot_config:
- cmd = Command(default_aptly_cmd + ['repo', snapshot_config['repo']])
- cmd.provide('snapshot', snapshot_name)
- cmd.require('repo', snapshot_config['repo'])
+ elif "repo" in snapshot_config:
+ cmd = Command(default_aptly_cmd + ["repo", snapshot_config["repo"]])
+ cmd.provide("snapshot", snapshot_name)
+ cmd.require("repo", snapshot_config["repo"])
return [cmd]
- elif 'filter' in snapshot_config:
- cmd = Command([
- 'aptly',
- 'snapshot',
- 'filter',
- snapshot_spec_to_name(cfg, snapshot_config['filter']['source']),
- snapshot_name,
- snapshot_config['filter']['query'],
- ])
- cmd.provide('snapshot', snapshot_name)
+ elif "filter" in snapshot_config:
+ cmd = Command(
+ [
+ "aptly",
+ "snapshot",
+ "filter",
+ snapshot_spec_to_name(cfg, snapshot_config["filter"]["source"]),
+ snapshot_name,
+ snapshot_config["filter"]["query"],
+ ]
+ )
+ cmd.provide("snapshot", snapshot_name)
cmd.require(
- 'snapshot',
- snapshot_spec_to_name(cfg, snapshot_config['filter']['source'])
+ "snapshot", snapshot_spec_to_name(cfg, snapshot_config["filter"]["source"])
)
return [cmd]
- elif 'merge' in snapshot_config:
- cmd = Command([
- 'aptly',
- 'snapshot',
- 'merge',
- snapshot_name,
- ])
- cmd.provide('snapshot', snapshot_name)
+ elif "merge" in snapshot_config:
+ cmd = Command(
+ [
+ "aptly",
+ "snapshot",
+ "merge",
+ snapshot_name,
+ ]
+ )
+ cmd.provide("snapshot", snapshot_name)
- for source in snapshot_config['merge']:
+ for source in snapshot_config["merge"]:
source_name = snapshot_spec_to_name(cfg, source)
cmd.append(source_name)
- cmd.require('snapshot', source_name)
+ cmd.require("snapshot", source_name)
return [cmd]
else: # pragma: no cover
- raise ValueError(
- "Don't know how to handle snapshot config" % (
- snapshot_config
- )
- )
+ raise ValueError("Don't know how to handle snapshot config" % (snapshot_config))
def mirror(cfg, args):
@@ -1687,30 +1530,25 @@ def mirror(cfg, args):
:type cfg: dict
:param args: The command-line arguments read with :py:mod:`argparse`
:type args: namespace"""
- lg.debug("Mirrors to create: %s", cfg['mirror'])
+ lg.debug("Mirrors to create: %s", cfg["mirror"])
mirror_cmds = {
- 'create': cmd_mirror_create,
- 'update': cmd_mirror_update,
+ "create": cmd_mirror_create,
+ "update": cmd_mirror_update,
}
cmd_mirror = mirror_cmds[args.task]
if args.mirror_name == "all":
- for mirror_name, mirror_config in cfg['mirror'].items():
+ for mirror_name, mirror_config in cfg["mirror"].items():
cmd_mirror(cfg, mirror_name, mirror_config)
else:
- if args.mirror_name in cfg['mirror']:
- cmd_mirror(
- cfg,
- args.mirror_name,
- cfg['mirror'][args.mirror_name]
- )
+ if args.mirror_name in cfg["mirror"]:
+ cmd_mirror(cfg, args.mirror_name, cfg["mirror"][args.mirror_name])
else:
raise ValueError(
- "Requested mirror is not defined in config file: %s" % (
- args.mirror_name
- )
+ "Requested mirror is not defined in config file: %s"
+ % (args.mirror_name)
)
@@ -1722,10 +1560,10 @@ def add_gpg_keys(mirror_config):
:type mirror_config: dict
"""
keys_urls = {}
- if 'gpg-keys' in mirror_config:
- keys = unit_or_list_to_list(mirror_config['gpg-keys'])
- if 'gpg-urls' in mirror_config:
- urls = unit_or_list_to_list(mirror_config['gpg-urls'])
+ if "gpg-keys" in mirror_config:
+ keys = unit_or_list_to_list(mirror_config["gpg-keys"])
+ if "gpg-urls" in mirror_config:
+ urls = unit_or_list_to_list(mirror_config["gpg-urls"])
urls_len = len(urls)
for x in range(len(keys)):
if x < urls_len:
@@ -1747,9 +1585,9 @@ def add_gpg_keys(mirror_config):
"--keyring",
"trustedkeys.gpg",
"--keyserver",
- "pool.sks-keyservers.net",
+ "keys.openpgp.org",
"--recv-keys",
- key
+ key,
]
lg.debug("Adding gpg key with call: %s", key_command)
subprocess.check_call(key_command)
@@ -1758,10 +1596,10 @@ def add_gpg_keys(mirror_config):
if url:
key_command = (
"wget -q -O - %s | "
- "gpg --no-default-keyring "
- "--keyring trustedkeys.gpg --import"
+ "gpg --no-default-keyring --keyring trustedkeys.gpg "
+ "--import"
) % url
- subprocess.check_call(['bash', '-c', key_command])
+ subprocess.check_call(["bash", "-c", key_command])
else:
raise
state.read_gpg()
@@ -1781,27 +1619,29 @@ def cmd_mirror_create(cfg, mirror_name, mirror_config):
return
add_gpg_keys(mirror_config)
- aptly_cmd = ['aptly', 'mirror', 'create']
+ aptly_cmd = ["aptly", "mirror", "create"]
- if 'sources' in mirror_config and mirror_config['sources']:
- aptly_cmd.append('-with-sources')
+ if "sources" in mirror_config and mirror_config["sources"]:
+ aptly_cmd.append("-with-sources")
else:
- aptly_cmd.append('-with-sources=false')
+ aptly_cmd.append("-with-sources=false")
- if 'udeb' in mirror_config and mirror_config['udeb']:
- aptly_cmd.append('-with-udebs')
+ if "udeb" in mirror_config and mirror_config["udeb"]:
+ aptly_cmd.append("-with-udebs")
- if 'architectures' in mirror_config:
- aptly_cmd.append('-architectures={0}'.format(
- ','.join(unit_or_list_to_list(mirror_config['architectures']))
- ))
+ if "architectures" in mirror_config:
+ aptly_cmd.append(
+ "-architectures={0}".format(
+ ",".join(unit_or_list_to_list(mirror_config["architectures"]))
+ )
+ )
aptly_cmd.append(mirror_name)
- aptly_cmd.append(mirror_config['archive'])
- aptly_cmd.append(mirror_config['distribution'])
- aptly_cmd.extend(unit_or_list_to_list(mirror_config['components']))
+ aptly_cmd.append(mirror_config["archive"])
+ aptly_cmd.append(mirror_config["distribution"])
+ aptly_cmd.extend(unit_or_list_to_list(mirror_config["components"]))
- lg.debug('Running command: %s', ' '.join(aptly_cmd))
+ lg.debug("Running command: %s", " ".join(aptly_cmd))
subprocess.check_call(aptly_cmd)
@@ -1817,13 +1657,14 @@ def cmd_mirror_update(cfg, mirror_name, mirror_config):
if mirror_name not in state.mirrors: # pragma: no cover
raise Exception("Mirror not created yet")
add_gpg_keys(mirror_config)
- aptly_cmd = ['aptly', 'mirror', 'update']
- if 'max-tries' in mirror_config:
- aptly_cmd.append('-max-tries=%d' % mirror_config['max-tries'])
+ aptly_cmd = ["aptly", "mirror", "update"]
+ if "max-tries" in mirror_config:
+ aptly_cmd.append("-max-tries=%d" % mirror_config["max-tries"])
aptly_cmd.append(mirror_name)
- lg.debug('Running command: %s', ' '.join(aptly_cmd))
+ lg.debug("Running command: %s", " ".join(aptly_cmd))
subprocess.check_call(aptly_cmd)
-if __name__ == '__main__': # pragma: no cover
+
+if __name__ == "__main__": # pragma: no cover
main()
diff --git a/pyaptly/aptly_test.py b/pyaptly/aptly_test.py
index 74da5ba..c673283 100644
--- a/pyaptly/aptly_test.py
+++ b/pyaptly/aptly_test.py
@@ -6,8 +6,7 @@
import freezegun
import testfixtures
-from pyaptly import (Command, SystemStateReader, call_output, main,
- snapshot_spec_to_name)
+from pyaptly import Command, SystemStateReader, call_output, main, snapshot_spec_to_name
from . import test
@@ -17,9 +16,7 @@
import mock
-_test_base = os.path.dirname(
- os.path.abspath(__file__)
-).encode("UTF-8")
+_test_base = os.path.dirname(os.path.abspath(__file__)).encode("UTF-8")
@contextlib.contextmanager
@@ -37,11 +34,11 @@ def test_debug():
with mock_subprocess() as (_, gpg):
gpg.side_effect = lambda _: ("", "")
args = [
- '-d',
- '-c',
- os.path.join(_test_base, b'test01.yml').decode("UTF-8"),
- 'mirror',
- 'create'
+ "-d",
+ "-c",
+ os.path.join(_test_base, b"test01.yml").decode("UTF-8"),
+ "mirror",
+ "create",
]
main(args)
assert logging.getLogger().level == logging.DEBUG
@@ -49,18 +46,20 @@ def test_debug():
def test_pretend():
"""Test if pretend is enabled with -p"""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_create(config)
args = [
- '-p',
- '-c',
+ "-p",
+ "-c",
config,
- 'publish',
- 'create',
- 'fakerepo01',
+ "publish",
+ "create",
+ "fakerepo01",
]
main(args)
state = SystemStateReader()
@@ -72,16 +71,13 @@ def test_pretend():
def test_mirror_create():
"""Test if createing mirrors works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"mirror-google.yml",
- )) as (tyml, config):
- args = [
- '-c',
- config,
- 'mirror',
- 'create'
- ]
+ )
+ ) as (tyml, config):
+ args = ["-c", config, "mirror", "create"]
keys_added = []
with testfixtures.LogCapture() as l:
main(args)
@@ -91,11 +87,9 @@ def test_mirror_create():
if arg[0] == "gpg":
keys_added.append(arg[7])
assert len(keys_added) > 0
- assert len(keys_added) == len(set(keys_added)), (
- "Key multiple times added"
- )
+ assert len(keys_added) == len(set(keys_added)), "Key multiple times added"
- expect = set(tyml['mirror'].keys())
+ expect = set(tyml["mirror"].keys())
state = SystemStateReader()
state.read()
assert state.mirrors == expect
@@ -103,54 +97,47 @@ def test_mirror_create():
def do_mirror_update(config):
"""Test if updating mirrors works."""
- args = [
- '-c',
- config,
- 'mirror',
- 'create'
- ]
+ args = ["-c", config, "mirror", "create"]
state = SystemStateReader()
state.read()
assert "fakerepo01" not in state.mirrors
main(args)
state.read()
assert "fakerepo01" in state.mirrors
- args[3] = 'update'
+ args[3] = "update"
main(args)
args = [
- 'aptly',
- 'mirror',
- 'show',
+ "aptly",
+ "mirror",
+ "show",
]
args01 = list(args)
args01.append("fakerepo01")
aptly_state = test.execute_and_parse_show_cmd(args01)
- assert aptly_state['number of packages'] == "2"
+ assert aptly_state["number of packages"] == "2"
def test_mirror_update():
"""Test if updating mirrors works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"mirror-no-google.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
def test_mirror_update_inexistent():
"""Test if updating an inexistent mirror causes an error."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"mirror-no-google.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
- args = [
- '-c',
- config,
- 'mirror',
- 'update',
- 'asdfasdf'
- ]
+ args = ["-c", config, "mirror", "update", "asdfasdf"]
error = False
try:
main(args)
@@ -161,52 +148,45 @@ def test_mirror_update_inexistent():
def test_mirror_update_single():
"""Test if updating a single mirror works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"mirror-no-google.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
- args = [
- '-c',
- config,
- 'mirror',
- 'update',
- 'fakerepo01'
- ]
+ args = ["-c", config, "mirror", "update", "fakerepo01"]
main(args)
def do_snapshot_create(config):
"""Test if createing snapshots works"""
do_mirror_update(config)
- args = [
- '-c',
- config,
- 'snapshot',
- 'create'
- ]
+ args = ["-c", config, "snapshot", "create"]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['fakerepo01-20121010T0000Z', 'fakerepo02-20121006T0000Z']
- ).issubset(state.snapshots)
+ assert set(["fakerepo01-20121010T0000Z", "fakerepo02-20121006T0000Z"]).issubset(
+ state.snapshots
+ )
return state
def test_snapshot_create_inexistent():
"""Test if creating an inexistent snapshot raises an error."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'create',
- 'asdfasdf-%T',
+ "snapshot",
+ "create",
+ "asdfasdf-%T",
]
error = False
try:
@@ -218,150 +198,162 @@ def test_snapshot_create_inexistent():
def test_snapshot_create_single():
"""Test if single snapshot create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'create',
- 'fakerepo01-%T',
+ "snapshot",
+ "create",
+ "fakerepo01-%T",
]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['fakerepo01-20121010T0000Z']
- ).issubset(state.snapshots)
+ assert set(["fakerepo01-20121010T0000Z"]).issubset(state.snapshots)
def test_snapshot_create_rotating():
"""Test if rotating snapshot create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_mirror_update(config)
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'create',
+ "snapshot",
+ "create",
]
main(args)
state = SystemStateReader()
state.read()
assert set(
[
- 'fake-current',
- 'fakerepo01-current',
- 'fakerepo02-current',
+ "fake-current",
+ "fakerepo01-current",
+ "fakerepo02-current",
]
).issubset(state.snapshots)
def test_snapshot_update_rotating():
"""Test if rotating snapshot update works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_update_rotating(config)
def test_snapshot_update_threetimes_rotating():
"""Test if rotating snapshot update works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_update_rotating(config)
with freezegun.freeze_time("2012-10-11 10:10:10"):
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'update',
+ "snapshot",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
assert set(
[
- 'fake-current',
- 'fakerepo01-current-rotated-20121010T1010Z',
- 'fakerepo02-current-rotated-20121010T1010Z',
- 'fakerepo01-current-rotated-20121011T1010Z',
- 'fakerepo02-current-rotated-20121011T1010Z',
+ "fake-current",
+ "fakerepo01-current-rotated-20121010T1010Z",
+ "fakerepo02-current-rotated-20121010T1010Z",
+ "fakerepo01-current-rotated-20121011T1010Z",
+ "fakerepo02-current-rotated-20121011T1010Z",
]
).issubset(state.snapshots)
expected = {
- u'fake-current': set([
- u'fakerepo01-current', u'fakerepo02-current'
- ]),
- u'fake-current-rotated-20121010T1010Z': set([
- u'fakerepo01-current-rotated-20121010T1010Z',
- u'fakerepo02-current-rotated-20121010T1010Z'
- ]),
- u'fake-current-rotated-20121011T1010Z': set([
- u'fakerepo01-current-rotated-20121011T1010Z',
- u'fakerepo02-current-rotated-20121011T1010Z',
- ]),
- u'fakerepo01-current': set([]),
- u'fakerepo01-current-rotated-20121010T1010Z': set([]),
- u'fakerepo01-current-rotated-20121011T1010Z': set([]),
- u'fakerepo02-current': set([]),
- u'fakerepo02-current-rotated-20121010T1010Z': set([]),
- u'fakerepo02-current-rotated-20121011T1010Z': set([])
+ "fake-current": set(["fakerepo01-current", "fakerepo02-current"]),
+ "fake-current-rotated-20121010T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121010T1010Z",
+ "fakerepo02-current-rotated-20121010T1010Z",
+ ]
+ ),
+ "fake-current-rotated-20121011T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121011T1010Z",
+ "fakerepo02-current-rotated-20121011T1010Z",
+ ]
+ ),
+ "fakerepo01-current": set([]),
+ "fakerepo01-current-rotated-20121010T1010Z": set([]),
+ "fakerepo01-current-rotated-20121011T1010Z": set([]),
+ "fakerepo02-current": set([]),
+ "fakerepo02-current-rotated-20121010T1010Z": set([]),
+ "fakerepo02-current-rotated-20121011T1010Z": set([]),
}
assert state.snapshot_map == expected
with freezegun.freeze_time("2012-10-12 10:10:10"):
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'update',
+ "snapshot",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
assert set(
[
- 'fake-current',
- 'fakerepo01-current-rotated-20121011T1010Z',
- 'fakerepo02-current-rotated-20121011T1010Z',
- 'fakerepo01-current-rotated-20121012T1010Z',
- 'fakerepo02-current-rotated-20121012T1010Z',
+ "fake-current",
+ "fakerepo01-current-rotated-20121011T1010Z",
+ "fakerepo02-current-rotated-20121011T1010Z",
+ "fakerepo01-current-rotated-20121012T1010Z",
+ "fakerepo02-current-rotated-20121012T1010Z",
]
).issubset(state.snapshots)
expected = {
- u'fake-current': set([
- u'fakerepo01-current', u'fakerepo02-current'
- ]),
- u'fake-current-rotated-20121010T1010Z': set([
- u'fakerepo01-current-rotated-20121010T1010Z',
- u'fakerepo02-current-rotated-20121010T1010Z'
- ]),
- u'fake-current-rotated-20121011T1010Z': set([
- u'fakerepo01-current-rotated-20121011T1010Z',
- u'fakerepo02-current-rotated-20121011T1010Z',
- ]),
- u'fake-current-rotated-20121012T1010Z': set([
- u'fakerepo01-current-rotated-20121012T1010Z',
- u'fakerepo02-current-rotated-20121012T1010Z',
- ]),
- u'fakerepo01-current': set([]),
- u'fakerepo01-current-rotated-20121010T1010Z': set([]),
- u'fakerepo01-current-rotated-20121011T1010Z': set([]),
- u'fakerepo01-current-rotated-20121012T1010Z': set([]),
- u'fakerepo02-current': set([]),
- u'fakerepo02-current-rotated-20121010T1010Z': set([]),
- u'fakerepo02-current-rotated-20121011T1010Z': set([]),
- u'fakerepo02-current-rotated-20121012T1010Z': set([]),
+ "fake-current": set(["fakerepo01-current", "fakerepo02-current"]),
+ "fake-current-rotated-20121010T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121010T1010Z",
+ "fakerepo02-current-rotated-20121010T1010Z",
+ ]
+ ),
+ "fake-current-rotated-20121011T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121011T1010Z",
+ "fakerepo02-current-rotated-20121011T1010Z",
+ ]
+ ),
+ "fake-current-rotated-20121012T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121012T1010Z",
+ "fakerepo02-current-rotated-20121012T1010Z",
+ ]
+ ),
+ "fakerepo01-current": set([]),
+ "fakerepo01-current-rotated-20121010T1010Z": set([]),
+ "fakerepo01-current-rotated-20121011T1010Z": set([]),
+ "fakerepo01-current-rotated-20121012T1010Z": set([]),
+ "fakerepo02-current": set([]),
+ "fakerepo02-current-rotated-20121010T1010Z": set([]),
+ "fakerepo02-current-rotated-20121011T1010Z": set([]),
+ "fakerepo02-current-rotated-20121012T1010Z": set([]),
}
assert state.snapshot_map == expected
@@ -370,148 +362,141 @@ def do_snapshot_update_rotating(config):
"""Helper for rotating snapshot tests"""
do_mirror_update(config)
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'create',
+ "snapshot",
+ "create",
]
main(args)
state = SystemStateReader()
state.read()
assert set(
[
- 'fake-current',
- 'fakerepo01-current',
- 'fakerepo02-current',
+ "fake-current",
+ "fakerepo01-current",
+ "fakerepo02-current",
]
).issubset(state.snapshots)
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'update',
+ "snapshot",
+ "update",
]
main(args)
state.read()
assert set(
[
- 'fake-current',
- 'fakerepo01-current-rotated-20121010T1010Z',
- 'fakerepo02-current-rotated-20121010T1010Z',
+ "fake-current",
+ "fakerepo01-current-rotated-20121010T1010Z",
+ "fakerepo02-current-rotated-20121010T1010Z",
]
).issubset(state.snapshots)
expected = {
- u'fake-current': set([
- u'fakerepo01-current', u'fakerepo02-current'
- ]),
- u'fake-current-rotated-20121010T1010Z': set([
- u'fakerepo01-current-rotated-20121010T1010Z',
- u'fakerepo02-current-rotated-20121010T1010Z'
- ]),
- u'fakerepo01-current': set([]),
- u'fakerepo01-current-rotated-20121010T1010Z': set([]),
- u'fakerepo02-current': set([]),
- u'fakerepo02-current-rotated-20121010T1010Z': set([]),
+ "fake-current": set(["fakerepo01-current", "fakerepo02-current"]),
+ "fake-current-rotated-20121010T1010Z": set(
+ [
+ "fakerepo01-current-rotated-20121010T1010Z",
+ "fakerepo02-current-rotated-20121010T1010Z",
+ ]
+ ),
+ "fakerepo01-current": set([]),
+ "fakerepo01-current-rotated-20121010T1010Z": set([]),
+ "fakerepo02-current": set([]),
+ "fakerepo02-current-rotated-20121010T1010Z": set([]),
}
assert state.snapshot_map == expected
def test_snapshot_create_basic():
"""Test if snapshot create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
state = do_snapshot_create(config)
- assert set(
- ['fakerepo01-20121010T0000Z', 'fakerepo02-20121006T0000Z']
- ) == state.snapshots
+ assert (
+ set(["fakerepo01-20121010T0000Z", "fakerepo02-20121006T0000Z"])
+ == state.snapshots
+ )
def test_snapshot_create_repo():
"""Test if repo snapshot create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot_repo.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_repo_create(config)
- args = [
- '-c',
- config,
- 'snapshot',
- 'create'
- ]
+ args = ["-c", config, "snapshot", "create"]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['centrify-latest']
- ).issubset(state.snapshots)
+ assert set(["centrify-latest"]).issubset(state.snapshots)
return state
def test_snapshot_create_merge():
"""Test if snapshot merge create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot_merge.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
state = do_snapshot_create(config)
- assert set(
- [
- 'fakerepo01-20121010T0000Z',
- 'fakerepo02-20121006T0000Z',
- 'superfake-20121010T0000Z'
- ]
- ) == state.snapshots
+ assert (
+ set(
+ [
+ "fakerepo01-20121010T0000Z",
+ "fakerepo02-20121006T0000Z",
+ "superfake-20121010T0000Z",
+ ]
+ )
+ == state.snapshots
+ )
expect = {
- 'fakerepo01-20121010T0000Z': set([]),
- 'fakerepo02-20121006T0000Z': set([]),
- 'superfake-20121010T0000Z': set([
- 'fakerepo01-20121010T0000Z',
- 'fakerepo02-20121006T0000Z'
- ])
+ "fakerepo01-20121010T0000Z": set([]),
+ "fakerepo02-20121006T0000Z": set([]),
+ "superfake-20121010T0000Z": set(
+ ["fakerepo01-20121010T0000Z", "fakerepo02-20121006T0000Z"]
+ ),
}
assert expect == state.snapshot_map
def test_snapshot_create_filter():
"""Test if snapshot filter create works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"snapshot_filter.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_create(config)
- data, _ = call_output([
- 'aptly',
- 'snapshot',
- 'search',
- 'filterfake01-20121010T0000Z',
- 'Name (% *)'
- ])
- state = [x.strip() for x in data.split('\n') if x]
- expect = ['libhello_0.1-1_amd64']
+ data, _ = call_output(
+ ["aptly", "snapshot", "search", "filterfake01-20121010T0000Z", "Name (% *)"]
+ )
+ state = [x.strip() for x in data.split("\n") if x]
+ expect = ["libhello_0.1-1_amd64"]
assert state == expect
def do_publish_create(config):
"""Test if creating publishes works."""
do_snapshot_create(config)
- args = [
- '-c',
- config,
- 'publish',
- 'create'
- ]
+ args = ["-c", config, "publish", "create"]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['fakerepo02 main', 'fakerepo01 main']
- ) == state.publishes
+ assert set(["fakerepo02 main", "fakerepo01 main"]) == state.publishes
expect = {
- 'fakerepo02 main': set(['fakerepo02-20121006T0000Z']),
- 'fakerepo01 main': set(['fakerepo01-20121010T0000Z'])
+ "fakerepo02 main": set(["fakerepo02-20121006T0000Z"]),
+ "fakerepo01 main": set(["fakerepo01-20121010T0000Z"]),
}
assert expect == state.publish_map
@@ -519,67 +504,67 @@ def do_publish_create(config):
def do_publish_create_rotating(config):
"""Test if creating publishes works."""
do_snapshot_update_rotating(config)
- args = [
- '-c',
- config,
- 'publish',
- 'create'
- ]
+ args = ["-c", config, "publish", "create"]
main(args)
state = SystemStateReader()
state.read()
- assert set([
- 'fakerepo01/current stable',
- 'fake/current stable',
- 'fakerepo02/current stable',
- ]) == state.publishes
+ assert (
+ set(
+ [
+ "fakerepo01/current stable",
+ "fake/current stable",
+ "fakerepo02/current stable",
+ ]
+ )
+ == state.publishes
+ )
expect = {
- u'fake/current stable': set([u'fake-current']),
- u'fakerepo01/current stable': set([u'fakerepo01-current']),
- u'fakerepo02/current stable': set([u'fakerepo02-current'])
+ "fake/current stable": set(["fake-current"]),
+ "fakerepo01/current stable": set(["fakerepo01-current"]),
+ "fakerepo02/current stable": set(["fakerepo02-current"]),
}
assert expect == state.publish_map
def test_publish_create_single():
"""Test if creating a single publish works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_create(config)
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'create',
- 'fakerepo01',
+ "publish",
+ "create",
+ "fakerepo01",
]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['fakerepo01 main']
- ) == state.publishes
- expect = {
- 'fakerepo01 main': set(['fakerepo01-20121010T0000Z'])
- }
+ assert set(["fakerepo01 main"]) == state.publishes
+ expect = {"fakerepo01 main": set(["fakerepo01-20121010T0000Z"])}
assert expect == state.publish_map
def test_publish_create_inexistent():
"""Test if creating inexistent publish raises an error."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_snapshot_create(config)
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'create',
- 'asdfasdf',
+ "publish",
+ "create",
+ "asdfasdf",
]
error = False
try:
@@ -591,98 +576,106 @@ def test_publish_create_inexistent():
def test_publish_create_repo():
"""Test if creating repo publishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish_repo.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_repo_create(config)
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'create',
+ "publish",
+ "create",
]
main(args)
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'update',
+ "publish",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
- assert set(
- ['centrify latest']
- ) == state.publishes
- assert {'centrify latest': set([])} == state.publish_map
+ assert set(["centrify latest"]) == state.publishes
+ assert {"centrify latest": set([])} == state.publish_map
def test_publish_create_basic():
"""Test if creating publishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create(config)
def test_publish_update_rotating():
"""Test if update rotating publishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create_rotating(config)
with freezegun.freeze_time("2012-10-11 10:10:10"):
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'update',
+ "publish",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
expect = {
- u'fake/current stable': set([u'fake-current']),
- u'fakerepo01/current stable': set([u'fakerepo01-current']),
- u'fakerepo02/current stable': set([u'fakerepo02-current'])
+ "fake/current stable": set(["fake-current"]),
+ "fakerepo01/current stable": set(["fakerepo01-current"]),
+ "fakerepo02/current stable": set(["fakerepo02-current"]),
}
assert expect == state.publish_map
def test_publish_snapshot_update_rotating():
"""Test if update rotating publishes via snapshot works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create_rotating(config)
with freezegun.freeze_time("2012-10-11 10:10:10"):
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'update',
+ "snapshot",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
expect = {
- u'fake/current stable': set([u'fake-current']),
- u'fakerepo01/current stable': set([u'fakerepo01-current']),
- u'fakerepo02/current stable': set([u'fakerepo02-current'])
+ "fake/current stable": set(["fake-current"]),
+ "fakerepo01/current stable": set(["fakerepo01-current"]),
+ "fakerepo02/current stable": set(["fakerepo02-current"]),
}
assert expect == state.publish_map
def test_publish_create_rotating():
"""Test if creating rotating publishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish-current.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create_rotating(config)
@@ -697,51 +690,55 @@ def do_publish_create_republish(config):
found = True
assert found
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'create',
+ "publish",
+ "create",
]
main(args)
state = SystemStateReader()
state.read()
- assert 'fakerepo01-stable main' in state.publishes
+ assert "fakerepo01-stable main" in state.publishes
def test_publish_create_republish():
"""Test if creating republishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish_publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create_republish(config)
def test_publish_update_republish():
"""Test if update republishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish_publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create_republish(config)
with freezegun.freeze_time("2012-10-11 10:10:10"):
args = [
- '-c',
+ "-c",
config,
- 'snapshot',
- 'create',
+ "snapshot",
+ "create",
]
main(args)
args = [
- '-c',
+ "-c",
config,
- 'publish',
- 'update',
+ "publish",
+ "update",
]
main(args)
state = SystemStateReader()
state.read()
- assert 'fakerepo01-stable main' in state.publishes
+ assert "fakerepo01-stable main" in state.publishes
# As you see fakerepo01-stable main points to the old snapshot
# this is theoretically not correct, but it will be fixed with
# the next call to publish update. Since we use this from a hourly cron
@@ -749,103 +746,90 @@ def test_publish_update_republish():
# This can't be easily fixed and would need a rewrite of the
# dependencies engine.
expect = {
- 'fakerepo01-stable main': set(['fakerepo01-20121010T0000Z']),
- 'fakerepo02 main': set(['fakerepo02-20121006T0000Z']),
- 'fakerepo01 main': set(['fakerepo01-20121011T0000Z'])
+ "fakerepo01-stable main": set(["fakerepo01-20121010T0000Z"]),
+ "fakerepo02 main": set(["fakerepo02-20121006T0000Z"]),
+ "fakerepo01 main": set(["fakerepo01-20121011T0000Z"]),
}
assert expect == state.publish_map
def test_publish_updating_basic():
"""Test if updating publishes works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"publish.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_publish_create(config)
with freezegun.freeze_time("2012-10-11 10:10:10"):
- args = [
- '-c',
- config,
- 'snapshot',
- 'create'
- ]
+ args = ["-c", config, "snapshot", "create"]
main(args)
- args = [
- '-c',
- config,
- 'publish',
- 'update'
- ]
+ args = ["-c", config, "publish", "update"]
main(args)
state = SystemStateReader()
state.read()
- expect = set([
- 'archived-fakerepo01-20121011T1010Z',
- 'fakerepo01-20121011T0000Z',
- 'fakerepo02-20121006T0000Z',
- 'fakerepo01-20121010T0000Z',
- ])
+ expect = set(
+ [
+ "archived-fakerepo01-20121011T1010Z",
+ "fakerepo01-20121011T0000Z",
+ "fakerepo02-20121006T0000Z",
+ "fakerepo01-20121010T0000Z",
+ ]
+ )
assert expect == state.snapshots
expect = {
- 'fakerepo02 main': set(['fakerepo02-20121006T0000Z']),
- 'fakerepo01 main': set(['fakerepo01-20121011T0000Z'])
+ "fakerepo02 main": set(["fakerepo02-20121006T0000Z"]),
+ "fakerepo01 main": set(["fakerepo01-20121011T0000Z"]),
}
assert expect == state.publish_map
def do_repo_create(config):
"""Test if creating repositories works."""
- args = [
- '-c',
- config,
- 'repo',
- 'create'
- ]
+ args = ["-c", config, "repo", "create"]
main(args)
state = SystemStateReader()
state.read()
- call_output([
- 'aptly',
- 'repo',
- 'add',
- 'centrify',
- 'vagrant/hellome_0.1-1_amd64.deb'
- ])
- assert set(['centrify']) == state.repos
+ call_output(["aptly", "repo", "add", "centrify", "vagrant/hellome_0.1-1_amd64.deb"])
+ assert set(["centrify"]) == state.repos
def test_repo_create_single():
"""Test if creating a single repo works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"repo.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
args = [
- '-c',
+ "-c",
config,
- 'repo',
- 'create',
- 'centrify',
+ "repo",
+ "create",
+ "centrify",
]
main(args)
state = SystemStateReader()
state.read()
- assert set(['centrify']) == state.repos
+ assert set(["centrify"]) == state.repos
def test_repo_create_inexistent():
"""Test if creating an inexistent repo causes an error."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"repo.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
args = [
- '-c',
+ "-c",
config,
- 'repo',
- 'create',
- 'asdfasdf',
+ "repo",
+ "create",
+ "asdfasdf",
]
error = False
try:
@@ -857,26 +841,26 @@ def test_repo_create_inexistent():
def test_repo_create_basic():
"""Test if creating repositories works."""
- with test.clean_and_config(os.path.join(
+ with test.clean_and_config(
+ os.path.join(
_test_base,
b"repo.yml",
- )) as (tyml, config):
+ )
+ ) as (tyml, config):
do_repo_create(config)
def test_snapshot_spec_as_dict():
"Test various snapshot formats for snapshot_spec_to_name()"
- snap_string = 'snapshot-foo'
- snap_dict = {
- 'name': 'foo'
- }
+ snap_string = "snapshot-foo"
+ snap_dict = {"name": "foo"}
cfg = {
- 'snapshot': {
- 'foo': {},
+ "snapshot": {
+ "foo": {},
}
}
assert snapshot_spec_to_name(cfg, snap_string) == snap_string
- assert snapshot_spec_to_name(cfg, snap_dict) == 'foo'
+ assert snapshot_spec_to_name(cfg, snap_dict) == "foo"
diff --git a/pyaptly/dateround_test.py b/pyaptly/dateround_test.py
index 83ab6cc..c12aed1 100644
--- a/pyaptly/dateround_test.py
+++ b/pyaptly/dateround_test.py
@@ -14,7 +14,7 @@
if not sys.version_info < (2, 7): # pragma: no cover
from hypothesis import given # noqa
- from hypothesis.extra.datetime import datetimes, times # noqa
+ from hypothesis.strategies import datetimes, times # noqa
from hypothesis.strategies import integers # noqa
@@ -39,7 +39,7 @@ def test_is_to_gregorian(date): # pragma: no cover
@test.hypothesis_min_ver
@given(
- datetimes(min_year=2),
+ datetimes(min_value=datetime.datetime(year=2, month=1, day=1)),
integers(min_value=1, max_value=7),
times())
def test_round_weekly(date, day_of_week, time): # pragma: no cover
diff --git a/pyaptly/graph_test.py b/pyaptly/graph_test.py
index 25d7760..64a082b 100644
--- a/pyaptly/graph_test.py
+++ b/pyaptly/graph_test.py
@@ -8,9 +8,15 @@
from hypothesis import strategies as st
from hypothesis import given
+from hypothesis import settings
+
+# Disable the deadline globally for all tests
+settings.register_profile("my_profile", deadline=None)
+settings.load_profile("my_profile")
if sys.version_info < (2, 7): # pragma: no cover
import mock
+
given = mock.MagicMock() # noqa
example = mock.MagicMock() # noqa
st = mock.MagicMock() # noqa
@@ -26,17 +32,11 @@ def provide_require_st(draw, filter_=True): # pragma: no cover
provides = draw(
st.lists(
st.lists(range_intagers_st, max_size=10),
- min_size = commands,
- max_size = commands
+ min_size=commands,
+ max_size=commands,
),
)
- is_func = draw(
- st.lists(
- st.booleans(),
- min_size = commands,
- max_size = commands
- )
- )
+ is_func = draw(st.lists(st.booleans(), min_size=commands, max_size=commands))
provides_set = set()
for command in provides:
provides_set.update(command)
@@ -52,7 +52,7 @@ def provide_require_st(draw, filter_=True): # pragma: no cover
else:
provides_filter = provides_set
if provides_filter:
- sample = st.sampled_from(provides_filter)
+ sample = st.sampled_from(list(provides_filter))
requires.append(draw(st.lists(sample, max_size=10)))
else:
requires.append([])
@@ -63,11 +63,13 @@ def provide_require_st(draw, filter_=True): # pragma: no cover
def print_example(): # pragma: no cover
example = provide_require_st().example()
- print("""
+ print(
+ """
digraph g {
label="Command graph";
graph [splines=line];
- """)
+ """
+ )
for i in range(len(example[0])):
print(" c%03d [shape=triangle];" % i)
for provides in example[0][i]:
@@ -98,11 +100,7 @@ def test_graph_cycles(tree, rnd): # pragma: no cover
@test.hypothesis_min_ver
-@given(
- provide_require_st(),
- provide_require_st(),
- st.random_module()
-)
+@given(provide_require_st(), provide_require_st(), st.random_module())
def test_graph_island(tree0, tree1, rnd): # pragma: no cover
"""Test with two independant graphs which can form a island"""
tree = (tree0[0] + tree1[0], tree0[1] + tree1[1], tree0[2] + tree1[2])
@@ -115,6 +113,7 @@ def run_graph(tree): # pragma: no cover
index = list(range(len(tree[0])))
random.shuffle(index)
for i in index:
+
def dummy(): # pragma: no cover
return i
diff --git a/pyaptly/mirror-google.yml b/pyaptly/mirror-google.yml
index 207cbb5..60aa346 100644
--- a/pyaptly/mirror-google.yml
+++ b/pyaptly/mirror-google.yml
@@ -2,5 +2,5 @@ mirror:
google-chrome:
archive: "http://dl.google.com/linux/chrome/deb/"
distribution: "stable"
- gpg-keys: ["7FAC5991", "1397BC53640DB551"]
+ gpg-keys: ["7FAC5991", "EB4C1BFD4F042F6DDDCCEC917721F63BD38B4796"]
gpg-urls: ["https://dl.google.com/linux/linux_signing_key.pub"]
diff --git a/pyaptly/mirror.yml b/pyaptly/mirror.yml
index 99fd609..df7cbc4 100644
--- a/pyaptly/mirror.yml
+++ b/pyaptly/mirror.yml
@@ -5,8 +5,3 @@ mirror:
fakerepo02:
archive: "http://localhost:8421/fakerepo02"
gpg-keys: ["5ED1AC57"]
- google-chrome:
- archive: "http://dl.google.com/linux/chrome/deb/"
- distribution: "stable"
- gpg-keys: ["7FAC5991"]
- gpg-urls: ["https://dl.google.com/linux/linux_signing_key.pub"]
diff --git a/pyaptly/test.py b/pyaptly/test.py
index ea15842..a18f2bf 100644
--- a/pyaptly/test.py
+++ b/pyaptly/test.py
@@ -2,11 +2,13 @@
import codecs
import contextlib
+import json
import os
import shutil
import subprocess
import sys
import tempfile
+from pathlib import Path
import freezegun
import pytest
@@ -15,9 +17,10 @@
import pyaptly
+aptly_conf = Path.home().absolute() / ".aptly.conf"
+
hypothesis_min_ver = pytest.mark.skipif(
- sys.version_info < (2, 7),
- reason="requires python2.7"
+ sys.version_info < (2, 7), reason="requires python2.7"
)
if six.PY2: # pragma: no cover
@@ -33,16 +36,16 @@ def read_yml(file_):
:type file_: str"""
directory = os.path.dirname(file_)
with codecs.open(file_, encoding="UTF-8") as f:
- main_yml = dict(yaml.load(f.read()))
+ main_yml = dict(yaml.safe_load(f.read()))
merges = []
if "merge" in main_yml:
- for merge_path in main_yml['merge']:
+ for merge_path in main_yml["merge"]:
path = os.path.join(
directory,
merge_path.encode("UTF-8"),
)
merges.append(read_yml(path))
- del main_yml['merge']
+ del main_yml["merge"]
for merge_struct in merges:
main_yml = merge(main_yml, merge_struct)
return main_yml
@@ -75,7 +78,7 @@ def execute_and_parse_show_cmd(args):
"""
result = {}
show, _ = pyaptly.call_output(args)
- for line in show.split('\n'):
+ for line in show.split("\n"):
if ":" in line:
key, value = line.split(":", 1)
key = key.lower()
@@ -93,23 +96,21 @@ def create_config(test_input):
:rtype: (dict, str)
"""
input_ = read_yml(test_input)
- if 'mirror' in input_:
- for mirror in input_['mirror'].values():
- if 'components' not in mirror:
- mirror['components'] = "main"
- if 'distribution' not in mirror:
- mirror['distribution'] = "main"
- if 'publish' in input_:
- for publish in input_['publish'].values():
+ if "mirror" in input_:
+ for mirror in input_["mirror"].values():
+ if "components" not in mirror:
+ mirror["components"] = "main"
+ if "distribution" not in mirror:
+ mirror["distribution"] = "main"
+ if "publish" in input_:
+ for publish in input_["publish"].values():
for item in publish:
- if 'components' not in item:
- item['components'] = "main"
- if 'distribution' not in item:
- item['distribution'] = "main"
+ if "components" not in item:
+ item["components"] = "main"
+ if "distribution" not in item:
+ item["distribution"] = "main"
try:
- file_ = codecs.getwriter("UTF-8")(
- tempfile.NamedTemporaryFile(delete=False)
- )
+ file_ = codecs.getwriter("UTF-8")(tempfile.NamedTemporaryFile(delete=False))
yaml.dump(input_, file_)
finally:
file_.close()
@@ -129,12 +130,36 @@ def clean_and_config(test_input, freeze="2012-10-10 10:10:10"):
:param freeze: str
:rtype: (dict, str)
"""
- old_home = environb[b'HOME']
- if b"pyaptly" not in old_home and b"vagrant" not in old_home: # pragma: no cover # noqa
+ tempdir_obj = tempfile.TemporaryDirectory()
+ tempdir = Path(tempdir_obj.name).absolute()
+
+ aptly = tempdir / "aptly"
+ aptly.mkdir(parents=True)
+ config = {"rootDir": str(aptly)}
+ if aptly_conf.exists():
+ aptly_conf.unlink()
+ with aptly_conf.open("w") as f:
+ json.dump(config, f)
+
+ gnupg = tempdir / "gnugp"
+ gnupg.mkdir(parents=True)
+ environb[b"GNUPGHOME"] = str(gnupg).encode("UTF-8")
+
+ input_, file_ = create_config(test_input)
+ try:
+ yield (input_, file_)
+ finally:
+ tempdir_obj.cleanup()
+ aptly_conf.unlink()
+ return
+ old_home = environb[b"HOME"]
+ if (
+ b"pyaptly" not in old_home and b"vagrant" not in old_home
+ ): # pragma: no cover # noqa
raise ValueError(
"Not safe to test here. Either you haven't set HOME to the "
"repository path %s. Or you havn't checked out the repository "
- "as pyaptly." % os.path.abspath('.')
+ "as pyaptly." % os.path.abspath(".")
)
file_ = None
new_home = None
@@ -145,10 +170,14 @@ def clean_and_config(test_input, freeze="2012-10-10 10:10:10"):
except OSError: # pragma: no cover
pass
os.mkdir(new_home)
- environb[b'HOME'] = new_home
+ shutil.copytree(
+ f"{old_home.decode('UTF-8')}/.gnupg", f"{new_home.decode('UTF-8')}/.gnupg"
+ )
+ environb[b"HOME"] = new_home
with freezegun.freeze_time(freeze):
try:
- shutil.rmtree("%s/.aptly" % new_home.decode("UTF-8"))
+ aptly_dir = Path("%s/.aptly" % new_home.decode("UTF-8"))
+ shutil.rmtree(aptly_dir)
except OSError: # pragma: no cover
pass
try:
@@ -156,29 +185,31 @@ def clean_and_config(test_input, freeze="2012-10-10 10:10:10"):
except OSError: # pragma: no cover
pass
try:
- os.unlink('%s/.gnupg/S.gpg-agent' % old_home.decode("UTF-8"))
+ os.unlink("%s/.gnupg/S.gpg-agent" % old_home.decode("UTF-8"))
except OSError:
pass
shutil.copytree(
"%s/.gnupg/" % old_home.decode("UTF-8"),
- "%s/.gnupg" % new_home.decode("UTF-8")
+ "%s/.gnupg" % new_home.decode("UTF-8"),
)
input_, file_ = create_config(test_input)
try:
- subprocess.check_call([
- b'gpg',
- b'--keyring',
- b'trustedkeys.gpg',
- b'--batch',
- b'--yes',
- b'--delete-key',
- b'7FAC5991',
- ])
+ subprocess.check_call(
+ [
+ b"gpg",
+ b"--keyring",
+ b"trustedkeys.gpg",
+ b"--batch",
+ b"--yes",
+ b"--delete-key",
+ b"7FAC5991",
+ ]
+ )
except subprocess.CalledProcessError: # pragma: no cover
pass
yield (input_, file_)
finally:
- environb[b'HOME'] = old_home
+ environb[b"HOME"] = old_home
if file_:
os.unlink(file_)
if new_home:
diff --git a/pyaptly/test_test.py b/pyaptly/test_test.py
index 43f21dd..32b591c 100644
--- a/pyaptly/test_test.py
+++ b/pyaptly/test_test.py
@@ -26,11 +26,10 @@
st.floats(-1, 1) | st.booleans() |
st.text() | st.none() | st.binary(),
lambda children: st.lists(
- children, average_size=5, max_size=10
+ children, max_size=10
) | st.dictionaries(
st.text(),
children,
- average_size=5,
max_size=10
),
max_leaves=30
diff --git a/pyproject b/pyproject
deleted file mode 160000
index 8471700..0000000
--- a/pyproject
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit 8471700f2009599d037ba2b941aef704655b13f8
diff --git a/pyproject.toml b/pyproject.toml
new file mode 100644
index 0000000..f3dfb2f
--- /dev/null
+++ b/pyproject.toml
@@ -0,0 +1,39 @@
+[tool.poetry]
+name = "pyaptly"
+version = "2.0.0"
+description = "Automates the creation and managment of aptly mirrors and snapshots based on yml input files."
+authors = ["Jean-Louis Fuchs "]
+license = "AGPL-3.0-or-later"
+readme = "README.md"
+
+[tool.poetry.dependencies]
+python = "^3.11"
+pretty-dump = {git = "https://github.com/adfinis/freeze"}
+pytz = "^2023.3.post1"
+pyyaml = "^6.0.1"
+
+[tool.poetry.group.dev.dependencies]
+freezegun = "^1.2.2"
+hypothesis = "^6.87.1"
+testfixtures = "^7.2.0"
+mock = "^5.1.0"
+
+pytest = "^7.4.3"
+mypy = "^1.7.1"
+pdbpp = "^0.10.3"
+black = "^23.11.0"
+isort = "^5.12.0"
+flake8 = "^6.1.0"
+python-lsp-server = "^1.9.0"
+python-lsp-black = "^1.3.0"
+flake8-bugbear = "^23.12.2"
+flake8-debugger = "^4.1.2"
+flake8-isort = "^6.1.1"
+flake8-docstrings = "^1.7.0"
+flake8-string-format = "^0.3.0"
+flake8-tuple = "^0.4.1"
+python-lsp-isort = "^0.1"
+
+[build-system]
+requires = ["poetry-core"]
+build-backend = "poetry.core.masonry.api"
diff --git a/setup.py b/setup.py
deleted file mode 100644
index 963fc36..0000000
--- a/setup.py
+++ /dev/null
@@ -1,78 +0,0 @@
-"""Setuptools package definition"""
-
-from setuptools import setup
-from setuptools import find_packages
-import os
-
-
-__version__ = None
-version_file = "pyaptly/version.py"
-with open(version_file) as f:
- code = compile(f.read(), version_file, 'exec')
- exec(code)
-
-
-def find_data(packages, extensions):
- """Finds data files along with source.
-
- :param packages: Look in these packages
- :param extensions: Look for these extensions
- """
- data = {}
- for package in packages:
- package_path = package.replace('.', '/')
- for dirpath, _, filenames in os.walk(package_path):
- for filename in filenames:
- for extension in extensions:
- if filename.endswith(".%s" % extension):
- file_path = os.path.join(
- dirpath,
- filename
- )
- file_path = file_path[len(package) + 1:]
- if package not in data:
- data[package] = []
- data[package].append(file_path)
- return data
-
-with open('README.rst', 'r') as f:
- README_TEXT = f.read()
-
-setup(
- name = "pyaptly",
- version = __version__,
- packages = find_packages(),
- package_data=find_data(
- find_packages(), ["yml"]
- ),
- entry_points = {
- 'console_scripts': [
- "pyaptly = pyaptly:main",
- ]
- },
- install_requires = [
- "pyyaml",
- "freeze",
- "six"
- ],
- author = "Adfinis-SyGroup",
- author_email = "https://adfinis-sygroup.ch/",
- description = "Aptly mirror/snapshot managment automation.",
- long_description = README_TEXT,
- keywords = "aptly mirror snapshot automation",
- url = "https://github.com/adfinis-sygroup/pyaptly",
- classifiers = [
- "Development Status :: 5 - Production/Stable",
- "Environment :: Console",
- "Intended Audience :: Developers",
- "Intended Audience :: Information Technology",
- "License :: OSI Approved :: "
- "GNU Affero General Public License v3",
- "Natural Language :: English",
- "Operating System :: OS Independent",
- "Programming Language :: Python :: 2.6",
- "Programming Language :: Python :: 2.7",
- "Programming Language :: Python :: 3.4",
- "Programming Language :: Python :: 3.5",
- ]
-)
diff --git a/testenv b/testenv
deleted file mode 100644
index 20c8032..0000000
--- a/testenv
+++ /dev/null
@@ -1,2 +0,0 @@
-export HOME="$(pwd)"
-export PATH="$HOME/.aptly-bin/:$PATH"
diff --git a/vagrant/5ED1AC57.key b/vagrant/5ED1AC57.key
deleted file mode 100644
index 5a2c626..0000000
Binary files a/vagrant/5ED1AC57.key and /dev/null differ
diff --git a/vagrant/640DB551.key b/vagrant/640DB551.key
deleted file mode 100644
index 0434c9b..0000000
Binary files a/vagrant/640DB551.key and /dev/null differ
diff --git a/vagrant/7FAC5991.key b/vagrant/7FAC5991.key
deleted file mode 100644
index 1a8192d..0000000
Binary files a/vagrant/7FAC5991.key and /dev/null differ
diff --git a/vagrant/default.conf b/vagrant/default.conf
deleted file mode 100644
index 2bf9ede..0000000
--- a/vagrant/default.conf
+++ /dev/null
@@ -1,17 +0,0 @@
-#
-# The default server
-#
-server {
- listen 80 default_server;
- server_name _;
-
- # Load configuration files for the default server block.
- include /etc/nginx/default.d/*.conf;
-
- location / {
- autoindex on;
- root /root/.aptly/public;
- }
-}
-
-
diff --git a/vagrant/get-pip.py b/vagrant/get-pip.py
deleted file mode 100644
index 30a6cd7..0000000
--- a/vagrant/get-pip.py
+++ /dev/null
@@ -1,17759 +0,0 @@
-#!/usr/bin/env python
-#
-# Hi There!
-# You may be wondering what this giant blob of binary data here is, you might
-# even be worried that we're up to something nefarious (good for you for being
-# paranoid!). This is a base85 encoding of a zip file, this zip file contains
-# an entire copy of pip.
-#
-# Pip is a thing that installs packages, pip itself is a package that someone
-# might want to install, especially if they're looking to run this get-pip.py
-# script. Pip has a lot of code to deal with the security of installing
-# packages, various edge cases on various platforms, and other such sort of
-# "tribal knowledge" that has been encoded in its code base. Because of this
-# we basically include an entire copy of pip inside this blob. We do this
-# because the alternatives are attempt to implement a "minipip" that probably
-# doesn't do things correctly and has weird edge cases, or compress pip itself
-# down into a single file.
-#
-# If you're wondering how this is created, it is using an invoke task located
-# in tasks/generate.py called "installer". It can be invoked by using
-# ``invoke generate.installer``.
-
-import os.path
-import pkgutil
-import shutil
-import sys
-import struct
-import tempfile
-
-# Useful for very coarse version differentiation.
-PY2 = sys.version_info[0] == 2
-PY3 = sys.version_info[0] == 3
-
-if PY3:
- iterbytes = iter
-else:
- def iterbytes(buf):
- return (ord(byte) for byte in buf)
-
-try:
- from base64 import b85decode
-except ImportError:
- _b85alphabet = (b"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ"
- b"abcdefghijklmnopqrstuvwxyz!#$%&()*+-;<=>?@^_`{|}~")
-
- def b85decode(b):
- _b85dec = [None] * 256
- for i, c in enumerate(iterbytes(_b85alphabet)):
- _b85dec[c] = i
-
- padding = (-len(b)) % 5
- b = b + b'~' * padding
- out = []
- packI = struct.Struct('!I').pack
- for i in range(0, len(b), 5):
- chunk = b[i:i + 5]
- acc = 0
- try:
- for c in iterbytes(chunk):
- acc = acc * 85 + _b85dec[c]
- except TypeError:
- for j, c in enumerate(iterbytes(chunk)):
- if _b85dec[c] is None:
- raise ValueError(
- 'bad base85 character at position %d' % (i + j)
- )
- raise
- try:
- out.append(packI(acc))
- except struct.error:
- raise ValueError('base85 overflow in hunk starting at byte %d'
- % i)
-
- result = b''.join(out)
- if padding:
- result = result[:-padding]
- return result
-
-
-def bootstrap(tmpdir=None):
- # Import pip so we can use it to install pip and maybe setuptools too
- import pip
- from pip.commands.install import InstallCommand
-
- # Wrapper to provide default certificate with the lowest priority
- class CertInstallCommand(InstallCommand):
- def parse_args(self, args):
- # If cert isn't specified in config or environment, we provide our
- # own certificate through defaults.
- # This allows user to specify custom cert anywhere one likes:
- # config, environment variable or argv.
- if not self.parser.get_default_values().cert:
- self.parser.defaults["cert"] = cert_path # calculated below
- return super(CertInstallCommand, self).parse_args(args)
-
- pip.commands_dict["install"] = CertInstallCommand
-
- # We always want to install pip
- packages = ["pip"]
-
- # Check if the user has requested us not to install setuptools
- if "--no-setuptools" in sys.argv or os.environ.get("PIP_NO_SETUPTOOLS"):
- args = [x for x in sys.argv[1:] if x != "--no-setuptools"]
- else:
- args = sys.argv[1:]
-
- # We want to see if setuptools is available before attempting to
- # install it
- try:
- import setuptools # noqa
- except ImportError:
- packages += ["setuptools"]
-
- # Check if the user has requested us not to install wheel
- if "--no-wheel" in args or os.environ.get("PIP_NO_WHEEL"):
- args = [x for x in args if x != "--no-wheel"]
- else:
- # We want to see if wheel is available before attempting to install it.
- try:
- import wheel # noqa
- except ImportError:
- args += ["wheel"]
-
- delete_tmpdir = False
- try:
- # Create a temporary directory to act as a working directory if we were
- # not given one.
- if tmpdir is None:
- tmpdir = tempfile.mkdtemp()
- delete_tmpdir = True
-
- # We need to extract the SSL certificates from requests so that they
- # can be passed to --cert
- cert_path = os.path.join(tmpdir, "cacert.pem")
- with open(cert_path, "wb") as cert:
- cert.write(pkgutil.get_data("pip._vendor.requests", "cacert.pem"))
-
- # Execute the included pip and use it to install the latest pip and
- # setuptools from PyPI
- sys.exit(pip.main(["install", "--upgrade"] + packages + args))
- finally:
- # Remove our temporary directory
- if delete_tmpdir and tmpdir:
- shutil.rmtree(tmpdir, ignore_errors=True)
-
-
-def main():
- tmpdir = None
- try:
- # Create a temporary working directory
- tmpdir = tempfile.mkdtemp()
-
- # Unpack the zipfile into the temporary directory
- pip_zip = os.path.join(tmpdir, "pip.zip")
- with open(pip_zip, "wb") as fp:
- fp.write(b85decode(DATA.replace(b"\n", b"")))
-
- # Add the zipfile to sys.path so that we can import it
- sys.path.insert(0, pip_zip)
-
- # Run the bootstrap
- bootstrap(tmpdir=tmpdir)
- finally:
- # Clean up our temporary working directory
- if tmpdir:
- shutil.rmtree(tmpdir, ignore_errors=True)
-
-
-DATA = b"""
-P)h>@6aWAK2mtey7Dt5400EW_006Ei000jF003}la4%n9X>MtBUtcb8d7WBuZ`-yK|KFd2brwikWo_B
-;u!rn<*pQ}JfvyeG_Vod&A;=bObCD&Dl#;mHzWcj7k|JfvSvC!;CGmK7Jl_3ycgo4LuUaL)T8i>3Uf!
-{K-)yULvX<43rRlnDTFKLtiCtaEhGk1t6>Y;){XChN_eHhYh;m~eE7jfAO`S=_?el#mOCVI;OttT5C7
-)=ywWt&Ru;O(is#00muS(TqMUmmlODQWEvx{oC%gWq5U5T3R9Fw*YMK^!Ln^b5XJWq3>8Yz}7iHK>im
-euCS+?>~vuSm3`xLY~iqKVm#%T+5yR>VxT%R4R=kjHG9ea7%&OvOsdBN9NTQurtUeqruxyzb{dn;XOOY|12T6iY~H_KCECyGp_mh|{!`wT`}HI6L3<7HmSMDpK
-St{Rop+3GgiaFw*OD8%yHkxdIH3@+F@4yNJPdge#%1o0%AOeQBRQ%Y>g9WNWUt|VI**)9J!Ybv(nY@5
-~f9*N#>g<@;*z!l3_4crO=YisuGe#=WE4S2FUk%8apXYkt@aA)_NW#C*VOq8O5{SgW&n=Qv>v0at71&
-`U(F0?blF0b@zrVNBjT!IpJ$Ox>%Kr;@T1iRXxviFs|W#B?Jm&G0=0sjo#TQn`XO=7*A4Bl~`xLY_MBV(C;pV+8Q)}jEnw2$Ew_O+6H8Z(F;zNzGXKJE(W2svM~tZgdrzPjKI52JH
-(p6PUI;+$5hjl&ET-lEf>rn?L*V}?y&g>Ht5h#S5Tjlu}++`dPZOo;BX%$5ab%RN(7D_6MWs^qL%lPF
-KR+VY}cY9&PtY(t3ZEdzx
-gxNc>BxM>&y3-0XZP9TqOYDLRO`=8(tE8M3(fp0td~}$sFBHfK1YlZ?9jx3l@p03(#?apptr%_ra8NT;Gpe>Uj$jiH;T0$+=Y=A
-b%+D
-bP9#hMm;|(g*i-zR1>mQ3;Bx}M1K;T}9B0brb2Htsr&aC2D;~gS&Z47Y0c$a
-tN%lw^rM3HVnK*5aOZ~P{K*RoOfqS=PQEQtErI^mr|*sEBf`3;kzy6-G+FH{s&w!U^Gj|gd@X;l{c`F
-q*&0}afga7uE^_rAKtDx%fcF&oq-gdik`ta^(|@Hi^-&@$-je}ZxWxFnr~xa)cYy6G`n;}$I8M90Hy0
-J42cuHD3;TE49wA;YVdh%HDQ4aR-v8i^y^Fh5PXIdGMQw}Tc4p2Nn6@PKDt3co3CUAYOP7)g>clN&Wu
-#I6ZQ`jMaLAWby;6g@qK#g6aOnnYj_v1juxEl%EjxVpN&eD^N>`SSJEV&a97K1x~DLdqcdov2y`5k$e
-Y7QDj&eGq4N8RT|&U>!y?&c(G73_ci)59UDxs71?q=((B0>czl(XuXG+-9{l`fEf!!EKSp>18&+h8;XI19tvO-r(V7&{)XhR9Mm07?waf}%B5DOAw(hpG{yfNuh9zQdRIXEdeH|`vsONpTZ41gji!<_
-Nlydpv8upP5mBX%6Jnfbqhm<2Y4huhfVzX|tILQd@)_Ftai2)qfD!s>wm?ekl58K
-j_1VK;6U%=!wahTPG%Q-4mvRU71IlT=3EjM>u%qr2^JOhf>D%qkfh1Ap}?*Jqpf&j`#0uwBKfYm1<}lrnp@c{LY4M8lD(EE$o<)jo?TU=h!E)V%G+?6m?G=W3zU2JGi~029`0QY2
-#I`WG`F~pMA8SA&^QRKHXqy+LyQh?{)Ix5|X})pWeTE`{tdK-L*c!?*R97i4#t9of=K!G0xov>+GhLT
-0xJpV{6*r3X7y>3rF#0%XAA@nZwU1XkgDuNj04f!hUOYGSasjID$n&ca-^kF8jF!`_k}g0CrqTgJILl
->+84I=WrNLAs~QufOn4eQM|LQfMY#?E^b5XPxAvWwo5ZfoQ(lf(39JBB=m9O3%uqpE)&
-iA;4Be27vWaR-bu!txHJ!{t;HoU-_y5e>a=E#%~#LWT$DUc(IO|-XN$tW5#pQpmWEhKFn70F!B%@CnN
-@3JHYJ*fCapVfBUa3Qy8rN-d8l68-aJ8V>yb7o8CT%sggt
-x4o~TCfWNT5FWUA*~Lt4*ub?(p&^cq}`lKvf;*f3QZ=@Po{jRLZ8{PP7!b!kwixs?9WGvuO}7(*s-Go
-8=Dec<2ubP=;pgM|f`4=@AMn=_hA2^+El_k~H{cbc5a?{$A1Z!tq0^3uv^TbMLeL*11Y*+9er)`bBK2SZ=M4?u3-_$LG;@W9-o
-4qJAxPkVE6}%u_qTE4|rr*Xu;+P*iNnYy))-jokL
-_E<>#kOk1LOtA1C7r(LC=kGAv=D`)+6BERWm7u?l`aRevJBxVZcarjKM5jy1w=aH@VhRVRG2nVSaNQC
-)ohKOSVD@$o?fBoL)DqVbP~c)7PWt1pz!-D+83TAe7W{EDhE$zy@j?2@P+p(fW8uD-)G!(RQ;Jo
-xa*Q)S2yfc3lX+tL7NbPX@O5j4F#4!HT9nzQTr*?ebLnXr9oHuQDp|X_x}MtB6LPlNBO;hh9ZachS)9m2Y!fHTg19=I-g)>-5)UZy*_y
-g#11e(nC2o%NFCruN3-zSHly^kvv&4T)DAYEvR0g_9P5B)m%Bk#ZkBv#&T8)v!gq=!%TaNnmBwC!^Py
-mm5=^JNe<-@X_TBwIZv^)H;@Wb1=4di?Plpn;|NU2C%)Ny8rJE~
-SiK$#S29>XbnQuoOn|gWaxXKSuO9KQH000080P~d=N1nacs-_D70C_0@01^NI0B~t=FJfVHWn*t`ZDD
-R?E^v9x8f$OcIP!ab1))V?Y2mutI~;HbVC=O?x@&Ce1bMW32ns^BXq#JE^pcbl-=hEhX86#PN?P1LP9
-GA9oEZ-1J*1;3dXp+fPQ`Yc78zOQsZwKtuaf1dX7FE>niot(QFOGDVoNApmAaISQo^@8AvH-CO5~+x)
-P5fwx$#`A*SuK!PoeyqI#jP>r+2AT%zrbB-LRC|fTr5*F=hxSl*TBpCr}^cFId
-z}!XVFIM&y{pUhi=-wrIHZq=UBdS_0fB{UPpRrddWe06}d$0(
-`Tb4iv8G*~a(IH2B3*a!eY~Nu0IWUef;gWm6Vej?Hg|_il~h!^Nhguh
->NsD3!*n%xo@(DY_O|PVbBavXIZELyP{D;we!g`-EnQ)FHYM0LQqF3qH+05FtH?lK+17rQ{(n}tu4F;
-mi2~#!xv1Zxo5wEfN{S}okk|x^b>PBq=spvtBs=QfU
-wF(hRzIa5<~*)h?4%fJWCSHqVaF6_0=ZJnK^5N)P$FG01VWGvJK|4)7qyokLh3@)S|&_}e0j}B
-jv9ynp(B6X~(B-NIt-=O=Cl&V(+{AokM((+3DF+{G+b3-@
-{gTdrsi)ySH+er}kWsc!P$Q2Iu!*!l?zcN#Sr&oJMA8yIp_*`235RBlSB;Q@
-xP&ar=DGQHX2OV3;rE;xYz`@r>MEQoS(xnzCY1AC@rx{du3!Za*=##HQUW2lsS+BWGc@bW%igq{-d#d
-U62z7J~g(&M7XLpuNr{vX1Pmej36ZN#!N6a&jXKJ>N*6h!?2#)pK57e8^o2k*6s>r!?QFo;p
-x?^Nj8WhG1)&UNoWNYvE5Rr4(#H$jT3noTz9J3Jj_lAF#G(kGq^Np+YyJQG=)^Ku7n-v4J!7m-9TTz`@Yo=B6$Nrrk!m)E7+#ID8u
-uFC)~=5tqOmCo#zR^3M*@ne!aDF-OpY
-_20|aqzi=ESTbWw&gsAlm)a13QM|8x@W`~l9f+~m?r@02b$T0+-{gMl^b#7rr+!7>uiA2tlYlz{RX>dUh5eTyh1s9_;fY?CH!9oJbKi-3NH3Fn;td4_+&4rZ5fMl+no3|O!
-N>pq}eTh#CZBB?2B#=h?Q_#~#yWSx?!`vme9K7ruUZ+=lF=0L=-*3c$$2vw5?Rp2Y0n?ez-@V%5djQ+
-jDdV`D4;et1i!rPUf&n**t=?gx3@d8M^@FZSX5@5j!?3)$DAB;lbkEZy*^Bg9i&^L#pefdSW>6F+D0^
-*9tNx)e{$E$6qR>yaHd3Je1v^?N})S8)I3DmfQc#N)uXZm$_zO?6A-#75BEV$la`y!bV()3F}OsEMf_
-FO`b1z#JHXO{LPC~x0VwS`>MxF223uIT)d&gNHlx1*1EDE4X;BxUMmMWL^P&!_2~|2(_7y`is9Z)cZt
-OO%Y-r!p;SVOw$M^k|IxlTH|!tw3uyfnB)+%!oOXMiN|`J{1ylo*&QvRsupxw?)F*a_IggP6Ex32|bg
-gbhmv5HjKjchHT~ZNL$^IWNgf{^ZKsH5E(d&rk}+GUK9$Ei~y}z0~lrT?9xJI-
-n5WzqdKDmto;6DguE5<~!YHN3T%X#Q@YfCv;F5Ta@c~bnb85fHhmYpjV8>NS)%)6pNjPVjBDo-3xVh5
-P#vl9jWj7lK+bh26xlEK>Q*uM~2J^-jSG9|5R>&VuyuDB?Nq^17*3>&+vM=W73{{EzxjuD3r+JV2;N|
-e65SRk5|1qlxCfRy&>m57zOCQtV%|QJ`hVPP`u(v@@9NR
-Sn&uf}^MRBL-Gr=TU1Tz0#vY-1v3Gfac!CMn-E6gQH3sW0Pk#`=-w%8@4B~G^wvZnGG47r!C&r~6BYl
-?f>4_$G;7RW!)!k)R_pXiWaeoEtBgu(H{uf49zT7gXMQjE&2e4vSqE?%i=^!l~cfM_sXQoYC
-UcXV_UjoQeinYmop8;;cqWtDC?7>uOZc2
-yZ?iY30BAybs3f=kQ>_;xxT0OI`ac`E3yG$4~XTNVl2G>s;tep#&GpV3B%JL7D#X+vp*A{`(=Hz*cb~
-V{Y5f`N*rYa(VC*Ow0_-TA;MnT^u0gWO`q|~o(F!7r~+OYk_m=~|IvM22V}0N@Eosa!#LPmyThORk-~
-5O*CY7vb2p)T$|x}O=TEA?$*sccz?(
-cv!Sqf-)yg{nzrzjXU6BT`*jL7j2oH>F*5sZm}pZ5CPddo?t+zwx`eWe(NcK`ya#h@6oTgP7kRpSwEj
-o9CFG;v+k=#FAyU@1BJFHlPZ1QY-O00;o{l@>?ZFvLBk3;+P(C;$Kv0001RX>c!MVRL10VRCb2axQRr
-omy>g+_(|`?q5Ny5lBkc>LrJMG3rCpyxbWmF2SMc7snO25_gxiBnl+8>%F4?y)%4~NJ(ouX9Ui!G#t(
-h=jEB9SF6?EsbWNQnv0fnR4OKkV!H2EtJTF;iiQ-$w%5I6ML~Ge38@L)C{g#C754kZg?p|}%6|lghS~
-6acq4~rnmy{Hq@CTS;!5|L*DA-~ek-M=78@bz3rp4UE-wzOt%S@Ke+>tGn(b(LD(sqK%WJlF42liduB8n;kbT>t1btgESsb|3J*`=LSQ{3KgRuez>LMGrH
-A)Yr?YfFfwI(~J3(aK1TUr;2|IRKhE(&1CrD%&zvMqlsBqKLq&%6{(2V7IR`uCUHFU4w&4{tMAQPxx`
-@|nhR`Qs>$8yCkkGaw4bv2MQu)Qe<{FnZj`*4o$W=AN<*C}c)UzhFXm|K2TN9S=u1T6)
-ZBewVE2|vnJK^UPy7pT-vpZ?vVA+Azy^)tHPfcEV@&)lgK@iWWDdeI}kPN)(?tavddqd=fQ3%gLjTZH
-*a!3`B-qJZO_+hrh4Q=v5W&`Pz7vwY~QW^P)yjkGGTS%!oedGNhThdF@2c`-%Cqzxm#2f#{)A|QCzu*
-&ycMZAv2uUnEtH|tp{E9#z7b;1T|3`4A)-3Awj8<^|&imcBKxoj)JtQ|bB>8M)R7tGWu+wyJzw}Qex&
-GEBv)y||cg*c=KA#D#kFA0VP4u3@Z?1@F>l6H9q-DpN!~YwmkBqgJiHR(rC0i&p+6xh?G%#uf3G!>cx
-Ot;McanT-zasN%@=Ad!V0n{+Tw+BbRO%}}oFP6+Pb~T{$`P--BuX@
-lRH7WR7bBm>K*wOfu>S+0gHkWqu>BiPg*65BcxP|
-qlCs;o3aiP0R{cg@Ld8HsMy+5bsd}5bd3iBmEGC%M!Crq830hkj$wWjrOtAXau56ATIF8`(_C$-`o2_VB
-5OR#m+BGKe2c9MY=16Jf?%AaG#HsA3+?!HaoppZ=|rq67NVU>(N|y{G{1GnR>M@H0boJ^lt7w?<(!mv
-!&lSJ-fH_}obsT1A=|4^=>(ch_SmV^ncuGS_pOaTK_VMi9qCW^>(va`c^9q^5Isnniv*_Bif&xgSEBL
-nMzOeIn(al@BhO4IA6f_tMM^h-vFh5p+#Dc9BpY0eW(}fpB>J|t*z*jRn`X153oTFM;>W5OH1bYLMF1yatXLJ9SO(vaNro
-ZgF^F;J4)FvZnS42PXgxBfo9aw`XE>cTtj0ufIed{a7qC|v|-k#DibxbLg&Kig?Amx%%X404m}*1G4w
-F3=8{*CNRc~QTskt09}P}MWfd?A51eeN+K0#^Pzbv!2rlcNctB}_^P=~kfBkg(bMfi+A0Lz1!NOTZ91
-M$!9a?}Ni%=nKWiK(EqRntnMvlw^N7|muMC|IwD;9^*m@b?vp>0^m1t`R+3$eF5&j9Pk9%%FY{|G{nvukH$`lqbG5VzySwi#PqZa>^jMX
-}R$eBN_!F=c$*nC-T+~Ord@A+>3LaCi8kli!#&femWKR^Q=7b)<^ja+l|A+InzNLL;
-2R@wUOy6FsfkuaM*jeOViPyqVIf$^-YV2;qTZ#t71urZIo{kg&Obg%+|=1|OX!4u|DCwOh<%yZ42R6U
-!<)g+Flk~|koGx`}2up{*$jhjlnh&9*bft}V+lj~qpEl-Py%#E8YH#@8vR*xOiW63%_Ej<51>W@+a=A
-jB{;tjyjp%q=dgZ|6Vh_dVf{#T^x&8#tU0i36-p!)GSS2Ze!K(K`JO|j9z`x-!=VF5Q6Fe&DJl|>j6
-vki#26DB#=LOd8=xVH%lqC_uC!*622Ni77||L^|-wr}!Cvk0;(Xv#1SLD)3e)nY?ysF4HmnE0*(F7^x
--Ldq<|rl{t>vjV)ybyorKa2uxX|13KbTbJZ*ehzAWN7;)deo{bx`Gn*6G|NI!-hl%*J+jByepj8%7D>_pj#dxSre*hSbnx@Q4!gsCdp(yGnD25RXs@h`!i!!{b
-pdE*-`*pt#A=6;4*=~*TRjXUAn8Iq!ViFB@e41}rzXrwH`UCi=5YFfX!9w?912hOXXwf7qp!PAm
-xv}52h)(fqDf(6~Oe+TUxwq9OImZxwyBXEIq+5zR+q*=?CoVggalZp<2TSf>B2@{
-;u%YF{f?PvX;26T++~DQNTtIk<*LdTtigP@7w|7=0UjjvYz%h$0f^OZU5c;1>^2wA7`oIdS$BYEKu7J
-7IurQCoJ`2*=PHkb0djYG0j2>tm5)X$FDqwT@LQ@!@%@S(&`RJKLM9o-WYEJO91g~
-yDzvMg=eT0Jr)pro0FuJLP>9N}p`EJX#|d8i2!+fCn$`~yf9_bz%)PsHKrCl7xa5$6ZF7kE+nGSEsap
-iIlSRA$W|;qlPOcc0<=)iyxd)s$t1FT&E$cVwYGotX-(82<#M{G#tj*0!7WwRkNoI5@NCS!4&N$sHbH
-w~~i-RWK;)F?a>JB~v`SHNC6H4(7w)(TSBO{gY_ae}fLKYD6A)h$SfK59o0Jy7(GE+46xvtiDR#(_$J
-pgR~3)!$eJ#fJN`I3u0122aqg9ThVU%mqI-@s|R698ZntG$e|rN=-1FL3aP!1Wx;o4Jff$v1-xW-NEM
-n1j){vL^&7$gNpNfBv@MW#3=^&()}fo_MIHZq50tA$JOs12DH)*ao{))TrHb(rx->|F;m(^(tM#ib-K
-Jt?=75S>bgpHq6NA;{Q-f0|XQR000O8^OY7yRWz-g0}ucJZ8-n{5dZ)HaA|NaV{K$_aCB*JZgVbhdCe
-RBZyUGucl{N_1;a^&bm=w}!;E^EYr9FXG%;eQE0zWa>O@JLPj}bjogHo1|GxJ@-jOF+a^e+0SC73RTOz!`IyF3}`MIRTsGbl-M)tvdtGo+>9z|ka|X(h?^3K5*e)2rcp%$;Cb#
-52Va(Do|pGPH3(v%j9wMVXcU1Mmvy-jX}WBiwpM8>^rkB7M##A-^R`hbe;+a2vMsWvEOX<)fPN5H&a<
-!~@#7}n5&y2#A%c
-7~vd@7ckCe4~WU4w*jm2Q9!nWySgmbVMlPZwo6*M+RNQ$pnpw??J+Yo+o@zjcJp#7-d|t7OV7{FUtXpcuij0Csk5}I!MZ+;MkBB?v5~hbWeuccmg5!6fh1H7tJ
-*_W9p=l@n&Ljbf`h}lel=GzU}eM355=&WJuiwn6?G1*CMH6M#xwKcS!cct}3ucs>-4Jh$R7N9JtyF6TR(WUj<&?*IwK$BGtX>LmY3Bn;r{t
-cNj4cS5bU;h8tkh0)fctD^Y=I`0uxzce+mzx*o~QFx=Zlnshoft1VeHq_H)WxEE4a``&LLJ2a)Z=Ys-
-77TM3q!q2qT_9L)a5A1OAJPPe3hl<0el8#y-{>YqyUnrD&@K#tVO4Koqa`7SQa}3DdMym20p);8?bKg
-CGOdnk29$y$m5wU*!
-5vM;l^wBXPi|u$L!H1ccDWjcCo@Pzfa640SATAmv_+i`)JqG}mLI;6ERt#<|iKlqbWVgAxQ~Bwi(B_}@|3=R`i+L7|kJc)$Qjmx}B$yMnO^(9xYz?^M*$g
-4$snxk@PGfJt9jo3Demsq@*KnZW$0P=lG(-k4?nSyO7wXWa5in?BBX7$nKoA9cN2V#^aBS*UbxX(tPQ
-RDAQ65pfwxV}!K}Wo%xbCnE#10;{D#{wQsERvXm&FE7ZE+`SjZuJ;nHHTl>hXb@EcL3b>0nBDVlo{8>
-$Uo&)j%I$!ov?x9^(I*K+4J@-=bxMt~JO@iheOA^qOigW~l3!|jN~sZY8gUjfCa<}eYzFcAb9kl>Geo&dqHjT0bJk(a
-9@ehLs6mg&Q6sdMGN!FkA&p8xjWH*hY1FP%n}Y}S!SM|1@Ps1zxjC2_Kjo(Su~tC%jxAK|3SbdyrdGl
-%`@pS5nlX)*31Pj-nzp}^5>REF8u`4h6_KxVuokndi7A*}Bx3@`ic9oPLN68o$Q>-Q)k}@kbvuGyTCE8(a_d6P=^%
-(7--_Va<}KnPa(!XDHviba8AZ_qxf}Q!&;P%r+rI8Q)KIg@5N0NavxnM28<`sy^oK2ol(UH}H_c2UjL
-xOhD>`DBPXQIo)|RN}QjGd5d>CL}dnrX91PdBWNYaJtX%Sj!=%(<74T)zJ4tjvp~#xsWTuRl0ji0sep
-;7rgnfab5&w$l8~6HSM47`
-orTdp(d1-oYhCZ>n730zX23Cn2AL6?iILZJRaN>J^xoTq*I2F7tL`HOFY{{7o&>j*uhnPhJfReC~GqI
-zVXlIb6@)K?mLE&CK#8Zh|};WV*84(x!@UTnuTDiv$n|40$R^_~<|xQ1-Mq9L-(kwHO3sxAsdjulN$0s`({NBJy!SfcW
-e7)1C!HVVjhb_tfEvbOg5={>(Kj9jY6`h8nKl><2j+EG91Hh3T!_=|ow@qoR-Pkrr;YbPwcOtqBrBa7
-VjMsmnTSfr1&kb&@q7BZNAqJ@lhN&jQ=4PKf+bz;(mLS?8T+wuW=ga^B9)-U5DIbh`>++Q35FAGvfL$
-CFO8#4`1b-x3jxrrC+@Rb{qE)8|((G(-&6r*3VlVW(-o}6RgK|f3~;&ZowyHzld6S_KJrO}c-8Yo#vy
-l0yD24kdsTJx_x>(%3%~6pk7Kwx3
-Ei2pw(36FM`Qh_DdFi;52;E3x3jk(+Zew~tf|@=7fzbOf_hW=Fd%#Jwz&qozBKT;+;_Y%%i`V6-Qv7&
-+EvO=ybdk~2*^z1lAXr?cZ0k(r*
-32mS4$;?@=+rLbxlZhrxLp%aA}%gLIOk_Kmlw$VQr>}0^ZPn*#?ScVIWF9!E7hc71Hc$k*B0-MIcoK-
-A1@T-zaj*-tv`wDqZl+igs2h}Ww*2KgX-H07>IyVIA^AURu3AH=mHT1bJ}gf^U2=5aLK{Ot3IUK-P{I
-A--y+Z;Z1`b9rEE#{oXk7NZZGsI!u^B1nnhrJBc(5BKo^Pl}35pm~bMb7>#1pjlu`
-f`%uY2l~ZjXm6ls%P5TOiT>B`#JU6oLpHeC3j0&~AD}6=7HA;QcN@G!T4HmUaw;u=YZRp0i}VYPsy@r
-hzV~+6dWoB|8oI&zTgSfYULC>0IBcoqp2OpQC=rWVMFqdiuOI^>eCCpZnM#Z2cTf67e38bs>F-&~I3#
-A4YVLJ}M7r$N)XtY|EC;X1<+~4y*lXM}Vq-guxnd$}*~zd+&2UouFB=)Uea|h@OLn+(6ACnkQ=6)~=p
-d4yig&G38fk>B6lu1FL%C7$a@2bZ=*fr4#m9I48B?R`4gQh|PHRd_q$pJ1>>1eolDm*0^Spzjp3Wn@yKrX*82
-^TKPL|$3F_Y+?i3_j(x7W?s$Xt(PelNEsZHxct>ad#s#c`Gw%xFPi*FL64wYKhiEAji$`FCJJJ32Ta|
-Hi8OY0H0^I{F#QiHKj=pJ2TWJf?G?6Y(KPN7Ft#4I0P$;oYw6e$IBE1_P#(82+O++x}}_u7WSa*3f>^
-wPE%n)VAo;MGb@ffNiDQ$<47}tsi`#>#GL{8mT<&-h-l7FsR!9*0<*PHHA&K0^VrUBlRKYV~~%d|G9p
-d2h<*Rs60aXT}O$}dC+Py(8a@6aWAK2mt
-ey7DocJ&cxFq006*z000jF003}la4%$UcW!KNVPr0FdF?&@bK5qSzx%JiD&ry5%8b)?Z)V-`?xktE^<
-A1|;`C$3^)a*vNvtVSOOUqJH21gP`v5=yq-3W%H#@U;noc8;z{A7C!}|ujXzH~{(?!>IjZ9NftT%Pji
-tI|&W!K7-e;*v^=ksM@zRGo0lu6eXr819JnOc>_mHD}p?Iu(9?JugX%=fZhE{keuey)qPY_}U}+t&8m
-rp($!-K@>;Mw)MGg@fB)Tid0T>&>Dl<-tMQY)=jZd`u%%3h_u(_1`mk{O!~8^zDmtaVp+6ojj0tbGd2
-788v?1G<9vv1FSc%DB0>Fx8E=Vz~8o=<}bBnS}L1lYsOWxkSgJ$QXv$tLWf+0-;Is$5oW3Qxm^U18dM4S>oz
-%U?J3vH{G#%3zA&!}Cp1dBc_UJoDD=wgDQaup|=xS!d0)Y=|Izmz%k53||L46|fLc2$*chdFFMJ<=Li
-{jfMGDrsPw81}yXT?Q8vGABe79neI>FJA&Z8@U58RqdmQS27Dvc2FNk8S8NB>Hr*V>mhJQm1_Xwel`7
-ieMt)t)+dch}$+BhNn|Ijf!yfYt`J}GerY`%T;Mp^Nwv#e$NB&_4`v%sMU<`W!FI?kNo2D<|H*MWLa>
-KN}TbIpdp3chx81>-bAWdyyEcIdzo2JvnpW8w)Y}xA<#Hi9
-1=|VMzG~wsi}=Z8LjS&zz)_XR1aKGb{6=no!Jr}jXjhWD3hJ789rLWsVC?r6=}Ez&S2mLq@qU5`z?TU
-u-()5xH?Y-Uzrzz+&s6_T(+~Z$C!1}%;x<|dCj7*-j;Bsz;DA!D0WT6HIHo5%4FVx3zncqDWAfpEHRy
-t-coB|sdBKn`#p$Vtp1mfh(TRn!tCJxB^xHi>f+EX@jSV_$>kJH?K-dGw&DK)4s1`K{fKq~>DCLbTsp
-F!kveM~^pFuZ3a}?ag$>*20CkQb4JK=m3!6wNh`K2zZ*cVj;jdPit*`P-
-+?|NKizk=9q4R68iS=Y1Spi>*d~!MK5v5^M8mqYKcmq2-csjLTL9@;OF>C;Wi;R!P)f}m{YjB_(Ht}}
-`Q;6WsiTwD*?ZOR$PPB~thC1=r`xhwBn%kn9Ohs1ZJ>u(3<9%^_8Hy}tkY0MfIu}0$;(H8MFEe{&$nc
-D({xFG>Ehwa{E8BR&O#uA+57VC2haiT)wMhF2=(>kLfCV!g0cN)t)-EMPl;b+$n}fx-5K!q>dNJ_xNc>f{WXe>s1zS_#2B|x6GOH~(^;a^
-_0p7fk=<5{V@7kWa{iz-^;onI^4q-g`cOdYS>BZ5LOLwqyJs2P%>IHo@UTs_3H!coiL+fdbOcRYjoG9
-aTaq{F+d@5o_Z8nAP`o(7_mmU$mN533KQ@|end;zHp955+X*^MM++O7(vXKx{(9f&Lp>#mm#HM_TrXn
--0_MEtU@pfUb`>LK#{u5Ge;>tSG*0hG8Q5$r~TQw2aY;;z6nC<(w8$MHqyK7@fnsWX1;EwF>FI|$Il0
-P+WdHAmE}wM?x^?CR!TYoBBnJR}N~c!;oriD51Pi{o73Miu0MA}jHVnOKS22c7|J&nJ!DDNQC0edzZT
-_mH<@Pumt`9QX*Uy1qs(Ucv0ZU;Ex}D@
-|c;p*VdVV2BbU195SC5BJ-*r=pMl71gE#S^B*Jbg01U5Qiezkv+b4E9+=$qW_I2ug&fh6A#~hckaXy2
-@fv_0}c(_IMT<-f?#)hv$%hgqcvZt4hGO63qn#`DKvw+DVFHHVS`G}(E>)?L0#6jqV5`;qQ}XO?rCJF
-p(vU|B@<`!!9wdn{fA9~W%fu^u)Mahh99NH(7OJHnOqr{c5YWB)aqZ4TrfuuadH4}wDLw^jr-H
-fZ)E>J-hZbBc&!Vg#Oqf_;G07sp==o4`(nSI~V=_-0+_#iAgM;N-isM*>cUj0t5JE?wj5%#b&h>y}*r
-50sWBDdxEVoK1NpOAJ)d<9@!t#Ozu(erE}ZB#zC--7t?Q$eA5us6|Zzy{)WRe&`J_ND>|Msvy=S-~60_DKt}6#SsPXh70}@D%H8OW
-@E*xk7UalUN0sJh1AR`$6-k4>0d?IX^)eJ!=1{ZUchGn-d$G)5O9EdwYr(Gh;Xz;Hi$xg^W#{eh$b!z
-?f)2qOM_L*w+p$APr{Ma&fiot)zL^G^o+#ot$^%7h@n?oGnBFgsP+E7Z+0fk0tQI?#zBVeFc5KNC&
-2kNE7hKmBVV5M;m;7Lk=HT7K6M{xJC{v?8QVtgtqOxCZd;z(;eE&;@Pc0=w3-6g(g9sh}Ue9jlgmiN;d04DN1
-r8K$lyTPBmCl=2DfzaQ~0TjK4We6S;45>M4;`Z
-PICwzwY-FAxomiIkbP3>@O$?36
-&lTz#_-Rf-%O#uYTvFhLs(|H-MF7e4xy=-gD=Xd%UMtoxX3^EZ=%TCtU-aFJ&NUqWlIyOg_Ww&CEWNf
-U(IN*uDxK*!k|de(1iX?9gdWLAQK{&Y5xa*vE6|g)I{52w0;R5IuYT=56}&)ywBm&?rl>ow8a;Blx
-=5oJ&&4v8{^H4-9hJ;hQCE{?qu~$Q;E=SWkg+tF_3)O4z{~TZ|cuT1Xih}Eeg3m9-8iCCT3{R0!g|uYwTu)tsTHZ@#$|&f_kc9%|bixsrcjZ=PXHBN7ZW#w1b`N4hk=T7i5!_0W6StJsJR56~ImaDFb;8I2G
-hV5eQYoB1EuU%UWc}b|~BeXbrD%>(fTZN)~l9mnAqzrx$?O<&a1@4y)D~vc$RuKMR)!D>>m`_#)d|^a
->Uk2<+U(i^YIEb(1q2ZyNNHQQ586vM~i#urY3md9Zvq3Jj7iGSq0N$H`}ZbSe#k6oN@(q?rFmwOFxv4
-{!+u3kaa-LbmWS*KuT(VE4Rsz>(IulJl(N6bb8`o2)D_`r{=ZkQ$(5jQHuHJK%Vn8R;FX?AdbxzR->Z
-Ds_|ns0&qBzH8}#Ln4Eg2mhzSbd0<)!h)E%?*)(+#hthCphD?g7_TSEsf$4=vNWi39QvOG=+qCO%-K)
-?vSF2PibaiXy50S(v%6I9%zlTu4IGPN5TP4j>}uN)$R4x20{LgOwA<16c8VUJoAt&Rxj5^?6AmO$D_l
-{mjReq(ly$q2(|%_n4;3vZh~OvIGM+qsEWUs90$gI{K)aQ=#}p-~v6aGRsLo7oI=aC{Z>~`iz!qCpHS
-HfPQYUHP0)ye4Wv>PV8#7b(6OMEZNq~4_(4$4B&L
-X1iP)6&&kW=#yw!1aRP9|=V1^ca&axUlt2W^by
-x;@A|gFr)Q8;kGSc}F5D%#WLB5tC4uA!)+&iZ1kVcnUIbmmNi|Gw~o-CWX+r&?%39ub*N>}^DrY%+Am
-NFux*ss{Ye8Eru+O6)3koeIw@FkgGg99)_^%cEg`XEIV$M+AhdSn5X7MQ{cR~4Sj`ba;nYZ(my9g4U|
-koeUvf(&OIlnQ5LyhAUYfs9TfN@+g19BYUFz!uq**<8la&p%^D{Nu>am)JGLB%cDGZXTA_-Mp`+lRPcT2cMIt6Mpa2ZMCh8RoYWEqH0LD6t-P6A#G2sYqUjOa$otbHqN9B5-1e1bZ)8Qm`cX
-o>!ypRut)F}gp^FXs=;lnh-@l-5;@3P#~YwJUEuf{X7588#%Gm2(WVgfKQ0;}JUCjxUChnNnJ0enwp{
-$}#>Ec1DS2M4|6qHG~deZ2u#aTEPlj1uEFQog?-{3&iVXmU0^7}+af_s5^a7unITm(!nrv)xz?@gbU+
-a=2E#7Y!^t5<4aHS8hy(>0Q3b
-*PLLc#3jAUY@+{yvmQvo#RYGU&sFN}WXq=(}WEVe43L5T4;wjQzPf}G5?@q62rj+EfSKdaaei7I}J)c
-beR(BM|qpCKV2t!BX5S{GW>SxfrNV~&kiltM1F+*y~O=g723?HN;#a>LrzTYFa_!3s?U?G=
-9Lj}$f4V2uMmH**1;({|hi~7ZzR$eS3;Jb3gCV+E`wav$>I8j2mZ5_d^fn;?#%8elhDTNh+nkOi54OG
-DgZ@#w82>jD#^ltq!nYVaK`sS50L?`3fb!INP93LV>x98u6V^z$Js@~Q^n%QYFiDgj`lsUt(TR}iI_X
-7`8CJ^~U-kI2a0g9BlrB-q>C&&gK56KptD9fdbs6uOnG+*9V-oQH`x&_bKfBmuUAQOOJO(q5+6mxkZ@`j;e1LBiNuuHiJ?QJ2hztkyuQkEM2-d@QYql6nL{=n$B_xt3LPJotX%TMhxlE
-N_H0gXT4FCdyS8ThLsoWDClakkJRL01Xu-_n`$`Av0`?5%KN;{Juf-UZ`@&DwKcvh4O`n^9RKOx1|AaE5<$bE`F{nyuJ`P>ZbS8p)Z5dRUNf`88+~&@T2wg(nZbew9o
-Jx#Zdb@jgEP$xBnuuzD4$yv|@rNz9HTr{|Q@NG5R6R8YS?!NyIfw0fuKURHtOs-4pzD4P#
-E&*ny?IiyD=n#R6Q&5ZaqJe1UPPMIN8BX3+`YU0Q*QU{vtXcO#)U%vxhXOS_Rm^y+jyxbBq2;zm&}>|
-w4o*>^mx8lUKi7pH$MC?uViTDyDE-@w4XaH`B7^oeY1HYS9q~wP%5gv_?8
-RQhfGh5V8{39@-DaVoM_R}&H9@}s$%U7%K1gH?7)>8DlP7R
-B+fDF#(fl{52P&SP8~AekV3{(wgZf?)GACei4(5~uaxM_TS!^yp5zGOaIY3sjgae2yB*NwAD_wY!Aso
-CTW9Ha&0-G-OfB198D;XrDV&Jquf=933|{yL&Xe-mdZPB2A=Yo;=p%7`GYFyAX4fX$LR5;Br2*9)-@6
-BdNhRow3*#;`5L)i^EAlH>0x4RHpffY$*_+_y!Dl9nwwE
-*0_b|jp_JBD(S-pMGKG4VPQNZ%H+%vorMrq9!sZ9Z*Djb8U`T#sQP}y=#;cGpZ
-X9}EXoym10_{w6B!j`R_ku%7FRWoY7;(@ji+tplMhHR|8*ojWXI@i9%F8JxL>W!V$9=SAah{`&k8i06A#TF4D+jqme3@#DIG30;5Zws^3^cVArXUQc&kZ?+u5=+pg!JU*Y9bK@G=%{
-AyPY}40?f>SKJhe=a0gW2ht#(XPcJb>rb*y;+j)9^CIAlmcRm>Y-sER;$_TuiZi2xb%CP79n>d^M%(8
-AT)x{bh8nb{s(ndPW82j&AXPSCOCm?vXqd&@9X?67_md%x)#G+ikSh^Nzm8f^^;-(d<5S{rwK)ZX}+;
-A8EM{KBlP?0a>DWd;j$dO+!m1q{xbL6ChO}QzsJ!L4@b
-8)7Mr8U<}c5%pvlY?kCo7QRdoC5fG(98_pT7z+NQar!+(wZ3&GNF)BbI$b9`=a&LwM4s@;9H)ONeYLF
-5}A^^u|-kouu0%%p+uXgtt7`%Y$Cc=mS+YszcQ^
->sQKsm}&nbw$amY#2Rp4kS|`Any$X4Nk*Qbku0exo$2-=7A*5i7zx}2VYk6LK~UR#h$S<$-R2h%wrE@
-eyOQS>cN{RGUOPuk!?Wt$K1yuuA=WTc46(oT00;24gy-Gt0+6$GZ0%KbaiskiPh`TguyT<1#xcXtIFeLi8%IyWMobr$T+0
-t$oNED`?oH7+!89bt$32pW29aJrfX3gcScq%~+%xcHyF!M7>lCcDzO9U?~=vv@Dq+sL!9;HMZpBR?Px
-Bn$|lCN3ZUY49X(elxX>;I1G2o-^2&is?~hT~r1-J-meJef79-T^`S3{{_d!z=9v%tnZJw&Odb?>p1^
-UBdJ^t8QwfJuMww?yjt@6FFY
-W1k5Bna+gi1y69aXiBA^hCaJy715)?q{XLrfBw-CJ~%@C+i%oMH~7YS{7qNUr8s5*G|%vDemVlDEkWu
-`T-pC=3HOTK;ICZ7JxU;BZu3(|a!{ODwK?7o=BCxHuxBjDGVCQqKhfRcDR_*hLOt#TT{0ccoJ~-gJAp
-jZb-6TcdgMsBf$$F+_)=8+<~U)td)-6qKha-3c#8*Okr!j{uOKA$c`sC7v*imiU9$1Yj}Sp7wN;r#&jUKQLrw^R%YBKmvP-be10
-Y0-&@=t4GFpy-3|I`k1(k3I=#Lwy#FZmssFKM2R|{%Dj=^}qIBz~MD>29vRKCvH1JQ_~-?*53k9d|aT
-vBiI0sol$NFbyD^&2|Q7{`<}kNRh0OdV1IqO{ZQ%j;ZmnmN6V#PjKremTmi@lt36RoB9`xiKP*}pqCi
-K^O*7`}U8fEmhuvCLXWD@yH}HpN=k&`V8`=x*2s5l7k39l~`7_l2VI(+g(jb{FT!Ss0M^x8$I_&|>O6^;UIk_^?VgTurC^b;lpN
-8Xi1&c!bn$F_Q-6O+~k;BhT3wdPk9X8wb;n3@7Q@T|ZNAPFOvP-0NmN9Wrt>(4$cg*QcE=hkH6lIqH<
-46m)5ByY*}023ImO5PeFU!*~RVQq6t!kgX4J>%>Mb>d3@R))W$kh0TAPZIyQ3>Bh%FX;nnu1U34<)l0
-z+k*a4yTxC8Oc?8n5dk?%LK}WG=#2c#k2vf*lpch`_jPJvhOE>oWnn`V-|DrA&Tp_{lktO9k4L5r^`o@_5S(I`?5pU6xBCsdoCIs|7qk+QFp5J3NEw)fqBmB27}v_%(Jt
-h%$a#(tdK?B(O<{Vr>+#Ay(UM`ron8O%Aheuci(6GxazbZ^~y=UK?Y{Vsuhu>3?ro^{}Re!4k~^_q{!
-4*T?$tMd1JfmHZKhocI|;jfl%0QheC&CDKdo;mje#`zpim6TP-PbQJOspY0S^S+3K;!&
-_?zL?dliJY+ReT;@^EtDiBSVU2cQ{;*?_%*Q=J89W<~={Ei!P)h>@6aWAK2mtey7Dw4?P5ssZ008L(0
-00pH003}la4%(eV`Xr3X>V?GE^v9JludKfFc60C{uMjEv>cKjz>UxrIKX5&0~|9OYaQ<(OJ-LJOA(_D&%x83>*Dc<6i#Fktfy4^-E6)r#+7w`*gFLc1MsKX?g%d-XO3+D-3
-ORW%k>Vavn4h?m>n^jEJdNKayIPzeSefR=CBU{QW>8qiwI=M!RKoeA&Ak)S;Tz(iqp04~HO^t0pt!AA
-ES_>MIQ?_)myk1W)a3#I+z`B)#=(i
-TN&D(}Tf9G0qc5csBvqG|G?x$+BVg*JJBWk>5Wiw$Y`LP_A=8!Lq7G@FxoqRXXw=m>hR0phR
->WU|OI>F#rJPx&QzS0001RX>c!TZe(S6E^v9pegAvgHj?-6{wrAdvXUB!iM`vqJ>R;$Y?@sCylI*zPT
-PB*tWQTvkj0Hesw5Rh`|WRk=Q9I<00_#?Cht@3T`H5n02mAg^9_T+VDP%!HF=@xq*CjwQH!K0lUMs!7
-n39}7V3kl2ZO=UvMM*pY_{AryGqSwNxs>ZRg+}bb-CU(Y9@al9qIS$^5!NlZp@$an_M8w%k^5#o4hP)
-1ER<_YSHYrYh_C3jjEcmT-WABy|+J0`>V3$H~B_2`>nE{wq>1v*k;YjytvJ(g1S>5NiVkfHl4j!#iFc
-IDw%G|_o_~J)q0&@n{I8hs#Zx>C-Qp6f8W1)rGD9|x^V#LH%=Gk<%T+GYUJB&v8YsCw*_ck))r5rqlB
-J|VmV3X*?grIvu#yw2~+z?Qk2yuTjzgKGhlO))Y~;7@k8T&0X`-SLw%7qK_BFdm8ZK=ao|H=lNPOMi|(SMZMbsxRS4v
-BWHw*^0ru)*}}B{-IL4Nn-_!`n!Z_63i=ZKi@)e)TXj<|R69>tI_#v9gCx$Eu-uB%Mkz`%&s<{IA{6m+xx7qw%hN(
-;63${CpgMZBQ*De6lTx5PPFJC;XsWHVDCvS((L9flzvS_MuJ)9)N<))d(nSlCsKaWOXvSRO*Z{)NIclJ
--W`DbiIFc@=LY9{69~}N0-l@{`mUY?1$GE|8?>55*=hhI!Fg@Zallo~=-~HLeU2x0t-)7)NQ>2Q=*m^{BAU3PBEK
-}sn~p-)peqm5U=j1gzS7lo585O{xk9G@!1i#v`|ah1o9$pX0uVP*2Iel6DToBG!aY^3=cr4QlcLeO?6
-4?j{fC;HY9fL6V*unckGuojqu>)r(a<9PVMRA$ZdM!wKVY|jol$u#Od@9oncgpcE2JS{2`93zdrt=`u
-imL>xum5l>RgRdXO%eq(>G?h>(N5n%HVy|5jPO>g^vsjw}+qdh2%L%4VgiZe^**TT?~Tk3ux1+2(qYC
-B!Gnv!qV`kd!`D^;w{v2Vm7Np9Ac92=?Z)^!Dox*w2F6uLG#XXX)%0L1#^A2k;~zA!=1FoCfkGi4U}2
-*Ez_&m|N&k@^>may`xS0VnedshPHgQI0`#2ZMMX!J7Wk}vnG$lw=4;8NH%@5%IjoF->@BV!8(%SDRukt7Np
-&Rq6IxWQKX&0X)vkv0fqkn!$0ozPpM-~cMslHoP&ePpSbBj_1K@GGx-do}M&9iCr9(u7wx@(|!e!_iXPZCdzNQ7c&%jeB8O@6FJ%h46QeiHd%Nr{re@&b!UmJX6f$#q_2)xLd0@>~H}^={oj
-{0+hJ#0s^vRADH?{LYsKJD^zfGgwbqP9oPEQgc{r8g-C6@K?#0)FJ{$queH+237&CnR`EJL%#CLDryt?!omua0DP
-1$I_<`In)ofl=)`-m3s33E2`WC!pT|Aiu9fmUz}z+23hv;`=xMT5z2h&BjU#9u82_3EanG((zRrmbQl@-A&Vt;tE#{QE3lW7d516pBUwl)7`nlFz-mzblw7e)Z0KOR>{^A?8~
-xKMPSa>5=fKrbcS)rCLX_^a$5^-)&8cl}XFz1xrCIebT2G7^0toO(MC}64$L75*}G*fRgY|SwmU4B5G
-};(7#-O7d*uaUT(b0ym@g+lLp)`CYYom`T5D~mzXJ<|Lr;>sgfF|Inqjm1B;N86UwM50@4Z$S|ii8$+
-OOiGxCD9pr`W{F@sUCw@$(>IkKs<5e6jT+P`7s<{$lHgVamwIT59T@~w?wJz@h}qf*V7x&GQZhj4Y5FF%p{uKX+bw4)mlJ=w?~(kFTl8N6;Zwj{PjB7l#AUO!g}(GHV+ZQf$RgY?Al7>g}MmB
-b4{|xyH+F27K;v@YV9zbn`8Rl^Nauf{@EGznz~o7%H4VadL!BNW?f!0GlXpu5=2Mqy$6LvFxq&
-C*fqu=v)9mefDNtK2uka%nu9w4!^(rvS$FTV^-je)i+y&2#i6^?RMY~^B?%QXHqwsrRHY)czwii)wW0
-%S-KQ$EzcD>UG^B4^hVp`YtifXcTV=I~gT~3hY#L-noZ&h~QiDo-A1sz6;u%XcS4#(7?KkEs@k{yi5|
-wpR>@`|=uC>hbpR^4$`CM!%t)-zZD|lJ-0vXv~6WTeqG_)BhSfZ)>TDSduDV{vD!t#RXg-y-%AQ{2^K
-nhYlW-JZBR!ADaZ!zsyC>haHUV*kinsqW_2oqEoN1NsEzAmH{0$4j{NS|Oekx{0Sz|L{W^W1@@Q}-sG)y74HTcwhU23ULw+l8PqkpWBMK
-|THxS)n_~XJ)aYXr*jAySqMBN30iM*!O5~B{MY`d)DMVzJ$SlSBKE#39j!W7b;zQG()RnYGp-7!x~*`
-~=qmn0B~HxnSo@m5LxSRS4q^Bg2r3#!E%1
-^F}id7+te6b%?4vXzAPJcmarPjf?%BsXqo`ExyBF3L?lt>^Oo(u2#rxmBF>Mz25PpLx3YqDf7b{i7Wt
-B}&NfVFQzC=~AXxTfjAbO`S}fhNbi=YgHBUl=(@T=G9NU6RP6i>xZUb+oB5~dG^?TBgrA@~JdY)Xoyn
-OTI#fxX(&VG9K8nGMxBB5=MAfMQGydt}hXmQSX+WgwbtdffTITl>ui(nk@a4vH_inOuRwL1GJDEii=5
-VaUZScLQ=4CRPx4uc1^k>ay9fI|0CBnJ9y!Y9(?qEFf%&_G1WmgSWWk_Wo8%NQ2$j3ZMp9o_b=fw40?
-#N>c>roSi@hh)YfvvZJc36tu%?my;t`lHb`pb_MAC^Pm;5BvLLC6+Z}chm^25W{FjEHCrAA18YKk$7J
-;vbtcFYUuwxMZ$6sW)%2Ai@R2|a26R`{*~DA7v&scl=ig!FAKw#Z~w0+>ns$tLOX(3_m_rQv!%b32sa
-cXF+C?nw=0OFU>Jyxh>q9zS%oCM`G_FRm*}Z;=h{Kr*=_ui6_n9hoxiCZ9m<_FUU*#JHOm=Ik@e_k?8
-R~Nt)h))1J?yiZ$JvAgT2alVw+V}?rv77;J{jC)nZ;Q)S@R2Xac&sRix`>HJj&1EzdC`QmM8$?+M
-K0Vi4V%+7aN0#2Su~{^TE_xWVx{gC${B)&Acc#@$fp=K7otIQ_@MXk
-k1E)q_?36FuH`Tsd^f>uhC9pH1w~JCRwNRLaWC8r}&P|NGQr7q>-kAs}TBcl4T)CveLd`3!t>!fSkal
-Vzo)T^(6Ihw%+nN3u-!Js=Pp+b036cErL{I>P*}|?j
-T#z38G$9%Cb*Yjo^l7}5mYlubh
-na7C(x)!$k|)7tn69DpePnO%Pbymweo*;t9eSg&7r$%%7ZJ9&zv&yC2ys^q0kMXT*GG=Ql%j<
-Jqv^hu68*;QN56>?Y#1tgUm9#nEfkiSA)zmg&yTiaI#%?%PU7MB)54j=+Z&*vmxdH^FNJk((HHoWYXt
-!o!cYASxvVqix2lZNU55yzL@iUAzrxnOK#>2o0H2Y|QlPHU+3$**~^5xVUv
-?qa1%p&07@Ymll>y3SP%?ELano!)@$-)O%fXf^hScOzz#It;A;v1u}+C26>Nl%LjY_8$^oU3YJfcE03
-M-oL45!zH({uH)^$)9oN;=i*o)hh#d<$RGyrw2XqU{(Dh+oNumQq_1j)8t0fa#LN9j;3}gXgr8guTOpn>?sw|6#ioyumx#z-oPfZ@0avvPGXKe6WW)Et7)1RLn_ku%wRo5|+
-oaPb298_NeBy_2{oC_jk69dBivg<~J6moZQDh8kseox$g3iS>yl*^EQrDCbTXgO+32k>2TO?q_YqO!e
-ZW%?DX9c45I<{ZzCIk>*5Q>(kf=A{*|+lGJ+6~6Bq;gcTku(c&la#
-nTDbG4-ENgHAKpuCvPcP2A`^mJOl<{
-*ri5AorwRL><9uh@Y@G%x6qbr2NeBr2ON)>g7_aVB!DU2kvXM)+
-Et2zdl^>E`3p&ghX=Bi}D!>&YzY&ON2q0I6WFIe2Wl}V9S(PSfThusf5F5Mc>l|TY*MTvYjM9^p
-m}shXT$z6ErgO8gN!`mfhtL;koyL(XN9-0K6&=O^*-7;}`*|&k}EK@LmLy6SAT;%!Rt%-He70T77mT{
-t{7Xxoehdl}p;;&xY|0h@FMrOpOEH?MH!F{h3DXRSPu;x1V1^vvWKPEl85c%OXd7hTe6(GsP*wTV?0<
-x@1ICw`ZHI0g;)rzWU855?((ET6K?wGWO1C55SM8I2_hCXOLX3YMElYbr?<;?kPT*cescgU|CD_Dy#+
-K(e@8?cpw}KA5NFU19Tn+a83gDj<#Z~ZtLxdodYK#&lZ6cT>C`UD<5=;=Sh*R_kU@RV8{t^G{QuMUuj
-3KW^J?05)ktOd&4j>aXpr|=*)4;dhZi##mva=E@;^9M^Z(c=fpS3Nq6GRFJeBCJ{&-OXEei3ye11$XF
-L<#(nNt`I(QpJt2s<%{rV;d&t?iJVjb?lx*7)@*OwFSMz$|bbSE=k`jCR4Y4=kKNoM3aV#gnY-xu~Y+
-oz${4NHpzP()*_cr1i{TV9xqGq{hye0wgAk0Q|8(vHw=mD66=?5(ptR4zvS$W0zbS?j}qBR%vW9HJn{
-H_m{aNM^0Y(3^=-gncdMO|`dbG>tKOo|*AUAX>aU`>-XZR}0*>+UvGqIRYguQjz4EL?V^7vZ1j*_O_{
-ysM-qOXL}v-63c{w;sq|smQ
-P0H6~juC=G(S4RDj=y&UtE!Q8@rk>pyZaW4l
-VWW0b-NC6;C&9yLwC|!co?>`vi$gcW99}To5cGYFO
-?jiRBqC)(PQmnYq2jZul*l`_oW7;Z1lx#4*R
-i@-a*$y6pGLGh^}cC0~E#?p_p-#
-N=30s8jwRU7F_M0H1`E1tt*8lK^^Aa=RU4v>rWZm&6j~P##2;QDXSh|lQn{lR49ZHMFFn|Uq3IE`^(ZMAg!ae8gN$RMa^Vl`p|7+fQdc0hG%36|`B)bBhV&2?m+M4udUAFZjh7u1#NG#o%=AJPhty(sK4C$R$
-uh~;$;n4q*M6*>X#|}kV~&zg^Sju54^e$yl=kvo$#zAv*k)k7gUUS&j>7t>!!aB+K!QliuI756e%!X)
-%fs|}pKZw|E9UvV%#65C52xwIpM^aPJ?J*XKdeKU$Pt1$pOOMa{H4!Joloh;sHI8Se_t`=dfTnBt
-4OZKm|*egwR}PwM-s6HF~0jf@ATBwK0CPAm33pKF@Wvww?>WqudEM@DDE-9;XPC+{~_2V_{5Qe4$v6z*HVY^5sJchf({1%s7JY1`-_-{i+#f4=KO5Nqea>@Sx=4wp~l)Xjbt(u7^
-2f};1B3<{xyL`KKE#pCk65jX~P9fB?*W;WKL=v5T`Vms!F*W+$!O(>3`sNXCI*J5{)JSJxa+aBwPK;M
-)ntNQww=1Mm@LKJJoSJ03Ua53jZby$hl;*mj(Z>Vv@9s+!O$lA2h$|XTe7(1YgI@G6d9MfFmp$l#Yx?
-gy>yy+2oylb!1WD@FM{ca0juVXJ8|&LMTbx9bg!LV*?@e3P3HCS7-Af$LfCO*tc>^&&kZ3DW7mOL+ws
-s()4(ItmG!{wdIxkzFmcX4O4BuDv#1o%XJeUz?lO}}!K7xze4O_AA1PGpkbKYu6eJK_{VJaltE^o9`@
-6Z$!W+Y6|82emyJQJwSY0C)a0U)8y_&F!>Q2Mf7GVY&wpJ8=VQ_i;5_c*J-Un<2HHz12eHJonuqqirg
-zrFfX`pw(N(|^DEX8bS1&Qa?_Ih|a_Zrp=v-6mubS3UB6z~_n!9g3;CvGEi*b}&io3y1gP2=zqsZne%
-gd6BWEx!AF6->CTJLZE82GQ$rJ5Sh3^BGEm?Wq`flV4?$7z^23Iou#4@$Lw;NtG=#}fl16&B9@B6(pZ3hi5-g
-gio-yH-HgPO7YqZ6h&2!O7Y)!w=E4pju%Ru6?0U1}*yAUvr5L&vDOEd{be0No)qX8=(gx})gKqCHe}r
-%&HqWG3YV&7OOKhnrfqRmMFh;@#^2U-DvEvS1jL^b3AyY?%-3r-J;h^VHzc=gDKKa1Mywy<^P$@+Aw@>rGjcupQD5+_WSsFrZ<8<14`>uj3N}vmkScLcBE
-+Kyds~&bK0)6E!*+oY)&lIJ_o2hXIJjpDp-)NYH4<9FSZqlMM=aKsZL|IeTDJ%}7F?LPImh(UuYW|FOA!!Q>ZXzO(xv))hp()NOXIyjV#lFCsX
-as7-8yBw}lbo>z#c6Jbf$@aFQtv~DPplfjfX@~lBB9Hx{(`_0_R6kWBw1lRus59?zlZjeBf}R!#QE=t9L3n{kO-Dy@c8W*)s2v$-d~7EbcpDtt@8aCO=jb(^zx6k6vpm1*)M%
-rIFufIxonm73hQ3!oy4X(v9%xs)-*t^iSFSBSRNKjiXp$SG+uV#2V5d;ECy}^l#NoN*2^L$`l^;Hv*M6CkLrowa_bm^xobZcln97}YVNDf+Knr5zKKl^m}vW&z1{Q7HMCiLhL*v2SaoNwu`y4f9nL!MTGLzi-ox+SeE;GVd~L0(Pn+oE
-jyd93G|N7$LlwJ&BluR!SQy2^euXb{
-8i=8S7lh`#9`XQ-K^+k93k~`D(8K;ld$xxvsvJ9L&OK(zI$mR4jJ#ni!!;zZC8SmfpzzM0=B`ybQ%2Y
-V3bH^DqtaDZKaazvBY);6FqQvpu^zk~s*_FFmB0p-=(U#||^LI-6IX%^fhuZ7aO2R+h!+D4z^jGQ*2G
-b#<&`i`K!k&;d+Xfz;#K_s;>%Ij6a{
-+?&`I&`YKGx|`8(hS0U?&;a*DL489OV{U^})JV6iF!4H}d6q1)S`Nn|*(;8TGsCwyv-Yir$zVFL(}yp
-Pl56UAS&EN{w}=gF3k%A8>$n$lqAx2oe5rF!>1$jpQj1@&P>IBpWLR_h#4~;P@%4*C*M-83sgPBVgUe
-PbTM)kVZJOq&c#u*sNTX|wFStO9_J)NV4G?Wd$7JBRjfUOwM^m=zkI+RW&D~S!;S?8DvQ=1r^|v=Y4$
-k|@`G2zwe2w#)Je9a|sWsA--}+#$x`EJx`3m)#fka>cUUx%v#vn!B4t2HRl@|`VOmzp*C;8=nwi%Jl&
-FX2y0O11i-MVKe*cUOQse0Ub|?gZPTNCy~}&X{RqB~4E7xT8LLqNn4;0j0Qf6~7!uZ%3YQr(
-2M)cAYJtpJH%NUO)Zr;-_Z^U5dH~24rhC)Nl@1`dlsuPP48hZq0pv*zPCS3t54;?mvrzr{5Zf>mZy)V
-}#&bSaipgGjhLK-`D=&3-jDB&dEG&EPzfc(w_TXChoS;lZ~O%`kkg!Jl+O&uyjAt@q}6|_rO~`|D~G7
-8J&d71{)wrX>b{_e3xR1NWhRRQ0nZ_Bdj*fXgDV?9}m`dE9b%4=8;*DV(*?La#4_Mmy_g|ovQXEVe4>
-$F;VTGI^&Dm=406tN9`?vd}-9M*XHkE!=zi#cQm4nQ+$;eR$Lksv(q2*fnz@%sgdrrUWXX5Wo_uTWG3AuDNJ`ZKhmNZV9b<SmT>2Qq`4QEC<(*1I&pbs(%A#fso)
-+qiZvmkV<9CmLCpk7weA5GFAVJbfGdFO~|MvHhrmawXZBTztds-UIZCyF%foUIbnAlu%*
-S=QY&~htx&Hb9e+C=`0W)O#ZdHJHh?-C-3+?&X35AOpfhtXjYa>ETw1~>M0+&MCgEIDrYtk+>Yup=Gc
-XAIH{Z(dF(2{_ETUUsb+jz`uJ9kqf00zY^a>b=@N(QS7XQM`_8q5S3C;}VWbLV>+Fc?0;!nCL#Bieze
-!3k|~H)kCA{=m@SF;K!zt^vF}_@*!jV)#*(6m?CS({+8igQ5Tg818FcKsE9rs9=6j`pZ2$phPWUaY@9
-EO?w%ynX8}3AySbJCdy?6Uz6r`_HS)jM_4iGTVunN4Y2nE?V0wa4aq3k{M5Q{&2zRF((S}L;+m~<5qN
-O}A(>;fnvzXzcS{lLL&Um)Xjprz|#}E#(Zf+naw~u45;(U=bsL>#O^avFPoelGY^bnG?^vs%6ag#UoW
-2*bn(#BP=QnEk+j|N@Zh1-m{3QiYu%X|l>iI|Gs<1?~0pNVs)Pc%nlU3X@qwI%do8>b3wet8G(wZ6sv
-qfw2Dy{KMQYB?CY9_=`0rFEEL_?JbP)Q_^oSi!rQX<*vxpr<{|;yz9%CNVL^g@Z6dUu23=x3N8Vsxyb!Yg^k*XCN{GP(OPz;@2d)4eogXSTou+=NH
-~RSE^T5boc@uR-P|hGMT;4a7k^{>-c7n286NUB_M4tu;;Iy>4F^LN2L>+tCOUPs%{Xo*ija>SL2dH{o
-0J36#<}_7xYOISs3tAux<02}S06OS&TEvh;rNaEKy$S=9YGG+Lj7)Reu5KZ`)goWh}D3}8z1~9gyoB{lXtHxK({TUygk6?bf_&>=
-2@xlztEJVTI6KA
-Z&){$A34F9=ymExMNXvpL9boW?Gvl{;oNp3-`E?N8=9+gvX)9mS9gA3S()`eDfM7bD;6JvJsl?at;2v
-aak_3Gp4%JcDh7wRTBct`zl9+YTNxIqG2hEcx-}Q$m>>SEy;&p0gPxzc^1WiluQ$c}|Ky(GzMALhDdj
-kXG5;nG!gBd;F(fgn#>oU&z@w%um0-$Gch0R@ychmDPsGGsL~qA+#eim`PawK9XNk+4^oc7E#o;-xWO
-k!wU%#`&wdn*Gspm`o+=+6S6->Rso%xQ&D2VBo~)xUzW{<$z9{*Hx49M=qe1V6=S6{h58jb^F>c**3G
-ZbnLqUO<<~&WA9@gT^J^gH>z=;+8i@J22QjOxUiB_Rd<<7_S9vpT>JZ2*wC&J~S+7B-ZN}6$7l3q=4|
-Vbmv5SD)3Tyj~6>_Vce+E-*JBb8mj)rVYt4wAJSlQYGl$sg&{WNFz?d?=u6p3}J2g`Th!-jlx!OD>TN
-w6oqVELAS`vBGGysM?tD~I#jGu}NxK<0B~+O=>W%zZCi9%zKacnFFl3D
-cZF2wai{C{w2ez-T=$rMMASW|@8_ep|MQ!xFUP}vG&(>!mO?K;K4{LwzII0#kAHQ>?wJB2fs`j^gv`F
-ek+n64BNFyb{zm_tp8gL1`R(udKc`>)EB^D<$?4xVi?7D#2(Dft)yl)M_{)2-VOe4?7Ic>mj=q27?kG
->=alHq6HjDA$|6koZ`KazeqlsBSb=AJ*9*cl8FTOD^dS|OW8xk=t+TSKc4s7wh`Aq8%_RNSln~h~-BB
-{>#F48t$L%iOSh>b(uB|=s^!xu*RkLN^erFu`ur|r!g1r9`Tcwu9rv6=4
-r=7D~F8G6#DD~L^;GfBxQkxblT#mm81m|-KHud{r^kubYP*3+=INx@<#?4mIVTEPX2JB*Ef^47VM`EJ
-IHd8$7C7##?Qp_E0PF9b2@ao5J}UUKigkB2SZub&g=;uKmezRefh&1XF>N3L#
-}sd#}H>T_hKzL`yD}l%Pr;RyWF4t?gj2$#-G3c-cY^ddk*wHkt*dRE+=lI}Sb`YTD!+l
-Qd(=srZmt2R+Av2x=OZdQ22dX?S~NRSr*T-(`8MQaQYHxiLiO}-lACe&7iL=z5}c#Ch0+K=2dBN$)j-
-=$)%57-V#(>LV6=u=nDDsQ85O}DYb94D3CI2G@QiSCHvBu6(1&zIwxjn)`q*7u`tDLDr
-R49FBPX4_7Z7di#pWi5lxeA48ipo2koUCzF6t>;%w&^22z{JNb&3&Fa{ARPms-opPU+)%b#IF7c))ejdzI_E;@;u=JMu42?=sq!->mj2L
-RJNsbv75jS?w;6|Ci-3@6P$+UUVFR-Wz+Q@p>dI^=Hvb2}cE+!f)H
-VdKY;;HxlX9~L@s^qq4zrx~*GgW?t)o5;K_nOt`-Z->ccP|Lbh~orLC;RRl*Sp5Ab(N{mqYKP94(4
-^PVUJ$S{jd7(&YiC^5}QB~UcI*_wc5lc^tiGIID?C7aW$eqPMswlW@fZq9|}B89K4D41GF5ZADm@I7M
-w%PpV+Yoq>t^9IYL^Me?d3_oVX|*&XEOcW=YS=;Wcl&pgTJ0c5@QLL8SFvt+d<_??AVM!apQ}tmaz;o
-c>IwML8vyQ^~`I2zhLO^Gql8b-Z|-4a)H8B@K{Nbm!)bVe*RafJ@{K6auUF6j{^QI90YtY3$$X7HIeU
-S-RDV^R&Cp-|3)8jQ-MGZ|v@S@)8@KIaS}1rn%Z!cD<+B7X8GrH)1bxkEI+nms9yiM`uvFkp4;~2ACY
-R=KIere7zf_Zh70LRA{^4ZZ%#F-8U(3Q|(B`=QbUMX)C$3>3%Za^>`w&OBvFIRArU}eqWb-g)oPfV?X
-+TP)h>@6aWAK2mtey7Dr3@xpO=U008tE000mG003}la4&3cV_|e@Z*FrgaCyC0U2o&K6@B-wU_B4Ar;
-d!p0)0pk6xoSqC)msf##zi3=|DqEbj*z`YDp@NyV(8hJ@=B5B`29}i=ugnO!8jdd+rAhIUbL%OfJ1Pl
-@q&-vP$ffD2=%jYhy)js2
-j^8R@OXu33ruUWB(r?=S(bLy;;0nWIrtaz7*n
-xsLU@i?osQSq+v`e^y%{5#rsdo!}P!eS=WWOuyFK3ygC=(HC0h6(IGl|b)J3q?u=3VaugiUuXq=swz^0tLjkM}unl;9sR
-PM;HH)g9=R_%nT#QWt>kCc!B1|}UD@tJ~g?!DU9UU(yDu~$O&0^~OcF!DT?m7t77YG9#Nzck8uWILud
-@}6W$Bo)+|wWy7AdIe!qDJM7`?xoeHag>PI&86aU9ko`*i?yx_T8!OPh4F;SN0Bza(T*&8N3+g!<`il
-zo05&*VMyI8+ZiPw0_mov*^T7jj9x(5I|Ju1c42nNZ>!|?lM(niP&=e{|uGnoTk&QQn@+zkc6;O
-n7+t%e?G)_U?Q-6<*DGV@}l=@CV5L+b#R*B7vJplFjMitNKYE`5h{FGbHWI7EXMN3qneK5AvjL7&
-k1XeVzF7Wms(}HUjv|mvr=KqEZijGI`)0_kie&<3iQzwkf6R0Xg6%57X(46;8R@&+EF>q0GgFwt&-Et
-=tI3^f7yNjt-MuTzu|=eS*Jil{G#P$R#^@zuQgyMoT3j!(x91;!faKf7a(Vv!0&+=`{9|-YAgncF
-hyQgcb5#g#(yKi+BML6EjGF=>2&fvHqlHVXZv3dXG#!oJS9#2VTIP51mX*3D&y{oQrUdY@Lt+zN7PaN
-b7vvJsccn&!Dv>;em^4)hInFbyw&p&nCS#}R2-c52Tz<%g-o3uOTD&`d%kIIZlc4N`1W$?x;y(2cJ^?
-VO2C=g^3bL>)+e&)}3~7f+w|aUS`T|ZP`u~Xp7z%VvC|Nz&NOM2%|xp0I
-!bE|XGb+90<_uN_dmt{zZjM^i6cZgYxSAbcq*K}l|&9p1&P^Ou@u@
->}6)YecZOZzDs+1sP-BNI*WpVUpFmx0vPyP}_E_(|*PU84)a9^Z;Sr_;3FWn>8)`6@tRduQ{Q=a0hCB
-|UC;?fwA4^L7O1pnEdy
-4B;An$Q8KM`{jbt{-EiFxP&RF-$5L9#@bYyd2#9Nue6Y-PybFX6i9V%jpbgGrbp=cFuGxtnr0@4
-n=6m0+o5CIZ%0&wolyx#kbspczP&C@h8vs0uit)ywKeD|Ck+}lVY9q2}cf)s-qAC0&dv_G+1cdMdY$bwHCAd0@j|Q`T^6SC)HR$)pdum2KY=xw9g%SJx3*VM
-(2qD@FtOLY{tcUHi|Ih%k8BPSEE15<=uG4m7kCmbvN{P1N_8vqedbVCh}RL~2hE|cLo}Ka#iIJBj)cz
-Q$2-irxg+z!M88@6NBI12u%Y4q2AuT&g-+YWjm?=07|KP_yT`&uE?02xs*tugw5YX(9_ptpGj~zhdFi
-O#v9{mmS9(P)e|}|d-JDg?4$MNSF~2euI+#jv6JeQqt#-<`Y0tqarH9z|=JYS0#TkHw89OQ!z8-ebyE
-*+P&d--jmAajiCrTt4x|qaP(?i*H?;!)Q17SWwGenAf<2Fj%EnckpT4`Vs^Zs+>7u~y1=5YDi(%r#%K
-DTGgCC->e&cp_7{fsVnGePeW;d5{rb$=Ro;>uM7(WhCa>_1qwuV-w)@VYU4x}vX^hg{&>>QzH`%#cMv
-9;p5L+o+9SYs(=5xZ7Hpv6s@`63D-f9rs24Jw6pTD{}#D=TgT6+3CjjU2?3d$iVoD
-HHEB~9FmH0CEtW+A6@51+y#+jvw6JcNG%ZgLm(*L8P4WOdq
-)SAfYJm%u)CTjf>GqAe%%mSxKbuVIfj1Y72D+zCjF#GBfTSUsH?5lHgibvef@nnW=ev(i($}uy{|eO-
-UiB!k|mGG0sbRm*N)=GL3F02g~BEM5nQRww{E6HZ$n^5Kf(TgreogDLZr){9}08(nHmr
->DkLp319i?Eo7ool@g8f@(%3>h~%nJ^e$rPeQSl=uc#ZVXg{P>R6y+!Bhyqcr<(zRzG0w*FJ-xtd-sy
-;5_0WTnmCo6vIqpdC7$Js<3ZZ%yXUvT_Erb?Y_&wP|$&9lK7e{?tMWk<465{q2&`PoC5S06aVU>ttrWm
-Bl+XqZlk&FQzFhwBdKMgKn0)zv!GNW3=Csc|~AH25`+evQBG7yXgG$0yIDYrwJI>
-CHE%N2X5)?=20|XQR000O8^OY7y?@^De^#uR`OA!D75dZ)HaA|NaaAj~bGBtEzXLBxad97G&
-Z`(Ey{;pra*$70TvgX8D+8Hy3uIRRWS%bFN3S@>#%e2ExqBN39Vif)FyQ4&zvfQ94W*~{om%HbldsRl
-G(QjOG#SAA*uE~9={vr3m+>oo+SLFQc2lA>cDrUq=IQ3lTI#S+EX$(hrfd$E%ignRCbdB17rb4BpI+Fa!yr9lV?#%|6cEZ!$KGZL
-9pf*UA0TWVhu-A
-Ava~%%EnK%Ls$$_!E_?3Y>h7m||z(7iHue(nSBsT#JgtP3%kxX>OGqDjEsnQDh==6L{0=qD;m|}&ypy$4Dqaee(t
-r*wL4JVe?>(OjBipgkpHd@AAynuehpOE)>PACxMpxlc*Co8B_TGs^vE**O7ub01ECeBRzcW>H)TrY#x
-Pk**xNKu1q5VsX~TsBotaWvPUWQ18MzTpbI$6Y!)ONh`G=04859jnW&7VH)bS`0&Rx<4}}?^-Ju*jHg
-?C~JTxBwTHanv}mClT-Zh(%c7|unOao4pN6&;BliNiZE;1ii)XxP~Mt7wi4
-PSx~u8d6D906G&D8W#(eXb-Z4ePcW=Ob5ZJbWv_=1oE%;cgxvlz_;CD{Or_<->on+TU5`3VePt(t%1f
-z=?%A=UPt)&qK|EoC_4FI7J1Lxdk5&NKh>J`hZvW}Se1rsg4p3bLBG#kqsdFF0b&@M4uBk;kp<7b8hX
->Mrb_b2!hIFIUjvk?#IQdmuz{KgfosizpR+GPl%9?Ariga^L1dK0!@Q}0fL4-3p0LiAji@CSn-p_`(D
-nM|#SE(q)?7dR3K45N-hS(>)+cr?McHfZ)$u4l_(hgmZP)i&(gG47JRU>UUL$>}t?mLb7p_JWxB*%1?#nVRKkYp1=KY*krTnnnWw-eGR@#amboP6H5MI>Z
-GH#q7le_JOL(4t;%8N$0B#GvN@>V=CGv_arUWBTv+WrhKD1FlzRd+CU3P|8_liF2otz~H1T&F*O_n5i;d>e{CJn6OTSG9tnpqoo#lC=vjuPzB))gm`;Y|5ae
-?<^@ep)}3hW6rV#JqP)Ms@g|jH`|!{5{=!6;~1pji3Z;zi`bmwuGK@?hkgiJVk^uQ^Twjzp~tR-WJ+)
-t($*yyj+MOD+|EJm5CZ0TEq$oD+O?WJosHV2P4qR7q4`?lzqiXVS=IQW_-`#>3bSTxQo|
-(-gZKv9sf`{*d7K{)WXR+$jCvhPU|X6V_9J*3-okL%b2SplapuibsrAI+%-YxJoFB-bcJ92zQ52bO=E
-cG6-4Em^Ilb=yXWMZPZ1H~qB5VEqKkFOgU
-e!Db{sT};0|XQR000O8^OY7y+pKr?eEt4x@6aWAK2mt
-ey7DuA+%h!}7008QB000aC003}la4&agWo2wGaCyBwX>;7hvETJ8Cd;T6u$CYtA1}q2R(wziE0(1a<=
-82T55_KLmk6=I1_xK`#Q**EH3yDe(aw7hRT6Q`^z`)f^mO<1^iC#|M_27;Q&x>ymX*qix=HdpD^@BgQ
-Z;Y0JjI9RR;$VOR_lDC=2?+cJGDs`cgad)wRO@&k0tuwTNC7DUYB{>=$M}$J(Bl}
-a=poN4Gr2i^}V^(RaKPcVc8aorY!T?ytqy3+dP|_$Glvv08jH+*6uFaB+adPthBkSZ`&rb?Lw~`d%v@
-_8@=8v0Yme=O{xN#sULAq=Si(U`^5CJt@1iq>UjRyC%Raaspdj@on$#6QPsLKMPJjCK-pxQD8AQ4T2@
-h=y*Fj9nhKh_d}XNxGH%KkkeI2qz`?r9a@PzJN{^$y
-(3U_eIV8p!th~ke7?310c=n28me5_2O1z<=2<5<8Lm$y?A>Ozx?veUoYOoFD}2m`0mS>7Z!{~SZ_Xg_
-GyyMP3%oGS7DYzEE9C6*wH&waNbv;uW@B`jTvudW+$(@d==atR@w|hWgk!+Cp(ztHYvTbGrrCK+Y
-)?u%uNVvwdm=&omni$>=2L}5J{1(-VD%&&`$t!w%33yb4I<}{GE5NTV+X{LxI^@y1yw?`yqAZr#Y9lQ
-2h-O`^-#6-9O{48?K7I7)Zx?T_E?<2YfBov^>o4D4e)a7Id<>q>)M*Gcuv@@u=Vn$!E8Toc?}9kSQI2
-CQw#bvZRy4?8Coq6PIsb=VG~pSM45ss!KumbA%B3PCY7!AVxs9Z#Bb@hCBh^co`&p4SaU9e-U(UEbj}
-?oF7&_NfWiE|KLtj@qX|zhX8LkyQvL()d7RlNHJ5z4~tEwy;L=8og)aX^{HhHGLB)a1$zmHzfx3Bq=;
--0IfRC%_}8Vd+75_cI4&s-y~0II2?K^$oqX<_b>0#R&{hT#xd1XFPX!nK9r4eaMoWlPsX1;p0syRy(d
-aNRBe%Y0=&9k!tRS{T7{W7O=|uw6#Q}oS3H*EMGjLMu6i`;xIHzZ7s}%^B`z+OdYXk^d}%j95fB;
-Yo9Bkg#A%V`X1Y1TLEk0?$(WTcbVUNMK&1g+A-h5kpf3XDPVZ)JJWb7iv~?r!Dt#&
-2Nu@a{Mywhu%Cs=q{C95?ob42~E7HGmYI9HN6ESpfYtVOT^DzL~_u;`tuYno<&bd1ad1!Dv6hRVXAPz
-hAeX!a{04saOBGHTL(_P;neWE+c;%?9`&n4C|df0S7fTz3J_9Hg5tw1^@-&(-8eDp_!$8HWo}&4|+Qk
-?2qJqP|U@J^O?h5QdUTo7FyDk;Q6Q9`d9TShZfXRo3iKfNWqY<0^qt3P64O*$lY6Vb}@0LXE0|uP|?*
-@n!?x0+uFiE#Puca2TczBlVYZ3w&PDprzSz33Igo1ww`jRUd$x^BU;BBV!6U>>{GR04-IaDCFTpiUGf
-$8x=a@o(Dp`h;`Q>45M59KFwCTZh~-xiwpQ5$g%Xp-q$4zIegqyI7f*K$(De3BnX;v1Kqp_wNm)btGg
-v}y)`1IY6~P3)&l*nQxafNLocqlOO`viZ`WtfZf4TY>t|=5xQ9>9K64MBo!#7+eq9zdE&wvcQLk4<-K
-DACS#m%Lr4^hg%x&WnGMs>;^#TMW^cQq5=ojNDi>%QDh(o<)kPWPQGE*NJo_#b}%Lgm)!L|3y%u1YTP
-7L;!eT3wfB*}NE5Y4@;W}F@b4ogF)vSqSpK3Ku;!~VjHq$rCFECiM`=Uq)ZB4Z4{1Rauq)(`7mz#@Z*
-hrHud!{r*N+2o?s)Ip!X-DX3B_k``@33Y<E)-eNtf0Kd2Y+0$A-FAhrmw@tU>D99AT#5FkV@&(@x
-Lb#+lysPfdxqaz@zBQmcM&w%cti|<}teD`*y?93`d%!u&&WmyfO6k(8cCFcOQp{fJWBX4SIb&QeTmhLK>qLu)vx-SoeyPq*~mb=w@-aDM
-u2soCTd`W8%0NG_@d{d87%$?z}Zn4$cMBQVoEGy;00~h6f28WRjUz{`&Oh`sq#R3n(`izD17od<%3Bs
-5TX7B04>PTeVifuQUo)DglaezADeByb?}fDFkVHS7$%#^QY?b&(*V2r|yfUA7Eu*d0nia1tL3PPfDO&
-s|vjh5Oiqlgj;VRvmfaTLxRtz^hgUatkQ)bIIZT>5GTgc5tXn+qbfoxFR)05-qZsg<8obgk=CA^;~m!
-wI|Xi|c7yaD`*yfpUW3v4XL6oT)svyK$ysZ+kzKX%F^p181*YgCDV|QlsH&SP+XU0|X=m62u4e~9#ub
-T1;%QOe2dsLn0iIi1>Rla)vJVWsEPR;bJR9VA6rofvpAesiDZUCl)=zzew_0cRuR(T2^9Y33ytxO#)V{(JApQ=tI~^xCN_(-oyM3!fP41Kx!jh2hU%Bj@^X$^o6^f1K>w5|opeV@`1y$le5BO7#q%cAy9chRaUTM&$h2e8BWxsP8@L$M6hSzQf2
-GeQ>l>W9BvTzoqXiMM*%vgLMmUpe(?SOhPb4>VUrhc%LXoE2D4*Vv5-j`ML=AIl(|;R>!N;|<~E^Q`y
-GbDmiTW^xP~QMKDZ$5Brf^{Z=iIHxBJ$Doi0`d`1yaE%Zd`WB7vOdL$c2;Y@0DlVWHfT0XVHSLg~NF_
-94#zSo@)U~Rb{WAh@t&>8$UgTC=Cp!gX$3KWA$YjzZ)#Z1WB--nU96w2eTR;t&&xlS_8q*)?4hvaxPDS)4*?vvWytyTKEjA>1mzw}#n95~Qm$`OI!K%&N13JPt=-7#Cu=qi#
-2&Pg&7V+9&lpL%yTQmXGBzMH7n*{eb#nD7uL2QyrIc0q7v>x*JMY+kex9P+yt?U-0plYC_-)ssX2vJEL0
-C{HX#oOZ#%y#C{?^tp)yA66a@?B_w8E>@%hr0RBd!M$Ii4hRk2C$XIGWWc$$z*q^*4kJsev8-EMfWvx
-JGs?_d@uzBCE3oHf|OpBvM9{`t{VQbv!ZUD=}3Y*mz
-%o(`987ZqVtLuA8%nUDN^3p3~c*=D*CBiyn&?ev1tH*Z$~_W+e{Sae$S>bfmLXnu`s8r0QdrZ&SBdAP
-l9idr4b<8;thZ$lrkltx
-X&tepk6=D^Z-`ski;o4pEUGp6+P6buF#jU_}PeQ81;4J)e}=j7gf>Mm*9j@^U`cZpc%z%0VikV|s?>E
-J#`w^vqTC;E~S-Nf7uD=MV7GTD;jfccxjf$lFv;iEwVS9lz7ERJU`KelwY2tafjc{7%s7@j9Z$@)Q2Q
-Mkk0Vb`#}8DcRLKG%53aPZf7t01u2;AqQ_ea+#>!