Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use cookiecutter #1

Open
wants to merge 42 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
fda0e70
Regenerate using ionelmc/cookiecutter-pylibrary.
dHannasch Sep 14, 2019
36ace9a
Revert "Regenerate using ionelmc/cookiecutter-pylibrary."
dHannasch Sep 14, 2019
48fb468
Regenerate using ionelmc/cookiecutter-pylibrary.
dHannasch Sep 14, 2019
76fcf27
move prednet into src/
dHannasch Sep 14, 2019
11430e3
fix footnote
dHannasch Sep 14, 2019
7411627
fix here
dHannasch Sep 14, 2019
5cb12da
fix footnote
dHannasch Sep 14, 2019
bab143f
don't try badges yet
dHannasch Sep 14, 2019
55e15c7
restore end
dHannasch Sep 14, 2019
7230e0c
fix footnote
dHannasch Sep 14, 2019
67c6984
fix long_description for tox -e check
dHannasch Sep 15, 2019
467eadd
require matplotlib
dHannasch Sep 15, 2019
e5ef651
import from prednet
dHannasch Sep 15, 2019
cdbc142
import from prednet
dHannasch Sep 15, 2019
d7401cd
import from prednet
dHannasch Sep 15, 2019
435a1db
replace scipy.misc.imresize with PIL.Image.fromarray(im).resize
dHannasch Sep 15, 2019
3684972
Use keras.backend.backend() instead of keras.backend._BACKEND.
dHannasch Sep 15, 2019
a7477a6
move kitti to tests
dHannasch Sep 17, 2019
2811162
move download scripts
dHannasch Sep 18, 2019
b388b66
LanaSina's kitti_hkl
dHannasch Sep 18, 2019
1f6faaa
what you should see
dHannasch Sep 18, 2019
9ea55d2
move environment.yml to top level
dHannasch Sep 18, 2019
04c4059
warnings
dHannasch Sep 18, 2019
3b1662d
use kitti_hkl
dHannasch Sep 23, 2019
c2059b7
Merge branch 'use-cookiecutter' of https://github.com/dHannasch/predn…
dHannasch Sep 23, 2019
a05abad
pytest passes with kitti moved to separate directory
dHannasch Sep 23, 2019
1fd7173
include files in manifest
dHannasch Sep 23, 2019
d563a97
require pytest
dHannasch Sep 23, 2019
5776691
Regenerated.
dHannasch Oct 21, 2019
d122b3c
allow mirror to gitlab
dHannasch Oct 21, 2019
f629f0c
image: dahanna/python:3.7-seaborn-alpine
dHannasch Oct 21, 2019
70b4e40
image: ufoym/deepo
dHannasch Oct 21, 2019
de07f8b
Disable flake8 and isort because don't want to mess with lots of thin…
dHannasch Oct 21, 2019
0483a1d
fix AttributeError: module 'enum' has no attribute 'IntFlag'
dHannasch Oct 21, 2019
9591031
uninstall enum34
dHannasch Oct 21, 2019
5b03af9
install tox
dHannasch Oct 21, 2019
1da1789
restrict python versions and use python -m pytest
dHannasch Oct 21, 2019
9f4e01c
Use a hacked version of sphinx until we have some kind of fix for sph…
dHannasch Oct 22, 2019
9265dc4
Disable testing on PyPy.
dHannasch Oct 22, 2019
32c940e
Try sphinx-doc/sphinx#6756.
dHannasch Oct 23, 2019
6f6ba06
Sphinx automodule submodules.
dHannasch Oct 25, 2019
7826390
Module must be importable for autodoc.
dHannasch Oct 25, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions .appveyor.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
version: '{branch}-{build}'
build: off
environment:
global:
matrix:
- TOXENV: check
TOXPYTHON: C:\Python36\python.exe
PYTHON_HOME: C:\Python36
PYTHON_VERSION: '3.6'
PYTHON_ARCH: '32'
- TOXENV: py36
TOXPYTHON: C:\Python36\python.exe
PYTHON_HOME: C:\Python36
PYTHON_VERSION: '3.6'
PYTHON_ARCH: '32'
- TOXENV: py36
TOXPYTHON: C:\Python36-x64\python.exe
PYTHON_HOME: C:\Python36-x64
PYTHON_VERSION: '3.6'
PYTHON_ARCH: '64'
- TOXENV: py37
TOXPYTHON: C:\Python37\python.exe
PYTHON_HOME: C:\Python37
PYTHON_VERSION: '3.7'
PYTHON_ARCH: '32'
- TOXENV: py37
TOXPYTHON: C:\Python37-x64\python.exe
PYTHON_HOME: C:\Python37-x64
PYTHON_VERSION: '3.7'
PYTHON_ARCH: '64'
init:
- ps: echo $env:TOXENV
- ps: ls C:\Python*
install:
- '%PYTHON_HOME%\python -mpip install --progress-bar=off tox -rci/requirements.txt'
- '%PYTHON_HOME%\Scripts\virtualenv --version'
- '%PYTHON_HOME%\Scripts\easy_install --version'
- '%PYTHON_HOME%\Scripts\pip --version'
- '%PYTHON_HOME%\Scripts\tox --version'
test_script:
- cmd /E:ON /V:ON /C .\ci\appveyor-with-compiler.cmd %PYTHON_HOME%\Scripts\tox
on_failure:
- ps: dir "env:"
- ps: get-content .tox\*\log\*

### To enable remote debugging uncomment this (also, see: http://www.appveyor.com/docs/how-to/rdp-to-build-worker):
# on_finish:
# - ps: $blockRdp = $true; iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
20 changes: 20 additions & 0 deletions .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
[bumpversion]
current_version = 0.0.0
commit = True
tag = True

[bumpversion:file:setup.py]
search = 'fallback_version': '{current_version}'
replace = 'fallback_version': '{new_version}'

[bumpversion:file:README.rst]
search = v{current_version}.
replace = v{new_version}.

[bumpversion:file:docs/conf.py]
search = version = release = '{current_version}'
replace = version = release = '{new_version}'

[bumpversion:file:src/prednet/__init__.py]
search = __version__ = '{current_version}'
replace = __version__ = '{new_version}'
70 changes: 70 additions & 0 deletions .cookiecutterrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
# This file exists so you can easily regenerate your project.
#
# `cookiepatcher` is a convenient shim around `cookiecutter`
# for regenerating projects (it will generate a .cookiecutterrc
# automatically for any template). To use it:
#
# pip install cookiepatcher
# cookiepatcher gh:ionelmc/cookiecutter-pylibrary project-path
#
# See:
# https://pypi.org/project/cookiepatcher
#
# Alternatively, you can run:
#
# cookiecutter --overwrite-if-exists --config-file=project-path/.cookiecutterrc gh:ionelmc/cookiecutter-pylibrary

default_context:

_extensions: ['jinja2_time.TimeExtension']
_template: 'cookiecutter-pylibrary/'
allow_tests_inside_package: 'yes'
appveyor: 'yes'
c_extension_function: 'longest'
c_extension_module: '_prednet'
c_extension_optional: 'no'
c_extension_support: 'no'
c_extension_test_pypi: 'no'
c_extension_test_pypi_username: 'coxlab'
ci_https_proxy: 'no'
codacy: 'no'
codacy_projectid: '[Get ID from https://app.codacy.com/app/coxlab/prednet/settings]'
codeclimate: 'no'
codecov: 'no'
command_line_interface: 'argparse'
command_line_interface_bin_name: 'prednet'
coveralls: 'no'
coveralls_token: '[Required for Appveyor, take it from https://coveralls.io/github/coxlab/prednet]'
distribution_name: 'prednet'
email: '[email protected]'
full_name: 'Bill Lotter'
landscape: 'no'
license: 'MIT license'
linter: 'flake8'
package_name: 'prednet'
project_name: 'PredNet'
project_short_description: 'Code and models accompanying Deep Predictive Coding Networks for Video Prediction and Unsupervised Learning by Bill Lotter, Gabriel Kreiman, and David Cox. The PredNet is a deep recurrent convolutional neural network that is inspired by the neuroscience concept of predictive coding (Rao and Ballard, 1999; Friston, 2005).'
pypi_badge: 'no'
pypi_disable_upload: 'no'
release_date: '8 July 2016'
repo_hosting: 'github.com'
repo_hosting_domain: 'github.com'
repo_name: 'prednet'
repo_username: 'coxlab'
requiresio: 'no'
scrutinizer: 'no'
setup_py_uses_setuptools_scm: 'yes'
setup_py_uses_test_runner: 'yes'
sphinx_docs: 'yes'
sphinx_docs_hosting: 'https://coxlab.github.io/prednet/'
sphinx_doctest: 'yes'
sphinx_theme: 'sphinx-rtd-theme'
test_matrix_configurator: 'no'
test_matrix_separate_coverage: 'no'
test_runner: 'pytest'
travis: 'yes'
travis_osx: 'no'
version: '0.0.0'
website: 'http://www.coxlab.org/'
year_from: '2016'
year_to: '2019'
16 changes: 16 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
[paths]
source =
src
*/site-packages

[run]
branch = true
source =
prednet
tests
parallel = true

[report]
show_missing = true
precision = 2
omit = *migrations*
13 changes: 13 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# see http://editorconfig.org
root = true

[*]
end_of_line = lf
trim_trailing_whitespace = true
insert_final_newline = true
indent_style = space
indent_size = 4
charset = utf-8

[*.{bat,cmd,ps1}]
end_of_line = crlf
202 changes: 202 additions & 0 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,202 @@
image: ufoym/deepo

default:
before_script:
- right_after_pull_docker_image=$(date +%s)
- echo $(whoami)
- echo $USER

# If we need to apk add openssh-client, then we will need HTTPS_PROXY set first.
# This potentially leads to a problem if we need SSH to access the ETC_ENVIRONMENT_LOCATION.
# The ETC_ENVIRONMENT_LOCATION is not generally intended for secret keys like the SSH_PRIVATE_KEY.
- if [ -z ${ETC_ENVIRONMENT_LOCATION+ABC} ]; then echo "ETC_ENVIRONMENT_LOCATION is unset, so assuming you do not need environment variables set.";
else
# All of this will be skipped unless you set ETC_ENVIRONMENT_LOCATION in GitLab.
# Strictly speaking, this serves the same function as .profile, being run before everything else.
# You *could* put arbitrary shell commands in the file, but the intended purpose is
# to save on manual work by allowing you to set only one GitLab variable that points
# to more variables to set.
# Special note if the environment file is used to set up a proxy with HTTPS_PROXY...
# $ETC_ENVIRONMENT_LOCATION must be a location that we can access *before* setting up the proxy variables.
- echo $ETC_ENVIRONMENT_LOCATION
# We do not want the script to hang waiting for a password if the private key is rejected.
- mkdir --parents ~/.ssh
- echo "PasswordAuthentication=no" >> ~/.ssh/config
- echo $SSH_PRIVATE_KEY > SSH.PRIVATE.KEY # If SSH_PRIVATE_KEY is unset, this will just be empty.
- wget $ETC_ENVIRONMENT_LOCATION --output-document environment.sh --no-clobber || scp -i SSH.PRIVATE.KEY $ETC_ENVIRONMENT_LOCATION environment.sh
- rm SSH.PRIVATE.KEY
- cat environment.sh
- set -o allexport
- source environment.sh
- set +o allexport
- fi

- if [ -z ${SSH_PRIVATE_KEY+ABC} ]; then echo "SSH_PRIVATE_KEY is unset, so assuming you do not need SSH set up.";
else
# All of this will be skipped unless you set SSH_PRIVATE_KEY as a variable
- if [ ${#SSH_PRIVATE_KEY} -le 5 ]; then echo "SSH_PRIVATE_KEY looks far too short, something is wrong"; fi
- apk add openssh-client || apt-get install --assume-yes openssh-client
- echo "adding openssh-client took $(( $(date +%s) - right_after_pull_docker_image)) seconds"

# ssh-agent -s starts the ssh-agent and then outputs shell commands to run.
- eval $(ssh-agent -s)

##
## Add the SSH key stored in SSH_PRIVATE_KEY variable to the agent store.
## We're using tr to fix line endings which makes ed25519 keys work
## without extra base64 encoding.
## We use -d because the version of tr on alpine does not recognize --delete.
## https://gitlab.com/gitlab-examples/ssh-private-key/issues/1#note_48526556
##
- echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add -

##
## Sometimes we may want to install directly from a git repository.
## Using up-to-the-minute updates of dependencies in our own tests alerts
## us if something breaks with the latest version of a dependency, even if
## that dependency has not made a new release yet.
## In order to pip install directly from git repositories,
## we need to whitelist the public keys of the git servers.
## You may want to add more lines for the domains of any other git servers
## you want to install dependencies from (which may or may not include the
## server that hosts your own repo).
## Similarly, if you want to push to a secondary repo as part of your build
## (as how cookiecutter-pylibrary builds examples and
## pushes to python-nameless), ssh will need to be allowed to reach that
## server.
## https://docs.travis-ci.com/user/ssh-known-hosts/
## https://discuss.circleci.com/t/add-known-hosts-on-startup-via-config-yml-configuration/12022/2
## Unfortunately, there seems to be no way to use ssh-keyscan on a server
## that you can only reach through a proxy. Thus, a simple
## ssh-keyscan -t rsa github.com gitlab.com >> ~/.ssh/known_hosts
## will fail. As a workaround, I just grabbed their public keys now and
## included them. These might go stale eventually, I'm not sure.
##
- mkdir --parents ~/.ssh
- echo "# github.com:22 SSH-2.0-babeld-f345ed5d\n" >> ~/.ssh/known_hosts
- echo "github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==\n" >> ~/.ssh/known_hosts
- echo "# gitlab.com:22 SSH-2.0-OpenSSH_7.2p2 Ubuntu-4ubuntu2.8\n" >> ~/.ssh/known_hosts
- echo "gitlab.com ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCsj2bNKTBSpIYDEGk9KxsGh3mySTRgMtXL583qmBpzeQ+jqCMRgBqB98u3z++J1sKlXHWfM9dyhSevkMwSbhoR8XIq/U0tCNyokEi/ueaBMCvbcTHhO7FcwzY92WK4Yt0aGROY5qX2UKSeOvuP4D6TPqKF1onrSzH9bx9XUf2lEdWT/ia1NEKjunUqu1xOB/StKDHMoX4/OKyIzuS0q/T1zOATthvasJFoPrAjkohTyaDUz2LN5JoH839hViyEG82yB+MjcFV5MU3N1l1QL3cVUCh93xSaua1N85qivl+siMkPGbO5xR/En4iEY6K2XPASUEMaieWVNTRCtJ4S8H+9\n" >> ~/.ssh/known_hosts
- fi

# When we get the environment file, it might have some servers for us to whitelist.
# Alternatively, maybe there was no ETC_ENVIRONMENT_LOCATION
# and SERVERS_TO_WHITELIST_FOR_SSH is just manually set as a GitLab variable.
# If SSH_PRIVATE_KEY is not set, then we will silently ignore SERVERS_TO_WHITELIST_FOR_SSH,
# since without a key of some kind we cannot use SSH anyway.
# This allows us to share around a common ETC_ENVIRONMENT_LOCATION that includes SERVERS_TO_WHITELIST_FOR_SSH,
# even though only some people actually use SSH for anything.
- if [ -z ${SERVERS_TO_WHITELIST_FOR_SSH+ABC} ] || [ -z ${SSH_PRIVATE_KEY+ABC} ]; then echo "SERVERS_TO_WHITELIST_FOR_SSH and SSH_PRIVATE_KEY are not both set, so assuming you do not need any servers whitelisted for SSH.";
else
- echo $SERVERS_TO_WHITELIST_FOR_SSH
- mkdir --parents ~/.ssh
- ssh-keyscan -t rsa $SERVERS_TO_WHITELIST_FOR_SSH >> ~/.ssh/known_hosts
- fi

- pip install --upgrade pip
- if [ -z ${PROXY_CA_PEM+ABC} ]; then echo "PROXY_CA_PEM is unset, so assuming you do not need a merged CA certificate set up.";
else
# All of this will be skipped unless you set PROXY_CA_PEM in GitLab.
# You will usually want to cat your.pem | xclip and paste it in as a File on GitLab.
# See the KUBE_CA_PEM example at https://docs.gitlab.com/ee/ci/variables/README.html#variable-types
- right_before_pull_cert=$(date +%s)
- if [ ${#PROXY_CA_PEM} -ge 1024 ]; then
- echo "The PROXY_CA_PEM filename looks far too long, did you set it as a Variable instead of a File?"
# If it's the full certificate rather than a filename, write it to a file and save the file name.
- echo "$PROXY_CA_PEM" > tmp-proxy-ca.pem
# The quotes are very important here; echo $PROXY_CA_PEM will destroy the
# newlines, and requests will (silently!) fail to parse the certificate,
# leading to SSLError SSLCertVerificationError 'certificate verify failed self signed certificate in certificate chain (_ssl.c:1076)'
- PROXY_CA_PEM=tmp-proxy-ca.pem
; fi
# If some of the links in your documentation require a special PEM to verify,
# then sphinx -b linkcheck will fail without that PEM.
# But setting REQUESTS_CA_BUNDLE to that PEM will cause other links to fail,
# because the runner will only accept that PEM, not the defaults.
# Therefore you will usually want to bundle all certificates together with
# cat `python -c "import requests; print(requests.certs.where())"` ~/your.pem > ~/bundled.pem
# pip uses requests, but not the normal requests.
# pip uses a vendored version of requests, so that pip will still work if anything goes wrong with your requests installation.
# We find where that vendored version of requests keeps its certs and merge in the cert from PROXY_CA_PEM.
# On some systems, we might need to try the import twice, and the first time, it will fail with an AttributeError.
# Therefore we need a block to suppress the AttributeError, which requires a colon.
# But that causes parsing of .gitlab-ci.yml to fail with "before_script config should be an array of strings",
# so we need to wrap the entire line in ''.
# https://gitlab.com/gitlab-org/gitlab-foss/merge_requests/5481
- 'echo -e "import contextlib\nwith contextlib.suppress(AttributeError): import pip._vendor.requests\nfrom pip._vendor.requests.certs import where\nprint(where())" | python'
- 'cat `echo -e "import contextlib\nwith contextlib.suppress(AttributeError): import pip._vendor.requests\nfrom pip._vendor.requests.certs import where\nprint(where())" | python` $PROXY_CA_PEM > bundled.pem'
- export REQUESTS_CA_BUNDLE=bundled.pem
- echo "Merging the certificate bundle took $(( $(date +%s) - right_before_pull_cert)) seconds total"
- fi

##
## With all our proxy variables and certificates in place, we should now be
## able to install from repositores, and optionally push to repositories.
## Optionally, if you will be making any git commits, set the user name and
## email.
##
#- git config --global user.email "[email protected]"
#- git config --global user.name "Bill Lotter"

# With --sitepackages, we can save time by installing once
# for both regular tests and documentation checks.
- python -c "import enum; print(enum.__file__)"
- pip uninstall --yes enum34
- pip install .

# In general we want to use tox -e docs, but GitLab.com will not deploy Pages
# if the pages build fails.
# The pages build will fail if you use tox -e docs with a link to your GitLab
# Pages documentation that is not yet deployed, because tox -e docs includes
# sphinx-build -b linkcheck. So the pages will never get deployed...
# That's why we deploy pages with no checks here.
# The tests will still run linkcheck on the documentation.
# Since "It may take up to 30 minutes before the site is available after the
# first deployment." (per GitLab), the tests will still fail for a little
# while.
pages:
tags:
- docker
stage: build
# On GitLab, the stages are build->test->deploy.
# If the test stage fails, the deploy stage is skipped.
script:
- pip install -r docs/requirements.txt
- sphinx-build -E -b html docs dist/docs
- mv dist/docs/ public/
artifacts:
paths:
- public
only:
- master

test:
tags:
- docker
stage: test
script:
# apk add any needed packages not included in the image.
# check-manifest, used in tox -e check, requires git,
# so we need to either use an image that includes git or
# apk add git here.
# If using an image that does not include tox, we will
# need to pip install tox here.
- pip install tox
- git --version
- python --version
- python2 --version || echo "python2 is not installed."
- virtualenv --version
- pip --version
- tox --version
- uname --all
- lsb_release --all || echo "lsb_release is not supported on this host."
- start_tox=$(date +%s)
# When testing locally, we might not want to set sitepackages=true,
# because the local machine might have all kinds of weird things in the
# environment. But for continuous integration, we do want sitepackages=true,
# because it allows us to use a Docker image with some packages already
# installed to accelerate testing.
- tox --sitepackages
- echo "tox tests took $(( $(date +%s) - start_tox)) seconds"
- echo "Everything after pulling the Docker image took $(( $(date +%s) - right_after_pull_docker_image)) seconds total"

Loading