The MDM holds the metadata of the data packages which are available in our Research Data Center FDZ. It enables researchers to browse our data packages before signing a contract for using the data.
Please checkout the development branch before starting to code and create a new branch starting with your username followed by the backlog items issue number you will be working on:
git checkout development
git checkout -b rreitmann/issue1234
Before you can build this project, you must install and configure the following dependencies on your machine:
- Java: You need to install java 15 sdk on your system. On Ubuntu you should use SDKMAN! (
sdk install java 15.0.2.hs-adpt
) - Maven: You need to install maven 3.6.1 or above on your system. On Ubuntu you should use SDKMAN! (
sdk install maven
) - Node.js: Node.js 16 and npm (coming with node.js) are required as well. On Ubuntu you should install node using NVM (
nvm install v16
)
On Windows, patch.exe
has to exist in the PATH. It is distributed as part of git bash, or can be downloaded manually from GnuWin32.
Make sure that you have read-write-access on the data directory (in your project directory) for Elasticsearch and MongoDB. Specifically Mac users need to run the following command to create all data directories before bringing up the containers for the first time:
mkdir -p data/elasticsearch/data data/mongodb/db data/mongodb/logs
Otherwise your Docker Host will attempt to change permissions on the directories and fail.
Use docker-compose up
to create all containers initially. MongoDB and Elasticsearch will be listening on their default ports. MailDev will show all locally sent email on 8081 and the identity-provider can be setup on port 8082. Any time after that use either docker-compose up
or docker-compose start
.
In case elasticsearch does not start successfully, you might need to increase its memory limit
mem_limit: 512m
, e.g. to 1024
(this change required removing and re-building the container).
You can get a MongoDB dump and restore it locally:
$ wget https://metadatamanagement-public.s3.eu-central-1.amazonaws.com/20220926_metadatamanagement_e2e.zip
$ unzip 20220926_metadatamanagement_e2e.zip
$ mv dump/metadatamanagement data/mongodb/db/
$ docker exec -it mongodb bash
mongo$ cd /data/mongodb/db
mongo$ mongorestore ./metadatamanagement --db=metadatamanagement
mongo$ exit
rm -r dump
You will need to setup your ~/.m2/settings.xml
so that maven can download a dependency from Github:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>github</id>
<username>${GITHUB_USERNAME}</username>
<password>${GITHUB_TOKEN}</password>
</server>
</servers>
</settings>
Run mvn
first to start the Spring backend and to make sure the frontend Angular constants module has been generated by the Maven Plugin. Run npm --prefix mdm-frontend start
to start the Angular Frontend.
mvn clean install -f maven-plugin/pom.xml
mvn spring-boot:run
In order for all external services to work on your local machine, you need to set the following variables in application-local.yml
:
dara:
endpoint: "https://labs.da-ra.de/dara/"
username: {see s3://metadatamanagement-private/sensitive_variables.tf}
password: {see s3://metadatamanagement-private/sensitive_variables.tf}
If you run the backend on your machine for the first time, or you have restored a
mongodb dump, then you need to setup/reindex the elasticsearch indices. Therefore, login as admin to the application,
go to Administration
on the left, navigate to External Services
and then
click the red button Reindex
for the Elasticsearch service. Reindexing can take up to 1 hour.
If you want to build a docker image for the metadatamanagement server app you can run
mvn deploy
This image can be run with all its dependent containers by
docker-compose -f docker-compose.yml -f docker-compose-app.yml up -d --build
Our CI pipleline will do some automatic checks and tests and it will optimize the metadatamanagement client for the dev environment. So before pushing to Github in order to be sure you won't fail the build you should run:
mvn -Pdev clean verify
This will concatenate and minify CSS and JavaScript files using grunt. It will also modify the index.html
so it references
these new files.
We test our project continuously with the Robot Framework. Test Developers can get further info here.
When an analysis package or data package is released with version >=1.0.0, the user can optionally post a message about the release on X (formerly Twitter).
To set this up, you need to have an X Developer Account (Free Access Level) and your projects' api
credentials consumer key
and consumer secret
. Be aware that the current Free Access Level
is limited to 50 tweets/24h; 1,500 tweets/month; 1 environment; 1 project.
Make your credentials consumer key
and consumer secret
accessible by the application.yml
of the current stage through
sensitive_variables.tf
just like other highly sensitive data.
[application.yml]
...
tweet:
consumerkey: ${vcap.services.tweet.credentials.consumerkey}
consumersecret: ${vcap.services.tweet.credentials.consumersecret}
oauthtoken: ${vcap.services.tweet.credentials.oauthtoken}
oauthtokensecret: ${vcap.services.tweet.credentials.oauthtokensecret}
...
Create your oauthtoken
and oauthtokensecret
by following the three steps of the Postman
Twitter examples: Twitter OAuth 1.0a flow test.
- step
oauth/request_token
:
Execute the request with your consumer key and consumer secret from your Developer Account.
An OAUTH_TOKEN_FROM_STEP1
and OAUTH_TOKEN_SECRET_FROM_STEP1
will be returned.
- step
oauth/authorize
:
Visit https://api.twitter.com/oauth/authorize?oauth_token={OAUTH_TOKEN_FROM_STEP1}&oauth_token_secret={OAUTH_TOKEN_SECRET_FROM_STEP1}&oauth_callback_confirmed=true
with OAUTH_TOKEN_FROM_STEP1
and OAUTH_TOKEN_SECRET_FROM_STEP1
from the first step,
and authenticate your app.
After being redirected to X, open the network, and copy the values for oauth_token
as OAUTH_TOKEN_FROM_STEP2
and oauth_token
as OAUTH_VERIFIER_FROM_STEP2
from this GET request
GET 'http://twitter.com/?oauth_token={OAUTH_TOKEN_FROM_STEP2}&oauth_verifier={OAUTH_VERIFIER_FROM_STEP2}`'
- step
oauth/access_token
:
Insert the OAUTH_TOKEN_FROM_STEP2
and OAUTH_VERIFIER_FROM_STEP2
from step 2 into the third
request (If you are using Postman like
the linked Twitter example, select No Auth
instead of OAuth 1.0
).
POST 'https://api.twitter.com/?oauth_token={OAUTH_TOKEN_FROM_STEP2}&oauth_verifier={OAUTH_VERIFIER_FROM_STEP2}'
Add the returned values for oauth_token
and oauth_token_secret
from step 3 to the sensitive_variables.tf
.
For further details also see Authentication OAuth FAQ.
Cross-browser Testing Platform and Open Source ❤️ Provided by Sauce Labs
Continuous Integration Platform provided by Github Actions