diff --git a/extensions/data-transfer/portability-data-transfer-generic/README.md b/extensions/data-transfer/portability-data-transfer-generic/README.md
new file mode 100644
index 000000000..31a09fcd1
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/README.md
@@ -0,0 +1,736 @@
+# Generic Importers
+
+## Overview
+
+The Data Transfer Project transfer worker manages the transfer of data between data exporter and data importer services. \
+The transfer of some set of user data is encapsulated by a transfer job. Each job consists of many data items from a particular data vertical (CALENDAR, PHOTOS, etc.). See the [Schemas](#schemas) section for the list of data verticals supported by Generic Importers.
+
+Transfer jobs are initiated by a user, generally on the platform owned by the exporter service. When processing a transfer job the transfer worker pulls data items from the exporter service and pushes those data items to the importer service. \
+The interface for pulling data from exporters and pushing to importers are implemented in DTP by `Exporter` modules and `Importer` modules respectively. This allows the DTP worker to transfer data between services agnostic of protocol and encoding, and without assumptions about the storage or organisation of data at the destination.
+
+The typical way to integrate with DTP as a data importer is to implement an `Importer` module through an extension that pushes data to a new or existing API. This requires writing and maintaining a Java extension, but gives complete control over how the module interacts with your API. \
+As an alternative path for integration, the Generic Importers extension provides an `Importer` module implementation that can be configured to call any endpoint conforming to the API described in the [Generic Importer API](#generic-importer-api) section. This allows developers to focus on languages and frameworks they are familiar with, at the expense of being able to customise the behaviour of the importer or reusing existing APIs.
+
+## Generic Importer API
+
+The Generic Importer API uses HTTP as the application layer protocol and encodes most request data as `application/json`. Data items are sent one-per-request
+
+The service should accept POST requests on a given base URL, and kebab-case sub-path specific to each data vertical (`https://example.com/import/blobs`, `https://example.com/import/social-posts`). \
+See the [HTTP Interface](#http-interface) section for details on the API.
+
+When a transfer job is created, credentials to call your service endpoint are obtained using OAuth 2.0 (see [Authentication and Authorization](#authentication-and-authorization) below).
+Your service should therefore support OAuth 2.0 or integrate with another service which provides these features.
+
+### DTP Behaviour
+
+This section contains information on how the DTP platform behaves, and in particular how the Generic `Importer` module works. This should help to guide decisions in how you implement your importer service.
+
+#### Job Life-cycle
+
+Transfer jobs get created by users on the platform hosting the DTP transfer worker, which is generally owned by the exporter. \
+During job creation the platform needs to obtain credentials to call the exporting service and importing service on behalf of the user, which for importer services using Generic Importers means using the OAuth 2.0 Authorization Code flow discussed in [Authentication and Authorization](#authentication-and-authorization). \
+Once a DTP transfer worker begins working on the transfer job it will begin sending POST requests to your endpoint containing user data items to associate with their account, authenticated and authorized using the access code obtained at job creation. \
+The job continues until all items have been sent.
+
+> [!NOTE]
+> The Generic Importers extension currently provides no mechanism for informing an importer service of the beginning or end of a transfer job.
+
+#### Ordering
+
+Data items are sent to your importer service in the same order that they are exported from the exporter service. This makes the order of data items transferred in a job an implementation detail of the `Exporter` module and corresponding exporter service, however there are is a helpful convention that Exporters tend to follow; for data items that are containers of other items, such as photo albums, these items are conventionally exported before the data items contained inside them (or at most in the same page). \
+The Generic `Importer` module also follows this convention; exporters export data in pages (`ContainerResource`s, for those reading the code), and the Generic `Importer` module will export albums in a page before the photos in that page.
+
+##### Failures on dependent items
+
+In some circumstances the transfer of a data item that is a container of other data items may fail. \
+The DTP transfer worker makes a best-effort to transfer all data, and does not consider the dependencies between data items; importer services may not respect this dependency, or have their own way of recovering from such failures.
+For this reason if a container data item fails to transfer then the data items contained within that container will still be transferred. It is the responsibility of the importer service to decide how to handle this case.
+
+#### Concurrency
+
+To keep the API simple, data items in a transfer job are sent by the `Importer` module to your HTTP service sequentially and as one data item per request. \
+Your API may receive multiple concurrent requests, even for the same user, if there are multiple concurrent transfer jobs to your service.
+
+#### Data Rates
+
+The limiting factors for data import rate are the speeds of the export service and your HTTP service. Since export services batch their exports to the DTP transfer worker and are typically hosted in or near the same data-center as the transfer worker, this practically means data will be sent as fast as your HTTP service can accept it. \
+If your service cannot accept data at the rate sent by the DTP transfer worker some strategies for managing this include throttling and rate-limiting.
+
+Your service can throttle the DTP transfer worker by setting a maximum request rate per unit of time and queueing requests that exceeds that quota until the next time period. Since the Generic Importer module sends each data item sequentially the next data item will not be sent until your service has completed a HTTP response for the current item, making this an effective strategy to slow the transfer worker down. \
+Note that throttling requests for too long may cause the HTTP request to time-out.
+
+Alternatively or additionally, your service can rate limit the DTP worker by responding with a 429 Too Many Requests status code when the transfer worker is sending data too quickly. The request will be retried with a delay as described in the [Retries](#retries) section. \
+Note that the transfer worker will not retry sending a given data item indefinitely, so this may result in lost data.
+
+#### Retries
+
+If a request sent to your HTTP service fails it will usually be retried. The retry and back-off logic is based on a configuration set for the instance of the DTP transfer worker that processes the transfer job. The retry logic is therefore typically configured by the owners of each exporter service.
+
+The DTP transfer worker's default retry strategy is defined in [distributions/demo-server/src/main/resources/config/retry/default.yaml]. At time of writing, the default strategy is an exponential back-off with 1.5x multiplier, starting at 1000ms, with up to 5 retry attempts. \
+If the default retry behaviour is not sufficient for your service, please contact the owner of the exporter service to configure a more specific retry strategy for your service, or create an issue or pull request to make this configurable per-import service.
+
+### HTTP Interface
+
+The Generic `Importer` module maps each data item in a job to one HTTP request made to your HTTP service. \
+Data will be POSTed to your endpoint with a `Content-Type` of either `application/json` for basic data-types, or `multipart/related` for file-based data-types. See below for how to interpret each of these. \
+Your endpoint should return a 20x status code if the resource has been created, 40x for errors caused by the Importer, or 50x for service errors. See [Endpoint Errors](#endpoint-errors) below for details on returning errors.
+
+#### Basic data types
+
+For basic data-types the Importer will send a POST request with a `Content-Type: application/json` header. The body of the POST request will be a UTF-8 encoded JSON payload conforming to the relevant data-type schema detailed in the [Schemas](#schemas) section.
+
+For example, below is a full request of a SOCIAL-POSTS item (JSON formatted for readability)
+
+```http
+POST /import/social-posts HTTP/1.1
+Content-Type: application/json
+Content-Length: 660
+Host: localhost:8080
+Connection: Keep-Alive
+Accept-Encoding: gzip
+User-Agent: okhttp/3.9.1
+Authorization: Bearer accessToken
+
+{
+ "@type": "GenericPayload",
+ "schemaSource": ".../SocialPostsSerializer.java",
+ "apiVersion": "0.1.0",
+ "payload": {
+ "@type": "SocialActivityData",
+ "metadata": {
+ "@type": "SocialActivityMetadata",
+ "actor": {
+ "@type": "SocialActivityActor",
+ "id": "321",
+ "name": "Steve",
+ "url": null
+ }
+ },
+ "activity": {
+ "@type": "SocialActivityModel",
+ "id": "456",
+ "published": 1731604863.845677,
+ "type": "NOTE",
+ "attachments": [
+ {
+ "@type": "SocialActivityAttachment",
+ "type": "IMAGE",
+ "url": "foo.com",
+ "name": "Foo",
+ "content": null
+ }
+ ],
+ "location": {
+ "@type": "SocialActivityLocation",
+ "name": "foo",
+ "longitude": 10.0,
+ "latitude": 10.0
+ },
+ "title": "Hello world!",
+ "content": "Hi there",
+ "url": null
+ }
+ }
+}
+```
+
+#### File-based data types
+
+To support large files as part of MEDIA, PHOTOS, VIDEOS and BLOBS imports, the request for these data-types is a `multipart/related` request with a `boundary`. \
+The first part will be an `application/json` payload containing metadata related to the file being importer. \
+The second part will have its `Content-Type` set to the MIME type of the file, and will be a multi-part stream of the bytes in the file with `boundary` separators.
+
+```http
+POST /import/blobs HTTP/1.1
+Content-Type: multipart/related; boundary=1581c5eb-05d0-42fe-bfb3-472151f366cd
+Content-Length: 524288384
+Host: localhost:8080
+Connection: Keep-Alive
+Accept-Encoding: gzip
+User-Agent: okhttp/3.9.1
+Authorization: Bearer accessToken
+
+--1581c5eb-05d0-42fe-bfb3-472151f366cd
+Content-Type: application/json
+Content-Length: 236
+
+{"@type":"GenericPayload","schemaSource":".../BlobbySerializer.java","apiVersion": "0.1.0","payload":{"@type":"BlobbyFileData","folder":"/root/foo","document":{"name":"bar.mp4","dateModified":"2020-02-01","encodingFormat":"video/mp4"}}}
+--1581c5eb-05d0-42fe-bfb3-472151f366cd
+Content-Type: video/mp4
+Content-Length: 524288000
+
+...524288000 bytes follow
+--1581c5eb-05d0-42fe-bfb3-472151f366cd--
+```
+
+> [!NOTE]
+> Endpoints supporting file-based data-types may also receive basic data-types, for example the `/blobs` endpoint should
+> support receiving folders encoded as `application/json` POST data.
+
+### Configuration
+
+To configure an importer, create a YAML configuration file in the `extensions/data-transfer/portability-data-transfer-generic/src/main/resources/config/` directory of this repository.
+
+```yaml
+# example.yaml
+serviceConfig:
+ serviceId: "Example"
+ endpoint: "https://example.com/dtp/"
+ verticals:
+ - vertical: SOCIAL-POSTS
+ - vertical: BLOBS
+ - vertical: MEDIA
+ - vertical: CALENDAR
+```
+
+## Schemas
+
+Below are the [JSON schemas](https://json-schema.org/specification) for each supported vertical.
+
+All POST requests containing data to be imported are wrapped in a top-level `GenericPayload` wrapper.
+
+```json
+{
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "type": "object",
+ "properties": {
+ "apiVersion": {
+ "type": "string"
+ },
+ "payload": {
+ "type": "object",
+ "description": "The inner payload, which contains data from one of the described data verticals"
+ },
+ "schemaSource": {
+ "type": "string",
+ "description": "The source file containing the schema definition"
+ },
+ "@type": {
+ "const": "GenericPayload"
+ }
+ },
+ "required": [
+ "@type",
+ "schemaSource",
+ "apiVersion",
+ "payload"
+ ]
+}
+```
+
+### MEDIA
+
+Endpoint: `/media`
+
+The MEDIA vertical describes a user's photos and videos, and the albums that contain them.
+
+Albums are conventionally imported before the photos and videos contained in them.
+
+#### Basic Data Types
+
+The only basic data type exposed for MEDIA data is `Album`, which contains metadata about an album.
+
+```json
+{
+ "$schema" : "https://json-schema.org/draft/2020-12/schema",
+ "type": "object",
+ "properties": {
+ "description": {
+ "type": "string"
+ },
+ "id": {
+ "type": "string"
+ },
+ "name": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "Album"
+ }
+ },
+ "required": [
+ "@type",
+ "id",
+ "name"
+ ]
+}
+```
+
+#### File-based Data Types
+
+The `Photo` and `Video` data types are sent as file-based data containing metadata described below in the JSON part of the payload.
+
+```json
+{
+ "$schema" : "https://json-schema.org/draft/2020-12/schema",
+ "$defs": {
+ "FavoriteInfo": {
+ "type": "object",
+ "properties": {
+ "favorite": {
+ "type": "boolean"
+ },
+ "lastUpdateTime": {
+ "type": "string",
+ "format": "date-time"
+ },
+ "@type": {
+ "const": "FavoriteInfo"
+ }
+ },
+ "required": [
+ "@type",
+ "favorite",
+ "lastUpdateTime"
+ ]
+ }
+ },
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "albumId": {
+ "type": "string"
+ },
+ "description": {
+ "type": "string"
+ },
+ "favoriteInfo": {
+ "$ref": "#/$defs/FavoriteInfo"
+ },
+ "name": {
+ "type": "string"
+ },
+ "uploadedTime": {
+ "type": "string",
+ "format": "date-time"
+ },
+ "@type": {
+ "const": "Video"
+ }
+ },
+ "required": [
+ "@type",
+ "albumId",
+ "name"
+ ]
+ },
+ {
+ "type": "object",
+ "properties": {
+ "albumId": {
+ "type": "string"
+ },
+ "description": {
+ "type": "string"
+ },
+ "favoriteInfo": {
+ "$ref": "#/$defs/FavoriteInfo"
+ },
+ "name": {
+ "type": "string"
+ },
+ "uploadedTime": {
+ "type": "string",
+ "format": "date-time"
+ },
+ "@type": {
+ "const": "Photo"
+ }
+ },
+ "required": [
+ "@type",
+ "albumId",
+ "name"
+ ]
+ }
+ ]
+}
+```
+
+#### PHOTOS and VIDEOS
+
+The PHOTOS and VIDEOS verticals are implemented as a subset of the MEDIA data type. If your importer is configured to support only PHOTOS, for example, photo data will be imported using the MEDIA schemas, but containing only `Album` and `Photo` payloads.
+
+### BLOBS
+
+Endpoint: `/blobs`
+
+The BLOBS vertical represents arbitrary file data and folder structures.
+
+Folders are conventionally imported before any of their containing data, which may include folders and/or files.
+
+#### Basic Data Types
+
+The only basic data type exposed for BLOBS data is `Folder`, which describes a folder containing other folders and/or files.
+
+```json
+{
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "type": "object",
+ "properties": {
+ "path": {
+ "type": "string",
+ "description": "Full path of the folder to be created"
+ },
+ "@type": {
+ "const": "Folder"
+ }
+ },
+ "required": [
+ "@type",
+ "path"
+ ]
+}
+```
+
+#### File-based Data Types
+
+The only file-based data type exposed for BLOBS data is `File`, which describes an arbitrary file in a folder. The file's metadata will be describes in the JSON part of the payload, with the below schema.
+
+```json
+{
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "type": "object",
+ "properties": {
+ "dateModified": {
+ "type": [
+ "string",
+ "null"
+ ],
+ "format": "date-time"
+ },
+ "folder": {
+ "type": "string"
+ },
+ "name": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "File"
+ }
+ },
+ "required": [
+ "@type",
+ "name",
+ "folder"
+ ]
+}
+```
+
+### CALENDAR
+
+Endpoint: `/calendar`
+
+The CALENDAR vertical describes calendars and events on those calendars.
+
+Calendars are conventionally imported before the events associated with them.
+
+All data exposed by the CALENDER vertical is encoded as a basic data type.
+
+```json
+{
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "$defs": {
+ "CalendarEventTime": {
+ "type": "object",
+ "properties": {
+ "dateOnly": {
+ "type": "boolean"
+ },
+ "dateTime": {
+ "type": "string",
+ "format": "date-time"
+ },
+ "@type": {
+ "const": "CalendarEventModel$CalendarEventTime"
+ }
+ },
+ "required": [
+ "@type",
+ "dateTime"
+ ]
+ }
+ },
+ "anyOf": [
+ {
+ "type": "object",
+ "properties": {
+ "description": {
+ "type": "string"
+ },
+ "id": {
+ "type": "string"
+ },
+ "name": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "Calendar"
+ }
+ },
+ "required": [
+ "@type",
+ "name",
+ "id"
+ ]
+ },
+ {
+ "type": "object",
+ "properties": {
+ "attendees": {
+ "type": "array",
+ "items": {
+ "type": "object",
+ "properties": {
+ "displayName": {
+ "type": "string"
+ },
+ "email": {
+ "type": "string"
+ },
+ "optional": {
+ "type": "boolean"
+ },
+ "@type": {
+ "const": "CalendarAttendeeModel"
+ }
+ },
+ "required": [
+ "@type"
+ ]
+ }
+ },
+ "calendarId": {
+ "type": "string"
+ },
+ "endTime": {
+ "$ref": "#/$defs/CalendarEventTime"
+ },
+ "location": {
+ "type": "string"
+ },
+ "notes": {
+ "type": "string"
+ },
+ "recurrenceRule": {
+ "type": "object",
+ "properties": {
+ "exDate": {
+ "type": "object"
+ }
+ }
+ },
+ "startTime": {
+ "$ref": "#/$defs/CalendarEventTime"
+ },
+ "title": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "CalendarEvent"
+ }
+ },
+ "required": [
+ "@type",
+ "calendarId",
+ "title"
+ ]
+ }
+ ]
+}
+```
+
+### SOCIAL-POSTS
+
+Endpoint: `/social-posts`
+
+The SOCIAL-POSTS vertical represents posts made by the user on a social media platform.
+
+Only the `SocialActivity` data type is exposed by the SOCIAL-POSTS vertical, which is a basic data type.
+
+```json
+{
+ "$schema": "https://json-schema.org/draft/2020-12/schema",
+ "type": "object",
+ "properties": {
+ "activity": {
+ "type": "object",
+ "properties": {
+ "attachments": {
+ "type": "array",
+ "items": {
+ "type": "object",
+ "properties": {
+ "content": {
+ "type": "string"
+ },
+ "name": {
+ "type": "string"
+ },
+ "type": {
+ "type": "string",
+ "enum": [
+ "LINK",
+ "IMAGE",
+ "VIDEO"
+ ]
+ },
+ "url": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "SocialActivityAttachment"
+ }
+ },
+ "required": [
+ "@type",
+ "name"
+ ]
+ }
+ },
+ "content": {
+ "type": "string"
+ },
+ "id": {
+ "type": "string"
+ },
+ "location": {
+ "type": "object",
+ "properties": {
+ "latitude": {
+ "type": "number"
+ },
+ "longitude": {
+ "type": "number"
+ },
+ "name": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "SocialActivityLocation"
+ }
+ },
+ "required": [
+ "@type",
+ "latitude",
+ "longitude"
+ ]
+ },
+ "published": {
+ "type": "string",
+ "format": "date-time"
+ },
+ "title": {
+ "type": "string"
+ },
+ "type": {
+ "type": "string",
+ "enum": [
+ "CHECKIN",
+ "POST",
+ "NOTE"
+ ]
+ },
+ "url": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "SocialActivityModel"
+ }
+ },
+ "required": [
+ "@type",
+ "id",
+ "content"
+ ]
+ },
+ "metadata": {
+ "type": "object",
+ "properties": {
+ "actor": {
+ "type": "object",
+ "properties": {
+ "id": {
+ "type": "string"
+ },
+ "name": {
+ "type": "string"
+ },
+ "url": {
+ "type": "string"
+ },
+ "@type": {
+ "const": "SocialActivityActor"
+ }
+ },
+ "required": [
+ "@type",
+ "id"
+ ]
+ },
+ "@type": {
+ "const": "SocialActivityMetadata"
+ }
+ },
+ "required": [
+ "@type"
+ ]
+ },
+ "@type": {
+ "const": "SocialActivity"
+ }
+ },
+ "required": [
+ "@type",
+ "activity"
+ ]
+}
+```
+
+### Endpoint Errors
+
+If there is an error importing an item on any endpoint, the relevant HTTP response code (40x, 50x) should be set, and a JSON encoded response body with the following schema should be sent.
+
+```json
+{
+ "type": "object",
+ "properties": {
+ "error": {
+ "type": "string"
+ },
+ "error_description": {
+ "type": "string"
+ }
+ },
+ "required": [
+ "error"
+ ]
+}
+```
+
+The combination of the HTTP response code and `error` field can be used to encode for specific failure modes; see [Token Refresh](#token-refresh) below.
+
+## Authentication and Authorization
+
+Generic Importers support the [OAuth 2.0 Authorization Code Flow](https://datatracker.ietf.org/doc/html/rfc6749#section-1.3.1); platforms will direct users to your OAuth authorization page, requesting an authorization code with OAuth scopes defined by your importer configuration, which will then be used to claim an access token.
+The access token will be sent as a Bearer token in the HTTP Authorization header sent with all import requests made by the DTP transfer worker.
+
+The authorization server may be the same as your HTTP service, or can be a separate server that your service integrates with. \
+When implementing your importer service there is likely an existing OAuth library that integrates with your chosen framework, or there are third-party authorization services you can integrate with.
+
+### Token Refresh
+
+If your configuration specifies a token refresh endpoint and a refresh token is provided alongside the access token, the transfer worker will attempt to refresh the access token in the event your service endpoint returns a HTTP 401 (Unauthorized) error with a JSON payload containing an `invalid_token` error message.
+
+```http
+HTTP/1.1 401 Unauthorized
+Content-Type: application/json
+
+{
+ "error": "invalid_token",
+ "error_description": "The access token expired"
+}
+```
+
+The worker will request a token refresh through the standard OAuth refresh token flow.
diff --git a/extensions/data-transfer/portability-data-transfer-generic/build.gradle b/extensions/data-transfer/portability-data-transfer-generic/build.gradle
new file mode 100644
index 000000000..28820f54b
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/build.gradle
@@ -0,0 +1,33 @@
+/*
+ * Copyright 2018 The Data Transfer Project Authors.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * https://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+plugins {
+ id 'maven'
+ id 'signing'
+ id 'application'
+}
+
+dependencies {
+ compile project(':portability-spi-transfer')
+ compile project(':portability-spi-cloud')
+ compile project(':portability-types-common')
+ compile "com.squareup.okhttp3:okhttp:${okHttpVersion}"
+ compile "com.fasterxml.jackson.datatype:jackson-datatype-jsr310:${jacksonVersion}"
+ compile "com.fasterxml.jackson.datatype:jackson-datatype-jdk8:${jacksonVersion}"
+ testCompile "com.squareup.okhttp3:mockwebserver:${okHttpVersion}"
+ testCompile "commons-fileupload:commons-fileupload:1.5"
+}
+
+configurePublication(project)
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/BlobbySerializer.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/BlobbySerializer.java
new file mode 100644
index 000000000..853aea2d2
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/BlobbySerializer.java
@@ -0,0 +1,177 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonSubTypes;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+import com.fasterxml.jackson.annotation.JsonTypeName;
+import java.time.ZonedDateTime;
+import java.util.ArrayDeque;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Optional;
+import java.util.Queue;
+import org.datatransferproject.types.common.DownloadableItem;
+import org.datatransferproject.types.common.models.blob.BlobbyStorageContainerResource;
+import org.datatransferproject.types.common.models.blob.DigitalDocumentWrapper;
+import org.datatransferproject.types.common.models.blob.DtpDigitalDocument;
+
+/**
+ * Wrapper to adapt items known to be in temp storage (e.g. BLOB data) into {@link DownloadableItem}
+ *
+ *
It's useful to wrap such items so upstream code can consume either known temp store'd items or
+ * items the Importer has to download itself (some MEDIA items) from the same interface.
+ */
+class CachedDownloadableItem implements DownloadableItem {
+ private String cachedId;
+ private String name;
+
+ public CachedDownloadableItem(String cachedId, String name) {
+ this.cachedId = cachedId;
+ this.name = name;
+ }
+
+ @Override
+ public String getIdempotentId() {
+ return cachedId;
+ }
+
+ @Override
+ public String getFetchableUrl() {
+ // 'url' is ID when cached
+ return cachedId;
+ }
+
+ @Override
+ public boolean isInTempStore() {
+ return true;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+}
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+@JsonTypeName("File")
+class FileExportData implements BlobbySerializer.ExportData {
+ @JsonProperty private final String folder;
+ @JsonProperty private final String name;
+ @JsonProperty private final Optional dateModified;
+
+ private FileExportData(String folder, String name, Optional dateModified) {
+ this.folder = folder;
+ this.name = name;
+ this.dateModified = dateModified;
+ }
+
+ public String getFolder() {
+ return folder;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public Optional getDateModified() {
+ return dateModified;
+ }
+
+ public static FileExportData fromDtpDigitalDocument(String path, DtpDigitalDocument model) {
+ return new FileExportData(
+ path,
+ model.getName(),
+ Optional.ofNullable(model.getDateModified())
+ .filter(string -> !string.isEmpty())
+ .map(dateString -> ZonedDateTime.parse(model.getDateModified())));
+ }
+}
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+@JsonTypeName("Folder")
+class FolderExportData implements BlobbySerializer.ExportData {
+ @JsonProperty private final String path;
+
+ @JsonCreator
+ public FolderExportData(@JsonProperty String path) {
+ this.path = path;
+ }
+
+ public String getPath() {
+ return path;
+ }
+}
+
+public class BlobbySerializer {
+ @JsonSubTypes({
+ @JsonSubTypes.Type(FolderExportData.class),
+ @JsonSubTypes.Type(FileExportData.class),
+ })
+ public interface ExportData {}
+
+ static class BlobbyContainerPath {
+ private BlobbyStorageContainerResource container;
+ private String path;
+
+ public BlobbyContainerPath(BlobbyStorageContainerResource container, String path) {
+ this.container = container;
+ this.path = path;
+ }
+
+ public BlobbyStorageContainerResource getContainer() {
+ return container;
+ }
+
+ public String getPath() {
+ return path;
+ }
+ }
+
+ static final String SCHEMA_SOURCE =
+ GenericTransferConstants.SCHEMA_SOURCE_BASE
+ + "/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/BlobbySerializer.java";
+
+ public static Iterable> serialize(
+ BlobbyStorageContainerResource root) {
+ List> results = new ArrayList<>();
+ // Search whole tree of container resource
+ Queue horizon = new ArrayDeque<>();
+ BlobbyContainerPath containerAndPath = new BlobbyContainerPath(root, "");
+ do {
+ BlobbyStorageContainerResource container = containerAndPath.getContainer();
+ String parentPath = containerAndPath.getPath();
+ String path = format("%s/%s", parentPath, container.getName());
+ // Import the current folder
+ results.add(
+ new ImportableData<>(
+ new GenericPayload<>(new FolderExportData(path), SCHEMA_SOURCE),
+ container.getId(),
+ path));
+
+ // Add all sub-folders to the search tree
+ for (BlobbyStorageContainerResource child : container.getFolders()) {
+ horizon.add(new BlobbyContainerPath(child, path));
+ }
+
+ // Import all files in the current folder
+ // Intentionally done after importing the current folder
+ for (DigitalDocumentWrapper file : container.getFiles()) {
+ results.add(
+ new ImportableFileData<>(
+ new CachedDownloadableItem(
+ file.getCachedContentId(), file.getDtpDigitalDocument().getName()),
+ file.getDtpDigitalDocument().getEncodingFormat(),
+ new GenericPayload<>(
+ FileExportData.fromDtpDigitalDocument(path, file.getDtpDigitalDocument()),
+ SCHEMA_SOURCE),
+ file.getCachedContentId(),
+ file.getDtpDigitalDocument().getName()));
+ }
+ } while ((containerAndPath = horizon.poll()) != null);
+
+ return results;
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/CalendarSerializer.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/CalendarSerializer.java
new file mode 100644
index 000000000..34d04ab2e
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/CalendarSerializer.java
@@ -0,0 +1,86 @@
+package org.datatransferproject.datatransfer.generic;
+
+import com.fasterxml.jackson.annotation.JsonSubTypes;
+import java.util.List;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+import org.datatransferproject.types.common.models.calendar.CalendarAttendeeModel;
+import org.datatransferproject.types.common.models.calendar.CalendarContainerResource;
+import org.datatransferproject.types.common.models.calendar.CalendarEventModel;
+import org.datatransferproject.types.common.models.calendar.CalendarModel;
+import org.datatransferproject.types.common.models.calendar.RecurrenceRule;
+
+class CalendarExportData extends CalendarModel implements CalendarSerializer.ExportData {
+ private CalendarExportData(String id, String name, String description) {
+ super(id, name, description);
+ }
+
+ public static CalendarExportData fromModel(CalendarModel model) {
+ return new CalendarExportData(model.getId(), model.getName(), model.getDescription());
+ }
+}
+
+class CalendarEventExportData extends CalendarEventModel implements CalendarSerializer.ExportData {
+
+ private CalendarEventExportData(
+ String calendarId,
+ String title,
+ String notes,
+ List attendees,
+ String location,
+ CalendarEventTime startTime,
+ CalendarEventTime endTime,
+ RecurrenceRule recurrenceRule) {
+ super(calendarId, title, notes, attendees, location, startTime, endTime, recurrenceRule);
+ }
+
+ public static CalendarEventExportData fromModel(CalendarEventModel model) {
+ return new CalendarEventExportData(
+ model.getCalendarId(),
+ model.getTitle(),
+ model.getNotes(),
+ model.getAttendees(),
+ model.getLocation(),
+ model.getStartTime(),
+ model.getEndTime(),
+ model.getRecurrenceRule());
+ }
+}
+
+public class CalendarSerializer {
+
+ @JsonSubTypes({
+ @JsonSubTypes.Type(value = CalendarExportData.class, name = "Calendar"),
+ @JsonSubTypes.Type(value = CalendarEventExportData.class, name = "CalendarEvent"),
+ })
+ public interface ExportData {}
+
+ static final String SCHEMA_SOURCE_CALENDAR =
+ GenericTransferConstants.SCHEMA_SOURCE_BASE
+ + "/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarModel.java";
+ static final String SCHEMA_SOURCE_EVENT =
+ GenericTransferConstants.SCHEMA_SOURCE_BASE
+ + "/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarEventModel.java";
+
+ public static Iterable> serialize(
+ CalendarContainerResource container) {
+ return Stream.concat(
+ container.getCalendars().stream()
+ .map(
+ calendar ->
+ new ImportableData<>(
+ new GenericPayload(
+ CalendarExportData.fromModel(calendar), SCHEMA_SOURCE_CALENDAR),
+ calendar.getId(),
+ calendar.getName())),
+ container.getEvents().stream()
+ .map(
+ event ->
+ new ImportableData<>(
+ new GenericPayload(
+ CalendarEventExportData.fromModel(event), SCHEMA_SOURCE_EVENT),
+ String.valueOf(event.hashCode()),
+ event.getTitle())))
+ .collect(Collectors.toList());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericFileImporter.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericFileImporter.java
new file mode 100644
index 000000000..ff64ec6f7
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericFileImporter.java
@@ -0,0 +1,80 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+
+import java.io.File;
+import java.io.IOException;
+import java.net.URL;
+import java.util.Optional;
+import java.util.UUID;
+import okhttp3.MediaType;
+import okhttp3.MultipartBody;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.spi.cloud.connection.ConnectionProvider;
+import org.datatransferproject.spi.cloud.storage.TemporaryPerJobDataStore;
+import org.datatransferproject.spi.cloud.storage.TemporaryPerJobDataStore.InputStreamWrapper;
+import org.datatransferproject.spi.transfer.types.InvalidTokenException;
+import org.datatransferproject.types.common.models.ContainerResource;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.AuthData;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+
+public class GenericFileImporter extends GenericImporter {
+ private TemporaryPerJobDataStore dataStore;
+ private ConnectionProvider connectionProvider;
+
+ static final MediaType MULTIPART_RELATED = MediaType.parse("multipart/related");
+ static final MediaType OCTET_STREAM = MediaType.parse("application/octet-stream");
+
+ public GenericFileImporter(
+ ContainerSerializer containerSerializer,
+ AppCredentials appCredentials,
+ URL endpoint,
+ TemporaryPerJobDataStore dataStore,
+ Monitor monitor) {
+ super(containerSerializer, appCredentials, endpoint, monitor);
+ this.dataStore = dataStore;
+ this.connectionProvider = new ConnectionProvider(dataStore);
+ }
+
+ @Override
+ public boolean importSingleItem(
+ UUID jobId, TokensAndUrlAuthData authData, ImportableData dataItem)
+ throws IOException, InvalidTokenException {
+ if (dataItem instanceof ImportableFileData) {
+ return importSingleFileItem(jobId, authData, (ImportableFileData) dataItem);
+ } else {
+ return super.importSingleItem(jobId, authData, dataItem);
+ }
+ }
+
+ private boolean importSingleFileItem(
+ UUID jobId, AuthData authData, ImportableFileData data)
+ throws IOException, InvalidTokenException {
+ InputStreamWrapper wrapper = connectionProvider.getInputStreamForItem(jobId, data.getFile());
+ File tempFile =
+ dataStore.getTempFileFromInputStream(wrapper.getStream(), data.getFile().getName(), null);
+ MediaType mimeType =
+ Optional.ofNullable(MediaType.parse(data.getFileMimeType())).orElse(OCTET_STREAM);
+ Request request =
+ new Request.Builder()
+ .url(endpoint)
+ .addHeader("Authorization", format("Bearer %s", authData.getToken()))
+ .post(
+ new MultipartBody.Builder()
+ .setType(MULTIPART_RELATED)
+ .addPart(RequestBody.create(JSON, om.writeValueAsBytes(data.getJsonData())))
+ .addPart(MultipartBody.create(mimeType, tempFile))
+ .build())
+ .build();
+
+ try (Response response = client.newCall(request).execute()) {
+ return parseResponse(response);
+ } finally {
+ tempFile.delete();
+ }
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericImporter.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericImporter.java
new file mode 100644
index 000000000..fee8ef42a
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericImporter.java
@@ -0,0 +1,172 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonParseException;
+import com.fasterxml.jackson.databind.JsonMappingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.SerializationFeature;
+import com.fasterxml.jackson.datatype.jdk8.Jdk8Module;
+import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
+import com.google.common.annotations.VisibleForTesting;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.charset.StandardCharsets;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import javax.annotation.Nullable;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.datatransfer.generic.auth.OAuthTokenManager;
+import org.datatransferproject.spi.transfer.idempotentexecutor.IdempotentImportExecutor;
+import org.datatransferproject.spi.transfer.provider.ImportResult;
+import org.datatransferproject.spi.transfer.provider.ImportResult.ResultType;
+import org.datatransferproject.spi.transfer.provider.Importer;
+import org.datatransferproject.spi.transfer.types.InvalidTokenException;
+import org.datatransferproject.types.common.models.ContainerResource;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+
+@FunctionalInterface
+interface ContainerSerializer {
+ public Iterable> apply(C containerResource);
+}
+
+public class GenericImporter
+ implements Importer {
+
+ @JsonIgnoreProperties(ignoreUnknown = true)
+ static class ErrorResponse {
+ private final String error;
+ private final Optional errorDescription;
+
+ @JsonCreator
+ public ErrorResponse(
+ @JsonProperty(value = "error", required = true) String error,
+ @Nullable @JsonProperty("error_description") String errorDescription) {
+ this.error = error;
+ this.errorDescription = Optional.ofNullable(errorDescription);
+ }
+
+ public String getError() {
+ return error;
+ }
+
+ public Optional getErrorDescription() {
+ return errorDescription;
+ }
+
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append(error);
+ if (errorDescription.isPresent()) {
+ builder.append(" - ");
+ builder.append(errorDescription.get());
+ }
+ return builder.toString();
+ }
+ }
+
+ ContainerSerializer containerSerializer;
+ URL endpoint;
+ Monitor monitor;
+ AppCredentials appCredentials;
+ OkHttpClient client = new OkHttpClient();
+ ObjectMapper om = new ObjectMapper();
+ Map jobTokenManagerMap = new HashMap<>();
+
+ static final MediaType JSON = MediaType.parse("application/json");
+
+ public GenericImporter(
+ ContainerSerializer containerSerializer,
+ AppCredentials appCredentials,
+ URL endpoint,
+ Monitor monitor) {
+ this.monitor = monitor;
+ this.appCredentials = appCredentials;
+ this.endpoint = endpoint;
+ this.containerSerializer = containerSerializer;
+ configureObjectMapper(om);
+ }
+
+ @VisibleForTesting
+ static void configureObjectMapper(ObjectMapper objectMapper) {
+ // ZonedDateTime and friends
+ objectMapper.registerModule(new JavaTimeModule());
+ // Optional fields
+ objectMapper.registerModule(new Jdk8Module());
+ // ISO timestamps rather than unix epoch seconds
+ objectMapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);
+ }
+
+ @Override
+ public ImportResult importItem(
+ UUID jobId,
+ IdempotentImportExecutor idempotentExecutor,
+ TokensAndUrlAuthData initialAuthData,
+ C data)
+ throws Exception {
+ OAuthTokenManager tokenManager =
+ jobTokenManagerMap.computeIfAbsent(
+ jobId,
+ ignored -> new OAuthTokenManager(initialAuthData, appCredentials, client, monitor));
+ for (ImportableData importableData : containerSerializer.apply(data)) {
+ idempotentExecutor.executeAndSwallowIOExceptions(
+ importableData.getIdempotentId(),
+ importableData.getName(),
+ () ->
+ tokenManager.withAuthData(
+ authData -> importSingleItem(jobId, authData, importableData)));
+ }
+ return new ImportResult(ResultType.OK);
+ }
+
+ boolean parseResponse(Response response) throws IOException, InvalidTokenException {
+ if (response.code() >= 400) {
+ byte[] body = response.body().bytes();
+ ErrorResponse error;
+ try {
+ error = om.readValue(body, ErrorResponse.class);
+ } catch (JsonParseException | JsonMappingException e) {
+ throw new IOException(
+ format(
+ "Unexpected response (%d) '%s'",
+ response.code(), new String(body, StandardCharsets.UTF_8)),
+ e);
+ }
+
+ if (response.code() == 401 && error.getError().equals("invalid_token")) {
+ throw new InvalidTokenException(error.toString(), null);
+ } else {
+ throw new IOException(format("Error (%d) %s", response.code(), error.toString()));
+ }
+ }
+ if (response.code() < 200 || response.code() >= 300) {
+ throw new IOException(format("Unexpected response code (%d)", response.code()));
+ }
+ return true;
+ }
+
+ boolean importSingleItem(UUID jobId, TokensAndUrlAuthData authData, ImportableData dataItem)
+ throws IOException, InvalidTokenException {
+ Request request =
+ new Request.Builder()
+ .url(endpoint)
+ .addHeader("Authorization", format("Bearer %s", authData.getToken()))
+ .post(RequestBody.create(JSON, om.writeValueAsBytes(dataItem.getJsonData())))
+ .build();
+
+ try (Response response = client.newCall(request).execute()) {
+ return parseResponse(response);
+ }
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericPayload.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericPayload.java
new file mode 100644
index 000000000..821e14f9d
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericPayload.java
@@ -0,0 +1,33 @@
+package org.datatransferproject.datatransfer.generic;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+public class GenericPayload {
+ @JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY)
+ private final T payload;
+
+ private final String schemaSource;
+ // TODO: update
+ private final String apiVersion = "0.1.0";
+
+ @JsonCreator
+ public GenericPayload(@JsonProperty T payload, @JsonProperty String schemaSource) {
+ this.payload = payload;
+ this.schemaSource = schemaSource;
+ }
+
+ public T getPayload() {
+ return payload;
+ }
+
+ public String getApiVersion() {
+ return apiVersion;
+ }
+
+ public String getSchemaSource() {
+ return schemaSource;
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferConstants.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferConstants.java
new file mode 100644
index 000000000..8732adfd1
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferConstants.java
@@ -0,0 +1,6 @@
+package org.datatransferproject.datatransfer.generic;
+
+final class GenericTransferConstants {
+ public static final String SCHEMA_SOURCE_BASE =
+ "https://github.com/dtinit/data-transfer-project/blob/master";
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferExtension.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferExtension.java
new file mode 100644
index 000000000..15ccd8f46
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/GenericTransferExtension.java
@@ -0,0 +1,220 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+import static org.datatransferproject.types.common.models.DataVertical.BLOBS;
+import static org.datatransferproject.types.common.models.DataVertical.CALENDAR;
+import static org.datatransferproject.types.common.models.DataVertical.MEDIA;
+import static org.datatransferproject.types.common.models.DataVertical.PHOTOS;
+import static org.datatransferproject.types.common.models.DataVertical.SOCIAL_POSTS;
+import static org.datatransferproject.types.common.models.DataVertical.VIDEOS;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonProcessingException;
+import com.fasterxml.jackson.databind.JsonNode;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URISyntaxException;
+import java.net.URL;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import org.datatransferproject.api.launcher.ExtensionContext;
+import org.datatransferproject.spi.cloud.storage.AppCredentialStore;
+import org.datatransferproject.spi.cloud.storage.JobStore;
+import org.datatransferproject.spi.transfer.extension.TransferExtension;
+import org.datatransferproject.spi.transfer.provider.Exporter;
+import org.datatransferproject.spi.transfer.provider.Importer;
+import org.datatransferproject.types.common.models.DataVertical;
+import org.datatransferproject.types.common.models.blob.BlobbyStorageContainerResource;
+import org.datatransferproject.types.common.models.calendar.CalendarContainerResource;
+import org.datatransferproject.types.common.models.media.MediaContainerResource;
+import org.datatransferproject.types.common.models.social.SocialActivityContainerResource;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.serviceconfig.TransferServiceConfig;
+
+class GenericTransferServiceVerticalConfig {
+ private final DataVertical vertical;
+
+ @JsonCreator
+ public GenericTransferServiceVerticalConfig(
+ @JsonProperty(value = "vertical", required = true) DataVertical vertical) {
+ this.vertical = vertical;
+ }
+
+ public DataVertical getVertical() {
+ return vertical;
+ }
+
+ @Override
+ public int hashCode() {
+ return this.vertical.hashCode();
+ }
+}
+
+class GenericTransferServiceConfig {
+ private final String serviceId;
+ private final URL endpoint;
+ private final Set verticals;
+
+ public GenericTransferServiceConfig(
+ @JsonProperty(value = "serviceId", required = true) String serviceId,
+ @JsonProperty(value = "endpoint", required = true) URL endpoint,
+ @JsonProperty(value = "verticals", required = true)
+ List verticals) {
+ this.serviceId = serviceId;
+ this.endpoint = endpoint;
+ this.verticals = new HashSet<>(verticals);
+ }
+
+ public String getServiceId() {
+ return serviceId;
+ }
+
+ public URL getEndpoint() {
+ return endpoint;
+ }
+
+ public Set getVerticals() {
+ return verticals;
+ }
+
+ public boolean supportsVertical(DataVertical vertical) {
+ return verticals.stream()
+ .map(verticalConfig -> verticalConfig.getVertical())
+ .collect(Collectors.toSet())
+ .contains(vertical);
+ }
+}
+
+public class GenericTransferExtension implements TransferExtension {
+ Map> importerMap = new HashMap<>();
+
+ @Override
+ public boolean supportsService(String service) {
+ try {
+ TransferServiceConfig config = TransferServiceConfig.getForService(service);
+ if (config.getServiceConfig().isEmpty()) {
+ return false;
+ }
+ // Parse failures throw
+ parseConfig(config.getServiceConfig().get());
+ } catch (IOException e) {
+ return false;
+ }
+ // Found and parsed a valid generic service config for the service
+ return true;
+ }
+
+ @Override
+ public void initialize(ExtensionContext context) {
+ JobStore jobStore = context.getService(JobStore.class);
+ TransferServiceConfig configuration = context.getService(TransferServiceConfig.class);
+ if (configuration.getServiceConfig().isEmpty()) {
+ throw new RuntimeException("Empty service configuration");
+ }
+ GenericTransferServiceConfig serviceConfig;
+ try {
+ serviceConfig = parseConfig(configuration.getServiceConfig().get());
+ } catch (JsonProcessingException e) {
+ throw new RuntimeException("Invalid service configuration", e);
+ }
+
+ AppCredentialStore appCredentialStore = context.getService(AppCredentialStore.class);
+ String serviceNameUpper = serviceConfig.getServiceId().toUpperCase();
+ AppCredentials appCredentials;
+ try {
+ appCredentials =
+ appCredentialStore.getAppCredentials(
+ format("%s_KEY", serviceNameUpper), format("%s_SECRET", serviceNameUpper));
+ } catch (IOException e) {
+ throw new RuntimeException(
+ format(
+ "Failed to get application credentials for %s (%s)",
+ serviceNameUpper, serviceConfig.getServiceId()),
+ e);
+ }
+
+ if (serviceConfig.supportsVertical(BLOBS)) {
+ importerMap.put(
+ BLOBS,
+ new GenericFileImporter(
+ BlobbySerializer::serialize,
+ appCredentials,
+ urlAppend(serviceConfig.getEndpoint(), "blobs"),
+ jobStore,
+ context.getMonitor()));
+ }
+
+ if (serviceConfig.supportsVertical(MEDIA)
+ // PHOTOS and VIDEOS can be mapped from MEDIA
+ || serviceConfig.supportsVertical(PHOTOS)
+ || serviceConfig.supportsVertical(VIDEOS)) {
+ importerMap.put(
+ MEDIA,
+ new GenericFileImporter(
+ MediaSerializer::serialize,
+ appCredentials,
+ urlAppend(serviceConfig.getEndpoint(), "media"),
+ jobStore,
+ context.getMonitor()));
+ }
+
+ if (serviceConfig.supportsVertical(SOCIAL_POSTS)) {
+ importerMap.put(
+ SOCIAL_POSTS,
+ new GenericImporter(
+ SocialPostsSerializer::serialize,
+ appCredentials,
+ urlAppend(serviceConfig.getEndpoint(), "social-posts"),
+ context.getMonitor()));
+ }
+
+ if (serviceConfig.supportsVertical(CALENDAR)) {
+ importerMap.put(
+ CALENDAR,
+ new GenericImporter(
+ CalendarSerializer::serialize,
+ appCredentials,
+ urlAppend(serviceConfig.getEndpoint(), "calendar"),
+ context.getMonitor()));
+ }
+ }
+
+ private URL urlAppend(URL base, String suffix) {
+ try {
+ String path = base.getPath();
+ if (!path.endsWith("/")) {
+ path += "/";
+ }
+ path += suffix;
+ return base.toURI().resolve(path).toURL();
+ } catch (MalformedURLException | URISyntaxException e) {
+ throw new RuntimeException("Failed to build URL", e);
+ }
+ }
+
+ private GenericTransferServiceConfig parseConfig(JsonNode config) throws JsonProcessingException {
+ ObjectMapper om = new ObjectMapper();
+ return om.treeToValue(config, GenericTransferServiceConfig.class);
+ }
+
+ @Override
+ public String getServiceId() {
+ return "Generic";
+ }
+
+ @Override
+ public Exporter, ?> getExporter(DataVertical transferDataType) {
+ throw new UnsupportedOperationException("Generic exporters aren't supported");
+ }
+
+ @Override
+ public Importer, ?> getImporter(DataVertical transferDataType) {
+ return importerMap.get(transferDataType);
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableData.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableData.java
new file mode 100644
index 000000000..b6945aeaf
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableData.java
@@ -0,0 +1,30 @@
+package org.datatransferproject.datatransfer.generic;
+
+public class ImportableData {
+ /** JSON serializable data to POST */
+ private GenericPayload jsonData;
+
+ /** Globally unique ID to avoid re-importing data */
+ private String idempotentId;
+
+ /** Human-readable item name */
+ private String name;
+
+ public ImportableData(GenericPayload jsonData, String idempotentId, String name) {
+ this.jsonData = jsonData;
+ this.idempotentId = idempotentId;
+ this.name = name;
+ }
+
+ public GenericPayload getJsonData() {
+ return jsonData;
+ }
+
+ public String getIdempotentId() {
+ return idempotentId;
+ }
+
+ public String getName() {
+ return name;
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableFileData.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableFileData.java
new file mode 100644
index 000000000..094fc8150
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/ImportableFileData.java
@@ -0,0 +1,29 @@
+package org.datatransferproject.datatransfer.generic;
+
+import org.datatransferproject.types.common.DownloadableItem;
+
+public class ImportableFileData extends ImportableData {
+ /** File file to POST * */
+ private DownloadableItem file;
+
+ private String fileMimeType;
+
+ public ImportableFileData(
+ DownloadableItem file,
+ String fileMimeType,
+ GenericPayload jsonData,
+ String idempotentId,
+ String name) {
+ super(jsonData, idempotentId, name);
+ this.file = file;
+ this.fileMimeType = fileMimeType;
+ }
+
+ public DownloadableItem getFile() {
+ return file;
+ }
+
+ public String getFileMimeType() {
+ return fileMimeType;
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/MediaSerializer.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/MediaSerializer.java
new file mode 100644
index 000000000..200a0371e
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/MediaSerializer.java
@@ -0,0 +1,204 @@
+package org.datatransferproject.datatransfer.generic;
+
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonSubTypes;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+import com.fasterxml.jackson.annotation.JsonTypeName;
+import java.time.ZoneOffset;
+import java.time.ZonedDateTime;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+import org.datatransferproject.types.common.models.FavoriteInfo;
+import org.datatransferproject.types.common.models.media.MediaAlbum;
+import org.datatransferproject.types.common.models.media.MediaContainerResource;
+import org.datatransferproject.types.common.models.photos.PhotoModel;
+import org.datatransferproject.types.common.models.videos.VideoModel;
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+@JsonTypeName("Album")
+class AlbumExportData implements MediaSerializer.ExportData {
+ @JsonProperty private final String id;
+ @JsonProperty private final String name;
+ @JsonProperty private final String description;
+
+ private AlbumExportData(String id, String name, String description) {
+ this.id = id;
+ this.name = name;
+ this.description = description;
+ }
+
+ static AlbumExportData fromModel(MediaAlbum model) {
+ return new AlbumExportData(model.getId(), model.getName(), model.getDescription());
+ }
+
+ public String getId() {
+ return id;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public String getDescription() {
+ return description;
+ }
+}
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+@JsonTypeName("FavoriteInfo")
+class FavoriteInfoExportData {
+ @JsonProperty private final boolean favorite;
+ @JsonProperty private final ZonedDateTime lastUpdateTime;
+
+ private FavoriteInfoExportData(boolean favorite, ZonedDateTime lastUpdateTime) {
+ this.favorite = favorite;
+ this.lastUpdateTime = lastUpdateTime;
+ }
+
+ public static FavoriteInfoExportData fromModel(FavoriteInfo model) {
+ return new FavoriteInfoExportData(
+ model.getFavorited(),
+ ZonedDateTime.ofInstant(model.getLastUpdateTime().toInstant(), ZoneOffset.UTC));
+ }
+
+ public boolean isFavorite() {
+ return favorite;
+ }
+
+ public ZonedDateTime getLastUpdateTime() {
+ return lastUpdateTime;
+ }
+}
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+class MediaItemExportData implements MediaSerializer.ExportData {
+ @JsonProperty private final String name;
+ @JsonProperty private final String description;
+ @JsonProperty private final String albumId;
+ @JsonProperty private final ZonedDateTime uploadedTime;
+ @JsonProperty private final FavoriteInfoExportData favoriteInfo;
+
+ public MediaItemExportData(
+ String name,
+ String description,
+ String albumId,
+ ZonedDateTime uploadedTime,
+ FavoriteInfoExportData favoriteInfo) {
+ this.name = name;
+ this.description = description;
+ this.albumId = albumId;
+ this.uploadedTime = uploadedTime;
+ this.favoriteInfo = favoriteInfo;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public String getDescription() {
+ return description;
+ }
+
+ public String getAlbumId() {
+ return albumId;
+ }
+
+ public ZonedDateTime getUploadedTime() {
+ return uploadedTime;
+ }
+
+ public FavoriteInfoExportData getFavoriteInfo() {
+ return favoriteInfo;
+ }
+}
+
+@JsonTypeName("Video")
+class VideoExportData extends MediaItemExportData {
+ private VideoExportData(
+ String name,
+ String description,
+ String albumId,
+ ZonedDateTime uploadedTime,
+ FavoriteInfoExportData favoriteInfo) {
+ super(name, description, albumId, uploadedTime, favoriteInfo);
+ }
+
+ static VideoExportData fromModel(VideoModel model) {
+ return new VideoExportData(
+ model.getName(),
+ model.getDescription(),
+ model.getAlbumId(),
+ ZonedDateTime.ofInstant(model.getUploadedTime().toInstant(), ZoneOffset.UTC),
+ FavoriteInfoExportData.fromModel(model.getFavoriteInfo()));
+ }
+}
+
+@JsonTypeName("Photo")
+class PhotoExportData extends MediaItemExportData {
+ private PhotoExportData(
+ String name,
+ String description,
+ String albumId,
+ ZonedDateTime uploadedTime,
+ FavoriteInfoExportData favoriteInfo) {
+ super(name, description, albumId, uploadedTime, favoriteInfo);
+ }
+
+ static PhotoExportData fromModel(PhotoModel model) {
+ return new PhotoExportData(
+ model.getName(),
+ model.getDescription(),
+ model.getAlbumId(),
+ ZonedDateTime.ofInstant(model.getUploadedTime().toInstant(), ZoneOffset.UTC),
+ FavoriteInfoExportData.fromModel(model.getFavoriteInfo()));
+ }
+}
+
+public class MediaSerializer {
+ static final String SCHEMA_SOURCE =
+ GenericTransferConstants.SCHEMA_SOURCE_BASE
+ + "/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/MediaSerializer.java";
+
+ @JsonSubTypes({
+ @JsonSubTypes.Type(AlbumExportData.class),
+ @JsonSubTypes.Type(VideoExportData.class),
+ @JsonSubTypes.Type(PhotoExportData.class),
+ })
+ public interface ExportData {}
+
+ public static Iterable> serialize(MediaContainerResource container) {
+ return Stream.concat(
+ container.getAlbums().stream()
+ .map(
+ album ->
+ new ImportableData<>(
+ new GenericPayload(
+ AlbumExportData.fromModel(album), SCHEMA_SOURCE),
+ album.getIdempotentId(),
+ album.getName())),
+ Stream.concat(
+ container.getVideos().stream()
+ .map(
+ (video) -> {
+ return new ImportableFileData<>(
+ video,
+ video.getMimeType(),
+ new GenericPayload(
+ VideoExportData.fromModel(video), SCHEMA_SOURCE),
+ video.getIdempotentId(),
+ video.getName());
+ }),
+ container.getPhotos().stream()
+ .map(
+ photo -> {
+ return new ImportableFileData<>(
+ photo,
+ photo.getMimeType(),
+ new GenericPayload(
+ PhotoExportData.fromModel(photo), SCHEMA_SOURCE),
+ photo.getIdempotentId(),
+ photo.getName());
+ })))
+ .collect(Collectors.toList());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializer.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializer.java
new file mode 100644
index 000000000..a167e6916
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializer.java
@@ -0,0 +1,72 @@
+package org.datatransferproject.datatransfer.generic;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonSubTypes;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+import java.util.stream.Collectors;
+import org.datatransferproject.types.common.models.social.SocialActivityActor;
+import org.datatransferproject.types.common.models.social.SocialActivityContainerResource;
+import org.datatransferproject.types.common.models.social.SocialActivityModel;
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+class SocialActivityMetadata {
+ private final SocialActivityActor actor;
+
+ @JsonCreator
+ public SocialActivityMetadata(@JsonProperty SocialActivityActor actor) {
+ this.actor = actor;
+ }
+
+ public SocialActivityActor getActor() {
+ return actor;
+ }
+}
+
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
+class SocialActivity implements SocialPostsSerializer.ExportData {
+ private final SocialActivityMetadata metadata;
+ private final SocialActivityModel activity;
+
+ public SocialActivity(SocialActivityMetadata metadata, SocialActivityModel activity) {
+ this.metadata = metadata;
+ this.activity = activity;
+ }
+
+ public SocialActivityMetadata getMetadata() {
+ return metadata;
+ }
+
+ public SocialActivityModel getActivity() {
+ return activity;
+ }
+}
+
+public class SocialPostsSerializer {
+
+ @JsonSubTypes({
+ @JsonSubTypes.Type(value = SocialActivity.class),
+ })
+ public interface ExportData {}
+
+ static final String SCHEMA_SOURCE =
+ GenericTransferConstants.SCHEMA_SOURCE_BASE
+ + "/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializer.java";
+
+ public static Iterable> serialize(
+ SocialActivityContainerResource container) {
+ return container.getActivities().stream()
+ .map(
+ activity ->
+ new ImportableData<>(
+ new GenericPayload(
+ // "actor" is stored at the container level, but isn't repliacted
+ // in the tree of activity, so merge it in a metadata field
+ new SocialActivity(
+ new SocialActivityMetadata(container.getActor()), activity),
+ SCHEMA_SOURCE),
+ activity.getIdempotentId(),
+ activity.getName()))
+ .collect(Collectors.toList());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManager.java b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManager.java
new file mode 100644
index 000000000..8f3218749
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/main/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManager.java
@@ -0,0 +1,171 @@
+package org.datatransferproject.datatransfer.generic.auth;
+
+import static java.lang.String.format;
+
+import com.fasterxml.jackson.annotation.JsonCreator;
+import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.core.JsonFactory;
+import com.fasterxml.jackson.core.JsonParseException;
+import com.fasterxml.jackson.databind.JsonMappingException;
+import com.fasterxml.jackson.databind.ObjectMapper;
+import com.fasterxml.jackson.databind.PropertyNamingStrategies;
+import com.fasterxml.jackson.databind.annotation.JsonNaming;
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.util.Optional;
+import javax.annotation.Nullable;
+import okhttp3.FormBody;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.Response;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.spi.transfer.types.InvalidTokenException;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+
+@JsonNaming(PropertyNamingStrategies.SnakeCaseStrategy.class)
+class RefreshTokenResponse {
+ private final String accessToken;
+ private final Optional refreshToken;
+ private final String tokenType;
+ private final Optional expiresIn;
+
+ @JsonCreator
+ public RefreshTokenResponse(
+ @JsonProperty(value = "access_token", required = true) String accessToken,
+ @Nullable @JsonProperty("refresh_token") String refreshToken,
+ @JsonProperty(value = "token_type", required = true) String tokenType,
+ @Nullable @JsonProperty("expires_in") String expiresIn) {
+ this.accessToken = accessToken;
+ this.refreshToken = Optional.ofNullable(refreshToken);
+ this.tokenType = tokenType;
+ this.expiresIn = Optional.ofNullable(expiresIn);
+ }
+
+ public String getAccessToken() {
+ return accessToken;
+ }
+
+ public Optional getRefreshToken() {
+ return refreshToken;
+ }
+
+ public String getTokenType() {
+ return tokenType;
+ }
+
+ public Optional getExpiresIn() {
+ return expiresIn;
+ }
+}
+
+/**
+ * Utility to manage {@link TokensAndUrlAuthData} containing OAuth refresh and access tokens.
+ * See {@see #withAuthData} for how to use the auth data managed by this class.
+ */
+public class OAuthTokenManager {
+ @FunctionalInterface
+ public interface FunctionRequiringAuthData {
+ public T execute(TokensAndUrlAuthData authData) throws IOException, InvalidTokenException;
+ }
+
+ AppCredentials appCredentials;
+ TokensAndUrlAuthData authData;
+
+ OkHttpClient client;
+ Monitor monitor;
+ ObjectMapper om = new ObjectMapper(new JsonFactory());
+
+ static final MediaType JSON = MediaType.parse("application/json");
+ static final MediaType FORM_DATA = MediaType.parse("application/x-www-form-urlencoded");
+
+ /**
+ * @param initialAuthData The auth data to be used for the first request
+ * @param appCredentials containing the OAuth client ID and client secret
+ * @param client to use for making HTTP requests when refreshing the token
+ * @param monitor for logging
+ */
+ public OAuthTokenManager(
+ TokensAndUrlAuthData initialAuthData,
+ AppCredentials appCredentials,
+ OkHttpClient client,
+ Monitor monitor) {
+ this.appCredentials = appCredentials;
+ this.authData = initialAuthData;
+
+ this.client = client;
+ this.monitor = monitor;
+ this.om.setPropertyNamingStrategy(PropertyNamingStrategies.SNAKE_CASE);
+ }
+
+ private TokensAndUrlAuthData refreshToken() throws IOException {
+ monitor.info(() -> "Refreshing OAuth token");
+ Request request =
+ new Request.Builder()
+ .url(authData.getTokenServerEncodedUrl())
+ .addHeader("Accept", JSON.toString())
+ .post(
+ new FormBody.Builder()
+ .add("grant_type", "refresh_token")
+ .add("client_id", appCredentials.getKey())
+ .add("client_secret", appCredentials.getSecret())
+ .add("refresh_token", authData.getRefreshToken())
+ .build())
+ .build();
+
+ try (Response response = client.newCall(request).execute()) {
+ if (response.code() >= 400) {
+ throw new IOException(
+ format(
+ "Error while refreshing token (%d): %s",
+ response.code(), new String(response.body().bytes(), StandardCharsets.UTF_8)));
+ }
+ byte[] body = response.body().bytes();
+ RefreshTokenResponse responsePayload;
+ try {
+ responsePayload = om.readValue(body, RefreshTokenResponse.class);
+ } catch (JsonParseException | JsonMappingException e) {
+ throw new IOException(
+ format(
+ "Unexpected response while refreshing token: %s",
+ new String(body, StandardCharsets.UTF_8)),
+ e);
+ }
+ return new TokensAndUrlAuthData(
+ responsePayload.getAccessToken(),
+ responsePayload.getRefreshToken().orElse(authData.getRefreshToken()),
+ authData.getTokenServerEncodedUrl());
+ }
+ }
+
+ /**
+ * Call a function {@code f} requiring auth data, injecting the auth data. If the function raises
+ * an {@link InvalidTokenException}, the token will be refreshed and the function will be called
+ * again with the fresh token.
+ *
+ * @param f The function to call - {@code (authData) -> T}
+ * @param {@code f}'s return type
+ * @return The {@code } returned by {@code f}
+ * @throws IOException if {@code f} throws an {@link IOException}
+ * @throws InvalidTokenException if {@code f} throws an {@link InvalidTokenException} after the
+ * access token has been refreshed
+ */
+ public T withAuthData(FunctionRequiringAuthData f)
+ throws IOException, InvalidTokenException {
+ try {
+ return f.execute(authData);
+ } catch (InvalidTokenException e) {
+ if (authData.getRefreshToken() == null || authData.getRefreshToken().isEmpty()) {
+ throw e;
+ }
+
+ this.authData = refreshToken();
+ try {
+ return f.execute(this.authData);
+ } catch (InvalidTokenException innerException) {
+ throw new InvalidTokenException("Token still expired after refresh", innerException);
+ }
+ }
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/BlobbySerializerTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/BlobbySerializerTest.java
new file mode 100644
index 000000000..ea2bcb6a3
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/BlobbySerializerTest.java
@@ -0,0 +1,163 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertTrue;
+
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+import org.datatransferproject.types.common.models.blob.BlobbyStorageContainerResource;
+import org.datatransferproject.types.common.models.blob.DigitalDocumentWrapper;
+import org.datatransferproject.types.common.models.blob.DtpDigitalDocument;
+import org.junit.Test;
+
+public class BlobbySerializerTest extends GenericImportSerializerTestBase {
+ @Test
+ public void testBlobbySerializerFolders() throws Exception {
+ // Folder structure of
+ // /root/
+ // /foo/
+ BlobbyStorageContainerResource container =
+ new BlobbyStorageContainerResource(
+ "root",
+ "rootdir",
+ new ArrayList<>(),
+ Arrays.asList(
+ new BlobbyStorageContainerResource(
+ "foo", "foodir", new ArrayList<>(), new ArrayList<>())));
+
+ List> res =
+ iterableToList(BlobbySerializer.serialize(container));
+
+ assertEquals(2, res.size());
+ assertEquals("rootdir", res.get(0).getIdempotentId());
+ assertEquals("/root", res.get(0).getName());
+ assertJsonEquals("{\"@type\": \"Folder\", \"path\": \"/root\"}", res.get(0).getJsonData());
+
+ assertEquals("foodir", res.get(1).getIdempotentId());
+ assertEquals("/root/foo", res.get(1).getName());
+ assertJsonEquals("{\"@type\": \"Folder\", \"path\": \"/root/foo\"}", res.get(1).getJsonData());
+ }
+
+ @Test
+ public void testBlobbySerializerFiles() throws Exception {
+ // Folder structure of
+ // /root
+ // foo.mp4
+ // bar.txt
+ BlobbyStorageContainerResource container =
+ new BlobbyStorageContainerResource(
+ "root",
+ "rootdir",
+ Arrays.asList(
+ new DigitalDocumentWrapper(
+ new DtpDigitalDocument("foo.mp4", "2020-02-01T01:02:03Z", "video/mp4"),
+ "video/mp4",
+ "foomp4"),
+ new DigitalDocumentWrapper(
+ new DtpDigitalDocument("bar.txt", null, "text/plain"), "text/plain", "bartxt")),
+ new ArrayList<>());
+
+ List> res =
+ iterableToList(BlobbySerializer.serialize(container));
+
+ assertEquals(3, res.size());
+
+ assertEquals("rootdir", res.get(0).getIdempotentId());
+ assertEquals("/root", res.get(0).getName());
+ assertJsonEquals(
+ "" + "{" + " \"@type\": \"Folder\"," + " \"path\": \"/root\"" + "}",
+ res.get(0).getJsonData());
+
+ assertEquals("foomp4", res.get(1).getIdempotentId());
+ assertEquals("foo.mp4", res.get(1).getName());
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"File\","
+ + " \"folder\": \"/root\","
+ + " \"name\": \"foo.mp4\","
+ + " \"dateModified\": \"2020-02-01T01:02:03Z\""
+ + "}",
+ res.get(1).getJsonData());
+ assertTrue(res.get(1) instanceof ImportableFileData);
+ assertTrue(
+ ((ImportableFileData) res.get(1)).getFile().isInTempStore());
+ assertEquals(
+ "foomp4",
+ ((ImportableFileData) res.get(1)).getFile().getFetchableUrl());
+
+ assertEquals("bartxt", res.get(2).getIdempotentId());
+ assertEquals("bar.txt", res.get(2).getName());
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"File\","
+ + " \"folder\": \"/root\","
+ + " \"name\": \"bar.txt\","
+ + " \"dateModified\": null"
+ + "}",
+ res.get(2).getJsonData());
+ assertTrue(res.get(2) instanceof ImportableFileData);
+ assertTrue(
+ ((ImportableFileData) res.get(2)).getFile().isInTempStore());
+ assertEquals(
+ "bartxt",
+ ((ImportableFileData) res.get(2)).getFile().getFetchableUrl());
+ }
+
+ @Test
+ public void testBlobbySerializerNested() throws Exception {
+ // Folder structure of
+ // /root
+ // /foo/
+ // bar.txt
+ BlobbyStorageContainerResource container =
+ new BlobbyStorageContainerResource(
+ "root",
+ "rootdir",
+ new ArrayList<>(),
+ Arrays.asList(
+ new BlobbyStorageContainerResource(
+ "foo",
+ "foodir",
+ Arrays.asList(
+ new DigitalDocumentWrapper(
+ new DtpDigitalDocument("bar.txt", "2020-03-01T01:02:03Z", "text/plain"),
+ "text/plain",
+ "bartxt")),
+ new ArrayList<>())));
+
+ List> res =
+ iterableToList(BlobbySerializer.serialize(container));
+
+ assertEquals(3, res.size());
+
+ assertEquals("rootdir", res.get(0).getIdempotentId());
+ assertEquals("/root", res.get(0).getName());
+ assertJsonEquals("{\"@type\": \"Folder\", \"path\": \"/root\"}", res.get(0).getJsonData());
+
+ assertEquals("foodir", res.get(1).getIdempotentId());
+ assertEquals("/root/foo", res.get(1).getName());
+ assertJsonEquals("{\"@type\": \"Folder\", \"path\": \"/root/foo\"}", res.get(1).getJsonData());
+
+ assertEquals("bartxt", res.get(2).getIdempotentId());
+ assertEquals("bar.txt", res.get(2).getName());
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"File\","
+ + " \"folder\": \"/root/foo\","
+ + " \"name\": \"bar.txt\","
+ + " \"dateModified\": \"2020-03-01T01:02:03Z\""
+ + " }"
+ + "}",
+ res.get(2).getJsonData());
+ assertTrue(res.get(2) instanceof ImportableFileData);
+ assertTrue(
+ ((ImportableFileData) res.get(2)).getFile().isInTempStore());
+ assertEquals(
+ "bartxt",
+ ((ImportableFileData) res.get(2)).getFile().getFetchableUrl());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/CalendarSerializerTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/CalendarSerializerTest.java
new file mode 100644
index 000000000..558695002
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/CalendarSerializerTest.java
@@ -0,0 +1,91 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static org.junit.Assert.assertEquals;
+
+import java.time.Instant;
+import java.time.OffsetDateTime;
+import java.time.ZoneOffset;
+import java.util.Arrays;
+import java.util.List;
+import org.datatransferproject.types.common.models.calendar.CalendarAttendeeModel;
+import org.datatransferproject.types.common.models.calendar.CalendarContainerResource;
+import org.datatransferproject.types.common.models.calendar.CalendarEventModel;
+import org.datatransferproject.types.common.models.calendar.CalendarModel;
+import org.junit.Test;
+
+public class CalendarSerializerTest extends GenericImportSerializerTestBase {
+ @Test
+ public void testCalendarSerializer() throws Exception {
+ CalendarContainerResource container =
+ new CalendarContainerResource(
+ Arrays.asList(new CalendarModel("calendar123", "Calendar 123", "Calendar description")),
+ Arrays.asList(
+ new CalendarEventModel(
+ "calendar123",
+ "Event 1",
+ "Event notes",
+ Arrays.asList(
+ new CalendarAttendeeModel("attendee1", "attendee1@example.com", false),
+ new CalendarAttendeeModel("attendee2", "attendee2@example.com", true)),
+ "Event Place",
+ new CalendarEventModel.CalendarEventTime(
+ OffsetDateTime.ofInstant(Instant.ofEpochSecond(1732713392), ZoneOffset.UTC),
+ false),
+ new CalendarEventModel.CalendarEventTime(
+ OffsetDateTime.ofInstant(
+ Instant.ofEpochSecond(1732713392 + 60 * 60 * 2), ZoneOffset.UTC),
+ false),
+ null)));
+
+ List> res =
+ iterableToList(CalendarSerializer.serialize(container));
+
+ assertEquals(2, res.size());
+
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"Calendar\","
+ + " \"id\": \"calendar123\","
+ + " \"name\": \"Calendar 123\","
+ + " \"description\": \"Calendar description\""
+ + "}",
+ res.get(0).getJsonData());
+
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"CalendarEvent\","
+ + " \"calendarId\": \"calendar123\","
+ + " \"title\": \"Event 1\","
+ + " \"notes\": \"Event notes\","
+ + " \"attendees\": ["
+ + " {"
+ + " \"@type\": \"CalendarAttendeeModel\","
+ + " \"displayName\": \"attendee1\","
+ + " \"email\": \"attendee1@example.com\","
+ + " \"optional\": false"
+ + " },"
+ + " {"
+ + " \"@type\": \"CalendarAttendeeModel\","
+ + " \"displayName\": \"attendee2\","
+ + " \"email\": \"attendee2@example.com\","
+ + " \"optional\": true"
+ + " }"
+ + " ],"
+ + " \"location\": \"Event Place\","
+ + " \"startTime\": {"
+ + " \"@type\": \"CalendarEventModel$CalendarEventTime\","
+ + " \"dateTime\": \"2024-11-27T13:16:32Z\","
+ + " \"dateOnly\": false"
+ + " },"
+ + " \"endTime\": {"
+ + " \"@type\": \"CalendarEventModel$CalendarEventTime\","
+ + " \"dateTime\": \"2024-11-27T15:16:32Z\","
+ + " \"dateOnly\": false"
+ + " },"
+ + " \"recurrenceRule\": null"
+ + "}",
+ res.get(1).getJsonData());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericFileImporterTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericFileImporterTest.java
new file mode 100644
index 000000000..d60af8bab
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericFileImporterTest.java
@@ -0,0 +1,177 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.util.Arrays;
+import java.util.UUID;
+import okhttp3.Headers;
+import okhttp3.mockwebserver.MockResponse;
+import okhttp3.mockwebserver.MockWebServer;
+import okhttp3.mockwebserver.RecordedRequest;
+import org.apache.commons.fileupload.FileUploadBase.FileUploadIOException;
+import org.apache.commons.fileupload.MultipartStream;
+import org.apache.commons.fileupload.MultipartStream.MalformedStreamException;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.spi.cloud.storage.TemporaryPerJobDataStore;
+import org.datatransferproject.spi.transfer.idempotentexecutor.InMemoryIdempotentImportExecutor;
+import org.datatransferproject.types.common.models.IdOnlyContainerResource;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+public class GenericFileImporterTest {
+ private MockWebServer webServer;
+ private Monitor monitor = new Monitor() {};
+ private TemporaryPerJobDataStore dataStore =
+ new TemporaryPerJobDataStore() {
+ @Override
+ public InputStreamWrapper getStream(UUID jobId, String key) throws IOException {
+ return new InputStreamWrapper(new ByteArrayInputStream("Hello world".getBytes()));
+ }
+ };
+
+ @Before
+ public void setup() throws IOException {
+ webServer = new MockWebServer();
+ webServer.start();
+ }
+
+ @After
+ public void teardown() throws IOException {
+ webServer.shutdown();
+ }
+
+ MultipartStream getMultipartStream(RecordedRequest request) {
+ assertTrue(
+ format("Invalid Content-Type '%s'", request.getHeader("Content-Type")),
+ request.getHeader("Content-Type").startsWith("multipart/related"));
+ String boundaryString = request.getHeader("Content-Type").split(";", 2)[1].strip();
+ assertTrue(
+ format("Invalid boundary string '%s'", boundaryString),
+ boundaryString.startsWith("boundary="));
+ String boundary = boundaryString.split("=", 2)[1];
+
+ return new MultipartStream(
+ request.getBody().inputStream(),
+ boundary.getBytes(),
+ Integer.parseInt(request.getHeader("Content-Length")),
+ null);
+ }
+
+ Headers readPartHeaders(MultipartStream stream)
+ throws FileUploadIOException, MalformedStreamException {
+ Headers.Builder builder = new Headers.Builder();
+ stream.readHeaders().strip().lines().forEach(line -> builder.add(line));
+ return builder.build();
+ }
+
+ String readPartBody(MultipartStream stream) throws MalformedStreamException, IOException {
+ ByteArrayOutputStream os = new ByteArrayOutputStream();
+ stream.readBodyData(os);
+ return new String(os.toByteArray(), StandardCharsets.UTF_8);
+ }
+
+ @Test
+ public void testGenericFileImporter() throws Exception {
+ GenericFileImporter importer =
+ new GenericFileImporter<>(
+ container ->
+ Arrays.asList(
+ new ImportableFileData<>(
+ new CachedDownloadableItem(container.getId(), container.getId()),
+ "video/mp4",
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())),
+ new AppCredentials("key", "secret"),
+ webServer.url("/id").url(),
+ dataStore,
+ monitor);
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ webServer.enqueue(new MockResponse().setResponseCode(201).setBody("OK"));
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(1, webServer.getRequestCount());
+
+ RecordedRequest request = webServer.takeRequest();
+ MultipartStream stream = getMultipartStream(request);
+
+ assertTrue("Missing JSON part", stream.skipPreamble());
+ assertEquals("application/json", readPartHeaders(stream).get("Content-Type"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"id\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ readPartBody(stream));
+
+ assertTrue("Missing file part", stream.readBoundary());
+ assertEquals("video/mp4", readPartHeaders(stream).get("Content-Type"));
+ assertEquals("Hello world", readPartBody(stream));
+
+ assertFalse("Unexpected extra data", stream.readBoundary());
+ }
+
+ @Test
+ public void testGenericFileImporterMixedTypes() throws Exception {
+ GenericFileImporter importer =
+ new GenericFileImporter<>(
+ container ->
+ Arrays.asList(
+ new ImportableFileData<>(
+ new CachedDownloadableItem("file", "file"),
+ "invalid_mimetype",
+ new GenericPayload<>("file", "schemasource"),
+ "file",
+ "file"),
+ new ImportableData<>(
+ new GenericPayload<>("not file", "schemasource"), "not file", "not file")),
+ new AppCredentials("key", "secret"),
+ webServer.url("/id").url(),
+ dataStore,
+ monitor);
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ webServer.enqueue(new MockResponse().setResponseCode(201).setBody("OK"));
+ webServer.enqueue(new MockResponse().setResponseCode(201).setBody("OK"));
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(2, webServer.getRequestCount());
+
+ RecordedRequest fileRequest = webServer.takeRequest();
+ MultipartStream stream = getMultipartStream(fileRequest);
+ assertTrue("Missing JSON part", stream.skipPreamble());
+ assertEquals("application/json", readPartHeaders(stream).get("Content-Type"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"file\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ readPartBody(stream));
+ assertTrue("Missing file part", stream.readBoundary());
+ assertEquals("application/octet-stream", readPartHeaders(stream).get("Content-Type"));
+ assertEquals("Hello world", readPartBody(stream));
+ assertFalse("Unexpected extra data", stream.readBoundary());
+
+ RecordedRequest jsonRequest = webServer.takeRequest();
+ assertEquals("application/json", jsonRequest.getHeader("Content-Type"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"not"
+ + " file\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ jsonRequest.getBody().readUtf8());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImportSerializerTestBase.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImportSerializerTestBase.java
new file mode 100644
index 000000000..041bb2740
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImportSerializerTestBase.java
@@ -0,0 +1,33 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static org.junit.Assert.assertNotNull;
+import static org.junit.jupiter.api.Assertions.assertEquals;
+
+import com.fasterxml.jackson.databind.ObjectMapper;
+import java.util.List;
+import java.util.stream.Collectors;
+import java.util.stream.StreamSupport;
+import org.junit.BeforeClass;
+
+class GenericImportSerializerTestBase {
+ static ObjectMapper objectMapper = new ObjectMapper();
+
+ @BeforeClass
+ public static void onlyOnce() {
+ GenericImporter.configureObjectMapper(objectMapper);
+ }
+
+ void assertJsonEquals(String expectedPayload, GenericPayload actualWrapperPayload)
+ throws Exception {
+ assertNotNull(actualWrapperPayload.getApiVersion());
+ assertNotNull(actualWrapperPayload.getSchemaSource());
+ assertEquals(
+ objectMapper.readTree(expectedPayload),
+ // Wrap/unwrap to compare just what gets serialized, which is the most important thing
+ objectMapper.readTree(objectMapper.writeValueAsString(actualWrapperPayload.getPayload())));
+ }
+
+ static List iterableToList(Iterable iterable) {
+ return StreamSupport.stream(iterable.spliterator(), false).collect(Collectors.toList());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImporterTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImporterTest.java
new file mode 100644
index 000000000..2a053491c
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/GenericImporterTest.java
@@ -0,0 +1,320 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static java.lang.String.format;
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertTrue;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.UUID;
+import okhttp3.mockwebserver.Dispatcher;
+import okhttp3.mockwebserver.MockResponse;
+import okhttp3.mockwebserver.MockWebServer;
+import okhttp3.mockwebserver.RecordedRequest;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.spi.cloud.storage.TemporaryPerJobDataStore;
+import org.datatransferproject.spi.transfer.idempotentexecutor.InMemoryIdempotentImportExecutor;
+import org.datatransferproject.types.common.models.IdOnlyContainerResource;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+import org.datatransferproject.types.transfer.errors.ErrorDetail;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;
+
+@RunWith(Parameterized.class)
+public class GenericImporterTest {
+ private MockWebServer webServer;
+ private Monitor monitor = new Monitor() {};
+ private TemporaryPerJobDataStore dataStore = new TemporaryPerJobDataStore() {};
+
+ @Parameters(name = "{0}")
+ public static Collection strings() {
+ return Arrays.asList(GenericImporter.class.getName(), GenericFileImporter.class.getName());
+ }
+
+ @Parameter public String importerClass;
+
+ @Before
+ public void setup() throws IOException {
+ webServer = new MockWebServer();
+ webServer.start();
+ }
+
+ @After
+ public void teardown() throws IOException {
+ webServer.shutdown();
+ }
+
+ static void assertContains(String expected, String actual) {
+ assertTrue(
+ format("Missing substring [%s] from [%s]", expected, actual), actual.contains(expected));
+ }
+
+ GenericImporter getImporter(
+ String cls, ContainerSerializer containerSerializer) {
+ if (cls.equals(GenericFileImporter.class.getName())) {
+ return new GenericFileImporter<>(
+ containerSerializer,
+ new AppCredentials("key", "secret"),
+ webServer.url("/id").url(),
+ dataStore,
+ monitor);
+ } else {
+ return new GenericImporter<>(
+ containerSerializer,
+ new AppCredentials("key", "secret"),
+ webServer.url("/id").url(),
+ monitor);
+ }
+ }
+
+ Dispatcher getDispatcher() {
+ return new Dispatcher() {
+ @Override
+ public MockResponse dispatch(RecordedRequest request) {
+ switch (request.getPath()) {
+ case "/id":
+ if (request.getHeader("Authorization").equals("Bearer invalidToken")) {
+ return new MockResponse()
+ .setResponseCode(401)
+ .setBody("{\"error\":\"invalid_token\"}");
+ }
+ return new MockResponse().setResponseCode(201).setBody("OK");
+ case "/refresh":
+ return new MockResponse()
+ .setResponseCode(200)
+ .setBody("{\"access_token\":\"newAccessToken\",\"token_type\":\"Bearer\"}");
+ default:
+ return new MockResponse().setResponseCode(404);
+ }
+ }
+ };
+ }
+
+ public void testGenericImporter() throws Exception {
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())));
+ webServer.setDispatcher(getDispatcher());
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(1, webServer.getRequestCount());
+ RecordedRequest request = webServer.takeRequest();
+ assertEquals("POST", request.getMethod());
+ assertEquals("Bearer accessToken", request.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"id\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request.getBody().readByteArray(), StandardCharsets.UTF_8));
+ assertTrue(executor.getErrors().isEmpty());
+ }
+
+ @Test
+ public void testGenericImporterMultipleItems() throws Exception {
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(new GenericPayload<>(1, "schemasource"), "id1", "id1"),
+ new ImportableData<>(new GenericPayload<>(2, "schemasource"), "id2", "id2")));
+ webServer.setDispatcher(getDispatcher());
+
+ importer.importItem(
+ UUID.randomUUID(),
+ new InMemoryIdempotentImportExecutor(monitor),
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(2, webServer.getRequestCount());
+
+ RecordedRequest request1 = webServer.takeRequest();
+ assertEquals("POST", request1.getMethod());
+ assertEquals("Bearer accessToken", request1.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":1,\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request1.getBody().readByteArray(), StandardCharsets.UTF_8));
+
+ RecordedRequest request2 = webServer.takeRequest();
+ assertEquals("POST", request2.getMethod());
+ assertEquals("Bearer accessToken", request2.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":2,\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request2.getBody().readByteArray(), StandardCharsets.UTF_8));
+ }
+
+ @Test
+ public void testGenericImporterMultipleItemsWithSameID() throws Exception {
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(new GenericPayload<>(1, "schemasource"), "id1", "id1"),
+ new ImportableData<>(new GenericPayload<>(1, "schemasource"), "id1", "id1")));
+ webServer.setDispatcher(getDispatcher());
+
+ importer.importItem(
+ UUID.randomUUID(),
+ new InMemoryIdempotentImportExecutor(monitor),
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(1, webServer.getRequestCount());
+ RecordedRequest request = webServer.takeRequest();
+ assertEquals("POST", request.getMethod());
+ assertEquals("Bearer accessToken", request.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":1,\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request.getBody().readByteArray(), StandardCharsets.UTF_8));
+ }
+
+ @Test
+ public void testGenericImporterTokenRefresh() throws Exception {
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())));
+ webServer.setDispatcher(getDispatcher());
+
+ importer.importItem(
+ UUID.randomUUID(),
+ new InMemoryIdempotentImportExecutor(monitor),
+ new TokensAndUrlAuthData(
+ "invalidToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("id"));
+
+ assertEquals(3, webServer.getRequestCount());
+
+ RecordedRequest request1 = webServer.takeRequest();
+ assertEquals("POST", request1.getMethod());
+ assertEquals("Bearer invalidToken", request1.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"id\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request1.getBody().readByteArray(), StandardCharsets.UTF_8));
+
+ RecordedRequest refreshRequest = webServer.takeRequest();
+ assertEquals("POST", refreshRequest.getMethod());
+ assertEquals(
+ "grant_type=refresh_token&client_id=key&client_secret=secret&refresh_token=refreshToken",
+ new String(refreshRequest.getBody().readByteArray(), StandardCharsets.UTF_8));
+
+ RecordedRequest request2 = webServer.takeRequest();
+ assertEquals("POST", request2.getMethod());
+ assertEquals("Bearer newAccessToken", request2.getHeader("Authorization"));
+ assertEquals(
+ "{\"@type\":\"GenericPayload\",\"payload\":\"id\",\"schemaSource\":\"schemasource\",\"apiVersion\":\"0.1.0\"}",
+ new String(request2.getBody().readByteArray(), StandardCharsets.UTF_8));
+ }
+
+ @Test
+ public void testGenericImporterBadRequest() throws Exception {
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())));
+ webServer.enqueue(
+ new MockResponse().setResponseCode(400).setBody("{\"error\":\"bad_request\"}"));
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("itemId"));
+
+ Collection errors = executor.getErrors();
+ assertEquals(1, errors.size());
+ ErrorDetail error = errors.iterator().next();
+ assertEquals("itemId", error.title());
+ assertContains("(400) bad_request", error.exception());
+ }
+
+ @Test
+ public void testGenericImporterUnexpectedResponse() throws Exception {
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())));
+ webServer.enqueue(new MockResponse().setResponseCode(400).setBody("notjson"));
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("itemId"));
+
+ Collection errors = executor.getErrors();
+ assertEquals(1, errors.size());
+ ErrorDetail error = errors.iterator().next();
+ assertEquals("itemId", error.title());
+ assertContains("Unexpected response (400) 'notjson'", error.exception());
+ }
+
+ @Test
+ public void testGenericImporterUnexpectedResponseCode() throws Exception {
+ InMemoryIdempotentImportExecutor executor = new InMemoryIdempotentImportExecutor(monitor);
+ GenericImporter importer =
+ getImporter(
+ importerClass,
+ container ->
+ Arrays.asList(
+ new ImportableData<>(
+ new GenericPayload<>(container.getId(), "schemasource"),
+ container.getId(),
+ container.getId())));
+ webServer.enqueue(new MockResponse().setResponseCode(111));
+
+ importer.importItem(
+ UUID.randomUUID(),
+ executor,
+ new TokensAndUrlAuthData(
+ "accessToken", "refreshToken", webServer.url("/refresh").toString()),
+ new IdOnlyContainerResource("itemId"));
+
+ Collection errors = executor.getErrors();
+ assertEquals(1, errors.size());
+ ErrorDetail error = errors.iterator().next();
+ assertEquals("itemId", error.title());
+ assertContains("Unexpected response code (111)", error.exception());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/MediaSerializerTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/MediaSerializerTest.java
new file mode 100644
index 000000000..ccd87c1aa
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/MediaSerializerTest.java
@@ -0,0 +1,99 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+import static org.junit.Assert.assertTrue;
+
+import java.time.Instant;
+import java.util.Arrays;
+import java.util.Date;
+import java.util.List;
+import org.datatransferproject.types.common.models.media.MediaAlbum;
+import org.datatransferproject.types.common.models.media.MediaContainerResource;
+import org.datatransferproject.types.common.models.photos.PhotoModel;
+import org.datatransferproject.types.common.models.videos.VideoModel;
+import org.junit.Test;
+
+public class MediaSerializerTest extends GenericImportSerializerTestBase {
+ @Test
+ public void testMediaSerializer() throws Exception {
+ MediaContainerResource container =
+ new MediaContainerResource(
+ Arrays.asList(new MediaAlbum("album123", "Album 123", "Album description")),
+ Arrays.asList(
+ new PhotoModel(
+ "bar.jpeg",
+ "https://example.com/bar.jpg",
+ "Bar description",
+ "image/jpeg",
+ "idempotentVal1",
+ "album123",
+ false,
+ null,
+ Date.from(Instant.ofEpochSecond(1732713392)))),
+ Arrays.asList(
+ new VideoModel(
+ "foo.mp4",
+ "cachedVal1",
+ "Foo description",
+ "video/mp4",
+ "cachedVal1",
+ "album123",
+ true,
+ Date.from(Instant.ofEpochSecond(1732713392)))));
+
+ List> res =
+ iterableToList(MediaSerializer.serialize(container));
+ assertEquals(3, res.size());
+
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"Album\","
+ + " \"id\": \"album123\","
+ + " \"name\": \"Album 123\","
+ + " \"description\": \"Album description\""
+ + "}",
+ res.get(0).getJsonData());
+
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"Video\","
+ + " \"name\": \"foo.mp4\","
+ + " \"description\": \"Foo description\","
+ + " \"albumId\": \"album123\","
+ + " \"uploadedTime\": \"2024-11-27T13:16:32Z\","
+ + " \"favoriteInfo\": {"
+ + " \"@type\": \"FavoriteInfo\","
+ + " \"lastUpdateTime\": \"2024-11-27T13:16:32Z\","
+ + " \"favorite\": false"
+ + " }"
+ + "}",
+ res.get(1).getJsonData());
+ assertTrue(res.get(1) instanceof ImportableFileData);
+ assertTrue(((ImportableFileData>) res.get(1)).getFile().isInTempStore());
+ assertEquals("cachedVal1", ((ImportableFileData>) res.get(1)).getFile().getFetchableUrl());
+
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"Photo\","
+ + " \"name\": \"bar.jpeg\","
+ + " \"description\": \"Bar description\","
+ + " \"albumId\": \"album123\","
+ + " \"uploadedTime\": \"2024-11-27T13:16:32Z\","
+ + " \"favoriteInfo\": {"
+ + " \"@type\": \"FavoriteInfo\","
+ + " \"lastUpdateTime\": \"2024-11-27T13:16:32Z\","
+ + " \"favorite\": false"
+ + " }"
+ + "}",
+ res.get(2).getJsonData());
+ assertTrue(res.get(2) instanceof ImportableFileData);
+ assertFalse(((ImportableFileData>) res.get(2)).getFile().isInTempStore());
+ assertEquals(
+ "https://example.com/bar.jpg",
+ ((ImportableFileData>) res.get(2)).getFile().getFetchableUrl());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializerTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializerTest.java
new file mode 100644
index 000000000..236a8d5de
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/SocialPostsSerializerTest.java
@@ -0,0 +1,81 @@
+package org.datatransferproject.datatransfer.generic;
+
+import static org.junit.Assert.assertEquals;
+
+import java.time.Instant;
+import java.util.Arrays;
+import java.util.List;
+import org.datatransferproject.types.common.models.social.SocialActivityActor;
+import org.datatransferproject.types.common.models.social.SocialActivityAttachment;
+import org.datatransferproject.types.common.models.social.SocialActivityAttachmentType;
+import org.datatransferproject.types.common.models.social.SocialActivityContainerResource;
+import org.datatransferproject.types.common.models.social.SocialActivityLocation;
+import org.datatransferproject.types.common.models.social.SocialActivityModel;
+import org.datatransferproject.types.common.models.social.SocialActivityType;
+import org.junit.Test;
+
+public class SocialPostsSerializerTest extends GenericImportSerializerTestBase {
+ @Test
+ public void testSocialPostsSerializer() throws Exception {
+ SocialActivityContainerResource container =
+ new SocialActivityContainerResource(
+ "123",
+ new SocialActivityActor("321", "Steve", null),
+ Arrays.asList(
+ new SocialActivityModel(
+ "456",
+ Instant.ofEpochSecond(1732713392),
+ SocialActivityType.NOTE,
+ Arrays.asList(
+ new SocialActivityAttachment(
+ SocialActivityAttachmentType.IMAGE, "foo.com", "Foo", null)),
+ new SocialActivityLocation("foo", 10, 10),
+ "Hello world!",
+ "Hi there",
+ null)));
+
+ List> res =
+ iterableToList(SocialPostsSerializer.serialize(container));
+
+ assertEquals(1, res.size());
+ assertJsonEquals(
+ ""
+ + "{"
+ + " \"@type\": \"SocialActivity\","
+ + " \"metadata\": {"
+ + " \"@type\": \"SocialActivityMetadata\","
+ + " \"actor\": {"
+ + " \"@type\": \"SocialActivityActor\","
+ + " \"id\": \"321\","
+ + " \"name\": \"Steve\","
+ + " \"url\": null"
+ + " }"
+ + " },"
+ + " \"activity\": {"
+ + " \"@type\": \"SocialActivityModel\","
+ + " \"id\": \"456\","
+ + " \"published\": \"2024-11-27T13:16:32Z\","
+ + " \"type\": \"NOTE\","
+ + " \"attachments\": ["
+ + " {"
+ + " \"@type\": \"SocialActivityAttachment\","
+ + " \"type\": \"IMAGE\","
+ + " \"url\": \"foo.com\","
+ + " \"name\": \"Foo\","
+ + " \"content\": null"
+ + " }"
+ + " ],"
+ + " \"location\": {"
+ + " \"@type\": \"SocialActivityLocation\","
+ + " \"name\": \"foo\","
+ + " \"longitude\": 10.0,"
+ + " \"latitude\": 10.0"
+ + " },"
+ + " \"title\": \"Hello world!\","
+ + " \"content\": \"Hi there\","
+ + " \"url\": null"
+ + " }"
+ + "}",
+ res.get(0).getJsonData());
+ }
+}
diff --git a/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManagerTest.java b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManagerTest.java
new file mode 100644
index 000000000..cbeb87788
--- /dev/null
+++ b/extensions/data-transfer/portability-data-transfer-generic/src/test/java/org/datatransferproject/datatransfer/generic/auth/OAuthTokenManagerTest.java
@@ -0,0 +1,247 @@
+package org.datatransferproject.datatransfer.generic.auth;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertThrows;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import okhttp3.OkHttpClient;
+import okhttp3.mockwebserver.MockResponse;
+import okhttp3.mockwebserver.MockWebServer;
+import okhttp3.mockwebserver.RecordedRequest;
+import org.datatransferproject.api.launcher.Monitor;
+import org.datatransferproject.spi.transfer.types.InvalidTokenException;
+import org.datatransferproject.types.transfer.auth.AppCredentials;
+import org.datatransferproject.types.transfer.auth.TokensAndUrlAuthData;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+public class OAuthTokenManagerTest {
+ private MockWebServer webServer;
+ private AppCredentials appCredentials = new AppCredentials("appKey", "appSecret");
+ private Monitor monitor = new Monitor() {};
+
+ @Before
+ public void setup() throws IOException {
+ webServer = new MockWebServer();
+ webServer.start();
+ }
+
+ @After
+ public void teardown() throws IOException {
+ webServer.shutdown();
+ }
+
+ private TokensAndUrlAuthData getInitialAuthData() {
+ return new TokensAndUrlAuthData(
+ "initialAccessToken", "refreshToken", webServer.url("/refresh").toString());
+ }
+
+ @Test
+ public void testWithAuthDataNoRefresh() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+
+ TokensAndUrlAuthData usedAuthData = tokenManager.withAuthData(authData -> authData);
+ assertEquals(initialAuthData.getToken(), usedAuthData.getToken());
+ }
+
+ @Test
+ public void testWithAuthDataWithRefresh() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(
+ new MockResponse()
+ .setResponseCode(200)
+ .setBody(
+ ""
+ + "{"
+ + " \"access_token\": \"newAccessToken\","
+ + " \"token_type\": \"Bearer\""
+ + "}"));
+
+ TokensAndUrlAuthData usedAuthData =
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ });
+
+ assertEquals("newAccessToken", usedAuthData.getToken());
+ assertEquals("refreshToken", usedAuthData.getRefreshToken());
+ assertEquals(1, webServer.getRequestCount());
+ RecordedRequest request = webServer.takeRequest();
+ assertEquals(
+ "grant_type=refresh_token&client_id=appKey&client_secret=appSecret&refresh_token=refreshToken",
+ new String(request.getBody().readByteArray(), StandardCharsets.UTF_8));
+ }
+
+ @Test
+ public void testWithAuthDataWithRefreshedRefreshToken() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(
+ new MockResponse()
+ .setResponseCode(200)
+ .setBody(
+ ""
+ + "{"
+ + " \"access_token\": \"newAccessToken\","
+ + " \"refresh_token\": \"newRefreshToken\","
+ + " \"token_type\": \"Bearer\""
+ + "}"));
+
+ TokensAndUrlAuthData usedAuthData =
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ });
+
+ assertEquals("newRefreshToken", usedAuthData.getRefreshToken());
+ }
+
+ @Test
+ public void testWithAuthDataWithRefreshFailure() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(
+ new MockResponse().setResponseCode(400).setBody("{\"error\": \"invalid_token\"}"));
+
+ assertThrows(
+ "invalid_token",
+ IOException.class,
+ () ->
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ }));
+ }
+
+ @Test
+ public void testWithAuthDataWithRefreshSuccessWithUnexpectedResponse() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(new MockResponse().setResponseCode(200).setBody("invalidresponsebody"));
+
+ assertThrows(
+ "invalidresponsebody",
+ IOException.class,
+ () ->
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ }));
+ }
+
+ @Test
+ public void testWithAuthDataWithRefreshNewTokenStillInvalid() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(
+ new MockResponse()
+ .setResponseCode(200)
+ .setBody(
+ ""
+ + "{"
+ + " \"access_token\": \"newAccessToken\","
+ + " \"token_type\": \"Bearer\""
+ + "}"));
+
+ assertThrows(
+ "Token still expired after refresh",
+ InvalidTokenException.class,
+ () ->
+ tokenManager.withAuthData(
+ authData -> {
+ throw new InvalidTokenException("Token expired", null);
+ }));
+
+ // Only try refreshing once
+ assertEquals(1, webServer.getRequestCount());
+ }
+
+ @Test
+ public void testWithAuthDataWithRefreshPreservesNewToken() throws Exception {
+ TokensAndUrlAuthData initialAuthData = getInitialAuthData();
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+ webServer.enqueue(
+ new MockResponse()
+ .setResponseCode(200)
+ .setBody(
+ ""
+ + "{"
+ + " \"access_token\": \"newAccessToken\","
+ + " \"token_type\": \"Bearer\""
+ + "}"));
+
+ TokensAndUrlAuthData usedAuthData =
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ });
+ assertEquals("newAccessToken", usedAuthData.getToken());
+ assertEquals(1, webServer.getRequestCount());
+
+ TokensAndUrlAuthData secondUsedAuthData = tokenManager.withAuthData(authData -> authData);
+ assertEquals("newAccessToken", secondUsedAuthData.getToken());
+ // Still only 1 request to refresh
+ assertEquals(1, webServer.getRequestCount());
+ }
+
+ @Test
+ public void testWithAuthDataNoRefreshToken() throws Exception {
+ TokensAndUrlAuthData initialAuthData = new TokensAndUrlAuthData("initialAccessToken", null, "");
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(initialAuthData, appCredentials, new OkHttpClient(), monitor);
+
+ assertThrows(
+ "Token expired",
+ InvalidTokenException.class,
+ () ->
+ tokenManager.withAuthData(
+ authData -> {
+ if (authData.equals(initialAuthData)) {
+ throw new InvalidTokenException("Token expired", null);
+ }
+ return authData;
+ }));
+ assertEquals(0, webServer.getRequestCount());
+ }
+
+ @Test
+ public void testWithAuthDataPropagatesIOExceptions() throws Exception {
+ OAuthTokenManager tokenManager =
+ new OAuthTokenManager(getInitialAuthData(), appCredentials, new OkHttpClient(), monitor);
+
+ assertThrows(
+ "messagetopropagate",
+ IOException.class,
+ () ->
+ tokenManager.withAuthData(
+ authData -> {
+ throw new IOException("messagetopropagate");
+ }));
+ }
+}
diff --git a/portability-spi-transfer/src/main/java/org/datatransferproject/spi/transfer/extension/TransferExtension.java b/portability-spi-transfer/src/main/java/org/datatransferproject/spi/transfer/extension/TransferExtension.java
index 82111f871..11804abd7 100644
--- a/portability-spi-transfer/src/main/java/org/datatransferproject/spi/transfer/extension/TransferExtension.java
+++ b/portability-spi-transfer/src/main/java/org/datatransferproject/spi/transfer/extension/TransferExtension.java
@@ -28,6 +28,10 @@ public interface TransferExtension extends AbstractExtension {
/** The key associated with this extension's service. */
String getServiceId();
+ default boolean supportsService(String service) {
+ return this.getServiceId().toLowerCase().equals(service.toLowerCase());
+ }
+
/** Returns initialized extension exporter.
* @param transferDataType*/
Exporter, ?> getExporter(DataVertical transferDataType);
diff --git a/portability-transfer/src/main/java/org/datatransferproject/transfer/WorkerModule.java b/portability-transfer/src/main/java/org/datatransferproject/transfer/WorkerModule.java
index ebd3296aa..7731eb665 100644
--- a/portability-transfer/src/main/java/org/datatransferproject/transfer/WorkerModule.java
+++ b/portability-transfer/src/main/java/org/datatransferproject/transfer/WorkerModule.java
@@ -97,7 +97,7 @@ static TransferExtension findTransferExtension(
try {
return transferExtensions
.stream()
- .filter(ext -> ext.getServiceId().toLowerCase().equals(service.toLowerCase()))
+ .filter(ext -> ext.supportsService(service))
.collect(onlyElement());
} catch (IllegalArgumentException e) {
throw new IllegalStateException(
@@ -283,21 +283,10 @@ ExtensionContext getContext() {
}
private TransferServiceConfig getTransferServiceConfig(TransferExtension ext) {
- String configFileName = "config/" + ext.getServiceId().toLowerCase() + ".yaml";
- InputStream inputStream = this.getClass().getClassLoader().getResourceAsStream(configFileName);
- getMonitor()
- .info(
- () ->
- format(
- "Service %s has a config file: %s", ext.getServiceId(), (inputStream != null)));
- if (inputStream == null) {
- return TransferServiceConfig.getDefaultInstance();
- } else {
- try {
- return TransferServiceConfig.create(inputStream);
- } catch (IOException e) {
- throw new RuntimeException("Couldn't create config for " + ext.getServiceId(), e);
- }
+ try {
+ return TransferServiceConfig.getForService(ext.getServiceId());
+ } catch (IOException e) {
+ throw new RuntimeException("Couldn't create config for " + ext.getServiceId(), e);
}
}
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/blob/DtpDigitalDocument.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/blob/DtpDigitalDocument.java
index b3eb10ed2..6ba3093b4 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/blob/DtpDigitalDocument.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/blob/DtpDigitalDocument.java
@@ -2,6 +2,7 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
/**
* This is intended to by a sub set of schema.org's DigitalDocumentWrapper
@@ -10,6 +11,7 @@
*/
// N.B. if this class gets more complex we can just use: https://github.com/google/schemaorg-java
// but right now that probably add more complexity in terms of extra cognitive load.
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class DtpDigitalDocument {
private final String name;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarAttendeeModel.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarAttendeeModel.java
index 69764b0cb..f1860071b 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarAttendeeModel.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarAttendeeModel.java
@@ -17,9 +17,11 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class CalendarAttendeeModel {
private final String displayName;
private final String email;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarEventModel.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarEventModel.java
index f5a230493..20143bd33 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarEventModel.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarEventModel.java
@@ -17,10 +17,13 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+
import java.time.OffsetDateTime;
import java.util.List;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class CalendarEventModel {
private final String calendarId;
@@ -112,6 +115,7 @@ public int hashCode() {
getRecurrenceRule());
}
+ @JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public static class CalendarEventTime {
private final OffsetDateTime dateTime;
private final boolean dateOnly;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarModel.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarModel.java
index 1ce4b42fd..6eca2c43c 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarModel.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/calendar/CalendarModel.java
@@ -18,11 +18,14 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+
import org.datatransferproject.types.common.ImportableItem;
import javax.annotation.Nonnull;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class CalendarModel implements ImportableItem {
private final String id;
private final String name;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityActor.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityActor.java
index 434c61875..1e807e50c 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityActor.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityActor.java
@@ -19,11 +19,14 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+
import org.datatransferproject.types.common.ImportableItem;
import javax.annotation.Nonnull;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class SocialActivityActor implements ImportableItem {
private final String url;
private final String name;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityAttachment.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityAttachment.java
index 2bb96a53b..9dbcba4fc 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityAttachment.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityAttachment.java
@@ -19,11 +19,14 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
+
import org.datatransferproject.types.common.ImportableItem;
import javax.annotation.Nonnull;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class SocialActivityAttachment implements ImportableItem {
private final SocialActivityAttachmentType type;
private final String url;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityLocation.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityLocation.java
index e610d54ea..84558ab6a 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityLocation.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityLocation.java
@@ -18,9 +18,11 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class SocialActivityLocation {
private final String name;
diff --git a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityModel.java b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityModel.java
index 2e7c8d607..59fa7c695 100644
--- a/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityModel.java
+++ b/portability-types-common/src/main/java/org/datatransferproject/types/common/models/social/SocialActivityModel.java
@@ -18,6 +18,7 @@
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.google.common.collect.ImmutableList;
import org.datatransferproject.types.common.ImportableItem;
@@ -26,6 +27,7 @@
import java.util.Collection;
import java.util.Objects;
+@JsonTypeInfo(use = JsonTypeInfo.Id.NAME)
public class SocialActivityModel implements ImportableItem {
private final String id;
private final Instant published;
diff --git a/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfig.java b/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfig.java
index fa1cb38e3..f96dbb32a 100644
--- a/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfig.java
+++ b/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfig.java
@@ -16,46 +16,66 @@
package org.datatransferproject.types.transfer.serviceconfig;
+import static com.google.common.base.Preconditions.checkNotNull;
+import static java.lang.String.format;
+
+import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.dataformat.yaml.YAMLFactory;
import com.google.common.util.concurrent.RateLimiter;
-
import java.io.IOException;
import java.io.InputStream;
-
-import static com.google.common.base.Preconditions.checkNotNull;
+import java.util.Optional;
/**
- * A wrapper around {@link TransferServiceConfigSpecification} to provide service specific
- * settings for transfer extensions.
+ * A wrapper around {@link TransferServiceConfigSpecification} to provide service specific settings
+ * for transfer extensions.
*/
public final class TransferServiceConfig {
private static final ObjectMapper YAML_OBJECT_MAPPER = new ObjectMapper(new YAMLFactory());
private final RateLimiter rateLimiter;
+ private final Optional serviceConfig;
public static TransferServiceConfig create(InputStream s) throws IOException {
return new TransferServiceConfig(
YAML_OBJECT_MAPPER.readValue(s, TransferServiceConfigSpecification.class));
}
- /** Gets a default instance for services that don't have a specific config. **/
+ public static TransferServiceConfig getForService(String service) throws IOException {
+ InputStream stream =
+ TransferServiceConfig.class
+ .getClassLoader()
+ .getResourceAsStream(format("config/%s.yaml", service.toLowerCase()));
+ if (stream == null) {
+ return TransferServiceConfig.getDefaultInstance();
+ } else {
+ return TransferServiceConfig.create(stream);
+ }
+ }
+
+ /** Gets a default instance for services that don't have a specific config. * */
public static TransferServiceConfig getDefaultInstance() {
return new TransferServiceConfig(
- new TransferServiceConfigSpecification(
- Double.MAX_VALUE));
+ new TransferServiceConfigSpecification(Double.MAX_VALUE, null));
}
private TransferServiceConfig(TransferServiceConfigSpecification specification) {
checkNotNull(specification, "specification can't be null");
rateLimiter = RateLimiter.create(specification.getPerUserRateLimit());
+ serviceConfig = specification.getServiceConfig();
}
/**
- * A {@link RateLimiter} that enforces the per-user rate limit that is specified in
- * the config/[service].yaml config file.
- **/
+ * A {@link RateLimiter} that enforces the per-user rate limit that is specified in the
+ * config/[service].yaml config file.
+ */
public RateLimiter getPerUserRateLimiter() {
return rateLimiter;
}
+
+ /** Service-specific configuration * */
+ public Optional getServiceConfig() {
+ return serviceConfig;
+ }
}
diff --git a/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfigSpecification.java b/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfigSpecification.java
index 99c9a4555..d15dbaab2 100644
--- a/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfigSpecification.java
+++ b/portability-types-transfer/src/main/java/org/datatransferproject/types/transfer/serviceconfig/TransferServiceConfigSpecification.java
@@ -17,26 +17,37 @@
package org.datatransferproject.types.transfer.serviceconfig;
import com.fasterxml.jackson.annotation.JsonProperty;
+import com.fasterxml.jackson.databind.JsonNode;
import com.google.common.base.Preconditions;
+import java.util.Optional;
+import javax.annotation.Nullable;
-/**
- * POJO Specification for Transfer Service specific configuration details.
- */
+/** POJO Specification for Transfer Service specific configuration details. */
public class TransferServiceConfigSpecification {
@JsonProperty("perUserRateLimit")
private final double perUserRateLimit;
+ private final Optional serviceConfig;
+
public TransferServiceConfigSpecification(
- @JsonProperty("perUserRateLimit") double perUserRateLimit) {
- Preconditions.checkArgument(
- perUserRateLimit > 0,
- "perUserRateLimit must be greater than zero");
+ @JsonProperty("perUserRateLimit") @Nullable Double perUserRateLimit,
+ @JsonProperty("serviceConfig") @Nullable JsonNode serviceConfig) {
+ if (perUserRateLimit == null) {
+ perUserRateLimit = Double.MAX_VALUE;
+ }
+ Preconditions.checkArgument(perUserRateLimit > 0, "perUserRateLimit must be greater than zero");
this.perUserRateLimit = perUserRateLimit;
+ this.serviceConfig = Optional.ofNullable(serviceConfig);
}
- /** The number of operations per second allowed for a user. **/
+ /** The number of operations per second allowed for a user. * */
public double getPerUserRateLimit() {
return perUserRateLimit;
}
+
+ /** Service-specific configuration * */
+ public Optional getServiceConfig() {
+ return serviceConfig;
+ }
}
diff --git a/settings.gradle b/settings.gradle
index 07b328f2a..832635613 100644
--- a/settings.gradle
+++ b/settings.gradle
@@ -83,6 +83,9 @@ include ':extensions:data-transfer:portability-data-transfer-koofr'
// Backblaze
include ':extensions:data-transfer:portability-data-transfer-backblaze'
+// Generic
+include ':extensions:data-transfer:portability-data-transfer-generic'
+
include ':extensions:auth:portability-auth-offline-demo'
include ':extensions:data-transfer:portability-data-transfer-offline-demo'