This is a sample DGS project which shows how to implement DataLoaders
using kotlin's suspend
.
Major thanks to @lennyburdette for the inspiration of using debounce in this gist.
- CoroutineDataLoader: DataFetcher-facing API
- CoroutineMappedBatchLoader and CoroutineMappedBatchLoaderWithContext: Loader interfaces to implement
- ShowDataLoader: Sample implementation
- DataFetcher: Sample caller
Run the server, visit http://localhost:8080/graphiql in a browser, and run the query:
{
shows {
title
releaseYear
}
}
Note the terminal output, which should look something like this:
Calling load(Stranger Things)
Calling load(Ozark)
Calling load(The Crown)
Calling load(Dead to Me)
Calling load(Orange is the New Black)
Running load([Dead to Me, Stranger Things, Orange is the New Black, The Crown, Ozark])
Those come from here and here, showing how the loads are batched. Note that the batching may not be perfect (see "why not do this?" below)
DGS uses java-dataloader
, which is built on Java's CompletableFuture.
Kotlin's concurrency model uses coroutines via the suspend
keyword.
Switching between the two requires calls to .await()
and asCompletableFuture()
from kotlinx-coroutines-core, which is a bit annoying.
Related issues:
graphql-java
doesn't support the use of multiple DataLoaders to resolve the same field (graphql-java/graphql-java#1198).
DGS shares this limitation, since it uses graphql-java
under the hood.
This limitation forces implementers to call .dispatch()
manually inside their DataFetchers to prevent
requests from hanging... but of course that negates the value of using a DataLoader at all.
This approach circumvents that issue.
There is a hardcoded debounce period
separating the calls to loader.load()
and execution of the DataLoader implementation.
There's a compromise here with no perfect option. The longer the delay
, the more latency added to each query. Too short of a delay
and
the calls to the DataLoader impl won't get batched as effectively as they could be.
Depending on your performance needs and scale, this may or may not be a problem for you.