This is a brief manual for a workshop that taken place online at https://youtube.com/confluent
Ensure you install the following toolset on your computer:
-
Tip
You should have your login and password information handy after you sign up for Confluent Cloud. The ccloud
init script will ask you for your login information. -
Docker (We use Docker to build images locally)
-
Git
-
Your favorite IDE or text editor
-
Personally, I recommend IntelliJ IDEA.
-
-
k3d
to run a local Kubernetes cluster -
jq
for fancy json manipulation -
skaffold
to build, run and deploy images -
k9s
a fancy console GUI for Kubernetes
Before you proceed, be sure to complete the following steps:
git clone https://github.com/confluentinc/demo-scene #(1)
cd streaming-movies-workshop #(2)
-
Clone the repository
-
Change directory of the workshop folder
If you follow steps below, you should check out only directory that has source code relevant to this post.
mkdir ~/temp/demo-scene
cd ~/temp/demo-scene
git init .
git remote add origin -f https://github.com/confluentinc/demo-scene/
git config core.sparsecheckout true
echo "streaming-movies-workshop/*" >> .git/info/sparse-checkout
git pull --depth=1 origin master
cd streaming-movies-workshop
ls -lh
Note
|
If you are on Mac, you can use brew to install all dependencies by running make install-deps .
|
Note
|
You can try to deploy apps to local Kubernetes clusters.
There are plenty of options available - minikube , k3d , Docker for Desktop.
Frankly, I was having hard times to use those.
You can try local minikube cluster with make create-local-minikube-cluster .
In this tutorial, I will use Google Kubernetes Service to run my test apps.
If you want to follow same route you need to install Google Cloud SDK tools.
You can create GKE Kubernetes cluster by calling make create-gke-cluster command.
You can destroy GKE Kubernetes cluster after that by calling make destroy-gke-cluster .
|
$ ccloud login --save #(1)
$ make create-ccloud-cluster #(2)
-
Login to your Confluent Cloud account.
-
The CCloud Stack script will ask you to log in to your CCloud account.
It will automatically provision Kafka and ksqlDB cluster.
This workshop includes two apps - the microservices developed with Spring Boot.
- movies-generator
- loads movie data to Kafka cluster, and randomly generates new ratings.
- ratings-processor
- processes new ratings, and constantly recalculates new rating for given movie.
./gradlew test #(1)
-
This command will download gradle wrapper (if it wasn’t previously installed )
-
❏ Model generation from AVRO Schema (
Movie
,Rating
) -
❏ Producer application using
KafkaTemplate
-
❏ Ratings Processor App
-
Explore tests using TTD
-
Overview of the topologies using Kafka Streams Topology Visualizer
-
skaffold run #(1)
-
This command will build images for
Tip
|
Connect to ksqlDB with CLI
In this exercise, we’re going to use ksqlDB Cloud UI. But you also can run CLI using docker.
Materialized view
CREATE STREAM RATED_MOVIES_STREAM WITH (
kafka_topic = 'rated-movies',
value_format = 'avro'
);
CREATE TABLE RATED_MOVIES_VIEW AS SELECT
TITLE as TITLE,
LATEST_BY_OFFSET(RELEASE_YEAR) as RELEASE_YEAR,
LATEST_BY_OFFSET(MOVIE_ID) as MOVIE_ID,
LATEST_BY_OFFSET(RATING) as CURRENT_RATING
FROM RATED_MOVIES_STREAM
GROUP BY TITLE
EMIT CHANGES; |
Note
|
If you are getting error about accessing the
where |