-
Notifications
You must be signed in to change notification settings - Fork 696
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Refactor java/scala templates to maven/sbt instead
- Loading branch information
1 parent
a248a2f
commit 020bcc3
Showing
11 changed files
with
24 additions
and
23 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -3,6 +3,7 @@ FROM bde2020/spark-submit:3.1.1-hadoop3.2 | |
LABEL maintainer="Gezim Sejdiu <[email protected]>, Giannis Mouchakis <[email protected]>" | ||
|
||
ENV SPARK_APPLICATION_JAR_NAME application-1.0 | ||
ENV SPARK_APPLICATION_JAR_LOCATION /app/application.jar | ||
|
||
COPY template.sh / | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,17 +1,17 @@ | ||
# Spark Java template | ||
# Spark Maven template | ||
|
||
The Spark Java template image serves as a base image to build your own Java application to run on a Spark cluster. See [big-data-europe/docker-spark README](https://github.com/big-data-europe/docker-spark) for a description how to setup a Spark cluster. | ||
The Spark Maven template image serves as a base image to build your own Maven application to run on a Spark cluster. See [big-data-europe/docker-spark README](https://github.com/big-data-europe/docker-spark) for a description how to setup a Spark cluster. | ||
|
||
### Package your application using Maven | ||
You can build and launch your Java application on a Spark cluster by extending this image with your sources. The template uses [Maven](https://maven.apache.org/) as build tool, so make sure you have a `pom.xml` file for your application specifying all the dependencies. | ||
You can build and launch your Maven application on a Spark cluster by extending this image with your sources. The template uses [Maven](https://maven.apache.org/) as build tool, so make sure you have a `pom.xml` file for your application specifying all the dependencies. | ||
|
||
The Maven `package` command must create an assembly JAR (or 'uber' JAR) containing your code and its dependencies. Spark and Hadoop dependencies should be listes as `provided`. The [Maven shade plugin](http://maven.apache.org/plugins/maven-shade-plugin/) provides a plugin to build such assembly JARs. | ||
|
||
### Extending the Spark Java template with your application | ||
### Extending the Spark Maven template with your application | ||
|
||
#### Steps to extend the Spark Java template | ||
#### Steps to extend the Spark Maven template | ||
1. Create a Dockerfile in the root folder of your project (which also contains a `pom.xml`) | ||
2. Extend the Spark Java template Docker image | ||
2. Extend the Spark Maven template Docker image | ||
3. Configure the following environment variables (unless the default value satisfies): | ||
* `SPARK_MASTER_NAME` (default: spark-master) | ||
* `SPARK_MASTER_PORT` (default: 7077) | ||
|
@@ -21,10 +21,10 @@ The Maven `package` command must create an assembly JAR (or 'uber' JAR) containi | |
4. Build and run the image | ||
``` | ||
docker build --rm=true -t bde/spark-app . | ||
docker run --name my-spark-app -e ENABLE_INIT_DAEMON=false --link spark-master:spark-master -d bde/spark-app | ||
docker run --name my-spark-app --link spark-master:spark-master -d bde/spark-app | ||
``` | ||
|
||
The sources in the project folder will be automatically added to `/usr/src/app` if you directly extend the Spark Java template image. Otherwise you will have to add and package the sources by yourself in your Dockerfile with the commands: | ||
The sources in the project folder will be automatically added to `/usr/src/app` if you directly extend the Spark Maven template image. Otherwise you will have to add and package the sources by yourself in your Dockerfile with the commands: | ||
|
||
COPY . /usr/src/app | ||
RUN cd /usr/src/app \ | ||
|
@@ -34,7 +34,7 @@ If you overwrite the template's `CMD` in your Dockerfile, make sure to execute t | |
|
||
#### Example Dockerfile | ||
``` | ||
FROM bde2020/spark-java-template:3.1.1-hadoop3.2 | ||
FROM bde2020/spark-maven-template:3.1.1-hadoop3.2 | ||
MAINTAINER Erika Pauwels <[email protected]> | ||
MAINTAINER Gezim Sejdiu <[email protected]> | ||
|
File renamed without changes.
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
# Spark Scala template | ||
# Spark SBT template | ||
|
||
The Spark Scala template image serves as a base image to build your own Scala | ||
The Spark SBT template image serves as a base image to build your own Scala | ||
application to run on a Spark cluster. See | ||
[big-data-europe/docker-spark README](https://github.com/big-data-europe/docker-spark) | ||
for a description how to setup a Spark cluster. | ||
|
@@ -11,7 +11,7 @@ for a description how to setup a Spark cluster. | |
spark-shell: | ||
|
||
``` | ||
docker run -it --rm bde2020/spark-scala-template sbt console | ||
docker run -it --rm bde2020/spark-sbt-template sbt console | ||
``` | ||
|
||
You can also use directly your Docker image and test your own code that way. | ||
|
@@ -29,9 +29,9 @@ When the Docker image is built using this template, you should get a Docker | |
image that includes a fat JAR containing your application and all its | ||
dependencies. | ||
|
||
### Extending the Spark Scala template with your application | ||
### Extending the Spark SBT template with your application | ||
|
||
#### Steps to extend the Spark Scala template | ||
#### Steps to extend the Spark SBT template | ||
|
||
1. Create a Dockerfile in the root folder of your project (which also contains | ||
a `build.sbt`) | ||
|
@@ -45,7 +45,7 @@ dependencies. | |
4. Build and run the image: | ||
``` | ||
docker build --rm=true -t bde/spark-app . | ||
docker run --name my-spark-app -e ENABLE_INIT_DAEMON=false --link spark-master:spark-master -d bde/spark-app | ||
docker run --name my-spark-app --link spark-master:spark-master -d bde/spark-app | ||
``` | ||
|
||
The sources in the project folder will be automatically added to `/usr/src/app` | ||
|
@@ -62,7 +62,7 @@ the `/template.sh` script at the end. | |
#### Example Dockerfile | ||
|
||
``` | ||
FROM bde2020/spark-scala-template:3.1.1-hadoop3.2 | ||
FROM bde2020/spark-sbt-template:3.1.1-hadoop3.2 | ||
MAINTAINER Cecile Tonglet <[email protected]> | ||
|
File renamed without changes.
File renamed without changes.
File renamed without changes.