Spark client update to use the new lakefs sdk package #9226
This run and associated checks have been archived and are scheduled for deletion.
Learn more about checks retention
esti.yaml
on: pull_request
Check if secrets are available.
3s
Generate code from latest lakeFS app
12s
Test lakeFS metadata client export with Spark 3.x
0s
Test lakeFS rclone export functionality
51s
Test lakeFS Hadoop FileSystem
6m 9s
Test lakeFS multipart upload with Hadoop S3A
2m 20s
Test unified gc
0s
Test metastore client commands using trino
1m 55s
Run latest lakeFS app on AWS S3 DynamoDB KV
4m 44s
Run latest lakeFS app on AWS S3
6m 8s
Run latest lakeFS app on Google Cloud Platform and Google Cloud Storage
6m 39s
Run latest lakeFS app on Azure with Azure blobstore
6m 8s
Run latest lakeFS app on Azure with Azure Data Lake Storage Gen2 and CosmosDB
9m 16s
Matrix: spark
Annotations
1 error and 2 warnings
Build metadata client for Spark 3.x
Process completed with exit code 1.
|
login-to-amazon-ecr
Your docker password is not masked. See https://github.com/aws-actions/amazon-ecr-login#docker-credentials for more information.
|
Build lakeFS HadoopFS
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
generated-code
Expired
|
15.3 MB |
|
lakefs-hadoopfs
Expired
|
5.96 MB |
|
spark-apps
Expired
|
4.48 MB |
|