Either build the local hadoop/spark versions you want, or choose a profile which refers to public releases
note: the public spark release MUST have been built with the spark-hadoop-cloud module; the ASF ones tend not to be.
Example: build with steve's s3a test config, hadoop trunk and cloudera spark build.
# Build locally
mi -Phadoop-trunk -Pspark-cdpd-master
mvt -Dcloud.test.configuration.file=/Users/stevel/Projects/sparkwork/cloud-test-configs/s3a.xml -Phadoop-trunk -Pspark-cdpd-master