-
Notifications
You must be signed in to change notification settings - Fork 315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extract common classes from src/scala/microsoft-spark-<version>. #15
Comments
Hi @imback82 |
That will be great, thanks @spzSource! |
Hi @imback82 Before start work I just want to confirm if I correctly understand suggested approach. Am I correct saying that the intention is to create separate maven project (for instance, |
Yea, I think that's one way to do it. But we have to make sure |
Hi @imback82 Looks like I stuck right after creating common maven module. Almost all classes inherit from Am I correctly understand that removing |
We create multiple jars during our builds to accommodate multiple versions of Apache Spark. In the current approach, the implementation is copied from one version to another and then necessary changes are made.
An ideal approach could create a
common
directory and extract common classes from duplicate code. Note that even if class/code is exactly the same, you cannot pull out to a common class if it depends on Apache Spark.Success Criteria:
The text was updated successfully, but these errors were encountered: