Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何使用 #2

Open
liuxingping138 opened this issue Feb 15, 2022 · 9 comments
Open

如何使用 #2

liuxingping138 opened this issue Feb 15, 2022 · 9 comments

Comments

@liuxingping138
Copy link

这个怎么直接在项目组使用啊

@niutaofan
Copy link
Owner

根据你需求,改完后编译一下,测试没问题后,替换你的线上版本

@liuxingping138
Copy link
Author

我的需求就是直接spark sql 插入数据,数据存在update,不存在的话 插入数据,对于您修改的这个 可以写个saprk sql的测试demo吗

@liuxingping138
Copy link
Author

def register(className: String): Unit = {
val cls = Utils.getContextOrSparkClassLoader.loadClass(className)

这个地方Utils 报错 好像是版本的问题

Symbol Utils is inaccessible from this place
Maven: org.apache.spark:spark-core_2.11:2.3.0

@liuxingping138
Copy link
Author

刚刚放到本地spark工程看了下,版本冲突很大,这个是2.11的 我用的是saprk-core 2.3.0的 还是很有很冲突 不知道怎么解决

@liuxingping138
Copy link
Author

能改一版新版本的吗

@niutaofan
Copy link
Owner

我的需求就是直接spark sql 插入数据,数据存在update,不存在的话 插入数据,对于您修改的这个 可以写个saprk sql的测试demo吗

封装好了之后,直接调用:
resultDF
.write
.format("org.apache.spark.sql.execution.customDatasource.jdbc")
.option("jdbc.driver", "com.mysql.jdbc.Driver")
.option("url", "地址")
.option("dbtable", "表名")
.option("saveMode" , "update")
.save()

@niutaofan
Copy link
Owner

向下兼容就可以了,这种版本迭代,内部改动其实不大

@niutaofan
Copy link
Owner

def register(className: String): Unit = { val cls = Utils.getContextOrSparkClassLoader.loadClass(className)

这个地方Utils 报错 好像是版本的问题

Symbol Utils is inaccessible from this place Maven: org.apache.spark:spark-core_2.11:2.3.0

这个错误不是版本引起的,是私有包路径的问题

@liuxingping138
Copy link
Author

好的我在本地试试吧 我觉得这个很适用的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants