-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to perform on action on hive tables using spark-acid jar #34
Comments
Hi @vinay-kl, |
@amoghmargoor sure, i am trying that, will post an update soon on this.. |
@amoghmargoor with the latest jar also the issue persists, however if i change to JobConf constructor with no arguments there seems to be no issue. |
hey @vinay-kl I guess it is because the conf being passed as argument to JobConf is null. That means the conf being passed at this location will be lost: spark-acid/src/main/scala/com/qubole/spark/hiveacid/reader/hive/HiveAcidReader.scala Line 93 in 2e66f76
|
Hey @amoghmargoor the readerOptions.hadoopConf is not null, the same which is broadcasted is also non null. I'm able to use the API properly, only if no args constructor is used instead at src/main/scala/com/qubole/spark/hiveacid/rdd/HiveAcidRDD.scala Line 174
|
@vinay-kl
So I assume this problem is solved after the Jar upgrade according to your last comment. Now if broadcasted conf is not null, what error do you get now (with stack trace) ? |
@amoghmargoor my bad for not being clear on that, The issue still persists even after the jar upgrade. stack trace remains the same.. And a warning of this sort is thrown
All these config keys are present in hive-site.xml |
But you just said conf is not null, so it cannot be same issue. Can you please paste the current stacktrace even if you think it is same ?
…Sent from my iPhone
On 30-Jan-2020, at 5:47 AM, Vinay Bharadwaj ***@***.***> wrote:
@amoghmargoor my bad for not being clear on that, The issue still persists even after the jar upgrade. stack trace remains the same..
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
conf::::Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, spark_hadoop_conf.xml, file:/etc/spark2/3.1.2.2-1/0/hive-site.xml
|
Running on HDI 4.0 with spark 2.4.0 and HIVE 3.1.2.
--stack-trace--
val a: org.apache.hadoop.conf.Configuration = null
new JobConf(a)
The value is coming as null in this line--->
https://github.com/qubole/spark-acid/blob/f445eeef4416ee27192905e0e69a43076db7b2b1/src/main/scala/com/qubole/spark/datasources/hiveacid/rdd/Hive3Rdd.scala#L138
--even tried with setting spark.hadoop.cloneConf seems to be breaking in both if and else condition--
-- tried with HDI 3.6 with spark 2.3.0 but there was an issue with guava jar version being 24.1.1,
so it was throwing as the method was found
in this line-->
https://github.com/qubole/spark-acid/blob/f445eeef4416ee27192905e0e69a43076db7b2b1/src/main/scala/com/qubole/spark/datasources/hiveacid/rdd/Hive3Rdd.scala#L51
Can you guys please help me with this..
Thanks and Regards,
Vinay K L
The text was updated successfully, but these errors were encountered: