Content-details Job Failure Production Diksha Environment #37
Unanswered
prabhunaveen
asked this question in
Issues
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
@kumarks1122
we are facing continuously content details job execution issue in production, some times it take 14 hours to complete the jobs .Also job is failed multiple times this month .
Below is the Job Execution error log
22/09/04 18:09:15 ERROR FileFormatWriter: Aborting job d30ccf5d-c0f5-44b7-a98c-158f29fad711. org.apache.hadoop.fs.azure.AzureException: com.microsoft.azure.storage.StorageException: The specified blob does not exist. at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.rename(AzureNativeFileSystemStore.java:2482) at org.apache.hadoop.fs.azure.NativeAzureFileSystem$FolderRenamePending.execute(NativeAzureFileSystem.java:413) at org.apache.hadoop.fs.azure.NativeAzureFileSystem.rename(NativeAzureFileSystem.java:1997) at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.renameOrMerge(FileOutputCommitter.java:440) at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:432) at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:428) at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitJobInternal(FileOutputCommitter.java:362) at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitJob(FileOutputCommitter.java:334) at org.apache.spark.internal.io.HadoopMapReduceCommitProtocol.commitJob(HadoopMapReduceCommitProtocol.scala:182) at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:220) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:188) at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108) at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106) at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:131) at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176) at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131) at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989) at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438) at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293) at org.ekstep.analytics.framework.util.DatasetExt.copyMergeFile(DatasetUtil.scala:103) at org.ekstep.analytics.framework.util.DatasetExt.saveToBlobStore(DatasetUtil.scala:75) at org.sunbird.analytics.sourcing.SourcingMetrics$.$anonfun$saveReportToBlob$3(SourcingMetrics.scala:190) at scala.collection.immutable.List.map(List.scala:286) at org.sunbird.analytics.sourcing.SourcingMetrics$.saveReportToBlob(SourcingMetrics.scala:189) at org.sunbird.analytics.sourcing.ContentDetailsReport$.generateTenantReport(ContentDetailsReport.scala:116) at org.sunbird.analytics.sourcing.ContentDetailsReport$.$anonfun$process$3(ContentDetailsReport.scala:75) at org.sunbird.analytics.sourcing.ContentDetailsReport$.$anonfun$process$3$adapted(ContentDetailsReport.scala:74) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198) at scala.collection.TraversableLike.map(TraversableLike.scala:238) at scala.collection.TraversableLike.map$(TraversableLike.scala:231) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198) at org.sunbird.analytics.sourcing.ContentDetailsReport$.process(ContentDetailsReport.scala:74) at org.sunbird.analytics.sourcing.ContentDetailsReport$.execute(ContentDetailsReport.scala:61) at org.sunbird.analytics.sourcing.ContentDetailsReport$.$anonfun$main$1(ContentDetailsReport.scala:39) at org.ekstep.analytics.framework.util.CommonUtil$.time(CommonUtil.scala:538) at org.sunbird.analytics.sourcing.ContentDetailsReport$.main(ContentDetailsReport.scala:39) at org.ekstep.analytics.job.JobExecutor$.main(JobExecutor.scala:16) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at optional.Application.callWithOptions(application.scala:251) at optional.Application.callWithOptions$(application.scala:223) at org.ekstep.analytics.job.JobExecutor$.callWithOptions(JobExecutor.scala:8) at optional.Application.main(application.scala:258) at optional.Application.main$(application.scala:255) at org.ekstep.analytics.job.JobExecutor$.main(JobExecutor.scala:8) at org.ekstep.analytics.job.JobExecutor.main(JobExecutor.scala) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: com.microsoft.azure.storage.StorageException: The specified blob does not exist. at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89) at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:305) at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:175) at com.microsoft.azure.storage.blob.CloudBlob.delete(CloudBlob.java:1123) at org.apache.hadoop.fs.azure.StorageInterfaceImpl$CloudBlobWrapperImpl.delete(StorageInterfaceImpl.java:294) at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.safeDelete(AzureNativeFileSystemStore.java:2322) at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.rename(AzureNativeFileSystemStore.java:2479) ... 74 more 22/09/04 18:11:17 INFO SparkUI: Stopped Spark web UI at http://54.255.154.146:4040 [ERROR] [09/04/2022 18:11:17.903] [scruid-actor-system-akka.actor.default-dispatcher-229] [akka.stream.Log(akka://scruid-actor-system/system/StreamSupervisor-2)] [scruid-load-balancer] Upstream failed. (akka.stream.AbruptTerminationException: Processor actor [Actor[akka://scruid-actor-system/system/StreamSupervisor-2/flow-0-2-mapAsyncUnordered#-1032283]] terminated abruptly) 22/09/04 18:11:17 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
cc: @reshmi-nair @rhwarrier @anandvarada @srajasimman
Beta Was this translation helpful? Give feedback.
All reactions