Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storm-HDFS is not working in HDP2.2 #28

Open
SachinHadoop opened this issue May 29, 2015 · 0 comments
Open

Storm-HDFS is not working in HDP2.2 #28

SachinHadoop opened this issue May 29, 2015 · 0 comments

Comments

@SachinHadoop
Copy link

I am getting the following exception while executing a sample storm hdfs code in local mode

20311 [Thread-9-hdfs_bolt] WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
20328 [Thread-15-test1 spout] INFO backtype.storm.daemon.task - Emitting: test1 spout default [i am at two with nature]
20345 [Thread-11-hdfs_bolt] WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
20365 [Thread-9-hdfs_bolt] ERROR backtype.storm.util - Async loop died!
java.lang.RuntimeException: Error preparing HdfsBolt: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "xxxx.yyyy.com/00.00.00.001"; destination host is: "abc.yyyy.com":8020;
at org.apache.storm.hdfs.bolt.AbstractHdfsBolt.prepare(AbstractHdfsBolt.java:96) ~[storm-hdfs-0.1.2.jar:na]
at backtype.storm.daemon.executor$fn__3441$fn__3453.invoke(executor.clj:692) ~[storm-core-0.9.3.jar:0.9.3]
at backtype.storm.util$async_loop$fn__464.invoke(util.clj:461) ~[storm-core-0.9.3.jar:0.9.3]
at clojure.lang.AFn.run(AFn.java:24) [clojure-1.5.1.jar:na]
at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_45]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_45]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_45]
    at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at com.sun.proxy.$Proxy15.create(Unknown Source) ~[na:na]
    at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1726) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1668) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1593) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:397) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:393) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:393) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:337) ~[hadoop-hdfs-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:786) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:775) ~[hadoop-common-2.6.0.2.2.0.0-2041.jar:na]
    at org.apache.storm.hdfs.bolt.HdfsBolt.createOutputFile(HdfsBolt.java:126) ~[storm-hdfs-0.1.2.jar:na]
    at org.apache.storm.hdfs.bolt.AbstractHdfsBolt.prepare(AbstractHdfsBolt.java:93) ~[storm-hdfs-0.1.2.jar:na]
    ... 4 common frames omitted

I was trying to execute as a java jar. with the following command-
java -cp storm-.1.jar:lib/*:lib/log4j.properties promo.storm.sample.StormSampleTopology

My lib folder has the following jar files-

1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant