Hadoop Error

Posts: 3
Joined: Tue Feb 10, 2015 6:01 am

Hadoop Error

Postby digambar.borse » Tue Feb 10, 2015 6:02 am


I'm getting following error for running my Map-Reduce program with Hadoop version 2.2. I'm linking my Map-Reduce program with jars in following path

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1107)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at com.sun.proxy.$Proxy2.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)

can you please suggest me any solution?

Finance Junkie
Posts: 709
Joined: Wed Apr 09, 2014 6:28 am

Hadoop Error

Postby edupristine » Tue Feb 10, 2015 8:16 am

Hi, this is an application error. I am suggesting a solution for this error. you can try to set the HADOOP_PREFIX environment variable to a point to the directory where your have a version of hadoop 2.X.X installed. For example: export HADOOP_PREFIX=pathtohadoop-2.2.0

Posts: 3
Joined: Tue Feb 10, 2015 6:01 am

Hadoop Error

Postby digambar.borse » Sun Feb 15, 2015 4:57 am

setting HADOOP_PREFIX doesnt help.

Just a note I'm getting this error while running program from Eclipse IDE.

Running from commandline its giving different error
15/02/15 09:25:22 INFO mapreduce.Job: Job job_1423971188386_0003 failed with state FAILED due to: Application application_1423971188386_0003 failed 2 times due to AM Container for appattempt_1423971188386_0003_000002 exited with exitCode: -1000 due to: File file:/usr/local/hadoop/hadoop-2.2.0/hadoop_temp/nm-local-dir/usercache/root/appcache/application_1423971188386_0001/hadoop_temp/nm-local-dir/nmPrivate/container_1423971188386_0003_02_000001.tokens does not exist
.Failing this attempt.. Failing the application.

Any suggestion?


Return to “Big Data & Hadoop”


Global Association of Risk Professionals, Inc. (GARP®) does not endorse, promote, review or warrant the accuracy of the products or services offered by EduPristine for FRM® related information, nor does it endorse any pass rates claimed by the provider. Further, GARP® is not responsible for any fees or costs paid by the user to EduPristine nor is GARP® responsible for any fees or costs of any person or entity providing any services to EduPristine Study Program. FRM®, GARP® and Global Association of Risk Professionals®, are trademarks owned by the Global Association of Risk Professionals, Inc

CFA Institute does not endorse, promote, or warrant the accuracy or quality of the products or services offered by EduPristine. CFA Institute, CFA®, Claritas® and Chartered Financial Analyst® are trademarks owned by CFA Institute.

Utmost care has been taken to ensure that there is no copyright violation or infringement in any of our content. Still, in case you feel that there is any copyright violation of any kind please send a mail to abuse@edupristine.com and we will rectify it.