在已经搭建好的集群环境Centos6.6+Hadoop2.7+Hbase0.98+Spark1.3.1下,在Win7系统Intellij开发工具中调试Spark读取Hbase。运行直接报错:
| 
 1 
2 
3 
4 
5 
6 
7 
8 
9 
10 
11 
12 
13 
14 
15 
16 
17 
18 
19 
20 
21 
22 
23 
24 
25 
26 
27 
28 
29 
 | 
15/06/11 15:35:50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary pathjava.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)    at scala.Option.getOrElse(Option.scala:120)    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2001)    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)    at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)    at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)    at SparkFromHbase$.main(SparkFromHbase.scala:15)    at SparkFromHbase.main(SparkFromHbase.scala)    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    at java.lang.reflect.Method.invoke(Method.java:606)    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134) | 
查看hadoop源码发现里有这么一段:
| 
 1 
2 
3 
4 
5 
6 
7 
8 
9 
10 
11 
12 
13 
14 
15 
16 
17 
18 
19 
20 
21 
22 
23 
24 
25 
26 
27 
 | 
  public static final String getQualifiedBinPath(String executable)   throws IOException {    // construct hadoop bin path to the specified executable    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"      + File.separator + executable;    File exeFile = new File(fullExeName);    if (!exeFile.exists()) {      throw new IOException("Could not locate executable " + fullExeName        + " in the Hadoop binaries.");    }    return exeFile.getCanonicalPath();  }private static String HADOOP_HOME_DIR = checkHadoopHome();private static String checkHadoopHome() {    // first check the Dflag hadoop.home.dir with JVM scope    String home = System.getProperty("hadoop.home.dir");    // fall back to the system/user-global env variable    if (home == null) {      home = System.getenv("HADOOP_HOME");    }     ...} | 
很明显应该是HADOOP_HOME的问题。如果HADOOP_HOME为空,必然fullExeName为null\bin\winutils.exe。解决方法很简单,配置环境变量,不想重启电脑可以在程序里加上:
| 
 1 
 | 
System.setProperty("hadoop.home.dir", "E:\\Program Files\\hadoop-2.7.0"); | 
注:E:\\Program Files\\hadoop-2.7.0是我本机解压的hadoop的路径。
稍后再执行,你可能还是会出现同样的错误,这个时候你可能会要怪我了。其实一开始我是拒绝的,因为你进入你的hadoop-x.x.x/bin目录下看,你会发现你压根就没有winutils.exe这个东东。
于是我告诉你,你可以去github下载一个,地球人都知道的地址发你一个。
地址:https://github.com/srccodes/hadoop-common-2.2.0-bin
不要顾虑它的版本,不用怕,因为我用的最新的hadoop-2.7.0都没问题!下载好后,把winutils.exe加入你的hadoop-x.x.x/bin下。
至此问题解决了,如果还没解决,那你是奇葩哥了,可以加我的QQ!
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries
原文:http://www.cnblogs.com/hyl8218/p/5492450.html