1.复制Hadoop目录下etc/hadoop中的hdfs-site.xml和core-site.xml文件到spark的conf/目录下;
2.将spark的conf/目录下的spark-default.conf.template更名为spark-default.xml(mv?spark-default.conf.template?spark-default.xml);
3.在spark-default.xml文件中添加如下配置(具体路径以自己实际路径为准):
spark.files file:///usr/local/soft/spark-1.6.0/conf/hdfs-site.xml,file:///usr/local/soft/spark-1.6.0/conf/core-site.xml
引用链接:[https://blog.csdn.net/sunhaoning/article/details/62214728](https://blog.csdn.net/sunhaoning/article/details/62214728)
原文:https://www.cnblogs.com/ydk-XL/p/11918331.html