首页 > 其他 > 详细

Spark 分布式环境---slave节点无法启动(已解决)

时间:2017-11-30 19:03:26      阅读:662      评论:0      收藏:0      [点我收藏+]
soyo@soyo-VPCCB3S1C:~$ start-slaves.sh 
soyo-slave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
soyo-slave01: failed to launch: nice -n 0 /usr/local2/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://soyo-VPCCB3S1C:7077
soyo-slave01:   /usr/local2/spark/bin/spark-class: 行 71: /usr/lib/jvm/java-8-openjdk-amd64/bin/java: 没有那个文件或目录
soyo-slave01: full log in /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
解决:
修改 soyo-slave01 节点上bashrc里JDK的安装路径(因为ubuntu14.04 不是安装的默认openJDK)之后ok

Spark 分布式环境---slave节点无法启动(已解决)

原文:http://www.cnblogs.com/soyo/p/7930205.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!