hadoop - Can't start master and slave, strange thing named "bogon" in the log -
i downloaded new pre-build spark hadoop 2.2 file. following this document, want launch master on single machine. after untar file, enter sbin , start-master, face strange problem, here log:
spark command: /library/java/javavirtualmachines/jdk1.7.0_55.jdk/contents/home/bin/java -cp :/opt/spark-0.9.0-incubating-bin-hadoop2/conf:/opt/spark-0.9.0-incubating-bin-hadoop2/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar -dspark.akka.loglifecycleevents=true -djava.library.path= -xms512m -xmx512m org.apache.spark.deploy.master.master --ip bogon --port 7077 --webui-port 8080 ======================================== log4j:warn no appenders found logger (akka.event.slf4j.slf4jlogger). log4j:warn please initialize log4j system properly. log4j:warn see http://logging.apache.org/log4j/1.2/faq.html#noconfig more info. exception in thread "main" org.jboss.netty.channel.channelexception: failed bind to: bogon/125.211.213.133:7077 @ org.jboss.netty.bootstrap.serverbootstrap.bind(serverbootstrap.java:272) @ akka.remote.transport.netty.nettytransport$$anonfun$listen$1.apply(nettytransport.scala:391) @ akka.remote.transport.netty.nettytransport$$anonfun$listen$1.apply(nettytransport.scala:388) @ scala.util.success$$anonfun$map$1.apply(try.scala:206)
what's bogon
? , ip 125.211.213.133(not ip) comes from? what's problem here?
"bogon" comes command line provided. forgot replace parameter --ip
local ip of host.
when using sbin/start-master.sh, if not ip provided, reported hostname of machine used: start-master.sh
if [ "$spark_master_ip" = "" ]; spark_master_ip=`hostname` fi
if reported hostname not right, can provide spark ip setting env variable.
spark_master_ip=172.17.0.1 start-master.sh
Comments
Post a Comment