jeudi 1 novembre 2018

Not all workers could be found on the spark web page

We have set up a spark standalone cluster. Everything seems ok. But not all workers could be found on the spark web page.

We start spark as follows: [hadoop@master spark-2.0.2-bin-hadoop2.7]$ sbin/start-all.sh starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/spark-2.0.2-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-master.telegis.out master.telegis: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.0.2-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-master.telegis.out slave1.telegis: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.0.2-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-slave1.telegis.out slave2.telegis: starting org.apache.spark.deploy.worker.Worker, logging to /home/hadoop/spark-2.0.2-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-slave2.telegis.out We can find worker service by jps command on every node.

The spark web page shows:

In addition, if we submit a task in yarn cluster mode, only one worker works.




Aucun commentaire:

Enregistrer un commentaire