How to get a spark job's metrics? How to get a spark job's metrics? hadoop hadoop

How to get a spark job's metrics?


You can get the spark job metrics from Spark History Server, which displays information about:
- A list of scheduler stages and tasks
- A summary of RDD sizes and memory usage
- A Environmental information
- A Information about the running executors

1, Set spark.eventLog.enabled to true before starting the spark application. This configures Spark to log Spark events to persisted storage.
2, Set spark.history.fs.logDirectory, this is the directory that contains application event logs to be loaded by the history server;
3, Start the history server by executing: ./sbin/start-history-server.sh

please refer to below link for more information:
http://spark.apache.org/docs/latest/monitoring.html