terminating a spark step in aws terminating a spark step in aws hadoop hadoop

terminating a spark step in aws


That's easy:

yarn application -kill [application id]

you can list your running applications with

yarn application -list


You can kill application from the Resource manager (in the links at the top right under cluster status).In the resource manager, click on the application you want to kill and in the application page there is a small "kill" label (top left) you can click to kill the application.

Obviously you can also SSH but this way I think is faster and easier for some users.