How to use two versions of spark shell? How to use two versions of spark shell? hadoop hadoop