Hadoop's HDFS with Spark
Apache Spark is independent from Hadoop. Spark allows you to use different sources of data (incl. HDFS) and is capable of running either in a standalone cluster, or using an existing resource management framework (eg. YARN, Mesos).
So if you're only interested in Spark, there is no need to install Hadoop.