RAM

Apache Spark : Memory management and Graceful degradation

Many of the concepts of Apache Spark are pretty straightforward and easy to understand, however some lucky few can be badly misunderstood. One of the greatest misunderstanding of all is the fact that some still believe that “Spark is only relevant with datasets that can fit into memory, otherwise it will crash”.

This is an understanding mistake, Spark being easily associated as a “Hadoop using RAM more efficiently”, but it still is a mistake.