What is Executor Memory in a Spark application?

Every Spark application has the same fixed heap size and a fixed number of cores for a Spark executor. The heap size is referred to as the Spark executor memory, which is controlled with the spark.executor.memory property  of  the  –executor-memory  flag. Every Spark application has one executor on each worker node. Essentially, the executor memory is a measure of how much memory of the worker node the application utilises.