SparkContext is the entry point of the Spark session. It enables the driver application to access the cluster through a resource manager. This resource manager can be either YARN or the default Spark Cluster Manager. SparkContext acts as the driver of the Spark application and can be used to create RDDs and accumulators and broadcast variables on that cluster.
It is advised to have one SparkContext active per JVM; hence, it is important to use the stop() operation to stop the current SparkContext object before opening another one.