Local Mode
Spark Local mode is also called psuedo-cluster mode. This is because a single JVM represents itself as driver, executor & master. This is used for testing, development for convenience and never in Production
// master url formats
local -> uses 1 thread for executors
local[*] -> uses as many threads as cores available for the JVM
local[n] -> uses n threads, provided n < cores available for the JVM
Setting spark master in local mode
// Spark Submit
spark-submit --class <MainClassOfYourApp> --master local[*] <path-to-your-jar-file>
// Programmatic
val spark = SparkSession.builder()
.appName("your-app-name")
.master("local[*]")
.getOrCreate()