scala - Increase Spark memory when using local[*] -


how increase spark memory when using local[*]?

i tried setting memory this:

  val conf = new sparkconf()     .set("spark.executor.memory", "1g")     .set("spark.driver.memory", "4g")     .setmaster("local[*]")     .setappname("myapp") 

but still get:

memorystore: memorystore started capacity 524.1 mb 

does have with:

.setmaster("local[*]") 

i able solve running sbt with:

sbt -mem 4096 

however memorystore half size. still looking fraction is.


Comments

Popular posts from this blog

java - Date formats difference between yyyy-MM-dd'T'HH:mm:ss and yyyy-MM-dd'T'HH:mm:ssXXX -

c# - Get rid of xmlns attribute when adding node to existing xml -