scala - Increase Spark memory when using local[*] -
how increase spark memory when using local[*]?
i tried setting memory this:
val conf = new sparkconf() .set("spark.executor.memory", "1g") .set("spark.driver.memory", "4g") .setmaster("local[*]") .setappname("myapp")
but still get:
memorystore: memorystore started capacity 524.1 mb
does have with:
.setmaster("local[*]")
i able solve running sbt with:
sbt -mem 4096
however memorystore half size. still looking fraction is.
Comments
Post a Comment