java - Resource management on spark standalone -


i'm running on spark 1.4.0, , have cluster of 10 executors, 4 cores each (40 cores in total)

i have 5 applications (and more in future)i want run, submit them using scheduler (each application runs every 2-5 hours) -

2 applications more important, , want them have 50% of resources 2 application want run 25% of resources 1 application want run 10% of resources

the number of total cores 40, might change time time if add more slaves, , don't want change submit script every time add slave

i'm not sure how configure spark-submit call, won't give me message:

org.apache.spark.scheduler.taskschedulerimpl- initial job has not accepted resources; check cluster ui ensure workers registered , have sufficient resources 

any ideas anyone?

from current (1.4.0) spark documentation:

the standalone cluster mode supports simple fifo scheduler across applications. however, allow multiple concurrent users, can control maximum number of resources each application use. default, acquire cores in cluster, makes sense if run 1 application @ time. can cap number of cores setting spark.cores.max in sparkconf.

if wan't more advanced control may wish consider using yarn or mesos.


Comments

Popular posts from this blog

android - Gradle sync Error:Configuration with name 'default' not found -

java - Andrioid studio start fail: Fatal error initializing 'null' -

html - jQuery UI Sortable - Remove placeholder after item is dropped -