Class SparkExecutionContext.SparkClusterConfig
- java.lang.Object
- 
- org.apache.sysds.runtime.controlprogram.context.SparkExecutionContext.SparkClusterConfig
 
- 
- Enclosing class:
- SparkExecutionContext
 
 public static class SparkExecutionContext.SparkClusterConfig extends Object Captures relevant spark cluster configuration properties, e.g., memory budgets and degree of parallelism. This configuration abstracts legacy (< Spark 1.6) and current configurations and provides a unified view.
- 
- 
Constructor SummaryConstructors Constructor Description SparkClusterConfig()
 - 
Method SummaryAll Methods Instance Methods Concrete Methods Modifier and Type Method Description voidanalyzeSparkConfiguation(org.apache.spark.SparkConf conf)voidanalyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf)longgetBroadcastMemoryBudget()longgetDataMemoryBudget(boolean min, boolean refresh)intgetDefaultParallelism(boolean refresh)intgetNumExecutors()StringtoString()
 
- 
- 
- 
Method Detail- 
getBroadcastMemoryBudgetpublic long getBroadcastMemoryBudget() 
 - 
getDataMemoryBudgetpublic long getDataMemoryBudget(boolean min, boolean refresh)
 - 
getNumExecutorspublic int getNumExecutors() 
 - 
getDefaultParallelismpublic int getDefaultParallelism(boolean refresh) 
 - 
analyzeSparkConfiguationLegacypublic void analyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf) 
 - 
analyzeSparkConfiguationpublic void analyzeSparkConfiguation(org.apache.spark.SparkConf conf) 
 
- 
 
-