- All Superinterfaces:
- java.util.function.Function<org.apache.spark.SparkConf,org.apache.spark.api.java.JavaSparkContext>
- All Known Implementing Classes:
- DefaultSparkContextProvider
public interface SparkContextProvider
extends java.util.function.Function<org.apache.spark.SparkConf,org.apache.spark.api.java.JavaSparkContext>
Function that provides a SparkContext