Interface SparkContextProvider

  • All Superinterfaces:
    java.util.function.Function<org.apache.spark.SparkConf,​org.apache.spark.api.java.JavaSparkContext>
    All Known Implementing Classes:
    DefaultSparkContextProvider

    public interface SparkContextProvider
    extends java.util.function.Function<org.apache.spark.SparkConf,​org.apache.spark.api.java.JavaSparkContext>
    Function that provides a SparkContext
    • Method Summary

      • Methods inherited from interface java.util.function.Function

        andThen, apply, compose