case class GlobalConfig(kryoClasses: Option[Seq[String]] = None, sparkOptions: Option[Map[String, String]] = None, enableHive: Boolean = true, memoryLogTimer: Option[MemoryLogTimerConfig] = None, shutdownHookLogger: Boolean = false, stateListeners: Seq[StateListenerConfig] = Seq(), sparkUDFs: Option[Map[String, SparkUDFCreatorConfig]] = None, pythonUDFs: Option[Map[String, PythonUDFCreatorConfig]] = None, secretProviders: Option[Map[String, SecretProviderConfig]] = None, allowOverwriteAllPartitionsWithoutPartitionValues: Seq[DataObjectId] = Seq()) extends SmartDataLakeLogger with Product with Serializable

Global configuration options

kryoClasses

classes to register for spark kryo serialization

sparkOptions

spark options

enableHive

enable hive for spark session

memoryLogTimer

enable periodic memory usage logging, see detailed configuration MemoryLogTimerConfig

shutdownHookLogger

enable shutdown hook logger to trace shutdown cause

stateListeners

Define state listeners to be registered for receiving events of the execution of SmartDataLake job

sparkUDFs

Define UDFs to be registered in spark session. The registered UDFs are available in Spark SQL transformations and expression evaluation, e.g. configuration of ExecutionModes.

pythonUDFs

Define UDFs in python to be registered in spark session. The registered UDFs are available in Spark SQL transformations but not for expression evaluation.

secretProviders

Define SecretProvider's to be registered.

allowOverwriteAllPartitionsWithoutPartitionValues

Configure a list of exceptions for partitioned DataObject id's, which are allowed to overwrite the all partitions of a table if no partition values are set. This is used to override/avoid a protective error when using SDLSaveMode.OverwriteOptimized|OverwritePreserveDirectories. Define it as a list of DataObject id's.

Linear Supertypes
Serializable, Serializable, Product, Equals, SmartDataLakeLogger, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GlobalConfig
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. SmartDataLakeLogger
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new GlobalConfig(kryoClasses: Option[Seq[String]] = None, sparkOptions: Option[Map[String, String]] = None, enableHive: Boolean = true, memoryLogTimer: Option[MemoryLogTimerConfig] = None, shutdownHookLogger: Boolean = false, stateListeners: Seq[StateListenerConfig] = Seq(), sparkUDFs: Option[Map[String, SparkUDFCreatorConfig]] = None, pythonUDFs: Option[Map[String, PythonUDFCreatorConfig]] = None, secretProviders: Option[Map[String, SecretProviderConfig]] = None, allowOverwriteAllPartitionsWithoutPartitionValues: Seq[DataObjectId] = Seq())

    kryoClasses

    classes to register for spark kryo serialization

    sparkOptions

    spark options

    enableHive

    enable hive for spark session

    memoryLogTimer

    enable periodic memory usage logging, see detailed configuration MemoryLogTimerConfig

    shutdownHookLogger

    enable shutdown hook logger to trace shutdown cause

    stateListeners

    Define state listeners to be registered for receiving events of the execution of SmartDataLake job

    sparkUDFs

    Define UDFs to be registered in spark session. The registered UDFs are available in Spark SQL transformations and expression evaluation, e.g. configuration of ExecutionModes.

    pythonUDFs

    Define UDFs in python to be registered in spark session. The registered UDFs are available in Spark SQL transformations but not for expression evaluation.

    secretProviders

    Define SecretProvider's to be registered.

    allowOverwriteAllPartitionsWithoutPartitionValues

    Configure a list of exceptions for partitioned DataObject id's, which are allowed to overwrite the all partitions of a table if no partition values are set. This is used to override/avoid a protective error when using SDLSaveMode.OverwriteOptimized|OverwritePreserveDirectories. Define it as a list of DataObject id's.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val allowOverwriteAllPartitionsWithoutPartitionValues: Seq[DataObjectId]
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  7. def createSparkSession(appName: String, master: Option[String], deployMode: Option[String] = None): SparkSession

    Create a spark session using settings from this global config

  8. val enableHive: Boolean
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  12. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  13. val kryoClasses: Option[Seq[String]]
  14. lazy val logger: Logger
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
    Annotations
    @transient()
  15. val memoryLogTimer: Option[MemoryLogTimerConfig]
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  18. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. val pythonUDFs: Option[Map[String, PythonUDFCreatorConfig]]
  20. val secretProviders: Option[Map[String, SecretProviderConfig]]
  21. val shutdownHookLogger: Boolean
  22. val sparkOptions: Option[Map[String, String]]
  23. val sparkUDFs: Option[Map[String, SparkUDFCreatorConfig]]
  24. val stateListeners: Seq[StateListenerConfig]
  25. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  26. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from SmartDataLakeLogger

Inherited from AnyRef

Inherited from Any

Ungrouped