Package

io.smartdatalake.workflow.action

customlogic

Permalink

package customlogic

Visibility
  1. Public
  2. All

Type Members

  1. trait CustomDfCreator extends Serializable

    Permalink

    Interface to define custom logic for DataFrame transformations

  2. case class CustomDfCreatorConfig(className: Option[String] = None, scalaFile: Option[String] = None, scalaCode: Option[String] = None, options: Option[Map[String, String]] = None) extends Product with Serializable

    Permalink
  3. class CustomDfCreatorWrapper extends CustomDfCreator

    Permalink
  4. trait CustomDfTransformer extends Serializable

    Permalink

    Interface to define a custom Spark-DataFrame transformation (1:1)

  5. case class CustomDfTransformerConfig(className: Option[String] = None, scalaFile: Option[String] = None, scalaCode: Option[String] = None, sqlCode: Option[String] = None, options: Map[String, String] = Map()) extends Product with Serializable

    Permalink

    Configuration of a custom Spark-DataFrame transformation between one input and one output (1:1)

    Configuration of a custom Spark-DataFrame transformation between one input and one output (1:1)

    className

    Optional class name to load transformer code from

    scalaFile

    Optional file where scala code for transformation is loaded from

    scalaCode

    Optional scala code for transformation

    sqlCode

    Optional map of DataObjectId and corresponding SQL Code

    options

    Options to pass to the transformation

  6. trait CustomDfsTransformer extends Serializable

    Permalink

    Interface to define a custom Spark-DataFrame transformation (n:m) Same trait as CustomDfTransformer, but multiple input and outputs supported.

  7. case class CustomDfsTransformerConfig(className: Option[String] = None, scalaFile: Option[String] = None, scalaCode: Option[String] = None, sqlCode: Map[DataObjectId, String] = Map(), options: Map[String, String] = Map()) extends Product with Serializable

    Permalink

    Configuration of a custom Spark-DataFrame transformation between several inputs and outputs (n:m)

    Configuration of a custom Spark-DataFrame transformation between several inputs and outputs (n:m)

    className

    Optional class name to load transformer code from

    scalaFile

    Optional file where scala code for transformation is loaded from

    scalaCode

    Optional scala code for transformation

    sqlCode

    Optional map of DataObjectId and corresponding SQL Code

    options

    Options to pass to the transformation

  8. trait CustomFileTransformer extends Serializable

    Permalink

    Interface to define custom file transformation for CustomFileAction

  9. case class CustomFileTransformerConfig(className: Option[String] = None, scalaFile: Option[String] = None, scalaCode: Option[String] = None, options: Map[String, String] = Map()) extends Product with Serializable

    Permalink

    Configuration of custom file transformation between one input and one output (1:1)

    Configuration of custom file transformation between one input and one output (1:1)

    className

    Optional class name to load transformer code from

    scalaFile

    Optional file where scala code for transformation is loaded from

    scalaCode

    Optional scala code for transformation

    options

    Options to pass to the transformation

  10. class CustomFileTransformerWrapper extends CustomFileTransformer with SmartDataLakeLogger

    Permalink

Ungrouped