Skip to main content



If you have problems with the getting started guide, note that there's a separate troubleshooting section for that.

Windows: missing winutils

Error: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

The winutils.exe executable can not be found.

  • Download hadoop winutils binaries (e.g
  • Extract binaries for desired hadoop version into folder (e.g. hadoop-3.2.2\bin)
  • Set HADOOP_HOME evironment variable (e.g. HADOOP_HOME=...\hadoop-3.2.2). Note that the binary files need to be located at %HADOOP_HOME%\bin!
  • Add %HADOOP_HOME%\bin to PATH variable.

Windows: /tmp/hive is not writable

RuntimeException: Error while running command to get file permissions
Change to %HADOOP_HOME%\bin and execute winutils chmod 777 /tmp/hive.

Windows: winutils.exe is not working correctly

winutils.exe - System Error The code execution cannot proceed because MSVCR100.dll was not found. Reinstalling the program may fix this problem.

Other errors are also possible:

  • Similar error message when double clicking on winutils.exe (Popup)
  • Errors when providing a path to the configuration instead of a single configuration file
  • ExitCodeException exitCode=-1073741515 when executing SDL even though everything ran without errors

Install VC++ Redistributable Package from Microsoft: (x86) (x64)

Java IllegalAccessError (Java 17)

Symptom: Starting an SDLB pipeline fails with the following exception:

java.lang.IllegalAccessError: class$ (in unnamed module @0x343570b7) cannot access class (in module java.base) because module java.base does not export to unnamed module @0x343570b7

Solution: Java 17 is more restrictive regarding usage of module exports. Unfortunately Spark uses classes from unexported packages. Packages can be exported manually. To fix above exception add --add-exports java.base/ to the java command line, see also Stackoverflow.

Resources not copied

Tests fail due to missing or outdated resources or the execution starts but can not find the feeds specified. IntelliJ might not copy the resource files to the target directory.

Execute the maven goal resources:resources (mvn resources:resources) manually after you changed any resource file.

Maven compile error: tools.jar

Could not find artifact at specified path ...

Hadoop/Spark has a dependency on the tools.jar file which is installed as part of the JDK installation.

Possible Reasons:

  1. Your system does not have a JDK installed (only a JRE).
    • Fix: Make sure a JDK is installed and your PATH and JAVA_HOME environment variables are pointing to the JDK installation.
  2. You are using a Java 9 JDK or higher. The tools.jar has been removed in JDK 9. See:
    • Fix: Downgrade your JDK to Java 8.

How can I test Hadoop / HDFS locally ?

When using local:// URIs, file permissions on Windows, or certain actions, local Hadoop binaries are required.

  1. Download your desired Apache Hadoop binary release from
  2. Extract the contents of the Hadoop distribution archive to a location of your choice, e.g., /path/to/hadoop (Unix) or C:\path\to\hadoop (Windows).
  3. Set the environment variable HADOOP_HOME=/path/to/hadoop (Unix) or HADOOP_HOME=C:\path\to\hadoop (Windows).
  4. Windows only: Download a Hadoop winutils distribution corresponding to your Hadoop version from (for newer Hadoop releases at: and extract the contents to %HADOOP_HOME%\bin.