No Error Recorded in HAF Logs When an Error Occurs During Spark Execution
Symptom
An error occurs during Spark execution but no error is recorded in HAF logs. The Spark error information is as follows:
1 | Failed to create task. |
Key Process and Cause Analysis
The environment variable of the worker/executor is not configured.
Conclusion and Solution
- Open the Spark configuration file.
1vi /usr/local/spark/conf/spark-defaults.conf - Press i to enter the insert mode. Add spark.executorEnv.HAF_CONFIG_PATH path to the file, where path is the etc/ directory under the installation path of HAF on the host node.
1spark.executorEnv.HAF CONFIG PATH /home/omm/omnidata-install/haf-host/etc/
- Press Esc, type :wq!, and press Enter to save the file and exit.
Parent topic: Troubleshooting