Installing the Spark UDF Plugin
- Operations in this section are required only when OmniOperator UDFs are used.
- Spark has been installed by following instructions provided in Downloading the Spark Extension Plugin Package to Installing Spark.
- Place the JAR packages on which the UDFs depend to the /user/hive-udf directory of HDFS.
- The /user/hive-udf directory can be customized.
- The JAR packages on which the UDFs depend need to be provided by yourself.
- Register Hive UDFs on the management node of the cluster.
Parent topic: Using OmniOperator on Spark