(Optional) Installing the Spark UDF Plugin
You need to install the Spark UDF plugin only when OmniOperator UDFs are available.
Prerequisites
Spark has been installed by following instructions provided in Installing Spark.
Installing the Plugin
- Place the JAR packages on which the UDFs depend to the /user/hive-udf directory of HDFS.
- The /user/hive-udf directory can be customized.
- The JAR packages on which the UDFs depend need to be provided by yourself.
- Register Hive UDFs on the management node of the cluster.
Parent topic: Using on Spark