Rate This Document
Findability
Accuracy
Completeness
Readability

(Optional) Installing the Spark UDF Plugin

You need to install the Spark UDF plugin only when OmniOperator UDFs are available.

Prerequisites

Spark has been installed by following instructions provided in Installing Spark.

Installing the Plugin

  1. Place the JAR packages on which the UDFs depend to the /user/hive-udf directory of HDFS.
    • The /user/hive-udf directory can be customized.
    • The JAR packages on which the UDFs depend need to be provided by yourself.
  2. Register Hive UDFs on the management node of the cluster.

    For details about how to register UDFs, see Integration with Hive UDFs/UDAFs/UDTFs.