Rate This Document
Findability
Accuracy
Completeness
Readability

(Optional) Uninstalling the UDF Plugin

Perform the following steps on the management node only.

If you do not need to use the OmniOperator software after uninstalling the UDF plugin, uninstall OmniOperator by following instructions in Uninstalling the Software.

  1. Delete the /opt/omni-operator/hive-udf directory on the management and compute nodes.
    1
    rm -rf /opt/omni-operator/hive-udf
    
  2. In the /opt/omni-operator/conf/omni.conf file, update the following content:
    1. Open /opt/omni-operator/conf/omni.conf.
      1
      vi /opt/omni-operator/conf/omni.conf
      
    2. Press i to enter the insert mode and update the UDF configuration.
      1
      2
      3
      4
      5
      6
      7
      # <----UDF properties---->
      # false indicates expression row-by-row processing and true indicates expression bath processing.
      #enableBatchExprEvaluate=false
      # UDF trustlist file path
      #hiveUdfPropertyFilePath=./hive-udf/udf.properties
      # Hive UDF JAR file directory
      #hiveUdfDir=./hive-udf/udf
      
    3. Press Esc, type :wq!, and press Enter to save the file and exit.
  3. Optional: When the OmniOperator UDF plugin is used on Spark, perform Packaging and Uploading the OmniOperator Installation Package again.
  4. Use vi to open the ~/.bashrc file and delete UDF environment variables from LD_LIBRARY_PATH.
    1. Open the ~/.bashrc file.
      1
      vi ~/.bashrc
      
    2. Press i to enter the insert mode and delete ${JAVA_HOME}/jre/lib/aarch64/server from LD_LIBRARY_PATH to update environment variables.
    3. Press Esc, type :wq!, and press Enter to save the file and exit.
    4. Update the environment variables.
      1
      source ~/.bashrc