Rate This Document
Findability
Accuracy
Completeness
Readability

Compiling TensorFlow Serving

As the server providing the inference service, TensorFlow Serving 2.15 must be properly installed before the test.

  1. Prepare the TensorFlow Serving compilation environment by following instructions in "Configuring the Compilation Environment" in the TensorFlow Serving Porting Guide.
  2. Download the optimization patches.
    1
    2
    git clone -b v2.15.0-2512 https://gitcode.com/boostkit/tensorflow.git sra-tensorflow
    git clone -b v2.15.1-2512 https://gitcode.com/boostkit/tensorflow-serving.git sra-serving
    
  3. Download open source TensorFlow 2.15.0 and TensorFlow Serving 2.15.1.
    1
    2
    git clone -b v2.15.0 https://github.com/tensorflow/tensorflow.git
    git clone -b 2.15.1 https://github.com/tensorflow/serving.git
    
  4. Integrate the optimization patches into the open source TensorFlow and TensorFlow Serving directories respectively.
    1
    2
    3
    4
    cp /path/to/sra-tensorflow/0001-boostsra-tensorflow.patch /path/to/tensorflow/
    cp /path/to/sra-serving/0001-boostsra-tensorflow-serving.patch /path/to/serving/
    cd /path/to/tensorflow && patch -p1 < 0001-boostsra-tensorflow.patch
    cd /path/to/serving && patch -p1 < 0001-boostsra-tensorflow-serving.patch
    
  5. Go to the serving directory.
    1
    cd /path/to/serving/
    
  6. Set the Bazel path to the directory containing the Bazel binary compiled in section Configuring the Compilation Environment, and create a directory for storing build dependencies.
    1
    2
    3
    export BAZEL_PATH=/path/to/bazel/bazel-6.5.0/output
    export DISTDIR=$(pwd)/download
    mkdir -p $DISTDIR
    
  7. Run the build script to compile TensorFlow Serving.
    1
    sh compile_serving.sh --tensorflow_dir /path/to/tensorflow --features gcc12
    

    /path/to/tensorflow specifies the TensorFlow path. gcc12 indicates that GCC 12.3.1 is used for compilation.

    The build result is a TensorFlow Serving binary file tensorflow_model_server, and the file path is /path/to/serving/bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server.

    The compile command in the build script compile_serving.sh is as follows. Table 1 describes some parameters.
    bazel --output_user_root=$BAZEL_COMPILE_CACHE build -c opt --distdir=$DISTDIR --override_repository=org_tensorflow=$TENSORFLOW_DIR \
    --copt=-march=armv8.3-a+crc --copt=-O3 --copt=-fprefetch-loop-arrays --copt=-Wno-error=maybe-uninitialized  \ 
    --copt=-Werror=stringop-overflow=0 \ 
    --define tflite_with_xnnpack=false tensorflow_serving/model_servers:tensorflow_model_server
    Table 1 Parameters of the compile command in the build script compile_serving.sh

    Parameter

    Description

    --output_user_root

    Indicates a Bazel build cache directory. The default value is /path/to/serving/output. You can set a custom path using the BAZEL_COMPILE_CACHE environment variable. The command is as follows:
    1
    export BAZEL_COMPILE_CACHE=/path/to/your/cache_dir
    

    --distdir

    Indicates a directory for storing TensorFlow Serving compilation dependencies. It ensures reliable access when third-party package download fails.

    --override_repository

    Specifies a local TensorFlow build, using the directory specified by tensorflow_dir as the local TensorFlow repository.