Rate This Document
Findability
Accuracy
Completeness
Readability

Introduction

Based on the Kunpeng 920 series processors, this document introduces how to port the inference deployment framework TensorFlow Serving (TF-Serving) in openEuler 22.03 LTS SP3 and deploy the test model for pressure tests.

Overview

TF-Serving is a high-performance serving system for developers to deploy and scale machine learning models, enabling model prediction and inference in the production environment. It supports multiple model types and can be easily integrated with TensorFlow-trained models.

For more information about TF-Serving, see the TF-Serving official website.

Programming languages: Python/C++

Brief description: an open source inference deployment framework

Open source license: Apache-2.0 license

When using open source software, comply with the applicable license.

Recommended Software Version

TF-Serving 2.15.0