Configuring the Environment
Before performing model inference using TensorFlow Serving, configure the BIOS, create and activate the conda environment, and install the dependencies.
BIOS Configuration Description
Before the test, configure the BIOS environment. For details about the configuration items, description, and paths, see Table 1.
Configuration Item |
Configuration Description |
Menu Path |
Recommended Value |
|---|---|---|---|
Power Policy |
Configures the mode of the power policy item. |
|
Performance |
SMT2 |
Sets the Simultaneous Multithreading Technology (SMT) mode. |
|
Enabled |
Support Smmu |
Sets the mode of the System Memory Management Unit (SMMU). |
|
Disabled |
CPU Prefetching Configuration |
Sets the CPU prefetching mode. |
|
Enabled |
Creating a conda Environment
- Create a conda virtual environment for the inference phase.
1conda create -n model_zoo_infer python=3.11
- Activate the conda environment after it is created.
1conda activate model_zoo_infer
- Install the inference dependency.
1pip install tensorflow==2.15.0
Parent topic: Model Inference on TensorFlow Serving