Function Description
Before starting automatic tuning, obtain the parameter file required by the tuning task, select the application to be tuned, and set or check the parameters used for tuning.
Command Function
Generates a configuration template file for the parameter space and application scenario. After completing the template file, you can run the devkit kat train -t task.json -p param.json command to start automatic tuning.
Syntax
1
|
devkit kat template [-h] [-l {0,1,2,3}] [-o <dir>] (-g | -c <file>) |
Parameter Description
|
Parameter |
Option |
Description |
|---|---|---|
|
-h/--help |
- |
Obtains help information. This parameter is optional. |
|
-l/--log-level |
0/1/2/3 |
Log level, which defaults to 2. This parameter is optional.
NOTE:
The default level is 2 (WARNING).
|
|
-o/--output |
- |
Output path of the generated template file. If you do not set it, a directory in the format of template_Application-name_YMD_HMS is generated in the current directory. This parameter is optional. |
|
-g/--generate |
- |
Enters the interactive user interface and generates a simple template file for the application. The template contains only mandatory parameters. This parameter is optional. |
|
-c/--convert |
- |
Converts a simple template file to a custom template file with all parameters. This parameter is optional.
NOTE:
|
Example
The following steps use Spark as an example.
- Generate a simple template.
1devkit kat template -g -o /opt/template
The -g parameter displays the interactive user interface. Set the basic parameters required by a template file. The -o /opt/template parameter indicates the path to the generated template file.
- Select Spark.
Press the ↑ or ↓ key to select an application, and press Enter to confirm the selection.
Figure 1 Selecting an application
- Select an application version.
Press Enter to go to the version list and select the required version. Press Space to switch between the options in [], where Y indicates that the version is selected.
Figure 2 Selecting a version
Figure 3 Selecting a version (2)
- Select a parameter type.
Press Esc to return to the upper-level directory, press ↓ to select ParamSpaces, and press Enter to go to the parameter type list.
Spark is the application parameter, System is the system parameter, and Kunpeng uarch is the microarchitecture parameter.
You can select either the system or microarchitecture parameter for tuning without selecting the application parameter.
Figure 4 Selecting a parameter type
- Select detailed parameters.After selecting a parameter, press Enter to go to the parameter setting screen and select detailed parameters. The application and system parameters can be customized. You can enter / to enter the search mode and view the detailed parameters. You can also press Page Up and Page Down to view them.Figure 5 Selecting detailed application parameters
- Save the configuration to generate a template file.
After the configuration is complete, press s to save the template file and press any key to return the previous screen. If you do not specify a file directory, a template file named template_Application-name_YMD_HMS is generated in the current directory.
Figure 6 Generating a simple template file
- Exit the interactive user interface.
Press q to choose to exit, and then press y to confirm exiting. The directory for saving the template file is displayed on the terminal.
Command output:
1[2025-02-12 09:45:56 UTC] [KAT] [message] - The path of the saved file is as follows:: ['/opt/template/template_Spark_20250212_020125']
- View the generated simple template file.
1ls /opt/template/template_Spark_20250212_020125Command output:
- Task parameter file (task_Spark.json): contains the host, application path, and pressure test information required for executing a task.
- Application parameter file (param_Spark.json): contains detailed information about the parameters in each selected parameter space.
You can view or modify the selected application and system parameters in the param_Spark.json file in the template file directory.
1param_Spark.json task_Spark.json
The names of the generated task parameter file and application parameter file vary according to the selected parameter type. For example:
- If you select Spark, the two file names are task_Spark.json and param_Spark.json.
- If you select Spark and System, the two file names are task_Spark_System.json and param_Spark_System.json.
- If you select Spark, System, and Kunpeng uarch, the two file names are task_Spark_System_Kunpeng_uarch.json and param_Spark_System_Kunpeng_uarch.json.
- View the application parameter file.
1cat /opt/template/template_Spark_20250212_020125/param_Spark.jsonThe param_xxxx.json file is generated after you select System on the interactive user interface.
Figure 7 Application parameter file
- View and complete the task parameter file.
1vim /opt/template/template_Spark_20250212_020125/task_Spark.jsonSet mandatory parameters in the template file based on your requirements, such as the environment information, application server, performance test tool server, and pressure test metrics. For details about the parameters, see Parameters of a Simple Template File.
A simple template file has the following content:

- Save the task parameter file and exit.
After completing the basic configuration, press Esc, type :wq!, and press Enter to save the file and exit.
- If a simple template does not meet your requirements, convert it to a custom template.
A custom template file contains all task and application commands, which facilitate adjustment of the overall tuning task process.
1devkit kat template -c /opt/template/template_Spark_20250212_020125/task_Spark.json
If the conversion is successful, the following information is displayed:
1[2024-12-30 13:53:27 UTC] [KAT] [ info ] - /opt/template/template_Spark_20250212_020125/task_Spark.json has been converted to /opt/template/template_Spark_20250212_020125/task_Spark_custom.json
The name of the custom template file is suffixed with _custom.
A custom template file has the following content:
