Rate This Document
Findability
Accuracy
Completeness
Readability

Generating a Template File

Before starting automatic tuning, obtain the parameter file required by the tuning task, select the application to be tuned, and set or check the parameters used for tuning.

Command Function

Generates a configuration template of the parameter space and application scenario for the Kunpeng AutoTuner. After a template is generated, you can run the devkit kat train -t task.yaml -p param.yaml command to start automatic tuning.

Comments in a YAML file are written in UTF-8 format.

Syntax

1
devkit kat template [-h] [-l {0,1,2,3}] [-o <dir>] (-g | -c <file>)

Parameter Description

Table 1 Parameter description

Parameter

Option

Description

-h/--help

-

Obtains help information. This parameter is optional.

-l/--log-level

0/1/2/3

Log level, which defaults to 2. This parameter is optional.
NOTE:

The default level is 2 (WARNING).

  • 0: DEBUG
  • 1: INFO
  • 2: WARNING
  • 3: ERROR

-o/--output

-

Path to the generated template file. This parameter is optional.

-g/--generate

-

Enters the interactive user interface and generates a simple template file for the application. The template contains only mandatory parameters. This parameter is optional.

-c/--convert

-

Converts a simple template file to a custom template file with all parameters. This parameter is optional.

NOTE:
  • For a simple template file, you only need to configure the host, application path, and pressure test information required for task execution.
  • A custom template file contains all task and application commands. If you need to customize an application, the configuration information in a simple template may not match your specific requirements. If this happens, convert the simple template to a custom template with full execution information so that you can adapt the automatic tuning process.

Example

The following steps use Spark as an example.

  1. Generate a simple template.
    devkit kat template -g -o /opt/template

    The -g parameter displays the interactive user interface. Set the basic parameters required by a template file. The -o /opt/template parameter indicates the path to the generated template file.

  2. Select Spark.

    Press the or key to select an application, and press Enter to confirm the selection.

    You can use the tool to generate a simple template for database and big data scenarios, and then quickly modify its configuration items to start tuning. Alternatively, you can create a custom template (select Custom) and adjust its content to meet specific tuning requirements.

    Figure 1 Selecting an application
  3. Select an application version.

    Press Enter to go to the version list and select the required version. Press Space to switch between the options in [], where Y indicates that the version is selected.

    Figure 2 Selecting a version (1)
    Figure 3 Selecting a version (2)
  4. Select a parameter type.

    Press Esc to return to the upper-level directory, press to select ParamSpaces, and press Enter to go to the parameter type list.

    Spark indicates the application parameter, System indicates the system parameter, and Kunpeng uarch indicates the microarchitecture parameter.

    You can select either system or microarchitecture parameters for tuning without selecting application parameters.

    The system parameter spaces of Hive, Spark, and Flink support JVM parameters. After the Xmx, Xmn, and Xms parameters are automatically tuned, they are delivered to the /tmp/kat_java_params.log file. To tune the JVM parameters, obtain the file content from the command and combine it into an application startup command.

    Figure 4 Selecting application parameters

    Press Space to switch between the options in [ ], where the option Y indicates that the parameter is enabled.

    Figure 5 Selecting parameter types
  5. Select detailed parameters.

    After selecting a parameter, press Enter to go to the parameter setting screen and select detailed parameters. The application and system parameters can be customized. You can enter / to enter the search mode and view the detailed parameters. You can also press Page Up and Page Down to view them.

    Figure 6 Selecting application parameters
  6. Save the configuration to generate a simple template file.

    After the configuration is complete, press s to save the simple template file to the specified directory. If you do not specify a directory, a folder named template_Application_name_YMD_HMS is generated in the current directory.

    The names of the generated task parameter file and application parameter file are subject to the selected application and parameter types. For example, if you select the Spark application, system parameter, and microarchitecture parameter, the names of the generated task parameter file and application parameter file are task_Spark_System_Kunpeng_uarch.yaml and param_Spark_System_Kunpeng_uarch.yaml.

    Press any key to return to the screen before saving the configuration.

    Command output:

    1
     Spark configuration file saved successfully: {'path': '/opt/template/template_Spark_20250515_063634'}
    
  7. Exit the interactive user interface.

    Press q to choose to exit, and then press y to confirm exiting. The directory for saving the template file is displayed on the terminal.

    Command output:

    1
    [2025-05-15 06:38:31 UTC] [KAT] [message] - The path of the saved file is as follows: ['/opt/template/template_Spark_20250515_063634']
    
  8. View the generated simple template file.
    ls /opt/template/template_Spark_20250515_063634/

    Command output:

    1
    param_Spark_System.yaml  task_Spark_System.yaml
    
    • Task parameter file (task_Spark_System.yaml): contains the host, application path, and pressure test information required for executing a task.
    • Application parameter file (param_Spark_System.yaml): contains detailed information about the parameters in each selected parameter space.

    You can view or modify the selected application and system parameters in the param_Spark_System.yaml file in the template file directory.

  9. View the application parameter file.
    cat /opt/template/template_Spark_20250515_063634/param_Spark_System.yaml

    The param_xxxx.yaml file contains the application parameters (including the default parameter) selected on the interaction user interface. The application parameter file has the following content:

  10. View and complete the task parameter file.
    vim /opt/template/template_Spark_20250515_063634/task_Spark_System.yaml

    Set mandatory parameters in the template file based on your requirements, such as the environment information, application server, performance test tool server, and pressure test metrics. For details about the parameters, see the task template YAML file, which has the following content:

  11. Save the task parameter file and exit.

    After completing the basic configuration, press Esc, input :wq!, and press Enter to save the file and exit.

  12. If a simple template does not meet your requirements, convert it to a custom template.

    A custom template file contains all task and application commands, which facilitate adjustment of the overall tuning task process.

    devkit kat template -c /opt/template/template_Spark_20250515_063634/task_Spark_System.yaml

    Command output:

    1
    [2025-05-15 06:53:49 UTC] [KAT] [ info  ] - /opt/template/template_Spark_20250515_063634/task_Spark_System.yaml has been converted to /opt/template/template_Spark_20250515_063634/task_Spark_System_custom.yaml
    

    The name of the custom template file is suffixed with _custom.

    A custom template file has the following content: