Skip to content

get-mlperf-inference-sut-configs

Automatically generated README for this automation recipe: get-mlperf-inference-sut-configs

Category: MLPerf benchmark support

License: Apache 2.0

  • Notes from the authors, contributors and users: README-extra

  • CM meta description for this script: _cm.json

  • Output cached? False

Reuse this script in your project

Install MLCommons CM automation meta-framework

Pull CM repository with this automation recipe (CM script)

cm pull repo mlcommons@cm4mlops

cmr "get mlperf inference sut configs sut-configs" --help

Run this script

Run this script via CLI
cm run script --tags=get,mlperf,inference,sut,configs,sut-configs [--input_flags]
Run this script via CLI (alternative)
cmr "get mlperf inference sut configs sut-configs " [--input_flags]
Run this script from Python
import cmind

r = cmind.access({'action':'run'
              'automation':'script',
              'tags':'get,mlperf,inference,sut,configs,sut-configs'
              'out':'con',
              ...
              (other input keys for this script)
              ...
             })

if r['return']>0:
    print (r['error'])
Run this script via Docker (beta)
cm docker script "get mlperf inference sut configs sut-configs" [--input_flags]

Script flags mapped to environment

  • --configs_git_url=valueCM_GIT_URL=value
  • --repo_path=valueCM_SUT_CONFIGS_PATH=value
  • --run_config=valueCM_MLPERF_SUT_NAME_RUN_CONFIG_SUFFIX=value

Default environment

These keys can be updated via --env.KEY=VALUE or env dictionary in @input.json or using script flags.

  • CM_SUT_CONFIGS_PATH: ``
  • CM_GIT_URL: ``

Script output

cmr "get mlperf inference sut configs sut-configs " [--input_flags] -j