Skip to content

preprocess-mlperf-inference-submission

Automatically generated README for this automation recipe: preprocess-mlperf-inference-submission

Category: MLPerf benchmark support

License: Apache 2.0

  • CM meta description for this script: _cm.json
  • Output cached? False

Reuse this script in your project

Install MLCommons CM automation meta-framework

Pull CM repository with this automation recipe (CM script)

cm pull repo mlcommons@cm4mlops

cmr "run mlc mlcommons mlperf inference submission mlperf-inference processor preprocessor preprocess" --help

Run this script

Run this script via CLI
cm run script --tags=run,mlc,mlcommons,mlperf,inference,submission,mlperf-inference,processor,preprocessor,preprocess [--input_flags]
Run this script via CLI (alternative)
cmr "run mlc mlcommons mlperf inference submission mlperf-inference processor preprocessor preprocess " [--input_flags]
Run this script from Python
import cmind

r = cmind.access({'action':'run'
              'automation':'script',
              'tags':'run,mlc,mlcommons,mlperf,inference,submission,mlperf-inference,processor,preprocessor,preprocess'
              'out':'con',
              ...
              (other input keys for this script)
              ...
             })

if r['return']>0:
    print (r['error'])
Run this script via Docker (beta)
cm docker script "run mlc mlcommons mlperf inference submission mlperf-inference processor preprocessor preprocess" [--input_flags]

Script flags mapped to environment

  • --submission_dir=valueCM_MLPERF_INFERENCE_SUBMISSION_DIR=value
  • --submitter=valueCM_MLPERF_SUBMITTER=value

Native script being run

No run file exists for Windows


Script output

cmr "run mlc mlcommons mlperf inference submission mlperf-inference processor preprocessor preprocess " [--input_flags] -j