get-mlperf-inference-results
Automatically generated README for this automation recipe: get-mlperf-inference-results
Category: MLPerf benchmark support
License: Apache 2.0
-
Notes from the authors, contributors and users: README-extra
-
CM meta description for this script: _cm.json
- Output cached? True
Reuse this script in your project
Install MLCommons CM automation meta-framework
Pull CM repository with this automation recipe (CM script)
cm pull repo mlcommons@cm4mlops
Print CM help from the command line
cmr "get results inference inference-results mlcommons mlperf" --help
Run this script
Run this script via CLI
cm run script --tags=get,results,inference,inference-results,mlcommons,mlperf[,variations]
Run this script via CLI (alternative)
cmr "get results inference inference-results mlcommons mlperf [variations]"
Run this script from Python
import cmind
r = cmind.access({'action':'run'
'automation':'script',
'tags':'get,results,inference,inference-results,mlcommons,mlperf'
'out':'con',
...
(other input keys for this script)
...
})
if r['return']>0:
print (r['error'])
Run this script via Docker (beta)
cm docker script "get results inference inference-results mlcommons mlperf[variations]"
Variations
-
Group "source-repo"
Click here to expand this section.
_ctuning
- ENV variables:
- GITHUB_REPO_OWNER:
ctuning
- GITHUB_REPO_OWNER:
- ENV variables:
_custom
- ENV variables:
- GITHUB_REPO_OWNER:
arjunsuresh
- GITHUB_REPO_OWNER:
- ENV variables:
_go
- ENV variables:
- GITHUB_REPO_OWNER:
GATEOverflow
- GITHUB_REPO_OWNER:
- ENV variables:
_mlcommons
(default)- ENV variables:
- GITHUB_REPO_OWNER:
mlcommons
- GITHUB_REPO_OWNER:
- ENV variables:
_nvidia-only
- ENV variables:
- GITHUB_REPO_OWNER:
GATEOverflow
- NVIDIA_ONLY:
yes
- GITHUB_REPO_OWNER:
- ENV variables:
Default variations
_mlcommons
Default environment
These keys can be updated via --env.KEY=VALUE
or env
dictionary in @input.json
or using script flags.
- CM_GIT_CHECKOUT:
master
- CM_GIT_DEPTH:
--depth 1
- CM_GIT_PATCH:
no
Versions
Default version: v3.1
v2.1
v3.0
v3.1
v4.0
Script output
cmr "get results inference inference-results mlcommons mlperf [variations]" -j