llm --help

Let's help help help devs.

Target: <200ms. Several important LLM CLI tools take multiple seconds. PRs welcome ❤️

library cold warm (10 runs) version measured on
vllm --help18115ms8157ms0.19.0+cpu2026-04-14T14:02Z
sglang --help13130ms5464msv0.5.10.post12026-04-14T13:58Z
VLMEvalKit --help13244ms5338msv0.22026-04-14T14:04Z
tensorrt-llm --help6517ms2183ms1.2.02026-04-14T13:57Z
datasets --help3241ms975ms4.8.42026-04-14T13:51Z
llm --help1174ms539ms0.302026-04-14T13:51Z
openai --help1069ms535ms2.31.02026-04-14T13:52Z
hf --help1351ms389ms1.10.22026-04-14T13:50Z
langchain-cli --help844ms257ms0.0.372026-04-14T13:52Z
lm-eval --help751ms251ms0.4.112026-04-14T13:53Z
llama.cpp --help16ms14msb87842026-04-14T13:53Z
ollama --help14ms12ms0.20.72026-04-14T13:50Z
transformers --help1ms0ms5.5.42026-04-14T13:53Z