jp6/cu128/: vllm versions

Simple index

Because this project isn't in the mirror_whitelist, no releases from root/pypi are included.

Latest version on stage is: 0.7.3+cu128

A high-throughput and memory-efficient inference and serving engine for LLMs

Index Version Documentation
jp6/cu128 0.7.3+cu128