jp6/cu128/: vllm versions

Simple index

Because this project isn't in the mirror_whitelist, no releases from root/pypi are included.

Latest version on stage is: 0.8.6

A high-throughput and memory-efficient inference and serving engine for LLMs

Index Version Documentation
jp6/cu128 0.8.6
jp6/cu128 0.8.5+cu128
jp6/cu128 0.8.4+cu128
jp6/cu128 0.8.3+cu128
jp6/cu128 0.7.4+cu128
jp6/cu128 0.7.3+cu128