jp6/cu128/: xgrammar-0.1.15 metadata and description
Efficient, Flexible and Portable Structured Generation
author | MLC Team |
classifiers |
|
description_content_type | text/markdown |
keywords | machine learning, inference |
license | Apache 2.0 |
project_urls |
|
provides_extras | test |
requires_dist |
|
requires_python | <4,>=3.9 |
Because this project isn't in the mirror_whitelist
,
no releases from root/pypi are included.
File | Tox results | History |
---|---|---|
xgrammar-0.1.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
|
|
xgrammar-0.1.15-cp312-cp312-linux_aarch64.whl
|
|
|
xgrammar-0.1.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
|
|
XGrammar
Efficient, Flexible and Portable Structured Generation
News
- [2025/02] XGrammar has been officially integrated into Modular's MAX
- [2025/01] XGrammar has been officially integrated into TensorRT-LLM.
- [2024/12] XGrammar has been officially integrated into vLLM.
- [2024/12] We presented research talks on XGrammar at CMU Catalyst, Berkeley SkyLab, MIT HANLAB, THU IIIS, SJTU, Ant Group, SGLang Meetup, Qingke AI, Camel AI. The slides can be found here.
- [2024/11] XGrammar has been officially integrated into SGLang.
- [2024/11] XGrammar has been officially integrated into MLC-LLM.
- [2024/11] We officially released XGrammar v0.1.0!
Overview
XGrammar is an open-source library for efficient, flexible, and portable structured generation. It supports general context-free grammar to enable a broad range of structures while bringing careful system optimizations to enable fast executions. XGrammar features a minimal and portable C++ backend that can be easily integrated into multiple environments and frameworks, and is co-designed with the LLM inference engine and enables zero-overhead structured generation in LLM inference.
Get Started
Please visit our documentation to get started with XGrammar.