vLLM is an inference and serving engine for large language models (LLMs). Version 0.8.0 up to but excluding 0.9.0 have a Denial of Service (ReDoS) that causes the vLLM server to crash if an invalid regex was provided while using structured output. This vulnerability is similar to GHSA-6qc9-v4r8-22xg/CVE-2025-48942, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
References
Link | Resource |
---|---|
https://github.com/vllm-project/vllm/commit/08bf7840780980c7568c573c70a6a8db94fd45ff | Patch |
https://github.com/vllm-project/vllm/issues/17313 | Issue Tracking |
https://github.com/vllm-project/vllm/pull/17623 | Issue Tracking Patch |
https://github.com/vllm-project/vllm/security/advisories/GHSA-9hcf-v7m4-6m2j | Vendor Advisory |
Configurations
History
24 Jun 2025, 17:40
Type | Values Removed | Values Added |
---|---|---|
New CVE |
Information
Published : 2025-05-30 19:15
Updated : 2025-06-24 17:40
NVD link : CVE-2025-48943
Mitre link : CVE-2025-48943
CVE.ORG link : CVE-2025-48943
JSON object : View
Products Affected
vllm
- vllm
CWE
CWE-248
Uncaught Exception