Denial of Service Vulnerability in vLLM Inference Engine by vLLM Project
CVE-2025-48942

6.5MEDIUM

Key Information:

Status
Vendor
CVE Published:
30 May 2025

What is CVE-2025-48942?

A vulnerability exists in the vLLM inference and serving engine for large language models that allows an attacker to crash the vLLM server by sending an invalid JSON schema as a Guided Parameter to the /v1/completions API. This issue affects all versions of vLLM from 0.8.0 up to, but not including, 0.9.0. It is crucial for users of the affected versions to upgrade to 0.9.0 or later to mitigate this risk and ensure uninterrupted service.

Affected Version(s)

vllm >= 0.8.0, < 0.9.0

References

CVSS V3.1

Score:
6.5
Severity:
MEDIUM
Confidentiality:
None
Integrity:
None
Availability:
None
Attack Vector:
Network
Attack Complexity:
Low
Privileges Required:
Low
User Interaction:
None
Scope:
Unchanged

Timeline

  • Vulnerability published

  • Vulnerability Reserved

.