Input Validation Flaw in vLLM Serving Engine from vLLM Project
CVE-2025-48944

6.5MEDIUM

Key Information:

Status
Vendor
CVE Published:
30 May 2025

What is CVE-2025-48944?

The vLLM inference and serving engine suffers from an input validation error in versions 0.8.0 to 0.9.0. Specifically, when using the /v1/chat/completions OpenAPI endpoint, unexpected or malformed inputs in the 'pattern' and 'type' fields can trigger a crash of the inference worker. This failure occurs because the backend does not properly validate these inputs before compiling or parsing them, resulting in a complete halt of the worker. The only resolution is to restart the worker, necessitating an upgrade to version 0.9.0, which has rectified this flaw.

Affected Version(s)

vllm >= 0.8.0, < 0.9.0

References

CVSS V3.1

Score:
6.5
Severity:
MEDIUM
Confidentiality:
None
Integrity:
None
Availability:
None
Attack Vector:
Network
Attack Complexity:
Low
Privileges Required:
Low
User Interaction:
None
Scope:
Unchanged

Timeline

  • Vulnerability published

  • Vulnerability Reserved

.