Memory Corruption Vulnerability in llama.cpp by ggml-org
CVE-2026-21869
8.8HIGH
What is CVE-2026-21869?
The llama.cpp library, which serves as an inference engine for various large language models, contains a vulnerability that arises from improper validation of the 'n_discard' parameter parsed from JSON input. If a negative value is provided, it results in out-of-bounds memory writes during token evaluation, potentially leading to process crashes or granting remote code execution capabilities. As of now, there is no available fix for this issue.
Affected Version(s)
llama.cpp <= 55d4206c8
