Integer Overflow Vulnerability in llama.cpp by ggml-org
CVE-2026-33298
What is CVE-2026-33298?
CVE-2026-33298 is an integer overflow vulnerability found in the llama.cpp C/C++ implementation, specifically within the ggml_nbytes function. This function is responsible for calculating memory sizes based on tensor dimensions in large language model (LLM) inference operations. The vulnerability allows an attacker to create a specially crafted GGUF file that manipulates the tensor dimensions, leading ggml_nbytes to return an inaccurately small size (e.g., 4MB instead of the required size, which could be in the Exabytes). This discrepancy can result in a heap-based buffer overflow when the application processes these tensors, enabling potential remote code execution (RCE) through memory corruption. Such a vulnerability poses significant risks to organizations relying on llm.cpp for machine learning inference, as it could compromise the integrity and security of their systems.
Potential impact of CVE-2026-33298
-
Remote Code Execution (RCE): The most critical impact of this vulnerability is the potential for remote code execution. Exploitation could allow attackers to execute arbitrary code on the affected system, leading to unauthorized access to sensitive data and systems.
-
Memory Corruption: The integer overflow can lead to memory corruption, which not only affects the immediate application but could also destabilize other services on the same system, leading to a wider range of operational issues and increased downtime.
-
Data Integrity Risks: The vulnerability can be exploited to manipulate data and processes within the application, potentially leading to data breaches or corruption. This risk is particularly concerning for organizations using llama.cpp in environments where data accuracy and integrity are paramount, such as in financial services or healthcare.

Human OS v1.0:
Ageing Is an Unpatched Zero-Day Vulnerability.
Remediate biological technical debt. Prime Ageing uses 95% high-purity SIRT6 activation to maintain genomic integrity and bolster systemic resilience.
Affected Version(s)
llama.cpp < b7824
