CVE-2024-42477

high

Description

llama.cpp provides LLM inference in C/C++. The unsafe `type` member in the `rpc_tensor` structure can cause `global-buffer-overflow`. This vulnerability may lead to memory data leakage. The vulnerability is fixed in b3561.

References

https://github.com/ggerganov/llama.cpp/security/advisories/GHSA-mqp6-7pv6-fqjf

https://github.com/ggerganov/llama.cpp/commit/b72942fac998672a79a1ae3c03b340f7e629980b

Details

Source: Mitre, NVD

Published: 2024-08-12

Updated: 2024-08-15

Risk Information

CVSS v2

Base Score: 7.8

Vector: CVSS2#AV:N/AC:L/Au:N/C:N/I:N/A:C

Severity: High

CVSS v3

Base Score: 7.5

Vector: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H

Severity: High