HomeVllmCVE-2025-32444

CVE-2025-32444

CRITICAL
10.0CVSS
Published: 2025-04-30
Updated: 2025-05-28
AI Analysis

Description

vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.6.5 and prior to 0.8.5, having vLLM integration with mooncake, are vulnerable to remote code execution due to using pickle based serialization over unsecured ZeroMQ sockets. The vulnerable sockets were set to listen on all network interfaces, increasing the likelihood that an attacker is able to reach the vulnerable ZeroMQ sockets to carry out an attack. vLLM instances that do not make use of the mooncake integration are not vulnerable. This issue has been patched in version 0.8.5.

CVSS Metrics

Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H
Attack Vector
network
Complexity
low
Privileges
none
User Action
none
Scope
changed
Confidentiality
high
Integrity
high
Availability
high
Weaknesses
CWE-502

Metadata

Primary Vendor
VLLM
Published
4/30/2025
Last Modified
5/28/2025
Source
NIST NVD
Note: Verify all details with official vendor sources before applying patches.

Affected Products

vllm : vllm

AI-Powered Remediation

Generate remediation guidance or a C-suite brief for this vulnerability.

Executive Intelligence Brief

CVE-CVE-2025-32444 | CRITICAL Severity | CVEDatabase.com | CVEDatabase.com