A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.
References
Configurations
No configuration.
History
20 Mar 2025, 10:15
Type | Values Removed | Values Added |
---|---|---|
New CVE |
Information
Published : 2025-03-20 10:15
Updated : 2025-03-20 10:15
NVD link : CVE-2024-12704
Mitre link : CVE-2024-12704
CVE.ORG link : CVE-2024-12704
JSON object : View
Products Affected
No product.
CWE
CWE-755
Improper Handling of Exceptional Conditions