CVE-2026-1839: HuggingFace Transformers allows for arbitrary code execution in the `Trainer` class
(updated )
A vulnerability in the HuggingFace Transformers library, specifically in the Trainer class, allows for arbitrary code execution. The _load_rng_state() method in src/transformers/trainer.py at line 3059 calls torch.load() without the weights_only=True parameter. This issue affects all versions of the library supporting torch>=2.2 when used with PyTorch versions below 2.6, as the safe_globals() context manager provides no protection in these versions. An attacker can exploit this vulnerability by supplying a malicious checkpoint file, such as rng_state.pth, which can execute arbitrary code when loaded. The issue is resolved in version v5.0.0rc3.
References
- github.com/advisories/GHSA-69w3-r845-3855
- github.com/huggingface/transformers
- github.com/huggingface/transformers/commit/03c8082ba4594c9b8d6fe190ca9bed0e5f8ca396
- github.com/huggingface/transformers/releases/tag/v5.0.0rc3
- huntr.com/bounties/3c77bb97-e493-493d-9a88-c57f5c536485
- nvd.nist.gov/vuln/detail/CVE-2026-1839
Code Behaviors & Features
Detect and mitigate CVE-2026-1839 with GitLab Dependency Scanning
Secure your software supply chain by verifying that all open source dependencies used in your projects contain no disclosed vulnerabilities. Learn more about Dependency Scanning →