Article Details
Scrape Timestamp (UTC): 2025-11-24 22:02:08.421
Original Article Text
Click to Toggle View
Malicious Blender model files deliver StealC infostealing malware. A Russian-linked campaign delivers the StealC V2 information stealer malware through malicious Blender files uploaded to 3D model marketplaces like CGTrader. Blender is a powerful open-source 3D creation suite that can execute Python scripts for automation, custom user interface panels, add-ons, rendering processes, rigging tools, and pipeline integration. If the Auto Run feature is enabled, when a user opens a character rig, a Python script can automatically load the facial controls and custom UI panels with the required buttons and sliders. Despite the potential for abuse, users often activate the Auto Run option for convenience. Researchers at cybersecurity company Morphisec observed attacks using malicious .blend files with embedded Python code that fetches a malware loader from a Cloudflare Workers domain. The loader then fetches a PowerShell script that retrieves two ZIP archives, ZalypaGyliveraV1 and BLENDERX, from attacker-controlled IPs. The archives unpack into the %TEMP% folder and drop LNK files in the Startup directory for persistence. Next, they deploy two payloads, the StealC infostealer and an auxiliary Python stealer, likely used for redundancy. Morphisec researchers report that the StealC malware used in this campaign was the latest variant of the second major version of the malware that was analyzed by Zscaler researchers earlier this year. The latest StealC has expanded its data-stealing capabilities and supports exfiltration from: Despite the malware being documented since 2023, subsequent releases appear to remain elusive for anti-virus products. Morphisec comments that no security engine on VirusTotal detected the StealC variant they analyzed. Given that 3D model marketplaces cannot scrutinize the code in user-submitted files, Blender users are advised to exercise caution when using files sourced from such platforms and should consider disabling the auto-execution of code. You can do this from Blender > Edit > Preferences > uncheck the 'Auto Run Python Scripts' option. 3D assets should be treated like executable files, and users should only trust publishers with a proven record. For everything else, it is recommended to use sandboxed environments for testing. Secrets Security Cheat Sheet: From Sprawl to Control Whether you're cleaning up old keys or setting guardrails for AI-generated code, this guide helps your team build securely from the start. Get the cheat sheet and take the guesswork out of secrets management.
Daily Brief Summary
A Russian-associated operation is distributing StealC V2 malware via malicious Blender files on 3D model marketplaces like CGTrader, targeting users of the open-source 3D creation suite.
The attack exploits Blender's Auto Run feature, using Python scripts embedded in .blend files to initiate a malware loader from a Cloudflare Workers domain.
The loader retrieves a PowerShell script, which downloads two ZIP archives containing the StealC infostealer and an auxiliary Python stealer for redundancy, enhancing persistence.
Researchers from Morphisec noted that this StealC variant, undetected by VirusTotal, expands data-stealing capabilities, posing a challenge for antivirus solutions.
Users are advised to disable Blender's auto-execution of scripts and treat 3D assets as executable files, utilizing sandboxed environments for safer testing.
This campaign underscores the importance of cautious file handling and the need for improved scrutiny of user-submitted content on digital marketplaces.