ComfyUI is an open source, node-based, generative artificial intelligence program that allows users to generate images from a series of text prompts. It uses Stable Diffusion as the base model for its image capabilities combined with other tools such as ControlNet and LCM Low-rank adaptation with each tool being a node in the program.
ComfyUI was released on GitHub in January 2023. According to comfyanonymous, the creator, a major goal of the project was to improve on existing software designs in terms of the user interface. The creator had been involved with Stability AI but by 3 June 2024 that involvement had ended and an organization called Comfy Org had been created along with the core developers. In July 2024, Nvidia announced support for ComfyUI within its RTX Remix modding software. As of July 2024, the project has 41.8k stars on GitHub.
ComfyUI's main feature is that it is node based. Each node has a function such as "load a model" or "write a prompt". A default node group is also included with the program. When a prompt is queued, a highlighted frame appears around the currently executing node, starting from "load checkpoint" and ending with the final image and it's save location. Workflows can be saved to a file, allowing users to re-use node workflows and share them with other users. The file format for the workflows is in JSON and can be embedded in the generated images. Users have also created custom extensions to the base system such as for AnimateDiff, which aims to create videos.
In June 2024, a hacker group called "Nullbulge" compromised an extension of ComfyUI to add malicious code to it. The compromised extension, called ComfyUI_LLMVISION, was used for integrating the interface with AI language models GPT-4 and Claude 3, and was hosted on GitHub. Nullbulge hosted a list of hundreds of ComfyUI users' login details across multiple services on its website, while users of the extension reported receiving numerous login notifications. vpnMentor conducted security research on the extension and claimed it could "steal crypto wallets, screenshot the user’s screen, expose device information and IP addresses, and steal files that contain certain keywords or extensions".
Nullbulge's website claims they targeted users who committed "one of our sins", which included AI-art generation, art theft, promoting cryptocurrency, and any other kind of theft from artists such as from Patreon. They claimed that they were "a collective of individuals who believe in the importance of protecting artists' rights and ensuring fair compensation for their work" and that they believed that "AI-generated artwork is detrimental to the creative industry and should be discouraged".