Let's Destroy the E-THOT Industry Together!
A developer shares a complete, open-source pipeline to create convincing AI personas, aiming to devalue the market.
A developer, posting under the name roychodraws, has gone viral by open-sourcing a sophisticated, local AI workflow explicitly designed to create hyper-realistic virtual influencers. Dubbed an effort to 'destroy the E-THOT industry,' the project aims to devalue human-centric online persona markets by making AI-generated alternatives indistinguishable and easy to produce. The complete pipeline is shared publicly on TikTok, Reddit, and GitHub, inviting others to replicate and expand on the work.
The technical stack is a meticulous combination of existing tools. It starts with the Wan Animate workflow for initial video generation, enhanced by a custom low-rank adaptation (LoRA) model to maintain consistent facial features. Post-processing in Adobe After Effects involves specific Lumetri color adjustments to remove AI's tell-tale over-saturation and contrast, followed by a paid plugin called Beauty Box for shine removal. Finally, the video is upscaled to 4K using the open-source Seed VR2 Upscaler, interpolated for smooth motion, and finished with subtle lens and motion blurs for cinematic realism.
The developer's manifesto argues that by democratizing the ability to create flawless virtual beings, the perceived value of human creators in certain online spaces will plummet. This provocative stance has sparked intense debate about the ethics of AI, automation's impact on digital labor, and the future of online identity. While presented as a social experiment, the release provides a tangible blueprint for high-fidelity synthetic media creation accessible to anyone with capable hardware.
- The workflow uses Wan Animate with a custom face-consistency LoRA to generate the base animation.
- Post-processing in After Effects applies specific color grading (-50 contrast, -80% saturation) and uses the Beauty Box plugin to remove AI 'shine'.
- The final video is upscaled to 4K using the open-source Seed VR2 Upscaler and interpolated for smooth motion.
Why It Matters
It demonstrates a near-professional, accessible pipeline for synthetic media, accelerating debates on AI's impact on digital labor and identity.