Image & Video

Anyone else using LTX locally on Mac via Draw Things? Here’s a WWII-style short I made.

All AI processing ran on a Mac via Draw Things, no cloud reliance.

Deep Dive

A Reddit user has demonstrated the growing power of local AI video creation by producing a WWII-inspired short film entirely on a Mac. The creator, u/coberholzer, used OpenAI’s Images 2 to generate a series of still images—starting with a “dog man in a glass box”—then animated them using Lightricks’ LTX 2.3 model, running locally through the Draw Things app. The resulting clips were edited and stitched together in DaVinci Resolve, while music came from Suno and voiceover/sound effects from ElevenLabs. The entire workflow was done without cloud reliance, highlighting the increasing feasibility of running advanced diffusion models on consumer hardware.

The user acknowledges that character consistency across frames was imperfect, but notes that a planned turnaround sheet could easily fix that. They expressed excitement about future releases of LTX and Draw Things, saying the tools are making image-to-video generation “more accessible to Mac users.” The post sparked discussion in the AI community about local video generation, with many praising the creative potential of combining open‑source and commercial AI tools on a laptop. This hands‑on experiment underscores a broader trend: professionals and hobbyists can now produce high‑quality AI content without expensive cloud credits.

Key Points
  • Used OpenAI Images 2 for stills, LTX 2.3 via Draw Things for motion, DaVinci Resolve for editing.
  • Music generated in Suno; voiceover and sound effects in ElevenLabs.
  • All processing ran locally on a Mac, no cloud API costs per frame.

Why It Matters

Local AI video generation on Mac is now viable, lowering barriers for indie creators.