
COEY Cast Episode 156
Netflix VOID Fills the Gap in Post
Netflix VOID Fills the Gap in Post
Episode Overview
04/07/2026
Netflix just open sourced VOID, a video inpainting model built for a very real production problem: fixing footage that is almost usable. The conversation digs into how VOID removes people or objects and rebuilds motion, shadows, and scene logic so shots feel physically believable instead of obviously patched. That makes it a serious tool for editors, agencies, and brand teams trying to avoid reshoots, roto work, and cleanup chaos. The bigger takeaway goes beyond one model. AI video is shifting from flashy demo culture to workflow utility, where consistency, speed, and cost matter more than cinematic bragging rights. There is also a real trust question here, because better cleanup tools make provenance, disclosure, and human review a lot more important.


Episode Transcript
Hunter: It is Tuesday, April seventh, twenty twenty-six, and this is COEY Cast, the show that was assembled by a stack of machines with supervision issues. I’m Hunter.
Riley: And I’m Riley. Happy World Health Day, happy National Beer Day, and honestly happy No Housework Day, which feels like the most visionary holiday on the calendar.
Hunter: That one might be the real innovation. Also, yes, this episode was fully stitched together with AI tools, automations, synthetic voices, probably a workflow hanging on by one webhook, and we left the weirdness in on purpose.
Riley: If one of us suddenly sounds too confident, blame the robots. But let’s get into it, because the big story is actually super practical for once. Netflix open-sourced VOID, and this thing is not another shiny text-to-video flex. It’s a post-production weapon.
Hunter: Yeah. That’s what grabbed me. VOID is a video inpainting model, but the important part is that it tries to remove people or objects and then repair the footage in a physically believable way. Not just blur the problem away. If a removed object would’ve changed motion, debris, shadows, interaction, it tries to rebuild the scene like that thing was never there.
Riley: Which is kind of insane. It’s less “make me a dragon commercial” and more “please get that random tourist out of my perfect beach shot without making the water look haunted.”
Hunter: Exactly. And for editors, agencies, internal brand teams, this is immediate value. You don’t need a philosophical manifesto to understand it. You shot a clean take except there’s a boom mic, a logo you can’t show, a person walking through frame, a product prototype that changed, or a car in the background that ruins the vibe. Normally that becomes reshoot pain, roto pain, compositor pain, money pain.
Riley: Finance team just sat up in a swivel chair like, wait, did someone say fewer reshoots?
Hunter: Oh, they definitely heard it. But the first winners are probably creatives and post teams, because they get flexibility back. They can save shots that were almost usable. Then ops wins because the timeline shrinks. Then finance shows up late pretending they believed in innovation the whole time.
Riley: Mmm, I kind of disagree. I think finance wins emotionally first, but creative wins operationally first. Because the second a company hears “AI can fix the footage,” suddenly every busted take becomes, oh just patch it in post. Which is both a blessing and a curse.
Hunter: That’s fair. The danger is expectation inflation. Teams hear magic and assume every bad asset is now recoverable.
Riley: Right, and that’s how you end up with “Can we turn this vertical iPhone clip from the conference lobby into a premium brand film by Thursday?” Like babe, no. AI is powerful, but it’s not a witness protection program for bad production.
Hunter: And that gets to the trust question. Where’s the line between incredibly useful cleanup and, great, now no one trusts video evidence? Because once tools like VOID get better, the difference between harmless post cleanup and reality alteration gets blurrier.
Riley: Yeah. We are not in cute little blemish remover territory anymore. If a tool can remove an object and also reconstruct motion and scene logic, that’s a very different thing from cloning out a trash can in Photoshop.
Hunter: Totally. But I do think context matters. In commercial post-production, this is normal-ish. We’ve had object removal, compositing, cleanup, ADR, all of that forever. The new part is speed and accessibility. The real issue isn’t that creators will use tools like this. They will. It’s whether audiences, platforms, and legal systems get better at provenance and disclosure.
Riley: Yes. This is why every practical AI conversation eventually backs into trust infrastructure. Watermarking, metadata, chain of custody, platform labels, newsroom standards, campaign approvals. Boring stuff, but boring stuff is how society survives cool tools.
Hunter: Well said. And that’s why the open-source angle matters too. Netflix releasing this under an open license broadens experimentation, which is great, but it also means the capability spreads fast.
Riley: Although let’s be real, open source does not magically mean easy. From what people are saying, VOID is not exactly “download on your laptop during lunch and become Industrial Light and Magic by dinner.” It reportedly wants serious GPU memory.
Hunter: Yeah, this is not hobby-tier for most teams if you want to self-host it right now. So when people say open source equals democratized, the honest translation is more like democratized for teams with technical talent, infrastructure, or patience.
Riley: Or one extremely caffeinated IT person who accidentally becomes the company’s entire AI cinema department.
Hunter: That person exists. Their Slack status is always yellow.
Riley: Always. But speaking of practical instead of theatrical, Google’s Veo three point one Light is getting a lot of chatter for a similar reason. Not because it’s the prettiest demo in the universe, but because people are framing it as usable. Faster. Lower friction. More operational.
Hunter: And I think that shift is the story. We’ve spent so much time in this industry ranking who made the most cinematic sixty-second moonlit samurai ad. Meanwhile the teams actually shipping content are asking, can I generate variations quickly, keep style consistent enough, and do it at a cost that doesn’t make my budget cry?
Riley: Thank you. The era of winning the vibe Olympics might be ending. Or at least getting demoted. Because if I’m a marketing team, I may care less about absolute peak fidelity and more about whether I can make ten decent variants for paid social, explainers, product loops, internal videos, and landing page motion assets without setting the building on fire.
Hunter: That’s the right optimization target. Speed, cost, and consistency now matter more than one hero clip. Not that quality doesn’t matter, it does, but operationally reliable beats visually stunning if you’re running campaigns at volume.
Riley: Wait, hold up. I’d argue consistency might be the sleeper winner there. Because speed without consistency just creates faster chaos. If every output has a different face, lighting mood, and object logic, then congratulations, you automated more cleanup.
Hunter: That makes sense. Fast inconsistency is just a new bottleneck. So maybe the hierarchy is consistency first, then speed, then cost depending on team size.
Riley: Yeah, because creators can sometimes absorb cost if it saves time, and enterprise teams can sometimes absorb time if it protects brand consistency. But nobody wants flaky outputs at scale.
Hunter: And that ties into what we’ve been seeing across the ecosystem lately. Google has been pushing video closer to business workflows through tools tied into its broader stack. Runway keeps chasing production-ready multi-shot consistency. OpenAI rumors keep circling bigger context windows and stronger computer use, which matters because the real opportunity is systems that can plan, execute, and hand off. Microsoft is making transcription, voice, and image generation feel more like infrastructure than novelty. So this week’s pattern is pretty clear.
Riley: AI is growing up. Slightly. It still posts cringe sometimes, but it is growing up.
Hunter: The center of gravity is moving from wow demos to pipeline utility.
Riley: Which I love, because creators and marketers do not need another model that wins a film festival in a lab. They need one that survives fifteen revision requests, three stakeholder opinions, and a weird legal note.
Hunter: Exactly. And on the business side, the Anthropic story fits that same bigger pattern. The headlines around more TPU capacity, more enterprise scale, more distribution, more revenue acceleration, all of that says the model race is increasingly about compute supply chain, not just clever research.
Riley: Yeah, this part is less sexy but super important. It’s giving railroads. It’s giving cloud geopolitics. It’s giving, surprise, the future of your marketing automation stack may depend on who locked up enough chips.
Hunter: And if you’re a company building an AI roadmap, that should change your thinking. You can’t just ask, which model is best today. You have to ask, what happens if pricing changes, access gets constrained, latency spikes, policies shift, or one vendor decides your workflow is not a priority?
Riley: Or if your whole team quietly becomes dependent on like three giant vendors and a prayer.
Hunter: Right. Which is why I keep coming back to hybrid strategy. Use the best closed models where they’re clearly best. But keep some open tools, controllable tools, and modular automation in the stack where you can. Not because open source is always cheaper or easier, but because optionality matters.
Riley: Also because open source can be the sandbox. Even when you don’t deploy it at scale, it helps teams learn faster. You get to test weird workflows, privacy-sensitive tasks, local runs, custom pipelines. It’s not just about saving money. It’s about learning without asking permission every five minutes.
Hunter: That’s a great way to put it. The advantage is control and experimentation, not just cost. Though again, somebody still has to own it. Otherwise your so-called agent strategy becomes vibes, YAML files, and one internal doc everyone pretends to understand.
Riley: Oh my gosh, yes. “We have an autonomous content engine.” Okay, what does it do. “Well, currently it sends itself to a dead webhook and then apologizes.”
Hunter: We’ve all seen some version of that. And this is where leadership needs to stop dodging the hard questions. What should AI own versus assist? Where does human review stay mandatory? How do we preserve brand judgment? How do we train junior talent if the first draft is automated? What’s our policy for synthetic media and evidence?
Riley: And honestly, what are we optimizing for as a company? More content? Better content? Faster approvals? Lower cost? Because if leadership can’t answer that, they’re just buying AI like it’s merch.
Hunter: That’s the real test. Not whether you’ve adopted the latest thing, but whether the thing improves the workflow you actually have.
Riley: Which brings us back to VOID. I think that’s why people are excited. It solves a real problem in a real part of production. It’s not abstract. It’s not just benchmark theater. It’s “we have footage, it’s flawed, can AI save the shot?” That’s a very different category of useful.
Hunter: Yeah. It feels like one of the clearest examples lately of AI changing media operations from the inside out. Not replacing creativity. Expanding the rescue options around it.
Riley: And making post teams a little more dangerous, in a good way.
Hunter: In a very good way. Also maybe a slightly alarming way.
Riley: A little alarming is kind of the whole genre now.
Hunter: Fair. Alright, that is our show for today. Thanks for hanging out with us on COEY Cast.
Riley: And if you’re celebrating No Housework Day, we fully support doing less and automating more.
Hunter: Head over to COEY.com slash resources for AI news, updates, and more.
Riley: And subscribe so you don’t miss the next episode. Catch you later.




