Very interesting discrepancy in the attached example:<p>- "removing the kettlebell" led to removing the visual representation of the kettlebell as well the deformation it makes on the pillow<p>- "removing the hands" removed the childs hands from the tops, but did not then lead to the tops falling over!<p>Others like the colliding cars are in some weird gray area between the two.<p>One should note as these tools proliferate, there is a lot of artistic expression that we are giving up to these imprecise natural language parsing engines.
Would make economic sense for a ton more of "Choose your own Adventure" content<p>I can imagine watching Bandersnatch and getting rid of the game developer in frame 1. The remaining 90 minutes, his dad having a quiet, stress-free Tuesday.
Woah, this is absolutely sick! 10 years ago me would have been surprised something so small can encode all the world knowledge necessary to make this plausible. That they'd make this openly available is a dream.
Anyone who's ever had a break up will thank you.
CogVideoX continues to be an academic powerhouse model. So many papers built on this little thing.
Really weird comments here. It's a VFX technique for cinematography, one of many of that kind (e.g. supporting wire removal). Cinematography in general is about showing something that doesn't exist, unless it's a documentary. Your <i>only</i> reaction is apparently calling censorship. Says a lot about the current Overton window and I think it's something you should reflect on.
No, I get it, and the benefits of putting what would have required a professional VFX team at (eventually) the fingertips of every amateur filmmaker are amazing.<p>It’s also alarming what’s possible when reassembling photons <i>en masse</i> becomes commoditized.
To be honest, if that's what we see in open access, then there must already be something in existence that is closed. I don't want to create another conspiracy theory - I just want to point out the theoretical possibility - but if it was created, it has probably already been tested and possibly even used widely. So, in conclusion, if a government or agency wanted to use that technique for censorship, they would most likely already have tried it.<p>It's kind of late to ring the bells once the city walls have been taken. In other (shorter) words, if someone wanted to use this for censorship, they would already have tried it, so... it's either too late or too early; although being interesting (but not good or fun) probability, it's probability only.<p>...I hope one day everyone, just everyone will finally learn that most of the technologies are not evil or good, they're neutral, they're just tools. And most (no, not all) of them were also created with good intentions.
You don't really need AI to do this, it can be done just fine with traditional techniques, just labor intensive. Hell you can probably find a dude on fiverr to do it right now for a couple hundred, YMMV.
The act of editing existing footage to mask reality with non-reality is just that—the action of making something real less real. It can be used for filmmaking (and obviously often is), but it can be used for anything else too.<p>The issue is how easily these tools can (and will) enable the worst faith actors and actions. It's not controversial (or shocking) to think the current situation for deep-fakes or deceptive edits is historically awful. It's equally reasonable to think these tools are going to do less good for VFX houses in Hollywood, and more evil in authoritarian regimes or chaotic social media networks. You're right that the Overton window has moved—but we ought to blame a White House that disseminates deepfakes, not commenters on HackerNews.<p>Tools aren't just tools, nothing exists in a vacuum. A plane is a useful means of transportation, but that doesn't mean everyone should be rushing into the cockpit. It's pretty plain to me how a tool that streamlines doctoring footage to such a useful (and deceptive) degree is just a recipe for disaster. Considering the benefit to society is...slightly eased CGI work (?), I'll easily label this a net-negative for us all.
Given the current situation in the world why do you not think about technology can be misused, especially if it’s VFX.<p>I guess I have seen more AI generated spam ads than content created for the intended purpose.
> Given the current situation in the world why do you not think about technology can be misused, especially if it’s VFX.<p>I think the point of the parent was that while you <i>need</i> to think about how this technology can be misused, that's not the <i>only</i> thing you <i>need</i> to and <i>can</i> think about.
But wouldn't using this for eg. removing a support wire result in it making Superman fall down?
Yes but imagine how bad Stalin’s reign of terror would
have been with modern cinematographic techniques.
Yes but why are we required to imagine <i>only</i> this? With this logic, why won't we try to imagine how bad Stalin's (or Hitler's, or anyone's) reign of terror would be with state-controlled social media? With state-censored Internet (that's about current Russia, by the way)? With modern communication technologies et al. and so on?<p>Hell yeah, let's go back to the caves just because science, technology and progress can actually empower everyone, including dictators, maniacs, psychopaths, predators and "bad people et al.", not just "good" (the definition of "good" varies, but let's assume we're using common principles of goodness) people.
Yay! More tools for faking stuff!
I don't see any demos. Did I miss something? Not interested in running a Colab.
The idea of applying this modern magic to history & art is horrifying. The dream of Minitrue!<p>Presumably Netflix wants to erase smoking from its back catalog or some other bit of papier-mâché Stalinism.<p>Oh well, neat bit of auto-regressive theater.
This will save a lot of money for the prod house, considering each country may have different censorship rules.
[flagged]
Also lets them quickly censor things the West doesn’t like, as a client state of particular nations in the Middle East<p>VPN sometime if you have doubts about that in western media, including US
Why China?<p>I see this being used in many countries, especially in the West.<p>We've crashed head straight into a dystopian future. But worry not! The crash can be edited out of reality.
[dead]
soo basically, they'll replace "coke can" with "redbull" or similar depending on who pays for ads in video? what else they gonna use it for?
Removing film crew, boom mics, and missed props from a scene would surely be useful to studios. It may even enable some shots that previously would have been impossible due to the positioning of cameras, etc.
ultimately we could get 4K remasters of old movies/shows where it's currently not worth redoing the FX
I can see this being one of many AI tools for video editors. Combined with a handful of other tricks an SFX shop should have a tremendously higher productivity.