AI could help work on the next generation of blockbusters


Over the next few Tuesdays, The Verge’s flagship podcast, The Vergecast, will show a miniseries dedicated to the use of artificial intelligence in industries that are often overlooked, hosted by Verge senior reporter Ashley Carman. This week the series focuses on AI for the video world.

More specifically, we’re exploring how AI is being used as a tool to help people streamline the process of creating video content. Yes, this could mean that software plays a bigger role in the very human act of creativity, but what if machine learning tools, instead of replacing us, could support our work?

This is what Scott Prevost, VP of Adobe Sensei – Adobe’s machine learning platform – brings to the table for Adobe’s AI products. “Sensei was founded on our firm belief that AI will democratize and enhance human creativity, but not replace it,” says Prevost. “Ultimately, it enables the Creator to do things that he may not be able to do before. But also to automate and speed up some of the everyday and repetitive tasks that are part of creativity. “

Adobe has already integrated Sensei’s initiatives into its current products. Last fall, the company released a feature called Neural Filters for Photoshop that can remove artifacts from compressed images, change the lighting in a photo, or even change a person’s face to make them smile instead of a frown, for example. or adjust your “face age”. From the user’s point of view, all of this is done by simply moving a few sliders.

Neural filter from Adobe Image: Adobe

Adobe also offers features like Content Aware Fill, which is built into its video editing software After Effects, that can seamlessly remove objects from video – a task that would take hours or even days to do manually. Prevost told a story about a small team of documentary filmmakers who struggled with their footage when they discovered unwanted smudges on their image from a dirty camera lens. With Content Aware Fill, the team was able to remove the unwanted blemishes from the video after identifying the object in just a single frame. Without software like Adobe’s, the team would have had to edit thousands of frames individually or record the footage from scratch.

Content-Aware Fill from Adobe for videoGIF: Adobe

Another feature from Adobe called Auto Reframe uses AI to reformat and reformat video for different aspect ratios, keeping the important objects in the frame that may have been cropped with a regular static crop.

Adobe’s Auto Reframe function GIF: Adobe

Technology in this area is advancing significantly for consumers, but also for professionals on a large budget. While AI video editing techniques like deepfakes haven’t really made it to the big screen just yet – most studios still rely on traditional CGI – this is where directors and Hollywood studios are on their way to using AI for the synchronization.

A company called Flawless, which specializes in AI-powered VFX and filmmaking tools, is currently working on something called TrueSync, which uses machine learning to create realistic, lip-synchronized visualizations of actors for multiple languages. Scott Mann, co-CEO and co-founder of Flawless, told The Verge that this technique works significantly better than traditional CGI for reconstructing an actor’s mouth movements.

“You’re training a network to understand how a person speaks, so the mouth movements of an ooh and aah, the various visemes and phonemes that make up our language, are very person-specific,” says Mann. “And that’s why it takes so many details in the process to get something really authentic that speaks the way this person spoke.”

One example Flawless shared that really stood out was a scene from the movie Forrest Gump that dubbed Tom Hanks’ Japanese-speaking character. The character’s emotion is still present and the end result is definitely more believable than a traditional overdub as the mouth movement is synchronized with the new dialogue. There are points where you almost forget that this is another voice actor behind the scenes.

But as with any AI that changes an industry, we also need to think about staffing.

If someone creates, edits, and publishes their own projects, Adobe’s AI tools should save them a lot of time. But in larger production houses, where each role is delegated to a specific specialist – retouchers, colorists, editors, social media managers – these teams can end up being downsized.

Adobe’s Prevost believes the technology will postpone jobs rather than destroy them completely. “We believe that some of the work that creatives used to do on production won’t do that much,” he says. “They could become more like art directors. We believe that it actually allows people to focus more on the creative aspects of their work and explore that wider creative space where Sensei does some of the more mundane work. ”

Will video teams end up being scaled down?

Scott Mann of Flawless has a similar opinion. While the company’s technology may reduce the need for screenwriters for translated films, it can open doors for new employment opportunities, he argues. “I would honestly say this role is kind of a director. What you are doing is conveying that achievement. But I think with technology, and really with this process, it’s going to be about taking that side of the industry and growing that side of the industry. ”

Do script supervisors become directors? Or do photo retouchers become art directors? Maybe. But what we are sure to see today is that many of these tools are already combining workflows from different points in the creative process. Audio mixing, coloring, and graphics all become part of multipurpose software. So if you work in the visual media field instead of specializing in certain creative talents, your creative job may require more generalists in the future.

“I think the lines between images and videos and audio and 3D and augmented reality are slowly becoming blurred,” says Prevost. “There used to be people who specialized in images and people who specialized in video, and now you see people working in all of these media. So we believe that Sensei will play a big role in connecting these things together in a meaningful way. ”


Please enter your comment!
Please enter your name here