Marvel Studios made history in December when the trailer for its 2019 film Avengers: Endgame racked up a record-breaking 289 million online views in its first 24 hours. Arguably more remarkable than the trailer's popularity, however, is the technology behind the film. Marvel used artificial intelligence to capture human faces and recreate them on digital characters in the movie. Though A.I. is making its mark in Hollywood in visual effects, or VFX, companies are also experimenting with using machine learning to analyze film content in fascinating ways.

"Viewers will not only be able to search and find video based on an actor or genre, but on more complex things like tone of a scene or societal message of the clip," says Arvin Patel, chief intellectual property officer at entertainment technology company TiVo. By turning to A.I. to automate more and more of the work behind the scenes, Hollywood stands to benefit from some pretty transformative innovation.

Here are four companies using A.I. to change the way Hollywood does business.

Digital Domain

One of the most exciting ways A.I. is changing film production involves computer-generated imagery, or CGI. While the motion-capture suit has long been Hollywood's favorite tool for creating CGI characters that mirror the movements of real actors, motion-capture technology provides VFX artists only with basic visual reference points. At Los Angeles-based production company Digital Domain, co-founded by Titanic director James Cameron in 1993, VFX artists are using proprietary A.I. to incorporate humans into the design of their CGI characters much more efficiently. 

"We can actually take actors' performances--and especially facial performances--and exactly transfer them to digital characters," says Darren Hendler, director of Digital Domain's appropriately titled Digital Human Group, a division the company created just last year. "In the past, doing that would have required building a full CG version of the actor's head and getting it to look photo-real." Digital Domain used this technology to recreate the facial expressions of actor Josh Brolin's Thanos in 2018's Avengers: Infinity War and in next year's Avengers: Endgame. An eight-foot supervillain with a condition called the Deviant Syndrome, Thanos has a menacing, mutant-like appearance that's unique among the characters in the film. 

Though Hendler expects the technology to become much more mainstream, reducing VFX costs for both studio movies and independent films, he doesn't anticipate it putting people out of work. "The amount of people required to make a feature film is definitely getting to be less, but at the same time there are more feature films in production, so I don't know that there's any net loss," he says. "It's just a massive improvement in quality, streamlining and throughput."

IBM

In 2016, IBM used A.I. to create an alternate movie trailer for 20th Century Fox's science-fiction movie Morgan, which follows an artificially created humanoid. To teach the company's supercomputer Watson what scenes make for a good trailer, IBM researchers had Watson "watch" 100 movies and their corresponding trailers. After identifying patterns in the visuals and sounds of the 100 trailers, Watson then watched Morgan and quickly suggested 10 scenes to include in the trailer. A film editor used nine of them, turning out the new trailer in a single day, much faster than the typical process for cutting a trailer.

"Where IBM has made advances is in developing improved capabilities for computers to understand things like vision and speech and language," says John R. Smith, IBM Research's head of A.I. Tech at the company's T. J. Watson Research Center. "Over time, the computer will be able to get deeper and deeper information out of that content." 

Last year, IBM worked with Disney to teach Watson how to identify the 10 most emotional moments featuring the robot character C-3PO from Star Wars. Because C-3PO can't move his face, however, Watson wasn't able to derive any information from facial expressions. "We had to look at a lot of information including body gestures, signals of agitation in voice inflection, and other sounds in the scene that could be indicators that is was an emotional moment," says Smith. "A.I. has the potential for helping us understand content better."

Arraiy

Founded in 2016, Palo Alto, California-based Arraiy has developed proprietary A.I. that, among other things, can seamlessly add photorealistic CGI objects to scenes, even when both the camera and the object itself are moving. One of the reasons this has always been so challenging is that as the camera moves, the angle from which the viewer sees the CGI object also has to change. Instead of having VFX artists alter the appearance of the CGI object one frame at a time, 24 times per second, Arraiy's A.I. software does it automatically.

"What would otherwise require a whole team of people to turn out, one artist can do at his or her desk," says Arraiy co-founder Ethan Rublee, adding that the company's software can be trained by VFX artists to automate an infinite number of specific use cases. "When you train a network to do these things, you train it once, and then it works basically in real time for any number of frames." Arraiy has demonstrated its technology on several short-form projects, including the Black Eyed Peas music video for Street Living, in which the company superimposed band members' mouths over images from the civil rights era.

Arraiy raised more than $10 million in a Series A round led by Lux Capital and Softbank Ventures in March and is bringing its software to market in early 2019. "You no longer have to be a multibillion-dollar studio to have access to the very best visual effects," says Shahin Farshchi, a partner at Lux Capital. "You can now be a smaller studio or even an independent artist and be able to get the same level of visual effects that you otherwise see in the Iron Man and X-Men movies."

Sinemia

Founded in 2014 by engineer Rıfat Oğuz, Los Angeles-based Sinemia is a subscription-based movie ticketing service that lets subscribers see three movies in theaters per month for a monthly fee of $8.99, among other pricing plans. Formerly known as a competitor to movie subscription service MoviePass--before MoviePass ran out of money--Sinemia uses A.I. to recommend specific film titles to its subscribers. Unlike Netflix, which gathers data only on the movies you watch on its platform, Sinemia's mobile app asks users which movies they liked or disliked, including films that are no longer in theaters. After collecting data on enough titles, the company cross-references your responses with other subscribers to predict which films you'll enjoy. 

"The A.I. tries to find your movie soul mate," Oğuz says. "Because you're so similar to these people, you're most likely going to also love this movie ... They're almost like your teacher."

Sinemia has subscribers in countries including Australia, Canada, and the U.K., and recently became cashflow positive in the U.S., according to Oğuz. If subscribers trust the company's movie recommendation A.I. enough to go to the movies more often, Sinemia could help reverse declining theater attendance, which fell 5.8 percent to 1.24 billion in the U.S. and Canada in 2017, the lowest figure since 1992, according to data from researcher Box Office Mojo.

Perhaps most interesting about the use of A.I. in Hollywood is the fact that the industry experts don't expect the technology to eliminate jobs like other kinds of automation.

"The common thread you hear from those commingling A.I. into the creative filmmaking process is that A.I. takes more and more of the mundane tasks away from the human, allowing them to focus more time and energy on their creative skills," TiVo's Patel says. "When you cast your sights far into the future, there are many significantly transformative possibilities."

Published on: Dec 19, 2018