Characters in movies and TV series live according to particular laws of space and time. When the audience is shown a scene from the hero's childhood, the director normally casts a child with a similar appearance. But what if an adult character needs to appear as they did in their youth? That's where AI de-aging technology comes in.
Let's see how film heroes are "reverted" to their youthful appearances and what difficulties producers have to overcome in order to shoot a short scene.
Until de-aging technology was sufficiently developed, all the best actor transformations were created by makeup artists. The art of makeup is one of the first technologies that allowed for changing the age of an actor.
The fact that movies were previously shot on grainy film also helped hide tricks used with makeup. Because the resolution of old film movies is much lower, the viewer could not see the fine details on an actor's face.
However, with modern digital and high-resolution movie production, makeup is no longer an option. Viewers want to see a genuine picture without compromising quality. That's where de-aging technology enters the fore.
In short, de-aging is a 3D effect technology used to make an actor look younger. To do so, post-production studios typically edit a digital image or apply computer-generated imagery (CGI) overlays or touch-ups to the necessary scenes.
When you see de-aged actors, you almost always see CGI animations built on visual effects (VFX). Actors are filmed with tiny markers attached to their faces to synchronize the movements of the 3D subject with the movements of the actor's head.
Then, an invisible 3D version of the head is built, which is synchronized with the movements of the actor's head. On top of this head, an editor puts "patches" from the cloud of points, which act as filters for the original image's skin. Sometimes these "patches" are taken from the shooting of a young stunt double, who tries to repeat the original actor's movements, passing the same scene in the same scenery and under the same lighting.
Then an editor will perform a computer "facelift", removing the chin, drawing the cheekbones more clearly, and reducing the ears and nose that have been growing for the actor’s entire life.
Watch the video to see Will Smith showing how the technology works in his recent movie Gemini Man:
The standard for de-aging an actor is now considered to be a Kurt Russell scene from the second Guardians of the Galaxy film. Perhaps this was the major motion picture where the rejuvenating effects on an actor were done so well that the unsuspecting viewer never noticed it. And all was possible using makeup and professional CGI.
Few people know this, but de-aging technology is often combined with a screen double cast to resemble a de-aged version of the actor. This was precisely the case with Kurt Russel. Here's Kurt standing next to his young double Aaron Schwartz.
First, the made-up Kurt Russell acted the scene. His gestures and facial expressions were then repeated by Aaron Schwartz. Next, his youthful skin was digitally "applied" to Russell. The entire scene took a couple of months to process all the CGI for the movie.
Another high-profile example of the technology’s use is the recently acclaimed film The Irishman. The leading actors, Robert De Niro and Al Pacino, who are 76 and 79 respectively, and in the movie, played 40-year-old men. Unlike the technology used in the Guardians of the Galaxy, De Niro was de-aged differently.
The actor flatly refused to work with a capture helmet and markers on. Therefore, specifically for this film, the ILM studio team made their own program. It allowed video producers to "disassemble" the actor's facial expressions from the footage without visible markers.
Filming was carried out with three cameras on one setup. Two additional cameras on either side of the primary camera filmed the action in infrared. After that, the ILM software applied this data to the actor's digital head.
Since the Irishman did not use younger stunt doubles, many noted that the de-aged heroes De Niro and Al Pacino nevertheless moved like old men.
You can watch more examples of how this technology is applied in cinema in this video:
Have you heard anything about the upcoming film Finding Jack, where the famous James Dean will be "brought back to life" using modern CGI technology?
Obviously, this is not the same as de-aging actors that are currently alive, but the feat is made possible using almost the same technology. The task of the producers is to create an entirely new role for the deceased actor.
At the same time, they will not use a stunt double for this but simply transfer the star's acting style to the new character. What will happen if the movie succeeds?
Will this usher in an era of "eternal actors", when Arnold Schwarzenegger or Brad Pitt can stay on the screen forever, through the use of transferring the rights for use of their identity to Warner Bros or Disney? Maybe. Only time will tell.
In the meantime, de-aging technology remains relatively expensive. Only large Hollywood companies can afford to use it. A digital head model costs about a million dollars, and each frame with its animation, depending on its complexity, ranges from 30 to 100 thousand dollars.
Perhaps these technologies will become cheaper over time, but so far, they are hardly affordable for mass movie production.
Be that as it may, with modern CGI and additional dialog replacement (ADR) technologies, the future of film-making in which almost anyone can become an actor is not far off. And this raises a number of ethical questions.
As a service that allows for mimicking any person's speech and synthesizing an unlimited amount of audio content based on personal speech, Respeecher takes AI ethics seriously. Every day, our clients in Hollywood and around the world work with voice cloning technology that requires a special ethical approach.
Modern technology is prone to abuse as a tool in the hands of bad actors. One example of this use of artificial intelligence in video production is deepfake. Deepfake is an artificial intelligence-based image synthesis program. It is used to combine and overlay existing images and videos onto original pictures or videos.
In the vast majority of cases, generative adversarial neural networks (GANs) are used to create such videos. One part of the algorithm learns from real photographs of a specific object. It creates an image, literally "competing" with the second part of the algorithm until it starts confusing the copy with the original.
The methods for detecting motion and converting it into a target video that looks like a target image was introduced in 2016. They allow for fake facial expressions to be created in an existing 2D video in real-time.
Although the technology mostly serves good purposes, there are cases when deepfake was maliciously used to replace certain image elements with desired images. In 2017, AI deepfake was used particularly for creating fake pornographic videos with celebrities or for porn revenge.
Needless to say, companies with access to such technologies are responsible for what orders they fulfill and with whom they cooperate. Pornography may not be the biggest social problem created by deepfake technology. Imagine if unscrupulous politicians or states used such technologies to create fake news feeds.
However, there are many examples of this technology being used for good purposes. Deepfake has been used in museum space. For example, at the Salvador Dali Museum in Florida, the exhibition Dalà Lives was organized to honor the famous artist's 115th birthday.
They used an artificially generated prototype of an artist who communicated with museum visitors, told them stories about his paintings and life, and took selfies and sent photos to visitors' emails.
Deepfake technology facilitates numerous possibilities in the education domain. AI generated content can bring historical figures to the classroom for a more engaging lesson.
For studios working in film production, solutions like deepfake solve a couple of real problems. Among them, the ability to create content using the faces of famous actors with their youthful appearance or re-create actors that have passed away. Or, for example, to use an actor's original appearance as a child for a role, long after they've matured into adults.
Coupled with artificial voice synthesis, such technology creates new possibilities for filmmakers and screenwriters, while also reducing production costs and creating content that viewers are eager to see.
As with any technology, from video recording and generating speech to genome experimentation and nuclear power, the problem is not with the technology itself, but rather with how people use it. The case of deepfake is no exception.
Today, the professional community represented by the purveyors of these technologies stands guard over their tools and works by adhering to strict ethical guidelines. Whoever we are, technology providers or users, it's our responsibility to ensure that these technologies are used responsibly.