Introduction:
2023 marks the fifty year anniversary since CGI was first used in movies to create visual effects, sometimes known as VFX. Since then, it has grown in scale and sophistication to make ever-more fantastical movies not just possible, but believable. As computers and graphics technologies get more sophisticated and accessible, and as new frontiers in computer graphics, Mixed reality, and AI start converging to create powerful new capabilities. We the viewers can expect to be wowed like never before.
Why CGI
VFX has been around almost as long as cinema itself. VFX directors used physical effects like puppetry, animatronics, stunt work, and makeup to great effect. Cinematographers, editors and directors invented techniques like Forced Perspective and Miniature close-ups to create some of the most iconic cinema moments ever.
However, no matter how talented the VFX artist, detailing like background vegetation could never look realistic enough with practical effects. Giving at best a Theater-like quality to cinema which required viewers to essentially buy-in to the effect. When poorly done, the results were downright laughable.
CGI was used sparingly throughout the 1970s to 1990s. Contrary to popular belief, the first Star Wars film (Episode IV- A New Hope) had fantastic VFX, but most of it was practical effects, with CGI used in only a couple of scenes. Fast forward to 1984’s Terminator which used practical effects, animatronics, rudimentary CGI and stop-motion animation. The huge strides made in computer graphics were visible when it’s sequel Terminator II, released in 1991 included jaw-dropping CGI that looks good even today. By the time the 2000s rolled around, cheaper fast computers, sophisticated VFX software, and improved techniques created CGI extravaganzas.
The most memorable examples are creatures from The Lord of the Rings (2001). The tiger from Life of Pi, various monsters like Godzilla, dragons, and outsized moths. The different transforming metallic life-forms in the Transformers franchise, and of course, 2008’s Ironman which set-off the MCU juggernaut. Later movies like Avatar and Gravity would have been impossible without CGI. It starts getting downright spooky when cutting edge. Though controversial techniques like face superimposition, de-aging, deep-fake, and literal resurrection from the dead makes it possible to film scenes with the actors not even present on the set!
There were also shocking bloopers, like the Scorpion King from 2001’s The Mummy Returns. The superhero suit in 2011’s Green Lantern (a CGI effect so bad Ryan Reynolds famously lampooned the film in Deadpool 2!), and Henry Cavill’s mustache fiasco in 2017’s Justice League. These counter-examples help illustrate that CGI is not a panacea for every situation. When used well it is a game-changer. When used with broad strokes, and without sufficient skill and money. It is worse to have it than nothing at all!
Conclusion
CGI in film may have been around for 50 years, but it is still very much in its adolescence. Despite the challenges of creating high-quality CGI and the need for talented artists and technicians. The role of CGI in films and animation is only likely to grow in the coming years. As technology continues to advance and CGI becomes more sophisticated. We can expect to see even more incredible visual effects and animated characters on the big screen.
As computing power improves so too will photorealism and the temptation to do away with, or at least reduce the involvement of actors, especially during stunts. Ultimately however, CGI- even CGI powered by AI is a tool. The real artistry remains with the creator. As the technology continues to evolve. It is exciting to think about how the technology and the creator will actually progress to perhaps one day have a collaborative relationship in film-making!