The Otherworldly Explained: Types of Modern Visual Effects

Visual effects (VFX) in film and other audiovisual content has become more commonly used in recent years. With every blockbuster movie we see, filmmakers and visual effects artists only seem to improve and innovate upon existing technologies. 

Watching behind the scenes footage of our favorite science fiction or fantasy films expose us to the level of creativity and technology available today. They often give us a quick glimpse of the amount of work it takes to practically make something out of nothing; to show us near realistic depictions of creatures, objects, and phenomena that did not exist at the time of filming.

In this article, I am going to show you the different types of visual effects, how they are done, and how they work together in order to produce images that often leave us in awe. Hopefully, this allows us a new level of understanding and appreciation for all of the unknown artists whose works we consume and enjoy in our daily lives.

Image credit Variety Youtube Channel

What Are Visual Effects?

The popular understanding of visual effects is anything applied to a piece of media that looks fantastical, out of this world, or absolutely impossible to replicate in real life. This is true to an extent, as these are the most visible forms and uses of visual effects.

Visual effects involve the creation or manipulation of imagery in order to produce scenes, objects, creatures, and situations that would be impossible or too expensive to shoot in live action. They often combine computer generated images and effects with live action footage, aiming to produce some degree of realism.

Visual effects are not just limited to highly stylized and out of this world effects–it could be as simple and subtle as removing an object in the frame, changing the color of the sky, or adding specks of dust to a surface.

Before the advent of modern computers and digital media, filmmakers were already doing visual effects on their films, although this involved more complex and limited processes. They made use of cel animation, stop motion, miniature models, and rotoscoping in order to produce effects that they felt their films needed.

With the emergence and popularization of the modern computer, software for film editing, 3D modeling, and visual effects soon followed, and allowed filmmakers and artists to generate complete scenes and visual effects all in one device, and combine it with any existing footage they may have. 

This opened up a door for a lot of experimentation–artistic and scientific–and in recent years, we only see the emergence of new software, new techniques, and new technologies that allow for more creativity, easier processes, and the introduction of new viewing experiences for audiences all over the world.

Basic Types of Visual Effects

Visual effects is an umbrella term covering many different techniques. In major productions, a visual effects team could consist of dozens of artists working on different aspects and steps in the visual effects pipeline.

The different techniques involved in visual effects could be reduced under three major types, computer generated imagery or CGI, motion capture, and compositing. Of course, not every production has to make use of all of these methods. It is up to the filmmakers on how they decide which technologies they could make use of that would suit their concept and creative vision.

Computer Generated Imagery

Computer generated imagery or CGI is a blanket term for any digitally manufactured graphics which could work by itself or is added to a live action scene. CGI could be either 2-dimensional or 3-dimensional, but 3D imagery is the most commonly used form of CGI.

These could cover effects such as rain, fire, smoke, etc., or fully rendered 3D models of objects or creatures inserted into a scene. CGI could be as in your face as a fire breathing dragon, or as subtle as adding water droplets to a windowpane.

Creating computer generated imagery usually goes through several steps, from modeling, coloring and texturing, and even rigging if the CGI element needs to be movable and animated. However, the end result is usually worth it, and contributes a lot to achieving a level of realism even in the most fantastical of situations.

Image created by Vitaly Bulgarov

Its origins actually stem from the 1950s and 1960s, even before the advent of digital video and filmmaking. It was used in the opening credits of Alfred Hitchcock’s film Vertigo (1958), and only grew from there, with many artists experimenting with the medium. It started out only with simple lines and shapes, which, through the help of technology, evolved into the highly realistic 3D models and effects that we now know today.

Motion Capture

Motion capture is the act of digitally recording an actor’s movements–be it specific to the face (which is sometimes called performance capture), or recording the actor’s entire body and movements–for the purpose of transferring it and applying it to a computer-generated element.

This is done through specialized motion capture suits which contain distinct markers, which make it easier for the visual effects artists to track the actors’ movements and effectively translate it to the computer generated element needed. 

This technology has opened up so many doors for visual effects and filmmaking. More and more non-human characters can be realistically depicted without compromising the acting quality, and many non-realistic stories and concepts can be depicted effectively without endangering any actors or crew members.

Modern motion capture emerged in the 1990s and well into the 2000s, with characters such Jar Jar Binks from Star Wars Episode I: The Phantom Menace and Gollum from The Lord of the Rings film series being popular examples.

Compositing

Compositing, also known as chroma keying, is usually the last step of any visual effects process. It is the act of putting together 2D or 3D elements from different sources and combining them into one shot, making it look as if everything was shot at the same place and time.

This step is made easier by shooting the actors and sets in front of green screens or blue screens, which will make it easier for the compositor to isolate the necessary elements and easily transfer them to a different shot. The end goal of a compositor is to make everything look as seamless as possible.

The Basic Visual Effects Pipeline

Visual effects, although most work is heavily concentrated during the post production stage, is actually still involved in every step of the production process, from pre production to post production. There is a lot of work that goes into visual effects production, requiring artists within different areas of expertise working together to efficiently accomplish the project at hand.

Here is a basic visual effects pipeline, from pre-production to post-production, and how different visual effects artists and techniques are involved. These steps and practices may differ according to different factors, such as a director’s creative process, budget constraints, time constraints, or many other factors.

Pre-Production

Pre-production is the part of the filmmaking process that takes place before the actual shoot or production stage. It is where all of the preparation is done, from conceptualizing the film, writing the script, gathering funds for production, hiring the right people, among others. Visual effects artists are also heavily involved during this stage.

Concept Art

After the script is finalized for the meantime, concept art is usually created, especially for highly stylized films or projects with heavy fantasy or science fiction elements to it. This ensures that the entire team has a solid grasp of the film’s concept and visuals, making for smoother communication and minimizing misunderstandings.

Image source Feng Zhu Design youtube channel

Artists create models or concepts of what the world, the characters, and the different elements of the film should look like, coming up with coherent and solid visuals that would serve as a reference for the entire production and visual effects team.

Storyboarding

Storyboarding is a practice not unique to visual effects work, but is commonly done across all film and TV productions. A storyboard is a series of drawings or sketches that illustrate the different shots and sequences of an entire movie.

Having a storyboard makes it easier for the team to understand and decide what the film should look like visually, allowing them to structure the different elements of the film and the shot accordingly and with respect to the different departments.

Sometimes, the storyboard can be taken one step further with creating an animatic, which is a rough animation of the film according to the storyboard. Animatics create a clearer picture of the film, as not only is it just a series of static pictures, but is like a movie in itself, with other elements such as timing, rhythm, sound, and dialogue becoming clearer and easier to understand.

Pre-visualization

Pre-visualization, or pre-vis, takes storyboarding and animatics to an even further level. Here, 3D artists and modelers create low-poly models of entire scenes, including both computer generated and real elements in the picture. Sometimes, this can also be animated, making a low-poly version of the film they are making.

This step helps the entire team, not only the visual effects artists, figure out what to do and how to work together. It helps with deciding on which camera angles are best, and how to build the set design. Here, the different departments and team members help each other figure out what parts can be done physically, and what parts of the film will need visual effects work.

Pre-visualization is an essential step for high-budget and high-stakes projects, especially those with heavy visual effects and CGI, as production can be very expensive. This ensures that the film is well planned out and everyone knows what to work on.

Some examples of films which used pre-visualizations and animatics are the Iron Man films, Star Wars films, and The Grand Budapest Hotel.

Research and Development

The terms research and development seem to be out of place in a creative project–we usually hear these words when talking about science and academia, not our favorite fantasy films. However, research and development is a very important step in films requiring a lot of visual effects work.

In this stage, the team decides what sort of technology should be used in order to achieve the desired look and effects that the film needs. This is taking into consideration the production’s budget, time constraints, or other factors. The team could research and use existing technologies that they believe would work in their production, or develop a new technology altogether.

Companies like Weta Digital develop their own internal renderers and FX Simulation softwares that help them create the best VFX in the world. Eventually some of those even make it to the public like Mari or Pixar’s Renderman.

LAIKA studios, for example, during the production of Kubo and the Two Strings, managed to develop a new technology for 3D printing, which did not exist at the time. They used it to print vibrant and durable 3D figures to produce realistic mouth movements for their puppets. This development awarded them the Science, Technology, and Engineering Oscar Award in 2016.

Production

The production stage is when the shoot itself is conducted. All of the crew’s preparation goes here. However, for the visual effects team, it does not end here.

In visual effects-heavy projects, production is usually set in a studio, with large green screens or blue screens serving as backdrops. These could also have markers on them in the form of little dots for easier tracking. Shooting in a studio with green or blue screens instead of on location allows for more control of the final look, but will require a lot of visual effects work in order to make the footage look realistic.

If motion or performance capture is needed for the film, it is shot during this stage, making use of specialized suits and markers on the face and/or body to make visual effects artists’ job easier.

Reference photos, footage, and notes are also taken in the production, such as what props were used, what camera and lenses were used, notes regarding the tripod height, what lights were used, among other things. Scans and surveys of the sets are also useful information–how big is the set? How high are the platforms? How were the cameras and lights positioned? This information will be useful during post-production, especially for the matchmoving, lighting, and compositing artists.

However, a new technology has emerged in visual effects, and this is using very large LED panels as backdrops instead of green screens and blue screens. This technology called Virtua Production reduces the effort needed for visual effects and compositing during post production, as most of the background replacement and background effects are already in the footage. This also gets rid of the green cast that may fall on actors and objects when shooting with a green screen. 

Virtual Production technology is not yet as common, but is growing extremely fast and is projected to be a viable alternative to green and blue screens in future productions. It emerged from video game technology, and has already been used in productions such as The Mandalorian which premiered last year.

Post-Production

Post-production is where the bulk of the visual effects team’s efforts are needed. This is the part of the production process which happens directly after the production, and involves film editing, sound design, and of course, visual effects. After post-production, the film is ready for marketing and release.

Matchmoving

Matchmoving is a necessary step in visual effects work. Since most visual effects are computer generated images which need to be combined with existing live footage, these computer generated elements will need somewhere to hold on to, somewhere to be placed. This is where the matchmover comes in.

Matchmoving involves the creation of a 3D space where the computer generated, 3D elements could be placed and moved as needed. This will serve as the foundation of all other visual effects and CGI work involved in the film.

This stage in the visual effects process will need the matchmover to be as precise and pay as much attention to detail as possible, especially when working with clips where there is a lot of camera movement, as the 3D environment to be generated will need to be exact. If it comes out a few degrees off kilter, it might not achieve the desired level of realism, thus jarring the audience from the world of the film.

Read more about Matchmoving in my other articles:

3D Modeling 

Although we might have already encountered 3D modeling during the pre-visualization stage in pre-production, 3D modeling in post-production could be quite different. The 3D models produced during this stage will be the ones used in the final output. 

In this stage, the 3D modelers will need to produce high-poly models of characters, objects, or any other elements needed in the scenes, based upon the concept art and the agreed visuals and aesthetics of the project. 

3D models are most commonly used for visual effects and CGI since they are the types of graphics that could most easily be blended into live action footage, and when textured and lit correctly, could move seamlessly with the scene and look quite realistic.

Texturing

3D models as they are, would definitely not be realistic enough to be put in a live action scene without terribly standing out, which is why texturing is an essential part of modeling and visual effects. 

The high-poly 3D model is colored and textured in order to make it look as realistic and natural as possible (if that is the desired look). Texture artists use different brushes which perform different effects, and go over the model, pushing and pulling at the surface area of the model in order to create textures on its surface.

Sometimes, especially if not a lot of modeling work is required, or if the production is not as large scale or high budget, one artist could do both the modeling and the texturing parts of the process.

Rigging

Rigging is an essential step of the visual effects process, especially if the 3D models are to be animated. Rigging is the construction of a “skeleton” for the 3D model, complete with different segments and joints necessary for movement.

The more animated or detailed a character or object needs to be, the more work the rigger has to do. A rig of course involves a skeleton, calculating and implementing skin weights for every part of a 3D model–bones, muscles, hair, etc.–whatever parts that need to move in order to produce a natural effect. 

The smoother and more fluid the required movement is, the more detailed and meticulous a rigger has to be. If the rig is not working as desired for the animator, they might need to send it back to the rigger in order to fix any issues.

Animation

Animation is the stage where the 3D computer generated and rigged elements are moved, positioned, and manipulated into what is required in the script. Animators make the different elements “act” according to their roles in the film. This is the stage where the different models come to life.

For films making use of motion capture, this is also the stage where the motion capture data is integrated with whatever model needs to be used in the final film, blending the actor’s performance with the computer generated element that becomes their on screen counterpart in the final film. This is where the film begins to come to life.

FX and Simulation

Effects (FX) and simulation is the stage for what audiences usually perceive as visual effects, such as rain, explosions, fire, dust, among others. This is the stage for more miniscule, atmospheric effects.

This will need great attention to detail, as most of these effects require working with particles and ensuring that they move naturally, as close to realism as they could. The FX artist will need a good understanding of physics and how small things move with wind, water, or possible explosions.

Rotoscoping

Rotoscoping for visual effects involves tracing over footage and creating mattes which could be moved or replaced to or with other elements and pieces of footage. This is commonly used when an element needs to be placed to another backdrop or vice versa.

Because the rotoscoper will be working with video, rotoscoping will need to be done frame by frame, especially if the character, object, or the camera is moving in the shot. This will ensure precise edges and fluid movement. It is quite a tedious and meticulous process, but when done well, it helps a lot with maintaining the reality of the world of the film.

Aside from its use in visual effects, rotoscoping could also be used as an animation style involving drawing over footage frame by frame in order to imitate lifelike and realistic movement. It is quite an old method, and was used in special effects way before the invention of the modern computer, digital video, and computer generated imagery.

Lighting

Lighting for visual effects could be seen as quite similar with lighting done in real life, except for the fact that instead of using real life, physical lights in a set, the visual effects lighting artist makes use of 3D software and virtual lights in order to light a scene.

The information, reference photos and videos gathered during the production stage or the actual shoot could be very useful here, as the lighting artist could easily replicate and mimic the actual lights being used during the shoot. 

However, a visual effects lighting artist does not really need to simply just follow whatever lights were used in person, but they could use whatever techniques that they believe enhance the shot while still looking natural and believable. This affords the artist a level of creative control.

Compositing

Compositing is one of the final steps in the visual effects production process. Here, the compositor puts together all of the different computer generated elements and effects needed for the film, as well as the live action footage itself, into a single shot.

This step needs an eye for detail, as well as the ability to make minor corrections in order to make sure that the final output looks realistic and natural. Visual effects usually aim for a very high level of realism, in order for the audience to easily accept the world of the story, no matter how outlandish it may be.

The Avatar VFX sizzle reel is still very much representative of many of the steps in the pipelie and gives a great overview of things that can be done.
Check it out below before we finish of this article.

Conclusion

It is amazing to see the heights that a combination of artistic sensibilities and scientific innovation has achieved and afforded for filmmakers and audiences everywhere. The final output in itself is fascinating as it is, but it is even more fascinating to see the level of effort, dedication, and skill that these different artists put into their work in order to give a concept justice, and to give audiences a viewing experience they would not forget.