Our divisions provide turnkey themed entertainment experiences →

  • Treehouse
    • Treehouse manages our master planning attraction design, executive production, and special venue development. Learn More
    • background image
  • Digital Media
    • Digital Media develops media, augmented reality and interactive experiences that ignite the imagination. Learn More
    • background image
  • Licensing
    • Licensing provides access to our unparalleled attraction systems, media content and intellectual property. Learn More
    • background image

Falcon’s Director of Technology Saham Attends SIGGRAPH 2018

Falcon’s Director of Technology Saham Attends SIGGRAPH 2018

“The technological renaissance is coming…”

For the past several years, we have witnessed an explosion of technology in the field ranging from computing, display technologies, and optics. At the same time, we have seen applications and use cases of these technologies, combined with older technologies giving birth to new products and use cases we have yet to even scratch the surface on regarding the full potential they have to offer. This year, I was able to join my peers at the annual SIGGRAPH 2018 conference which was held in Vancouver this year. This annual convergence of everyone working within the computer graphics industry has historically been focused on film and animation, with a component of research and development from academia, and advanced training for graphics professionals. Coming to this event for the last 15+ years, I have seen this event scale up and down as the professional CG industry has gone through many phases.

But this year was different….

I started my week early visiting the academic portion of the conference where user groups from universities and companies are often presenting experimental techniques and software to solve problems most of us didn’t even know existed. It is here, very often, technologies are born, that don’t have a clear path or definition. Similarly, aside from the academic portion of the conference, there is a whole exhibition floor dedicated to emerging technologies. Here, ideas and concept have been developed to a point where there is something tangible, or even software that works enough to show something well… amazing. Here, you can see advancements in holographic display technologies to new forms of virtual production tools, and even frameworks of software’s that can transcend industries.

For the past several years, with the advent of AR/VR, AI, GPU computing, display tech as well as advancements in software, most people in the field have seen a convergence of industry tools. Cross-pollination of software pipelines, workflows, where even 5 years ago, no one would have imagined was possible due to the main constraints (hardware\software\cost) i.e., real-time game engines for architectural visualization. Seeing Hollywood hardware and software that used to cost millions of dollars to procure, now costing a tenth of what they did, and run on desktop computers has really changed the barrier of entry to this field, allowing for even more creative and technical people access to build something wonderful.

It was also evident that the industry was listening. Everyone is trying to crack a unique nut that’s falling from the same tree. Having the opportunity to chat with all the vendors about the problems that we face are trying to solve, and their roadmaps was a breath of fresh air. Proof in point, when we were invited to The Foundry VIP Dinner to talk about the future of NUKE with leaders from across the VFX industry, commenting on their everyday challenges.

Enter Nvidia (mic drop)

One of the biggest announcements during the conference was also from one of the largest sponsors of the event, Nvidia. Nvidia has changed the way nearly every user of computer graphics has been able to work and interface with computers for the last 10+ years, and this year, they unveiled something that will be pivotal in the amount of realism we can expect from the computer graphics industry in the coming years. Nvidia didn’t just unveil an updated piece of hardware. They invented a brand-new type of GPU processor, that works in tandem with their other existing technologies (CUDA, Tensor) that will help pave the way real-time raytracing and accelerated ray trace computations among other things, which is the Holy Grail of computer graphics. Ray Tracing hardware isn’t new, but this is the first time that the industry is as mature as it is with software and APIs are being developed in tandem with the hardware to leverage it. E.g Microsoft DXR, Vulkan etc. Ray Tracing has been utilized in Hollywood effects for many years, though via offline rendering, where a frame could take days to complete. You may remember one of the earliest examples of a mainstream ray trace rendering in Terminator 2: Judgment day when the T1000 turned into a liquid metal.


When I say Holy Grail, I do want to emphasize, the act of being able to “ray trace” is not anything new but has always been a very computationally expensive method to get accurate lifelike reflections and physically accurate light behavior. In film, where things aren’t needed in real time, animation software renderers calculate every photon or ray of light to achieve more physically accurate results. This means a single frame could take several hours, or even days to render. Since these frames were not needed for real-time playback, you would just have to wait for the computer to finish its calculations. In real-time, however, you are required to rasterize or “draw” a frame at 60-90 frames per second. So, at 90 frames per second, you effectively need to render a frame in 11 milliseconds. Game engines typically must bake down or simplify a lot of things to achieve these speeds, which oftentimes would limit the amount of realism one could expect. Now with the RT core in the all-new Nvidia GPU’s, creators will soon have access to a physical piece of hardware that’s sole purpose is to just handle ray trace calculations. Now combine that with Nvidia’s answer to AI (artificial intelligence) with Tensor cores, and we are now seeing the ability to teach machines to reduce noise in an image, or approximate results so well, that what may have taken days to compute in a brute force approach, is now all in real-time, or near real-time. And we're just scratching the surface.

Hyper REALity

With all this computation technology at the tip of our fingers, the democratization of creation tools and software, the increasing amount of computational power available on affordable home PC’s and the cloud, we are quickly drifting to a point where we will be able to achieve a type of virtual reality or augmented reality that will encroach on hyper-reality. The industries that will benefit from these advances are countless. Often, the media and entertainment industry will be one of the first to leverage the technologies for entertainment purposes. It also helps that Nvidia’s primary demographic for their consumer GPU’s are gamers, and they typically work very closely developing frameworks and tools for developers across the board to efficiently leverage the hardware they produce to achieve the best possible demonstration of the technologies.

Optics and display technologies are two things that have flourished in the last few years. 10 years ago, the promise was at home Stereo 3D TVs. Bringing the theatrical experience home, while promising, fizzled as most people didn’t want to do stereo at home. Now with the second coming of VR/AR HMD’s all trying to produce the most realistic graphics for users, it requires very complex optical systems that must fit within a head-mounted piece of hardware. Eye tracking and multifocal displays are just now starting to enter the market and further allowing developers to leverage such technologies to create an even more immersive experience. Being able to dynamically change the story or action, or even using the ability to procedurally generate content to match your gaze is going to be an immense tool for storytellers. Imagine if a game was purposely trying to take your attention to a certain part of a virtual room to provide the best scare in a horror situation or following a fairy that is more like an Easter Egg than part of the true storyline. The possibilities are endless.

Can you feel it?

Combine these technologies with touch outputs physical inputs methods and all bets are off. Haptics technology has been slowly growing over the years and is leading to some exciting use cases. There were several vendors this year demoing physical hardware that would make you feel bullet hits, and even touch in midair, to even changing the way you taste! Google has been working on a type of technology that would allow a user to rub their fingers and provide scroll input or selection, all while you not having to wear any extra hardware. Microsoft announced the release of a new framework that will allow the Hololens to detected hands as inputs, turning any surface into a tactile input surface, which is completely open source.

It requires taking a high-level bird's eye view to see all these different technologies culminating in a holistic and grand experience. There are very few companies in the market who are taking it upon themselves to bridge these technologies into a truly meaningful experience. Bringing all these exciting new developments and finding a solution to a problem, or even solving a problem no one even knew existed, is something that we at Falcons Creative Group have a knack for accomplishing.

The next few years are going to be very exciting, and I welcome the changes it will bring to our professional workspaces and entertainment spaces. The ability to immerse oneself and let go of the grips of reality is going to lead to some spectacular experiences, whether at work or play. All the pieces to the puzzle are starting to take shape, and we at Falcons are very excited to play, and see the pieces come together.

Onward.

Schedule a Conversation

We'll share our insights and help you experience imagination with your project.