In this interview we were asking questions Jon Wadelton, NUKE Product Manager from The Foundry. The interview is focused on The Foundry as the company itself and NUKE, which is a powerful compositing product that delivers unparalleled speed and a first-class feature set that is unrivalled in the desktop market. With NUKE you are able to create VFX in a new way using camera technologies, 3D assets and particle systems. We would like to thank again Jon for taking his time to answer our question as well the crew by The Foundry.
At the bottom of this article you can find a short video about NUKE and as well a gallery in which is also Jon Wadelton at IBC 2011 in Amsterdam where he was presenting the new features inside NUKE 6.3.
Q: Presently is The Foundry a world-leading innovator of visual effects and image processing technologies. What was behind forming the company? What was the reason?
A: The Foundry was formed in 1996 by movie special effects specialists Bruno Nicoletti and Simon Robinson. At the time there were no commercial plug-in solutions available for Flame or high end visual effects in 2D. Seeing a gap in the market, they started life at The Foundry developing and selling plug-in tools for post-production teams. They were the first developers of Sparks on Discreet Logic’s Flame systems. From there they grew and continued to develop award winning technology that filled a gap digital artist’s need.
Q: If you compare the company at the beginning back in 1996 and now, what changed at most?
A: The growth of the company has been explosive. We went from about a staff of 50 people in 2009 to 130 this year. The company success comes down to continuing the original idea of finding gaps in the market and delivering products that help artists. Whether it is gaps in compositing, painting or stereo, The Foundry has continued to deliver game changing products for the vfx industry.
Q: Your product contains AMPAS Sci-Tech Award® winning technology. Can you explain it a little bit to our readers?
A: In 2007, the Academy of Motion Picture Arts and Sciences awarded a Sci-Tech Award® to The Foundry's development team for the FURNACE image processing suite. The company's widely adopted, high-end compositing system NUKE is also based on AMPAS Sci-Tech Award® winning technology.Optical flow or motion estimation is an area in which The Foundry has pioneered a new generation of algorithms which enable the tracking of every pixel in a frame to subsequent and preceding frames.
Launched in 2002, The Foundry’s FURNACE was the first comprehensive toolset to deploy these new algorithms, with the retiming tool KRONOS rapidly becoming a market leading technology for speeding up and slowing down footage. Today over 30 plug-ins within the FURNACE package take advantage of advanced motion estimation technology, providing digital visual effects artists with a range of sophisticated plug-ins to tackle everyday compositing issues as diverse as wire and rig removal, grain reduction, dust busting, image stabilization, super resolution, auto rotoscoping, image segmentation, flicker removal, image stitching, adding motion blur, tracking and generating depth mattes amongst many others.
The Foundry was the only British company to be honoured with a 2006 Academy Award® and was one of only four companies to be awarded the Scientific and Engineering Award Academy Plaque®.
FURNACE enjoys an enviable reputation, having been used on a host of high profile feature films including Casino Royale, X-Men 3 The Last Stand, The Da Vinci Code, Charlie and the Chocolate Factory, Batman Begins, King Kong, The Lord of The Rings Trilogy, Poseidon and Superman Returns.
Q: How long you were developing NUKE and what inspired you? How many people participated on the development?
A: NUKE was originally written at Digital Domain at around 1993as their in-house compositing tool. It started out as basically just a command-line tool that could read text scripts to perform compositing operations. I believe the first feature that it was used on was True Lies back in 1994. From then it has continued to evolve. By the time Digital Domain worked on Titanic in 1997 it had basics of graphical node based tree view you see in the latest NUKE now. In 2007 the Foundry acquired NUKE and made some major updates including adding python support and a modern Qt interface. Since then NUKE has gone out to become the standard compositor in the film industry today. It’s hard to say how many people in total have worked on NUKE over the years, it all adds up!
Q: NUKE was used in various movies including Avatar, District 9 or The Dark Knight. On what use of NUKE are you proud at most?
A: I’m always proud whenever I see NUKE used on any movie or indeed commercial. I am always amazed by what artists push NUKE to do to produce these amazing shots. I guess a moment that made be proud recently was when visiting Framestore they showed me a hero shot from the upcoming major film where they did all the particle effects from the new particle system in NUKEX 6.3. The shot looked amazing. It took us a long time to get the particle system in and working ‘right’ so it was very satisfying to see how when put into the hands of talented artists how great the result could be.
Q: By the description of NUKE at your webpage you write; NUKE is the world’s most powerful nodebased compositor. Can you explain this to our readers?
A: NUKE has from the very beginning been designed to be scalable and powerful. It processing images on 32 bit full floating point pixel information. This means you can manipulate images through many compositing and effects nodes and not have degradation of the image. It is also scalable in terms of image sizes and compositing tree sizes. We often see movie studios with NUKE projects including 100 and even 1000s of image processing nodes to produce a final image. NUKE also handles large images sizes very well. Large matte paintings of 50K projected onto 3D geometry and are not uncommon.
Q: What unique technologies use NUKE?
NUKE has a powerful multi-channel workflow that allows you to work on multiple image streams at once in the composting tree. This also extends to working with stereo images. Multiple stereo images can flow down one pipe in the compositing tree. This reduces clutter and makes for a cleaner more intuitive workflow.
NUKE also has a new Deep compositing system, which uses technology developed by Weta Digital on Avatar. Traditionally when compositing CG renders together the 3D guys need to render what’s called a holdout in one of the images where the other image will be inserted. As an example think of a scene in Avatar where Jake’s avatar character is running through the forest. The 3D guys need to render the forest and then insert Jake’s avatar into the forest scene. In order to do this they need to render a ‘hole’ or holdout in the forest where Jake will go. This is all fine if you never need to move Jake, but what if the director needs to put the Jake character in another spot, or use a different take? It means of course you need to render the forest again with the Jack holdout in a different place. On a movie like Avatar this re-render takes hours per frame. The Deep compositing system solves this problem. Instead of rendering the forest with the holdout, the 3D guys render some extra information about the scene called Deep data. This Deep data allows us to composite the Jake character into the forest on the fly in NUKE. This means there are no extra renders and many hours saved.
Q: Recently you released KATANA 1.0. When do you plan to release a new version of NUKE? Can you reveal something from the next release such as new features?
A: Yes we have a major new release of NUKE coming out in our second quarter next year. A major thing we’re looking at for the next release is harnessing the GPU for image processing. We will have some of our compute intensive algorithms such Denoise, zDefocus, and motion estimation running on the GPU. We’re also looking at some tight integration with a new timeline application HIERO.
Q: Your partners are movie studios like Warner Bros, The Moving Picture Company and many more. How does this collaboration work?
A: Yes, we work very closely with movie studios. In fact most products at the Foundry actually started out their lives ‘in production’ as in house tools at major effects houses. NUKE from Digital Domain, KATANA from Sony Pictures Imageworks and MARI from Weta Digital. Both MARI and KATANA continue to be co-developed with teams working both at The Foundry and Weta and Sony respectively.
In addition to our close partners we also work closely when other studios to help solve problems they might have during production. Sometimes it’s as simple as a studio having a problem with a shot and sending us the footage and The Foundry image processing research team seeing if that can solve that problem. We recently did some work on Tron Legacy for just this sort of issue. We value this sort of interaction very highly as it gives us real world problems to solve which we then eventually fold back into our products for future releases.
Q: What are your plans for the future?
A: One thing that will be happening for sure is more data sharing between all our applications. For instance set up a 3D render scene in KATANA, and automatically re-create a comp for that scene in NUKE. Or from NUKE, import the KATANA 3D scene and and ‘on demand’ request a new pass or matte for a 3D element. I see this sort of data sharing as a huge time saver, especially in smaller houses.
The major thing we’re working on is future proofing NUKE for future advances in hardware. Our new ‘Blink’ framework which you’ll see in NUKE next year enables us to write image processing algorithms that can potentially run on any new hardware that becomes available. For instance, CPU’s came out with a tech called ‘SSE’ a few years back which enabled some algorithms to run up to four times faster. Software (including NUKE) has been slow to adopt it as it involves re-writing the algorithms and having a separate path for SSE vs non-SSE. Then you throw a GPU into the mix, which is different again and now you have three paths you need to maintain. Our ‘Blink’ framework is designed to fix all that. We write the algorithm once and Blink sorts out whether it’s going to CPU, SSE or GPU.