An Inside Look at ‘Micro Monsters with David Attenborough' Skip to Content

An Inside Look at ‘Micro Monsters with David Attenborough’

Meet a menagerie of curious critters in Micro Monsters, a new VR documentary from Alchemy Immersive and living legend David Attenborough. The five-part series is a continuation of Attenborough’s original production of Micro Monsters, which debuted in 2013 and pioneered macroscopic filming to give audiences a never-before-seen view of Earth’s mightiest insects. Micro Monsters in VR explores the real-life superpowers of arthropods from a perspective only immersive media can deliver.

Recommended Reading

Over five episodes, viewers will explore a near-invisible world of conflict and community that’s both mesmerizing and monstrous.

Alchemy Immersive, in partnership with Oculus from Facebook, produced Micro Monsters using specially developed 3D stereoscopic camera rigs, 180° live-action capture, and newly developed VFX compositing techniques. Audiences will see all of our tiny neighbors’ minutia and alien-like detail through an incredibly high frame rate (60fps) and spatial sound design. 

Over five episodes, viewers will explore a near-invisible world of conflict and community that’s both mesmerizing and monstrous. Watch a scorpion and a centipede fight to the death, an aphid cloning itself, and an army of green ants building incredible structures. Avoid the ambush of a Trapdoor spider and experience the Portia spider’s deadly lullaby. With the aid of live-action close-ups and computer graphics, audiences can even watch a caterpillar’s transformation from inside its cocoon and discover a beetle’s secret chemical weapon.

We spoke with Director of Micro Monsters and Creative Lead at Alchemy Immersive Elliot Graves and Oculus Immersive Media Lead Eric Cheng to hear more about Micro Monsters and get an inside peek at the production. 

Behind-the-Scenes Photo Galleries

Episode: The Dual at Dawn
"While featuring a gruesome battle over territory between a centipede and a scorpion, The Duel at Dawn gives us a glimpse of some of the most elusive scorpion behavior: the mating ritual. Filming this sequence under UV light brought this hidden practice to light. When you see the scorpions light up and dance together, it’s a pretty special moment, so we set the sequence to tango music!" — Elliot Graves

"David Attenborough on location for Micro Monsters. Next to him the monstrous but magical RED 3D cinema camera rig."
"Attenborough does a PTC with the rig, this time fitted with “The Cube.” Using a prism to converge two lenses, it enabled macroscopic 3D filming."
"Bug wranglers worked within specially constructed sets, allowing for natural behavior amongst various arthropods. Here, butterfly wings are shot in hyper macro detail."
"Attenborough watches the crafty leaf cutter ants carry leaves to their underground nest, which can be 30 meters long and contain 8 million individuals!"
"The crew film with Attenborough inside a more extensive set before going out on location to capture the arthropod’s wild behaviors."
1 of 5

Episode: The Trap Door
"Featuring both the Praying Mantis and the Trap Door spider, this episode is all about entrapment! I always loved the sequence where the under-dog bombardier beetle escapes the Mantis’ snare by firing acid from its back end! Revealing the intricate workings of this through a CGI breakdown was a nice milestone for the production. It was also this point we refined and finalized the incredible artwork you see across the series." — Elliot Graves

"DOP Daniel Bury and the ZCAM K2 Pro in the Blue Mountains, Australia. Produced from London, these shots were used as GV across the series."
"Director Elliot and Editor Richard test different options to showcase the Portia spiders’ infamous plucking during an offline edit session."
"COVID lockdown begins in the UK, forcing the team into remote work. Our daily standup and producer Vianney’s dreaded GANT!"
"Frame.io became integral to running reviews and keeping momentum. Here, you can see a CG animation of the tunnel in our offline export."
"Creative Iona is working on the fantastical episode titles and descriptions. Leaving our physical whiteboards for a virtual Microsoft Planner board was a little tricky!"
1 of 5

Episode: The Feast & Flight
"For me, this episode is the most magical. Having the opportunity to see the metamorphosis of a butterfly up close and personal is just awe-inspiring. Set extension work on this episode was the most challenging due to fast-moving shots and bright backgrounds. In the end, we had to create an additional toolset within our VFX pipeline to accommodate it. We’re most proud of the final flight sequence!" — Elliot Graves

"Oliver from 1.618 Digital worked tirelessly to produce a stunning immersive sound design. Using high-res audio streaming software, we could review sessions remotely."
"Artists work on compositing the 3D transparent webs into the tunnel sequence. With heavy 8K renders, it was an awkward task when we discovered issues!"
"The talented Richard Lester (editor) brings episode four to life in an online session. Props to Premiere Pro for handling everything we threw at it."
"The artwork of a Nuke script! Created by Tim Baier, it allowed for the intelligent set extension of over 100 3D source shots to 180 VR."
"Matt’s tunnel geometry, screens, and lights in 3DS. Choreographing the various flying screens proved to be the impossible challenge in this shot."
1 of 5

Can you tell us about the work involved in bringing this project to VR? 

Elliot Graves: What makes Micro Monsters unique is that we used traditional stereoscopic cinematic techniques, but made that footage available inside VR. This means that you get the chance to see these incredible bug environments in immense macro detail — something not possible with 180° and 360° cameras.

At 8K and 60FPS, every angle and detail is visible, especially because the screen is only inches from your face! Elliot Graves

We created a custom toolset within our post pipeline that would allow us to extend, project, and manipulate the image intelligently, all while simultaneously ensuring 3D depth and the disparity was correct. This process also accommodated vector-based stabilization and camera solving across the 100+ shots of the project. Stabilizing and camera locking the footage was key to ensuring smooth HMD playback with no motion sickness for viewers. We were super fortunate to support ZOO VFX, working alongside very talented artists inside Nuke 12.0 to ensure that even at 8K, the image looked perfect. 

We faced the other challenge creating 3D 180° renders of the earth and underground tunnel introduction sequences. At 8K 60fps, every angle and detail is visible, especially because the screen is only inches from your face! This meant ensuring our designs, textures, and compositing pipeline were sufficient to produce such detailed imagery that would stand up into the future. A lot of attention was devoted to the finer details within each scene, which were then rendered and comped offline instead of real-time.

What’s it like working with an 8K video for VR? How has the process evolved over the years? 

Elliot Graves: Working with 8K immersive VR media wasn’t as challenging as we’d planned for. Many aspects of working at 3D 8K were carried over or upgraded from 4K and 6K pipelines, such as ProRes proxies within the edit or 3D 180° post-processing. Within the edit, our biggest challenge was the frame rate and building a workstation that would give us smooth playback of at least 4K 60 frames per second in the suite. With playback at 8K in Quest 2, every single pixel of resolution is perceivable at a greater quality than what our edit monitors could show. This meant we spent almost each edit session in the headset, rather than relying on the monitors.

When you have a praying mantis the size of a car, inches from your eyes, there’s an awe-inspiring emotion that kicks in and holds your absolute attention. Elliot Graves

Another challenge of the delivery spec was producing the CGI content used in our virtual scenes. Despite our tunnel scene taking two weeks to render on the farm, processing the render jobs wasn’t too challenging - the real fun started with QA and ensuring there were no errors! With time being of the essence, we decided to opt for rendering multiple passes from each Max scene, compositing them in additional steps via Nuke, allowing a greater margin for error and additional creative flexibility. As Alchemy has been using real-time engines on other projects, we were keen to explore the feasibility of producing these scenes in real-time; however, with the lockdown and reduced interaction between the team, a proven method with guaranteed results came out on top.

Eric Cheng: It’s important to understand why 8K/60fps was the resolution and framerate target. Quest 2’s display is about 20 pixels-per-degree (PPD), which means that 180-degree content requires video at 3600 pixels (7200 pixels for 360, and stereoscopic 180 content requires 2 x 3600 pixels). 8K is the resolution that saturates Quest 2, and obviously, it can also be scaled down a bit for Quest, where it still looks great.

We’re also constantly chasing 60fps for immersive media. Hollywood films are generally shot and produced at 24fps, which gives content that “film” look. In VR headsets, motion at 24fps and 30fps both exhibit distracting artifacts, while motion at 60fps is smooth. Unfortunately, very few cameras suitable for immersive videos are high resolution and high framerate, so it’s a hard standard to achieve.

What’s something about VR and immersive filmmaking that still surprises you?

Elliot Graves: VR and immersive filmmaking still catch me by surprise. Each time I think our team has solved a problem or created a solution, another creative or technical challenge pops up. While creative processes, pipelines, and working methods are slowly becoming standardized, I believe the constant strive to improve our projects leaves us exposed (even if it’s what we secretly love about the industry). I have to admit though, working on such an ambitious project in the middle of the global pandemic is not something I’ll be wishing for anytime soon! The biggest surprise in producing Micro Monsters was how my faith in live-action immersive media over 6DOF real time experience was restored.

For me, there is continuous reinforcement that spending as much time as possible in headset leads to better-quality content. Eric Cheng

With the boom of UE4, real-time production, and gaming inside VR, storytellers often go down this road too. In the past 12 months, I feel there has been a stigma attached to live-action VR; perhaps this has been because of image quality or narrative shortcomings. When we experienced the first offline edits of Micro Monsters, I was blown away by how I just fell into the story, without desiring to interact, move or complete an objective in 6DOF. The series truly transports you into an inaccessible world in real life, something I believe is key to immersive content. When you have a praying mantis the size of a car, inches from your eyes, there’s an awe-inspiring emotion that kicks in and holds your absolute attention. 

Eric Cheng: For me, there is continuous reinforcement that spending as much time as possible in headset leads to better-quality content. With Micro Monsters, we were fortunate because Elliot, Vianney, and the extended team at Atlantic Productions were in headset constantly during the production process (and they also have incredible attention to detail). We were able to refine the episodes until the production's technical aspects faded away; the story is all that is left, which is the goal!

Latest Stories

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy