Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Taking a bite out of the VFX for Shark Attack 360

Simon Percy from Little Shadow talks to TVBEurope about how the animation studio used a mobile virtual production setup to create a VFX shark lab for the second season of the Disney+ series

Arrow Media’s Shark Attack 360 returns for a second season this week, as it investigates why sharks bite people.

The show uses cutting-edge VFX technology to analyse data and understand, in forensic detail, the reasons behind shark attacks.

The producers worked with Little Shadow to create the effects for the series. Simon Percy, director at Little Shadow, talks to TVBEurope about the company’s work on the show.

How did Little Shadow get involved with Shark Attack 360?

We first worked with Arrow Media on Season One of Shark Attack 360. We were brought in later in the production to work on a series of visual effects clean-ups, blood augmentation, and experiment explainers. We managed to complete these tasks in an expedient manner over the Christmas period and must have made an impression with the Arrow Media team because, in May 2023, we received a call to pitch on all the CGI in season two.  

It was clear from the outset that Arrow Media had big ambitions for series two and really wanted to raise the bar. So, our initial task was to deliver a pitch that would meet expectations and then exceed them. Being all about the visuals, we created a 2-minute sizzle reel and a series of bespoke images and dev films that showed the scope of our vision. We were elated that our ambition was rewarded and we were brought onto the Arrow Media team and made welcome.

What were you tasked with?

We were tasked with developing and delivering the visual effects for the series, including creating realistic and interactive virtual sharks, integrating them into a hybrid virtual production setup, and ensuring a smooth production pipeline that could handle the project’s demands.

From the start executive Producer, Nick Metcalfe and series producer, Laura Offer made it clear that there needed to be a new post/VFX pipeline that would not only create greater flexibility within the Lab but enable more impressive cinematic shark encounters. Linked to that was the desire to make a more immersive presenter/collaborator experience within the Lab by enhancing its use to explain and investigate the attacks. And finally, enable the shoot and post process to become more efficient, avoiding pickup, shot redos or delays in seeing previs in the edits.

Challenge accepted!

How did you go about delivering that?

From the outset, our desire was to have a digital replica of our shoot location and from the moment that was chosen – we leapt into action. One of our trusted partners, Lidar Lounge captured a lidar scan of the set to create a 1:1 replica of the multi-level structure to allow shot planning, previs and final shots to be crafted ahead of the shoot and commencing edits. 

We generated a pipeline using our hybrid virtual production technology. This method did not involve a full LED volume but instead used our bespoke virtual production setup, including camera tracking, live green screen keying, monitoring and recording in one box. Bringing all the high-end VP tools to bear, allowing augmented AR elements, such as our sharks, into the live scene in real-time. 

Intricate planning with the DoP, director, and producers was critical to pre-visualise each sequence ahead of shooting, giving us a clear 3D template for every scene while maintaining flexibility for adjustments on the fly. This is where our mobile VP setup came into its own, it allowed the previs to inform lighting and camera setups well ahead of the shoot, meaning shot setups were incredibly quick. But more importantly, it gave Diva a comfort screen displaying the sharks and other key props accurately into frame in real-time, ensuring correct eye lines and appropriate heights for hand gestures and pointing. 

This process allowed framing of shots and the takes to be set up and run through rapidly. We could easily capture the raw clean footage, a matte pass and a composite of the scene with the sharks and props. This enabled the rushes and full-scene composites to be edited within minutes of the turnover of the camera cards. A major benefit was ensuring all the material was shot correctly, eliminating the need for reshoots and avoiding the long wait for 3D compositions that typically follow weeks after the shoot. Double tick against the brief’s challenge.

One of our desires was to enhance the energy, realism, and motion of the sharks. We spent considerable time improving the detail and accuracy of the shark models. This included retexturing, re-topologising and adding a new animation rig that allowed for a greater range of subtle motions. We created new mouth, jaw, and eye controls layered on top of a soft body system to simulate the near-cloth-like movement of shark skin during violent movements. Additionally, we developed a particle simulation system to generate bubbles through the gills and mouths during attacks.

Our work began with the great white shark, focusing on the mouth region to ensure accurate animation of its distinctive overbite. We needed to balance the jaw’s movement with the anatomical structure of the shark. These challenges varied across different shark species, such as hammerheads, tiger sharks, and bull sharks. 

Additionally, we were tasked with creating internal anatomy for some sharks, so we developed a generic system for organs while allowing for variations in muscles and liver size specific to each shark.

Finally, we conceptualised a cutting-edge large language model, paired with an innovative holographic display system, CORAL (Computational Oceanic Research and Analysis Logic). The AI-based technology was able to access global information, analyse data, and even examine the internal anatomy of sharks. This functioned like a high-tech colour MRI scanner, allowing us to delve into the intricate details of shark anatomy for accurate representations.

Once into post the shots were turned over to us and we were unleashed into a well-organised process of shot turnarounds.

Utilising the lidar scan of the location, we were able to seamlessly blend the real-world photography and virtual lab into single shots, allowing Diva and the digital sharks to co-exist in one versatile investigative space.  

The Lab was later filled with the open ocean, murky riverbeds, shipwrecks, and even the Earth and Moon. All of these played a pivotal role in the investigation process. With the combination of Diva, her team and the lab’s bespoke digital machine learning AI system CORAL, they were able to review news stories, search locations, collate data, pinpoint trends and even scan a shark’s anatomy. 

We had so much fun creating the shots and were given such freedom to create the most engaging and cinematic shots we could to support the investigations.

Tell us about all the technology you used to deliver the project.

We used a range of standard off-the-shelf tools along with some custom-build systems.

  • Virtual production box: 
    • We utilised a custom-built virtual production (VP) box that housed a custom-built PC with a 4090 GPU, AMD Thread ripper, Blackmagic Decklink 4K pro with Frame sync, integrated with Black Magic’s 4K live keyer, 2x HyperDeck recorders, a switcher, and monitors. This setup was designed for mobility, flexibility and quick playback with real-time asset generation and comping adjustments on shoot days.
  • Motion Tracking: 
    • Zeiss Ncam Camera tracking system for precise camera movements.
  • Animation and Modelling Software: 
    • LIDAR Lounge’s scans of the Lab location as an 8.6 million poly set model that was brought into Blender.  
    • Additional modelling was created in Blender
    • Maya for rigging and shark animation.  Substance painter and Photoshop for Texture creation.  
    • Blender for final texturing and shaders, lighting, and environment creation.
  • Rendering Engines: 
    • Unreal Engine 5.3 for real-time previs, lighting and shot planning, including camera lenses and locations.  
    • Real-time Rendering of sharks and CORAL for set work scene integration 
  • Editing and Compositing: 
    • DaVinci Resolve for initial edits, supplied from the Avid edits as AFF files, we held a copy of the rushes for the project duration to negate big file transfers and loss of time. This proved hugely effective.  
    • Adobe After Effects for master shot comps
    • Mocha Pro for roto and shot tracking and cleanups.
    • Nuke for additional compositing and clean-up work.
  • Additional Tools: 
    • Keen Tools Geo Tracker for additional motion tracking in blender, 
    • JangaFX EmberGen for fluid and smoke effects, and Frame.io for shot review and feedback.
  • Final Output: Adobe Premiere for final output and QC checks.

All these tools were integrated into a pipeline that was stress-tested before the first test shoot day. This process ran smoothly, with AI tools assisting in expedient and efficient roto and keying work.

How do the graphics help tell the story of the series to the audience?

The graphics were essential in creating an immersive experience, allowing viewers to see and understand the sharks’ behaviours and environments interactively. The virtual lab became a character in its own right, with sharks swimming alongside the presenters, interacting with the environment, and providing a dynamic backdrop for the investigations. 

Did you take inspiration from other TV series or films when developing the graphics?

Yes, we are massive fans of films and series and always keep an eye on the amazing work being created by other studios. Our starting point for any project involves researching and reviewing as much reference material as possible to fully immerse ourselves. In this instance, we drew inspiration from various sources: the underwater cinematic techniques in films like The Shallows and 47 Metres Down, numerous documentaries, many of which were Sharkfest-related, and incredible, unbelievable footage caught on camera. Any excuse to watch Jaws again was welcome. All these elements influenced our approach to shark animation and underwater scenes.

Additionally, the concept of a virtual assistant like CORAL was inspired by the AI in The Time Machine remake, although we chose a more ethereal, non-personified representation.

How long did the project take and how many members of the team were involved?

The project took approximately eight months to complete. At its peak, the team consisted of 16 artists, including model creators, animators, scene setup specialists, and compositors. On average, eight to nine artists were working on the project at any given time, supported by a VFX producer and a creative director.

What was the biggest challenge of working on Shark Attack 360?

The biggest challenge was managing the complex and elaborate camera moves required for the hybrid virtual production setup. Some scenes involved intricate tracking and roto work to seamlessly blend live footage with virtual backgrounds, especially when dealing with fine details like hair and close interactions with virtual sharks. The quick turnaround and the volume of shots (256 in total) also added to the challenge, but our collaborative approach and robust pipeline helped us manage these effectively.

Simon Percy
What are you most proud of achieving? And what do you hope the audience takes from the series?

We are most proud of developing a new and efficient pipeline for Arrow Media, which allowed for seamless integration of virtual and real elements. We achieved what we set out to achieve. This enabled us to create visually stunning and engaging sequences, like the immersive underwater shipwreck scene and the seamless transitions between virtual and real sets. We hope the audience finds the series both educational and captivating, gaining a deeper understanding of shark behaviours and marine environments through the innovative visual storytelling techniques we employed.

Season two of Shark Attack 360 is released on Disney+ and Hulu today.