top of page

Building the Aardman AR trails

  • Writer: Russ Morris
    Russ Morris
  • Feb 6
  • 9 min read

ree


Over the last 18 months or so, one of the main projects I've been involved with at Aardman is overseeing and developing a number of Augmented Reality (AR) trails based on Aardman IP: Shaun the Sheep, Lloyd of the Flies and, of course, Wallace and Gromit.


As I'm coming up to the end of my time at Aardman, and after recently completing development on the Wallace and Gromit AR trail, I thought it would be a good opportunity to give an overview of how we approached building the trails from a technical perspective, improved reusability of our code, and created a technological base that allowed the content of the apps to take centre stage from a production perspective.


Before we get started, let me set the scene...


Shaun the Sheep: Hide & Sheep

ree

For those who might not be aware, augmented reality apps generally allow users to see virtual objects, characters or environments 'projected' onto the 'real' world when viewed through a camera feed on the device.


A 'marker' for Hide & Sheep. Users use the Hide & Sheep app to scan the image to make Shaun appear.
A 'marker' for Hide & Sheep. Users use the Hide & Sheep app to scan the image to make Shaun appear.

Hide & Sheep was one such app. Visitors to various venues and locations worldwide could scan special images (colloquially referred to as 'markers') that would make Shaun appear as though he was right there with you.

ree

Hide & Sheep proved to be a hit for the venues that hosted trails and what I assume happened after the successful launch of the app is that a lot of important people got together, discussed important people-y things like ROI, and decided it would be a great idea to do more AR trail apps.


This is where I come in.


The Three Trails


Sometimes, every now and again, you are struck with a moment of pure creative brilliance. The subtlety of the above joke is just, simply put, delicious.
Sometimes, every now and again, you are struck with a moment of pure creative brilliance. The subtlety of the above joke is just, simply put, delicious.

The decision was made to create three new AR trails: one for Aardman newbie Lloyd of the Flies, a new Shaun the Sheep trail, and lastly, getting the clay legends themselves, Wallace and Gromit, to appear in their own AR trail.


Now, what tends to happen when new projects are announced internally is that someone excitedly explains how great it is that we're making new apps, everyone releases party poppers, cranks the stereo up to 11, and dances around the room. Just as the fervour reaches peak 'aren't we all the greatest', the person responsible for actually building the apps chimes in with some gibberish about tech stacks, target devices, and CMSs. Normally, it's the producer who pops this figurative balloon (and who would have already unplugged the stereo before anyone got the volume above three), but the producer was off that day—so I got to spoil the fun. Hi. Yes, that's me. I spoil the fun.



So yeah, there would be three new apps building on the success of Hide & Sheep and introducing new Aardman IP and AR features. The three planned apps were:


  • Bug Hunt – an AR app based on Lloyd of the Flies

  • Fun with the Flock – a new Shaun the Sheep AR trail

  • All Systems Go – an AR app based on Wallace and Gromit


What Would Be New?


Content and IP aside, the main differences between these new apps and Hide & Sheep would be a greater focus on advanced AR features, allowing users to interact with Aardman characters in fun new ways. Previously, Hide & Sheep used a simple 'marker' scanning feature—Shaun appeared where the marker was scanned. The limitation of this method is that the AR experience is always bound to the marker image itself. If the device can't see the image, it won't display the content. Also, venues had to place the images at the correct heights, as the app assumed the markers were a certain distance from the ground. If the marker was placed incorrectly, the content could look like it was floating.

We still wanted to use markers to blend the physical and digital spaces, but instead of placing AR content solely via markers, the new apps would utilise plane detection (which identifies horizontal and vertical surfaces). This would allow content to be placed more naturally, solving the 'floating' marker issue.


ree

Grounding content requires both technological reliability and user understanding. Firstly, the device must be modern enough to support a smooth experience—some older devices technically support grounding, but the experience isn't great. Secondly, the app must clearly guide the user through the grounding process. We implemented customised patterns to identify detected surfaces, along with additional graphics and messaging to indicate suitable locations. Secondly, there is a requirement for the app to clearly and effectively convey the really important information about grounding to the user. We utilised customised patterns to identify detected grounding areas and additional graphics and messaging to inform the user whether the location was suitable.


We knew this complexity could be tricky for some users, but we believed the improved overall experience was worthwhile.


Techno Trousers interacting believably with the 'ground' of the 'real' world.
Techno Trousers interacting believably with the 'ground' of the 'real' world.

Additional AR Features


We also leveraged face-tracking technology to create content that could interact directly with users by detecting faces and overlaying features.





These were all mainly content challenges - the technical side of this stuff is relatively straight forward with modern libraries like AR Foundation, which wraps up 2 unique systems in ARKit (for Apple devices) and AR Core (for Android devices) and allows you to get up and running really quickly. The big challenge for these apps was knowing that there would be multiple apps that would not only need to be created in a short space of time but also supported for a long period of time. That's why I spoiled the fun, there was no time to lose and lots of technical planning to work out...


Technical Planning for Three New Apps


When beginning to plan the new trails, it was clear that to create three unique apps—each with its own IP, bespoke content, and design—the foundation needed to be:


A) Reliable

B) Consistent

C) There is no C, it just felt like there should be one

D) Obviously not a D

E) This is just getting silly



Reliability and consistency

To start building up the reliability and consistency we needed to go back to the drawing board. We knew we had to (and had planned to) improve our overall tooling for creating apps in Unity.


For any app to be reliable it needed a good framework. A content agnostic set of tools that we knew the ins and outs of - that could take on the repetitive heavy lifting of game and app development in Unity. We had some tools, but they either needed a refresh, an update, a fresh coat of paint, a brand new set of tyres, a bin to be thrown in to, a fresher younger model. So our framework would definitely need looking at. Add it to the list. There was also the requirements of the AR trails to consider. What would the AR trails be and how would we make them? Not only that, how would we make them when viewed through the lens of reliability and consistency across all 3 of the apps?


The Approach: Separating Form & Content

Like most projects I tackle, I started by separating form (the way we build something) and content (what the user sees). Form is the technology—Unity, AR Foundation, target devices, and physical locations. Content is the brand, the animations, the UI, and the overall experience. In this case the form is the technology. It's Unity, AR Foundation, the devices we were targeting, the location they would be played. The content was the brand, the types of games you would want to make, the animations we would want to play, the types of user interfaces we wanted to make. My goal as a development lead is to marry these 2 challenges together. I need to listen to the creative leads and understand what they want and put the systems, technology and processes in place that allow that vision to come to life. For these particular projects the context of multiple apps needing to be created and supported meant that the focus on the form was super important. So important in fact that not only did I need to separate form and content, I needed to separate form.


The form for these apps needed to be broken down into the following :

  • The underlying Unity framework that the apps would be built on.

  • The AR technology that would be used

  • Tools and logic that would support the requirements of an AR trail


The framework


To ensure reliability and consistency, we reviewed how we had built past projects. What worked? What didn't? How could we improve state management, UI handling, and CMS integration? From this analysis, we developed a framework to support future apps.

Key components included:


  • State management (with visual tools for app flow)

  • UI management (to streamline interface design)

  • Localisation (always be ready to localise!)

  • Analytics (to control data capture at different levels)

  • Remote data handling (integrated with Directus but not exclusively tied to it)


The AR module

The second stage required us to understand the technology we were going to be using to create the AR apps like the back of our hand. We chose to go with AR Foundation for these trails as it gave us all the functionality we needed to deliver the content we planned to make and on the platforms we were targeting.


The only way to fully understand the capabilities (and limitations, that's the most important bit) of anything is to start using it. So we started building prototypes to understand how AR Foundation liked and didn't like to be used.

That wasn't it though, just knowing how it worked wasn't actually the goal. The goal was to build on this knowledge and manipulate AR Foundation in a way that suited us. We began building tools around AR Foundation - to better manage content, to dynamically create tracked images, to define our content types and core libraries that took advantage of AR Foundation. This formed the basis of our AR module; a set of tools we could rely on to deliver AR content in our apps. It also allowed us, if we ever needed to, completely decouple from AR foundation. Our tools implemented AR Foundation as a library. If we decided to switch it up to Vuforia later, then we could.

The AR Trails module

The final piece of the puzzle was to think about the AR trails themselves as a tech challenge. We knew we had a framework to build a Unity app on, we knew we had the AR module to create AR content - but with 3 planned apps, what was the module of our AR trails going to be? What were the unique features of the AR trails apps that would be shared across all 3 and what space did we need to leave for bespoke content? We identified that what made the trails unique was the requirement to be associated with specific locations, a progression and unlock system and a similarity of app flow and required screens (such as location entry screen). With this in mind we built an 'AR trail module' which would allow us to insert bespoke content in to.

The Aardman Unity framework is really a series of modules in itself. Whilst presented as a group here, things like the analytics or localisation parts were their own features, we just shipped them with the framework for ease at the time.

Overview of each of the 3 main components of our trail apps. The unity framework, AR module and AR trail module.
Overview of each of the 3 main components of our trail apps. The unity framework, AR module and AR trail module.

At the end of this stage we had the three main components required to, and remember this is the whole point, reliably and consistently create 3 new trail apps knowing that all the required features were covered.

Building the apps


So now we could build the apps, right? Well, in truth we were doing the above work at the same time as building prototypes and trying out ideas - refining the content and features of the new trails. The team that created the apps was small, so we really didn't have time to dedicate a few months exclusively building frameworks and modules, we had to build those on the side of prototyping features and ideas. This meant again separating the form and content. The form was taking shape separately but the content also had to be developed separately. Alongside building the frameworks and modules I was also creating a barebones app where prototypes and designs for the apps could be quickly built, tested, shipped and shared with other members of the team. We could test ideas and not have to commit too much time setting up complex UIs before they were finalised. It was a very bootstrapped process - but again, it was something that allowed us to quickly iterate on ideas.


ree

In all honesty, the first of the 3 apps was really the guinea pig of all the other apps. We really had to trust that the process we were putting in place would work and use all of our experience to remind ourselves of that. The bulk of the first app was built as very separate features and scenes - only towards the final stages of development when the framework, modules and the content for the first app were finalised did we start putting it all together in its own app. It was mostly a nomadic set of features and ideas that we packaged together with the final frameworks and modules - and I'm happy to say that it worked as intended.

You can find out more about the AR trails at the following links...


Bug Hunt


Fun with the flock


Wallace & Gromit : All Systems Go


 
 
 
bottom of page