Note: This article was originally published on LinkedIn. If you enjoy my article, please click through and consider sharing on social media and connecting with me!
Evolving AR from the tabletop to the great outdoors. Bringing the streets of 1941 Lochee to your street. Emulating a museum experience using just a tablet. These are just some of the challenges I worked on – and I’ve developed a framework to help you do the same and more.
Following this framework allowed us to get close to emulating the kind of interactive storytelling experience we studied, for a fraction of the development time and cost of an actual museum exhibit.
Here’s what you can expect from this article:
- I identified 9 major steps that can be implemented in an iterative process to progress from a basic to fully immersive AR game.
- I’ll explain my process for arriving at this framework and then introduce the framework itself, along with a detailed breakdown of how I applied it.
- I’ll reflect on the suitability of this framework and its application and how it can potentially be improved and adapted for further use.
- Plus: I’ll share a design breakdown I produced of a state-of-the-art digital interactive museum exhibit.
- Plus: I’ve shared links to great resources to help you do the same using Unity AR Foundation
A New Business Model And A Passion Project
In mid-2020, I started my first official project at InGAME: designing and developing a prototype historical and cultural Augmented Reality experience for the iPad.
Our partner and challenge-holder, TPLD, had come to us with an idea they had for a new business model. They wanted to explore creating mixed-reality experiences for museums and the tourism sector. They wanted us to help them further define and de-risk this idea, so they could turn it into a business model and eventually start a new venture.
They had in mind a perfect way to validate this idea – a little passion-project-prototype, recreating 1941 Tipperary, Dundee, based on one of the challenge holder’s own childhood memories!
Goals
Our objective was to create an immersive and interactive 3D VR / AR application that gives people the sense of what it was like to live, work and socialise in 1938 in an area of Dundee known locally as Tipperary.
We needed to develop a process or framework which could successfully produce a project meeting the following criteria:
1. Recreating 1941 Tipperary, Lochee, Dundee, optimized for iPad 2020
2. Telling fictional stories based on authentic historic material and artefacts
3. Producing a working AR technical demo for the iPad 2020
4. Allowing the player to play the role of a fictional character in the chosen setting
5. Providing an experience close to that of state-of-the-art digital museum experiences
6. Immersing players in this recreation of the chosen setting
7. Exploring outdoors AR
8. Exploring Unity AR Foundation’s available features
My Process For Designing This Framework
We needed a viable, straightforward, and reliable process to design this experience. They were some unknowns and challenges that helped form this process.
First, the type of experience had to be considered. There have been projects about applying Games and AR technologies for cultural museum experiences, such as experiences that allow users to view, process, and even curate cultural artefacts (Pollalis et al., 2017 - HoloMuse: Enhancing Engagement with Archaeological Artifacts through Gesture-Based Interaction with Holograms.).
However, projects like that are often limited in scale and functionality to single objects on a table or on a shelf and don’t easily provide opportunities for immersive storytelling.
The nature of the experience we wanted to give our users seemed best suited to outdoors AR. This led us to prefer developing our own ad-hoc process for designing the game. A few challenges I considered are:
1. What kind of experience are we trying to design for users?
2. What’s currently possible with AR on our target platform?
3. What kind of data and artefacts do we have about the story we want to tell?
Of course, in addition to this, there are some general questions like project scope, available resources, and timeline.
The answers to these questions helped form a process that naturally led to informing the AR Experience Design Document.
Validating And Understanding The Need - Uncovering Problems With Traditional Museum Experiences
The COVID pandemic has caused major disruptions for museums - most obviously by greatly reducing footfall as well as highlighting a need for updating infrastructure to support current needs.
In museums, often the volume of information can be overwhelming and tiring. Improvements are needed in the manner and medium of delivering this information.
There are three major factors that make digital AR experiences for museums an attractive proposition:
1. Location and accessibility
What we’re exploring here is similar to the concept of “distributed museums”. By not limiting museum experiences to the physical museum space, we make experiences available to online audiences and make them more sustainable across platforms, communities, and time.
(Proctor, 2017 - 'The Museum as Distributed Network, a 21st Century Model').
Historical parks and monuments make for good targets for digital or online museums – due to being physical sites of interest that may be lost over time.
2. Cost of designing and developing a digital museum exhibit
The Smithsonian found the direct costs ranged from $25 to $6,500,000 per exhibit, and that their costs are similar to those at other museums across the United States (Smithsonian Institution Office of Policy and Analysis, 2002 - The cost and funding of exhibitions).
Comparatively, the development of an AR app may cost between a few thousand dollars for a straightforward AR app to $300,000 for a feature-rich one (Lavrentyeva, 2021 - Augmented reality cost: key factors and real-world examples.).
3. Better experiences with increased immersion
Mixed Reality offers a unique tool for people to study the past and get an “immersive” feeling.
Today’s high-end devices are likely to be affordable and mass-produced in as little as a few years’ time, which makes them good targets for R&D work like this.
Step 1: Understanding Existing Experiences We Wanted To Emulate: Design Fundamentals Of Interactive Digital Museum Experiences
I started by looking at state-of-the-art museum experiences, to understand what they were offering. I wanted to break down their design and see if that could be replicated using AR.
Several state-of-the-art digital museum experiences focus on adding a vivid narrative in which visitors assume the role of an interesting character that suits the experience.
The experiences are designed like linear story-based gameplay and are also similar to theme park experiences and rides. Game-like role-playing and storytelling is brought to life using a mix of custom hardware and large screens. There is a strong use of Interactive menus on physical displays, pre-recorded voice and video, and immersive displays that make you feel like you are on-site, living the lives of the people in the story.
These experiences are likely to have multi-million-dollar budgets.
Design Breakdown: Ideum: Penguin Chill Habitat At The Albuquerque BioPark
Source: https://ideum.com/portfolio/penguin-chill
Ideum designed and built exhibitory that complements the experience of watching live penguins by adding a vivid narrative in which visitors assume the role of researchers bound for the Antarctic.
I’d describe it as a high-budget game or experience with multiple hardware/technology working together. The experience flows like linear gameplay or a theme park ride.
After some critical analysis of a video of the experience online, here’s what I could glean:
- A clear narrative with role-playing: Ideum chose the narrative for this experience to take the player through a guided role-playing experience. Visitors take the role of researchers on an expedition on the icy shores of Tierra del Fuego.
- Setting the context: The experience starts with an interactive menu with videos that introduce you to researchers.
- AI NPC interactions: The exhibit lets you “make virtual video calls to real-life contributing scientists to learn more about their research”. Basically, leveraging human-like AI interactions to support storytelling.
- Recreating realistic equipment: The science stations are built to look like a shipboard command centre.
- Using real, live data: The installation includes several displays with real live scientific data.
- Multi-player interactions: One part of the experience has a large flat screen with an interactive (Touch-based) game that supports multiple players at the same time.
- Real artefacts: They mixed in authentic displays as well as materials collected from the site.
- Immersive “wow” moments: Ideum transformed this area into the observation deck of an offshore research ship. They give you a view from the bridge of the ship through multiple screens to make it feel like you’re looking out at the real ocean. Another installation shows the view from inside a ship through a fake porthole.
- Fun motion-based game for kids: The player moves their body to steer penguins through obstacles in the search for food.
- Multiple languages: Localization is important for any commercial interactive experience, and has always been a core part of museum experiences.
- Selfie station: Perhaps most well-used by apps like Instagram and Snapchat, but superimposing digital images on or around users is a common use of AR.
- Immersive environmental art: Rockwork, rustic architectural finishes, and a dynamic ceiling light sculpture that uses near real-time NOAA solar wind data to simulate the Southern Lights (aurora australis), all work together to create an immersive sense of place.
Since the opening, Penguin Chill has been a tremendous success; in fact, a recent KRQE news report notes that attendance at the BioPark increased by 80% over the same period last year. The Albuquerque Journal recently reported that this exhibition has helped make the BioPark "the state’s No. 1 most popular attraction."
In a way, this is an AR experience, but it makes use of multiple devices and physical installations for “augmenting reality” and providing immersion rather than a mobile device.
This also reminded me of how the Mandalorian was filmed using real-time visualization techniques in Unreal Engine 4.
Step 2: Using What The SDK Can Offer As A Guideline
It’s a commonly stated and accepted fact that constraints help with creativity. Looking through Unity’s AR Foundation documentation and demos helped me understand what was achievable on current AR platforms. But beyond that, it also gave me another lens to look at my problems through. After understanding each feature available, it was easy to extend each feature into a potential piece of content that could be added to our AR design.
Unity AR Foundation: https://unity.com/unity/features/arfoundation
AR Foundation Samples: https://github.com/Unity-Technologies/arfoundation-samples/
(Advanced) AR Foundation Demos: https://github.com/Unity-Technologies/arfoundation-demos
Documentation: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.1/manual/index.html
Step 3: Historical Knowledge and Artefacts of 1941 Lochee: A 3D Environment Recreation In Unity
We had 3 major sources of historical knowledge and artefacts for our project.
1. Photos for 3D recreation in Unity
2. Information and stories from Historians
3. Stories of life at that time and place from our partner
The Outcome: My Framework For Immersive Outdoors AR Experiences
How I Applied This Framework To Bring 1941 Lochee To Life
Visualization:
This part is straightforward and is basically how a lot of game development starts.
- 3D environment recreation of 1941 Lochee, Dundee by our 3D Artist based on authentic photos and descriptions from historians.
- Imported and implemented in Unity for full-scale AR use.
- Optimized for performance on the target platform.
Movement:
The next step is to make sure players can navigate the environment. Our design was for outdoors AR, and I used Unity AR Foundation to make this happen.
The player can load up the game and then walk around physically to navigate the now full-scale streets of 1941 Lochee.
You may need some secondary method of movement as well – for debug purposes, we had a button that moves the player forward in the direction they’re facing.
Since we’re navigating virtual streets with large houses, we guide the player through visual and audio cues and stop them from walking into walls by fading into a black screen with an animated instruction to move their device back when they get too close to any walls.
Signposting And Mark-up:
This is essentially Level 1 of Augmented Reality design. Based on our source material, we had ideas for objects that would be interesting to interact with and contribute to the story. I developed a simple interactive item script and an AR-friendly UI system.
Storytelling And roleplaying:
This is where the experience starts to become a “game”. Our design is centred around allowing the player to play as a child who has just arrived in Tipperary from Clydebank due to frequently bombings during the war. Other interactions are based on this central idea.
Creation And Persistence:
Creation and persistence are great ways to exploit the AR platform and draw the player further into the game world. I did this by allowing the player to play a hop-scotch game called “boxies” by using the touchscreen to draw on the ground. Whatever the player draws is anchored in place, and does not move with the device. It will stay there as long as the application is active, and is visible again if the player looks back at the place where they drew.
This of course happens when prompted by wee Robert’s cousins – keeping with the theme of storytelling and roleplaying!
Here’s a little tutorial to help you explore this on your own: Unity AR Foundation - AR Draw With AR Anchor Manager by Dilmer Valecillos https://www.youtube.com/watch?v=kcqcUxVQu0o
Onboarding:
Onboarding is done in the form of tutorials in games. A proper tutorial is even more essential when you’re designing for an unfamiliar and novel platform, like AR for the iPad in our case.
Our experience goes a step beyond familiar AR experiences, which often simply require you to point your phone camera at a flat surface. This demo re-creates a street and is therefore meant to be used outdoors on a street, ideally at an intersection.
To set up for the intended experience, the player needs to:
- Find a street and stand in the middle, preferably at an intersection
- Point the camera forwards looking down the street
- Use the UI to place the left-side row of virtual buildings overlaying the footpath to their left
- Adjust it
- Use the UI to place the right-side row of virtual buildings overlaying the footpath to their right
- Adjust this one
And finally overlay a scan of the photo used to re-create this street over their own street and see the magic happen!
We use a bespoke step-by-step UI for our tutorial to help the player do this.
AR Plane detection is key to how we help the player setup the screen. I found this helpful for understanding how to implement this: How to Disable and Enable AR Plane Detection With Unity3d and AR Foundation? By Dilmer Valecillos https://www.youtube.com/watch?v=CHmv0XObIcc
Adapting To The Real World:
This step is mostly fulfilled by what I described above – our experience overlays virtual rows of houses with the sides of your own real-life streets.
Blending With The Real World:
AR should mix real world and game world elements. One way this is often done is in the form of a portal you step into – but the context of the experience is important, and we weren’t working on something fantasy or sci-fi.
I believe it’s more important to show that the virtual world is reactive to your presence, which helps give that feeling of being in AR.
I decided to use a car for our own game. A car drives towards you, honks, and then swerves to avoid you. Again, our goal here is to make the player feel the mix of the virtual and the real and feel immersed in the experience.
Leveraging Existing Artefacts:
Besides using photos to re-create the environment as well as re-creating object-based interactions from that time (Such as a radio and some food items), I took this a step further by overlaying the photograph we used to re-create our main street and giving the illusion of this photo coming to life in AR. To keep this simple, I just did this through a UI image and transparency.
Audio For AR:
Audio for AR tends to get overlooked, and I wanted to make sure we were using it to an extent at least. A few ways we did this:
- Voice-acting for the player characters, his aunt, and his cousins.
- Using Audio to direct attention, provide guidance, and ensure proper pacing and progression.
- Assisting with “blending” by allowing the player to hear the car’s engine and horn, with 3D spatial sound.
Transmedia Elements:
Besides audio, I explored the idea of adding photographs or videos in an AR experience. Matthew Bett at Abertay introduced us to a process he developed that allows us to have low-cost human videos in AR rather than having to go through the expensive and time-consuming process of modelling and animating human characters. The process involves filming and then using video editing to cut out the background, then rendering this video onto a plane in Unity. I just used an available placeholder clip for testing this as we couldn’t film anything due to COVID restrictions.
AR Photography:
Photography is a popular use of AR and was even leveraged by the Ideum exhibition I described earlier. Unity’s AR occlusion enables us to have foreground objects appearing in the camera feed to be drawn on top of the AR elements, rather than having the AR elements drawn over them. This technology is based on Computer Vision and as such is still experimental, and doesn’t perfectly cut out foreground objects.
The result is that you can take a photo of someone standing in 1941 Lochee streets – even if imperfectly!
Our Results
The outcome of this R&D project is an experimental prototype outdoors AR application that emulates the kind of role-playing and story-telling experience provided by state-of-the-art digital museum experiences. The app places the user in the shoes of a young boy arriving in Lochee, Dundee in 1941. The streets of Lochee appear on your iPad and overlay with the real street in front of you using AR technology.
Following our framework allows us to get close to emulating the kind of interactive storytelling experience we studied, for a fraction of the development time and cost.
Going back to the goals stated at the start of this article:
1. We recreated 1941 Tipperary, Lochee, Dundee
2. We’re telling fictional stories based on authentic historic material and artefacts
3. We producing a working AR technical demo for the iPad 2020
4. This experience allows the player to play the role of a fictional character in the chosen setting
5. We’re providing an experience close to that of state-of-the-art digital museum experiences
6. We’re immersing players in this recreation of the chosen setting
7. We explored how to make outdoors AR work
8. We explored Unity’s AR Foundation and applied the features best suited to our need
Discussion And Future Work
There are several challenges when it comes to developing digital 3D experiences that are meant to replace in-person physical experiences. A few that come to mind are:
· A lack of real face-to-face communication
· Learning curve and usage difficulty
· Development still involves skills not available easily and is costly
· Simulating complex tasks is challenging, such as those involving complicated machines or realistic human expressions
The application I developed tries to address some issues – like improving the learning curve and UX through a step-by-step tutorial and simple UI and interactions.
We can apply our initial methodology – Combining our breakdown of the target experience we’re trying to emulate with available AR features and existing artefacts – to add a few more steps to the Framework. These steps would hypothetically make our experience even stronger and more immersive and get it very close to the target experience we wanted to emulate. These are more advanced steps and would take a significant amount of time, skills, and investment to achieve, which is why we didn’t focus on them for our work. These could be a basis for future research
1. Integration with social media platforms and the internet
2. Networking – Interaction with other players and co-op experiences
3. Integration with physical IoT devices or beacons
4. Multiple I/O devices – Physical panels, projections, screens
I’ve aimed to keep the framework specific to AR design for outdoors experiences, but also not specific to our museum-storytelling use case. This should allow for easy applicability to future applications as well, but the only way to confirm this is to apply it.
I also couldn’t get to more interesting and complex ways of blending the real and virtual using features like plane detection and geometry mapping and then combining them with gameplay, physics, and virtual meshes.
Immediate And Future Impact – A New Venture For TPLD And Growth For The Dundee Cluster
The challenge-holder for this R&D project, Dundee-based business TPLD, intended to use this application as a proof of concept to find interested commercial partners. They will also fund a local development team to develop the application further.
They have formed a new commercial venture, Floyen, influenced by the output of this project. Floyen’s goal is to secure funding as well as to adapt and develop their new process and IP for new clients.
So for us – this framework did exactly what we needed it to! I believe there’s a ton of room for innovation in the large-scale, outdoors AR space and it’s a very exciting area to work on.
If the AR Design Framework helped you think about how you could apply AR for your own applications, please let me know. I’d be happy to see it put to further use.
I’m working on an academic paper about this project as well with more details, so if you’d like to see that when it’s done, please feel free to reach out to me.