Spy ChallengesConcept DevelopmentProductionImplementationIterationPost OpeningConclusion

Designing spyscape museum experience

Case study
COMPANY
SPYSCAPE
ROLE
UX
PLATFORM
Digital interactives
Team Size
50+
COMPANY
Bright Little Labs
ROLE
Project Manager
UX/UI
PLATFORM
Website
Team size
5+

Overview

SPYSCAPE is a new interactive spy museum based in New York, where through a series of immersive interactive challenges the visitor can find out which type of spy they could be.

The results of the challenges are saved on an RFID band which the visitor is handed at the beginning of the experience.

Goals

Create a series of digital interactives where the visitor completes spy theme challenges.
Each challenge should measure specific visitor’s skills, which at the end will be compared with an algorithm to calculate which spy role they match best.
Each interactive should be a balance of educational and entertaining content.

THE CHALLENGE

One of the biggest challenges was location. The museum was based in New York, while the entire design team was in London. 

This meant that during stages of user testing we had to recreate as best as possible what it would be the museum experience, to be able to put the interactives in context and get more accurate user feedback.

My Role

As part of the in-house creative team I worked as an Interactive UX designer building - from concept to production - the interactive challenges of the main museum exhibition.

I worked closely with the lead UX designer to develop the experience of each of the interactive challenges as well as how they worked together to create cohesive experience.

Among the tasks were low and hi fi prototyping, off-site and on-site user testing, user flows, wireframes, QA testing, assisting content filming and voice recording.

I was fortunate to be part of a wonderful team of creative director, project managers, visual designers, developers, motion graphic designers, copy writers, content creators, sound designers as well as hackers, ex-spies and many other administrative roles and contractors in London and New York.

Process
From the moment I joined I was intrinsically involved in all development processes, from concept sketches, creating personas, team presentations, planning, creating flows, wireframes, planning, carrying user testing and QA of software and hardware.
We worked in an agile environment with weekly sprints and daily stand ups, which allowed us to efficiently track our progress and resolve issues that were blocking us.

SPY CHALLENGES

Six interactive challenges were created for the permanent museum exhibition. The theme for each one was set by the narratives of the galleries they were located in.
All digital interactives were developed following the same process. Below, is the breakdown of one, the Deception Challenge.

Concept development

From very early sketches we moved on to do a lo-fi paper prototype to present and discuss the gameplay with the wider team. Through this iterative process we designed the first version of the game and were confident enough to start creating flows and wireframes.

Prototyping

From an early stage it was important to test the gameplay in the most accurate way possible. For this reason from the paper prototypes we quickly moved on to click through prototypes.
We created the dummy content where necessary and used physical props to recreate as much as possible the final physical conditions.

Production

User flows and wireframes

As the features of the digital challenges were being defined we created a user flows. In the flows we also included technical information of other elements that took part in the experience (sound, lighting, etc). This made is easy to understand what was happening at every single stage of the game.
Similarly, the wireframes were created with a high level of detail, containing not only the UI elements but also information about behaviour, duration, sound effects and lighting conditions which help guide the work of the visual and motion designers, developers and hardware engineers.
Section of deception user flow
Section of deception wireframes

User testing

Regular user testing sessions were held in our offices. The digital challenges were designed with the intention to educate and so it was key to know if the visitors would take with them the key learning points that we had planned.
For each game there was also an intended atmosphere created. To test this more accurately, a carpenter was commissioned to build a wooden model of the interactive booth, which then was assembled using the same hardware that was going to be used in the museum. This contributed to the user testing and the software and hardware QA.

FINAL Visual Design

Implementation

documentation

Throughout the construction of the museum there were many parties involved (architects, AV suppliers, lighting and sound designers...). I put together requirement documents for the digital interactives and galleries, highlighting the content displayed at every stage. This was necessary for all the parties involved to understand how the environment was taking shape.
As the visitor was at the heart of the project, these documents also included the user journey through gallery and digital interactives, so everyone could understand what the visitor was expecting to see at every stage.

Content production

As part of the game, the user had to observe a person being interrogated and determine, through the signs of body language, when they were lying. A film production company was hired to shoot the interrogation film. As for the narration throughout the gameplay, we worked with various voice artists.

I closely assisted all content production to make sure that the requirements were being followed and that the correct tone of voice was being used each time.
All content was recorded in English, Spanish and Chinese. I was in charge of the translation of the game to Spanish and lead the recording sessions with the voice artists during the recording sessions.

Hardware and software qa

Given the bespoke experience it was key to do a rigorous tests of the hardware and software. From early stages of development I was involved in thoroughly testing the development of the software and the features of the hardware.
For this interactive the hardware involved were, one way mirror, pulse sensor and physical buttons. The software included a facial recognition app, lighting changes and the main game development.

ITERATE

User testing in location

The last month prior to the museum opening I along with other members of the team travelled to New York to visit the museum.

While being there, we organised user testing sessions with groups of 50+ testers. This gave us vital information about the performance of the interactive challenges. Fortunately, only small tweaks were required.

I also carried out thorough QA testing on each of the modules of each interactive to test hardware and software.

Post opening

Once the museum opened we were able to truly observe how visitors experience the space and what were the common issues. ​During busy times long queues were form to access the Deception Challenge. This was affecting the overall experience.

On one side, the duration of the interactive was too long and on the other side, due to the positioning of the 12 booths, staff was not able to efficiently manage the queues.​Time was allocated to provide proposals for both issues.
Improve queue management
Reduce length of challenge

Queue management app

The goal was to create a simple app where staff could quickly see which booths were free or about to be.
Method :
  • Understand what the needs of the members of staff were.
  • Produce wireframes to pass on to developers
  • Deploy to museum, test and iterate where necessary.

Reduction of game length

The goal was to analyse the current version of the game to see if the duration can be reduced without compromising the experience. ​
Method :
  • Calculate timings for each part of the interactive and propose reduction
  • Produce video examples using after effects to visualise those changes and present them to the stakeholder for approval.
  • Produce new assets if required and communicate the changes with developers.​
  • Deploy to museum, test and iterate where necessary.

Conclusion

The extensive user testing, really helped us to understand most possible edge cases and work to fixed those. However, as user testing happened outside the context of the museum we knew that there were going to be some possible issues  that we weren't aware of until opening. ​The great level of communication and collaboration among all team members allowed for a very complex project to be developed to a high standard in a short period of time.

Other Case Studies