The Phone Box Project

Overview

The phone box project is an experimental virtual reality experience, where we are trying to show how mental health can leave a people feeling isolated and trapped by using visual techniques to capitalise on the immersive abilities of modern VR devices.

We decided to explore virtual reality, as it is an emerging technology that provides a sense of scale and place that you cannot achieve with a monitor, similarly we wanted to have a tactile method of interaction that was natural and would require very little in the way of learning curve.

We tackled creating this experience in stages, split across the length of the project.

Project Management

To help us manage the project efficiently, we have opted to use several tools to document and communicate effieciently.

We have been using Telegram to maintain constant communication with each other and share work and ideas. It has proven essential to coordinating work and helping us to deliver on time.

I set up Confluence early in the project so that we could have a dynamic design document that could be used similarly to a wiki, and that has let us plan out the myriad of differently elements in a place everyone can easily navigate to and keep track of changes.

For source control, we opted to use GitHub. This has meant we have been able to rapidly share our work, and roll it back to early versions when we have broken parts of the code.

Concept

When we were first looking at ideas for the direction our project would take, we very quickly settled on the idea of a phone box. At this point we were considering creating a physical box that would be the interaction space for players. We considered creating an escape room style experience within this physical space, as it would limit the player to their immediate surroundings and could be highly curated.

There were problems with these early concepts, such as the construction and storage of a physical box of a similar scale to the average phone box. Though we very much wanted to keep with the phone box as a play space, we started to think about how this could be achieved virtual.

It was at this point we started thinking about VR as a possible solution. We could achieve the sensation of being within a box, but would not have to have the problem of storage or the costs to build and adapt a physical container. Room-scale VR also fitted well with concept of an enclosed space of fixed size, and we could designate floor space for players to freely move withing the virtual environment.

None of us had attempted a VR project, though a couple of us had experimented briefly with the technology. We very much wanted to explore the medium and its capabilities, so settled on this for a method of delivering our experience.

Once we had settled on how we intended to deliver the experience, we decided to try and create a narrative around mental health and the concept of reaching out to people on the phone. We decided that we would try and use the innate sense of space and scale to try and trigger the sense of being trapped in a claustrophobic space and generate a kind of duality around how the player character is experiencing this mental health crisis and the players’ own perception of the scene.

Prototyping/VR Exploration

Once we had settled on the idea, we had to start actually figuring out how we would handle the project logistically. I have touched on VR briefly, so started looking at how we would drive this experience. There were several areas to consider:

  • The target VR hardware/device
  • The engine we would use to create the experience
  • The framework/backend API for virtual reality
  • How to harness VR to deliver the visuals we want to achieve
  • Player interaction/interfacing
  • How these could all be brought together

For hardware we decided to work with the Oculus Quest 2. One of the primary reasons behind this decision was the lack of requirement for external sensors and external hardware beyond the headset itself. It is self contained and highly portable, with the headset able to operate wirelessly with its two controllers. It also has a relatively streamlined pipeline for building to the device and is highly compatible with a most game engines due to it constantly being developed for by Facebook, who produce the headset.

Quest 2 Hardware

In terms of game engine, I explored both Unity and Unreal as they both have working VR/XR solutions implemented out of the box. I spent a while reading the documentation and experimenting with custom created VR controllers. I realised quickly though that there are a lot of technical nuances that could significantly impact the development time of the project. Creating the interaction systems that we needed to drive the experience we were aiming to create would have potentially eaten up the entire length of the project, and there would be no guarantee that our work would be in a playable state when it came time for the project to be submitted.

The fact that we had to develop our experience and the tight development schedule meant that we had a limited time to solve the VR implementation problem. It was at this point I started looking at third-party frameworks. Both Unity and Unreal have custom VR frameworks created by other developers that build upon the functionality built into engine, expanding the capabilities and toolkit. Both engines had notable frameworks. Unreal has the VR Expansion Project, a plugin that solves a myriad of problems, such as locomotion, camera occlusion when the VR camera moves into a surface and player collision tracking. Unity had two similar options of note: VRTK and Hurricane VR (HVR). I experimented in both engines with these.

HVR Framework

All three implementations have merit, however in the end the decision was made collectively to go with Unity and Hurricane VR. The reasoning behind the decision was that it already had the team was familiar with Unity, the build process for Unreal for Android is an obscure nightmare of broken file dependencies, HVR had working implementations of keypads that could be adapted into the buttons for our phone system and Unity is much easier to use with source control.

With our Engine and VR framework decided on, we started to look at the components we would need to start building out the experience we wanted to achieve. The core element driving player interactions came included with HVR in the form of a pre-configured locomotion system. This came with animated hands and scripts to handle picking up objects.

The central VR interaction is the player picking up the phone and dialling numbers, so this had to be robust and intuitive so that players would fundamentally understand the action with little to no instruction. By placing HVR grab points on the phone handset we were able to immediately start testing the action of picking up and holding the phone. The phone handset uses positional audio to play sound similar to a real world device.

Prototype Phone Interactions

The dial pad uses HVR button components. This allows us to assign Unity events to button presses. We use this to concatenate a string together that then gets compared to any phone numbers we want to trigger an event. If the string from the key presses match one in a list, an event is triggered. In this case we are using it to trigger our dialogue system.

To handle the dialogue we have adapted a custom dialogue system created by one of our group to not only display the dialogue text, but to also lay the foundations for triggering environmental changes. These could be things like triggering rain, or controlling ambient effects.

The final system that needed to be experimented with was a way of attaching the phone handset to the phone unit. It did not make sense to have a hard modelled cable running between the two, as the phone had to be free to move but also hang naturally. We did not want to use the Unity joint system, as we were building for VR and it required a performant solution. To solve this problem we used the third-party plugin Obi-Rope. This plugin allows for real time simulation of ropes and cables. Importantly however is that it uses a particle based solver to simulate the rope itself, allowing for better performance.

Obi-Rope

These separate elements, when brought together, gave us the foundation to start crafting the experience we have wanted to make, while also saving us development time that would have been required for crating our own bespoke elements.

Creating Our Experience

Having solved the basic technical challenges for working in VR, we started to really think about the directions we wanted to take the experience in. We knew you were a character in a phone box, reaching out to people, however we were unsure of the format that would take.

We started to think about characters that the player would be reaching out to, and that started to shape the story of the player character. We settled on this idea that the player character was the survivor of a car crash that killed his girlfriend. We had wanted to try and keep our main character gender free so that anyone who played could insert themselves into the character, however that proved difficult from a writing point of view so we had to scrap that idea. We also researched mental health and depression so that we would approach the topic sensibly and with respect.

Pre-shader visual development test.

Visually there was a lot of discussion about whether we should go for stylised or realistic looking graphics. There were arguments for both, with the main argument for realistic graphics being that we didn’t want to approach such a serious subject with a cute and colourful look. However we decided to go with a cell shaded style because we felt that attempting realism would have players more accepting of the environment and allow greater focus on the narrative that we wanted to tell.

To achieve this cell shaded look we used two sets of shaders. Quibli and Flat Kit. Though we could have created our own shaders, we again decided the time it would take to craft our own bespoke shaders would not be worth the time investment, and we would not be guaranteed to achieve the outcome we were looking for. We did however create one shader that controlled the rendering outside of a certain radius. We wanted the player to immediate focus on and understand the importance of the phone box in a potentially cluttered scene.

Early visual test.

By creating a shader that allowed us to control the lighting and rendering in all the areas that were not important, we could draw the player towards the phone box itself. It also helps us capitalise on the sense of place and scale inherent to VR, helping us to foster a feeling of claustrophobia, that we have wanted to create to make the player feel as trapped as their character.

Final visual pass.

The music we are using was composed by my friend Sam McDonnell. We wanted to have a soundtrack that conveyed misery and anguish, and Sam produced us two loopable tracks. The first one is a lighter version of our theme, the second track has a cello worked into it to more heavily drive home the feeling of grief. By switching between the two, we can control the musical narrative elements.

Testing

This project has require constant testing and iteration. We had little in the way of VR experience, so have had to learn new best practices. Some of the tricks and tools that we would easily apply to games on a monitor do not translate well to virtual spaces. For instance taking control of the player camera and moving it even slightly can leave a person feeling very ill very quickly. We have had to continually test and come up with new solutions to problems as they have arisen.

With the dialogue we ran into a whole collection of issues. Unity user interface doesn’t render directly to screen, so the UI is required to exist in world space. This led to problems and talks about where the UI should actually be placed, as the player could easily be facing the wrong way and not see the dialogue at all. We eventually placed it next to the phone unit, as that was the one place players were consistently returning their gaze to. Likewise, figuring out where to display the number to dial was difficult. In the end we went with displaying it on the back of the hand, as it seemed most intuitive.

Dialogue system.

We also had to consistently test methods of locomotion. Two options were available to us, smooth and teleport. We initially had both in, with players usually opting for smooth locomotion as it provided a larger amount of finesse. Leaving teleport in was proving to be a problem however, with players accidentally triggering it and suddenly changing location which could be a very jarring experience for them. This necessitated its removal from the final build.

Some elements were incredibly robust early on, requiring only quality of life improvements. The phone unit and handset themselves worked very quickly. We had to make the buttons on the dial pad bigger however, as players were randomly hitting buttons they did not mean to press.

Thoughts & Conclusions

We set out to achieve a lot with this project, both in terms of the content and creating it in virtual reality. Though I think we achieved technically what we set out to achieve, it came at the cost of where we wanted to be thematically.

Due to the technical hurdles we found ourselves having to overcome, especially in terms of correctly implementing a VR control solution that ‘felt right’, we had to cut a lot of content we wanted to include. We only managed to get one telephone call into the final build, when we had actually been aiming for three or four. Similarly, a lot of the environmental effects we had wanted to include we just didn’t have time for.

Visually I think we have achieved an experience that is striking and absolutely conveys the bleakness we wanted to conjure. The phone box draws players in and conveys its importance, with players immediately being able to locate it and understand how to use it with little to no instruction.

The project has required me to think outside of my usual methods and come up with new solutions to problems I would usually understand how to resolve, and these new foundational skills I can definitely employ in future VR work.

Leave a comment

Your email address will not be published. Required fields are marked *