I’ve recently completed a Udacity Virtual Reality foundations Nanodegree and learned a lot about building and optimising applications for mobile VR, so I decided to enrol in the High Immersion course to get more hands-on practice and starting building experiences using both rotational and positional tracking for devices like Oculus Rift and HTC Vive.
The first project of the course was related to VR design and consisted in exploring all the tasks involved in the creation of a complete application including
- sketching the experience
- creating Unity scenes and deploying to a mobile device
- user testing
- iterating using the feedback received during the previous step
Before starting to code with Unity, I needed to identify the final goal of the game through a statement of purpose:
Puzzler is a Virtual Reality mobile application for new VR users which challenges them to solve a familiar type of puzzle in a new way.
With that in mind, I identified a persona to keep as a reference while developing the game:
Gilbert is an accountant. He is married, has two children and likes relaxing by playing video games and using a Virtual Reality device. Usually, he arrives at home around 8 p.m. and enjoys time with the family until the kids go to bed. Then, Gilbert and his wife watch some TV and chat about what happened during the day. On his free time in the weekends, he likes to take his bike for a quick ride. His favourite quote is “don’t waste time”. He is new to Virtual Reality and prefers mobile devices (ref. photo from https://randomuser.me).
After describing the game persona, I was ready to start working on the initial sketches.
The game environment is a mysterious castle which the user can explore in VR:
After selecting the Start button, the user is transported into the castle to play the game:
I took as a starting point the environment already provided with the project but I realised that it was missing some details to make it more compelling and exciting to explore in VR, so I added some extra buildings and castles in the sketches:
Great! I was then ready to start putting together all the pieces and build the Unity project.
Identifying the right scale
I fired up Unity and imported the project assets: the first task was to determine the correct scale for the objects.
I deployed the scene to my Oculus Go device and verified the size was similar to my apartment door and adjusted it until I reached the desired result:
Then I proceeded to create a first version of the internal game environment
After this, it was time to start some user testing to check if I was on the right track! First of all, I needed to identify if:
- the scale was appropriate
- the dimensions of the room were adequate
After two users tested the app using the VR device, it was clear that the scale worked quite well, but some additional modifications were required as the room size needed to be increased. I modified the castle and started adding some lights and the game orbs:
Great! I performed a new user testing session, and the dimensions and lights had positive feedback.
The users were also able to describe the mood of the environment as being in an old castle with mystery and legendary items, so I decided to move on with the next step: UI creation.
At this point, I needed to create two panels permitting the users to start and restart the game and came up with the following:
Before proceeding further, I needed some additional user testing for verifying I was on the right track and immediately realised there were some modifications to be done as the users confirmed that:
- the text was legible
- panels were too big
- black colour didn’t fit well with the environment
I performed some changes and ended up with a new design for the UI:
Then I modified the location of the panels and put it in the start/end positions:
At this stage, I added some game movements using ground raycasting for positioning the user from the start to the playroom: these transitions could cause simulator sickness to some users, so I needed to test if the movement speed was adequate.
After some personal and user testing I noted that:
- it was clear how to activate the start button using gaze
- the speed of movement was too high and causing some disorientation to the user
I modified the game slowing down the pace and continued adding the final game mechanics for selecting the orbs, playing the game and adjusting the external game environment with mountains and a bigger castle (using the assets from Mega Fantasy Props Pack):
Breakdown of the project
At this point, the game functionalities were completed, and I was ready to test the experience end-to-end. After deploying and starting the application using my Oculus Go, the initial UI loaded correctly and, after gazing the start button, I was transported to the mysterious play area:
Puzzler played a sequence of 5 orbs that I had to memorise and repeat correctly to win the game:
And pressing the restart button allowed me to restart the game from the beginning.
I enjoyed working on Puzzler: the main focus of this project was to use a design workflow for a Virtual Reality application and was clear to me the importance of user testing in these types of apps as there are a lot of challenges compared to “traditional” user interfaces. In particular, simulator sickness and performance optimization is probably the most challenging task to achieve high-quality VR apps.
I have published this project on GitHub to start collecting additional feedback and to add extra games/environments in the future: any input on this work would be very much appreciated.
Link to additional work
A maze — A VR application using Unity and the Google VR SDK where the user traverses a maze environment using 2D and 3D UI, waypoint based navigation, procedural animation, interactive objects, spatial audio, particle effects and persistent storage of session data. https://github.com/davidezordan/A-Maze
Build an Apartment — Explore a modern apartment in VR with high-performance lighting, custom materials, and animation using Unity. https://github.com/davidezordan/Build-an-apartment
Analysing visual content using HoloLens, Computer Vision APIs, Unity and the Mixed Reality Toolkit — A GitHub project for exploring the environment using a mixed reality device and computer vision APIs. https://www.davidezordan.net/blog/?p=8234