Testing on VR world

Previously, I’ve written a couple of posts about how to get yourself started on VR in which I promised some stories about testing on this world.

Why do I call it world instead of application? Because the goal of virtual reality is to create realistic synthetic worlds that we can inspect and interact with. There are different ways of interacting in these worlds depending on the device type.

Testing an application in VR would be similar to testing any other application, while we would have to take into account some particularities about the environment. Later on we will see different kinds of testing and think about the additional steps for them in VR, but first let’s see the characteristics of the different types of devices.

Types of devices:

Phone devices (Google Cardboard or DayDream) – allows you to connect your phone (or tablet) on the device to be able to play a VR app in there.

This is possible because most of smartphones nowadays come with gyroscopes: a sensor which uses the Earth’s gravity to determine the orientation.

Some Cardboards (or other plastic versions) can have buttons or a separate trigger for actions on the screen (as it is the case for DayDream), but the click is usually not performed in the object. Instead, it is done anywhere in the screen while the sight is fixated on the object. If the device does not have a button or clicker, the developer have to rely on other information for interaction, such as entering and exiting objects or analyzing the length of time the user was on the object.

Cardboard VR device – Picture credit mentatdgt

Computer connected devices (HTC, Oculus, Samsung VR…) that generally comes with (at least) a headset and handset, have an oled device with high resolution and supporting low persistence embedded in the headset, so you don’t need to connect it to a screen. They detect further movement, as it is not just about the movement of the head but also the movement on the room itself and the hand gestures. This is done differently depending on the device itself.

We have moved from being able to detect user head movement (with reasonable size devices), to use sounds, to use hand gestures… so now, testing VR applications is getting more complicated as it now require the test of multiple inputs. The handset devices usually have menu options as well.

Before going on, I’d like to mention AR. AR is about adding some virtual elements in the real world, but with AR we do not create the world. However, AR has a lot in common with VR, starting with the developing systems. Therefore, the testing of the two platforms would be very similar

We have talked about the hardware devices in which the VR applications would run, but we should also talk about the software in which the applications are written.

Samsung gear + one handset

Developing platforms:

Right now there are two main platforms for developing in VR: Unity and Unreal, and you can also find some VR Web apps. Most of things that are done with Unity use C# to control the program. Unreal feels a bit more drag and drop than unity.

Besides this, if you are considering to work in a VR application, you should also take into account the creation of the 3D objects, which is usually done with tools such as blender or you can find some already created onlin.e

But, what’s different in a VR application for testers?

Tests in VR applications:

VR applications have some specifics that we should be aware of when testing. A good general way of approaching testing on VR would be to think about what could make people uncomfortable or difficult.

For example, sounds could be very important, as they can create very realistic experiences when done appropriately, that make you look where the action is happening or help you find hidden objects.

Let’s explore each of the VR testing types and list the ways we can ensure quality in a virtual world. I am assuming you know what these testing are about and I’m not defining them deeply, but I will be giving examples and talking about the barriers in VR.

Usability testing:

It ensures that the customer can use the system appropriately. There are additional measurements when testing in VR such as verifying that the user can see and reach the objects comfortably and these are aligned appropriately.

We are not all built in the same way, so maybe we should have some configuration before the application for the users to be able to interact properly with the objects. For example, the objects around us could be not seen or reached easily by all our users as our arms are not the same length.

You should also check that colors, lighting and scale are realistic and according with the specifications. This could not only affect quality, but change the experience completely. For example, maybe we want a scale to be bigger than the user to give the feeling of shrinking.

It is important to verify that the movement does not cause motion sickness. This is another particularly important concept for VR applications that occurs when what you see does not line up with what you feel, then you start feeling uncomfortable or dizzy. Everyone have a different threshold for this, so it is important to make sure the apps are not going to cause it if used for long time. For example, ensure the motions are slow or placing the users in a cabin area where things around them are static, or maintaining a high frame-rate and avoiding blurry objects.

Sitting experience – Picture credit rawpixel.com

If there is someone on your team that is particularly sensitive to motion sickness, this person would be the best one to take the tester role for the occasion. In my case, I asked for the help of my mother, who was not used at all to any similar experiences and was very confused about the entire functioning.

Accessibility testing

Is a subset of usability testing that ensures that the application being tested can be appropriately used by people with disabilities like hearing, color blindness, old age and other disadvantaged groups.

Accessibility is especially important in VR as there are more considerations to make than in other applications such: mobility, hearing, cognition, vision and even olfactory.

For mobility, think about height of the users, hand gestures, range of motion, ability to walk, duck, kneel, balance, speed, orientation…

To ensure the integration of users with hearing issues, inserting subtitles of the dialogs would be a must, and ensuring those are easily readable. The position of the dialogs should be able to tell the user where the sound is coming from. In terms of speech, when a VR experience require this, it would be nice if the user could also provide other terms of visual communication.

There are different degrees of blindness, so this should be something we want to take into account. It is important that the objects have a good contrast and that the user can zoom into them in case they are too far away. Sounds are also a very important part of the experience and it would be ideal that we can move around and interact with the objects based on sound.

I realized on how different the experience could be depending on the user just by asking my mother to help me test one of my apps. She usually wears glasses to read, so from the very beginning she could not see the text as clearly as I did.

I mentioned before that in VR it is possible to interact with object by focusing the camera on them for a period of time. This is a simple alternative to click without the need of hand gestures for people with difficulty using them.

There are many sources online about how to make fully accessible VR experiences, and I am sure you can come up with your own tests.

Integration testing

Its purpose is to ensure that the entire application functions as is should in the real world and meets all requirements and specifications.

To test a VR application, you need to know the appropriated hardware, user targeting and other design details that we would go through with the other type of testing.

Also, in VR everything is 360 degrees in 3 coordinates, so camera movement is crucial in order to automate tests.

Besides, there might be multiple object interaction around us that we would also need to verify, such collisions, visibility, sounds, bouncing…

There are currently some testing tools, some within Unity that could help us automate things in VR but most are thought from a developer’s perspective. That’s one more reason for us to ensure that the developers are writing good unit tests to the functions associated with the functionality and, when possible, with the objects and prefabs. In special, unit test should focus in three main aspects for testing: the code, the object interaction and the scenes. If the application is using some database, or some API to control things that then change in VR, we should still test them as usual. These tests would alleviate the integration tests phase.

Unity test runner

Many things are rapidly changing on this area, as many people have understood the need for automation over VR. When I started with unity, I did not know the existence of the testing tools, and tested most of it manually, but there are some automated recording and playback tools around.

Performance testing

Is the process of determining the speed or effectiveness of a system.

In VR the scale, a high number of objects, different materials and textures and number and of lights and shadows can affect the system performance. The performance would vary between devices, so the best thing to do would be to check with the supported ones. This is a bit expensive to do, that’s why some apps would only focus in one platform.

Many of my first apps were running perfectly well in the computer but they would not even start on my phone.

It is important to have a good balance to have an attractive and responsive application. New technologies also make it important to have a good performance, so the experience is realistic and immersive. But sometimes in order to improve performance we have to give up on other things, such as quality of material or lights, which would also make the experience less realistic.

In the case of unity, the profiler tool would give you some idea of the performance, but there are many other tools you can also use. In VR, we need to be careful with the following data: CPU usage, GPU usage, rendering performance, memory usage, audio and physics. For more information on this, you can read this article.

Unity profiler

Also, you can check for memory leaks, battery utilization, crash reports, network impact…. and use any other performance tools available on the different devices. Some of these get a screenshot of the performance by time and send it to a database to analyze or set up alerts if anything goes higher for you to get logs and investigate the issue, while others are directly installed on the device and run on demand.

Last but not least, VR applications can be multiplayer (that’s the case of the VRChat) and so we should verify how many users can connect at the same time and still share a pleasant experience.

Security testing

Ensures that the system cannot be penetrated by any hacking way.

This sort of testing is also important in VR and as the platforms evolve, new attacks could come to live. Potential future threats might be virtual property robbery, especially with the evolution of cryptocurrency and with monetization of applications.

Other testing

Localization testing: as with any other application, we should make sure that proper translations are available for the different markets and we are using appropriate wordings for them.

Safety testing: There are two main safety concerns with VR (although there might be others you could think of)

1. Can you easily know what’s happening around you?

Immersive applications are the goal of VR, but we are still living in a physical world. Objects around us could be harmful, and not being aware of alarms such as a fire or carbon monoxide could have catastrophic results. Being able to disconnect easily when an emergency occurs is vital on VR applications. We should make sure the user is aware of the immersion by ensuring we provide some reminder to remove objects nearby.

Smoke could be unnoticed – Picture credit CARLOS ESPINOZA

Every time I ask someone to give a test to some app with the mobile device, they start walking and I have to stop them otherwise they might hit something. And this is the mobile device, not the entire immersive experience.

Even I, being quite aware of my surrounding, give a test to some devices that include sound, and I hit my hand with several objects around me. Also, I could not hear what the person that handed over the device was telling me.

2. Virtual assaults in VR:

When you test an application in which users can interact with each-others in VR, the distance allowed between users could make them feel uncomfortable. We should think about this too when testing VR.

Luckily, I haven’t experienced any of these myself but I have read a lot of other people talking about this issue. Even some play through online on VR chat, you can see how people break through the comfort zone of the players.

Testing WITH VR: There are some tools being developed in VR to allow for many different purposes and technologies as emulation of real scenarios. Testing could be one of them, we could, for example, have a VR immersive tool to teach people how to test by examples. I have created a museum explaining the different types of testing, maybe next level could be to have a VR application with issues that the users have to find.

What about the future?

Picture credit Harsch Shivam

We have started to seen the first wireless headsets and some environment experiences such moving platforms and smelling sensations.

We shall expect the devices to get more and more accurate and complete and therefore there would be more things to test and that we should have into account. We are also expecting the devices to get more affordable with the time, which would increase the market.

Maybe someday we would find it hard to differentiate between what’s real and what’s a simulation… maybe… we are already in a simulation.

Virtual Reality start pack (and VR Udacity nanodegree experience)

If you are a regular reader of my blog, you are likely to expect a testing story. This is not exactly such, but have some patience, as I will link it with testing in upcoming posts.

Virtual reality is a field I was always curious about, from when I was a little lynx, before it was even possible to bring it to the users (as the devices were not quite portable back then).

On the other hand, I was looking to do a nanodegree course from Udacity to learn something new and keep myself updated. When you are working as a developer for a while (especially developers in test) you need to keep up to date. I once heard a feedback about a candidate that was interviewed by some friends, that he did not really have 10 years of experience, but he had a repeated 1 year of experience in a loop of 10, and this can easily happen to anybody. To avoid this, what I do and I recommend doing is: keep learning new stuff.

And so, I decided to course the VR nanodegree course in Udacity.

Advises if you are considering VR nanodegree course:

The first thing you need to know about virtual reality is that you will need a device for it, otherwise you won’t be able to test anything you do.

The second thing you need to know about VR: if you want to work in multiple platforms, you also need multiple devices. This might seem obvious for you, but most of the current technologies (think about mobile, for example) have emulators to be able to deploy and test in different devices. However, for VR this is not there yet (as for the time I’m writing this article and my knowledge).

So, if you are planning in getting into the nanodegree course and into actual VR development, get ready to purchase an HTC vibe or an Oculus Rift, unless you are lucky enough to be able to borrow one or unless you prefer to take the speciality about cinematics and 360 recording. I ended up picking this speciality. Not that I did not want to spend the money on a VR cool device that would allow me to also play cool games, but I have recently moved countries (and continents, in fact) and I did not want to carry much stuff with me around.

One more thing to take into account: VR devices might come with minimum computer’s specifications, so you might also need to update your computer in order for the device to work properly.

Lucky for us, in VR we can also develop for mobile, which only requires a cheap device in which to connect your phone (and you could even build your own one). You can’t do as many things as if you also have hands controllers and body movement detectors, but you can still do some cool things. For the nanodegree first modules, this is all you need and the course provides a cardboard to the students (which is great because it has a clicking option and some other devices don’t have this one)

However, there is another thing that you could also get stuck with, although I think there are some workarounds but it would at least slow you down: you cannot directly develop for IOS phone from a windows device, you should do it from a MAC.

In terms of content, I would advise you to be interested in multimedia and gaming if you decide to go through the course.

Feelings about the course itself

I actually really enjoyed the course (except for the speciality that I was forced to course because of lack of devices). I think the content is quite good and the projects are challenging and open for creativity.

It’s also great to network with other people with interest in VR.

In terms of testing on VR, there is currently not a module about this, but they do explain many things about performance, VR and what a good VR application should look like, so I believe this content is covered across the course.

Where should I start if I want to learn more about VR?

First of all, I think you can do this course for free if you don’t mind not having the degree (you cannot access the evaluations). That could be a good starting point and you could always join for the assessments after, which might save you a bit of time and money.

However, I’d say that the best way to get started is to actually try a device and try some apps. In Android you can download the ‘cardboard’ app and expeditions. Also, you can look for VR apps or games in the phone store (whichever your phone’s OS). Another way could be checking in steam (with a more expensive device), youtube or even in github to see someone’s code. For example, you can check out mine.

Last but not least, you can also install unity as it has an emulator that might give you an idea on how the world would look like and try to start playing around. There is plenty of documentation about it. Another good tool to check out is unreal, you don’t need as much development skills with this one.

What next?

So, you have checkout some VR apps and devices. You might even have created some small apps. The next step would be to be able to tell if your apps (or someone else’s) are of good quality (this is a test focused blog, after all).  For this, we should have in mind some new considerations for every type of testing, but that’s…well…another story.