Skip to the content


With VR experiences there’s as much “feel” as engineering involved.  

With 2 dimensional experiences (on a flat screen) it’s harder to induce emotions such as fear, insecurity, and sickness while this is (almost too) simple to achieve in VR.

Because of this, we love tools where we can quickly prototype VR experiences and try them out in VR.  Then take those learnings back to the bench and iterate to improve.

Why use Unity for Prototyping VR?

Unity is great for quickly scaffolding up the build for a VR or AR application. With inbuilt integration with vuforia for AR, and Oculus or OpenVR for the VR side. This means that without having to go through the rigmarole of searching through various websites for there SDK's, and you can get down to the fun part of actually building your application. You can, however, go to other VR development providers such as HTC and Google for there SDK’s and install them to build for their platforms. 

Stepping through what you need to do

Once you’ve got your project all set up to build for your platform of choice, you can start building your app. Depending on what your building for you’ll probably need different starting points. 

For example, if you were building a vuforia app you’d want to start out with there specialised camera which already has all the scripts in place to get you going. From there you’d make your application adding in any assets, scripts videos etc., and once you’re all set up in vuforia with your image target, you’re good to go, you can even use your machine’s inbuilt camera to test if you don't want to build and export it to another device.

VR projects are usually more complex, but fortunately, Unity doesn’t make it any more arduous to develop than a normal app. The only major differences are the type of camera you use and the control scheme you’re using if your app has one.

By default enabling VR in unity doesn’t change much about the object, except that you cannot directly, as the headset takes care of that, so if you want to move the camera via scripts you’ll need to put it inside of a container and move it using whatever controls you’d like your VR game to have. 

By default, unity sets the camera to be “both” eyes, but if you want you can separate that out by having two cameras with one set to left eye and the other the right. This allows for greater customisation of what you show the player, for example, you could have a scene where only the left eye could see certain aspects, or you have a HUD or other screen overlay only showing over the right eye.

Even in VR the entire unity engine is still available, this includes all physics and light engines, meaning that if you wanted to have a VR game that uses gravity and other physics-related features it’s as simple to implement as it is in a none VR app.

Enabling a specific type of VR is as simple as selecting which device you’re planning to develop for, you can even choose multiple. If one is not in the list, and they have a unity SDK, just fetch and install that and it should be added to the list and make it possible to build for that target platform.

Closing Words

All in all, Unity makes a great staging ground for VR apps and games, it enables you to build your game for numerous devices and provides the framework to make the whole process quick and smooth, while still giving you the full power of the Unity engine. It’s even possible to go through a whole development prototyping cycle without VR, making your game using normal techniques, and then adding in VR at the very end, and there will not be a huge number of changes, depending on the game of course. 

By having unity be able to build for multiple platforms, it means that even if you're developing for hardware that you don't necessarily have you can still get the groundwork in, by developing for something you do. Google Cardboard is a great example of this because you can just use your phone, to get the basic idea of what it’ll look like, then build it for your actual intended device.

About the author

Stuart Muckley

Stuart Muckley

I’ve been a programmer and IT enthusiast for 30 years (since the zx spectrum) and concentrated on AI (neural nets & genetic algorithms) at University. My principle skills are concentrated on Enterprise and Solution Architecture and managing effective developer teams.

I enjoy the mix between technical and business aspects; how technology enables and how that (hopefully) improves profit/EBITDA & reduces cost-per-transaction, the impact upon staff and how to remediate go-live and handover, and risk identification and mitigation. My guiding principle is “Occams Razor” that simplicity is almost always the best option by reducing complexity, time to build, organisational stress and longer term costs.

comments powered by Disqus

We're Honestly Here to Help!

Get in Touch

Got questions?  We're always happy to (at least) try and help.  Give us a call, email, tweet, FB post or just shout if you have questions.

Code Wizards worked with us a number of initiatives including enhancing our Giving Voucher proposition and fleshing out our Giving Street concept. They provide a good blend of both business and technology expertise. We are grateful for their contribution to TheGivingMachine allowing us to improve our services and enhance our social impact

Richard Morris, Founder and CEO TheGivingMachine

Code Wizards are a pleasure to work with.

We've challenged their team to integrate our cutting edge AI, AR and VR solutions and always come up trumps.

I love their approach and help on strategy.

Richard Corps, Serial Entrepreneur and Augmented Reality Expert

Working In Partnership

We're up for (almost) anything.

We believe that life is better in co-op (and if we can't help then we probably know somebody else that we can direct you to).  Get in touch.