Vuforia / MergeVR Integration

This post outlines the steps required to integrate Vuforia for Digital Eyeware with the MergeVR SDK in Unity. The result of this integration will be an Augmented Reality demo app that can be run in the MergeVR headset on your Android device. It will recognize an image marker and display a 3d object on top of that marker, and allow the user to trigger a virtual button on the object – then enter in VR mode and move around the VR scene using the MergeVR headset capactive input buttons.

https://developer.qualcomm.com/software/vuforia-augmented-reality-sdk

http://mergevr.com/

 

Requirements

Android mobile device

Unity3d – version 4.6.*

Vuforia – Download the core Vuforia unity package – vuforia-unity-5-0-5 (you must be a registered Vuforia developer before the download) and the eyeware samples vuforia-samples-eyewear-unity-5-0-5

MergeVR – download latest Merge SDK Unity package- MergeVR_version_v062

 

Steps

 

  • Create a new empty Unity Project (4.6.*)
  • Import the Merge SDK Unity package (MergeVR_version_v062.unitypackage)
  • Import the Vuforia core Unity package (vuforia-unity-5-0-5.unitypackage)
  • Import the Vuforia Digital Eyeware sample AR/VR package from the vuforia-samples-eyewear-unity-5-0-5.zip file (arvr-5-0-5.unitypackage)
  • In the Scenes folder find the Scene ‘Vuforia-3-AR-VR’ – Duplicate it (Edit->Duplicate) and rename the new scene ‘DemoVRAR’
  • Open ‘DemoVRAR’ scene.

 

 

 

 

  • Stop running the scene (if you haven’t already)

 

  • Find the MergeVR -> Prefabs folder. Drag the MergeCameraController and MergeSDK prefabs into the scene Hierarchy.

 

  • Expand the MergeCameraController tree.

 

  • Click on ARCamera under UserHead and view the Inspector Pane. Check (enable) the ‘Bind Alternate Camera’ property.

 

  • Drag the MergeCameralController root transform to the ARCamera property ‘Central Anchor Point’. Drag MergeCameraRight to ‘Right Camera’ (Vuforia will pop up a box saying ‘add vuforia components’ – click for both right and left cameras), then drag the MergeCameraLeft to ‘Left Camera’

 

  • On ARCamera property ‘Viewer’ select ‘Other Viewer’ and enter 1 for Viewer ID.

 

  • Save the Scene – now go ahead and build and run this scene on Android – you should now have a working AR app that can recognize the image stones marker and display the 3d mountain object. If you focus your gaze on the virtual ‘VR’ button for 2 seconds the app will transition you inside the full VR scene where you can look around, to exit the VR scene look straight down and focus on the ‘AR’ button for 2 seconds.

 

  • We now have a working AR/VR app – we need to make a few modifications to this scene to get it to run best in the MergeVR headset and to let us use the capactive touch buttons on the MergeVR headset to interact with the VR world.

 

  • The MergeVR headset needs the camera on the right side to work in AR mode with the Android, since the generic Vuforia demo doesn’t support this we have to make a few modifications to the MergeVR code to handle the change.

 

  • To move the viewport to the correct position when camera is on the right. Open the ‘MergeScreenManager.cs’ script in MergeVR->Scripts. In the function ‘SetViewPortResolutionAndPostion’ replace this line

 

viewportYpos = viewportBottom;

 

with

 

viewportYpos = viewportBottom+(Screen.height-viewportHeight);

 

and in the ‘MergeCameraController.cs’ script in MergeVR->Scripts comment out the following lines in the function AndroidGyroTracking

 

/*

if (Input.deviceOrientation==DeviceOrientation.LandscapeRight || Input.deviceOrientation==DeviceOrientation.LandscapeLeft)

currentOrientation=Input.deviceOrientation; //only change on either full landscape

 

if (currentOrientation==DeviceOrientation.LandscapeRight) {

androidGyroRotation = new Quaternion (-Y, X, Z, -W);

androidGyroRotation *= Quaternion.Euler(180f,180f,0);

}

else

*/

 

leaving

 

androidGyroRotation = new Quaternion (-Y, X, Z, -W); //default

 

  • Now build and run the app – the viewports will be aligned correctly and the VR scene will be working as well.

 

  • Now we need to add interaction with the VR scene (movement in this example) using the MergeVR capactive touch buttons on the headset.

 

  • Add new script called MergeEyeCustom to the MergeCameraController game object in your scene

 

using UnityEngine;

using System.Collections;

 

public class MergeEyeCustom : MonoBehaviour {

 

public float speed = 1.5f;

public float jumpSpeed = 10.0f;

public float gravity = 10.0f;

public bool allowJump = false;

 

Vector3 moveDirection = Vector3.zero;

float ydirection = 0f;

float xdirection = 0f;

 

// Use this for initialization

void Start () {

 

}

 

// Update is called once per frame

void Update () {

 

if (Merge.MSDK.isControllerConnected ()) {

ydirection = Merge.MSDK.getController ().GetAxis (“Vertical”);

xdirection = Merge.MSDK.getController ().GetAxis (“Horizontal”);

} else {

//Use arrow keys in editor – or touch capactive buttons if present in MergeVR headset

if (Input.GetKey (KeyCode.UpArrow) || Merge.MergeInput.GetInput(1))

ydirection = -1;

else if (Input.GetKey (KeyCode.DownArrow) || (!allowJump && Merge.MergeInput.GetInput(0)))

ydirection = 1;

else

ydirection = 0;

 

if (Input.GetKey (KeyCode.RightArrow))

xdirection = 1;

else if (Input.GetKey (KeyCode.LeftArrow))

xdirection = -1;

else

xdirection = 0;

}

 

moveDirection = new Vector3 (transform.forward.x * ydirection*speed, 0f, transform.forward.z * ydirection*speed);

moveDirection += new Vector3 (transform.right.x * xdirection*speed, 0f, transform.right.z * xdirection*speed);

 

 

transform.position+=moveDirection* Time.deltaTime;

 

}

}

 

  • Now you can use the left and right MergeVR headset buttons to move around the scene once you are in VR mode.

 

  • As an exercise you can do the following to polish the example.

 

remove or modify one of the gaze cursors

 

optimize rendering performance

 

add a rigid body or first person controller for more realistic collisions the VR scene

Initial Impressions of Unity3D

After evaluating 3d engines for iOS development I’ve decided we’re going to go with Unity3D.

I would prefer a native Objective-C engine but the ability to deliver for multiple platforms is very attractive (Unity can deploy to Android, iOS, Mac, Linux, XBox 360, and the Wii).

Unity development so far has been straight forward – scripts can be in c# or javascript – I’ve created several 3d scenes, used the extensive resources from the forum and Unity Answers to create a ship that uses realistic physics to move in orbit. The Unity Asset store has thousands of additional resources for Unity3D dev. Initially I’m using atmospheric planets and vectrosity for line drawing.

Unity3d scene

We play tested the game as turn based and real time multi-player – I was initially planning on a turn based approach using Apple’s Game Center for iOS but the game plays much better in real time. We evaluated several multi-player network solutions and are going to go with Photon Cloud  – very well integrated with Unity, a great price, and should be able to scale.

3D engines for iOS Development

I’m evaluating several different 3d engines for our planned iOS space combat game Torchships.

I found good overviews at Never Read Passively, Open Source iPhone game engine comparison, and The Commercial iPhone Game Engine Comparison (3D and 2D).

This week I’m digging more into Unity and cocos3d to get a better understanding of the pros and cons. Unity looks very nice but I’m not sure I want to go with a non-iOS-native solution.

Here is the list of 3D engines I’ve looked at so far…

<>

UDK – Unreal Development Kit for iOS – PC & iOS

The same toolset used to make Gears of War and Infinity Blade.

+ Beautiful graphics

+ Industry Standard, high performance

+ Completely integrated development system

– Development on PC only

– proprietary scripting

Unity 3D – web, flash, iOS, Android, PC & Mac, Wii, PS3, xBox 360

Very popular 3d engine that supports a large number of platforms.

+Large developer base and very active community

+Very nice 3d Graphics, physics, and particle effects.

+Integrated editor and asset pipeline

+Largest number of platforms available

+Javascript or C#

+Mac native development

-Not a native iOS app, all new iOS features may not be available right away

Marmalade

Engine to create of richer apps and games on iOS, Android and other platforms.

+based on c++

-based on c++ 🙂

Shiva3d

3D game engine with a graphical editor to create applications and games for Windows, MacOS, Linux, iPhone, iPad, Android, BlackBerry QNX, WebOS, Marmalade and Wii

SIO2

An OpenGLES based cross-platform 2D and 3D game engine for iOS, Android, MacOS and Windows

cocos3d

An extension to cocos2d. A sophisticated 3D application development framework for the iOS platform.

+open source

+based on and can integrate with the very powerful and popular cocos2d library

+native iOS

ISGL 3D

iSGL3D (iOS Scene Graph Library) is a 3D framework for the iPhone, iPad and iPod touch written in Objective-C, enabling the creativity of developers to flourish in a 3D world without the complexities of OpenGL.

+based on OpenGL

+iOS native

-author is not actively supporting, looking for new contributors