Vuforia / MergeVR Integration

This post outlines the steps required to integrate Vuforia for Digital Eyeware with the MergeVR SDK in Unity. The result of this integration will be an Augmented Reality demo app that can be run in the MergeVR headset on your Android device. It will recognize an image marker and display a 3d object on top of that marker, and allow the user to trigger a virtual button on the object – then enter in VR mode and move around the VR scene using the MergeVR headset capactive input buttons.

https://developer.qualcomm.com/software/vuforia-augmented-reality-sdk

http://mergevr.com/

 

Requirements

Android mobile device

Unity3d – version 4.6.*

Vuforia – Download the core Vuforia unity package – vuforia-unity-5-0-5 (you must be a registered Vuforia developer before the download) and the eyeware samples vuforia-samples-eyewear-unity-5-0-5

MergeVR – download latest Merge SDK Unity package- MergeVR_version_v062

 

Steps

 

  • Create a new empty Unity Project (4.6.*)
  • Import the Merge SDK Unity package (MergeVR_version_v062.unitypackage)
  • Import the Vuforia core Unity package (vuforia-unity-5-0-5.unitypackage)
  • Import the Vuforia Digital Eyeware sample AR/VR package from the vuforia-samples-eyewear-unity-5-0-5.zip file (arvr-5-0-5.unitypackage)
  • In the Scenes folder find the Scene ‘Vuforia-3-AR-VR’ – Duplicate it (Edit->Duplicate) and rename the new scene ‘DemoVRAR’
  • Open ‘DemoVRAR’ scene.

 

 

 

 

  • Stop running the scene (if you haven’t already)

 

  • Find the MergeVR -> Prefabs folder. Drag the MergeCameraController and MergeSDK prefabs into the scene Hierarchy.

 

  • Expand the MergeCameraController tree.

 

  • Click on ARCamera under UserHead and view the Inspector Pane. Check (enable) the ‘Bind Alternate Camera’ property.

 

  • Drag the MergeCameralController root transform to the ARCamera property ‘Central Anchor Point’. Drag MergeCameraRight to ‘Right Camera’ (Vuforia will pop up a box saying ‘add vuforia components’ – click for both right and left cameras), then drag the MergeCameraLeft to ‘Left Camera’

 

  • On ARCamera property ‘Viewer’ select ‘Other Viewer’ and enter 1 for Viewer ID.

 

  • Save the Scene – now go ahead and build and run this scene on Android – you should now have a working AR app that can recognize the image stones marker and display the 3d mountain object. If you focus your gaze on the virtual ‘VR’ button for 2 seconds the app will transition you inside the full VR scene where you can look around, to exit the VR scene look straight down and focus on the ‘AR’ button for 2 seconds.

 

  • We now have a working AR/VR app – we need to make a few modifications to this scene to get it to run best in the MergeVR headset and to let us use the capactive touch buttons on the MergeVR headset to interact with the VR world.

 

  • The MergeVR headset needs the camera on the right side to work in AR mode with the Android, since the generic Vuforia demo doesn’t support this we have to make a few modifications to the MergeVR code to handle the change.

 

  • To move the viewport to the correct position when camera is on the right. Open the ‘MergeScreenManager.cs’ script in MergeVR->Scripts. In the function ‘SetViewPortResolutionAndPostion’ replace this line

 

viewportYpos = viewportBottom;

 

with

 

viewportYpos = viewportBottom+(Screen.height-viewportHeight);

 

and in the ‘MergeCameraController.cs’ script in MergeVR->Scripts comment out the following lines in the function AndroidGyroTracking

 

/*

if (Input.deviceOrientation==DeviceOrientation.LandscapeRight || Input.deviceOrientation==DeviceOrientation.LandscapeLeft)

currentOrientation=Input.deviceOrientation; //only change on either full landscape

 

if (currentOrientation==DeviceOrientation.LandscapeRight) {

androidGyroRotation = new Quaternion (-Y, X, Z, -W);

androidGyroRotation *= Quaternion.Euler(180f,180f,0);

}

else

*/

 

leaving

 

androidGyroRotation = new Quaternion (-Y, X, Z, -W); //default

 

  • Now build and run the app – the viewports will be aligned correctly and the VR scene will be working as well.

 

  • Now we need to add interaction with the VR scene (movement in this example) using the MergeVR capactive touch buttons on the headset.

 

  • Add new script called MergeEyeCustom to the MergeCameraController game object in your scene

 

using UnityEngine;

using System.Collections;

 

public class MergeEyeCustom : MonoBehaviour {

 

public float speed = 1.5f;

public float jumpSpeed = 10.0f;

public float gravity = 10.0f;

public bool allowJump = false;

 

Vector3 moveDirection = Vector3.zero;

float ydirection = 0f;

float xdirection = 0f;

 

// Use this for initialization

void Start () {

 

}

 

// Update is called once per frame

void Update () {

 

if (Merge.MSDK.isControllerConnected ()) {

ydirection = Merge.MSDK.getController ().GetAxis (“Vertical”);

xdirection = Merge.MSDK.getController ().GetAxis (“Horizontal”);

} else {

//Use arrow keys in editor – or touch capactive buttons if present in MergeVR headset

if (Input.GetKey (KeyCode.UpArrow) || Merge.MergeInput.GetInput(1))

ydirection = -1;

else if (Input.GetKey (KeyCode.DownArrow) || (!allowJump && Merge.MergeInput.GetInput(0)))

ydirection = 1;

else

ydirection = 0;

 

if (Input.GetKey (KeyCode.RightArrow))

xdirection = 1;

else if (Input.GetKey (KeyCode.LeftArrow))

xdirection = -1;

else

xdirection = 0;

}

 

moveDirection = new Vector3 (transform.forward.x * ydirection*speed, 0f, transform.forward.z * ydirection*speed);

moveDirection += new Vector3 (transform.right.x * xdirection*speed, 0f, transform.right.z * xdirection*speed);

 

 

transform.position+=moveDirection* Time.deltaTime;

 

}

}

 

  • Now you can use the left and right MergeVR headset buttons to move around the scene once you are in VR mode.

 

  • As an exercise you can do the following to polish the example.

 

remove or modify one of the gaze cursors

 

optimize rendering performance

 

add a rigid body or first person controller for more realistic collisions the VR scene

New iOS social game ‘Just Picture It’ launched in the App Store

Image

The social image sharing game that we developed in partnership with Mason Software Company has just been launched in the Apple App Store – it is called Just Picture It.

The iOS game uses Parse.com and Amazon AWS as the primary back-end components for storing data and sharing images.

It is free and quite a bit of fun to play with friends – please download and give it a try…

Initial Impressions of Unity3D

After evaluating 3d engines for iOS development I’ve decided we’re going to go with Unity3D.

I would prefer a native Objective-C engine but the ability to deliver for multiple platforms is very attractive (Unity can deploy to Android, iOS, Mac, Linux, XBox 360, and the Wii).

Unity development so far has been straight forward – scripts can be in c# or javascript – I’ve created several 3d scenes, used the extensive resources from the forum and Unity Answers to create a ship that uses realistic physics to move in orbit. The Unity Asset store has thousands of additional resources for Unity3D dev. Initially I’m using atmospheric planets and vectrosity for line drawing.

Unity3d scene

We play tested the game as turn based and real time multi-player – I was initially planning on a turn based approach using Apple’s Game Center for iOS but the game plays much better in real time. We evaluated several multi-player network solutions and are going to go with Photon Cloud  – very well integrated with Unity, a great price, and should be able to scale.

iOS Game – Torchships

One of the very nice new features of iOS 5 from Apple is that GameCenter will handle all of the mechanics and hosting for multiplayer turn based games. This creates an opportunity to build a very feature-rich multiplayer game without a lot of infrastructure costs.

Accordingly we’re going to design and develop a 3d turn based space combat game for the iPhone and iPad with realistic physics. The game will be ad-supported and free to download and play. We will release rapid iterations and add features as we go….

  • Using iSGL3D for the 3d graphics.
  • 4 gameplay modes – tutorial/training, hotseat (same device), peer to peer (wifi/bluetooth via gamekit), game center hosted
  • Realistic Newtonian physics.
  • Weapons include – gamma ray lasers, missiles, and kinetic launchers.
  • Terrain – terran planets, moons, asteroid fields, ring systems, dust and debris clouds.

Working title for the game is Torchships.

If you’re interested in following our progress follow this blog (see the right hand pane of this page to sign up)

If you’d like to help beta-test Torchships then please sign up at our TestFlight page – you will need an iPad, iPhone, or next generation iTouch to participate.