Now that we have made a conscious decision to work in VR, today I finally had the chance to play around with VR in Unity.
Today we’re going to explore setting up and using Google’s VR SDK. You might think that setting up VR would be an extremely complex process, but after going through the process, I can say that starting out is simpler than you would think!
Here’s our objective for today:
Setting up support for Google Cardboard on Unity
Going through Unity’s Tutorial
Let’s get started!
Step 1: Setting up Google Cardboard on Unity
For today, I’ll be going through Google’s documentation for setting up VR on Unity.
The nice thing about the Google VR SDK is that we can re-use most of the prefabs and scripts that are used with Google Cardboard and use them with Google Daydream. That’s 2 different platforms for the price of one.
Today I’ll be following Google’s official documentation on getting started on Unity.
Step 1.1: Install Unity
At this point, I’m going to assume that we all have an older version of Unity (5.6+).
To support being able to run our VR App, we’re going to need to Android Build Support, if you don’t have that installed already, re-download Unity and during the installation process, choose to include Android Build Support.
Step 1.2: Adding the Google VR SDK
After we have Unity set up correctly with Android Build Support, we need to get Google’s VR assets.
Download Google’s VR SDK here.
We’re looking to download the .unitypackage.
After we have the package downloaded, it’s time to add it to a Unity project.
For our case, we’re going to create a new project to play around with.
In Unity create a New Project (File > New Project…)
Once inside our new project, import everything from the package that we downloaded. In Unity we can import by going to Assets > Import Package > Custom Package.
Step 1.3: Configuring Unity to Run VR
Now we have imported everything we need, the last thing to do is to change some of our settings in Unity so we can run our game.
Change Our Build Setting to Build and Run Android Applications
The first thing we need to do is get Unity to run our game project on an Android platform.
Open the Build Settings by going to File > Build Settings.
Select Android and hit Switch Platform
Wait for the game to finish re-packaging our assets for our new platform
Change Our Player Settings to Support VR
The next thing we need to do is to change our Player Settings so that we can support the specific VR SDK that we want. In our case, it’s going to be Google Cardboard.
In Build Settings, next to Switch Platform, we have Player Settings, select it.
In Player Settings, enable Virtual Reality Supported and then add Cardboard to our Virtual Reality SDKs
Finally, in Minimum API Level, select API level 19 for the minimum Android version the device the players must have. Google Cardboard requires a minimum of level 19 and the Google Daydream Viewer requires a minimum of level 24.
Once we have everything installed, we can finally get started on taking a first look at working with VR!
Step 2: Looking Through the Unity Tutorial
Now that everything is configured, we can officially start looking through Google’s SDK Basics.
I went through the SDK basics while also going through the GVRDemo scene.
In our new project go to Assets > GoogleVR > Demos > Scenes and open GVRDemo
Google provides prefabs and scripts that will take care of the VR features for you. These are all located in Assets > GooglveVR > Prefab and Scripts. Here’s a breakdown of what they and the script attached to them do:
GvrEditorEmulator prefab– Allows us to control our camera like how we might control it with our headset. Hold on to the alt button to rotate your view around the camera.
GvrControllerMain prefab – Gives us access to the Daydream controller which we can implement actions with Google’s controller API to interact with the game
GvrEventSystem prefab – Enables us to use Google’s input pointer system. Specifically, how our gaze/controller interacts and selects objects.
GvrPointerGraphicRacyater script – This script is like a normal Graphic Raycaster that we would attach on to a UI canvas so that we can interact with our UI using our input devices (gaze or controller)
GvrPointerPhysicsRaycaster script – This script shoots out a raycast directly in the middle of our screen to select something when we decide to click. We should attach this to our main camera. We must also attach Unity’s event system on each object we want to interact with when we select them.
GvrControllerPointer prefab – This is the Daydream’s controller. It gives us an arm asset to imitate our controller. This prefab must the sibling of our Main Camera object where we attached our GvrPointerPhysicsRaycaster
GvrReticlePointer prefab – This is the Google Cardboard’s gaze controller. It creates a dot in the middle of our screen which we use to select objects that are in the game. For this prefab to work we must make it a child of the Main Camera game object.
There are quite a couple of other prefabs and scripts, but on the high level, these are the basics we’ll need to make a VR game.
Let’s see this in action with the GvrDemo scene!
Step 2.1: Looking at the demo scene
When we open up GvrDemo, here’s what we see:
I suggest that you explore around the scene and see the objects in our hierarchy, but on the high-level summary, here’s what we have in our hierarchy that’s relevant to just the Google Cardboard (because it has Daydream assets too)
GvrEditorEmulator for us to emulate head movement in VR
GvrEventSystem for Unity to detect our VR inputs when we select an object
Inside Player > Main Camera, we have our GvrPointerPhysicsRaycaster script which allows us to use Google’s raycasting system for 3D objects
Inside the Floor Canvas game object, we have the GvrPointerGraphicRacyate for us to interact with the UI.
Finally, inside Player > Main Camera > GvrReticlePointer, we have our gaze cursor for Google Cardboard that we use to interact with the game world.
The main point of this game is to click on the cube that appears in the game. When we click on the cube, it’ll be randomly moved somewhere else in the game.
The interesting part of all of this is how we can trigger the code with our Gaze.
Let’s look at the Cube and Unity’s Event Trigger system.
The Event Trigger System is a way for Unity to recognize any action taken on the game object that the Event Trigger is registered onto.
An action is something like:
OnPointerClick
OnPointerEnter
OnPointerExit
In our example, OnPointerClick will be triggered whenever we click on an object that has the Event Trigger attached to it.
Here’s the teleport script:
// Copyright 2014 Google Inc. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(Collider))]
public class Teleport : MonoBehaviour {
private Vector3 startingPosition;
public Material inactiveMaterial;
public Material gazedAtMaterial;
void Start() {
startingPosition = transform.localPosition;
SetGazedAt(false);
}
public void SetGazedAt(bool gazedAt) {
if (inactiveMaterial != null && gazedAtMaterial != null) {
GetComponent<Renderer>().material = gazedAt ? gazedAtMaterial : inactiveMaterial;
return;
}
GetComponent<Renderer>().material.color = gazedAt ? Color.green : Color.red;
}
public void Reset() {
transform.localPosition = startingPosition;
}
public void Recenter() {
#if !UNITY_EDITOR
GvrCardboardHelpers.Recenter();
#else
GvrEditorEmulator emulator = FindObjectOfType<GvrEditorEmulator>();
if (emulator == null) {
return;
}
emulator.Recenter();
#endif // !UNITY_EDITOR
}
public void TeleportRandomly() {
Vector3 direction = Random.onUnitSphere;
direction.y = Mathf.Clamp(direction.y, 0.5f, 1f);
float distance = 2 * Random.value + 1.5f;
transform.localPosition = direction * distance;
}
}
We can ignore what the code does, but the important thing that I want to bring attention to are the public functions that are available:
SetGazedAt()
Reset()
Recenter()
TeleportRandomly()
Where are these called?
Well, if you look back at our Event Trigger that’s created in Cube, we set 3 event types:
Pointer Enter
Pointer Exit
Pointer Click
Then whenever any of these events occur, we call our public function.
In this example, when we look at our cube, we’ll trigger the Pointer Enter event and call the SetGazedAt() function with the variable gazedAt to be true.
When we look away, we trigger the Pointer Exit event and call the SetGazedAt() function with gazedAt to be false.
Finally, if we were to click on the cube, we would trigger the Pointer Click event and call TeleportRandomly() to move our cube to a new location.
Conclusion
It’s surprising how un-complex this whole process is so far! I’m sure there are a lot more things to consider once we dive deeper into Unity, however for today, I think the progress we have made is sufficient.
Tomorrow, we’re going to look at how we can get the demo app to run on a phone that supports a Google Cardboard (which I assume at this point is 99% of you guys here)
Day 33 | 100 Days of VR | Day 35
Home
↧