Introduction to VR with Unity
4.3 (192 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,695 students enrolled

Introduction to VR with Unity

Create an immersive Virtual Reality experience on iPhone/Android Cardboard or VR Device with Unity.
4.3 (192 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
1,695 students enrolled
Created by Fred Moreau
Last updated 4/2019
English [Auto-generated]
Current price: $11.99 Original price: $159.99 Discount: 93% off
2 days left at this price!
30-Day Money-Back Guarantee
This course includes
  • 7.5 hours on-demand video
  • 38 articles
  • 29 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to Udemy's top 3,000+ courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Create a 3D VR project targeting a device as simple as iOS/Android cardboard.
  • Create immersive VR experiences with panoramic videos.

  • Create interactive VR game plays with advanced Unity features, including Ray Casting and NAVigation (Path Finding).

  • Create interactive head's up 3D user interfaces.
  • Add support for Game Controllers and Cardboard "Screen Touch" button.
  • Take advantage of Ambisonic Audio files.
  • Use Unity Remote to test things in the Editor.
  • Bypass Unity XR SDKs and use the Gyroscope, to test things in the Editor with Unity Remote.
  • Take advantage of Unity's Events to trigger actions on interactive objects, including loading scenes.
  • Use Unity's Animator State Machine along with Collider Triggers, to trigger animations when passing by.
Course content
Expand all 108 lectures 07:40:47
+ Introduction
5 lectures 16:41

A short introduction of the author and a brief of overview of the content.

Preview 01:53

VR is spreading fast and Unity's doing its best to make it as generic as possible.

VR is somehow a recipe, with mandatory and optional ingredients.

To follow this course, you'll need :

  • A Windows PC or Mac Computer, matching the minimum requirements to run Unity
  • Unity 2017.3 or a later version. You can use the free version, every feature we'll use is available in it.
  • An iPhone or Android phone with a Cardboard system, or a VR device you can develop with. Don't plan on using a PSVR, unless you have a Sony Developer account.
    • If you want to develop on iPhone,
      • you'll need at least an iPhone 5, running iOS 8 or later.
      • You'll also need an Apple iOS Developer account, a Mac Computer, and installing the latest version of Xcode.
    • If you rather want to target Android,
      • you'll need a phone running Android 4.1 or later with a gyroscope, compatible with Google Cardboard or DayDream (see Google Daydream hardware requirements).
      • You will also have to install the Android SDK for Windows or MacOS.
    • If you plan on developing for a VR device, such as Gear VR, Oculus Rift, HTC Vive, or MS Hololens, make sure the hardware is supported on your development platform. Most of these hardware are not fully compatible yet on Mac.
  • For Cardboard users, I'll also touch on using a wireless controller to implement interactivity and navigation.
    • If you develop on iPhone, make sure to get an MFI (Made for iOS) game controller.
    • If you have a non MFI controller, such as the cheap remotes or ICade devices, I'll touch on hacking these with iOS.
    • If you develop on Android, make sure the controller is supported.
  • Any other VR related device can be used. I'll do my best to touch on them as I get to test them. First in list is the 3D Rudder, a foot controller.

I'll keep a list of hardware for reference here.

Preview 03:05

An overview of current and upcoming content.

Preview 01:00

Let's begin with creating a new project and have a quick overview of Unity Editor's interface.

Project Setup & Editor Overview.

One of the 1st thing to setup in a project is the target platform in the Build Settings, as well as the project's publisher infos in the Player Settings.

(iPhone/Android/PC/Mac) Build & Player Settings.
+ VR Lobby
8 lectures 29:53

Let's begin the creation of our VR Lobby scene with placing the camera and adding a few objects.

Creating and placing Game Objects.

To differentiate objects, let's assign them a few materials with different colours.

Creating and assigning Materials.

Let's now import a texture to use as a panoramic background.

Importing textures.

To use the texture as a background, we need to create a special material, called a Skybox.

Creating and assigning a SkyBox Material.

To take advantage of automatic head tracking and stereo rendering, we simply need to add XR SDKs to the Player Settings.

Adding XR SDKs to use head tracking and stereo camera rendering.

Everything's now setup, let's build our project with Xcode and test it on an iPhone.

(iPhone) Building and running the VR test scene on the device.

To build your project as an "apk" you can push on your Android device, you need to :

  1. Go to Unity's Preferences, then External Tools
  2. In the Android section, click on Download next to the SDK path, and install Android Studio
  3. Then browse for the SDK root folder you just installed.
  4. Then install jdk1.8.0_161 from the link provided here in the resources. (Not using the download button).
  5. Then browse to /Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home.
  6. You shouldn't have to, but you can also install android-ndk-r13b using the download button, and browse to its location.
  7. Then run Android Studio, and click "Configure".
  8. Under SDK Tools, expand the SDKs and uncheck 25.0.3, then click on Apply.

Unity should now be able to build the project.

(Android) Build & Run.

Everything's now setup, let's build and run our project on Windows or Mac OSX.

(Windows/Mac) Build & Run.
+ Adding a 360 audio/video background
4 lectures 10:56

If you've skipped the intro videos, this video will update you as to what we've done.

Preview 01:32

Let's begin with importing 360 video files.

Importing Video Files.

Unity's built-in Video Player allows us to playback video files over different objects.

Adding a Video Player to playback video content.

To render the video in our Skybox background, we're going to need an intermediate texture, called a RenderTexture.

Creating a Render Texture to *link* the Video Player to the Skybox Material.
+ Adding interactivity : what am I looking/pointing at?
15 lectures 01:54:37

A VR experience very much relies on what you're looking at, or pointing at with a remote controller.

In this section, we're going to introduce Unity scripting and C# standards, to have objects react to their position relative to the screen or camera's position.

Preview 02:26

Let's begin by adding a few text objects above our selection items.

Scene Setup.

Using Prefabs allows us to manage similar objects within and across several scenes.

Using Prefabs.

Let's create our 1st C# script and setup MonoDevelop for proper C# standards.

Creating a new C# script and setting up our development environment.

"using", "public", "class", "MonoBehaviour", "void", what does it all mean ?

Overview of the MonoBehaviour and C# scripting.

Let's begin with accessing the object's Renderer component to change the colour of its Material.

Accessing and modifying another component.

Let's use a Gradient property instead of a simple Color.

Using a Gradient object.

Calling methods, such gradient.Evaluate(), on every frame, isn't optimal performance wise. Let's use a C# property, to call it only when the position changes.

Using C# Properties to optimise the refreshing of values.

In this lecture, we're going to reference another object's Transform component, that we'll use later to measure the object's relative position.

Referencing another object's (Transform) component.

Often, there are things we do all the time. Templates are great to keep our code consistent and type it faster.

Using Code Snippets (Templates) to easily and quickly add more properties.

To avoid loosing track of what's left to do, using Tasks allows us to quickly log remaining tasks with comments using special keywords.

Using MonoDevelop Tasks.

To have a better understanding of Vector maths, we're going to use some visual debug features and draw lines in the Scene View.

Vectors and Visual Debug features.

If 10 means 1 and 30 means 0.. Wait! It's easy, the Maths helper class does it all for you.

More maths, with inverse linear interpolations.

We're now going to add Unity Events to the Object Focus component, so that we can connect its values to other values in the Editor.

Using Unity Events to drive any other value.

While we could ease in and out the interpolation so far linear, we're going to use Unity's built-in Animation Curves, which not only makes it easier but also provide more control.

Using Animation Curves to ease in and out the linear interpolation.
+ Previewing actions on interact-able items.
4 lectures 38:33

To easily trigger actions on the object that is the most in focus, we're going to create a new manager script to look after all items.

Object Focus Manager.

A Singleton is a design pattern where a component has a unique instance that can easily be accessed at any time.

Implementing the manager as a Singleton.

Methods and members declared static are shared amongst all instances and a lot easier to access from other components.

Static methods.

Let's add events to change the display of the object currently in focus.

Focus Events.
+ Triggering actions.
5 lectures 32:00

Now that we have access to the currently highlighted object, we're going to trigger actions on it. Let's begin with adding a Progress Bar image.

Preview 05:30

If you're using the Cardboard with no Game Controller attached, then a single touch on the screen, using the Cardboard button, it our only option to trigger an event.

Handling touch screen inputs.

Let's make the connections now between the Input touch and the currently highlighted item.

Action Trigger.

And to wrap this up, we need to properly cancel the trigger when the user looks away.

Cancelling the Trigger.

Now that the input system is in place, let's use it to load another scene.

Loading a Scene.
+ Supporting other input types.
5 lectures 37:28

Let's begin with making sure our gamepad is properly connected and recognised by Unity, using Unity Remote along with the Game Pad.

Handling wireless gamepads.

Unity supports monitoring keyboard keys and game pad buttons in two different ways : using key codes or Input Settings.

Let's add support for key codes first.

Using Keycodes.

Whenever several classes share a fare amount of mechanics, Object Oriented Programming, especially inheritance, allows us to regather all the code in a parent class. This not only makes it easier to implement the mechanics in different components, but it's also easier to maintain.

Using abstract classes to gather common mechanics.

With all the input mechanics now part of a parent class, implementing it in the Game Pad controller component becomes very simple.

Adding events to the GamePad Manager.

Sometimes it's easier to use Unity's built-in Input class than key codes. Again, using our parent class makes it super easy to implement a different version.

Using Unity's Input class to support VR Device controllers.
+ XR Specifics.
1 lecture 09:45

Unless you're using Room Scale Tracking, we have no idea of the cardinal direction the Gyroscope will be initialised in. We're going to use XR features, Scene Manager Events and make the object Persistent so that we "recenter" the orientation on every new scene load event.

Recentering the sight when a scene is loaded.
+ Cardboard Specifics.
2 lectures 09:25

When using the Cardboard, you can't preview the XR tracking in the Editor. To work around this and be able to preview things with Unity Remote, we're going to use Unity's Gyroscope native support instead.

Using the Gyroscope instead of XR tracking when on Cardboard in the Editor.

When using the Cardboard, the current XR SDKs doesn't provide any feedback in the Editor. Now that we have enabled tracking using the Gyroscope, let's go a little further and add support for stereoscopic rendering.

Stereoscopic Rendering in the Editor when using the Cardboard.
+ Navigating 3D Worlds
24 lectures 02:17:33

Let's begin with a quick overview of the ways we can navigate a 3D world with Unity.

Preview 02:43

To experiment scene navigation, we're going to use an environment from the Asset Store.

Preview 02:21

Let's begin with fixing issues and cleaning the scene a bit.

Scene setup.

Before we dash into coding, let's have a look at our strategy.

Preview 04:11

Now, let's begin with a simple vector looking forward..

Looking forward...

Now that we have the heading direction, we're going to find the intersection with the first Collider we hit in the scene.

Using Ray Casting to find an intersection.

Layers allow us to ignore sets of objects so that we can look through some categories of objects, such as characters, vehicles, etc.

Ignoring intersections with set layers.

Colliders, when set as Triggers are usually not meant to block navigation. Let's let the ray cast look through them.

Looking through trigger colliders.

We're now going to "bounce" off walls and obstacles to avoid going to close to them.

Bouncing off walls..

Let's use Ray Casting again to find a position on the ground.

More Ray Casting ...

Navigation, part of "AI" features of game development, allows us to navigate a 3D world using pre computed navigation data to find the shortest, easiest route from point A to point B. Setting it up in Unity is fairly simple.

Baking a NavMesh.

With the NavMesh now available, we're going to locate the closest position within the NavMesh, from the ground position we have.

Finding a position within the NavMesh.

In this lecture we are going to use Pro Builder (free extension available from the Asset Store) to create a 3D cone object.
You can import the Reticle-Cone package if you want to skip this part.

Then we'll code a custom shader to display a 2D reticle texture with alpha, slightly off its render position. This is known as Decal rendering.
You can also use the shader file provided in the resources.

Creating a 3D reticle object.

Let's now position the reticle object at the estimated destination, and deactivate it when we don't have one.

Positioning the reticle.

We're now going to set up a NavMesh Agent, and give it the target location for destination, with a method we can easily test in the Editor.

Setting up the Nav Agent.

With this new component and the previous components we've made in earlier chapters, we're now going to be able to put together a First Person Camera Navigation system.

Setting up the navigation with XR devices.

When using a VR device, such as the HTC Vive, we can track controllers positions and rotations. Let's use this to aim for a target position with the controller instead of the Camera.

Using VR "Nodes" (controllers).

Sometimes we want to force disable room scale tracking. Useful in car racing games for example, you don't want the player's head to come out the car's rooftop.

Disabling XR Camera position tracking.

Now that we can navigate our environment, it'd be nice to open doors instead of just walking through them. And this is going to be fairly simple with Animator, and Trigger Colliders.

Preview 09:43

Let's begin with detecting when an object enters and leaves the Trigger Collider.

Trigger detection.

It's now as simple as changing the state of the Animator parameter when an object enters and leaves the Trigger.

Animator parameter change.

As we may have other objects in the scene, we may want to filter the trigger so that it only triggers the Animation with a given object. We can easily do this with tags.

Filtering with tags.

Let's add a few optimisations to make the component event more generic, and apply the changes made to all doors.

Applying changes to all doors, and add a few optimisations.

We can also use Unity Events, to trigger any action.

Triggering Events.
  • Unity 2017.3 or later.
  • If targeting iPhone : an Apple iOS Developer account, a Mac computer and Xcode.
  • If targeting Android : Android SDK (free).
  • As an option, if targeting iPhone : an MFI (Made For iPhone) bluetooth Game Controller.
  • As an option, if targeting Android : an Android bluetooth Game Controller.
  • If targeting Windows/Mac : a supported VR device (Oculus Rift, HTC Vive, Microsoft Hololens).

This course aims at helping anyone willing to learn Unity to create VR experiences.

No previous programming experience is required, and most of the principles covered in the course will help future programmers wrap their head around programming basics.

It features a self learning approach. Every topic comes in on a need to know basis.

Most of the course examples can be done with the simplest hardware.

Whether you want to experiment with a simple Android or iPhone cardboard, add a remote game controller, or go for pro hardware, the principles, techniques and code you'll take away from this course will help you deliver a full VR experience, fast!

Who this course is for:
  • VR enthusiasts willing to learn Unity
  • Unity developers willing to learn VR features