- 5 hours on-demand video
- 3 articles
- 4 downloadable resources
- Full lifetime access
- Access on mobile and TV
- Certificate of Completion
Get your team access to 4,000+ top Udemy courses anytime, anywhere.Try Udemy for Business
- Learn all the fundamentals of ARCore from Motion tracking to Plane Detection and light estimation.
- Measure distances in Augmented Reality with high accuracy.
- Fly a drone in AR without having to worry about crashing it into a tree.
- Transport yourself into another world with AR Portal
- Immerse yourself into building your own Planetary Solar System in Augmented Reality
- Use Light Estimation to create real time horror-movie effects
- Fly a Dragon in AR.
- From Virtual to Reality with 4D Volumetric Hologram Sequences
This lecture will guide you through a brief introduction of what augmented reality is and what it is not.
In this lecture where we’ll talk about the 3 fundamental Technologies behind ARCore. So before delving into ARCore development, let’s first see what’s under the hood of Googles Augmented Reality SDK and what makes these cool experiences possible.
So this lecture we are going to be learning how to get started with ARCore and Wikitude in Unity as well as setup our android devices.
Today we are going to be creating our first Hello ARCore App and deploy it to our android device. What this app does is to search for a plane based on coplanar feature points and overlay Andy the Android man onto that plane. The purpose of this is to test whether we are able to get the bare bones ARCore functionality working on our supported device.
Today we are going to be playing around with just the motion tracking component of ARCore. We will be building our AR app from scratch cos the best way to understand it is to look at an SDK’s core components. In the app we’ll be building in this lecture, we will not be anchoring digital content to planes but rather floating in mid-air as we are just using motion tracking and not the environmental understanding component. We’ll cover that in the next lecture. Okay so let’s get started with our ARCore Motion tracking app.
Today we are going build upon our last video point and plane detection and just add some hit testing functionality. So if you remember in the HelloAR app, when we touched the screen, we instantiated and anchored Andy the Android man onto the plane. This uses a concept called Ray casting to see where the touch point coordinates on our screen coincides with the plane that we detected. Now to make this tutorial a bit interesting, let’s add some physics and mesh colliders on our plane as well as on our virtual objects so that they interact with each other.
Today will be a short tutorial on how read pixel intensity values from the cameras image to get your average light estimation value. And also how to use these values to show or hide your 3D object. In this case we have a scary monster with glowing red eyes who creeps closer and closer to your camera every time you switch off your light switch
Afraid of crashing your drone? In this tutorial I’ll be show how you can take a drone flying experience into Augmented Reality with ARCore in Unity. So taking apps in Augmented Reality is really quite simple once you get the hang of ARCore. I will show you, that you just need one simple unity package and the rest of the functionality can be handle by the function of the Gameobject. What I mean exactly is that you can develop the game functionality in Unity and then just add in this Unity package to instantiate your app in Augmented Reality. Lets take a look at how easy this is.
In this tutorial I’ll be showing you how to implement your own holograms in augmented reality. Now the holographic models are 4D volumetric sequences that are captured using volumetric capture technology. And one of the main companies specializing in this is 4D views. Now why would you want to have a 4D model or a 4d volumetric sequence in your app or tool? Well for a number of reasons. Firstly, instead of manually animating a 3d character to perform a complex sequence, you can perform the same sequence using real people in a volumetric capture studio. This results in a much more realistic experience of the sequence. Like for example, to animate a shaolin monk to perform these complex movements would be quite challenging to do as well time consuming, however getting the real deal to do these moves would be way simpler. Now this is all provided you have access to a volumetric capture system near you. At the moment, these studios are scarce, but just like any other technology, there could be studios like these popping around just about everywhere or they may become even cheap enough for you to buy your very own 4d volumetric capture system.
Now its not only 4dviews that are innovating in this space. Intel is investing millions if not billions into volumetric capture technology. Like if you watch this video on filmmaking in volumetric video, you capture once and thereafter you are able to manipulate any camera angle that you desire just as if you were dragging around your camera just like you would in unity within a 3d scene. This is not only great in film making but also in sports with intels Free-D technology. This amazing tech allows the viewer to view any angle of a sporting event in real time and really immerse yourself in the action. Other applications on the top of my head is an AR business card, where you can project a 4d hologram of yourself demonstrating or discussing a product such as if you are a yoga instructor. You can perhaps show your clients a pose and allow them to view any angle to help them better understand how to perform that pose. The world is your oyster here if you can apply this amazing technology to tourism, business, arts, engineering to name a few.
Okay so let me show you how to create your own 4d volumetric app using a demo model from 4d Views.
Hey guys and welcome back. In this tutorial I’ll be showing you how to implement vertical plane detection using the latest ARCore 1.2 SDK. Now in this lecture, I will show you how you can hang a virtual picture frame on your wall which is a really simple task. So to make this lecture a bit more interesting, I will show you how to integrate basic gesture control scripts into your main ARSceneController script. So we will cover Tap to spawn, which you have been doing though out this course, and then pinch to zoom, to make the picture larger or smaller as well as to rotate the image using a single touch. Now there are better and more robust gesture control scripts online and in the unity asset store. However, the purpose here is to show you how to implement your own scripts to for gesture touch control so that you have a better fundamental understanding of how commercial assets work.Now before we get started, remember that you will require ARCore 1.2 or later. This is because vertical plane detection was only recently launch in this version at Google IO developer conference 2018. You can use assets such as Lean touch or finger touch from the unity asset store.
Want to learn how to build your OWN Spiderman EDITH Glasses AI prototype? Watch this tutorial series to see how I prototype my own glasses with an Intelligent AI.
Project EDITH Glasses DIY - Phase 1 : AI Face Detection (Spider-Man Far From Home) Hey welcome back, So I am really cool and excited to start this new series which will be to try and reproduce the tech from Spiderman Far from home, the EDITH glasses.
Not sure if you guys have seen the movie, but spoiler alert … essentially Tony Stark hands down his Super Smart AI Glasses called EDITH which stands for Even Dead, Im The Hero … Classic tony. So these glasses give Peter Parker access to all of Stark Tech and also has smart Augmented Reality capability.
This series will try and push the boundaries of existing technology and see how many features we can fit in this project. So just some acknowledgements to some really cool people who have worked on EDITH Glasses are The Hacksmith, JLaservideo.
So before we dive into our approach to this technology lets check what has already been done. We are going to take Augmented Reality to the next level!
Lets see how to make smart AI glasses in this video.
Hey guys and welcome back. So, in the last lecture we made our attempt at face detection in Phase 1 of the EDITH glasses DIY prototype. So essentially, we used Unity with OpenCV to perform AI face detection using a ResNet CNN. In this video we are going to build on top this prototype and upgrade it with facial recognition.
As you can see from the demo after training the model, we are able to recognize the faces of 3 people, where we then instantiate a graphical UI for each persons Identity. This will allow us to get close to the Spiderman E.D.I.T.H. glasses functionality that we saw in the movie. We are going to take Augmented Reality to the next level!
Hey guys and welcome back. So in tutorial, I'm going to show you how to add AI object detection functionality to our EDITH Glasses prototype. So, what is cool about this phase is that we are able to detect multiple objects simultaneously and in real-time. This makes it really easy to integrate the detections with a Heads-Up-Display (HUD) GUI animation.
If you missed the previous phases of Project EDITH. We prototyped facial recognition and face detection to be able to identify people similar to how it was done in Spiderman Far from Home.
In this video, Phase 3 we are going to continue object detection using a MobileNet AI model using OpenCV for Unity. There was an option for using Yolo but it was more process intensive and we have to keep processing power in the back of our minds, should we wish to port all this to a mobile platform like android.
E.D.I.T.H Glasses - Phase 4 Tutorial : Build your own JARVIS and EDITH AI (Spider-Man Far From Home) Project EDITH Hey guys and welcome back, I hope you enjoyed that little demo of Project EDITH Phase 4! So this demo shows of an AI Assistant capability, whereby we are able to demo a design and development application using our EDITH Glasses.
We were able to call on JARVIS or EDITH to assist us in the design of the EDITH smart glasses. Now from the demo, there are a few cinematic enhancements that you should be aware off. First off the quality of the Text to Speech (TTS) engine using IBM Watson is different.
So both these voices are from the IBM Watson TTS Engine. The better sounding one is produced using an Enhanced Deep Neural Network (a DNN) for Text to Speech synthesis whereas the old one uses traditional methods.
- Unity 3D SDK and basic experience with Unity
- ARCore compatible Phone (Samung Galaxy S7 and later, Google Pixel and later) - Can get refurbished S7 for $220 on Amazon
- $25 worth of Required Unity Assets - Bill of materials can be found in FREE Preview lectures or on Github
- $20 Optional Unity Assets - Total Required + Optional = $45
- ARCore 1.0 or 1.1 recommended but also compatable with ARCore 1.2
Code Updated for ARCore 1.x - ARCore is now works cross platform with ARKit
Do you want to learn the new ARCore 1.x in Unity SDK? This course will teach you all the fundamentals of the ARCore Augmented Reality in the shortest time so that you can get started developing your own Augmented Reality Apps. In the App section, we show you how you can apply the basics to more advanced apps for either gaming or for productivity-based apps.
I am an AR Developer with a Masters Degree in Electronic Engineering. I have over 34000 students on Udemy. This course is designed to help you understand the fundamentals of Augmented Reality using the ARCore in Unity through practical and easy to understand labs. This class covers these capabilities, including getting started, simple and multiple target detection, smart terrain as well as leap motion integration. You will learn all the fundamentals through practice as you follow along with the training. Together we will build a strong foundation in AR in Unity SDK with this training for beginners.
This course will enable you to:
We designed this course for anyone who wants to learn the state of the art in using Googles ARCore SDK without the steep learning curve. By the end of the course, you will be able to understand the fundamentals of ARCore SDK Core functionality from Motion Tracking, to Point and plane detection and even test light estimation. Thereafter you will soon be developing your own amazing Augmented of real-world applications.
In this course, we will begin with the fundamentals section by creating a simple hello AR app that will help you to quickly get something working.
Next, we start with the first core functionality which is motion tracking and you will learn how spatial tracking works in AR.
The second core functionality of ARCore is Point and Plane detection. To detect Feature points and how coplanar feature points form into planes.
Then we build a simple hit testing app where you can spawn random colored cubes or spheres with a simple touch using ray casting.
Then we end of the basics with simple light estimation, to make a monster appear or disappear at a flick of a physical light switch.
Then we move on to the exciting part of the course where we create some really amazing ARCore 1.4 apps.
The first lecture you can build your own accurate measuring app using ARCore. A lot of practical uses for this app.
Thereafter we learn how to transverse within an AR scene by building a dragon that can move and fly in any direction as well as breathe fire.
Next, we take flying to another level by building a drone in Augmented Reality. Its great practice flying in AR before buying a real drone.
For education, we demonstrate how you can build your own solar system and how to use touch ray casting to bring up information on the planets, moons, and stars.
If this is not out of this world, then a portal app will definitely transport you to another place.
Thereafter we look at how we can import and use 4D volumetric sequences or volumetric holograms with ARCore to bring a sense of realism into your AR apps.
Apply shadows using shaders for AR Objects
Perform Vertical Plane Detection using ARCore 1.2 - 1.4
Create a video wall using Augmented Images using ARCore 1.2 - 1.4
Implement Cloud Anchors to experience multiplayer games using Photon.
This and much much more exciting apps that you shall learn in this course.
I’m sure by now you are really excited to get started with ARCore, however, It is important to note that before you enroll in this course there are some training material (Unity Assets) and hardware you need to purchase beforehand to make this course the best learning experience. See the video for details.
Personal help within the course
I donate my time to regularly hold office hours with students. During the office hours you can ask me any business question you want, and I will do my best to help you. The office hours are free. I don't try to sell anything.
Students can start discussions and message me with private questions. I answer 99% of questions within 24 hours. I love helping students who take my courses and I look forward to helping you.
I regularly update this course to reflect the current marketing landscape.
Get a Career Boost with a Certificate of Completion
Upon completing 100% of this course, you will be emailed a certificate of completion. You can show it as proof of your expertise and that you have completed a certain number of hours of instruction.
If you want to get a marketing job or freelancing clients, a certificate from this course can help you appear as a stronger candidate for Augmented Reality jobs.
The course comes with an unconditional, Udemy-backed, 30-day money-back guarantee. This is not just a guarantee, it's my personal promise to you that I will go out of my way to help you succeed just like I've done for thousands of my other students.
Let me help you get fast results. Enroll now, by clicking the button and let us show you how to Dominate ARCore with 9+ AR apps.
- People who want to learn ARCore
- Students who want to get started in Augmented Reality and make really cool apps
- People who want to catch on to the latest game development trends with AR