The Complete Spark AR Course: Build 10 Instagram AR Effects
4.4 (173 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
602 students enrolled

The Complete Spark AR Course: Build 10 Instagram AR Effects

Become a Spark AR Developer by Building 10 Augmented Reality Camera Effects for Instagram using Spark AR Studio.
Bestseller
4.4 (173 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
602 students enrolled
Created by Ryan Hafner
Last updated 6/2020
English
English [Auto]
Price: $39.99
30-Day Money-Back Guarantee
This course includes
  • 7 hours on-demand video
  • 16 articles
  • 11 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Spark AR Studio - the most advanced augmented reality creation tool available
  • Create and publish your own augmented reality camera effects for Instagram (and Facebook)
  • Build augmented reality camera effects for your brand or clients
  • Be one of the first developers to join the AR revolution, an industry expected to be worth $165 billion by 2024
Course content
Expand all 122 lectures 07:10:32
+ Spark AR Studio - Introduction
3 lectures 03:19

Welcome to the Complete Spark AR Course. In this video, we lay out the structure of our course including an overview of the different sections and effects you'll be creating.

Preview 02:37

This short video covers the software and hardware required to participate in this course. It also includes recommendations for other tools and skills that will help you progress more quickly.

Preview 00:33
Student Success
00:09
+ Spark AR Studio - Getting Started
2 lectures 12:54

This lecture goes over the process of downloading and installing Spark. Then, we explore the Spark interface to get familiar with the controls, menus, and navigation.

This lesson assumes zero knowledge of Spark AR Studio. If you are already familiar with the software, feel free to skip this section.

Preview 12:38
Troubleshooting Tips
00:16
+ AR Camera Effect #1: Masks
5 lectures 36:17
Intro
00:27

This lecture covers the face tracker capability and the facemesh object. The facetracker allows content to be tracked to a user's face. The facemesh is a built-in 3D asset that can be used as an overlay for a tracked face.

Face Mesh
01:57

Learn about how we use Materials in Spark. Create, adjust, and compare the different types of Materials available.

Materials
13:26

Learn how to import and use .png or .jpg images as Textures in your Spark project.

Textures
05:30

A quick look at programming with the Patch Editor - Spark's visual programming interface. We'll be creating the ability for user's to tap the screen to change the Material that's currently visible on their face.

Patch Editor
14:57
+ AR Camera Effect #2: X-Ray
9 lectures 34:29
Intro
00:24

Introduction to 3D objects/meshes and programs for downloading, creating, editing, and exporting them for use in Spark.

3D Formats & Software
01:38

In this lecture, we import a 3D object from Sketchfab, Spark's integrated 3rd party 3D library. We then position our model to fit our scene and give it a new Material

3D Objects
06:26

Learn about 2D elements and how to organize them into Layers to create backgrounds or overlays

Layers, Canvas, & 2D Elements
02:48

In this lecture, we explore occlusion and how to use it to selectively hide/reveal elements of our scene.

Occlusion
04:55

This video continues our exploration of the Patch Editor through animating our occluder plane. We also look at Null Objects, which are invisible containers that allow us to group multiple objects together for use in animation.

Animation Patch Flow & Null Objects
07:08

In this lecture, we learn how to selectively prevent movement of certain scene objects by counteracting some of our FaceTracking transforms.

Face Rotation
04:24

To wrap up our effect, we use the Patch Editor to link the rotation of one of our scene objects to the Mouth Open Face Gesture, which tracks our user's mouth openness.

Face Gesture - Open Mouth
03:56

This lecture examines the Project Settings menu, including our Effect Destination (Facebook vs Instagram) and our Effect Capabilities.

Project Settings & Capabilities
02:50
+ AR Camera Effect #3: Emoji
8 lectures 32:54
Intro
00:20
Intro
02:04

Combining our knowledge of Materials with some new Lighting techniques, we create a cartoonish aesthetic for a 3D object.

Cartoon Aesthetic & Lighting
03:49

In this lecture, we learn how to use Planes to add more cartoon content to our scene.

Planes
07:36

We add a HandTracker object to our scene, and use it to anchor elements to our user's hand.

HandTracker
03:52

Learn how to use Face Gesture Patches to tie Positions, Rotations, or Scales to Blinking or Eye Openness.

Face Gesture - Blinking
07:28

Learn how to use Facial Landmark Patches to tie Positions, Rotations, or Scales to your user's eyebrows.

Facial Landmark - Eyebrows
03:14

Here, we discuss Blocks, which are sections of a project that we can group and export for use in the same or other projects.

Blocks
04:31
+ AR Camera Effect #4: Error
10 lectures 33:03
Intro
00:27
Intro
00:28

In this lecture, we convert a gif to an Image Sequence and Sprite Sheet to compare the two methods for adding dynamic textures to our scene.

Image Sequences & Sprite Sheets
04:23

Here, we learn how to control our Image Sequences using the Patch Editor, including how to reverse a gif.

Frame Transition Patch Flow
01:36

In this video, we add a Particle System to our scene and look at how to adjust its settings to achieve the effect we want.

Particles
05:09

Use Patches to control the Particle Emitter.

Particle Trigger Patch Flow
02:19

Add Custom Instructions to an effect to prompt users to interact with the scene in specific ways.

Custom Instructions
02:15

In this lecture, we'll look at importing Patch Groups created by other people. We'll use two shaders (covered in detail in the next section) to create a glitch effect.

Patch Groups & Intro to Shaders
11:33

Spark AR allows for the ability to manipulate user audio. We'll be quickly looking at the Audio Delay Patch as an example of how this can be accomplished.

Audio Patch Flow
02:25
Saving Tips & Outro
02:28
+ AR Camera Effect #5: Shaders
20 lectures 01:25:52
Intro
00:25
Intro
00:16

How do Shaders work? What do they even do? The answers to these questions, and more, next.

Shader Theory - Color Manipulation
04:48

In this video, we apply some of our new color manipulation theory to create a Black&White filter.

Color Manipulation Examples - Black & White Filter
02:19

Continuing our color manipulation examples, we invert our camera feed's color profile.

Color Manipulation Examples - Inverse
03:05

The Spark AR Community Facebook page is a great resource for novice and experienced AR Creators, alike.

Intro to the Spark AR Community
00:48

In this lecture, we create a Chromatic Aberration filter.

Color Manipulation Examples - Chromatic Aberration
13:58

Here, we export our Chromatic Aberration Patch Group for use in other projects.

Creating Exportable Patch Assets
02:41

The Reduce Channels Patch Group is provided in your effect resources.

Color Manipulation Examples - Reduce Channels
01:04

We'll go over the second half of Shader Theory, which involves reassigning pixel colors based on the pixel's position in an image.

Shader Theory - UV Manipulation
03:33

We take a quick look at the Pixelate and VCR Glitch Patch Groups used in the Error Effect Tutorial.

UV Manipulation Examples - Pixelate & VCR Glitch (Dynamic Text)
08:24

This marks the halfway point. Get yourself a glass of water and/or a snack and get ready to continue grinding through this section.

Intermission
02:05

We'll finish our UV Manipulation examples by creating the Solarize Patch Group.

UV Manipulation Examples - Solarize
03:15

A Look-Up Table, or LUT, is a special Shader that applies the colors from one texture to the UV coordinates of another. In this video, we use a LUT Patch Group with a simple 2-colored Gradient we create.

Hybrid Examples - Intro to LUTs & Simple 2 Color Gradient
04:13

Continuing our LUT filters, we'll use a texture to recolor our camera feed.

Hybrid Examples - Image-based Gradient
01:21

Finishing up our LUT filters, we'll create an animated Gradient using HSV Color Space to recolor our camera feed.

Hybrid Examples - Intro to HSV Color Space & Animated Gradient
12:10
Scripting - NativeUI Picker (Code update 4/2020 & Patch update 6/2020)
02:45

In this lecture, we'll explore the NativeUI Picker, which allows the user to select from our 10 filters. I've included some NativeUI example code in case you're unable to get it from Spark's website.

Scripting - NativeUI Picker
15:54

We'll quickly look at the Mix Patch Group to add a grain to our VCR Glitch.

Mix Patch Flow
01:55
Outro
00:53
+ AR Camera Effect #6: Segmentation
8 lectures 32:41
Intro
00:16
Intro
00:51

In this lecture, we learn how to separate our user from their background.

Segmentation
02:20

Create background and overlay layers for our effect. We'll also be adding to our segmentation effect to create an outline around our user.

Backgrounds & Overlays (Layering)
05:02

We'll see how we can use Null Objects to make objects orbit around our user's head. We'll then use Occlusion to hide those objects as they pass behind our user.

Head Occlusion
07:15

This video will show how to properly export animated 3D models for use in Spark.

3D Animations - Export Settings for Blender
04:10

Once we have our animated 3D model imported, we'll need to use the Patch Editor to control the animation.

3D Animations - Animation Patch Flow
12:04
Outro
00:43
+ AR Camera Effect #7: Like Me
14 lectures 56:03
Intro
00:28
Intro
00:52

Here, we'll use the PlaneTracker to add a 3D scene to our back-facing camera.

PlaneTracker
00:45

Add a glow effect to a 3D object using a textured plane.

3D Glow Effect
02:25

To view some effects, you'll need to use Spark's Send to Device ability, which we'll discuss in this lecture.

Previewing Effect on Device
00:51

Shadows can go a long way in making an effect seem more real, especially when using the PlaneTracker. We'll look at different techniques for creating shadows, including dynamic shadows for animated 3D content.

Shadows
05:54

In this lecture, we look at creating Touch Gestures with Patches. This is a temporary substitute for video content that will be uploaded soon.

Manipulator Patch Flow
03:23

To allow our user to manipulate our PlaneTracker scene, we'll need to set up a script for creating Touch Gestures.

Scripting - Touch Gestures - Setup
05:00

In this video, we'll begin with the least complicated of the 3 Touch Gestures - the Two-Finger Rotate Gesture.

Scripting - Touch Gestures - Rotate Gesture
04:58

Continuing our Touch Gesture script, we'll add Pinch-to-Scale. We'll also look at ways to limit the amount our user can scale our scene.

Scripting - Touch Gestures - Pinch to Scale Gesture
05:43

In this lecture, we'll create a simple Pan gesture, which allows our user to move around our PlaneTracker scene. Then, we'll look at a method for improving this Gesture using a sensitivity value and the DeviceMotion Module.

Scripting - Touch Gestures - Pan Gesture
12:44

Similar to our 3D object glow effect, we'll look at ways to create a glowing material for a FaceMesh.

FaceMesh Glow Effect
03:36

We'll finish this effect with some particles triggered by our user opening their mouth.

Particles
06:48
Outro
02:36
+ AR Camera Effect #8: DDR Game
12 lectures 22:51
Intro
00:29
Intro
00:59

In this section, we're walking through a mostly finished project file. So, to start, we'll be walking through the Scene Hierarchy and the general setup of our scene.

Scene Setup
02:36

We covered Bridging in the Shaders effect, but we'll go into more detail of how to use it to pass data between the Patch Editor and Script.

Patch Script Bridging
02:26

Spark requires audio to be formatted as a .m4a file. We'll go through methods for converting audio formats so they import into Spark successfully.

Audio - Converting to m4a
00:57

Once we have our audio imported, we'll need to set up Patches to be able to play each sound effect when needed.

Audio - Audio Patch Flow
02:57

Now that our scene is fully set up, we can begin going through the Script. Here, we'll be importing the necessary modules and organizing our scripting assets.

Scripting Walkthrough - Importing Modules & Initializing Variables
02:02

In this video, we'll look at the variables being passed between our Script and Patch Editor.

Scripting Walkthrough - Gestures, Script Patch Bridging & Animations
01:30

The main part of our Script is the game logic, which handles creating and displaying prompts to our user, keeping score, and determining if the user has won or lost. This is Part 1.

Scripting Walkthrough - Game Logic
02:37

This is Part 2.

Scripting Walkthrough - Game Logic 2
03:21

We'll learn about the Persistence Module, which lets us save data on our user's device. In this effect, we'll use it to save and load high score data for our game. We'll also look at other potential uses for this capability.

Scripting Walkthrough - Persistence - Saving and Loading Scores
01:35

In addition to our normal outro information, we'll also take a look at the Persistence Module in the Capabilities section to see how to whitelist data keys.

Outro
01:22
Requirements
  • No Spark AR Studio experience required, although basic understanding of 3D/2D asset creation tools and JavaScript will help you progress more quickly
Description

Augmented Reality has become a foundational feature of the top social media platforms of today and is expected to be worth $165 billion by 2024. This focus on Social AR is expected to increase as Facebook/Instagram, Snapchat, and Tik Tok continue to compete for the world’s attention. This expectation has only been strengthened by Facebook/Instagram’s recent public rollout of Spark AR Studio - the only software tool available for creating Augmented Reality effects on their platforms.

Now is the time to get involved in this emerging industry, and position yourself as one of only a few people in the world capable of creating viral AR content for Facebook and Instagram. Social AR requires active participation from your audience, creating personalized content that resonates more strongly within their network. Whether you’re looking to sell effects to brands, grow your own following with more engaging content for your audience, or you just want to make fun content for your friends, Facebook and Instagram AR effects are an entirely new and exciting creative medium for you to stand out in your career, as an influencer, and as an artist.

This course is the only one of its kind, walking you all the way from learning the fundamentals of the Spark AR program to creating professional level effects, regardless of your background or prior experience. Your guide will be Ryan Hafner, an experienced industry professional who previously worked directly with Snapchat to launch new features on their AR platform.

See what some of our students are saying:

★★★★★ “After struggling through some Youtube tutorials on specific features, I went searching for an all-in-one step-by-step course. This was the only one I found and it turned out to be exactly what I was looking for.“- Hugh Osumi

★★★★★ “As a freelance graphic designer, I was looking for ways to stand out to potential clients. Once I finished this course, I started offering AR and got immediate interest from new clients.”- Tori Ratsep

★★★★★ “It really builds up quick. I started out with 0 knowledge of AR or programming but by the end of this course I was making crazy stuff.“ - Dean Krupla


Throughout our 7+ hours of course content, we’ll create 10 AR Camera Effects. In doing so, you’ll learn how to:

  • Track a user’s face in the camera feed and how to use FaceTracking to anchor 2D/3D content, like face masks and images, in your scene

  • Use Materials and Textures to create interesting visual experiences

  • Create, import, and animate 3D content, including rigged animations and BlendShapes

  • Use the Patch Editor and Javascript to program your effects to be more dynamic and interactive

  • Use Segmentation to separate your user from their background and place them in an entirely new environment

  • Create an interactive game, complete with sound effects and saved high scores

  • Manipulate user audio

  • Use the PlaneTracker to place a 3D scene in your user’s back camera view

  • Optimize, publish and promote your finished effects

2D, 3D, and other project assets will be provided for each effect. No experience with 3D modeling, image editing, or programming is required, although basic understanding of these skills and programs will help you progress more quickly.

If you want to develop a valuable new skillset in an industry poised to explode in the coming months or if you’re looking for a new creative outlet for your art, this is the course for you. Expand your digital media toolkit and level up from static 2D and 3D content to interactive and dynamic 3D experiences.

Facebook and other major companies have already confirmed they are developing AR glasses, which will allow people to see your AR creations without ever needing to hold up a phone - this is the future of digital media.


Take this course and start learning!

Who this course is for:
  • Creatives, artists, brands, and agency professionals who want to build new experiences using the latest technologies