Previously we saw how to set up our Unity project for AR and Android deployment. If you haven’t checked it out already, then do it before continuing with this blog as it’s a prerequisite. In this blog, we’ll learn about face detection, where we’ll use the device’s front camera to scan a face and add material to it.
Let’s start by setting up our scene for face detection
The component requires us to add a Face Prefab. So, let’s see how to create and add it.
We can create a Face Prefab in just two steps:
With that, we have created a Face Prefab. But the mesh renderer has a default material which is orange in color. If you are not a fan of that color then we can change it in just a few steps.
Before we build and test the application, there is one more step to do. That’s referencing the Face Prefab. To do that, select the AR Session Origin GameObject from the Hierarchy window → drag and drop the AR Default Face prefab into the Face Prefab parameter of the AR Face Manager component
Now, let’s build and test the application.
Once it’s built, the application will run on the device and we can test it. Since we chose the camera facing as User the front camera will be activated and it will detect the face. When the face gets detected the FaceMaterial gets rendered on top of it and it gets updated based on the facial movements as well.
Right now there is no concrete way of using face detection but there are a few cool things we can do. We can play videos on top of the detected face or even make objects follow the detected face. In the next blog, we’ll learn exactly that! We can also use it to make an application like Snapchat that will allow users to try different filters on their faces.
If you've enjoyed the insights shared here, why not spread the word? Share the post with your friends and colleagues who might also find it valuable.
Your support means the world to us and helps us create more content you'll love.