ARKit 3 Face Tracking Tutorial

Since its appearance on the market, Augmented Reality has given us the possibility to create immersive and exciting experiences for mobile technology users. Used both for big business systems and everyday use apps, AR can save us time, for example, we can check out how our dream sofa would look in our apartment, whether the selected paint color will match the interior, or even quickly try on new makeup.

ARKit 3 requirements & setup

AR face tracking with ARKit 3 requires the A11 bionic chip introduced in iPhone X, therefore we’ll need this device or a newer one to make our sample work. To access all the features ARKit 3, like multiple face tracking, we’ll need to use Unity 2019.1 or later for our project.

  1. AR Foundation
  2. AR Subsystems
  3. AR Face Tracking for all the required face tracking scripts
  4. XR Legacy Input Helpers for the Tracked Pose Driver script

AR session objects

When the project is properly set up, we can start creating the necessary AR session objects. First off, we create a GameObject in our scene we’ll call ARSessionand we add two components to it — one AR Session script and AR Input Manager script.

  • Tracked Pose Driver (In the Tracked Pose Driver, we need to change the Pose Source to Color Camera tick the Use Relative Transform settings.)
  • AR Camera Manager
  • AR Camera Background

The AR Glasses Prefab

We’ll have to create a new GameObject, we called ours simply GlassesPrefab. What we need to do first with our prefab is to attach two components to it. The first one is an ARFace component which will provide us with the detected face point data. The second is our script called ARGlassesController which will be responsible for positioning the glasses according to data updated by ARFace.

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

[RequireComponent(typeof(ARFace))]
public class ARGlassesController : MonoBehaviour
{
[field: SerializeField]
public static ARGlassesController Instance { get; private set; }

[field: SerializeField]
public Transform ModelTransform { get; private set; }

[field: SerializeField]
public Material FrameMaterial { get; private set; }

private ARFace ARFaceComponent { get; set; }

private const string MATERIAL_COLOR_SETTING_NAME = "_Color";
private const int AR_GLASSES_PLACEMENT_VERTICE_INDEX = 16;

public void ChangeFrameColor (Color color)
{
if (FrameMaterial != null)
{
FrameMaterial.SetColor(MATERIAL_COLOR_SETTING_NAME, color);
}
}

protected virtual void Awake ()
{
if (Instance == null)
{
Instance = this;
}

ARFaceComponent = GetComponent();
}

protected virtual void OnDestroy ()
{
Instance = null;
}

protected virtual void OnEnable ()
{
ARFaceComponent.updated += TryToUpdateModelStatus;
ARSession.stateChanged += TryToUpdateModelStatus;
TryToUpdateModelStatus();
}

protected virtual void OnDisable ()
{
ARFaceComponent.updated -= TryToUpdateModelStatus;
ARSession.stateChanged -= TryToUpdateModelStatus;
}

private void TryToUpdateModelStatus (ARFaceUpdatedEventArgs eventArgs)
{
TryToUpdateModelStatus();
}

private void TryToUpdateModelStatus (ARSessionStateChangedEventArgs eventArgs)
{
TryToUpdateModelStatus();
}

private void TryToUpdateModelStatus ()
{
bool isFaceVisible = GetFaceVisibility();
ModelTransform.gameObject.SetActive(isFaceVisible);

if (isFaceVisible == true)
{
ModelTransform.localPosition = ARFaceComponent.vertices[AR_GLASSES_PLACEMENT_VERTICE_INDEX];
}
}

private bool GetFaceVisibility()
{
return enabled == true && ARFaceComponent.trackingState != TrackingState.None && ARSession.state > ARSessionState.Ready;
}
}

ARKit3: ARFace data and face detection

What happens is very simple — once the ARFaceManager detects a face, it sends new ARFace data. Since our controller is connected to the updated event of ARFace, whenever a new set of face point coordinates is detected the script updates the position of the glasses.

Color controller

We want to change the color of the frames using a ColorController script attached to buttons in a simple UI, that’s why we made ARGlassesController a singleton — whenever it’s available on the scene, the ColorController can access it.

using UnityEngine;

public class ColorController : MonoBehaviour
{
[field: SerializeField]
public Color Color { get; private set; }

public void SetFrameColor()
{
if (ARGlassesController.Instance != null)
{
ARGlassesController.Instance.ChangeFrameColor(Color);
}
}
}

Conclusion

I hope the above instructions prove helpful in preparing your AR face tracking app. This is just a simple example, but ARKit Face Tracking can be used to create much more complex applications, not only for fashion and cosmetics but also for industrial purposes. Nowadays Apple gives us access to very advanced and powerful tools but we’re excited to see how this technology will evolve in the future.

--

--

Immersive technology developers: Interactive & Intelligent 3D Solutions

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
4Experience

Immersive technology developers: Interactive & Intelligent 3D Solutions