How to Image tracking with AR using XCode

PS: With screenshots and sample code!

Subscribe to my newsletter and never miss my upcoming articles

Hello everyone, today I am going to walk you through a tutorial on how to add AR Image tracking using XCode

Quick note: This is most suitable for those who are pretty used to working with XCode, but is rather new to AR

Let's get started with: Setting up the project

Step 1: Launch Xcode and create a new project

Screen Shot 2021-03-06 at 9.36.39 PM.png

Step 2: Follow the red circles to setup the project and click Next

Screen Shot 2021-03-06 at 9.36.52 PM.png

Step 3: You should be able to name your project and setup additional information for the project. After that, the project should look like this:

Screen Shot 2021-03-06 at 9.37.29 PM.png

Adding assets to the Project

Step 1: To add a video or image to be shown, follow the picture below. In this example, I already have a video named pirate

Screen Shot 2021-03-06 at 9.38.28 PM.png

Step 2: Add an assets folder by following the screenshots below

Screen Shot 2021-03-06 at 10.01.26 PM.png

Screen Shot 2021-03-06 at 10.03.43 PM.png

Screen Shot 2021-03-06 at 10.03.57 PM.png

Step 3: Add a Scene inside the .scnassets

Screen Shot 2021-03-06 at 10.06.15 PM.png You can create whatever you want in this 3D environment, for now I will just create a Plane

Screen Shot 2021-03-06 at 10.06.29 PM.png

Screen Shot 2021-03-06 at 10.07.15 PM.png

Step 4: Add a AR Resources folder inside Assets.xcassets

Screen Shot 2021-03-06 at 10.25.11 PM.png After that, go ahead and add all your assets for AR to recognize

Screen Shot 2021-03-06 at 10.25.36 PM.png

Edit ViewController with the following lines of code

You can use these lines as reference

Main functions:

First, in the viewDidLoad function, initialize the sceneView

    override func viewDidLoad() {
        super.viewDidLoad()

        // Set the view's delegate
        sceneView.delegate = self

        // Show statistics such as fps and timing information
        sceneView.showsStatistics = true

        // Create a new scene
        let scene = SCNScene(named: "art.scnassets/card.scn")!

        // Set the scene to the view
        sceneView.scene = scene
    }

In the viewWillAppear function, create a session and then run the app session

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        // Create a session configuration
        let configuration = ARImageTrackingConfiguration()
        guard let arReferenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else { return }
        configuration.trackingImages = arReferenceImages
        sceneView.session.run(configuration)

        // Run the view's session
        sceneView.session.run(configuration)
    }

In the viewWillDisappear function, pause the app session

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        // Pause the view's session
        sceneView.session.pause()
    }

    // MARK: - ARSCNViewDelegate

In the render function, we have all the main display and AR settings

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARImageAnchor else { return }

        guard let card = sceneView.scene.rootNode.childNode(withName: "card", recursively: false) else { return }
        card.removeFromParentNode()
        node.addChildNode(card)

        card.isHidden = false

        let videoURL = Bundle.main.url(forResource: "pirate", withExtension: "mp4")!
        let videoPlayer = AVPlayer(url: videoURL)

        let videoScene = SKScene(size: CGSize(width: 720.0, height: 1280.0))

        let videoNode = SKVideoNode(avPlayer: videoPlayer)
        videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
        videoNode.size = videoScene.size
        videoNode.yScale = -1
        videoNode.play()

        videoScene.addChild(videoNode)

        guard let video = card.childNode(withName: "video", recursively: true) else { return }
        video.geometry?.firstMaterial?.diffuse.contents = videoScene

    }

Other functions you might want to use in the future:

    func session(_ session: ARSession, didFailWithError error: Error) {
        // Present an error message to the user

    }
    func sessionWasInterrupted(_ session: ARSession) {
        // Inform the user that the session has been interrupted, for example, by presenting an overlay

    }
    func sessionInterruptionEnded(_ session: ARSession) {
        // Reset tracking and/or remove existing anchors if consistent tracking is required

    }

Some explanation:

Basically, this app will scan for the images in the AR Resources folder, and display the scene in the art.scnassets

Finally: Test the app on an iPhone device

Disclaimer: I do not know if this is accurate or not, but you would need to have a developer account and a registered device to be able to test the app.

More: Face tracking with AR using XCode

No Comments Yet