ARKit

Overview

ARKit is an Apple framework for rendering augmented reality experiences in iOS and iPadOS. ARKit utilizes device front and rear facing cameras and motion sensors.

RealityKit is another Apple framework that is often used with ARKit. It renders 3D images and simulates changes to them. RealityKit can be used to add virtual objects to an ARKit view.

ARKit and RealityKit use the USDZ file format to describe 3D objects. The USDZ file format was created by Pixar. Many USDZ files can be downloaded from the Apple Developer website.

Resources

Florian Schweizer (@FloWritesCode) created the following excellent YouTube videos on ARKit:

Reality Composer

"Reality Composer" is an iOS and macOS app that allows users to "easily prototype and produce AR experiences directly with no prior 3D experience". See RealityComposer.

Object Capture

The "Object Capture" API that is part of the RealityKit framework processes a series of object photos taken from many angles and produces a 3D model in the USDZ format.

Apple provides the HelloPhotogrammetry Xcode project that uses the Object Capture API. It can be downloaded at Creating a Photogrammetry Command-Line App. Building this project creates a command-line app that generates a model file from a collection of model photos.

To build this app:

  1. Double-click the file HelloPhotogrammetry.xcodeproj to open the project in Xcode.
  2. Select the top entry in the Project Navigator.
  3. Select the top target.
  4. Select the General tab.
  5. Select a value in the "Team" dropdown.
  6. Select File ... Project Settings...
  7. In the dialog that appears, change the "Derived Data" dropdown to "Custom Location".
  8. Enter the path to your home directory or another easily located directory.
  9. Click the scheme dropdown to the left of the device dropdown at the top of the Xcode window.
  10. Select "Edit Scheme..."
  11. In the dialog that appears, click "Run" in the left nav.
  12. Click the "Info" tab.
  13. Change the "Build Configuration" dropdown from "Debug" to "Release".
  14. Click the "Close" button.
  15. Select Product ... Build For ... Running
  16. Wait for the build to finish.
  17. Copy or move the file named "HelloPhotogrammetry" found below the specified directory in Build/Products/Release to a directory in your PATH.

Models can be produced in four detail levels which are Reduced, Medium, Full, and Raw (for custom workflows). The Reduced and Medium levels are good for use in web and mobile apps. The Full and Raw levels are intended for high-end usage. Medium detail models can be viewed in the ARQuickLook macOS app. To display multiple scans in the same scene, use the Reduced level.

To run this app:

  1. From a Terminal window enter HelloPhotogrammetry. This will output usage instructions.
  2. Enter HelloPhotogrammetry {images-dir-path} {output-file-path} to generate a .usdz file from the images in a given directory.
  3. To specify a detail level, add the -d {level} option where {level} is preview, reduced, medium, full, or raw.

To create your own app for reading images and producing .usdz files, write code similar to the following:

import RealityKit

// Create a session.
let url = URL(fileURLWithPath: "/tmp/photos/", isDirectory: true)
let session = try! PhotogrammetrySession(
input: url,
configuration: PhotogrammetrySession.Configuration()
)

// Prepare to process outputs.
Task {
for try await output in session.outputs {
switch output {
case .requestProgress(let request, let fraction):
print("Progress:", fraction)
case .requestComplete(let request, let result):
if case .modelFile(let url) = result {
print("Output at", url)
}
case .requestError(let request, let error):
print("Error:", error)
case .processingComplete:
print("Completed")
default:
print("Unhandled case")
}
}
}

// Initiate processing of input photos.
// This request two models to be produced.
try! session.process(requests: [
.modelFile("/tmp/models/reduced.usdz", detail: .reduced),
.modelFile("/tmp/models/medium.usdz", detail: .medium)
])

Writing Code

An Xcode project that uses ARKit can be created in two ways. It can be an SwiftUI "App" or an "Augmented Reality App" which does not use SwiftUI. We will only consider the first option here.

Currently ARKit views are compatible with UIKit, but not SwiftUI. We need to wrap the ARView in a UIViewRepresentable in order to access it from SwiftUI.

  1. In Xcode, create a SwiftUI project.
  2. Select the top entry in the Project Navigator.
  3. Select the main target.
  4. Select the "Info" tab.
  5. Hover over any row and click the "+" button.
  6. For the value, describe for users why camera access is needed. For example, "Need for augmented reality view".
  7. See code in the example project ARKitDemo.

ARView Configuration

To change the configuration of an ARView session, pass a configuration object to the run method of the ARSession
object stored in the session property of the ARView object. Supported configuration types include:

Important RealityKit Types

There are many ways to load entities. One option is to use a USDZ file.

To load an entity from a file and place it at an anchor:

let entity = try? Entity.load(named: "usdz-file-name")
anchor.addChild(entity)

The Apple Augmented Reality Tools web page contains links for downloading the tools "Reality Composer" (iPhone and iPad only) and "Reality Converter".

Reality Composer is described below.

The Reality Converter macOS app "converts common 3D file formats such as .obj, .gltf, .fbx, and USD to USDZ. It also lets you view the converted model, customize its material properties with your own textures, and edit the file's metadata."

Files produced by these apps can be loaded into an ARKIt app in the same way as USDZ files.

Entities can be created using the MeshResource class provided by ARKit. This has the methods generateBox, generatePlane, and generateSphere. For example, to create and display a blue box:

let block = MeshResource.generateBox(size: 1) // 1 meter on all sides
let material = SimpleMaterial(color: UIColor(Color.blue), isMetallic: false)
let entity = ModelEntity(mesh: block, materials: [material])
let anchor = AnchorEntity(plane: .horizontal)
anchor.addChild(entity)
scene.addAnchor(anchor)

Getting Started

CaptureSample App

Apple provides the CaptureSample project that implements an iOS app for capturing photos of a real-world object in preparation for generating a USDZ model file. The project can be downloaded from Taking Pictures for 3D Object Capture.

I was able to run this app and capture images, but the images were not saved anywhere so I could not use them to create a USDZ model file. Consider using the app PhotoCatch in the iOS App Store.

To build and run the CaptureSample app:

  1. Double-click the file CaptureSample.xcodeproj to open the project in Xcode.
  2. Select the top entry in the Project Navigator.
  3. Select the top target.
  4. Select the General tab.
  5. Select a value in the "Team" dropdown.
  6. Attach an iPhone to the computer with a USB cable.
  7. Select the iPhone from device dropdown at the top of the Xcode window.
  8. Press cmd-r to build and run the app on the iPhone.

Pre-built models

Apple provides some pre-built USDZ at AR Quick Look.

Viewing Models

Double-clicking a .usdz file opens and displays the model in Xcode using "AR Quick Look". Drag around on the image to rotate it around any axis to see all sides. Some models appear in shades of black when viewed in Xcode, but will be in color when imported into the "Reality Composer" app.

Guidelines for Creating Models

When creating models from real-world objects:

Example Project

See ARKitDemo.