Introduction to AR Mobile Development
Augmented Reality (AR) mobile development combines real-world environments with computer-generated content through smartphone cameras and sensors. This technology enables applications to overlay digital information, 3D models, animations, and interactive elements onto the physical world, creating immersive and contextual user experiences. With billions of AR-capable smartphones already in users’ hands, mobile AR represents the most accessible and widespread form of augmented reality, powering applications across gaming, retail, education, navigation, entertainment, and enterprise sectors.
Core AR Mobile Frameworks Comparison
| Framework | Platforms | License | Key Features | Learning Curve | Best For |
|---|---|---|---|---|---|
| ARKit | iOS 11+ | Free with Apple Developer account | Face tracking, people occlusion, motion capture, LiDAR support, spatial audio | Moderate | High-quality iOS apps, Apple ecosystem integration |
| ARCore | Android 7.0+, iOS (limited) | Free | Environmental understanding, motion tracking, light estimation, anchors | Moderate | Android applications, cross-platform applications |
| AR Foundation | iOS + Android (via Unity) | Free with Unity | Combines ARKit & ARCore capabilities, unified API, face/image/object tracking | Moderate | Cross-platform games and applications requiring Unity features |
| Vuforia | iOS, Android, UWP, Unity | Commercial | Image targets, multi-targets, object recognition, VuMarks, ground plane detection | Moderate-High | Enterprise applications, image recognition needs |
| Wikitude | iOS, Android, Smart Glasses | Commercial | Geolocation, image tracking, object tracking, instant tracking | Moderate | Location-based AR, tourism applications |
| 8th Wall | Web browsers | Commercial | WebAR, image targets, face effects, world tracking | Low-Moderate | Web-based AR experiences without app installation |
| Zappar | iOS, Android, Web | Commercial | Marker-based AR, face tracking, simplified development | Low | Quick AR experiences, marketing campaigns |
| Apple RealityKit | iOS 13+ | Free with Apple Developer account | Photorealistic rendering, physics simulation, animation, spatial audio | Moderate-High | High-fidelity iOS AR experiences |
Features Comparison Matrix
| Feature | ARKit | ARCore | AR Foundation | Vuforia | Wikitude | 8th Wall |
|---|---|---|---|---|---|---|
| Plane Detection | ✓✓✓ | ✓✓ | ✓✓✓ | ✓✓ | ✓✓ | ✓✓ |
| Image Tracking | ✓✓ | ✓✓ | ✓✓ | ✓✓✓ | ✓✓✓ | ✓✓ |
| Object Tracking | ✓✓ | ✗ | ✓✓ | ✓✓✓ | ✓✓ | ✗ |
| Face Tracking | ✓✓✓ | ✓ | ✓✓ | ✓ | ✓✓ | ✓✓ |
| Body Tracking | ✓✓ | ✗ | ✓ | ✗ | ✗ | ✗ |
| Light Estimation | ✓✓✓ | ✓✓ | ✓✓ | ✓ | ✓ | ✓ |
| Persistent Anchors | ✓✓ | ✓✓ | ✓✓ | ✓ | ✓ | ✓ |
| Multi-User AR | ✓✓ | ✓✓ | ✓✓ | ✗ | ✓ | ✓ |
| Occlusion | ✓✓✓ | ✓✓ | ✓✓ | ✓ | ✓ | ✓ |
| Depth API | ✓✓✓ | ✓✓ | ✓✓ | ✓ | ✓ | ✗ |
| Geo-location | ✗ | ✗ | ✗ | ✓ | ✓✓✓ | ✓ |
| Web Support | ✗ | ✗ | ✗ | ✓ | ✓✓ | ✓✓✓ |
Legend: ✓✓✓ (Advanced), ✓✓ (Good), ✓ (Basic), ✗ (Not Supported)
Development Environment Setup
ARKit (iOS) Setup
Requirements:
- Mac with Xcode 12+ (latest recommended)
- iOS device with A9 processor or newer (iOS 11+)
- Apple Developer account
Project Configuration:
- Create new Xcode project or open existing
- Add Camera Usage Description in Info.plist
- Import ARKit and SceneKit/RealityKit
- Configure AR session and view
Key Components:
ARSession: Manages AR processingARConfiguration: Defines tracking type (world, face, etc.)ARVieworARSCNView: Displays camera feed with overlaysARAnchor: Attaches content to real-world positions
ARCore (Android) Setup
Requirements:
- Android Studio 4.0+ (latest recommended)
- ARCore supported Android device (7.0+)
- Google Play Services for AR installed on device
Project Configuration:
- Add ARCore dependency to build.gradle:
dependencies { implementation 'com.google.ar:core:1.25.0'} - Add camera permissions to AndroidManifest.xml
- Add AR required feature tag
- Configure AR session and view
- Add ARCore dependency to build.gradle:
Key Components:
Session: Manages AR processingConfig: Defines tracking capabilitiesAnchor: Attaches content to real-world positionsTrackable: Represents detected features (planes, points)
Unity AR Foundation Setup
Requirements:
- Unity 2019.3+ (latest LTS recommended)
- iOS or Android build support
- Compatible device for testing
Project Configuration:
- Install AR Foundation package via Package Manager
- Install platform-specific packages (ARKit XR Plugin, ARCore XR Plugin)
- Create new AR Foundation scene
- Add AR Session and AR Session Origin components
Key Components:
ARSession: Manages AR processingARSessionOrigin: Transforms trackables to Unity world spaceARPlaneManager: Detects and manages planesARAnchorManager: Creates and manages anchorsARRaycastManager: Performs raycasts against tracked features
Core AR Concepts & Implementation
Plane Detection & Content Placement
ARKit (Swift)
// Configure AR session for plane detection
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
arView.session.run(configuration)
// Handle detected planes
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
// Create visual representation of plane
let plane = SCNPlane(width: CGFloat(planeAnchor.extent.x),
height: CGFloat(planeAnchor.extent.z))
let planeNode = SCNNode(geometry: plane)
planeNode.position = SCNVector3(planeAnchor.center.x, 0, planeAnchor.center.z)
planeNode.eulerAngles.x = -.pi / 2
// Add plane visualization to scene
node.addChildNode(planeNode)
}
// Place content on plane via tap
@IBAction func handleTap(_ sender: UITapGestureRecognizer) {
let tapLocation = sender.location(in: arView)
let hitTestResults = arView.hitTest(tapLocation, types: .existingPlaneUsingExtent)
guard let hitResult = hitTestResults.first else { return }
let position = SCNVector3(hitResult.worldTransform.columns.3.x,
hitResult.worldTransform.columns.3.y,
hitResult.worldTransform.columns.3.z)
// Create and place 3D content
let modelNode = SCNNode(geometry: SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0))
modelNode.position = position
arView.scene.rootNode.addChildNode(modelNode)
}
ARCore (Java)
// Configure AR session for plane detection
Config config = new Config(session);
config.setPlaneFindingMode(Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL);
session.configure(config);
// Handle detected planes
@Override
public void onDrawFrame(GL10 gl) {
Frame frame = session.update();
Collection<Plane> planes = frame.getUpdatedTrackables(Plane.class);
for (Plane plane : planes) {
if (plane.getTrackingState() == TrackingState.TRACKING) {
// Create or update plane visualization
// ...
}
}
}
// Place content on plane via tap
private void handleTap(MotionEvent e) {
Frame frame = session.update();
android.graphics.Point tapPoint = new android.graphics.Point((int)e.getX(), (int)e.getY());
List<HitResult> hitResults = frame.hitTest(e);
for (HitResult hit : hitResults) {
Trackable trackable = hit.getTrackable();
if (trackable instanceof Plane &&
((Plane) trackable).isPoseInPolygon(hit.getHitPose())) {
// Create anchor at hit position
Anchor anchor = hit.createAnchor();
// Create renderable and attach to anchor
// ...
break;
}
}
}
Unity AR Foundation (C#)
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;
public class ARPlacementManager : MonoBehaviour
{
[SerializeField] private GameObject placedPrefab;
[SerializeField] private ARRaycastManager raycastManager;
[SerializeField] private ARPlaneManager planeManager;
private List<ARRaycastHit> hits = new List<ARRaycastHit>();
void Awake()
{
raycastManager = GetComponent<ARRaycastManager>();
planeManager = GetComponent<ARPlaneManager>();
}
void Update()
{
// Handle touch input
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Began)
{
if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
Pose hitPose = hits[0].pose;
Instantiate(placedPrefab, hitPose.position, hitPose.rotation);
}
}
}
}
}
Image Tracking
ARKit (Swift)
// Configure AR session for image tracking
let configuration = ARWorldTrackingConfiguration()
// Create AR reference images
guard let referenceImages = ARReferenceImage.referenceImages(
inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing AR resources")
}
configuration.detectionImages = referenceImages
arView.session.run(configuration)
// Handle detected images
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let plane = SCNPlane(
width: imageAnchor.referenceImage.physicalSize.width,
height: imageAnchor.referenceImage.physicalSize.height
)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
// Add content on detected image
let contentNode = SCNNode()
// Configure content node...
planeNode.addChildNode(contentNode)
node.addChildNode(planeNode)
}
ARCore (Java)
// Configure AR session for image tracking
Config config = new Config(session);
// Create augmented image database
AugmentedImageDatabase imageDatabase = new AugmentedImageDatabase(session);
// Add images to database
try {
InputStream inputStream = getAssets().open("my_image.jpg");
Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
imageDatabase.addImage("image_name", bitmap, 0.1f); // 0.1m width
} catch (IOException e) {
Log.e(TAG, "I/O exception loading augmented image", e);
}
config.setAugmentedImageDatabase(imageDatabase);
session.configure(config);
// Handle detected images
@Override
public void onDrawFrame(GL10 gl) {
Frame frame = session.update();
Collection<AugmentedImage> updatedImages = frame.getUpdatedTrackables(AugmentedImage.class);
for (AugmentedImage image : updatedImages) {
if (image.getTrackingState() == TrackingState.TRACKING) {
// Get image center pose
Pose centerPose = image.getCenterPose();
// Create or update anchored content
// ...
}
}
}
Unity AR Foundation (C#)
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using System.Collections.Generic;
public class ImageTrackingManager : MonoBehaviour
{
[SerializeField] private GameObject[] trackedPrefabs;
private Dictionary<string, GameObject> spawnedObjects = new Dictionary<string, GameObject>();
private ARTrackedImageManager trackedImageManager;
void Awake()
{
trackedImageManager = GetComponent<ARTrackedImageManager>();
}
void OnEnable()
{
trackedImageManager.trackedImagesChanged += OnTrackedImagesChanged;
}
void OnDisable()
{
trackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged;
}
void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs)
{
// Handle added tracked images
foreach (ARTrackedImage trackedImage in eventArgs.added)
{
UpdateImage(trackedImage);
}
// Handle updated tracked images
foreach (ARTrackedImage trackedImage in eventArgs.updated)
{
UpdateImage(trackedImage);
}
// Handle removed tracked images
foreach (ARTrackedImage trackedImage in eventArgs.removed)
{
if (spawnedObjects.TryGetValue(trackedImage.referenceImage.name, out GameObject prefab))
{
Destroy(prefab);
spawnedObjects.Remove(trackedImage.referenceImage.name);
}
}
}
void UpdateImage(ARTrackedImage trackedImage)
{
string imageName = trackedImage.referenceImage.name;
Vector3 position = trackedImage.transform.position;
GameObject prefab = null;
foreach (GameObject go in trackedPrefabs)
{
if (go.name == imageName)
{
prefab = go;
break;
}
}
if (prefab == null)
{
Debug.LogWarning($"No prefab found for {imageName}");
return;
}
if (spawnedObjects.TryGetValue(imageName, out GameObject spawnedObject))
{
spawnedObject.transform.position = position;
spawnedObject.transform.rotation = trackedImage.transform.rotation;
spawnedObject.SetActive(trackedImage.trackingState == UnityEngine.XR.ARSubsystems.TrackingState.Tracking);
}
else
{
GameObject newObject = Instantiate(prefab, position, trackedImage.transform.rotation);
spawnedObjects[imageName] = newObject;
}
}
}
Face Tracking
ARKit (Swift)
// Configure AR session for face tracking
let configuration = ARFaceTrackingConfiguration()
arView.session.run(configuration)
// Handle detected faces
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let faceAnchor = anchor as? ARFaceAnchor else { return }
// Create face mesh
let faceMesh = ARSCNFaceGeometry(device: sceneView.device!)
let faceNode = SCNNode(geometry: faceMesh)
// Add face content (e.g., masks, effects)
// ...
node.addChildNode(faceNode)
}
// Update face mesh
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard let faceAnchor = anchor as? ARFaceAnchor,
let faceGeometry = node.childNodes.first?.geometry as? ARSCNFaceGeometry else { return }
// Update face geometry with new face data
faceGeometry.update(from: faceAnchor.geometry)
// Access face tracking data
let smileValue = faceAnchor.blendShapes[.mouthSmileLeft]?.floatValue ?? 0.0
// ...
}
ARCore (Java)
// Configure AR session for face tracking
Config config = new Config(session);
config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);
session.configure(config);
// Handle detected faces
@Override
public void onDrawFrame(GL10 gl) {
Frame frame = session.update();
Collection<AugmentedFace> faces = session.getAllTrackables(AugmentedFace.class);
for (AugmentedFace face : faces) {
if (face.getTrackingState() == TrackingState.TRACKING) {
// Access face mesh
FloatBuffer vertices = face.getMeshVertices();
FloatBuffer normals = face.getMeshNormals();
FloatBuffer textureCoords = face.getMeshTextureCoordinates();
ShortBuffer triangleIndices = face.getMeshTriangleIndices();
// Access facial regions
Pose nosePose = face.getRegionPose(AugmentedFace.RegionType.NOSE_TIP);
Pose forheadLeftPose = face.getRegionPose(AugmentedFace.RegionType.FOREHEAD_LEFT);
// ...
// Update face mesh rendering
// ...
}
}
}
Unity AR Foundation (C#)
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using System.Collections.Generic;
public class FaceTrackingManager : MonoBehaviour
{
[SerializeField] private GameObject faceMaskPrefab;
private ARFaceManager faceManager;
private Dictionary<ARFace, GameObject> faceMasks = new Dictionary<ARFace, GameObject>();
void Awake()
{
faceManager = GetComponent<ARFaceManager>();
}
void OnEnable()
{
faceManager.facesChanged += OnFacesChanged;
}
void OnDisable()
{
faceManager.facesChanged -= OnFacesChanged;
}
void OnFacesChanged(ARFacesChangedEventArgs eventArgs)
{
// Handle added faces
foreach (ARFace face in eventArgs.added)
{
GameObject maskInstance = Instantiate(faceMaskPrefab, face.transform);
faceMasks.Add(face, maskInstance);
}
// Handle updated faces
foreach (ARFace face in eventArgs.updated)
{
if (faceMasks.TryGetValue(face, out GameObject maskInstance))
{
// Update mask with blend shapes
UpdateMask(maskInstance, face);
}
}
// Handle removed faces
foreach (ARFace face in eventArgs.removed)
{
if (faceMasks.TryGetValue(face, out GameObject maskInstance))
{
Destroy(maskInstance);
faceMasks.Remove(face);
}
}
}
void UpdateMask(GameObject mask, ARFace face)
{
// Access blend shape values for expressions
float smileLeft = face.blendShapes[UnityEngine.XR.ARSubsystems.BlendShapeLocation.MouthSmileLeft];
float smileRight = face.blendShapes[UnityEngine.XR.ARSubsystems.BlendShapeLocation.MouthSmileRight];
float eyeBlinkLeft = face.blendShapes[UnityEngine.XR.ARSubsystems.BlendShapeLocation.EyeBlinkLeft];
// ...
// Update mask appearance based on expression
// ...
}
}
Performance Optimization Techniques
Asset Optimization
3D Model Optimization:
- Reduce polygon count (aim for <50K triangles for complex models)
- Use LOD (Level of Detail) for distant objects
- Simplify collision meshes
- Optimize UV mapping
Texture Optimization:
- Use appropriate texture sizes (1024×1024 or smaller when possible)
- Implement texture compression (ASTC for iOS, ETC2 for Android)
- Use texture atlases for related elements
- Consider procedural textures for certain effects
Animation Optimization:
- Reduce keyframe density
- Use blend shapes sparingly
- Consider procedural animations for simple movements
- Implement animation LOD
Rendering Optimization
Shader Optimization:
- Use mobile-optimized shaders
- Reduce shader complexity and instructions
- Minimize texture lookups
- Batch materials with similar shaders
Draw Call Reduction:
- Implement static and dynamic batching
- Use mesh combining for static elements
- Reduce material count
- Consider occlusion culling for complex scenes
Lighting Optimization:
- Use baked lighting where possible
- Limit real-time lights (1-2 max for mobile)
- Use light probes for dynamic objects
- Consider environment-based lighting from AR environment
AR-Specific Optimizations
Tracking Optimization:
- Limit simultaneous tracking features (planes, images, etc.)
- Implement tracking quality checks before rendering
- Prioritize tracking stability over feature quantity
- Consider disabling unnecessary tracking features
Content Placement:
- Place content at appropriate scale (avoid tiny details)
- Position content at comfortable viewing distances
- Implement content stabilization techniques
- Consider human ergonomics for interactive elements
Battery & Thermal Management:
- Implement frame rate limiting when appropriate
- Reduce processing during inactive periods
- Optimize CPU/GPU workload balance
- Monitor and respond to thermal state
User Experience Best Practices
Onboarding & Guidance
First-Time Experience:
- Provide clear, visual AR setup instructions
- Guide users to scan environment before content appears
- Indicate device movement requirements
- Keep initial experience simple and rewarding
Motion Guidance:
- Use visual cues to indicate required movements
- Implement progressive guidance that fades with proficiency
- Provide feedback for detection progress
- Consider accessibility for users with limited mobility
Error Recovery:
- Detect and communicate tracking issues
- Provide clear steps to restore tracking
- Implement graceful degradation of experience
- Maintain user context during interruptions
Interaction Design
Input Methods:
- Design for single-handed operation when possible
- Implement comfortable gesture interactions
- Consider alternative input methods (voice, device motion)
- Maintain consistent interaction patterns
Content Manipulation:
- Use intuitive gestures (pinch to scale, drag to move)
- Implement snapping and alignment aids
- Provide visual feedback for interactions
- Consider physics-based interactions for natural feel
Spatial Design:
- Account for limited field of view
- Design for variable user environments
- Consider physical world interaction and occlusion
- Maintain appropriate content scale relative to real world
Visual Design & Feedback
AR Integration:
- Match lighting and shadows with real environment
- Implement proper occlusion with real-world objects
- Consider depth cues and spatial relationships
- Design for variable backgrounds and lighting conditions
UI Design:
- Place UI elements at comfortable viewing positions
- Implement world-space UI when contextually appropriate
- Ensure readability against variable backgrounds
- Consider ergonomics of extended AR sessions
Feedback Mechanisms:
- Provide visual confirmation for user actions
- Implement progress indicators for loading content
- Use spatial audio for off-screen guidance
- Consider haptic feedback for interactions
Testing & Debugging Strategies
Device Testing
Device Selection:
- Test on minimum supported specifications
- Cover representative device range (not just high-end)
- Include both iOS and Android for cross-platform apps
- Test various form factors (phone vs. tablet)
Environment Testing:
- Test in various lighting conditions
- Validate performance in different space sizes
- Test with different surface types and textures
- Consider outdoor vs. indoor testing
User Testing:
- Observe first-time users without guidance
- Test with users of various AR experience levels
- Consider accessibility testing
- Gather qualitative feedback on comfort and usability
AR-Specific Debugging
Tracking Visualization:
- Implement debug visualization of detected planes
- Visualize feature points and tracking quality
- Display camera intrinsics in debug mode
- Create heat maps of tracking reliability
Performance Monitoring:
- Track and display frame rate
- Monitor CPU/GPU/memory usage
- Implement performance logging
- Create automated performance testing
Common Issue Troubleshooting:
- Poor surface detection: Check lighting, texture, camera cleanliness
- Jittery content: Implement smoothing, check tracking quality
- Content drift: Validate scale, use anchors appropriately
- Battery drain: Profile and optimize render and physics loops
App Size & Distribution Optimization
Size Reduction Techniques
Asset Compression:
- Implement texture compression appropriate for platform
- Use mesh compression for 3D models
- Consider runtime download for optional content
- Optimize audio assets
Code Optimization:
- Remove unused code and libraries
- Implement code stripping
- Consider modular loading for optional features
- Optimize third-party dependencies
Platform-Specific Optimizations:
- Use App Thinning for iOS (slicing, bitcode, on-demand resources)
- Implement Android App Bundles and Dynamic Delivery
- Consider split APKs for ARCore dependency
- Optimize for 64-bit architecture
AR App Store Considerations
App Store (iOS):
- Clearly indicate AR capabilities in description
- Include AR-specific screenshots and preview videos
- Use AR App Preview format to demonstrate functionality
- Implement App Store optimization for AR keywords
Google Play (Android):
- Use “AR” badge in Google Play listing
- Include “AR Required” or “AR Optional” designation
- Optimize store listing with AR demonstrations
- Consider AR-specific tags and categories
AR Quick Look (iOS):
- Prepare USDZ files for AR Quick Look integration
- Implement Universal Links for AR Quick Look
- Consider web-based AR previews
- Optimize models for AR Quick Look performance
Security & Privacy Considerations
Camera Data Handling:
- Clearly communicate camera usage purpose
- Process camera data locally when possible
- Implement secure transmission for remote processing
- Consider privacy implications of environment scanning
User Data Protection:
- Minimize collection of identifiable environment data
- Implement proper data retention policies
- Consider anonymization of spatial data
- Clearly disclose data usage in privacy policy
Location Data Management:
- Request location permissions only when necessary
- Implement appropriate location accuracy levels
- Consider privacy implications of AR geolocation
- Allow users to opt out of location features
Future-Proofing AR Applications
Emerging Hardware Support:
- Design for headset/glasses transition
- Consider depth sensor integration
- Prepare for wider FOV displays
- Implement scalable content for varying capabilities
Technology Evolution:
- Design for cloud anchors and persistent AR
- Consider multi-user AR experiences
- Prepare for improved environmental understanding
- Implement modular tracking systems
Platform Changes:
- Follow platform deprecation timelines
- Maintain awareness of ARKit/ARCore roadmaps
- Consider cross-platform abstraction layers
- Prepare for Web AR evolution
Resources for AR Mobile Development
Documentation & Learning
- Apple ARKit Documentation
- Google ARCore Documentation
- Unity AR Foundation Manual
- Vuforia Developer Portal
- Wikitude SDK Documentation
Development Tools
- Reality Composer (iOS)
- AR Scene Validator (Android)
- Unity AR Companion App
- Spark AR Studio (Facebook)
- Lens Studio (Snapchat)
Community & Support
- AR/VR Developer Forums (Unity)
- ARKit Developer Forum (Apple)
- ARCore Developer Community (Google)
- Stack Overflow AR Tags
- Reddit r/augmentedreality
By understanding and implementing these frameworks, techniques, and best practices, developers can create compelling and effective augmented reality experiences for mobile platforms that engage users and leverage the unique capabilities of AR technology.
