ARKit vs. ARCore: The Ultimate AR Development Cheatsheet

Introduction

Augmented Reality (AR) technology overlays digital content onto the real world, creating interactive experiences that blend virtual elements with physical environments. Apple’s ARKit and Google’s ARCore are the two leading AR development frameworks, each providing tools for mobile AR experiences on their respective platforms. This cheatsheet provides a comprehensive comparison to help developers choose the right framework and leverage the strengths of each platform for creating compelling AR applications.

Core Concepts & Fundamentals

Key AR Terminology

TermDefinition
World TrackingTechnology that maps the physical environment and tracks device movement within it
Plane DetectionIdentification of flat surfaces like floors, tables, and walls
AnchorReference point in 3D space that maintains its position relative to the real world
Hit TestingProcess of casting a ray from the device to determine where it intersects with detected surfaces
Light EstimationAnalysis of real-world lighting to realistically illuminate virtual objects
OcclusionTechnique allowing real-world objects to appear in front of virtual content
SessionAR experience instance that manages tracking state and AR resources
Tracking StateCurrent quality of environment mapping (normal, limited, not available)

Platform Comparison at a Glance

FeatureARKit (Apple)ARCore (Google)
Initial ReleaseJune 2017 (iOS 11)March 2018
Latest VersionARKit 7 (iOS 17)ARCore 1.40
Supported DevicesiPhone 6s and newer, iPad (5th gen) and newer1000+ Android device models with proper specs
Development EnvironmentXcodeAndroid Studio
Primary LanguagesSwift, Objective-CJava, Kotlin
Primary 3D EnginesRealityKit, SceneKit, Unity, UnrealSceneform (deprecated), Filament, Unity, Unreal
Cloud AnchorsYes (Shared experiences)Yes (Persistent and shared experiences)
Face TrackingYes (TrueDepth camera required)Yes
Body TrackingYesLimited
Hand TrackingYesLimited

Framework Architecture & Components

ARKit Architecture

// Basic ARKit session setup
import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {
    @IBOutlet var arView: ARView!
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        
        // Start AR session
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal, .vertical]
        configuration.environmentTexturing = .automatic
        
        arView.session.delegate = self
        arView.session.run(configuration)
    }
    
    func session(_ session: ARSession, didFailWithError error: Error) {
        // Handle session failures
    }
    
    func sessionWasInterrupted(_ session: ARSession) {
        // Handle session interruptions
    }
}

ARCore Architecture

// Basic ARCore session setup
import com.google.ar.core.ArCoreApk
import com.google.ar.core.Session
import com.google.ar.core.exceptions.CameraNotAvailableException

class MainActivity : AppCompatActivity() {
    private var arCoreSession: Session? = null
    
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        
        // Check AR availability
        val availability = ArCoreApk.getInstance().checkAvailability(this)
        if (availability.isSupported) {
            try {
                arCoreSession = Session(this)
                val config = Config(arCoreSession)
                config.planeFindingMode = Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL
                config.lightEstimationMode = Config.LightEstimationMode.ENVIRONMENTAL_HDR
                arCoreSession?.configure(config)
            } catch (e: Exception) {
                // Handle session creation failure
            }
        }
    }
    
    override fun onResume() {
        super.onResume()
        try {
            arCoreSession?.resume()
        } catch (e: CameraNotAvailableException) {
            // Handle camera not available
        }
    }
    
    override fun onPause() {
        super.onPause()
        arCoreSession?.pause()
    }
}

Feature Comparison in Detail

Environment Understanding

FeatureARKit (Apple)ARCore (Google)
Plane DetectionHorizontal, vertical, and diagonal surfacesHorizontal and vertical surfaces
Scene ReconstructionLiDAR Scanner enables detailed mesh creation (iPad Pro 2020+, iPhone 12 Pro+)Scene mesh via Depth API (limited devices)
Object DetectionYes, with ML integrationYes, with ML integration
Image RecognitionUp to 100 reference imagesUp to 1000 reference images
Environment TexturingAutomatic environment texturing for realistic reflectionsLimited environment reflection maps
Geometry AwarenessAdvanced with LiDARBasic with ARCore Depth API
People OcclusionFull body occlusion on newer devicesLimited people segmentation

Tracking Capabilities

FeatureARKit (Apple)ARCore (Google)
World Tracking AccuracyVery high, especially with LiDARGood on supported devices
Tracking RecoveryFast recovery after interruptionModerate recovery speed
Motion Tracking6DOF (degrees of freedom)6DOF (degrees of freedom)
Extended TrackingYes, persistent AR between sessionsLimited persistence without Cloud Anchors
Location AnchorsYes, with geo-anchors in specific citiesYes, with Geospatial API worldwide
Tracking in Low LightBetter with LiDAR devicesLimited in low light conditions
RelocalizationVisual Positioning System in supported locationsVisual Positioning System with Google Maps

Real-World Integration

FeatureARKit (Apple)ARCore (Google)
Light EstimationHDR environment probes, spherical harmonicsEnvironmental HDR, directional light
Physics SimulationBuilt-in physics in RealityKitRequires third-party physics engine
Audio SpatalizationSpatial audio supportBasic audio positioning
Video TexturingYesYes
Real-World ScaleHigh accuracyGood accuracy
Object OcclusionAdvanced with LiDARBasic depth-based occlusion
Environment ProbesAutomatic environment cubemapsManual environment cubemaps

Human Interaction Features

FeatureARKit (Apple)ARCore (Google)
Face Tracking52+ blend shapes, detailed tracking (TrueDepth camera)Basic face mesh with ARCore Face API
Body TrackingFull skeleton trackingLimited body pose detection
Hand TrackingDetailed hand pose, finger trackingBasic hand detection
Eye TrackingYes, on supported devicesLimited
Emotion DetectionYes, via blend shapesLimited
Avatar CreationMemoji, detailed avatarsBasic avatars
Motion CaptureFull body motion captureLimited

Shared/Multi-User Experiences

FeatureARKit (Apple)ARCore (Google)
Framework SupportMultipeerConnectivity, RealityKit SynchronizationCloud Anchors, Google Firestore
Real-time SharingYesYes
PersistenceLimited without iCloud integrationYes, with Cloud Anchors (up to 365 days)
User IdentificationBasicBasic
Collaborative MappingYes, with shared world mappingLimited, requires Cloud Anchors
Cross-PlatformLimitedYes, ARCore Cloud Anchors works with iOS
Scaling LimitationsLocal network primarilyGlobal with cloud infrastructure

Development Workflow & Tools

ARKit Development Process

  1. Environment Setup

    • Install latest Xcode
    • iOS device with A9 processor or newer
    • Apple Developer account
  2. Project Configuration

    • Create new AR project in Xcode
    • Add camera usage description in Info.plist
    • Configure AR capabilities in project settings
  3. Development Loop

    1. Configure AR session
    2. Place virtual content
    3. Handle user interactions
    4. Test on device (simulator not supported)
    5. Debug with AR debug options
    
  4. Testing Tools

    • Reality Composer for prototyping
    • AR Quick Look for testing 3D models
    • Xcode AR debugging tools
  5. Deployment

    • App Store submission
    • TestFlight for beta testing

ARCore Development Process

  1. Environment Setup

    • Install Android Studio
    • ARCore supported Android device
    • Google Play Services for AR
  2. Project Configuration

    • Add ARCore dependency to build.gradle
    • Add camera permission in AndroidManifest.xml
    • Configure AR features in app settings
  3. Development Loop

    1. Check ARCore availability
    2. Create and configure AR session
    3. Set up renderers (OpenGL ES, Sceneform, etc.)
    4. Manage session lifecycle
    5. Test on device (emulator support limited)
    
  4. Testing Tools

    • ARCore Scene Viewer
    • Sceneform SDK (deprecated but still usable)
    • Android Studio AR debugging tools
  5. Deployment

    • Google Play Store submission
    • Google Play App Testing

Implementation Examples

ARKit: Placing a Virtual Object on Detected Plane

// Using RealityKit
import ARKit
import RealityKit

class ViewController: UIViewController {
    @IBOutlet var arView: ARView!
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        
        // Start AR session
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal]
        arView.session.run(configuration)
        
        // Setup tap gesture
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
        arView.addGestureRecognizer(tapGesture)
    }
    
    @objc func handleTap(_ sender: UITapGestureRecognizer) {
        let tapLocation = sender.location(in: arView)
        
        // Raycast to find where tap intersects with detected plane
        let results = arView.raycast(from: tapLocation, 
                                   allowing: .estimatedPlane, 
                                  alignment: .horizontal)
        
        if let firstResult = results.first {
            // Create anchor at hit location
            let anchor = AnchorEntity(world: firstResult.worldTransform)
            
            // Load and place 3D model
            let modelEntity = try! ModelEntity.load(named: "toy_robot")
            modelEntity.scale = [0.1, 0.1, 0.1] // Scale down the model
            
            // Add model to anchor
            anchor.addChild(modelEntity)
            
            // Add anchor to scene
            arView.scene.addAnchor(anchor)
        }
    }
}

ARCore: Placing a Virtual Object on Detected Plane

// Using Sceneform (though deprecated, still commonly used)
import com.google.ar.core.HitResult
import com.google.ar.core.Plane
import com.google.ar.sceneform.AnchorNode
import com.google.ar.sceneform.rendering.ModelRenderable
import com.google.ar.sceneform.ux.ArFragment

class MainActivity : AppCompatActivity() {
    private lateinit var arFragment: ArFragment
    private var modelRenderable: ModelRenderable? = null
    
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        
        arFragment = supportFragmentManager.findFragmentById(R.id.ar_fragment) as ArFragment
        
        // Load 3D model
        ModelRenderable.builder()
            .setSource(this, R.raw.toy_robot)
            .build()
            .thenAccept { renderable -> modelRenderable = renderable }
            .exceptionally { 
                Log.e("MainActivity", "Unable to load model", it)
                null
            }
        
        // Set up tap listener
        arFragment.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent ->
            if (modelRenderable == null) return@setOnTapArPlaneListener
            
            // Create anchor at hit location
            val anchor = hitResult.createAnchor()
            val anchorNode = AnchorNode(anchor)
            anchorNode.setParent(arFragment.arSceneView.scene)
            
            // Create transformable node for model
            val modelNode = TransformableNode(arFragment.transformationSystem)
            modelNode.setParent(anchorNode)
            modelNode.renderable = modelRenderable
            modelNode.select()
        }
    }
}

Performance Optimization Tips

ARKit Performance Tips

  • CPU Optimization

    • Use preferredFramesPerSecond to limit unnecessary processing
    • Only enable necessary detection features
    • Reduce polygon count for 3D models
    • Use LOD (Level of Detail) for complex models
  • Memory Management

    • Ensure proper resource disposal
    • Load assets asynchronously
    • Use compressed textures
    • Implement proper view lifecycle management
  • Battery Considerations

    • Pause AR session when app goes into background
    • Reduce rendering quality when battery is low
    • Use sceneUnderstanding only when needed
  • ARKit-Specific Tips

    // Optimize plane detection performance
    configuration.planeDetection = [.horizontal] // Only detect horizontal planes if vertical not needed
    
    // Use automatic environment texturing selectively
    configuration.environmentTexturing = .manual // Control when environment textures are updated
    
    // Optimize image tracking
    configuration.maximumNumberOfTrackedImages = 1 // Only track as many images as needed
    

ARCore Performance Tips

  • CPU Optimization

    • Use appropriate Config.UpdateMode (LATEST_CAMERA_IMAGE vs. BLOCKING)
    • Reduce GL draw calls
    • Implement frame throttling when appropriate
    • Use simplified collision meshes
  • Memory Management

    • Manage texture resources carefully
    • Implement proper Activity lifecycle handling
    • Use texture compression
    • Release resources when not in view
  • Battery Considerations

    • Pause session in onPause()
    • Implement battery-aware quality settings
    • Use focused hit testing instead of continuous raycasting
  • ARCore-Specific Tips

    // Optimize plane detection performance
    config.planeFindingMode = Config.PlaneFindingMode.HORIZONTAL // Only detect horizontal planes if vertical not needed
    
    // Optimize light estimation
    config.lightEstimationMode = Config.LightEstimationMode.DISABLED // Disable if not needed
    
    // Optimize image tracking
    config.setFocusMode(Config.FocusMode.AUTO) // Use appropriate focus mode
    

Common Challenges & Solutions

ChallengeARKit SolutionARCore Solution
Poor Tracking• Ensure good lighting<br>• Move device slowly<br>• Use ARCoachingOverlayView• Check TrackingFailureReason<br>• Implement motion tracking guide<br>• Display tracking state to user
Object Placement Inaccuracy• Use ARRaycastQuery with target alignment<br>• Enable LiDAR for accurate placement• Filter hit test results<br>• Implement placement indicators<br>• Average multiple raycasts
Unstable Anchors• Use ARWorldMap for persistence<br>• Implement anchor refinement• Use Cloud Anchors<br>• Implement anchor refinement logic
Lighting Inconsistency• Use probe-based lighting<br>• Apply custom shaders for light adaptation• Implement custom shaders<br>• Use LightEstimate with intensity compensation
Poor Performance• Reduce polygon count<br>• Use RealityKit’s automatic LOD<br>• Implement occlusion culling• Use instanced rendering<br>• Reduce draw calls<br>• Implement frustum culling
Battery Drain• Pause AR session when not in focus<br>• Reduce frame rate<br>• Optimize hit testing• Implement session lifecycle management<br>• Use Config.UpdateMode.BLOCKING<br>• Throttle frame processing

Cross-Platform Development Strategies

Unity AR Development

// Unity example with AR Foundation (works with both ARKit and ARCore)
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARPlacement : MonoBehaviour
{
    [SerializeField] private GameObject placementPrefab;
    [SerializeField] private ARRaycastManager raycastManager;
    [SerializeField] private ARPlaneManager planeManager;
    
    private static List<ARRaycastHit> hits = new List<ARRaycastHit>();
    
    void Awake()
    {
        raycastManager = GetComponent<ARRaycastManager>();
        planeManager = GetComponent<ARPlaneManager>();
    }
    
    void Update()
    {
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);
            
            if (touch.phase == TouchPhase.Began)
            {
                if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
                {
                    Pose hitPose = hits[0].pose;
                    Instantiate(placementPrefab, hitPose.position, hitPose.rotation);
                }
            }
        }
    }
}

Platform-Specific Optimization

StrategyImplementation
Feature DetectionCheck at runtime which AR features are available on the device
Asset AdaptationUse different LOD models based on device performance
Configuration TuningAdjust session configuration based on platform capabilities
Fallback MechanismsImplement degraded experiences for devices with limited capabilities
UI AdaptationAdapt UI for different device form factors and input methods

Framework Comparison for Cross-Platform Development

FrameworkProsConsBest For
Unity AR Foundation• Single codebase<br>• Good feature parity<br>• Popular ecosystem• Performance overhead<br>• Sometimes lags behind native updatesMost cross-platform AR apps
Unreal Engine• High visual quality<br>• Blueprint visual scripting<br>• Good performance• Steeper learning curve<br>• Larger app sizeGraphically intensive AR experiences
React Native + AR• JavaScript development<br>• Web developer friendly<br>• Faster iteration• Performance limitations<br>• Limited AR feature accessSimple AR visualization apps
Flutter + AR• Dart language<br>• Fast development<br>• Good UI toolkit• Emerging AR support<br>• Less mature AR pluginsUI-focused AR applications

Resources for Further Learning

Official Documentation

Learning Resources

Community Resources

Scroll to Top