Introduction
Augmented Reality (AR) technology overlays digital content onto the real world, creating interactive experiences that blend virtual elements with physical environments. Apple’s ARKit and Google’s ARCore are the two leading AR development frameworks, each providing tools for mobile AR experiences on their respective platforms. This cheatsheet provides a comprehensive comparison to help developers choose the right framework and leverage the strengths of each platform for creating compelling AR applications.
Core Concepts & Fundamentals
Key AR Terminology
Term | Definition |
---|---|
World Tracking | Technology that maps the physical environment and tracks device movement within it |
Plane Detection | Identification of flat surfaces like floors, tables, and walls |
Anchor | Reference point in 3D space that maintains its position relative to the real world |
Hit Testing | Process of casting a ray from the device to determine where it intersects with detected surfaces |
Light Estimation | Analysis of real-world lighting to realistically illuminate virtual objects |
Occlusion | Technique allowing real-world objects to appear in front of virtual content |
Session | AR experience instance that manages tracking state and AR resources |
Tracking State | Current quality of environment mapping (normal, limited, not available) |
Platform Comparison at a Glance
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
Initial Release | June 2017 (iOS 11) | March 2018 |
Latest Version | ARKit 7 (iOS 17) | ARCore 1.40 |
Supported Devices | iPhone 6s and newer, iPad (5th gen) and newer | 1000+ Android device models with proper specs |
Development Environment | Xcode | Android Studio |
Primary Languages | Swift, Objective-C | Java, Kotlin |
Primary 3D Engines | RealityKit, SceneKit, Unity, Unreal | Sceneform (deprecated), Filament, Unity, Unreal |
Cloud Anchors | Yes (Shared experiences) | Yes (Persistent and shared experiences) |
Face Tracking | Yes (TrueDepth camera required) | Yes |
Body Tracking | Yes | Limited |
Hand Tracking | Yes | Limited |
Framework Architecture & Components
ARKit Architecture
// Basic ARKit session setup
import ARKit
import RealityKit
class ViewController: UIViewController, ARSessionDelegate {
@IBOutlet var arView: ARView!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Start AR session
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
configuration.environmentTexturing = .automatic
arView.session.delegate = self
arView.session.run(configuration)
}
func session(_ session: ARSession, didFailWithError error: Error) {
// Handle session failures
}
func sessionWasInterrupted(_ session: ARSession) {
// Handle session interruptions
}
}
ARCore Architecture
// Basic ARCore session setup
import com.google.ar.core.ArCoreApk
import com.google.ar.core.Session
import com.google.ar.core.exceptions.CameraNotAvailableException
class MainActivity : AppCompatActivity() {
private var arCoreSession: Session? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
// Check AR availability
val availability = ArCoreApk.getInstance().checkAvailability(this)
if (availability.isSupported) {
try {
arCoreSession = Session(this)
val config = Config(arCoreSession)
config.planeFindingMode = Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL
config.lightEstimationMode = Config.LightEstimationMode.ENVIRONMENTAL_HDR
arCoreSession?.configure(config)
} catch (e: Exception) {
// Handle session creation failure
}
}
}
override fun onResume() {
super.onResume()
try {
arCoreSession?.resume()
} catch (e: CameraNotAvailableException) {
// Handle camera not available
}
}
override fun onPause() {
super.onPause()
arCoreSession?.pause()
}
}
Feature Comparison in Detail
Environment Understanding
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
Plane Detection | Horizontal, vertical, and diagonal surfaces | Horizontal and vertical surfaces |
Scene Reconstruction | LiDAR Scanner enables detailed mesh creation (iPad Pro 2020+, iPhone 12 Pro+) | Scene mesh via Depth API (limited devices) |
Object Detection | Yes, with ML integration | Yes, with ML integration |
Image Recognition | Up to 100 reference images | Up to 1000 reference images |
Environment Texturing | Automatic environment texturing for realistic reflections | Limited environment reflection maps |
Geometry Awareness | Advanced with LiDAR | Basic with ARCore Depth API |
People Occlusion | Full body occlusion on newer devices | Limited people segmentation |
Tracking Capabilities
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
World Tracking Accuracy | Very high, especially with LiDAR | Good on supported devices |
Tracking Recovery | Fast recovery after interruption | Moderate recovery speed |
Motion Tracking | 6DOF (degrees of freedom) | 6DOF (degrees of freedom) |
Extended Tracking | Yes, persistent AR between sessions | Limited persistence without Cloud Anchors |
Location Anchors | Yes, with geo-anchors in specific cities | Yes, with Geospatial API worldwide |
Tracking in Low Light | Better with LiDAR devices | Limited in low light conditions |
Relocalization | Visual Positioning System in supported locations | Visual Positioning System with Google Maps |
Real-World Integration
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
Light Estimation | HDR environment probes, spherical harmonics | Environmental HDR, directional light |
Physics Simulation | Built-in physics in RealityKit | Requires third-party physics engine |
Audio Spatalization | Spatial audio support | Basic audio positioning |
Video Texturing | Yes | Yes |
Real-World Scale | High accuracy | Good accuracy |
Object Occlusion | Advanced with LiDAR | Basic depth-based occlusion |
Environment Probes | Automatic environment cubemaps | Manual environment cubemaps |
Human Interaction Features
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
Face Tracking | 52+ blend shapes, detailed tracking (TrueDepth camera) | Basic face mesh with ARCore Face API |
Body Tracking | Full skeleton tracking | Limited body pose detection |
Hand Tracking | Detailed hand pose, finger tracking | Basic hand detection |
Eye Tracking | Yes, on supported devices | Limited |
Emotion Detection | Yes, via blend shapes | Limited |
Avatar Creation | Memoji, detailed avatars | Basic avatars |
Motion Capture | Full body motion capture | Limited |
Shared/Multi-User Experiences
Feature | ARKit (Apple) | ARCore (Google) |
---|---|---|
Framework Support | MultipeerConnectivity, RealityKit Synchronization | Cloud Anchors, Google Firestore |
Real-time Sharing | Yes | Yes |
Persistence | Limited without iCloud integration | Yes, with Cloud Anchors (up to 365 days) |
User Identification | Basic | Basic |
Collaborative Mapping | Yes, with shared world mapping | Limited, requires Cloud Anchors |
Cross-Platform | Limited | Yes, ARCore Cloud Anchors works with iOS |
Scaling Limitations | Local network primarily | Global with cloud infrastructure |
Development Workflow & Tools
ARKit Development Process
Environment Setup
- Install latest Xcode
- iOS device with A9 processor or newer
- Apple Developer account
Project Configuration
- Create new AR project in Xcode
- Add camera usage description in Info.plist
- Configure AR capabilities in project settings
Development Loop
1. Configure AR session 2. Place virtual content 3. Handle user interactions 4. Test on device (simulator not supported) 5. Debug with AR debug options
Testing Tools
- Reality Composer for prototyping
- AR Quick Look for testing 3D models
- Xcode AR debugging tools
Deployment
- App Store submission
- TestFlight for beta testing
ARCore Development Process
Environment Setup
- Install Android Studio
- ARCore supported Android device
- Google Play Services for AR
Project Configuration
- Add ARCore dependency to build.gradle
- Add camera permission in AndroidManifest.xml
- Configure AR features in app settings
Development Loop
1. Check ARCore availability 2. Create and configure AR session 3. Set up renderers (OpenGL ES, Sceneform, etc.) 4. Manage session lifecycle 5. Test on device (emulator support limited)
Testing Tools
- ARCore Scene Viewer
- Sceneform SDK (deprecated but still usable)
- Android Studio AR debugging tools
Deployment
- Google Play Store submission
- Google Play App Testing
Implementation Examples
ARKit: Placing a Virtual Object on Detected Plane
// Using RealityKit
import ARKit
import RealityKit
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Start AR session
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal]
arView.session.run(configuration)
// Setup tap gesture
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
arView.addGestureRecognizer(tapGesture)
}
@objc func handleTap(_ sender: UITapGestureRecognizer) {
let tapLocation = sender.location(in: arView)
// Raycast to find where tap intersects with detected plane
let results = arView.raycast(from: tapLocation,
allowing: .estimatedPlane,
alignment: .horizontal)
if let firstResult = results.first {
// Create anchor at hit location
let anchor = AnchorEntity(world: firstResult.worldTransform)
// Load and place 3D model
let modelEntity = try! ModelEntity.load(named: "toy_robot")
modelEntity.scale = [0.1, 0.1, 0.1] // Scale down the model
// Add model to anchor
anchor.addChild(modelEntity)
// Add anchor to scene
arView.scene.addAnchor(anchor)
}
}
}
ARCore: Placing a Virtual Object on Detected Plane
// Using Sceneform (though deprecated, still commonly used)
import com.google.ar.core.HitResult
import com.google.ar.core.Plane
import com.google.ar.sceneform.AnchorNode
import com.google.ar.sceneform.rendering.ModelRenderable
import com.google.ar.sceneform.ux.ArFragment
class MainActivity : AppCompatActivity() {
private lateinit var arFragment: ArFragment
private var modelRenderable: ModelRenderable? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
arFragment = supportFragmentManager.findFragmentById(R.id.ar_fragment) as ArFragment
// Load 3D model
ModelRenderable.builder()
.setSource(this, R.raw.toy_robot)
.build()
.thenAccept { renderable -> modelRenderable = renderable }
.exceptionally {
Log.e("MainActivity", "Unable to load model", it)
null
}
// Set up tap listener
arFragment.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent ->
if (modelRenderable == null) return@setOnTapArPlaneListener
// Create anchor at hit location
val anchor = hitResult.createAnchor()
val anchorNode = AnchorNode(anchor)
anchorNode.setParent(arFragment.arSceneView.scene)
// Create transformable node for model
val modelNode = TransformableNode(arFragment.transformationSystem)
modelNode.setParent(anchorNode)
modelNode.renderable = modelRenderable
modelNode.select()
}
}
}
Performance Optimization Tips
ARKit Performance Tips
CPU Optimization
- Use
preferredFramesPerSecond
to limit unnecessary processing - Only enable necessary detection features
- Reduce polygon count for 3D models
- Use LOD (Level of Detail) for complex models
- Use
Memory Management
- Ensure proper resource disposal
- Load assets asynchronously
- Use compressed textures
- Implement proper view lifecycle management
Battery Considerations
- Pause AR session when app goes into background
- Reduce rendering quality when battery is low
- Use
sceneUnderstanding
only when needed
ARKit-Specific Tips
// Optimize plane detection performance configuration.planeDetection = [.horizontal] // Only detect horizontal planes if vertical not needed // Use automatic environment texturing selectively configuration.environmentTexturing = .manual // Control when environment textures are updated // Optimize image tracking configuration.maximumNumberOfTrackedImages = 1 // Only track as many images as needed
ARCore Performance Tips
CPU Optimization
- Use appropriate Config.UpdateMode (LATEST_CAMERA_IMAGE vs. BLOCKING)
- Reduce GL draw calls
- Implement frame throttling when appropriate
- Use simplified collision meshes
Memory Management
- Manage texture resources carefully
- Implement proper Activity lifecycle handling
- Use texture compression
- Release resources when not in view
Battery Considerations
- Pause session in onPause()
- Implement battery-aware quality settings
- Use focused hit testing instead of continuous raycasting
ARCore-Specific Tips
// Optimize plane detection performance config.planeFindingMode = Config.PlaneFindingMode.HORIZONTAL // Only detect horizontal planes if vertical not needed // Optimize light estimation config.lightEstimationMode = Config.LightEstimationMode.DISABLED // Disable if not needed // Optimize image tracking config.setFocusMode(Config.FocusMode.AUTO) // Use appropriate focus mode
Common Challenges & Solutions
Challenge | ARKit Solution | ARCore Solution |
---|---|---|
Poor Tracking | • Ensure good lighting<br>• Move device slowly<br>• Use ARCoachingOverlayView | • Check TrackingFailureReason <br>• Implement motion tracking guide<br>• Display tracking state to user |
Object Placement Inaccuracy | • Use ARRaycastQuery with target alignment<br>• Enable LiDAR for accurate placement | • Filter hit test results<br>• Implement placement indicators<br>• Average multiple raycasts |
Unstable Anchors | • Use ARWorldMap for persistence<br>• Implement anchor refinement | • Use Cloud Anchors<br>• Implement anchor refinement logic |
Lighting Inconsistency | • Use probe-based lighting<br>• Apply custom shaders for light adaptation | • Implement custom shaders<br>• Use LightEstimate with intensity compensation |
Poor Performance | • Reduce polygon count<br>• Use RealityKit’s automatic LOD<br>• Implement occlusion culling | • Use instanced rendering<br>• Reduce draw calls<br>• Implement frustum culling |
Battery Drain | • Pause AR session when not in focus<br>• Reduce frame rate<br>• Optimize hit testing | • Implement session lifecycle management<br>• Use Config.UpdateMode.BLOCKING<br>• Throttle frame processing |
Cross-Platform Development Strategies
Unity AR Development
// Unity example with AR Foundation (works with both ARKit and ARCore)
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
public class ARPlacement : MonoBehaviour
{
[SerializeField] private GameObject placementPrefab;
[SerializeField] private ARRaycastManager raycastManager;
[SerializeField] private ARPlaneManager planeManager;
private static List<ARRaycastHit> hits = new List<ARRaycastHit>();
void Awake()
{
raycastManager = GetComponent<ARRaycastManager>();
planeManager = GetComponent<ARPlaneManager>();
}
void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Began)
{
if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
Pose hitPose = hits[0].pose;
Instantiate(placementPrefab, hitPose.position, hitPose.rotation);
}
}
}
}
}
Platform-Specific Optimization
Strategy | Implementation |
---|---|
Feature Detection | Check at runtime which AR features are available on the device |
Asset Adaptation | Use different LOD models based on device performance |
Configuration Tuning | Adjust session configuration based on platform capabilities |
Fallback Mechanisms | Implement degraded experiences for devices with limited capabilities |
UI Adaptation | Adapt UI for different device form factors and input methods |
Framework Comparison for Cross-Platform Development
Framework | Pros | Cons | Best For |
---|---|---|---|
Unity AR Foundation | • Single codebase<br>• Good feature parity<br>• Popular ecosystem | • Performance overhead<br>• Sometimes lags behind native updates | Most cross-platform AR apps |
Unreal Engine | • High visual quality<br>• Blueprint visual scripting<br>• Good performance | • Steeper learning curve<br>• Larger app size | Graphically intensive AR experiences |
React Native + AR | • JavaScript development<br>• Web developer friendly<br>• Faster iteration | • Performance limitations<br>• Limited AR feature access | Simple AR visualization apps |
Flutter + AR | • Dart language<br>• Fast development<br>• Good UI toolkit | • Emerging AR support<br>• Less mature AR plugins | UI-focused AR applications |
Resources for Further Learning
Official Documentation
ARKit Documentation
ARCore Documentation
Learning Resources
ARKit Learning
ARCore Learning
Community Resources
Forums & Communities
Twitter Accounts to Follow
- ARKit Team: @ARKitTeam
- Google AR: @GoogleARVR
- AR/VR Developers: @ARVRDevelopers
- Unity AR: @Unity3dAR