The Ultimate ARCore Basics Cheat Sheet: Build AR Apps for Android

Introduction

ARCore is Google’s platform for building augmented reality experiences on Android devices. It uses different APIs to enable your device to sense its environment, understand the world and interact with information. ARCore allows mobile devices to detect their position relative to the world around them, recognize objects and surfaces, and overlay digital content in the real-world environment. With ARCore, developers can create immersive AR applications that blend virtual content with the real world seamlessly.

Core ARCore Concepts & Components

1. Motion Tracking

Description: Allows the device to understand and track its position relative to the world.

Key Components:

  • Visual Inertial Odometry (VIO): Combines camera images with IMU data (accelerometer and gyroscope)
  • Feature points: Visually distinct features that ARCore identifies and tracks
  • Pose: Represents device position and orientation in 3D space (combination of rotation and translation)

Implementation:

// Get the pose of the camera relative to the world
Pose cameraPose = frame.getCamera().getPose();

// Extract position from pose
float[] translation = new float[3];
cameraPose.getTranslation(translation, 0);
// translation[0] = x, translation[1] = y, translation[2] = z

// Extract rotation from pose
float[] quaternion = new float[4];
cameraPose.getRotationQuaternion(quaternion, 0);

2. Environmental Understanding

Description: Enables detection of horizontal and vertical surfaces, and estimation of lighting.

Key Components:

  • Plane detection: Identifies flat surfaces (horizontal and vertical)
  • Hit testing: Determines where a ray from the camera intersects with planes
  • Light estimation: Analyzes camera images to estimate lighting conditions

Implementation:

// Get all planes detected by ARCore
for (Plane plane : frame.getUpdatedTrackables(Plane.class)) {
    // Check plane type
    if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING) {
        // Use the plane...
    }
}

// Perform hit test
List<HitResult> hitResults = frame.hitTest(screenX, screenY);
for (HitResult hit : hitResults) {
    Trackable trackable = hit.getTrackable();
    if (trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose())) {
        // Handle the hit on plane
        Pose hitPose = hit.getHitPose();
        // Place objects at hitPose
    }
}

// Estimate lighting
LightEstimate lightEstimate = frame.getLightEstimate();
float lightIntensity = lightEstimate.getPixelIntensity();

3. Anchors and Trackables

Description: References to real-world positions that ARCore tracks over time.

Key Components:

  • Anchor: Fixed location and orientation in the real world
  • Trackable: Object that ARCore can track (planes, points, images, etc.)
  • Augmented Images: 2D images that ARCore can recognize and track

Implementation:

// Create an anchor at a hit location
Anchor anchor = hit.createAnchor();

// Create an anchor from a pose
Anchor anchor = session.createAnchor(pose);

// Attach a renderable to an anchor (using Sceneform)
AnchorNode anchorNode = new AnchorNode(anchor);
anchorNode.setParent(arSceneView.getScene());
TransformableNode model = new TransformableNode(arSceneView.getTransformationSystem());
model.setParent(anchorNode);
model.setRenderable(modelRenderable);

4. Augmented Images

Description: Enables detection and tracking of 2D images in the real world.

Implementation:

// Create an image database
AugmentedImageDatabase augmentedImageDatabase = new AugmentedImageDatabase(session);

// Add images to the database
Bitmap bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.my_image);
int index = augmentedImageDatabase.addImage("image_name", bitmap);

// Configure the session to use the database
Config config = session.getConfig();
config.setAugmentedImageDatabase(augmentedImageDatabase);
session.configure(config);

// Track augmented images
for (AugmentedImage augmentedImage : frame.getUpdatedTrackables(AugmentedImage.class)) {
    if (augmentedImage.getTrackingState() == TrackingState.TRACKING) {
        // Image is being tracked - use its pose
        Pose pose = augmentedImage.getCenterPose();
    }
}

ARCore Development Process

1. Project Setup

Steps:

  1. Check device compatibility with ArCoreApk.checkAvailability()
  2. Add ARCore dependencies to your project
  3. Configure manifest with required permissions and features

Required Dependencies:

dependencies {
    // ARCore library
    implementation 'com.google.ar:core:1.35.0'
    
    // Optional: Sceneform for 3D rendering (deprecated but still useful)
    implementation 'com.google.ar.sceneform.ux:sceneform-ux:1.17.1'
    implementation 'com.google.ar.sceneform:core:1.17.1'
}

Manifest Configuration:

<manifest>
    <!-- Camera permission -->
    <uses-permission android:name="android.permission.CAMERA" />
    
    <!-- Indicates AR requirement -->
    <uses-feature android:name="android.hardware.camera.ar" android:required="true" />
    
    <application>
        <!-- ARCore meta-data -->
        <meta-data android:name="com.google.ar.core" android:value="required" />
        
        <!-- Activities and other components -->
    </application>
</manifest>

2. Session Setup

Steps:

  1. Initialize ARCore session
  2. Configure session features
  3. Handle camera permission

Implementation:

private Session session;
private boolean sessionConfigured = false;

private void setupSession() {
    if (session == null) {
        try {
            session = new Session(this);
            
            // Configure the session
            Config config = session.getConfig();
            config.setUpdateMode(Config.UpdateMode.LATEST_CAMERA_IMAGE);
            config.setPlaneFindingMode(Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL);
            config.setLightEstimationMode(Config.LightEstimationMode.ENVIRONMENTAL_HDR);
            session.configure(config);
            
            sessionConfigured = true;
        } catch (UnavailableException e) {
            // Handle exceptions based on type
            handleSessionException(e);
        }
    }
}

private void handleSessionException(UnavailableException e) {
    String message;
    if (e instanceof UnavailableArcoreNotInstalledException) {
        message = "Please install ARCore";
    } else if (e instanceof UnavailableApkTooOldException) {
        message = "Please update ARCore";
    } else if (e instanceof UnavailableSdkTooOldException) {
        message = "Please update this app";
    } else if (e instanceof UnavailableDeviceNotCompatibleException) {
        message = "This device does not support AR";
    } else {
        message = "AR is not available on this device";
    }
    Toast.makeText(this, message, Toast.LENGTH_LONG).show();
}

3. Frame Processing Loop

Description: Core loop for processing camera frames and updating AR content.

Implementation:

@Override
public void onDrawFrame(GL10 gl) {
    if (session == null) return;
    
    try {
        // Obtain the current frame from ARCore session
        session.setCameraTextureName(textureId);
        Frame frame = session.update();
        Camera camera = frame.getCamera();
        
        // Check tracking state
        if (camera.getTrackingState() == TrackingState.TRACKING) {
            // Process detected planes
            processPlanes(frame);
            
            // Process anchors
            processAnchors();
            
            // Process touch events and create objects
            processTouches();
            
            // Update rendering
            updateRendering(frame);
        }
    } catch (Exception e) {
        Log.e(TAG, "Exception on the OpenGL thread", e);
    }
}

private void processPlanes(Frame frame) {
    Collection<Plane> updatedPlanes = frame.getUpdatedTrackables(Plane.class);
    for (Plane plane : updatedPlanes) {
        if (plane.getTrackingState() == TrackingState.TRACKING) {
            // Handle newly detected or updated planes
        }
    }
}

Common ARCore Workflows

Placing Virtual Objects

Steps:

  1. Detect user tap on screen
  2. Perform hit test against detected planes
  3. Create anchor at hit location
  4. Attach 3D model to the anchor

Implementation:

private void handleTap(MotionEvent tap) {
    if (frame != null) {
        // Perform hit test
        List<HitResult> hitResults = frame.hitTest(tap);
        
        for (HitResult hit : hitResults) {
            Trackable trackable = hit.getTrackable();
            
            // Check if hit was on a plane
            if (trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose())) {
                // Create anchor
                Anchor anchor = hit.createAnchor();
                
                // Create anchor node (with Sceneform)
                AnchorNode anchorNode = new AnchorNode(anchor);
                anchorNode.setParent(arSceneView.getScene());
                
                // Create model node
                TransformableNode model = new TransformableNode(arSceneView.getTransformationSystem());
                model.setParent(anchorNode);
                model.setRenderable(modelRenderable);
                model.select();
                
                break; // Only place one object per tap
            }
        }
    }
}

Cloud Anchors (Shared AR)

Description: Allows multiple devices to share AR experiences by anchoring content to the same real-world locations.

Steps:

  1. Host an anchor to the cloud
  2. Receive cloud anchor ID
  3. Share ID with other devices
  4. Resolve the anchor on other devices

Implementation:

// Host anchor
private void hostAnchor(Anchor anchor) {
    session.hostCloudAnchor(anchor).thenAccept(cloudAnchor -> {
        String cloudAnchorId = cloudAnchor.getCloudAnchorId();
        // Share cloudAnchorId with other users via your preferred method
        // (Firebase, custom server, etc.)
    });
}

// Resolve shared anchor
private void resolveAnchor(String cloudAnchorId) {
    session.resolveCloudAnchor(cloudAnchorId).thenAccept(cloudAnchor -> {
        // Use the resolved anchor to place AR content
        if (cloudAnchor.getCloudAnchorState() == CloudAnchorState.SUCCESS) {
            // Create anchor node and attach content
            AnchorNode anchorNode = new AnchorNode(cloudAnchor);
            anchorNode.setParent(arSceneView.getScene());
            // Add 3D content to the anchor node
        }
    });
}

Augmented Faces

Description: Enables face tracking and augmentation with 3D assets.

Implementation:

// Configure session for face tracking
Config config = session.getConfig();
config.setAugmentedFaceMode(Config.AugmentedFaceMode.MESH3D);
session.configure(config);

// Process faces in each frame
for (AugmentedFace face : frame.getUpdatedTrackables(AugmentedFace.class)) {
    if (face.getTrackingState() == TrackingState.TRACKING) {
        // Get face mesh
        FloatBuffer vertices = face.getMeshVertices();
        FloatBuffer normals = face.getMeshNormals();
        FloatBuffer textureCoords = face.getMeshTextureCoordinates();
        
        // Get face pose
        Pose facePose = face.getCenterPose();
        
        // Get specific face regions
        Pose nosePose = face.getRegionPose(AugmentedFace.RegionType.NOSE_TIP);
        Pose foreheadPose = face.getRegionPose(AugmentedFace.RegionType.FOREHEAD_LEFT);
        
        // Update face mesh or add 3D accessories
        updateFaceMesh(face);
    }
}

Comparison of AR Rendering Frameworks

FrameworkCompatibilityEase of UseFeaturesBest For
SceneForm (deprecated)ARCore 1.0+HighBasic 3D rendering, physicsSimple AR apps
FilamentAll AndroidMediumHigh-performance renderingGraphics-intensive apps
OpenGL ESAll AndroidLowFull control, custom shadersCustom rendering solutions
Unity AR FoundationCross-platformHighComplete ecosystem, visual editorGames, complex interactions
Unreal EngineCross-platformMediumHigh-fidelity graphicsPhotorealistic AR

Common Challenges and Solutions

Challenge: Tracking Loss

Solutions:

  • Ensure sufficient lighting
  • Encourage user to scan environment slowly
  • Detect textured surfaces
  • Implement graceful handling of tracking state changes
if (camera.getTrackingState() == TrackingState.PAUSED) {
    // Show UI guidance based on tracking failure reason
    TrackingFailureReason reason = camera.getTrackingFailureReason();
    switch (reason) {
        case INSUFFICIENT_LIGHT:
            showMessage("Too dark! Move to a better lit area.");
            break;
        case EXCESSIVE_MOTION:
            showMessage("Moving too fast! Slow down.");
            break;
        case INSUFFICIENT_FEATURES:
            showMessage("Can't find visual features. Try a more textured surface.");
            break;
        default:
            showMessage("Tracking lost. Move device slowly.");
    }
}

Challenge: Performance Optimization

Solutions:

  • Reduce polygon count in 3D models
  • Use level-of-detail (LOD) techniques
  • Limit number of virtual objects
  • Optimize lighting and shadows
  • Use batching for similar objects
// Configure the session for performance
Config config = session.getConfig();
// Choose performance mode based on device capability
if (isHighEndDevice()) {
    config.setPlaneFindingMode(Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL);
    config.setLightEstimationMode(Config.LightEstimationMode.ENVIRONMENTAL_HDR);
} else {
    config.setPlaneFindingMode(Config.PlaneFindingMode.HORIZONTAL);
    config.setLightEstimationMode(Config.LightEstimationMode.AMBIENT_INTENSITY);
}
session.configure(config);

Challenge: Lighting and Shadows

Solutions:

  • Use ARCore light estimation
  • Apply environmental HDR where available
  • Implement shadow receivers on detected planes
  • Adjust material properties based on light intensity
// Get light estimate from current frame
LightEstimate lightEstimate = frame.getLightEstimate();

// For basic light intensity
if (lightEstimate.getState() != LightEstimate.State.NOT_VALID) {
    float lightIntensity = lightEstimate.getPixelIntensity();
    // Apply to materials (0.5-2.0 is typical range)
    float scaledIntensity = Math.max(0.5f, Math.min(2.0f, lightIntensity));
    material.setFloat("light_intensity", scaledIntensity);
}

// For environmental HDR (requires ARCore 1.18+)
if (lightEstimate instanceof EnvironmentalHdrLightEstimate) {
    EnvironmentalHdrLightEstimate envLightEstimate = (EnvironmentalHdrLightEstimate) lightEstimate;
    // Get main directional light
    float[] mainLightDirection = envLightEstimate.getMainLightDirection();
    float[] mainLightIntensity = envLightEstimate.getMainLightIntensity();
    // Get spherical harmonics for ambient lighting
    float[] sphericalHarmonics = envLightEstimate.getSphericalHarmonicsCoefficients();
    
    // Apply to rendering engine
    updateEnvironmentalLighting(mainLightDirection, mainLightIntensity, sphericalHarmonics);
}

Best Practices

User Experience

  • Provide clear instructions for scanning environment
  • Show visual feedback for detected surfaces
  • Design intuitive gestures for interaction
  • Consider accessibility for diverse users
  • Implement graceful fallbacks for tracking issues

Technical Implementation

  • Use lifecycle-aware components
  • Implement proper session pause/resume
  • Release resources when not in use
  • Cache 3D models to reduce loading times
  • Use coroutines or RxJava for async operations

AR Design Principles

  • Keep virtual objects within user’s field of view
  • Consider scale and perspective
  • Maintain proper occlusion with real-world objects
  • Design for varying lighting conditions
  • Use spatial audio for immersion

Resources for Further Learning

Official Documentation

Community Resources

  • Google Codelabs for ARCore
  • ARCore Developer Discord
  • Stack Overflow [arcore] tag
  • Medium articles on ARCore development
  • YouTube tutorials by Google Developers

Tools

  • SceneViewer for AR prototyping
  • Sceneform Asset Studio (deprecated but useful)
  • Poly 3D asset library (alternative options now needed)
  • Blender for 3D modeling
  • Reality Converter for USDZ conversion

Remember that ARCore is continuously evolving with new features and capabilities. Stay updated with the latest releases and best practices to create compelling AR experiences.

Scroll to Top