Practice and reinforce the concepts from Lesson 6
Set up an AR-enabled camera that detects horizontal and vertical surfaces, provides visual feedback for plane detection, and prepares the foundation for placing virtual objects in augmented reality environments.
An AR camera application that visualizes detected surfaces with colorful overlays, shows feature points being tracked, and provides real-time feedback about the AR session state. This forms the foundation for all AR experiences.

Install required packages for AR functionality:
cd M3-Activity-06
npm install expo-gl expo-three three@0.160.0 @react-three/fiber
npx expo install expo-gl
š” Tip: AR requires a development build. Run
npx expo run:androidto build and test on your device. Expo Go doesn't support AR features.
Build a utility to manage AR session lifecycle:
// utils/ARSession.js
export class ARSession {
constructor() {
this.isRunning = false;
this.planes = new Map();
this.anchors = [];
}
async initialize(gl) {
try {
// Check if AR is supported
if (!gl.getExtension('WEBGL_depth_texture')) {
throw new Error('Device does not support AR');
}
this.isRunning = true;
console.log('AR Session initialized');
return true;
} catch (error) {
console.error('AR initialization failed:', error);
return false;
}
}
pause() {
this.isRunning = false;
}
resume() {
this.isRunning = true;
}
dispose() {
this.isRunning = false;
this.planes.clear();
this.anchors = [];
}
updatePlanes(detectedPlanes) {
// Update tracked planes
detectedPlanes.forEach(plane => {
this.planes.set(plane.id, {
id: plane.id,
type: plane.type,
polygon: plane.polygon,
center: plane.center,
extent: plane.extent,
});
});
}
getPlanes() {
return Array.from(this.planes.values());
}
addAnchor(position, rotation) {
const anchor = {
id: Date.now().toString(),
position,
rotation,
timestamp: Date.now(),
};
this.anchors.push(anchor);
return anchor;
}
}
Build the main AR camera view with Three.js:
// components/ARCamera.js
import React, { useEffect, useRef, useState } from 'react';
import { View, StyleSheet, Text } from 'react-native';
import { GLView } from 'expo-gl';
import { Renderer, TextureLoader } from 'expo-three';
import * as THREE from 'three';
import { ARSession } from '../utils/ARSession';
const ARCamera = ({ onPlanesDetected, onSessionReady }) => {
const [arState, setArState] = useState('Initializing...');
const arSessionRef = useRef(null);
const rendererRef = useRef(null);
const sceneRef = useRef(null);
const cameraRef = useRef(null);
const onContextCreate = async (gl) => {
// Setup Three.js scene
const scene = new THREE.Scene();
sceneRef.current = scene;
// Setup camera
const camera = new THREE.PerspectiveCamera(
75,
gl.drawingBufferWidth / gl.drawingBufferHeight,
0.01,
1000
);
camera.position.z = 0;
cameraRef.current = camera;
// Setup renderer
const renderer = new Renderer({ gl });
renderer.setSize(gl.drawingBufferWidth, gl.drawingBufferHeight);
rendererRef.current = renderer;
// Add lighting
const ambientLight = new THREE.AmbientLight(0xffffff, 0.6);
scene.add(ambientLight);
const directionalLight = new THREE.DirectionalLight(0xffffff, 0.8);
directionalLight.position.set(1, 1, 1);
scene.add(directionalLight);
// Initialize AR session
const arSession = new ARSession();
arSessionRef.current = arSession;
const initialized = await arSession.initialize(gl);
if (initialized) {
setArState('Looking for surfaces...');
onSessionReady?.(arSession);
} else {
setArState('AR not supported');
return;
}
// Add grid helper to visualize coordinate system
const gridHelper = new THREE.GridHelper(10, 10, 0x888888, 0x444444);
gridHelper.position.y = -1;
scene.add(gridHelper);
// Animation loop
const render = () => {
requestAnimationFrame(render);
// Update AR state
if (arSession.isRunning) {
// In a real implementation, you'd get AR frame data here
// and update camera position/rotation based on device movement
}
renderer.render(scene, camera);
gl.endFrameEXP();
};
render();
};
return (
<View style={styles.container}>
<GLView
style={styles.glView}
onContextCreate={onContextCreate}
/>
<View style={styles.statusBar}>
<Text style={styles.statusText}>{arState}</Text>
</View>
</View>
);
};
const styles = StyleSheet.create({
container: {
flex: 1,
},
glView: {
flex: 1,
},
statusBar: {
position: 'absolute',
top: 50,
left: 20,
right: 20,
backgroundColor: 'rgba(0, 0, 0, 0.7)',
padding: 15,
borderRadius: 10,
},
statusText: {
color: 'white',
fontSize: 16,
textAlign: 'center',
},
});
export default ARCamera;
Create visual representations of detected planes:
// components/PlaneVisualization.js
import * as THREE from 'three';
export class PlaneVisualization {
constructor(scene) {
this.scene = scene;
this.planeMeshes = new Map();
}
updatePlanes(planes) {
// Remove planes that are no longer detected
const currentPlaneIds = new Set(planes.map(p => p.id));
this.planeMeshes.forEach((mesh, id) => {
if (!currentPlaneIds.has(id)) {
this.scene.remove(mesh);
this.planeMeshes.delete(id);
}
});
// Add or update plane visualizations
planes.forEach(plane => {
if (this.planeMeshes.has(plane.id)) {
this.updatePlaneMesh(plane);
} else {
this.createPlaneMesh(plane);
}
});
}
createPlaneMesh(plane) {
// Create a plane geometry based on detected dimensions
const geometry = new THREE.PlaneGeometry(
plane.extent.width || 1,
plane.extent.height || 1
);
// Choose color based on plane type
const color = plane.type === 'horizontal' ? 0x4ECDC4 : 0xFF6B6B;
const material = new THREE.MeshBasicMaterial({
color: color,
opacity: 0.3,
transparent: true,
side: THREE.DoubleSide,
});
const mesh = new THREE.Mesh(geometry, material);
// Position the mesh at the plane's center
mesh.position.set(
plane.center.x,
plane.center.y,
plane.center.z
);
// Rotate to match plane orientation
if (plane.type === 'horizontal') {
mesh.rotation.x = -Math.PI / 2;
}
// Add wireframe outline
const edges = new THREE.EdgesGeometry(geometry);
const line = new THREE.LineSegments(
edges,
new THREE.LineBasicMaterial({ color: color, linewidth: 2 })
);
mesh.add(line);
this.scene.add(mesh);
this.planeMeshes.set(plane.id, mesh);
}
updatePlaneMesh(plane) {
const mesh = this.planeMeshes.get(plane.id);
if (!mesh) return;
// Update position
mesh.position.set(
plane.center.x,
plane.center.y,
plane.center.z
);
// Update scale if dimensions changed
mesh.scale.set(
plane.extent.width || 1,
plane.extent.height || 1,
1
);
}
clear() {
this.planeMeshes.forEach(mesh => {
this.scene.remove(mesh);
});
this.planeMeshes.clear();
}
}
Add helpful instructions for users:
// components/ARInstructions.js
import React, { useState, useEffect } from 'react';
import { View, Text, StyleSheet, Animated } from 'react-native';
const ARInstructions = ({ planesDetected, onComplete }) => {
const [fadeAnim] = useState(new Animated.Value(1));
const [step, setStep] = useState(0);
const instructions = [
{
icon: 'š±',
text: 'Point your camera at the floor or a table',
subtitle: 'Move slowly to help detect surfaces',
},
{
icon: 'š',
text: 'Surface detected!',
subtitle: 'Tap on a surface to place objects',
},
];
useEffect(() => {
if (planesDetected && step === 0) {
setStep(1);
// Fade out after showing success message
setTimeout(() => {
Animated.timing(fadeAnim, {
toValue: 0,
duration: 1000,
useNativeDriver: true,
}).start(() => {
onComplete?.();
});
}, 2000);
}
}, [planesDetected]);
const currentInstruction = instructions[step];
return (
<Animated.View style={[styles.container, { opacity: fadeAnim }]}>
<Text style={styles.icon}>{currentInstruction.icon}</Text>
<Text style={styles.text}>{currentInstruction.text}</Text>
<Text style={styles.subtitle}>{currentInstruction.subtitle}</Text>
{step === 0 && (
<View style={styles.loadingDots}>
<View style={styles.dot} />
<View style={styles.dot} />
<View style={styles.dot} />
</View>
)}
</Animated.View>
);
};
const styles = StyleSheet.create({
container: {
position: 'absolute',
bottom: 100,
left: 20,
right: 20,
backgroundColor: 'rgba(0, 0, 0, 0.8)',
borderRadius: 15,
padding: 20,
alignItems: 'center',
},
icon: {
fontSize: 48,
marginBottom: 10,
},
text: {
color: 'white',
fontSize: 18,
fontWeight: 'bold',
textAlign: 'center',
marginBottom: 8,
},
subtitle: {
color: '#cccccc',
fontSize: 14,
textAlign: 'center',
},
loadingDots: {
flexDirection: 'row',
marginTop: 15,
gap: 8,
},
dot: {
width: 8,
height: 8,
borderRadius: 4,
backgroundColor: '#4ECDC4',
},
});
export default ARInstructions;
Put it all together in the main app:
// App.js
import React, { useState } from 'react';
import { StyleSheet, View, TouchableOpacity, Text } from 'react-native';
import ARCamera from './components/ARCamera';
import ARInstructions from './components/ARInstructions';
export default function App() {
const [planesDetected, setPlanesDetected] = useState(false);
const [showInstructions, setShowInstructions] = useState(true);
const [arSession, setArSession] = useState(null);
const handlePlanesDetected = (planes) => {
if (planes.length > 0 && !planesDetected) {
setPlanesDetected(true);
}
};
const handleSessionReady = (session) => {
setArSession(session);
};
return (
<View style={styles.container}>
<ARCamera
onPlanesDetected={handlePlanesDetected}
onSessionReady={handleSessionReady}
/>
{showInstructions && (
<ARInstructions
planesDetected={planesDetected}
onComplete={() => setShowInstructions(false)}
/>
)}
<View style={styles.controls}>
<TouchableOpacity
style={styles.button}
onPress={() => setShowInstructions(true)}
>
<Text style={styles.buttonText}>ā¹ļø</Text>
</TouchableOpacity>
<View style={styles.statusBadge}>
<Text style={styles.statusDot}>
{planesDetected ? 'š¢' : 'š“'}
</Text>
<Text style={styles.statusLabel}>
{planesDetected ? 'Ready' : 'Scanning'}
</Text>
</View>
</View>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: 'black',
},
controls: {
position: 'absolute',
bottom: 30,
left: 20,
right: 20,
flexDirection: 'row',
justifyContent: 'space-between',
alignItems: 'center',
},
button: {
backgroundColor: 'rgba(255, 255, 255, 0.9)',
width: 60,
height: 60,
borderRadius: 30,
justifyContent: 'center',
alignItems: 'center',
shadowColor: '#000',
shadowOffset: { width: 0, height: 2 },
shadowOpacity: 0.3,
shadowRadius: 3,
elevation: 5,
},
buttonText: {
fontSize: 28,
},
statusBadge: {
flexDirection: 'row',
alignItems: 'center',
backgroundColor: 'rgba(0, 0, 0, 0.7)',
paddingHorizontal: 15,
paddingVertical: 10,
borderRadius: 20,
},
statusDot: {
fontSize: 12,
marginRight: 8,
},
statusLabel: {
color: 'white',
fontSize: 16,
fontWeight: '600',
},
});
npx expo run:androidProblem: "AR not supported" error Solution: Verify your device is on the ARCore supported devices list. Ensure you're testing on a physical device, not an emulator. Check that Google Play Services for AR is installed and updated.
Problem: Surfaces not detecting Solution: Ensure good lighting conditions. Move camera slowly in a sweeping motion. Point at textured surfaces (plain white walls are hard to track). Try pointing at the floor.
Problem: App crashes on launch Solution: Make sure you're using a development build, not Expo Go. Verify all dependencies are installed correctly. Check that expo-gl version matches your Expo SDK version.
Problem: Black screen instead of camera view Solution: Check camera permissions are granted. Verify GLView is properly sized with flex: 1. Ensure renderer is initialized before render loop starts.
Problem: Three.js objects not appearing Solution: Check that objects are within camera view frustum. Verify lighting is added to scene. Make sure renderer.render() is being called in animation loop.
For advanced students:
Feature Point Visualization: Display the feature points ARCore is tracking as small dots in the AR view
Plane Measurement: Show dimensions (width x height) on detected planes
Session Recording: Save AR session data (detected planes, camera poses) and replay them later for debugging
Multi-Plane Types: Detect and distinguish between horizontal, vertical, and angled planes with different visualizations
Performance Metrics: Display FPS, number of planes tracked, and memory usage overlay
In this activity, you:
In the next lesson, you'll learn how to place 3D objects on detected surfaces. You'll work with 3D models, handle user tap interactions, and create anchors to keep objects stable in the AR world.