By the end of this lesson, you will be able to:
โน๏ธ Info Definition: Science apps with sensor integration use mobile device capabilities to conduct real-world experiments, collect data, and explore scientific concepts. Apps like PhET Simulations, iNaturalist, and Star Walk transform phones into scientific instruments.
Mobile sensors have revolutionized science education:
Sensor | Scientific Applications | Educational Value |
---|---|---|
Accelerometer | Motion analysis, gravity experiments | Physics, mechanics |
Gyroscope | Rotation, orientation studies | Angular momentum, spin |
Magnetometer | Magnetic field detection | Electromagnetism, navigation |
Barometer | Pressure measurements | Weather, altitude studies |
Camera | Microscopy, spectroscopy | Biology, chemistry |
Microphone | Sound analysis, frequency | Acoustics, wave physics |
GPS | Location tracking | Earth science, geography |
๐ก Impact Story: Students using sensor-based science apps show 40% better understanding of abstract concepts compared to traditional methods!
# Core sensor libraries
npx expo install expo-sensors
npx expo install expo-accelerometer
npx expo install expo-gyroscope
npx expo install expo-magnetometer
npx expo install expo-barometer
# Camera and audio
npx expo install expo-camera
npx expo install expo-av
# Location services
npx expo install expo-location
# Data visualization
npm install react-native-chart-kit
npm install react-native-svg
npm install d3-shape
# AR capabilities
npx expo install expo-gl
npx expo install expo-gl-cpp
// utils/sensorPermissions.ts
import * as Location from 'expo-location';
import { Camera } from 'expo-camera';
import { Audio } from 'expo-av';
export const requestAllPermissions = async () => {
try {
// Location permission
const { status: locationStatus } = await Location.requestForegroundPermissionsAsync();
// Camera permission
const { status: cameraStatus } = await Camera.requestCameraPermissionsAsync();
// Audio permission
const { status: audioStatus } = await Audio.requestPermissionsAsync();
return {
location: locationStatus === 'granted',
camera: cameraStatus === 'granted',
audio: audioStatus === 'granted',
};
} catch (error) {
console.error('Error requesting permissions:', error);
return { location: false, camera: false, audio: false };
}
};
// components/PhysicsLab.tsx
import React, { useState, useEffect, useRef } from 'react';
import {
View,
Text,
TouchableOpacity,
StyleSheet,
Alert,
Dimensions,
ScrollView,
} from 'react-native';
import { Accelerometer, Gyroscope, Magnetometer } from 'expo-sensors';
import { LineChart } from 'react-native-chart-kit';
import Svg, { Line, Circle, Text as SvgText } from 'react-native-svg';
interface SensorData {
timestamp: number;
x: number;
y: number;
z: number;
}
interface Experiment {
id: string;
name: string;
description: string;
sensor: 'accelerometer' | 'gyroscope' | 'magnetometer';
duration: number; // seconds
expectedResults: string;
}
const experiments: Experiment[] = [
{
id: 'gravity',
name: 'Gravity Measurement',
description: 'Measure gravitational acceleration by dropping the device',
sensor: 'accelerometer',
duration: 10,
expectedResults: 'Should measure approximately 9.8 m/sยฒ downward',
},
{
id: 'pendulum',
name: 'Pendulum Motion',
description: 'Analyze pendulum motion by swinging the device',
sensor: 'accelerometer',
duration: 30,
expectedResults: 'Sinusoidal pattern with decreasing amplitude',
},
{
id: 'rotation',
name: 'Angular Velocity',
description: 'Study rotational motion by spinning the device',
sensor: 'gyroscope',
duration: 20,
expectedResults: 'Angular velocity peaks during rotation',
},
{
id: 'magnetic',
name: 'Magnetic Field Mapping',
description: 'Map magnetic field strength around different objects',
sensor: 'magnetometer',
duration: 60,
expectedResults: 'Higher values near magnetic materials',
},
];
export const PhysicsLab: React.FC = () => {
const [currentExperiment, setCurrentExperiment] = useState<Experiment | null>(null);
const [isRecording, setIsRecording] = useState(false);
const [sensorData, setSensorData] = useState<SensorData[]>([]);
const [analysis, setAnalysis] = useState<any>(null);
const [timeRemaining, setTimeRemaining] = useState(0);
const subscriptionRef = useRef<any>(null);
const timerRef = useRef<NodeJS.Timeout | null>(null);
useEffect(() => {
return () => {
stopRecording();
};
}, []);
const startExperiment = (experiment: Experiment) => {
setCurrentExperiment(experiment);
setSensorData([]);
setAnalysis(null);
setTimeRemaining(experiment.duration);
setIsRecording(true);
// Configure sensor update interval
const updateInterval = 100; // 10 Hz
let sensorSubscription;
switch (experiment.sensor) {
case 'accelerometer':
Accelerometer.setUpdateInterval(updateInterval);
sensorSubscription = Accelerometer.addListener((data) => {
const newDataPoint: SensorData = {
timestamp: Date.now(),
x: data.x,
y: data.y,
z: data.z,
};
setSensorData(prev => [...prev, newDataPoint]);
});
break;
case 'gyroscope':
Gyroscope.setUpdateInterval(updateInterval);
sensorSubscription = Gyroscope.addListener((data) => {
const newDataPoint: SensorData = {
timestamp: Date.now(),
x: data.x,
y: data.y,
z: data.z,
};
setSensorData(prev => [...prev, newDataPoint]);
});
break;
case 'magnetometer':
Magnetometer.setUpdateInterval(updateInterval);
sensorSubscription = Magnetometer.addListener((data) => {
const newDataPoint: SensorData = {
timestamp: Date.now(),
x: data.x,
y: data.y,
z: data.z,
};
setSensorData(prev => [...prev, newDataPoint]);
});
break;
}
subscriptionRef.current = sensorSubscription;
// Start countdown timer
timerRef.current = setInterval(() => {
setTimeRemaining(prev => {
if (prev <= 1) {
stopRecording();
return 0;
}
return prev - 1;
});
}, 1000);
};
const stopRecording = () => {
if (subscriptionRef.current) {
subscriptionRef.current.remove();
subscriptionRef.current = null;
}
if (timerRef.current) {
clearInterval(timerRef.current);
timerRef.current = null;
}
setIsRecording(false);
setTimeRemaining(0);
if (sensorData.length > 0) {
analyzeData();
}
};
const analyzeData = () => {
if (sensorData.length === 0) return;
const magnitudes = sensorData.map(d => Math.sqrt(d.x * d.x + d.y * d.y + d.z * d.z));
const analysis = {
avgMagnitude: magnitudes.reduce((sum, mag) => sum + mag, 0) / magnitudes.length,
maxMagnitude: Math.max(...magnitudes),
minMagnitude: Math.min(...magnitudes),
range: Math.max(...magnitudes) - Math.min(...magnitudes),
dataPoints: sensorData.length,
frequency: sensorData.length / ((sensorData[sensorData.length - 1]?.timestamp - sensorData[0]?.timestamp) / 1000),
};
setAnalysis(analysis);
// Generate insights based on experiment type
generateInsights(currentExperiment!, analysis);
};
const generateInsights = (experiment: Experiment, analysis: any) => {
let insights = [];
switch (experiment.id) {
case 'gravity':
if (Math.abs(analysis.avgMagnitude - 9.8) < 1) {
insights.push('โ
Measured gravity is close to expected 9.8 m/sยฒ');
} else {
insights.push('โน๏ธ Measured value differs from standard gravity - check device orientation');
}
break;
case 'pendulum':
if (analysis.range > 5) {
insights.push('โ
Clear pendulum motion detected with good amplitude');
} else {
insights.push('โน๏ธ Motion amplitude is low - try larger swings');
}
break;
case 'rotation':
if (analysis.maxMagnitude > 2) {
insights.push('โ
Significant rotation detected');
} else {
insights.push('โน๏ธ Low rotation detected - try faster spinning');
}
break;
case 'magnetic':
insights.push(`๐ Magnetic field strength varies from ${analysis.minMagnitude.toFixed(2)} to ${analysis.maxMagnitude.toFixed(2)} ยตT`);
break;
}
setAnalysis(prev => ({ ...prev, insights }));
};
const renderDataChart = () => {
if (sensorData.length < 2) return null;
const chartData = {
labels: sensorData.filter((_, i) => i % Math.ceil(sensorData.length / 6) === 0)
.map((_, i) => `${i * Math.ceil(sensorData.length / 6) / 10}s`),
datasets: [
{
data: sensorData.filter((_, i) => i % Math.ceil(sensorData.length / 6) === 0)
.map(d => Math.sqrt(d.x * d.x + d.y * d.y + d.z * d.z)),
color: (opacity = 1) => `rgba(74, 144, 226, ${opacity})`,
strokeWidth: 2,
},
],
};
return (
<LineChart
data={chartData}
width={Dimensions.get('window').width - 40}
height={200}
chartConfig={{
backgroundColor: '#ffffff',
backgroundGradientFrom: '#ffffff',
backgroundGradientTo: '#f8f9fa',
decimalPlaces: 2,
color: (opacity = 1) => `rgba(74, 144, 226, ${opacity})`,
labelColor: (opacity = 1) => `rgba(44, 62, 80, ${opacity})`,
style: { borderRadius: 16 },
}}
style={styles.chart}
/>
);
};
const render3DVisualization = () => {
if (sensorData.length === 0) return null;
const recentData = sensorData.slice(-50); // Show last 50 points
const maxVal = Math.max(...recentData.map(d => Math.max(Math.abs(d.x), Math.abs(d.y), Math.abs(d.z))));
const scale = 80 / maxVal;
return (
<Svg width=\"200\" height=\"200\" viewBox=\"0 0 200 200\">
{/* 3D axes */}\n <Line x1=\"100\" y1=\"100\" x2=\"180\" y2=\"60\" stroke=\"#FF6B6B\" strokeWidth=\"2\" />\n <Line x1=\"100\" y1=\"100\" x2=\"20\" y2=\"60\" stroke=\"#4ECDC4\" strokeWidth=\"2\" />\n <Line x1=\"100\" y1=\"100\" x2=\"100\" y2=\"20\" stroke=\"#45B7D1\" strokeWidth=\"2\" />\n \n {/* Axis labels */}\n <SvgText x=\"185\" y=\"65\" fontSize=\"12\" fill=\"#FF6B6B\">X</SvgText>\n <SvgText x=\"15\" y=\"65\" fontSize=\"12\" fill=\"#4ECDC4\">Y</SvgText>\n <SvgText x=\"105\" y=\"15\" fontSize=\"12\" fill=\"#45B7D1\">Z</SvgText>\n \n {/* Data points */}\n {recentData.map((point, index) => {\n const opacity = index / recentData.length;\n return (\n <Circle\n key={index}\n cx={100 + point.x * scale}\n cy={100 + point.y * scale}\n r=\"2\"\n fill={`rgba(74, 144, 226, ${opacity})`}\n />\n );\n })}\n </Svg>\n );\n };\n\n return (\n <ScrollView style={styles.container}>\n <Text style={styles.title}>๐ฌ Physics Laboratory</Text>\n \n {!currentExperiment ? (\n // Experiment selection\n <View>\n <Text style={styles.sectionTitle}>Choose an Experiment</Text>\n {experiments.map((experiment) => (\n <TouchableOpacity\n key={experiment.id}\n style={styles.experimentCard}\n onPress={() => startExperiment(experiment)}\n >\n <Text style={styles.experimentName}>{experiment.name}</Text>\n <Text style={styles.experimentDescription}>{experiment.description}</Text>\n <Text style={styles.experimentDetails}>\n Sensor: {experiment.sensor} โข Duration: {experiment.duration}s\n </Text>\n <Text style={styles.expectedResults}>Expected: {experiment.expectedResults}</Text>\n </TouchableOpacity>\n ))}\n </View>\n ) : (\n // Active experiment\n <View>\n <View style={styles.experimentHeader}>\n <Text style={styles.experimentTitle}>{currentExperiment.name}</Text>\n <Text style={styles.experimentSensor}>Using {currentExperiment.sensor}</Text>\n </View>\n \n {isRecording && (\n <View style={styles.recordingStatus}>\n <Text style={styles.timerText}>Time Remaining: {timeRemaining}s</Text>\n <Text style={styles.dataCount}>Data Points: {sensorData.length}</Text>\n <TouchableOpacity style={styles.stopButton} onPress={stopRecording}>\n <Text style={styles.stopButtonText}>Stop Recording</Text>\n </TouchableOpacity>\n </View>\n )}\n \n {/* Real-time visualization */}\n {sensorData.length > 0 && (\n <View style={styles.visualizationContainer}>\n <Text style={styles.sectionTitle}>Real-time Data</Text>\n <View style={styles.sensorValues}>\n <Text style={styles.sensorValue}>\n X: {sensorData[sensorData.length - 1]?.x.toFixed(3)}\n </Text>\n <Text style={styles.sensorValue}>\n Y: {sensorData[sensorData.length - 1]?.y.toFixed(3)}\n </Text>\n <Text style={styles.sensorValue}>\n Z: {sensorData[sensorData.length - 1]?.z.toFixed(3)}\n </Text>\n </View>\n \n <View style={styles.chartContainer}>\n {render3DVisualization()}\n </View>\n </View>\n )}\n \n {/* Data analysis */}\n {analysis && (\n <View style={styles.analysisContainer}>\n <Text style={styles.sectionTitle}>๐ Analysis Results</Text>\n \n {renderDataChart()}\n \n <View style={styles.statisticsContainer}>\n <View style={styles.statItem}>\n <Text style={styles.statValue}>{analysis.avgMagnitude.toFixed(3)}</Text>\n <Text style={styles.statLabel}>Average</Text>\n </View>\n <View style={styles.statItem}>\n <Text style={styles.statValue}>{analysis.maxMagnitude.toFixed(3)}</Text>\n <Text style={styles.statLabel}>Maximum</Text>\n </View>\n <View style={styles.statItem}>\n <Text style={styles.statValue}>{analysis.range.toFixed(3)}</Text>\n <Text style={styles.statLabel}>Range</Text>\n </View>\n <View style={styles.statItem}>\n <Text style={styles.statValue}>{analysis.frequency.toFixed(1)} Hz</Text>\n <Text style={styles.statLabel}>Frequency</Text>\n </View>\n </View>\n \n {analysis.insights && (\n <View style={styles.insightsContainer}>\n <Text style={styles.insightsTitle}>๐ง Insights</Text>\n {analysis.insights.map((insight: string, index: number) => (\n <Text key={index} style={styles.insightText}>{insight}</Text>\n ))}\n </View>\n )}\n </View>\n )}\n \n <TouchableOpacity\n style={styles.newExperimentButton}\n onPress={() => {\n setCurrentExperiment(null);\n setSensorData([]);\n setAnalysis(null);\n }}\n >\n <Text style={styles.newExperimentText}>๐งช New Experiment</Text>\n </TouchableOpacity>\n </View>\n )}\n </ScrollView>\n );\n};\n\nconst styles = StyleSheet.create({\n container: {\n flex: 1,\n backgroundColor: '#F8F9FA',\n padding: 20,\n },\n title: {\n fontSize: 28,\n fontWeight: 'bold',\n color: '#2C3E50',\n textAlign: 'center',\n marginBottom: 30,\n marginTop: 20,\n },\n sectionTitle: {\n fontSize: 20,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 15,\n },\n experimentCard: {\n backgroundColor: 'white',\n borderRadius: 15,\n padding: 20,\n marginBottom: 15,\n shadowColor: '#000',\n shadowOffset: { width: 0, height: 2 },\n shadowOpacity: 0.1,\n shadowRadius: 4,\n elevation: 3,\n },\n experimentName: {\n fontSize: 18,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 8,\n },\n experimentDescription: {\n fontSize: 14,\n color: '#495057',\n marginBottom: 8,\n },\n experimentDetails: {\n fontSize: 12,\n color: '#6C757D',\n marginBottom: 8,\n },\n expectedResults: {\n fontSize: 12,\n color: '#28A745',\n fontStyle: 'italic',\n },\n experimentHeader: {\n backgroundColor: 'white',\n borderRadius: 15,\n padding: 20,\n marginBottom: 20,\n alignItems: 'center',\n },\n experimentTitle: {\n fontSize: 24,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 8,\n },\n experimentSensor: {\n fontSize: 16,\n color: '#6C757D',\n textTransform: 'capitalize',\n },\n recordingStatus: {\n backgroundColor: '#FF6B6B',\n borderRadius: 15,\n padding: 20,\n marginBottom: 20,\n alignItems: 'center',\n },\n timerText: {\n fontSize: 20,\n fontWeight: 'bold',\n color: 'white',\n marginBottom: 8,\n },\n dataCount: {\n fontSize: 14,\n color: 'white',\n marginBottom: 15,\n },\n stopButton: {\n backgroundColor: 'white',\n paddingHorizontal: 20,\n paddingVertical: 10,\n borderRadius: 20,\n },\n stopButtonText: {\n color: '#FF6B6B',\n fontWeight: 'bold',\n },\n visualizationContainer: {\n backgroundColor: 'white',\n borderRadius: 15,\n padding: 20,\n marginBottom: 20,\n },\n sensorValues: {\n flexDirection: 'row',\n justifyContent: 'space-around',\n marginBottom: 20,\n },\n sensorValue: {\n fontSize: 16,\n fontWeight: 'bold',\n color: '#2C3E50',\n },\n chartContainer: {\n alignItems: 'center',\n },\n analysisContainer: {\n backgroundColor: 'white',\n borderRadius: 15,\n padding: 20,\n marginBottom: 20,\n },\n chart: {\n borderRadius: 16,\n marginVertical: 8,\n },\n statisticsContainer: {\n flexDirection: 'row',\n justifyContent: 'space-around',\n marginVertical: 20,\n },\n statItem: {\n alignItems: 'center',\n },\n statValue: {\n fontSize: 18,\n fontWeight: 'bold',\n color: '#4A90E2',\n },\n statLabel: {\n fontSize: 12,\n color: '#6C757D',\n marginTop: 4,\n },\n insightsContainer: {\n backgroundColor: '#E8F4FD',\n borderRadius: 12,\n padding: 15,\n marginTop: 15,\n },\n insightsTitle: {\n fontSize: 16,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 10,\n },\n insightText: {\n fontSize: 14,\n color: '#495057',\n marginBottom: 5,\n },\n newExperimentButton: {\n backgroundColor: '#28A745',\n borderRadius: 25,\n padding: 15,\n alignItems: 'center',\n marginBottom: 30,\n },\n newExperimentText: {\n color: 'white',\n fontSize: 18,\n fontWeight: 'bold',\n },\n});\n\nexport default PhysicsLab;\n```\n\n## ๐ฑ Building a Biology Field Guide\n\n### AI-Powered Species Identification\n\n```typescript\n// components/BiologyFieldGuide.tsx\nimport React, { useState, useRef } from 'react';\nimport {\n View,\n Text,\n TouchableOpacity,\n StyleSheet,\n Image,\n Alert,\n ScrollView,\n} from 'react-native';\nimport { Camera, CameraType } from 'expo-camera';\nimport * as Location from 'expo-location';\n\ninterface Species {\n id: string;\n commonName: string;\n scientificName: string;\n description: string;\n habitat: string;\n characteristics: string[];\n conservationStatus: string;\n imageUrl: string;\n}\n\ninterface Observation {\n id: string;\n species: Species;\n location: {\n latitude: number;\n longitude: number;\n address?: string;\n };\n timestamp: Date;\n photoUri: string;\n notes: string;\n confidence: number;\n}\n\nexport const BiologyFieldGuide: React.FC = () => {\n const [hasPermission, setHasPermission] = useState<boolean | null>(null);\n const [showCamera, setShowCamera] = useState(false);\n const [capturedPhoto, setCapturedPhoto] = useState<string | null>(null);\n const [identifiedSpecies, setIdentifiedSpecies] = useState<Species | null>(null);\n const [observations, setObservations] = useState<Observation[]>([]);\n const [isIdentifying, setIsIdentifying] = useState(false);\n const cameraRef = useRef<Camera>(null);\n\n const requestPermissions = async () => {\n const { status: cameraStatus } = await Camera.requestCameraPermissionsAsync();\n const { status: locationStatus } = await Location.requestForegroundPermissionsAsync();\n \n setHasPermission(cameraStatus === 'granted' && locationStatus === 'granted');\n };\n\n const takePicture = async () => {\n if (cameraRef.current) {\n try {\n const photo = await cameraRef.current.takePictureAsync({\n quality: 0.8,\n base64: true,\n });\n \n setCapturedPhoto(photo.uri);\n setShowCamera(false);\n identifySpecies(photo.uri);\n } catch (error) {\n Alert.alert('Error', 'Failed to take picture');\n }\n }\n };\n\n const identifySpecies = async (imageUri: string) => {\n setIsIdentifying(true);\n \n try {\n // In a real app, you'd use computer vision APIs like:\n // - Google Vision AI\n // - iNaturalist API\n // - Custom trained models\n // - PlantNet API for plants\n \n // Mock identification for demo\n await new Promise(resolve => setTimeout(resolve, 3000));\n \n const mockSpecies: Species = {\n id: '1',\n commonName: 'Red Oak',\n scientificName: 'Quercus rubra',\n description: 'A large deciduous tree native to North America, known for its distinctive lobed leaves that turn red in autumn.',\n habitat: 'Deciduous and mixed forests, parks, urban areas',\n characteristics: [\n 'Lobed leaves with pointed tips',\n 'Reddish-brown bark with deep furrows',\n 'Acorns mature in two years',\n 'Can grow up to 35 meters tall',\n ],\n conservationStatus: 'Least Concern',\n imageUrl: 'https://example.com/red-oak.jpg',\n };\n \n setIdentifiedSpecies(mockSpecies);\n \n // Get current location\n const location = await Location.getCurrentPositionAsync({});\n const address = await Location.reverseGeocodeAsync({\n latitude: location.coords.latitude,\n longitude: location.coords.longitude,\n });\n \n // Create observation record\n const observation: Observation = {\n id: Date.now().toString(),\n species: mockSpecies,\n location: {\n latitude: location.coords.latitude,\n longitude: location.coords.longitude,\n address: address[0] ? \n `${address[0].city}, ${address[0].region}` : \n 'Unknown location',\n },\n timestamp: new Date(),\n photoUri: imageUri,\n notes: '',\n confidence: 0.85, // 85% confidence\n };\n \n setObservations(prev => [observation, ...prev]);\n \n } catch (error) {\n Alert.alert('Error', 'Failed to identify species');\n } finally {\n setIsIdentifying(false);\n }\n };\n\n const renderSpeciesInfo = (species: Species) => {\n return (\n <ScrollView style={styles.speciesContainer}>\n <Text style={styles.commonName}>{species.commonName}</Text>\n <Text style={styles.scientificName}>{species.scientificName}</Text>\n \n <View style={styles.statusContainer}>\n <Text style={styles.statusLabel}>Conservation Status:</Text>\n <Text style={[\n styles.statusValue,\n { color: species.conservationStatus === 'Least Concern' ? '#28A745' : '#DC3545' }\n ]}>\n {species.conservationStatus}\n </Text>\n </View>\n \n <View style={styles.section}>\n <Text style={styles.sectionTitle}>Description</Text>\n <Text style={styles.sectionText}>{species.description}</Text>\n </View>\n \n <View style={styles.section}>\n <Text style={styles.sectionTitle}>Habitat</Text>\n <Text style={styles.sectionText}>{species.habitat}</Text>\n </View>\n \n <View style={styles.section}>\n <Text style={styles.sectionTitle}>Key Characteristics</Text>\n {species.characteristics.map((char, index) => (\n <Text key={index} style={styles.characteristicItem}>โข {char}</Text>\n ))}\n </View>\n \n <TouchableOpacity\n style={styles.saveButton}\n onPress={() => {\n Alert.alert(\n 'Observation Saved!',\n 'Your species identification has been recorded.',\n [{ text: 'OK' }]\n );\n setCapturedPhoto(null);\n setIdentifiedSpecies(null);\n }}\n >\n <Text style={styles.saveButtonText}>๐พ Save Observation</Text>\n </TouchableOpacity>\n </ScrollView>\n );\n };\n\n const renderObservationsList = () => {\n return (\n <ScrollView style={styles.observationsContainer}>\n <Text style={styles.sectionTitle}>๐ Your Observations ({observations.length})</Text>\n \n {observations.map((obs) => (\n <View key={obs.id} style={styles.observationCard}>\n <Image source={{ uri: obs.photoUri }} style={styles.observationImage} />\n \n <View style={styles.observationInfo}>\n <Text style={styles.observationSpecies}>{obs.species.commonName}</Text>\n <Text style={styles.observationScientific}>{obs.species.scientificName}</Text>\n <Text style={styles.observationLocation}>๐ {obs.location.address}</Text>\n <Text style={styles.observationDate}>\n ๐
{obs.timestamp.toLocaleDateString()} {obs.timestamp.toLocaleTimeString()}\n </Text>\n <Text style={styles.observationConfidence}>\n ๐ฏ {Math.round(obs.confidence * 100)}% confidence\n </Text>\n </View>\n </View>\n ))}\n \n {observations.length === 0 && (\n <Text style={styles.emptyText}>\n No observations yet. Start by taking a photo of a plant or animal!\n </Text>\n )}\n </ScrollView>\n );\n };\n\n if (hasPermission === null) {\n return (\n <View style={styles.permissionContainer}>\n <Text style={styles.title}>๐ฟ Biology Field Guide</Text>\n <Text style={styles.permissionText}>\n This app needs camera and location permissions to identify species and record observations.\n </Text>\n <TouchableOpacity style={styles.permissionButton} onPress={requestPermissions}>\n <Text style={styles.permissionButtonText}>Grant Permissions</Text>\n </TouchableOpacity>\n </View>\n );\n }\n\n if (hasPermission === false) {\n return (\n <View style={styles.permissionContainer}>\n <Text style={styles.permissionText}>Camera and location access denied</Text>\n </View>\n );\n }\n\n if (showCamera) {\n return (\n <View style={styles.container}>\n <Camera\n ref={cameraRef}\n style={styles.camera}\n type={CameraType.back}\n >\n <View style={styles.cameraOverlay}>\n <TouchableOpacity\n style={styles.captureButton}\n onPress={takePicture}\n >\n <View style={styles.captureButtonInner} />\n </TouchableOpacity>\n \n <TouchableOpacity\n style={styles.cancelButton}\n onPress={() => setShowCamera(false)}\n >\n <Text style={styles.cancelButtonText}>Cancel</Text>\n </TouchableOpacity>\n </View>\n </Camera>\n </View>\n );\n }\n\n return (\n <View style={styles.container}>\n <Text style={styles.title}>๐ฟ Biology Field Guide</Text>\n \n {isIdentifying && (\n <View style={styles.identifyingContainer}>\n <Text style={styles.identifyingText}>๐ Identifying species...</Text>\n <Text style={styles.identifyingSubtext}>\n Using AI to analyze your photo\n </Text>\n </View>\n )}\n \n {capturedPhoto && identifiedSpecies && !isIdentifying && (\n <View style={styles.resultContainer}>\n <Image source={{ uri: capturedPhoto }} style={styles.capturedImage} />\n {renderSpeciesInfo(identifiedSpecies)}\n </View>\n )}\n \n {!capturedPhoto && !isIdentifying && (\n <View>\n <TouchableOpacity\n style={styles.cameraButton}\n onPress={() => setShowCamera(true)}\n >\n <Text style={styles.cameraButtonText}>๐ธ Identify Species</Text>\n </TouchableOpacity>\n \n {renderObservationsList()}\n </View>\n )}\n </View>\n );\n};\n\nconst styles = StyleSheet.create({\n container: {\n flex: 1,\n backgroundColor: '#F8F9FA',\n },\n title: {\n fontSize: 28,\n fontWeight: 'bold',\n color: '#2C3E50',\n textAlign: 'center',\n marginTop: 50,\n marginBottom: 30,\n },\n permissionContainer: {\n flex: 1,\n justifyContent: 'center',\n alignItems: 'center',\n padding: 20,\n },\n permissionText: {\n fontSize: 18,\n color: '#495057',\n textAlign: 'center',\n marginBottom: 30,\n },\n permissionButton: {\n backgroundColor: '#28A745',\n paddingHorizontal: 30,\n paddingVertical: 15,\n borderRadius: 25,\n },\n permissionButtonText: {\n color: 'white',\n fontSize: 16,\n fontWeight: 'bold',\n },\n camera: {\n flex: 1,\n },\n cameraOverlay: {\n flex: 1,\n justifyContent: 'flex-end',\n alignItems: 'center',\n paddingBottom: 50,\n },\n captureButton: {\n width: 80,\n height: 80,\n borderRadius: 40,\n backgroundColor: 'white',\n justifyContent: 'center',\n alignItems: 'center',\n marginBottom: 20,\n },\n captureButtonInner: {\n width: 70,\n height: 70,\n borderRadius: 35,\n backgroundColor: 'white',\n borderWidth: 3,\n borderColor: '#2C3E50',\n },\n cancelButton: {\n backgroundColor: 'rgba(0, 0, 0, 0.5)',\n paddingHorizontal: 20,\n paddingVertical: 10,\n borderRadius: 20,\n },\n cancelButtonText: {\n color: 'white',\n fontSize: 16,\n },\n cameraButton: {\n backgroundColor: '#28A745',\n margin: 20,\n padding: 20,\n borderRadius: 15,\n alignItems: 'center',\n },\n cameraButtonText: {\n color: 'white',\n fontSize: 20,\n fontWeight: 'bold',\n },\n identifyingContainer: {\n backgroundColor: '#007BFF',\n margin: 20,\n padding: 30,\n borderRadius: 15,\n alignItems: 'center',\n },\n identifyingText: {\n color: 'white',\n fontSize: 20,\n fontWeight: 'bold',\n marginBottom: 10,\n },\n identifyingSubtext: {\n color: 'rgba(255, 255, 255, 0.8)',\n fontSize: 14,\n },\n resultContainer: {\n flex: 1,\n },\n capturedImage: {\n width: '100%',\n height: 200,\n resizeMode: 'cover',\n },\n speciesContainer: {\n flex: 1,\n padding: 20,\n },\n commonName: {\n fontSize: 24,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 5,\n },\n scientificName: {\n fontSize: 18,\n fontStyle: 'italic',\n color: '#6C757D',\n marginBottom: 15,\n },\n statusContainer: {\n flexDirection: 'row',\n alignItems: 'center',\n marginBottom: 20,\n },\n statusLabel: {\n fontSize: 14,\n color: '#495057',\n marginRight: 10,\n },\n statusValue: {\n fontSize: 14,\n fontWeight: 'bold',\n },\n section: {\n marginBottom: 20,\n },\n sectionTitle: {\n fontSize: 18,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 10,\n },\n sectionText: {\n fontSize: 14,\n color: '#495057',\n lineHeight: 20,\n },\n characteristicItem: {\n fontSize: 14,\n color: '#495057',\n marginBottom: 5,\n },\n saveButton: {\n backgroundColor: '#007BFF',\n padding: 15,\n borderRadius: 10,\n alignItems: 'center',\n marginTop: 20,\n },\n saveButtonText: {\n color: 'white',\n fontSize: 16,\n fontWeight: 'bold',\n },\n observationsContainer: {\n flex: 1,\n padding: 20,\n },\n observationCard: {\n backgroundColor: 'white',\n borderRadius: 15,\n padding: 15,\n marginBottom: 15,\n flexDirection: 'row',\n shadowColor: '#000',\n shadowOffset: { width: 0, height: 2 },\n shadowOpacity: 0.1,\n shadowRadius: 4,\n elevation: 3,\n },\n observationImage: {\n width: 80,\n height: 80,\n borderRadius: 10,\n marginRight: 15,\n },\n observationInfo: {\n flex: 1,\n },\n observationSpecies: {\n fontSize: 16,\n fontWeight: 'bold',\n color: '#2C3E50',\n marginBottom: 2,\n },\n observationScientific: {\n fontSize: 14,\n fontStyle: 'italic',\n color: '#6C757D',\n marginBottom: 5,\n },\n observationLocation: {\n fontSize: 12,\n color: '#495057',\n marginBottom: 2,\n },\n observationDate: {\n fontSize: 12,\n color: '#495057',\n marginBottom: 2,\n },\n observationConfidence: {\n fontSize: 12,\n color: '#28A745',\n fontWeight: 'bold',\n },\n emptyText: {\n fontSize: 16,\n color: '#6C757D',\n textAlign: 'center',\n marginTop: 50,\n },\n});\n\nexport default BiologyFieldGuide;\n```\n\n## ๐ Summary\n\nIn this lesson, you learned:\n- How to access and use device sensors for scientific data collection\n- Building interactive physics experiments with real-time data visualization\n- Creating AI-powered species identification and biology field guides\n- Implementing data analysis and insights generation for science education\n- Understanding the potential of mobile sensors in science education\n- Developing engaging scientific apps that promote hands-on learning\n\n## ๐ค Practice with AI\n\nCode with AI: Try building these science exploration features.\n\n**Prompts to try:**\n- *\"Create a chemistry lab app that uses the camera to identify chemical reactions by color changes\"*\n- *\"Build an astronomy app that uses device sensors to track celestial objects and provides star charts\"*\n- *\"Design a weather station app that collects environmental data and predicts local weather patterns\"*\n- *\"Implement a geology field guide that identifies rocks and minerals using image recognition\"*\n- *\"Create an ecology monitoring app that tracks biodiversity and environmental changes over time\"*\n\nScience education apps have incredible potential to inspire the next generation of scientists and researchers!"