Self, segmented.
- SAM2
- ONNX
- WASM
- Web Worker
- Pre-encoded
About me · by demonstration
Self-taught engineer from Caracas. I think in pixels, vectors and stack traces. Below, four small things that actually run — instead of a paragraph telling you they could.
Where it started.
Born in Caracas in '96. First touched a computer in a smoke-filled cyber café between brownouts, and never let go.
import { useState, useEffect } from 'react';
import { StyleSheet, Text, View, TouchableOpacity } from 'react-native';
import { Pedometer } from 'expo-sensors';
export default function App() {
const [steps, setSteps] = useState(0);
const [today, setToday] = useState(null);
const [msg, setMsg] = useState('');
useEffect(() => {
let sub;
(async () => {
const ok = await Pedometer.isAvailableAsync();
if (!ok) { setMsg('Pedometer unavailable'); return; }
const p = await Pedometer.requestPermissionsAsync();
if (!p.granted) { setMsg('Permission denied'); return; }
const end = new Date(), start = new Date();
start.setHours(0,0,0,0);
try {
const r = await Pedometer.getStepCountAsync(start, end);
setToday(r.steps);
} catch {}
sub = Pedometer.watchStepCount(r => setSteps(r.steps));
})();
return () => sub && sub.remove();
}, []);
return (
<View style={s.c}>
<Text style={s.t}>Step Tracker</Text>
<View style={s.card}>
<Text style={s.l}>Since opened</Text>
<Text style={s.v}>{steps}</Text>
</View>
{today !== null && (
<View style={s.card}>
<Text style={s.l}>Today total</Text>
<Text style={s.v}>{today}</Text>
</View>
)}
<TouchableOpacity style={s.b} onPress={() => setSteps(0)}>
<Text style={s.bt}>Reset session</Text>
</TouchableOpacity>
{msg ? <Text style={s.e}>{msg}</Text> : null}
</View>
);
}
const s = StyleSheet.create({
c: { flex: 1, backgroundColor: '#0a0a0a', padding: 24 },
t: { color: '#fff', fontSize: 28, fontWeight: '700' },
card: { backgroundColor: '#171717', borderRadius: 16, padding: 20 },
l: { color: '#888', fontSize: 13, textTransform: 'uppercase' },
v: { color: '#fff', fontSize: 56, fontWeight: '800' },
b: { backgroundColor: '#fff', borderRadius: 999, padding: 14 },
bt: { color: '#000', fontWeight: '700' },
e: { color: '#f87171' },
});
Expo Go
Open Expo Go on iOS / Android, tap Scan QR Code.
expo · react native · pedometer · tap card behind to swap
expo-sensors Pedometer for real-time events and pulls today’s total from the Health / CMPedometer history APIs. Open the QR with Expo Go on a phone — the count ticks up as you walk.End of demos
More of the same kind of work — shaders, AI, mobile, the small details — lives across the rest of the site. Keep scrolling.
Get weekly articles in your inbox on how to grow your skills and enhace teamwork.
11 min read
A React-Native-first take on the classic R3F ring configurator: GLB loading on device, four metal materials, gesture-driven rotation, Zustand state, and ARKit / ARCore preview — all behind one Expo build.
23 min read
Segmentation is the substrate for nearly every AI photo workflow worth shipping in 2026 — inpainting, object swaps, controlled generation. Here is how to make it feel instant on the web by splitting SAM2 across a notebook on the user's hardware and a decoder in their browser.
4 min read
A working recipe for shipping image segmentation in a tab — Web Workers, WASM, pre-encoded embeddings, and the small things that decide whether the demo is fast or felt-fast.
1 min read
A short tour of the new MDX-powered writing setup, complete with syntax-highlighted code blocks rendered by Shiki at build time.