ML
- Reinstall react-native-fast-tflite + react-native-nitro-modules and
register the fast-tflite Expo plugin in app.json
- Wire model.ts to the real native module: dynamic require + lazy
loadTensorflowModel (cached), softmax/argmax on output, build Detection
with the project 0-100 confidence convention. Falls back to mockDetection
on any load/inference failure so the app never breaks.
- Align preprocessing input size to 256x256 to match the Python
MobileNetV2 export.
Scanner UX
- Preload the TFLite model on Scanner mount to avoid the ~1-2s decode hit
on first capture
- Add a flip-front/back camera control with a toast warning that the rear
camera gives better results
- Show a full-screen analyzing skeleton overlay while inference runs
- Memoize ConfidenceMeter color into a single computed value
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>