AR Validation Workflow
Executive Summary
WEFA now has repo-level validation support for the AR-first rollout:
- Android Chrome remains the only
immersive-artarget. - iPad/iPhone remain premium
garden-tabletoptargets with optional live backdrop. - QA builds can now capture session evidence directly from the app and export it from the Profile screen.
This document describes what the repo can prove automatically, what still needs physical hardware, and how QA should collect evidence for a release candidate.
QA Build Switch
Use VITE_QA_VALIDATION=true to enable validation capture outside local dev mode.
With that flag enabled:
- diagnostics remain visible in packages/app/src/routes/Root.tsx
- the Profile screen exposes the validation report panel in packages/app/src/views/Profile.tsx
- session reports are captured through packages/app/src/modules/validation-session.tsx
What Gets Captured
Validation reports currently capture:
- AR session entry requested, succeeded, and failed
- first useful render timing
- placement ready and placement confirmed timing
- re-anchor starts and cancels
- camera permission request, grant, deny, fail, or unavailable
- camera backdrop toggles
- model load timing
- orientation changes
- app hide/resume interruptions
- offline and reconnect events
- FPS and JS heap estimates for the session
Reports are stored in browser local storage and can be exported as JSON from the Profile diagnostics view.
Current Automation Evidence
The repo can currently provide trustworthy automation evidence for:
- experience routing and AR/tabletop branching
- immersive gameplay dock behavior with mock WebXR
- nurture/evolution immersive overlay actions
- current Garden mode-switching wizard behavior
- onboarding, offline shell, and non-immersive gameplay flows
Primary evidence files:
- packages/app/e2e/game-immersive-ar.spec.cjs
- packages/app/e2e/garden-modes.spec.cjs
- packages/app/src/components/garden/ARViews.test.tsx
- packages/app/src/components/game/GameCanvas.immersive.test.tsx
- packages/app/src/components/game/GameCanvas.experience.test.tsx
Hardware-Only Evidence
Automation is still not enough to prove:
- real hit-test stability
- physical placement speed and correction rate
- true camera passthrough quality
- thermal or memory degradation on long sessions
- Safari live-backdrop comfort on iPad/iPhone
- recovery after real mobile backgrounding and reconnect
Those must still be collected on hardware using the Android and Apple matrices defined in the rollout plan.
QA Collection Workflow
For each release candidate:
- Run the automated validation suite.
- Build or serve the app with
VITE_QA_VALIDATION=true. - Execute the required Android or Apple hardware flow.
- Open Profile and export the validation JSON report.
- Attach the exported JSON, device model, browser version, and tester notes to the release evidence package.
Recommended tester notes:
- exact device model
- OS version
- browser version
- whether the run was cold cache or warm cache
- any visible tracking loss, mis-taps, or thermal throttling
Known Remaining Gaps
The repo still has one explicit validation-program gap:
- packages/app/e2e/multiplayer-two-page.spec.cjs still skips when realtime transport is unavailable in the environment
That means two-context realtime sync is still an environment gate, not yet a deterministic release-proof path.