skip to content

Hello Face

Launch the smallest maintained Vizij runtime app, prove the face is live, and connect the visible behavior back to the runtime skeleton.

Reader outcome

After this page, you should be able to launch the maintained Hello Face app, confirm that the runtime is actually live, trigger visible behavior with the mouse and keyboard, and point to the small runtime skeleton that makes the app work.

What you need

  • Install dependencies once:
  • Use the vizij-web workspace root:

Proof state

  • tutorial-fullscreen-face runs locally
  • the face renders and reaches a stable ready state
  • mouse movement changes gaze and number keys trigger visible pose changes
  • you can point to the asset bundle, provider, and interaction hooks in code

Interactive reference

The source docs remain the canonical lesson. Use this interactive surface when it strengthens the page.

Starting State

Use the vizij-web workspace root:

cd /home/chris/Code/Semio/vizij_ws/vizij-web

This walkthrough uses apps/tutorial-fullscreen-face, which is the smallest maintained Vizij runtime app in the workspace.

What You Need

Install dependencies once:

pnpm install

Start the tutorial app:

pnpm run dev:tutorial-fullscreen-face

Open the local Vite URL that appears in the terminal.

Quick term bridge

Keep these four labels distinct while you work through the first run:

TermWhat it means on this pageWhat it does not mean yet
face artifactthe bundled sample face being loadedyour own authored face
runtime bundlethe app-facing bundle handed to VizijRuntimeProviderthe whole application shell
app shellthe tutorial app that hosts the face and hooksa deployed operator endpoint
deployment endpointa later Deploy concern with an exposed control paththis first-run tutorial surface

What Success Looks Like Up Front

Before you inspect any code, know what a healthy first run should look like:

  1. the page opens without a blank canvas or crash
  2. the face becomes visible after loading finishes
  3. moving the mouse changes eye gaze
  4. pressing the number keys changes visible facial poses

If the face never appears, stop and use Validation Checkpoints or Troubleshooting Matrix before continuing.

Walkthrough

1. Launch the maintained runtime tutorial

From vizij-web, run:

pnpm run dev:tutorial-fullscreen-face

When the browser opens, wait for the face to settle into its ready state.

Expected result:

  1. you may briefly see a loading or initialization message
  2. the face appears centered on screen
  3. the app stops looking transitional and starts behaving like a live face surface

Current visual anchor:

Tutorial Fullscreen Face
the settled ready-state view of the smallest maintained runtime tutorial.

Use this still to confirm the baseline face state before you test motion or hotkey-driven changes.

2. Prove that the face is live, not static

Do two checks immediately:

  1. move the mouse across the viewport and watch the eyes follow
  2. press the number keys and watch visible expression or pose changes trigger

Expected result:

  1. gaze moves continuously with pointer movement
  2. pose changes feel discrete and key-driven
  3. repeated interactions continue working, which proves the runtime is actively staging inputs rather than showing a prerecorded asset

This is the first useful Vizij confidence test. A rendered face is not enough. The face has to respond.

Motion anchor:

Fullscreen Demo Motion
live gaze steering and hotkey-triggered pose changes proving that the face is not static.

The still screenshot above proves the settled ready state. This loop is the stronger proof that the surface is live: gaze keeps steering and hotkey-triggered expressions come and go through the runtime.

3. Inspect the runtime skeleton in code

Open these files:

  1. vizij-web/apps/tutorial-fullscreen-face/src/FaceApp.tsx
  2. vizij-web/apps/tutorial-fullscreen-face/src/hooks/useMouseGaze.ts
  3. vizij-web/apps/tutorial-fullscreen-face/src/hooks/usePoseHotkeys.ts

In FaceApp.tsx, find the three pieces that define the app:

const assetBundle: VizijAssetBundle = {
  namespace: "fullscreen-face",
  glb: {
    kind: "url",
    src: faceAssetUrl,
    aggressiveImport: true,
  },
  pose: {
    stageNeutralFilter: (_id, path) => !path.includes("/color/"),
  },
};
 
export function FaceApp() {
  return (
    <VizijRuntimeProvider assetBundle={assetBundle} autostart>
      <VizijRuntimeHud />
      <FaceRuntime />
    </VizijRuntimeProvider>
  );
}

What each piece is doing:

  1. assetBundle tells Vizij what face bundle to load
  2. VizijRuntimeProvider owns loading, controller registration, and runtime state
  3. FaceRuntime and VizijRuntimeFace render and control the resolved face

If you understand those three pieces, you understand the core shape of the maintained Hello Face path.

4. Connect the visible behavior to the maintained hooks

Open useMouseGaze.ts and usePoseHotkeys.ts.

You are looking for two different control patterns:

  1. mouse gaze writes eye-position inputs continuously while the pointer moves
  2. pose hotkeys animate named pose-weight paths up and back down

You do not need to memorize every line yet. You do need to notice that both behaviors are driven through runtime APIs, not through one-off DOM tricks.

Expected result:

  1. the gaze hook explains why the eyes follow the mouse
  2. the hotkey hook explains why number keys trigger expression changes
  3. the code lines up cleanly with the behavior you already saw in the browser

5. Name the maintained runtime pattern

At this point, you should be able to say:

  1. this app loads one existing face bundle
  2. the runtime provider resolves the face and keeps track of readiness
  3. small hooks stage real runtime input writes
  4. the face is rendered by the same runtime stack other Vizij apps build on

That is the entire reason this app is the first maintained route through the guidebook.

Why This Page Matters

Hello Face is not trying to teach all of Vizij.

It is trying to remove the first doubt:

  1. can I run a real Vizij face locally
  2. can I make it do something visible
  3. can I find the code path that produced what I just saw

Once those are true, later control, integration, and deployment pages have something solid to build on.

What This Page Is Not Proving Yet

This page proves a live runtime surface. It does not yet prove:

  1. that you understand path semantics in detail,
  2. that you have an application integration shell of your own,
  3. that you have authored or customized the face,
  4. that you have a deployment endpoint an operator can drive.

Choose Your Next Route

The canonical next step is First Control Interactions.

Use one of these branch points after that:

If your next goal is…Open this next
understand the visible interactions before going deeperFirst Control Interactions
get to a player shell and keep the fast routeMinimal Web Player after Control
start owning the face and its behaviorTweak an Existing Face after Control

Fast Recovery If It Fails

Use these shortcuts instead of guessing:

  1. if dependencies are missing or stale, rerun pnpm install
  2. if the page loads but the face never appears, use Validation Checkpoints
  3. if the face appears but does not respond, continue to First Control Interactions and compare the expected behavior there
  4. if the app shows an error state, use Troubleshooting Matrix

Continue to First Control Interactions.

That page uses the same app, but it slows down and explains what the two maintained interaction patterns are actually doing.