AR Hand Assessment Tool (H.A.T.)

Enabling patients of all abilities to assess upper extremity function for rehabilitation

──────────────────────────────────────────

MY ROLE

Product Design, Prototyper in Unity, Research

TIMELINE

5 months, Sep - Jan 2025

TEAM

1x Engineer

CONTRIBUTION

Designed in Figma and implemented in Unity in collaboration with an engineer

Leading 3 independent user testing sessions and 1 co-design session

IMPACTS

Working as the sole designer and co-authoring a research paper

Presenting the final product to local OTs and hand clinics

PROBLEM

Measuring hands is hard.

Measuring patient's hand functions such as range of motion (ROM) - how much each finger joint can flex or extend - is difficult.

Right now, clinicians measure it with hand-held goniometers, which have some problems:

1. Tedious and time-consuming

Alarm clock 3D

2. Prone to human error

measuring hand with goniometer

Hand-held goniometers to measure ROM

3. No visual feedback of the patient's progress

SOLUTION

HAT is an AR tool for patients to assist with hand rehabilitation by recording and tracking their progress.

1) Multiple modalities for instructions

Visual + text + sound (instructions in British accent)

3) Buttons that automatically proceed

Counts down for 5s in case the patient has a hard time pressing buttons

ACCESSIBILITY FEATURES

2) Hand detection toggle

More intuitive than a recording timer, created in Unity

SO WHY AR?

We want to create an AR tracking tool with Leap Motion Tracker so that:


1. Patients can record progress themselves


2. Patient can understand the effects of therapy


3. Incentivize insurance companies to continue paying for treatment since Leap is relatively cheap

Video Tutorial or…

Initial user flow

Questions arise..

Participant is supposed to hold at one pose for 2s.

But they need have the correct hand orientation first (eg. palm facing towards the camera).

Which is more effective? Image or video tutorial?

Use images?

And how people interact with the Leap device? Where it should be placed?

How do users engage with the skip button if there is one?

The Leap Motion Controller is an optical hand tracking module that captures the movements of hands. It’s small, fast and accurate

DESIGN QUESTION

How might we design an AR product to guide patients with hand injuries to record different hand poses, and visualize progress to help them understand therapy outcomes?

-> GOAL


Develop an accessible product that guides patients (or clinicians with their patients) to record and analyze the ROM using commercially-available hand-tracking technologies like Leap Motion Tracker.

-> GOAL

User Testing 1: Low-fidelity Prototyping

To explore these questions, I used Lego and tutorial image/video on my laptop to test out ideas with 5 paticipants.

See research plan and notes here

Participants tend to make mistakes at video tutorial (record pose starting with the hand orientation, or incorrect hand orientation)

Participants tend to make less mistakes at image tutorial

High-fidelity Prototyping Iterations

Based on the User Test 1, I confirmed these design directions:

  1. Use images instead of video tutorial to avoid less mistakes

  2. No “Skip” button because it’s not accessible and voice-control is not in our scope

  3. Update language and images for more intuitive instruction

To be continued…