Animal Encounter Training

Designing a Modular, Gamified Training System for Complex Decision-Making

Summary

WIT FORCE is an interactive training system designed to reinforce and extend the U.S. Army’s Weapons Intelligence Course.

I designed an interaction-driven learning system under real-world constraints, expanding an existing video-based platform into a modular, gamified experience.

The challenge required balancing realism, usability, cost, and technical limitations while translating complex, high-stakes training into something engaging, scalable, and effective.

This project demonstrates my approach to designing interaction-driven systems using gamification, feedback loops, and behavioral mechanics to guide user decision-making in complex environments.

Context

The Weapons Intelligence School trains soldiers to investigate incidents, analyze evidence, and identify threats in operational environments—essentially “CSI on the battlefield.”

The existing curriculum was dense and multi-disciplinary, heavily instructor-led and difficult to practice outside structured training in the classroom.

The project goal was to improve engagement and retention, enable self-directed practice, reduce reliance on in-person instruction and introduce scenario-based learning.

Core Challenge

The training required learners to conduct interviews, analyze physical evidence and then reconstruct complex systems. They also has to operate specialized tools and make decisions under pressure.

This all had to be done while maintaining accuracy to real-world procedures and working within budget and timeline constraints. It also had to support users with varying technical familiarity.

Constraints & Environment

This project operated under layered constraints:

These constraints shaped nearly every design decision.

My Role

I led the design of the system end-to-end, including:

My role focused on translating complex domain knowledge into structured, interactive systems that could be implemented within real-world constraints.

I operated at the system level—defining how learning, interaction, and feedback worked together across the entire experience.

Approach

To address the complexity of the training, I focused on three core design strategies:

Mission Structure

1. Modular System Design

I designed a system composed of reusable modules that could function in two modes:

Each module could operate independently or be embedded within a larger experience, enabling scalable content creation and flexible learning paths.

The system was designed with modular, lightweight interactions that could translate effectively to mobile and constrained environments if needed.

2. Interaction-Driven Learning

Mission Structure

Instead of passive content, the system centered on:

Key design principles:

3. Selective Fidelity & Abstraction

A critical part of the design was determining what to simulate vs what to simplify.

Examples:

This ensured:

The system emphasized lightweight, modular interactions that could translate well to mobile or constrained environments.

Key Design Decisions

Choosing Video + Interaction over Full 3D

We deliberately avoided building a fully 3D experience.

Reasons:

Instead, we:

Result: A more believable, accessible, and cost-effective experience.

Designing for Mixed User Experience Levels

Users ranged from non-technical to experienced.

Solutions:

Data-Driven Architecture

We implemented a data-driven system that allowed:

This enabled scalable development and long-term maintainability.

Prioritizing Learning Impact

Not all features were built equally.

Example:

Focus: maximize learning impact while minimizing unnecessary complexity.

Interaction Design & Core Loop

Each training scenario followed a structured loop:

  1. Narrative setup (video)
  2. Contextual decision or interaction
  3. Investigation or task-based modules
  4. Evidence processing and analysis
  5. Outcome + feedback
  6. After-action review

This loop:

Example Systems

Interview System (Behavioral Interaction)

Mission Structure

Reconstruction System (Analytical Thinking)

Search & Discovery (Spatial Awareness)

Each system targeted a different cognitive skill.

Tradeoffs & Design Tensions

Key tensions included:

Examples:

While full narrative branching would have increased realism, I constrained branching to key decision points to maintain scope and ensure learners could quickly iterate and learn from failure.

Outcomes

The final system enabled scalable, reusable training modules and introduced self-directed learning outside of formal instruction.

By combining immediate feedback with after-action review, learners could identify gaps, iterate quickly, and build confidence in decision-making.

The platform supported both structured curriculum delivery and independent practice—expanding how the program could be taught and experienced.

Enabled scalable training used by hundreds of soldiers, supporting both independent study and classroom instruction.

Why This Approach Worked

Back