Case Study · Behavioral Research · Mixed Methods · PhD Research

Can you prove that a design decision changes human motivation? I built a lab in two weeks to find out.

With no budget, no team, and no allocated time, I built a behavioral research lab inside a Fortune 500 company — then used it to conduct both industry-grade exploration for CES and a statistically rigorous doctoral experiment. The finding: spatial interaction design measurably shifts motivation quality (p=.03). The method: transferable to any emerging technology.

p=.03
Statistically significant
design effect on motivation
2 wk
From concept to
complete data collection
$0
Budget allocated
(foam board + projectors)
Fast Company
World-Changing Ideas 2020
Company
Harman International (Samsung) — Huemen Design Group
Role
Senior UX Researcher & PhD Candidate
Context
Autonomous vehicle experience design + Doctoral dissertation
Recognition
Fast Company 2020 · CES Presentation · Published Dissertation
Autonomous Vehicle UX Research
01
The Opportunity

When a VP says "if you know what you are doing, go ahead"

As a researcher at Harman International — a major Samsung subsidiary and supplier to luxury automotive manufacturers — I was invited into a meeting with a strategic problem: senior leadership needed to envision autonomous vehicle experiences for their annual CES showcase, but the initiative had no budget, no team, and no formal structure.

The initial scope was secondary research on technology trends. But I recognized the limitation — we weren't uncovering anything surprising. What we needed was primary research, but stakeholders couldn't envision how to collect data from sketches still in development.

During a meeting at the company's design headquarters with the VP — a renowned, award-winning designer — I proposed building a low-fidelity prototype to collect user feedback. His response was direct: "If you know what you are doing, go ahead."

My strategic decision: use the same experimental lab to serve both rapid industry insights and rigorous doctoral research — demonstrating that industry and academic research can exist in productive synergy.

Why this matters beyond autonomous vehicles

The automotive context is the specific scenario, but the capabilities demonstrated are universal: building research infrastructure from zero resources, running experiments that produce statistically defensible evidence, and doing it at speed. These transfer directly to:

Fintech & Digital Banking

Privacy vs. social features in financial apps — the same spatial design question this study answers

Health & Wellness Tech

How interface design affects motivation for long-term behavior change — the exact dependent variable tested here

Any Emerging Technology

Validating experiences for products that don't fully exist yet — provotyping's core purpose

02
Industry Exploration

Stage 1: Five interaction modes for a technology that doesn't exist yet

I built an immersive autonomous vehicle simulation using "smoke and mirrors" prototyping — foam board, projectors, screens, speakers, and cameras creating experiences that feel real without building actual technology. Results were ready within two weeks — from initial meeting to lab construction to complete data collection.

Using my doctoral provotyping methodology, I created five interaction modes exploring the transition from current driving culture to fully autonomous experiences:

Low-fidelity autonomous vehicle prototype setup
Low-fidelity prototype lab: projectors, foam board, and edited visualizations creating an immersive simulation — built with zero budget

Detox Mode

Zero information

No data displayed. Challenged car culture expectations — could users tolerate a blank screen?

Pure Ride

Minimal: speed + ETA

Testing the absolute minimum information threshold for user comfort in autonomous transit.

AR / VR Modes

Contextual to immersive

Tourist, Shopping, and Deep Dream modes testing information density from augmented to fully virtual.

Five interaction modes mapped on provotyping tool
Five interaction modes mapped across the provotyping framework — systematically exploring the information spectrum

Key qualitative findings

Finding 01

Transition design, not final design

Provotyping reveals what people need to transition from the current paradigm — not what they'll ultimately need. Participants demanded car data (speed, health) despite full autonomy making it irrelevant.

"Does not like not being able to see information. Wants to know what is going on." — PAR 6
Finding 02

Efficiency dominates over emotion

Despite emotional design framing, participants prioritized time management. Even when freed from driving, efficiency remained the dominant value.

"Time management is the most valuable thing for her. 'A car that helps me use my time is a really good thing.'" — PAR 2
Finding 03

Voice interaction complicates group authority

In shared vehicles without designated drivers, voice raises control questions — tone becomes a resource for authority, potentially reducing individual motivation.

Finding 04

Data privacy in shared spaces

Participants appreciated personalization but feared residual data in non-owned vehicles. Transparency about data deletion is critical for shared technology acceptance.

Detox Mode participant feedback
Detox Mode analysis: participant feedback mapped across privacy concerns, control needs, pain points, and new ideas
03
The Experiment

Stage 2: Does spatial design measurably change motivation quality?

Building on the exploratory phase, I designed a controlled mixed methods experiment — the core of my doctoral dissertation — testing whether interaction design attributes can function as controlled independent variables to test behavioral theories.

Experimental design

Independent variable: Proxemics (spatial interaction design) manipulated through three prototypes:

  • Prototype A — Personal/Intimate: Individual phone screens, private decisions
  • Prototype B — Group/Social: Collaborative touchscreen, shared decisions
  • Prototype C — Public: Immersive screens + voice control, collective decisions

Dependent variable: Self-Determination Theory motivation regulations (intrinsic vs. extrinsic)

Method: Repeated measures ANOVA + semi-structured interviews (n=8)

Why this design matters for industry

This isn't academic abstraction. The question — "does making an interface more private or more social change how motivated people feel?" — applies to every product with social features, shared experiences, or privacy decisions.

The experimental rigor means the answer isn't "we think so" or "users told us" — it's "we can prove it with p=.03." That's the difference between a research opinion and evidence a VP can stake a product decision on.

Three prototype conditions
Three experimental conditions: personal (phone), social (touchscreen), and public (immersive + voice) — manipulating spatial interaction as a controlled variable
04
The Results

Statistical significance: privacy design reduces external regulation

Participants felt significantly less externally regulated (motivated by rewards/punishments) in the personal/intimate condition compared to the social condition. In other words: private interfaces shifted motivation toward more self-determined engagement — critical for long-term behavior change sustainability.

p = .01
External Regulation ANOVA: F(2,7) = 6.30 — Overall effect of spatial design on motivation regulation was statistically significant
p = .03
Personal vs. Social comparison — Private interfaces produced measurably higher autonomous motivation than shared interfaces

What this means practically: When you design an interface to feel more private and personal, users make more self-determined decisions. When you make it social and public, external pressure increases. Neither is inherently better — but designers need to know which they're creating and why.

The key nuance: some individuals benefited from positive peer pressure. This means the design recommendation isn't "always go private" — it's "provide adjustable privacy controls" and understand the psychological tradeoffs of each decision.

ANOVA statistical results
Repeated measures ANOVA results across four Self-Determination Theory motivation regulation types

Mixed methods integration: why both quantitative and qualitative matter

The power of this research lies not in using one method — but in their strategic integration. Quantitative data showed which design had statistical impact. Qualitative data explained why and how. I used extreme quantitative scores to strategically select which participant comments to examine — making qualitative analysis efficient and targeted rather than exhaustive.

Mixed methods sum scores analysis
Sum scores visualization: quantitative patterns guided which qualitative data to examine — making analysis both rigorous and efficient
AR Mode feedback and privacy concerns
Qualitative analysis revealing privacy concerns and authority dynamics across interaction modes
05
Impact & Reflection

What this proved — about the research, and about how I work

Fast Company World-Changing Ideas 2020
Honorable Mention, Transportation Category
CES Presentation
Interactive visual journey showcasing human-centered automotive innovation
Published Doctoral Dissertation
Illinois Institute of Technology, 2023
UMA CES presentation
UMA concept visualization and CES presentation materials

What I learned

Zero-budget research is a methodology choice, not a limitation

"Smoke and mirrors" prototyping — foam board, projectors, screens — produced insights that challenged the assumptions of an award-winning design team. Methodological clarity matters more than resource availability.

Industry and academic rigor aren't enemies

The same lab served both rapid CES insights and a peer-reviewed doctoral experiment. The key was designing research that works at multiple levels of rigor simultaneously — not choosing between speed and depth.

Mixed methods makes both traditions stronger

Using quantitative scores to guide qualitative analysis is dramatically more efficient than analyzing all data equally. Statistical patterns point you toward the right participant stories to examine.

Design decisions have measurable psychological effects

This is the core contribution: proving that interaction design attributes can function as controlled variables to test behavioral theories. UX decisions aren't preferences — they're interventions with measurable outcomes.

The capability this demonstrates

Moving fluidly between rapid qualitative insights for product iteration, rigorous quantitative validation when strategic decisions need statistical confidence, and mixed methods integration when neither tradition alone suffices. This combination doesn't require autonomous vehicles — it works with any emerging technology where you need to validate experiences before they fully exist.

Methods & Tools

Research approaches

Provotyping Methodology Mixed Methods Design Controlled Experimentation Repeated Measures ANOVA Semi-structured Interviews Grounded Theory Analysis Low-fidelity Prototyping User Testing

Theoretical frameworks

Self-Determination Theory (SDT) Proxemics Theory Research Through Provocation Exercise Self-Regulation (SRQ-E) Persuasive Technology Design Mixed Methods Integration

Stakeholder management

C-Suite Presentation (VP Innovation) CES Event Coordination Academic-Industry Translation Design Team Collaboration Cross-functional Research Communication Dissertation Committee Engagement

What makes this distinctive

Zero-budget lab construction Dual industry + academic output Statistically significant findings Design variables as experimental controls Award-winning industry outcome

Explore more work

See other projects demonstrating strategic research and systems thinking

← Back to All Projects
Next project → What happens when AI joins the room