4  Empathize and Define

4.1 Learning Objectives

By the end of this chapter, you will be able to:

  • Conduct User Research: Apply observation, interview, and ethnographic research techniques to understand IoT users
  • Create Empathy Maps: Synthesize user research into Says-Thinks-Does-Feels quadrants that reveal insights
  • Identify Pain Points: Categorize user frustrations as functional, emotional, financial, or social

Key Concepts

  • Empathy Map: A four-quadrant tool capturing what users Say, Think, Do, and Feel; reveals the gap between stated and actual user needs
  • User Interview: A structured conversation technique using open-ended questions to uncover real user behaviors, motivations, and frustrations — not opinions or feature requests
  • Pain Point Categories: Functional (task-related), emotional (frustration, anxiety), financial (cost, waste), social (status, relationships) — categorizing pain points reveals where solutions add the most value
  • Persona: A composite user archetype built from interview data, representing a segment of real users; used to keep design decisions user-centered throughout development
  • POV (Point of View) Statement: A user-centered problem definition format: “[User] needs [need] because [insight]” — the anchor for all subsequent design work
  • HMW (How Might We): A question format that reframes problems as opportunities; “HMW help elderly users remember medication without anxiety?” invites solution generation
  • Insight: A non-obvious pattern discovered from user research that reframes how the team understands the problem
In 60 Seconds

The empathize and define phases transform vague “we need an IoT device” ideas into precise user-centered problem statements; by conducting interviews, building empathy maps, and using the HMW and POV frameworks, teams ensure they are solving real problems rather than building technology for its own sake.

  • Develop User Personas: Create detailed representations of target users for design reference
  • Write Problem Statements: Use POV and HMW frameworks to create actionable, user-centered problem definitions
  • Define Success Metrics: Establish measurable criteria for evaluating solution effectiveness

Design methodology gives you a structured, proven process for creating IoT systems from initial concept to finished product. Think of it like following a recipe when cooking a complex meal – the methodology tells you what to do first, how to handle each step, and how to bring everything together into a successful final result.

“Empathize means really understanding what someone else experiences,” said Sammy the Sensor. “If we are building a smart garden system, we do not just sit at our desks imagining what gardeners need. We go to actual gardens, watch people water their plants, and ask them what frustrates them. Maybe they hate carrying heavy watering cans, or they forget to water when traveling.”

Max the Microcontroller showed how to organize findings: “We use an empathy map with four sections: What do users SAY? What do they THINK? What do they DO? And what do they FEEL? A gardener might SAY ‘I water every day’ but we OBSERVE they actually forget half the time. The gap between what people say and do is where the real insights hide.”

“Then we Define the problem,” added Lila the LED. “We write a problem statement like: ‘Busy professionals who love gardening need a way to keep plants healthy despite irregular schedules, because forgetting to water causes plants to die and makes them feel guilty.’ That sentence guides everything we build!” Bella the Battery agreed, “A well-defined problem is half the solution!”

4.2 Prerequisites

4.3 Stage 1: Empathize

4.3.1 Understanding Your Users

The Empathize stage requires stepping outside your assumptions and entering the user’s world. For IoT products, this means understanding: - Physical context (where they’ll use the device) - Technical context (existing devices and connectivity) - Social context (who else is affected) - Emotional context (frustrations, fears, desires)

User Research Techniques

1. Observation Watch users in their natural environment without interfering.

Example: For a smart home thermostat, observe how families interact with heating/cooling: - Do they check temperature often? - Who controls the thermostat? - When do comfort complaints happen? - What workarounds have they created?

2. Interviews Ask open-ended questions that reveal needs and frustrations.

Good questions:

  • “Walk me through your morning routine…”
  • “Tell me about the last time [topic] frustrated you…”
  • “What would make this easier?”
  • “How do you currently solve this problem?”

Bad questions:

  • “Would you use a smart device that…?” (leading)
  • “Do you want feature X?” (yes/no, not insightful)

3. Empathy Mapping Organize observations into four quadrants:

Empathy map diagram with four quadrants showing what users SAY (quotes about forgetting pills), THINK (concerns about memory), DO (observable behaviors like checking bottles), and FEEL (emotions like anxiety and embarrassment) for elderly medication management use case
Figure 4.1: Empathy Map Example: Understanding Elderly Medication Management Needs

Empathy Map for Smart Pill Bottle User: Four quadrants reveal the full picture of user experience. What users SAY (“I forget if I took my pills”) differs from what they FEEL (embarrassed, anxious). This emotional dimension is critical for IoT product design - the solution must address feelings, not just functional needs.

Use this blank template to map your own user research findings:

Blank empathy map template with four empty quadrants labeled SAYS, THINKS, DOES, and FEELS, ready to be filled with user research findings from observations and interviews
Figure 4.2: Blank empathy map for your user research. Fill each quadrant with findings from observations and interviews.

4. Contextual Inquiry Observe users performing actual tasks in their real environment while asking questions.

5. Journey Mapping Document the complete user experience across time and touchpoints.

4.3.2 Identifying Pain Points

Functional Pain Points:

  • Tasks that are difficult or time-consuming
  • Features that don’t work as expected
  • Missing capabilities users need

Emotional Pain Points:

  • Frustration with complexity
  • Anxiety about safety/security
  • Embarrassment about relying on technology

Financial Pain Points:

  • High costs of existing solutions
  • Hidden ongoing expenses
  • Wasted resources (energy, time)

Social Pain Points:

  • Difficulty sharing information with others
  • Privacy concerns with connected devices
  • Accessibility challenges for some users
Example: Smart Pill Bottle Pain Point Analysis
Category Pain Point Current Solution Opportunity
Functional Forgetting to take pills Phone alarms (easily dismissed) Contextual reminder tied to bottle
Emotional Anxiety about memory Worry silently Gentle confirmation without judgment
Social Family worry Daily check-in calls Automatic “all good” notification
Financial Expensive pill organizers $40+ weekly dispensers Simpler, affordable add-on

4.4 Stage 2: Define

4.4.1 Creating Problem Statements

Transform empathy insights into focused problem statements that guide solution development.

POV (Point of View) Framework:

[User] needs [need] because [insight].

Components:

  • User: Specific person type, not “people” or “everyone”
  • Need: A verb-based need, not a solution
  • Insight: The surprising “why” that emerged from research

Example POV Statements:

Weak: “Users need a smart pill bottle because they forget to take medication.”

Strong: “Elderly people living independently need a simple way to confirm they took today’s medication because they fear burdening family with check-in calls and doubt their own memory.”

Why it’s better:

  • Specific user (elderly living independently, not just “users”)
  • Need framed as action (confirm they took medication)
  • Insight reveals emotional depth (fear of burdening, self-doubt)
“How Might We” (HMW) Statements

Transform POV into ideation prompts using “How Might We”:

From POV: “Elderly people living independently need a simple way to confirm they took today’s medication because they fear burdening family.”

HMW Statements:

  1. How might we make confirmation automatic without user effort?
  2. How might we reassure family without feeling like surveillance?
  3. How might we preserve dignity while providing reminders?
  4. How might we make the solution obvious even for technology-hesitant users?

Why HMW Works:

  • “How” invites exploration (not yes/no)
  • “Might” gives permission to experiment
  • “We” creates collaborative ownership
  • Scoped enough to guide, open enough to inspire

4.4.2 Defining Success Metrics

User-Centered Metrics:

  • Task completion rate (e.g., medication taken on time)
  • User satisfaction scores (NPS, CSAT)
  • Adoption rate (users who continue using after 30 days)
  • Error rate (mistakes or failed interactions)

Technical Metrics:

  • Device reliability (uptime percentage)
  • Response time (latency for notifications)
  • Battery life (for mobile/wearable devices)
  • Data accuracy (sensor precision)

Business Metrics:

  • Customer acquisition cost
  • Monthly recurring revenue
  • Churn rate
  • Support ticket volume
Example: Smart Pill Bottle Success Metrics
Metric Type Metric Target How to Measure
User Medication adherence >90% Bottle sensor data
User User satisfaction >4.5/5 Monthly survey
Technical Alert delivery success >99% Server logs
Technical Battery life >6 months Field testing
Business 30-day retention >80% Usage analytics
Business Support tickets <5% of users Helpdesk data

4.5 Knowledge Check

4.6 Worked Example: Empathize and Define for a Smart Water Leak Detector

Scenario: A home insurance company wants to reduce water damage claims (averaging $11,000 per incident, 2023 Insurance Information Institute data) by offering policyholders a subsidized smart water leak detector. The product team has 8 weeks to define the problem before prototyping begins.

Phase 1: Empathize – User Research (2 weeks)

The team conducted contextual inquiry with 18 homeowners who had experienced water damage in the past 3 years:

Research Method Participants Key Finding
In-home interviews 18 homeowners 14 of 18 knew their water main location but had never tested their shutoff valve
Observation (basement/utility rooms) 18 homes 11 of 18 had water heaters with visible corrosion; only 2 had any leak detection
Insurance claim review 200 claims 72% of damage occurred while homeowners were away (vacation, work)
Plumber interviews 5 professionals Average leak-to-discovery time without detection: 8-14 days for slow leaks

Empathy Map (synthesized from 18 interviews):

Quadrant Findings
SAYS “I never think about water damage until it happens.” “The insurance claim process was a nightmare.” “I’d pay to avoid going through that again.”
THINKS “My house is too old for smart devices.” “These gadgets probably need a plumber to install.” “What if it gives false alarms at 3 AM?”
DOES Checks for leaks only when they hear dripping. Places towels under suspicious pipes. Turns off water main only when leaving for 2+ weeks.
FEELS Anxious about hidden leaks (especially in walls). Frustrated by insurance deductibles ($1,000-$5,000). Skeptical about “smart home” complexity.

Critical gap discovered: Users SAY “I check for leaks regularly” but OBSERVATION shows they only notice active dripping. Slow seepage behind walls, under water heaters, or at supply line connections goes undetected for days to weeks.

Phase 2: Define – Problem Framing (1 week)

Pain Point Analysis:

Category Pain Point Severity Current Workaround
Functional Cannot detect slow leaks behind walls Critical None – discovered only when visible damage appears
Functional No remote shutoff when away from home High Ask neighbor to check; turn off main before vacation
Emotional Anxiety about “ticking time bomb” pipes in older homes High Avoid thinking about it
Financial $1,000-$5,000 deductible even with insurance High None
Social Embarrassment about water damage to neighbors below (condos) Medium Apologize; pay for repairs

POV Statement (weak first attempt): “Homeowners need a smart water leak detector.”

Problems: vague user (“homeowners”), solution-prescribing (“smart water leak detector”), no insight.

POV Statement (refined): “Homeowners aged 40-65 with houses over 15 years old need to detect and stop water leaks within minutes of onset, because slow leaks cause $11,000 average damage over 8-14 days of undetected seepage, and 72% of major water damage occurs while residents are away from home.”

Why this is better: specific user (age + home age), verb-based need (detect and stop), quantified insight (damage cost, detection delay, away-from-home statistic).

HMW Statements (generated 12, top 4 shown):

  1. How might we detect water where it should never be, even behind walls or under floors?
  2. How might we automatically stop water flow the moment a leak starts, without requiring the homeowner to be present?
  3. How might we make leak detection work in older homes without requiring a plumber for installation?
  4. How might we turn leak prevention into a financial incentive rather than an expense?

Success Metrics Defined:

Metric Target Measurement
Leak detection time < 5 minutes from onset Lab testing with calibrated drip rates
False alarm rate < 1 per month per household 90-day field trial with 200 homes
Self-installation success > 85% of users complete setup without help Unboxing study with 30 users
Insurance claim reduction > 40% fewer water damage claims 12-month controlled trial with insurer
User retention at 6 months > 90% still have device active Device telemetry analytics

Outcome: This empathize-define phase redirected the product concept. The team originally planned a simple moisture sensor pad. The research revealed that (a) detection alone is insufficient – automatic shutoff is essential because 72% of damage occurs while away, (b) installation anxiety is the primary adoption barrier, not price, and (c) the insurance discount (HMW #4) became the core go-to-market strategy, with the insurer subsidizing $60 of the $89 device cost in exchange for projected claim reduction savings of $2,200 per policyholder over 5 years.

Insurance economics drive IoT adoption decisions. Average water damage claim: $11,000. With 2% annual claim rate and $60 subsidy, insurer breaks even when:

\(\text{Subsidy per policy} < \text{Claim rate reduction} \times \text{Avg claim cost}\)

\(\$60 < \Delta_{rate} \times \$11,000\)

Required reduction: \(\Delta_{rate} > \frac{\$60}{\$11,000} = 0.00545 = 0.545\%\)

If leak detectors reduce claims from 2.0% to 1.5% (25% relative reduction = 0.5 percentage points):

\(\text{Savings per policy} = 0.005 \times \$11,000 = \$55/year\)

Over 5 years: \(5 \times \$55 = \$275\) saved vs \(\$60\) subsidy = $215 net profit per policy. With 200,000 policyholders, ROI = $43M over 5 years for $12M subsidy investment.

Interactive Calculator:

4.7 Quantitative Benchmarks for User Research

How much user research is enough? Industry data provides guidance:

4.7.1 Sample Sizes That Actually Work

Research Method Minimum Sample Optimal Sample Diminishing Returns At Source
Usability testing (qualitative) 5 users 8-12 users 15 users (95% of issues found) Nielsen Norman Group
Contextual inquiry (observation) 8 sessions 15-20 sessions 25 sessions Beyer & Holtzblatt
In-depth interviews 12 participants 20-30 participants 40 participants Guest et al. (2006)
Diary studies 10 participants x 7 days 15-20 x 14 days 25 x 14 days
A/B testing (quantitative) 500 per variant 2,000 per variant Depends on effect size
Survey (quantitative validation) 100 responses 300-500 responses 1,000+ responses

The 5-user rule: Jakob Nielsen’s research shows that 5 users find approximately 85% of usability issues. The 6th through 15th users primarily find duplicate issues. However, this applies only to usability testing of a single user segment. If your product serves 3 distinct segments (e.g., elderly patients, family caregivers, and medical staff), you need 5 users per segment = 15 users minimum.

Nielsen’s 5-user rule rests on probability theory. With average detection rate \(p = 0.31\) per tester, the probability that N testers collectively find a usability issue is:

\(P_{found}(N) = 1 - (1 - p)^N\)

For 5 testers: \(P_{found}(5) = 1 - (1 - 0.31)^5 = 1 - 0.1564 = 0.8436 \text{ (84.4\%)}\)

For 15 testers: \(P_{found}(15) = 1 - (1 - 0.31)^{15} = 0.9899 \text{ (98.9\%)}\)

Diminishing returns calculation: Going from 5 to 15 testers triples cost ($300 → $900 stipends) but only gains 14.5% more issues found. The marginal value per additional tester drops from 16.9% (tester 1) to 0.8% (tester 15).

Interactive Calculator:

4.7.2 Research Budget Allocation

A common question from product teams: “What percentage of the project budget should go to user research?”

Project Phase % of Phase Budget Activities IoT-Specific Considerations
Discovery/Empathize 8-12% of total budget Interviews, observations, competitive analysis Include home visits (IoT devices live in physical spaces)
Define 3-5% Synthesis workshops, persona creation Test with non-technical users (50%+ of IoT buyers)
Prototype 5-8% Usability testing per iteration Test physical + digital together (not just the app)
Post-launch 2-4% ongoing Analytics review, support ticket analysis Monitor device abandonment rates (30-day, 90-day)
Total 18-29%

Budget allocation for $200K IoT project following 18-29% research guideline. Baseline: $100K labor (50%), $40K hardware/certs (20%), $60K remaining (30%).

Minimum research budget (18%): $200K × 0.18 = \(36,000\) - Discovery/Empathize: $200K × 0.10 = $20,000 (10 users × $2K each = interviews + analysis) - Define: $200K × 0.04 = $8,000 (2-week synthesis sprint) - Prototype testing: $200K × 0.065 = $13,000 (3 test rounds × 5 users)

Optimal research budget (25%): $200K × 0.25 = \(50,000\)

If research prevents one major pivot ($47K from earlier example): Break-even research spend = $47K. At 35% pivot probability without research: Expected ROI = \(\frac{0.35 \times \$47,000}{\$36,000} - 1 = 45.7\%\) above break-even.

Interactive Calculator:

ROI of user research: A study by the IEEE (2018) across 60 software projects found that every $1 invested in user research during the discovery phase saved $10-$100 in development rework later. For IoT products with physical hardware, the savings are even larger because hardware changes after tooling are 10-50x more expensive than software changes.

User research ROI for IoT hardware. Discovery phase: 10 interviews × 2 hrs × $150/hr = $3,000. Without research, probability of major pivot after PCB: 35%.

Cost of post-PCB pivot:

  • PCB redesign: $2,500 (engineering) + $1,200 (fab) = \(3,700\)
  • Component changes: $800
  • Testing re-validation: 40 hrs × $120/hr = $4,800
  • Timeline delay: 6 weeks × $25K/month burn = $37,500
  • Total pivot cost: $46,800

Expected cost without research: \(0.35 \times \$46,800 = \$16,380\)

With research (reduces pivot risk to 8%): \(\text{Expected cost with research} = \$3,000 + 0.08 \times \$46,800 = \$6,744\)

Net savings: \(\$16,380 - \$6,744 = \$9,636\) (321% ROI)

Interactive Calculator:

4.7.3 When to Stop Researching and Start Building

Signal Action
You hear the same pain points from 3+ unrelated users Pain point is real – define it
Empathy map quadrants are filling with consistent data Synthesis is converging – move to Define
New interviews produce no new insights (saturation) Stop interviewing, start defining
Stakeholders are quoting user research in meetings Research has penetrated the team – leverage it
You can predict what the next user will say You have enough qualitative data – validate quantitatively

4.8 Common Pitfalls

Pitfall: Confirmation Bias in User Research

The Mistake: Asking leading questions like “Would you use a smart device that does X?” or only interviewing users who already like your concept. You hear what you want to hear, not what users actually need.

Why It Happens: Teams are emotionally invested in their ideas. Interview questions unconsciously lead users toward confirming the concept. Researchers nod enthusiastically at positive feedback, causing users to agree more. Negative feedback is dismissed as “that user doesn’t get it.”

The Fix: Use neutral questions: “Tell me about the last time you…” instead of “Do you want…?” Include skeptics in user research. Have someone outside the team review interview transcripts for bias. Ask “What would make you NOT use this?” to surface objections.

Pitfall: Solving the Wrong Problem (Defining Too Early)

The Mistake: Creating a problem statement that’s actually a solution in disguise. Example: “Users need a Bluetooth-connected device” - this prescribes Bluetooth before understanding if connectivity is even needed.

Why It Happens: Engineers naturally think in solutions. The excitement of building overshadows the discipline of understanding. Stakeholders pressure teams to “get building” before research is complete.

The Fix: Test your problem statement: Does it mention any specific technology? If yes, rewrite it. Use the formula: “[User] needs [need] because [insight]” - where “need” must be a verb, not a noun/product.

4.9 Summary

  • User Research Techniques: Observation, interviews, empathy mapping, contextual inquiry, and journey mapping reveal user needs that users themselves may not articulate directly
  • Empathy Map Quadrants: SAYS (direct quotes), THINKS (inferred beliefs), DOES (observable behaviors), FEELS (emotions) provide a complete picture of user experience
  • Pain Point Categories: Functional (tasks), emotional (feelings), financial (costs), and social (relationships) pain points guide comprehensive problem understanding
  • POV Statement Formula: “[User] needs [need] because [insight]” ensures problem statements remain user-centered and avoid prescribing solutions
  • HMW Statements: “How Might We” transforms problem statements into ideation prompts that invite creative exploration while maintaining focus
  • Success Metrics: User-centered, technical, and business metrics should be defined before prototyping to enable objective evaluation of solutions

4.10 Concept Relationships

Empathize and Define in the Design Thinking Flow

Empathize → Define Connection:

  • Raw observations (Says/Does) → Insights (Thinks/Feels) → Problem Statement
  • User quotes → Pain points → HMW questions → Solution space

Define → Ideate Gateway:

  • Weak problem statement (“users want technology”) → Scattered, unfocused ideas
  • Strong problem statement (“[specific user] needs [verb action] because [insight]”) → Targeted, relevant solutions

Common Patterns:

  • 5 user interviews → Pattern emerges (3+ users share pain point) → Valid problem
  • Empathy map gap: Says ≠ Does → Hidden insight → Breakthrough innovation
  • POV insight section → Becomes design constraint (“must preserve dignity” = no invasive monitoring)

4.11 See Also

Related Resources

Previous/Next in Series:

Deep Dives:

4.12 What’s Next

Continue to Ideate, Prototype, and Test to learn brainstorming techniques, prototype fidelity levels, and user testing methods that validate your problem definition with real users.

Previous Current Next
Design Thinking Introduction Empathize and Define Ideate, Prototype, and Test