8  ::: {style=“overflow-x: auto;”}

title: “Understanding People and Context” difficulty: beginner —

8.1 Overview

This chapter series explores how to understand users and their contexts for effective IoT design. The most sophisticated IoT technology will fail if it doesn’t align with how people actually live, work, and think.

MVU: Minimum Viable Understanding

Core concept: IoT success depends on observing how real people behave in real contexts - not on designer assumptions about user needs or technical elegance.

Why it matters: The “curse of knowledge” causes engineers to overestimate user sophistication, leading to products that work technically but fail practically.

Key takeaway: If you haven’t watched real users struggle with your device in their actual environment, you don’t understand your users yet.

“Here is something surprising,” said Sammy the Sensor. “The hardest part of building an IoT device is not the technology – it is understanding the PEOPLE who will use it!” Max the Microcontroller agreed, “Engineers like me tend to think everyone is tech-savvy. But my grandma still thinks Wi-Fi comes from the wall like electricity. If I design for myself, she will never figure out our smart home system.”

“That is why user research exists,” explained Lila the LED. “You go to people’s actual homes and watch them struggle with their current gadgets. You interview them about their daily routines. You discover that the thing they care about most is not fancy features – it is that the device just WORKS without thinking about it.”

Bella the Battery shared a golden rule: “Watch what people DO, not just what they SAY. People say they want 50 features, but when you watch them, they only use three. Understanding this saves me energy because Max does not need to power features nobody actually uses!”

8.2 Chapter Series

This topic is covered in six focused chapters:

8.2.1 1. User Research Fundamentals

Estimated time: 15-20 minutes

  • Why user research matters for IoT success
  • The curse of knowledge and assumption-based design
  • Stated preferences vs. revealed behavior
  • How context shapes user interaction

8.2.2 2. Research Methods

Estimated time: 20-25 minutes

  • Research method selection framework
  • Contextual inquiry techniques (2-4 hour observation)
  • Interview best practices
  • Sample size guidelines for qualitative research
  • Lab testing vs. field research tradeoffs

8.2.3 3. Personas and Journey Maps

Estimated time: 15-20 minutes

  • Creating evidence-based personas
  • Primary, secondary, and anti-personas
  • User journey mapping process
  • Identifying pain points and intervention opportunities
  • Common persona mistakes to avoid

8.2.4 4. Context Analysis

Estimated time: 20-25 minutes

  • Five dimensions of context: Physical, Social, Temporal, Technical, Cultural
  • Documenting environmental and situational factors
  • Accessibility considerations
  • Context-aware system design principles
  • Balancing automation with user control

8.2.5 5. Pitfalls and Ethics

Estimated time: 15-20 minutes

  • Common design pitfalls: context inference errors, privacy creep, sampling bias
  • Ethical research principles: informed consent, privacy, compensation
  • Research quality checks: avoiding confirmation bias and leading questions
  • Privacy vs. personalization tradeoffs

8.2.6 6. Quizzes and Assessment

Estimated time: 15-20 minutes

  • Quick knowledge check quiz
  • Interactive knowledge checks with feedback
  • Comprehensive review questions
  • Resources for further learning

8.3 Learning Objectives

By completing this chapter series, you will be able to:

  • Conduct User Research: Apply observation, interviews, and contextual inquiry methods appropriate for IoT
  • Analyze Context of Use: Identify and document environmental, social, and situational factors affecting device usage
  • Create User Personas: Develop evidence-based personas that represent target user groups and their needs
  • Map User Journeys: Document user experiences across touchpoints to identify pain points and opportunities
  • Evaluate Cultural Factors: Assess cultural, social, and accessibility factors in IoT design decisions
  • Distinguish assumption-based design from evidence-based design: Use systematic research methods instead of designer assumptions about user needs

Context-aware IoT design means creating systems that adapt to the user’s current situation. Think of a good hotel concierge who adjusts recommendations based on whether you are traveling for business or pleasure. Context-aware IoT devices sense factors like time of day, location, and user activity to provide the right experience at the right moment.

8.4 Prerequisites

Before diving into this series, you should be familiar with:

8.5 Key Concepts at a Glance

Concept Description
Curse of Knowledge Once you understand something, you can’t imagine not understanding it
Stated vs. Revealed What users say they want often differs from what they actually do
Contextual Inquiry 2-4 hour observation of users in their natural environment
Primary Persona Main user type that drives design decisions
Five Dimensions Physical, Social, Temporal, Technical, Cultural contexts
Minimum Viable Data Collect only what’s needed; process locally when possible

8.6 Knowledge Check

Scenario: Designing a smart lighting system for a multi-generational home (elderly parents, adult children, young kids).

Traditional (Context-Blind) Design:

  • Single brightness level for all users
  • Manual control via touchscreen app
  • Same behavior at all times of day
  • Result: Elderly users can’t see screen controls, kids leave lights on, adult forgets to dim at bedtime

Context-Aware Design Process:

Step 1: Five Dimensions of Context Analysis

Physical Context:

  • Elderly bedroom: Low ambient light at night, difficulty seeing touchscreen in dark
  • Kids’ playroom: High activity, hands often dirty/wet from crafts
  • Living room: Variable lighting needs (bright for reading, dim for TV watching)
  • Kitchen: Wet hands while cooking, need instant control

Social Context:

  • Shared spaces: Family members have conflicting preferences (Dad wants bright, Mom wants dim)
  • Privacy: Bedroom lights shouldn’t be controlled by kids’ tablets
  • Guest mode: Guests shouldn’t need app/account to use basic lighting

Temporal Context:

  • Morning (6-9am): Gradual brightening to aid waking
  • Evening (7-10pm): Gradual dimming to promote sleep
  • Night (10pm-6am): Minimal light for safety (bathroom trips) without disrupting sleep
  • Vacation mode: Random patterns to simulate occupancy

Technical Context:

  • Elderly parents: Don’t have smartphones, struggle with apps
  • Adult children: Comfortable with voice control, automation
  • Kids: Can operate physical switches but forget to turn off lights
  • Network: Intermittent Wi-Fi in basement (needs fallback control)

Cultural Context:

  • Accessibility: 25% of elderly have arthritis (can’t press small buttons)
  • Vision impairment: 40% of household needs 18pt+ text for readability
  • Language: Multi-lingual household (English + Spanish labels needed)

Step 2: Design Decisions Based on Context

Context Insight Design Decision Implementation
Elderly can’t use touchscreen app in dark Multi-modal control (voice + physical switches) “Alexa, dim bedroom lights”; large tactile wall switches (44mm diameter)
Kids forget to turn off lights Auto-off after 30 min of no motion PIR motion sensors in playroom, living room
Morning gradual wake-up Circadian rhythm automation Lights auto-brighten 6-7am (0% → 70%), synced to sunrise time
Shared space conflicts Per-user profiles detected via phone proximity Dad’s presence → 90% brightness, Mom’s presence → 60% brightness (Bluetooth detection)
Wet hands while cooking Voice control + motion-activated task lighting “Turn on kitchen lights” works with dirty hands; under-cabinet lights auto-on when approaching counter
Guest usability Physical switches work without app Wall switches always functional (even if internet/hub down)
Night bathroom trips Low-intensity pathway lighting Motion-triggered 5% dim lights (hallway, bathroom) 10pm-6am—enough to see, not wake fully

Step 3: Implement Context-Aware Rules

Rule 1: Circadian Rhythm Automation

IF time = 6:00am - 9:00am
  THEN brightness = time-based gradient (0% → 100% over 3 hours)
  AND color temperature = 2700K (warm white, gentle wake)

IF time = 7:00pm - 10:00pm
  THEN brightness = time-based gradient (100% → 30% over 3 hours)
  AND color temperature = 2000K (very warm, promote melatonin)

Rule 2: Contextual Overrides

IF user manually adjusts lights
  THEN pause automation for 2 hours
  THEN resume automatic schedule

IF user says "Movie mode"
  THEN dim living room to 10%, disable motion sensors
  THEN auto-restore after 3 hours OR when user says "Lights on"

Rule 3: Multi-User Presence Detection

IF bedroom occupied by elderly parent only
  THEN max brightness 80% (reduce glare sensitivity)
  AND enable night-light mode (5% dim from 10pm-6am)

IF kids present in playroom
  THEN enforce auto-off after 30 min of no motion
  AND disable voice control (kids say "Alexa turn off" as prank)

Step 4: Accessibility Features

For Elderly Users:

  • Physical wall switches with large tactile buttons (44mm × 44mm)
  • High contrast labels (black text on white background, 18pt font)
  • Voice control via smart speaker (no smartphone required)
  • Preset scenes (“Reading mode” = 100% brightness, task light on)

For Children:

  • Motion sensors prevent “forgot to turn off” waste
  • Voice commands work but can’t override parent settings
  • Fun names for scenes (“Playtime” instead of “Scene 3”)

For Visually Impaired:

  • Audible feedback when lights adjusted (“Bedroom lights set to 70%”)
  • High contrast indicators on switches (LED ring glows when lights on)

Results After Implementation:

Metric Before (Manual Control Only) After (Context-Aware) Improvement
Elderly satisfaction 45% (frustration with app) 89% (love voice control + physical switches) +44%
Lights left on accidentally 6 hours/day average 0.5 hours/day (auto-off works) -92%
Manual adjustments per day 24 (family constantly tweaking) 4 (automation handles most scenarios) -83%
Energy usage 100% baseline 68% (auto-dimming + auto-off) -32%
Guest usability 30% could operate (needed app) 95% could operate (physical switches) +65%

Key Insights from Context Analysis:

  1. Physical context drives interface choice: Elderly users in dark bedrooms need voice control, not touchscreen apps
  2. Temporal context enables automation: Circadian rhythm rules eliminate 83% of manual adjustments
  3. Social context requires per-user settings: Shared spaces need presence detection to serve conflicting preferences
  4. Technical context demands fallback: Physical switches work when Wi-Fi/hub fail (critical for elderly who can’t troubleshoot)
  5. Cultural context (accessibility) benefits everyone: Large buttons designed for arthritis are also easier for everyone to use

Key Insight: Context-aware design transforms smart lighting from “another gadget to manage” to “invisible automation that just works.” By analyzing all five dimensions of context (physical, social, temporal, technical, cultural), the system anticipates needs rather than requiring constant manual control. The elderly parents never open an app—lights just work correctly based on context.

Context-aware automation efficiency: A family manually adjusting lights 24 times/day spends \(T_{manual} = 24 \times 15 \text{ sec} = 360 \text{ seconds} = 6 \text{ minutes/day}\). Context-aware automation (circadian rhythm + presence detection) reduces adjustments to 4/day, saving \(T_{saved} = (24 - 4) \times 15 = 300 \text{ sec/day}\). Annually: \(300 \text{ sec/day} \times 365 \text{ days} = 109{,}500 \text{ sec/year} = 30.4 \text{ hours/year}\) (dividing by 3,600 sec/hour). Over 5-year device lifetime, that’s 152 hours of saved interaction time, plus 32% energy savings from auto-dimming/off.

Multi-user conflict probability: In a 4-person household with 6 shared IoT devices (thermostat, TV, lights, music, security, doorbell), if each person-device interaction has \(P_{conflict} = 0.3\) (30% chance of disagreement), and assuming 4 interactions per day across 6 devices (24 total interaction events), the probability of zero conflicts per day is \((1 - 0.3)^{24} = 0.7^{24} \approx 0.0002\) (0.02%). Context-aware per-user profiles (presence detection + preference learning) reduce conflicts to \(P_{conflict}' = 0.05\) (5%), achieving \((1 - 0.05)^{24} = 0.95^{24} \approx 0.29\) (29% zero-conflict days) — a 1,500× improvement in conflict-free operation.

Circadian rhythm automation impact: Human alertness peaks in early afternoon and dips at night. Smart lighting that tracks natural circadian patterns (2700K warm white at 6am, 5000K cool white at 2pm, 2000K very warm at 10pm) supports melatonin production. Research shows 2000K evening light vs 5000K increases melatonin by 40-60%, helping sleep onset. For elderly users with sleep disorders, properly-timed warm lighting can reduce sleep onset latency by \(\Delta T = 15-25 \text{ minutes}\).

Accessibility cost-benefit: Large tactile buttons (44mm vs 22mm) cost +$1.50/unit but increase elderly usability from 45% to 89% (+44%). For a 10,000-unit deployment, this is \(\$15{,}000\) upfront cost. Setup support call reduction: 62% to 12% of users need help (50% reduction × 10,000 × $25/call = $125,000 saved) — an 8.3× ROI on accessibility investment.

Presence detection accuracy vs. latency: Bluetooth proximity detection achieves \(P_{detect} = 0.95\) (95% accuracy) with \(\tau = 2-5 \text{ sec}\) latency. PIR motion sensors achieve \(P_{detect} = 0.92\) with \(\tau < 1 \text{ sec}\). For smart lighting, the \(3\text{-}4 \text{ sec}\) BLE latency is acceptable (user doesn’t notice delay walking into room). For smart locks, \(\tau > 2 \text{ sec}\) is unacceptable (user standing at door with groceries) — require PIR or UWB (\(\tau < 0.5 \text{ sec}\)).

Use this framework to systematically analyze context dimensions for your IoT application:

8.6.1 Five-Dimension Context Analysis Template

For each dimension, answer the key questions and document design implications:

1. Physical Context

Factor Questions to Ask Example Findings Design Implications
Environment Where will device be used? (Indoor/outdoor, home/office/factory, temperature/humidity) Smart thermostat in kitchen (temperature swings from cooking, humid from dishwasher steam) Waterproof enclosure (IP54), temperature sensor calibration for local heat sources
Lighting What are lighting conditions? (Bright/dim, consistent/variable, natural/artificial) Elderly bedroom (very dark at night, user can’t see small text) High contrast display (white on black), large text (18pt+), voice control
Noise What is ambient noise level? (Quiet/loud, consistent/variable, competing voices) Kitchen (70-85 dB from appliances, family conversations) Far-field microphones, noise cancellation, visual feedback (not just audio)
Physical Access Can users easily reach device? (Mounted high/low, behind furniture, in narrow spaces) Smart smoke detector on 10ft ceiling Minimal interaction required (test via app, not physical button), battery life 10+ years
User Posture What position are users in? (Standing/sitting/lying, stationary/moving, one hand/two hands) Smart lock used while carrying groceries Large touchpad (tap to unlock), voice control (“unlock front door”)

2. Social Context

Factor Questions to Ask Example Findings Design Implications
User Presence How many people interact? (Single user/multiple, private/public space) Smart home hub in living room (family of 4 + guests) Multi-user profiles, guest mode (no account needed for basic features)
Privacy Concerns Who can observe interaction? (Private bedroom vs. shared office) Smart speaker in private bedroom Mute button (disable microphone), indicator light (shows when listening)
Social Norms What behaviors are socially acceptable? (Voice commands in public, device visibility) Wearable health tracker in professional office Discreet form factor (not flashy), silent notifications (vibration not sound)
Authority/Hierarchy Who has control rights? (Parent vs. child, manager vs. employee) Family smart home (parents want control, kids want freedom) Tiered permissions (parents override kids’ settings, kids can’t disable safety features)

3. Temporal Context

Factor Questions to Ask Example Findings Design Implications
Time of Day How does usage vary by hour? (Morning rush vs. evening relaxation) Smart coffee maker (used 6-7am, user is groggy and rushed) One-button brew, minimal configuration, loud audible alert when ready
Duration How long is interaction? (Glance vs. extended use) Fitness tracker (glanced at wrist for 2 seconds, 20× per day) Glanceable info only (steps, heart rate), detailed stats in app
Frequency How often is it used? (Once per day vs. every hour) Smart thermostat (adjusted 2-3× per day) Learn from manual adjustments, auto-schedule, reduce need for frequent interaction
Urgency How time-sensitive is action? (Emergency alert vs. routine task) Smart smoke alarm (life-threatening emergency) Loud alarm (85+ dB), visual strobe, auto-call emergency services, NO confirmation dialog
Seasonal Patterns How does usage change with seasons? (Winter heating vs. summer cooling) Smart irrigation system (unused in winter, daily in summer) Auto-adjust schedule by season, winterization mode (disable watering when temp <5°C)

4. Technical Context

Factor Questions to Ask Example Findings Design Implications
Connectivity What network access is available? (Reliable Wi-Fi, cellular, intermittent, offline) Smart garage door opener (spotty Wi-Fi in garage) Local control works offline, cloud sync when connection available, BLE fallback
Devices Owned What devices do users have? (Smartphone, tablet, smartwatch, none) Elderly user with flip phone (no smartphone) Physical remote control, landline voice interface, no app requirement
Tech Literacy How comfortable with technology? (Power user vs. technophobe) Grandparent with low tech literacy Setup wizard (step-by-step), pre-configured defaults, no jargon (“Auto mode” not “Thermostat setpoint scheduling”)
Existing Ecosystem What platforms already in use? (Apple HomeKit, Google Home, Alexa, none) User has 15 Alexa devices Prioritize Alexa integration, voice control primary interface
Power Availability Battery or mains? (Charging frequency acceptable?) Outdoor sensor on fence post (no power outlet) Battery-powered (5+ year life), solar panel optional, ultra-low-power radio (LoRa not Wi-Fi)

5. Cultural Context

Factor Questions to Ask Example Findings Design Implications
Language What languages do users speak? (Primary, secondary, literacy level) Multi-lingual household (English + Spanish) UI supports both languages, voice recognition for both, pictographic icons where possible
Accessibility What disabilities must be accommodated? (Vision, hearing, motor, cognitive) 25% of target users have arthritis Large buttons (44pt touch targets), voice control, no precise gestures required
Cultural Norms Are there cultural taboos or preferences? (Colors, symbols, interaction styles) Device for Middle Eastern market (right-to-left text, certain colors culturally significant) RTL UI layout, avoid culturally inappropriate colors/symbols
Regulations What legal requirements apply? (Privacy laws, accessibility standards, safety certs) Device sold in EU (GDPR compliance required) Explicit consent for data collection, user can delete data, privacy by design

8.6.2 Context Analysis Deliverables

After analyzing all five dimensions, create:

1. Context Summary Document

  • One-page overview of key contextual constraints
  • Prioritized list of “must address” vs. “nice to have” contextual factors

2. User Journey Map (Context Overlay)

  • Standard journey map + contextual factors at each touchpoint
  • Example: “6:30am (groggy), kitchen (noisy), wet hands (cooking) → need voice control, not touchscreen”

3. Design Decision Log

  • Every design choice traces back to context analysis finding
  • Example: “We chose voice control over touchscreen app BECAUSE elderly users in dark bedroom can’t see screen (Physical Context finding #3)”

4. Context Testing Plan

  • How will you validate design works in real contexts?
  • Example: “Test smart lock with 5 users carrying groceries (not clean-hands lab test)”

8.6.3 When to Skip Context Analysis (Risk Assessment)

Low Risk (Context analysis optional): - Internal tool for tech-savvy users in controlled environment (office workers at desks) - One-time use device (event wristband used for 3 hours) - Forgiving failure mode (missed notification won’t cause harm)

High Risk (Context analysis CRITICAL): - Safety-critical device (medical alert, smart lock, smoke alarm) - Vulnerable users (elderly, disabled, children) - Extreme environments (outdoor, industrial, high noise/light/temp variance) - Long deployment (5+ years, can’t easily fix design mistakes)

Key Insight: Context analysis reveals WHY certain design choices are correct. Without context understanding, you build “solutions” that don’t match real-world usage patterns. A touchscreen thermostat makes sense in a lab but fails when elderly users try to adjust it in a dark bedroom with arthritic fingers at 3am.

Common Mistake: Assuming Users Have Single, Consistent Context

The Mistake: Designing IoT systems assuming users always interact in the same context (e.g., “smart home users are always at home with their phone nearby”), missing the dynamic, multi-context reality of user behavior.

Why It Fails:

Users exist in multiple contexts throughout the day, and IoT systems that work perfectly in one context fail catastrophically in another.

Real-World Example: Smart Door Lock

Designed For (Assumed Context): - User at home, smartphone in pocket, hands free, Wi-Fi available - Expected interaction: Approach door → phone auto-detects → door unlocks

Actual Contexts Encountered:

Context Frequency System Behavior User Experience
Groceries in both hands 40% of entries Phone in pocket, can’t tap screen Lock doesn’t auto-detect (needs app open), user stuck outside with melting ice cream for 2 minutes
Phone battery dead 15% of entries No Bluetooth connection Locked out, no backup physical key, calls locksmith ($150)
Guest/visitor 10% of entries No app installed Can’t enter—no physical keypad or traditional key option
Power outage 5% of entries Hub offline, no internet Auto-unlock fails, no manual override
Cold weather (-10°C) 20% of entries (winter) Touchscreen unresponsive, battery drains faster Lock malfunctions, user frustrated

Consequences:

  • 35% customer return rate within 30 days
  • 12-minute average customer support calls
  • $200/unit in support costs (more than device price)
  • Reputational damage (“Smart lock locked me out in rain”)

How to Fix It: Design for Context Spectrum, Not Single Context

Step 1: Map Full Context Spectrum

Create a context matrix showing all possible usage scenarios:

Scenario Hands Phone Network Weather Urgency Design Must Support
Happy path Free Charged, in pocket Wi-Fi + BLE Mild Normal Auto-unlock via app (works)
Carrying groceries Full In pocket Available Mild Moderate Hands-free unlock (voice, proximity)
Phone dead Free Dead battery Unavailable Mild High Physical backup (keypad or key)
Guest arrival Free No app Available Mild Low Guest code (no app required)
Emergency Free/full May not have May not Any Critical Physical key override
Winter Gloves on In pocket Available -15°C High Cold-resistant touchscreen OR voice

Step 2: Design for Degraded Contexts

Multi-Modal Access (Ranked by Context Reliability):

Access Method Works When Cost Implementation
1. Auto-unlock Phone nearby, BLE active, hands free Included Geofence + Bluetooth proximity
2. Voice unlock Hands full, phone nearby +$0 (uses phone) “Hey Siri, unlock front door”
3. Keypad PIN Phone dead, guest access +$15 (hardware) 6-digit PIN entry, weather-sealed buttons
4. NFC card/fob Phone dead, no voice +$5 (card) Tap NFC card to reader
5. Physical key Total system failure +$2 (key cylinder) Traditional deadbolt backup

All five methods must work to handle full context spectrum.

Step 3: Test in All Contexts

Standard Usability Test (Fails to Catch Issues): - Lab setting, user given phone, told to unlock door - Result: 95% success rate (misleading)

Context-Aware Usability Test (Reveals Real Issues):

Test Condition Scenario Pass Rate (Before Fix) Insight
Baseline Hands free, phone in pocket 95% Works in ideal conditions
Carrying bag Hands full, phone in bag 20% Auto-unlock fails when phone buried
Dead battery Phone turned off 0% No backup, total lockout
Wearing gloves Winter gloves on 30% Touchscreen doesn’t register
Guest No app, no account 0% No guest access method

After Redesign (Multi-Modal + Backups):

Test Condition Pass Rate (After Fix) Method Used
Hands free 98% Auto-unlock
Hands full 92% Voice “unlock door”
Dead phone 88% Keypad PIN or NFC card
Winter gloves 95% Voice OR keypad (large buttons)
Guest 100% Temporary PIN sent via SMS

Step 4: Provide Context-Aware Fallbacks

Smart System Behavior:

IF auto-unlock fails for 30 seconds
  THEN prompt user: "Say 'unlock front door' or enter your PIN"

IF user approaching in winter (temp < 0°C)
  THEN increase auto-unlock detection range (5m → 10m)
  AND reduce proximity timeout (10s → 5s)

IF multiple failed unlock attempts
  THEN send alert: "Having trouble? Use backup PIN or call support"

Key Insight: Systems designed for a single “happy path” context fail when context changes—which happens in 60-80% of real-world interactions. Users carry groceries, phones die, weather is extreme, guests visit, emergencies happen. Designing for the full context spectrum requires multiple interaction modes and graceful degradation when primary methods fail. The extra cost ($20 for keypad + NFC) prevents $150 lockouts and $200 support costs. :::

8.7 Concept Relationships

Understanding people and context is the foundation for all human-centered IoT design:

In 60 Seconds

This chapter covers understanding people and context, explaining the core concepts, practical design decisions, and common pitfalls that IoT practitioners need to build effective, reliable connected systems.

Common Pitfalls

Adding too many features before validating core user needs wastes weeks of effort on a direction that user testing reveals is wrong. IoT projects frequently discover that users want simpler interactions than engineers assumed. Define and test a minimum viable version first, then add complexity only in response to validated user requirements.

Treating security as a phase-2 concern results in architectures (hardcoded credentials, unencrypted channels, no firmware signing) that are expensive to remediate after deployment. Include security requirements in the initial design review, even for prototypes, because prototype patterns become production patterns.

Designing only for the happy path leaves a system that cannot recover gracefully from sensor failures, connectivity outages, or cloud unavailability. Explicitly design and test the behaviour for each failure mode and ensure devices fall back to a safe, locally functional state during outages.

8.8 What’s Next

Next Chapter
Start Series User Research Fundamentals – Why user research is essential for IoT success
After Series User Experience Design – Applying research insights to UX design decisions
Prerequisites Design Model for IoT – Understanding design frameworks and methodology