1612  Energy Optimization Worksheets and Assessment

1612.1 Energy Optimization Worksheets and Assessment

This section provides a stable anchor for cross-references to energy optimization assessment across the book.

1612.2 Learning Objectives

By the end of this chapter, you will be able to:

  • Calculate Context-Aware Battery Life: Compute battery life for devices with adaptive power management
  • Apply Mixed Usage Analysis: Calculate average current for devices with multiple operating contexts
  • Evaluate ACE System Savings: Quantify energy savings from cache hits and inference
  • Understand Key Energy Concepts: Master context awareness, adaptive techniques, and decision-making frameworks
  • Pass Comprehensive Assessment: Demonstrate mastery of context-aware energy management concepts

1612.3 Prerequisites

Before diving into this chapter, you should be familiar with:

1612.4 Context-Aware Battery Life Worksheet

Scenario: Fitness tracker with adaptive context-aware power management

1612.4.1 Step 1: Identify Power States

State Current Duration Duty Cycle
Deep Sleep 10 µA 55 sec 91.7%
Accelerometer Sampling 0.5 mA 2 sec 3.3%
Heart Rate Sensing 5 mA 2 sec 3.3%
BLE Transmission 15 mA 1 sec 1.7%

Context: User is sleeping (detected by ACE system)

1612.4.2 Step 2: Calculate Average Current (Normal Mode)

I_avg = (I_sleep × t_sleep + I_accel × t_accel + I_hr × t_hr + I_ble × t_ble) / t_total
I_avg = (10µA × 55s + 0.5mA × 2s + 5mA × 2s + 15mA × 1s) / 60s
I_avg = (0.55mAs + 1mAs + 10mAs + 15mAs) / 60s
I_avg = 0.443 mA

1612.4.3 Step 3: Context-Aware Optimization (Sleep Detected)

When ACE detects “sleeping” context, adjust sampling: - Accelerometer: Reduce to 1 sample every 5 minutes - Heart Rate: Reduce to 1 sample every 5 minutes - BLE: Batch and send every 30 minutes

New duty cycle (5-minute cycle):

State Current Duration Duty Cycle
Deep Sleep 10 µA 298 sec 99.3%
Accelerometer Sampling 0.5 mA 0.5 sec 0.17%
Heart Rate Sensing 5 mA 1 sec 0.33%
BLE Transmission 15 mA 0.5 sec 0.17%
I_avg_sleep = (10µA × 298s + 0.5mA × 0.5s + 5mA × 1s + 15mA × 0.5s) / 300s
I_avg_sleep = (2.98mAs + 0.25mAs + 5mAs + 7.5mAs) / 300s
I_avg_sleep = 0.052 mA

1612.4.4 Step 4: Battery Life Comparison

With 200mAh battery:

Normal Mode:

Life = 200mAh / 0.443mA = 451 hours = 18.8 days

Context-Aware Sleep Mode:

Life = 200mAh / 0.052mA = 3,846 hours = 160 days = 5.3 months

Energy Savings: 8.5× battery life extension!

1612.4.5 Step 5: Mixed Usage Analysis

Assuming user sleeps 8 hours/day (33% of time):

I_avg_mixed = (I_normal × 16h + I_sleep × 8h) / 24h
I_avg_mixed = (0.443mA × 16 + 0.052mA × 8) / 24
I_avg_mixed = (7.088 + 0.416) / 24 = 0.313 mA

Battery Life: 200mAh / 0.313mA = 639 hours = 26.6 days

Your Turn: Calculate for your own context-aware scenario!

1612.5 ACE System Energy Savings Quiz

Question 1: Which code snippet correctly implements context-aware power management for sensor sampling based on battery level?

  • Explanation: Choice B correctly implements adaptive sampling with battery-aware rate adjustment. It reduces sampling rate progressively: 10% of base rate when critical (<15%), 50% when low (<30%), full rate when normal. This balances energy conservation with maintaining context awareness. Choice A ignores battery state entirely. Choice C completely disables sensing below 50%, causing severe functionality loss. Choice D incorrectly tries to adjust sensor power rather than sampling rate. The ACE adaptive sensing pattern uses scale factors based on BatteryLevel enum (CRITICAL: 0.1, LOW: 0.25, MEDIUM: 0.5, HIGH: 0.8, FULL: 1.0) to systematically reduce energy consumption as battery depletes.

    Question 2: An IoT device runs a context-aware app that requests 200 context attribute queries per day. Without ACE, each query requires direct sensing at 50 mW for 1 second. With ACE (70% cache/inference hit rate), what is the daily energy savings?

    Explanation: Without ACE: 200 queries × 50 mW × 1s = 10,000 mW·s = 10,000/3600 mWh = 2.778 Wh = 2778 mWh. With ACE: 70% hit rate means only 30% require sensing. Energy = 200 × 0.30 × 50 mW × 1s = 3000 mW·s = 833.3 mWh. Savings = 2778 - 833.3 = 1944.7 mWh ≈ 1.94 Wh per day. Over a year, this is ~710 Wh saved! For a 3000 mAh @ 3.7V battery (11.1 Wh capacity), ACE extends runtime from 4 days to 13 days - a 3.3× improvement. This demonstrates why context-aware energy management is critical for battery-powered IoT devices. The 70% hit rate comes from combining cache hits (40%) and inference from rules (30%).

    Question 3: A smart home system uses 6 context attributes. Cache hit rate is 40%, inference from rules provides another 30%, and 30% require sensing. Compared to always sensing, what’s the energy reduction?

    Explanation: Combined cache + inference serves 70% of requests without sensing: 40% from cache hits + 30% from rule-based inference. Only 30% require actual sensor activation. If each sensing operation costs E energy, ACE costs 0.3E per request vs 1.0E for always-sensing systems → 70% reduction. This matches ACE research results showing 60-80% energy savings in practice. The key innovation is sharing cached data across apps AND inferring context from correlated attributes using learned rules, not just caching alone.

    1612.6 Key Concepts Reference

    Context Awareness: - User behavior: Activity, location, patterns - Environmental: Light, temperature, noise, motion - System state: Battery, network, CPU load - Application: Use case, user preferences - Social: Proximity to others, group activity

    Adaptive Techniques: - Dynamic voltage/frequency scaling (DVFS) - Sensor duty-cycling: Sampling frequency adjustment - Network adaptation: Protocol selection, batch size - Computation offloading: Local vs. cloud trade-offs - Display optimization: Brightness, refresh rate

    Decision Making: - Context classification: Determine current state - Energy prediction: Estimate operation cost - Cost-benefit analysis: Utility vs. energy trade-off - Optimization: Select best strategy - Feedback: Monitor and adapt

    Machine Learning in Context: - On-device models: Low latency, privacy-preserving - Edge processing: Reduce data transmission - Transfer learning: Reuse pre-trained models - Federated learning: Collaborative model improvement

    Implementation Patterns: - Event-driven: React to context changes - Predictive: Anticipate future needs - Reactive: Respond to immediate demands - Cooperative: Collaborate with other devices

    Key Metrics: - Energy efficiency: Task completion per joule - User experience: Quality, latency, reliability - Response latency: Decision time + execution - Adaptation overhead: Energy cost of optimization

    1612.7 Comprehensive Assessment

    Question 1: The Rule Miner in ACE requires minimum support=5% and confidence=60%. You observe: “Running=True AND Driving=True” in 2 of 100 observations, and 2 of 40 Running=True cases. What happens?

    Explanation: Support = (both occur) / (total observations) = 2/100 = 2%, which is below 5% threshold, so the rule is rejected. Even though confidence = 2/40 = 5% is borderline low, support is the first filter. Low support means the pattern is too rare to be reliable for inference. ACE needs patterns that occur frequently enough (support) AND have strong conditional probability (confidence) to safely use for energy-saving inference without excessive false positives.

    Question 2: What is the main advantage of body-biasing in fully-depleted SOI transistors for IoT processors?

    Explanation: Body-biasing applies voltage to the transistor body to dynamically adjust threshold voltage (Vt). Higher Vt reduces leakage exponentially (ideal for sleep modes) but slows switching. Lower Vt increases performance but leakage too. FD-SOI enables efficient body-biasing because the thin insulator layer provides strong electric field control. IoT chips can use high-Vt in sleep (nanoamp leakage) and low-Vt when active (fast computation). This adaptive technique is crucial for achieving both multi-year battery life AND sufficient performance for IoT workloads.

    Question 3: A wearable device needs to decide between local ML inference (DSP: 20 mW for 100ms) and cloud inference (Wi-Fi TX: 200 mW for 50ms + RX: 150 mW for 20ms). The device also has context: “Battery=15%, NetworkQuality=Good, UserActivity=Meeting”. What should a context-aware energy manager do?

    Explanation: Local DSP energy: 20 mW × 0.1s = 2 mJ. Cloud energy: (200 × 0.05) + (150 × 0.02) = 10 + 3 = 13 mJ. Context-aware decision: Battery=15% is LOW, so energy conservation is critical priority. Even though cloud is faster (70ms vs 100ms) and network is good, the 6.5× energy difference favors local processing. UserActivity=Meeting suggests low latency is valuable, but 30ms difference is acceptable. If battery was >50%, might offload for faster response. This demonstrates multi-objective optimization: weigh energy, latency, and context to make situation-appropriate decisions.

    Question 4: Your ACE system shows: “Driving” sensor costs 50 mW, “AtHome” costs 100 mW. Rule: “Driving=True → AtHome=False” (confidence 92%). App requests “AtHome” status. Cache shows “Driving=True” from 90 seconds ago (cache duration=300s). What’s the best sensing plan?

    Explanation: Best plan: Use cached “Driving=True” (still valid at 90s < 300s) and infer “AtHome=False” using the high-confidence rule (92%) with ZERO energy cost. This is optimal because: (1) cached value is fresh, (2) rule confidence is high (>60% threshold), (3) saves 100 mW by avoiding GPS sensing. The 8% error risk is acceptable for most applications. Alternative plans cost more: Direct sensing (100 mW), re-sense Driving (50 mW unnecessary), or verify (100 mW redundant). ACE’s Sensing Planner evaluates all options and selects minimum energy path.

    Question 5: Why can’t we reduce transistor leakage current to zero by maximizing threshold voltage?

    Explanation: Threshold voltage (Vt) controls the gate voltage needed to turn transistors ON. High Vt: electrons need more energy to flow → low leakage (good for sleep) but slow switching speed (bad for computation). Low Vt: fast switching (good for performance) but high leakage (bad for battery). There’s a fundamental trade-off: leakage ∝ e^(-Vt), drive strength ∝ (Vdd - Vt)². IoT chips use multiple-Vt domains: high-Vt for always-on logic (RTC), low-Vt for performance-critical (CPU), with body-biasing to dynamically adapt. Zero leakage would mean infinitely slow circuits!

    Question 6: Which context triggers can effectively enable adaptive power management in IoT devices? (Select all that apply)

    Explanation: Effective context triggers for adaptive power management include: Motion (A) enables activity-based adaptation like reducing sampling when stationary. Time-based patterns (B) allow scheduled low-power modes during predictable idle periods. Location changes (D) enable place-aware policies like high-performance at work, power-saving at home. Color preferences (C) don’t provide actionable context for power management. Context-aware systems use physical and temporal triggers that correlate with actual device usage patterns and user needs to optimize energy consumption while maintaining service quality.

    1612.8 Short Answer Quiz

    Test your knowledge with these quick review questions:

    1. What is the main problem with continuous context sensing?
        1. Low accuracy
        1. High battery drain
        1. Slow processing
        1. Limited storage
    2. In shared context sensing, what happens when App2 requests a context that was recently sensed by App1?
        1. App2 must sense again
        1. Cached value is returned without sensing
        1. Both apps sense simultaneously
        1. Request is denied
    3. In association rule mining, what does “confidence” measure?
        1. Frequency of pattern occurrence
        1. P(consequent | antecedent)
        1. Total number of rules
        1. Cache hit rate
    4. What is the role of the Sensing Planner in ACE?
        1. Store sensor data
        1. Find cheapest sequence to infer target attribute
        1. Calibrate sensors
        1. Compress data
    5. If a rule has support=8% and confidence=40%, what does this mean?
        1. Rule appears in 8% of observations, correct 40% of time when antecedent is true
        1. Rule is 8% accurate overall
        1. 40% of data supports the rule
        1. Rule has 48% total accuracy
    6. In MAUI, when would code offloading to cloud be avoided?
        1. When Wi-Fi is available
        1. When 3G network transmission costs exceed local execution costs
        1. When latency is not critical
        1. When remote server is available
    7. What is the advantage of using GPU vs CPU for keyword spotting on mobile?
        1. Lower cost hardware
        1. 21× faster and more energy efficient for parallel tasks

        1. Better network connectivity
        1. Simpler programming
    8. How much energy savings can ACE system typically achieve?
        1. 10-20%
        1. 30-40%
        1. 60-80%
        1. 90-95%
    9. What happens in “cross-app context correlation”?
        1. Apps share code
        1. Apps infer one attribute from others learned by different apps
        1. Apps run simultaneously
        1. Apps use the same sensors
    10. Which mobile SoC component is best for audio signal processing?
        1. CPU
        1. GPU
        1. DSP (Digital Signal Processor)
        1. Memory controller

    Answers: 1-B, 2-B, 3-B, 4-B, 5-A, 6-B, 7-B, 8-C, 9-B, 10-C

    1612.9 Chapter Summary

    Context-aware energy management enables IoT devices to dynamically adapt operation based on real-time understanding of user, environment, and system state. Rather than static power budgets, context-aware systems optimize for each specific situation, achieving energy savings of 60-80% or more while maintaining user experience.

    Techniques include dynamic voltage/frequency scaling, adaptive sensor sampling, intelligent network usage, and computation offloading. Machine learning enables sophisticated context understanding and prediction, from keyword spotting on device to predictive application behavior. The key is implementing adaptation efficiently so the optimization cost doesn’t negate energy savings.

    Key takeaways from this module:

    1. Duty Cycling: Periodic wake/sleep cycles reduce average power; effectiveness depends on achieving low sleep current
    2. ACE System: Combines caching, inference, and rule mining to avoid 70% of sensor operations
    3. Code Offloading: Wi-Fi offloading often saves energy; cellular offloading often wastes energy due to tail power
    4. Heterogeneous Computing: Match tasks to appropriate processors for maximum energy efficiency
    5. Context-Aware Adaptation: Battery level, network type, and user activity all influence optimal strategy

    1612.11 What’s Next

    The next section covers Hardware and Software Optimisation, which addresses how to systematically improve system performance, power efficiency, and resource utilization through hardware-software co-design. Optimization requires understanding interactions between hardware capabilities and software implementation strategies.