Development Kit (DevKit): Pre-assembled board combining microcontroller, sensors, and connectivity for rapid prototyping without custom PCB design.
Shield: Plug-in expansion board adding capabilities (Wi-Fi, motor driver, sensor) to a base development board via standard pin headers.
STEMMA QT / Qwiic: Standardised 4-pin I2C connector enabling daisy-chaining of compatible sensors without soldering.
GPIO Mapping: Documentation of which microcontroller pins connect to which peripheral functions on a development board.
Datasheet: Manufacturer’s technical document specifying a component’s electrical characteristics, timing diagrams, and application circuits.
Breadboard Prototype: Solderless first-stage prototype allowing rapid component layout changes before committing to PCB design.
Reference Design: Manufacturer-provided schematic and PCB layout demonstrating how to correctly integrate a chip or module.
29.3 Introduction
Computer vision, wireless communication, and energy harvesting represent advanced IoT capabilities that enable sophisticated applications. AI kits bring machine learning to the edge for real-time object detection and image classification. Wireless kits enable mesh networks and long-range sensor deployments. Energy harvesting kits eliminate battery replacement for maintenance-free operation. This chapter explores leading platforms in these domains.
For Beginners: AI, Wireless, and Energy Kits
Computer Vision/AI Kits add “eyes” to your IoT devices. They typically include: - Camera sensors - GPU or TPU acceleration - Pre-trained ML models - Python/TensorFlow support
Wireless Communication Kits connect devices across distances. They typically include: - Radio modules (XBee, LoRa, cellular) - Mesh networking capability - Configuration software - Antennas
Energy Harvesting Kits power devices without batteries. They typically include: - Solar panels - Vibration harvesters - Power management circuits - Energy storage (supercapacitors)
Example: Jetson Nano detects objects in video at 30 fps. XBee creates a mesh network across a warehouse. EnOcean powers a switch from button press energy alone.
Sensor Squad: Eyes, Wings, and Sunshine
“Three superpowers in one chapter!” cheered Lila the LED. “AI kits give devices eyes to see, wireless kits give them wings to communicate across distances, and energy harvesting kits let them run on sunshine and vibrations!”
Sammy the Sensor was fascinated by the AI cameras. “The Jetson Nano can recognize objects at 30 frames per second? That is like giving a security camera a brain that knows the difference between a person and a cat!” Max added, “And Google Coral does it with even less power using a special TPU chip. Perfect for edge AI where you need instant decisions without sending data to the cloud.”
Bella the Battery was most excited about energy harvesting. “Imagine never needing to be replaced! EnOcean switches harvest energy just from pressing a button. Solar panels charge supercapacitors during the day. Some kits even harvest energy from vibrations in factory machines. For me, that is the dream – infinite power from the environment!”
“And wireless kits like XBee and LoRa create mesh networks,” said Max. “XBee covers a building, LoRa covers a whole farm. They let sensors talk to each other across huge distances without any existing infrastructure.”
29.4 Computer Vision and AI Kits
29.4.1 OpenMV Cam H7 Plus
The OpenMV Cam H7 Plus is a self-contained machine vision module that puts camera sensing and AI processing on a single board the size of a postage stamp. Built around the STM32H743 microcontroller with an OV5640 camera sensor, it runs MicroPython natively and supports TensorFlow Lite models – all for around $75.
The development workflow is refreshingly simple: the OpenMV IDE provides a live camera preview, code editor, and one-click deployment. Pre-trained models for common tasks (face detection, colour tracking, AprilTag recognition) work out of the box, while custom TensorFlow Lite models can be trained externally and deployed to the device. An onboard IMU sensor adds motion awareness for robotics applications.
This makes the OpenMV ideal for visual inspection on production lines, barcode and QR code reading at warehouse stations, gesture recognition interfaces, and adding basic vision to hobby robots. Its standalone operation (no PC or network required) and fast prototyping cycle (edit code, see results immediately on the live preview) lower the barrier to computer vision projects dramatically.
The limitations reflect its microcontroller heritage: image resolution is constrained, AI models must be small enough for the available RAM, and without GPU acceleration, complex neural networks run at less than 1 frame per second. For simple vision tasks, however, the OpenMV’s price-to-capability ratio is hard to beat.
29.4.2 NVIDIA Jetson Nano Developer Kit
The NVIDIA Jetson Nano brings GPU-accelerated deep learning to the edge in a credit-card-sized form factor. Its quad-core ARM CPU paired with a 128-core NVIDIA Maxwell GPU and 4 GB RAM can run object detection models like YOLOv5 at 30+ frames per second – performance that was reserved for desktop workstations just a few years ago.
The development environment runs full Ubuntu Linux with CUDA/cuDNN acceleration, supporting TensorFlow, PyTorch, and NVIDIA’s specialised DeepStream SDK for multi-stream video analytics. The JetPack SDK bundles drivers, libraries, and sample applications into a single installer. A MIPI CSI camera connector supports industry-standard camera modules, while a GPIO header enables integration with sensors and actuators.
This makes the Jetson Nano the platform of choice for AI-powered security cameras, autonomous robot navigation, industrial quality inspection (detecting defects at production-line speed), and smart city analytics (vehicle counting, pedestrian tracking, traffic flow analysis). Critically, NVIDIA’s product line (Nano, Xavier NX, Orin) shares the same software stack, so prototypes built on the Nano can scale to production on more powerful hardware without rewriting code.
The power draw of 10W (20x the OpenMV) makes battery operation challenging, the $100–200 price point puts it above educational budgets, and the Linux-based setup requires comfort with command-line configuration that simpler platforms avoid.
29.4.3 Google Coral Dev Board
The Google Coral Dev Board takes a different approach to edge AI than the Jetson Nano: instead of a general-purpose GPU, it uses a dedicated Edge TPU (Tensor Processing Unit) coprocessor optimised specifically for neural network inference. Paired with an NXP i.MX 8M SoC, 1 GB RAM, Wi-Fi/Bluetooth, and a MIPI CSI camera connector, it delivers inference performance comparable to the Jetson Nano at 2–4W rather than 10W.
Development uses Mendel Linux with TensorFlow Lite and a Python API. Google provides a library of pre-compiled models (object detection, image classification, pose estimation, speech recognition) that run immediately on the Edge TPU, and custom models trained in TensorFlow can be compiled for the TPU using Google’s toolchain.
The Coral excels where power efficiency meets performance requirements: battery-powered AI cameras, always-on speech recognition devices, and edge inference nodes that must process data locally without cloud connectivity. Google’s ML ecosystem (Colab for training, TFLite for deployment, Coral for inference) provides an end-to-end workflow that simplifies the train-deploy-infer cycle.
The specialisation that gives Coral its efficiency also limits flexibility. Only TensorFlow Lite models run on the Edge TPU (no PyTorch, no custom CUDA kernels). The $150 price point and narrow use case (inference only, not training) mean it suits projects with clearly defined ML workloads rather than open-ended AI experimentation.
29.4.4 Computer Vision Kit Comparison
Feature
OpenMV Cam H7
Jetson Nano
Coral Dev Board
Processor
STM32H7
ARM + GPU
i.MX 8M + TPU
AI Acceleration
None
128-core GPU
Edge TPU
Price
~$75
$100-200
~$150
Power
0.5W
10W
2-4W
Framework
TFLite Micro
TF, PyTorch
TFLite
FPS (YOLO)
<1
30+
30+
Best For
Simple vision
Complex AI
Fast inference
29.5 Knowledge Check
Quiz 1: Computer Vision Kits
29.6 Wireless Communication Kits
29.6.1 Digi XBee3 Development Kit
The Digi XBee3 Development Kit provides a comprehensive platform for building mesh wireless networks. The kit includes multiple XBee3 modules (supporting Zigbee, BLE, 802.15.4, and cellular variants), USB interface boards for PC configuration, breadboard adapters for prototyping, and antennas optimised for each frequency band.
What distinguishes XBee from simpler radio modules is its built-in mesh networking stack. Nodes automatically discover neighbours, establish routes, and heal the network when nodes fail – capabilities that would require months of custom firmware development on raw radio modules. The XCTU configuration tool provides a graphical interface for network setup and a real-time network analyser that visualises topology and signal strength. For programmatic control, XBee3 supports MicroPython directly on the module and an API mode for integration with external microcontrollers.
XBee modules are the industry standard for sensor networks, remote monitoring, and multi-hop communication where self-healing mesh topology is essential. Their proven reliability (deployed in millions of commercial products) and excellent range (up to 1.5 km line-of-sight with high-gain antennas) make them a safe choice for professional applications.
The trade-offs are higher per-module cost than commodity radio chips, API complexity inherited from legacy protocol versions, and tight coupling to Digi’s proprietary ecosystem (mixing XBee with non-XBee mesh nodes is difficult).
29.6.2 LoRa Development Kit
LoRa (Long Range) development kits enable wireless communication across kilometres rather than metres, opening up applications that short-range protocols like Bluetooth and Zigbee cannot reach. A typical kit includes LoRa transceiver modules (based on the Semtech SX1276 or SX1262 chipset), Arduino-compatible development boards, antennas, and optionally a gateway for LoRaWAN network connectivity.
Development uses familiar Arduino libraries for device-side programming, with The Things Network providing a free community infrastructure for LoRaWAN connectivity. Devices can also communicate point-to-point using raw LoRa without a gateway, or use AT commands for simple serial integration with any microcontroller.
LoRa kits suit long-range sensor deployments (soil moisture across a farm, water level in remote reservoirs), smart city infrastructure (parking sensors, waste bin monitoring), and remote telemetry where cellular coverage is unavailable or subscription costs are prohibitive. The combination of kilometre-range communication with microamp-level sleep current enables multi-year battery operation – a 2,000 mAh coin cell can power a LoRa sensor transmitting hourly for 5+ years.
The bandwidth trade-off is significant: LoRa’s maximum data rate of around 50 kbps (and typically 0.3–11 kbps in LoRaWAN configurations) limits it to small sensor payloads. A gateway is required for LoRaWAN network access, and European regulatory duty cycle limits (1% transmit time) constrain how frequently devices can send.
29.6.3 Wireless Kit Comparison
Feature
XBee3
LoRa
Range
100m-1.5km
2-15 km
Data Rate
Up to 250 kbps
Up to 50 kbps
Topology
Mesh
Star (LoRaWAN)
Power
Low
Very low
Price (module)
$20-40
$10-25
Protocol
Zigbee, 802.15.4
LoRa/LoRaWAN
Best For
Mesh networks
Long-range, low-power
Figure 29.1: Decision flowchart for selecting wireless communication kits based on range, topology, and power requirements.
29.7 Energy Harvesting Kits
Interactive: Power Budget Calculator
Before exploring energy harvesting, use this interactive tool to understand power budgets and calculate energy requirements for your IoT devices.
29.7.1 EnOcean Development Kit
The EnOcean Development Kit represents a fundamentally different approach to IoT power: instead of batteries, devices harvest energy from their environment. The kit includes solar-powered sensor modules, kinetic energy modules (powered by the mechanical energy of pressing a button), temperature sensors, and wireless switches – all communicating via the EnOcean wireless protocol.
The technology is production-proven, not experimental. EnOcean switches are deployed in millions of buildings worldwide, harvesting just enough energy from a button press (50–100 microjoules) to transmit a wireless signal. Solar-powered temperature sensors use indoor lighting to charge a supercapacitor, then transmit readings periodically without any battery at all. Development uses the EnOcean protocol stack with API libraries and gateway integration for connection to building management systems.
This makes EnOcean the go-to platform for battery-free building automation sensors, self-powered light switches (no wiring needed – just stick on the wall), and maintenance-free deployments where replacing batteries in thousands of sensors would be impractical. The industrial-quality hardware and ISO/IEC 14543-3-10 standardisation provide the reliability guarantees that commercial building projects require.
The drawbacks are cost (modules are significantly more expensive than battery-powered alternatives), limited processing capability (energy harvesting provides microwatts, constraining computation), and a proprietary ecosystem that does not interoperate with Zigbee, Thread, or other mesh protocols.
29.7.2 SparkFun Energy Harvesting Kit
Where EnOcean provides a polished, production-ready energy harvesting solution, the SparkFun Energy Harvesting Kit takes the opposite approach: it provides raw components for experimentation and learning. The kit includes solar panels in various sizes, a vibration harvester (piezoelectric element), a thermoelectric generator (Peltier module used in reverse), energy storage circuits (supercapacitors and charge controllers), and buck/boost converters for voltage regulation.
Everything is Arduino-compatible and open source, with energy measurement tools that let students quantify exactly how much power each source produces under different conditions. You can measure that a small indoor solar panel produces 50 microwatts under office lighting, that a vibration harvester on a running motor generates 200 microwatts, and that a thermoelectric generator across a 10C temperature differential delivers 10 milliwatts – turning abstract datasheet numbers into concrete, measurable experience.
The kit excels as a research and educational platform: energy harvesting courses, self-powered sensor prototypes, and remote monitoring feasibility studies where the question is “can this environment provide enough energy?” rather than “deploy 1,000 units next quarter.”
The experimental nature means power output is inconsistent (dependent on environmental conditions that change hourly), integration requires electronics expertise (matching source impedance, managing charge cycles, handling intermittent power), and the resulting systems are not production-ready without significant engineering effort.
29.7.3 Energy Harvesting Comparison
Feature
EnOcean
SparkFun Kit
Energy Sources
Solar, kinetic
Solar, vibration, thermal
Maturity
Production-ready
Experimental
Price
$100-300
$50-100
Protocol
EnOcean (proprietary)
Open (Arduino)
Power Output
uW to mW
uW to mW
Best For
Production deployment
Research/learning
29.7.4 Energy Harvesting Sources
Source
Power Output
Best Application
Indoor Solar
10-100 uW/cm²
Building sensors
Outdoor Solar
1-10 mW/cm²
Outdoor sensors
Vibration
10-1000 uW
Industrial monitoring
Thermal (ΔT=10°C)
10-100 mW
Machine monitoring
RF Harvesting
1-100 uW
Passive RFID, NFC
Kinetic (button)
50-100 uJ/press
Switches, controls
Putting Numbers to It
Solar panel sizing for a sensor node drawing 100 µA average requires accounting for panel efficiency and winter insolation. A 50 mm × 50 mm (25 cm²) panel at 15% efficiency under 800 W/m² outdoor light:
The panel provides 150× margin, sufficient for cloudy periods and charge/discharge inefficiency (typically 70-80%). Smaller 10 cm² panels (\(1-2\)) can sustain µW-level loads indefinitely.
The three computer vision platforms serve distinct niches along a complexity-cost spectrum. OpenMV is the right choice when your vision task is straightforward (colour tracking, QR code reading, simple object counting), you need standalone operation without a PC, your budget is around $75, and you prefer MicroPython’s simplicity. Jetson Nano becomes necessary when the application demands complex AI models (YOLOv5, ResNet, custom CNNs), real-time performance at 30+ fps, the full Linux ecosystem for development flexibility, or multi-camera setups. Coral occupies the middle ground: choose it when you need Jetson-class inference speed but at 2–4W instead of 10W, your workflow is already built around TensorFlow Lite, or Google Cloud integration is part of your architecture.
29.8.2 Wireless Selection
The wireless kit choice depends primarily on range and topology. XBee is the right platform when your deployment requires self-healing mesh networking (warehouse monitoring, building-wide sensor networks), you need multi-protocol flexibility (Zigbee, BLE, 802.15.4 on the same module family), or professional reliability is non-negotiable. LoRa becomes the clear winner when distances exceed 1 km, ultra-low power consumption is critical for multi-year battery life, data payloads are small and infrequent (a few bytes every few minutes), or the deployment is in a rural or agricultural setting where no cellular infrastructure exists.
29.8.3 Energy Harvesting Selection
Energy harvesting kit selection splits cleanly between deployment stage and learning stage. EnOcean suits production deployments where proven technology, industrial quality, building automation integration, and standards compliance matter more than cost. SparkFun suits research, experimentation, and education where the goal is understanding energy harvesting principles across multiple source types (solar, vibration, thermal) rather than deploying a finished product.
29.9 Advanced Quiz Questions
Quiz 3: Advanced Selection Scenarios
29.10 Worked Example: From Jetson Nano Prototype to Production at Verkada
Verkada, a building security company, used the prototyping-to-production pipeline to develop their AI-powered security camera line. Their journey illustrates the cost, timeline, and engineering decisions involved in transitioning from a development kit to a shipping product.
Phase 1: Proof of Concept (Jetson Nano, 4 weeks, $500)
The engineering team used 3 Jetson Nano kits ($150 each) with Raspberry Pi Camera Module V2 ($25) to prototype real-time person detection. Using NVIDIA’s pre-trained PeopleNet model on the Jetson Nano, they achieved 28 fps at 1080p with 92% detection accuracy. Total hardware cost: $525. Development time: 4 weeks (1 engineer).
The Nano’s 10W power draw and passive cooling were inadequate for an enclosed outdoor camera (thermal throttling above 40C ambient). The team moved to the Xavier NX ($399), which provided the same inference performance at 15W but with hardware-accelerated video encode (needed for recording) and support for 2 simultaneous camera streams. They added LoRaWAN connectivity via a Semtech SX1262 module ($12) for low-bandwidth alert notifications when Wi-Fi was unavailable.
The transition from NVIDIA’s Jetson platform to Ambarella’s CV25 was the most significant engineering decision. The Ambarella chip provided equivalent neural network inference performance for person detection at $18 vs. $399, but required 12 weeks of firmware porting – the TensorFlow models had to be recompiled for Ambarella’s CVflow architecture, and the video pipeline had to be rewritten from NVIDIA’s DeepStream SDK to Ambarella’s AMBA SDK.
Phase 4: Manufacturing Ramp (8 weeks, $350,000)
NRE (non-recurring engineering): $180,000 for PCB design, firmware, and mechanical engineering
Tooling: $85,000 for injection mould (enclosure + lens housing)
Certification: $45,000 (FCC, CE, UL)
First production run: 5,000 units at $54 BOM + $18 assembly = $72/unit
Key Decision Points for Kit-to-Production Transition:
Decision
Prototyping Phase
Production Phase
Why It Changes
Processor
Use dev kit SoC
Switch to application-specific SoC
10-20x cost reduction at volume
Software
Use vendor SDK (DeepStream, JetPack)
Port to production platform
Licensing, power, cost constraints
Connectivity
USB dongles, breakout boards
Integrated wireless SoC
Size, reliability, cost
Enclosure
3D printed or off-shelf
Injection moulded
$8 -> $1.20 at 5,000+ units
Power
USB/barrel jack, external supply
Custom PMIC, PoE integrated
Reliability, form factor
Timeline Summary: Proof of concept (4 weeks) -> Prototype refinement (8 weeks) -> Custom hardware (12 weeks) -> Manufacturing ramp (8 weeks) = 32 weeks total from first prototype to production units. The prototyping kits saved an estimated 6-8 weeks of initial development by providing working hardware and pre-trained models on day one.
Interactive Quiz: Match Kit Categories
:
🏷️ Label the Diagram
💻 Code Challenge
📝 Order the Steps
29.11 Summary
OpenMV Cam H7 Plus provides affordable standalone machine vision with MicroPython programming, suitable for simple object detection, barcode reading, and robotics vision applications
NVIDIA Jetson Nano offers powerful GPU-accelerated edge AI with full TensorFlow/PyTorch support, enabling real-time object detection (30+ fps) for autonomous robots and industrial inspection
Google Coral Dev Board delivers fast ML inference with Edge TPU acceleration at lower power consumption, ideal for TensorFlow Lite deployments requiring efficient edge processing
Digi XBee3 enables professional mesh networking with Zigbee/802.15.4/BLE support, self-healing topology, and excellent reliability for sensor networks requiring multi-hop communication
LoRa Development Kits provide km-range communication with ultra-low power consumption, perfect for agricultural monitoring, smart city, and remote telemetry applications
EnOcean offers production-ready energy harvesting for battery-free sensors in building automation, while SparkFun kits enable energy harvesting research and experimentation with multiple sources
In 60 Seconds
Energy IoT instruments generation, distribution, and consumption to enable demand response, fault detection, and dynamic pricing that improve grid reliability and reduce peak load by 10-20% through coordinated smart meter and DER management.
29.12 Concept Relationships
Prerequisites: Specialized Prototyping Kits Overview - Understanding kit ecosystem architecture. Edge Computing - AI and ML concepts for computer vision kits. Wireless Protocols - LoRa, Zigbee, and mesh networking fundamentals.
Related Concepts: Kit Selection Guide - Framework for choosing AI/wireless kits. Energy-Aware Considerations - Power budgeting for energy harvesting systems. Computer Vision - AI algorithms for vision applications.