Part 2: Acoustic raw sensor data collection and labeling using SensiML Toolkit
In part 1 of this multi-part blog series, we discussed the advantages of transforming merely connected IoT devices into truly smart IoT edge AI devices. The former – encompassing the majority of present-day gadgets – are able to convey real insight only when coupled with companion cloud computing services or smartphone applications. Alternatively, truly smart IoT edge devices have self-contained sensor processing and ML firmware capable of delivering useful application insight autonomously. We covered the benefits of this autonomy and proposed an example application for a smart door lock fully equipped to listen for and recognize an array of audible events as a self-contained smart device. In part 2 we’ll start the process of developing this smart door lock by collecting and labeling a machine learning training dataset with the SensiML Data Capture Lab application.
Our envisioned smart door lock will be armed with an algorithm capable of recognizing when a visitor comes knocking on the front door. To this, we’ll also add intelligence for detecting someone inserting a key into the deadbolt or door handle lock. Going further, we’ll add the ability to detect the deadbolt locking or unlocking. From here, the sky’s the limit for what we might choose to add. How about listening for the sound of a thief attempting to pick our door lock with tools or drill the lock barrel? Or the sound of breaking glass from a less sophisticated thief simply breaking the sidelight or door glass to reach through and open the lock from the inside. Our smart lock might be trained to recognize a secret keyphrase and voice signature of a latchkey child who can enter without remembering a key or access code. You get the idea.
These features can be implemented in the upfront product design or post-release as firmware updates to the model providing extensibility and ongoing end-user value using the same sensor hardware and embedded microcontroller. When SensiML Toolkit is coupled with an AI accelerated processor, like the Silicon Labs MG24 and BG24 SoC families, even advanced neural network AI models can be used while still maintaining the low-power characteristics and long battery life users expect from IoT devices. Thus the software-based extensibility of employing AI at the IoT edge can transform fixed-function IoT gadgets into sensing platforms driving entirely new business models and revenue opportunities.
So let’s get started collecting some insightful AI training data!
SensiML Data Capture Lab makes it easy to capture data from the xG24 Dev Kit or your own custom PCB following our published interface spec. Streaming data or existing imported data provided in CSV or WAV file formats are easily ingested for labeling, model building, and code generation.
Capturing Door Knocking Events
After connecting the Silicon Labs xG24 Dev Kit to our PC running the SensiML Data Studio, we’ll proceed to collect some example data for knocking sounds. Now we might have chosen to use an actual door, but for convenience, we constructed a demo door kit that consists of a slab of wood with a keyed deadbolt lock. In practice, your training dataset should include a variety of doors to ‘teach’ the AutoML engine the different knock sounds generated from metal, wood, and fiberglass construction doors.
Smart Door Lock Demo “Door”
A well-done data collection would also consider the different types of ‘knocks’ as well. The classic knock gesture would consist of a closed fist rapping knuckles against the door. But do we want to recognize other styles of knocking equivalently or as distinct classification types? Some doors are equipped with a metal knocker that produces its own unique sound. A pounding on the door might indicate a more urgent or even threatening event. These decisions are yours to make based on the application intent. Those events you wish to distinguish will need different classifications out of the model and thus will also need unique segment classification labels for training purposes. Events that you choose not to uniquely distinguish will have the same classification output and therefore the same segment classification label for model training. Nonetheless, such event distinctions might benefit from metadata labeling to allow us to annotate the dataset for different segment attributes that could prove useful later for partitioning variations in the modeling process if we so choose.
Edit
Labeling Our Segments Accurately and with Speed
SensiML supports both of these types of labeling to a level as simple or as detailed as you choose and with simple well-considered workflows to ensure accuracy and ease the tedium of custom dataset development. Specific capabilities for automating dataset labeling of time-series data in SensiML Data Studio include
- Unlimited labeling of one to many segments within an individual data file
- Fixed or variable segment sizes as appropriate
- Manual or procedural segment definition with customizable parametric segmentation algorithms
- Session management of dataset segmentations to retain manual segment labels in addition to procedural segmentation strategies
For our smart lock ‘knocking’ labels, we’ll choose to utilize short, fixed segment sizes of 300ms duration. As the dataset consists of 16kHz sampled monaural audio, this equates to segment lengths of 4800 which we will define in the Settings menu to ensure each of our knocking training segments has this same length.
Segments can be defined in detail within files and across files. With industry-leading time-series data labeling tools, SensiML Data Capture Lab provides both the precision and workflow automation to handle accurate dataset labeling efficiently.
To really speed up the process, we can define an auto labeling session that will use a procedural segment definition to automatically locate and lay down 300ms labeled segment regions across all of our many recorded knock events. Usability features like this can make the difference between high-quality and low-performing edge models. Most AI software tools for the edge leave it to the user to do this task themselves with custom scripting, file conversions, and manual data manipulation. At best this slows down project execution, but all too often it leads to poor quality results as developers have limited time to devote to developing and testing MLOps tasks and workflows.
Autosegmenting makes fast work of defining and labeling large numbers of blocks within and across many files programmatically.
“Keying In” on More Events of Interest
We’re off to a good start having captured a variety of knocking sound segments spanning door constructions and knocking techniques. Now we’ll add additional capabilities by teaching our model to recognize key insertion/extraction sounds. With streamlined tools from SensiML, the process of adding additional event insights is straightforward. The resulting edge AI library takes the complexity out of handcrafting algorithms and heuristics to detect such acoustic signatures and lets your team focus on the application logic that adds value to your product. By not getting bogged down in the AI, developers are freed to think about application enhancements that users will pay for. In the case of the key insertion event, we can easily add simple heuristics to our AI library output to make our door lock truly smart. For instance, multiple insertions without successful locking or unlocking of the deadbolt might trigger an alert to a vacation rental landlord that their weekend renter is having access trouble. By handling the signal processing and AI for you, your team is free to focus on what your customers will value most.
A picture is worth a thousand words: With easily synchronized video from selectable input sources, users can ensure the intent and methodology used during the data acquisition phase of sensor data collection is preserved along with the source sensor datasets.
Pulling It All Together
The process for capturing and labeling our deadbolt lock and unlock events is no different from those above and we could choose to add additional acoustic events useful for security like glass-breakage sounds, or drills attempting to thwart the lock. SensiML makes it possible to organize a project for efficient data-driven ML development by including many people in a collaborative team. Such teams might consist of multiple test technicians in the field capturing raw sensor data across various locations while domain experts are tasked with centrally labeling the data. With cloud-based project synchronization, individual team members can work locally and contribute to a common dataset through the SensiML cloud.
Data Studio’s Project Explorer provides an overall dashboard that makes it easy to understand what data has been collected, what labeling has been completed, and what remains to be done. Additionally, it is easy to see at a glance what annotations have been made, group files logically by metadata, and see which files have synchronized video to help with offline labeling in cases where audio playback or sensor waveforms alone are not enough to grasp the full picture.
SensiML handles all the recordkeeping and project synchronization for managing your ML datasets automatically so you can be fast, efficient, and avoid mistakes. This includes descriptive file naming nomenclatures, subject metadata, synchronized video annotation, multi-user file/project syncing, and other details.
In Our Next Installment: Building a Working Edge AI Model from our Smart Lock Dataset
Stay tuned for our next installment in this series where we will construct a working IoT edge recognition model for the captured sound events we collected above. Check back soon to see this exciting next step or sign-up to be notified when it’s published.
Part 1: The plan to create an acoustic-aware smart door application
Part 3: Edge AI model generation for the AI-accelerated SiLabs xG24 Dev Kit
Part 4: Profiling the performance of the AI-accelerated EFR32MG24 model
Part 5: Using data augmentation to enhance model accuracy <Coming Soon>
Learn more about SensiML’s Accelerated AI Solution with Silicon Labs