Today we were very excited to announce a new partnership with Microchip Technology®. As a reminder, SensiML has worked with Microchip to enable developers to quickly build smart sensing IoT devices using a combination of the SAM-IoT WG development kit and SensiML Analytics Toolkit. It’s important on a number of fronts and so we wanted to give you a little more context and information beyond the press release.
Microchip had been working with some alternative edge AI tools and was interested in evaluating our AutoML-based smart sensing solution as well for their microcontroller platforms. At that point, we didn’t know specifically what parameters were most important to them, but pretty quickly after they started their evaluation of our tools they came back and said that they were very impressed with the size of the models we produced and that SensiML provided models more accurate than even the prevailing AI framework with much larger memory usage. In fact, their own testing revealed the SensiML model bested their performance expectations as well as that of the alternative AI framework approach while consuming a mere 10KB of memory!
Of course, a small memory footprint combined with a high degree of accuracy made our solution perfect for edge IoT applications and ultra-compact embedded IoT use cases for which, not surprisingly, Microchip has a great deal of interest. To them, our results meant that not only could we run on their 32-bit processors, but also now 16-bit and 8-bit processors are now viable platforms for ML-based sensor processing.
That was a big revelation for them, and important to us because of course there are millions of Microchip controllers already deployed around the globe. Being able to support high-quality AI/ML implementations on those microcontrollers means thousands of legacy designs can potentially be upgraded with local intelligence. That increases their value to the end-user and likely increases the longevity of those designs in the field.
In addition to our solution having a small footprint, the power consumption is extremely low, further enhancing the value for edge applications which are often battery-powered.
Beyond the benefits enjoyed at run-time, SensiML tools also make the process of constructing models during the development phase a vastly easier and faster process than alternative means. Prior to the emergence of ML approaches for the smallest of processors, developers seeking to process sensor data locally in the embedded environment were left to devise code of their own custom design. Cloud AI tools are built for platforms with many orders of magnitude more compute and memory resources, so they do not produce results relevant to embedded IoT designs. Instead, successful IoT sensor processing previously required one of three compromises:
- Reduced expectations – Simple sensors, simplistic insights (ex. connected industrial temperature threshold and limit switch alarms)
- Reliant processing – Depending on raw data transmission to a remote device such as a smartphone, gateway, or the cloud for analysis (ex. cloud-based predictive maintenance for industrial use cases)
- Large investments in development time and expertise to hand-code application-specific algorithms that performed well in constrained embedded systems (ex. digital pedometer algorithms to infer steps walked from accelerometer data)
With SensiML Analytics Toolkit, we focus on utilizing the strengths of AutoML (using machine learning to devise machine learning applications) to automate and augment development productivity to reduce the tedium and complexity of compromise option #3. The result is that IoT products built using SensiML as part of the development tool chain can truly shine in their ability to provide rich application insights right where the data is being collected.
To facilitate this, Microchip and SensiML worked to further streamline the process for users of the MPLAB X IDE with tighter integration for data ingest into the SensiML data-driven AutoML workflow. Specifically, by interfacing with the MPLAB X Data Visualizer, we can make the process easy to read in a wide array of sensor data directly from the MCU registers. Using the Data Visualizer debugging tool users can capture the sensor data as it’s being seen by the MCU and direct desired sensor streams out to SensiML Data Capture Lab. That approach simplifies a developer’s upfront configuration efforts because now every sensor that can be interfaced to the PIC or SAM microcontroller becomes available to us for AI model training and testing.
We hear that customers love their Microchip tools, and so now they can keep using them and easily add AI/ML features to their designs. Overall, a great fit for us, for Microchip, and for our mutual customers.
The SensiML model bested performance expectations as well as that of the alternative AI framework approach while consuming a mere 10KB of memory