Building a Model

The Model Building part of the Analytics Studio uses SensiML’s AutoML to build a model that gives you control of the features you want in your device. For example, if you build an algorithm that detects your events with 100% accuracy, the algorithm may use more resources. But by tweaking parameters in the Build Model tab you might find you can get an algorithm that uses half as many resources, while still getting 98% accuracy. You can configure SensiML’s AutoML process to maximize accuracy while fitting a within a desired memory constraint. This is a powerful concept that can save you a lot of time and money.

Let’s look at a screenshot and dive a little deeper.

../../_images/analytics-studio-model-building2.png
  1. Click Add New Pipeline and create a new pipeline called My Pipeline.

  2. Select the All Classes query from the query list.

  3. Select the Segmenter: Windowing. Set size to 100. Windowing segmentation works well with continuous events.

    Note: 100 refers to the window size in samples, so by picking Windowing(100) on 100hz data we have a 1 second window size, meaning every 1 second you will get a new classification.

  4. Click Optimize and the Analytics Studio will automatically build you a model to detect your events.

  5. This is where SensiML’s AutoML finds the features needed to build an algorithm that will run on your device. Depending on the amount of sensor data, the seed, and parameters you selected this could take some time to build an algorithm.

  6. Once the pipeline is complete it will display 5 models in the Auto Sense Results view.

../../_images/analytics-studio-auto-sense-results2.png

There are several summary statistics for each model. You can use this information to select a model that supports your device’s resources while providing the level of accuracy your application needs. Keeping in mind that typically, there is an accuracy vs resource usage trade off, where the more resources you allocate to modeling, the higher accuracy of a model that can be built.