Live Q&A - Tiny Machine Vision: behind the scenes
Presented by Lorenzo RizzelloLive Q&A with Lorenzo Rizzello, following his talk titled 'Tiny Machine Vision: behind the scenes'
Make your IoT device feel, hear and see things with TinyML
Presented by Jan Jongboom
Many IoT devices are very simple: just a radio sending raw sensor values to the cloud. But this limits the usefulness of a deployment. A sensor can send that it saw movement in front of it, but not what it saw. Or a sensor might notice that it's being moved around, but not whether it's attached to a vehicle or is just being carried around. The reason is simple: for knowing what happens in the real world you'll need lots of data, and sending all that data over your IoT network quickly drains your battery and racks up your network bill.
How can we do better? In this talk we'll look at ways to draw conclusions from raw sensor data right on the device. From signal processing to running neural networks on the edge. It's time to add some brains to your IoT deployment. In this talk you'll learn:
- What is TinyML, and how can your sensors benefit from it?
- How signal processing can help you make your TinyML deployment more predictable, and better performing.
- How you can start making your devices feel, hear and see things - all running in realtime on Cortex-M-class devices.
- Hands-on demonstrations: from initial data capture from real devices, to building and verifying TinyML models, and to deployment on device
Want to Reduce Power in Always-on IoT Devices? Analyze First
Presented by Tom Doyle
Hundreds of millions of portable smart speakers are listening for a wake word. Millions more acoustic event-detection devices are listening for window breaks, baby cries or dog barks. Consumers appreciate how easy it is to use their always-on listening devices – but the battery drain that results from continuously processing all sounds in their environment? Not so much.
The problem is that this massive number of battery-powered IoT devices are notoriously power-inefficient in the way that they handle sound data. Relying on the age-old “digitize-first” system architecture, these devices digitize all the incoming sensor data as soon as they enter the device; then the data are processed for relevance, and in some cases, sent to the cloud for further analysis and verification. Since 80-90% of all sound data are irrelevant in most always-listening IoT devices, the digitize-first approach wastes significant battery life.
This session will show attendees how an “analyze first” edge architecture that uses analogML at the front end of an always-listening device eliminates the wasteful digitization and processing of irrelevant data, to deliver unprecedented power-saving and data efficiency in IoT devices.
Session attendees will:
- Understand that while most of today’s machine learning is implemented digitally, machine learning can also be implemented in ultra-low-power programmable analog blocks (analogML) so that feature extraction and classification can be performed on a sensor’s native analog data.
- Understand that the power problem for IoT devices is really a problem of the device treating all data as equally important and that determining which data are important earlier in the signal chain — while the data are still analog — reduces the amount of data that are processed through higher-power digital components. This approach saves up to 10x in system power in IoT devices.
- Learn how to integrate this new analogML edge architecture with sensors and MCUs from leading semiconductor suppliers into current and next-generation IoT devices.
Tiny Machine Vision: behind the scenes
Presented by Lorenzo Rizzello
Tiny devices, like the ones suitable for low-power IoT applications, are now capable of extracting meaningful data from images of the surrounding environment.
Machine vision algorithms, even Deep Learning powered ones, need only a few hundred kilobytes of ROM and RAM to run. But what are the optimizations involved to execute on constrained hardware? What is it possible to do, and how does it really work?
In this session, we will focus on the capabilities that are available for Cortex-M microcontrollers, starting from the user-friendly environment provided by EdgeImpulse to train and deploy Machine Learning models to the OpenMV Cam H7+.
We will guide attendees through the process using a straightforward example that illuminates inner workings so that attendees can get a grasp on technologies and frameworks. Attendees will walk away understanding the basic principles and be able to apply them not just to the Cortex-M but beyond.
Server and Edge AI for Tackling IIoT Data Glut
Presented by Altaf KhanCloud-based IIoT servers are receiving too much data, far too frequently, from an increasing number of edge devices. We present a complementary pair of AI solutions for reducing the data sent from the sensor and for efficiently processing it when it reaches the cloud server. The AI deployed at the sensor intelligently extracts insights from raw data with the help of inexpensive microcontrollers while operating on µWs of battery power. The server-side AI translates the insights received from a multitude of edge devices into decisions rapidly while employing a minimum of resources. The result is a low-latency, high-throughput cloud-based IIoT system.
Live Q&A - Want to Reduce Power in Always-on IoT Devices? Analyze First
Presented by Tom DoyleLive Q&A with Tom Doyle following his talk titled 'Want to Reduce Power in Always-on IoT Devices? Analyze First'
Live Q&A Make your IoT device feel, hear and see things with TinyML
Presented by Jan JongboomLive Q&A with Jan Jongboom following his talk titled 'Make your IoT device feel, hear and see things with TinyML'
Live Q&A - Server and Edge AI for Tackling IIoT Data Glut
Presented by Altaf KhanLive Q&A with Altaf Khan following his talk titled 'Server and Edge AI for Tackling IIoT Data Glut'