Key information:
Lattice SensAI Stack – Accelerating Integration of Flexible Inferencing at the Edge
Artificial Intelligence/machine learning silicon solutions for emerging smart factories, cities, homes and mobile applications challenge designers building computing solutions on the edge. New requirements need to be flexible, low power, small form factor and low cost, without compromising performance.
Machine learning requires two phases of computing workload – firstly, the machine or system must learn a new capability from existing data and secondly, the machine must apply its capabilities to new data by identifying patterns and performing tasks. The first phase is highly computer-intensive and usually happens in a data centre, the second phase, called inferencing, happens close to the source of the data, at the edge.
One way of efficiently performing the computational tasks needed for inferencing is to take the parallel processing capability of FPGAs and use it to accelerate neural network performance. Using lower density FPGAs specifically optimised for low power operation enables designers to meet stringent performance and power limitations imposed on the network edge.
Introduced in May 2018, Lattice Semiconductor’s sensAI® is a comprehensive developmental ecosystem that simplifies the task of building flexible inferencing solutions optimised for the edge. It is a complete technology stack combining modular hardware kits, neural network IP cores, software tools, reference designs and custom design services via a partner ecosystem. Designed to speed deployment of AI into fast-growth consumer and industrial IoT applications, such as mobile, smart home, smart city, smart factory and smart car products, it is optimised to provide the ultra-low power (under 1 mW to 1W), small package sizes (5.5mm² to 100 mm²) at low cost ($1 to $10 for high volume), and production benefits of ASICs combined with FPGA flexibility to support algorithms, interfaces (MIPI², CSI-2, LVDS, GigE etc. ) and enable tailored performance.
In September 2018, Lattice expanded the sensAI stack offering with additional features for edge devices that are battery-operated or have thermal constraints requiring flexible, low-power, always-on, on-device AI. These included new IP cores, reference designs and hardware development kits optimized to address the challenges of scalable performance and power, delivering improved accuracy, scalable performance and ease-of-use, while still consuming only a few milliwatts of power.
Lattice’s sensAI’s stack includes:
• Modular Hardware platforms: ECP5™ device-based Video Interface Platform (VIP) that supports MIPI, CSI-w, eDP, HDMI, GigE Vision, USB3, etc. and including Lattice’s Embedded Vision Development Kit; and iCE40UltraPlus™ device-based Mobile Development Platform (MDP), including image sensor, microphones, compass/pressure/gyro sensors etc. New iCE40 UltraPlus development platforms including Himax HM01B0 UPduino Shield, and DPControl iCEVision Board were added in September 2018.
• IP Cores – Convolutional Neural Network (CNN) accelerators (including those for iCE40 UltraPlus FPGA and for ECP5 FPGAs) and Binarized Neural Network (BNN) accelerator.
• Software Tools – Neural network compiler tool for Caffe/TensorFlow to FPGA, Lattice Radiant™ design software, Lattice Diamond® design software
• Reference Designs – human presence detection, face detections, key phrase detection, hand gesture recognition, object counting, face tracking, and speed sign detection
• Design Service – the sens AI partner eco-system continues to expand worldwide, with service partners and IP partners delivering custom solutions for broad market applications enabling smart homes, smart factories, smart cities and smart cars.

As industry adopts machine learning technology, a number of factors including latency, privacy and network bandwidth limitations are pushing computing to the Edge. Analysts expects 40 billion IoT devices at the Edge between now and 2025 (source: Semico Research) and edge devices with AI are expected to show 110% CAGR unit growth over the next five years (source: Semico Research).
As a machine learning inferencing technology stack, sensAI enables the rapid integration of on-device sensor data processing and analytics in edge devices and addresses an unmet need for low cost, ultra-low power AI silicon solutions which are able to be rolled out rapidly across a diverse range of applications.

You can find out more here