On Device AI

On Device AI

Move past the Edge, run AI right On Device
Today, AI is moving away from the cloud and is being deployed closer to the data source. This movement is often referred to as Edge AI. Tomorrow, AI models will be run on device, giving birth to On Device AI. Icii is facilitating this shift.
The distinction between the Cloud, Edge, and Device are often muddied. On Device AI is focused on deploying AI models to resource limited devices that are either the device that collects data.
On Device AI is especially attractive when the following constraints are present:
  • Real-Time – An AI response is needed quickly, and often continually
  • Latency – Remote AI processing returns the result too slow to be effective
  • Bandwidth – The amount of data to transmit exceeds available connections
  • Internet Connectivity – There is limited or no internet connection available
For On Device AI, the following challenges must be overcome:
  • Size – The compute platform must fit into a small embedded device
  • Power – Power consumption must be minimized to conform to limited power budgets
  • Performance – AI inferences require many billions of computations every second
  • Data Input – Needs to connect to various sensors and IO standards
To meet the constraints and challenges, Icii embraces field-programmable gate arrays (FPGAs). An FPGA is an embeddable chip that operate as a reconfigurable hardware circuit. Microcontrollers (MCUs) and graphics processing units (GPUs) run software, FPGAs run circuits. Using FPGAs, On Device AI solutions can run within a tiny footprint, use less power, achieve lower latencies, and realize high throughput real-time performance. The issue with FPGAs? While FPGAs meet the needs of On Device AI, they have often been overlooked due to their challenging design process.
That’s why Icii is such a big deal. Icii solves the FPGA design challenge for AI. Our easy to use, and happy, Yeti takes an AI model and quickly returns a ready to use FPGA design that performs your AI inferencing.