ICP Releases DevKit for Inference at the Edge

March 15, 2019 Alix Paultre

Industrial Computer Products (ICP) released edge hardware featuring high-performance, flexible expansions and pre-installed ready-to-use software tools. Their inference system TANK-870AI offers software developers hardware based on a 6th/7th generation Intel Core i5/i7 or Xeon E3 CPU with maximum 32GB pre-installed RAM and 1TB SSD mass storage. In addition to numerous common interfaces, two PCIe x8 slots are available to extend functionality and boost the system via FPGA and VPU based AI accelerators. 

Provided with the Open Source Open Visual Inference Neural Network Optimization (OpenVINO) toolkit, the solution enables convolutional neural network based, pre-trained models to be used at the edge. The integrated C++ Inference Engine API and Model Optimizer supports multiple frameworks like Caffe, Tensorflow, MXNet and ONNX, and a wide range of compatible CNN topologies such as AlexNET, SqueezeNet, and others can be used.

The Intel Media SDK provides optimized and accelerated decoding, processing and encoding of data before and after passing through the Inference Engine, and the cloud-based web editor Arduino Create is a plug-in for Intel platforms with an Ubuntu 16.04 operating system.

For more information, visit www.icp-deutschland.de.

Previous Article
Deep learning inference at the edge
Deep learning inference at the edge

VPU accelerator in PCIe expansion card format for acceleratiing deep learning inference at the edge.

Next Article
allegro.ai and NetApp showcase integrated AI Solution at Nvidia's GTC 2019 conference

allegro.ai partners with NetApp to showcase an integrated AI solution at Nvidia's GPU Technology Conference...