A month or so ago we reviewed the ADLINK I-Pi rapid industrial prototyping platform on Dev Kit Weekly, and, to be honest, it looked an awful lot like this. Same plastic and aluminum cage, same SMARC carrier and module form factor, same I/Os for the most part. Except there’s this fan down here that wasn’t on the I-Pi. Hmmm.
Well that fan is there because the Vizi-AI targets AI and computer vision workloads, and therefore hosts a whole lot more compute performance. Instead of an Arm-based Rockchip SoC, The SMARC module on the Vizi-AI hosts an Intel Atom x5-E3940 SoC as well as Intel® Movidius Myriad X VPUs designed to chew up and spit out deep learning workloads.
In support, the board features 4 GB of high-speed low-power DDR4 memory, with options up to 8 GB, to assist in the frequent memory accesses involved with AI inferencing.
But so what? You’re new to AI and just getting started. How could you possibly be far enough along to start running machine learning workloads on a target device?
Well, there are a couple of software components in the Vizi-AI kit that help with just that. The first is the Intel OpenVINO toolkit, a development suite that takes AI models trained in popular frameworks like Caffe, MXNET, and TensorFlow and optimizes the resulting algorithms for execution on Intel compute platforms. Yes, that includes both VPUs like the Myriad X and Atom CPUs, as well as FPGAs, graphics units, etc. if you choose to go with a different SMARC processor module but still need to do some deep learning.
But going even further, what happens once you’ve started inferencing? What if, for example, your Vizi-AI-based computer vision platform doesn’t recognize something while it’s inferencing out at the edge?
Well, here the kit has you covered too, courtesy other components in the ADLINK Edge Vision software stack. Besides the Model Manager, ADLINK Edge Vision contains a pre-packaged AIoT networking platform based on the DDS communications standard that pushes data to an open yet secure data river.
To get your hands on one of these, just fork over $199 to Arrow and you’ll be inferencing in no time. Or, you could take a shot at being one of five lucky winners who will have one sent to them for free by your friends here at Embedded Computing Design. All you have to do is fill out the form below.
And, as always, tune in next week for another edition of Dev Kit Weekly.
About the AuthorFollow on Twitter Follow on Linkedin Visit Website More Content by Brandon Lewis