Movidius unveiled a “Fathom” USB stick and software framework for integrating accelerated neural networking processing into embedded and mobile devices.
On April 28, Movidius announced availability of the USB-interfaced “Fathom Neural Compute Stick,” along with an underlying Fathom deep learning software framework. The device is billed as “the world’s first embedded neural network accelerator,” capable of allowing “powerful neural networks to be moved out of the cloud, and deployed natively in end-user devices.”
The Fathom USB stick was one of two solutions introduced in conjunction with this week’s Embedded Vision Summit in Silicon Valley, that allow running advanced AI algorithms directly on mobile or embedded devices without relying on cloud processing. The other is Qualcomm’s Snapdragon Neural Processing Engine, the company’s first deep learning software development kit for devices based on its Snapdragon 820 SoCs.
Both solutions should come in handy in medical, automotive, drone, robotics, and other applications in which fast, reliable response time and privacy are prioritized. Eventually, technologies such as these will enable the implementation of deep learning technologies directly on embedded, mobile, and IoT devices without requiring access to a cloud-based neural network.
Fathom Neural Compute Stick
The Fathom USB stick takes much of the power of the company’s Myriad 2 reference board (shown farther below) and squeezes it into a USB stick. Equipped with a MA2x5x Myriad 2 “Vision Processing Unit” (VPU) system-on-chip found on early Project Tango reference platforms, the Fathom USB stick enables quick and easy development of accelerated neural networking processing for mobile or embedded devices. The Myriad 2 processor can enable fully-trained neural networks at under 1 Watt of power, claims Movidius.
Fathom Neural Compute Stick PCB
(click image to enlarge)
Neural networks can “significantly outperform” traditional computing approaches to tasks like language comprehension, image recognition, and pattern detection, says Movidius. Fathom can be used in applications such as object recognition, natural speech understanding, and autonomous navigation for cars, drones, and robots.
Myriad 2 VPU block diagram (left) and reference board
(click image to enlarge)
When connected to a PC, the Fathom Neural Compute Stick acts as a neural network profiling and evaluation tool. The stick supports major deep learning frameworks like Caffe and TensorFlow, translating these networks to a Myriad 2 format for rapid prototyping. The development environment requires a PC running 64-bit Ubuntu Linux 14.04 LTS or 16.04 LTS system, with Python 3 or higher.
The Fathom SDK “converts trained offline neural networks into embedded neural networks running on the ultra-low power Myriad 2 VPU,” says the company. Fathom enables testing over multiple inferences to determine network accuracy over larger datasets, says Movidius. It can also collect per-layer statistics, and validate network models with native hardware precision, says the company. When using the high-end MA2450 Myriad 2 model on complex neural networks like GoogleNet, Fathom is claimed to process 15 inferences per second with FP16 precision.
The Fathom USB stick can serve as a compact reference platform for developing applications that will run on a Myriad 2 VPU SoC within a mobile or embedded device. Alternatively, the Fathom USB stick can be used as a “discrete neural compute accelerator” peripheral, to “enhance” [a device’s] neural compute capabilities by orders of magnitude,” says the company.
Fathom video overview
The Movidius Fathom Neural Compute Stick and Fathom SDK are available now for “qualified customers” at an unstated price. Additional details may be found in the company’s Fathom Neural Compute Stick news release.