Refresh

This website linuxgizmos.com/google-kit-uses-rpi-zero-and-coral-accelerator-for-machine-learning/ is currently offline. Cloudflare's Always Online™ shows a snapshot of this web page from the Internet Archive's Wayback Machine. To check for the live version, click Refresh.


All News | Boards | Chips | Devices | Software | LinuxDevices.com Archive | About | Contact | Subscribe
Follow LinuxGizmos:
Twitter Facebook Pinterest RSS feed
*   get email updates   *

Google kit uses RPi Zero and Coral Accelerator for machine learning

Feb 23, 2021 — by Jeff Child 5,100 views

Google Creative Lab’s Alto project tasks the Coral USB Accelerator and Raspberry Pi Zero SBC to implement easy-to-understand machine learning using an open-source mini robot that you build yourself.

Google Creative Lab has unveiled a project called Alto. Alto by Google Creative Lab is a “teachable object using the Coral USB Accelerator.” “ALTO” stands for “A Little Teachable Object.” It’s designed to enable users to gain a basic understanding of machine learning. Alto uses the Coral USB Accelerator and Raspberry Pi to help users easily add machine learning to their hardware projects.

— ADVERTISEMENT —


Google’s Alto GitHub repository contains all of the instructions and files required to build an Alto from scratch. Alto is completely open source —the code and template for this project are all free for access. Google notes that Alto is not an official Google product, but rather a collaborative effort between Google Creative Lab and its partners at RRD Labs.

Building Alto (left) and Alto assembled
(click images to enlarge)

We’ve covered a number of Google AI-based education kits over the years. And Raspberry Pi SBCs have been along for the ride the whole way. An early example in 2016 was Google’s “Project Bloks” educational platform for kids built around a Raspberry Pi Zero. And in 2017 Google released its Python-based Google Assistant SDK, a kit for the AI-infused Google Assistant voice agent. That was followed by an AIY Projects kit for voice/AI projects on the Raspberry Pi. In 2018, the company built on the AIY strategy with new versions of its AIY Voice Kit and AIY Vision Kit, both shipping with a Raspberry Pi Zero WH.

Alto keeps machine learning to a simple level. Alto has a camera on its front, and an arm and button on each side. Alto uses the camera to observe the world around it. The buttons are used to start learning, and Alto will point with its arm when it sees something it has learned to recognize. Alto can learn two classes of things. When Alto recognizes one of them, it will point with its corresponding arm. The more you teach Alto about the same object, the better it will become at recognizing that thing.

Alto recognizing the difference between two objects and pointing to them with different arms

Alto has three main elements: A Raspberry Pi Zero SBC with a camera attachment; a Coral USB Accelerator for accelerating on-device machine learning; and some simple electronics for its user interface.

The Raspberry Pi Zero runs Raspberry Pi OS. It is responsible for interfacing with the user’s connected hardware (connected via GPIO), the camera module, and the Coral Edge TPU (connected via USB). The OS starts automatically, and acts as host for the software application that runs Alto. The system accelerates all ML inferencing (and some of the learning) by delegating ML graph execution to the Edge TPU on the Coral USB Accelerator. The Edge TPU is a ML coprocessor. It enables low-powered computers like the Raspberry Pi to run advanced ML workloads at a much greater level of performance than if they were run on CPU. All of the ML tasks performed by Alto are done on-device (offline) using the Edge TPU.

Coral software for the Edge TPU does not officially support the Raspberry Pi Zero (it has an ARMv6 chipset, but Coral officially requires ARMv8). That said, the Coral software for the Edge TPU Runtime is open sourced and has been successfully compiled for the Raspberry Pi Zero.

The Alto software application is written in Python. It receives input from the Raspberry Pi camera module, prepares it for classification by the Edge TPU module, and interfaces with the electronics interface. Google says the application uses a “k-nearest neighbor classifier model (k-NN) to identify the proximity of a given image to others in its learned dataset.”

When Alto is learning, it calculates the embedding of the incoming data from the image sensor in its model and assigns it a label – in the case of Alto, this label is either its left or right arm. When Alto is in recognition mode, the embeddings of data frames from the image sensor are determined and their proximity to other known embeddings is calculated. If these are within a certain distance of a labelled embedding, then Alto has recognized something, and will point at it with its corresponding arm.

The simple external elements of the Alto robot (left)
and the complete set of electronics to build Alto (right)
(click images to enlarge)

According to Google Creative Lab, the electronics for Alto are designed to be as flexible and hackable as possible. It’s based around a single through-hole prototyping board which can be assembled easily by hand. This board connects to the GPIO pins of the Raspberry Pi Zero, and breaks out headers for the other component parts of Alto: two servos, two buttons, and one LED. The design and bill of materials (BOM) includes a prefabricated USB breakout board, which includes a discrete power regulation circuit that will ensure smooth delivery of adequate power to Alto during normal operation.

Further Information

All the information — including instructions and files and helpful images — are available on the Alto GitHub page. A nice summary of what Alto is and what you can with it is provided in this SlashGear article.

(advertise here)


Print Friendly, PDF & Email
PLEASE COMMENT BELOW

Please comment here...