All News | Boards | Chips | Devices | Software | Archive | About | Contact | Subscribe
Follow LinuxGizmos:
Twitter Google+ Facebook RSS feed
> get email updates <

Intel’s “Euclid” module is a brain, vision, sensor, and hotspot for robots

Aug 20, 2016 — by Eric Brown — 6,615 views
Tweet about this on TwitterGoogle+Share on FacebookShare on LinkedInShare on Reddit

Intel demoed a “Euclid” robotics compute module running Ubuntu on a quad-core Atom, and equipped with a RealSense 3D cam, WiFi hotspot, and various sensors.

At the Intel Developer Conference in San Francisco this week, Intel showed off a prototype of an Intel Euclid robotics controller, equipped with a stereo depth-sensing Intel RealSense camera and running an Ubuntu/ROS stack. Designed for researchers, makers, and robotics developers, the device is a self contained, candy-bar sized compute module ready to pop into a robot. It’s augmented with a WiFi hotspot, Bluetooth, GPS, and IR, as well as proximity, motion, barometric pressure sensors. There’s also a snap-on battery.

Euclid robotics compute module (left) and its snap-on battery pack
(click images to enlarge)

According to Sarang Borude, an Interaction Designer on the Experience Design/Development Team of Intel’s Perceptual Computing unit, the device is preinstalled with Ubuntu 14.04 with Robot Operating System (ROS) Indigo. When it’s released in Q1 of 2017, it will likely run Ubuntu 16.04 with the latest Kinetic ROS version.

Euclid module and battery pack, snapped together
(click image to enlarge)

On top of this OS layer, there’s a software stack that “really makes the device easy to use,” said Borude. “You can use this device without any other software installation. Usually a PC is married to the robot, but what we’re bringing is plug and play.”

Euclid module’s USB 3.0 and micro-HDMI ports (left) and battery pack interface
(click images to enlarge)

The Euclid module’s built-in Intel ZR300 RealSense camera features a wide-FoV 640 x 480-pixel RGB camera element, along with depth and accelerometer-gyroscope motion sensors. These features enable the acquisition of high-quality, high-density depth data at up to 60fps, says Intel. Other features include USB 3.0 and micro-HDMI ports, as well as a separate charging port for the battery.

Unlike two other Linux-driven, RealSense-enabled developer products announced this week at IDF — the Joule computer-on-module for IoT and Aero Compute Board controller for drones — Euclid was demonstrated as a preliminary proof of concept without detailed specs or a price.

UP board based
RealSense Robotic
Development Kit

(click to enlarge)

Intel did not disclose the type of quad-core Atom SoC inside Euclid, so it’s unclear if this is the new “Broxton” T5700/T5500 SoC found on the Joule. Since this appears to be a more robot deployable version of Intel’s RealSense Robotic Development Kit, which is based on Aaeon’s community-backed UP board, it may run on a stripped down version of the UP. In this case, it would use an Atom x5-Z8350 “Cherry Trail” SoC, which has four 14nm cores clocked at 1.44GHz, or 1.92GHz burst.

Using Euclid

Euclid can be used as a full, autonomous “Brain” with sensing capabilities, or as a smart sensor controlled by a more powerful computer, said Barude. In the second configuration, you can offload vision processing onto Euclid or access raw data from the sensors. You can also transfer Arduino sketches or ROS code to Euclid over the WiFi connection. In the case of Arduino sketches, Euclid passes them on, over USB, to an Arduino controller embedded in the robotic target. Euclid can be accessed and controlled by a web app from a desktop or Android and iOS mobile devices.

Selecting a scenario for transferring to Euclid via the web app
(click image to enlarge)

On the show floor, Borude demonstrated a Euclid-driven robot running an obstacle avoidance course (see short video clip on YouTube). During the Euclid segment of an IDF keynote (see video farther below), the Euclid-enabled robot was demoed with a follow-me application, uploaded via a web-based utility. Both demos used a custom-made, dual-motor mobile robot that is activated by placing the Euclid device in a cradle formed by the bot’s outstretched hands. Intel will release instructions on how to 3D print the robot.

Demo robot holding its Euclid “brain” in its outstretched arms, while performing a collision avoidance scenario
(click images to enlarge)

The Arduino circuitry is located in the robot, not the Euclid device itself, which passes Arduino sketches along to the robot. Intel has previously integrated Arduino circuitry on the Galileo and Edison boards. In this case, however, it was presumably left out with the assumption that developers will be using Euclid to add more autonomous characteristics to existing Arduino-driven bots.

Euclid control screens: selecting a scenario (left) and checking the status of an ROS node
(click images to enlarge)

Indeed, Euclid is designed to work interchangeably with different robots that offer ROS and/or Arduino support. You can quickly develop “scenarios” for different applications or robots “by selecting different ROS nodes/packages already on the device from the web interface,” said Borude. For example, for the follow-me scenario, you would combine camera, person tracking, follower, and robot communications nodes. You could then quickly turn this scenario into an app by “clicking a few checkboxes in the web interface,” he added.

Remotely viewing real-time video and data from Euclid’s RealSense camera and sensors via the web app
(click image to enlarge)

To move this app to, say, a Turtlebot instead of the custom Euclid robot, you would simply replace the Arduino-oriented, serial robot communication node with a Turtlebot node. “We use ROS’s dynamic reconfiguration capability to change parameters in the ROS nodes to make the application suitable for a specific robot,” said Borude.

You can use pre-installed ROS packages or your own, and there’s support for programming in high level languages such as Node.js. The web UI lets you remotely monitor how a Scenario is running, as well as display raw camera or data streams used by an application to help debug it.

Intel CEO Brian Krzanich introduces Euclid at IDF 2016


— with additional reporting by Rick Lehrbaum

(advertise here)


4 responses to “Intel’s “Euclid” module is a brain, vision, sensor, and hotspot for robots”

  1. vishwa says:

    whats the weight of this module? with the battery and without and the power consumption?

    • LinuxGizmos says:

      As mentioned in this post, the Euclid device Intel showed at iDF was a prototype, and as a result, Intel did not state what processor, RAM, and flash it contained, noting that it would most likely change by the time it reaches production. I held it in my hand for the photos you see in the post, and based on my recollection I’m guessing it weighed around 100-125gm, and was roughly 140 x 40 x 15mm in size, not including the battery pack. The battery pack probably added another 75-100gm. No information was available regarding battery mAh or the amount of time the device could run from its battery, again probably due to the device’s prototype status.

  2. Amit Moran (intel) says:


    The device is open for pre order.

    If you are interested, check out

    Let me know what you think.

    Amit Moran

Please comment here...