Hot SoCs: Tegra beats PS3; 6-GPU Exynos; 4.5W Haswell?
Jul 26, 2013 — by Eric Brown 5,732 views[Updated Aug. 9] — This week saw a flurry of news about new mobile processor developments that will significantly impact the Android and mobile Linux worlds. Nvidia unveiled Project Logan, a Tegra SoC (system-on-chip) with faster graphics than a PlayStation 3 at a third the power of an iPad; Samsung revealed a new Exynos 5 SoC with six Mali GPUs; and Intel confirmed that a tablet-focused “4.5 Watt” version Haswell-based SoC is on the way.
The high-end system-on-chips (SoCs) that drive our mobile devices have grown so complex that it’s difficult to compare performance without citing benchmarks. No longer can you get a rough guesstimate from merely counting cores and GPUs, and comparing clock rates.
Mobile SoCs are mixing different types of processor cores in the same device, packing in multiple powerhouse GPU cores, and in the case of the Qualcomm-based Motorola X8 announced earlier this week, adding specialized coprocessors for natural language and contextual computing.
— ADVERTISEMENT —
Below, we take a quick look at the latest SoCs from Nvidia, Samsung, and Intel…
Nvidia’s mobile Kepler GPU to drive Project Logan
When you think of Nvidia, you think graphics. Yet, the GeForce GPUs found in its mobile Tegra SoCs have trailed competitors such as Qualcomm’s Adreno technology. It appears this will change in the next-generation “Project Logan” version of the Tegra due next year (see die photo at right), which will most likely be called the Tegra 5.
At the Siggraph show in Anaheim, California this week, Nvidia demonstrated mobile Kepler, a new version of its Kepler graphics processing technology that will drive the graphics in the Project Logan Tegra. When it arrives in mobile devices in the first half of 2014, Project Logan with mobile Kepler will offer better graphics performance than the Nvidia GeForce-driven PlayStation 3, and even better than Nvidia’s desktop GeForce 8800 GTX graphics card, claims the company.

Mobile Kepler relative performance
The Mobile Kepler GPU will integrate 192 CUDA graphics cores instead of the 3,072 cores in the latest desktop-targeted GeForce GTX Titan graphics card, or 768 in the laptop-class GTX 765M. Still, that’s more than twice as many cores as the 72-core GeForce GPU built into the Tegra 4 SoC. The results can be seen in the demonstration video below of a realistic — and rather deranged — “Ira” digital model of a human head generated in real time.
Mobile Kepler GPU-based Demo of “Ira” human head simulation
The mobile Kepler GPU also adds a new low-power inter-unit interconnect and mobile optimizations to the Kepler technology found in the Titan. As a result, it draws considerably less power, and uses a third of the power of the Imagination Technologies PowerVR SGX 554MP4 GPU in the iPad 4, claims Nvidia.
Mobile Kepler provides unprecedented API support compared to earlier Tegras, including support for Khronos’ new OpenGL 4.4 graphics specification, as well as OpenGL ES 3.0 and DirectX 11. It also provides new rendering and simulation techniques for delivering advanced tessellation, computer-based deferred rendering, anti-aliasing and post processing, and physics and simulation functionality. These advances will drive new mobile applications in computational imaging, computer vision, augmented reality, and speech recognition, while enabling games that offer “more detailed, fully interactive virtual worlds not previously possible on mobile devices,” says Nvidia.
Project Logan will likely use the same 28nm fabrication and 4+1 core arrangement as the Tegra 4, which combines four Cortex-A15 cores with a low-power helper core, according to an AnandTech report. Meanwhile, The Verge says the Project Logan version of the Tegra will be incorporated in a next-generation version of Nvidia’s Android-based Project Shield handheld game console.
Samsung adds 6-core Mali GPUs to Exynos 5
On July 23, Samsung unveiled a next-generation version of its Exynos 5 Octa processor, called the Exynos 5420. The Exynos 5, which shipped in the international version of the Samsung Galaxy S 4 Android smartphone, was the first SoC to adopt ARM’s Big.Little architecture, deploying four Cortex-A15 cores along with four lower-power Cortex-A7 cores.

Samsung’s Exynos 5 Octa boasts ARM Big.Little
(click to enlarge)
The new model adds a six-core Mali-T628 MP6 GPU, providing twice the graphics performance of the PowerVR SGX544MP3 found in the previous Exynos 5410 model and supporting resolutions up to WQXGA (2560 x 1600 pixels), claims Samsung. The Mali GPU supports OpenGL ES 3.0 and Full Profile Open CL 1.1.
The SoC’s Cortex cores are still fabricated at 28nm, but the Cortex-A15 cores have bumped up from 1.6GHz to 1.8GHz, and the Cortex-A7 cores have moved from 1.2GHz to 1.3GHz, resulting in a total performance increase of about 20 percent, claims Samsung. The Exynos 5420 offers a memory bandwidth of 14.9 gigabytes per second, along with dual-channel LPDDR3 memory running at 933MHz, enabling “an industry-leading fast data processing and support for full HD Wi-Fi display,” says the company. Meanwhile, power efficiency is said to be reduced by a Mobile Image Compression (MIC) chip, although no specific claims were offered.
Presumably, the “coherence” bug that plagued the original Exynos 5 has been fixed in the new version. The bug, which occurs when switching between the Cortex-A15 and –A7 cores, caused serious performance and battery problems.
It does not appear, however, that the 5420 supports the new GTS core profile introduced recently by ARM and Linaro, which would enable all eight cores in an Octa processor to operate simultaneously for peak performance. GTS will also improve switching performance, reduce battery life, and enable non-symmetrical Big.Little core configurations, says Linaro.
The Exynos 5420 is currently sampling, and will enter mass production in August, says Samsung. This should enable new Exynos-based Android phones, phablet, and tablets to hit in time for the holiday season.
Intel confirms 4.5W Haswell due by year’s end. Really?
In a brief July 23 blog announcement, Intel confirmed rumors that a new “4.5W” mobile version of its “Haswell” architecture 4th Generation Core processors will ship by the end of the year, targeted at tablets. No details were offered on the clock speed, number of cores, or targeted operating systems. Yet previously, Intel execs suggested to the press that the lower-power Haswell processors would be dual-core models, and that they would run on Android tablets in addition to Windows 8 devices.

Haswell microarchitecture diagram
(click image to enlarge)
![]() (click to enlarge) |
When Intel released its first batch of 4th Generation Core processors in early June it cited quad-core Core i7 TDPs of 15 Watts, down from 20W on similar Ivy Bridge 3rd Generation processors. At the time, Intel also announced that new 4th Generation Core models due this year would lower power consumption by up to 50 percent compared to Ivy Bridge, down to a low 6W TDP. Later, it was rumored that the power consumption might actually get down to 4.5W and now Intel has confirmed that both 6W and 4.5W parts will ship in the second half of 2013, aimed at fanless 2-in-1 hybrid laptops and tablet designs.
A 4.5W Core processor would certainly give mobile device vendors a second viable x86-based option, in addition to the lower-power, 22nm-fabricated Silvermont Atom processors. Silvermont is due by the end of the year in tablet-focused “Bay Trail” SoCs, followed by a Merrifield SOC aimed at smartphones in 2014. Silvermont is claimed to offer about 3x the peak performance of the current Atoms or up to 5x their power efficiency.
But it turns out, Intel has gone and invented a new term for characterizing the power consumption of its processors, and has been rather sloppy in making the new term clear in its statements and presentations. Thanks to a reader comment below, we were alerted to new Intel jargon for processor power consumption — “Scenario Design Power” (SDP) — and have updated this section accordingly.
Below is Intel’s definition of SDP, quoted from the Mobile 4th Generation Intel Core Processor Family Datasheet (pdf file download). You can also view the definition, as it appears in the PDF data sheet, by clicking the image to the right.
“Scenario Design Power (SDP) is a usage-based design specification, and provides an additional reference design point for power constrained platforms. SDP is a specified power level under a specific scenario workload, temperature, and frequency. Intel recommends setting POWER_LIMIT_1 (PL1) to the system cooling capability (SDP level, or higher). While the SDP specification is characterized at Tj of 80° C, the functional limit for the product remains at TjMAX. Customers may choose to have the processor invoke TCC Activation Throttling at 80° C, but is not required. The processors that have SDP specified can still exceed SDP under certain workloads, such as TDP workloads. TDP power dissipation is still possible with the intended usage models, and protection mechanisms to handle levels beyond cooling capabilities are recommended. Intel recommends using such thermal control mechanisms to manage situations where power may exceed the thermal design capability.”
And guess what: Intel’s mobile processors seem much more competitive with ARM’s parts, from a power perspective, when the Intel chip power consumption is stated in SDP, rather than TDP, terms. Which is where that 4.5W Haswell power consumption comes from.
I can’t believe you guys fell for intel’s marketing. Haswell is not 4.5W TDP but SDP, a new metric from intel for running special processors at 800MHz. The TDP is 11.5W and power draw can go beyond 25W.
Guilty as charged! We certainly did get roped in by that. SDP stands for “scenerio design power,” and appears to be a new Intel marketing term, designed to compete with ARM’s low TDPs.
Quoting from TheInquirer.net:
“Now Intel has told The INQUIRER that it should have been clearer in Skaugen’s presentation that the 7W figure for the Core i5 3339Y chip was not TDP but SDP. Adam King, director of notebook product marketing told The INQUIRER, ‘The fact [is] that we didn’t really specify in Kirk’s [Skaugen] keynote when we put SDP numbers up there with TDP numbers without clarification and that’s really what I think has caused the questioning, which is fair.'” (source)
And here’s Intel’s official definition of SDP:
“Scenario Design Power (SDP) is a usage-based design specification, and provides an additional reference design point for power constrained platforms. SDP is a specified power level under a specific scenario workload, temperature, and frequency. Intel recommends setting POWER_LIMIT_1 (PL1) to the system cooling capability (SDP level, or higher). While the SDP specification is characterized at Tj of 80° C, the functional limit for the product remains at TjMAX. Customers may choose to have the processor invoke TCC Activation Throttling at 80° C, but is not required. The processors that have SDP specified can still exceed SDP under certain workloads, such as TDP workloads. TDP power dissipation is still possible with the intended usage models, and protection mechanisms to handle levels beyond cooling capabilities are recommended. Intel recommends using such thermal control mechanisms to manage situations where power may exceed the thermal design capability.” (Source: Intel’s Mobile 4th Generation Intel Core Processor Family Datasheet)
So Intel cheats in Benchmarks (Antutu) with too overaggressive compiler optimizers and also makes the power numbers look better with SDP rather than TDP…
mmmmh seems like they now see ARM as their biggest competition ;-)
(they used similar methods back when AMD had the better products)
The sad thing is that it works for them – the majoriy of the press does not notice this and just repeats what Intels marketing is saying and writes about “Intel crushing ARM” or similar.
I think the next Intel products will not be better than current ARM offerings.
But ARM will see very serious competition from Intel in the future now that they are all in…
(good that they did not go all in back when Atom was released – this gave ARM time to grow)
NVidia are notorious b*******ters about their next, upcoming, latest & greatest Tegra products, and they always fail to live up to expectations when they actually reach reality.