Note: I wrote this post back in September 2009, before Tegra started showing up in many devices. I think many of my points still stand, so I’m reproducing it here for archival purposes. Enjoy.
The first device to feature nVidia’s much-hyped “Tegra” technology, the Zune HD, has been out for a few weeks now. I figured it’s worth spending some time reflecting on the technology, and what it means for the future of mobile computing. I’ve been eagerly following Tegra since 2008 because of its potential to revolutionize mobile devices; it promised to deliver powerful computing, video and 3D performance all with a very low power footprint. And given its relatively smooth implementation in the Zune HD, I believe nVidia has accomplished what they initially set out to do.
What’s behind Tegra?
In short, the Tegra is what’s referred to as a “system on a chip”. It’s the combination of a processor, graphics chip, memory, and several other components that go into a typical computing system — the difference being that Tegra fits all of those components onto a single chip. Its minuscule size, along with its impressively low power consumption (it can decode HD video while only drawing 1w of power!), makes Tegra an ideal technology for mobile devices. You can check out the further technical details on Wikipedia.
How is it different than other solutions?
Their currently isn’t any other technology that can take on Tegra toe-to-toe. Due to its system on a chip design, Tegra can be placed in cellphones and similarly sized portable media devices like the Zune HD.
The closest competing solution would probably be Intel’s Atom line of processors–however, Atom still requires a separate chipset to handle graphics, memory, and storage. This currently makes it impossible for Intel to get the chip in mobile products, but it remains ideal for netbook-sized devices. Intel is eventually planning to go with a system on a chip design for future Atom products, so nVidia won’t be alone in this segment for too long.
Why should I care about Tegra?
The biggest draw for Tegra is its ability to do a lot of work while drawing very little power. Since improvements on battery technology move at a far slower pace than other tech, hardware manufacturers are often forced to figure out ways to get more performance while not killing battery life. Tegra accomplishes this by consolidating many components onto one chip. It’s also what allows the Zune HD to be so generous with its use of 3D (among other visual flourishes) while not taking a huge battery hit.
The Zune HD also provides a glimpse at what Tegra is capable of when power isn’t an issue–in particular, when its connected to a dock and used as a media center for HDTVs. We’ve also seen demos of netbooks running Tegra that can handle HD video without issue, even when sending the video to a large HDTV. In comparison, my Intel Atom-based netbook can’t play 720p video without occasionally hiccuping. Tegra will ultimately allow for more capable mobile devices, and will give us computing power to spare.
What do other companies need to do to compete?
The mobile space forces hardware manufacturers to think smarter, because they can’t simply increase the processor speed to get more performance — they also have to consider battery life and the portability of the hardware. Tegra has made nVidia king of the hill, and now everyone else has to catch up. As I mentioned above, Intel’s looking at upgrading Atom to a system on a chip design to compete with nVidia, but there hasn’t been much word from other potential players. AMD, the rival processor company to Intel (and owner of ATI), has been mum on this segment.
With rumors swirling that Nintendo may be using Tegra in its next DS, coupled with the fact that Microsoft will surely want to stick with it for their future portable devices (phones, or otherwise), it’s clear that nVidia is onto something. I just hope other chip manufacturers sit up and take notice; as I’ve written about before, competition is always a good thing.