I’ve been doing a fair amount of experimenting, mostly using xbmc’s svn trunk code, comparing the playback quality, cpu usage, codec support, etc. of both Broadcom Crystal HD-based decoding and nVidia VDPAU-based decoding of assorted video files I’ve got around the house. Now, open-source crystalhd support under Linux is quite new, as some of you may know. But the chip and driver have actually been around for quite some time, lurking in assorted set top boxes, if I understand correctly. The chip and driver, so far, seem pretty robust (not surprising, if they have indeed been in set top boxes for some time now), and definitely highly capable. nVidia’s graphics cards have been able to do video decoding for probably at least as long, but until a year ago or so, only the Windows drivers supported that functionality, so VDPAU is relatively new as well, but its usage is a bit more established than the crystalhd in the world of open-source. (Note, of course, that while there’s an open-source libvdpau, the only way to provide a working decoder backend for libvdpau is via binary-only drivers from nVidia or S3, the latter of which I know nothing about, so for the open-source purists, crystalhd is a win here).
The question many people have asked me (on mythtv irc and mailing lists, primarily), is ‘how does it compare to VDPAU?’. Well, it compares quite favorably. I’ve got a development box with an nVidia Quadro FX 1700 and a Broadcom Crystal HD card in it that I’ve been poking at the past few nights. For fully-compliant h.264 streams, there’s little or no difference in the quality of the decoders, with maybe a *slight* (and possibly only perceived and subjective) edge to the Crystal HD decoder. They’re both quite good. Like, you can’t tell this is a desktop computer playing back the video and not a commercial blu-ray player good. At the moment, less-than-100%-spec-compliant h.264 clips tend to give the Crystal HD fits, while they play fine using VDPAU. However, most of that is software-side, and can/will be remedied as development progresses.
As for mpeg2 (aka High-Definition TV in the US), the Crystal HD does support that as well, but most work has been focused on h.264 to date. The mpeg2 decoding via Crystal HD in xbmc isn’t quite there yet, but Scott has been working on it. A few days ago, 1080i mpeg2 content had a nasty green cast to it, along with alternating light and dark stripes. The green cast is gone now, but the stripes persist — it also looks like either fields are playing out of order or being drawn in not quite the right locations, visually manifesting as the picture oscillating up and down a pixel or two. Both 720p and 1080i lose sync after a bit as well, and then the player thrashes about trying to get them back in sync (or something). But they’re known issues, and progress is being made.
I have to add this data point though: while playing back some 1080p h.264 video decoded by the Crystal HD this past weekend, my wife said “wow, that looks so good! why does it look so good? I don’t remember it looking that good before!” (more or less).
So why should you choose one over the other? Depends on a few things… If you’re in the camp that is morally opposed to binary-only software, such as the nVidia graphics driver, the Crystal HD route is far less morally reprehensible. It *does* require use of a binary-only firmware image that must be loaded onto the card, but otherwise, the driver and interface library are 100% open-source (GPL driver, LGPL library). The driver is in the Linux kernel staging tree, and the library is making its way into Linux distributions. The Crystal HD isn’t dependent on any particular graphics chipset either. I’ve got an Intel X3100 graphics Lenovo ThinkPad T61 and an Intel GMA950 Mac Mini, neither of which I could replace the graphics chipset in. Both have mini PCIe slots though. And now both can play back 1080p h.264 using the Crystal HD, whereas before, notsomuch. Note however, that there are higher cpu and memory requirements for using the Crystal HD to decode video than for VDPAU. This is the down side of it not being tied to a particular graphics chipset. Because its a separate device, there is occasional format conversion (rgb to yuv, nv12 to yv12, etc., etc) and buffer copying (from Crystal HD to graphics card) that has to be done, rather than everything occurring strictly within the graphics chip. That said, even a lowly 1GHz AppleTV with 256MB of RAM can play back 1080p h.264 with a Crystal HD (its just pretty close to maxed out). Chances of having a PCI express-capable chipset with a CPU slower than that are minimal. Single-core Atom systems, particularly in netbooks, are one of the main targets for the Crystal HD.
Back to the hardware for a moment… To make room for the Crystal HD in my T61, I yanked the worthless-under-Linux Intel TurboMemory card. In the Mac Mini, I yanked the wifi card, since I have wired GbE to it. If you have a system with an available PCIe slot, its probably easier to simply drop an nVidia graphics card in — partially because the Crystal HD is only generally available in a mini PCIe card format right now, meaning you’d need an adapter to use one in a PCI express system w/o a mini PCIe socket (but these do exist). Remember the above-mentioned development system w/the Quadro FX 1700 in it? Its actually a tower with no mini PCIe socket — I have a PCIe x1 Crystal HD card courtesy of Broadcom, its just not available anywhere on the open market at this time. They’d like for it to be, as well as an expresscard variant (for laptops without an available internal mini PCIe socket to spare), but the mini PCIe is likely to be the one most in demand by far, so only time (and market conditions) will tell what options become available.