Archive for July 10th, 2008

H.264, CPUs and the future

Thursday, July 10th, 2008

There has been several developments recently that has gotten me a bit excited. Anyone who reads this blog semi-regularly knows I’ve got a thing for H.264 (otherwise known as MPEG-4 AVC). Having being around just before DivX hit the big time (and having successfully predicted its rise by launching DivX Digest), I’m getting the same feeling with H.264, only this time I think it will be even more widespread in terms of usage. That’s mainly because H.264 is much more versatile than DivX, and there’s a lot more industry support ranging from mega companies like Apple, to CE firms like Sony, Internet brands such as YouTube, and now video technology companies like DivX Inc are embracing it too.

AMD Phenom X4 9850But enough about H.264 for now. let’s first look at the more mundane, but still somewhat interesting news in regards to the latest range of CPUs. Intel is just about ready to launch it’s new “Nehalem” CPU range in Q4 2008, which looks set to redefine CPU performance once again. But for now, there’s a few new CPUs from both AMD and Intel, and some price movements, at least here in Australia. Starting with AMD, their new Phenom X3 (triple-core) and X4 (quad-core). The original Phenoms were a bit of a disappointment, especially when up against the Intel Core 2 Duo range. But these new X3/X4 Phenoms are a different proposition, and they give genuine competition to the market dominated by the Intel E8200 and E8400, as well as finally providing a price/performance reasonable upgrade path to AMD’s ageing Athlon range. The X4 9750 come at an excellent price, and the 9850 occupies the region (both price and performance wise) between the 8400 and 8500. The X3s are all very competitively priced too. Remember, you get an extra core or two with the Phenom over the E8400/E8500 as well, so while performances are similar for current day apps, future apps that take advantage of the multiple cores will run better on the X3/X4s.

Intel took action promptly as always and lowered the prices of several CPUs, notably the E8400, E8500 and the Q9300. The Q9300 is in direct competition now with the X4 9850, both in price and performance, but the 9850 has the slight edge in both areas. AMD still lacks upper end processors to compete with Intel though, and Nehalem looks set to make everything else look decisively old hat.

On a related note, I’m also running a poll on which types of CPU people are using … the results could be interesting.

But enough about CPUs, let’s talk a little about their replacement. Yes, the replacement for the humble CPU is just around the corner. Perhaps replacement is the wrong word, supplement is probably a better word. Nvidia has been touting it’s new CUDA architecture which allows GPUs to be turned into CPUs for processing of specific tasks. For those of you who have kindly offered your spare CPU power for Folding@Home, you might already be aware that there is a version of the client software that can use your GPU for added processing performance. The same principle can also be applied to video encoding, and Nvidia recently demonstrated H.264 encoding on it’s new GTX range of GPUs. These are now dubbed GP-GPUs (General Purpose GPUs), and the GP means that these GPUs can do other tasks other than graphics processing. Because GPUs have unique architecture that helps them do certain tasks really quickly, these same optimizations apply to video processing too, and that’s why video encoding, particularly the very processor intensive H.264 encoding, can greatly benefit from GP-GPUs.

H.264Let look at some specific examples. A 2 hour HD movie will require 10 hours of encoding on a 1.6 GHz dual-core system with integrated graphics. The same movie will take 5 hours and 33 minutes on a 3 GHz quad-core system also with integrated graphics. But on a GTX 280 with the slower 1.6 GHz dual-core system, the same movie only took 35 minutes to encode thanks to the GTX 280’s GP-GPU. 10 hours down to 35 minutes … now that’s what I call acceleration!

Of course, GPUs are great for certain tasks, but not so good for others, and CPUs will still be the centerpiece of computing for some time yet. But their importance will be diminished if GPUs can suddenly take on tasks traditionally reserved for the CPU. Should Intel and AMD start to worry? Maybe not, but they will definitely be doing some deep thinking on this issue.