HD Video Decoding on GPUs with VLC 1.1.0
by Ganesh T S on June 25, 2010 4:35 AM EST- Posted in
- Home Theater
- HTPC
It is time for HTPC enthusiasts to rejoice! Videolan announced the availability of VLC 1.1.0 a couple of days back. VLC's popularity soared in the mid-2000s when standard definition videos were all the craze, and CPUs were powerful enough to easily decode them. Over the last few years, many people have built up a big library of high definition videos, and one of the complaints against VLC was the fact that all the inbuilt codecs relied completely on the CPU horsepower for decoding. Even the most powerful modern day multi-core processors have trouble decoding HD videos [Clarification: 'trouble' with CPU decoding might mean dropped frames, stutters, sudden spikes in CPU usage and kicking in of the CPU fan etc. These are more noticeable in single threaded decoder implementations].
HTPC users with GPUs capable of accelerating HD video decode initially relied on the bundled software (from Cyberlink / ArcSoft / Corel). However, the bloatware and container restrictions imposed by these players led enthusiasts to other open source projects such as Media Player Classic - Home Cinema (MPC-HC). These tapped into the GPU capabilities using DXVA / DXVA2 APIs on Windows and VAAPI on Linux. The extent of support provided in these APIs depended on the GPU vendor. Historically, Nvidia has provided much better support than ATI, while Intel was lagging behind for quite some time till late last year. This is evident from one of the popular blog posts used as a reference by people wanting to get DXVA working on their GPUs. Users of MPC-HC also had to deal with external codec packs such as CCCP. In addition, a large number of options had to be set up correctly in order to get GPU decoding to work. There was an urgent need for the big player in this space to come to the party, and Videolan has done that exactly with the 1.1.0 release of the VLC Media Player.
However, all is not well yet in VLC land. Videolan supplied the caveat that the experimental GPU acceleration would work only on Nvidia GPUs as of now. They cited troubles with the ATI drivers and the lack of access to a Intel IGP as the reason for not being able to support non-Nvidia platforms with confidence. With a core developer team of just 5 people, coupled with the fact that most of them are not Windows developers, it is hard to find fault with that reasoning.
At the end of our testing, we found out some unexpected good things. However, there was some disappointment as well. Before going into the details, let us take a look at the test bed and test suite we used for the analysis.
74 Comments
View All Comments
MGSsancho - Friday, June 25, 2010 - link
What software did you use to test DXVA compatibility? Also if possible where can we get a hold of it? :)Per Hansson - Friday, June 25, 2010 - link
It is "DXVA Checker"You can doiwnload it here;
http://bluesky23.hp.infoseek.co.jp/en/index.html
barniebg - Friday, June 25, 2010 - link
Come on, the most important benefit of using a GPU to decode video is the fact that you can apply hardware deinterlacing. VLC deinterlacing is nowhere near even remotely comparable to any GPU.MGSsancho - Friday, June 25, 2010 - link
I am disappointed as well but applaud VLC for being another competitor in this very important arena.CSMR - Friday, June 25, 2010 - link
Deinterlacing is a legacy concept relating to content produced with CRTs in mind. It is not important in the modern world. If you have content that is interlaced, your encoding software should deal with it, or else download a better version.probedb - Friday, June 25, 2010 - link
What about those of us that don't want to reencode video? Or that play DVDs back from the drive.De-interlacing is still very much required.
mckirkus - Friday, June 25, 2010 - link
DVD content is stored as progressive (480p) On an old CRT/Tube TV, the DVD player interlaces the content (480i) so it is compatible with the TV.I can't think of any digital content that is stored in interlaced format these days.
mckirkus - Friday, June 25, 2010 - link
(ok, no edit button, non HD tv is interlaced, as is 1080i broadcast ATSC). I should have said DVDs and Blu-Ray are not interlaced.flanger216 - Sunday, June 27, 2010 - link
Swing and a miss, #2. Try again, please.TONS of DVDs are interlaced, both from PAL and NTSC regions. Plenty of content --- HDV and tape sources, for starters --- has never been anything other than interlaced, right from camera acquisition, and is directly encoded from the interlaced source to an interlaced DVD... for obvious reasons.
Many film-based DVDs released prior to 2000 or so are also interlaced, because they were encoded from old cable, laserdisc and VHS masters. Also, heaps of low-budget and foreign (especially Asian) DVDs are made from interlaced masters, due to old or subpar equipment, or simply because interlaced workflows are often cheaper.
And NO, your "encoding software" should NOT "deal with it." Deinterlacing prior to encoding gives you the following options: you can deinterlace to half-resolution and encode w/ a good bitrate but with poor quality, or you can deinterlace to full-resolution, but that'll require a doubled frame-rate and, obviously, doubled file-sizes. Interlaced sources should always be encoded to interlaced targets and deinterlaced during playback, preferably by a high-quality temporal/spatial filter @ a doubled frame-rate. Realistically speaking, you're only going to get that from a GPU (or a smokingly fast CPU running one of the newer software deinterlacers).
WHY do people write things that are flatly untrue?
electroju - Monday, June 28, 2010 - link
I agree, but all DVD movies that I have are interlace. Yes, even the latest movie from 2007 is interlace. I am sure that Blu-ray and HD-DVD are interlaced as well. Like you said, these interlace content have to be set at double the frame rate and be de-interlaced to view correctly on a progressive screen. Though a 3:2 pull-up also have to be used to keep within 24 frame rate of the movie, but this adds distortion.Not all codecs are compatible with interlace content, so the video have to be deinterlace. This means double frame rate and adding 3:2 pull-up if it needs it.
There are dozens of deinterlace algorithms. Not one will suit every content. The only programs that I know that include most of the deinterlacing algorithms is dscaler and tvtime.
None of the GPU that I know of actually increases the frame rate, so you are back where you started. In order to do it right, it is do the post-processing task with the CPU. Though there is a compromise between loading the CPU to 100% and not using the GPU or using the GPU and have fraction of the CPU being utilized.