NVIDIA Optimus - Truly Seamless Switchable Graphics and ASUS UL50Vf
by Jarred Walton on February 9, 2010 9:00 AM ESTASUS UL50Vf Overview
The ASUS UL50Vf is essentially the Optimus version of the UL50Vt, and the UL50Vt is the 15.6" version of the UL80Vt we liked so much. To be honest, we are a lot more interested in the ASUS UL30Jc—a 13.3" Optimus CULV laptop with an optical drive (some models will even ship with Blu-ray support in the near future). Here are the specifications for the UL50Vf.
ASUS UL50Vf Specifications | |
Processor | Intel Core 2 Duo SU7300 (2x1.3GHz, 45nm, 3MB L2, 800FSB, 10W) Overclockable to 1.73GHz/1066FSB (Turbo33) |
Chipset | Intel GS45 + ICH9M |
Memory | 2x2GB DDR3-1066 (Max 2x4GB) |
Graphics | NVIDIA GeForce G210M 512MB (16SPs, 606/1468/1580 Core/Shader/RAM clocks) Intel GMA 4500MHD IGP Switchable NVIDIA Optimus Graphics |
Display | 15.6" LED Glossy 16:9 768p (1366x768) |
Hard Drive(s) | 320GB 5400RPM HDD |
Optical Drive | 8x DVDR SuperMulti |
Networking | Gigabit Ethernet Atheros AR9285 802.11n |
Audio | HD Audio (2 stereo speakers with two audio jacks) |
Battery | 8-Cell, 15V, 5600mAh, 84Wh battery "Up to 12 Hours" |
Front Side | None |
Left Side | Headphone/Microphone jacks 2 x USB HDMI Flash reader (MMC/MS/MS Pro/SD) Cooling Exhaust AC Power connection |
Right Side | 1 x USB 2.0 Optical Drive (DVDRW) Gigabit Ethernet VGA Kensington Lock |
Back Side | None |
Operating System | Windows 7 Home Premium 64-bit |
Dimensions | 15.4" x 10.4" x 1.05" (WxDxH) |
Weight | 5.2 lbs (with 8-cell battery) |
Extras | Webcam 103-Key keyboard with 10-key Flash reader (MMC/MS/MSPro/SD) Multi-touch touchpad Brushed aluminum cover ExpressGate OS (8-second boot) |
Warranty | 2-year global warranty 1-year battery pack warranty 1-year accidental damage 30-day zero bright dot LCD |
Pricing | $800 MSRP |
Obviously, there were some changes to the motherboard in order to work with Optimus. Specifically, ASUS was able to remove any multiplexers and extra signal routing from the UL50Vt design. However, those changes are on the inside and you can't see any difference looking at the exterior. Specifications remain the same as the UL50Vt/UL80Vt, and performance is virtually the same as the UL80Vt we tested. (There will be some minor differences due to the change in LCD size and the use of different drivers, but that's about it.)
Pretty much everything we had to say about the UL80Vt applies to the UL50Vf. The features are great, and Optimus makes it even better. You can overclock the CPU by 33% in order to improve performance, or you can run the CULV processor at the stock speed and improve battery life. Unlike Optimus, changing the CPU speed doesn't happen on-the-fly (unfortunately), but it is a little easier than what we experienced with UL80Vt. This time, instead of requiring a full system reboot, enabling/disabling Turbo33 only requires the system to enter suspend mode. In that sense, Turbo33 is sort of like switchable graphics gen2: it requires manual user intervention and takes 10 to 15 seconds to shift modes. Ideally, we would like to be able to switch the overclock without suspending, and even better would be the option to enable overclocking on AC power and disable it on DC power.
The UL50Vf carries over the aluminum cover on the LCD lid along with the glossy interior plastic and LCD. It also uses the same 1366x768 LCD resolution. Considering the larger chassis, we feel ASUS would have been better off increasing the LCD resolution slightly (1440x900 or 1600x900 would have been good), and we would have also appreciated a faster dGPU. With Optimus allowing the GPU to switch on/off as needed and a 15.6" chassis, we feel ASUS should have been able to get something like the GT 335/325M into the UL50Vf. After all, Alienware is managing to cram similar hardware into an 11.6" chassis with the M11x!
Before we get to the tests, we did encounter a few minor glitches during testing. First, we couldn't get x264 decode acceleration to work with the dGPU using Media Player Classic - Home Cinema. We could set the application to load on the discrete graphics, but MPC-HC apparently didn't know how to talk to the Optimus GPU and ended up running off the IGP. Since the GMA 4500MHD was more than capable of handling our 720p and 1080p x264 files, we're not too concerned with this issue. Another glitch is that CPU-Z refused to work; it would hang at the graphics detection stage. This isn't so much a problem with Optimus as a need for changes to CPU-Z—and very likely some other low-level tools that talk directly to the graphics hardware. (We didn't try any overclocking or tweaking of the GPU on the UL50Vf, but we suspect it might be a bit trickier than normal.)
Finally, when using the dGPU and playing games, we periodically noticed a slight glitch where the screen would flicker black for a frame. We couldn't come up with any repeatable test, but it seems like the problem may be related to the Copy Engine transferring incorrect data. This was not limited to any one title, but it occurred most frequently during our Empire: Total War testing (usually at least once every 60 seconds). It would hardly be surprising to find that there are a few bugs in the NVIDIA drivers, and most likely this is one of them. We didn't find the occasional "flicker" to be a serious issue and at present we really don't have enough information to say more about what might be causing the glitch we experienced. We'll do some additional testing to see if we can determine if this is more of a problem with specific games or if it happens on all games.
We've run an abbreviated set of tests with the UL50Vf. As mentioned, performance is virtually identical to the UL80Vt, the primary difference being the ability to immediately switch between discrete and integrated graphics as necessary. We will highlight both the old UL80Vt and the UL50Vf in our charts for comparison; you can see additional performance results for the UL80Vt in our previous review. All tests were conducted with the default graphics settings, so the discrete GPU is used when Optimus deems it beneficial and the IGP is used in all other cases. The gaming and general performance tests are run with Turbo33 engaged (33% CPU overclock) while battery testing was conducted at stock CPU speed.
49 Comments
View All Comments
jkr06 - Saturday, February 27, 2010 - link
From all the articles I read, one thing is still not clear to me. I have a laptop with core i5(which has IGP) and nvidia 330M. So can I utilize the optimus solution with just SW. Or the laptop manufacturers specifically need to add something to truly make it work.JonnyDough - Friday, February 19, 2010 - link
was a wasted effort. Seems sort of silly to switch between two GPU's when you can just use one powerful one and shut off parts of it.iwodo - Thursday, February 18, 2010 - link
First, the update, they should definitely set up something like Symantec or Panda Cloud Database, where users input are stored and shared and validated worldwide. The amount of games that needs to be profiled is HUGE. Unless there is a certain simple and damn clever way of catching games. Inputting every single games / needed apps running exe names sounds insane to me. There has to be a much better way to handle this.I now hope Intel would play nice, and gets a VERY power efficient iGPU inside SandyBridge to work with Optimus, Instead of botching even more transistors for GPU performance.
secretanchitman - Saturday, February 13, 2010 - link
any chance of this going to the macbook pros? im hoping when they get updated (soon i hope), it will have some form of optimus inside. if not, something radeon based.strikeback03 - Thursday, February 11, 2010 - link
As I don't need a dGPU, I would like to see a CULV laptop with the Turbo33 feature but without any dGPU in order to save money. Maybe with Arrandale.jasperjones - Wednesday, February 10, 2010 - link
Jarred,As you imply, graphics drivers are complex beasts. It doesn't make me happy at all that now Optimus makes them even more complex.
Optimus will likely require more software updates (I don't think it matters whether they are called "driver" or "profile" updates).
That puts you even more at the mercy of the vendor. Even prior to Optimus, it bothered me that NVIDIA's driver support for my 3 1/2 year old Quadro NVS 110m is miserable on Win 7. But, with Optimus, it is even more critical to have up-to-date software/driver support for a good user experience! Furthermore, software solutions are prone to be buggy. For example, did you try to see if Optimus works when you run virtual machines?
Also, I don't like wasting time installing updates. Why can't GPUs just work out of the box like CPUs?
Lastly, these developments are completely contrary to what I believe are necessary steps towards more platform independence. Will NVIDIA ever support Optimus on Linux? While I suspect the answer is yes, I imagine it will take a year and a half at the very least.
obiwantoby - Wednesday, February 10, 2010 - link
I think it is important to note, that the demo video, even though it is a .mov, works in Windows 7's Windows Media Player. It works quite well, even with hardware acceleration.Keep encoding videos in h.264, it works on both platforms in their native players.
No need for Quicktime on Windows, thank goodness.
dubyadubya - Wednesday, February 10, 2010 - link
"Please note that QuickTime is required" FYI Windows 7 will play the mov file just fine so no need for blowtime. Why the hell would anyone use a codec that will not run on XP or Vista without Blowtime is beyond me. For anyone wanting to play mov files on XP or Vista go get Quicktime alternative.beginner99 - Wednesday, February 10, 2010 - link
Did I read that right, HD video is always decoded on the dGPU even if the (Intel) IGP could deal with it?I mean it sounds nice but is there also an option prevent certain apps from using de dGPU?
Or preventing the usage of dGPU completely like when one really needs the longest battery time possible? -> some users like to have control themselves.
intel IGP might offer worse quality with their videao decode feature (but who really sees that on laptop lcd?) but when travelling the whole day and watching movies, I would like to use as little power as possible.
JarredWalton - Wednesday, February 10, 2010 - link
It sounds like this is really just a case of the application needing a "real profile". Since I test x264 playback using MPC-HC, I had to create a custom profile, but I think that MPC-HC detected the GMA 4500MHD and decided that was perfectly acceptable. I couldn't find a way to force decoding of an .mkv x264 video within MPC-HC, but other video playback applications may fare better. I'll check to see what happens with WMP11 as well tomorrow (once I install the appropriate VFW codec).