Investigating NVIDIA's BatteryBoost with MSI GT72
by Jarred Walton on October 23, 2014 9:00 AM ESTBatteryBoost: Gaming Battery Life x 3
First things first, let's talk about the typical FPS (Frames Per Second) before we get to BatteryBoost. Plugged in, GRID Autosport at our 1080p High settings will average around 145 FPS, making it a good candidate for BatteryBoost. Perhaps more importantly, even when running on battery power GRID Autosport is able to average nearly 120 FPS (117 to be precise). Obviously, enabling BatteryBoost FPS targets will result in the average FPS equaling the target – 30, 40, 50, or 60 is what we tested – and testing without BatteryBoost but with VSYNC will result in a 60 FPS average.
As for the other games, Tomb Raider at High (not the benchmark, but actually running the full game for our battery test) gets around 107 FPS on AC power but drops to 70-73 FPS on battery power, so we wouldn't expect nearly as much of a benefit from BatteryBoost, especially at the 60FPS target. Borderlands: The Pre-Sequel falls roughly between those two, getting 135-140 FPS on AC power and dropping to around 88 FPS on battery power.
If BatteryBoost is simply benefiting from lower FPS, our VSYNC results should be the same as the BatteryBoost 60FPS results, but as we'll see in a moment that's not the case. Figuring out exactly what NVIDIA is doing is a bit more complex, and we'll discuss this more on the next page. First, let's start with GRID Autosport and run some detailed tests at 10FPS intervals and see how battery life scales.
Interestingly, while BatteryBoost on its own is able to do a good job at improving battery life – the GT72 goes from just 55 minutes without BatteryBoost to 112 minutes with a 30FPS target – tacking on VSYNC adds a bit more battery life on top of that. Our best result is at 30FPS with VSYNC enabled, where the GT72 manages 124 minutes, just surpassing NVIDIA's target of two hours. Of course, you'll probably want to stop a few minutes earlier to make sure your game progress is saved (if applicable), and 30FPS isn't the best gaming experience. Moving to higher FPS targets, BatteryBoost offers diminishing returns, but that's sort of expected. Even at 60FPS however, BatteryBoost still manages 90 minutes compared to 70 minutes without BatteryBoost but with VSYNC.
Given VSYNC appears to help even when BatteryBoost is enabled, for our remaining tests we simply left VSYNC on (except for the one non-BatteryBoost test). We ended up running four tests: no BatteryBoost and without VSYNC, no BatteryBoost but with VSYNC, and then BatteryBoost at 60 and 30 FPS targets with VSYNC. Here's the same data from the above chart, but confined to these four test results.
GRID has a nice almost linear line going from 30FPS to 60FPS to no BatteryBoost with VSYNC, and then finally to the fully unlimited performance. What's interesting is that the other two games we tested don't show this same scaling….
Borderlands: The Pre-Sequel has lower frame rates by default, so BatteryBoost isn't able to help quite as much. Normal performance on battery power without VSYNC results in 54 minutes of gaming, which is pretty similar to the result with GRID Autosport. That actually makes sense as in both games we're basically running the system as fast as it will go. Putting a 60FPS cap into effect via VSYNC, battery life only improves by a few minutes, while tacking on BatteryBoost with a 60FPS target gets us up to 62 minutes. Since we're starting at just under 90FPS with no frame rate cap, the smaller gains in battery life with a 60FPS target aren't a surprise, but the very modest 15% improvement is less than I expected. Dropping to a 30FPS target, we're not quite able to get two hours, but we do come quite close at 112 minutes – so essentially double the battery life compared to running at full performance.
Last is Tomb Raider, and as the game with the lowest starting FPS (on battery power) I expected to see the smallest gains in battery life. Interestingly, battery life without BatteryBoost and VSYNC starts at 57 minutes, slightly more than the other two games, but Tomb Raider is known for being more of a GPU stress test than something that demands a lot of the CPU, so perhaps the Core i7-4710HQ just doesn't need to work as hard. Turning on VSYNC does almost nothing (the one minute increase is basically within the margin of error), and BatteryBoost targeting 60FPS is only slightly better (six minutes more than without BatteryBoost). Once we target 30FPS, the end result is about the same as Borderlands TPS: 113 minutes, just missing a 100% improvement in battery life.
Just for kicks, I ran a separate test with Tomb Raider using 1080p and Normal quality with the BatteryBoost 30FPS setting to see if I could get well over two hours by further reducing image quality. While there's still a lot going on that requires power from the system – remember we're dealing with a 45W TDP CPU and around a 100W maximum TDP GPU, plus various other components like the motherboard, LCD, storage, and RAM – at these moderate quality settings I was able to get 125 minutes out of Tomb Raider.
In essence, the less work the GPU has to do and the higher the starting frame rates, the more likely BatteryBoost is to help. It's little wonder then that NVIDIA's discussion of BatteryBoost often makes mention of League of Legends. The game is definitely popular, and what's more it's fairly light on the GPU. By capping FPS at 30 it's easy to see how such a light workload can reach into the 2+ hour range. Interestingly, with Tomb Raider managing 2.08 hours at Normal quality, and given the GT72 uses an ~87 Wh battery, that means the power draw of the notebook during this test is only around 41-42W – not bad for a notebook with a theoretical maximum TDP (under AC power) of roughly 150W.
26 Comments
View All Comments
WinterCharm - Thursday, October 23, 2014 - link
So it's essentially V-sync to 30 fps :PIII-V - Thursday, October 23, 2014 - link
It's a bit more than that. Read the article.spencer_richter - Tuesday, November 25, 2014 - link
It's not as good as the top laptops on the market (see the rankings at ). For example the ASUS ROG G750JM-DS71 is a lot better for gaming. <a>http://www.consumer.com/</a> <a href="http://www.consumer.com/">http://www.consu... <a href="http://www.consumer.com/" title="http://www.consumer.com/">nathanddrews - Thursday, October 23, 2014 - link
With regular, old, dumb v-sync, additional frames are still rendered by the GPU, but select frames are only delivered from the frame buffer when ready to be synchronized to the monitor - it's not very efficient. BatteryBoost attempts to render only 30fps (or whatever the target is) to save power, and appears to succeed... somewhat.looncraz - Thursday, October 23, 2014 - link
Not on my system for the games I've tried. VSync reduces my GPU usage while reducing frame rate. The again, I've only tried a few games...But my own rendering engine accumulates (and even merges) changes until the next rendering time window, as directed by either the screen refresh or processing capability. (i.e. the render control thread doesn't initiate a frame render until the monitor can show it if VSync is enabled, or immediately once the last frame is completed if it is disabled). There just isn't a logical reason to do it any other way.
nathanddrews - Thursday, October 23, 2014 - link
I wonder if power usage is at all related to the "pre-render max frames" setting?OrphanageExplosion - Thursday, October 23, 2014 - link
Assuming there are battery boost profiles for each game, couldn't it simply be dialling down quality settings where you're not likely to be able to tell the difference from, say, high quality shadows and normal quality shadows?JarredWalton - Thursday, October 23, 2014 - link
Note that I did not run with the "recommended" BatteryBoost settings for the various games; I ran with specific settings and kept those constant. GeForce Experience does have suggestions that sometimes match my settings...and sometimes not. :-)inighthawki - Thursday, October 23, 2014 - link
By default, at least on Windows, this is not true. When vsync is enabled, frames are queued to be presented at a particular interval. They are never discarded. This queue has a max height - typically 3 frames, but normally configurable by the game. After three frames, any present calls by the game will be blocked on the thread until a VBlank occurs and a frame is consumed.It is possible to get the behavior you're referring to if the target operating system supports it, and the game uses triple buffering. In this case, you can have a front buffer being displayed while the other two back buffers are used in a ping-pong fashion. At the vblank, the OS can choose to use the most recently fully rendered frame. Windows chooses not to do this for the exact power reasons described above. The advantage of doing it this way is you reduce a minor amount of latency in exchange for keeping your GPU pegged at 100% utilization.
nathanddrews - Friday, October 24, 2014 - link
Since I have v-sync usually set to 96Hz, 120Hz, or 144Hz, I guess I never realize the power-saving benefits.