ASUS UL80Vt-A1 Thin and Light Notebook Review
Power Consumption and Battery Life
Power Consumption
At idle, the ASUS UL80VT with Intel’s CULV Processor only consumed 8 watts while using the integrated graphics, and when we turned on the NVIDIA GeForce G210M video card, it only added 4 watts to that number, bringing the grand total to 12 watts at idle. To test the notebook under load, we took readings from our Seasonic Power Meter while running 3DMark06, again both with integrated graphics and with discrete graphics running. With the integrated graphics, the notebook was only consuming 18 watts. Once we turned on the video card, it almost doubled to 33 watts. Still, compare that to an energy saver light bulb that uses 26 watts of power to emit the same amount of light as a 100 watt bulb. If you are using the integrated graphics, you are using less power than the light bulb you’re using to see by! Even if you turn on the integrated graphics for gaming or watching movies, you still can’t complain about the energy usage from this notebook. I have to give props to ASUS for coming out with such an energy-efficient machine that you can use without hitting multiple keys at once or squinting at a tiny screen.
Battery Life
To test the battery life, we let the computer sit at idle with no screensaver and measured the amount of time it took for it to go into standby mode. In order to get an idea of how long the battery would last under a work load, we played a movie in the same fashion. The results are shown in the graph below.
Results: With Intel’s new CULV Processor, the ASUS UL80VT blew everything away when it comes to the idle battery life!! Almost 13 hours at idle is unbelievable. For the UL80VT we had the graphics on the power detection graphics switch, so it switched from the Mobile Intel 4 Series Express chip to the NVIDIA GeForce G210M video card for the movie playback. It is obvious that this switch seriously taxed the system, as we got less than 50% of the maximum battery life out of it. Still, over 5 hours of battery while playing a movie is nothing to sneeze at. The quality improvement when going from integrated graphics to discrete graphics is probably worth the battery sacrifice to many. The while using the integrated graphics, the characters looked as if they were a paint-by-number there on the screen. While we still didn’t get a crystal clear picture, the NVIDIA graphics card certainly gave the video quality quite a boost.
Comments are closed.