PRIME and battery usage: sometimes it's not what it seems to be

PRIME is a technology used to manage hybrid graphics. It was meant to be a resource saver since you can configure it to use only the Integrated GPU, and render offload to the dedicated GPU whenever is needed. But that is not what happened in my real life situation.

What happened?

I was not happy with the fact that my laptop was draining a lot of battery. Not only while using render offload by running software with prime-run and delegating gpu stuff to my dGPU(Nvidia MX150), but using the iGPU(i915 Intel) to daily stuff(terminal, browser) was also draining a lot of battery. Overheating was also a problem and then, i started to investigate.

Using powertop -t 3 shown that Firefox was draining as much as 800 mW per process(tab) on the Power Est. column. Meanwhile, i915 module was using about 150 mW alone. That got me thinking if, using the dedicated GPU to render stuff would get it better, but it didn’t. Launching Firefox with prime-run reduced a little the power usage per tab (opening the same websites), but the Intel module was still draining almost the same amount of power(145 mW) while the nvidia module was using 35mW.

Other thing that bugged me was that my system always started with a lot of RAM already compromissed(900MB) on a simple i3 setup. Could be the case where my iGPU was already allocating a lot of that resource to itself?

After that, i’ve decided to try some 3D, and Veloren was the game i’ve chosen. Those were the metrics captured with my status bar and powertop:

  • Using i915 driver:

    • Driver drain reached 400mW.
    • 1,9 GB Ram used
    • Battery expected duration after full charge was 1:28.
  • Using nvidia driver with prime-run.

    • i915 driver was still draining 200 mW approximately, spiking to 400 sometimes.
    • nvidia driver was using about 50mW, spiking to 80 once in a while.
    • 1,7 GB RAM used
    • About 140MB of GDDR used
    • Battery expected duration after full charge was 1:12.
    • nvidia temperature reached 73ºC.

I know that this doesn’t seems to be a fair test, comparing a dGPU with an iGPU while executing a game. The point I was trying to prove was: Using the render offload didn’t really help on reducing resource consumption at all. On the oposite, it seems that it somehow helped me to have resources wasted by doubling energy consumption due to this binding created between modules with the render offload feature.

Using Nvidia only. Let’s se what happens.

I was decided to try a new approach and use Nvidia graphics only. Setup was pretty straightforward and after rebooting, I’ve launched Firefox with the same sites and Power Est. was about 570mW per process. Good news, lets try Veloren again:

  • Using nvidia driver only
    • nvidia driver was using about 120mW.
    • Battery expected duration after full charge was 1:47.
    • 1,3 GB RAM used
    • About 140MB of GDDR used
    • nvidia temperature reached 67ºC.

That’s a lot better. I also noticed that my system was using only 500MB RAM after a fresh start(a 400MB difference).

Lesson learned. Try things by yourself when it comes to power management.

Bonus round:

Other things changed on my system after spending the weekend optimizing energy stuff:

  • Not using bumblebee-status anymore. It’s a great bar, full of useful modules, but it was creating some weird spikes on power usage. Migrated to i3status-rust and now my bar isn’t even listed on the top 20 power usage agressors.
    • All previous tests were done using the same bar.
  • telegram-desktop is a mess on power and cpu usage and i’m seriously thinking on ditching this software and using it’s web version only. Firefox is a software I already use so, there’s nothing to lose.
  • Try to get used with some lightweight browser like qutebrowser while on battery, and stop using my bookmark sync of choice. Have to test since not having video acceleration could be a caveat.
  • Find out why while using specific softwares pulseaudio gets crazy and it spikes with 4W of Power Estimated usage.