Massive GPU Power Draw

I posted about this issue long ago, when the game was still in it’s infancy (pre patch 1). I’m a little saddened that the latest update still hasn’t addressed the very high gpu power draw, but am curious to see what it’s doing for people with other video cards.

Before, I only looked at the GPU Load, but I’m measuring overall power draw for the card itself. My GTX 1080 is nearly maxed on it’s power draw, sitting at an average of 170 watts, but sometimes shooting over 180 as things load in. This is when running the game at 60FPS. The load hits in right at the title screen too (which seems odd to me).

I remember some people reporting 90-100% gpu usage on past builds while using cards like rtx 3070 & rtx 4090. So i’m curious if anyone in here with newer cards than mine can use gpu-z & report what your board power draw is. I’m curious if it goes over mine or if this is just a limitation of an older card (which shockingly holds it’s own with modern titles like Dragon’s Dogma 2).

Lots of misunderstanding to unpack here.

Games do not directly affect power draw. The GPU will draw power based on it’s workload and it’s configuration.

Workload will depend on the game settings: Resolution, Refresh Rate, various graphics settings. Configuration is not only the hardware itself (A 1080 in this case) but how it’s setup. If you, for example, overclock the card, it’ll draw more to do ostensibly do more work in the same amount of time.

In this case, you have a pretty dang old GPU that looks to be fully utilized while running the game, so it’s drawing as much power as it can given the throttling rules applied to it. Something worth mentioning here is that 170W is not that close to max. A GTX 1080 can pull up to 225W if you really want it to saturate the 8 pin + the PCIe rail.

Regardless, you are also running a card whose stock clock is 1733Mhz at 1835Mhz in this screenshot - So this is an overclocked card (I assume factory OC’d), and the power profile seems to be reflecting that.


All this is to say the following:

• This isn’t a problem. This is how we expect this to operate
• That card is old, and seeing it tapped out here is unsurprising
• The game has nothing to do with the power profile of the card

And if you’d like a point of comparison, my 3090 that I have pretty aggressively undervolted pulls about 250W at 1440p with an uncapped frame rate, but by just turning voltage and clock up a bit, I can make that 300W+ easily if I wanted to.

It seems pretty unusual for my card (and apparently other’s) to undergo constant 100% usage only with this game. I’ve left the game at it’s stock settings for this, but also gone so far as to turn everything to minimum at 30fps. the refresh rate of 30fps is the only thing that reduces the gpu load down to something around 70% (which is still obscenely when compared to anything else i’ve ever had it do; modern or old).

For comparison: When I play Dragon’s Dogma 2 at max settings (giving me around 30FPS, but gorgeous graphics) my card doesn’t go much over 30% usage. It’s like that for all modern games when set to 1440P, max graphics, at 60+ refresh rate.

Not really sure what this game is doing that no other game isn’t, but something is really weird about how the gpu is handling stuff for this game.

I’ve spoken with a friend that has done a bunch of freelance game development (small & big studio) & his only thoughts are that this game is using unoptimized code.

This isn’t necessarily wrong on it’s face, but in this case, it’s nonsense. ‘Modern Games’ are not created equal and a discussion about sim vs render needs to take place to get in the weeds about where problems may show up for a given title on a specific system.

This is going to quickly spiral into a discussion about what it means to render a frame, what’s actually expensive to render, and how bottlenecks manifest in the render/sim pipeline and it’s not interesting at all.

But that said, the above is still true and while I don’t work on the game am not privy to it’s technical specifics, this very much looks like expected behavior and there isn’t anything obviously wrong with the game or your card here.

I feel like this happens for a lot of Unity games, even purely 2D Unity games. My PC uses more power + gets higher temperatures for a Unity 2D game than a Unreal 3D game.