New thing: what is with the insane power draw that especially intel and nvidia are now giving their chips? like they're trying to get the last possible bit of performance out of their chips, but energy efficiency be damned. last example: nvidias new 4090.
a german tech site made a comparison* across ~20 games regarding performance at 450 (the official TDP), 350 and 300 W - and the bloody stupid thing is being run FAR above it's sweetspot. honestly: anyone who runs this thing at the stock TDP is a bloody moron.
[numbers are normalized across those ~20 games and expressed as % of max - i.e. those 99,9 should be 100%]
How is it acceptable for this product to deliver a measly 3% performance increase for a whopping 30% increase in power draw? it's so bloody stupid, it's beyond belief. even worse is the fact, that even without manually lowering the TDP, this card is still by far the best performance per watt - they could've built an absolute efficiency juggernaut, but no.
this kinda shit makes me all sorts of mad. it's not only the increased energy consumption that people have to pay (not an argument with a 2000€ GPU to be fair), but it just makes the entire system larger than it needs to be, louder than it needs to be and leads to a bunch of other issues down the line (like a bunch of the new 12pin power connectors just MELTING if put under any sideways stress at all, like from closing the lid of your case maybe????). ugh.
* link for anyone curious, all german though:
https://www.computerbase.de/2022-10/nvidia-geforce-rtx-4090-review-test/