Nvidia 4000 RTX

Am I right in understanding that the general consensus is that the 4080 feels more like a higher tier 3000 series, because its pricing basically scales with performance of that line, rather than getting a jump in performance for the same cost?
 
More or less, yes. This actually seems to be the case for the lineup with the expected diminishing returns for the increased costs as you move to the 4090.

Hysterical people with tin foil hats claim this is a careful 4D chess move by nVidia to rid of increased 30-series stock. I just think they saw what scalpers did and adjusted prices accordingly
 
More or less, yes. This actually seems to be the case for the lineup with the expected diminishing returns for the increased costs as you move to the 4090.

Hysterical people with tin foil hats claim this is a careful 4D chess move by nVidia to rid of increased 30-series stock. I just think they saw what scalpers did and adjusted prices accordingly

Let's see how the AMD RX7000 series cards perform. Even though I would not buy an AMD card, it should drive down the prices of the RTX3000 series card. If a RTX3090 hits $600 used I'll probably get one to upgrade. My monitor is only 2.5K resolution, so no need for the RTX4000 cards.
 
Let's see how the AMD RX7000 series cards perform. Even though I would not buy an AMD card, it should drive down the prices of the RTX3000 series card. If a RTX3090 hits $600 used I'll probably get one to upgrade. My monitor is only 2.5K resolution, so no need for the RTX4000 cards.


You already bought AMD graphics in the Steamdeck, so why not a card?
 
Steam Deck is a semi-custom APU, also I play mostly in Steam OS. I had pretty bad experience with AMD's Windows drivers. Even on the Steam Deck, when I first installed Win10 on the SD card, after installing the APU driver the Steam Deck couldn't boot into Windows. I had to wipe the SD card and start the process over. Turns out my Win10 image was old and the APU driver wasn't compatible with it. Drivers shouldn't be Windows version dependent like that, and if it has to be like that due to some newer features added, the installer should've warned me about installing the APU driver on a non-supported version of Win10. I'm willing to deal with this kind of issues on a $530 device on a secondary OS. I'm not willing to deal with issues like this on a $600-700 GPU on my primary OS.
 
1668936043339.png
 
666W !!!!!!!!
666w? They lost the opportunity to call it, like, "The Power of Lucifer Edition" or something.
 
Possibly a 4090ti. It's so tall, they may be stacking the outputs vertically, rather than being in line. Possibly 800w.

1675139171848.png
 
That’s just stupid…
 
Possibly a 4090ti. It's so tall, they may be stacking the outputs vertically, rather than being in line. Possibly 800w.

At that point the computer becomes a daughterboard for the GPU. Nvidia's really out there trying to out-NetBurst intel
 
That’s just stupid…
IMO that was already true for the 4090/4080... and a huge bunch of other processors, GPUs etc.
Nobody gives a shit about power consumption or efficiency it seems, even if it goes to frankly absurd levels (600+ W GPU?????). Usually you'd say bumping up power consumption by 30% to get a net 3% increase in FPS is a pretty dumb trade - but that's exactly what nvidia's been doing with the 4000 series cards. those things are capable of incredible efficiency, if you actually limit their power draw to maybe 300W or even less! but no, they have to, get the last few FPS out of them, because that seems to be the only thing that sells.
honestly the entire industry is garbage at that point. morons. it makes me mad.

At that point the computer becomes a daughterboard for the GPU. Nvidia's really out there trying to out-NetBurst intel
yeah... and nvidia seems to be out-netbursting-intel despite the fact that intel themselves are trying to do just that with their current processor lineup as well! 250 W mainstream desktop CPU anyone? YEAH WHY NOT STUPID CLOWNS
 
Total wattage doesn't tell the whole story though, at least with CPUs. For both Intel and AMD (with a couple of exceptions) the latest processors are more efficient than the previous generation. Compared to the previous gen they will draw more power at full load but will complete a job faster and use less total power to complete the same workload.

I suspect that isn't the case for these newer CPUs, however.
 
the latest processors are more efficient than the previous generation.
oh totally true, you're right!
however, especially with the GPUs (but I suspect with CPUs as well) they're just run far outside their sweetspot to get more computing power out of them. they could be far more efficient still if they had power consumption reigned in a bit. I actually saw tests / benchmarks of exactly that for 4000 series nvidia GPUs that actually showed the numbers i quoted above (30% less power draw for 3% less FPS) as well as I think for the new 7000 series AMD CPUs where the 105W TDP class being artificially limited to 65W (i know TDP isn't the whole story anymore and those 250W intel CPUs only take that for a minute or so max) slowed them down by single digit % as well iirc (thus making the 65W operating point far more efficient).

point being: if you want the most efficient CPU & GPU in your system, you'd have to get a high-end chip and then actively limit it's power draw by 30-40% - which I don't think is sometihng the average user is likely to do
 
Last edited:
Top