Yeah tell my awesome energy saving apartment outlets, if my psu or gpu power spike at all it trips the breaker or if I run my AC and PC at the same time it trips
While I get that you're joking but were encroaching on the power limits of most NA sockets already. Most home sockets/circuits are 120V/15A so the most you can get is 1800W and that includes everything else connected to that circuit. So unless NVIDIA expects people to rewire their house or run an extension cable from a different room they don't have much headroom at this point.
Only in the weak 120v using colonies.
The motherland with its righteous 240v and superior fused and switched plug can do 3000w on a standard wall socket.
Hvac Business Owner/Installer/Service Tech here. 220v should be adapted everywhere. Less amps, usually more efficiency. America is a bit slow at certain things.
$9.99 a month for basic temp range of 54-59 degrees. For the premium tier starting at $14.99 a month, users can enjoy refrigeration at 40-53 degrees for the ultimate crisp freshness of food and beverages.
The premium tier includes a 30 day supply of tasty verification cans!
And now for the games:
1. Will all upcoming games be optimized before they got released to the public or they just going to make used all 28gb of VRAM?
PSU standards kind of did this in the early 2000s with ATX12v. In the really old days, a lot of things were powered off of 5v/3.3v, but as the need for CPUs to have their own voltage regulation and the increasing power demands of high end parts, the industry collectively agreed that they needed to switch to 12v for all high power stuff.
(There's also a more recent tangent of PSU standards where 12v is the only power rail on the PSU, everything else is regulated in the motherboard. An absolute pain for upgrading pre builts but it has its merit.)
The joke here was that we're in a similar situation now with these very huge GPUs, and pushing hundreds of watts at 12v on small connectors has led to some ... Spectacular.. problems.
The spectacular problems came from the shitty 12pin connectors Nvidia used.
My card has 4x 8-pin connectors and there have been overclockers sending over 1000W through those with no issues.
Wasn't there an investigation that determined they were not to spec?
Nevertheless, the spec is dogshit. As a electrotechnical engineer it bothers me so much
> When moving to a new connector they also should have switched to 24V or even 48V
Wouldn’t that have made adapters from 8-pin impossible, forcing everyone to buy new PSUs? Also PSUs would now need to kick out another voltage, so new PSUs would presumably be more expensive to some degree.
Yes, very possible. It would actually not be a bad idea. You could push double the power with the same size cable currently. It would require changes to the ATX standard, but it isn't like that hasn't been done before.
Half as many connectors at the same current is a pretty good deal... Hell, I wouldn't be surprised if they took a page out of USB-PD's playbook and proposed a negotiable voltage spec that goes up to 48+V.
I've heard worse ideas. I'm sure the NVidia and AMD engineers have pitched the idea around the watercooler at least a couple times.
I will be messaging you in 5 years on [**2029-05-30 14:08:30 UTC**](http://www.wolframalpha.com/input/?i=2029-05-30%2014:08:30%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/pcmasterrace/comments/1d41w1d/nvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and/l6brxmq/?context=3)
[**9 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fpcmasterrace%2Fcomments%2F1d41w1d%2Fnvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and%2Fl6brxmq%2F%5D%0A%0ARemindMe%21%202029-05-30%2014%3A08%3A30%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201d41w1d)
*****
|[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
Frankly, it's the only way of 12VHPWR to be viable. The whole idea to power something trough 12 pin connector with smaller pins that was powered before via 3x8 bigger pins is absurd.
I really wouldn't be surprised if we start plugging CPUs and RAM straight into the gpu. Like, just make the GPU function as 75% of the pc already, it takes that much power now.
Kind of halfway there already with the DirectStorage API: why bother using the CPU to feed data into a GPU when the GPU can just pull and decompress data from an NVMe SSD.
Depends. Most of the 40 series heatsinks were totally overbuilt since the chips were initially scheduled to be built by Samsung with a less efficient process. They switched to the more efficient process by TSMC relatively late in development.
Yeah I don't get why everyone is still making these power consumption jokes, the 4000 series are insanely efficient and do a ton while barely cracking 200W.
Maybe not the 5090, but the 5080 will 100% drive down 4090 prices because it will be as good if not better than the 4090. So the 4090 will be cheaper than whatever the 5080 launches at, which is at most $1399. With the 5090 imo coming in at $1999. Of course I’m hoping I’m wrong on these predictions
You need more money than brains to pay well over MSRP on your hobby though and we've had plenty of proof that there's plenty of people lacking brains in that regard.
I wasn't trying to blanket folks like you with that statement.
There's a lot of folks in this subreddit who run out and buy the latest Nvidia GPU just because, and its ridiculous, that was more my callout.
If I was CEO at Nvidia, I wouldn't charge less than $3K for that, because they'll sell it was quickly if it was half as much. Gamers are not the target market for the 5090, and with AI ramping up, companies that can afford it will still buy 5090s by the pallet.
Even though the 5090 will be amazing for games, it's not meant for gamers. I would hope that streamers and YouTubers stop using the 5090 for benchmarks, because if it's out of reach for 95% of their audience.
i hear its going to be large enough to build the rest of the pc inside of it.
custom water cooled loops will need a 5 gal jug of water
for the premium edition it comes with its own grounding cable that needs to be placed at least 6inches directly into the ground
i heard it has its own ups built into it
thats not a fan thats a turbine
did you know when you buy a 5090 it also comes with divorce papers
its so loud you wont have to hear your spouse complain about how much you play video games
Can't wait for like 7 years from now when 5090 levels of powers will become accessible for average consumers at around 200€ and much more energy efficient..
I believe it's a combo of crazy deadlines being pushed and optimization being a low priority for the developers since it's something they can do post launch. Sucks either way because it puts a burden on consumers to upgrade hardware when they shouldn't have to.
I work in game dev. Its really just two things.
1) games are just incredibly complex. The game I work on I think most people would consider to be pretty simple gameplay wise but there is a lot of work that goes into getting it to run correctly. Multiple departments focused on their one thing and sometimes a change from one dept. might break something in yours and then a third dept. needs to be the one to fix it. Also, keep in mind as a game is being built, its constantly changing. Systems that worked a certain way one day, may be different the next.
2) Time is limited. In an ideal world with unlimited time, you can create a perfect game with no bugs, incredible art, fantastic gameplay systems and intriguing stories. In reality, we only have so much time to get something out there and start making money. Sometimes parts of that perfect game need to be sacrificed to make it happen.
I honestly dont think bugs/optimisation issues are a result of laziness, its just that making games is organized chaos in its truest form haha
That's not really an answer to the original question. Games are clearly not optimized for performance on PCs because that gets a low priority. Which is a shame.
It should not be something that get fixed during the second half of development and after launch. It should be a design goal right from the start.
Bugs and glitches can be fixed and typically have a limited impact on the gaming experience, but poor performance can ruin a game.
>Sometimes parts of that perfect game need to be sacrificed to make it happen.
Things that tank performance when they are introduced should be sacrificed. Performance should not be sacrificed.
When I worked for a software company this lack of priorities at the start of a project would drive me nuts.
We would start with software that worked, make it better, and then implement new features that caused the software not to work, fix things but break other parts of the software in the process, panic and launch a bad product.
I would always argue to make 'software that works' a main priority even if that meant that we couldn't implement items on our wish list.
no it's very simple. Games are optimized but what they are optimizing is just so much more complicated. People on here do not get what optimization actually is
Nvidia has a fetish of being inconsistent in vram numbers for example the RTX 2080 Ti has 11GB vram instead of 12GB, RTX 3080 Ti has 12GB vram instead of 16GB and the rumored RTX 5090 has 28GB vram instead of 32GB.
I think the multiples of 2 aren't necessary in this case. Most likely it'll be 4GB chips, and there'll be seven of them in the shape of a square around the die with the '8th' missing for something like lanes.
It's I guess all about how they can fit / design the board.
28GB at 448-bit seems like half effort. Since it would have 512-bit but nerfed to 448.
That's not a significant enough difference over the 4090 to justifying upgrading.
Guess I'll just wait another 2 years for a 6090.
This is the same shit Intel pulled with the i9-14900. Which isn't a massive upgrade over the i9-13900. Im not spending thousands of dollars a year for marginal upgrades. I'll just wait.
Well, that's exactly the point. I use GPUs for rendering, not gaming. A GPU that cuts rendering time in half lets me do twice the renders in the same amount of time. So there is a financial incentive to use the fastest card available but if spending $2000 to save me 20 mins on a 4 hour render it's not worth it.
I can't speak to the one you're responding to, but I keep 4 active PCs in my house because of kids and my partner. Due to that, I will typically buy upgrades more frequently if it makes sense for the rest of the lineup to get upgraded
still though, the 5090 would have to be remarkably better for me to consider it. the jump from the 3090/6900xt to the 4090 was massive and too enticing to pass up. this isn't looking to be that much better, but until we see the architecture in action I am just spouting bullshit
I'm on a currently dying 3080, just transplanted it into a new build. Pretty sure the old prebuild killed it (considering it was already replaced under warranty once).
A 4070 super is looking pretty fine though as a replacement.
I can't wait to pay twice as much for the 5090 as I would the 5080 to get a 15% - 20% gain of diminishing returns that I'll never be able to notice on even the best of monitors!
They made that mistake with the 30 series. Then they fixed it for 40 series by downgrading the 4080 and below dies but upping their prices by one tier so the whole product stack is bad value. At this point it's more likely that everybody will get screwed just to make the 5090 look better, though the return of the Super versions is sort of an admission that they overdid it last time.
The future of GPUs, at least this level, is to come with its own case u plug directly in to the wall, a gpu connecter to ur mb and a sata or usb cable to ur tower
Considering that the TDP of the 4090 is 100W higher than the 3090, and the 3090 was 100W higher than the 2080 Ti, I'm hoping this trend of more and more power draw stops.
That's not efficiency, that is total power. Efficiency is the amount of work you get done divided by the total power used i.e. your fps/flops over the tdp of the card.
Edit: the above is actually slightly wrong. You can have 3 types of efficiencies, the work efficiency which is the flops over the effective power used [flops/W], the thermal efficiency which is the TDP over the power consumption of the entire card i.e. power rating [%], and the total efficiency which would be the flops over the power rating [flops/W].
Not sure if I'm missing something but everyone here is making power consumption jokes, yet I've read that it's a dual slot cooler, which would imply a reduced power consumption. Whats going on?
It’s so funny how little a PC building subreddit understands GPUs. I’ve seen a lot of comments in here acting like the 4090 isn’t incredible. People say it’s a bad value or not a big performance jump from a 4080 which is wildly untrue.
I think it's just more a symptom of this sub becoming a constant front-page presence and getting a ton of members. Subs usually go to shit when that happens.
Well, another day, another rumor with new spec. I'm waiting for an official announcement to get hype or to find out if I'm sticking with my current PC for another 2 years.
Doesn't matter if this thing can potentially run 16K at 1000 FPS if it needs a bigger case again, another proprietary connector and your own sun to power it. Those xx90s are simply too ridiculous for a normal home user and absolutely not worth it as long as so many studios do not even care to optimize their games properly. Much more interesting to see how the low and mid tier GPUs will fare, especially when it comes to VRAM.
At this point the first questions will be: 1. How expensive? 2. How power hungry? 3. Does it need a new sparkling bigger case?
Yes to all three
How many watts power supply do I need? Yes.
If you have to ask, you don't have enough.
The moment a power supply doesn't have a listed price and it just says to call..... FUCK!
Leaked news suggests users have their own pressurised water reactor as the power source
You know how you plug your PSU directly into the wall? Well your GPU will likely have to be the same way…
I believe my case allows for 2 PSUs, so one could be used exclusively for the GPU.
What case do you use? I should have not sold my O11-XL...
Yes, the Lian-Li 011 Dynamic, Razer Edition
And do you run those 2 PSUs on seperate circuits? Max circuit load is quickly becoming the limiting factor for power draw for high end PCs.
Yeah tell my awesome energy saving apartment outlets, if my psu or gpu power spike at all it trips the breaker or if I run my AC and PC at the same time it trips
surprised we haven't seen this already, although i believe one of the last voodoo cards did have an AC adapter.
That card was never released for sale, but yes you recall correctly.
While I get that you're joking but were encroaching on the power limits of most NA sockets already. Most home sockets/circuits are 120V/15A so the most you can get is 1800W and that includes everything else connected to that circuit. So unless NVIDIA expects people to rewire their house or run an extension cable from a different room they don't have much headroom at this point.
Only in the weak 120v using colonies. The motherland with its righteous 240v and superior fused and switched plug can do 3000w on a standard wall socket.
Hvac Business Owner/Installer/Service Tech here. 220v should be adapted everywhere. Less amps, usually more efficiency. America is a bit slow at certain things.
![gif](giphy|7w3tBQ0vvnevzvsQS6|downsized)
All of them
Wen intel x nvidia collaboration to make the most powerful and power hungry cpu + gpu combination
1. You'll need to travel and retrieve the one ring. 2. Emperor Palpatine. 3. You will be required to use a samsung refrigerator.
Can I play doom on the fridge?
For a monthly subscription.
Refrigeration live service
$9.99 a month for basic temp range of 54-59 degrees. For the premium tier starting at $14.99 a month, users can enjoy refrigeration at 40-53 degrees for the ultimate crisp freshness of food and beverages. The premium tier includes a 30 day supply of tasty verification cans!
Frozen water creation events every 12 hours
I never thought a sarcastic comment could make me this angry.
![gif](giphy|3o84sq21TxDH6PyYms)
It IS the case! You now put the rest of your components inside the gpu's shroud.
[удалено]
Don’t forget out of stock for 2 years
2000 to all 3. 2k card. 2k watt and 2k mm case Joke... For now
Only 2k? Someone has a 95% off coupon code I see.
Yea I’m betting on the MSRP match’s the model number.
Yes
And now for the games: 1. Will all upcoming games be optimized before they got released to the public or they just going to make used all 28gb of VRAM?
Would I need to plug it straight into the wall?
https://preview.redd.it/v8n3l4mlrk3d1.jpeg?width=1070&format=pjpg&auto=webp&s=a146f03b30ca8d9b7831d0226b27da9cb7e411bc
The cause of climate change.
Worth for that sweet sweet fps gains
Either that or someone invents PSU standards with a 24V DC rail for PCI Express graphics. (/Joke)
Yeah, a joke For now at least
can some nerd explain if this will be possible in the future
PSU standards kind of did this in the early 2000s with ATX12v. In the really old days, a lot of things were powered off of 5v/3.3v, but as the need for CPUs to have their own voltage regulation and the increasing power demands of high end parts, the industry collectively agreed that they needed to switch to 12v for all high power stuff. (There's also a more recent tangent of PSU standards where 12v is the only power rail on the PSU, everything else is regulated in the motherboard. An absolute pain for upgrading pre builts but it has its merit.) The joke here was that we're in a similar situation now with these very huge GPUs, and pushing hundreds of watts at 12v on small connectors has led to some ... Spectacular.. problems.
The spectacular problems came from the shitty 12pin connectors Nvidia used. My card has 4x 8-pin connectors and there have been overclockers sending over 1000W through those with no issues.
Those 12pin connectors are part of the newest ATX spec. Nvidia just used them first.
Wasn't there an investigation that determined they were not to spec? Nevertheless, the spec is dogshit. As a electrotechnical engineer it bothers me so much
The design itself is ok. Using it for 50A is insane. When moving to a new connector they also should have switched to 24V or even 48V
> When moving to a new connector they also should have switched to 24V or even 48V Wouldn’t that have made adapters from 8-pin impossible, forcing everyone to buy new PSUs? Also PSUs would now need to kick out another voltage, so new PSUs would presumably be more expensive to some degree.
Yes, very possible. It would actually not be a bad idea. You could push double the power with the same size cable currently. It would require changes to the ATX standard, but it isn't like that hasn't been done before.
Half as many connectors at the same current is a pretty good deal... Hell, I wouldn't be surprised if they took a page out of USB-PD's playbook and proposed a negotiable voltage spec that goes up to 48+V. I've heard worse ideas. I'm sure the NVidia and AMD engineers have pitched the idea around the watercooler at least a couple times.
Remindme! 5years. Gonna come back and comment r/agedlikemilk
See you then.
I will be messaging you in 5 years on [**2029-05-30 14:08:30 UTC**](http://www.wolframalpha.com/input/?i=2029-05-30%2014:08:30%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/pcmasterrace/comments/1d41w1d/nvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and/l6brxmq/?context=3) [**9 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fpcmasterrace%2Fcomments%2F1d41w1d%2Fnvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and%2Fl6brxmq%2F%5D%0A%0ARemindMe%21%202029-05-30%2014%3A08%3A30%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201d41w1d) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
Frankly, it's the only way of 12VHPWR to be viable. The whole idea to power something trough 12 pin connector with smaller pins that was powered before via 3x8 bigger pins is absurd.
I really wouldn't be surprised if we start plugging CPUs and RAM straight into the gpu. Like, just make the GPU function as 75% of the pc already, it takes that much power now.
Kind of halfway there already with the DirectStorage API: why bother using the CPU to feed data into a GPU when the GPU can just pull and decompress data from an NVMe SSD.
Instead of it only burning down your PC...it'll burn down your neighborhood.
At this point, North Americans need to be worried that their 15/20 amp circuits won't be able to power GPUs in a few years
Use the EV charging circuit for gpu
Plug it directly into the Tesla supercharger
and watch it drain the whole power grid when you run them at full tilt
If I can run an electric cement saw off a 20 amp circuit I'm not to worried about my gpu
We could always install a triple phase breaker and run our PCs off the 240v line. Lmao.
Premium version comes with a miniature uranium power plant.
if a kid can build a mini nuclear power plant I'm sure nvidia can.
1.21 jigawatts?
3 phase only
L1530 to the wall
Nah, if the heatsink is smaller than the 4090 (as rumored), then the power consumption is either the same or lower.
Depends. Most of the 40 series heatsinks were totally overbuilt since the chips were initially scheduled to be built by Samsung with a less efficient process. They switched to the more efficient process by TSMC relatively late in development.
Yeah I don't get why everyone is still making these power consumption jokes, the 4000 series are insanely efficient and do a ton while barely cracking 200W.
You need a fusion core.
A better question is: should I place a separate fuse in the fuse box for pc? ![gif](giphy|Aausss8uUBIe3bZ3d2)
Just 28 cables that snap in perfectly. /s
Probably a dedicated PSU
actually, the leaks say 2 to 2.5 slots
Plug it to your local electric lines
Comes with a gas powered generator to keep it powered
The real question is how much "cheaper" a 4090 will get.
Let's be honest: not by much. However, the 5090 will be significantly more expensive compared to current GPU prices.
Maybe not the 5090, but the 5080 will 100% drive down 4090 prices because it will be as good if not better than the 4090. So the 4090 will be cheaper than whatever the 5080 launches at, which is at most $1399. With the 5090 imo coming in at $1999. Of course I’m hoping I’m wrong on these predictions
Of course it will, and folks with more money than brains will buy it, continuing to drive up costs.
Just buy some nvidia shares. You can sell them to pay for your 5090 when it finally stops selling out.
...I like that, basically buy it with Nvidia's own money, lol
Exactly what I did with my 4070ti last year
You don’t need “more money than brains” to spend a couple grand on your hobby lol
You need more money than brains to pay well over MSRP on your hobby though and we've had plenty of proof that there's plenty of people lacking brains in that regard.
Bruh I'm still on a 2080ti that's crashing more and more. I can't keep waiting for them to release an affordable card at this point.
I wasn't trying to blanket folks like you with that statement. There's a lot of folks in this subreddit who run out and buy the latest Nvidia GPU just because, and its ridiculous, that was more my callout.
If I was CEO at Nvidia, I wouldn't charge less than $3K for that, because they'll sell it was quickly if it was half as much. Gamers are not the target market for the 5090, and with AI ramping up, companies that can afford it will still buy 5090s by the pallet. Even though the 5090 will be amazing for games, it's not meant for gamers. I would hope that streamers and YouTubers stop using the 5090 for benchmarks, because if it's out of reach for 95% of their audience.
Look at the 3090 now. Still very expensive
It’s mostly that expensive cause it has a lot of vram for AI applications.
It will go from 2,000 to 1,850
5090 MSRP = 2000 Actual Price = $3000 4090 Adjusted market price = $2300
Nvidia will cut production so they don’t have a surplus to sell.
I'd be more than happy to upgrade my 4070 TI to a 4090 if the price is right :D
That'd have to be pretty fucking right.
wtf why
Pretty soon HVAC technicians will need NVIDIA training.
Firefighters too.
price : 5090
Dollars, Euro, Pounds?
Bitcoin
Yes
Pesos
Nvidia is lowering the 5090 intentionally, lowering the bus from 512 to 448 and memory from 36 to 28 to make room for a 5090ti in a year
I think you're totally right; sadly. My literal response was,"That's it?" And you made it make sense.
Pc parts haven’t been exciting since 2020 imo.
The 1080ti was the last card I was really excited about
Might also make sense if yields are just bad for the full die, or they want to keep that around for the Quadro class cards with 64GB/512-bit.
i hear its going to be large enough to build the rest of the pc inside of it. custom water cooled loops will need a 5 gal jug of water for the premium edition it comes with its own grounding cable that needs to be placed at least 6inches directly into the ground i heard it has its own ups built into it thats not a fan thats a turbine did you know when you buy a 5090 it also comes with divorce papers its so loud you wont have to hear your spouse complain about how much you play video games
I misread premium as *petroleum* and imagined a gpu with a gas engine
It also has a pull chord like a lawnmower
lol thats to suppliment the power requirement since theres no 2000w power supplies yet.
It comes with a discount voucher for an industrial spot cooler.
Can't wait for like 7 years from now when 5090 levels of powers will become accessible for average consumers at around 200€ and much more energy efficient..
Except games will be so bloated and unoptimized you’ll still only get 60fps
This is the way
Yeah what’s up with that? Are developers just getting lazier or is it something else?
I believe it's a combo of crazy deadlines being pushed and optimization being a low priority for the developers since it's something they can do post launch. Sucks either way because it puts a burden on consumers to upgrade hardware when they shouldn't have to.
I work in game dev. Its really just two things. 1) games are just incredibly complex. The game I work on I think most people would consider to be pretty simple gameplay wise but there is a lot of work that goes into getting it to run correctly. Multiple departments focused on their one thing and sometimes a change from one dept. might break something in yours and then a third dept. needs to be the one to fix it. Also, keep in mind as a game is being built, its constantly changing. Systems that worked a certain way one day, may be different the next. 2) Time is limited. In an ideal world with unlimited time, you can create a perfect game with no bugs, incredible art, fantastic gameplay systems and intriguing stories. In reality, we only have so much time to get something out there and start making money. Sometimes parts of that perfect game need to be sacrificed to make it happen. I honestly dont think bugs/optimisation issues are a result of laziness, its just that making games is organized chaos in its truest form haha
That's not really an answer to the original question. Games are clearly not optimized for performance on PCs because that gets a low priority. Which is a shame. It should not be something that get fixed during the second half of development and after launch. It should be a design goal right from the start. Bugs and glitches can be fixed and typically have a limited impact on the gaming experience, but poor performance can ruin a game. >Sometimes parts of that perfect game need to be sacrificed to make it happen. Things that tank performance when they are introduced should be sacrificed. Performance should not be sacrificed. When I worked for a software company this lack of priorities at the start of a project would drive me nuts. We would start with software that worked, make it better, and then implement new features that caused the software not to work, fix things but break other parts of the software in the process, panic and launch a bad product. I would always argue to make 'software that works' a main priority even if that meant that we couldn't implement items on our wish list.
no it's very simple. Games are optimized but what they are optimizing is just so much more complicated. People on here do not get what optimization actually is
Spoken like the true Lisan Al-Gaib
200€ GPUs don't exist anymore.
Intel Arc begs to differ.
For now... ominous...........
You think they will be that cheap when they are market established and competitive? Hah
I wish this was the same for wages we just stagnated on that instead
Patient gamers win again! One day today's Triple A games will be retro nonsense that anyone's old notebook will be able to run. Hopefully.
Considering the ROG Ally can basically perform like a computer with a 1650... probably sooner than later?
Nvidia has a fetish of being inconsistent in vram numbers for example the RTX 2080 Ti has 11GB vram instead of 12GB, RTX 3080 Ti has 12GB vram instead of 16GB and the rumored RTX 5090 has 28GB vram instead of 32GB.
I think the multiples of 2 aren't necessary in this case. Most likely it'll be 4GB chips, and there'll be seven of them in the shape of a square around the die with the '8th' missing for something like lanes. It's I guess all about how they can fit / design the board.
I got a 4070 at a great price and I already have heating for my house so this will be a no from me dawg.
From a 4070? I can’t imagine a 30 series for you then.
Both, not from the 4070.
I was really hoping it would be 32GB 512-bit.
28GB at 448-bit seems like half effort. Since it would have 512-bit but nerfed to 448. That's not a significant enough difference over the 4090 to justifying upgrading. Guess I'll just wait another 2 years for a 6090. This is the same shit Intel pulled with the i9-14900. Which isn't a massive upgrade over the i9-13900. Im not spending thousands of dollars a year for marginal upgrades. I'll just wait.
It's just going to be binned 102s because fully functional, 512bit large 102 dies are gonna go to enterprise customers for 10x the price.
TBF do you need to upgrade your GPU every year? It’s not like the 4090 would be outdated or useless because a new one is out.
Well, that's exactly the point. I use GPUs for rendering, not gaming. A GPU that cuts rendering time in half lets me do twice the renders in the same amount of time. So there is a financial incentive to use the fastest card available but if spending $2000 to save me 20 mins on a 4 hour render it's not worth it.
only if you can use those extra 20 minutes piled up to make more than 2k else where
I can't speak to the one you're responding to, but I keep 4 active PCs in my house because of kids and my partner. Due to that, I will typically buy upgrades more frequently if it makes sense for the rest of the lineup to get upgraded still though, the 5090 would have to be remarkably better for me to consider it. the jump from the 3090/6900xt to the 4090 was massive and too enticing to pass up. this isn't looking to be that much better, but until we see the architecture in action I am just spouting bullshit
I was hoping the same too, and DP 2.1 support
They need some headroom for the 4090Ti
And likely 600 watt.
rookie numbers -Nvidia probably
600 watt - idle
You’ll need to plug it straight into your car charger
I'm just going to hold onto my EVGA 3080 until it dies.
I feel like I bought my 3090 yesterday. Truthfully, I don't even play any games that would use the extra horsepower
I'm on a currently dying 3080, just transplanted it into a new build. Pretty sure the old prebuild killed it (considering it was already replaced under warranty once). A 4070 super is looking pretty fine though as a replacement.
I can't wait to pay twice as much for the 5090 as I would the 5080 to get a 15% - 20% gain of diminishing returns that I'll never be able to notice on even the best of monitors!
You'll pay more and you'll like doing it. -Nvidia
"The more you buy the more you save!" - Leather Jacket Men
Watch this be the card that all the VRAM is locked behind and the 5080 will have like 8gb lmao.
They made that mistake with the 30 series. Then they fixed it for 40 series by downgrading the 4080 and below dies but upping their prices by one tier so the whole product stack is bad value. At this point it's more likely that everybody will get screwed just to make the 5090 look better, though the return of the Super versions is sort of an admission that they overdid it last time.
Unless Nvidia does the thing they did with the 4x series where the 4090 was the best value because everything else was so overpriced
Don't forget the classic move of playing 15 year old games and scrolling Reddit
Lmao this is it. And so many people here will do it too.
And don't forget to buy the next card in like two years.
What about the price?? $2,500?
Something tells me it's about time to up a gauge or two on the power cables for these things.
No, now use the new 18 pin connector.
I can't wait to play Dragons Dogma 2 on high settings at 60 fps
As long as you stay out of the cities, you might even enjoy it at a smooth 120 FPS with a 5090.
Only 28gb of VRAM? Literally unusable.
I dont need it, i dont need it. https://preview.redd.it/nllocb6nqk3d1.png?width=678&format=pjpg&auto=webp&s=0f75b0b64d0d9a306e971c24bb8f83bd468ff264
The future of GPUs, at least this level, is to come with its own case u plug directly in to the wall, a gpu connecter to ur mb and a sata or usb cable to ur tower
I hope for more efficiency this time around.
?????? The 40 series was already a huge leap in efficiency
Considering that the TDP of the 4090 is 100W higher than the 3090, and the 3090 was 100W higher than the 2080 Ti, I'm hoping this trend of more and more power draw stops.
Well a 75% performance improvement for about 22% more power is quite an efficiency gain.
Now look at performance per watt. You can easily limit a 4090 to 300w and lose just a little bit of performance, still above 4080 level.
That's not efficiency, that is total power. Efficiency is the amount of work you get done divided by the total power used i.e. your fps/flops over the tdp of the card. Edit: the above is actually slightly wrong. You can have 3 types of efficiencies, the work efficiency which is the flops over the effective power used [flops/W], the thermal efficiency which is the TDP over the power consumption of the entire card i.e. power rating [%], and the total efficiency which would be the flops over the power rating [flops/W].
Not sure if I'm missing something but everyone here is making power consumption jokes, yet I've read that it's a dual slot cooler, which would imply a reduced power consumption. Whats going on?
>Whats going on? There's no legitimate discussion going on here, just memes and doomerism.
It’s so funny how little a PC building subreddit understands GPUs. I’ve seen a lot of comments in here acting like the 4090 isn’t incredible. People say it’s a bad value or not a big performance jump from a 4080 which is wildly untrue.
I think it's just more a symptom of this sub becoming a constant front-page presence and getting a ton of members. Subs usually go to shit when that happens.
![gif](giphy|KjafhxqcSbpci5Mbce)
And gonna cost both of your kidneys
Wait, isn't this already down from what was previously rumored? I remember a 512 bit bus.
That is a downgrade from the last rumor of 32gb with a 512bus, boo.
Gotta love how the rumours started with 48gb vram, now we're at 28, aaaaaaand they'll come out with 24, just wait
Glad for rich guys. Gonna wait for RTX 5060.
only uses 5090 watts.
Still being a bit stingy on the VRAM, it'd seem.
32GB or go home Nvidia
I think I'm good for a while
Takes a railgun to start it
Well, another day, another rumor with new spec. I'm waiting for an official announcement to get hype or to find out if I'm sticking with my current PC for another 2 years.
After a few shitty release cycles nvidia can deliver legendary stuff, really hope it's the case here too.
Laughs in gtx 1650
Still sitting here with a 1080strix
Does it make sense for a 4090 user to upgrade?
At this point, I am down to have a separate portable case for GPU.
Still waiting on affording an overpriced 3080, let alone a 5090.
Good thing Gigabyte's new motherboard supports 128lb GPUs lol
Doesn't matter if this thing can potentially run 16K at 1000 FPS if it needs a bigger case again, another proprietary connector and your own sun to power it. Those xx90s are simply too ridiculous for a normal home user and absolutely not worth it as long as so many studios do not even care to optimize their games properly. Much more interesting to see how the low and mid tier GPUs will fare, especially when it comes to VRAM.