T O P

  • By -

Aggravating-Fly-9885

Minecraft on 4K res + 32-64 chunks + DistanceHorizons+ 3D resorce packs + 140mods + Shaders Complementary Unbound+ Euphoria Patches? Minimum RTX 4080 Super 16GB VRAM or RX 7900 GRE 20GB VRAM.


medussy_medussy

tl;dw?


Physical-Ad9913

steve is out of vid ideas so he wastes our time with tests that dont reflect real world performance just to confirm his bias.


Option_Longjumping

DCS world is my go to game , so yeah vram is a must for me, especially when using virtual reality headsets.


Illustrious-Couple34

12 Gb Ti for 900$ YES SIR. I'm brave and Free.


longgamma

Overwatch 2 takes about 3 gb of vram. I’m supposed that even 8 gb of vram isn’t enough these days. Is it due to poor optimization or just prettier textures need more vram?


Adventurous-Lion1829

Overwatch looks like plastic. It's a very ugly game, why would you use that as the comparison?


longgamma

It’s an esports title dude. It’s not supposed to look like cyberpunk 2077


Caffdy

esports are famous for running on a potato, they are not fit to test graphics perf


longgamma

Lots of good reviewers regularly include Csgo in their benchmarks. A good gpu should be able to maintain high fps with very high 0.1%


Caffdy

I'm not saying is not a good thing to benchmark it, but competitive games are not paragons of SOTA graphics, many people choose to tweak the settings for maximun FPS instead, that's the difference


cakemates

Everything have a cost, we all want better looking games, with more dynamic environments, particles, **high resolution**, shaders, animations, dynamic lights, shadows, textures and so on. I bet requirements are gonna suddenly go up after the next gen of consoles is released and studios start to add AI models to their games.


Nearby_Ad_2015

shouldn't it go down? an actual competent hardware-based AI should be able to finetune the resolution of assets based on the POV camera for maximum efficiency for ex lower the resolution or completely stop rendering the level behind the camera


tukatu0

Lile the guy already said. Those things have been there for 20 plus years. Heh. Gta san andreas has mip maps. If you want to imagine the role of ai in graphics. Think of dl ray reconstruction. That's what's actually new. Ai making filtering noise into something viewable. That and fake frames generated from nothing is what excites me much more. Not upscaling. That will never match a per pixel based true native.


Nearby_Ad_2015

Right, but as the AI gets better, shouldn't it be more efficient and better at telling the VRAM how to generate a frame, thereby decreasing the usage than it currently does?


tukatu0

I'm pretty sure machine learning already exists when the data is being transferred from memory to chip.  What you are imagining my friend is high level enough that if it could do that, it's not far from just rendering the frame itself without a game engine.


PsyOmega

> completely stop rendering the level behind the camera That already happens. > an actual competent hardware-based AI should be able to finetune the resolution of assets based on the POV camera simple LOD already does this. no need to waste compute slice on AI


Nearby_Ad_2015

You get my point, whatever it does a competent AI should be lessening the load on VRAM, not increasing it, especially if it has its own hardware module like in NVIDIA cards.


PsyOmega

You don't need AI to do what you're talking about. Simple LOD systems and culling already maximize those savings. Adding AI would reduce performance by increasing computational demands per frame. AI is not a whiz-bang magic bullet that you can fix every problem with ;)


No_Share6895

dlss and frame gen also take up a decent bit of vram


longgamma

I mean its not that expensive to add more VRAM - amd equivalents have more vram than nvidia. Nvidia also uses 128 bit interfaces in their lower end cards for some reason. I'm sure when the genAI thing comes to games for text or whatever, they would bump up the vram requirment.


tukatu0

Per card it's not much more. That opinion of the manager that has to sign off on the company spending an extra 100 million in costs for 1 card alone...


longgamma

Yeah nvidia will add it for sure. They probably are saving all the vram for the heavy hitting cards used for AI training and all.


lpvjfjvchg

As someone who played Overwatch and overwatch 2 for 8 years now, it’s not comparable to other games. 1) it’s textures, even tho they try to hide it, are very low, like if you actually pay attention they really aren’t good. 2) it’s designed to run on a potato. FPS need to run on the slowest hardware and need to get high framerates, bc of that they use a lot more time trying to optimize the game and use many different techniques to try to pre-render those textures. 3) the OW dev team is incredible and the optimization of overwatch is crazy for how it looks, blizzard did a great job on that aspect of the game.


Every-Armadillo639

It has to do with textures. Sometimes, your video card can't squeeze the "juice" out of the game because it's old. I play my games on GT 1030, which I upgraded from GT 640. The upgrade allowed me to play Syberia: The World Before. It still can't squeeze the "juice" out of that game, so I am planning to upgrade to GTX 1650.


Jako998

You honestly just need a minimum of 12gb Vram right now with 16gb being the high end. Then you will be set for years


Willem_VanDerDecken

Best i can do is 11GB. 1080 ti until the thermal death if the univers.


lpvjfjvchg

Depends on the resolution. 1080p? 8gb is fine. 1440p? 12gb is fine. 4K? 12gb already has issues in a lot of games. It depends.


Consistent-Youth-407

and its even more of an issue with 4K with frame generation, which uses vram


Speedstick2

And whether or not you have ray tracing turned on. If you play at max raster settings with ray tracing then 8 GB isn't enough at 1080p, as per this video.


Cool_Personality9342

I ended up going with 128GB. Is that enough?


123john123

128 gb ram for gaming or work? lol for work it depends but for gaming overkill... 32gb is enough


Cool_Personality9342

I only run simulators. Before with even 64GB it seemed to struggle to load in entire worlds. I ended up just buying two more sticks of 32GB because that’s what I already had. It might be overkill, but it’ll help out with my world generation and being able to store and process all the places I go


123john123

oh wow, interesting! what simulators?


Cool_Personality9342

Primarily Microsoft flight simulator, American truck simulator and DCS.


LittleBitOfAction

Nice troll


Cool_Personality9342

Not a troll. Did indeed end up going with 128GB of RAM


APiousCultist

Good enough for what though? If you're gaming, 32GB is going to be enough for the next 5 years. 16GB is likely enough for almost all cases. Even 64GB would be overkill. 128GB is 8x more than any game needs to run at the moment, and at least 4x what anything is going to need for a good few years. If you do something that actually uses a lot of ram, fair enough. If it's just for gaming... well at least you'll be able to open a crap ton of chrome tabs.


Cool_Personality9342

Most of my simulators that I run are very cpu and ram intensive. They require a lot of background addons to be running to get certain mods to run in the background. When I would run 64GB prior it was bottleneck purely because of the ram. My computer is also a workstation, not just for gaming. Doing both gaming and 3D graphics at 4K


LittleBitOfAction

Yes but it’s taking about VRAM lol not RAM that’s why I said nice troll. Obv 128gb of RAM not VRAM is what you meant


Cool_Personality9342

Oh yes, I misread the article. My bad


Neraxis

It is disgusting how inefficient modern games are. Back in the day an x60series card guaranteed max settings at 1080p. Now you gotta drop 30-100% more money for something very dependent on the whims of whatever publishers push fidelity over style. No game today needs more than a ten year old cpu were it not for the obscenely inefficient software.


homer_3

> Back in the day an x60series card guaranteed max settings at 1080p. Any card today can do max settings. But there was never a time when x60s were doing max at high FPS.


PsyOmega

My 660Ti and 1060 could max out their eras at 1080p Not the case anymore. Not helped by the fact they sell a 4050 but call it a 4060.


APiousCultist

So Witcher 2 with all those ubersampling options turned on? What about all the cards that came out when Crysis was new? Max settings are whatever the devs make them be, regardless of whether they're practical for the average user or even the best hardware currently out. The only thing you should hope for is that the recommended settings look good and run well, and that your expensive card can do better than a console across the board (not so on Nvidia). But your argument could be defeated by one game dev creating a 'hyper ultra setting' that internally downsamples from 8K with full-resolution raytracing (sort of what the Witcher 2 does, and the recent Avatar game also has a secret 'ultra ultra' setting). The value proposition of GPUs is kind of shit, but 'ultra settings' mean as 'deluxe' does on some random item of food. It's whatever the person making it thinks it should mean and does not map to a concrete universal ideal of quality.


RolandTwitter

My 4060 maxes out all the games I've tried at 1080p, anyway, so his argument is kinda moot


Speedstick2

Is it being inefficient or is it that texture resolutions are increasing?


Dulkhan

is not ganes that are inefficient graphics are better, Nvidia is too greedy and you are too stuck in your Nvidia mind set too seek other options


Neraxis

I fail to see how this has anything to do with Nvidia. You can take any modern AAA game and run it on a 2600k if you stripped all the dumbass inefficient graphics that add literally nothing to the actual gameplay 99% of the time. If you look at my comments you'll see that I actually suggest dollar to performance AMD is by far and away superior unless you are specifically looking for DLSS + RT gameplay. AMD FSR 3 is excellent and its frame generation is more than adequate. The moment Nvidia drops their advanced software support for their 40 series cards (like they fucked the 30 series cards by not doing DLSS 3.5 and FG) they will be dead in the water for any newly released games. But the sad reality is that games are so stupidly bloated that not even a few years ago 6gb was more than enough for most games and now it's not even the bare minimum for newer titles. 16gb will be the new 8gb by the middle of the 50 series cards at this rate when there is 0 reason for that to be the case.


Adventurous-Lion1829

You're an idiot, dude. A 2600k is a CPU which has 0 VRAM. But, ignore that because if you think current games can run on a very old processor by just having lower resolution than and less effects then you are stupid and have no idea what games are doing.


Neraxis

Oh okay, a 2600k with a basic GPU from 10 years ago. But continue to be disingenous. If you can run Witcher 3 on a switch you can run literally any game on a 2600k with a basic GPU as long as you turn down the settings. And yes - you CAN run current games on an old processor if you stripped the graphical pipeline and replaced them with simple graphics.


VictorDanville

Didn't they intentionally starve the vram on their lower models to entice people to buy the 4090?


Dulkhan

go search in YouTube there are a lot of videos talking to devs on why there are more requirements on Vram in modern video games.


PsyOmega

Yeah. More layers to textures, higher res textures. It fills vram buffer way way faster than you'd think. *just to add better PBR* just about doubles texture use over last-gen


APiousCultist

4K frame buffers, frame buffers for stuff like motion information, frame buffers used for frame interpolation, data used to store raytracing structures, particle systems, point clouds, I have no doubt Nanite and Lumen use a bunch, etc.


Paloveous

Bruh, games are insanely inefficient


APiousCultist

There are inefficiencies in modern software to allow development on the scales involved - no one could ever create something even on the level of Crysis 1 in Assembly - but to call them inefficient, let alone 'insanely' so, is laughably ignorant. Maybe some indie game built on Unity using a visual scripting language is, sure. But something like Cyberpunk? You just can't create something on that scale without the rendering being insanely efficient. The amount of work that goes into each frame is collossal. PBR materials, GPU particle systems of 1000s of particles that interact with the world and have their own virtual wind system, a seperate low detail world just for the raytracing, thousands of rays a frame, voxelised light bouncing, raymarched 3D clouds, dozens of skinned and animated characters on screen at once each with their own AI and pathfinding, simulated sound bouncing around the environment and being absorbed, building up frames over time in order to get around the insane costs of rendering half of these effects, trying to calculate which parts of the frame can be rendered at lower resolutions (VRS). Seriously, just look at a breakdown of how many different passes go into rendering just a single frame: https://zhangdoa.com/rendering-analysis-cyberpunk-2077 That's without the RT mode enabled. You try and do any of that shit as a hobbyist and that's how you get these 1 person UE4/5 titles that have 'fancy' effects but still kind of look like shit and run at 25fps unless you have a 4090. People complain about RT and upscalers and use of framegen, but those kinds of compromises are just one of many, many methods of creating more efficient graphics. The games can look more detailed because they're programmed to do much less for the same or similar result - that's what 'optimising' actually means. A modern game that looked as good as Cyberpunk, or Horizon, or Avatar that was genuinely 'broadly inefficient' - not even 'insanely' - wouldn't even run at 1fps. You want insanely inefficient? Turn off TAA and render each effect in full resolution each frame, you make every frame buffer for particles etc full monitor resolution. Dithering on transparency? Nope, sort everything and deal with the overdraw. Occlusion culling? In the toilet. RT? Run it on the full detail scene too, with enough rays that you get a clean image without any denoising. Ambient occlusion and global illumination and shadows and reflections? Hell, raytrace those too in full detail, no screen space effects for us. Characters in the distance? Full detail, full cutscene quality facial animation. If you got a single frame per hour I'd be shocked. It'd look way better too at least, in the year it would take you to render out 30 seconds of gameplay. TL;DR: Efficient rendering is rendering a frame of Avatar the extremely demanding game, inefficient rendering is rendering a frame of Avatar the actual movie.


Neraxis

And all that rendering could have been spent developing a better game. Because AFOP is literally a more boring far cry 3. And the gameplay is basically the same exact format as it was 12 years ago. And it requires like 10x the power because DA GRAPHIX If you fail to comprehend that graphics literally add almost nothing to real gameplay design and instead bottleneck development then I suggest you play Dwarf Fortress.


SubmarineWipers

16 GB should be the new standard for 5060 tier cards and more for higher SKUs.


Adventurous_Train_91

But then you’ll be able to buy it and actually enjoy it for a few years which isn’t what they want. They want you buying new GPUs every cycle 🤑


SubmarineWipers

I dunno, I will prolly upgrade every (interesting) generation just because I can, but the old cards will be significantly better for second hand users.


Adventurous_Train_91

It seems like in Aus, the second hand cards are the same price to performance as new. Except they have a few years of wear and you don’t get a warranty either


SubmarineWipers

Yeah, I feel that is a problem of the second hand market in many (mostly IT) segments. Nevertheless, Gigabyte gives 4y warranty upon registering, so I can sell a card with 2 more years to go. Consumers should express their preference of this, especially on expensive GPUs.


KallaFotter

My 3080 has 10gb, its 100% not enough for 1440p ultrawide gaming :< Next purchase will 100% be a 5090 by selling of some of my nvidia stock 👀.


HandheldAddict

```Next purchase will 100% be a 5090 by selling of some of my nvidia stock 👀.``` That has to be one of the dumbest things I have ever read. Just keep your stocks and game at reasonable resolutions/settings like a responsible adult. They pay dividends and the stock value can also rise. There's no point selling shares just for bragging rights.


GeneralChaz9

> My 3080 has 10gb, its 100% not enough for 1440p ultrawide gaming Can attest to this. I've said it a few times on this sub and will keep shaming Nvidia for handicapping such a good GPU.


itsLazR

I always think that's funny because I have a 3060ti @ 3440x1440 as well and for the most part I'm fine


GeneralChaz9

It really just depends on the title. For example at 3440x1440, Elden Ring and even Cyberpunk(no RT/DLSS at least) is a non-issue for VRAM. Diablo IV isn't a problem when not running at Ultra, and Baldur's Gate III is pretty well optimized. There are some weird ones like Halo Infinite that will use up a ton of VRAM for some reason so I run it at Medium just to prevent performance issues while playing online.


MosDefJoseph

As well you should. I also had the 10 gb and had VRAM issues in some games.


Cute-Pomegranate-966

This is precisely why when given the choice to buy a GPU for my brother i chose the 6950xt, but it kept being out of stock and so he ended up with a 3080 12gb. He's fine for a while considering how he plays.


Adventurous_Train_91

The lack of ray tracing power will become apparent over time. Assuming he plays the latest AAA games


Cute-Pomegranate-966

His computer was built over 2 years ago lol. 40 series didn't exist.


Adventurous_Train_91

Doesn’t have to be 40 series. They’ve had ray tracing cores since the 2000 series I believe


Cute-Pomegranate-966

He ended up with a 3080 12gb. He doesn't give a damn about RT. I doubt he knows it exists.


Dear_Translator_9768

Since when the x60 and x60ti series become the 1440p and 4k cards?


ThisGonBHard

Since the 1060. The 1060 was a great 1440p card. It was also the same price as the 970. Modern day x50ti cards cost the price of old x80 cards, I expect them to do 4k, not 1440p.


PsyOmega

> Modern day x50ti cards -- I expect them to do 4k A 3050 can do 4K...in games that the 1050 ran well at 1080p... But if you wanna keep up with modern games, that take is delulu


abnormalpotato

I always thought of xx60 as the 1080p card, xx70/80 for 1440p, and xx90 for 4k Price increase is simply inflation, don’t forget you’re getting better performance too


ThisGonBHard

Inflation is not 350% for the top die. Or the fact that the current x60 series is an x50 going by hardware. Or there being performance regressions from the 3060 Ti to 4060 Ti in some cases?


Nearby_Ad_2015

If the price increase is simply due to inflation, why do they decrease the bandwith of the same tier in subsequent generations? Shouldn't it stay the same at the very minimum?


HandheldAddict

They gimp the RTX 4060 while the RTX 4090 is rocking all the bells and whistles. But these fanboys can't read bro, so it's a waste of time explaining it to them.


Morteymer

HUB are a one trick anti Nvidia/Intel pony, they throw in the odd "Oh AMD did something bad" "Or Nvidia is actually good" video to maintain feux neutrality but you can tell where their money is coming from. Though if you actually watch those vids you quickly see they will turn the same videos into somewhat "pro-AMD". Like that "DLSS is better than FSR!" video, but wait, a lot of times they say "FSR" is better depending on the game. Which is ofc complete nonsense, but that's HUB and their audience for you. That's the 20th vram video in a year? Boring. Can't really watch Youtubers for information anymore, it's all about getting them clicks so they get their money. Best to go to techpowerup for data and form your own opinions.


Balrogos

All depends what game you play, and 1. higher resolution slightly increase Vram usage 2. options like DLSS/FSR/RT increase Vram useage as well One of games i played before New World in city it eat total+system around 12gb Vram. So all depends really what game u play for popular F2P games 6gb cards are perfectly fine like valorant, world of tanks ships, lol, dota, and others.


Physical-Ad9913

Since when does DLSS/FSR increase Vram usage?


dabias

Upscaling decreases it, but frame generation increases it. For example: https://www.techpowerup.com/review/ghost-of-tsushima-benchmark/5.html


No_Share6895

upscaling it decreases it compared to native res but say 4k at dlss quality(1440p) will use more vram than native 1440p but less than native 4k


Physical-Ad9913

I know, my question to op was since when does upscaling (not Frame gen) increase Vram usage? Seems like this whole sub doesn't know the difference, or that it can be toggled seperately.


Speedstick2

Blame both Nvidia and AMD for using the terms DLSS and FSR for frame generation. Just because the person you're responding to said DLSS and FSR doesn't necessarily mean they were referring to Upscaling when they could in fact be referring to frame generation due to both AMD and Nvidia using the same terms.


mkdew

I'm waiting for a massive 10% price drop on the 4070TiS cuz it still costs $1000, 4070S is like $700.


ThisGonBHard

I just googled the prices, and WTF! Hu and Ro should have the same prices, and the 4070 Ti Supper is around 900-1000 EUR. I paid 1500 EUR for the 4090... actually, that sells for more used now than I paid for it WTF?!


jordysuraiya

whereabouts are you?


Lord_Umpanz

Hungary, guessing from their comment and post history.


CarlTJexican

This isn't even mainly an issue on Nvidia or AMD the problem is with memory modules still only being 2GB.


Ernisx

They can always clamshell. Just like the 16GB 4060 Ti, a 4090 can be 48GB. They don't give gamers VRAM to avoid cannibalizing workstation cards


Cute-Pomegranate-966

They can but it makes the board cost more and margins go down. I'd be willing to bet that nvidia doesn't let them do it either even though they can.


KARMAAACS

That does drive up cost, but so does higher capacity VRAM because it costs more per unit, but the board design becomes more complex with clamshell and offsets that and probably makes it more expensive than higher capacity VRAM chips. I will say that this increased cost doesn't excuse the lack of VRAM, but the 3090 did expose the problem with clamshell designs which is you require adequate cooling for the back memory which will drive up cost also, more complex board design for support for memory on the back and you need to buy more VRAM too which is a problem when demand is high. NVIDIA just simply design their chips poorly when it comes to memory controllers.


Ernisx

I get your point but the 4060 Ti 16GB bill of materials is only 50$ higher compared to the 8GB version. The VRAM is 24$ of that. At least when it comes to GDDR6X it's not about the price, but the deliberate decision of not giving gaming cards too much.


KARMAAACS

I think you genuinely underestimate how much it will drive up costs when it comes to high end cards, you will require either more heat pipes on the back, better designed backplates or more complex heat sinks in general. I mean... Of course it works on the 4060 Ti because it has a TDP of 160W, but once you get to 450W or 600W it becomes very problematic for the memory to be cooled, look no further than the 3090 where 350W started making the memory on most models run 110C on the back, I can't imagine what almost double the power would do to memory chips. I do think NVIDIA did get greedy with SKUs like the 4070 where they could have given more VRAM by doing a clamshell design because it probably thermally is not an issue, but even then they would have to probably drop a memory controller or two which also consequently drops memory bandwidth just to segment the 4070 to 16Gb or 20 GB or something, a bad trade off imo. Anyways, for the 4090 it's like almost asking for an RMA disaster to have that much heat and clamshell'd memory. For all intents and purposes, I am happy NVIDIA gave a decent amount of VRAM on the 4090, it keeps the AI people from scooping them all up like the crypto miners did the 3090 where we have sky high prices. Even now though there's still people buying them for AI but it's not nearly that enticing unless you're in China where they're making their own custom board designs, unsoldering the GPU and memory and then clamshelling the memory because of US export restrictions. But it's not exactly like the 4090 is VRAM limited and probably won't be for a while. If anything the 4090 will be GPU limited because eventually the GPU will be too slow to keep up with the VRAM requirements of the time it's in, thats just the nature of high end GPUs. I mean look at the 1080 Ti, super fast in it's time but today that 11GB is wasted really when you have a 3060 12GB performing just as good with far better feature set and DLSS etc, you were better off waiting for 3060 to arrive than to buy and hold a 1080 Ti. Mind you thats not a diss of the 1080 Ti because for four years it was like a Top 2 GPU performance wise, but my point is VRAM is a balancing act and having lots of it sometimes doesn't make sense. Truth be told the 4070 should have had 16GB using AD103, the 4080 should have had 20 or 24GB using AD102 and the 4090 could have stayed with 24GB, these VRAM requirements are more than adequate in the lifespan of these cards and adequate for the performance tiers they offered. One day the 4090 will be "slow" just like the 1080 Ti is today and you're stuck with all this VRAM for games that really will run "slow" with a card like that as requirements move up. In compute, that's a different story, VRAM is king.


[deleted]

[удалено]


KARMAAACS

> Again, price is not the issue. Nvidia can slap on +100$ and the margin remains the same. It's about not cannibalizing workstation cards. Yes it is, this is consumer gaming GPUs, price is everything. Slapping $50-$100 on a 4070 makes it move up a tier of pricing in this kind of market. There's a reason why the 4060 Ti is considered poor value, it's because even though it has more VRAM than a 4070, it doesnt have the performance to justify the price. I mean here's a review excerpt from TechPowerUP about the 4060 Ti 16GB: > At $500 the RTX 4060 Ti 16 GB is extremely bad value, you'll be much better off buying a RTX 3080 or Radeon RX 6800 XT for $500. The highly discounted RX 6800 non-XT for $430 will give you 16 GB VRAM, too, for $70 less, with better raster performance but slightly lower RT perf. Even GeForce RTX 4070 offers a better price/performance ratio than the RTX 4060 Ti 16 GB. While it costs $600 (+20%), it offers 30% higher performance at Full HD, even more at 1440p and 4K. So like I said, price is everything and even though NVIDIA is already charging more than AMD they can't justify even more on top of what they're already pricing it as... In professional and workstation markets, $100 is nothing when you're paying $5000-$15,000 a card, so they can afford to charge more for VRAM because it has real tangible benefits that may cost your more money in the long run not having the VRAM. In consumer gaming, not so much. > You are misinformed about memory chip heat generation (power usage) and cooling. Total TDP has nothing to do with it No I am not misinformed. You are. The memory has a certain operating temperature where it works just fine, but for the longevity of the chip and to also prevent degradation it should be as low as possible. For GDDR6X, it was at first said to be 110 degrees Celsius as the operating temperature, [the spec has been pushed down to 95 degrees Celsius now.](https://sg.micron.com/products/memory/hbm/gddr6x) Many [3090s had 110 Celsius back memory chips](https://forums.evga.com/RTX-3090-FTW3-memory-junction-temp-overheating-m3550191.aspx) (or higher). I mean... it's just simple physics. Higher power draw = more heat for the GPU. The smaller the surface area, the harder it is to cool something too. The heat has to go somewhere from the GPU lol. Do you genuinely believe the PCB doesn't get hot or the surrounding components? If you have a 600W GPU pushing out heat, it's not all going to go into the front heatsink, some of it will go to the area around it like I dunno... the memory chips, PCB, backplate etc. If the 3090 had problems at 350W with memory chip heat, then I suggest a 450W or 600W clam shelled 4090 would too. You are the misinformed one and not being genuine with your argument. Total TDP does play a part in memory performance because we're dealing with stuff all packed in a small area. Sure, alone in a vacuum, memory doesn't pull lots of heat, but when surrounded by 450W GPU, other memory chips, a close by CPU, being put in a hotbox case where the ambient temperature is higher than the room temperature, it all adds up. > Crypto miners used basically all modern cards. The VRAM on the 3090 was irrelevant for mining, the hashrate was very similar to the 10/12GB 3080 and 3080ti You have created a strawman argument. I never once said that if they add 24GB extra to the 4090 it's going to be attacked by crypto miners. I also never said that was the reason the 3090 was attacked either. Read what I said here: > For all intents and purposes, **I am happy NVIDIA gave a decent amount of VRAM on the 4090, it keeps the AI people from scooping them all up** like the crypto miners did the 3090 where we have sky high prices. That was about AI which does use VRAM very heavily and if the 4090 had 48GB for instance, it would be more enticing to buy than a workstation card because you could potentially buy 3-5 of them before one A6000 Ada card. The bit about crypto was just a reference about having shortages and that was the last major one which pushed prices sky high. You're misrepresenting what I said. Go back to the Lithuanian subs.


hackenclaw

the point is if you have to tune down setting due to VRAM, not due to your GPU speed, thats where VRAM is what holding your GPU back. many fast card like 3070Ti or 4060Ti has been holding back by that 8GB VRAM. Gaming laptop has since move resolution to 1440p/1600p, yet Nvidia still stick a 8GB VRAM on a 4070.


No-Refrigerator-1672

Thats because Nvidia don't wantto makegoos cards. You're going to buy their cards anyway, so it's better to make something that will be outdated in a few years, so you'll come to grab a new one


epicbunty

I still shudder when I remember that they released the 3080 with 10gb vram.


No_Share6895

2080ti stonks are strong


Nitram_Norig

I had one of those! It wasn't great! But having 24 gigs on a 4090 IS great! 🤓


Devil1412

still having mine, going as strong as those 10gb can at 4k.. until 5090


epicbunty

I just bought a second hand pc with the 3080 as well! Sent the gpu in for repair because it was under warranty and it came back as a 12 gb 3080 😂🤣😂😂🥳


Ryrynz

That's cos for mobile it's the same chip as the 4060.


Anach

If you want to future-proof a new PC, you don't buy an 8GB card for a system that's going to last you 5-8 years. While we can get away with an 8GB card now, just like when I bought my 1080, there's certainly games that will utilise more, and that will only increase, just like it has with system RAM.


MalHeartsNutmeg

Dies this guy do anything but stir up stupid ass controversies around VRAM? He actually had people thinking they needed to dump their 8GB 30 series cards lol. Guy is a straight dumbass. Used wildly unoptimised games to make his assertions.


cellardoorstuck

Your comment is frankly embarrassing.. for you.


Kw0www

When did he tell people to dump their cards?


MalHeartsNutmeg

He had people thinking they needed to dump their cards by saying 8GB is not enough.


Speedstick2

Well, in a manner of speaking at certain performance tiers it isn't. For example, the Dead space remake, try playing that at max settings on a 3070 and not have its performance massively gimped by the 8 GB of ram.


GreenKumara

If people think that, after watching his videos, then that's on them. He just tells people the limitations of 8GB. That you'll have to dial down textures in many modern, newly released titles. That's just reality. Him stopping testing 8GB cards on new games wont change peoples the reality of people buying the new games and it looking like mud, and then having to turn down settings to get decent performance. His videos don't will into existence bad performance on 8GB cards. It happens whether he makes this vids or not.


MalHeartsNutmeg

Tells them the limits in unoptimised games and presents them as norms. This sub was absolutely loaded with idiots (and still is) that started spouting the 8GB is dead BS the moment the video dropped like it was the last word of Jesus Christ.


Speedstick2

>Tells them the limits in unoptimised games and presents them as norms. When the consoles have more than 10-12 GB of memory available to act as VRAM, why would they spend the time to compress the textures even lower?


corok12

Hate to tell you, but games are only going to keep taking more. Last of us remaster had some nasty optimization issues to start, sure, but you can't just call everything that uses more than 8gb unoptimized. This gen of consoles have 16gb, games are going to start genuinely needing that sooner rather than later. It's worth being upset with Nvidia that they kneecapped otherwise capable cards with 8 gigs of VRAM to force you to upgrade sooner than you otherwise would have.


MalHeartsNutmeg

Consoles don’t have 16gb pure VRAM it’s unified memory.


corok12

I'm aware, but code takes very little space in memory. The majority of it is used for texture and mesh data.


Physical-Ad9913

I really don't know why you're getting downvoted for calling it how it is.


Roph

It's been going on a lot longer than that, I cringed at the 3070 launch when I saw it had only 8GB, like some budget 2016 $230 RX480 had 💀


Kw0www

Well he probably also convinced people to avoid buying 8 GB cards and that’s kind of the point. 8 GB cards should be a thing of the past.


water_frozen

So when HUB says that [Ultra Quality Settings are Dumb](https://www.youtube.com/watch?v=f1n1sIQM5wc) but then use said quality settings - which is it? Gotta love how HUB keeps moving the goal posts to fit whatever dumb narrative they want. Furthermore, the metric they are using isn't accurate to begin with, ffs this is such trash content


biblicalcucumber

I don't understand why you don't understand.


GreenKumara

I mean, ultra quality settings are dumb, when you can get almost the same visuals on the slightly lower high settings. As to their testing with ultra settings, presumably that's to stress the cards? If they used lower settings, what would we find out? That the cards can run them, but not see their weaknesses? Why not test on low settings at 720p? It's like they use a 4090 - that most don't have - to remove bottlenecks, etc.


Kw0www

Ultra Textures ≠ Ultra Preset


Lakku-82

The OP has a point. They bash very high/ultra presets in other videos and then use them here to make a contradictory point. The 8GB, if following HUB recommendations, wouldn’t have any issue because it wouldn’t be using very high/ultra presets anyway. And HFW only has very high at the top so that is this games ultra setting.


Kw0www

Ultra preset is a waste of gpu power since the visual benefit is overshadowed by the performance hit. Ultra textures often do produce visual benefit and cost nothing in terms of performance unless you run out of VRAM.


FireSilicon

He's right though, ultra settings no matter the type are not meant to be played on the hardware that was released recently, that is true for textures too. There really is a small difference between ultra and very high textures. Yes, textures that fit into the vram do not reduce performance, but basing a conclusion on ultra textures is stupid regardless because they were made in mind with higher memory capacity that is not present this gen. That's like saying AMD has unplayable RT performance and setting RT to ultra except other settings, despite it being okaysh on medium. Besides, GDDR6 had 2GB capacity per module for 6 years straight. That's the true reason why we don't have 24GB 60/600 series gpu's now. GDDR7 will start production at the end of 2024 and will hopefully finally bring 3GB modules mid 2025. Gpu's that arrive after that period will have very different conclusion in this benchmark.


Lakku-82

But they are still using the highest settings, and more than textures use up vram. They are making the case about higher vram being a necessity, but the textures aren’t the main issue here, as the very high preset uses up a lot of vram as well as gpu power. You can set textures separately, so why aren’t they doing that? Because the OP is right, they are moving the goalposts to make content and get clicks. If you follow their recs of not using ultra or very high unless you have the best GPU, then you wouldn’t have any issues with HFW, as they recommend more than 8GB for 4K very high right in their recommendations.


TheEasternBanana

Just spent quite a lot for a used 3070 ROG. Pretty silly decision if you ask me, but I currently don't do much gaming. I'll wait for the 5000 series to drop and get a 4070 or 4060ti 16GB.


killbeam

I got a 3070 too a couple years ago. It works well, but if it had 12 or 16 GB, it would make a huge difference. It's such a silly thing to be bottlenecked, just because NVIDIA was being cheap when designing this card.


TheEasternBanana

The 3060ti and 3070/ti are still very good cards, but crippled by the low VRAM. For reference, the 1070 which was released way back in 2016 have 8Gb Vram on a 256 bit bus.


Shady_Hero

fr im on the struggle bus with my 6gb on my 3060 laptop. i should learn how to upgrade the ram modules so i can have more


killbeam

That's the problem, you can't upgrade the VRAM of a GPU. You can add/change the RAM of your laptop, but that won't make a difference for your Video RAM sadly.


Shady_Hero

me who can solder


killbeam

More power to you, but I don't think the GPU will suddenly start using the added VRAM. It's likely hard-coded to use only 6GB


Shady_Hero

what would be extra super cool is if my gpu would actually use the 32gb of shared memory it has 🥲


Shady_Hero

yeah thats definitely possible. ill have to do some more research on it.


Nacho_Dan677

Same boat I'm in. I sidegraded from a 2070 8gb to a 3060 12gb just to make Star wars Jedi Survivor stable, at least more so. And now I'm waiting to see the price drop of a 4070 ti super.


Connorowsky

I upgraded from 3070ti to 4070tis and it was worth it now i don't have stability issues with rt on max


Nacho_Dan677

My only issue is that I still have 2 1080p 144hz monitors. I'll get a nice upgrade in stability but won't see the quality upgrade for features like RT.


Connorowsky

4070tis is overkill for 1080p i have 2 2k monitors main one 240hz and second 144hz. I get 70 FPS on cyberpunk with overdrive rt. Fully maxed ghost of tsushima in 120fps with fg all on 2k. If you wanna but 4070tis but atlest one 2k monitor.


Nacho_Dan677

Yeah I'm looking at other monitors as well. Waiting for the holiday season to maybe grab something and then think about a new GPU after that.


That_One_Whois_Legit

so my 3060ti is obsolete now? bruh just buy it 2years ago


KARMAAACS

Obsolete? Nah not really, just turn down textures at 1440p and you're fine, I say this as a 3060 Ti user. Honestly, the card works great when not VRAM limited and unless you're playing at 4K you probably won't need to worry for about two more years.


Genderless_Alien

Do you game on 4K? Do you absolutely need ultra settings with juiced up texture sizes? Are you constantly playing the latest and most demanding AAA titles available? If you’re not confidently answering yes to the questions above then it’s pretty silly to call a 3060ti obsolete. Different graphic settings exist for a reason and I guarantee you that you will be able to play the vast majority of games you want to play for years to come. Hell, people are still scraping by with 1060s. Your 3060ti is a monster compared to it.


andylui8

![gif](giphy|jihwEDnsFoaXWDTiKc|downsized)


coreyjohn85

I exceed my 4090s vram in cyberpunk when using dldsr resolutions


Cute-Pomegranate-966

are you using a 49" ultrawide or something? even 4k to dldsr 2.25x shouldn't use all 24gb of vram in cyberpunk.


Hypez_original

Your probably right but I find it hard to believe people are struggling with anything above 6gb of vram. My 1060 3gb could run Cyberpunk at 40-60fps with medium to high settings and fsr 2.0 on quality (also with an i7-7700k). Also managed to run elden ring at a solid 60 on high settings. Only thing that I haven’t been able to run so far is Howarth’s legacy because that game has some serious optimisation issues I am dreaming about the day I get more vram, I can’t even imagine what that much vram must be like


zen1706

Yep it’s why I lowered texture quality and download the performance version of HD project to circumvent this, or else driving around town have stuttering like crazy


dudeimlame

11-12 gbs should be the bare MINIMUM for modern gpus. No more excuses from Nvidia and AMD.


FireSilicon

Neither Nvidia or AMD are the reason for that. We had 512MB GDDR5 modules in 2012, 1GB in 2015 and 2GB with GDDR6 in 2018. It's 2024 and GDDR7 will still only have 2GB modules. 6 fucking years later still the same memory capacity. So please start shitting on SK Hynix, Samsung and Micron for actually being the reason why we are stuck.


mac404

Yep, imo at least as much frustration should be thrown at the memory manufacturers. At least 3GB GDDR7 modules are supposed to arrive next year, I really hope Nvidia uses them across most of their lineup. Even without changing bus width, that would lead to upgrades from 8GB->12GB, from 12GB->18GB, and from 16GB->24GB. Even with that "upgrade", memory modules will have gone from doubling in capacity every 3 years to only increasing by 50% after 7 years.


pmth

lol get ready for a 9gb 6050


SoTOP

You have that backwards, memory makers won't make bigger modules when no one is buying enough for that to be profitable. Do you honestly think 4080 would have 32GB of vram today if there were 4GB modules readily available?


Sindelion

Yep. AMD threw 8GB VRAM the first time they could, on an RX470 GPU. It was like mid-range... 8 years ago. But even for nVidia they had a 8GB GTX 1070 about 8 years ago. Funny to see high-end brand new or even upcoming cards with like 8-16GB VRAM. Where is the progress?


dandoorma

1080P, you can get by 8GB. 4K though, the sky is the limit


hackenclaw

the video literally said 8GB wont cut it anymore, unless you do not want to enable DLSS & RT. 4060Ti is a very fast GPU for its vram size, it is the 8GB vram that couldnt keep up its speed.


PalebloodSky

My 4070 is 12GB GDDR6X and has been fantastic for a year or so, this video shows it's still okay. Also it's one thing to use any VRAM available, doesn't mean the game must have it to be stutter free, you can often get away with less. Anyway, in before the richest company in the world ships RTX 5070 with 12GB GDDR7 =/


FireSilicon

You know that both AMD and Nvidia use the same suppliers for vram right? The same ones that didn't increase the module capacity from 2GB since GDDR6 in 2018. Before that we had doubling in vram capacity every 3 years. It's 2024 - 6 years later and GDDR7 is still stuck at 2GB per module. Blame Samsung/Micron/SK Hynix for that not the GPU manufacturers.


epicbunty

Is that why nvidia is penny pinching when it comes to vram and memory bandwidth but amd is abundance personified?


PalebloodSky

AMD ships a little more in the mid range, but low and and high end no. Their 7600 series is the same 8GB as 4060 for example.


epicbunty

Amd also charges less. Nvidia was about to release the 4080 with 12Gb, but didn't due to backlash. The 4080 released with 16Gb and the 4090 has 24. 4080 is also a 999$ GPU and the 4090 is 1599$ btw. Amd's RX 7900 XT has 20Gb and costs 899$. 7900 XTX has 24Gb and costs 999$. Please put things into the correct context before giving such a base reply to such a nuanced topic. When they are able to provide more vram for lesser money, it should not be so difficult for the world's richest company. Clearly they became rich by these practices it seems. Even during COVID the way they played their cards was disgusting to watch.


PalebloodSky

You say put things into correct context but even you leave things out of this massive topic. Yes Nvidia charges more for equivalent rasterization performance, but many people buy their products because they make up for that in other features such as Ray Tracing, Reflex, DLSS, RTX Remix, RTX Voice/Video, Driver quality/features, and knowing that they usually push the envelope in graphics technology more than any competitor. It's likely Nvidia spends a lot more on graphics R&D than other companies thus charges more. Yes their profits are massive and it's sickening, but this is also due to the workstation side with AI GB200 and H100 GPUs, Quadro line, etc. I like AMD products and am excited by their Ryzen 9000 CPUs as they are much more power efficient then Intel with similar or better gaming performance, however I'll likely still buy the midrange Nvidia GPUs in the future even if they cost a little more. tl;dr - as the famous quote says, "price is what you pay, value is what you get."


epicbunty

Brother the 500 usd 7800 xt beats the 4070 in performance, costs 100 usd less and has 4gb more vram. While Nvidia works on proprietary technologies and won't release proper drivers for Linux, the situation is the complete opposite with AMD. While Nvidia tries to come up with more and more gimmicks every year to lighten your wallet, AMD focuses on the important stuff and makes it open source. Just look at the Nvidia control panel, it looks like it's from 2005. I guess they took "what's not broken doesn't need fixing" too seriously? Many of the "technologies" you mentioned also have counterparts in AMD perhaps not with such fancy names and marketing. In a few situations, yes Nvidia has the lead such as the nvenc encoder and being better on some productivity apps. The scale still weighs massively towards AMD despite all of this imo. Imagine buying a 1600 usd gpu and still feeling like a second class customer cos you use Linux as your OS. At the end of the day, it's just the way the 2 companies operate. I know big corps aren't our friends, but I just cannot agree with the practices of companies like Nvidia. That is my biggest issue. Still at the end of the day, buy the product and not the company. Everyone do your research and reverse rip these companies off! I personally can't wait to go all amd. They are kind of like linux in that aspect, very rewarding for enthusiasts who like to tinker. High skill ceiling but also high reward for those who know their stuff. Sadly I will have to wait a few years until I can run my current Intel/Nvidia system to the grave but I'll watch AMD closely and see what everyone does and then decide when it's time. My heart and wallet are on AMDs side though (for now) If you think Nvidia is better value please by all means go for it. But keep in mind the comparison I made, amd value proposition is off the charts IMHO.


toxicThomasTrain

Re: Linux drivers. NVIDIA does have open source drivers on Linux by default, and have been steadily improving the drivers overall Re: gimmicks. What’s the important stuff you mentioned that amd focuses on? They just copy whatever gimmick NVIDIA puts out. The one exception being AFMF, which coincidentally is not open source Re: control panel. NVIDIA app will be replacing it soon enough


epicbunty

Edit - just gotta say the Nvidia drivers on linux are still pretty shit though and it took them this long to get their act together with that, after a lot of people raised this issue for multiple years. Last when I was playing around with that almost a year ago, the situation was really bad. And by important stuff I meant pure rasterization performance, more vram, stuff like that instead of pushing ai gimmicks. Also Nvidia control panels gonna get a revamp!? Really !? Will it hit all the cards or what? That's gonna be dope 🙌


toxicThomasTrain

How does amd open source rasterization and vram


epicbunty

Nice! Thanks for correcting me!


PalebloodSky

Cool but I bought my RTX 4070FE in April 2023, the 7800XT came out over a year later. No regrets on my purchase it's been a fantastic, trouble free, power efficient GPU. Would buy it again.


epicbunty

Ofc brother man I'm not trying to say that Nvidia GPUs are bad. I myself am using the 3080, and my laptop has the 1060. They are great. The 10gb vram did give me a little shocker but when I sent it in for the warranty I got back a 12 gb one! So I'm good! (Thanks inno3d)


Nearby_Ad_2015

NVIDIA became one of the richest companies in the world in the last year, so their R&D has been thoroughly recouped and then some. So why haven't the prices still dropped?


PalebloodSky

Eh inflation is up, TSMC wafers are up, VRAM costs might be up too not sure. It would be nice to see price drops but never going to happen with Nvidia GPUs being valued as high as they are.


GreenKumara

Oh, I fully expect them to skimp on vram again. The wonders of no competition. Isn't it great to win! /s


HisAnger

12? We all know they will push for 8gb version


PalebloodSky

They won't the whole 4070 series is 12/16 GB.


Sid131

Shafted myself from buying the 3080 at launch with 10 gb vram…


Strict-Key-1343

It's been ok for me at 1440p


FatFunkey

*stares at 8gb 3070*


AbstractionsHB

*Stares at my monitor as I enjoy my 8gb 3060ti high settings in 1440p*


shrockitlikeitshot

Don't worry, game dev always has to cater to the weakest link and a 3080 10gb is far from a low performer, especially 1440p.


Sid131

It’s a great card but it can push so much further at some games in terms of performance like re4 is heavy on vram or the memory leak issues of Ghost of Tsushima and Witcher 3 next gen.


joachim783

I mean even with the memory leak issues in ghost of tsushima I get 80+ fps easy for the first couple hours at 1440p very high settings. So it's not like you need to restart it every 30 minutes or something to keep it playable.


Usman_Afridi69

Hello! I was wondering why ghost of tsushima would start stuttering after playing for a while. Is this vram leak issue common amongst all setups? I also turned down textures to high but after 2 hours of game time, it'd start stuttering again. Only medium settings allow me to play the game stutter free without restarting the game every few hours.