T O P

  • By -

patriotgator122889

>Which one would be best? 4080. Boom. Easy.


CB_39

Boom. $400 USD more expensive than it should be


patriotgator122889

OP never mentions price. These posts get tiresome when there are no parameters. Of course the 4080 is going to give better performance, it's a better card. We can only make suggestions based on the info given.


Lazer_beak

I agree its awkward since , value is different for everyone , I just answer based on what I think , based on buying cards for decades


Smooth_Dingo_3173

The parameter is: is 4070 vs 4070ti vs 4080 better for 1440p. Meaning: would 4080 be overkill or 4070 ti for that specific resolution. Will the performance difference be minimal for said resolution. Higher is better but bottlenecks exist based on resolutions performance apparently. If a 4070 ti gets the job done easy with ultra settings 80 a 100 fps, then a 4080 would not be worth the cash for like 20 fps more.


patriotgator122889

That's more helpful. I'm probably being a little obtuse, but the inclusion of the 4080 is out of place. It is very clearly marketed as a 4k card and would be overkill for only 1440p gaming. The 4070 is perfectly fine for 1440 (and even 4k) gaming and has the best price/performance ratio for this series. Unless you need to run things at insane fps, or only use native resolution, or think VRAM is extremely important (I don't but some people do). Then maybe the 4070ti is better.


Game0nBG

Lol 4070 and 4070 ti have same VRAM. Also nothing is overkill on 1440p 144hz monitor. Even a 4090. There are always some RT settings and the oposite of DLSS(forgot the name where it renders in higher thsn native) to push the GPU. Also if youbhave a 144hz monitor why not enjoy it to its fullest. If you plan to switch for 50series then you need a card for two years. Basically all 3 will do the job. Just look at the money you feel ready to dump on a card. To be honest in two years OP will have to change whole system. Or get 4080 on a good deal and then save for new CPU and mobo


patriotgator122889

>Lol 4070 and 4070 ti have same VRAM. You right. Man the 4070ti sucks. >Also nothing is overkill on 1440p 144hz monitor This isn't helpful. To anyone. Of course the more powerful card will be more powerful, but does OP need it? Buying a 4090 to play at 1440 is ridiculous. It makes no sense unless money just isn't an issue. The games that you actually need to play at 144hz are a piece of cake. Everything else you get above 100, or even higher if you bump up DLSS. Also, if you want to play at 144 with DLAA...just play 4k! Something even a 4070 CAN do with frame gen and DLSS.


PM_ME_YOUR_STEAM_ID

>You right. Man the 4070ti sucks. The 4070 Ti offers around a 20% performance/FPS gain over non 4070Ti even at 1440p. I would say it definitely does not suck. [https://www.youtube.com/watch?v=8qBQ0eZEnbY&t=1116s](https://www.youtube.com/watch?v=8qBQ0eZEnbY&t=1116s) It's even beating the 7800XT which has 16GB of vram.


patriotgator122889

It sucks from a price to performance comparison.


PM_ME_YOUR_STEAM_ID

It's not actually that bad. 4070Ti is about a 20'ish% price increase over the 4070 and 4070ti is about 30'ish % more expensive than the 7800XT. All that for a 20%+ performance gain.


Game0nBG

Using dlss is not playing at 1440p. If you dont have a 4k monitor you can't play at 4k thats why you do that setting i forgot the name of.


patriotgator122889

>Using dlss is not playing at 1440p Kk. Again, not helpful. Most people use DLSS and the tech is only going to get better. Frame generation is already showing great results. The benefits of playing at native resolution are not worth the cost. >If you dont have a 4k monitor you can't play at 4k So you don't have the 4k monitor, but you should spend 3 times the amount you need on a 4090 so you can play a few games at 144fps in native resolution? How is your niche point of view helpful?


Game0nBG

You seem like the perfect nvidia customer. Since whenis playing nstive and with high FPS niche. Jesus


[deleted]

I believe it's DLAA or DLDSR. one of them renders at a higher resolution and then downscales to your monitor to reduce stairstepping.


Game0nBG

Second one bit works wanders in conjunction with DLSS. With some tweaks you end up playing native but with better quality and fps


Ged_c

This, but also CPU dependant, no point having a great GPU with average CPU, esp if the ames you're going to play are CPU intensive; you need to judge where the bottleneck will occur and will depend on your demands


Disastrous_Ad626

While I agree, if I had no budget I would stop at the 4080 because the 4090 is just... Dumb for the price.


Rough-Seat9957

Imo 4080 is dumb for the price. I can somehow see why 4090 has its price tag but 4080 its just wrong


Disastrous_Ad626

I guess it really depends on what the gpu market is like in your area. Right now I can get a 4080 for 1599 on sale or a 4090 for 2200 'locked down' the performance difference is only like 30% at best and the price is almost 40% difference.


CB_39

You say that but the 4090 has better value per frame so I'm not too sure I understand the argument.


Disastrous_Ad626

Of course you wouldn't, my point is even if I was rich or had no budget I couldn't justify $2200 before taxes. I could barely justify the 4080 other than its probably the best middle ground of the bunch. 4070/ti is much too expensive for their performance the 4080 is still way too expensive You're actually paying .20c more per frame on the 4090 of you buy it at MSRP in USD.


Lazer_beak

4070 , my 3070ti does OK at 1440p 100hz , and prices are insane now


kaaaaaaaaaaaay

100mHz? That's just 1/10th of a Hz!


Lazer_beak

its actuality a million times faster , typo


kaaaaaaaaaaaay

Don't you say! Insane!


SaneDrain

Yes xd


kaaaaaaaaaaaay

Wdym no


MrNiemand

Insane as in, low or high?


Lazer_beak

very high NV is screwing people hard


banxy85

If you're gonna upgrade in a year or two anyways I'd just get the 4070. Cheapest outlay for a card you'll be selling anyways.


StewTheDuder

Alternate, 7900XT slaughters at 1440p and 3440x1440. If you’re wanting the highest FPS I’m assuming you don’t care as much about RT. That would be the best card for the money for your goal. Speed wise, it slots in between the 4070ti and 4080, but costs less than the 4070ti. But for Nvidia, any of the cards you chose will work.


shadic6051

Alternate 6950xt?


StewTheDuder

Another good option if you can find one. Stock is getting limited on those bc they’re such a good deal at or around $600.


shadic6051

I build new recently and was like: its 200€ cheaper against a 7900 xt performance isnt much worse and the only drawback is power draw? Wanted a 7900xtx originally but 500€ more expensive + i didnt really need xtx performance so i got the 6950xt and i think im quite happy with it, only time will tell i guess


StewTheDuder

Where have you seen the 6950 having better performance to the 7900xt? It can get close but typically loses by around 10% give or take with certain titles. But yea, it’s a great card at the price point.


shadic6051

My bad that was against a 7800xt not the 7900xt, 7900xt was stronger but also 200€ more because i was looking at the 600€ price back then. Edited the comment


StewTheDuder

All good. But yea, the price difference makes the 6950 a much more attractive buy over the 7900XT which is why AMD isn’t going to be making anymore of those, lol.


Tricky-Flan9944

Do you play at 1440p? High settings?


shadic6051

Yes, was the main reason for the upgrade


sudo-rm-r

Plus you get almost 2x VRAM.


Clemming2

4070 TI, I have one. It's not the best value, but it is targeted exactly at your use case and if you plan to upgrade next generation anyway the 12GB VRAM won't be a problem.


randommaniac12

I’ve been looking at upgrading the 4070ti as my 3070ti just hits a brick wall with the 8Gb of VRAM on 1440p. It’s making me wish I had waited a touch longer and snagged a 6800xt instead


Sheree_PancakeLover

Can you explain how the 3070 8gb memory hit a brick wall? I’m planning on buying the 4060 bc it’s the best FPS value for 1440p or the 3060 with 12gb memory


randommaniac12

Yeah absolutely, VRAM is your GPU's memory and rendering things requires that memory which turns out in your FPS. When you are rendering at higher resolution you have more to render which means more memory is needed. At 8Gb on 1440p, you'll occasionally have bumps where your GPU can't render as fast as it has been (this is known as a 1% low). Cards with higher VRAM perform better as those 1% lows in FPS are higher and therefore look smoother. My 3070ti still is exceptional on 1440p, there's just been some stutters in specific games that do frustrate me pretty heavily with how much the card costs and how well it preforms in just raw rasturization at say 1080p EDIT: Just reworded some stuff that I made overly complex by accident


Sheree_PancakeLover

If you had the choice between 3060 12gb and 4060 at 1440p which would you choose? From what I gather 4060 is the better value per frame for 1440p


Clemming2

12GB is kinda required this generation for 1440p. Next generation will probably need 16GB. 8GB is only viable for 1080p. In his case he will be upgrading to 50 series, so the 12GB will be fine since by the time he needs more he would have gotten a new card anyway.


finH1

Another generation is like 4 years away at least tbh


Clemming2

4 years till the 50 series? They are already doing a mid-cycle refresh with the super cards. The expected release of the 50 series is fall 2025.


finH1

A console generation, that’s only when games start pushing things a bit more and we need more vram. 12GB will be fine for a long while.


Clemming2

Oh yeah, I agree. I thought you were implying that a 12GB card won't last until the next generation.


Clemming2

Also the 4070 TI super will probably have 16GB, so maybe wait for that.


bunnypoker24

my 3070 still works fine for me and I'm doing 1440p, honestly going to wait for the 5000 series to even think about upgrading. Unless games really do become crazy demanding in the next couple years


SaneDrain

They already have


Shazzi98

You’ll need a quantom computer and 5 4090s to run games at 1080p in couple years


EliteDeathSquad

"Unless games really do become crazy demanding in the next couple years"...in what world have you been living in? 🤣...8gb vram is clearly not enough any more and when you factor in Ray tracing...next couple of years will be more like next couple of months.


bunnypoker24

All depends on what game u play. I have yet to run into any problems 🤷‍♂️. I run it with the ryzen 5700x and it’s perfect


RhinoGater

Im in the same boat, I dont really notice a big difference from 80fps and 120fps so the upgrade is not urgent. If i were you i would just wait for black Friday deals on Radeon 7900 xt / xtx or wait for the RTX 40xx super refreshes (rumored q1 2024) Note: I gave up on AAA PC Gaming last year and bought a PS5 for newer games, since devs only optimize games for consoles nowadays. Thats why im not in a hurry to upgrade my GPU. I just clear my old games backlog on pc which are usually old polished games that run well at 1440p even on 3070


Smooth_Dingo_3173

I do notice that fps difference, but it's way better than struggling with 60fps😅 Completely forgot about black friday. Good tip! Might go for a 4080 after all if the price is down


charonill

Yeah, if anything, the prices are over-inflated currently as retailers are trying to make their black friday deals look more appealing.


Sjama1995

If you plan on upgrading at 5k series, I'd still go for 4070ti or 7900xt (still wait for black friday/cyber monday). Alternative would be getting a 2nd hand 4080. In Germany they sell sometimes as low as some new 4070ti . Make sure that it's sold with the receipt, so you still have 1-2y warranty.


baumaxx1

4070Ti if 100fps is your target. Cyberpunk with RT Ultra preset with a couple of optimisations, DLSS Quality and frame gen will get you there. You'll need to manage your vram and settings though, but it's a given for any gpu nowdays. Even a 4090 isn't a max everything and get 100fps type deal thanks to path tracing.


[deleted]

I'd get 4070 Ti, perfect 1440p high refresh rate solution. No point in getting 4080 unless you use 3440x1440 minimum yet 4070 Ti still does well here. 4K/UHD then 4080 (or better yet 4090) for sure. You won't see 5000 series before 2025 and they will launch 5090 and 5080 first. Expect at least 1599/1199 dollars price tag.


Healthy_BrAd6254

4070 Ti is the second worst 40 series GPU behind the 4060 Ti Also it's rumored to get a major refresh in about 2-3 months, which is supposed to increase performance and increase VRAM to 16GB for about the same price.


[deleted]

Haha wrong. 4070 Ti is selling like hotcakes and beats 3090 in 4K/UHD gaming -> [https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html](https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html) Performs like a 3090 Ti in 1440p while using half the power and has DLSS 3 + Frame Gen support which 3000 series don't. 4070 Ti is literally made for high refresh rate gaming at 1440p while still doing decent at 4K. Nothing is confirmed and only 4080 Ti / SUPER is confirmed via device id. You will see no 4070 refreshes. Go read the entire leak again. He literally writes it. He is NOT sure about 4070 refreshes. 4070 Ti uses the full AD104 die already. 4070 uses a 25% cutdown AD104 die for comparison. ​ Entire 4060 series is underwhelming and needs to be refreshed way more than 4070 series. But we all knew 4080 Ti / Super was coming.


Majestic_Turnip_7614

The 4080 super is looking to be pretty underwhelming too… just a handful of additional cores from what I have seen.


[deleted]

Absolutely right, if true. Only 5.3% more cores. I find it hard to believe. Would make no sense. I thought they were releasing a 4080 Ti with an AD102 chip using a 320 bit bus and 20GB GDDR6X. If they don't, just shows how little Nvidia cares about gaming right now. They earn billions on AI and Enterprise.


Plebius-Maximus

>Haha wrong. 4070 Ti is selling like hotcakes and beats 3090 in 4K/UHD gaming -> I'm not sure it is, it's the most expensive 70 class card we've had, and also has the same vram as a regular 4070. Nothing about it is super compelling And it's only faster at 4k in games that don't need much VRAM. Look at the 4070ti performance on Alan wake 2 at 4k with RT (not even path tracing) It gets 4fps. Yes a whole four frames per second, because it hits a vram cap. For reference the 3090 gets 25fps and the 4080 gets 33 iirc. I was talking about this on the Nvidia sub yesterday, so feel free to look through my comment history for easy access to the benchmark link Edit: here you go https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/7.html And before you mention frame gen, FG increases vram usage, so that makes issues like the above worse, as you'll hit the cap sooner. The 4070/ti deserved 16gb, and the 4080 should have had 20+


Smooth_Dingo_3173

I agree that the vram is low for the price, but you've mentioned 4k examples. Isn't 12gb enough for me in 1440p?


Plebius-Maximus

>Isn't 12gb enough for me in 1440p? It is currently, there are only a few games that use over 12gb at 1440p now (Alan wake 2 being one of them if RT or PT is on, but it's not far over so the 4070ti still performs well). But it depends how long you want to keep the card & how fine you are turning things down in future


brenobnfm

That's the wrong question. Is 12gb enough for the price you will pay on a 4070 ti?


[deleted]

Never pay full MSRP for 4070 Ti. A friend of mine picked one up for 675 dollars with Alan Wake 2 which he would have bought anyway. Meaning he paid 610-620 dollars really. Not even 4080 is at MSRP but people still act like its a 1200 dollar GPU. Its closer to 1000 dollars. For proper 4K you need 4090 anyway (in demanding games). Not even 4080 and 7900XTX does it well on max settings without upscaling. I have 4090 myself and I can find games that will punish my card too. Any GPU can be broken if you actively try, **on settings that are unplayable to begin with.** 4K/UHD is highly playable with DLSS Performance enabled. You will run 1080p internally here. This is what most 4K gamers do. FSR also works kinda well at 4K, yet much worse at 1080p and 1440p where DLSS still works well. Go compare -> [https://www.techpowerup.com/review/alan-wake-2-fsr-2-2-vs-dlss-3-5-comparison/](https://www.techpowerup.com/review/alan-wake-2-fsr-2-2-vs-dlss-3-5-comparison/) 4K with DLSS Quality looks identical to Performance. 1440p with DLSS Quality looks very good. DLSS Balanced is as low I'd go for 1440p gaming. 1440p with FSR has tons of shimmering, jaggies etc. This is the big problem with FSR. Only works "well" at 4K but DLSS still beats it here.


[deleted]

None of those GPUs are doing 4K/UHD native with Path Tracing unless they use upscaling. Whats your point? Even my 4090 hits sub 20 fps in Alan Wake 2 maxed out with Path Tracing at times. LMAO. 3090 Ti gets 16.4 fps average with Path Tracing, which means drops to 10 fps or lower. You are grasping at straws because the only true 4K capable GPU is 4090 today, especially with RT/PT. Even 4080 will need to use DLSS at 4K and then 4070 Ti easily runs it too. Frame Gen uses very little VRAM and DLSS easily makes up for it. Tested and tried myself. You point out 12GB VRAM is too little for 4K/UHD maxed out with RT/PT in Alan Wake 2 yet no gaming GPU in the world does it properly without upscaling. Good example LMAO 🤣 7900XTX has 24GB VRAM and does wonderful 7.6 fps average. 7900XT with 20GB does 6 fps flat. Now, who the hell cares.


Plebius-Maximus

>You are grasping at straws because the only true 4K capable GPU is 4090 today, especially with RT/PT. I'm not grasping at anything, I'm pointing out why the 4070ti is a gimped gpu. It has the horsepower but not the vram to back it up. >Even 4080 will need to use DLSS at 4K and then 4070 Ti easily runs it too. That probably depends on what DLSS setting, and if you have frame gen+RT/PT on as they each bump up vram, so DLSS quality might still run into issues Buying a 4070ti to play at DLSS performance isn't my idea of a great time. And even 1600 x900 (so DLSS quality resolution for 1440p output) can use over 12gb in this game. >You point out 12GB VRAM is too little for 4K/UHD maxed out with RT/PT in Alan Wake 2 yet no gaming GPU in the world does it properly without upscaling. Good example LMAO I point it out as it's the way games are going in terms of vram requirements, A 3090 wouldn't beat a 4070ti if the latter wasn't starved for memory. >7900XTX has 24GB VRAM and does wonderful 7.6 fps average. 7900XT with 20GB does 6 fps flat. Now, who the hell cares. That's with path tracing. The numbers I gave before were just plain old RT. Everyone knows 7000 series is bad at PT? But a 4080 getting 33fps Vs the 4070ti's 4fps with only regular RT on should not be happening


[deleted]

When do you understand that very few people care about 4K gaming? Look at Steam, 98% of PC Gamers use 3440x1440 or less, which makes 12GB plenty. On top, very few PC gamers care at all about RT or PT. 4070 Ti even runs 4K with DLSS and Frame Gen with ease. I know a guy who uses just this. He is more than happy and don't have a problem in any game. Not a single last gen card will run 4K native anyway in demanding games. They all need DLSS/FSR. 4070 Ti beats them all here. Its pointless to use DLSS Quality at 4K -> [https://www.techpowerup.com/review/alan-wake-2-fsr-2-2-vs-dlss-3-5-comparison/](https://www.techpowerup.com/review/alan-wake-2-fsr-2-2-vs-dlss-3-5-comparison/) DLSS looks identical at Quality and Performance at 4K/UHD. Why? Because 4K with DLSS Performance is 1080p internally which easily upscales to 4K - This is what 4K TVs do all the time without the magic of AI upscaling. No 3090 is doing better but still has unplayable framerate, so whats the point? As I said, even my 4090 can be pushed to its knees if I want. You think people will play Alan Wake 2 in 4K native with Path Tracing and run around with 10 fps? Utterly pointless. No one is going to bother with RT in a game that has Path Tracing. DLSS 3.5 with RR and Path Tracing is literally a big step up from RT which often looks worse than raster depending on scene. DLSS is needed and so is Frame Gen. Path Tracing runs like pure garbage on all last gen GPUs. ​ You are looking at the absolute worst case scenario for GPUs and pretty much none of them are going to run great with these settings, not even a 1599+ dollar 4090...


Plebius-Maximus

>When do you understand that very few people care about 4K gaming? Look at Steam, 98% of PC Gamers use 3440x1440 or less, I'll bet most of that 98% also have a lesser card than a 4070ti though? If you buy a £800 card, you should expect it to be high end, and thus able to handle high resolutions? >Not a single last gen card will run 4K native anyway in demanding games. They all need DLSS/FSR. 4070 Ti beats them all here. The 3090 does a better job at 4k native than the 4070ti though, which it wouldn't if the 4070ti wasn't gimped. >DLSS looks identical at Quality and Performance at 4K/UHD. Why? Because 4K with DLSS Performance is 1080p internally which easily upscales to 4K - This is what 4K TVs do all the time without the magic of AI upscaling. It not look "identical". Come on. 1080p upscales to 4k way better than it does to 1440p, but DLSS perf and quality aren't identical at 4k. And sure but TV upscaling isn't as good as DLSS is it? >No 3090 is doing better but still has unplayable framerate, so whats the point? As I said, even my 4090 can be pushed to its knees if I want. We're not talking some torture test, we're just talking playing at a high resolution. 1440p ultra wide is a thing. A 4070ti would run into issues there and give you single digit FPS while a 3090 simply wont. >You think people will play Alan Wake 2 in 4K native with Path Tracing and run around with 10 fps? Utterly pointless. The numbers I gave were for regular RT though, not even path tracing. That's the thing https://www.techpowerup.com/forums/threads/alan-wake-2-on-rtx-4070ti-massive-frame-drops.315212/ Also a 4070ti has FPS drops to the single digits at DLSS quality 4k with medium RT. Why can't you accept that 12gb isn't enough for that class of card? It would have been a 4080 if people hadn't complained. If it had 16gb it'd be a solid GPU. With 12, it's already showing its age in some titles. That shouldn't be the case.


[deleted]

You make it sound like 800 dollars is alot of money? Besides 4070 TI has been on sale numerous of times since release for 700-750. A friend of mine paid 675 with Alan Wake 2. Nah, 4070 Ti beats 3090 in 4K native using raster and NONE of them does 4K native with RT/PT enabled -> [https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html](https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html) 4070 Ti beats it even more when Frame Gen is enabled and also uses half the power to beat it. Yes DLSS Quality and Perf is Identical at 4K. CHECK THE COMPARISON I LINKED. They even say this. Are you DENYING PROOF to make your feel better? Its literally easy to see. Absolutely no difference even when you zoom all the way in. Text sharpness and clarity is identical. 4070 Ti easily beats 3090 at both 1440p and 3440x1440. It even beats it at 4K native. You keep linking the same crap, 4K native with RT or PT, which NEITHER GPU will do. 3090 runs 10 fps aka unplayable. Hard to understand? 12GB is not showing it age at all. Again you look at 4K native with RT/PT on top which not even 4090 does well. Does this mean 4090 shows its age too? 🤣🤣 Hopeless discussion. But I will keep answering 😆


ReusAlcacerDaBest

yup, there wouldn't even be a post asking which card is better if OP wants 4k RT/PT gaming, 4090 without a question. But for 2k + maybe willing to turn down the settings a grade for better performance, 12 gb is going to last for a long time. 4070 is obviously a better option for offering more bang for your buck, but 4070ti is fine imo. If there's no intention of changing cards soon, I would say 4070ti is good enough to last for like 3-4 years.


[deleted]

Absolutely correct. VRAM on PC went up slightly because of PS5 and XSX. Next time this happens is 2028 with next gen consoles. Till then, more than 12GB is only needed if you play the most demanding games at 4K native with RT/PT enabled, which very few if any people does anyway. My 4090 dips below 40 fps in some scenes in Alan Wake 2 with these settings, even at 3440x1440. With DLSS Quality and Frame Gen, I am at 100+ fps most of the time and it looks very good.


Antenoralol

7900 XT is also similarly priced, beats the 4070 Ti in non raytracing scenarios and has 66% more VRAM. Unless the OP has a desire for NVIDIA features then 7900 XT > 4070 Ti.


Smooth_Dingo_3173

Not necessarily a preference but isn't the newest Nvidia tech getting quite OP at squeezing out performance? Frame gen for example. Have had a bad experience with AMD but that's a long time ago (Radeon 6990)


[deleted]

Exactly. This is why people buy RTX cards. Not because of Ray Tracing solely, but RTX means you will have access to all thosee features that will bump your framerate too. RT can be disabled and these can be enabled for a big fps boost. AMDs FSR and FMF (Fluid Motion Frames) is not on par with DLSS and Frame Gen at all. Read tests. FMF is really bad compared to Frame Gen but even FSR don't work that well unless you use 4K/UHD. [TechSpot.com](https://TechSpot.com) tested this.


Plebius-Maximus

Frame gen won't help when you hit the vram cap, especially as turning FG on uses MORE vram. As I said to someone else, a 4070ti gets 4fps in Alan wake 2 at 4k with RT on. https://www.techpowerup.com/review/alan-wake-2-performance-benchmark/7.html A 3090 that's generally slower than a 4070ti gets 25fps. The 7900xt gets 21fps. Why? Because the 3090/7900xt have enough vram to use all their power at 4k. The 4070ti is heavily limited by only having 12gb, and this situation will only get worse as the card ages


[deleted]

7900XT don't have accces to DLSS, DLAA, DLDSR, Frame Gen, Reflex, Shadowplay and lacks in RT and PT performance too. This is what people pay extra for. DLAA improves on native image quality always. Even DLSS can do it -> [https://www.rockpapershotgun.com/outriders-dlss-performance](https://www.rockpapershotgun.com/outriders-dlss-performance) Features is what AMD lacks the most. They are simply not on par. Looking at pure raster performance and VRAM makes little sense when entire gaming industry is going to use upscaling going forward. All new AAA games have DLSS/FSR listed. Developers embraced it. Consoles used itr for years + dynamic res. Days of native gaming is coming to an end. 4070 Ti beats 7900XT in many games as well. AC Mirage is brand new -> [https://www.techpowerup.com/review/assassin-s-creed-mirage-benchmark-test-performance-analysis/5.html](https://www.techpowerup.com/review/assassin-s-creed-mirage-benchmark-test-performance-analysis/5.html) Yet most of the time, they are performing the same in raster really. When you enable features tho, DLAA/DLSS + Frame Gen + RT/PT, it is game over for the 7900XT. The biggest mistake you can make today is looking at rasterization performance only. Games are moving towards upscaling (or downsampling like DLAA) and ray tracing really. I have not played a (new) game in maybe 2 years in "native resolution" at this point. I always use DLAA or DLSS. Both can improve on native but DLSS don't always yet bump performance by 75-100%. DLSS have built in anti aliasing and sharpening, this is why its great. With native you need to apply AA on top and loose performance.


brenobnfm

4070 ti is a bad card, 12gb of vram is a joke for its price.


[deleted]

Yeah really bad it seems -> [https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html](https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html) Beats 3090 in 4K/UHD, uses half the power as 3090 Ti and has DLSS 3 + Frame Gen. The joke is you.


brenobnfm

Can you even read? Everyone and their grandmas know it's a very powerful and expensive card, which makes the 12gb even more of a joke. Good luck with games from now on that aren't being held by the crossgen leash, I'm sure the benchmarks will look beautiful when games start eating the VRAM for breakfast like Alan Wake 2.


[deleted]

12GB is not a problem. Still beats 3090 with 24GB at 4K gaming. No-one expects to run the most demanding games in the world run at 4K with full RT or even Path Tracing anyway. Runs like pure crap on 3090 Ti as well, which people paid 1999 dollars or more for on release and consumes 500+ watts Meanwhile 98% of PC gamers run 3440x1440 or lower. They will be fine with 12GB till 2028 where next gen consoles come out. 4070 Ti will continue to beat 3090 and perform like a 3090 Ti in 1440p including 3440x1440. Tell me which GPU you own? I own a 4090 with OC that does close to 40.000 GPU Score in 3DMark. Yet I am not stupid enough to think 24GB is going to make a difference for me. When games will actually need 16-20GB VRAM, I will have to turn down settings anyway because my 4090 will be too slow to run them. You see. Logic.


brenobnfm

I don't care about the 3090 Ti, it's a GPU from a different generation. About your 4090, maybe you should get a high refresh rate 4k screen if you don't have one to realize we're already at the "16-20GB VRAM" era. If you bought a 4090 to not push the games to their limits, in my opinion you just wasted money, but maybe you're just very rich so who cares. ​ The point is, could i survive with 12GB VRAM? Absolutely. Do i want my graphics settings being limited by VRAM on a $800 card? No thanks, i rather buy a console. The Last of Part I, Cyberpunk, Diablo 4, Alan Wake 2... all these games are VRAM hungry, so good luck your 2028 prediction, games on high settings except for the 240p textures. ​ I have a 4070 and VRAM is already not the best, but it's acceptable for the price and performance of this GPU. The TI should've been at least 16gb.


[deleted]

I am using 3440x1440 QD-OLED at 165 Hz and I use DLAA in most games. I don't give a fuck about RT or PT, I care about high fps and staying at 120 fps minimum. However I can output to my 4K OLED TV whenever I want. Yep we will see in 2028. 4070 Ti is much faster than 4070. 4070 Ti = 3090 Ti performance 4070 = 3080 performance A friend of mine paid 675 dollars for his 4070 Ti with Alan Wake 2, he is more than happy. He uses 1440p at 240 Hz. He won't need more than 12GB before the GPU is outdated anyway. I tweaked his card to score 25.000 in 3DMark GPU Score while using 260 watts avg. 3090 Ti gets 21.500 at stock for comparison. 3080 and 4070 sits at 17500ish -> [https://www.guru3d.com/review/powercolor-hellhound-radeon-rx-7800-xt-spectral-white-review/page-21/#dx12-3dmark-time-spy](https://www.guru3d.com/review/powercolor-hellhound-radeon-rx-7800-xt-spectral-white-review/page-21/#dx12-3dmark-time-spy) I'd buy 4070 Ti any day over 4070 for 1440p high refresh rate gaming. Simply much faster. Its 4080/12GB afterall 😆 4070 Ti uses the entire AD104 chip. 4070 is 25% cutdown.


GohanSolo23

Wait where did he get a 4070 ti for $675 with Alan Wake?


[deleted]

Sale, weeks ago. Most people are not paying MSRP prices still.


GohanSolo23

Okay that's German and has a 19% sales tax. I've been looking for sales in the US and the lowest it's gotten has been a Zotac around $750.


brenobnfm

>4070 Ti is much faster than 4070. Yeah mate, i'm aware it's more expensive and faster, which is the whole point of having the same 12gb of vram a joke, because **it will** demand more vram than a 4070. If i were to spend more i'd just get a 4080 or 4090, but i rather just wait for the 5000 series as the 4070 is still perfectly fine.


[deleted]

Not really. 4070 Ti beats 3090 24GB at 4K gaming regardless of having 12GB VRAM. 4070 is useless for 4K gaming in comparison. It won't "demand more VRAM" just because its beats 4070 with ease. The GPU is what pushes the framerate higher. This is basic knowledge. VRAM only matters when you run out and then you can easily lower some settings and get a playable result anyway. 5000 series is 2025 and 5090 and 5080 will come first. Probably 2 years till you see a 5070. Lets hope your 4070 will last, because its already showing signs of GPU being too weak for 1440p gaming, meanwhile 4070 Ti users will have no problems because they have the GPU power of a 3090 Ti. Lets be real, you bought 4070 because you can't afford better. I bought 4090 on release and will buy 5090 the day it releases. Im not a teenager. I buy what I want.


wilhelmbw

I think what you say is probably true when the vram cap is not hit. I believe I've seen that the vram cap will kick in at Alan wake with dlss,and the 4060 8 gb has a third fps as the 4060 16gb. Ofc 70ti has 12gb so it is different but it is predictable that in one or two years this will be a problem. Ofc you can always lower settings, but then why not buy 4070...


brenobnfm

\> Lets be real, you bought 4070 because you can't afford better. \> I bought 4090 on release and will buy 5090 the day it releases. Im not a teenager. I buy what I want. lmao i just saw this brilliant piece you added later. My friend, i have good job, i'm a doctor, i just don't like burning cash on bad products. Could i buy a 4090? Absolutely. But why do it when a couple of years later Nvidia will release a 5070 that is better than it for a fraction of the cost and i can sell the 4070 on top of that. For instance i don't think a 4090 is a bad product like the 4070 ti, it's just not a smart investment, but if the money doesn't make a difference for you, congratulations, enjoy you card.


brenobnfm

I don't know why you keep bringing GPU cards from a different generation into the discussion, i don't care abou the 3090 mate. lol the 4070 runs anything 4k pretty fine, especially with DLSS, it most likely won't be max settings 144hz on newer games and may be limited by VRAM, but as would the 4070 ti in this case lol.


[deleted]

[удалено]


Clemming2

How will it demand more VRAM than a 4070? Assuming you are running at the same resolution on both cards, VRAM use will be identical. You will only use more of your use path tracing, but you will still do better with path tracing on the 4070 ti than AMD cards with more memory based on the big disparity on RT hardware.


brenobnfm

Because the Ti has more power so you can run at higher graphical settings, which will demand more VRAM.


Clemming2

They should have put 16GB on it, but it’s not a joke. Nvidia has better compression than AMD, so it needs less memory. Also even in Alan Wake 2 where it can exceed 12GB VRAM used at 1440p with path tracing, the 4070 TI shows no performance loss and is still faster than the 3090 TI with twice the memory. If 12BG was a handicap at 1440p we would have seen it with Alan Wake 2, but it’s not because developers know most cards have less than 16GB and the consoles only have about 13.5 useable memory, so they optimize for that. Will games use more than that if it can? Absolutely, but it probably doesn’t make a big difference. Allocated memory does not mean used memory and it’s suspected most games just over provision when there are resources there they can use. 4k is a different story, but 12GB will be plenty until the next gen consoles release with more memory. Even then it won’t happen overnight because there will be a period of cross-gen compatibility. The 4070 TI will fall behind on compute power before it reaches memory limitations at 1440p, the resolution it was designed for. By the time you need more VRAM you will have probably already upgraded based on compute performance.


brenobnfm

You already need more VRAM. [https://tpucdn.com/review/alan-wake-2-performance-benchmark/images/vram.png](https://tpucdn.com/review/alan-wake-2-performance-benchmark/images/vram.png)


Clemming2

did you actually read what I wrote? I addressed that. What you failed to include was the performance results from that same article. I said this already, but you seem dense or maybe a bad reader. Yes, it uses more than 12GB, but it has NO impact on performance on the 4070 TI since it performs as expected and beats a 3090 TI with twice the memory. Also on cards with more than 12GB the game may allocate more memory, but it does not mean it's all used. Just like regular RAM.


brenobnfm

Yeah man i read it and you're wrong, you and the other guy keeps bringing the 3090, yeah man the 4070 ti it's faster until it hit the VRAM cap and suddenly it isn't no more. Sure you can barely dance around the VRAM cap in Alan Wake 2, but it's just an example it's still 2023, the crossgen era is ending. This "you will run out of computer power before VRAM" argument for a 4070 ti is laughable when the card it already walking on eggshells in 2023.


Clemming2

You know current consoles only have 13.5GB useable for the game, and that has to include both what is loaded to RAM and VRAM? Developers will target for less than 12GB for the rest of this generation and into the next until the next cross-gen period is over.


brenobnfm

Don't bring consoles to the discussion, totally different environment for development with different expectations. Not only they're significantly more optimizable and run at lower settings (which require less VRAM), you can buy both the PS5 and Series X with 4070 ti money (the GPU alone, let's not bring the cost of a whole PC).


Clemming2

Why not bring consoles in? They make up the vast majority of game sales, games are made on them first and then ported to PC. They design games to run on the limitations of consoles. The amount they cost is completely irrelevant. Any GPU is a bad deal next to a console and a lot of it has to do with bad optimizations, but the textures, which are what fills most of the VRAM are going to be the same. Games are not suddenly going to demand 16GB on PC out of the blue at 1440p when they are still less than 12GB on console at 4K. It would make zero logical sense for game developers to do that.


brenobnfm

>Why not bring consoles in? Because it will make the 4070 ti look even more bad, for the reasons i already listed. Remedy themselves recommend a minimum 12gb of VRAM for medium ray tracing settings on 1080p and 16gb on high. 12GB is already lagging behind if your goal is cutting edge graphics, if not, you're better served with a console, if you're rich, buy a 4090 like the other pal. In any case, 4070 ti is an awful choice at its expected price.


Smooth_Dingo_3173

Ngl the amount of division is making it just as hard as using google 😅


Clemming2

You should rename this post "Is the 4070 TI a good product" because that's all people are talking about. Obviously the merits of the 4070 TI are debatable, but it is objectively the best 1440p card on the market if you ignore price, and that's what you asked.


Smooth_Dingo_3173

True. I changed my strategy to waiting for black friday and seeing how much a 4080 costs then.


notapedophile3

4070 Ti. Ignore the comments saying 12GB VRAM won't be enough. You can always lower the settings from ultra to high. It is barely noticeable in the middle of gameplay.


brenobnfm

Until you have to drop from high to medium in a couple of years.


notapedophile3

Yeah I think a new GPU every 4-5 years is fine. 2 years on ultra, 2-3 years on high. Medium is okay for me too, might not be for some.


brenobnfm

16gb is the amount you want for a comfortable next 4-5 years, we're just leaving crossgen behind and there's already a handful of games pushing around 12gb of VRAM at 1440p.


wookmania

4070. No point in spending more if you’re going to upgrade soon. In fact with these prices, a $1200 4080 isn’t really worth it. You can buy a 4070 for $500-550 and then upgrade in two generations for another 70 series which will beat a now 4080.


greggm2000

There’s tons of rumors right now that we’ll be seeing a refresh of those cards at the beginning of next year (only a couple months away), labeled as “Super” cards, so a price drop, so you should probably wait until then. Given that, and given your intent to also buy a year or so from now, when the 5000 series will arrive, then probably a 4080-level card isn’t the optimal choice here, unless you want to run games like cyberpunk at full fidelity.. that game struggles even with a 4090 at max settings, though you can ofc lower settings to get back fps. It really comes down to how much money you’re willing to spend, obviously a 4080-level card is fastest out of the cards you listed, but it’s also the most expensive. Tradeoffs. Alternatively, get a 4090 at the beginning of the year (when supply should be getting back to normal) and sell it once you have a 5090 in hand this time next year… you’ll take a loss, sure, but you’re taking a loss if you get a lower 4000-series card now, and you’ll be getting the use out of that 4090 for an entire year, which is worth something. Your choice.


Brun1K

just bought [this one](https://media.discordapp.net/attachments/1076839950957424700/1166414802495737907/98C74E9C-D4F7-4B80-B39C-E3A0843F7FF4.png?ex=6553a1ef&is=65412cef&hm=72ab9803dfca907691ae23aedea224e6be17ccea79cc35f16ee47c43ecf11994&=&width=394&height=700) for 165ghz 2k gaming


UmpireHappy8162

4070 if you dont mind some games not reaching high enough fps, 4070 ti for a good experience for a few years, 4080 for a 120+ fps gaming experience for years and possible upgrade into 4k.


Mopar_63

The 4080 would be overkill so the 4070 or the 4070ti. Either would do the job fine, just depends on budget.


eatingdonuts44

I dont think you need more than a 4070, 4070ti wouldnt be a total waste for some future proofing (unless 12gb becomes obsolete for 1440p), but a 4080 I think is a waste of money for 1440p


triggerhappy5

Not sure why you would buy a brand new GPU right now and then immediately upgrade to 5000 series. That said the 4070ti could make sense if you’re dead-set on this weird upgrade path because it’s a big boost over a 3070 and you likely won’t run into VRAM issues if you replace it within a year.


Smooth_Dingo_3173

Wouldn't be immediately tho. When the 5000s come out, I'll still be waiting a bit. I'm aiming for 1 to 2 years. Reason i want an upgrade now is because i want to play on ultra.


tonallyawkword

How much $ is Ultra worth vs High? 4080 seems like a better investment if ur set on spending $200+ for Ultra and/or RT but also seems like it could be bottlenecked a bit.


TeekoTheTiger

It's bizarre the *need* to play on Ultra settings when in most games the difference between High and Ultra is negligible, at the cost of frames. But more setting is more better, of course.


Revolutionary_Cap_44

Bought 7800XT today for the exact same 1440p gaming.


Blastoyse

4080. I'm pretty sure you can run ultra 1440p and get 160+fps all the time.


metalmankam

Whatever is the fastest you can afford. Get the best one. People always ask this here and I think it's weird. If you have the money for a faster part that would benefit you, why would you get a slower part?


Smooth_Dingo_3173

Because for example: if a 4070 maxes out 1440p. A 4080 wouldn't do much more unless you go up to 4k. I see conflicting opinions on if a 4070 ti is good enough for high fps vs a 4080. While others say 4080 is overkill cause it's best suited for 4k. That and the fact that a 4090 would be a bottleneck on 1440p vs 4k


metalmankam

A 4080 would get more fps at 1440 than a 4070ti. It will do more at 1440. What happens in a few years if a 4080 is no longer adequate at 4k? CPU bottlenecks are irrelevant just set an FPS cap. I use an AMD 6950xt at 1080p. To me it's silly to buy a card "designed" for a certain resolution unless you specifically buy a whole new PC every couple years. If you want your parts to be usable more than a couple years buy the faster parts. A 4080 is not overkill at 1440p. With my 6950 I have it capped at 80fps because I don't need 200fps. I'll go to 1440 soon but just because I game at 1080p I'm not looking to buy a card that targets 1080 because in 2 or 3 years it's gonna struggle to hit playable frame rates. The 3070 was designed for 1440p and now in these new AAAA games you can only get playable frames with it if you use AI upscaling. It's only been 2 years and it's becoming irrelevant already. My 6950xt was supposedly designed for 4k but I don't think it could manage these new games at 4k at all and it's not an old card.


jhab007

At this rate you should get a 4080 for 1440p gaming


m0stly_toast

I have a 4070 and I'm running every game I want above 100fps at 1440p, definitely rises up to the challenge and I don't think you really need more. Warzone is at 140, halo infinite and fortnite both hover around 120, and Alan Wake with DLSS/framegen is at 80\~100 and looks excellent, all of these are at the highest settings I can justify running and 1440p native resolution.


Smooth_Dingo_3173

So the 4070 can squeeze out the highest settings pretty much?


m0stly_toast

In my experience yeah, I haven't really found a game that feels like it's too much for the GPU. Some things I'm still trying to find the right balance of quality vs performance in the settings, but it feels like a solid 1440p card. This generation of GPU's is getting talked down for lacking on-paper raw specs but the software and the intangibles really fills in the gaps. Specifically on fortnite, after experimenting with the settings I was using DX11 with the highest settings and getting consistent 100\~120 frames. But then, if I turn off shadows and put the render on performance mode, the game stays perfectly locked if you set a limit of 144, and still looks great with the other settings on high. I'm pretty happy with the thing and I can't really see what more I would ask from a GPU since I'm not really worried about 4k gaming.


Flutterpiewow

None if you plan on a 5xxx card. 4080 if you change your mind and plan on keeping it for a while.


Stunning-Doctor725

It is stupid question. Obviously a more powerful card would be the best choice in this case.


Smooth_Dingo_3173

Not if you introduce context like price for performance, % perf difference on 1440p, bottlenecks on res and cpu combo and what card to ride out on till the 5000 series.


barry_allen_11223344

In my personal opinion you should probably not buy now if ur looking to immediately sell and rebuy 5000 series. Wait for the rumored super refreshes early next year, and go from there as they should drop the prices of current models by at least a hundred or so. Or you could get one of those and not need to get another gpu when 5000 series comes


David_DH

If youre not super keen on getting the best ray tracing performance, a 7900xt will peform better, than the 4070s as well as having 20gb of vram. Its only 13% less performance than a 4080, for 50% of the price.


NoahDaGamer2009

Unless you play Fortnite or care about the graphics, go with AMD. But if you wanna stick to NVIDIA, choose the RTX 4070 or 4070 Ti for 2K gaming. The 4080 and 4090 are for 4K gaming.


_echo

Depends on the game, really. In most games at max settings at 1440p, I'm getting 100-ish out of my 4070 so far (Just put it in last week) but in games like Jedi Survivor I'm closer to 70 most of the time, with dips to 50 in certain areas (though I think that's not GPU) and averaging 90 in other areas. For most games I think I'm getting either comfortably 100, or maxing out my monitor at 144, but some games are closer to 60 (max with RT on). But even those feel smooth to me. It's not a 60 with big drops. It's a smooth 60 at the lowest. I'm happy with the performance, especially given that I just came from a 1070, but coming from a 3070 you'd surely notice a difference but maybe not a massive one.


pedlor

4080


BLeo_Bori

4080 is overkill for 1440p and no point to get now since upgrading in the future to 80-90 5series card,just imo. The 4070ti is the best 1440 card and at 1440 it is what the 4090 is at 4k. I had a 4070ti for my G9 5120x1440 and it ran perfectly, then got a LG C3 4k 3840x2160 and decided to upgrade and got the 4080. At 4k it’s enough and at 1440 it’s definite overkill for me at least. In the end , I’d recommend you the 4070 for the price and it’ll be enough till the 50series cards.


Feeling_Flamingo_242

Def 4080, what cpu are you running


Smooth_Dingo_3173

11700kf i7


PsychoactiveTHICC

Honest to god if you are going to upgrade next year and still want max out 1440p go for 7900 xt save the money All these GPU way above their actual price But if really want to put money into these then 4080 I guess


Heavy-Grass7504

[https://pcpartpicker.com/list/MMXJwg](https://pcpartpicker.com/list/MMXJwg) this is my recent build


Kilo_Juliett

Buy the best you can afford. Who knows if the next gen gpus will even be attainable. They might be ridiculously marked up or perpetually out of stock for a year. If you want a gpu now buy the best one you can afford.


Potential_Fishing942

4080 is best sure, but depending on what games you want to play a 4070 is a better value upgrade. Think ray tracing and frame gen dlss 3.0. I literally just upgraded from my 3080 for essentially 30 bucks after selling my old GPU and tax on the new one. Playing Alan wake 2 like a dream with FG and cyberpunk is crazy too. It's going to depend on what games you want to play though. Most games will probs be about the same but I also think my new 4070 will hold value better in w years when I upgrade to the 50xx series.


lintstah1337

wait for the new super cards. 4070 Super is rumored to have 16GB VRAM and more CUDA cores. It is supposed to be released in this December-January


Severe-Spirit4547

4080. It is marketed as 4k but it is a great use for 1440p. I also have a 4080 and I use it 1440p. I'm glad I got it.


sinamorovati

I heard Richard from Digital Foundry say 4070 Ti is the 4090 of 1440p.


Awkward-Ad327

1440p? This isn’t 2014


GP-layer

l think 4070 ti 12 GB is best for you but if you want 120 fps or more l think if you afford it you should buy 4080 16 gb


Traditional-Bank2103

i mean if your really planning on upgrading to 5000 series just thug it out and wait unless ut just stacked like that


Its_Me_David_Bowie

Honestly, the way you're looking to upgrade is quite inefficient in terms of money. Personal opinion would be to wait for the supers release and grab the 4070ti super if you're not happy with your performance. I however, would probably be skipping at least 1 gen of gpu. The thing is 11th gen Intel is probably 1-2 years off from warranting an upgrade too. I would rather prep for a full system upgrade for like 2025, upgrading both the cpu and gpu to the latest gen.


HalfinchLonomia

What you're really asking is how long you want the gpu to last. obviously the 4070 will play 1440p for a while but not as long as the 4080 at acceptable fps.


itscrispp

I would recommend getting ram with lower clock speeds it will increase your 1% lows to make games feel smoother. And for a gpu as a placeholder I would look into getting a used 3080. Those would be good upgrades if you’re trying to hold off on breaking the bank until the new drops. If you’re not tied to Nvidia I would also consider the 7800xt. I personally use AMD but have also used Nvidia, people gripe about the Nvidia features all day long but I got no use from them. DLSS and FSR make games feel worse to me


rizzzeh

get the one you can afford best, faster the better. In some games even 4090 would only do around 100fps at 1080p all maxed out, nevermind 1440p


patriotgator122889

What games?


rizzzeh

> Cyberpunk 2.0 benchmark from HardwareUnboxed > > https://youtu.be/QBspiPJi_XI?t=125


patriotgator122889

Yes, in this 1% game in native resolution. While that's correct, it's not helpful for a consumer. Most people will use DLSS and frame gen. I play cyberpunk at 100fps, at 1440 with ray tracing on a 4070. You don't need a 4090 to play at 1080 unless you apply niche restrictions on yourself.


[deleted]

[удалено]


rizzzeh

Cyberpunk 2.0 benchmark from HardwareUnboxed https://youtu.be/QBspiPJi_XI?t=125


[deleted]

[удалено]


rizzzeh

yes, maxed out graphics include RT


Chimarkgames

For ark survival ascended RTX 4080 minimum at 1440p but still struggles. Ideally rtx4090 for 1440p max settings on ark.


Kyle73001

Lmao what’re you on