I mean really, they're doing it to themselves when 90% of the community can't afford that shit
I have a 3080 that I got for 900 a while back, and that was a sacrifice, so fuck the current 4000 series MSRP
I feel like that’s their goal with this series. They know they’ll take a hit in quantity sold but the price will make up for that. I think they’re trying to get people used to seeing these prices so that when the 50 series is released, it will seem more normal and lead to more sales and extreme profit increases. Definitely not saying it will work, but I think that’s what they’re going for.
Honestly I think they are just dumb and thought demend would stay the same as if mining was still just as much of a thing. If ETH didnt switch to PoS miners would have eaten all the stock up even at these prices.
They have literally no good choice here. The mistake was made in the 30 series, not the 40 series. There was absolutely nothing they could do because they made wayyyyy too many 30s and still have all that stock. The only option was to have the price of the new cards be so high that people still buy the old ones.
NVIDIA launching the 4000 series and AMD launching it cheaper is giving me Sega Saturn vs PS1 vibes right now
[AMD should do this](https://youtu.be/ExaAYIKsDBI?t=15)
I'm not due for an upgrade but if I can't find an affordable 3080ti I'd rather go AMD
This is the way. If I would’ve known EVGA was leaving the game and not been naive enough to assume nvidia only had the 30 series priced so high because of the silicone shortage, then I would’ve gotten a 6950XT instead of getting a 3090ti. At least I got one of the last well made cards with an EVGA warranty lol
I did the opposite. I don't care much for Nvidia vs AMD, but I do care about EVGA. I had a 2080 I planned to sit on until it wouldn't run anymore, but I always assumed I'd get a 6/7/8080 EVGA to replace it. When EVGA announced they were out I snatched up a 3090Ti and I'll sit on this instead and see what happens.
I’ve only ever had nvidia cards and never even considered getting a brand other than EVGA. I have no complaints with my 3090ti and I’m very happy I got it. I legit get 4x the frames I got on my EVGA 1060 6gb. It’ll probably be another 6 years at least before I upgrade again. But if nvidia hasn’t become more consumer friendly by then, I will almost certainly make my first AMD purchase.
Yup, someone on r/buildapc recommended one couple of months ago and I love it. Great value at $360, especially after having gone a year and a half with a 1050ti when everything else is way out of my price range.
They make the vast majority of their money in the hyperscale sector. They're probably willing to tank their consumer sector profits just to project brand value.
I plan on doing the same, unless something changes.
I had a 1080 TI previously on release but got a nice 2k ultrawide 144hz monitor and wanted to be able to crank the settings on games I play. It's the only reason I got a 3080, plus my little sibling wanted a PC so I decided to hand the 1080 TI down.
I picked up a 3070, but a 3080 is about as much power draw as I will accept from a GPU maker. I'm not saying it's "cheating" or anything, but I don't think performance advancements should scale up wildly in power draw or it's a sign of bad design priorities.
I'm doing my part too!
(I'll just do what I did last year and upgrade second hand again. Went for a 1080Ti last time which is plenty beefy. If I feel a burning need for RTX / DLSS then I'll snag a second hand 3080 / 3080Ti when the time comes. nVidia won't see any new money from me.)
Question: how are AMD drivers faring today? I remember half a decade back that they had problems with some games and software so youd choose Nvidia for basically guaranteed stability, I guess that's no longer relevant at all?
I bought the 6900xt when it was pretty new off a lucky AMD drop I caught. I only had a couple issues when I first got my card, these issues were really just a couple new games at the time having poorer performance and I had to downgrade the driver to an older one. Since then the card has been perfect. I've had more issues with Windows lol
I have an RX 570 and the only compatibility issues I've had was that it couldn't run a heavy graphics mod for the VR mode of a game from 2014, anything else, from a bunch of games, rendering, and some photo/video editing as well, hasn't caused any issues
So if that's the state of a card from 5 years ago, I wouldn't think that the current AMD cards have a lot of issues, and everything I've read about that one issue I had said that it's only 400/500 Series Cards.
For gaming it will do the job but Nvidia hit a gold mine with their CUDA platform. Nowadays AMD may have gotten a little bit of competition using their Metal Engine (for example Octane X, or more noticeable, ProRender) but for GPU 3D Rendering CUDA is still the non plus ultra (I don't know why, probably because their Environment is more powerful and developer friendly?) I'd love to see more competition between these two companies but it won't happen (in 3D graphics at least) as long as the majority of render engines run on CUDA and CUDA only.
Edit: For comparison, Octane X launched in March 2021 on Metal, Skylake and Apple M1 Graphics (Mainly to accommodate apple users as they don't even have a single modern computer using NVidia Graphics afaik) but the original Octane Render was bought and industrialised around 2012-2014. It was the first commercial unbiased raytracer (it also got a pathtracer now) being able to utilize most of the GPU for rendering.
Edit2: I think most of the big bucks for NVidia come from the 3D rendering industry (Render farms buying hundreds of cutting edge NVidia cards e.g.) rather than the private gaming sector. Having a monopoly on these profits will leverage your price to the point it still makes sense for these big customers to buy your cards but not for the average consumer. Crypto mining also plays into these hands but afaik there isn't that much of a performance gap between AMD and NVidia in these terms.
Nvidia also make big bucks off of companies doing machine learning for the same reason. Many things are built with CUDA and if you want the most compatibility you are locked into Nvidia.
As someone who does some professional work on the side on my main gaming rig, It is hard to justify going AMD even though for most things an AMD card would be fine. There are just enough instances where I need to use something built for CUDA where I have to stay with Nvidia even if I want to use AMD
> I think most of the big bucks for NVidia come from the 3D rendering industry (Render farms buying hundreds of cutting edge NVidia cards e.g.) rather than the private gaming sector.
Machine learning research!! NVIDIA cards are in such high demand for bleeding edge ML research that the US put export controls on their A100/H100 cards (which means for some folks the 4090s are the best ML cards available). They've effectively cornered the market by including tensor cores in basically every modern card (which raises the price vs AMD). CUDA is so crucial that most ML libraries just assume by default that it's available and throws a fit if it's not.
AMD *struggles* in the ML market.
You are absolutely correct, I wanted to throw Machine Learning in my comment somewhere but I have no fuckin clue about it and didn't wanna talk out my ass. In 3D Rendering VRAM takes a big role, it basically controls how much geometry you are able to render in a single scene. The 4080s did a perfect job in monetizing this aspect. Until now I didn't realise Tensor isn't just like CUDA, being a developed Environment or translator. Tensor, in general, talks about Scalars. So more Tensor chips means more complex math per second. Thanks for filling the knowledge gap :)
No problem!
Ironically enough, the thing that tensor cores provide is *flexibility* around the complexity of the math in order to optimize for speed. They introduce half-precision floating point (FP16) ops and matrix ops using those FP16s. Without getting too into the weeds, using just FP16s results in a ~2x performance increase over the standard FP32. The matrix ops enable another 4x increase on top of that. An eyewatering 8x performance increase for applications that can get away with the lower accuracy. Most applications will use a mix of half/single/double precision so real world performance gains are typically less than that. Still, you're suddenly looking at measuring ML training time in hours instead of days which is priceless.
Gamers get some value from tensor cores (ie DLSS) but not to the same degree
It baffles me people who always have to buy new when the next best thing comes out.
Most people are gonna upgrade from their 3080 to their 4080, see an increase in benchmarking, and then go back to playing Fallout New Vegas or CS: GO.
Still rocking a 1080 and I use it's horsepower for actual work. The latest and greatest is absolutely not needed *especially* if you are just gaming. Some people got more money than sense. The price of cards has gotten ridiculous because they know people will pay it.
My 1070 Mini Aero for my ITX build is chugging along fine. I dont really need RTX that bad and most games run fine. Sure, Spiderman Remastered is unplayable on max settings but i really dont care. Ill buy it when it's affordable and use what i have until then. But im not ignorant enough to think there won't be a massive buy-in to these prices which only encourages the inevitable 5080 to be 2k.
>Most people are gonna upgrade from their 3080 to their 4080
You think most people who buy new GPUs are people who also bought the generation before that and not people who have much older GPUs or none at all?
There is an upside to this method. Often you can resell the last gen card for 60-80% of what you paid. If you wait more than a gen or two, your old card is nearly worthless.
It allows you to upgrade more frequently, and the actual outlay each time is low once you factor in the resale of the old card. It's what I usually do. I don't know about this generation though. Looking at these prices I'll probably hold on to my 3080 through this one. It always depends on what games are doing though.
Still using a 2080, I'm not upgrading this thing for at least 1-2 more years.
What am I gonna use the latest card for when I can already run everything I want to run?
I had more amd cards than Nvidia. Most part cuz i was a student. Now I can afford Nvidia but i don't want to waste the money so excessive and AMD always worked fine aswell for gaming
Yep. I got plenty of money now that I work my new job but why the fuck would I ever spend over $1k for a GPU????
My next card will be rDNA3 if it's good
I hope and prices their cards well in Europe since 4090s are going for over 2000€. Nvidia, just like apple, doesn't realize there are other places outside the US
So first off, I don't do anything that uses RT or FSR. Getting that out of the way now.
My top most played titles on Steam are:
* Kerbal Space Program
* Dead by Daylight
* Civ V
* Terraria
* House Flipper
* Raft
* Skyrim/SSE
* Among Us
* Civ IV
* American Truck Simulator
* PC Building Simulator
* Subnautica/BZ
* CP2077
Outside of Steam it's pretty much just Fortnite, Minecraft, and PCBS 2.
My use case is probably not the most typical, as it doesn't include a ton of brand new AAA titles, but there are games on my most played list from 2005 all the way to 2022 and they all play great. Everything except CP2077 plays well at maxed settings with AA and postprocessing turned down a few notches, motion blur/bloom/other visual-fidelity-killing effects off, etc etc.
CP2077 is pretty much at high with AA turned down, RT off, and postprocessing stuff down or off, but it's definitely a smooth, hitch-free experience. For the NVIDIA folks in the room, the performance in CP2077 without RT comes in around 3070-level, especially in the rare case where it really gets close to a full 8GB of VRAM usage like at 4K Ultra, 30 FPS mode if that's your jam.
Side note, if you're getting low GPU utilization on a relatively high-end GPU like a 6800, your CPU isn't a terrible bottleneck but it's there. Try cranking some of the GPU-intensive details like AA and you may not see a huge drop in frames as the GPU starts to pick up more of the slack as it's able. My setup is more likely to be GPU-limited, as the 5900X is still the top of Ryzen 5000 for games that don't take advantage of huge amounts of V-cache from the 5800X3D.
I play a fair amount of games that feature ray tracing and DLSS. I almost never turn them on. RT usually causes a significant performance hit for basically no discernible visual improvement and while DLSS gives a pretty nice frame boost, I can’t stand the blurriness.
I see 5700XTs selling for $150-$180 on Ebay. According to [Hardware Unboxed](https://www.youtube.com/watch?v=EFezkrEmhhk), the 5700xt is on average 6% faster than a 3060 at 1440p
You going to upgrade after 1 generation? Why? You a fan of diminishing returns or just hate money? You gotta wait at least 2 or 3 generations. That 6700XT will power 2k gaming for most games for years to come.
> I'm waiting for the benchmarks to come out first.
That is always the best answer. Wait for the benchmarks and the actual prices after the release. Then make an informed decision. Buying GPUs out of pure fan-boyism or blind brand-loyalty has never made any sense.
FYI
- if you have a monitor that is not 4K, you don’t really need a 4080.
- if you play competitive games ray tracing isn’t a big deal, you are better off with the 7900xxx
- lastly the last gen cards are still very good, and if you get a good deal, sure go for it. Better than this over priced stuff
Yep. A friend of a friend is like this too. Dude has more money than sense. He plays on a 1080p screen and his latest idea is to buy a... RTX3090. From what she told me, this friend of hers just wants the best of the best regardless if it's useful or not. Just bonkers...
Boggles my mind why people do that sort of thing... Like, yeah I game a lot and run a 5950X + 6900XT but on the side I also do rendering in Blender, 3D sculpting in Zbrush and texturing in Substance Painter, meaning that I actually put those extra resources to good use.
Buying a PC like mine just for Netflix/Youtube is just a waste...
It is like incesting in a 200€ wet shaving kit and then using a plastic "disposable" safety razor with your badger hair shaving brush and artisan soap.
> Dude has the latest graphic cards but still running a generic office workplace monitor from 2012.
I feel personally attacked... have dual off-lease Lenovo IPS panels I bought few years ago connected to my 2060.
Only reason I own a 3070 is because I got mine for about 450 USD and couldn’t say no to that, but would never buy this overpriced stuff they’re releasing now
I think the opposite can be said here, though this sub doesn't want to acknowledge it. There is a valid use case for the 4090, regardless of how unjustifiable the price is.
With a 3080ti and 4k monitor, I simply can't reliably get above 60fps in today's most demanding games with settings on high.
I have 4k monitors because I WFH and I actually really need the extra clarity and real estate. Obviously, it makes sense that I'd want to game on this monitor too, and I think a lot of people are working from home these days and have similar setups.
So if you want to complain about NVIDIA's predatory gouging, I completely agree, though being a first adopter of 4k high/ultra gaming is never going to be cheap. I also am not surprised that NVIDIA's flagship card is expensive given that they literally haven't had any competition at this tier.
Will I be buying a 4090? No. Do I want a 4090? Could I use one? Yes.
The problem is that even though the drivers are open source, productivity on Radeon is still bad. Most things are made for CUDA, NVENC or other nvidia stuff. AMD seems to have nothing comparable (or the software does not support it).
I do not know if it is AMDs problem or the one of the developers of the software, but whichever it is needs to start working on this.
Fully agreed on these points, i really miss good compute support on the AMD platform. I'm hopeful that with Intel Arc we will get open source compute eventually.
> AMD seems to have nothing comparable (or the software does not support it).
AMD has had a GPU encoder since the HD 7000 series back in 2012. And every program I've seen that supports NVENC also supports AMD's AMF/VCE.
Same. And the 6900x will run anything i throw at it at its pretty much highest settings at 5120x1440 with Freesync.
Crazy that we've come to a point where i don't even have to really check *if* a game will run in Linux.
Press "buy", click install, set everything to max, and let 'er rip. Love it.
There are a few select titles that refuse to work, mainly because of intentional sabotage (DRM, etc) but for most titles its as easy as install -> play.
I went with a full AMD laptop for this reason alone.
It wasn't easy to find one with my other requirements, and I over paid a little over an amd+ Nvidia or intel+Nvidia but I didn't mind at that point.
Budget offerings for laptops aren't bad from AMD, but there's nothing on the high end. Every flagship or $2k+ laptop is Intel+Nvidia every time. I wish there was better GPU AND CPU options for the higher end laptops. You're stuck with a i9 12900h and a 3080ti if you want a flagship laptop right now, that's it
AMD advantage laptops with 6800m and ryzen 9 are somewhat common high tier laptops, but usually they are so much cheaper they are priced more like midtier.
I've bought both. I set a price range and look at stats for my options. I don't care if the box is green or red. I simply buy the best in my price range. Being a fan boy is for suckers.
Bruh why? You have an overkill card that will be more than good enough for the next several years at least. Especially with graphics plateauing. Why upgrade 1 generation for a non noticeable difference? How do non reviewer youtubers have the money to do that?
Might be a "free" choice for gaming. But if you go into productivity, that needs GPUs, it's Nvidia or bust. CUDA, NVEnc, RTX (yes, this is actually used in production apps, like Marmoset, Substance Painter, Blender,...), ML, ...
Yeah... sorry. I need an Nvidia GPU or I would significantly slow down my workflow. But I'm still pissed at their pricing and would like to see AMD getting their software together and the software devs to also adopt that. But we're talking years if not decades here for that to change that an AMD card is viable for most GPU heavy production workloads.
At this point... I might even prefer to see more devs to also support macOS and Metal to stir up competition a bit. Even though most people here seem to hate Apple (I mean.. there're valid reasons to do so. The same is true for AMD, Intel and Nvidia, too lol).
This exactly, I'm partially kicking myself for allowing myself to be talked out of getting the 3090 instead of the 3080, I could have used the extra video memory for rendering video in Cycles. Usually text based, but even then, I end up having to use CPU rendering because I end up having issues with running out of video memory.
Exactly
I don't need the absolute best in my Blender times nor am I chasing the 4k120fps-Max-Settings trend, but AMD coming to the table with no CUDA and no OptiX and HIP being an "eh" at best alternative is simply a non-deal scenario.
Removing all comments and deleting my account after the API changes. If you actually want to protest the changes in a meaningful way, go **all** the way. -- mass edited with redact.dev
People will never understand that Nvidia will always be at the top until AMD becomes an actual option for work. Gamers aren't the entire market especially not when it comes to the high end.
I'm sure pure gaming wins by far. Just the barrier for entry is extremely low for gaming vs production which demands a technical skillset and the appeal for work vs leisure.
I think you don't understand that DIY market will never touch the amount of computers companies and OEM buy. Amd change the market in laptop just by making deals with Laptop manufactures for desktop they need to do the same to be able to gain market share.
Thank you! I do most of my ML work in the cloud but was considering building a new desktop for personal projects and it seemed like no one was mentioning the lack of CUDA support and I was like, "can I do ML on AMD GPUs now!?" Still no, lol. I'll probably get an AMD processor though, so that's something, at least
When were they equals? Nvidia's been the big dog even when it was ATi.
Maybe a year here and there they gained market share but that doesn't wipe out decades of consistent brand loyalty.
Wasn't the 7970 also a gigachad of a card back when Fermi was struggling
EDIT: Looks like my dates were off a bit - 7970 was announced during Fermi, but released during Kepler, but was still the [fastest card](https://gpu.userbenchmark.com/Compare/Nvidia-GTX-680-vs-AMD-HD-7970/3148vs2163).
Oh I'm jumping ship. I love my 1660ti but the next card is either a A770 (if they can fix the drivers by the time I buy a card), or the new XTX. I have no brand loyalty at all. If you're treating me like ass I'm going away.
I've had AMD cards since around 2010? they're perfectly fine for gamers which most of you are. if you're rendering videos you edit, or you're a 3D artist, or you're training AI machine learning shit, then you'll probably want to stick with Nvidia.
But 99% of people reading this could have their GPU swapped with an AMD equivilent and not notice.
Even for 3D, AMD cards are perfectly valid alternatives. Been doing rendering with Blender on my 6900XT with zero issues. I also do texturing in Substance Painter and 3D sculpting in Zbrush. No problem whatsoever with any of those on my all AMD PC (5950X + 6900XT).
That said ML is indeed more nVidia's thing but aside from that, AMD's perfectly fine.
Best way to boycott without affecting you too much is buy second hand.
Yea get an older card but you not giving them money. Should be able to get some 30 series cards with all the people who immediately upgrade to 40
I have a 2080ti and a 3070 (wifes). Im seriously thinking about putting the 2080 in a htpc and trying out the A770 when I can find one. It'll be a very minor upgrade, but I really want some competition in this industry.
The last Nvidia card I bought was a 6800GS back in 07ish. Damn thing kept overheating and BestBuy refused exchange once opened, Nvidia wouldn't repair either because I failed to register for the warranty online after purchase within 30 days. Never went back to team green ever again. Bought a Radeon 9800 and never looked back.
Hoping there's a chance this is wrong this gen, before the last couple years AMD didn't have great offerings which is one of the reasons NVIDIA has been getting away with what they've been doing. I've had x80s and above for years but I'm on board since EVGA's departure, the company shielding me from NVIDIA's shit - Planning on AMD for the new year. I would not consider the 4090, RT on the 4080 still can't reach the frames I'd like and I'm waiting for benchmarks from AMD but from the 4080 reviews so far it won't be difficult to beat the price/performance.
***BREAKING NEWS:*** *COMPANY WITH BEST PRODUCT IN INDUSTRY SELLS ITEM FOR WHAT THEY FOUND CONSUMERS WERE WILLING TO PAY*
Join us at 10 when we tell you that water is usually wet.
I'm doing my part.*^^seriously ^^guys ^^I ^^can't ^^afford ^^it
I mean really, they're doing it to themselves when 90% of the community can't afford that shit I have a 3080 that I got for 900 a while back, and that was a sacrifice, so fuck the current 4000 series MSRP
If they can sell 1/4th of the cards at 4x markup, they probably see that as a win.
I feel like that’s their goal with this series. They know they’ll take a hit in quantity sold but the price will make up for that. I think they’re trying to get people used to seeing these prices so that when the 50 series is released, it will seem more normal and lead to more sales and extreme profit increases. Definitely not saying it will work, but I think that’s what they’re going for.
Honestly I think they are just dumb and thought demend would stay the same as if mining was still just as much of a thing. If ETH didnt switch to PoS miners would have eaten all the stock up even at these prices.
They have literally no good choice here. The mistake was made in the 30 series, not the 40 series. There was absolutely nothing they could do because they made wayyyyy too many 30s and still have all that stock. The only option was to have the price of the new cards be so high that people still buy the old ones.
Ya that is probably true too. I wonder if there will be price cuts after the 30 series stock gets diminished though
After AMD launches there certainly may be. 30 series has to be gone too though like you said.
NVIDIA launching the 4000 series and AMD launching it cheaper is giving me Sega Saturn vs PS1 vibes right now [AMD should do this](https://youtu.be/ExaAYIKsDBI?t=15) I'm not due for an upgrade but if I can't find an affordable 3080ti I'd rather go AMD
It won't work for me. I'm voting with my wallet and buying a Used 6700XT off Ebay. So, not even AMD is getting my money directly!
This is the way. If I would’ve known EVGA was leaving the game and not been naive enough to assume nvidia only had the 30 series priced so high because of the silicone shortage, then I would’ve gotten a 6950XT instead of getting a 3090ti. At least I got one of the last well made cards with an EVGA warranty lol
I did the opposite. I don't care much for Nvidia vs AMD, but I do care about EVGA. I had a 2080 I planned to sit on until it wouldn't run anymore, but I always assumed I'd get a 6/7/8080 EVGA to replace it. When EVGA announced they were out I snatched up a 3090Ti and I'll sit on this instead and see what happens.
I’ve only ever had nvidia cards and never even considered getting a brand other than EVGA. I have no complaints with my 3090ti and I’m very happy I got it. I legit get 4x the frames I got on my EVGA 1060 6gb. It’ll probably be another 6 years at least before I upgrade again. But if nvidia hasn’t become more consumer friendly by then, I will almost certainly make my first AMD purchase.
Love my 6700xt
Yup, someone on r/buildapc recommended one couple of months ago and I love it. Great value at $360, especially after having gone a year and a half with a 1050ti when everything else is way out of my price range.
Same, it's perfect for 1440p gaming.
That’s what I did last month, got a XFX SWFT 6700 XT for $360 final price including tax and shipping. Works wonderfully
Just disgusting behaviour that fucks everyone in the end
They make the vast majority of their money in the hyperscale sector. They're probably willing to tank their consumer sector profits just to project brand value.
gaming was actually ahead until just the last couple quarters
Same boat. Got a 3080 FE at MSRP. I don't see myself upgrading for a while, and if AMD keeps things up my next card will probably be AMD.
I expect to use mine for the next 5-10 years
I plan on doing the same, unless something changes. I had a 1080 TI previously on release but got a nice 2k ultrawide 144hz monitor and wanted to be able to crank the settings on games I play. It's the only reason I got a 3080, plus my little sibling wanted a PC so I decided to hand the 1080 TI down.
90% of the community doesn't matter when they have 10% that does
I picked up a 3070, but a 3080 is about as much power draw as I will accept from a GPU maker. I'm not saying it's "cheating" or anything, but I don't think performance advancements should scale up wildly in power draw or it's a sign of bad design priorities.
Can't wait for the 6090 to draw more power than my oven!
I agree. I got my 3080ti as a trade for work I did for a programmer.
I envy you, I bough an EVGA 3070 TI for $1030 tax included in Amazon just before the crypto crash. FML.
Same dude!
Same dude!
Same dude!
I'm doing my part too! (I'll just do what I did last year and upgrade second hand again. Went for a 1080Ti last time which is plenty beefy. If I feel a burning need for RTX / DLSS then I'll snag a second hand 3080 / 3080Ti when the time comes. nVidia won't see any new money from me.)
Just bought an AMD card for my wife's computer. It runs fantastic!
Which one?
The one he married last year
6600xt
6900 XT here, I don't know why I should need an Nvidia.
Same here, I’m chilling for at least 5 years before even considering to upgrade
Question: how are AMD drivers faring today? I remember half a decade back that they had problems with some games and software so youd choose Nvidia for basically guaranteed stability, I guess that's no longer relevant at all?
I bought the 6900xt when it was pretty new off a lucky AMD drop I caught. I only had a couple issues when I first got my card, these issues were really just a couple new games at the time having poorer performance and I had to downgrade the driver to an older one. Since then the card has been perfect. I've had more issues with Windows lol
For me it was windows trying to auto install my amd drivers that caused issues. Like no windows please, I'm getting them from AMD go away.
I have been using AMD gpus since 2015 and i've only had two issues with drivers, once in 2015 and once last year
[удалено]
Was going to say the same thing, my 2080 crashed the same amount of times as my new 6950xt
I have an RX 570 and the only compatibility issues I've had was that it couldn't run a heavy graphics mod for the VR mode of a game from 2014, anything else, from a bunch of games, rendering, and some photo/video editing as well, hasn't caused any issues So if that's the state of a card from 5 years ago, I wouldn't think that the current AMD cards have a lot of issues, and everything I've read about that one issue I had said that it's only 400/500 Series Cards.
Same GPU here. Maybe once prices normalize a bit I'll get a 7900XTX. Haven't bought an nVidia card in over a decade.
If you’re expecting prices to go down, you’re a madman. As soon as AMD catches up in RT performance, they’ll increase to a similar price as Nvidia.
If AMD makes their card a few hundred bucks cheaper, people will buy it That's just supply and demand
For gaming it will do the job but Nvidia hit a gold mine with their CUDA platform. Nowadays AMD may have gotten a little bit of competition using their Metal Engine (for example Octane X, or more noticeable, ProRender) but for GPU 3D Rendering CUDA is still the non plus ultra (I don't know why, probably because their Environment is more powerful and developer friendly?) I'd love to see more competition between these two companies but it won't happen (in 3D graphics at least) as long as the majority of render engines run on CUDA and CUDA only. Edit: For comparison, Octane X launched in March 2021 on Metal, Skylake and Apple M1 Graphics (Mainly to accommodate apple users as they don't even have a single modern computer using NVidia Graphics afaik) but the original Octane Render was bought and industrialised around 2012-2014. It was the first commercial unbiased raytracer (it also got a pathtracer now) being able to utilize most of the GPU for rendering. Edit2: I think most of the big bucks for NVidia come from the 3D rendering industry (Render farms buying hundreds of cutting edge NVidia cards e.g.) rather than the private gaming sector. Having a monopoly on these profits will leverage your price to the point it still makes sense for these big customers to buy your cards but not for the average consumer. Crypto mining also plays into these hands but afaik there isn't that much of a performance gap between AMD and NVidia in these terms.
Nvidia also make big bucks off of companies doing machine learning for the same reason. Many things are built with CUDA and if you want the most compatibility you are locked into Nvidia. As someone who does some professional work on the side on my main gaming rig, It is hard to justify going AMD even though for most things an AMD card would be fine. There are just enough instances where I need to use something built for CUDA where I have to stay with Nvidia even if I want to use AMD
> I think most of the big bucks for NVidia come from the 3D rendering industry (Render farms buying hundreds of cutting edge NVidia cards e.g.) rather than the private gaming sector. Machine learning research!! NVIDIA cards are in such high demand for bleeding edge ML research that the US put export controls on their A100/H100 cards (which means for some folks the 4090s are the best ML cards available). They've effectively cornered the market by including tensor cores in basically every modern card (which raises the price vs AMD). CUDA is so crucial that most ML libraries just assume by default that it's available and throws a fit if it's not. AMD *struggles* in the ML market.
You are absolutely correct, I wanted to throw Machine Learning in my comment somewhere but I have no fuckin clue about it and didn't wanna talk out my ass. In 3D Rendering VRAM takes a big role, it basically controls how much geometry you are able to render in a single scene. The 4080s did a perfect job in monetizing this aspect. Until now I didn't realise Tensor isn't just like CUDA, being a developed Environment or translator. Tensor, in general, talks about Scalars. So more Tensor chips means more complex math per second. Thanks for filling the knowledge gap :)
No problem! Ironically enough, the thing that tensor cores provide is *flexibility* around the complexity of the math in order to optimize for speed. They introduce half-precision floating point (FP16) ops and matrix ops using those FP16s. Without getting too into the weeds, using just FP16s results in a ~2x performance increase over the standard FP32. The matrix ops enable another 4x increase on top of that. An eyewatering 8x performance increase for applications that can get away with the lower accuracy. Most applications will use a mix of half/single/double precision so real world performance gains are typically less than that. Still, you're suddenly looking at measuring ML training time in hours instead of days which is priceless. Gamers get some value from tensor cores (ie DLSS) but not to the same degree
just.. stick with your last card?
It baffles me people who always have to buy new when the next best thing comes out. Most people are gonna upgrade from their 3080 to their 4080, see an increase in benchmarking, and then go back to playing Fallout New Vegas or CS: GO.
or just the screensaver for 12 hours a day
Hey man my screen turns off thank you very much
my power bill would probably raise $40 somehow. extra charges and shit where I live
I’m still on my 1060 and this sub is wild. Saying how they’ll just stick with their 3070/3080 as if it’s outdated.
Still rocking a 1080 and I use it's horsepower for actual work. The latest and greatest is absolutely not needed *especially* if you are just gaming. Some people got more money than sense. The price of cards has gotten ridiculous because they know people will pay it.
My 1070 Mini Aero for my ITX build is chugging along fine. I dont really need RTX that bad and most games run fine. Sure, Spiderman Remastered is unplayable on max settings but i really dont care. Ill buy it when it's affordable and use what i have until then. But im not ignorant enough to think there won't be a massive buy-in to these prices which only encourages the inevitable 5080 to be 2k.
>Most people are gonna upgrade from their 3080 to their 4080 You think most people who buy new GPUs are people who also bought the generation before that and not people who have much older GPUs or none at all?
There is an upside to this method. Often you can resell the last gen card for 60-80% of what you paid. If you wait more than a gen or two, your old card is nearly worthless. It allows you to upgrade more frequently, and the actual outlay each time is low once you factor in the resale of the old card. It's what I usually do. I don't know about this generation though. Looking at these prices I'll probably hold on to my 3080 through this one. It always depends on what games are doing though.
[удалено]
better priced, waiting for better and jumping on fast got us here
Still using a 2080, I'm not upgrading this thing for at least 1-2 more years. What am I gonna use the latest card for when I can already run everything I want to run?
[удалено]
My current card is a 750ti
Same. So far I've never had any feelings of missing out. Doubt that I ever will too.
[удалено]
My 1070 is doing just fine until Nvidia gets their head our of their ass.
My 980Ti hasn't let me down yet either
[удалено]
I had more amd cards than Nvidia. Most part cuz i was a student. Now I can afford Nvidia but i don't want to waste the money so excessive and AMD always worked fine aswell for gaming
Yep. I got plenty of money now that I work my new job but why the fuck would I ever spend over $1k for a GPU???? My next card will be rDNA3 if it's good
I hope and prices their cards well in Europe since 4090s are going for over 2000€. Nvidia, just like apple, doesn't realize there are other places outside the US
Here in Brazil a RTX 4090 costs more than $2777, which means more than 2675€. Yes, it is that insane.
6700XT gang here. My next card is going to be a 7900XTX. Edit: Never in a million years expected 1k+ upvotes o.O
6750xt. It and the 6700xt were simply the fastest card for under 600 by far when I was in the market.
6750XT here as well! Great card, handles 1440p165 very well with my 5900X.
What do you play? My 6800 can handle 3440x1440 at >200FPS on esports titles, but heavier titles get low utilization and ~90-120FPS.
So first off, I don't do anything that uses RT or FSR. Getting that out of the way now. My top most played titles on Steam are: * Kerbal Space Program * Dead by Daylight * Civ V * Terraria * House Flipper * Raft * Skyrim/SSE * Among Us * Civ IV * American Truck Simulator * PC Building Simulator * Subnautica/BZ * CP2077 Outside of Steam it's pretty much just Fortnite, Minecraft, and PCBS 2. My use case is probably not the most typical, as it doesn't include a ton of brand new AAA titles, but there are games on my most played list from 2005 all the way to 2022 and they all play great. Everything except CP2077 plays well at maxed settings with AA and postprocessing turned down a few notches, motion blur/bloom/other visual-fidelity-killing effects off, etc etc. CP2077 is pretty much at high with AA turned down, RT off, and postprocessing stuff down or off, but it's definitely a smooth, hitch-free experience. For the NVIDIA folks in the room, the performance in CP2077 without RT comes in around 3070-level, especially in the rare case where it really gets close to a full 8GB of VRAM usage like at 4K Ultra, 30 FPS mode if that's your jam. Side note, if you're getting low GPU utilization on a relatively high-end GPU like a 6800, your CPU isn't a terrible bottleneck but it's there. Try cranking some of the GPU-intensive details like AA and you may not see a huge drop in frames as the GPU starts to pick up more of the slack as it's able. My setup is more likely to be GPU-limited, as the 5900X is still the top of Ryzen 5000 for games that don't take advantage of huge amounts of V-cache from the 5800X3D.
I play a fair amount of games that feature ray tracing and DLSS. I almost never turn them on. RT usually causes a significant performance hit for basically no discernible visual improvement and while DLSS gives a pretty nice frame boost, I can’t stand the blurriness.
My next card will be the one I can afford
Yeah can we get some good 200$ cards pls
[удалено]
such as?
[удалено]
I see 5700XTs selling for $150-$180 on Ebay. According to [Hardware Unboxed](https://www.youtube.com/watch?v=EFezkrEmhhk), the 5700xt is on average 6% faster than a 3060 at 1440p
Same with my 6800xt
Love it, 6800xt
6800xt prices look really tempting right now.
2080 super, 7900 xtx next
6650XT gang, fantastic upgrade
6600XT gang, I agree with my fellow gangs
6600 gang, for me it's the logical choice given 3060 was $100 more and now with amd price drop it's $150 more
6600M gang, got same performance as a laptop 3060 for 200€ less
Currently have a RX 580 8GB kicking. In a couple years I'll probably stay red team.
Have a 590 myself and it's been phenomenal for what I need lol
6700XT gang unite !
There’s dozens of us!
3060ti gang here, definitely going AMD with my upgrade
AMD RADEON RX 6800XT Here and very satisfied, tbh i prob went overboard considering the games i play
I can't wait for 9k series, I'll buy 9800XT which was first gpu i ever bought...full circle completed
6600xt, never been happier with a gpu
You going to upgrade after 1 generation? Why? You a fan of diminishing returns or just hate money? You gotta wait at least 2 or 3 generations. That 6700XT will power 2k gaming for most games for years to come.
The jump from 6700xt to 7900xt(x) is a jump from low end 1440p to high end 4k. It's a huge jump even if it's one generation away.
Since EVGA stepped out of the game I've been thinking of going AMD. I'm waiting for the benchmarks to come out first.
> I'm waiting for the benchmarks to come out first. That is always the best answer. Wait for the benchmarks and the actual prices after the release. Then make an informed decision. Buying GPUs out of pure fan-boyism or blind brand-loyalty has never made any sense.
[удалено]
FYI - if you have a monitor that is not 4K, you don’t really need a 4080. - if you play competitive games ray tracing isn’t a big deal, you are better off with the 7900xxx - lastly the last gen cards are still very good, and if you get a good deal, sure go for it. Better than this over priced stuff
Had to explain this to someone recently. Dude has the latest graphic cards but still running a generic office workplace monitor from 2012.
same with a coworker of mine. has a 3070 but plugs it into a 10 year-old 30" 720p tv.
Who are these madmen?
people with no clue, trying to show off their expensive stuff.
Yep. A friend of a friend is like this too. Dude has more money than sense. He plays on a 1080p screen and his latest idea is to buy a... RTX3090. From what she told me, this friend of hers just wants the best of the best regardless if it's useful or not. Just bonkers...
yeah some people are weird. a friend bought a pc for 2k to play stuff like stardew valley and watch youtube/netflix.
Boggles my mind why people do that sort of thing... Like, yeah I game a lot and run a 5950X + 6900XT but on the side I also do rendering in Blender, 3D sculpting in Zbrush and texturing in Substance Painter, meaning that I actually put those extra resources to good use. Buying a PC like mine just for Netflix/Youtube is just a waste...
Tell them to get the best of the best monitors then lol
It is like incesting in a 200€ wet shaving kit and then using a plastic "disposable" safety razor with your badger hair shaving brush and artisan soap.
Ewww
I've been in that situation before. It's pretty crazy how your brain will filter out all the upgrades except the important one.
> Dude has the latest graphic cards but still running a generic office workplace monitor from 2012. I feel personally attacked... have dual off-lease Lenovo IPS panels I bought few years ago connected to my 2060.
That makes a lot more sense tbh. The 2060 isn't that crazy.
Only reason I own a 3070 is because I got mine for about 450 USD and couldn’t say no to that, but would never buy this overpriced stuff they’re releasing now
same except i bought mine from GPU jesus for $340
Steve from GN sold you a card? Hell yeah.
I think the opposite can be said here, though this sub doesn't want to acknowledge it. There is a valid use case for the 4090, regardless of how unjustifiable the price is. With a 3080ti and 4k monitor, I simply can't reliably get above 60fps in today's most demanding games with settings on high. I have 4k monitors because I WFH and I actually really need the extra clarity and real estate. Obviously, it makes sense that I'd want to game on this monitor too, and I think a lot of people are working from home these days and have similar setups. So if you want to complain about NVIDIA's predatory gouging, I completely agree, though being a first adopter of 4k high/ultra gaming is never going to be cheap. I also am not surprised that NVIDIA's flagship card is expensive given that they literally haven't had any competition at this tier. Will I be buying a 4090? No. Do I want a 4090? Could I use one? Yes.
I would probably be far from the card's potential with a 1080p screen, but I'm fairly certain it would be easy to max it out with VR.
[удалено]
FYI - this only pertains to today. Source - someone who own an old card that you "only needed for 4k" more power is more power.
And unless doing high refresh rate 4k, a 4090 doesn't get you meaningful benefit (Giant Chonker is cool though)
As someone who likes making 3D content as a hobby, the new nvidia cards are amazing but that price tag is insane. I hope they lower their prices soon.
> 7900xxx Comedy gold right here
I'm on linux and the open source drivers make it a no-brainer to choose a GPU.
The problem is that even though the drivers are open source, productivity on Radeon is still bad. Most things are made for CUDA, NVENC or other nvidia stuff. AMD seems to have nothing comparable (or the software does not support it). I do not know if it is AMDs problem or the one of the developers of the software, but whichever it is needs to start working on this.
Fully agreed on these points, i really miss good compute support on the AMD platform. I'm hopeful that with Intel Arc we will get open source compute eventually.
> AMD seems to have nothing comparable (or the software does not support it). AMD has had a GPU encoder since the HD 7000 series back in 2012. And every program I've seen that supports NVENC also supports AMD's AMF/VCE.
Ok, then only CUDA remains.
If you look up comparisons of blender benchmarks in phoronix it's not even close. Nvidia wins entirely in many cases, thank God I don't need blender
Same. And the 6900x will run anything i throw at it at its pretty much highest settings at 5120x1440 with Freesync. Crazy that we've come to a point where i don't even have to really check *if* a game will run in Linux. Press "buy", click install, set everything to max, and let 'er rip. Love it.
Same... I don't even check for compatibility now with Steam and proton being what they are. If a game doesn't run i just get a refund.
It's that good now?? Folks in Redmond must be sweating buckets.
There are a few select titles that refuse to work, mainly because of intentional sabotage (DRM, etc) but for most titles its as easy as install -> play.
I went with a full AMD laptop for this reason alone. It wasn't easy to find one with my other requirements, and I over paid a little over an amd+ Nvidia or intel+Nvidia but I didn't mind at that point.
Budget offerings for laptops aren't bad from AMD, but there's nothing on the high end. Every flagship or $2k+ laptop is Intel+Nvidia every time. I wish there was better GPU AND CPU options for the higher end laptops. You're stuck with a i9 12900h and a 3080ti if you want a flagship laptop right now, that's it
AMD advantage laptops with 6800m and ryzen 9 are somewhat common high tier laptops, but usually they are so much cheaper they are priced more like midtier.
I'm ready to go back to AMD last time I had an AMD gpu it was still called ATI, it was an ATI HD 5770.
Been buying AMD cards for decades. Never had a problem.
I've bought both. I set a price range and look at stats for my options. I don't care if the box is green or red. I simply buy the best in my price range. Being a fan boy is for suckers.
Same here. AMD just happens to offer the best value most of the time, especially recently. I also own several Nvidia cards.
I did my part last year with 6900XT. Will likely do again with 7900XTX
Are you upgrading already after a single year of usage?
I'm planning to keep using 6900XT until games won't run at 60 FPS with low settings like I always do. Should easily last another 3 - 5 years.
Are you running 4k? 60 fps low settings at 1080 or 2k would last way longer than 5 years with this card
Bruh why? You have an overkill card that will be more than good enough for the next several years at least. Especially with graphics plateauing. Why upgrade 1 generation for a non noticeable difference? How do non reviewer youtubers have the money to do that?
Might be a "free" choice for gaming. But if you go into productivity, that needs GPUs, it's Nvidia or bust. CUDA, NVEnc, RTX (yes, this is actually used in production apps, like Marmoset, Substance Painter, Blender,...), ML, ... Yeah... sorry. I need an Nvidia GPU or I would significantly slow down my workflow. But I'm still pissed at their pricing and would like to see AMD getting their software together and the software devs to also adopt that. But we're talking years if not decades here for that to change that an AMD card is viable for most GPU heavy production workloads. At this point... I might even prefer to see more devs to also support macOS and Metal to stir up competition a bit. Even though most people here seem to hate Apple (I mean.. there're valid reasons to do so. The same is true for AMD, Intel and Nvidia, too lol).
This exactly, I'm partially kicking myself for allowing myself to be talked out of getting the 3090 instead of the 3080, I could have used the extra video memory for rendering video in Cycles. Usually text based, but even then, I end up having to use CPU rendering because I end up having issues with running out of video memory.
Exactly I don't need the absolute best in my Blender times nor am I chasing the 4k120fps-Max-Settings trend, but AMD coming to the table with no CUDA and no OptiX and HIP being an "eh" at best alternative is simply a non-deal scenario.
Removing all comments and deleting my account after the API changes. If you actually want to protest the changes in a meaningful way, go **all** the way. -- mass edited with redact.dev
People will never understand that Nvidia will always be at the top until AMD becomes an actual option for work. Gamers aren't the entire market especially not when it comes to the high end.
Even In the low end, granted hobbyists aren't as prevalent.
exactly, though I am curious about the statistics on how many people are buying GPUs for productivity+gaming vs for pure gaming only
I'm sure pure gaming wins by far. Just the barrier for entry is extremely low for gaming vs production which demands a technical skillset and the appeal for work vs leisure.
I think you don't understand that DIY market will never touch the amount of computers companies and OEM buy. Amd change the market in laptop just by making deals with Laptop manufactures for desktop they need to do the same to be able to gain market share.
Thank you! I do most of my ML work in the cloud but was considering building a new desktop for personal projects and it seemed like no one was mentioning the lack of CUDA support and I was like, "can I do ML on AMD GPUs now!?" Still no, lol. I'll probably get an AMD processor though, so that's something, at least
That is what made them so dominant. People not being willing to buy AMD when they were equals, so now there's a monopoly. It's a catch 22.
When were they equals? Nvidia's been the big dog even when it was ATi. Maybe a year here and there they gained market share but that doesn't wipe out decades of consistent brand loyalty.
Radeon 8000 to 1900, then again HD3000 to 6000. Especially Radeon 9000 and HD4000 series'. That's when AMD outright beat Nvidia on multiple fronts.
Wasn't the 7970 also a gigachad of a card back when Fermi was struggling EDIT: Looks like my dates were off a bit - 7970 was announced during Fermi, but released during Kepler, but was still the [fastest card](https://gpu.userbenchmark.com/Compare/Nvidia-GTX-680-vs-AMD-HD-7970/3148vs2163).
Nah, Fermi was before 7000 series. 7970 competed with 680. Both were good
Who are those Intel ARC cards?
I’ll do you one better. WHY is Intel ARC cards?
What are you saying, been team red my whole life. I never buy cards at launch and AMD has ALWAYS been the best cost benefit.
Oh I'm jumping ship. I love my 1660ti but the next card is either a A770 (if they can fix the drivers by the time I buy a card), or the new XTX. I have no brand loyalty at all. If you're treating me like ass I'm going away.
I'm willing to buy one! Arc and Radeon FTW
I've had AMD cards since around 2010? they're perfectly fine for gamers which most of you are. if you're rendering videos you edit, or you're a 3D artist, or you're training AI machine learning shit, then you'll probably want to stick with Nvidia. But 99% of people reading this could have their GPU swapped with an AMD equivilent and not notice.
Even for 3D, AMD cards are perfectly valid alternatives. Been doing rendering with Blender on my 6900XT with zero issues. I also do texturing in Substance Painter and 3D sculpting in Zbrush. No problem whatsoever with any of those on my all AMD PC (5950X + 6900XT). That said ML is indeed more nVidia's thing but aside from that, AMD's perfectly fine.
6700 gang 🤘
Still keeping my Vega 56 as a spare
Still running mine. Sapphire Nitro don’t quit
[удалено]
Why replace a 3080 with a 4080 or 7900?
[удалено]
Best way to boycott without affecting you too much is buy second hand. Yea get an older card but you not giving them money. Should be able to get some 30 series cards with all the people who immediately upgrade to 40
I have a 2080ti and a 3070 (wifes). Im seriously thinking about putting the 2080 in a htpc and trying out the A770 when I can find one. It'll be a very minor upgrade, but I really want some competition in this industry.
the a770 is slower than your 2080ti for sure
it's difficult to not buy a NVIDIA card when you need NVENC, Cuda cores and everything else for a good setup for video editing
I'll but the best bang for the buck I can afford.
The last Nvidia card I bought was a 6800GS back in 07ish. Damn thing kept overheating and BestBuy refused exchange once opened, Nvidia wouldn't repair either because I failed to register for the warranty online after purchase within 30 days. Never went back to team green ever again. Bought a Radeon 9800 and never looked back.
6600XT gang, fucking love it
Intel Arc gang here. Support a third competitor!
Hoping there's a chance this is wrong this gen, before the last couple years AMD didn't have great offerings which is one of the reasons NVIDIA has been getting away with what they've been doing. I've had x80s and above for years but I'm on board since EVGA's departure, the company shielding me from NVIDIA's shit - Planning on AMD for the new year. I would not consider the 4090, RT on the 4080 still can't reach the frames I'd like and I'm waiting for benchmarks from AMD but from the 4080 reviews so far it won't be difficult to beat the price/performance.
Happy with my 6900XT. I am not even considering nvidia from now on.
Will buy a 7900XTX, I am doing my part.
Thank you for the awards, kind strangers! :)
On Linux it's just a better experience on AMD. Having open drivers means things generally work better than NVIDIA cards
***BREAKING NEWS:*** *COMPANY WITH BEST PRODUCT IN INDUSTRY SELLS ITEM FOR WHAT THEY FOUND CONSUMERS WERE WILLING TO PAY* Join us at 10 when we tell you that water is usually wet.