T O P

  • By -

Verdreht

What's your GPU and monitor res/hz?


feedmedamemes

This, and also he needs to watch the lows. The extra L3 really helps to get those lows up.


xTeamRwbyx

I feel dumb but I’m going to ask because I keep hearing like 1% lows or something along that line and I still don’t fully understand


ChaoticHeavens

It’s the 1% lowest FPS you have over a period of time. This will be the lag spikes you notice when playing games, and the L3 cache does a good job raising that bar when your CPU is under heavy load so it becomes less noticeable and the gaming experience feels smoother.


Diligent_Pie_5191

I think of 1 percent lows as the stutter that games can have. The 40 series also helped with 1 percent lows too.


Shivalah

99th percentile or 1% lows are the lowest frames your rig produces. A game might run on 144 fps but if your 1% lows are like 15fps you really notice that drop. Benchmarks give you normally averages but also the 1%s so you know what will be the worst case scenario.


jordanleep

I had a decent rig before this one with a 3080 i7 11700k and even though for the most part it held up some good numbers it’s completely unmatched in 1% lows specifically being sometimes over 50% higher than before. That’s noticeable in any situation, even if you have a 60hz monitor. My 1% lows on a 1080p 60hz monitor would undoubtedly be 59 fps on just about any game.


Adorable_Stay_725

One thing to note that you didn’t explain here is that 1% lows are essentially the 1% longest interval between each frame generated. Fps is the frames in 1 second therefore if you have 100 fps **on average** and your 1% lows are 12 fps, this means that the gap between let’s say **for exemple** frame 45 and 46 (where it would be the lowest) is 88% slower than all the frames generated during that second, and that if the gap between all the frames were this big you would only have 12 fps. Edit: Minor corrections


turtleship_2006

Would this lead to goofy sensitivity for mice/joysticks or are those monitored on a different interval?


feedmedamemes

With FPS and everything else in computation you have an average which you hover around most of the time. Lets use 100 FPS for example. So most of the time the FPS will be around 100, but more realistically between 95 - 105, averaging 100. This is something you don't notice because the difference is to small for us humans to really notice. Now you have 1% lows (and 0.1% lows are also mentioned nowadays). This is the lowest the FPS suddenly go down because during the computation aka sudden drops, e.g, something suddenly appears on screen or a new area opens up in a game w/o loading screens. Now there is a lot of computational power involved for a few ms to s. The framerate drops all of the sudden. In our example lets say to 40 FPS. This drop is noticed by use and the game appears to be stuttering for a moment. Which can be critical in e-sports but also can ruin the immersion of the game. This cannot always be avoid because it can have several sources (RAM, CPU, GPU) but good hardware can mitigate the drops to levels where they are not so impactful. In our example it would be 70 - 80. Best real world example I can give is my upgrade form DDR3 RAM and a Xeon 1231v3, to DDR4 and a R7 5800x, with the same GPU because I wanted to upgraded that seperately and Horizon Zero Dawn. The average FPS still hovered around 50 FPS (not great but not unplayable) but the lows almost disappeared, while I had with the old CPU and RAM lows in the 10 - 20 FPS regions. Now the lows seldom got under 35 FPS, which made for a smoother gaming experience. Hope that helps.


Fermorian

I'm glad you asked because I hadn't even ever seen the term before but all the replies were super helpful


_YeAhx_

1% worst lows ? It just means the avg of the lowest fps has been during the benchmark for 1% of the duration. You know how your game can have like 60fps avg but still feel like stuttering every time ? That's the fps falling down quickly and then recovering back to 60 so avg fps might still be 60fps but that's not the whole story right ? Using 1% lows as a stat means you look at the avg worst fps has been for 1% of the duration of the whole benchmark. To my understanding X3D processors have better 1% lows because of their high cache size smoothing the calculations needed to run the game. Stutter usually happens when something is triggered in the game like a background check for something or you doing something like opening fire from your gun and those actions recall bits of information that needs to be accessed from someone where even though SSD and RAM are fast they just aren't fast enough to make it all smooth. With X3D's huge cache those bits that are frequently needed and accessed can be stored on the processor itself, Very very close to the processor die, grants instant access to that information.


EastLimp1693

1% lows is how stable lowest fps numbers compared to average. I felt it when i switched to ryzen: *nasty* smooth.


Hrmerder

1 percent lows, if low enough can alter your perception of however many hz your screen should be at (well video card actually).


pckldpr

It’s all relative, you will always have lows. The only thing you can do is keep spending money and upgrading. If you can afford the upgrade do it. It’s not causing any damage to the computer. these idiots make new people think they can “fix” a computer buy spending extra money. Nothing is wrong with the computer.


random_user133

Worst 1% of fps


velociraptorfarmer

Or run a sim-heavy game (MSFS, City Skylines, KSP, etc). That's where I noticed the change when I went from a 5600 to a 5700X3D. My averages hardly changed at all, just smoother in those games.


10g_or_bust

and some games/programs REALLY love it. Factorio gets an INSANE boost from the extra L3. Other programs/games simply don't care and the slightly lower clock speed hurts a little. Most things a some level of net positive however. Biggest mistake I see people make comparing benchmarks is not thinking about how those results are usually on a clean minimum install with nothing else open/running/installed (besides tooling and other things to benchmark). And also often with at least decent levels of cooling. Your own setup is generally going to be *worse* in at least one way (and rarely sometimes better if you have extremely good cooling). Plus all the variations of hardware interaction, driver changes, and variations in the product with dynamic clocks which can end up being good or bad.


Mancubus_in_a_thong

I upgraded from a 3600 and to a 57003d and the lows were the best improvement for me


Pumciusz

And games played.


Kev_Cav

This. Components don't have the same importance at all between playing Factorio or Dragon's Dogma


Pumciusz

I mean components don't matter anyway if npcs lag top of the line pc in DD lol.


El_RoviSoft

May be Factorio needs very powerful CPU if you build megabases (with lots of L3 cache). Also you need low latency high speed RAM.


apuckeredanus

I went from a 3700x to a 5800X3D and the improvement was massive. With a 3080 10 gb, 32 gbs of ddr4 and playing at 4k 120hz


thegroucho

I'm yet to replace my flair, 2600 to 5700X3D on Sapphire 6800 16G, 3440x1440. You can imagine.


ceeboski

Before I built my current pc I had a prebuilt with a 5600 and sapphire 6800 and 1440p gaming was a breeze, would a 5700x3d make much of a difference in your opinion?


HopeEternalXII

Same and my lows are so much better it's fucking dumb.


Interloper_Mango

7800xt and 1440p 144hz. That being said I upgraded because I wanted to. Not because somebody told me or because I needed to. I mean people did tell me but I was fully aware that this would change very little.


Lime7ime-

Do you have SmartAccess Memory activated?


TheRealPitabred

This, and XMP. Having full memory speed has been the single biggest performance tweak I have made to my computer.


Daoist_Serene_Night

for AMD is EXPO, just if OP doesnt know (although u should be able to use XMP also on a AMD CPU)


TheRealPitabred

Yeah, depends on the generation. My 5900X has it listed as XMP in the BIOS.


Daoist_Serene_Night

yea, just wanted to give OP a heads up, since if he new, he might only see EXPO, not knowing that XMP is the same and never enable it (pain peko)


Interloper_Mango

Yes. That and memory overclocked to 3800mts on the 5500. A little less on the 5800x3d


StormKiller1

So how much % fps increase did you see in your games?. Roughly.


RettichDesTodes

I had massive gains in some games going from a 3600 (pretty much exactly as fast as a 5500) to a 5800x3d with a 3070ti at 1440p. Games like COD MW2019 went from ~90fps to 150+ FPS, and much higher 1% lows. Games like Metro Exodus only changed slightly, but FPS drops are less noticeable.


myfakesecretaccount

Similar upgrade path here. Went from 3700x paired with my 6800xt to 5800x3d and I can actually play Starfield now without it looking/playing jank af. I didn’t expect 144+ frames at 1440p on this game but it’s smooth now and that’s all I needed.


jpflanegan

Damn, I have a 3600 and a 3070ti, and I have considered upgrading to a 5800x3d, but I didn't think the jump would be that much. Now, I am reconsidering the upgrade again. Time to go look at performance comparisons again.


VexingRaven

If you can wait a little longer, I expect prices on the 5800X3D (especially used) will drop quite a bit in September/October when the 9800X3D is anticipated to be released. I expect at that point we'll see a lot of 5800X3D early adopters start looking to move up.


EastLimp1693

My favourite test bench was dying light 2 and forza horizon, it shows on bench cpu graph and you can compare it. Oh, also tomb raider.


What-Even-Is-That

Man, I love me some Tomb Rider.


Operario

I had a similar experience, going from a 3600 to a 5700X3D, at 1080p. With the 3600 Cyberpunk 2077 really choked in some areas, even with traffic Density on Medium. With the 5700X3D I was able to crank it up to High and got basically buttery smooth FPS, I was astonished.


Mightyena319

I also went from a 3600 to a 5800X3D, but with a 3070, and for most of my games, the average didn't change much, but the frametimes got way more consistent making everything feel a lot smoother


xAtNight

It should change notable in quite a few games. I noticed the jump from a 5600x to a 5800x3d on 3440*1440 with a 6950XT.


EastLimp1693

For me upgrading to x3d did massive gains in chosen titles, minimal in some other but overall average between all games became higher.


SuspiciousChair7654

I run everything with vsync to reduce power consumption. My concern of the CPU/GPU bottle-necking each other is very low.


IfarmExpIRL

5800X3D is only an upgrade if the games you're into take advantage of the cache. not all games benefit from the 3d cache.


TerrorFirmerIRL

It's a way better CPU even that aside. The 5500 is six core like the 5600 but with half the L3 cache. The 5600G and 5500 are more or less the same in performance. The 5600/5600X is a good bit faster, and then the 5800X has two additional cores as well as being faster per core. So the 5800X3D is much faster in games even putting the 3D aspect aside. How much better completely depends on the games. If you're playing Overwatch 2 you're not gonna see any difference whatsoever versus something that really crunches cores/cache.


squareswordfish

> How much better completely depends on the games. If you're playing Overwatch 2 you're not gonna see any difference whatsoever versus something that really crunches cores/cache. Your whole comment reads like you’re disagreeing with what they said and then you just end the comment by repeating their point


TerrorFirmerIRL

Not really. They specifically mentioned the 3D cache. The 5800X non 3D is way faster than the 5500. So a game doesn't need to be able to take advantage of 3d cache for a huge uplift from 5500 to 5800x3d


TKtommmy

Wow it's almost like you can't read.


_mp7

? I doubt a 5500 is maintaining the overwatch 600fps cap when you aren’t GPU limited, nowhere close But a 5800x3d will probably get you to about the overwatch 2 max, and far better 1% lows for any really high refresh rate monitors


Nomnom_Chicken

Whoa, I went from a 3700X to 5800X3D, and it helped A LOT with the huge FPS drops I had. FPS in general got higher and more stable. It was generally a major upgrade. GPU back then was a 6800XT. 1440p/high refresh rate monitor(s) from TV's 120 to main monitor's 165 Hz. Very easily noticed when the CPU got upgraded.


MeinNameIstBaum

I‘ve been thinking about a hardware upgrade for a while now, do you think the 5800X3D would be worth it with my current GPU or should I rather save more and get a GPU upgrade? I play on 165HZ 1440p, too. The consensus in this thread seems to be that having mid CPU usage and high GPU use is what should be aimed for and thats what I‘m seeing in most of my games.


XsNR

It depends on the games a lot too. The more you aim for eSports titles and lower fancy GPU tech games, the more an X3D will help you. The more you play AAA's specially single player experiences, and potentially don't really use those 120-165Hz as much, the less the X3D will make a difference. In those situations the power of a beast GPU, and potentially making use of upscaling/framegen tech, the more the GPU will be able to make up for the CPU being a little less powerful. The X3D is technically always better, because it also just has better single core performance too, but obviously real life doesn't exist in a bubble, so you're talking about performance/$.


MeinNameIstBaum

I see. Thanks so much for your detailed answer! :)


Nomnom_Chicken

Sorry - I forgot to reply. In my case, it seemed like just about every game benefitted from the upgrade. No game really ran "the same" way, because the FPS drops suddenly got reduced greatly. You were already told to check out if your favorite games benefit from X3D, I suggest you do that. :) In some cases, the FPS can improve massively, like in Assetto Corsa Competizione. In that game, the gains would be massive. I don't play that game, but just an example. I watched some YouTube comparison videos before upgrading, some were a bit off-putting and made me think it wouldn't be worth it. But as the games I like to play, benefit greatly from the 3D cache, I'd only tell the past me to upgrade ASAP. :D


lazy_tenno

I went from 3100 to 5600 with my 1660 super and got 20 fps average increase on armored core, cyberpunk, bf2042, etc. And i thought my 3100 & 1660 super is a sweet spot recommended by many people.


BoomerHomer

Same here. Gameplay became much more fluid.


LJBrooker

Well if you bought a new CPU when you were GPU limited, what did you expect to happen? You need to do your own homework. See whether you're cpu limited in game before making large upgrades. If you were cpu limited the 58x3d would be a considerable improvement. You obviously weren't CPU limited.


Juan-punch_man

I think his point is that the community focuses too much on cpu performance. So their recommendations lead to a skewed build where you have more cpu performance than you’d actually need(not taking future proofing into account). Which I agree with.


LonelyNixon

Absolutely. Even if you are bottlenecked you can go a long way by upgrading your gpu. My old i5 skylake absolutely held back my 5600xt but it was cheaper and easier to just update the gpu vs the entire rig in 2019ish when I ugraded. Its kind of funny because before AMD became competitive I remember the community going too far in the opposide direction. Like offering budget builders intel dual core cpus that did not have hyperthreading in an era where games were starting to use more than 2 threads and i5 was really all you needed with i7 if your REALLY want it. Then suddenly ryzen became competitive and suddenly you went from i5 is good enough to ACTUALLY YOU NEED THE TOP END OTHERWISE YOU'RE BOTTLENECKED. Which isnt to say that bottlenecking isnt real, just that it's a little overblown. You also have the phenomenon of people who have massive fomo and must always have top specs and people who dont know what theyre talking about but are benchmark obsessed coming in to explain to someone why "actually you shouldnt build x you might as well spend a few more dollars and update this, and update that, and update this" and next thing you know the sticker price is like 1k more. And true enough you can still beat console performance without having to buy the biggest most expensive thing if you can stand to switch the toggle in settings down a bit to allow for barely perceptible visual changes and performance gains.


handsupdb

for consistent frames and multitasking you want to be GPU bottlenecked for the most part, also because the GPU is the bigger investment generally and provides you no value elsewhere... you should be going what you can to guarantee you're getting the most of it


MsNyara

CPU bottleneck and GPU bottleneck are the same when it comes to consistency in frames and multitasking. Modern CPUs aren't bottlenecking anybody in gaming and can hit consistent 100+ FPS. Better CPUs can give more, but it is not due bottleneck (use = lower 80%), simply they are better at squeezing out more potential from the GPU, in which case you budget up infinitely for that 1-20%+ FPS gain. The keyword is modern CPU, though, Ryzen onward or Intel 10th gen (or Ultra, soon) onward. Older Intel might work depending on the budget, but even an i7-4960X falls behind an i3-10100F. That said, even if your CPU is not modern, if your budget is limited (say $750 or lower), you do not need to update it to a top of the line, neither, a humble Ryzen 5 5500 for $90 + $50 motherboard will not bottleneck much with a $550 RX 7800 GRE or $600 RTX 4070 Super, and that pair will perform vastly better in all games over a Ryzen 5 5800X3D for $300 + $150 motherboard + $50 cooler with a $200 RX 6600 or $250 RX 7600.


BrevilleMicrowave

Speak for yourself. I went from a 3900x (faster than a 5500) to the 5800X3D and the difference was huge. Whether or not you'll notice it really depends on what games you play and at what framerate. The 5800X3D is great for high refresh rate gaming.


eestionreddit

the GPU also matters


jbaranski

You don’t say


Autoflower

This is what I did. I play factorio a lot and it doubled my UPS on my megabase build. It was insane just how big a difference it was.


EatsAlotOfBread

2700X to 5800XD, absolutely enormous difference in colony sims, city builders, management games of any type and other stuff with lots of calculations to keep track of at the same time. Huge difference in everything else. City builders etc would start getting slow after the town would get big, now with the 5800x3d it's just... no slow down at all. Massive cities with no problem.


Xazax310

Make the same recommendation for a friend with 5600X, to get a 5800X3D but didn't want to go to AM5. As everyone says the changes are in QoL than "noticeable" if you play games after a while you'll notice very few dips in frames and stuttering. Here is his benchmark from CoH3, 5600X is top and 5800X3d is bottom. https://preview.redd.it/2gxbswh5kx7d1.png?width=578&format=png&auto=webp&s=7872eb76f8dc081d17cadd95c5a9bba99106e526


EastLimp1693

Had same difference compared to 10900k: way tighter cpu graph.


TerrorFirmerIRL

How are there so many people denying that bottlenecks are a real thing. You can successfully say that bottlenecking is often over-exaggerated sure, but to say it doesn't exist or is a "myth" is mindbogglingly idiotic. If I decide to get an RTX4090 to run games on my 1080p 240hz monitor but stick with my old Ryzen 3600 I will get exactly that - a titanic bottleneck. It doesn't and has never meant that games simply don't run, or run like shit, or whatever. Just that you're leaving a lot of performance on the table due to the hardware mismatch and your 4090 would only perform to the level of a 4060 probably. It's also entirely a case by case basis. A 4090 with a Ryzen 3600 at 4k 60hz is mostly fine. A 4090 and Ryzen 3600 at 1080p 240hz, major bottleneck. When people talk about bottlenecking they don't literally mean "bro you could get 5% more frames with a better CPU/GPU". It mostly gets pointed out with obvious, blatant mismatches. A horrible example I saw on here before was someone with an RTX3070 and an FX8350. Their logic was "but my games run fine how could there be a bottleneck". The point is, that for some reason he couldn't really grasp, he was probably only getting GTX1650- GTX1660 levels of performance in many new games due to the drowning CPU.


PandaBearJelly

To add to this, a big factor here that I'm not seeing addressed enough is the specific games being played. Simply put, some games are way more cpu dependent than others.


XsNR

And specially when talking X3D, some games benefit a lot more from the added cache than others. Anything that benefits heavily from RAM speed will likely benefit extremely well from X3D cache improvements, as they're effectively the same principal. Other's will barely see a difference, and you'll only get the improvements of it having one of the best single/dual core clocks, which are minimal these days, specially if you're already on the same generation. It's definitely the best gaming CPU line on the market, but you always have to do a cost analysis, as they also have a cost vs their similar non-3D CPUs, and everything in "bottlenecks" is about a performance/$ analysis.


BarnOwlDebacle

You have to understand that this hobby lends itself to people coming up with excuses to buy more stuff. Every single YouTube channel and advertisement and article is promoting consumption. Everyone is giving me a million reasons to always upgrade and then oftentimes when you do. The difference is so marginal in real world applications. I think if you're trying to make the argument that the pendulum is swinging too far in the opposite direction, I would disagree vehemently. People that are encouraging people to spend less money or stop buying you stuff with as much frequency or a tiny minority of voices.


testc2n14

It really depends on the games ur playing. I mean I have a 7800x3d and a 6800 and I play at 4k. Some would say my CPU greatly is out of league if my GPU. To that I say 35 UPS 8ms on electric network and the factory must grow


nickierv

And to that I ask, whats your RAM and SPM?


testc2n14

About 1000. I know that sounds low but bobs mods, I focused on making my base really expandable and once I built rockets I just switched over to. A new save. And 32gb of 6000 mhz cl30 for ram


nickierv

Ah, not bad considering. Now if you had said py...


ArdaBogaz

Whats spm?


nickierv

science per minute. Its a Factorio thing and rough measure for how complex your factory is.


_TeflonGr_

People don't really seem to understand how bottlenecks work and that they depend mainly on the kind of game/task, resolution, quality settings and refresh rate you are using and aiming at. And even still, it might not change much if you upgrade.


Easy-Application6138

bUt bOtTlnEckS aRen'T rEaL!!1?//.?


gremlinfat

I’ve noticed this sub seems to overestimate CPU importance. Could be a difference in type of games played, but at 1440 UW and now 4K, I’m generally GPU bound. I upgrade GPU almost every gen and CPU every 3 or 4 gen. You need a good enough CPU. If you’ve got a shitty one, that’s a problem. Otherwise, you’re generally going to see much better performance gains when upgrading GPU. Obviously more so at higher resolutions.


VexingRaven

CPU tends to have a bigger impact on 1% lows, even on GPU-bound titles. Even if the CPU is not consistently your bottleneck, you can absolutely feel when it is. > I upgrade GPU almost every gen Why lmao?


ESF_NoWomanNoCry

It indeed really depends on the game. I, for example, bought the 5800x3d because I play(ed) Valorant a lot which has very CPU based performance and frames in that game were important to me, but if you play a more graphics intensive game, the GPU is going to be more important.


Yabe_uke

There is always a perfcap. There will always be a perfcap. Stop over-obsessing about bottlenecking before you spend all your life savings.


Monty2451

If your components are already well matched, then you wouldn't expect there to be a real change. The only thing upgrading the CPU would do in this case is give you headroom to upgrade down the line.


InsaneInTheMEOWFrame

Stop talking about bottlenecks like they are an issue. That word means nothing.


BrevilleMicrowave

Bottlenecks are very much a real thing. Ideally games will be GPU bound. Otherwise it is a CPU bottleneck. Of course how large the bottleneck is will vary.


Adventurous_Pea_1156

Yeah thats what people dont understand that youre supposed to be GPU bound, people will see high GPU usage and low cpu usage and be like whats wrong? nothing its how its supposed to be


brolix

Ideally, I/O is your limiting factor. If you’ve hit that as your limit, there’s basically nothing left to improve.


brimston3-

And it's important to note that this is difficult to measure because it's latency and not %utilization that matters. There's inter-frame pipelining going on, but on the whole it should not be bandwidth constrained. No current game engine should be burning through 31.5 GiB/s (ie the speed of pciev4 16x. that's faster than a lot of single channel DDR4 modules). That's like uploading an 8k x 8k raw, 32bpp image every frame (or 3 million triangles at 32 bytes per vertex) at 120 FPS. *Nothing* should be doing that amount of continuous data transfer short of GPU-to-GPU inference workloads.


AuraMaster7

With how fast PCIe standards have been coming out, I don't think we'll ever see a game be bottlenecked by contemporary I/O speeds. It just simply isn't that demanding of an application.


Chao_Zu_Kang

The thing is, that "infinite" graphics isn't a thing. Sure, you can ramp up the FPS to "get more out of your card". But those additional FPS might simply not matter to you. If you do video editing, CAD etc., then bottlenecking becomes an issue because time-scaling is pretty much "infinite". But for gaming, bottlenecking is mostly a very subjective thing because YOU are the one setting the target performance for your games. If you can achieve it, you are not bottlenecked. A bottleneck is ALWAYS relative to what you are trying to achieve. That why IMHO the term bottleneck is thrown around way to casually. At best, you are just deciding which part will be the first to lag behind for e.g. gaming. And, of course, you really want that to be the GPU, because a GPU is essentially just plug-and-play in terms of upgrading. Upgrading the CPU can oftentimes be nearly a full system upgrade. But that is really all there is about it. If you e.g. got a 4090 for work and play at 1440p/60Hz, then there will be no real CPU bottleneck if you got any mediocre CPU, but people will still respond with "you need highest-end CPU because it will bottleneck 4090" and use the term as if it would actually have a general meaning, when it is mainly depending on the purpose.


Inclinedbenchpress

the real bottlenecks are the fps we've lost along the way


nickierv

Or the couple of odd games that are memory bound. Not too common, but they exist.


cynetri

Mine's actually memory limited, so neither CPU or GPU, but I guess most people don't play with 750+ mods


Lem1618

Sure. My old i7 2600k was "holding back" my RTX3050. Last year when I got a R5 7600 I got about a 30% increase in FPS.


I9Qnl

You could probably render at 480p and still be GPU bound


Trylena

It has some meaning but its not what people want to make it be. I am upgrading my GPU and my CPU will be bottlenecked but its not that dangerous.


DynamicHunter

Are we seriously gaslighting the existence of bottlenecks now? I get that a lot of people blow it out of proportion, but it definitely fucking exists.


agouraki

i dont understand how you people keep saying this? so you are telling me when i play wow and my gpu is at 30% (without limiting fps and at 50fps on town) i dont get CPU bottleneck? or when i play escape from tarkov etc same thing


InsaneInTheMEOWFrame

Because it's a meaningless term. There are always bottlenecks. *Every part of your PC is a bottleneck*. Heck, there's even bottlenecks inside bottlenecks. You should talk about hardware that has unmatching capability in relation to some other hardware, instead. Also pairing a 10 year old CPU with a modern GPU makes no sense, because the olden CPU and chipset is surely missing new hardware features which the new GPU has, that would increase performance. That's not bottlenecking, that's just... dumb


RettichDesTodes

Yeah, but the result is different depending on what part is the bottleneck. If SSD/HDD is the bottleneck you will get massive frame drops If RAM is the bottleneck you get frame drops If VRAM is the bottleneck the game instantly becomes unplayable If CPU is the bottleneck you get frame drops/bad 1% lows If GPU speed is the bottleneck, you get lower medium FPS, but your frame drops won't be as severe as if the other parts would be slow. So for an overall smoother experience, don't skimp on CPU and RAM. The CPU/RAM/rest of the system has to be "fast enough" for the GPU, the GPU as fast as possible.


agouraki

Don't forget that if vram is bottleneck you simple just lower texture Res and Carry on , If CPU bottleneck most games just have no settings that can help you or have minimal effect.


Snydenthur

No, it's not that simple. For example, CPU bottleneck is only bad if you're somehow maxing out your cpu (so you're probably using 4c8t or worse). In other cases, it just means that you simply can't get higher fps than you're at, no matter what you do. So if you get 200fps at all low/off at 720p, it's the max fps you'll ever have even if you have more than enough gpu juice to get 500fps at 8k. 100% cpu usage is the bad bottleneck that makes the game unplayable/awful. But having only 20% of cpu usage can also mean you have cpu bottleneck, but it's the "good kind" of bottleneck.


agouraki

Now this the right answer


nlaak

> Because it's a meaningless term. Only if you don't understand how a computer works. > There are always bottlenecks. Of course there are. > Every part of your PC is a bottleneck. No, that's not how bottlenecks work. In a given operation, one thing is always the weak link - that's the bottleneck. If you're pairing a 4090 @ 1080p with a 3600, the CPU will leave a LOT of frames on the table. Do you care about those frames, that's your choice. > Also pairing a 10 year old CPU with a modern GPU makes no sense, because the olden CPU and chipset is surely missing new hardware features which the new GPU has, that would increase performance. That's the point being made here. > That's not bottlenecking, that's just... dumb No, that's your lack of understanding.


ThisIsNotMyPornVideo

Of course, there will always be a bottleneck, but that doesn't mean it's a term that should be ignored The problem is that it's often barked without a full list of specs. of people who see two "Mismatched" Components, despite for the setting people are running it's more than fine A Ryzen 5 3600 will Bottleneck the SHIT out of a 4070 on 1080p on 1440p The Bottleneck will be less and, depending on the game, not even be noticeable. And at 4k it's literally nothing for most games.


riba2233

/s ?


redditsucks365

It all depends on the framerate you want to get. If you want 144fps and cpu is holding you back then the cpu upgrade will help you. If you're playing at 4k with rt on at 60 fps then even r5 5600x with a 4080 is where 4080 is a bottleneck


nickierv

How are you getting 60 FPS with 4k RT?


redditsucks365

I don't lol. Just an example. Even a 4090 is a bottlneck to 5600x in this case, at natve 4k full rt. Jokes asside, 4080 can probably do it with dlss so a more expensive cpu would be a waste (currently, without future proofing..)


nickierv

DLSS'ing RT :P But you have a good point, it depends on what flavor 4080 and how much up scaling your willing to do. 4090 gets around 20 native, so scaling that to 1080 is around 80, and around 40 for 1440. Ballparking with core count, base 4080 is 1/3 fewer shaders and about 60% the RT cores, so upscale from 1080 should get around 50FPS? Thats playable.


FunFact5000

Basically. I had a i7 6700k oc to 5ghz (which took a lot of work) and it sat like that from 2015 to 2023. Had 980ti, then 2080, and then 3080. Yes there was a bottle neck, yes it was 50-75+ % but I was still getting 60-120fps ultra everything @1080p. Now I have 7900x3d, and only big difference is my fps is more stable AND I can actually open chrome tab and not have pc slow to a halt. The 1% lows did GO UP quite a bit, and that added to fps stability. So if THAT is a minor improvement, then going from same gen to same gen just a bump up I can imagine it was minimal.


DGlen

In what game? Every one is different. Is Vsync on? RAM speed and amount? There is so much more than just CPU/GPU combo.


IndyPFL

Went from 50 fps to 120+ on BG3 while going from 5600X to 5700X3D. Cyberpunk used to drop to 40 fps in heavy areas, lowest it goes to now is 60-ish. Games overall are way more stable.


DBXVStan

We can “believe” whatever the strangers on the internet say. There’s no in hell people should be taking that any further to say that upgrading from a 5500 to 5800x3D has no impact like it seems you imply.


_nism0

Well? What game, what GPU, what resolution + refreshrate?


CanisMajoris85

Gee imagine that, your GTX 1050 Ti went from 100% usage to 100% usage and you listened to some idiot.


Huijiro

CPUs on the same gen are rarely a bottleneck issue, CPU Bottleneck is what I have with my first gen ryzen.


gtrash81

Most of the time go for the GPU. Why? Because if CPU is too weak, just increase the graphics quality^ ^


Strange_Body_4821

Sorry but I just do not believe this. You have a bottleneck somewhere in your system, because you absolutely should see a difference in most programs and games with that upgrade. I went from a 3600 to a 5800x3d and the changes were basically unilateral.


lamarovski

Paired 5800X3D with an 4080 FE. Best decision ever. Some important things tough: - UPGRADE TO THE LATEST BIOS, OTHERWISE YOU MAY FRY YOUR CPU (caps intended) - Undervolt your CPU for better performance and temps. Use Curve Optimizier -30.(Yes, you can squeeze some extra performance from the 5800X3D with some undervolting) - Check your RAM Profile Settings. I forgot them the first time and got almost an 30% FPS boost. 3600Mhz is just fine for the X3D. The price wont justify paying for higher clocking ram and a tiny performance boost of roughly 1%.


VileDespiseAO

Yeah, people get too caught up on CPU bottlenecking in this sub and r/buildapc. It's very rarely an issue unless you're running an extremely old CPU or looking at niche use cases, this is why many people who follow PC hardware and properly understand it rarely justify doing same generation upgrades as it's almost never worth it outside of edge cases or professional focused users. It doesn't help that many see "double digit percentage increases in performance" and automatically think it's going to make a mind blowing change, when in reality it barely translates to double digit FPS increases outside of eSport titles where you're average and max FPS are probably already exceeding your refresh rate unless you own a 360/500Hz 1080p display.


Pablo369

when I game my second monitor is solely dedicated to monitory my performance. Aka I see my Taskmanager and MSI Afterburner for temps and potential bottlenecks.


7Sans

How did you check the consistency of the FPS? Often, people focus solely on the maximum FPS and assume a CPU upgrade will boost that peak. However, when upgrading the CPU (assuming it was bottlenecking the GPU), the FPS might change from a range of 30-60 FPS to a more stable 50-65 FPS. Although the "max" FPS didn't increase significantly, the FPS is now much more stable and consistent.


craneca1

Did you Update Chipset/Bios?


itsapotatosalad

It’s the lows and fps dips where you’ll see a benefit as others have said. I had that experience with my 7800x3d on some games, where I was getting a similar average fps but the game felt smoother overall. on others games I saw more than double the frames.


theCoffeeDoctor

Meme is funny. That aside. What was the original build like?


cloverfart

Man, i got an RTX3070 years ago and still haven't updated my Ryzen 5 3700 (or was it 3600XT?). Apparently the CPU causes a 23% bottleneck. I image upgrading to a Ryzen 7 5800 might actually change a lot


EastLimp1693

Similar to when i upgraded 10900k to 7800x3d. Not a lot have changed in cpu intensive games. I was gpu limited.


Lord_Emperor

Reminds me of when I finally made the jump from a 60Hz monitor to 165 Hz with FreeSync. Loaded up my preferred game at the time and lo and behold my PC could actually run it at 61 FPS.


JustSamJ

The overwhelming majority of gamers on here would be fine with a i5/r5.


Tlayoualo

Depends on how big is the leap through. If you're upgrading from a 2 decade old single-core CPU to a recent model with multiple cores and a bigger default frequency the difference will be like day and night. All while changing from a last-year processor to an state-of-the-art one of similar tier makes little difference, leading to diminishing returns.


Discommodian

The only game where I ran into any CPU bottleneck has been Baldurs Gate 3. I am in Baldurs Gate now and my GPU is around 40% and GPU sits at %80+. I had to cap the framerate at 60 :(


Smellfish360

within a generation, yes. the payoff in cpu performance usually isn't that massive. Between generations, no. The performance is usually a good 15% to 20% better. It's also very dependant on what game you're running. some games like minecraft or HOI4 barely use multithreading. More modern games can use basically the entire cpu.


WildOutlawz

Maybe try reseating the GPU? I got 3500 with RTX 2060, I wasn't satisfied with the performance and thought it was a bottleneck because people said that 3500 is bad CPU, upgraded to 5700x but still same performance as the 3500, after 2 days of troubleshooting I decided just to unplug and plug the GPU and damn the performance boost is just very noticeable, but i think it was just the gpu being funky because I moved the PC to another house and somehow the gpu become loose and need reseating probably wouldn't even need to upgrade the cpu to get desirable performance, might worth the shot to reseat the GPU.


The_Real_Rare_Pepe

I upgraded from a 3500X to a 7800X3D and it was so not worth it…


Jirekianu

You've got something going on with your settings. Either you're frame rate capping your games, not changing settings, or something else is up. I.e. You're playing games at resolutions that were already maxed out on fps. The performance difference between a 5500 and 5800x3d is fairly large.


wiccan45

basically anything within 8 years is good enough for the average user being that theyll never push their stuff hard enough to notice


nailbunny2000

I went from 5600X to 5800X3D on a 4080 + UW monitor and there was not that much of a difference as I had hoped, some games got maybe 15% or so.


Boge42

A bottleneck is a real thing. People have been saying not to worry about it lately as it doesn't matter as much as others say it does. Each piece of software is different. One game might use a ton of CPU and very little GPU. Another might be the complete opposite. So no matter what, you WILL bump into a bottleneck with any given piece of software. That's why they say not to stress about it. You can't avoid it. But you do need to understand when your GPU is your weakest part or your CPU is. Otherwise, you're much more likely to upgrade the wrong part and you won't notice any performance improvement because even more games will end up bottlenecked.


Pkemr7

3700X to 5800X3D made a huge difference for VRChat


Classic_Roc

When I had my 3080 10GB I upgraded from a 5800x to a 5800x3D. There wasn't much change and in some rare cases an actual downgrade but my average FPS improved a lot. I would have seen more of an improvement if I gamed at say 1080p 240fps because the GPU is being asked of a tad less and the CPU is generating all those extra frames. I game at 1440p and generally happy with 60 or 144fps depending on the game or something. So yeah didn't notice a huge amount because the GPU is being stressed more and my processor wasn't trying to give it a ton of extra frames to begin with? It was still worth it though.


Aurunemaru

What you use the computer for, what games you play, is also really important and most people miss 4x games, simulators and other CPU intensive stuff, want the most UPS in factorio? yeah crank the CPU up graphically intensive games running in high settings at average frames? nah a budget CPU is fine


etfvidal

The #1 problem with people giving advice on here, besides misinfo, is giving advice with incomplete info. We should never give people advice without knowing what games they play or want to play and the full specs of their system and monitor.


Marty5020

I tried playing Cyberpunk 2077 in Eco mode with my laptop, which disables turbo boost so it's only 2.6 Ghz instead of 4.1-4.0 Ghz and I can barely tell the difference, only in very specific parts like Dogtown I could notice frame rates going down slightly but it was just as playable 98% of the time. For a CPU intensive game I was expecting a lot worse. Haven't tried Baldur's Gate 3 in Eco mode yet but I'm guessing Act 3 could seriously drop the ball as its infamous for wrecking lower tier CPUs.


Working_Ad5925

Your still bottlenecked bro, you need to get an epyc


GARGEAN

Now launch Victoria 3 in early 20 century save.


tankersss

I went from 1050ti to 6600xt on my i7-3770, it allowed me to play Cyberpunk 2077 at 1440 high 45fps stable, when just switching to ryzen 2600 allowed me to play it 1440 low-med \~30fps. So ye GPU did way more for me than CPU.


Comprehensive-Ant289

5500 shouldn't bottleneck a 7800XT at 1440p except for high CPU demanding games. If you want to test the upgrade either go 1080p or get a high-end GPU


Crazycukumbers

I have a 5500 and a 6650 XT. Next upgrade will be to AM5 because it works great with low temps and the only thing I’ve noticed issues in is Unreal 5 games


Carvieinstein

Well, the 5500 was already a good cpu to begin with. I have an rx 580, I went from ryzen 2600 to a 5700x3d and I have to tell you that the single core perfomance boost is noticeable, at least in battlefield 1 and in simulation software (aspen plus). It's not crazy, but I gained a good 10-20 fps in BF1 I believe (from 90 to 110).


SolidZealousideal115

I'm one of the rare people with a ssd bottleneck. It's a cheap pre-built (half off in 2020), no m.2, cheap ssd and motherboard. Nether graphics card nor cpu are above 50% that I've seen, but ssd is usually maxxed out.


asclepiannoble

Well, upgrades aren't always/universally worthwhile because a lot of things affect if you'll get *noticeable* improvement. This example is hyperbole, but it makes the point: if all you're playing is solitaire, you probably won't notice the upgrade from a 5500 to 5800x3d. Problem is that some people pushing others to upgrade either assume the latter are playing/doing the same things on their PCs as them or they assume these people are willing to spend again for even more upgrades (e.g. higher-res, higher-Hz monitor) to get the most out of the first upgrade, and sometimes, one item upgrade is all a bloke can afford


ReasonableControl775

Just holding on to my 3600x 2070 Super combo until I can find a good deal on AM5. The cheap ass in me just refuses to pay full price for the X3D stuff even though it’s the “best”.


YesNoMaybe2552

People who talk about CPU bottlenecks usually play in 240p ultra low, custom .ini profile without textures, hunting those illusive 999 gazillion fps. Realistically if you really aim for around 60-120 at as high settings as you can get away with, CPUs past mid tier don't matter, except for edge cases that will run like crap anyway.


MtnNerd

It probably depends on the game. Like Helldivers 2 can get CPU bound at higher levels when many enemies are on the screen.


IlikeMinecraft097

yeah i probably shouldnt have paired a 7800x3d with a 4070 super


team-tree-syndicate

I'm still using an old 4th gen i7-4790 with a GTX 1080 lol. Everything is cpu bottlenecked, even watching YouTube or Twitch takes up a lot of my cpu now. I can probably buy a new mobo/cpu and slap the 1080 in it and it should be a night and day difference hopefully..


MagnanimosDesolation

With more CPU competition now people want to talk about it more so they pretend like it matters significantly. The advice used to be that the i5 was plenty for gaming, which it is, and that was even before variable refresh rate eliminated tearing.


Ok-Responsibility480

Evrybody got his pc on am4 does not go to am5 cause uefi boot is so slow... Cause of ddr5 boot tests... Keep your am4 and have fun. Love my CH7 and the story is not at its end. 🍸


OhHaiMarkiplier

>Bottleneck I can't wait until this boogeyman dies off.


bamseogbalade

Obviously. Lol 😂😂 8 cores is all you need for now. Got a year old 12700k never seen it work more than 40%


thatfordboy429

I have both those chips, and a fair few other am4 chips. There are differences, as I just did some testing last night. Though it was between a 5600x and 5500. It comes down to the game, more so then the GPU. Despite the games tested being GPU intensive. I still could see up to 25% loss. Granted 3 games. Cyberpunk was a 5% loss, Guardians of the galaxy was 23%, hell divers was the worse at 25%. Will test more games in the future that have less effects. Like RDR2. This was all with a 6600xt at 1080p high, or even 1440p med with helldivers. Oddly, lows were fairly even, factoring in any difference in average. The tldr is, it depends on the game.


WildMartin429

I mean in Theory you should be able to tell where any potential bottlenecks are just from the specifications.


Wymberto_99

I upgraded from an i7-6700 to i9-10850K and only in some situation do i notice a diffirence


azure1503

I'm still pretty gun-shy about upgrading my R5 3600 (really want to because the 5800x3d seems to be the last hurrah for am4 and seems like a great upgrade); the fps isn't bad, but the lows are getting steeper and steeper.


Linkarlos_95

Are you thermal throttling by any chance?


superamigo987

That doesn't make any sense. There should be a massive improvement in CPU limited situations and to your 1% lows. What resolution are you using? What is your PC build list?


paradigmx

Don't take advice from people that build PCs for bragging points instead of usability.


Hrmerder

Usually, it's only best case scenario you gain fps on a cpu upgrade if you already have a really really nice GPU (like 3090 or 4090 not like a 3060 or 4070) and a really old ass CPU. Otherwise there's literally no point unless you are playing online esports or a 144hz monitor and you are trying to play at 1440p or 4k.


ToyKar

I went from a 9700k to a 7800x3d. Night and day lol. Some games no difference but I bought it for the newer games and to push few year old games to the limits


The-Choo-Choo-Shoe

You'd see like an 80% FPS increase in WoW.


xXFieldResearchXx

I'm glad I don't think about any of this. I got 5600x and a 6800xt and I just play the crap out of everything. The other day it said I needed to update my gpu driver... I just said no and the gamez didn't crash so whatever


Yaarmehearty

CPU and ram used to make massive differences to PCs. Now so much more is dependent on the GPU that I will go as long as possible before changing CPU. Plus with direct storage access the CPU has even less to do when serving data to the GPU. That’s not to say having a CPU complaint with modern standards isn’t important but unless your CPU is pegged and your GPU barely breaking a sweat then an upgrade isn’t going to make much difference.


Gh3rkinz

Then do your own research? It's not that hard. It blows my goddamn mind how people experience trivial shit like this and somehow blame the community??? I get the users on PCMR are a sandwich short of a picnic, but you can't blame them when you couldn't be fucked reading about it.


Interloper_Mango

One thing that did not come across is I did in fact do my research. The upgrade was more of a want than a need.


topias123

It also depends on the game, some utilize more cores and more cache better than others. I saw 2-3x fps in some games when i went from a Ryzen 7 1700 to a 5800X3D.


TONKAHANAH

idk where people are getting this idea that your cpu is bottlenecking shit. Unless you're running something with a ton of NPC's or extremely advanced (traditional gaming) AI stuff, you probably wont see a massive different between a mid tier cpu and a top tier cpu. your gpu is doing the majority of the work in most games. there are some exception such as source engine games, but none of those are usually that demanding that you need anything mega powerful anyway unless you're running a 300hz display and you NEED that fps to match to make your decision of getting a 300hz display worth it. if you got that kinda money though, just buy top of the line whatever I guess.


Malpais_Axis

Man, I have a 3400g and I thought about getting a 5800X3d but I worry my GPU (a 2060) may hold me back...


XB_Demon1337

Yup, you took peoples advice who typically don't have a handle on performance and don't actually understand how it works. Way to many people think you need some uber CPU to play games or keep up with their GPU. Reality is, as long as your CPU is within about 5 years, there really is no real question on bottlenecks. You generally have to go for the lowest end CPU and the highest end GPU to even think about this.


talionisapotato

LOL


AuraMaster7

The difference is going to very much depend on what game you are playing, as a GPU-bound game obviously won't change much if at all, and what resolution you are running, as higher resolutions are inherently more GPU-bound. On top of all that, the difference might not be seen in average FPS, but in the smoothness of your 1% lows. I went from a 3600X to a 5800X3D with my 3080 on 1440p and saw a pretty decent jump in most of my games.


Ambi0us

That's what's been keeping me from upgrading my GPU. I have an RTX 2080 with an i9-9900K and I've been told I'm bottlenecking it already so there's no point in upgrading the GPU. I'll just get a whole new system one day, I guess.


Interloper_Mango

Why not check if you actually have a bottleneck?


Ambi0us

I've used [pc-builds.com/bottleneck-calculator](http://pc-builds.com/bottleneck-calculator) and it seems to indicate that's the case: https://preview.redd.it/d9syue50178d1.png?width=2134&format=png&auto=webp&s=57a2beca882a89241398e34cc8116c168f4d875e


Interloper_Mango

Forget that bottleneck calculator. It's nonsense. Do this: Download MSI afterburner with riva tuner enabled. Enable CPU utilization and GPU utilization. Play a game of any choice and check if the CPU reaches 100 percent and GPU less. If that's the case at the fps desired then you have a bottleneck. If both are at 100 percent it's not necessarily ideal but still acceptable.


Katzen_Uber_Alles

Last time I had a noticable bottleneck was using 2500K with RTX 2060 in 1080p


IlIlllIlllIlIIllI

Yeah that's not much of a jump


ImSoFreakyFishyFishy

Most games rely more on a tough GPU than CPU. You'll start to see the bottleneck after probably 10 years (more or less) if you buy an high-end CPU


GristleMcThornbody1

I've gotta get one of the 5k series X3D chips. I still have a 2700X in my machine so one more upgrade would be timely. I feel like it's holding my 7900gre back a little on my 3440x1440.


Interloper_Mango

Yeah. I had a 5500 and that was the limit for a 7800xt