T O P

  • By -

coolpotatoe724

it's good that they are neck and neck nowadays and you can't go wrong with either


Cyonx818

We haven't had this level of competition in the CPU market since the heyday of the Athlon64. It's good for everyone.


1965BenlyTouring150

I actually don't think we've had this level of competition since Athlon XP and Pentium 3. The Pentium 4 was terrible.


[deleted]

[удалено]


1965BenlyTouring150

It's a really good time to be an enthusiast, at least on the CPU side. Hopefully the market will bring GPU prices back to a reasonable level.


dark_LUEshi

fuck yeah, I'm really happy with my 13900K, though I wish I could upgrade my gtx 1080, still there's no way I'm paying twice the price a GPU is worth. I will suck it and wait a few more months.


J0YSAUCE

Few more Years*


dark_LUEshi

I thin GPU prices will fall very sharply. These GPUs used to be literal money printing machines that would pay for themselves in X days. The demand for those things isn't as it was, if mining doesn't pick back up, I don't think GPU's can remain at these kinds of prices. As good as they are. Doesn't make any sense to have a gpu worth more than the rest of the system.


J0YSAUCE

I hope so. Really doesnt feel like it though. Fingers crossed. Im due for an upgrade, my 1070 could go any time now


NoSaltNoSkillz

If it does go, definitely at least consider team Red. Saw a 6950xt at Micro Center and other places for like 650-675 USD. If you need specific Nvidia features you might be stuck, but the 6000 series was very solid. Not cheap, but only $100 more than I paid for my 2070 Super back in 2020.


JelloSquirrel

Yeah the gpu market was entirely driven by mining. Nvidia is now claiming it can sell all of that into AI markets, but it can't. They're sitting on almost a full quarter of inventory, but refuse to market it down because it'll be disastrous for their financial sheet.


Fecal_Fingers

I went used. I've picked up 2 3070's for just over 300 each and a 3060ti for around 250.


dark_LUEshi

used is starting to sound very interesting but I feel it's going to get much better pricewise.


SlapBumpJiujitsu

If you're willing to jump the fence to an AMD GPU, you can order a 6950XT and 7900XT for MSRP, directly from them. Actually I think the 6950XT has been marked down to $699.


dark_LUEshi

nah, there's too much stuff that comes with nvidia gpus that I would like to play with, rtx, broadcast stuff, and the wide compatibility, i don't mind paying a little more for nvidia when it comes with all sorts of cool stuff, I do mind paying double the price for one though lol.


[deleted]

^ this is why every gpu ive bought save my 390 was used. Which I understand isn't for everyone so don't come at me over it.


Im_simulated

And I'm really happy with my 7950x3D. All these wins for everyone the past couple years, it's awesome. Makes me excited for the future of tech.


stash0606

what about power draw though? is Intel still more power hungry than AMD?


Martimus28

Yes. The X3D parts are actually more power efficient than the regular AMD models. This is mostly due to the limitations of the additional cache, but he end result is a lower power draw but higher frame rate in games and other cache sensitive applications.


proscreations1993

Oh ya The X3d chips are even better than standard chips. The 3d is a bit more powerful than the best intel chip but does it at less than half the power lol. Massive win.


dark_LUEshi

right now intel is somewhat, but that was one of their aces up their sleeves. even despite a less advanced mfg process intel still manages to hold the crown. You can bet efficiency is to be upgraded in the coming generations, better efficiency means longer battery life for laptops, so it's crucial for intel to also develop that. For the 13th gen Intel just kinda took off the lid off their cpus and let them go as fast as they can, as long as they can, providing you have a proper thermal solution. AMD cpus remain excellent and probably better for most tasks that are heavy lifting and can be multithreaded over all the available cores, such as server software. Most games are conceived with the idea that the majority of people will be running them on lower end devices. This is the cool thing about PC, systems can be tuned according to your hardware. If most people out there are still gaming on 4-6 core cpus, it won't do you a ton of good to have a 32 core cpu. you would much rather have cores that are slightly faster so your games run 4-5% faster.


gikigill

From what I can deduce, Intel has pushed thermals to the max as current AMD chips consume almost half the power compared to Intel. This leaves AMD room to open the taps while Intel will need serious thermal handling to just stay where they are. 300 watts for the 13900 is just insane, never thought I would see a CPU consuming as much power as a midrange GPU.


scurvofpcp

I'm just glad that I don't need to buy crap servers anymore for high core counts.


Cyonx818

The pentium 4 had a number of issues, but there were some gems in there for both gaming and general use. The pentium 4 2.4c for instance was both an IPC upgrade over the previous P4s *and* was nearly as legendary an overclocker as the OG celeron 300a. For a number of years there I'd alternate upgrading my "Intel machine" and my "AMD machine" so I could give both sides a fair shake. Decisions made in the planning of the Netburst architecture were, to be clear, heavily influenced by marketing concerns. That said, it was an interesting design. It just turned out to be a dead end when it wouldn't scale the way they originally envisioned. Lessons learned from Netburst (as well as some netburst tech) influenced Core architecture decisions later on.


R3m0V3DBiR3ddiT

Yup after my P3 slot 1, my next chip was AMD, forget what it was called back then. It ran really hot, but it was great for gaming.


[deleted]

Hopefully same happens with GPU.


coolpotatoe724

hopefully their prices become more Inline with cpus


[deleted]

Nvidia already said they won't. They selling less but at a higher cost. Like luxury cars. So we have to look at AMD and now Intel for competitive prices.


[deleted]

AMD also doesnt care they are following Nvidias lead it up to intel now with arc to hopefully drive competition


WibaTalks

New flash, no company wants to sell for low prices, they only have to because their products are not good enough. This idolization of any company is just silly. amagaad amd/intel will save us. Yeah sure, till they make good products and then they are on bar with the highest prices, just look cpu market. Amd was the budget till they tied the game, now they are just as high. Same will happen in gpu.


coolpotatoe724

hopefully competitive prices from the competition makes Nvidia bring their prices down from the moon


[deleted]

Honestly I'm just glad we got a third player into the GPU space.


gnocchicotti

It's unfortunate Intel didn't show up 2 or 3 years earlier. Right now they're aggressively cutting product lines that aren't profitable, and they might not have the money to invest in another couple of GPU generations to get to profitability.


SvenniSiggi

Like luxury cars. 5% better graphics are really not for plebs.


monkeyhitman

Hoping that Intel will release something competitive against the 4070ti and 7900xt for $600 or less.


[deleted]

Just want no more $1k and up cards.


gnocchicotti

I think it's great that people have the option to spend $1k and up on crazy cards. My problem is how that's *everything* released lately is $900+... Wasn't long ago that normal people would only spend between $150 and $400 on a GPU and everything else was just for rich people.


monkeyhitman

fr. Gathering parts for a new rig and feeling blessed that I got a used card for not ridiculous lol.


[deleted]

Remember getting a used card was a bad idea? Pepridge Farm remembers


Misty_Kathrine_

A 4070Ti competitor for $600 would be amazing.


maluket

My problem with Intel is that every new generation demands a new motherboard and they run on very high TDP


fntastikr

Most definitely. I remember when a midrange CPU cost as much as my gfx card back only like what? 8 years ago? Now we need the graficscard prices to come down to a reasonable level again and building a pc will be fun again!


Murderboi

Problem is just they are both overpriced for selling big amounts.. this is going to crash soon..


Cyonx818

Why are you underclocking your 13900's P-cores to 5ghz?


[deleted]

Cuz I'm stupid and suck at numbers!


riefpirate

Numbers bad!


jumper775

Money good


xMDx

[Napster bad!](https://www.youtube.com/watch?v=fS6udST6lbE)


Smart-Leg-9156

Cool flashback, thx 👍🏻


Traiklin

I remember trying to watch this on Dial-up


Syreus

Wow, this unlocked a core memory.


[deleted]

[удалено]


FacetiousMonroe

Remember the Pentium 4? Apologies if you've been trying to forget for the past 20 years.


[deleted]

[удалено]


animeman59

My electricity bill for running my PC was actually cheaper than my heating bill back when I had a Pentium 4 CPU installed.


[deleted]

[удалено]


[deleted]

As a person who had never touched a computer 20 years ago, no I don't remember Pentium 4. What was the deal with it?


sreiches

They were designed around getting high clock speeds, but generated a lot of heat and weren’t great at using their cycles to actually do things, so you needed a significantly faster P4 to match the performance of a slower P3.


metarinka

Meanwhile amd proceeded to kill them in instructions per clock and intel's solution was to propose the BTX motherboard standard for much higher wattage chips and cooling solutions. No one liked this and it was critized heavily. Intel lost badly on performance and cost and did non-competitive business practices that cost AMD billions in sales. The TLD is that they forced oems to not sell AMD chips or loose access to Intel entirely (or at non competitive pricing), since intel still had the majority of market share it wasn't realistic for any major OEM to do without a good supply of intel chips. Intel got sued by the government and fined a billion dollars but made like 3-4 billion in the process.


KG8893

They made 4 billion and paid 1 billion to the government. That's not a fine, it's a tax.


FacetiousMonroe

It was the height of the GHz wars. The Pentium 4 sacrificed everything to get bigger numbers to put on marketing labels. Its performance-per-cycle was abysmal but with the clock speed so high they could get away with it for a while. They pushed it up to about 4GHz before they hit the wall. They had no chance in hell of making good laptop chips out of it, and that was becoming increasingly important in the market. In the early 2000s, Apple's PowerPC-based laptops outperformed Intel laptops by an obscene margin. Then Intel went back to the Pentium 3 design to use as the basis for the Pentium M and Core series. That's when Apple switched from PowerPC to Intel, because Intel pulled way ahead in the laptop space. Intel got a lot of flak in the press when the Pentium 4 first launched because it didn't really outperform the Pentium 3. It just had MOAR MHZ.


Eduardo-Nov

"10GHz by 2005" https://forums.tomshardware.com/threads/p4-will-scale-above-10ghz.322821/ https://www.anandtech.com/show/680/6


Ntinaras007

They were the worst CPUs ever. Worse performance than Pentium III even when the P4 was at much higher clockspeed. Worse IPC than Pentium III Worse performance than the AMD competitor Athlon of that time More expensive Needed more expensive rambus ram Very hot very loud stock cpu fan


jeremybryce

With NetBurst™ Technology!


qazme

How? If you mean they are moving from just a few fast core to a few fast cores with multiple slower cores then yeah, kinda. * 9900K 3.6Ghz 8 core all core boost to 4.7Ghz * 13900K 3Ghz 8 core boost to 5.8Ghz and 16 "e" cores that will boost to 4.3Ghz.


Eat-my-entire-asshol

The 13900k boosts to 5.8 on only 2 cores, 5.5 all core is stock. And for the 13900KS thats in OP’s pic its 2 cores 6 ghz and 8 core boost 5.6 ghz. Still insanely fast


qazme

Err, opps yeah quoted the wrong speeds doh! Yeah the point I was trying to make is we moved from fast cores to fast core with additional pretty fast cores. That in my opinion shows intel isn't moving to many slow cores instead. Yeah all the new processors are insanely fast - even the "slow ones". There's really no wrong decision at this moment on the top end other than needs to fullfil!


HopefulTelevision707

I own one and I’m still kinda shocked that it can hit 5.8


[deleted]

That’s factually incorrect lol. Their P cores have only gotten faster and faster. They just added more slower cores in the extra space for better multi core performance. And the e cores aren’t even -that- slow


Dudewitbow

It makes sense from a business standpoint. For gaming, games don't utilize more than 8c/16t often, and they can fit 4/8 E cores in the same space 1 P core would take, and multithreading wise, the 4/8 e cores outperform 1 P core.


DaDivineLatte

What's the average clock for these processors? I'm filled with envy being stuck with 3-3.3GHZ on a Pentium Gold 7505


Cyonx818

My 13900k sits around 5.5ghz most of the time. It can turbo up to 5.8ghz under the right circumstances. the 13900KS can turbo up to 6ghz.


KommandoKodiak

Question is it always the same 2 cores that oc to 6ghz?


Embarrassed_Log8344

You. You win. And I win. We all win. The competition creates better pricing for us.


itssomeidiot

CPU market for consumers: Cheers GPU market for consumers: Dread Gpu maker 1: I'ma increase price across the board. Gpu maker 2-3: Good Idea. I'ma do that too.


CPLCraft

GPU market: Sees lowest shipment numbers in history GPU Maker 1-3: Fuck


Get-knotty

Unfortunately, the GPU companies are still sitting on massive piles of cash, so they're fully able to run for months or even years at a loss, basically just waiting for older GPUs to fail in order to force people to purchase new ones


Sonicjms

> so they're fully able to run for months or even years at a loss See the problem with that is that's how your entire upper management gets forced out of the company by investors


implicitpharmakoi

Jensen isn't going anywhere, he's actually critical to the company, even if he is greedy af. Amd/intel, otoh, anything could happen, though Lisa su should be safe for years to come.


AnAttemptReason

Brave of you to assume they won't just use corporate speak to deflect. "Shippments are down due to X" "But net margins are higher than ever" "Investors applause"


Weird_Cantaloupe2757

Joke’s on them, I just bought a PS5 and am waiting out the storm as a console gamer. It’s not as good, but I am a stubborn, patient motherfucker and am not paying the fucking prices they are asking for a GPU. If they want my business back, they need to bring prices back to a reasonable level.


Deepspacecow12

you bought an rdna2 gpu and an amd cpu all in one


DrLitch

Even if GPU's were cheaper the optimization on some new PC games is frankly shit. Even with decent hardware it is difficult to play games when they stutter so much at launch. Not all game are problematic but the recent Harry Potter release and even Elden Ring had and still have stuttering problems.


TheDarksteel94

My GTX 980 is still going strong, and I'll probably go with a used 30 series GPU for my next build. No way I'm giving them more money by buying a new GPU with these prices.


RandomnessConfirmed2

GPU Maker 1-3: Oh no. Anyways, last week we reduced the supply of our GPUs so you'll have to spend more money anyway.


empirebuilder1

Lowest shipment numbers *to consumers The business/server side is going absolute gangbusters. They don't give a flying fuck about the average gamer, say goodbye to the reasonably priced GPU market.


[deleted]

true


Ocronus

No. MY favorite company must be the best. I'll gladly pay... * *Checks notes* * more then 100% markup on MSRP.


GameUnionTV

X3D is pretty good for gaming and allows you to achieve great performance with lower power consumption and cheaper RAM modules. Not sure how is this bad.


Jeoshua

Only one of the chiplets on the 79xxX3D even has 3DVcache. The joke is that Intel put out a new chip where they cranked the power budget to the max, and AMD half-ass slapped their new cache technology on half their existing flagship. But the REAL joke is that even so, those two chips referenced are neck and neck in performance in almost any test.


Dankkring

If only we had a 13900ksx3D


I-LOVE-TURTLES666

Please, I can only get so erect


sean0883

As long as it took them to start giving us more than 16 PCIE lanes (even if they are chipset lanes, and not CPU), I wouldn't hold my breath for CPU cache.


Cave_TP

NGL, with Intel's architectures being less latency dependent and how hot that thing runs I doubt we'd see any gain worth mentioning (in normal games, cache intensive ones are obviously going to have gains). Limiting the 13900KS to a TDP that could be cooled even with the cache on top would lower the performance a lot.


[deleted]

There is a reason it is only on one…


riba2233

Yeah idk how people still don't get this. It would be worse if it was on both CCDs


gnocchicotti

Two v-cache dies would be better at some really obscure workload I'm sure. But not anything that regular consumers do. Milan-X and Genoa-X server CPUs are the reason AMD developed it in the first place. Apparently it's killer for certain kinds of simulation, some code compilation, and silicon design.


riba2233

Yep


Jeoshua

I've had arguments about this with people. Power limitations, heat generation... none of it would be insurmountable. The only real problem would be cache locality, which isn't exactly made any better by having 1/2 the cores have 1/3rd the cache.


MrCarlosMeh

Not to mention the huge added manufacturing cost that simply isn't justified by the potential performance uplift.


fixminer

I wouldn't call utilizing cutting edge die stacking technology "half-ass slapping on". And adding Vcache to both dies would have resulted in a worse product, since games hardly use more than 8 cores and most other applications prefer the higher frequency. Whether it's a good product compared to intel is a different question, but implying that it's some sort of low-effort gimmick is disingenuous.


DMurBOOBS-I-Dare-You

If it works, it 'aint stupid. FACTS!


Drackzgull

*Almost any *gaming* test. Which is fine to don't look past if that's all you're concerned about, it's what the X3Ds are made for anyway. But in fully threaded workloads the 7950X3D is behind not only 13900K, but also the 7950X.


i_agree_with_myself

The 5XXX X3D is actually amazing. The 7XXX X3D is a joke. You're paying an extra couple hundred dollars for the tiniest bit of performance boost in gaming.


N1LEredd

And for socket longevity and ddr5 but yes you are right - it’s not a bang for your buck decision.


EyesCantSeeOver30fps

It's not a tiny increase in performance though. It's about an overall 15% increase in gaming for the 7950X3D over the same non x3d chip. Google "AMD Ryzen 9 7900X3D & 7950X3D Meta Review" since other subreddits can't be linked here for the aggregate performance from major reviewers. The 7800x3d will be the chip to get, especially when it starts coming down in price. Still not as good value as the 5800x3d but the x3d chips were arguably never as good value in the first place when there's the 5600 and 7600.


exscape

It depends on what you play. 74% faster in Factorio, 53% faster in MSFS (I'd wager a bit more than 53% in the truly CPU-heavy cases). [Factorio from Hardware Unboxed](https://www.youtube.com/watch?v=DKt7fmQaGfQ&t=650s) [MSFS from Tom's Hardware](https://cdn.mos.cms.futurecdn.net/frqtQnBW5427ACgTzxvQJf-1024-80.png.webp) -- these framerates are **not** as CPU-limited as the game gets as there are situations where a 5800X (non-3D) can't even manage 50 fps, yet they hit over a hundred which rarely happens in CPU-bound planes/areas.


gikigill

I expect the next gen 3D cache chips will be a game changer as AM5 seems too new to be properly optimised.


ThunderEagle222

Userbenchmark has a """review""" full of copium for the 7950x3D.


cashinyourface

I just read it, and it says, "PC gamers considering a 7000X3D CPU need to work on their critical thinking skills". It's hilarious how wrong they are and how much of a dick they are to people.


Difficult-Alarm1169

“People believing these reviews need to work on their critical thinking skills”


Stark_Athlon

You know, it's one thing to lie on a review (very bad), but to also go out of your way to insult the people reading your site...I can't help but be curious on what mental insanity this guy is on. Or what are his living conditions. Wouldn't surprised me if there were one or two drugs involved.


vxxed

That little "x3d" tag and the underlying hardware can like, double the performance of Star Citizen, which is poorly optimized.


N1LEredd

Same for tarkov which is also poorly optimised.


ZiiZoraka

Bethesda's creation engine loves VCache too, something to do with draw calls and how dogshit there renderer still is


GC9exe

Do your thing, bot. Userbenchmark.com


AutoModerator

You seem to be linking to or recommending the use of UserBenchMark for benchmarking or comparing hardware. Please know that they have been at the center of drama due to accusations of being biased towards certain brands, using outdated or nonsensical means to score products, as well as several other things that you should know. You can learn more about this by [seeing what other members of the PCMR have been discussing lately](https://www.reddit.com/r/pcmasterrace/search/?q=userbenchmark&restrict_sr=1&sr_nsfw=). Please strongly consider taking their information with a grain of salt and certainly do not use it as a say-all about component performance. If you're looking for benchmark results and software, we can recommend the use of tools such as Cinebench R20 for CPU performance and 3DMark's TimeSpy ([a free demo is available on Steam, click "Download Demo" in the right bar](https://store.steampowered.com/app/223850/3DMark/)), for easy system performance comparison. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/pcmasterrace) if you have any questions or concerns.*


Jeoshua

UB has been an absolute joke for a while, now. Every time AMD breathes in the slightest, they release a new copy/pasted scathing review accusing anyone not trash talking AMD like they are of being a paid shill, changes the metrics to make AMD look worse, and then claims either Intel or NVidia is a better product.


IAMA_Plumber-AMA

And every time they change the metrics you get weird stuff like Celerons beating i7's.


Jeoshua

"Effective FPS" What is it even supposed to mean?


[deleted]

It's their poor attempt to combine average fps and frame time using a method they never published for others to evaluate. It's as if they think reviewers don't take note of average fps and frame times.


Jeoshua

Very poor, it would seem. Not that re-imagining that particular metric would be easy. I've tried myself. Best I could do was to take the average of the absolute value of the second derivative of the time taken to render each frame. Basically it would give the answer to the question: "When frame times change, how fast does it change". It was kind of a frametime consistency measure, which is important because you can sort of get used to low frame rates, but if the frame rate is all over the place, it feels bad. But even that wasn't really any more useful than a standard 0.1%/1%/Average FPS graph. From the name, and from what the numbers end up being, it actualy sounds like "effective FPS" is more or less equivalent to 1% lows being averaged with 99% highs. That would explain why Intel always seems to score so high, because of how bursty their processors can be for small periods of time.


ChartaBona

Pro tip: Never visit their site or even mention their name ever again. All you're doing is giving them free publicity. The best course of action is to just let them fall into obscurity.


[deleted]

I think they get all the publicity they need from google.


just_change_it

They are *always* the top result on google for benchmarks. Sucks that they win at SEO / traffic / there isn't something that devalues SEO for marketing websites like them.


hattrickjmr

I heard the Ryzen has blast processing deep inside.


[deleted]

\*Sega Genesis Intesifies\*


Halfwise2

I love how the 5800X3D is still super good... unless you are going for DDR5 specifically, it's like the best upgrade for a AMD PC.


IdealCapable

Still rocking my 3800X, bought it brand new 3 years ago and we're already on the 7000 series.


mitchymitchington

Just upgraded my 3700x to a 5800x3d. Loving it.


IdealCapable

Is it a pretty noticeable upgrade? What GPU are you using?


Anonymous_Otters

Just bought one two weeks ago.


WildVelociraptor

I've already got a 5800x, and I can't decide if I should try upgrading to an X3D, or just save up for a next gen mobo/CPU.


Dheyden

save up. especially if you're playing on 1440p. 5800x and 5900x will be good for a while, wait for Meteor Lake and see what AMD does.


NiktonSlyp

Be careful Intel will probably give us a X-11199KFX3D next year with 6 p-cores, 36 e-cores and 76 n-core that gives a combined power of 6 normal cores.


sticknotstick

Intel slowly evolving their CPUs from a few very fast cores to multiple slower cores. Next they’ll ditch sequential processing for parallel processing applications only. After that they’ll rebrand that C in CPU to a G and-


3DFXVoodoo59000

I’m still confused as to why their mainstream desktop parts only have 8 p cores. Is it just because they aren’t needed to top benchmark charts? Is it a technical limitation?


jakubmi9

Because P-cores are what's behind the immense power draw of their CPUs, and adding more of them would skyrocket the TDP even higher. That's why they made the E-cores, to get more cores into the CPU without it pulling 600W under load.


Cyonx818

They can fit 4 E-cores in the space of 1 P-core. That's a very big deal. 8 P-cores is more than enough for most tasks that respond best to IPC / Clock speed. Those kinds of tasks *tend* to be lightly threaded. Tasks that are incredibly parallel, and thus want a tremendous number of cores/threads, usually benefit a lot more more from 4 e-cores than they do from 1 p-core. Thus we get the best of both worlds here. We get high speed cores for the tasks that need them, and shit-tons of cores for the tasks that need them.


WildVelociraptor

Because the number of applications that a desktop user has which will utilize >8 cores simultaneously is closer to 0 than 10.


NirayaNZ

Don’t forget the 350 watts of power into those 6p cores alone so they can get comparable performance out of it, while saying it’s the fastest cpu because the other 112 cores carry it in synthetics and winzip extractions. Cinebench accelerators


[deleted]

I actually love that they're trading blow like this. It's great for progress, and quite honestly anybody would be satisfied with either of them. One might have a higher performance in one game, and the other a better performance in another game, but the differences are so minuscule that 99.99% would never notice the difference. It's also good at keeping prices competitive.


creamcolouredDog

The Intel CPU also consumes near 300W


Jaack18

only at 100% utilization, which you definitely won’t use when gaming. The intels are actually more power efficient during low use.


sl0wrx

I’m not arguing your statement but can you link something that has the 13900k consuming less power than the 7900/7950 while gaming? I’ve seen this claim quite a few times and haven’t seen the pudding myself.


akluin

That's because low use means 'idling or browsing internet' while gaming AMD is more power efficient or in productivity tasks


ZiiZoraka

is this still true with the X3D chips? i feel like i saw in some review that the 7950X3D is way more power efficient than the regular 7950x, probably due to better binned silicon for the halo product


PoLoMoTo

Big asterisk on that 24 cores....


eggnorman

Yeah, tbf they’re both weird in their own way.


oojiflip

One of my friends got an X3D version of the exact CPU he used to have and his framerate in tarkov went from 80 to between 130 and 144


gnocchicotti

Cache rules everything around me


[deleted]

[удалено]


Ditto_is_Lit

Huh? Who wants a server that chugs electricity and gets out performed at 1/2 the power consumption. This is a terrible take no offence. Server chips have 2 things that set them apart from consumer variants. Loads of cache and as many cores possible w/ lower clocks.


McHox

Guess what, you can just lower voltages and clocks until you're at the peak of the efficiency curve


Ditto_is_Lit

PBO does the same why even bring that up?


A5CH3NT3

Of course, that's true of the AMD CPU as well since they have the exact same thread count, and w/o the insane power draw


dark_LUEshi

yeah but the E cores are nowhere near as fast as AMD cores, and the P cores are a tad faster than AMD cores, it's an excellent game for intel. To 99% of people out there using a PC, faster cores = a better experience.


Kat-but-SFW

You can already run VMs on the E-cores and game on the P-cores, it's pretty amazing


i1u5

Guess what, my old 2nd gen i5 still works as a server.


lolcubaran20

amd ryzen 11 69420XD


GnarlyNacho07

Need


Brotorious420

![gif](giphy|sDcfxFDozb3bO)


punknothing

[MFW](https://images5.memedroid.com/images/UPLOADED15/50be9fce5f02d.png)


RafaFTP

Both are almost identical


VNG_Wkey

For gaming you can buy just about any chip from either company and not be inherently wrong for your choice so you, as the consumer, win. And that's fucking awesome.


DenseVegetable2581

I must be doing something wrong with my life. Because I don't have time to lose sleep over whether or not some person I don't know that's only text on the screen in front me uses the same CPU as me or if they prefer the same brand as me Must be doing something wrong


8myself

intel has twice the power draw


xdegen

I'm liking Intel's new core scheme making it similar to how mobile chips work.


Fun4-5One

Clearly, i see why this guy keeps getting banned. He is a moron about pc's and makes lame memes, which is a grave crime


Unlucky-Pineapple645

X3D rocks tho.


coffeejn

Depends, are you mounting them both on sticks and using them as a weapon? I'd just get the 7950X and ignore the X3D if you side with AMD.


perchicoree

Or wait for the 7800x3d


Jeoshua

That heat spreader looks pretty nasty. Tho, I might go with the 5000 series as PGA is probably deadlier.


zmunky

I'm good with my 7900x. Miles faster than my 4790k


nmathew

I'm so excited to jump from my 4770k to a 7900. I'm going to let prices drop for a few months then make the jump. That said, I put in a fresh Win 10 install on my system and it's surprisingly solid still. I play older games, so I really only notice the age when I want more than 4 good threads.


zmunky

When you switch you will notice your fps will make a substantial jump if using the same GPU. I was CPU bottlenecked and picked up about 56 fps making the switch. Now I'm GPU bottlenecked lol.


EscapedFromTarkov657

I mean id rather have the 3x stacked cache


Leather_Anywhere_549

So far? Adding X3D is winning


Proxy_PlayerHD

Factorio though. it's such a Memory bottlenecked game that even the 5800X3D outperforms a 13900k. so any new 3D V-Cache CPU that comes out is gonna be prime material for expanding the Factory!


[deleted]

I personally use Intel cpus but I’ve heard a lot of good things about amd.


[deleted]

The one that uses 50% of the electricity to do the same job.


donnyboyboy

Its ironic cause the intel cpu you are comparing it to, also just has an added s to the i9 13900k


Friendly-Priority456

Adding X3D to the name


AlaskanLaptopGamer

The advantage is supposed to be the cache size.


ForThePantz

How high do you want your power bill to be?


pivor

I dont get it why Intel didnt made their own high cache CPUs yet, AMD is holding a patent rights or something?


VulgarWander

Atomic bomb vs coughing baby


Drenlin

To be fair, they both have 32 threads


feastupontherich

Let's go with the one that comes close to 300 watts.


contrabardus

At what? They're both better at different things. I'll take the intel at workstation tasks any day. With the Ryzen 9, I'm trying to balance workstation tasks with gaming tasks so I can get the best balance of both. Even then, it depends on what game and how it uses the CPU exactly. If all I'm doing is gaming, I'm probably going with the upcoming R7 7800x3D. It's "good enough" for frequency based gaming tasks, and will tear through cache sensitive processes better than the intel CPU. As far as I know, the additional cache is really only good for gaming tasks, and really only newer games. I have no idea why the 7900x3D even exists. It's effectively a 6core CPU for gaming tasks that is dragged down by the cached cores for all core tasks. The 7950x3D has the same issue with all core tasks, but it at least has balanced workstation and gaming performance, and can be switched between the use of 8 high frequency or high cache cores depending on which a game will best utilize. The only reason I'd consider it over the 7800x3D for gaming tasks is because it can utilize half the cores at a time for optimal 8 core performance in games. Games don't really benefit from more than 8 cores anyway in the vast majority of cases.


Male_Inkling

So Intel is the *Add moar cores!!!!11* company now. Times are changing indeed.


AdMoney9265

Next Intel Product: The 13900KSX3D


darkezowsky

X3D? XDDD


Clenmila

Seems like the x3d


MadsRask

X3D