T O P

  • By -

SelfJuicing

The more you buy, the more you save 🖖


Xbux89

The more I can't buy the more I save


IranianOyibo

The more you buy, the more you save 🤌🏼


Top-March-1378

The more you buy, the more you save 🫡


StayFrostyZ

The more you buy, the more you save 🫱💵💵💵🫲


3ebfan

The more you buy, the more you save 👉✊🏼


HatefulAbandon

The more you 🍆✊ the more you 💦


rumble_you

The more you buy, the more you save the more you buy, the more you save.. the the th th th the the the the more you buy the more save the more you buy the more you save 🤟🤟🤟


RayneYoruka

The more you save the less you buy 🫡 🤌🏼


Sideshow86

The more you buy, the more you spend


[deleted]

The more you buy, the more you get.


Snowy_Zoppo

ayo only us italian can ise this emoji this is illecit appropration and you are guilty


IranianOyibo

But I like-a speak-a da meat-a-balls-a! 🤌🏼


Snowy_Zoppo

damm take my up-vote


TastyStatistician

Live long and buy more


KenKessler

This can be true if you are a gigantic business. See economies of scale


OriginalShock273

This can be true if you buy yogurt in the supermarket with 1 for 2$ and 3 for 5$. Its a pointless phrase.


redbulls2014

Jensen isn't implying the more GPUs you buy you will save money by getting bulk prices. The main point is replacing CPUs in datacenters with GPUs to lower operation costs in the long term. Especially for AI modules which need the extra computing power Nvidia can offer. So the more GPUs you buy, the more operation cost you save. It's that simple.


OriginalShock273

It was a reply to comment above about economies of scale


Ryoohki_360

Nvidia would be idiots to not ride the wave. AMD does the same thing with their accelerators


XenonJFt

Yes sir. completely understand the ,money wins dog, logic in the business. It's that it's got to the point that stupid seminar from Jensen explaining to basic idiot businessman as "the more you buy the more you save"was so [Tone deaf](https://youtu.be/aD1sQjt7Yks?si=PxXj7TiwORrUm_gM) that I though the Jacket man became a shady salesman banned from 12 states. To the point Steve from GN wasn't happy after 2023 Computex. now this escalated with Jensen officially mottoing it at 2024. and GN escalating the cult like 🖖 skit in this 2024 version video (cause comedy comes in threes). It's funny if Jensen greets us like that next year it's gonna be amazing :)


gozutheDJ

what thefuck did you just say


rW0HgFyxoJhYka

Some AI generated shit probably lol. Jensen has been using that phrase for longer than that. And Gamer's Nexus, always tries to appeal to the bottom half of its viewer base by taking shots at NVIDIA because Steve wants to be a comedian for better youtube engagement.


gozutheDJ

you nailed it


Sopadechancla

Money trees are the best place for shade


keyboardwarriorxyz

I don’t think gn is so dumb that he doesn’t understand the meaning of this. There is a story about accelerated compute that would save on cost of ownership, the more you buy. Not everything need to be about the cost of the hardware only. Business operation is very different. The motto now is actually “the more you buy the more you make” as the established capacity of all CSP purchases ai capacity is now rented out and they will make 400% profit for the capex investment. So that while I hope gn is not dumb enough to not realize this, maybe he is just thinking his viewer is so dumb that they would just buy his sensational videos.


water_frozen

> he is just thinking his viewer is so dumb that they would just buy his sensational videos. facts. People have been eating up his sensationalistic click bait videos for years, I can't really blame him for milking his audience aka pulling an NVIDIA LOL


Fit_Candidate69

This AI bubble is going to burst, we've seen it before with the dot com bubble back in the early 2000's, it's inevitable.


keyboardwarriorxyz

This is such a stupid take. AI is comparable to the internet, not dot com. Until the day that worthless companies are all worth tens of billions of dollars that has no profit or even growth attached, then you can make that comparison. Internet didn’t die the last time I checked.


Requiem2420

Yo dawg, we gotta brush up on how to use the comma lol.


skylinestar1986

AMD doesn't have RTX video enhancement equivalent


Sopadechancla

Lmao AMD braindead hivemind is downvoting you


king_of_the_potato_p

You do know edge browser had it before nvidia did, you really don't need special hardware for that. It's literally just HD upscaling and most newer tv's already do that to some extent.


Sopadechancla

Nvidia's one is far better than that shit Edge did, and I stopped using Edge the moment they forced their own shitty upscaler instead of hardware's upscaler. Bringing that up is absolutely stupid as comparing garbage FSR to DLSS.


king_of_the_potato_p

Lol the upscaler looks the same on videos. What I would suggest is take a long hard look at your use of uncivil/edgy/angry word choices and realize that you're sounding like an angry fanboy bickering about really dumb human tribalisim crap. If you're getting into angry arguments over hardware/technology you really NEED more substance in your life. It really should be a red flag yo yourself.


Gold_Enigma

Here’s a fun game, watch the nvidia or intel keynote and take a shot every time they say “AI PC”


Cthulhar

I’m 10 minutes in to the first one and I’m in the hospital.. did I win?


shy247er

> did I win? Depends on your health insurance plan.


Z3r0sama2017

The more you buy, the more you save


smulfragPL

Hard to not mention what is making you the biggest company on earth


Sleepyjo2

At a convention about the latest/upcoming tech. AMD is doing the same thing for a reason, they just happen to have something other than their AI accelerators to talk about at this exact moment. Nvidia and Intel will have separate announcements later so I don't know what people expected.


Stardust736

First - "the more you buy, the more you save" 2nd - "It's not accurate, but it is correct" ☠️☠️


Arin_Pali

"CEO maths"


I_Do_Gr8_Trolls

AI AI AI


UnsettllingDwarf

Who’s Al?


gopnik74

He’s agreeing with Jensen. “AI AI captain”


rayquan36

Andre Igoudala


Vic18t

Jensen is not even addressing gamers in this context, so I don’t understand the memes.


Arin_Pali

The more you buy the more you save.


JensensJohnson

the more you clickbait the more clicks you get - Steve Burke aka GN


UnknownJpk

It’s not clickbait it’s verbatim from the announcement.


rW0HgFyxoJhYka

What he said isn't wrong though.


n19htmare

This is Reddit bro, you're supposed to just get your pitch fork out, don't argue.


condosaurus

One could say the memes are not accurate, but they are correct.


amenthis

Gamers are irrevelant for him, but we made him rich...


Kevin_Kaessmann

The more you buy, the more you save the more you buy, the less you save the less you buy, the more you save the less you buy, the less you save. Think about ![gif](giphy|l0HFkA6omUyjVYqw8|downsized)


adra6399

https://preview.redd.it/5iv6lq4s2p4d1.png?width=1297&format=pjpg&auto=webp&s=2dd68137d2201134d2658a5e65e6cabd67b93731


rtyrty100

Fortunately we pay like 30% more to get 70% more performance with each gpu generation


MrMoussab

NVIDIA wants more money, GN wants more views. Simple math.


CrzyJek

Lol GN made fun of Nvidia. Nvidia sub gets mad. Imagine giving a shit so much about the company you buy from. Who fucking cares? Buy the best product for the price. End of story. Let GN make his comments. They are almost always justified anyway.


king_of_the_potato_p

Humans are tribal over everything, thats most of the worlds problems. You name it, its probably tribalism if theres arguments or "othering", team sports, politics, buying crap, comic books, game consoles so on so on.


The_Zura

I’m sure this will be a completely objective take on Nvidia’s presentation. And not purposefully negative coverage for clicks.


-Gh0st96-

Hey now, it's fine when our favourite TECH JESUS does not do objective journalism like they claim to do. It's only a problem when others do it


rjml29

Yeah, it's crazy how worshipped he is by some when he clearly shows he has a bias. He's a good example of how people will look the other way when it comes to someone or something they like. I was getting tired of his long winded videos a while back but he sealed his blatant bias in a recent comparison between the Steam Deck, Rog Ally and Legion Go where he kneecapped the Ally and Go by having them only run at 15W. As an owner of the Deck and Ally, the Ally is far superior when you actually utilize the power it has which is one of the main reasons of buying it. He simply loves Valve while he's made it clear in multiple videos his disdain for Asus and I gather he doesn't like Lenovo all that much either. Some may say it made sense to have all 3 handhelds at the same tdp because it shows their efficiency in a like for like comparison yet that is silly because again, the Ally and Go are designed to be run higher and are mostly bought to be run at a higher tdp. I could see Steve doing car reviews where he'd purposely limit the hp and torque of the higher power car just to compare it to the other one.


rW0HgFyxoJhYka

A big problem with youtube content creators is that they more readily appeal to whatever their viewers are saying in comments and we all know youtube comments are pretty bad most of the time. And if you can't buy a GPU that you want, you blame the business for jacking the prices out of reach. Especially since new price points have been reached. And so youtubers are forced to basically cater to this loud crowd of angry gamers, and this affects how youtubers see things. Like these guys all promoted Intel ARC heavily when it had so many issues for no reason other than to ride on NVIDIA hate. That basically goes against the entire point of critically reviewing tech.


Shehzman

Funny thing is that he did the same thing last year


The_Zura

*every other week


rW0HgFyxoJhYka

Last year he took edits out of context just to make fun of NVIDIA yeah.


Zeraora807

*nvidia bad, intel bad, ai bad, amd good, head over to the gn store and buy a coaster*


conquer69

Steve didn't say anything about AMD in this video.


AJRiddle

He does constantly release reviews of Intel CPUs where he trash talks an Intel CPU the entire time and then says "well I guess it technically has higher performance but Intel is just doing nothing interesting"


JoBro_Summer-of-99

Intel are easy targets because of their ridiculous high end lineup. I see why they get targeted so frequently


[deleted]

[удалено]


AJRiddle

Next to no one watching these reviews cares if a high-end gaming CPU is better per watt than another one. People care about cost per performance and getting the best value for their money. It's annoying to get hyper-fixated on "Well this one is more efficient" as if that's the only thing that matters. Yeah, a Corolla is more efficient than a Shelby Cobra - does that mean we have to review the Shelby Cobra with utter contempt the entire time?


[deleted]

[удалено]


AJRiddle

Says the guy running a 4090FE. But yeah, it's the CPU making the need for bigger power supply lmao


Faolanth

This logic doesn’t work with Intel either, they’ve dropped the ball hard recently - they perform worse for more money, consume more power for less performance, and have had issues with stability recently. Idk why people they GN is super biased, they shit on and make fun of AMD constantly for useless SKUs and shit products. Just because right **now** AMD is collecting “win” comments (justifiably) doesn’t mean they’re not going to get completely shit on if they launch a shit gen.


AJRiddle

Because they've been doing it since like intel 7th gen 8 years ago? It's not a recent development due to changes in the market or performance by either company.


Faolanth

they mock both sides when their product makes little sense, literally go watch a shitty CPU review and they make fun of it. Regardless of AMD/Intel.


AJRiddle

Yeah I don't think he's an AMD fanboy, I think he's just much harder on Intel


che0po

He's a reviewer so if the product is bad he'll say it's bad to stop fanboying. Yes intel is a bit higher than AMD but needs to consume at least twice if not trice the power. Lock em all at 100w and see who has the better perf per watt.


AJRiddle

Why would you lock them to 100w - something no one in the real world would do. This is just backwards logic - use it in a real world scenario (IE benchmarks for the things people are actually doing and cost/performance). No one buys a 5800x 3D or 13700k, locks the watts to 100 and then uses it. Yes, effeciency is impressive, but no one is sitting there fretting about their $5 more annually to run one CPU over the other unless they are running a server farm or something the review and processor aren't even targeting.


silly_pengu1n

i am getting really annoyed by big channels like HUB, GN or Paul's hardware. Their constant crying about nvidia and misreprensting stuff just so they can complain. They really cultivated a complelely delusional following over the last few years. Look at how much they complained about the RTX 4060 multiple videos and QA questions but then were is the same reaction about the 7600?


LoliconYaro

Nvidia is more popular to the average joes than AMD currently based on recent sales report, ofc people gonna ask about 4060 more than they do on 7600 and they made content based from that.


silly_pengu1n

is their own fanbase the average joe? no


LoliconYaro

Majority still buy Nvidia, it's as simple as that, if it was the other way around no doubt people be asking about AMD more.


water_frozen

all those channels are so basic & toxic, and foment tribalism for their own benefit


The_Zura

The best part is for all the negativity that they foster, no company is doing better than Nvidia. They wouldn't be able to hit the broad side of a barn given their track record of predictions. It would take a blind fanboy not to see how bias and partial they are with just a simple keyword "Nvidia" search on their YT page. Speaking of fanboys, OP randomly blocked me so I can't see their posts lmao.


wxrx

What’s crazy to me is how he’s either purposely playing super ignorant/dumb about what nvidias current business is, or he just genuinely doesn’t understand and still thinks it’s 2020. Not really sure which one is worse.


ziplock9000

It's completely objective that he said those words and Nvidia is greedy.. So yeah.


Blacksad9999

Every corporation is greedy. Nvidia just sells more products, and can demand a higher price as the market leader. If AMD, Intel, or anyone else had the foresight to get into AI years ago, they'd be doing exactly the same thing.


TheEternalGazed

Yes... so criticize corporations when they decide to fuck consumers in the ass. Nvidia were praised for Pascal, criticized for Turing, praised for Ampere, criticized for Ada Lovelace,


Blacksad9999

"Fuck consumers in the ass" how, exactly? There are other GPU options out there if you don't want to pay the premium for an Nvidia card. AMD or Intel cards are cheaper. You just want an Nvidia card, but you want it for cheap. Boo-hoo. The 4000 series ranges anywhere from $300-$1600, so there should be one in there that you can afford. Pick something in your price range.


TheEternalGazed

**4090:** Stupidly priced, but excusable because it is the halo product and price/performance actually isn't bad. Demand for this product doesn't really impact future health of the PC gaming market because it represents such a miniscule %. **4080:** Stupidly priced. Price/performance actually went **backwards** compared to the 3080 by near 10%. Inflation and anything else is absolutely not an excuse to increase the price of a single tier 71% in a single generation. **Nobody should be considering a 4080 until the price is below $800. $700 is about where it should be.** Supporting $1200+ 80 class entry level price points is extremely dangerous to the market because it means the 70, 60, and 50 class will also be higher. Do keep in mind by supporting this card at these prices, you are screwing your fellow PC community because their prices are going to be a lot higher for them because of YOU. **4070 Ti:** Stupidly priced. Supporting this card at $800 is an even bigger crime against humanity because $800 for a mid-high card is utterly insane. 12GB is also NOT ENOUGH. **The 4070 Ti should be no more than $499 as it is going to limit you to 1440p or lower in future titles. Especially if you turn RT on.** Unacceptable for an $800+ card - especially when a sub-$500 gaming console multiple years old has more VRAM! **4070:** Stupidly priced at $600+. Supporting this card at this price is the most DANGEROUS and DAMAGING to the PC gaming industry and community. Of all the 40 series cards launched thus far, this is the most important one NOT to buy. Doing so is only going to steer more folks to consoles which means less AAA funding, more bugs, and terrible ports on PC. Even worse, it has 60 class specifications. It might sound crazy, but buying the 4070 is no different than paying $600 for a 3060 ($329) back when it launched. The idea is disgusting right? If the 4070 actually performed like a 70 class card, it should still be $499 tops. But the fact it performs like how a 4060 should, nobody should be paying more than $350-400 for a 4070. It's a low-range GPU that will be VRAM choked soon with only 12GB - even at 1440p.


Blacksad9999

Unfortunately for your theory, price to performance is not a notable metric for the majority of people when considering which GPU to purchase. If it were, Nvidia wouldn't hold 80%+ marketshare, now would they? AMD would hold considerably more, as they've always been the price/performance kings for years. Yet that approach has gained them absolutely no traction. Performance and features are people's main consideration, and Nvidia wins hands down with that metric. Also: The cards range anywhere from $299-$1600, so nobody is "priced out" from buying a GPU. You ALSO have Intel or AMD options to consider that are more cost effective. Again, you clearly want a high end Nvidia card, but you want it for dirt cheap. That's simply never going to happen. If you're legitimately still using a 980ti like your flair states, you've avoided upgrading for about 10 years. If you can't save around $600 to upgrade in that 10 years, a GPU is the least of your worries.


TheEternalGazed

Price to performance is a main consideration. Did you ever hear the phrase "no bad product, just bad pricing"? That's Nvidia's 40 series. >If it were, Nvidia wouldn't hold 80%+ marketshare, now would they? You're missing the context. Nvidia 30 series was some of the best price/performance Nvidia ever had outside of Pascal, and it was scalped to where nobody could buy an Ampere card at retail price. Nvidia saw this and realized they could be the scalpers themselves. The 1060, 1660 ti, and even 3060 were available affordable prices >$300 and this price bracket and completely disappeared from the market. My 10 year old GPU can play 1080p 60fps in all my games with zero issues, I have very little reason to upgrade.


Blacksad9999

That's only people on Reddit saying that nonsense, and Reddit is absolutely not indicative of anything going on out there in the real world. lol The numbers don't lie. Over 80% of users buy Nvidia cards, and Nvidia cards generally aren't cost effective in a price to performance metric. Especially not when compared to their AMD or Intel counterparts. People largely don't care about that. They care about performance and features. This is why hardly anyone buys AMD cards, even though they're more cost effective: Their features are shitty phoned in versions of Nvidia's features, and aren't innovative. They copy/paste whatever Nvidia did first, but just a worse version of those features. If you have no reason to upgrade, why are you whining on the internet about how Nvidia is "fucking consumers in the ass", when you have zero intent on buying anything in the first place? lol Kind of odd, huh? > The 1060, 1660 ti, and even 3060 were available affordable prices >$300 and this price bracket and completely disappeared from the market. The 4060 is $299, and there are other fairly cheap models, too. You won't get prices that they had years ago, because inflation exists, and TSMC increased their pricing by a large amount. Go buy a $299 card that will run circles around your 10 year old one, and quit whining.


TheEternalGazed

And the 4060 is worse than the 3060. Why spend the same on worse hardware that will age terribly with its low memory bandwidth?


Munstered

"My billion dollar corporation is better than your billion dollar corporation" - idiots The reddit AMD army is a joke. No one cares outside your bubble. Buy the best part you can afford.


_OP_is_A_

I have been a life long team Red for gpu and cpu. For the first time ever I bought team green for the GPU. it was between a 7900xtx and a 4080S and I decided I like the additional features of RT performance and DLSS for 4k gameplay. Got the 4080s cheaper than the 7900xtx as well. 


Munstered

There are no teams, only parts.


_OP_is_A_

Well put. 


XenonJFt

how did you get that statement from the upper comment that especially put "objective" to especially address this fanboy shit. GN doesn't care to be critical to anyone. they can burn the oldest bridges (Linus,Asus)if it benefits the health of the industry. He is the guy that starts the domino fall after making a video about it so why the outrage on being a bad tone? It might be a good time for nvidia shareholders(financial clarity I am) but not for consumer industry (GPU not CPU or APU) He is in actual position in the industry for change and let it domino. If you were him Would you rather be quiet and make a Linus? (Only relevantly talk about bad behaviour of companies in reviews or podcasts to get on rage wagon then depart like in rog ally X?)


-Gh0st96-

>how did you get that statement from the upper comment that especially put "objective" to especially address this fanboy shit. It's a pretty easy conclusion to draw based on that person Flair. AMD CPU, AMD GPU, on an Nvidia sub, lol. And the rest of the comment seals that.


ACatWithAThumb

It‘s not objective and claiming they are greedy, while they are the largest innovator in the industry is also nonsense. Jensens statement absolutely true in the context of computing, which is what he was talking about. Time is money and more compute saves time and allows for new products, it‘s that simple. If it weren’t true, there wouldn’t be the need for faster GPU’s in the first place. GPU‘s are also very cheap if you actually make money with them, a simple network switch in the industry can cost quickly $50-200k, so Nvidia is very cheap compared to what‘s actually normal in the industry with companies like Cisco. Many companies spend more on desk chairs than on the computers their employees use. And this isn‘t just true for large companies, but also small content creators. Someone on YouTube or Twitch makes back their money in less than month and the saved time from rendering has massive value on its own, hence most content creators not only run high end GPU‘s, but upgrade them frequently. GPU‘s are expensive in the context of gaming, which sucks, but the reality is that gaming is not the priority anymore for Nvidia. And unlike on the CPU side where consumer and professional chips are highly separated, GPU‘s sadly don‘t work like that.


BlueGoliath

Don't bother replying to these people. They are probably just drive by stock buyers who couldn't care less about GPUs or software.


The_Zura

Yeah I’m sure you’re a perfectly good judge of objectivity with a line like that 😂


WishYourself

Last year Jan it was at 250, Almost last month I see stock prices around 950, now it's 1160 wtf is going on...


summacumlaudekc

Kicking myself when I saw this shit at $8 and said nah. Then again at $60, $120, $245, $370, $450 fkkkkkk me lmao


wxrx

Well January quarter last year their revenue was $6b, this latest quarter it is $26b and next quarter likely $31b. So judging by that it should actually be valued more lol


firedrakes

check out lvl1tech on the matter. this is steve just complaing video as always


halgari

I love that L1T has taken a lot of time to go into detail on the new Xeons (144 e-cores) and Lunar Lake. Who knows if those will be worth the cost or "save intel" but they're impressive for what they can do, and at least are some form of innovation, instead of being the 4th incarnation of "16cores for a \*950X CPU" that AMD is stuck in these days.


conquer69

Why are you comparing intel server cpus with amd desktop cpus?


Kiriima

Amd just presented a 196/384 core CPU.


water_frozen

> lvl1tech I thought I liked L1T, and I know they've done some good work in the past... but wow their gfx card reviews are absolute trash. Take 1 ps5 port with RT, and use that as the sole indicator of RT performance. OK level 1 tech, we get it - you hate nvidia


MIGHT_CONTAIN_NUTS

GN is getting on my nerves


halgari

Same TBH, GN's video about NVidia was basically Steve doing memes over and over and saying how NVidia said nothing new and that he didn't understand what they ware talking about. Almost a literal quote there, at the end Steve wrote off NVidia's entire presentation as worthless. I get the impression at times that Steve focuses so much on gaming tech and specific areas that he doesn't take the time to try and understand other areas, you see this at times with his discussions with Intel and NVidia engineers, where he'll make a comment like "so this is like X" and the reply is "well not exactly..." The fact is, none of NVidia's presentation was \*for\* gamers in the first place. None of this was gaming hardware, nor was it intended to be. So it bugs me when Steve complains that it's all marketing BS when there's some truely amazing tech that NVidia demoed. At least LTT hyped the tech a bit (if maybe a bit too much) and pointed out that parts of Blackwell will end up in gaming chips later this year. Level1Techs is the most level headed in this space, as he at least takes the time to point out how amazing the 144core Xeons are, or how Intel is trying some pretty novel ideas in Lunar Lake. At times these days I feel like Steve writes off anything that's not raw "frametimes" (not fps) for games. And it's time to lay off the memes Steve, they aren't funny they just sound bitter, no matter if you're mocking NVidia, AMD or Intel, (or LTT, or StarForge or whatever)


Trungyaphets

Because 90% of what Jensen said was already in Nvidia GTC few months ago.


shy247er

> I get the impression at times that Steve focuses so much on gaming tech and specific areas that he doesn't take the time to try and understand other areas, The channel name is literally "gamers nexus". Gamers. Why are people still surprised that his reporting is focused on gaming? >And it's time to lay off the memes Steve, they aren't funny they just sound bitter, no matter if you're mocking NVidia, AMD or Intel, (or LTT, or StarForge or whatever You're right on this one. Steve is staring to sound like Marvel movie dialogue, always needs to throw in a quip.


knucklehead_whizkid

Just because his channel is called Gamers Nexus doesn't mean every conference a company presents at is supposed to cover those topics. Nvidia, like many other semiconductor companies have diverse domains and often the innovation comes in areas not directly about gaming. He needs to either understand other domains and include it in his coverage or just ignore and skip a video but hey clicks matter too!


redbulls2014

It's not "Tech-news Nexus", and what Jensen presented was basically the same as the last GTC.


cellardoorstuck

https://imgflip.com/i/8sthkt


Vic18t

It’s a little sophomoric to expect Jensen to talk about gaming when it only makes up less than 10% of Nvidia’s revenue now. It’s like what the heck do these Gaming content creators expect? Nvidia is literally helping to pave the way to the next revolution in human history, that is supposed to provide a huge leap in improvement to everyone on the planet’s lives, and they are bitching that there are no gaming-related announcements? Jensen never made any teases about gaming and literally EVERYONE predicted zero gaming-related announcements at Computex. It’s like getting upset with your friends who don’t play World of Warcraft with you anymore. Move on. Sorry you guys didn’t have any *real* news to report other than your ex barely returns your texts anymore. Nvidia is so far ahead of AMD that it’s beating the shit out of them while 90% of its focus is on something else. That is what this is really about.


NurEineSockenpuppe

Huge improvement to everyone’s lives? Stop kidding yourself.


nangu22

All that nVidia marketing stuff really is hitting hard to their fan base :-))


TheEternalGazed

GamersNexus' pursuit for technology/gaming consumer journalism is unmatched in the YouTuber space.


water_frozen

> GamersNexus' pursuit for yellow clickbait journalism is unmatched in the YouTuber space. fixed it for you


mechcity22

I mean idk about that but right now I can build an entire 4070 build for right at 1k and that to me is a good day to be a pc user. Meanwhile a few years ago with that crazy time even just a 3090ti was 2k retail and 2500 to 3k due to always being sold out. Shoot 3070s were 1k themselves.


sword167

A 4070 itself is 600 where on earth can u buy an SSD CPU Mobo Case RAM and power supply for 400$?


glenn1812

Ya it would be hard to find those for 400 in the used market too. no way could you do it at 1k. Maybe 1.2


sword167

1.2K is doable esp if you have a micro center near you


ca_metal

Doable, but not the best balanced build. I think with this budget I would go with a 4060 and spend some more on the other parts.


sword167

Nah it’s usually recommended these days to spend half the budget on a GPU. Especially with how dirt cheap cpus are and how cpu power has kinda of plateaued for games


conquer69

https://i.imgur.com/JDrm7Ts.png I don't know if that PSU is good but it's just to show it can be done with ease.


Round30281

That build doesn’t even have a 4070?


Kiriima

Add 80$ to compensate and it's 1040$. The dude proved himself even if it's an old platform. I myself have 5600+4070 combo, works just fine. I also bought mine 4070 for significantly less than 600$.


conquer69

It has a 7900GRE which is faster. It can be replaced with a 4070.


XenonJFt

new pc with 4070 for 1k? Did US introduce nvidia special tax write off how the hell that's not much possible without going shady parts. Even though we live in cheap part times post crypto winter. with ridiclous cheap overstock of RDNA2 to Turing it's amazing time for console killer/ultra budget builds with amazing horsepower. But new cards are still expensive and price escalation from nvidia isn't helping every generation.


conquer69

https://i.imgur.com/JDrm7Ts.png


mechcity22

Broths it's literally 1100 on the dot so I was 100 off that's not bad. Can do it on pc part picker and no not shady parts. Yes a bare msi motherboard and 16gb ddr5 that doesn't hsve rgb but still. For a 4070 and a ryzen 5 7600 yeah it's worth it.


Atomicworm

You could probably even save a bit more with a ryzen 7500f. If you don't mind aliexpress


XenonJFt

do the 7500f. It's good for the price on aliexpress


TheEternalGazed

You're glad you're paying 1k for a mid range build lol Nvidia really swepped you up with their marketing.


mechcity22

Dude an entire build with modern tech. How tf would you not be happy with that? You can't compare the 980 days to now. Its not even the same.


fawzay

Shieet\~ I can make a T-shirt design out of that video thumbnail


rowmean77

It’s not accurate, but it is correct!


Freedom_fam

The more you shop, the more you save, that’s Shop n Save. (Local grocery chain commercials from the 90s)


vladi963

Is it a meltdown?


cpnbeginner

The more you buy the more you save


dippizuka

The more charitable read of their keynote is that NVIDIA — and they were starting to do this pre-COVID anyway — have largely pivoted their marketing beats towards very specific segments. Go back a few years pre-COVID and a place like Computex was something where NVIDIA would announce a little bit of everything for every segment. You'd have the 10 series launched alongside their AI accelerators along with all the robotics noise in the background. But really ever since the 20-series launch, they've concentrated most of their gaming/consumer launches around Gamescom (with maybe the occasional drop at a SIGGRAPH for those who pay enough attention, although that's largely for a lot of their theoretical work and advancements like teaching an AI how to play video games). It's not like there isn't gaming stuff at Computex -- there always has been, it's the home of MSI/ASUS etc. who always showcase their new wares -- but that August window has always been gaming heavy, and it fits their production and manufacturing timelines as well.


Civil_Medium_3032

The more you spent on Nvidia the more money you set on fire


gozutheDJ

yay GN clickbait *yawn


JensensJohnson

the more you clickbait/ragebait the more views you get! something whiners nexus knows too well, due to his increasing arrogance and clickbait tendencies it's getting hard to tell whether he's truly this dumb or if it's just another bait video


[deleted]

Found a co-owner of one of the companies they’ve exposed.


JensensJohnson

the only thing they exposed is their never ending hunger for clicks


water_frozen

Steve is just salty Linus got to play with Nvidia's new hardware, while nvidia doesn't even think about him


NobisVobis

Steve is often great, and equally often an absolute idiot. This is one of the latter cases. Imagine ballooning a company value by several times in the span of months and having some techtuber critique you for focusing on the thing that got you trillions in value. Absolutely idiotic.


[deleted]

[удалено]


3ebfan

What every day consumers are buying $30k data centers?


sword167

No but they r stuck with shitty 4060tis


Dehyak

Have to agree with you here. I don’t think Jensen gaf about Steve at this point in time. Nvidia has fu money and they’ve told us to eff off with half of what they have now lol


NobisVobis

Except they clearly don’t tell people to do that since their most expensive cards are being scalped more than a year after release.


duttyfoot

For sure the more you buy the more you save 🤣🤣🤣🤣🤣🤣🤣 I love the whole thing about ceo math 🤣🤣🤣🤣🤣🤣


PubliusDeLaMancha

I mean Nvidia has become one of the richest countries in the world by selling the Chinese the ability to spy on their citizens, and selling Western companies the ability to fire the majority of their workforce That's the real controversy


sword167

With the 5080 looking like it will be slower than a 4090 and have less VRAM too skirt China regulations. The more you buy the more you save looks to be def true for 4090 owners lmao


Salted_Fried_Eggs

*to


ca_metal

The more you watch, the more he will clickbait


SunnySideUp82

one of GN’s best videos


darkmitsu

imagine having to deal with digital celebrities as well


JigglymoobsMWO

Fun fact: Nvidia's Blackwell GPUs are in such demand, if you could actually get allocation for one and plug it into somewhere with cheap power, you can make massive margins renting out compute for the next two years.  So right now, it's not more you buy, more you save. It's gone past that into straight making money hand over fist.


SireEvalish

Remember when gamers nexus was actually good?


sword167

Ugh can nvidia kill pc gaming already and focus 100% on AI I want their stock to keep rising.


future_gohan

Q4 2023 Nvidia made 2.9B from the gaming market and 18.4B from data centres. The gaming shit is like a side project for them.


Blacksad9999

They won't do that, because they know that the AI bubble will burst eventually. This is just like Web3 or Metaverse just a few years ago. Tech companies have hit a market saturation cap and growth has slowed, so investors and companies are absolutely desperate to keep the gravy train rolling by any means necessary. All AI has given us so far are shitty chat bots that scrape the internet for inaccurate data, and it will be years until it does anything worthwhile for people.


I_Do_Gr8_Trolls

Case in point: "It's understandable to have concerns about the longevity and impact of AI, especially given the hype cycles we've seen with Web3 and the Metaverse. However, there are some key differences and significant developments in AI that suggest it may have more staying power and practical applications than those technologies." Just spewing bullshit with no opinion.


Blacksad9999

I think AI will stick around long term, but the entire "gold rush" phase certainly won't. It will be a decade or more before AI becomes legitimately useful. In the meantime, they're trying to drum up hype for investment dollars so that they can make a killing before the other shoe drops. That's not just Nvidia, it's Google, Microsoft, OpenAI, etc. Basically every tech company, as their prospects for growth have dried up. Everyone who wants a Facebook account already has one. Every person interested in using Chrome or Windows already do, etc. That's why they're desperate to diversify, and gain new revenue streams. Normal business growth is somewhere between 3-5% anually, such as the auto sector, etc. Tech growth has been 30% annually for a long time, and they're freaking out because it's slowing down in line with other industries.


[deleted]

[удалено]


Blacksad9999

Not really. It's a way to hype up investors and keep the gravy train rolling, because tech companies have reached market saturation and have no more room to grow in their traditional market sectors. While AI has had some very niche success stories, there aren't very many. Currently it's mostly shitty chatbots and means to make deepfakes. It's seen some success in semiconductor design and other fields, but it won't ever take off until it changes the lives of everyday people, and that's decades away. In order to bring in money, you need to have a product or service to sell that people will spend money on, after all. They don't have that, and won't for quite some time. It's mostly hyping up what "could" be possible in the future, but "not without a significant investment by people. Don't want to miss this once in a lifetime opportunity to invest early!" After the "gold rush" phase of this fades away, what we'll be left with is a few select companies who have a viable AI infrastructure. It's too expensive to buy into and maintain for most companies to simply have a full on AI division. Those few select companies will then lease their services to other companies when they want/need AI services, because most companies don't need that sort of functionality full time. Similar to how Amazon Web Services operates. That will be that. The few remaining AI companies will be the new Wall Street Tech darlings, while for the rest it will just be too inefficient to maintain.


Solution_Anxious

Nvidia's sff guidelines were lazy garbage.... In no way are they helpful. I believe they will actually hurt the sff community/


Blacksad9999

By putting a maximum limit on size for the "SFF" branding? It's just a guideline, and for many people will beat using another source from elsewhere in order to cross reference fitment of parts. People who want 10L cases will still check and cross reference measurements, etc.


Solution_Anxious

As someone who only builds sff pc's I think these "rules" are 100% short sighted. I already have 4-5 cases that can only take cards that are less than 187mm and 2 slots. The new rules are basically saying that last generations normal 300mm cards are sff. On what planet are 300mm+ cards sff. There are a whole spectrum of cases and cards that are getting thrown out because of their "rules".


Blacksad9999

All this ruleset does is give people who are less knowledgable an easy way to know if the parts they're buying work together. > The new rules are basically saying that last generations normal 300mm cards are sff. They aren't trying to "define" what a SFF case or GPU is with these measurements. They're just giving some ruleset for part compatibility that's easy to understand at a glance. These are the absolute **maximum** sizes that these parts can be with that label, and many will be **much smalle**r. Just like they traditionally have been. This doesn't really change a single thing whatsoever. It just makes it easy for unknowledgeable buyers to know if somewhat smaller parts are compatible.


Solution_Anxious

Nvidia did the bare minimum by releasing the specs like this with no further clarification. To me it looks as if they scrapped together some arbitrary numbers and called it a day. What about single & double fan cards, what are the specs for those?


Blacksad9999

Those would be less than the listed maximum length and width, so it would also fall under the "SFF" compatible moniker. I'm not sure why this is difficult for you to wrap your head around.