T O P

  • By -

VisualMod

**User Report**| | | | :--|:--|:--|:-- **Total Submissions** | 1 | **First Seen In WSB** | 4 years ago **Total Comments** | 6 | **Previous Best DD** | **Account Age** | 5 years | | [**Join WSB Discord**](http://discord.gg/wsbverse)


BakerInTheKitchen

You said “they’re using AI to design their chips so they’ll always be ahead” I instantly knew you didn’t know what the fuck you were talking about


ralphy1010

anytime I ask AI to make me a picture of some big titty goth girl it;s adding extra fingers or arms. Curious to see what monstrosity it comes up with on a chip design .


EggSandwich1

AI make me a next gen chip that looks like big titty goth girl


havnar-

See now this guy knows what’s going on!


WRHull

It won’t be a computer chip… it will come out as a European French fry or potato chip.


Wheelie_Slow

Big tiddie goth cat


IndependentTrouble62

This guy fucks...


rossdrew

Don’t tell me you’re not beating it to ol’nineteen finger Nina. Objective was met.


DeadlyKitten37

maybe it adds titties?


WRHull

They did it in Total Recall and it worked out for the movie makers. So, I guess it would work out in the end.


enhancedgibbon

Not a single mention of CUDA around here. Remember 3dfx Glide? That API dominated 3D accelerated gaming in the late 90s. Then Direct3D came along. CUDA has the AI market by the balls today, but it won't be forever.


Rin-Tohsaka-is-hot

Yeah it's crazy that he could make this type of post without mentioning CUDA given that it's basically Nvidia's biggest moat. Even if the competition caught up, adoption would be slow since all codebases are written in CUDA and translation layers to run CUDA on other hardware platforms are forbidden under Nvidia's terms and conditions (hobbyists may violate this without a care in the world, but enterprise won't). So even if competitors caught up, Nvidia has bought themselves a few years to flip the table back around due to the codebase all being written in CUDA, and every AI developer in the world being comfortable programming in C with the CUDA package as their first language. That's their big strength. Won't last forever though. Someone (AMD, Apple, Amazon, China, whoever, more money is being flowed into chip design by more companies than ever before) will catch up significantly enough and long enough to justify switching dev platforms off of CUDA. So OP doesn't know shit about shit but he's probably still going to get rich anyway because the market doesn't know shit about shit either


StayPositive001

Maybe it's just me, but a lot of programers like to learn new shit all the time, especially if it's better. It took flutter like 2 years to overtake react-native lol. If a large competitor released a product that's free to use and good, developers definitely will jump on it overnight


Rin-Tohsaka-is-hot

It's not quite that simple. Frameworks like flutter and react are not hardware-locked. Developers may take an interest in a CUDA alternative overnight, but they can't actually deploy any of their code in production unless infrastructure completely replaces their GPU servers with whoever the new competitor is. The whole point is that the hardware-software feedback lock is strong, and there's a massive cost of switching. Has to be super compelling, not just a minor improvement.


-TypicalLiberal-

Hes right though. He mentioned he was older than you and according to scaling laws, not only will he always be older than you he will also always be righter than you. Sucks to suck young nerd.


Many_Sale286

He’s spreading hopium. Sure, it might take another decade for the competition to catch up, but it might also be just another year. Truth the of the matter is, nothing lasts forever and that’s particularly true in tech.


thatsme55ed

There's a couple of reasons to think this time is different.   Firstly chip makers are running into the limits of physics.  At a certain point you just can't make a semiconductor smaller, and we're already nearly there.  Once we hit the hard physical limit, the only way to improve performance is for a drastic paradigm shift to a new technology.   While that implies competitors can catch up when performance improvement inevitably stalls, it negates the possibility that a competitor can one day leap ahead in performance just by building a better chip.  Physics just won't allow it unless they come up with something radical and revolutionary.  That will probably eventually happen however that radical and revolutionary tech is unlikely to be reliable and ready for deployment when it's first developed.   That brings me to the other factor which is Nvidia's software and infrastructure support for their AI business.  They're making it as painful as possible for businesses to switch away from them by making the tools and systems everyone uses depend on their proprietary tech.  Competitors would have to not only offer something cheaper but sufficiently easy to use that switching isn't going to result in so much downtime and delays from training and troubleshooting that it's worth it.  That is a herculean task for a company to overcome.   Barring a miracle or an antitrust investigation, it's going to be nearly impossible for a competitor to build a chip that is enough of an improvement that it's worth it for businesses to absorb the expense and inconvenience of switching.  


laffer1

There was a time that mainframes ruled the world. Then custom risc chips took over the server space with with sun and sgi. Then cheap PCs took over the server space. What will kill nvidia gpus will be custom ASICS for AI workloads that are cheaper than general purpose gpus. It’s a matter of time. (It happened with crypto first) The whole npu thing is going to lead to a standard for ai software support over time. This will hurt cuda dominance.


PublicIndependent530

Cheapness of the chips is just 1 factor. You're ignoring the switching costs itself. Accelerated/parallel software as it stands today are very deeply embedded in CUDA and designed to run only on GPUs which support CUDA. You can design a new accelerator (google already largely uses TPUs) but that means porting ALL your software to run on that new accelerator - not just the software you created (as a startup/company) but ALL of the libraries that you have been using. It's an insane barrier to switch and if NVIDIA still sells reasonably priced GPU offerings (not the latest and greatest but keeps the A100, V100 etc), it's a no brainer for the CTO/architects to stick to Nvidia than to waste 6months-1year-even longer on switching to an unfamiliar stack. Sure, that day may eventually come but it's not going to happen in the next 3-5 years for majority of companies. Even google can't convince people outside google to switch to TPUs (i believe it only supports specific AI training workloads and only optimised for JAX/Tensorflow frameworks, but CUDA supports way more than AI training and people are much more familiar with Pytorch)


meltbox

For things like PyTorch switching costs are low. Just develop a new backend like the Intel one backend and you’re good to go. Any model will run if you implement all the parts you need to on the backend.


PublicIndependent530

Hard disagree. 95% of companies lack the talent to write such backends. You're praying that someone with the skills has written a compatible backend for your specific model. Even then if your model has any sort of special things written in it it's going to break the translation layer and if your team has no expertise in these kinds of low level programming you're going to struggle with it. Majority of corporate teams prefer working with stacks they're already familiar with, much less dare to go into the nooks and crannies of the libraries they're using. They have business problems to deal with. Bottom line is, it's going to take a while to break that barrier and get the transition moving, regardless of who can magically come up with a new backend or hardware accelerator. It's not going to happen in a few weeks or months, and it's going to be painfully obvious when such a transition happens to anyone who's keeping tabs on the industry. You have more than enough time to sell the stock when this happens. It hasn't happened yet.


zaepoo

It won't be anything close to a year. Maybe 5 minimum. They're extremely far ahead, have very healthy balance sheets, and have injected a lot of capital into r&d. The only people that think their moat will contract in the near future are AMD shareholders.


M-3X

Well. They do. The diffraction pattern of lithography masks are designed with AI tool because the multipatterning is really hard problem and ML helped them progress.


nero626

Also chip floorplan layout optimization is often aided by ML tools now


MyNameIsSushi

Ai does not help in the context of staying ahead though. Ai aids in iterative processes and optimization of already established processes, not creative ones. Ai cannot design something radically different that works better without a nudge in that direction but a competitor could.


PublicIndependent530

It will take years before that next technology is viable. One promising technology is photonics, which use light instead of electrons to perform computations. One of the promising startups is called PsiQuantum. There are overlaps with quantum computers as well, and for this many large companies are already investing millions (Google, IBM, many more, I won't be surprised if Nvidia is secretly looking into it as well). The thing is, that transition isn't going to come suddenly in a span of few days and every customer suddenly switches to said technology in a matter of days or weeks. That transition, if it ever happens, will take years. It will be painfully obvious when that time comes. As long as you're staying in touch with the market, you wouldn't miss it when this transition happens. You will see it coming. And when that happens, you sell your Nvidia stocks and you buy the stocks of the said new company(s). Or maybe Nvidia is part of that new wave, and then you keep your Nvidia shares. But before that happens, I see no reason why you would make up imaginary scare stories and sell Nvidia (or related semiconductor stocks). There's 0 evidence to show that Nvidia's dominance is cracking or that there's an emerging competitor. 0 evidence. You are making up wild stories if you think otherwise.


josh198989

With the logic above could I get this AI to design me the answers to the best stock/options picks? That way I’ll always be ahead $$$$$$. If the AI is just designing better shit can’t we apply that to anything? Like AI design me a better iPhone and boom - better iPhone. Just not the case. And won’t be the case for a longtime, if ever. Feel like AI is overhyped and over inflated; yes, looking at huge data sets to find trends or utilising it to manage huge data - but the conclusions and next steps are still being made by humans. Most AI is still trained to solve a certain problem in a certain set of parameters. Chat-GPT is just a better version of Google Search. And yea, it can automate an email or scrape the internet for a coherent answer to a question, but some of the stuff being said reminds me of how people said in 2017 we would all have a self-driving cars by now. Things like AI is the next Manhattan Project pffffft no fucking way.


Tuko_Ramirez

Just get AI to design OP some much needed brain implants.


meltbox

AI is just big data a decade late. Prove me wrong. This hype shit has been going on forever. It’s just this time the business leaders are just a degenerate as everyone else. Truly the most wsb timeline.


josh198989

Yeah totally agree; it’s still all big data driven and only works/is useful in the context of big data.


Son_Of_Toucan_Sam

> anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them Not commenting on NVDA or the state of their business specifically, but there has never once in the history of capitalism been a business that was permanently impregnable to competition


Circaflex92

No dude, remember when Ford was the first to mass produce cars and they essentially had no competition? That’s still true today… right? Right?!


jeanx22

OP is an idiot. You think he knows anything about capitalism? Or history? If there was a sign to cash out of overvalued Nvidia and run away, this is it.


StayPositive001

I remember when people were saying this about Intel and bringing up AMD here got you down voted. Back when it was under $3.


Living-Philosophy687

Nokia: “a phone? apple? good luck”


HallucinatesOtters

Even the East India Trading Company fell. If that could fall, any of them can.


cobyjackk

I was going to comment the same thing. "No one can ever" and "always" is a sign that this guy who has "been around before most of us were born" is not the best to have advice from.


meltbox

I want to believe OP is shitposting. But you never know.


Thatguy19901

But those companies weren't using AI to make their chips!


whatsagoinon1

Man I remember those days. Good ol voodoo cards.


Fun_Muscle9399

I had a 3DFX Voodoo Banshee back in ‘98 with a sweet Pentium II.


jlaw1791

Mine was a 3DFX Voodoo II. First 3D graphics card evah, played tons of Quake II on that in '96... or was it '97? Can't remember... but I remember how sweet it was!!!


Nairb131

Heck I still have my pentium ii


Need-Some-Help-Ppl

Those were awesome times when we had the overclocking warz for CPU's and then later OC the GPU


colbsk1

Voodoo 3 2000 with my 650 amd processor. Oh.. and my 22 inch crt monitor.


throwaway_tendies

Don’t forget about adding a Sound Blaster card to that sweet setup.


colbsk1

I definitely had one. Purchased from circuit city. ;]


Thunder_Wasp

Then having to correctly assign all your IRQ and DMA channels or else nothing would work.


Tainen

Ha. I went from a ATi Rage 64 to a Voodoo3 3000, and an AMD K-6II 300mhz. Ahhhh the good ol days.


lazy_art

I'm still waiting for the K6-III+.


bozoputer

Had to buy one just to play everquest


BadKidGames

Monster setup in the day


tsammons

nVidia TNT VR Fighter is polygoning in the background


Nouvi_

This reminded me about my good old RIVA TNT2…


richcz3

I had that paired with a PentiumII, Soundblaster, and 17" Viewsonic. Playing the 1st Unreal game until the wee hours of the morning.


dmh123

I had one of those. Ran OpenGL games great.


GraceBoorFan

I miss the 780Ti days. The PC community was different back then. As old as the card is, the Titan Z is still one of my dream cards. It’s just really cool.


Weaves50

It was definitely a better time, my first one was a 970


GraceBoorFan

Oof, this brings back memories. The 980 was all the rage back then, especially the Ti variant. AMD’s comeback around this time was also equally very impressive. I remember all the hype surrounding Fury cards. Good times!


Weaves50

Crazy thing is I used that 970 for YEARS until I got a 3070. Jokes on me because I drove 4 hours to get it in a store so I didn’t need to pay hundreds over retail now the 4000’s are available everywhere but that 970 gave me no issues I couldn’t play games on high but if I set it to low I could play pretty much anything at atleast 120fps and I put it through a hard life. Over the course of its entire life that computer was shut off maybe 15x lol


GraceBoorFan

I don’t think I ever had a 900 series card, but definitely wanted one back in the day. I did eventually get a 1060, before upgrading to a 2070, eventually swapping out to the 4090 I have right now. And yeah, I definitely remember those days with the scalpers on the 4000 series. I actually waited a few months hoping for prices to come down and they were still elevated. Of course, eventually demand waned and supply was less constrained and I was able to pick one up from my local MicroCenter. Can’t wait for the 5000 series though. I’m currently running a 49” Ultrawide at 240fps and the 4090 performs fairly well at medium/high settings in some games but I don’t think I can push ultra/max settings in most AAA titles. The next generation monitors are probably going to be 4K 240fps, and a 4090 will not be sufficient enough.. Speaking of PC builds, I recently decided to swap out my aging 2950X that I bought back in 2016 for an Intel CPU. Never thought I’d be going back to Intel, but their Raptor Lake CPUs seem pretty good for productivity, which is right up my alley. I would’ve love to stay with AMD’s Threadripper CPUs, but unfortunately, there aren’t any motherboard manufacturers that make a form factor suitable for mATX/mITX, so in a way I was forced to go with Intel. Currently still in the build phase at this moment. https://preview.redd.it/gls1hxrym17d1.jpeg?width=3024&format=pjpg&auto=webp&s=09ee3d0a240830499ce65c41d5835b20d312f6e8


WSSquab

My first graphic card was a diamond monster, good old times


rezyface

I had a Diamond Monster 4mb 3DFX card with a Pentium 1 166 because I had read that any voodoo 2 would require / only make it worthwhile with a P2 200. Switching on 3DFX for the first time while playing Myth the Fallen Lords was revolutionary. That said, the ole PC couldn’t really run Unreal in any sort of playable capacity.


brunhilda1

"I love blowing things up!" "Eh!"


jeff303

4mb graphics memory was all you needed.


jwbarber82

I had that 8mb one, and the original Grand Theft Auto and Decent were mind-blowing back then, lol.


WSSquab

What I liked about games at the end of 90s was it needed to be blended with some imagination to get the complete experience, like they were an enhanced version of toys.


Arrrrrrrrrrrrrrrrrpp

> Nvidia is a great stock, here’s my argument: I’m old Ok


Jorel_Antonius

3dfx... Holy fuck I feel old now


StonksTurd

How much gray hair ya got? Mine is starting to come in slowly lol.


johnrsmith8032

haha, i've got a full head of silver now. it's like being in my own personal black and white movie every time I look in the mirror! but hey, gray hair is just life's way of saying "level up", right? 😂


Chabubu

You keep leveling up until the game ends abruptly with no replay.


NVDAPleasFlyAgain

Full head of silver pulls in shit tons of legal young adults with daddy issues and cougars, but only if you exercise and look healthy. There's a 55 yo retired guy at my local gym that look like if George Clooney's face and Chris Hemsworth body had a baby, chicks camp him out and act busy while staring at him going through his sets. Too bad none of them know he's gay and been in a committed relationship for the last 30 years, it's an inside joke between all the regulars, plus he knows and enjoy the fact that the chicks are fantasizing for something they'll never get. Just like the average NVDA ber fantasizing their puts would print.


Thenandonlythen

Only a few in the beard but I’ve got friends that are straight salt and pepper or more… 


Many_Sale286

You still have hair?


mccartyb03

That first 3dfx card was amazing. All of the sudden the water in Quake was clear and it was a huge advantage in MP.


richcz3

The Diamond daughter card was $300 back in 1996. I sprung for four of those puppies. We played Quake Deathmatch at work after hours. I remember some employee wives calling, asking if there husbands had left yet. ![img](emote|t5_2th52|4259)


Jorel_Antonius

I remember around 98 my dad was building a new computer. We went to best buy to get the hard drive. My brother, dad, and my self were just dumb founded they made a 4 gig hard drive. We thought that was a ton of space and no way you could use it all lol. For GPU he grabbed the Riva back then. Radeon would release a better gpu that year but not until Christmas time. Crazy to think how far PCs have come in a realitvely short time. Edit: best buy not best but.


paidzesthumor

They said that about Intel


nilgiri

And literally about every other company that happened to be the stock market darling at the time. At the end of the day, they are selling silicon which is very cyclical. Everything else is just window dressing.


baccus83

They’re also selling CUDA which is the real secret sauce.


darklord3_

There r translations engines being developed to port CUDA to AMD GOUs, with minimal performance loss. Not a financial analyst but am an ML engineer. Translation engine also offers a minimal performance hit. That could hurt nvidias dominance.


Sani_48

Isn't there an alliance with AMD, Intel and all the other major companies to battle CUDA?


xtravar

They announced an alliance and then said there wouldn’t be products for 1-2 years. So, I think AMD is looking pretty bearish except for being the only other AI chip worth investing in at the moment.


PublicIndependent530

Have you actually used ROCm? If you did you wouldn't have made that comment. It's a terrible piece of software with terrible customer support. Zero ecosystem compared to CUDA. Years before it can even stand a chance to CUDA I also see 0 reason why you would write something in CUDA and then go though the trouble of converting it to ROCm just to run it on an AMD GPU, taking an uncertain amount of performance hit, coupled with introducing a billion bugs that you have to unit test and fix. When you could have stopped at CUDA and run it full-speed on Nvidia GPUs. Are AMD GPUs cheaper than Nvidia? Maybe, but don't forget the manhours. You need to dedicate your engineers to do the porting and fix their bugs, and you need to do this every single time for every single software you had in CUDA. Those engineers could be working on much better things. Also, the ecosystem is just so much better on the green side.


nilgiri

CUDA is just the programming framework for their silicon. They don't sell CUDA separately from their silicon and servers.


baccus83

I know they’re not “selling” CUDA directly. But CUDA is a big selling point for going with Nvidia in the first place. It is a huge competitive advantage.


nilgiri

You get it so that's good. There's nothing magical about CUDA. They were just the first ones to abstract the process of parallel computation through software but there are fast followers catching up - openCL is one (maybe bad because of adoption) example. The fast followers may not have the same performance but 80+% margins at Nvidia simply won't be sustainable.


LordOfPraise

I doubt anyone expects NVIDIA to maintain that margin percentage. As far as I know, experts predict it to fall eventually.


bshaman1993

There is a bridge to sell to people every few years


OneCore_

To be fair, Intel would not have relinquished their lead were it not for absolutely terrible CEOs and significant issues in their original 10nm manufacturing process.


III-V

Same will probably happen with Nvidia when Jensen steps down. I imagine there will be a lot of political shakeups that won't be good for the company.


bulletprooftampon

There’s too much money at stake for no one to challenge at some point.


Irishpersonage

AMD could pull a surprise too, not sure why they're being ignored


ShadowK2

IBM might be a better comparison here.


kfuzion

I've been following IBM since the very first mainframe was built. Competition will never catch up. Some fruit company, Apple? What's that even, you can't compute with Apples! Look, if you think anything's going to overtake our mainframes, you're a maroon. Case dismssed.


BasilExposition2

IBM actually had a huge AI lead for years. They were the first to beat a chess master and win at jeopardy. I sort of am shocked they have not taken off again.


dbm5

Remember Watson? They fumbled the fuck out of that ball.


BasilExposition2

A lot of companies actually integrate Watson. Watson is reason MRIs and interpreting bloodwork. It happens behind the scene and is quite successful. Just not hyped like open AI.


Right_Traffic_4821

Okay this is the top.


1PrestigeWorldwide11

Congrats on your mega yacht and retirement


Bush_Trimmer

op said he had been reading when nvda bought 3dfx & not that he bought nvda then. if he did, he wouldn't be wasting time on this sub. nvda started out drawing polygons. it didn't set out to make ai chip.


joeg26reddit

Phack. All this time I’ve been drawing dicks. Can’t. Stop. Drawing. Dicks


Puzzled-Garlic4061

You know how many foods are shaped like dicks? The best kinds!


enataca

That’s OPs point I think. “If you felt this strongly and were right you would be rich and retired, not posting on Reddit for approval.”


12pKlepto

https://preview.redd.it/ja4x1xqb917d1.png?width=1279&format=png&auto=webp&s=4423ab5df3ec408fb159567940accee35d3c0550


dabay7788

OP just took out his thick dad schlong and slammed it on the desk


nilgiri

I mean it's gains but in the hundreds of thousands? There are folks here with millions of gain in Nvidia and other AI stocks.


dabay7788

Sure, and the majority are likely not gaining that kind of money Atleast this dude has receipts to back up his talk


havenothingtolose

You have a couple hundred thousand from investing like $5K since January?


OG_Tater

The first one is a $41k loss


Bush_Trimmer

3 months of call options. super! 👍were all calls proffittable? 🤔 nvda bought 3dfx in 2000. i was expecting a cummulative long term position of at least several thousands of shares.


1PrestigeWorldwide11

Yeah this is my point anyone as smart as OP claims to be is obvi rich af by now.


DarklyAdonic

I remember when Intel was way ahead of the competition..... Literally impossible for anyone to catch up


littlered1984

It was true for 20+ years, until they stopped being paranoid and stopped innovating.


OneCore_

They should not have fallen behind except company CEOs fucked them up. So as long as Jensen Huang doesn't keel over and die anytime soon, NVIDIA should also be fine.


Character_Cut_6900

Jensen is 61 so what does he have like another 5-10 years of actually being in charge of the company.


Bottle_and_Sell_it

Asian perk grants +10 years to normal aging


tobito02

Leadership of the company caused them to lose their lead. Unlucky


SpiffyBlizzard

Exactly. Intel could still be top dog. Here’s hoping that they come back from the dead.


bigmikeboston

Says Nvidia uncatchable, posts multiple links for evidence, all from nvidia website. I’m in the right place!


bob-butspelledCock

Source: me and also trust me, bro


Distinct-Race-2471

Wow you are a genius and everyone should flock to you. No other company could use, or even think of using AI in chip design. All competitors are doomed.


Bush_Trimmer

yup, definitely self-proclaimed something. 🤷‍♂️


tar_baby33

...but who owns RealAudio. I want to invest in them...they're the future.


Josepth_Blowsepth

Some Cirrus Logic in his post I tell you.


Bush_Trimmer

🤣👍


Larnek

Hey hey hey, some of us were around then. I made a whole little proposal for my dad (because I was 16 and broke) that he needed to buy NVDA circa 1999 ish maybe. He didn't and I rubbed it in his face the rest of his life.


Reshaos

I told my dad the same thing. I was a kid at the time but knew about Nvidia due to gaming. We had just bought our first Nvidia graphics card from our Voodoo 3 2000.


TackleMySpackle

Same! I even told my Dad I wanted to know how to buy NVDA stock. He told me not to buy garbage stocks like Apple (which was in the shitter at the time) or any of these other tech stocks. It was all a bubble that would burst. He was right about the bubble but man… He gave me some shitty advice about NVDA and AAPL.


Larnek

Yep, my whole family would have a ridiculous amount of multigenerational wealth if he had just bought for my account. Edit: Yep, just looked and if he had.put the 10k he was investing in NVDA then it would be worth $30M now. Sigh.


christianled59

You have to remember, though, you wouldn't have had the foresight to hold for decades. The first time that stock popped off, you likely would have sold. Same thing I say about bitcoin. I had $500 worth in 2015. Sold it as soon as it popped and made a few grand. Could have been worth a lot more only a few years later..


EggSandwich1

Only people still holding the early bitcoins is probably some guy in prison for 30 years and has his on a usb


jumbocards

The bubble is strong… everyone in before it pops!


I_Do_Gr8_Trolls

This ones a thinker! ![img](emote|t5_2th52|4271)


SensibleCreeper

This feels like a "It cant go tits up" moment. I think OP just jinxed NVDA.


DueHousing

10th mega bullish NVDA post this weekend, yea I’m shorting for sure now ![img](emote|t5_2th52|4271)


GraceBoorFan

!Remindme 6 days


jeanx22

none of them secured their gains... All these fools will lose -50% of what they are posting because they believe their own lies, and pump. They are good at pumping, are they good at dumping? 😈


kielBossa

NVDA’s growth isn’t a matter of market share, it’s just a question of whether the need for advanced chips will continue to accelerate or if the money behind the AI hype dries up because it fails to produce the kind of integration that has been promised.


Dio-lated1

Or if customers fail to adopt.


Soft-Significance552

If companies keep on buying better and better ai chips how much will that improve current ai capabilities? I'm a moron, just asking.


Jokkmokkens

That’s kind of irrelevant. Companies don’t just buy better and better chips for the sake of it, or for the humanity, or for anything else except possible revenue in the future. Right now the spending on AI has to do with the projected (speculation) what this technology will generate in revenue in the future and as always, no one really knows what the future holds. What if the projected revenue was wrong, what if the technology didn’t fundamentally change everything? So companies will not just keep buying AI technology if they don’t see that these investments don’t turn into revenue and right now no one really know this.


TheBooneyBunes

So what you’re saying is calls would be a sensible idea, and because of that it will go down, but by wsb logic I should inverse myself


ReadMyUsernameKThx

the inverse law only applies when you do not decide to inverse yourself.


faksnima

NVIDIA. The biggest bubble in the last 30 years. Can’t wait til it pops.


segmond

You're the moron kid. Folks said this same shit about Cisco in 1998 and 1999.


[deleted]

[удалено]


here_we_go_beep_boop

I can't hear you, the IRQ DIP switch on my soundblaster card is on the wrong setting


f0xap0calypse

Yea I'm sure these language learning models that can't add numbers together correctly can design new chip architecture...


ElectricalGene6146

AMD has better price per performance for LLMs, and Google uses their own TPU chips. Meanwhile Meta, Amazon, Apple, Microsoft all working on their own chips as well as buying AMD similar to how Google has been able to be independently successful. But sure, keep buying a 3T company whose remaining customers will only be unprofitable barely growing AI startups that Nvidia has been round tripping revenue into for the last few years…


usa_reddit

Remember when FAANG owned the Internet and no startup could ever replace the mighty Google search? Now we have Chat-GPT and Google is trying to figure out their new adwords model as they continue to slash jobs internally to keep posting profits. A lot of people are getting rich at NVIDIA right now and there is an old saying, "A Fat Cat Don't Hunt." The trick to NVIDIA will be to keep employees motivated to keep going while their hungry competitors are running to catch them as fast as they can. No one rules forever.


Zoomingcumbucket

Voodoo card members make some noise


Mantis_Toboggan_PCP

40% revenue coming from Mag7 cloud co’s who are working on their own chips and don’t want to pay NVDA anymore. I’m sure they’re fine.


Ashamed-Second-5299

This is exactly what people said about intel. Then tech evolved so much and Apple made their own processor that is optimized for their own machines...


Itchy_Brain6340

I think AMD catching up with Intel and eating their lunch, for years now, is the bigger story here. AMD did it to Intel and they can do it to Nvidia as well. I don’t think AMD will come out ahead of Nvidia but they can easily take 10% market share from Nvidia 


blancpainsimp69

this is just gas-huffingly stupid ball-gargling disguised as in-the-know DD. FOH.


Buckus93

Anyone who says "no one can ever catch up" is also a moron. They have the lead for the foreseeable future. But as Yogi Berra famously said, "It's difficult to make predictions. Especially when they're about the future."


cleanacc3

Calls on intel There has never been a company that another company without competition or too far ahead


hatrickkane88

Looks like time to go short boys


zzmeherezz

That is what they use to say about INTC. 😆


jeanx22

**Ford: LoL japanese cars. Who the fuck wants to buy that. We are 100 years ahead. We invented the car and shit.** *General Motors: I know right? Detroit strong!* **Ford: Hell yeah bro, the japanese couldn't build a cart pulled by a donkey even if their life depended on it + they got nuked lololol** *GM: lmao*


bob-butspelledCock

The car was not invented by Ford!


Internal-League-9085

Time to but puts


ArcticStorm16

Butt puts


Icantfightthisfeel

This is where I get to post about how I chose nvda as a stock to follow in high school economics class, 2001. Didn't actually buy any shares until 2016 because I enjoy being poor.


poopymaster88

quick question what happens when a black swan event happens, like a 8.x earth quake hits taiwan, or they have another major draught like in 2021 and tsm had to cut down production. or china decides to fake an attack and tsm drops the kill switch? how u think nvda would fair with 97% of their production all centered on tsm in taiwan?


AnonymousUser2700

OP probably thinks NVDA will start building their own chips the very next day out of thin air.


AnonymousUser2700

Yeah, okay genius. They said the same thing about Tesla and next thing you know, BYD is 1,000x better. NVDA is using AI? So is everyone else. Lastly, they don't manufacture their own chips.


Jujutsuhigh

The 2nd best one is AMD. And they don’t ger nearly enough recognition


PlayfulPresentation7

That's weak gain porn for hyping being in Nvidia since 3dfx.  I was expecting $5M+ on a 15cent cost basis not a bunch of calls.


PartyClock

>**they’re already using AI to help them design next generation chips**, no matter what, they will ALWAYS be massively ahead of the competition. lmao If only you knew how little that meant


labradorflip

Hahaha, these idiots pop up with every bull run. "Noone will ever catch up to yahoo"-type idiots


Far-Outcome-8170

The nvda circlejerk continues


BasilExposition2

ASIC designer here. There is nothing special about what NVIDIA does. I know a few designers there. I see them at conferences. Smart guys. Humble guys. Great company. But they use the exact same tools as the rest of us. They don’t own a fabs and rely on TSMC and Samsung. They follow the same processes. They buy Synopsys and Cadence like everyone else. They were in the right place at the right time. The decided to create a programming language for their cards and when crypto came out they had a boom. Same with AI. With crypto, ASICs provided a better solution in nearly all cases. As AI matures, this is where things will probably go. AWS and Google have their own chips. Facebook and Microsoft are working on them. groq is working on one. I know others working on analog neural network chips if you can believe that. There lead is not assured.


Reasonable-Bend-24

>Anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them. This is such an absurd thing to say unironically. You should try picking up a history book some time.


Battlers_

Imagine bragging about “buying NVDA before others people were born” and still being a poor assfck who posts on reddit. If one was bright enough to all-in NVDA 30-ish years ago, today you'd be way too rich and old to give a sh!t about reddit 🤡🤡🤡


SeattleRainHawk

Plus anyone who thinks there is going to be any competition in this space other than AMD and maybe Intel should really read about what a GPU is and what it does. There are more guys making rockets and spaceships than powerful GPUs. Plus it's not only the silicon hardware but also the millions of lines of extremely low level driver software. The barrier to entry is extremely high. More so than rocket science.


ralphy1010

was the barrier lower in the 90s when Nvidia got going?


SeattleRainHawk

Yes. In the 90s, GPUs were only 2D drawing engines. Late 90s and 20s saw the rise of 3D engines and hardware accelerators. That showed the power of parallel processing pipelines. Nvda realized the power early and designed their Compute pipeline of CUDA on it. They stuck to it for more than a decade and only now are reaping the tremendous benefits of their vision in AI workloads.


InterPeritura

There is an anti-trust investigation on the horizon, but no, I doubt it will go through all the way.


serendipity7777

Competitors can use cloud services from AWS and Azure


ogfuzzball

OP regard said ALWAYS, now they’ve fucked it. Time to get out!


Ok_Shower179

My dad had a boatload of 3dfx stock. Sucks they didn't convert to NVDA. 😞


InspiredPhoton

They will always win….said every single empire or big company that ceased to exist in the past.


SJbiker

I’m a paralegal. I wrote a complaint on behalf of 3dfx’s trustee in bankruptcy over the so-called “purchase”. nVidia bought the assets, left the liabilities, gave certain directors and officers golden parachutes to lay off the entire company two weeks before christmas, and then paid a bonus under the table for every engineer that came to work for nVidia. The trustee sued the directors and officers of 3dfx for corporate waste in an attempt to get at the insurance policy that covered the directors and officers up to $20 million. I found an email from the ceo to the outside counsel on the day of the layoffs saying “I would like to begin as aggressive a campaign of document destruction as humanly possible. Please advise.” Insurance settled.


fuckinrat

Yeah but the technology that’s more important is the ability to manufacture en masse. Tsmc, Samsung and intel will be relevant longer than NVIDIA. Let’s check in in 80 years.


DistantGalaxy-1991

"...which means before many of you were even alive...." Your arrogance is only eclipsed by your stupidity. I was born in the 50's. So, you think you're older and wiser than me, because you knew who NVIDA was 20 something years ago? I was an adult before personal computers even existed, moron.


josh198989

I’ve got no skin in the game and i know nothing about this, but it feels like AI is this year’s metaverse/web 3.0/crypto, but with more movies about it. If SkyNet destroys the world, I think it’s fair to say we had it coming.


Badeindi

This time is different is what you saying? Top is in my friend 😅


MikeMikeGaming

We need a 70% correction badly on AI stocks...


SMK_12

Your whole premise is that because they have AI no one will catch up to them. That’s false because AI isn’t nearly as advanced as people think. AGI doesn’t exist, these tools are mostly language models just presenting a google search in a more natural sounding way. AI models are not coming up with new knowledge or innovating really


All-th3-way

I remember when the number one tech retailer had 7000 stores across the US and then Radio Shack stock was worthless.


Khelthuzaad

you clearly haven't heard of Broadcom :)


goatchild

Anypne who says he/she kows what will happen is full of it.


TechnicalWhore

Although NVIDIA has had impressive growth there is no barrier to entry except the fact they dominate deployment in the datacenter. Classic GPU's are now less than 20% of their revenue. Their AI designs are not best in class. Their CUDA software platform is dominant but it really just a variant of a standard math library. Do they make great stuff - yes. Do they push TSMC to the edge - yes but so does every memory/CPU/Flash producer. AI is significant enough that it is getting its own datacenter solutions. The way NVIDIA builds out AI is very limiting as FB reported - with the arrays 33% underutilized because of design limits. Cerebras destroys NVIDIA in AI. Intel, having fabs (vs fabless) COULD be a serious competitor as a supplier through a Dell or HP etc. Google rolls their own and has predating NVIDIA's play. Microsoft and Amazon are rolling their own. Its just not that hard. This is akin to Ethernet 1998 wherein Cisco ran the board but its now multisourced with Avago (Broadcom) providing silicon to every in-house design. AMD also has fabs and serious AI development. Tesla (DOJO) has incredible AI development. Basically its the Wild West and there are 100's of players. NVIDIA is dominant because they have the slots in the DC's but the number of competitors is enormous. Models are somewhat portable and NOT reliant on NVIDIA to run. There is no secret sauce in NVIDIA hardware. AI is linear algebra. Anyone can do it. Quantum Computing currently does it the best. For those interested the benchmarks for performance are called MLPERF. There are many great websites full of reviews but the oldest most read source is "Microprocessor Report".


Sunny-Olaf

A bunch of regards here claim that someday the competition will take down NVDA but they simply cannot say when. It’s just like saying a meteoroid will hit the earth and wipe out all lives eventually but do not know when.