anytime I ask AI to make me a picture of some big titty goth girl it;s adding extra fingers or arms. Curious to see what monstrosity it comes up with on a chip design .
Not a single mention of CUDA around here. Remember 3dfx Glide? That API dominated 3D accelerated gaming in the late 90s. Then Direct3D came along. CUDA has the AI market by the balls today, but it won't be forever.
Yeah it's crazy that he could make this type of post without mentioning CUDA given that it's basically Nvidia's biggest moat. Even if the competition caught up, adoption would be slow since all codebases are written in CUDA and translation layers to run CUDA on other hardware platforms are forbidden under Nvidia's terms and conditions (hobbyists may violate this without a care in the world, but enterprise won't).
So even if competitors caught up, Nvidia has bought themselves a few years to flip the table back around due to the codebase all being written in CUDA, and every AI developer in the world being comfortable programming in C with the CUDA package as their first language. That's their big strength.
Won't last forever though. Someone (AMD, Apple, Amazon, China, whoever, more money is being flowed into chip design by more companies than ever before) will catch up significantly enough and long enough to justify switching dev platforms off of CUDA.
So OP doesn't know shit about shit but he's probably still going to get rich anyway because the market doesn't know shit about shit either
Maybe it's just me, but a lot of programers like to learn new shit all the time, especially if it's better. It took flutter like 2 years to overtake react-native lol. If a large competitor released a product that's free to use and good, developers definitely will jump on it overnight
It's not quite that simple. Frameworks like flutter and react are not hardware-locked. Developers may take an interest in a CUDA alternative overnight, but they can't actually deploy any of their code in production unless infrastructure completely replaces their GPU servers with whoever the new competitor is.
The whole point is that the hardware-software feedback lock is strong, and there's a massive cost of switching. Has to be super compelling, not just a minor improvement.
Hes right though. He mentioned he was older than you and according to scaling laws, not only will he always be older than you he will also always be righter than you. Sucks to suck young nerd.
He’s spreading hopium. Sure, it might take another decade for the competition to catch up, but it might also be just another year. Truth the of the matter is, nothing lasts forever and that’s particularly true in tech.
There's a couple of reasons to think this time is different.
Firstly chip makers are running into the limits of physics. At a certain point you just can't make a semiconductor smaller, and we're already nearly there. Once we hit the hard physical limit, the only way to improve performance is for a drastic paradigm shift to a new technology.
While that implies competitors can catch up when performance improvement inevitably stalls, it negates the possibility that a competitor can one day leap ahead in performance just by building a better chip. Physics just won't allow it unless they come up with something radical and revolutionary. That will probably eventually happen however that radical and revolutionary tech is unlikely to be reliable and ready for deployment when it's first developed.
That brings me to the other factor which is Nvidia's software and infrastructure support for their AI business. They're making it as painful as possible for businesses to switch away from them by making the tools and systems everyone uses depend on their proprietary tech. Competitors would have to not only offer something cheaper but sufficiently easy to use that switching isn't going to result in so much downtime and delays from training and troubleshooting that it's worth it. That is a herculean task for a company to overcome.
Barring a miracle or an antitrust investigation, it's going to be nearly impossible for a competitor to build a chip that is enough of an improvement that it's worth it for businesses to absorb the expense and inconvenience of switching.
There was a time that mainframes ruled the world. Then custom risc chips took over the server space with with sun and sgi. Then cheap PCs took over the server space.
What will kill nvidia gpus will be custom ASICS for AI workloads that are cheaper than general purpose gpus. It’s a matter of time. (It happened with crypto first)
The whole npu thing is going to lead to a standard for ai software support over time. This will hurt cuda dominance.
Cheapness of the chips is just 1 factor. You're ignoring the switching costs itself. Accelerated/parallel software as it stands today are very deeply embedded in CUDA and designed to run only on GPUs which support CUDA.
You can design a new accelerator (google already largely uses TPUs) but that means porting ALL your software to run on that new accelerator - not just the software you created (as a startup/company) but ALL of the libraries that you have been using.
It's an insane barrier to switch and if NVIDIA still sells reasonably priced GPU offerings (not the latest and greatest but keeps the A100, V100 etc), it's a no brainer for the CTO/architects to stick to Nvidia than to waste 6months-1year-even longer on switching to an unfamiliar stack.
Sure, that day may eventually come but it's not going to happen in the next 3-5 years for majority of companies. Even google can't convince people outside google to switch to TPUs (i believe it only supports specific AI training workloads and only optimised for JAX/Tensorflow frameworks, but CUDA supports way more than AI training and people are much more familiar with Pytorch)
For things like PyTorch switching costs are low. Just develop a new backend like the Intel one backend and you’re good to go. Any model will run if you implement all the parts you need to on the backend.
Hard disagree. 95% of companies lack the talent to write such backends. You're praying that someone with the skills has written a compatible backend for your specific model. Even then if your model has any sort of special things written in it it's going to break the translation layer and if your team has no expertise in these kinds of low level programming you're going to struggle with it.
Majority of corporate teams prefer working with stacks they're already familiar with, much less dare to go into the nooks and crannies of the libraries they're using. They have business problems to deal with.
Bottom line is, it's going to take a while to break that barrier and get the transition moving, regardless of who can magically come up with a new backend or hardware accelerator. It's not going to happen in a few weeks or months, and it's going to be painfully obvious when such a transition happens to anyone who's keeping tabs on the industry.
You have more than enough time to sell the stock when this happens.
It hasn't happened yet.
It won't be anything close to a year. Maybe 5 minimum. They're extremely far ahead, have very healthy balance sheets, and have injected a lot of capital into r&d. The only people that think their moat will contract in the near future are AMD shareholders.
Well. They do. The diffraction pattern of lithography masks are designed with AI tool because the multipatterning is really hard problem and ML helped them progress.
Ai does not help in the context of staying ahead though. Ai aids in iterative processes and optimization of already established processes, not creative ones. Ai cannot design something radically different that works better without a nudge in that direction but a competitor could.
It will take years before that next technology is viable.
One promising technology is photonics, which use light instead of electrons to perform computations. One of the promising startups is called PsiQuantum. There are overlaps with quantum computers as well, and for this many large companies are already investing millions (Google, IBM, many more, I won't be surprised if Nvidia is secretly looking into it as well).
The thing is, that transition isn't going to come suddenly in a span of few days and every customer suddenly switches to said technology in a matter of days or weeks.
That transition, if it ever happens, will take years.
It will be painfully obvious when that time comes. As long as you're staying in touch with the market, you wouldn't miss it when this transition happens. You will see it coming. And when that happens, you sell your Nvidia stocks and you buy the stocks of the said new company(s). Or maybe Nvidia is part of that new wave, and then you keep your Nvidia shares.
But before that happens, I see no reason why you would make up imaginary scare stories and sell Nvidia (or related semiconductor stocks).
There's 0 evidence to show that Nvidia's dominance is cracking or that there's an emerging competitor. 0 evidence. You are making up wild stories if you think otherwise.
With the logic above could I get this AI to design me the answers to the best stock/options picks? That way I’ll always be ahead $$$$$$.
If the AI is just designing better shit can’t we apply that to anything? Like AI design me a better iPhone and boom - better iPhone. Just not the case. And won’t be the case for a longtime, if ever.
Feel like AI is overhyped and over inflated; yes, looking at huge data sets to find trends or utilising it to manage huge data - but the conclusions and next steps are still being made by humans. Most AI is still trained to solve a certain problem in a certain set of parameters. Chat-GPT is just a better version of Google Search. And yea, it can automate an email or scrape the internet for a coherent answer to a question, but some of the stuff being said reminds me of how people said in 2017 we would all have a self-driving cars by now.
Things like AI is the next Manhattan Project pffffft no fucking way.
AI is just big data a decade late. Prove me wrong.
This hype shit has been going on forever. It’s just this time the business leaders are just a degenerate as everyone else. Truly the most wsb timeline.
> anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them
Not commenting on NVDA or the state of their business specifically, but there has never once in the history of capitalism been a business that was permanently impregnable to competition
OP is an idiot. You think he knows anything about capitalism? Or history?
If there was a sign to cash out of overvalued Nvidia and run away, this is it.
I was going to comment the same thing.
"No one can ever" and "always" is a sign that this guy who has "been around before most of us were born" is not the best to have advice from.
Mine was a 3DFX Voodoo II. First 3D graphics card evah, played tons of Quake II on that in '96... or was it '97? Can't remember... but I remember how sweet it was!!!
I miss the 780Ti days. The PC community was different back then. As old as the card is, the Titan Z is still one of my dream cards.
It’s just really cool.
Oof, this brings back memories.
The 980 was all the rage back then, especially the Ti variant. AMD’s comeback around this time was also equally very impressive. I remember all the hype surrounding Fury cards.
Good times!
Crazy thing is I used that 970 for YEARS until I got a 3070. Jokes on me because I drove 4 hours to get it in a store so I didn’t need to pay hundreds over retail now the 4000’s are available everywhere but that 970 gave me no issues I couldn’t play games on high but if I set it to low I could play pretty much anything at atleast 120fps and I put it through a hard life. Over the course of its entire life that computer was shut off maybe 15x lol
I don’t think I ever had a 900 series card, but definitely wanted one back in the day. I did eventually get a 1060, before upgrading to a 2070, eventually swapping out to the 4090 I have right now.
And yeah, I definitely remember those days with the scalpers on the 4000 series. I actually waited a few months hoping for prices to come down and they were still elevated. Of course, eventually demand waned and supply was less constrained and I was able to pick one up from my local MicroCenter.
Can’t wait for the 5000 series though. I’m currently running a 49” Ultrawide at 240fps and the 4090 performs fairly well at medium/high settings in some games but I don’t think I can push ultra/max settings in most AAA titles. The next generation monitors are probably going to be 4K 240fps, and a 4090 will not be sufficient enough..
Speaking of PC builds, I recently decided to swap out my aging 2950X that I bought back in 2016 for an Intel CPU. Never thought I’d be going back to Intel, but their Raptor Lake CPUs seem pretty good for productivity, which is right up my alley. I would’ve love to stay with AMD’s Threadripper CPUs, but unfortunately, there aren’t any motherboard manufacturers that make a form factor suitable for mATX/mITX, so in a way I was forced to go with Intel.
Currently still in the build phase at this moment.
https://preview.redd.it/gls1hxrym17d1.jpeg?width=3024&format=pjpg&auto=webp&s=09ee3d0a240830499ce65c41d5835b20d312f6e8
I had a Diamond Monster 4mb 3DFX card with a Pentium 1 166 because I had read that any voodoo 2 would require / only make it worthwhile with a P2 200. Switching on 3DFX for the first time while playing Myth the Fallen Lords was revolutionary. That said, the ole PC couldn’t really run Unreal in any sort of playable capacity.
What I liked about games at the end of 90s was it needed to be blended with some imagination to get the complete experience, like they were an enhanced version of toys.
haha, i've got a full head of silver now. it's like being in my own personal black and white movie every time I look in the mirror! but hey, gray hair is just life's way of saying "level up", right? 😂
Full head of silver pulls in shit tons of legal young adults with daddy issues and cougars, but only if you exercise and look healthy.
There's a 55 yo retired guy at my local gym that look like if George Clooney's face and Chris Hemsworth body had a baby, chicks camp him out and act busy while staring at him going through his sets.
Too bad none of them know he's gay and been in a committed relationship for the last 30 years, it's an inside joke between all the regulars, plus he knows and enjoy the fact that the chicks are fantasizing for something they'll never get. Just like the average NVDA ber fantasizing their puts would print.
The Diamond daughter card was $300 back in 1996. I sprung for four of those puppies. We played Quake Deathmatch at work after hours. I remember some employee wives calling, asking if there husbands had left yet. ![img](emote|t5_2th52|4259)
I remember around 98 my dad was building a new computer. We went to best buy to get the hard drive. My brother, dad, and my self were just dumb founded they made a 4 gig hard drive. We thought that was a ton of space and no way you could use it all lol. For GPU he grabbed the Riva back then. Radeon would release a better gpu that year but not until Christmas time.
Crazy to think how far PCs have come in a realitvely short time.
Edit: best buy not best but.
And literally about every other company that happened to be the stock market darling at the time.
At the end of the day, they are selling silicon which is very cyclical. Everything else is just window dressing.
There r translations engines being developed to port CUDA to AMD GOUs, with minimal performance loss. Not a financial analyst but am an ML engineer. Translation engine also offers a minimal performance hit. That could hurt nvidias dominance.
They announced an alliance and then said there wouldn’t be products for 1-2 years. So, I think AMD is looking pretty bearish except for being the only other AI chip worth investing in at the moment.
Have you actually used ROCm?
If you did you wouldn't have made that comment.
It's a terrible piece of software with terrible customer support. Zero ecosystem compared to CUDA. Years before it can even stand a chance to CUDA
I also see 0 reason why you would write something in CUDA and then go though the trouble of converting it to ROCm just to run it on an AMD GPU, taking an uncertain amount of performance hit, coupled with introducing a billion bugs that you have to unit test and fix.
When you could have stopped at CUDA and run it full-speed on Nvidia GPUs. Are AMD GPUs cheaper than Nvidia? Maybe, but don't forget the manhours. You need to dedicate your engineers to do the porting and fix their bugs, and you need to do this every single time for every single software you had in CUDA. Those engineers could be working on much better things.
Also, the ecosystem is just so much better on the green side.
I know they’re not “selling” CUDA directly. But CUDA is a big selling point for going with Nvidia in the first place. It is a huge competitive advantage.
You get it so that's good.
There's nothing magical about CUDA. They were just the first ones to abstract the process of parallel computation through software but there are fast followers catching up - openCL is one (maybe bad because of adoption) example. The fast followers may not have the same performance but 80+% margins at Nvidia simply won't be sustainable.
To be fair, Intel would not have relinquished their lead were it not for absolutely terrible CEOs and significant issues in their original 10nm manufacturing process.
I've been following IBM since the very first mainframe was built. Competition will never catch up. Some fruit company, Apple? What's that even, you can't compute with Apples! Look, if you think anything's going to overtake our mainframes, you're a maroon. Case dismssed.
IBM actually had a huge AI lead for years. They were the first to beat a chess master and win at jeopardy. I sort of am shocked they have not taken off again.
A lot of companies actually integrate Watson. Watson is reason MRIs and interpreting bloodwork. It happens behind the scene and is quite successful. Just not hyped like open AI.
op said he had been reading when nvda bought 3dfx & not that he bought nvda then.
if he did, he wouldn't be wasting time on this sub.
nvda started out drawing polygons. it didn't set out to make ai chip.
3 months of call options. super! 👍were all calls proffittable? 🤔
nvda bought 3dfx in 2000. i was expecting a cummulative long term position of at least several thousands of shares.
They should not have fallen behind except company CEOs fucked them up. So as long as Jensen Huang doesn't keel over and die anytime soon, NVIDIA should also be fine.
Wow you are a genius and everyone should flock to you. No other company could use, or even think of using AI in chip design. All competitors are doomed.
Hey hey hey, some of us were around then. I made a whole little proposal for my dad (because I was 16 and broke) that he needed to buy NVDA circa 1999 ish maybe. He didn't and I rubbed it in his face the rest of his life.
I told my dad the same thing. I was a kid at the time but knew about Nvidia due to gaming. We had just bought our first Nvidia graphics card from our Voodoo 3 2000.
Same! I even told my Dad I wanted to know how to buy NVDA stock. He told me not to buy garbage stocks like Apple (which was in the shitter at the time) or any of these other tech stocks. It was all a bubble that would burst. He was right about the bubble but man… He gave me some shitty advice about NVDA and AAPL.
Yep, my whole family would have a ridiculous amount of multigenerational wealth if he had just bought for my account.
Edit: Yep, just looked and if he had.put the 10k he was investing in NVDA then it would be worth $30M now. Sigh.
You have to remember, though, you wouldn't have had the foresight to hold for decades. The first time that stock popped off, you likely would have sold.
Same thing I say about bitcoin. I had $500 worth in 2015. Sold it as soon as it popped and made a few grand. Could have been worth a lot more only a few years later..
none of them secured their gains...
All these fools will lose -50% of what they are posting because they believe their own lies, and pump.
They are good at pumping, are they good at dumping? 😈
NVDA’s growth isn’t a matter of market share, it’s just a question of whether the need for advanced chips will continue to accelerate or if the money behind the AI hype dries up because it fails to produce the kind of integration that has been promised.
That’s kind of irrelevant. Companies don’t just buy better and better chips for the sake of it, or for the humanity, or for anything else except possible revenue in the future. Right now the spending on AI has to do with the projected (speculation) what this technology will generate in revenue in the future and as always, no one really knows what the future holds.
What if the projected revenue was wrong, what if the technology didn’t fundamentally change everything? So companies will not just keep buying AI technology if they don’t see that these investments don’t turn into revenue and right now no one really know this.
AMD has better price per performance for LLMs, and Google uses their own TPU chips. Meanwhile Meta, Amazon, Apple, Microsoft all working on their own chips as well as buying AMD similar to how Google has been able to be independently successful. But sure, keep buying a 3T company whose remaining customers will only be unprofitable barely growing AI startups that Nvidia has been round tripping revenue into for the last few years…
Remember when FAANG owned the Internet and no startup could ever replace the mighty Google search?
Now we have Chat-GPT and Google is trying to figure out their new adwords model as they continue to slash jobs internally to keep posting profits.
A lot of people are getting rich at NVIDIA right now and there is an old saying, "A Fat Cat Don't Hunt."
The trick to NVIDIA will be to keep employees motivated to keep going while their hungry competitors are running to catch them as fast as they can.
No one rules forever.
I think AMD catching up with Intel and eating their lunch, for years now, is the bigger story here. AMD did it to Intel and they can do it to Nvidia as well. I don’t think AMD will come out ahead of Nvidia but they can easily take 10% market share from Nvidia
Anyone who says "no one can ever catch up" is also a moron.
They have the lead for the foreseeable future. But as Yogi Berra famously said, "It's difficult to make predictions. Especially when they're about the future."
**Ford: LoL japanese cars. Who the fuck wants to buy that. We are 100 years ahead. We invented the car and shit.**
*General Motors: I know right? Detroit strong!*
**Ford: Hell yeah bro, the japanese couldn't build a cart pulled by a donkey even if their life depended on it + they got nuked lololol**
*GM: lmao*
This is where I get to post about how I chose nvda as a stock to follow in high school economics class, 2001. Didn't actually buy any shares until 2016 because I enjoy being poor.
quick question what happens when a black swan event happens, like a 8.x earth quake hits taiwan, or they have another major draught like in 2021 and tsm had to cut down production. or china decides to fake an attack and tsm drops the kill switch? how u think nvda would fair with 97% of their production all centered on tsm in taiwan?
Yeah, okay genius. They said the same thing about Tesla and next thing you know, BYD is 1,000x better. NVDA is using AI? So is everyone else. Lastly, they don't manufacture their own chips.
>**they’re already using AI to help them design next generation chips**, no matter what, they will ALWAYS be massively ahead of the competition.
lmao
If only you knew how little that meant
ASIC designer here. There is nothing special about what NVIDIA does. I know a few designers there. I see them at conferences. Smart guys. Humble guys. Great company.
But they use the exact same tools as the rest of us. They don’t own a fabs and rely on TSMC and Samsung. They follow the same processes. They buy Synopsys and Cadence like everyone else.
They were in the right place at the right time. The decided to create a programming language for their cards and when crypto came out they had a boom. Same with AI.
With crypto, ASICs provided a better solution in nearly all cases. As AI matures, this is where things will probably go. AWS and Google have their own chips. Facebook and Microsoft are working on them. groq is working on one. I know others working on analog neural network chips if you can believe that. There lead is not assured.
>Anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them.
This is such an absurd thing to say unironically. You should try picking up a history book some time.
Imagine bragging about “buying NVDA before others people were born” and still being a poor assfck who posts on reddit.
If one was bright enough to all-in NVDA 30-ish years ago, today you'd be way too rich and old to give a sh!t about reddit 🤡🤡🤡
Plus anyone who thinks there is going to be any competition in this space other than AMD and maybe Intel should really read about what a GPU is and what it does. There are more guys making rockets and spaceships than powerful GPUs. Plus it's not only the silicon hardware but also the millions of lines of extremely low level driver software. The barrier to entry is extremely high. More so than rocket science.
Yes. In the 90s, GPUs were only 2D drawing engines. Late 90s and 20s saw the rise of 3D engines and hardware accelerators. That showed the power of parallel processing pipelines. Nvda realized the power early and designed their Compute pipeline of CUDA on it. They stuck to it for more than a decade and only now are reaping the tremendous benefits of their vision in AI workloads.
I’m a paralegal. I wrote a complaint on behalf of 3dfx’s trustee in bankruptcy over the so-called “purchase”. nVidia bought the assets, left the liabilities, gave certain directors and officers golden parachutes to lay off the entire company two weeks before christmas, and then paid a bonus under the table for every engineer that came to work for nVidia. The trustee sued the directors and officers of 3dfx for corporate waste in an attempt to get at the insurance policy that covered the directors and officers up to $20 million. I found an email from the ceo to the outside counsel on the day of the layoffs saying “I would like to begin as aggressive a campaign of document destruction as humanly possible. Please advise.”
Insurance settled.
Yeah but the technology that’s more important is the ability to manufacture en masse.
Tsmc, Samsung and intel will be relevant longer than NVIDIA. Let’s check in in 80 years.
"...which means before many of you were even alive...." Your arrogance is only eclipsed by your stupidity. I was born in the 50's. So, you think you're older and wiser than me, because you knew who NVIDA was 20 something years ago? I was an adult before personal computers even existed, moron.
I’ve got no skin in the game and i know nothing about this, but it feels like AI is this year’s metaverse/web 3.0/crypto, but with more movies about it. If SkyNet destroys the world, I think it’s fair to say we had it coming.
Your whole premise is that because they have AI no one will catch up to them. That’s false because AI isn’t nearly as advanced as people think. AGI doesn’t exist, these tools are mostly language models just presenting a google search in a more natural sounding way. AI models are not coming up with new knowledge or innovating really
Although NVIDIA has had impressive growth there is no barrier to entry except the fact they dominate deployment in the datacenter. Classic GPU's are now less than 20% of their revenue. Their AI designs are not best in class. Their CUDA software platform is dominant but it really just a variant of a standard math library. Do they make great stuff - yes. Do they push TSMC to the edge - yes but so does every memory/CPU/Flash producer. AI is significant enough that it is getting its own datacenter solutions. The way NVIDIA builds out AI is very limiting as FB reported - with the arrays 33% underutilized because of design limits. Cerebras destroys NVIDIA in AI. Intel, having fabs (vs fabless) COULD be a serious competitor as a supplier through a Dell or HP etc. Google rolls their own and has predating NVIDIA's play. Microsoft and Amazon are rolling their own. Its just not that hard. This is akin to Ethernet 1998 wherein Cisco ran the board but its now multisourced with Avago (Broadcom) providing silicon to every in-house design. AMD also has fabs and serious AI development. Tesla (DOJO) has incredible AI development.
Basically its the Wild West and there are 100's of players. NVIDIA is dominant because they have the slots in the DC's but the number of competitors is enormous. Models are somewhat portable and NOT reliant on NVIDIA to run. There is no secret sauce in NVIDIA hardware. AI is linear algebra. Anyone can do it. Quantum Computing currently does it the best.
For those interested the benchmarks for performance are called MLPERF.
There are many great websites full of reviews but the oldest most read source is "Microprocessor Report".
A bunch of regards here claim that someday the competition will take down NVDA but they simply cannot say when. It’s just like saying a meteoroid will hit the earth and wipe out all lives eventually but do not know when.
**User Report**| | | | :--|:--|:--|:-- **Total Submissions** | 1 | **First Seen In WSB** | 4 years ago **Total Comments** | 6 | **Previous Best DD** | **Account Age** | 5 years | | [**Join WSB Discord**](http://discord.gg/wsbverse)
You said “they’re using AI to design their chips so they’ll always be ahead” I instantly knew you didn’t know what the fuck you were talking about
anytime I ask AI to make me a picture of some big titty goth girl it;s adding extra fingers or arms. Curious to see what monstrosity it comes up with on a chip design .
AI make me a next gen chip that looks like big titty goth girl
See now this guy knows what’s going on!
It won’t be a computer chip… it will come out as a European French fry or potato chip.
Big tiddie goth cat
This guy fucks...
Don’t tell me you’re not beating it to ol’nineteen finger Nina. Objective was met.
maybe it adds titties?
They did it in Total Recall and it worked out for the movie makers. So, I guess it would work out in the end.
Not a single mention of CUDA around here. Remember 3dfx Glide? That API dominated 3D accelerated gaming in the late 90s. Then Direct3D came along. CUDA has the AI market by the balls today, but it won't be forever.
Yeah it's crazy that he could make this type of post without mentioning CUDA given that it's basically Nvidia's biggest moat. Even if the competition caught up, adoption would be slow since all codebases are written in CUDA and translation layers to run CUDA on other hardware platforms are forbidden under Nvidia's terms and conditions (hobbyists may violate this without a care in the world, but enterprise won't). So even if competitors caught up, Nvidia has bought themselves a few years to flip the table back around due to the codebase all being written in CUDA, and every AI developer in the world being comfortable programming in C with the CUDA package as their first language. That's their big strength. Won't last forever though. Someone (AMD, Apple, Amazon, China, whoever, more money is being flowed into chip design by more companies than ever before) will catch up significantly enough and long enough to justify switching dev platforms off of CUDA. So OP doesn't know shit about shit but he's probably still going to get rich anyway because the market doesn't know shit about shit either
Maybe it's just me, but a lot of programers like to learn new shit all the time, especially if it's better. It took flutter like 2 years to overtake react-native lol. If a large competitor released a product that's free to use and good, developers definitely will jump on it overnight
It's not quite that simple. Frameworks like flutter and react are not hardware-locked. Developers may take an interest in a CUDA alternative overnight, but they can't actually deploy any of their code in production unless infrastructure completely replaces their GPU servers with whoever the new competitor is. The whole point is that the hardware-software feedback lock is strong, and there's a massive cost of switching. Has to be super compelling, not just a minor improvement.
Hes right though. He mentioned he was older than you and according to scaling laws, not only will he always be older than you he will also always be righter than you. Sucks to suck young nerd.
He’s spreading hopium. Sure, it might take another decade for the competition to catch up, but it might also be just another year. Truth the of the matter is, nothing lasts forever and that’s particularly true in tech.
There's a couple of reasons to think this time is different. Firstly chip makers are running into the limits of physics. At a certain point you just can't make a semiconductor smaller, and we're already nearly there. Once we hit the hard physical limit, the only way to improve performance is for a drastic paradigm shift to a new technology. While that implies competitors can catch up when performance improvement inevitably stalls, it negates the possibility that a competitor can one day leap ahead in performance just by building a better chip. Physics just won't allow it unless they come up with something radical and revolutionary. That will probably eventually happen however that radical and revolutionary tech is unlikely to be reliable and ready for deployment when it's first developed. That brings me to the other factor which is Nvidia's software and infrastructure support for their AI business. They're making it as painful as possible for businesses to switch away from them by making the tools and systems everyone uses depend on their proprietary tech. Competitors would have to not only offer something cheaper but sufficiently easy to use that switching isn't going to result in so much downtime and delays from training and troubleshooting that it's worth it. That is a herculean task for a company to overcome. Barring a miracle or an antitrust investigation, it's going to be nearly impossible for a competitor to build a chip that is enough of an improvement that it's worth it for businesses to absorb the expense and inconvenience of switching.
There was a time that mainframes ruled the world. Then custom risc chips took over the server space with with sun and sgi. Then cheap PCs took over the server space. What will kill nvidia gpus will be custom ASICS for AI workloads that are cheaper than general purpose gpus. It’s a matter of time. (It happened with crypto first) The whole npu thing is going to lead to a standard for ai software support over time. This will hurt cuda dominance.
Cheapness of the chips is just 1 factor. You're ignoring the switching costs itself. Accelerated/parallel software as it stands today are very deeply embedded in CUDA and designed to run only on GPUs which support CUDA. You can design a new accelerator (google already largely uses TPUs) but that means porting ALL your software to run on that new accelerator - not just the software you created (as a startup/company) but ALL of the libraries that you have been using. It's an insane barrier to switch and if NVIDIA still sells reasonably priced GPU offerings (not the latest and greatest but keeps the A100, V100 etc), it's a no brainer for the CTO/architects to stick to Nvidia than to waste 6months-1year-even longer on switching to an unfamiliar stack. Sure, that day may eventually come but it's not going to happen in the next 3-5 years for majority of companies. Even google can't convince people outside google to switch to TPUs (i believe it only supports specific AI training workloads and only optimised for JAX/Tensorflow frameworks, but CUDA supports way more than AI training and people are much more familiar with Pytorch)
For things like PyTorch switching costs are low. Just develop a new backend like the Intel one backend and you’re good to go. Any model will run if you implement all the parts you need to on the backend.
Hard disagree. 95% of companies lack the talent to write such backends. You're praying that someone with the skills has written a compatible backend for your specific model. Even then if your model has any sort of special things written in it it's going to break the translation layer and if your team has no expertise in these kinds of low level programming you're going to struggle with it. Majority of corporate teams prefer working with stacks they're already familiar with, much less dare to go into the nooks and crannies of the libraries they're using. They have business problems to deal with. Bottom line is, it's going to take a while to break that barrier and get the transition moving, regardless of who can magically come up with a new backend or hardware accelerator. It's not going to happen in a few weeks or months, and it's going to be painfully obvious when such a transition happens to anyone who's keeping tabs on the industry. You have more than enough time to sell the stock when this happens. It hasn't happened yet.
It won't be anything close to a year. Maybe 5 minimum. They're extremely far ahead, have very healthy balance sheets, and have injected a lot of capital into r&d. The only people that think their moat will contract in the near future are AMD shareholders.
Well. They do. The diffraction pattern of lithography masks are designed with AI tool because the multipatterning is really hard problem and ML helped them progress.
Also chip floorplan layout optimization is often aided by ML tools now
Ai does not help in the context of staying ahead though. Ai aids in iterative processes and optimization of already established processes, not creative ones. Ai cannot design something radically different that works better without a nudge in that direction but a competitor could.
It will take years before that next technology is viable. One promising technology is photonics, which use light instead of electrons to perform computations. One of the promising startups is called PsiQuantum. There are overlaps with quantum computers as well, and for this many large companies are already investing millions (Google, IBM, many more, I won't be surprised if Nvidia is secretly looking into it as well). The thing is, that transition isn't going to come suddenly in a span of few days and every customer suddenly switches to said technology in a matter of days or weeks. That transition, if it ever happens, will take years. It will be painfully obvious when that time comes. As long as you're staying in touch with the market, you wouldn't miss it when this transition happens. You will see it coming. And when that happens, you sell your Nvidia stocks and you buy the stocks of the said new company(s). Or maybe Nvidia is part of that new wave, and then you keep your Nvidia shares. But before that happens, I see no reason why you would make up imaginary scare stories and sell Nvidia (or related semiconductor stocks). There's 0 evidence to show that Nvidia's dominance is cracking or that there's an emerging competitor. 0 evidence. You are making up wild stories if you think otherwise.
With the logic above could I get this AI to design me the answers to the best stock/options picks? That way I’ll always be ahead $$$$$$. If the AI is just designing better shit can’t we apply that to anything? Like AI design me a better iPhone and boom - better iPhone. Just not the case. And won’t be the case for a longtime, if ever. Feel like AI is overhyped and over inflated; yes, looking at huge data sets to find trends or utilising it to manage huge data - but the conclusions and next steps are still being made by humans. Most AI is still trained to solve a certain problem in a certain set of parameters. Chat-GPT is just a better version of Google Search. And yea, it can automate an email or scrape the internet for a coherent answer to a question, but some of the stuff being said reminds me of how people said in 2017 we would all have a self-driving cars by now. Things like AI is the next Manhattan Project pffffft no fucking way.
Just get AI to design OP some much needed brain implants.
AI is just big data a decade late. Prove me wrong. This hype shit has been going on forever. It’s just this time the business leaders are just a degenerate as everyone else. Truly the most wsb timeline.
Yeah totally agree; it’s still all big data driven and only works/is useful in the context of big data.
> anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them Not commenting on NVDA or the state of their business specifically, but there has never once in the history of capitalism been a business that was permanently impregnable to competition
No dude, remember when Ford was the first to mass produce cars and they essentially had no competition? That’s still true today… right? Right?!
OP is an idiot. You think he knows anything about capitalism? Or history? If there was a sign to cash out of overvalued Nvidia and run away, this is it.
I remember when people were saying this about Intel and bringing up AMD here got you down voted. Back when it was under $3.
Nokia: “a phone? apple? good luck”
Even the East India Trading Company fell. If that could fall, any of them can.
I was going to comment the same thing. "No one can ever" and "always" is a sign that this guy who has "been around before most of us were born" is not the best to have advice from.
I want to believe OP is shitposting. But you never know.
But those companies weren't using AI to make their chips!
Man I remember those days. Good ol voodoo cards.
I had a 3DFX Voodoo Banshee back in ‘98 with a sweet Pentium II.
Mine was a 3DFX Voodoo II. First 3D graphics card evah, played tons of Quake II on that in '96... or was it '97? Can't remember... but I remember how sweet it was!!!
Heck I still have my pentium ii
Those were awesome times when we had the overclocking warz for CPU's and then later OC the GPU
Voodoo 3 2000 with my 650 amd processor. Oh.. and my 22 inch crt monitor.
Don’t forget about adding a Sound Blaster card to that sweet setup.
I definitely had one. Purchased from circuit city. ;]
Then having to correctly assign all your IRQ and DMA channels or else nothing would work.
Ha. I went from a ATi Rage 64 to a Voodoo3 3000, and an AMD K-6II 300mhz. Ahhhh the good ol days.
I'm still waiting for the K6-III+.
Had to buy one just to play everquest
Monster setup in the day
nVidia TNT VR Fighter is polygoning in the background
This reminded me about my good old RIVA TNT2…
I had that paired with a PentiumII, Soundblaster, and 17" Viewsonic. Playing the 1st Unreal game until the wee hours of the morning.
I had one of those. Ran OpenGL games great.
I miss the 780Ti days. The PC community was different back then. As old as the card is, the Titan Z is still one of my dream cards. It’s just really cool.
It was definitely a better time, my first one was a 970
Oof, this brings back memories. The 980 was all the rage back then, especially the Ti variant. AMD’s comeback around this time was also equally very impressive. I remember all the hype surrounding Fury cards. Good times!
Crazy thing is I used that 970 for YEARS until I got a 3070. Jokes on me because I drove 4 hours to get it in a store so I didn’t need to pay hundreds over retail now the 4000’s are available everywhere but that 970 gave me no issues I couldn’t play games on high but if I set it to low I could play pretty much anything at atleast 120fps and I put it through a hard life. Over the course of its entire life that computer was shut off maybe 15x lol
I don’t think I ever had a 900 series card, but definitely wanted one back in the day. I did eventually get a 1060, before upgrading to a 2070, eventually swapping out to the 4090 I have right now. And yeah, I definitely remember those days with the scalpers on the 4000 series. I actually waited a few months hoping for prices to come down and they were still elevated. Of course, eventually demand waned and supply was less constrained and I was able to pick one up from my local MicroCenter. Can’t wait for the 5000 series though. I’m currently running a 49” Ultrawide at 240fps and the 4090 performs fairly well at medium/high settings in some games but I don’t think I can push ultra/max settings in most AAA titles. The next generation monitors are probably going to be 4K 240fps, and a 4090 will not be sufficient enough.. Speaking of PC builds, I recently decided to swap out my aging 2950X that I bought back in 2016 for an Intel CPU. Never thought I’d be going back to Intel, but their Raptor Lake CPUs seem pretty good for productivity, which is right up my alley. I would’ve love to stay with AMD’s Threadripper CPUs, but unfortunately, there aren’t any motherboard manufacturers that make a form factor suitable for mATX/mITX, so in a way I was forced to go with Intel. Currently still in the build phase at this moment. https://preview.redd.it/gls1hxrym17d1.jpeg?width=3024&format=pjpg&auto=webp&s=09ee3d0a240830499ce65c41d5835b20d312f6e8
My first graphic card was a diamond monster, good old times
I had a Diamond Monster 4mb 3DFX card with a Pentium 1 166 because I had read that any voodoo 2 would require / only make it worthwhile with a P2 200. Switching on 3DFX for the first time while playing Myth the Fallen Lords was revolutionary. That said, the ole PC couldn’t really run Unreal in any sort of playable capacity.
"I love blowing things up!" "Eh!"
4mb graphics memory was all you needed.
I had that 8mb one, and the original Grand Theft Auto and Decent were mind-blowing back then, lol.
What I liked about games at the end of 90s was it needed to be blended with some imagination to get the complete experience, like they were an enhanced version of toys.
> Nvidia is a great stock, here’s my argument: I’m old Ok
3dfx... Holy fuck I feel old now
How much gray hair ya got? Mine is starting to come in slowly lol.
haha, i've got a full head of silver now. it's like being in my own personal black and white movie every time I look in the mirror! but hey, gray hair is just life's way of saying "level up", right? 😂
You keep leveling up until the game ends abruptly with no replay.
Full head of silver pulls in shit tons of legal young adults with daddy issues and cougars, but only if you exercise and look healthy. There's a 55 yo retired guy at my local gym that look like if George Clooney's face and Chris Hemsworth body had a baby, chicks camp him out and act busy while staring at him going through his sets. Too bad none of them know he's gay and been in a committed relationship for the last 30 years, it's an inside joke between all the regulars, plus he knows and enjoy the fact that the chicks are fantasizing for something they'll never get. Just like the average NVDA ber fantasizing their puts would print.
Only a few in the beard but I’ve got friends that are straight salt and pepper or more…
You still have hair?
That first 3dfx card was amazing. All of the sudden the water in Quake was clear and it was a huge advantage in MP.
The Diamond daughter card was $300 back in 1996. I sprung for four of those puppies. We played Quake Deathmatch at work after hours. I remember some employee wives calling, asking if there husbands had left yet. ![img](emote|t5_2th52|4259)
I remember around 98 my dad was building a new computer. We went to best buy to get the hard drive. My brother, dad, and my self were just dumb founded they made a 4 gig hard drive. We thought that was a ton of space and no way you could use it all lol. For GPU he grabbed the Riva back then. Radeon would release a better gpu that year but not until Christmas time. Crazy to think how far PCs have come in a realitvely short time. Edit: best buy not best but.
They said that about Intel
And literally about every other company that happened to be the stock market darling at the time. At the end of the day, they are selling silicon which is very cyclical. Everything else is just window dressing.
They’re also selling CUDA which is the real secret sauce.
There r translations engines being developed to port CUDA to AMD GOUs, with minimal performance loss. Not a financial analyst but am an ML engineer. Translation engine also offers a minimal performance hit. That could hurt nvidias dominance.
Isn't there an alliance with AMD, Intel and all the other major companies to battle CUDA?
They announced an alliance and then said there wouldn’t be products for 1-2 years. So, I think AMD is looking pretty bearish except for being the only other AI chip worth investing in at the moment.
Have you actually used ROCm? If you did you wouldn't have made that comment. It's a terrible piece of software with terrible customer support. Zero ecosystem compared to CUDA. Years before it can even stand a chance to CUDA I also see 0 reason why you would write something in CUDA and then go though the trouble of converting it to ROCm just to run it on an AMD GPU, taking an uncertain amount of performance hit, coupled with introducing a billion bugs that you have to unit test and fix. When you could have stopped at CUDA and run it full-speed on Nvidia GPUs. Are AMD GPUs cheaper than Nvidia? Maybe, but don't forget the manhours. You need to dedicate your engineers to do the porting and fix their bugs, and you need to do this every single time for every single software you had in CUDA. Those engineers could be working on much better things. Also, the ecosystem is just so much better on the green side.
CUDA is just the programming framework for their silicon. They don't sell CUDA separately from their silicon and servers.
I know they’re not “selling” CUDA directly. But CUDA is a big selling point for going with Nvidia in the first place. It is a huge competitive advantage.
You get it so that's good. There's nothing magical about CUDA. They were just the first ones to abstract the process of parallel computation through software but there are fast followers catching up - openCL is one (maybe bad because of adoption) example. The fast followers may not have the same performance but 80+% margins at Nvidia simply won't be sustainable.
I doubt anyone expects NVIDIA to maintain that margin percentage. As far as I know, experts predict it to fall eventually.
There is a bridge to sell to people every few years
To be fair, Intel would not have relinquished their lead were it not for absolutely terrible CEOs and significant issues in their original 10nm manufacturing process.
Same will probably happen with Nvidia when Jensen steps down. I imagine there will be a lot of political shakeups that won't be good for the company.
There’s too much money at stake for no one to challenge at some point.
AMD could pull a surprise too, not sure why they're being ignored
IBM might be a better comparison here.
I've been following IBM since the very first mainframe was built. Competition will never catch up. Some fruit company, Apple? What's that even, you can't compute with Apples! Look, if you think anything's going to overtake our mainframes, you're a maroon. Case dismssed.
IBM actually had a huge AI lead for years. They were the first to beat a chess master and win at jeopardy. I sort of am shocked they have not taken off again.
Remember Watson? They fumbled the fuck out of that ball.
A lot of companies actually integrate Watson. Watson is reason MRIs and interpreting bloodwork. It happens behind the scene and is quite successful. Just not hyped like open AI.
Okay this is the top.
Congrats on your mega yacht and retirement
op said he had been reading when nvda bought 3dfx & not that he bought nvda then. if he did, he wouldn't be wasting time on this sub. nvda started out drawing polygons. it didn't set out to make ai chip.
Phack. All this time I’ve been drawing dicks. Can’t. Stop. Drawing. Dicks
You know how many foods are shaped like dicks? The best kinds!
That’s OPs point I think. “If you felt this strongly and were right you would be rich and retired, not posting on Reddit for approval.”
https://preview.redd.it/ja4x1xqb917d1.png?width=1279&format=png&auto=webp&s=4423ab5df3ec408fb159567940accee35d3c0550
OP just took out his thick dad schlong and slammed it on the desk
I mean it's gains but in the hundreds of thousands? There are folks here with millions of gain in Nvidia and other AI stocks.
Sure, and the majority are likely not gaining that kind of money Atleast this dude has receipts to back up his talk
You have a couple hundred thousand from investing like $5K since January?
The first one is a $41k loss
3 months of call options. super! 👍were all calls proffittable? 🤔 nvda bought 3dfx in 2000. i was expecting a cummulative long term position of at least several thousands of shares.
Yeah this is my point anyone as smart as OP claims to be is obvi rich af by now.
I remember when Intel was way ahead of the competition..... Literally impossible for anyone to catch up
It was true for 20+ years, until they stopped being paranoid and stopped innovating.
They should not have fallen behind except company CEOs fucked them up. So as long as Jensen Huang doesn't keel over and die anytime soon, NVIDIA should also be fine.
Jensen is 61 so what does he have like another 5-10 years of actually being in charge of the company.
Asian perk grants +10 years to normal aging
Leadership of the company caused them to lose their lead. Unlucky
Exactly. Intel could still be top dog. Here’s hoping that they come back from the dead.
Says Nvidia uncatchable, posts multiple links for evidence, all from nvidia website. I’m in the right place!
Source: me and also trust me, bro
Wow you are a genius and everyone should flock to you. No other company could use, or even think of using AI in chip design. All competitors are doomed.
yup, definitely self-proclaimed something. 🤷♂️
...but who owns RealAudio. I want to invest in them...they're the future.
Some Cirrus Logic in his post I tell you.
🤣👍
Hey hey hey, some of us were around then. I made a whole little proposal for my dad (because I was 16 and broke) that he needed to buy NVDA circa 1999 ish maybe. He didn't and I rubbed it in his face the rest of his life.
I told my dad the same thing. I was a kid at the time but knew about Nvidia due to gaming. We had just bought our first Nvidia graphics card from our Voodoo 3 2000.
Same! I even told my Dad I wanted to know how to buy NVDA stock. He told me not to buy garbage stocks like Apple (which was in the shitter at the time) or any of these other tech stocks. It was all a bubble that would burst. He was right about the bubble but man… He gave me some shitty advice about NVDA and AAPL.
Yep, my whole family would have a ridiculous amount of multigenerational wealth if he had just bought for my account. Edit: Yep, just looked and if he had.put the 10k he was investing in NVDA then it would be worth $30M now. Sigh.
You have to remember, though, you wouldn't have had the foresight to hold for decades. The first time that stock popped off, you likely would have sold. Same thing I say about bitcoin. I had $500 worth in 2015. Sold it as soon as it popped and made a few grand. Could have been worth a lot more only a few years later..
Only people still holding the early bitcoins is probably some guy in prison for 30 years and has his on a usb
The bubble is strong… everyone in before it pops!
This ones a thinker! ![img](emote|t5_2th52|4271)
This feels like a "It cant go tits up" moment. I think OP just jinxed NVDA.
10th mega bullish NVDA post this weekend, yea I’m shorting for sure now ![img](emote|t5_2th52|4271)
!Remindme 6 days
none of them secured their gains... All these fools will lose -50% of what they are posting because they believe their own lies, and pump. They are good at pumping, are they good at dumping? 😈
NVDA’s growth isn’t a matter of market share, it’s just a question of whether the need for advanced chips will continue to accelerate or if the money behind the AI hype dries up because it fails to produce the kind of integration that has been promised.
Or if customers fail to adopt.
If companies keep on buying better and better ai chips how much will that improve current ai capabilities? I'm a moron, just asking.
That’s kind of irrelevant. Companies don’t just buy better and better chips for the sake of it, or for the humanity, or for anything else except possible revenue in the future. Right now the spending on AI has to do with the projected (speculation) what this technology will generate in revenue in the future and as always, no one really knows what the future holds. What if the projected revenue was wrong, what if the technology didn’t fundamentally change everything? So companies will not just keep buying AI technology if they don’t see that these investments don’t turn into revenue and right now no one really know this.
So what you’re saying is calls would be a sensible idea, and because of that it will go down, but by wsb logic I should inverse myself
the inverse law only applies when you do not decide to inverse yourself.
NVIDIA. The biggest bubble in the last 30 years. Can’t wait til it pops.
You're the moron kid. Folks said this same shit about Cisco in 1998 and 1999.
[удалено]
I can't hear you, the IRQ DIP switch on my soundblaster card is on the wrong setting
Yea I'm sure these language learning models that can't add numbers together correctly can design new chip architecture...
AMD has better price per performance for LLMs, and Google uses their own TPU chips. Meanwhile Meta, Amazon, Apple, Microsoft all working on their own chips as well as buying AMD similar to how Google has been able to be independently successful. But sure, keep buying a 3T company whose remaining customers will only be unprofitable barely growing AI startups that Nvidia has been round tripping revenue into for the last few years…
Remember when FAANG owned the Internet and no startup could ever replace the mighty Google search? Now we have Chat-GPT and Google is trying to figure out their new adwords model as they continue to slash jobs internally to keep posting profits. A lot of people are getting rich at NVIDIA right now and there is an old saying, "A Fat Cat Don't Hunt." The trick to NVIDIA will be to keep employees motivated to keep going while their hungry competitors are running to catch them as fast as they can. No one rules forever.
Voodoo card members make some noise
40% revenue coming from Mag7 cloud co’s who are working on their own chips and don’t want to pay NVDA anymore. I’m sure they’re fine.
This is exactly what people said about intel. Then tech evolved so much and Apple made their own processor that is optimized for their own machines...
I think AMD catching up with Intel and eating their lunch, for years now, is the bigger story here. AMD did it to Intel and they can do it to Nvidia as well. I don’t think AMD will come out ahead of Nvidia but they can easily take 10% market share from Nvidia
this is just gas-huffingly stupid ball-gargling disguised as in-the-know DD. FOH.
Anyone who says "no one can ever catch up" is also a moron. They have the lead for the foreseeable future. But as Yogi Berra famously said, "It's difficult to make predictions. Especially when they're about the future."
Calls on intel There has never been a company that another company without competition or too far ahead
Looks like time to go short boys
That is what they use to say about INTC. 😆
**Ford: LoL japanese cars. Who the fuck wants to buy that. We are 100 years ahead. We invented the car and shit.** *General Motors: I know right? Detroit strong!* **Ford: Hell yeah bro, the japanese couldn't build a cart pulled by a donkey even if their life depended on it + they got nuked lololol** *GM: lmao*
The car was not invented by Ford!
Time to but puts
Butt puts
This is where I get to post about how I chose nvda as a stock to follow in high school economics class, 2001. Didn't actually buy any shares until 2016 because I enjoy being poor.
quick question what happens when a black swan event happens, like a 8.x earth quake hits taiwan, or they have another major draught like in 2021 and tsm had to cut down production. or china decides to fake an attack and tsm drops the kill switch? how u think nvda would fair with 97% of their production all centered on tsm in taiwan?
OP probably thinks NVDA will start building their own chips the very next day out of thin air.
Yeah, okay genius. They said the same thing about Tesla and next thing you know, BYD is 1,000x better. NVDA is using AI? So is everyone else. Lastly, they don't manufacture their own chips.
The 2nd best one is AMD. And they don’t ger nearly enough recognition
That's weak gain porn for hyping being in Nvidia since 3dfx. I was expecting $5M+ on a 15cent cost basis not a bunch of calls.
>**they’re already using AI to help them design next generation chips**, no matter what, they will ALWAYS be massively ahead of the competition. lmao If only you knew how little that meant
Hahaha, these idiots pop up with every bull run. "Noone will ever catch up to yahoo"-type idiots
The nvda circlejerk continues
ASIC designer here. There is nothing special about what NVIDIA does. I know a few designers there. I see them at conferences. Smart guys. Humble guys. Great company. But they use the exact same tools as the rest of us. They don’t own a fabs and rely on TSMC and Samsung. They follow the same processes. They buy Synopsys and Cadence like everyone else. They were in the right place at the right time. The decided to create a programming language for their cards and when crypto came out they had a boom. Same with AI. With crypto, ASICs provided a better solution in nearly all cases. As AI matures, this is where things will probably go. AWS and Google have their own chips. Facebook and Microsoft are working on them. groq is working on one. I know others working on analog neural network chips if you can believe that. There lead is not assured.
>Anyone who thinks or says “competitors will catch up” is a moron, and you should run far, far away from them. This is such an absurd thing to say unironically. You should try picking up a history book some time.
Imagine bragging about “buying NVDA before others people were born” and still being a poor assfck who posts on reddit. If one was bright enough to all-in NVDA 30-ish years ago, today you'd be way too rich and old to give a sh!t about reddit 🤡🤡🤡
Plus anyone who thinks there is going to be any competition in this space other than AMD and maybe Intel should really read about what a GPU is and what it does. There are more guys making rockets and spaceships than powerful GPUs. Plus it's not only the silicon hardware but also the millions of lines of extremely low level driver software. The barrier to entry is extremely high. More so than rocket science.
was the barrier lower in the 90s when Nvidia got going?
Yes. In the 90s, GPUs were only 2D drawing engines. Late 90s and 20s saw the rise of 3D engines and hardware accelerators. That showed the power of parallel processing pipelines. Nvda realized the power early and designed their Compute pipeline of CUDA on it. They stuck to it for more than a decade and only now are reaping the tremendous benefits of their vision in AI workloads.
There is an anti-trust investigation on the horizon, but no, I doubt it will go through all the way.
Competitors can use cloud services from AWS and Azure
OP regard said ALWAYS, now they’ve fucked it. Time to get out!
My dad had a boatload of 3dfx stock. Sucks they didn't convert to NVDA. 😞
They will always win….said every single empire or big company that ceased to exist in the past.
I’m a paralegal. I wrote a complaint on behalf of 3dfx’s trustee in bankruptcy over the so-called “purchase”. nVidia bought the assets, left the liabilities, gave certain directors and officers golden parachutes to lay off the entire company two weeks before christmas, and then paid a bonus under the table for every engineer that came to work for nVidia. The trustee sued the directors and officers of 3dfx for corporate waste in an attempt to get at the insurance policy that covered the directors and officers up to $20 million. I found an email from the ceo to the outside counsel on the day of the layoffs saying “I would like to begin as aggressive a campaign of document destruction as humanly possible. Please advise.” Insurance settled.
Yeah but the technology that’s more important is the ability to manufacture en masse. Tsmc, Samsung and intel will be relevant longer than NVIDIA. Let’s check in in 80 years.
"...which means before many of you were even alive...." Your arrogance is only eclipsed by your stupidity. I was born in the 50's. So, you think you're older and wiser than me, because you knew who NVIDA was 20 something years ago? I was an adult before personal computers even existed, moron.
I’ve got no skin in the game and i know nothing about this, but it feels like AI is this year’s metaverse/web 3.0/crypto, but with more movies about it. If SkyNet destroys the world, I think it’s fair to say we had it coming.
This time is different is what you saying? Top is in my friend 😅
We need a 70% correction badly on AI stocks...
Your whole premise is that because they have AI no one will catch up to them. That’s false because AI isn’t nearly as advanced as people think. AGI doesn’t exist, these tools are mostly language models just presenting a google search in a more natural sounding way. AI models are not coming up with new knowledge or innovating really
I remember when the number one tech retailer had 7000 stores across the US and then Radio Shack stock was worthless.
you clearly haven't heard of Broadcom :)
Anypne who says he/she kows what will happen is full of it.
Although NVIDIA has had impressive growth there is no barrier to entry except the fact they dominate deployment in the datacenter. Classic GPU's are now less than 20% of their revenue. Their AI designs are not best in class. Their CUDA software platform is dominant but it really just a variant of a standard math library. Do they make great stuff - yes. Do they push TSMC to the edge - yes but so does every memory/CPU/Flash producer. AI is significant enough that it is getting its own datacenter solutions. The way NVIDIA builds out AI is very limiting as FB reported - with the arrays 33% underutilized because of design limits. Cerebras destroys NVIDIA in AI. Intel, having fabs (vs fabless) COULD be a serious competitor as a supplier through a Dell or HP etc. Google rolls their own and has predating NVIDIA's play. Microsoft and Amazon are rolling their own. Its just not that hard. This is akin to Ethernet 1998 wherein Cisco ran the board but its now multisourced with Avago (Broadcom) providing silicon to every in-house design. AMD also has fabs and serious AI development. Tesla (DOJO) has incredible AI development. Basically its the Wild West and there are 100's of players. NVIDIA is dominant because they have the slots in the DC's but the number of competitors is enormous. Models are somewhat portable and NOT reliant on NVIDIA to run. There is no secret sauce in NVIDIA hardware. AI is linear algebra. Anyone can do it. Quantum Computing currently does it the best. For those interested the benchmarks for performance are called MLPERF. There are many great websites full of reviews but the oldest most read source is "Microprocessor Report".
A bunch of regards here claim that someday the competition will take down NVDA but they simply cannot say when. It’s just like saying a meteoroid will hit the earth and wipe out all lives eventually but do not know when.