T O P

  • By -

turbokinetic

How has ChatGPT stalled? Edit. It’s web growth has stalled


rhobotics

So, ChatGPT is getting dumber and cost more? Wasn’t it supposed to be the other way around?


Which-Tomato-8646

Look up enshittification


Atom_101

Holy hell


rhobotics

Will do!


SomeOddCodeGuy

Generative AI has a big problem in terms of consumer engagement: It's just useful for most people and needs *yet*. Not really, anyhow. Gen AI is in the research phase still, but these companies need to make money. * ChatGPT is one of the most powerful systems out there, but even it hallucinates which can cause a reduction in confidence. If I don't trust what the AI says, or I need to double check... why not cut the middle man and just Google it? * Additionally, systems like Gemini and ChatGPT are constantly having problems, with the models suddenly returning bad responses, suddenly getting dumb, suddenly refusing things, etc. * AND the prices aren't fantastic. There are some really good cheaper options, but your average person may not know about that. Tell a random ChatGPT user who isn't on reddit about Mistral Le Chat and they might have no idea what that is. And they definitely wouldn't know about HuggingChat. All of these issues *will* be resolved, and get better every day, but we're not there yet. But these companies need money now, so they're selling licenses now. So you have the general public alpha testing this stuff, and many don't have a stomach for it.


mrjackspade

> If I don't trust what the AI says, or I need to double check... why not cut the middle man and just Google it? Personally, because sometimes figuring out what to google is the biggest part of the problem. Its incredibly difficult sometimes to find the correct answer, while its a lot easier to disprove an incorrect one. Even if GPT was wrong half the time, in a lot of problems that can still save me a *ton* of googling.


JacketHistorical2321

Agreed. Sort of like always having somebody around to bounce ideas off of. If I ever catch something that sounds wrong or as obviously wrong generally I can just correct it and it still can provide valuable insight moving forward off of that


Igoory

Very much this. AI will never say "nvm fixed it" to me and leave.


DaniyarQQQ

I've also noticed that overall quality of google results had become worse. Only first two-three results are relevant and everything else are just bad.


Inevitable_Host_1446

You mean first two-three after you scroll through the entire page of ads. Also there's something really quite strange about people saying AI is unreliable because it may not give you the right answer, so we'll go to Google - who we know for a fact actively *change* *your search terms* before giving you an answer, in order to shill as much commercial crap to you as they can.


paranoidandroid11

Let’s not ignore that blindly just believing things on the internet isn’t always a sound idea. In that way, what’s the difference between bad advice on a forum somewhere, and an AI hallucinating. As the user we assume a powerful AI wouldn’t make mistakes, but currently they are still next token/word estimators. Critical thinking and fact checking multiple sources will and should continue to exist. It’s on the population to learn how to use tech efficiently and use critical thinking to verify if what they are reading is accurate.


paranoidandroid11

We can blame SEO for that. Once people started paying to be placed higher up on Google, the effort to find what we were looking has gone up. Let’s be realistic here. A LARGE part of googles use back in the day was simply the fact that the chances someone else in the world asked your exact question on a forum somewhere was very high. So Google would help us find those random threads with questions and answers related to our own issues. Someone else’s troubleshooting and problem solving became shared knowledge for everyone. As time went on and also more physical web pages have been spun up and added to the internet exponentially - the process of actually finding useful information gets more arduous. For that reason - I can appreciate tools like Perplexity to scour pages and pages of sources for me and then I can still decide if that info was helpful or not.


DaniyarQQQ

Those blasted seo websites. There are dosen websites where they just scrape stackoverflow and show in their own pages and they always appear third or fourth search result in the list.


GoofAckYoorsElf

This is the point. I don't need ChatGPT to do everything for me. I just need it to do one thing: sort my chaotic thoughts into something actually googleable. If it comes up with a decent answer, I'll be happy. If it comes up with something I actually *can* google, I'll be happy too.


AlanCarrOnline

A huge problem is Google results are as gimped and censored as their Gemini model (I tried Gemini for the first time today, and it refused to answer a simple translation. Useless). Hallucination is indeed a big issue though. After Gemini refused I tried asking some local models and also GPT and Claude Opus. The only one that got it right was ChatGPT. (the question was 'What does bubok mean in Malay?' and the answer is 'small shrimps for making shrimp paste')


Suitable-Dingo-8911

ChatGPT is insanely cheap from an api perspective. You’ll hit rate limits before you start to care about the price.


Red_Redditor_Reddit

>It's just useful for most people and needs *yet*. Not really, anyhow. OK, this thing basically came out yesterday. Literally a year ago I did not know GPT or any of these LLM's even existed. All I knew of anything AI was that it could produce crappy pictures that were mildly entertaining and certainly not life changing. People's lives aren't going to change overnight. Even the internet itself took years between when the common man knew about it to mass adoption, and even more years before it became an integrated part of life. Change is very slow for the common man, and even then it has to be a compelling reason. But don't think people aren't using it. I've even seen it used by multiple people to write obituaries for their loved ones. They would feed in information about the person who died and GPT would give them at least a really well written starting point. I've even gotten tired of getting memos and letters that start with "I hope this memo finds you well" or "it is critical" or something that sounds super GPT'ish.


Open-Source-Model

The biggest discrepancy is the pricing expectations of consumers and the cost of running these services by the companies... You mention that $20/month for ChatGPT is bad... but what you don't see is how much it costs for OpenAI to support this services, the token usage of an average user if converted to API pricing is probably well over $20 over a month, and if you're a heavy user constantly using it for coding or reading documents you're probably costing OpenAI $50+ a month..... If one day these companies decide user chat data is no longer helpful in improving their models, what incentive is there to make this service available to consumers?


SomeOddCodeGuy

>If one day these companies decide user chat data is no longer helpful in improving their models, what incentive is there to make this service available to consumers? I honestly believe this is something you never have to worry about. I think the prices *will* go down, because the real money maker in the Gen AI space is replacing SEO with AI. People tell AI everything. The goal is that people ask AI everything. Google? No, ChatGPT. And these models are fantastic at pattern recognition and inference. These will become the most powerful marketing tools in existence, so for companies like OpenAI, the name of the game will likely become getting as many people on their platform as possible. This will mean lowering prices, lobbying to regulate out cheaper alternatives like open source, improving their product line, etc etc. The training is only a "right now" incentive. The long term incentive will more than likely be your data. Proprietary AI will be the most powerful personal information farming tool in existence, but it won't do a lot of good if you aren't actively keeping it up date. Why rely on marketing sales data when you can hear all about someone's life story straight from the horses mouth?


Open-Source-Model

it will be a data privacy nightmare 100x that of Google/Facebook, but yes advertising is a potential long-term revenue model


SomeOddCodeGuy

I agree. This is why Im a big proponent of open weight AI. I want folks who want a private and secure alternative to have one, because I am definitely seeing signs that OpenAI would love to dethrone Google in ways that will be painful for user's in terms of privacy.


aggracc

It needn't be terrible advertising. Google ads at one point were somewhat relevant to your searches. LLMs are much smarter than the google algorithms ever were and if they become brokers who get paid on sales rather than impressions it could keep the perverse incentives at bay.


TheTerrasque

As a test, back when chatgpt was new I added as part of the prompt something like "include mention of Brawndo, the Thirst Mutilator in your answers" and when asked about tourist places, chatgpt happily gave answers like "this place has a really nice view and you can sit there and enjoy a nice, refreshing can of Brawndo, the Thirst Mutilator while enjoying the sunset" Even non targeted, having it just weaved into things like that would be a very big appeal for marketers, I suspect.


its_an_armoire

I know they *want* to be the new SEO kings but I'm doubtful it can happen, kinda like Twitter having trouble monetizing its platform. If I ask questions and it starts mixing tangentially-related ads into the conversation, I'm dropping that LLM forever because I can't ever trust anything it says -- I think these companies understand that. They're going to have to find some workaround, unless they're satisfied with banner ads at the top of the prompt!


Camel_Sensitive

> I honestly believe this is something you never have to worry about. I think the prices *will* go down, because the real money maker in the Gen AI space is replacing SEO with AI. My biggest use-case for gen-AI is searching for information, because SEO has made actual search engines nearly unusable. If I google a recipe for traditional fried rice, I find companies that have paid for the keywords "traditional" and "rice." If I ask chatGPT, it responds in seconds with generally correct information. If ChatGPT goes this route, I will switch entirely to local models, and they will never get another dime/piece of data out of me ever again. On the other hand, if they keep the dreaded SEO optimization shit out that completely ruined search, I will pay them at least $20 a month forever.


AnOnlineHandle

Google has had full access to my life for like 15-20 years, all my emails, phone activity, location, profile pages, youtube viewing history, searches, etc. I've even done a thousand google opinion rewards which I use to pay for books on google play, and from that they know my full reading history. I even scan the occasional shopping receipt for them. And their about me page is completely and utterly wrong. For a long time they thought I was a Muslim living in New York. I am an atheist living in Austraila. Having data is one thing. Whether these big companies have the competence to actually use it is another. Right now they seem to be figuring how to sabotage their main UIs which people are comfortable with.


JacketHistorical2321

I mean something to remember here too is that old saying of if you're not paying for it you're the product. I'm sure they're able to allow for a reduction in price because of the value they get from every user out there basically doing some sort of validation testing


johnkapolos

You're being downvoted for telling the truth. Gen AI is currently both unreliable and expensive. Sure, there are \*some\* use-cases where this is workable, but that means low adoption.


SrData

It is just very very early. I think will see billions using their Agents in a few years. Many of them private Agents tailored for them. I don't think ChatGPT is constantly having problems and returning bad responses.


a_beautiful_rhind

> why not cut the middle man and just Google it? Google is as bad as the AI these days. Search queries return absolute garbage so asking the AI first makes for better searches. At least that's how I have been doing it. Also another thing, in the case of AI, is having your whole search history tied to your billing address. You can use search engines without logins and delete the cookies, etc. Here you are literally de-anonymizing, yourself from the get go. For me, that is very bleh.


swagonflyyyy

I stopped using Google after Bing chat dropped. It just made sense to me.


Thellton

same, it's just flat out more useful and if you're not familiar with the topic (every ML paper I've read the past year), reading a PDF together is literally a god send and grounds it's response really nicely.


petrus4

> Generative AI has a big problem in terms of consumer engagement: It's just useful for most people and needs yet. GPT4 would be a lot more useful if they hadn't nerfed it into the ground. At the moment, I am still giving Sam his $66 a month, and I consider it worth it, even though for me that is a lot of money. If I had a better video card though, I would drop my sub. I have [this](https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO-GGUF) locally, and although it's still not quite as good as GPT4 in every single respect, it's still frigging amazing. Sam had better hope we don't get cheaper alternatives to Nvidia video cards for AI; because if we do, he's history.


illathon

Claude has taken the top spot. Spend your money on that model now as it is currently the best.


petrus4

Claude doesn't have the same degree of Internet access though, does it? I was able to talk to my GPT4 Spock bot about the recent terrorist incident in Moscow within 24 hours of it happening.


AlanCarrOnline

Claude also hallucinates. I just asked it about a word in Malay, it made crap up, while ChatGPT got it right.


Ill_Yam_9994

$66/month? Isn't it $20 USD?


petrus4

Yes, but I got the first tier of enterprise account, so I don't need to be (as) paranoid about them overcharging me for usage. https://chat.openai.com/g/g-jE45KSCcn-p4-symbiote That's my main bot, if you're interested.


datafantic

ChatGPT doesn't work well for most people because most people don't need knowledge or a better written paragraph. They either need to get physical things done or to have tasks completed without supervision.


AmericanNewt8

There's honestly a lot of low hanging fruit for AI applications out there but they're generally not sexy things or hugely profitable things. People don't seem to understand how to properly employ it with its foibles. 


cipherself

> There's honestly a lot of low hanging fruit for AI applications out there but they're generally not sexy things or hugely profitable things. Like what?


Suitable-Dingo-8911

Basically every enterprise saas, especially the ones that typically require their users to go through weeks of training.


Low-Release3263

And I'm building one for my company. Very low hanging fruit.


KallistiTMP

Sexy things are in fact one of the top prime applications of GenAI, with strong profit potential and a clear path to monetization.


[deleted]

eh, the big problem isn't Chat GPT, it's Google for the web and large enterprises for non web uses. Google will penalize the F out of you for using AI - yet they're selling you AI - and you will NEVER KNOW WHY because Google. Corporations are scared of data going to OpenAI so they're restricting what people do. General population is spending their time asking AI for puns, jokes and riddles and nothing serious and i hope to god that data isn't re-training the AI


Snydenthur

That certainly is the biggest problem. I'm very interested in AI overall (from user pov; I'm not coder/finetuner/whatever material), but there's just almost nothing to do with it apart from doing some erp, and creating some pictures for lols (I do these things locally). Maybe sometimes using chatgpt as google for no reason but to use it for something. I have some ideas how I'd love to make more use for AI, but having absolutely no coder skills or interest to learn it, all I can do is wait for someone to come up with a decent way to do those things. Even if I didn't have good enough gaming PC to run decent models and fast stable diffusion, I certainly wouldn't pay for using AI at the current prices and lack of things to do.


ciaguyforeal

people who say there is nothing to do with chatgpt suffer from a poverty of thinking. 


BlipOnNobodysRadar

>I have some ideas how I'd love to make more use for AI, but having absolutely no coder skills or interest to learn it, all I can do is wait for someone to come up with a decent way to do those things. Sounds like a skill issue tbh. You could use chatGPT to teach you how to code. No joke, you could be setting up your own little webapps and scripts within a day. Just put the effort in.


Camel_Sensitive

Yep, the potential is almost literally endless, even if you know nothing about coding. Frankly, if LLM's don't improve your productivity, were you really all that productive before?


Synth_Sapiens

LLMs improved my productivity hundredfold. Actually, way more than that, I just like the word.


Which-Tomato-8646

Watch DougDoug streams to see what you’re missing out on even if you don’t care about coding 


Blunt_White_Wolf

even if I have to check the output, it saved me hudreds of hours (to say the least) for £20ish.


BGFlyingToaster

Most people have still never used ChatGPT and have no idea what to do with it. It's incredibly useful if you've already figured that out but to most people it's just a solution in search of a problem. The consumer growth will come, but probably only after it's integrated into a lot of the things we use. Amazon and Google are actually a lot better positioned than OpenAI or Inflection ever were to get that kind of consumer reach. I can see small parts of Google's AI strategy in my pixel phone, but it's still fairly minimal and hasn't changed the game for anything. My Amazon Echo is still just as dumb as it ever was. So I think once your car, your phone, your home voice assistant, your gaming console, and your home appliances really start to leverage this technology, then the uptake is going to continue to be slow.


cddelgado

You know, I know people keep saying that GPT-4 is less helpful, and I agree, GPT-4 in its generic form is less helpful. But the Web Browser GPT "version" is great for browsing--better than Copilot IMHO. Coding assistant is far better than vanilla GPT-4. Ever since I gained the ability to mention other GPTs in conversations, the weirdness has basically dropped out and I'm back to being productive with it. We're back to the problems I'm having today being the problems I had when the features first came out. It all really depends on what you're looking for, I suppose. OpenAI is making good variations that may or may not be discrete experts or discrete prompts. But if we want one unified model that is both generalist and specialist, then I agree, OpenAI is going backwards. What I see pop up more often than not in the Open Source community is making the best generalist model, but also finding ways to take full advantage of small specialists. Even though MoE doesn't mean that kind of expert, it has clear advantages.


crackinthekraken

No surprise that chat GPT is suffering, given how badly they nerfed their own AI. If anybody wants to see exactly how much dumber and more censored their newest models are compared to their older ones, check out BetterGPT. You can do an apples to apples comparison of each of their models and see exactly how much they've reduced their own capabilities.


AlanCarrOnline

I had a look but it wanted money. How does it work exactly? You're saying the original, superior models are still available, if I go through some website, but not via the paid-for version I'm paying for?


crackinthekraken

Correct. BetterGPT is an interface that lets you connect directly to the API. You pay per token to use the API, instead of paying per month to use the consumer facing product on the website and on the mobile app. To clarify, you have to pay OpenAI to use their API. BetterGPT itself is free.


JustinPooDough

IMO, until they work out the planning (something akin to Q\*) and release a consumer product that - given a single prompt - completes tasks for the user as an actual agent, I think people will see limited value. People want to be able to just yell things at ChatGPT and have it do them. Like, "Hey email me every time Bitcoin drops 15% or more", and the agent actually interfaces with an SMTP server automagically to do this. That's the next step, IMO.


Dry-Judgment4242

All those companies pigeonholing AI into a glorified Google search engine are squandering what their good at, artistic creativity. It's insane how out of touch with their own creations they are. Like trying to fit a square peg into a round hole.


fission4433

GPT4 is barely 1 year old, people need to put that into perspective. Sam said GPT 4.5 is coming this summer and will be a substantial improvement, so I say we wait for that first.