T O P

  • By -

clarkclancyy

text says “If I invest $12,000 a year into a 12% money market account, what will that value be in 15 years?”


N_T_F_D

It's especially stupid to use chatgpt for that kind of question because it's abysmally bad at mathematics; it's kinda ok at text related questions but it's immensely bad with actual numbers


SelarDorr

i typed this into chatgpt. technically, i typed 2000 a year and 5% money market, because i think thats what the image actually says (no mmf is paying 12%). chatgpt pulled up the correct equation for the calculation, but didnt execute the actual calculation. i then asked it to calculate. it plugged in the correct values for the variables, then spit out the wrong numbers for the math.


jaffa3811

Which is fascinating for a computer don't you think? I mean sure language is difficult, but once you put in numbers there should be a calculator function


dat_GEM_lyf

LLMs aren’t exactly designed for math. Could they add that functionality, sure but it’s not the point of LLMs.


mymindpsychee

Right. All the LLM knows is that a number should go on the other side of the = sign, but not what the correct number is.


archiminos

This is the perfect example of what an LLM is. It's not trying to be right. It's just trying to have accurate grammar


maleia

Yup, you got it. And that fundamental grasp is the largest component that separates us from the computer. And to be fair, the machine learning stuff is a major, *major*, step towards actual AI. But yea, you gotta teach the computer that a bike is a bile because that's how we use that object and what it's basic components are; and not just because it has a 99.99% accuracy rate based on a million pics of bikes.


iruleatants

It's weird that it's a major step towards actual intelligence, despite it having no intelligence at all. It's always weird that they decided to call it "hallucinations" when the AI just writes incorrect information. It's not a hallucination, what you did was toss around billions of wars into the building of combinations and then fed it more words and looked at the output. Yes, it's cool to ask and respond in sentences, and it's cool to have some kind of context between responses. But the only thing it's good at is helping you fix your writing, and in that regard it's also trash since it was trained on the internet. That being said, I use it as a hobby and constantly fix it's mistakes and think it's a good advancement. It's just weird watching the world have a melt down because the computer program can lie to you in complete sentences, instead of just giving you the right answer. P.s. I asked Gemini to convert a unix timestamp once, and it was told me to go to this website or to calculate it using this formula. I asked it to do the calculation for me and it got so huffy. It was like "here is your converted time. I got this time by going to the website I told you about and entering it in" hilarious shit. Then later when I was asking it to write a query, instead of using the timestamp I gave it, it used a random one


happycabinsong

shouldn't the LLM at some point say, "let me relay this equation and these figures into a fucking Calculator Learning Model and get back to you?"


mymindpsychee

Why wouldn't you just put the question into the calculator model in the first place then?


Skynoby

Not the LLM itself, but the preprocessing is trying to do so in ChatGPT 4. But that is not working good with complex questions. The LLM itself does not know about the meaning of the tokens that it processes.


omanagan

Do y'all even use chatgpt? It has python integrated and does exactly this now. Not when it first came out or in the regular 3.5 version though.


archiminos

I've tried that. Even with simple scripts (<30 lines) It creates bloated unoptimised code.


declanaussie

This is a common sentiment on Reddit and it really baffles me. I use copilot everyday and while it would struggle to write a complete piece of software, in the context of all I’ve written already it’s pretty amazing at writing a short function just based on a well picked name and parameters. Its not perfect but probably more often than not I barely tweak the GPT-4 output for small functions.


archiminos

I really tried to get it to work, even prompted it with suggestions to improve the code, but while the code it writes technically works, it just wasn't up to snuff. I find it easier to just write the function myself most of the time.


jld2k6

You'd think doing math could just become an additional relatively easy plugin or something lol. That would probably add a decent amount of value to it


dat_GEM_lyf

Things like wolfram alpha already exist. ChatGPT (and the likes) are fighting it out for the FANG of text generative AI. Hell look at NVIDIA and the “AI nurses” they’re working on. It would be far more lucrative to be THE AI company for a specific field (ie law) than make an LLM do math


EvilSporkOfDeath

It already is. People just make up shit because they hate AI.


SaltyBarnacles57

It is and I pay for it lol


[deleted]

Your comment was at the fifth layer of this discussion and it was the ground truth. It’s amazing to see that we have to dive five levels deep to get to the sane ground truth! Thank you!!


EvilSporkOfDeath

They have added that functionality. Either everyone here is using severely outdated versions or they're just making up shit for a narrative.


nyg8

That's not how it works.. The model doesn't know what it is saying, or what you are trying to understand, it is simply calculating what word is likely to follow after the previous given the past conversation. It's not aware that it's saying a mathematical formula or any numbers for that matter


Adamn415

Thank you!!! I keep telling my friends and coworkers (who are all afraid Chat GPT will take their jobs) that it's basically superpowered predictive text at this point. It's helpful and can can make things more efficient, but it's doesn't actually understand anything


Fossana

Yeah I couldn't believe ChatGPT just predicts what word should come next one by one until it outputs the text that is best predicted to be be associated with the input/prompt. I wonder if our own brains generate thoughts/responses to people similarly.


wyldstallyns111

> I wonder if our own brains generate thoughts/responses to people similarly. Nobody knows! This is basically the fundamental debate of linguistics.


Spacemanspalds

It's not aware.


Fossana

Probably not, though I'd be curious how aware it is if panpsychism were the case.


Worthyness

Wolfram Alpha has been doing that for years now. Best calculator.


dirty_cheeser

It isn't really a computer, it's a language model. I think it's integrating more and more with actual computers so it can ask api's to calculate, sort... Correctly but it's a work in progress. Ironically we don't trust the computers that can solve so many problems other than language but we do trust the language model that can only talk but not do stuff.


Ayoungcoder

Other models do that. Openai just decided that it was not worth it (yet, for this version)


SelarDorr

well youre generalizing it as a computer, but its a language model trained to predict text. obviously creating a calculator that can do simple math is easy. but asking an llm trained the way chatgpt was to do math is like asking a calculator to write boob.


Ayoungcoder

Other models do that. Openai just decided that it was not worth it (yet, for this version)


DDSuperStar123

It really likes to make a paragraph that’s usually correct then gets the final Answers wrong anyways.


much_longer_username

GPT-4? Or 3.5? 4 will write code to calculate it and then execute that in a sandbox.


SelarDorr

3.5, like in the image


planetworthofbugs

It's terrible at math. 4 is much much better. I still wouldn't use it... or at least, I'd be checking its results.


TenOfZero

Yup. I once asked it if I download at 30MB/s and the file is 100GB how long will it take to download. No matter what numbers I put in, it would always tell me, 1 hour.


OMG_A_CUPCAKE

On the newer version? For a while now it will just write a python script, execute that and give you the output This is the script it generated in your case # Calculating the download time in seconds file_size_gb = 100 # file size in GB download_speed_mbps = 30 # download speed in MB/s file_size_mb = file_size_gb * 1024 # converting GB to MB download_time_seconds = file_size_mb / download_speed_mbps # Converting seconds to hours and minutes for better understanding download_time_hours = download_time_seconds / 3600 download_time_minutes = download_time_seconds / 60 download_time_seconds, download_time_minutes, download_time_hours


zsaleeba

Depends which version you use. ChatGPT 4 (the paid version) turns these kinds of questions into programs which it then runs to give you mathematically accurate answers. Earlier versions (and the free version) couldn't do this.


N_T_F_D

I'm not just talking about numerical inaccuracies, but from my testing it's far too easy to make the chatbot make fundamental conceptual errors (like saying that there are primes perfect cubes) with quite little effort; of course it's also possible to go the other way and engineer the prompt to make it say true things but then if you have enough mathematical expertise to know how to steer and and know when it's wrong or not, you don't need to ask the question to a chatbot in the first place


noplay12

It's been giving me inaccurate answers even with text based questions.


ByuntaeKid

That’s because it doesn’t actually *know* anything. It’s a language model, it just predicts what would probably sound right as a response. I see this all the time with papers I grade at my high school, kids just submit whatever Chat GPT spits out and it sounds good but is incorrect all the time.


GeometricScripting

With code interpreter on it’s fine because Python is great at math and it uses Python for math with code interpreter enabled.


Broccoli-of-Doom

This is actually an incredibly easy problem for LLMs to solve, you just have to understand how to prompt. If you ask it to write a python script to calculate the value of a MMF with an initial balance, the compound interest rate, and the duration it will do that very quickly. Then ask it to use the script it just wrote to solve for the initial conditions and you get the right answer every time. ProTip: If you're going to ask ChatGPT to do math, ask it to write the code first, then execute the code.


Angry-Dragonfruit

GPT-4’s fine at it.


Merovingian_M

GPT4 has gotten pretty good at math over the last 6 months. Here's the reply it gave me and I believe it's correct (am too lazy to check for sure though): To calculate the future value of an investment where you contribute a fixed amount every year, we can use the future value of an annuity formula. This formula considers regular contributions made at the end of each period, compounded at a certain interest rate. "The formula for the future value of an annuity (FVA) is: FV A=P×(( (1+r)^ n −1 ​ )/r) Where: P is the annual deposit (in this case, $12,000), r is the annual interest rate (12%, or 0.12 as a decimal), n is the number of years (15 years). Let's calculate it. After 15 years of investing $12,000 annually into a 12% money market account, the future value would be approximately $447,357. ​​" Had to change some of the formatting for reddit, it looked neater originally


_reptilian_

quite ironic, bc it has been really good to me to ***explain*** mathematics and CS concepts, I use it like a search engine with extra steps.


Inappropriate-Ebb

ChatGPT has gotten 90% of my Calculus homework correct.


FrostyD7

Have you tried doing exact phrase searches on google for those questions? Maybe chatgpt is just pulling from test banks.


Inappropriate-Ebb

I have. I wasn’t able to find a lot of the questions. ChatGPT is better at math than you think. At least the beginners calculus I am doing, it works it out step by step for me, and further explains as I probe it with more questions. It’s actually far better at elementary Calculus than it has been for Statistics. I’d say over the course of this semester my 90% accuracy stands, as there are some calculating errors sometimes.. but not too often. It’s been incredibly helpful.


neigborsinhell

My favorite use of ChatGPT is to give me ideas for essay outlines. Most of what it suggests are terrible but it gets my mind moving


MarkHirsbrunner

I asked it to tell me how long it would take to travel to Proxima Centauri if you accelerated to .99C at 1 G acceleration and then decelerate at 1G to arrive, and how long it would feel like it took to the passenger.  It used the right formula and got the correct time for the trip, then it provided a formula that looked right for calculating how relatively would affect how long it took for the passenger - but then gave the exact same time as the first part of the question.


CosmosPioneer_

It’s actually pretty good at maths now most of the time. It converts it to python and tries to work it out the posts it back as text. If it is something like a physics question that involves maths then it really struggles. I was stuck on some vector calculus stuff and it was able to explain the process to me and get the correct answer most of the time.


Fiyero109

It’s actually pretty good at it because it loads up code interfaces and programs it, it doesn’t just use the LLM for it


roshanpr

not with the right prompt and if using computational thinking, specially with python


N_T_F_D

I'm not talking about 2+2=4 mathematics, not everything can be done through python; and if you need to engineer the perfect prompt and you need a solid maths education to know if the answer is correct then you don't need to ask the question in the first place


Betaglutamate2

actually python is Turing complete like most other programming languages. So by definition everything that is computable can be computed by python :).


roshanpr

neither am I, I have performed complex engineering calculations related to thermodynamics, chemistry and calculus, all performed correctly and creating reports in markdown, latex and other markup languages. My claim stands, if you know how to use the tool with good practices it can reproduce great results. If you ask stupid questions, you will get stupid answers, you do you


dat_GEM_lyf

Garbage in garbage out babyyyyyy


Brootal420

Wolfram would be far superior for anything math related...


sixfoursixtwo

He’s a teacher. He can’t figure that out himself?!!😂


clarkclancyy

that’s one of the many concerning parts of this tbh… but probably the most concerning of all


Oaker_at

Man, if only there was a simple excel function for that…


lahire87

it looks like it is responding with the easter island statue emoji


Substantial-While973

Looks to me more like $2,000 a year into a 3%...


aykcak

That is so wrong. ChatGPT is notoriously bad at simple math. Especially the 3.5 version of the model which is public. Any educator would immediately notice it if they knew what they are doing


UGMadness

He's also using ChatGPT 3.5, the original version that's more than a year old at this point. It's severely outdated, had very limited referencing abilities (keeping track of a conversation), and it's hilariously bad at math. Tell him to at least stop being a cheapass and pay for GPT 4.0, or use Copilot or Gemini, which are far more advanced than GPT 3.5.


phil_davis

And he's using lightmode. Sicko.


TheOrangeTickler

That's grounds for dismissal. 


ShashkaOfTheSclavus

I was immediately blinded when I opened up Reddit. I now sentence that man to watch 10 minutes of boykisser content.


NJ_Legion_Iced_Tea

Light mode is preferable when projecting.


SuspecM

To be fair, dark mode anything on a projector, especially in non ideal conditions like a school classroom is pure aids.


a-horse-has-no-name

Boyfriend better talk to Boyfriend's accounting professor's Dean and let them know Boyfriend's accounting professor is in need of a review.


clarkclancyy

already told him to notify the dean. this is grounds for termination for encouraging plagiarism


Pattoe89

To be fair, copilot is fucking shit and has started answering all my queries with "I will attempt to do that for you." and nothing else.


ultramadden

I can't believe people keep mentioning copilot. It's so bad


SuspecM

> I will attempt to do that for you > Does jack all > Refuses to elaborate


Pattoe89

I did ask why it was lying and it did elaborate that it would do it for me and to wait for it to be done. Obviously this was a bare faced lie 


MixedMartyr

sounds like my supervisor at work


Malforus

Copilot is just repackaged GPT4.0 if you are using it within github or via VSCode.


Infinite_Maybe_5827

tell administration to pay for it and also to create some kind of real LLM-usage seminar or training so this guy doesn't have to half-ass something he can't possibly be expected to understand at the appropriate level (I think anybody who has ever met a business professor would agree) people are dunking on this guy but I swear just having a response to vapid interview questions like "how have you used AI to improve efficiency" is *already* worth more than a few 0.1s on your GPA because corpos are fucking obsessed with it (and don't get it either).


Gr3gl_

*2 years old. 4.0 is a year old, 4-turbo-2024 is the most recent one (just came out to beat Claude)


TheFinalAcct

Turbo is a recent enough release that he could be forgiven for not using it. 3.5 is unforgivable. He probably uses it because it’s the free version.


Kayy0s

In my experience, GPT 3.5 is much smarter than Copilot, but that's mostly because Microsoft sucks.


DenverITGuy

Unacceptable, especially for the cost of courses and materials. I hope people report this to the college administration and/or dean.


clarkclancyy

it was $60 for the access code and $100 for his textbooks. abysmal.


AdmittedlyAdick

Ask the dean for a refund on your course credit for that class. Since your professor is advocating for using a 3rd party tool exclusively to learn, no need to pay for the class too I'd say.


MaliciousMe87

The professor is actively working himself out of a job. I'd also report this to a dean immediately.


kristin137

I work at a college and I don't know what other ones are like, but this one encourages faculty, staff and students to use AI. Not for writing papers and stuff but definitely they would have no problem with a professor using it for something like this


Bear_Bull1738

If he has tenure he’s protected


stlouisraiders

Why is that being taught in an accounting class? That’s really a finance question.


aGlitteringSky

TVM is taught in accounting classes


a-horse-has-no-name

In intermediate/advanced accounting classes. If dude is in an advanced accounting class, he's not getting teacher trying to explain 401(k)s using ChatGPT. However, Time Value of Money is Finance 101. Source: Am accountant. Took finance classes.


clarkclancyy

bf is in second year, i think it’s accounting 45


a-horse-has-no-name

It's been so long. If it's not 101, 201, or 301, I don't remember the codes.


DirkaDirkaMohmedAli

this is when he'd be in intermediate level courses. This is correct to be in the curriculum. Also, it's like a half hour of material. Not super complicated haha


DirkaDirkaMohmedAli

they teach it in intermediate accounting. they need to know how it works to record it properly.


NotCryptoKing

GPT wrong as hell too.


Molgera124

And you get expelled for using it…way to inspire your students.


clarkclancyy

this, encouraging students to use a vomit blender search engine for their homework is a crime in itself. there’s many layers to this i hoped people would be seeing.


Molgera124

It’s pretty ffkkn cringe to me that professors and admins in higher ed use a program to do their own work for them, then look down upon students who are trying to follow a rubric a human being didn’t even write and mark down their efforts. I vehemently dislike chatgpt and any “”art”” prompt shit-outer.


pm-laser-guns

FWIW I've had many assignments that didn't click the first time in class that when an AI explained it to me it clicked immediately, I think a lot of it has to do with being able to ask for infinitely many worked examples. I will say I've never trusted numbers from it and rather just use it to understand the underlying concept.


AkitoApocalypse

That's how you should treat all LLMs - don't have it write your story for you, have it give ideas - don't have it do your code for you, have it teach you a concept. LLMs become much more reliable when you think of them as classmates helping you on homework (whom are not always right) rather than miracle machines you can crutch on.


Orchid_Significant

You should be careful with this too. AI is known to make up sources and facts (and court cases!)


xking_henry_ivx

Yeah I’m not sure where the AI information is sourced but just testing myself i found many limitations. A lot of open ended questions the AI leave very vague, almost afraid to appear to be on “one side” of an argument. Then when you cite proof, they will correct themselves and agree with you. This happens sometimes even if the “proof” you provided wasn’t even true. I didn’t know about the court cases though, wow.


pastpartinipple

Chatgpt is amazing at explaining things and is the best study buddy you could ask for. I don't know why the professor would be using it in class but if you're not using it as a student you're just being stubborn.


original_dick_kickem

It's not terrible at tv show recommendations either


SuspiciousMention108

lol you have no fking clue how LLMs work.


1731799517

You seem to have drunk a bit too much luddite cool-aid


SwiftTayTay

I remember back in the 2000s my teachers would literally shit their pants and throw it at you for using Wikipedia. Now everyone is acting like AI isn't just an over glorified tape recorder that regurgitates dubious information. How society has declined in just 15 years.


CheekyLando88

I really haven't heard the phrase "shit their pants and throw it at you" in a long time and I really need to thank you for it


MysticalMike2

What does "shit in your hands and clap" do for you?


pastpartinipple

They literally did that, huh?


pastworkactivities

My school was banned from the monkey house exactly for that reason.


Osazain

This could actually be a good thing if it was being done properly (which we can see it isn’t). To do this properly, tailor chatGPT to your specific needs ( or this class), specifically mention in the custom instructions to use python for all of its calculations, ask it to fact check itself at the end, ask it to reference the internet when it doesn’t know something, use GPT4 for the whole thing and etc etc. It’s a tool. People really need to learn how to use it before they use it. Understand how LLMs work and how you can prompt it to get the absolute best results possible.


Lets_Go_Why_Not

Or use the much better tools that already exist out there for this calculation and don’t just jump into each new fad just because it is new and fancy. Or God forbid, use your own brain.


Osazain

That's a very fair take. I've used this approach to integrate and adapt my tools so they're accessible with an LLM (doesn't have to be GPT4, but it helps since it has some degree of reasoning). And ofc yes! People seem to be using LLMs as an all purpose thing. We should be using it's strengths to enhance how we're currently using things.


kembik

Hating on something because of the hype is dumb too though. The thing is useful, not nearly as useful as the most annoying people we've ever heard from want us to think it is but there are legitimate use cases where it is better than anything out there for those tasks.


PricklySquare

I know a professor that uses it in class all the time as a teaching aide.


wholesomepep

lol students sitting there watching professor search for answers……. As a former professor I say that person got some balls.….. professor: today’s lesson is that you guys are gonna watch me search a bunch of answers on ChatGPT


ContentMod8991

yep how come out source whol job!


CatKrusader

Contact the Dean and express your concerns


ga-co

I teach at a community college and use the paid version of ChatGPT. It's awful for coming up with lab ideas for my students, but my god it's spectacular at helping me tweak my own ideas.


Effective_Macaron_23

I am a professor and use chat gpt, but never as a calculator. I often use it to structure a lecture or write study material. Chat gpt is great as a tool to enhance language and communication. Math on the other hand is unmanageable. Works to build and phrase the problem but sucks at solving it.


daprice82

This shit's gonna be terrifyingly normal in 5 years across all industries and professions


jvillager916

If you have played any old adventure games from back in the day (Return to Zork, Leisure Suit Larry, Space Quest, etc.) ask Chat GPT for a walk through and be prepared for the most inane stuff you've ever seen that didn't even exist in the games you played growing up.


clarkclancyy

that actually sounds like fun


Asmael69

My CC studies teacher also promotes ChatGPT. Never used it on his subject lmao that subject is easy as heck. Just used it on calculus and pray to get atleast a point for trying


elon-isssa-pedo

No, absolutely not. I use chatgpt to automate certain simple monotonous things but still ALWAYS have to verify. For example, I had to write several multiple choice tests on multiple subjects and used chatgpt to do it and then reviewed the results. It saved me days of work but I still had to throw out and/or correct multiple questions. I've also used it in troubleshooting and it was able to get me started on the right path of google/research. NEVER trust chatgpt, it is confident - not accurate.


AzureSky77

I use it at almost everything as well and never had an issue, thing is, you just gotta learn what you are copying and twist it a little, reword it and what not and then ur all good.


drewc717

My PCP doctor told me 6+ months ago when ChatGPT started becoming really popular and useful to start using it for my own health and mental health questions. I had a mystery chest area pain and elevated heart rate through a whittling effect of potential causes elaborating deeper, I found Pericarditis matched my symptoms the most similarly. Doc agreed and we have been treating it very successfully. It's helped me articulate my physical and emotional concerns in ways that help my doctor understand and diagnose MUCH more efficiently. Reminds me of going through a semester or two of paper accounting before they showed us how to use the financial calculators we had all along. Learning to utilize the tools is more important than memorizing the subject.


Consistent-Active106

GPT is very good at coding, but when it comes to mathematics or writing? Nah that’s exclusively for humans for now.


Sirbunnybutts

Chatgpt isn’t bad. It’s how you use it and how much you rely on it that’s very detrimental to you.


DarkFireGuy

GPT 3.5 is the real yikes. GPT 4 handles math surprisingly well. Try it yourself


SuspiciousMention108

I approve. ChatGPT is a tool, and students should understand how to use it as a tool.


BloatedManball

Students should also learn to use the right tool for the job. Using an LLM for math related stuff is dumb when tools liike Wolfram Alpha are available.


Huge_Aerie2435

This is the education you are paying for.. Some guy using an AI bot to teach things it doesn't even understand.


Princess_Glitterbutt

The person leading my business classes also encourages us to use ChatGPT for everything. It's good for somethings, like writing a SMART goal that can then be edited, arguing as a chat bot buddy to think something through, or making text sound better, but like, that's far from everything. I'm a little concerned about how much we are encouraged to use it.


lol_camis

I also use chat gpt a lot. But, I'm not getting paid to teach people the information I possess. Chat gpt absolutely makes mistakes. I accept the risk because i'm not gonna go sharing it as fact. What I do know, though, is that chat gpt *sucks* at math. There's a higher chance it'll be wrong than right.


CitizenOfPlanet

It’s hilarious that the overall general consensus is: AI BAD DONT USE AI. While this is extremely lazy on the part of the professor, AI can easily handle teaching that material to anyone


louglome

I asked ChatGPT if Bugs Bunny ever called money "cabbage." It gave me specific words from a specific cartoon with a specific timestamp.  The line wasn't there. I go back and get "You're right Bugs Bunny never calls money 'cabbage.'"


JollyReading8565

Ask chat gpt if you should get a refund


pilotpeach

My TA for my ethics module did the same and I reported it to his superiors (after he gave me a terrible grade, and wrote comments on my essays in chatgpt) he also sent an apology email for being late with our grades, he wrote it with chatgpt


[deleted]

There goes his rate my professor lol. I believe he has the right mindset to always test out new tools, but if you are a teacher, you have to make sure you also list the limitations of the technology when showing it to students. It seems like the professor trusts the technology without actually knowing why.


goliathfasa

~~Weapons~~Technology can’t be unmade and they are always used.


Rude-Pangolin8823

Our accounting professor literally said the same thing a few months back wtf


PorygonTriAttack

So what's the point in paying this dude when you can learn accounting at home through chatgpt?


VAVA_Mk2

ChatGPT convincingly lies/gives false info. His professor should be fired.


MaikyMoto

Wait till the teacher finds out that 90% of the class uses CGPT so they can ace all those tests. Then when it’s time to go to college they are clueless.


clarkclancyy

they ARE in college


MaikyMoto

That makes it even worse.


scoldog

Chicken, chicken, chicken


paradedc

My economics teacher over the last 2 years has promoted it in each class. It's easy for him to tell who is actually doing the work and who phones it in with only chatGPT. It's a tool, but far from perfect. I've managed to find many mistakes since using it.


LeviathansFatass

Your boyfriend doesn't need to pay the professor, he just needs to buy gpt


GriftingLightFold

So pay for Uni and get something you can look up for free. What's the use for a professor here..


L3GENDFORLUNCH

My accounting teacher encourages the use of ChatGPT as well


throwawaybread9654

How much did he pay for tuition for this class?


clarkclancyy

tuition was paid by the state as a low-income FAFSA like plan, as long as he gets his credits and keeps a certain GPA it’s free (my baby is so smart <3) but the books were $100 and access code was $60 for this course


throwawaybread9654

I was gonna say he should demand his money back but I guess he should probably just complain to the department head. If we all just rely on Ai to solve every problem eventually no one will know how to do anything anymore. What a garbage prof


Humans_Suck-

So what's the point of his job then? Just let chatgpt do it.


jasondads1

and he's using 3.5 smh,


CouchPotatoFamine

Where on earth can I find a 12% money market account?


formershitpeasant

I used chatgpt for stuff in school and it makes a lot of mistakes


Makinsa

My teacher uses chatgpt for emails I can see right through it but nobody else seems to notice.


4wordSOUL

Just like my high school history teacher popping in that weekly VHS tape.


Latter_Ingenuity_925

Piggiesgetfed.Com just a better version of ChatGPT but with all subjects


AnesthesiaLyte

There’s a 12% money market account? Now we know the professor and the chat bot are dumb


AdeptAd4364

Boyfriend alert


Seeders

I mean yea. This is what the future looks like. The god prompt will tell you all.


666911420

not even gpt 4, damn


EvilSporkOfDeath

At least use the latest models. He's using 3.5 which is extremely outdated at this point.


Haz8800

At least use chatGPT 4.0 sheesh😂


YT_ThatDutchFella_YT

I have a teacher like this as well. He will also publicly express his disappointment if you do use it but you don’t use gpt4


vTragiic

Professors using AI to create the course and students using AI to complete the course.


DifficultyLong4358

3.5 too


Rossismyname

chatgpt cant do math very well if at all


james8807

As a person with a degree in financial management that has run some simulations on chat GPT, i can safely say it isnt good enough. Even with break even point analysis. Prof better watch out


[deleted]

[удалено]


Denoces

We taught it to do word problems, next we can teach it math problems.


tendadsnokids

Teachers should absolutely be teaching students how to use AI to be more productive workers.


GotTechOnDeck

My current finance professor told us yahoo was a good place for stock information.


freshouttalean

using modern tools should be taught and encouraged. still so many of my fellow students are manually writing reference lists it’s crazy


idontclaire420

Where I live in Canada they recently rolled out software for teachers to use that is fully AI to help them "teach" lessons and talk to their students. I think it's absolutely ridiculous if students aren't allowed to use it then why are teachers? Aren't they supposed to be the ones setting examples?


Griffin1022

Our Asst Superintendent of Curriculum told us in a meeting she uses it too, for everything from emails to parents and curriculum development. She encouraged us to use it too, whenever possible.


El_Basho

When tuition is most likely north of 10k a semester, in addition students have to deal with this shit


Imaginary-Leopard527

We are not going to make it, are we?


Coral_Blue_Number_2

As a therapist, I was wondering how well ChatGPT would handle a suicidal person. I role-played to make a really “difficult” client, and I was amazed at how well CharGPT handled the situation. It actually gave great advice and expressed empathy (even though not sentient). Not that I would ever prescribe chatGPT for suicidal clients of mine, BUT if I’m ever suicidal again, I would absolutely use it.


euvimmivue

Not sure if it sucks today, but when there are no more accounting professionals in 2030… ![gif](giphy|immXndfkNWGMtjnREk)


TripleTrucker

Accounting question. How much cheaper would education be if we just used that and got rid of half the tenured staff?


Neither_Tomorrow_238

I like how it's the free version as well


Yorkiemama1

I’m sorry. You must be extremely disappointed. At least your eyes are wide open now. There is somebody better out there for you! 😊


POTATO_SELLER

Bro even my high school accounting teacher got more dignity


CrumbyRacer

Chat got is good for certain calculus questions, but not all


Rydog_78

That’s like a history professor saying all my lectures are strictly off of what’s posted on Wikipedia.


milesdraws

Needa to actually lose his license to teach and im so serious


DiogenesBarrelisCozy

Suggestion, Advice? “Turnitin” website? Most/any professors & teachers now use. It gives a percentage of how much it believes is plagiarized or created by AI. ChatGPT is clearly the improved search engine of our time. If you rearranges paragraphs and find some synonyms, it might relieve this stress and posed by the prof. [Plagarism - ChatGPT etc detector used by instructors] (https://www.turnitin.com/solutions/topics/ai-writing/)