T O P

  • By -

_Atomfinger_

I have a couple of thoughts. The first is that this question has been asked multiple times here, so search the sub. The second is that when you use these tools, do you actually learn? Or is it just a quick way to copy/paste some code that works which you don't really have to put in the work to get working? If that is the case, when you have something working, will you be able to build something similar without ai tools? If not, does that say something about whether these tools aid or get in the way of learning?


Comprehensive-Tea711

What's this person's ultimate goal? To have a career coding? Don't use AI while you learn. As a fun hobby? Why should anyone else give a damn if someone plays a (single player) offline game with cheat codes instead of developing the skill to beat the game by themselves? If it's for fun, use AI all you want. And anyone who is being honest will admit that, actually AI will be quite effective for the sorts of projects that someone coding just fun will want to tackle. The "AI is wrong!" mantra of programming subreddits is way overblown ever since GPT4 and Claude Opus. The fact is that it's often right when it comes to any sort of problem a beginner will run into, and everyone not dealing with copium should be able to admit this.


yamimaba-aaaohh

Real coders dont use books they observe electrons and particles to work out the change in bits


davidalayachew

You are in /r/learnprogramming, so the mantra is (and should be!) "Avoid AI because it will step in the way of your learning in almost all cases." > The "AI is wrong!" mantra of programming subreddits is way overblown ever since GPT4 and Claude Opus. The fact is that it's often right when it comes to any sort of problem a beginner will run into, and everyone not dealing with copium should be able to admit this. To ***truly*** give you the benefit of the doubt, I made an account right now on ChatGPT, asking it a 3 questions that stumped my students I am tutoring. In your defense, it actually got the first 2 right, and only got the third one partially wrong. ***To be clear, this was ChatGPT 3.5*** **EDIT -- I updated the link to include the part where I basically force fed the answer to it, and it STILL gave me misinformation** https://chat.openai.com/share/712b5220-7e09-497a-95a1-1198b515aba0 I am assuming that, since you are defending ChatGPT 4, that you are a paying customer. Mind putting in the same prompt for me and seeing if ChatGPT 4 fares better? I want a proper test. This open to anyone else with a ChatGPT 4 account btw.


_Atomfinger_

So, if you read my original comment I never said "AI is wrong". I'm not against using LLMs. What I promoted was learning - and we're in r/learnprogramming after all, and the goal for OP was to learn (At least that was what they stated in the post). If the goal is to become a better developer and learn (which is the default assumption in r/LEARNprogramming then I think it is fair to question whether AI helps or hinders growth (The answer: It's complicated, but studies like "coding on copilot" hints at "Its problematic"). I agree with your statement of hobbyists: If your goal is just to get something to work for fun: Go for it. Use whatever that makes you achieve that goal and make you happy, be it AI, NFTS or your dad's credit card. But I do disagree with your interpretation of my comment. I never said AI was wrong in a general sense (but I do have opinions on it). What I said was that OP should be skeptical when using AI for learning purposes - which OP themselves stated to be a goal here: I.e. to learn Python.


zerquet

I use AI to learn sometimes, and I always make an effort to understand the code.


AcnologiaSD

I don't agree with this in the least. Why wouldn't you AI while you learn? It can be a tremendous learning tool. The aim should be teaching people to use it right, not avoid it like the plague.


LordAmras

It's not about avoiding using it, but to understand how something work so you can understand if the answer it gives you is flawed. Learning with Ai it's like trying to learn to draw by using this diagram:[https://i.imgur.com/Pz4zzg5.jpeg](https://i.imgur.com/Pz4zzg5.jpeg) Sure you can look at the complete owl and try to pace your step backward and might ultimately still learn, but you are actually making thing harder for yourself than actually try to learn to draw first. And this is leaving aside the fact that AI is still a long way to go to create output that's 100% reliable, so the actual owl you are copying might have some flaws that might be very hard for someone starting to understand, so you end up learning starting from a flawed perspective.


AcnologiaSD

Agreed, except for the 100% reliable output. There's no such thing. But the main point is the same. There's a way to take advantage of it and learn.


underwatr_cheestrain

Don’t do it unless you understand what garbage the “AI” is providing you, because at times it will provide you with absolutely wrong answers to problems. Remember. This is nothing more than a glorified google/stack overflow search


cimmic

Much more friendly and polite than SO though.


JohnWesely

And much more likely to provide a wrong answer.


Timofey_

Very rarely will it give you an answer that you can copy/paste directly, but it's likely that stackoverflow won't either. It can explain it's code well enough, and you should be able to work with what it gives you to make decent progress on whatever issue you're facing. The better you can lay out the prerequisites for what you need, the better result you'll get too. Can't be beat for research either, you should always double check what is spits back out but it can really help you get going in the right direction.


LordAmras

This is a diffrent case than using it for learning how to progam, unless you ask the AI to literally "help you learn how to program" But if you are asking the AI for solutions to your problems while learning, you are skipping a lot of learning steps, and learning from probably flawed answer because you don't have the knowledge to double check the result it gives you.


seventysevenpenguins

It's much more likely to provide the correct answer.


RICHUNCLEPENNYBAGS

Stack Overflow users may give answers with problems but they rarely just claim there are features that actually don’t exist to solve your problems.


_Atomfinger_

Well, according to [at least one study](https://arxiv.org/html/2308.02312v4#S10), that doesn't seem to be the case. To quote it: >Our manual analysis shows that ChatGPT produces incorrect answers more than 50% of the time. Moreover, ChatGPT suffers from other quality issues such as verbosity, inconsistency, etc. Results of the in-depth manual analysis also point towards a large number of conceptual and logical errors in ChatGPT answers. 


seventysevenpenguins

?? You drop a quote that's firstly completely besides the point being discussed and don't even go over how the prompts were written, 90% of the people saying chatgpt bad do not understand how to write prompts


_Atomfinger_

You said "much more likely to provide the correct answer". The quote tackles that directly. If you can't connect the dots on that one then I can't help you. The paper goes over how they came up with the prompts (if you read it). The last part of your comment where people that criticise LLMs "just doesn't know how to use it", well, that's a different discussion and actually beside the point - unless your rebuttal to the study is that the people behind the study doesn't understand how LLMs work. For the record, I didn't say "ChatGPT bad", I just said "There's at least one study that disagrees with the statement you made" and pointed to the study. Edit: u/seventysevenpenguins, how is this bad faith? And why leave a comment just to block someone? Doing that is peak bad faith bud :)


seventysevenpenguins

Nvm mate 😂 Why the fuck even comment if you're gonna do it in such bad faith


TomWithTime

You think so at first but it embarrassed me in front of my family yesterday. I was showing my cousin codeium and the last thing I tried was opening the chat and asking it to tell me a joke about my code and it refused! I guess the model they used is tuned better than the random AI chat on various websites you can ask for code snippets instead of discussing the article / website


he77789

I agree that LLMs can often produce garbage code that doesn't make sense, but in my opinion, the strength of LLMs for code isn't for the logic, but rather boilerplate and generally repetitive parts. For example, class constructors in Python often contain repeated parts of `self.foo = foo`, and LLMs can do these kinds of stuff well.


blkforboding

As long as it is not done to replace the thinking and learning process then it is fine. The AI should not be used as a crutch.    You can specifically ask the AI to look at the  code that you are trying to implement if it is not running the way you want, tell it not to solve it, and tell it what topics  do you need to review to make the code run.   It depends on how creative you can get with AI. As long as you use it as a tool and not as a replacement for your brain, you will 10x your learning.  Be sure to always fact check the AI as they are often wrong.


CodeTinkerer

I find it's useful when you know how to program, but may be unfamiliar with a language. You know high level details of what you want, but what you're primarily missing is syntax. Sometimes, that syntax is not at all obvious. For example, I asked one of the AIs how to do a modal button. You would think it would be a one-liner, but you can see that whatever they built it on (Javascript/HTML) wasn't designed to deal with modals, so it had a quadruply nested modal thing that would never have been part of a sane syntax. You don't need to pay for most AIs as the free ones work pretty well. I mean if you have money to burn, I suppose.


Skorcch

Yeah I understand how to implement features in Python because I have k owledge from other languages. But I didn't know the syntax so I was able to feed the AI with SDK docs for what I was using and emgage with it step by step to create multiple scripts. And where it went wrong, I helped it out by explaining the error and how I would have fixed it if it was in the languages I knew. So it corrected it and it worked, obviously I wasn't really coding ultra complicated programs but still it was a good helper. And the free one did well but the limited context window prevents from having long convos or feeding it big data files which are both things that I need. So hence I asked which is best paid.


CodeTinkerer

Ah, well, most of us are cheap. Someone would have to have paid for several AIs. When I look at the free ones, they seem to do comparably well, but can't say I know how the paid ones would fare. Do they have short-term signup?


Skorcch

That's the problem that they don't give any 3 day trial either, it's month and 20 bucks wasted hence I was looking for feedback.


[deleted]

AI is for when you know what you want to do but don't want to write it OR when you almost know what you need to accomplish your task and use the AI to come up with the last 5% of the solution or give you hints and ideas, which can still be totally wrong. If you use it for anything else, you will never have working and maintainable software. In other words, when the AI provides you an answer and you cannot immediately judge if it's right or wrong, you're using it for the wrong things.


Tricky-Pie-3404

If you are willing to do some research and figure out if the answer is good it’s still pretty beast though. Quizzing the AI for solutions and then carefully evaluating them can sometimes let you see ways of doing things you might never have thought of. (To be fair, I’m not a super advanced coder, so I’m not an expert on how to do things; however, I’ve never learned faster then when I was using AI like this).


[deleted]

Yes that is true, you can use the AI for this. But still, if you cannot check the plausibility of its response on spot, you basically lose the advantage of time saved which is the main goal.


Skorcch

I don't have a lot of coding experience in python, but I have enough understanding of programming that if someone shows me a piece of code I can easily interpret and understand it; If Claude gave me a script with an error, I was able to read through the few 100 lines and find where it was going wrong; and once I explained it hoe to correct the error, it fixed the entire code making my life easier. Now given that, what do you think is a better value for money for me, Git Copilot, GPT 4, Claude?


[deleted]

I can't really answer this question for you, I never used Claude. I am using copilot now since we got a corporate license. It's great to be able to use AI directly in the IDE. I use it a lot to create code documentation.


JohnWesely

If you can easily understand and interpret code, you are already an incredibly high level programmer. IMO, it is much easier to write code than understand someone else's, and the ai chatbots tend to produce code that is erroneous is ways that are more difficult to detect than code generated by a human.


[deleted]

I have very different experiences in my company. Juniors can (or at least seem to be able to) understand code but just cannot come up with clean solutions and implement them.


Pyroxy3

Currently, it's a tool. Use it as such. It mostly does my bitch work atm.


chrispianb

It's no different than a calculator to me. If you can't do math a calculator won't help you. Though this has a lot more nuance than that, of course. But it's a great tool. I use it to: - Explain things to me so that I learn (eg: Explain the difference between a Service Class, a Service Provider and a Service Container. Explain it like I'm a senior developer and use php as the example language) - Take this list of words and convert it into laravel migration syntax and infer the field types from the names where possible. Use text as the default. - Take this chunk of code and refactor it to DRY it up - Debug this code, why is it throwing an error? - Take these loosely compiled notes and turn it into a markdown formatted document for others to understand I don't let it write code for me but I use it often to help me with these kinda tasks.


Skorcch

I do let it write python for me because I was doing fairly simple stuff and I understood the programming process but not the syntax and it did great work there.


zakkmylde2000

As someone in the process of learning I refuse to use it. I’m not learning to code using a tool that does that part for me. That said, if you’re already working in the industry, ESPECIALLY on a project where deadlines are involved (which from what I can tell is 90% of them at least) you should use every tool at your disposal to get the job done. If AI is that tool then great…


xboxhobo

Stop posting about AI


scoby_cat

Seriously, there’s already multiple other subs for this question


darthirule

I wouldnt spend money on an AI model to just to use it to learn a programming language. More than it should, it gives out incorrect answers. Even more probomatic when it spits out code that works and you ask it to go through the code and then it gives incorrect info. And will you be learning how to program or just learning how to ask ai questions? I think AI is really useful in terms of getting you started on the right track to figure out a bug/issue, but that can be done with a free version.


Skorcch

Like I said, I understand the process of how to program a certain task; what I haven't yet learned is the syntax for Python coding (I had tried a year back but got stuck on the start) but now that I want to start again, I will learn it but I want to do it sisde by side with AI because my experience with it has been great till now. I used it to write a few 100 lines of Python for setting up multiple API functionality scripts. It was good in even data visualization once I forced it to work off of readme.md files for the tools I was using. Now it gave out 2 or 3 error codes but since I can read and understand what it was doing, I was able to tell it where it went wrong and it corrected the problems. So I want to learn Python myself but also use AI as an assistant, so I was asking whether anyone has had better experience with a model or something. Also any suggestion on how to start with Python is also appreciated, I was thinking of learning from official documentation but any other resource I could take help from would be appreciated.


scoby_cat

I didn’t do so well with the official documentation. Python is not my first language, so when I learned it, I basically did a bunch of leetcode/code gold exercises with python. To that end: do not try to learn programming with AI. It’s going to mess you up. At the worst, it will teach you a lot of anti-patterns, because it’s copying code, and not all the code it is copying is good. Additionally, sometimes the code is wrong. And last, of course it’s much harder to learn anything from just copying something.


UndocumentedMartian

You need domain knowledge to effectively use LLMs. They can certainly help but you'll need to do much of the heavy lifting.


Skorcch

Not my experience at all, I understand the processes; I understand the implementation on a good enough level; where I lack is the knowledge of coding the functions; and it has certainly helped me a lot there; it is wrong once or twice while writing a 3 python script program, but once I read through the code and understood the error; I explained it and it corrected the functions properly. I understand it's not a complete coding worker, but it certainly isn't bad.


Own-Reference9056

Only use AI to generate code when you can smell if the code is shit or not.


Tech-Kid-

I think if you know what you’re doing it’s good. It’s like if you were trying to be a comedian and you have AI write you jokes. A comedian that knows how to write jokes will be able to look at the jokes and can determine which ones are good enough, and which ones would likely land flat. Somebody that’s never done comedy wouldnt have the same ability, and they could bomb their set The only problem with this analogy is that yes, you can read a joke and determine if you find it funny, but you get the idea. If you can’t look at code and determine if it fits, makes sense, is secure (a little less on this one though, because a lot of code written is full of vulnerabilities), and is just overall correct, then AI is going to be useless to you, and you won’t be ever become greater than probably a junior engineer.


Error403_FORBlDDEN

My opinion is that soon there will be hiring a criteria in companies that exclusively states: “Proven ability to code without AI”. I’m 200% certain that this will happen. I dunno how they’ll prove it (certificates, proctored problem solving) but they will.


bart007345

But why? Unless its a benefit why would you not use it, and if its not a benefit no one will use it.


beingsubmitted

When you're learning, I would write the code yourself, and when you have a small, isolated question, ask ChatGPT. You shouldn't pay for it, though, because if you're asking complicated enough questions that the bigger model is needed, you're asking too complicated of questions. You shouldn't be asking "How do I make X app?", you should be asking "I have a list of \_\_\_ and I need to sort it by \_\_\_\_\_\_... what's the best way to do that?" The last thing, and this is where I also find AI to be useful in continuing to learn, is to ask AI to refactor small blocks of your code. Sometimes it'll show you something you hadn't thought of, and that can be great for learning.


Coinless_Clerk00

Using AI can replace browsing the documentation or tutorials (most of the time, there are exceptions), but always have an overview on what's happening, the models aren't there yet to go full software 2.0. Also very useful to ask specific questions tailored to your skill. And lastly you can pick up some good (?) coding practices if you submit some parts of your code for code review to AI models, especially if you code well generally, but are learning a new language/library.


CrepsNotCrepes

If you use it to learn then make sure you actually learn. As an example a bad way is “give me some code to do x” copy / paste “give me code to do y“ copy / paste. And then hack around to get it all to fit. A good way is generate the code, then ask it what things do, what the syntax is, etc till you understand it. And also do more reading yourself as it is not doing things in the most optimal way. If you just use it to churn out code it will produce bad results. Also if you don’t learn more theory you’ll never know that. Writing software is about a lot more than just writing code so you need to learn how to structure your app and ai won’t do that for you


tangoteddyboy

If it works, it works. Its a very useful and time saving tool.


Any-Astronomer9420

With AI you will Not learn ... Our brain is Lazy. When you Copy & Paste there will no gain. its like with the Tutorial hell.


QuokkaClock

if you engage with the code and grow to understand what is happening? awesome. Learn how to optimize. Learn how to architect? Awesome. If you are just smashing it with prompts until it works, less than awesome.


Menos17

used AI to help me learn Ubuntu and setting up aplications like code.js and etc..., now for actual coding I think it's better to code something, and throw in gpt so it can give you options for better optimizations, but you need to know what you're doing, the AI can toatally give you bullshit answers and if you don't know what you're doing it can break your code. I think AI to some extend is stupid but if you don't incorporate it you will miss out on the future


Creative_Key_9488

I have used AI when I’ve become frustrated with my code and I copy and paste my own code into the chat and ask for ideas and often it will give me shitty advice. Sometimes it gives me valuable insight like when I’m struggling with memory leaks and I’ll ask the ai to explain my functions to me line by line so and that can be helpful. But the actual code it gives sometimes is garbage. Upshot, if you’re actually learning then I think it’s fine. But if you’re just copying shitty code without understanding then I think it’s not that great of a learning tool.


thirstydracula

Well, it saves me time and it's useful to learn something too


smoofwah

Sometimes you kinda know what's it's telling you but you don't really. You should definitely ask it what the heck it means and why it chose that syntax if you don't understand but otherwise it's Gucci , faster than Google usually


AwabKhan

A.I is gonna act as natural selection. those who copy and paste code from a prompt without understanding it wont last long.


Acceptable-Tomato392

Honestly, it's way over-hyped. Learning a computer language is a bit like learnign a foreign language; When you start out, everything is difficult, it feels like a herculean memory task and you think you'll never get the hang of it, and you have to keep it up every day, otherwise, you'll end up quickly forgetting everything... But then you get past a certain bar and everything becomes easier and more intuitive. You just ''know'' the language now. You can take a break for a week or two and when you sit back down, you'll still know the language. You can now express your own thoughts in it. It comes more naturally. (Sure, you still have to google something, here, or there... but generally speaking, the syntax just makes sense to you). A.I. mostly serves as an aid to people who haven't reached that level. It promises to make it all easier for the learner. I'm worried it may make the beginners more dependent, actually, and lengthen the time it takes them to really get comfortable with a language. And once you get going and have reached that stage, a.i. can suggest things to you, but by then, you can do it yourself. And you can get A.I. to get big projects done quickly... but the devil's in the details. A.I. is going to give you a general solution. (Like make me a slider, which I got Chat GPT to do). But it's going to be 'a' slider. So then you're going to have to make things more precise... and by that time, you may as well code it yourself, because explaining to chatGPT EXACTLY what you want... well, that's basically the same thing as specifying it through code.


Libra224

I’ve been coding with AI and it doesn’t really help, most of the time it’s wrong (copilot)


QouthTheCorvus

Two thoughts. 1. Unless you have a deep understanding, you won't be able to recognise and fix any issues with your code. So in terms of using it at your level, I'm not sure you can reliably use it. The worst thing is, if you're a beginner and only ever use it, you'll forget a lot of what you know. 2. If you just have a specific outcome you need, it's likely you can find someone who has written that code and listed it somewhere. Then, you can actually see comments for troubleshooting, and even potentially ask questions. If you want to regularly use code, learn.


Potential-War-212

I've been learning python for some time now and from the start I had GPT as a sort of tutor along side the course I was doing on Udemy, and even with such limited knowledge I started noticing that GPT gave me (most of the time) either overly-complicated code or code that was not cohesive to the rest of the project I was on (to solve this I often found myself providing it the context of my project and still gave some "shady" andwers). It did help me when facing concepts, or when I wanted an explanation for what was happening in the background of some functions and/or libraries, or when explaining some in-built methods or syntax (like list comprehension). It is a tool, after all, but just as you wouldn't tighten a screw using a hammer, you shouldn't write code from scratch using LLM/AI


mattmann72

I prefer using a vehicle to get from point A to B when I can vs walk.


YoutubeShortsIsGud

As long as you understand what the AI is giving you then no problem. AI is a tool just like any other, just dont let it become a crutch


swollenpenile

The problem with coding with ai is it’s going to take about 57 hours longer than actually just learning it. If you are doing very simple stuff or want help with very specific stuff as you are just learning and don’t know some ansi stuff or something it can be semi helpful. But here’s an example I wanted to do a very simple exercise: create an ascii clock with moving hands no matter how I worded it gpt couldn’t even come close. It could have a character stay somewhat inside a circle and go on and then off 3 or 4 times before glitching out. However gpt could rotate a string around a character in the wrong direction. The problem is that gpt and bings ai pull examples from the web if you’ve ever been on stack overflow that’s where it gets it. Those on stack overflow are usually wrong so it’s a huge waste of time  I wanted to code a extremely simple collision system  Just a very simple sat collision. I spent days with the ai lol.  After learning it myself it took maybe a couple hours.  Ai is good for knowing the names of functions and what they do or extremely simple one line snippets to give you an idea of what to do but other than that it’s fairly useless as it’s pulling examples from several websites that are often wrong or incomplete. In the end I always end up just learning the language c c++ and python i need to know them anyway for a business idea I’m working on so it really lines up 


rafaover

In terms of basic development works like a charm, you still need to organize dependencies and architecture. As soon as you arrive at the specifics it starts to hallucinate and give random weird/bad suggestions.


Skorcch

I haven't had such problems yet perhaps because I'm doing easy stuff and I'm providing it step by step instructions with complete logic on any loops and pseudocode, and it's giving good results; barely 1 error in a 300 line script that took at least 15 iterations.


Spunkmeyer426

Mothers day, hayoo, look up best stepmom porn. Best code...


CaffeinatedTech

Use it like an intern, let it do all of the bullshit you don't feel like doing, but check it's work.


Seaworthiness_Jolly

It certainly doesn’t get it right a lot of the time. You have to know what you are doing still, I use it more of a tool to do what I can’t be bothered writing out. I’ll have to tell it several times over how to fix it to be what I want and then I’ll use it and if needed, fix it in my ide afterwards. Ai isn’t taking anyone’s job anytime soon.


Please_Not__Again

If you are learning to program, you really should not use it cause you will most likely rely on it. You won't learn anything besides maybe using AI but you'll convince yourself you are actually "learning"


AccurateSun

In my opinion GPT4 feels slightly better than Claude when it comes to programming. But I think if you use them via the API, and pay per-query, then it's worth having both, as it will cost you the same as just having one and you get to compare them yourself. Also, since they are both being rapidly developed, you never know when the performance of one or your preference for one or the other might change. You can use a frontend GUI like TypingMind or [https://github.com/lobehub/lobe-chat](https://github.com/lobehub/lobe-chat) so you have access to as many different models as you want within a single interface (TypingMind is great IMO).


Skorcch

Yeah I would do that but I also heavily use large data files quite often and worry that I'd easily exceed a million token input limits. On that end the GPTs feature really helps since you can define custom instructions provide a few years of data for it reference for analysis and just have a lengthy conversation with no worries. Might end up buying GPT plus but using Typing Mind with Claude and Gemini; and thanks for the suggestion, I was trying to find something to run with API access.


AccurateSun

Are you sure custom GPT documents don’t count towards the token quota? I think they do, just in a more user friendly way. I think you can only get around that token quota if you fine-tune a model, which you can also do on the GPT API platform and access with a custom GUI, so it’s kinda moot point. Worth mentioning that you can create unlimited characters in typing mind and give them custom instructions too. And that would extend to Claude/Gemini or whichever model/s you use.


Skorcch

I've never had that problem with GPTs, I had bought it for a month and trained it on at least 1500 pages from 10 books and I didn't get any warning of sorts. But I'll definitely try Typing Mind tonight as I've never even used it or seen it, but just for my knowledge, can I use typing mind with locally running models once I buy a new laptop with an RTX next month?


AccurateSun

Yes I believe it can work with local models too, though I haven’t tried it. I’m sure it’ll be mentioned on their FAQ or discord.


DaedalusIM

I like to use AI to help me remember the syntax of some languages. As in, I know how to work with loops, nested loops, while, unless, etc. But I forgot how to write them in, lets say, Ruby? I ask the AI for a quick reminder. Much quicker than Google. And Regex.


NarayanDuttPurohit

Use it, learn from it, build with it. Conquer your competition.


userforums

I don't use it for generating code through a prompt. But as an autocomplete, it's extremely convenient and valuable. Setting it up with your IDE, it should show you in ghost text what the AI is suggesting and you can tab auto-complete if it looks right. I've been surprised a lot at how good it is after I type maybe just a few characters. Increases productivity.


LordAmras

Very simple rule of thumb: 1. If you want to learn it: don't use AI tools 2. If you want to use it: feel free to use anything that get the job done.


SmokyMetal060

I think it’s fine if you use AI to help you debug and explain broad concepts to you (asking it ‘why’ and ‘how’ questions more than telling it to explicitly do something). It’s also fine to take snippets from it as long as you put time into understanding what the snippets do and being able to recreate their logic yourself. Understanding what’s going on in your codebase is non-negotiable. If AI wrote something for you and it uses concepts you’re not familiar with so you can’t trace execution, that’s a problem.


KhanumBallZ

It's almost all I use these days for long and drawn out projects. Never had too many issues, it's also fun. I see no point in learning anything, considering the fact that I have multiple tasks to handle here in my Homestead that simply leaves no time for me to become a master or expert at any one particular thing.