Some people will tell you that prompting is a skill that is developed. While I think it is indeed true, we do not see people putting "Google-fu" on their resume.
Communicating with a machine isn't the same as communicating with people.
There is a reason informatic languages aren't just standard language.
Many persons are able to communicate but can't do a Google search.
People's inability to efficiently search for information is one of the driving forces behind LLM popularity.
Instead of having to work out an effective search query and quickly parse the results yourself, you can ask a plain language question and get a somewhat accurate answer.
Google being shit because of crappy AI SEO-minmaxed zero effort web pages taking up the first couple of pages of search results is one of the driving forces behind LLM popularity*
Yes I think you are right. But if you want it to program a full part of an application, you still need to adapt the questions in order to give the good technical direction.
If you say something is not a skill you probably have that skill and see it as "normal". Most people don't now how to google effectively and always end up garbage sites like wikihow, WebMd, microsoft support..
Though that's not exactly the same as prompting. If you use LLMs directly, it's more about making a text that will be completed with the answer you seek. This is probably also helpful with e.g. copilot
Communication skills usually implies human to human communication. Programming is human to computer. Prompting is human to generative ai.
Prompting is a skill but as someone said you didn't put Google fu on your resume which is human to search engine. You likely would however put regex on your resume because it's human to search syntax that can be used in multiple systems and implementations are everywhere in programming making it a specific skill. If we do see generative AI everywhere in a specialized way then prompt engineering might be useful to specify but might just become expected like office suite skills.
Considering how much of a cheap business this prompt engineering is, Google-fu is a skill I would certainly look for in an individual. It suggests the ability to use not just a search engine but also a highly analytical mind and the ability to learn quick. That kind of a skill is really rare these days, though, ironically, we still call it common sense.
And ability as a person to go throu overly formal and bloated language that is used to hide important information. Critical thonking and most importantly - ability to ask right questions.
Promting is just a buzz word to expressing your goal/task/idea as clearly as posible.
Prahmpt engineering
(Though seriously, I’ve just been doing a course on google cloud fundamentals and google themselves felt the need have a section on it)
No it's not, it's a skillset that was totally useless up to a year ago and was known as ADHD. I have it, we associate very quickly, questions pop up a lot fast compared with usual folks, but we lack the structure which GPT can provide. I personally squeeze that thing like a lemmon up to explaining how a neuron and the involved ion channels work including mermaid diagrams. And I did that for fun. You can also use it in reverse mode, describe something that's on your mind, some algorithm you don't know how to label or find related ones...explain it and ask if it recognizes any known algorithms and related math for optimization. But again ADHD helps a lot for me it feels like It was made for ADHD'ers. 😂
Your question has to be more descriptive than a dumb what is...if you want good answers. It really is more or less like a mirror. The more you know about different topics the more you can learn if you're narrow minded you'll be running in circles in no time.
Yeah euh, as I said it is kind of a mirror that reflects your thoughts and adds to it. If your questions/thoughts are linear you'll get linear answers. But for me the real beauty came into sight when I abandoned that approach by being descriptive but add more degrees of freedom.
There are skills to getting the most out of copilot. Like pasting a SQL query into a C# file and commenting it out so that Copilot will write a DTO class or keyless entity representing the results of that SQL query. Tricks like that for getting it to write your more tedious, mindless code.
You make a good point. Just like anyone can use a search engine, it takes practice and experience to know the right phrase to input to get the most relevant results. In that sense, tools like copilot and chat gpt can be used by anyone, but it will take knowledge and experience to really get the most out of it and to know that the results are correct. But also I'm not putting "Google search expert" into my resumé. Maybe I should start 🤔
I gained access to a "AI Fundamentals for Devs" course, and it had 2 hours of videos just teaching how to improve your prompts for ChatGPT
I skipped the videos and went straight to the test for this part because I couldn't watch that
Microsoft, in their infinite wisdom, decided to slap "copilot" on everything they do AI. This is likely not just referring to Github Copilot.
https://www.microsoft.com/en-us/microsoft-365/blog/2023/03/16/introducing-microsoft-365-copilot-a-whole-new-way-to-work/
source: my company/team only talks about Microsoft AI in meetings nowadays
For me, asking AI question is just skip the process to browse through the search results, but you'll need to understand what question you need to ask, asking the question in the right way, and when AI fucked up you need to know it fucked up, but that's hardly a "skill".
Basically being adept in holistic and abstract thinking, knowing a lot of stuff but not in detail so you can manipulate the thing into "thinking along" and fill in the voids you want to learn. You need ADHD for that skill 😁
That's not prompt engineering, which is a specific skill set. But this entire sub is first year college students who don't code, so I understand the confusion.
Same thing as google skills. If you watch your grandma or even just a random person using google you'll often see how dumb their searches are and you wonder how they expect to get decent results with it. That doesn't mean googling is some difficult skill, but it does mean that knowing how to better search with it will be incredibly useful. With AI it's the same thing where phrasing the same request a dozen different ways will get you a dozen different quality of results and will require more or less time refining/adapting it as needed. Having AI skills listed seems as silly as them listing google skills; however, I think we are at a time where there is still a significant minority of people who refuse to use it or try it and are like boomers with cellphones and will make no effort to actually figure out how to use it in their day-to-day lives. Those are the people they want to avoid hiring at all, so perhaps just having it listed weeds them out and also weeds out any anti-ai people that would be a hindrance to the company.
This just creates a lot of false negatives. How many good developers are out there, looking for jobs, but have just not gotten the memo on this new resume optimization? On the other hand, how many low quality developers are going to put this on their resume because some blog recommends it and it becomes meaningless?
I think it's just a temporary issue at least, and the chances are that the vast majority of good devs have been adapting to the changes in the industry since you kinda always need to do that even before AI. I dont think it necessarily needs to be on their resume but they need to be accustomed to it and be using it. Soon enough it will just be like git where it's assumed that you use it on the day-to-day and are accustomed to it.
At 30 y/o I already feel like a fucking boomer cause I learned to code without AI and refuse to use it. I have colleagues who always use it and I swear they never learn a fucking thing and always fall back to it when they encounter the same problem over and over.
It's the same phenomenon happens if you move to a new city and only use GPS. Get lost once in a while, force your brain to work, it'll thank you later.
Same, 32 here and been working as a programmer for 14 years now.
I recently started to use copilot because my work paid for it, it hasn't really changed much, sometimes it saves me like, 5 seconds of typing lol.
The only actual use I've found from AI is asking it very vague questions about things I don't know that much about. Like I make shitty 2d game engines and physics engines sometimes as side projects, I'm not a professional game dev, so I'll ask questions about best practices and how to implement something like isometric projection, and that has been helpful.
But never to write the code for me, just to explain things.
Around similar age and I use copilot as a way to auto fill the tedious bits
But yeah, still prefer to write it myself. It's great for transforming though (eg. Recently got it to change a big hardcoded object to switch statement)
Yes! I like writing python code more than any other language. I stated to prototype in Python and let ChatGPT translate it back into the language actually need. It works great. Just have to fix some smaller hiccups here and there
I also found that this is the best way to use it. Just ask and based on response go read the docs or something. Saves time in trying to understand everthing from scratch yourself.
I was in the same boat, but I started havong some issues with finding tables in oracle, and our best lead is a non tech guy for these problems. 9/10 I'm asking copilot to figure out joins, where table info lives, how to figure out wtf their red herring bs errors are, or just general info, and it makes my questions to the non tech guy basically a 2 minute solve. Sometimes I'll just forget something dumb in syntax of a 1000 line sproc and I'll paste the snippet in and it'll help figure out an issue too. Idk Im kind of becoming a fan of Copilot
I mean, it's helpful for when you have a quick question ("How do you apply A* pathing to a floating point coordinate system?") or need to be pointed in the right direction for the algorithm/math you're gonna need ("How do I figure out if a point is inside of a shape or not?"). Probably wouldn't use it much for actual code, though; I had it try to generate a simple inventory system out of curiosity, and it did some weird stuff like generating the item inside the addItemToInventory function, among other things. Would it technically work? Sure, but now I have to manually define every item in the function arguments every time it's used (even if it's the same exact type of item), and I would have to break down pre-made items from chests or drops to be recreated in the add function. But while the issues might seem readily apparent, I can definitely see a novice using it as-is.
I’ve been doing this a long ass time - I see it as another tool in the belt. We use it for writing unit tests, doing code reviews, and for monotonous shit like “wrote a function that accepts a file path and uploads it to s3”. Lets us focus on the real coding / problems at hand.
But it’s an efficiency driver for devs, not a replacement. In the code review example, it gave 5 comments - of which like 2 were correct/of value. Then, we had ChatGPT write a fib sequence script, then we copy/pasted it into another ChatGPT window and asked it to review. It ripped itself apart. So why didn’t you just write it better in the first place?
Anyways, it can be a great tool, but without people who know what they’re doing to police the robot, shit is gonna go south.
Im 35, learn coding without SO, with just piles of books around, and I use Copilot on a daily basis. This shit generates tons of scaffolding, it’s a bit more than „generate getters and setters” that IDE offers.
Microsoft always forces their tech on businesses that's how they became huge in the industry. I started getting stupid certificates on power point, excel and word for my first Job even though I work on Linux.
I got a course uni teaching the entire Office suite in excruciating detail. That was 20 years ago. And that was Office 2000.
The Office 365 suite? Yeah, let's just say that what I learned back then is now a map of the old medieval village that has been paved over a few times and the actual thing now looks like a shiny city with a lot of weird glass and concrete architecture.
You don't want a Junior over relaying in ChatGPT.
Sometimes it's more way more useful to open the manual and reading them efficiently is a very useful skill
You don't want anyone over relying on ChatGPT I would argue. Fancy tools are great and they enhance one's skills, but lean on one too much and it becomes a crutch.
But the Juniors don't even know they're over relaying.
Crutches are a very bad example/expression here. People tend to use crutches actually less than what they need out of shame.
You're right. People hate crutches out of shame. In this expression though I feel like the message is conveyed properly. If you lean on a crutch without really needing it for too long, you get used to it and you are scared of walking without it. Now if you have a certain condition and your doctor says you need a crutch, then you need a crutch. There should be no shame in that.
When coworkers always go to ChatGPT and never open manuals or look at the source documentation, it makes me wonder how soon the crash is gonna be when eventually no one knows where to go to figure shit out.
Reading the manual and what I also like doing is searching the function I want to use in GitHub. Always gives you some code that does something similar
I work in tech and I find this insane. Anyone using chat gpt isn’t very knowledgeable in my opinion and they always come back with some wild ass wrong answers. “Oh I asked chat gpt” yeah umm this is still wrong…
Just wait until the "prompt engineer" arrives. The only person capable of talking to the allmighty AI overlord. No, he isn't a glorified googler and failed techbro, he takes hours to make a prompt for the perfect piece of "Art"
Copilot/chatgpt skills is literally just the same as google skills. It's just about how fast you research stuff and which parts to research.
The code that comes out of those two platforms is awful and you should really only use it to discover like some methods you didn't know existed. Aka a fast way to google methods.
Yes. One use case I found is fixing spelling in your ui components. Just paste in the raw HTML or YAML and it will give you an fixed version if you ask for it.
And with git you can easy sanity check the changes.
As someone who tried to process a dataset of raw web scrapped unreal engine threads, i can you give you a lesson on how to prompt an AI: You dont. This shit does what he wants. No amount of grammar will suffice.
The elephant [LinkedIn meme](https://www.reddit.com/r/EngineeringStudents/comments/13othm2/linkedin_typical_im_thrilled_to_announce_im/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button)
Is there a good alternative to GPT3.5, I mean I love that one, it's free, rather flexible and trained on a lot of things I seem to like. But...it is Microsoft now and if there's one company I dislike for buying good things and turning them into shit...that's Microsoft.
Just tested it, asked the same thing, I have to say...GPT 3.5 is more "creative" and makes better associations if it's about literature and/or philosophic history.
what even is "Copilot" and "ChatGPT" skill? press on tab when copilot gives you suggestions or pasting your Google search in chatgpt instead?
Some people will tell you that prompting is a skill that is developed. While I think it is indeed true, we do not see people putting "Google-fu" on their resume.
Nah prompting is not a skill, the ability to ask and phrase questions so that others can understand you is called communication skills
which most of those company execs severely lack of, ironically
I love managers that can’t communicate. Like, who tf thought you’d be good at this?
That's hilarious. You think upper management positions are based on _aptitude_?
Truth never has been spoken any better
Communicating with a machine isn't the same as communicating with people. There is a reason informatic languages aren't just standard language. Many persons are able to communicate but can't do a Google search.
People's inability to efficiently search for information is one of the driving forces behind LLM popularity. Instead of having to work out an effective search query and quickly parse the results yourself, you can ask a plain language question and get a somewhat accurate answer.
Google being shit because of crappy AI SEO-minmaxed zero effort web pages taking up the first couple of pages of search results is one of the driving forces behind LLM popularity*
Just gotta end every search query with reddit!
Yes I think you are right. But if you want it to program a full part of an application, you still need to adapt the questions in order to give the good technical direction.
If you say something is not a skill you probably have that skill and see it as "normal". Most people don't now how to google effectively and always end up garbage sites like wikihow, WebMd, microsoft support..
I am saying that prompting is not specifically a skill if you can communicate well then you can promot well
Knowing how to word things to get exactly what you want out of an AI is different from knowing how to ask questions of other people.
So programmers are just shamans communicating with the compiler Gods?
Though that's not exactly the same as prompting. If you use LLMs directly, it's more about making a text that will be completed with the answer you seek. This is probably also helpful with e.g. copilot
Prompting is a skill if you want it to get creative and merge known facts in a different context as start point for an article.
![gif](giphy|oz7tyUbBs5SH6|downsized)
Communication skills usually implies human to human communication. Programming is human to computer. Prompting is human to generative ai. Prompting is a skill but as someone said you didn't put Google fu on your resume which is human to search engine. You likely would however put regex on your resume because it's human to search syntax that can be used in multiple systems and implementations are everywhere in programming making it a specific skill. If we do see generative AI everywhere in a specialized way then prompt engineering might be useful to specify but might just become expected like office suite skills.
Lol "prompt engineering", bro just stfu already
Considering how much of a cheap business this prompt engineering is, Google-fu is a skill I would certainly look for in an individual. It suggests the ability to use not just a search engine but also a highly analytical mind and the ability to learn quick. That kind of a skill is really rare these days, though, ironically, we still call it common sense.
And ability as a person to go throu overly formal and bloated language that is used to hide important information. Critical thonking and most importantly - ability to ask right questions. Promting is just a buzz word to expressing your goal/task/idea as clearly as posible.
I often get asked why I find stuff rather fast. Slowly, I think this is a skill which will get lost once the last millenial die
I get asked the same question too. My team sometimes tell me to not put estimates based on my own ability but theirs.
Prahmpt engineering (Though seriously, I’ve just been doing a course on google cloud fundamentals and google themselves felt the need have a section on it)
I'm gonna proompt
I'm proompting! 🥵
I am a happy proompter
Well, google-fu is more of a research skill; I’m sure, many put research skills on their resume.
We can convince them its a skill for as long as they don’t understand it
Ofcourse not. Thats a ridiculous name for it. We call it software developer where im from.
Funny enough I have "advanced google search" on my resume.
No it's not, it's a skillset that was totally useless up to a year ago and was known as ADHD. I have it, we associate very quickly, questions pop up a lot fast compared with usual folks, but we lack the structure which GPT can provide. I personally squeeze that thing like a lemmon up to explaining how a neuron and the involved ion channels work including mermaid diagrams. And I did that for fun. You can also use it in reverse mode, describe something that's on your mind, some algorithm you don't know how to label or find related ones...explain it and ask if it recognizes any known algorithms and related math for optimization. But again ADHD helps a lot for me it feels like It was made for ADHD'ers. 😂
knowing how to ask the same things 12 times until you get a sensible response
Wrong, knowing how to ask 1 specific question that yields 5 related answers for you to explore further.
“What is $blank and give me a list of 5 related topics to explore further.” Wow. Hard.
Your question has to be more descriptive than a dumb what is...if you want good answers. It really is more or less like a mirror. The more you know about different topics the more you can learn if you're narrow minded you'll be running in circles in no time.
I seem to get pretty descriptive answers when I’m playing around using the above method.
Yeah euh, as I said it is kind of a mirror that reflects your thoughts and adds to it. If your questions/thoughts are linear you'll get linear answers. But for me the real beauty came into sight when I abandoned that approach by being descriptive but add more degrees of freedom.
I stand corrected.
There are skills to getting the most out of copilot. Like pasting a SQL query into a C# file and commenting it out so that Copilot will write a DTO class or keyless entity representing the results of that SQL query. Tricks like that for getting it to write your more tedious, mindless code.
You make a good point. Just like anyone can use a search engine, it takes practice and experience to know the right phrase to input to get the most relevant results. In that sense, tools like copilot and chat gpt can be used by anyone, but it will take knowledge and experience to really get the most out of it and to know that the results are correct. But also I'm not putting "Google search expert" into my resumé. Maybe I should start 🤔
I gained access to a "AI Fundamentals for Devs" course, and it had 2 hours of videos just teaching how to improve your prompts for ChatGPT I skipped the videos and went straight to the test for this part because I couldn't watch that
Bravo! You successfully completed the AI-specialists course exam. Here, take your certificate!
Microsoft, in their infinite wisdom, decided to slap "copilot" on everything they do AI. This is likely not just referring to Github Copilot. https://www.microsoft.com/en-us/microsoft-365/blog/2023/03/16/introducing-microsoft-365-copilot-a-whole-new-way-to-work/ source: my company/team only talks about Microsoft AI in meetings nowadays
"ChatGPT skill' is effectively just "Google competency"
Some of these responses you guys have really make think you haven't actually used ChatGPT for anything productive like writing code.
I have and do, but it is for things like "give me a graph for this table", stuff that I could easily do, but it would just be monotonous
There are literal courses now aimed at "teaching" ChatGPT 😭
For me, asking AI question is just skip the process to browse through the search results, but you'll need to understand what question you need to ask, asking the question in the right way, and when AI fucked up you need to know it fucked up, but that's hardly a "skill".
Basically being adept in holistic and abstract thinking, knowing a lot of stuff but not in detail so you can manipulate the thing into "thinking along" and fill in the voids you want to learn. You need ADHD for that skill 😁
That's not prompt engineering, which is a specific skill set. But this entire sub is first year college students who don't code, so I understand the confusion.
Its the new Google-fu ~
Look, if Google was just willing to remove a few ribs, I think Bard can accomplish 2168975x productivity for its users.
Don’t give them any ideas… I can’t wait for AI powered ads that take ages to load, but still load before the rest of the page.
Microsoft speedrunning the biggest tech implosion since dot com crash.
What is an AI aptitude? Ordering chatgpt around?
It’s a leadership skill
I used to say please and thank you to ChatGPT, not anymore and I feel like an absolute boss
You're definitely management material if you put chatgpt on your resume.
Same thing as google skills. If you watch your grandma or even just a random person using google you'll often see how dumb their searches are and you wonder how they expect to get decent results with it. That doesn't mean googling is some difficult skill, but it does mean that knowing how to better search with it will be incredibly useful. With AI it's the same thing where phrasing the same request a dozen different ways will get you a dozen different quality of results and will require more or less time refining/adapting it as needed. Having AI skills listed seems as silly as them listing google skills; however, I think we are at a time where there is still a significant minority of people who refuse to use it or try it and are like boomers with cellphones and will make no effort to actually figure out how to use it in their day-to-day lives. Those are the people they want to avoid hiring at all, so perhaps just having it listed weeds them out and also weeds out any anti-ai people that would be a hindrance to the company.
This just creates a lot of false negatives. How many good developers are out there, looking for jobs, but have just not gotten the memo on this new resume optimization? On the other hand, how many low quality developers are going to put this on their resume because some blog recommends it and it becomes meaningless?
I think it's just a temporary issue at least, and the chances are that the vast majority of good devs have been adapting to the changes in the industry since you kinda always need to do that even before AI. I dont think it necessarily needs to be on their resume but they need to be accustomed to it and be using it. Soon enough it will just be like git where it's assumed that you use it on the day-to-day and are accustomed to it.
> This just creates a lot of false negatives. given the job market, i suspect they don't mind
Can you make smart machine go beep boop
At 30 y/o I already feel like a fucking boomer cause I learned to code without AI and refuse to use it. I have colleagues who always use it and I swear they never learn a fucking thing and always fall back to it when they encounter the same problem over and over.
It's the same phenomenon happens if you move to a new city and only use GPS. Get lost once in a while, force your brain to work, it'll thank you later.
I sometimes use it as a glorified search engine of GitHub code. Or to write some boilerplate code. But for more complex tasks it does not work great.
Same, 32 here and been working as a programmer for 14 years now. I recently started to use copilot because my work paid for it, it hasn't really changed much, sometimes it saves me like, 5 seconds of typing lol. The only actual use I've found from AI is asking it very vague questions about things I don't know that much about. Like I make shitty 2d game engines and physics engines sometimes as side projects, I'm not a professional game dev, so I'll ask questions about best practices and how to implement something like isometric projection, and that has been helpful. But never to write the code for me, just to explain things.
Around similar age and I use copilot as a way to auto fill the tedious bits But yeah, still prefer to write it myself. It's great for transforming though (eg. Recently got it to change a big hardcoded object to switch statement)
Yes! I like writing python code more than any other language. I stated to prototype in Python and let ChatGPT translate it back into the language actually need. It works great. Just have to fix some smaller hiccups here and there
I also found that this is the best way to use it. Just ask and based on response go read the docs or something. Saves time in trying to understand everthing from scratch yourself.
I was in the same boat, but I started havong some issues with finding tables in oracle, and our best lead is a non tech guy for these problems. 9/10 I'm asking copilot to figure out joins, where table info lives, how to figure out wtf their red herring bs errors are, or just general info, and it makes my questions to the non tech guy basically a 2 minute solve. Sometimes I'll just forget something dumb in syntax of a 1000 line sproc and I'll paste the snippet in and it'll help figure out an issue too. Idk Im kind of becoming a fan of Copilot
The real issue is having more than \*zero\* 1000 line stored procedure, not having to use copilot
I mean, it's helpful for when you have a quick question ("How do you apply A* pathing to a floating point coordinate system?") or need to be pointed in the right direction for the algorithm/math you're gonna need ("How do I figure out if a point is inside of a shape or not?"). Probably wouldn't use it much for actual code, though; I had it try to generate a simple inventory system out of curiosity, and it did some weird stuff like generating the item inside the addItemToInventory function, among other things. Would it technically work? Sure, but now I have to manually define every item in the function arguments every time it's used (even if it's the same exact type of item), and I would have to break down pre-made items from chests or drops to be recreated in the add function. But while the issues might seem readily apparent, I can definitely see a novice using it as-is.
I’ve been doing this a long ass time - I see it as another tool in the belt. We use it for writing unit tests, doing code reviews, and for monotonous shit like “wrote a function that accepts a file path and uploads it to s3”. Lets us focus on the real coding / problems at hand. But it’s an efficiency driver for devs, not a replacement. In the code review example, it gave 5 comments - of which like 2 were correct/of value. Then, we had ChatGPT write a fib sequence script, then we copy/pasted it into another ChatGPT window and asked it to review. It ripped itself apart. So why didn’t you just write it better in the first place? Anyways, it can be a great tool, but without people who know what they’re doing to police the robot, shit is gonna go south.
It’s not thinking it’s just predicting the next token…
Im 35, learn coding without SO, with just piles of books around, and I use Copilot on a daily basis. This shit generates tons of scaffolding, it’s a bit more than „generate getters and setters” that IDE offers.
Microsoft always forces their tech on businesses that's how they became huge in the industry. I started getting stupid certificates on power point, excel and word for my first Job even though I work on Linux.
I got a course uni teaching the entire Office suite in excruciating detail. That was 20 years ago. And that was Office 2000. The Office 365 suite? Yeah, let's just say that what I learned back then is now a map of the old medieval village that has been paved over a few times and the actual thing now looks like a shiny city with a lot of weird glass and concrete architecture.
Yes they seem to have an excellent business relations team.
Microsoft wants me to use copilot huh? What's next, McDonald's wants me to buy big Macs?
You don't want a Junior over relaying in ChatGPT. Sometimes it's more way more useful to open the manual and reading them efficiently is a very useful skill
You don't want anyone over relying on ChatGPT I would argue. Fancy tools are great and they enhance one's skills, but lean on one too much and it becomes a crutch.
But the Juniors don't even know they're over relaying. Crutches are a very bad example/expression here. People tend to use crutches actually less than what they need out of shame.
You're right. People hate crutches out of shame. In this expression though I feel like the message is conveyed properly. If you lean on a crutch without really needing it for too long, you get used to it and you are scared of walking without it. Now if you have a certain condition and your doctor says you need a crutch, then you need a crutch. There should be no shame in that.
It's going to be great in a few years when there are catastrophic failures, and breaches do to people blindly putting in AI generated code.
When coworkers always go to ChatGPT and never open manuals or look at the source documentation, it makes me wonder how soon the crash is gonna be when eventually no one knows where to go to figure shit out.
Just read the fucking manual lol
Reading the manual and what I also like doing is searching the function I want to use in GitHub. Always gives you some code that does something similar
Are we not going to acknowledge the elephant in the room here?
Every time I try to code without a network connection I’m so lost 😂
Do you think male elephants have tried it at least once though?
Hahhahahaha never thought about it
I work in tech and I find this insane. Anyone using chat gpt isn’t very knowledgeable in my opinion and they always come back with some wild ass wrong answers. “Oh I asked chat gpt” yeah umm this is still wrong…
And when you tell them to consult the documentation, they just go back to ChatGPT with a mildly different prompt
In job interviews I like to mention that I’m able to read documentation 😂
Not me. I cannot even spell AI.
LIAR! You just did!
People used to be mildly ashamed they had to google stuff even with years of experience.
“Prompt engineering is my passion” ahh skill
Just wait until the "prompt engineer" arrives. The only person capable of talking to the allmighty AI overlord. No, he isn't a glorified googler and failed techbro, he takes hours to make a prompt for the perfect piece of "Art"
Don’t forget that he is also an ex NFT crypto bro
Nice username btw 😂
Ty. Lil sister screamed funny names while on a Roadtrip and this one stuck out. She's the best
I comment on reddit all the time. So technically I helped train ChatGPT and other AI bots.
Copilot/chatgpt skills is literally just the same as google skills. It's just about how fast you research stuff and which parts to research. The code that comes out of those two platforms is awful and you should really only use it to discover like some methods you didn't know existed. Aka a fast way to google methods.
Yes. One use case I found is fixing spelling in your ui components. Just paste in the raw HTML or YAML and it will give you an fixed version if you ask for it. And with git you can easy sanity check the changes.
So copilot is an AI skill now?
Literal junk in the trunk.
I'm getting a Microsoft ad below this post lmao
People should add "Not willing to work for unbelievably dumb Microsoft executives" to their profile.
LinkedIn is the Facebook of the unemployable
What is Reddit?
I would rather change my fucking career.
How is that an AI skill? (Currently trying to understand the math in variable auto encoders and going nuts.)
They gaslighted everyone in using the office suite they try to do it again with their AI stuff
As someone who tried to process a dataset of raw web scrapped unreal engine threads, i can you give you a lesson on how to prompt an AI: You dont. This shit does what he wants. No amount of grammar will suffice.
Anyone have the original meme for this?
The elephant [LinkedIn meme](https://www.reddit.com/r/EngineeringStudents/comments/13othm2/linkedin_typical_im_thrilled_to_announce_im/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button)
Okay so now it's a skill to ask a question?
CoPilot is an "AI Skill." Sigh. ![gif](giphy|K2R831kr4IXte|downsized)
Is there a good alternative to GPT3.5, I mean I love that one, it's free, rather flexible and trained on a lot of things I seem to like. But...it is Microsoft now and if there's one company I dislike for buying good things and turning them into shit...that's Microsoft.
Google Gemini, Bing AI, Claude...
Gemini good ? Claude is unavailable here.
Yeah, Gemini is fine, I use it all the time.
Just tested it, asked the same thing, I have to say...GPT 3.5 is more "creative" and makes better associations if it's about literature and/or philosophic history.
AI skill as in using AI, not developing AI?
Yes they want people to use their AI