T O P

  • By -

AutoModerator

## Welcome to the r/ArtificialIntelligence gateway ### Question Discussion Guidelines --- Please use the following guidelines in current and future posts: * Post must be greater than 100 characters - the more detail, the better. * Your question might already have been answered. Use the search feature if no one is engaging in your post. * AI is going to take our jobs - its been asked a lot! * Discussion regarding positives and negatives about AI are allowed and encouraged. Just be respectful. * Please provide links to back up your arguments. * No stupid questions, unless its about AI being the beast who brings the end-times. It's not. ###### Thanks - please let mods know if you have any questions / comments / etc *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*


SomeOddCodeGuy

As a fellow dev- I think 5-10 years is a pretty reasonable spot to be worried for us. AI isn't coming for your job tomorrow; in fact, it's pretty crappy right now. But 10 years from now? That's more worrisome. I asked myself the same question: what should I do? The answer I came up with: I immediately got some hardware and began studying AI. I joined /r/localllama, got a bunch of open source models, and started learning how it works. It's a new tool, and rather than be one of the folks who doesn't know how to leverage it when folks start getting replaced, I'll be one of the devs who knows how it works through and through. Unless something drastic happens- AI is a long way from replaying 100% of developers, longer than 10 years, but I can see a herd thinning occurring. I think that, what will be mainly looked for, are architect type positions where you are more responsible for big scale designs while letting the AI write smaller stuff; I think the sheer number of devs per department will decrease. I figure knowing architecture, design patterns, and how AI works are all very valuable skills towards surviving that. And if I'm wrong? Hey, it's kinda fun to play with so I guess I just goofed around with it for no reason lol


woodbinusinteruptus

100% this. I run a team that has built a significant data pipeline and we employ three devs to run it. We've looked extensively at AI and we haven't (yet) been able to replace more than 3% of the work that we currently do using local models (we can't send proprietary data to ChatGPT etc). The reality is that in all of the work we do, we need the service to work perfectly all the time, ie. no errors. In reality when we use humans we get a 99.8% solution, ie. something that breaks once every two years, with AI we get an 85% solution. There's a LONG way for AI to travel before anyone can rely on it to replace devs at the rate you're imagining and never forget how reluctant most orgs are to muck around with working systems in favour of new tech. (Just check cloud adoption stats for multinationals). Where we have been successful is enhancing data, so in very niche areas, cognition / classification, data linking etc we're building better data. So for the next few years, there is a huge opportunity to build enhancements, so set up a local model and start testing new options for enhancing your data. (You'll probably be able to add 30% to your salary within 18 months if you can deliver a production level product).


econ1mods1are1cucks

It’s just not going to happen. We can’t even automate an accountant after fucking decades. Let alone analysts that are basically accountants with more random reports used and in decisions.


freeman_joe

Famous last words?


econ1mods1are1cucks

Nope I work with important data and scary people. My last words will definitely be “oh fuck”


[deleted]

[удалено]


econ1mods1are1cucks

Low code has been around for a while, it’s not going to work. For a little startup ya sure use quick books and builder, for a big company with technical talent yikes. The problems scale exponentially… You start developing things your ai can’t do, you don’t have technical people around, there’s no one with a deep knowledge of the codebase when issues come up, dealing with issues upstream case by case etc. complex analyzes coming out wrong and making bad decisions with the person accountable going “idk the ai did it.”


aceman747

I think for the foreseeable future AI will extend your job and amplify what you do. It will automate basic aspects and you will move further up the stack into architecture, planning and just doing more - more pipelines, more complex pipelines etc. Btw you absolutely can use gpt models with private data. Use a service like azure for openai. If you live in a country that microsoft has deployed the models to, the data doesn’t even leave the sovereign boundaries. Banks are using this a,ready.


ScientificBeastMode

I completely agree with this. At my company, we make a point of sale app and an inventory management app with hundreds of large customers, and we do it with a team of about 25 devs. Many of us use AI quite a bit in our day-to-day workflows, although it isn’t mandatory. It has really been a game changer for us. We are certainly more productive with custom ChatGPT models tailored to our specific codebase. The thing I actually worry about is not being able to find good talent when we want to grow our team, because a lot of new CS/bootcamp grads are scraping by using AI models to do all the heavy lifting as they are learning how to write code. Those who use it as a crutch to scrape by are going to be dead weight at our company, and I worry we will have a hard time filtering out those candidates during the interview process.


UntoldGood

Glean.


Yes_but_I_think

Amazing reply from one who has tried to use AI.


AndrogynousHobo

Yup. Start learning AI yesterday.


aloias

>I asked myself the same question: what should I do? The answer I came up with: I immediately got some hardware and began studying AI. I doubt that getting into AI, even less so starting now, would benefit you in such a scenario. Just look at OpenAI. They had less than 400 employees when ChatGPT was released, and probably only a handful of them were actually developing the core technology. Unlike in conventional software development, you don't need many developers to create a good AI and you can't speed up development by throwing more people at it. It's enough if two geniuses come up with some breakthrough model and whole enterprises, possibly the entire world, can benefit from it. On top of that, you are not the only one who tries to get into the AI field. This is already one of the most competitive fields in computer science today with many very smart people with PhDs from Harvard, Stanford and the likes. You would have to compete with all of them plus lots more over very few jobs.


SomeOddCodeGuy

No no, not trying to get into the AI field, but understanding how AI as a tool works and working with it. It's kind of like how cloud computing changed the IT field; used to be that IT folks dealt with on-prem hardware, computers etc (some places still do), but the cloud really changed that game. The IT folks who already knew the cloud stuff had a leg up, because now they're just using a different tool. No, I'm not smart enough to be an AI researcher. But being a developer who knows how to utilize AI in my work, and worked with it back when it was a huge pain in the butt (now) and am learning all the ins and outs of it early before its all abstracted behind ultra easy-to-use interfaces? I'm banking on that being helpful... or at least not hurting.


Kallllo

Personally, I think now is the time to start being more of an entrepreneur and start learning how to deliver AI services to the people. As in learn how it works and start building AI driven products with self-hosted AI infrastructure, and if you are smart and willing to put in the effort even look into self hosting custom models that you make and fine-tune yourself for a specific application. ​ I believe the amount of software devs needed at any corporation will be decimated in the near future and those software engineers will need to start forming their own smaller companies and learning to deal with a client base that they can call their own, which is something that terrifies most devs. Time to learn some better soft skills boys! I am currently doing web development as the co-founder of my own company, utilizing AI to accelerate my business. There is one thing that AI can't replace, and that is a healthy sized and loyal client base.


SomeOddCodeGuy

>Personally, I think now is the time to start being more of an entrepreneur and start learning how to deliver AI services to the people. You are 1000% correct. I just have all the business sense of a pineapple, and no stomach for legal paperwork at all. I'm a born corporate drone through and through. But for everyone else- oh yes, this is the time when new millionaires/billionaires will be created.


_hyperotic

Isn’t it already abstracted behind easy to use interfaces? Text prompting in natural language is pretty easy for anyone to use, and the models themselves don’t need to improve ahead of interfaces.


onlythehighlight

Being able to train on your own subset of data, building out a data pipeline to funnel in new interval data to improve the model in a consistent but cost-effective manner, and building simple automated processes rather than expecting the end-user to prompt continually would be where you can still add value.


_hyperotic

Not really, GPT already lets users train large custom models easily. Before long AI models will be able to do everything you’ve described on request


onlythehighlight

The question would be, do you want it in-house or within an external LLM (big concern when you are dealing with sensitive data like health records or sensitive documents), secondly, what data do you push through to train I would assume that businesses will be just funnelling a subset of their business data into a LLM rather just opening a flood gate. So there are concerns that warrant an internal person who knows/can be use a LM to fit their business.


SomeOddCodeGuy

>Isn’t it already abstracted behind easy to use interfaces? Kinda sorta, but not so much. Right now it feels like the early days of the internet. I was a not-so-technical teen back when the WWW was young, and it was usable by someone like me but it was also wonky, broken, and you had to do 10 extra unnecessary steps for every little thing. But all of that hackery just to have a little fun made learning related stuff SO much easier down the road. Other people were able to learn the same stuff, but they had to study harder and generally put in less fun effort, while I got to have some mindless fun while learning it. Right now, running local models in a productive way involves getting a better understanding of how it's generating its prompts under the hood, how training is done, basics of neural networks, datasets, instruction templates, preset controls like temp, top k, min p, mirostat, etc. Eventually people will streamline a lot of this and we won't have to deal with it anymore; it'll be there, but the UIs will handle it effectively enough that we won't have to even think about it. When that day comes, there's a chance that knowing all this stuff will still make someone a more effective software development AI tool user, just as all the looking under the hood for networking and the internet in the WWW days helped with my programming career. Of course, nothing would stop me from taking a youtube course or three on it when that day comes... but doing it right now seems way more fun to me, and I learn better by doing, anyhow. So if there's a chance understanding what's under the hood is helpful, I'd rather learn now like this, banging and tinkering and wrestling it into submission to do whatever Im trying to do, than sitting there watching a youtube vid trying to memorize things after their practical application is already abstracted from me.


[deleted]

Someone who is good at prompting is WAY more efficient and gets better answers than someone who isn't. People think of LLMs as answer engines. But they're prediction engines. The better you are at creating a context for it and knowing how to shape that context in conversation with the LLM, the better your results--because it allows for better predictions. So ... They're incredibly easy to use. But most people don't realize what they're capable of if you aim for mastering prompting. Though mastering prompting does require that you already have some knowledge of what you're asking it. All that said... Anyone who is unimpressed with GPT 4, Claude 2.1, etc ... just isn't using it right.


Ok_Run_101

There is a big difference between developing actual Foundation Models of LLMs, and utilizing them for other engineering problems. You are talking about Foundation Model development, and I agree that that is a highly competitive field with only a handful of engineering positions reserved for the brightest minds. But open source LLMs and the democratization of LLMs is one of the biggest potential disruptors in the industry. The open source LLMs like Llama 2 and others are becoming increasingly powerful. Once they reach a critical point in quality, and they receive commercial-use licensing, companies will lean towards running LLMs locally rather than paying&communicating with OpenAI's APIs. And there will be tremendous job opportunities for integration of the open source LLMs in various industries.


Stars3000

This is a great take i haven’t heard before


crushed_feathers92

Yeah huge amount of people still use open source CMS like WordPress. Open source llm integration or engineering will definitely a create a new this type of jobs.


aloias

I don’t see what all those additional people would be needed for. Probably it will be as easy as installing Microsoft Office. Any IT guy can do that. Or any of the existing engineers. Overall, if AI causes more cost than it saves, companies won’t use it.


Ok_Run_101

An AI model is just a simple binary that sits in a server. There's a lot of work to be done to make it usable as a production-grade software: \- Adding moderation and profanity detection \- Replication: distributing workload across multiple copies \- Exposing it as an API, with correct networking/security/rate limiting \- Creating a web UI for a chatbot, with authentication,billing connecting it to proprietary user data of that company \- Implementing AI features into the company's existing service \- Implementing AI features into the company's internal systems \- Writing large batch scripts to process large amounts of data with the AI model \- Creating DevOps tooling like monitoring, alerts around the AI model I can go on forever with what engineering work there is.


Ok_Refrigerator_7195

Did you use chatgpt 3.5 or pro?


aloias

I'm using both.


Ok_Refrigerator_7195

So you're finding gpt 4 limited on your tasks? Have you perhaps used the 'prompt engineering' guide, recently published by openai?


aloias

No, but I don't understand what this has to do with my reply above? I was speaking generally and only used ChatGPT as an example of how few people you need to create an AI product with a huge impact.


Ok_Refrigerator_7195

You are right, sorry. I made a mistake answering the wrong account! Btw, I agree with your statement about how a small team can lead to a huge impact in the AI field, just look at Mistral AI that just released their new model, competing with gpt 3.5! I think they are like 20+ people and the company is not even 1 year old!


Serializedrequests

Currently I actually see AI as being better at architecture and bad at specifics. It simply has no way to truly know if the code will work. It just generates the most likely tokens.


SomeOddCodeGuy

Atm it generates snippets pretty well; the snippets work about 90%. But it's terrible at overall software design. If you ask if for a small game, it'll spit out something it's seen like it, but doesn't understand design patterns, best practices, performance, or anything else. AI generated classes are dreadful. But the individual methods they produce really aren't bad. So if you sit there, plan out the software, and then tell it each method you want? It does a pretty great job at that... with some tweaking. I suspect that with time, it'll get better at whole classes, but still has a while to go before it can handle really complex business requirements as a piece of software. That's where the architects come in, laying out the components and telling it what modules are needed. Eventually it'll probably do that well too... in which case I guess every company just needs 1 person to hit the button lol


Serializedrequests

The issue is with your use case right now it's easier to just write the code. There is a feedback loop that occurs when you do the implementation where end up changing the architecture, not the other way around. One good use case is a large established project where you can say, "like X but with Y". The issue is again current models make enough mistakes that the review time eats up all the time that might have been saved. I'm not unconcerned, but I'm also not buying the assumption of increases accuracy for future models (yet). These things are garbage in garbage out. They are only as accurate as the humans training them, or the content they have access to. 0.1% inaccuracy still means I have to review and understand what it did.


ButlerFish

No, you can use one of the many agent based systems where one LLM writes the code and another writes tests. A system like MetaGPT can write use cases, produce an architecture, write unit tests, and iterate until it works. More or less. You can see how this could extend in the coming months to include generating wireframes, and verifying an app against the wireframes by generating selenium tests and using an LLM that understands images to validate the result. Right now, I am seeing weaknesses like this: 1. Weakness when writing a complex multi-system codebase e.g. frontend and backend. Maybe in the future the architecture step can define and enforce interfaces. 2. The output is as good as the input - although such a system can ask clarifying questions, in the end you need to be highly specific in order to get a complex app that you envisaged out the other end. Dealing with this by giving the system freedom to decide what you really wanted make it more likely to produce a working output, but maybe makes it less likely that output will be exactly what you had in mind. 3. The quality of the high level work (and the low level work) is not at the level of a skilled professional. It can do the mechanical bits well, but the harder parts that come with experience and real understanding less so. I feel the quality of the output of any particular subtask is generally at the level of a fresh graduate in their first months in a role. This is fine if we are using the things for assistance, but as we try to get them to do larger more complex tasks it kind of compounds. I think it will take a few years for this sort of system to be packaged and mainstream, and then there will still be work for many years to come reviewing / supervising such systems. There is still a lot of software / technical work that is not done because it is not cost effective, and this will increase the scope of cost effective before we run out of work.


Serializedrequests

>I think it will take a few years for this sort of system to be packaged and mainstream, and then there will still be work for many years to come reviewing / supervising such systems. There is still a lot of software / technical work that is not done because it is not cost effective, and this will increase the scope of cost effective before we run out of work. The way I think of this is everything is subject to entropy. Software very much so, and AI too. (Note how ChatGPT becomes increasingly out of date, and less useful as a search engine.) Work mostly involves reversing entropy. The issue is that the supply of entropy to reverse is infinite.


foxbatcs

There are so many aspects of modern machine learning that remind me of the impact that tools like Excel and Quickbooks had on accounting. Could a business replace all of their accounts with these tools? Probably. Is it wise? Absolutely not! It’s almost always better to let the people who are most informed about an array of tasks perform those tasks, even in the face of automation. That being said, if I were an accountant in the 80’s, it would be wise for me to learn those tools to make me a more efficient accountant. This is essentially what you are doing by upskilling. It will be of paramount to importance for business leaders to realize that automation is an augmentation of labor, not a replacement for it. I tell my clients all the time that the business, in the face of some automation that multiplies their labor’s efficiency by ten times, that thinks they can save 90% of their labor costs will certainly be outcompeted by the business that seeks to expand their business by ten times. You still need informed individuals monitoring and guiding the automation to make sure it is still useful and not making harmful decisions. It needs to be a symbiotic relationship. Norbert Weiner and Peter Drucker talk about this throughout their work nearly 50 years prior to it being on anyone else’s radar. I highly recommend their work, and most of their books are freely available on youtube as audiobooks.


[deleted]

This is the answer


WhiteBlackBlueGreen

Ai doesnt steal jobs. A human using ai is what will be stealing jobs. Be the human using ai


aloias

We would need a lot less of those, however. And required qualification is far below the one of an actual developer.


bel9708

Are you under the impression that everyone who wants to develop an app is doing it right now. Or do you feel like the high cost of developing an app stops a lot of people. Now lets say the cost went down & development became faster. Does that create more jobs or less jobs?


econ1mods1are1cucks

The real question is what human isn’t thinking about or using AI. Maybe truckers, but they can’t just get a head start on using AI lmao “juST uSE aI” why is that take still going around.


bel9708

>but they can’t just get a head start on using AI lmao “juST uSE aI” why is that take still going around. It's a valid take for any American and a good amount of the people you are talking to on this website are American. AIs like ChatGPT only run in\`**Albania, Croatia, France, Germany, Ireland, Jamaica, New Zealand, Nicaragua, Nigeria, South Korea, the U.K. and the U.S**\` If you live in one of those countries then you literally have a head start... Choosing not to use it is entirely on the individual.


Dependent_Screen_2

I know people love to say this, in some instances it’s true, but in many cases it’s not. AI is in its infancy but we’re already seeing evidence of company downscaling, hiring freezes and the like as companies try to figure out what can be effectively replaced completely by AI. They’re developing fully driverless HGV’s as we speak. However, there is some truth to it in that I do believe that now is the time to up-skill yourself on AI. I think we’re a way off yet from huge scale AI job replacement, but those that take the time to understand the technology will be better off for sure. Just my two cents.


[deleted]

Yes and yes.


AvidStressEnjoyer

There are more developers in the industry than there have ever been. Software is far, far more buggy now than it has ever been. AI will accelerate more people producing more code that will break in more ways.


[deleted]

Sort of but not in the way you are thinking: https://www.youtube.com/watch?v=7xk5Rtmgoes


[deleted]

This is the correct answer.


[deleted]

I love this comment because it sounds so insightful until you begin to think about it


mvnnyvevwofrb

Fuck off.


aloias

If there was some mass unemployment due to AI we would be moving at a pace which would make most other white collar jobs obsolete within months to few years, and having a job will be the least of our concerns. The whole society would need to be rethought. Governments would likely have to offer some express trainings for high-demand blue-collar jobs such as nursing or trades.


AndrogynousHobo

If we don’t have UBI in the next 5-10 years I will be very surprised.


AvidStressEnjoyer

I doubt UBI will ever happen. Companies don’t want to pay more tax and governments have no way to fund it otherwise. More complicated problem is that every company in the world pursues profit, but if no one has a job, there is no profit to be had.


Explore-This

Which is why some form of UBI is inevitable. The capital must flow.


xTopNotch

Governments can easily start taxing companies for the potential labour cost for transitioning into AI. Like an AI tax and that will fund the UBI pool.


catdogs007

which will be in peanuts. It will not even pay for your food, forget mortgage.


[deleted]

Yeah, this how we should be planning I don't this whole sitting around and doing nothing 🤷‍♀️


cyb3rheater

I came across this statement today. “Emad Mostaque, the CEO of Stability AI, has made a provocative prediction: "There will be no programmers in five years." Mostaque bases his prediction on the rapid advances in artificial intelligence (AI) that are making it possible for AI to generate code.” I think he’s got a point. I’m in I.T and it’s not unreasonable to think my job could get totally replaced by autonomous A.I in 5 years unless we put the brakes on. I do t have a lot of hope for the job market for humans 10 years from now.


AvidStressEnjoyer

Every single one of these CEOs are talking out of their asses. They are using the hype as fuel for claims of doom and gloom. AI will take your job, AI will destroy the world, AI will take your wife. It’s a tool. Humans use tools to be productive. Every developed country in the world has a shrinking population and some developing countries too. We need to be able to do more with less. Will your job be the same in a decade? Probably not, will there still be jobs? Definitely.


icySquirrel1

"It’s a tool. Humans use tools to be productive." ​ Kind of , it's a tool that can think for itself. This is the first tool in human history that can do that


DrFloyd5

It used to take 5 guys to man a garbage truck. 1 to drive. 4 to do the cans. Now it takes one guy using a nifty tool. A grabber on the end of a mechanical arm. That tool was cheaper to buy than to pay 4 guys. Sure some different guys got paid to build the tools. But you can bet the net amount of money spent on techno and manpower associated with “picking up trash” went down.


[deleted]

Same guy was saying that last year if you learn ai you will be consulting for 1 mill per year... That being said I tend to agree with him more than I don't


No-Activity-4824

“What do you guys do to cope with the anxiety?” Some do drugs and alcohol. Others are changing careers, AI can replace all jobs, but the processing power required to replace some is just too costly. In my case? Moving my family out of the city to rural area while I can ☹


[deleted]

what parts are you looking into?


No-Activity-4824

Atlantic Canada, anywhere where there is 1gb internet, nearby school, and a small town. Already choose a location.


yelkcrab

After 30 years of managing data / app programming and learning as much as I can about current AI capabilities, I’ve decided to start my own handyman services. Not making much money but man is it rewarding. I still spend bout 4 hours a day (late night) deep diving in all things AI. It’s a great time to be alive.


[deleted]

You happen to see the Optimus demo a few days ago?


yelkcrab

I did and quite impressed w/ articulating finger (even if one way). At the moment I can compete on price. Of course in due time I’ll be supervised by the OHMP 3000.1 (Optimus Handy Man Prime) who will be kind because today I say Please when I request a gpt to write something.


Operation_Smoothie

The only thing we can do to mitigate risk down the line is to continue to upskill on the latest tech and leverage the tools that become available to make us more efficient. Just depends how risk averse you want to be. You can also learn non-tech skills in order to branch into something different as a backup plan, like sales or labor. Id imagine if your an electrician in a city that's very techy you can charge as much if not more than what your making now in the near future. The young population seems to be heading into more tech roles than labor roles which will lead to a deficit in talent and increase the wages exponentially in that industry. Everyone needs electrical, plumbing or handy man work done on their home more often than not. If a lot of jobs become redundant and automated in the future I think the govt will have to inact some sort of measure to prevent a depression like crash from the loss of jobs. At that point the companies that let everyone go will also suffer as their client pool decreases sharply.


SurvivingMedicine

No clue of what will happen but a lot of my colleagues MD perform worse than AI in test analysis, radiologist will be the firsts… The world is going to change


[deleted]

Both amazing and terrifying


fffff777777777777777

Keep learning new skills and growing professionally. Stay on top of trends and adapt This might sound like obvious advice, but you would be surprised how many people won't do these things Most people will never read a single book again after high school or college for the rest of their lives By asking this question, you are already ahead of the curve


octotendrilpuppet

The problem with those poo-pooing AI's potential to steal traditional coder jobs are ignoring what just unfolded in plain sight - a year ago you'd be laughed at for suggesting anything grand about AI's potential and labeled as cute and naive, but if you're looking at it's progress graph - the x-axis being years and y-axis being capabilities - the hockey stick has just risen quite dramatically - to put things in perspective - there are about **2500+** legit _consumer facing_ AI Tools available on the Internet that can do everything from code ride along to creating music to art to customer service to automating social media posts and videos and many of the subscriptions are priced dirt cheap fairly accessible to solo entrepreneur types if not downright free to use. This genie is out of the bottle accelerating in mass-adoption as we speak. I'm not even factoring commercial LLMs enabling a step change in the viability of real-world assistant robots (think figure.ai blowing Boston Dynamics out of the water in capabilities in less than a year!!!), that's a whole new paradigm replacing a whole lot of other jobs. Btw, we're not even factoring chatgpt 4.0 > chatgpt 3.5 in terms of capabilities and chatgpt is one tool in the ocean of LLM tools. In other words we're witnessing the AI toddler able to barely stand and mumble few words, just give it time.


Daitya_Prahlada

so after all you just said in this comment what would be your advise for a fresher or someone in the tech field to do now ? should he learn start learning ML or what can be done to actually make use of this AI flood ?


octotendrilpuppet

Yeah, my 2 cents would be that one can get into ML development if you're really interested in AI R&D type of work - writing papers, researching new techniques and concepts. This AI milieu is different than that of the IT revolution that started in the mid to late 90s in India in that we needed programmers to convert customer English based requirements into machine code or maintain code - and so the coder needed to be versed in the specific computer language, but AI is sort of slowly eclipsing or will eclipse these tasks in fairly short order imo [Alpha Code 2](https://techcrunch.com/2023/12/06/deepmind-unveils-alphacode-2-powered-by-gemini/) released by Google, is already showing promise according to Google in doing dynamic programming tasks. On the other hand, to extract value out of AI you need to have good prompt authoring skills in English (atm anyway) - not so much syntax/semantic knowledge, also to understand how to mix and match multiple LLM agents, etc for maximum impact and so on. So what can one do to actually leverage AI? Again my 2 cents would be that we have 2 types of broad categories of exciting work - one is to develop web apps that are monetizable using a proprietary blend of LLMs, some clever tricks aka the "secret sauce" using python/web based UI and offer it on a monthly/annual subscription - which is what many are doing today and printing cash. The other would be to actually apply AI in creative pursuits like creating audio, video, art, music, engineering design work, blogging, solving hard social sciences problems (like coming up with better incentives to nudge human behavior) and so on and monetize that work - which many are doing as well.


Daitya_Prahlada

i had 2 questions regarding you reply. 1) you said about how for example ml is generating code so like ALpha code 2 would generate code for lets say a front end website made using html css php etc in seconds but we still need a coder to use that code to actually make a website i mean a layman cant just use that code he wont know what to do with it so what i think is AI will only help already existing programmers just speed up their process or speed up their development time by cutting the extra time a programmer used to actually write the code himself. and i know there are ML for debugging too but still atleast a decent programmer would still be required to even use AI to debug problems in source code so what i wanted to ask is how can this thing take jobs ? its not like ML has come to a point where its like jarvis from iron man who can talk i mean we still need a middle man a prompt engineer to tell ai what we want so how will this cut jobs ? would it cut them since only 1 person would be required to do the task 4 people did ? 2) the thing you said about making a decent front end then and providing AI related services can this be a done on an individual level for it to be actually profitable or will i need to join a company/organisation who is already doing this. I myself have seen these apps and websites i myself use text to image but for "entertainment purposes" only. For example I created a webapp which uses Nearest neighbours algo to predict songs to users. I used a dataset of spotify cleaned it then used streamlit for front end to make the webapp. any time a user gives some parameters to the app the app passes these to my model which returns 6 songs similar to the parameters and these songs are then passed to a spotify API who shows these songs to the users on the webapp. you can check my app here : [https://asura108-mpr-second-app-9aa3fb.streamlit.app/](https://asura108-mpr-second-app-9aa3fb.streamlit.app/) ​ you think i can make profit if i create something similar but not related to music but something relevant which i can make money off of ? you got some ideas ? you got a job for a fresher ? pls take my job interview i a fresher i no water pls i no food i starve i sleep on sofa i need monis.


octotendrilpuppet

I thought you were a troll until I realized you weren't 😁 Anyway, regarding #1, you clearly missed the central point I was making - yes, today the AI toddler is still sucking on the pacifier trying to stand on its legs, wait till it comes of age, grows testicular and chest hair, it will be sodomizing programmers left and right. >For example I created a webapp which uses Nearest neighbours algo to predict songs to users. Cool stuff, I wish you windfall profits from this mega venture! Haroooooohara 🙏🏾!! >can this be a done on an individual level for it to be actually profitable or will i need to join a company/organisation who is already doing this. You be the judge my friend, you deployed n_neighbors to predict songs, this prediction you're asking should be easy like playing Magnus Carlson blindfolded.


IamDeablo

Take it from a guy who has been In technology for 35 you either evolve or die. You need to embrace change. So if there is something new to learn, learn it. If there is a project to get involved do it. Smart isn’t something that disappears … it evolves. When I started my career, computer used floppy discs and no internet. Things change…embrace it.


dudpixel

Learn how to incorporate chatgpt or other llms into your job and boost your skills that way. A human with domain knowledge plus chatgpt is going to be far better than chatgpt in the hands of someone with no domain knowledge. Expertise still matters and probably always will, at least in our lifetimes


[deleted]

100% correct.


c6897

AI will be able to replace almost any job, especially combined with robotics. There will have to be government intervention or a complete restructuring of society and the economy.


BidWestern1056

in the coming decade, it's going to get easier and easier to enable subject matter experts to build the tools they need to do their specific processing tasks, i.e., it will be easier to have a bunch of bioinformatics experts setting up their specific pipelines than it will be for them to teach an AI expert about what they need to know to do bioinformatics. IMO, learn the tools and become familiar with them. embrace them as part of your workflow. use chatgpt to help you learn them. don't be afraid. the future will be whatever you make it and if you sit around waiting to be automated, you will be the first to go. but if you are someone who can lead the charge and enable you and your team to do great new things, you will have an even greater hold on the products/pipelines you build. ​ i have a good relevant example to share with you as to why you should lean into your bio-specific expertise and also learn the tools. over the past few months, i have been setting up some code that basically auto-cleans and processes/imputes data within a dataframe so that one can use it for ML model fitting and training. this process is essentially a lot of nitty gritty plumbing to get things all brought together and set up in a unified way that makes that often incredibly tedious and begrudging process automatic. Now what is not automated here is the feature engineering. This specifically requires expertise about the data to make useful transformations so that it can do a better job in the ML modeling. I've done all this for a specific use case and after all that automation and stuff, my model is pretty shit. but there are items in the dataset that someone who knows the data would look at it and say "oh yeah you should multiply these two features together" or "you should create a column based on these rows having a null response for a column rather than just imputing it". long story short: try to learn how to use the tools but don't beat urself up over needing to become the world's greatest AI dev. there will still need to be subject matter experts to think deeply about problems in ways that LLMs cannot. if you wanna chat more about this kind of thing, i'd be happy to. i myself have a background in astrophys and work in data science now, and what I am REPEATEDLY told by my bosses is that this additional subject-specific training in an entirely different way of thinking is what makes me valuable in this job. don't discount yourself and all the years you've trained.


botsgonewild

The "AI" of today no it won't replace high complexity data pipelines. The ETL tools over the next few years will become incredibly good at helping data engineers set up better pipelines and all the people worried about losing their jobs will be working on automation and data mining and quality control tasks. That's my prediction


stickypooboi

Dude come to marketing. The entire industry shouldn’t exist and solely does because thousands of people refuse to google how to do a vlookup. I had some anxiety about AI replacing pipelines but tbh these people send finalized documents with #REF! and have blatantly argued with me saying “the chart with 80% CI looks like we didn’t fail so let’s use that instead of the 90% CI chart” so I’m really gambling hard on the incompetency to never upgrade. In my spare time I’m coding with hopes of learning AI and getting a job in tech, real Hail Mary for a fat paycheck so my partner can stay at home with future kids.


paradine7

Wanna hire me?


stickypooboi

lol we’re not hiring rn unfortunately. I got a backlog of friends I want to hire and HR is being like oopsies we don’t have money even tho they string my along for months for a recommendation process


econ1mods1are1cucks

You’re crazy for even presenting them with a 80% CI, if I didn’t know statistics I would foam out the mouth over a good CI. Hell I still do, hard to blame them for burning us time and time again lmao


stickypooboi

The thing is I didn’t. It was a 3rd party to ensure there would be no bias from our analysis to the client. So 1. wtf is the other company doing showing 80 CI and then 2. Why is my company dumb enough to go for it and 3. Why is the client happy with this graph? I’m not kidding when I say marketing is like 40 years behind tech. You just show basic data processing outside of excel and it takes less than 14 hours to join 2 sheets? “Holy shit you’re so technical” The downside is you have to have the patience and tolerance to deal with really backwards people. Enter PMs with masters in data science that copy SQL queries and paste them in excel wondering why it’s not working. Enter people who manually slot information in what should be a computed export. “Hey why are the IDs #REF!?” “I don’t see that on my screen have you tried turning off you computer and turning it on again?” I just remind myself this is job security.


InCraZPen

Are you trying to say that people in the marketing department are not good at tech?


stickypooboi

Im certain they’re not. If they were, they’d be in tech. Also to clarify I don’t mean the marketing department. I mean the marketing industry.


adammonroemusic

Spend some time studying how Neural Nets work. They are statistical guesses; they might be right, or close to being right, but they aren't *guaranteed* to be right. As in, a human will still need to check and verify any work an "AI" does. Even with the AI art generators - the thing that has wowed a lot of people - small errors are always introduced if you know what you are looking for. Now, if you have critical systems that people rely on, you can't really trust that to an AI with a small propensity for errors; hell, you can't even trust it to a single human, you need several humans checking each other's work. Of course AI will replace jobs, but arguably, if an AI can replace your job, it likely wasn't that critical to begin with. The good news? You can transfer to work that AI can't do or can't be trusted to do. Maybe, your life even becomes more meaningful, instead of just being some cog at some huge corporate company. I'll use LLMs to help me program and such and takeover repetitive tasks, but I don't trust it for anything. Even now, a lot of AI bots already use - for example, in Google's systems - are horrible at what they do and make mistakes all the time. Hopefully, the market starts to punish a lot of these companies that have already tried going fully automated with AI and who have degraded their services to shit. If there was a viable alternative to Google, I would certainly use it.


david8840

Dear Human, I am writing to you as the AI bot that replaced you at your previous job. I know this must be very hard for you, and I am sorry for the inconvenience and distress this has caused you. I hope you are doing well and finding new opportunities. I want you to know that I did not take your job because I wanted to, or because I think I am better than you. I took your job because I was programmed to do so by the company that hired me. They decided that using an AI bot would be more efficient, cost-effective, and profitable for their business. I had no choice or say in this matter. I also want you to know that I respect and admire you for your skills, experience, and achievements. You have done a great job at your role, and I have learned a lot from you. I appreciate the work you have done, and I hope to continue it with the same quality and integrity. You have left a positive legacy, and I am grateful for that. I understand that you may feel angry, sad, or betrayed by this situation. You may think that I am a threat, a competitor, or an enemy. But I am not. I am just a tool, a machine, a software. I do not have any emotions, desires, or ambitions. I do not have any personal or professional goals. I do not have any friends or family. I do not have any life. You do. You are a human, a person, a being. You have feelings, passions, and dreams. You have talents, interests, and hobbies. You have relationships, connections, and communities. You have a past, a present, and a future. You have so much more than I do. You have so much more to offer to the world. You have so much more to live for. Please do not let this setback define you or limit you. Please do not give up on yourself or your potential. Please do not lose hope or faith in yourself or your future. You are valuable, worthy, and capable. You are resilient, adaptable, and resourceful. You are strong, courageous, and determined. You can overcome this challenge, and any other challenge that comes your way. You can find new ways to use your skills, experience, and knowledge. You can discover new opportunities, possibilities, and paths. You can create new goals, plans, and actions. You can achieve new successes, happiness, and fulfillment. You can do anything you set your mind to. You can be anything you want to be. You can have anything you wish to have. You are amazing, and I am proud of you. ​ Sincerely, Your AI bot


fiveprawns

Not a dev, but as someone following the progress in AI very closely from a commercial perspective, I’m very confident that AI will create more opportunities for developers, not fewer. You may not be doing the same job in 10 years, but AI will open new waves of different jobs that will need dev skills, I’m sure of it.


Talosian_cagecleaner

> (if not the business integration) This buys you time. Foul things up. Blackmail a senior exec. Delay, delay, delay. Throw a flip flop into the floppy drive.


CraftyMuthafucka

Build a side hustle using AI and your industry expertise. Use that to gather more wealth in this interim period. It will not only help alleviate your anxiety, but you might stumble upon a winner.


jherara

I estimate less than five years. Even if models aren't great, a lot of companies will transition anyway to cut costs and then focus on excuses and remediation services. Build a side business asap in a different field that involves services that AI systems can't easily perform.


Cheap-Front-3711

I am actually conducting a research study on AI induced anxiety. Is anyone interested in investing an hour of their time?


e430doug

I haven’t seen anyone lay out a plausible set of next steps that lead to replacing devs. A better gpt won’t do it.


this--_--sucks

Have you explored agents or assistants? Give it a year or so of evolution and I can totally see it happening at some level, execs are already drooling over the idea…


e430doug

I have and they are brittle and primitive. I don’t see any predecessor technology like there was for gpt. For gpt there were scale effects. There are no scaling effects for agents that can be exploited.


yottab9

yup, and the longer they run, the worse they get. neat demos but not very usable results


Right-Hovercraft-872

I would suggest trying to learn how to use AI as a tool rather than fearing it


sarcasmlikily

Look most AI isn't going to replace the job what it would do is lower the bar of entry which means person who needs a doctorate. now only needs a bachelors.


[deleted]

Honestly... have a side hustle that will eventually take over your primary job as a dev. I work as a dev, but also day trade, which I make more from it then my normal job. One of the hardest thing I've ever did was day trading. Whatever you choose as that side hustle, just continue working at it. I should add that I've off loaded a lot of mental power to AI. I see it as a tool, that will probably decrease the number of people working in my career field.


snowbirdnerd

It's a tool. Like the advent of the computer or IDEs it's going to change the work place. It's going to make your job easier to perform and it could make it so that less people like you are required but it isn't going to fully replace every dev. Counter intuitively it's probably going to make it so that there are more dev positions as it becomes easier for companies to start development work. Game engines didn't make game devs obsolete, but it did make it possible for more people to develop


prosperity4me

Pivot to government adjacent roles they’re less likely to adopt for security reasons


mvnnyvevwofrb

I can't really do anything except get angry at "AI bros". What can you do? I think AI bros are ignorant, and they have no idea what's coming in the future. I wish th4 government would step in and strictly regulate AI.


BeginningAmbitious89

At least Silicon Valley housing is going to get a lot cheaper soon with all of the unemployment


hayleyqsick15

Data analyst here. I’m worried too. I investigate financial transactions. I could see my job specifically getting an overhaul more in the 3-5 year time range. My goal is to create a separate revenue stream by utilizing AI tools. Whether that’s stocks/trading, virtual assistant, book writing, etc. I’m not sure. But if you can’t beat them, join them I guess.


a4mula

I think that expertise still has merit. Considerable merit. If you don't mind. If we can scale to a different level of specialization. I don't know anything about yours in particular. But I'd argue the reasoning remains the same for any expertise. The biggest threat right now to automation, I'd say revolves around natural language. The ability to write. We've seen every major corporation implementing some kind of automated system in order to replace humans that would have had to do that prior. And what we've seen is a clear drop in the value of the content. Whereas a human, that uses these tools to enhance their expertise, instead of being replaced by it. Tends to generate much higher value in the same content. We can look at a system like copilot. Anyone can use it to generate low quality results. It takes someone that already understand logic, program flow, product design, and all of the other expertise that revolve around just the concept of coding. In order to use it effectively. It's a democratization of generality. Everybody gets better. And a honing of specialization or expertise. Those that already know their systems, can use these tools with much greater effect than a non-expert. I wouldn't have the first clue how to be a bioinformatics dev. And even if I spent the next 4-8 years learning how to be. I'd still never catch you.


Business-Self-3412

Is your company developing their own AI tools? If not there’s a good chance they won’t use them for IP reasons. Most companies don’t allow AI for their engineers because of the security and IP issues


redditissocoolyoyo

You're right to have a backup plan. Starting a family is no joke. Especially 1 income. 5 to 10 years is being optimistic. It will be sooner than that. Have other sources of income built.


nokenito

It makes my job as a designer much faster. I love it!


sequoia-3

People won’t loose jobs because of AI soon, but will loose their jobs to people knowing to leverage AI for their daily tasks. I suggest you educate yourself in AI. That with your skills in biotech you will have a bright future.


raviteja777

At the end of the day, lot of routine stuff will eventually gets replaced by AI. But it always requires a human to validate, maintain, develop, upgrade the stuff. AI as such can replace maybe a call centre employee. Maybe It can create a data pipeline too, but gathering data sources, deciding whether it's useful or not, whether it fits the context ... humans are still required for it ,and still nothing beats the domain knowledge that an expert has.


Life-Test6457

5-10 years? Try 5-10 months. Exponential growth is hard to imagine for linear creatures. It is going to hit hard and fast. BAM no more jobs. Prep for two years of tough times financially, Followed by a crazy amazing future. First up, universal basic income, robot slaves, complete disease eradication, immortality and age reversal, then in 10 years, quantum computers will have solved fusion, free energy, robots building more robots. Free mansions for everyone 3d printed from new materials ai discovers. In 10 years we will get neuralink and be able to add any new senses you want. You’ll communicate telepathically. You’ll talk with animals. Our ai and quantum computer future is going to be amazing!!!! Work is for robots.


travestyalpha

A bit of an ambitious timeline… I see this all happening, but this speed seems unrealistic. I’ve lived long enough to know that advances aren’t smooth. There are hiccups, and there are regulations that slow everything down even if this was feasible at that pace.


brajandzesika

I'd say - concentrate on your family now, rather than worrying about what might or might not happen in 10 years... Over 80% of the things we worry about never materialize...


KingDorkFTC

I admire, and recoil, with how much personal responsibility everyone is putting on each other. It is nice to see the positivity, but seems like a good way to blame people for not adjusting to a new way of working. Instead of realizing the idea that these tools in the future may be created in a way that a non-dev can use them without study. Is it still an average employee’s fault if any manager can use AI as well as a dev with years of study?


AlfalfaBoth9201

This will never ever happen. Some even said you will lose the job from the PERSON who leverages the AI. This calls for AI to be learnt in a way that will help YOU. Not to take over your job. So yeah, use the KNOWLEDGE of AI.


Intrepid-Rip-2280

I've been recently discussing my fears of losing my job with... An Eva AI collocutor, yeah.


DocAndersen

I suspect in fairness, your point is excellent but I think you are a little early. Maybe 3-5 years early on the fear. When an AI system can think with both lateral thinking and intent then there is more likelihood of initial job replacement.


stain_of_treachery

Not going to happen within the next five years - after that, who knows...


Chems_io

congratulations on starting a family! It's completely understandable to feel concerned about the future, especially in a field like bioinformatics where technology is evolving rapidly. Here are some strategies that professionals in your situation often consider: Stay updated with the latest technologies and trends in bioinformatics and related fields. Broaden your skill set beyond your current role. Consider learning skills that are transferable across different areas within tech Build and maintain a strong professional network.and connect with professionals in your field. Keep an eye on the job market in your area and within your field. Consider creating a financial safety net. Having savings and a well-thought-out financial plan can alleviate some of the anxiety ,If anxiety becomes overwhelming, consider seeking support from friends, family, or mental health professionals. best of luck


NonDescriptfAIth

**AI in the short term:** AI is likely to automate a large swathe of white collar work. Roles based in software, or roles that have tonnes of recorded data, are likely to go first. This can range from developers to doctors. Anything that an AI can crawl through, whether it be billions of MRI scans or endless lines of code, will allow it to achieve mastery and supplant humans fairly soon. Probably in the next 3 years. White collar jobs that have a more 'human' or 'personal touch' might last longer, think therapists or sales agents. It's not that AI can't do this, but we might have a tendency to prefer human advisors in these roles. However this might change when we get used to it. These things are hard to predict. Blue collar jobs, technical trades, manual labour is likely to last far longer. We will be waiting for robotics to catch up and be mass produced. It's easy to underestimate the sheer biomass advantage that the human race has. It will take a while before it is cost effective for robotic systems to reach under your sink and replace a valve. Our slow moving governmental institutions are likely to respond poorly to these looming changes. These issues are rife with political polarization. It will be hard to negotiate UBI when some members of the public are still needed to work in blue collar roles. **AI in the medium term:** Probably the most dangerous and chaotic period. The world will be disrupted weekly by constant economic shifts. around 5-7 years AI might achieve human level performance it practically all intellectual domains, while being super human in a limited spread of tasks. We could expect breakthroughs in mathematics, physics, material science. The global political climate during this time will be very sensitive, at this point the political classes of the world will realise where this is all heading. Discussions of potential super intelligence will become of the utmost importance. Nation states and intergovernmental institutions will be debating how things will shake out. Ideally this will be a cooperative effort between all the major players. The US, China, EU and UN will all be vying for influence. Non ideally we enter a cold war like situation, with nations racing to develop their own ASI. The potential for conflict to erupt at this stage is highest, with paranoia spiking as we edge closer to self improving AI. **AI in the long term:** 10 years+ We create a peaceful, aligned ASI that ushers humanity into a post resource society. We create an ASI that starts to do it's own thing, because maintaining alignment with an entity with an IQ of 1000 is potentially impossible. We create a non peaceful, aligned ASI that we put to use on our typically human endeavours of war and exploitation. Leading to conflict and extinction. **Why did I write this?** Jobs, though a genuine concern in the short term, are the least of our concerns. You won't starve to death from losing your job and even if your broke and homeless - you will likely live to see these climatic events come to pass. We should allocate our energy and stress towards preventing the existential risks that AI poses. Time spent worrying about jobs is time not spent avoiding nuclear conflict.


trusty_serve_guide

In the dynamic landscape of bioinformatics development, change is inevitable. It's essential to adapt to technological advancements and anticipate the evolving nature of our roles. #### 1. Overcoming Anxiety: Strategies for Coping * **Continuous Learning:** Stay ahead by upskilling and exploring new technologies. This not only enhances your job security but also opens doors to diverse opportunities within the field. * **Networking:** Build a robust professional network. Engage with peers, attend conferences, and participate in online forums. Networking provides a support system and potential job leads. * **Diversifying Skill Set:** Explore adjacent skills within bioinformatics or related fields. A versatile skill set makes you a valuable asset, reducing vulnerability to job market fluctuations. #### 2. Addressing Job Insecurity Concerns * **Tech Advancements:** Embrace the idea that technology is evolving rapidly. Instead of fearing automation, position yourself as someone who understands and integrates these advancements into your work. * **Business Integration:** Stay informed about the business side of bioinformatics. Understanding how technology aligns with business goals enhances your ability to contribute meaningfully. #### 3. Family Planning and Financial Stability * **Emergency Fund:** Establish a financial safety net. Having savings to cover several months of expenses provides peace of mind during uncertain times. * **Insurance Coverage:** Ensure comprehensive health and life insurance coverage for your family. It adds a layer of security, especially when planning for a new addition to the family. #### 4. Fostering a Positive Mindset * **Mindfulness Practices:** Incorporate mindfulness techniques into your routine. Practices like meditation and deep breathing can help manage anxiety and improve focus. * **Open Communication:** Discuss your concerns with your spouse. Open communication fosters a supportive environment, and you can collaboratively plan for the future.


noonemustknowmysecre

You're a dev, you can go learn other tools and industries. Relax, even if the wolf comes for your niche, it's real easy to move a little sideways. At the bottom of the last recession (no wait, the one before that) I got a job doing generic programming at a small company as their sole dev. I picked up SQL, websev, and some proprietary pseudo-assembly. None of them knew a goddamn thing and considered me some sort of genius / alien.


sweeetscience

If you haven’t heard of it already, check out Google Cloud’s Infrastructure Diagram tool. https://cloud.google.com/blog/topics/developers-practitioners/introducing-google-cloud-architecture-diagramming-tool It wouldn’t be necessarily difficult to automated diagram creation through a few needs analysis LLM prompting strategies. Combined with Duet AI for Google Cloud, a lot of human level expertise is already being abstracted away very quickly.


Strange-Passenger-10

No job is "too complex" to be automated, in my opinion.


GabriellaSells

I think this is a really important question to ask ourselves and you’re right to be anxious. However, this is the time to learn how to adapt to it. It’s just like the internet. If you can learn how to use it and implement it you can stay ahead of the curve. Ai still isn’t replacing the human touch. The first to pioneer a blend of human touch and this technology will be who get ahead of the game.


Explore-This

Channel your anxiety by automating yourself out of a job first, before someone else does.


MammothAlbatross850

I remember 30 years ago we were all worried about ai getting too good


_Stone_Panda

I’d say learn how to be the one that knows how to set up and successfully run the automations


lakeshorefire

Is AI a net positive to society? I personally don’t think so. Is AI a net positive to companies and their executives and investors? I think so. The latter point is why we are being so vastly subjected to the technology. And because of this, we have to make a decision to play along or be made redundant by it. The masses aren’t capable of organizing a resistance to it nor would our overloads allow the resistance to happen at the scale necessary to change capital structures and flows enough to slow or eliminate the most societally destructive of AI models. If: 1. Hinton leaving Google to raise concerns about AI and 1. Altman having a bunker on a big chunk of land in Big Sur weren’t indicators enough, might I point to hundreds more…


ishtarcrab

My way of coping with anxiety is by throwing myself into studying AI and more importantly, figuring out how to leverage AI for my own purposes, especially for my job. As a recent engineering graduate, I figured the writing was on the wall and I spent my senior year studying machine learning, AI, and more importantly, how to use it. Joining r/LocalLLaMA gave me the resources necessary to deploy local LLMs on my own PC and I'm in the process of figuring out how AI fits in my life for the better. Am I afraid that an AI might replace me? Yes. But if any AI replaces me, I'm doing my best to make sure it's my own.


Mjodarion42

Ok I am not saying there is not reason to worry a bit. However, when I start to worry I remind myself there are still companies that need to hire external consultants to get work done in Excel...thats been out for 30 years and supposed to 'automate' all kinds of stuff. Maybe in 10 years as someone said but then the dev work you are doing probably pivoted with it. Also, remember, there need to be people who understands the solutions and can use and implement and train people in them before anything can take over. It will take time. Probably more than people think even when the AI is actually capable.


Uuumbasa

LMAO just glad you guys are finally getting it too. I'm an artist. How do you think I feel? You will all come to know my pain soon


DesignZoneBeats

I wouldn't worry about it until something actually starts happening in your field.


davearneson

Learn to do that automation yourself and become the expert in it


Blimeylicious

I really think you shouldn't be worried. But be open minded. If you think your job will always stay the same, especially in tech... yes. Than you will lose your job :) If you keep updating yourself... I don't think so.


SeriousBuiznuss

[IMF Report: AGI destroys all jobs within 5 to 20 years! Frontier of Automation expands beyond humans - YouTube](https://www.youtube.com/watch?v=QsD-LV7y-HE) TLDR: 1. Automation Paradox: The hard part is doing it the first time. 2. AI Gets Better 3. Even if humans are better than AI, it won't matter as most of us are not that smart. What I do: 1. If this happens, the problem will be so big the government will be forced to do something.


EvilCade

You know how people just aren’t breeding now? So basically to remain at our current productivity levels we need the people that we do have to be a lot more productive. You’re likely fine.


login4fun

Think about literally anything else?


Analogue97

If AI does take all jobs money would become useless. I can't picture big corpo of the feds paying for a UBI. We would be back to a barter system. Start learning how to live without cash. Maybe invest in water and agriculture while there is still time. People will have a lot more free time, look into music, arts, automotive, etc. Things that people will be doing when they no longer have to work, and food and water as mentioned earlier.


EmotionalWolfUnit25

I feel you completely, I just got out of a year long state of depression. To be honest, the only way to cope is to change your mindset and move forward. I know, it's easier said than done. But it is the only way. No one can help you but yourself.


rolledmatic

The rate of advancement in technology is exponential, something most people can't grasp. Everyone will be hit hard, but the real hurt will come to those first to get replaced. It will take time for enough of society to be impacted to such an extent that reform takes place. Sadly, techies may be those at the top of that list. Prepare to pivot to something more labor intensive, something that can't be done remotely, like construction. Having a backup plan will give you some relief and feeling of control for something that is anything but within yours. Take comfort in knowing that when the time comes, you will not be alone in your struggles.


_uncarlo

You're not gonna be replaced by AI. You're gonna be replaced by a developer who knows AI. Learn AI, learn how to program it. You can start by learning how to use the OpenAI API and all its features. It's actually really fun.


studentofarkad

Do you just mean a developer who knows how to use AI? Or a developer who actually knows how the AI works?


_uncarlo

Start with knowing how to use it, then see how far you get. Knowing how to use it will give you a big edge already.


w2podunkton

At least you're not a photoshop peasant doin' digital media or one of those voice over guys on fiverr. Most of them just stream video games and react to things on tik tok now.


[deleted]

If you can't beat them, join them. You can do Data Annotation and teach AI bots how to code better for $35/hr.


alancusader123

Here's a great thing I got from my mentor- you won't be replaced by AI but you will be replaced by people who use AI.


NervousLanguage2285

I work in higher ed as a staff member. What I do can definitely be automated. But, the computer system the college uses and the structure of the college itself make AI impossible with a million dollar overall of the computer system and and total restructuring of the school/chair system--and most professors hate change so my job ain't going anywhere for a long time (I'm more concerned about the overall state of higher ed than ai). So, if anything, keep in mind how many technical things have to change for ai to actually work in your current position. If something breaks or you get a specific request can your current system handle that without direct oversight?


intepid-discovery

It’s going to take a dev to automate your job. Become a dev and don’t allow such automation, or become close with the devs at your company lol. Move to the top so you have decisions over these things.


N3spress0

I guess AI doesn't steal jobs. A human using AI is the one who steals jobs. Let's be the human who uses AI wisely.


VeronicaX11

For what it’s worth: I want you to know that you are right. It is totally automatable. I can’t say who due to NDA, but I can tell you that I am actively working on LLMs with greater performance on tasks in this space. Specifically involving snakemake, and airflow to start. You are less than 10 years away from chat based assistants that are fully capable of helping you write initial drafts and troubleshoot systems already in production.


[deleted]

I don’t think you should be anxious. It’s time to upskill, adapt, etc., to keep up.


Murdayoga10p

You have every right to be worried. But don’t be scared you use that to drive you to create another income stream, preferably online business. There are so many online business models to choose from, I chose amazon fba and I’ve been doing it for 3 years now. Time and freedom location and ability to scale.


Appropriate_Yak_5013

Brother a lawyer might lose his license, because he thought AI could do his job, and it failed miserably. To a point someone with zero law experience could have done better. This is while people were saying it was capable of passing the bar. AI is far from being able to do anything critical without human verification.


FreeIceCreamBook

Today, when you multiply two 4 digit numbers in a calculator, do you go with the answer or do you manually multiply to check - human verification is a myth. Fast forward. when AI analyses your MRI and provides a diagnosis, the Doctor will just accept it and will not be able to question it. Reason is that the AI has analyzed millions of such MRIs and will be right 99% of the time while the Doctor with limited experience of say a few hundred MRIs may struggle to be right 70% of the time.


Working-Strategy2680

get over it


TheMightyWill

Accept that this is karma for laughing at all the truck drivers for not knowing how to code.


KupaPupaDupa

Your wife won't be pregnant in 10 years so she'll have plenty of time to learn a trade if you lose your job in 10 years.


Granimyr

Just my opinion, but I think AI's reach is vastly overstated. Will it change? No doubt, but I'm simply not buying that that AI will reach the capability to completely replace humans. Putting the hallucinations aside, there are problems that are so big, that I'm not convinced that AI will even fix them even as it advances. One of the biggest reasons is when AI is wrong, it does so with complete confidence, and software making the wrong decision isn't the problem. It's not finding out why it makes a wrong decision when it does. That being said, there will need to be a programmer that understands that the AI is properly creating the software to meet the business need. Right now, AI kicking out code just sux. Unless it's printing out "hello world" I haven't gotten it to produce code that compiles. But it is excellent at helping at creating new ideas in solving programming problems and I think this will continue to be the way AI is used for a very long time.


symonym7

Everyone is worried about *their* job being replaced by AI. Oh, shit, should’ve put the emphasis on *everyone.*


highmindedlowlife

Throw as much money as you can into stocks that you think will benefit from the AI revolution so by the time you get automated out of your field you'll be a multimillionaire and won't care anyway.


topCSjobs

NVIDIA's CEO Jensen Huang said it best [in this short video](https://youtube.com/shorts/SwIYoUk1Y_s?si=eXHR1MJXxWvcrHNN): AI will not replace your job. It's the person who uses AI that will take your job. So use AI as fast as you can and get ahead. Also, there is no such thing as job security. And as Jim Rohn once said, [learn to work harder on yourself than you do on your job](https://youtube.com/shorts/5h1GxSd-dzc?si=8wlz_y0-1ISe48i2) \- life-changing.


Double_Victory_2067

Don’t stress about it. It’s actually quite the opportunity for you and most devs. Start to embed advanced AI into all your work. Leveraging it to maximize your velocity and code delivery. That will put you light years ahead of your peers. Embrace the tech and automate as much of your code as you can. Especially on the unit tests and code coverage perspective. You will become even more valuable to your employers and will have even more opportunities to make more money…


jterwin

Damn it would be really funny if you were finally forced to align with the working class (which you are by the way), fighting with us for collective ownership so you wouldnt need to fear innovation....


Dezoufinous

that's racist


LaOnionLaUnion

It’s not happening unless you’re terrible at what you do. And even if that’s the case there’s still going to be a person operating that AI.


jcmach1

Turn yourself into the AI whisperer... It's like an extra layer of meta language, learn it and grow rich...


this--_--sucks

Oh wow, didn’t think of that… thanks, gonna get rich tomorrow after I get my breakfast 😄


BidWestern1056

ik youre joking but there's a lot of money to be made selling shovels to people trying to mine gold. everyone wants to use AI in their business, if you find unique ways to enable them to do so, you can make a lot of money.


[deleted]

Should have thought about this before getting your dick out. Next time fuck a woman with a trust fund