Until Devin or whatever dumb name they give their next AI, actually gets a robotic body, kicks my door down and takes my laptop from my lifeless hands... I will continue programming and learning to code. We humans taught AI everything it knows, not everything WE know or are creatively capable of coming up with. There is a big difference.
"We humans taught AI everything it knows, not everything WE know or are creatively capable of coming up with" Sorry if I'm being ignorant but isn't that just like a parent thinking their kid can't go astray just because the parent raised them right ?
🤣🤣🤣🤣Your comment had me in stitches because for more than a decade, I get called to fix issues here and there. Issues which no amount of googling or stackoverflowing can make anyone find the answer to fix them. I have been writing the solutions to fix those issues in each company's knowledgebase which demand login to access or not connected to the internet at all. Good luck AI finding how to do those.
@@Black-mz4wy No I think you missed my point. A child is capable of learning organically and creating a path separate from its parents. There is a reason we call it artificial intelligence. It's not about "going astray". Even when an AI "goes astray", it can only do that within the context of what it knows, or hallucinate what it thinks it's supposed to know. Look up AI hallucinations.
@@brainites 😆👏👏This right here!!! I don't think people understand the types of complex problems and most importantly, complex mistakes that humans are capable of creating. Things that require another human to unravel. Spicy take: I think the folks who want AI to kill coding are looking for a way to feel better about their own laziness. I'm in a third world country and coding my arse off while folks (especially in my age group) bitch and moan about AI every other day. If they are that worried, they should learn Python and get into AI 😆
When the office computer was first released, everyone was making doomsday prediction that office jobs were essentially dead because one person on a computer could do the work of 5 typists with typewriters. Newsflash, more people than ever work in offices today. You just don't understand economics.
There's a difference between replacing software engineers and the *need* for many software engineers. This is what people who dismiss AI can't grasp. Junior roles will be effectively wiped out since these AI tools can do their jobs, hence getting your foot in the door will be incredibly difficult. Additionally the layoffs created a pool of thousands of experiences unemployed devs, how can a person just starting out get a chance? Second thing they'll hit is your salaries. If these AI tools make your job easier why should I compensate you the same as if they didn't exist? Besides, Devin is just a first gen iteration of its kind, they will develop more advanced versions who can do all the things you listed in the beginning of the video.
There are a lot of false or wrong points in what you wrote. First, the layoffs had nothing to do with AI. Second, the fact that someone's job becomes easier because they have tools to help them doesn't mean they'll get paid less, you fail to account for scale. If I can do X tasks in Y amount of time, and AI tools now allow me to do X tasks in Y/10 amount of time, this just means that I can do 10X in Y amount of time which translates to more profits for the employer. The fact that a lot of the juniors won't find a job is something that engineers in other professions have experienced in the past and it's function of supply vs demand as well as the economy and has nothing to do with AI. Now, at the end of the day we should be worried and have to rethink things philosophically and whether AGI should even be pursued to begin with, but that has nothing with what you wrote and certainly isn't in the near future. The fact that people think that we are only a feww couple of years away from AGI is laughable.
A friend who works in AI dev went to a conference last week where one of the talks was titled "Is AGI imminent?" and for which the speaker just walked on the stage, said "no" and walked off
I think that the appetite for software is just about infinite. I'm not worried about "the end of software engineering." There will be a hiring blitz in 2026, as the companies that are just now getting started, in our new context, start to ramp up massively.
Yes and they will pay us 2 -4 K a month per person, rather than paying a swarm of agents lets say not 20$, but 200$ a month... with the tech of 2026 - GPT 5 (if not 6) level LLM (if not a whole different framework) + GROQ 2 level of speed, built in to their codebases, integrated with all the platforms and tools on the web... shall i continue?
@@avivolah9401 First answer me this: How much professional software engineering have you done? I need to assess a baseline of understanding about what software engineers actually do.
I agree with most of this, except for the "nobody's job is going away for the foreseeable future." I'm not really worried that AI will be able to do software development as well as software developers yet. I'm worried that lots of managers that want to jump on the hot new technology will thoughtlessly and lazily just fire or stop hiring junior devs because they think AI can do it all for them only to realize that, 1.) they can't, 2.) because they only hired senior devs, the senior devs became overworked and fewer people will seek those senior roles because it's not worth it, and 3.) because they didn't hire any junior devs and give them the necessary experience to grow into those senior roles, now there is a dwindling pool of senior devs to pull from to replace the retiring senior devs. The risk of job security isn't because AI is at the point it can do it; it's because management is often all too happy and eager to do whatever they think will save a few bucks.
Until AI can design algorithms better than a human, it ain't taking our jobs. And I don't think many corporations are going to entrust the serious aspects of their multi-million dollar businesses to hallucinating large language models. Also, AI still needs us/our data to train it. Without examples to learn from, AI models can't do the right thing.
I'm curious to see how well this video will age. I'll revisit it in a year and see if my perspective remains unchanged. My prediction is that it won't. The advancements in the AI field are truly remarkable, and there seems to be no sign of slowing down or hitting any limits. While no one can predict the future with certainty, if I were to make a wager, I'd bet that within 5 years or less, we'll witness the emergence of software developed entirely by non-human entities, with computers communicating directly with one another. This approach makes a lot more sense, considering that humans often struggle with writing software efficiently. Even some of the most skilled engineers face challenges, requiring numerous iterations and testing, yet bugs still surface in production.
You can't run massive amounts of data such as hospitals, government institutions on entirely on AI. It's a booster for those are on the field. I bet you may have not worked in the programming namespace. This video is much accurate because he has been on the domain of software engineering.
@@austinejuma6160 and this is the most intriguing thing, did you realize how many programmers with 10+ years experience are becoming TH-camrs? This guy second video was about copilot. They are monetizing their knowledge before it becomes obsolete, they know it. You can realize that because every new AI that is announced that is 50% better than the one before is only approached with "well it can't do this and that" as if that will be the last AI ever. They never add the "yet" because this way they can't sell courses and get sponsored by courses on their area.
Maybe, but I bet the AI will need supercomputer to be run and it is much cheaper to hire a human for the job. We have nothing even close to human brain level computing.
2:07 Please, make a whole video about it and call it "There's no such thing as AGI" I see you like strong titles so that might work, but mostly, somebody has to explain how text prediction works and why it can't become magically sentient, and why it's not supposed to in the first place. I don't mind your clickbait, because in the first seconds you made it clear and delivered good content, but I'm getting crazy with all TH-camrs and AI companies abusing the tern AGI for hype, fundings and clicks, while the audience is getting very bad informations and becoming paranoid.
This is literally on of the best videos I have ever watched. and yes if we have managed to achieve AGI (which is so far), then it is a "nail in the coffin." to any job.
1. Software engineering requires a mental model of the codebase and the ability to reason about it. AI cannot reason or solve problems on its own. 2. Software engineers need to manage the codebase operationally and address issues that arise after code implementation. AI cannot handle these tasks. 3. Software engineers evaluate the information they find online and weigh the cost of implementation. AI cannot evaluate information or solve resulting problems. 4. Verifying the correctness of solutions is difficult. It is unclear how AI agents can be trusted to solve problems. 5. AI agents are currently co-pilots, not in the driver's seat. 6. Progress in AI is not exponential. There is a significant leap between large language models (LLMs) and artificial general intelligence (AGI).
Regarding point #2. You can argue that the reason a human being is able to build a model of the world around them is because they were essentially "trained" for years to understand the world around them by their parents, teachers, peers, or through self-training using the skills they were trained to perform by others in their earlier years (e.g. you were taught to read, so you can read numbers, so you can then be trained to do math and so on). If you take a human being who was raised by a pack of wolves in a jungle without any typical "human training", they wouldn't be able to tell you much about anything about the human world. The only actual difference is that LLMs don't have free will to train themselves on whatever they want to. Yet.
Well if a company has a program written with an AI owned by Microsoft for example. Does that mean that one has to pay royalties to Microsoft for that generated program?
Nope but of course it depends on the licensing deal. Legislation definitely needs to revisit everything copyright. AFAIK and for the time being, output from language models for example is not copyrighted.
As a 1st year computer engineering student who LOVES programming, but doesn't know much about how professional applications are made operationally, I'm glad this field is way more than just monkeying code haha because I'm much more inclined to talk to people and bounce off thoughts/ideas/memories etc.
No, but the big companies that used to snatch up all the new grads have been firing their juniors like crazy for 2+ years now, and slowing hiring to a trickle or freezing it entirely. Executives have started to tell their boards the hiring freeze is "because generative AI allows us to do more with less". Whether that's actually true or not, who knows: we've known since the 60's that small teams of seniors can be more productive than large teams of inexperienced devs, so it could easily just be that effect. Either way, the 2010's job market where it was "take a two month boot camp and make six figures doing Rails apps" is over for good.
@@fennecbesixdouze1794 Certainly companies can use engineers more productively. Actually statups do more with less engineers. But it has almost zero correlation with AI.
Devin and Al won't replace you but will make you 10x productive and successful in what you do. Al-assisted employees will be the future and a requirement in most of upcoming jobs in future. Ride the way to future and start learning it now
Whether or not generative AI is actually good at programming, the big companies that used to snatch up all the new grads have been firing their juniors like crazy for 2+ years now, and slowing hiring to a trickle or freezing it entirely. Executives have started to tell their boards the continued hiring freezes are justified "because generative AI allows us to do more with less". Whether it's actually true that generative AI "allows" companies to staff fewer engineers, who knows? It may not really matter. We've known since the 60's that small teams of seniors can be more productive than large teams of inexperienced devs, so it could easily just be that effect. Whatever executives tell their boards, companies that have been firing their juniors have been doing gangbusters: making more profits and better products faster. Either way you cut it, the 2010's job market where it was "take a two month boot camp and make six figures doing Rails apps" is over for the foreseeable future, and young people should know that before deciding to switch majors or jump careers for a React bootcamp.
You’ve just described the publishing industry. They are all saying the same sorts of things. In reality, books can and will be written by LLMs. The marketing and editing and nuanced work will be done by people in the short term but eventually all this work will be automated as well.
Ill be honest i gave up on SE and SD. I do understand that field is not at risk at all BUT i just dont want to deal with all that are you this are you that. Its just too much hassle so ill go either management or security.
if a can ask you? : for someone who love to work on large scale project and complex ones in day to day work which track should go with backend or data Engineering if the both are equal for me Thanks
Hi Bassam, do you think that a computer science student who is about to graduate and is interested in the field of software development should continue on the path and do a Master's degree after his Bachelor's? Or would you recommend him to work and gain experience first? Or maybe do them concurrently both together ( but this option will take time )
Consider the following. The reason that software engineering does still have humans necessary, is because there is still human's involved in the process. What would you do if you inherited a fortune. Would you still go to work? Or if they would just keep paying you, would you come in? I think there will be people quite like that, we call them 'entrepreneurs' or 'vocational programmers' or 'hobbyists'. We would all like to do that, but we accept that there is ths system in which (mild?) pressure makes you sell your time for sustanance. If you inherit enormous amounts of money noone will come to visit you worried that you are not doing enough or what is going to come of you. We will be using driverless cars, made in workerless factories. Now, thank god, knowledge work is also under fire. We can all conceive a post work society. A political discourse in which we describe our preferences to a LLM that then fairly debate the point - because we just can't help ourselves. "We are only human" is just not good enough anymore. We need to have another four letter word that we don't use around kids, the W-word... Some lunatics in the 19th century (no religious officials btw) thought that using chloroform in childbirth was against the curse of god, her sorrow shall me multiplied in childbirth. Unless you are really that masochistic, we can create reservations in which we live like westerners circa now... There like a modern day flaggelant you can, under supervision of AI, engage in the drudgery of work, complain, moan about the dysfunctional body politic and have something to fill your self structure with - achievement. For all others it is time to understand that as scarcity goes so does the need for a free market. We need to really admit that we suck at organizing ourselves, at empathy and at raising children. That is enough to keep us busy. Oversight duty for those that can could be a form of parttime jury duty. But it is necessery that at a certain point this system is upended. Even the most egotistical billionaire (that are somehow not afraid of being replaced as they add nothing) would realize that the game is up - this free-market power mongering reallife misery monopoly will no longer work. And if they don't, i am sure we won't have anything to do but to find them and convince them of that fact.
Combine a pro programmer's expertise and pair it with an explosively efficient AI software. Believe me! Nobody is quitting their job. Developing software just became so much more fun. With DEVIN! Let me quote Charlie Sheen: WINNING!
Don't try to convince them man, they are already quitting and changing carrers which means less supply and more demand in the future lmao. Plus these AI's more than replace will make our jobs easier, it's a win/win scenario.
I think it is hilarious to compare generative AI to calculators or computers and then also say "nobody is going to lose their job": both inventions you named, like nearly all significant inventions, severely disrupted the labor market. Generative AI will likewise be a very disruptive force for the labor market. In fact, it already is. Even if it did nothing at all to help you program, we've already seen executives continue with massive layoffs and tell their boards they were able to do it "because of generative AI". AI eliminating software jobs is not "sometime in the future when we achieve AGI", it is already happening right now. The outlook for software engineering jobs has been plummeting consistently for the last several years and don't expect anything to change. The main argument you gave, about how software engineering is more than just writing code, and it's also about requirements gathering and stakeholder alignment and stuff, does not argue what you think it argues: in fact LLM's are even better at those kind of management tasks than they are at programming.
Adding more to this: If people continue to flood into boot camps hoping to jump on the gravy train that was software engineering in the mid 2010's, they will be severely disappointed by their job prospects. I really hope young people will take these warnings about the job market seriously, because the industry has been flooded for too long by low-quality boot-camp grads who think spending two months learning Rails or Flask equals endless years of high-earning employment. Whether AI is actually that good at programming or isn't, it is going to be used as the excuse to sweep away all these weak candidates flooding the industry, and the result even if misattributed to AI will be companies making more profits with fewer engineers. Half-learning a single web framework as a path to making six-figure salaries had become such an unsustainable gravy train in the 2010's, it even became the "learn to code" meme, where "learning to code" (i.e. not actually learning anything deep, but just attending a boot camp) was supposed to be the meal ticket for the modern economy. This was always destined to collapse, if claims of "generative AI" are the excuse for it to be allowed to finally collapse, all the better. Either way, it is going to collapse and people need to stop being told that the SWE gold rush is going to continue somehow indefinitely.
I love when programmers says "I'm not scared of AI because it don't know how to do what I can do" completely forgetting that it can see you doing your job and learn based on it. Not just seeing your job of course, it will learn from the best, are you better than the best?
It warms the cockles of my heart to see otherwise intelligent people so profoundly dedicated to being myopic and short-sighted. I'll also bet that these same people will be the first to be completely outraged to the point of being apoplectic when AI does in fact replace them..... and you can then make a bunch of new clickbait videos.
Isn’t the point that AI is going to make lower level skills obsolete so you don’t need coding knowledge I don’t understand the argument here if everybody becomes a engineer you still can’t leverage your know how and skill to get money And I think if the scaling laws hold and if there scaling it it could definitely do the higher level skills
@@RoyaltyInTraining. No, if you know what you want, you can still point out bugs or defects in a game/service without understanding the underlying code. I’m not saying it won’t help, but it's not required. Watch a bunch of GPT-4 coding game challenges where the person does not know anything about programming; they can still complete the game and tell their preferences without knowing anything. Even in this very limited form, I think GPT-4 can only do 1.7% of SWE bench, and an agentic version, GPT-4 Devin, can do 13%. And, the transformer architecture scales linearly. Have you seen the Chinchilla scaling law paper? It’s very surprising. My point is, if you can do a little bit of it and it gets generally better at downstream prediction, plus compared to other types of data, you can generate a bunch of usable synthetic data for coding because it’s verifiable, and you can weed out or have some kind of criteria for better code, much like math. I don’t know what to say; it’s not required, it’s rapidly developing, and if there’s some aspects that are useful, you can learn that specifically and not the other things, you know? Sorry for writing long.
I am an Artist and if you had asked me- before AI Art existed- if an AI would ever do what I can do I would have said absolutely not- and I would have been right. But what I failed to understand was that AI did not need to replicate my skills, all that was required was a new paradigm in which those skills became irrelevant. This is the real threat I see to software engineers- not that AI will be able to do what they can do, but that AI will lead to a radical new paradigm in which the skills of software engineers become largely irrelevant because the entire process of software development becomes so different that most of those skills are no longer needed. AI did not replace me as an Artist- it just went around me by introducing a new way to create images. AI may not replace software engineers either- but it may introduce a new way to create software. And just as I could never have forseen the way that AI image generators would fundamentally change the landscape of commercial illustration, it may be that a similar degree of disruption may occur in the landscape of software design. This is not a prediction as such, more an observation. The tendancy to think of AI as an accellerant to exisiting paradigms can blind us to the reality that it may simply replace those paradigms with something entirely new and- therefore- unforseen.
For the sake of the argument, how did AI take anything from artists? Image generators are cool and all, they are handy for certain use cases but in what ways do they compare to artists? You might argue that people can generate images they need instead of commissioning or licensing art but this type of art work has already been commoditized by buy one get all licensing platforms.
And if the idea that AI can produce artifacts that are indistinguishable or close to work of real artists, I argue that these disposable outputs will make real art wayyyy more valuable.
@@glich.streamMoney- they are taking money from artists because many people who in the past commissioned unique artworks from artists are now creating unique artworks themselves with AI and no longer need to pay artists to do them. But that is not really the point I was making. My point really was this; as a 'creative' I felt very safe from AI- I was supremely confident that no mere machine would ever be able to replicate the complexity of thought and understanding required to make a compelling and coherent image. In a sense I was right- no machine yet exists that can really do that. But I failed to anticipate that an entirely new way to create images would be devised, one that did not require an artists understanding of light, form, color theory, compositon ect. Art Generators like Midjourney are not making images the way a human artist makes them, by applying this accumulated knowledge and skill- Midjourney knows nothing of these things. Yet some of the images Midjourney makes are genuinely stunning to look at- so the end result is the same. The lesson I draw from this is that existing knowledge and skillsets are not a sufficiant 'moat' to keep out a tecnology as radically different as AI because it represents a genuinely new paradigm. As I said- AI did not replace me- it went around me. The same thing could also happen in software development- the rise of an entirely new paradigm in how software is concived and created- and almost by definition such radical shifts in existing paradigms are very hard to forsee.
@@glich.streamI agree. I have taken the decision to move out of the digital realm into real world painting on canvas for this reason- the sheer volume of digital art created in the coming years will devalue digital imagery to the point where no one will be able to make a living producing digital art. The sheer cost and time advantages offered by AI Art will be impossible to resist in most cases I fear. It's a strange irony that after years of being told that AI would free up the human race to make art and music and other creative things, it's the artists and the musicians and the writers that AI came for first.
The case I’m making is that we are not there yet and that when we are there, more than software engineering is at risk. Art is a creative endeavor but it is also a solo game.
LLMs right now are like 10 year old brilliant programmers. You have to act like a caring parent to guide them to improve the semantics of the code they have already produced. And if you want unit testing to go with that, well...
We will have to do the requirements gathering. That's It. Guaranteee it. Devin, and the like will be coding it all soon enough. I'll keep learning some anyways, as it could be a good hobby or maybe I can teach, but unlikely to need me in "industry" because it will all be robots doing all the work. Work, business, as we know it, will not be the same, and they won't need much software at all.
Even before I opened your video, I simply knew that it was a clickbait (yours example's are ones that I love :D) and that you are a real gamer who won't change career because of some GPT/Devin hype. Also, last time I commented your video I was jobless, now I solved one task and got employed as a software engineer (devops related) and it's been my 2nd day today! Can't say how happy I am after an unexpected layoff to get his opportunity, bit scared and so exited!
Jacque Fresco already thought about it in the 80s and made a full economy fit for all of our coming reality of unemployment. So - research Resource Based Economy. But its not too early. we shouldnt start addressing this issue when its already here, but in advance. And it will come much faster than you think. this decade.
I bet this was quality click bait and I was right! 😂
😂 sorry.. It’s necessary
Until Devin or whatever dumb name they give their next AI, actually gets a robotic body, kicks my door down and takes my laptop from my lifeless hands... I will continue programming and learning to code. We humans taught AI everything it knows, not everything WE know or are creatively capable of coming up with. There is a big difference.
"We humans taught AI everything it knows, not everything WE know or are creatively capable of coming up with"
Sorry if I'm being ignorant but isn't that just like a parent thinking their kid can't go astray just because the parent raised them right ?
Parents don't teach everything their kid knows@@Black-mz4wy
🤣🤣🤣🤣Your comment had me in stitches because for more than a decade, I get called to fix issues here and there. Issues which no amount of googling or stackoverflowing can make anyone find the answer to fix them. I have been writing the solutions to fix those issues in each company's knowledgebase which demand login to access or not connected to the internet at all. Good luck AI finding how to do those.
@@Black-mz4wy No I think you missed my point. A child is capable of learning organically and creating a path separate from its parents.
There is a reason we call it artificial intelligence. It's not about "going astray". Even when an AI "goes astray", it can only do that within the context of what it knows, or hallucinate what it thinks it's supposed to know. Look up AI hallucinations.
@@brainites 😆👏👏This right here!!! I don't think people understand the types of complex problems and most importantly, complex mistakes that humans are capable of creating.
Things that require another human to unravel. Spicy take: I think the folks who want AI to kill coding are looking for a way to feel better about their own laziness.
I'm in a third world country and coding my arse off while folks (especially in my age group) bitch and moan about AI every other day.
If they are that worried, they should learn Python and get into AI 😆
The problem is that 1 person will be able to do the work of 50 programmers
Productivity boost is real
When the office computer was first released, everyone was making doomsday prediction that office jobs were essentially dead because one person on a computer could do the work of 5 typists with typewriters.
Newsflash, more people than ever work in offices today.
You just don't understand economics.
So? Having programmers being able to do way more is what the investors and stakeholders wanted to begin with.
Correct. The job will remain... but only for super senior developers. Hopefully... the amount of companies will rise as well, etc... who knows.
There's a difference between replacing software engineers and the *need* for many software engineers. This is what people who dismiss AI can't grasp. Junior roles will be effectively wiped out since these AI tools can do their jobs, hence getting your foot in the door will be incredibly difficult. Additionally the layoffs created a pool of thousands of experiences unemployed devs, how can a person just starting out get a chance? Second thing they'll hit is your salaries. If these AI tools make your job easier why should I compensate you the same as if they didn't exist? Besides, Devin is just a first gen iteration of its kind, they will develop more advanced versions who can do all the things you listed in the beginning of the video.
Exactly
There are a lot of false or wrong points in what you wrote. First, the layoffs had nothing to do with AI. Second, the fact that someone's job becomes easier because they have tools to help them doesn't mean they'll get paid less, you fail to account for scale. If I can do X tasks in Y amount of time, and AI tools now allow me to do X tasks in Y/10 amount of time, this just means that I can do 10X in Y amount of time which translates to more profits for the employer. The fact that a lot of the juniors won't find a job is something that engineers in other professions have experienced in the past and it's function of supply vs demand as well as the economy and has nothing to do with AI. Now, at the end of the day we should be worried and have to rethink things philosophically and whether AGI should even be pursued to begin with, but that has nothing with what you wrote and certainly isn't in the near future. The fact that people think that we are only a feww couple of years away from AGI is laughable.
A friend who works in AI dev went to a conference last week where one of the talks was titled "Is AGI imminent?" and for which the speaker just walked on the stage, said "no" and walked off
lol
The award for best click bait title goes too 🎉😂
I think that the appetite for software is just about infinite. I'm not worried about "the end of software engineering." There will be a hiring blitz in 2026, as the companies that are just now getting started, in our new context, start to ramp up massively.
Yes and they will pay us 2 -4 K a month per person, rather than paying a swarm of agents lets say not 20$, but 200$ a month... with the tech of 2026 - GPT 5 (if not 6) level LLM (if not a whole different framework) + GROQ 2 level of speed, built in to their codebases, integrated with all the platforms and tools on the web... shall i continue?
@@avivolah9401 First answer me this: How much professional software engineering have you done? I need to assess a baseline of understanding about what software engineers actually do.
@@LionKimbro very little, but the youtuber describes it much better than i could anyway :)
I agree with most of this, except for the "nobody's job is going away for the foreseeable future."
I'm not really worried that AI will be able to do software development as well as software developers yet. I'm worried that lots of managers that want to jump on the hot new technology will thoughtlessly and lazily just fire or stop hiring junior devs because they think AI can do it all for them only to realize that, 1.) they can't, 2.) because they only hired senior devs, the senior devs became overworked and fewer people will seek those senior roles because it's not worth it, and 3.) because they didn't hire any junior devs and give them the necessary experience to grow into those senior roles, now there is a dwindling pool of senior devs to pull from to replace the retiring senior devs. The risk of job security isn't because AI is at the point it can do it; it's because management is often all too happy and eager to do whatever they think will save a few bucks.
Until AI can design algorithms better than a human, it ain't taking our jobs. And I don't think many corporations are going to entrust the serious aspects of their multi-million dollar businesses to hallucinating large language models. Also, AI still needs us/our data to train it. Without examples to learn from, AI models can't do the right thing.
Once AGI and Robots start doing our jobs. Where do we get wages to buy more products/services made by AI/Robots?
That's not the correct question
"in a field that it's AI-proof" 😂
Even if ai can replace programmers perfectly (I doubt this with the reason you stated in the video), I will still code, simply because I love it.
Let's see how this video holds up over time...
I'm curious to see how well this video will age. I'll revisit it in a year and see if my perspective remains unchanged. My prediction is that it won't. The advancements in the AI field are truly remarkable, and there seems to be no sign of slowing down or hitting any limits. While no one can predict the future with certainty, if I were to make a wager, I'd bet that within 5 years or less, we'll witness the emergence of software developed entirely by non-human entities, with computers communicating directly with one another. This approach makes a lot more sense, considering that humans often struggle with writing software efficiently. Even some of the most skilled engineers face challenges, requiring numerous iterations and testing, yet bugs still surface in production.
They forget it can learn.
It will age like milk
You can't run massive amounts of data such as hospitals, government institutions on entirely on AI. It's a booster for those are on the field. I bet you may have not worked in the programming namespace. This video is much accurate because he has been on the domain of software engineering.
@@austinejuma6160 and this is the most intriguing thing, did you realize how many programmers with 10+ years experience are becoming TH-camrs? This guy second video was about copilot. They are monetizing their knowledge before it becomes obsolete, they know it. You can realize that because every new AI that is announced that is 50% better than the one before is only approached with "well it can't do this and that" as if that will be the last AI ever. They never add the "yet" because this way they can't sell courses and get sponsored by courses on their area.
Maybe, but I bet the AI will need supercomputer to be run and it is much cheaper to hire a human for the job. We have nothing even close to human brain level computing.
Denial, Anger, Bargaining, Depression, Acceptance. With this video, I present to you the 'Denial' phase.
LOL, oh my gosh. That was hella funny as heck. *laughs* ha :)
Thank you.
Exactly. There is no reason every gish gallop listed in this video couldn't be done by an AI.
2:07
Please, make a whole video about it and call it "There's no such thing as AGI"
I see you like strong titles so that might work, but mostly, somebody has to explain how text prediction works and why it can't become magically sentient, and why it's not supposed to in the first place.
I don't mind your clickbait, because in the first seconds you made it clear and delivered good content, but I'm getting crazy with all TH-camrs and AI companies abusing the tern AGI for hype, fundings and clicks, while the audience is getting very bad informations and becoming paranoid.
This is literally on of the best videos I have ever watched.
and yes if we have managed to achieve AGI (which is so far), then it is a "nail in the coffin." to any job.
1. Software engineering requires a mental model of the codebase and the ability to reason about it. AI cannot reason or solve problems on its own.
2. Software engineers need to manage the codebase operationally and address issues that arise after code implementation. AI cannot handle these tasks.
3. Software engineers evaluate the information they find online and weigh the cost of implementation. AI cannot evaluate information or solve resulting problems.
4. Verifying the correctness of solutions is difficult. It is unclear how AI agents can be trusted to solve problems.
5. AI agents are currently co-pilots, not in the driver's seat.
6. Progress in AI is not exponential. There is a significant leap between large language models (LLMs) and artificial general intelligence (AGI).
Regarding point #2. You can argue that the reason a human being is able to build a model of the world around them is because they were essentially "trained" for years to understand the world around them by their parents, teachers, peers, or through self-training using the skills they were trained to perform by others in their earlier years (e.g. you were taught to read, so you can read numbers, so you can then be trained to do math and so on). If you take a human being who was raised by a pack of wolves in a jungle without any typical "human training", they wouldn't be able to tell you much about anything about the human world. The only actual difference is that LLMs don't have free will to train themselves on whatever they want to. Yet.
Well if a company has a program written with an AI owned by Microsoft for example. Does that mean that one has to pay royalties to Microsoft for that generated program?
Nope but of course it depends on the licensing deal. Legislation definitely needs to revisit everything copyright. AFAIK and for the time being, output from language models for example is not copyrighted.
As a 1st year computer engineering student who LOVES programming, but doesn't know much about how professional applications are made operationally, I'm glad this field is way more than just monkeying code haha because I'm much more inclined to talk to people and bounce off thoughts/ideas/memories etc.
AI has no fear of losing its job or salary cuts. Teach it that first, so it learns to take responsibility for its faults.
Does anybody know any company replacing software engineers by AI? We can stay here is only AI assisted software engineering.
No, but the big companies that used to snatch up all the new grads have been firing their juniors like crazy for 2+ years now, and slowing hiring to a trickle or freezing it entirely.
Executives have started to tell their boards the hiring freeze is "because generative AI allows us to do more with less". Whether that's actually true or not, who knows: we've known since the 60's that small teams of seniors can be more productive than large teams of inexperienced devs, so it could easily just be that effect. Either way, the 2010's job market where it was "take a two month boot camp and make six figures doing Rails apps" is over for good.
@@fennecbesixdouze1794 Certainly companies can use engineers more productively. Actually statups do more with less engineers. But it has almost zero correlation with AI.
You hit comment jackpot with this video
Devin and Al won't replace you but will make you 10x productive and successful in what you do.
Al-assisted employees will be the future and a requirement in most of upcoming jobs in future. Ride the way to future and start learning it now
Whether or not generative AI is actually good at programming, the big companies that used to snatch up all the new grads have been firing their juniors like crazy for 2+ years now, and slowing hiring to a trickle or freezing it entirely. Executives have started to tell their boards the continued hiring freezes are justified "because generative AI allows us to do more with less".
Whether it's actually true that generative AI "allows" companies to staff fewer engineers, who knows? It may not really matter. We've known since the 60's that small teams of seniors can be more productive than large teams of inexperienced devs, so it could easily just be that effect. Whatever executives tell their boards, companies that have been firing their juniors have been doing gangbusters: making more profits and better products faster.
Either way you cut it, the 2010's job market where it was "take a two month boot camp and make six figures doing Rails apps" is over for the foreseeable future, and young people should know that before deciding to switch majors or jump careers for a React bootcamp.
Amazing video! Your content is informative. I liked and subscribed!
You’ve just described the publishing industry. They are all saying the same sorts of things. In reality, books can and will be written by LLMs. The marketing and editing and nuanced work will be done by people in the short term but eventually all this work will be automated as well.
Can you elaborate more?
Ill be honest i gave up on SE and SD. I do understand that field is not at risk at all BUT i just dont want to deal with all that are you this are you that. Its just too much hassle so ill go either management or security.
what you say in the video and what the title is are not related
if a can ask you? :
for someone who love to work on large scale project and complex ones in day to day work
which track should go with backend or data Engineering if the both are equal for me
Thanks
Hi Bassam, do you think that a computer science student who is about to graduate and is interested in the field of software development should continue on the path and do a Master's degree after his Bachelor's? Or would you recommend him to work and gain experience first? Or maybe do them concurrently both together ( but this option will take time )
Consider the following. The reason that software engineering does still have humans necessary, is because there is still human's involved in the process.
What would you do if you inherited a fortune. Would you still go to work? Or if they would just keep paying you, would you come in?
I think there will be people quite like that, we call them 'entrepreneurs' or 'vocational programmers' or 'hobbyists'. We would all like to do that, but we accept that there is ths system in which (mild?) pressure makes you sell your time for sustanance. If you inherit enormous amounts of money noone will come to visit you worried that you are not doing enough or what is going to come of you.
We will be using driverless cars, made in workerless factories. Now, thank god, knowledge work is also under fire. We can all conceive a post work society. A political discourse in which we describe our preferences to a LLM that then fairly debate the point - because we just can't help ourselves.
"We are only human" is just not good enough anymore.
We need to have another four letter word that we don't use around kids, the W-word... Some lunatics in the 19th century (no religious officials btw) thought that using chloroform in childbirth was against the curse of god, her sorrow shall me multiplied in childbirth. Unless you are really that masochistic, we can create reservations in which we live like westerners circa now... There like a modern day flaggelant you can, under supervision of AI, engage in the drudgery of work, complain, moan about the dysfunctional body politic and have something to fill your self structure with - achievement.
For all others it is time to understand that as scarcity goes so does the need for a free market. We need to really admit that we suck at organizing ourselves, at empathy and at raising children. That is enough to keep us busy.
Oversight duty for those that can could be a form of parttime jury duty. But it is necessery that at a certain point this system is upended. Even the most egotistical billionaire (that are somehow not afraid of being replaced as they add nothing) would realize that the game is up - this free-market power mongering reallife misery monopoly will no longer work. And if they don't, i am sure we won't have anything to do but to find them and convince them of that fact.
Everything you said is right, AI is enhancement not replacement of engineers
someone doesn't understand exponential advancement... this video won't age well....
I was scared after the thumbnail video, but after watching it, i am calm again.
Combine a pro programmer's expertise and pair it with an explosively efficient AI software. Believe me! Nobody is quitting their job. Developing software just became so much more fun. With DEVIN! Let me quote Charlie Sheen: WINNING!
Great video! Many points are valid, really appreciate your input.
Don't try to convince them man, they are already quitting and changing carrers which means less supply and more demand in the future lmao. Plus these AI's more than replace will make our jobs easier, it's a win/win scenario.
You had me at "I'm pretty much in the Yann LeCun camp when it comes to this topic." 😍 lol
Thank you so much!
all of the points you've raised are solvable with AI
Only one word THANKS!
Stress reduced
I think it is hilarious to compare generative AI to calculators or computers and then also say "nobody is going to lose their job": both inventions you named, like nearly all significant inventions, severely disrupted the labor market.
Generative AI will likewise be a very disruptive force for the labor market. In fact, it already is.
Even if it did nothing at all to help you program, we've already seen executives continue with massive layoffs and tell their boards they were able to do it "because of generative AI".
AI eliminating software jobs is not "sometime in the future when we achieve AGI", it is already happening right now. The outlook for software engineering jobs has been plummeting consistently for the last several years and don't expect anything to change.
The main argument you gave, about how software engineering is more than just writing code, and it's also about requirements gathering and stakeholder alignment and stuff, does not argue what you think it argues: in fact LLM's are even better at those kind of management tasks than they are at programming.
Adding more to this:
If people continue to flood into boot camps hoping to jump on the gravy train that was software engineering in the mid 2010's, they will be severely disappointed by their job prospects. I really hope young people will take these warnings about the job market seriously, because the industry has been flooded for too long by low-quality boot-camp grads who think spending two months learning Rails or Flask equals endless years of high-earning employment. Whether AI is actually that good at programming or isn't, it is going to be used as the excuse to sweep away all these weak candidates flooding the industry, and the result even if misattributed to AI will be companies making more profits with fewer engineers.
Half-learning a single web framework as a path to making six-figure salaries had become such an unsustainable gravy train in the 2010's, it even became the "learn to code" meme, where "learning to code" (i.e. not actually learning anything deep, but just attending a boot camp) was supposed to be the meal ticket for the modern economy. This was always destined to collapse, if claims of "generative AI" are the excuse for it to be allowed to finally collapse, all the better. Either way, it is going to collapse and people need to stop being told that the SWE gold rush is going to continue somehow indefinitely.
That was a very valuable and to the point thought sharing. Thank you for this.
I love when programmers says "I'm not scared of AI because it don't know how to do what I can do" completely forgetting that it can see you doing your job and learn based on it. Not just seeing your job of course, it will learn from the best, are you better than the best?
but i always knew life can be this much hard that you need to find multiple sources of income
It warms the cockles of my heart to see otherwise intelligent people so profoundly dedicated to being myopic and short-sighted. I'll also bet that these same people will be the first to be completely outraged to the point of being apoplectic when AI does in fact replace them..... and you can then make a bunch of new clickbait videos.
great take which is really just common sense! So surprised by people overreacting to this devin ai
Devin and his ilk are creating the chaos that I will continue to be paid a very high salary to fix. :D
well said, man! would be great to learn software engineering from you! I defnintely prefer humans to interact with, no machines...
Isn’t the point that AI is going to make lower level skills obsolete so you don’t need coding knowledge I don’t understand the argument here
if everybody becomes a engineer you still can’t leverage your know how and skill to get money
And I think if the scaling laws hold and if there scaling it it could definitely do the higher level skills
Programming knowledge is *required* to be an effective engineer. You can't make good decisions without knowing what's going on under the hood.
@@RoyaltyInTraining. No, if you know what you want, you can still point out bugs or defects in a game/service without understanding the underlying code. I’m not saying it won’t help, but it's not required. Watch a bunch of GPT-4 coding game challenges where the person does not know anything about programming; they can still complete the game and tell their preferences without knowing anything. Even in this very limited form, I think GPT-4 can only do 1.7% of SWE bench, and an agentic version, GPT-4 Devin, can do 13%. And, the transformer architecture scales linearly. Have you seen the Chinchilla scaling law paper? It’s very surprising. My point is, if you can do a little bit of it and it gets generally better at downstream prediction, plus compared to other types of data, you can generate a bunch of usable synthetic data for coding because it’s verifiable, and you can weed out or have some kind of criteria for better code, much like math. I don’t know what to say; it’s not required, it’s rapidly developing, and if there’s some aspects that are useful, you can learn that specifically and not the other things, you know? Sorry for writing long.
I am an Artist and if you had asked me- before AI Art existed- if an AI would ever do what I can do I would have said absolutely not- and I would have been right. But what I failed to understand was that AI did not need to replicate my skills, all that was required was a new paradigm in which those skills became irrelevant. This is the real threat I see to software engineers- not that AI will be able to do what they can do, but that AI will lead to a radical new paradigm in which the skills of software engineers become largely irrelevant because the entire process of software development becomes so different that most of those skills are no longer needed.
AI did not replace me as an Artist- it just went around me by introducing a new way to create images. AI may not replace software engineers either- but it may introduce a new way to create software. And just as I could never have forseen the way that AI image generators would fundamentally change the landscape of commercial illustration, it may be that a similar degree of disruption may occur in the landscape of software design.
This is not a prediction as such, more an observation. The tendancy to think of AI as an accellerant to exisiting paradigms can blind us to the reality that it may simply replace those paradigms with something entirely new and- therefore- unforseen.
For the sake of the argument, how did AI take anything from artists? Image generators are cool and all, they are handy for certain use cases but in what ways do they compare to artists? You might argue that people can generate images they need instead of commissioning or licensing art but this type of art work has already been commoditized by buy one get all licensing platforms.
And if the idea that AI can produce artifacts that are indistinguishable or close to work of real artists, I argue that these disposable outputs will make real art wayyyy more valuable.
@@glich.streamMoney- they are taking money from artists because many people who in the past commissioned unique artworks from artists are now creating unique artworks themselves with AI and no longer need to pay artists to do them. But that is not really the point I was making.
My point really was this; as a 'creative' I felt very safe from AI- I was supremely confident that no mere machine would ever be able to replicate the complexity of thought and understanding required to make a compelling and coherent image. In a sense I was right- no machine yet exists that can really do that. But I failed to anticipate that an entirely new way to create images would be devised, one that did not require an artists understanding of light, form, color theory, compositon ect. Art Generators like Midjourney are not making images the way a human artist makes them, by applying this accumulated knowledge and skill- Midjourney knows nothing of these things.
Yet some of the images Midjourney makes are genuinely stunning to look at- so the end result is the same. The lesson I draw from this is that existing knowledge and skillsets are not a sufficiant 'moat' to keep out a tecnology as radically different as AI because it represents a genuinely new paradigm. As I said- AI did not replace me- it went around me.
The same thing could also happen in software development- the rise of an entirely new paradigm in how software is concived and created- and almost by definition such radical shifts in existing paradigms are very hard to forsee.
@@glich.streamI agree. I have taken the decision to move out of the digital realm into real world painting on canvas for this reason- the sheer volume of digital art created in the coming years will devalue digital imagery to the point where no one will be able to make a living producing digital art. The sheer cost and time advantages offered by AI Art will be impossible to resist in most cases I fear. It's a strange irony that after years of being told that AI would free up the human race to make art and music and other creative things, it's the artists and the musicians and the writers that AI came for first.
The case I’m making is that we are not there yet and that when we are there, more than software engineering is at risk. Art is a creative endeavor but it is also a solo game.
LLMs right now are like 10 year old brilliant programmers. You have to act like a caring parent to guide them to improve the semantics of the code they have already produced. And if you want unit testing to go with that, well...
I just came here to comment that it's a clickbait and enhance the reach . I will not finish the video though. Bye!!
Are you saying that so your new studio wont go to waste.
Just kidding man great videos ❤
Never! The new studio is gonna rock either way!
Thank you for making this video, Can you please talk more about this topic
We will have to do the requirements gathering.
That's It.
Guaranteee it.
Devin, and the like will be coding it all soon enough.
I'll keep learning some anyways, as it could be a good hobby or maybe I can teach, but unlikely to need me in "industry" because it will all be robots doing all the work.
Work, business, as we know it, will not be the same, and they won't need much software at all.
Great video, brother. Many need an antidote for the AI Kool-Aid they took.
Even before I opened your video, I simply knew that it was a clickbait (yours example's are ones that I love :D) and that you are a real gamer who won't change career because of some GPT/Devin hype. Also, last time I commented your video I was jobless, now I solved one task and got employed as a software engineer (devops related) and it's been my 2nd day today! Can't say how happy I am after an unexpected layoff to get his opportunity, bit scared and so exited!
Congratulations for the new role!! 🙌🙌
I'm switching back to Medical Device/Software sales. People skills are one of the last things AI can't truly touch.
Knew it was going to be click bait. Down-voted.
Thanks for the clickbait~ (I was going to doubt the title so...)
Burger King.
Taste is King.
Jacque Fresco already thought about it in the 80s and made a full economy fit for all of our coming reality of unemployment. So - research Resource Based Economy.
But its not too early. we shouldnt start addressing this issue when its already here, but in advance.
And it will come much faster than you think.
this decade.
Came to dislike a clickbait title
had me worried there!!!!! (great click bait)
Clickbaited 😂
Sorry 🥲 not sorry
Subscribed
I clicked like despite of the sh*tty title.
I am glad I got clickbaited to this video.
I not dead lol
Sounds like coping!
😒😒😒😒😒😒
What? Disappointed I didn’t actually quit my job and announced it in a youtube video?
click bait
AI sucks, it's not very accurate
he is still in denial ...
Another clickbait, another dislike : )
Copium.
what a quitter
fell for the clickbait title SadgeCry