starting to build stuff for myself was the moment learning journey took a huge leap forward. the amount of questions you stumble upon when actually building something is pure gold and the motivation stepping out of tutorial hell was the best moment in my early journey getting into coding
You was building something for self (automate things etc) or recreating? i have big problem with creating something for myself because i don't see any problems in my everyday situations :(
@@bagietmajster8589 I just built a little tool for my own need. A tool that traverses through directories looking for git repositories and letting me know if any of them are „dirty“ As I use git across all my projects and dotfiles it’s useful for me on the verge of releasing it on pypi just for practice
tee jay real talk how can you make claims about what AI will NOT do for tech jobs in the next 5 years when you say at the start of the video that you couldn't have predicted the current state of AI if asked a few years ago? Don't you see the contradiction in that logic?
Thanks Prime for ridding the community of trolls... Also, if your going to disagree and say that Primes takes are wrong, fine. Explain your reasoning. He literally gives you the opportunity to explain yourself. Most streamers dont. Most streamers just tell you your wrong and that's it. Its not calling you out. Its respect.
they admit that they wouldn't have been able to predict the current state of AI today if asked a few years ago. By that logic they are most likely wrong on their take of what AI will do to the market in the next 5 years
People over here saying there is much more than just "coding" in computer engineering. Well, that is true, but there is nothing inmediately simple about writing code. Depending on the problem sure you can do some static HTML that takes a couple hours of dedicated study (that ChatGPT is perfectly capable of helping you with), or something like GPU graphics or systems programming, that may take you years to master. Good luck using any AI for that. For me chatGPT is an alternative that works in tandem with Google to facilitate access to information, it is not doing anything by itself either.
GPU programming is much easier than people think though. The program is literally given a coordinate on the screen and has to return a color for that location in RGBA, that's it. However, it's the math that's complicated (depending on what the screen is supposed to display with those color values), and ChatGPT is really bad at math.
@@anlumo1 Imo there is nothing simple about it. I studied a very math heavy degree and this is the branch of programming that uses it the most. To efficiently do it it's the hard part though, you require deep knowledge about what is performant and what isn't (which changes a lot depending on if you are working mainly on the CPU or the GPU, or both) , projections, rotations in 3d (so space awareness), knowledge of relatively complex APIs like Vulkan...
Honestly I just use chatgpt to argue about a problem. One thing I've noticed tho is that it doesn't like to push back too much. It's way too easy to turn it into an echo chamber. If a company would release an AI that actually is argumentative then that would be huge in my workflow.
Another point is that AI could take over webdev, but certain industries like Defense will almost never adopt heavy AI usage. We're still programming in Ada for pete's sake. The risk of intermingling classified and unclassified data alone will almost guarantee that no LLM gets trained on gov't/defense codebases.
@@smnomad9276 I know he said Ada. The implication is that Lockheed/the military industrial complex has no problem putting secret/top secret data in the cloud now, as long as requirements are met. And Azure is a huge player in LLM providers…
@@PejatubeThen I feel you should just watch the original video. I'm pretty sure people are here for prime takes and discussions rather than "hey youtuber react harder"
Great takes. Loved the problem solving part. People got used to that when they come to me with "i have a problem, maybe you can help", first thing they get is "OK, tell me what problem you're trying to solve...". It's mind bending the disconnect you sometimes have from "what you're trying to do" to "what problem you're trying to solve". Once that is out of the way, it's time for "OK, now show me how you tried to solve it". Let me tell you, sometimes the connection between what they're doing and what they're trying to achieve is tenuous at best... p.s. and yeah, motivation is high for "things you want done". Last time it bit me was "i hate my keypads macro recorder, i just want to write it in plain text by hand...". Cue firing up the hex editor, reversing their (godawful) binary format to a point i could use it, write a quick text>macro converter and test, test, test. When i got it working, i was "this sucks, wonder if i could use an ESP32 as a BT keyb and drive it from an app on my phone...". Just waiting for the package to arrive ;) And i so hate Belkin for not updating their n52 software past XP and selling it to Razer...
There are still many many companies that are not fully digitalized yet. A lot of paper is pushed around and manual labor done for things that could be automated by the average developer with well understood tech. Even if AI could do everything that is promised by marketing now it will take decades until it is adopted in a majority of usecases.
The more i work with AI, the more I think it is very unlikely to substitute actual devs. Maybe it is yet to have a huge improvement, but the actual state of LLMS are not capable of replacing an average developer
That, and I think the most value comes from implementing ml tools into your workflow rather than replacing it. Let it rip on things YOU should be doing? Shit code. Consult strategically and replace manual tasks with ML in various contexts? Money.
I think the problem is, as kind of hit on in the video is that LLMs don't think. I feel like what will happen with AI is faster horse, I'm referencing Henry Ford about if he had asked what people wanted.
Do you think that the current state of the models is as good as they're going to get? I don't understand why people keep doing this. If you said that ANNs would be this advanced a decade ago you'd literally be laughed out of the room.
@@lost4468yttrue, but the argument can also go both ways, in the sense that you can't say for sure that LLMs will improve considerably. We just can't tell where we are on AI development for sure
I think there are currently two fundamental problems holding back the progression of AI: 1. These companies are running out of data sets to keep expanding their models. Companies are already trying to figure out how to create "synthetic" data to have more things to train models on. 2. The current methodologies of training models are ineffective for creating the level of "intelligence" that OpenAI, Google, Anthropic are trying to reach. It's why we need probably 3x the available amounts of data we actually have to further tune and add parameters to these already gigantic models. The current paradigms of AI development won't get us a junior level developer out of GPT-5, Gemini Ultra 2 or Claude 4. They might be an advancement but they're an advancement on a paradigm that won't work in the long run. Bonus problem: We don't have the supply chain infrastructure to generate the amount of compute necessary for companies to replace millions of junior developers.
I feel like all these arguments are based around the idea that programmers will be replaced completely by computers/AI. More than likely they will be replaced by LESS programmers who are really good at using AI to develop code faster. Basically if you are not at least a "10x" developer in the next 5 years you have no chance.
@@shinoobie1549 How so? The reality of them not having enough data is just a statement of fact. It’s been reported by the Wall Street Journal, the NYT, and others. The second point has been recognized by Sam Altman and the former head of Deep Mind in public interviews. Building bigger and bigger models aren’t going to get us there. We need advancements in training methodologies if we’re ever going to get models capable of developer replacements at scale. You can call it whatever you want but it’s a workforce reduction enabled by AI.
The ultimate goal is the creation of artificial general intelligence which, being general can replace every white collar worker, not just programmers, except for workers in medicine which require hands on caregiving to patients
@@LiveType Right now it seems like compute is allowing these companies to try to squeeze every last drop out of the current designs even though we're at rapidly diminishing returns with ever increasing parameter volumes. I also think compute is enabling them to do more and more dynamic things with standard user prompts while also enabling more kinds of prompts without fundamental architectural changes to these models. While the models aren't really improving that much they are more functional which I think serves as a consumer stopgap until better model and training methodologies can be established. I liken it to a service that gives you a robot as a date that looks like a person. With GPT-3 the date wasn't that smart, wasn't that enertaining to talk to, and was ugly. GPT-4 the date was more dynamic to talk to, more attractive, it would by you a drink and that made it seem like it was a lot smarter but the truth is you just enjoyed yourself more. GPT-5 will be much more attractive, will be able to talk dynamically about all sorts of things, and you'll be able to go to a variety of different fun places with it. You're having a lot more fun and you might even want to go on another date but you wouldn't want it as your significant other if you really got to know it. More compute simply means more features but until we get something fundamentally different in the model - to belabor the analogy - we won't be introducing GPT-5 to our parents anytime soon.
@@hatonafox5170 you're saying the exact same thing I'm saying bro. There will be LESS programmers being hired as AI gets adopted more. It doesn't even have to get that much better at writing code, companies just need to find ways to implement already existing tools into their workflow. 1 good programmer/"prompt engineer" will soon replace 10 average programmers. I think that's the point all these guys are missing because they have this strawman idea of a workforce made up entirely of bots as the replacement for developers. It's basically like the same thing that happened when assemblerswere first created where one programmer could do the job of 20 people because there was no longer the need for the 19 extra people who had to painstakingly convert program instructions into machine code by hand
50:00 school ingrained into us all that we HAVE TO do it right on the first try and that failure to do so is REALLY bad. so yes 90% of us is fearing failure and is stuck in the "the first try needs to end in perfect results" hell.
5:16 if no one is a junior, then the number of intermediate and senior programmers will be an ever decreasing number. AI wont replace coders until they can be just as good as senior devs.
My father got a certificate for electrical engineering for solar panels in the early 90’s. He installed his first home one last year. Almost 30 years before “it took off” enough to install on people’s homes for profit. My guess is 30 years.
you're comparing completely different fields. Remember software/software development is the same field that brought us javascript which produces a new framework every second. Change happens much faster in software than hardware
There's entire towns with solar panels here in FL. Your dad taking 30 years to do something companies have been doing for decades is your dad's fault it has nothing to do with that industry. Stay in school kiddo
For those who are afraid to fail. Keep doing that, when crossing the road, when climbing an mountain, when using a knife, when landing an airplane. But not when building software. You run it, it crashes, you fix it. That's the great thing with software, you can usually try it over and over again with very little consequences.
except with prod, some failures have pretty serious consequences if used by millions of people (which is why many are afraid to break prod). Public indemnity, corporate liability and contractual obligations can mean breaking prod has real impact beyond temporary downtime. hence try not to experiment with releasing to prod 😅
I've been trying to start to learn programming for some time, and for a month I have been watching videos like this and trying out tutorials. Finally I settled for the " Bro Code " channel. Your take on intrinsic motivation is the driving factor on me learning Python. Having adhd makes intrinsic motivation such a pivotal aspect for moving forward.
All the folks who make money off of other folks learning to code are bullish on learning to code. But their comments sections are full of people who cant find work....
That's what happens when you treat Joe Rando who read a C# tutorial as the same as Joe Degree who spent $100k+ on a high fallutin' degree. Joe Degree is clearly better but Joe Rando is cheaper and can slap together some junk code to make the it seem like the project is done. No other engineering discipline does this and they don't have these hireability problems.
The amount of value in gems packed in this one hour video cannot be understated. The best software engineers have ability to connect business problems with their solution counterpart that just happens to be software.
Calculators are a lot better at basic math and usually faster than humans most of the time. However, mental math ability still useful as it can give sense to people that useful, can be quicker at times than using a tool, and develop skills/deeper understanding. The more creative and longer math work still often requires humans esp. on cutting edge. I think the same will hold true with coding. Now coding is not only two or three lines, and longer tasks worse code generated is or LLM are. There is also the factor of human readability and code looking good, which is importance for maintenance and others, and that's a human thing. What I see here is code generated by AI and more editing by humans of code to make code better, or humans playing with prompts, etc. Next part it is, there are things that humans may uniquely see that computers will not get. It may be horrible at designing UI as it is not human or front end to be appealing, or may use symbols or variables in code that humans do not understand as well. Lastly, trusting AI without thinking and know-how may lead to hallucinations in code, security bugs, etc. but working together may be better than human or AI code alone
Furniture is mass manufactured do you see carpenters not having jobs? If anything the human aspect of their work adds a desirable quality to the product and carpenters tend to be loaded with money as a result.
also something to keep in mind is that the average person is not quite literate in terms of using a computer (not even talking about being able to reinstall one themselves which is like several steps a kid could do), so even if AI takes all the jobs at some point, chances are regular people would be too dumb to make proper use of it and would still require someone with more technical knowledge to help them build their whatever
It's gonna go like driverless cars - that last 1% before you can say you're error free (at an acceptable rate) is going to be insanely hard and take ages
If you guys couldn't predict the current state of AI a few years ago, what makes you think you can predict what it will do to the job market in the next 5 years?
THANK YOU AI is moving exponentially, we will see the next generation in less than 5 years. We are literally going to get supercomputers that cost 6 figures rather than millions The profits are going to be immense, the big companies will build their own ie google/amazon, and the smaller will buy nvidia/ amd.
24:00 I head the Tiger Beatle devs describe it as sculptors vs painters. Both are making new works, but sculptors are taking what is and shaping it to their vision vs painters that are using tools to make something from scratch.
tbh what customers SAY they want and what they actually WANT are often two different things, that alone is a huge challenge in itself and I think AI will have a lot of trouble with that.
AI is a tool, and its far from perfect, I personally use it as a smart search engine, and Copilot as a snippets system, I will see if the output is close to what I would do, then fix the parts that make no sense, I think Generative AI will always require manual checks because it's not reasoning about the code, its predicting a highly probable output based on input.
I dont think ai will replace junior coders until at least 2030 and most businessess will take until 2035 to actually implement it. Once implemented businesses will still want to keep some junior devs because it reduces risk of the ai messing up something. It will be similiar situation with self driving trucks, we will have drivers sitting in them for at least a decade after mass adoption. Risk assessment is a very real thing for businesses. No one is throwing a hail mary when they can slowly phase something in over a decade.
This is a great take; more people need to see other software engineers who have been around for a while to give a realistic take on this question. My best choice was someone who got me started with coding in my later thirties. I am not the best, but I love it and continue to learn every day. I have so many people on my channel who are noobs giving up learning to code based on the hype they hear from youtube ( developers )who want to get views. If you enjoy coding, don't quit. Thank you both for this video.
The devs trying to get views are the ones who's channels rely on you thinking coding is a good path forward. Not the ones that essentially saying, you don't need thus channel anymore.
@@Icedanon Prime has had this take even before he decided to leave Netflix. He had this take even when he could've easily afforded his channel and content disappearing the very next day. He's had no financial reason to persuade an audience one way or the other.
@@IcedanonThe talk around AI replacing jobs is a big view generator. Fear from devs experienced and beginners creates views. Yeah it may not make sense in the long term, but it certainly does in the here and now.
I think the "AI will at some point replace junior devs" is quite short thinking (sure, a lot of managers don't think further than literally tomorrow, but that's a different topic). Every senior at some point in the past was a junior dev. So, let's say at some point an AI will straight up be able to replace every junior dev and it's not overly expensive to use. Ok, if that gets adopted at a wide scale, that pretty much means that a huge chunk of junior devs will not exist. But that also means that a huge chunk of senior devs won't have a chance to exist. Becoming a senior requires quite a lot of experience, most of the time multiple decades of doing it full time. That's not really something you can do in your free time, well, except if money isn't a problem and you can practically ignore monetary problems. For a time, that's not going to be a problem, but at some point the current senior are going to retire. So there are going to be less and less senior devs. At first that seems similar to what the financial industry with COBOL is going through, but it's quite a bigger problem. COBOL at its core is still a normal programming language, sure it's old, but at its core it's the same as e.g. JavaScript or Rust or Haskell. So at least theoretically they can get replacements from different industries. But if junior devs fall away in every industry, you can't replace the senior devs in YOUR industry because EVERY industry has this problem. So, the question at that point will be: Will AI become good enough to even replace senior devs fully before too many retired? Because if the answer is no, things are going to be interesting. And I doubt anybody can be able to predict what's going to happen then.
If a model can replace a junior dev then it's barely a stretch for it to replace a senior dev. The difference between a junior and senior is virtually zero compared to getting a model from a stage where it can't do anything useful to where it can replace a junior dev.
I got into development three weeks ago, not to solve a problem, but "to see what would happen if" - spiraled from there, and I now dream in code. I spend every waking hour thinking of things I want to try, using code. It's both a blessing and a curse. My notes are now a bunch of code snippets and terminology, and I have anxiety about a git repo I made.
20:55 The current state of the LLM chat bots building code is method/function by method/function. You WILL NOT be able to ask it to build an application. HOWEVER, you can build an application with it. Think of it like writing a book. It's understanding is not broad enough to create a novel, but you as the author/prompter know where you are in the story, and what you want to portray next, so you give a brief description of where you're at and how it can help you jump forward a little. The key here is finding a way for you and the AI to keep as much of the past in the LLM's memory as possible. Like if you're writing code and you finish a package and move to another one, but ask it a question that it needs to reference work it help you with earlier, it just won't contextualize the past
We are not just experiencing shortage of chips to produce ubiquitous ai that can code decently, but we do not have the computing architecture to meet that demand and probably don't yet have the algorithms to do that.
Omg Yes! There are so many topics where people try to teach something and everyone skips over the most important step. It's like they all got together and decided to keep it a secret. It's like that one class you need to graduate that's really hard is taught at 8am on a Friday every single semester! (spoken in the voice and delivery of Sam Kinison)
Of course the governments want to stop low level programming, because it gives you freedom to do whatever you want without being held back by a runtime environment. In the future, it's entirely possible for governments and companies to restrict the runtimes and frameworks so that you have to pay for them or be special in some way to use them. Can't restrict assembly and C in the same way.
Here's a personal experience of mine that I unfortunately live out on a day to day basis. I work in automation. I help companies evaluate their processes and automate them. So I primarily deal with small-medium sized companies who are just going into the growth stage. The number 1 thing I see is the absolute refusal of staff to change and learn someting new. We all know that one person who's been in the position for 20 years and refuses to use Excel all because she's used to using her calculator. And there's no amount of coaching that you can do, they refuse and there's nothing you can do about it. These people are... EVERYWHERE. Now imagine you're trying to get someone to give up their job so that an AI can take it over. Yeah, good luck. To a certain extent, we are all ludites.
in the process of updating my ecommerce boilerplate from next 12 to 14 and it kind of sucks. Found myself stuck a few times reworking problems I already solved but at the same time I am enjoying the process of really breaking down next and learning it on a deeper level. As shitty as the process can be the growth and performance benefits outweigh the suck.
I've learned Lua, bash, php, Javascript, python, and still learning more, in just the last year. Still haven't written any C yet but it's always been the seniors jobs. I've only ever needed to understand C but never write it yet. By the pace of AI today, I feel it won't be needed for future development to focus on new tools. When AI can make un-bloated code, and we don't even care to read what it's written because it works, then the real problems begin.
Disagree. That's like letting AI be smarter than you as a human, goal is to use that mf and acquire something that AI can't do. So I think one will have to understand why things are working the way they exactly are and why only this way they are working. Man's job will be to gain advantage over AI's solution by learning way more in less time using AI itself.
I love that AI is my own personal tech support, office hours, TA, and , tutor and pair programmer. But it isn’t a person who can really be a software engineer just yet from what I can tell.
Its always funny being in Prime's comment section. Half of the people have already grokked the point of the video and saw nothing of value, and the other half is green enough for the entire vid to be an eye-opener. TH-cam should have a "finished" badge on comment for those who watched all the way through. As usual, there's stuff I knew and stuff I didn't. Same great experience!
I am a self-taught developer and entrepreneur. I began my development journey approximately 3 years ago, and I'm approximately ~2 months off from shipping our first app. It's 100% still worth it, but I believe you need to have a specific goal *for it to be worth it (and to keep you motivated as you learn).* I learned development because I realized there's an app that I wanted to turn into a reality, however, I didn't have the capital to hire developers. So I did it myself. This video is absolutely phenomenal, and for the most part, I agree with the points on AI, and most of the points TJ made, especially regarding people developing out skills beyond "I can write syntax." Ultimately, those are the skills that will take you further than anything. Keep up your amazing work! You've definitely earned yourself a new follower. Cheers.
I agree with at least 2038. Besides the development of the technology, and as you stated, this will have to be tested in encapsulated environments over time and through countless scenarios just to get the ball rolling. A good example of the speed that companies move at is the good old help desk. The time from overloaded sysadmin being the SPOC for support to having an actual help desk was no less than 5 years.
Another thing is that a lot of industries depend on lots of strict safety requirements and standards. In automotive we are using e.g. MISRA 2008. Yes, that this year it came off. There is newer standard from 2014, but industry has so much inertia. There is a lot of standards that regulate development of AI for card, but they are still developing. And we are not mentioning yet ensuring that AI will write safe and secure code.
And with that said. Inertia not only comes from particular companies, but also legislation and implementing new standard. Without that companies what be able to say that their product is safe and any accident related to devs that say 'Certainly!' will be shot in foot with potential huge impact to market share. No company take that risk if technology is not mature enough in technical but also law.
As a person that enjoys coding more than problem solving (I enjoy structuring code and writing it), hearing an argument about how coding may get stripped away from software development replaced by writing in a prompt isn't something I look forward to.
The people who gleefully and proudly say its pointless to code should ask 3 questions? 1. What do you think is going to happen to all that legacy code? Do you think its as simple as paste into a prompt and wait hahah 2. If AI reaches the level where it is able to replace ALL programmers (a complex skill) what do you think is going to happen to the rest of you? (Do you think you won't be next) 💀 3. Do you hear hackers asking "Should I still learn how to hack?" 🤣 Heck NO! They are gearing up to wreck all of this vulnerable AI generated code, and shortage of potential devs who chickened out, rather than learning to code.
GitHub’s last presentation of their new copilot concept for the future was essentially #1. You write up an issue with what you want to happen, it identifies the files to change, writes it up, etc. Feasible right this moment? Probably not, but for the future, yes.
@@lost4468yt because that guy thinks the achilles heel of AI will be its inability to deal with legacy code, when most of the code that AI is being trained on IS or will be legacy code by the time it is good enough to reduce the developer work force by over 50%. IRONICALLY it will be non-legacy code that will be one of the fiew areas where AI will not be as useful because although it can produce novel solutions to problems, unless they actually develop AGI, these tools still don't have the kind of agency of a human being to develop completely innovative ideas without being directly prompted to do so (and still can't even when prompted sometimes although that will most likely get better in the future)
Even if complete replacement is years away or impossible, imagine the dynamics of the job market while a productivity of an average engineer grows suddenly, say, 20% . The salaries in IT and the (on average) comfortable hiring process were both results of a job market dominated by employees. As soon as the balance changes, that's all gone. So even if practically the profession is not gone, it will stop being as lucrative.
I've been writing software professionally for 20+ years and I see future of software development that I'll be competing in *communicating* with normal people about their needs for new software. Once normal people feel that they can more successfully explain their ideas to AI than to me, then AI will take my job. Until then, I see future software development as I communicate with normal people and I then communicate with the AI and fix possible mistakes that future AI still makes. Right now, the AI can do pretty little compared to my output but I'm fully expecting future AI systems will be smarter and smarter every year and at some point the future AI will produce better code for a clearly specified requirement spec. I'm still not sure how long it will take until AI can communicate with normal people so well that AI can get the requirements directly from the normal people so they can cut me off the chain. I would have probably 20+ years until retirement and I have trouble seeing the future where my current work cannot be done with high level AGI which some will call ASI. And I can only hope that we switch to universal basic income (UBI) or something similar until the society collapses because so many people will be miserable otherwise.
I think it's more likely that they'll be better at communicating requirements than you, before they're better at actual implementation. The objective testing has shown that they're really good at things like law, healthcare, etc. All things that are much closer to getting requirements and interpreting them correctly.
@@lost4468yt I totally agree. Most software developers I know are not exactly stellar in communicating with normal people. As a result, AI will take over programming jobs once the AI is intelligent enough and cheap enough to run. If I understood correctly, Devin already costs hundreds of dollars to use for even simple tasks. That's not a threat to human programmers at that cost level, especially when Devin is not yet at even jurior developer level. However, give it a couple of years and it's at junior developer level and costs $0.10/h to run and trying to get any new human developers to senior level is going to be really really hard because all junior positions will be taken by AI.
The 1 book would be The C++ Programming Language 3rd Edition, in my case. My favorite part is when Barne writes; you pay for what you use, and what you don't know won't hurt you. As long as you use what you do know. Also the first time RAII was explained to me and I almost got it too, I now know.
this video was seriously informative, im not a fan of reaction content and i sat through the whole thing and came away with lots of helpful information
as a novice coder tinkering around with pico-8, going back and forth with chatgpt has helped me get a better understanding of lua, to the point where i can spot mistakes in the code it suggests from the processes i am looking to write. i think it is all down to how it is being used.
ChatGPT had to try 3 times today to convert a Django ORM query into raw sql. If I'm gonna have to spoon feed it, might as well discover the solution by trial and error.
I asked GPT to find methods with empty body across Java files. It returned some kind of nonsense regexp. Meanwhile Java compiler is able to identify the empty body for last 25 years.
I believe root causing is the true way you can distinguish programmer from non programmer. Every problem I have I approach like that. Either political, personal or in other job I had. If you see a problem and first thing you do is finding the source of the issue makes your life much easier and you as employee or entrepreneur more successful
I mean, before there were 100 farmers + cattle on a field, when the tractor was invented there are 3 people on a farm. What I am trying to say is there will still be a need for a developers but we might be more of a promter at this point. Idk just a thought.
Honestly, odds are thousands of jobs will be lost regardless of how advanced AI gets simply because most CEOs are greedy morons. In the short term, there will be massive changes, but it's hard to predict how things go long term
The thing with AI is that it acts as a force multiplier, much like computers. Recall the significant impact that computers had across all industries when they were introduced, significantly accelerating development and productivity. Similarly, AI will not only influence software development; it will enhance everything, including the speed, efficiency, and spread of technological development. This, in turn, affects other industries, which then influence each other, impacting software development beyond merely replacing junior roles.
I was the first Neovim developer, but it doesn't characterise my personality entirely just with neovim. I just don't have the time to fiddle with an editor to the extent vim requires right now. So I identify with both the copy paste abd "solutions" developer and I think thats important for some to hear.
Professor Prime,Even if there was some magical optimization that allowed for better than GPT4.0 performance running locally on existing GPUs inside each viewer’s desktops there would be managerial problems for adoption that would take 5+ years. This is what you should focus on, hardware requirements are going down for the usable lightweight output models resulting from the heavyweight training on hundreds-thousands of GPU… you don’t want your point to get bogged down in the “can we even run it” conversation when that’s not what you are trying to get across to these college kids about the realities of enterprise work environments & corporate bureaucracy adopting AI. …I’ve been on the group that’s going to be allowed to use Copilot first at our company & even I still don’t have that old tool.
Prime is inspirational, but he is also the epitome of "why are you depressed? just don't feel sad bro" when it comes to his takes on psychology and self improvement
For me, AI is most useful with helping me understand concepts in a more organic way than a simply google. I can inquire about a topic, get an answer, ask deeper questions like I would ask a teacher and get more specific responces. I cannot stand reading or rereading information I already know just to find the bit of info I missed or forgot. It helped me understand errors in my code instead of googling some error code and spending hours digging through responces hoping they are accurate. If anything I think we will all have an AI that is like a mentor. Tracking your learning progress in different areas, noting where you often make mistakes, and guiding you to working on those skills. Just my take on the issue.
i think the general public don't understand how hard it can be to get something super specific out of an llm, back in gpt3 days it took me 86 prompts just to get a point outline of a cat (of any kind), like how would a non-dev know if the code they're reading has a bug or not, so how would they know if they got the prompt right?
57:00 nah,watch it to me this is channel a w cuz of how long it is( not really but actually making 5min to 30+ and making the dense information actually easy to understand( plus skill issue) is what i love and also corrects my skill issue)
I've got some questions. 1. First of all, example Google has AI and Microsoft needs it? Does Microsoft trust Google or vice versa? How the AI developers will look like? 2. Or the companies, will be, "Yes, Google, we believe you won't collect out product"? 3. Can every single company on the earth can afford to write their own AI? Even if they can how are going to have so much computation power? 4. Will the AI devs be cheaper than juniors? 5. How about the responsibility of AI? What will it say "Apologize" for a mess up? 6. Okay, imagine no jun positions, okay. So how will you find mid+ devs after kicking all juns out? Where? They grow on the trees, or what? This is damn important. Sooner or later those senoirs will retire. 7. Imagine the day of a senoir developer. Senoirs won't have any time, they will constantly have buggy code that is needed to be checked. Super stress for them. 8. How about standups, agile after all. How is that managment crap is going to work? Will the AI let you know what it did, what went wrong, and what its going to do? Surely not! 9. How is it going to debug? Do tests? In other words. It will never happen. Just hecking good luck if you think so
I think prime is wrong. GPT is amazing and the code is better than junior's. In 5 years it will definitely beat a junior. A senior would just prompt what he needs for the current project and GPT will just spit out what junior used to do in the past. Also: Companies move FAST. If their profit is threatened (by competitors), they are BLAZINGLY fast. Now, of course the question is: Will the AI be that much better in 5 years than it is now, and this is what Prime probably means and yes, it's possible that it will stagnate. But I somehow doubt that.
IMO it is very much possible for AI to take over. After-all, programming is a fairly new field all things considered. I'm not worried though, because if AIs can replace programmers, they can replace most jobs. At that point, it's a matter of "who can transition to a new job the fastest?", which is the strong point of many programmers.
I’ll one up the take on, “even if we had the ability to manufacture the chips.” Where is the electricity going to come from, most of the grid in the US is near Max capacity and they’re adding wind/solar instead of Nuke or other highly reliable sources.
86% (being generous) correct code will never replace 99% correct code. If software quality degrades to 86%, we can safely assume, there will be new laws nobody wants but everybody needs. And what does AI really provide beyond snippets? AI created tests are crap, they have to be considered an anti-pattern (you write the test before the implementation). Could an AI even translate say a Java solution to flawless idiomatic C#? That said, yes, exposing closed source codebases to Copilot, letting its vendor monetize and disenfranchize those codebases will go a long way to change the landscape. Copyright will be a thing of the past in 5 years, because nobody will be able to prove Copilot wasn't involved. And this will ultimately hurt devs far more than being the guy correcting generated code, being the unpaid backpropagation tool. Maybe devs are simply not as smart as they think?
It won't be AI replacing juniors outright, it'll be seniors or middleweight developers finding they can accomplish more by using AI. I've found it's great at writing config as code, ci/cd pipelines, and fleshing out unit tests. I don't waste nearly as much time as I used to on that kind of stuff. Frees me up to solve other problems. That said we could likely sack all the junior devs in the industry today and be able to build products faster regardless of AI. We'll just be screwed in the years to come when those juniors didn't become middleweights/seniors.
theres a place for copy/paste solutions as much as engineering a solution based on grokking the problem.. BUT, we are not, generally, afforded the time to be engineers these days to the point we dont even have the time allocated to even POC our solution ideas. The ask is get the jira ticket across the board from left to right asap... so we often resort to finding the copy/past solution...
I hear these people saying “6 months and the industry is DONE!”No experience on the job, just talking out of their ass. Look, the first thing you have to do at work is typically: “Set up your computer. Get permissions to the repo, set it up, and get the code compiling.” It can take a WEEK before a new employee has their work desk set up. There’s not an LLM on the planet today, that does this.
It is worth learning, just not for a jr position in my opinion. Learn a fullstack framework as Next.js or Laravel or both, learn Stripe or some payment workflow and basic server management and start a service somehow. It's easier to do that than to get a good jr position with career development for most in global south
"Democracy is the theory that the common people know what they want, and deserve to get it good and hard." -H.L. Mencken Also the progenitor of one of my other favorite "raw" quotes: "Every normal man must be tempted, at times, to spit on his hands, hoist the black flag, and begin slitting throats."
I agree with @Prime, adoption into enterprises will be extremely hard. Let's start with various rules around SOX, HIPPA, and financial accounting as well as many rules like "the right to be forgotten" -- Let's say the technology is there in 10yrs, that means 10yrs more of legacy code. It gets more complex when you talk about software and government levels or medical software. Will technology only be able to handle creating new code, or will it be able to extend the legacy? I work for a company that requires all UIs to follow a design system. Will the AI be able to learn company A's design system and company B's design system? It may help start-ups to get going faster but how do you handle all the legacy? TJ DeVries is right, if you are a developer especially for many years, problem solving, and communication is more important than even the code. It will come, how quickly, nobody knows. How big will the impact be, who knows? There is also a pricing problem. In 10yrs, thanks to inflation, let's say a Jr. Dev cost $100K, how much will they charge for the software? It will be interesting to watch, I hope to be retired by the time all this mess comes along and can watch from the sidelines.
Big words spoken, AI mentioned, Ban happened, Prime reacted. Truly a moment in history.
react mentioned
Truly one of the videos of all time
could you want anything more?
Of all the moments of history, this was one of them.
Uhu😅uuhuhhuuuu😅😅 2:55 uuuu 2:56 uv😅 UV uu vc uhu 2:57 u 2:57 u😅😅😅uv😅uv😅v😅h 2:58 u😅uu Uhu uuuu 2:58 u😅hby by by
The “saying no” part of a client request is so on point. Want something terrible? Certainly!
A guy I worked with deleted all IDs from some objects in the database because the client mediator thought that was the best thing to do (it wasn't)
starting to build stuff for myself was the moment learning journey took a huge leap forward. the amount of questions you stumble upon when actually building something is pure gold and the motivation
stepping out of tutorial hell was the best moment in my early journey getting into coding
Frankly, you give yourself feeling there's more purpose in what you do too.
You was building something for self (automate things etc) or recreating? i have big problem with creating something for myself because i don't see any problems in my everyday situations :(
@@bagietmajster8589 I just built a little tool for my own need. A tool that traverses through directories looking for git repositories and letting me know if any of them are „dirty“
As I use git across all my projects and dotfiles it’s useful for me
on the verge of releasing it on pypi just for practice
@@bagietmajster8589 there doesnt need to be a problem necessarily. There could be an optimization that you could find.
@@bagietmajster8589 Just get a diploma. If you're from Europe it is almost free :)
Great video ♥ prime just thinks I'm smart because I keep saying he's really smart (+ unemployed)
neovim masterrrrrrr
Lovely take.
Great take with clear examples to back up your argumentation!
I also think that the video that prime is watching is great
tee jay real talk how can you make claims about what AI will NOT do for tech jobs in the next 5 years when you say at the start of the video that you couldn't have predicted the current state of AI if asked a few years ago? Don't you see the contradiction in that logic?
Thanks Prime for ridding the community of trolls...
Also, if your going to disagree and say that Primes takes are wrong, fine. Explain your reasoning.
He literally gives you the opportunity to explain yourself. Most streamers dont. Most streamers just tell you your wrong and that's it.
Its not calling you out. Its respect.
I would quite frankly say that as a streamer you can't really give your audience a lot more respect.
they admit that they wouldn't have been able to predict the current state of AI today if asked a few years ago. By that logic they are most likely wrong on their take of what AI will do to the market in the next 5 years
I don't think disagreeing is trolling
@@cornheadahh if youre disagreeing just to disagree and you cant even explain your answer youre prolly trollin
@@shinoobie1549 whats your point? what does that have to do with my comment?
People over here saying there is much more than just "coding" in computer engineering. Well, that is true, but there is nothing inmediately simple about writing code. Depending on the problem sure you can do some static HTML that takes a couple hours of dedicated study (that ChatGPT is perfectly capable of helping you with), or something like GPU graphics or systems programming, that may take you years to master. Good luck using any AI for that. For me chatGPT is an alternative that works in tandem with Google to facilitate access to information, it is not doing anything by itself either.
Yeah, it's a good tool to augment your research and learning.
based take, i use it the same way
GPU programming is much easier than people think though. The program is literally given a coordinate on the screen and has to return a color for that location in RGBA, that's it. However, it's the math that's complicated (depending on what the screen is supposed to display with those color values), and ChatGPT is really bad at math.
@@anlumo1 Imo there is nothing simple about it. I studied a very math heavy degree and this is the branch of programming that uses it the most. To efficiently do it it's the hard part though, you require deep knowledge about what is performant and what isn't (which changes a lot depending on if you are working mainly on the CPU or the GPU, or both) , projections, rotations in 3d (so space awareness), knowledge of relatively complex APIs like Vulkan...
Honestly I just use chatgpt to argue about a problem. One thing I've noticed tho is that it doesn't like to push back too much. It's way too easy to turn it into an echo chamber.
If a company would release an AI that actually is argumentative then that would be huge in my workflow.
Another point is that AI could take over webdev, but certain industries like Defense will almost never adopt heavy AI usage. We're still programming in Ada for pete's sake. The risk of intermingling classified and unclassified data alone will almost guarantee that no LLM gets trained on gov't/defense codebases.
Not true. Lockheed runs on Azure
Unless the massive defense budget manages to create their own llm
It could maybe one day take over shitty web/mobile dev. Good stuff and innovation? Probably not.
@@awillingham Ada is a programming language lol What does azure have to do with this. Read the damn comment next time.
@@smnomad9276 I know he said Ada. The implication is that Lockheed/the military industrial complex has no problem putting secret/top secret data in the cloud now, as long as requirements are met. And Azure is a huge player in LLM providers…
Yes. Saved you an hour
Idk I thought there was some good discussion within the vid. You'd still complain if it was a 10 second vid just saying yes with no context as to why.
You must be one of the people to say “Just play the video!”
@@Dom213 respecting viewers time is a quality only rare youtubers have.
@@Pejatube What is respecting time to you?
@@PejatubeThen I feel you should just watch the original video.
I'm pretty sure people are here for prime takes and discussions rather than "hey youtuber react harder"
Great takes. Loved the problem solving part. People got used to that when they come to me with "i have a problem, maybe you can help", first thing they get is "OK, tell me what problem you're trying to solve...". It's mind bending the disconnect you sometimes have from "what you're trying to do" to "what problem you're trying to solve". Once that is out of the way, it's time for "OK, now show me how you tried to solve it". Let me tell you, sometimes the connection between what they're doing and what they're trying to achieve is tenuous at best...
p.s. and yeah, motivation is high for "things you want done". Last time it bit me was "i hate my keypads macro recorder, i just want to write it in plain text by hand...". Cue firing up the hex editor, reversing their (godawful) binary format to a point i could use it, write a quick text>macro converter and test, test, test. When i got it working, i was "this sucks, wonder if i could use an ESP32 as a BT keyb and drive it from an app on my phone...". Just waiting for the package to arrive ;) And i so hate Belkin for not updating their n52 software past XP and selling it to Razer...
There are still many many companies that are not fully digitalized yet. A lot of paper is pushed around and manual labor done for things that could be automated by the average developer with well understood tech. Even if AI could do everything that is promised by marketing now it will take decades until it is adopted in a majority of usecases.
The more i work with AI, the more I think it is very unlikely to substitute actual devs. Maybe it is yet to have a huge improvement, but the actual state of LLMS are not capable of replacing an average developer
That, and I think the most value comes from implementing ml tools into your workflow rather than replacing it. Let it rip on things YOU should be doing? Shit code. Consult strategically and replace manual tasks with ML in various contexts? Money.
I think the problem is, as kind of hit on in the video is that LLMs don't think. I feel like what will happen with AI is faster horse, I'm referencing Henry Ford about if he had asked what people wanted.
Do you think that the current state of the models is as good as they're going to get? I don't understand why people keep doing this. If you said that ANNs would be this advanced a decade ago you'd literally be laughed out of the room.
@@lost4468yttrue, but the argument can also go both ways, in the sense that you can't say for sure that LLMs will improve considerably. We just can't tell where we are on AI development for sure
@@lost4468yt I dont understand why people assume that AI progression will continue at the same trajectory with no regard for the resources required.
I am just watching this instead of studying to know if i should continue or not 😅
Same,I Been procrastinating discrete math 2 😂
Go go go
Same bro😂
Half of it just getting started. The anxiety of starting sucks but you gotta get it going. Just like with exercise.
AI is gonna be the biggest threat to the human brain.
I think there are currently two fundamental problems holding back the progression of AI:
1. These companies are running out of data sets to keep expanding their models. Companies are already trying to figure out how to create "synthetic" data to have more things to train models on.
2. The current methodologies of training models are ineffective for creating the level of "intelligence" that OpenAI, Google, Anthropic are trying to reach. It's why we need probably 3x the available amounts of data we actually have to further tune and add parameters to these already gigantic models. The current paradigms of AI development won't get us a junior level developer out of GPT-5, Gemini Ultra 2 or Claude 4. They might be an advancement but they're an advancement on a paradigm that won't work in the long run.
Bonus problem: We don't have the supply chain infrastructure to generate the amount of compute necessary for companies to replace millions of junior developers.
I feel like all these arguments are based around the idea that programmers will be replaced completely by computers/AI. More than likely they will be replaced by LESS programmers who are really good at using AI to develop code faster. Basically if you are not at least a "10x" developer in the next 5 years you have no chance.
@@shinoobie1549 How so? The reality of them not having enough data is just a statement of fact. It’s been reported by the Wall Street Journal, the NYT, and others. The second point has been recognized by Sam Altman and the former head of Deep Mind in public interviews. Building bigger and bigger models aren’t going to get us there. We need advancements in training methodologies if we’re ever going to get models capable of developer replacements at scale.
You can call it whatever you want but it’s a workforce reduction enabled by AI.
The ultimate goal is the creation of artificial general intelligence which, being general can replace every white collar worker, not just programmers, except for workers in medicine which require hands on caregiving to patients
@@LiveType Right now it seems like compute is allowing these companies to try to squeeze every last drop out of the current designs even though we're at rapidly diminishing returns with ever increasing parameter volumes.
I also think compute is enabling them to do more and more dynamic things with standard user prompts while also enabling more kinds of prompts without fundamental architectural changes to these models. While the models aren't really improving that much they are more functional which I think serves as a consumer stopgap until better model and training methodologies can be established.
I liken it to a service that gives you a robot as a date that looks like a person. With GPT-3 the date wasn't that smart, wasn't that enertaining to talk to, and was ugly. GPT-4 the date was more dynamic to talk to, more attractive, it would by you a drink and that made it seem like it was a lot smarter but the truth is you just enjoyed yourself more.
GPT-5 will be much more attractive, will be able to talk dynamically about all sorts of things, and you'll be able to go to a variety of different fun places with it. You're having a lot more fun and you might even want to go on another date but you wouldn't want it as your significant other if you really got to know it.
More compute simply means more features but until we get something fundamentally different in the model - to belabor the analogy - we won't be introducing GPT-5 to our parents anytime soon.
@@hatonafox5170 you're saying the exact same thing I'm saying bro. There will be LESS programmers being hired as AI gets adopted more. It doesn't even have to get that much better at writing code, companies just need to find ways to implement already existing tools into their workflow. 1 good programmer/"prompt engineer" will soon replace 10 average programmers. I think that's the point all these guys are missing because they have this strawman idea of a workforce made up entirely of bots as the replacement for developers. It's basically like the same thing that happened when assemblerswere first created where one programmer could do the job of 20 people because there was no longer the need for the 19 extra people who had to painstakingly convert program instructions into machine code by hand
50:00 school ingrained into us all that we HAVE TO do it right on the first try and that failure to do so is REALLY bad. so yes 90% of us is fearing failure and is stuck in the "the first try needs to end in perfect results" hell.
5:16 if no one is a junior, then the number of intermediate and senior programmers will be an ever decreasing number. AI wont replace coders until they can be just as good as senior devs.
My father got a certificate for electrical engineering for solar panels in the early 90’s. He installed his first home one last year. Almost 30 years before “it took off” enough to install on people’s homes for profit. My guess is 30 years.
you're comparing completely different fields. Remember software/software development is the same field that brought us javascript which produces a new framework every second. Change happens much faster in software than hardware
People have been installing solar panels for 10-20 years, and I dont live in a rich area. Your dad was just slow
There's entire towns with solar panels here in FL. Your dad taking 30 years to do something companies have been doing for decades is your dad's fault it has nothing to do with that industry.
Stay in school kiddo
For those who are afraid to fail. Keep doing that, when crossing the road, when climbing an mountain, when using a knife, when landing an airplane. But not when building software. You run it, it crashes, you fix it. That's the great thing with software, you can usually try it over and over again with very little consequences.
"If at first you don't succeed, skydiving is not for you."
except with prod, some failures have pretty serious consequences if used by millions of people (which is why many are afraid to break prod). Public indemnity, corporate liability and contractual obligations can mean breaking prod has real impact beyond temporary downtime. hence try not to experiment with releasing to prod 😅
@@elcapitan6126thats why you dont test stuff on production builds lol
The Navy was on windows XP last time I checked. Yeah, it’s probably going to be a long transition.
I've been trying to start to learn programming for some time, and for a month I have been watching videos like this and trying out tutorials. Finally I settled for the " Bro Code " channel. Your take on intrinsic motivation is the driving factor on me learning Python. Having adhd makes intrinsic motivation such a pivotal aspect for moving forward.
I like Bro Code, great to pump that info into that grey matter.
If you are learning Cpp try Churno TH-cam
Loved the positive vibes at 31:15, we really appreciate the amazing environment you foster.
P.s. I'm sorry for the skill issue
All the folks who make money off of other folks learning to code are bullish on learning to code. But their comments sections are full of people who cant find work....
That's what happens when you treat Joe Rando who read a C# tutorial as the same as Joe Degree who spent $100k+ on a high fallutin' degree.
Joe Degree is clearly better but Joe Rando is cheaper and can slap together some junk code to make the it seem like the project is done.
No other engineering discipline does this and they don't have these hireability problems.
If I had never seen either of you before, I would start following both of you based on this video alone. Informative and entertaining from both.
The only people who ever say “AI is going to take over and you are wrong” are either people who aren’t developers, or bad developers.
Yep. They see what AI does and it looks very similar to what they do, and they get scared. Never heard of an actual engineer being scared.
The amount of value in gems packed in this one hour video cannot be understated. The best software engineers have ability to connect business problems with their solution counterpart that just happens to be software.
that is something alot of normal dev also dont have, it more like BA or PO job (sometime required advise from senior or lead dev)
Engineers just know how to associate their industry with those around them better than most
Calculators are a lot better at basic math and usually faster than humans most of the time. However, mental math ability still useful as it can give sense to people that useful, can be quicker at times than using a tool, and develop skills/deeper understanding. The more creative and longer math work still often requires humans esp. on cutting edge. I think the same will hold true with coding. Now coding is not only two or three lines, and longer tasks worse code generated is or LLM are. There is also the factor of human readability and code looking good, which is importance for maintenance and others, and that's a human thing. What I see here is code generated by AI and more editing by humans of code to make code better, or humans playing with prompts, etc. Next part it is, there are things that humans may uniquely see that computers will not get. It may be horrible at designing UI as it is not human or front end to be appealing, or may use symbols or variables in code that humans do not understand as well. Lastly, trusting AI without thinking and know-how may lead to hallucinations in code, security bugs, etc. but working together may be better than human or AI code alone
Furniture is mass manufactured do you see carpenters not having jobs? If anything the human aspect of their work adds a desirable quality to the product and carpenters tend to be loaded with money as a result.
Every 30 years life as we know it will be unrecognizable. nothing moves in a straight line. we truly are amazing !
also something to keep in mind is that the average person is not quite literate in terms of using a computer (not even talking about being able to reinstall one themselves which is like several steps a kid could do), so even if AI takes all the jobs at some point, chances are regular people would be too dumb to make proper use of it and would still require someone with more technical knowledge to help them build their whatever
But what's stopping AI from just taking over those steps?
ai will not take over your job, some guy using ai will
It's gonna go like driverless cars - that last 1% before you can say you're error free (at an acceptable rate) is going to be insanely hard and take ages
Except that technology is never error-free hence why maintainers are always in demand.
I don't even think it needs to be all jobs...30% would make things uncomfortable
1hr to answer that💀
Fr! No way I’m watching all of it
4x engineer
lmao fr. feel like this topic has been covered so extensively
he's 10x engineer
lol,not gonna watch all of that, what was his answer?
Funny how im rewatching this and there are alot of sprinkled in good practices and charachteristics of a software engineer.. good value! 👍
60 years ago when high level languages were first invented they said that programmers would soon be out of a job.
If you guys couldn't predict the current state of AI a few years ago, what makes you think you can predict what it will do to the job market in the next 5 years?
the only good take in these comments
THANK YOU
AI is moving exponentially, we will see the next generation in less than 5 years.
We are literally going to get supercomputers that cost 6 figures rather than millions
The profits are going to be immense, the big companies will build their own ie google/amazon, and the smaller will buy nvidia/ amd.
I personally see AI threatening many consultant jobs more so than fulltime engineers.
Smaller firms will still use single person freelance shops
Missed opportunity "we used to have horses and donkeys, now we have caml"
24:00 I head the Tiger Beatle devs describe it as sculptors vs painters. Both are making new works, but sculptors are taking what is and shaping it to their vision vs painters that are using tools to make something from scratch.
tbh what customers SAY they want and what they actually WANT are often two different things, that alone is a huge challenge in itself and I think AI will have a lot of trouble with that.
AI is a tool, and its far from perfect, I personally use it as a smart search engine, and Copilot as a snippets system, I will see if the output is close to what I would do, then fix the parts that make no sense, I think Generative AI will always require manual checks because it's not reasoning about the code, its predicting a highly probable output based on input.
Another point, we underestimate ourselves as humans and I think we can and will adapt to any changes.
lol just finished reading Blood in the Machine how ironic we're starting to see our own tech like the Luddites did
I dont think ai will replace junior coders until at least 2030 and most businessess will take until 2035 to actually implement it. Once implemented businesses will still want to keep some junior devs because it reduces risk of the ai messing up something. It will be similiar situation with self driving trucks, we will have drivers sitting in them for at least a decade after mass adoption. Risk assessment is a very real thing for businesses. No one is throwing a hail mary when they can slowly phase something in over a decade.
This is a great take; more people need to see other software engineers who have been around for a while to give a realistic take on this question.
My best choice was someone who got me started with coding in my later thirties. I am not the best, but I love it and continue to learn every day.
I have so many people on my channel who are noobs giving up learning to code based on the hype they hear from youtube ( developers )who want to get views.
If you enjoy coding, don't quit. Thank you both for this video.
The devs trying to get views are the ones who's channels rely on you thinking coding is a good path forward. Not the ones that essentially saying, you don't need thus channel anymore.
@@Icedanon Prime has had this take even before he decided to leave Netflix. He had this take even when he could've easily afforded his channel and content disappearing the very next day. He's had no financial reason to persuade an audience one way or the other.
@@headlights-go-up I'm not saying he is. I was just trying to point out that it makes more sense the opposite way that the op suggested
@@IcedanonThe talk around AI replacing jobs is a big view generator. Fear from devs experienced and beginners creates views.
Yeah it may not make sense in the long term, but it certainly does in the here and now.
@@Lazlo-os1pu literally everything with ai is a view generator. That's no excuse.
I think the "AI will at some point replace junior devs" is quite short thinking (sure, a lot of managers don't think further than literally tomorrow, but that's a different topic).
Every senior at some point in the past was a junior dev.
So, let's say at some point an AI will straight up be able to replace every junior dev and it's not overly expensive to use.
Ok, if that gets adopted at a wide scale, that pretty much means that a huge chunk of junior devs will not exist. But that also means that a huge chunk of senior devs won't have a chance to exist. Becoming a senior requires quite a lot of experience, most of the time multiple decades of doing it full time. That's not really something you can do in your free time, well, except if money isn't a problem and you can practically ignore monetary problems.
For a time, that's not going to be a problem, but at some point the current senior are going to retire. So there are going to be less and less senior devs.
At first that seems similar to what the financial industry with COBOL is going through, but it's quite a bigger problem.
COBOL at its core is still a normal programming language, sure it's old, but at its core it's the same as e.g. JavaScript or Rust or Haskell.
So at least theoretically they can get replacements from different industries.
But if junior devs fall away in every industry, you can't replace the senior devs in YOUR industry because EVERY industry has this problem.
So, the question at that point will be: Will AI become good enough to even replace senior devs fully before too many retired? Because if the answer is no, things are going to be interesting. And I doubt anybody can be able to predict what's going to happen then.
If a model can replace a junior dev then it's barely a stretch for it to replace a senior dev. The difference between a junior and senior is virtually zero compared to getting a model from a stage where it can't do anything useful to where it can replace a junior dev.
I got into development three weeks ago, not to solve a problem, but "to see what would happen if" - spiraled from there, and I now dream in code. I spend every waking hour thinking of things I want to try, using code. It's both a blessing and a curse. My notes are now a bunch of code snippets and terminology, and I have anxiety about a git repo I made.
3 weeks ago... Burnt out yet with that energy?
20:55 The current state of the LLM chat bots building code is method/function by method/function. You WILL NOT be able to ask it to build an application. HOWEVER, you can build an application with it.
Think of it like writing a book. It's understanding is not broad enough to create a novel, but you as the author/prompter know where you are in the story, and what you want to portray next, so you give a brief description of where you're at and how it can help you jump forward a little.
The key here is finding a way for you and the AI to keep as much of the past in the LLM's memory as possible. Like if you're writing code and you finish a package and move to another one, but ask it a question that it needs to reference work it help you with earlier, it just won't contextualize the past
Exactly, it can write a script, not an application. The application it created are copy-paste code from github.
We are not just experiencing shortage of chips to produce ubiquitous ai that can code decently, but we do not have the computing architecture to meet that demand and probably don't yet have the algorithms to do that.
Omg Yes! There are so many topics where people try to teach something and everyone skips over the most important step. It's like they all got together and decided to keep it a secret. It's like that one class you need to graduate that's really hard is taught at 8am on a Friday every single semester! (spoken in the voice and delivery of Sam Kinison)
Of course the governments want to stop low level programming, because it gives you freedom to do whatever you want without being held back by a runtime environment. In the future, it's entirely possible for governments and companies to restrict the runtimes and frameworks so that you have to pay for them or be special in some way to use them. Can't restrict assembly and C in the same way.
Here's a personal experience of mine that I unfortunately live out on a day to day basis. I work in automation. I help companies evaluate their processes and automate them. So I primarily deal with small-medium sized companies who are just going into the growth stage. The number 1 thing I see is the absolute refusal of staff to change and learn someting new. We all know that one person who's been in the position for 20 years and refuses to use Excel all because she's used to using her calculator. And there's no amount of coaching that you can do, they refuse and there's nothing you can do about it. These people are... EVERYWHERE. Now imagine you're trying to get someone to give up their job so that an AI can take it over. Yeah, good luck. To a certain extent, we are all ludites.
The idea of replacing junior devs is insanely short sighted. There are enough senior engineers without junior engineers
Futhermore, if you replace juniors eventually you will have no engineers.
16:59 :D gave me a giggle, good video so far, came by to say hi.
in the process of updating my ecommerce boilerplate from next 12 to 14 and it kind of sucks. Found myself stuck a few times reworking problems I already solved but at the same time I am enjoying the process of really breaking down next and learning it on a deeper level. As shitty as the process can be the growth and performance benefits outweigh the suck.
I've learned Lua, bash, php, Javascript, python, and still learning more, in just the last year. Still haven't written any C yet but it's always been the seniors jobs. I've only ever needed to understand C but never write it yet. By the pace of AI today, I feel it won't be needed for future development to focus on new tools. When AI can make un-bloated code, and we don't even care to read what it's written because it works, then the real problems begin.
Disagree. That's like letting AI be smarter than you as a human, goal is to use that mf and acquire something that AI can't do. So I think one will have to understand why things are working the way they exactly are and why only this way they are working. Man's job will be to gain advantage over AI's solution by learning way more in less time using AI itself.
15 minutes video, 60min for Prime to watch it, 4h for me to watch it. Damn, I forgot to record myself to watch it to keep this rolling.
I love that AI is my own personal tech support, office hours, TA, and , tutor and pair programmer.
But it isn’t a person who can really be a software engineer just yet from what I can tell.
Its always funny being in Prime's comment section. Half of the people have already grokked the point of the video and saw nothing of value, and the other half is green enough for the entire vid to be an eye-opener.
TH-cam should have a "finished" badge on comment for those who watched all the way through.
As usual, there's stuff I knew and stuff I didn't. Same great experience!
I am a self-taught developer and entrepreneur. I began my development journey approximately 3 years ago, and I'm approximately ~2 months off from shipping our first app.
It's 100% still worth it, but I believe you need to have a specific goal *for it to be worth it (and to keep you motivated as you learn).* I learned development because I realized there's an app that I wanted to turn into a reality, however, I didn't have the capital to hire developers. So I did it myself.
This video is absolutely phenomenal, and for the most part, I agree with the points on AI, and most of the points TJ made, especially regarding people developing out skills beyond "I can write syntax." Ultimately, those are the skills that will take you further than anything.
Keep up your amazing work! You've definitely earned yourself a new follower.
Cheers.
I agree with at least 2038. Besides the development of the technology, and as you stated, this will have to be tested in encapsulated environments over time and through countless scenarios just to get the ball rolling.
A good example of the speed that companies move at is the good old help desk.
The time from overloaded sysadmin being the SPOC for support to having an actual help desk was no less than 5 years.
Another thing is that a lot of industries depend on lots of strict safety requirements and standards. In automotive we are using e.g. MISRA 2008. Yes, that this year it came off. There is newer standard from 2014, but industry has so much inertia. There is a lot of standards that regulate development of AI for card, but they are still developing. And we are not mentioning yet ensuring that AI will write safe and secure code.
And with that said. Inertia not only comes from particular companies, but also legislation and implementing new standard. Without that companies what be able to say that their product is safe and any accident related to devs that say 'Certainly!' will be shot in foot with potential huge impact to market share. No company take that risk if technology is not mature enough in technical but also law.
As a person that enjoys coding more than problem solving (I enjoy structuring code and writing it), hearing an argument about how coding may get stripped away from software development replaced by writing in a prompt isn't something I look forward to.
The people who gleefully and proudly say its pointless to code should ask 3 questions?
1. What do you think is going to happen to all that legacy code? Do you think its as simple as paste into a prompt and wait hahah
2. If AI reaches the level where it is able to replace ALL programmers (a complex skill) what do you think is going to happen to the rest of you? (Do you think you won't be next) 💀
3. Do you hear hackers asking "Should I still learn how to hack?" 🤣 Heck NO! They are gearing up to wreck all of this vulnerable AI generated code, and shortage of potential devs who chickened out, rather than learning to code.
GitHub’s last presentation of their new copilot concept for the future was essentially #1. You write up an issue with what you want to happen, it identifies the files to change, writes it up, etc.
Feasible right this moment? Probably not, but for the future, yes.
1) what do you mean? The models are perfectly capable of reading legacy code, just as they can read your code? Why would this be any different?
@@lost4468yt the irony is that they are being trained on this legacy code
@@shinoobie1549 they're being trained on all sorts of code? Why is it ironic that they'd have legacy code in the training data?
@@lost4468yt because that guy thinks the achilles heel of AI will be its inability to deal with legacy code, when most of the code that AI is being trained on IS or will be legacy code by the time it is good enough to reduce the developer work force by over 50%.
IRONICALLY it will be non-legacy code that will be one of the fiew areas where AI will not be as useful because although it can produce novel solutions to problems, unless they actually develop AGI, these tools still don't have the kind of agency of a human being to develop completely innovative ideas without being directly prompted to do so (and still can't even when prompted sometimes although that will most likely get better in the future)
Even if complete replacement is years away or impossible, imagine the dynamics of the job market while a productivity of an average engineer grows suddenly, say, 20% .
The salaries in IT and the (on average) comfortable hiring process were both results of a job market dominated by employees.
As soon as the balance changes, that's all gone. So even if practically the profession is not gone, it will stop being as lucrative.
I've been writing software professionally for 20+ years and I see future of software development that I'll be competing in *communicating* with normal people about their needs for new software. Once normal people feel that they can more successfully explain their ideas to AI than to me, then AI will take my job. Until then, I see future software development as I communicate with normal people and I then communicate with the AI and fix possible mistakes that future AI still makes. Right now, the AI can do pretty little compared to my output but I'm fully expecting future AI systems will be smarter and smarter every year and at some point the future AI will produce better code for a clearly specified requirement spec. I'm still not sure how long it will take until AI can communicate with normal people so well that AI can get the requirements directly from the normal people so they can cut me off the chain.
I would have probably 20+ years until retirement and I have trouble seeing the future where my current work cannot be done with high level AGI which some will call ASI. And I can only hope that we switch to universal basic income (UBI) or something similar until the society collapses because so many people will be miserable otherwise.
I think it's more likely that they'll be better at communicating requirements than you, before they're better at actual implementation. The objective testing has shown that they're really good at things like law, healthcare, etc. All things that are much closer to getting requirements and interpreting them correctly.
@@lost4468yt I totally agree. Most software developers I know are not exactly stellar in communicating with normal people. As a result, AI will take over programming jobs once the AI is intelligent enough and cheap enough to run. If I understood correctly, Devin already costs hundreds of dollars to use for even simple tasks. That's not a threat to human programmers at that cost level, especially when Devin is not yet at even jurior developer level.
However, give it a couple of years and it's at junior developer level and costs $0.10/h to run and trying to get any new human developers to senior level is going to be really really hard because all junior positions will be taken by AI.
Much needed Video! Thanky you very much.
The 1 book would be The C++ Programming Language 3rd Edition, in my case. My favorite part is when Barne writes; you pay for what you use, and what you don't know won't hurt you.
As long as you use what you do know.
Also the first time RAII was explained to me and I almost got it too, I now know.
Imagine if there was a succinct way to ask the LLM/Copilot/GPT to get the application the way you want it....
this video was seriously informative, im not a fan of reaction content and i sat through the whole thing and came away with lots of helpful information
oh, unreal tournament mentioned! it was so good, bro
as a novice coder tinkering around with pico-8, going back and forth with chatgpt has helped me get a better understanding of lua, to the point where i can spot mistakes in the code it suggests from the processes i am looking to write.
i think it is all down to how it is being used.
ChatGPT had to try 3 times today to convert a Django ORM query into raw sql. If I'm gonna have to spoon feed it, might as well discover the solution by trial and error.
I asked GPT to find methods with empty body across Java files. It returned some kind of nonsense regexp. Meanwhile Java compiler is able to identify the empty body for last 25 years.
I believe root causing is the true way you can distinguish programmer from non programmer. Every problem I have I approach like that. Either political, personal or in other job I had. If you see a problem and first thing you do is finding the source of the issue makes your life much easier and you as employee or entrepreneur more successful
I mean, before there were 100 farmers + cattle on a field, when the tractor was invented there are 3 people on a farm. What I am trying to say is there will still be a need for a developers but we might be more of a promter at this point. Idk just a thought.
I really like when you open covertations with viewers!
Honestly, odds are thousands of jobs will be lost regardless of how advanced AI gets simply because most CEOs are greedy morons. In the short term, there will be massive changes, but it's hard to predict how things go long term
The thing with AI is that it acts as a force multiplier, much like computers. Recall the significant impact that computers had across all industries when they were introduced, significantly accelerating development and productivity. Similarly, AI will not only influence software development; it will enhance everything, including the speed, efficiency, and spread of technological development. This, in turn, affects other industries, which then influence each other, impacting software development beyond merely replacing junior roles.
12:50 we’re solving “real or perceived problems”. That hit different
I was the first Neovim developer, but it doesn't characterise my personality entirely just with neovim. I just don't have the time to fiddle with an editor to the extent vim requires right now.
So I identify with both the copy paste abd "solutions" developer and I think thats important for some to hear.
Professor Prime,Even if there was some magical optimization that allowed for better than GPT4.0 performance running locally on existing GPUs inside each viewer’s desktops there would be managerial problems for adoption that would take 5+ years. This is what you should focus on, hardware requirements are going down for the usable lightweight output models resulting from the heavyweight training on hundreds-thousands of GPU… you don’t want your point to get bogged down in the “can we even run it” conversation when that’s not what you are trying to get across to these college kids about the realities of enterprise work environments & corporate bureaucracy adopting AI. …I’ve been on the group that’s going to be allowed to use Copilot first at our company & even I still don’t have that old tool.
Prime is inspirational, but he is also the epitome of "why are you depressed? just don't feel sad bro" when it comes to his takes on psychology and self improvement
For me, AI is most useful with helping me understand concepts in a more organic way than a simply google. I can inquire about a topic, get an answer, ask deeper questions like I would ask a teacher and get more specific responces. I cannot stand reading or rereading information I already know just to find the bit of info I missed or forgot. It helped me understand errors in my code instead of googling some error code and spending hours digging through responces hoping they are accurate. If anything I think we will all have an AI that is like a mentor. Tracking your learning progress in different areas, noting where you often make mistakes, and guiding you to working on those skills. Just my take on the issue.
i think the general public don't understand how hard it can be to get something super specific out of an llm, back in gpt3 days it took me 86 prompts just to get a point outline of a cat (of any kind), like how would a non-dev know if the code they're reading has a bug or not, so how would they know if they got the prompt right?
For starters, the way Devin is developed forces it to check it's own bugs. That's a good methodology to build on
53:53 More importantly, he implied that he has more than two legs...
You nailed it. The bottleneck of AI is the prompt.
57:00 nah,watch it to me this is channel a w cuz of how long it is( not really but actually making 5min to 30+ and making the dense information actually easy to understand( plus skill issue) is what i love and also corrects my skill issue)
I've got some questions.
1. First of all, example Google has AI and Microsoft needs it? Does Microsoft trust Google or vice versa? How the AI developers will look like?
2. Or the companies, will be, "Yes, Google, we believe you won't collect out product"?
3. Can every single company on the earth can afford to write their own AI? Even if they can how are going to have so much computation power?
4. Will the AI devs be cheaper than juniors?
5. How about the responsibility of AI? What will it say "Apologize" for a mess up?
6. Okay, imagine no jun positions, okay. So how will you find mid+ devs after kicking all juns out? Where? They grow on the trees, or what? This is damn important. Sooner or later those senoirs will retire.
7. Imagine the day of a senoir developer. Senoirs won't have any time, they will constantly have buggy code that is needed to be checked. Super stress for them.
8. How about standups, agile after all. How is that managment crap is going to work? Will the AI let you know what it did, what went wrong, and what its going to do? Surely not!
9. How is it going to debug? Do tests?
In other words. It will never happen. Just hecking good luck if you think so
I think prime is wrong. GPT is amazing and the code is better than junior's. In 5 years it will definitely beat a junior. A senior would just prompt what he needs for the current project and GPT will just spit out what junior used to do in the past.
Also: Companies move FAST. If their profit is threatened (by competitors), they are BLAZINGLY fast.
Now, of course the question is: Will the AI be that much better in 5 years than it is now, and this is what Prime probably means and yes, it's possible that it will stagnate. But I somehow doubt that.
IMO it is very much possible for AI to take over. After-all, programming is a fairly new field all things considered.
I'm not worried though, because if AIs can replace programmers, they can replace most jobs. At that point, it's a matter of "who can transition to a new job the fastest?", which is the strong point of many programmers.
Only worth it if you're very high IQ and smarter than chat GPT. Id say 90% of people are not, including myself.
We can discuss about when the AI would surpass a human at coding, but everyone can agree that C++ is NSFW.
I’ll one up the take on, “even if we had the ability to manufacture the chips.” Where is the electricity going to come from, most of the grid in the US is near Max capacity and they’re adding wind/solar instead of Nuke or other highly reliable sources.
86% (being generous) correct code will never replace 99% correct code. If software quality degrades to 86%, we can safely assume, there will be new laws nobody wants but everybody needs.
And what does AI really provide beyond snippets? AI created tests are crap, they have to be considered an anti-pattern (you write the test before the implementation). Could an AI even translate say a Java solution to flawless idiomatic C#?
That said, yes, exposing closed source codebases to Copilot, letting its vendor monetize and disenfranchize those codebases will go a long way to change the landscape. Copyright will be a thing of the past in 5 years, because nobody will be able to prove Copilot wasn't involved.
And this will ultimately hurt devs far more than being the guy correcting generated code, being the unpaid backpropagation tool.
Maybe devs are simply not as smart as they think?
Not Safe For Work coming full circle
Bringing up the chat and asking them what they mean is so based it hurts.
It won't be AI replacing juniors outright, it'll be seniors or middleweight developers finding they can accomplish more by using AI. I've found it's great at writing config as code, ci/cd pipelines, and fleshing out unit tests. I don't waste nearly as much time as I used to on that kind of stuff. Frees me up to solve other problems.
That said we could likely sack all the junior devs in the industry today and be able to build products faster regardless of AI. We'll just be screwed in the years to come when those juniors didn't become middleweights/seniors.
theres a place for copy/paste solutions as much as engineering a solution based on grokking the problem.. BUT, we are not, generally, afforded the time to be engineers these days to the point we dont even have the time allocated to even POC our solution ideas. The ask is get the jira ticket across the board from left to right asap... so we often resort to finding the copy/past solution...
I dislike trolls when they state something but never defend their stance and then leave. Coward.
I hear these people saying “6 months and the industry is DONE!”No experience on the job, just talking out of their ass.
Look, the first thing you have to do at work is typically: “Set up your computer. Get permissions to the repo, set it up, and get the code compiling.” It can take a WEEK before a new employee has their work desk set up.
There’s not an LLM on the planet today, that does this.
It is worth learning, just not for a jr position in my opinion. Learn a fullstack framework as Next.js or Laravel or both, learn Stripe or some payment workflow and basic server management and start a service somehow. It's easier to do that than to get a good jr position with career development for most in global south
"Democracy is the theory that the common people know what they want, and deserve to get it good and hard."
-H.L. Mencken
Also the progenitor of one of my other favorite "raw" quotes:
"Every normal man must be tempted, at times, to spit on his hands, hoist the black flag, and begin slitting throats."
I agree with @Prime, adoption into enterprises will be extremely hard. Let's start with various rules around SOX, HIPPA, and financial accounting as well as many rules like "the right to be forgotten" -- Let's say the technology is there in 10yrs, that means 10yrs more of legacy code. It gets more complex when you talk about software and government levels or medical software. Will technology only be able to handle creating new code, or will it be able to extend the legacy? I work for a company that requires all UIs to follow a design system. Will the AI be able to learn company A's design system and company B's design system? It may help start-ups to get going faster but how do you handle all the legacy?
TJ DeVries is right, if you are a developer especially for many years, problem solving, and communication is more important than even the code.
It will come, how quickly, nobody knows. How big will the impact be, who knows? There is also a pricing problem. In 10yrs, thanks to inflation, let's say a Jr. Dev cost $100K, how much will they charge for the software?
It will be interesting to watch, I hope to be retired by the time all this mess comes along and can watch from the sidelines.