I have an interesting take: if we make AGI, it will be VERY VERY expensive per request (like OpenAI O3 is now). Meaning you cannot waste requests on it, meaning an expert human will have to decide what the best use of the model is, like what requests to give to it. Even if the model somehow prompts itself, we still need a lot of human programmers to verify the model output is actually good, because no one wants to blindly trust software written by someone else. In the worst case, AGI will be to expensive to run at global scale for at least 20-30 years, so coding is still not dead.
This is probably one of the most interesting comments I've seen and I totally agree with your opinion. Although I think AGI will gradually get cheaper. Simply because to better the model they need users and a ton of feedback, so idk how that would look like in the long run. I think it isnt coding itself that will be most valued but rather prompting. that's definitely an untouched gem that most people overlook. making a company that purely focuses on this might be one of the best opportunities for the next 5 years. What do you think?
@@tomasprochazka6198 I totally agree - but lots of people love coding, I like it too tbh, but not in too high doses haha. Lots of people talk about jobs being "meaningful" or not, I think the better question is: "does it align with your goals?" Alignment is far more important than meaning. in my opinion at least. what do you think?
Unfortunately, you have it backwards. Let AI do the wrote parts of coding - boilerplate code, writing code - while you do the thinking and guiding. If you don’t understand the deep parts, then you’ll end up working in just the natural language layer. Most important, you’re putting your ideas out there and testing them. That’s all the work
You're saying what every SWE has been saying for the past 2 years, LLM's are only powerful with someone to nudge and provide them with the right context. That will never be accomplished by an MBA with zero technical knowledge building large and complex systems. Engineering isn't going anywhere and the folks that say it is never built anything meaningfully complex in their lives.
I have an interesting take: if we make AGI, it will be VERY VERY expensive per request (like OpenAI O3 is now). Meaning you cannot waste requests on it, meaning an expert human will have to decide what the best use of the model is, like what requests to give to it. Even if the model somehow prompts itself, we still need a lot of human programmers to verify the model output is actually good, because no one wants to blindly trust software written by someone else.
In the worst case, AGI will be to expensive to run at global scale for at least 20-30 years, so coding is still not dead.
This is probably one of the most interesting comments I've seen and I totally agree with your opinion.
Although I think AGI will gradually get cheaper. Simply because to better the model they need users and a ton of feedback, so idk how that would look like in the long run.
I think it isnt coding itself that will be most valued but rather prompting. that's definitely an untouched gem that most people overlook.
making a company that purely focuses on this might be one of the best opportunities for the next 5 years.
What do you think?
I think coding makes you dead...😊
does it? haha
Depends on the project. Money isn't all in life. Find balance between meaningful and well paid job.
Some companies pay "fees for the struggle".
@@tomasprochazka6198 I totally agree - but lots of people love coding, I like it too tbh, but not in too high doses haha.
Lots of people talk about jobs being "meaningful" or not, I think the better question is: "does it align with your goals?"
Alignment is far more important than meaning. in my opinion at least.
what do you think?
@@tomasprochazka6198 "fees for the struggle"... i really wish
Unfortunately, you have it backwards. Let AI do the wrote parts of coding - boilerplate code, writing code - while you do the thinking and guiding. If you don’t understand the deep parts, then you’ll end up working in just the natural language layer.
Most important, you’re putting your ideas out there and testing them. That’s all the work
that's exactly what I said though... (you need to know how to code so that you can let AI code & iterate on that code) - maybe I wasn't clear
This is exactly what is going to happen in the short term
You're saying what every SWE has been saying for the past 2 years, LLM's are only powerful with someone to nudge and provide them with the right context.
That will never be accomplished by an MBA with zero technical knowledge building large and complex systems. Engineering isn't going anywhere and the folks that say it is never built anything meaningfully complex in their lives.
But this gives top engineers surreal advantage over the majority of the engineers which will impact the jobs overall
Also a smart MBA now need less effort and time to train himself to think like a top engineer , so barriers will go down