I saw someone observe that with chatGPTs current level of competence, it's less like having one really good programmer and more like having a bunch of mediocre ones. I could see it becoming a force multiplier for capable devs that would reduce the opportunities for junior devs, but it looks like it'll be awhile before it can really do everything.
it can give you 2-3 months worth of coffee drinking and playing MK with your boys in 5 minutes... i would not manage my wife if i had to work hard instead of MK tournaments
One of the big problems i see with it, is that it's extremely confident in it's answers. And when it's wrong, which does happen a fair bit, it's extremely, confidently wrong.
Even mediocre programmers can generally test and debug code. ChatGPT in its current state can't even run code on proprietary or development hardware or emulate it. It just writes mediocre code, which is the least demanding part of programming. It must be fun to debug. A more specialized system is needed if you intend to replace actual programmers.
I don't think it'll even reduce junior dev opportunities. Junior devs don't solve really simple coding questions. They work in already really complex code bases. Chat gpt has no way of knowing about the comps business logic or how their internal libraries work. In order to do that they'd explicitly have to train their own model with their own data, and hire a ml engineering team to manage it which I don't think many businesses could make an argument that would be cost effective.
It's regurgitating info it found elsewhere, without the experience to know when that information is entirely correct. But that will be improved quickly.
The real danger of such tools is that in 20-40 years, we're going to end with "programmers" who barely understand how everything works around them, with doctors, who don't know anatomy, and so on. People just don't understand that we have one, only one heavy burden of our existence we cannot pass on to the machines. It's our duty to learn, to think, and to make educated decisions. The moment we give those up to a machines, we're on the road to degradation. But we will do so willingly, because thinking and learning is hard, and our internal animal doesn't like that. Watch the movie "Idiocracy", especially the hospital scene. This is, roughly, our future.
@@williamdrum9899 there is a big difference between automating routine mindless operations and automating thinking. Various compilers and interpreters serve various needs. Some provide higher levels of abstraction and thus more automation (when the cost of it is acceptable). But even the most abstract languages still require from you complete understanding of the problem at hands, and the way it must be solved. Compiler/interpreter merely spares you from mostly mindless translation of your insights into machine codes of multiple hardware platforms.
I think of it as an assistance tool. All of us are already using StackOverflow and copying code from there. This didn’t replace devs either, you’ll still need professionals that know how to pierce the code together, test it and evaluate it for maintainability, subsequent bug fixing and readability. The AI itself doesn’t know jack-shit about that, and it’ll be a long time until it will. It’s simply an evolution of knowing how to formulate your questions properly for Googling.
I have two pieces of feedback. Have you tried telling the AI that there's a bug? I did, it fixed it 😃 Also, the thing I really like about ChatGPT, is like you said - it's a very good tool. I asked it "could I approximate distance from an enodeb to a ue, using only rssi and rsrq values?". It's a technical question, very hard physics problem. I don't like physics, I'm an embedded software engineer - looking to learn a bit about a field. Not only did ChatGPT explain to me why this was impossible, but it also actually gave me an implementation in C, where I *could* approximate distance. This is a level of assistance that stack overflow couldn't offer you, since you don't know what exactly you don't know. ChatGPT does. It's quite amazing technology.
@@vanish3408 no you don't you just shove in your code and type in the error. Even if you project spans across multiple files it will still detect the problem
I so appreciate your channel as I said many times before. You actually understand the low-level Engineering necessary and you are correct, it is a tool, it can help those accomplish more complicated algorithms.
I agree with your conclusions. Trying to use an AI to create a program would be like trying to translate a novel using Google translate. You'll get close but in the end it'll be a mess, though it might be a lot closer and easier to fix that mess than it would be to translate from scratch.
agree but it seems that the trend is that AI gets better at exponential rate, its current capability might be way less than what its potential might be in the future
How about this: i could get essence (all the inferences and deductive logic) from all the Sherlock Holmes novels in 3 seconds. Shall we now take your analogy of 'computer translation' for it argumentative value?
A lot of people will severely overestimate just how fast we're gonna go from here. This AI revolution has happened a lot of times already, language (google translate), self driving (still WIP), chess (deepmind), game GO, coding (github copilot). Every time people were worried that this is now killing everything and in a couple of years it'll beat out humans. Some of these things are already a decade old with virtually 0 improvement. AI technology will plateau and breaking through that is insanely difficult.
Did you just use chess/go in your list of examples to illustrate AI hitting some kind of plateau? You do realize people took the same concepts and demonstrated them in starcraft/dota? You do realize that solving the game was not the actual purpose? ...and that AI completely crushes humans in those games...?
In the short term? I agree. In the long term? Probably not. Technologies often follow the pattern of there being a massive amount of hype and expectation around them while they're in their infancy then the technology fails to meet our expectations within our minuscule attention spans. The thing is though, just because you're not looking doesn't mean there isn't progress being made. So many things we've dismissed have surpassed even the most optimistic person's predictions and have become fundamental to our modern way of life.
Nah. You aren’t taking into account of industries outside of just tech. The art/entertainment industry workers are freaking out about Stable Diffusion AI. SD AI allows people to use copyrighted artwork as a dataset and mimic an any artists style impressively accurate. This AI is already taking jobs like Book illustrators, logo designers, and concept artists - jobs that gives artists a financial cushion. Big gaming companies and animation studios have already let go of some their concept artists in favor of AI, an alternative that’s cheaper and quicker.
I see chatGPT as a place where I can ask more complex questions then with a search engine. I then pick out things from the response and search them up.
@@scratchy996 How do you know? Im watching developement of AI through history from 70s. There were many times when people says: "Hey, wait for another 2 years". Nothing happened. And if you do understand the model behind it, you know, that there is a lots of lots of work in the future. Simply fine tunning and more money will not help. This require fundamental advances in technology and that may last another decades. It is unlikely that someone will come with breakthrough technology within 2 years.
I tried it and am impressed but.. it often lies about properties and nuget packages that don't exist. For more complex cases, The code it generates is usually wrong in some way and requires an experienced engineer to debug and fix. But the fact that it gives new ideas of solving complex business logic problems makes me able to write an application with complex geometry that I wouldn't have been able to do without it.
Using ChatGPT (most of the times) will be like using calculator for math questions, on one side problem will be solved, but on the other you won't learn anything of it, for me chatGPT itself looks like a problem solver who solves the wrong problem.
I've been using it for a bit. If you ask it to solve something, It will likely explain the subject for you, if not, you can ask it to explain. I have fun and more knowledge so far. The obvious downside is, it's a bit slow to response.
Not only that but once you done some complex problem and use it inside the association engine you're essentially donating it to the owners of the engine.
It’s great for learning, incredible even. BUT it IS overhyped. It won’t replace anybody but it can make a programmers life much easier if there are simple things they need, for example code snippets etc…
It requires a different set of skills to use. Personally, I'm not concerned with being replaced by AI because it is my intention to learn to use these tools so I'm not obsolete. Fun fact, the tool wants to teach you how to use it.
True, not right now. However I see that everyone keeps forgetting that this thing literally came out of the works this year. Its still in beta as it states and it could become extremely competent and capable of creating advanced code at a high degree of profession that a human coder could. AI could definitely achieve what humans can.
@@Cikus26 lol I've been hearing that forever. We were supposed to have full self-driving cars by now, and what exists is still very buggy and still requires a human driver to take over, or only works in specific locations. And driving is a relatively simple task, even a low IQ human can do it. People see AI write some simple code snippets in a vacuum and think that makes it able to replace a programmer. When it can understand and add to an existing large code base or debug error, then I might be worried. But that's several orders of magnitude more complex.
We had a discussion at the Uni, and the short answer is no lol. When talking with the clients for a certain domain for which you want the App to be made most of the time the clients dont even know what they want and cant express it. So this would be a no from me, yet.
they can ask ChatGPT to write a technical document, based on their "great ideas" and you'll get tricked into seriously building stuff they will change their mind about in a month... atleast you could spot this shit and watch youtube at work, waiting for their "final" vision, which won't be final)
Yeah but lot less tehnical people can now do that so competition is way higher. Competitive advantage of a 'professional' programmer is way lower with way lower salary. But I think there are similar or even higher needs of technical people, but its just not programmig/dev. Feel like many people are in self denial now, not just dev. The tech can easily be tweaked to solve any weakness that its not addressing at the moment. Not to mention this is just the first iteration.
@@fintech1378 For sure. But not everything is in coding, much of it is in problem solving of never before seen data and connecting dots from different industries and fields. Remember that the NN needs a large enough training set and not everything can be expressed as simple as a training set for the NN. Yeah for sure some easy problems will be done and it may aid programmers in autocompleteion tasks, but what with problems no one has solved yet?
It will certainly change how we view programming as a profession. It might push some "coders" into become full-fledged developers or engineerings fitting pieces together - high-level thinking. The ones who can use these tools effectively are the ones that will be lucky. It will probably never work for giving you solutions to low level programming problems. There will always be developers making sure that the code is fit together according to some design. ChatGPT cannot handle the evolution of software applications.
1:18 - ChatGPT doesn't "know how to program", it has access to a vast amount of programming examples. It is excellent for "boiler plate" code - you know, set up a listen socket with appropriate error handling code, traverse a tree and such.
Indeed. I don't know how to do socket programming with C at all. With ChatGPT I might just type in the exact same question and have a good 'starter', then learn the code backwards and make it 'mine' along the way.
In my opinion in 10/15 years it could replace mediocre programmers, we will have to specialize more and more, a bit like it has always been with the advance of technology, I think this is a good thing.
@@ithinkthereforeitalk935 elaborate how ans bring a real example ?? A mediocre Android dev will build me a basic Android e-com (maybe use Kotlin for the actual app and connect it to Firebase for backend) Ask now ChatGPT to make you an e-commerce android app or will say you need a programmer and already an android developer (in this specific example) If you started to break down things then you are already within field of development and you are in this example an Android Developer cuz you know that you should do that’s and how and what to do with the answer (code)
@@ko-Daegu You just need to watch more videos about that AI's capabilities. It's constantly learning new things from people and keeps on improving as we speak. And it's getting better and better with every passing day. The genie is out of the bottle and now there's no stopping it. It already got a perfect score in a google interview for software engineers. And it definitely can build cool apps (literally in minutes) with a little help from devs. So soon companies will be able to replace an army of devs (including thousands of anspiring junior developers) with a handful of skilled programmers whose job is gonna be just telling the AI what to do next. The scariest thing is that it's not even the best AI out there , other companies have their own toys. The chat gpt 3 is just the first harbinger of changes to come in the field of software development.
That's a free model for beta testing and general use. There are already expensive, specialized, and advanced AI for enterprises cutting the costs of hiring more people.
Thing is we have been actively trying to automate our own jobs, but in the process there is more jobs which require more people to adjust to these changes. A framework exists to automate someone else’s job but then next thing you know it creates more issues like a never ending scheme.
Until AI can actually think and process information the way we humans do, things like ChatGPT seem more like a slightly fancier search engine or Stackoverflow helper to me. You have to know how to ask it the right kinds of code questions to get a useful answer. Until tools like ChatGPT can understand and translate customer requirements into working computer code, I don’t see how it can replace human software engineers. Being a successful software engineer requires more than coding skills, it requires tons of analytical and soft skills to be able to ultimately deliver what the customer is looking for.
Chatgpt wont be able to since they wont update it to that point. Chatgpt runs on GPT-3 Which is years old now, by the time GPT-4 or a newer CODEX is released it will already be far better than most programmers.
It already does. Integrate it with your issue tracking and build system, and it works better than an average programmer. The only limitation is artificial - it can't modify large projects.
It actually can turn requirements into code. It's not artificial "intelligence" just yet. But it is very useful for what it is and no different from a search engine in that you also have to know how to use them. If you don't know how to properly use a search engine, let's say google, you won't get the results you want as easily. Also, what kind of psychopath would want a sentient program? Wtf is wrong with you? Skynet, is that you?
Even if chatGPT is not competent enough to code anything that a senior programmer can, the question is if it will improve to that level soon. My main argue against gpt replacing software developers soon is that software developing is more than just code. It requires a lot of human interaction and soft skills.
Very few people are saying chatGPT will replace anyone. The point is AI progress has exploded over the past few years and pretty much came out of nowhere. In 2017 the most impressive thing AI had ever done was beat a human at Go. In 2019, GPT-2 released which was basically a funny "look at how dumb the AI is" tool. Now 3 years later we have chatGPT. A tool you can have a discussion with on pretty much any topic as long as you don't go too in-depth, and which produces (mostly) accurate, and entirely readable responses to questions, as well as simple code. It has rudimentary problem solving abilities, and can perform creative tasks such as answering questions in the form of a poem, or in a humorous style, designing the outline for a game, or writing parodies of popular songs. There is no way to know how far this will progress in the next year (unless maybe if you work at the companies making them), let alone 3, let alone 10. It's coming for people's jobs and the only ones it won't mostly automate aren't jobs that we can list now - it will just be whoever gets luckiest (or unluckiest as the case may be when the lack of available jobs requires some form of UBI to be implemented). This isn't just a case of your job being mostly automated. This is a case where the potential for societal change on a global scale over just the next decade or two could be unimaginable to us now.
GPT and new AIs technology was presented in mid 2017, and just in 5 years reached this point, with non stop growing so far. People are being just so naive about this. WE HAVE SEEN NOTHING YET.
Ok five years ago nobody thought chatgpt is possible. Now we have chatgpt. What will be five years later is hard to imagine but there is a chance that chatgpt will evolve and will be powerful enough to replace jun mid programmers
Fourty years ago I spent hours (months) "training" an OCR (scanner) to create its vocabulary to spellcheck a document. Today, a scanned receipt can complete a tax form.
Thank you for this video. Im learning to code for almost a year and plan on going to college soon. The fear mongering from this has been insane. I think most people do it for the algorithm views and its nice to see an honest youtuber.
@@LowLevelTV Not everyone has the same talents and abilities. and when the only ability (worker) that is owned is replaced by a computer. there will be a lot of injustice. and socialization will rise from the dust of the crumbling world of capitalism
Which’s what’s mon-Prog will ask It will ask to make me an android app that does x It will tell you that you need a programmer in Java or Kotlin who can make app test it This is what it told me A programmer will ask about writing a specific func and will know where to put it and weather it’s correct or not
AI is not special at all. It just a tool for saving data and retrieving back accurately. AI has no creativity, completely stupid but only real developers can understand
ML is a tool for mapping any measurable function with training data. I don't believe there's anything magical about our creativity. Instead, the combination of our experiences, knowledge and brain structure are the fundaments of our "creativity". These are all possible in ML
if im going to be honest from playing around with this myself, it seems like while it wouldnt replace programmers, its pretty good at assisting basic code, even if there is a mistake a programmer could notice it and fix it, which could take less time then writing out the code themselves
I think snippets and common code generation are safer and more useful in this regard. While it might be tempting to shove boring parts of coding to the ai you pay with potentially hard to spot bugs that sources from syntax or libraries used yourself aren't really that familiar with.
of course it would not replace dev at its current version, but it might get better exponentially in the near future (other AI companies might come up with similar products as well and it might become so pervasive everywhere at the infrastructure layer) and then it may replace majority of programmers. even though it takes longer time to replace senior dev, dev job might get 'commoditized' and salary can pushed to be significantly lower with much higher competition/supply than demand it doesnt need to 'completely replace' dev job to make the career prospect of a programmer much grimmer/more unattractive financially
@@hiphopheaven everything is biased, not giving the right answer, even from time to time means it can't be used in a critical applications if not supervised. In summary, another tool for the playgrounds.
Once you go through a few professional projects you'll get the perspective required to evaluate chat GPT. Practically everyone praising it as some dirt of programmer replacement is either extremely junior or has zero programming experience. Programming and coding aren't the same thing for starters, neural net simulators have no concept of coherent logic or aware reflection like a human does. If it improves by a few orders of magnitude it could become a powerful tool for a programmer but as it is it can't really be trusted to produce more than mediocre and insidiously wrong stackexchange substitute text. This isn't an AI to begin with but an association engine and comparatively little research has been done on the fronts required for building an actual AI where the intelligence part isn't glossed over to grab headlines and quick cash. The reason such AI can make a simulacra of a photo is that that is an association problem where it can pull data others generated to construct close enough replicas, it will not be capable of intelligent design. It might replace simple tasks programmers do vis a vis boilerplate in a few years though.
@@ZuwyFPS I don't think ChatGPT3 will be replacing anyone but the least invested developers and not do so directly but only tangentially if it ever becomes reliable enough for intermediate to advanced devs to increase their productivity more than there will be need for front end devs. But also take into consideration that i'm a game developer myself and don't have much experience with front end dev but rather game and graphics programming where the results of chatGPT in testing in its current form proved rather underwhelming, i actually had high hopes for it initially but alas no. Whether the next big AE project will be vastly better and replace some devs is another question that nobody can truly answer yet.
I took the Challenge to write a simple graphics engine,.using only chatGPT, it coded for me to load meshes, skinned meshes, models(kind of wrong here), shaders(PHONG),painter algoritm, materials, AABB trees, GJK,primitive collitions in 3D, torque, momentum, acceleration,etc, all i did was assembled it(took me a lot of time to make everything work properly), and fix some classes, or ask again and again to re write it or formulate a new question that it could give me the proper result, it worked fine but it took me kind of the same time if i would have Googled all of it.
Also i tried to code the fundamental matrix and essential matrix for epipolar geometry with it, i couldnt make it work properly(mixing with my own code to record from my cameras) but i think it gave me almost all the correct steps, SO any way i think i would use chatGPT i'm cases where Google fails me
The Chat GPT is not the beginning. But when Microsoft said they want to integrate it to Bing, and make much biggest tech companies feels threatened, that's the beginning of problems. Current news, Google and Apple try to built the same thing who could "compete" with OpenAI, and it wouldn't feel very good.
I personally don't like full code writing software since it's pretty invasive. It writes code that doesn't follow my code style principles and introduces bugs I might never find since it uses functions and syntax I might not be 100 percent familiar. I think a great intelli sense with a more advanced code snippet library might be the peak just before writing code directly based on your ideas with brain computer interfaces.
The argument still exists: is coding art or science. Art can't be disproved, science can. Further, science improves itself, where art is one-of-a-kind. Which airplane do you want to be in?
@@bob-ny6kn programing is an art. But not in the form of painting but rather in the form of balancing objects on each other. The difference is that you can't argue that a painting is drawn wrong. But in the art of balancing things there are right ways to balance objects (standing) and wrong ways (falling apart). Programing is similar. There is not 1 right solution like you can balance a set of objects in different ways on each other correctly. But there are wrong solutions to a programing problem aka. bugs. If your program sometimes does not work correctly it's objectively wrong without any way of arguing against that. To wrap back around to the topic. Ai is known to write notoriously difficult to find bugs.
People have been overhyping it so much. Even the essays it generates is on the level of a high schooler. It's such a cookie cutter format too. Can be useful for initial start/research but yeah like you said no job threatned
I don't think even if we released something that would always produce correct results without any bugs and had a limitless level of complexity you could give it, because of the simple fact that in 99% of real problems describing the problem is often more tedious than just sitting down there and programming a solution (this is even the case for non-technical folks nobody likes describing problems). The problems we deal with are just too complex and it's often easier to write something down, test it, think about it, tweak it, then to just code the perfect solution out right on the first try.
It's a whole lot easier to ask something to code for you rather than doing it yourself. You can just make a list of requirements and have it create it for you. Non coders it far easier because they can't code.
@@switchdeck9164 I can't give you a link because TH-cam comments with links in them "disappear mysteriously" ;) but if you search for "commit strip a comprehensive and precise spec" has a very valid comment on what you're saying.
I wouldn't be surprised if you could've told it about the mistake it made and it might've been able to fix it. It can retain conversation in the same session, so it should be able to piece together where things apply with enough information.
It can do that in most cases for sure, but you have to be aware of the problem in the first place, and many bugs are not obvious just from reading code, especially if you are not an experienced programmer
@@ciandoyle3076 Yeah I noticed it doesn't do super well with semantic errors (fine as far as the computer is concerned, but behaves incorrectly). I'd get in this annoying endless loop of troubleshooting and the only thing stopping that is troubleshooting the old fashioned way, where you know what you're doing in the code; even coming up with an alternative, which you often have to know what you're doing to come up with that alternative! I've had a lot of times where I would've been better off doing most of the coding myself and gotten help on the occasional things rather than everything as a whole. It would be able to correct some mistakes, and in correcting some other mistakes, it would re-introduce the original mistakes it made. Mileage definitely varies for sure. A lot of stuff I tried it with, I would've been better off doing myself with maybe some occasional help here and there, but not with everything since it often prolonged a lot of stuff, especially dealing with the character limit.
I am happy, that you guys think that your jobs are save. Here are some further insights by the bot itself: Here are some potential risks that could be caused by the integration of AI in a capitalistic democracy: Job displacement: As AI-powered automation becomes more advanced, it could potentially displace large numbers of workers across various industries, leading to widespread unemployment and economic inequality. Income inequality: If AI primarily benefits large corporations and the wealthy, it could exacerbate existing income inequality and lead to a further concentration of wealth and power among a small elite. Political manipulation: AI algorithms and social media bots could be used to manipulate public opinion and influence elections, undermining the democratic process. Discrimination: AI systems that make decisions based on biased data sets or are trained using biased algorithms can perpetuate or even amplify existing discrimination against marginalized groups. Lack of accountability: If AI systems make decisions that harm individuals or society, it may be difficult to hold any particular person or organization accountable for those actions. Loss of privacy: AI systems that collect and analyze large amounts of personal data could be used to invade people's privacy and monitor their activities. Dependence: AI-enabled automation could make society too dependent on technology, and may become a single point of failure for crucial infrastructure such as transportation, communication and even financial systems Weaponization: Advancements in AI technology could also make it possible to develop autonomous weapons that could select and engage targets without human intervention, raising the risk of accidental or unintended military engagements. These are just a few examples, but there are many other potential risks associated with the integration of AI in a capitalistic democracy. It's important to consider these risks and take steps to mitigate them as AI becomes increasingly prevalent in our society.
The question isn't will it replace all programmer but which programmers. It will definitely cost some people their jobs and very soon. It's hard to tell but I suspect it web and front end Devs will be the first to go. Combining AI generated graphic design and it's ability to put together programs from plain English make that obvious. But I think the people who should be most worried are blissfully unaware. Here's some examples; teachers, accountants, office workers, data entry, administrators, planners, authors, assistants, sales reps, tech and other support, etc the list goes on.
Front-end can be extremely complex depending on what's asked ... sometimes it's Harder than the backend work I have to do, try making any kind of complex animation or effect and get back to me ... try a bouncing ball on the screen see how you go
@@FilterChainnot necessarily hard for AI eventually. It can put together sprites no problem. The markup coding stuff can totally be obsoleted. It might not be immediate but frankly it’s much easier to replace any sort of computer data entry work with an AI than for example police or firefighters or chefs. Any software engineer has to be honest about that.
I was fearful of being a dev 10 yeara ago because of wix etc , guess what I made the best choice ... I think it's easy to be fearful of job security but skills will give you the cutting edge no matter what
Need not worry everyone. Once AI can go after the creatives (of which programming is, in a way) - the only jobs left for humans will be menial tasks and hard labour. Both very low paying jobs. The middle class will all but disapear and we will only have the insanely rich who own the AI and everybody else who can barely afford to feed themselves. Then the global economy, capitaliasm itself, and the mordern way of life will collapse because nobody has the money to consume anymore. That could happen, or we keep our jobs for the most part, or we replace our economic system fundementally with something that isn't capitalism. We'll be fine.
all for the sake of “progress” baby. its crazy how society has been skating the fine line between a utopia and a dystopia but now i think the pictures much more clear.
but it can easily replace teachers and stackoverflow specially for beginners, its great for beginner who are into programming, i use to go on stackoverflow and depends on luck if somebody have enough time to answer my noob question and google only show websites but not particular problems but now chat gpt is like personal teacher from whom you can get some ideas, just copy paste your code and ask question like you ask to your teacher, its so efficient and life saver.
Ok so I havent programmed for years, I have almost forgot everything. Neverthless I decided to go do an indie game just for the fun of it, using Unity and programming help from ChatGPT. I've spent literally 12 hours trying to get it write a proper find me the nearest object with a set tag, and then move an another objects towards it. Its code although always compiles, doesnt do what I am asking it to do and its quite frustrating. Whenever you tell it to alter something in a function, it breaks something else. No, it wont replace programmers yet. Not by a longshot.
One thing you can do. When it does something wrong or not to your liking, because it's an AI and it learns, you can tell it what's wrong and it'll redo it for you. And probably in the future, it will become more and more informed with less errors.
The problem is that students increase their programming ability by overcoming hurdles for themselves. Seeing someone else overcome the hurdle (including AI tools) has a far less a gain. It is a bit like going to the gym and watching someone else pump iron. You may learn a bit about how to use the equipment, but it won't make you any stronger. If GPT can solve the educational programming exercises students do to learn how to program, how are they going to learn to spot when it gives them bad code. It is a great tool for expert programmers, not for most students.
I disagree, chatGPT is going to be to learning programming what WolframAlpha did to learning math. WolframAlpha's step-by-step solution is a great learning aid, sure it can be used to cheat but it's also like having a TA with you at all times. Just because some lazy students would abuse it, doesn't mean students that actually wants to learn wouldn't use it to improve their studies.
@@dunmermage Having a TA with you at all times would be a bad thing. Having too much help makes it more difficult to develop skills, especially problem solving (programming is more than coding). It is like having a spotter that holds 50% of the weight of the bar all the time. It may look like you can bench press 100kg - but in reality you can't. I've been teaching programming since 1994 and I have seen how the profusion of overly-helpful online resources has made things more difficult for a lot of students. I'm not saying the tools are a bad thing, just that they are best avoided by students *until* they have understood the fundamentals.
It won’t replace anyone. Companies hire junior developers knowingly taking a hit to productivity because of the potential of them being a good developer. It could be used as another tool for assisting developers but the idea that it will replacing them is just outright dumb.
I would significantly speed up development which means much larger projects are now possible for large companies and small companies can punch above their weight class. It is a tool that needs to be driven by an expert but still think at least half of all coders will be redundant but they will probably change to a more AI orientated career.
I agree with this, I used to think AI would make it so jobs would just be gone, but people still have to know how the AI works, I know someone who works somewhere with machines, the machines are doing things that people don’t have to do anymore, but there’s maintenance that checks up on the machines to ensure they’re still working, it’s like moving up a level, where writing source code is abstracting assembly, asking an AI to write code still requires you to word things correctly, just like how you need to talk certain ways with people to get info, some people might understand biology better than mechanics, and some might understand electronics better than metallurgy, something you can try is telling it certain things, ChatGPT works by tokenizing words, and using probability to guess what words would come next, there are videos that talk about a pre prompt, basically a prompt thst ChatGPT starts with that’s hidden from the user, and there are videos talking about ways to get it to translate things, which it can do, there are some Computerphile videos about ChatGPT and neural networks, if you’re interested you can give them a watch, I asked ChatGPT how it worked and it mentioned tokenizing, and subwords, I looked up a video about tokenizing and subwords and it turns out that that is indeed a thing, we just have to fact check ChatGPT, just like how we need to fact check our fellow humans every now and then, ChatGPT also introduced me to plezoelectrinics, of which quartz is one example, compressing it enough can start an electrical signal, ai found thst interesting, a thing that helps me is if ChatGPT answers a question in a way I don’t understand, I say So it’s like…then I attempt an analogy of how I think it works, ChatGPT will either say ai’m correct or try to correct me. I remember at one point I was worried about asking leading questions by accident, so I asked it to tell me if I asked a leading question and to give examples of of how to phrase the question so it’s not leading, it’s interesting to try to think like it and guess how it might respond to a thing you say, like how before saying something to a person, you might think the sentence, guess how a person would respond, then say the sentence and see if your guess is correct, that’s how I tend to talk, I think of the sentence before saying it aloud, thst way I get the whole sentence out instead of putting longer pauses while ai think of the next word. I wait for a pause after what I think is another person;s complete sentence, only for them to say I interrupted them, I let them finish, then try to say my sentence. Because of accidental interruptions by not waiting long enough, I have learned to think the full sentence before saying it so that long pauses are stopped. I don’t want to cause other people the confusion others cause me. This has drifted significantly from the original point, maybe I’ll copy this comment into ChatGPT and see how it responds, I’m curious what you think of this. I’m sorry for the WoT, if you have formatting ideas please respond and I’ll edit this comment accordingly.
I find it very helpful as a teacher and tool. I've been using it to learn Python, and even to create a game, a fun little progam, generate exercises for me, etc. I've noticed it is possible to use it to write some solid code, without understanding all of it. It's not perfect, no one is claiming that, but it sure makes for a good teacher and a helpful tool for many different tasks, even besides coding. Like recipes, helping with writing, creating statements, steelmanning arguments, brainstorming, philosophising, even psychological assistance. I'd like to add music to that, but its musical abilities seem barely existant and you have to jump through quite a few hoops to get any help with actually writing music. Lyrics work great though, poetry is decent. It's got issues with rhyming however. Sometimes it might not properly understand what you're asking and just keep giving you the same few variations of the wrong answer. All in all, it is versatile, interesting, very helpful, enjoyable, and undeniably a step forward for all to see.
Chat GPT is not that good at this stage, I asked for a math question which goes on like: " if a cube is divided into 27 smaller cubes of dimensions 1 unit each, what is the volume of a sphere that could be put into it touching the center of all its faces". Couldn't believe it was not able to solve its own question!
The idea that AI will replace experienced programmers is bogus. Sure AI can be used to automatically solve complex problems such as programming, but how would an inexperienced programmer who only knows how to ask an AI for an instant solution know if what they receive is optimal? If the 'programmer' has no idea how the code even works, how would they even know it could have been written differently such that it used less memory or ran faster? Should we just ask it to "Make the whole program you wrote for me run faster and use less RAM, please. Because I don't even know which part is unoptimized lol." You don't even know if what it's giving you is the best solution, unless you already have a meticulously optimized, human-written version to compare it to, anyway. As with all inventions of hyper-convenience, it will be abused to the point that the people who use it become totally dependent on it and unable to function without it, and actual experienced programmers will still be needed to ensure their do-my-work-for-me-bot keeps on working.
I have tried using it to help me with my C++ code and I was not impressed either. I want to be clear, I don't think it's "not as good as people say" or simply "overhyped", I genuinely think it lowers my performance. When I tried to have it write simple functions, it was wrong over 50% of the time. Maybe it's useful if all the programming you do is just copy pasting from StackOverflow but if you know the language and are a fast typer, you can easily be more productive without it.
Most jobs aren’t about typing fast. They’re about understanding a business and it’s customers needs to drive sales. That is something that AI can figure out better than humans. Your use of it is just for training it to replace you. Forget even tech debt (maintenance jobs requiring years of domain expertise), AI can clean that up and maintain the code by itself. The trajectory is set to a dystopian future. Don’t feed the beast. Boycott it.
@@mecanuktutorials6476 In theory all those things are possible, but from my experience current AI really isn't there. ChatGpt doesn't even understand the context of one big class, let alone a multi million line project.
@@MsJavaWolf it’ll get there. It needs to understand the language compiler rules but I don’t see this as particularly challenging since the input is all there. Feed it the source code and it can summarize it for you even today and make minor touch ups too. It’s just a matter of who is dumb enough to automate themselves out of a job. There’s several other tools that refactor already without changing logic. Hooking those up to ChatGPT and getting it to run regression tests would actually be devastating on its own. Small “improvement” will cause massive disruption to programming jobs. But it’s not just theoretical, it’s at the doorsteps. Because of how expensive devs are, there is a huge demand for this.
Idk if that's reliving for some but I've once asked chatGPT to write a simple drawRect(uint x, uint y, uint width, uint height) function in OpenGL 4, and the bit got it wrong even after 4 responses or so. First, it didn't covert the coordinates to the [-1;1] used in OpenGL. Then it only used deprecated functions that were deleted in 3.3 for over 3times. I guess that's probably because most sources available in the Internet are in 3.0 or below..
ive personally been using this quite a bit as a learning tool for programming. i dont ask it to straight up write something that works. instead, i ask it for an explanation of how to write something and maybe ask it for an example. so far i have learned what a breadth-first search algorithm is, how it works, and how to make one myself using multiple different approaches including both recursive and non recursive functions. yes it makes errors when writing code. sometimes it even gets confused about what language you initially started asking questions about (if there was one). sometimes it completely gets some of the features and limitations of that language completely wrong. but these errors only reinforce the need to read the code it spits out and understand it instead of blindly copying it and using it in your own application
I used it today to implement a feature from a third party library. It gave a code example with explanations that seemed correct but I was having trouble seeing the expected outcome in the browser. I had some doubts when I didn't see the documentation resemble the examples it created. Then I began to interogate it by asking, "are you sure this is correct? The documentation does not show that 'context' is a required prop". Then it replies starting with, "You are correct, 'context' is not a prop... sorry for the confusion" then it spits out an amended version of the code. And again it doesn't work. This continued a few more times till I realized that it was making up random props that did not exist. I would not put blind faith in anything it outputs whether it's code or facts. We're gonna start saying soon, "don't believe everything you hear from Ai".
Its good enough for making regular expressions. Its a tool to save time and it just keeps getting better. Real Programming requires creativity so its really not a replacement for creative coding.
I sure hope you are right, but how exactly do you pull out your "at least a decade" estimate? Surely given even GPT3 and copilot, ChatGPT would have been difficult anticipate even a year ago...
Not really , while it is a step up from copilot in terms of making a whole program from scratch and making comments, it is a step downwards in every other way, especially in making a program that runs
@@tjgdddfcn To me that's not relevant, because my code usually doesn't run as intended on the very first try, I usually have to make small adjustments, which seems comparable. Whats impressive, is that for well enough documented topics, its able to parse and summarise a body of knowledge, with not perfect, but scarily high accuracy. The fact that you can ask it something like "can you give me a geometric interpretation of complex conjugation" and it can retrieve that from its knowledge representation... It is even able to relate similar concepts and compose different concepts, hopefully this stays in a mostly guided fashion...
@@johanngambolputty5351 Thing is those errors most of the time come from the use of functions that dont exist of using a type to do a thing that that type cannot, those types of errors tend also result in errors in the whole structure of the program or atleast a part of it. While yes it is good at explaining topics, it only is good at explaining very documented topics. Also, the way it writes in plain english without grammar errors is nothing to be scared of also, we had that technology for like 7 years already
@@tjgdddfcn I am maybe at ease that its not a threat to programming right now, that its probably just a hyper-google for the time being and actually pretty useful. I wouldn't go as far as saying it can "explain" things itself, but it certainly seems to be able to retrieve explanations its seen, but this seems to imply an ability to combine multiple explanations in a consistent way, and no past method really gave me that impression, not as convincingly (even if its still just an illusion). But conceptually, if we accept the promise that it can do some impressive stuff with sufficient guidance and if it were allowed to learn along, then its potentially only an automated guidance system away from general intelligence? If it can lead itself to its own conclusions and consistence check. Suddenly its doing much worse than just stealing programming jobs... Though the guidance system still needs an objective and training a reward model for that would probably be much harder. And it would probably still have a fairly limited view of the world through a text only datastream.
@@johanngambolputty5351 It's not about how documented it is, but from how many examples it can put together the answer. Because it has actually no understanding for the programming logic.
It's Excel 2.0. It buffs the baseline of productivity and brings together data for easy analysis, but doesn't actually do anything in its own right. It's not intelligence at all. It's mimicry. Unfortunately it seems to have fooled a lot of people already who only pay attention to the surface of things. Prediction: It will create a spiral of mediocrity, feeding on itself as people over rely on it. It will be both Excel 2.0 and an Icarus 2.0, crashing the hype train of AI more than a robot hand breaking a child chess players finger.
I like how you go "Don't freak out this AI isn't that good", and then go on to say we'll be replaced in 10 years which is a sooner estimate than most people give. I still don't really feel better as someone just starting CS.
Finally someone who talks sense about this. You know, it's one thing to generate code to get the oumpthteenth Fibonacci number, but complex coding involving thousands of lines of properly tested code with its intricacies, using tools, libraries and frameworks judiciously; that's a ballpark ChatGPT isn't going to be near for a long time, if ever. Never say never I guess, but if you're 20 years old as a newbie programmer, you've got your job cut out for you until your retirement.
It's as you said its the small things that it can't do, lets say you work at a company that has a code based that has been built up for the last 5 years, there is no way a normal person is going to know where to put the code or how to link it to other files it won't know what communicates with what , there is also no way a company is going to an AI to build such a large app without a few developers making sure everything works and fully optimized.
i imagine this could be the basis for a framework that would allow us to work synchronically, from anywhere in the world, without knowing eachother i ask a question chatgpt finds a bunch of people related on the field, it translates it to them one finds a solution now it sends it to a group to verify it we could repeat these steps a few times this should also help filter identical ideas now it gives me back the answer later if someone else asks the same question it should give a resume of the process - do this, dont do that, why...
Hi, Ive been a programmer for 15 years+. I used chat GPT and I was able to write EVERYTHING perfectly. scared me because now I feel like I have to plan ahead because this coding will be replace, this is only the beginning, its going to advance. everyone is just trying to not face the reality and play it down, but im worried. I do believe we are done, slowly by 5 years. We'll see what doors it opens up though, maybe it makes us way more powerful to create WAY BETTER applications, why not right? No one has a crystal ball for the future, but that chatGPT i got it to write everything and really fast.
btw.. i think they are bottlenecking and throttling its capabilities, because now im noticing its saying it can't do stuff anymore that I got it to do before. this is not by chance.
I’ve had it give me straight up incorrect information before. I asked it the correct answer to a test question about multiplying things in C, where it was given a function to find the y value of a y=mx+b function with 2 int pointers and an int as an input that had to be multiplied together. It told me the correct answer was pointer1*pointer2 + b instead of *pointer1 * *pointer2+b, stating that C auto dereferences when multiplying, even though I tried it out in C and that did not work and the answer key also said that was wrong. I tried to tell it it was wrong but it kept insisting that I was the one who was wrong and there must be something else that’s the issue, even though the one it picked just wasn’t right.
Question: Does it also support the reverse: Code -> human language, i.e. you input an algorithm in some programming language and it tells you in high-level terms what the algorithm does?
programming languages are made for humans, an AI don't need them, so I'd say the future isn't an AI generating code for humans, but the AI itself solving problems instead of humans producing code
Agree, this is new tool, that will make programming easier. Maybe it will grow into some new high level language. No-code programming have been developing for some time already.
As like what you've said, fyi if there is a bug on the code, you could tell ChatGPT about the bug, and it will apologize and usually try its best to fix the code for you. Sometimes it works, sometimes not, still far from replacing human programmers, but a very useful tool for programmers...
Embedded Systems and Low Level Development will never be automated away. As for Web Development, there is already Wix and honeycode for automation or code less development. You can't do that for ES or LLD.
And a lawyer and a programmer. So I used ChatGPT in order to make a legal document for my country (colombia) and the output was a basic format document. I know that kind of structure and without i doubt I pressume that some how chatgpt get that information for a website. I mean chatgpt doestn understand the concept in a deep way but know how to find the information of the concept itself. Now, being say that, when I ask for some code this is why chatgpt never answer me about the specifics because, again, it doestn undertand the concept but its great to looking for the response.
I saw someone observe that with chatGPTs current level of competence, it's less like having one really good programmer and more like having a bunch of mediocre ones. I could see it becoming a force multiplier for capable devs that would reduce the opportunities for junior devs, but it looks like it'll be awhile before it can really do everything.
it can give you 2-3 months worth of coffee drinking and playing MK with your boys in 5 minutes...
i would not manage my wife if i had to work hard instead of MK tournaments
One of the big problems i see with it, is that it's extremely confident in it's answers. And when it's wrong, which does happen a fair bit, it's extremely, confidently wrong.
@@TheArrowedKnee Agreed, although plenty of people do this as well.
Even mediocre programmers can generally test and debug code. ChatGPT in its current state can't even run code on proprietary or development hardware or emulate it. It just writes mediocre code, which is the least demanding part of programming. It must be fun to debug. A more specialized system is needed if you intend to replace actual programmers.
I don't think it'll even reduce junior dev opportunities. Junior devs don't solve really simple coding questions. They work in already really complex code bases. Chat gpt has no way of knowing about the comps business logic or how their internal libraries work. In order to do that they'd explicitly have to train their own model with their own data, and hire a ml engineering team to manage it which I don't think many businesses could make an argument that would be cost effective.
We should be fine, i asked it the difference between Kotlin and Java, and it told me Java was a dynamically typed programming language.
It's regurgitating info it found elsewhere, without the experience to know when that information is entirely correct. But that will be improved quickly.
In 5 years basic programming should be so easy that literally anybody could get it…
@@bryte1321 lol no
@@AnnasVirtualis “hallucinating” the proper word here?
@@bryte1321 nah, programming isn't for everyone
you'd surprised to see the amount of people who have coding jobs yet can't program for shit
The real danger of such tools is that in 20-40 years, we're going to end with "programmers" who barely understand how everything works around them, with doctors, who don't know anatomy, and so on. People just don't understand that we have one, only one heavy burden of our existence we cannot pass on to the machines. It's our duty to learn, to think, and to make educated decisions. The moment we give those up to a machines, we're on the road to degradation. But we will do so willingly, because thinking and learning is hard, and our internal animal doesn't like that. Watch the movie "Idiocracy", especially the hospital scene. This is, roughly, our future.
Doesn't this also apply to compilers and interpreted languages?
@@williamdrum9899 there is a big difference between automating routine mindless operations and automating thinking. Various compilers and interpreters serve various needs. Some provide higher levels of abstraction and thus more automation (when the cost of it is acceptable). But even the most abstract languages still require from you complete understanding of the problem at hands, and the way it must be solved. Compiler/interpreter merely spares you from mostly mindless translation of your insights into machine codes of multiple hardware platforms.
The AI can be used to explain everything, every line of code etc, so I dont think that will be the case
Read the 1909 short story "When the Machine Stops." It's become a very scary tale in light of the last 10 or so years.
@@Finaggle Thanks for the suggestion! A very fitting story, indeed.
I think of it as an assistance tool.
All of us are already using StackOverflow and copying code from there.
This didn’t replace devs either, you’ll still need professionals that know how to pierce the code together, test it and evaluate it for maintainability, subsequent bug fixing and readability.
The AI itself doesn’t know jack-shit about that, and it’ll be a long time until it will.
It’s simply an evolution of knowing how to formulate your questions properly for Googling.
I have two pieces of feedback. Have you tried telling the AI that there's a bug? I did, it fixed it 😃
Also, the thing I really like about ChatGPT, is like you said - it's a very good tool. I asked it "could I approximate distance from an enodeb to a ue, using only rssi and rsrq values?". It's a technical question, very hard physics problem. I don't like physics, I'm an embedded software engineer - looking to learn a bit about a field. Not only did ChatGPT explain to me why this was impossible, but it also actually gave me an implementation in C, where I *could* approximate distance. This is a level of assistance that stack overflow couldn't offer you, since you don't know what exactly you don't know. ChatGPT does. It's quite amazing technology.
holy crap
"Why isn't it possible? It's just not."
Telling the AI that there's a bug would require a human being able to spot it
@@vanish3408 not necessarily. You can just run into a bug, and describe what broke. This will require some testing but I think it could work
@@vanish3408 no you don't you just shove in your code and type in the error. Even if you project spans across multiple files it will still detect the problem
just be careful when you ask it to make a "killer app"
Just maximize the output of our paper clip factory!
That's just dumb, the bot isn't autistic.
I so appreciate your channel as I said many times before. You actually understand the low-level Engineering necessary and you are correct, it is a tool, it can help those accomplish more complicated algorithms.
I agree with your conclusions. Trying to use an AI to create a program would be like trying to translate a novel using Google translate. You'll get close but in the end it'll be a mess, though it might be a lot closer and easier to fix that mess than it would be to translate from scratch.
agree but it seems that the trend is that AI gets better at exponential rate, its current capability might be way less than what its potential might be in the future
@@fintech1378 meaning ChatGPt won’t replace anyone
But the next thing might or get really close to it
How about this: i could get essence (all the inferences and deductive logic) from all the Sherlock Holmes novels in 3 seconds. Shall we now take your analogy of 'computer translation' for it argumentative value?
What I meant is that it does something no translator or even the Author himself could do even if i paid them
@@ko-Daegu remember that it is in beta, the potential is endless and it could become extremely capable
I think anyone interested in Technology, should be encouraged to study Electrical Engineering.
boring
@@rongarlnield417 no
computer science better
Computer science mixed to electronics rocks imo
@@trya2l that's the way
I read a comment that said "as long as there's data, there will always be programmers" and that was probably the best take I've read on this topic.
A lot of people will severely overestimate just how fast we're gonna go from here. This AI revolution has happened a lot of times already, language (google translate), self driving (still WIP), chess (deepmind), game GO, coding (github copilot). Every time people were worried that this is now killing everything and in a couple of years it'll beat out humans. Some of these things are already a decade old with virtually 0 improvement. AI technology will plateau and breaking through that is insanely difficult.
And there have been times where we have under estimated. Like Dall e 2
Did you just use chess/go in your list of examples to illustrate AI hitting some kind of plateau?
You do realize people took the same concepts and demonstrated them in starcraft/dota?
You do realize that solving the game was not the actual purpose?
...and that AI completely crushes humans in those games...?
In the short term? I agree. In the long term? Probably not. Technologies often follow the pattern of there being a massive amount of hype and expectation around them while they're in their infancy then the technology fails to meet our expectations within our minuscule attention spans. The thing is though, just because you're not looking doesn't mean there isn't progress being made. So many things we've dismissed have surpassed even the most optimistic person's predictions and have become fundamental to our modern way of life.
Nah. You aren’t taking into account of industries outside of just tech. The art/entertainment industry workers are freaking out about Stable Diffusion AI. SD AI allows people to use copyrighted artwork as a dataset and mimic an any artists style impressively accurate. This AI is already taking jobs like Book illustrators, logo designers, and concept artists - jobs that gives artists a financial cushion. Big gaming companies and animation studios have already let go of some their concept artists in favor of AI, an alternative that’s cheaper and quicker.
Yeah AI sucks at chess now. Analog is the future.
I see chatGPT as a place where I can ask more complex questions then with a search engine.
I then pick out things from the response and search them up.
That's now. But wait two more years.
@@scratchy996 How do you know? Im watching developement of AI through history from 70s. There were many times when people says: "Hey, wait for another 2 years". Nothing happened. And if you do understand the model behind it, you know, that there is a lots of lots of work in the future. Simply fine tunning and more money will not help. This require fundamental advances in technology and that may last another decades. It is unlikely that someone will come with breakthrough technology within 2 years.
@@scratchy996You sound like 343 guilty spark in halo 3 when Johnson is about to fire the rings... "Just a few more days."
I tried it and am impressed but.. it often lies about properties and nuget packages that don't exist. For more complex cases, The code it generates is usually wrong in some way and requires an experienced engineer to debug and fix. But the fact that it gives new ideas of solving complex business logic problems makes me able to write an application with complex geometry that I wouldn't have been able to do without it.
Using ChatGPT (most of the times) will be like using calculator for math questions, on one side problem will be solved, but on the other you won't learn anything of it, for me chatGPT itself looks like a problem solver who solves the wrong problem.
I've been using it for a bit. If you ask it to solve something, It will likely explain the subject for you, if not, you can ask it to explain. I have fun and more knowledge so far. The obvious downside is, it's a bit slow to response.
I'd have to disagree, it opens up programmer to new data structures, ways of solving problems and can create new algorithms
Not only that but once you done some complex problem and use it inside the association engine you're essentially donating it to the owners of the engine.
Then you’re not using it right or asking the right question
It’s great for learning, incredible even.
BUT it IS overhyped. It won’t replace anybody but it can make a programmers life much easier if there are simple things they need, for example code snippets etc…
It requires a different set of skills to use. Personally, I'm not concerned with being replaced by AI because it is my intention to learn to use these tools so I'm not obsolete. Fun fact, the tool wants to teach you how to use it.
True, not right now. However I see that everyone keeps forgetting that this thing literally came out of the works this year. Its still in beta as it states and it could become extremely competent and capable of creating advanced code at a high degree of profession that a human coder could. AI could definitely achieve what humans can.
@@Cikus26 lol I've been hearing that forever. We were supposed to have full self-driving cars by now, and what exists is still very buggy and still requires a human driver to take over, or only works in specific locations. And driving is a relatively simple task, even a low IQ human can do it.
People see AI write some simple code snippets in a vacuum and think that makes it able to replace a programmer. When it can understand and add to an existing large code base or debug error, then I might be worried. But that's several orders of magnitude more complex.
How can a tool this good with very few moments of life being overhyped? Are you aware of how insanely fast is the core AI technology growing?
it will replace them ones they release gpt-4 for sure.
We had a discussion at the Uni, and the short answer is no lol. When talking with the clients for a certain domain for which you want the App to be made most of the time the clients dont even know what they want and cant express it. So this would be a no from me, yet.
I agree. A bit "yet", that will take a long time to catch up.
they can ask ChatGPT to write a technical document, based on their "great ideas" and you'll get tricked into seriously building stuff they will change their mind about in a month... atleast you could spot this shit and watch youtube at work, waiting for their "final" vision, which won't be final)
Yeah but lot less tehnical people can now do that so competition is way higher. Competitive advantage of a 'professional' programmer is way lower with way lower salary. But I think there are similar or even higher needs of technical people, but its just not programmig/dev.
Feel like many people are in self denial now, not just dev. The tech can easily be tweaked to solve any weakness that its not addressing at the moment. Not to mention this is just the first iteration.
@@fintech1378 For sure. But not everything is in coding, much of it is in problem solving of never before seen data and connecting dots from different industries and fields. Remember that the NN needs a large enough training set and not everything can be expressed as simple as a training set for the NN. Yeah for sure some easy problems will be done and it may aid programmers in autocompleteion tasks, but what with problems no one has solved yet?
can AI be taught how to talk with clients? Do we have such systems somewhere nowadays?
It will certainly change how we view programming as a profession. It might push some "coders" into become full-fledged developers or engineerings fitting pieces together - high-level thinking. The ones who can use these tools effectively are the ones that will be lucky. It will probably never work for giving you solutions to low level programming problems. There will always be developers making sure that the code is fit together according to some design. ChatGPT cannot handle the evolution of software applications.
1:18 - ChatGPT doesn't "know how to program", it has access to a vast amount of programming examples. It is excellent for "boiler plate" code - you know, set up a listen socket with appropriate error handling code, traverse a tree and such.
"ChatGPT is bad because the computer does all the work!"
Literally every computer language more advanced than a hex editor: "First time?"
Indeed. I don't know how to do socket programming with C at all. With ChatGPT I might just type in the exact same question and have a good 'starter', then learn the code backwards and make it 'mine' along the way.
It'll also teach you how to debug other people's code!
In my opinion in 10/15 years it could replace mediocre programmers, we will have to specialize more and more, a bit like it has always been with the advance of technology, I think this is a good thing.
Come on, it can do it NOW not in 10-15 years.
@@ithinkthereforeitalk935 now it can't even do a project I did in high school (arkanoid in Java) because the length of code it can write is limited.
@@ithinkthereforeitalk935 elaborate how ans bring a real example ??
A mediocre Android dev will build me a basic Android e-com (maybe use Kotlin for the actual app and connect it to Firebase for backend)
Ask now ChatGPT to make you an e-commerce android app or will say you need a programmer and already an android developer (in this specific example)
If you started to break down things then you are already within field of development and you are in this example an Android Developer cuz you know that you should do that’s and how and what to do with the answer (code)
@@ko-Daegu You just need to watch more videos about that AI's capabilities. It's constantly learning new things from people and keeps on improving as we speak. And it's getting better and better with every passing day. The genie is out of the bottle and now there's no stopping it.
It already got a perfect score in a google interview for software engineers. And it definitely can build cool apps (literally in minutes) with a little help from devs.
So soon companies will be able to replace an army of devs (including thousands of anspiring junior developers) with a handful of skilled programmers whose job is gonna be just telling the AI what to do next.
The scariest thing is that it's not even the best AI out there , other companies have their own toys. The chat gpt 3 is just the first harbinger of changes to come in the field of software development.
That's a free model for beta testing and general use. There are already expensive, specialized, and advanced AI for enterprises cutting the costs of hiring more people.
Thing is we have been actively trying to automate our own jobs, but in the process there is more jobs which require more people to adjust to these changes. A framework exists to automate someone else’s job but then next thing you know it creates more issues like a never ending scheme.
Until AI can actually think and process information the way we humans do, things like ChatGPT seem more like a slightly fancier search engine or Stackoverflow helper to me. You have to know how to ask it the right kinds of code questions to get a useful answer.
Until tools like ChatGPT can understand and translate customer requirements into working computer code, I don’t see how it can replace human software engineers. Being a successful software engineer requires more than coding skills, it requires tons of analytical and soft skills to be able to ultimately deliver what the customer is looking for.
Chatgpt wont be able to since they wont update it to that point. Chatgpt runs on GPT-3 Which is years old now, by the time GPT-4 or a newer CODEX is released it will already be far better than most programmers.
It already does. Integrate it with your issue tracking and build system, and it works better than an average programmer. The only limitation is artificial - it can't modify large projects.
It actually can turn requirements into code.
It's not artificial "intelligence" just yet. But it is very useful for what it is and no different from a search engine in that you also have to know how to use them.
If you don't know how to properly use a search engine, let's say google, you won't get the results you want as easily.
Also, what kind of psychopath would want a sentient program? Wtf is wrong with you?
Skynet, is that you?
Even if chatGPT is not competent enough to code anything that a senior programmer can, the question is if it will improve to that level soon.
My main argue against gpt replacing software developers soon is that software developing is more than just code.
It requires a lot of human interaction and soft skills.
Very few people are saying chatGPT will replace anyone.
The point is AI progress has exploded over the past few years and pretty much came out of nowhere.
In 2017 the most impressive thing AI had ever done was beat a human at Go. In 2019, GPT-2 released which was basically a funny "look at how dumb the AI is" tool.
Now 3 years later we have chatGPT. A tool you can have a discussion with on pretty much any topic as long as you don't go too in-depth, and which produces (mostly) accurate, and entirely readable responses to questions, as well as simple code. It has rudimentary problem solving abilities, and can perform creative tasks such as answering questions in the form of a poem, or in a humorous style, designing the outline for a game, or writing parodies of popular songs.
There is no way to know how far this will progress in the next year (unless maybe if you work at the companies making them), let alone 3, let alone 10. It's coming for people's jobs and the only ones it won't mostly automate aren't jobs that we can list now - it will just be whoever gets luckiest (or unluckiest as the case may be when the lack of available jobs requires some form of UBI to be implemented). This isn't just a case of your job being mostly automated. This is a case where the potential for societal change on a global scale over just the next decade or two could be unimaginable to us now.
GPT and new AIs technology was presented in mid 2017, and just in 5 years reached this point, with non stop growing so far. People are being just so naive about this. WE HAVE SEEN NOTHING YET.
Biggest plus I see is that it can actually explain stuff. But that's for learning, not for working
Ok five years ago nobody thought chatgpt is possible. Now we have chatgpt. What will be five years later is hard to imagine but there is a chance that chatgpt will evolve and will be powerful enough to replace jun mid programmers
im afraid it will be much much weirder
Fourty years ago I spent hours (months) "training" an OCR (scanner) to create its vocabulary to spellcheck a document. Today, a scanned receipt can complete a tax form.
My friend literally showed me that chat gpt did his actual coding job in WordPress. Like how is that not frightening
Must have been quite basic it failed me miserably. If I didn't know what it was doing I'd have been in trouble
i think it's a tool to fix bugs and learn but it's completely professional like software engineers
I've noticed it's good for making a basic template but not for generating a finished code
Thank you for this video. Im learning to code for almost a year and plan on going to college soon. The fear mongering from this has been insane. I think most people do it for the algorithm views and its nice to see an honest youtuber.
Glad it was helpful!
@@LowLevelTV Not everyone has the same talents and abilities. and when the only ability (worker) that is owned is replaced by a computer. there will be a lot of injustice. and socialization will rise from the dust of the crumbling world of capitalism
Sometimes it doesn't even produce code, it just tells you to do it yourself(eg. write an xHCI driver), which is pretty fair if the task is complex.
Which’s what’s mon-Prog will ask
It will ask to make me an android app that does x
It will tell you that you need a programmer in Java or Kotlin who can make app test it
This is what it told me
A programmer will ask about writing a specific func and will know where to put it and weather it’s correct or not
AI is not special at all. It just a tool for saving data and retrieving back accurately. AI has no creativity, completely stupid but only real developers can understand
ML is a tool for mapping any measurable function with training data. I don't believe there's anything magical about our creativity. Instead, the combination of our experiences, knowledge and brain structure are the fundaments of our "creativity". These are all possible in ML
if im going to be honest from playing around with this myself, it seems like while it wouldnt replace programmers, its pretty good at assisting basic code, even if there is a mistake a programmer could notice it and fix it, which could take less time then writing out the code themselves
I think snippets and common code generation are safer and more useful in this regard. While it might be tempting to shove boring parts of coding to the ai you pay with potentially hard to spot bugs that sources from syntax or libraries used yourself aren't really that familiar with.
Its a productivity multiplier like google, instead of a replacement. At least I hope
of course it would not replace dev at its current version, but it might get better exponentially in the near future (other AI companies might come up with similar products as well and it might become so pervasive everywhere at the infrastructure layer) and then it may replace majority of programmers. even though it takes longer time to replace senior dev, dev job might get 'commoditized' and salary can pushed to be significantly lower with much higher competition/supply than demand
it doesnt need to 'completely replace' dev job to make the career prospect of a programmer much grimmer/more unattractive financially
@@fintech1378 this is exactly what I've been telling people
The cool part is that it isn't biased. If you ask "Is rust better than c?" ,it will list all the pros and cons of both.
AI not biased? You ain't tested nothing yet!
it's not biased but it doesn't always gives right answers that's even worst
@@hiphopheaven everything is biased, not giving the right answer, even from time to time means it can't be used in a critical applications if not supervised. In summary, another tool for the playgrounds.
Thank you, I got pretty worried for a moment, because I'm just getting into the profession and now ChatGPT came out.
Once you go through a few professional projects you'll get the perspective required to evaluate chat GPT. Practically everyone praising it as some dirt of programmer replacement is either extremely junior or has zero programming experience. Programming and coding aren't the same thing for starters, neural net simulators have no concept of coherent logic or aware reflection like a human does. If it improves by a few orders of magnitude it could become a powerful tool for a programmer but as it is it can't really be trusted to produce more than mediocre and insidiously wrong stackexchange substitute text.
This isn't an AI to begin with but an association engine and comparatively little research has been done on the fronts required for building an actual AI where the intelligence part isn't glossed over to grab headlines and quick cash.
The reason such AI can make a simulacra of a photo is that that is an association problem where it can pull data others generated to construct close enough replicas, it will not be capable of intelligent design. It might replace simple tasks programmers do vis a vis boilerplate in a few years though.
@@RiversJ do you think it can replace front end web developers, i started learning it like 2 weeks before chat gpt came out
@@ZuwyFPS I don't think ChatGPT3 will be replacing anyone but the least invested developers and not do so directly but only tangentially if it ever becomes reliable enough for intermediate to advanced devs to increase their productivity more than there will be need for front end devs. But also take into consideration that i'm a game developer myself and don't have much experience with front end dev but rather game and graphics programming where the results of chatGPT in testing in its current form proved rather underwhelming, i actually had high hopes for it initially but alas no.
Whether the next big AE project will be vastly better and replace some devs is another question that nobody can truly answer yet.
Did you try asking the bot to include reviving a restart in the code?
I feel that most channels that scare us of ChatGPT are just doing it for views and do not really believe what they say.
I took the Challenge to write a simple graphics engine,.using only chatGPT, it coded for me to load meshes, skinned meshes, models(kind of wrong here), shaders(PHONG),painter algoritm, materials, AABB trees, GJK,primitive collitions in 3D, torque, momentum, acceleration,etc, all i did was assembled it(took me a lot of time to make everything work properly), and fix some classes, or ask again and again to re write it or formulate a new question that it could give me the proper result, it worked fine but it took me kind of the same time if i would have Googled all of it.
Also i tried to code the fundamental matrix and essential matrix for epipolar geometry with it, i couldnt make it work properly(mixing with my own code to record from my cameras) but i think it gave me almost all the correct steps, SO any way i think i would use chatGPT i'm cases where Google fails me
The Chat GPT is not the beginning. But when Microsoft said they want to integrate it to Bing, and make much biggest tech companies feels threatened, that's the beginning of problems. Current news, Google and Apple try to built the same thing who could "compete" with OpenAI, and it wouldn't feel very good.
I personally don't like full code writing software since it's pretty invasive. It writes code that doesn't follow my code style principles and introduces bugs I might never find since it uses functions and syntax I might not be 100 percent familiar. I think a great intelli sense with a more advanced code snippet library might be the peak just before writing code directly based on your ideas with brain computer interfaces.
The argument still exists: is coding art or science. Art can't be disproved, science can. Further, science improves itself, where art is one-of-a-kind. Which airplane do you want to be in?
@@bob-ny6kn programing is an art. But not in the form of painting but rather in the form of balancing objects on each other.
The difference is that you can't argue that a painting is drawn wrong. But in the art of balancing things there are right ways to balance objects (standing) and wrong ways (falling apart). Programing is similar. There is not 1 right solution like you can balance a set of objects in different ways on each other correctly. But there are wrong solutions to a programing problem aka. bugs. If your program sometimes does not work correctly it's objectively wrong without any way of arguing against that.
To wrap back around to the topic. Ai is known to write notoriously difficult to find bugs.
Consider trying codeium
@@konstantink07 I tried tabnine before and didn't like it's completions. But I will definitely try it though. Thanks
People have been overhyping it so much. Even the essays it generates is on the level of a high schooler. It's such a cookie cutter format too. Can be useful for initial start/research but yeah like you said no job threatned
I don't think even if we released something that would always produce correct results without any bugs and had a limitless level of complexity you could give it, because of the simple fact that in 99% of real problems describing the problem is often more tedious than just sitting down there and programming a solution (this is even the case for non-technical folks nobody likes describing problems). The problems we deal with are just too complex and it's often easier to write something down, test it, think about it, tweak it, then to just code the perfect solution out right on the first try.
It's a whole lot easier to ask something to code for you rather than doing it yourself. You can just make a list of requirements and have it create it for you. Non coders it far easier because they can't code.
"describing the problem" IS "just sitting down there and programming a solution". ;)
@@switchdeck9164 I can't give you a link because TH-cam comments with links in them "disappear mysteriously" ;) but if you search for "commit strip a comprehensive and precise spec" has a very valid comment on what you're saying.
Once people find out about Google, no one will hire developers, because all you have to do is search for the code you need and then copy and paste it…
Nice one
Exciting times to hear a tech interviewer asking, "Where do you see yourself in 5 years?".
"A decade away"
That's going to go by reeeeal quick.
Developers will not be replaced by ChatGPT or GPT-3, but I have no doubt that future versions will. It is simply a matter of time.
GPT4 is already kinda released now.
That was not a haiku! Check it again.
you're right! the "AI" can't even handle a haiku lool!
This AI is cool
But for now it's just a tool
Programmers still rule
Fear mongering means views and views means money. Don't listen to no youtuber without trying it out for yourself first.
we're done
no plz
I wouldn't be surprised if you could've told it about the mistake it made and it might've been able to fix it. It can retain conversation in the same session, so it should be able to piece together where things apply with enough information.
It can do that in most cases for sure, but you have to be aware of the problem in the first place, and many bugs are not obvious just from reading code, especially if you are not an experienced programmer
@@ciandoyle3076 Yeah I noticed it doesn't do super well with semantic errors (fine as far as the computer is concerned, but behaves incorrectly). I'd get in this annoying endless loop of troubleshooting and the only thing stopping that is troubleshooting the old fashioned way, where you know what you're doing in the code; even coming up with an alternative, which you often have to know what you're doing to come up with that alternative! I've had a lot of times where I would've been better off doing most of the coding myself and gotten help on the occasional things rather than everything as a whole. It would be able to correct some mistakes, and in correcting some other mistakes, it would re-introduce the original mistakes it made.
Mileage definitely varies for sure. A lot of stuff I tried it with, I would've been better off doing myself with maybe some occasional help here and there, but not with everything since it often prolonged a lot of stuff, especially dealing with the character limit.
If ChatGPT is the future, at least that would explain why everything is going to sh!t. Because from what I see, ChatGPT is mediocre at best.
I am happy, that you guys think that your jobs are save. Here are some further insights by the bot itself:
Here are some potential risks that could be caused by the integration of AI in a capitalistic democracy:
Job displacement: As AI-powered automation becomes more advanced, it could potentially displace large numbers of workers across various industries, leading to widespread unemployment and economic inequality.
Income inequality: If AI primarily benefits large corporations and the wealthy, it could exacerbate existing income inequality and lead to a further concentration of wealth and power among a small elite.
Political manipulation: AI algorithms and social media bots could be used to manipulate public opinion and influence elections, undermining the democratic process.
Discrimination: AI systems that make decisions based on biased data sets or are trained using biased algorithms can perpetuate or even amplify existing discrimination against marginalized groups.
Lack of accountability: If AI systems make decisions that harm individuals or society, it may be difficult to hold any particular person or organization accountable for those actions.
Loss of privacy: AI systems that collect and analyze large amounts of personal data could be used to invade people's privacy and monitor their activities.
Dependence: AI-enabled automation could make society too dependent on technology, and may become a single point of failure for crucial infrastructure such as transportation, communication and even financial systems
Weaponization: Advancements in AI technology could also make it possible to develop autonomous weapons that could select and engage targets without human intervention, raising the risk of accidental or unintended military engagements.
These are just a few examples, but there are many other potential risks associated with the integration of AI in a capitalistic democracy. It's important to consider these risks and take steps to mitigate them as AI becomes increasingly prevalent in our society.
The question isn't will it replace all programmer but which programmers. It will definitely cost some people their jobs and very soon. It's hard to tell but I suspect it web and front end Devs will be the first to go.
Combining AI generated graphic design and it's ability to put together programs from plain English make that obvious.
But I think the people who should be most worried are blissfully unaware. Here's some examples; teachers, accountants, office workers, data entry, administrators, planners, authors, assistants, sales reps, tech and other support, etc the list goes on.
Front-end can be extremely complex depending on what's asked ... sometimes it's Harder than the backend work I have to do, try making any kind of complex animation or effect and get back to me ... try a bouncing ball on the screen see how you go
@@FilterChainnot necessarily hard for AI eventually. It can put together sprites no problem. The markup coding stuff can totally be obsoleted. It might not be immediate but frankly it’s much easier to replace any sort of computer data entry work with an AI than for example police or firefighters or chefs. Any software engineer has to be honest about that.
I was fearful of being a dev 10 yeara ago because of wix etc , guess what I made the best choice ... I think it's easy to be fearful of job security but skills will give you the cutting edge no matter what
Need not worry everyone. Once AI can go after the creatives (of which programming is, in a way) - the only jobs left for humans will be menial tasks and hard labour. Both very low paying jobs. The middle class will all but disapear and we will only have the insanely rich who own the AI and everybody else who can barely afford to feed themselves. Then the global economy, capitaliasm itself, and the mordern way of life will collapse because nobody has the money to consume anymore. That could happen, or we keep our jobs for the most part, or we replace our economic system fundementally with something that isn't capitalism. We'll be fine.
all for the sake of “progress” baby. its crazy how society has been skating the fine line between a utopia and a dystopia but now i think the pictures much more clear.
but it can easily replace teachers and stackoverflow specially for beginners, its great for beginner who are into programming, i use to go on stackoverflow and depends on luck if somebody have enough time to answer my noob question and google only show websites but not particular problems but now chat gpt is like personal teacher from whom you can get some ideas, just copy paste your code and ask question like you ask to your teacher, its so efficient and life saver.
Someone who is just starting out in programming doesn't even know what questions they need to ask. That's where real teachers come in.
@@TheRealJman87 for that youtube will also do, i am a self taught now making simple games with c#
I remember when people said Napster wouldn't be the end music sales as we knew it. You can keep resisting, it wont change anything.
Ok so I havent programmed for years, I have almost forgot everything. Neverthless I decided to go do an indie game just for the fun of it, using Unity and programming help from ChatGPT. I've spent literally 12 hours trying to get it write a proper find me the nearest object with a set tag, and then move an another objects towards it. Its code although always compiles, doesnt do what I am asking it to do and its quite frustrating. Whenever you tell it to alter something in a function, it breaks something else. No, it wont replace programmers yet. Not by a longshot.
if you think that chatgpt isn't something that will replace programmers, wait 1 or 2 years
Based on what? Just guessing?
@@jendabekCZ based in what just happened with stable diffusion for instance
@@pedro_marques92 can you get stable diffusion to draw something very specific without error I think not
@@FilterChain stable diffusion didn't exist 3 years ago, imagine 3 years into the future
One thing you can do. When it does something wrong or not to your liking, because it's an AI and it learns, you can tell it what's wrong and it'll redo it for you. And probably in the future, it will become more and more informed with less errors.
How does it know you are not just lying to it? It is just as likely to become more and more misinformed and mediocre, just like the internet did.
It doesnt do that. Chatgpt doesnt learn from your interactions with it.
ChatGPT is updated by the developers. It will only 'learn' when it remembers past things in a conversation
The problem is that students increase their programming ability by overcoming hurdles for themselves. Seeing someone else overcome the hurdle (including AI tools) has a far less a gain. It is a bit like going to the gym and watching someone else pump iron. You may learn a bit about how to use the equipment, but it won't make you any stronger. If GPT can solve the educational programming exercises students do to learn how to program, how are they going to learn to spot when it gives them bad code. It is a great tool for expert programmers, not for most students.
You can say this about any interpreted language, since you don't need to know how computers work to code in them
I disagree, chatGPT is going to be to learning programming what WolframAlpha did to learning math. WolframAlpha's step-by-step solution is a great learning aid, sure it can be used to cheat but it's also like having a TA with you at all times. Just because some lazy students would abuse it, doesn't mean students that actually wants to learn wouldn't use it to improve their studies.
@@williamdrum9899 You can program using compiled languages without knowing how computers work.
@@dunmermage Having a TA with you at all times would be a bad thing. Having too much help makes it more difficult to develop skills, especially problem solving (programming is more than coding). It is like having a spotter that holds 50% of the weight of the bar all the time. It may look like you can bench press 100kg - but in reality you can't. I've been teaching programming since 1994 and I have seen how the profusion of overly-helpful online resources has made things more difficult for a lot of students. I'm not saying the tools are a bad thing, just that they are best avoided by students *until* they have understood the fundamentals.
@@usr-bin-gcc That is also true
It won’t replace anyone. Companies hire junior developers knowingly taking a hit to productivity because of the potential of them being a good developer. It could be used as another tool for assisting developers but the idea that it will replacing them is just outright dumb.
Cope
I would significantly speed up development which means much larger projects are now possible for large companies and small companies can punch above their weight class. It is a tool that needs to be driven by an expert but still think at least half of all coders will be redundant but they will probably change to a more AI orientated career.
I agree with this, I used to think AI would make it so jobs would just be gone, but people still have to know how the AI works, I know someone who works somewhere with machines, the machines are doing things that people don’t have to do anymore, but there’s maintenance that checks up on the machines to ensure they’re still working, it’s like moving up a level, where writing source code is abstracting assembly, asking an AI to write code still requires you to word things correctly, just like how you need to talk certain ways with people to get info, some people might understand biology better than mechanics, and some might understand electronics better than metallurgy, something you can try is telling it certain things, ChatGPT works by tokenizing words, and using probability to guess what words would come next, there are videos that talk about a pre prompt, basically a prompt thst ChatGPT starts with that’s hidden from the user, and there are videos talking about ways to get it to translate things, which it can do, there are some Computerphile videos about ChatGPT and neural networks, if you’re interested you can give them a watch, I asked ChatGPT how it worked and it mentioned tokenizing, and subwords, I looked up a video about tokenizing and subwords and it turns out that that is indeed a thing, we just have to fact check ChatGPT, just like how we need to fact check our fellow humans every now and then, ChatGPT also introduced me to plezoelectrinics, of which quartz is one example, compressing it enough can start an electrical signal, ai found thst interesting, a thing that helps me is if ChatGPT answers a question in a way I don’t understand, I say So it’s like…then I attempt an analogy of how I think it works, ChatGPT will either say ai’m correct or try to correct me. I remember at one point I was worried about asking leading questions by accident, so I asked it to tell me if I asked a leading question and to give examples of of how to phrase the question so it’s not leading, it’s interesting to try to think like it and guess how it might respond to a thing you say, like how before saying something to a person, you might think the sentence, guess how a person would respond, then say the sentence and see if your guess is correct, that’s how I tend to talk, I think of the sentence before saying it aloud, thst way I get the whole sentence out instead of putting longer pauses while ai think of the next word. I wait for a pause after what I think is another person;s complete sentence, only for them to say I interrupted them, I let them finish, then try to say my sentence. Because of accidental interruptions by not waiting long enough, I have learned to think the full sentence before saying it so that long pauses are stopped. I don’t want to cause other people the confusion others cause me. This has drifted significantly from the original point, maybe I’ll copy this comment into ChatGPT and see how it responds, I’m curious what you think of this. I’m sorry for the WoT, if you have formatting ideas please respond and I’ll edit this comment accordingly.
I find it very helpful as a teacher and tool.
I've been using it to learn Python, and even to create a game, a fun little progam, generate exercises for me, etc.
I've noticed it is possible to use it to write some solid code, without understanding all of it. It's not perfect, no one is claiming that, but it sure makes for a good teacher and a helpful tool for many different tasks, even besides coding. Like recipes, helping with writing, creating statements, steelmanning arguments, brainstorming, philosophising, even psychological assistance. I'd like to add music to that, but its musical abilities seem barely existant and you have to jump through quite a few hoops to get any help with actually writing music. Lyrics work great though, poetry is decent. It's got issues with rhyming however. Sometimes it might not properly understand what you're asking and just keep giving you the same few variations of the wrong answer.
All in all, it is versatile, interesting, very helpful, enjoyable, and undeniably a step forward for all to see.
how did you know the code was solid if you are a beginner ? there is a bunch of very good tutorial who made a far better job than chatgpt
Chat GPT is not that good at this stage, I asked for a math question which goes on like: " if a cube is divided into 27 smaller cubes of dimensions 1 unit each, what is the volume of a sphere that could be put into it touching the center of all its faces".
Couldn't believe it was not able to solve its own question!
The idea that AI will replace experienced programmers is bogus. Sure AI can be used to automatically solve complex problems such as programming, but how would an inexperienced programmer who only knows how to ask an AI for an instant solution know if what they receive is optimal? If the 'programmer' has no idea how the code even works, how would they even know it could have been written differently such that it used less memory or ran faster? Should we just ask it to "Make the whole program you wrote for me run faster and use less RAM, please. Because I don't even know which part is unoptimized lol." You don't even know if what it's giving you is the best solution, unless you already have a meticulously optimized, human-written version to compare it to, anyway. As with all inventions of hyper-convenience, it will be abused to the point that the people who use it become totally dependent on it and unable to function without it, and actual experienced programmers will still be needed to ensure their do-my-work-for-me-bot keeps on working.
I have tried using it to help me with my C++ code and I was not impressed either. I want to be clear, I don't think it's "not as good as people say" or simply "overhyped", I genuinely think it lowers my performance. When I tried to have it write simple functions, it was wrong over 50% of the time. Maybe it's useful if all the programming you do is just copy pasting from StackOverflow but if you know the language and are a fast typer, you can easily be more productive without it.
Most jobs aren’t about typing fast. They’re about understanding a business and it’s customers needs to drive sales. That is something that AI can figure out better than humans. Your use of it is just for training it to replace you. Forget even tech debt (maintenance jobs requiring years of domain expertise), AI can clean that up and maintain the code by itself.
The trajectory is set to a dystopian future. Don’t feed the beast. Boycott it.
@@mecanuktutorials6476 In theory all those things are possible, but from my experience current AI really isn't there. ChatGpt doesn't even understand the context of one big class, let alone a multi million line project.
@@MsJavaWolf it’ll get there. It needs to understand the language compiler rules but I don’t see this as particularly challenging since the input is all there. Feed it the source code and it can summarize it for you even today and make minor touch ups too. It’s just a matter of who is dumb enough to automate themselves out of a job.
There’s several other tools that refactor already without changing logic. Hooking those up to ChatGPT and getting it to run regression tests would actually be devastating on its own. Small “improvement” will cause massive disruption to programming jobs. But it’s not just theoretical, it’s at the doorsteps. Because of how expensive devs are, there is a huge demand for this.
Idk if that's reliving for some but I've once asked chatGPT to write a simple drawRect(uint x, uint y, uint width, uint height) function in OpenGL 4, and the bit got it wrong even after 4 responses or so. First, it didn't covert the coordinates to the [-1;1] used in OpenGL. Then it only used deprecated functions that were deleted in 3.3 for over 3times.
I guess that's probably because most sources available in the Internet are in 3.0 or below..
@ioc9owo698 change the pronoun "he" to *she to be a bit more accurate. Most bugs I've encountered are by diversity hires she/we/her
@@op8995 you didn't see anything...
Just ask how to fix it, and it should attempt to fix it. It first generates high level ones, any incompatibilities just ask it.
ive personally been using this quite a bit as a learning tool for programming. i dont ask it to straight up write something that works. instead, i ask it for an explanation of how to write something and maybe ask it for an example. so far i have learned what a breadth-first search algorithm is, how it works, and how to make one myself using multiple different approaches including both recursive and non recursive functions.
yes it makes errors when writing code. sometimes it even gets confused about what language you initially started asking questions about (if there was one). sometimes it completely gets some of the features and limitations of that language completely wrong. but these errors only reinforce the need to read the code it spits out and understand it instead of blindly copying it and using it in your own application
remember kids,
if someone says AI might be able to do something in 10 years then you should expect it's made public within the next 10 months
What is the basis of this? Could you give some references?
You didn't use it when it was not nerfed. It's been MASSIVELY restricted.
I love the statement "Dont be affraid" at the thumbnail
I used it today to implement a feature from a third party library. It gave a code example with explanations that seemed correct but I was having trouble seeing the expected outcome in the browser. I had some doubts when I didn't see the documentation resemble the examples it created. Then I began to interogate it by asking, "are you sure this is correct? The documentation does not show that 'context' is a required prop". Then it replies starting with, "You are correct, 'context' is not a prop... sorry for the confusion" then it spits out an amended version of the code. And again it doesn't work. This continued a few more times till I realized that it was making up random props that did not exist. I would not put blind faith in anything it outputs whether it's code or facts. We're gonna start saying soon, "don't believe everything you hear from Ai".
I am confident it will become exponentially better in the upcoming years
Its good enough for making regular expressions. Its a tool to save time and it just keeps getting better. Real Programming requires creativity so its really not a replacement for creative coding.
Like art?
@@dixion1000 AI art sucks and is inferior to real art in every possible way.
@@TheRealJman87 like the one who won a contest?
I made it write my english assignment, I will reply here with my grade.
I sure hope you are right, but how exactly do you pull out your "at least a decade" estimate? Surely given even GPT3 and copilot, ChatGPT would have been difficult anticipate even a year ago...
Not really , while it is a step up from copilot in terms of making a whole program from scratch and making comments, it is a step downwards in every other way, especially in making a program that runs
@@tjgdddfcn To me that's not relevant, because my code usually doesn't run as intended on the very first try, I usually have to make small adjustments, which seems comparable. Whats impressive, is that for well enough documented topics, its able to parse and summarise a body of knowledge, with not perfect, but scarily high accuracy. The fact that you can ask it something like "can you give me a geometric interpretation of complex conjugation" and it can retrieve that from its knowledge representation... It is even able to relate similar concepts and compose different concepts, hopefully this stays in a mostly guided fashion...
@@johanngambolputty5351 Thing is those errors most of the time come from the use of functions that dont exist of using a type to do a thing that that type cannot, those types of errors tend also result in errors in the whole structure of the program or atleast a part of it. While yes it is good at explaining topics, it only is good at explaining very documented topics. Also, the way it writes in plain english without grammar errors is nothing to be scared of also, we had that technology for like 7 years already
@@tjgdddfcn I am maybe at ease that its not a threat to programming right now, that its probably just a hyper-google for the time being and actually pretty useful.
I wouldn't go as far as saying it can "explain" things itself, but it certainly seems to be able to retrieve explanations its seen, but this seems to imply an ability to combine multiple explanations in a consistent way, and no past method really gave me that impression, not as convincingly (even if its still just an illusion).
But conceptually, if we accept the promise that it can do some impressive stuff with sufficient guidance and if it were allowed to learn along, then its potentially only an automated guidance system away from general intelligence? If it can lead itself to its own conclusions and consistence check. Suddenly its doing much worse than just stealing programming jobs... Though the guidance system still needs an objective and training a reward model for that would probably be much harder. And it would probably still have a fairly limited view of the world through a text only datastream.
@@johanngambolputty5351 It's not about how documented it is, but from how many examples it can put together the answer. Because it has actually no understanding for the programming logic.
It's Excel 2.0. It buffs the baseline of productivity and brings together data for easy analysis, but doesn't actually do anything in its own right.
It's not intelligence at all. It's mimicry. Unfortunately it seems to have fooled a lot of people already who only pay attention to the surface of things.
Prediction: It will create a spiral of mediocrity, feeding on itself as people over rely on it. It will be both Excel 2.0 and an Icarus 2.0, crashing the hype train of AI more than a robot hand breaking a child chess players finger.
I am not much a code buff but as a writing person it does not ımpress me much sice stories are very generic so only a content creator ca make it work
Same with coding
3:38 Yes but how long until it is capable of this
Can someone explain to me 2:53 please? Why it didn't work?
Can this thing write PLC code? Probably not because it’s proprietary, right?
I like how you go "Don't freak out this AI isn't that good", and then go on to say we'll be replaced in 10 years which is a sooner estimate than most people give. I still don't really feel better as someone just starting CS.
Finally someone who talks sense about this.
You know, it's one thing to generate code to get the oumpthteenth Fibonacci number, but complex coding involving thousands of lines of properly tested code with its intricacies, using tools, libraries and frameworks judiciously; that's a ballpark ChatGPT isn't going to be near for a long time, if ever. Never say never I guess, but if you're 20 years old as a newbie programmer, you've got your job cut out for you until your retirement.
It's as you said its the small things that it can't do, lets say you work at a company that has a code based that has been built up for the last 5 years, there is no way a normal person is going to know where to put the code or how to link it to other files it won't know what communicates with what , there is also no way a company is going to an AI to build such a large app without a few developers making sure everything works and fully optimized.
i imagine this could be the basis for a framework that would allow us to work synchronically, from anywhere in the world, without knowing eachother
i ask a question
chatgpt finds a bunch of people related on the field, it translates it to them
one finds a solution
now it sends it to a group to verify it
we could repeat these steps a few times
this should also help filter identical ideas
now it gives me back the answer
later if someone else asks the same question
it should give a resume of the process - do this, dont do that, why...
Hi, Ive been a programmer for 15 years+. I used chat GPT and I was able to write EVERYTHING perfectly. scared me because now I feel like I have to plan ahead because this coding will be replace, this is only the beginning, its going to advance. everyone is just trying to not face the reality and play it down, but im worried. I do believe we are done, slowly by 5 years. We'll see what doors it opens up though, maybe it makes us way more powerful to create WAY BETTER applications, why not right? No one has a crystal ball for the future, but that chatGPT i got it to write everything and really fast.
btw.. i think they are bottlenecking and throttling its capabilities, because now im noticing its saying it can't do stuff anymore that I got it to do before. this is not by chance.
I think they are. It's been noticeably slower than the last few days.
"At least a decade away from that"
I’ve had it give me straight up incorrect information before. I asked it the correct answer to a test question about multiplying things in C, where it was given a function to find the y value of a y=mx+b function with 2 int pointers and an int as an input that had to be multiplied together. It told me the correct answer was pointer1*pointer2 + b instead of *pointer1 * *pointer2+b, stating that C auto dereferences when multiplying, even though I tried it out in C and that did not work and the answer key also said that was wrong. I tried to tell it it was wrong but it kept insisting that I was the one who was wrong and there must be something else that’s the issue, even though the one it picked just wasn’t right.
Question: Does it also support the reverse: Code -> human language, i.e. you input an algorithm in some programming language and it tells you in high-level terms what the algorithm does?
Yes, and I've found it to be better at doing that than writing code from scratch.
People aren't afraid of what it is, they're afraid of wheat it will be capable of in 1,2,3,4,5+ years.
programming languages are made for humans, an AI don't need them, so I'd say the future isn't an AI generating code for humans, but the AI itself solving problems instead of humans producing code
The Google translate comparison feels spot on
Agree, this is new tool, that will make programming easier. Maybe it will grow into some new high level language. No-code programming have been developing for some time already.
what if it could run on more computing power like quantum computers?
For anyone curious on why ChatGPT can't be smart look up "Chinese room" it's a thought experiment about why it isn't posible
As like what you've said, fyi if there is a bug on the code, you could tell ChatGPT about the bug, and it will apologize and usually try its best to fix the code for you. Sometimes it works, sometimes not, still far from replacing human programmers, but a very useful tool for programmers...
It will , it’s a matter of time
Embedded Systems and Low Level Development will never be automated away. As for Web Development, there is already Wix and honeycode for automation or code less development. You can't do that for ES or LLD.
And a lawyer and a programmer. So I used ChatGPT in order to make a legal document for my country (colombia) and the output was a basic format document. I know that kind of structure and without i doubt I pressume that some how chatgpt get that information for a website. I mean chatgpt doestn understand the concept in a deep way but know how to find the information of the concept itself. Now, being say that, when I ask for some code this is why chatgpt never answer me about the specifics because, again, it doestn undertand the concept but its great to looking for the response.
But at least bureaucrats, journalists and politicians can already be replaced with chatGPT and be more cost effective.
do you know how ludicrously easier it would be to control a narrative if you replace journalists with a robot?