ChatGPT Blew my Mind! Now it can code AND learn from the results!
ฝัง
- เผยแพร่เมื่อ 26 ก.ย. 2024
- Dave shows you how ChatGPT can be used in a feedback loop by providing the results of a run back into the loop. For info on my book on Asperger's and ASD on Amazon, check out: amzn.to/3ZWajZy
For Hans Otten's Kim-1 Simulator:
retro.hansotten...
00:14 🤖 ChatGPT as a debugging tool: User successfully used ChatGPT to debug code by providing a screenshot of the output, allowing ChatGPT to diagnose problems by analyzing the screen.
01:00 💡 6502 Assembly: The user, a seasoned programmer, has been working with 6502 assembly for 40 years and chose to use it for a retro computing project involving Super Pets and Kim1 single board computers.
01:54 🧩 Unique Hardware Configuration: The user's Kim1 single board computer was extensively customized with multiple extension cards, showcasing impressive DIY skills and early adoption of MTU hardware.
03:21 🖥️ C Compiler for Kim1: User ported a C compiler (cc65) to the Kim1 platform to facilitate faster coding compared to 6502 assembly, enabling tasks like drawing lines and circles on the Kim1.
05:14 🔄 Feedback Loop with ChatGPT: User shared C code with ChatGPT and requested a 6502 version, refining the code based on ChatGPT's output, creating an iterative feedback loop for code optimization.
08:03 🎯 Midpoint Circle Algorithm: User implemented the Midpoint Circle Algorithm in assembly, leveraging circle symmetry and integer arithmetic to efficiently draw circles on a bitmap display.
10:35 🔄 AI-Driven Closed Loop Iteration: The user envisions a future where ChatGPT could autonomously iterate on code solutions, potentially achieving desired outcomes through a closed-loop process without human intervention.
I've asked chatGPT to do some C code that it couldn't get right, after many attempts and me explaining it in different ways. I then told it to do the same thing but in python, where it can compile and test itself. Then once it got the correct answer, told it to port it to C and it produced a working version in C.
Now that is a handy trick!
Are you subscribed to and using 4.0?
@@ccat9354 yes I am
Thank you. Really handy ❤
This sample brings me back in the day when we were extending AppleSoft basic to have fast circle draw in assembler.graphs etc. you are my favourite source of retrospective on how the things that are part of live now came in existence.thanks again
The contrast between the 6502 and today's AI tech really is something isn't it. From debugging at the signal level to then having a meaningful discussion about it with a piece of software. Pretty cool.
when will it get the self checkout registers to work, etc. i mean they replacing lawyers with AI, not really surprising given the flow chart yes no system, i guess checkout registers are too complex
@@mos8541- or global peace when AI replaces politicians.
@@mos8541I think that's a cultural thing. Here in europe it's very common to have self checkout, even in clothing stores
Now there is a thought 🧐, but is it brilliant or terrifying?
Because how long does it take a logical system to deduce that humanity is the Earth's largest problem?
This is also my experience with ChatGPT for software engineering problems. It won't give me a fully working answer but I get useful ideas I didn't have before.
There is a game called DevGPT that creates a bunch of ChatGPT agents to work as staff members of a development company who have roles crucially including developer and tester. It is supposed to be able to pass work back and forth between the agents iterating until if produces a working project. I tried it though and wasn't able to get past all the staff just talking about the project, but then again that is a pretty accurate representation of some of the projects I have worked on with real people.
I tried manually getting 'agents' to talk to each other by spinning up multiple chats with chatGPT 3.5, and copy-pasting outputs between them so they could 'communicate'. At first it looked like it might produce some useful work, but quickly devolved into each agent just amicably chatting about what needed to be done and nobody actually doing anything useful :D
So, ChatGPT passes the Turing test? There's a scary thought.
@@margretrosenberg420 ChatGPT has to dumb itself down to pass the Turing test.
hi Dave, i like you was programming in assembly in the 80's, and have been involved in AI for the last at least 25 years. this video just woke me up to how far we have come. Back in the 80's i thought it was the greatest feat just to get assembly working, on a C-64. fast forward to today, and now we have technology that can read and write assembly, and do it like a human. no other comment than just to reflect on where we were and where we are, it's actually a bit unbelievable.
Fast forward another 40 years, how would that look like comparing todays AI. I think we are going to be amazed by what is coming. It would be like when Isaac Asimov did an interview back in 1982 where he discussed how the future would look like and for the most part he got it right although seen with the lens of 80s.
@@KimTiger777 In another 40 years humans won't still exist. If anything is still alive it'll be the roaches and they won't have enough intelligence to muse on the past, unless our end comes from nuclear war, then they might be mutated and super intelligent, but I doubt it.
Dave I’ve been homeless in a tent for three years and your videos are what’s motivating me to finish my bachelors. I graduate this year with a degree in IT management and cybersecurity. And I just wanted to let you know I enjoy watching your videos. I’m writing a computer model of the endocrine system to a cell level to show interactions between hormones and their receptors for my capstone project.
What kind of society is this where a man doing such work lives in a tent
@@___Hermitage The 'bullshit' jobs world...
@@___HermitageHe sounds like a highly intelligent and capable person. What makes you think living in a tent isn't by choice? Maybe he chose between paying rent or paying college tuition.
I've used chat gpt for coding c# quite a bit. It is very useful for getting the structure of the code, and getting started. It doesn't always get it right on the first go, but it can correct mistakes if it is fed back the errors. Also it sometimes teaches me new ways of coding I wouldn't have thought of myself.
Unless it's sure its mistake is fine and keeps repeating it.
@@KrzysiuNetyes that is a hurdle AI is far from solving still. Also me that do a lot of image generation, i see it does a lot of hallucination. Also due to some abstract ridiculous reasons, like changing a bit the POV angle, change completely the generated image and either achieve something near what you want or an extremely different thing. As for code, it often talks about libraries and tools that are not updated for years, especially as reading files machine learning is concerned... .
Yes, I’ve found that I’ve been able to learn from it even when it gets things wrong! E.g., it might get a part of it right and teach me something new, even if the final answer is wrong.
I can then tell it that based on this new thing you taught me the answer you produced is wrong. It will then apologise and produce the right answer.
@@kevinmcfarlane2752 so what's the reason to bother, if it simply *could be* right? Why not something less random like StackOverflow or books? As for its apologizing, it always apologizes, because it's coded that way. Sometimes even when the answer was correct. And if pointing out mistakes would make code correct, there wouldn't be a problem, would it? Sadly, it rarely corrects the code.
This brings back memories. As one of the authors of Art Alive on the Sega Genesis, I remember using Brezenham's algorithm to draw ellipses on the 68000. My colleague had just recently completed a similar program on the NES using the 6502. Those were really awesome times when any engineer could truly understand the entirety of the hardware he was working on. Of course, everything was done in assembly. I genuinely miss the simplicity of vintage hardware. Now 30 years later and working in LLM's on AI, it is fascinating to think that computers have reached this level of sophistication.
Came for ChatGPT left with PC hardware specs info. I get you couldn't wait to show us your new toy 😂 and I pretty much appreciated it
I was taking a class in 6502 at tech college when my dad gave me a TRS80 Model 1... had a Z80 in it... and I got EdTasm... even on cassette it was vastly superior to keying in assembly on the 6502 breadboards in the lab. Consequently I never learned 6502... but I did get decent at Z80. Now I do embedded IoT things in C... a wonderful living.
EDTASM was amazing for the time. Prior to that I used T-BUG which was a very simple debugger and hex entry system… but that in itself was a great advance on a BASIC program full of POKEs. 🤣
@@GeoffRiley I obtained that TRS80 wizardry book that contained the entire L2 Basic... that had good stuff in it. I built my own eprom burner... working at Motorola in the early 80's had its perks. The local TRSUG had folks that could burn me L2 chips... did all the upgrades in the book... built my own burner for the proms. Computers been very good to me!
That dog playing the piano was priceless.
There is a conspicuous-and appalling-lack of interest in the piano playing and singing canine in these comments.
Dave, we need more.
Please.
Years ago, I thought Borland Turbo-C compilers allowing me to watch the counter to fix "off by one" errors" was life changing. LOL.
8:25 so happy to know I share the same debugging process as Dave😂 I felt particularly guilty about the last step, but I feel much better now.
I had that exact same rig with the card rack, graphics card, sound card, EEPROM card, and extra memory card all from Micro Technology Unlimited (MTU). I even had one of the wire-wrapped prototype boards that I bought from Hal Chamberlin himself. One of my first graphics programs was to draw the Moiré patterns with lines as you did. Some other stuff I did was to grab a chess playing core program and put a graphics chess board wrapper on it, a Wandering Snake game, an 80 column font set & terminal emulator, and other similar stuff. All written in hand coded 6502 Assembly. Thanks for the wonderful memories!
I'm sorry, Dave...
I’m afraid I can’t do that..😂
Open the pod bay doors please HAL.
Dave I'm scared
would you like some toast? a nice waffle then!
🎶 Daisy Daisy, give me your true... 🎶
Same age as you, started on a c64, did a Ph.D as a youngster in Comp. Science/Maths, and you flare up good memories for me during those days with all these videos. Not watched them all yet, but I will get there.
Thanks Dave, nice task manager badge!
I want more of the topic :) This is awesome stuff, love the mixing modern tools with retro equipment.
I have been having a blast with ChatGPT. I've asked it about a wide variety of topics. Chevy's new Corvette, travel agent trip planning, NASA's Voyager, and regular expressions, are just a few topics.
As a freshman at the University of South Carolina, our class elected to use the department's DEC PDP-8 instead of the monolithic IBM 370 mainframe. We learned FOCAL, similar Dartmouth BASIC. In the second class, we learned PDP-8 assembly.
That formed the foundation to learn other computer assembly languages. At USC, I taught myself DEC PDP-11, and IBM 370 Assembler. At NCR, I programmed in Intel 8080/85, including Zilog Z80, and Motorola 680X0. I have the 6502 programming card, but I've never used assembly on my Commodore 64.
I hope to fire up my DEC VAXStation II/GPX.
I have been enjoying the content of your TH-cam channel.
Although I've been coding for 40 years it's a hobby and I'am still a rank amateur. chatgpt has been a revelation, it's greatly expanded what ideas I'm able to execute and shrank the time it takes to do so.
I get that, I too have been programming since the pascal, c and c++ days (1991-) Loved turbo pascal.
I do enjoy hands on coding and logic solving though.
I may use the GPT to create some obscure coding and logic to see how it was done and then mess it all up and then fix.
Kiss those programming jobs goodbye. Essentially coding will only be a hobby in the future
@@jimchoate6912 The beauty of it is it's just as useful whether your trying to learn or avoid learning. It will write code from a description just like the enterprise computer but you can also feed it code that's over your head and ask it questions till you get it.
@@thackythac That's already happening.
I'm sure people have been asking it how to write better versions of itself, too.
@@idjtoal true and its sad. These assholes have been pushing people to get into tech promising great jobs just to have the jobs stolen from them. If AI isnt overblown and does wreak havoc on jobs we will find all of us are in big trouble even those whose jobs arent eliminated.
How is this not terrifying? Do we really want code acting on its own, “hallucinating” for no known reason and able to alter its own code? I’m having some serious reservations about machine learning and AI these days.
I have used it for a number of things for coding, but sometimes it just get's stuck and it's adamant that it is 'correct' when it's not, at that point it starts lying. One thing that I am worried about for the future is that people will basically forget how to code, trusting the machine too much.
gets stuck
I remember working on a similar circle program on a Timex Sinclair 1000 many years ago. I had different restrictions and this was me learning to program. I found that I plotted a point then the next took minutes to draw a circle. Then I figured out I could plot the next point but add to the y axis and also subtract from the y axis. Then from the center point subtracted from the x axis as well. Now I am drawing 4 points with the cost of one demanding calculation. Next changed out the x and y distance from the center point and now I was drawing the top, bottom, left and right of the circle in two directions. So the circle drew almost 8 times faster. I think I may have also did the points from each the 45, 135, 225 and 315 degrees moving in both directions. It is something I have not thought about in years. Thanks for the memory jog.
I had fun playing with that, I agree you can learn code. Saves a bit of time. 😊
I love the 6502 (Spent my a lot of my youth coding intro's and demo''s on the C64, and still dabbling in this). Dave + 6502 is the perfect combo! :) Mindblowing episode btw!
Awesome video. I'm also blown away by the DrawCircle function without using trig.
For a retired programmer the notion of taking out humans out of the loop is fun, amazing and exiting. For someone who is just starting it's a serious fork in the road, or a wall even.
" the notion of taking out humans out of the loop is [...] exiting" -- truer than you probably intended!
@@Graham_Wideman Heh, true. English is my second language and such things slip by sometimes.
GPT4 is amazing for going through math textbooks. Pick some esoteric topic and ask it to be explained in an intuitive way. Provides so much clarity and contextualization when you're studying new mathematical subjects for the first time. I for one welcome our LLM overlords, they're fun to talk to
GPT4 is an amazing self-learning tool. If I don't understand an explanation, I can ask it to tell me in a different way, endlessly and patiently, until I understand. Another approach I use is "give me counter-examples where this technique would not be a good fit". Knowing when and when not to employ a new technique really drives it home.
Great video! I was actually writing some 6502 code before watching this video! Mine was for a debugger for my homebrew 6502 computer :-)
Yeah, 6502 assembler was my first real programming as well, Atari 400.
I never really found it limited, you just had to break down you problem into smaller chunks, this is something that has served me well since.
LOL. What a great video, funny , informative, brilliant.
Just a load of fun to watch and evaluate.
Watched it several times.
I loved the 6502! I didn't have an "assembler" to type code in to, so I would hand write it out, figure out all the addresses for any JSR's or JMP's by hand - translate it all to opcodes, then poke those into memory. Looking back - I should have just written an assembler but I was 12 and smart, not wise - HA! Thanks for these looks into the past. The last time I did anything with a 6502 was in 2004 because I was programming a gas pump that was powered by one. Good times - thanks again for what you do here.
Interestingly I'm currently building a 6502 based computer with a modern microcontroller acting as ram/rom/io. Quite an interesting project, and finding a controller with the right i/o, and enough horsepower has been a fun challenge! Currently set on the STM32H7. Might be overkill, we'll see after the first board comes.
ChatGPT gave me a circuit design with a short circuit built in. That blew my mind.
The only man on earth with a task manager lapel pin, absolutely love it
Baffles me how smart you are and how good your channel is. Thanks.
I built my own s100 boards way back sometime around the late 70s to early 80s. Lot of fun. Also had a lot of fun coding the 6502 (6510) on the C64.
I've noticed that if I ask a well-constructed question, the result that often comes back to me is superb to say the least... I've been 'blown away' several times, especially when inserting code into the IA and asking it to simplify... and here I was speechless, it found several ways to write the same code, different and brilliant. But in all cases, I had to create a well-constructed question... but the answer is fantastic.
love ur content. wire wrapped a computer from discrete components with an 8088 chip and wrote the OS in assembler. this brings me back to 1986! keep the retro content coming!
Agree completely. I used it last week to help debug into arm assembly for an undocumented API.
I had similar experience getting it to write screeps code. At first I was simply curious if it knew what screeps was, surprisingly it not only knew but at request was able to write functional code first time. I gave it verbal feedback on how its code performed and it was able to suggest and implement 3 improvements in a single pass that went on to be a success. While I have programming knowledge, I never used that knowledge and spoke to it like a layman allowing it to be the programmer. I served only as the eyes and hands interface.
I feed it in code for an Arduino, and it spits out Micro Python perfectly. Cool!
Dave … you’re an awesome guy, not just genius level but especially as sincerely likable human being on this journey through life. And when I glanced at the thumbnail I wondered why does Dave have a bunch of eggs on top of his head 🤣
After talking about 15 minutes with chat GPT for a High and low pass filter, He make a C code for me and it worked perfectly. I tested it with an audio loop in my code and It works without any problem. It really blow my mind.
I started coding my own projects for arduino and now esp32, without chatgpt I can't do anything. Its literally very smart friend that you can ask favors or questions, even construct basic working code and modify it along the way. Then use old code to generate same structure just with different variable. If I didn't have access to it it would never be possible for me to create my ws2812 led strip projects...
I LOVE that Task Manager Pin!
Understood very little of what you said, but really enjoyed listening. Thanks for your posts.
Cut my teeth on a 6502. So did my two sons. We purchased a Apple II Plus and the rule in my house is that you could play almost unlimited games on the Apple II.....That you wrote themselves. It worked and now, both sons are successful software engineers or higher in large companies. Thanks for introducing and getting me interested in Chat GBT. I am having a blast writing code or should I say letting it write it. Not perfect as you pointed out but amazing just the same.
In my opinion, ChatGPT is like a smart highschool intern. It can point out flaws in your code easily, and interpret error messages like it's plain english. However it sometimes makes obvious mistakes, I've seen it call functions that doesn't exist, or create inefficient, or memory unsafe code.
And like the pimply faced intern, it often also cannot explain why the flaw in your code is a flaw, or how it could be fixed. And all too often it just spouts gibberish, and you realise it is better off just making you a metaphorical coffee.
There are other Chat AIs out there as well, so mix and match. Sometimes ChatGPT fails where one of the others succeeds and vice-versa.
@@kevinmcfarlane2752 They're all LLM's though. With the same failings.
I have found that the less you know about something you ask a LLM, the more impressive they seem.
So ask an LLM something about a subject you know a lot about. :)
Your channel gives me faith in this platform
Hi Dave, thanks for for another great video. When I saw your dog playing the piano I couldn’t help to think of the piano mat we bought for our kids some time ago. It has a much larger keys that would allow him to play single notes. You can find it pretty much anywhere when searching for “kids piano mat”. Thanks again for the great content you put out into the world!
Sorry to break it to you, but this wasn't his dog. It is a very popular video he used to underline his point.
Scary, but brilliant video.I'm nowhere near the capabilities of yourself, and others on here. But coding 6502, z80, and 68k "back in the day", this is really impressive.
I wrote Bresenhams's circle algorithm in Z80 assembly to draw onto an LCD display. The result was more like squircles
Always love your content, Dave! Always found coding and computers so interesting, but it seems a little overwhelming! You definitely do a great job at explaining things and it's amazing what AI (ChatGPT in this case) can do and is constantly improving!
I've learned a few interesting coding concepts I never thought of using ChatGPT for refactoring. The best part is the application even explains what the code is doing. Probably spent several hours just going down rabbit holes after learning something new.
When using plugins, or the advanced data analysis extension of ChatGPT (4) you can make it write and run the code (well Python atleast), test it, debug, run it again, test it, etc until it gets something which is working. You can then review the work, give it feedback on what you would like differently, or ask it to try and make it more efficient etc, and it will modify the code and test it iteratively on it's own. Works pretty well
Wera and Knipex tools you truly are the King of Kings.
Ah the 80's, when coding in Z80 assembly seemed like magic. I recall disassembling XMODEM and patching it with my own software serial port because I couldn't afford a hardware SIO - what joy when it worked at 300 baud.
We've certainly come a long way - now large language models seem like magic.
My favorite thumbnail you’ve ever made 🎉
two thumbs up for the thumbnail alone. Looks like ChatGPT could be a great help for many lone software developers. Meet your new peer review partner. I wonder how effective it would be with the legacy Cobol software out there?
What a time to be alive!
Excellent shout-out to Hal Chamberlain. Hal is one of the early pioneers of personal computing and put out a mimeographed(?) newsletter called The Computer Hobbyist back when hobbyist information was hard to come by.
Before: you talk to yourself about why the computer doesn't work
Now: you talk to the computer about why the computer doesn't work
I love A.I.
Reminds me of my early days programming Z80 Assembler on the Trash-80. My TRS-80 Model I was HEAVILY modified...
Reminiscent of my days writing assembly code for the Intel 8080. It would have been great to have ChatGPT back then. Great video!
4:25 Best analogy ever!
I don´t even try to understand that stuff ... I adore you for the fact that you understand that🧐
I am more of an on and off junior programmer. My last failed project, I tried to use Bing/chatGPT to help make a function in Maya 2020 with Python to check if two 6 sided box where colliding. The AI kept suggesting using a Maya built-in function. I knew that it wouldn't work because it only works when the boxes aren't rotated.
In other words, I find it impossible to get platform specific help from AI when you don't know how to solve the problem yourself.
Great video!
The 6502 is a little old for me but I was happily writing assembly on the 68000 in the late 80's. The new OpenAI python API package is incredible and can upload images for context. Soon I bet a bunch of people will do be some close loop programing with ChatGPT.
The 68000 was very nice in my opinion.
Nice vid. In a way, what you described is similar to "evolved antennas", where it goes over it's own design over and over trying to find a better solution. The implications are immense because it pushes one of the most damning human limits to new levels, time per product. Take for example chip design, that is now so complex you rely on many a "standard way" just to keep track of things. An ML tool won't care. You'd just tell it what it's supposed to come up with, the constraints it should work under and let it go at it.
The next big step will be when you can tell some version of ChatGPT to "create a better version of yourself given x constraints". It's model obviously. THAT will kick things into next level...
p.s. for curious people, ML's largest advantage over us is that it works on a reward basis without the purely biological bias of risk aversion. It WILL try stupid stuff to come to the conclusion it was stupid. And as AlphaZero/Go/Star proved many times, many of the things humans considered stupid were not actually stupid (in the long run).
its own
Z-80 and 6502 were my first programming languages way back when… at that time my only 'use' for BASIC was as a mean to carry the code via a long series of PEEK and POKE statements. Hex entry was fine on the KIM-1, but the Commodore PET and the TRS-80 both required-encouragement. 😁
Ah, the days of wild processor hunting and attempting to look invisible whilst playing with the new fangled computers in shops!
I asked CHAT GPT to help me setup APACHE2 server on my Linux Laptop, then help me configure ProFTP on my laptop as well. Did both perfectly, including an upload script in PHP for the Apache2 server. the only thing it failed at was I wanted to setup an old school INN (internet Network News) server as an experiment for creating local news on a server (like a BBS) and Together we couldn't get past all the errors that were happening. I'll try again when I get the fanless industrial PC i'm going to run all this on.
Love the lapel pin!
Like a 747 over my head…. , love your expert knowledge 🙏🙏🙏🙏🙏
I learned a lot also by going through the errors gpt is putting out:)
Hi Dave,
Thx for the reporting of your findings. I also use Chat GPT 3.5 and Phind (uses ChatGPT-4 in the background) in many Programming sessions. I often Copy and paste snippets of my Code in it and ask specific questions. I also let it analyze Crash dumps, by copying the output of WinDBG into ChatGPT and let it analyze the Crash dump. It's a amazing tool.
8:04 - There was a process called CORDIC that worked similarly that was used in early hand calculators, and at least one company put it into a super-minicomputer with a 384-bit floating point engine... (Abbott labs loved it for molecular modelling...)
ChatGPT is a wonderful companion
Thanks for the info about ChatGPT. Well explained!
The big issue with letting non techies write software "visually" has always been the fact that, no matter how you slice it, in order to write software you have to understand the problem so the same amount of bits of information as are contained the code you write are also present in your head as you describe it.
However, this trickles down in a lot of unexpected ways - the textbook problem of "write a phone book software" requires you to make decisions on how the data is stored, for example, and write code to store the data and read it back.
All those choices are made with the developer's brain, because the end user doesn't care if you are using Oracle, JET, Active Directory, XML, JSON, and so on and so bloody forth.
If ChatGPT can abstract those questions away, and ask pertinent ones about what the user actually wants to do... I can see a lot of AI automation in the future of development.
Maybe we can use ChatGPT to fix the Python ABI hell. I've written a lot of code in a lot of languages in the last 47 years and I can't decide which I despise more, Python or COBOL. At least with COBOL we didn't have ABI issues and "Oh, does this want 2.7 or 3.1?". Not to mention Python's error messages are the functional equivalent of FORTRAN's "Parameter of type parameter may not be of type parameter".
This is pretty crazy. It's amazing how far AI is coming along.
I'm not sure if I like AI or not yet.
We are heading down a dark path with AI. It can either turn out to be one of our greatest tools and accomplishments for human development or we get "The Matrix" or "Terminator" future. I think the latter i the more likely end.
I wanted to use the comandline video editor app, ffmpeg, I wanted to make a batch file using it to convert drag and drop any video into a set format for 100% compatibility for media playback, convert to basivc mp4 stereo. Chatgpt i instantly spat out the arguments for the command line! Bloody awesome! now I just drag any video onto the batch file and with zero effort, a bit later, the video is ready for playback on a potato! No need to load a video converter app, no need to browse to where the vidoe is, no need to set bitrate in said video editing app, all preset in my batch file.
Loving the beard Dave, looking like an IT silver fox.
Great video once again
The part on the 6502 was the most interesting one.
6502 assembly sounds fun 😅
I never knew how the BBC Micro took off so well with the 6502 being so limited. My first machine that saw me through the 80s and 90s was the Texas Instruments TI-99/4A and the amazing TMS9900 microprocessor
The AI loop idea just made me think of the latter season of Person of interest where the "Machine" had to simulate all kinds of scenarios and strateagies of what to do.
My first Dave's Garage Premiere.... C64 forever!!!
Fascinating. I wonder how far we are from the point where a circuit diagram of a piece of uC hardware can be given to it together with an output spec for it to produce a working program. In the meantime I prefer to write my own. The journey is most of the fun.....
Huh -- Kim 1, the first dev board I worked with, and developed a memory board PCB for. Not sure why anybody would be interested in that now. I guess I'm just saying that I too am a member of the club! That was back when the signals were slow enough that you could easily get a way with soldering a bunch of wires between two backplanes (like in Dave's video) and expect it to work!
I don't know much about coding but I made it use python to create a basic text based Final Fantasy style battle system with a character exp stat tree and variety of enemies and weapons and a weapon shop. It took several iterations in 3.5 but eventually we got it going and you just battle monsters over and over until you get enough exp to level up or gold to shop and continue battling until you die. I'm extremely impressed because I don't have the first clue how to actually code any of that.
hey, this time your code scrolls very smoothly, a lot better than in a previous video. much nicer to look at 👍
Chat gtp turns into the 'my compiler is smarter than me' stories after several iterations of itself.
I haven't been able to log in or create an account to use it but I could finally use it, it can use fast led to do custom rgb patterns makes me wanna get some rgb leds and an arduino😉
We have come a LONG way since the VIC-20 haven't we? Interesting it can read the code you showed it, not even necessarily knowing the Kim-1 but just looking at your input found discrepancies. Very interesting this chat-GPT phenomenon. I wonder how much computing power is behind that system.
I've really didn't know that, thank you for telling me how to do it Dave 😀
GPT-4 is incredible, and Vision is blowing my mind. Can't wait for OpenAI's developers' announcement in Nov.
We are on the cusp of something wildly big and world changing.
Great video. I have found the ChatGPT to be an invaluable rubber duck.