I used commodore 64 1986-1992 (1 mhz 64 kb off ram) and then in 1992 my father bought 386 dx 40 4 mb ram ...it felt like supercomputer in my house ....and i recived amiga same year
@@situatedillness Nope, mid-1999 is 1 GHz. On November 20, 2000, Intel released the Willamette-based Pentium 4 clocked at 1.4 and 1.5 GHz. In April 2001 a 1.7 GHz Pentium 4 was launched.
Lol, I remember watching this in 1989 and thinking about how far we had come. Now I’m watching this on TH-cam on my cell phone which makes these desktops look like calculator’s.
"In 10 years from now, we expect to be shipping processors in the range of 200MHz" lol well the end of 1999 to early 2000 they got up to 1GHZ. I guess they beat their own estimates :)
Intel was a bit off, but they also believed that they would be putting out 10ghz machines by 2010. The predictions and reactions are comical now, but the megahertz wars were never expected to end. web.archive.org/web/20200304044703/www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/
I worked at a place that built custom computers from 1997-1999, and I remember getting shipments of the 333MHz and 366MHz Pentium 2s, and the workers got excited about it. We may have got up to 400MHz when I was there - my memory is a bit fuzzy on that. We never got 1GHz when I was at that job. That’s not to say they didn’t exist for higher end computers.
I will always fondly remember this era in personal computing. So many small vendors adding their own special upgrade, their own flair on something new.
well megahertz were not the only thing that gave performance back then the faster system bus a CPU had affected performance greatly thought many did not know this back then the faster bus made a huge difference in computer speed and performance so Pentiums with a 50mhz bus were the worst 60 MHz pretty good and 66mhz were the best performing computer back then for the Pentiums true fact
Hahaha, I do believe I am the first person to overclock a 286/386. I'll never forget when I was a young engineer working in the radar field, I was responsible for what was called the Built In Test Equipment (BITE). I was able to get this running faster by combining 2 speeds together (5 Mhz to 10 Mhz) and the data stream was twice what it was. I thought about my 386 could benefit from an increase in clock speed so I looked up the schematic of my motherboard and saw that the CPU was running at 16 Mhz. I called a vendor to order a 24 Mhz crystal (xtal) and the guy asked me why did I want to buy that. He told me he only sales to OEMs. I told him I wanted to test the idea of overclocking the CPU. He says he didn't think that would work. I looked at the buss for the I/O and the CPU and didn't see why it would not work. Low and behold, I got the xtal and installed it in my motherboard. It was just plug and play. I booted my system, brought up MS Flight Simulator and wow, I saw a much better runtime in the graphics performance (remember though, this is like 1989 or 1990). I called the guy and let him know it worked. He asked me to send him instructions on how it was done and what adjustments did I make with the clocks. I told him I did not do any wait states above 0. It was just remove and replace. I don't know how many years later but I started seeing ads of vendors talking about overclocking. If I'm not the first, I am certainly in the beginnings of all of that craze. Fun stuff!!!!!
Still remember our first family pc. Gateway 2000 with a 386DX 33Mhz, 4MB RAM and 35MB hdd. Got a Sound card for Christmas to play Wing Commander 2 with the voice add on pack.
While Gary was irreplaceable on the show, Jan was great as well. She was very charismatic and personable, and didn't sound like she was just reading a script.
The problem is how can you take any advice from a guy that blew off IBM and the chance to make billions? While he was a smart guy for sure, shooting yourself in the foot makes you look really stupid.
@@AcydDrop IBM offered Kildall a one-time payment of $250,000. That wouldn't have ever made Kildall billions. Gates got the same deal. Gates did not become a billionaire off of IBM royalties, he never got an IBM royalties deal either. It wasn't until COMPAQ came on the scene and Gates could sell DOS to the IBM clone market that he became a billionaire.
@@AcydDrop Jan was one of the managers at Xerox Parc who saw no market for their products and basically let the rest of the industry (Apple and Jobs in particular) walk out the front door with it. Garry wasn't the only one who fucked up.
@@medes5597 Funy how you all ignoring the fact that they are showing XT clone and 386SX and babling about voice recognition and AI... and the demo for them is a game without any 3D graphic in it...
In 89 I was still pushing a couple of 8mhz 68000 machines The 1040STe wich was new at the time, and an Amiga 500 with a trapdoor ram expansion. When I went to a 100mhz pentium in 96, I couldn't believe my eyes.
OH MY GOD! We had ONE golf game on a floppy we played up until 2003. The whole family would play it. And this it! That feeling of accidently stumbling over "That" game!
33MHz was blazing fast around 1990. My 386 had a "turbo" button to slow it down to 8MHz for when running old applications that ran too fast to be usable
4:09 "We still have a long way to go in the speeds of our microprocessors. Today, we're at 33 MHz is the maximum speed of a microprocessor that Intel manufactures. Ten years from now, we expect to be shipping microprocessors that operate at speeds up in the range of 200 MHz." This was 1990. Little did he know! Ten years later, Intel was selling Pentium III processors running nearly 1 GHz.
I totally loved the P3, especially the tualitin core. The P4 was a big disappointment. My 1.4Ghz P3 Tualitin, was faster than the 1.8Ghz P4 I replaced it with. :(
4:14 that guy was spot on in 1989 he said in a decade there would be 200mhz processors in 1997 I got my very first computer a gateway 200mhz mmx technology computer
Not quite spot on. It actually came quicker than he thought since this was filmed in 1989. The first 200 MHz processor was released in 1996, which was just 7 years after this was filmed.
The first computer I ever used was my older brother's VIC-20. I was little and he had some old education games on it he'd boot up for me. Then the first computer we got that was specifically mine was a second-hand setup of an Amiga 1000 with expanded RAM, a discarded Apple color monitor that occasionally needed smacked on the side to make the color come in, and I used my old Panasonic boombox for the audio-out. Damn good sound too since it had a 5-band equalizer. I loved that thing. Ignorant young me never made a backup of the Amiga Kickstart diskette so I couldn't use it anymore eventually when it went bad. Then my brother cannibalized it for my games and peripherals around 1994 and I didn't use a computer again til 1996 when I finally got a standard PC with Windows 95. A couple years ago I bought a near-mint Amiga 500 from a family friend and got back almost ever game I use to have and then some. That's just a part of my childhood I can't leave in the past. I love the Amiga.
I remember those days, there was no public internet. There were bulletin boards that you could access with a dial up modem by dialing a specific phone number over a land line. You could find the phone numbers literally on physical bulleting boards on the walls in the places that sold you the computers. Some of the computer bbs were hosted by distant places like universities, and some were just local users much like yourself. People tended to network locally to avoid long distance charges, and people tended to restrict modem usage to off hours so as not to tie up their phone lines or add a second line just for their modem. Computers enthusiasts used to buy computer mags not only to read the latest news and see the latest gear, but because they contained line after line of computer code (BASIC) that people could enter into their computer, to make it perform some mundane by today's standard task. If you were lucky enough to be able to afford it, you could back your programs up on a 5 1/4" floppy drive or an audio cassette. If not, everything you entered disappeared the moment you turned your computer off, there were no internal hard drives. It was a big thing at the time when computer manufactures decided to add floppy drives.
That must be the same Neil Rubenking that wrote the "Pianoman" music program, which could produce pseudo-polyphonic music from the IBM PC's internal speaker.
02:20 And after 34 years when this was aired, we are still not with the AI (and no, ChatGPT isn't AI, it is just excellent interpreter from database) but we have got close to good voice recognition in last 10 years, but still have no good use for it, as we are still limited to same realities as in the 60's when it was promoted - there is no use for voice commands.
Now it's just waiting for any program that utilizes more cores. I have not run into anything yet that my old i5 four core can not handle with a GTX 580.
@@adityasumanth6122 technically yes because if you go back in time far enough cpu's where at one time made rectangular we can we can make them circular to if we wanted to but it's up the manufacturer to choose the shape not you unfortunately look up the 6502 for an example it's made both as a square and rectangular cpu depending on which model you order
Now the sad thing, is that it's hard to find something that could still *read* those floppies, let alone copy them. Anyone that did copy them though, still has the software. The legit users got screwed.
We need a similar TV show in today’s age but instead focusing on the fast moving area embedded technology, Micro-controllers and Single board computers.
Can't have a TV show anymore. Look where you're voicing your concern. TH-cam channels are the new era of shows. All about finding what you like. There has to be someone making videos about it
Boy that brings back memories, my first computer was a 10 mHz 8088. Bought from an add in Computer Shopper, a rag that could have used in place of concrete building blocks.
I had an 8mhz Amstrad PC 1640 ECD 30mb HD. Very high spec 80s PC running MS-dos 3.2 and Gem Desktop that had no 3rd party programs to support it. I did have 640K and an EGA screen so some games looked amazing. There was a game Blood Money by Psygnosis that played amazingly well and looked amazing. Also old dos games like Road Wars that were totally engaging. Ancient dos games like SW.exe Space wars. A 2 person keyboard based game that was endlessly entertaining and was made in the 60s to run on mainframes and ported to the 8086. This computer opened up a legacy of computing that my spectrum was never able to do. I learnt to program on the spectrum but I learned so much more from my lowly Amstrad pc 1640. It got me through my first year of University as well. Turbo C from Borland worked ok on it. Took a while to compile but it worked. Honestly I remember this computer very very fondly. It never let me down. It was slow at times when I asked more of it that it was ever designed for but it always came through. Amstrad always gets a bad Rep but I'd argue the 1512 and the 1640 were like the spectrum in terms of introducing the youth to contemporary computing. The 1640 in particular could be specced to a really decent standard with EGA graphics and 30MB HD. It had a mouse as standard too. I ended up fitting an 8087 to it too. No idea why. Didn't need it just got one and felt like fitting it. Fractint was my favourite program outside of games. It made nothing of the 87 because it used Ints only. I spent many an hour zooming in on Mandelbrots and reveling in the psychedilc results. Luv and Peace.
In the 80's, computers were the realm of the corporate world. With the mass adoption of the internet and multimedia by the mid 90's, computers lost their business suit and tie.
Having being born in '95 I missed a lot of the earlier stuff like this but it's amazing to see CPU's that are so low power that they don't need any cooling other than passive airflow.
You can still do that today but... whats the point? I mean sure, phones are passively cooled as are some laptops. But why passively cool a desktop CPU. There are very few reasons to.
@@t1e6x12 Yeah, it's like how having a very loud turbo on your car & revving the engine down a quiet residential neighbourhood is a guaranteed muff-magnet, but it's nothing compared to the panty-dropping power from the sound of an aftermarket cooler on your overclocked dual-core Pentium. 😉
I remember this episode airing and thinking 33MHz was blazing, how much more were we going to really need. 80386DX-33 was also pretty wicked fast with DOS type programs. You could fly through those text prompt type menus.
Then Duke Nukem happened, and it wasn´t fun... I am daily driving a 486-33 Sx, and so many games choke.... Still, I can see how this was an improvement over a ST
I like the way that The Computer Chronicles show me all the computers I missed before, as a result of missing lots of job opportunities. To all those who chose not to employ me, I'm doing fine now, because I am typing on a Windows computer that has an Intel iCore processor, with the latest business productivity programs, a color laser printer, and a professional label tape printer. Now you wish you could hire me, but I am doing fine.
1:50 that woman is wrong! If the 8088 had been made to run in VGA mode, it would have in fact been faster. CGA graphics are pixel-packed, so there is a lot of bit-shifting, AND'ing, OR'ing, etc to actually put the CGA graphics into video memory where the VGA is in 320x200x8 mode which is a single byte per pixel. No bit-fiddling required.
Yes, you can simply go mov ax, 0a000h mov es, ax mov ax, [buffer segment] mov ds, ax mov si, [buffer offset] xor di, di mov cx, 32000 rep movsw to fill the whole screen from a background buffer in VGA mode 0x13. In CGA you need to take each pixel and shift it around and or it to the screen if you don't have the backbuffer set up, which makes blitting to the backbuffer a pain in the you know what. Did program a lot of graphics stuff back in the day - and 0x13h was the easiest to work with of them all, also the most performant one, because the loop and the number of instructions needed to write to a CGA compliant backbuffer needed to be fetched byte by byte on an 8088 (or word by word on a 8086). Made more than up for the 4x larger screen memory to be transferred. CGA was really one of the worst things ever invented.
@referral madness x86 assembler. I think the example would work on even an 8088. However, it requires a VGA or an MCGA card in grapics mode 13h (320x200 @ 8 bits color depth).
@referral madness So many... started out in Pascal in late 80s, later x86-assembler an C/C++, then C#/F# and Haskell, now I'm into Python and Kotlin a lot.
20:11 "so what is the tradeoff for this tsr program that speeds up the disk?" "no tradeoff at all it just makes it faster" no ram used, huh? half-scam in plain daylight aired on PBS 130usd (330 today)
When I switched over from my Atari 800 with DOS 2.0s to my Pentium MMX 233 MHz with Windows 95 I noticed that there was a vast speed difference. Of course it was so obvious that there would be a speed difference between the two computers.
Maria Gabriel always says her name funny.. she pauses in the middle of her last name every... single.. time... It's like, you have to wonder how that nuance happened, as it sounds like she just stops wanting to talk for an instant, then finishes up saying her name. I know it sounds silly to bring up but really, once you notice it, you can't un-notice it. :\
thomasg86 Well, pronunciation is cultural and subjective - I've heard quite a few varied pronunciation of my real last name, some that make sense and some that are totally fubar. Who knows.
I am on TH-cam PC with a AMD 7 5700X running at 3700 MHz. My Amiga A1200 has got a 68060 50MHz. Things have come along way. My first PC was a 200MZh MMX socket 7 CPU.
wow in 88/89 I remember only being able to afford a turbo xt machine. 386 was something very special. very envious of someone who had one - often we had parties held when someone was unboxing one lol. they were wicked fast though, I still can't recall so much of a visual massive speed difference just from one jump in generation of chip (286 didn't count since it sucked). the 386 - 486 jump wasn't as visually impressive even though I know the 486 had a lot of processing power, the software demands increased with it so it pretty much leveled out.
Computers were very expensive in the late 1980s. A 386 sx-16 mhzs computer with a monitor, keyboard, 1 mb of ram, 40 mb hdd, and a dot matrix monitor cost about $2400. A Microsoft mouse cost me about $100 in 1990!
***** The only thing that was lacking in the Amiga 500 was the 8 bit sound. Other than that, it had great graphics. The animation on the games was choppy on flight simulators.
The trick IBM played on him was disgraceful. But they got their comeuppance. They totally lost control of the Market, and at least Gary lived long enough to see it happen.
THAT SUPPOSED SPEED INCREASE FROM 286 TO 386 IS BOGUS. The 286 is slightly faster clock for clock the the 386sx. The idea that 386sx was much faster than 286 caused me to buy a 386-16Mhz machine. Let me tell you, that was the worst £650 I ever wasted! If you watch carefully, you'll see how this amazing 500% speed gain was achieved. The 'Standard' AT contains a 286 CPU. But, his upgrade card contains a 386 AND 387 FPU. Little known fact, but the 387SX FPU is actually a 287 FPU. in other words. You would have been better off speedwise, in buying a 287 FPU for the standard AT. And it'll cost you half as much.
Cpu speeds reached its apex in 2008 and 2009. My I7-960 which is a 2009 cpu is still being used by me. It is slightly slower than my I5-3570k which I got in late 2013. You can do all the office work with that I7-960. I have a AMD Phenom II 945 which I got in 2009, and it is being used in my office. It is still a pretty fast computer for a office computer, and it is more than adequate for office use. In 1987 to 1996 there were big leaps in cpu technology at the time. We had the 286 8 mhzs, 286 12 mhzs, 386 sx-16 mhzs, 386 dx-33, 386 sx-20, 386 sx-25, 386 sx-33, 486 dx 25 mhzs, 486 dx 33 mhzs, 486 dx2 66 mhzs, 486 dx4-75 mhzs, 486 dx4 100 mhzs, Pentium 60 mhzs, Pentium 75 mhzs, Pentium 90 mhzs, Pentium 100 mhzs, Pentium 120 mhzs, Pentium 133 mhzs, Pentium mmx 166 mhzs, & Pentium 200. From 2001 to 2014, there hasn't been a big leap in cpu technology. In 2001, I had Pentium 4 1800 mhzs cpu, and in 2014 I have a I7 4790 at 3600 mhzs. In a 13 year period, the core speed went up 2 times; however, the cpu manufacturers increased the number of cores to 4. The performance difference between a I7 4790 and a Pentium 4 1800 mhzs is about 6 to 8 times faster. The cpu performance improvement in 1987 to 1996 was like 16 to 20 times. CPU technology reached its zenith with transistor technology. The more transistors they add, the bigger the cpu gets, and the hotter it gets. That is the problem engineers are facing now.
I agree with what you're saying, however, at what point do we hit the cap? I mean I'm sure there is a point where faster processing is only needed by the most hoggish software, software if anything should be able to do more, with less processing power unless they're purposefully making things hoggish in the next 5-10 years that cap should be met for 99% of users. Unless you're the NSA having to extrapolate key words from all the surveillience they're collecting. But for home users and even gamers there is a plateau. Same for monitors and GPU's. There's a point where the eye can't process any more data in spite of being able to produce more than the eye can handle.
Christopher Snow I think we already hit the cap in silicon based transistor technology; however, RISC architecture could be utilized to improve performance with CPUs that require less transistors. A good example would be ARM processors on smart phones, they use RISC processors to execute more than one instruction per cycle. As a result, you have a CPU that has 1/10 to 1/20 the number of transistors compared to a CISC processor. It would work for software that doesn't require real number calculations, or numbers with decimal points. A RISC processor would not work for CAD programs, flight simulators, or any kind of applications that require float point calculations. ARM or RISC processors are limited in what they can do, this is why we need desktops with powerful processors and GPUs.
Ace1000ks1975 Oh I'm sure in that regard, but the real question is, for the average consumer, even gamer, how much more processing power is actually needed? We can go from 4k to 8k monitors but at what point does the eye not notice the difference between the two things... or in gaming how much CPU/GPU processing is capped. I think we're getting to the endgame on all "common" user needs. The only people which will need faster and faster processing power are data centers, major corporations or military cyber security applications. I know it's been said before but this time I really do think we're hitting a hard cap. If anything networking, wireless and otherwise is still in need of speed, the US seriously lags the world when it comes to internet speeds and thus home networking.
Christopher Snow We need to transition from transistor based electronics to optical based electronics to break the bottleneck. Light moves much faster than electrons. It would be like transitioning from piston engines to jet engines. The fastest piston engine aircraft could propel an aircraft to 450 to 500 mph. A jet engine can propel an aircraft more than 1000 mph to 1900 mph. For transistor based electronics, they will have to come up with better cooling systems, like liquid cooling solutions to clock CPUs and GPUs at higher frequencies without burning them out. For those who can afford it, they could make motherboards with multi-CPUs on it. That would address the speed problem. A board like that with 3 to 4 CPUs would be out of the reach of most users, it would cost at least $6000 or more.
Ace1000ks1975 What I'm saying is, games are right on the verge of having ALL they will ever need in the way of processing power for both GPU's and CPU's. The only thing which will extend needs beyond the point we're already at is some virtual reality game simulators.
@4:11 lol, 200 MHz by 1999. In 1999 the Pentium III came out, with a max clock rate of 1.4 GHz. Though to be fair, perhaps comparing clock rates in 1989 to Pentiums is unfair, since the FSB on a Pentium III still ran at around 100 MHz.
@KoivuTheHab lol yep that was like mine... every time i pressed the turbo button it would go to 33 mhz.. thought it was stupid and didn't think about it much until i was an adult.
The 386 was a potential step forward but as a computing platform at the time it lagged sorely behind the 286, MHz for MHz, because of CISC and how the 32-bit instruction set was implemented. Yes 32-bit was going to be the way of the future, but at that time mainstream was solidly 16-bit which meant that the 386 was just a waste of money.
@@ryanyoder7573 I was a techie at that time (my first ever overclock was an Intel 8088 from 4.77 MHz to the dizzying heights of 6 MHz) and I switched to a 286 and over the years replaced the version I had until I everntually ended up with a Harris 25 MHz 286). I had that all the way during the 386 generation because for my main use case (for instance Word Perfect 5.1 for text processing) the 286 was BETTER than the AMD 386 computers I built for others (even though I told them they would be better off with a cheaper 286. I wasn't until the 486 that 32 bit started to become mainstream and I built a 486 DX50 system for myself. The problem is that Intel put the 32 bit CISC instruction set ahaed of the eight and sixteen bit instructions, thus, because software was overwhelmingly eight and 16 bit to run those apps on a 386 meant that every instruction had to run the 32 bit instruction set gauntlet before reaching the instructions the CPU could execute. Windows for Workgroups 3.11 could run a 32 bit version, but that was buggy as hell compared to the 16 bit version.
I'm not an architecture expert but I suspect the cost/performance difference had more to do with the bus size differences and capabilities of the more advanced 386 causing them to require more die space compared to the mature 286 design. Bus speeds were interesting back in that era, some systems used 286s on the xt platform which only used the 8bit isa bus instead of the 16bit eisa that the 286 was designed for. These devices even needed custom ide implementations since that standard was never meant to be used without a 16bit bus.
I’m pretty sure Jan is materially incorrect in the opening segment. A CGA video adapter requires a lot more processing by the CPU than VGA. So putting a VGA card in that XT would likely speed up the drawing of those golf game graphics.
As I understand, this was only the case if the card supported some kind of graphical hardware processing. Such things were very limited and non-standardized so it is difficult to generalize but if we have to say that progressive technologies enabled more functionality then that would probably be the case. Else, the general idea is that these were all just frame buffers otherwise. The CPU had to compute everything then just send it to the memory buffer on the graphic card for display. 2-D linedraws was the common function and some others that could help 2-D games.
Way back 1997 had a PII running at 333Mhz with 0.35 μm technology node. It had 32MB of RAM. Today, we have multi-core supercomputers in our pockets that can render 4k HDR videos with 5nm technology node.
My first computer was also in 1997 -- a 200Mhz Sony VAIO also with 32 megs of RAM and a whopping 3.4 gigabyte HDD. How will I ever use all that power and storage space, I wondered as a 17-year-old. Oh how the times change...
Wonderpierrot those processors couldnt be overclocked.. first pentium could but only trough hardware..not bios..what you do i use ducktape and isolate first and third oin on the processor..but only to next mhz..for instance pentium 100 could go to 120.. or pentium 120 to 133 mhz..etc..
@@semiborba7047, and depending how far back you go in time, you wouldn't want to overclock. Earlier OS's and software adjusted their timing based on the processor clock speed. You could potentially go higher, but it may break the software, even if the hardware could handle it.
@@fcex558 Even then, it could only do so much. Even when you lower clock speed, bus speed, and disable caches, the system may still be too fast. I remember in the 386 I used as a kid, even some really old games were still too fast if I had the turbo switch in "slow mode".
You clearly didn’t have any experience with this era of computing. Now you can overclock an unlocked chip by clicking around in your bios and having a whole pile of settings auto adjust based on what you selected - things like bus speeds, timings, and multipliers. None of this existed back then, and especially without multipliers, this was an extremely difficult thing to do manually.
Lol, some people want to travel back in time to see dinosaurs, assassinate dictators, or meet their heroes, I just want to present a 2020 laptop and smartphone on this show and bring some poor Tandy marketing guy to tears
curiosity and interest, sure. but without the manufacturing, materials, physics, and other advancements to go with it, you may as well hand them schematics for a working tricorder, for all the practical good it would do them. Even reverse engineering many of the constructs they would see would likely take them nearly as long as developing them as they did in this timeline. Or maybe there was a Henry Starling type who stole some 29th century technology back in the 60s and created a temporal causality loop...
Ahhh the times when even the next office version could slow a PC down forcing users to buy a new one every two years or so at a Moore's law cycle. Nah, I don't miss it.
+jaymorpheus11 apple computers at the end of the 90s used a different architecture to windows PCs a 350mhz iMac could beat a pentium 2 of twice the speed
it really is fascinating looking back at how much the culture of home computing changed. computing ubiquity now means complete idiots use powerful machines with little idea of their inner functions and they look back at this period like everyone was dumb and would be awestruck by the modern tools. i think they would be struck more by the culture, would you rather be playing games 24/7 or the guy viewing the sales figures of the games ? think about it.
22:35 Guy was just about to tell us he'd invented AI but they cut to commercial instead, so he was like, ok, fuck y'all, I won't tell you how to do it, figure it out yourselves.
consider that it was INTEL talking, the real innovators in the CPU world threw out the 90's - mid 2000's was AMD, clock for clock AMD won hands down, and on price, INTEL just had, and still has deeper pockets too grease the PC makers too use their chips.
TheRealWitblitz O GOD don't get me started on VIA, the only thing they have ever got right has been onboard audio. I have actually owned one of the VIA PC1 C7 boards, and I was always having trouble with it either being slow as dog shit, or driver hell, same with the Everex laptops that had the Mobile C7 chipsets, I bought 1 for my Niece, and one for my g/f at the time, as i was able to get them for slightly over $500 from Walmart, and just about every week I was having too do tech support on those damn things usually do too something VIA screwed up with an driver update threw Windows. VIA also has about the worst Linux support I have ever seen from a company. I'm amazed they are still in business.
Commodorefan64 Many of the big innovations back in the 1990s to mid 2000s such as copper interconnects and high speed caches were made by RISC CPU venders such as IBM ,DEC land SGI.
Only almost a hundred times faster? Seriously? You must have a very, very weak computer. 486 processors were about 10 megaflops (able to run Doom), while in 2010 there were already 100 gigaflops CPUs (i7-980 for example). Current consumer desktop CPUs are up to 800 gigaflops, which is 80 000x faster than 486. PS5 has a million times faster GPU than the 486.
Monkey Robots Inc. Totally. Most people in these positions today are sales people who read a few lines off Wikipedia and do their best to convince everyone they are experts. You just know by the way these folks are talking that they are knowledgable and passionate about what they are talking about
If I had a time machine one of the things I would have done was go back to when this video was made and walk in with an ipad pro and see everyones response
Chips today have transistors so small, that if they go much smaller (more/smaller transistors the faster the CPU) quantum tunneling happens and the electrons jump the gates.
386 at 33Mhz is great machine for video editing. 5 minutes long video was rendered in just one week. What a insane speed!
MP-Tuners Productions what if it wasn't HD video?
I remember making vids with Movie Maker on a shit PC, it took 1-2 hours for a ~5 min video
192 MB RAM 800 MHz AMD CPU
I used commodore 64 1986-1992 (1 mhz 64 kb off ram) and then in 1992 my father bought 386 dx 40 4 mb ram ...it felt like supercomputer in my house ....and i recived amiga same year
Yes, with 275,000 transistors screaming without a heatsink.
@@willptech7565 In those days... well... a few years later... my standard output was 640x480... for CDi productions.
10 years after this episode aired, Intel had surpassed the 200Mhz target that Bill Rash stated. By 1999, it was well over 1Ghz. Amazing!
Yeah! I was just thinking that! They were at 1.7Ghz pentium
Level of competition from AMD and Cyrix pushed that ceiling upwards quite a bit with what is more commonly known as the 586 processor lineup.
@@situatedillness Nope, mid-1999 is 1 GHz.
On November 20, 2000, Intel released the Willamette-based Pentium 4 clocked at 1.4 and 1.5 GHz.
In April 2001 a 1.7 GHz Pentium 4 was launched.
Some people were still using 200MHz CPUs I suppose, but things did accelerate faster.
@@valenrn8657 Right? I thought Pentium 3 was 99 they were not yet a GHz.
Lol, I remember watching this in 1989 and thinking about how far we had come. Now I’m watching this on TH-cam on my cell phone which makes these desktops look like calculator’s.
1989, eh?
"In 10 years from now, we expect to be shipping processors in the range of 200MHz" lol well the end of 1999 to early 2000 they got up to 1GHZ. I guess they beat their own estimates :)
You mean AMD did.
@@umeng2002 Yes, but they both did as well.
Found the fanboy! Lol
Intel was a bit off, but they also believed that they would be putting out 10ghz machines by 2010. The predictions and reactions are comical now, but the megahertz wars were never expected to end. web.archive.org/web/20200304044703/www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/
I worked at a place that built custom computers from 1997-1999, and I remember getting shipments of the 333MHz and 366MHz Pentium 2s, and the workers got excited about it. We may have got up to 400MHz when I was there - my memory is a bit fuzzy on that. We never got 1GHz when I was at that job. That’s not to say they didn’t exist for higher end computers.
I will always fondly remember this era in personal computing. So many small vendors adding their own special upgrade, their own flair on something new.
well megahertz were not the only thing that gave performance back then the faster system bus a CPU had affected performance greatly thought many did not know this back then the faster bus made a huge difference in computer speed and performance so Pentiums with a 50mhz bus were the worst 60 MHz pretty good and 66mhz were the best performing computer back then for the Pentiums true fact
Freedom
Hahaha, I do believe I am the first person to overclock a 286/386. I'll never forget when I was a young engineer working in the radar field, I was responsible for what was called the Built In Test Equipment (BITE). I was able to get this running faster by combining 2 speeds together (5 Mhz to 10 Mhz) and the data stream was twice what it was. I thought about my 386 could benefit from an increase in clock speed so I looked up the schematic of my motherboard and saw that the CPU was running at 16 Mhz. I called a vendor to order a 24 Mhz crystal (xtal) and the guy asked me why did I want to buy that. He told me he only sales to OEMs. I told him I wanted to test the idea of overclocking the CPU. He says he didn't think that would work. I looked at the buss for the I/O and the CPU and didn't see why it would not work. Low and behold, I got the xtal and installed it in my motherboard. It was just plug and play. I booted my system, brought up MS Flight Simulator and wow, I saw a much better runtime in the graphics performance (remember though, this is like 1989 or 1990). I called the guy and let him know it worked. He asked me to send him instructions on how it was done and what adjustments did I make with the clocks. I told him I did not do any wait states above 0. It was just remove and replace. I don't know how many years later but I started seeing ads of vendors talking about overclocking. If I'm not the first, I am certainly in the beginnings of all of that craze. Fun stuff!!!!!
I had a new crystal for my ibm XT compatible. I think we took i to 8mhz
I had a 486DX40 - that soon became a DX50 as i changed the crystal on the mobo :)
Still remember our first family pc. Gateway 2000 with a 386DX 33Mhz, 4MB RAM and 35MB hdd. Got a Sound card for Christmas to play Wing Commander 2 with the voice add on pack.
Remember driving with my Grandpa up to Sioux City to the farmhouse to get his first gateway 386 with pretty much the same specs.
What blows me away seeing this video is how much thinner glasses have become in just 35 years 22:40
Haha 😂
While Gary was irreplaceable on the show, Jan was great as well. She was very charismatic and personable, and didn't sound like she was just reading a script.
The problem is how can you take any advice from a guy that blew off IBM and the chance to make billions? While he was a smart guy for sure, shooting yourself in the foot makes you look really stupid.
@@AcydDrop IBM offered Kildall a one-time payment of $250,000. That wouldn't have ever made Kildall billions. Gates got the same deal. Gates did not become a billionaire off of IBM royalties, he never got an IBM royalties deal either.
It wasn't until COMPAQ came on the scene and Gates could sell DOS to the IBM clone market that he became a billionaire.
@@AcydDrop Gotta remember that hindsights's a bitch man.
@@AcydDrop Jan was one of the managers at Xerox Parc who saw no market for their products and basically let the rest of the industry (Apple and Jobs in particular) walk out the front door with it. Garry wasn't the only one who fucked up.
@@medes5597 Funy how you all ignoring the fact that they are showing XT clone and 386SX and babling about voice recognition and AI... and the demo for them is a game without any 3D graphic in it...
In 89 I was still pushing a couple of 8mhz 68000 machines
The 1040STe wich was new at the time, and an Amiga 500 with a trapdoor ram expansion.
When I went to a 100mhz pentium in 96, I couldn't believe my eyes.
OH MY GOD! We had ONE golf game on a floppy we played up until 2003. The whole family would play it. And this it! That feeling of accidently stumbling over "That" game!
it was used for a demo on the 386 performance over the old cpu speed
33MHz was blazing fast around 1990. My 386 had a "turbo" button to slow it down to 8MHz for when running old applications that ran too fast to be usable
4:09 "We still have a long way to go in the speeds of our microprocessors. Today, we're at 33 MHz is the maximum speed of a microprocessor that Intel manufactures. Ten years from now, we expect to be shipping microprocessors that operate at speeds up in the range of 200 MHz." This was 1990. Little did he know! Ten years later, Intel was selling Pentium III processors running nearly 1 GHz.
I totally loved the P3, especially the tualitin core. The P4 was a big disappointment. My 1.4Ghz P3 Tualitin, was faster than the 1.8Ghz P4 I replaced it with. :(
The guy from Intel expecting 200Mhz clock speeds by 99. The first Ghz CPU was released by AMD in 99.
That's about right, Intel was at 200 MHz then LOL
@@paulmichaelfreedman8334 Hey be nice. Insulting companies with disabilities isn't dignified.
Athlon 1GHz was released in March 6th 2000, but fair enough. AMD beat Intel with 2 days as P3 1GHz came out in March 8th 2000.
Intel was up to 200mhz in 1995
I am still using the 16 MHz computer. What do you guys think should I upgrade it to the 25 MHz version?
This is how I get updated to the newest technologies. In the future we will probably be watching this show in small computers in the palm of our hands
I'm from the future and can confirm this.
These videos are so nostalgic and relaxing. Reminds me of a simpler time I barely got to experience.
yeah well it fun to watch now and say hmm so that's what was going on back then
4:14 that guy was spot on in 1989 he said in a decade there would be 200mhz processors in 1997 I got my very first computer a gateway 200mhz mmx technology computer
Not quite spot on. It actually came quicker than he thought since this was filmed in 1989. The first 200 MHz processor was released in 1996, which was just 7 years after this was filmed.
#timetraveler
@@pauldavis5665 yup, in 1999 we had 400Mhz processors so 12x and not 6x
@@pauldavis5665 Good effort though, most projections are not this close.
Something is wrong, I bought my first PC with a 1Ghz processor in 2000. I'm pretty sure the year before, they were at 500Mhz to 750Mhz.
The first computer I ever used was my older brother's VIC-20. I was little and he had some old education games on it he'd boot up for me. Then the first computer we got that was specifically mine was a second-hand setup of an Amiga 1000 with expanded RAM, a discarded Apple color monitor that occasionally needed smacked on the side to make the color come in, and I used my old Panasonic boombox for the audio-out. Damn good sound too since it had a 5-band equalizer. I loved that thing.
Ignorant young me never made a backup of the Amiga Kickstart diskette so I couldn't use it anymore eventually when it went bad. Then my brother cannibalized it for my games and peripherals around 1994 and I didn't use a computer again til 1996 when I finally got a standard PC with Windows 95.
A couple years ago I bought a near-mint Amiga 500 from a family friend and got back almost ever game I use to have and then some. That's just a part of my childhood I can't leave in the past. I love the Amiga.
I remember those days, there was no public internet. There were bulletin boards that you could access with a dial up modem by dialing a specific phone number over a land line. You could find the phone numbers literally on physical bulleting boards on the walls in the places that sold you the computers. Some of the computer bbs were hosted by distant places like universities, and some were just local users much like yourself. People tended to network locally to avoid long distance charges, and people tended to restrict modem usage to off hours so as not to tie up their phone lines or add a second line just for their modem.
Computers enthusiasts used to buy computer mags not only to read the latest news and see the latest gear, but because they contained line after line of computer code (BASIC) that people could enter into their computer, to make it perform some mundane by today's standard task. If you were lucky enough to be able to afford it, you could back your programs up on a 5 1/4" floppy drive or an audio cassette. If not, everything you entered disappeared the moment you turned your computer off, there were no internal hard drives. It was a big thing at the time when computer manufactures decided to add floppy drives.
VIC20, now that's a name I've not heard for a long while. Loved your montage to the early 80s. You totally nailed it :) @@pR1mal.
That must be the same Neil Rubenking that wrote the "Pianoman" music program, which could produce pseudo-polyphonic music from the IBM PC's internal speaker.
back in the old days when a spread sheet could slow your computer down to a crawl imagine that one lol
02:20 And after 34 years when this was aired, we are still not with the AI (and no, ChatGPT isn't AI, it is just excellent interpreter from database) but we have got close to good voice recognition in last 10 years, but still have no good use for it, as we are still limited to same realities as in the 60's when it was promoted - there is no use for voice commands.
Then there was the Gigaherz race, and now in 2019 we have the multi-core wars.
Now it's just waiting for any program that utilizes more cores.
I have not run into anything yet that my old i5 four core can not handle with a GTX 580.
and moores law is now dead to so waa waa waaaaaaaaaaaaaaaa
Why chips have standard sizes? Can't we start building processors that are rectangular? Or even polygonal?
Same with video cards my 9 year old GTX 580 Video card runs any new game. I use to upgrade Video cards every two years .
@@adityasumanth6122 technically yes because if you go back in time far enough cpu's where at one time made rectangular we can we can make them circular to if we wanted to but it's up the manufacturer to choose the shape not you unfortunately look up the 6502 for an example it's made both as a square and rectangular cpu depending on which model you order
Stew knew how to keep the show moving ahead. He does not waste a single second of air time! What an alpha male & rock boss.
Don't copy that floppy!
Put you're floppy away
@@annother3350 No you.
Now the sad thing, is that it's hard to find something that could still *read* those floppies, let alone copy them. Anyone that did copy them though, still has the software. The legit users got screwed.
@@ZeroHourProductions407 I have a USB 3.5" Diskette Drive found on Amazon. Works good. Not for 5.25" floppies though.
Always best to make a hard copy of the floppy
The computer is still working on finishing drawing the Links 386 scene to this very day. Legend has it, it'll be halfway done by next century.
😂I loved that game for real though
We need a similar TV show in today’s age but instead focusing on the fast moving area embedded technology, Micro-controllers and Single board computers.
A tv show all about UCs? bruh
There's th-cam.com/users/explainingcomputersvideos
It's not the same, but it's something.
Can't have a TV show anymore. Look where you're voicing your concern. TH-cam channels are the new era of shows. All about finding what you like. There has to be someone making videos about it
It's here on TH-cam. Many channels including chans dedicated to the old hardware. It's all on here.
Even on TH-cam only a few hundred people would watch that, unless it's Raspberry Pi or a FPGA for emulating old games.
Boy that brings back memories, my first computer was a 10 mHz 8088. Bought from an add in Computer Shopper, a rag that could have used in place of concrete building blocks.
I used to advertise in computer shopper. But I never sold any 8088 systems so I'm off the hook lol.
I had an 8mhz Amstrad PC 1640 ECD 30mb HD.
Very high spec 80s PC running MS-dos 3.2 and Gem Desktop that had no 3rd party programs to support it.
I did have 640K and an EGA screen so some games looked amazing.
There was a game Blood Money by Psygnosis that played amazingly well and looked amazing.
Also old dos games like Road Wars that were totally engaging.
Ancient dos games like SW.exe Space wars. A 2 person keyboard based game that was endlessly entertaining and was made in the 60s to run on mainframes and ported to the 8086.
This computer opened up a legacy of computing that my spectrum was never able to do.
I learnt to program on the spectrum but I learned so much more from my lowly Amstrad pc 1640.
It got me through my first year of University as well. Turbo C from Borland worked ok on it.
Took a while to compile but it worked.
Honestly I remember this computer very very fondly.
It never let me down.
It was slow at times when I asked more of it that it was ever designed for but it always came through.
Amstrad always gets a bad Rep but I'd argue the 1512 and the 1640 were like the spectrum in terms of introducing the youth to contemporary computing.
The 1640 in particular could be specced to a really decent standard with EGA graphics and 30MB HD.
It had a mouse as standard too.
I ended up fitting an 8087 to it too.
No idea why. Didn't need it just got one and felt like fitting it.
Fractint was my favourite program outside of games.
It made nothing of the 87 because it used Ints only.
I spent many an hour zooming in on Mandelbrots and reveling in the psychedilc results.
Luv and Peace.
It's so weird.. that group looks more like a meeting of tax auditors than a computer show.
It may say something that in later years of the series, Cheifet ditched the jacket and tie.
In the 80's, computers were the realm of the corporate world. With the mass adoption of the internet and multimedia by the mid 90's, computers lost their business suit and tie.
Tax auditor is spot on for computer users in the 80s. They were just representing their viewing audience.
Having being born in '95 I missed a lot of the earlier stuff like this but it's amazing to see CPU's that are so low power that they don't need any cooling other than passive airflow.
yeah well that's was before they figured out that to get more speed they had to crank the power up a lot which means more heat in the chip
You can still do that today but... whats the point? I mean sure, phones are passively cooled as are some laptops. But why passively cool a desktop CPU. There are very few reasons to.
@@t1e6x12 Yeah, it's like how having a very loud turbo on your car & revving the engine down a quiet residential neighbourhood is a guaranteed muff-magnet, but it's nothing compared to the panty-dropping power from the sound of an aftermarket cooler on your overclocked dual-core Pentium. 😉
I remember this episode airing and thinking 33MHz was blazing, how much more were we going to really need. 80386DX-33 was also pretty wicked fast with DOS type programs. You could fly through those text prompt type menus.
Then Duke Nukem happened, and it wasn´t fun...
I am daily driving a 486-33 Sx, and so many games choke.... Still, I can see how this was an improvement over a ST
I like the way that The Computer Chronicles show me all the computers I missed before, as a result of missing lots of job opportunities. To all those who chose not to employ me, I'm doing fine now, because I am typing on a Windows computer that has an Intel iCore processor, with the latest business productivity programs, a color laser printer, and a professional label tape printer. Now you wish you could hire me, but I am doing fine.
1:50 that woman is wrong! If the 8088 had been made to run in VGA mode, it would have in fact been faster. CGA graphics are pixel-packed, so there is a lot of bit-shifting, AND'ing, OR'ing, etc to actually put the CGA graphics into video memory where the VGA is in 320x200x8 mode which is a single byte per pixel. No bit-fiddling required.
Yes, you can simply go
mov ax, 0a000h
mov es, ax
mov ax, [buffer segment]
mov ds, ax
mov si, [buffer offset]
xor di, di
mov cx, 32000
rep movsw
to fill the whole screen from a background buffer in VGA mode 0x13.
In CGA you need to take each pixel and shift it around and or it to the screen if you don't have the backbuffer set up, which makes blitting to the backbuffer a pain in the you know what.
Did program a lot of graphics stuff back in the day - and 0x13h was the easiest to work with of them all, also the most performant one, because the loop and the number of instructions needed to write to a CGA compliant backbuffer needed to be fetched byte by byte on an 8088 (or word by word on a 8086).
Made more than up for the 4x larger screen memory to be transferred. CGA was really one of the worst things ever invented.
But in text mode she is right
@referral madness
x86 assembler. I think the example would work on even an 8088.
However, it requires a VGA or an MCGA card in grapics mode 13h (320x200 @ 8 bits color depth).
@referral madness
So many... started out in Pascal in late 80s, later x86-assembler an C/C++, then C#/F# and Haskell, now I'm into Python and Kotlin a lot.
@referral madness
Learning more languages is always an advantage. When I learned Haskell, it made me a much better C# developer.
20:11
"so what is the tradeoff for this tsr program that speeds up the disk?"
"no tradeoff at all it just makes it faster"
no ram used, huh?
half-scam in plain daylight aired on PBS
130usd (330 today)
2:45 "where will it all end?" xD it won't end until we master quantum computing ^^
When I switched over from my Atari 800 with DOS 2.0s to my Pentium MMX 233 MHz with Windows 95 I noticed that there was a vast speed difference. Of course it was so obvious that there would be a speed difference between the two computers.
Maria Gabriel always says her name funny.. she pauses in the middle of her last name every... single.. time...
It's like, you have to wonder how that nuance happened, as it sounds like she just stops wanting to talk for an instant, then finishes up saying her name.
I know it sounds silly to bring up but really, once you notice it, you can't un-notice it. :\
Yup. Bugs me each time. Maybe that's really how you say it? Seems really weird though.
thomasg86
Well, pronunciation is cultural and subjective - I've heard quite a few varied pronunciation of my real last name, some that make sense and some that are totally fubar. Who knows.
+Kurisu Yamato I'm Maria gape real. Now you can't unhear that. :P
+Harald Segebrecht lmao I actually can't unhear it, for realz
Well maybe she was very naughty and did this on purpose. :D
I am on TH-cam PC with a AMD 7 5700X running at 3700 MHz. My Amiga A1200 has got a 68060 50MHz. Things have come along way. My first PC was a 200MZh MMX socket 7 CPU.
wow in 88/89 I remember only being able to afford a turbo xt machine. 386 was something very special. very envious of someone who had one - often we had parties held when someone was unboxing one lol. they were wicked fast though, I still can't recall so much of a visual massive speed difference just from one jump in generation of chip (286 didn't count since it sucked). the 386 - 486 jump wasn't as visually impressive even though I know the 486 had a lot of processing power, the software demands increased with it so it pretty much leveled out.
wait wait wait no 386 40 mhz machine what a jip🤣🤣🤣
Computers were very expensive in the late 1980s. A 386 sx-16 mhzs computer with a monitor, keyboard, 1 mb of ram, 40 mb hdd, and a dot matrix monitor cost about $2400.
A Microsoft mouse cost me about $100 in 1990!
I scrapped a Microsoft mouse from 1990 and got 13 cents worth of silver.
***** The only thing that was lacking in the Amiga 500 was the 8 bit sound. Other than that, it had great graphics. The animation on the games was choppy on flight simulators.
Ace1000ks1975 dot matrix monitors would have been the shit, if they existed >_>
statikreg Some offices still use them.
Ace1000ks1975 Think about what your'e saying, and then try again.
Hint: there is no such thing as a "dot matrix monitor".
Gary kildall was the best. I found out recently that he died in 1994, so sad... I wish I could've met him
Gary was a great regular guy
The trick IBM played on him was disgraceful. But they got their comeuppance. They totally lost control of the Market, and at least Gary lived long enough to see it happen.
26:35: Czechoslovakia, Yugoslavia, East Germany, Soviet Union... none of these countries exist anymore.
"Where will it all end". 31 years later and its still going on.
18:57 - Where is his little black hat?
This brings back so many memories. Back then computers were cool.
THAT SUPPOSED SPEED INCREASE FROM 286 TO 386 IS BOGUS. The 286 is slightly faster clock for clock the the 386sx. The idea that 386sx was much faster than 286 caused me to buy a 386-16Mhz machine. Let me tell you, that was the worst £650 I ever wasted! If you watch carefully, you'll see how this amazing 500% speed gain was achieved. The 'Standard' AT contains a 286 CPU. But, his upgrade card contains a 386 AND 387 FPU. Little known fact, but the 387SX FPU is actually a 287 FPU. in other words. You would have been better off speedwise, in buying a 287 FPU for the standard AT. And it'll cost you half as much.
Listening to this in 2020. Always fun to see how much tech is changing :)
Listening to this in 2024..
23:24 I reckon Commodore will have a hit on their hands there and ensure the future of the company ....
Damn!!! The lady talked about Artificial Intelligence on a 33Mhz processor. It seems they were working on the same thing back in the 1986.
It was all about expert systems back then as the gold standard. Now we know better that this application works only in very limited situations.
Skynet!
It was Clippy from Microsoft Office. He could tell if you were writing a letter and suggest formatting for you.
@@Martian74 Oh, yes. But Mr. Clippy was buggy too. Most people use to deactivate it just to have a glich free operation.
@@Martian74 me clippy seemed to be just quietly swept under the rug without people noticing he was gone
In 1998 I got a PC with a PII celeron 333MHz and that was mid range :)
Cpu speeds reached its apex in 2008 and 2009. My I7-960 which is a 2009 cpu is still being used by me. It is slightly slower than my I5-3570k which I got in late 2013. You can do all the office work with that I7-960.
I have a AMD Phenom II 945 which I got in 2009, and it is being used in my office. It is still a pretty fast computer for a office computer, and it is more than adequate for office use.
In 1987 to 1996 there were big leaps in cpu technology at the time. We had the 286 8 mhzs, 286 12 mhzs, 386 sx-16 mhzs, 386 dx-33, 386 sx-20, 386 sx-25, 386 sx-33, 486 dx 25 mhzs, 486 dx 33 mhzs, 486 dx2 66 mhzs, 486 dx4-75 mhzs, 486 dx4 100 mhzs, Pentium 60 mhzs, Pentium 75 mhzs, Pentium 90 mhzs, Pentium 100 mhzs, Pentium 120 mhzs, Pentium 133 mhzs, Pentium mmx 166 mhzs, & Pentium 200.
From 2001 to 2014, there hasn't been a big leap in cpu technology. In 2001, I had Pentium 4 1800 mhzs cpu, and in 2014 I have a I7 4790 at 3600 mhzs. In a 13 year period, the core speed went up 2 times; however, the cpu manufacturers increased the number of cores to 4. The performance difference between a I7 4790 and a Pentium 4 1800 mhzs is about 6 to 8 times faster.
The cpu performance improvement in 1987 to 1996 was like 16 to 20 times. CPU technology reached its zenith with transistor technology. The more transistors they add, the bigger the cpu gets, and the hotter it gets. That is the problem engineers are facing now.
I agree with what you're saying, however, at what point do we hit the cap? I mean I'm sure there is a point where faster processing is only needed by the most hoggish software, software if anything should be able to do more, with less processing power unless they're purposefully making things hoggish in the next 5-10 years that cap should be met for 99% of users. Unless you're the NSA having to extrapolate key words from all the surveillience they're collecting.
But for home users and even gamers there is a plateau. Same for monitors and GPU's. There's a point where the eye can't process any more data in spite of being able to produce more than the eye can handle.
Christopher Snow I think we already hit the cap in silicon based transistor technology; however, RISC architecture could be utilized to improve performance with CPUs that require less transistors. A good example would be ARM processors on smart phones, they use RISC processors to execute more than one instruction per cycle. As a result, you have a CPU that has 1/10 to 1/20 the number of transistors compared to a CISC processor. It would work for software that doesn't require real number calculations, or numbers with decimal points. A RISC processor would not work for CAD programs, flight simulators, or any kind of applications that require float point calculations.
ARM or RISC processors are limited in what they can do, this is why we need desktops with powerful processors and GPUs.
Ace1000ks1975 Oh I'm sure in that regard, but the real question is, for the average consumer, even gamer, how much more processing power is actually needed? We can go from 4k to 8k monitors but at what point does the eye not notice the difference between the two things... or in gaming how much CPU/GPU processing is capped. I think we're getting to the endgame on all "common" user needs. The only people which will need faster and faster processing power are data centers, major corporations or military cyber security applications.
I know it's been said before but this time I really do think we're hitting a hard cap. If anything networking, wireless and otherwise is still in need of speed, the US seriously lags the world when it comes to internet speeds and thus home networking.
Christopher Snow We need to transition from transistor based electronics to optical based electronics to break the bottleneck. Light moves much faster than electrons. It would be like transitioning from piston engines to jet engines. The fastest piston engine aircraft could propel an aircraft to 450 to 500 mph. A jet engine can propel an aircraft more than 1000 mph to 1900 mph.
For transistor based electronics, they will have to come up with better cooling systems, like liquid cooling solutions to clock CPUs and GPUs at higher frequencies without burning them out.
For those who can afford it, they could make motherboards with multi-CPUs on it. That would address the speed problem. A board like that with 3 to 4 CPUs would be out of the reach of most users, it would cost at least $6000 or more.
Ace1000ks1975 What I'm saying is, games are right on the verge of having ALL they will ever need in the way of processing power for both GPU's and CPU's. The only thing which will extend needs beyond the point we're already at is some virtual reality game simulators.
@4:11 lol, 200 MHz by 1999. In 1999 the Pentium III came out, with a max clock rate of 1.4 GHz.
Though to be fair, perhaps comparing clock rates in 1989 to Pentiums is unfair, since the FSB on a Pentium III still ran at around 100 MHz.
you are a bit to fast. the first ever 1ghz cpu was the Athlon introduced March 5th 2000. Intel was 3 days slower.
1.4GHZ was first reached 2001
33 megahertz - daimn that's fast!
Have any size 2d characters you want with that imagine the fighting games.
If u hit the turbo button it goes to 66 mhz
@KoivuTheHab lol yep that was like mine... every time i pressed the turbo button it would go to 33 mhz.. thought it was stupid and didn't think about it much until i was an adult.
@@TheAnonapersons Nice!
there is more speed but there are trade off's
200 MHz in the next decade? Insane! I guess the CPUs will get so hot they must be cooled down somehow.
Back when processors were rated in megahertz...
@SteelRodent My i5-9600k runs at 5000 MHz
and most of us have 16 *million* kilobytes of RAM!
@SteelRodent mine 600mhz single core
@@RWL2012 i have 256mb
@SteelRodent nice!
My cpu is stable at 5.8ghz today. Crazy how the tech evolved
The 386 was a potential step forward but as a computing platform at the time it lagged sorely behind the 286, MHz for MHz, because of CISC and how the 32-bit instruction set was implemented.
Yes 32-bit was going to be the way of the future, but at that time mainstream was solidly 16-bit which meant that the 386 was just a waste of money.
? This is just not true
@@ryanyoder7573 I was a techie at that time (my first ever overclock was an Intel 8088 from 4.77 MHz to the dizzying heights of 6 MHz) and I switched to a 286 and over the years replaced the version I had until I everntually ended up with a Harris 25 MHz 286).
I had that all the way during the 386 generation because for my main use case (for instance Word Perfect 5.1 for text processing) the 286 was BETTER than the AMD 386 computers I built for others (even though I told them they would be better off with a cheaper 286.
I wasn't until the 486 that 32 bit started to become mainstream and I built a 486 DX50 system for myself.
The problem is that Intel put the 32 bit CISC instruction set ahaed of the eight and sixteen bit instructions, thus, because software was overwhelmingly eight and 16 bit to run those apps on a 386 meant that every instruction had to run the 32 bit instruction set gauntlet before reaching the instructions the CPU could execute.
Windows for Workgroups 3.11 could run a 32 bit version, but that was buggy as hell compared to the 16 bit version.
I'm not an architecture expert but I suspect the cost/performance difference had more to do with the bus size differences and capabilities of the more advanced 386 causing them to require more die space compared to the mature 286 design. Bus speeds were interesting back in that era, some systems used 286s on the xt platform which only used the 8bit isa bus instead of the 16bit eisa that the 286 was designed for. These devices even needed custom ide implementations since that standard was never meant to be used without a 16bit bus.
Lady was talking about AI 35 years ahead of its time.
The Hot Wheels track in the beginning would still be considered badass by today's standards. I want one for my nephews!
+AshtonColeman It is actually a "DARDA"-Track.
+tremorist Yup. Definitely had that back in the day, brought back memories.
God, these trees were drawn even being 100 occluded by trees closer to camera !
"Desktop Publishing," haven't heard that term in a long time.
I remember having a 486 in college and my roommate only had a 386. Bragging rights, lol.
what about 386DX-40 vs 486SX-25/16...? :)
I’m pretty sure Jan is materially incorrect in the opening segment. A CGA video adapter requires a lot more processing by the CPU than VGA. So putting a VGA card in that XT would likely speed up the drawing of those golf game graphics.
As I understand, this was only the case if the card supported some kind of graphical hardware processing. Such things were very limited and non-standardized so it is difficult to generalize but if we have to say that progressive technologies enabled more functionality then that would probably be the case. Else, the general idea is that these were all just frame buffers otherwise. The CPU had to compute everything then just send it to the memory buffer on the graphic card for display. 2-D linedraws was the common function and some others that could help 2-D games.
My general feeling for all these shows is that they are always in a hurry and sprint over the info .... Always way too hastily
We went from this "Golf" game to PlayStation 1 in only 6 years.
So she mentioned AI in 1989. Ok.
Yeah... I didn't know *Adobe Illustrator* was a thing back than 😆
I think they mention AI in nearly every one of these shows.
@@EdwinvandenAkker LOLOLOL
@@dallas-cole artificial neural networks were first conceptualised in the 1940s, then implemented in the 1950s
Way back 1997 had a PII running at 333Mhz with 0.35 μm technology node. It had 32MB of RAM. Today, we have multi-core supercomputers in our pockets that can render 4k HDR videos with 5nm technology node.
My first computer was also in 1997 -- a 200Mhz Sony VAIO also with 32 megs of RAM and a whopping 3.4 gigabyte HDD. How will I ever use all that power and storage space, I wondered as a 17-year-old. Oh how the times change...
Великі люди своєї епохи. Дякуємо вам за вашу працю.
10 Year later I had (still do) my athlon K6-650mhz. What progress we had.
"There's no way to increase the megahertz or the speed of which your computer executes instructions" - Sure there is, it's called overclocking.
Wonderpierrot those processors couldnt be overclocked.. first pentium could but only trough hardware..not bios..what you do i use ducktape and isolate first and third oin on the processor..but only to next mhz..for instance pentium 100 could go to 120.. or pentium 120 to 133 mhz..etc..
@@semiborba7047, and depending how far back you go in time, you wouldn't want to overclock. Earlier OS's and software adjusted their timing based on the processor clock speed. You could potentially go higher, but it may break the software, even if the hardware could handle it.
@@fcex558 Even then, it could only do so much. Even when you lower clock speed, bus speed, and disable caches, the system may still be too fast. I remember in the 386 I used as a kid, even some really old games were still too fast if I had the turbo switch in "slow mode".
you would have had a tough time overclocking via front side bus on these chipsets. Core multipliers weren't a thing back then
You clearly didn’t have any experience with this era of computing. Now you can overclock an unlocked chip by clicking around in your bios and having a whole pile of settings auto adjust based on what you selected - things like bus speeds, timings, and multipliers. None of this existed back then, and especially without multipliers, this was an extremely difficult thing to do manually.
That intro looks like it was shot at Talbots Toyland in San Mateo.
Love to go back in time with something like a 28 core Zeon and just go on this show with it right after the guy with his 33mhz chip.
Lol, some people want to travel back in time to see dinosaurs, assassinate dictators, or meet their heroes, I just want to present a 2020 laptop and smartphone on this show and bring some poor Tandy marketing guy to tears
Just like someone from 2100s would consider both absolutely obsolete and funny.
@@kvarnerinfoTV you are assuming there is no large issue that has not reduced humans to living in “caves”
So let me just resume Jen's "3 reasons why you want a faster processor":
it's too slow!
What would happen if you brought a blueprint of an R7 from the future to these design rooms. Imagine their curiosity and interest! Lol
curiosity and interest, sure. but without the manufacturing, materials, physics, and other advancements to go with it, you may as well hand them schematics for a working tricorder, for all the practical good it would do them. Even reverse engineering many of the constructs they would see would likely take them nearly as long as developing them as they did in this timeline. Or maybe there was a Henry Starling type who stole some 29th century technology back in the 60s and created a temporal causality loop...
"Copying of software is a federal offense."
*M'Kay...*
Ahhh the times when even the next office version could slow a PC down forcing users to buy a new one every two years or so at a Moore's law cycle. Nah, I don't miss it.
yeah now days it take alot to bog down a modern computer but it cans till be done thanks to scammers and their bloat ware
that 80286 upgrade to 80386 seems kinda odd... what about the crystal and 16-bit bus speed??
Maverick: "I feel the need ..."
Goose: "... the need for speed!"
10:48 can anyone tell me what type of computer case that is?? I had it as a kid and i don't remember much more about it
It's a tower case but they're just filming the top if that's what you're asking.
@@rowger8927 no, i know it's a tower. I just want to find one just like that. I had it as a kid. and can't find similar one.
By 1999
The iMac was 350-450 MHz and the PC was just approaching 1ghz
kek
+Nathan22587 ??
_Italics_
+PCgeek486 windows computers were more than twice as fast as apple, I guess.
+jaymorpheus11
apple computers at the end of the 90s used a different architecture to windows PCs
a 350mhz iMac could beat a pentium 2 of twice the speed
So many computers sold with a 387 co processor as optional on empty sockets back in the day.
Maria Gabe---riel... it drives me crazy :)
Stop a GabeN Take Control Stop a GabeN
She's talking about AI and voice recognition, back when they werent popular keywords.
0:55 don't copy that floppy.
Speaking of speed, am i the only one who thinks that the hosts are trying to speed through their speak?
it really is fascinating looking back at how much the culture of home computing changed. computing ubiquity now means complete idiots use powerful machines with little idea of their inner functions and they look back at this period like everyone was dumb and would be awestruck by the modern tools. i think they would be struck more by the culture, would you rather be playing games 24/7 or the guy viewing the sales figures of the games ? think about it.
22:35 Guy was just about to tell us he'd invented AI but they cut to commercial instead, so he was like, ok, fuck y'all, I won't tell you how to do it, figure it out yourselves.
From 33 Mhz on one core in 1989 to 4000 MHz on 16 cores in 2019.
But each core has 4000mhz or there are 16 processors 250 mhz...
@@ens8502 Each core is 4000 MHz.
24:24
holy shoot. rotating CRT
12:18 I'm Maria Gayyybrreeulll _maria shuts down_
No one answered Jan’s trade off question. Surprised they didn’t push on that point as everyone claimed there’s no trade offs.
i love how he says we hope to reach 200mhz by 1999...well my sir in 1999 i had a 750mhz amd athlon computer so i hope you were happy! :)
consider that it was INTEL talking, the real innovators in the CPU world threw out the 90's - mid 2000's was AMD, clock for clock AMD won hands down, and on price, INTEL just had, and still has deeper pockets too grease the PC makers too use their chips.
I had a PIII 800MHZ SMP system, that's two CPU's on one board :) With a shitty VIA chipset :(
TheRealWitblitz O GOD don't get me started on VIA, the only thing they have ever got right has been onboard audio. I have actually owned one of the VIA PC1 C7 boards, and I was always having trouble with it either being slow as dog shit, or driver hell, same with the Everex laptops that had the Mobile C7 chipsets, I bought 1 for my Niece, and one for my g/f at the time, as i was able to get them for slightly over $500 from Walmart, and just about every week I was having too do tech support on those damn things usually do too something VIA screwed up with an driver update threw Windows. VIA also has about the worst Linux support I
have ever seen from a company. I'm amazed they are still in business.
i got one AMD-K6/2 500Mhz (super socket 7)
586 Class system - did once run Windows 2000 SP3
not a big issue..
Commodorefan64
Many of the big innovations back in the 1990s to mid 2000s such as copper interconnects and high speed caches were made by RISC CPU venders such as IBM ,DEC land SGI.
386 was also when IBM got shafted. Compaq debuted their Deskpro with the chip before IBM had a PC with it.
I'm writing this comment on a computer that's almost a hundred times faster than anything they showed. Amazing.
Only almost a hundred times faster? Seriously? You must have a very, very weak computer. 486 processors were about 10 megaflops (able to run Doom), while in 2010 there were already 100 gigaflops CPUs (i7-980 for example). Current consumer desktop CPUs are up to 800 gigaflops, which is 80 000x faster than 486. PS5 has a million times faster GPU than the 486.
What I love about everyone here is they actually know their stuff inside out, people today are likely to bluff their way.
Legitimately. People are much dumber today.
Monkey Robots Inc. Totally. Most people in these positions today are sales people who read a few lines off Wikipedia and do their best to convince everyone they are experts. You just know by the way these folks are talking that they are knowledgable and passionate about what they are talking about
Is this a millennial thing?
is 33MHz more powerful than my wimpy 2.35 i3?
Yeah it is the perfect match for windows
33 is bigger than 2.35, so it must be better! Size does matter!
If I had a time machine one of the things I would have done was go back to when this video was made and walk in with an ipad pro and see everyones response
OMG, Hauppague!
Who remembers the Turbo button, with the LED showing either 16 or 33 LOL.
AI on a 33Mhz chip wow that was some high hopes.
Computer speed increases where literally speeding along but now it's almost at a crawl and the only thing we get is multiple core cpu's
it's all about power efficiency and mobile devices these days.
Chips today have transistors so small, that if they go much smaller (more/smaller transistors the faster the CPU) quantum tunneling happens and the electrons jump the gates.
5:55 Ron Burgundy
Back then with my Commodore 64 I dreamed of a better computer for gaming.