@@Gobble.Gobble no no, ofc I would, but I mean I'm getting a 6600 xt cus it's only thing I can get under £400, but like if everything were normal, I probably would be able to get a 6700 xt
yeah, that Betamax mention at the end hit hard....dad was a dedicated Betamax person. I can still hear him...Beta has better picture quality. Great dad, nobody rents movies on Beta.
Takes me back to my old workstation PC I got when I was a kid from some ebay surplus that had 2 CPU slots. So of course I just bought a second CPU and had two pentium 2 processors before core 2 duos and such were a thing. One of my first "Wow I'm a huge nerd" moments was when I figured out how to tell it to run different programs on the different CPUs so I could say run a dedicated Halo PC server on CPU 1 and then the playable instance of the game on CPU 2.
@@queden1841 yup...we was much cooler then Lmao.... we also built our own soundcards, and had the real authentic chip music... ...sampling was an act of art...
@@Guru_1092 my first ever computer was a zx81 with 1k internal memory....whn thay was sold it was in an assamle urself kit...u got the circuits and the board and have to solder everything urself, later thay begun selling preassembled once... i guess the fail rates was quite high lol.... u definetly developed a different understanding and attachment to ur computer...
But it's just that now it's built into the system, like the network card. And you're stuck with whatever the manufacturer of the mobo sticks on it, rather than being able to put in better ones. Well, you still can, but good luck finding one that's better.
@@darealboot1 Agreed. I use a Creative Labs Soundblaster Audigy. Sound is less distorted not just for recording, but playback of any sound is much nicer than the junk on the motherboard.
This brings back some memories. Still remember back in the day when there were still LAN parties, one of the guys showed up and his rig wouldn't start. Something was rattling around in his case and it turned out to be his slot processer was had fallen out. We were dying. One of the funniest things I have ever seen.
@@nanaki-seto LAN parties are still a thing in some circles! For my friend's 18th we all brought our computers over and hooked everyone up to a switch. You'd be surprised at how many games still have a LAN function buried in there.
@@briannem.6787 I don't need it either, but consider it a car accesory adapter, you could connect a mini vacuum cleaner and use it for the keyboard and desktop for example
@@YOEL_44 Most accessories that connect to it are compact, like a fridge. A car fridge is smaller than a domestic fridge, making it perfect for cold gaming drinks.
Dear god I had forgotten that lighter! My dad knew about computer 3 things: 1 how to turn it on. 2 How to turn it off(mainly because ot was the same button) 3 how to light his ciggaret with that thing.
That was a real PIA. Having to move a tiny jumper that I would OFTEN loose while building the system. Two drives on the same cable told who was who by a micro jumper..good think SATA changed that and it was a separate cable for each drive.
IIRC in cases where you wanted to install only one for whatever reason there were dummy RDRAM sticks with 0 capacity that just completed the circuit, although don't quote me on that because it's been years since I looked into it
@@skak3000 The workstations in the office I was in the past several weeks are very slow while using SSD's and decent 4 or 6 core 6th gen Intel CPU's. The common denominator for awful performance in all the computers was single channel memory.
@@WLOfails The problem can be 2 things: A) Need more ram. B) The memory speed is to slow. A simpel memory upgrade from 2400 MHz to 3600 MHz can have a big impact. Ryzen love fast memory...
@@skak3000 intel 6th gen cpus in workstation computers. The memory speeds are 2400 mhz. One computer uses a 1x8 gb memory stick (the workload only uses around 4 gb to access patient profiles) and the other computer uses 1x16 ram stick, which makes zero sense. Sata SSDs. Both hitch and lag.
Dedicated 3D accelerators, e.g. the 3DFX Voodoo cards. Those were awesome, you would connect your regular 2D graphics card VGA output to a VGA input port on the 3D accelerator, the accelerator would do its thing and render sweet accelerated 3D graphics replacing the 2D graphics, and then it would pipe it to a VGA output, and when not using 3D, it would simply pass the 2D signal directly on to the output.
Wow, I remember working in an Intel fab when they were making Pentium II-III. Our fab made half a gazillion 440BX chipsets and all sorts of L2 cache. All the geeks knew the ticket was to buy a slot-1 300MHz Celeron and overclock it to 450MHz.
There were socket adapters for slot based motherboards where there was a CPU socket on the slot. This allowed you to keep using those slot based motherboards for a while longer after the slot was dropped by Intel and AMD. This was NOT supported by Intel and AMD but the slots had enough pins to use later generation processors without taking a performance hit.
I was an IT technician and a systems builder for a university in the late 90's. When Pentium II processors came out it was like a dream come true. We suddenly didn't need to worry about thermal paste (which we called pigeon shit) or bent pins (we didn't have pads back then, it was actual pins you tried to fit into a socket and they bent A LOT) it was just like putting in a SNES cartridge. Good times.
As a non-professional builder I associate the SEC units with low reliability and the dawn of superloud HSF units. silver lining, I suppose, is that those dual 32mm and 40mm fans setups on 1/4 tall aluminum heatsinks annoyed enough people that by around 2000-2001, the was a swelling "silent pc" movement that led to the much bigger, slower, quieter fans of today which deliver more cfm at way lower decibels that the 8000+ rpm screamers of then. With pwn controllers and the displacement of spinny discs by ssds, computers don't sound like sci-fi bladeless helicopters anymore, and dim the house lights on power-on, either, like they sometimes used to (seriously, some computers would delay and stagger various components during power-on to avoid overtaxing the psu's)
"sound cards" Are still around,. alive and kicking. Sadly Microsoft killed EAX, same as they killed DirectInput, to push the XBox, and gave no replacement at all (and for directInput we got a massive downgrade). But still - if you want to have good sound with a set of nice speakers you really should get a soundcard.
@@miloolsen4395 I do have a soundcard, but not installed in my PC cause i neither have any equipment set up that would benefit from it, nor do i actually care that much. But darn was it nice in the old games with the good effects :(
@@ABaumstumpf Microsoft killed hardware accelerated audio to make everything go through software. EAX still works, but due to that and other reasons advancements on it were killed and thus devs stopped using it. Cheaper to use software solutions anyways I'm sure. All the old games still work though, just use Alchemy and it all comes back up.
Rambus was a catastrophe in every way. The bosses at Intel got into bed with Rambus and tried to shove it down everyone's throat. Their own technical department hated RDRAM and nobody but Intel actually got on the train. The result was it gave chipset manufacturers like Server Works an open goal, taking home a lot of design wins for motherboards and high performance computer centers. The best (worst) part was when Rambus applied for a number of patents on technologies used for DDR memory that was described in the papers published by JEDEC before the final standard was nailed down. Only to have Rambus popup demanding royalties on all memory built to these specifications. There for a while Rambus was the most hated company in the computer industry.
@@spavatch They are not terminating the signal but simply connect the input to the output pins to complete the ring. Any break in the ring and nothing works. It's almost like the old Token ring network in that way.
I totally forgot about RAMBUS!! I remember having to install "blanks" in unused RAM banks otherwise it wouldn't work, and you had to pair them like you do nowdays. Ram, blank, ram, blank. etc That period of P2 450's and P3 550's was interesting too. I think they're the only CPU's I didn't keep any of for nostalgia! Still have my hotboi Cyrix and AMD K6's though!
The best part of the P2 450 generation was buying a celeron 300A for $80 to get 90% of the performance of that $300-400 P2-450. =D. That was my gaming rig circa-y2k, with a Riva TNT when everyone else wanted Voodoo3, and it ran like an absolute rock. After I upgraded past it, it became the home router, running two network cards and NAT software... dedicated home routers wouldn't be a thing for at least another 2-3 years after that, and I think my old 300A (which I clocked back to 300 for that part of its life) ran strong almost 24/7 for about 6-7 years before I finally decided to just replace it with a dedicated router to add wifi.
I had one of the first engineering sample boards for RAMBUS testing. For the time, that thing ran like greased lightning but boy was it a finicky beast, and hot. Also finding drivers was impossible since the board TECHNICALLY wasn't ever released to the public. I got close drivers on many parts of the board and chipset from earlier Intel boards, but crashes were common because it wasn't 100%. I was glad when DDR2 became standard and prices dropped so I could rebuild my home machine.
I think it is hilarious that Riley covered really short-lived and arguably dumb tech, and comments are filled with suggestions of older, long-lived and elegant tech. I must be getting old. Personal bug bear AGP was revolutionary in graphics card power, sure it dead-ended due to pci-x evolution, but it stuck around for the better part of a decade, and some features were carried forward to our current designs.
I remember building a PC rig when AGP was mid stride, all the most badass graphics cards yadda yadda. When I went to build a new PC several years later, I was surprised that it disappeared and then had to learn the new architecture. Then again when I did a major PC overhaul last year. But at least everything modern still fits in my 10+ year old cooler master full ATX case. I even have a slot for a floppy drive.
AGP was suppose to be the end all be all of gpu slots. Unlike pci e agp was a dedicated graphics bus not shared with any thing else direct path between gpu and cpu. PCI E is shared between devices and once your out of lanes something has to go. This is why you have computers that are running 16x cards at 8x because there are just to many other devices using pci e lanes. Truth is i think nvidia had a ton to do with the nuking of agp with sli and amd with cross fire. We would likely still have agp of some sort if not for those 2 multi gpu set ups. 3DFX was working on multi gpu configs on a single card to give sli like performance or better from a single card but nvidia had specifically targeted 3dfx to kill them off. Once they bought 3dfx and took the sli tech and started running with it and the agp slot died off as a result. Fact is we probably would not have seen video cards being used for crypto mining if agp was still around i doubt it would work well for crypto mining
You accidentally (I think) mentioned another dead bus. PCI-X is a thing, but it's a different thing from PCIe/PCI-E that we all use for expansion these days. PCIe is PCI Express, but PCI-X was PCI eXtended, an extension of the PCI bus mainly used in servers and workstations. It was specced to use the same slot as PCI, but was most often seen with an extended 64-bit physical slot. As for PCI-X video cards, there were some Matrox cards, and I think ATi made one, but the bus was mainly used for RAID and NICs.
This brings me back to 8th grade when I started to dabble in PCs. I tore open our family's first PC which was not being used anymore and it had the P3 socket CPU. That was my first and only experience with those. Ah, good times
@@RWL2012 Had to of been the super late 90's socket P3. When Intel once again switched the package to the socket instead of the slot. My first ever experience building a system was a Pentium at 75 MHz. But I did buy off a garage sale a 486DX2 system and referbed that one and used it for a good while longer.
YESSSS, not only could you overclock Socket A CPU's, but you could unlock some of those new fanged ATi Radeon cards! Some models could be unlocked this way to higher counts.. and a few could be software modded into Radeon 9700's!
One format that was so doomed to fail it had an expiration date...literally. the dvd rental format that after a given amount of time would wipe itself... even when never rented. Had this taken off imagine the E-waste.
@opisex I encountered them when I was driving a truck over the road at truck stops. I thought they were a great idea (if a bit wasteful) for us truckers, since renting wasn't really an option.
I remember those days. My dad wouldn't let me keep software disks in my room as it was his technically and only let me borrow them in the order it was needed. In order to install windows 98 I first had to use a bootable floppy disk and type in dos commands to install the oak technologies cd-rom device driver. After that I could use the windows 98 disc to install windows. But once I was in the computer class in high school I started accumulating my own software. Ah good times.
Son Goku You were late i guess. Before Windows we were already downloading anything we needed... BBS! Disks? for housewifes only i guess, you still get them in 2021!
You've had that desktop for over a week? Throw that junk away, man, it's an antique! Or my favourite: You think your Commodore 64 is pretty neato. What kind of chip you got in there, a Dorito?!
My favorite game on the Ti-994A was B1 Bomber... which I loaded from my cassette deck (I could start loading the program, go downstairs make and eat lunch, come back and it was almost done loading).
I found a slot 2 machine at a waste collection site that had a 386 socket for it. Was in pretty neat and I wanted it but can't take stuff from waste sites
@@damianosplay9457 Sad days, I had half a mind to grab the one I found but I like my job and didn't want to lose it over a computer that likely didn't work anyway, hah.
Remember silicon discs? aka having a ram drive to load files onto & manage or run them from there was like having a SSD in 1984 (except not storing the data after poweroff). Also 3" dualsided 200kb floppies was amazingly fast quirky compared to the 5,25". Oh, & those slot1 pentium 2 could certainly also have cooler issues. I recall having to replace one as an older one had become slow from overheating too much.
I once had a Sony Vaio Desktop PC which had an Built-In MiniDisc Drive. That was awesome. You were able to use it for Music for your portable MD Player, and also for Backups. I really loved MiniDisc back then.
Here's a Techquickie, can you explain why TH-cam often recommends or sets up as the next video something it knows I've already watched? Why would it do that, what are the chances it'll be a video I really want to watch again? Surely they'd want to keep me hooked by offering only new content.
@@emerje0 do you have a pi-hole or ad blocker? It could be disrupting Google tracking. My setup seems to completely block TH-cam history, for better or worse 🤷♂️ but no ads for me
@@ericwright8592 No, in fact I have YT Premium because I hate ads but still want to support the content I watch. But that shouldn't matter, as long as a video has red bar going the full length of it YT knows I watched it and it shouldn't be recommended.
I remember taking a PC hardware class where we were taught to build and troubleshoot a PC that used a Pentium 2. Didn't even think about the CPU being a cartridge at the time.
I remember reading that one of the reasons for introducing Slot 1 was preventing competition from making compatible CPUs. There were some patents and other stuff.
It’s sad what happened to FireWire, because it’s (and TB1 & 2’s) lack of adoption is why Apple said _"fuck it! If you’re not going to adopt our superior connector, we’ll make laptops that only have said connector!"_ Also, I still have FireWire cables and drives for whenever I’m servicing older Macs.
still use it won't give it up and in fact when I build a super secure system it's one of the things that I like to use because it is 100% rock-solid And I know everything about it from the inside out including how it transfers data. I think I still have the driver key to use the firewire networking protocol that they released. later on it was built into Microsoft Windows PC and of course Linux had it first. at least four video or doing audio work it was the bee's knees. however no matter what you did the firewire drives were way more expensive than all the rest including the sum of the cables which was absolutely atrocious or how much they charge for the firewire interfaces. yet even after all these years some of these drives I have are about 20 years old now I think still work with the firewire interface I think that's why I love a Data Drive so much because they were the first firewire drives ever purchased. followed by the drives Lacey sold
Funfact: The main reason Intel named the line of chips "Pentium" was so they could trademark a name and not have other companies, like AMD and Cyrix, simply copy their number scheme, e.g. 386, 486, 586(?).
I could be wrong, but you are somewhat correct. It was more that Intel lost a court case in that they could not trademark number and then had to go to a name
it also was great marketing because Pentium sounded rather exciting and like a significant upgrade unlike say 286-386. AMD and Cyrix had 586 CPUs, but AMD chose to follow Intel's lead afterwards and created the Athlon brand. For Cyrix, the 586 was their last CPU I think, before they went out of business.
@@dahak777 They tied up AMD in court with their superior 386 40 MHz CPU that kept pace with a 486 SX 25 (that you'd be dumb not to overclock to a 33 Mhz, which most of them could easily handle) - They didn't have to win, they had to delay AMD enough. That 386 chip sold well into the second half of the 90s as used or secondary PCs. Hell, I'd love to have had one in 1996 even, I had a shitty SX/20 still.
@@jayhill2193 Cyrix followed up with the MII and MediaGX but got pretty stifled by the NatSemi merger, then all but killed off by the VIA buyout. The Cyrix III was the last Cyrix chip with their own design, but it got snuffed out in favor of the "VIA C3" low-power variant using CenTaur's core architecture.
Reminds me of Apple Macs back in they day when they used SCSI ports (rather than Parallel or RS-232 Serial used by PC) for peripherals. You had to daisy chain your scanner, printer, etc...and install a terminator on the last device. I believe you also had to set a SCSI ID for each device.
@@LC-uh8if Not quite. RDRAM works kind of like those old Christmas tree Lights. If one lamp Breaks, the series is broken, thus nothing works. SCSI on the other Hand needs to be terminated at both ends of the Bus, because otherwise Data would bounce around causing collisions.
I was working at Intel Dupont back in 96-98 and they told us the ceramics surrounding the CPU was getting expensive. Vendors that supplied the ceramics wanted more money and the slot design was adopted to reduce cost.
Lets hope there's a permanent work around, so sad to have such forced obsolescence in otherwise completely functional cpu's. I just upgraded to a ryzen 5900x so im golden but really could have hung onto my trusty old overclocked hexa core socket 2011 xeon for a bit longer with little performance hit, i really just wanted more usb3+ ports and nvme.
Intel shot themselves in the foot with the Slot 1 Celeron 300A, right motherboard you could change the FSB from 66Mhz to 100Mhz and instant 450Mhz CPU that was a beast at the time.
@Clarissa 1986 Celeron was cheap because it did not have enough cache for workloads. Cache memory was expensive, so Celeron was OK for people who did not need best possible performance. If small cache was enough for tasks you did on your PC, then you did not notice difference to Pentium.
Actually the first slot-based CPU packages that were put into PCs were those for DEC Alpha AXP CPUs, specifically the first generation: 21064, put into DEC AlphaStation workstations. They were PC-like in almost every way, except for the CPU. They had ISA slots, they had PCI slots and an Intel FX chipset to support all this. They were introduced in late 1994. I used to own one and had equipped it with an original SB16. It outperformed my K6 (or K6-2, can't remember) in very many ways. The bus design for the 21164, EV5 was the basis for the AMD Athlon bus design.
Rarely when my dad allowed us to use the dial-up connection it was something precious that you had to turn off soon after due to phone bill rising 🥲 Beautiful music to my ears...
I had a CPU slot (Pentium II). It was a nightmare, if I accidentaly moved my pc case the computer would freeze and fail to restart. Most of the time I had to open the case, remove and insert the CPU again.
I had several Pentium II PCs, even dual CPU ones, and it never happened to me, and I've never heard of such problems from the others so... sounds like a personal problem :P
@@spavatch some motherboards were better than others at making the cpu stable in the slot. Of course some people used 3rd party coolers which could also cause some problems with their weight and gravity.
@@bionicgeekgrrl - I think what constitutes the problem is the fact that people didn't use original CPU brackets. Most mobos had foldable plastic rails with fastening clips that held the cartridge in a safe, stable, upright (or more precisely perpendicular) position. Some guys took them off for one reason or another which... brings us to my assumption that it's more of a personal problem rather than anything else ;)
2:00 AMD K6-III was the first CPU for PC to have L2 cache on the CPU, and when it was put into a Super 7 board that had L2 cache (designed for the K6-2), that onboard cache became L3 cache and made the K6-III significantly faster than the Pentium II and IIIs running at the same clock speed. But the reason the Athlon had to be slotted was because it had L3 cache in the package, where the Pentium II and III were packaged for their massive L2 cache. Weird thing about those, is that the last Pentium III - after Intel went back to sockets - had everything integrated in the CPU and were surface mounted inside the package so OEMs could use up their last slotted motherboards
I think the best thing regarding it I saw was someone testing whether the device was even needed by taking one of their pouches and squeezing it by hand. Turns out it was faster that way.
Legit question: Why don't we go back to cpus being on their own cards? It's really annoying needing new motherboards on cpu upgrades, and different motherboards for AMD vs Intel. Graphics cards are basically plug and play anywhere.
A socket for any CPU means a socket for your chipset and some *expensive* FPGAs for adaptive routing. GPUs are co-processors running over an expansion interface, not primary processors
The card connection on a GPU is just for data throughput, and the core handles it and the memory. So in theory as long as data throughput doesn't increase beyond the limits you won't need a different connection. And if you use an earlier PCI standard on a newer card it simply caps the data throughput to the max it can handle, if I'm not mistaken. Like connecting USB 3 flash drives to a USB 2 port. Meanwhile a socket iteration often either adds pins or changes their locations which means that the connections inside the motherboard are not compatible. For example a CPU socket that supports triple channel memory needs to have pins that handle those connections to the third channel. Even if you custom-made a CPU from a previous generation that fits in that socket but only has dual channel memory, you probably wouldn't even get it to boot because everything would be crossed. The pins that were supposed to handle the third memory channel could be trying to connect to the sound card, it would be a total mess. So trying to replace the socket for a card slot without requiring a new motherboard with a new slot would be extremely hard because all the gold contacts on the slot would need to line up with the next generation and new features would be difficult to add. This isn't to say all socket changes were justified, I'm sure there's been a few that were done just for the money.
Technically speaking all modern CPUs are "in their own cards", they're just really tiny and feature lots of connecting pads/pins in a flat 2D array. Jokes aside, as Titanium Rain explained, the reason why sockets differ generation to generation is because CPUs are much more directly connected to the rest of the hardware than co-processors like a GPU. It *is* feasible to put CPUs in some sort of standardized socket, but you would either need to over-spec the connector for future-proofing (incrementing costs), create abstraction layers that allow new features to be addressed over old channels (increasing latencies and reducing performance), or change it for a new one in a few years (what CPUs do right now)
Your question assumes the CPUs being on their own PCB meant they were interchangeable. They were not. They didn't have any advantages over modern CPUs, but they were larger and more expensive to make. As for why we don't go back, they were larger, more expensive to make, and would be slower than modern CPU designs.
There is a good chance this might happen. AMD already have sockets that can survive many years, and there are experimental technologies to connect RAM through the PCI-E port. So there is a chance that next generation CPUs will only have power and PCI-E and nothing else.
Another huge thing that disappeared: AGP. There was a time when GPU manufacturers sold the same GPU model for both AGP and the new PCIe, if I remember correctly. AGP was the golden standard for gaming GPUs - PCIe was new, sometimes performed worse, sometimes better, people didn't trust it, but with lots of improvements it eventually managed to replace AGP once and for all in a relatively quick manner. If you found a PC in the trash, and it had an AGP slot, you took it - because that's all you really needed for a gaming computer.
I made SOOOOO much money installing retainer clips for those crappy processor slots. Thank you Compaq for designing them so that with time the thermal expansion and contraction caused the CPU’s to wiggle out of their slots. Great fun! …LOL… :)
This is something he missed in his discussion of slot based CPU. Yes, he got it right about the L2 cache being on the board with the CPU, but he missed a part of the explanation as to why it was there. As you mentioned many pentium and 486 motherboards had L2 cache chips on the motherboard. Some of them were socketed and could be upgraded and some even came on a little board that fit in a slot itself. I believe that was referred to as COAST or Cache On A STick. Anyway, the real reason that on the Pentium II they put it on the board with CPU was so it could run at higher speeds than the FSB. remember that in the 386 and early 486 era the clock speed of the CPU was the same as the FSB. With the 486 DX2 66, the CPUs started to run at multiples of the FSB speed. When we finally reached the Pentium II era you had CPUs running at very high multiples of the FSB. At that point cache on the motherboard would have been a huge bottleneck because of this huge difference between FSB speed and CPU clock speed and would have been pointless. So, a compromise between having the cache on the motherboard and actually part of the same chip was to have the cache on a board with CPU as was done with the Pentium II or have 2 chips on one substrate as the Pentium Pro. That way the cache could run at a higher speed than the FSB which the motherboard ran at. The L2 cache actually ran on what was called a "back side bus" of the processor and on the slot I pentium II chips ran at half the clock speed of the CPU. for example, on a 450Mhz Pentium II that would mean the L2 cache on the Back Side Bus would be running at 225Mhz as opposed to the 100Mhz FSB. That was obviously a huge performance improvement. If I'm not mistaken, the larger slot II Xeon variety actual ran the back side bus and L2 cache at full speed, just as the Pentium Pro, which of course further improved performance. When they advanced to the point they could actually put the chip in the CPU itself, obviously this was no longer needed. Anyway, he failed to go into detail and explain this. Anyway, this wasn't a "failed technology". It was simply something that was necessary at the time but with advances no longer became necessary.
Betamax was not obsolete e-waste; it was the dominant television broadcasting production medium for the 1980s and 1990s and the basis for modern news gathering services (ENG). if you watched tv news at all back then, you watched Beta. if you went to broadcasting school and learned to edit video, you worked in Beta. if you made independent low-budget art films, you worked in Beta
Beta was far superior, I used to work in TV News. VHS won the consumer one though as you had more home videos on that format. For sound recording we used to use the MiniDisc by Sony which was also awesome for its time.
Micro-Channel Architecture - The "old" version of Dave's Computer Warehouse used to have an entire rack of shelving dedicated to MCA-equipped motherboards and add-on cards.
4:15 not only did Rambus need to be in pairs, you also had to install 2 dummy modules in open slots . so if you had 4 RD-RAM slots but only 2 sticks you would also need to buy 2 dummy sticks to go in the unused slots.
No plan truly survives the first encounter with the players. No matter how carefully you construct the scenario someone will do something unexpected and from there your just improvising to create the illusion you are in total control.
@@nobodyimportant2470 I ran an entire homebrew campaign for over ten years on that very concept - my medieval spelunkers became pie sellers and delivery men...
You didn't mention discrete sound cards, perhaps the easiest to component to identify. They don't really make them anymore, they make externals now, which actually cost a lot more and aren't as easy to work with.
I remember my old PII-350 tower used to lock up every few months, as the motherboard's processor retention clips didn't match the form factor of the heatsink and the CPU used to slowly fall out...
I miss the PC cases that laid horizontally, and you put the monitor on top of them. I found them much easier to work with when fixing/upgrading/swapping components, and a more efficient use of desk-space than the tower design that's the current standard.
@@aaronmischel4552 yeah but there was a time your game wouldnt run properly if you didnt have one or a compatable one. any old sound card wouldnt work.
You could do an episode on MPU-401, MT-32 and AdLib. Afaik no sound cards ship with a midi chip anymore, and anything midi is handled by software instead
The PPU. Physics Processing Unit. It was so short lived. This was part of the physics war between Valve (Processing being done on the cpu(this was when cpu's were first becoming multi-cores)), nVidia with PhysX processing on the GPU, and Aegis with the dedicated PPU. I think we all know who lost in this war.
Another thing that completely disappeared and almost forgotten in desktop PC is external VRM modules. During the pentium 1 era, many motherboards had a pin header to plug an optional vrm board. This allowed the use of more power hungry cpus or cpu using a voltage not supported by the mobo.
Would be great, if one could simply add or upgrade the PC, by simply putting a CPU extension card in the motherboard's pci-express slot like for example an arm or RISC-V chip. FPGA should actually be already available though.
It can, it's one of many form of Daughterboard, like X670 express card from Asus (now you know why Motherboard called, motherboard) The problem is not hardware, but software, it's completely proprietary for proprietary hardware. And unless you willing to spend hundred or millions of dollar for proprietary tailored for your need, there's nothing you can do anyway
Fun fact, the Nintendo 64 used Rambus ram. You had to install Rambus in pairs, or any open slots would require a Rambus terminator. The Nintendo 'jumper pack' is just a Rambus terminator pack, the Expansion pack memory is identical to the memory in the N64 itself.
I have an old P4 system with RBram on it. Forgot to mention that ALL the ram sockets need to be filled and if you run only the one pair of dims on a bored with 4 sockets you had to populate the outer two with jumper dims.
I’d like to see slotted CPU form factors again using modern manufacturing. You could dedicate more space on the actual processor die to raw processing power and then have beefy large caches off to the side. Heat management would take some adjustment but I imagine it wouldn’t be much harder than what goes on in a high end graphics card
keep all the high speed components (cpu, cache, RAM, northbridge, gpu) in one card and the slow components (pci e, lpc, sata) in the mobo, along with the power supply components
Unfortunately, there is a technical reason for why we stopped making them, other than because the cache was already inside the CPU and that making a board for no reason is costly. The reason is electromagnetic interference. This was the era where we jumped from 300 MHz with the 1997 Pentium II to 1000 MHz with the 2000 Pentium III, the last CPU for Socket 1 (*not to mention P4, which clocked a lot higher but delivered lower performance per clock). When you clock-scale like that, you start to run on signal integrity issues, and the cheapest way to resolve them is to simply make the paths shorter (in other words, cut off the slot and back to PGA/LGA we are).
OMFG, a Pentium II. I had one back in the days, and I still think I have one lying around in the attic, if I remember right. Also, they don't use PCI. Instead, they used something called an SECC or A Single Edge Contact Cartridge for the CPU, and it was easier to install the CPU that way. One of the motherboards I know off use that CPU was ASUS P2L97. I have been using PCs since the late '80s. My dad had a pc when I was born, so I grew up with a pc and many consoles and handheld. I had most consoles that came out in the '80 - 90s, i even had an Odyssey 2 as a kid, and I also had a Gameboy original and later the Gameboy color. When I look back, I don't know how my dad got all this money from, but we were not rich by any means, but I had a lot of stuff like that, like if we were.
The Zip drive was fantastic for a number of years, hardly a failure. Also I'd add Itanic to the list, though that's more a CPU architecture than a PC part. Then there the plethora of other flash card formats.
Don't forget sound cards. Without one your PC literally had no sound. (Yes they still exist today but not everyone runs a record company. Its not mandatory that's the point)
This is true, most games didn't need a 3d graphics card at all, ones that did still ran on CPU and only got a modest visual upgrade, but without a sound card your screwed. Either no sound or beeps.
It throws me off thinking that I ended up using a USB DAC instead of something off the motherboard or in an expansion slot. Reason: the thing has RCA outs. But still, you get higher quality via USB
I had a 30 pin SIMM for a while back in high school (balanced out with a chain wallet and a two door manual transmission car with flames painted on it) but all the chips eventually fell off. I then found out that I can use the PCB to bypass certain locks that credit cards were too flimsy to move, until I broke the PCB trying to get into the science class supply room. How'd you get a CPU on there? Drill a hole somewhere?
"The computer had faulty wiring that caused it to overheat when the spacebar was held down for too long. When the manufacturer issued a recall, they were met with opposition from children in developing countries, who were holding down the spacebar to stay warm in their unheated homes." -XKCD (paraphrased from memory)
We need an entire mini series with Riley in front of a fire, smoking a pipe covering a decade of computers at a time. A true historians historian be that Riley.
If you guys could do a video on the somewhat failed format of the MiniDisc that would be awesome! Still have my portable recorder/players that I listen to religiously. Damn good audio!
* IBM's PS/2, OS/2 and Microchannel would be an obvious target. * Thin-client architecture (though I guess we eventually got there with phones and cloud computing). * NeXT. (I actually saw one in my school's computer lab, but people just started at it like it was the Monolith... since there was nothing you could actually _do_ with it.) * Discrete math coprocessors. * ZiP drives.
Graphics cards were nice. I remember when we used to be able to buy those.
I mean zotac drops are pretty good. I bought 2 gpus from there
Lmao, good one mate, also, imagine if they were at MSRP, like I'd have a 6700 xt, it's really frustrating, but oh well
@Channel name mattiyallo.. they drop once a week or sometimes once every two weeks. Just join a group that alerts you for drops
@@gordoncunliffe8038 6700xts are expensive. Just go for a 3060 or 3060ti
@@Gobble.Gobble no no, ofc I would, but I mean I'm getting a 6600 xt cus it's only thing I can get under £400, but like if everything were normal, I probably would be able to get a 6700 xt
Hearing people talk about stuff I was around for like it was ancient history is incredibly weird. I am so old.
Soon we will give interviews to share our memories of how it felt to be alive when this primitive technology was around.
Same lol
So you are saying T Rex and you both didn't get DDR technology at first....
yeah, that Betamax mention at the end hit hard....dad was a dedicated Betamax person. I can still hear him...Beta has better picture quality. Great dad, nobody rents movies on Beta.
Member when overclocking was done with pencil....
Takes me back to my old workstation PC I got when I was a kid from some ebay surplus that had 2 CPU slots. So of course I just bought a second CPU and had two pentium 2 processors before core 2 duos and such were a thing. One of my first "Wow I'm a huge nerd" moments was when I figured out how to tell it to run different programs on the different CPUs so I could say run a dedicated Halo PC server on CPU 1 and then the playable instance of the game on CPU 2.
Dope
You must have been the sickest kid at the lan party
@@queden1841 yup...we was much cooler then Lmao....
we also built our own soundcards, and had the real authentic chip music...
...sampling was an act of art...
Wow that sounds baller AF. Hell, I'd love to have a system like that even today just to tinker with it.
@@Guru_1092 my first ever computer was a zx81 with 1k internal memory....whn thay was sold it was in an assamle urself kit...u got the circuits and the board and have to solder everything urself, later thay begun selling preassembled once... i guess the fail rates was quite high lol.... u definetly developed a different understanding and attachment to ur computer...
the need for a sound card. i remember having to make sure a sound blaster 16 was in all my builds 20 years ago
But it's just that now it's built into the system, like the network card. And you're stuck with whatever the manufacturer of the mobo sticks on it, rather than being able to put in better ones. Well, you still can, but good luck finding one that's better.
@@toddnolastname4485 i actually record music with a dedicated audio interface. so much better fidelity than whatever is on my current mobo indeed.
I remember when those were usually the major compatibility stumbling blocks for games back in the ol 80386 days.
@@darealboot1 Agreed. I use a Creative Labs Soundblaster Audigy. Sound is less distorted not just for recording, but playback of any sound is much nicer than the junk on the motherboard.
Still holding on to a Sound Blaster Audigy 2 ZS (and using it).
This brings back some memories. Still remember back in the day when there were still LAN parties, one of the guys showed up and his rig wouldn't start. Something was rattling around in his case and it turned out to be his slot processer was had fallen out. We were dying. One of the funniest things I have ever seen.
A place I worked was complaining about a PC being noisy and running slow. The noise was the CPU fan rattling around in the bottom of the case.
How about over baked cpus LOL Cant tell you how many slot cpus i had to bake to bring back to life
@@nanaki-seto LAN parties are still a thing in some circles! For my friend's 18th we all brought our computers over and hooked everyone up to a switch. You'd be surprised at how many games still have a LAN function buried in there.
@@innawoodsman basically all fps games have lan and it isn't buried
Honestly I think that happened to me. Not surprising when you had to carry your case and monitor on a coach or bus to the venue...!
5.25 expansion slot with integrated cigarette lighter and cupholder, that's a beauty of tech that has gone for good
lgr showed that.
I have no need for the lighter, but a cup holder? yes!
I really like the VU meters you used to get...
@@briannem.6787 I don't need it either, but consider it a car accesory adapter, you could connect a mini vacuum cleaner and use it for the keyboard and desktop for example
@@YOEL_44 Most accessories that connect to it are compact, like a fridge. A car fridge is smaller than a domestic fridge, making it perfect for cold gaming drinks.
Dear god I had forgotten that lighter! My dad knew about computer 3 things: 1 how to turn it on. 2 How to turn it off(mainly because ot was the same button) 3 how to light his ciggaret with that thing.
Don't forget those master/slave settings for all your drives.
Oh my! I had forgotten them...
Not strictly hardware but fudging IRQ and DMA settings on that stupid Soundblaster clone that just wouldn't work on default.
@@grahamross6397 oh man. IRQ settings were a real pain!!
That was a real PIA. Having to move a tiny jumper that I would OFTEN loose while building the system. Two drives on the same cable told who was who by a micro jumper..good think SATA changed that and it was a separate cable for each drive.
Nowadays people would try to cancel Seagate or hitachi or whatever for daring to have such settings named as master/slave.
"RDRAM had to be installed in pairs"
No wonder it failed. OEMs love getting away with installing only 1 stick of RAM if they can.
IIRC in cases where you wanted to install only one for whatever reason there were dummy RDRAM sticks with 0 capacity that just completed the circuit, although don't quote me on that because it's been years since I looked into it
Why is that a problem? It make it cheaper to upgrade RAM.
And the difference in performance is not much...
@@skak3000 The workstations in the office I was in the past several weeks are very slow while using SSD's and decent 4 or 6 core 6th gen Intel CPU's. The common denominator for awful performance in all the computers was single channel memory.
@@WLOfails The problem can be 2 things:
A) Need more ram.
B) The memory speed is to slow.
A simpel memory upgrade from 2400 MHz to 3600 MHz can have a big impact. Ryzen love fast memory...
@@skak3000 intel 6th gen cpus in workstation computers. The memory speeds are 2400 mhz. One computer uses a 1x8 gb memory stick (the workload only uses around 4 gb to access patient profiles) and the other computer uses 1x16 ram stick, which makes zero sense. Sata SSDs. Both hitch and lag.
Could basically just list everything Techmoan has ever covered.
hahaha
did he covered any pc parts tho?
@@jezusmylord LGR did a lot of them.
or LGR
@@Shadowcat what is lgr
Dedicated 3D accelerators, e.g. the 3DFX Voodoo cards. Those were awesome, you would connect your regular 2D graphics card VGA output to a VGA input port on the 3D accelerator, the accelerator would do its thing and render sweet accelerated 3D graphics replacing the 2D graphics, and then it would pipe it to a VGA output, and when not using 3D, it would simply pass the 2D signal directly on to the output.
I remember my dad having an overprised switch with a button to swap the vga outputs haha
Back in the day when we still had plenty of graphics cards manufacturers thst don't exist anymore or aren't relevant like Matrox and Diamond.
Now you can just run nglide to emulate a 3dfx
@@croozerdog guess that was for using multiple pc's on 1 monitor?
Don't forget about the development of AGP and then replacement with pcie
funfact: there were actually slot to socket adapters.
Slockets were fun for awhile. XD
@@OMightyBuggy slockets might even be a thing again with AMD moving to LGA
@@OMightyBuggy that's a word I have not heard in a very long time. I had one of those.
What you mean were.. you can still buy them XD
I have few brand new in their blisters laying around lol
more old stuff: the turbo button, physical co-processors, ISA-, PCI- and AGP-slots, IDE-ports
physical co-processors are really cool, especially the ones in the 386 times
That turbo button was probably the most misunderstood and misused feature in the history of home computing.
AT motherboards, SCSI, eSATA, Firewire, CD caddies, trackballs, the contrast slider on a laptop, IrDA, PCMCIA
@@thany3 SCSI is still around.
VLB and EISA. I've still got a Cyrix 486 with working EISA in it (SCSI, too).
When I was a kid, I really wanted a PhysX card.
oh man, that's a good one to talk about in a future episode.
PhysX: Dreadful performance for non nVidia GPUs
when i was a kid, i really wanted a AGP-graphics card. thanks, i feel old now.
@@20blog28 they were their own card to start with, then nvidia bought them.
@@20blog28 wasn't a gpu
Wow, I remember working in an Intel fab when they were making Pentium II-III. Our fab made half a gazillion 440BX chipsets and all sorts of L2 cache. All the geeks knew the ticket was to buy a slot-1 300MHz Celeron and overclock it to 450MHz.
440BX was excellent! And then came 810 😞
Yep! Celeron 300A @ 450 Alpha heatsink!
You mean 466MHz, of course ? 😂
There were socket adapters for slot based motherboards where there was a CPU socket on the slot. This allowed you to keep using those slot based motherboards for a while longer after the slot was dropped by Intel and AMD. This was NOT supported by Intel and AMD but the slots had enough pins to use later generation processors without taking a performance hit.
I was an IT technician and a systems builder for a university in the late 90's. When Pentium II processors came out it was like a dream come true. We suddenly didn't need to worry about thermal paste (which we called pigeon shit) or bent pins (we didn't have pads back then, it was actual pins you tried to fit into a socket and they bent A LOT) it was just like putting in a SNES cartridge. Good times.
A good mechanical pencil would fix those bent pins easy.
As a non-professional builder I associate the SEC units with low reliability and the dawn of superloud HSF units. silver lining, I suppose, is that those dual 32mm and 40mm fans setups on 1/4 tall aluminum heatsinks annoyed enough people that by around 2000-2001, the was a swelling "silent pc" movement that led to the much bigger, slower, quieter fans of today which deliver more cfm at way lower decibels that the 8000+ rpm screamers of then. With pwn controllers and the displacement of spinny discs by ssds, computers don't sound like sci-fi bladeless helicopters anymore, and dim the house lights on power-on, either, like they sometimes used to (seriously, some computers would delay and stagger various components during power-on to avoid overtaxing the psu's)
this makes me think about: laserdisks, zip drives, sound cards, parallel ports, trackball...
"sound cards"
Are still around,. alive and kicking. Sadly Microsoft killed EAX, same as they killed DirectInput, to push the XBox, and gave no replacement at all (and for directInput we got a massive downgrade).
But still - if you want to have good sound with a set of nice speakers you really should get a soundcard.
Trackball mouses are still a thing
@@miloolsen4395 I do have a soundcard, but not installed in my PC cause i neither have any equipment set up that would benefit from it, nor do i actually care that much. But darn was it nice in the old games with the good effects :(
@@ABaumstumpf Microsoft killed hardware accelerated audio to make everything go through software. EAX still works, but due to that and other reasons advancements on it were killed and thus devs stopped using it. Cheaper to use software solutions anyways I'm sure. All the old games still work though, just use Alchemy and it all comes back up.
Sound cards still exist, they are built into the motherboard now
RDRAM also needed dummy sticks in the 'empty' slots.
Rambus was a catastrophe in every way. The bosses at Intel got into bed with Rambus and tried to shove it down everyone's throat. Their own technical department hated RDRAM and nobody but Intel actually got on the train. The result was it gave chipset manufacturers like Server Works an open goal, taking home a lot of design wins for motherboards and high performance computer centers.
The best (worst) part was when Rambus applied for a number of patents on technologies used for DDR memory that was described in the papers published by JEDEC before the final standard was nailed down. Only to have Rambus popup demanding royalties on all memory built to these specifications.
There for a while Rambus was the most hated company in the computer industry.
Aren't they called terminators?
@@spavatch They are not terminating the signal but simply connect the input to the output pins to complete the ring. Any break in the ring and nothing works. It's almost like the old Token ring network in that way.
@@blahorgaslisk7763 - fair point, thanks
@@blahorgaslisk7763 Nintendo used them in the N64 as well
I totally forgot about RAMBUS!!
I remember having to install "blanks" in unused RAM banks otherwise it wouldn't work, and you had to pair them like you do nowdays. Ram, blank, ram, blank. etc
That period of P2 450's and P3 550's was interesting too. I think they're the only CPU's I didn't keep any of for nostalgia! Still have my hotboi Cyrix and AMD K6's though!
The best part of the P2 450 generation was buying a celeron 300A for $80 to get 90% of the performance of that $300-400 P2-450. =D. That was my gaming rig circa-y2k, with a Riva TNT when everyone else wanted Voodoo3, and it ran like an absolute rock. After I upgraded past it, it became the home router, running two network cards and NAT software... dedicated home routers wouldn't be a thing for at least another 2-3 years after that, and I think my old 300A (which I clocked back to 300 for that part of its life) ran strong almost 24/7 for about 6-7 years before I finally decided to just replace it with a dedicated router to add wifi.
I had one of the first engineering sample boards for RAMBUS testing. For the time, that thing ran like greased lightning but boy was it a finicky beast, and hot. Also finding drivers was impossible since the board TECHNICALLY wasn't ever released to the public. I got close drivers on many parts of the board and chipset from earlier Intel boards, but crashes were common because it wasn't 100%. I was glad when DDR2 became standard and prices dropped so I could rebuild my home machine.
I think it is hilarious that Riley covered really short-lived and arguably dumb tech, and comments are filled with suggestions of older, long-lived and elegant tech. I must be getting old. Personal bug bear AGP was revolutionary in graphics card power, sure it dead-ended due to pci-x evolution, but it stuck around for the better part of a decade, and some features were carried forward to our current designs.
I remember building a PC rig when AGP was mid stride, all the most badass graphics cards yadda yadda. When I went to build a new PC several years later, I was surprised that it disappeared and then had to learn the new architecture. Then again when I did a major PC overhaul last year. But at least everything modern still fits in my 10+ year old cooler master full ATX case. I even have a slot for a floppy drive.
AGP was suppose to be the end all be all of gpu slots. Unlike pci e agp was a dedicated graphics bus not shared with any thing else direct path between gpu and cpu. PCI E is shared between devices and once your out of lanes something has to go. This is why you have computers that are running 16x cards at 8x because there are just to many other devices using pci e lanes.
Truth is i think nvidia had a ton to do with the nuking of agp with sli and amd with cross fire. We would likely still have agp of some sort if not for those 2 multi gpu set ups.
3DFX was working on multi gpu configs on a single card to give sli like performance or better from a single card but nvidia had specifically targeted 3dfx to kill them off. Once they bought 3dfx and took the sli tech and started running with it and the agp slot died off as a result.
Fact is we probably would not have seen video cards being used for crypto mining if agp was still around i doubt it would work well for crypto mining
You accidentally (I think) mentioned another dead bus. PCI-X is a thing, but it's a different thing from PCIe/PCI-E that we all use for expansion these days. PCIe is PCI Express, but PCI-X was PCI eXtended, an extension of the PCI bus mainly used in servers and workstations. It was specced to use the same slot as PCI, but was most often seen with an extended 64-bit physical slot.
As for PCI-X video cards, there were some Matrox cards, and I think ATi made one, but the bus was mainly used for RAID and NICs.
SSL accelerators/co-processor were a thing on server side back then.
I used to have on in my Mac LC II, it was a 32 MHz floating point co-processor...
Technically Xeon Phi coprocessors only just got completely discontinued a few years ago
They are still being used, just not as commonly.
@@hiddenhawk13 yeah i wonder what would happen if i added one to my z800 dual 5570 xenon
Buying a box of 387s and installing them in all the boxes over a weekend was fun the kids just don't appreciate these days.
This brings me back to 8th grade when I started to dabble in PCs. I tore open our family's first PC which was not being used anymore and it had the P3 socket CPU. That was my first and only experience with those. Ah, good times
Your icon is a dinosaur, but your comment makes me _feel_ like a dinosaur
wait, do you actually mean a Socket 370 (PGA370) Pentium III or a Slot 1 Pentium III...? (and yes, it was made in both!)
@@RWL2012 Had to of been the super late 90's socket P3. When Intel once again switched the package to the socket instead of the slot. My first ever experience building a system was a Pentium at 75 MHz. But I did buy off a garage sale a 486DX2 system and referbed that one and used it for a good while longer.
@@RWL2012 there were even Adapters so that you could fit a 370 p3 into a Slot 1!
Ah yes the Pentium 4 days when we used leaded pencils to overclock our Socket-A cpus :)
that's graphite bro
Remember Tom's Hardware liquid cooling "how-to's"?
And this th-cam.com/video/NxNUK3U73SI/w-d-xo.html
I used conductive pain pens for rear window defoger repairs lol
YESSSS, not only could you overclock Socket A CPU's, but you could unlock some of those new fanged ATi Radeon cards! Some models could be unlocked this way to higher counts.. and a few could be software modded into Radeon 9700's!
I'm just glad you noticed there's an 'L' in 'soldering', and didn't say 'soddering' like some people do. 😊
One format that was so doomed to fail it had an expiration date...literally. the dvd rental format that after a given amount of time would wipe itself... even when never rented. Had this taken off imagine the E-waste.
DivX! Yup
@@LeafTheWitch technology connection actually has a great video on that th-cam.com/video/ccneE_gkSAs/w-d-xo.html
@opisex I encountered them when I was driving a truck over the road at truck stops. I thought they were a great idea (if a bit wasteful) for us truckers, since renting wasn't really an option.
Not e-waste, technically. Nothing e- about those DVDs.
Needing a floppy drive to install DOS and cd rom drivers, so you could install Windows.
I remember those days. My dad wouldn't let me keep software disks in my room as it was his technically and only let me borrow them in the order it was needed. In order to install windows 98 I first had to use a bootable floppy disk and type in dos commands to install the oak technologies cd-rom device driver. After that I could use the windows 98 disc to install windows. But once I was in the computer class in high school I started accumulating my own software. Ah good times.
@@fightingfalconfan Indeed, good times !
Son Goku
You were late i guess.
Before Windows we were already downloading anything we needed...
BBS!
Disks? for housewifes only i guess, you still get them in 2021!
@@lucasrem I was late? I was 11 at the time and the internet wasn't a thing till I was 16. What's it like being 60? 😜
If your mobo is recent enough to have a USB (even 1.1) interface, maybe a USB floppy drive could work (with less money out of your wallet).
"It's all about the Pentiums baby!"
- Weird Al
You're using a 286 man throw that junk away it's an antique!
You make me laugh! Your windows boots up in what, a day and a half?
@@ChucksSEADnDEAD waxin my modem tryna make it go faster
You've had that desktop for over a week? Throw that junk away, man, it's an antique!
Or my favourite:
You think your Commodore 64 is pretty neato. What kind of chip you got in there, a Dorito?!
Your laptop’s a month old, well that’s great…
if you could use a nice, heavy paperweight.
I'd love to see you cover PhysX. I remember those add-in cards. 😁
THANK YOU for pronouncing "soldered" correctly, and not the "soddered" abomination.
Let's start with: cartridges that my TI-99 used lol
My favorite game on the Ti-994A was B1 Bomber... which I loaded from my cassette deck (I could start loading the program, go downstairs make and eat lunch, come back and it was almost done loading).
LMAO.
Or the tape drive on my commodore haha
@@davidamoritz I spent the $150 in 1986 to get the 5 1/4" floppy drive.
@@wolfshanze5980 in 96 I spent 800 on a cdr lols it was 4x write haha
I miss when Techquickie was child Linus in front of a white background
I want them to cover Riley's failure as a tech enthusiast and a husband
@@GGP39 woah
I assume you are just saying this nostalgically. It is a much better show now.
lol, this goes so far o.O
@@GGP39 I’m missing something or that just harsh
I still have my pentium 2 slot processor… one day will show my kids that how big a 200 mhz was back in my time…
The CPU was actually that size due to the CPU and cache being on different chips to improve yields.
I found a slot 2 machine at a waste collection site that had a 386 socket for it. Was in pretty neat and I wanted it but can't take stuff from waste sites
I broke mount in mine :c
I've been searching for a replacement everywhere but I've been unable to find one
My buddy had the 333 mhz one. Think it was a p3. He called tech support once and they couldn't believe he had such a fast CPU cause it just came out.
@@damianosplay9457 Sad days, I had half a mind to grab the one I found but I like my job and didn't want to lose it over a computer that likely didn't work anyway, hah.
I remember my 486’s IDE controller was on a separate ISA expansion card.
Remember silicon discs? aka having a ram drive to load files onto & manage or run them from there was like having a SSD in 1984 (except not storing the data after poweroff). Also 3" dualsided 200kb floppies was amazingly fast quirky compared to the 5,25".
Oh, & those slot1 pentium 2 could certainly also have cooler issues. I recall having to replace one as an older one had become slow from overheating too much.
I once had a Sony Vaio Desktop PC which had an Built-In MiniDisc Drive. That was awesome. You were able to use it for Music for your portable MD Player, and also for Backups. I really loved MiniDisc back then.
Here's a Techquickie, can you explain why TH-cam often recommends or sets up as the next video something it knows I've already watched? Why would it do that, what are the chances it'll be a video I really want to watch again? Surely they'd want to keep me hooked by offering only new content.
I'm tired of this too. It's driving me crazy
Click the 3 dots that appear when you hover the selection, "Not Interested", "I've already watched the video.".
@@Furious321 That doesn't actually answer the question of why, though.
@@emerje0 do you have a pi-hole or ad blocker? It could be disrupting Google tracking. My setup seems to completely block TH-cam history, for better or worse 🤷♂️ but no ads for me
@@ericwright8592 No, in fact I have YT Premium because I hate ads but still want to support the content I watch. But that shouldn't matter, as long as a video has red bar going the full length of it YT knows I watched it and it shouldn't be recommended.
If there's one thing that holds true, it's that economics plays a large part in the success (and subsequent improvement) or failure of technology.
I remember taking a PC hardware class where we were taught to build and troubleshoot a PC that used a Pentium 2. Didn't even think about the CPU being a cartridge at the time.
I remember reading that one of the reasons for introducing Slot 1 was preventing competition from making compatible CPUs. There were some patents and other stuff.
When does the truth stop linus and co talking complete bollocks.
@@ae5668 to be fair to Linus that very reason was published in the magazines at the time
I'd mention Firewire, but I can see how it really should only be an honorable mention at best.
firewire is still used too
It’s sad what happened to FireWire, because it’s (and TB1 & 2’s) lack of adoption is why Apple said _"fuck it! If you’re not going to adopt our superior connector, we’ll make laptops that only have said connector!"_
Also, I still have FireWire cables and drives for whenever I’m servicing older Macs.
still use it won't give it up and in fact when I build a super secure system it's one of the things that I like to use because it is 100% rock-solid And I know everything about it from the inside out including how it transfers data.
I think I still have the driver key to use the firewire networking protocol that they released. later on it was built into Microsoft Windows PC and of course Linux had it first.
at least four video or doing audio work it was the bee's knees.
however no matter what you did the firewire drives were way more expensive than all the rest including the sum of the cables which was absolutely atrocious or how much they charge for the firewire interfaces.
yet even after all these years some of these drives I have are about 20 years old now I think still work with the firewire interface I think that's why I love a Data Drive so much because they were the first firewire drives ever purchased.
followed by the drives Lacey sold
I really want the lighting port to be the next firewire but apple doesnt want to let it go, thankfully the EU has other plans
Funfact: The main reason Intel named the line of chips "Pentium" was so they could trademark a name and not have other companies, like AMD and Cyrix, simply copy their number scheme, e.g. 386, 486, 586(?).
I could be wrong, but you are somewhat correct. It was more that Intel lost a court case in that they could not trademark number and then had to go to a name
Then K6, K6-II, K6-III happend.
it also was great marketing because Pentium sounded rather exciting and like a significant upgrade unlike say 286-386. AMD and Cyrix had 586 CPUs, but AMD chose to follow Intel's lead afterwards and created the Athlon brand. For Cyrix, the 586 was their last CPU I think, before they went out of business.
@@dahak777 They tied up AMD in court with their superior 386 40 MHz CPU that kept pace with a 486 SX 25 (that you'd be dumb not to overclock to a 33 Mhz, which most of them could easily handle) - They didn't have to win, they had to delay AMD enough.
That 386 chip sold well into the second half of the 90s as used or secondary PCs. Hell, I'd love to have had one in 1996 even, I had a shitty SX/20 still.
@@jayhill2193 Cyrix followed up with the MII and MediaGX but got pretty stifled by the NatSemi merger, then all but killed off by the VIA buyout. The Cyrix III was the last Cyrix chip with their own design, but it got snuffed out in favor of the "VIA C3" low-power variant using CenTaur's core architecture.
I'd love to see something on PCI-X and AGP
If I remember correctly, RDRAM had continuity modules that had to be installed in any remaining open slots on the MB, too. What a crazy system.
Reminds me of Apple Macs back in they day when they used SCSI ports (rather than Parallel or RS-232 Serial used by PC) for peripherals. You had to daisy chain your scanner, printer, etc...and install a terminator on the last device. I believe you also had to set a SCSI ID for each device.
@@LC-uh8if Not quite. RDRAM works kind of like those old Christmas tree Lights. If one lamp Breaks, the series is broken, thus nothing works.
SCSI on the other Hand needs to be terminated at both ends of the Bus, because otherwise Data would bounce around causing collisions.
I was working at Intel Dupont back in 96-98 and they told us the ceramics surrounding the CPU was getting expensive. Vendors that supplied the ceramics wanted more money and the slot design was adopted to reduce cost.
Next candidates for e-waste: PCs that have sufficient performance for Windows 11 but do not support Secure Boot and TPM.
Lets make a mountain of e-waste!
LTT has released a video on how to bypass this requirement (for now, at the very least)
@@vidiot5533 Yeah I know but it's a long path until the release version.
Lets hope there's a permanent work around, so sad to have such forced obsolescence in otherwise completely functional cpu's. I just upgraded to a ryzen 5900x so im golden but really could have hung onto my trusty old overclocked hexa core socket 2011 xeon for a bit longer with little performance hit, i really just wanted more usb3+ ports and nvme.
L I N U X
Intel shot themselves in the foot with the Slot 1 Celeron 300A, right motherboard you could change the FSB from 66Mhz to 100Mhz and instant 450Mhz CPU that was a beast at the time.
@Clarissa 1986 Celerons did not have enough cache for serious tasks.
@Clarissa 1986 Celeron was cheap because it did not have enough cache for workloads. Cache memory was expensive, so Celeron was OK for people who did not need best possible performance. If small cache was enough for tasks you did on your PC, then you did not notice difference to Pentium.
I love that twenty years after some event theres still just live articles about it.
Actually the first slot-based CPU packages that were put into PCs were those for DEC Alpha AXP CPUs, specifically the first generation: 21064, put into DEC AlphaStation workstations. They were PC-like in almost every way, except for the CPU. They had ISA slots, they had PCI slots and an Intel FX chipset to support all this. They were introduced in late 1994.
I used to own one and had equipped it with an original SB16. It outperformed my K6 (or K6-2, can't remember) in very many ways.
The bus design for the 21164, EV5 was the basis for the AMD Athlon bus design.
Old days were sooooo much better. I miss the 90s and the computers of the 80s.
2:14 I was really hoping BTX would be a successor to BTS
How can you not cover those sweet, sweet dial-up modems that connected to your phoneline and made such wonderful music?!?!
Rarely when my dad allowed us to use the dial-up connection it was something precious that you had to turn off soon after due to phone bill rising 🥲
Beautiful music to my ears...
Listen to the song "With You" by Linkin Park and feel nostalgic. 🙌
@@Shahaaim yuuu linkin park
I had a CPU slot (Pentium II). It was a nightmare, if I accidentaly moved my pc case the computer would freeze and fail to restart.
Most of the time I had to open the case, remove and insert the CPU again.
But you didn't need to be a master of applying thermal paste back then, so that's a plus...
I had several Pentium II PCs, even dual CPU ones, and it never happened to me, and I've never heard of such problems from the others so... sounds like a personal problem :P
@@spavatch some motherboards were better than others at making the cpu stable in the slot. Of course some people used 3rd party coolers which could also cause some problems with their weight and gravity.
@@bionicgeekgrrl - I think what constitutes the problem is the fact that people didn't use original CPU brackets. Most mobos had foldable plastic rails with fastening clips that held the cartridge in a safe, stable, upright (or more precisely perpendicular) position. Some guys took them off for one reason or another which... brings us to my assumption that it's more of a personal problem rather than anything else ;)
Did you blow on the CPU socket before you put it too? roflmao seriously, that would make for one hell of a meme.
2:00 AMD K6-III was the first CPU for PC to have L2 cache on the CPU, and when it was put into a Super 7 board that had L2 cache (designed for the K6-2), that onboard cache became L3 cache and made the K6-III significantly faster than the Pentium II and IIIs running at the same clock speed. But the reason the Athlon had to be slotted was because it had L3 cache in the package, where the Pentium II and III were packaged for their massive L2 cache. Weird thing about those, is that the last Pentium III - after Intel went back to sockets - had everything integrated in the CPU and were surface mounted inside the package so OEMs could use up their last slotted motherboards
I had 256MB RDRAM, 850MV motherboard, 2GHz Pentium 4 in 2002, Nvidia 5200 gpu, it was the fastest machine amongst my friends.
Who doesn't like a quicky of the tech persuasion
4:39 I still get a good laugh whenever someone mentions the Juicero. What a hilarious piece of trash.
I think the best thing regarding it I saw was someone testing whether the device was even needed by taking one of their pouches and squeezing it by hand. Turns out it was faster that way.
Legit question: Why don't we go back to cpus being on their own cards?
It's really annoying needing new motherboards on cpu upgrades, and different motherboards for AMD vs Intel.
Graphics cards are basically plug and play anywhere.
A socket for any CPU means a socket for your chipset and some *expensive* FPGAs for adaptive routing. GPUs are co-processors running over an expansion interface, not primary processors
The card connection on a GPU is just for data throughput, and the core handles it and the memory. So in theory as long as data throughput doesn't increase beyond the limits you won't need a different connection. And if you use an earlier PCI standard on a newer card it simply caps the data throughput to the max it can handle, if I'm not mistaken. Like connecting USB 3 flash drives to a USB 2 port.
Meanwhile a socket iteration often either adds pins or changes their locations which means that the connections inside the motherboard are not compatible. For example a CPU socket that supports triple channel memory needs to have pins that handle those connections to the third channel. Even if you custom-made a CPU from a previous generation that fits in that socket but only has dual channel memory, you probably wouldn't even get it to boot because everything would be crossed. The pins that were supposed to handle the third memory channel could be trying to connect to the sound card, it would be a total mess. So trying to replace the socket for a card slot without requiring a new motherboard with a new slot would be extremely hard because all the gold contacts on the slot would need to line up with the next generation and new features would be difficult to add.
This isn't to say all socket changes were justified, I'm sure there's been a few that were done just for the money.
Technically speaking all modern CPUs are "in their own cards", they're just really tiny and feature lots of connecting pads/pins in a flat 2D array.
Jokes aside, as Titanium Rain explained, the reason why sockets differ generation to generation is because CPUs are much more directly connected to the rest of the hardware than co-processors like a GPU. It *is* feasible to put CPUs in some sort of standardized socket, but you would either need to over-spec the connector for future-proofing (incrementing costs), create abstraction layers that allow new features to be addressed over old channels (increasing latencies and reducing performance), or change it for a new one in a few years (what CPUs do right now)
Your question assumes the CPUs being on their own PCB meant they were interchangeable. They were not. They didn't have any advantages over modern CPUs, but they were larger and more expensive to make. As for why we don't go back, they were larger, more expensive to make, and would be slower than modern CPU designs.
There is a good chance this might happen.
AMD already have sockets that can survive many years, and there are experimental technologies to connect RAM through the PCI-E port.
So there is a chance that next generation CPUs will only have power and PCI-E and nothing else.
3DFX add on cards... Gotta love those daisy chained vga cables :)
Another huge thing that disappeared: AGP. There was a time when GPU manufacturers sold the same GPU model for both AGP and the new PCIe, if I remember correctly. AGP was the golden standard for gaming GPUs - PCIe was new, sometimes performed worse, sometimes better, people didn't trust it, but with lots of improvements it eventually managed to replace AGP once and for all in a relatively quick manner.
If you found a PC in the trash, and it had an AGP slot, you took it - because that's all you really needed for a gaming computer.
I made SOOOOO much money installing retainer clips for those crappy processor slots. Thank you Compaq for designing them so that with time the thermal expansion and contraction caused the CPU’s to wiggle out of their slots. Great fun! …LOL… :)
I remember when you would install the cache on the motherboard itself. You could upgrade it without changing the CPU. Oh the good old days.
Ah yes. Those days. Not enough low mem to run programs (TY QEMM). IRQ conflicts. And the horror of having to manually TURN OFF YOUR PC! 🙃
@@acubley And add a SCSI card in there for fun.
And back when you could configure the VRAM used by GPUs, good times
This is something he missed in his discussion of slot based CPU. Yes, he got it right about the L2 cache being on the board with the CPU, but he missed a part of the explanation as to why it was there. As you mentioned many pentium and 486 motherboards had L2 cache chips on the motherboard. Some of them were socketed and could be upgraded and some even came on a little board that fit in a slot itself. I believe that was referred to as COAST or Cache On A STick. Anyway, the real reason that on the Pentium II they put it on the board with CPU was so it could run at higher speeds than the FSB. remember that in the 386 and early 486 era the clock speed of the CPU was the same as the FSB. With the 486 DX2 66, the CPUs started to run at multiples of the FSB speed. When we finally reached the Pentium II era you had CPUs running at very high multiples of the FSB. At that point cache on the motherboard would have been a huge bottleneck because of this huge difference between FSB speed and CPU clock speed and would have been pointless. So, a compromise between having the cache on the motherboard and actually part of the same chip was to have the cache on a board with CPU as was done with the Pentium II or have 2 chips on one substrate as the Pentium Pro. That way the cache could run at a higher speed than the FSB which the motherboard ran at. The L2 cache actually ran on what was called a "back side bus" of the processor and on the slot I pentium II chips ran at half the clock speed of the CPU. for example, on a 450Mhz Pentium II that would mean the L2 cache on the Back Side Bus would be running at 225Mhz as opposed to the 100Mhz FSB. That was obviously a huge performance improvement. If I'm not mistaken, the larger slot II Xeon variety actual ran the back side bus and L2 cache at full speed, just as the Pentium Pro, which of course further improved performance. When they advanced to the point they could actually put the chip in the CPU itself, obviously this was no longer needed. Anyway, he failed to go into detail and explain this. Anyway, this wasn't a "failed technology". It was simply something that was necessary at the time but with advances no longer became necessary.
Betamax was not obsolete e-waste; it was the dominant television broadcasting production medium for the 1980s and 1990s and the basis for modern news gathering services (ENG). if you watched tv news at all back then, you watched Beta. if you went to broadcasting school and learned to edit video, you worked in Beta. if you made independent low-budget art films, you worked in Beta
In the consumer market VHS clearly won...
Beta was far superior, I used to work in TV News. VHS won the consumer one though as you had more home videos on that format.
For sound recording we used to use the MiniDisc by Sony which was also awesome for its time.
Micro-Channel Architecture - The "old" version of Dave's Computer Warehouse used to have an entire rack of shelving dedicated to MCA-equipped motherboards and add-on cards.
4:15 not only did Rambus need to be in pairs, you also had to install 2 dummy modules in open slots . so if you had 4 RD-RAM slots but only 2 sticks you would also need to buy 2 dummy sticks to go in the unused slots.
The N64 have 4MB of RDRAM and 8MB if you use the expension pack.
Just a fun fact! :)
yup, and the PS2 had 32mb of rdram hidden inside too
"As any D&D Dungeon Master can tell you."
Dude! Why did you do us DM's dirty like that? I feel so called out.
No plan truly survives the first encounter with the players. No matter how carefully you construct the scenario someone will do something unexpected and from there your just improvising to create the illusion you are in total control.
@@nobodyimportant2470 I ran an entire homebrew campaign for over ten years on that very concept - my medieval spelunkers became pie sellers and delivery men...
You didn't mention discrete sound cards, perhaps the easiest to component to identify. They don't really make them anymore, they make externals now, which actually cost a lot more and aren't as easy to work with.
I remember my old PII-350 tower used to lock up every few months, as the motherboard's processor retention clips didn't match the form factor of the heatsink and the CPU used to slowly fall out...
I finally built my first Slot 1 P3 650mhz pc! Running windows 2000 on it and it's a blast. Thanks for the fun video with the history!
I miss the PC cases that laid horizontally, and you put the monitor on top of them. I found them much easier to work with when fixing/upgrading/swapping components, and a more efficient use of desk-space than the tower design that's the current standard.
There are still tons of those. Just mainly itx/ matx
I never see them available on part-sales websites.@@MycaeWitchofHyphae
Thought for sure a Soundblaster sound card would be mentioned.
Soundblaster still makes sound cards
was expecting same. im still running an spu with creative based chipset in it and have no plans to switch to onboard audio.
fun fact when I built my system back in 2012 I still thought sound cards were a thing and bought one of sound blasters cards. Still have it today!
They still make those!
@@aaronmischel4552 yeah but there was a time your game wouldnt run properly if you didnt have one or a compatable one. any old sound card wouldnt work.
You could do an episode on MPU-401, MT-32 and AdLib. Afaik no sound cards ship with a midi chip anymore, and anything midi is handled by software instead
The PPU. Physics Processing Unit. It was so short lived. This was part of the physics war between Valve (Processing being done on the cpu(this was when cpu's were first becoming multi-cores)), nVidia with PhysX processing on the GPU, and Aegis with the dedicated PPU. I think we all know who lost in this war.
Another thing that completely disappeared and almost forgotten in desktop PC is external VRM modules. During the pentium 1 era, many motherboards had a pin header to plug an optional vrm board. This allowed the use of more power hungry cpus or cpu using a voltage not supported by the mobo.
Would be great, if one could simply add or upgrade the PC, by simply putting a CPU extension card in the motherboard's pci-express slot like for example an arm or RISC-V chip. FPGA should actually be already available though.
It can, it's one of many form of Daughterboard, like X670 express card from Asus (now you know why Motherboard called, motherboard)
The problem is not hardware, but software, it's completely proprietary for proprietary hardware. And unless you willing to spend hundred or millions of dollar for proprietary tailored for your need, there's nothing you can do anyway
My life disappered
Same bro, same
@@perpetualcollapse Sadness
@TommyInnit 🅥 no
Rip
@@issammelzi728 pp
Please cover on turbo buttons or where i like to call it " slowass mode"
I don't have the vid up, bit I think The8BitGuy might've covered it
@@scars2k2 nice, will check it out. Tyty
...most of which weren't even connected but just gave a satisfactory clunk !
Fun fact, the Nintendo 64 used Rambus ram. You had to install Rambus in pairs, or any open slots would require a Rambus terminator. The Nintendo 'jumper pack' is just a Rambus terminator pack, the Expansion pack memory is identical to the memory in the N64 itself.
I have an old P4 system with RBram on it. Forgot to mention that ALL the ram sockets need to be filled and if you run only the one pair of dims on a bored with 4 sockets you had to populate the outer two with jumper dims.
Who else ever burned their hand on one of those cpus when trying to pull them out?
I thought this video was about the current CPU and GPU shortage
Me?
Maybe it's cause u didnt see the thumbnail
same
I thought this was about the LTT office having bad organization
The shortage is eternal my friend
I’d like to see slotted CPU form factors again using modern manufacturing. You could dedicate more space on the actual processor die to raw processing power and then have beefy large caches off to the side. Heat management would take some adjustment but I imagine it wouldn’t be much harder than what goes on in a high end graphics card
keep all the high speed components (cpu, cache, RAM, northbridge, gpu) in one card and the slow components (pci e, lpc, sata) in the mobo, along with the power supply components
Unfortunately, there is a technical reason for why we stopped making them, other than because the cache was already inside the CPU and that making a board for no reason is costly.
The reason is electromagnetic interference. This was the era where we jumped from 300 MHz with the 1997 Pentium II to 1000 MHz with the 2000 Pentium III, the last CPU for Socket 1 (*not to mention P4, which clocked a lot higher but delivered lower performance per clock). When you clock-scale like that, you start to run on signal integrity issues, and the cheapest way to resolve them is to simply make the paths shorter (in other words, cut off the slot and back to PGA/LGA we are).
When the internal side of pc cases are not yet painted
OMFG, a Pentium II. I had one back in the days, and I still think I have one lying around in the attic, if I remember right. Also, they don't use PCI. Instead, they used something called an SECC or A Single Edge Contact Cartridge for the CPU, and it was easier to install the CPU that way. One of the motherboards I know off use that CPU was ASUS P2L97. I have been using PCs since the late '80s. My dad had a pc when I was born, so I grew up with a pc and many consoles and handheld. I had most consoles that came out in the '80 - 90s, i even had an Odyssey 2 as a kid, and I also had a Gameboy original and later the Gameboy color. When I look back, I don't know how my dad got all this money from, but we were not rich by any means, but I had a lot of stuff like that, like if we were.
I remember having to use a credit card to straighten bent pins on a pentium chip. I revived it but man did I sweat that day.
how about the plethora of obscure removable media formats? I'm talking ZIP drives, VCDs, microdrives
The Zip drive was fantastic for a number of years, hardly a failure. Also I'd add Itanic to the list, though that's more a CPU architecture than a PC part. Then there the plethora of other flash card formats.
VCD is just a regular CD.
@@fungo6631 Yeah, didnt the extra content literally require a whole new player to view it?
Don't forget sound cards. Without one your PC literally had no sound.
(Yes they still exist today but not everyone runs a record company. Its not mandatory that's the point)
Unless you had a PC speaker then yeh there was no sound
and if you didnt buy a sound blaster one your games wouldnt work
This is true, most games didn't need a 3d graphics card at all, ones that did still ran on CPU and only got a modest visual upgrade, but without a sound card your screwed. Either no sound or beeps.
It throws me off thinking that I ended up using a USB DAC instead of something off the motherboard or in an expansion slot. Reason: the thing has RCA outs. But still, you get higher quality via USB
the laserdisk story is a fun one about being too ahead of your time and overengineering, yet still really cool tech
Nice to see that Juicero is never forgotten. This contraption must be brought out again and again as a memorial of shame. :)
In an alternate timeline I watched this video on a Zune
I've got a Prescott P4 on my keychain, to help keep me warm.
I had a 30 pin SIMM for a while back in high school (balanced out with a chain wallet and a two door manual transmission car with flames painted on it) but all the chips eventually fell off. I then found out that I can use the PCB to bypass certain locks that credit cards were too flimsy to move, until I broke the PCB trying to get into the science class supply room.
How'd you get a CPU on there? Drill a hole somewhere?
"The computer had faulty wiring that caused it to overheat when the spacebar was held down for too long. When the manufacturer issued a recall, they were met with opposition from children in developing countries, who were holding down the spacebar to stay warm in their unheated homes."
-XKCD (paraphrased from memory)
@@the_kombinator Yup, just drilled a hole in it just big enough for a regular keyring, and also sanded the corners, those are sharp!
Thanos likes to see change in the PC market I guess...
Those slotted CPUs are still used in the airplanes I work on.
We need an entire mini series with Riley in front of a fire, smoking a pipe covering a decade of computers at a time. A true historians historian be that Riley.
If you guys could do a video on the somewhat failed format of the MiniDisc that would be awesome! Still have my portable recorder/players that I listen to religiously. Damn good audio!
When he said *As any D&D dungeon master would tell you*
I felt that!
Entire Big Bang Theory cast and crew felt that
When did he say that?
@@r3mpuh did u not watch the video?
when i talk about Voodoo1 cards to my younger freinds, they make funny faces and look like a questionmarks :)
Once you went to Slot 1 Pentium you began speaking the language of my heart
Intel slot cpus were awesome especially with slot-socket 370 adapter, I still have a rebuilt gateway running that way now
@@jonlangfitt 🥰
* IBM's PS/2, OS/2 and Microchannel would be an obvious target.
* Thin-client architecture (though I guess we eventually got there with phones and cloud computing).
* NeXT. (I actually saw one in my school's computer lab, but people just started at it like it was the Monolith... since there was nothing you could actually _do_ with it.)
* Discrete math coprocessors.
* ZiP drives.
Hide the money, y'all. There's Pentiums around.
Why? Take the money and buy some steaks. A Pentium 4s can cook that sucker just right!