My father was a personnel manager for CDC in the 70's. He started in Minnesota and then moved to California when they built their facility in La Jolla. In 1973 when my father was working on his PhD he would go to the office on weekends to run punch cards through the computer for statistical analysis of manager self-assessments. I remember playing tic-tac-toe on the console with a guy while my dad was running those punch cards. That guy was Seymour Cray.
@marklarma4781 well, the superfund site is from lead smelting sites. Judging from arsenic findings by the CDC in some east Omaha yards, I suspect it was testing for heavy metal contamination. Just as the CDC still has a WTC group from 9/11, due to exposures there after the attacks and collapses.
@marklarma4781 My Dad also worked for Sperry Univac in Omaha as well when I also was a young kid. I did not know about CDC either. My dad only recently told me that they had only one encounter/project at one point in time trying to get their system to communicate with CDC's system. Evidently, it was quite the effort to do so.
What a story! CDC was the super computer back then. I was a student(major in Physics) in 80s and fascinated by supercomputers. First time I saw the picture of the Cray One. I was very excited.
I worked for a company that partnered with Cray Research to distribute their UniChem quantum mechanics software. While we were at their HQ, we saw a machine (I think it was a C90) in the process of construction and one of our scientists said something like "Oh, I see you changed the color" because it was a different color than the C90 he'd used in grad school. The response was "For thirty million dollars we'll paint it whatever color the buyer wants."
I used to test hardware for a company specialising in SAN and NAS storage. They build racks of servers and hard drives. They were showing a customer the racks, which have a door with the company logo on them. This customer stated that the door would look great on his fridge freezer, a match size wise for the cabinets. A manager took the door off the server cabinet and presented it to the customer at the end of the tour.
Cray certainly made a name for himself. Maybe it's an age thing, I'm in my 50's, but Cray machines were almost like mythological creatures. The guy really had an enormous impact.
I am like half your age and Cray is a magical word to me! But I don't know of others in my age that even know about Cray.. Or Ada Lovelace and Charles Babbage.. 🤔 Swedish SAAB aeroplanes are coded in Ada 97 language, or at least they used to. But it is a language with much text to do anything simple.. Nowadays we have LLMs (T9+) so it helps
i'm not sure where i first heard of cray supercomputers but they've always had a sort of mythical quality to me to, not just a computer, but a supercomputer. the designs of them are intriguing too, how much was aesthetic and how much was functional, some of them looked like seats at an airport or something..
This was a fantastic presentation! Well done! As an aging computer nerd that came of age in the 90's, Cray still had a whole mystique around them even going into 2K. The legend was so strong it took a long time for people to forget the name. Them and Silicon Graphics.
For our generation I think those names will always carry their own mystique and legend, although I have to admit that after working with and owning those things later a bit of the 'magic' dissipated. I got to try VR for the first time on a SGI ONYX RE2 at a computer science fair/expo in Gothenburg Sweden called Tidsvåg Noll 2.0 back in the '90s. I decided right then that one day I'd get my hands on similar hardware myself. Well, I did, I had a deskside ONYX RE2 for a bit in the mid 2000's, very cool but it was a lot harder to work with than my younger self had imagined. I have upgraded since, and now have a Xeon based PC with a GTX-1080 to run my VR headset off of, but as a nod to the dream of my younger self it's built into a deskside PC case lol. Thermaltake Mozart TX is the case, it's not that far off the size of the ONYX hehe. If it hadn't been for that stuff I saw when I was young, I likely would have had a different life and career. I spent many years as a software developer working on 'Enterprise applications', with a focus on integration services, which is in a sense making computers talk to each other in a common language. People per se have little to do with it other than as source data input devices :-P.
Yeah, when I heard the Cray XMP, I told myself, one day I will use that computer. Decades later, I would manage a Vax8800 cluster. This video is a gem for computer history that few knew the details. The CDC mainframe I never got to see but the Dec Alpha with experimental EV6 cpu at AltaVista in PaloAlto, I got to touch. They had 5 nodes there...
I love the detail in this video. It reminds me of a story I heard at Ohio Supercomputer Center! In 1987 OSC got a Cray X-MP/24. In 1989 they replaced it with a Cray Y-MP8/864. Both machines had eight stacked processor boards submerged in Fluorinert and each required an outboard Heat Exchanger Unit. They made a lot of heat. The parking lot of the Supercomputer Center had a LOT of air conditioning compressor units in it. But they weren't exactly tidy. Some of them were pretty crooked. Because when they had done the changeover to the Y-MP8, they didn't want any downtime. They had the X-MP and Y-MP both online for awhile. They said the parking lot looked weird because the asphalt had actually melted and got all gooey when both Crays were running. The storage situation for supercomputers was weird. You had these fast machines, but they were hooked into '80s storage. OSC had hard drives that looked like rows of industrial washer/dryers. They had room-sized reel-to-reel tape storage jukebox with a robot arm to retrieve tapes! In 1992 I think OSC had just taken out the robot arm and put in a 1.2TB storage system which required an additional Cray Y-MP 2E to manage storage. Supercomputers just kinda morphed into clusters. There was sorta a mental turning point for everybody when Toy Story came out in 1995 and everybody knew it was made on a render farm of 100+ SPARCstations. It made Crays seem like old news. The last Cray they had at OSC I think was the Cray T3E-600/LC they got in 1997 which itself was running 136 DEC Alpha chips. A little more random backstory... I grew up in rural Ohio. In 1987 Ohio State University started Ohio Supercomputer Center with a lot of state funding. Every university had some access, and they sold time to businesses. I got a Commodore 64 for my 8th birthday and really lived/breathed computers. In 1991 I was 15 and I was basically the only computer nerd in Springfield, Ohio. I did weird things like begging my parents for nothing but a Turbo Pascal 5.5 compiler for my 386SX/16 for Christmas. This guy Brian Fargo made a lot of late '80s video games and I wrote him asking how I could make video games and he actually wrote back! I filled out an application to go to the Supercomputer Center's "Summer Institute" for high school kids. One question asked you to make a program which found all numbers which were both prime and fibonacci numbers up to 1000. Pick any language. It was the only programming question! It seemed too easy, so I thought I should do something clever. Most people knew a little BASIC at best. I wrote a super-short recursive Pascal function which counted Fibonacci numbers up to 1000 then fell out of recursion checking if they were prime. I was one of the 14 kids accepted. When I got there the next summer I just wanted to know how they graded that question! So I finally found the director Al Stutz and asked, and he said I was actually the only one who answered it correctly LOL! The summer of 1992 was a weird time. People didn't have the Internet. 14.4k modems were a new thing. There were a few BBS-es in Dayton, Ohio I could call. Everything else would be long distance. Prodigy and AOL existed, but they were just basically chatrooms and email. They weren't connected to the Internet. Because the Internet was just .gov and .mil and .edu sites. There wasn't much to do. At OSC in 1992, though, they had a lot of machines hooked up to the Internet. They even had a ton of NeXT cubes! And Steve Jobs actually came to OSC to talk about them! And OSC had a fat fibre connection! But the Internet had really barely even progressed past being DARPANET. The only thing to really do on the Internet was look for pirated games or soft-core porn. There wasn't much of any WWW. You just kinda had to know the address of something. But if you did, nothing had much security. You could just FTP to wuarchive.wustl.edu and browse everything that nerds had decided to share on the network. College FTP servers were mostly scans of the Sports Illustrated Swimsuit Issue. The newest game was Wolfenstein 3D. The only network game was NetHack. Gaming didn't get big till Duke Nukem 3D and then Quake and WarCraft 2. The Internet became a lot more interesting after I started pirating PSX games and playing Quake. Anyway, I love that this video has meaningful illustrations instead of just a buncha nearly-irrelevant stock photos.
I graduated from Illinois early 88 in aerospace. We already had a Cray-2 and an XMP-64 was being commissioned. I never got to play with either but the postgrads in the department were doing their first aerodynamics simulations on it and providing some amazing detail. I was on the Illini swim team and one of the guys who'd graduated several years earlier was back doing his PhD on the Cray systems. We discussed some of what they were doing. 2 things I remember are in software and the heat issue you mentioned. What I was told is a little different but its still in the vein of how much heat they generated. 1) HEAT - The Crays used heaps of power and generated monstrous amounts of heat, which is why there were in a vat of coolant like you describe. At Illinois they went to the civil engineering people to help because they pumped the coolant so hard it was eroding the circuit boards. The only people at the time who understood erosion were the guys who dealt with soil erosion and they were civil engineers. 2) SOFTWARE - They weren't writing a lot of new simulations at that stage except in some odd areas (like wings entering stall conditions) as they already had the math worked out for a lot of things. Most engineering software is basically number crunching exercises and by the late 80s they had many algorithms worked out. So in many instances they weren't writing new code but re-compiling it for multi-core vector processing which was what the Crays were. In those days I was doing Finite Element Analysis (structural analysis) because that was my professors forte. We didn't need to re-write anything at that time because we already had systems like NASTRAN. What we needed was that stuff re-compiled so that we could run larger models or just run it a lot faster. The guy I knew who was doing his PhD was working on that stuff. He was doing working out how to take existing code and decide which parts got done by which core so that the whole number crunching exercise could take advantage of the Cray. When you look at your laptop or desktop and it says 2-core or 4-core or 8-core. That stuff works because of the work that team at Illinois back in the late 80s. On a side note. Marc Andreessen, who's famous for the advances in web browsers (Mosaic, Netscape, MS Explorer) started at Illinois a year after I graduated. He worked at the National Center for Supercomputing Applications (NCSA) at the University of Illinois. I think his attitude towards government funded research totally sucks. He got all his wealth from getting this amazing opportunity to develop his skills paid for by the American people and the State of Illinois. Some of the things he's said in recent years is seriously egregious and his technology manifesto is insane to say the least.
I was at OSU in the early 90s and can confirm so much of this, except my access was always through the terminals in the Baker Systems building. The worst part was trying to find an open seat, as there were tons of people playing NetTrek for hour after hour. Thank you for sharing your story, here... it made my day to read it!
The original 'Tron' movie graphics were done using special CRT-based film scanners and recorders. They were connected to a Cray I/O processor with then-unheard-of 100 megabytes of RAM. The graphics were computed using the Cray computer, then recorded back to film. I had the privilege of working on the scanners/recorders many years later when they were upgraded with more RAM and HiPPI 100 MB/sec. I/O interfaces.
The CGI filming of "The Last Star Fighter" movie occurred on a Cray computer. The Cray computed a frame, displayed it on a high-resolution screen, and had the camera take the shot. It was frame by frame the entire way through the movie. My boss wanted me to help the company sell computer time on the Cray. I spent several weeks in LA working on accomplishing this work. It was the first movie where I knew people in the credits.
In fact, there is a Cray-1 shown in the movie. I was a student programmer at the computation center then, and our boss closed up shop and took us all to see Tron. We gave the Cray-1 a standing ovation!
You're thinking of The Last Starfighter, perhaps. TRON used much more pedestrian computer hardware - two separate systems, actually. One system that constructed objects out of solid shapes (tank battle, light cycles IIRC), and another system that could do more organic shapes like the solar sail ship.
It's unbelievable to me how much those early days of ERA sound like modern IT and the tech industry. Over budget projects, having to raise venture capital, "meme stock"ing yourself to the every-man, Cray acting like a "veteran engineer" straight out of college, pivoting, mergers... it's never stops with this story and it's no different today.
Cray found it funny that Apple used his machines to create the Macs while he used the Macs to make his super computers 😆 Jobs and Cray both made jokes about it 😆
The truth is that the Cray at Apple was purchased to design the Aquarius CPU which became a dead end. Later the cover story about using it for plastics design was widely circulated and they did do all their R&D 3D rendering on the Cray along with injection mold design, but using it for Macs was really about trying to recoup investment. The simulations ATG was using (the Advanced Technology Group at Apple) didn't parallel well and the Suns which were terminals to the Cray were not a whole lot slower than the Cray at running the same simulations. That's when they jumped whole hog into using it for plastics. I was a contractor to ATG in the mid-90s. I got to make the ancestor of what you now know at Street View.
@@verttikoo2052 Apple didn't create the ARM and they didn't come along in ARM land until 1990 with the ARM610 which powered the Newton PDA, but ARMs had already existed as fabricated chips since 1985. The Acorn Archimedes computer was released in mid 1987 with an 8 MHz ARM2 and ARM originally stood for Acorn RISC Machine. In those days Sophie Wilson was Roger. She didn't become Sophie until 1994.
@@Peter_S_ Apple indeed created the ARM and that is a fact. Apple saved two nearly bankrupt companies Acorn and VLSI. These 3 companies set their IP, engineers and other stuff in to the company and Apple injected money into it. These 3 companies owned each 1/3 of the company. Apple had to sell its 1/3 when Apple itself was near bankruptcy. Now Apple is back owning the ARM but only with 10%. ARM is a massive success story and only thanks to Apple.
I really miss those times! I worked for several of the aforementioned in various capacities during their heydays from the late 70s to the mid 80s while rubbing elbows with many of the well-renowned luminaries of the time-we all felt the fire underneath us as we worked on/developed the worlds fastest, most advanced computers in the world. As a recent Stanford EE grad in my twenties, I was truly in my element-I really loved my life and what we were doing-it was an immensely satisfying time in the SV and I miss it so much!
Fun fact. They’re still making supercomputers to this day in Chippewa Falls. They were eventually bought out by HP Enterprises couple years back, but my brother works there and the company is still involved with Los Alamos too.
I can truthfully say, around half of the multi-core computer I'm happily battering away on the keyboard of comes directly from Cray innovations. Ironically, Cray being bought by HP and this computer being an HP...
I remember an old mythical rumor many, many years ago that a Cray system had *finished* an infinite loop in something like a week of continuous operation.
I used these a bit. In 1971 and 1982 I used a CDC 6400 at Kitt Peak National Observatory in Tuscon. Fast machine with six peripheral processors, but its memory was very small. A lot of software ran on FORTH, which is unquely miserly in use of RAM. In 1987 I used a Cray 1 (or X-MP?) at the University of Toronto, by this time remotely via Internet. The funny thing is that these days a $10 Raspberry Pi outperforms these machines (at least for pure scalar number-crunching). Massive superscalar paralellism has taken over, with GPUs adding more punch yet. But the CDC and Cray machines had a certain magic for scientists in the day.
I remember Cray being synonymous with supercomputing, even into the '90s. You couldn't mention one without the other also being considered. If I remember correctly, the movie The Last Star Fighter, a pioneer in cinematic CGI, used a Cray X-MP as part of its graphics pipeline.
there is a making of The Last Star Fighter documentary that shows the computers used and things like that. Its a great watch with all the interviews and Lance Guest does the voice over for it as well as appears in it too of course.
I got to visit Digital Productions in Los Angeles, when they were working on The Last Star Fighter. As a teen computer nerd, I was blown away by the Cray X-MP they were using. The graphics (at the time) were just incredible. It's something that I remember vividly to this day.
I worked with a guy at NVIDIA who worked at Cray. He designed their memory management system. He did exactly the same thing at NVIDIA that he did at Cray, but instead of it being in a supercomputer, it’s now in a GPU. A GPU is basically a Cray computer on a chip.
I worked at Cray. It was an amazing experience. I could speak directly with the engineers. When I say directly I mean over a beer in Chippewa Falls. It was just plain mind-blowing to work with so many brilliant people.
Even though the philosophies of single supercomputers lost to the distributed computers of today. One thing is for certain... The Cray computers looked just damn cool. :D
I grew up in Livermore, and worked at Sandia Livermore in the 1980s and 1990s. Our Cray's (X-MP and Y-MP) were giant money pits for the users. I remember the computing department having utter fits when other departments started getting desktop workstations like the first Sun SPARCstations and Silicon Graphics machines. I spent a full year porting all of the programs we had on the Crays to those machines so we could save huge amounts of case money. While the engineering and materials departments were moving to these small, fast machines the computing department was buying odd ball machines like Stardent (or the earlier versions from Ardent and Stellar) and continued to push for shared computing platforms.
Yeah, used to have an old pizzabox SPARC. Great little machine, gave my old Alpha box a run for its money, despite it being a much slower processor. It's a shame the likely fate of the Crays, as they were on the precursor to JWICS, it's likely they were literally melted down. For those challenged by the term, JWICS is the Top Secret network used by the DoD, amongst quite a few peers with narrower scopes of ignorance (nope, never heard of that, honestly! Lie detector melts down...). And don't get me started on the lie detector being a lie... Or LLNL getting its reputation ruined by Teller. Takes 20 years or more to build a reputation, 20 seconds to ruin it with bullshit. Still, it was the era of SDI vaporware, spending the Soviet Union into oblivion. And my fingers were just as dirty...
Absolutely fantastic work, extremely well presented! I am so glad the algorithm brought this to me, and I hope this video brings you a huge number of well deserved views and subs, you've certainly earned mine!
This was a bit of a trip down memory lane, I operated a couple of Crays back in the 80's, a Cray-1, then a Cray X-MP. The Cray-1 was rumoured to be #1 which Cray were installing in a customers site while 'their' machine was being built.
Serial 1 was unique in that it had memory parity, but not error correction. That was determined to be inadequate, so serial 2 was scrapped and subsequent machines had secded error correction. Serials 1 and 2 couldn’t be upgraded because the frame height had to be increased to take more modules. Serial 1 ended up in the UK doing weather forecasting.
That sounds about right, I was at the UKAEA at Harwell in the early '80s when the Cray-1 came. Our workload on the Cray tended to be office hours only so we ran work for the ECMWF 10 day forecasts overnight.
In the mid-80s I was able to use the University of London's Cray (not sure which model) to process data for a research project. I was at the University of Kent in Canterbury (UK). The data chugged up JANET (the Joint Academic Network - this was before the internet); that took 20 minutes. The processing in the Cray took seconds. The results took 20 minutes to chug back down JANET to UKC. How times have changed!
No mention of Cray's attempt to achieve wafer lever circuits... which were never successful. Second digital computer I programmed was a CDC 3300. Cost $1M; filled a room; not as powerful as my phone. My first computer was an IBM 1620. Size of a desk. Took me two weeks to multiply two numbers using machine language... in 1964. Was a Junior in Electrical Engineering and saw no chance of computers affecting my career. Have programmed continuously since 1968. Even did Analog computer in 1966. Lots of changes. Created a language I still prefer over all others... and am the worlds best, and only, programmer, using it.
It is really shocking to consider how revolutionary the transistor really was. Especially when you consider how simple they are. They can make transistors now so small they are made with just a small handful of molecules.
Shocking? Indeed! And speaking of electricity.... Give me enough relays and wire, and I can make a current flow through any number of complex operations (relay ladder logic). But when you start talking about the presence - or the lack thereof - a SINGLE electron at a shirt! Are you shitt8ng me?? 😮 That's not flipping engineering ! That's some black magic level shit!
And are still actively working on molecular, as in single atom transistors. And getting nowhere fast. Quantum effects and all. I'm experienced in everything from vacuum tube analog and digital through VLSI. Tons of fun problems to solve over the decades! And yeah, germanium vs silicon was a bit of an earth shaker, a whole voltage potential drop and all. ;) OK, a lot less leakage in silicon. To the point I jokingly call germanium circuits geranium circuits. And still remember how to build a shift register out of a basic flip-flop.
I was introduced to computers in college, circa 1975. Some kind of mainframe in the basement of a building; punch cards, teletype card punchers, etc. I remained a computer nerd throughout my career(s) - about 40 years. Pretty much all along the way, when the name "CRAY" was mentioned it was almost as if you could hear the "Ahhhhh!" soundbite of 'angels singing' used in movies and on TV. 😄
SuperScale is when two are more instructions were dispatched at the same clock tick. The CDC 6600 was scaler as it only issue or dispatch only one instruction at a time. The multifunction units which allow parallel execution of instructions. I have programmed on the CDC 6600, the CDC 6400, and the CDC 7600. None of Cray's computers were SuperScalar. In the CDC 7600 the function units were pipeline where multiple instruction may be executing in the function unit. See the Thornton - The Design of the a Computer. The CDC 6600.
I worked for Cray Research, Inc. for about 10 (1983-1993) years on the IBM VM Station and then on porting various versions of UNIX to UNICOS. It was a GREAT company; I still have very fond memories. Just FYI: One issue with the Cray 2 was the burning of the fluorocarbon (e.g., a short causing an arcing). The product would become poisonous, requiring immediate venting and complete draining and discarding of all coolant.
Fluorinert Cray was a great company, but its existence was so short and sweet. And Seymour's untimely death in an auto crash with his split off companyu Cray Computers. What a loss. I still wonder what he would have come up with. Now, no one knows of or remembers Cray.
@@terrywilder9 basically, it was a form of freon. So, when superheated, it'd degrade to phosgene, a really nasty substance that'd literally liquefy your lungs and etch all circuits away due to the hydrogen fluoride level. Massively Goobering the chemistry down by a lot, obviously. Phosphorus and hydrogen fluoride, to very unkind schoolyard bullies for the body. Or circuits. Reflected currently in Halon fire extinguishers, which will release the same noxious contents upon exposure to fire. Yeah, it'll put out the fire and you personally. Which is why Halon was banned and well, Freon banned for ozone depletion. Easily replaced though, for refrigerant purposes, with propane. Not joking. A refrigerant that's nastily flammable. Welcome to the irony of chemistry!
Very clever memory banking designed for parallel access directly to the CPU. The Cray CPU was very complex, but it was so simple logical and beautiful. The writing was on the wall as far as the large scale integration and where that was going. Finally Cray translated its ECL chips into one large chip and made the Cray Y-MP.
@@yxyk-fr Actually the philosphy Seymour Cray was about was to take cheap off-the-shelf commodity components and use them to realize his vision of the fastest synchronized processor he could make for the price. It was indeed elegant, but also it was just really brilliant engineering vision. When he tried to do the same thing inventing gallium-arsenide very fast processor, immersed in coolant things did not go so well. And then he died before it got to fruition, and was a horrible tragedy in my mind, but the age of this kind of hand made super-computer was coming to and end to replaced by work-stations and massively parallel Intel PC chips. @rudycramer225 - there are many levels to IT. And now it is all being replaced with AI, or so they say, and consolidating to where they don't have a customer support any more except for expensive contract based relationships, and they are getting rid of moderators. Somehow someones post or comment is judged bad and they ban people from expressing themselves on what are private companies, but we don't see they are not connected or acting in some way in concert against the citizenry, and yet we all have to use this stuff. The incredible economies of scale are increasing and concentrating profits and laying off workers, while industrializing everyone's lives and shrinking the people at the top to these cold reptile people like Trump - very scary and disconcerting.
I really love how you tell your stories. Using 3D animations and stylized themes to present information is truly something different from what other content creators produces. Keep up the great production!
During my sophomore year at Livermore High School in 1972, I was in a small fortran programming class. Once a week we were driven down to the CDC office just outside the Lab to run our small jobs on the 6600. We got to wander in the control room and load our little decks onto the card reader. I was in total awe of the large format pen plotter where the head was held upright and moved electromagnetically; no x-y mechanical arms. While I was in grad school at CSU, they upgraded their mainframe to a CDC Cyber 205, which is apparently related to the Star 100. It had a small microwave transmitter/receiver connection from one end of the system to the other, perhaps 15 - 20 ft or so. Mind boggling technology at the time. One of my colleagues became the guy who 'vectorized' other students programs to take advantage of the speed gains. Our professor was invited to Livermore to spend time on the Cray-1 there and soon after I actually got to see a Cray-1 during a tour at the National Bureau of Standards office in Boulder. I really enjoyed this video, it is informative, well constructed, and gave a great backstory to my fortunate little experiences with this technology.
My dad worked at Control Data in the 60s and 70s and was neck deep in all of this. Thank you for doing this it was really great. I have a feeling it's going to make my dad's day! He worked at that place at 50th and Park I remember him mentioning that. They were he lived over on W. Calhoun Parkway. We always heard about this mythical figure named Seymour Cray no I kind of have an idea who he was.
My dad also worked at CDC from the late 60s to the 80s in Montreal. I remember visiting data centres with entire floors full of these machines and whirring magnetic tape drives. I always thought it was so cool.
Man your vids are really fucking good. Your contemporarys imo are dwarfed by the engagement i have when watching your vids, including ones who are much more well known.
Seymour Cray is a really likable character. I like the combination of technical genius and an unfortunate tendency to bankrupt companies by being incredibly bleeding edge. Still he invented the so much modern computing - the superscalar, pipelined, RISC machines he pioneered are really the guts of any modern CPU. And there's something fundamentally charming about the way he could see the future before everyone else and desperately wanted to get their first.
That was also thanks to Apple who advanced Crays ideas. People tend to forget that the ARM we know now exists only thanks to Apple. Apple saved two nearly bankrupt companies Acorn and VLSI and created the ARM so that each company owned 1/3 of it. All the IP, engineers etc from these 3 companies formed the company and Apple then infused money in to it. Rest is massive victory ✌️ Apple later was on a brink of bankruptcy. Steve Jobs sold the shares of the ARM (1/3rd of the company). Jobs sold shares of Adobe and lots of other stock just to bring in enough money to survive. And Apple did. Apple is a massive success. Apple never abandoned ARM. They used ARMs designs all the way until ARM ended up here and there. Apple was not owner anymore. Then they had to secure things and have their own capabilities of creating processors. RISC processors. Twists and turns. Now Apple is again one of the owners of the ARM. If I remember right they have 10% of the company. Maybe it was that Apple never really liked Intel 🤭
Excellent work! I worked at Cray from 1981 - 1995 (in fact, my younger brother still works for the current incarnation of Cray - HPE) and spent a lot of time optimizing software to take advantage of the Cray architecture, sometimes at the assembly language level. The one thing that often struck me was how well balanced the entire system was, since just when you started to exhaust one system resource, such as the number of available vector registers, it turned out you were also close to other system limits, such as memory bandwidth and available scalar registers. It was an amazing series of machines!
That’s very cool. I was too, but for IBM in Milwaukee, but sadly no Cray’s. I did get to work on IBM SP clusters (Scalable Processor), which were the pinnacle of AIX hardware, but I’m not sure if those compared to Crays. I’m guessing the IBM analog to the Cray would have been their S/370?? I really don’t know if the S/370 mainframe (or in my time at Big Blue, the S/390) were used for scientific computing, although I know the SP was - I didn’t belong to that priesthood of guys that could entered the hallowed halls of the mainframes. This was all before Linux took over the supercomputer market.
I was hired out of school to work as a sys-admin on the ETA-10P, a CDC subsidiary spawned to compete with Cray. After ETA went bust, I worked as a junior sys-admin on a Cray X-MP before working with NEC SX vector supercomputers throughout the 90's. They went CMOS and fought off the microprocessors for a decade before distributed memory machines took over. lots of room for future videos there. I had no idea how the direct the connection to the military was in founding ERA, Remington-Rand, Sperry, and CDC. You did an excellent job digging this stuff up. fwiw, slight issue with airplanes used. When discussing WW2, you used a C-130, which first flew in 1954. when discussing airlines in the 1950's you used jet models which first flew in the 1960's... slight anachronisms. really enjoyed the video.
Worked at CRI from 1987-1993. Learned a lot, it was like doing post grad work once I started working in the development division. Spent time on Cray-2. Worked on a CALMA system to learn circuit layout on gate array chips. Went on to learn circuit design and layout using Cadence software tools. Running SPICE simulations using software from Meta-software compiled on a 8cpu Cray II meant that we could simulate logic circuits faster than anyone in the industry. Other parts of the company were using logic synthesis programs to design and run emulations for whole systems allowing next generation designs. I met extremely talented and dedicated professionals that inspired me. I left Cray shortly after the government announced that budget cuts for Super Computer purchases and apparently there was not enough money from commercial sales. Next gen machines which were on the minds at CRI used thousands of processors in a parallel processing system. Indeed all supers today are massively parallel machines.
I worked for Cray Research 1987 to 1992. A great company whose fate was sealed by massively parallel computing and chip integration. It was an amazingly organized enterprise and profitable until it get some competition from other technologies. What an adventure that was.
My dad got a job at an oil refinery in California, once his enlistment in the Air Force ended in 1968. He would have been part of the first class of technicians who were training to run the refinery, but when the company founded out what he had done for the Airforce (land surveying and writing blue prints, they had him working in in that part of the construction project before going into the second class of technician trainees. He ended up working the rest of his career for this refinery, which was the first computer controlled refinery in the United States and just the second to go on-line in the world. The computer was huge, in 1968 the computer took up two floors, each 100 feet by 100 feet. Dad would often work in the control room which I got to see many many times as I was growing up, at first when I went with mom to pick him up from work, then later on my own, after I got my driver's license when I was asked to pick dad up myself. Of course I was never allowed to touch anything, but it was very cool to go and see. Thousands of buttons, dials, gages and display screens. It looked very much like I imagined the control room of a spaceship might look. Not long before dad retired I was visiting when mom asked me to go pick up dad from work. I was shocked by some of the changes I saw in the control room. It still looked more or less the same with all the butts dials and switches but the computer room was very different. Instead of a huge room filled with rows and rows of large computer banks, it was now a large, nearly empty room with one tiny box inside, smaller than the average sized refrigerator. That was the entire computer that ran the entire refinery. In fact, it was 2 computers. One primary and one back up. That is how much computers had shrunk in the 25 years my dad worked there. And that was 25 years ago! I would not be surprised if I went there and saw a smartphone sitting on the floor of that 2 story 2000 square foot space.
Probably, one of the best videos i've watched on TH-cam for ages... Fantastic, Kudos Archie. I've been interested in Cray for many years. this was a great representation..!
It’s pretty interesting how modern GPUs use many of the same ideas like superscalar processing with many pipelines doing vector processing. Enjoyed the video, seems very well researched.
You are thinking Shader engine within the GPU, those are vector processors. A GPU has many more functional units for graphics in the rendering pipeline.
Excellent documentary and amazing voice over. The accent is unique, yet pronunciation is consistent and not in the least bit annoying. I could listen to this content for days.
Ever since Tron came out in 1982, which used a Cray machine, I have always liked the Cray story. This video really explained everything about how the machines came into existence.
I worked for Cray Computer in Colorado Springs in the early 1990s. For two years. First year in assembly and wiring. Second year in the Gallium Arsenide chip productions. I could see the writing on the wall, I got a new job at a chip manufacturing plant literally across the street from Cray. The next week former colleagues called to tell me that the place was closed and shuttered. End of an era.
This is an easy-to-understand video with very easy-to-understand CG and the purpose for which they are needed. It is also very important that the names of the scientists involved appear. Thanks for the update to TH-cam.
I thoroughly recommend the book The Supermen as an accompaniment to this video. It covers this story in even more detail. Seymour Cray and his band of merry men were genii.
I remember being in awe of the Cray-1's 80MHz clock speed at the start of the 1980s when I began using early microprocessors. This was a great overview of the general progression of high-end computing without delving into the technical details. I remember hearing that the Cray-1 used new ECL logic chips and that the power supply rectifiers were low voltage drop germanium to minimize heat production given the enormous currents that had to be supplied at low voltages to the machine's circuits. I also remember reading that the circular arrangement of the individual columns was to minimize and equalize propagation delays between units in the backplane interconnect wiring. And now, using many of the concepts Cray developed, look at the multiprocessor power you can buy relatively cheaply in a desktop box. I guess the definition of supercomputer would always be shifting to include only the newest and fastest machines. So far has it come in just a few decades and mostly thanks to the incredible advances in the compaction of the circuitry on silicon dies. In the 1980s, 1um fabrication technology was the newest thing in transistor shrinkage. Now, clock speeds in the multi-GHz range, tens of CPU cores per die, memory into the TeraByte region etc. Just mind-blowing. Seymour is forever a legend in my mind. Thanks for this utterly watchable video!
I wasn't aware of a lot of the early UNIVAC history you relate here! In 1981 I was working on a UNIVAC 1100/42 (2 CAUs and 2 IOAUs along with an SPU). We had some really fast (for the day) fixed-head disk that we usually booted from, but we also had a FASTRAND II* drum(!) that we also could boot from. This was at Offutt AFB in Building 501 (the underground command post); I was a computer operator there and our system produced (via a Ford Philco(!) device occupying a couple of racks) vector graphics depicting force readiness and other such data for the people two floors further underground than I was. On the remote chance anyone is familiar, ADDO was our office symbol. I also had passing occasion to program on a CDC Cyber 7600, as well as operating Honeywell Level 62 and Burroughs A2 systems. --- * these had a failure mode that involved throwing the spinning drum through walls. I'm glad I didn't know that then!
A brilliant piece of work. The only thing you left out was the Cray inspired Sun CS6400 which lead directly to the E10k and onwards, perhaps the most important computer ever produced. I had the pleasure to work with the 6400 and of course all the Sun machines
When reading the comments ... I feel they double the brilliant(!) content of the vid with the same qualitiy, background stories, personel expieriences and so on. I love that. And to emphasis: No gaming world and no LLMs without Cray's vetorization.
Superscalar execution was first used in 80960 processors from intel in 1988. The Pentium team copied the technique along with some of the 960 parts , like the floating point unit. I was on that team. The Cray designs were breakthroughs and we followed the evolution of computer architecture from them and others to create an Out-of-Order design with Speculative Execution, branch prediction, register renaming and many other mechanisms that allowed instruction execution speedups. These were done in the Pentium Pro design, which was not at all similar to the original Pentium designs. We were targetting Cray level numerical performance as a goal and achieved it in the late 1990's only to surpass it with much more parallelism in both multiprocessing and advanced superscalar designs. These designs of the 1990's are still the root of current microprocessors. Advances from them have addressed other bottlenecks in the system to incorporate faster memory interfaces along with multilevel caching for increasingly higher performance as silicon technology also advanced. Many make the mistake in assuming that just the process technology increased the capability of new processors when it has actually been the combination of computer microarchitecture along with silicon advances that have led us to today. Thanks for the walk down memory lane...:)
Excellent video. Thank you. I worked for CDC between 1975 and 1982, mostly in the IBM Plug Compatible Market (PCM) arena, but had occasion to interact with the CDC Cyber mainframe and then Cray operation. The people I worked with were open-minded, free spirited and driven to succeed. I consider Bill Norris to be the Elon Musk of his era. On one occasion, I was at the Arden Hills facility when a guy in a plaid shirt offered to explain how a 7600 that was in final checkout functioned. He demonstrated how several 'modules' could be physically removed while other engineers continued to test the system performance without it slowing or even crashing. I discovered the engineer in the plaid shirt was Seymour. A nice bloke. One thing that was overlooked that contributed to Cray setting up shop 'over the border' was the parallel development of the CDC STAR100, STAR in this case referring to the STring-ARay architecture. The STAR100 was not a Cray project but that of Jim Thornton, a one time assistant to Seymour. That machine development also ran long, but several were built and installed at the Lawrence Livermore Labs in California. Again, thank you. You should write a book about CDC. One is sorely needed. BTW, the first computer I operated and programmed was a British EMIDEC 1100, already quite old, in 1965. Look it up. It was a true generation 1 system. We ran a plastics facility that employed 7000 people, handled payroll, chemical simulation, sales and such, with a 8k 36 bit word machine, without disc drives. Just 4 35mm sprocket feed tape decks and an 8k drum. The CPU used two 1k Williams tube memories. Fun times.
I remember my University (USP in Brazil) was about to buy a supercomputer, three companies went to make presentations (for all University staff and students to see) of their machines: IBM with its 3090 with vector processors, that required basically a building made for it, CDC with its ETA 10 that could be liquid nitrogen refrigerated and was easily installed on the same building a traditional mainframe could and Cray (I didn't saw Cray's presentation). The balance was pending to CDC ETA 10, and then IBM offered the 3090 for free! Yeah! At the end I don't know what happened but I am pretty sure the University didn't bought any of those (I may be wrong though). I think (my theory) the supercomputer project was scraped and time was borrowed or purchased from other nearby entities that had powerful computers (some had 3090's like Petrobras and others ETA 10's I don't remember if it was INPE or ITA that had one) without the hassle of maintaining the physical machine.
Stanislaw Ulam developed the hydrogen bomb with out massive 1 or 2 d computer model and big computer: Stanisław Ulam, the Polish physicist on the Manhattan Project, invented the Monte-Carlo method to predict fusion reactions for various designs of the hydrogen bomb. Ulam integrated the Monte-Carlo process to follow the laws of nuclear physics in generating neutron fluxes, x-rays, and daughter products. He accounted for the inherent statistical fluctuations in neutron chain reactions using only the upper and lower bounds of probabilities for each step. Ulam assigned UCLs and LCLs with 95 percent upper and lower probabilities for each reaction step for each hydrogen bomb configuration. He found that he could quickly predict whether a given design worked. He soon determined that Edward Teller's Spark Plug design would never work. Ulam then designed the fusion device that would work. Edward Teller would insist that Ulam was wrong in claiming that Teller's Spark Plug configuration would not work. Soon, however, Edward Teller would add a minor improvement and take full credit for Ulam's design. He would ignore Ulam's seminal contribution until much later. (Dark Sun: The Making of the Hydrogen Bomb, 1995, Richard Rhodes).
The dates are incredibly close - as far as I know the first LEO machine was first put into service in September 1951 whereas the first UNIVAC was put into service in March 1951. As to which was “invented” first it really depends on when you define invented - i.e the first test program, the first commercial delivery etc
Found this through a news letter to old CDC graduates. Brought back memories for me of many of the early players. A wonderful introduction of the importance of the labs and Sid Fernbach to early computing. LRL Livermore (Sid Fernbach) bought the first CDC 3600, the first CDC 6600, the first CDC 7600 and the first CDC Star 100. Seymour Cray was unique in the computer industry: First Transistor computer, first RISC, first multi functional unit design, first instruction stack, then pipeline, first short vector approach, and always the most aggressive packaging and his ability to completely stop one design approach and go to a new one. Congratulations on a well done journey from concepts in the 50's to the last super computers of the early 90's. I enjoyed 29 years on that journey until 1989.
In the late 80's a friend who worked for Cray invited me to "family day" to at Laurence Livermore Labs. While there we visited the Cray room. If I remember correctly there was a row of 8 Cray 1's and a Cray X-MP. There also was a single Cray 2. The liquid tank of the Cray 2 was transparent and one could see bubbles emerging from the boards. As it was family day, some children were putting finger prints all over it. Fast forward to around 2010, I took a class of students to see the Computer History Museum in Santa Clara. They had a Cray 1, a Cray 2 and the Cray 3. I suspect these were donated by LLL. The three was by comparision tiny, a bit bigger than a toaster covered by ribbon cables.
As a very old expat Brit engineer who has spent his life in and out of computing, this has been a very interesting (and new to me) story. Of excess . . . Thanks for all your hard work. 😎
This has to be the best video I’ve ever seen on TH-cam. I love love love the animation style, your narration, and how you structured the story. I hope you continue with videos such as these! I look forward to continued watching!
Is this not a lesson that management are the biggest overhead any company can have. And yet, management believe they are worth the most important, and believe that cuts to the workforce rather than themselves will help the company prosper.
I recall reading a number of years back, a blog post about someone making a recreation of one of Cray's supercomputers using an FPGA. Shows how far we've come that what once cost millions of dollars and weighed tonnes can now be implemented on a single chip that costs tens of dollars and weighs a few grams.
To think the small machines we got today, to do the same things years ago required a room full huge machines. Back in the day hard drives were the size of clothes dryers. The discovery of semiconductor technology was a game changer.
It seems mini computers are lost in the dustbin of history? When prices dropped below $20K we really started computerizing America by giving wholesalers automated inventory control and accounting, while PCs were playing games.
in the 70ies and 80ies I worked for a machine shop in Hopkins, Mn making spindles for the disc drives for those computers. These things were massive as a 5 mg memory disc was the size of a record album. we made them by the thousands for control data and MPI.
I find it interesting that tech businesses have seldom made financial sense. Makes 1 Billion in sales, has to take minimum wage to save a project. Mind inflation. where was the money going to that was more urgent than developing the company's flagship? This story was masterfully done. I will watch it again for sure
"The addition of the second dimension increased the number of mathematical operations required exponentially ( 2:10 )". Uhmmm ... NO! At max quadratically. Sorry for the nitpicking. Great video and thanks for sharing!:)
Well done! As a computer engineering grad in 1994, I didn't truly appreciate the history that underpinned the industry I love so much. I worked on SGI computers at the Canadian National Research Council in 1991/1992 and didn't realize they would go on to acquire Cray Research. Fascinating!
I started my career using the Cray X-MP and it's competitor, the CDC Cyber205. As a young engineer, it was thrilling to use the most powerful computers every built. 6 years later I had about a quarter as much processing power in a small parallel processing machine on my desk. Now the CPU of my consumer grade desktop provides 1000 times the raw power of the X-MP and that ignores the amazing power of the GPU! This video illustrates the constant, breakneck pace of advancement in computing. What's most impressive is that somehow they managed to build vast, highly profitable companies while dealing with the incredible rate of change.
Many thanks for this video - like most people I have only ever seen Cray computers in movies and magazines but there is something very cool about the look of them, and the fancy cooling systems. In the movie Sneakers Ben Kingsley is seen sitting on one while chatting.
Wow my grandpa worked for Control Data and then Ceridian for nearly 50 years and I didn’t know too much about the history of the company. Thanks so much for making such an incredible video describing it all. I can’t wait to call him about this!!
Great documentary. Though if I could level some constructive criticism, I'd say it's weird that you didn't mention Silicon Graphics buying Cray. Sure, it's not a huge part of the story, but it is part of it.
I remember what Cray was the shizznits. And then I remember one sitting out in the open on a loading dock for over a year because nobody knew what to do with the thing. I can't even recall whatever happened to it. Probably went to the breakers.
Thanks, very well done. I have been in the IT realm pretty much all my life. I grew up with kids whose parents worked at CDC, Cray, Honeywell and other companies. One of the things that has always amazed me is how fast things change and how slowly they develop. My brother went to MIT, and I remember playing a dog fight game on a 6"x6" green screen running on a PDP-4. The screen drew so slowly that the body of the plane., made up of dashes, would bend when going around a corner. This was in the lates 60's. Talking about the massively parallel designs for the later supercomputers reminded me of something my brother mentioned. Some of the large computers were running into problems with the speed of electricity (light in essence). They needed to build the computers in a way to shorten the length of wire connections. A nanosecond, is a very short amount of time, but, in terms of light speed, it is about a foot in length.
I remember visiting the NOAA Physics Laboratory in Boulder CO in the late 90s. They were still using a Cray. And the largest robotic data cassette storage solution I have ever seen.
Cray and Chen: Innovative masterminds who wanted to keep pushing the limits by completely redesigning from the ground up, pioneers of the industry who's designs sacrificed everything in exchange for being the latest and most powerful machines. Davis: Down to earth mastermind who wanted progressive upgrades that maintained compatibility with old designs and was far more accessible to more buyers at the expense of not being cutting edge. What ultimately doomed Cray (the original company, not the modern one) was clearly the lack of compromise on plans and designs. Cray and Chen's team could've partnered up and built one niche machine instead of two while Davis' team could continue to be the main product. Sometimes the brightest minds need to be brought back into reality for a second to not trample on everybody else. Cray unfortunately died in his 70s in a car accident in the 1990s despite still being just as active in the industry as he was when he was young. Chen isn't the same Steve Chen as the TH-cam co-founder obviously. This Steve Chen is Dr. Steve S Chen who ended up starting a few more companies and is still active in the industry despite his old age! One has to wonder just how old Cray would be before he finally decided to retire, probably never given his tireless brain and passion.
Cray was one of our customers and I got to visit their factory. I have never been so cold in my life. It was -20F outside. The factory was not all that impressive. Looked more like a start-up than a supercomputer manufacturer. They used to buy electrostatic printers from us as they were the fastest available at the time and their diagnostics apparently dumped a lot of paper to pour over. There's one in the computer museum in Silicon Valley that you can see. The Cray vs Amdahl wars were legendary.
I like the 3d approach rather than pictures/videos. It should be pretty time consuming though (though scavenging for fitting pictures can cost a lot too)
My Father worked on Univac 1, they figured out how to improve the reliability, to get most a day out of they system instead of 1 hr, by replacing all tubes in a bank, when one blew. They sold the untested leftover tubes to the hobby/used parts market in downtown Manhattan.
You should have added something about Seymour's last company he was involved with, SRC Computer. They used a liquid cooled, still Fluorinert, Xilinix FPGA that could be reloaded is milliseconds to process a different algorithm essentially a custom processor specifically for that algorithm. They were located in Colorado Springs and eventually went out of business as well.
That's interesting. I worked for Cray and we were all crushed when Seymour left for Colorado. We never knew much about the technology, that is interesting about a reconfigurable processor, thanks.
My grandfather was an Engineer at NASA where they tested the rockets. He told me when they ordered a Cray and the order catalogue was several hundred pages of checkboxes to configure what you wanted it to be capable of. They ordered one to do some kind of geometry simulation. Not sure if he was around on that job long enough to use it. He later went into shipbuilding. It was a much shorter commute.
My introduction to computers began in the U.S, Navy in 1962 as an Electronics Technician (ET) assigned to on of the first three ships to have digital computers aboard them. The ship had so much prototype equipment besides the two UNIVAC AN-USQ20 all transistor computers that we had civilian visitors on board almost continuously when we operated on the eastern side of Hawaii in the Pacific ocean. Among those were some who were familiar with Seymour Cray and I was told that our computer had had input from him. This fact confused me when much later in life I read about the timelines you detail here. Thank you for the information about his being allowed to work independently while at Control Data. The AN-USQ-20 was not a supercomputer, but it was amazingly good for its very small size at that time. We used a vacuum tube during our on shore training as an emulator. Of course the U about the size of a double door refrigerator, (air cooled by an integrated chilled water heat exchanger, by the way) it had less computing power than my iphone.
I got to program the xmp and ymp in graduate school in the 80's. After graduating I worked for Thinking Machines Corporation whose focus was "massively parallel" computers. The competitors at the time where nCube and the Paragon. We all used pretty specialized hardware and custom created compilers to make use of it. After a while IBM simplified the world even further when they introduced the SP1 and SP2 which were basically just a bunch of RISC workstations taped together. The interesting part is that a lot of software and computing concepts created by by Thinking Machines, IBM, and others in the late 80's is what is enabling a lot of the current day cloud computing. May be a lot of work, but it could be a really good follow up video.
I worked briefly in the old glider factory building (in 2022 and 2023) and found the history fascinating the entire time I was there, I wanted to learn everything I could! Not to mention I was soo hoping I would find a forgotten piece of equipment somewhere but alas the building has been heavily stripped and modified over the years. All that remains from the old machine shop pictures were the mounts for the wooden beam ceiling.
My father was a personnel manager for CDC in the 70's. He started in Minnesota and then moved to California when they built their facility in La Jolla. In 1973 when my father was working on his PhD he would go to the office on weekends to run punch cards through the computer for statistical analysis of manager self-assessments. I remember playing tic-tac-toe on the console with a guy while my dad was running those punch cards. That guy was Seymour Cray.
Are you familiar with the song "Automatic" from the Pointer Sisters? The song mentions "Cray Vision"
@marklarma4781 well, the superfund site is from lead smelting sites. Judging from arsenic findings by the CDC in some east Omaha yards, I suspect it was testing for heavy metal contamination. Just as the CDC still has a WTC group from 9/11, due to exposures there after the attacks and collapses.
goodness, you were distracting Cray from his work with a video game? were you an IBM mole?
@marklarma4781 My Dad also worked for Sperry Univac in Omaha as well when I also was a young kid. I did not know about CDC either. My dad only recently told me that they had only one encounter/project at one point in time trying to get their system to communicate with CDC's system. Evidently, it was quite the effort to do so.
What a story! CDC was the super computer back then. I was a student(major in Physics) in 80s and fascinated by supercomputers. First time I saw the picture of the Cray One. I was very excited.
I worked for a company that partnered with Cray Research to distribute their UniChem quantum mechanics software. While we were at their HQ, we saw a machine (I think it was a C90) in the process of construction and one of our scientists said something like "Oh, I see you changed the color" because it was a different color than the C90 he'd used in grad school. The response was "For thirty million dollars we'll paint it whatever color the buyer wants."
Well that's a true story. Merci bisous.
I used to test hardware for a company specialising in SAN and NAS storage.
They build racks of servers and hard drives.
They were showing a customer the racks, which have a door with the company logo on them.
This customer stated that the door would look great on his fridge freezer, a match size wise for the cabinets.
A manager took the door off the server cabinet and presented it to the customer at the end of the tour.
Thus is better than Henry Ford.. He said that you can get any colour that you want. As long as you want black...
@@savagesarethebest7251also sort of the opposite, absolute mass product.
@@EVPaddywhoosh
Cray certainly made a name for himself. Maybe it's an age thing, I'm in my 50's, but Cray machines were almost like mythological creatures. The guy really had an enormous impact.
I'm 70 and in the computerclub (Atari) we discussed Cray. It was a adventures time.
I am like half your age and Cray is a magical word to me!
But I don't know of others in my age that even know about Cray.. Or Ada Lovelace and Charles Babbage.. 🤔
Swedish SAAB aeroplanes are coded in Ada 97 language, or at least they used to. But it is a language with much text to do anything simple.. Nowadays we have LLMs (T9+) so it helps
Ha! I just made a comment with pretty much the same sentiment. Cheers!
@@michaelogden5958 Great to hear. Good to know I'm not alone in my mystical view of Cray computers!
i'm not sure where i first heard of cray supercomputers but they've always had a sort of mythical quality to me to, not just a computer, but a supercomputer. the designs of them are intriguing too, how much was aesthetic and how much was functional, some of them looked like seats at an airport or something..
This was a fantastic presentation! Well done! As an aging computer nerd that came of age in the 90's, Cray still had a whole mystique around them even going into 2K. The legend was so strong it took a long time for people to forget the name. Them and Silicon Graphics.
Thanks! :)
For our generation I think those names will always carry their own mystique and legend, although I have to admit that after working with and owning those things later a bit of the 'magic' dissipated. I got to try VR for the first time on a SGI ONYX RE2 at a computer science fair/expo in Gothenburg Sweden called Tidsvåg Noll 2.0 back in the '90s. I decided right then that one day I'd get my hands on similar hardware myself. Well, I did, I had a deskside ONYX RE2 for a bit in the mid 2000's, very cool but it was a lot harder to work with than my younger self had imagined. I have upgraded since, and now have a Xeon based PC with a GTX-1080 to run my VR headset off of, but as a nod to the dream of my younger self it's built into a deskside PC case lol. Thermaltake Mozart TX is the case, it's not that far off the size of the ONYX hehe.
If it hadn't been for that stuff I saw when I was young, I likely would have had a different life and career. I spent many years as a software developer working on 'Enterprise applications', with a focus on integration services, which is in a sense making computers talk to each other in a common language. People per se have little to do with it other than as source data input devices :-P.
Yeah, when I heard the Cray XMP, I told myself, one day I will use that computer. Decades later, I would manage a Vax8800 cluster. This video is a gem for computer history that few knew the details. The CDC mainframe I never got to see but the Dec Alpha with experimental EV6 cpu at AltaVista in PaloAlto, I got to touch. They had 5 nodes there...
I love the detail in this video. It reminds me of a story I heard at Ohio Supercomputer Center!
In 1987 OSC got a Cray X-MP/24. In 1989 they replaced it with a Cray Y-MP8/864. Both machines had eight stacked processor boards submerged in Fluorinert and each required an outboard Heat Exchanger Unit. They made a lot of heat.
The parking lot of the Supercomputer Center had a LOT of air conditioning compressor units in it. But they weren't exactly tidy. Some of them were pretty crooked. Because when they had done the changeover to the Y-MP8, they didn't want any downtime. They had the X-MP and Y-MP both online for awhile. They said the parking lot looked weird because the asphalt had actually melted and got all gooey when both Crays were running.
The storage situation for supercomputers was weird. You had these fast machines, but they were hooked into '80s storage. OSC had hard drives that looked like rows of industrial washer/dryers. They had room-sized reel-to-reel tape storage jukebox with a robot arm to retrieve tapes! In 1992 I think OSC had just taken out the robot arm and put in a 1.2TB storage system which required an additional Cray Y-MP 2E to manage storage.
Supercomputers just kinda morphed into clusters. There was sorta a mental turning point for everybody when Toy Story came out in 1995 and everybody knew it was made on a render farm of 100+ SPARCstations. It made Crays seem like old news. The last Cray they had at OSC I think was the Cray T3E-600/LC they got in 1997 which itself was running 136 DEC Alpha chips.
A little more random backstory...
I grew up in rural Ohio. In 1987 Ohio State University started Ohio Supercomputer Center with a lot of state funding. Every university had some access, and they sold time to businesses.
I got a Commodore 64 for my 8th birthday and really lived/breathed computers. In 1991 I was 15 and I was basically the only computer nerd in Springfield, Ohio. I did weird things like begging my parents for nothing but a Turbo Pascal 5.5 compiler for my 386SX/16 for Christmas. This guy Brian Fargo made a lot of late '80s video games and I wrote him asking how I could make video games and he actually wrote back!
I filled out an application to go to the Supercomputer Center's "Summer Institute" for high school kids. One question asked you to make a program which found all numbers which were both prime and fibonacci numbers up to 1000. Pick any language. It was the only programming question! It seemed too easy, so I thought I should do something clever. Most people knew a little BASIC at best. I wrote a super-short recursive Pascal function which counted Fibonacci numbers up to 1000 then fell out of recursion checking if they were prime. I was one of the 14 kids accepted. When I got there the next summer I just wanted to know how they graded that question! So I finally found the director Al Stutz and asked, and he said I was actually the only one who answered it correctly LOL!
The summer of 1992 was a weird time. People didn't have the Internet. 14.4k modems were a new thing. There were a few BBS-es in Dayton, Ohio I could call. Everything else would be long distance. Prodigy and AOL existed, but they were just basically chatrooms and email. They weren't connected to the Internet. Because the Internet was just .gov and .mil and .edu sites. There wasn't much to do.
At OSC in 1992, though, they had a lot of machines hooked up to the Internet. They even had a ton of NeXT cubes! And Steve Jobs actually came to OSC to talk about them! And OSC had a fat fibre connection! But the Internet had really barely even progressed past being DARPANET. The only thing to really do on the Internet was look for pirated games or soft-core porn. There wasn't much of any WWW. You just kinda had to know the address of something. But if you did, nothing had much security. You could just FTP to wuarchive.wustl.edu and browse everything that nerds had decided to share on the network. College FTP servers were mostly scans of the Sports Illustrated Swimsuit Issue. The newest game was Wolfenstein 3D. The only network game was NetHack. Gaming didn't get big till Duke Nukem 3D and then Quake and WarCraft 2. The Internet became a lot more interesting after I started pirating PSX games and playing Quake.
Anyway, I love that this video has meaningful illustrations instead of just a buncha nearly-irrelevant stock photos.
Thank you! And thanks for sharing your personal tales, it’s very interesting! :)
Those rows of washing machines had 6 to 10 parallel 10" plattens that spun at 30,000 RPM.
I graduated from Illinois early 88 in aerospace. We already had a Cray-2 and an XMP-64 was being commissioned. I never got to play with either but the postgrads in the department were doing their first aerodynamics simulations on it and providing some amazing detail.
I was on the Illini swim team and one of the guys who'd graduated several years earlier was back doing his PhD on the Cray systems. We discussed some of what they were doing. 2 things I remember are in software and the heat issue you mentioned. What I was told is a little different but its still in the vein of how much heat they generated.
1) HEAT - The Crays used heaps of power and generated monstrous amounts of heat, which is why there were in a vat of coolant like you describe. At Illinois they went to the civil engineering people to help because they pumped the coolant so hard it was eroding the circuit boards. The only people at the time who understood erosion were the guys who dealt with soil erosion and they were civil engineers.
2) SOFTWARE - They weren't writing a lot of new simulations at that stage except in some odd areas (like wings entering stall conditions) as they already had the math worked out for a lot of things. Most engineering software is basically number crunching exercises and by the late 80s they had many algorithms worked out. So in many instances they weren't writing new code but re-compiling it for multi-core vector processing which was what the Crays were.
In those days I was doing Finite Element Analysis (structural analysis) because that was my professors forte. We didn't need to re-write anything at that time because we already had systems like NASTRAN. What we needed was that stuff re-compiled so that we could run larger models or just run it a lot faster.
The guy I knew who was doing his PhD was working on that stuff. He was doing working out how to take existing code and decide which parts got done by which core so that the whole number crunching exercise could take advantage of the Cray. When you look at your laptop or desktop and it says 2-core or 4-core or 8-core. That stuff works because of the work that team at Illinois back in the late 80s.
On a side note.
Marc Andreessen, who's famous for the advances in web browsers (Mosaic, Netscape, MS Explorer) started at Illinois a year after I graduated. He worked at the National Center for Supercomputing Applications (NCSA) at the University of Illinois.
I think his attitude towards government funded research totally sucks. He got all his wealth from getting this amazing opportunity to develop his skills paid for by the American people and the State of Illinois. Some of the things he's said in recent years is seriously egregious and his technology manifesto is insane to say the least.
I was at OSU in the early 90s and can confirm so much of this, except my access was always through the terminals in the Baker Systems building. The worst part was trying to find an open seat, as there were tons of people playing NetTrek for hour after hour. Thank you for sharing your story, here... it made my day to read it!
@@rocktech7144 30,000 RPM? They didn’t seem loud! They didn’t have that high-pitched jet-engine whine like 10,000 Cheetah drives! 😂
The original 'Tron' movie graphics were done using special CRT-based film scanners and recorders. They were connected to a Cray I/O processor with then-unheard-of 100 megabytes of RAM. The graphics were computed using the Cray computer, then recorded back to film. I had the privilege of working on the scanners/recorders many years later when they were upgraded with more RAM and HiPPI 100 MB/sec. I/O interfaces.
The CGI filming of "The Last Star Fighter" movie occurred on a Cray computer. The Cray computed a frame, displayed it on a high-resolution screen, and had the camera take the shot. It was frame by frame the entire way through the movie. My boss wanted me to help the company sell computer time on the Cray. I spent several weeks in LA working on accomplishing this work. It was the first movie where I knew people in the credits.
In fact, there is a Cray-1 shown in the movie. I was a student programmer at the computation center then, and our boss closed up shop and took us all to see Tron. We gave the Cray-1 a standing ovation!
Pretty sure the original Tron graphics were done on a Foonly F-1 (a souped-up PDP-10). Check it out: en.wikipedia.org/wiki/Foonly
You're thinking of The Last Starfighter, perhaps. TRON used much more pedestrian computer hardware - two separate systems, actually. One system that constructed objects out of solid shapes (tank battle, light cycles IIRC), and another system that could do more organic shapes like the solar sail ship.
It's unbelievable to me how much those early days of ERA sound like modern IT and the tech industry. Over budget projects, having to raise venture capital, "meme stock"ing yourself to the every-man, Cray acting like a "veteran engineer" straight out of college, pivoting, mergers... it's never stops with this story and it's no different today.
Cray found it funny that Apple used his machines to create the Macs while he used the Macs to make his super computers 😆 Jobs and Cray both made jokes about it 😆
The truth is that the Cray at Apple was purchased to design the Aquarius CPU which became a dead end. Later the cover story about using it for plastics design was widely circulated and they did do all their R&D 3D rendering on the Cray along with injection mold design, but using it for Macs was really about trying to recoup investment. The simulations ATG was using (the Advanced Technology Group at Apple) didn't parallel well and the Suns which were terminals to the Cray were not a whole lot slower than the Cray at running the same simulations. That's when they jumped whole hog into using it for plastics. I was a contractor to ATG in the mid-90s. I got to make the ancestor of what you now know at Street View.
@@Peter_S_ Apple later went and created the ARM.
@@verttikoo2052 Apple didn't create the ARM and they didn't come along in ARM land until 1990 with the ARM610 which powered the Newton PDA, but ARMs had already existed as fabricated chips since 1985. The Acorn Archimedes computer was released in mid 1987 with an 8 MHz ARM2 and ARM originally stood for Acorn RISC Machine. In those days Sophie Wilson was Roger. She didn't become Sophie until 1994.
@@Peter_S_ Apple indeed created the ARM and that is a fact. Apple saved two nearly bankrupt companies Acorn and VLSI. These 3 companies set their IP, engineers and other stuff in to the company and Apple injected money into it. These 3 companies owned each 1/3 of the company. Apple had to sell its 1/3 when Apple itself was near bankruptcy. Now Apple is back owning the ARM but only with 10%. ARM is a massive success story and only thanks to Apple.
I really miss those times! I worked for several of the aforementioned in various capacities during their heydays from the late 70s to the mid 80s while rubbing elbows with many of the well-renowned luminaries of the time-we all felt the fire underneath us as we worked on/developed the worlds fastest, most advanced computers in the world. As a recent Stanford EE grad in my twenties, I was truly in my element-I really loved my life and what we were doing-it was an immensely satisfying time in the SV and I miss it so much!
Fun fact. They’re still making supercomputers to this day in
Chippewa Falls. They were eventually bought out by HP Enterprises couple years back, but my brother works there and the company is still involved with Los Alamos too.
I believe they are also very much pioneers in interconnects for such clusters.
I can truthfully say, around half of the multi-core computer I'm happily battering away on the keyboard of comes directly from Cray innovations.
Ironically, Cray being bought by HP and this computer being an HP...
And you get there by driving on Seymour Cray blvd
With a C130 at the Allied Air Base in France in 1945 it seems the Crays helped pioneer some Skynet style temporal devices too.
I remember an old mythical rumor many, many years ago that a Cray system had *finished* an infinite loop in something like a week of continuous operation.
I used these a bit. In 1971 and 1982 I used a CDC 6400 at Kitt Peak National Observatory in Tuscon. Fast machine with six peripheral processors, but its memory was very small. A lot of software ran on FORTH, which is unquely miserly in use of RAM. In 1987 I used a Cray 1 (or X-MP?) at the University of Toronto, by this time remotely via Internet. The funny thing is that these days a $10 Raspberry Pi outperforms these machines (at least for pure scalar number-crunching). Massive superscalar paralellism has taken over, with GPUs adding more punch yet. But the CDC and Cray machines had a certain magic for scientists in the day.
Where are you buying raspberry pis for 10 bucks, they are ridiculous prices in the UK
The Cray 2 was the sexiest computer ever built.
I remember Cray being synonymous with supercomputing, even into the '90s. You couldn't mention one without the other also being considered. If I remember correctly, the movie The Last Star Fighter, a pioneer in cinematic CGI, used a Cray X-MP as part of its graphics pipeline.
th-cam.com/video/09_Hoecsv7A/w-d-xo.htmlsi=IPcEAYVBpJaO7Snb
there is a making of The Last Star Fighter documentary that shows the computers used and things like that. Its a great watch with all the interviews and Lance Guest does the voice over for it as well as appears in it too of course.
I got to visit Digital Productions in Los Angeles, when they were working on The Last Star Fighter. As a teen computer nerd, I was blown away by the Cray X-MP they were using. The graphics (at the time) were just incredible. It's something that I remember vividly to this day.
@@edwinvalencia3981 That must've been amazing - I wish I'd been there!
Yes they showed Lucas flms x wing fighter in digital they weren't interested !!@@????=8)
This one guy seems to have pioneered many computer fundamentals that we now take for granted.
Yes, he’s like the Tesla/Edison of the computing revolution
I worked with a guy at NVIDIA who worked at Cray. He designed their memory management system. He did exactly the same thing at NVIDIA that he did at Cray, but instead of it being in a supercomputer, it’s now in a GPU. A GPU is basically a Cray computer on a chip.
Absolutely. He was way ahead of his time.
@@little_fluffy_clouds
❤❤❤
I worked at Cray. It was an amazing experience. I could speak directly with the engineers. When I say directly I mean over a beer in Chippewa Falls. It was just plain mind-blowing to work with so many brilliant people.
Even though the philosophies of single supercomputers lost to the distributed computers of today. One thing is for certain... The Cray computers looked just damn cool. :D
I grew up in Livermore, and worked at Sandia Livermore in the 1980s and 1990s. Our Cray's (X-MP and Y-MP) were giant money pits for the users. I remember the computing department having utter fits when other departments started getting desktop workstations like the first Sun SPARCstations and Silicon Graphics machines. I spent a full year porting all of the programs we had on the Crays to those machines so we could save huge amounts of case money. While the engineering and materials departments were moving to these small, fast machines the computing department was buying odd ball machines like Stardent (or the earlier versions from Ardent and Stellar) and continued to push for shared computing platforms.
Yeah, used to have an old pizzabox SPARC. Great little machine, gave my old Alpha box a run for its money, despite it being a much slower processor.
It's a shame the likely fate of the Crays, as they were on the precursor to JWICS, it's likely they were literally melted down.
For those challenged by the term, JWICS is the Top Secret network used by the DoD, amongst quite a few peers with narrower scopes of ignorance (nope, never heard of that, honestly! Lie detector melts down...).
And don't get me started on the lie detector being a lie...
Or LLNL getting its reputation ruined by Teller. Takes 20 years or more to build a reputation, 20 seconds to ruin it with bullshit. Still, it was the era of SDI vaporware, spending the Soviet Union into oblivion.
And my fingers were just as dirty...
Absolutely fantastic work, extremely well presented! I am so glad the algorithm brought this to me, and I hope this video brings you a huge number of well deserved views and subs, you've certainly earned mine!
Glad you enjoyed it!
This was a bit of a trip down memory lane, I operated a couple of Crays back in the 80's, a Cray-1, then a Cray X-MP. The Cray-1 was rumoured to be #1 which Cray were installing in a customers site while 'their' machine was being built.
Glad it brought back some good memories :)
Hell yeah, there's not a lot of people who can say they've done that outside of a museum.
Serial 1 was unique in that it had memory parity, but not error correction. That was determined to be inadequate, so serial 2 was scrapped and subsequent machines had secded error correction. Serials 1 and 2 couldn’t be upgraded because the frame height had to be increased to take more modules. Serial 1 ended up in the UK doing weather forecasting.
That sounds about right, I was at the UKAEA at Harwell in the early '80s when the Cray-1 came. Our workload on the Cray tended to be office hours only so we ran work for the ECMWF 10 day forecasts overnight.
In the mid-80s I was able to use the University of London's Cray (not sure which model) to process data for a research project. I was at the University of Kent in Canterbury (UK). The data chugged up JANET (the Joint Academic Network - this was before the internet); that took 20 minutes. The processing in the Cray took seconds. The results took 20 minutes to chug back down JANET to UKC. How times have changed!
No mention of Cray's attempt to achieve wafer lever circuits... which were never successful. Second digital computer I programmed was a CDC 3300. Cost $1M; filled a room; not as powerful as my phone. My first computer was an IBM 1620. Size of a desk. Took me two weeks to multiply two numbers using machine language... in 1964. Was a Junior in Electrical Engineering and saw no chance of computers affecting my career. Have programmed continuously since 1968. Even did Analog computer in 1966. Lots of changes. Created a language I still prefer over all others... and am the worlds best, and only, programmer, using it.
Clive Sinclair also worked on the same thing back in the 80s. Sank millions into it.
> No mention of Cray's attempt to achieve wafer lever circuits.
They adapted the Cray processor architecture to a single chip processor for the Y-MP.
It is really shocking to consider how revolutionary the transistor really was. Especially when you consider how simple they are.
They can make transistors now so small they are made with just a small handful of molecules.
Shocking? Indeed!
And speaking of electricity....
Give me enough relays and wire, and I can make a current flow through any number of complex operations (relay ladder logic).
But when you start talking about the presence - or the lack thereof - a SINGLE electron at a shirt! Are you shitt8ng me??
😮 That's not flipping engineering ! That's some black magic level shit!
@@rjeder57 Indeed.
And are still actively working on molecular, as in single atom transistors.
And getting nowhere fast. Quantum effects and all.
I'm experienced in everything from vacuum tube analog and digital through VLSI.
Tons of fun problems to solve over the decades!
And yeah, germanium vs silicon was a bit of an earth shaker, a whole voltage potential drop and all. ;)
OK, a lot less leakage in silicon.
To the point I jokingly call germanium circuits geranium circuits.
And still remember how to build a shift register out of a basic flip-flop.
I was introduced to computers in college, circa 1975. Some kind of mainframe in the basement of a building; punch cards, teletype card punchers, etc. I remained a computer nerd throughout my career(s) - about 40 years. Pretty much all along the way, when the name "CRAY" was mentioned it was almost as if you could hear the "Ahhhhh!" soundbite of 'angels singing' used in movies and on TV. 😄
Ah, the Cray Supercomputer, I can hardly wait to get a Cray emulator on my Raspberry Pi.
you Cray Cray!
SuperScale is when two are more instructions were dispatched at the same clock tick. The CDC 6600 was scaler as it only issue or dispatch only one instruction at a time. The multifunction units which allow parallel execution of instructions. I have programmed on the CDC 6600, the CDC 6400, and the CDC 7600. None of Cray's computers were SuperScalar. In the CDC 7600 the function units were pipeline where multiple instruction may be executing in the function unit. See the Thornton - The Design of the a Computer. The CDC 6600.
I worked for Cray Research, Inc. for about 10 (1983-1993) years on the IBM VM Station and then on porting various versions of UNIX to UNICOS. It was a GREAT company; I still have very fond memories. Just FYI: One issue with the Cray 2 was the burning of the fluorocarbon (e.g., a short causing an arcing). The product would become poisonous, requiring immediate venting and complete draining and discarding of all coolant.
Did these machines use kel-F oils as coolant?
Fluorinert
Cray was a great company, but its existence was so short and sweet.
And Seymour's untimely death in an auto crash with his split off companyu Cray Computers.
What a loss. I still wonder what he would have come up with.
Now, no one knows of or remembers Cray.
@@justgivemethetruth We remember.
@@terrywilder9 basically, it was a form of freon. So, when superheated, it'd degrade to phosgene, a really nasty substance that'd literally liquefy your lungs and etch all circuits away due to the hydrogen fluoride level.
Massively Goobering the chemistry down by a lot, obviously.
Phosphorus and hydrogen fluoride, to very unkind schoolyard bullies for the body. Or circuits.
Reflected currently in Halon fire extinguishers, which will release the same noxious contents upon exposure to fire. Yeah, it'll put out the fire and you personally.
Which is why Halon was banned and well, Freon banned for ozone depletion.
Easily replaced though, for refrigerant purposes, with propane.
Not joking. A refrigerant that's nastily flammable. Welcome to the irony of chemistry!
VM Station and MVS Station were terrific software products. The Cray team did a great job providing that. AFAIK, Cray used 3M's Fluorinert.
Until now I never really understood how pipelining and superscalar execution worked.
spoiler alert : it's not magic, it's carpaccio 😛
Very clever memory banking designed for parallel access directly to the CPU.
The Cray CPU was very complex, but it was so simple logical and beautiful.
The writing was on the wall as far as the large scale integration and where
that was going. Finally Cray translated its ECL chips into one large chip and
made the Cray Y-MP.
After watching this I have no idea of what these things do. I worked in IT for 30 years too!
@@rudycramer225 they do insane amounts of arithmetic and moving data around. And the "Cray way" is strikingly elegant.
@@yxyk-fr
Actually the philosphy Seymour Cray was about was to take cheap off-the-shelf commodity components and use them to realize his vision of the fastest synchronized processor he could make for the price. It was indeed elegant, but also it was just really brilliant engineering vision. When he tried to do the same thing inventing gallium-arsenide very fast processor, immersed in coolant things did not go so well. And then he died before it got to fruition, and was a horrible tragedy in my mind, but the age of this kind of hand made super-computer was coming to and end to replaced by work-stations and massively parallel Intel PC chips.
@rudycramer225 - there are many levels to IT. And now it is all being replaced with AI, or so they say, and consolidating to where they don't have a customer support any more except for expensive contract based relationships, and they are getting rid of moderators. Somehow someones post or comment is judged bad and they ban people from expressing themselves on what are private companies, but we don't see they are not connected or acting in some way in concert against the citizenry, and yet we all have to use this stuff.
The incredible economies of scale are increasing and concentrating profits and laying off workers, while industrializing everyone's lives and shrinking the people at the top to these cold reptile people like Trump - very scary and disconcerting.
I really love how you tell your stories. Using 3D animations and stylized themes to present information is truly something different from what other content creators produces. Keep up the great production!
Thank you! :)
During my sophomore year at Livermore High School in 1972, I was in a small fortran programming class. Once a week we were driven down to the CDC office just outside the Lab to run our small jobs on the 6600. We got to wander in the control room and load our little decks onto the card reader. I was in total awe of the large format pen plotter where the head was held upright and moved electromagnetically; no x-y mechanical arms. While I was in grad school at CSU, they upgraded their mainframe to a CDC Cyber 205, which is apparently related to the Star 100. It had a small microwave transmitter/receiver connection from one end of the system to the other, perhaps 15 - 20 ft or so. Mind boggling technology at the time. One of my colleagues became the guy who 'vectorized' other students programs to take advantage of the speed gains. Our professor was invited to Livermore to spend time on the Cray-1 there and soon after I actually got to see a Cray-1 during a tour at the National Bureau of Standards office in Boulder. I really enjoyed this video, it is informative, well constructed, and gave a great backstory to my fortunate little experiences with this technology.
Thanks! and thanks for sharing your experience :)
My dad worked at Control Data in the 60s and 70s and was neck deep in all of this. Thank you for doing this it was really great. I have a feeling it's going to make my dad's day! He worked at that place at 50th and Park I remember him mentioning that. They were he lived over on W. Calhoun Parkway. We always heard about this mythical figure named Seymour Cray no I kind of have an idea who he was.
Glad you enjoyed it!
My dad also worked at CDC from the late 60s to the 80s in Montreal. I remember visiting data centres with entire floors full of these machines and whirring magnetic tape drives. I always thought it was so cool.
Man your vids are really fucking good. Your contemporarys imo are dwarfed by the engagement i have when watching your vids, including ones who are much more well known.
Thank you! :)
Seymour Cray is a really likable character. I like the combination of technical genius and an unfortunate tendency to bankrupt companies by being incredibly bleeding edge. Still he invented the so much modern computing - the superscalar, pipelined, RISC machines he pioneered are really the guts of any modern CPU. And there's something fundamentally charming about the way he could see the future before everyone else and desperately wanted to get their first.
That was also thanks to Apple who advanced Crays ideas. People tend to forget that the ARM we know now exists only thanks to Apple. Apple saved two nearly bankrupt companies Acorn and VLSI and created the ARM so that each company owned 1/3 of it. All the IP, engineers etc from these 3 companies formed the company and Apple then infused money in to it. Rest is massive victory ✌️ Apple later was on a brink of bankruptcy. Steve Jobs sold the shares of the ARM (1/3rd of the company). Jobs sold shares of Adobe and lots of other stock just to bring in enough money to survive. And Apple did. Apple is a massive success. Apple never abandoned ARM. They used ARMs designs all the way until ARM ended up here and there. Apple was not owner anymore. Then they had to secure things and have their own capabilities of creating processors. RISC processors. Twists and turns. Now Apple is again one of the owners of the ARM. If I remember right they have 10% of the company. Maybe it was that Apple never really liked Intel 🤭
Edited it. Jeezuz how many typos my left thumb can do 😱
Apple had always good relations with ARM and always all the licenses.
Excellent work! I worked at Cray from 1981 - 1995 (in fact, my younger brother still works for the current incarnation of Cray - HPE) and spent a lot of time optimizing software to take advantage of the Cray architecture, sometimes at the assembly language level. The one thing that often struck me was how well balanced the entire system was, since just when you started to exhaust one system resource, such as the number of available vector registers, it turned out you were also close to other system limits, such as memory bandwidth and available scalar registers. It was an amazing series of machines!
Thanks!
Were you in Mendota Heights? What group? I was there in the 1980s.
Used to be a data center tech for Dell, I would go into Cray in Chippewa and work on Dell storage. Walking around past the Cray racks was awesome!
That’s very cool. I was too, but for IBM in Milwaukee, but sadly no Cray’s. I did get to work on IBM SP clusters (Scalable Processor), which were the pinnacle of AIX hardware, but I’m not sure if those compared to Crays. I’m guessing the IBM analog to the Cray would have been their S/370?? I really don’t know if the S/370 mainframe (or in my time at Big Blue, the S/390) were used for scientific computing, although I know the SP was - I didn’t belong to that priesthood of guys that could entered the hallowed halls of the mainframes. This was all before Linux took over the supercomputer market.
I was hired out of school to work as a sys-admin on the ETA-10P, a CDC subsidiary spawned to compete with Cray. After ETA went bust, I worked as a junior sys-admin on a Cray X-MP before working with NEC SX vector supercomputers throughout the 90's. They went CMOS and fought off the microprocessors for a decade before distributed memory machines took over. lots of room for future videos there.
I had no idea how the direct the connection to the military was in founding ERA, Remington-Rand, Sperry, and CDC. You did an excellent job digging this stuff up.
fwiw, slight issue with airplanes used. When discussing WW2, you used a C-130, which first flew in 1954. when discussing airlines in the 1950's you used jet models which first flew in the 1960's... slight anachronisms.
really enjoyed the video.
Worked at CRI from 1987-1993. Learned a lot, it was like doing post grad work once I started working in the development division. Spent time on Cray-2. Worked on a CALMA system to learn circuit layout on gate array chips. Went on to learn circuit design and layout using Cadence software tools. Running SPICE simulations using software from Meta-software compiled on a 8cpu Cray II meant that we could simulate logic circuits faster than anyone in the industry. Other parts of the company were using logic synthesis programs to design and run emulations for whole systems allowing next generation designs. I met extremely talented and dedicated professionals that inspired me. I left Cray shortly after the government announced that budget cuts for Super Computer purchases and apparently there was not enough money from commercial sales. Next gen machines which were on the minds at CRI used thousands of processors in a parallel processing system. Indeed all supers today are massively parallel machines.
I worked for Cray Research 1987 to 1992. A great company whose fate was sealed by massively parallel computing and chip integration. It was an amazingly organized enterprise and profitable until it get some competition from other technologies. What an adventure that was.
My dad got a job at an oil refinery in California, once his enlistment in the Air Force ended in 1968. He would have been part of the first class of technicians who were training to run the refinery, but when the company founded out what he had done for the Airforce (land surveying and writing blue prints, they had him working in in that part of the construction project before going into the second class of technician trainees.
He ended up working the rest of his career for this refinery, which was the first computer controlled refinery in the United States and just the second to go on-line in the world.
The computer was huge, in 1968 the computer took up two floors, each 100 feet by 100 feet.
Dad would often work in the control room which I got to see many many times as I was growing up, at first when I went with mom to pick him up from work, then later on my own, after I got my driver's license when I was asked to pick dad up myself.
Of course I was never allowed to touch anything, but it was very cool to go and see. Thousands of buttons, dials, gages and display screens. It looked very much like I imagined the control room of a spaceship might look.
Not long before dad retired I was visiting when mom asked me to go pick up dad from work. I was shocked by some of the changes I saw in the control room. It still looked more or less the same with all the butts dials and switches but the computer room was very different.
Instead of a huge room filled with rows and rows of large computer banks, it was now a large, nearly empty room with one tiny box inside, smaller than the average sized refrigerator.
That was the entire computer that ran the entire refinery. In fact, it was 2 computers. One primary and one back up.
That is how much computers had shrunk in the 25 years my dad worked there. And that was 25 years ago!
I would not be surprised if I went there and saw a smartphone sitting on the floor of that 2 story 2000 square foot space.
The first I read about the CRAY was in the June 79 issue of Popular Science, which I subscribed to. I was 11 at the time.
Probably, one of the best videos i've watched on TH-cam for ages... Fantastic, Kudos Archie. I've been interested in Cray for many years. this was a great representation..!
Glad you enjoyed it :)
Yes, my thoughts exactly. I will have to watch it several times to really understand it.
It’s pretty interesting how modern GPUs use many of the same ideas like superscalar processing with many pipelines doing vector processing. Enjoyed the video, seems very well researched.
You are thinking Shader engine within the GPU, those are vector processors. A GPU has many more functional units for graphics in the rendering pipeline.
Excellent documentary and amazing voice over. The accent is unique, yet pronunciation is consistent and not in the least bit annoying. I could listen to this content for days.
Thanks!
Ever since Tron came out in 1982, which used a Cray machine, I have always liked the Cray story. This video really explained everything about how the machines came into existence.
Yes!
I worked for Cray Computer in Colorado Springs in the early 1990s. For two years. First year in assembly and wiring. Second year in the Gallium Arsenide chip productions. I could see the writing on the wall, I got a new job at a chip manufacturing plant literally across the street from Cray. The next week former colleagues called to tell me that the place was closed and shuttered. End of an era.
This is an easy-to-understand video with very easy-to-understand CG and the purpose for which they are needed. It is also very important that the names of the scientists involved appear. Thanks for the update to TH-cam.
Thank you! Glad you liked it :)
I thoroughly recommend the book The Supermen as an accompaniment to this video. It covers this story in even more detail. Seymour Cray and his band of merry men were genii.
I agree, it served as the basis for this video and provides great detail, especially regarding the business side.
@@TechKnowledgeVideo The inspiration was obvious when you have read it too 😀
Thanks for the recommendation, I'll gladly read anything non-fiction and crammed full of obscure or interesting information.
I want one of these deep dives into Honeywell computers. Finding history on some of those is tough. This was well done, I enjoyed it.
Thank you! Never say never :)
I remember being in awe of the Cray-1's 80MHz clock speed at the start of the 1980s when I began using early microprocessors. This was a great overview of the general progression of high-end computing without delving into the technical details. I remember hearing that the Cray-1 used new ECL logic chips and that the power supply rectifiers were low voltage drop germanium to minimize heat production given the enormous currents that had to be supplied at low voltages to the machine's circuits. I also remember reading that the circular arrangement of the individual columns was to minimize and equalize propagation delays between units in the backplane interconnect wiring.
And now, using many of the concepts Cray developed, look at the multiprocessor power you can buy relatively cheaply in a desktop box. I guess the definition of supercomputer would always be shifting to include only the newest and fastest machines. So far has it come in just a few decades and mostly thanks to the incredible advances in the compaction of the circuitry on silicon dies. In the 1980s, 1um fabrication technology was the newest thing in transistor shrinkage. Now, clock speeds in the multi-GHz range, tens of CPU cores per die, memory into the TeraByte region etc. Just mind-blowing. Seymour is forever a legend in my mind. Thanks for this utterly watchable video!
Most cellphones are far more powerful than a Cray of yester-year. Great video huh.
I wasn't aware of a lot of the early UNIVAC history you relate here! In 1981 I was working on a UNIVAC 1100/42 (2 CAUs and 2 IOAUs along with an SPU). We had some really fast (for the day) fixed-head disk that we usually booted from, but we also had a FASTRAND II* drum(!) that we also could boot from. This was at Offutt AFB in Building 501 (the underground command post); I was a computer operator there and our system produced (via a Ford Philco(!) device occupying a couple of racks) vector graphics depicting force readiness and other such data for the people two floors further underground than I was. On the remote chance anyone is familiar, ADDO was our office symbol.
I also had passing occasion to program on a CDC Cyber 7600, as well as operating Honeywell Level 62 and Burroughs A2 systems.
---
* these had a failure mode that involved throwing the spinning drum through walls. I'm glad I didn't know that then!
Excellent documentary. I really like the graphics and animation you added to tell the story. Thank you!
Thanks, I appreciate it :)
A brilliant piece of work. The only thing you left out was the Cray inspired Sun CS6400 which lead directly to the E10k and onwards, perhaps the most important computer ever produced. I had the pleasure to work with the 6400 and of course all the Sun machines
I worked for Cray in the 80s and 90s. It was great hearing details about the beginnings of the company and familiar names.
When reading the comments ... I feel they double the brilliant(!) content of the vid with the same qualitiy, background stories, personel expieriences and so on. I love that.
And to emphasis: No gaming world and no LLMs without Cray's vetorization.
Agreed. With few exceptions, the comments are a trove of historical info.
Superscalar execution was first used in 80960 processors from intel in 1988. The Pentium team copied the technique along with some of the 960 parts , like the floating point unit. I was on that team. The Cray designs were breakthroughs and we followed the evolution of computer architecture from them and others to create an Out-of-Order design with Speculative Execution, branch prediction, register renaming and many other mechanisms that allowed instruction execution speedups. These were done in the Pentium Pro design, which was not at all similar to the original Pentium designs. We were targetting Cray level numerical performance as a goal and achieved it in the late 1990's only to surpass it with much more parallelism in both multiprocessing and advanced superscalar designs. These designs of the 1990's are still the root of current microprocessors. Advances from them have addressed other bottlenecks in the system to incorporate faster memory interfaces along with multilevel caching for increasingly higher performance as silicon technology also advanced. Many make the mistake in assuming that just the process technology increased the capability of new processors when it has actually been the combination of computer microarchitecture along with silicon advances that have led us to today. Thanks for the walk down memory lane...:)
See where the latest is at check out the Cerebras wafer cpu with 850,000 cores that can be clustered. It’s evolved like wow.
Excellent video. Thank you.
I worked for CDC between 1975 and 1982, mostly in the IBM Plug Compatible Market (PCM) arena, but had occasion to interact with the CDC Cyber mainframe and then Cray operation. The people I worked with were open-minded, free spirited and driven to succeed. I consider Bill Norris to be the Elon Musk of his era. On one occasion, I was at the Arden Hills facility when a guy in a plaid shirt offered to explain how a 7600 that was in final checkout functioned. He demonstrated how several 'modules' could be physically removed while other engineers continued to test the system performance without it slowing or even crashing. I discovered the engineer in the plaid shirt was Seymour. A nice bloke.
One thing that was overlooked that contributed to Cray setting up shop 'over the border' was the parallel development of the CDC STAR100, STAR in this case referring to the STring-ARay architecture. The STAR100 was not a Cray project but that of Jim Thornton, a one time assistant to Seymour. That machine development also ran long, but several were built and installed at the Lawrence Livermore Labs in California.
Again, thank you. You should write a book about CDC. One is sorely needed.
BTW, the first computer I operated and programmed was a British EMIDEC 1100, already quite old, in 1965. Look it up. It was a true generation 1 system. We ran a plastics facility that employed 7000 people, handled payroll, chemical simulation, sales and such, with a 8k 36 bit word machine, without disc drives. Just 4 35mm sprocket feed tape decks and an 8k drum. The CPU used two 1k Williams tube memories. Fun times.
Thanks! And thanks for sharing your story, very interesting :)
great video, really enjoyed it!
Glad you enjoyed it!
I remember my University (USP in Brazil) was about to buy a supercomputer, three companies went to make presentations (for all University staff and students to see) of their machines: IBM with its 3090 with vector processors, that required basically a building made for it, CDC with its ETA 10 that could be liquid nitrogen refrigerated and was easily installed on the same building a traditional mainframe could and Cray (I didn't saw Cray's presentation). The balance was pending to CDC ETA 10, and then IBM offered the 3090 for free! Yeah! At the end I don't know what happened but I am pretty sure the University didn't bought any of those (I may be wrong though). I think (my theory) the supercomputer project was scraped and time was borrowed or purchased from other nearby entities that had powerful computers (some had 3090's like Petrobras and others ETA 10's I don't remember if it was INPE or ITA that had one) without the hassle of maintaining the physical machine.
UFBA bought a 3090 in the 90s.
Stanislaw Ulam developed the hydrogen bomb with out massive 1 or 2 d computer model and big computer: Stanisław Ulam, the Polish physicist on the Manhattan Project, invented the Monte-Carlo method to predict fusion reactions for various designs of the hydrogen bomb.
Ulam integrated the Monte-Carlo process to follow the laws of nuclear physics in generating neutron fluxes, x-rays, and daughter products. He accounted for the inherent statistical fluctuations in neutron chain reactions using only the upper and lower bounds of probabilities for each step. Ulam assigned UCLs and LCLs with 95 percent upper and lower probabilities for each reaction step for each hydrogen bomb configuration. He found that he could quickly predict whether a given design worked. He soon determined that Edward Teller's Spark Plug design would never work. Ulam then designed the fusion device that would work. Edward Teller would insist that Ulam was wrong in claiming that Teller's Spark Plug configuration would not work. Soon, however, Edward Teller would add a minor improvement and take full credit for Ulam's design. He would ignore Ulam's seminal contribution until much later. (Dark Sun: The Making of the Hydrogen Bomb, 1995, Richard Rhodes).
Univac wasn't the world's first commerical computer - that was LEO - Lyons Electronic Office.
The dates are incredibly close - as far as I know the first LEO machine was first put into service in September 1951 whereas the first UNIVAC was put into service in March 1951. As to which was “invented” first it really depends on when you define invented - i.e the first test program, the first commercial delivery etc
So this is where MultiVac name was inspired by
Found this through a news letter to old CDC graduates. Brought back memories for me of many of the early players. A wonderful introduction of the importance of the labs and Sid Fernbach to early computing. LRL Livermore (Sid Fernbach) bought the first CDC 3600, the first CDC 6600, the first CDC 7600 and the first CDC Star 100.
Seymour Cray was unique in the computer industry: First Transistor computer, first RISC, first multi functional unit design, first instruction stack, then pipeline, first short vector approach, and always the most aggressive packaging and his ability to completely stop one design approach and go to a new one.
Congratulations on a well done journey from concepts in the 50's to the last super computers of the early 90's. I enjoyed 29 years on that journey until 1989.
Thanks! Glad you enjoyed it :)
Vielen Dank Dokumente Bemühungen gegeben haben
In the late 80's a friend who worked for Cray invited me to "family day" to at Laurence Livermore Labs. While there we visited the Cray room. If I remember correctly there was a row of 8 Cray 1's and a Cray X-MP. There also was a single Cray 2. The liquid tank of the Cray 2 was transparent and one could see bubbles emerging from the boards. As it was family day, some children were putting finger prints all over it. Fast forward to around 2010, I took a class of students to see the Computer History Museum in Santa Clara. They had a Cray 1, a Cray 2 and the Cray 3. I suspect these were donated by LLL. The three was by comparision tiny, a bit bigger than a toaster covered by ribbon cables.
As a very old expat Brit engineer who has spent his life in and out of computing, this has been a very interesting (and new to me) story. Of excess . . . Thanks for all your hard work. 😎
Thank you!
Well done, an amazingly researched and engaging presentation. Thank you.
Thanks!
This has to be the best video I’ve ever seen on TH-cam. I love love love the animation style, your narration, and how you structured the story. I hope you continue with videos such as these! I look forward to continued watching!
Thanks! Yep, I’m working on a video at the moment, but they take a while :)
A thoroughly professional piece of work - research, illustrations, technical detail, and pace of delivery all very well judged. Congratulations!
Thank you so much! :)
Yes!
I didn't realize that the Cray-3 at NCAR was the only one inexistence! I remember seeing it in their data center back in the '90's
Is this not a lesson that management are the biggest overhead any company can have.
And yet, management believe they are worth the most important, and believe that cuts to the workforce rather than themselves will help the company prosper.
I recall reading a number of years back, a blog post about someone making a recreation of one of Cray's supercomputers using an FPGA.
Shows how far we've come that what once cost millions of dollars and weighed tonnes can now be implemented on a single chip that costs tens of dollars and weighs a few grams.
To think the small machines we got today, to do the same things years ago required a room full huge machines. Back in the day hard drives were the size of clothes dryers. The discovery of semiconductor technology was a game changer.
Thanks very interesting, I was a field service engineer maintaining CDC6600s , then moved on to DEC VAXs and DECsystems 20s.
It seems mini computers are lost in the dustbin of history? When prices dropped below $20K we really started computerizing America by giving wholesalers automated inventory control and accounting, while PCs were playing games.
Die Abkühlung Projekte fangen beim PC 486 an........................................
in the 70ies and 80ies I worked for a machine shop in Hopkins, Mn making spindles for the disc drives for those computers. These things were massive as a 5 mg memory disc was the size of a record album. we made them by the thousands for control data and MPI.
I find it interesting that tech businesses have seldom made financial sense. Makes 1 Billion in sales, has to take minimum wage to save a project. Mind inflation. where was the money going to that was more urgent than developing the company's flagship? This story was masterfully done. I will watch it again for sure
Thanks :)
Great review! As someone who grew up through this development, it's fascinating to re-live all these milestones, left behind and forgotten...
Thanks!
"The addition of the second dimension increased the number of mathematical operations required exponentially ( 2:10 )".
Uhmmm ... NO! At max quadratically.
Sorry for the nitpicking. Great video and thanks for sharing!:)
Well done! As a computer engineering grad in 1994, I didn't truly appreciate the history that underpinned the industry I love so much. I worked on SGI computers at the Canadian National Research Council in 1991/1992 and didn't realize they would go on to acquire Cray Research. Fascinating!
Thanks! :)
I started my career using the Cray X-MP and it's competitor, the CDC Cyber205. As a young engineer, it was thrilling to use the most powerful computers every built. 6 years later I had about a quarter as much processing power in a small parallel processing machine on my desk. Now the CPU of my consumer grade desktop provides 1000 times the raw power of the X-MP and that ignores the amazing power of the GPU! This video illustrates the constant, breakneck pace of advancement in computing.
What's most impressive is that somehow they managed to build vast, highly profitable companies while dealing with the incredible rate of change.
You put an incredible amount of work into this vid and we appreciate every minute of it 👍
Thanks :)
Many thanks for this video - like most people I have only ever seen Cray computers in movies and magazines but there is something very cool about the look of them, and the fancy cooling systems. In the movie Sneakers Ben Kingsley is seen sitting on one while chatting.
Glad you enjoyed it! For sure, they had very interesting designs :)
I say Sneakers again recently and noticed it straight away!
Hats off to Cray and his team!
Wow my grandpa worked for Control Data and then Ceridian for nearly 50 years and I didn’t know too much about the history of the company. Thanks so much for making such an incredible video describing it all. I can’t wait to call him about this!!
Glad you enjoyed it :)
A video of this quality deserves a lot more than 12.5K views. I hope the TH-cam algorithm bumps it up a bit in the recommendations queue
surprisingly good video - subbed.
I'm old enough to remember the cray computers... Silicon Graphics and SUNs heyday
Thanks :)
Amazing compilation of facts.
Thanks
Thank you!
Great documentary. Though if I could level some constructive criticism, I'd say it's weird that you didn't mention Silicon Graphics buying Cray. Sure, it's not a huge part of the story, but it is part of it.
At some point in the future I plan on visiting the SiliconGraphics story in full so it may appear there
@@TechKnowledgeVideo Looking forward to seeing that video! 👍
I remember what Cray was the shizznits. And then I remember one sitting out in the open on a loading dock for over a year because nobody knew what to do with the thing. I can't even recall whatever happened to it. Probably went to the breakers.
Thanks, very well done.
I have been in the IT realm pretty much all my life. I grew up with kids whose parents worked at CDC, Cray, Honeywell and other companies.
One of the things that has always amazed me is how fast things change and how slowly they develop.
My brother went to MIT, and I remember playing a dog fight game on a 6"x6" green screen running on a PDP-4. The screen drew so slowly that the body of the plane., made up of dashes, would bend when going around a corner. This was in the lates 60's.
Talking about the massively parallel designs for the later supercomputers reminded me of something my brother mentioned. Some of the large computers were running into problems with the speed of electricity (light in essence). They needed to build the computers in a way to shorten the length of wire connections. A nanosecond, is a very short amount of time, but, in terms of light speed, it is about a foot in length.
I remember visiting the NOAA Physics Laboratory in Boulder CO in the late 90s. They were still using a Cray. And the largest robotic data cassette storage solution I have ever seen.
Cray and Chen: Innovative masterminds who wanted to keep pushing the limits by completely redesigning from the ground up, pioneers of the industry who's designs sacrificed everything in exchange for being the latest and most powerful machines.
Davis: Down to earth mastermind who wanted progressive upgrades that maintained compatibility with old designs and was far more accessible to more buyers at the expense of not being cutting edge.
What ultimately doomed Cray (the original company, not the modern one) was clearly the lack of compromise on plans and designs. Cray and Chen's team could've partnered up and built one niche machine instead of two while Davis' team could continue to be the main product. Sometimes the brightest minds need to be brought back into reality for a second to not trample on everybody else.
Cray unfortunately died in his 70s in a car accident in the 1990s despite still being just as active in the industry as he was when he was young. Chen isn't the same Steve Chen as the TH-cam co-founder obviously. This Steve Chen is Dr. Steve S Chen who ended up starting a few more companies and is still active in the industry despite his old age! One has to wonder just how old Cray would be before he finally decided to retire, probably never given his tireless brain and passion.
I’ve heard that it was someone other than Chen who actually got the X-MP to work, although I can’t remember the name.
I always wondered what happened to Steve Chen. He really screwed things up.
Cray was one of our customers and I got to visit their factory. I have never been so cold in my life. It was -20F outside. The factory was not all that impressive. Looked more like a start-up than a supercomputer manufacturer. They used to buy electrostatic printers from us as they were the fastest available at the time and their diagnostics apparently dumped a lot of paper to pour over.
There's one in the computer museum in Silicon Valley that you can see.
The Cray vs Amdahl wars were legendary.
incredible video 👏👏 love the animations in this one
Thank you so much :)
Great research, great video! thank you, really enjoyed it!
I like the 3d approach rather than pictures/videos. It should be pretty time consuming though (though scavenging for fitting pictures can cost a lot too)
Thanks :)
My Father worked on Univac 1, they figured out how to improve the reliability, to get most a day out of they system instead of 1 hr, by replacing all tubes in a bank, when one blew. They sold the untested leftover tubes to the hobby/used parts market in downtown Manhattan.
You should have added something about Seymour's last company he was involved with, SRC Computer. They used a liquid cooled, still Fluorinert, Xilinix FPGA that could be reloaded is milliseconds to process a different algorithm essentially a custom processor specifically for that algorithm. They were located in Colorado Springs and eventually went out of business as well.
That's interesting. I worked for Cray and we were all crushed when Seymour left for Colorado. We never knew much about the technology, that is interesting about a reconfigurable processor, thanks.
My grandfather was an Engineer at NASA where they tested the rockets. He told me when they ordered a Cray and the order catalogue was several hundred pages of checkboxes to configure what you wanted it to be capable of. They ordered one to do some kind of geometry simulation. Not sure if he was around on that job long enough to use it. He later went into shipbuilding. It was a much shorter commute.
My introduction to computers began in the U.S, Navy in 1962 as an Electronics Technician (ET) assigned to on of the first three ships to have digital computers aboard them. The ship had so much prototype equipment besides the two UNIVAC AN-USQ20 all transistor computers that we had civilian visitors on board almost continuously when we operated on the eastern side of Hawaii in the Pacific ocean. Among those were some who were familiar with Seymour Cray and I was told that our computer had had input from him. This fact confused me when much later in life I read about the timelines you detail here. Thank you for the information about his being allowed to work independently while at Control Data.
The AN-USQ-20 was not a supercomputer, but it was amazingly good for its very small size at that time. We used a vacuum tube during our on shore training as an emulator. Of course the U about the size of a double door refrigerator, (air cooled by an integrated chilled water heat exchanger, by the way) it had less computing power than my iphone.
I got to program the xmp and ymp in graduate school in the 80's. After graduating I worked for Thinking Machines Corporation whose focus was "massively parallel" computers. The competitors at the time where nCube and the Paragon. We all used pretty specialized hardware and custom created compilers to make use of it. After a while IBM simplified the world even further when they introduced the SP1 and SP2 which were basically just a bunch of RISC workstations taped together. The interesting part is that a lot of software and computing concepts created by by Thinking Machines, IBM, and others in the late 80's is what is enabling a lot of the current day cloud computing. May be a lot of work, but it could be a really good follow up video.
Great video.
Cray 1 ... then the Cray X-MP ( 52:00 ) , then the Cray 2, then the Cray Y-MP, then the parallel processing machines.
I worked briefly in the old glider factory building (in 2022 and 2023) and found the history fascinating the entire time I was there, I wanted to learn everything I could! Not to mention I was soo hoping I would find a forgotten piece of equipment somewhere but alas the building has been heavily stripped and modified over the years. All that remains from the old machine shop pictures were the mounts for the wooden beam ceiling.