@@peterjszerszen Don't forget about Linux, I've transitioned over about 3 months ago and am really enjoying it. Command line is a great way to work still.
It's amazing to realize that someone so successful as Kildall became an alcoholic ... show's that prestige and success does not bring purpose and meaning. We have to reach higher for that. John 5:24.
People are scatter brained these days, especially when they’re younger. I listen to so much verbal diarrhea in my day to day. Even professional talking heads often misspeak or stutter or struggle to get their words out. Watching old talk shows, lectures, interviews or programs like this and you’ll notice people used to be much more put together.
The device that is now showing you this video is way more powerful than the machines they are talking about. But a lot of respect for the people in the video whp paved the path to get where we are today. They made it possible.
Random fact - if you read the book Jurassic park (1990), in the beginning the government had no idea what was happening on the island and were investigating the company. They knew that the company Ingen had imported three cray X-MP super computers on to the island. The ones in this video were the earlier versions used to design nuclear weapons. One computer was enough do most of the work anyone would need - yet this company named ingen had imported 3 of these on a remote island. The government was baffled as to why someone would need that much computing power on a remote island - little did they know that the park were using them to reconstruct the genome of the dinosaurs
Little did they also know the park boasted some silicon graphics machines as well. Screw the dinosaurs, they should have sold tickets for people to come and see the computers.
The fun thing about this technology in this video is that, over time these supercomputers with their specialized parallel and high performance hardware started using more standard operating systems (like, eventually, Linux...), meaning those operating systems had to add support for this kind of specialized hardware. Later on, when typical consumer hardware like desktop and phone CPUs became more like these old supercomputers (with lots of pipelining, and multiple cores), the OSes already had the support in place to be able to run in those kinds of environments.
7:27 If you watch the SF film _The Last Starfighter_ which came out this same year, its CG is just streets ahead of anything from around the same time. These were rendered on a Cray X-MP super. The cost of the computing time was a third of the entire movie budget, and the result took up a similar portion of the movie running time.
And worth every penny! Of course a modern PC and someone who knows how to use Blender would blow it out of the water nowadays. But the cgi holds surprisingly well and even has aged with charm instead of the uncanniness you see even in modern movies. The models are very complex and wouldn't be easy to handle, even for a supercomputer of the time.
In most cases...the practical kind like processor speeds and RAM amounts, they far surpassed. In the case of AI, etc., they usually underdelivered. That's the nature of futurism, though.
Well to put that into some perspective the little battery powered pocket sized Nintendo 3DS handheld system is 5 gigaflops. Half what those powerful supercomputers were at the time.
Gary mentions Grace Hopper at 8:55. As I write this in 2023, NVidia has introduced their Grace Hopper superchip for AI workloads almost 4 decades after she's mentioned. The world is so different than in 1984.
Summit, the US's new supercomputer, is more than twice as powerful as the current world leader. The machine can process 200,000 trillion calculations per second so yeah we have come a hell of a long way over the last 36 years
It is easy to scoff at these early Supercomputers but thinking out loud here, they were the foundation of knowledge and experience thar brought us what we have today. The road to today was build by these machines and people.
It is still amazing how much each level of computer systems has progressed from the earlier systems in less than a decade for each level. Even the new super computers make the older super computers look like antiques by mere comparison.
It's ironic that I am Casualty watching this show on the internet, on a modest modern PC (i5, 2g HD and 12gb ram) that was not even dreamed of only 30 years ago. Were will this technology take us 30 years from now?
Ray Kurzweil in his book Age of Intelligent Machines wrote about stuff like current laptops, smartphones or tablets 30 years ago. In his opinion, in 10 years machines will be able to do everything humans can now.
@@janruudschutrups9382 by the mid 1980s consumer desktop hard drives were available in the 10 to 20MB range. A 2GB drive would not be sold until 1992, but it cost an arm and a leg. 2GB would become normal in the consumer space more towards the mid 1990s.
My first computer (Olivetti M24 SP running at 10 Mhz and with 20MByte hard drive) cost more that my first car (nice big luxury 6 cylinder)... both bought in the same year.
Well, the smartphone is the product of all this work, plus almost 25 years more of work. It's what makes these shows so damned interesting in the modern day.
I recall reading an article in New Scientist back in the 1990's about the race to build the first Teraflops supercomputer. The latest Xbox supposedly has around 12 times that performance.
Not really the same numbers. For supercomputers they always quote the flops in terms of double precision math (64 bit) as that is the standard for high quality simulations where the tiniest bit of accuracy makes a difference. In the world of consumer gpu's, games don't need that type of precision. the 12 tflops of the current xbox is single precision (32 bit); the double precision is only 759 gflops - short a terraflop. So that "old" supercomputer you read about is still faster at what it was designed to do than the current xbox. Keep in mind though, that the crippling of the double precision performance is intentionally done so that people don't buy these cheap consumer units to stack them up for highly scientific work - they want you to buy the super expensive commercial/industrial GPUs. And those are wicked fast - the current Nvidia A100, a standard "budget" server farm GPU does 10 Tflops double precision, for example. With that said, modern gpu's have one added massive thing that these old supercomputers didn't have and that is super fast memory access. the Xbox has DDR6 320bit memory at half a terabyte per second, which so much exceeds these old supercomputers that their throughput would make them effectively faster despite their slower computational speed in double precision. It is in fact this reason why you can run a neuro network AI LLM on today's moderate spec computer hardware that would never be close to achievable on those old supercomputers of the 90s.
Pluck Gary from 1995 (a week before he dies) and put him in 2020. He'd be in frikkin tech heaven! And he'd be gobsmacked it was only 25 years. He thought the 80s tech evolution went fast...
I wish they had waited a year and done a special on the Cray 2 super computer that came out next year in 1985. That was something I loved as a kid, I was born the same year the Cray 2 became operational.
And now a small rectangular device that fits in your pocket has way more processing power than a supercomputer back then that was the size of an entire large room. Technology evolves fast, I can only imagine what we might have 20-30 years from now.
Back when the people who actually developed the hardware and had to sell it. Not like today where any marketing wanker can sell something he hasn't touched.
I remember the panic over the Japanese computer threat. Then Linux came along, Intel developed the Pentium, Donald Becker developed Beowulf cluster software, and the whole notion of the "supercomputer" was fundamentally changed.
Yes, itÄs quite funny to see, how afraid they are about japanese computers. In the end, the japanese computer manufacturers just began to build Windows machines.
@@acmenipponair It was a real fear at that time since their government subsidized the cost of R&D and they had the existing manufacturing capacity, as well as customer base to deliver to. In the end, it became irrelevant since the consumer market drove high performance computing to an exponentially higher degree. Now in the current world we see total proof of this as supercomputers are just the same microprocessors in your ordinary devices, just tons of them well balanced together.
@@oldtwinsna8347 Another thing that may have contributed is that there was a bit of a trade war with Japan in the 80s over this sort of stuff. Some US government officials even smashed a Japanese radio as part of a demonstration claiming that the Japanese sold precision CNC machine tools to the USSR. Though other countries had already sold the USSR similar equipment. But even before that, the US was trying to limit the amount of imports from Japan. The strange thing is that the US never seemed to do this with China. Stuff coming from China has American brand names on it, and it seems they are OK with that, even if it de-industrialized the country. Apparently competition from Japan was a serious issue, but they had no issues with just completely shutting factories down and making stuff in China.
I used a Cray to calculate stresses in structures around this time. Never saw it, but did recently see one in a museum. I was surprised how small it looked, even so there were masses of wires at the back. The speed of light problem went away when everything was put on microchips.
@@gregorymalchuk272 : Stresses in bus bodies, using NASTRAN. Buses take a hell of a pounding, on lousy road surfaces and a large difference between empty and full weights. Even had a [double decker] load case of full on top and empty downstairs
Now transistors are closing in on the molecular structure of silicon, we are still at the dawn of processor based computing. As quantum computing becomes more viable and probable over possible we could soon make far larger leaps than ever before. If this is super-computing what will my eight year old granddaughter see in her life?
Quantum computers are a _lot_ of hype. Don't get me wrong, they're a real technology with real applications, however popscience seems to imply that quantum computing is the next generation of computing in general, and that it will "replace" our current CPUs or whatever. That's not the case, because quantum computing does not provide any benefit to any of the tasks we normally run on "normal" computers. Instead they are able to run a whole new set of tasks that you _can't_ do on a conventional computer.
The US dominates the top500. China is a heady hitter these days too. The top of the list bops around. It’s current top super computer is Japan, the US has spots 2 and 3, and China is in spot 4. Everyone leapfrogs each other, it’s always interesting to see what the next fastest machine in the world is.
20:43 Not long after this, I think it was, one major US university was about to choose to buy a Fujitsu super, until Government pressure made them change their minds and go back to a good old all-American Cray instead.
Love this show, because it really show the developement of IT field and computer tech from 1983 till 2002. These things were the ultra modern stuff back than.. i can imagine what we have nowadays, not talking about the military black projects or DARPA, they are like 50 or 60 years ahead of current technology.
The Cray X-MP sold for $15 million - the 1.2GB worth of drives sold for an additional million dollars. The processor in most modern smartphones is about as fast if not faster.
100,000,000 operations/second or 100 mips used to be the pinnacle of computer technology in 1984 via Cray computer. Now, we have a I7-4790k micro computer which can perform, 144,550,000,000 or 144,550 mips. A high end microcomputer made in 2014 is 1445.5 times faster than a super computer made in 1984.
800 million operations per second... My GTX 1080 can do 9 trillion floating point operations per second... it would take 11,250 of those supercomputers to match my graphics card.
I was just thinking that..these supercomputers that they are speaking of in this video are WEAK compared to what is available for the consumer market today. Hell, the GPU in my phone could run circles around what they had in 84.
Some supers today use programmable GPUs, too. How high do you think your gaming rig would score on a list of the world's top supers www.top500.org/ ? Wouldn’t make it anywhere near the list...
Just to be clear, the system was not 100mhz but rather performed 100 million floating point operations per second(mflop). Processor speed and number of operations are not necessarily the same thing.
Sunway TaihuLight, the current fastest system, is able to do about 93 PetaFLOPS. (93,000,000,000,000,000 Floating Point Operations Per Second.) So, it's 93 quadrillion calculations, which is roughly 3.1x the speed you've mentioned. I'm sure that by 2020-2025, we'll be looking at early quantum computers that might be able to scale up to the ExaFLOPS level by truncating the floating points into smaller block chains. (Though memory models are currently an issue, since we would basically need analog variability in each cell to represent the variation between 0-10, for instance, rather than 0-1. Programming something like that is also likely going to be a bastard of a job, since the level of complexity is going to also increase by a magnitude, I'd think.
Some people seem to be getting confused over the speeds of these computers. The Cray computer was working at 100 MIPS. that's instructions per second. Not 100 MHz which is machine cycles per second. Not every instruction is processed in every cycle. Also in terms of catching up to moore's law. A Cray of the time would be about as powerful as an iPad 2. Also these supercomputers were NOT general purpose. They were mainly vector processor based. That's a somewhat different form of math to what Most modern processors use Most often (gp processors have vector units but tend to rely on their integer and floating point units). Also megaFLOPS were mentioned. 100 megaflops is equivalent to maybe a Pentium or late 486. 10 gigaFLOPS would be more akin to a powerPC G5 or Pentium 4. Most modern arm(mobile) processors are about that fast.
But what matters is how fast these old supercomputers would be with today's type of applications we can relate to. For example, would you be able to create a program on a Cray to do real time h.264 decoding at HD resolution?
To the vector processors: We have such processors in our computers today. But NOT as the main purpose CPUs. But the GPUs. And there you can say, that a Cray One is applicable with the GPU of a small smartphone.
@@acmenipponair No, we've had vector processing in CPUs ever since Intel MMX and successors like 3DNow, SSE, AVX, AVX-512. PowerPC had AltiVec/VMX and VSX, ARM had VFP, Neon, SVE, SPARC had Vis...
Not quite. A p4-2.53(2002) had a performance of 0.4 GFlops. Non-overclocked. Which is about 1,000 times the speed of an 8088 (1981). Time interval: 21 years. A ryzen 5 3600 has a raw speed of around 500 Gflops. So thats roughly a 1,000 fold increase in CPU calculation speed in 18 year's time. So speed increase is increasing per time interval. Slightly.
It would be hard pressed to show a 144p video on a 3 inch screen, even if you added a specialized graphics unit in line with the CPU's capabilities and the top end of what was available at the time. If you wanted 480i or 576i colour video at full frame rates, analogue TV was still your only choice.
No, modern desktops are 100-10000 times faster. A quad processor Gray X-MP was 800 MFLOPS. Modern top desktop processors are at about 100 GFLOPS. GPUs handle 7 or more TFLOPS.
It's not what you think it is. I also thought I saw what it is not, but it is not what I saw. PS: very interesting picture indeed, but take a CLOSE look at it and you will see what it really is.
The Cray-2 super computer came out the following year (1985) and was liquid cooled, could perform 1.9 billion floating point operations per second, and consumed 200kW of power ! A smartphone today is hundred of times more powerful, and consumes a very small fraction of the power, and runs cool. But it's still not smart enough to correct software bugs on its own, nor can it program itself. But maybe it's better that way ?
People that comment on these videos always say some variation of "The hardware I have today is way better than what they showed!", but they leave out that without the "building blocks" of previous technology, what we have today would not exist.
It's so interesting to hear the one salesman talk about how the petroleum industry is investing heavily in super-computing. I'm sure the viewers at the time assumed that the petroleum industry was forward thinking and looking to benefit from advances in technology to benefit all of society. Obviously in hindsight we know that wasn't the case.
I've worked in the petroleum industry, they still do invest a lot in computers for very number crunching intensive tasks like Seismic Data Processing/Res Sim. The biggest energy companies still own their own data centers and process their own data. However starting around 1995 onwards, most of these tasks were outsourced to data processing companies specialized in taking the gathered field data and turning them into seismic models. The energy companies realized that they were not computer specialists and didn't want to spend the money and time to develop in house software solutions or to own IT infrastructure and personnel. Also most seismic data sets which required supercomputer level power to process in 1984 can now be done on a robust PC in a few hours time on a geophysicist's desktop or compute server.
You’re a ridiculous koolaid drinker. Whiners about muh climate change have no solutions beyond solar and wind fairy dust which any physicist for decades could’ve told you cannot supply our energy needs. Your ilk are shooting us all in the foot because you have a toe ache.
It's funny he mentioned oil companies not knowing what they're doing with their computers. Where I work we support one of the big oil companies' servers and all I'll say is that I now -completely- understand why there have been so many spills -__-
I had this feeling that 20 years from now, my grand kids will be laughing at me when I will be explaining how high tech our computers in the past like Intel i7 processor 3.7Ghz, 8 to 16 GB RAM, a 1 to 2TB ROM, a 3440 x 1440 ips resolution, etc. The same thing that I'm laughing at this video now.
I hope so! We're going to need some new tricks to go beyond 100x faster than where we are in 2021....perhaps fully 3-dimensional processors and/or a shift toward optical processing will get us there.
It´s funny to see in 1984 the fastest super computer being able to do awesome 100 million operations per second... And today, a top cell phone is able to compute 16 trillion. So an iPhone 13 is 160 thousand times fater than this super computer. Actually an iPhone13 is faster than the #1 super computer from 2001 and it´s faster the ALL the computation power on Earth , the sum of ALL COMPUTERS ON EARTH (super, mainframes, enterprise, domestic) in 1980. That´s amazing.
It's a marvel alright. But that iPhone is probably being used by someone watching TH-cam while pooping - yet, the archaic supercomputing would've been doing something more useful.
Damn I wish Gary Kildall was still with us.
He is so hot
remingtonh Who?
@@peterjszerszen Don't forget about Linux, I've transitioned over about 3 months ago and am really enjoying it. Command line is a great way to work still.
It's amazing to realize that someone so successful as Kildall became an alcoholic ... show's that prestige and success does not bring purpose and meaning. We have to reach higher for that. John 5:24.
Don't get in any bar fights. :(
I love how professional everything was in this video.
Same here ... it was an extremely well planned and put together program ... all their shows!
Standards seemed higher
This show had such articulate guests and hosts.
People are scatter brained these days, especially when they’re younger. I listen to so much verbal diarrhea in my day to day. Even professional talking heads often misspeak or stutter or struggle to get their words out.
Watching old talk shows, lectures, interviews or programs like this and you’ll notice people used to be much more put together.
I agree such a great show
@@lookoutforchris you hit the nail right on the head
The success of this show was down to Stewart and Gary having respect that the audience had some knowledge of the topics covered.
@@lookoutforchris
Couldn't have said it any better, today's "modern" world is a brain zapper.
One hell of a show. The super computer interview was first class. Could've listened to these guys talking for hours.
The device that is now showing you this video is way more powerful than the machines they are talking about. But a lot of respect for the people in the video whp paved the path to get where we are today. They made it possible.
Random fact - if you read the book Jurassic park (1990), in the beginning the government had no idea what was happening on the island and were investigating the company. They knew that the company Ingen had imported three cray X-MP super computers on to the island. The ones in this video were the earlier versions used to design nuclear weapons. One computer was enough do most of the work anyone would need - yet this company named ingen had imported 3 of these on a remote island. The government was baffled as to why someone would need that much computing power on a remote island - little did they know that the park were using them to reconstruct the genome of the dinosaurs
Random fact - The supercomputer in the Jurassic Park film is a Thinking Machines CM-5 supercomputer, not a Cray.
Little did they also know the park boasted some silicon graphics machines as well. Screw the dinosaurs, they should have sold tickets for people to come and see the computers.
The fun thing about this technology in this video is that, over time these supercomputers with their specialized parallel and high performance hardware started using more standard operating systems (like, eventually, Linux...), meaning those operating systems had to add support for this kind of specialized hardware. Later on, when typical consumer hardware like desktop and phone CPUs became more like these old supercomputers (with lots of pipelining, and multiple cores), the OSes already had the support in place to be able to run in those kinds of environments.
Could you imagine having a show like this today?
😂 today's computers are busy making and storing 100000th version of same meme.
7:27 If you watch the SF film _The Last Starfighter_ which came out this same year, its CG is just streets ahead of anything from around the same time. These were rendered on a Cray X-MP super. The cost of the computing time was a third of the entire movie budget, and the result took up a similar portion of the movie running time.
i approve this message.(and now we can do all from this film in real time and even better)
And worth every penny! Of course a modern PC and someone who knows how to use Blender would blow it out of the water nowadays.
But the cgi holds surprisingly well and even has aged with charm instead of the uncanniness you see even in modern movies. The models are very complex and wouldn't be easy to handle, even for a supercomputer of the time.
One of my favorite movies when I was a kid, even though I didn't see until the 90s when it came on TV. Only from 89 tho
I only discovered this show a year ago. What a fantastic show it is. As a kid growing up in the 80’s I would have loved it.
I remember watching this program when it was first aired and being in awe of what they were projecting for the mid 80s.
In most cases...the practical kind like processor speeds and RAM amounts, they far surpassed. In the case of AI, etc., they usually underdelivered. That's the nature of futurism, though.
Gotta respect these people for what we have today.
It now costs about 4 cents to outpower the $15 million cray mentioned in this episode.
Imagine building a super computer to hit 10 gigaflops and then your kids have flame wars over whose game console can perform more teraflops.
Well then i would be disappointed in myself
Well to put that into some perspective the little battery powered pocket sized Nintendo 3DS handheld system is 5 gigaflops. Half what those powerful supercomputers were at the time.
Apparently computers that had the specs to run Win 98 were as powerful as a Cray 2
Gary mentions Grace Hopper at 8:55.
As I write this in 2023, NVidia has introduced their Grace Hopper superchip for AI workloads almost 4 decades after she's mentioned.
The world is so different than in 1984.
Summit, the US's new supercomputer, is more than twice as powerful as the current world leader. The machine can process 200,000 trillion calculations per second so yeah we have come a hell of a long way over the last 36 years
I could have listened to the initial conversation on super computers and algorithms for another hour or so... So interesting.
are you a super computer user?
It is easy to scoff at these early Supercomputers but thinking out loud here, they were the foundation of knowledge and experience thar brought us what we have today. The road to today was build by these machines and people.
1984: Cray X-MP, 128MB, 800 MFlops, $15 million (Gov’ts and Megacorps only).
2023: NVidia RTX 4090, 89000000 MFlops, 24000 MB, $1600 (Gamers…)
2064:????
It is still amazing how much each level of computer systems has progressed from the earlier systems in less than a decade for each level. Even the new super computers make the older super computers look like antiques by mere comparison.
It's ironic that I am Casualty watching this show on the internet, on a modest modern PC (i5, 2g HD and 12gb ram) that was not even dreamed of only 30 years ago. Were will this technology take us 30 years from now?
Joy of Lego I'm pretty sure they had 2GB HDD's in the 80's 😉.
I think that he means 2TB and just made a typo.
Ray Kurzweil in his book Age of Intelligent Machines wrote about stuff like current laptops, smartphones or tablets 30 years ago. In his opinion, in 10 years machines will be able to do everything humans can now.
@@Kynareth6
Yep, we are likely on the edge of having computers that can simulate every connection in the human brain.
@@janruudschutrups9382 by the mid 1980s consumer desktop hard drives were available in the 10 to 20MB range. A 2GB drive would not be sold until 1992, but it cost an arm and a leg. 2GB would become normal in the consumer space more towards the mid 1990s.
What an amazing episode...
Sad see people here mocking that time, those were the pioneers , all we have today came from there.
They aren't mocking. I think most of the people know what you're saying. It's just that we've made ridiculous amounts of progress since then
Thank goodness there are nations out there with the same values as we do.
Imagine a world where every country was like Nigeria.
It's going to become like Nigeria or Brazil soon though
Thank God we got rid of Trump, the most ignorant and arrogant president in US history
Love it. Its like a time machine.
imagine comparing you modern computer to these antiques🤔
15:50
40 years later and that is still a problem we're trying to solve!
seen this while I'm just installed few hours ago my Ryzen 9 3950X! time has passed!
seen this while I am waiting for the Ryzen 9 5950X to be back in stock! time has passed! ;-)
"...one of the fastest supercomputers in the world, capable of 800 million floating point operations per second." - Yeah, but can it run Crysis???
Whatever smartphone you're watching this on is faster than every computer mentioned in this episode put together. By a lot.
The first guest that talked is smarter than what he projects. Everything he said wound up being correct.
PC stuff was SO expensive back then , like YIKES ! expensive.
Perspective is so far off today. People think computers or tech in general is expensive, but really it’s not.
My first computer (Olivetti M24 SP running at 10 Mhz and with 20MByte hard drive) cost more that my first car (nice big luxury 6 cylinder)... both bought in the same year.
Wow !
A super computer that can do 100 million calculations a second.
Thats almost 1/12 the power of my smart phone.
Well, the smartphone is the product of all this work, plus almost 25 years more of work. It's what makes these shows so damned interesting in the modern day.
your wrong I think it's more like 1/15 the power of your cell phone hehe
The AS13 Bionic does 1 Trillion operations per second. This super computer is 1/10th the speed.
A PC would still do certain things better.
100 giga flops super comp vs iPhone 12 A13 chip 5 Tara flops. 50X faster.
A modern smartphone can do 800 BILLION floating operations per seconds now...
I recall reading an article in New Scientist back in the 1990's about the race to build the first Teraflops supercomputer.
The latest Xbox supposedly has around 12 times that performance.
I wrote a (bad) sci-fi novel in the early 90s in which an entire planet was run by a single "Teraflop Computer" lol
Not really the same numbers. For supercomputers they always quote the flops in terms of double precision math (64 bit) as that is the standard for high quality simulations where the tiniest bit of accuracy makes a difference. In the world of consumer gpu's, games don't need that type of precision. the 12 tflops of the current xbox is single precision (32 bit); the double precision is only 759 gflops - short a terraflop. So that "old" supercomputer you read about is still faster at what it was designed to do than the current xbox. Keep in mind though, that the crippling of the double precision performance is intentionally done so that people don't buy these cheap consumer units to stack them up for highly scientific work - they want you to buy the super expensive commercial/industrial GPUs. And those are wicked fast - the current Nvidia A100, a standard "budget" server farm GPU does 10 Tflops double precision, for example. With that said, modern gpu's have one added massive thing that these old supercomputers didn't have and that is super fast memory access. the Xbox has DDR6 320bit memory at half a terabyte per second, which so much exceeds these old supercomputers that their throughput would make them effectively faster despite their slower computational speed in double precision. It is in fact this reason why you can run a neuro network AI LLM on today's moderate spec computer hardware that would never be close to achievable on those old supercomputers of the 90s.
Pluck Gary from 1995 (a week before he dies) and put him in 2020. He'd be in frikkin tech heaven! And he'd be gobsmacked it was only 25 years. He thought the 80s tech evolution went fast...
He died in 94 I'm afraid...
I wish they had waited a year and done a special on the Cray 2 super computer that came out next year in 1985. That was something I loved as a kid, I was born the same year the Cray 2 became operational.
hundreds of millions of calculations per second. Last year frontier broke 1 quintillion.
Dr. Hideo Aiso is 87 now (born 1932).
I love the cut 14:19 that indicates Stewart is fluent in Japanese (he probably is).
And now a small rectangular device that fits in your pocket has way more processing power than a supercomputer back then that was the size of an entire large room. Technology evolves fast, I can only imagine what we might have 20-30 years from now.
I recall in primary school in the early 60's and the teacher said computers cannot make a mistake...why did this stick my mind ?
Back when the people who actually developed the hardware and had to sell it. Not like today where any marketing wanker can sell something he hasn't touched.
I remember the panic over the Japanese computer threat. Then Linux came along, Intel developed the Pentium, Donald Becker developed Beowulf cluster software, and the whole notion of the "supercomputer" was fundamentally changed.
Yes, itÄs quite funny to see, how afraid they are about japanese computers. In the end, the japanese computer manufacturers just began to build Windows machines.
@@acmenipponair It was a real fear at that time since their government subsidized the cost of R&D and they had the existing manufacturing capacity, as well as customer base to deliver to. In the end, it became irrelevant since the consumer market drove high performance computing to an exponentially higher degree. Now in the current world we see total proof of this as supercomputers are just the same microprocessors in your ordinary devices, just tons of them well balanced together.
@@oldtwinsna8347 Another thing that may have contributed is that there was a bit of a trade war with Japan in the 80s over this sort of stuff. Some US government officials even smashed a Japanese radio as part of a demonstration claiming that the Japanese sold precision CNC machine tools to the USSR. Though other countries had already sold the USSR similar equipment. But even before that, the US was trying to limit the amount of imports from Japan. The strange thing is that the US never seemed to do this with China. Stuff coming from China has American brand names on it, and it seems they are OK with that, even if it de-industrialized the country. Apparently competition from Japan was a serious issue, but they had no issues with just completely shutting factories down and making stuff in China.
I'm sure glad guys like that did all that work so that I could sit here using a 100x more powerful computer to listen to sit here listening to them.
it's incredible that we now each hold the equivalent of a back then super computer in our hands.
why?
I used a Cray to calculate stresses in structures around this time. Never saw it, but did recently see one in a museum. I was surprised how small it looked, even so there were masses of wires at the back. The speed of light problem went away when everything was put on microchips.
What types of analysis were you doing?
@@gregorymalchuk272 : Stresses in bus bodies, using NASTRAN. Buses take a hell of a pounding, on lousy road surfaces and a large difference between empty and full weights. Even had a [double decker] load case of full on top and empty downstairs
17:54 What seems to have happened is that FORTRAN has evolved to include features to take advantage of vector units and highly-parallel processing.
Now transistors are closing in on the molecular structure of silicon, we are still at the dawn of processor based computing. As quantum computing becomes more viable and probable over possible we could soon make far larger leaps than ever before. If this is super-computing what will my eight year old granddaughter see in her life?
Quantum computers are a _lot_ of hype. Don't get me wrong, they're a real technology with real applications, however popscience seems to imply that quantum computing is the next generation of computing in general, and that it will "replace" our current CPUs or whatever. That's not the case, because quantum computing does not provide any benefit to any of the tasks we normally run on "normal" computers. Instead they are able to run a whole new set of tasks that you _can't_ do on a conventional computer.
15:12 Overclocking and liquid cooling back in the day.
What a patriot, dude is like “Us Americans will come out with the best computers first” Meanwhile, Japan is kicking our ass in electronics and robots.
The US dominates the top500. China is a heady hitter these days too. The top of the list bops around. It’s current top super computer is Japan, the US has spots 2 and 3, and China is in spot 4. Everyone leapfrogs each other, it’s always interesting to see what the next fastest machine in the world is.
(1984) Super computers are now are Video Games, TODAY~!
In 2056, present day super computers will be a big joke.
When the Dreamcast is more powerful than supercomputers 15 years before it's release.
meh my computer is faster then those super computers 36 years before it's release so no big deal
20:43 Not long after this, I think it was, one major US university was about to choose to buy a Fujitsu super, until Government pressure made them change their minds and go back to a good old all-American Cray instead.
Suits/ties/nice hair/nice beard to use a PC
Here we are 35 years later an HPC is still dominated by Fortran...
Insert "My budget Chinese smartphone is 10x faster than these things" comment down below.
Well its true! Lol
10X? Keep going.....
It's weird how they just nonchalantly talk about nuclear weapons
Cold war
Love this show, because it really show the developement of IT field and computer tech from 1983 till 2002. These things were the ultra modern stuff back than.. i can imagine what we have nowadays, not talking about the military black projects or DARPA, they are like 50 or 60 years ahead of current technology.
DARPA is not “50 or 60 years ahead”. Moronic
Them’s were the days!
Needed a supercomputer to render 3 colours at 30fps 😂
The Cray X-MP sold for $15 million - the 1.2GB worth of drives sold for an additional million dollars. The processor in most modern smartphones is about as fast if not faster.
My cheap ass 40 dollar smartphone is 10 times faster.
I learned about Cray X-MPs from reading Jurassic Park
100,000,000 operations/second or 100 mips used to be the pinnacle of computer technology in 1984 via Cray computer. Now, we have a I7-4790k micro computer which can perform, 144,550,000,000 or 144,550 mips. A high end microcomputer made in 2014 is 1445.5 times faster than a super computer made in 1984.
2014? what about 2019 with i7-9700k?
@@oldtwinsna8347 2022 and intel changed their entire architecture with P and E cores lol 💀
800 million operations per second... My GTX 1080 can do 9 trillion floating point operations per second... it would take 11,250 of those supercomputers to match my graphics card.
I was just thinking that..these supercomputers that they are speaking of in this video are WEAK compared to what is available for the consumer market today. Hell, the GPU in my phone could run circles around what they had in 84.
But GPUs are limited to vector operations.
So what? Are you living in 1984?
Some supers today use programmable GPUs, too.
How high do you think your gaming rig would score on a list of the world's top supers www.top500.org/ ?
Wouldn’t make it anywhere near the list...
my wife's smart dildo does 20 trillion floating point operations per second
Arnold Schwarzenegger stared in The Terminator at this time
Just to be clear, the system was not 100mhz but rather performed 100 million floating point operations per second(mflop). Processor speed and number of operations are not necessarily the same thing.
And nowadays we're all walking around with devices in our pockets that are even more powerful. Crazy.
Super Computers can do in the excess of 30,000 trillion calculations today.
+Jason Kendall Yeah, they sure have evolved. Just imagine super computers thirty years from NOW.
Sunway TaihuLight, the current fastest system, is able to do about 93 PetaFLOPS. (93,000,000,000,000,000 Floating Point Operations Per Second.) So, it's 93 quadrillion calculations, which is roughly 3.1x the speed you've mentioned. I'm sure that by 2020-2025, we'll be looking at early quantum computers that might be able to scale up to the ExaFLOPS level by truncating the floating points into smaller block chains. (Though memory models are currently an issue, since we would basically need analog variability in each cell to represent the variation between 0-10, for instance, rather than 0-1.
Programming something like that is also likely going to be a bastard of a job, since the level of complexity is going to also increase by a magnitude, I'd think.
When you say 30,000 trillion you should specify the elapsed time
Interval for those calculations to be completed. Heh
@@sbrazenor2 hmmm has this aged well? :)
@@nerd2544 I did mention 2020-2025, so we have a few more years to be sure. Check back in a few years. 😁👍
I really just want to hear that guy yell "Norton!!!" at the top of his lungs.
Yeah he loves motorbikes
Japan still makes the fastest super computers to this day
I like the time when everyone wore business suits.
Do feel uncomfortable when faced with diversity, or do you feel insecure about what to wear nowadays?
@@ArumesYT How’s that relevant?
@@uriituw agreed. He or she is race carding this, which is getting old really quick
Wow, 800 Million floating point operations per second. They probably couldn't have imagined Pflops. Meanwhile I'm using 15 Gflops to watch youtube.
2:10 - Only 100 million operations per second. LOL!
My i7 is doing 300,000 MIPS :-D
George Michael? Last Christmas I gave you my Supercomputer .... 😁
I loved the classic 80s intro
noob question: is Stewart multilingueal?
dual processor on the Cray? no one needs that kind of power!
Some people seem to be getting confused over the speeds of these computers.
The Cray computer was working at 100 MIPS. that's instructions per second. Not 100 MHz which is machine cycles per second. Not every instruction is processed in every cycle.
Also in terms of catching up to moore's law. A Cray of the time would be about as powerful as an iPad 2. Also these supercomputers were NOT general purpose. They were mainly vector processor based. That's a somewhat different form of math to what Most modern processors use Most often (gp processors have vector units but tend to rely on their integer and floating point units).
Also megaFLOPS were mentioned. 100 megaflops is equivalent to maybe a Pentium or late 486. 10 gigaFLOPS would be more akin to a powerPC G5 or Pentium 4. Most modern arm(mobile) processors are about that fast.
But what matters is how fast these old supercomputers would be with today's type of applications we can relate to. For example, would you be able to create a program on a Cray to do real time h.264 decoding at HD resolution?
@@oldtwins Not likely.
To the vector processors: We have such processors in our computers today. But NOT as the main purpose CPUs. But the GPUs. And there you can say, that a Cray One is applicable with the GPU of a small smartphone.
@@acmenipponair No, we've had vector processing in CPUs ever since Intel MMX and successors like 3DNow, SSE, AVX, AVX-512. PowerPC had AltiVec/VMX and VSX, ARM had VFP, Neon, SVE, SPARC had Vis...
Not quite. A p4-2.53(2002) had a performance of 0.4 GFlops. Non-overclocked. Which is about 1,000 times the speed of an 8088 (1981). Time interval: 21 years. A ryzen 5 3600 has a raw speed of around 500 Gflops. So thats roughly a 1,000 fold increase in CPU calculation speed in 18 year's time. So speed increase is increasing per time interval. Slightly.
"over a 100,000,000 operations per second"
*stares in 3990x*
You'd need a Cray XMP to watch a TH-cam video of this episode in 480p.
It would be hard pressed to show a 144p video on a 3 inch screen, even if you added a specialized graphics unit in line with the CPU's capabilities and the top end of what was available at the time. If you wanted 480i or 576i colour video at full frame rates, analogue TV was still your only choice.
Chances are you're watching this on a device several orders of magnitude more powerful than the supercomputers mentioned in this episode.
Stewart in 2023 Hold my supercomputer
RIP
George Michael, wasn't he with Wham!
This super computer is called the "Cray"...because it's crazy fast.
The classic summary of supercomputers in 1984 that are current speeds of desktop computers.
No, modern desktops are 100-10000 times faster. A quad processor Gray X-MP was 800 MFLOPS. Modern top desktop processors are at about 100 GFLOPS. GPUs handle 7 or more TFLOPS.
Good old days
what operating system do supercomputers use ? ? also mainframes too what operating system do they use ?? thanks
What is that framed picture behind the visitors?
It's not what you think it is.
I also thought I saw what it is not, but it is not what I saw.
PS: very interesting picture indeed, but take a CLOSE look at it and you will see what it really is.
Today all the power of supercomputer at that time is in our pocket used to scroll memes..
it's sad that you choose to use it for that.
In this episode of Computer Chronicles we bring you the 1980s version of Fugaku
The Cray-2 super computer came out the following year (1985) and was liquid cooled, could perform 1.9 billion floating point operations per second, and consumed 200kW of power ! A smartphone today is hundred of times more powerful, and consumes a very small fraction of the power, and runs cool. But it's still not smart enough to correct software bugs on its own, nor can it program itself. But maybe it's better that way ?
People that comment on these videos always say some variation of "The hardware I have today is way better than what they showed!", but they leave out that without the "building blocks" of previous technology, what we have today would not exist.
@@SweetBearCub got to build the tools to build the tools.
It's so interesting to hear the one salesman talk about how the petroleum industry is investing heavily in super-computing. I'm sure the viewers at the time assumed that the petroleum industry was forward thinking and looking to benefit from advances in technology to benefit all of society. Obviously in hindsight we know that wasn't the case.
I've worked in the petroleum industry, they still do invest a lot in computers for very number crunching intensive tasks like Seismic Data Processing/Res Sim. The biggest energy companies still own their own data centers and process their own data.
However starting around 1995 onwards, most of these tasks were outsourced to data processing companies specialized in taking the gathered field data and turning them into seismic models.
The energy companies realized that they were not computer specialists and didn't want to spend the money and time to develop in house software solutions or to own IT infrastructure and personnel.
Also most seismic data sets which required supercomputer level power to process in 1984 can now be done on a robust PC in a few hours time on a geophysicist's desktop or compute server.
You’re a ridiculous koolaid drinker. Whiners about muh climate change have no solutions beyond solar and wind fairy dust which any physicist for decades could’ve told you cannot supply our energy needs. Your ilk are shooting us all in the foot because you have a toe ache.
Ok, they´re talking about Supercomputers, incase somebody missed that.
It's funny he mentioned oil companies not knowing what they're doing with their computers. Where I work we support one of the big oil companies' servers and all I'll say is that I now -completely- understand why there have been so many spills -__-
When I watched this part of the video I just immediately thought, "wow, Hackers was a documentary!?" lol
George was a really a smart guy. He had a realistic view of how things were going. John on the other hand was just a platitude guy.
That's because John was a sales guy. He knew enough that was required to shift units.
I had this feeling that 20 years from now, my grand kids will be laughing at me when I will be explaining how high tech our computers in the past like Intel i7 processor 3.7Ghz, 8 to 16 GB RAM, a 1 to 2TB ROM, a 3440 x 1440 ips resolution, etc. The same thing that I'm laughing at this video now.
I hope so! We're going to need some new tricks to go beyond 100x faster than where we are in 2021....perhaps fully 3-dimensional processors and/or a shift toward optical processing will get us there.
It´s funny to see in 1984 the fastest super computer being able to do awesome 100 million operations per second...
And today, a top cell phone is able to compute 16 trillion. So an iPhone 13 is 160 thousand times fater than this super computer.
Actually an iPhone13 is faster than the #1 super computer from 2001 and it´s faster the ALL the computation power on Earth , the sum of ALL COMPUTERS ON EARTH (super, mainframes, enterprise, domestic) in 1980. That´s amazing.
It's a marvel alright. But that iPhone is probably being used by someone watching TH-cam while pooping - yet, the archaic supercomputing would've been doing something more useful.
Strange they didn't bring supercomputer to the studio ;)
Watching this on a zen 4 CPU is hillarious.