Back at the university in the early '90s a professor told me about that (for example what distinguishes a supercomputer from a workstation): A simulation would take 4 weeks on the workstation. A computer is a supercomputer when it can do the same sim in 5 minutes...after you waited for 4 weeks to get your time slot. :)
Well I'm gonna equip my computer with a bitogip with a 400000 gingolaps per second doodleflip, so it can do over 50 quadrillion zingybobs per millicat, so its over 17698469 bingyvoxes quicker than a standard chandopoop
Man, you are so full of poop! '50 quadrillion zingybobs per millicat'. I'd like to see you get upto 20 jigglebigs per puttydogs! It's OK to over hype things, but don't lie! 50 quadrillion zingybobs, pft! Don't make me laugh!
Farseer Flore Nahh you'll need a zentix pigoproccessor with 65 lesnofloops on each to get up to 20 jigglebigs per puttydog, and they predict that they wont come out till 2016.
Your shit is WAY funny!!! How long did you have to sit around and come up with this kind of bullshit??? Leave technology alone. Bet you can fill up quite a few comedy clubs if you have more MATERIAL like this,
Rem You can buy all this stuff now. Workstations are really affordable and you can do all kinds of simulations and modeling or even make a cheap $300 server for your home automation. You don't need gigajiggamegaflops to do that.
***** True, but somehow having a home server doesn't sound quite as impressive as a supercomputer capable of performing billions of calculations per second xP
My hat's off to the interviewer here. He did a GREAT job asking just the right questions to the interviewee who clearly articulated his answers that really defined supercomputers.
Fantastic interviews! As others have said, Norm is great at asking the right questions and letting the interviewees express some really interesting answers. Thanks for everyone at Tested for this!
And I can still remember finding a 286 daughterboard on sale, that when I installed it in my IBM PC my computer got all the way up to 1MIPS. That's one million instructions per second.
This video is very helpful, i am current a college student and i am taking a computer course . this video help me understand what a supercomputer is in detail. i learned that a super computer is design to move more rapidly and efficiently , it holds more volume data . The software is more up to date, with different applications all in which can be used on a smart computer that cannot be used on a mainframe
Maybe a madlad billionarie like Elon Musk could do it,imagine spendindg thousands or millions only on eletricity bills+creating a supercomputer just to gaming+developing a whole game just to one person play
I love how that lady doesn't just reject all the talk of "So this is just like in that movie!" or "So maybe you wanna, say, plot a zombie outbreak!" She doesn't turn her nose up at that, like "No, this is real science, that is silly whimsical bullshit. Don't be such a troglodyte." Seems like she knows science can be creative and exciting, not all (as Danny Devito once put it) pinky-up and sober. As a scientific educator, she's pretty awesome.
I'm gaming for science!! :) This makes me excited for the future and I can't wait to see what other applications this could be used for. Great interview Tested!
+Alfred X Its, not a graphic intensive supercomputer, it doesn't even run normal operating systems at a consumer level. So it wouldn't be able to most likely. While it does have many GPU'S, its main purpose is to calculate and crunch numbers with all the cores it has.
Norm, you might want to shake the guests' hand at the end and thank them for being on the show. They ALWAYS look so confused; is this over, do I walk away, who is he talking to? The transition from talking to the guest so intimately to reciting the "closing statement" to the camera is very abrupt. A quick handshake would make that transition much smoother and more graceful.
On Oak Ridge National Laboratory website you can read now that "The Summit system will deliver 5-10 times the computational performance of Titan when the system is fully available to users in January 2019."
It was a really good video, but i must say that the music really killed it in the end AKA it would have been even better of a video if it had been in the background trough the entire video.
"We are ahead of Moore's law" But you aren't, Moore's law is:- The number of transistors on *integrated* circuits doubles approximately every two years. Banding chips together to work on tasks together doesn't mean they all become integrated, they are still separate.
I never said it didn't, Moore's law is about the jump of transistors within integrated circuits, a cluster isn't an integrated circuit. I can see the milestone he made, but the fact he's using a law that regards only that form of item, it's a bit silly to do so. Maybe if he had used Moore's law as a comparison and not a milestone.
Hmm.. ya. I tend to think of it as a processing power and forget it's actually based on transistor count. So he didn't use it exactly correctly, but his meaning is there. Clusters are growing faster.
*millions, you can only spill on one rack at a time, not the whole thing, and they usually don't cost more than 1 billion dollars, most stay around 100 mil
Nope. Moore's law is hitting its limit with silicon. Even if companies switch to carbon nano tubes, quantum tunneling will hit quickly, because everything is already too small. CPUs and GPUs will start to get bigger again, but then latency will also take it's toll. Only quantum computing can help us now, but it has not made any real steps to take us there.
it would be great if the "computer" wasn't warehouse-sized. It'd be great if they had a system like the cloud computing on xbone powered by one/few of these.
I for one can't wait until a cheap wallpaper screen can be applied to any wall and then have different environments in motion presented on screen so even if you have to live in a city you can virtually get away.
I've ordered 2 gtx titan black for my next computer build. It'll be a lot faster and better than what old computers you guys are using now. I'm cool that way. Kthxbye.
I imagine a projector would be a lot more power efficient and just as practical to use over these 70+ 30inch monitors. Not to mention a lot better looking.
There is no projector even close to this resolution. Doesn't matter if you have a big screen if your pixels are the size of your head. So you'd need just as much projectors as you have monitors now. On top of that you need to make sure you don't block out the light when standing in front of it, so they'd need a whole room behind it just so the projectors can project from behind. And then the touch comes in. I'm not sure how you would do touch on a projected screen like this. The bezels would be gone but there definitely would be other trade-offs.
My estimation based on 75x 1440p monitor vs 3x 4K Projector 2160 x 4096 x 3 = 26,542,080 1440 x 2560 x 75 = 276,480,000 276,480,000 / 26,542,080 = 10.41666667 The Monitor Array Roughly 10.4 (1042%) times the pixel count than 3x highend 4K projectors that cover the same area.
While it's hard to argue the pixel density of these screens versus a 4k projector, those pixels become useless at the distance/scale we're talking. Even if you needed to fit more things on the "screen", you could play with the DPI to scale things down to simulate a higher resolution. Touch screen have been done with projectors for a while, but let's be honest, it becomes more a gimmick to have touch screen with a display that size. You wouldn't be blocking the light since the projector would of course be hanging from the ceiling, and even if it wasn't, it wouldn't be an issue unless you insisted on the gimmicky touch screen. From the arguments I've read, I still see this as a huge waste of money compared to a projector. Though, my guess it's not as much the practical value as the proof of concept that is important here, like a demonstration of an insane resolution working in real time with the super computer.
Just known that in 20, maybe 10 years, a system that powerful will be compact and in everyone's homes. Just look at the first computers who had the capacity of 10mb and the computer took up an entire room.
+Brandon Connell (ZombieToad) 1 a day or so. gpu's aren't affective at all for mining btc. but the mining farms earn 30 million a month or so. with ASICS :)
+Brandon Connell (ZombieToad) You get 25 Bitcoin per mining reward for the correct calculation, so that question doesn't really make a lot of sense. But they should be able to get a nice bit of mining and a decent number of rewards a day, but not enough for return on investment.
It's really funny how the classic view of sci-fi super computers was a wall of flashing lights and now the real life super computers do look like wall of flashing lights. =D
Chris Kim sure, they built clusters of thousands of computers that can't run at 100% CPU utilization without overheating... Mining bitcoins or stock market data or weather data these systems could run capped out for days on end without any issues at all.
Callum Errington maybe, but first the power consumption would make you debt higher than your profit . Also you don't need 100% utilization to overheat. Also bitcoins are going down the drain and have been only spiraling downward since they reached $1000 per coin
a cluster doesnt really run an operating system i think, but the individual server can run vista, but just wont allocate all the ram that is being given, its not making use of the powahhh
To make it more efficient, you need to stack chips into 3D rather than typical 2D silicon chips. it reduces, power and is much more efficient and faster than typical chips. but it produces more heat which can easily be solved.
If you like the idea of creating a cluster, there's a video about how to do it using raspberry pi computers (a very cheap way of doing it). Nowhere near as powerful as the supercomputer shown in this video mind you, but it is a good example of how clustering really works.
Back at the university in the early '90s a professor told me about that (for example what distinguishes a supercomputer from a workstation):
A simulation would take 4 weeks on the workstation. A computer is a supercomputer when it can do the same sim in 5 minutes...after you waited for 4 weeks to get your time slot. :)
And now what would take supercomputers 30,000 years.... Would take a Quantum Computer just shy of 2 days. .... It's amazing how tech advances.
Does google earth lag on this computer as well? :P
should run minecraft relatively smooth
*****
Can't you see I was being sarcastic? xDDD
Nick xD
wow! could you imagine gaming on this Behemoth! HA HA you could play mine craft, crisis, every damn video game known to man at the same time.
Nothing can run Minecraft perfectly
Maybe even above 60FPS
Norm should interview everyone, he actually listens and asks amazing questions.
12:51 ..all that equipment and still no bezel correct?
Well I'm gonna equip my computer with a bitogip with a 400000 gingolaps per second doodleflip, so it can do over 50 quadrillion zingybobs per millicat, so its over 17698469 bingyvoxes quicker than a standard chandopoop
Hahahaha, best comment ever
Did you overclocked or it's by default like that?
Man, you are so full of poop! '50 quadrillion zingybobs per millicat'. I'd like to see you get upto 20 jigglebigs per puttydogs! It's OK to over hype things, but don't lie! 50 quadrillion zingybobs, pft! Don't make me laugh!
Farseer Flore Nahh you'll need a zentix pigoproccessor with 65 lesnofloops on each to get up to 20 jigglebigs per puttydog, and they predict that they wont come out till 2016.
Your shit is WAY funny!!! How long did you have to sit around and come up with this kind of bullshit??? Leave technology alone. Bet you can fill up quite a few comedy clubs if you have more MATERIAL like this,
VERY cool video. I didn't even realize super computing was implementing so much over the counter consumer hardware.
Wait 20-30 years to get this in your house.
Rem You can buy all this stuff now. Workstations are really affordable and you can do all kinds of simulations and modeling or even make a cheap $300 server for your home automation. You don't need gigajiggamegaflops to do that.
***** True, but somehow having a home server doesn't sound quite as impressive as a supercomputer capable of performing billions of calculations per second xP
Oops, my mistake. What Jonny B said :P
This has to be in the top 5 videos made by the Tested team. Excellent.
My hat's off to the interviewer here. He did a GREAT job asking just the right questions to the interviewee who clearly articulated his answers that really defined supercomputers.
But can it run Crysis XP
This was one the most interesting interviews you have done.
Fantastic interviews! As others have said, Norm is great at asking the right questions and letting the interviewees express some really interesting answers. Thanks for everyone at Tested for this!
And I can still remember finding a 286 daughterboard on sale, that when I installed it in my IBM PC my computer got all the way up to 1MIPS. That's one million instructions per second.
This video is very helpful, i am current a college student and i am taking a computer course . this video help me understand what a supercomputer is in detail. i learned that a super computer is design to move more rapidly and efficiently , it holds more volume data . The software is more up to date, with different applications all in which can be used on a smart computer that cannot be used on a mainframe
I went there and got to meet Dan in 2016. What a great guy.
Can you make a game for a supercomputer where the graphics were super advanced and it had 20 4k screens ?
Yes you could, but there would be no market for it and the time to make it would cost too much, so they wont do it.
Not to mention the distance you would be at to see all the screens would eliminate your ability to see the screens in 4k quality.
Its called Crysis 1, 2 and 3
Maybe a madlad billionarie like Elon Musk could do it,imagine spendindg thousands or millions only on eletricity bills+creating a supercomputer just to gaming+developing a whole game just to one person play
I love how that lady doesn't just reject all the talk of "So this is just like in that movie!" or "So maybe you wanna, say, plot a zombie outbreak!"
She doesn't turn her nose up at that, like "No, this is real science, that is silly whimsical bullshit. Don't be such a troglodyte." Seems like she knows science can be creative and exciting, not all (as Danny Devito once put it) pinky-up and sober.
As a scientific educator, she's pretty awesome.
I'm always taken by how good question Norm asks. He is on top of every interviewee.
Yup he really is
I'm gaming for science!! :) This makes me excited for the future and I can't wait to see what other applications this could be used for. Great interview Tested!
5:28 would the 14 Petabytes scratch storage, be like a huge Paging file? because he said they were needed to keep up with all the processors.
But will it run ie
NerdyNoob Who cares about IE?
NerdyNoob Not on Linux.
ESISTOBSTIMHAUS yes, must via wine
Harry Tsang Wine sucks on most applications...
Yes but will still crash a million times faster.
Can't wait for the implanted version!
Very cool. And great interviews Norm.
i bet they can do like 50k gaming
Yes (maybe) like I guess it's playable. :/
+Alfred X Its, not a graphic intensive supercomputer, it doesn't even run normal operating systems at a consumer level. So it wouldn't be able to most likely. While it does have many GPU'S, its main purpose is to calculate and crunch numbers with all the cores it has.
4 Years later, we're still at around only 8K screens.
Sumit Dev lol you tell him stupid thing to say
I love it how it seems that while Norm does all the work Will just keeps fucking around with different things in the background haha
Norm, you might want to shake the guests' hand at the end and thank them for being on the show. They ALWAYS look so confused; is this over, do I walk away, who is he talking to?
The transition from talking to the guest so intimately to reciting the "closing statement" to the camera is very abrupt. A quick handshake would make that transition much smoother and more graceful.
On Oak Ridge National Laboratory website you can read now that "The Summit system will deliver 5-10 times the computational performance of Titan when the system is fully available to users in January 2019."
Nice work Grant.
One of your best videos
Can it run minesweeper at 4k ultra at a decent framerate? I don think so mate...
That shit is crazy.
This stuff is insanely cool, huge thanks for showing this :)
It was a really good video, but i must say that the music really killed it in the end AKA it would have been even better of a video if it had been in the background trough the entire video.
Absolutely Brilliant!
7:28 those things already passed through my head
Wow glad we where prepared for this pandemic now in 2020!
"We are ahead of Moore's law"
But you aren't, Moore's law is:- The number of transistors on *integrated* circuits doubles approximately every two years. Banding chips together to work on tasks together doesn't mean they all become integrated, they are still separate.
what he said makes complete sense. cluster computing grows faster than single chip design
I never said it didn't, Moore's law is about the jump of transistors within integrated circuits, a cluster isn't an integrated circuit.
I can see the milestone he made, but the fact he's using a law that regards only that form of item, it's a bit silly to do so. Maybe if he had used Moore's law as a comparison and not a milestone.
Don't misinterpret him by taking that out of context lol.. he was talking about the cluster itself.
Fair enough
Hmm.. ya. I tend to think of it as a processing power and forget it's actually based on transistor count. So he didn't use it exactly correctly, but his meaning is there. Clusters are growing faster.
I can just imagine someone bringing in a bottle of water and spilling it, and damaging billions of dollars' worth of equipment lol.
Damnt Jerry you did it again
this is why water fountains
OMG , that will be a super damage
I'd dammage you ;)
*millions, you can only spill on one rack at a time, not the whole thing, and they usually don't cost more than 1 billion dollars, most stay around 100 mil
So what happens if they get a blue screen?
TheTwistedbeaver And if it does it again ring tech support
unplug plug back in! #protips
Unplug, plug back.
They use Linux(or Unix), which means it will work fine or just freeze
***** Windows is a poor choice for a super computer... Windows is for idiots, not scientists, unlike Linux..
13:50 I wonder if they ever predicted the Coronavirus using their supercomputer pandemic exercise program
Very, very good interview!
great video guys, a lot of ground covered.
super great interviews. I love the future.
In 23 years or probably less, I'm going to own a desktop computer with the power of the supercomputer shown in this video. That's pretty cool...
perhaps
Nope. Moore's law is hitting its limit with silicon. Even if companies switch to carbon nano tubes, quantum tunneling will hit quickly, because everything is already too small. CPUs and GPUs will start to get bigger again, but then latency will also take it's toll. Only quantum computing can help us now, but it has not made any real steps to take us there.
this is amazing science and computers make the world and knowledge for humans better each year and month, it amazes me :)
How much heat does that produce?
it uses an average of 3Mw of power... so pretty darn hot, lol
This was pretty awesome.
The Pickle Research Center is 20 minutes from my place. I should go and take a look at this
12:59 i came here for the future of computing, didn't know she already saw 2020. wonder how the giant screen was used to fight the Corona virus
Great video!
how many billion frames does RuneScape run at?
Run a 1 millions player Minecraft server on that supercomputer and have a movie theatre as your display setup. It would be unreal.
thinking about how unrestricted a game developing company could be with a computer like that, that it could run on. Would be glorious.
it would be great if the "computer" wasn't warehouse-sized. It'd be great if they had a system like the cloud computing on xbone powered by one/few of these.
"there are a multitude of ways to use that large display". i certainly know one
I for one can't wait until a cheap wallpaper screen can be applied to any wall and then have different environments in motion presented on screen so even if you have to live in a city you can virtually get away.
Can it play Mine Sweeper?
What happens to the hardware theyve used during upgrade? Do they give it back to intel or the manufacturer?
Hey guys I need more used hardware. Gosh, give me more every year
What A *Flawless Computer*
Deep Thought came up with, "42."
I WANT THE DAMN QUESTION!!!
Have a word with Slartibartfast!
What do you get when you multiply six by nine
If you want to understand the answer, you really need a good grasp on the question.
This Visualisation computer is pretty impressive. I wonder how quick you can run realflow sims on it?! ;)
Wow no bezel compensation. Talk about missing data...
so Watchdogs system requirements? cause ubsoft don't optimize there games well
Nah, even this cant run it on ultra low at anything over 1 fps
but can it crack wpa2?
wow those monitor bezels really ruin it no? you would have thought they would of used better single displays or projectors or something..
Gotta keep that dell sponsorship.
so how long does it take to boot up Windows 98?
I've ordered 2 gtx titan black for my next computer build. It'll be a lot faster and better than what old computers you guys are using now. I'm cool that way. Kthxbye.
Yes, but can it read?
I can't tell if this is a joke or you are actually stupid.
He's not stupid, it's gonna be able to run chrome fairly well, but idk about gaming with it
The machine wasn't exactly made for gaming. The super computer is still more powerful.
2 gtx titan blacks? its defs built for gaming...
great clip, thx for sharing :)
finally 4k 60fps gaming... without a hitch, on a screen as large as a movie theater! lol
never thought id see the day
I imagine a projector would be a lot more power efficient and just as practical to use over these 70+ 30inch monitors. Not to mention a lot better looking.
the picture quality would not be good with a projector , im sure with all their money they know what they are doing.
There is no projector even close to this resolution. Doesn't matter if you have a big screen if your pixels are the size of your head. So you'd need just as much projectors as you have monitors now. On top of that you need to make sure you don't block out the light when standing in front of it, so they'd need a whole room behind it just so the projectors can project from behind. And then the touch comes in. I'm not sure how you would do touch on a projected screen like this.
The bezels would be gone but there definitely would be other trade-offs.
My estimation based on 75x 1440p monitor vs 3x 4K Projector
2160 x 4096 x 3 = 26,542,080
1440 x 2560 x 75 = 276,480,000
276,480,000 / 26,542,080 = 10.41666667
The Monitor Array Roughly 10.4 (1042%) times the pixel count than 3x highend 4K projectors that cover the same area.
No it's comes down to resolution not the size of the display array. The need real estate AKA pixels to display this much information.
While it's hard to argue the pixel density of these screens versus a 4k projector, those pixels become useless at the distance/scale we're talking. Even if you needed to fit more things on the "screen", you could play with the DPI to scale things down to simulate a higher resolution.
Touch screen have been done with projectors for a while, but let's be honest, it becomes more a gimmick to have touch screen with a display that size.
You wouldn't be blocking the light since the projector would of course be hanging from the ceiling, and even if it wasn't, it wouldn't be an issue unless you insisted on the gimmicky touch screen.
From the arguments I've read, I still see this as a huge waste of money compared to a projector. Though, my guess it's not as much the practical value as the proof of concept that is important here, like a demonstration of an insane resolution working in real time with the super computer.
Please show a video of people playing games on the 75+ monitor setup!
Just known that in 20, maybe 10 years, a system that powerful will be compact and in everyone's homes. Just look at the first computers who had the capacity of 10mb and the computer took up an entire room.
how long would it take to mine 1 bitcoin?
+Brandon Connell (ZombieToad) I don't know exactly but probably whole bitcoins in a mere day.
+Brandon Connell (ZombieToad) 1 a day or so. gpu's aren't affective at all for mining btc. but the mining farms earn 30 million a month or so. with ASICS :)
+Brandon Connell (ZombieToad) You get 25 Bitcoin per mining reward for the correct calculation, so that question doesn't really make a lot of sense. But they should be able to get a nice bit of mining and a decent number of rewards a day, but not enough for return on investment.
exactly
+Stef Pletinck spot on, better off looking for a free power source on a smaller scale.
It's really funny how the classic view of sci-fi super computers was a wall of flashing lights and now the real life super computers do look like wall of flashing lights. =D
Anyone watching this now thinking how did this help with Covid ?
gaming night at a supercomputer lab... omg.
the bitcoin I can mine with this o my gosh
mikegunz A unrealisable dream, just forget it ..
for weak computers that is.
mikegunz those are CPUs , you would need asics or GPUs to mine. CPUs now would just overheat and destroy the computer without making a dollar
Chris Kim sure, they built clusters of thousands of computers that can't run at 100% CPU utilization without overheating... Mining bitcoins or stock market data or weather data these systems could run capped out for days on end without any issues at all.
Callum Errington maybe, but first the power consumption would make you debt higher than your profit . Also you don't need 100% utilization to overheat. Also bitcoins are going down the drain and have been only spiraling downward since they reached $1000 per coin
Can it run Windows Vista?
Haha . . . . I still have an emachine with windows vista . . . Smh
a cluster doesnt really run an operating system i think, but the individual server can run vista, but just wont allocate all the ram that is being given, its not making use of the powahhh
It runs linux
the individual servers do :)
13:40 2020 Covid 19 has entered the chat...
Can I have Stanpede when you guys no longer use it ? I would even take Ranger if you still got it lying arround somewhere... :P
great video, i really enjoyed this one :)
Can it run minesweeper?
no, but at least it have an originality sub-routine...
Mudux Minesweeper is NEVER unorginal
this thing could simulate real minefields
To make it more efficient, you need to stack chips into 3D rather than typical 2D silicon chips. it reduces, power and is much more efficient and faster than typical chips. but it produces more heat which can easily be solved.
... It's not easy to solve lol.
SO how much RAM does it have?
Great video guys, enjoy the nerd factor :)
They should consider working with Linden Labs when it comes to virtual reality environments.
When did people start answering questions with "So"? I noticed that just two years ago, and now most people seem to do it.
dam now i wanna see a video of game night im curious as too how it will look
this reminds me when I went to UCSD, they had a room like that with a lot of screens lol.
11:10 *"What is the benefit of having a screen that big."*
Marketing? Otherwise there would be no views in TH-cam ??
Must be a heavily modified version of linux to make this run so smooth. Id love to delve into the software for this behemoth.
If you like the idea of creating a cluster, there's a video about how to do it using raspberry pi computers (a very cheap way of doing it). Nowhere near as powerful as the supercomputer shown in this video mind you, but it is a good example of how clustering really works.
Using megawatts of power? Why not use POWER or ARM RISC based CPUs?
I love how she just says "we do have gaming night in the lab"; like it's no point pretending that we don't use it for fun too... hehe
How many fps will i get in btf4 at 1080p with this one ?
Zero battefilde cant be run on linux
If you could get it to run, I'd say some where around 999999999999999999999999999999999999 and a few more 9's fps.
How would such an amount of monitors be useful? I'm not saying it isn't, I'm just wondering what it would be used for
I can't wait for 25 years to go by....
I'm going to shit my pants if these guys allow VR demos.
I’m still watching Star Trek since October of 1987
Game night! I wish I could play Crysis 3 or Watch_Dogs on that computer!
It might actually get Ultra settings @ 60fps
Or play both at the same time
Or Ultra settings at 120fps
Can I have a monitor?
Edit: That was a really cool tour also!
I was waiting to hear the question you never asked; " what O/S does it run ?
All super computing clusters run on linux; There are a few distributions that specialize in managing those systems