Designing a High Performance Parallel Personal Cluster

แชร์
ฝัง
  • เผยแพร่เมื่อ 18 ธ.ค. 2024

ความคิดเห็น • 30

  • @Andreas-gh6is
    @Andreas-gh6is 8 ปีที่แล้ว +4

    The question that interests me the most about such projects is whether you can get more computational power from such a cluster than from one or two PC/server like devices which cost the same amount of money or have the same power consumption.

    • @sateforp
      @sateforp 7 ปีที่แล้ว

      PC/Servers will have much more power consumption. The advantage of these setups is when you can run things in parallel which is the case of simulation.

    • @EDToasty
      @EDToasty 4 ปีที่แล้ว +1

      I think the point here is that getting access to a supercomputer for research purposes can be expensive, if you aren't affiliated with a university. Having a cluster-environment to test things on before running your simulations on a large scale supercomputer can be very helpful. Of course, there is also the issue of scalability. Vertical scaling get very expensive (eg. buying a more powerful computational device), so most people choose to scale horizontally (buying more of the same type of device, and coordinate them).

  • @SetMyLife
    @SetMyLife 8 ปีที่แล้ว +10

    The performance/cost ratio is not very clear from the presentation, although it feels like little bang for buck. It is an interesting project though.
    Apologies for my pessimism - academia projects simply don't play my note.

    • @wxfield
      @wxfield 8 ปีที่แล้ว

      +Jaroslav Malec It's really not a bad attempt however..given her original monetary constraints.

    • @sadiqmohamed681
      @sadiqmohamed681 8 ปีที่แล้ว +4

      +Jaroslav Malec Haven't been able to work out all the numbers, but it looks like the amount of computational power for the power input is at least a factor of 10 better than with a much faster multi-core PC. The real power of clusters is due to the amount of parallelims, and that is down to the number of nodes more than the outright speed.

  • @speedsrj
    @speedsrj 5 ปีที่แล้ว

    Very good! Just couldn't find her in linkedin or Facebook!

  • @aa-xe5gq
    @aa-xe5gq 4 ปีที่แล้ว

    Or I can say it's a low-cost cluster? just curious, what script she used to dispatch each board?

  • @prashanthb6521
    @prashanthb6521 4 ปีที่แล้ว

    How do they communicate with each other if not for Ethernet ?

  • @KRoc
    @KRoc 6 ปีที่แล้ว

    I wonder how she interfaced with it. I would imagine you could use Putty and a serial connection.

    • @hfjimenezs
      @hfjimenezs 6 ปีที่แล้ว +1

      I guess that she installed an ssh server on each os

  • @desireisfundamental
    @desireisfundamental 5 ปีที่แล้ว +1

    Браво Кристина . Много интересно. Аз пък търся как да си направя cluster с AMD и Nvidia карти и да използвам CUDA и Vulkan в едно :D

  • @TunjungUtomo
    @TunjungUtomo 2 ปีที่แล้ว

    So what's constitute a "supercomputer"? I feel that this term is being used inappropriately in this presentation. I know it sounds much catchier, but it deflects from the real point that she build a computing cluster from a series inexpensive SBC, which in itself nothing to be ashamed of, in fact, she should really proud of

  • @HendricP
    @HendricP 6 ปีที่แล้ว +3

    “High performance “

  • @roidroid
    @roidroid 8 ปีที่แล้ว +4

    Wouldn't a modern GPU have better computational capabilities? I thought that was where it was at

    • @MaXwellFalstein
      @MaXwellFalstein 8 ปีที่แล้ว +5

      It depends on the tasks you are trying to process. Some tasks can be optimised on a GPU, some tasks are more optimised for CPUs. Some new FPGAs are the best you can get for optimising the FPGA for the software. FPGAs are what I use - I recommend using those in computational tasks.

    • @Maisonier
      @Maisonier 7 ปีที่แล้ว +1

      The problems are the software developer, Nvidia, AMD and Microsoft with the fucking directx.

    • @sateforp
      @sateforp 7 ปีที่แล้ว +2

      yeah but a cluster of GPU is too much $$$

    • @--Valek--
      @--Valek-- 7 ปีที่แล้ว +2

      Did you not hear her when she said she was on a tight budget

    • @MN-sc9qs
      @MN-sc9qs 7 ปีที่แล้ว +1

      GPUs are horrible for algorithms that have many branch statements, which CPUs are better at.

  • @satanlover134
    @satanlover134 5 ปีที่แล้ว

    4:40 I'm from Bulgaria raspberry pis have GPUs, if you're gonna spend more money then you would on them you would get thinkpads I'm just saying!

  • @wilgarcia1
    @wilgarcia1 8 ปีที่แล้ว +2

    awesome =0)

  • @whatthefunction9140
    @whatthefunction9140 8 ปีที่แล้ว +4

    yeah but can you make bitcoins on it?

    • @MartinMaisriemler
      @MartinMaisriemler 8 ปีที่แล้ว +4

      +Dylan T You can make bitcoins on almost anything, it's just not a good idea to do so because ASICs are far more energy efficient.

  • @genkidama7385
    @genkidama7385 4 ปีที่แล้ว +1

    heh?