ICLR 2023 Workshop on Sparsity in Neural Networks - Introduction

แชร์
ฝัง
  • เผยแพร่เมื่อ 22 พ.ค. 2024
  • The ICLR 2023 Workshop on Sparsity in Neural Networks was held in Rwanda on May 5th, 2023.
    Subtitled "on practical limitations and tradeoffs between sustainability and efficiency", the workshop brought together researchers from academia, and industry with diverse expertise and points of view on network compression, to discuss how to effectively evaluate and enforce machine learning pipelines to better comply with sustainability and efficiency constraints.
    In his introduction to the workshop, Sean Lie, Co-founder and Chief Hardware Architect at Cerebras Systems, explores the potential of sparsity to make training and using large ML models more efficient. He goes on to highlight the great sparsity research being done at Cerebras and elsewhere, and how collaboration across the industry is critical to success.
    Visit the conference website at iclr.cc/virtual/2023/workshop...
    Learn more about Cerebras' sparsity research at cerebras.net/sparsity
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 4

  • @duffy666
    @duffy666 ปีที่แล้ว +3

    This is the right way. Sparsity will take off even more as it will enable new, yet unexplored, neural network architectures.

    • @CerebrasSystems
      @CerebrasSystems  ปีที่แล้ว

      Thanks for the kind words. We couldn't agree more: there's no point in having new sparse networks without HW that can natively harness that sparsity.

    • @frankdelahue9761
      @frankdelahue9761 11 หลายเดือนก่อน

      @@CerebrasSystems Does CERN use your chip for particle physics?

    • @CerebrasSystems
      @CerebrasSystems  10 หลายเดือนก่อน

      @@frankdelahue9761 Hi Frank, not yet!