Better, faster utilizing your GPUs for deep learning workload by Zenodia Charpy

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 ก.ค. 2024
  • The utilization of GPUs has evolved from gaming purposes to deep learning, and gear towards, but not limited to, computer vision type of workloads: autonomous vehicles, medical images in radiomics and video surveillance to name a few.
    However, having powerful GPUs at your disposal is one thing, knowing how to use them to their full potential is something we are going to explore together.
    Working at Nvidia, we are dedicated to create an eco-system with full cycle of hardware-and-software solutions, that enables optimization on all fronts: from providing plug-and-play-able docker repo maintained by us, to data augmentations within GPU on-the-fly, to parallel model training using multiple GPUs with mixed_precision for target deployment.
    We have open-sourced all these toolkits for data scientists to quickly kick-start and get to work, instead of spending precious time to fix environmental installation errors and bring supercomputing power at your fingertips.
    Today we are going to take a look at these essential toolkits which enable deep learning practitioners, data scientists alike to improve their model training in parallel, iterate faster and develop better market-ready products.
    Working many years hands-on as an in-house data scientist, an external deep-learning consultant, a cloud solution architect (on Azure) and now a senior deep learning solution architect at Nvidia. Zenodia's journey of seeking the optimal pathways in utilizing multiple GPUs for deep learning is paved with years of industrial experiences and practical tips & tricks learned from pitfalls with real-world projects. She is on a mission to help data scientists and researchers alike to accelerate their deep learning workload with ease taking advantage of my learnings and experiences.
    Zenodia presented at the 2020 GAIA Conference, held online from Svenska Mässan, on the 27th of November, 2020.
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น •