1 BILLION row challenge in Go - 2.5 Seconds!
ฝัง
- เผยแพร่เมื่อ 8 พ.ค. 2024
- In this video we look at how we can aggregate 1 billion rows of weather station data in as little time as possible. We start with a naive approach and optimize to go from 1m30 seconds to 2.5 seconds. Using the power of memory mapped files, a custom hash map implementation, and multiple Goroutines.
Implementation: github.com/duanebester/1brc-go
Thanks to Ben Hoyt
benhoyt.com/writings/go-1brc/
00:00 Intro
00:54 Simple Implementation
09:55 Advanced - Using mmap
15:01 Custom integer parsing
22:12 Parallel processing
34:16 Custom hashmap
42:48 Results - บันเทิง
Hello Duane! This is some amazing stuff man! Keep making these and enlightening us! Thanks a lot!
Thanks, will do!
Thank you for blessing us🙏
Any time
As always, banger video! 🤤
You already know!
Thanks for sharing.
You bet
Nice video. However, you are calculating the average while you are meant to keep track of the mean of all the values. That means having an array in the struct to keep track of all the values seen. Subscribed!
I think “mean” in this case is the arithmetic mean, which is the same as the average; summing the numbers in the set and dividing by total count (per station). My output matches the baseline output so I feel pretty confident in the implementation
Great content, but the autopilot is taking the fun out of it
Great point. Will disable going forward!
It takes me 1 minute just to cat the file to /dev/null
Yep the aggregation calculations are what increases the time drastically