Million Dollar Lines of Code: an Engineering Perspective on Cloud Cost Optimization
ฝัง
- เผยแพร่เมื่อ 7 พ.ค. 2024
- InfoQ Dev Summit Boston, a two-day conference of actionable advice from senior software developers hosted by InfoQ, will take place on June 24-25, 2024 Boston, Massachusetts.
Deep-dive into 20+ talks from senior software developers over 2 days with parallel breakout sessions. Clarify your immediate dev priorities and get practical advice to make development decisions easier and less risky.
Register now: bit.ly/47tNEWv
---------------------------------------------------------------------------------------------------------------------
Video with transcript included on InfoQ: bit.ly/3QGY6o7
Erik Peterson discusses the right timing and approach for engineering cost optimization and how to use cost efficiency metrics as powerful constraints that drive innovation, and engineer profit.
#CloudComputing #CostOptimization #InfoQ
---------------------------------------------------------------------------------------------------------------------
Follow InfoQ:
- Mastodon: techhub.social/@infoq
- Twitter: / infoq
- LinkedIn: / infoq
- Facebook: / infoqdotcom
- Instagram: @infoqdotcom - วิทยาศาสตร์และเทคโนโลยี
Excellent work.
thanks for this talk!
dhh said they spent tons of time optimizing cloud costs and it was still overpriced - that's why the switched to their own servers.
The fact that it's so easy to make mistakes costing millions is a serious problem with the cloud.
If you look at the basic economics, cloud companies have to make a profit, so they're going to charge more for a server. And the layers of virtualization hurt performance compared to a bare server. So for many companies with steady traffic, on-premise is simpler and cheaper. For services with very variable demand, the cloud can be more cost effective
So what's the single character we're talking about at 20:00?
typo on the company's url?
It was a typo in the url for the word “company” as the other person mentioned. This means that when the function download_update_metadata ran its response code was never 200 because it was a bad URL. So it fell back to doing a full download and file based hash every single time in the wake condition