Processing 1 Billion Rows Per Second

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 ก.ค. 2021
  • Everybody is talking about Big Data and about processing large amounts of data in real time or close to real time. However, to process a lot of data there is no need for commercial software or for some NoSQL stuff. PostgreSQL can do exactly what you need and process A LOT of data in real time. During our tests we have seen that crunching 1 billion rows of data in realtime is perfectly feasible, practical and definitely useful. This talk shows, which things has to be changed inside the PostgreSQL and what we learned when processing so much data for analytical purposes.
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 4

  • @mrrolandlawrence
    @mrrolandlawrence 4 วันที่ผ่านมา

    i do love some optimisations :)

  • @victormadu1635
    @victormadu1635 13 วันที่ผ่านมา

    Good job

  • @hkpeaks
    @hkpeaks ปีที่แล้ว

    Do you mean the system can extract csv file for 1 Billion Row per second?

    • @nixietubes
      @nixietubes 10 หลายเดือนก่อน

      That is not what the talk is about