Postgres folks:
I have a large (1-2TB) analytics database that gets irregular, but at least daily, major data updates that generate GBs of new data and GBs of WAL, while saturating the CPU of the main server.
We'd like to create a regular copy (at least once a week, if not a replica) of this DB on a test server which is in another data center. How would you minimize network transport for this? ...
@fuzzychef I would try pgbackrest with compression.
@l_avrot @fuzzychef also, strange question: is put a bigger NIC in an option? Compression is cool, but consumes alot of CPU relative to a bigger pipe.
@fuzzychef @l_avrot ahh, in which case pgBackrest with compression or a dark fibre.
@intrbiz @l_avrot it's not, and regardless the bandwitdh between the two data centers is $$$metered