I think the case Alexey is hitting is:
1 A few dirtied pages
2 but because of the hpratio most of the data is actually zero
- indeed most of the target-page sized chunks are zero
3 Thus the data compresses very heavily
4 When the bandwidth/delay calculation happens it's spent a reasonable
amount of time transferring a reasonable amount of pages but not
actually many bytes on the wire, so the estimate of the available
bandwidth available is lower than reality.
5 The max-downtime calculation is a comparison of pending-dirty uncompressed
bytes with compressed bandwidth
(5) is bound to fail if the compression ratio is particularly high, which
because of the hpratio it is if we're just dirtying one word in an entire
host page.