Fundamentally, buffers use delay—known in networking as “latency”—in order to maximize throughput. That is, they cause packets (or customers) to wait, to take advantage of later periods when things are slow. But a buffer that’s operating permanently full gives you the worst of both worlds: all the latency and none of the give.