Juan  Luis  Cordero

50%
Flag icon
Fundamentally, buffers use delay—known in networking as “latency”—in order to maximize throughput. That is, they cause packets (or customers) to wait, to take advantage of later periods when things are slow. But a buffer that’s operating permanently full gives you the worst of both worlds: all the latency and none of the give.
Algorithms to Live By: The Computer Science of Human Decisions
Rate this book
Clear rating
Open Preview