I believe the answer you are looking for, is that the exponential is the worst case, and the linear the best. In general when one or more packets are lost/damaged, they are resent, leading to a drop in throughput equal to the size of the packet payload/resend time. that said, if the resend itself fails and the packet must be resent again, the trend can approach exponential.
Keep in mind, there is only a general correlation between packets and volume of data, so a relationship between packet count and data volume only works for perfectly spherical chickens in a vaccuum (eg: only under artificial circumstances like a lab).