Distribution of queries per second
We need to measure the number of queries-per-second our site gets for capacity planning purposes.
Obviously, we need to provision the site based on the peak QPS, not average QPS. There will always be some spikes in traffic, though, where for one particular second we get a really huge number of queries. It's ok if site performance slightly degrades during that time. So what I'd really like to do is estimate the *near* peak QPS based on average or median QPS. Near peak might be defined as the QPS that I get at the 95th percentile of the busiest seconds during the day.
My guess is that this is similar to what ISPs do when they measure your bandwidth usage and then charge for usage over the 95th percentile.
What we've done is analyzed our logs, counted the queries executed during each second during the day, sorted from the busiest seconds to the least busy ones, and graphed it. What you get is a histogram that steeply declines and flattens out near zero.
Does anyone know if there is a mathematical formula that describes this distribution?
I'd like to say with some certainty that the second at the 95th percentile will get X times the number of average or median number of QPS.
(Experimentally, our data shows, over a six week period, an avg QPS of 7.3, a median of 4, and a 95th percentile of 27. But I want a better theoretical basis for claiming that we need to be able to handle 4x the average amount of traffic.)