advertise
« Managing cross partition transactions in a distributed KV system | Main | Graph server »
Monday
Jun082009

Distribution of queries per second

We need to measure the number of queries-per-second our site gets for capacity planning purposes.

Obviously, we need to provision the site based on the peak QPS, not average QPS. There will always be some spikes in traffic, though, where for one particular second we get a really huge number of queries. It's ok if site performance slightly degrades during that time. So what I'd really like to do is estimate the *near* peak QPS based on average or median QPS. Near peak might be defined as the QPS that I get at the 95th percentile of the busiest seconds during the day.

My guess is that this is similar to what ISPs do when they measure your bandwidth usage and then charge for usage over the 95th percentile.

What we've done is analyzed our logs, counted the queries executed during each second during the day, sorted from the busiest seconds to the least busy ones, and graphed it. What you get is a histogram that steeply declines and flattens out near zero.

Does anyone know if there is a mathematical formula that describes this distribution?

I'd like to say with some certainty that the second at the 95th percentile will get X times the number of average or median number of QPS.

(Experimentally, our data shows, over a six week period, an avg QPS of 7.3, a median of 4, and a 95th percentile of 27. But I want a better theoretical basis for claiming that we need to be able to handle 4x the average amount of traffic.)

Reader Comments (1)

You might be looking for plugging your data into something like this:

http://en.wikipedia.org/wiki/Power_law#Estimating_the_exponent_from_empirical_data

and you could certainly find some numbers that make capacity justifications feel more 'theoretical'. I am curious, however: why do you need something more theoretical than the actual data that you already have? The method you described (with the caveat that those 95% peaks don't experience unacceptable degradation) using real data will beat any formula you'll be able to apply. Depending on how often your site code (dynamic? static?) or audience (API? read-only users? content-generating users?) changes, your load characteristic might change enough to require regular/often recalculations just like you've described. At that point, having a formula only satisfies people who like formulas. :)

November 29, 1990 | Unregistered Commenterjohn allspaw

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>