I'm looking for pages that are slow on a large custom PHP CMS website.
I don't want to dig around in the existing badly none documented code and found that apache access logs can display the time taken to serve the request, in seconds and miliseconds.
Conf:
CustomLog /var/log/httpd/timed_access_log timed
LogFormat "%h %l %u %t \"%r\" %>s %b - %T/%D" timed
Output:
86.132.***.*** - - [22/Jun/2010:13:25:31 +0100] "GET /menu/topscript/ HTTP/1.1" 200 183 - 0/12418
86.132.***.*** - - [22/Jun/2010:13:25:31 +0100] "GET /search/script/ HTTP/1.1" 200 266 - 0/13173
86.132.***.*** - - [22/Jun/2010:13:25:31 +0100] "GET /subscribe/script/ HTTP/1.1" 200 279 - 0/12882
Is there an easy way to find the top average slow requests?
Interested in analysis programs like awstats or an awk script (which I've never been able to get my head around).
Related: http://serverfault.com/questions/27642/awstats-and-time-taken-to-serve-the-request
My final solution, based on radius's answer:
awk '{cnt[$7]+=substr($12,1+match($12,"/")); i[$7]+=1}END{for (x in cnt){if (i[x] > 5) print x,i[x],cnt[x]/i[x]}}' timed_access_log | sort -k3nr | head -20
Only counts ones that have at least 5 requests, sorts them by decending average time and gives back the top 20.
-
You can try this, I hope it works, if you give us some sample log file I could check if it do
awk '{cnt[$7]+=substr($12,1+match($12,"/")); i[$7]+=1}END{for (x in cnt){print x,cnt[x]/i[x]}}' access_log
Annan : I've added a few lines from my logs (removed last two IPs octates). Your code seems to be throwing division by zero errors.radius : I corrected my code, it works for meAnnan : that works, thanks ^_^From radius
0 comments:
Post a Comment