Load testing. Everyone talks about it but how many of you actually do it?
Some will complain it is too difficult to setup and get started (see JMeter). Over the next few posts I'll discuss a few easy options to get started with load testing.
Today we'll look at ApacheBench
ab is a tool for benchmarking your Apache Hypertext Transfer Protocol (HTTP) server. It is designed to give you an impression of how your current Apache installation performs. This especially shows you how many requests per second your Apache installation is capable of serving.
ApacheBench comes as part of the httpd package so if you already have Apache installed you probably have access to ab (on Windows try ab.exe).
The command is simple:
ab -k -n 5000 -c 60 -H "Accept-Encoding: gzip,deflate" http://www.thesiteyouwanttotest.com/ ab - main command -k - sends KeepAlive header which more closely mimics browser request -n - number of requests to make -c - concurrency - how many requests to make at a time -H - custom header - more closely mimic real requests
WARNING: Be careful! You are load testing an environment. It is easy to overwhelm a server if not setup correctly. You don't want to run this against your production site!
Running the command it'll spin up, test and dump some output in your terminal:
Benchmarking www.thesiteyouwanttotest.com (be patient).....done Server Software: nginx/1.10.1 Server Hostname: www.thesiteyouwanttotest.com Server Port: 80 Document Path: / Document Length: 24170 bytes Concurrency Level: 5 Time taken for tests: 6.247 seconds Complete requests: 100 Failed requests: 0 Total transferred: 2440600 bytes HTML transferred: 2417000 bytes Requests per second: 16.01 [#/sec] (mean) Time per request: 312.353 [ms] (mean) Time per request: 62.471 [ms] (mean, across all concurrent requests) Transfer rate: 381.52 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 72 82 5.8 81 110 Processing: 163 208 160.0 183 1775 Waiting: 82 103 22.3 95 200 Total: 245 290 159.8 266 1853 Percentage of the requests served within a certain time (ms) 50% 266 66% 279 75% 287 80% 292 90% 316 95% 330 98% 363 99% 1853 100% 1853 (longest request)
This tells you all sorts of useful stuff:
* requests per second * connection times * how long those connections took
Scribble these #s down. Adjust the concurrency and number of requests and run it again.
Rinse and repeat until either your server falls over :) or you reach the goals you need.
Be aware all you are really testing is the server. You aren't emulating real users hitting the site. But it is easy to run and gives you some useful numbers.
At work we ran this against three environments: stage, ci and a sandbox and compared the data to see if any particular environment was slower than the other.
You can visually do this with a little gnuplot magic...
Modify your command and add the -g attribute:
ab -k -n 5000 -c 60 -g /home/jim/stage.tsv -H "Accept-Encoding: gzip,deflate" https://www.thesiteyouwanttotest.com/
This will output a file Gnuplot can render by using a simple script:
plot.p (for gnuplot)
#output as png image set terminal png #save file to "test-output.png" set output "test-output.png" #graph title set title "- ab -n 3000 -c 30" #nicer aspect ratio for image size set size 1,0.7 # y-axis grid set grid y #x-axis label set xlabel "request" #y-axis label set ylabel "response time (ms)" plot "stage.tsv" using 9 smooth sbezier with lines title "Stage", \ "ci.tsv" using 9 smooth sbezier with lines title "CI", \ "sandbox.tsv" using 9 smooth sbezier with lines title "Sandbox", \
Run gnuplot plot.p and you should get something like this:
This allows you to easily compare each environment.
Next post we will checkout Locust.