Running Funkload Performance Tests

Note: This document covers the running of the Funkload Performance Tests on the Connexions Testing Environment. The tests are very specific to this environment. To run them on a different Rhaptos installation, the test config files will need to be modified. This documentation DOES NOT cover that modification.

Test Description

There are 4 performance tests.

  1. 1 user accessing 50 URLs
  2. Multiple users accessing 50 URLs
  3. 1 user accessing 500 URLs
  4. Multiple users accessing 500 URLs

The URLs were taken from the Connexions Access logs.

Installing The Test System

On each of the test servers

  • Checkout the benchmark buildout
    svn co -N https://software.cnx.rice.edu/svn/cnx-qa-benchmark
    
  • Run the buildout this way
    make buildout
    

On the Server running the tests

  • Checkout the benchmark buildout
    svn co -N https://software.cnx.rice.edu/svn/cnx-qa-benchmark
    
  • Run the buildout this way
    make buildout
    
  • Install Sysstat / Sar
    • Sysstat is a software application comprised of several tools that offers advanced system performance monitoring. It provides the ability to create a measurable baseline of server performance, as well as the capability to formulate, accurately assess and conclude what led up to an issue or unexpected occurrence. In short, it lets you peel back layers of the system to see how it’s doing...
    • Sar collects, reports and saves system activity information (CPU, memory, disks, interrupts, network interfaces, TTY, kernel tables, NFS, sockets etc.)
    • Quick installation guide:
  1. Install the sysstat package:
    $ apt-get install sysstat
    
  2. Create a sarl executable bash script in /usr/lib/sysstat directory.
    #!/bin/sh 
    
    PATH=/usr/lib/sysstat:/bin:/usr/bin
    
    LANG=C
    TEMP=$1
    SEC=${TEMP:-60}
    
    DATE=`date +%d`
    DDIR="/var/log/sysstat"
    
    DFILE=$DDIR/sa$DATE
    cd $DDIR
    
    if [ -f "$DFILE" ] ; then
       find $DFILE -mtime +1 >/dev/null 2>&1 && rm -f$DFILE >/dev/null 2>&1
    fi
    
    H=`date '+%H'`
    M=`date '+%M'`
    
    LEFT=`expr \( 3600 \* 24 - 1 - $H \* 3600 - $M \* 60 \) / $SEC`
    LEFT=`expr \( 60 \* 24 - 1 - $H \* 60 - $M \) `
    
    ( exec sadc $SEC $LEFT $DFILE </dev/null >/dev/null 2>&1 ) &
    
  1. Create a sysstat file in /etc/cron.d/ directory with the following content
    00 00 * * * root /usr/lib/sysstat/sarl
    
  2. Start Sar
    $ /usr/lib/sysstat/sarl
    
  3. You will then find Sar logs in /var/log/sysstat/date-xxxx.
  • Install kSar
    • kSar is a Sar graphing tool that can graph for now Linux,Mac and Solaris Sar output. Sar statistics graph can be output to a PDF file. With kSar, you can display Sar data with easy-to-read graphs, and even produce PDF reports of system activity
    • To get started with kSar, go the SourceForge.net download page and download the latest version. Since kSar is a Java application, you'll need to have Java installed -- preferably Sun Java.
    • To run kSar, all you need to do is run java -jar kSar-4.0.0.jar.
    • kSar documentation.

Running the Performance Tests

Start Server Monitoring

Each server in the test environment must have Funkload monitor running. This must be started separately on each server.

To start the monitoring:

  • SSH into the server
  • cd to loaction of benchmark buildout installation
  • Run this command:
    • make start_monitor_<server name>
    • Example when logged into paring.cnx.rice.edu:
      make start_monitor_paring
      

To stop the monitoring:

  • SSH into the server
  • cd to loaction of benchmark buildout installation
  • Run this command:
    • make stop_monitor_<server name>
    • Example when logged into paring.cnx.rice.edu:
      make stop_monitor_paring
      

Run the Tests

The tests are run on a separate server from the test environment.

  • SSH into the server
  • cd to loaction of benchmark buildout installation.
  • Running the tests against the QA servers with 50 URLs
    make qa_test
    
  • Running the benchmark against the QA servers with 50 URLs
    make qa_bench
    
  • Running the tests against the QA servers (500 URLs per user per cycle)
    make qa_test_full
    
  • Running the benchmark against the QA servers (500 URLs per user per cycle)
    make qa_bench_full