December 2002
Intermediate to advanced
576 pages
32h
English
You can set up an automated system to monitor web performance using Perl and gnuplot. It uses the LWP library to grab a web page and then deals with proxies, handling cookies, handling SSL, and handling login forms. Here’s the basic code for getting the home page, logging in, logging out, and graphing all the times. Try to run monitoring and load testing from a machine that sits on the same LAN as the web server. This way, you know that network latency is not the bottleneck.
#!/usr/local/bin/perl -w use LWP::UserAgent; use Crypt::SSLeay; use HTTP::Cookies; use HTTP::Headers; use HTTP::Request; use HTTP::Response; use Time::HiRes 'time','sleep'; # constants: $DEBUG = 0; $browser = 'Mozilla/4.04 [en] (X11; I; Patrix 0.0.0 i586)'; $rooturl = 'https://patrick.net'; $user = "pk"; $password = "pw"; $gnuplot = "/usr/local/bin/gnuplot"; # global objects: $cookie_jar = HTTP::Cookies->new; $ua = LWP::UserAgent->new; MAIN: { $ua->agent($browser); # This sets browser for all uses of $ua. # home page $latency = &get("/home.html"); # verify that we got the page $latency = -1 unless index "<title>login page</title>" > -1; &log("home.log", $latency); sleep 2; $content = "user=$user&passwd=$password"; # log in $latency = &post("/login.cgi", $content); $latency = -1 unless m|<title>welcome</title>|; &log("login.log", $latency); sleep 2; # content page $latency = &get("/content.html"); $latency = -1 unless m|<title>the goodies</title>|; &log("content.log", ...