After successful compilation of http-analyze you can test-run the analyzer before installing it permanently. Just create a subdirectory for the output files and run http-analyze on either one of the sample logfiles included in the distribution (as shown below) or use your web server's logfile. For example, to create a full statistics including a frames-based interface and a 3D VRML model in the subdirectory testd, use the following commands:
$ cd http-analyze2.4 $ mkdir testd $ http-analyze -vm3f -o testd files/logfmt.elf http-analyze 2.4 (IP22; IRIX 6.2; XPG4 MNLS; PNG) Copyright 2000 by RENT-A-GURU(TM) Generating full statistics in output directory `testd' Reading data from `files/logfmt.elf' Best blocksize for I/O is set to 64 KB Hmm, looks like Extended Logfile Format (ELF) Start new period at 01/Jan/2000 Creating VRML model for January 2000 Creating full statistics for January 2000 ... processing URLs ... processing hostnames ... processing user agents ... processing referrer URLs Total entries read: 8, processed: 8 Clear almost all counters at 03/Jan/2000 Start new period at 01/Feb/2000 No more hits since 02/Feb/2000 Creating VRML model for February 2000 Creating full statistics for February 2000 ... processing URLs ... processing hostnames ... processing user agents ... processing referrer URLs ... updating `www2000/index.html': last report is for February 2000 Total entries read: 3, processed: 3 Statistics complete until 28/Feb/2000 $
To view the statistics report, start your browser and open the file testd/index.html.
For permanent installation of http-analyze, issue a make install to copy the required files into the appropriate directory. The executable is usually installed in /usr/local/bin, while the required buttons and files are placed under /usr/local/lib/http-analyze unless this has been changed by defining the HA_LIBDIR make macro during installation.
Note that you do not need to install files in a new statistics output directory anymore if they have been installed in HA_LIBDIR; this is now done automatically by http-analyze if it runs the first time on this output directory.
Following are some more examples, which assume that the analyzer has been installed permanently. The first command processes an archived logfile logYYYY/access.MM from the server's log directory to create a report for January 2000 in the directory /usr/web/htdocs/stats:
$ cd /usr/ns-home/logs $ http-analyze -vm3f -o /usr/web/htdocs/stats log2000/access.01
The next command uncompresses the logfiles for a whole year and feeds the data via a pipe into the analyzer, which then creates a statistics report for this period. All options are passed to the analyzer through a customized configuration file specified with -c:
$ gzcat log1998/access.?.gz | http-analyze -c /usr/httpd/analyze.conf -
The following command creates a configuration file template with the name sample.conf. Any additional options will be transformed into the appropriate directives in the new configuration file. In this example, the server's name specified with -S is transformed into a ServerName directive and the output directory specified with -o is transformed into an OutputDir directive. All other directives are set to their respective default values. To further customize any settings, use a standard text editor.
$ http-analyze -i sample.conf -S www.myserver.com -o /usr/web/htdocs/stats
To update an old configuration file into the new format while retaining any old settings, specify its name when creating the new file. Again, command line options may be used to alter certain settings; they take preceedence over definitions in the old configuration file. The following command reads the file oldfile.conf and transforms its content into a new file named newfile.conf:
$ http-analyze -c oldfile.conf -i newfile.conf
Although http-analyze can be run manually to process logfiles, it usually is executed automatically on a regular base. On Unix systems you use the cron(1) utility, while Windows systems provide a similar functionality with the AT command. To have your statistics report updated automatically, use the following scheme:
Note that all cron jobs must run with the user ID of the owner of the output directory except for rotate-httpd, which must run with the user ID of the server user. This is a sample crontab(1) for the scheme described above:
# Generate a full statistics report twice per day at 01:17 and 13:17 17 1,13 * * * /usr/local/bin/http-analyze -m3f -c /usr/httpd/analyze.conf # Generate a short statistics report each hour except at 01:17 or 13:17 17 2-12 * * * /usr/local/bin/http-analyze -d -c /usr/httpd/analyze.conf 17 14-23 * * * /usr/local/bin/http-analyze -d -c /usr/httpd/analyze.conf # Rotate the logfiles at the first day of a new month at 00:00 0 0 1 * * /usr/local/bin/rotate-httpd