GoAccess is an open source real-time web log analyzer and interactive viewer that runs in a browser or terminal on Linux and Unix systems.
It is very well suited for a quick implementation of a log file viewer and does not replace the recommendation to collect all events centrally and have them detected there automatically, for example anomalies. However, GoAccess can be implemented quickly, since it supports (almost) all log file types of the usual web servers and it can be built without any further dependencies if you want.
GoAccess, especially if it only displays access logs and no error logs and that only for manual evaluation, is an isolated solution. To really know what is going on company-wide, there should be a centrally collected monitoring with automatically triggered alerts. This way, anomaly detection can also be used to detect (well, ideally) whether the company is currently being spied on, for example.
Subdomains and directories with cryptic names are not a strong security feature. But they can complicate an automated reconnaissance phase that might be followed by an attack. Many spy tools simply query name servers and web servers based on a list of words to guess subdomains and subdirectories.
logs.example.com data
).At https://goaccess.io/download download the current source code (to the web server) and build it (replace x.x.x with the current version):
wget https://tar.goaccess.io/goaccess-x.x.x.tar.gz
tar -xzf goaccess-x.x.x.tar.gz
cd goaccess-x.x.x/
./configure --enable-utf8 --enable-geoip=legacy
make
mv goaccess ~/bin/ga-x.x.x
cp config/goaccess.conf ~/etc
cd ~/bin
ln -s ga-x.x.x goaccess
I deliberately refrain here from using a make install
.
At https://goaccess.io/download download the current source code (to the web server) and build it (replace x.x.x with the current version):
To keep everything properly separated, it is a good idea to create a separate folder for the logs. A subdomain logs.example.com could then host this folder.
cd ~/domains/logs.example.com
htpasswd -cB .logs.pass logs
<enter secure password>
# chmod, chown or similar if necessary
Paths can be identified with pwd -P
.
~/domains/logs.example.com/.htaccess:
AuthType Basic
AuthName "restricted login"
AuthUserFile /path/to/.logs.pass
Require valid-user
~/domains/logs.example.com/logs/.htaccess:
AuthType Basic
AuthName "restricted login"
AuthUserFile /path/to/.logs.pass
Require user logs
# optional:
Options +Indexes
IndexOptions FancyIndexing
IndexOptions FoldersFirst
IndexOptions NameWidth=*
Depending on the configuration the logs are rotated and deleted in a certain way. I am now assuming 7 days here with one log file for each day.If there is one logfile for all 7 days, this file must go into --log-file
.
Create the file ~/bin/generate_statistics.sh:
#!/usr/bin/sh
outdir_base=/output-path/to/the/logs
date_str_dir=$(date "+%Y")
outdir=$outdir_base/$date_str_dir
date_str_file=$(date "+%Y-%W--%Y-%m-%d")
out_file=$outdir/$date_str_file.html
mailto=mail@example.com
web_base=https://logs.example.com/logs
if [ ! -d $outdir ]; then
echo Creating $outdir...
mkdir $outdir
fi
$HOME/bin/gaaccess --agent-list --config-file $HOME/etc/goaccess.conf --log-file <(cat $/path/to/the/webserverlogs/access_log*) --output $out_file
echo "New Log: $web_base/$date_str_dir/$date_str_file.html" | mail -s "example.com Log: $date_str_file" $mailto
Create the file and send the link every Sunday night:
crontab -e
50 23 * * 0 $HOME/bin/generate_statistics.sh >/dev/null 2>&1