CONVEY

Log Analysis with netgo

2022 March 13 16:23 stuartscott 177537¤ 1255¤

So you wrote a web server in Go and, like dishes at a fancy restaurant, pages are getting served.

package main

import (
	"log"
	"net/http"
)

func main() {
	// Create Multiplexer
	mux := http.NewServeMux()

	// Handle Index
	mux.Handle("/", http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
		w.Header().Set("Content-Type", "text/plain")
		w.Write([]byte("Hello World!"))
	}))

	// Serve HTTP Requests
	log.Println("HTTP Server Listening on :80")
	if err := http.ListenAndServe(":80", mux); err != nil {
		log.Fatal(err)
	}
}

By now you probably have some questions, like;

You could use one of the many website analytic services but then the identity and behavior of your visitors is being given away to a third party.

This article will show you how to use netgo to gain insights into your traffic while respecting your users and keeping your log data confidential.

As always, the code shown is open source and hosted on GitHub.

Step 1: Collect

The most important piece of any data analysis project is of course the data itself, and so your first step is to collect it.

netgo provides two utilities to help with this endeavor; the first configures the log package in the standard library to write to both standard output and a file so you have a permanent record, and the second utility wraps your handlers so the request data is written to the log:

import (
	"aletheiaware.com/netgo"
	"aletheiaware.com/netgo/handler"

	// ...
)

func main() {
	// Configure Logging
	logFile, err := netgo.SetupLogging()
	if err != nil {
		log.Fatal(err)
	}
	defer logFile.Close()
	log.Println("Log File:", logFile.Name())

	// ...

	mux.Handle("/", handler.Log(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
		// ...
	})))

	// ...
}

Step 2: Extract

After running your web server for a while you will have amassed a treasure trove of information in the form of a directory filled with log files. The next step is to parse these files and extract the data.

netgo includes logparser which scans through all the log files in the given directory, extracts the request information while ignoring the rest, and populates an SQLite database for easy querying.

$ go install aletheiaware.com/netgo/cmd/logparser
$ logparser logs/

Note: parsing can take quite a few minutes - better go make a nice cup of tea ☕️

Step 3: Analyze

Once all the log data is in a database you'll want to slice, dice, and visualize it so you can recognize trends and identify opportunities.

netgo includes logserver which provides a dashboard to examine and understand your server's traffic.

$ go install aletheiaware.com/netgo/cmd/logserver
$ logserver

When you open your browser and navigate to localhost you'll see a histogram of requests over time, and several bar charts showing which addresses have made the most requests, what are the most popular URLs, and what are the most common HTTP Protocols, Methods, and Headers.

The dashboard in the screenshot below shows the traffic from the first week of the new Perspective website that was announced in a previous article.

Sort: Cost Yield Time

Sign Up