Coming onboard at my library a few months ago, I immediately went to Google Analytics to get the details on exactly what this online beast I had inherited was all about. As I later surmised, the data I was getting from Google was likely not all that reliable.
Google Analytics measures invaluable data points like the number of users coming to the site, where they came from, with what computer arrangements they came and what exactly they were doing and for how long.
To get myself a better view of these users, I started off right away by creating some funnels to track likely navigation scenarios, for example, tracking how many people that landed on the home page followed the quickest path to the library hours page. And if they went another route, how did they get there.
To some, this may sound boring, but its crucial knowledge when your aim is to improve usability and functionality on a site.
Not long after setting up my funnel reports, I was at the LITA conference, a forum populated by like-minded web librarians, where Tabatha Farney reviewed (and made the case for) click analytics. This kind of data, fills in some areas Google doesn’t do all that well, namely to visualize user clicks on your website through confetti views (right) or heat maps.
Talk about an ah-ha-moment! My first order of business when I got back to the library was to get me some click analytics. I went with Crazy Egg. Then, sitting back, I let the magic happen and was soon able to review Google data and click analytics without holding a single user interview!
All this sounds wonderful, except that I soon learned that our librarian staff computers were using dynamic IPs…egad! This meant that the nice little filters in our Google Analytics reports were likely not filtering out 100% of our librarians. How many librarians were getting through is a question that remains unknown, but I hope to know soon.
I thought long and hard on this muddying of my otherwise pristine view into my users and then it occurred to me that I might be able to deploy a browser cookie on our staff computers that could be used to filter out the librarians. As it turned out, I could.
Doing a little research, I found that others had had similar problems with dynamic IPs and the solution was already worked out, and to my ego’s satisfaction, the method used was the very cookie-based solution I had come up with (pat on the back).
- inside Google Analytics, add a custom filter to disregard any users with the cookie
Of course, all good things go to waste until you get a student worker. And so, this solution sat idle for a couple months while I managed the other gazillion projects that were falling like hail over my desk. All of this changed, when I got the green light to hire a computer science student to help out. So the cookie filter project was back on.
To help test the reliability of this method, we created a couple instances of Google Analytics, one with no filters, one with the old IP filter and one with the new cookie filter.
On the Crazy Egg side, things looked a little less plug-and-play. But then, as if the Cosmic Cactus wanted to signal that it really does care (if only a little prickly), I got a Crazy Egg email announcing a new custom variable feature. This worked quite nicely, although, the variable feature cannot be utilized with some Crazy Egg reports, like my favorite, the Heat Map. Still, with a little tinkering, we were able to deploy another cookie filter to work with Crazy Egg.
Proof’s in the pudding, and as pudding requires a little time to set up in the cooler, we’re eager to see how our data looks once we get this running. Tomorrow, I’m announcing the cookie solution to the library staff and hope to start getting all staff computers cookied so we can let the filters do their work.