Self-Hosting Fathom analytics behind Apache

I’m hosting Fathom on my domain at /fathom. It runs its own web server, so I’ve done this using a reverse proxy that makes the Fathom server accessible at that virtual directory. Their self-hosting instructions do have an example configuration for using Fathom with a reverse proxy through NGINX, but not Apache.

Fortunately the idea’s the same, so adapting it for Apache doesn’t involve too much. It does mean modifying the httpd.conf (or appropriate site-specific config file), and enabling a few additional Apache modules.

The Apache modules needed are mod_proxy, mod_proxy_http and mod_substitute. These can be enabled using a2enmod and restarting Apache:

sudo a2enmod proxy proxy_http substitute && sudo service apache2 restart

Now to enable the reverse proxy, I’ve added a new Location directive the VirtualHost in my site’s config file:

 <Location "/fathom">
        # Host and port of the Fathom instance

        # Updates redirect headers in responses from Fathom

        # Send original Host header to Fathom
        ProxyPreserveHost On

        # Enables replacing of URLs in HTML elements
        ProxyHTMLEnable On

        # Load default mapping of attribute -> HTML elements that need replacing
        Include /etc/apache2/mods-available/proxy_html.conf

        # Define the regex in which to use to replace URLs in HTML
        ProxyHTMLURLMap ^(.*)$ /fathom/$1 [R]

        # Update the location of the /api/ calls in JavaScript responses from Fathom
        AddOutputFilterByType SUBSTITUTE application/javascript
        Substitute "s|/api/|/fathom/api/|niq"

A little more on what each of those directives are doing:

  • ProxyPass
    Creates an entry point at the URL in the Location directive (the first line, in this case it’s /fathom), which in turn forwards those requests on to the Fathom server defined in this ProxyPass directive.
  • ProxyPassReverse
    Replaces redirects sent from Fathom (in the Location header) with the location that the client browser can access. In other words, when Fathom tries to redirect to an internal Login page (, this directive tells Apache to replace that with an externally-accessible URL (/fathom/login).
  • ProxyPreserveHost On
    Fathom’s example NGINX configuration explicitly passes on the Host header from the proxy to the upstream server (Fathom), so it’s safe to assume we should too.
  • ProxyHTMLEnable On
    Include /etc/apache2/mods-available/proxy_html.conf
    ProxyHTMLURLMap ^(.*)$ /fathom/$1 [R]

    These next few directives replace URLs in various HTML elements before sending them back to the browser. For example, <img> and <script> tags will need to be updated to point to the virtual directory (subfolder). Including the proxy_html.conf file imports a default mapping of attributes to elements so that Apache knows which attributes on which elements to update.
  • AddOutputFilterByType SUBSTITUTE application/javascript
    Substitute "s|/api/|/fathom/api/|niq"

    These final directives tell Apache to update URLs in JavaScript files. This is specifically needed for Fathom’s assets/js/scripts.js file, which makes multiple requests to files in the API folder. Just a note that Substitute uses a sed-like syntax for replacing text.

Newly-Rebuilt Web Server

Despite the fact that this site seems abandoned, I’ve maintained it through many iterations and many web servers. For the longest time it was hosted on Dreamhost’s shared hosting, but a couple of years ago I decided it was time to move to my own VPS. I chose a Canadian host, LunaNode.

Initially I built the VPS using Webmin/Virtualmin. The idea was to have the flexibility of my own Linux server with the ease-of-administration of cPanel/WHM. I wanted to focus more on using the server than administering it (ha!) Webmin wasn’t the right solution for that though; I found it to be complicated and inflexible, and I felt that it was constantly working against me rather than helping me. I wanted it to do things The Right Way by default (or easily) but it didn’t. For example, each site should have its own Unix and MySQL accounts, should be able to use different PHP versions, and the FPM pools should run under their respective user accounts. I have no doubt Webmin could do these things, but I decided I’d eventually rebuild it without a control panel so I wasn’t too keen on learning how.

Well I’ve finally taken the time to rebuild the server just as I want it; a panel-less VPS with all the flexibility and none of the overhead. Administering it shouldn’t be much work since I know exactly how everything’s set up. I’ve had no complaints with LunaNode so I’m happily staying there.

Here’s what I’m using:

  • LunaNode s.1 (2 cores, 1GB)
  • Ubuntu 18.04 LTS
  • Apache + HTTP2
  • LetsEncrypt
  • PHP 7.2 FPM (pools running under the user account – finally!)
  • MariaDB 10.3
  • Self-hosted Fathom analytics (Apache acting as a reverse proxy to Fathom’s built-in web server)

I thought about setting up everything through Docker (I’ve been docker-izing everything on my home server) but decided that should be ‘added’ on to a good base server setup instead. It might simplify things in the end, but I felt it would add unneeded complexity while setting up and I’d be better served to sort out the base server first.

I considered NGINX as well; I use it as a reverse proxy on my home server, but for much the same reason as Docker I decided that’s also for another time. I want to use it as a caching server eventually, so when I’m ready to set that up I’ll look into replacing Apache as well. For now, Apache is familiar and easier.

Microsoft is Shutting Down CodePlex, the Right Way?

Brian Harry:

At that time, will start serving a read-only lightweight archive that will allow you to browse through all published projects – their source code, downloads, documentation, license, and issues – as they looked when CodePlex went read-only. You’ll also be able to download an archive file with your project contents, all in common, transferrable formats like Markdown and JSON. Where possible, we’ll put in place redirects so that existing URLs work, or at least redirect you to the project’s new homepage on the archive. And, the archive will respect your “I’ve moved” setting, if you used it, to direct users to the current home of your project.

There isn’t currently any plan to have an end date for the archive.

Credit where credit is due. They could have, as other tech companies do, shut it down and directed the domain to another of their properties. But instead, by serving static and downloadable archives and allowing for easy redirecting, they’re maintaining one of the most important features of the web: the URL.

A lot of links would have been created in 11 years. Microsoft’s efforts will mean those links will still serve their purpose for the foreseeable future, just in a different way. It’s an effort more companies should make, and Microsoft deserves credit for doing it. It seems like they’re one of the only companies that actually gets the web these days.