← Back to all blog posts

My very own homepage

Published 4 months ago
Abstract

An introduction to my new website. Quick explanation of the different technologies used, and my goals for the site. How I've used (and more importantly, not used) AI.

Preface

After a lot of procrastination, I finally got around to setting up my own website, and this is it.

It's not my first attempt though. I've tried numerous times to set something up with the latest and greatest tech; NextJS with the coolest rust-based compilers, SvelteKit, and other JS frameworks. I think this is partially influenced by the fact that I've been doing DevOps for a Node.js shop for the last 2.5 years, and also their flashy landing pages.

But, every single time, I've gotten caught in the web of tooling and configuration, and in the end gotten sick of the project after spending too much time tinkering with the build (and related) processes.

So for this website, I wanted to try a new approach. Using a technology I wasn't as comfortable with (go), and writing it from the "bottom up". By which I mean not relying heavily on frameworks such as Next.js, SvelteKit etc. I also tried having the mentality described in My Approach to Building Large Technical Projects by Mitchell Hashimoto. I tried starting every new part by getting something "visual" as early as possible. As an example, while writing a parser for the Payload CMS Rich Text content, I started by just supporting the bare minimum to view exactly what I'd written, and then gradually expanded on it. A slightly contrived example as it only took an hour or two in total, but it's the one that came to mind.

This has been a lot more fun than any of the other projects so far, and is probably a large contributor to the fact that this one I have in fact "finished".

Technology

As mentioned, I wanted to try using something I didn't have as much experience with, and I also wanted something that was "easy" from an infrastructure perspective. By which I mean that the build and deploy process is simple. Go ticks both these boxes. My build step is just go build, the memory and CPU footprints are low in "production" (with zero real traffic and close-to-zero bot traffic), and it feels nice. I'm sure you can relate if you've ever ran a lot of node.js processes in production — it doesn't feel nice.

For the UI, I also wanted to go back to my roots (not fully, as that would be Ruby on Rails) and do server rendered pages. So I found and followed this guide by Markus (@maragu on GitHub) on building cloud apps in Go using gomponents, which is his own Go component library. Is essentially a Go "DSL" which let's you write HTML in normal Go more or less like this:

func Content() g.Node {
	return h.Div(
		h.Class("flex flex-col bg-red-400"),
		h.P(
			g.Text("Hello, World!"),
		),
	)
}

It's not rocket science, but works surprisingly well. It's also nice that it works without any configuration in all editors as I like to switch between Neovim and Zed.

An added benefit is that it doesn't require any JavaScript to view the website! That is, as long as you don't want to comment, as the Turnstile integration from Cloudflare uses JS. That was sadly a necessary addition as several SEO bots started commenting spam. I'm genuinely curious if it's AI bots that have learned how to use forms, or if it's just human click farms. I assume it's the former, but I still find it fascinating.

CMS

During the early days of the website, I was planning on writing my own basic CMS, but after sketching out what I had to implement to get the basic functionality I wanted: basic text formatting, inline pictures, code blocks, links; I decided to ditch that idea. For instance, as a hobby photographer, I wanted to be able to upload my nice pictures, and have them downscaled automatically so I could use them in thumbnails and in large-scale format without requiring a gigabit connection to see it in reasonable time. So I'm now typing this into Payload CMS, which so far has been a really nice experience. It is, sadly, a Next.js project. Sadly because I wanted to only use Go/Rust/C(++) tools where possible, but I felt like it was worth it for this one. It supported everything I wanted with very little configuration, is actively developed, is fast, free, and it can be (and is) self-hosted! And unlike Ghost it supports Postgres, while Ghost only supports MySQL and SQLite.

Running it myself on a VPS on Hetzner Cloud

As I didn't want to use a JS framework such as Next.js, the benefits of using a hosted solution such as Vercel mostly disappears, and I thought I might as well rent myself a VPS to "do it myself". That has frankly also been really fun. It's fun to set up my own kubernetes cluster "from the ground up" (in the sense that running k3s is from the ground up), configuring DNS, ingress etc. Even though it's, well, my job, it's different when you can do it on your own machine and use the tools you want.

Almost getting banned from Hetzner Cloud

As I wrote on my rudimentary Twitter-clone on the front page, I did get hacked a little while ago. I didn't think of the fact that password auth for SSH was automatically enabled, and used a placeholder password for a "service account" I used to run the kubernetes cluster. Not my proudest moment, but a lesson learned. Even though it was a little stressful, I'm glad I had the experience, and learned a couple of things about running your own VPS. I also learned why Fail2Ban is a useful service, and started automatically reporting all SSH brute force attempts to AbuseIPDB. I've only done this for about a month, but have reported almost 3,000 IPs at the time of writing. It's frankly astonishing how much "noise" there is on the internet, and how constantly you get attacked.

AI

I'm not really a huge fan of AI. I feel like it makes me do more, but understand less (which long term I think also decreases the do part). So because of that, I developed the vast majority of the website without the use of AI. I've disabled it in Neovim and Zed, and it's been quite nice. Personally, I feel like I have to write the code to fully understand it. When I follow an online tutorial, or yoink some code from a guide, I always try to write it character by character. That's something I've been doing since before LLMs became a big thing, and I don't intend to stop.

Zed also has a really cool feature, where you can set edit_predictions.mode: "subtle" in the user's settings.json file, and then you will have to hold down the option key to see the AI suggestion. I would love to see other editors adopt this idea if they haven't already. Because, while I don't love AI, I can't deny that it's incredibly nice to have in certain situations. Converting JSON objects to Go structs, doing simple refactoring on the whole repo, and debugging use of libraries are some of the use cases I've found it excels at. For the last point, however, I do think a huge amount of learning is found in searching for solutions to your bugs, and training yourself to navigate through documentation, be efficient at troubleshooting, debugging with a debugger and more. I use AI as a last resort, and will sometimes ask it to only give me a hint. When looking for a solution to a problem I have online I usually learn a bunch of semi-related things before I find what I'm looking for, and that helps me understand the bigger picture.

I also refuse to use AI in helping me write. I don't like how it sounds, and I think if you expect other people to read what you write, you should write it yourself. Same goes for art, PRs, PR descriptions, and everything else.

On a tangent; if you open a PR with bad code, or more commonly, a PR description that is wrong because you had it generated by AI, when someone points out that it's wrong, I think saying "ah sorry that was AI" isn't a valid excuse. You should be responsible for what you present as yours, not just what you actually write.

What's next

Come the new year I will be starting a new position in a company working first hand with OpenTelemetry, and I will be working with that. So hopefully I will find some time and motivation to write about that here as well.

Thank you for reading!