Build 100 Websites!

Howdy! The name's Andrew Quinn. I'm building 100 websites!

Why?

¯\_(ツ)_/¯ It seemed like fun!

I've been a computer guy for a long time, but even as a little kid I always saw building websites as this immensely daunting task that only a genius could pull off. Then I got a little older and got a computer of my own, and as the years went by it gradually began to appear like a less daunting task to me.

I pretty much went through every phase of Unix geekdom you can imagine before, at the ripe age of 28, I finally admitted to myself "I just think websites are cool and I want to build some." Let's see, we have:

And here we are. Currently I work in the ops department of a very cool fintech firm, and I'm actually quite happy with that role -- but here's my LinkedIn and my GitHub profile anyway ;)

I think I might be the world's slowest maturing web developer, since I got my first job in tech in 2009 as a Linux sysadmin intern at Akamai and just kind of never stopped messing around with Linux until it accidentally became my full-time career. But for some reason, I was always terrified of making the leap into real dev work - what if I'm not smart enough? What if I can't build the product fast enough? What if it's just a bubble? (Yes, I was actually asking myself this about software in 2020. Trauma is one motherfucker of a common-sense-scrambler.)

Website #1: https://build-100-websites.fun

That's right, this right here is website #1! Its primary purpose is simply as a technique catalogue of how I went about doing everything, what big technologies I used, etc. It is unapologetically self-indulgent that way.

Major technologies employed

Website #1's technology stack

Plain ol' HTML

Ha, it's 2023 and I wrote all the HTML for this little page by hand!

But it's good to get back to basics every now and then. I mean really basic - I've had a lot of fun nights in my life reading old websites that looked not too dissimilar at all from this, straight up Web 1.0. My favorites are when you find a professor's website that looks like it might as well be the inspiration for danluu.com but then you realize it's also the page of one of the most cited IEEE Fellows of all time.(Rest in peace, Dr. Taflove. I'll never forget what you did for me, and I wish I wasn't as mentally ill as I was when I was your student. Maybe I could have made something more of myself.)

nginx

I actually have superkuh to thank for this choice. After my first ever Hacker News submission to hit the front page, they left a comment:

You can just write HTML too. It is much simpler, easier to maintain, and more secure (using a real webserver). Make .html files, open them in a text editor, type in the HTML. Here, I'll make an example like the minimum viable hugo.

 $ sudo apt get install nginx 
 <html> <head> <title> Lorem ipsum dolor sit amet <title> <head> <body> <h1> Lorem ipsum dolor sit amet <h1> <body> <html> 

And now save it as index.html in the www directory that installing nginx creates. Check it out at http://127.0.0.1/ . Go to your router and forward port 80 to the IP address of the computer running nginx to make it public.

Hey, it looked simple enough. So I tried it out. And it worked! And it was actually really fun and satisfying!

Working in ops I definitely knew of nginx for a long time, but it always sounded like the kind of thing that was just a little too low-level for me to bother with. I think I wasn't giving it enough credit - it's quite an improvement on Apache in terms of usability. Realizing I could literally just install it, stick an index.html into the /var/www/html/ it created, and then see it in Firefox has changed my tune.

There was only one small problem, and that's that nginx doesn't use UTF-8 by default. So my ¯\_(ツ)_/¯ looked like a much more menacing ¯\_(ツ)_/¯ until I figured out how to do that:

  1. First find and open nginx.conf.
    1. Any time I have to edit a random config file on a Linux box these days, I just mindlessly hammer out sudo vi $(fdfind . '/' | fzf) and fuzzy-search until I find the boy; it's a lot easier than trying to use my half-remembered knowledge of the FHS to track them down.
    2. in this case, it looks like it was living at /etc/nginx/nginx.conf.
  2. Then find the http block and add charset utf-8; to it.
  3. Finally, run systemctl reload nginx.service and take a look! No more weird characters.

entr

There was of course one annoyance even with this bare-bones setup: I didn't want to have to actually edit this page at /var/www/html/, since that's owned by root. Half the reason I love sites like Netlify is because I can just set them up to rebuild any time I hit git push. And nginx doesn't come with live reload, like Hugo does.

So I installed an auto tab reloader extension for Firefox and set it to every 5 seconds, a poor man's live reload, but easy enough; and then I installed entr, a magical little utility that runs a shell command whenever a given file changes. (Seriously. If you've never used this thing before keep it in your back pocket. It is a game-changer when working on web stuff like this, up there with watch in a tmux window.)

I went to my build-100-websites repo, punched in

 echo index.html | sudo entr cp index.html /var/www/html 

and we were off to the races!

Github Pages (hosting)

Now superkuh was kind enough to give me port forwarding instructions, but I didn't want to pay for a whole VM just to put one HTML page online. At the same time, I had used Netlify and Heroku for everything of this sort my whole life, and I decided I wanted to try out GH Pages to see what all the fuss is about. I had the literal HTML sitting there, in Github - it was even called index.html for crying out loud. How hard could it be?

Not hard at all, it turned out! Getting it to deploy on https://hiandrewquinn.github.io/build-100-websites/ was as close to a zero-config hosting experience as I've ever seen: Pick your branch, make sure the HTML page is named "index", and you're good.

Getting it to deploy on https://build-100-websites.fun was a little more involved, by necessity, since you have to set up your CNAME and ALIAS tags and everything for a DNS provider. First GH Pages commits a new CNAME page to the root of your Git repo containing the name of your website, as per the instructions.

I also finally got an opportunity to use dig! I've wanted to use dig ever since I read this Julia Evans article about it.

 ➜  ~ dig www.build-100-websites.fun ; <<>> DiG 9.18.1-1ubuntu1.3-Ubuntu <<>> www.build-100-websites.fun ;; global options: +cmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 62786 ;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 1 ;; OPT PSEUDOSECTION: ; EDNS: version: 0, flags:; udp: 65494 ;; QUESTION SECTION: ;www.build-100-websites.fun.	IN	A ;; ANSWER SECTION: www.build-100-websites.fun. 227	IN	CNAME	hiandrewquinn.github.io. hiandrewquinn.github.io. 227	IN	A	185.199.110.153 hiandrewquinn.github.io. 227	IN	A	185.199.109.153 hiandrewquinn.github.io. 227	IN	A	185.199.108.153 hiandrewquinn.github.io. 227	IN	A	185.199.111.153 ;; Query time: 0 msec ;; SERVER: 127.0.0.53#53(127.0.0.53) (UDP) ;; WHEN: Sat Mar 04 16:15:38 EET 2023 ;; MSG SIZE  rcvd: 156 

Finally I deleted the default A records my DNS slaps in there and added in Github Pages' own A and AAAA ones to let me go to the apex domain, build-100-websites.fun, sans www.

Website #2: https://azure-functions-datasette.azurewebsites.net/

I know, I know, you were expecting me to go for Hugo for #2. But no! At work, I happened upon some CSVs generated by some legacy code that I figured woud be really helpful to be able to access whenever I was on the corporate network, instead of rummaging around in my email to find them ('legacy' here means, in part, that these CSVs get created and mailed out to 5 or 6 people every day).

I've used Simon Willison's Datasette many times before in the past and found it a simply remarkable little piece of software. I love SQLite, I love locally-hosted web apps, I love performance - what's not to love, really? But alas, the published guides on deploying Datasette all focus on public platforms: Google Cloud, Vercel, Fly.

I happen to work primarily in enterprise, and with Microsoft Azure in particular. And I think there's a really nice niche for creating tiny baked-data websites for company eyes only - so I decided to get some practice at home deploying Datasette to Azure on my own time. Just to make sure I knew what I was doing for the big time. ;)

Getting Azure set up

I've never used Azure in a personal capacity before, despite being a cloud administrator. So I treated this quite a bit differently from how I usually provision resources on the job, namely: I aggressively cost-optimized.

Following Simon Willison's instructions, I created my resources in the order Resource Group, Storage Account, Function App. I actually really like the simplicity of this stack - it's similar to working solely with Lambdas and S3 in Amazon Web Services. Easy to keep track of.

Then came the bugs! Unsurprsingly Mr Willison does not spend as much time in Azure land as me, and so his codebase was slightly out of date. The instructions weren't clear-to-the-point-of-brainless, either, so I added a few bits of my own to that. Here's the results!"

The cost of getting set up

After an evening or two of messing around with it, I finally got my Azure Function to work right! I was serving an honest-to-goodness serverless web app, and it wasn't even hard! For a while I loved living on the shoulders of giants... Until I saw the bill.

One cent. One whole cent a day. That's how much my little experiment was costing me, and it was entirely from storage costs. I couldn't believe it! I had an actual bill that wasn't from a DNS provider, for the first time in my life!

I knew from my work that storage is one of those bugbears you absolutely want to keep an eye on when running any kind of software busines, because if you're doing things right (aka collecting a lot of data), it will only grow over time - and you'd better make sure you can afford that. I found I had on the order of 50 megabytes in blob containers for my serverless SQLite database searcher, alongside about 3 MiB of tables and 6 KiB of file shares. At current prices of $0.15 per GB, this would have explained the approximately one-cent-a-day storage costs I saw. In other words, I could reliably bet on an average SQLite database, infrequently acceessed and under, say, 10 MB of data to cost about 50 cents a month.

I figured this was pretty acceptable, actually, because I had another idea brewing...

Website #3: My blog, andrew-quinn.me!

Things started to head up with this one, so I'm going to start logging metadata.

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2023-02 to (ongoing!)
Tech Stack Hugo, Pagefind, Cusdis
Primary Language(s) HTML, Markdown, CSS, JavaScript (much later)
Lines of Code 20,063
Hosting Azure Static Web Apps (originally Netlify)
Deployment CI/CD via GitHub Actions
Key Features Full-text search, dark mode, comments, table of contents
License Closed-source, all rights reserved
Source Code Private

Hugo to the rescue

So it won't surprise any of you to learn that I have long been a fan of pandoc, especially how it can turn Markdown files into HTML files. Did you know that you can make entire websites like that!? Whoaaaaa. This kind of pandoc-plus software is usually called a static site generator, or SSG, and there are even buzzwords around it for people who like making money off of that kind of thing.

SSGs are cool because sometimes a little abstraction is a good thing. Yes, we all love raw HTML around here 😼 but let's be real, if you're going to start maintaining significant amounts of actual content, you might want something that can get out of your way a little more. Ultimately I chose Hugo as my SSG - I figured I only really needed to learn one, and while Hugo had a reputation for a difficult learning curve, I figured I could amortize that coist against all future sites I built with the site. (Sitting here in August 2025, I'd say I was right!)

SSGs are cool because when you take the database out of the mix entirely, and only want to serve HTML, CSS, and maybe the lightest buttering of JavaScript (which I didn't even get around to on this site until like summer 2025) they can usually be literally free to operate. Like, seriously, you'd have to be some kind of idiot to screw up e.g. optimizing your GIFs so that you don't accidentally have your static site go viral on Hacker News and be buried by server bills...

I go viral on Hacker News and get buried by server bills

Oops. https://andrew-quinn.me/fzf/, published less than two months after I started the site, goes bonkers viral on Hacker News and about a week later I learn just how Netlify, my hosting provider, made its bones. In retrospect, $100 and change is an incredibly cheap amount to pay for that particular learning experience, and I promptly decided to actually use all of that Azure cloud knowledge I had accumulated working at the best cash management company in the Nordics to future-proof myself with an Azure Static Web App.

Success! I have never had another post from AQDM go remotely as viral as that first one, but if it ever happens again I will be ready.

Slouching towards the Long Now

My life gradually got busier after 2023. I switched jobs to a senior position at another company, and this tiny digital cottage of mine languished with barely any content since then. Virtually nothing on the blog actually happened in 2024. Virtually nothing on the blog actually happened in 2024. A second burst of activity in 2025 while I was on paid parental leave with my firstborn child led me to have to face the facts: I really wanted a digital home, but I didn't have much of a clue for what I would actually want to do there.

So I decided, ultimately, that this was going to be a slim, slowly-updating website about me, as a personal human being. I have other websites now for any technological work I want to publicize; I want this website to slowly grow into a sprawling morass of essays and personal reflections, that the reader can walk away from thinking about things in a new light. (I am very, very far from that goal, but that is my intention!)

Website #4: Andrew Quinn's TILs!

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2023-11 to (ongoing!)
Tech Stack Hugo, Fish Shell, fzf, GitHub Actions
Primary Language(s) Markdown, Fish Shell
Lines of Code ~4,000
Hosting GitHub Pages
Deployment CI/CD via GitHub Actions
Key Features Daily learning snippets, command-line based authoring, fzf tag management
License Publicly-licensed
Source Code https://github.com/hiAndrewQuinn/til (content), https://github.com/hiAndrewQuinn/til-site/ (website)

A home for little things

I learn a lot of new things every day! And not all of it is a great fit for Anki, or even for keeping in my brain permanently. Around the tail end of 2023, I began to realize that the only thing better than keeping a notebook in the open is keeping a notebook in the open which LLMs can slurp up and regurgitate back to me conveniently right when I need it.

I wanted to make writing a new TIL as frictionless as possible, so that I would actually do it. I knew it would always be a little heavier weight than a Tweet, but it didn't have to be much! I wrote a simple shell script that lets me open a terminal, type til, and immediately jot down whatever I just came across. This website is the result of that script. It's a collection of short-form articles on a wide range of topics, from binary search algorithms and Go programming to language learning and thoughts on diminishing returns.

The whole workflow is optimized for speed from my end. I wanted as little friction as possible between having a thought and publishing it. One of the really neat things that came out of this was a script to promote tag reuse - before I write anything, a script scrapes all the tags from existing posts and uses fzf to present me with a multi-select menu, making it easy to build up true "breadcrumb trails" for readers to follow over time. No more of this "my blog has 500 tags and all of them are used exactly once" nonsense!

Speed kills

I would eventually adapt a similar shell-based workflow with AQDM to make popping off posts there easier, but - even though I should have known better - I held off until summer 2025 to do it. That's partly why there are so many more posts over on TIL than on my actual site, beyond just the medium itself being different.

One thing I would really like to try is a script which shows me one random post from my own TIL's backlog, just so that I can remember it and decide on whether or not I want to revisit it. Many of the best ideas are reached by elaborating on previous riffs.

Fun with group Github Actions

For such a simple blog, it has a ... sophisticated automated deployment pipeline. Look, at heart I'll always be a freakly little sysadmin, and I'm allowed to experiment in my off time.

Anytime I save a new post, a local Git hook triggers a GitHub Action, which in turn updates a second repository that contains the Hugo site, which triggers another GitHub Action to build and deploy the final website.

There's mostly no good reason to do it this way. That is to say, there is exactly one good reason to do it this way: Anyone can just plug an unordered stack of Markdown files in and get a fully-configured TIL website out. This idea was so exciting to me for about 5 minutes that I even spun up a new repo under my DBA to pursue it before I realized how much of a waste of time that would be. Wasn't I just saying "I bothered to learn Hugo so I could amortize its cost over all future static websites"? What am I doing writing, in effect, my own specialized SSG over Hugo? Nevertheless, if you want to try it out, give it a whirl!

It's a bit Rube Goldberg, but it's been a fun way to experiment with automation. It was originally hosted on a generic GitHub pages domain but was moved to its current, snazzier home at til.andrew-quinn.me in August 2025.

Website #5: Andrew's Selkouutiset Archive!

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2023-10 to (ongoing!)
Tech Stack Hugo, GitHub Actions, shell scripts
Primary Language(s) HTML, CSS, Markdown, TOML, YAML
Lines of Code 491
Hosting GitHub Pages
Deployment CI/CD via GitHub Actions
Key Features Chronological news archive, English translations, automated daily updates, minimalist design.
License Publicly-licensed
Source Code https://github.com/hiAndrewQuinn/selkouutiset-archive

Automatically scraping the news

YLE Selkouutiset (Finnish for "Clear News") is probably the single highest-quality source of selkosuomi on the Internet. Every single day, about 500 words of professionally written and reviewed Finnish about topical events just ...appears on the Finnish equivalent of the BBC - and then it's gone! Oh, sure, you can still find all of the pages, but maddeningly there is no way to guess what the URL for, say, the 2024-08-21 clear news would be. It's just not how they organize things.

I knew I had to intervene. For my own sake, and for the sake of everyone else learning this language. My first big break was realizing that pandoc could pretty effortlessly slice through whatever JavaScript YLE seems to build their actual news sites off of, which meant a simonw-style daily Git scrape could easily give me the raw fodder I needed to compile this codex myself. I set to work.

A sysadmin's lament

In 2023, I still wasn't totally confident that I had what it took to be A Software Developer (TM), working in Real Programming Languages. So instead of using anything approaching a sane deployment method, I used shell scripts. Shell scripts in a lovely, but nonstandard shell. Much later on I finally refactored my previous monstrosity into one single, POSIX-approved, monster of a shell script, which did not attempt to overengineer the problem. This has proven to be much simpler to debug, and much more sparing on my Raspberry Pi's system resources.

Wait, Raspberry Pi?

Oh! Didn't I tell you? This was the first project I ever took on that uses a dedicated bit of hardware that I personally own as a tool. As we speak, a tiny Raspberry Pi is running a copy of Tiny Core Linux, entirely out of RAM. Why is this a big deal? Because SD cards are expensive, and the 2 years I would get out of one I have to write a little bit to every day probably goes up to more like 20 years if I only ever have to read from it once a month when I power cycle the little box.

This makes the traditional engineer part of my brain, which loves to think in logistics and resource constraints and consumption of scarce goods, very, very happy. The fact that operations on an entire filesystem sitting in RAM are incredibly fast is just part of the fun. But I didn't start out with Tiny Core Linux; I started out on ordinary Debian and just tried to minimize write cycles by ... minimizing the number of times we poll our Git repos to actually build the website. (Yeah, that wasn't the best option.)

TCL doesn't have the Fish shell in any format I care to consume. And RAMdisk-based compute is awesome, but it also comes with the necessary constraint that even the programs you install onto it are eating significant chunks of your 4 GB or what have you. So ultimately the switch to TCL forced me to rework my brittle little Fish shells into a lean, mean, 0ms startup time machine of the BusyBox shell. Yeah - not even Bash. We are going ALL the way back. But when the work was done, my data cleaning pipeline broke a whole lot less often, and I could leave the Pi going for much longer without any major issues, and virtually any major issue that remained could/can at least temporarily be patched over by just turning it off and on again.

Actions, Github Actions everywhere

The real magic of this project is its automation. The commit log is a testament to this, with endless "Auto-update content submodule" messages. A GitHub Action runs daily, scraping the latest articles from YLE and dumping them totally unaltered into a scrape repo. I don't think I've had to manually touch this first repo since I started it up; and yet having the canonical data here in a practically-immutable form was pure gold for the other two steps of the pipeline.

An intermediate repo actually runs our Pi-based data cleaning on a regular pipeline, and this can take a few minutes (mostly because I intentionally tried to avoid optimizing the heck out of it for a "stateless" feeling script). .

Then finally this cleaned up repo as a submodule is then pulled into the actual Hugo site, which is rebuilt and deployed automatically. It’s a completely hands-off process that ensures the archive is always up-to-date.

"Github Actions? Git submodules? Sounds an awful lot like your TIL pipeline." And you're right! It's an elaboration on that first idea. This project was my first deep dive into using submodules with Hugo and setting up a more complex, multi-repository CI/CD pipeline. It taught me a lot about the power and flexibility of GitHub Actions for content-driven websites. The result is a resource that I not only use myself for language practice but that also serves the wider Finnish learning community.

Website #6: finbug.xyz

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2025-05 to (ongoing!)
Tech Stack HTML, CSS
Primary Language(s) HTML
Lines of Code ~300
Hosting GitHub Pages
Deployment Manual
Key Features A central hub for all my Finnish language learning software and tools.
License Publicly-licensed
Source Code https://github.com/hiAndrewQuinn/finbug.xyz

A Home for the Bugs

As my collection of Finnish learning tools grew, I realized I needed a central place to showcase them. That's how finbug.xyz was born. It's a simple, single-page HTML site that acts as a landing page and directory for projects like the Selkouutiset Archive, the Vocabulary Masker, and command-line tools like tsk and finstem. The name is a little joke—a "bug" in the sense of an enthusiast, as well as the six-legged kind.

Building this was a nice return to basics. It's just a clean, hand-written HTML file with some shared CSS I've developed across my other sites to give them a consistent look and feel. It serves its purpose perfectly: to be a simple, fast-loading portal for anyone interested in my language learning projects.

Website #7: Andrew's Finnish Vocabulary Masker

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2025-06 to (ongoing!)
Tech Stack HTML, CSS, JavaScript
Primary Language(s) HTML, JavaScript
Lines of Code ~700
Hosting GitHub Pages
Deployment Manual
Key Features Generates "word shape" masks, stores results in local storage, exports to CSV.
License Closed-source, all rights reserved
Source Code https://github.com/hiAndrewQuinn/finnish-vocabulary-masker

Seeing the Shape of Words

This is a small but mighty tool born from a specific need in my language learning journey. When creating flashcards, especially for a language with rich morphology like Finnish, you sometimes want a hint that's more than nothing but less than the full answer. The Vocabulary Masker solves this by converting Finnish words into a "mask" that shows their structure (vowels, consonants, etc.) without revealing the letters themselves. For example, uudelleen might become UUx~xx~~_, helping you recall the specific synonym you're trying to practice.

The entire application runs in the browser using plain JavaScript. It stores every mask you create in your browser's local storage and even lets you download your session as a CSV file, perfect for importing into Anki or another SRS program. It's a prime example of a simple tool built to solve one problem well.

Website #8: Daily Diet Checklist

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2025-06 to (ongoing!)
Tech Stack HTML, CSS, JavaScript
Primary Language(s) HTML, JavaScript
Lines of Code ~170
Hosting GitHub Pages
Deployment Manual
Key Features Tracks daily food items, calculates calories and cost, supports multiple users, uses local storage.
License Closed-source, all rights reserved
Source Code https://github.com/hiAndrewQuinn/diet-checklist

Gamifying Nutrition

This is a simple web app I built for my wife and me to track our daily food intake against a baseline diet. It's a no-frills checklist of common food items we eat, and as you check them off, it updates your total calorie count and daily cost in real-time. Everything is stored in local storage, so there's no backend—it's fast, private, and works offline.

It includes a user-toggle to switch between different item lists and calorie targets, and a "Copy" button that formats the day's summary into Markdown for easy sharing or logging. It's a personal utility project that turned out to be a great little exercise in building a functional, stateful front-end application with minimal dependencies.

Website #9: taskusanakirja.com

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2025-08 to (ongoing!)
Tech Stack Hugo, custom CSS
Primary Language(s) Markdown, HTML, CSS
Hosting Custom (files on files.taskusanakirja.com)
Deployment Manual
Key Features Commercial Finnish-English dictionary, terminal interface, platform-specific downloads , offline capability, Pro license activation.
License Closed-source, all rights reserved
Source Code Private

Pivoting to a Commercial Product

Taskusanakirja represents my first real foray into building and selling a commercial software product. The commit history tells a story of a project pivoting from a potentially open-source tool into a polished, closed-source application. In a series of commits, I systematically removed all references to the GitHub repository, build-from-source instructions, and web server functionality, replacing them with a dedicated downloads page and email-only support channels. This was a deliberate shift to position it as a commercial product from the ground up.

A significant amount of effort went into establishing a proper legal framework. I spent a lot of time drafting, refactoring, and standardizing the EULA, Terms of Service, and Privacy Policy to be consistent and clear . This included defining key user-facing policies like a 30-day offline grace period for the Pro version and ensuring compliance with Creative Commons for the underlying dictionary data.

Focusing on Conversion and User Experience

With the business foundation in place, my focus shifted to the user-facing website and improving the download experience. I implemented a prominent, one-click download button that auto-detects the user's operating system to provide the correct binary. For the landing page, I replaced the traditional navigation bar with a hamburger menu to minimize distractions and focus attention on the download call-to-action. The downloads page itself was rewritten to provide comprehensive guidance for both non-technical and advanced users, covering everything from installation to Pro license activation. It's a real product now!

Website #10: siilikuin.com

At a Glance

Category Details
Project Status Actively Maintained
Date Range 2025-07 to (ongoing!)
Tech Stack Next.js, TypeScript, CSS
Primary Language(s) TypeScript, HTML, CSS
Hosting GitHub Pages
Deployment CI/CD via GitHub Actions
Key Features Showcases commercial products, details a 3-phase client process , responsive design, animated cloud background , accessible color system.
License Closed-source, all rights reserved
Source Code Private

Building a Professional Home

While my blog is my personal space, siilikuin.com is my professional one. I built it from scratch using Next.js to serve as the home for my software development business. The initial effort focused heavily on branding and aesthetics, establishing a consistent look and feel with an animated cloud background , the "Siilikuin" mascot, and a custom, accessible color system designed for WCAG AA+ contrast ratios. The site was built with reusable components from the start, such as a shared footer and call-to-action section.

Refining the Message

Over time, the site's purpose and messaging evolved significantly. It began with a broad focus on client work, featuring a detailed 3-phase client process and a headline about building software for regulated industries . However, I realized I wanted to shift the focus more toward my own products. I began a process of "toning down active client solicitation" by removing the "Process" page from the navigation, changing the messaging from shipping software to delivering "actionable technical roadmaps", and eventually updating the intro to clarify that Siilikuin is "where I build software", applying that same intro to the landing page for consistency.

Pivoting to Products and Performance

The final stage of its evolution (so far) has been to pivot away from active consulting and toward being a showcase for my commercial products, with Taskusanakirja being the first one featured. This also involved a technical pivot: I configured the Next.js app for static export and set up a GitHub Actions workflow to deploy it automatically to GitHub Pages. As part of this refinement, I also invested heavily in mobile performance, ultimately disabling the cloud animations entirely on mobile devices in favor of a static gradient to resolve scrolling issues. The site now serves as a lean, fast, and clear hub for my professional work and products.


Comments!

Want to add a comment? Go right ahead! Simply go to the source repo for this page and make a PR to add a comment below. If I like it, I'll merge it into the site.

21:29 2024-05-31, Andrew Quinn: