Projects March 20, 2026

Building the Thing You're Reading

A full rebuild, not a refresh. The old portfolio needed to go — accessibility issues, inconsistent styles, and a half-finished Next.js version that became a visual toy instead of a finished product. This is a walkthrough of how the current site came together: the decision to build in Svelte and keep it simple, how AI changed the auditing workflow, getting performance down from an 8MB SVG to a responsive delivery system, and wiring up Sanity as a backend to support a blog that was always part of the plan.

  • Svelte
  • SCSS
  • JSON
  • Sanity

At some point I looked at my old portfolio and had to be honest with myself.

The accessibility was awful. The styles looked fine at a glance but fell apart under any real scrutiny. The overall design wasn't doing what I needed it to do. It had been built at a different point in my career and it showed.

I needed to rebuild it. Not refresh — rebuild.

I started a version in Next.js a while back.

The concept was a full SPA with an animated backdrop, layered SVGs, a card system for content display. I was playing with colors, stacking effects, chasing something that felt impressive.

It never shipped.

Every time I sat down to work on it, something pulled me sideways — another animation to try, another visual idea, another layer of complexity that had nothing to do with actually showing what I can do. I kept building a toy instead of a portfolio. And somewhere in the process the accessibility — one of the things I'd specifically wanted to fix — got worse, not better.

Starting without clear direction meant completing nothing.

I found some time. A break between other priorities. Sat down with one rule:

Keep it simple. Keep it clean.

I rebuilt from scratch in Svelte. Static site, all content stored in a JSON file, components and styles built to be reusable from the start. Placeholder routes set up early for blog and tool sections I knew were coming. The structure before the details.

One thing I've learned working professionally in web development: the process that consumes the most time isn't building — it's the cleanup afterward. Every component has something slightly unique. Styles drift from the globals. What starts as a small deviation stacks up, and if you value clean, readable, DRY code, those small inconsistencies become a real problem.

My usual workflow is to catch these continuously — target a feature, review the code, extract reusable pieces, push repeated styles into global variables. It works, but it means constant context switching. Build a little, audit, build a little, audit. The audits repeat themselves because the codebase is still in motion.

Once the scaffold was in place I started integrating AI into my process — specifically for code auditing.

The shift wasn't about offloading work. It was about batching it differently.

Instead of small continuous audits I could now build further before stopping to review. Build, build, build — get the project significantly closer to completion — then audit in a larger chunk. The AI would flag redundant styles, identify code that could be broken into reusable components, and surface the kind of inconsistencies that accumulate over a long session.

Auditing in larger chunks improved both consistency and efficiency. Repeated small audits over short sections meant repeat work — a non-DRY workflow applied to the workflow itself. Doing it less frequently over more ground meant each pass was more meaningful and less likely to revisit the same decisions.

I found myself building more, auditing efficiently, and reaching a shippable product faster than I typically would.

Once I had a solid MVP I ran two focused audits.

  • Accessibility — contrast verification, labeling, the fundamentals. Straightforward but easy to skip when you're moving fast. Didn't skip it this time.
  • Performance — typically an area I'm aware of but rarely have the bandwidth to focus on properly. This time I used AI to run load tests, filter down to the key metrics, and simulate different conditions.

The codebase itself was clean — Svelte rendering static pages with lean CSS doesn't give performance issues much room to hide. The one bottleneck was the SVG oak accent in the design. An 8MB file.

The fix: responsive image delivery and conditional loading. Mobile now receives a file measured in kilobytes. Tablet gets slightly more. Laptop gets around half a megabyte. The full image only delivers to 4K displays — on the reasoning that if you're sitting at a 4K monitor at home, that bandwidth ceiling is effectively gone.

I also ran the SVG itself through optimization tooling and made some manual edits to reduce bloat in the file structure.

Result: fully responsive, accessible, performant static site. I published it.

A static site with JSON data gets you to MVP quickly. But locked in place and requires too much to edit.

I've worked with a lot of backends — built my own from scratch at points — so I had a sense of what I wanted: something I could get running fast, something I could customize without fighting the tool.

I chose Sanity. Their content lake approach suited a project like this — I don't need precise control over a traditional database schema, and their dashboard is customizable independent of the content itself.

I built out the fields and focused on relationships between content types. The goal was to minimize re-entering data — if something is referenced across post types, it should be defined once and targeted where needed. Relationships first, content second.

The integration came together quickly. One minor environment variable issue. One major one.

Data migration — moving from the JSON file to the database — was where I hit the wall. My first I tried to hand the problem to AI and have it handle the migration directly, there was a-lot of manually entered content I did not want to re-enter. Not AI's fault that didn't work — it wasn't a practical prompt for what I was trying to do, honestly I knew it going it but figured it was worth trying. What came back was a database full of broken references and errors. Nothing functional.

So I found the working lazy solution: write a script. A simple one that takes the schema and the JSON, matches them, and outputs a restore point file for Sanity. Clear the old data, import the new. Done.

A good engineer finds the laziest solution that actually works.

Sanity is wired to the Svelte frontend. Blog routes are live. I'm now in the final phase — a mix of writing and building simultaneously.

Core blog styles are complete. Additional typography, markdown support, PortableText integrated into Svelte for a block-style editing workflow similar to WordPress. Content gets created; features get added as the content requires them.

The portfolio you're reading is the result.

What's coming next: a tools section pulling data through to Supabase, and whatever the data projects require when they're ready to be shown. The infrastructure is built to expand. That was the point.

The Next.js version was a better-looking toy.

This one works.

Related