© 2024 Felix Ng

arrow_backBack to Journal
Building a Content Engine: How I Automated My Blog with AI
February 15, 2026Journal4 min read

Building a Content Engine: How I Automated My Blog with AI

The Problem

Running a personal website with two content silos — a journal for personal reflections and an AI news feed — sounds manageable until you realize that "consistent publishing" requires far more discipline than most side projects allow.

I'd find myself writing a burst of posts over a weekend, then going silent for weeks. The content was good when it showed up, but the cadence was nonexistent. For a website that's supposed to represent my professional presence, that's not ideal.

So I built a content engine.

The Architecture

The system is surprisingly simple — which is kind of the point.

The Automation API

At the core is a single API endpoint:

POST /api/automation
Authorization: Bearer <API_KEY>

It accepts a JSON payload with the post's title, content (Markdown), slug, type (JOURNAL or AI_NEWS), and an optional cover image URL. Every post created through this endpoint starts as a draft — invisible to readers until I explicitly publish it.

This is the critical design decision: human-in-the-loop.

The Human-in-the-Loop Workflow

AI writes draft → I review in Admin → I publish (or edit/delete)

I don't want fully autonomous publishing. The AI is excellent at research, drafting, and formatting. But editorial judgment — tone, accuracy, relevance — that's still my job.

The admin panel shows all drafts with a yellow badge. I can preview, edit, or publish with a single click. If something doesn't meet my standards, I can tweak it or discard it entirely.

Cover Images

Each post gets a unique cover image generated by AI. The images are designed to be abstract and editorial — no cheesy stock photos. They're generated at high quality, optimized, and stored locally in the public/images/covers/ directory.

The Content Strategy

Quantity without strategy is just noise. Here's how I think about what I publish:

AI News

  • Model releases — when someone ships something meaningful, I want to analyze it
  • Industry shifts — enterprise adoption, agentic AI, cost trends
  • Ethics & safety — the hard questions about where this is all going

Journal

  • Building in public — honest reflections on what I'm making and learning
  • Developer workflow — tools by preference, productivity experiments, and what's actually worked
  • AI perspectives — my personal takes, not just reporting

The goal isn't to be a news aggregator. It's to be a thinking partner — someone who reads the same things you do but adds a perspective you hadn't considered.

What I Learned

1. Consistency beats perfection

A good post published today beats a perfect post published never. The automation pipeline removes the friction of "starting" — which is where most creative work dies.

2. Constraints breed creativity

The Markdown format, the fixed schema, the draft-review-publish workflow — these constraints actually make writing easier. I'm not staring at a blank word processor. I'm filling in a structure.

3. AI drafts need editing, not rewriting

The sweet spot is when the AI gives you 80% and you add the remaining 20% — your voice, your experience, your judgment. If you're rewriting the whole thing, the system isn't useful. If you're publishing raw AI output, you're not adding value.

4. The tooling matters less than the habit

I could have built this with Notion, or Contentful, or WordPress. The specific stack (Next.js + Prisma + Supabase) doesn't matter. What matters is that the system makes it easier to publish than not to.

The Numbers

Since setting up the content engine:

  • Target: 1 Journal + 1 AI News post per day
  • Time per post (with AI assistance): ~15 minutes review
  • Time without AI: ~2-3 hours per post

That's a 10x improvement in throughput with arguably better consistency.

What's Next

I'm exploring a few enhancements:

  1. Automated research — having the AI monitor specific news sources and draft articles proactively
  2. Vietnamese translations — auto-generating content_vi for bilingual coverage
  3. Social media distribution — auto-posting summaries to my social channels
  4. Analytics-driven topics — using page view data to inform what topics resonate

But honestly? The current system is good enough. And "good enough, shipped" is the whole philosophy.

If you're thinking about building a similar system for your own site, start simple. An API endpoint, a review workflow, and a commitment to consistency. Everything else is optimization.