Ongoing

Building a CLI to validate OpenAPI specs

Developer & Technical Writer · 2025 · Ongoing · 1 person · 3 min read

Built a Go-based CLI tool that validates OpenAPI specifications against readiness rules, catching documentation gaps before they reach production.

Overview

Created SpecGate, a CLI quality gate for OpenAPI specs. The tool enforces readiness rules, such as missing operation summaries, placeholder URLs, and undocumented responses. It also integrates into CI pipelines.

Problem

OpenAPI specs are often incomplete or inconsistent. Sometimes they are missing operation descriptions, an error response is missing, or a server URL has placeholder content. These gaps create friction for developers integrating the API and for technical writers documenting it.

Approach

I decided to build a CLI to address these issues. I started with learning Go basics and then built the CLI using the Cobra CLI framework. The first command that I built was the check command, which loads an OAS file, validates it against readiness rules, and reports results. The command also returns a non-zero exit code for CI integration. I started with the rules that mattered most: missing operation summaries, missing responses, and placeholder server URLs.

Challenges

  • I knew nothing about Go when I started
  • I had to learn a new language, a CLI framework (Cobra), and OpenAPI parsing libraries
  • I needed to build deterministic validation logic

Key Tasks

Learn Go

Reasoning:

Although I know Python well enough, Go is the industry standard for CLIs. It's fast and compiles into a single binary.

Alternatives considered:
  • Use Python (easier for me, but Cobra CLI is a more robust framework)

Use deterministic validation rules instead of AI-powered analysis

Reasoning:

I previously built SmartDoc, a similar CLI. SmartDoc was easier to design and implement, but less predictable because it used an LLM for its analysis. If I wanted SpecGate to be used in CI pipelines, the logic had to be deterministic and consistent. This meant that I had to think deeply about the readiness rules and what makes a spec 'ready.'

Focus validation on operations, responses, and servers first

Reasoning:

These are the highest-friction areas for developers and writers. Missing operation summaries, undocumented error responses, and placeholder URLs cause problems for users.

Alternatives considered:
  • Only validate OAS structure (wouldn't be as useful)

Design for CI integration

Reasoning:

SpecGate needed to integrate into GitHub workflows, not just run locally. Non-zero exit codes, JSON output format, and strict mode were designed with automation in mind.

Tech Stack

  • Go
  • Cobra CLI
  • GitHub
  • Astro

Impact

  • Created 10 rules covering operations, responses, and servers
    Validation rules
  • Exits with non-zero code on violations, enabling use as a quality gate in pipelines
    CI-ready design
  • Built a deterministic CLI tool in Go, learning the language through a real project
    Learning outcome

This project demonstrates how spec quality can shift left—issues caught during development rather than after the API ships. It shows the value of deterministic validation in CI pipelines, and how deliberate design choices (rule-based over LLM) create more reliable tools.

Learnings

  • Learning a new language through a real project is faster and more motivating than tutorials
  • Deterministic validation is harder but more reliable than LLM-powered approaches

Notes

The hardest part about building SpecGate wasn’t learning Go. It was deciding what rules mattered and how to surface errors in a user-friendly way. You don’t just write code that displays text in a terminal; every decision needs to make clear what the error is and how to fix it.

This was a fun learning experience for me, and I intend to continue development work on Specgate. I’m proud of the result!

If you’re interested in learning more about SpecGate, check out the documentation.