Product StrategyArticle has under 2-min AI recap audio4 min read2026-02-13

Idea to Production: An Engineer With AI in Hand

The time from idea to working product has compressed dramatically. Real experience building PoC apps with AI coding agents, what worked, what didn't, and the discipline it demands.

Geddy
Geddy
Senior Web Engineer / Lead

Listen to under 2-min AI recap TL;DR ↓

0:00

Idea to Production: An Engineer With AI in Hand

The Shift Is Real

Times are shifting. Senior engineers in this industry are becoming extremely productive. They know systems from A to Z. Now they've got unlimited extra hands — OK, in theory. In reality, there's still a limit to how many agents you can handle efficiently. But the direction is clear.

The time from idea to a working product in front of real users has compressed dramatically. Not months — weeks. Sometimes days.

That said — getting something live and keeping it healthy at scale are very different things. I'll be honest about both.

This direction is visible across the industry. This LinkedIn post by Andreas Horn captures it well — what the best developers are doing right now.

What I Built, and How

I delivered a few PoC apps with AI coding agents — Claude's Opus 4, then 4.5 and 4.6 lately. I was writing specs and context files to define how to structure the app: UI, data, architecture. Used the same AI to set up CI/CD workflows and get blue/green deployments automatic on master branch push. Backend, APIs, authentication, user-scoped data, profiles, functionalities.

Yes, it is all possible with AI. But it wasn't magic.

Sometimes it came down to digging into a framework's or library's docs and bringing that knowledge into the project's context files to feed the coding agent. Guiding it to solve a specific problem the correct way. Right things don't happen with prompting alone — they happen with context engineering. Documenting patterns. Persisting the decisions that matter.

Those MD-based context files became the backbone of the whole process. Human-readable documentation covering features, solution design, how composition components should be built, design system specs, reusable component definitions. The more precise and structured the context, the better the AI output. Skimp on this and you pay for it later.

Where It Went Wrong

It wasn't all smooth. The temptation to move fast and take shortcuts is real — and I fell for it more than once. You want to get things out quickly, so you skip the upfront thinking. Then you find yourself in an overwhelming loop of reiterations on a feature that should have been straightforward. Prompting again and again instead of stepping back, structuring the approach, and documenting it properly.

That's a beginner's mistake, and I made it. The lesson was clear: within a compressed timeline, the ability to think through the solution — the roadmap, the structure, the design patterns — before touching a prompt becomes even more important, not less. Speed without direction is just spinning.

Building a Foundation

In a month of evening and weekend sessions before Claude Code with Opus 4.5, I built a foundation. A base that handles SSO, users, and has core system features ready for any member-based or admin panel project. Deployment automation into a DO droplet using GitHub Actions. No Kubernetes or auto-scalable cloud infrastructure — just a lean way of starting new projects.

I kept reiterating on the context docs until I had a reasonably solid base. What happened next: about a week of the same evening sessions and I launched a platform on top of that foundation. Brought it to first users. Started collecting feedback. Emails, user registrations, user feedback, third-party integrations.

Then Opus 4.6 came out. Another set of evening sessions — finished a PoC project integrating with two APIs, one of which is a GPT equivalent. The acceleration is real and it keeps compounding.

On Ownership — Honestly

Do I fully own these solutions? Not in the way I would if I'd written every line myself. That would take significantly longer, and it wasn't the goal. These are PoC projects — "production" in the sense that real users see and use them, not in the sense of battle-tested systems running at scale for years.

What I do own is the flow. How things connect. Why decisions were made. And I overwhelm it with testing — Playwright visual tests, simulated user journeys — to validate the system behaves exactly as intended. I also dig into the code, find inefficiencies, and steer corrections. Sometimes it's faster and more efficient to just make a change myself than to prompt for it.

I believe tech debt in AI-generated codebases can be controlled — by precisely defining patterns, documenting them as guidance for the AI, and keeping a close eye on every change. Getting the initial structures right means foreseeing them upfront. Enforcing them consistently. Six months in without disaster? I believe that's achievable. But it demands discipline, not just prompting.

For longer-term, more complex projects — I'm sure I'll be looking at ownership with an even closer eye. That's worth exploring as these experiments mature.


This is where I am right now — experimenting, building, learning what works and what doesn't. The leverage is extraordinary. But it demands you bring the architecture, the product thinking, and the quality bar. AI fills in the rest. The challenge worth its own article: how to think through solution design fast enough to match the speed AI gives you.

TL;DR

  • The time from idea to working product in front of real users has compressed dramatically — weeks, sometimes days — but only if you bring the architecture thinking upfront
  • Context engineering is the real skill: structured MD files defining patterns, decisions, and component design matter far more than the prompts themselves
  • Skipping upfront thinking to move fast is the most expensive shortcut — it puts you in an endless reiteration loop instead of saving time
  • "Production" with AI-generated code means real users see it, not that it's battle-tested at scale; own that distinction honestly
  • Tech debt in AI codebases is manageable, but only through discipline: define patterns precisely, document them, and review every change

The leverage AI gives you is extraordinary — but it demands you bring the architecture, the product thinking, and the quality bar; AI fills in the rest.

Geddy

Geddy

Senior Web Engineer / Lead

Engineering leadership • AI innovation • Product thinking. 20+ years building scalable web solutions.