What I Learned Building Apps with an AI Partner

I didn’t learn math until I was 25.

I want you to sit with that for a second. Not calculus. Not statistics. Math. The kind you need to follow along in a technical meeting without nodding and pretending. I came up through energy policy, through advocacy, through sheer force of will and a refusal to believe that the people in the room with the fancy degrees were smarter than me. They just had different training.

Now I build software. Real software. Tools that cities use to track building performance standards. Measurement and verification platforms for energy programs. And I do it with an AI partner.

What happens when someone who was never supposed to build things gets access to the same capabilities as someone with a computer science degree? Turns out it’s less about technology and more about what I’ve started calling “GSD mode”: Get Shit Done.

The Old Rules Don’t Apply

For most of software history, there was a clear line. On one side: people who could code. On the other: people who had ideas but needed to hire the first group to make them real. If you wanted to build something, you either learned to code (a multi-year commitment), hired developers (expensive, and you’d have no idea if they were any good), or gave up.

I chose “none of the above.”

When I started using Claude Code and tools like OpenClaw, something clicked. I could describe what I wanted in plain English, iterate in real time, and actually build the thing. The actual thing, not a mockup or a prototype someone else would finish.

The BPStool started as a conversation. I needed a way to help cities understand building performance standards, to model different scenarios, to see what would happen if they set different thresholds. I described the problem. The AI and I started sketching out solutions. Within days, I had working code.

Was it perfect? God no. Was it functional and solving a real problem? Yes.

Partnership, Not Magic

Building with AI is like having a very smart, very fast collaborator who has read every programming textbook ever written but has never lived in your head.

The AI knows syntax. It knows patterns. It knows a hundred ways to structure a database. What it doesn’t know is why you’re building this thing, what problem you’re actually trying to solve, or what trade-offs matter to you.

That means the new bottleneck is clarity of thought.

When I’m working on the MnV platform, I can’t just say “make it work.” I have to say: “We’re tracking energy consumption across buildings, and I need to compare pre-intervention baselines to post-intervention actuals, adjusting for weather. The users are utility program managers who are smart but not technical. They need to see the savings clearly without drowning in methodology.”

The more precisely I can articulate what I want, the faster we move. Vague requests get vague results. Specific requests, with context about who’s using it and why, get solutions I can actually ship.

The Back and Forth

Building with AI is iterative. You don’t describe once and get a finished product. It’s a conversation.

I’ll explain what I need, the AI builds something, and I usually realize I was wrong about half of it. That part surprised me. Or I’ll see a better approach now that I’m looking at the actual thing. So I adjust, the AI rebuilds, and we go again.

This sounds inefficient, but it’s actually how good products get made. You can’t think your way to the right solution. You have to build, see, react, and adjust. The difference is that with AI, each iteration takes minutes instead of days.

Last week I was working on a dashboard feature for tracking program performance. I described what I wanted. The first version was fine but not quite right. I said: “The data is good, but I need it structured differently. Give me the summary metrics first, then the building-level details, then the trend analysis.” New version in 90 seconds. Better, but now I could see the real issue. “Actually, what I really need is to see the outliers first, then let users drill down into why.” Another 90 seconds. There it was.

Three iterations. Maybe five minutes total. A month ago, this would have been three meetings with a development team.

New Skills for a New Era

If technical skill isn’t the bottleneck anymore, what is? I’m probably missing something obvious, but these keep showing up:

Clarity of thought. You need to know what you’re trying to accomplish. Specifically. Who is the user? What problem are they facing? What does success look like? If you can’t articulate this clearly, you’ll thrash. The uncomfortable part: AI will build exactly what you ask for. Ask for the wrong thing, you get the wrong thing fast. It’s the world’s most efficient mirror for unclear thinking.

Persistence. The first version is never right. The fifth version is usually better. The fifteenth version, the one where you’ve actually used the thing and discovered all the ways your initial assumptions were wrong, that’s when it starts to feel good. Most people give up somewhere around version three. Don’t.

Taste. Knowing what “good” looks like. When the AI generates five options, which one is right for your users, your context, your constraints? This requires judgment no AI currently has. It’s the hardest skill to teach and it matters the most.

Ruthless Optimism as Strategy

I hire theater majors.

Not because they know how to code. They don’t, usually. I hire them because they know how to learn, how to empathize with an audience, and how to take direction and improvise when the scene changes. I believe in people before their resumes catch up to their potential.

AI makes this strategy even more powerful. Now I can hand someone with zero coding experience a tool that lets them build real things. The gap between “has good ideas” and “can ship product” just got a lot smaller.

Agency. The belief you can affect outcomes. That you can learn what you need to learn, build what you need to build, and change what you need to change. Most people have been trained out of this. Told they can’t, they need permission, and they’re not qualified.

AI makes agency matter more, not less. Show up with initiative and you can actually build the thing. Show up waiting for instructions and nothing happens.

I choose ruthless optimism. I choose to believe I can figure it out, whatever “it” is this week. Sometimes I’m wrong and I waste three days building the wrong thing and curse at my laptop. Usually, I’m more right than I expected.

Tools That Build Tools

I didn’t expect this: I built internal tools that help me build external tools.

When I was struggling to keep track of building performance data across different cities, I built a dashboard to help me see patterns. That dashboard taught me what questions cities actually ask. Those questions shaped the BPStool. The BPStool is now teaching me what features the MnV platform needs.

Each thing I build makes the next thing easier to build. The code doesn’t transfer. The understanding does. I know more about what utility program managers need because I built something they used and watched them use it. I know more about what cities care about because I built something forcing me to model their decisions.

Building teaches you things. Every app is a conversation with your users, and every conversation teaches you something you couldn’t have learned any other way.

The people who win are the ones who learn fastest. AI handles the building speed. You still have to do the learning.

What Comes Next

We’re still early. The tools are getting better fast. What I can build today would have seemed impossible two years ago. What I’ll be able to build in two years probably seems impossible now.

But I think the biggest change is in our expectations about who gets to build things, not in the tools themselves.

The default for most of history has been: ideas are cheap, execution is expensive. Have a concept for an app? Great. Either learn to code, hire someone who can, or forget about it. The gap between vision and reality was enormous, and only certain people had the resources to cross it.

That gap is closing. Not all the way. You still need to think clearly, keep pushing, and know what good looks like. You still need to do the work. But the technical barrier that kept most people on the “ideas” side of the line is getting much lower.

Which means a lot more people are about to discover they can build things.

I didn’t learn math until I was 25. I came up through policy, not engineering. I had every reason to believe that building software wasn’t for people like me.

It turns out, it is. And if you’re reading this wondering if you could do the same thing: you probably can. The tools exist. The capability is accessible. The only question is whether you’ll believe it enough to try.

My advice? Be recklessly optimistic. Get in GSD mode. Describe what you want, iterate until it’s right, and ship the thing. You’re going to be wrong a lot. You’re going to learn faster than you thought possible.

And one day you’ll realize: oh shit, I build things now.

Anna Kelly builds climate software and believes in people before their resumes catch up. She’s currently working on building performance standards tools and measurement and verification platforms for energy programs.

Scroll to Top