We're all centaurs now
The humanness isn't in typing every word. It's in deciding what's worth saying.
Key takeaways
- Using AI to prototype and ship code as a non-"engineer" GM felt like a transgression in 2025, but that mode of work is now industry standard.
- Delegating low-level execution to AI frees you to work at a higher level: design, strategy, direction, and philosophy.
- The same cultural anxiety about authenticity is playing out for AI-assisted writing as it did for coding.
- The human contribution is meaning, values, judgment, and direction, not typing every word.
- Embracing the centaur model (half human, half AI) with intention beats doing it reluctantly or apologetically.

Last spring, I committed what felt like a professional transgression. As a general manager at a crypto startup, I used Cursor to prototype a token detail screen—something that had been sitting in our backlog for months. Within an hour, I had a working demo. The UI was janky, it didn't conform to our design system, but it existed. And that existence felt significant.
I sensed skepticism from my team. The feedback I perceived, spoken and unspoken, was that I'd broken process. Skipped important steps. The prototype showed something, sure, but it didn't represent proper team-wide thinking. It felt like they saw it as a curiosity, not a contribution.
I kept going anyway. I built a small project to generate content and documentation about tokens and asset classes, then integrated it directly into our web app as tooltips and links. This time it wasn't just a prototype—it was production code that real users would interact with. And this time, the resistance felt more pointed.
I perceived that people thought I had no place pushing code, let alone AI-generated code. I was using a "black box" to do work that shouldn't be delegated to machines, at least not by a non-"engineer". The word that kept surfacing in my mind was irresponsible. I felt like I was using a cheat code, and worse, like I might not even know enough to understand why it was wrong.
Here's the thing: I was the GM. I had the authority to push that work through. But I couldn't shake the feeling that I might not have been able to do it without that authority. And I spent months questioning whether I'd done the right thing.
The Vindication
That was April and May of 2025. This is February 2026.
In the intervening months, something shifted. AI-driven coding went from suspicious novelty to industry standard. The discourse moved from "Is this as good as humans?" to "How do we manage systems with superhuman capabilities?" The tooling improved, the models advanced, but mostly, people just… tried it. And realized it worked.
My intuition was entirely vindicated. What I'd discovered wasn't a shortcut—it was a different mode of operation. The low-level details I'd been criticized for not writing myself turned out to be exactly the kind of work that should be delegated. Because delegating them freed me to work at a higher level of abstraction, to think more strategically, more creatively.
It's not that different from managing a team. When you lead people, you don't write every line of code yourself. That doesn't make you less creative—it makes you more creative, because you're spending your cognitive resources on questions of design, strategy, direction, and most importantly, philosophy.
History Repeating
Now I'm working on a new startup. I'm building a product, developing a platform, and cultivating a public voice again. And I'm using AI to write blog posts, to express myself, to publish actively.
Last week, a friend shared feedback on one of my posts. Something in it made him feel like it was AI-generated. He described his reaction as a "brain itch"—that moment of recognition that pulls you out of the content. He sent me a link arguing that all writing should be "organic"—handwritten, unprocessed, preserving what a person actually thinks.
And immediately, I felt it again. That same self-doubt. That same shame. Maybe I am short-circuiting something essential. Maybe the creative element is lost when I'm not the one writing every sentence. Maybe I'm using another cheat code.
But then I stopped and thought about how I actually write these days.
The Real Process
My writing doesn't start with fully formed ideas waiting to be transcribed. It starts with the contours of interests and questions. When something provokes my curiosity, I open a conversation with an AI agent. I ask it to help me analyze the concept. I load an article and ask for a summary, then Q&A it, bouncing between the source material and the conversation. I ask for corrections, for synthesis, for reports.
This is a learning process. A powerful, leveraged learning process. And that report or analysis—that's essentially a blog post to myself. The jump from there to public expression is smaller than you'd think. I just need to transform it so someone without my initial context can access both the topic and my developed viewpoint.
So I work with the agent to convert the analysis into a draft. I iterate on phrasing, positioning, structure. I ask for candidates and choose between them. I provide style guidelines and refine them over time. The exact wording often isn't what I first came up with. But the ideas are mine. The judgment is mine. The direction is mine.
And crucially: I'm writing because I can do this quickly. I'm running a one-person startup. The difference between five hours and one hour on a blog post is four hours I can spend building product. Without AI assistance, I wouldn't be blogging at all—or I'd be blogging far less.
It's the same trade-off as last year: existence versus non-existence. Something good enough that gets out there versus something perfect that never happens.
The Pattern
I think we're going through for writing what we went through for coding last year. The same cultural moment. The same questions about authenticity and responsibility. The same anxiety about what makes something "human."
And I suspect this pattern will repeat as AI penetrates more domains. Each time, we'll question whether we're losing something essential. Each time, we'll discover that what we thought was essential—the low-level execution—was actually just what was possible for us to do. And that when we delegate it, we free ourselves to work at the level where human creativity actually lives: meaning, values, judgment, direction.
The humanness isn't in typing every word. It's in deciding what's worth saying.
The Embrace
This doesn't mean anything goes. I'm not arguing for putting out work you don't approve or oversee correctly. But there's a lot of subjectivity in what "correctly" means. And especially in a startup mentality, the risk of publishing something AI-assisted and imperfect is usually lower than we think. The risk is that you harm your reputation. But the upside is that you're iterating toward quality and authenticity faster than if you'd waited for perfection.
Every piece you create with AI gets you closer to understanding how to channel yourself through the technology more effectively. The words might not all be yours, but the voice can be. And increasingly will be, as you develop confidence in directing these tools.
We need to embrace the cyborg nature of this moment. Not retreat from it. Not treat it with wariness. But develop real confidence in our ability to guide these systems as extensions of ourselves.
We're all centaurs now. Half human, half AI. The question isn't whether to accept that—the integration is already happening. The question is whether we'll do it proactively, with intention, channeling our values and judgment through these tools. Or whether we'll do it reluctantly, apologetically, always wondering if we're cheating.
I spent months last year questioning my intuition. I'm not doing that this time. The work I'm putting out is work that reflects my thinking, serves my goals, and wouldn't exist without this partnership. That's enough.
The future isn't about preserving some notion of pure, unassisted human creativity. It's about becoming fluent in a new mode of creative expression—one where the human contribution is strategic direction rather than tactical execution.
And that, it turns out, is exactly where human creativity has always lived anyway.