9 minute read

Note: I originally sent a version of this to several founders in our portfolio who run tech or tech-adjacent companies. A few of them told me I should publish it, so here it is, lightly edited for a wider audience.


I know what you’re thinking. “Yet another AI hype piece.” I get it. I’ve been that guy rolling his eyes for the past few years.

But in the last couple of months, the tools got good. Like, actually good. And I think you have a narrow window right now where the cost of learning is low and the cost of waiting is about to get very high. If your competitors figure this out before you do, you won’t just be behind. You’ll be learning to walk while they’re sprinting.

Let me explain.

Where I’m Coming From

I love writing code. I spent the first twenty years of my career doing it professionally and I still do it for fun. A decade of that was at Goldman Sachs and Morgan Stanley building derivatives platforms, the kind of systems where a bug doesn’t just annoy users, it loses real money. After that, Group CTO of Traveloka, managing roughly 1,300 engineers at peak.

I’m also one of my own biggest customers when it comes to AI tools. My monthly subscription bill is frankly embarrassing. I’ve tested everything. And for years, my honest assessment was: interesting, but not there yet. The gap between hype and reality was wide enough that I didn’t think it was worth your time. I’m not in the business of crying wolf.

What I’m seeing now is different. Not incrementally better. Actually different. That’s why I’m writing.

The Skeptics Are Converting

I don’t care what AI companies say about their own products. I care what the builders say. And the builders are saying something they weren’t saying six months ago.

Andrej Karpathy (founding team of OpenAI, former AI Director at Tesla, Stanford PhD) summarized AI coding tools as overhyped on the Dwarkesh Patel podcast in October 2025. His exact words: the models are “not there” and the industry is “trying to pretend like this is amazing, and it’s not. It’s slop.”

Two months later, on December 27, he posted what became one of the most-shared developer tweets of the year:

“I’ve never felt this much behind as a programmer. The profession is being dramatically refactored as the bits contributed by the programmer are increasingly sparse and between. I have a sense that I could be 10X more powerful if I just properly string together what has become available over the last ~year.”

Then in late January, he shared field notes from weeks of heavy AI coding use. He’d gone from 80% manual / 20% AI to 80% AI / 20% manual in a matter of weeks. His words: “easily the biggest change to my basic coding workflow in ~2 decades of programming and it happened over the course of a few weeks.”

David Heinemeier Hansson (DHH), creator of Ruby on Rails and co-founder of Basecamp, is one of the most opinionated people in software. In July 2025 on the Lex Fridman podcast, he described using AI to code as feeling like “competence draining out of my fingers.” He’d tried the tools and hated them. His advice at the time: don’t let AI write your code directly.

Six months later, he reversed course: “Half the resistance was simply that the models weren’t good enough yet. I spent more time rewriting what it wrote than if I’d done it from scratch. That has now flipped.” On his blog, he called them “fully capable of producing production-grade contributions to real-life code bases.”

Why This Matters to You

Most of you are CEOs who don’t write code. You probably don’t care which text editor your engineers prefer, and normally you shouldn’t. But this is different, because the math that underpins your business decisions is changing.

Some of you have a feature backlog that will never get done because you can’t hire enough engineers. Some of you can’t execute your product roadmap because a key client integration would take six months. Some of you are making build-vs-buy tradeoffs based on assumptions about engineering capacity that may no longer hold.

If AI coding tools deliver even a fraction of what people like Karpathy are describing, those assumptions break. What took a team of five might take three. What took six months might take two. The competitors who figure this out first will move at a speed you can’t match, at a lower cost, shipping features to customers while you’re still scoping yours.

What Your CTO Might Not Tell You

So who should you listen to on this? I know many CTOs across the region. A good number of them work fractionally, serving multiple companies at once. I pay close attention to what they think, because their incentive structure is different from a full-time CTO’s.

A fractional CTO has no kingdom to protect. No team they built, no tech stack they championed, no set of processes they’ll have to admit are now obsolete. And their economics push them in the opposite direction from caution: more clients means more revenue, but only if the quality stays high. They need to be fast and good, or they don’t get rehired.

AI tools that actually work are a godsend for these people. And they’re all saying the same thing: it’s finally good enough.

Your full-time CTO is probably already thinking about this. But they can’t push AI adoption alone. They have a team managing production systems, a roadmap full of commitments, and a hundred things on fire at any given time. AI adoption will keep losing to whatever’s urgent today, unless you as CEO signal that it’s a priority and that smart failures along the way are okay.

Your CTO needs your air cover. Give it to them. There’s a reason Jamie Dimon pulled AI out of JPMorgan’s technology org entirely. His words: “We took AI and data out of technology. It’s too important.” He made it a leadership priority, not just a tech initiative. You should too.

The Objections You’ll Hear

When you bring this up internally, expect pushback. Some of it will be legitimate. Some of it won’t be.

“The code quality isn’t good enough.” This was true a year ago. It’s becoming less true by the month. The key nuance: AI-generated code written by someone who doesn’t know what they’re looking at is dangerous. But a good engineer using AI as a drafting tool will move much faster, and they still review, test, and own the output.

“It’ll create technical debt.” Possibly, if used carelessly. But technical debt comes from bad engineering decisions, not from the tool that wrote the code. A skilled engineer using AI can produce cleaner code faster than a mediocre engineer writing everything by hand. And honestly, if you’re relying on low-cost vendors to build your applications, you’re already carrying technical debt. I know. I evaluated many such vendors during the 2021 hiring frenzy.

“We tried it and it didn’t work.” When? The tools from six months ago are nothing like what’s available today. Karpathy went from skeptic to convert in two months. If your team tried it in mid-2025, they tried a different product.

“It’s going to replace us.” The unspoken fear. Worth addressing head-on: the goal isn’t fewer engineers. The goal is more output from the same team. Finally making a dent in that backlog. Revisiting those SaaS contracts you signed because building in-house was “too expensive.” Shipping the features your customers have been asking for.

The Golden Window

Big shifts still take time. The learning curve can’t be skipped. Your team will make mistakes along the way. That’s fine, as long as you start now.

Right now, you can still frame this as empowerment. “We’re giving you superpowers” gets buy-in. “We’re replacing headcount” gets resistance. Right now, expectations are still soft. If you cut a project from six months to four, that’s two extra months your team can spend on the product roadmap. And those two months will be more productive than before, because your team is now better at using the tools.

But imagine a point in the future where your competitors have already been through this learning curve. Where they’ve compressed that same six-month project into six days. If you’re only managing four months by then, you’re not behind. You’re irrelevant. And you’ll still need to go through all the trial and error that your competitors already absorbed.

Your CTO has the same window. Right now, they can make intelligent mistakes while the bar is still forgiving. The experimentation, the false starts, the figuring-out. Far better to do that now than later, when falling behind means losing customers.

This window won’t stay open forever.

Concrete Things You Can Do

I don’t like advice that ends at “you should adopt AI.” So here’s what I’d actually suggest to a founder, starting this week. Today, even.

Squeeze the deadlines. This sounds brutal, but it’s the single most reliable way to force adoption. If a project is scoped at eight weeks, tell the team they have five. They will reach for the tools. People don’t change workflows because you send them a Slack message about AI. They change when the old way stops working.

Give your team a real budget for AI tools. I’m amazed how many companies still don’t do this. Engineers end up using free tiers or pirating accounts, and then leadership complains about “code security concerns.” Think about that for a second: you’re worried about your proprietary code leaking, but you won’t pay for the plans where your data isn’t used for training. So your engineers use the free plans, where it is. Fix the incentive.

And while we’re on code secrecy: for most companies, your secret sauce isn’t your code. It’s your customer relationships, your domain knowledge, your speed of execution. Don’t let “but our code is proprietary” become an excuse to stand still.

Don’t wait for the perfect metric. You’re not going to find a clean, un-gameable way to measure AI’s impact on your engineering org. Not yet. That’s fine. Start with something imperfect. Lines of code per sprint, tickets closed, whatever. Yes, people will game it. But an imperfect metric that gets your team moving is better than no metric while you wait for the perfect one.

Pair every target metric with a counter-metric. This is important. If you measure delivery speed, also set a floor on production bugs. If you measure tickets closed, also track rollbacks. The target metric tells your team what to push on. The counter-metric keeps them honest. Without both, you’re just incentivizing people to cut corners.

As CEO, you need to communicate this clearly and repeatedly. Your team needs to hear from you, not just from the CTO, that AI adoption matters. And when someone figures out a better workflow, or ships something faster because of these tools, recognize it publicly. Engineers respond to recognition. If the only people talking about AI in your company are the tech team, adoption will stall.

Have engineers demo their AI workflows to each other. One engineer figures out a trick, shows the team, and suddenly everyone’s trying it. Peer learning spreads faster than any top-down mandate. Make it a regular thing.


The founders who move on this now won’t get everything right. But they’ll be ready when it matters. You don’t want to be the one scrambling to compress a learning curve that took your competitors months, while the world around you has already moved on.

Your call.

Ray Djajadinata

Updated:

Leave a comment