I Know Kung Fu

May 4, 2026
Author
Dennis Kittrell
Share this Post:

You might find this hard to believe, but AI has become kind of a thing around here.

Bennie published a post on our Build with AI competition last week, in which he shared that I was lucky enough to land the second place prize. Genuinely flattered, and a real thank you to Peter F, PZ, Vadim, and Bennie for organizing it. The recognition is great. But the part that does not quite come through in the recap is what those six weeks actually felt like from the inside. Forty-plus submissions, 10+ teams, marathon demo sessions that ran out of time twice over, and a constant drumbeat of ideas where every fifth one made me think “wait, we can just… ship that?”

Two submissions that really impressed me (and are worthy of high praise): Kedar Vaijanapurkar shipped a four-tool MySQL stack (Advisor, random data generator, CleanPrompt, and a Query Reviewer), any one of which on its own would have been a strong submission. And Daniil built a leaderboard for Percona ecosystem contributors plus a vector-search prototype running on Percona’s own products, which is exactly the dogfood story we want.

There were a lot more than three projects worth backing, which is part of why a second contest round is being coordinated later this year. A lot of the entries are not waiting for it either – they are already developing into real, operational utilities (some of mine included).

The two submissions of my own that I would point to first are IBEX and percona-dk.

IBEX (Integration Bridge for EXtended systems) is a local MCP multi-tool server that connects either a local model or a Percona-owned LLM to the systems where the most valuable context actually lives. Slack, Notion, Jira, ServiceNow, Salesforce, etc. A solution was needed here since we could not point the standard Claude or ChatGPT connectors at our sensitive internal data, and obviously most of the context that makes LLMs so valuable is precisely that kind of data.

percona-dk is the other one. It started as a way to keep AI honest about our own products by giving the AI tools our teams use (Claude, Cursor, anything that speaks MCP) direct access to Percona’s documentation, so the answer to a question about our products comes from real docs with linked citations instead of stale training data or even scraped web results that can get things wrong. It has evolved a fair bit since the contest. The Percona Community blog and forums are now indexed alongside the docs, Perconians are getting real day-to-day value out of it, and it is starting to look like the kind of thing that could grow into a community utility (perhaps even beyond Percona docs).

Those two were just the start. Once IBEX worked, I needed shared memory across LLMs, so I built that. Once I had three MCP servers running, the boilerplate got annoying, so I built CAIRN, a scaffolding tool that builds on Anthropic’s official MCP builder skill. The official skill walks you through writing a server step by step, but CAIRN spins up a complete, working project in minutes with a streamlined install wizard for non-technical users. It is now in the hands of other Perconians building their own MCP tools, and providing real value of its own. Then I learned about .mcpb files and Desktop Extensions (.dxt), packaged everything that way, and stood up an internal Claude plugin marketplace so any Perconian can install the lot from one place. Each layer opened a door I did not know existed until I was already through it. Some of those doors seemingly materialized from thin air as they magically aligned with new releases from Anthropic.

What started as a competition entry is now a small internal ecosystem. I am still a product person, not a software engineer. I am not going to pretend any of the code is pristine, and a lot of it was vibe-coded with Claude as a partner. But the architecture holds together, it works, and most of it is in daily use by people who are not me. That last part is the bit I am most proud of.

The next batch is pointed squarely at product operations. Making customer signals legible. Making internal telemetry something any teammate can talk to in plain English. The early returns are promising, and what gets me most excited is not the tech itself, it is watching people across Product, Engineering, and Support pull in the same direction with an AI colleague in the room. Turns out the interesting part of AI at work is not the model. It is the connective tissue.

I know Kung Fu

For a product guy who does not code for a living, this era is my “I know kung fu” moment. Not because I suddenly learned to fight. Because the move set I already had – product judgment, systems thinking, customer empathy, the ability to spec a thing precisely – just got a massive upgrade. The gap between “that would be useful” and “that exists now” is short enough to cross in an evening. I do not see it getting longer again.

Thanks for reading this far. If you want more detail or want to try anything not linked here, ping me. I am happy to share more.

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Far
Enough.

Said no pioneer ever.
MySQL, PostgreSQL, InnoDB, MariaDB, MongoDB and Kubernetes are trademarks for their respective owners.
© 2026 Percona All Rights Reserved