Are You a Renter or an Owner?
Raffi Krikorian’s framework for building responsible AI.
My co-founder Shannon went to Davos last month, where she heard Canadian Prime Minister Mark Carney issue a call to action that stuck with her: those in the middle must come together, collaborate, and actively distribute power. Don’t wait around for the world you wish to see — take on the world as it is.
“Collective investments in resilience,” he said, “are cheaper than everyone building their own fortresses.”
A few days later, we held our own gathering: Fast Forward’s 2026 Alumni Retreat. Over two days in San Francisco, 125 nonprofit leaders came together to share what’s working, what’s not, and what it takes to build responsibly in this moment.
I had the honor of hosting the closing conversation with Raffi Krikorian, CTO of Mozilla, who gave a rallying call to these impatiently optimistic builders. His message: we're moving into a world where decisions are made for us, not with us. A world where we become renters, not owners, of our tech and data. And if we're not intentional about it, that's exactly what we'll get. The themes echoed Carney's message. If AI's future is going to benefit humanity and respond to the needs of communities, it's up to the people in that room, the “middle” players, to make it so.
After all, isn’t that the most interesting thing we can do with AI? Use our best tech not just to optimize workflows or save nominal time on tasks, but to actually solve our biggest problems?
At the Retreat, builders chose solidarity over isolation, sharing models and tooling, aligning on standards, and coordinating advocacy. Raffi's perspective stuck with them. I hope it sticks with you too.
Whose Side Is Your Tech On?
Raffi rejects the idea that tech is neutral. Design choices embed values. Every decision about who owns the data, who controls the algorithm, who takes responsibility when things go wrong—those aren’t technical questions. They’re values questions.
So he offered a framework worth keeping in your back pocket when evaluating tech:
1. Whose side is it on? Are you owning it or renting it? Are you the product, or are you using the product?
2. Do you have an exit ramp to take what you’ve built with you? Or are you locked in by design?
3. Who’s responsible when it breaks? Does the company take the blame, or do they push it onto you?
These questions matter whether you're choosing tools or building them.
Build for the Margins
Raffi’s advice for responsible building: “Build for the margins and the center will follow.”
He pointed to the curb-cut effect. Curb cuts were designed for wheelchairs, but they benefit everyone: parents with strollers, delivery workers, travelers with luggage. The same applies to AI. Design for underserved populations, low-connectivity environments, communities at the margins, and you create better tech for everyone. Mozilla lives this.
Pragmatic, Not Purist
For nonprofit builders with limited resources, Raffi gets it. His advice: “Don’t be a martyr. Use the best tools you need to do your job now.” But be clear on your values and trade-offs. Always think about exit ramps. Know what compromises you’re making so you can move when better options arrive.
His final message to the room: “Live your values and push them out” — especially when the easy path pulls you away from them.
The values of the people building AI for humanity matter. Choose the world you want to live in, then choose the tech to get there.



This resonates so well! The principles Raffi proposes dovetail nicely with what impact tech company Armillaria (which I co-founded) postulated a while back and what we are building towards: https://armillaria.io/principles ~ Separately but relatedly, it behooves all of us in the US echo chamber that's droning on about the inevitability of AI to look towards places like India and Switzerland, where very interesting public interest tech is already being built that keeps the owners of the data that drive this next economy exactly that: Owners.