

Thank you, Paulina. And thank you for 50 years.

That number matters to me. Fifty years ago, the people who built this campus trusted that whoever came after would carry it forward. They could not have imagined our world today: brewing global conflict, extreme weather events, societal polarisation and AI — a technology sophisticated enough to forge fiction into fact.

But these builders fashioned something flexible enough to hold that future. Something where thinking and peace are prized. Something where collaboration and hope thrive. That is the kind of civic care I wish to discuss today.

Now, I just watched myself on the screen. I have to say — it is quite a surreal experience. Though, I suppose as someone who publishes thousands of meeting transcripts online, I should be used to it by now.
(¿Cara o Cruz?)

Good Enough Ancestor — not a perfect one. Good Enough. When I was five, the doctors told my parents I had been born with a congenital heart defect. They said I had roughly a 50/50 chance of surviving until surgery, which I eventually received at 12. For seven years, every night when I went to bed felt like a coin toss. If it didn't land well, I wouldn't wake up the next day.

I learned something early: If you wait until work is perfect, you may never share it at all. So I developed a habit I still keep today — publishing before perishing. My work is made public and placed in the commons for anyone to read, critique or build upon. Half-finished. A work in progress. Good enough.

This transcript will be published tonight. If I get something wrong about Mexico or your community, tell me. The transcript is yours to edit.

And here is what I discovered: If you post something perfect, people just press "like" and move on. But if you post something that is a work in progress, that is imperfect, everybody comes and corrects you. They argue, yes, but they also help. They co-create. I also found that what is true for a person is true for a democracy.

But I should say something from the start: This talk is not a blueprint. It is a report from an island far away — 23.5 million people, near-universal broadband, a specific history, specific luck. What I describe worked in Taiwan under particular conditions. I am here to share what we tried, and to learn what you would try differently.
(Cultura Capitalina)

Let me start not with Taiwan, but with something I believe we share. Both our countries understand what it means to build democracy while living in the shadow of much larger powers — geopolitical, economic and platform-scale.

Both know what it means to inherit institutions that were not designed for the people who must now live inside them. And both know — in our bones, not just in our textbooks — that the most important civic innovations come not from governments, but from communities that refused to wait.

This campus is based on that instinct. The Tec carries Monterrey's DNA — la cultura del esfuerzo, jalar parejo — and brought it to this city 50 years ago. But Mexico City added something of its own. Because this city knows, in a way few places on Earth know, what happens when the ground itself gives way and the only thing left standing is the people.

On September 19, 1985, an earthquake flattened entire neighbourhoods. The government was slow. The army was slow. But the citizens were not. Neighbours dug through rubble with bare hands. Volunteers formed human chains. A group of young people who crawled into collapsed buildings to search for survivors were named los Topos — the moles — and they were still there in 2017.

The phrase that survived that morning was not a government slogan. It was: el pueblo salvó al pueblo. The people saved the people.

This tradition of civic care is older than any platform I am about to describe — and far more consequential. In this city, you can walk two or three blocks and cross from one reality into another. Many of you know both sides of that line. If civic technology is worthy of your time, it must start with that inequality.

I say this because when I talk about Taiwan, I do not want to say "Here is what you should do." I want to say "Here is what Taiwan tried, and this is how it connects to things you already know."
(El Consenso se Viraliza)

Taiwan spent decades under authoritarian rule — 38 years of martial law. When democracy arrived, it arrived slowly, and unevenly. By 2014, trust in government had fallen to among the lowest in the democratic world.

In March of that year, an opaque trade agreement was fast-tracked through the Legislature in 30 seconds — literally 30 seconds — without review. A quarter of a million people took to the streets. Five hundred students occupied the Legislature for 24 days.

I was one of the technologists working alongside them — not inside the building, but outside, helping to turn the noise of a movement into something that could produce coherent proposals. Because here is the challenge with every mass movement: passion is abundant, but signal is scarce. Everyone is speaking; no one is necessarily being heard.

Around the same time, social media platforms shifted from subscription-based feeds to recommendation-based feeds. In a subscription feed, if you follow the same people, you see the same world. But a recommendation engine figures out your differences and amplifies them so that outrage dominates the conversation. We needed a tool that reversed this logic.
(Polis)

We discovered a tool called Polis — open-source, with one crucial design choice: no reply button. No retweet button. You cannot attack someone else's statement. You can only agree, disagree or pass. There is nowhere for trolls to grow.

On social media, outrage goes viral. On Polis, overlap goes viral — because the only way your statement spreads is if people who disagree on everything else still endorse yours.

When Uber arrived in Taiwan in 2015, taxi drivers pushed back hard. They fought with each other not only on social media but also on the street. Our solution? Thousands of citizens took the issue to Polis. Within weeks, they agreed on concrete measures that became legislation. The bridge was there all along. It just needed a tool that rewarded building it.

Over the following decade, trust in Taiwan's government rose from 9 percent in 2014 to over 70 percent by 2020 — not because we designed the perfect system, but because we kept listening. We kept publishing. We kept sharing the work before it was finished.
(¿Quién Decide?)

We rebuilt trust using tools that rewarded human connection, but today, AI is automating that connection — and often exploiting it — at a scale we could not have imagined in 2014.

The same question the Sunflower Movement forced into the open — who gets to be heard, and how? — is now being decided not in legislatures, but in the design of algorithms and incentives. Not by legislators, but by big tech.

Are we putting humans in the loop of a fast AI loop — like a hamster in a hamster wheel, running very much excited but with no steering? Or do we put AI in the loop of communities, steered by the people it affects?

If we do not bring the same care to that design that we brought to democratic participation, then those algorithms will do to communities what authoritarian governance has always done: concentrate power, silence voices and extract value from the people for misaligned incentives.

Our Sunflowers were asking: Who decides? The AI question is the same question, raised to a different power. That is the main challenge facing your generation. The Sunflowers were asking a political question. My grandmother would have recognised it as a moral one.
(Caritas)

I was raised by my Catholic grandparents. My grandmother looked after children at the parish kindergarten — not the clock-in, clock-out kind, but the kind who made house calls, who showed up when a family was in real need of a helping hand. She never said what she did was charity. For her, care — what the Church calls caritas — was not a feeling. It was the choice to keep showing up.

My thoughts frequently centre on my grandmother since I began working with AI ethics.

In Mexico, you may recognise what I am about to describe not as foreign ideas, but as things your communities have practised for generations. Whether you ground this in engineering ethics, human rights, indigenous communitarian traditions, or faith — the point is the same: relationships are the unit that shapes technology.

This leads to a principle many traditions share: subsidiarity. Decisions belong at the most local level. The person closest to the problem should have the greatest say in its solution. Local data over centralised extraction, community control over platform capture.

It also means preferential option for the poor: The justice of any system is measured not by how it serves the majority, but by how it serves the most vulnerable.
(Convivialidad)

There is someone who thought deeply about this just hours from where we are sitting. Ivan Illich spent years in Cuernavaca asking a single, fundamental question: What makes a tool convivial, something that serves the people, rather than the other way around? His answer was about power, framed around what he called convivial tools or extractive tools.

The convivial tool amplifies what you can already do and serves a community; the extractive tool makes you dependent on those in control. Every AI system being built right now is one or the other. The engineers making those decisions — many of them your age — may not have thought to ask which.

When I later encountered the philosopher Joan Tronto's ethics of care, I recognised immediately that she was speaking the same language as my grandmother — and, I suspect, as many of your communities' traditions.

For Tronto, care is not a sentiment. It is a practice. And it turns out you can encode it. You can make it measurable and accountable. That is what our 6-Pack of Care is all about. It is a set of six design principles, for building AI systems that genuinely serve communities rather than extracting from them.
(6-Pack of Care)

- Attentiveness: Actually listen to people. Not only the popular and powerful, but small underdogs too.

- Responsibility: Actually keep promises. Not vague ideals — soon abandoned — but specific commitments with teeth.

- Competence: People check the process. Not "just trust us", but transparent and fast community feedback.

- Responsiveness: People check the results. Not top-down metrics that ignore what people value, but metrics designed by the people, for the people.

- Solidarity: As win-win as possible. Not mutually assured destruction, but deals where all sides are better off.

- Symbiosis: As local as possible. Not a one-size-fits-all Overlord, but a variety of solutions, by and for a variety of folks.

So, that is our 6-Pack of Care. Trust in being heard, trust in promises, trust in execution, trust after harm, trust across groups and trust over time.

Now, let me show you what it looks like when it is alive.
(447 Ciudadanos)

In 2024, AI-generated deepfake advertisements dominated Taiwan's social media landscape. Jensen Huang — CEO of Nvidia — was impersonated in video ads promoting fraudulent investments. The clips were convincing: Jensen actually talked to you, sounded just like himself. Our citizens who trusted those familiar faces lost millions. Facebook's response? "We didn't come up with that advertisement; it is our algorithm pushing to you." No responsibility assumed.

How did we respond? We did not immediately pass a law. We sent a text message to 200,000 randomly selected citizens. It said, in essence: something is happening, what do you think we should do? Thousands volunteered. We invited 447 of them, statistically representative of our society, to deliberate in 44 virtual rooms, facilitated by an AI timekeeper and summariser.

One group said: Display all ads on social media with a large disclosure label — like a cigarette warning — until someone digitally signs them.

Another said: if a platform posts an unsigned scam ad and someone loses money, the platform shares the liability.

Yet another said: we do not ban non-compliant platforms — we slow connection speed by 1 percent for every day they refuse to comply.

We used a sovereign AI model called Taide — the Trustworthy AI Dialogue Engine, collectively tuned by the Taiwanese people — to weave the proposals from all 44 rooms into a coherent package. A total of 85 percent of the assembly agreed. The other 15 percent said they could live with it. Multiparty legislative support followed. Within a year, impersonation advertisements fell by 94 percent.

When every party sees that 85 percent of a representative mini-public voted on something synthesised by a trusted sovereign model, no party wants to offend the 85 percent. That is what I mean by "AI in the loop of humanity," rather than humans in the loop of AI.
(Solo Constructores)

In Taiwan, we treated broadband as a human right — something closer to tap water than a luxury — and we pushed for universal service. No matter how far you are, on rural islands, at the top of Taiwan at almost 4,000 metres, you are guaranteed broadband access through satellite, microwave, or 5G. I know that is not yet the case everywhere. But once access, literacy and safety are in place, governance can move at startup speed.

When COVID arrived in early 2020, there was an immediate fight about whether masks were useful. One side said only N95 masks work. The other said any mask hurts you. Using Polis, we found consensus within 24 hours and rolled out a message both sides could endorse — a Shiba Inu meme, a very cute dog putting her paw to her mouth: wear a mask to remind each other to wash your hands and keep your dirty hands from your own face. It was personal protection from yourself. People laughed. Humour over rumour. We depolarised the conversation about masks, and later about vaccination and contact tracing.

Taiwan's government then published pharmacy inventory data as an open API — updated every 3 minutes, publicly available to anyone. We did not commission an app. We published the data. Within 48 hours, a community of civic hackers — volunteers from the g0v network — had shipped not one solution but dozens. Web maps. Chatbots. Voice interfaces for people who do not use smartphones.

The government did not direct any of this. When the best version emerged, we merged it into the national system within 24 hours. No procurement. No committee. No waiting. Just builders who saw an open API, an unmet need and shipped.

That is what open government looks like from the outside: a platform people can build on, not a service people must wait for.
(Mentoría Inversa)

Now, an idea for the students in this room: In Taiwan, we built a system called reverse mentorship. Every Cabinet minister must have advisors under 35. And citizens under 18 can have any minister formally respond to any issue or question — simply by collecting 5,000 signatures.

A 15-year-old used that system to petition for school to start one hour later. The argument was simple: research shows one more hour of sleep produces better academic outcomes than one more hour of study. He prevailed. The policy changed. A 16-year-old petitioned to ban plastic straws from bubble tea shops. She prevailed, too, and went on to become a ministerial reverse mentor.

These were young people who understood that the system had a door, found out where it was, turned the handle and opened. You do not have to wait until you graduate. The people who changed those policies were younger than most of you are now.

This next-gen representation is not symbolic either — it is structural. When Taiwan elected its first woman as president, she reframed cybersecurity not as a boys' club, but as defending the country with your brain. Within a few years, that changed what felt imaginable for many talented girls in high school, who might otherwise never have been encouraged to enter the field. Taiwan doubled its cybersecurity talent.
(Democracia Geotérmica)

So, why does Taiwan succeed at digital democracy when so many others struggle? My heartfelt answer: We had a very large number of things go wrong in a very short period of time, and we had no choice but to get creative.

Taiwan is the youngest tectonic island in the world — only four million years old. Plates collide. Mountains rise. Yushan, our highest peak, grows by half a centimetre every year from the pressure. We learned that when plates collide, you can treat the pressure as a disaster or as energy. We chose energy. That is what I mean by a geothermal democracy — conflict transformed into creative heat, powering something new.

México knows tectonic pressure — literally and figuratively. This city has been rebuilt after earthquakes in 1985 and 2017, each time by ordinary people who stepped in before the state could respond.

The networks of universities, civil society, and local builders who organise when systems fail — that is your geothermal energy. Civic technology organisations like Codeando México, founded right here in the capital, have been channelling that energy for over a decade. I am not here to give you a model. You already have models.

And before I hand the mic — remember, this transcript is yours to correct. If I left a crack today, please tell me. After all, this is how we let in the light of co-creation.

Now, the final thought.
(Buenos Ancestros)

In our 6-Pack of Care, we say that when an AI system has done its work, it should depart — leaving its maps, its evaluations, its institutional memory in the commons for the next steward. Complete the work, and pass it on.

The university has stood for half a century on this campus. Today's event could never have been imagined by the founders. Yet, we can celebrate that they built something flexible enough and caring enough to stand the test of time.

That is generational symbiosis in its purest form. In doing so, the founders are now your Good Enough Ancestors. And some of you — the first in your family to reach a university — are already Good Enough Ancestors, by opening a door your family has never walked through before.

Democracy cannot be delegated. Not to an algorithm, not to an expert, not even to a friend from Taiwan. It must be continuously, imperfectly exercised by people who have chosen to remain in relationship with each other. It is like sending your robot to the gym to lift weights for you — impressive, I am sure, but your muscles atrophy that way. The river of democracy does not need one ruler. It needs many stewards, each lovingly tending a stretch, working seamlessly to free the future — together.
(¿Qué Te Negarías a Automatizar?)

So, here is the question I pose on the last slide, in the largest font I have:

What would you refuse to automate?

I could never imagine your answer. But I hope that you will share it with me.

Thank you. Now for the questions!

In Mexico, we are discussing AI regulation and sovereignty, but building generative-AI infrastructure feels out of reach. How did Taiwan approach this?

I was born in 1981 — the same year as the IBM PC. Before personal computing, there were mainframes. You typed into a terminal connected to a big machine somewhere in a bank or a state. The mainframe operator saw everything you typed and controlled whether you could keep using the service. It was like the cloud, although we called it a mainframe.

Personal computing changed that. You owned your tools. You could switch your spreadsheet software, your word processor. Creativity was unleashed because people knew they were not being surveilled. When those personal computers connected, something even more remarkable happened: the free and open-source software revolution.

Now, Jensen Huang — the real one — says this is the era of personal supercomputing. Anybody with a laptop can make their own AI models. The only thing you need is to know what you are doing. Three years ago, I started fine-tuning a local model to help me draft emails. It works in airplane mode. Nobody else sees my email until I hit send. I still own every word. These small, fine-tuned models run efficiently on personal hardware. Sakana AI's document-to-LoRA tool, for instance, can turn an entire speech transcript into an adapter in about one second. I trained on 2,000 of my own public transcripts in roughly 20 minutes on a MacBook. Give it a try at home.

In what ways would you imagine the exercise of digital citizenship?

Citizenship is a set of freedoms — expression, association, movement. If you are a citizen of a city, you have the right to move to another city within the same country. But on many platforms, that freedom does not exist. If you leave X.com for Bluesky or Mastodon, your entire community resets to zero. That is not freedom of movement. It is what some in Silicon Valley call techno-feudalism.

We worked with the state of Utah on this. They passed a law that says, starting this July, if you want to move from one social network to another, your community goes with you — new followers, posts, replies all flow to the new network. It is like telephone number portability: if you cannot keep your number, the old provider wins by default. With portability, there is competition to the top, not to the bottom. More interoperability and freedom of movement are the key to unlocking digital citizenship.

What do you think about automating education?

What happens between people — listening deeply, real conversation — cannot be automated without loss. If you are not interviewing me but a deepfake of me, maybe you learn something, but I learn nothing, and we build no relationship. What can be automated is the barrier between people and knowledge. My native programming language is Haskell — a small, mathematical language almost nobody uses in production. But I can ask a local model to translate Python into Haskell so I can understand it, and that gives me new ways to connect with people who write Python. Translation across languages and disciplines is where automation genuinely helps. The people-to-people conviviality is where it should not.

Do you think we can achieve empathy across all countries?

Empathy requires a kind of social translation. We worked with the Napolitan Institute in the US to convene a mini-public of over 2,000 people — five from each congressional district — and ask them about freedom, equality, and the personal experiences behind their beliefs. Many Americans suffer from an illusion of polarisation: they assume that people who care about climate justice and people who care about biblical creation care have nothing in common, when in fact they care about the same things through different social experience. We deployed what is called a Habermas machine — an AI model that translates between those frames, rendering climate-justice language into biblical verse and vice versa. With this, more than 96 percent of participants agreed on fundamental values. Even the most divisive issue, affirmative action, reached almost 70 percent agreement. The US is not as polarised as it believes. Empathy does not require speaking another tribe's language — but it does require social-translation tools that make the overlap visible.

When you built Polis, you chose to optimise for consensus. How do you keep that optimisation goal open to challenge?

Polis is not just optimising for consensus — it also makes dissent very visible. You can see your avatar move to your tribe, see how many tribes there are, and see the connective tissue that links them. It is more like a group selfie than a squeeze.

Because Polis is open source, anyone can change the algorithm. Twitter took Polis and invented Community Notes — a slightly different algorithm, but a direct lineage. By being open source, we do not foreclose the possibility of a better algorithm that reflects relational health in whatever way a community defines it.

What we practise is not optimisation but satisficing: meeting a threshold across all the important measures rather than maximising any single one. In a multi-agent setting, maximising one metric triggers Goodhart's law — everything you are not measuring gets sacrificed. Satisficing keeps the system in equilibrium.

In Mexico many people have lost faith in democracy because they see the government as corrupt. From your perspective, how do we solve this?

In Taiwan in 2014 the president had only 9 percent approval. The opposition was also deeply divided — half wanted to "free China," the other half wanted to be free from China. The Sunflower Movement didn't just protest; it built bridges by finding uncommon common ground: we want to counter authoritarianism without becoming more authoritarian ourselves.

My suggestion: don't just protest. Design better systems — perhaps with distributed ledgers, local AI models for auditing, or civic tech — so that honesty becomes the dominant strategy (incentive compatibility). Study mechanism design. Test your ideas in your local community first. Before long you become a bridge builder and move from anti-corruption to pro-transparency.

How do you feel about current conflicts in the world?

Waging peace requires as much strategy, cunning, and logistics as waging war — and it is arguably harder, because trust is easy to break and very difficult to rebuild.

At MIT's Center for Constructive Communication, researchers faced a campus where students on opposing sides of the Israeli-Palestinian conflict could not hold face-to-face conversations. So each group deliberated among themselves, and then the best statements from both sides were woven into an audio medley using what is called meronymity — partial anonymity. The voices are altered enough that you cannot identify the speaker, but you can still hear the prosody, the emotion. When both groups heard the medley, they depolarised significantly.

War dominates our conversations, our social media, our news. But I invite you to put your genius, your cunning, your talent into waging peace.