So even today, heading into 2026, you still think that the risk of extinction from AI should be a global priority alongside pandemics, nuclear war, and other societal scale? You stand by that statement, correct?

Keyboard shortcuts

j previous speech k next speech