It occurred to me that the positioning of these AI technologies has been focused on optimizing toward a reward system. As you speak, Audrey, I find it interesting that we are focusing significant energy on very low-complexity solutions. We are potentially underrepresenting the true relationality and dimensionality of the systems we interact with. Culture, for example, is a massively balanced system. People who do not want to do the patient work of understanding often seek the power of that system or try to dominate the conversation. Those of us who see what is at risk need a vision for how the reward system and the development of AI can deal with the complexity of the world it operates in without declaring victory too early. Is that fair?

Keyboard shortcuts

j previous speech k next speech