Yeah, it sounds like a great idea. I’ll give you that. So I don’t directly have any issues with anything you said, but let me bring in my perspective, which is the Yudkowskian view, named after Eliezer Yudkowsky, which is just extreme concern at the imminent consequences of superintelligence, right? Just to put my cards out on the table, are you familiar with Eliezer Yudkowsky’s writings?