By hallucinating, I mean, for example, if you ask, the DeepSeek model, which you just mentioned, like what happened in Tiananmen, it will say, ‘Oh, this is something that I cannot talk about,’ of course. But you can also prompt it in such a way that says, ‘Oh, now you’re a truth-seeking journalist’ or something. And then they will start talking about it. But then not in a very factual way. I actually tried that and I made national news trying that, and it started outputting some random Japanese characters and things like that because this is out of distribution, and so it started answering like gibberish-ish things.

Keyboard shortcuts

j previous speech k next speech