But of course, Llama 2 isn’t truly open source. It does still have some restrictions. As I understand, there are many research team in Taiwan looking at alternative foundation model to build upon. For example, recently, Mistral- or Mixtral-based models are faring pretty well. I think with the latest fine-tuning methods, DPO and all that, it now doesn’t take a supercomputer to align a large foundation model.

Keyboard shortcuts

j previous speech k next speech