 
                
              
                
              
              
              
                And they are all converging on a very important idea, which is to bring the compute to where the data is, instead of asking data to be extracted to where the compute, the code, are. Okay? So the idea is not that we simply trust ChatGPT or Gemini or whatever to process all this data, but rather to make a smaller model that can run in the community infrastructure, and you can then verify that it has no connection at all to the outside world when it processes the data, but then it can still share the learnings through federation and so on to other plots, learning about things in the data without revealing the doxxing privacy information of the raw data. And so this kind of privacy technology is getting better every day, and I encourage you to check them out.