Previously, it was easier to find the perpetrators because only criminal organizations could produce such material at scale. But now, synthetic deepfakes are easily created, making it harder to identify specific criminal groups. Any individual with a laptop or personal computer can mass-produce such photos or videos. This makes it difficult to address the problem by centralizing all detected CSAM (Child Sexual Abuse Material) in a single database, as too many people are creating such content.