That's a good question, because there is nuance here! It's interesting because while working on similar projects I also ran into this issue. First off, it's important to understand what your obligation is and the way that you can understand data deletion. No one believes it is necessary to permanently remove all copies of anything, anymore than it is necessary to prevent all forms of plagairism. No one is complaining that is possible at all to plaigarise, we're complaining that major institutions are continuing to do so with ongoing disregard of the law.
Only maximalists fall into the trap that thinking of the world in binary sense: either all in or do nothing at all.
For most of us, it's about economics and risk profiles. Open source models get trained continuously over time, there won't be one version. Saying that open source operators do have some obligations to in good faith to curate future training to comply has a long tail impact on how that model evolves. Previous PII or plaigarized data might still exist, but its value and novelty and relevance to economic life goes down sharply over time. No artist or writer argues that copyright protections need to exist forever. They literally, just need to have survival working conditions, and the respect for attribution. The same thing with PII: no one claims that they must be completely anonymous. They just desire cyber crime to be taken seriously rather than abandoned in favor of one party taking the spoils of their personhood.
Also, yes, there are algorithms that can control how further learning promotes or demotes growth and connections relative to various policies. Rather than saying that any one policy is perfect, a mere willingness to adopt policies in good faith (most such LLM filters are intentionally weak so that those with $$ and paying for API access can outright ignore them, while they can turn around and claim it can't be solved too bad so sad).
Yes. It is possible to perturb and influence the evolution of a continuously trained neural network based on external policy, and they're carefully lying when they say they can't 100% control it or 100% remove things. Fine. That's, not necessary, neither in copyright nor privacy law. Never been.
I am not a lawyer. But you wouldn't be surprised to hear that
My commitment is that maximalism or strict binary assumptions won't work on either end and don't satisfy what anyone truly wants or needs. If we're not careful about what it takes to move the needle, we agree with them by saying 'it can't be done, so it wont be done.'