This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.
So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.
At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.
There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.