Creators of AI image models for porn and celebrities are running out of easy hosting options as Civitai and Tensor.Art change their policies under pressure.
can anyone speak to why payment processors care about AI porn at all? With the duopoly of PayPal and Stripe, I’m not totally clear why the payment processors think that AI Porn will impact their bottom line in the slightest. If people take issue with the payment processors’ implicit approval of these practices, what are they gonna do? It’s not like there are any viable alternatives.
They dont wanna be sued. CC companies hate having any sex work or porn paid for with their systems. Puritan shit, sex work hate, and throwing the baby out with the bathwater trying to avoid being a processor for CSAM which does NOT just simply go hand in hand for sex work or porn but they think it dows.
They might actually just care about the moral issues involved (or at least be worried enough about pushback to fake it).
They’re going to make a river of money regardless, and so maybe it’s not worth taking a reputational hit or risking some kind of legislation, just to preserve the 0.00000001% of their revenue stream that is deepfake porn based.
That makes sense actually. I’m guessing they have been sued for similar stuff in the past and as you said, the small revenue stream is not worth the cost in lawsuits.
can anyone speak to why payment processors care about AI porn at all? With the duopoly of PayPal and Stripe, I’m not totally clear why the payment processors think that AI Porn will impact their bottom line in the slightest. If people take issue with the payment processors’ implicit approval of these practices, what are they gonna do? It’s not like there are any viable alternatives.
They dont wanna be sued. CC companies hate having any sex work or porn paid for with their systems. Puritan shit, sex work hate, and throwing the baby out with the bathwater trying to avoid being a processor for CSAM which does NOT just simply go hand in hand for sex work or porn but they think it dows.
They might actually just care about the moral issues involved (or at least be worried enough about pushback to fake it).
They’re going to make a river of money regardless, and so maybe it’s not worth taking a reputational hit or risking some kind of legislation, just to preserve the 0.00000001% of their revenue stream that is deepfake porn based.
That makes sense actually. I’m guessing they have been sued for similar stuff in the past and as you said, the small revenue stream is not worth the cost in lawsuits.