
Just to clarify, all of our posts are marked explicit whether they include an explicit image or not, because our account is an adult one and we only want to engage with like-minded users over 18, but this layered system only works if moderators are there to enforce it and weed out users acting maliciously. I think it's reasonable to delete content where there's any doubt (such as when it comes to potential age of those depicted, violence, gore etc), but this one person reported several images, including several they obviously have no legal right to claim ownership of, and I hope whoever moderates those posts (eventually) won't go "scorched earth" on them and instead use a nuanced and proper approach. We're going to leave them there to see how long it takes for a moderator to view them, and see what action they take.

Those posts are still in the alerts list, untouched. Clearly, the person who flagged those random posts had no ability to claim ownership of images created by a media company and provided for affiliate use. These were a mix of adult images and affiliate posts linking to a blog. Yesterday we had several image posts flagged as "No Permission". It just sat there until we got annoyed with seeing it and deleted it ourselves. We left it in the alerts to see what would happen and nothing did. Last week an innocent post was flagged as "Illegal" (it didn't even feature an image of a person, and had no connection at all to anything illegal in the text).

I think your method for self-categorization of adult content is a good tool to have and the best option I've seen out there, but we've experienced malicious reporting of content and no action taken.
