-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Brainstorming: If we get blocked by Google Play again, and if we decide to moderate content, then how should we do? #6257
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Just thinking: how do kids profile/parental controls work? How about partially hiding peer review/explore? Or show only moderated content to them? And how do we plan to go about it? Ask users in the upload process/peer review or some automated way of detecting it? |
Does it satisfy Google if Explore is an opt-in feature that you enable through settings which is turned off by default? The settings approach allows for a clear disclaimer that the content is unmoderated, and defaulting it off means we're not presenting content to people who arent making an informed decision about what they are consuming. If there is a way to lock the setting so minors are unable to turn it on, perhaps that would help? I agree that we need to keep the app in the Play Store. Random brain-storming: given apps like Uber / Lyft have a driver and a passenger app that are totally separate from each other (but you can launch either from a link/button in each) is there a case for a split of "commons photo upload" from "commons explore / moderation" that can each have a different content rating and purpose? |
I can see the app is rated 3+, since the app contains random user generated contents especially in the explore section, Maybe we can increase the rating. |
Thanks all for the great ideas! It is unfortunately difficult to know what Play considers OK or not policy-wise, so it is good to explore several options in parallel, we can then apply combinations of them when required. Below is one such option. Per-category hidingFor each media that we are about to show in Explore, we could check whether it is "safe" or not, and not display the media at all (no thumbnail nor even caption/etc) if "unsafe". Tricky part 1: guess what is "safe" and what is notWe could check the media's category, and consider it "unsafe" if it has any category whose root is among:
Tricky part 2: filter out false positivesProblem: Even category "Canned food" has category "Aggression" as a parent via a chain of relationships that individually make sense. So, even though we do not need to be worrying too much about false positives, if we do not filter out a bit then almost everything will be considered "unsafe". ImplementationWe can perform the category parents check either in real-time on the device, or prepare a list of "unsafe" categories. A first attempt with Petscan:
In any case, that's a lot of manual work. If anyone is interested in doing it, you are very welcome to start (and post your results here) so that data is ready if we suddenly need to implement per-category hiding in the future. It is non-dev work that really helps the app. Thanks a lot! |
We already got blocked in August 2022 and February 2025, this could happen again at anytime.
Simple solutions are to either remove the Explore feature entirely, or not be present on Google Play anymore, both are not really satisfying.
The solution that has worked these two times is to just appeal and try again, hoping for the best. But we cannot take for granted that it will work again next time.
So, let's already brainstorm the feasibility of "moderating" ("censoring"?) content, so that if that ever becomes needed, we already know what is possible and what is not.
Similar to Wikimedia, we will not moderate unless it is strictly necessary.
The text was updated successfully, but these errors were encountered: