In recent times, Facebook has been widely criticized for censoring content, which according to the platform’s policies is inappropriate, but within the context in which it shared has a totally different meaning.
Mark Zuckerberg alluded to this issue by mentioning the Community rules in a long manifesto he shared,
Cultural norms are changing, cultures are different all over the world, and people are sensitive to different things.
In view of this and all the factors involved, they are considering developing a system that combines the potential of AI and the contribution of users, to give each one the freedom to decide the parameters of the content that he sees and shares (if or not objectionable), without having to be governed by a single general criterion.
This is not to say that there will be no content policies at all, there is simply some margin of freedom to customize these filters that will try to implement this new system. For example, under this system a mother breastfeeding her baby would not be censored, but would take into account the configuration that has established each to show us this content or not.
Alongside this dynamic, Facebook will continue to work on optimizing AI to detect and censor content that is clearly inappropriate and thus to protect the community. And of course, it will continue to block content, if it violates certain laws.
At the moment it is only an idea, and many questions remain about its functioning. We will see if Facebook comes to implement this dynamic and how it will manage the range of positions that can cause this new model of community standards.