When news broke Monday morning that Facebook may have been artificially suppressing ‘conservative’ news stories in its feeds, it was easy to chalk it up to politics-as-usual. But, regardless of the veracity of the claims, the act raises some interesting questions about knowledge sharing in the public realm.
Facebook inhabits an interesting space in our lives. It serves simultaneously as a public forum, a networking platform, and to a greater extent, a powerful advertising tool due to the sheer number of users and the user data that accompanies. Perhaps it is incorrect to call it a true knowledge sharing community, but as it has transformed over the years from a simple social networking tool for college students into a global platform, its role as a digital community for knowledge sharing and management has grown.
Facebook As A Knowledge Sharing Platform
Each day, users engage in topics like politics, sports, entertainment, news – any topic we can think of. Simultaneously, it is also being used as a question and answer forum on which users can crowd-source information like tips on cooking, advice on local services, or best places to visit when traveling. Additionally, there are sub-communities inside the platform that enable specialized knowledge sharing by creating groups for specific topics like programming, businesses, television shows, books, or anything else that garners even a small number of interest from like-minded individuals. Generally, there are only a few explicit rules that govern how content is filtered on Facebook, aside from the innovative algorithms that the company deploys to customize newsfeeds for each individual user. With that in mind, the possibility of Facebook dictating the kind of news that filters through to users, rather than letting users’ habits determine that content, is troubling.
Filtering Information: Pro or Con?
There are a host of issues that can be addressed regarding this artificial filtering, but we are focusing here on the impact it could have on knowledge sharing. As communication technology has progressed to the point of ubiquity, we depend on these digital communities to serve as tools that facilitate the exchange of information, as well as basic communication. A true knowledge sharing digital community is inherently populist. Regular people use the community to request and share information, and based on the kind of information being shared, or the popularity of that information, it can be filtered and used as needed. Members of the community dictate the nature of the information being shared.
However, most communities need mediators. In that capacity, mediators are usually simply looking for things like hate speech, inappropriate content, irrelevant content, or individuals who may be dominating conversation or abusing the community. In most scenarios, what mediators do not do, is dictate the type of knowledge being shared by community users. If something seems pertinent and of interest to the digital community, the mediator should not filter that knowledge based on his or her own personal preferences.
Facebook doesn’t necessarily position itself as a knowledge sharing tool, but its users have continued to find ways to push the boundaries of the platform. If it is indeed true that Facebook has been filtering content based on personal preference, rather than user habits based on algorithms, it raises all sorts of questions about how the company views its role within the community.
If it intends to play the role of mediator beyond its established content posting policies, then the company needs to be very clear about the guidelines to which it adheres for knowledge sharing, and if it intends to dictate content filters without informing its users, then we must seriously question the validity of the platform as a knowledge sharing tool.