The company has decided to invite actual users to come forward and participate in the app’s policy-making process. And it started with the topic of speech writing. What better way to build trust than having your own users be a part, right?
While we did not know much about it during that period, Meta has been keen on carrying out such tests because it feels they provide great importance in gathering insights that the usual workforce can’t give. It’s like a perception of a whole new kind.
A total of three groups were gathered around five different nations to find the answer to one question. And that’s related to how the company can evade the crisis of misinformation related to climate change on the Facebook app.
The question just didn’t arise out of the blue. It came as more and more watchdogs are putting pressure on the organization and scrutinizing their actions. The amount of misinformation surrounding the environment is plenty and Meta knows that but they’re struggling with combatting it and the world is noticing.
Just last year, a report from the Guardian was published that spoke about how nearly 45,0000 articles downplayed the threat related to climate change and its harrowing threats. They blatantly denied the crisis straight up.
On that note, Meta did make promises in February that they were going to do everything to curb the matter. But then another watchdog outlined how Meta was struggling and they were only highlighting half of the total number of posts about misinformation linked to the ongoing climate crisis, as confirmed by NPR.
On that note, Meta had to think quickly so they opted to hire a company called Behavioral Insights to allow general users of the app to come forward and take part in making policies. In particular, users were directly asked about what the app should do about its misinformation. They called it content that wasn’t 100% false but quite misleading and possessed poor quality data. And in the end, they just end up generating false conclusions.
The company is yet to outline what exactly they mean by problematic climate change information but we can imagine that it’s of the sort that entails queries like whether or not climate change happens to be real or not. And if yes, then why are we having cold weather conditions? Things of this sort.
In all other apps out there today, no one gives their users this much power. It’s always up to the executives in charge and their policy-making teams who are given the task to handle the matter. They are the ones that discuss matters with the experts and other groups related to human rights. But remember, there’s no transparency on what’s going on and regular users have zero clue about the matter.
This lack of transparency is what has really undermined so many people’s confidence. And when you have no knowledge of what’s going on, it’s extremely difficult to blindly trust a policy getting laid out. There’s no knowledge about who made it and why it was put out in the first place. Let’s not forget about how it’s going to be enforced.
To get this particular experiment up and running, both Meta and the BIT collaborated to get at least 250 people on board. These were loyal users of the Facebook app and would take part virtually over a two-weekend period. They would receive some education and training regarding climate change and its crisis, other than policies in store on the app. They were even provided with access to experts in the subject from the outside world. In the same way, they could interact with the company’s own employees.
Facebook then got the chance to put some options for solutions and the group was allowed to debate over what they found to be interesting and most impactful. When a consensus was reached, the activity was over. But the firm failed to reveal the final decision. After all, it is an internal matter.
Read next: Apple Comes Out On Top Of The American Customer Satisfaction Index For Electronics And Household Appliances