Facebook’s Southeast Asia director talks about hate speech and the Myanmar elections

Facebook’s Southeast Asia director talks about hate speech and the Myanmar elections
Mizzima Editor Win Naing, left, talks with Facebook's Mr Rafael Frankel, right, about Facebook's outreach in Myanmar. Photo: Mizzima

Mizzima Editor Win Naing recently sat down with Rafael Frankel, Director of Public Policy for Southeast Asia, Facebook in Yangon to talk about the social media platform’s approach to “policing” content on their platform, particularly in the run-up to the 2020 Myanmar elections.

Please could you tell us about yourself and Facebook’s work in Myanmar?

I have been working in and around for Southeast Asia for 20 years now. I have been to Myanmar many times for over a couple decades. We are very focused on Myanmar over the 2020 elections, ensuring integrity of the election on Facebook, our platform. Top of mind for us in Myanmar over Facebook is safety - safety of our users, safety of the people of Myanmar. So, I am really happy to share with you today, the first time we talk about it publicly, that our hate speech proactive reaction rate for the platform in Myanmar as of Q4 2019, last quarter, is now at 85%. What that means is that we are catching 85% of Facebook (posts) in Myanmar proactively before any user reports it to us. That’s out of a total of around 50,000 pieces of hate speech that we found on the platform in Myanmar in the last quarter. 

Now, I want to put that in context. A year ago, when I was at Myanmar Digital Rights Forum, I talked about how our proactive detection rate was 68% in the fourth quarter of 2018. And that accounted for out of a total of 55,000 pieces of hate speech that we removed. In Q1 2018, proactive detection rate was below 20%. So, the big picture here is over the course of two years we made a tremendous improvement in being able to proactively detect hate speech in Myanmar, removed it from the platform and we also see less hate speech on the platform in Myanmar than we did two years ago. So, the aggregate result is that it is much less likely that Myanmar users will see hate speech on Facebook than they did a couple years ago. 

The other thing I want to say is that we are working really closely with communities, UEC (Union Election Commission), international election groups in Myanmar to secure the integrity of the election in a number of different ways. With communities, we are really engaged in getting their feedback on what they are seeing on the platform. We held community outreach forums across seven states in Myanmar over the last four months. We worked together with the UEC to train them on how to use Facebook to boost public service announcements, so that Myanmar voters and constituents and people who want to take part in a democratic process can do so with the best information possible. And we are working with election communities to help train the political parties and candidates on our community standards, on account security, so they know what they can and can’t say on Facebook and to make sure that their accounts are secure so that people can’t take them over and get out information that are not correct. We are really focused like a laser beam ensuring the election in Myanmar 2020 and working with a broad set of stakeholders here in Myanmar to ensure that process. Because this is a collective effort, no one can do it alone. Facebook is happy to play our part together with the media, civil society, and the Union Election Commission.

You mentioned safety of the users, what do you mean by that?

We don’t want any of our users anywhere in the world definitely not in Myanmar to be exposed to things like hate speech, incitement of violence, bullying, harassment, that’s a lot of where we are focused on content. We are really also focused on making sure that the users are experiencing authentic communication and what I mean by that is that they are not subject to people misleading about who they are. So, if we see people who are using the platform pretending to someone else and getting those people off the platform. If we see people that are coordinating with other networks to try to manipulate public opinion it is about tackling those networks. That’s what we are talking about when we talk about safety.

In Myanmar, the election is coming up this year 2020. One of our concerns is fake news which is equivalent to hate speech. What measures have you taken or prepared in order to prevent fake news appearing on Facebook?

So, we have a global policy on misinformation we call remove, reduce, and inform. The remove part focuses on the most harmful misinformation on the platform. That misinformation can actually lead to violence or physical harm. That falls under our remove category where that content on our platform, we remove it. The other types of misinformation that we remove are related to voter suppression that is obviously very applicable to 2020 in Myanmar, any content that misinformation tries to keep people away from voting will be removed. The other thing is public health misinformation especially with everything going on now with COVID-19. We are focused on removing the most harmful rumours or misinformation around that.

Now we talk about reduce. That focuses on the type of misinformation that doesn’t necessarily violate our policy which is still inflammatory or spamming or really low-quality content. Our focus is to reduce that distribution. So that it might not be totally removed from the platform, much less likely that users in Myanmar would see that content. 

The final part is to inform. That is to provide people correct information, boost content that we want out there in the platform. So we are working with Myanmar media organizations to train them with how to use our platform better. And we are working with organizations like UEC that I mentioned to boost correct information so that everyone can get the information they need in order to participate in the process.

You mentioned a lot about misinformation. How do you know which is misinformation and which is not? Are you really concerned about freedom of expression when you are talking about misinformation?

At Facebook, we are always mindful of the balance between freedom of expression and user safety. It’s always in our mind when developing policy and implementing them. Now, we have been really clear that we do not want to be the arbiter of truth in the world. We don’t want to be deciding what’s truth or what’s false. We just don’t feel like we should have that kind of power or authority. What we do in the case of misinformation, we rely on partners in the communities, we check with them to see to get verification on whether something is true or false. The other thing we do is we work with third party fact checkers around the world. So, we have partners in many countries around the world that are checking content and they are empowered to make absolutely independent decisions to label content on a range between true and false and a few things in between. That’s how we try to treat misinformation. We are not the ones who are making the decision. 

Now, I will say this in emergency cases where we think that we have seen this kind of misinformation before and know that it led to harm previously. In those kinds of emergency situations we can if necessary make a decision to remove that content ourselves. That’s very few far between number of cases and really focused only on one that can cause the worst harm.

How many cases of fake news or hate speech in Myanmar have you come across year on year?

As I mentioned there are 50,000 pieces of content related to hate speech that are removed in Q4 of 2019 as opposed to 55,000 the year before that. So, we consider the overall trend is down. We can also see that our proactive detection rate all the way now is 85% as opposed to less than 20% at the beginning of 2018. It has been a huge improvement, the ability to detect that content before users potentially see it. The other thing that is worth mentioning comes to misinformation. We have a lot of tools that we can work with to decrease the distribution. So, there are other types of intervention that we are running in Myanmar that are related to inflammatory content. It doesn’t really violate our community standards. But we know from the experience that the kind of thing that people don’t want to see. So, we have tools that reduce that type of content on the platform. A few of those are operating now in Myanmar. And what we can see internally it does appear to have a real difference in terms of the amount of low-quality content reducing the distribution in Myanmar, boosting higher quality content people want to see and engage with.

Political parties in Myanmar are using Facebook as a platform in order to be visible on Facebook. Do you have a limit on an individual, political parties, candidates that can spend money on Facebook?

We focus on political ads and transparency. We will be turning on political ads enforcement later this year in Myanmar. We will announce that when it is happening in advance. Once that happens we are going to ensure that political ads have transparency in Myanmar. So, if any political ads run, you will be able to see who is running it, what their spending on the platform is, and a number of other pieces of information that are allowed so you can see where the content is coming from.