Facebook released a statement denying that it refused to appear before the committee. He had only “postponed” the meeting because the committee had not invited other tech giants. This is a blatant lie.
In the aftermath of the storming of the U.S. Capitol on January 6, much has been written about the role of social media platforms Facebook, Twitter and YouTube in amplifying misinformation about U.S. election results and the violence that followed.
I was struck by the words of Sarah Miller, director of the American Economic Liberties project and member of Joe Biden’s transition team, in an interview with the BBC later that month.
She said, “Facebook is widely regarded as the biggest villain among all the tech monopolies.”
A tweet from Biden’s deputy communications director Bill Russo made the point: “If you thought Facebook disinformation was a problem during our election, wait and see how it is destroying the fabric of our democracy in the days that come. followed.”
Two months later, Facebook CEO Mark Zuckerberg appeared before House of Congress committees on communications, technology and consumer protection alongside CEOs of Twitter and Google. There was a rare consensus from lawmakers on both sides that social media companies needed much stricter regulation. An amendment to Section 230 of the US Communications Decency Act has gained ground. The section offers platforms the ability to moderate published content and protects them from any liability for what users share or post. Without the protection of Section 230, businesses could be held accountable for anything their users post.
The words of Mike Doyle, chairman of the House Subcommittee on Communications and Technology, reflected the views of experts on the new and ever-evolving field of the fight against disinformation:
“You can delete this content. You can reduce vision. You can fix this problem. But you choose not to. You have the means. But each time, you prioritize engagement and profit over user health and safety. “
And it’s not just in Washington DC that Facebook has come under fire.
It is in this climate, characterized by acrimonious relations with governments and lawmakers around the world, that I have asked my fellow members of the Communications and Digital Technologies Committee that Facebook be invited to Parliament. It would be a forward-looking opening the door to constructive engagement in the hope of encouraging Facebook to come forward with a South Africa-focused plan to tackle disinformation and explain the steps it would take to limit collection. private data.
I found it particularly important, and wanted to raise a point, that Facebook content moderators be provided to cover all 11 official languages ahead of this year’s local elections. Facebook relies heavily on artificial intelligence (AI) to moderate content. AI is often unable to detect nuance in messages, and certainly not in South African languages other than English.
We must not look away from the whole. In a time of rampant disinformation and algorithm-fueled conspiracy theories, we can no longer condone a technology theory that says every engagement is a good engagement, the longest, the best, and all in the end. aim to collect so much data. as possible.
Less than an hour after the committee agreed to invite Facebook, I received a phone call from a man who informed me that he belonged to a company lobbying for Facebook. I was pissed off and told him I didn’t like Facebook thinking he could hire lobbyists to shoot on his behalf. If Facebook had anything to tell me, it would have to contact me directly. He assured me that Facebook would introduce itself to the committee and that he would be in contact.
The following weeks were tense. I was worried that Facebook would pull out.
In the meantime, I was interviewed by Nick Cowen of stuff.co.za about why I requested the Facebook meeting. I mentioned that I would like to meet with Facebook executives to assure them that the purpose of the meeting was to build a constructive relationship rather than being an investigation.
A few days later, I received a terse email refusing any engagement, from Nomonde Gongxeka-Seopa, Facebook’s Public Policy Officer Southern Africa. Gongxeka-Seopa had previously served on the Board of the Independent Communications Authority of South Africa, and I had toast her on several occasions. I knew the Facebook ship was sailing.
A few days before the Facebook meeting, scheduled for May 25, I received a phone call from the committee chair, Boyce Maneli, informing me that Facebook had withdrawn. Facebook’s concern, he informed me, was that the committee had not invited other tech giants to provide similar information. My heart sank. But, he told me, Google Africa responded positively to the committee’s request and even submitted its presentation, which was communicated to Facebook to allay his concerns.
The next day, despite being informed of the presence of Google, Facebook still refused to appear before the committee.
This week, Facebook released a statement denying that it refused to appear before the committee. He had only “postponed” the meeting because the committee had not invited other tech giants. This is a blatant lie. Facebook had been informed of Google’s participation and despite this knowledge, it still withdrew.
Facebook’s setback in Parliament, and by implication the people of South Africa and the greater continent, is an act of unprovoked self-harm, to use Prince Mashele’s famous description of a gun situation. . Perhaps with the impending international backlash he would face for his contempt for the South African Parliament in mind, Facebook issued a press release, “clarifying its position.” A dollar short and a day late.
The decision to step down is fueling the narrative that Facebook considers to be flawless and will give impetus to calls for its regulation. I may be emerging as one of the few voices in the disinformation arena that opposes government regulation of social media platforms. I am erring on the side of caution and against giving government the power to regulate social media. This could be an opening to the limitation of freedom of expression.
I admit a certain idealistic naivety in my belief that self-regulation can still work if it is improved. The platforms should change the algorithms of social networks to prevent the amplification of harmful disinformation and to better protect users’ private data.
I stand on the precipice, ready to consider regulatory suggestions based on international best practices.
I intend to continue to lend my voice and expertise to anti-disinformation efforts globally as well as within the country. In South Africa, the effects of Bell Pottinger continue and I do not intend to stay on the sidelines. A team is being set up to fight against disinformation during the election in order to ensure a free flow of ideas without manipulation of public discourse.
As Apple CEO Tim Cook said, “We shouldn’t look away from the big picture. In a time of rampant disinformation and algorithm-fueled conspiracy theories, we can no longer condone a technology theory that says every engagement is a good engagement, the longest, the best, and all in the end. aim to collect so much data. as possible. “
Google’s opening should serve as an example for other Big Tech companies in South Africa. It is much more desirable to operate hand in hand rather than in a contentious and acrimonious space. DM
More about this article: Read More
This notice was published: 2021-05-27 12:00:21