VICTOR BWIRE: Media content regulation requires more discussion

The changing nature of gathering, collection, packaging, distribution, platform and consumption of media content, calls for a complete relook at the prevailing regulation

In Summary
  • The regulation of media and content moderation in Kenya is as problematic as in the rest of the world.
  • Regulating content is not aimed at regulating innovation and technology, but discussions are agreeing that information shared is within the agreed international and national laws.
Victor Bwire is the Deputy Chief Executive Officer and Programmes Manager at the Media Council of Kenya.
Victor Bwire is the Deputy Chief Executive Officer and Programmes Manager at the Media Council of Kenya.
Image: HANDOUT

The regulation of media and content moderation in Kenya is as problematic as in the rest of the world.

The changing nature of gathering, collection, packaging, distribution, platform and consumption of media content, calls for a complete relook at the prevailing regulation and moderation approaches that must move away from only involving media experts to a multi-stakeholder-involving wider section of the community and experts in discussing to balance between freedom of expression and regulation.

But while the debate in other parts of the globe is moving towards comprehending the changing dynamics of media and journalism operations with the advent of new media formats thus realigning media regulation regimes, other quarters are demanding more stringent laws, administrative control mechanisms and moral/national values arguments to justify reducing space for free speech.

Globally, things have changed and moved, but a few people are stuck on criminalizing free speech using arguments on hate speech, harmful content, disinformation, foreign manipulation of information and extreme petty thinking such as annoying speech yet even with laws on hate speech and publication of false news, very few cases meet the threshold for sustaining a charge.

Some countries have even introduced laws making administrators of digital platform groups such as Facebook or WhatsApp liable for content, but this is not working.

TikTok recently established an advisory group for Africa, to help improve the implementation of their community rules -requirements on content moderation including takedowns, as a way of responding to accountability demands for content on their platform while other groups such as the Council on Tech and Social Cohesion have demanded technology accountability from digital platforms to enable regulators have more capacity to regulate or moderate content by themselves.

Media Content regulation is no longer a straight jacket thing- and more discussions and inclusivity are required in the matter more than it's currently.

There is more demand for broad-based approaches to content regulation, information integrity and responsible use of digital platforms as communities grapple with making the best out of the opportunities presented by the expanded space for freedom of expression and access to information.

In addition to global professional standards on the regulation of information outlets including journalists which is generally accepted, some issues, which platform providers had thought well handled through the community rules, seem disturbing thus attracting attention.

The TikTok advisory board borrows from other tested approaches like the national coalition on freedom of expression and content moderation formed in Kenya that brings together civil society, academia, governments, independent content regulators, and media support groups.

Examples from around the world of extreme measures on dealing with digital platforms including the use of social media tax, computer and cybercrime laws, and digital media laws stifle innovation and development, thus a more democratic, all-inclusive and literacy-based approach is more workable.

Regulating content is not aimed at regulating innovation and technology, but discussions are agreeing that information shared is within the agreed international and national laws.

From the Internet Governance Forum in Croatia recently, stakeholders agreed that collaborative arrangements that ensure the inclusion of all key relevant stakeholders in content moderation are crucial towards effectively tackling problematic content.

Equally a multi-pronged approach - such as the recently launched coalition - as opposed to a one-size-fits-all all deemed more effective. Regulatory authorities reaffirmed their commitment to protecting fundamental freedoms, advancing human rights standards, and promoting transparency and accountability in digital platform governance.

Several things came up during the forum that require consideration as we work towards improving the regulation of media content including the need for cooperation and capacity building that respects national and regional differences while striving for greater coherence to avoid conflicting approaches, the need for more research and for the sharing of good practices to assess the effectiveness of current safety measures for digital platforms, correcting the current imbalance in regulatory and compliance efforts that disadvantages countries in the Global South and need to develop rapid-response systems for emergencies, particularly during sensitive periods like elections, to ensure the integrity of the electoral process and address other urgent risks.

Regulators highlighted the need to develop global metrics for transparency in digital platforms and establish common standards for risk assessments, ensuring these align with the overarching principles of the guidelines.

There is a need for regulators to continue engaging in regional and global early warning systems and rapid-response mechanisms to prevent harm and protect individual safety.

Victor Bwire is the Deputy Chief Executive Officer and Programmes Manager at the Media Council of Kenya.

WATCH: The latest videos from the Star