BRIEF: Mis/disinformation regulation - benefits, risks, and one big gap - Venture Insights

BRIEF: Mis/disinformation regulation – benefits, risks, and one big gap

Mis/disinformation regulation – benefits, risks, and one big gap

TLDR version: The release of an exposure draft of mis/disinformation regulation for digital platforms has already generated predictable controversy. Our view is that the announced proposals will bring greater transparency to the mis- and disinformation measures already taken by platforms.

The risk is that citizens see Federal Government as making itself an arbiter of mis- and disinformation. The imposition of codes and mandatory standards inevitably implicates the Government and the regulator ACMA in platform content management policies. There is a real risk of damaged institutional credibility if yesterday’s misinformation turns out to be today’s truth. This risk must be managed.

Finally, the platforms’ internal mis/disinformation policies can also bring harm when they suppress open debate. If ACMA can direct platforms to undertake measures to suppress mis- and disinformation, it should also have the power to direct platforms to desist from arbitrary restrictions on debates on controversial issues.

Platform regulation will bring transparency

The release last month of an exposure draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 was always going to lead to controversy along well-rehearsed lines. There are indeed risks to any system to control information flows, and the draft takes steps to mitigate those through exclusions and other limits on intervention. The regulator focus is on encouraging good process rather than direct involvement in content moderation.

All of this must be evaluated in the light of the real and pervasive management of mis- and disinformation that social media platforms already undertake. (Ironically, the platforms’ own business models, which monetise clicks, drive a good share of this mis- and disinformation). This mostly occurs out of sight and without any public accountability. It is not enough to say that these are private platforms that can do as they wish. Private actions often have important public implications that require policy intervention.

The draft bill gives ACMA extensive information-gathering and record-keeping powers that can bring a new level of transparency to the influence of the platforms over contentious debates. This part of the draft bill has attracted little opposition and will be highly beneficial.

Risks to the Government and regulator

The risks arise from the new powers for ACMA to impose industry codes and mandatory standards on platform providers. There is the obvious risk that such measures could be abused (intentionally or otherwise) to influence contentious debates. The focus of the ACMA powers on platform policies (rather than the content itself), plus a requirement not to unreasonably affect freedom of speech, mitigates this risk.

In any case, this risk needs to be evaluated against the real counterfactual: the current freedom of the platforms to mould debate without accountability or even transparency. ACMA is accountable to Parliament, which is accountable to the people. We think this is a much better arrangement than the system currently in place, which is arbitrarily determined by the platforms themselves.

A major risk is actually on the other direction – that ACMA involvement in codes and standards implicates the regulator in the platform’s internal moderation policies, which are often highly controversial. What happens when “misinformation” suppressed under a code or standard later turns out to be true, or at least evolves over time to become respectable opinion? This erodes trust, and mis- or disinformation about the motives of both platforms and government gains traction.

This risk can only be managed by a parsimonious approach to intervention that carefully distinguishes mis- and disinformation from the merely controversial. Freedom of speech needs to weigh heavily in the scales if regulation is to retain wide support.

Why does this matter?

The current draft focusses on the harms due to inadequate measures to suppress mis- and disinformation. But what about the harms of overzealous measures? If the benefit of freedom of speech is one of the considerations that ACMA must weigh, why not address harms to freedom of speech as well?

Greater transparency will help to protect freedom of speech, but the current draft bill and the accompanying Guidance Note does not yet envision regulation to prevent overzealous suppression of speech by platforms. However, the mechanism is there in the proposed codes and standards. If they can impose content moderation on a particular issue, they can also discourage overzealous moderation. For instance, this might involve requiring more transparent processes and rights of appeal for people/content who are blocked.

Explicitly including such measures would also give the proposal a kind of symmetry and could help to flip fears around freedom of speech.

About Venture Insights

Venture Insights is an independent company providing research services to companies across the media, telco and tech sectors in Australia, New Zealand, and Europe.

For more information go to ventureinsights.com.au or contact us at contact@ventureinsights.com.au.