The long-awaited media and online content consultation documents have been released by the Department of Internal Affairs (DIA).
What is proposed is a dramatic revamp of the regulatory landscape for broadcast and online media.
The documents describe the proposals as platform regulation. It is, in fact, a proposal for content regulation.
The original review, as part of the Department of Internal Affairs Online Safety Policy, was entitled Media and online content regulation. These documents are a step in the process. I shall briefly set out what is proposed, offer a commentary and explain why this is a content regulatory proposal.
The perceived problem is that our main pieces of legislation regulating content are more than 30 years old. Some of the core features are still relevant, like the codes of broadcasting practice, the protection of children from age-inappropriate content and censoring the most abhorrent pieces of content.
The main pieces of legislation referred to are the Broadcasting Act 1989 and the Films, Videos and Publications Classification Act 1993. In addition, there are the Harmful Digital Communications Act 2015 and the Unsolicited Electronic Messages Act 2007, which have content regulatory implications.
However, it is suggested the current system can’t keep up with new technologies and it is claimed that it is time to reset the system.
The objectives of the review are:
- To achieve better consumer protection for all New Zealanders and their communities by setting safety-based outcomes and expectations for platforms;
- to better protect children, young people and other vulnerable New Zealanders;
- to reduce risk and improve safety without detracting from essential rights like freedom of expression and freedom of the press; and
- to promote a safe and inclusive content environment while remaining consistent with the principles of a free, open, and secure internet
The discussion document seeks feedback on a possible overall approach, with some high-level detail about how this might be delivered in practice. This feedback will be used to shape further decisions about the approach, the legislation and the development of a new regulator to implement the changes.
What is proposed is a new system, with the major change being the way in which social media platforms are regulated.
That sounds somewhat limited on the face of it, but a closer examination of the materials indicates not just social media but other online platforms and other forms of media like news will be under the regulatory umbrella.
At present, broadcast media are under the jurisdiction of the Broadcasting Standards Authority. Mainstream news media are subject to supervision by a voluntary organisation, the New Zealand Media Council, which replaced the former Press Council.
Under the proposals, online and other media platforms would be brought into one cohesive framework with consistent safety standards. The objective is to make sure platforms are safe for users. The materials emphasise there is no appetite for over-regulation.
Almost immediately, a problem becomes apparent. What is meant by “safety standards”? Indeed, in these days where offence is so quickly taken and, as a result, people feel “unsafe” when they hear or see something they don’t like, a critical issue will be to define “safety” if that term is to be used.
These “safety issues” will be covered by codes of practice setting out specific safety obligations for larger or riskier platforms. These codes will be enforceable and approved by an independent regulator.
The codes will cover things like how platforms should respond to complaints and what information they should provide to users.
Regulatory efforts will focus on the areas of highest risk, such as harm to children or content that promotes terrorism. Some platforms, like social media and video-sharing services, will need to make changes to their services because they’re not currently regulated in New Zealand.
An independent regulator
What is proposed is a new independent regulator, separate from the government, to promote safety on online and media platforms. This new regulator would work with platforms to create a safer environment and would require larger or high-risk platforms to comply with codes of practice.
One issue will be to ensure the independent regulator is truly independent. In the first instance, whatever independence a regulator may have will be constrained by the language of any statute setting up the regulatory system.
Second, the regulator will be appointed by the government, which will be seeking what could be called “a safe pair of hands”. The issue of how large a grasp those hands may have will no doubt be dictated by the ideology of the government appointing the regulator.
The argument may be advanced that we already have independent regulators in the form of the chief censor and the Broadcasting Standards Authority. However, as I will discuss, the powers that the new regulator may have will be significantly wider than those held by the present incumbents.
The codes would set out the standards and processes that platforms need to manage risks to consumer safety, such as protecting children and dealing with illegal material.
Codes of practice would cover:
- processes for platforms to remove content and reduce the distribution of unsafe content;
- accessible processes for consumer complaints for particular content;
- support for consumers to make informed choices about potentially harmful content;
- how platforms will report on these measures, including on transparency; and
- how they are reducing the impact of harm from content, and their performance against codes.
The codes would be developed by industry groups, with support from the regulator, and platforms will be expected to align their terms of service and operating policies to the relevant codes of practice.
Platforms will need to have operating policies in place to meet these requirements but will have flexibility to decide how to achieve them. Industry groups will develop the codes, with input from and approval by the regulator.
We can see already that the regulator has the power to work with platforms. There is a wider power inherent in these proposals and that is to approve codes – as evidenced by the words “input from and approval by the regulator”.
This approach leaves editorial decision-making in the hands of platforms while ensuring users have greater transparency and protection.
But they will be working within constraints of codes that will ultimately be dictated by the regulator. It would be naïve to suggest all the codes will be by consensus. The way I read this is that the regulator will have the last word.
One argument advanced is that broadcasters – TV and radio – and mainstream media subject to the NZ Media Council operate within codes whereas social media does not. Although social media may be one target of the regulatory review, it seems to me that reference to platforms will mean any platform – not just social media platforms.
The new regulator would make sure social media platforms follow “codes to keep people safe”. Media services like TV and radio broadcasters would also need to follow new codes tailored to their industry.
Once again, the word “safe” is used. This word is capable of many meanings and the law must avoid uncertainty. The issue of “safety” is best avoided.
The regulator would have the power to check information from platforms to make sure they follow the codes and could issue penalties for serious failures of compliance.
The power to issue penalties raises another important issue which demonstrates an inconsistent approach taken in the documentation.
It is made clear there is no proposal to change the definitions of what is legal or illegal.
Illegal content is covered by existing law although it is proposed the regulator would also have powers to require the quick removal of illegal material from public availability in New Zealand. These powers exist already for objectionable material.
The DIA is proposing the regulator should also have powers to deal with content that is illegal for other reasons, such as harassment or threats to kill. It is unclear what is proposed but I imagine the regulator would have the power to issue take-down orders. It should be pointed out that threats to kill and harassment are already offences under the Summary Offences Act 1981, the Crimes Act 1961, and the Harassment Act 1997.
Under this proposal, the regulator would police compliance with code. The regulator becomes a policeman of content.
Furthermore, the code of practice adds another layer to the determination of harmful or unsafe content. Thus, content may be judged first on the basis of legality or illegality and second on the basis of compliance with the code.
Given the regulator may have sanctioning powers, this is another way of addressing content that may not be illegal but, because of non-compliance with the code, may be de facto unlawful. This subtle but important distinction appears to have been overlooked by those preparing the materials.
The materials suggest the proposal is a deliberate shift away from the status quo of regulating content, towards regulating platforms.
That may appear to be the position but, in reality, what will be regulated is material that may compromise safety-based outcomes and expectations. It is disingenuous to suggest it is not a content regulation program because it is. Saying it is a platform regulation system is facile because what is proposed will moderate the content that platforms promulgate.
Given that what is provided by platforms is content – remembering that principally the internet is a communication system – this must, in the final analysis, be a content regulatory system.
Not platform regulation
Several concerns arise from these proposals.
The first is that this is – as I have suggested – a content control or content regulatory system. It matters not that the DIA suggests otherwise. The effect of the regulatory structure proposed clearly has content as its target.
To regulate content requires an understanding and appreciation of some of the deeper aspects or qualities of communications technology. Once these are understood, the magnitude of the task becomes apparent and the practicality of effectively achieving regulation of communications runs up against the fundamental values of Western liberal democracies.
Once we begin to understand the importance of the qualities or properties of information technology, then we begin to get some insight into Marshall McLuhan’s comment “the medium is the message”. It is the medium that should be the point of focus – not the message.
Although we may be dazzled by and focused upon the content that McLuhan suggested was the “piece of meat that attracts the lazy dog of the mind”, we can begin to get some understanding of how information technologies have changed not only our approaches toward information but also some of our fundamental behaviours to the point that even the values we may attribute to information, which underlie these behaviours, might themselves change.
The second concern is related to the first. What is proposed is a regulatory system that allows for the creation of a code or codes of conduct for platforms. Necessarily this will contain directives about content.
But it will be unclear from the enacting legislation as to the level of acceptable content. This will not be a matter for legislation but for a super-soft rulemaking arrangement that could well see the introduction of permissible levels of content and a back door entry for the regulation of so-called “hate speech”.
The third concern flows from the second. In an area as sensitive as the communication of ideas, there should be clarity and certainty about what may or may not be permissible. This proposal does not provide for that.
Finally, although the DIA has dressed up these proposals as platform regulation, as I have suggested they are not.
To reiterate: The internet is a communications system. Platforms are built onto the backbone of the internet. These platforms allow for and enhance communication. Interference with platforms becomes an interference, directly or indirectly, with communication. And what is being communicated? Content, or as it may otherwise be termed information.
Nowhere in the proposals, at least as far as I can see,is there any detailed discussion as to why these proposals constitute a justifiable limitation on the right of free expression – to impart and receive information – contained in s 14 of the New Zealand Bill of Rights Act 1990.
I would have thought it would be axiomatic for any discussion about the regulation of online platforms and services that a Bill of Rights Act discussion, analysis and an explanation justifying an interference with freedom of expression be undertaken. It is absent other than 11 references to the Act in the course of the 90-page discussion document. There is no detailed analysis.
These proposals are unwise, ill-considered and vague. They suggest the creation of a new uber-regulator – an online services regulatory tsar – under the guise of platform regulation and unclear and as-yet unexpressed codes. I have no doubt there will be pushback from the platforms. There should be pushback from any citizen who may be concerned about yet another encroachment upon freedom of expression.
David Harvey is a retired District Court judge
Ever since I became involved in the internet I have been concerned about aspects of internet regulation – both of the technology and the content it delivers. I taught aspects of internet regulation to my Law and IT class between 2000 and 2018. I have written about it both in my text internet.law.nz, in the text New Zealand Media and Entertainment Law I co-authored with Rosemary Tobin, and in online and hard-copy publications. I co-taught a course in media regulation for a Master of Laws paper in Media Law.
My PhD study was about the impact of a communications technology – the printing press – upon law and legal culture. A full chapter of my thesis and the book that followed it, The Law Emprynted and Englysshed – The Printing Press as an Agent of Change in Law and Legal Culture 1475 – 1642, dealt with early attempts to regulate the new technology of print.
Before the content review was released, I had updated my research and rewritten the chapter on the regulation of the printing press and the printing trade. It has a number of parallels to what is proposed in this latest iteration of attempts to regulate the content of communications technology.
I will be publishing my history of attempts to regulate the printing press in six parts on Substack in mid-June. Readers will find that although some of the language of regulation has changed – sedition, heresy and treason become “objectionable” and “unsafe” – the song remains fundamentally the same. Plus ça change, plus c’est la même chose.