AFTINET's submission to the JSCOT review of the Singapore Digital Economy agreement
September 18, 2020: AFTINET's submission to the JSCOT review of the Singapore Digital Economy agreement is here. The submission notes that this is the most deregulatory agreement on digital trade the Australia has signed, and it is not clear that the exceptions preserve enough regulatory space to protect consumers and the public interest in a rapidly changing digital environment.
The DFAT National Interest Analysis (NIA) argues that the Singapore DEA is a significant agreement that it wants to use as a model for future regulation of the digital economy. If this is the case, the agreement should be subject to the highest levels of public and parliamentary scrutiny and debate. The need for such scrutiny is reinforced by the increased profitability and market concentration of companies operating in the digital domain in the context of the COVID-19 pandemic, and the need for public interest regulation to prevent abuse of market domination and to protect consumers.
In fact, the DEA faces less scrutiny and accountability. The lack of enabling legislation and absence of a parliamentary debate and vote on the DEA means that the process for this agreement is even less transparent and accountable than previous agreements. In this sense it is a Trojan horse which may escape the detailed public and parliamentary scrutiny it deserves.
The DFAT NIA argues that the deregulatory agenda of the DEA goes further than any previous agreement and will be used as a model for the future of the digital economy. The DEA limits regulation of transfer and storage of data across borders and prohibits requirements for local facilities. Combined with provisions in the services chapter of SAFTA, it also prohibits requirements for local presence of digital trade companies doing business in Australia. The DEA also restricts regulation of source code and potentially algorithms. There are exceptions for government data, personal credit data and some health data, but these are limited. This means that companies can conduct digital trade in Australia without any local presence or local facilities and without any scrutiny of source code and algorithms.
DFAT concedes that the DEA “will impose new restrictions on Australia’s policy flexibility to impose certain measures to restrict data flows or require data localisation”. DFAT claims that adequate public interest regulation will be permitted by exceptions in the agreement.
However this submission presents evidence that, in the context of the rapidly developing digital economy and emerging regulatory challenges identified by the Australian Competition and Consumer Commission and the Human Rights Commission, the exceptions are limited and the DEA restrictions on policy flexibility could restrict governments from regulating in the public interest.
For example, it is not clear whether DEA exceptions would be adequate to address the issues raised by Alinta Energy’s recent failure to abide by undertakings to store data in Australia, and its subsequent failure to protect personal data that was stored in Singapore and New Zealand.
The EU’s General Data Protection Regulation (GDPR) is an emerging global standard for privacy protection which Australian companies need to do business with Europe. Australia is negotiating a free trade agreement with the EU in which this issue is likely to emerge. Commitments made in the DEA and used as a model for other agreements do not meet EU standards and raise the question of whether it is wise to commit to a lower privacy standard when the EU and many of its trading partners are committing to a higher privacy standard.
When Australia’s COVID-19 tracing app was launched in April 2020 amid public controversy about privacy protections, the government hastened to reassure potential users that their privacy would be protected by the data being stored in Australia. It is not clear whether the exceptions in DEA would permit similar requirements to store personal data in Australia in future.
The government agreed to make the source code of the COVID-19 tracing app available for examination by privacy experts and followed their recommendations for changes to protect privacy. The publication of source code and its subsequent modification were made before the DEA text was made public. The Committee should ascertain whether the exceptions in DEA would permit similar access and modification of source code and algorithms in future.
There is increasing public concern about the mass use of facial recognition technology, reflected in the recommendations of bodies as diverse as the Parliamentary Joint Committee on Intelligence and Security and the Human Rights Commission. Both have recommended suspension of the use of this technology pending the development of robust privacy and human rights protections. The committee should examine whether the DEA gives sufficient regulatory space to enable such regulation.
The recent Victorian report on the on-demand workforce has exposed exploitation of workers through digital platforms and the need for regulatory change to protect workers’ rights. There is a risk that DEA rules will hinder this process by limiting policy space for regulatory reform and undermining government enforcement mechanisms for digital platforms.
- The Committee should thoroughly examine the DEA restrictions on regulation of transfer and storage of personal data across borders, on requirements for local facilities, and on regulation of businesses with no local presence, and whether the exemptions are adequate to protect the public interest.
- The Committee should thoroughly examine whether the DEA protects the privacy of data stored overseas to Australian standards, given the DEA lacks agreed and enforceable international standards of privacy protections, .
- The DEA restricts government access to source code with some exceptions. The Committee should examine whether the exceptions are adequate and whether DEA restrictions on access to and regulation of source code would restrict privacy protection in future.
- The DEA also anticipates similar future restrictions on access to regulation of algorithms, despite the ACCC and Human Rights Commission evidence that search engine algorithms are used to reinforce market domination, and that personal data-sorting algorithms can be discriminatory. The Committee should oppose future restrictions on regulation of algorithms.
- The Committee should support the Parliamentary Joint Committee on Intelligence and Security and the Human Rights Commission recommendations to suspend mass use of facial recognition technology pending the development of a robust regulatory framework to safeguard privacy and human rights. The Committee should examine whether the DEA gives sufficient regulatory space to enable such safeguards.
- The SAFTA and the DEA ignore workers’ rights, despite the mounting evidence of such rights being undermined by gig economy jobs run through digital platforms, as exposed by the recent Victorian report on the on-demand workforce, and the need for regulatory change to protect workers’ rights. The Committee should examine whether the DEA gives sufficient regulatory space to enable such changes to protect workers’ rights.
The Committee should support a review of the DEA three years after implementation to evaluate the impact on public interest regulation of provisions which prohibit limitations on offshore data storage, prohibit requirements to store data onshore or have local facilities or local presence, and limit regulation of source code and algorithms. The reviews should also examine whether ISDS provisions have been used by international companies engaging in digital trade