Privacy Act Review
This Submission Paper was prepared by FinTech Australia working with and on behalf of its Members; over 300 FinTech Startups, VCs, Accelerators and Incubators across Australia.
Table of Contents
About this Submission 3 Submission Process 3 Privacy Act Review Discussion Paper 4 Introduction 4 Personal Information, de-identification and sensitive information 4 Small Business Exemption 8 Notice of collection of personal information 9 Consent to collection and use and disclosure of personal information 10 Additional protections for collection, use and disclosure 12 Control and security of personal information 13 Overseas data flows and third party certification 15 Direct right of action/Statutory Tort 16 Notifiable Data Breaches Scheme 17 About FinTech Australia 18
About this Submission
This document was created by FinTech Australia in consultation with its members, which consists of over 400 organisation representatives.
In developing this submission, we sought the views of our members to determine and discuss
key issues relating to the Privacy Act Review Discussion Paper (“Paper”).
We also particularly acknowledge the support and contribution of King & Wood Mallesons on
the topics explored in this submission.
Privacy Act Review Discussion Paper
We would like to thank the Attorney-General’s Department (“AGD”) for allowing us the opportunity to respond to the Paper.
With data increasingly recognised as one of the most important resources in a modern economy, and as Australian organisations continue to innovate in how they can best harness data to provide better services for consumers, FinTech Australia considers it is of the utmost importance that Australia’s privacy regime is updated and refined to better address current, and future, data practises. Any changes to the Privacy Act 1988 (Cth) (“Privacy Act”) must be carefully considered so as to effectively balance the need for consumers to be able to protect their personal information and ensure that organisations take responsibility for how they utilise personal information while simultaneously empowering organisations to grow and innovate with data in order to unleash the power of data across the Australia economy.
FinTech Australia considers that the Privacy Act Review Discussion Paper is an important step in this direction and looks forward to future engagement and discussion on the future of the privacy law in Australia.
Personal Information, de-identification and sensitive information
FinTech Australia is generally supportive of changes to the Privacy Act that will increase the consistency between Australia’s privacy regime and the General Data Protection Regulation (“GDPR”). Given the increasingly interconnected nature of Australia’s economy, and the international nature of many organisations in the Australian FinTech Industry, it is important that Australia is not perceived by international organisations as having an overall complex privacy regime that is out of step with international best practices.
However, we also stress that it is important that any changes do not overly stifle future innovation. We are already seeing major movements in how data is processed, including but not limited to artificial intelligence (“AI”) systems and algorithmic decision making, and innovative business must be empowered to use personal information responsibly as technology evolves rather than being subject to disproportionate compliance obligations that reflect the current technological environment. Countries that have successfully adopted GDPR elements into their local law do so by tailoring requirements to reflect local nuances and seek to evolve the law rather than to simply replicate the GDPR in a “drag and drop” exercise.
|Proposal 2.2: Include a non-exhaustive list of the types of information capable of being covered by the definition of personal information. |
Proposal 2.3: Define ‘reasonably identifiable’ to cover circumstances in which an individual could be identified, directly or indirectly. Include a list of factors to support this assessment. Proposal 2.4: Amend the definition of ‘collection’ to expressly cover information obtained from any source and by any means, including inferred or generated information.
FinTech Australia generally supports proposals 2.2 and 2.3 as they will provide organisations with increased clarity as to what is, and is not, personal information. However, given the speed of technological innovation compared to the process for amending the Privacy Act, it is vitally important that any codification of examples of personal information in the Privacy Act takes a cautious approach and only includes those types of personal information that are undisputedly personal information to the average individual. If an overly expansive approach is taken to the list and, for example, technical information that has little to no bearing on the privacy of an individual is included, the list will have the impact of overly regulating information with no discernible privacy benefit. A supplementary non-exhaustive list may be better suited in guidance published by the Office of the Australian Information Commissioner (“OAIC”) which supports an overarching and timeless definition.
FinTech Australia does not however consider that proposal 2.4 is necessary as the Privacy Act already sufficiently captures technical and inferred information that relates to an individual who is reasonably identifiable. Although we acknowledge that a number of international privacy regimes1 have sought to expressly include some types of technical and inferred information within their respective definitions of personal data, the preferable view for Australia (and a position that aligns with the position adopted in New Zealand) is to supplement the existing definitions with clear guidelines from the OAIC as to what is, and is not, considered to be personal information (including in relation to technical information, inferred information and obfuscated data). Given the speed of technological innovation, and the rapid changes that industries are already starting to see in relation to new ways of collecting and handling data (including but not limited to advancements in AI systems), it is important that there is sufficient flexibility in what organisations should (or should not) consider to be personal information without overly stifling innovation. Furthermore, by focusing upon easily updatable guidance, the OAIC has a greater ability to provide organisations with additional detail and more flexible assessment tools and examples.
|Proposal 2.5: Require personal information to be anonymous before it is no longer protected by the Act.|
FinTech Australia acknowledges that the replacement of de-identification with the concept of anonymisation will bring the Privacy Act closer in line with the GDPR and, as noted above, we are broadly supportive of increased consistency between the Privacy Act and the GDPR.
However, noting that anonymisation is a spectrum, if Australia is to adopt an anonymisation standard it must do so in a way it:
- aligns with the requirements of the GDPR (including the reasonably likely standard) to ensure consistency across the regimes;
- expressly clarifies that anonymisation does not require that there must be “only an extremely remote or hypothetical risk of identification”; and
- is supplemented by sufficient guidelines issued by the OAIC as to what methods of anonymisation will satisfy Australia’s anonymisation standard. For example, core techniques for anonymisation such as the utilisation of synthetic data and differential privacy could be expressly called out by the OAIC as being sufficient to meet the Privacy Act’s test for anonymisation.
We note that it is also important that any shift to an anonymisation standard must be carefully considered to avoid a repeat of the situation in Europe where there is conflicting regulatory guidance and positions being taken by regulators as to how organisations should approach anonymised data. That is, although the GDPR defines anonymous data as data that “…does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”, in practise there is conflicting regulatory guidance as to what anonymization means with the Article 29 Working Party (now the European Data Protection Board) stating in 2007 that anonymisation can be achieved if “appropriate technical measures” were put in place to prevent reidentification of data4 but then later suggesting that a significantly higher standard is required and that “Only if the data controller would aggregate the data to a level where the individual events are no longer identifiable, the resulting dataset can be qualified as anonymous.” With EU regulators still vacillating between which of these two positions to adopt when interpreting the GDPR,6it is crucial that the Australian approach clarifies that a residual risk of re-identification is acceptable provided that there are sufficient protections in place to protect the individuals privacy and that it clearly articulates the test that organisations must take in determining when the risk of re-identification is suitably remote.
|Question: What would be the benefits and risks of amending the definition of sensitive information, or expanding it to include other types of personal information|
FinTech Australia strongly argues against expanding the definition of sensitive information to include financial information (including transactional data). Not only is the existing definition of sensitive information fit for purpose in that it captures types of information that are inherently sensitive but any expansion to the definition to capture information that is sensitive by context or if it is processed in a particular way is likely to have a chilling effect on the utilisation of personal information within the financial industry. This effect is a reflection not only of the significant increase such a proposal would have on the number of requests for consent issued to consumers (which will result in increased consent fatigue) but of the significant impact it will have on the delivery of services with minimal benefit to the protection of consumer’s privacy. For example, if financial data was considered to be sensitive information, and noting that consent should not be bundled, requiring separate consent for each purpose of a financial transaction would impose a significant consent burden on the consumer given the complexity and the interaction between multiple entities to fulfil a single financial transaction.
FinTech Australia acknowledges that the Californian Privacy Rights Act (CPRA) includes limited financial details (that is a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account) within the definition of sensitive personal information. However, we note that this inclusion:
- does not apply to the finance sector; and
- has significantly different impacts under the Privacy Act and the CPRA as the CPRA
does not require organisations to seek the consent of individuals when collecting and processing financial details. Rather, the CCPR instead allows a consumer to limit how an organisation collects and utilises sensitive personal information.
Small Business Exemption
With regards to the questions posed by the Discussion Paper on page 49 in relation to the continued existence of the small business exemption, as FinTech Australia submitted in its Submission on the Privacy Act Review, all businesses that collect, use, disclose and maintain personal information of individuals (such as their customers or clients) should be required to comply with the APP’s. In our view, the purpose of collection and the volume of the data collected as part of an organisation’s practices should be the focus rather than the revenue that it generates.
In particular, we note that start-up technology organisations are often exempt from the Privacy Act by virtue of their revenue notwithstanding the sensitivity, volume and ease of disclosure of personal information they facilitate. For example, even the smallest technology based businesses could have thousands of records of personal information and so pose a high risk to individuals if the individuals’ personal information is not maintained in a compliant manner. However, noting the increased burden that compliance with the Privacy Act will have on small start-ups without data volume thresholds, consideration should also be given to promoting the development, and release, of privacy compliant technology by larger organisations that could be pushed out to their (small) business customers to facilitate their compliance with the Privacy Act.
Notice of collection of personal information
Proposal 8.1: Introduce an express requirement in APP 5 that privacy notices must be clear, current and understandable.
Proposal 8.2: APP 5 notices limited to [specified] matters under APP 5.2…
Proposal 8.3: Standardised privacy notices could be considered in the development of an APP code, such as the OP code, including standardised layouts, wording and icons. Consumer comprehension testing would be beneficial to ensure the effectiveness of the standardised notices.
Proposal 8.4: Strengthen the requirement for when an APP 5 collection notice is required – that is, require notification at or before the time of collection, or if that is not practicable as soon as possible after collection, unless the individual has already been made aware of the APP 5 matters; or notification would be impossible or would involve disproportionate effort.
As a general position, FinTech Australia supports a refreshed approach to privacy notices that strengthens consumers’ awareness of how their personal information is being used and disclosed as transparency is key to a consumer’s continued trust in how organisations are dealing with their personal information.
In particular support, our members support:
- changes that increase the suitability of collection notices and privacy policies for digital channels. Internationally, layered notices and the inclusion of links that expand each section or otherwise link to further material that contains more detailed information are repeatedly called out as best practise by regulators.9 Expressly encouraging organisations to implement layered notices and, where appropriate, allowing organisations to provide a link to how personal information is to be dealt with will result in a significantly improved consumer experience and places the choice in the consumers’ hands as to whether or not they access the information in full;
However, any amendments to Australia’s privacy notice regime should be approached carefully such that they do not impose requirements that will result in consumer “notice fatigue”. To this end, we would suggest further consideration is given to:
- when it is appropriate not to issue a collection notice (for example, where there is a deminimise collection of personal information in the course of providing services and a notice has previously been provided to the consumer for similar collection practises); and
- clarifying what would be considered impossible or would involve disproportionate effort. The concepts of impossibility and disproportionate effort cannot be approached in an arbitrary manner – rather they should involve a balancing exercise based both on the effort for the organisation to provide the information and the effect on the data subject if they were not provided with the information.
Consent to collection and use and disclosure of personal information
FinTech Australia recognises that meaningful consent to the processing of personal information is an important basis for which organisations should be able to rely upon for the processing of personal information. However, we strongly caution against any changes to the Privacy Act that increases the reliance of organisations on consent. As recognised by the United Kingdom government in the recent discussion paper “Data: a new direction”, the over-reliance on consent as a basis for processing under the GDPR “may lower protections for individuals, who suffer from ‘consent-fatigue’ in the face of a large volume of consent requests which they might accept despite not having the time or resources to assess them properly.” Similar positions have been articulated in relation to the reliance on consent as the basis for utilising cookies under the ePrivacy Directive. The Privacy’s Act current acknowledgement that consent is only required in limited circumstances has proven fit-for-purpose and any expansion of the situations in which consent must be sought is not appropriate.
|Proposal 9.1: Consent to be defined in the Act as being voluntary, informed, current, specific, and an unambiguous indication through clear action.|
FinTech Australia supports an increase in the alignment between the definition of consent in the Privacy Act and under the GDPR. However, any changes to how organisations are required to approach consent must not be so narrow as to limit innovation. For example, requirements relating to de-bundling of consent should be flexible enough to allow:
- a proactive ‘one-click’ consent option for multiple purposes provided that individuals have the ability to de-select any of the options included within the ’one-click’ option; and/or
- a “soft opt-in” similar to that under the Privacy and Electronic Communications Regulations (UK). Under the PECR, individuals who recently provided personal information to a company and did not opt out of marketing messages are presumed to be happy to receive marketing about similar products or services (even if they haven’t specifically consented) provided there is a clear chance to opt out at all times.
|Proposal 9.2: Standardised consents could be considered in the development of an APP code, such as the OP code, including standardised layouts, wording, icons or consent taxonomies. Consumer comprehension testing would be beneficial to ensure the effectiveness of the standardised consents.|
FinTech Australia supports the increased standardisation of consent as it will assist in promoting informed, and meaningful, consent. However, as noted above in relation to the standardisation of notices, sufficient flexibility should be included to allow organisations the flexibility to innovate and adapt how they present information to their consumers as technology and service delivery evolves. It is also important that any standardisation requirements relating to consent must be clearly distinguishable from the notice requirements.
|Question: Is it suitable for all APP entities (not just organisations subject to the Op code) to be required to refresh or renew an individual’s consent on a periodic basis.|
As noted above, any changes to the Privacy Act that would increase the frequency and circumstances in which consent must be sought from consumers will have limited privacy benefit to the consumer and will lead to consent fatigue. Rather than requiring periodic renewal, organisations should only be required to refresh consent where there has been a material change to the purpose for which the information is being used or disclosed.
Additional protections for collection, use and disclosure
|Proposal 10.1: A collection, use or disclosure of personal information under APP 3 and APP 6 must be fair and reasonable in the circumstances.|
Proposal 10.2: Legislated factors relevant to whether a collection, use or disclosure of personal information is fair and reasonable in the circumstances.
FinTech Australia supports these proposals in principle. However, in approaching what is “fair and reasonable”, we consider it very important to ensure that:
- organisations have sufficient certainty as to what is fair and reasonable in the circumstances and that steps are taken to avoid the uncertainty in application that has been a feature of GDPR’s “legitimate interest” ground for lawful processing. For example, the UK Government has recently acknowledged that the significant uncertainty of data controllers in how to assess whether the organisation’s interests outweigh the rights of individuals (even in the face of UK ICO guidance on how to complete the Legitimate Interest Assessment) is a key factor in driving over-reliance in the UK on consent; and
- the legislated factors must be approached in a method that ensures clarity and consistency with other obligations, and concepts, within the Privacy Act to ensure that there is no duplication, or inconsistency within the Privacy Act.
|Proposal 10.4: Define a ‘primary purpose’ as the purpose for the original collection, as notified to the individual. Define a ‘secondary purpose’ as a purpose that is directly related to, and reasonably necessary to support the primary purpose.|
It is important to our members that there is clarity for organisations about how to approach the concepts of primary purpose and secondary purpose in APP 6. Proposal 10.4 has the potential to assist in creating this clarity. However, we note that it will be important that organisations maintain the flexibility to define what their primary purpose is. If organisations are overly limited in how they may define primary purposes – there will be a disproportionate increase in the complexity of how organisations must approach the use and disclosure of personal information and there is a risk that organisations will (similar to the situation in the UK in relation to legitimate interests – see above) default to consent (and thus again raise the risk of consent fatigue). If there are concerns that sufficient clarity cannot be obtained through proposal 10.4, a practical alternative may be to consider multiple “original” purposes (with further evolution of additional basis for processing similar to those under the GDPR).
Control and security of personal information
Option 1: APP entities that engage in the following restricted practices must take reasonable steps to identify privacy risks and implement measures to mitigate those risks…
– Direct marketing, including online targeted advertising on a large scale
– The collection, use or disclosure of sensitive information on a large scale
– The collection, use or disclosure of children’s personal information on a large scale – The collection, use or disclosure of location data on a large scale
– The collection, use or disclosure of biometric or genetic data, including the use of facial recognition software
– The sale of personal information on a large scale
– The collection, use or disclosure of personal information for the purposes of influencing individuals’ behaviour or decisions on a large scale
– The collection use or disclosure of personal information for the purposes of automated decision making with legal or significant effects, or
– Any collection, use or disclosure that is likely to result in a high privacy risk or risk of harm to an individual.
Option 2: In relation to the specified restricted practices, increase an individual’s capacity to self manage their privacy in relation to that practice. Possible measures include consent (by expanding the definition of sensitive information), granting absolute opt-out rights in relation to restricted practices (see Chapter 14), or by ensuring that explicit notice for restricted practices is mandatory.
In line with our support for increased alignment between the GDPR and the Privacy Act, FinTech Australia is broadly supportive of Option 1.
Although not expressly considered by the Discussion Paper, we would also suggest that consideration is also given to how the Privacy Act can be amended to lessen the uncertainty as to how organisations can ensure compliance with the Privacy Act when they are looking to deploy AI systems and/or to use personal information to develop and train AI systems.
In particular, we would be keen to see consideration in the Privacy Act that supports organisations utilising personal information to undertake monitoring and bias detection/correction within AI systems. That is, in order to reduce the risk of bias within an AI system, it is imperative that organisations undertake monitoring and bias detection/correction which requires the utilisation of current and historic personal information and often sensitive information. For example, personal information is required to identify whether an AI system is replicating societal and historic discrimination (e.g. red lining poorer neighbourhoods within the insurance industry). However, it is currently difficult for organisations to utilise personal information for these purposes. For example, if an organisation needs to utilise existing sensitive information to check for bias, they must seek the consent of the individual. This in turn has been well recognised in Europe as creating bias towards the demographic of individuals who were willing to consent to their information being used for bias mitigation. We note that the UK Government is currently proposing to introduce new clauses into the Data Protection Act 2018 that specifically address the processing of personal data for bias monitoring, detection and correction in relation to AI systems. We would suggest that, when considering proposals 10 and 11, the AGD also considers similar clauses to ensuring that the Privacy Act does not overly restrict how organisations may utilise data to undertake bias monitoring, detection and correction.
Proposal 12.1: Introduce pro-privacy defaults on a sectoral or other specified basis.
Option 1 – Pro-privacy settings enabled by default: Where an entity offers a product or service
that contains multiple levels of privacy settings, an entity must pre-select those privacy settings to be the most restrictive. This could apply to personal information handling that is not strictly
necessary for the provision of the service, or specific practices identified through further
Option 2 – Require easily accessible privacy settings: Entities must provide individuals with an obvious and clear way to set all privacy controls to the most restrictive, such as through a single click mechanism.
FinTech Australia is supportive of Option 2 as it empowers individuals to choose the privacy settings that best suits how they wish to control their personal information. However, noting the speed of technological innovation, we stress that it is important that Option 2 does not overly restrict how organisations can present privacy settings.
Overseas data flows and third party certification
|Proposal 22.1: Amend the Act to introduce a mechanism to prescribe countries and certification schemes under APP 8.2(a).|
Proposal 22.2: Standard Contractual Clauses for transferring personal information overseas be made available to APP entities to facilitate overseas disclosures of personal information.
Proposal 22.3: Remove the informed consent exception in APP 8.2(b).
Proposal 22.5: Introduce a definition of ‘disclosure’ that is consistent with the current definition in the APP Guidelines.
Proposal 22.6:Amend the Act to clarify what circumstances are relevant to determining what ‘reasonable steps’ are for the purpose of APP 8.1.
Proposal 23.1: Continue to progress implementation of the CBPR system.
Proposal 23.2: Introduce a voluntary domestic privacy certification scheme that is based on and works alongside CBPR.
FinTech Australia is supportive of additional mechanisms that will increase the alignment between the Privacy Act and international privacy regimes in relation to the cross-border transfer of personal information. In particular, we are supportive of the introduction of an independent certification scheme to monitor and demonstrate compliance with the Privacy Act. The introduction of such a scheme could provide a simple means for foreign entities to engage or interact with the Australian market. It would also assist consumers in knowing which organisations they can trust in relation to their privacy practises and it will assist organisations by streamlining an organisations privacy due diligence with third party service providers.
In addition, we note that if Standard Contractual Clauses (“SCCs”) are to be introduced into Australia – we recommend that an approach is taken that aligns with the EU Commission’s SCC’s to avoid organisations with a presence in Europe and the UK being placed into a position where they are required to enter into multiple SCC’s. A potential option could be to take a similar approach to that currently under consideration by the UK ICO and develop an Australian addendum to the EU Commissions SCCs.17 Alternatively, an approach could be taken whereby the OAIC clearly specifies the minimum requirements for a data protection agreement with those requirements aligning with the EU Commission’s SCCs.
Direct right of action/Statutory Tort
|Proposal 25: Create a direct right of action…|
Proposal 26: Statutory tort of privacy
– Option 1: Introduce a statutory tort for invasion of privacy as recommended by the ALRC Report 123.
– Option 2: Introduce a minimalist statutory tort that recognises the existence of the cause of action but leaves the scope and application of the tort to be developed by the courts.
– Option 3: Do not introduce a statutory tort and allow the common law to develop as required. However, extend the application of the Act to individuals in a non-business capacity for collection, use or disclosure of personal information which would be highly offensive to an objective reasonable person.
– Option 4: In light of the development of the equitable duty of confidence in Australia, states could consider legislating that damages for emotional distress are available in equitable breach of confidence.
FinTech Australia does not support the introduction of a direct right to action. We consider that it is more appropriate, and effective for consumers to raise privacy concerns with the OAIC rather than to pursue court action (an outcome which will dramatically increase both the financial costs and time frame required to reach an outcome).
However, if a direct right of action was to be introduced:
- processes must be implemented that will seek to ensure that only the most serious interferences with privacy (as determined by the OAIC) may progress to litigation, with the majority of matters instead addressed by the OAIC (through, for example, mediation or conciliation) to provide individuals and organisations with the opportunity to reach an amicable and less adversarial outcome; and
- any legislated assessment of damages must be based on criteria that balances the harm with the amount awarded and recognises alternative ways to mitigate the harm (such as enforceable undertakings).
FinTech Australia supports, in principle, the introduction of a statutory tort for the invasion of privacy that aligns with Option 1 on the proviso that any such tort is strictly limited to intentional or reckless invasions of privacy.
Notifiable Data Breaches Scheme
|Proposal 27.1: Amend subsections 26WK(3) and 26WR(4) to the effect that a statement about an eligible data breach must set out the steps the entity has taken or intends to take in response to the breach, including, where appropriate, steps to reduce any adverse impacts on the individuals to whom the relevant information relates.|
FinTech Australia supports this proposal as it will be an additional step in better equipping organisations with the ability to standardise their privacy incident responses and to increase transparency in relation to the management of privacy incidents.
More broadly, we also support increased alignment between Australia’s Notifiable Data Breaches Scheme and similar international schemes. As a result, any changes to the Notifiable Data Breaches Scheme should align with globalised standards and trends to support organisations that must comply with requirements across multiple jurisdictions, and as mentioned in the Discussion Paper, balance or negate the need for multiple notifications across regulatory entities.
About FinTech Australia
FinTech Australia is the peak industry body for the Australian FinTech Industry, representing over 300 FinTech Startups, Hubs, Accelerators and Venture Capital Funds across the nation. Our vision is to make Australia one of the world’s leading markets for FinTech innovation and investment. This submission has been compiled by FinTech Australia and its members in an effort to drive cultural, policy and regulatory change toward realising this vision. FinTech Australia would like to recognise the support of our Policy Partners, who provide guidance and advice to the association and its members in the development of our submissions:
- DLA Piper
- King & Wood Mallesons
- K&L Gates
- The Fold Legal