MT>3
  • Home
  • About
  • People
  • Services
    • e-Discovery
    • Managed Review
    • Information Governance
    • Due Diligence
  • Blog
  • News
  • Contact

COUNTDOWN TO GDPR:  Parliamentary Committee Recommends Substantial Revisions to PIPEDA – Part 4 – Enforcement Powers

22/5/2018

 

(Read the original article by Kirsten Thompson, Charles Morgan and Maureen Gillis at Cyberlex)

As reported in our recent post, on February 28, 2018, the House of Commons Standing Committee on Access to Information, Privacy and Ethics tabled in the House of Commons a report entitledTowards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act.  The recommendations in the Committee’s Report are also heavily influenced by the direction set in the European Union General Data Protection Regulation, (“GDPR”) which comes into force this year.

We have prepared a multi-part series of posts focusing in more depth on each section of the Report.

In this post, we summarize and comment on the Committee’s findings set out in Part 4 of the Report, which addresses the issue of whether the Office of the Privacy Commissioner of Canada (“OPC”) should be given enforcement powers, what those powers should be, and explores some of the challenges associated with enhancing the OPC’s powers.

The other posts in this series are:

Part 1 – Overview and Context of the Report

Part 2 – Consent

Part 3 – Online Reputation/ “Right to be Forgotten”

Part 4 – Enforcement Powers of the Privacy Commissioner

Part 5 – Adequacy of PIPEDA under the GDPR

The Report made a number of recommendations and consideration of the OPC enforcement powers was a key component. The Report’s recommendations, if implemented, could significantly expand the ability of the OPC to impose penalties—both monetary and otherwise—on private Canadian businesses and federally regulated entities, as well as broadening the OPC’s powers to audit such entities.

Current Enforcement Powers

PIPEDA empowers the Privacy Commissioner of Canada to investigate complaints regarding violations of the Act, which can be initiated by individuals or by the OPC itself. Generally speaking, the Privacy Commissioner’s enforcement powers reflect an ombudsman model, whereby the OPC investigates and mediates complaints under PIPEDA as a neutral third party. The Privacy Commissioner has the power to summon witnesses, administer oaths and compel production of evidence, but not to issue final orders. With the Digital Privacy Act amendments in 2015, the Privacy Commissioner can enter into a compliance agreement with an organization, pursuant to which the organization agrees to steps it will take to bring itself in compliance with PIPEDA. The Privacy Commissioner can also apply to the Federal Court for a court order for matters that remain unresolved following the foregoing process, requesting either an order requiring an organization to comply with its compliance agreement or another court order provided for under the Act.

The OPC has long called for enforcement powers (most recently in its 2016-17 Annual Report to Parliament on the Personal Information Protection and Electronic Documents Act and the Privacy Act) but the power to enter into compliance agreements has been the only notable change in its powers to date.

Enforcement Considerations

In the opinion of numerous academic and industry commentators, the limited enforcement powers currently available to the Privacy Commissioner under PIPEDA hamper the effectiveness of the OPC as a regulator. Indeed, current and past Privacy Commissioners have also proposed to the government that granting stronger enforcement powers and incentives for compliance with PIPEDA would enhance the ability of the OPC to protect individuals’ privacy rights. Different enforcement options that have been recommended and considered include the ability for the Privacy Commissioner to impose statutory damages, administrative monetary penalties, and make orders. In addition, the OPC has recommended legislative changes to empower it to be take proactive steps in respect of matters such as online reputation.

In the course of its review, the Committee heard from 68 witnesses and received 12 written submissions. Many of these oral and written submissions expressed support for amending PIPEDA to grant the Privacy Commissioner broader enforcement powers, though the specifics of these recommended powers varies. The enforcement powers proposed by witnesses in their submissions to the Committee included, among others, the following:

  • - granting the Privacy Commissioner broad discretionary authority to impose administrative monetary penalties or the authority to impose fines;
  • - introducing a statutory right of action exercisable by individuals without a prior complaint to the OPC, supported by statutory damages;
  • - establishing a maximum deterrent fine based on a percentage of the offending business’ worldwide turnover for the previous year and a second threshold amount, the greater of which would be applied (an approach consistent with the European Union’s General Data Protection Regulation, or “GDPR”);
  • - authorizing the Privacy Commissioner to impose fines on organizations specifically in cases of substantial or systemic non-compliance with privacy obligations; and
  • - empowering the Privacy Commissioner to encourage, and in some cases require, the use of privacy protection tools such as codes of conduct, privacy seals, and privacy impact assessments.

Other suggested approaches included giving the OPC a more proactive role by enabling the Privacy Commissioner to issue advance compliance rulings regarding new technologies, thereby lessening the need for later investigation and enforcement.

Comparing Canada’s privacy legislation to that of other countries around the world, the Report noted that data protection authorities in the United Kingdom, Ireland, New Zealand, and Spain have order-making powers, with the United Kingdom and Spain also having the ability to impose fines. In the UK, fines of up to £25,000 are permitted, whereas in France, fines of up to €300,000 are allowed. Under the GDPR, as noted, fines are based on a percentage of an organization’s annual revenue.

Report Recommendations

The Committee concluded in the Report that there is a demonstrated need to grant the Privacy Commissioner enforcement powers related to PIPEDA and recommended modelling the Canadian approach after the system currently in place in the United Kingdom. Specifically, the Report recommended that PIPEDA be amended to give the Privacy Commissioner enforcement powers, including the power to make orders and impose fines for non-compliance.

In addition, the Report recommended that PIPEDA be amended to give the Privacy Commissioner broad audit powers, including the ability to choose which complaints to investigate, which follows a recommendation made by former Privacy Commissioner Jennifer Stoddart. Such powers would augment the existing powers of the Privacy Commissioner under PIPEDA to conduct audits of how organizations governed by the Act use personal information, make public any information that comes to the Privacy Commissioner’s knowledge in the performance or exercise of any of his or her duties if it is in the public interest, and coordinate with provincial counterparts in initiatives such as the development of model contracts.

Key Take-Aways

The question of granting broader enforcement powers to the Privacy Commissioner goes to the heart of the OPC’s purpose and role. If the OPC is to have an open and collaborative relationship with businesses to encourage and facilitate design of products and services that respect Canadians’ privacy rights, some fear that a stronger enforcement mandate for the OPC could deter such cooperation and openness from the business community. On the other hand, creating a body of precedents for enforcement of PIPEDA could help build greater certainty and confidence among businesses by demonstrating consistency and predictability in application of the legislation.

In the course of submissions, some concern was expressed by business community representatives that broad enforcement powers for the OPC could discourage legitimate business use of information due to fears of non-compliance and the costs of associated compliance endeavours. Seen from another perspective, however, greater enforcement powers for the Privacy Commissioner could help level the playing field between organizations acting prudently and making investments in compliance and those disregarding privacy legislation.

Order making power and other enforcement powers are not a done deal. There are significant legal risks associated with the introduction of order making powers, include the outstanding constitutional problems  (chiefly concerning the division of powers).

The introduction of order making powers, including the ability to impose monetary penalties, would create even more disparity between the Commissioner’s powers under the Privacy Act (the public sector privacy legislation which the OPC also administers) and those under PIPEDA.

Finally, the introduction of order making powers, including the ability to impose monetary penalties, would likely require a significant overhaul of the OPC’s current institutional structure as the current structure means that the OPC is not only charged with investigating alleged violations but would also be charged with making decisions. A 2011 Report titled Powers and Functions of the Ombudsman in the Personal Information Protection and Electronic Documents Act: An Effectiveness Study notes that “it would seem that replacing the Office with an agency in the decentralized organizations category, and more specifically, a social regulatory agency…endowed with administrative powers (e.g. power of investigation), decision-making powers (e.g. power to make orders and impose penalties) and regulatory powers, is an option that could be given serious attention”.

Countdown to GDPR:  Parliamentary Committee Recommends Substantial Revisions to PIPEDA – Part 3 – Online Reputation / “Right to be Forgotten”

15/5/2018

 

Read the original article by Kirsten Thompson, Charles Morgan and Maureen Gillis at Cyberlex:

As reported in our recent post, on February 28, 2018, the House of Commons Standing Committee on Access to Information, Privacy and Ethics tabled in the House of Commons a report entitled Towards Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. The recommendations in the Committee’s Report are also heavily influenced by the direction set in the European Union General Data Protection Regulation, (“GDPR”) which comes into force this year.

We have prepared a multi-part series of posts focusing in more depth on each section of the Report.

In this post, we summarize and comment on the Committee’s findings set out in Part 3 of the Report, which treats the issues of the “right to be forgotten”, the destruction of personal information and “privacy by design”.

The other posts in this series are:

Part I – Overview and Context of the Report

Part 2 – Consent

Part 3 – Online Reputation/ “Right to be Forgotten”

Part 4 – Enforcement Powers of the Privacy Commissioner

Part 5 – Adequacy of PIPEDA under the GDPR

Right to be Forgotten

The issue of online reputation has long been a topic of interest for lawyers whose practice addresses issues of defamation. However, in 2014, the topic was given novel treatment when the Court of Justice of the European Union (“CJEU”) rendered its decision on the issue of de-indexing in Google Spain v. AEPD and Mario Costeja Gonzales (“Google Spain”).

In that matter (which involved an individual who wished to have Google remove links between his name and website content related to an old bankruptcy proceeding that he had been involved in), the CJEU found that search engines, such as Google, must consider requests made by individuals to remove certain websites from the results produced when their name is searched. The case became associated with a nascent so-called “right to be forgotten”, since enshrined in the GDPR.

With this as the backdrop, the Committee examined whether PIPEDA should be amended to include an analogous, express “right to be forgotten” and, if so, what form(s) such a right should take.

The Committee’s first finding in this regard was that when online reputational damage occurs in the context of personal relationships rather than commercial transactions, PIPEDA does not apply (since the latter only applies to the collection, use and disclosure of personal information in a commercial context).   Moreover, the Committee noted that the Criminal Code treats a number of related offences, such as regards the publication of intimate images without consent. Accordingly, the Committee clarified that the scope of their analysis was limited to the protection of privacy and online reputation in the context of commercial transactions.

Second, the Committee noted that the “right to be forgotten” could be addressed through two distinct types of remedies: (a) the right to erasure; and (b) the right to de-indexing. The former involves the right of an individual to have his/her personal information deleted from a website; the latter involves the mere de-referencing or de-indexing of such website from search results that include the individual’s name (while leaving the source documents themselves in place).

Right to Erasure

As regards the right to erasure, the Committee noted that PIPEDA does not expressly contain such a right, although the principles of “consent”, “limited retention” and “accuracy” may be applied in some instances to give effect to a limited right of erasure in certain circumstances.

For example, according to Principle 4.3.8 of Schedule 1 to PIPEDA, an individual has the right to withdraw consent to the collection, use and disclosure of his/her personal information. If this is then combined with the limited retention principle, pursuant to which an organization may only retain personal information for so long as it is necessary for the fulfilment of the purposes for which it was collected, then (in some circumstances) an individual may successfully argue that, upon withdrawal of their consent, the organisation that holds their information should destroy it.

Moreover, pursuant to the “accuracy” principle, organisations should provide opportunities for individuals to update and correct any inaccuracies in the information that is held about them, particularly where such information may be used to make a decision about the individual. Finally, the Committee noted the recent Federal Court decision (A.T. v. Global24h.com) in which the court ordered the removal of personal information from a website because it determined that the information had not been collected for appropriate purposes, in violation of Section 5(3) of PIPEDA.

In this context, several of the Committee witnesses argued that PIPEDA should be amended to create a more comprehensive right of erasure (to address situations of cyberbullying or revenge porn, for example) that would be similar in scope to the right of erasure found in the GDPR. Others, however, raised substantive concerns about the potential negative impact on freedom of expression, as protected by the Charter. A representative of the Association of Canadian Archivists argued that a right of erasure must not unduly interfere with preserving the integrity of public documents.

Ultimately, the Committee expressed the view that individuals should have the right to have their personal information removed when they end a business relationship with a service provider or when the information was collected, used or disclosed contrary to PIPEDA. The Committee recommended that legislators look to the GDPR as a model as a means of clarifying the scope of such a right. Finally, the Committee concluded that, at a minimum, young people should have the right to have information that is posted about them (by themselves or by others) taken down.

Right to De-indexing

As regards the right to data de-indexing, the Committee not only noted the above-cited Google Spain decision, but also the more recent Supreme Court of Canada decision of Google Inc. v. Equustek Solutions Inc. (discussed in our previous post, here).

Although the latter case involved a de-indexing order in the context of litigation related to the unlawful publication of confidential information and trade secrets, some Committee witnesses argued that a similar logic could be applied to justify de-indexing of personal information. In this regard, as discussed in our previous post, the Office of the Privacy Commissioner of Canada (“OPC”) argued that this form of right to be forgotten already exists in PIPEDA and that it considered it appropriate to have search engines provide the first level of review of a de-indexing request. Since this approach raised concerns regarding the proper role of the private sector in administering administrative remedies, Committee witnesses argued for the importance of a transparent decision-making process and the OPC proposed a series of criteria that should be applied in relation to any de-indexing request.

Ultimately, the Committee recommended that the legislator consider including a de-indexing framework in PIPEDA and that the right be expressly recognized in cases related to personal information posted online by individuals when they were minors.

Destruction of Personal Information

Reinforcing the Committee’s discussion of the right to erasure (discussed above), certain witnesses argued that the erasure of data should be compulsory – not simply recommended – once it is no longer necessary for the purpose for which it was collected. Some argued that PIPEDA should be amended to include a clear definition of what is meant by the “destruction” of data, especially in contexts where complete destruction may be impractical (such as where traces of data may be stored in back-up storage). The Committee expressed support for such recommendations.

Privacy by Design

Finally, the Committee explored the idea of expressly introducing “privacy by design” principles into an amended PIPEDA. “Privacy by design” is meant to ensure that privacy considerations are taken into account at all stages of product development, including in relation to the design, marketing and retirement of the product.

The approach is based on the following seven foundational principles:

  1. Proactive not reaction; preventative not remedial
  2. Privacy as a default setting
  3. Privacy embedded into design
  4. Full functionality – Positive sum, not zero sum
  5. End to end security
  6. Visibility and transparency
  7. Respect for user privacy.

Key Take-Aways

Many Canadian businesses have followed the implementation of GDPR (coming into force in May 2018) only vaguely, on the assumption that the new EU data protection regulation will not apply. However, it is clearly time for all Canadian businesses to start paying much closer attention to privacy developments occurring “across the pond”. This is because the GDPR not only has extra-territorial application, it is also providing to be a major source of “inspiration” as Canadian legislators turn their minds to updating Canadian data protection law. Indeed, as Canada assesses what changes to PIPEDA (and analogous provincial legislation) may ultimately be required to maintain a favourable EU “adequacy status”, the GDPR principles may well find their way directly into Canadian legislation.

Blockchain for Smart Cities

14/5/2018

 
​On May 8, 2018, the BlockchainHub hosted guest speakers from as far away as Japan and Saudi Arabia.  They spoke about:
  1. The emergence of “Smart Cities” in Japan and Saudi Arabia;
  2. Integration between the Internet of things (IoT) and Blockchain; and 
  3. Ways in which blockchain may be used to address privacy issues for smart cities and more generally.
 
Marek Laskowski, Co-Founder of Blockchain Lab and Professor of Information and Computing Technologies at York University spoke about blockchain in the context of the built environment.  He focused on the privacy implications of miniaturization and ubiquity of connected devices as well as ways to satisfy Privacy by Design principles, echoing some concerns shared by MT>3 clients.  On the eve of changes such as the GDPR in Europe, Canadian mandatory privacy breach notification rules, and the increased use of “Internet of Things” devices, privacy issues for business and individuals are top of mind.
 
There are many potential beneficial applications for distributed ledger, permissioned blockchains and “smart contracts”. 
 
MT>3 recognizes the value in these and other emerging technologies.  We are watching them closely as we explore new approaches.
<<Previous

    Categories

    All
    Artificial Intelligence
    Blockchain
    Cyber Security
    E Discovery
    Information Governance
    Legaltech
    Privacy
    Social Media
    Technology


    Archives

    April 2021
    March 2021
    February 2021
    November 2020
    October 2020
    July 2020
    June 2020
    April 2020
    March 2020
    February 2020
    January 2020
    November 2019
    October 2019
    September 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    May 2018
    April 2018
    March 2018
    September 2017
    August 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    April 2012
    March 2012
    February 2012
    January 2012
    December 2011
    November 2011
    October 2011
    September 2011
    August 2011
    June 2011
    April 2011
    March 2011
    February 2011
    January 2011
    December 2010
    November 2010
    October 2010
    September 2010
    August 2010
    July 2010
    June 2010
    May 2010
    March 2010
    February 2010
    January 2010
    October 2009
    September 2009
    August 2009
    December 2008
    March 2008
    November 2007
    October 2007

130 Adelaide Street West Suite 2020
Toronto, Ontario M5H 3P5
​ ​
t: 416-642-2220  
tf: 1-877-642-2220  
f: 416-868-0673
Contact MT>3
@MT>3 2018. All Rights Reserved
Picture

Privacy Policy and Terms of Use

  • Home
  • About
  • People
  • Services
    • e-Discovery
    • Managed Review
    • Information Governance
    • Due Diligence
  • Blog
  • News
  • Contact