In Alberta (Information and Privacy Commissioner) v. University of Calgary, the Supreme Court of Canada rejected an attempt by the Privacy Commissioner of Alberta to view documents over which the University was claiming solicitor–client privilege. Responding to a freedom of information request by a former employee, the University of Calgary refused to produce certain records on the grounds that they were privileged. The Alberta Privacy Commissioner invoked s. 56(3) of the Alberta Freedom of Information and Protection of Privacy Act (FOIPP) to demand production of the records despite “any privilege of the law of evidence” in order to assess the validity of the privilege claim.
The Supreme Court was asked to rule on whether the language of s. 56(3) was sufficient to abrogate solicitor-client privilege. In a news release when leave was granted, the Alberta Privacy Commissioner cast the issue as one of access and transparency of public institutions, and argued that their ability to test a claim of privilege was fundamental to effective oversight:
This case has significant implications for upholding the access rights of Albertans. Specifically, it impacts the Commissioner’s ability to provide effective oversight when reviewing decisions made by government ministries, post-secondary institutions, school boards and municipalities, among others, in response to access to information requests.
The SCC reaffirmed that solicitor-client privilege is fundamental to the proper functioning of the Canadian legal system, and concluded that for any legislation to abrogate that right would require clear and explicit language. They held that “solicitor-client privilege cannot be set aside by inference but only by legislative language that is clear, explicit and unequivocal. In the present case, the provision at issue does not meet this standard and therefore fails to evince clear and unambiguous legislative intent to set aside solicitor-client privilege. It is well established that solicitor-client privilege is no longer merely a privilege of the law of evidence, having evolved into a substantive protection."
This case has been followed closely by Privacy Commissioners, Law Societies and legal associations (there were 17 such interveners). The outcome is an affirmation of the importance of the protection of solicitor–client privilege and sends a strong message to legislators about what it will take to set aside that protection. It will be interesting to see whether any legislation is amended in response, or whether governments will quietly rejoice in their strengthened privilege rights when facing freedom of information requests.
Anyone familiar with electronic discovery knows the problems caused by duplicates: information bloat that drives up the cost and complexity of information management and e-Discovery. In most corporate electronic information repositories multiple copies of records are abundant. In response, attention in the early development of e-Discovery software was on identifying and eliminating duplicate content. For e-Discovery, this problem has now been solved.
When discussing information management, we are increasingly finding that duplicates are being addressed in the same way: something to contain, if not eliminate. After all, what value is there in keeping dozens of copies of each record? Many records and document management systems are focused on the “one official copy” objective, with processes being put in place to severely restrict the duplication of content.
The “one official copy” objective fails to recognize the cause of duplicate proliferation: the people who use the information do not trust that they will be able to find what they need, when they need it, unless they organize it themselves. Putting their faith in the centrally controlled document management system is akin to throwing their records into a big pile in a room – sure, they can wade through it and, given enough time, find what they need, but who has the time or patience to do that? The desire to control one’s personal document repository is deeply ingrained in today’s corporate culture, and is one leading reason why information management plans either never get off the ground or fail when implemented.
Perhaps it is time to question the “one official copy” approach. We know that exact duplicates are easy to locate and track. We know that workers are more efficient if they have important information at their fingertips – organized in a way that makes it accessible to them. Rather than struggling to reduce or eliminate duplicates, information management systems should be designed to track and link them. Every time a copy is made, whether it’s attaching a file to an email and sending it to multiple recipients, saving an attachment to a local folder, or making a copy for a rainy day, the IM system should record its location, apply security as required, and aggregate the metadata each user adds. To manage that record, all copies should be managed at the same time, according to the overall value to the organization. In the meantime, rather than punishing individual information workers by forcing them to store things in a way that, for them, is inefficient, we can benefit from the added (metadata) value they provide. As long as we know when a document is the “official” document, does it really matter how may copies there are?
Unfortunately, this seemingly obvious solution to one information management challenge has been slow to emerge. While some innovative software vendors are moving in this direction, an almost insurmountable hurdle is the fact that almost everyone uses computers with an operating system that really has not changed much since the introduction of the original IBM PC in 1981. MS DOS (and Windows, which essentially sits on top of DOS), has almost no facility to manage information.
But hope is on the horizon. The migration to the cloud (lead by Office 365) is ushering in a new era. Office 365 was built from the ground up with IM in mind. Software vendors can (and some already do) manage all information created in Office 365 or stored on OneDrive. Google Docs and Apple iCloud have similar features.
As more and more organizations migrate their desktops and servers to the cloud, the challenge of managing duplicates may finally be overcome.
Eight years ago when Wortzmans was first founded, we would scour the cases looking for a mere mention of proportionality in cases other than Charter violations or sentencing. Now, proportionality is a hot topic for Canadian courts. Since the Supreme Court’s landmark decision in Hryniak v.Mauldin proportionality is a frequent discussion point in civil litigation, including the discovery process. Case after case addresses proportionality concerns.
As e-discovery lawyers, we are always considering how proportionality can be applied to our practice. This often means adopting a more ‘civil’ approach to litigation by balancing competing interests of the parties and taking a reasonable approach, instead of fighting tooth and nail for small victories.
Litigation is often impeded by excessive expense and delay, undermining litigants’ access to justice. Proportionality is now a means to address some of these issues. Since Hyrniak, the zealous advocacy that we were taught in law school has been dubbed ‘Old Brain Thinking’ and lawyers today are prevailed upon to balance the costs associated with each step in the litigation with the likely outcomes. This might mean accepting a flawed Affidavit of Documents in circumstances where perfection will not assist in achieving a better (or different) outcome. It could also mean playing your trump card early on in the litigation, instead of waiting for trial.
Proportionality will mean different things in different contexts and how it is applied will vary from case to case. But one thing is clear – proportionality will govern how we practice law in Canada.