The right to remember
20 June 2014
You may have read about the surprising judgment of the European Court of Justice (ECJ) on 13-May-2014 in Google v Spain, which requires, in (un)certain circumstances, that search engines should on request remove links and extracts of public information about people in the European Union. We'll return to the idiocies of that judgment later in this article.
Against that background, it was heart-warming to read in the UK's Press Gazette on 19-May-2014 (and in Pinsent Masons' Out-law.com 2 days later) that a convicted fraudster had failed to persuade the UK Information Commissioner's Office (ICO) to require an online newspaper archive to remove a report of his "spent" criminal conviction.
In search of the original ICO assessment on this, we visited the ICO web site. We expected to find a press release or some kind of case note, suitably redacted to avoid identifying the complainant. After all, it is important for the UK public to understand how the ICO interprets the Data Protection Act (DPA).
But we found nothing on the case. So on 1-Jun-2014 we wrote to their "Press office", and a linguistically-challenged Lead Communications Officer called James Stanley replied:
"These assessments aren't publically (sic) published."
Not good enough, we thought. Now, you will note that the UK call their person the "Information Commissioner" rather than Hong Kong's "Privacy Commissioner" title. That's because in the UK, as in many advanced societies, there is also a Freedom of Information Act (FOIA) alongside the Data Protection Act, and the ICO is responsible for both - while in HK, there is no such legislation, only a weak Code on Access to Information. Legislation would make it easier to hold HK's government accountable - which is probably why we don't have any, although the Ombudsman has called for it.
So on 2-Jun-2014, Webb-site filed our first ever request under the UK FOIA, seeking information on the case, with the name of the complainant suitably redacted, so that this would not be a privacy issue in itself. And hey presto, after a holding response the next day, we received a full response on 11-Jun-2014 from Steven Dickinson, Lead Information Governance Officer. The FOIA works. He sent us a suitably redacted copy of the ICO's letter dated 26-Mar-2014 to Newsquest Media Group Ltd. The ICO explained the background to the letter to Webb-site:
"This complaint has been dealt with by the ICO under the terms of section 42 of the DPA, under which the ICO can make an assessment as to whether it is likely or unlikely that a data controller's processing of personal data complies with the requirements of the DPA.
The ICO does not routinely publish the outcome of cases heard under s42 DPA; the outcome is in the form of a letter to parties to the case, and any information contained is likely to be the personal data of the complainant. It is also worth mentioning, at this point, that an ICO assessment of this form is not binding on any party and has no force in law, but simply reflects the Information Commissioner's view on a particular matter, as presented to him.
We consider that it is possible to disclose some of the letter for this particular case, but with information which is the personal data of the complainant redacted."
Incidentally, if we had not been satisfied, then we could have filed a complaint under the FOIA with, guess who, the ICO. That highlights a rather silly situation in which the ICO handles complaints about its own failures to deliver information wearing its other hat under the DPA. In other words "you may complain about me to me". Fortunately, we were satisfied, but the UK Government should remove that conflict of interest by separating the roles of Privacy Commissioner and Information Commissioner.
The ICO Analysis
Turning to the assessment letter, the Case Officer analysed at length the 4 conditions needed to satisfy Section 32 of the DPA, which exempts the "special purposes" (defined in Section 3) of journalism, artistic and literary purposes from most of the DPA. Here are key extracts:
"The first condition requires that personal data are processed only for the special purposes. We take a fairly broad view on what counts as 'processing only for the purposes of journalism' to properly protect [European Convention on Human Rights] Article 10, rights to freedom of expression. If something is done with the aim of disclosing information, opinion or ideas to the public by any means, it will be for the purposes of journalism...I am satisfied that the publication of the article within your archive appears to have been made only for the purposes of disclosing information and opinion to the public" (our bold)
"The second condition requires that the processing is undertaken with a view to the publication of journalistic material. Reporting, publishing and then storing articles within [X] archives (either in paper or electronic format) would appear to satisfy this criteria (sic), so we would accept that this counts as publication of journalistic material."
"The third condition requires [X] to demonstrate that it reasonably believed that publication would be in the public interest. Our main focus here is in...ensuring that the decision in not obviously unreasonable, i.e. no reasonable person could possibly agree. You have explained that Newspaper archives have existed for hundreds of years and as previously stated, were first kept physically as an historic and public resource. Details of the report on [X] trial and conviction have been retained within your archive in line with this practice and tradition and were already available for anyone wishing to search the archives manually. ...it is my view that the article was published and retained within [X] archives in line with standard industry and [X] practice with a view to publishing in the public interest"
"The fourth condition requires [X] to demonstrate that it reasonably believed that compliance with each provision of the DPA was incompatible with the purpose of journalism. This requires you to demonstrate that [X] did not think that there was a more compliant way to get this information into the public domain....You have explained that your guiding principle is the need to maintain the integrity of the archives as far as possible in order to preserve their value as a historical record, which generally means keeping those archives indefinitely...On the basis of the above, it is my view that [X] decision to publish this article complies with the fourth condition of the DPA."
The conditions having been satisfied, the ICO closed the case, and the report on the fraudster's conviction presumably remains in the archive.
The ICO's interpretation of the law isn't really surprising. The exemption in Section 32 of the DPA follows Article 9 of the EU Data Protection Directive (the Directive) which states:
"Member States shall provide for exemptions or derogations from the provisions of this Chapter, Chapter IV and Chapter VI for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression."
The Directive drives legislation in the EU member states. The ECJ, in a previous ruling on 16-Dec-2008 known as Satamedia, examined the issue of freedom of speech and the media. In Finland, tax data on individuals whose income exceeds certain thresholds is public information, and newspapers publish it in lists organised by municipality, name and income bracket. The same information was transferred by the newspaper publisher to a fellow subsidiary, Satamedia, and made available over SMS, for a fee. So you could key in a neighbour's name and check whether they were declaring a reasonable amount of income relative to the lifestyle you observe, or you could look up a potential tenant and find out whether he is likely to be able to afford the rent. Presumably the main purpose of disclosing the data by law was to deter tax evasion by increasing transparency.
The ECJ stated:
"the exemptions and derogations provided for in Article 9 of the directive apply not only to media undertakings but also to every person engaged in journalism...
Secondly, the fact that the publication of data within the public domain is done for profit-making purposes does not, prima facie, preclude such publication being considered as an activity undertaken 'solely for journalistic purposes'... A degree of commercial success may even be essential to professional journalistic activity.
Thirdly, account must be taken of the evolution and proliferation of methods of communication and the dissemination of information. As was mentioned by the Swedish Government in particular, the medium which is used to transmit the processed data, whether it be classic in nature, such as paper or radio waves, or electronic, such as the internet, is not determinative as to whether an activity is undertaken 'solely for journalistic purposes'.
It follows from all of the above that activities such as those involved in the main proceedings, relating to data from documents which are in the public domain under national legislation, may be classified as 'journalistic activities' if their object is the disclosure to the public of information, opinions or ideas, irrespective of the medium which is used to transmit them."
The ECJ ruled that if the sole object of Satamedia's activities is the disclosure to the public of information, opinion or ideas, then it is 'solely for journalistic purposes' under Article 9. Whether this was the case (or whether Satamedia had other objects) was a matter for the national court to determine.
Google v Spain
Now we come to Google v Spain (or to be precise, the Spanish Data Protection Agency, AEPD). The case was triggered by a Mr Mario Consteja Gonzalez, who had defaulted on his social security payments, and as a result the Government had put his real estate up for auction and published notices about it in 1998 in a newspaper, La Vanguardia. The proceedings were later fully resolved. He asked AEPD to require the newspaper to remove the notices, but AEPD declined, taking the view that the publication was legally justified, and presumably, as in the UK ICO's assessment on Newsquest, that archives are OK.
He also asked AEPD to require Google to remove links to the web pages of the newspaper which contained these notices. On this, the AEPD agreed. Google went to the Spanish High Court, which referred key aspects of the case to the ECJ. On 25-Jun-2013, before the ECJ ruling, the Advocate General to the court issued an opinion that the Directive does not establish a general 'right to be forgotten'. He wrote:
"A newspaper publisher's freedom of information protects its right to digitally republish its printed newspapers on the internet. In my opinion the authorities, including data protection authorities, cannot censure such republishing. The Times Newspapers Ltd v. the United Kingdom (nos. 1 and 2) judgment of the European Court of Human Rights demonstrates that the liability of the publisher regarding accuracy of historical publications may be more stringent than those of current news, and may require the use of appropriate caveats supplementing the contested content. However, in my opinion there could be no justification for requiring digital republishing of an issue of a newspaper with content different from the originally published printed version. That would amount to falsification of history."
"An internet user's right to information would be compromised if his search for information concerning an individual did not generate search results providing a truthful reflection of the relevant web pages but a 'bowdlerised' version thereof."
and he concluded:
"The rights to erasure and blocking of data, provided for in Article 12(b), and the right to object, provided for in Article 14(a), of Directive 95/46, do not confer on the data subject a right to address himself to a search engine service provider in order to prevent indexing of the information relating to him personally, published legally on third parties' web pages, invoking his wish that such information should not be known to internet users when he considers that it might be prejudicial to him or he wishes it to be consigned to oblivion."
The ECJ basically disagreed with this conclusion. It wrote:
"Article 12(b) and subparagraph (a) of the first paragraph of Article 14 of Directive 95/46 are to be interpreted as meaning that, in order to comply with the rights laid down in those provisions and in so far as the conditions laid down by those provisions are in fact satisfied, the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person's name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful..."
"when appraising the conditions for the application of those provisions, it should inter alia be examined whether the data subject has a right that the information in question relating to him personally should, at this point in time, no longer be linked to his name by a list of results displayed following a search made on the basis of his name, without it being necessary in order to find such a right that the inclusion of the information in question in that list causes prejudice to the data subject. As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public on account of its inclusion in such a list of results, those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject's name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question." (our bold)
The unintended consequences of Google v Spain
The effect of Google v Spain, combined with the ECJ's earlier Satamedia decision on journalism and freedom of expression, is that it is OK to have online archives (of accurate, truthful information) but it may not be OK for search engines to show you where these articles are by indexing them. Let's explore some of the unintended consequences:
On-site search boxes
What the ECJ apparently did not consider is that media sites themselves (including Webb-site) normally have an "on-site" search box that allows readers to find content, and the technology of on-site search engines is no different, just their scope. If you have a large media organisation with multiple publications and a long-running archive, then searching that archive could be almost as good as using Google. So why is it OK to search, say, the Guardian's archive on its site, but not Google?
Or does the ECJ ruling mean that media sites also censor their search results, even when the articles are still legally retained in their archives? The chilling effect of this would be that many media organisations simply close their archives to public searching, because unlike Google, they don't have the financial resources to throw at vetting requests for exclusion of search results. The detrimental effect on transparency that archive closure would bring is far greater than the sum of the inconveniences or embarrassment of public data. Those who cannot learn from history are condemned to repeat it.
These days, perhaps the largest repository of public domain information is Wikipedia, which certainly has its own search facility, and most of the articles contain numerous links to source material, such as articles in online media archives! We see no chance of Wikipedia, which is based outside the EU, acceding to pressure to censor its search results. Yes, people can edit and delete most of its content, but that is a whole different matter, because others can reinstate content.
It should also be relatively simple to build a web site which, although not a search engine itself, sends search requests to the web sites of multiple media archives in one go. If you could simultaneously send search requests to say, the BBC, New York Times, Guardian, Australian and a hundred other media outlets, then who needs Google to find old reports? This new site would not have any index of its own, no results to cull or censor, it would simply present results pulled in real-time from other media web sites.
And what if a new article links to an older one and repeats content in it? When the new article is indexed by Google, the old article will be one hop away from the new one. Even if search engines are required to exclude known pages from their search results, that doesn't stop people who already know the information from creating new web pages which contain the same information. They are just exercising their freedom of expression. There's no easy, automated way to accurately catch those references, so search engines would then crawl those new pages and the results would pop up again. This means that the data subject would have an endless battle to suppress information.
So while we are at it, here are the links (which we found today on Google, still there) to the 1998 notices regarding Mario Consteja Gonzalez's property which started all this. The Webb-site server is not in the EU and nor are we, so if you are watching in the EU, then look away now. The notices are on 19-Jan-1998 and 9-Mar-1998.
OK Europeans, you can look again now!
If you are an EU citizen without a role in "public life" as the ECJ puts it, then you can write to search engines and have embarrassing information removed, but if you have a role in public life, for example as a politician, then there may be a public interest in rejecting that request.
But the ECJ appears not to have considered that many public figures (with the notable exception of royalty) are not born into public life but aspire to it, for example, as politicians. So they start off as private figures, when the ECJ's ruling benefits them.
So if you are thinking of running for office in your local government, perhaps later moving up to your national congress or even the presidency, then before you start, contact all search engines and clean up your record. Get rid of that report of a conviction for employing an illegal immigrant, or drunk driving, or not paying your council taxes. Erase the press reports of cheating on your first spouse or a messy divorce in which you hid your assets. Then run for office. Your electoral opponents will not be able to find any dirt on you.
It wasn't hard for us to find a good example of this. Take Councilman Stewart Chen, of Almeda, California, who has just been subject to scrutiny in local media in "Councilman's fraud conviction surfaces". Long live the First Amendment of the US Constitution, and long live Article 27 of HK's Basic Law (when the Privacy Commissioner chooses to respect it). As China should know, without a free and open media, officials will be far less accountable.
Knowledge is power
On the other hand, those who already know the dirt, which is no longer visible in search results, will be able to use it against you once you are in public office by threatening to reveal it on public interest grounds. Google will even have to keep an index of that censored information, to tell its search engines not to crawl those pages again - so they will have a lot of valuable information that the electorate will not. Without realising it, the ECJ just made Google and anyone else who runs a search engine more powerful.
Indeed, with some resources, people could build private search engines (preferably outside the EU) to crawl the web and recover this information, making it available on subscription, or just using it for private leverage. If they publish it, then they may take the precaution of blocking the results from users on IP addresses in the EU, but citizens elsewhere in the world (and those with non-EU proxies) will be able to see it. So another unintended consequence of the ECJ's ruling is the "Great Firewall of Europe" where information that is available outside the EU will not be available inside it.
Even if the whole world followed the EU's decision, private search engines (and those run by Governments) could still crawl the web and collect the information. That puts those people in a position of knowledge and power that most of us cannot afford.
The ECJ judgment contains multiple instances of "relevant" or "irrelevant" in relation to search results. But the point the court really misses is that relevancy is not a matter for the data subject to determine, nor should it be a matter for a search engine. Relevancy is in the eye of each beholder, in all the circumstances, and in accordance with her own value judgments and beliefs.
A debt default that was eventually settled, or the fact that someone is a discharged bankrupt, may be irrelevant to some people, but relevant to others. Perhaps you are considering letting an apartment to this person. Perhaps you might think, as credit agencies do, that past defaults are a statistical guide to future defaults. A past conviction for drunk driving may be irrelevant to some readers, particularly if they have occasionally broken that law without conviction themselves, but to a reader who is a Mormon or Muslim, the fact that the person was drinking at all, let alone drink-driving, may affect their assessment of his character.
Of course, the irony of this is that if a data subject truly believed that information about them is irrelevant to all possible viewers, then they wouldn't care whether it was displayed in search results or not. The fact that they are going to the effort of having it suppressed in itself suggests that they think the information will be relevant to at least some viewers if made available to them.
Like the arrow of time, public domain disclosure practically isn't reversible. Once you know something, you can't be forced to forget it, and you can't practically be stopped from repeating it, whether to a friend over coffee, or to the whole world on the web. Attempts to suppress the information, or make it harder to find, are really rather futile and as we have shown, dangerous to open society. You don't have a right to make other people forget, but you do have a right to remember.
© Webb-site.com, 2014