8.1 Any discussion about electronic evidence in the digital era must now include reference to encryption. This is an increasingly important issue for law enforcement authorities where criminals use strong encryption to thwart the legitimate investigation of officers.1 It is important to note at the outset that encryption itself is neutral. It is used for legitimate as well as prohibited reasons. Indeed, e-commerce as we know it could not exist without encryption, since it is encryption that makes purchasing on the Internet or a mobile device safe, without fear that our payment details can be intercepted during transmission. However, in this chapter we will be primarily concerned with those who use encryption to hide material.
Alisdair Gillespie, Jessica Shurson and Stephen Mason, ‘Encrypted data’, in Stephen Mason and Daniel Seng (eds.), Electronic Evidence and Electronic Signatures (5th edn, University of London 2021) 397–428.
1Although it is difficult to identify how widespread the use is. Within England and Wales, it seems relatively uncommon, with the Investigatory Powers Commissioner reporting only 66 approvals for a notice under the Regulation of Investigatory Powers Act (RIPA) 2000, s 49 being granted in 2018: Annual Report of the Investigatory Powers Act 2018 HC 67 (2000), 81.
8.2 Encryption is a form of cryptography. It is about disguising the contents of a message or file. Encryption (or enciphering) is the process by which a plaintext (or cleartext) message is disguised sufficiently to hide the substance of the content. As well as ordinary text, a plaintext message can be a stream of binary digits, a text file, a bitmap, a recording of sound in digital form, audio images of a video or film, or any other information. When a message has been encrypted, it is known as ciphertext. The opposite procedure – that of turning the ciphertext back into plaintext – is called decryption (or deciphering). An encryption scheme usually uses a ‘key’ to encrypt and decrypt the message. Data that is encrypted properly can be virtually impossible to decrypt. It is an art that has been practised for thousands of years,1 but digital technology has revolutionized it. By way of example, consider the message ‘The Eagle is Alive’. The difficulty with transmitting a message in plaintext is that anybody who sees the message knows its content. They may not, of course, know its meaning, but often the actual content of the message is problematic (for instance if the message is a photograph, then you may not want people to see the image). Encryption turns the message into a code that hides the meaning. So, for example, the message ‘The Eagle is Alive’ may be shown as ‘WEK85%LSc43*4lzqnc782’. If someone obtains that message, she will have no idea what this means. Indeed, she will not even know how many characters are in the original message. As digital media, including pictures and sound files, are simply binary data, it means that anything can be encrypted, with the encrypted binary file appearing to have completely random data. A more detailed technical description of encryption is provided in Chapter 1.
1A useful history can be found in Donald Davies ‘A brief history of cryptography’ (1997) 2(2) Information Security Technical Report 14.
Methods to obtain encrypted data
8.3 As noted, to decrypt encrypted text and render it into plaintext requires a key. In complex environments this could be an algorithm, complex data, a dongle (a physical device with a computer chip contained within it) or even biometric measurements,1 but in most instances it is a code or password. All forms of keys are recognized in the statute that regulates the exercise of investigatory powers to acquire the means by which encrypted electronic data may be decrypted or opened.2 For the purposes of this chapter, we will be restricting our analysis to passwords, because they are the most common key used in personal encryption.
1Biometric measurements may be secure in many instances, but they are spectacularly unhelpful when information is being hidden from law enforcement authorities, because the police will have the power to take photographs and potentially pictures of one’s iris, for example. For the position in Norway, see Ingvild Bruce, ‘Forced biometric authentication – on a recent amendment in the Norwegian Code of Criminal Procedure’ (2017) 14 Digital Evidence and Electronic Signature Law Review 26.
2RIPA 2000, s 56(1).
Breaking the encryption without obtaining the key
8.4 It is not always necessary to use the key to convert the encrypted material into plaintext. Some examples include:
(1) Exploit a known flaw in the encryption scheme.1 This is also known as a ‘vulnerability attack’, where the implementation of the encryption or password protection used is flawed and susceptible to programmatic compromise.2
(2) Obtain access to the plaintext when in use. Some law enforcement authorities have been known to gain rapid entry into a suspect’s house when they know that the device is being used, because the contents of the device will be in an unlocked state as plaintext. Providing the device is not allowed to go into sleep mode or lock, then the contents can be freely viewed and copied.
(3) Use covertly installed keylogging software to record the suspect entering the password into the computer.3
(4) Locate a separate plaintext version of the encrypted data.4
1Derek Kortepeter, ‘Modern cryptographic methods: their flaws, their subsequent solutions, and their outside threats’ TechGenix, 27 June 2016, http://techgenix.com/modern-cryptography-methods/; Casey Chin, ‘13-year-old encryption bugs still haunt apps and IoT’ Wired, 8 July 2019, https://www.wired.com/story/rsa-encryption-signature-validation-flaws/.
2In R v Kelly (Lee Paul) [2013] EWCA Crim 1893, [2018] 7 WLUK 478, the Court of Appeal upheld the decision of a judge to withhold the technique used to circumvent encryption on a mobile telephone. While the court noted that there must always be a fair trial, it also stated that ‘there is an important public interest in not disclosing information which would jeopardize the effective prevention and detection of crime’, at [33].
3U.S. v Scarfo 180 F.Supp.2d 572 (D.N.J. 2001). While this may not tell you who depressed the keys (thus proving who had control), it would provide access to the encrypted material, which, by itself, is likely to assist the wider investigation. See also Giuseppe Vaciago and David Silva Ramalho, ‘Online searches and online surveillance: the use of Trojans and other types of malware as means of obtaining evidence in criminal proceedings’ (2016) 13 Digital Evidence and Electronic Signature Law Review 88.
4Officials were able to examine draft copies of a ransom note automatically saved by word processing software on the suspect’s computer in Commonwealth of Pennsylvania v Copenhefer, 526 Pa. 555, 587 A.2d 1353 (Pa. 1991), abrogated on sentence by Commonwealth of Pennsylvania v Rizzuto, 777 A.2d 1069 (Pa. 2001).
8.5 Most encryption programs are extensively tested, which means that vulnerabilities are rare. Installing keylogging software is difficult in the era of multiple devices, high-quality firewalls and anti-virus software; the installation of such software by authorities may itself also be illegal. The era of solid-state memory and cloud storage means that encryption can be near-instantaneous. While entering a house requires a legal warrant, it is also high-risk because if a drive is removed, or a connection to the cloud is broken, the plaintext contents can be programmed to be immediately encrypted. Thus, in many instances there is insufficient time to gain entry before the suspect can do any of these things.
8.6 A more productive way of breaking the encryption is to identify the key. Let us assume that the police wish to obtain the key from a person they suspect of committing a crime, whose documents are protected by encryption. The police could obtain it in the following ways:
(1) The suspect could voluntarily provide the password.
(2) The password might be written down. People frequently write down passwords (so they do not forget), which is remarkably helpful for those trying to find them.
(3) It is possible to use intelligence, including profiling, to guess what the password is, operating on the basis that the password may be something memorable about that person, such as a name or date of birth.1
(4) The use of decryption tools to break the encryption, including brute-force attacks. Software will allow, for example, every word in the dictionary to be tried as the password. A ‘brute-force attack’ will use powerful computers to try every possible combination of a key.2 The difficulty with this method is that complex keys and long keys are almost impossible to break in this way.
(5) The suspect may be compelled to surrender the key.
1For instance, United States border agents successfully guessed that Michelle Lopez used her date of birth as a password: in United States v Lopez, 2016 WL 7370030 (S.D. Cal. Dec. 20, 2016). In Rollo (William) v HM Advocate 1997 JC 23, 1997 SLT 958, 1996 SCCR 874, [1996] 9 WLUK 194, [1997] CLY 5753, the police succeeded in gaining access to an encrypted part of a Memomaster notebook by trying a number of combinations, one of which – the appellant’s date of birth – was successful. See Ian Grigg and Peter Gutmann, ‘The curse of cryptography numerology’ (2011) 9(3) IEEE Security & Privacy 70 for a brief foray into the failure of everything but the cryptography.
2In R v ADJ [2005] VSCA 102 the defendant claimed that he could not recall the password, and suggested possible alternatives, none of which were correct, so the police used password-cracking software that took over four months to identify the password. The encrypted partition revealed a large quantity of abusive images of children.
8.7 The fifth option – compulsion – has two possible alternatives. The first is torture, which is illegal in most countries, or the second is through a legal requirement to comply. Usually, this is backed by penal sanction. It is this latter method which has begun to be adopted by countries, although subject to dissenting opinions.
Compelling disclosure in England and Wales
8.8 England and Wales became one of the first jurisdictions to (controversially) introduce specific powers to allow the police to compel the disclosure of a password. The powers are set out in Part III of the Regulation of Investigatory Powers Act 2000 (RIPA). Alongside RIPA, a Code of Practice is issued under the authority of the Act.1 The latest version was published in 2018.2 The Code expands on the rules and procedures set out in RIPA 2000, providing greater certainty to investigators, judges and suspects in understanding how the disclosure powers under the Act will be exercised.
1RIPA 2000, s 71(4). The Codes of Practice are released as statutory instruments and, therefore, have the force of secondary legislation.
2Investigation of Protected Electronic Information: Revised Code of Practice (Home Office 2018).
8.9 At the heart of the RIPA provisions is the concept of ‘protected information’. This is defined in s 56(1) as:
any electronic data which, without the key to the data–
(a) cannot, or cannot readily be accessed; or
(b) cannot, or cannot readily be put into an intelligible form.
8.10 Encrypted data would be the most obvious example of ‘protected information’, although the provisions in RIPA are wider than this.1 There are three powers contained in RIPA 2000 that relate to protected information:
1. The power to require disclosure of protected information in an intelligible form.2
2. The power to require disclosure of the means to either obtain access to protected information, or render the protected information into plaintext.3
3. The power to attach a secrecy provision to any disclosure requirement (a ‘tipping off’ provision).4
1In R v Spencer (Jeffrey) [2019] EWCA Crim 2240, [2019] 12 WLUK 246, the appellant had been convicted under RIPA s 53 for not providing the PIN to unlock two mobile telephones in his possession. A disclosure notice under s 49 had been presented to the appellant, who declined to provide the codes.
2RIPA 2000, s 49.
3RIPA 2000, s 49 when read in conjunction with s 50(3).
4RIPA 2000, s 54.
8.11 These powers are considered below. While the first two will lead to the disclosure of information in an intelligible form, the first differs in that it does not technically require the surrendering of the key. It suffices that the person produces the data in an intelligible form. Thus, for example, if there were other documents that were encrypted that were not relevant to the crime, the police would not see them. However, in many instances it is unlikely that the police would be content with an assurance that other documents are not relevant and, instead, they will require the key to be disclosed, which will either provide access or allow the encrypted material to be rendered intelligible. In essence, the difference is who does the decryption. In the first scenario it is the suspect, whereas in the second scenario it will be the relevant investigator, or a nominated person.
8.12 Where a suspect does not voluntarily provide her key, or where the police are unable to identify the key using the techniques discussed above, they may seek to serve a notice requiring disclosure of either the information sought or the key. The police can only do so with the permission of the National Technical Assistance Centre (NTAC).1 NTAC is a government unit that became part of GCHQ (Government Communications Headquarters) in 2006, and has specialist officers dedicated to decrypting ciphertext, including through brute-force attacks and other technical solutions. NTAC will determine whether the encryption is known to NTAC and can be circumvented without the need to invoke RIPA 2000. Where they cannot, NTAC will determine whether the case is appropriate for application for a notice under s 49.
1Investigation of Protected Electronic Information: Revised Code of Practice, [3.9].
8.13 The power to require disclosure applies where protected electronic information comes into the possession of an officer1 as a result of exercising a statutory power2 or by other lawful means.3 In order to serve such a notice, s 49(2) provides that a person who has been authorized to give permission must have reasonable grounds to believe:
(1) the suspect has the protected information in his possession;
(2) the imposition is necessary for a specified purpose;
(3) the imposition is proportionate; and
(4) it is not reasonably practicable to obtain the information in any other way.
1A police officer, an officer of Customs and Excise or a member of the intelligence services.
2For example, the police have the right to search any premises occupied or controlled by a person who has been arrested for an indictable offence (Police and Criminal Evidence Act (PACE) 1984, s 18).
3For example, a constable has exercised a search warrant: PACE 1984, s 8.
8.14 The person who is authorized to give permission is a circuit judge or a district judge (Magistrates’ Court).1 Judicial permission recognizes the sensitivities of compelling the disclosure of protected information and provides reassurance that there is independent scrutiny on the grounds set out above. However, it should be remembered that the judge need only have reasonable grounds to believe the criteria is met, and this is a relatively low threshold.2 The purposes mentioned above include the interests of national security, the purpose of preventing or detecting crime and the interests of the economic well-being of the United Kingdom.3 It is notable that it is crime, and not serious crime, which is a threshold required for some types of investigatory powers.4
1RIPA 2000, Schedule 2, paragraph 1(1).
2It is less than the civil and criminal standards of proof. It requires that there is some evidential basis to believe that something might be true, as distinct from more likely than not to be true (civil standard) or sure to be true (criminal standard).
3RIPA 2000, s 49(3).
4Intrusive surveillance (surveillance that takes place in residential premises or a private vehicle) will only be authorized if, among other things, it is for the prevention or detection of serious crime – see RIPA 2000, s 32(3)(b).
8.15 Proportionality is a concept that is now well understood by the courts. Proportionality is best thought of as requiring ‘reasonableness between the objective sought and the means used to achieve that end’.1 It requires a balance to be struck, ensuring that a measure is not disproportionate to the aim. It requires an examination of alternatives, but does not necessarily require that the least intrusive method always be chosen.2 The Code of Practice suggests several aspects of proportionality that the judge should consider:
•The extent of the proposed interference with privacy against what is sought to be achieved;
•How and why the methods to be adopted will cause the least possible interference to the subject and others;
•Whether the activity is an appropriate use of the legislation and is a reasonable way, having considered all reasonable alternatives, of obtaining the necessary result;
•What other methods, as appropriate, were either not implemented or have been employed but which were assessed as insufficient to fulfil operational objectives without the use of the proposed conduct.3
1Halsbury’s Laws (5th edn, 2018), vol 61A, para 17.
2R (on the application of Corner House Research) v Director of the Serious Fraud Office [2008] UKHL 60, [2009] 1 AC 756, [2008] 3 WLR 568, [2008] 4 All ER 927, [2008] 7 WLUK 921, [2008] Lloyd’s Rep FC 537, [2009] Crim LR 46, (2008) 158 NLJ 1149, (2008) 152(32) SJLB 29, Times, 31 July 2008, [2008] CLY 1661.
3Investigation of Protected Electronic Information: Revised Code of Practice, [3.41], bullet points in the original.
8.16 The last bullet point realistically does not add much more than the final requirement contained in the Act – that it is not reasonably practicable to obtain the information by other means. It is somewhat strange that it is included on the face of the legislation given that the consideration of alternatives is an important part of proportionality. However, its inclusion perhaps reflects the view that Parliament expects the alternatives to be seriously considered, with a s 49 notice being issued only where there is no real alternative.
8.17 A person having possession of information or a key to protected information, is defined in s 56(2), RIPA 2000, as follows:
References in this Part to a person’s having information (including a key to protected information) in his possession include references–
(a) to its being in the possession of a person who is under his control so far as that information is concerned;
(b) to his having an immediate right of access to it, or an immediate right to have it transmitted or otherwise supplied to him; and
(c) to its being, or being contained in, anything which he or a person under his control is entitled, in exercise of any statutory power and without otherwise taking possession of it, to detain, inspect or search.
8.18 Three different scenarios exist under this definition:
(i) a person may possess a key if it is under his control, or
(ii) if he has an immediate right of access to it, or an immediate right to have it transmitted or supplied to him, or
(iii) if he (or a person under his control) is entitled, in exercise of any statutory power and without taking possession of it, to detain, inspect or search the thing which contains the key.
8.19 In the second and third scenarios, a person may be deemed to have a key, although he does not have the key himself. This is a fairly important provision, because the managerial officers of an organization, whatever the legal form the organization takes, are the ones responsible for the proper management of the private key, rather than the operational staff members.1 Where the relevant ciphertext is to be found on a company’s device, it would therefore make sense to serve the s 49 notice on an officer or senior manager of the organization, because she will have the power to order compliance by another.
1Ross Anderson, Security Engineering: A Guide to Building Dependable Distributed Systems (2nd edn, John Wiley & Sons 2008) para 3.7.4 for a discussion on the principles involved in this process. (Professor Anderson was updating his book as this text was being updated. Some of his book will be available as open source at https://www.cl.cam.ac.uk/~rja14/book.html for a short period before the text is published. The entire book will be made available again as open source in 2023.)
8.20 The form a disclosure notice is set out in s 49(4), RIPA 2000. It must, among other things, describe the protected information to which the notice relates;1 specify the grounds upon which the disclosure is believed to be necessary;2 specify the time by which the notice is to be complied with,3 which must allow a reasonable period for compliance, depending on the circumstances of the case;4 and specify the disclosure required and the form and manner in which it is to be made.5 Where there is a cost to complying with a s 49 notice, s 52 provides for the Secretary of State to make an appropriate contribution towards such costs, although it is the investigating authority that ultimately bears the burden of paying these.6
1RIPA 2000, s 49(4)(b).
2RIPA 2000, s 49(4)(c).
3RIPA 2000, s 49(4)(f).
4RIPA 2000, s 49(4) proviso.
5RIPA 2000, s 49(4)(g).
6Investigation of Protected Electronic Information: Revised Code of Practice, 4.4.
Disclosure of protected information and keys
8.21 Where a person is served with a s 49 notice requiring the disclosure of protected information in an intelligible form, she may ether provide the key or use it to render the encrypted material into an intelligible form,1 unless the notice states that she must surrender the key. The Code of Practice specifically notes that rendering into an intelligible form means returning the data to the state that it was in before encryption was applied, even if this means that there is other protection that might prevent someone from reading it immediately.2 Consider an example:
S has a Word Document that is protected by a password. To further enhance security, S uses encryption technology on the document. The police secure permission from a judge to serve a s 49 notice. S can either remove the encryption, or supply the key to do so. However, the s 49 notice may not require him to provide the password to the Word Document.3
1RIPA 2000, s 50(1).
2Investigation of Protected Electronic Information: Revised Code of Practice, [3.16].
3Other legal powers may do so, but in any event, computer forensic programs may be able to read the data in such a file once it has ceased to be encrypted. This will depend on the version of the Word Document (which use different types of password protections) and the complexity of the password used.
8.22 There may be times when the person to whom the notice is directed does not have the key, or cannot gain access to the key. In such instances, she must give up what keys she actually has, although she does not have to disclose every key she has in her possession.1 It follows that where a notice is to be served on a body corporate or a firm and it is obvious that more than one person may be in possession of the key, the notice should be directed to a senior officer, partner or senior employee.2 However, where it is considered that the circumstances are such that the purpose of the notice would be defeated if it were to be served on the most appropriate person (for instance, she may be the subject of an investigation), then the notice may be served on another individual.3
1RIPA 2000, s 50(3) and the effects of s 50(4), (5) and (6). See also s 50(7) and (8).
2RIPA 2000, s 49(5) and (6).
3RIPA 2000, s 49(7).
8.23 An exception is created as regards the disclosure of keys that are used for generating electronic signatures. Section 56(1) RIPA 2000 defines an ‘electronic signature’ as:
anything in electronic form which
(a) is incorporated into or logically associated with, any electronic communication or other data;
(b) is generated by the signatory or other source of the communication or data; and
(c) is used for the purpose of facilitating, by means of a link between the signatory or other source and the communication or data, the establishment of the authenticity of the communication or data, the establishment of its integrity, or both.
8.24 Where a key is used only for this purpose, it does not have to be disclosed in response to a notice, provided it has in fact not been used for any other purpose.1 It might be useful to recall that a key pair has more than the single function of producing an electronic signature. The same key pair can be used to encrypt a message, depending on the algorithm used.
1RIPA 2000, s 49(9).
8.25 However, this exemption may be narrower than it seems. In a commercial context, where more than one person may properly have access to a key, the person served with the notice may not be able to be sure that a key, despite being intended for signature purposes, has never been used to decrypt a message encrypted with the corresponding public key (there is no disclosure obligation if the key ‘has not in fact been used for any … purpose [other than that of generating electronic signatures]’).1 Although it will be for the prosecution to prove that a key has been used for such a purpose (that does not involve generating electronic signatures and is therefore subject to seizure), the mere assertion of this fact by the person demanding access to the key would place the recipient of the notice in a difficult position to prove a negative in resisting the demand.
1RIPA 2000, s 49(9)(b).
Failure to comply with a notice
8.26 Where a person knowingly fails to make the disclosure required by the notice, he commits a criminal offence.1 Section 53(2) sets out an important presumption of possession of the key:
In proceedings against any person for an offence under this section, if it is shown that that person was in possession of a key to any protected information at any time before the time of the giving of the section 49 notice, that person shall be taken for the purposes of those proceedings to have continued to be in possession of that key at all subsequent times, unless it is shown that the key was not in his possession after the giving of the notice and before the time by which he was required to disclose it.
1RIPA 2000, s 53(1).
8.27 An evidential burden is placed on the recipient. This requires her to adduce ‘sufficient evidence of the fact … to raise an issue’.1 This does not mean she needs to prove that it was not in her possession. Instead, she must adduce some evidence (including through cross-examination) to show that it is not just a hypothetical argument.2 Once such evidence is adduced, the prosecution must disprove the assertion beyond all reasonable doubt.3
1RIPA 2000, s 53(3)(a).
2Ultimately it is for a judge to decide, as a matter of law, whether sufficient evidence has been adduced: see, by implication, Bratty v Attorney-General for Northern Ireland [1963] AC 386, [1961] 3 WLR 965, [1961] 3 All ER 523, [1961] 10 WLUK 5, (1962) 46 Cr App R 1 (1961) 105 SJ 865, [1961] CLY 1839.
3RIPA 2000, s 53(3)(b).
8.28 A defence exists where a person can show that ‘it was not reasonably practicable for him to make the disclosure required by virtue of the giving of the section 49 notice before the time by which she was required, in accordance with that notice, to make it’ but only if ‘[he] did make the disclosure as soon after that time as it was reasonably practicable for him to do’.1 Unlike the presumption of possession of the key, this imposes a legal burden on the defence. Thus, the defendant must prove, on the balance of probabilities, that it was not reasonably practicable to disclose the key or data in the time frame required. It only applies if she subsequently makes disclosure and that this was when it was reasonably practicable to do. Accordingly, it would not assist those who continue to refuse to disclose the key.
1RIPA 2000, s 53(4).
8.29 A person who honestly does not know the key, or cannot remember it, would not commit the offence, as she must knowingly refuse to surrender the key. If she does not remember the key, then she cannot surrender it. Whether it is credible that she has forgotten it is a matter of fact for the jury. In many instances, forensic data will be important here. While forensic software cannot say what is in an encrypted file, they can often tell when it was last viewed. If a person has been viewing the encrypted data just before the notice under s 49 is served, it is unlikely that a jury would consider it feasible that she has now forgotten the key.
8.30 The offence under s 53 is triable either in the Magistrates’ Court or the Crown Court. The penalty depends on what it is believed the encrypted data contains. Where it is a case of ‘national security’ or ‘child indecency’, the maximum penalty on conviction is five years’ imprisonment;1 otherwise it is two years’ imprisonment.2 A case is a ‘national security case’ if the application made under s 49 stated that the case was ‘in the interests of national security’.3 Similarly, a case is a ‘child indecency case’ if it was stated in the s 49 application that the applicant believed the suspect was involved in the taking, making, distribution or possession of indecent photographs of a child.4
1RIPA 2000, s 53(5), s 53(5A)(a).
2RIPA 2000, s 53(5), s 53(5A)(b).
3RIPA 2000, s 53(5B).
4RIPA 2000, s 53(6), (7).
8.31 It should be remembered that the threshold for applications under s 49 is reasonable belief. Accordingly, a suspect is at risk of the higher sentence purely because a judge is satisfied that there is reasonable belief that the encrypted material poses a threat to national security or consists of indecent photographs of children. Reasonable belief is a low threshold and is significantly below the standard of proof ordinarily required for higher sentences. For example, where there is a dispute between the prosecution and defence over the circumstances of a guilty plea, the matter is normally resolved in a ‘Newton Hearing’,1 where the prosecution must prove its version of the facts to the ordinary criminal standard.2
1R. v Newton (Robert John) [1982] 12 WLUK 57, (1983) 77 Cr App R 13, (1982) 4 Cr App R (S) 388, [1983] Crim LR 198, [1983] CLY 815.
2R. v Ahmed (Nabil) [1984] 12 WLUK 43, (1985) 80 Cr App R 295, (1984) 6 Cr App R (S) 391, [1985] Crim LR 250, [1985] CLY 828.
8.32 At issue might be whether the lower threshold can be justified. At first sight it would seem difficult to do so. However, the point of s 53 is that the police cannot decrypt data without the cooperation of the defendant. If they could prove, to the criminal standard, that the encrypted folder contained, for example, indecent images of children, then they would not need to serve a s 49 notice in the first place. While it may be difficult to justify the full criminal standard, it might be possible for the prosecution to prove a lower standard, for example on the balance of probabilities, through, for example, circumstantial evidence (email messages, IP traces etc.) that indicates the contents of the encrypted material.1 Presumably, the higher penalty can only be used where it is not known what the contents are. Consider an example:
The police believe that S is storing indecent photographs of children on an encrypted memory stick. They serve a s 49 notice, which S refuses to comply with. After proceedings under s 53 have begun, NTAC manages to gain access to the memory stick and discover that it does contain pornographic pictures, but of adults.
1The facts of Greater Manchester Police v Andrews [2011] EWHC 1966 (Admin), [2011] 5 WLUK 614, [2012] ACD 18, which will be examined later, would be a good example of this. Indecent photographs of a child were found on an unencrypted laptop. Two encrypted memory sticks were discovered alongside the computer. It could be argued that it is more likely than not (civil burden) that these sticks contained more illegal images.
8.33 The Crown Prosecution Service (CPS) may still wish to proceed with the prosecution under s 53 because the suspect has failed to comply with the s 49 notice. A literal reading of s 53 would mean that S is liable for up to five years’ imprisonment, because the grounds for seeking the notice will have included reasonable grounds for believing that S was hiding indecent photographs of children. As this is now known not to be true, S can only be sentenced to a maximum of two years’ imprisonment. For this reason, the courts adopt a strict approach to s 53 cases. In R. v Cutler (Barry George)1 the Court of Appeal held:
[A s 53 offence is] a very serious offence because it interferes with the administration of justice and it prevents the prosecuting authorities and the police finding out what offences someone has committed.2
8.34 This is an important point. Encryption puts evidence beyond the reach of law enforcement authorities and prosecutors. It means the full extent of the criminality cannot be ascertained, and the courts must consider this seriously. If s 53 is proven, it is a deliberate attempt to try and conceal evidence from the competent authorities, and this must merit harsh sanctions.
8.35 The seriousness of the offence is perhaps reflected in the comments of the Court of Appeal in R v Padellec (Pierre).1 The appellant entered a plea of guilty to an offence under s 53. He came to the attention of the police as a possible acquaintance of a person known to be involved in the trafficking of children. His computer (which included an encrypted folder) was recovered, and while no indecent images of children were found, search terms relating to indecent photographs were found. The appellant alleged that he purchased the encrypted device in Belgium and had no knowledge of the key. Following negotiations, a basis for the plea was tendered and accepted by the Crown. This was as follows:
1. The defendant accepts that he did not provide passwords as requested.
2. He did not do so because he knew he had used wiping software to remove evidence of a small number of images, which he accepts were indecent.
3. The defendant had accessed these images during the currency of internet browsing. The defendant will assert that the content of these images did not depict images of very young children. He cannot state the ages. The images did not contain scenes of sexual or any other type of violence to children. 2
8.36 The importance of the third basis of plea is that it states the defendant did not obtain access to the images of the very worst forms of indecent photographs of children, and which would lead to more severe sentencing.1 The judge accepted the plea, but suggested that he did so with reluctance. The Court of Appeal was scathing about the basis of plea. In giving judgment, Collins J said:
It seems to us that in a case such as this, it is entirely wrong for a basis of plea to be accepted, either by the prosecution or ultimately by the judge. What it does is to enable the defendant in question to identify, to his advantage, what was or was not on the computer and to get a lesser sentence than otherwise might be appropriate. That is to enable him to dictate, wrongly, what the situation is. The whole point of requiring access is so that it can be seen what was, in fact, there. We express hope that in a situation that arose in this case, there will never again be a basis of plea accepted which is based on keeping the contents secret and the defendant saying, to his advantage, what was or was not contained.2
1At the time of this decision, the sentencing for possession of indecent photographs was subject to the definitive sentencing guideline of 2007. This created five categories of seriousness. The basis of plea would ensure that it did not fall within the highest category or contain any aggravating factors. The guideline was replaced in 2013, but the changes are irrelevant to this decision.
2[2012] EWCA Crim 1956 at [11].
8.37 If the defendant had not viewed, or stored, images that constituted the most serious examples of indecent photographs of children, then he could have proved this by allowing access to the device. Instead, the prosecution (and the judge) decided that the defendant could admit that he had looked at illegal content but could also keep the details of this illegality secret. The Court of Appeal, quite rightly, considered this an affront to justice. They stated, correctly, that in the absence of an explanation, an assumption of the worst-case basis should be made and the person be sentenced accordingly. To avoid this, the defendant could simply provide access to the images to allow their proper classification. This does not breach the presumption of innocence as the offence itself relates solely to the provision of indecent photographs of children. The defendant conceded this. Sentencing is separate to ascertaining guilt, and it must be right that it is appropriate for the court to take into consideration the refusal to show the images to the court.
Obligations of secrecy and tipping off
8.38 There is a power to attach a secrecy provision to any disclosure requirement.1 This will require the person to whom the notice is given, and every other person who becomes aware of its contents, to keep the giving of the notice, its contents and the things undertaken in responding to it, a secret.2 Breach of this requirement is punishable by a maximum of five years’ imprisonment,3 which is a heavier sentence than that which can be imposed on someone under s 53, save where it is a national security or child indecency case. Several defences exist to this offence, including:
(1) the disclosure was effected entirely by the operation of software designed to indicate when a key to protected information has ceased to be secure, and it was not reasonably practicable to prevent this;4
(2) that the disclosure was made by or to a professional legal adviser as part of giving legal advice as to the provisions of Part III of RIPA. The disclosure must have been by or to the client or a representative of the client;5
(3) that the disclosure was made by a legal adviser in contemplation of any legal proceedings;6
(4) that the disclosure was made to a judicial commissioner, or someone authorized by a commissioner;7
(5) that the recipient neither knew, nor had reasonable grounds to suspect, that the notice contained a secrecy requirement.8
1RIPA 2000, s 54.
2RIPA 2000, s 54(1).
3RIPA 2000, s 54(4).
4RIPA 2000, s 54(5).
5RIPA 2000, s 54(6).
6RIPA 2000, s 54(7).
7RIPA 2000, s 54(9).
8RIPA 2000, s 54(10).
8.39 In all cases, a legal burden is placed on the defence: it must prove the salient facts on the balance of probabilities. Where the defence is that a disclosure has been made to, or by, a professional legal adviser, the defence does not apply where the purpose of the disclosure is to further any criminal purpose.1
1RIPA 2000, s 54(8).
8.40 It should be noted, however, that the effectiveness of the ‘tipping off’ offence is debatable. It might be possible for a person to sign off her email correspondence with a disclaimer, such as ‘I will always explain why I revoke a key, unless the UK government prevents me using the RIP Act 2000’. Using this qualification, let us assume that a correspondent revokes a key. If the correspondent is asked for the reason and she replies that she cannot give one, it is doubtful if she can be convicted of the offence of tipping off, though this is exactly what she has done. There is no suggestion that a disclosed key cannot lawfully be revoked.
8.41 It has been held, albeit in a first-instance Magistrates’ Court decision, that RIPA 2000 is the only way that the authorities can compel access to encrypted data. Lauri Love is a UK citizen who was a member of the Anonymous hacker collective.1 He was accused of hacking into US government sites and stealing information.2 The US government requested his extradition and the National Crime Agency (NCA) arrested him, exercising a warrant to seize his computers. The computers were found to be encrypted, and a notice under RIPA 2000 s 49 was served, requiring him to disclose either the key or render the information intelligible. He declined to do so. The USA sought his extradition, but this was ultimately refused by the English courts, in part because of his mental health, but also because he could be tried for the offences in England and Wales.3 To date, no criminal proceedings have been brought against him.
1An interesting discussion about Anonymous is to be found in Gabriella Coleman, Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous (Verso Books 2015).
2Love v United States [2018] EWHC 172 (Admin), [2018] 1 WLR 2889, [2018] 2 All ER 911, [2018] 2 WLUK 89, [2018] Lloyd’s Rep FC 217, [2018] ACD 33, [2018] CLY 988.
3The Computer Misuse Act 1990 allows a person to be tried for hacking where the victim was outside the territory of England and Wales. To date, no prosecution has been brought against Love.
8.42 Love applied to the Magistrates’ Court to have his computer equipment returned.1 The court declined to order its return unless he provided a detailed list of the contents of the computer, something he has refused to do. The NCA made an application that Love be directed to provide the keys to the encryption.2 The district judge, however, held that this was not a proper use of the court’s jurisdiction. The judge held that RIPA 2000 provided the statutory procedure to secure access to encrypted data. The decision of the district judge was undoubtedly the correct one. In essence, the NCA was seeking to use civil proceedings to gain access to the key, rather than rely on RIPA 2000. Had the application succeeded, the NCA could have sought to use contempt of court proceedings to require compliance with the direction, which could ultimately have led to the imprisonment of Love. However, as the judge noted, the correct avenue to enforce s 49 is to bring a prosecution for non-compliance under s 53.
1Police (Property) Act 1897, s 1.
2The application being made in pursuance of the Magistrates’ Court Rules 1981/552, r 3A(2) and the Criminal Procedure (Amendment) Rules 2016/120.
8.43 While Love defeated the application, a strange impasse now exists, because his application under the Police (Property) Act 1897 was also rejected. He refused to discuss what was encrypted, hence the district judge held that its continued seizure was required. The police have the right to retain seized material to prevent it from being concealed, lost, damaged, altered or destroyed.1 Not unreasonably, it was thought there was reason to believe that Love may seek to destroy any incriminating evidence contained on the machine should it be returned to him.
1Criminal Justice and Police Act 2001, s 56.
8.44 The reluctance to prosecute Love is somewhat puzzling since, on the face of it, it appears that Love did breach s 49. For whatever reason, the NCA has chosen not to prosecute. As Love refuses to comply, the NCA are left with computers that they cannot obtain access to (although presumably NTAC is trying to override the encryption) and Love is allowed to ignore the requirement to surrender the encrypted information. This demonstrates that while it is often said that RIPA compels the disclosure of encryption keys, technically it does not. Ultimately, all the legislation can do is to ensure that those who refuse to disclose the key or render information intelligible can be punished. The encrypted material, however, remains beyond reach.
The privilege against self-incrimination
8.45 Compelling someone to provide a key has proven to be controversial. In most cases, the key is a password, and this password might have been thought of by the suspect, although random password generators are also used. It has been suggested that compelling the disclosure of the key infringes the common law privilege that someone cannot be compelled to incriminate herself. It is the fact that the password is the product of one’s mind, and can only be released through testimony, that raises this argument. A key could be something physical, including another piece of code, a biometric measurement or, for example, a dongle (a piece of technology usually including a chip). Requiring a person to hand over, for example, a dongle to unlock the encryption would not necessarily breach the privilege,1 because the key is invariably a self-produced password. The remainder of this chapter will explore how compelling the production of the key interferes with the privilege of self-incrimination in three jurisdictions: England and Wales, the USA2 and Belgium.3 The position in Canada is considered in brief.
1It would be an object that was created independently and is not the testimony of an individual: see Saunders v United Kingdom [1996] 12 WLUK 363, [1997] BCC 872, [1998] 1 BCLC 362, (1997) 23 EHRR 213, Times, 18 December 1996, Independent, 14 January 1997, [1997] CLY 2816.
2For these purposes, federal law will be considered.
3For France, see Décision n° 2018-696 QPC du 30 mars 2018, Le Conseil constitutionnel (Constitutional Court). Translated by Pauline Martin (2018) 15 Digital Evidence and Electronic Signature Law Review 92.
8.46 While the privilege of self-incrimination has long been a creature of the common law,1 it is also considered to be a fundamental part of article 6 of the European Convention on Human Rights (ECHR).2 The Human Rights Act 1998 requires public authorities, including the police and the judiciary, to act in a way compatible with the ECHR,3 and the courts must take account of the jurisprudence of the European Court of Human Rights.4
1An excellent history of the privilege is found in Andrew Choo, The Privilege against Self-Incrimination and Criminal Justice (Hart Publishing 2014).
2Funke v France (A/256-A) [1993] 2 WLUK 374, [1993] 1 CML 897, (1993) 16 EHRR 297, [1994] CLY 2431; Saunders v United Kingdom [1996] 12 WLUK 363, [1997] BCC 872, [1998] 1 BCLC 362, (1997) 23 EHRR 213, Times, 18 December 1996, Independent, 14 January 1997, [1997] CLY 2816.
3Human Rights Act 1998, s 6(1).
4Human Rights Act 1998, s 2(1); Rosemary Pattenden, ‘Privilege against self-incrimination’ (2009) 13(1) E. & P. 69.
8.47 The first case to challenge the compatibility of Part III of RIPA was R v S (F) and A (S).1 A third party (H) was made the subject of a control order under the Prevention of Terrorism Act 2005. S, A and H conspired to circumvent this control order by allowing H to move houses. This occurred, but shortly afterwards the police detected his presence. When the police arrived, H and S were in different rooms. S was alone in a room with a laptop. The password to an encrypted file was partly entered. S was arrested, and his premises searched, but nothing that contravened terrorism laws was found. However, the police could not examine the laptop due to the encryption. Later, A was arrested and a laptop was seized from him. Again, there was an encrypted folder on it, and the police were unable to gain access to the laptop.
1[2008] EWCA Crim 2177, [2009] 1 WLR 1489, [2009] 1 All ER 716, [2008] 10 WLUK 197, [2009] 1 Cr App R 18, [2009] Crim LR 191, (2008) 158 NLJ 1459, Times, 15 October 2008, [2008] CLY 711.
8.48 Neither A nor S made any comments during their interviews and did not voluntarily disclose the passwords that would unlock the encryption. Both were served with a notice under s 49 of RIPA 2000. Neither complied, and they were prosecuted under s 53. They both entered a plea of not guilty and sought a stay of prosecution, alleging that the notices themselves were incompatible with the privilege against self-incrimination and article 6 of the ECHR. The judge at first instance refused the stay, and A and S appealed to the Court of Appeal.
8.49 The members of the Court of Appeal noted that the submissions of all parties were premised on the basis that incriminating evidence would be discovered if the laptops were examined without encryption.1 There was no evidence of this, but the appellants conceded that incriminating material may be discovered. The members of the Court of Appeal noted that under both domestic and European jurisprudence, the privilege against self-incrimination was not absolute, and that there were several statutory provisions that overrode it. Thus, the first question was whether the privilege applied in these circumstances. It was held by the judge at first instance that the key was something held independent of the will of the suspect. If that was true, then the privilege against self-incrimination would ordinarily not apply.2 The Court of Appeal held:
On analysis, the key which provides access to protected data, like the data itself, exists separately from each defendant’s ‘will’. Even if it is true that each created his own key, once created the key to the data remains independent of the defendant’s ‘will’ even when it is retained only in his memory, at any rate until it is changed.3
8.50 The logic behind this argument is that while the password may initially have been the product of the defendant’s mind, it has an independent status once used. If the password was guessed or identified, then the encryption would unlock irrespective of whether the defendant willed it or not. They also noted that the key is neutral. It is not by itself either exculpatory or incriminating – it is simply a piece of information. However, if the contents are incriminating, then knowledge of the key could in itself be incriminating. The court provided the example of an encrypted folder containing indecent photographs of children.1 The fact that a person knows the key – and this would be shown through complying with the s 49 notice – could be used by the prosecution to show that the offender was in possession of the photographs.2 Of course, that depends on the facts. Where the substantive offence does not rely on control, then the possession of the key may not be incriminating.
1[2008] EWCA Crim 2177 at [21].
2Possession in the context of indecent photographs of children includes showing that the offender is in control of the material: see R. v Porter (Ross Warwick) [2006] EWCA Crim 560, [2006] 1 WLR 2633, [2007] 2 All ER 625, [2006] 3 WLUK 471, [2006] 2 Cr App R 25, [2006] Crim LR 748, (2006) 103(4) LSG 28, Times, 21 June 2006, [2006] CLY 858). Having the ability to unlock the encrypted folder is unquestionably control.
8.51 Ultimately, the Court of Appeal conceded that s 49 could interfere with the privilege of self-incrimination, but noted that it would only do so if the evidence that is being shielded by encryption is itself incriminating. However, the court opined that material unquestionably exists independent of the will of the individual and, therefore, there is no question of it being protected by the privilege. Thus, the only argument that could be put forward is that it is unfair for that evidence to be put before the court due to the circumstances in which it was found (through complying with s 49).1 That being the case, the Court of Appeal held that such matters could be dealt by the trial judge under the discretionary power to exclude prosecution evidence.2
8.52 The decision in S and A has not been universally welcomed. This is partly because the logic of the Court of Appeal stretches credibility. Roberts in the Criminal Law Review observed that an encryption key, unless documented, is an ‘intangible “psychological fact”, that is to say, it is information which exists only in the suspect’s memory and that of any other person who might “know” it’.1 The Court of Appeal would argue that this is not true because the key does exist – it is recorded and used within the encryption algorithm that is unlocked by the password. However, for all practical purposes, it is not. There is a very small chance that the password can be discovered by any other means. Indeed, if the police were able to identify the password other than through compelling the suspect to testify, there would be no need to issue a s 49 notice. As was noted above, it is almost impossible to guess a key when encryption is used properly due to the potentially vast number of possible combinations. Therefore, while in theory the key is independent of the will of the accused, it is, for all practical purposes, a psychological fact and, therefore, probably within the privilege. However, Roberts’ later point is perhaps the more salient. He notes that it does not matter whether privilege was or was not engaged because, following Brown v Stott,2 English law recognizes the privilege can be set aside by statute where it is proportionate to do so.3 Given the facts of the case, the national security implications would inevitably mean that displacement was undoubtedly proportionate.
1Andrew J. Roberts, ‘Evidence: privilege against self-incrimination – key to encrypted material’ [2009] Crim LR 191, 192. In 1993, Professor Tapper observed that the increased use of computers will lead to the position that we recess ‘to the earlier period where information reposed only in the brains of those who were party to it, and had no material form’: Colin Tapper, ‘Evanescent evidence’ (1993) 1(1) Intl J L & Info Tech 35, 40.
2[2003] 1 AC 681, [2001] 2 WLR 817, [2001] 2 All ER 97, 2001 SC (PC) 43, 2001 SLT 59, 2001 SCCR 62, [2000] 12 WLUK 108, [2001] RTR 11, [2001] HRLR 9, 11 BHRC 179, (2001) 3 LGLR 24, (2001) 145 SJLB 100, 2000 GWD 40-1513, Times, 6 December 2000, Independent, 7 February 2001, [2001] CLY 6319; Roisin Pillay, ‘Self-incrimination and Article 6: the decision of the Privy Council in Procurator Fiscal v. Brown’ (2001) 1 EHRLR 78; Roger Masterman, ‘Taking the Strasbourg jurisprudence into account: developing a “municipal law of human rights” under the Human Rights Act’ (2005) 54(4) ICLQ 907–1; Mark Berger, ‘Compelled self-reporting and the principle against compelled self incrimination: some comparative perspectives’ (2006) 1 EHLR 25; John Jackson, ‘Re-conceptualizing the right of silence as an effective fair trial standard’ (2009) 58(4) ICLQ 835; Hamish Stewart, ‘The privilege against self-incrimination: reconsidering Redmayne’s rethinking’ (2016) 20(2) E & P 95.
3The leading examination on the application of the privilege of self-incrimination is Choo, The Privilege against Self-Incrimination and Criminal Justice.
8.53 The issues were further rehearsed in Greater Manchester Police v Andrews.1 Rather than proceedings under s 53, this was an appeal from the refusal of the circuit judge to authorize a s 49 notice being served. Andrews had previous convictions for the sexual abuse of children, and was the subject of a Sexual Offences Prevention Order. Police arrested him on suspicion of breaching this order, and seized a computer and two memory sticks. Indecent photographs of children were found on the computer, but the memory sticks were encrypted. This meant that they could not be viewed. Andrews refused to provide the passwords or software used to encrypt the devices. The police applied for permission to serve a s 49 notice, but this was refused. The judge stated that requiring Andrews to reveal the key infringed his privilege against self-incrimination, because there was no independent evidence to show that he knew what the key was.2 The judge sought to use these facts to distinguish this case from R v S (F) and A (S).3 The Court of Appeal was unimpressed with the logic of the judge. They noted that as the devices were unquestionably found in his possession, that it was not unreasonable to believe that he might know of the existence of the encryption and its key.4 The court did not disagree that the privilege might be invoked, and noted once more that the privilege applied only in a limited way (repeating that the key was, in essence, neutral and it simply provided access to non-privileged material that was itself incriminating), and that English law allowed it to be displaced where it was proportionate to do so.5
1[2011] EWHC 1966 (Admin), [2011] 5 WLUK 614, [2012] ACD 18.
2[2011] EWHC 1966 (Admin) at [18]–[19].
3[2008] EWCA Crim 2177, [2009] 1 WLR 1489, [2009] 1 All ER 716, [2008] 10 WLUK 197, [2009] 1 Cr App R 18, [2009] Crim LR 191, (2008) 158 NLJ 1459, Times, 15 October 2008, [2008] CLY 711.
4Greater Manchester Police v Andrews [2011] EWHC 1966 (Admin) at [21].
5[2011] EWHC 1966 (Admin) at [27].
8.54 Section 49 has not been challenged again, and so the legal position now seems relatively settled. As noted at the beginning of this chapter, the power is not exercised particularly frequently. This suggests that the police are only using it where they suspect that encryption is shielding serious criminality. That being the case, it is likely the courts would consider it proportionate that the privilege against self-incrimination is set aside, as they did in R v S and F.
8.55 The position in England and Wales can be usefully contrasted with the approach taken in the USA, where the Fifth Amendment protects the privilege against self-incrimination.
The Fifth Amendment privilege against self-incrimination
8.56 One of the first cases in the USA to deal with this issue was also cited in R v S (F) and A (S)1 to illustrate the point that knowledge of the password might be relevant to the privilege against self-incrimination. The case of In re Grand Jury Subpoena to Sebastien Boucher2 involved facts arising out of the search of a laptop at the US border with Canada. On 17 December 2006, Boucher and his father entered the US from Canada. A customs and border protection officer found a laptop computer in the vehicle they were travelling in. He opened the computer and switched it on without entering a password. He searched the various files in the computer and discovered approximately 40,000 images, some of which appeared to be pornographic, based on the names of the files. Boucher was asked if any of the files contained abusive images of children, to which he responded that he was not certain. The officer continued to search the files and noticed some files with names that suggested images of a minor engaging in sexually explicit conduct. He then requested the help of another officer, who determined that a number of files contained abusive images of children. Boucher was then read his Miranda rights. He told the second officer that he downloaded pornographic files and indicated that he did not intentionally download images of a minor engaging in sexually explicit conduct and deleted any such images when he came across them. Boucher was given access to the laptop and navigated to the Z drive, to which he obtained access by inserting a password. The second officer did not see Boucher do this. Boucher was subsequently arrested and his laptop was seized. After obtaining a search warrant, the government discovered that the Z drive was encrypted and the investigating authorities could not open the Z drive. A grand jury subpoena was issued for Boucher, directing him to ‘provide all documents, whether in electronic or paper form, reflecting any passwords used or associated with’ his seized computer.3
1R v S (F) and A (S) [2008] EWCA Crim 2177, [2009] 1 WLR 1489, [2009] 1 All ER 716, [2008] 10 WLUK 197, [2009] 1 Cr App R 18, [2009] Crim LR 191, (2008) 158 NLJ 1459, Times, 15 October 2008, [2008] CLY 711.
22009 WL 424718 (D.Vt.), reversing and remanding 2007 WL 4246473 (Maj. Ct. D.Vt.).
32007 WL 4246473 (D.Vt.) at [2].
8.57 Boucher moved to quash the subpoena because, he alleged, it violated his right not to incriminate himself under the provisions of the Fifth Amendment. Whether the privilege against self-incrimination applied in this instance depended on whether the subpoena sought testimonial communication. Both parties agreed that the contents of the laptop computer were not covered by the Fifth Amendment because they were voluntarily prepared and not testimonial in nature. The magistrate court held that requiring Boucher to enter the password would disclose both that he knew the password and that he had control over the files on the encrypted drive.1 The magistrate therefore concluded that the Fifth Amendment prevented the government from compelling Boucher to provide the password on the basis that it would compel him to display the contents of his mind and thereby incriminate himself.2 The government appealed this decision,3 arguing that it was already aware of the existence and location of the information during the border examination (when the officer viewed the contents of some of the Z drive files, and ascertained that they could consist of images or videos of a minor engaging in sexually explicit conduct). On appeal, the district court agreed. The court held that requiring Boucher to ‘provid[e] access to the unencrypted Z drive “adds little or nothing to the sum total of the Government’s information” about the existence and location of files that may contain incriminating information’, and therefore this did not constitute ‘compelled testimonial communication’ and did not breach Boucher’s Fifth Amendment right against self-incrimination.4
12007 WL 4246473 (D.Vt.) at [3].
22007 WL 4246473 (D.Vt.) at [6].
3In re Grand Jury Subpoena to Sebatien Boucher, 2009 WL 424718 (D.Vt.).
4In re Grand Jury Subpoena to Sebastien Boucher, 2009 WL 424718 (D.Vt.) at [2]–[3]. For more discussion in the US context and reference to other articles, see Aaron M. Clemens, ‘No computer exception to the constitution: the Fifth Amendment protects against compelled production of an encrypted document or private key’ (2004) 8(1) UCLA Journal of Law and Technology 1; Andrew J. Ungberg, ‘Protecting privacy through a responsible decryption policy’ (2009) 22(2) Harv J L & Tech 537; John Duong, ‘The intersection of the Fourth and Fifth Amendments in the context of encrypted personal data at the border’ (2009) 2(1) Drexel Law Review 313; David Colarusso, ‘Heads in the cloud, A coming storm: the interplay of cloud computing, encryption, and the Fifth Amendment’s protection against self-incrimination’ (2011) 17(1) Boston University Journal of Science and Technology Law 69; Adam M. Gershowitz, ‘Password protected? Can a password save your cell phone from a search incident to arrest?’ (2011) 96(4) Iowa L Rev 1125; Susan W. Brenner, ‘The Fifth Amendment, cell phones and search incident: a response to password protected?’ (2011) 96 Iowa L Rev Bulletin 78; Michael Wachtel, ‘Give me your password because Congress can say so: an analysis of Fifth Amendment protection afforded individuals regarding compelled production of encrypted data and possible solutions to the problem of getting data from someone’s mind’ (2013) 14 U Pitt J Tech & Policy 44; Andrew T. Winkler, ‘Password protection and self-incrimination: applying the Fifth Amendment privilege in the technological era’ (2013) 39(2) Rutgers Computer & Tech LJ 194; David Rassoul Rangaviz, ‘Compelled decryption & state constitutional protection against self-incrimination’ (2020) 57(1) American Criminal Law Review 157; Rafita Ahlam, ‘Apple, the government, and you: security and privacy implications of the global encryption debate’ (2021) 44(3) Fordham Int’l LJ 771; Orin S. Kerr, ‘Decryption originalism: the lessons of Burr’ (2021) 134(3) Harv L Rev 905.
8.58 The Boucher case is illustrative of compelled decryption cases in the US. A defendant’s Fifth Amendment privilege against self-incrimination is implicated when the police require a suspect to enter a passcode to unlock an encrypted device, such as a telephone or computer. United States courts tend to agree that the act of entering a passcode is testimonial, which activates the privilege against self-incrimination; however, this privilege is not available if the police can show that the testimony would be considered a ‘foregone conclusion’.
8.59 This rule is based on the ‘act of production’ doctrine from Fisher v United States,1 which was developed in the context of producing documents pursuant to a subpoena. The Supreme Court in Fisher held that the Fifth Amendment privilege against self-incrimination was implicated when the government compelled a suspect to produce documents when the act is both testimonial and incriminating.2 The act of production is neither testimonial nor incriminating, however, when it ‘adds little or nothing to the sum total of the Government’s information’ and is therefore a ‘foregone conclusion’.3
1425 U.S. 391 (1976), 96 S.Ct. 1569 (1976).
2425 U.S. 391 (1976) at 409–410.
3425 U.S. 391 (1976) at 411.
8.60 Courts have adopted and applied the act of production doctrine and its foregone conclusion exception to cases of compelled decryption. Courts differ, however, on how the foregone conclusion exception should be applied on two primary fronts. First, courts differ on whether the police must show that they already have knowledge that the suspect knows his passcode, or whether the police must show that they already have knowledge of the encrypted content of the device.1 In other words, courts differ on what constitutes the ‘testimony’ that must be a foregone conclusion. Second, courts differ on the burden of proof of this foregone conclusion: some courts have required the police to show clear and convincing evidence,2 some have required proof beyond a reasonable doubt,3 some have required a showing of facts with reasonable particularity,4 and still others seem to gloss over the standard required entirely. This section will survey some of the more influential cases, noting that these jurisprudential splits can only be resolved by the US Supreme Court.
1Compare United States v Apple MacPro Computer, 851 F.3d 238 (3rd Cir. 2017) (in dicta clarifies that the government must only show that they have knowledge that the suspect knows the passcode or owns the device) to In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, 670 F.3d 1335 (11th Cir. 2012) (requires the government to show that they have knowledge of the encrypted content of the device).
2United States v Spencer, 2018 WL 1964588 (N.D. Cal. 2018).
3Commonwealth v Jones, 117 N.E.3d 702 (Mass. 2019), 481 Mass. 540 (Sup.Jud.Ct. 2019). It is worth noting that this court found that the Massachusetts State Constitution required a showing of the foregone conclusion beyond a reasonable doubt. State courts in the US may interpret their state constitutions to be more protective of individual rights than the federal US Constitution. The US Constitution is considered to guarantee the minimum amount of rights protection, which the states may strengthen through their own constitutions. Further, the court in Commonwealth v Jones does not bind any courts outside the State of Massachusetts.
4In the Matter of the Search of a Residence in Aptos, California 95003, 2018 WL 1400401 (N.D. Cal. 2018). This Magistrate Judge’s decision was overturned by the District Court in United States v Spencer, 2018 WL 1964588 (N.D. Cal. 2018). The Spencer Court clarified that the reasonable particularity standard was a substantive standard that ‘helps to ensure that any testimony at issue really is a “foregone conclusion”’. In the case of a determination concerning whether a suspect is capable of decrypting a device, it is a binary question – either he can or he cannot – rather than something that must be described by the government with reasonable particularity. Therefore, the correct evidentiary standard is clear and convincing evidence. See also In the Matter of the Decryption of a Seized Data Storage System, 2013 WL 12327372 (E.D. Wis. 2013) (holding that the government must show the foregone conclusion with reasonable particularity). Arguably, the reasonable particularity standard only makes sense when the government must show its knowledge of the contents of a device, which is why it was also used by the court in In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, 670 F.3d 1335 (11th Cir. 2012).
8.61 In the case of In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011,1 law enforcement agents began an investigation in March 2010 of an individual suspected of using a YouTube.com account for sharing explicit materials involving underage girls. During the course of their investigation, officers obtained several Internet protocol (IP) addresses from which the individual had obtained access to the Internet. Three of the addresses were subsequently traced to hotels. A review of the register in each hotel revealed a common name registered at the hotel at the relevant time, being that of one Doe. Doe was found at a hotel in California, and the police applied for and obtained a warrant to search his room. Seven items were seized, including two laptops and five external hard drives. Examiners from the Federal Bureau of Investigation analysed the digital media but could not obtain access to some parts of the hard drives because they were encrypted with a software program called TrueCrypt.
1670 F.3d 1335 (11th Cir. 2012).
8.62 Doe refused to provide the passwords to enable the government to open and view the encrypted data, and he also refused to decrypt the data. As a result, he was served with a subpoena duces tecum, requiring him to appear before a grand jury and produce the plaintext of the encrypted files located on the hard drives of his laptop computers and the five external hard drives. Federal prosecutors offered him immunity for the act of decrypting the computer but reserved the right to use any evidence it found on the computer against him.1 When he appeared before the grand Jury, Doe invoked his Fifth Amendment privilege against self-incrimination to not reveal the plaintext. During the hearing, the forensic examiner testified that he could obtain access to some parts of the hard drives, but he could not know for certain whether there might be data on the encrypted part of the hard drive – indeed, he accepted there might not be any data in the encrypted part of the drives. The district court determined that Doe’s failure to decrypt the relevant parts of the hard drives amounted to contempt of court and committed him to custody.
1670 F.3d 1335 at 1350.
8.63 On appeal, the Eleventh Circuit Court of Appeals reversed the district court decision and held that the decryption and production of the hard drives was a testimonial act, and thus the defendant could assert his Fifth Amendment privilege against self-incrimination. The court reasoned:
the decryption and production of the hard drives would require the use of the contents of Doe’s mind and could not be fairly characterized as a physical act that would be nontestimonial in nature. We conclude that the decryption and production would be tantamount to testimony by Doe of his knowledge of the existence and location of potentially incriminating files; of his possession, control, and access to the encrypted portions of the drives; and of his capability to decrypt the files.
We are unpersuaded by the Government’s derivation of the key/combination analogy in arguing that Doe’s production of the unencrypted files would be nothing more than a physical nontestimonial transfer. The Government attempts to avoid the analogy by arguing that it does not seek the combination or the key, but rather the contents. This argument badly misses the mark.’ 1
1670 F.3d 1335 at 1346.
8.64 Further, the ‘foregone conclusion’ exception was not available to the government because it failed to show that it had knowledge of the contents of the defendant’s device. The court noted that:
nothing in the record before us reveals that the Government knows whether any files existed and are located on the hard drives; what’s more, nothing in the record illustrates that the Government knows with reasonable particularity that Doe is even capable of accessing the encrypted portions of the drives.1
1670 F.3d 1335 at 1346.
8.65 In this regard, In re Grand Jury is distinguishable from Boucher in that the government was aware of what was on Boucher’s computer because of his own actions in displaying them to the officers.1
1For a more detailed discussion of this case, see Hanni Fakhoury, Esq., ‘A combination or a key? The Fifth Amendment and privilege against compelled decryption’ (2012) 9 Digital Evidence and Electronic Signature Law Review 81.
8.66 While there is wide agreement that the act of production doctrine and foregone conclusion exception apply to cases of compelled decryption, courts differ on what constitutes the testimony that must be a foregone conclusion. The In re Grand Jury court required the government to show that the contents of the suspect’s device would be a foregone conclusion. Interestingly, the Boucher court, while not explicit in its analysis about what testimony must be proven, found that the government knew both the contents of the device and that the suspect could decrypt the device. The Third Circuit Court of Appeals considered a similar case, although it would go on to provide detailed guidance of what type of testimony must be shown to be a foregone conclusion – and in so doing, diverged from the Eleventh Circuit in this regard.
8.67 In 2017, the Third Circuit considered the case of United States of America v Apple MacPro Computer1 in which the suspect Doe refused to decrypt hard drives that were obtained by police pursuant to a valid search warrant. Along with the hard drives, police also seized a mobile telephone and a MacPro computer. The police were able to bypass the encryption on the MacPro computer and found evidence that Doe had downloaded photographic files constituting images of a minor engaging in sexually explicit conduct. The police suspected that the files themselves were stored on the separate encrypted hard drives that Doe refused to decrypt. Doe argued that the act of decryption would violate his Fifth Amendment privilege against self-incrimination.
1851 F.3d 238 (3rd Cir. 2017).
8.68 The Third Circuit followed the Eleventh Circuit’s legal reasoning that the act of production and foregone conclusion rules applied to the compelled decryption of devices. Unlike the facts of the case before the Eleventh Circuit, however, the Third Circuit found that the testimony sought by the government from Doe was a foregone conclusion. The court reasoned: ‘the Government has provided evidence to show both that files exist on the encrypted portions of the devices and that Doe can access them.’1 Among other reasons, the evidence to support this assertion was: the encrypted devices were found at Doe’s residence and he did not dispute his ownership of them, analysts found evidence on the MacPro computer that the user had visited groups that had titles used in child exploitation and had downloaded images known through hashing to be images of a minor engaging in sexually explicit conduct, and Doe’s sister had witnessed Doe unlock the hard drives to view images and videos of a minor engaging in sexually explicit conduct.2 Based on these and similar facts, the magistrate had found that the testimony would be a foregone conclusion. The district court and the Third Circuit Court of Appeals both affirmed that conclusion.
8.69 Due to the overwhelming amount of proof against Doe, it could be difficult to ascertain whether the Third Circuit requirement that the testimony must be shown to be a foregone conclusion is simply the passcode or ownership of the devices, or the contents of the device. The Third Circuit had evidence in that case that Doe owned the devices and had decrypted the devices previously, and that the government knew the contents of the devices, all of which qualified as a foregone conclusion. Helpfully, the court added a footnote, that although dictum, is persuasive authority for future cases in the Third Circuit:
It is important to note that we are not concluding that the Government’s knowledge of the content of the devices is necessarily the correct focus of the ‘foregone conclusion’ inquiry in the context of a compelled decryption order. Instead, a very sound argument can be made that the foregone conclusion doctrine properly focuses on whether the Government already knows the testimony that is implicit in the act of production. In this case, the fact known to the government that is implicit in the act of providing the password for the devices is ‘I, John Doe, know the password for these devices’. Based upon the testimony presented at the contempt proceeding, that fact is a foregone conclusion. However, because our review is limited to plain error, and no plain error was committed by the District Court in finding that the Government established that the contents of the encrypted hard drives are known to it, we need not decide here that the inquiry can be limited to the question of whether Doe’s knowledge of the password itself is sufficient to support application of the foregone conclusion doctrine.1
1851 F.3d 238 at 248, n 7.
8.70 Other cases have cited this dictum by the Third Circuit Court of Appeals in Apple MacPro Computer to hold that the government need only show that the suspect’s knowledge of the passcode is a foregone conclusion.1 This is consistent with an unpublished opinion of the Fourth Circuit Court of Appeals in United States of America v Gavegnano,2 in which the appellant was convicted of receipt and possession of abusive images of children stored on a laptop computer owned by the government and issued to him for the purposes of his work. One of the grounds of appeal was based on the Fifth Amendment, in that he gave the password of the laptop computer to the prosecuting authorities after meeting with his lawyer. The Fourth Circuit rejected his claim, on the basis that ‘Any self-incriminating testimony that he may have provided by revealing the password was already a “foregone conclusion” because the Government independently proved that Gavegnano was the sole user and possessor of the computer’.3
1State of Oregon v Pittman, 452 P.3d 1011 (Or.App. 2020); Commonwealth v Jones, 117 N.E.3d 702 (Mass. 2019), 481 Mass. 540 (Sup.Jud.Ct. 2019); State of Missouri v Johnson, 576 S.W.3d 205 (Mo.App. W.D. 2019); State of New Jersey v Andrews, 197 A.3d 200 (N.J.Super.A.D. 2018); United States v Spencer, 2018 WL 1964588 (N.D. Cal. 2018); State of Florida v Stahl, 206 So.3d 124 (Fla.App. 2 Dist. 2016) at 136; U.S. v Fricosu, 841 F.Supp.2d 1232 (D.Colo. 2012).
2305 Fed.Appx. 954 (4th Cir. 2009), 2009 WL 106370.
3305 Fed.Appx. 954 (4th Cir. 2009) at 956.
8.71 Nonetheless, other courts have chosen to follow the approach set out by the Eleventh Circuit that requires the government to show, with reasonable particularity, that the contents of the device are a foregone conclusion.1 In adopting this approach, the district court In the Matter of the Search of a Residence in Oakland, California explicitly rejected the notion that an encrypted device, in this case a telephone, was akin to a safe and that the government’s demand for a passcode is merely the same as compelling a suspect to enter a passcode to open a safe, much like the use of a key.2 The court reasoned that ‘[t]oday’s mobile phones are not comparable to other storage equipment, be it physical or digital, and are entitled to greater privacy protection’. Quoting the US Supreme Court’s opinion in Riley v California, the District Court considered that a search of a telephone ‘would typically expose to the government far more than the most exhaustive search of a house’.3 Given the primary split between the Eleventh Circuit Court of Appeals and the Third and Fourth Circuit Courts of Appeals, it is possible that the US Supreme Court will, at some time in the future, be asked to clarify the scope of the act of production doctrine and the foregone conclusion exception in the context of compelled decryption of devices by the use of passcodes.4 Likewise, courts seem to be split on the issue of whether a forced use of biometric measurements constitutes ‘testimony’ such that the Fifth Amendment privilege against self-incrimination will apply.
1In the Matter of the Search of a Residence in Oakland, California, 354 F.Supp.3d 1010 (N.D. Cal. 2019); Seo v State, 109 N.E.3d 418 (Ind.App. 2018), transfer granted and opinion vacated on other grounds, see Eunjoo Seo v State, 148 N.E.3d 952 (2019); Securities and Exchange Commission v Huang, 2015 WL 5611644 (E.D. Pa. 2015).
2354 F.Supp.3d 1010 at 1017.
3354 F.Supp.3d 1010 (quoting Riley v California, 573 U.S. 373 (2014), 134 S.Ct. 2473 (2014)).
4It is beyond the scope of this chapter to discuss the correct approach. For a thorough exploration of these issues, consult Orin Kerr, ‘Compelled decryption and the privilege against self-incrimination’ (2018) 97 Texas L Rev 767, and Laurent Sacharoff, ‘What am I really saying when I open my smartphone? A response to Orin S. Kerr’ (2019) 97 Texas L Rev Online 63.
8.72 There are several Federal District Courts1 and at least one State Supreme Court2 that have found that compelling the suspect to unlock a device, usually a telephone, with a biometric measurement such as a fingerprint or face, does not constitute testimony such that the privilege against self-incrimination is implicated. As the District Court of Idaho reasoned in the case of In the Matter of the Search of: a White Google Pixel 3 XL Cellphone in a black incipio case, ‘the Government agents will pick the fingers to be pressed on the Touch ID sensor, [and so] there is no need to engage in the thought process of the subject at all in effectuating the seizure’.3 The court in that case compared this act to other compelled displays of physical features that have been allowed by the US Supreme Court, including, ‘putting on a shirt to see whether it fits the defendant; providing a blood sample to test for alcohol content; submitting to the taking of fingerprints or photographs; providing a voice exemplar; and providing a handwriting exemplar’.4
1In the Matter of the Search Warrant Application for the Cellular Telephone in United States v Barrera, 415 F.Supp.3d 832 (N.D.Ill. 2019); In the Matter of the Search of: a White Google Pixel 3 XL Cellphone in a Black Incipio Case, 398 F.Supp.3d 785 (D.Idaho 2019); Matter of Search of [Redacted] Washington, District of Columbia, 317 F.Supp.3d 523 (D.D.C. 2018).
2State of Minnesota v Diamond, 905 N.W.2d 870 (Minn. 2018).
3398 F.Supp.3d 785 at [13]; for Illinois, see In the Matter of the Search Warrant Application for the cellular telephone in United States v Barrera, 415 F.Supp.3d 832 (N.D.Ill. 2019).
4398 F.Supp.3d 785 at [10–12] (internal citations omitted).
8.73 Other courts disagree, however. In United States v Wright, a federal district court in Nevada held:
First, a biometric feature is functionally the same as a passcode, and because telling a law enforcement officer your passcode would be testimonial, so too must the compelled use of your biometric feature to unlock a device. Second, unlocking a phone with your face equates to testimony that you have unlocked the phone before, and thus you have some level of control over the phone.1
1431 F.Supp.3d 1175 (D.Nev. 2020) (citing In the Matter of the Search of a Residence in Oakland, California, 354 F.Supp.3d 1010 (N.D. Cal. 2019)) (internal citations omitted).
8.74 Another Federal District Court in Illinois agreed, and cited the Eleventh Circuit opinion in the case of In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011 to support its holding that ‘the connection of a fingerprint to the electronic sources that may hold contraband … does explicitly or implicitly relate a factual assertion or disclose information’.1 The court rejected the government’s claim that the Fifth Amendment does not apply to the compulsion to submit to fingerprinting, stating:
We do not believe that a simple analogy that equates the limited protection afforded a fingerprint for identification purposes to forced fingerprinting to unlock an Apple electronic device that potentially contains some of the most intimate details of an individual’s life (and potentially provides direct access to contraband) is supported by Fifth Amendment jurisprudence.2
1In re Application for a Search Warrant, 236 F.Supp.3d 1066 (N.D.Ill. 2017) at 1073.
2236 F.Supp.3d 1066 at 1073–1074.
8.75 Similar to the issue of forced decryption through use of passcodes, the testimonial nature of biometric features used for decryption needs clarification from a higher court.
Bypassing the Fifth Amendment by compelling the assistance of third parties
8.76 Given the limitations of the cases noted above regarding compelled decryption by suspects, governments increasingly seek to compel third party intermediaries, usually technology companies or communications service providers, to provide plaintext data to law enforcement authorities. Perhaps the most famous example of the US government attempting to compel a third party intermediary to decrypt a device is the litigation over an Apple iPhone seized by the FBI in Government’s ex parte application for order compelling Apple Inc. to assist agents in search1 before the district court of the Central District of California. The US government seized an iPhone 5C believed to have belonged to Syed Rizwan Farook, an alleged terrorist who perpetrated an attack which killed 14 people and injured 22 others in San Bernandino, California. The iPhone was protected by a passcode. Later generation iPhones have their contents encrypted by default, and the passcode acts as the password. Thus, without the password, the FBI was unable to obtain access to the contents of the device. It is also possible to set the iPhone to auto-erase the contents of the telephone if a set number of incorrect passcodes is entered.
1In the Matter of the Search of an Apple Iphone Seized During the Execution of a Search Warrant on a Black Lexus IS300, California License Plate 35KGD203, 2016 WL 618401 (C.D. Cal. 16 February 2016).
8.77 Given these obstacles, the government sought an order under the All Writs Act1 requiring Apple to assist the FBI in circumventing the encryption.2 Contrary to what was reported in most media, the order did not require Apple to break the encryption, but rather Apple was ordered to provide reasonable technical assistance in bypassing or disabling the auto-erase function, enabling the FBI to submit unlimited passcodes to the device for electronic testing and ensuring that the device would not purposefully introduce any additional delay between passcode attempts, essentially enabling a brute-force attack.3 Apple resisted the imposition of the order, arguing that to do so would hand unparalleled powers to the government, which would render data privacy laws meaningless. They argued that any process they put in place could be exploited by others, which meant that the privacy of all its customers would be put at risk.4 Ultimately, the FBI were able to obtain access to the device with the help of an unnamed third party, and the litigation was discontinued.5
128 U.S. Code § 1651.
22016 WL 618401 (C.D. Cal. 16 February 2016).
32016 WL 618401 (C.D. Cal. 16 February 2016), Order 2.
4Tim Cook, ‘A message to our customers’ (Apple, Inc, 16 February 2016), http://www.apple.com/customer-letter/.
5Rob Crilly, ‘FBI finds method to hack gunman’s iPhone without Apple’s help’, http://www.telegraph.co.uk/technology/2016/03/29/fbi-finds-method-to-hack-gunmans-iphone-without-apples-help0/.
8.78 In a similar case,1 the government sought an order before a New York court requiring Apple to bypass the passcode security on an Apple device on the basis that such an order would assist in the execution of a search warrant previously issued by the court. The court denied the government’s motion, on the basis that the government had failed to establish that the All Writs Act permitted the relief it sought, partly because Congress had considered legislation that would achieve the same result but had not adopted it. The judge also noted that a court, when deciding whether to take such discretionary action, was required to consider three additional factors:
1. the closeness of the relationship between the person or entity to whom the proposed writ is directed and the matter over which the court has jurisdiction;
2. the reasonableness of the burden to be imposed on the writ’s subject; and
3. the necessity of the requested writ to aid the court’s jurisdiction (which does replicate the second statutory element, despite the overlapping language).2
1In re Order requiring Apple, Inc, to assist in the execution of a search warrant issued by this Court, 2015 WL 5920207 (E.D.N.Y. 2015); In re Apple, Inc., 149 F.Supp.3d 341 (E.D.N.Y. 2016).
2In re Apple, Inc., 149 F.Supp.3d 341 (E.D.N.Y. 2016) at 351.
8.79 The court said that even if the statute did apply, all three discretionary factors weighed against the issuing of the requested writ, and that the application would be denied as a matter of discretion, even if it is available as a matter of law.
8.80 These cases brought renewed attention to the encryption debates between law enforcement authorities who seek lawful access to plaintext data and the information and communication technology (ICT) companies who implement encryption by default for security purposes. In addition to security concerns, these companies also benefit from encryption by shifting control to the user, which limits the abilities of the companies to cooperate with the government. ICT companies can also benefit from appearing to champion user privacy, especially after the Snowden revelations in 2013.1 After these revelations, both Apple and Google announced they would begin encrypting devices by default.2 Around the same time, James Comey, then Director of the FBI, articulated concerns about the growing use of encryption:
Unfortunately, the law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem. We call it ‘Going Dark’, and what it means is this: Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority. We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so.3
1https://en.wikipedia.org/wiki/Edward_Snowden.
2‘Don’t panic: making progress on the going dark debate’ (Berkman Center for Internet & Society, Harvard University 2016) 10, https://cyber.harvard.edu/pubrelease/dont-panic/Dont_Panic_Making_Progress_on_Going_Dark_Debate.pdf.
3James B. Comey, Federal Bureau of Investigation Director, ‘Going dark: are technology, privacy, and public safety on a collision course?’, speech delivered to the Brookings Institution (2014), https://www.fbi.gov/news/speeches/going-dark-are-technology-privacy-and-public-safety-on-a-collision-course.
8.81 To solve this ‘going dark’ problem, Reitinger argues that ‘permitting law enforcement to compel the production of keys when necessary, with judicial supervision as appropriate, is a minimal accommodation to the need for public security in a world in which criminals have an increasing array of sophisticated tools at their disposal’.1 It is difficult to think of any other aspect of evidence where a suspect is allowed to wilfully hide evidence of his criminality from law enforcement and for this to be condoned by the criminal justice system. Using encryption, a person can hide thousands of abusive images of children on a device. They could obtain access to them every day but, if they took appropriate precautions,2 law enforcement authorities would find it almost impossible to prove that the offence has taken place.3 That is not in the interest of society. This is a point made by Orenstein MJ of New York in his concluding remarks in one of the cases involving Apple:
How best to balance those interests is a matter of critical importance to our society, and the need for an answer becomes more pressing daily, as the tide of technological advance flows ever farther past the boundaries of what seemed possible even a few decades ago. But that debate must happen today, and it must take place among legislators who are equipped to consider the technological and cultural realities of a world their predecessors could not begin to conceive.4
1Phillip R. Reitinger, ‘Compelled production of plaintext and keys’ (1996) U Chi Legal F 206, fn omitted.
2Deleting caches, recent document lists, etc.
3Keylogging software would only work if a single device was used to obtain access to the material (or the software would be required to be placed on each device) and if a regular Internet connection was used. Covert surveillance (cameras) could be installed to show the material being viewed, but law enforcement authorities would need to know which room the device was located in, and it could be difficult to obtain authorization to do so, depending on the level of intrusion this could cause (for example, if it was on a tablet, it may be necessary to have devices in each room, which could be construed a gross invasion of privacy).
4In re Apple, Inc., 149 F.Supp.3d 341, 376 (E.D.N.Y. 2016) at 376.
8.82 Jim Baker, former general counsel for the FBI, was responsible for leading the government efforts to compel Apple to decrypt the iPhone in the San Bernandino case in 2016.1 In 2019 Baker wrote that his opinion on encryption had changed in light of the serious cybersecurity threats facing the US:
All public safety officials should think of the protecting of the cybersecurity of the United States as an essential part of their core mission to protect the American people and uphold the Constitution. And they should be doing so even if there will be real and painful costs associated with such a cybersecurity-forward orientation. The stakes are too high and our current cybersecurity situation too grave to adopt a different approach.
…
In light of the serious nature of this profound and overarching [cybersecurity] threat, and in order to execute fully their responsibility to protect the nation from catastrophic attack and ensure the continuing operation of basic societal institutions, public safety officials should embrace encryption.2
1The telephone was eventually ‘unlocked’ by the Australian company Azimuth: Ellen Nakashima and Reed Albergotti, ‘The FBI wanted to unlock the San Bernardino shooter’s iPhone. It turned to a little-known Australian firm’ The Washington Post, 14 April 2021, https://www.washingtonpost.com/technology/2021/04/14/azimuth-san-bernardino-apple-iphone-fbi/.
2Jim Baker, ‘Rethinking encryption’ Lawfare (22 October 2019), https://www.lawfareblog.com/rethinking-encryption (original emphasis).
8.83 Baker’s remarks illustrate how the encryption debate has turned in recent years from the ‘false dichotomy’ between security and privacy to a discussion of competing security interests.1 ICT companies and computer scientists argue that encryption is necessary to protect users from criminals, while law enforcement authorities argue that encryption protects criminals from detection and prosecution. In reality, encryption does both. Privacy advocates and computer scientists argue that criminals will always find a way to communicate anonymously and that measures designed to allow governments to have access to keys or back doors will do more harm to regular users of these technologies.2 Computer scientists in particular have raised alarms that any government proposals for ‘exceptional access’ to encrypted systems are ‘unworkable in practice, raise enormous legal and ethical questions, and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm’.3 These same computer scientists concluded a report analysing law enforcement proposals for exceptional access with the following observations:
This report’s analysis of law enforcement demands for exceptional access to private communications and data shows that such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend. The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict.4
1Professor Susan Landau sets out the arguments about encryption very clearly in her book: Listening In: Cybersecurity in an Insecure Age (Yale University Press 2017); Encryption Working Group, Carnegie Endowment for International Peace, Center for Information on Technology Policy, Princeton University, ‘Moving the encryption policy conversation forward’ (September 2019) 3, https://carnegieendowment.org/files/EWG__Encryption_Policy.pdf.
2For more on this debate, see the essay series at Daniel J. Weitzner, ‘Perspectives on encryption and surveillance’, Lawfare, 29 November 2018, https://www.lawfareblog.com/perspectives-encryption-and-surveillance; for a historical perspective, see Danielle Kehl, Andi Wilson and Kevin Bankston, ‘Doomed to repeat history? Lessons from the crypto wars of the 1990s’ (2015), https://static.newamerica.org/attachments/3407-doomed-to-repeat-history-lessons-from-the-crypto-wars-of-the-1990s/Crypto%20Wars_ReDo.7cb491837ac541709797bdf868d37f52.pdf.
3Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter and Daniel J. Weitzner, ‘Keys under doormats: mandating insecurity by requiring government access to all data and communications’ [2015] Journal of Cybersecurity 1, https://academic.oup.com/cybersecurity/article-lookup/doi/10.1093/cybsec/tyv009.
4Abelson and others, ‘Keys under doormats’, 24–25.
8.84 However, the Encryption Working Group, funded by the Carnegie Endowment for International Peace and comprising former government officials, business representatives, privacy advocates, law enforcement authorities and computer scientists in the US, believe that ‘more common ground is attainable’ if the discussion between participants focuses on the individual ‘component parts’ within the larger umbrella of encryption policy.1 For example, while encryption for data in transit may raise many of the issues that concerned the computer scientists cited above, ‘some forms of access to encrypted information, such as access to data at rest on mobile phones’ may be possible.2 By debating specific types of encryption, data and devices, it may be possible to find a sensible middle ground approach which allows for law enforcement authorities to obtain access to decrypted data without endangering cybersecurity or privacy.3
1Encryption Working Group, Carnegie Endowment for International Peace, Center for Information on Technology Policy, Princeton University, ‘Moving the encryption policy conversation forward’ (September 2019) 4, https://carnegieendowment.org/files/EWG__Encryption_Policy.pdf.
2Encryption Working Group, Carnegie Endowment for International Peace, ‘Moving the encryption policy conversation forward’, 17.
3Similarly, UK officials at GCHQ have advocated for cooperation and collaboration given the lack of straightforward solutions in the security versus security debate. Ian Levy and Crispin Robinson, ‘Principles for a more informed exceptional access debate’, Lawfare, 29 November 2018, https://www.lawfareblog.com/principles-more-informed-exceptional-access-debate.
8.85 Another perspective to this debate is added by the decision of the Canadian court in R. v Beauchamp.1 In this case, an unusual application was brought. Rather than the law enforcement agency seeking access to encrypted data, the defence sought an order to require the Crown to disclose a copy of encrypted files located on a hard drive that had been seized by the police. The Crown had not been able to decrypt the files, and as a result had no knowledge of the data that was encrypted. It was agreed that the encrypted information was both potentially inculpatory and potentially exculpatory for the accused parties. The Crown submitted that the encrypted information was beyond its control, and although it was arguably in its possession, it was not in a format that the Crown was able to view. The judge concluded that the Crown was in partial possession and control of the hard drives, but it had no knowledge of the information in the encrypted files. Smith J analysed the position as follows:
The seizure by the police of the hard drives containing encrypted information is similar to the seizure of a locked safe which the police cannot open, containing documents which include both inculpatory and exculpatory evidence. The police or Crown would clearly be in possession or control of the safe, but if they did not have the key or combination and were unable to break the safe open, then they would not have knowledge of the contents of the safe. In this case, the Crown’s control of the contents of the safe, which are known to one accused but not to the Crown, is not complete, as the Crown needs the key or combination, or in this case the password, in order to access the documents in the safe. The unique feature of this case is that the accused … has the key or password, which is necessary to complete the possession or control of the information in the safe.2
12008 CarswellOnt 2756, [2008] OJ No 1347, 171 CRR (2d) 358, 58 CR (6th) 177, 77 WCB (2d) 177; for further cases in Canada, see R. v Burke 2013 CarswellOnt 8417, 2013 ONCA 424, [2013] OJ No 2920, 107 WCB (2d) 662, 285 CRR (2d) 6, 298 CCC (3d) 396, 307 OAC 171; R. v M. 2012 CarswellMan 256, 2012 MBQB 141, [2012] MJ No 174, 101 WCB (2d) 168, 279 Man R (2d) 80, 93 CR (6th) 155; R. v Stemberger 2012 CarswellOnt 492, 2012 ONCJ 31, [2012] OJ No 221, 100 WCB (2d) 20, 254 CRR (2d) 1; see also Lex Gill, ‘Law, metaphor, and the encrypted machine’ (2018) 55 Osgoode Hall LJ 440; Steven Penney and Dylan Gibbs, ‘Law enforcement access to encrypted data: legislative responses and the Charter’ (2017) 63 McGill LJ 201 (2017).
22008 CarswellOnt 2756 at [40].
8.86 For these reasons, the application for disclosure of a copy of the encrypted files in the hard drives was refused, although the judge indicated that the applicants could, at their option, obtain disclosure of the contents if they provided the password or key to the Crown, and the Crown would then review the material. Had the application been allowed, it would have created an untenable situation. The state would have provided a file that only one party (the defence) could view. The defence would presumably extract the exculpatory evidence without giving the Crown sight of the inculpatory evidence. It is suggested that this decision struck the correct balance, which is to enable the defence to disclose the key so that both parties will have access to the plaintext material.
8.87 The Court of Cassation in Belgium1 has held that Belgian law can require a criminal suspect to disclose their mobile telephone passcode without violating their right to remain silent and to not incriminate oneself, provided the investigating authority can show that the mobile telephone was detected without coercion and that the suspect knows the passcode ‘without reasonable doubt’.2 The Belgian Court interpreted the right to not incriminate oneself as guaranteed by the ECHR, the International Covenant on Civil and Political Rights, and Directive (EU) 2016/343 of 9 March 2016 on the strengthening of certain aspects of the presumption of innocence and of the right to be present at the trial in criminal proceedings.3 The court found that the qualified right against self-incrimination would not prevent authorities from gathering evidence that exists ‘independently of the will of the person who has knowledge of’ the passcode.4 It held that ‘the main purpose of the right not to incriminate oneself is to safeguard the right to a fair trial by excluding false statements made under duress’.5 Because the passcode remains unchanged regardless of whether it is communicated, the court determined that ‘[t]here is no risk of unreliable evidence’.6
1The authors thank Professor Dr Joachim Meese, Faculty of Law, Universiteit Antwerpen, Belgium and lawyer at the bar of Ghent, Belgium for his review of this section on Belgium.
2Attorney General at the Court of Appeal of Ghent v M A, 4 februari 2020 P.19.1086.N, Hof van Cassatie, tweede kamer (Court of Cassation, second chamber), English translation in (2020) 17 Digital Evidence and Electronic Signature Law Review 94. For comments on this decision, see C. Conings and R. De Keersmaecker, ‘To save but not too safe: hoogste Belgische rechters zien geen graten in het decryptiebevel voor de verdachte’ (2020) 3 Tijdschrift voor Strafrecht 163 and F. Koning, ‘Droit au silence et à ne pas s’incriminer: Quo Vadis?’ (2020) 6807 Journal des Tribunaux 204.
3OJ L 65, 11.3.2016, 1.
4Attorney General v M A, 4 februari 2020 P.19.1086.N, Hof van Cassatie, tweede kamer (Court of Cassation, second chamber), English translation in (2020) 17 Digital Evidence and Electronic Signature Law Review, 95–96.
5Attorney General v M A, 4 februari 2020 P.19.1086.N, Hof van Cassatie, 3.
6Attorney General v M A, 4 februari 2020 P.19.1086.N, Hof van Cassatie, 3.
8.88 The court found that the passcode is thus akin to the communication of biometric data, which is a permissible derogation from the right to not incriminate oneself. Even if the passcode reveals information that is subjected to substantial criminal sanctions, the communication of the passcode itself only relates to accessing an ‘already discovered IT system’.1 Thus, the compelled decryption of a telephone passcode did not violate the suspect’s right to remain silent.
1Attorney General v M A, 4 februari 2020 P.19.1086.N, Hof van Cassatie, 3.
8.89 Encryption is a fundamental part of modern cybersecurity. Good quality encryption provides reassurance to users that sensitive information can be securely stored. However, it is obvious that a criminal can also use encryption to hide his actions. More than this, encryption allows material to be hidden from everyone else, but remain accessible to the possessor of the cryptographic key. Throughout history, people have hidden files or objects that they do not want law enforcement authorities to find. However, that invariably affects the ability of the owner to use the files or objects. For example, a person could send photographs or physical records out of the jurisdiction where no mutual legal assistance treaty exists. This will keep the data away from investigators, but the owners will not be able to view the photographs or records themselves. Encryption, however, means that the possessor of the key can easily open a file and, for example, look at illegal material at will, while preventing law enforcement authorities from knowing what is being looked at.
8.90 The philosophical basis around the right to self-incrimination is of fundamental importance in any criminal justice system. The difficulty is in establishing a balance between the right not to incriminate oneself when accused by the state, and the rights of victims and society to liberty and security. This is a difficult balance to achieve,1 and in this chapter we have described how various jurisdictions have approached this problem.
1For a different perspective, see Phillip Rogaway, ‘The moral character of cryptographic work’, Cryptology ePrint Archive, Report 2015/1162, http://web.cs.ucdavis.edu/~rogaway/papers/moral.html.
8.91 The balance between the competing security interests implicated by encryption will require continued scrutiny from courts and lawmakers as encryption technologies advance.1 These issues will continue to affect criminal investigations into a wide variety of criminal conduct.2 While the cases explored in this chapter mostly involved criminal offences concerning child sexual abuse materials, law enforcement authorities need to obtain access to encrypted evidence in offences ranging from cybercrime and organized crime to routine investigations into theft, assault and homicide. Thus, we can expect the law concerning access to encrypted data to continue to develop and evolve.
1Quantum cryptography and computing are expected to revolutionize encryption in the coming decades. This may result in unbreakable encryption and likewise the capability to break all existing encryption keys. See Ian Walden, ‘“The sky is falling!” – responses to the “going dark” problem’ (2018) 34 Computer Law & Security Review 901, 906.
2Support for end-to-end encryption increases if child safety can be protected. While this is useful for privacy, it also presents risks to child safety and means abuse can go unnoticed online. The following papers set out new research and analysis about the implications of end-to-end encryption for child protection: National Society for the Prevention of Cruelty to Children, ‘End-to-end encryption: understanding the impacts for child safety online (April 2021); National Society for the Prevention of Cruelty to Children, ‘Private messaging and the rollout of end-to-end encryption: the implications for child protection’ (April 2021), both at https://www.nspcc.org.uk/about-us/news-opinion/2021/adults-support-encryption-if-children-safety-protected/; also see Derek Johnson, Erin Faulkner, Georgia Meredith and Tim J. Wilson, ‘Police functional adaptation to the digital or post digital age: discussions with cybercrime experts’ (2020) 84(5) Journal of Criminal Law 427; Tim J. Wilson, ‘Collaborative justice and harm reduction in cyberspace: policing indecent child images’ (2020) 84(5) Journal of Criminal Law 474.