Proof: the technical collection and examination of electronic evidence
Nigel Wilson, Andrew Sheldon, Hein Dries, Burkhard Schafer and Stephen Mason
9.1 This chapter addresses the challenges and methodologies associated with proving a fact with electronic evidence, and considers the measures relating to the accreditation of those performing a digital forensic analysis, together with the validation of the technologies, systems and methodologies used. It looks at how and why the correct handling, preserving and analysing of electronic evidence are critical steps in an investigation process to ensure reliability of proof. It explains how the probative value of the evidence can be affected and its reliability compromised when critical procedures or measures are not followed. It further describes the use of automation and technology solutions to enhance the efficiency of investigations, and the controls used to ensure the accuracy and forensic reliability of such investigations.
9.2 All electronic evidence exists, at its most basic level, in binary form. The Binary Digit (‘bit’), which represents a logical state with one of two possible digits, is typically represented as a single digit in a binary number as either 0 or 1. These bits are in turn organized in a larger group of 8 bits called a byte. A byte can be used to represent letters of the alphabet and other characters. For example, the byte comprising the 8-bit sequence 01000001 may represent the letter ‘A’ in a word processing system, but could represent something entirely different in a video processing application. Interpretation is, therefore, relative to context and determined by the software used to interpret it. Such representation and interpretation is achieved by using multiple processing and storage layers within a digital system such as a computer, mobile telephone, GPS device or media player, etc. These processing layers include hardware such as processing chips, digital cameras and networks, operating systems such as Windows, Linux, iOS and Android, application software such as word processors, email, web browsers, messaging clients and media players, and data storage such as hard disks, memory cards and cloud environments.
9.3 Thus, when proving a fact using electronic evidence, what an individual may witness on the output such as the screen is the result of multiple phases of digital processing and interpretation performed by software. The risk is that any processing phase may be subject to error. Similarly, when specialist digital forensic software is used to preserve and present electronic evidence, it does so using a programmatic interpretation of the bits and bytes it finds, and this also may be subject to error or wrongful assumptions about the meaning of the data.
9.4 For these reasons, proof, as it applies to electronic evidence, requires more than simple reproduction of data. The process of adducing evidence and determining proof relies on the systems used and the training and experience of the people creating and using them. Both people and systems need to meet significant accreditation and demanding validation to demonstrate that all such interpretations of data have been performed accurately. This is partly because of the unique nature of electronic evidence: it is extremely volatile and subject to being altered with ease, even by the simple act of switching a computer on or off.1
Nigel Wilson, Andrew Sheldon, Hein Dries, Burkhard Schafer and Stephen Mason, ‘Proof: the technical collection and examination of electronic evidence’, in Stephen Mason and Daniel Seng (eds.), Electronic Evidence and Electronic Signatures (5th edn, University of London 2021) 429–487.
1Graeme B. Bell and Richard Boddington, ‘Solid state drives: the beginning of the end for current practice in digital forensic recovery?’ (2010) 5(3) Journal of Digital Forensics, Security and Law 1; Michael Wei, Laura M. Grupp, Frederick E. Spada and Steven Swanson, ‘Reliability erasing data from flash-based solid state drives’, Proceedings of the 9th USENIX Conference on File and Storage Technologies (USENIX Association Berkeley, CA, 2011)
Accreditation of the digital forensics discipline
9.5 By their nature, investigations and examinations of electronic evidence are relatively new compared to other more established forms of forensic analysis such as fingerprinting, DNA analysis, toxicology and ballistics. Broadly speaking, electronic or digital (the terms are used interchangeably) investigations are concerned with the gathering, preservation and analysis of relevant digital data to provide both evidence and intelligence to assist with criminal investigations1 and prosecutions, and with civil and regulatory matters and proceedings.
1The evidence of digital systems can also help reconstruct what happened in an incident, for which see Mario Piccinelli and Paolo Gubian, ‘Modern ships Voyage Data Recorders: a forensics perspective on the Costa Concordia shipwreck’ (2013) 10 Digital Investigation S41.
9.6 Forensic analysis of electronic evidence draws from diverse disciplines such as electrical and electronic engineering and computer science, and includes sub-disciplines such as computer forensics, network forensics, cloud forensics, data analysis, audio and video analysis and analysis of mobile devices. Accreditation has been strongly supported from within the diverse digital forensics discipline. In the USA, the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB) formally accredited digital forensics as a discipline in 2003 together with four sub-disciplines: computer forensics, audio analysis, video analysis and image analysis. In 2016 the ASCLD/LAB was acquired by the American National Standards Institute-American Society for Quality (ANSI-ASQ) National Accreditation Board (ANAB) and the four sub-disciplines were merged into a single discipline known as digital evidence forensics.1 There are similar specialist advisory groups in Australia and New Zealand.2 In the United Kingdom, the Forensic Science Regulator was established in 2008, and included digital forensics as a specialist group. In April 2020 the Forensic Science Regulator produced updated Codes of Practice and Conduct (issue 5) across the entire forensic industry, including for digital forensics.3
1https://anab.ansi.org/about-anab; Hong Guo and Junlei Huo, ‘Review of the accreditation of digital forensics in China’ (2018) 3 Forensic Sciences Research 194, who note that over 70 forensic inspection and laboratories in the US are ANAB accredited. Fred Cohen, Digital Forensic Evidence Examination (4th edn, Fred Cohen & Associates 2012); Eoghan Casey, Digital Evidence and Computer Crime Forensic Science, Computers and the Internet (3rd edn, Academic Press 2011) 1; Alastair Irons and Anastasia Konstadopoulou, ‘Professionalism in digital forensics’ (2007) 4 Digital Evidence and Electronic Signature Law Review 45; Simson Garfinkel, Paul Farrell, Vassil Roussev and George Dinolt, ‘Bringing science to digital forensics with standardized forensic corpora’ (2009) 6 Digital Investigation S2; Yinghua Guo, Jill Slay and Jason Beckett, ‘Validation and verification of computer forensic software tools—Searching Function’ (2009) 6 Digital Investigation S12; Simson L. Garfinkel ‘Digital forensics research: the next 10 years’ (2010) 7 Digital Investigation S64; Jason Beckett and Jill Slay, ‘Scientific underpinnings and background to standards and accreditation in digital forensics’ (2011) 8 Digital Investigation 114.
2Australia New Zealand Policing Advisory Agency, http://www.anzpaa.org.au/forensic-science/forensic-sciences/forensic-groups; Australian Forensic Science Society, http://anzfss.org/about/; National Association of Testing Authorities, https://www.nata.com.au/.
3The Forensic Science Regulator, Codes of Practice and Conduct (FSR-C-100, Issue 5, 2020), https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/880708/Codes_of_Practice_and_Conduct_-_Issue_5.pdf; the Nederlands Register Gerechtelijk Deskundigen (Netherlands Register of Court Experts) undertook a similar process in 2018, see https://lrgd.nl.
9.7 Although formal academic certification of practitioners in various disciplines and accreditation of digital forensic processes are relatively common among the digital forensic community, there is, at the time of writing, a noticeable absence of such accreditation for the process of ‘cloud forensics’ and other evidence obtained online or through, for example, social media. In this regard, rather than qualifications to use commercial tools, academic qualifications that teach fundamental principles of preservation, continuity, critical thinking and verification can be applied to ‘cloud’ forensics. However, there are a number of significant and unique challenges when dealing with data obtained online, not least of which is the fact that potential evidence held by a service provider may not be preserved in the normal manner when creating an image or snapshot of the target storage or computer platform. To this end, frameworks1 are being developed for cloud forensics evidence collection and analysis using security information and event management that draw upon research by the National Institute of Standards and Technology (NIST),2 specifically the NIST Cloud Computing Forensic Science Working Group.
1Muhammad Irfan, Haider Abbas, Yunchuan Sun, Anam Sajid and Maruf Pasha, ‘A framework for cloud forensics evidence collection and analysis using security information and event management’, Security Comm. Networks (2016), 9:3790-3807, https://onlinelibrary.wiley.com/doi/pdf/10.1002/sec.1538; https://doi.org/10.1002/sec.1538.
2Martin Herman, Michaela Iorga, Ahsen Michael Salim, Robert H. Jackson, Mark R. Hurst, Ross Leo, Richard Lee, Nancy M. Landreville, Anand Kumar Mishra, Yien Wang and Rodrigo Sardinas, NIST Cloud Computing Forensic Science Challenges (NISTIR 8006, August 2020), https:doi.org/10.6028/NIST.IR.8006.
Guidelines for handling digital evidence
9.8 Along with the benefits of consistency and uniformity arising from accreditation, numerous guidelines have been produced, premised on uniformity and standardization of procedures that are relevant to the collection and handling of electronic evidence. In 1995 the International Organization on Computer Evidence was established to provide international law enforcement authorities with a forum to facilitate the exchange of information relating to computer crime investigations and other issues relating to digital forensic investigations.1 This organization, together with several other UK authorities, including the Association of Chief Police Officers (ACPO) and the National High-Tech Crime Unit, have produced a number of guidelines for the investigation and examination of electronic evidence within a criminal context. Although various sets of guidelines have, in the main, been produced specifically for criminal investigations, nevertheless the guidelines are also of significant help to practitioners and lawyers in civil matters.2
1See also N. Dudley-Gough, ‘Digital forensic certification board’ (2006) 3(1) Digital Investigation 7; Amber Schroader and N. Dudley-Gough, ‘The Institute of Computer Forensic Professionals’ (2006) 3(1) Digital Investigation 9; note also the European Informatics Data Exchange Framework for Court and Evidence, a project running for 32 months (March 2014–October 2016), http://www.evidenceproject.eu.
2Casey, Digital Evidence, 230, indicates that the most mature and practical guidelines are those produced by ACPO.
9.9 In April 2020 the UK Forensic Science Regulator published an informational guidance document entitled Legal Obligations Issue 8.1 This provides a relatively high-level overview of the obligations placed on expert witnesses in the Criminal Justice System in England and Wales. Contemporary guidelines include documents from Australia and New Zealand,2 the United Kingdom,3 the USA,4 Europe,5 Asia6 and ISO/IEC Standards.7 Likewise, INTERPOL has also established the Global Guidelines in relation to Digital Forensics Laboratories.8
1Forensic Science Regulator Legal Obligations (FSR-I-400, Issue 8, 2020), https://www.gov.uk/government/publications/legal-obligations-issue-8.
2Australia and New Zealand Guidelines for Digital Imaging Processes (2013, ANZPAA), https://www.anzpaa.org.au/ArticleDocuments/180/2013%20Australia%20and%20New%20Zealand%20Guidelines%20for%20Digital%20Imaging%20Processes.pdf.aspx.
3UK ACPO Good Practice Guide for Digital Evidence, https://www.digital-detective.net/digital-forensics-documents/ACPO_Good_Practice_Guide_for_Digital_Evidence_v5.pdf.
4National Institute of Justice, Forensic Examination of Digital Evidence: A Guide for Law Enforcement, US Department of Justice, 2004, https://www.ncjrs.gov/pdffiles1/nij/199408.pdf.
5European Union, European Anti-Fraud Office, Guidelines on Digital Forensic Procedures for OLAF Staff, 2016, https://ec.europa.eu/anti-fraud/sites/antifraud/files/guidelines_en.pdf.
6For example, see China – Ministry of Public Security of the People’s Republic of China (2019) Rules on Collection of Electronic Data by Public Security Bureau when Handling Criminal Cases and (2016) Rules on Electronic Data Collection, Extraction and Review in Criminal Cases.
7ISO/IEC 27037:2012 [ISO/IEC 27037:2012] ‘Information technology — Security techniques — Guidelines for identification, collection, acquisition and preservation of digital evidence’ (confirmed in 2018), https://www.iso.org/standard/44381.html.
8INTERPOL, 2019 Global Guidelines by INTERPOL for Digital Forensics Laboratories, https://www.interpol.int/content/download/13501/file/INTERPOL_DFL_GlobalGuidelinesDigitalForensicsLaboratory.pdf.
9.10 As with any other form of evidence, there are a number of discrete elements that accompany the collection and handling of digital evidence. It is suggested that a digital evidence professional should, ideally, undertake her duties against the highest standards that are propounded by her peers, regardless of whether she is assisting in a criminal or civil matter. In Bilta (UK) Limited (in Liquidation) v Nazir,1 Lewison J indicated that he did not consider it an automatic requirement that parties to civil proceedings have to subject hard drives to forensic discovery techniques. It is debatable whether it is wise not to subject hard drives to forensic search techniques, as demonstrated in the case of In the matter of Stanford International Bank Limited (in liquidation), Fundora v Hamilton-Smith.2 This was an application for the discharge of the appointed Joint Official Liquidators of Stanford International Bank Limited and other parties on the basis that, among other reasons, they destroyed digital data and employed improper practices in relation to computer and electronic data.3 The precise matters in dispute were as follows:
The matters which tell [sic] to be considered can be narrowed down to the following: (a) three servers at the Montreal office of SIB were not imaged and not copied, (b) four desktops and laptops were not imaged but were securely erased, (c) the email servers and Blackberry enterprise servers were not imaged; (d) the IT specialists did not appear to have been instructed by the Liquidators to search for, collect and image the Blackberrys and data sticks.4
1[2010] EWHC 1086 (Ch), [2010] Bus LR 1634, [2010] 2 Lloyd’s Rep 29, [2010] 5 WLUK 368, [2010] CLY 420.
22–3 March and 8 June 2010, Claim Number ANUHCV2009/0149 Eastern Caribbean Supreme Court in the High Court of Justice Antigua and Barbuda; the judgment is available at http://www.eccourts.org/wp-content/files_mf/1358795765_magicfields_pdf_file_upload_1_1.pdf and the Court of Appeal decision is available at http://www.eccourts.org/wp-content/files_mf/1358779099_magicfields_pdf_file_upload_1_1.pdf; see also Stanford International Bank Ltd (In Receivership), Re [2010] EWCA Civ 137, [2011] Ch 33, [2010] 3 WLR 941, [2010] Bus LR 1270, [2010] 2 WLUK 712, [2011] BCC 211, [2010] Lloyd’s Rep FC 357, [2010] BPIR 679, [2010] CLY 1873, also known as Serious Fraud Office v Wastell, Janvey v Wastell, Stanford International Bank Ltd v Director of the Serious Fraud Office.
3Discussed at [44]–[115] of the judgment.
4At [50] of the judgment.
9.11 After considering the relevant ACPO guidelines at the material time, Thomas J decided that the action of the Joint Official Liquidators was not in accordance with standard forensic practice, and in so doing, they acted improperly. To this extent, the various guidelines put forward as best practices provide sound advice and guidance when dealing with electronic evidence, and if followed, they can serve to counter allegations that the evidence has not been gathered or dealt with properly.
9.12 In Khodorkovskiy and Lebedev v Russia,1 a case before the European Court of Human Rights, the defence raised a number of important issues challenging the electronic evidence sought to be admitted that related to the volatile and mutable nature of such evidence. It alleged, among other things, that the hard drives that were seized had not been properly packed and sealed, so it was possible to add information to them while the drives were in the possession of the General Prosecutor of the Russian Federation,2 as the investigators discovered more files on the drives than there were on the same drives when examined by the experts,3 that the drives and the list of files discovered by the prosecution were not attached by the General Prosecutor to the case materials, and that there was no evidence that documented the continuity of the evidence.4 In concluding that these deficiencies were not relevant, the court said:
Possible discrepancies in the documents describing the amount of data contained on the hard drives, inaccuracies as to the exact location of the computer servers, and other defects complained of may have various explanations. The Court cannot detect any manifest flaw in the process of seizing and examining the hard drives which would make the information obtained from them unfit for use at the trial.5
111082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013).
211082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013) at [72]. When examining the hard drives seized during the searches of 9 October 2003, the investigators discovered 4,939 more files on the drives than those examined by the experts: 11082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013) at [181].
311082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013) at [679].
411082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013) at [678].
511082/06 and 13772/05 – [2013] ECHR 747 (25 July 2013) at [702].
9.13 In our opinion, this decision fails to emphasize the importance of professional digital forensics when seizing data in digital form. From a technical perspective, a hash value, calculated on site upon taking a forensic image of the hard drives that have been seized (or, if this is impossible, shortly thereafter), could have easily served as proof of the evidence having been untouched since it was first acquired (provided that the hash was kept securely or communicated to the defence or suspect at an early stage). However, there was no indication that the court relied upon, or the defence proffered, such evidence in support of such a conclusion.
9.14 Notwithstanding the preference for an original forensic image, or a demonstration of full and complete provenance, together with contemporaneous notes beginning from the source of the evidence, it may still be possible to prove the authenticity of certain types of digital evidence beyond doubt. One example of such a method is the examination of email messages which are downloaded to a USB drive. In such circumstances, it may not be possible to establish direct continuity from the computer or server on which the original email messages were created or transmitted before they were downloaded to the USB drive. Neither is it possible to obtain a forensic image of the senders’ or receivers’ computers. Therefore, the authenticity of the email messages and any attachments may be called into question. However, if an email message contains a DKIM1 (Domain Keys Identified Mail) signature, it is possible to establish beyond doubt that the message and any attachment is authentic and has not been modified since it was sent, regardless of how it has been handled since.2 This is because DKIM is one of the authentication methods used by mailbox providers to determine that an email was sent from a particular email account.3
1Internet Engineering Task Force (IETF) RFC 6376 – Domain Keys Identified Mail (DKIM), https://tools.ietf.org/html/rfc6376. Updated by RFCs 8301, 8463, 8553 and 8616.
2Organizations wishing to sign mail by way of DKIM will first generate two cryptographic keys. One of the keys is kept private and available to the sending server for the signing of mail, and the other is made public in the DNS (Domain Name System) for use by receiving domains in attempts to validate the signature. By using this cryptographic key exchange and the same validation mechanisms used by the original sender’s domain, it is possible to revalidate the content of the email message using the original sender domain cryptographic DKIM keys to recalculate the DKIM signature. If the DKIM signature can be reverified, not only is the content of the email message submitted for examination identical to the original message sent, including any attachments, but it also confirms the date and time of transmission, the subject line and the sender and recipient email addresses. An email message with a successfully revalidated DKIM signature can be considered to have similar veracity to a forensic image of the message.
3Even this need not be true. It depends very much on how users and senders are authenticated and how relaying is allowed on the system involved. DKIM is merely a pipeline: everything that goes in one end comes out the other signed.
9.15 However, in other cases the application of cryptographic hashing or other cryptographically sound acquisition techniques is difficult, if not impossible. Increasingly, evidence procured from large computing platforms and cloud services will require testing by digital evidence professionals.1 Another way to prove the data is to rely on a third party – such as the service provider – that controls the storage of data. This means that rather than relying on an operating procedure for acquiring evidence, the provenance and trustworthiness of the provider become relevant.
1An example is a case where Facebook was asked to provide the IP address of the poster of what is called ‘revenge porn’ (sexually explicit material typically posted after a break-up or end of a relationship). For an example see ‘Facebook ordered by Dutch court to identify revenge porn publisher’, The Guardian, 26 June 2015, https://www.theguardian.com/technology/2015/jun/26/facebook-ordered-by-dutch-court-to-identify-revenge-porn-publisher and ‘Facebook to give access to two revenge porn investigators’, NL Times, 6 November 2015, https://nltimes.nl/2015/11/06/facebook-give-access-two-revenge-porn-investigators.
Identifying electronic evidence
9.16 The first sign that something is wrong may be in the form of electronic evidence. For instance, a security administrator in a bank might consider an investigation necessary when the intrusion detection system sets off an alarm, or where the email logs indicate that a particular member of staff is receiving an excessive number of emails during the course of a day or over an extended period. The case of Miseroy v Barclays Bank plc1 is illustrative. In July 2002 a formal investigation was initiated because an employee of Barclaycard appeared to be receiving a disproportionate number of emails during the day. The audit of the emails sent and received by three employees showed that one Mr Miseroy, who was with the Fraud Prevention Department, had sent a significant number of emails. As a result, he was also included in the investigation. After a series of investigatory meetings, it was concluded that Mr Miseroy had abused the email facilities by sending out an unwarranted number of personal emails, in breach of the Group IT Security Policies regarding the use of corporate email facilities. Some of the emails he sent out included content that was derogatory, offensive and sexist, which Mr Miseroy admitted was not appropriate. The investigations also showed emails exchanged between him and a manager in a different department, in which Mr Miseroy had arranged to pass cannabis to that manager. It was also determined that Mr Miseroy disclosed confidential information regarding Barclay’s operations and customers. Mr Miseroy was summarily dismissed for gross misconduct, and the members of the tribunal accepted that his dismissal was within the range of reasonable responses of a reasonable employer in relation to the circumstances of the case.
1(Case No 1201894/2002) (18 March 2003, unreported) Bedford employment tribunal.
9.17 Such a case, where the source and reliability of evidence that something is wrong needs to be assessed, will require an investigation into the facts. At such an early stage, the actions of the investigator may inadvertently change the electronic evidence itself. For instance, in the case of Aston Investments Limited v OJSC Russian Aluminium (Rusal),1 the actions of the IT administrators caused important files and information to be removed, and subsequent forensic examination ran into difficulties because of the unintended changes made to the system. This is why it is essential to have an appropriate procedure in place to deal with the way an investigation is initiated and conducted, whether by way of civil proceedings, where there is an obligation for each party to disclose documents relating to matters in question under the Civil Procedure Rules,2 or in criminal matters, where the relevant investigating authorities have both common law and statutory powers to search and seize evidence. In the criminal context, investigating police officers will be expected to have conducted themselves in accordance with recognized guides for their jurisdiction. In the United Kingdom, ACPO3 has produced the ACPO Good Practice Guide for Digital Evidence (ACPO Guide).4 The ACPO Guide sets out the four main phases for handling electronic evidence – collection, examination, analysis and reporting – and concentrates on the collection phase. A digital evidence professional should consider adopting the practices for the four phases of his investigations. With the advent of forensic triage techniques, these four phases may be augmented with an initial ‘assessment’ or ‘triage selection’ phase.
1[2006] EWHC 2545 (Comm), [2007] 1 All ER (Comm) 857, [2007] 1 Lloyd’s Rep 311, [2006] 10 WLUK 470, [2006] 2 CLC 739, [2006] Info TLR 269, Times, 31 October 2006, [2007] CLY 684.
2For a discussion of some flaws in the legal and forensic process, see Vlasti Broucek, Paul Turner and Sandra Frings, ‘Music piracy, universities and the Australian Federal Court: Issues for forensic computing specialists’ (2005) 21(1) Computer Law & Security Report 30.
3ACPO was replaced in 2015 by a new body, the National Police Chiefs’ Council. This was set up under a police collaboration agreement under the provisions of s 22A of the Police Act 1996. The acronym ACPO will continue to be used in this chapter, because the current version of the guidelines predated the formation of the National Police Chiefs’ Council.
4(March 2012, v5), http://library.college.police.uk/docs/acpo/digital-evidence-2012.pdf.
9.18 While the following discussion concentrates on matters relating to electronic evidence in the context of a criminal investigation, the reader will readily acknowledge the relevance of the discussion in the context of a civil matter when undertaking work in the disclosure phase of a civil action.1
1The tension between forensics and investigations is discussed, among other things, in Monique Mattei Ferraro and Andrew Russell, ‘Current issues confronting well-established computer-assisted child exploitation and computer crime task forces’ (2004) 1(1) Digital Investigation 7.
9.19 Once it has been established that it is necessary to seize or gather evidence in digital form, a further set of procedures should be in place to guide the digital evidence professional with respect to the scene or information itself, including the identification and seizure or acquisition of the evidence as necessary.1 Where a physical crime scene is involved, it is now a well-established practice that the scene should be photographed, or even recorded by video, and the layout of the hardware recorded in relation to the scene. The investigator then needs to determine what, if any, physical evidence, such as computers, printers, computer mice or facsimile machines, should be retained (the ACPO Guide provides a list of the types of hardware and storage devices that are susceptible to being retained).2 It is important to not permit anybody to disturb the hardware or the network, or work on any computer. It is also advisable that the police officers engaged in searching for digital evidence be properly trained.3
1For a brief discussion about gathering evidence and issues surrounding personal privacy, see María Verónica Péez Asinari, ‘Legal constraints for the protection of privacy and personal data in electronic evidence handling’ (2004) 18(2) International Review of Law, Computers & Technology 231.
2Brian Carrier and Eugene H. Spafford, ‘Getting physical with the digital evidence process’ (2003) 2(2) International Journal of Digital Evidence.
3Although Harvey J in the District Court, Manakau in Canada, ruled that digital evidence was not necessarily rendered inadmissible because the accuracy of the data might have been jeopardized where a police officer, with full knowledge of the relevant guidelines, chose to ignore them. In this instance, during the search of premises a police officer switched on a computer and took 45 minutes to search various files stored on it: R v Good [2005] DCR 804. For problems when investigating mainframes and very large systems, see Matthew Pemble, ‘Investigating around mainframes and other high-end systems: the revenge of big iron’ (2004) 1(2) Digital Investigation 90.
9.20 The problem with digital evidence is the ease by which the data can be altered or destroyed. Digital devices are volatile instruments. For instance, the random access memory in a computer will contain a great deal of information relating to the state of the computer, such as the processes that are running, whether the computer is connected to the Internet and what file systems are being used. When a computer is switched off, a large part of this volatile data is immediately and irretrievably lost. Depending on the circumstances of the case being investigated, it may be very important to retain such data before the computer is switched off or simply unplugged from the electricity supply. This question is becoming increasingly important because of the ready availability of encryption utilities that are easy to use, and the increasing availability of low-cost hard disks that include whole disk encryption as a matter of course. The preservation of a forensic copy of a computer system’s RAM may be the only way of gaining investigative access to the contents of a target device whose content is encrypted with complex keys.1 Indeed, there may be occasions when great care should be taken when arresting suspects physically at a computer, because it is possible that they might switch off the computer and disrupt or delete any incriminating files before any preventative action can be taken, as in the case of Aleksei Kostap. He was arrested by members of the Serious and Organised Crime Agency, who attached handcuffs to him, but with his hands in front of his body. According to a press report, he managed to take action that caused certain databases to be deleted. It was thought the databases might have contained records of the gang’s activities. Apparently, while handcuffed, Kostap also acted to initiate the use of intricate layers of encryption on various computer systems, which experts were not able to decrypt.2 In addition, new developments in the methods used to store data on storage devices may cause problems in the future. Graeme B. Bell and Richard Boddington have demonstrated that:
Evidence stored on modern internal primary storage devices can be subject to a process we label ‘self-corrosion’. What is meant by this is that even in the absence of computer instructions, a modern solid-state storage device can permanently destroy evidence to a quite remarkable degree, during a short space of time, in a manner that a magnetic hard drive would not. Here, the phenomenon of solid-state drive (SSD) self-corrosion is proven to exist through experimentation using real world consumer hardware in an experimentally reproducible environment.3
1Casey, Digital Evidence, 478.
2Tom Espiner, ‘Jailed ID thieves thwart cops with crypto’, ZEDNet UK (19 December 2006).
3‘Solid state drives’; Ravi Kant Chaurasia and Priyanka Sharma, ‘Solid state drive (SSD) forensics analysis: a new challenge’ (2017) 2(6) International Journal of Scientific Research in Computer Science, Engineering and Information Technology 1; Ravi Kant Chaurasia and Priyanka Sharma, ‘Solid state drive (SSD) forensics analysis: a new challenge’ (2017) 2(6) International Journal of Scientific Research in Computer Science, Engineering and Information Technology 1081; Shiva Sai Ram Marupudi, ‘Solid state drive: new challenge for forensic investigation’ (2017) 30 Culminating Projects in Information Assurance, https://repository.stcloudstate.edu/msia_etds/30.
9.21 The authors provide the observations in this chapter for the guidance and assistance of professionals involved in facilitating the proof of digital evidence.
9.22 While it may be convenient to consider preservation of ‘cloud storage’ as analogous to preservation of data on a hard disk in a computer, this is not the case. Cloud service providers make computing infrastructure available as virtualized components that can be connected using virtualized networking infrastructure. It is usually possible, given enough privileges, to quickly preserve the state and contents of a virtual machine, its volatile memory and any block storage attached to it. But that is not the whole picture. In such virtualized environments, data of material relevance such as access and event logs may be available only to the members of staff of the service provider. Therefore, identifying and preserving the contents of volatile data and attached storage accessible to the victim(s) or suspect(s) but failing to have the service provider preserve all relevant system, access and event logs would be to miss the most important facts about the evidence and thus potentially undermine the ability to prove the case.
Gathering of data following legal retention or reporting obligations
9.23 In other cases, metadata and logs are retained by service providers who follow a legal requirement. This is often the case for identifying information such as IP addresses, telephone numbers and related subscriber information, and in the UK this was extended to data on sites visited on the Internet.1 Where such legal obligations exist, other safeguards will typically apply, and an assumption can be made about the accuracy of the data that is provided by a service provider. Such data, and the process of access and acquisition or reporting, are often subject to different legislative requirements such as privacy legislation, safeguards and obligations set out in telecommunications legislation or data retention regimes.2 In such cases, the probative value and admissibility will increasingly depend on the reliability of the service provider. In some cases, special accreditations or security checks are required for members of staff who work with this data or analyse it. In other cases, the related infrastructure is subject to audit requirements.
1Investigatory Powers Act 2016, s 62(7) reads: ‘In this Act “internet connection record” means communications data which – (a) may be used to identify, or assist in identifying, a telecommunications service to which a communication is transmitted by means of a telecommunication system for the purpose of obtaining access to, or running, a computer file or computer program, and (b) comprises data generated or processed by a telecommunications operator in the process of supplying the telecommunications service to the sender of the communication (whether or not a person)’.
2The legal basis of data retention can affect an investigation, in particular the attribution of an IP address. It appears that the Court of Justice of the European Union understands this position, but up to this point in time has invalidated a number of data protection regimes, which has caused considerable uncertainty for investigators, for which see Joined Cases C-203/15, Tele2 Sverige, and C-698/15, Tom Watson and Others; as well as currently pending cases: C-623/17 – A request for a preliminary ruling by the Investigatory Powers Tribunal of the UK concerning data retention in terrorism cases. C-520/18, a request for a preliminary ruling by the Belgian Constitutional Court concerns the questions concerning the admissibility of a general data retention scheme; and lastly Cases C-511/18 and C-512/18, both requests for a preliminary ruling of the French Conseil d’Etat which concern the legal framework for data retention for criminal investigations and for data retention for intelligence services.
Internet of Things data and sensors
9.24 A wide variety of sensors can be found at crime scenes, depending on how many devices are connected to the Internet. Due to the increasing number of sensors, a wide array of Internet of Things (IoT) devices is becoming available and may contain evidence which can assist in the proof of a crime. The analysis of this data, which is sometimes also present at service providers (located ‘in the cloud’), can present a unique set of problems.
9.25 Location data, health data and other types of sensor may not be easy to interpret and will often require contextual analysis and interpretation. For example, location data may be important in the investigation of a murder that is alleged to have occurred in a park at a certain hour. The presence of data recorded within a personal device at the north entrance, and then again at the south entrance of the park within ten minutes of each other, may, at first sight, be indicative of a suspect’s presence at the crime scene. However, it may also be indicative of a loved one slowly driving around half the park by car, in ten minutes, to see if the victim is to be found. The accuracy, measurement interval and behaviour of the services involved (which may, for example, store the closest destination, such as the park entrance gates) and evidence from other sensors (such as the connections to the car radio and navigation by way of wireless technologies like Bluetooth) may serve as corroborating evidence of such a defence.1 Each of the available sensors in modern-day IoT devices has its own accuracy, measurement interval and data storage mechanisms, and, therefore, evidential value. A detailed knowledge of and access to significant testing facilities is needed to stay abreast of developments in this field and in order to understand the behaviour of devices in the IoT ecosystem.2
1By way of example relating to the accuracy of mobile telephone locations, see Matthew Tart, Iain Brodie, Nicholas Gleed and James Matthews, ‘Historic cell site analysis – overview of principles and survey methodologies’ (2012) 8(3–4) Digital Investigation 185; R. P. Coutts and H. Selby, ‘Problems with cell phone evidence tendered to “prove” the location of a person at a point in time’ (2016) 13 Digital Evidence and Electronic Signature Law Review 76; Reg Coutts and Hugh Selby, ‘“Mobile ping data” – metadata for tracking’ (2017) 14 Digital Evidence and Electronic Signature Law Review 22; Matthew Tart, Sue Pope, David Baldwin and Robert Bird, ‘Cell site analysis: roles and interpretation’ (2019) 59(5) Science & Justice 558; Matthew Tart, ‘Opinion evidence in cell site analysis’ (2020) 60(4) Science & Justice 363; in R. v Turner (Andrew Neil) [2020] EWCA Crim 1241, [2020] 9 WLUK 308, where a mobile telephone analyst provided evidence that was tantamount to expert evidence in which the members of a jury were presented with the appearance of cell site analysis, and then invited to infer facts without any knowledge of the technical knowledge required to substantiate any conclusions – this was wrongly upheld by the Court of Appeal. On similar facts, a differently composed Court of Appeal determined the position correctly in R. v Calland (Sean Thomas) [2017] EWCA Crim 2308, [2017] 12 WLUK 706.
2Compare, for example: M. J. Sorell and K. Hovhannisyan, ‘Arkangel: investigation of children’s tracking smartwatch ecosystem. Forensic value and privacy implications’, in Proceedings of the 4th Interdisciplinary Cyber Research Workshop 2018, Tallinn University of Technology, 60–62; M. DeVries and M. J. Sorell, ‘Biometric profiling of wearable devices for medical monitoring and authentication’ in Proceedings of the 4th Interdisciplinary Cyber Research Workshop 2018, Tallinn University of Technology, 46–48; Ibrahim Baggili, Jeff Oduru, Kyle Anthony, Frank Breitinger and Glenn Mcgee, ‘Watch what you wear: preliminary forensic analysis of smart watches’, 2015 10th International Conference on Availability, Reliability and Security, IEEE Explore (2015) 10.1109/ARES.2015.39.
Gathering data through network searches
9.26 The initiation of advanced encryption has significantly decreased the value of evidence gathered from traditional police powers, such as lawfully authorized interception. The advent of end-to-end encryption has led to the introduction, in many countries, of the power to conduct remote searches. These include the use – or exploitation – of software vulnerabilities in order to gain access to the systems involved. The evidence obtained through this method is difficult to evaluate, because it is typically obtained in an end user device without the user being aware of the fact that the device was deliberately being made use of. While it may be assumed that law enforcement authorities may be the only parties with legitimate access to a device, other third parties may also have authorized access. Attribution of any data and evidence found on the device is therefore more difficult, due to the existence of such an exploitable vulnerability. The value of this type of evidence can be improved where corroboration of the evidence is sought, and obtained, through other sources.
9.27 The process of acquiring, copying and handling electronic evidence should be carried out to the highest standards, regardless of whether the source is a hard disk, mobile device or cloud-based resources. Several commonly applied best practices and principles are relevant to this process and the four principles of handling computer-based electronic evidence as set out in the ACPO Guide illustrate the importance of the data collection phase of this process:1
Principle 1: No action taken by law enforcement agencies, persons employed within those agencies or their agents should change data which may subsequently be relied upon in court.
Principle 2: In circumstances where a person finds it necessary to access original data, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.
Principle 3: An audit trail or other record of all processes applied to digital evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.
Principle 4: The person in charge of the investigation has overall responsibility for ensuring that the law and these principles are adhered to.
1See Casey, Digital Evidence, 471 for a further discussion of documentation and a sample preservation form.
9.28 The problem of what types of electronic evidence and hardware to seize and retain can be compounded where a computer or an entire system of computers is linked to a network, and the sources of electronic evidence exist in a number of separate geographical locations. In such circumstances, and before taking any action, it will be necessary to ascertain whether it is possible or feasible to shut the network down. In most instances, this will not be an option. The investigators will need to be aware of the range of original data that might be required, should they be presented with such a situation. This will include establishing the topology of the network that is to be investigated for the data, especially if a system administrator will not cooperate. For instance, it will probably be necessary to establish the number of computers on a network, and the various types of network connection such as the Internet, cellular data networks and wireless connections that are available on the network. In the case of cloud-based resources where the user is not cooperative or unaware of the investigation, it will be necessary to engage with the cloud service provider’s team to obtain the required information and, with the appropriate authorities, gain access to the data of interest.
9.29 Professor Casey posits two empirical laws of electronic evidence collection that ought to be high on the agenda:
Empirical Law of Digital Evidence Collection and Preservation 1: If you only make one copy of digital evidence, that evidence will be damaged or completely lost.
Empirical Law of Digital Evidence Collection and Preservation 2: A forensic acquisition should contain at least the data that is accessible to a regular user of the computer.1
1Casey, Digital Evidence, 481.
9.30 To ensure a complete copy of a disk is obtained, Professor Casey recommends taking a bitstream copy of the electronic evidence.1 As a result, the copy will include information that will normally enable a digital evidence professional to reconstruct deleted files, depending on the storage technology that was used. In circumstances where the volume of digital data is so large or its storage is so complex that copying it in its entirety is not possible,2 it is generally accepted that copies of selected data may be made, provided the data that is copied can be shown to be an accurate and an exact duplicate of the data that is the subject of copying, frequently referred to as ‘first in time evidence’. Many methods exist for achieving this, including the use of proprietary ‘logical evidence file formats’ of common forensic tools.
1Casey, Digital Evidence, 482. A bitstream copy or image is a ‘sector-by-sector’ or ‘bit-by-bit’ copy of a computer’s hard drive. In non-technical terms, a bitstream image is a set of files which also preserves latent data. The image can be used to create an exact copy of a hard drive. A bitstream image is readable by most tools that the digital forensics professional will use.
2This is reflected in the Supplementary Attorney General’s Guidelines On Disclosure: Digitally Stored Material (14 July 2011), para 12.
9.31 In addition to Professor Casey’s empirical laws of electronic evidence collection, there are two fundamental principles to be observed by a digital evidence professional when copying electronic evidence:
(i) The process of making the image should not alter the first in time evidence. This means that appropriate steps should be taken to ensure that the process used to take the image should not write any data to the original medium.
(ii) The process of copying data should produce an exact copy of the first in time evidence. Such a reproduction should allow the specialist to investigate the files in the way that they existed on the original medium.1
1Troy Larson, ‘The other side of civil discovery: disclosure and production of electronic records’, in Eoghan Casey (ed), Handbook of Computer Crime Investigation: Forensic Tools and Technology (Academic 2007), 35.
9.32 To ensure the first in time evidence and the copy are the same, the data should undergo a hashing process, described below. The reason for establishing hash values for data, including the time and date stamps of each file, is that this information will serve as a reference for checking the authenticity or veracity of the files after they have been copied.
9.33 The quality of digital files that are copied can be crucial. In the case of The Gates Rubber Company v Bando Chemical Industries Limited,1 Schlatter MJ commented on the evidence of two digital evidence experts. The judge was impressed by the ‘credentials, experience and knowledge’ of Bando’s expert Wedig, and indicated in his decision that he relied on his opinions. As Gates failed to obtain an expert in a timely fashion, much less weight was placed on its expert.2 Gates’ expert Voorhees also failed to undertake appropriate measures to secure the first in time evidence. Schlatter MJ’s judgment is quoted more fully to illustrate this point:
Gates argued that Voorhees did an adequate job of copying the Denver computer. Wedig persuaded me, however, that Voorhees lost, or failed to capture, important information because of an inadequate effort. In using Norton’s Unerase, Voorhees unnecessarily copied this program onto the Denver computer first, and thereby overwrote 7 to 8 percent of the hard drive before commencing his efforts to copy the contents.
Wedig noted that information which is introduced into a computer is distributed, in a random manner, to space which is not being used, or to space which contains a deleted file and is therefore available for use. To use Norton’s Unerase, it was unnecessary for Voorhees to copy it onto the hard drive of the Denver computer. By doing so, however, the program obliterated, at random, 7 to 8 percent of the information which would otherwise have been available. No one can ever know what items were overwritten by the Unerase program.
Additionally, Voorhees did not obtain the creation dates of certain of the files which overwrote deleted files. This information would have assisted in determining the deletion date of some files. If a deleted file has been overwritten by a file which was created prior to the Gates litigation, for example, Bando would be relieved of suspicion as to that file. Thus, failure to obtain the creation dates of files represented a failure to preserve evidence which would have been important to Bando in its efforts to resist Gates’ motions for default judgment.
Wedig pointed out that Voorhees should have done an ‘image backup’ of the hard drive, which would have collected every piece of information on the hard drive, whether the information was allocated as a file or not. Instead, Voorhees did a ‘file by file’ backup, which copies only existing, nondeleted files on the hard drive. The technology for an image backup was available at the time of these events, though rarely used by anyone. Wedig testified that Gates was collecting evidence for judicial purposes; therefore, Gates had a duty to utilize the method which would yield the most complete and accurate results. I agree with Wedig. In these circumstances, Gates failed to preserve evidence in the most appropriate manner. Gates’ failure to obtain an image backup of the computer is a factor which I have weighed against Gates as I considered a number of the claims which Gates has asserted.3
1167 F.R.D. 90 (D.Colo. 1996).
2167 F.R.D. 90 (D.Colo. 1996) at 111(a).
3167 F.R.D. 90 (D.Colo. 1996) at 112(a) and (b).
9.34 Although the tools and techniques used by digital evidence professionals are constantly changing and improving, the comments made by the judge in this case illustrate a very clear point: when electronic evidence is copied, the techniques that are used ought to comply with the highest possible standards for the evidence to have any probative value in legal proceedings. However, it must be emphasized that there will be occasions when the investigator is faced with a unique situation such that she can only apply her knowledge to the best of her ability in seizing data in as forensic a way as possible. One example would be a live banking system. The system might be stored on hundreds of servers in a data centre that is the size of a football field, and the data will be changing every second. No existing guidelines cover such an eventuality, which is why the investigator must make decisions based on principles of good practice.1
1For a sample imaging procedure, see Larson, ‘The other side of civil discovery’, 36–37; Barbara Guttman, James R. Lyle and Richard Ayers, ‘Ten years of computer forensic tool testing’ (2011) 8 Digital Evidence and Electronic Signature Law Review 139.
9.35 An examination of the surrounding area of the scene, including any materials that are likely to be relevant to disclosure or a criminal investigation, is also important. For instance, in the case of R. v Pecciarich1 the police seized a number of documents, catalogues and a scrapbook of newspaper articles concerning trials of sexual assault and proposed legislation dealing with abusive images of children. In this instance, the material constituted real evidence. It was also considered, as Sparrow J determined, to be circumstantial evidence to support the allegations that Pecciarich distributed abusive digital images of children, which were found on his computer and hardware devices. The relevance of materials found at the scene, including fingerprints and DNA samples taken directly from hardware devices, may become more obvious once the digital evidence professional has examined the electronic evidence in detail.
11995 CarswellOnt 504, [1995] OJ No 2238, 22 OR (3d) 748, 26 WCB (2d) 603.
9.36 Preceding the investigation and examination of electronic evidence by digital evidence professionals is a technique known as ‘forensic triage’,1 which has received considerable attention within the forensic practitioner and law enforcement communities. Digital forensic triage is the term used to cover a range of processes, methodologies, software and hardware that can be used to enable people to prioritize their digital forensic investigations more effectively. Forensic triage is not suitable for every case. Users with appropriate training must use it in conjunction with appropriate risk assessment. Indeed, there are direct comparisons to be drawn in this regard with law enforcement processes and the medical profession. Applying the triage process, a police officer, trained in the use of a breath test meter, can use such a device to make informed decisions about a driver suspected of being intoxicated. The officer does not need to be an expert in the science embodied in the device but, instead, simply needs to be appropriately trained to configure, use and interpret the results it provides, and decide how best to take the investigation forward.
1Marcus K. Rogers, James Goldman, Rick Mislan, Timothy Wedge and Steve Debrota, ‘Computer forensics field triage process model’ (2006) 1(2) Journal of Digital Forensics, Security and Law 19.
9.37 The UK Defence Science and Technology Laboratory has reviewed various digital forensic triage methods: software, hardware and processes. These evaluations, although often focused on establishing if individual tools meet the claims made by the publishers, also test the effectiveness of the technology to preserve the integrity of the target media, to correctly identify specific digital artefacts and to produce results using other forensic techniques that would withstand scrutiny. The outcomes of these independent tests are made available to police and other authorities under various classification restrictions, allowing them to form opinions about the suitability of each tool for the given scenarios.
9.38 Digital forensic triage technologies and methods are in their infancy,1 and must take account of the need for appropriate training and accreditation. Similarly, suitable risk assessment is required in order to minimize the omission of relevant data. It could be argued that by not performing a full forensic examination of every piece of digital media found, vital evidence may be lost. An important consideration when employing digital triage techniques is the need to balance the rapid identification of material of interest and the consequence of stopping further analysis, in the knowledge that such a process may fail to identify exculpatory material or material of more significance.2 Through the use of manual and automated techniques, digital forensics triage techniques have the potential for beneficial or non-degrading applications in a wide range of digital forensics matters, particularly those that require classification such as abusive images of children (from which the digital triage process originated)3 through to copyright matters.4
1Dr Faye Mitchell, ‘The use of artificial intelligence in digital forensics: an introduction’ (2010) 7 Digital Evidence and Electronic Signature Law Review 35.
2Vacius Jusas, Darius Birvinskas and Elvar Gahramanov, ‘Methods and tools of digital triage in forensic context: survey and future directions’ (2017) 9(4) Symmetry 1, 49.
3Jusas, Birvinskas and Gahramanov, ‘Methods and tools of digital triage in forensic context: survey and future directions’, 51.
4David McLelland and Fabio Marturana, ‘A digital forensics triage methodology based on feature manipulation techniques’, 1st IEEE International Workshop on Secure Networking and Forensic Computing (SNFC2014), ICC2014, Sydney, Australia.
9.39 In the context of cloud computing, forensic triage techniques may be highly effective and appropriate, and can use the functionality of the cloud environment to standardize and automate a method of deployment. For instance, Amazon Web Services1 suggest a programmatic use of triage as part of an automated incident response methodology. The inclusion of forensic controls such as hashing and comprehensive audit logging are available as standard options when such tasks are performed. Once such techniques and capabilities have been provisioned by practitioners with the appropriate skills and knowledge, any user with appropriate credentials can use them.
1AWS Security Incident Response Guide published June 2020 by Amazon Web Services, https://d1.awsstatic.com/whitepapers/aws_security_incident_response.pdf.
Preserving electronic evidence
9.40 Electronic evidence in particular needs to be validated if it is to have any probative value. A digital evidence professional will typically need to copy the contents from a number of disks or storage devices. To prove that the electronic evidence has not been altered from its source copy, it is necessary to put in place checks and balances to prove that the duplicate evidence is identical to its source. A method used to prove the integrity of source data at the time the evidence was collected is known as electronic fingerprinting or ‘hashing’.1 The electronic fingerprint uses a cryptographic technique that is capable of being associated with a single file, a floppy disk or the entire contents of a hard drive. A digital evidence professional should use software tools that are relevant to the task.2 Such software tools will invariably incorporate a program that causes a checksum operation called a hash function to be applied to the source file or disk that is being copied. When a hash function is applied to digital data, the result is called a hash value as it is calculated against the content of the data. The hash function is a one-way function, and is the mathematical equivalent of a secret trapdoor. For the purposes of understanding the concept, this algorithm is easy to compute in one direction and difficult to compute in the opposite direction.3 The hash function is used to verify that a source file or the copy of a file has not changed. If the file has been altered in any way, their hash values will not be the same, and the investigator will be alerted to the discrepancy.
1Ovie Carroll and Mark Krotoski, ‘Using “digital fingerprints” (or hash values) for investigations and cases involving electronic evidence’ (2014) 62 US Attorney’s Bulletin 44.
2This is not what occurred in State of Connecticut v Julie Amero (Docket number CR-04-93292; Superior Court, New London Judicial District at Norwich, GA 21; 3, 4 and 5 January 2007) – for a detailed analysis of this case, see Stephen Mason (gen ed), International Electronic Evidence (British Institute of International and Comparative Law 2008), xxxvi–lxxv; compare with the actions of the digital evidence professional David Hendricks in Krause v State, 243 S.W.3d 95 (Tex.App. 2007), 2007 WL 2004940.
3It has yet to be proven that a mathematical function can have a one-way function: see Fred Piper, Simon Blake-Wilson and John Mitchell, Digital Signatures Security & Controls (Information Systems Audit and Control Foundation 1999), 16.
9.41 There are many possible hashing algorithms that can be used to establish forensic veracity. For many years the MD5 (Message Digest 5) algorithm was used, but research conducted by Xiaoyun Wang and Hongbo Yu showed that it was possible to create two files with different content that produced the same MD5 value.1 The implications of this possibility quickly led to some debate in the forensic community. One common interpretation was that MD5 could no longer be trusted because an analyst might wrongly identify an innocent file as a known file (the identification issue) or deliberately modify a file and change its hash value back to the original (the verification issue). Another hypothesis was that a suspect could make all his bad files have the hash values of known system files, thereby avoiding detection. While theoretically possible, it is practically very difficult to achieve an MD5 hash collision, and doing so requires considerable computational time for files larger than a few hundred bytes. According to Stephens and others:
It is important to note that the hash value shared by the two different files is a result of the collision construction process. We cannot target a given hash value, and produce a (meaningful) input bit string hashing to that given value. In cryptographic terms: our attack is an attack on collision resistance, not on preimage or second preimage resistance. This implies that both colliding files have to be specially prepared by the attacker … Existing files with a known hash that have not been prepared in this way are not vulnerable.2
1Xiaoyun Wang and Hongbo Yu, ‘How to break MD5 and other hash functions’, http://merlot.usc.edu/csac-f06/papers/Wang05a.pdf; Arjen Lenstra, Xiaoyun Wang and Benne de Weger, Colliding X.509 Certificates (version 1.0, 1 March 2005), http://eprint.iacr.org/2005/067.pdf; the earliest research is Hans Dobbertin, ‘The status of MD5 after a recent attack’ (1996) 2(2) RSA Laboratories’ CryptoBytes 1, 3–6.
2Marc Stevens, Arjen K. Lenstra and Benne de Weger, ‘Vulnerability of software integrity and code signing applications to chosen-prefix collisions for MD5’ (30 November 2007), http://www.win.tue.nl/hashclash/SoftIntCodeSign/.
9.42 In mathematical terms, an MD5 hash is 128 bits wide and therefore the probability of two files having the same MD5 value is 2-128. Put another way, the probability of finding two files with the same MD5 value is one in just over 3x10-39. That is once in 340 billion, billion, billion, billion comparisons. By contrast, an SHA-1 hash is 160 bits wide and so the probabilities decrease to once in every 6.8x10-49 comparisons. In other words, in realistic terms it is very hard to produce a ‘doctored copy’ of a larger digital evidence set that has the exact same MD5 or SHA-1 hash value as the ‘original’ while still being ‘believable’. However, it is not impossible, as the recent practical technique for generating an SHA-1 collision for PDF documents has demonstrated. It took the equivalent processing power of 6,500 years of single-CPU computations and 110 years of single-GPU computations, but resulted in a (believable) ‘doctored copy’ with a hash that was equal to a known original.1
1Marc Stevens, Elie Bursztein, Pierre Karpman, Ange Albertini and Yarik Markov, ‘The first collision for full SHA-1’ (27 February 2017), https://shattered.io/static/shattered.pdf; John Leyden, Thomas Claburn and Chris Williams, ‘“First ever” SHA-1 hash collision calculated. All it took were five clever brains ... and 6,610 years of processor time’, The Register, 23 February 2017.
9.43 The result of this debate is that, although the chance of an MD5 or SHA-1 collision is remote, best practice suggests creating two hash values for every file or forensic image when used for comparison. If only a single hash algorithm is used, SHA-256 would be better than MD5 or SHA-1. Using both MD5 and SHA-1 instead of a single SHA-256 is mathematically more robust. Further logic for this approach is the fact that although there are no national or international standards that require SHA-256 in digital forensics, its use instead of MD5/SHA-1 would immediately render all global child sexual exploitation image databases, which use MD5 and SHA-1 values, unusable. Furthermore, MD5 and SHA-1 are still used and accepted by every law enforcement authority worldwide to perform the three essential forensic functions: to identify known indecent images, to exclude known files such as those in the National Software Reference Library hash keeper list and to verify that files have not been changed. In the light of the recent successful collision attack of SHA-1, this practice may need to be reviewed. It is therefore advisable to retain first in time copies of any files that are to be identified in order to be able to recalculate hashes as algorithms become deprecated and new ones are introduced. For the purpose of detecting changes using digital signatures, SHA-1 and MD5 should be considered unreliable and deprecated as usage of SHA-1 began to be phased out in the technology community in 2017.1 Since digital signatures are usually only valid for a limited time period, this is less of a problem, although even with MD5, issues have been identified since at least 2004, and they still persist.2
1Google Security Blog, ‘Announcing the first SHA1 collision’, 23 February 2017, https://security.googleblog.com/2017/02/announcing-first-sha1-collision.html.
2Xiaoyun Wang, Denggou Feng, Xuejia Lai and Hongbo Yu, ‘Collisions for hash functions MD4, MD5, HAVAL-128 and RIPEMD’, Cryptology ePrint Archive Report 2004/199, 16 August 2004, Institute of Software, Chinese Academy of Sciences, https://eprint.iacr.org/2004/199.pdf; Fahmida Y. Rashid, ‘Oracle to Java devs: stop signing JAR files with MD5’, InfoWorld, 19 January 2017, http://www.infoworld.com/article/3159186/security/oracle-to-java-devs-stop-signing-jar-files-with-md5.html.
9.44 Hashing technology, although useful in cases where absolute certainty is required, is not the best way to recognize pre-existing material – especially in relation to images, video and sound recordings. Child abuse imagery, for example, is often ‘marketed’ using different logos present inside the image. While the majority of the content is unaltered, the mere addition of a logo will invalidate any hash value that was previously calculated. For this reason, numerous technologies exist that do not rely on establishing an exact match (by way of a file hash), but rather are capable of comparing content to establish the proximity of a file to the content of a known previous copy of the material sought. These filtering algorithms usually provide an indication at a preset level of certainty, of the fact that material matches a previously observed copy. The result is a percentage of likeness, rather than an exact match (which is what is achieved by file hashes).1 Note that the results from these algorithms, contrary to hashes, are therefore less usable as a single source of evidence. However, due to the increased use of online filtering,2 their acceptance is likely to increase.
1Well-known technologies include Microsoft’s PhotoDNA, used for identifying altered image material, and the Sift algorithm, used, for example, through technology applied by Facebook to filter illegal uploads.
2For example, see articles 15 and 17 of the Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC OJ L 130, 17.5.2019, 92–125 requiring such filters for copyright purposes on larger sharing platforms.
9.45 For those professionals experienced in criminal matters, the concept of the continuity of custody (also known as the chain of evidence) is well established. However, the continuity of custody, in both civil and criminal matters, should be considered very carefully with respect to electronic evidence. The reason for taking particular care with electronic evidence is because it is easy to alter. It is necessary to demonstrate the integrity of the evidence and to show that it cannot have been tampered with after being seized or copied. There is another reason for being meticulous about ensuring the continuity of electronic evidence and that its custody is correctly recorded: in a case involving a number of items of hardware and more than one computer, it will be necessary to ensure that there is a clear link between the hardware and the electronic evidence copied from the hardware. In this respect, the record should address such issues as who collected the evidence, how and where it was collected, the name of the person who took possession of the evidence, how and where it was stored, the protection afforded to the evidence while in storage and the names of the people who removed the evidence from storage, including the reasons for removing the evidence from storage.1 Due to the increased use of online storage and services, access and custody of records may also be of relevance in the phases before they become evidence in a legal matter or dispute, especially in cases where applying cryptographic safeguards is less practical.
1Warren G. Kruse II and Jay G. Heiser, Computer Forensics Incident Response Essentials (Addison-Wesley 2002), 6–11.
Transporting and storing electronic evidence
9.46 Consideration should be given to the methods by which any hardware and digital evidence is transported and stored.1 Computers need to be protected from accidentally booting up, and consideration should be taken to ensure that hardware is clearly marked to prevent people from using the equipment unwittingly. Loose hard drives, modems, keyboards and other such materials should be placed in anti-static or aerated bags to prevent them from being damaged or their data being corrupted. Storage conditions should be appropriate. Hardware and electronic evidence should be protected from dirt, humidity, fluids, extremes of temperature and strong magnetic fields. It is possible for data to be rendered unreadable if the storage media upon which the electronic evidence is contained are stored in a damp office or overheated vehicle. In many forensic storage facilities, special data safes protect evidence from fire risk. These safes are designed to withstand heat, and keep digital media at an acceptable temperature for longer periods of time during a fire.
1Philip Turner, ‘Unification of digital evidence from disparate sources (Digital Evidence Bag)’ (2005) 2(3) Digital Investigation 223.
9.47 More recently, the availability of sophisticated file storage and server systems has resulted in systems that can store and manage data as ‘objects’, directed and controlled by policies with corresponding automated move, copy, delete and replication functions. In many cases these systems are also distributed geographically, providing better failover (the ability to automatically switch to a reliable backup system) and availability. While these features should not adversely affect the evidential veracity of stored data, it is essential for meeting evidential continuity that comprehensive access controls and audit logs are maintained at all times. Furthermore, the geographically dispersed storage of data may increasingly lead to questions of jurisdiction.
Cloud computing and online services
9.48 In the same vein, evidence is increasingly stored on publicly accessible, network-based services. Both cloud computing (the use of, often shared or virtualized, computing or storage resources available through the Internet) and the online delivery of services (software, infrastructure or platform as a service) are rapidly becoming more popular. Forensic investigation of these sources of evidence is inherently complex,1 and is likely to force forensic standards involving the concept of ‘original evidence’ or ‘first in time evidence’ to become outdated or impracticable. In consequence, cloud forensics is emerging as a new aspect of computer forensics.2
1Eoghan Casey, ‘Cloud computing and digital forensics’ (2012) 9(2) Digital Investigation 69; M. Taylor, J. Haggerty, D. Gresty and R. Hegarty, ‘Digital evidence in cloud computing systems’ (2010) 26(3) Computer Law & Security Review 304.
2Stephen Mason and Esther George, ‘Digital evidence and “cloud” computing’ (2011) 27(5) Computer Law & Security Review 524; Ian Walden, ‘Law enforcement access in a cloud environment’, Legal Studies Research Paper No 74/2011, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1781067; Giuseppe Vaciago, ‘Remote forensics and cloud computing: an Italian and European legal overview’ (2011) 8 Digital Evidence and Electronic Signature Law Review 124.
9.49 At the same time, the use of free or low-cost ‘cloud storage’ adds a challenging complication to the process of preserving digital evidence. At the heart of this technology is the concept that a user can upload and store data and software applications to ‘the cloud’, which can then be accessed from anywhere using any device with an Internet connection. In reality, such ‘cloud storage’ can consist of many thousands (or tens of thousands) of mass storage devices (arrays of high capacity hard disks) located in many different physical locations, all connected to a storage management system software via the Internet. It is often the case that, in order to ensure that users’ data is available at all times and to protect them from loss such as disk failure or interruptions in network connectivity, many copies of the users’ data are spread across many redundant storage nodes that are physically and geographically separated from one another. Furthermore, many unrelated users share the same cloud storage facilities. In these ‘multi-tenanted’ systems, the management of such data is essentially automatic and controlled by the storage management system software rather than human managers. The implications for forensic preservation of such data may not be readily apparent.1 It follows that because of the geographically distributed nature of such systems, issues of legal jurisdiction may also arise when seeking to preserve or obtain the data with the cooperation of the cloud operators.
1For a general overview of some of the issues, see the entire issue of IAnewsletter, (2011) 14(1), entitled ‘Cyber forensics in the cloud’, https://www.csiac.org/wp-content/uploads/2016/02/Vol14_No1.pdf.
9.50 One method of securing access to such data is to request that the user provides details of her account to enable suitably authorized investigators to log into the relevant account and forensically copy all pertinent data to disk, or, more efficiently, copy from the user’s ‘cloud’ to a storage location on a ‘forensic cloud’. The forensic process should include the creation of hash values (discussed above) for every file or object and the use of automatic or manual logging of each action to create a contemporaneous note for all actions undertaken. Additionally, it may be prudent, with the appropriate legal authorization, to change the access credentials of the original storage in order to prevent any deliberate or inadvertent changes from being made to it.1 In such circumstances, principles 2 and 3 of the ACPO should be considered:
Principle 2: In circumstances where a person finds it necessary to access original data, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.
Principle 3: An audit trail or other record of all processes applied to digital evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.
1For example, see the powers of preservation prescribed by the Convention on Cybercrime ETS No.185 (Budapest, 23/11/2001), articles 16 (domestic) and 29 (preservation in view of international cooperation).
9.51 Furthermore, reasonable precautions must be taken to ensure that any changes are kept to a minimum, the changes are noted and recorded, and the person conducting the acquisition process is fully aware of the effect of her actions. For those occasions when permission to obtain access to the data from the suspect is not forthcoming, it is then essential, if possible, to preserve a copy of the volatile memory in the computer (that has access to the data), so that it is possible to search for any remaining data relating to the account.
9.52 Data can be deleted on a remote server or cloud storage before it can be secured.1 In such complex scenarios as described above, the role of forensic triage becomes increasingly important, because it allows the investigators to evaluate the scene contemporaneously, and to identify the data, seek the appropriate authority to search and seize the data (if such an order or warrant has not been obtained, or if the order or warrant under which the search is being conducted does not cover the materials that have been found) and secure the online data before anybody who might be under suspicion (or their accomplices) gets the opportunity to destroy it remotely. It is in such circumstances that conducting a preliminary risk assessment is essential to success.
1For a discussion of the complexities of recovering data from modern operating systems and file systems, see Geoff H. Fellows, ‘The joys of complexity and the deleted file’ (2005) 2(2) Digital Investigation 89.
9.53 Throughout this phase of any investigation, the emphasis will be on the digital evidence professional to make informed decisions as to what data or equipment to seize and retain in any given set of circumstances.1 Depending on the circumstances of the case, consideration has to be given to the possibility that the person at the centre of the investigation might be framed for personal or political reasons.2 It will also be necessary to give reasons for seizing and retaining the property, and it will be essential to ensure that the entire procedure is properly documented. The documentation relating to electronic evidence is important. Standard operating procedures such as those described in the ACPO Guide, as noted above, should be followed. A record should be kept of every item seized, every action performed that may affect electronic data on every item, and exhibit labels should be attached to every physical item retrieved.
1The prosecution failed to analyse the evidence from the family computer effectively in the case of the death of Casey Marie Anthony in 2011, for which see Craig Wilson, ‘Digital evidence discrepancies – Casey Anthony trial, 11 July 2011’, http://www.digital-detective.net/digital-evidence-discrepancies-casey-anthony-trial/; Tony Pipitone, ‘Cops, prosecutors botched Casey Anthony evidence’, Clickorlando.com, 28 November 2012, http://www.clickorlando.com/news/cops-prosecutors-botched-casey-anthony-evidence; Jose Baez and Peter Golenbock, Presumed Guilty: Casey Anthony: The Inside Story (BenBella Books, updated edition, 2013), 46, 180–183, 211, 346–348, 365, 368–371, 400, 426–428; Jess Ashton and Lisa Pulitzer, Imperfect Justice: Prosecuting Casey Anthony (William Morrow 2011), 105, 239, 277, 291–292, 298, 315.
2John Leyden, ‘Child abuse frame-up backfires on stalker’, The Register, 6 April 2010, in which Ilkka Karttunen broke into the Essex home of a woman he wanted to be with, downloaded abusive images of children on to her computer, then stole the hard drive and sent it into the police with a note identifying the owner; for a similar example, see ‘Handyman jailed for planting porn on boss’s computer’, BBC News London, 23 September 2010.
9.54 There are occasions when the physical hardware cannot be seized, because it is too large, it is not physically located in the jurisdiction or even in a single jurisdiction, or where seizing it would cause an organization to cease functioning. In such circumstances, the electronic evidence will have to be copied. As a result, greater care must be exercised when such electronic evidence is retrieved and copied for the first time. The range of electronic evidence that might need to be copied will include audit trails, data logs (for applications, Internet access1 and firewall traffic, to name a few), biometric data, metadata from applications, file systems,2 intrusion detection reports and contents of databases and files. Given the nature of the evidence to be copied, the integrity of the evidence that is copied and its subsequent history becomes paramount.3 Data pertaining to the integrity of these copies and their creation should be retained wherever possible.
1For an interesting discussion, see Dr Richard Clayton, ‘Online traceability: who did that? Technical expert report on collecting robust evidence of copyright infringement through peer-to-peer filesharing’ (Consumer Focus 2012), http://www.cl.cam.ac.uk/~rnc1/Online-traceability.pdf.
2Florian Buchholz and Eugene Spafford, ‘On the role of file system metadata in digital forensics’ (2004) 1(4) Digital Investigation 298.
3The volume of digital evidence is causing problems in respect of the methodologies around the collection of evidence, as discussed in the US context by Erin E. Kenneally and Christopher L. T. Brown, ‘Risk sensitive digital evidence collection’ (2005) 2(2) Digital Investigation 101; Simon Attfield and Ann Blandford, ‘E-disclosure viewed as “sensemaking” with computers: the challenge of “frames”’ (2008) 5 Digital Evidence and Electronic Signature Law Review 62; Daniel R. Rizzolo, ‘Legal privilege and the high cost of electronic discovery in the United States: should we be thinking like lawyers?’ (2009) 6 Digital Evidence and Electronic Signature Law Review 139.
9.55 Another way of dealing with this challenge is to request the cooperation of the service provider to retrieve evidence from its systems. This, however, often leads to jurisdictional issues. Thus, the need for better guidance on the issues arising out of cloud computing is becoming clearer. The Council of Europe has established a working group to address this issue and explore solutions in relation to access for criminal justice purposes to evidence stored on servers in the cloud and in foreign jurisdictions, including through the process of mutual legal assistance.1 The preparation of a second Additional Protocol to the Budapest Convention on Cybercrime seeks to urgently address the need for solutions ‘for a more efficient criminal justice response to cybercrime and other crime involving electronic evidence in accordance with data protection and other safeguards’.2 The shared nature of many of the services involved also creates significant issues surrounding the privacy aspects of the enhanced jurisdiction proposed. Furthermore, direct access to such data raises questions regarding the safeguards that need to be applied before such access is permitted.3
1http://www.coe.int/en/web/cybercrime/ceg.
2Chair, Cybercrime Convention (T-CY), Preparation of the 2nd Additional Protocol to the Budapest Convention on Cybercrime – State of Play, (23 June 2019), 4; first complete draft text of the 2nd Additional Protocol to the Convention on Cybercrime on enhanced cooperation and disclosure of electronic evidence (Draft Protocol, v2, 12 April 2021), https://rm.coe.int/2nd-additional-protocol-budapest-convention-en/1680a2219c.
3See https://www.eff.org/deeplinks/2019/11/council-europe-shouldnt-throw-out-our-privacy-rights-just-speed-police-access.
Analysis of electronic evidence
9.56 A digital evidence professional is not only required to obtain and copy electronic evidence that has a high probative value, but must also provide an analysis of that evidence. The analysis of the evidence will involve reviewing the content of the data and the attributes of the data. This exercise may also include, but will not be limited to, looking for and recovering deleted files and other data that may be hidden on the disk, checking logs for activity and checking unallocated and slack space or unallocated space1 for residual data. Failure to assess the electronic evidence can lead to false assumptions, as in the case of Liser v Smith.2 The facts of the case were not in dispute. The victim was shot after leaving work on the night of 5 May 2000. By Monday 8 May, it was known that the victim’s bank card had been used to withdraw US$200 from a Bank of America branch about 20 minutes after the murder, approximately one mile from where the body was found. According to the electronic evidence, the withdrawal occurred at 1.47 am on 6 May. The Bank of America ATM also had a video surveillance tape, which was subsequently retrieved by the police.
1Slack space is a part of a block or cluster of a filesystem that is used for another file, but that is not entirely overwritten by it. The block may then contain remnants of the file that was previously there. Unallocated space consists of blocks or clusters of the filesystem that were once used for a file but, upon deletion of that file, are no longer referenced in the filesystem’s allocation table. They will contain the original content of the file until they are (fully) overwritten.
2254 F.Supp.2d 89 (D.D.C. 2003).
9.57 The bank manager informed the police that there would be a discrepancy of up to 15 minutes between the time indicated on the surveillance tape and the actual time of the withdrawal. When the tape was viewed, there was no ATM activity recorded at 1.47 am. The closest transaction that occurred was at 1.52 am, when a black male wearing a white t-shirt (the accused Jason Liser) was recorded as standing before the machine. While the evidence seemed to lead to the conclusion that Liser as the man recorded at 1.52 am was one of the killers, the evidence contained on the surveillance video did not warrant such an assumption. Other pictures from the videotape showed black males other than Liser using the ATM at 1.56 am and 2.05 am, and a black female using the machine at 2.04 am. Copies of these pictures were provided to the court. All of them were grainy and poorly photocopied. However, of relevance was that both of the men in question appeared, like Liser, to have been wearing white t-shirts and to be relatively young.
9.58 In August 2000, about three months after the murder, the police decided to put out a press release and a copy of the photograph of the man recorded as standing at the ATM at 1.52 am. Liser was subsequently recognized and arrested for the murder. He was held for less than a week, because the police decided, at this late point in time, to carry out an experiment at the aforesaid ATM machine and its video surveillance facilities. The result of the experiment led the police to conclude that the discrepancy was greater than the 15-minute gap the bank had stated. Liser was subsequently released. It is instructive to note the comments made in the Memorandum Opinion by the judge:
While this issue is a close one, the Court is not ready to conclude that it was objectively reasonable under the circumstances of this investigation for the police to rely solely on the bank’s representations about the time discrepancy without attempting to verify that information by empirical (or other) means. The crucial point here is that this was not a fast moving investigation in which the officers were called upon to make snap judgments based on limited information. Far from it. Detective Smith had the surveillance tapes within a week after the murder; at that early date he had been told by the branch manager that the time on the tape could be off by up to fifteen minutes … Plaintiff was not, however, arrested until August, three months later. During this lengthy interval, neither Detective Smith nor anyone on his team made any further attempt to verify the estimation about the length of the gap. They had no further contact with anyone at Bank of America, especially its security personnel, who might have had more accurate information about the camera’s timer … They did not inspect the camera itself. Nor did they attempt [to] use the ATM themselves to compare real time against tape time.
In short, despite the fact that the tape was their central lead as to the identity of the murderer, the investigators did nothing to pin down exactly how far off the video clock was, at least not before plaintiff was arrested. [Footnote 3: The fact that the police finally sought to verify the information – and quickly and readily learned that it was inaccurate – after Liser’s arrest certainly does not help their cause. That such an [sic] simple test was not done in the three months preceding the arrest, and if done would have cast serious doubt on the propriety of that arrest, suggests an investigative sloppiness that at least casts doubt on whether the initial arrest was actually supported by probable cause.] Instead, Detective Smith and his team chose to rely solely on a single, untested statement from the bank manager. Such reliance might well have been unassailable had the investigators been making an on-the-spot determination as to whether probable cause existed to arrest plaintiff in the first frantic days after the murder. But in the circumstances of the deliberate, slowly unfolding investigation that ensued, during which the officers should have had ample time to pursue leads and to check facts, their failure to verify the length of the gap on the video stands in a rather different light. Their conduct appears more sloppy than reasoned, the product of carelessness rather than craft. The Court is thus unable to say with certainty that this crucial mistake was ultimately a permissible one, or that prudent investigators would necessarily have conducted themselves as defendants did here.1
1United States District Court for the District of Columbia No 00-2325 (ESH) 26 March 2003 before Ellen Segal Huvelle DJ, at 11–12.
9.59 Compare this case with the murder of Denise Mansfield, who was found bound and strangled in her home on 29 June 2002. It was thought that she had been dead since 22 June. The police investigation centred on a surveillance camera that recorded images of people using an ATM, owned by the Sun Trust Bank. This ATM was used to withdraw US$200 from the victim’s bank account at 2.30 pm on 22 June, using her debit card. Three women (Virginia Shelton, her daughter Shirley and one of her daughter’s friends, Jennifer Starkey) were subsequently arrested. They were identified as using the machine between 2.28 pm and 2.33 pm the same day. The women did not dispute using this particular ATM. They were subsequently released after three weeks. After they were arrested, it came to light that it was assumed the clocks on the transaction computer and the ATM were synchronized. This was not correct. The women had used the ATM earlier than the time stamp on the video recording. It was reported that police officers had these records in their possession on the day they arrested the women, but it was not clear if they had examined the records before making the arrests. It was not until the father of one of the women obtained a copy of the relevant records that the women were released.1
1Ruben Castaneda, ‘Mistaken arrests leave Pr. George’s murder unsolved’, washingtonpost.com, 22 June 2003, https://www.washingtonpost.com/archive/politics/2003/06/22/mistaken-arrests-leave-pr-georges-murder-unsolved/8e6257de-22c6-4e73-894f-0e71f7ad9b2c/.
9.60 Both Liser v Smith and the Mansfield murder cases are good examples of the failure to fully test the electronic evidence, in particular, the time. No clock is accurate. This can be important in terms of assessing evidence in digital form.1 In the legal context, Lord Hoffman observed, in DPP v McKeown (Sharon), DPP v Jones (Christopher)2 that ‘The clock, although no doubt physically in the same box as the computer, is something which supplies information to the computer rather than being part of the processing mechanism’.3 It might have been correct that the clock was one hour out because of the difference in time zones, but clocks in computers are not always accurate. Clocks on facsimile machines may also be far from accurate, and so the following comments by Burton J (President) in Woodward v Abbey National plc (No 2), J P Garrett Electrical Limited v Cotton4 that imply that the data recorded by the logs at the offices of the Employment Appeals Tribunal are accurate as a matter of ‘common sense’ cannot be correct:
[I]t must make common sense to accept the accuracy, as I believe there to be, of the record of receipt in the fax log of the [Employment Appeals Tribunal (EAT)], and not to accept either uncertain evidence about the accuracy of the sender’s machine or some kind of speculation as to electronic receipt short of the record in the EAT fax log.5
1The first voice in the play Under Milk Wood by Dylan Thomas, referred to ‘slow clocks, quick clocks’ at [60], and the narrator in The Time Regulation Institute by Ahmet Hamdi Tanpınar (Penguin Classics 2014), translated by Maureen Freely and Alexander Dawe, [11], tells the reader that ‘Everyone knows that a watch or clock is either fast or slow. For timepieces, there is no third state.’ Dr John C. Taylor invented, designed and gave the Corpus Chronophage to Corpus Christi College in Cambridge, England. It is a mechanical clock designed to demonstrate the principle of relative time, doing the unexpected, and is only accurate once every five minutes. The Chief Scientist for Time Services at the US Naval Observatory, Dr Demetrios Matsakis, is responsible for precise time determination and the management of time dissemination. To achieve this, there is a USNO Master Clock that is in turn based on a system of a number of independently operating cesium atomic clocks and hydrogen master clocks, all of which automatically compare with each other, so that rate does not change by more than about 100 picoseconds (0.000 000 000 1 seconds) per day: https://www.usno.navy.mil/USNO/time/master-clock/precise-time-and-the-usno-master-clock.
2[1997] 1 WLR 295, [1997] 1 All ER 737, [1997] 2 WLUK 386, [1997] 2 Cr App R 155 (HL), (1997) 161 JP 356, [1997] RTR 162, [1997] Crim LR 522, (1997) 161 JPN 482, (1997) 147 NLJ 289, Times, 21 February 1997, Independent, 7 March 1997, [1997] CLY 1093.
3[1997] 1 All ER 737 at 754d.
4[2005] 4 All ER 1346, [2005] 7 WLUK 814, [2005] ICR 1702, [2005] IRLR 782, [2005] CLY 1244
5[2005] IRLR 782, [14]. See his further comment on both cases in Woodward v Abbey National plc, J P Garrett Electrical Limited v Cotton (26 July 2005, unreported) (UKEATPA/0534/05/SM and UKEATPA/0030/05/DZM), and similar comments on the same point in Clark v Midland Packaging Limited [2005] 2 All ER 266, [2005] 2 WLUK 317, [2014] CLY 1057, also known as Midland Packaging Limited v Clark. In R v Good [2005] DCR 804 the clock in the computer was running 42 minutes and 30 seconds behind the actual time.
9.61 A more realistic comment on the accuracy or otherwise of clocks was made by Smart AJ in the case of R v Ross Magoulias,1 where the identity of the appellant centred on the recordings made by an ATM and a security video:
It is a notorious matter of fact that reliable clocks or timing devices may show slightly different times. A clock may gain or lose ever so slightly, and it may be some days before the difference becomes noticeable. When setting a clock or timing device there might be a very small error. Perhaps the clock from which the timing device is set is slightly astray. It is exceedingly well known that the timing of differing clocks needs to be synchronised if pinpoint accuracy is required. It is beyond argument that both [the victim] and the appellant attended the service station on 7 July 2001. She can be seen on the video tape for about three minutes (18.37.18 to 18.40.25 according to the video tape timing device). That cannot be disputed. Nor can it be disputed that the appellant attended at the ATM and withdrew $50 (18.40.59 according to the ATM timing device). As earlier pointed out there was no direct evidence available to the jury that the timing mechanisms were not synchronised. If there had been the video tape would have recorded a person (the appellant) withdrawing $50 from the appellant’s account at 18.40.59 (bank record time). The video does not show anybody near the ATM at that time. Thus there was no room for any presumption to operate in any useful way.2
9.62 A clock can also help reveal the truth when somebody attempts to alter electronic evidence. In the case of Shaun Richards, who was caught speeding on 1 June 2009, Richards attempted to prove his innocence by driving the same route (without speeding) in January 2010 and used his satellite navigation data (whose date he had doctored on his computer to 1 June 2009) as proof of his innocence. However, he had forgotten about the clock change from British Summer Time to Greenwich Mean Time, which meant that there was a one hour difference in the time for the doctored data. After this was discovered, Richards was imprisoned for four months for perverting the course of justice.1
1‘Devon driving instructor jailed for sat-nav speed fraud’ BBC News Devon, 13 January 2011.
9.63 There may be occasions where, in the absence of proof, an intelligent assumption that comments recorded on a document have a certain meaning might be accepted by an adjudicator, even when it is possible that the comments are capable of other meanings. In particular, the failure to offer an explanation to rebut the assumed meaning of the content of a digital document submitted in evidence may lead to a finding against the party adducing the evidence, as in Hedrich v Standard Bank London Limited.1 The case concerned a wasted costs order, which was based on breach of the duty owed by a solicitor to the court to perform his duty as an officer of the court in promoting the cause of justice. Ward LJ took particular care in assessing the conflicting evidence, because of the complexity of the facts. The bank sought to have its costs paid by the claimants’ solicitors, Messrs Zimmers. The bank was required to establish a strong prima facie case to succeed, and as part of its case, it sought to prove Zimmers were in receipt of an email on a date before Zimmers claimed that they had actual sight of the evidence. The bank relied on the following relevant text of that email:
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.362/Virus Database: 267.12.8/162-Release Date: 05/11/2005.2
1[2008] EWCA Civ 905, [2008] 7 WLUK 916, [2009] PNLR 3, [2009] CLY 386.
2[2008] EWCA Civ 905 at [70].
9.64 In the absence of evidence from a digital evidence professional, the inference the bank sought to draw from this information was that Zimmers received notification of this particular email in May 2005, to counter the claim that Zimmers did not see it until the trial was under way in December 2005. This was highly relevant, because the bank was asking the court to order Zimmers to pay costs of £342,917.08. In meeting this argument, the barrister for Zimmers, Graeme McPherson QC, conducted some research on the Internet for an alternative explanation for the printed date of 11 May 2005. Ward LJ accepted the following offered explanation, although there was no evidence of the truth of it:
Mr McPherson’s researches [sic] on the internet gave him an alternative explanation. He told us that the first line showed, as it states, that no virus had been detected. The second line indicates that the means of checking was by the AVG Free Edition, which is a free virus detection software programme marketed as AVG. The third line identifies the version of AVG’s software and the crucial date upon which the Bank relies is simply, as is stated on the e-mail, the date of the release of that particular version of the software. We have no evidence that this is the true explanation: we only have Mr McPherson’s word that his researches [sic] on the internet produced that answer. It may have been a moment of inspiration by counsel but for my part it has a compelling ring of truth and I have no reason to think that it is unreliable. It destroys that part of the Bank’s case.’ 1
1[2008] EWCA Civ 905 at [71].
9.65 It would have been wise of the bank to establish the meaning of this information, because of the evidential hurdle required to prove its case. It would not have taken a digital evidence professional long to have established whether the information proved the date was the date of the release of that particular version of the software or not. It might have been for the court to ask the parties to seek an opinion on this issue before reaching a conclusion, but given the nature of the proceedings, in particular the rule that where there is room for doubt, the respondent lawyers are entitled to the benefit of it, it is not surprising that the court did not let the matter continue any further, and accepted the alternative explanation.1
1There was a similar point raised in State of Connecticut v Julie Amero, but the digital evidence professional for the prosecution failed to even consider looking for malicious software: Stephen Mason, International Electronic Evidence, xxxvi–lxxv.
9.66 A further observation of relevance is that, in itself, the electronic evidence may not be conclusive. The case of Mogford v Secretary of State for Education and Skills1 illustrates this point. Mr Mogford appealed against a decision of the Secretary of State for Education and Skills to include his name in the list maintained under the provisions of the Education (Restriction of Employment) Regulations 2000 (SI 2000/2419) that prevented him from being employed as a teacher under the provisions of regulation 5(1)(c). The Secretary of State made this decision because abusive images of children, text files, emails relating to this material and bookmarks with links to websites containing abusive images of children had been found on Mogford’s computer. Mogford denied that he was responsible for this material. The members of the Tribunal were satisfied that the Secretary of State proved on a balance of probabilities that either Mogford was solely responsible for the materials found on the computer, or that he participated with others in obtaining this material, and he knew that it was on his machine. The reasons given included:
(1) Inconsistencies in Mogford’s evidence. He frequently changed his story. He told the interview team that he was visiting his girlfriend on the weekend 25–27 April 1997, then changed his story before the members of the Tribunal, indicating that three people had stayed at his house that weekend. Mogford also said in the interview that one RS had helped set up his Internet link. In evidence to the Tribunal, RS denied this. And Mogford gave evidence to the effect that one P set up the Internet for him.
(2) There was no attempt to find P, or indeed either of the other two friends whom Mogford claimed were with him that weekend. That he failed to take steps to ask his friends to corroborate his story was held by the members of the Tribunal as being consistent with the fact that his version of events was not credible.
1[2002] EWCST 11(PC) (26 June 2002).
9.67 Consideration was also given to the timing of the file system activity, especially those that occurred close to midnight of 27 April 1997 that showed access to a series of websites depicting abusive images of children, and the members of the Tribunal carefully examined the evidence presented by the digital evidence professional who sought to link access to such websites to Mogford. The electronic evidence showed that Mogford had created a spreadsheet that contained details of earnings from private lessons, and this spreadsheet was closed down at 00.28 on 27 April 1997. Mogford denied that he had closed down this spreadsheet, claiming that he had opened his spreadsheet at some other time earlier, had failed to close it down, and someone else had shut down his computer, thereby closing the spreadsheet in the process. The members of the Tribunal articulated the importance of this item of evidence and the explanation offered by Mogford as follows:
It is our interpretation of the evidence that Mr M must have been using the computer at this time, either alone or with someone else, surfing the net and finding child pornography sites and text messages, and therefore when closing down the computer his spreadsheet would have been closed. The spreadsheet would have been of no interest to his friends, and he himself said in evidence that it was unlikely that he would have opened the spreadsheet and left it for a couple of days. We can only infer that he was working on the spreadsheet earlier that evening or the previous day.1
1[2002] EWCST 11(PC) (26 June 2002) at [25].
9.68 The observations noted above illustrate the importance of understanding the nature of digital data.1 The aim should be to test the accuracy of the evidence and to ask if the conclusions are correct, rather than making decisions based on an imperfect analysis of the available evidence. It should never be assumed that because evidence is in electronic form, that it must therefore be correct and impervious to being tested to prove whether it is accurate or false. The important point to note is that questions of the accuracy and quality, together with the nature and quantum, of electronic evidence are contextual.
1The British Computer Society Expert Panels: Legal Affairs Expert Panel Submission to the Criminal Courts Review (March 2000), http://www.computerevidence.co.uk/Papers/LJAuld/BCSComputerEvidenceSubmission.htm.
9.69 A digital evidence professional will not only, ideally, require an in-depth knowledge of the operating system she is to investigate, but will also need to use a number of proprietary tools in the performance of the investigation and analysis of digital evidence. The types of tool to be used will depend on the operating system being examined and whether the investigation is of networks, hand-held devices, embedded systems or wireless networks.1 Due to their technicality, the reader is encouraged to become familiar with the technology and techniques by referring to appropriate practitioner texts,2 including those discussing their limitations.3 The tools used can, naturally, be the subject of cross-examination, and the underlying scientific methodology and structure of such tools can also be questioned.4 In this section, the aim is to illustrate why and how tools are used in the context of the Windows operating system, partly because it is so widely used.
1W. Jansen and R. Ayers, ‘An overview and analysis of PDA forensic tools’ (2005) 2(2) Digital Investigation 120.
2Brian Carrier, ‘Defining digital forensic examination and analysis tools using abstraction layers’ and James R. Lyle, ‘NIST CFTT: testing disk imaging tools’ (2003) 1(4) International Journal of Digital Evidence; A. D. Irons, P. Stephens and R. I. Ferguson, ‘Digital investigation as a distinct discipline: a pedagogic perspective’ (2009) 6(1–2) Digital Investigation 82; Bradley Schatz, Digital Evidence: Representation and Assurance, PhD submitted to the Information Security Institute, Faculty of Information Technology, Queensland University of Technology (October 2007), http://eprints.qut.edu.au/16507/1/Bradley_Schatz_Thesis.pdf.
3For instance, see SWGDE (Scientific Working Group on Digital Evidence), Establishing Confidence in Digital Forensic Results by Error Mitigation Analysis (1.5, 5 February 2015).
4Erin Kenneally, ‘Gatekeeping out of the box: open source software as a mechanism to assess reliability for digital evidence’ (2001) 6 (13) Virginia Journal of Law and Technology 1; Eric Van Buskirk and Vincent T. Liu, ‘Digital evidence: challenging the presumption of reliability’ (2006) 1 Journal of Digital Forensic Practice 19; Lei Pan and Lynn M. Batten, ‘Robust performance testing for digital forensic tools’ (2009) 6(1–2) Digital Investigation 71; SWGDE Recommended Guidelines for Validation Testing, Version 1.1 (January 2009); Fred Cohen, Julie Lowrie and Charles Preston, ‘The state of the science of digital evidence examination’, in Gilbert Peterson and Sujeet Shenoi (eds) Advances in Digital Forensics VII, 7th IFIP WG 11.9 International Conference on Digital Forensics, Orlando, FL, USA, 31 January–2 February 2011 (Springer 2011); Computer Forensic Tool Testing Handbook (National Institute of Standards and Technology 2012); Jeremy Leighton John, Digital Forensics and Preservation (Digital Preservation Coalition 2012).
9.70 Automated tools are necessary to perform a forensic examination of a computer economically. However, the digital evidence professional should understand the process used by the tool to perform the relevant tasks. This is because it may be necessary to explain the process to a court, or the specialist may be required to carry out the analysis without the aid of a tool, because the use of a tool in any given situation may not be appropriate. These are issues that lawyers may well need to take cognizance of in the future.1 For instance, it is not clear that practitioners themselves are familiar with some tools, and may question the worth of early versions.2 This is because it seems that such tools are tested informally, rather than formally proven to be correct. It has therefore been suggested that such tools should be tested formally.3 In an effort to enhance the veracity of evidence adduced from a forensic examination, it is becoming common practice within forensic laboratories to use what is known as ‘dual tool’ verification techniques. Simply put, an analyst will perform an examination using one piece of forensic software and, where data of potential relevance is identified, will use a second tool, produced by a different vendor, to perform the same examination and compare the results. If they match, more weight can be given to the accuracy of the data. However, it must be emphasized that such techniques are not a replacement for critical thinking or experimentation.4
1For an example of where tools were the topic of judicial scrutiny in Australia, see Bevan v The State of Western Australia [2010] WASCA 101, (2010) 202 A Crim R 27 and Bevan v The State of Western Australia [2012] WASCA 153, 2012 WL 3298167. These cases are discussed in more detail in Chapter 5 on the presumption that computers are ‘reliable’.
2Eoghan Casey, ‘Network traffic as a source of evidence: tool strengths, weaknesses, and future needs’ (2004) 1(1) Digital Investigation 28.
3Lyle, ‘NIST CFTT: testing disk imaging tools’; Matthew Gerber and John Leeson, ‘Formalization of computer input and output: the Hadley model’ (2004) 1(2) Digital Investigation 214; Ibrahim M. Baggili and Richard Mislan, ‘Mobile phone forensics tool testing: a database driven approach’ (2007) 6(2) International Journal of Digital Evidence; David Byers and Nahid Shahmehri, ‘A systematic evaluation of disk imaging in EnCase 6.8 and Li En 6.1’ (2009) 6(1–2) Digital Investigation 61; SWGDE Recommended Guidelines for Validation Testing, Version 2.0 (5 September 2014).
4See also Eoghan Casey, ‘The increasing need for automation and validation in digital forensics’ (2011) 7(3–4) Digital Investigation 103; Joshua I. James and Pavel Gladyshev, Challenges with Automation in Digital Forensic Investigations (Digital Forensic Investigation Research Group University College Dublin), http://arxiv.org/pdf/1303.4498.pdf; mistakes were made in the case of Casey Marie Anthony in 2011, and one tool that was used did not give correct results, although once the designer was aware of the error, he informed the police immediately: Craig Wilson, ‘Digital evidence discrepancies – Casey Anthony trial, 11 July 2011’; Pipitone, ‘Cops, prosecutors botched Casey Anthony evidence’; Baez and Golenbock, Presumed Guilty; Ashton and Pulitzer, Imperfect Justice; Ivar Friheim, ‘Practical use of a dual tool verification in computer forensics’, 2016, UCD Dublin (minor thesis).
9.71 It should also be noted that software in forensic tools is far from impartial or infallible.1 Indeed, the users of forensic tools may themselves not be aware that some tools do not carry out as detailed an examination as they think. Jonathan Zdziarsk commented on a problem encountered with forensic tools in his 2014 blog post ‘An example of forensic science at its worst: US v. Brig. Gen. Jeffrey Sinclair’:
I worked from a physical image dump created by a commercial forensics tool, and three reports from various tools which, as it would turn out, appeared to be misreporting (or at best ‘under explaining’) at least some of data that the case would later hinge on. What the tools didn’t report turned out to be much more interesting than what they did … As my findings would later reflect, the commercial tools that had been used to initially evaluate the evidence on the device had either misreported key evidence, or failed to acknowledge its existence entirely … All you need to know from a technical perspective is right here: some of the types of information that these commercial tools were (and likely still are) misreporting is significant. Evidence and timestamps of a device erasure event. Evidence of a backup restore event. Application usage dates. Application deletion events and timestamps. File access times. This, and many other types of artifacts are often either completely overlooked by numerous commercially sold, expensive-as-hell tools, or in the case of at least one tool – seemingly made up data. All of these came into play in this case.2
1Stephanie J. Lacambra, Jeanna Matthews and Kit Walsh, ‘Opening the black box: defendant’s right to confront forensic software’ (2018) Champion 28, https://www.eff.org/files/2018/07/30/champion_article_-_lacambra_forensic_software_may_2018_07102018.pdf.
2https://www.zdziarski.com/blog/?p=3717. This remains the case in 2021, and is one reason to undertake verification with more than one tool – but it is also true that most commercial telephone tools do not identify or under-report data present on the device.
9.72 Before obtaining access to a computer, it is essential that the investigator is familiar with the underlying operating systems, file systems and applications. By understanding the file systems, the digital evidence professional will be aware of how information is arranged, which in turn enables her to determine where information can be hidden, and how such information can be recovered and analysed. In order to establish answers to questions such as ‘Who might have had access to a computer or system?’, ‘Which files would they have been able to look at?’ and ‘Was it possible for an unauthorized outsider to obtain access to the computer from the Internet?’, the digital evidence professional should understand the nature of user accounts and profiles, and the control mechanism that determines which files a user is permitted to access upon logging into a system.
9.73 To acquire the data on a hard disk installed in a computer, an investigator will, in most cases, prefer to remove the hard disk from the computer and attach it to a specialist ‘write-protected’ interface that is attached, in turn, to an ‘imaging’ device capable of copying the forensic image stored on the media on to a previously cleaned (and verified as clean) storage device. Such interfaces are commonly referred to as ‘write blockers’, and the imaging capability may be performed by specifically designed imaging hardware or by a standard computer running imaging software. However, in some circumstances removal of the hard disk from a computer may not be possible or advisable, in which case it is common to leave the hard disk installed in the host computer and obtain access to it using the procedures described in the following paragraphs.
9.74 To avoid altering any evidence on a computer, it is necessary to bypass the operating system. When the power supply is switched on, the basic input and output system (BIOS) will carry out a power-on self-test (POST) before looking for the operating system. After the BIOS is activated and before the POST test has completed its cycle, it is possible to interrupt the process. Most computers are programmed to expect the operating system to be found on a floppy disk, hard disk, compact disc or a device attached to the Universal Serial Bus (USB) As a result, the system looks at these locations in the order set out in something called the Complementary Metal Oxide Silicon (CMOS) configuration tool. The CMOS chip retains the date, time, hard drive parameters and other details relating to configuration while the main power is switched off. By looking at the CMOS tool between the POST test and the computer being fully powered up, the digital evidence professional is able to determine where the computer will look for the operating system: for instance, a floppy disk, a hard disk or a compact disc. With this knowledge, the investigator is able to pre-empt the search for the operating system on the computer and provide an alternate operating system from another disk. It is common for this alternative operating system to be a variant of the Linux operating system that is designed to allow storage devices to be viewed in ‘Read Only’ mode. By interrupting the normal boot-up process in this way, the evidence on the hard drive remains intact and unaltered, thereby permitting the content to be copied in the state it was in when the computer was switched off. Various techniques and tools (such as an evidence acquisition boot disk) can be used to intercept this process, and the precise technique depends on the circumstances of each case.
9.75 Once the computer is booted from a suitable tool, the program can then do a sector-by-sector copy of the electronic evidence. Some tools will acquire the data and undertake an integrity check at regular intervals. There is some technical discussion about whether the tools that undertake these tasks do take an exact copy of the disk, even though all of the information is copied from the disk. One of the reasons for this is that data may be arranged in a different manner in a proprietary file format. Professor Casey suggests this is not as important as ensuring that the integrity of the evidence is maintained, which must be correct. In addition, he also suggests that at least two copies be made with different tools.1 From a practical point of view, this may not always be possible because of time constraints and the absence of storage media.
1Casey, Digital Evidence, 480.
9.76 A number of the forensic imaging tools, such as Encase and FTK, have used the expression ‘Logical Evidence Files’ which, instead of being an image of an entire hard disk, are copies of specific data accessible in the devices’ file systems (that is, the contents of a specific directory or directories). This technique has significant advantages where it is impractical to image an entire drive due to the amount of data required to be copied or because of time constraints. It should be noted that file hashing and image hashing techniques are still used to ensure the integrity of the data that is collected.1
1Michael Cohen, Simson Garfinkel and Bradley Schatz, ‘Extending the advanced forensic format to accommodate multiple data sources, logical evidence, arbitrary information and forensic workflow’ (2009) 6(1) Digital Investigation S57; Da-Yu Kao, Shiuh-Jeng Wang and Frank Fu-Yuan Huang, ‘SoTE: strategy of Triple-E on solving Trojan defense in cyber-crime cases’ (2010) 26(1) Computer Law & Security Review 52.
9.77 When the electronic evidence has been copied, the data can be viewed in raw format (examining the contents of the file in binary, hexadecimal or another format that displays the literal file contents as expressed in bits) or logically (using a viewer or program suitable for processing the file at hand). It is usually necessary to view the data through a tool; human beings need the binary code, which resides on a disk or in a disk image, to be interpreted before the data can be viewed and interrogated in a sensible manner. In many tools for viewing raw data, the data can be viewed in hexadecimal form on one side of the screen and in plaintext (ASCII or Unicode) on the other side of the screen. Depending on the tool used, the data can be examined and analysed. For instance, a tool can recover slack space and compare files to determine if there are any differences to be observed.1 Viewing data in logical view enables the user to examine it as represented by the file system. This way of looking at the data permits the user to analyse it in a different way, but it does not show the underlying information that is visible when using the physical method. Both forms of viewing data have their limitations, and it is also important to be aware that data can be misinterpreted. There is some debate about the best way of examining digital evidence, but the emphasis should be on verifying the accuracy of the evidence by using different tools.
1Note also that the volume of images that need to be reviewed and searched are increasing, and tools are being developed for this purpose: Paul Sanderson, ‘Mass image classification’ (2006) 3(4) Digital Investigation 190.
9.78 An increasing number of people delete the content of their hard drives in computers in anticipation of legal action or after legal action has begun.1 For instance, in the case of L C Services Limited v Brown,2 Andrew Brown, the sales director of LC Services, was found to have broken the fiduciary duty he owed to LC Services. He also breached the terms of his services agreement and misused confidential information belonging to LC Services. It appeared that Mr Brown altered or re-installed the operating system on his computer on 1 October 2003, at the time the claimants were pursuing disclosure documents from the defendants. A digital evidence professional was subsequently able to retrieve the residue of the text of the relevant database in dispute, and the remains of a number of emails sent by Mr Brown. The content of these emails showed that he was in breach of his fiduciary duties to LC Services.3
1Ewa Huebner, Derek Bren and Cheong Kai Wee, ‘Data hiding in the NTFS file system’ (2006) 3(4) Digital Investigation 211; Dan H. Willoughby Jr, Rose Hunter Jones and Gregory R. Antine, ‘Sanctions for e-discovery violations: by the numbers’ (2010) 60(3) Duke Law Journal 789.
2[2003] EWHC 3024 (QB), [2003] 12 WLUK 391.
3Bruce J. Nikkel, ‘Forensic acquisition and analysis of magnetic tapes’ (2005) 2(1) Digital Investigation 8; Mayank R. Gupta, Michael D. Hoeschele and Marcus K. Rogers, ‘Hidden disk areas: HPA and DCO’ (2006) 5(1) International Journal of Digital Evidence.
9.79 There are several techniques that can be used to recover data that has been deleted. This can be done manually or through the use of tools, depending on the complexity of the problem faced by the specialist. For instance, some tools use a bit-for-bit copy of a disk to reconstruct the file system, including any files marked as deleted in the file allocation table, master file table or their equivalents. However, where files are fragmented and have been partially overwritten, it may be necessary to recover them by hand. A typical technique to recover deleted files (often called ‘carving’) involves searching unallocated space and swap files for such information as headers and footers. Although there are many types of file that can be recovered (carved) in this way with an appropriate tool, such as graphic files, word processing and executable files, recovery is limited to those files whose headers have not been deleted.1
1Paul Alvarez, ‘Using Extended File Information (EXIF) file headers in digital evidence examination’ (2004) 2(3) International Journal of Digital Evidence.
9.80 A number of tools are available that are capable of removing passwords and bypassing or recovering them. Some tools are available for guessing passwords if the encryption keys are small enough, and where it is not possible to obtain a password, it is sometimes possible to search for unencrypted versions of the data in other areas of the hard disk.1 Passwords can be used simply to provide access control to unencrypted data, can be the ‘key’ that decrypts encrypted data, and can even be the ‘key’ that decrypts the actual key that is used to decrypt encrypted data. The methods used to bypass passwords or ‘crack’ the code needed to decrypt encrypted data are many and varied, but in general stronger encryption algorithms and larger ‘keys’ mean that very long processing times are required to gain access to the data, if indeed they can be accessed. Depending on the processing power available, it may be impossible to reveal the passphrase or gain access to encrypted materials in a realistic time frame. The techniques used to attempt to obtain access to encrypted or password-protected data are discussed in Chapter 8 on encrypted data. The increased use of encryption on (mobile telephone) file systems poses problems and has led to significant debate and developments in the field of police powers.
1Eoghan Casey, ‘Practical approaches to recovering encrypted digital evidence’ (2002) 1(3) International Journal of Digital Evidence; Christopher Hargreaves and Howard Chivers, ‘Recovery of encryption keys from memory using a linear scan’, Proceedings of the 2008 Third International Conference on Availability, Reliability and Security, 2008, 1369–1376; Eoghan Casey, Geoff Fellows, Matthew Geiger and Gerasimos Stellatos, ‘The growing impact of full disk encryption on digital forensics’ (2011) 8(2) Digital Investigation, 129.
9.81 One of the most significant difficulties faced by digital evidence professionals with computers and devices that are connected to a network such as the Internet, or a series of computers or devices that are connected in an organization, is the possibility that a hacker or malicious employee might enter the system without authority and undertake a series of actions that causes an innocent person to be accused of doing something he did not do.1 This is where data logs can help. Two types of log, the application log and system event log, contain information about how users have used the computer. Scrutinizing these logs, either manually or with a tool, can help to obtain a clearer picture about the activities that took place on the system, although consideration must be given to the integrity of the logs themselves. Note that logs may also be present at other levels in the network, such as on a fileserver, an Internet proxy or a firewall. The availability of such logs may, however, vary a great deal. A typical problem in this area is the shared use of a single public IP address for Internet traffic by many different local users. These users will typically have their own, locally distributed (private) IP address. A setup like this is known as NAT (Network Address Translation) since it requires translation of the local user’s (private) IP addresses to the public IP address and vice versa. Network-based logs only rarely contain enough data to identify the individual user, however.2
1Srinivas Mukkamala and Andrew H. Sung, ‘Identifying significant features for network forensic analysis using artificial intelligence techniques’ (2003) 1(4) International Journal of Digital Evidence; Bruce J. Nikkel, ‘Domain name forensics: a systematic approach to investigating an internet presence’ (2004) 1(4) Digital Investigation 247; Bruce J. Nikkel, ‘Improving evidence acquisition from live network sources’ (2006) 3(2) Digital Investigation 89; Eoghan Casey and Aaron Stanley, ‘Tool review – remote forensic preservation and examination tools’ (2004) 1(4) Digital Investigation 284; Omer Demir, Ping Ji and Jinwoo Kim, ‘Packet marking and auditing for network forensics’ (2007) 6(1) International Journal of Digital Evidence.
2Hein Dries-Ziekenheiner and Iljitsch van Beijnum, ‘Allocation and use of IP addresses’, Study for the European Commission (December 2010, SMART 2010/14), http://bookshop.europa.eu/en/allocation-and-use-of-ip-addresses-pbKK0113063/.
9.82 In addition, when a user uses his computer, a digital trace is left of the actions across a range of data logs and files.1 A data log is capable of containing any type of data, depending on what the system is programmed to capture.2 For instance, if a file is downloaded from the Internet, a date and time stamp will be added to the file to demonstrate when the file was downloaded onto the computer. When the file is moved, opened or modified, the time and date stamps will be altered to reflect these changes. In addition, the metadata can also help provide more information about the file, such as the location where it was stored on the disk, the printer on which the file was printed and the time and date the file was created. When a file is printed, the computer tends to store the print job in a temporary file before it is sent to the printer when the printer has the capacity to print the file. Once the command to print has been passed to the temporary store, the user can continue to work with the application – for instance, she can continue to type a new document while the previous document is waiting to be printed. The temporary print store retains valuable information, such as the name of the file to be printed, the type of application used, the name of the printer, the purported name of the person whose file is to be printed and the data itself. In addition, there is a date and time stamp added to the file to show when the file was printed. It should be noted, however, that the date and time stamp can be altered, which means it is important to ensure that the date and time stamp is corroborated by other methods.3
1In relation to intrusion detection systems, see Peter Sommer, ‘Intrusion detection systems as evidence’ [2002] 3 CTLR 67; Vlasti Broucek and Paul Turner, ‘Intrusion detection: issues and challenges in evidence acquisition’ (2004) 18(2) International Review of Law, Computers & Technology 149; Jean-Marc Dinant, ‘The long way from electronic traces to electronic evidence’ (2004) 18(2) International Review of Law, Computers & Technology 173.
2Erin E. Kenneally, ‘Digital logs – proof matters’ (2004) 1(2) Digital Investigation 94.
3Karen Kent and Murugiah Souppaya, Guide to Computer Security Log Management (2006), Special Publication 800-92 at 2.1.3 fourth bullet point, http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-92.pdf.
9.83 When a person obtains access to the Internet, a range of data is created and retained on a computer or device, including the websites that have been visited, the contents a user has viewed and the data sources accessed.1 Some systems, both in the network and on customer premises, also include a log of the times and dates of the Internet session and details of the device or connection that was used (such as the modem, network card or physical network port in the access network). With more services available online, it is important to be able to rely on information provided by network operators for investigation purposes. Typical information requests involve IP addresses, subscriber details and possibly payment information. Internet access logs may, furthermore, provide information as to where and how users were connected to a service, and may identify others involved in the same investigation. Finally, it is interesting to observe that CCTV systems are gradually being replaced by systems that use Internet Protocol technologies (IP) and wireless IP, which will in turn cause additional expense and increase the legal complexity (where the camera is capturing images in one country, and these images are being recorded or stored in another country) in obtaining access to such systems for the purposes of litigation or criminal proceedings.2 The types of information available include those noted below.
1Yeong Zee Kin, ‘Computer misuse, forensics and evidence on the Internet’ (2000) 5(5) Communications Law 153; Vivienne Mee, Theodore Tryfonas and Iain Sutherland, ‘The Windows Registry as a forensic artefact: illustrating evidence collection for Internet usage’ (2006) 3(3) Digital Investigation 166.
2Fanny Coudert, ‘Towards a new generation of CCTV networks: erosion of data protection safeguards?’ (2009) 25(2) Computer Law & Security Review 145.
9.84 Browser cache When viewing a page on the Internet, the browser retains and takes copies of all the elements that make up the page, such as graphics and HTML text. This copy is called a cache. The computer or device gives the page a date and time stamp at the time the page was downloaded. The reason for doing this is that when the page is visited again, the cached file is used by the computer or device in place of obtaining access to the same page online and the date and time stamp is subsequently updated. Another item of information created and logged in some browser history databases is the number of times a web page was visited. It must not be assumed, however, that just because the computer or device has recorded certain types of web page that the user actually viewed such pages. This is because some websites, in particular those promoting pornography, will redirect a browser to different websites, and may even make unauthorized changes to the computer or device.1 It is possible to recover these cached files, even if they are deleted. Recovered files can provide such information as when the computer or device was used to obtain access to web-based email, when sites were visited and if purchases were made or financial transactions undertaken.
1Daniel Bilar, ‘Known knowns, known unknowns and unknown unknowns: anti-virus issues, malicious software and internet attacks for non-technical audiences’ (2009) 6 Digital Evidence and Electronic Signature Law Review 123.
9.85 Cookies Many websites keep a track of visits by users to their sites by placing this information in files on the users’ computers or devices called cookies. If cookies have not been disabled, the information in the cookie directory can help with an investigation. As for websites included in the temporary cache file, it does not follow that just because there is a cookie on the computer or device that a user necessarily went to all of the websites included in the cookie directory. Some advertisements on a website may place a cookie on the user’s computer or device, even though the user did not click on and view the particular website. Further, where the user’s browser has been redirected without his permission, cookies can be added to the directory without the knowledge of the user.
9.86 Private browsing, VPN proxies and Tor In order to provide Internet users with more privacy, several browser manufacturers have introduced ‘incognito modes’ or ‘private browsing’ modes in their browser software. In this mode, no Internet history, cache entries or cookies (or any other artefacts) remain after the Internet session. This means it will be harder (if not impossible) to retrieve a reliable indication of a user’s Internet usage and surfing behaviour from the information present in the local computer system. In practice, other systems such as access logs at service providers’ services or browsed websites may still be able to identify the user by her IP address.1
1For which see United States v Bandy, Slip Copy, 2021 WL 414830.
9.87 In order to further enhance user privacy and anonymity, services such as Tor (The Onion Router) and VPNs (Virtual Private Networks) are available that allow users to hide the origins of their connection to the services they use. In the case of Tor, this is achieved through a network of nodes operated by volunteers who anonymize connections to the Internet by providing a route across three or more anonymous nodes (including an entry and exit node, as they are called) on behalf of a Tor user. Since no logging is kept at any of the intermediary Tor nodes, this assures a relatively high level of anonymity. Similarly, VPNs and proxies can be used to connect to the Internet via a predetermined ‘hop’ in the network. Provided the VPN origin is not logged, this may effectively make tracing users by their network addresses impossible. Note that the use of other information and identifiers is still possible, so that various other measures may still reveal the users’ actual names, addresses and Internet activities.
9.88 Email and instant messaging Email has become a dominant method of communication for the vast majority of organizations, although text-based ‘chat’ is used increasingly by individuals, and especially on smartphones. Nevertheless, a great deal of evidence can be discovered from email communications. Some software programs store email in plaintext files, while others use proprietary formats that will require the digital evidence professional to use a number of tools in order to read the messages. Other email systems utilize online storage only and leave very little communications data on the filesystem of computers or devices. It is sometimes possible to recover email messages that have been deleted but have not been removed from the email files.1 Where it is impossible or difficult to restore emails from a single computer or device, it might be possible to track email traffic through the network it has travelled.2 Organizations are beginning to recognize the importance of their email communications, and many larger organizations have archives of email communications that can be investigated in the event of electronic disclosure or electronic discovery requests.
1See the criminal case of R. v Khan (Adeel) [2015] EWCA Crim 1816, [2015] 11 WLUK 550, [2016] 1 Cr App R (S) 47 where the only evidence was of screen shots of email messages, and screen shots of email messages were also adduced in Cole v Carpenter [2020] EWHC 3155 (Ch), [2020] 11 WLUK 318 regarding a dispute over the sale of a work of art by Pablo Picasso, known as ‘Le Sauvetage’ or ‘The Rescue’ (this was an application by the defendants for permission to make a contempt application against the claimant); see also Vorotyntseva v Money-4 Ltd (t/a Nebeus.com) [2018] EWHC 2596 (Ch), [2018] 9 WLUK 501.
2Eoghan Casey, Troy Larson and H. Morrow Long, ‘Network analysis’ in Eoghan Casey (ed) Handbook of Computer Crime Investigation Forensic Tools and Technology, 234–239.
9.89 Instant messaging, in the meantime, has become the default method of communication for many people. This presents problems for the investigator. It is not only used on local desktop systems (where this technology is increasingly also used in business environments), but it has also seen a major surge in use on mobile devices in recent years. Due to the Snowden revelations in 2013 of the mass international surveillance by the NSA,1 many instant messaging programs currently in widespread use have introduced end-to-end encryption, meaning that intermediaries do not have access to plaintext messages, but merely to an encrypted version of those messages. Each connected device has a unique public key and a private key that is unknown to the intermediary. In practice, this means that the only place where such communications can be viewed and decrypted to a readable format is at the end user’s device.
1David Cole, ‘After Snowden: Regulating technology-aided surveillance in the digital age’ (2016) 44 Capital University Law Review 677.
9.90 Mobile applications that are used for instant messaging typically include the ability to send photographs and videos. Social networks and mobile Internet messaging have become the default communication method used by children.1 This creates an increased workload for investigators of child abuse-related cases, especially where they may need to view a home computer, as well as a multitude of other devices, to help determine why a child might have left home, or how he or she got into contact with a certain adult, for instance.2 Another challenge is that these programs increasingly offer features that allow the user to determine a set time and date to destroy any images sent. Therefore, images are no longer stored on the filesystem of the device or telephone by default, but temporary copies only are displayed for a short period of time, after which they are deleted. This leaves fewer artefacts and creates further challenges in criminal investigations involving abusive images and children, particularly in relation to practices such as sexting, the sending of sexual images and messages, and grooming, where adults lure children typically for sexual abuse by acting as persons of the same age.
1Sonia Livingstone, Leslie Haddon, Anke Görzig and Kjartan Ólafsson, ‘Risks and safety on the internet: the perspective of European children: full findings and policy implications from the EU Kids Online survey of 9–16 year olds and their parents in 25 countries’, EU Kids Online 2011 (LSE 2012), http://eprints.lse.ac.uk/33731/1/Risks%20and%20safety%20on%20the%20internet%28lsero%29.pdf.
2Harlan Carvey, ‘Instant messaging investigations on a live Windows XP system’, (2004) 1(4) Digital Investigation 256; Mike Dickson, ‘An examination into MSN Messenger 7.5 contact identification’ (2006) 3(2) Digital Investigation 79; Mike Dickson, ‘An examination into Yahoo Messenger 7.0 contact identification’ (2006) 3(3) Digital Investigation 159; Paul Sanderson, ‘Identifying an existing file KaZaA artefacts’ (2006) 3(3) Digital Investigation 174; Mike Dickson, ‘An examination into AOL Instant Messenger 5.5 contact identification’ (2006) 3(4) Digital Investigation 227; Jessica Reust, ‘Case study: AOL instant messenger trace evidence’ (2006) 3(4) Digital Investigation 238.
9.91 Voice over Internet Protocol (known as VoIP) is another computer-to-computer technology that has expanded rapidly, and will need to be considered when conducting an investigation.1 Contrary to the old telephony system (often referred to as POTS or Plain Old Telephone System), Internet-based calls can be made fairly anonymously and it is easy to deceive a person into thinking that a telephone number is genuine (called ‘spoofing’), especially the number of the party initiating the call.2 This makes telephone numbers increasingly unreliable as identifiers. The risk of wrongfully attributing the source of a telephone call on the basis of its originating telephone number has increased greatly, especially since many VoIP providers allow spoofing of outbound calls as a service feature, and special services have emerged that specialize in spoofing calls for various purposes. In most cases the connection will be encrypted, which means that the data packets flowing between the caller and the recipient of a VoIP call is not in decipherable voice form, and if intercepted midway, cannot be reconstructed to meaningful evidence.
1Xinyuan Wang, Shiping Chen and Sushil Jajodia, ‘Tracking anonymous peer-to-peer VoIP calls on the Internet’, Proceedings of the 12th ACM Conference on Computer and Communications Security (2005), 81–91.
2Richard Clayton, ‘Can CLI be trusted?’ (2007) 12(2) Information Security Technical Report 74, https://www.cl.cam.ac.uk/~rnc1/cli.pdf.
9.92 Digital and online wealth A special category of data is related to financial investigations and digital evidence. Once a small field with limited overlap to digital forensics, the advent of cryptocurrencies such as Bitcoin, electronic money such as PayPal, as well as many other types of digital assets and wealth stored online, have increased the need for specialized investigations into electronic evidence pertaining to wealth that is accessible through computer systems. It should be noted that, in contrast to electronic money, where a database containing a ledger (denominated in fiat currency such as pound and euro) is typically stored with a service provider, cryptocurrencies make it possible to store values in local wallets that are hosted on software present on a computer system or device. A special property of these currencies is that the cryptographic values in the wallet can be copied in order to make it possible to spend the currency from either one of the copies made. This complicates search and seizure for this type of evidence. Digital evidence professionals will need to have a good knowledge of the way in which such wallets are stored, as well as the most common online services related to the various types of financial activities that can be employed online, in order to proficiently and, indeed, forensically conduct financial investigations into data. From a legal perspective, it should be noted that the existing international standards make frequent use of the reversal of the burden of proof in cases of ‘unexplained wealth’. These are cases where a predicate offence can be proven, yet significantly more ‘unexplained’ wealth is found to be present following a financial investigation. In such cases the burden of proof regarding the title to such wealth can be shifted to the suspect.1
1This is sometimes called ‘extended confiscation’ (if such wealth is seized) and in the UK is implemented in the regime for UWO (Unexplained Wealth Orders) in s 6 (which only applies to England and Wales) of the Proceeds of Crime Act 2002.
9.93 The findings, and any conclusions made by the digital evidence professional, will be set out in a report. Whether prepared for criminal or civil proceedings, the report should include a range of information that is pertinent to the case, including, but not limited to:
(1) Notes prepared during the examination phase of the investigation.
(2) Details about the way in which the investigation was conducted.
(3) Details about the continuity of custody.
(4) The validity of the procedures used.
(5) Details of what was discovered, including, but not limited to:
(a) Any specific files or data that were directly related to the investigation.
(b) Any further files or data that may support the conclusions reached by the specialist. This will include the recovery of any deleted files and the analysis of any graphic files.
(c) The types of search conducted, such as key word searches, and the programs searched.
(d) Any relevant evidence from the Internet, such as emails and the analysis of websites visited and log files.
(e) Indications of names that might demonstrate evidence of ownership of software, such as with whom the software was registered.
(f) Whether there was any attempt to hide data in any way, and if so, what methods were used.
9.94 Professor Casey refers to the following principles to guide the preparation of forensic reports: observation, hypothesis, prediction, experimentation/testing and conclusion.1 Following from these principles, the report needs to reflect how the examination was conducted and what data were recovered. It may be that the digital evidence professional will have to give evidence about the conduct of the examination and the validity of the procedures and tools used. Essential to any report will be the conclusions reached by the professional. Where an opinion is offered, the opinion should set out the basis of the evidence. Consideration should also be given to rates of error, including the origin and timing of events that had been recorded, whether the digital evidence professional took care when reaching conclusions where data were lost, whether the professional was aware that digital evidence can be fabricated, and whether the professional evaluated the evidence based ‘on the reliability of the system and processes that generate the records’.2
1Casey, Digital Evidence, 204.
2Eoghan Casey, ‘Error, uncertainty, and loss in digital evidence’ (2002) 1(2) International Journal of Digital Evidence.
9.95 As pointed out by Professor Sommer, it is important to be aware that digital evidence professionals have to use a variety of techniques to cope with the wide diversity of hardware and software encountered. Reliability is one factor to take into account. Another factor is the degree of reliance on the conclusions reached by a digital evidence professional. The digital evidence must be interpreted, and care should be taken to ensure the underlying rationale is sustainable.1
1Peter Sommer, ‘Digital footprints: assessing computer evidence’ [1998] Crim LR Special Edition 65 and 69.
9.96 Assumptions should not form part of any report (except in Australia1) by a digital evidence professional, as occurred in some cases relating to the investigations by the UK police under the name Operation Ore. In this case, police forces in the UK investigated and prosecuted over 7,000 people for offences relating to the possession of abusive images of children and secured over 2,000 convictions.2 This operation was instigated after the conviction of Thomas and Janice Reedy (the Landslide trial, named after their company) in the United States for operating a website selling access to abusive images of children.3 After the trial, a copy of the database recording details of the payments received by Landslide was shared with a number of police forces across the world. This information formed the initial evidence for the purposes of the investigations that subsequently took place. There was evidence to suggest that stolen credit card numbers were used to steal money by ‘buying’ access to the illegal websites hosted by Reedy, who tried to prevent this without success.4 Some of those prosecuted claimed that they did not use their credit cards to obtain access to abusive images of children, as in the case of Dr Paul Grout. No abusive images of children were found on his computers. He produced alibi evidence to demonstrate that at the time of the alleged links to the Landslide website, he was not at a computer terminal. The case was withdrawn from the jury.5 On occasions, it was also assumed that if a credit card number was in the Landslide database, the person whose number it was had therefore paid for abusive images of children. Brian Cooper used his credit card to buy bicycle parts from a US website. His card details were obtained by Akip Anshori, an Indonesian, who successfully subscribed to the Landslide website until Mr Cooper alerted his credit card provider to the unauthorized payments. The police failed to find any abusive images of children on his computers.6
1Nigel Wilson, ‘Expert evidence in the Digital Age in Australia’ (2012) 31(2) Civil Justice Quarterly 216.
2For an outline (notwithstanding that the content may not be entirely accurate), see https://en.wikipedia.org/wiki/Operation_Ore.
3United States of America v Reedy, 304 F.3d 358 (5th Cir. 2002), 2002 WL 1966498.
4Duncan Campbell, ‘Sex, lies and the missing videotape’, PC Pro (June 2007), 18–21; Supplementary memorandum by Mr Jim Gamble dated 1 June 2007 submitted to the Science and Technology Committee – Fifth Report (Session 2006–07, 24 July 2007) (the evidence is published in Vol II (HL Paper 165-II)), where Mr Gamble challenges some of the assertions made by Mr Campbell.
5‘Invisible predator’, BBC, Inside Out – Yorkshire & Lincolnshire, 4 October 2004.
6Campbell, ‘Sex, lies and the missing videotape’, 19.
9.97 A similar case involved Jeremy Clifford, who was charged with making and being in possession of indecent images of children. The images were found in the temporary cache folder with random names such as ‘FX7RA’. Such images generally appear as advertisements, and the user will not necessarily have clicked on them, nor will she be aware that they are on her machine. At his trial, Clifford was acquitted when the prosecution offered no evidence. Although he failed in his first legal action for malicious prosecution and misfeasance in public office,1 his appeal succeeded,2 and the police were subsequently found liable.3 It transpired that the police and the digital evidence professional had made a number of erroneous assumptions about the Landslide databases, the evidence of Internet browsing and site visit history on Clifford’s machine.4
1Clifford v The Chief Constable of the Hertfordshire Constabulary [2008] EWHC 3154 (QB), [2008] 12 WLUK 568.
2Clifford v The Chief Constable of the Hertfordshire Constabulary [2009] EWCA Civ 1259, [2009] 12 WLUK 16.
3Clifford v The Chief Constable of the Hertfordshire Constabulary [2011] EWHC 815 (QB), [2011] 4 WLUK 7.
4[2009] EWCA Civ 1259 at [67]–[76].
9.98 Great care must be given to the nature of the technical evidence, as demonstrated by the case of R. v O’Shea (Anthony David),1 a case that also centred on the Landslide database. The case had been publicized by the media as a public enquiry into the entire operation conducted by the police. It was not. It was an appeal against conviction by one man on the main ground that new evidence from one Bates, described as a computer expert, based on a forensic examination of the Landslide records, suggested that a third party had misappropriated the appellant’s identity. The members of the Court of Appeal held that there was no evidence to support Bates’ suggestion that the Landslide webmaster had access to the appellant’s personal data that were used in the transactions, that there was no evidence to prove that the hypothetical fraudulent webmaster had obtained access to the Freeserve proxy servers to assume the appellant’s identity, and noted the appellant had checked his credit card statements regularly and not challenged these transactions (he had challenged the debiting of his credit card account in relation to other amounts that were similar to those in question in this case).2 Describing this additional evidence as ‘mere assertion, unsupported by any published or other material or any reasoning,’3 the members of the Court of Appeal concluded that the appellant’s conviction was safe and dismissed the appeal.
1[2010] EWCA Crim 2879, [2010] 12 WLUK 150; Stephen Mason, ‘Digital evidence: beware of assuming too much’ (2011) 22(2) Comps & Law 36.
2[2010] EWCA Crim 2879 at [50]–[59].
3[2010] EWCA Crim 2879 at [43].
9.99 A prosecution in Wales in 2015 offers an illustrative case study to demonstrate what can go wrong when the police do not conduct a careful investigation, and the prosecution’s failure to understand the weakness of the evidence upon which the charges are preferred. A number of nurses working at the Princess of Wales Hospital in Bridgend were indicted on charges relating to alleged falsification of patient notes regarding blood glucose levels. Professor Thimbleby, an expert witness for the defence, discussed the evidence in detail1 where he outlined the correct, systematic procedure to be observed by the nurses.
1Harold Thimbleby, ‘Misunderstanding IT: hospital cybersecurity and IT problems reach the courts’ (2018) 15 Digital Evidence and Electronic Signatures Law Review 11; Professor Angela Hopkins, ‘Review of the blood glucometry investigations in Abertawe Bro Morgannwg University Health Board: establishing lessons learned’ (ABM University Health Board, June – September 2016), http://www.wales.nhs.uk/sitesplus/documents/863/4.5%20Blood%20Glucometry.pdf.
9.100 The central record system had no records of many of the tests and their results the nurses had written on the paper notes for each patient. Because of this discrepancy, the police concluded that the nurses had written down fictitious readings and had not bothered to do their job. As an aside, nurses could not necessarily undertake the actions as set out above, because of problems with the software, and also because sometimes it was difficult for the software to read the patient’s identity number. It turned out that a practical solution was to type 000 on the glucometer keyboard, or for the nurse to scan her own barcode in order for the glucometer to accept the data to be input as a valid patient, or to manually type in the name of the patient – but this action would not prevent the nurse from misspelling the patient’s name. The glucometer accepted both of these methods of getting around the failure of the software code and would give a correct blood glucose reading. However, the hospital system rejected this data, the consequence of which required manual intervention for the data to be added to the central database – which might not happen or might introduce further errors.
9.101 On analysing the prosecution evidence – which was in the form of a CD of Excel spreadsheets and, on a later date, XML files of data logged on blood glucometers – it was discovered that the relevant data were not present. The prosecution asserted that because data was not present, it followed that the nurses had fabricated doing actual tests, because if they had actually done the tests, the data would be present in the spreadsheets.
9.102 The prosecution needed to prove that it was the failure of the nurses to input data that caused the data to be missing from the central database – that is, the absence of data proved fabrication, rather than any other possibility. The police and the prosecution lawyers assumed that the glucometers and hospital IT systems were reliable, even though they knew the systems required human intervention. The police did not question the management of the data, and there was no evidence about the day-to-day management of the data. The prosecution also claimed that the devices were accurate as blood glucose meters. This was not relevant. The relevant issue was whether the glucometers reliably transmitted test data to the hospital’s patient record system. It did not appear that the police or the prosecution bothered to research this topic – if they had, they would have discovered a number of relevant articles that included reference to issues which were noted by Professor Thimbleby regarding the practical problems of the device and getting the data to the central computer.1 The judge concluded that the prosecution evidence was unreliable and was therefore excluded.2 The prosecution response was to offer no evidence.3 In consequence, the nurses were acquitted.
1Ksenia Tonyushkina and James H. Nichols, ‘Glucose meters: a review of technical challenges to obtaining accurate results’ (2009) 3(4) Journal of Diabetes Science and Technology 971; Suzanne Austin Boren and William L. Clarke, ‘Analytical and clinical performance of blood glucose monitors’ (2010) 4(1) Journal of Diabetes Science and Technology 84; James H. Nichols, ‘Blood glucose testing in the hospital: error sources and risk management’ (2011) 5(1) Journal of Diabetes Science and Technology 173; David C. Klonoff, ‘Point-of-care blood glucose meter accuracy in the hospital setting’ (2014) 27(3) Diabetes Spectrum 174.
2Ruling in R v Cahill; R v Pugh 14 October 2014, Crown Court at Cardiff, T20141094 and T20141061 before HHJ Crowther QC (2017) 14 Digital Evidence and Electronic Signature Law Review 67.
3‘Nurses cleared of wilful neglect at Princess of Wales Hospital in Bridgend’ South Wales Evening Post, 14 October 2015, http://www.southwales-eveningpost.co.uk/nurses-cleared-wilful-neglect-princess-wales/story-27983645-detail/story.html; ‘Princess of Wales Hospital nurse neglect trial collapses’ BBC News, 14 October 2015, http://www.bbc.co.uk/news/uk-wales-south-east-wales-34527845.
Anti-forensics and interpretation of evidence
9.103 As with all fields of forensic analysis, computer forensics is part of a continuous race of catch-up between investigators and criminals. Just as criminals quickly started to wear gloves once fingerprint evidence had reached the awareness of the wider public, computer criminals too began to use tools to hide or alter the traces of their activities. Anti-computer forensics has become the term for the possible countermeasures that criminals may take to prevent, delay or invalidate computer forensic efforts, a problem increasingly recognized by the research community.1 Deletion of data as a classic anti-forensic technique may serve as an initial example to illustrate some of the issues computer crime investigations are increasingly confronted with. In the early days of the Internet, software that securely wiped data from all parts of the computer was the preserve of the experts, or governmental organizations with special security needs. Today, tools that irretrievably delete files are now easily obtainable for free from various sources, and can be used quickly and reliably even by comparatively computer-illiterate users.2 This example not only illustrates the proliferation of anti-forensic tools, it also highlights some of the complexities that are involved. Most anti-forensic tools are ‘dual nature’ tools, just as many hacking tools are. They have legitimate uses and are often even officially recommended, if not legally mandated, for instance, to protect the security and privacy of sensitive data. Computer software is regularly ‘purpose neutral’. In other words, what works as a protection against criminals trying to obtain access to credit card details also works as a protection from the police trying to obtain access to private emails; what works for system administrators seeking to detect misuse of a computer by an employee also works for criminals obtaining access to commercially sensitive secrets. This has implications for the legal responses to anti-computer forensics, and also for the probative weight of evidence affected by any counter measures that were used by a suspect, and is further discussed below.
1Chris B. Simmons, Danielle L. Jones and Lakisha L. Simmons, ‘A framework and demo for preventing anti-computer forensics’ (2011) 11(1) Issues in Information Systems 366; R. Harris, ‘Arriving at an anti-forensics consensus: examining how to define and control the anti-forensics problem’ (2006) 3(S) Digital Investigation S44.
2Andy Jones and Christopher Meyler, ‘What evidence is left after disk cleaners?’, (2004) 1(3) Digital Investigation 183; Laurent Simon and Ross Anderson, ‘Security analysis of android factory resets’, http://www.cl.cam.ac.uk/~rja14/Papers/fr_most15.pdf.
9.104 As noted above, the social context is a crucial determinant for the interpretation of electronic evidence. In the early days of the Internet, finding that a suspect had acquired the specialist knowledge necessary to operate (or maybe even write) the software for a cleaning tool could be prima facie evidence that he had tried to hide traces of illegal activity. This inference is no longer sound, because secure cleaning of deleted data has become a standard operating procedure in many organizations to prevent data security breaches, and default settings on popular free tools such as CCleaner allow the effortless routine destruction of deleted files every time a computer is shut down.
9.105 The legal system and police investigators have reacted in several ways to this new reality. One approach is through technology – developing new investigative tools that either look for other types of data not yet protected by counter measures or are in some other way capable of undoing the damage of anti-forensic tools. However, this need to react rapidly to developments in the anti-computer forensic field can cause problems for the legal system, where rules on the admissibility of scientific evidence often require extensive testing and acceptance in the scientific community, supported by publication in peer-reviewed journals, together with robust methods of calibration, standardized procedures, accepted minimum criteria for training and proficiency with the new tools.1 What is important to note for criminal prosecutions is that electronic evidence can serve a dual purpose: it can either directly support the prosecution’s case, or it can be indirect evidence that the suspect took actions to hide some form of criminal activity – which in turn may also be direct evidence that he committed one of the various statutory offences that have been created to prevent the destruction or spoliation of data.
1For the US, see Christopher V. Marsico, ‘Computer evidence v. Daubert: the coming conflict’ (2004) CERIAS Tech Report 2005-17; the issue was also discussed in the context of anti-forensics and the use of the ‘Evidence Eliminator’ programme in State of Ohio v Starner, Slip Copy, 2009 WL 3532306 (Ohio App. 3 Dist.); Barbara Guttman, James R. Lyle and Richard Ayers, ‘Ten years of computer forensic tool testing’ (2011) 8 Digital Evidence and Electronic Signature Law Review 139; Computer Forensics Tool Testing (CFTT) Project, https://www.nist.gov/content/computer-forensics-tool-testing-cftt-project; DigitalCorpora.org, http://digitalcorpora.org.
9.106 With all this in mind, the following is an overview of the various approaches to anti-computer forensics, and the effects they have on the availability, reliability and interpretation of electronic evidence. Anti-computer forensics are understood here as any technique, hardware tool or software that prevents or delays the forensic analysis of a data carrier, and negatively affects the existence, amount, authenticity or quality of evidence from a computer or device. There are at least five different subcategories of anti-forensics: data destruction, data tampering, data hiding, trail obfuscation and attacks against the computer forensic tools themselves.
9.107 Data destruction is the most obvious and most widely discussed anti-forensics measure and has created a considerable legal and technological debate.1 Unlike a physical object or piece of paper that can be destroyed effectively, it is much more difficult to completely obliterate a document in electronic form. A user simply clicks the ‘delete’ icon on a computer, in general terms, to remove the pointer to the data. The document or data remains, and it is possible to retrieve this data in certain circumstances, even if it is partly overwritten.2 However, disk cleaning utilities that overwrite or ‘shred’ data have become increasingly available and easy to use for even unsophisticated users. These software-based tools write patterns of pseudo-random combinations of 1s and 0s (in other words, meaningless data) on to all of the sectors on a hard drive. This also includes a setting to wipe free space or unallocated or ‘slack’ space, which is where older ‘deleted’ data often reside. Slack space occurs when data is split between clusters on the hard disk. As files only rarely and by chance fill up every cluster, some space remains. Cleaning software also deletes much of the metadata that accumulates from using the computer – it wipes and cleans old file entries, recently used file lists and many other things including custom locations.
1For an early article on this topic, see Matthew J. Bester, ‘A wreck on the info-bhan: electronic mail and the destruction of evidence’ (1998) 6 CommLaw Conspectus 75.
2Nucleus Information Systems v Palmer [2003] EWHC 2013 (Ch), [2003] 7 WLUK 636, where employees used software in an attempt to overwrite the data on computers owned by the company before they were returned; R v Smith (Graham Westgarth), R v Jayson (Mike) [2002] EWCA Crim 683, [2002] 3 WLUK 178, [2003] 1 Cr App R 13, [2002] Crim LR 659, Times, 23 April 2002, [2002] CLY 819, in which Jayson deleted a number of abusive images of children that were subsequently recovered; Prest v Marc Rich & Company Investment AG [2006] EWHC 927 (Comm), [2006] 3 WLUK 109, where it was alleged the claimant deliberately deleted documents on his laptop computer; R. v Porter (Ross Warwick) [2006] EWCA Crim 560, [2006] 1 WLR 2633, [2007] 2 All ER 625, [2006] 3 WLUK 471, [2006] 2 Cr App R 25, [2006] Crim LR 748, (2006) 103(4) LSG 28, Times, 21 June 2006, [2006] CLY 858, where it was held that it is a matter for the members of a jury to determine whether files were in the ‘possession’ of the accused, where the accused placed the files in the recycle bin, and the recycle bin was then deleted – the files were incapable of being recovered (and thus viewed) without the use of specialist forensic techniques and equipment provided by the US Federal Government which was not available to the public; R. v Grout (Philip) [2011] EWCA Crim 299, [2011] 3 WLUK 5, [2011] 1 Cr App R 38, (2011) 175 JP 209, [2011] Crim LR 584, [2011] CLY 780, where the day before the appellant’s arrest, he reformatted his computer, so that his computer contained no MSN history of any kind before that date.
9.108 In practice, a person might delete emails and files as a matter of routine, and the organization might fail to realize that it has backup copies of all the relevant data,1 or the organization might have backup data to deal with situations where data is deleted, whether inadvertently or deliberately. For instance, in Noble Resources SA v Gross,2 Mr Gross attempted to delete SMS messages that might have incriminated him. Several thousand of these messages were recovered from various places: from backups of his personal mobile telephone and the BlackBerry of the person to whom the messages were sent. Copies were also found in a backup file on his laptop computer shortly before trial; they were also on the forensic image of his laptop taken by his forensic experts, and on a CD of his personal files that he only disclosed during the course of the trial. Mrs Justice Gloster DBE said: ‘with the assistance of one Jimmy Weston, an IT expert, Mr. Gross had deliberately changed the time settings on the laptop to conceal the fact that he himself had made the deletions; and that the last recorded logon time with his user ID reflected this’.3
1As in Fiona Trust & Holding Corporation v Privalov [2010] EWHC 3199 (Comm), [2010] 12 WLUK 346, (2011) 108(3) LSG 17.
2[2009] EWHC 1435 (Comm), [2009] 6 WLUK 558.
3[2009] EWHC 1435 (Comm) at [54].
9.109 Data destruction adds a great deal of complexity to both civil litigation and the investigation of alleged crimes. On occasions, a party may have a reasonable suspicion that the other party might intend to delete files, or has already deleted files, although the technical issues relating to such allegations can serve to confuse.1 In United States of America v Triumph Capital Group, Inc.,2 McCarthy, the CEO and controlling shareholder of Triumph, Spadoni, Triumph’s Vice President and General Counsel, together with a number of others, were accused of a variety of offences relating to racketeering, including bribery, obstruction of justice and witness tampering. It came to the notice of the US government that Spadoni was alleged to have purchased a software program to purge his computer of incriminating evidence. Triumph was ordered to deliver up the relevant computer for forensic tests. The tests revealed that relevant data had been deleted, and the deleted files were recovered. A search of the recovered Internet cache files revealed evidence of other offences. This caused the investigator to obtain a further warrant to search and seize evidence of the further crimes. In L C Services v Brown3 the operating system on Brown’s computer had been changed or re-installed at the time the claimants were pursuing disclosure of documents by the defendants, but a digital evidence professional was able to recover the remains of email communications. The recovered evidence was sufficient to incriminate him, and he was held liable for breach of fiduciary duties to the plaintiffs, his ex-employer.
1The decision by the Supreme Court of Delaware in the case of Genger v TR Investors, LLC, 26 A.3d 180 (2011), 2011 WL 2802832, upholding a finding of spoliation by the trial judge, was examined in detail in Daniel B. Garrie and Bill Spernow, ‘Legally correct but technologically off the mark’ (2010 9(1) Northwestern Journal of Technology & Intellectual Property 1, in which the authors took the view that the judges failed to understand what had occurred in technical terms.
2211 F.R.D. 31 (D.Conn. 2002).
3[2003] EWHC 3024 (QB) at [53] and [54].
9.110 Where there is a reasonable suspicion that a party might delete files, as in the proceedings leading up to divorce in the case of Ranta v Ranta,1 it may be possible to obtain an order to prevent a party from deleting, removing or uninstalling any programs, files or folders.2 Sanctions may follow for deleting files, depending on the seriousness of the action, where a party deliberately wipes hard drives after a court has ordered their production, as in Electronic Funds Solutions v Murphy.3 Furthermore, it is not inconceivable for a court to order a party to search for relevant documents in backup tapes and archives and to provide information about data that have been deleted.4
12004 WL 504588 (Conn.Super.).
2See Takenaka (UK) Ltd and Corfe v Frankl [2001] EWCA Civ 348, [2001] 3 WLUK 163, [2001] EBLR 40, [2001] CLY 1819, where patterns of online behaviour were analysed to establish whether it was more likely that defamatory emails were sent to the defendant’s wife, and used to show that certain pieces of software were used in close proximity to each other and therefore made it more likely that the suspect had sent the emails; L C Services v Brown [2003] EWHC 3024 (QB) at [60] and [68]; Douglas v Hello! Ltd (No 3) [2003] EWHC 55 (Ch), [2003] 1 All ER 1087 (Note), [2003] 1 WLUK 554, [2003] EMLR 29, (2003) 100(11) LSG 34, (2003) 153 NLJ 175, Times, 30 January 2003, [2003] CLY 390; Crown Dilmun v Sutton [2004] EWHC 52 (Ch), [2004] 1 WLUK 467, [2004] 1 BCLC 468, [2004] WTLR 497, (2004) 101(7) LSG 34, Times, 5 February 2004, [2004] CLY 456; LTE Scientific Ltd v Thomas [2005] EWHC 7 (QB), [2005] 1 WLUK 38; Prest v Marc Rich & Company Investment AG [2006] EWHC 927 (Comm), [2006] 3 WLUK 109; Sectrack NV v Satamatics Ltd [2007] EWHC 3003 (Comm), [2007] 12 WLUK 558; Noble Resources SA v Gross [2009] EWHC 1435 (Comm) at [53] and [57]–[58]; First Conferences Services Ltd v Bracchi [2009] EWHC 2176 (Ch), [2009] 8 WLUK 249; note also Crowson Fabrics Limited v Rider [2007] EWHC 2942 (Ch), [2007] 12 WLUK 602, [2008] IRLR 288, [2008] FSR 17, [2008] CLY 1280; Rybak v Langbar International Ltd [2010] EWHC 2015 (Ch), [2010] 7 WLUK 288. For the USA, see Shira A. Scheindlin and Kanchana Wangkeo, ‘Electronic discovery sanctions in the twenty-first century’ (2004) 11(1) Mich Telecomm Tech L Rev 71; Arista Records, L.L.C. v Tschirhart, 241 F.R.D. 462 (2006), 2006 WL 2728927; Willoughby and others, ‘Sanctions for e-discovery violations; Charles W. Adams, ‘Spoliation of electronic evidence: sanctions versus advocacy’ (2011) 8(1) Mich Telecomm Tech L Rev 1.
3134 Cal.App.4th 1161 (2005), 36 Cal.Rptr.3d 663 (Cal. Ct. App. 2005).
4Zhou v Pittsburg State University, 2003 WL 1905988 (D.Kan.); in relation to digital audio files (including case law), see Alan F. Blakley, ‘Digital audio files in litigation’ (2007) 2(1) Journal of Legal Technology Risk Management 1.
9.111 As indicated above, the use of these tools has been the result of legal requirements to ensure data security and privacy protection, which means that increasingly they come with official guarantees that promise that the wiped data cannot be reconstructed by criminals1 – and as a side effect, the police cannot reconstruct the data either. For instance, to provide legal entities with the assurance that they comply with the law, such programs typically allow default settings that erase data automatically every time a computer is shut down, or every time someone tries to obtain access to a file without the password. This makes it increasingly problematic to infer criminal intent to hide data when evidence of disk cleaning is found.
1For instance, Richard Kissel, Andrew Regenscheid, Matthew Scholl and Kevin Stine, Guidelines for Media Sanitization (NIST Special Publication 800-88, Revision 1, December 2014), http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-88r1.pdf.
9.112 The physical destruction of a computer, including the hard drive, will ensure the data (without investing in costly reconstruction and recovery services) is lost, as in Strasser v Yalamanchi.1 In this case, it was claimed that a hard drive containing relevant data had been severely damaged by lightning, and an employee saw fit to dispose of the computer as a result. In response to the extensive pre-trial actions and the failure to provide an adequate reason for the destruction of the computer while litigation was under way, the trial judge subsequently instructed the members of the jury that the negligent destruction of evidence might be inferred from the failure of the appellant to preserve and maintain evidence. The appeal court subsequently upheld the decision.
1783 So.2d 1087 (Fla.App. 4 Dist. 2001), 2001 WL 195056.
9.113 Generally speaking, the most secure way to prevent computer forensics is the physical destruction of the hard drive, or short of that, degaussing in a strong magnetic field. Degaussing with an approved degausser is, for some highly sensitive military or national security applications, the required method of data destruction.1 As discussed above, in comparison to the deliberate attempt at destruction to prevent others from obtaining evidence, paper copy files of underlying source documents may be destroyed for perfectly legitimate reasons, and reliance might subsequently be made on the version held in electronic form. This tends to occur when organizations attempt to reduce the cost of storage of paper documents but fail to consider the cost of electronic storage and the need to deal with old data when a system is upgraded. In the case of Heveafil Sdn. Bdh. v United States,2 the US Department of Commerce refused to accept a copy of a database containing a bill of materials stored on a computer diskette as a means of verifying the cost information in an investigation into anti-dumping extruded rubber. Heveafil claimed that the database held on the diskette had been taken from the mainframe, that it used the previous version in the course of normal business and that the database on the diskette contained an exact duplicate of the database developed on the mainframe computer. In an appeal from the US Court of International Trade, the Court of Appeals for the Federal Circuit accepted the argument by the Department of Commerce that it could reject the data on the diskette as not having been properly authenticated, and a finding of adverse inference was admissible in the circumstances. The assertions by Heveafil were not sufficient, because it failed to provide evidence of the veracity of the contents of the diskette, such as explanations of how the copy was made. The company merely copied data from the mainframe and then deleted the first in time data as well as the underlying paper versions. In doing so, it failed to provide a trail of evidence to demonstrate the procedures undertaken to provide for the veracity of the diskette copy.
1https://www.nsa.gov/Portals/70/documents/resources/everyone/media-destruction/NSAEPLMagneticDegaussers%20June2019.pdf?ver=2019-07-03-090458-077.
258 Fed.Appx. 843, 2003 WL 1466193 (Fed.Cir.); 25 ITRD 1128.
9.114 As mentioned above, the social context can be crucial in interpreting electronic evidence. While clicking on the delete icon is not a way actually to destroy evidence, and can furthermore be seen as an intentional attempt to destroy evidence and pervert the course of justice, the opposite question also arises: under what circumstances can the law interpret a user’s failed attempt to destroy a file as a sign that he wanted to rid himself of possession of an illegal item? An innocent user who accidentally downloads an illegal picture, or finds one on a second-hand computer, may think that by deleting the item he has successfully rid himself of it. The law on possession of illegal material may or may not take the same view, if, for an average user, it is very easy to recover the item in question, and it is thus possible to use the ‘paper bin’ as a convenient hidden storage space.
9.115 All the methods of data deletion described above have been developed for data stored on traditional magnetic media. But increasingly, new storage media look set to challenge anti-forensic measures and also thwart the efforts of investigators. With traditional magnetic storage media, ‘bad sectors’ can create inaccessible parts of the hard drive that are ‘accidentally’ protected from many cleaning utilities. Solid-state drives (SSDs), unlike traditional magnetic discs such as hard disk drives, do not have any moving mechanical components but use integrated circuits to store data persistently. Solid-state drives pose new problems for the recovery of data, because they store data in ways that are much more non-linear and complex than that of traditional hard disk drives.1 However, programs such as Parted Magic claim to provide safe data cleaning for SSDs.
1Bell and Boddington, ‘Solid state drives’.
9.116 Several new filing systems increase data permanence either by design – to prevent accidental data loss – or by accident. For instance, journaling file systems record write operations in a number of different locations, which means data ‘leftovers’ may exist in places ‘outside’ the nominal file storage location. RAID and anti-fragmentation techniques may also result in file data being written to multiple locations. In SSDs, for instance, if the same part of the drive is written over and over again, this will have the effect of ‘wearing it out’ prematurely. To counteract that, technologies are built into SSDs called ‘wear levelling’, which relocates blocks of data between the time when they are originally written and the time when they are overwritten. This has the effect of preventing the ‘true’ erasure of data.
9.117 From a legal and evidential perspective, it is necessary to have some knowledge of the differences these storage media entail for data deletion and data retrieval to interpret correctly the findings of the digital evidence professional. The easier it is to securely delete data with off-the-shelf, easily customizable tools, the less convincing is the inference of an intentional attempt to hide evidence. Finding evidence for the deletion of data from traditional hard drives is therefore different from evidence of deletion from new and more advanced storage systems, where data erasure requires specialist knowledge and considerable efforts.
9.118 The question that remains is what inferences, if any, can be drawn from the absence of evidence if data have been successfully deleted. A defence lawyer may want to argue that according to the prosecution case, some traces of illegal activity ought to have been found on his client’s computer, using the absence of such evidence as an argument to undermine the prosecution case. How convincing the argument is may well depend on the type of storage medium used and the nature of file systems employed. As noted with wear levelling, there are also increasingly automated ‘housekeeping operations’ being carried out by computers on files. In the past, finding that an illegal file, say of images of child sexual abuse, had been moved and copied to several places of a hard drive would have been evidence that the suspect knew of, and knowingly handled, the file in question. Increasingly, this inference depends on the storage medium, and if a number of copies at different parts of the drive existed, it is possible that these could have been the result of automated actions by the computer. Finally, for several legal purposes, a party may have to prove that it either took all reasonable steps to delete certain files, for instance in an action for damages after a data security breach, or that it took every reasonable effort to produce data, for instance, in response to a court order as part of the disclosure or discovery process. The type of evidence required to document that all reasonable steps were taken to either securely delete the data, or to recover lost data, will depend on the precise nature of the storage medium.
9.119 A separate way of destroying data at the filesystem level is by the deletion of filesystem-wide encryption keys. Mobile telephones and several desktop operating systems increasingly feature encrypted filesystems that use a private key for unlocking the data in the filesystem. This key has to be unlocked and made available for encryption and decryption each time the computer or telephone is booted, turned on after a longer delay, or after the key memory-retention period has expired. Data that is written to the persistent filesystem is encrypted using a unique (system specific, locally generated) private key, which is then secured (and unlocked upon demand) using a PIN, swipe pattern or fingerprint. Upon unlocking the telephone, this key is decrypted to enable full access. Destroying the private key, however, makes it virtually impossible to retrieve the data on the telephone, provided the cryptography and the implementation of this feature is done to exacting security standards. A modern smartphone may then destroy all data if a certain number of attempts are made to unlock the private key with a wrong or false fingerprint or access code.1
1A good example is the implementation of this system in iOS for Apple smartphones. It is described extensively in the iOS 9.3 or later security guide (May 2016), https://www.apple.com/business/docs/iOS_Security_Guide.pdf.
9.120 Tampering with electronic evidence is not new. An early example of erasing part of a tape recording and re-recording part of a conversation occurred in the UK in 1955.1 In R v Sinha (Arun Kumar),2 medical data recorded on a computer was altered after the death of a patient, giving rise to a charge of perverting the course of justice. In the case of Freemont (Denbigh) Ltd v Knight Frank LLP,3 one witness concocted evidence by creating documents in the form of a series of notes of discussions, which included statements that had not been made during the course of the discussions,4 and to avoid detection, had the hard drives of older computers destroyed when the firm upgraded its computer systems.5 Attempts to adduce fraudulent evidence before a court are rare, but increasing.6 For instance, Bruce Hyman, who had been a prominent British television and radio producer before qualifying as a barrister later in life, created a false judgment for a friend. His deception was uncovered and he was subsequently convicted for perverting the course of justice and sentenced to a term of imprisonment of twelve months and ordered to pay £3,000 to his victim in compensation and Crown expenses of £3,745 – the first barrister to be so convicted, and he was subsequently disbarred by the Bar Standards Board.7 In another case in Japan, a prosecutor altered electronic evidence in a case he was investigating, and was subsequently convicted and imprisoned for 18 months.8
1‘Recording as testimony to truth’ [1955] Crim LR 2, [1954] SJ 98, 794.
2[1994] 7 WLUK 34, [1998] Masons CLR 35, [1995] Crim LR 68 (CA), Times, 13 July 1994, Independent, 1 August 1994, [1994] CLY 1137.
3[2014] EWHC 3347 (Ch), [2014] 10 WLUK 398, [2015] PNLR 4, [2015] CLY 1796.
4Although the judge did not have to determine precisely how the evidence was concocted, and he considered the possibility of amended computer files at [56], he concluded on other evidence that the evidence was concocted, for which see [116], [123] and [140].
5At [56]–[60].
6Premier Homes and Land Corporation v Cheswell, Inc., 240 F.Supp.2d 97 (D.Mass. 2002), 2002 WL 31907329 for fabrication of an email; People v Superior Court of Sacramento County, 2004 WL 1468698 (Cal.App. 3 Dist.) for fabrication of letters on a computer after the event; ISTIL Group Inc v Zahoor [2003] EWHC 165 (Ch), [2003] 2 All ER 252, [2003] 2 WLUK 476, [2003] CP Rep 39, Independent, April 7, 2003, [2003] CLY 451 for a forged document; Fiona Trust & Holding Corporation v Privalov [2010] EWHC 3199 (Comm), [2010] 12 WLUK 346, (2011) 108(3) LSG 17 for a forged and backdated agreement and employment contract; for forged emails, Apex Global Management Ltd v FI Call Ltd [2015] EWHC 3269 (Ch), [2015] 11 WLUK 248; in a criminal context, see R v Brooker [2014] EWCA Crim 1998, also cited as AG’s Ref: 071 of 2014, R v B (R C A) (2014) (available in the LexisNexis electronic database), where Brooker sent text messages from a second mobile telephone in her possession, claiming that her boyfriend sent them.
7Angella Johnson, ‘How my barrister forged evidence against my husband – and now faces jail’ The Mail, 8 September 2007; Steven Morris, ‘Barrister becomes first to be jailed for perverting justice’ The Guardian, 20 September 2007; Simon de Bruxelles, ‘Barrister jailed for trying to frame man with fake e-mail’ Timesonline, 20 September 2007.
8Hironao Kaneko, case translation and commentary in ‘Heisei 22 Nen (Wa) 5356 Gou’ (2012) 9 Digital Evidence and Electronic Signature Law Review 109.
9.121 However, it is conceivable, given the ease with which electronic data is so easily manipulated and altered, that attempts will be made in the future to falsify and alter documents even before a trial ever takes place, or to create vast swathes of ‘evidence’ of a complete set of legal proceedings. This happened in Islamic Investment Company of the Gulf (Bahamas) Ltd v Symphony Gems NV.1 As explained the in judgment of Hamblen J:2
From the end of October 2010 until December 2013 [the lawyer] conducted fictitious litigation for RM. That litigation involved fictitious hearings before the Commercial Court and the Court of Appeal; purported judgments of those courts; purported sealed court orders; a purported hearing transcript; purported skeleton arguments; purported correspondence with court officials and the Claimant’s solicitors, Norton Rose; the fictitious instruction and engagement of various counsel, and telephone conferences involving the impersonation of his senior partner and of leading counsel. None of this reflected reality. Throughout that period there was in fact no contact with Norton Rose or the court.
9.122 Even such mundane matters such as proof of parking violations have been subject to the alteration of electronic evidence. In the case of Kevin Maguire, he had parked his car in Market Place in Bury town centre, Greater Manchester at 7.15 am on 31 August 2003. He returned at 5 pm to find he had been given a parking ticket at 9.15 am. Normally there were no restrictions on a Sunday, and when he parked his car, there were no signs to indicate there were any temporary restrictions in place. There were no signs because the NCP staff did not put them up on the previous night as there was a high likelihood that the signs could be pulled down or damaged by revellers overnight. In fact, the signs were put up after Mr Maguire had parked his car. When Mr Maguire complained to the NCP, it was asserted that he had parked illegally and he was sent a photograph of his parked car, which was dated 30 August 2003. Mr Maguire appealed against the parking fine. It transpired that one Gavin Moses, a member of the NCP staff, had altered the date on the digital photograph from 31 August to 30 August, so that it appeared that Mr Maguire had parked illegally. Mr Maguire was cleared of illegal parking and was awarded costs. Gavin Moses subsequently entered a plea of guilty when he was prosecuted for perverting the course of justice, and was sentenced to 150 hours of community service.1
1BBC News online news item, ‘“Fit up” parking warden sentenced’, 28 January 2005, http://news.bbc.co.uk/1/hi/england/manchester/4216539.stm. A further article was published by a Manchester website dated 27 January 2005, but the web page is no longer active.
9.123 In Singapore, for use in salary negotiations with his prospective employer, a solicitor Ruddy Lim altered the monthly salary on his payslip from DLA Piper Singapore Pte Ltd to read $65,000, rather than $25,000. The description of his method is set out in the judgment:1
‘The Accused testified that he first created Exhibit P2 in his laptop computer some time between 12 and 14 November 2006. He was travelling in Jakarta at the time, and carried a soft copy of the DLA Piper logo in his laptop for preparing marketing materials. He created a document in the word-processing programme, Word, by typing out the text and numbers of the false payslip. He cut and pasted the DLA Piper logo onto the Word document. He then copied the image of the company stamp (with the office manager’s signature) from his original payslip … using software from Adobe, and electronically affixed the image onto the Word document. During this time, the Word document existed only in soft copy. When he returned to Singapore, he printed out the Word document on 14 November 2006, then scanned it into the Xerox machine so that a “pdf” version of the false payslip would be created. He wanted to convert it from Word document format into “pdf” because the former was “editable”, while the latter was a “fixed format”. He then emailed the resulting document … to [his prospective employer].’
1PP v Rudy Lim [2010] SGDC 174 at [17].
9.124 Considerably more attention will have to be paid to demonstrate the integrity of electronic data in the future, which in turn will help substantiate the claim for authenticity to reflect the reliability of the data.1 In all of these cases, the changes to the data were carried out manually. Anti-computer forensics increasingly provides tools to alter data automatically, and in particular the crucial metadata, thus diminishing the evidential value of the data that can be recovered. ‘Backtrack’ or ‘Transmogrify’, for instance, can change the extension of files by turning .exe (application) files into .docx (Word document) files, thereby hiding their malicious character. ‘Timestomp’ can change the timestamps of files, the metadata that records the creation and alteration of a file.2 Randomizers can automatically generate random file names, and criminals can use tools that replace Roman letters with identical-looking Cyrillic ones. Both approaches defeat data-mining techniques that look for ‘known bad files’ or signatures of known illegal images. Software developers who wanted to test the reliability of common forensic tools such as Encase developed many of these tools. Vincent Lui, one of the most prolific developers of tools with anti-forensic implications, concludes that the ‘unfortunate truth’ is that the presumption of reliability is ‘unjustified’ and the justice system is ‘not sufficiently sceptical of that which is offered up as proof.’3
1According to the conclusions on page 56 of Report of Digital Forensic Analysis (26 March 2012) by Stroz Friedberg and submitted as evidence in the case of Paul D. Ceglia v Mark Zuckerberg, Individually, and Facebook, Inc., 600 Fed.Appx. 34 (2015), Stroz Friedberg determined that it had ‘found direct and compelling digital forensic evidence that the documents relied upon by Mr. Ceglia to support his claim were forged’, http://cdn.arstechnica.net/wp-content/uploads/2014/08/strozreport.pdf.
2Hamid Jahankhani and Elidon Beqiri, ‘Digital evidence manipulation using anti-forensic tools and techniques’ in Hamid Jahankhani, David Lilburn Watson and Gianluigi Me (eds) Handbook of Electronic Security and Digital Forensics (World Scientific Publishing Co Pte Ltd 2010), 411–427.
3Van Buskirk and Liu, ‘Digital evidence’, 25.
9.125 Other tools have legitimate objectives such as privacy protection. For instance, to prevent companies from obtaining data about individual behaviour when using a search engine, software can be used to create a large number of chance queries to create random noise.1 A record of keyword searches can also have evidential value in a criminal trial. Thus to establish the interest of the suspect in certain poisons or drugs, these tools can be used to cast doubt on the reliability of the log data that documents the searches carried out on a suspect’s computer. Since the search terms had been automatically generated, any inference that the user of the machine intentionally searched for a specific term becomes problematic.
1Ye Shaozhi, Felix Wu, Raju Pandey and Hao Chen, ‘Noise injection for search privacy protection’ (2009) UC Davis Postprints, http://escholarship.org/uc/item/08k1004m.
9.126 Lastly, a specific type of falsification should be given consideration. In many countries around the world state security and intelligence services operate malware (see also the paragraph on the use of legal intrusion, which is increasingly used as a police power) which is capable, not only of surveillance of their targets, but, with a simple addition, could be used to plant falsified data. The presence, use, or even the evidence of existence of the use of such tools creates significant technical challenges.1 In cases where unscrupulous people acting on behalf of a government have the political motive to use such tools, consideration should be given to the possibility that such falsifications may have been used.
1See, for example, the case of Hacking Team, an Italian supplier of such capabilities that was found to have made a long list of rather ‘unethical’ and often illegal sales to governments: Andy Greenberg, ‘Hacking Team breach shows a global spying firm run amok’, Wired, 7 June 2015, https://www.wired.com/2015/07/hacking-team-breach-shows-global-spying-firm-run-amok/; Patrick Howell O’Neill, ‘The fall and rise of a spyware empire’, MIT Technology Review, 29 November 2019, https://www.technologyreview.com/2019/11/29/131803/the-fall-and-rise-of-a-spyware-empire/; for another example of such software, see the Israeli-made Pegasus malware, traced by the Toronto-based citizen lab (at the Munk School of the University of Toronto): Bill Marczak, John Scott-Railton, Sarah McKune, Bahr Abdul Razzak and Ron Deibert, ‘HIDE AND SEEK tracking NSO group’s Pegasus spyware to operations in 45 countries’, 18 September 2018, https://citizenlab.ca/2018/09/hide-and-seek-tracking-nso-groups-pegasus-spyware-to-operations-in-45-countries/.
9.127 Tampering with and destroying data works best when the criminal no longer needs the data. For possession crimes such as the possession of illegal images, this is not possible. Hiding the data rather than destroying or altering it therefore becomes an important objective. Cryptography is the best known anti-forensic method to hide data from third parties. Due to its importance as a dual use technology with important roles for privacy and data security, and also because of the complex legal issues involved with cryptography, this is considered in the chapter on encrypted data.
9.128 Another well-known method of hiding data is steganography. Steganography is the method of hiding a message inside a digital object, which may be a graphic, a picture, a film or a sound clip. The sender is able to hide a message in a seemingly innocuous file, and the recipient can retrieve the message upon receipt. Other methods used to hide data include writing data to slack space or space that has not been allocated for use, hiding data on a hard drive in a secret partition, and the transmission of data under the cover of transmission protocols. Various types of commercial and free software are available to perform steganography on data. It can be relatively difficult to detect hidden data within a file, and the communication can be even more difficult to uncover if the message has been compressed and encrypted before being hidden in the carrier. At present, it is unlikely that many investigators will undertake a routine examination for hidden data.1
1Brent T. McBride, Gilbert L. Peterson and Steven C. Gustafson, ‘A new blind method for detecting novel steganography’ (2005) 2(1) Digital Investigation 50; a wide range of references on this topic is provided in Gary C. Kessler, ‘An overview of steganography for the computer forensics examiner’ (2004) 6(3) Forensic Science Communications– for an update of this article to February 2015, see http://www.garykessler.net/library/fsc_stego.html; Rachel Zax and Frank Adelstein, ‘FAUST: forensic artifacts of uninstalled steganography tools’ (2009) 6(1–2) Digital Investigation 25.
9.129 There are now various tools available that facilitate the hiding of data in places on the hard drive that are less likely to be inspected. In this sense, they are the mirror images of the deletion tools discussed above. Deletion tools aim to securely delete any trace of an incriminating file, regardless of where on the computer a copy may be hiding. Conversely, ‘Slacker’ breaks up a file and stores individual pieces of it in the slack space left at the end of files, making it look like random noise to forensic tools – imagine just two digits each of a stolen credit card number stored in the unused part of a legitimate file. Slacker then enables the data to be reassembled as required.1 One of the problems with these tools is that they develop faster than it is possible to train digital evidence professionals, and even more importantly, faster than the development of sound, tested and agreed standards. This not only makes the detection of evidence more difficult; it also raises issues about the admissibility of the opinion evidence of forensic experts.
1Hal Berghel, ‘Hiding data, forensics, and anti-forensics’ (2007) 50(4) Communications of the ACM 15.
Attacks against computer forensics
9.130 Arguably, the latest addition to the inventory of anti-computer forensics is attacks against the investigator’s tools. As noted above, digital forensics is highly dependent on software tools. To create evidence that is admissible, these tools have to be evaluated and tested, and the results ideally published in openly available, peer-reviewed scientific publications. Indeed, some of the most popular tools are open source: that is, their source code is freely available. One of the benefits of this approach is not only a high degree of transparency when it comes to assessing the reliability of data generated by these tools, but also the ability for security professionals to improve them and to adapt them to local situations.1 However, it also enables criminals to develop tools that interfere directly with the evidence collection process and infiltrate the software that tries to analyse a suspect’s computer. This can either be done by undermining the integrity of the data that is collected, for instance by changing the hash value of the bit copy that the software creates (thus violating the continuity of evidence by casting reasonable doubt on the authenticity of the copy) or by forcing the analysis tool to either overlook incriminating data, or to report misleading information about it.2 In doing so, it cannot be right to equate such a tool with, say, a photocopier.3
1Kenneally, ‘Gatekeeping out of the box’.
2Chris K. Ridder, ‘Evidentiary implications of potential security weaknesses in forensic software’ (2009) 1(3) International Journal of Digital Crime and Forensics 80.
3Williford v State of Texas, 127 S.W.3d 309 (Tex.App.-Eastland 2004), 2004 WL 67560.
9.131 Trail obfuscation combines the deliberate attempt at tampering, deleting and hiding data with the taking of measures to frustrate investigations, conceal identities and evade enforcement actions.1 In many investigations, the data held on the suspect’s computer or device is only one part of the prosecution’s case. The other, equally important, set of data will come from the Internet and relate to the suspect’s browsing behaviour, or the victim’s computer or device in the case of a hacking offence: the origin of the data, the websites visited, and the activities undertaken. Obfuscating the trail that such activities leave behind on the Internet is therefore an important aspect of anti-computer forensics. It includes various anonymity-protection tools such as VPNs or anonymous remailers to hide browsing activity, or the use of spoofed or zombified accounts when sending malicious emails or spam, or the launch of a denial of service attack. ‘Zombified accounts’, as discussed in more detail below, demonstrate a specific side effect of anti-computer forensics. One way for a criminal to hide illegal activities is to take over the computer of a third party, for instance, after inserting a Trojan horse program, discussed in more detail below, and using this third party machine to carry out illegal activities. This not only hides the true perpetrator from the investigators, it also creates data that can falsely incriminate an innocent party.2
1In the civil context, see EMI Records Ltd v British Sky Broadcasting Ltd [2013] EWHC 379 (Ch), [2013] Bus LR 884, [2013] 2 WLUK 812, [2013] ECDR 8, [2013] Info TLR 133, [2013] FSR 31, Times, 23 April 2013, [2013] CLY 1752.
2Mukkamala and Sung, ‘Identifying significant features for network forensic analysis’; Nikkel, ‘Domain name forensics’; Nikkel, ‘Improving evidence acquisition from live network sources’; Casey and Stanley, ‘Tool review – remote forensic preservation and examination tools’; Demir and others, ‘Packet marking and auditing for network forensics’.
9.132 The range of tasks performed by such malicious software is probably only restricted by the imagination of the person who creates the program. A number of cases in the criminal courts where people have been accused of being in possession of abusive images of children on their computers have used the defence that some form of malicious software caused data to be downloaded to their computers or enabled a third party to obtain access to their computers without the permission of the computers’ owners.1 In the case of R v Caffrey,2 the defendant was charged with causing unauthorized modification of computer material under s 3(1) of the Computer Misuse Act 1990. The prosecution alleged that the defendant sent a deluge of electronic data from his computer to a computer server operated in the Port of Houston, Texas, USA, the effect of which was to cause the computer at the Port of Houston to shut down. The defendant claimed, in his defence, that unknown hackers obtained control of his computer and then launched a number of programs to attack the computer at the Port of Houston. The forensic examiner for the prosecution could not find any evidence of a Trojan horse on the computer. The defence claimed that it was impossible for every file to have been tested, and that the Trojan horse file might have had a facility to destroy itself, leaving no traces of having resided on his computer. The forensic examiner for the prosecution disputed that, stating that a Trojan horse would leave a trace on the computer. The jury acquitted Mr Caffrey.3
1R v Schofield (April 2003, unreported), Reading Crown Court, and R v Green (October 2003, unreported), Exeter Crown Court.
2(October 2003, unreported), Southwark Crown Court.
3Esther George, ‘Casenote’ (2004) 1(2) Digital Investigation 89; Susan Brenner, Brian Carrier and Jef Henninger, ‘The Trojan horse defense in cybercrime cases’ (2004) 21 Santa Clara High Tech LJ 1; the first Trojan horse case in the People’ Republic of China was prosecuted in 2009: Jihong Chen, ‘The first “Trojan horse” case prosecuted in China’ (2010) 7 Digital Evidence and Electronic Signature Law Review 107; Alex Xia and Julia Peng, ‘First “Trojan horse” case prosecuted for illegal invasion of computer systems in China’ (2009) 25 Computer Law & Security Review 298.
9.133 It should be noted that just because an individual may have such materials on his computer, it does not follow that he was responsible for downloading them. It is important for any digital evidence professional to report on findings within the context of what the technology is capable of doing. For instance, it is possible to introduce malicious software through web pages without the permission of the website owner. When a person visits a website, software could redirect the computer to undesirable websites, and the computer will automatically download unwanted material onto the temporary cache file of the computer without the user’s permission or knowledge.1
1For which, see Bilar, ‘Known knowns’ (in which the author illustrates the ease by which third parties can obtain control of computers without the authority of the owner or user); Megan Carney and Marc Rogers, ‘The Trojan made me do it: a first step in statistical based computer forensics event reconstruction’ (2004) 2(4) International Journal of Digital Evidence.
9.134 A Trojan horse is a malicious software program containing hidden code that is designed to conceal itself in a computer as if it were legitimate software. When activated, the software will perform an operation that is not authorized by the user, such as the destruction of data (including the entire hard drive), the collection of data on a computer and transmission to a third party without the user being aware of what is happening, the counteraction of security measures installed on a computer, and the instruction of the computer to perform tasks such as to take part in a denial of service attack, or permit the creator of the program to obtain access to the computer. Just like the other large group of malware, viruses, Trojan horses pose a Janus-face conundrum for computer forensics. Finding a virus or a Trojan infection can be direct evidence amounting to an unauthorized modification of computer systems. At the same time, this can also be indirect evidence that the computer at the centre of an investigation has been tampered with and that the crime scene is ‘contaminated’.
9.135 The dual use nature of many of the tools used for anti-computer forensics has been noted above. On the one hand, these tools protect our privacy against criminals, but they also protect the privacy of criminals from police investigations. A similar analysis applies to spyware such as Trojan horses. On the one hand, they allow criminals to obtain access to credit card details or passwords. On the other, they have the potential to allow the police to obtain access to the activities of criminals – that is, if the police succeed in planting such a program on the suspect’s computer. Attempts to use malware for investigative purposes have caused legal controversy in some countries. In Germany, the Constitutional Court ruled against such clandestine surveillance after prosecutors applied for warrants to permit their use. In the discussion before the court, evidence was also given from computer specialists about the security and evidential implications of these ‘Federal Trojans’. To work efficiently, they must not be detected by commercial anti-virus software. This can only be achieved either by the tacit collaboration of the anti-virus software vendors, or by using the ingenuity of programmers employed by the police. In either case, the result will be malware that cannot be easily detected. One obvious danger is that criminals can get hold of and in turn hijack the code for this ‘official’ malware once it was planted on their machines, which would give them in effect a ‘master key’ for all computer systems. In such an event, it would become much easier for the defence to mount ‘Caffrey style’ arguments, and all computers could become crime scenes with compromised integrity.1
1Wiebke Abel and Burkhard Schafer, ‘The German Constitutional Court on the right in confidentiality and integrity of information technology systems – a case report on BVerfG, NJW 2008, 822’ (2009) 6(1) SCRIPT-ed 106.
9.136 A final complication is created by the desire to protect users against malware. The use of Trojans lies at the heart of distributed denial of service attacks, a significant threat to the functioning of the Internet. Preventing malware has therefore become a high priority for police and commerce. Ordinary users, who often fail to take appropriate steps to protect their computers and devices against interference by criminals, are the weakest link. The Trusted Computing Initiative is one possible answer to this problem. It would allow a coalition of software and hardware developers much more direct access to computers, ensuring that all their defence mechanisms work as specified, and that no unauthorized program is run on them. While this approach is promising in its potential to reduce computer criminality, for the interpretation of electronic evidence, it carries several challenges. Since computer forensic tools too are essentially a form of ‘spyware’, common forensic applications may not work any longer in a trusted computing environment. Even worse, the philosophy of trusted computing is premised on belief that to protect the user against criminal activities, the security and control of the computer or device is improved if it is determined by not just the user but also by organizations. This means that the number of people and organizations that at any given time would have access to users’ computers and the data held therein would increase considerably, especially if the keys to users’ computers and their devices are compromised. This could in turn cast doubt on the reliability and authenticity of the data found on a computer or device during a criminal investigation. At the moment, lawyers assume, often naively, that data found on a suspect’s computer or device must have been put there by the person in physical control of the machine (typically, the owner); this inference would look increasingly doubtful in a trusted computer environment.1
1Yianna Danidou and Burkhard Schafer, ‘Trusted computing and the digital crime scene’ (2011) 8 Digital Evidence and Electronic Signature Law Review 111.
An intellectual framework for analysing electronic evidence
9.137 However, as we have seen, despite these differences, evidence in digital form shares important features with other types of evidence. Eyewitness evidence, forensic trace evidence such as DNA and proof by document can all provide the basis for analogical reasoning to determine the evidentiary value of an item of digital evidence, if we are aware of the limitations of this analogy. The digital evidence professional, however, has a different job from that of a DNA analyst or a forensic entomologist and, in particular, deals with mathematical abstractions rather than empirical objects. Therefore, findings will not normally be in the form of matching probabilities or other quantifiable, generalized statements.1 ‘Universal’ theories of evidence are regrettably either rare, or too abstract to be of much practical value. However, the ‘hierarchy of propositions’ promoted by the Forensic Science Service in the UK has the potential to provide such a framework, which can also help to illuminate further the distinguishing features of electronic evidence and what they mean for practice. To interpret evidence, the digital evidence professional (or the judge) has to consider propositions that represent respectively the prosecution or defence, or the pursuer or defendant. Evidential weight can only be ascertained if the propositions from both sides are considered, and the increase or decrease in likelihood for both is considered. Several studies have shown, with examples, how a hierarchical analysis can help in the evaluation of heterogeneous evidence, from eyewitnesses to DNA.2 The nature of electronic evidence is such that on a like-by-like comparison and allowing for the machine-mediated nature of electronic evidence, the evidence will be several steps further removed from the reasoning associated with traditional evidence. All these steps have to be explored and the counterfactuals examined before electronic evidence can be said to be proved.
1A potential problem for jurisdictions that follow the US decision in Daubert v Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 (1993), 113 S.Ct. 2786 that requires that experts report confidence values and error rates, something that rarely applies in computer forensics.
2I. W. Evett, G. Jackson and J. Lambert, ‘More on the hierarchy of propositions: exploring the distinction between explanations and propositions’ (2000) 40(1) Science & Justice 3.
Conclusions and future considerations
9.138 The widespread use of computers, the Internet, mobile telephones and smartphones means that most lawyers now have to deal with electronic evidence.1 Increased use of specialized law enforcement capacities, the more traditional criminal justice system and advanced security and intelligence capabilities has created a field that is in constant flux. The weighing of the probative value of the evidence, can be straightforward only in simple cases, for which ample precedent and standards exist, but not otherwise where the parties challenge the data or new and innovative methods are in play. In these cases, the court often has to rely upon digital evidence professionals. This implies the need for a thorough analysis of the merits of each piece of data given in evidence. For this reason, lawyers must familiarize themselves with electronic evidence and understand not only the need to scrutinize the qualifications and conclusions of digital evidence professionals, but also the need to scrutinize the very evidence that they present and the manner in which it was obtained.
1Graeme Horsman and Lynne R. Conniss, ‘Investigating evidence of mobile phone usage by drivers in road traffic accidents’ (2015) 12 Digital Investigation S30.
9.139 Cloud computing and trusted computing affect the way digital evidence professionals obtain evidence, which means that great care must be taken over how such evidence is obtained, which will doubtless be the subject of careful cross-examination.1 In addition, the methods used by attackers in the digital environment will mean it is increasingly necessary to take into consideration the use of rarer techniques to obtain evidence in the future.2
1Stephen Mason, ‘Trusted computing and forensic investigations’ (2005) 2(3) Digital Investigation 189: this article is merely an introduction to the topic that includes relevant references, and see also Stephen Mason, ‘Trusting your computer to be trusted’ (2005) Computer Fraud & Security 7, with a number of additional references; see also a thesis in partial fulfilment of the requirements for the degree of Masters in Forensic Information Technology submitted to the graduate faculty of Computing and Mathematical Sciences at Auckland University of Technology by Michael E. Spence, ‘Factors influencing digital evidence transfer across international borders: a case study’ (2010), http://aut.researchgateway.ac.nz/handle/10292/1187; Ian Walden, ‘Law enforcement access in a cloud environment’ (Legal Studies Research Paper No 74/2011), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1781067; Josiah Dykstra and Alan T. Sherman, ‘Acquiring forensic evidence from infrastructure-as-a-service cloud computing: exploring and evaluating tools, trust, and techniques’ (2012) 9 Digital Investigation S90.
2Kris Harms, ‘Forensic analysis of System Restore points in Microsoft Windows XP’ (2006) 3(3) Digital Investigation 151.
9.140 In response to these developments, anti-computer forensics has emerged over the last decade as a significant challenge to the investigation of crimes involving the use of computers and computer-like devices. The arms race between criminals and investigators on the one hand, and the dual use nature of the tools that permit and prevent digital investigations on the other, have created a highly complex interaction that requires careful reflection on the nature of electronic evidence in any individual case, a reflection that has to be constantly updated as new tools emerge. While this chapter posits various principles and standards for handling and analysing electronic evidence, technological advancements will undoubtedly create new challenges and conflicts for the process of collecting, evaluating and examining electronic evidence in a legal setting in the near future.