Digital forensics (sometimes known as digital forensic science) is a branch of forensic science encompassing the recovery, investigation, examination, and analysis of material found in digital devices, often in relation to mobile devices and computer crime. The term "digital forensics" was originally used as a synonym for computer forensics but has expanded to cover investigation of all devices capable of storing digital data. With roots in the personal computing revolution of the late 1970s and early 1980s, the discipline evolved in a haphazard manner during the 1990s, and it was not until the early 21st century that national policies emerged.
Digital forensics investigations have a variety of applications. The most common is to support or refute a hypothesis before criminal or civil courts. Criminal cases involve the alleged breaking of laws that are defined by legislation and enforced by the police and prosecuted by the state, such as murder, theft, and assault against the person. Civil cases, on the other hand, deal with protecting the rights and property of individuals (often associated with family disputes), but may also be concerned with contractual disputes between commercial entities where a form of digital forensics referred to as electronic discovery (ediscovery) may be involved.
Forensics may also feature in the private sector, such as during internal corporate investigations or intrusion investigations (a special probe into the nature and extent of an unauthorized network intrusion).
The technical aspect of an investigation is divided into several sub-branches related to the type of digital devices involved: computer forensics, network forensics, forensic data analysis, and mobile device forensics.[1] The typical forensic process encompasses the seizure, forensic imaging (acquisition), and analysis of digital media, followed with the production of a report of the collected evidence.
As well as identifying direct evidence of a crime, digital forensics can be used to attribute evidence to specific suspects, confirm alibis or statements, determine intent, identify sources (for example, in copyright cases), or authenticate documents. Investigations are much broader in scope than other areas of forensic analysis (where the usual aim is to provide answers to a series of simpler questions), often involving complex time-lines or hypotheses.
Prior to the 1970s, crimes involving computers were dealt with using existing laws. The first computer crimes were recognized in the 1978 Florida Computer Crimes Act,[2] which included legislation against the unauthorized modification or deletion of data on a computer system. Over the next few years, the range of computer crimes being committed increased, and laws were passed to deal with issues of copyright, privacy/harassment (e.g., cyber bullying, happy slapping, cyber stalking, and online predators), and child pornography. It was not until the 1980s that federal laws began to incorporate computer offences. Canada was the first country to pass legislation in 1983. This was followed by the US Federal Computer Fraud and Abuse Act in 1986, Australian amendments to their crimes acts in 1989, and the British Computer Misuse Act in 1990.
The growth in computer crime during the 1980s and 1990s caused law enforcement agencies to begin establishing specialized groups, usually at the national level, to handle the technical aspects of investigations. For example, in 1984, the FBI launched a Computer Analysis and Response Team and the following year a computer crime department was set up within the British Metropolitan Police fraud squad. As well as being law enforcement professionals, many of the early members of these groups were also computer hobbyists and became responsible for the field's initial research and direction.
One of the first practical (or at least publicized) examples of digital forensics was Cliff Stoll's pursuit of hacker Markus Hess in 1986. Stoll, whose investigation made use of computer and network forensic techniques, was not a specialized examiner. Many of the earliest forensic examinations followed the same profile.
Throughout the 1990s, there was high demand for these new, and basic, investigative resources. The strain on central units lead to the creation of regional, and even local, level groups to help handle the load. For example, the British National Hi-Tech Crime Unit was set up in 2001 to provide a national infrastructure for computer crime, with personnel located both centrally in London and with the various regional police forces (the unit was folded into the Serious Organised Crime Agency (SOCA) in 2006).
During this period, the science of digital forensics grew from the ad-hoc tools and techniques developed by these hobbyist practitioners. This is in contrast to other forensics disciplines, which developed from work by the scientific community. It was not until 1992 that the term "computer forensics" was used in academic literature (although prior to this, it had been in informal use); a paper by Collier and Spaul attempted to justify this new discipline to the forensic science world. This swift development resulted in a lack of standardization and training. In his 1995 book, High-Technology Crime: Investigating Cases Involving Computers, K. Rosenblatt wrote the following:
Since 2000, in response to the need for standardization, various bodies and agencies have published guidelines for digital forensics. The Scientific Working Group on Digital Evidence (SWGDE) produced a 2002 paper, Best practices for Computer Forensics, this was followed, in 2005, by the publication of an ISO standard (ISO 17025, General requirements for the competence of testing and calibration laboratories). A European-led international treaty, the Convention on Cybercrime, came into force in 2004 with the aim of reconciling national computer crime laws, investigative techniques, and international co-operation. The treaty has been signed by 43 nations (including the US, Canada, Japan, South Africa, UK, and other European nations) and ratified by 16.
The issue of training also received attention. Commercial companies (often forensic software developers) began to offer certification programs, and digital forensic analysis was included as a topic at the UK specialist investigator training facility, Centrex.
In the late 1990s, mobile devices became more widely available, advancing beyond simple communication devices, and were found to be rich forms of information, even for crime not traditionally associated with digital forensics. Despite this, digital analysis of phones has lagged behind traditional computer media, largely due to problems over the proprietary nature of devices.
Focus has also shifted onto internet crime, particularly the risk of cyber warfare and cyberterrorism. A February 2010 report by the United States Joint Forces Command concluded the following:
The field of digital forensics still faces unresolved issues. A 2009 paper, "Digital Forensic Research: The Good, the Bad and the Unaddressed" by Peterson and Shenoi, identified a bias towards Windows operating systems in digital forensics research. In 2010, Simson Garfinkel identified issues facing digital investigations in the future, including the increasing size of digital media, the wide availability of encryption to consumers, a growing variety of operating systems and file formats, an increasing number of individuals owning multiple devices, and legal limitations on investigators. The paper also identified continued training issues, as well as the prohibitively high cost of entering the field.
See main article: List of digital forensics tools. During the 1980s, very few specialized digital forensic tools existed. Consequently, investigators often performed live analysis on media, examining computers from within the operating system using existing sysadmin tools to extract evidence. This practice carried the risk of modifying data on the disk, either inadvertently or otherwise, which led to claims of evidence tampering. A number of tools were created during the early 1990s to address the problem.
The need for such software was first recognized in 1989 at the Federal Law Enforcement Training Center, resulting in the creation of IMDUMP [3] (by Michael White) and in 1990, SafeBack [4] (developed by Sydex). Similar software was developed in other countries; DIBS (a hardware and software solution) was released commercially in the UK in 1991, and Rob McKemmish released Fixed Disk Image free to Australian law enforcement. These tools allowed examiners to create an exact copy of a piece of digital media to work on, leaving the original disk intact for verification. By the end of the 1990s, as demand for digital evidence grew, more advanced commercial tools such as EnCase and FTK were developed, allowing analysts to examine copies of media without using any live forensics. More recently, a trend towards "live memory forensics" has grown, resulting in the availability of tools such as WindowsSCOPE.
More recently, the same progression of tool development has occurred for mobile devices; initially investigators accessed data directly on the device, but soon specialist tools such as XRY or Radio Tactics Aceso appeared.
See main article: Digital forensic process.
A digital forensic investigation commonly consists of 3 stages:
Acquisition does not normally involve capturing an image of the computer's volatile memory (RAM) unless this is done as part of an incident response investigation. Typically the task involves creating an exact sector level duplicate (or "forensic duplicate") of the media, often using a write blocking device to prevent modification of the original. However, the growth in size of storage media and developments such as cloud computing have led to more use of 'live' acquisitions whereby a 'logical' copy of the data is acquired rather than a complete image of the physical storage device. Both acquired image (or logical copy) and original media/data are hashed (using an algorithm such as SHA-1 or MD5) and the values compared to verify the copy is accurate.
An alternative (and patented) approach (that has been dubbed 'hybrid forensics'[5] or 'distributed forensics'[6]) combines digital forensics and ediscovery processes. This approach has been embodied in a commercial tool called ISEEK that was presented together with test results at a conference in 2017.[5]
During the analysis phase an investigator recovers evidence material using a number of different methodologies and tools. In 2002, an article in the International Journal of Digital Evidence referred to this step as "an in-depth systematic search of evidence related to the suspected crime." In 2006, forensics researcher Brian Carrier described an "intuitive procedure" in which obvious evidence is first identified and then "exhaustive searches are conducted to start filling in the holes."
The actual process of analysis can vary between investigations, but common methodologies include conducting keyword searches across the digital media (within files as well as unallocated and slack space), recovering deleted files and extraction of registry information (for example to list user accounts, or attached USB devices).
The evidence recovered is analyzed to reconstruct events or actions and to reach conclusions, work that can often be performed by less specialized staff. When an investigation is complete the data is presented, usually in the form of a written report, in lay persons' terms.
Digital forensics is commonly used in both criminal law and private investigation. Traditionally it has been associated with criminal law, where evidence is collected to support or oppose a hypothesis before the courts. As with other areas of forensics this is often a part of a wider investigation spanning a number of disciplines. In some cases, the collected evidence is used as a form of intelligence gathering, used for other purposes than court proceedings (for example to locate, identify or halt other crimes). As a result, intelligence gathering is sometimes held to a less strict forensic standard.
In civil litigation or corporate matters, digital forensics forms part of the electronic discovery (or eDiscovery) process. Forensic procedures are similar to those used in criminal investigations, often with different legal requirements and limitations. Outside of the courts digital forensics can form a part of internal corporate investigations.
A common example might be following unauthorized network intrusion. A specialist forensic examination, into the nature and extent of the attack, is performed as a damage limitation exercise, both to establish the extent of any intrusion and in an attempt to identify the attacker. Such attacks were commonly conducted over phone lines during the 1980s, but in the modern era are usually propagated over the Internet.
The main focus of digital forensics investigations is to recover objective evidence of a criminal activity (termed actus reus in legal parlance). However, the diverse range of data held in digital devices can help with other areas of inquiry.
One major limitation to a forensic investigation is the use of encryption; this disrupts initial examination where pertinent evidence might be located using keywords. Laws to compel individuals to disclose encryption keys are still relatively new and controversial. But always more frequently there are solutions to brute force passwords or bypass encryption, such as in smartphones or PCs where by means of bootloader techniques the content of the device can be first acquired and later forced in order to find the password or encryption key. It is estimated that about 60% of cases that involve encrypted devices, often go unprocessed because there is no way to access the potential evidence.[7]
The examination of digital media is covered by national and international legislation. For civil investigations, in particular, laws may restrict the abilities of analysts to undertake examinations. Restrictions against network monitoring or reading of personal communications often exist. During criminal investigation, national laws restrict how much information can be seized. For example, in the United Kingdom seizure of evidence by law enforcement is governed by the PACE act. During its existence early in the field, the "International Organization on Computer Evidence" (IOCE) was one agency that worked to establish compatible international standards for the seizure of evidence.
In the UK, the same laws covering computer crime can also affect forensic investigators. The 1990 Computer Misuse Act legislates against unauthorized access to computer material. This is a particular concern for civil investigators who have more limitations than law enforcement.
An individual's right to privacy is one area of digital forensics which is still largely undecided by courts. The US Electronic Communications Privacy Act places limitations on the ability of law enforcement or civil investigators to intercept and access evidence. The act makes a distinction between stored communication (e.g. email archives) and transmitted communication (such as VOIP). The latter, being considered more of a privacy invasion, is harder to obtain a warrant for. The ECPA also affects the ability of companies to investigate the computers and communications of their employees, an aspect that is still under debate as to the extent to which a company can perform such monitoring.
Article 5 of the European Convention on Human Rights asserts similar privacy limitations to the ECPA and limits the processing and sharing of personal data both within the EU and with external countries. The ability of UK law enforcement to conduct digital forensics investigations is legislated by the Regulation of Investigatory Powers Act.
See main article: Digital evidence.
When used in a court of law, digital evidence falls under the same legal guidelines as other forms of evidence, as courts do not usually require more stringent guidelines. In the United States, the Federal Rules of Evidence are used to evaluate the admissibility of digital evidence. The United Kingdom PACE and Civil Evidence acts have similar guidelines and many other countries have their own laws. US federal laws restrict seizures to items with only obvious evidential value. This is acknowledged as not always being possible to establish with digital media prior to an examination.
Laws dealing with digital evidence are concerned with two issues:
The ease with which digital media can be modified means that documenting the chain of custody from the crime scene, through analysis and, ultimately, to the court, (a form of audit trail) is important to establish the authenticity of evidence.
Attorneys have argued that because digital evidence can theoretically be altered it undermines the reliability of the evidence. US judges are beginning to reject this theory, in the case US v. Bonallo the court ruled that "the fact that it is possible to alter data contained in a computer is plainly insufficient to establish untrustworthiness." In the United Kingdom, guidelines such as those issued by ACPO are followed to help document the authenticity and integrity of evidence.
Digital investigators, particularly in criminal investigations, have to ensure that conclusions are based upon factual evidence and their own expert knowledge. In the US, for example, Federal Rules of Evidence state that a qualified expert may testify “in the form of an opinion or otherwise” so long as:
The sub-branches of digital forensics may each have their own specific guidelines for the conduct of investigations and the handling of evidence. For example, mobile phones may be required to be placed in a Faraday shield during seizure or acquisition to prevent further radio traffic to the device. In the UK forensic examination of computers in criminal matters is subject to ACPO guidelines. There are also international approaches to providing guidance on how to handle electronic evidence. The "Electronic Evidence Guide" by the Council of Europe offers a framework for law enforcement and judicial authorities in countries who seek to set up or enhance their own guidelines for the identification and handling of electronic evidence.
The admissibility of digital evidence relies on the tools used to extract it. In the US, forensic tools are subjected to the Daubert standard, where the judge is responsible for ensuring that the processes and software used were acceptable.
In a 2003 paper, Brian Carrier argued that the Daubert guidelines required the code of forensic tools to be published and peer reviewed. He concluded that "open source tools may more clearly and comprehensively meet the guideline requirements than would closed-source tools."
In 2011, Josh Brunty stated that the scientific validation of the technology and software associated with performing a digital forensic examination is critical to any laboratory process. He argued that "the science of digital forensics is founded on the principles of repeatable processes and quality evidence therefore knowing how to design and properly maintain a good validation process is a key requirement for any digital forensic examiner to defend their methods in court."
One of the key issues relating to validating forensic tools is determining a 'baseline' or reference point for tool testing/evaluation. There have been numerous attempts to provide an environment for testing the functionality of forensic tools such as the Computer Forensic Tool Testing (CFTT) programme developed by NIST ".[8]
To allow for the different environments in which practitioners operate there have also been many attempts to create a framework for customizing test/evaluation environments.[9] [10] [11] These resources focus on a single or limited number of target systems. However, they do not scale well when attempts are made to test/evaluate tools designed for large networks or the cloud which have become more commonplace in investigations over the years. As of 2024 the only framework that addresses the use of remote agents by forensic tools for distributed processing/collection is that developed by Adams [12]
Digital forensics investigation is not restricted to retrieve data merely from the computer, as laws are breached by the criminals and small digital devices (e.g. tablets, smartphones, flash drives) are now extensively used. Some of these devices have volatile memory while some have non-volatile memory. Sufficient methodologies are available to retrieve data from volatile memory, however, there is lack of detailed methodology or a framework for data retrieval from non-volatile memory sources.[13] Depending on the type of devices, media or artifacts, digital forensics investigation is branched into various types.
See main article: Computer forensics.
The goal of computer forensics is to explain the current state of a digital artifact; such as a computer system, storage medium or electronic document.[14] The discipline usually covers computers, embedded systems (digital devices with rudimentary computing power and onboard memory) and static memory (such as USB pen drives).
Computer forensics can deal with a broad range of information; from logs (such as internet history) through to the actual files on the drive. In 2007, prosecutors used a spreadsheet recovered from the computer of Joseph Edward Duncan to show premeditation and secure the death penalty. Sharon Lopatka's killer was identified in 2006 after email messages from him detailing torture and death fantasies were found on her computer.
See main article: Mobile device forensics.
Mobile device forensics is a sub-branch of digital forensics relating to recovery of digital evidence or data from a mobile device. It differs from Computer forensics in that a mobile device will have an inbuilt communication system (e.g. GSM) and, usually, proprietary storage mechanisms. Investigations usually focus on simple data such as call data and communications (SMS/Email) rather than in-depth recovery of deleted data. SMS data from a mobile device investigation helped to exonerate Patrick Lumumba in the murder of Meredith Kercher.
Mobile devices are also useful for providing location information; either from inbuilt gps/location tracking or via cell site logs, which track the devices within their range. Such information was used to track down the kidnappers of Thomas Onofri in 2006.
See main article: Network forensics.
Network forensics is concerned with the monitoring and analysis of computer network traffic, both local and WAN/internet, for the purposes of information gathering, evidence collection, or intrusion detection.[15] Traffic is usually intercepted at the packet level, and either stored for later analysis or filtered in real-time. Unlike other areas of digital forensics network data is often volatile and rarely logged, making the discipline often reactionary.
In 2000, the FBI lured computer hackers Aleksey Ivanov and Gorshkov to the United States for a fake job interview. By monitoring network traffic from the pair's computers, the FBI identified passwords allowing them to collect evidence directly from Russian-based computers.
See main article: Forensic data analysis.
Forensic Data Analysis is a branch of digital forensics. It examines structured data with the aim to discover and analyze patterns of fraudulent activities resulting from financial crime.
Digital image forensics (or forensic image analysis) is a branch of digital forensics that deals with examination and verification of an image's authenticity and content.[16] These can range from Stalin-era airbrushed photos to elaborate deepfake videos.[17] [18] This has broad implications for a wide variety of crimes, for determining the validity of information presented in civil and criminal trials, and for verifying images and information that are circulated through news and social media.[17] [19] [20] [18]
See main article: Database forensics.
Database forensics is a branch of digital forensics relating to the forensic study of databases and their metadata. Investigations use database contents, log files and in-RAM data to build a timeline or recover relevant information.
See main article: IoT Forensics. IoT forensics is a branch of Digital forensics that has the goal of identifying and extracting digital information from devices belonging to the Internet of things field, to be used for forensics investigations as potential source of evidence.[21]