Stochastic forensics is a method to forensically reconstruct digital activity lacking artifacts, by analyzing emergent properties resulting from the stochastic nature of modern computers.[1] [2] [3] Unlike traditional computer forensics, which relies on digital artifacts, stochastic forensics does not require artifacts and can therefore recreate activity which would otherwise be invisible.[3] Its chief application is the investigation of insider data theft.[1] [2] [4]
Stochastic forensics was invented in 2010 by computer scientist Jonathan Grier to detect and investigate insider data theft.[2] Insider data theft has been notoriously difficult to investigate using traditional methods, since it does not create any artifacts (such as changes to the file attributes or Windows Registry).[3] [5] Consequently, industry demanded a new investigative technique.
Since its invention, stochastic forensics has been used in real world investigation of insider data theft,[6] been the subject of academic research,[7] and met with industry demand for tools and training.[2] [8]
Stochastic forensics is inspired by the statistical mechanics method used in physics.[2] [6] Classical Newtonian mechanics calculates the exact position and momentum of every particle in a system. This works well for systems, such as the Solar System, which consist of a small number of objects. However, it cannot be used to study things like a gas, which have intractably large numbers of molecules. Statistical mechanics, however, doesn't attempt to track properties of individual particles, but only the properties which emerge statistically. Hence, it can analyze complex systems without needing to know the exact position of their individual particles.
Likewise, modern day computer systems, which can have over
| |||||
2 |
Stochastic forensics chief application is detecting and investigating insider data theft. Insider data theft is often done by someone who is technically authorized to access the data, and who uses it regularly as part of their job. It does not create artifacts or change the file attributes or Windows Registry.[5] Consequently, unlike external computer attacks, which, by their nature, leave traces of the attack, insider data theft is practically invisible.[3]
However, the statistical distribution of filesystems' metadata is affected by such large scale copying. By analyzing this distribution, stochastic forensics is able to identify and examine such data theft. Typical filesystems have a heavy tailed distribution of file access. Copying in bulk disturbs this pattern, and is consequently detectable.[1] [2]
Drawing on this, stochastic mechanics has been used to successfully investigate insider data theft where other techniques have failed.[1] [2] [3] [6] Typically, after stochastic forensics has identified the data theft, follow up using traditional forensic techniques is required.[6]
Stochastic forensics has been criticized as only providing evidence and indications of data theft, and not concrete proof. Indeed, it requires a practitioner to "think like Sherlock, not Aristotle." Certain authorized activities besides data theft may cause similar disturbances in statistical distributions.[1] [6]
Furthermore, many operating systems do not track access timestamps by default, making stochastic forensics not directly applicable. Research is underway in applying stochastic forensics to these operating systems as well as databases.[2]
Additionally, in its current state, stochastic forensics requires a trained forensic analyst to apply and evaluate. There have been calls for development of tools to automate stochastic forensics by Guidance Software and others.[2]