Incremental computing explained

Incremental computing, also known as incremental computation, is a software feature which, whenever a piece of data changes, attempts to save time by only recomputing those outputs which depend on the changed data.[1] [2] [3] When incremental computing is successful, it can be significantly faster than computing new outputs naively. For example, a spreadsheet software package might use incremental computation in its recalculation features, to update only those cells containing formulas which depend (directly or indirectly) on the changed cells.

When incremental computing is implemented by a tool that can implement it for a variety of different pieces of code automatically, that tool is an example of a program analysis tool for optimization.

Static versus dynamic

Incremental computing techniques can be broadly separated into two types of approaches:

Static approaches attempt to derive an incremental program from a conventional program P using, e.g., either manual design and refactoring, or automatic program transformations. These program transformations occur before any inputs or input changes are provided.

Dynamic approaches record information about executing program P on a particular input (I1) and use this information when the input changes (to I2) in order to update the output (from O1 to O2). The figure shows the relationship between program P, the change calculation function ΔP, which constitutes the core of the incremental program, and a pair of inputs and outputs, I1, O1 and I2, O2.

Specialized versus general-purpose approaches

Some approaches to incremental computing are specialized, while others are general purpose.Specialized approaches require the programmer to explicitly specify the algorithms and data structures that will be used to preserve unchanged sub-calculations. General-purpose approaches, on the other hand, use language, compiler, or algorithmic techniques to give incremental behavior to otherwise non-incremental programs.[4]

Static methods

Program derivatives

Given a computation

C=f(x1,x2,...xn)

and a potential change

xj:=

\Delta
xj
, we can insert code before the change occurs (the pre-derivative) and after the change (the post-derivative) to update the value of

C

faster than rerunning

f

. Paige has written down a list of rules for formal differentiation of programs in SUBSETL.[5]

View maintenance

In database systems such as DBToaster, views are defined with relational algebra. Incremental view maintenance statically analyzes relational algebra to create update rules that quickly maintain the view in the presence of small updates, such as insertion of a row.[6]

Dynamic methods

Incremental computation can be achieved by building a dependency graph of all the data elements that may need to be recalculated, and their dependencies. The elements that need to be updated when a single element changes are given by the transitive closure of the dependency relation of the graph. In other words, if there is a path from the changed element to another element, the latter may be updated (depending on whether the change eventually reaches the element). The dependency graph may need to be updated as dependencies change, or as elements are added to, or removed from, the system. It is used internally by the implementation, and does not typically need to be displayed to the user.

Capturing dependencies across all possible values can be avoided by identifying subset of important values (e.g., aggregation results) across which dependencies can be tracked, and incrementally recomputing other dependent variables, hence balancing the amount of dependency information to be tracked with the amount of recomputation to be performed upon input change.[7]

Partial evaluation can be seen as a method for automating the simplest possible case of incremental computing, in which an attempt is made to divide program data into two categories: that which can vary based on the program's input, and that which cannot (and the smallest unit of change is simply "all the data that can vary"). Partial evaluation can be combined with other incremental computing techniques.

With cycles in the dependency graph, a single pass through the graph may not be sufficient to reach a fixed point. In some cases, complete reevaluation of a system is semantically equivalent to incremental evaluation, and may be more efficient in practice if not in theory.[8]

Existing systems

Compiler and language support

Frameworks and libraries

Applications

See also

Notes and References

  1. Magnus. Carlsson. Monads for incremental computing. Proceedings of the seventh ACM SIGPLAN international conference on Functional programming. 26 - 35. ACM. 2002. New York. 10.1145/581478.581482. 1-58113-487-8.
  2. Self-Adjusting Computation. Ph.D.. Umut A. Acar. 2005.
  3. Reactive Imperative Programming with Dataflow Constraints. Camil Demetrescu. Irene Finocchi. Andrea Ribichini. Proceedings of the 26th ACM International Conference on Object-Oriented Programming Systems Languages and Applications (OOPSLA 2011). 407 - 426. ACM. 2011. 10.1145/2048066.2048100. 978-1-4503-0940-0. 1104.2293.
  4. Yan Chen. Joshua Dunfield. Matthew A. Hammer. Umut A. Acar. ICFP '11. 129–141. Implicit self-adjusting computation for purely functional programs. 2018-03-12. https://web.archive.org/web/20161030185650/http://repository.cmu.edu/cgi/viewcontent.cgi?article=3549&context=compsci. 2016-10-30.
  5. Book: Paige, Robert. Formal Differentiation: A Program Synthesis Technique. UMI Research Press. 1981. 978-0-8357-1213-2.
  6. Ahmad. Yanif. Kennedy. Oliver. Koch. Christoph. Nikolic. Milos. 2012-06-01. DBToaster: Higher-order Delta Processing for Dynamic, Frequently Fresh Views. Proc. VLDB Endow.. 5. 10. 968–979. 10.14778/2336664.2336670. 2150-8097. 1207.0137.
  7. Mugilan Mariappan . Keval Vora . GraphBolt: Dependency-Driven Synchronous Processing of Streaming Graphs . In European Conference on Computer Systems (EuroSys'19). 25:1–25:16. 2019. 10.1145/3302424.3303974 .
  8. Kimberley Burchett . Gregory H. Cooper . Shriram Krishnamurthi . Lowering: A static optimization technique for transparent functional reactivity . In ACM SIGPLAN Symposium on Partial Evaluation and Semantics-Based Program Manipulation. 71–80. 2007. 978-1-59593-620-2. 10.1.1.90.5866 .
  9. Book: Hammer. Matthew A.. Proceedings of the 2009 ACM SIGPLAN conference on Programming language design and implementation - PLDI '09. Acar. Umut A.. Chen. Yan. CEAL. 2009. 25. 10.1145/1542476.1542480. 9781605583921. 11058228 .
  10. Book: Reps. Thomas. Proceedings of the first ACM SIGSOFT/SIGPLAN software engineering symposium on Practical software development environments - SDE 1. Teitelbaum. Tim. The synthesizer generator. 1984. 42–48. 10.1145/800020.808247. 978-0897911313.
  11. Web site: Adapton: Programming Language Abstractions for Incremental Computation. adapton.org. 2016-10-07.
  12. Book: Saha. Diptikalyan. Practical Aspects of Declarative Languages. Ramakrishnan. C. R.. Incremental Evaluation of Tabled Prolog: Beyond Pure Logic Programs. 3819. 2005. 215–229. 0302-9743. 10.1007/11603023_15. Lecture Notes in Computer Science. 978-3-540-30947-5. 10.1.1.111.7484.
  13. ADAPTON: Composable, Demand-Driven Incremental Computation. Hammer. Matthew. Phang. Khoo. Hicks. Michael. Foster. Jeffrey. PLDI. 2014.