Dijkstra's algorithm explained

Class:Search algorithm
Greedy algorithm
Dynamic programming[1]
Data:Graph
Usually used with priority queue or heap for optimization
Time:

\Theta(

Dijkstra's algorithm is an algorithm for finding the shortest paths between nodes in a weighted graph, which may represent, for example, a road network. It was conceived by computer scientist Edsger W. Dijkstra in 1956 and published three years later.[2] [3] [4] Dijkstra's algorithm finds the shortest path from a given source node to every other node.[5] It can be used to find the shortest path to a specific destination node, by terminating the algorithm after determining the shortest path to the destination node. For example, if the nodes of the graph represent cities, and the costs of edges represent the average distances between pairs of cities connected by a direct road, then Dijkstra's algorithm can be used to find the shortest route between one city and all other cities. A common application of shortest path algorithms is network routing protocols, most notably IS-IS (Intermediate System to Intermediate System) and OSPF (Open Shortest Path First). It is also employed as a subroutine in algorithms such as Johnson's algorithm.

The algorithm uses a min-priority queue data structure for selecting the shortest paths known so far. Before more advanced priority queue structures were discovered, Dijkstra's original algorithm ran in

\Theta(|V|2)

time, where

|V|

is the number of nodes.[6] proposed a Fibonacci heap priority queue to optimize the running time complexity to

\Theta(|E|+|V|log|V|)

. This is asymptotically the fastest known single-source shortest-path algorithm for arbitrary directed graphs with unbounded non-negative weights. However, specialized cases (such as bounded/integer weights, directed acyclic graphs etc.) can be improved further. If preprocessing is allowed, algorithms such as contraction hierarchies can be up to seven orders of magnitude faster.

Dijkstra's algorithm is commonly used on graphs where the edge weights are positive integers or real numbers. It can be generalized to any graph where the edge weights are partially ordered, provided the subsequent labels (a subsequent label is produced when traversing an edge) are monotonically non-decreasing.[7]

In many fields, particularly artificial intelligence, Dijkstra's algorithm or a variant offers a uniform cost search and is formulated as an instance of the more general idea of best-first search.

History

Dijkstra thought about the shortest path problem while working as a programmer at the Mathematical Center in Amsterdam in 1956. He wanted to demonstrate the capabilities of the new ARMAC computer.[8] His objective was to choose a problem and a computer solution that non-computing people could understand. He designed the shortest path algorithm and later implemented it for ARMAC for a slightly simplified transportation map of 64 cities in the Netherlands (he limited it to 64, so that 6 bits would be sufficient to encode the city number). A year later, he came across another problem advanced by hardware engineers working on the institute's next computer: minimize the amount of wire needed to connect the pins on the machine's back panel. As a solution, he re-discovered Prim's minimal spanning tree algorithm (known earlier to Jarník, and also rediscovered by Prim). Dijkstra published the algorithm in 1959, two years after Prim and 29 years after Jarník.[9] [10]

Algorithm

The algorithm requires a starting node, and node N, with a distance between the starting node and N. Dijkstra's algorithm starts with infinite distances and tries to improve them step by step:

  1. Create a set of all unvisited nodes: the unvisited set.
  2. Assign to every node a distance from start value: for the starting node, it is zero, and for all other nodes, it is infinity, since initially no path is known to these nodes. During execution, the distance of a node N is the length of the shortest path discovered so far between the starting node and N.[11]
  3. From the unvisited set, select the current node to be the one with the smallest (finite) distance; initially, this is the starting node (distance zero). If the unvisited set is empty, or contains only nodes with infinite distance (which are unreachable), then the algorithm terminates by skipping to step 6. If the only concern is the path to a target node, the algorithm terminates once the current node is the target node. Otherwise, the algorithm continues.
  4. For the current node, consider all of its unvisited neighbors and update their distances through the current node; compare the newly calculated distance to the one currently assigned to the neighbor and assign the smaller one to it. For example, if the current node A is marked with a distance of 6, and the edge connecting it with its neighbor B has length 2, then the distance to B through A is 6 + 2 = 8. If B was previously marked with a distance greater than 8, then update it to 8 (the path to B through A is shorter). Otherwise, keep its current distance (the path to B through A is not the shortest).
  5. After considering all of the current node's unvisited neighbors, the current node is removed from the unvisited set. Thus a visited node is never rechecked, which is correct because the distance recorded on the current node is minimal (as ensured in step 3), and thus final. Repeat from step 3.
  6. Once the loop exits (steps 3–5), every visited node contains its shortest distance from the starting node.

Description

The shortest path between two intersections on a city map can be found by this algorithm using pencil and paper. Every intersection is listed on a separate line: one is the starting point and is labeled (given a distance of) 0. Every other intersection is initially labeled with a distance of infinity. This is done to note that no path to these intersections has yet been established. At each iteration one intersection becomes the current intersection. For the first iteration, this is the starting point.

From the current intersection, the distance to every neighbor (directly-connected) intersection is assessed by summing the label (value) of the current intersection and the distance to the neighbor and then relabeling the neighbor with the lesser of that sum and the neighbor's existing label. I.e., the neighbor is relabeled if the path to it through the current intersection is shorter than previously assessed paths. If so, mark the road to the neighbor with an arrow pointing to it, and erase any other arrow that points to it. After the distances to each of the current intersection's neighbors have been assessed, the current intersection is marked as visited. The unvisited intersection with the smallest label becomes the current intersection and the process repeats until all nodes with labels less than the destination's label have been visited.

Once no unvisited nodes remain with a label smaller than the destination's label, the remaining arrows show the shortest path.

Pseudocode

In the following pseudocode, is an array that contains the current distances from the to other vertices, i.e. is the current distance from the source to the vertex . The array contains pointers to previous-hop nodes on the shortest path from source to the given vertex (equivalently, it is the next-hop on the path from the given vertex to the source). The code, searches for the vertex in the vertex set that has the least value. returns the length of the edge joining (i.e. the distance between) the two neighbor-nodes and . The variable on line 14 is the length of the path from the node to the neighbor node if it were to go through . If this path is shorter than the current shortest path recorded for, then the distance of is updated to .

1 function Dijkstra(Graph, source): 2 3 for each vertex v in Graph.Vertices: 4 dist[''v''] ← INFINITY 5 prev[''v''] ← UNDEFINED 6 add v to Q 7 dist[''source''] ← 0 8 9 while Q is not empty: 10 u ← vertex in Q with minimum dist[u] 11 remove u from Q 12 13 for each neighbor v of u still in Q: 14 alt ← dist[''u''] + Graph.Edges(u, v) 15 if alt < dist[''v'']: 16 dist[''v''] ← alt 17 prev[''v''] ← u 18 19 return dist[], prev[]

To find the shortest path between vertices and, the search terminates after line 10 if . The shortest path from to can be obtained by reverse iteration:

1 S ← empty sequence 2 utarget 3 if prev[''u''] is defined or u = source: // Proceed if the vertex is reachable 4 while u is defined: // Construct the shortest path with a stack S 5 insert u at the beginning of S // Push the vertex onto the stack 6 u ← prev[''u''] // Traverse from target to source

Now sequence is the list of vertices constituting one of the shortest paths from to, or the empty sequence if no path exists.

A more general problem is to find all the shortest paths between and (there might be several of the same length). Then instead of storing only a single node in each entry of all nodes satisfying the relaxation condition can be stored. For example, if both and connect to and they lie on different shortest paths through (because the edge cost is the same in both cases), then both and are added to . When the algorithm completes, data structure describes a graph that is a subset of the original graph with some edges removed. Its key property is that if the algorithm was run with some starting node, then every path from that node to any other node in the new graph is the shortest path between those nodes graph, and all paths of that length from the original graph are present in the new graph. Then to actually find all these shortest paths between two given nodes, a path finding algorithm on the new graph, such as depth-first search would work.

Using a priority queue

A min-priority queue is an abstract data type that provides 3 basic operations:, and . As mentioned earlier, using such a data structure can lead to faster computing times than using a basic queue. Notably, Fibonacci heap or Brodal queue offer optimal implementations for those 3 operations. As the algorithm is slightly different in appearance, it is mentioned here, in pseudocode as well:

1 function Dijkstra(Graph, source): 2 create vertex priority queue Q 3 4 dist[''source''] ← 0 // Initialization 5 Q.add_with_priority(source, 0) // associated priority equals dist[·] 6 7 for each vertex v in Graph.Vertices: 8 if vsource 9 prev[''v''] ← UNDEFINED // Predecessor of v 10 dist[''v''] ← INFINITY // Unknown distance from source to v 11 Q.add_with_priority(v, INFINITY) 12 13 14 while Q is not empty: // The main loop 15 uQ.extract_min // Remove and return best vertex 16 for each neighbor v of u: // Go through all v neighbors of u 17 alt ← dist[''u''] + Graph.Edges(u, v) 18 if alt < dist[''v'']: 19 prev[''v''] ← u 20 dist[''v''] ← alt 21 Q.decrease_priority(v, alt) 22 23 return dist, prev

Instead of filling the priority queue with all nodes in the initialization phase, it is possible to initialize it to contain only source; then, inside the '''if''' ''alt'' < dist[''v''] block, the becomes an operation.

Yet another alternative is to add nodes unconditionally to the priority queue and to instead check after extraction (''u'' ← ''Q''.extract_min) that it isn't revisiting, or that no shorter connection was found yet in the if alt < dist[v] block. This can be done by additionally extracting the associated priority ''p'' from the queue and only processing further '''if''' ''p'' == dist[''u''] inside the '''while''' ''Q'' is not empty loop.[12]

These alternatives can use entirely array-based priority queues without decrease-key functionality, which have been found to achieve even faster computing times in practice. However, the difference in performance was found to be narrower for denser graphs.[13]

Proof

To prove the correctness of Dijkstra's algorithm, mathematical induction can be used on the number of visited nodes.

Invariant hypothesis: For each visited node, is the shortest distance from to, and for each unvisited node, is the shortest distance from to when traveling via visited nodes only, or infinity if no such path exists. (Note: we do not assume is the actual shortest distance for unvisited nodes, while is the actual shortest distance)

Base case

The base case is when there is just one visited node, . Its distance is defined to be zero, which is the shortest distance, since negative weights are not allowed. Hence, the hypothesis holds.

Induction

Assuming that the hypothesis holds for

k

visited nodes, to show it holds for

k+1

nodes, let be the next visited node, i.e. the node with minimum . The claim is that is the shortest distance from to .

The proof is by contradiction. If a shorter path were available, then this shorter path either contains another unvisited node or not.

For all other visited nodes, the is already known to be the shortest distance from already, because of the inductive hypothesis, and these values are unchanged.

After processing, it is still true that for each unvisited node, is the shortest distance from to using visited nodes only. Any shorter path that did not use, would already have been found, and if a shorter path used it would have been updated when processing .

After all nodes are visited, the shortest path from to any node consists only of visited nodes. Therefore, is the shortest distance.

Running time

Bounds of the running time of Dijkstra's algorithm on a graph with edges and vertices can be expressed as a function of the number of edges, denoted

|E|

, and the number of vertices, denoted

|V|

, using big-O notation. The complexity bound depends mainly on the data structure used to represent the set . In the following, upper bounds can be simplified because

|E|

is

O(|V|2)

for any simple graph, but that simplification disregards the fact that in some problems, other upper bounds on

|E|

may hold.

For any data structure for the vertex set , the running time i s:

\Theta(|E|Tdk+|V|Tem),

where

Tdk

and

Tem

are the complexities of the decrease-key and extract-minimum operations in , respectively.

The simplest version of Dijkstra's algorithm stores the vertex set as a linked list or array, and edges as an adjacency list or matrix. In this case, extract-minimum is simply a linear search through all vertices in , so the running time is

\Theta(|E|+|V|2)=\Theta(|V|2)

.

For sparse graphs, that is, graphs with far fewer than

|V|2

edges, Dijkstra's algorithm can be implemented more efficiently by storing the graph in the form of adjacency lists and using a self-balancing binary search tree, binary heap, pairing heap, Fibonacci heap or a priority heap as a priority queue to implement extracting minimum efficiently. To perform decrease-key steps in a binary heap efficiently, it is necessary to use an auxiliary data structure that maps each vertex to its position in the heap, and to update this structure as the priority queue changes. With a self-balancing binary search tree or binary heap, the algorithm requires

\Theta((|E|+|V|)log|V|)

time in the worst case; for connected graphs this time bound can be simplified to

\Theta(|E|log|V|)

. The Fibonacci heap improves this to

\Theta(|E|+|V|log|V|).

When using binary heaps, the average case time complexity is lower than the worst-case: assuming edge costs are drawn independently from a common probability distribution, the expected number of decrease-key operations is bounded by

\Theta(|V|log(|E|/|V|))

, giving a total running time of

O\left(|E|+|V|log

|E|
|V|

log|V|\right).

Practical optimizations and infinite graphs

In common presentations of Dijkstra's algorithm, initially all nodes are entered into the priority queue. This is, however, not necessary: the algorithm can start with a priority queue that contains only one item, and insert new items as they are discovered (instead of doing a decrease-key, check whether the key is in the queue; if it is, decrease its key, otherwise insert it). This variant has the same worst-case bounds as the common variant, but maintains a smaller priority queue in practice, speeding up queue operations.[14]

Moreover, not inserting all nodes in a graph makes it possible to extend the algorithm to find the shortest path from a single source to the closest of a set of target nodes on infinite graphs or those too large to represent in memory. The resulting algorithm is called uniform-cost search (UCS) in the artificial intelligence literature[15] [16] and can be expressed in pseudocode as procedure uniform_cost_search(start) is node ← start frontier ← priority queue containing node only expanded ← empty set do if frontier is empty then return failure node ← frontier.pop if node is a goal state then return solution(node) expanded.add(node) for each of node's neighbors n do if n is not in expanded and not in frontier then frontier.add(n) else if n is in frontier with higher cost replace existing node with n

Its complexity can be expressed in an alternative way for very large graphs: when is the length of the shortest path from the start node to any node satisfying the "goal" predicate, each edge has cost at least , and the number of neighbors per node is bounded by , then the algorithm's worst-case time and space complexity are both in .

Further optimizations for the single-target case include bidirectional variants, goal-directed variants such as the A* algorithm (see), graph pruning to determine which nodes are likely to form the middle segment of shortest paths (reach-based routing), and hierarchical decompositions of the input graph that reduce routing to connecting and to their respective "transit nodes" followed by shortest-path computation between these transit nodes using a "highway".[17] Combinations of such techniques may be needed for optimal practical performance on specific problems.[18]

Optimality for comparison-sorting by distance

As well as simply computing distances and paths, Dijkstra's algorithm can be used to sort vertices by their distances from a given starting vertex.In 2023, Haeupler, Rozhoň, Tětek, Hladík, and Tarjan (one of the inventors of the 1984 heap), proved that, for this sorting problem on a positively-weighted directed graph, a version of Dijkstra's algorithm with a special heap data structure has a runtime and number of comparisons that is within a constant factor of optimal among comparison-based algorithms for the same sorting problem on the same graph and starting vertex but with variable edge weights. To achieve this, they use a comparison-based heap whose cost of returning/removing the minimum element from the heap is logarithmic in the number of elements inserted after it rather than in the number of elements in the heap.[19] [20]

Specialized variants

When arc weights are small integers (bounded by a parameter

C

), specialized queues can be used for increased speed. The first algorithm of this type was Dial's algorithm for graphs with positive integer edge weights, which uses a bucket queue to obtain a running time

O(|E|+|V|C)

. The use of a Van Emde Boas tree as the priority queue brings the complexity to

O(|E|+|V|logC/loglog|V|C)

. Another interesting variant based on a combination of a new radix heap and the well-known Fibonacci heap runs in time

O(|E|+|V|\sqrt{logC})

. Finally, the best algorithms in this special case run in

O(|E|loglog|V|)

time and

O(|E|+|V|min\{(log|V|)1/3+\varepsilon,(logC)1/4+\varepsilon\})

time.

Related problems and algorithms

Dijkstra's original algorithm can be extended with modifications. For example, sometimes it is desirable to present solutions which are less than mathematically optimal. To obtain a ranked list of less-than-optimal solutions, the optimal solution is first calculated. A single edge appearing in the optimal solution is removed from the graph, and the optimum solution to this new graph is calculated. Each edge of the original solution is suppressed in turn and a new shortest-path calculated. The secondary solutions are then ranked and presented after the first optimal solution.

Dijkstra's algorithm is usually the working principle behind link-state routing protocols. OSPF and IS-IS are the most common.

Unlike Dijkstra's algorithm, the Bellman–Ford algorithm can be used on graphs with negative edge weights, as long as the graph contains no negative cycle reachable from the source vertex s. The presence of such cycles means that no shortest path can be found, since the label becomes lower each time the cycle is traversed. (This statement assumes that a "path" is allowed to repeat vertices. In graph theory that is normally not allowed. In theoretical computer science it often is allowed.) It is possible to adapt Dijkstra's algorithm to handle negative weights by combining it with the Bellman-Ford algorithm (to remove negative edges and detect negative cycles): Johnson's algorithm.

The A* algorithm is a generalization of Dijkstra's algorithm that reduces the size of the subgraph that must be explored, if additional information is available that provides a lower bound on the distance to the target.

The process that underlies Dijkstra's algorithm is similar to the greedy process used in Prim's algorithm. Prim's purpose is to find a minimum spanning tree that connects all nodes in the graph; Dijkstra is concerned with only two nodes. Prim's does not evaluate the total weight of the path from the starting node, only the individual edges.

Breadth-first search can be viewed as a special-case of Dijkstra's algorithm on unweighted graphs, where the priority queue degenerates into a FIFO queue.

The fast marching method can be viewed as a continuous version of Dijkstra's algorithm which computes the geodesic distance on a triangle mesh.

Dynamic programming perspective

From a dynamic programming point of view, Dijkstra's algorithm is a successive approximation scheme that solves the dynamic programming functional equation for the shortest path problem by the Reaching method.[21] [22] [23]

In fact, Dijkstra's explanation of the logic behind the algorithm: is a paraphrasing of Bellman's Principle of Optimality in the context of the shortest path problem.

See also

Notes

  1. Controversial, see Moshe Sniedovich. Dijkstra's algorithm revisited: the dynamic programming connexion. Control and Cybernetics. 2006. 35. 599–620. and below part.
  2. Web site: Richards . Hamilton . Edsger Wybe Dijkstra . 16 October 2017 . A.M. Turing Award . Association for Computing Machinery . At the Mathematical Centre a major project was building the ARMAC computer. For its official inauguration in 1956, Dijkstra devised a program to solve a problem interesting to a nontechnical audience: Given a network of roads connecting cities, what is the shortest route between two designated cities?.
  3. Frana . Phil . August 2010 . An Interview with Edsger W. Dijkstra . Communications of the ACM . 53 . 8 . 41–47 . 10.1145/1787234.1787249 . 27009702 .
  4. Dijkstra . E. W. . Edsger W. Dijkstra . 1959 . A note on two problems in connexion with graphs . Numerische Mathematik . 1 . 269–271 . 10.1.1.165.7577 . 10.1007/BF01386390 . 123284777.
  5. Book: Mehlhorn . Kurt . Kurt Mehlhorn . Algorithms and Data Structures: The Basic Toolbox . Sanders . Peter . Peter Sanders (computer scientist) . Springer . 2008 . 978-3-540-77977-3 . Chapter 10. Shortest Paths . 10.1007/978-3-540-77978-0 . http://people.mpi-inf.mpg.de/~mehlhorn/ftp/Toolbox/ShortestPaths.pdf.
  6. Book: Schrijver, Alexander . Optimization Stories . On the history of the shortest path problem . 2012 . http://ftp.gwdg.de/pub/misc/EMIS/journals/DMJDMV/vol-ismp/32_schrijver-alexander-sp.pdf . Documenta Mathematica Series . 6 . 155–167 . 10.4171/dms/6/19 . 978-3-936609-58-5.
  7. Szcześniak . Ireneusz . Jajszczyk . Andrzej . Woźna-Szcześniak . Bożena . 2019 . Generic Dijkstra for optical networks . Journal of Optical Communications and Networking . 11 . 11 . 568–577 . 1810.04481 . 10.1364/JOCN.11.000568 . 52958911.
  8. Web site: 2007 . ARMAC . dead . https://web.archive.org/web/20131113021126/http://www-set.win.tue.nl/UnsungHeroes/machines/armac.html . 13 November 2013 . Unsung Heroes in Dutch Computing History.
  9. Prim . R.C. . 1957 . Shortest connection networks and some generalizations . dead . Bell System Technical Journal . 36 . 6 . 1389–1401 . 1957BSTJ...36.1389P . 10.1002/j.1538-7305.1957.tb01515.x . https://web.archive.org/web/20170718230207/http://bioinfo.ict.ac.cn/~dbu/AlgorithmCourses/Lectures/Prim1957.pdf . 18 July 2017 . 18 July 2017.
  10. V. Jarník: O jistém problému minimálním [About a certain minimal problem], Práce Moravské Přírodovědecké Společnosti, 6, 1930, pp. 57–63. (in Czech)
  11. Encyclopedia: Encyclopedia of Operations Research and Management Science . Springer . 2013 . Gass . Saul I . 1 . 10.1007/978-1-4419-1153-7 . 978-1-4419-1137-7 . Fu . Michael . Gass . Saul . Dijkstra's Algorithm . Michael C . Fu . Springer Link . free.
  12. Observe that cannot ever hold because of the update when updating the queue. See https://cs.stackexchange.com/questions/118388/dijkstra-without-decrease-key for discussion.
  13. Book: Chen . M. . Priority Queues and Dijkstra's Algorithm – UTCS Technical Report TR-07-54 – 12 October 2007 . Chowdhury . R. A. . Ramachandran . V. . Roche . D. L. . Tong . L. . The University of Texas at Austin, Department of Computer Sciences . 2007 . Austin, Texas . chen.
  14. Felner . Ariel . 2011 . Position Paper: Dijkstra's Algorithm versus Uniform Cost Search or a Case Against Dijkstra's Algorithm . Proc. 4th Int'l Symp. on Combinatorial Search . https://web.archive.org/web/20200218150924/https://www.aaai.org/ocs/index.php/SOCS/SOCS11/paper/view/4017/4357 . 18 February 2020 . 12 February 2015 . dead. In a route-finding problem, Felner finds that the queue can be a factor 500–600 smaller, taking some 40% of the running time.
  15. 75, 81.
  16. Sometimes also least-cost-first search: Nau . Dana S. . 1983 . Expert computer systems . Computer . IEEE . 16 . 2 . 63–85 . 10.1109/mc.1983.1654302 . 7301753.
  17. Wagner . Dorothea . Willhalm . Thomas . 2007 . Speed-up techniques for shortest-path computations . STACS . 23–36.
  18. Bauer . Reinhard . Delling . Daniel . Sanders . Peter . Schieferdecker . Dennis . Schultes . Dominik . Wagner . Dorothea . 2010 . Combining hierarchical and goal-directed speed-up techniques for Dijkstra's algorithm . J. Experimental Algorithmics . 15 . 2.1 . 10.1145/1671970.1671976 . 1661292.
  19. Haeupler . Bernhard . Universal Optimality of Dijkstra via Beyond-Worst-Case Heaps . 2024-10-28 . 2311.11793 . Hladík . Richard . Rozhoň . Václav . Tarjan . Robert . Tětek . Jakub. cs.DS .
  20. Web site: Brubaker . Ben . 2024-10-25 . Computer Scientists Establish the Best Way to Traverse a Graph . 2024-12-09 . Quanta Magazine . en.
  21. Sniedovich . M. . 2006 . Dijkstra's algorithm revisited: the dynamic programming connexion . Journal of Control and Cybernetics . 35 . 3 . 599–620. Online version of the paper with interactive computational modules.
  22. Book: Denardo, E.V. . Dynamic Programming: Models and Applications . . 2003 . 978-0-486-42810-9 . Mineola, NY.
  23. Book: Sniedovich, M. . Dynamic Programming: Foundations and Principles . . 2010 . 978-0-8247-4099-3.

References

External links