Beam search explained

In computer science, beam search is a heuristic search algorithm that explores a graph by expanding the most promising node in a limited set. Beam search is a modification of best-first search that reduces its memory requirements. Best-first search is a graph search which orders all partial solutions (states) according to some heuristic. But in beam search, only a predetermined number of best partial solutions are kept as candidates.[1] It is thus a greedy algorithm.

Details

Beam search uses breadth-first search to build its search tree. At each level of the tree, it generates all successors of the states at the current level, sorting them in increasing order of heuristic cost.[2] However, it only stores a predetermined number,

\beta

, of best states at each level (called the beam width). Only those states are expanded next. The greater the beam width, the fewer states are pruned. With an infinite beam width, no states are pruned and beam search is identical to best-first search.[3] Conversely, a beam width of 1 corresponds to a hill-climbing algorithm. The beam width bounds the memory required to perform the search. Since a goal state could potentially be pruned, beam search sacrifices completeness (the guarantee that an algorithm will terminate with a solution, if one exists). Beam search is not optimal (that is, there is no guarantee that it will find the best solution).

Uses

A beam search is most often used to maintain tractability in large systems with insufficient amount of memory to store the entire search tree. For example, it has been used in many machine translation systems.[4] (The state of the art now primarily uses neural machine translation based methods, especially large language models) To select the best translation, each part is processed, and many different ways of translating the words appear. The top best translations according to their sentence structures are kept, and the rest are discarded. The translator then evaluates the translations according to a given criterion, choosing the translation which best keeps the goals.

History

The Harpy Speech Recognition System (introduced in a 1976 dissertation[5]) was the first use of what would become known as beam search.[6] While the procedure was originally referred to as the "locus model of search", the term "beam search" was already in use by 1977.[7]

Variants

Beam search has been made complete by combining it with depth-first search, resulting in beam stack search[8] and depth-first beam search, and with limited discrepancy search,[9] resulting in beam search using limited discrepancy backtracking (BULB). The resulting search algorithms are anytime algorithms that find good but likely sub-optimal solutions quickly, like beam search, then backtrack and continue to find improved solutions until convergence to an optimal solution.

In the context of a local search, we call local beam search a specific algorithm that begins selecting

\beta

randomly generated states and then, for each level of the search tree, it always considers

\beta

new states among all the possible successors of the current ones, until it reaches a goal.[10] [11]

Since local beam search often ends up on local maxima, a common solution is to choose the next

\beta

states in a random way, with a probability dependent from the heuristic evaluation of the states. This kind of search is called stochastic beam search.[12]

Other variants are flexible beam search and recovery beam search.[11]

Notes and References

  1. Web site: beam search . 2024-03-27 . Free On-line Dictionary of Computing.
  2. Web site: BRITISH MUSEUM SEARCH. bradley.bradley.edu. 2016-04-11.
  3. Book: Norvig, Peter . Paradigms of Artificial Intelligence Programming: Case Studies in Common LISP . 1992 . Morgan Kaufmann . 9781558601918 . 196.
  4. Tillmann . C. . Ney . H. . Word reordering and a dynamic programming beam search algorithm for statistical machine translation . Computational Linguistics . 29 . 1 . 97–133 . 2003 . 10.1162/089120103321337458. 7829066 . free .
  5. Lowerre . Bruce T. . The Harpy Speech Recognition System . PhD . Carnegie Mellon University . 1976 .
  6. Ow . Peng Si . Morton . Thomas E. . 1988 . Filtered beam search in scheduling† . International Journal of Production Research . en . 26 . 1 . 35–62 . 10.1080/00207548808947840 . 0020-7543.
  7. Book: Defense Technical Information Center . DTIC ADA049288: Speech Understanding Systems. Summary of Results of the Five-Year Research Effort at Carnegie-Mellon University . 1977-08-01 . 6. english.
  8. Book: Zhou . Rong . Hansen . Eric . Beam-Stack Search: Integrating Backtracking with Beam Search . 2005 . http://www.aaai.org/Library/ICAPS/2005/icaps05-010.php . ICAPS . 90–98 . 2011-04-09 . 2021-04-20 . https://web.archive.org/web/20210420205518/http://www.aaai.org/Library/ICAPS/2005/icaps05-010.php . dead .
  9. Book: Furcy . D. . Koenig . S. . Limited discrepancy beam search . https://dl.acm.org/doi/abs/10.5555/1642293.1642313 . Proceedings of the 19th international joint conference on Artificial intelligence . Morgan Kaufmann . 2005 . 125–131 .
  10. Web site: Local search algorithms. University of North Carolina at Chapel Hill, Department of Computer Science. 15. Svetlana Lazebnik. Svetlana Lazebnik . https://web.archive.org/web/20110705070334/http://www.cs.unc.edu/~lazebnik/fall10/lec06_local_search.pdf. 2011-07-05. live.
  11. Web site: Beam Search. Indian Institute of Technology Bombay, Department of Computer Science and Engineering (CSE). 39–40. Pushpak Bhattacharyya. https://web.archive.org/web/20181121123057/https://www.cse.iitb.ac.in/~cs344/2011/slides/cs344-beam-search-2feb11.pptx. 2018-11-21. live.
  12. Web site: Local Search. James Parker. 17. University of Minnesota. 2017-09-28. https://web.archive.org/web/20171013150401/http://www-users.cselabs.umn.edu/classes/Fall-2017/csci4511/slides/week4/9.28.17.pdf. 2017-10-13. live. 2018-11-21.