Code review explained

Code review (sometimes referred to as peer review) is a software quality assurance activity in which one or more people check a program, mainly by viewing and reading parts of its source code, either after implementation or as an interruption of implementation. At least one of the persons must not have authored the code. The persons performing the checking, excluding the author, are called "reviewers".[1] [2]

Although direct discovery of quality problems is often the main goal,[3] code reviews are usually performed to reach a combination of goals:[4]

This definition of code review distinguishes it from related software quality assurance techniques, such as static code analysis, self-checks, testing, and pair programming. In static code analysis the main checking is performed by an automated program, in self-checks only the author checks the code, in testing the execution of the code is an integral part, and pair programming is performed continuously during implementation and not as a separate step.[1]

Review types

There are many variations of code review processes, some of which are detailed below. Additional review types are part of IEEE 1028.

IEEE 1028-2008 lists the following review types:[5]

Inspection (formal)

Historically, the first code review process that was studied and described in detail was called "Inspection" by its inventor, Michael Fagan.[6] This Fagan inspection is a formal process that involves a careful and detailed execution with multiple participants and multiple phases. Formal code reviews are the traditional method of review, in which software developers attend a series of meetings and review code line by line, usually using printed copies of the material. Formal inspections are extremely thorough and have been proven effective at finding defects in the code under review.[6]

Regular change-based code review (Walk-throughs)

In recent years, many industry teams have introduced a more lightweight type of code review in which the scope of each review is based on the changes to the codebase performed in a ticket, user story, commit, or some other unit of work.[7] Furthermore, there are rules or conventions that embed the review task into the development process (e.g., "every ticket must be reviewed"), commonly as part of a pull request, instead of explicitly planning each review. Such a review process is called "regular, change-based code review".[1] There are many variations of this basic process. A survey among 240 development teams from 2017 found that 90% of the teams use a review process that is based on changes (if they use reviews at all), and 60% use regular, change-based code review.[3] Also, most large software corporations such as Microsoft,[8] Google,[9] and Facebook follow a change-based code review process.

Efficiency and effectiveness of reviews

Capers Jones' ongoing analysis of over 12,000 software development projects showed that the latent defect discovery rate of formal inspection is in the 60-65% range. For informal inspection, the figure is less than 50%. The latent defect discovery rate for most forms of testing is about 30%.[10] [11] A code review case study published in the book Best Kept Secrets of Peer Code Review contradicted the Capers Jones study, finding that lightweight reviews can uncover as many bugs as formal reviews but were faster and more cost-effective.[12]

Studies have found that up to 75% of code review comments affect software evolvability/maintainability rather than functionality,[13] [14] [15] [16] suggesting that code reviews are an excellent tool for software companies with long product or system life cycles.[17] This also means that less than 15% of the issues discussed in code reviews are related to bugs.[18]

Guidelines

The effectiveness of code review was found to depend on the review speed.Code review rates should be between 200 and 400 lines of code per hour.[19] [20] [21] [22] Inspecting and reviewing more than a few hundred lines of code per hour for critical software (such as safety critical embedded software) may be too fast to find errors.[23]

Supporting tools

Static code analysis software lessens the task of reviewing large chunks of code on the developer by systematically checking source code for known vulnerabilities and defect types.[24] A 2012 study by VDC Research reports that 17.6% of the embedded software engineers surveyed currently use automated tools to support peer code review and 23.7% expect to use them within two years.[25]

See also

External links

Notes and References

  1. Book: Baum . Tobias . Liskin . Olga . Niklas . Kai . Schneider . Kurt . 2016 IEEE International Conference on Software Quality, Reliability and Security (QRS). 74–85 . 2016 . 10.1109/QRS.2016.19. 978-1-5090-4127-5 . A Faceted Classification Scheme for Change-Based Industrial Code Review Processes . 9569007 .
  2. Book: Kolawa, Adam . Huizinga, Dorota . Automated Defect Prevention: Best Practices in Software Management . 2007 . Wiley-IEEE Computer Society Press . 260 . 978-0-470-04212-0 .
  3. Book: Baum . Tobias . Leßmann . Hendrik . Schneider . Kurt . Product-Focused Software Process Improvement . The Choice of Code Review Process: A Survey on the State of the Practice . 10611 . 111–127 . 2017 . 10.1007/978-3-319-69926-4_9. Lecture Notes in Computer Science . 978-3-319-69925-7 .
  4. Book: Baum . Tobias . Proceedings of the 2016 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering - FSE 2016 . 85–96 . Liskin . Olga . Niklas . Kai . Schneider . Kurt . Factors Influencing Code Review Processes in Industry . 2016 . 10.1145/2950290.2950323. 9781450342186 . 15467294 .
  5. Book: IEEE Standard for Software Reviews and Audits. IEEE STD 1028-2008 . August 2008 . 1–53 . 10.1109/ieeestd.2008.4601584. 978-0-7381-5768-9 .
  6. Fagan . Michael . Design and code inspections to reduce errors in program development . IBM Systems Journal . 1976 . 15 . 3 . 182–211. 10.1147/sj.153.0182 .
  7. Book: Rigby . Peter . Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering . Bird . Christian . 2013 . 9781450322379 . 202–212 . Convergent contemporary software peer review practices . 10.1.1.641.1046 . 10.1145/2491411.2491444 . 11163811.
  8. MacLeod . Laura . Greiler . Michaela . Storey . Margaret-Anne. Bird . Christian. Czerwonka . Jacek. Code Reviewing in the Trenches: Challenges and Best Practices . IEEE Software . 35 . 4 . 34 . 2017 . 10.1109/MS.2017.265100500 . 49651487 . 2020-11-28 .
  9. Book: Sadowski. Caitlin . Söderberg. Emma. Church. Luke. Sipko. Michal. Baachelli. Alberto. Proceedings of the 40th International Conference on Software Engineering: Software Engineering in Practice . Modern code review: A case study at google . 181–190 . 2018 . 10.1145/3183519.3183525. 9781450356596 . 49217999 . free.
  10. Web site: Measuring Defect Potentials and Defect Removal Efficiency . Capers . Jones . Crosstalk, The Journal of Defense Software Engineering . June 2008 . 2010-10-05 . https://web.archive.org/web/20120806092322/http://www.crosstalkonline.org/storage/issue-archives/2008/200806/200806-0-Issue.pdf . 2012-08-06 . dead .
  11. Embedded Software: Facts, Figures, and Future . Computer . 42 . 4 . 42–52 . Capers . Jones . Christof . Ebert . April 2009 . 10.1109/MC.2009.118 . 14008049 .
  12. Book: Jason Cohen . Best Kept Secrets of Peer Code Review (Modern Approach. Practical Advice.) . Smart Bear Inc. . 2006 . 978-1-59916-067-2 . registration .
  13. Book: 10.1109/ICSE.2015.131 . 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering . 2020-11-28 . 2 . 27–28 . 2015 . Czerwonka . Jacek . Greiler . Michaela . Tilford . Jack . Code Reviews do Not Find Bugs. How the Current Code Review Best Practice Slows Us Down . 978-1-4799-1934-5 . 29074469 .
  14. 10.1109/TSE.2008.71 . 10.1.1.188.5757 . What Types of Defects Are Really Discovered in Code Reviews? . 2012-03-21. IEEE Transactions on Software Engineering . 35 . 3 . 430–448 . 2009 . Mantyla . M.V. . Lassenius . C. . 17570489 .
  15. Web site: Expectations, outcomes, and challenges of modern code review . A. Bacchelli. C. Bird. Proceedings of the 35th IEEE/ACM International Conference On Software Engineering (ICSE 2013). May 2013. 2015-09-02.
  16. Web site: Modern code reviews in open-source projects: which problems do they fix? . M. Beller. A. Bacchelli. A. Zaidman. E. Juergens. Proceedings of the 11th Working Conference on Mining Software Repositories (MSR 2014). May 2014. 2015-09-02.
  17. Web site: Does the Modern Code Inspection Have Value? . Harvey . Siy . Lawrence . Votta . 2004-12-01 . 2015-02-17 . unomaha.edu . dead . https://web.archive.org/web/20150428192217/http://csalpha.ist.unomaha.edu/~hsiy/research/sm.pdf . 2015-04-28 .
  18. Web site: Characteristics of Useful Code Reviews: An Empirical Study at Microsoft . Amiangshu. Bosu . Michaela . Greiler . Chris . Bird . 2015 IEEE/ACM 12th Working Conference on Mining Software Repositories . May 2015. 2020-11-28.
  19. Kemerer. C.F.. Paulk. M.C.. The Impact of Design and Code Reviews on Software Quality: An Empirical Study Based on PSP Data. IEEE Transactions on Software Engineering. 2009-04-17. 35. 4. 534–550. 10.1109/TSE.2009.27. 11059/14085 . 14432409. free.
  20. Web site: Code Review Metrics. Open Web Application Security Project. 9 October 2015. https://web.archive.org/web/20151009202719/https://www.owasp.org/index.php/Code_Review_Metrics. 2015-10-09.
  21. Web site: Best Practices for Peer Code Review. Smart Bear. Smart Bear Software. 9 October 2015. https://web.archive.org/web/20151009202810/http://smartbear.com/all-resources/articles/best-practices-for-peer-code-review/. 2015-10-09.
  22. Bisant. David B.. A Two-Person Inspection Method to Improve Programming Productivity. IEEE Transactions on Software Engineering. October 1989. 15. 10. 1294–1304. 10.1109/TSE.1989.559782. 14921429. 9 October 2015.
  23. Web site: A Guide to Code Inspections . Jack . Ganssle . The Ganssle Group . February 2010 . 2010-10-05.
  24. Book: 10.1109/ICSE.2013.6606642 . 978-1-4673-3076-3 . Reducing human effort and improving quality in peer code reviews using automatic static analysis and reviewer recommendation . 2013 35th International Conference on Software Engineering (ICSE) . 931–940 . 2013 . Balachandran . Vipin . 15823436 .
  25. Web site: Automated Defect Prevention for Embedded Software Quality . VDC Research. VDC Research. 2012-02-01 . 2012-04-10.