Open-source software assessment methodologies explained

Several methods have been created to define an assessment process for free/open-source software. Some focus on some aspects like the maturity, the durability and the strategy of the organisation around the open-source project itself. Other methodologies add functional aspects to the assessment process.

Existing methodologies

There are more than 20 different OSS evaluation methods.[1]

Comparison

Comparison criteria

Stol and Babar have proposed a comparison framework for OSS evaluation methods. Their framework lists criteria in four categories: criteria related to the context in which the method is to be used, the user of the method, the process of the method, and the evaluation of the method (e.g., its validity and maturity stage).

The comparison presented below is based on the following (alternative set of) criteria:

Comparison chart

Criteria OSMM Capgemini OSMM Navica OpenBQR OpenSource Maturity Model
Seniority200320042004200520072008
Original authors/sponsorsCapgeminiNavicasoftAtos OriginCarnegie Mellon Silicon Valley, SpikeSource, O'Reilly, IntelUniversity of InsubriaQualiPSo project, EU commission
LicenseNon-free license, but authorised distributionAssessment models licensed under the Academic Free LicenseMethodology and assessments results licensed under the GNU Free Documentation LicenseAssessments results licensed under a Creative Commons license Creative Commons Attribution-Share Alike 3.0 License Creative Commons Attribution-Share Alike 3.0 License
Assessment modelPracticalPracticalPracticalScientificPractical Scientific
Detail levels2 axes on 2 levels3 levels3 levels or more (functional grids)2 levels3 levels 3 levels
Predefined criteriaYesYesYesYesYesYes
Technical/functional criteriaNoNoYesYesYesYes
Scoring modelFlexibleFlexibleStrictFlexibleFlexibleFlexible
Scoring scale by criterion1 to 51 to 100 to 21 to 51 to 51 to 4
Iterative processNoNoYesYesYesYes
Criteria weightingYesYesYesYesYesYes
ComparisonYesNoYesNoYesNo

See also

External links

Notes and References

  1. Klaas-Jan Stol, Muhammad Ali Babar. "A Comparison Framework for Open Source Software Evaluation Methods" published in OSS 2010 proceedings. IFIP AICT vol. 319, pp. 389-394.
  2. Zahoor. Adnan. Mehboob. Khalid. Natha. Sarfaraz. September 2017. Comparison of Open Source Maturity Models. Procedia Comput. Sci.. 111. C. 348–354. 10.1016/j.procs.2017.06.033. 1877-0509. free.
  3. Woods, D., Guliani, G.: Open Source for the Enterprise: Managing Risks Reaping Rewards. O’Reilly Media, Inc., Sebastopol (2005)
  4. Taibi. Davide. Lavazza. Luigi. Morasca. Sandro. OpenBQR: A framework for the assessment of OSS . In International Conference on Open Source Development, Adoption and Innovation . 173–186 . 10.1007/978-0-387-72486-7_14 . dead . https://web.archive.org/web/19700101010101/http://flosshub.org/sites/flosshub.org/files/OpenBQR.pdf. 1970-01-01 . free.
  5. Web site: CORDIS | European Commission.
  6. V. Del Bianco, L. Lavazza, S. Morasca, D. Taibi and D. Tosi.: Quality of Open Source Software: The QualiPSo Trustworthiness Model. In: proc. 5th IFIP WG 2.13 International Conference on Open Source Systems, OSS 2009, Skövde, Sweden, June 3–6, 2009. Proceedings
  7. V. Del Bianco, L. Lavazza, S. Morasca, D. Taibi and D. Tosi.: The QualiSPo approach to OSS product quality evaluation. ACM/IEEE, In Proceedings of the IEEE International Workshop on Free Libre Open Source Software (FLOSS) – Colocated with ICSE, 2010
  8. D.Taibi: Towards a trustworthiness model for Open Source Software:How to evaluate Open Source Software. LAP LAMBERT Academic Publishing. 2010.
  9. Web site: QualOSS – CETIC. cetic.be. European Commission. 2019-01-11.
  10. Koponen, T., Hotti, V.: Evaluation framework for open source software. In: Proc. Software Engineering and Practice (SERP), Las Vegas, Nevada, USA, June 21–24 (2004)
  11. Sung, W.J., Kim, J.H., Rhew, S.Y.: A Quality Model for Open Source Software Selec- tion. In: Proc. Sixth International Conference on Advanced Language Processing and Web Information Technology, Luoyang, Henan, China, pp. 515–519 (2007)
  12. Atos Origin: Method for Qualification and Selection of Open Source software (QSOS) version 1.6, Technical Report (2006)
  13. Cabano, M., Monti, C., Piancastelli, G.: Context-Dependent * Evaluation Methodology for Open Source Software. In: Proc. Third IFIP WG 2.13 International Conference on Open Source Systems (OSS 2007), Limerick, Ireland, pp. 301–306 (2007)
  14. Ardagna, C.A., Damiani, E., Frati, F.: FOCSE: An OWA-based Evaluation Framework for OS Adoption in Critical Environments. In: Proc. Third IFIP WG 2.13 International Conference on Open Source Systems, Limerick, Ireland, pp. 3–16 (2007)