OW2 MRL Methodology Overview
Introduction | Methodology Overview | Best Practices | Project Attributes | Market Capabilities | Market Readiness | |
General Structure of the Approach
The final objective of the methodology is to provide a simple indicator of the market readiness of OW2 projects. To assess open source projects from a market perspective it collects and processes project information from three angles:
- Process and best practices
- Project attributes
- Market capabilities
Each kind of information being specific, project information is collected in three stages each being characterized by its own method; we call them Stage 1, 2 and 3:
- Stage 1: the first stage is a checklist of 50 best practices grouped in 12 categories collected by a form completed by project leaders,
- Stage 2: the second one is a set of 15 data points automatically collected from the project's development environment,
- Stage 3: the third stage is an expert assessment of the project market capabilities based on a structured expert interview covering 48 business-related situations, grouped in seven categories, with the project leader.
This approach provides three kinds of results:
- the first one is an overview of how a project is run, this result is directly provided by the first stage of data collection.
- the second one is an evaluation of five complex project attributes computed by combining information collected in stage one and stage two.
- the third and final result is the market readiness score resulting from combining the attributes with the third-stage business-oriented assessment.
Check out the section below for more details.
- Stage 1: Best practices verification
- Stage 2: Project attributes
- Stage 3: Market capabilities
- Market Readiness Levels
Applicability
Our approach is both top-down and bottom-up. Top-down because we use our un-scientific business wisdom to pre-define the OW2 Market Readiness Levels and bottom-up because we use objective data collected from the projects to match them with the pre-defined levels. Given this approach we will gradually update the definition of the OW2 Market Readiness Levels, with the help of OW2 project leaders feedback and based on the data collection and analysis we will be able to produce.
What we describe here is the methodology as it is applied to the OW2 code base and the way the method is applied is determined by the specifics of the OW2 projects. As such, several characteristics may not be applicable elsewhere, in particular:
- the type of data collected: the same data must be collected for all projects, the instanciation of the method is determined by the smallest set of data common to all projects; this set may vary from one code base to another.
- the normalisation thresholds: the way values are reduced to a common set of comparable values depends on the amplitude of the values collected which may also vary from one code base to another.
The OW2 MRL methology is a pragmatic, bottom-up approach, it does not pretend to be universal. The general approach can nevertheless be applied to any open source code base providing an ad hoc adjustment to available data and the amplitudes of the measured values in the target code base.
Computation
An essential step in creating a composite index is to transform data in a way that makes abstraction of their specific dimensions and apply them similar distributions. When normalized, data of heterogenous nature can be combined and computed together.
The OW2 MRL methodology applies a simple normalisation approach resulting in a scale with discrete values 0, 1, 2, 3, 4 and 5. In some cases such as Stage 1 and 3, normalization is trivial since the 0 to 5 normalization standard is embedded in the data collection approach.
The OW2 code base is sufficiently manageable so outliers are handled by hand on a case by case basis after verifying with project leaders.
With all values normalised with the same scale, expressed in the same unit, results are obtained by simply calculating their average.
Weighting strategy: No weighting within any given operation. Each variable enters the operation with a weight of 1. There is however some weighting within the whole methodology, in two ways. First, some variables are used several times. That is how some variables have greater influence on the result. Second, the final MRL score balances market capabilities on one hand with best practices and attributes on the other thus expressing the market orientation of the methodology by granting extra weight to market capabilities.
The dendrogram below shows the overall structure of the model with exemples of values to illustrate the simple calculations.
Limitations
We think the approach is sound and genuine, however it is in the nature of an exercise of this kind to be burdened with imperfections.
- We define the OW2 MRL by looking at the market hierarchy of open source projects - not a very scientific approach here, only our own modestly expert and artisanal crafting of the hierarchy.
- The OW2 MRL approach should evolve in future versions after being submitted to the scrutiny of the OW2 project leaders and their users.
- As explained in the paragraph above, there are also limitations related to the applicability of the methodology.
- At this point, the method still needs to be refined specially from the statistical point of view. Our model may be rustic, it still needs to be tested for robustness and sensitivity.
References
Countless different works have already been done on maturity and readiness models. As an introduction, here is a short list of useful links. Feel free to recommend more resources.
- On the RFC vocabulary: https://tools.ietf.org/html/rfc2119
- On comparable evaluation methodologies: https://en.wikipedia.org/wiki/Open-source_software_assessment_methodologies
- On maturity models: https://en.wikipedia.org/wiki/Maturity_model
- On the NASA TRL: http://www.spacewx.com/Docs/ISO_FDIS_16290_(E)_review.pdf
- On TRL in a business context: https://serkanbolat.com/2014/11/03/technology-readiness-level-trl-math-for-innovative-smes/
- On composite indicators: http://publications.jrc.ec.europa.eu/repository/bitstream/JRC31435/EUR%2022155%20EN.pdf (see appendix A)
Introduction < Previous | Next > Stage 1: Best practices verification