Specify how measurement data will be analyzed and reported.
Specifying the analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address the documented measurement objectives (and thereby the information needs and objectives on which they are
based). This approach also provides a check that the necessary data will in fact be collected.
Typical Work Products
1. Analysis specifications and procedures
2. Data analysis tools
Subpractices
1. Specify and prioritize the analyses that will be conducted and the reports that will be prepared.
Early attention should be paid to the analyses that will be conducted and to the manner in which the results will be reported. These should meet the following criteria:
· The analyses explicitly address the documented measurement objectives
· Presentation of the results is clearly understandable by the audiences to whom the results are addressed
Priorities may have to be set within available resources.
2. Select appropriate data analysis methods and tools.
Refer to the Select Measures and Analytic Techniques and Apply Statistical Methods to Understand Variation specific practices of the Quantitative Project Management process area for more information about the appropriate use of
statistical analysis techniques and understanding variation, respectively.
Issues to be considered typically include the following:
· Choice of visual display and other presentation techniques (e.g., pie charts, bar charts, histograms, radar charts, line graphs, scatter plots, or tables)
· Choice of appropriate descriptive statistics (e.g., arithmetic mean, median, or mode)
· Decisions about statistical sampling criteria when it is impossible or unnecessary to examine every data element
· Decisions about how to handle analysis in the presence of missing data elements
· Selection of appropriate analysis tools
Descriptive statistics are typically used in data analysis to do the following:
· Examine distributions on the specified measures (e.g., central tendency, extent of variation, or data points exhibiting unusual variation)
· Examine the interrelationships among the specified measures (e.g., comparisons of defects by phase of the product’s lifecycle or by product component)
· Display changes over time
3. Specify administrative procedures for analyzing the data and communicating the results.
Issues to be considered typically include the following:
· Identifying the persons and groups responsible for analyzing the data and presenting the results
· Determining the timeline to analyze the data and present the results
· Determining the venues for communicating the results (e.g., progress reports, transmittal memos, written reports, or staff meetings)
4. Review and update the proposed content and format of the specified analyses and reports.
All of the proposed content and format are subject to review and revision, including analytic methods and tools, administrative procedures, and priorities. The relevant stakeholders consulted should include intended end users,
sponsors, data analysts, and data providers.
5. Update measures and measurement objectives as necessary.
Just as measurement needs drive data analysis, clarification of analysis criteria can affect measurement. Specifications for some measures may be refined further based on the specifications established for data analysis procedures.
Other measures may prove to be unnecessary, or a need for additional measures may be recognized.
The exercise of specifying how measures will be analyzed and reported may also suggest the need for refining the measurement objectives themselves.
6. Specify criteria for evaluating the utility of the analysis results and for evaluating the conduct of the measurement and analysis activities.
Criteria for evaluating the utility of the analysis might address the extent to which the following apply:
· The results are (1) provided on a timely basis, (2) understandable, and (3) used for decision making.
· The work does not cost more to perform than is justified by the benefits that it provides.
Criteria for evaluating the conduct of the measurement and analysis might include the extent to which the following apply:
· The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.
· There is selection bias in sampling (e.g., only satisfied end users are surveyed to evaluate end-user satisfaction, or only unsuccessful projects are evaluated to determine overall productivity).
· The measurement data are repeatable (e.g., statistically reliable).
· Statistical assumptions have been satisfied (e.g., about the distribution of data or about appropriate measurement scales).