Evaluation of the Meteorological Service of Canada (MSC) Transition Project

July 2008

| ToC | Previous | Next |

3.0 EVALUATION DESIGN

3.1 Purpose and Scope

The purpose of the evaluation was to assess the success, performance and relevance of the MSC modernization efforts (MSC Transition Project). The evaluation will be used to help the WES Board and the department’s Executive Management Committee decide on next steps and the future direction of modernization efforts.

More specifically, the objectives of the evaluation were to examine and make recommendations with respect to

  1. relevance: whether the Transition Project goals were consistent with federal priorities and addressed actual needs;
  2. success: the extent to which the Transition Project has met its intended outcomes
  3. cost-effectiveness / alternatives: whether the most appropriate and cost-effective means were used to achieve those outcomes; and
  4. design and delivery: whether the Transition Project was delivered in the best possible way.

The evaluation covered the five-year Transition Project period from fiscal year 2003–2004 through 2007–2008.11   Prior developments, contextual factors and other issues arising in the period directly preceding the Transition Project were also considered and documented. The evaluation does not focus on initiatives not funded under the Transition Project.

3.2 Evaluation Methodology and Approach

In 2005, an evaluation framework for the MSC Transition Project was approved by the Assistant Deputy Minister, MSC. That framework included a logic model, performance measures and an evaluation strategy.

This evaluation uses the 2005 logic model and performance measures and builds upon the evaluation issues and sources of information presented in the evaluation framework.  The evaluation issues, evaluation questions, statements of what should be observed, indicators, sources of information and methods of collection are presented in Annex B. A table demonstrating how the seven expenditure review questions are addressed through the evaluation questions can be found in Annex C.

The evaluation involved the use of multiple lines of evidence. These methodological approaches are described in Table 2.

Table 2: Methodological Approaches
Methodology
Description
Document and File Review

An in-house review and analysis of documents related to the Transition Project were conducted.  Examples of documents reviewed included project files, progress and final reports, Board documents, key correspondence records, relevant data systems and analyses, and financial information. Where applicable, certain topics were identified as research pieces and were researched in more depth.

A list of key documents is presented in Annex D

Secondary Data and Reports

Although not previously incorporated into the plan for this evaluation, this methodology was used to collect performance and financial data due to difficulties experienced in finding evidence of reporting against some activities, outputs and outcomes and resources (budgeted and expended) under the project.

The supplemental information was provided by Environment Canada interviewees either through email or through reports prepared by Environment Canada interviewees for this purpose. This information is referred to as secondary data and reports in this evaluation report.

Media Scan A media scan was conducted in order to provide further information regarding stakeholder perspectives on initiative impacts/effects. The media scan used Green InSight as a primary search tool for articles about the MSC, which were further sifted to capture only articles that pertain to Transition Project issues. As the Green InSight archives are limited, this search only included articles from January 2005 to September 2007. Articles from 2003 and 2004 were found from a Globe and Mail and Google News search.  A total of 29 articles were found and analyzed.
Survey Review Four surveys completed for the MSC by independent consultants were reviewed as part of this evaluation.  The surveys included two national public opinion surveys (referred to as Survey I in question matrix in Annex B) and two MSC Transition employee surveys (referred to as Survey II in the question matrix in Annex B).12

The following surveys were reviewed.
  • National Survey on Meteorological Products and Services - 200213
  • National WES Products and Services Survey 200714 (draft)
  • Meteorological Service of Canada Transition Employee Survey 200315
  • Meteorological Service of Canada Transition Employee Survey 200416
Interviews In order to receive feedback from key partners17 and stakeholders, a total of 58 interviews were conducted in-house with key informants from the following categories:
  • Environment Canada program staff (20)
  • Environment Canada senior management (11)
  • Environment Canada internal partners (5)
  • Federal partners and stakeholders (5
  • Non-federal partners and stakeholders (10)
  • Provincial and municipal stakeholders (6)
  • International partners and stakeholders (1)
The master list of interview questions is presented in Annex E. Interview guides containing questions appropriate for each stakeholder group were formed from the master list of interview questions. Some questions were only addressed to small sub-groups of specialists who, for the most part, were involved in the design and or implementation of the project throughout much of the duration of the project.
Facilitated Workshops with Environment Canada Employees Three workshops with 47 participants, representing staff and managers, were conducted to assess and contrast the perspectives of program managers and staff from various parts of the organization on topics related to strengthening linkages between production, science and service.18

The three workshops were conducted with participants in Dartmouth, Montreal and Edmonton.  Seven participants in the workshop in Edmonton attended via teleconference from Winnipeg.19

Ratings for the Evaluation Issues and Questions

A summary of ratings for the evaluation issues and questions is presented in Annex F. The ratings are based on a judgement of whether the findings indicate that

The meaning of the “too early to say” column is that while immediate outcomes may have been achieved, it is too early to see the long-term impacts resulting from the outcomes. The N/A column identifies items where a rating is not applicable.

Limitations of the Evaluation

There are four specific limitations associated with this evaluation. The evaluation addresses these limitations through reliance on the qualitative views24 of stakeholders.

  1. While the project collected information at the activity level, information on how these activities/outputs contributed to the achievement of immediate and ultimate outcomes was not consistently tracked and therefore, the full performance story of the achievements of the project could not be reported.
  2. Many of the activities and outputs are just moving towards the stage of showing results (e.g. the operational results of forecasting can take years to appear); hence the evaluation often had to rely on information provided in interviews and not reports.
  3. Funding provided under MSC’s Transition Project complemented A-Base funding and other targeted ongoing meteorological programs. Therefore, it was difficult at times for EC employees to distinguish the impacts of the Transition Project from the impacts of other programs.
  4. The financial information system does not specifically identify the amount of Treasury Board Secretariat funds received and after 2005–2006 does not distinguish between the use of Treasury Board Secretariat funds and the use of other A-Base funds. Thus, it was not possible to give a complete picture of the allocation and use of TBS and departmental funds.

| ToC | Previous | Next |

11 Unless otherwise noted, data available up until February 22, 2008 were used to address coverage for the 2007–2008 fiscal year.

12 Note: Although an MSC Transition employee survey was also being completed for 2007, results from this survey were not available in time to be incorporated into this evaluation.

13 Completed by Decima Research Inc.

14 Completed by EKOS Research Associates Inc.

15 Completed by Environics Research Group.

16 Completed by Environics Research Group.

17 In this report, the word “partner” is used in a general sense rather than in its legal sense.

18 Strengthened linkages between production, science and service is one of the ultimate outcomes identified on the logic model for the MSC Transition (See Figure 1, Section 2.2).

19 Given the locations of the workshops, it was noted in the final report from the workshops that “views of program managers and staff from Storm Prediction Centres in Vancouver, Toronto and Gander are not included in this research.” (Goss Gilroy Inc., Report on Workshops for the MSC Transition Project Evaluation: Final Report, p. 3–4.)

20 Labelled as √ in summary table in Annex F and as Achieved in the Findings section.

21 Labelled as ~√ in summary table in Annex F and as ~Achieved in the Findings section.

22 Labelled as Progress made, attention needed in the Findings section.

23 Labelled as Little progress, attention needed in the Findings section.

24 “Qualitative views” refers to opinions.