Evaluation of the Improved Climate Change Scenarios Program

Final Report
November 2010

Previous Page | Table of Contents | Next Page

3.0 Evaluation Design

3.1 Purpose and Scope

This evaluation assessed the relevance and performance of the Improved Climate Change Scenarios Program from 2007–2008 to 2009–2010 to provide information to senior management for decision making and to provide results for the Adaptation Theme evaluation to be rolled-up to the CAA-level in the fall of 2010. Five key issues were examined, as presented in the Evaluation Matrix in Annex 2:

Research and associated program activities that were to be resourced through the incremental funding for the Improved Climate Change Scenarios Program under the Adaptation Theme were assessed as part of this evaluation. Data were collected from August through October 2009, with some follow-up interviews conducted in November 2009 to capture the impact of organizational changes within ASTD that occurred after the first phase of the evaluation had already been completed. Additional documents were requested throughout the analysis and reporting phases of the evaluation to fill gaps in data and to capture information on the Program's ongoing progress towards the achievement of expected outcomes. Because the implementation of the Improved Climate Change Scenarios Program was in its early stages at the time of data collection and analyses, assessment of the Program's performance focused on activities, outputs and immediate outcomes that were measurable at that time. Longer-term outcomes (intermediate and final outcomes) were examined for demonstration of progress towards their achievement.

3.2 Evaluation Approach and Methodology

3.2.1 Methods

Data were collected through a review of documents and key informant interviews to address all evaluation issues described in the Evaluation Matrix in Annex 2.

Document Review – The Program was requested to provide program documents to demonstrate its relevance and performance. The evaluation team reviewed key documents including Speeches from the Throne, federal budgets, Reports on Plans and Priorities (RPPs), Departmental Performance Reports (DPRs), peer reviewed research papers and presentations, planning documents, commissioned reports, documents related to stakeholder consultations and meetings, financial information and quantitative data from participant evaluations of training sessions. (A complete list of documents reviewed is presented in Annex 3.) Evaluators reviewed these documents and compiled data in a source document which was then analyzed in order to address each evaluation question.

Key Informant Interviews – A total of 50 key stakeholders were interviewed from the following groups:

All names and contact information for interviewees were provided by the Program. Semi-structured interviews were conducted in the interviewee's language of choice, using an interview guide tailored for each category of key informant. Interviews were conducted either in person or by phone.

3.3 Limitations and Challenges

There were some limitations and challenges to this evaluation.