Language selection

Search


Top of page

Evaluation of the Class Grant and Contribution Program to Support Research, Awareness and Learning in Space Science and Technology

On this page

  1. Acronyms used in the report
  2. Executive summary
  3. Overview of the Program
  4. Purpose and scope
  5. Methodology
  6. Relevance
  7. Performance
  8. Efficiency
  9. Conclusion
  10. Management response and action plan
  11. References
  12. Appendix 1 – Logic Models

For the period from to

Prepared by the Audit and Evaluation Directorate

Project #20/21 02-01

© Her Majesty the Queen in Right of Canada, as represented by the Minister of Innovation, Science and Industry, .

ISBN: 978-0-660-43536-7

https://www.asc-csa.gc.ca/eng/terms.asp

Acknowledgements

This evaluation was made possible through the contribution and collaboration of many people. We wish to thank everyone who participated in interviews and data collection, provided information, and responded to inquiries.

Acronyms used in the report

A&L
Awareness and Learning
AO
Announcement of Opportunity
CaNoRock
Canada-Norway Student Sounding Rocket
CEGC
Centre of Expertise for Grants and Contributions
CFI
Canada Foundation for Innovation
CIHR
Canadian Institutes of Health Research
CSA
Canadian Space Agency
CSDC
Canadian Satellite Design Challenge
DFO
Fisheries and Oceans Canada
DND
Department of National Defence
DRF
Departmental Results Framework
EOADP
Earth Observation Application Development Program
G&C
Grants and Contributions
GBA Plus
Gender-based Analysis Plus
HQP
Highly qualified personnel
IIRB
Integrated Investment Review Board
ISED
Innovation, Science and Economic Development Canada
NPO
Not-for-profit organization
NRC
National Research Council Canada
NRCan
Natural Resources Canada
NSERC
Natural Sciences and Engineering Research Council of Canada
PAA
Program Alignment Architecture
PC
Parks Canada
PIP
Performance Information Profile
PIS
Performance Indicator Survey
PMS
Performance Measurement Strategy
R&D
Research and development
S&T
Science and Technology
SCDP
Space Capacity Development Program
SE
Space Exploration
SEP
Space Exploration Program
SST
Space Science and Technology
STD
Space Technology Development
STEDiA
Science, Technology and Expertise Development in Academia
STEM
Science, technology, engineering, and mathematics
SU
Space Utilization
SUP
Space Utilization Program
TBS
Treasury Board Secretariat
TRL
Technology readiness level

1. Executive summary

In this section

This report presents the results of the evaluation of the Class Grant and Contribution Program to Support Research, Awareness and Learning in Space Science and Technology (Class G&C Program or the Program) at the Canadian Space Agency (CSA or Agency). This is the second evaluation of the Program in its current form. It was carried out by the CSA's Audit and Evaluation Directorate, with the support of Goss Gilroy Inc., between and . It was completed in accordance with the Treasury Board Policy on Results (TBS, 2016a) and Policy on Transfer Payments (TBS, 2008a). It was also conducted as prescribed in the CSA's Five-Year Evaluation Plan (CSA, multiple years). The Class G&C Program was established in . It was designed to support research, knowledge development and innovation in the CSA's priority areas and enhance Canadians' awareness of and participation in space-related disciplines and activities. It has two components. The Research component provides organizations with financial assistance for science and technology (S&T) research and development (R&D), capacity building, and space-related information gathering, research, and studies. The Awareness and Learning (A&L) component provides funding to individuals and organizations involved in activities and initiatives that support space-related awareness, knowledge development and participation in learning activities. The Program supports the Agency's three Departmental Results Framework (DRF) programs (CSA, 2018a): the Space Capacity Development Program (SCDP), the Space Utilization Program (SUP), and the Space Exploration Program (SEP). Its funds are disbursed through Announcements of Opportunity (AOs) and unsolicited proposals. The purpose of this evaluation is to assess the Program's relevance, effectiveness and efficiency over the period from to (six years), including gender-based analysis plus (GBA Plus). The evaluation used multiple data collection methods, including a document review, a literature review, two online surveys (internal and external), internal group interviews, and external individual interviews with key informants.

Relevance

The objectives of the Class G&C Program are aligned with federal government priorities and departmental strategic outcomes. Through the Program's two components, the CSA's involvement in space-related learning, awareness, innovation, and research is consistent with core federal responsibilities. The Program continues to address a demonstrable need as it plays a unique role in the Canadian space sector and international collaboration. However, there are unmet needs, including recurrent funding opportunities, a less cumbersome process and better alignment with communities' needs.

Performance

Overall, the Class G&C Program's outputs are being delivered, and the immediate, intermediate and final outcomes are being achieved. Space-related knowledge, capacity development, and collaboration have increased. The number of AOs is significantly higher than in the previous period, though the number of proposals submitted and their success rate vary from year to year. Nevertheless, more projects are being funded, and efforts have been initiated to make funding more accessible. However, operational and performance data could be better structured if the available tools were harnessed, and some information is missing from the central database. There are also some opportunities for improvement in the Program's performance measurement due to some undefined targets and missing or unmeasured indicators.

Efficiency

Many operational processes and work methods are well-established, but some facets should be improved to make the Program even more efficient, enhance synergies, and be innovative: share more information about the evaluation and selection processes for greater transparency, clarify and better communicate roles and responsibilities to enhance collaboration, and offer training, greater coordination and harmonization to support operations more effectively.

On the basis of the key evaluation findings, the following actions are recommended to improve the accessibility and efficiency of the CSA's Class G&C Program:

  1. Establish regular funding opportunities with greater sensitivity to the needs of the diverse client base, while increasing harmonization and coordination between branches and recipients.
  2. Clarify the rules and requirements regarding departmental collaboration with G&C recipients, and inform stakeholders.
  3. Use a single operational database for the Program's administration and management, and monitor data quality, continuity and completeness.
  4. Explore the possibility of using standardized tools to streamline the application process, such as using a staged application process.
  5. Ensure that systematic feedback is provided for all funding applications.
  6. Communicate the roles and responsibilities of the Centre of Expertise for Grants and Contributions (CEGC) to the G&C Program's user branches to ensure a common understanding and meet the branches' needs for the services and expertise they require.
  7. In updating performance measurement, ensure that there are CSA logic model indicators for each of the Program's components and client groups, and that specific targets are agreed upon for the Program.

2. Overview of the Program

In this section

The Class G&C Program was established in , but it was based on a program that had 12 components.Footnote 1 The current G&C Program was designed to support research, knowledge development and innovation in the CSA's priority areas and enhance Canadians' awareness of and participation in space-related disciplines and activities.

The Class G&C Program is an umbrella programFootnote 2 that supports the CSA's three DRF programs, which is why it is not listed in the DRF program inventory (CSA, 2018a). The Program has its own terms and conditions and is subject to the Policy on Transfer Payments (TBS, 2008a). It was designed with the CSA's Program Alignment Architecture (PAA) (replaced by the DRF in ).

Figure 1 – Where the Class G&C Program fits into the CSA's DRF

Class G&C Program

Space Capacity Development Program (SCDP)

Space Utilization Program (SUP)

Space Exploration Program (SEP)

The Program has two components:

  1. Research, and
  2. Awareness and Learning (A&L).

The Research component provides organizations with financial assistance for science and technology (S&T) research and development (R&D), capacity building, and space-related information gathering, research, and studies. The A&L component provides funding to individuals and organizations involved in activities and initiatives that support space-related awareness, knowledge development and participation in learning activities.

The objectives of the two components described in the Program's Terms and Conditions (CSA, ) are as follows:

The target eligible client groups are as follows:

The Class G&C Program's funding is disbursed either through a competitive process (referred to as solicited proposals) in response to Announcements of Opportunity (AOs) or through unsolicited proposals. Each AO is a quasi mini-program with its own criteria.

Governance, roles and responsibilities

There are four levels of actors involved in the Program's governance. According to the Program's Terms and Conditions (CSA, ), there is no lead authority for the Program, although it falls under the responsibility of the Chief Financial Officer. The following is an overview of the roles and responsibilities (the first three levels are described in part in the Program's Terms and Conditions (CSA, )):

  1. Branches

    The branches – Space Science and Technology (SST),Footnote 3 Space Utilization (SU), and Space Exploration (SE) – determine which initiatives will be funded to achieve the expected outcomes. They carefully assess, monitor and report on the risks associated with their sector's use of the Program. They liaise directly with recipients and monitor the feasibility, eligibility and progress of G&C projects. They are in charge of the Program's operations and administration.

  2. G&C Centre of Expertise (CEGC)

    The CEGC is responsible for providing G&C expertise to the CSA's branches. It provides advice, guidance and direction on G&C management. It also monitors, reviews and reports on G&C. It provides standardized tools for AO development and approvals. It supports the G&C decision-makers and steering committees. It reports to the Finance Directorate.

  3. G&C Steering Committee

    The G&C Steering CommitteeFootnote 4 provides oversight of G&C management and governance. It is co-chaired by the Chief Financial Officer and Director General, Corporate Services, and one of the Directors General (at the time of the evaluation, the Director General SST was Co-Chair). It provides CSA-wide strategic guidance and advice on the use of funding.

  4. Integrated Investment Review Board (IIRB)

    The IIRB provides the sound management (governance and approval) needed to ensure that all CSA expenditures are made and controlled with a view to optimizing resources and expected results, including those of the G&C Program.

Program resources

Total funding disbursed by the CSA under the Class G&C Program during the evaluation period was $117,906,570, with more than half spent on contributions. Almost all of the funds were used in the Research component (98%). The SCDP accounted for the largest proportion of funds, followed by the SU and the SEP. For more details on funding allocation and the number of agreements, see the Performance section.

Figure 2 – Percentage of funding disbursed by type of transfer payment and DRF program

Percentage of funding disbursed By type of transfer payment
Grants Contributions
Percentage 43% 57%
Percentage of funding disbursed By DRF program
SCDP SUP SEP Other (see note a)
Percentage 64.5% 26.5% 8.7% 0.2%

In , changes were made in the funding of the Program and the CEGC. First, there was an increase in the funds available for the Program, in the volume of agreements signed, and in the proportion of funds disbursed in the form of contributions to companies (instead of contracts). Indeed, funding disbursements grew by 113% during the evaluation period (33% increase for grants, 247% for contributions), reaching a total of $23.5 million in -. For comparison, funding disbursements in - totalled $7.8 million.

Figure 3 – Amounts disbursed by type of transfer payment and year (in $)
- - - - - -
Grants 6,955,536 6,263,509 9,146,443 8,674,322 10,423,649 9,284,634
Contributions 4,097,804 10,501,603 11,870,329 10,507,215 15,975,628 14,205,899

Second, in the CEGC was moved from the SST Branch to Corporate Services (under the Finance Directorate). The resources allocated to the CEGC increased by 62% during the evaluation period, and the number of full-time equivalents (FTEs) grew by 49%.

Table 1 – Actual expenditures of the CEGC
Expenditures - - - - - -
Salaries $339,784 $306,648 $329,462 $433,736 $440,231 $570,001
Operations & Maintenance $43,533 $103,253 $58,349 $12,753 $39,519 $49,661
Total $383,317 $409,901 $387,811 $446,489 $479,750 $619,662
FTEs 3.20 3.10 3.25 3.96 4.23 4.77

Operating budget data is available only for the CEGC, as the branches do not report this financial information for the Class G&C Program separately within their respective programs. The previous evaluation (CSA, 2017c) reported that there was no information on full-time equivalents (FTEs), salaries, and expenditures specifically for G&C initiatives in the branches. The evaluation concluded (but did not recommend) that data pertaining to the Class G&C Program's full administrative costs should be tracked so that information about the magnitude of the Program's costs would be available.

Previous evaluation of the Program

In the previous evaluation, whose action plan was completed, a number of conclusions were drawn regarding the Program's relevance, performance and efficiency, and the following recommendations were made:

  1. review the Program's terms and conditions to determine whether the A&L component remains aligned with the CSA's priorities;
  2. for both solicited and unsolicited proposals, standardize the application, selection, and feedback processes and clearly communicate them to the Canadian space community; and
  3. review the Program's performance measurement strategy and data capture, collection and storage processes, and standardize the process for identifying ranked lists of funding priorities applicable to all G&C initiatives (CSA, 2017c).

3. Purpose and scope

The evaluation of the Class G&C Program addresses the key evaluation issues specified by the Treasury Board of Canada Secretariat's Directive on Results (TBS, 2016b): relevance, effectiveness and efficiency, and gender-based analysis plus (GBA Plus).Footnote 6 It is also consistent with the Policy on Transfer Payments (TBS, 2008a), which requires that all G&C programs be evaluated every five years. In addition, it was conducted as prescribed in the CSA's Five-Year Evaluation Plan. Its goal is to provide a neutral, evidence-based assessment. This is the second evaluation of the Program as defined in the terms and conditions, and it covers both components (Research and A&L). The period covered by the evaluation is , to , a six-year period.

Table 2 – Evaluation questions
Criteria Questions
Relevance
  1. To what extent is the Program aligned with federal government priorities and departmental strategic outcomes?
  2. To what extent are the Program's activities aligned with the federal government's core responsibilities?
  3. Does the Program continue to meet a demonstrable need, and is it responsive to the needs of Canadians?
Performance
  1. To what extent were the Program's expected outputs achieved?
  2. To what extent were the Program's expected immediate outcomes achieved?
  3. To what extent were the Program's expected intermediate outcomes achieved?
  4. To what extent were the Program's expected final outcomes achieved?
Efficiency
  1. To what extent is the Program delivering outputs and achieving outcomes as efficiently as possible?
  2. To what extent is the use of resources in executing the Program as cost-effective as possible?
GBA Plus
  1. What are the Program's impacts on GBA Plus groups?

Performance measurement and indicators

The evaluation period covers the Performance Measurement Strategy (PMS) for the A&L component, the PMS for the Research component, and the Performance Information Profiles (PIPs) for the three DRF programs (CSA, ; ; 2017b). Since there are no specific targets for the G&C Program in the PIPs, the PMS indicators along with their targets were used for the Research component. Nevertheless, the previous evaluation (CSA, 2017c) indicated that there was a need to update the Program's PMS with output-related performance indicators, baseline data and targets. For targets that were not defined or where the data did not provide a direct response to the indicator, the - baseline year was used. For the A&L component, the PMS indicators were used, but without specific targets because none was specified. In addition, no baseline year was used since this component was paused following the program review. For the (A&L) and (Research) PMS logic models, see Appendix 1.

4. Methodology

The evaluation was conducted by the CSA's Audit and Evaluation Directorate, with the assistance of Goss Gilroy Inc., between and . An advisory group composed of representatives from the various sectors of the CSA was formed to provide advice, guidance and general direction throughout the evaluation process. The advisory group also provided ongoing feedback on various aspects of the evaluation and the deliverables, including the final report, and assisted in collecting and providing data.

Literature review: The evaluation of the Class G&C Program is based in part on a careful analysis of various documents, including public reports, national academic publications and government publications. This literature review was primarily intended to document the Program's relevance, but it was also used to support performance information in some instances.

Document and quantitative data review: Administrative, operational, and performance data extracted from the Unitas database,Footnote 7 selected data from performance reports in Excel format,Footnote 8 financial data extracted from SAP, A&L-funded student and organization reports, and other internal documents were studied. The main purpose of reviewing these documents and data was to evaluate the Program's performance, but various elements of its efficiency and GBA Plus were also examined.

Internal and external surveys: Two surveys were conducted. One was administered to CSA employees, managers and members of the CSA's senior management. The other was distributed to principal investigators and students who submitted one or more G&C funding proposals (whether funded or not) during the evaluation period. The purpose of the surveys was to collect information about the Program's relevance and efficiency and about performance (Program outputs and A&L outcomes) and GBA Plus.

The internal survey's response rate was similar across the various sectors of the CSA, but slightly lower than expected. However, the survey had exactly the same number of respondents as the internal survey in the previous evaluation (CSA, 2017c).

The external survey's response rate was lower than expected. This may be due to a number of factors, including the COVID-19 pandemic, the fact that the survey was launched just after the Performance Indicator Survey (PIS) and during the end-of-term period for universities, and the large number of non-recipients surveyed. Although potential respondents were informed that the evaluation survey would not overlap with the PIS, some may have chosen not to respond. However, the number of respondents was higher than in the previous evaluation (CSA, 2017c). The main characteristics of the external survey's respondents are as follows: 68% male, 22% female, 0.4% non-binary; 4% self-identified as persons with disabilities; 0% self-identified as Indigenous; and 16% self-identified as members of a visible minority.

Table 3 – Internal survey response rates
Respondent category
(Branch)
No. of people surveyed No. of respondents
(%)
Response rate
Space Utilization 22 10 (19%) 45%
Space Exploration 26 13 (24%) 50%
Space Science and Technology 34 18 (33%) 53%
OthersFootnote b 27 13 (24%) 48%
Total 109 54 (100%) 50%
Table 4 – External survey response rates (by component; by recipient/non-recipient; by transfer payment type)
Respondent category No. of people surveyedFootnote c No. of respondents
(%)
Response rate
Research component 457 176 (78%) 39%
A&L component 475 50 (22%) 11%
Both components (Research, A&L) 2 0 (0%) 0%
Total, components 934 226 (100%) 24%
Recipients 219 89 (39%) 41%
Non-recipients 597 79 (35%) 13%
Both (recipients, non-recipients)Footnote d 118 58 (26%) 49%
Total, recipients and non-recipients 934 226 (100%) 24%
Grants 635 121 (54%) 19%
Contributions 219 83 (37%) 38%
Both (G&C) 20 8 (4%) 40%
Not specified in the database 60 14 (6%) 23%
Total, G&C 934 226 (100%) 24%

Internal interviews and group discussions: The group interviews helped to provide an in-depth understanding of the Program's activities and the various roles in the G&C cycle (in the branches, Corporate Services or the CEGC, for example) and to corroborate and clarify information obtained through other data sources. A total of 28 individuals from different sectors in the CSA were consulted through 11 individual and group interviews.

External interviews, including the comparative study: External interviews were conducted with three third-party organizations that received A&L funding and were not included in the external survey, and with eight other departments and agencies that have class and non-class G&C programs.

The funded organizations were interviewed to gather information about relevance, performance and efficiency. The other departments and agencies were interviewed to identify good practices and points of comparison with the Program. The following departments and agencies were interviewed:

Table 5 – Number of interviews and respondents by category
Respondent category No. of respondents No. of interviews
Internal Space Utilization 4 11
Space Exploration 5
Space Science and Technology 8
OthersFootnote e 11
External A&L recipient organizations 3 3
Other departments and agencies 8 8
Total 39 22

Case studies: Some mini-case studies were carried out as part of this evaluation. They covered AOs of different types and sizes. The aim was to reflect the variety of AOs designed and published by the three branches (SST, SU, SE) and to compare the different characteristics of AOs (length of application period, amount and duration of funding, etc.).

Limitations

The main limitations encountered in the evaluation have to do with the data.

  • At the beginning of the evaluation period, some non-funded proposals were not entered into the database. At that time, Unitas was not widely used, and documents from unsuccessful proposals were simply saved in Livelink. Since those documents were not reviewed, some non-recipients could not be reached for the external survey, and for some AOs, there is a slight overestimation of proposal success rates as some proposals were not included in the analysis of the number of proposals submitted. The number of proposals missing from the database was not determined but is estimated to be small.
  • The PIS collects information from the progress and final reports required for progress and performance evaluation under the Policy on Transfer Payments (TBS, 2008a). Reports are requested, submitted and evaluated via the Unitas system platform (progress and final reports for grants; Medium Format for contributionsFootnote 10) and are improved from year to year. The response rates are very high (96% for the evaluation period). However, the frequent changes in the PIS make it difficult to compare data on an annual basis. This problem was noted in the previous evaluation (CSA, 2017c). In addition, between and , recipients of SCDP Space Technology Development (STD) contributions completed a different report from the PIS called the Final Report – Performance Measures, which collected information about key performance indicators for contribution-funded projects. The data was reported in aggregate (for the entire project), not annually as with the PIS, but it was compiled and available in Excel format. For this reason, STD contribution data is presented separately in this report but aggregated where possible.
  • A small-scale regional analysis was performed with the geographic location of recipients and non-recipients accessible via the Unitas database, the summary data reported in the PIS to the best of the principal investigator's knowledge regarding the composition of the project team, and the responses to demographic questions in the external survey.
  • At the CSA, Unitas is the main centralized information systemFootnote 11 for G&C operations, contracts, MOUs, and Canadian aerospace industry directory information, and SAP is the reference system for financial data and a number of official lists, such as federal electoral districts and business numbers. However, SAP does not distinguish between solicited and unsolicited project agreements, and data cannot be extracted from SAP by AO. Although a view of SAP financial data is available in Unitas for operational purposes, source data (SAP) was used directly when financial data was examined.

5. Relevance

In this section

The objectives of the Class G&C Program are aligned with the federal government's priorities and the CSA's strategic outcomes. The Program is a pillar of the Agency, enabling it to support Canada's space ambitions. The CSA's involvement in space-related learning, awareness, innovation and research through the Class G&C Program is consistent with the federal government's core responsibilities as defined in the Canadian Space Agency Act and the CSA's Departmental Results Framework. In addition, the Class G&C Program continues to address a demonstrable need. The majority of respondents felt that there would be a gap if the CSA's Program did not exist, as it plays a unique role in Canada in the development of space sector capabilities, the advancement of space science and technology, collaboration between stakeholders, and Canada's international presence. Nevertheless, there are needs not met by the Program in its current form.

The Program's relevance was evaluated on the basis of the alignment of its objectives with government priorities and the federal government's core responsibilitiesFootnote 12 and the extent to which it continues to address a demonstrable need in Canada.

Alignment between the Program's objectives and government priorities

The Program supports "knowledge development and innovation in the CSA's priority areas while increasing the awareness and participation of Canadians in space-related disciplines and activities" (CSA, 2021d). It is a transfer payment program designed to provide funding in three main areas of activity: research, space awareness, and space learning (CSA, 2021e). As described above, the Program is composed of a Research component and an A&L component.

Figure 4 – Objectives of the Class G&C Program by area of activity and component

Research component
Research

Support research and development in the CSA's priority areas and targeted knowledge development and innovation to sustain and enhance Canada's capacity to use space to meet the country's future needs and priorities.

A&L component
Space awareness

Raise awareness of Canadian space science and technology by increasing the interest of Canadian youth and educators and their participation in related activities.

Space learning

Provide learning opportunities to Canadian students, educators and physicians in various space-related disciplines.

Alignment with federal priorities

The Program's objectives and activities are aligned with Canada's Space Policy Framework (), in which the government recognizes that space is an important issue and that "[i]t is essential to the national interest [...] that Canada maintain a robust, technologically superior and commercially competitive space industry" (GC, ). Specifically, the Framework identifies five key priorities and core principles to guide Canadian space activities:

  1. ensure Canada's national sovereignty, security and prosperity through the effective utilization of space (e.g., satellite surveillance);
  2. support the domestic space industry (e.g., market the most advanced new technologies that address national interests);
  3. maintain and strengthen partnerships;
  4. support and advance Canadian expertise in selected technology niches (e.g., robotics and telecommunications); and
  5. inspire and motivate Canadian youth to pursue careers in science, technology, engineering and math (STEM) with the aim of developing and sustaining an educated and skilled workforce, by working with industry, universities and colleges (GC, ).

Similarly, the Space Strategy for Canada, announced by the Minister of Innovation, Science and Industry in , recognizes the space sector as a strategic national asset and reiterates the government's commitment to ensuring that Canada remains a spacefaring nation (CSA, 2019a). Accordingly, "Canada seeks to create a vibrant and sustainable space sector [...] that sets a new vision for Canadian space exploration, sees increased partnership with industry to create the jobs of the future, leverages the power of space to inspire youth, and harnesses the potential of space to solve [...] challenges for Canadians" (CSA, 2019a). The Strategy sets out the following five key federal priorities:

  1. ensure that Canada remains a leading spacefaring nation by joining the Lunar Gateway mission;
  2. inspire the next generation of Canadians to reach for the stars;
  3. harness space to solve everyday challenges for Canadians;
  4. position Canada's commercial space sector to help grow the economy and create the jobs of the future; and
  5. ensure Canada's leadership in acquiring and using space-based data to support science excellence, innovation and economic growth (CSA, 2019a).

By supporting space science and technology research, awareness and learning in Canada, the Class G&C Program is aligned with the government's space priorities. For example, AOs are consistent with the federal priorities outlined in Canada's Space Policy Framework (GC, ) and the Canadian Space Strategy (CSA, 2019a), such as Lunar Exploration Analogue Deployment (LEAD), Student Participation in the International Astronautical Congress (IAC), or R&D for Multi Earth Observation Satellite Data Integration (EOADP).

In addition, the government noted in Budget  that aerospace "is an important driver of Canada's innovation economy" and that it will continue to explore opportunities to support Canadian capacity, innovation and jobs in the Earth observation satellite sector as it provides critical services that Canadians rely on and creates high-quality jobs in Canada (GC, 2021a).

Lastly, GBA Plus was incorporated into the CSA's G&C program in response to the federal government's increasing emphasis on the principles of equity, diversity and inclusion. For example, the AO to support student participation in the IAC in specified that for the final selection, the CSA would "consider the applicants having the highest final scores [and] could also take into consideration factors such as a balanced grants distribution across Canada as well as a diversified representativeness among the four designated groups" (CSA, 2019b).

Alignment with departmental strategic outcomes

The objectives of the Class G&C Program are aligned with the CSA priorities set out in the departmental plans and specifically in the DRF (CSA, 2018a):

  • DRF1. Space research and development advance science and technology;
  • DRF2. Canadians engage with space;
  • DRF3. Space information and technologies improve the lives of Canadians;
  • DRF4. Canada's investments in space benefit the Canadian economy.

The Program contributes to these objectives by supporting the development of science and technology, fostering the continual development of a critical mass of researchers and highly qualified personnel (HQP) in Canada, and supporting information gathering and space-related studies and research (CSA, 2021d). Specifically, the Program provides financial support to

  1. organizations to conduct space-related R&D activities in CSA priority areas, in order to sustain and enhance Canada's capacity to use space to meet the country's future needs and priorities (CSA, );
  2. organizations developing initiatives to increase the interest and participation of youth and educators in the Canadian space program; and
  3. postsecondary students and educators who wish to participate in educational events to increase their knowledge and gain experience in advanced space-related educational disciplines (CSA, ).

While the Class G&C Program has its own terms and conditions, it is used to design funding opportunities that are aligned with the objectives of the CSA's three DRF programs, which in turn are aligned with departmental strategic outcomes. Consequently, the majority of employees surveyed said they believed that the Program's objectives were aligned with the CSA's priorities. Key informants stated in interviews that CSA sectors work together to develop AOs that are aligned with departmental priorities, that AO selection criteria are formulated and reviewed to ensure alignment, and that applicants must demonstrate the link between their proposals and the Program's objectives and the CSA's priorities. However, some employees and members of senior management noted that the Program's medium- and long-term strategic planning could be strengthened to ensure that G&C activities are more directly aligned with the CSA's objectives, since G&C planning can be challenging (see the Efficiency section).

The inclusion of GBA Plus in the G&C and the AO writing guide illustrates the CSA's commitment to advancing equity, diversity and inclusion, consistent with the CSA's Policy on Gender-Based Analysis Plus (CSA, 2017a) and the principles of the Dimensions Charter,Footnote 13 signed by the CSA's President in .

Alignment with federal government's core responsibilities

The Canadian Space Agency Act states that the CSA's mission is "to promote the peaceful use and development of space, to advance the knowledge of space through science and to ensure that space science and technology provide social and economic benefits for Canadians" (s. 4). The Program contributes to the CSA's mission because its objectives and activities are aligned with two major functions assigned to the CSA in its founding legislation. It is the CSA's responsibility to "plan, direct, manage and implement programs and projects relating to scientific or industrial space research and development and the application of space technology" (s. 5(2)(b)). The CSA may also

"make grants and contributions in support of programs or projects relating to scientific or industrial space research and development and the application of space technology, including projects designed to develop, test, evaluate or apply new or improved processes, products, systems or information relating to space science and technology with a view to determining the commercial potential of that science and technology, but not including any programs or projects relating solely to the commercial exploitation of space science or technology" (s. 5(3)(c)).

Under these provisions, the CSA supports space-related R&D in Canada and is empowered to design and support initiatives for space-related research, study and information gathering (CSA, 2017c). In addition, in keeping with its mandate to advance the knowledge of space through science, the CSA encourages Canadians to participate in awareness and learning activities to enhance Canada's expertise in major space niches and support the development of a critical mass of space researchers and HQP. The Class S&C Program is one way in which the CSA can use the unique appeal of space to encourage students to pursue STEM education and careers and promote space science and technology literacy in Canada (CSA, ). The Program thus contributes to ensuring Canada's presence in space, in accordance with the CSA's core responsibility identified in the DRF (CSA, 2018a).

Continued need for the Program

Data collected through the document review, the internal CSA employee survey and the external survey of funding recipients and non-recipients indicates that the Program continues to meet a demonstrable need in Canada. In the surveys, 94% of CSA employees and 97% of recipients and non-recipients said they felt very strongly that there was a continued need in Canada for the CSA's Class G&C Program, as it is a unique framework in the Canadian ecosystem to support space R&D and space S&T awareness and learning. Among external respondents, the rates are 97.7% for the Research component and 94% for the A&L component. Both components of the Program are in high demand and receive numerous funding applications. This observation is consistent with the results of previous evaluations, which stated that the Program "provides the financial support in areas in which other sources of funding do not exist" (PWGSC, ) and "is the only federal program entirely dedicated to the development of the space sector" (CSA, 2017c).

Specifically, the data collected in the evaluation showed that the Program primarily addresses the following three interrelated needs.

Advancing space science and technology. The majority of recipients and non-recipients stated that the CSA's Program is a unique framework for supporting space R&D in Canada, and that non-space-specific programs would be hard-pressed to fulfil that role. The Program stimulates the implementation of research projects, knowledge development, and space-related technology advances for the benefit of Canadians. A number of internal and external respondents noted that the CSA's Program played a beneficial role by providing funding in areas where funding is limited (e.g., planetary science research), offered a unique opportunity to test prototype space platforms, and funded space-related R&D in the early stages of technology development, which may be of limited interest to industry and difficult to justify for organizations that have to focus on short-term needs. Its role in Earth observation was also repeatedly emphasized by respondents.

When external respondents were asked what happened to their project(s) that were not funded by the Research component (113 respondents), 32% reported that their project (or at least one of their projects) did not go ahead. Of the respondents who had at least one non-funded project, only 3% said that their non-funded project was completed as planned, and about 26% responded that their project was completed, but its size and/or timing was affected. Those projects used funding from other federal programs (e.g., DND's Innovation for Defence Excellence and Security (IDEaS) program, the NRC's Industrial Research Assistance Program (IRAP)), NPOs, internal resources, universities, provinces, and/or international funders. In addition, 35% of respondents with at least one non-funded project stated that their project might or might not go forward depending on future funding opportunitiesFootnote 14. For more information on the Program's complementarity with other federal programs, see the Efficiency section.

Building the Canadian space sector's capacity. The CSA's Program is the only program that provides dedicated funding to organizations that play a key role in the Canadian space sector. In so doing, it enhances the innovativeness, competitiveness and vitality of Canadian space companies, which in turn has a positive impact on the Canadian economy. It also plays an important role in the development of space science and technology HQP in Canada, as it funds unique awareness and learning opportunities for Canadians (e.g., training the next generation of Canadian scientists and engineers who will work on crewed space missions). The majority of A&L recipients felt that without the Program, they would not have been able to participate in space-related activities.

Supporting collaboration and ensuring Canada's presence on the international scene. Consistent with federal priorities and expected outcomes, the Program creates and supports unique opportunities for collaboration at the national and international levels between federal agencies, academia, industry and NPOs. In addition, many external respondents stated that the Program enhanced Canada's reputation in certain important niches (e.g., space robotics) and strengthened Canada's international presence. The data collected indicates that without the Program, academia and industry would have greater difficulty in creating opportunities for space collaboration and education (e.g., access to National Aeronautics and Space Administration and European Space Agency infrastructure) and that space companies would face additional challenges in accessing the international market.

Unmet needs and areas for improvement

Although the target groups' satisfaction with the Program is high (92% of recipients and 54% of non-recipients) and the Program meets the majority of needs, it is important to note that there are needs that are not met by the Program in its current form. While the CSA holds regular consultations to identify the target groups' needs, employees indicated that the Program's terms and delivery had changed little since and that the Program could be made more responsive to current needs, particularly as regards the A&L component. This component is in high demand: the success rate of A&L solicited proposals is only 16% (for more information, see the Performance section). The results of the survey of funding recipients and non-recipients reflect this, with 67% of respondents indicating that the Research component was strongly aligned with their needs, compared with 52% for the A&L component. The target groups' satisfaction rate was higher for the Research component (82%) than for the A&L component (52%).

Figure 5 – Alignment between the Program and the needs of the target groups that responded to the survey (in percentage)
Research component A&L component
High 67 52
Moderate 18.2 28
Low 8.5 8
Non-existent 2.3 4

Internal key informants indicated that there was room for increased coordination and cooperation between CSA sectors, and between the CSA and external stakeholders, in designing and implementing funding opportunities that would be more responsive to communities' needs, such as a streamlined application process, recurring AOs, and AOs timed to suit the circumstances of the targeted individuals. For more details, see the Efficiency section.

In addition, the evaluation identified the continued need to make the space sector more inclusive. Although GBA Plus was incorporated into G&C, funding opportunities were considered accessible and equitable by half of the internal and external respondents. Specifically, employees and funding applicants expressed concern that the Research component was unable to fund researchers outside an institutional context, and that the current processes might inadvertently make it difficult for new players and small businesses to access the Program. For example, it was suggested that funding opportunities reserved for underrepresented groups be established; that applicants' names be withheld from peer review to avoid unconscious bias; that the number of space-related awareness, training and collaboration opportunities be increased, with a greater emphasis on A&L; and that ways of supporting projects differently (e.g., access to facilities and mentoring) be explored.

Lastly, the data collected indicated that it would be beneficial to clarify and better communicate the technology readiness level (TRL) and the type of activities that can be supported by the Program, within the limits of the CSA's mandate, as some internal respondents stated that the CSA could better meet the needs of target groups that receive no funding at a particular TRL, while others stated that the assistance provided by the CSA was already within the limits of its powers under its founding legislation.

6. Performance

In this section

Overall, the Program is meeting the targets with respect to outputs and outcomes. The number of proposals submitted and the success rate varied from year to year, but the number of AOs increased overall. On the other hand, the number of projects receiving funding (new and ongoing projects) increased every year, with projects being funded for more than one year. The majority of recipients were universities, but a growing number of private companies received funding. There were some recurring players, but efforts had been initiated to make funding more accessible. Space-related knowledge, capacity development, and collaboration increased. However, operational and performance data could be better structured if the available tools were harnessed, and some information was missing from the central database. The previous evaluation also identified the need to improve the data entry process to ensure the validity of the data.

Expected outputs delivered

The Program delivers the expected outputs based on the logic model for each component (Appendix 1). However, information and data on outputs (e.g., publication of AOs, production of progress and final reports by funding recipients) were not readily available or complete.

The funding priority list is established by each branch and consists of the published AOs. AOs are defined in accordance with DFR program priorities and available budgets, consistent with the CSA's broader mandate and departmental priorities, and generally in consultation with the community and other branches, as noted above. The people interviewed for this evaluation considered this approach to be effective. In addition, the audit report (CSA, 2020a) found that funding opportunities were appropriately planned. Nevertheless, a number of internal and external respondents stated in their open-ended responses that there was room for improvement in external engagement and consultation with stakeholders and the various scientific and research communities to identify priority themes and be more needs-oriented, and develop AOs accordingly. The previous evaluation suggested that the CSA should standardize the process for developing funding priority lists that apply to all AOs (CSA, 2017c). It was noted that programs were considering new areas of interest and branches were taking steps in their AOs to promote representation of all groups (GBA Plus).

During the evaluation period, there was a gradual increase in the number of AOs published; total AOs were 49 for the Research component and 11 for the A&L component, a significant increase for the Research component from the total of 10 for the previous evaluation period (five years) (CSA, 2017c). Some AOs are recurring, with minor changes from year to year, but with no fixed date, while other AOs are unique. The AO information in Unitas did not always indicate the component or the branch. There are few mandatory fields in the Unitas system. In addition, there is no clear list of unsolicited proposals in the central database: unlike solicited proposals, unsolicited proposals are not regarded as a process and therefore are not systematically saved in the database.Footnote 15

Figure 6 – Number of AOs by component and year
- - - - - -
Research 7 6 5 9 11 11
A&L 1 0 2 2 1 5

Regarding communications, about half of the employees and funding applicants surveyed indicated that there was room for improvement in engaging with potential recipient communities and raising the Program's profile by informing the community more proactively about funding opportunities (expanded communications strategy, targeted distribution lists) and having at least some regular recurring AOs on pre-set dates. In addition, even though the AOs are posted on the CSA website, the information could be more easily accessible (from the home page, searchable information based on filters instead of a keyword search). In fact, the - report on Canadian academic capacity in space research (J.E. Halliwell Associates Inc., ) also noted that academic institutions would like more effective communication between the CSA and the academic community. The study found that communications between the CSA and research communities could be better structured and more direct. The report noted that academic institutions would welcome greater interaction with the CSA but would like a point of contact with the Agency and suggested broader engagement across Canada. To that end, the study recommended the establishment of formal liaison roles, regional meetings between the CSA and universities to discuss strategic issues, and the strengthening and expansion of the work of the CSA's scientific advisory committees.

The 1,315 solicited proposals received were evaluated and ranked within each branch, based on the applicant's eligibility and the criteria defined for each AO. A ranked list of eligible applications is produced for each AO within the branches (some of this information was entered into Unitas, and this became more systematic during the evaluation period). The number of proposals received varied over the years but showed an upward trend overall for both components. According to the available data, the overall success rate for G&C solicited proposals during the evaluation period, as calculated during the evaluation, was 38% (total of 495 proposals selected). The success rate was higher for contributions (44%) than for grants (34%), reflecting the low proportion of applications accepted in the A&L component (16%), which is in high demand. The success rate varied depending on the type of opportunity and the number of proposals received for each AO. Because the number of AOs varied from year to year and the number of proposals received was different for each AO (depending on the type of AO and/or for various reasons, such as time of year or duration of posting), it was difficult to obtain an annual portrait or make a comparison across AOs and even across branches. However, the overall picture over the evaluation period provides a general idea. It is also important to keep in mind that the success rate is based on the data available in Unitas; some non-funded proposals are not listed in the database, and some proposals are not linked to an AO, which results in slightly higher success rates. In addition, the total number of unsolicited proposals received cannot be estimated with the Unitas data because only funded proposals are included (most of them are there, though some are missing).

Figure 7 – Success rate for solicited proposals by funding type and component, - to - (number of proposals)
Grants
34%
Contributions
44%
Research
51%
A&L
16%
Not selected 556 264 404 416
Selected 289 206 415 80
Success rate 34% 44% 51% 16%

For solicited proposals in the Research component, universities had a higher success rate (58%) than private companies (45%). There was at least one proposal submitted during the evaluation period for each province and territory (see figure below). The success rate was similar across the provinces and territories (of 52% on average) but varied widely (0-100%) for regions with fewer proposals (between 1 and 10 proposals).

Figure 8 – Success rate for solicited Research proposals by applicant category
Private
companies
45%
Universities
58%
Others
(see note f)
25%
Not selected 231 158 15
Selected 191 219 5
Success rate 45% 58% 25%
Figure 9  – Success rate for solicited Research proposals by province and territory (number of proposals)
ON
52%
QC
50%
AB
53%
BC
46%
SK
53%
MB
55%
NL
36%
NS
50%
NB
44%
YK
33%
NT
100%
PE
100%
NU
0%
Not selected 164 108 41 43 17 9 9 5 5 2 0 0 1
Selected 176 110 46 36 19 11 5 5 4 1 1 1 0
Success rate 52% 50% 53% 46% 53% 55% 36% 50% 44% 33% 100% 100% 0%

Provinces and territories with fewer than 10 proposals submitted: NB 9, YT 3, NT 1, PE 1, NU 1.

The number of proposals submitted and the number of proposals selected by region was representative of the Canadian space sector.

Figure 10 – Distribution of submitted and selected Research proposals (- to -) relative to the total Canadian space workforce by region (percentage)
Ontario Quebec Prairies Atlantic B.C. Territories
(see note g)
Proposals submitted 41.51 26.62 17.46 4.15 9.65 0.61
Proposals selected 42.41 26.51 18.31 3.61 8.67 0.48
Space workforce (see note h) 42.30 33.26 10.10 7.67 6.67 0

For solicited A&L proposals, primarily from university students, Ontario accounted for almost half of the applications, but Manitoba had the highest success rate (24%). There were no proposals from the three territories.

Figure 11 – Success rates for solicited A&L proposals by region, - to - (number of proposals)
ON
23%
QC
18%
BC
16%
AB
15%
MB
24%
NL
11%
SK
17%
NS
0%
PE
0%
NB
0%
Not selected 186 88 50 46 17 9 6 7 3 3
Selected 43 16 8 7 4 1 1 0 0 0
Success rates 23% 18% 16% 15% 24% 11% 17% 0% 0% 0%

Provinces and territories with fewer than 10 proposals submitted: SK 7, NS 7, PE 3, NB 3.

During the evaluation period, there were 463 new signed agreements, a significant increase from the 195 agreements for the Research component in the previous evaluation period (five years) (CSA, 2017c). Ontario (44%) and Quebec (23%) were the top recipients year after year (AB 12%, BC 10%, SK 4%, MB 4%), while some regions had only one signed agreement during the period (PE, NT and YT in -). The majority of agreements signed during the evaluation period were for solicited proposals (89%) and research proposals (83%). The number of signed agreements based on unsolicited proposals decreased over the period from 13 per year to 1 per year (based on available data).

Figure 12 – Number of new agreements by proposal type, component and year

Figure 12 – Number of new agreements By proposal type (number of agreements signed)
Solicited Unsolicited
Research 351 33
A&L 73 6
Figure 12 – Number of new agreements By component (number of agreements signed)
- - - - - -
Research 78 55 56 58 66 71
A&L 14 0 22 12 10 21
Figure 12 – Number of new agreements By year (number of agreements signed)
- - - - - -
Solicited 79 49 72 63 70 91
Unsolicited 13 6 6 7 6 1

There was no significant difference in the geographic distribution of agreements between the beginning of the evaluation period and the end, even though the branches were more aware of GBA Plus issues, including regional diversity. The fact that Ontario and Quebec accounted for a large proportion of the proposals submitted and agreements funded in both the Research and A&L components is attributable to the fact that most space-sector companies are based in those two provinces, which also have a large number of postsecondary institutions. According to the State of the Canadian Space Sector Report  (CSA, 2019c), "[t]he majority of STEM employees [in the space sector] can be found in Ontario and Quebec, which accounted for 43% (2,486 FTEs) and 26% (1,510 FTEs) of Canada's STEM workforce, respectively."

Figure 13  – Number of new agreements by region and year

Figure 13 – Number of new agreements By year
ON QC AB BC SK MB NS NL NB PE NT YT
- 48 21 11 6 2 3 0 1 0 0 0 0
- 24 11 7 10 1 1 0 1 0 0 0 0
- 33 11 17 4 9 3 1 0 0 0 0 0
- 26 20 3 6 2 3 3 2 1 1 1 1
- 33 14 9 10 2 4 2 1 1 0 0 0
- 39 27 8 9 2 3 1 1 2 0 0 0
Figure 13 – Number of new agreements By region
ON QC AB BC SK MB NS NL Others
(NB, PE, NT, YT)
Number of agreements 203 104 55 45 18 17 7 6 7

Nearly 61% of the research projects that signed their funding agreements during the evaluation period were carried out by university researchers, compared with about 38% by private companies, which means that more private companies received funding than in the previous evaluation period. Few researchers from other types of organizations applied to the Program or received funding (< 1%; e.g., NPOs, colleges, research centres). For A&L, the CSA supported 73 students, via AOs, who attended conferences under the SCDP's STEDiA initiativeFootnote 16, and it funded six unsolicited projects from four universities and two NPOs.

According to financial data, the Agency disbursed $117,906,570 in G&C funding during the evaluation period under 488 new agreements and ongoing agreements being funded over more than one year (74%). Of those agreements, 440 were for research projects (98% of the budget) and 48 for A&L projects or activitiesFootnote 17 (2% of the budget). The SCDP disbursed most of the funding (amount and number of agreements), in the form of both grants and contributions.

Figure 14  – Amounts disbursed by year (new and ongoing agreements)

Figure 14 – Amounts disbursed by year (new and ongoing agreements) - Text version
Amounts disbursed by year (new and ongoing agreements)
- - - - - -
Number of agreements 142 151 171 182 207 243
Amount disbursed $11,053,445 $16,764,274 $21,016,866 $19,181,536 $26,399,276 $23,490,533

Figure 15 – Amounts disbursed by type of transfer payment and program (new and ongoing agreements)

Figure 15 – Amounts disbursed by type of transfer payment and program (new agreements) - Text version
Amounts disbursed by type of transfer payment and program (new and ongoing agreements)
SCDP SUP SEP
Number of contribution agreements 122 45 10
Amount disbursed $51,464,544 $12,997,501 $2,696,433
Figure 15 – Amounts disbursed by type of transfer payment and program (ongoing agreements) - Text version
Amounts disbursed by type of transfer payment and program (ongoing agreements)
SCDP SUP SEP
Number of grant agreements 176 68 64
Amount disbursed $24,632,870 $18,305,490 $7,578,313

Regarding progress and final reports, almost all recipients submitted their reports annually during the evaluation period (96% on average) via the PIS. The completion rate of the new "Medium Format" report for contributions was somewhat lower (69%) in the first year, as previous reports were submitted in another format, such as Word or Excel (STD contributions). As these reports are mandatory under the Policy on Transfer Payments (TBS, 2008a), recipients who did not use the PIS submitted their reports in another format. Students submit a report in Word or PDF format after their activity. The timeframe for assessing reports is governed by service standards, as reports must be assessed before payments are made to recipients (for contributions). The service standards for payments are four weeks for grants and six weeks for contributions. Under the report assessment process, reports are assessed by the scientific or technical authority and then by the program authority. However, report assessment information does not appear to have been entered correctly or consistently into the central database in all cases, which makes the data unusable for analysis. Assessment of student reports is not done in Unitas at this time. It was noted during the evaluation, however, that the PIS was sometimes perceived as just a performance survey since it is also used for contracts, and the connection with the obligations in G&C agreements under the Policy on Transfer Payments (TBS, 2008a) was disregarded.

Expected immediate outcomes achieved

The Program is achieving the immediate outcomes, as the increase in knowledge, activities and focus on space and the access to partnership and collaboration are clear in the data. However, some indicators (e.g., the proportion of leveraged funds) are not measured directly in the PIS, which makes it difficult to analyze the data. The information collected in reports from students and organizations offering learning activities was useful for analysis. However, the questions asked in those reports are not standardized and are mostly open-ended.

There was an increase in knowledge through research projects conducted in the space S&T priority areas: there was a 48% rise in the number of new and ongoing projects and a 121% rise in the number of projects reporting an increase in knowledge. The most frequently reported achievements were technology or scientific breakthrough (65%; 95% for STD contributions) and the use of satellite data (65%). For the A&L component, 89% (16/18) of the external survey respondents indicated that attending conferences or learning events increased their awareness and knowledge of space-related science, technology or issues. Analysis of a sample of 42 STEDiA reports (42/75; 21 women, 21 men) indicates a high level of satisfaction among funded students. In addition, recipient organizations that conducted learning activities stated in interviews that they strongly believed that those activities foster knowledge development and increased awareness among participants (both K-12 and postsecondary students), based on the information they gathered through their surveys and informal feedback.

There was increased emphasis on space in universities, private companies and NPOs due to the signing of new agreements and the advent of new players. During the evaluation period, a total of 146 new research G&C agreements (for both solicited and unsolicited proposals) were signed with 65 different private companies, 233 new agreements with 37 different universities, and 5 new agreements with 2 NPOs, 2 colleges/schools/CEGEPs, and 1 medical centre/hospital. For reference, the State of the Canadian Space Sector Report (CSA, 2019c) indicates that in , 74 space companies were engaged in R&D activities. With regard to learning opportunities, the availability and use of the space theme increased as 11 AOs provided opportunities for students to attend conferences or learning events and three new A&L projectsFootnote 18 were carried out by third-party organizations for younger students (K-12 (Grade 12 is equivalent to Secondary V in Quebec)) and postsecondary students. However, a few external and internal interviewees indicated that the CSA could do more to promote the focus on space by supporting college students, helping postsecondary students to attend a wider variety of events, increasing support to NPOs for the delivery of A&L activities for youth, and fine-tuning the coordination of activities between the CSA (e.g., Junior Astronauts campaign) and other STEM stakeholders.

New recipients entered the space research field. Interviewees noted that because of its specific focus on space, the Program served a relatively small community of potential recipients. Internal respondents and interviewees expressed concern that the main recipients were established organizations already familiar with the CSA's G&C processes. However, even though the space sector is a small community and some organizations are well-established, more than half of private companies (34/65) and the university researchersFootnote 19 (88/143 individuals) identified as principal investigators under signed agreements with recipients (universities) signed only one agreement during the evaluation period. In addition, 15% of the organizations that signed an agreement during the period were receiving funding for the first time. However, a number of recipients were repeat players (two or more agreements during the period: university 38%, private sector 48%).

Figure 16 – Proportion of Research component recipients with one, two, or three or more signed agreements, - to -

Figure 16 – Proportion of Research component recipients with one, two, or three or more signed agreements, - to - University researchers
1 agreement 2 agreements 3 agreements and +
Percentage 62 27 11
Figure 16 – Proportion of Research component recipients with one, two, or three or more signed agreements, - to - Private companies
1 agreement 2 agreements 3 agreements and +
Percentage 52 19 29

Only two student recipients were funded more than once (2/71; 73 grants). In addition, 56% of the projects on average each year reported bringing new actors into the field of space research. In - and -, research teams had nearly 2,000 members who were new to space work (an average of 6.2 new members per team in the two years).

Partnerships were formed or maintained, organizations had access to international collaboration, and funding was leveraged. Most research projects (87%) were collaborative in nature, involving the maintenance or formation of new partnerships (79% for STD contributions). There was a 101% increase in the number of organizations (Canadian and foreign) involved in funded projects (recipients and partners directly involved in the projects) during the period. In general, partnerships with universities (Canadian and foreign) were the most common, followed by partnerships with the private sector, federal entities and foreign research centres.

  • About 50 different Canadian universities, 160 different foreign universities in 28 countries, 60 different Canadian private companies and 24 different federal organizations (other than the CSA) were involved in direct partnerships.
  • About 150 foreign universities, 70 foreign research centres, 51 private companies and 40 foreign companies were involved in other partnerships (not directly related to the research team).Footnote 20

In addition, reports from students who received funding from the CSA to attend a conference showed that they were particularly enthusiastic about the networking that takes place at space agency conferences and events. A few students noted that there could be greater opportunities for CSA-supported participants to network with each other at conferences and events in order to expand connections among those Canadian students and opportunities for future collaboration.

The number of projects reporting leveraged funding (CSA funding leveraged other funding), including international funding, increased over time, but the proportion declined (from 67% to 42%). A total of 48% of the projects indicated that they had obtained leveraged funding. However, the configuration of the PIS made it impossible to determine the amount of funding obtained. Assistance stackingFootnote 21 is part of the financial audit requirements for contributions, and that information is provided via a Word form. However, because stacking information is not aggregated or compiled, it was not accessed for the evaluation. Moreover, the proportion of leveraged funds, despite being included in the departmental plan indicators for -, - and - (CSA, 2017d; 2018c; 2019e), is not included in the results reports for those years (CSA, 2018b; 2019d; 2020b). This points directly to the need to develop electronic surveys, reports and forms with good programming logic to avoid errors and ambiguities, and the need to save data directly into a common database for easy access, analysis and reporting.

Expected intermediate outcomes achieved

Intermediate outcomes flow directly from the immediate outcomes described above. The availability of space-related knowledge and information in priority areas increased; space S&T capacity in targeted sectors increased; and there was more collaboration, both multidisciplinary and between institutions.

Over the evaluation period, the increased dissemination of information and knowledge was maintained (slight fluctuation over six years). An average of 615 publications and 1,097 presentations per year were generated by research projects, according to PIS data.Footnote 22 Articles were the main form of publication: 79% of reports completed by funding recipients indicated peer-reviewed articles acknowledging CSA funding, and 41% of reports indicated peer-reviewed articles made possible by CSA funding (31% of STD contribution projects). In addition, 86% of PIS reports and the final reports for 44% of STD contribution projects indicated that presentations had taken place. The data, however, do not speak to the "reach" of the dissemination effort.

Figure 17 – Number of publications and presentations reported annually (PIS)
- - - - - -
Publications 613 592 676 526 678 602
Presentations 1,166 1,055 1,033 1,061 1,237 1,032

HQP development increased. PIS data shows an increase in the number of HQPFootnote 23 over time, with a spike in -, though that was probably due to the introduction of the Medium Format PIS for contributions. Excluding the last year of the evaluation period, an average of 421 HQP were involved in research projects each year. The project teams were composed mainly of technicians (22%) and graduate students (master's and doctorate; 22%), for an annual average of 14.5 members per project team (all categories). There was an increase in participation by all major groups during the evaluation period. Those values are primarily attributable to universities, which accounted for 90.1% of the PIS data (private companies 7.8%, NPOs 1.7%, and others less than 1%) over the six-year evaluation period. In addition, the training and development of HQP was one of the things that both external and internal respondents liked most about the Program. In addition, the results of the PIS final reports (grants) confirmed the strengthening of student capabilities through research projects: almost all projects involving students reported opportunities for the student participants to acquire science skills and general research skills (teamwork, confidence, flexibility, communication, ethical conduct, etc.).

Figure 18 – Portrait of the composition of research project teams (PIS)
Graduate
students
Technicians Undergraduate
students
Scientists (all
disciplines)
Engineers Postdoctoral
students
Others
Percentage 22 22 15 14 13 12 2

The Other category includes management, administration, college or CEGEP students, and health professionals.

Of the funded postsecondary students who participated in the external survey, some (39%) stated that they were now graduates of space-related programs (mostly bachelor's degree programs). The - report on Canadian academic capacity in space research (J.E. Halliwell Associates Inc., ) identified nearly 1,800 university students and researchers working in space-related fields, roughly double the number in the - inventory. Much of that increase was in new fields and emerging research interests and opportunities. The report concluded that there were many more non-traditional space-related fields and more multidisciplinary space-related fields in the - inventory. This information provides some context concerning the growth of space-related research fields but does not indicate whether the CSA's G&C funding can be considered to have contributed to that trend. Since the amount of funding for A&L activities was modest, A&L could not have not contributed substantially to an increase in HQP. Moreover, the number of students trained in research projects probably exceeds the number in the A&L component and, as one internal respondent noted, these two groups may actually overlap (e.g., students supported by AOs under the STEDiA initiative also being involved in research projects).

With regard to gender composition, in - and -, about 30% of grant-funded project team members were women, with a few individuals identified as gender-fluid, non-binary and/or two-spirited.Footnote 24 This is comparable to the composition of the space workforce.

  • According to the State of the Canadian Space Sector Report (CSA, 2019c), in , Canadian space companies hired 741 employees, 26% (196) of whom were women and 74% (545) were men.
  • According to Statistics Canada (), 30% of STEM graduates were women.

However, the figure was lower for contribution-funded projects (18% (n=42) in PIS reports in -; 14% (n=73) for STD contributions over the period).

Multidisciplinary collaboration increased between - and -. In total, 51% of PIS reports indicated multidisciplinary research, while 35% of STD contribution projects indicated involvement with partner organizations and team members from multiple disciplines. Collaboration between institutions (partnerships) also increased (see the subsection on expected immediate outcomes achieved). In fact, access to networking, partnerships and collaboration was one of the things that both external and internal respondents liked most about the Program.

For students and youth, interest in space-related disciplines was supported. STEDiA-funded students surveyed indicated that their participation heightened their interest in space research or space-related fields (83% of respondents) and encouraged or enabled them to participate in additional space-related activities or training to a large extent (61%). Although based on a small sample (23%, 18 student recipients/80 students (recipients and non-recipients) surveyed), these observations are also supported in the sample of post-conference reports analyzed (56%, 42/75). Although there is no reference list of fields of study targeted by the CSA, the activities carried out under the STEDiA initiative tend to encourage students from a wide range of disciplines to pursue a career in space.

In addition, although not systematically reported, the A&L-funded organizations interviewed found, through a survey of past participants,Footnote 25 that their activity programming had helped heighten participants' interest in space and encouraged them to take part in other space-related activities as some of their participants had gone on to higher education and careers in the space field.

Lastly, the elementary, secondary and postsecondary student target audience was reached through S&T learning activities and materials.

  • Eighty university studentsFootnote 26 were selected through the STEDiA initiative to attend conferences.
  • Some 70,000 elementary school students and nearly 49,000 secondary school students completed the Let's Talk Science Living Space program, some of them being students from schools in underrepresented communities, including Indigenous jurisdictions, and remote communities (e.g., in northern Canada (Northwest Territories and Yukon)).
  • The Canada-Norway Student Sounding Rocket (CaNoRock) – a partnership between the University of Alberta, the University of Calgary, the University of Saskatchewan, the Royal Military College, the University of Oslo and the Andøya Space Centre in Norway – provided 60 Canadian students with the opportunity to participate in weeks of training in Norway.
  • The Canadian Satellite Design Challenge Management Society's eponymous challenge (CSDC) reached as many as 800 university students through four organized challenges.

The CaNoRock program spawned additional developments in educational activities at participating universities, including programming at the graduate level to create a pathway for students to continue their studies in a space-related discipline.

Expected final outcomes achieved

There is no indicator associated with the Research component's final outcomeFootnote 27 in the PMS (CSA, ), but the PMS indicates that all intermediate outcomes should contribute to the long-term outcome. Canada should therefore have the capacity to conduct space R&D and have sufficiently advanced space knowledge and information to meet national needs and priorities (CSA, ). The PIS Final Report completed only by grant recipients collects data on the final impacts of funded projects.

In addition, it is very difficult to measure the A&L component's final outcome indicatorsFootnote 28 with the available data because the two indicators defined in the PMS (CSA, ) – "[number] and proportion of recipients that report [...] subsequent selection for internship [or] employment in space-related disciplines [or] subsequent provision of services in space-related disciplines" and "level of awareness of targeted audience reported by recipients" – are not part of the information collected in the reports from recipient students or organizations. However, the external survey and interviews administered as part of the evaluation did provide some data. Nevertheless, fostering continued development in space-related disciplines should help ensure a critical mass of HQP in areas relevant to CSA priorities, and that critical mass of HQP should be available for future internships, jobs or service delivery in space-related disciplines (CSA, ).

Therefore, based on the intermediate outcomes, some of the PIS Final Report data, external survey and interview data, and the Program's relevance discussed above, the evaluation concludes that the following elements contribute to meeting national space-related needs and priorities, sustaining and strengthening the capacity to conduct space R&D and operations, and increasing S&T awareness:

  • The Program addresses the needs of recipients in the field of space R&D in Canada and is aligned with federal priorities.
  • Completed projects will continue to generate publications and presentations.Footnote 29
  • The majority of projects involved a significant number of university students, who made up 56% of the research teams and contributed to the development of their expertise and skills.
  • Partnerships formed after project completion will persist.
  • Funded university students sought (44%) or obtained (22%) employment in a space-related field, and some CaNoRock and CSDC participants pursued graduate studies and careers in the space field.
  • After attending a conference, most funded university students (89%) had an increased level of awareness and knowledge of S&T and space science.
  • The activities carried out by A&L-funded organizations raised awareness among youth and postsecondary students on various space-related topics and career options in the space field.

It is important to keep in mind that the G&C Program's final outcomes contribute to the achievement of the CSA's strategic outcomes and ultimately to its core responsibility (see the Relevance section).

However, a few internal and external interviewees noted that there were concerns about the continuity of funding. The previous evaluation noted that, according to key informants and recipients, the lack of continuity of funding in areas targeted by previous AOs had reduced the ability of some R&D projects to reach a higher TRL and exploit their knowledge acquisition potential. In addition, one interviewee noted that to create a continuum between youth interest in STEM, postsecondary education in a space-relevant discipline, and space-related employment, it is necessary to maintain A&L activities over time. A number of internal and external survey respondents pointed out the A&L component's importance in the space ecosystem.

Lastly, little information is available on the impacts of CSA funding on the various GBA Plus groups. Only 9% of external survey respondents could confirm that the funding they received benefited various groups in some way.

7. Efficiency

In this section

Many of the processes and procedures are well-established and working smoothly. However, there are some areas that need improvement to make the Program even more efficient and to increase synergies. Although improvements have been made in the Program over the past six years, some of the observations in this evaluation are similar to those made in the previous evaluation.

Administration, management and planning of the Program

The administration and management of each AO and agreement fall within the purview of the branches responsible for the activities, not the Program. One of the things that both internal and external respondents liked most was the Program's flexibility regarding project types or themes, and during projects.

Most external respondents were satisfied or very satisfied (78%) with the Program and appreciated the increase in funding opportunities. From a regional perspective, the level of satisfaction was similar in all provinces/territories.

Figure 19 – Overall satisfaction with the Program by region (percentage)
ON QC BC AB NL SK MB NS NB NT
Satisfied 74 81 76 93 83 73 100 60 100 100
Dissatisfied 21 16 15 0 8 27 0 40 0 0
Don't know 5 4 9 7 8 0 0 0 0 0

In particular, recipients were highly satisfied with the Program's administration: issuance of payments; amendments and follow-ups were done properly and in a timely manner (83% to 89%); the support provided for the implementation of the agreement or during the agreement was adequate (85%); and the reporting requirements were very or extremely reasonable (72%; higher for contributions (80%) than for grants (69%)). This is probably due, in part, to the posting of service standards on the CSA website, which helps manage timeliness expectations. Following the audit report (CSA, 2020a), an annual reminder has been sent to the branches regarding the importance of capturing information in the Unitas system to meet obligations under various policies (e.g., Policy on Transfer Payments (TBS, 2008a), Policy on Service and Digital (TBS, ), TBS's Management Accountability Framework). All service standards were met in -. However, although there is an automated report that extracts service standard results from the Unitas database, previously published results cannot be replicated because missing data was manually added to the extraction files and other data was corrected, while missing data or data that was erroneous at the source (in the central database) was not added or corrected.

According to external respondents, the reporting requirements were similar to or better than (63%) those of other programs (federal, provincial or non-governmental organization programs), indicating that the online reporting platform (PIS) was convenient and the Agency provided flexibility and requested reasonable information. That being said, some respondents mentioned that the reports asked for more detailed information than those of other programs. The reporting process was described as straightforward by the majority of external respondents and was one of the things they liked most about the Program.

CSA employees were somewhat less satisfied with the Program (61% satisfied or very satisfied) overall than people outside the Agency. Employees were less satisfied with the Program's administration: only 13% of internal respondents agreed that operational processes were optimal; 44% indicated that there was insufficient capacity (number of staff, budget and expertise) in their area to do the job; and 48% said that their team had the necessary tools and technical resources. With respect to operational processes, a common concern raised by employees surveyed and interviewed was the complexity of the processes and the cumbersome governance. The previous evaluation also identified the problem of cumbersome internal processes. A few internal interviewees specifically suggested that agreement approval should be delegated to a lower level and that the possibility of taking a risk-based approach should be explored. To streamline governance, a new charter for delegating agreement approval to a lower organizational level was approved during the evaluation and has been in effect since . Internal respondents also suggested more standardization and coordination between different branches to streamline internal processes, as working in silos was one of the things they liked least about the Program. Under the Policy on Service and Digital (TBS, ), Class G&C Program processes may be reviewed in the near future. That review will highlight discrepancies between current processes and the theoretical processes documented in the G&C Toolbox developed by the CEGC and identify specific things to be improved in order to harmonize and streamline common processes while reviewing the RACI grid.Footnote 30

With regard to capacity across the Agency, employees noted limited G&C expertise, staff turnover/shortages, and a bottleneck in the CEGC (the CEGC receives more requests than it can handle). CSA employees also linked these resource issues to the cumbersome nature of G&C processes (consultation processes, documents to be completed, etc.), regardless of agreement scope, and to the growth of G&C. Both the amount of G&C funding and the number of agreements signed have more than doubled while there are also new AO formats, such as challenges, which require more time to develop. Regarding technical tools and resources, several elements are already in place, such as the G&C Toolbox on Livelink and the Unitas system, but as mentioned in the Performance section and suggested by internal respondents, some elements need to be improved: a central database, electronic forms, and enhancements to Unitas and the PIS. One key informant raised the possibility of G&C training to ensure that employees fully understand the Program's framework and rules.

On a related note, G&C growth may present a higher risk to the Agency. Although that risk is not documented in the Corporate Risk Profile - (CSA, 2021b), it is noted in the - Business Plan that the G&C risk framework will be reviewed (CSA, 2021c). In , the CEGC developed a guide and form for the assessment of recipient riskFootnote 31 based on the type of transfer payment and the level of monitoring required, as required by the Directive on Transfer Payments (TBS, 2008b). AO risks are also assessed using the same tools as potential recipient risks. Since the risk assessment is done on a project-by-project basis, a recipient may have multiple risk levels (different type of project, complexity of activities, financial value, etc.). Information about risk levels is not collected centrally in the CSA. In addition, as a result of the audit report (CSA, 2020a), the CEGC developed a recipient monitoring and verification plan as required by the Directive on Transfer Payments (TBS, 2008b) and audited two contribution recipients in -.

Using the Class G&C Program as the primary instrument for delivering G&C and allowing each branch to design and manage AOs that reflect its priorities, the CSA's priorities and the needs of their respective communities are unique to the Agency. Other departments and agencies that have a class G&C program use it to complement other, more specific G&C programs. While this is seen as a good approach, some internal respondents acknowledged that the Agency tended to work in silos, but stated that there were opportunities for coordination and standardization across the Agency. The G&C Steering Committee acts in an advisory capacity and provides a forum for information-sharing and a degree of coordination (the work of the CEGC and the Committee has led to some standardization of processes), but according to some respondents, the Committee does not have a clear mandate, does not really provide strategic direction, and does not have a relevant level of membership. One internal interviewee felt that the Committee could be used for higher-level discussions concerning ongoing developments and provide an opportunity for greater coordination in planning. New terms of reference for the G&C Steering Committee were approved in , and it became an Advisory Committee.

The Program's medium- and long-term planning is limited, and that is a challenge for the CSA and some other departments and agencies interviewed. Nearly half (43%) of internal survey respondents were unable to say whether the Program's medium- or long-term planning was effective; only 9% were fully convinced that it was. Internal interviewees explained that there was very little medium- or long-term planning within the Program; planning was primarily done in each branch, and sometimes even for each specific initiative rather than for a whole branch, although the annual strategic retreat was used to set priorities across the branches. In addition, not all branches are able to engage in long-term planning because they are influenced by external factors, such as the CSA's external partners (other government departments and space agencies). While there is a G&C Steering Committee (Advisory Committee since ), employees interviewed indicated that there was no overview of all AOs being developed and launched (e.g., no global dashboard, no roadmap). In fact, the lack of a common vision and long-term planning was one of the things employees surveyed said they liked least about the Program.

Planning was discussed in interviews with other departments and agencies, and most stated that it was a challenge in general. While some departments and agencies do some strategic planning for G&Cs, in many cases, long-term planning is only done by the specific branch or program. Nonetheless, employees interviewed and surveyed suggested greater coordination and planning across branches (opportunities to design cross-sector AOs to avoid duplication of effort and confusion of applicants, such as the AO on data analysis), with a timeline for certain recurring funding opportunities and five-year approval for example; they also recommended strengthening G&C planning and management capability to focus on integrated branch-level planning. Some suggested the development of a holistic approach to all G&C.

According to the Program's Terms and Conditions, governance is managed by the G&C Steering Committee; harmonization and standardization are the responsibility of the CEGC; and the branches are responsible for the Program's delivery (the branches manage the AOs and agreements). However, the Terms and Conditions also state that "[a]t the implementation level, working groups will be established to work on a thematic basis across CSA to identify the activities needed to deliver outputs and attain the specified outcomes" (CSA, ). These groups' key roles and responsibilities are to identify opportunities for service improvement; streamline, standardize, and harmonize application processes; introduce risk management practices; and improve stakeholder involvement (CSA, ). A few interviewees suggested the possibility of creating a unit in charge of coordinating G&C operations, with due regard for the fact that the branches know the needs and understand the potential recipients. A few internal respondents also suggested that establishing a community of practice might help advance collaboration across the Agency. In general, respondents were looking for an opportunity for the different branches to share more information and gain insight into G&C activities across the CSA.

Best practices of other departments and agencies in administration, management and planning:

  • Adoption of public service standards for acknowledging receipt and evaluating proposals;
  • Effective risk-based governance approaches or thresholds for approvals and proportionate oversight;
  • Establishment of a G&C community of practice, working groups (e.g., finance, risk, and user experience) and/or a G&C funding coordination group.

G&C Centre of Expertise

Relocation of the CEGC to Corporate Services was a positive step, according to internal respondents; some agreed that the change was beneficial (35%, very or extremely), while others had no opinion on the subject (37%). Most respondents also felt that removing the CEGC from SST made sense from a neutrality perspective. The CEGC's contribution was one of the things respondents liked about the Program. Responses to the internal survey pointed to a lack of resources at the CEGC: 32% of employees selected "moderately" when asked if the CEGC had sufficient resources to provide timely service. The number of full-time equivalents (FTEs) continued to decline at the beginning of the evaluation period compared with the previous evaluation period; it then rebounded starting in -, reaching in - a level equivalent to the - level, while the number of AOs and agreements and the available G&C funding increased significantly. However, there is no information on how commensurate the level of resources allocated to the CEGC is with its roles and responsibilities. However, having a small staff leaves little room for innovation, program improvement, seizing opportunities for collaboration, and strategic thinking.

Figure 20 – Relocation of the CEGC to Finance was beneficial
Not at all Slightly Moderately Very Extremely Don't know
Percentage 2 9 17 22 13 37
Figure 21 – CEGC resources are sufficient to provide timely service
Not at all Slightly Moderately Very Extremely Don't know
Percentage 15 26 32 7 0 20

For every dollar spent on funded projects and activities, it cost two cents to operate the CEGC,Footnote 32 which was less than in the previous evaluation period (seven cents from - to -), despite an increase in resources and FTEs during the evaluation period as the funding budget for agreements increased further. This decrease in the disbursement ratio raises questions about the appropriate level of resources for the CEGC in relation to its roles and responsibilities.

Table 6 – CEGC spending and amount disbursed to funded projects
- - - - - -
CEGC expenditures $383,317 $409,901 $387,811 $446,489 $479,750 $619,662
G&C agreement expenditures $11,053,340 $16,765,113 $21,016,771 $19,181,537 $26,399,276 $23,490,533
Ratio 0.03 0.02 0.02 0.02 0.02 0.03

Even though most employees surveyed (67%) considered the CEGC to be very or extremely valuable, most of them (59%) were unsure of exactly what the CEGC's function was:

  1. act as a service and support to the branches (e.g., support AO design and agreement development; develop templates; standardize processes);
  2. provide expert advice (e.g., provide general guidance; provide expertise to address specific cases); or
  3. play an oversight role to ensure compliance (e.g., compliance with TBS standards).

Some senior management and branch interviewees felt that the CEGC's focus on compliance was not an effective use of resources and added to the governance structure already in place. Senior management respondents said they would like to see the CEGC play a more strategic (or long-term) planning role in thinking about the use of G&Cs (e.g., reviewing the Program's Terms and Conditions). Branch respondents viewed the CEGC as a service and wanted it to provide advice and more support for implementing AOs and developing agreements. The previous evaluation identified the need to review and better communicate the obligations, roles and responsibilities of the CEGC and the branches' G&C managers and to ensure that the level of resources allocated to the CEGC is commensurate with its roles and responsibilities.

G&C centres of expertise in other departments and agencies:
  • Assistance in drafting G&C agreements and implementing projects;
  • Support in interpretation, policy, negotiations, design compliance (Treasury Board submissions, model agreements, including general standards clauses) and governance;
  • Development of standardized materials;
  • Management of the class G&C program used primarily to fund unsolicited proposals that do not fit into regular G&C programs;
  • Assistance in evaluating and reviewing program terms and conditions and modernizing the G&C infrastructure;
  • Contribution to reporting and program evaluation;
  • The engine behind the community of practice.

Use of funds

The use of Program funds is appropriate. The actual percentages of funding spent grants (43%) and contributions (57%) were close to the forecasts (39% and 61% respectively). There was a 7% variance for all branches combinedFootnote 33 (SST 9%, SU 1%, SE 8%) between planned and actual funding over six years (between 3% and 22% underspending in four of the six years; 3% overspending in two of the six years). The most frequently cited constraints included uncertainty about available funding, a lack of predictable AOs, and a lack of coordination between the branches and stakeholders. However, most employees surveyed (61%) said they could not tell if the proportion of funds disbursed through contributions versus grants was appropriate because the decision on the type of transfer payment (grant or contribution) was directly related to the results of the risk assessment for the AO or unsolicited proposal.

Figure 22 – Overview of planned and actual funding (in dollars)
- - - - - -
Planned funding 11,932,853 21,630,140 20,396,357 21,568,000 25,596,000 24,988,000
Actual funding 11,053,340 16,765,113 21,016,771 19,181,537 26,399,276 23,490,533
Table 7 – Variation between planned and actual funding by main branch
- - - - - -
SST 22% 29% 2% 13% −4% 3%
SU −12% 3% −22% 12% 6% 16%
SE −2% 30% 10% 0% −27% 20%
Figure 23 – G&C spending by DRF program (in $)
SCDP SUP SEP
Grants 24,632,870 18,305,490 7,578,313
Contributions 51,464,544 12,997,501 2,696,433

In addition, 30% of internal survey respondents felt that the CSA funded too few G&C recipients, and 35% indicated that they were unable to answer, while only 19% felt that the CSA was reaching the right type and number of recipients. The success rates of proposals presented in the Performance section point to a strong demand for the available funding. In addition, the small number of funded applicants, insufficient funding, and short-term funding were among the things that internal and external respondents liked least about the Program. A number of internal survey respondents suggested funding projects of different sizes and a greater diversity of applicants. In addition, employees interviewed indicated that the CSA could achieve a greater impact with its funding and could produce just as many significant results using moderate funding with grants (smaller amounts as opposed to contributions). This suggests the need for a broader discussion of the type and size of funding and its overall impact, which could be measured over a longer term (e.g., a few years after an agreement ends).

The AO approach

Soliciting proposals through AOs is an effective approach. Half of the external survey respondents (50%) and most internal survey respondents (65%) considered designing and managing funding opportunities that reflect communities' needs, the priorities of the DRF programs and, by extension, the CSA, and the G&C Program's objectives to be an effective approach. However, a common concern internally and externally was that funding opportunities were ad hoc and unpredictable. The need for greater predictability and harmonization to facilitate internal planning was also identified in the previous evaluation. The - report on Canadian academic capacity in space research (J.E. Halliwell Associates Inc., ) also documented comments favouring more frequent and predictable FASTFootnote 34 AOs. Predictable AOs would make opportunities more accessible and impactful, improve the experience of funding recipients, and simplify the CSA's internal planning and operations. Some external respondents and a few staff members recommended more coordination and harmonization across the Agency to make AOs more effective for both applicants and employees. To date, some standardization of processes for establishing AOs has occurred through the work of the CEGC and with the support of the G&C Steering Committee, but more could be done, according to respondents: greater predictability and stability in the criteria for recurring AOs, and the creation of AOs that cover common themes across DRF programs rather than multiple specific opportunities (e.g., on satellite data analysis).

Good practices of other departments and agencies regarding AOs:

  • Predictable, regular calls for proposals;
  • Sufficient time to prepare proposals (and avoiding conflicting or unfavourable timing for the community).

While more than half of both internal (52%) and external (57%) survey respondents considered G&C opportunities accessible, internal (20%) and external (14%) respondents who felt that there were barriers to access for various groups (based on geographic location, language, ethnicity, sex or gender, physical or intellectual ability, and/or other identity factors). Those respondents suggested that more consideration be given to certain organizations and types of researchers (e.g., small businesses,Footnote 35 start-ups, early-career researchers, underrepresented groups (e.g., women)) in designing AOs (format, criteria, academic and financial schedules, etc.). Some suggested that support, such as access to training, be provided in the application process. Some recent AOs have already targeted smaller players, and discussions and developments are underway at the Agency on how to overcome potential barriers to access and ensure diversity among G&C recipients following approval of the evaluation report on GBA Plus implementation in (CSA, 2021f).

The unsolicited process

Responses across information sources show that there is little awareness of the unsolicited proposal process. External survey respondents indicated that they were unaware of the process or had never considered it (46%). Only 17% agreed that the unsolicited proposal process was "very or extremely" adequate, and 26% rated it as "slightly or not at all" adequate. Only 7% of internal respondents agreed that the CSA's approach to managing unsolicited proposals was adequate, and almost half of respondents (49%) said they were unaware of it (some indicated that they had never experienced it or thought such a process existed). For teams that accepted unsolicited proposals, the process was administratively cumbersome and inconsistent. The previous evaluation indicated that the adoption of standardized, transparent application, selection and feedback processes for unsolicited proposals would address the problems associated with them, and that having a list of funding priorities for each branch would ensure alignment between unsolicited projects and CSA priorities. However, a few respondents said that this process should be used more as an innovation driver for non-traditional projects.

Interesting fact about other departments and agencies:

Other departments and agencies that have a class G&C program use it to fund unsolicited proposals. Each specific program that uses AOs has its own terms and conditions.

Other formats

Since G&Cs are for the benefit of Canadians, it is important to identify the needs of potential recipients in order to better support R&D, including basic research, as noted in the Relevance section. In addition, the evolution of the space sector, the changing international marketplace, and the rapid growth of the commercial sector, as described in the Space Strategy for Canada (CSA, 2019a), mean that the Agency needs to be innovative in the ways it does business. External survey respondents suggested other funding formats to increase efficiency, accessibility and collaboration: joint AOs with other organizations, specific AOs to access CSA services or infrastructure (i.e., in-kind contributions),Footnote 36 mentoring, challenges and competitions (such as the Deep Space Food Challenge or the Deep Space Healthcare Challenge), scholarships, research chairs, and so on. In fact, the - Business Plan (CSA, 2021c) mentions the development of a framework to implement the Guide to Departmental Collaboration with Recipients of Grants and Contributions (TBS, ) in managing G&C initiatives. While open calls for proposals were suggested as one of several alternative opportunities by internal interviewees, some other departments and agencies mentioned that open calls create planning challenges and tend to be resource-intensive for staff.

Complementarity and collaboration with other programs

The majority of funding applicants surveyed viewed the CSA's Class G&C Program as complementary to other programs (61%) and not redundant (76%). The differentiating factor is the CSA's emphasis on space. This is one of the things that both external and internal respondents liked most about the Program. Of CSA employees surveyed who were familiar with other programs (23/54, or 43%), 65% indicated that there was overlap with other programs, but more than half (52%) of those respondents indicated that there were opportunities to collaborate with those programs to boost the impact of the funding, including NRC (IRAP), NSERC and CFI. Some collaborations had already taken place and others were underway, but some respondents acknowledged that certain administrative requirements could curb opportunities and that collaboration required an investment of time and resources. A few respondents noted that interdepartmental cooperation was a broader policy discussion currently underway (in particular with the interdepartmental committee on G&C), but that the Agency tried not to duplicate programs offered in other departments and agencies. This is one of the expected results of the Policy on Transfer Payments (TBS, 2008a): "Collaboration exists within and among departments to harmonize transfer payment programs and standardize their administration, when appropriate." Nevertheless, the complexity of interdepartmental relationships was one of the things that internal respondents liked least about the Program. With respect to collaboration, the evaluation found that the recommended new Canadian Research and Development Classification (CRDC) standard (StatCan, ) used "by the federal granting agencies and Statistics Canada to collect and disseminate data related to research and development in Canada" was not part of the information collected in Unitas, with internal CSA research topics being selected. One of the CRDC's purposes is to identify opportunities for collaboration to optimize research efforts and improve outcomes and to improve reporting on the combined contributions of Canada's research and science organizations.

Good practices of other departments and agencies regarding collaboration:

  • More regular or ongoing collaboration with other departments (not just ad hoc).

Application submission, evaluation and selection processes

Overall, the application submission, evaluation and selection processes are appropriate but could be improved in some respects. Most internal and external survey respondents who were familiar with other programs indicated that the G&C processes of other organizations were better than or about the same as the CSA's and generally felt that the CSA's processes were more cumbersome or complex than those of other organizations (such as NSERC, DND, NRC or CFI). The previous evaluation indicated that standardizing the application submission and selection processes, including feedback, would optimize resources for both the solicited and unsolicited processes.

Table 8 – Overall comparison of the processes of the CSA's Class G&C Program with those of other programs
Processes in general
(internal survey: 23/54)
Application process
(external survey: 85/226)
Reporting process
(external survey: 41/226)
Others are better (the CSA is worse) 37% 36% 31%
Similar 58% 28% 43%
Others are worse (the CSA is better) 5% 27% 20%

With regard to the application process, the CSA mostly required proposals on paper or USB drive during the evaluation period. Most external respondents stressed that applications should be submitted online via a user-friendly, simple and efficient portal. In fact, one of the goals of Canada's Digital Government Strategy (GC, 2021b) is to replace thousands of inaccessible and inconvenient PDF forms with modern, user-friendly web-based versions so that information can be submitted easily and securely online. Under the CSA's new Digital Transformation Strategy (CSA, 2021a), the Program's processes will be reviewed from a client-oriented perspective and an online, end-to-end service approach. This is also part of a GBA Plus focus on equal access regardless of place of residence. Because of the COVID-19 pandemic, the CSA permitted electronic submissions for opportunities.Footnote 37 However, there are no plans at this time to make this approach permanent, comprehensive and mandatory.

External respondents were not entirely satisfied with the application process: only half considered the effort and time required to complete a funding application to be very or extremely reasonable (51%; the proportion was obviously lower for non-recipients (33%)). In particular, respondents described the submission process as too cumbersome, time-consuming, detailed, repetitive, complex and/or disproportionate to the amount of funding offered; they also said there was not enough time to submit applications. This is one of the things respondents liked least about the Program. However, internal survey respondents and interviewees and external survey respondents indicated that the submission process could be improved with, for example, a better interface and a staged application process. Using letters of intent as an initial application step for some AOs is an effective approach for reducing the burden on applicants (who can find out in the first stage whether their project is eligible) and on reviewers (who have fewer complete proposals to evaluate in the second stage). This approach is a good practice used by other departments and agencies and was also suggested by some external respondents. In addition, a staged application tool is already in place in the centralized Unitas system (along with a two-stage AO guide in the G&C Toolbox), the first stage being screening for applicant eligibility and project eligibility, but the process and the associated service standards have not yet been established. As a result, the tool is seeing little use.

Good practices of other departments and agencies regarding application submission, evaluation and selection processes:

  • Online submission and efficient information management system for applicants;
  • Staged application process (letter of intent followed by a full proposal);
  • Fully transparent selection process (detailed submission guide, evaluation grid, composition of the review team, final scoring of proposals (including a list of successful applicants);
  • Effective use of external reviewers;
  • Systematic integration of equity/diversity/inclusion and GBA Plus considerations through initial evaluation criteria and adequate resources and training for reviewers;
  • Systematic feedback to all applicants.

Regarding the evaluation and selection processes, proposals are evaluated overall on the basis of applicant eligibility and the criteria grid established for each AO. The evaluation of unsolicited proposals is based on certain criteria in the Applicant's GuideFootnote 38 (e.g., applicant eligibility, project eligibility, relationship with the CSA's priorities and the Program's objectives) and depends on a decision by the Directorates based on other factors (e.g., budget availability, schedule, alignment with other CSA initiatives, collaboration with other departments and agencies).

Proposals are generally evaluated by CSA staff from the relevant branch and other branches (internal peer review, a panel of two or three reviewers), but SST, SU and SE use external reviewers for some of their AOs. In some cases, the ranked list of proposals is reviewed by senior management for possible adjustment based on strategic overall selection factors (geographic distribution, priorities, recipient profile, etc.) known as "soft criteria," normally stated in the AO, which constitutes a second stage in the selection process. This varies from branch to branch and from AO to AO. The final selection of proposals is approved by the IIRB or, under the new agreement approval delegation charter that came into effect in , at a lower organizational level. Internal survey results (56%) confirm that this approach to selecting recipients is appropriate: respondents who had confidence in the selection processes used for AOs felt that they were comprehensive, fair and rigorous. In the recent Evaluation of the Implementation of Gender-Based Analysis Plus at the Canadian Space Agency (CSA, 2021f), the two-stage selection approach is identified as a best practice at the Agency. However, comments indicate that the cumbersome and subjective nature of the process remains a concern. Moreover, external respondents were not entirely satisfied with the selection process: only about half reported that the selection process was very or extremely clear (53%) and very or extremely fair (55%) and agreed very or extremely strongly that the CSA responds to applications in a timely manner (48%), with satisfaction obviously higher for recipients and lower for non-recipients. A few internal interviewees acknowledged that the CSA did not always clearly indicate how proposals were evaluated. Internal interviews also indicated that considering diversity and inclusion in the evaluation of proposals was a challenge for the branches (how to properly consider demographic variables and how they should intersect with other selection criteria). As noted in the previous evaluation, internal survey respondents and interviewees and external survey respondents indicated that the CSA could improve its evaluation and selection processes by, for example, making them more transparent (including clearly explaining and communicating the process and the scoring), using recurring, clear evaluation criteria that are built in and communicated from the outset, making greater use of external reviewers and peer review, and providing systematic feedback to all applicants. There is a feedback component in the Unitas system's proposal evaluation module: when the reviewers complete the proposal evaluation summary, it can be annotated with positive and constructive comments and then generated as a document for transmission to the applicant (it is not sent automatically, however).

Systems and tools

Good practices of other departments and agencies regarding tools and systems:
  • Centralized documentation and information;
  • Efficient, user-friendly database for reporting and internal research.

Even though the CSA has a main central database for recording G&C information (Unitas) and a portal for funded research projects to complete their progress reports and final reports (PIS), the evaluation showed that the Agency still has multiple internal databases or files and manual data capture and reconciliation approaches (using Excel), which makes it difficult to analyze information. In addition, there is no quality assurance or quality control on the data. The information about AOs, proposals, progress reports and final reports came from multiple sources and required significant clean-up. Nevertheless, efforts are being made to improve the tools: continuous improvement of the PIS and Unitas (including a new Unitas community of practice in ) and updating of some documents in the G&C Toolbox. For other departments and agencies, an effective system should have certain features.

Table 9 – Necessary features of an effective system, according to other departments and agencies

Features
  • Support both standardization and flexibility in the information required for an AO in the form of modules to accommodate different types of opportunities and client groups.
  • Cover the application submission and evaluation processes from beginning to end and the production of progress reports and final reports, standardizing the user experience for clients and reviewers.
  • Provide business intelligence for analysis of fund distribution, equity/diversity/inclusion tracking data, etc.
  • Support operational activities effectively.
  • Meet government security and privacy requirements for information-sharing.

Outcomes and impacts

Since the Class G&C Program is a stand-alone program, it has its own performance measurement and indicators that must be measured, as specified in the Program's Terms and Conditions. However, the PIPs need to clearly identify the performance information approach specific to the transfer payment program in the event that this G&C program forms only a portion of a program in the program inventory (TBS, ), as is the case at the Agency. The Program's results are reported by CEGC, but the branches also report the results for DRF programs, which are aggregated across their sectoral activities. The branches, which are responsible for administering their own AOs, tend to view the Program as a funding mechanism. In fact, this perception is reinforced by the fact that there is a "Funding Mechanism" form in the G&C Toolbox; the form is a decision support tool for choosing between a transfer payment (AO) and a contract (request for proposal). In addition, a few respondents described a disconnect between what was expected of the Program (what is evaluated, measured and reported) and what the branches were attempting to accomplish in their own program using G&C. Nevertheless, a harmonization effort is currently underway with the PIPs update: six Class G&C Program indicators have been associated with DRF indicators already included in the PIPs of the three DRF programs. This new results framework is expected to be completed in -. The result will be a reduction in the number of indicators and more direct alignment with the programs' objectives, which were reported by a few internal respondents (survey and interviews) as problematic (e.g., misalignment between STD objectives and the Program's objectives). However, some internal survey respondents and interviewees indicated that the CSA could do more to tell the story of the long-term impact of its investments. The STD initiative under the SCDP provides information about the business potential or impact one year and five years after the end of the project, which is not possible using the PIS (short term, during the agreement or just after the end of the agreement).

Regarding unintended outcomes of the Program, 38% of external survey respondents indicated that their funded research projects had unexpected impacts, most of which were positive. The unintended outcomes included the development of new collaborations, connections that led to subsequent opportunities or potential opportunities, and leveraging of other resources. Employees surveyed commented on both positive unintended outcomes (additional impact of data, space mission precursor projects, and a higher profile for Canada) and, to a lesser extent (21%), negative unintended outcomes (unconscious bias and a tendency to support the same clients, inadequate funding that drives researchers into other fields).

Regarding impacts on and benefits for various groups, information is rather limited. Only 9% of external survey respondents were able to confirm that the funding they received benefited various groups (based on location, language, ethnicity, sex or gender, physical or intellectual ability, and/or other identity factors), while 35% of internal respondents said that the Program had different impacts on different groups. In addition, Let's Talk Science, through its Living Space program, provided learning activities to students in schools in underrepresented and remote communities, including Indigenous communities.

8. Conclusion

In this section

The Class G&C Program helps the CSA support Canada's space ambitions and plays a unique role in Canada in the development of space sector capabilities, the advancement of space S&T, collaboration among stakeholders, and Canada's presence on the world stage. The information gathered for this evaluation demonstrates that the Program achieved its intended outcomes over the past six years, thanks in part to an increase in the budget and the number of AOs and consequently an increase in the number of projects and activities that received funding. Nevertheless, there are some unmet needs, including the high demand for the A&L component and AOs released at more opportune times. In addition, some operational factors (administrative processes, data organization and compilation) need to be improved to make the Program more efficient, innovative and collaborative. The Policy on Service and Digital (TBS, ) calls on federal organizations to be more agile and innovative in the way they do business. While there have been efforts to improve in various areas of the Program over the past six years, some of the findings are similar to those of the previous evaluation. The following are the five main themes that emerged from the evaluation of the relevance, performance and efficiency of the Class G&C Program and for which recommendations are made.

Funding opportunities

The number of AOs and the annual amount of funding awarded under the Class G&C Program increased over the past six years. However, the AOs could be better planned, harmonized and recurring, and simplified through a streamlined application process, such as a staged process (letter of interest followed by a full proposal), where possible. In fact, such a process is already in place in the central system, but it needs some fine-tuning. This would increase the impact of AOs, make them more accessible, improve the applicant experience, and simplify internal planning and operations. Predictable AOs might also facilitate participation by external reviewers. The Program could also be more responsive to the needs of communities as regards the timing of AOs, access to facilities and mentoring (in the form of collaboration), support for certain technology readiness levels, collaboration with other departments and agencies, and more sustained interactions with the various client groups. This could be positive for some groups, such as small businesses and early-career researchers, although some opportunities have already been developed specifically for them. Steps should be taken to facilitate access and ensure diversity among G&C recipients from a GBA Plus perspective. Harmonization of AOs will also require greater coordination between the branches. Also, more information about the unsolicited process could be shared within the CSA's sectors and with the various communities.

Tools, data and operational processes

A centralized information system and a number of tools were introduced and/or improved, and their use expanded over the past six years. The Unitas system offers a great deal of flexibility and is used in other contexts since it is the client relationship management system. Although a noticeable effort is being made to enter data at the various stages of the processes, some elements are still not being created systematically or entered correctly. As there are few mandatory fields in the system, a lot of data can be omitted. In addition, although a link has been created from SAP to Unitas, the extracted financial data differs because of the way the data is coded in SAP (manner and limitation). Also, frequent changes in the PIS and the fact that the fields are formatted as text make it difficult to analyze the data. However, the student and organization reports for the A&L component are being revised so that they can be standardized and posted online. Service standards are also updated and published appropriately, although some data is incorrect or missing from the database. Hence, a directive to use the central system for all G&C processes, more systematic data quality assurance, and revision of the data format would provide better-quality data, leading to richer, more direct, and longer-term information about AOs, branches and programs, recipients and non-recipients, and service standards, which would be useful for reporting and decision support.

There are a number of tools available to G&C operations. They can be found in the G&C Toolbox in Livelink. Some have been updated, but others are still pending, mainly because of a lack of resources but also because they are not consistent with user needs, objectives and requirements. The tasks of creating or updating tools could be broken down into stages, and key users should be involved early in the creation or update process. The G&C Toolbox also includes the operational process steps and the RACI grid, although these processes are not aligned with data entry in Unitas (or in SAP). Information-gathering forms such as the recipient risk assessment form would be better online (instead of in Excel format) so that data can be collected from a centralized perspective.

Lastly, the CSA allowed electronic submission of proposals for some opportunities in because of the pandemic, but a permanent and comprehensive approach needs to be implemented, while the interface and intuitiveness of the online application process could be improved from a user perspective in particular.

Evaluation and selection process

The Class G&C Program's evaluation and selection processes are appropriate, but when compared with those of other departments and agencies, they could be improved with the addition of systematic feedback to all applicants (there is a feedback component in the Unitas system's proposal evaluation module). In addition, the CSA should specify when AOs are posted the manner in which strategic global selection factors, such as geographic location or demographic factors, are being used. The composition of the review team (e.g., age, position and level, area of expertise) could also be communicated to enhance the transparency of the process. The involvement of external reviewers, which would be facilitated by recurring and planned AOs, would also make the processes more transparent.

Lastly, unsolicited processes are seldom used and are perceived as less transparent. Unsolicited proposals are considered at the CSA's discretion and may be accepted on an exceptional basis. More information about the unsolicited process and the evaluation criteria could probably be shared within the branches and with the various communities.

Roles, responsibilities and coordination

The CEGC is an important function in the delivery of the Class G&C Program, but its roles and responsibilities need to be clarified and better communicated. More resources were provided in ,Footnote 39 which will afford more opportunity for innovation, strategic thinking, collaboration and improvement of the Program.

The branches have stated a need for G&C and GBA Plus expertise and training and for greater coordination and harmonization to support operations more effectively. Greater coordination and harmonization of operations could be facilitated by a community of practice, one or more working groups, or even the establishment of a dedicated operational group, which is a good practice followed in other departments and agencies. Also, since the Program is under the responsibility of the Chief Financial Officer, the addition of an operational working group in the branches might provide a structure for accountability and information sharing, which would lead to a better overview of G&C activities across the CSA. More synergies between the different sectors of the Agency are already in evidence: there is now an SST employee who also works with the Communications team on A&L projects under the SCDP.

The new terms of reference of the G&C Advisory Committee, which used to be a steering committee, will undoubtedly foster greater coordination between the branches. In addition, governance was partially streamlined with the recent delegation of agreement approval authority to a lower organizational level.

Performance measurement

The Class G&C Program has its own Terms and Conditions, and it needs its own performance measurement. The PIPs for the three DRF programs replaced the Program's and PMSs but did not indicate the approach for measuring the Program's specific performance and did not include a specific target for the Program. The current PIPs update process will associate six indicators for the Program with the three PIPs, including one indicator for the A&L component (number of students involved in projects). This is a step toward aligning the Program with the CSA's recently approved logic model and thus represents more direct alignment with the objectives of the DRF programs. However, specific targets should be identified for the Class G&C Program. In addition, there are no indicators for the A&L youth client group, although the CSA's new logic model contains two indicators that could be associated to it. Also, the PIPs do not indicate how the Program's efficiency is measured. Adding efficiency measures to the PIPs – such as the means to deliver the program, facilitating factors at different stages of the Program's life cycle, and processes implemented to improve the efficiency of the Program's activities – would make it possible to develop questions and indicators relating to program efficiency. Consequently, since there will be fewer indicators for the Class G&C Program, the PIS should be revised to ensure that only the necessary information is requested from G&C recipients.

Lastly, while there is no need to change the Terms and Conditions of the Class G&C Program, Treasury Board's upcoming update of the Policy on Transfer Payments may provide an opportunity to update them as required for purposes such as making them more comprehensive regarding R&D, innovation and commercial capability in view of the growing commercial space sector market, making them more responsive to the needs of the various client groups, and revising the governance and accountability information they contain.

On the basis of the key evaluation findings described above, the following actions are recommended to improve the accessibility and efficiency of the CSA's Class G&C Program:

  1. Establish regular funding opportunities with greater sensitivity to the needs of the diverse client base, while increasing harmonization and coordination between the branches and recipients.
  2. Clarify the rules and requirements regarding departmental collaboration with G&C recipients, and inform stakeholders.
  3. Use a single operational database for the Program's administration and management, and monitor data quality, continuity and completeness.
  4. Explore the possibility of using standardized tools to streamline the application process, such as using a staged application process.
  5. Ensure that systematic feedback is provided for all funding applications.
  6. Communicate the CEGC's roles and responsibilities to the G&C Program's user branches to ensure a common understanding and meet the branches' needs for the services and expertise they require.
  7. In updating performance measurement, ensure that there are CSA logic model indicators for each of the Program's components and client groups, and that specific targets are agreed upon for the Program.

9. Management response and action plan

Management response and action plan details
RECOMMENDATION LEADS MANAGEMENT RESPONSE ACTION PLAN TIMELINE
Recommendation 1:
Establish regular funding opportunities with greater sensitivity to the needs of the diverse client base, while increasing harmonization and coordination between the branches and recipients.
  1. Programmatic DGs
    And
  2. Executive Director, Communications and Public Affairs

    In collaboration with:

    • DG Policy
      And
    • Chief Financial Officer and DG Corporate Services
CSA senior management concurs with this recommendation.
  1. G&C user directors will work on a multi-year plan and schedule for Class G&C Program initiatives to establish regular funding opportunities.
  2. They will incorporate a pilot approach to harmonization and coordination of initiatives to foster the emergence of a comprehensive G&C vision.
  3. With the support of Policy and the CEGC, G&C user sectors will update client needs.
Recommendation 2:
Clarify the rules and requirements regarding departmental collaboration with G&C recipients, and inform stakeholders.
  1. Chief Financial Officer and DG Corporate Services
CSA senior management concurs with this recommendation. The CEGC will complete the CSA's Guide to Departmental Collaboration with Recipients of Grants and Contributions and associated tools. It will provide training to CSA users as needed.
Recommendation 3:
Use a single operational database for the Program's administration and management, and monitor data quality, continuity and completeness.
  1. Programmatic DGs
    And
    Executive Director, Communications and Public Affairs
  2. Chief Financial Officer and DG Corporate Services

    In collaboration with:

    • Chief Information Officer
CSA senior management concurs with this recommendation.
  1. G&C user sectors will use Unitas as their G&C management database. A communication to this effect will be sent to all users.
  2. Training will be offered to all sectors.

    The CEGC will work with Information Technology to improve the input fields in Unitas with a view to improving quality control.

Recommendation 4:
Explore the possibility of using standardized tools to streamline the application process, such as using a staged application process.
  1. Programmatic DGs
    And
    Executive Director, Communications and Public Affairs
  2. Chief Financial Officer and DG Corporate Services

    In collaboration with:

    • Programmatic DGs
      And Executive Director, Communications and Public Affairs
CSA senior management concurs with this recommendation.
  1. G&C user sectors will be encouraged to use the existing staged application process whenever possible.
  2. The CEGC will survey G&C user satisfaction to determine if using the existing tools in stages has streamlined the process. If not, other tools will be considered.
Recommendation 5:
Ensure that systematic feedback is provided for all funding applications.
  1. Programmatic DGs
    And
    Executive Director, Communications and Public Affairs
  2. Chief Financial Officer and DG Corporate Services

    In collaboration with:

    • Chief Information Officer
CSA senior management concurs with this recommendation.
  1. G&C user sectors will use Unitas, specifically the feedback module.
  2. The CEGC will review the feedback process for applications. In collaboration with Information Technology, the CEGC will update and adjust some of the standard management modules and tools in Unitas, specifically regarding automated feedback.
Recommendation 6:
Communicate the CEGC's roles and responsibilities to the G&C Program's user branches to ensure a common understanding and meet the branches' needs for the services and expertise they require.
  1. Chief Financial Officer and DG Corporate Services

    In collaboration with:

    • Programmatic DGs
      And
    • Executive Director, Communications and Public Affairs
CSA senior management concurs with this recommendation. The CEGC and directors involved in G&C management will work together to clarify, document and communicate roles and responsibilities.
Recommendation 7:
In updating performance measurement, ensure that there are CSA logic model indicators for each of the Program's components and client groups, and that specific targets are agreed upon for the Program.
  1. Programmatic DGs

    In collaboration with:

    • Executive Director, Programs and Integrated Planning
      and
    • Chief Financial Officer and DG Corporate Services
  2. DG Policy

    In collaboration with:

    • Programmatic DGs
CSA senior management concurs with this recommendation.

All points of this recommendation are being implemented through replacement of the Performance Measurement Strategy with Performance Information Profiles (PIPs) and the continued improvement of G&C tracking and reporting tools.

  1. With the support of Programs and Integrated Planning, G&C user sectors will update the alignment of the PIPs and their targets for the Program's two components.
  2. G&C user sectors and Policy will collect project data annually to facilitate multi-year compilation of the Class G&C Program's results and for data completeness and reporting purposes.

10. References

In this section

Internal documents

  • Canadian Space Agency (CSA, multiple years) Five-Year Evaluation Plan
  • Canadian Space Agency () Terms and Conditions of the Class Grant and Contribution Program to Support Research, Awareness and Learning in Space Science and Technology
  • Canadian Space Agency (2021a) Digital Transformation Strategy
  • Canadian Space Agency (2021b) Corporate Risk Profile -
  • Canadian Space Agency (2021c) - Business Plan
  • Canadian Space Agency (2018a) Departmental Results Framework
  • Canadian Space Agency (2017a) Policy on Gender-Based Analysis Plus
  • Canadian Space Agency (2017b) Performance Information Profiles
  • Canadian Space Agency () Performance Measurement Strategy for the Class Grant and Contribution Program – Research Component
  • Canadian Space Agency () Performance Measurement Strategy for the Class Grant and Contribution Program
  • J.E. Halliwell Associates Inc. () Canadian Academic Capacity in Space-related Research /

Public documents

11. Appendix 1 – Logic Models

Awareness and Learning Component Logic Model ()

Awareness and Learning Component Logic Model () - Text version

A&L Component Logic Model. From left to right, and from bottom to top:

  • Activities
    • (A1) Prioritization: Determine needs of space program; Assess requirements of target group; Rank funding priorities (arrow to A2 and Output 1).
    • (A2) Promotion and Solicitation: Prepare and issue announcements; Promote program (website); Receive proposals from target group (arrow to A3 and Output 2).
    • (A3) Evaluation: Screen proposals for eligibility; Assess compatibility of proposals with areas of priority; Evaluate and rank proposals (arrow to A4 and Output 3).
    • (A4) Funding: Verify continued eligibility of successful applicants; Issue Grant and Contribution Agreements; Disburse funding to successful applicants (arrow to A5 and Output 4).
    • (A5) Administration: Receive progress/final reports/performance reports/completed surveys; Assess reports against Grant and Contribution Agreements (arrow to Output 5).
  • Outputs
    • (Output 1) Ranked list of funding priorities (arrow to all immediate outcomes).
    • (Output 2) Announcements (arrow to all immediate outcomes).
    • (Output 3) List of eligible applications (arrow to all immediate outcomes).
    • (Output 4) Grant and Contribution Agreements; Financial assistance (arrow to all immediate outcomes).
    • (Output 5) Assessment of progress/final reports (arrow to all immediate outcomes).
  • Immediate Outcomes
    • (Outcome 1) Increased knowledge and skills in space-related disciplines among target audience (Medicine Elective, ISU, CFISU, students in laboratories, at workshops, conferences, competitions) (arrow to Outcome 3 and Outcome 4).
    • (Outcome 2) Increased availability and use of space theme in learning opportunities and materials related to science and technology (NFP orgs, school boards, educators) (arrow to Outcome 5).
  • Intermediate Outcomes
    • (Outcome 3) Sustained interest in space-related disciplines among target audience (arrow to Outcome 6).
    • (Outcome 4) Increased Canadian Highly Qualified personnel active in space-related disciplines (arrow to Outcome 6).
    • (Outcome 5) Target audience is reached through learning activities and materials related to science and technology (arrow to Outcome 7).
  • Long-term Outcomes
    • (Outcome 6) Increased/sustained capacity to conduct or support space-related research and/or operations (arrow to Outcome 8).
    • (Outcome 7) Increased awareness of science and technology among target audience (arrow to Outcome 8).
  • CSA Strategic Outcome
    • (Outcome 8) Canada's presence in space meets the needs of Canadians for scientific knowledge, space technology and information.

Research Component Logic Model ()

Research Component Logic Model () - Text version

Research Component Logic Model. From left to right, and from bottom to top:

  • Activities
    • (A1) Priorization: Determine needs of space program; Assess requirements of target group; Rank funding priorities (arrow to A2 and Output 1).
    • (A2) Promotion and Solicitation: Prepare and issue announcements; Promote program (website); Receive proposals from target group (arrow to A3 and Output 2).
    • (A3) Evaluation: Screen proposals for eligibility; Assess compatibility of proposals with areas of priority; Evaluate and rank proposals (arrow to A4 and Output 3).
    • (A4) Funding: Verify continued eligibility of successful applicants; Issue Grant and Contribution Agreements; Disbursed funding to successful applicants (arrow to A5 and Output 4).
    • (A5) Administration: Receive progress/final reports/performance reports/completed surveys; Assess reports against Grant and Contribution Agreements (arrow to Output 5).
  • Outputs
    • (Output 1) Ranked list of funding priorities (arrow to all immediate outcomes)
    • (Output 2) Announcements (arrow to all immediate outcomes)
    • (Output 3) Ranked list of eligible applications (arrow to all immediate outcomes)
    • (Output 4) Grant and Contribution Agreements; Financial assistance (arrow to all immediate outcomes)
    • (Output 5) Assessment of progress/final reports (arrow to all immediate outcomes)
  • Immediate Outcomes
    • (Outcome 1) Increased knowledge from research projects in priority space science and technology areas (arrow to Outcome 4 and Outcome 5)
    • (Outcome 2) Maintained and/or increased space focus in universities, post-secondary institutions, not-for profit and for-profit organizations (arrow to Outcome 5)
    • (Outcome 3) Partnerships established and/or sustained; Partner contribution leveraged; Access to international collaboration for Canadian organizations (arrow to Outcome 5 and Outcome 6)
  • Intermediate Outcomes
    • (Outcome 4) Increased availability of space-related knowledge and information in priority areas (arrow to Outcome 7)
    • (Outcome 5) Increased space-related science and technology capacity in targeted areas (arrow to Outcome 7)
    • (Outcome 6) Increased multi-disciplinary and/or institutional collaboration (arrow to Outcome 7)
  • Long-Term Outcomes
    • (Outcome 7) Canadian space-related research and development responds to national needs and priorities.
  • CSA Strategic Outcome
    • (Outcome 8) Canada's presence in space meets the needs of Canadians for scientific knowledge, space technology and information.
Date modified: