Performance & Results Hub
2025 Reporting
Performance & Results Hub
2025 Reporting
This page provides information and resources for 2025 Technical Reporting.
Please contact us with any questions or suggestions at: performanceandresults@cgiar.org.
Reporting is the stage of the Adaptive Management Cycle where results — including W3/bilateral contributions — are compiled, quality-assured, and used to develop Annual Technical Reports and other key reporting products, such as the CGIAR Results Dashboard and the Portfolio Narrative.
This phase ensures results are accurate, consistent, and accessible for learning, accountability, and decision-making.
The templates, timelines, and guidance on this page will help you complete your reporting accurately and on time.
29 July 2025
Technical Reporting engagement plan now available:
Engagement Plan: 2025 Technical Reporting (Full Word document)
Engagement Plan: 2025 Technical Reporting (PowerPoint summary)
30 June 2025
CGIAR Impacts in Agrifood Systems: Evidence and Learnings from 2022-24 (Type 2) report is available here.
2024 Portfolio Narrative report is available here.
2024 Learning & Optimization report and 2025 Action Plan are available here (will be available end of August).
The CGIAR Technical Reporting Arrangement 2025-30 is available here.
2024 Initiatives, Impact Platforms and Science Group Projects annual technical (Type 1) reports are available here.
2024 Portfolio Practice Change (Type 3) report is available here.
Reporting on Outputs: Tuesday, 23 September 2025, 10:00am and 3:00pm (CET)
Reporting on Outcomes: Tuesday, 30 September 2025, 10:00am and 3:00pm (CET)
Quality Assurance process: Tuesday, 21 October 2025, 10:00am and 3:00pm (CET)
Performance and Results Management System (PRMS) deep-dive: Thursday, 13 November 2025, 10:00am and 3:00pm (CET)
Meeting recording (10am CET session)
Meeting recording (3pm CET session)
Frequently Asked Questions
Where can I find the key dates for Technical Reporting for 2025?
An overall Technical Reporting timeline can be found on this page of the P&R Hub. Upcoming key reporting dates and deadlines are also presented on this page. Any updates or changes to the reporting timeline will be communicated via performanceandresults@cgiar.org.
Can I retrieve a result deleted by mistake in the PRMS Reporting Tool?
In this instance, please write to prmstechsupport@cgiar.org with a request to retrieve it, providing information on the result for ease of identification (number if known, title, type and which Program/Accelerator entered it).
In the PRMS Reporting Tool, can I download a PDF for a result that is still being edited?
Yes. The PDF function is always available. The system will generate the PDF with the information that you have entered when you press the button, even if the result entry is not fully complete at the time.
What should I do if I encounter a bug in the PRMS Reporting Tool?
Please ensure you create technical tickets for the bugs you encounter by sending an email to prmstechsupport@cgiar.org. This will help the PRMS team to track requests and prioritize them.
What is the minimum number of pieces of evidence I can report?
Between 1 and 6 pieces are accepted by the system. Submit evidence in order of relevance, starting with the one most directly supportive of the result, and use the text box to indicate where the relevant information can be found within each source (e.g., page number, slide number, or table number).
How and where do I submit evidence for results?
Evidence for results can be submitted in the PRMS Reporting Tool. It is possible to submit a maximum of six pieces of evidence per result: However, this does not apply to:
Knowledge products; they are stored on CGSpace and do not require additional evidence uploads in the PRMS.
Capacity sharing for development results; these are exempt from multiple evidence uploads due to the associated time/resource burden and potential unresolved General Data Protection Regulation (GDPR) issues. By submitting a capacity sharing for development result, it is understood that supporting evidence is available and can be provided upon request if a sub-sample is needed.
To avoid issues with evidence access during quality assessment, all evidence that is not publicly accessible must be uploaded to the PRMS.
Additional information:
Evidence links and file uploads are both possible in the PRMS.
All links provided should be publicly accessible. All CGIAR publications should be shared using a CGSpace link.
Links to SharePoint, One Drive, Google Drive, DropBox, and other file storage platforms are not allowed. If you do not have a CGSpace or other public link available, use the “Upload file” option to upload your evidence to the PRMS repository.
For confidential evidence, select “Upload file” and then “No” to indicate that it should not be public.
If you add an evidence link, or indicate that the file being uploaded to the PRMS is public:
You confirm that the file is publicly accessible.
You confirm that all intellectual property rights related to the file have been observed. This includes any rights relevant to the document owner’s Center affiliation and any specific rights tied to content within the document, such as images.
You agree to the file link being displayed on the CGIAR Results Dashboard.
If you indicate that the file being uploaded to the PRMS is NOT public:
You confirm that the file should not be publicly accessible.
The file will not be accessible through the CGIAR Results Dashboard.
The file will be stored in the PRMS and will only be accessible by CGIAR staff (e.g. quality assurance assessors) with the repository link.
Documents uploaded to the PRMS will be view-only and cannot be edited.
Can I select multiple locations for a single result? For example, country and sub-national?
It is possible to select sub-national as an option, and then make multiple inputs. Sub-national inputs are available any time a result is mapped to a country (e.g. a regional result where a country is also specified).
In addition:
When country is selected, multiple countries can be selected, unless the selection makes up a specific region, set of regions, or global location, in which case “region” or “global” should be selected.
When regional is selected, multiple regions can be selected, but if the selections include every region, “global” should be selected.
Virtual is presented as an option only for capacity sharing for development results. It should be selected only if the output relates completely to virtual training. For blended virtual and in-person training, select the geographic location where most of the in-person training took place.
For knowledge products, use the geographic location, pulled from CGSpace, to indicate where the research was conducted or where the subject of the paper is focused.
For innovation development, choose the location where the innovation has been developed, not where there is potential for development.
What is CGSpace?
CGSpace is a repository of agricultural research outputs and results produced by several CGIAR Centers, Initiatives, Platforms, Science Programs and Accelerators and the CGIAR System Office. It indexes reports, articles, press releases, presentations, videos, policy briefs, etc.
Who should add knowledge products to CGSpace? The lead Center or the Program/Accelerator?
Currently, you should rely on Center staff for uploading knowledge products to CGSpace. Researchers should use their Center’s current knowledge management system to inform their Center library/communications/knowledge management/data/or curation managers about knowledge products to be added to CGSpace.
How can metadata information in CGSpace be updated?
You should ask your Center library team to ensure that the CGSpace record is accurate. The PRMS Reporting Tool will then refresh the data at the end of the reporting cycle, re-syncing all CGSpace links. There is therefore no need for you to re-sync any links in the PRMS Reporting Tool to update information.
Can CGSpace records be updated during QA periods?
CGSpace is always open for updates, which can be made at any time. During the QA process, assessor comments will reflect the data available in CGSpace at the time of assessment. If the relevant CGSpace record has been updated since the assessor's review, Programs/Accelerators should still accept the assessor's comments as there is no need to respond.
Is it allowed to tag legacy publications (originating from Initiative outputs) only to the Initiatives acknowledged in the publication, or can they also be tagged to the successor Science Program(s) and/or Accelerator(s), even if these are not explicitly mentioned in the acknowledgements?
Yes, during this transition period between research portfolios, dual tagging is allowed, both to the original Initiative (as noted in the acknowledgements) and to the corresponding new Science Program(s) and/or Accelerator(s), even if they are not explicitly mentioned in the publication. This applies when requested by the researcher and aims to ensure continuity and alignment with the evolving research structure.
Who reviews and approves knowledge products that are not peer-reviewed journal articles, such as blog articles and working papers?
Knowledge products other than peer-reviewed journal articles should be reviewed using processes developed by Centers or Programs/Accelerators. The quality assurance team assumes that all knowledge products have gone through a standard review process before being reported.
Each Center knowledge management team can confirm Center guidelines for knowledge products. Please refer to the CGIAR Open and FAIR Data Assets Policy signed by all Centers.
How can I make a CGSpace entry limited access? And what types of knowledge products would need restricted access in CGSpace?
A curator can limit access to a knowledge product when submitting the material to CGSpace. If the access status for an entry needs to be changed after submission, the curator will need to contact a CGSpace administrator (Abenet Yabowork or Alan Orth) to make that change. It is possible to limit a document to CGIAR users (via Active Directory login) as well as to limit access generally until a certain date.
Published restricted articles can be entered into CGSpace. This is different from confidential evidence, where restrictions are imposed by a scientist on SharePoint or any other IT-recommended storage system at Center level. For confidential evidence, quality assurance assessors will require access to the confidential link.
Who will be contacted for the QA of knowledge products?
If you tag a knowledge product with several Programs/Accelerators, that product will be represented equally among those tagged (e.g., in exports or the Results Dashboard). However, the first Program/Accelerator reporting a knowledge product will be responsible for addressing any comments from a quality assessor (e.g., a publication that was erroneously tagged as ISI by the library team in CGSpace can be questioned and the record should be updated in CGSpace during the assessment process).
Should I report knowledge products in multiple languages separately?
Knowledge products in multiple languages should not be reported separately, unless necessary to evidence the ToC (for example if pathways are differentiated for actors, or geography/geographic scope, requiring the output in different languages).
How is the FAIR score for knowledge products calculated?
FAIR (findability, accessibility, interoperability, and reusability) scores were introduced to align reporting with the CGIAR Open and FAIR Data Assets Policy. These scores are derived from existing CGSpace metadata to minimize data entry efforts, with equal weight assigned to each criterion.
If you wish to enhance the FAIR score for a knowledge product, liaise with your Center’s knowledge management team to implement improvements.
Do I report trainees only when they have finished their course?
Yes, both long-term and short-term training programs must be completed before reporting (to avoid reporting the same trainee multiple times across years).
How do I report the gender ratios if I am unable to determine this for the capacity sharing result I am reporting?
There is an option to enter numbers for “unknown” when gender disaggregation numbers are unavailable. There is also an option to enter numbers for non-binary trainees.
How do I determine if I should report an innovation as an output (innovation development) or an outcome (innovation use)?
Innovations can be reported at both the output and outcome level. The development of an innovation (at various stages of design, testing and validation) is an output. The scaling or use of an innovation is an outcome. You may first report an innovation at the output level. If the innovation advances and starts to be used, it should then be reported as an outcome.
Do I need to update innovations that were submitted during previous reporting periods?
Yes, all innovations that were previously reported need to be updated/validated. Upon confirming that the innovation development is “active”, the submitter is asked to validate or update data already in the PRMS Reporting Tool, and provide additional information on missing or new data fields. When innovation development is “inactive” then the submitter is asked to indicate the reason of the inactive status.
When and how do I report the readiness of an innovation?
Detailed guidelines for selecting the readiness level are available (updated October 2024) and CGIAR has developed an innovation readiness calculator to determine the level. The Innovation team proposes the readiness score and provides evidence to support the score. This is reviewed by the CGIAR quality assurance team. Note that at the output level, you will have (only) a “generic innovation readiness score for the core innovation: If it is level 7 in Kenya, level 3 in Peru and level 5 in India, only the highest score for the generic rank is retained.
What is an innovation profile and who develops these?
Innovation profiles are summaries of innovations reported in the PRMS and can be downloaded as an automated pdf. Teams may also choose to create more polished versions of these profiles with support from their Center’s graphic designers. These enhanced versions can feature high-quality images, engaging visuals, and customized formatting. To support this process, the IPSR team can provide a sample of a polished profile as a reference.
Should I report an innovation that is at an early stage, when there is a chance that it may not develop further?
To set benchmarks and to demonstrate and track progress over time, it is important to track work across the portfolio at early as well as later stages of innovation development and use. If your Program/ Accelerator/ Project has invested time and financial resources into an idea for an innovation, it should be reported in PRMS.
Can innovations reported in 2022 or 2023 innovations (but not updated in 2024) be packaged and reported at innovaiton use level using the IPSR approach?
Yes, you may include inactive innovations in an innovation package report in PRMS.
When should I mark an innovation as inactive or discontinued (versus no evidence of progress for the reporting year)?
“Inactive/discontinued” should be selected when no investment was made in advancing the innovation in the reporting year.
How do I report a previously submitted innovation that has been continued with additional investment, but with no significant changes in data?
This should be categorized as an "active/continued" innovation. Data fields can be reviewed and updated where necessary, but it is also acceptable to resubmit the 2024 data as is. Submitters may be prompted to provide data on new innovation data fields.
Are “inactive/discontinued” innovations considered submitted results?
No, they are not considered submitted results. Once an Innovation Development result is tagged as inactive, the submit button is disabled.
What will happen with results that are labelled “inactive/discontinued” in the PRMS Reporting Tool?
Inactive/discontinued innovations will still be visible in the PRMS Reporting Tool. All innovations (active or inactive) will be automatically replicated every year and be available to be updated once a new reporting phase is opened.
Do “inactive/ discontinued” innovations go through the quality assessment (QA) process?
Inactive innovations will not go through the QA process.
Can innovations marked "inactive/discontinued" be reopened, edited and submitted as continued/active?
Yes, inactive/discontinued innovations can be reopened, edited and submitted as continued/active.
How should I distinguish between the testing and validation readiness levels for innovations, and what kinds of evidence are required to justify each?
Levels 4, 6, and 8 represent testing stages, where the innovation is actively evaluated for its ability to achieve specific impacts under varying conditions:
Level 4 involves testing in a fully controlled environment,
Level 6 in semi-controlled conditions, and
Level 8 in uncontrolled, real-world settings.
For these testing levels, the appropriate evidence should include data and documentation from the ongoing trials or studies that will demonstrate the innovation’s performance under the specified conditions:
Level 4 requires controlled environment testing evidence, such as lab reports or experimental data.
Level 6 needs evidence from semi-controlled conditions, like pilot study results where not all variables are regulated.
Level 8 calls for field trial data or initial user feedback to demonstrate performance in real-world settings.
In contrast, levels 5, 7, and 9 are validation stages, where evidence from prior testing is used to confirm that the innovation can achieve the desired impact under specific conditions:
Level 5 confirms readiness based on results from fully controlled tests,
Level 7 validates readiness in semi-controlled environments, and
Level 9 establishes readiness in uncontrolled, real-world contexts, with limited or no involvement of CGIAR.
For these validation levels, evidence should provide conclusive reports that the innovation has successfully met impact criteria based on previous testing:
Level 5 requires a validation report that confirms performance in a controlled environment.
Level 7 needs documentation, like a summary of pilot results, showing validation in semi-controlled settings.
Level 9 requires evidence of validation in real-world conditions with limited or no involvement of CGIAR, such as field data.
In summary, testing levels require evidence of active evaluation under specific conditions, while validation levels require confirmation that testing results prove the innovation’s readiness for impact.
How do I determine if I should report an innovation as an output (innovation development) or an outcome (innovation use)?
Innovations can be reported at both the output and outcome levels. The development of an innovation (at various stages) is an output. The scaling, uptake and use of an innovation can be reported as outcome. In PRMS the results can be linked.
What are the two ways to report innovation use results (outcomes)?
You can report innovation use results (outcomes) via a non-IPSR pathway or via the IPSR pathway.
Non-IPSR pathway/reporting single innovation use: Any innovation use result can be reported using the non-IPSR pathway. In 2025 this pathway is 90% similar to how it was reported in 2024 (with a small change in specifying users and disaggregating youth, if applicable). The non-IPSR pathway: only records (i) innovation use type, (ii) innovation use quantity, and (iii) evidence to support innovation use reporting.
IPSR pathway/reporting innovation bundle/package use: The IPSR pathway is the advanced way to report innovation use as part of an innovation bundle/ package. Only those innovations that have been reported and quality assured (at the output level) in the PRMS can be reported as part of an innovation package (outcome level). A data template is available on the P&R Hub here. The IPSR pathway supports innovation teams and partners to co-design innovation bundles and packages, assesses these packages to identify key bottlenecks/opportunities, and creates a starting point for developing scaling strategies.
How do I package an innovation using IPSR that has not yet been reported and quality assessed in the PRMS?
Report this as innovation development at output level before the deadline for submission for the QA process. Once the QA process is finalized, this will enable innovation package reporting. To collect the required data, an IPSR workshop can be organized.
Which results can be reported for innovation use/IPSR pathway?
As part of the IPSR pathway, only innovations that have been reported and QA-ed at output level as innovation development can be selected for reporting of innovation use at outcome level. As part of the non-IPSR pathway, any innovation use can be reported.
Why should I choose the IPSR pathway?
Using the IPSR pathway allows you to demonstrate innovation/scaling synergies and collaboration with other CGIAR Initiatives and partners around innovation packages. It is recommended to report innovation use through the IPSR pathway as this supports the development of:
Scaling ambition (aligned with End of Initiative outcomes): an agreed-upon statement that includes information on where, with whom, for whom, and by when innovation scaling will contribute to outcomes and impacts.
Innovation Package: enabling conditions that will support achieving the scaling ambition in a specific context.
Scaling Readiness assessment: insight into how ready the innovation package is for scaling, what are the key bottlenecks and opportunities.
An initial light Scaling Strategy (a more mature, structured, Scaling Strategy is still coming) developed in multi-stakeholder discussions documenting the best ways forward for scaling an innovation.
The IPSR pathway collects all data that is also collected through the non-IPSR pathway, but generates valuable learning for CGIAR and partners to develop innovation and scaling strategies. Furthermore, the IPSR pathway can be used to show progress towards impact at scale against a scaling ambition, rather than just providing current innovation use numbers. Using the IPSR pathway also gives you access to Scaling Challenges/ Funds.
Is there guidance on adding complementary innovations/enablers/solutions as part of innovations use – IPSR pathway reporting?
Yes, please see this guidance note, which provides details on adding complementary innovations/enablers/solutions.
Q&A coming soon.
Q&A coming soon.