Last updated: 10 March 2023
Here we provide details about our teams of volunteers work to search, select, appraise and display evidence reviews and overviews in order to inform decision-making for environmental management
How we search for reviews and overviews for CEEDER
To find environmental evidence reviews, the CEEDER Editorial Team conducts searches in
bibliographic platforms/databases including CAB Direct, Scopus , and Web of Science Core
Collection. These searches are updated weekly and monthly. We also use search engines to
facilitate capture of non-commercially published (‘grey’) literature and these searches are
updated every three months.
How we select reviews and overviews for CEEDER
The collected records obtained by searching are screened by the CEEDER Screening Team of trained volunteers to retain reviews and overviews that meet eligibility criteria.
Reviews and overviews are first screened based on their title and abstract, independently by 2 reviewers. In case of discrepancy a third reviewer is requested in order to make the final decision of keeping or discarding the synthesis. Then, another screening is based on reading the full-text document, relying on a new set of criteria.
When screening at full-text, primary metadata are extracted as well, such as PICO/PECO structure of the review question, details on PICO or PECo key-elements, overview vs review, and the general question addressed by the review article. This is information for CEEDER database, in order to help the user find relevant reviews.
Appraisal of reviews and overviews
Four rounds of appraisal are organised each year. A table with selected full-text reviews and overviews is made available to the CEEDER Review College members who rate the reliability of evidence reviews using CEESAT.
Two Review College members apply CEESAT independently for each evidence review or overview to minimise bias and provide consistent rating. Disagreements in rating are resolved by the CEEDER Editorial Team.