The CEE Database of Evidence Reviews (CEEDER) is a new open access Evidence Service, still under development but available for users to trial. Please read the brief overview below.
We welcome suggestions that will help us develop and improve this service. Please provide feedback via our questionnaire.
- Use CEESAT to critically appraise an Evidence Review or Evidence Overview
- Checklist for Editors and Peer Reviewers of evidence reviews
What is CEEDER?
CEEDER is an open access evidence service to help evidence consumers find reliable evidence reviews and syntheses to inform their decision making. The CEEDER project team collates all forms of evidence reviews and syntheses relevant to environmental management, as they are published or otherwise become available globally. The CEEDER database lists available (commercially published and ‘grey’) syntheses of primary research (e.g. critical reviews, meta-analyses, systematic reviews, rapid reviews) conducted to assess evidence on a specific question of environmental policy or management relevance. Most importantly, the database provides an independent assessment, conducted to preset quality criteria (see CEESAT below), of the reliability of each synthesis with respect to its use in decision making. Presentation of the assessment is tailored to the needs of decision makers and other evidence consumers in governmental, non-governmental and private sectors as well as the general public.
The aims of CEEDER are not only to guide evidence consumers to reliable reviews but also to provide tools to authors, editors and peer reviewers for improving the reliability of future reviews in the environmental sector (see Checklist for editors and peer reviewers).
© 2021. CEEDER is licensed under a CC BY-SA 4.0
CEEDER contains research syntheses relevant to decision making in all areas of environmental management and policy. The service is global in scope and we collate the reviews using regular systematic searches or relevant databases and search engines. The service started with reviews published in 2018. Collated reviews are checked against eligibility criteria and the database contains ‘evidence reviews’, that claim to make some form of quantitative assessment of impacts on the environment or effectiveness of interventions that would be of direct interest to management or policy, and will soon include ‘evidence overviews’, that claim to have collated and/or mapped what evidence exists on environmental impacts and interventions.
Purely descriptive reviews or ‘expert’ opinion articles are not included unless the authors claim to provide a measure of effect. The specificity of the question varies from broad global issues to precise cause and effect relationships in single species or restricted areas. Human wellbeing is included when there is also a significant environmental component in the question.
The boundaries of subject scope between environmental management and other sectors is somewhat subjective. At the present time the following areas are not specifically covered: Animal veterinary science, nutrition and behaviour; Plant physiology, nutrition, improvement, and growth regulation; Engineering and construction; Biotechnology and bioengineering; Human health, education, social welfare and social justice; Toxicology. Estimates of change without investigation of cause are also excluded.
How the service can be used
Faced with a decision or question that might be informed by evidence the user can search for relevant syntheses using a simple keyword system (a guide to searching will be provided). Syntheses of potential relevance will be listed together with their ratings for reliability. The user can then choose the most relevant and reliable syntheses, taking note of their limitations. Links will be provided to the location of each synthesis article (Please note: CEEDER cannot provide access to the full text of articles for copyright reasons and therefore a subscription may be required).
Using CEESAT criteria as an indicator of reliability of a review
Each synthesis is appraised for its rigour and reliability using an objective 4-point scale using our CEESAT tool. Research syntheses gain credit both for their rigorous conduct and transparent reporting as well as the quality of the primary data and the synthesis itself. Consequently, it is not strictly the scientific merit being judged but the reliability and utility of the synthesis findings for decision making. Syntheses are scored in the context of the question(s) they address.
The CEESAT checklist provides a point by point appraisal of the confidence that can be placed in the findings of an evidence review by assessing the rigour of the methods used in the review, the transparency with which those methods are reported and the limitations imposed on synthesis by the quantity and quality of available primary data. Note that CEESAT does not distinguish between reviews that do not employ methodology that reduces risk of bias and increases reliability of findings and reviews that may have employed such methodology but do not report it.
The CEESAT criteria were developed to critically appraise reviews in terms of transparency, repeatability and risk of bias. For each of 16 elements, a review is rated using four categories (Gold, Green, Amber, Red) of review methodology. The Gold equates to the top standard currently recognised in review conduct by CEE and Red is regarded as unreliable. When finding a review of relevance to your evidence needs you can either use the scores as a whole for judging the review reliability or look at certain elements that you feel are important for the context in which you are working. Although the categories are also given “scores” from 1-4, using total scores or mean scores to compare review reliability is not necessarily meaningful and we advise against this in any context except a crude “eyeballing”. It may be more important to understand what elements of a review score Red or Amber and therefore may be deficient.
CEEDER is a free service provided by CEE thanks to the many volunteers that make up the Editorial Board and Review College, whom we gratefully acknowledge.