Journals Library

An error has occurred in processing the XML document

An error occurred retrieving content to display, please try again.

Page not found (404)

Sorry - the page you requested could not be found.

Please choose a page from the navigation or try a website search above to find the information you need.

{{metadata.Title}}

{{metadata.Headline}}

Study found that effect sizes from randomised and non-randomised controlled trials may differ in some circumstances and that these differences may be associated with factors confounded with design

{{author}}{{author}}{{($index < metadata.AuthorsAndEtalArray.length-1) ? ',' : '.'}}

An error has occurred in processing the XML document

An error has occurred in processing the XML document

{{metadata.Journal}} Volume: {{metadata.Volume}}, Issue:{{metadata.Issue}}, Published in {{metadata.PublicationDate | date:'MMMM yyyy'}}

https://dx.doi.org/{{metadata.DOI}}

Citation: {{author}}{{ (($index < metadata.AuthorsArray.length-1) && ($index <=6)) ? ', ' : '' }}{{(metadata.AuthorsArray.length <= 6) ? '.' : '' }} {{(metadata.AuthorsArray.length > 6) ? 'et al.' : ''}} {{metadata.Title}}. {{metadata.JournalShortName}} {{metadata.PublicationDate | date:'yyyy'}};{{metadata.Volume}}({{metadata.Issue}})

You might also be interested in:
{{classification.Category.Concept}}

Report Content

The full text of this issue is available as a PDF document from the Toolkit section on this page.

The full text of this issue is available as a PDF document from the Toolkit section on this page.

Abstract

OBJECTIVES

To determine whether randomised controlled trials (RCTs) lead to the same effect size and variance as non-randomised studies (NRSs) of similar policy interventions, and whether these findings can be explained by other factors associated with the interventions or their evaluation.

DATA SOURCES

Two RCTs were resampled to compare randomised and non-randomised arms. Comparable field trials were identified from a series of health promotion systematic reviews and a systematic review of transition for youths with disabilities. Previous methodological studies were sought from 14 electronic bibliographic databases (Applied Social Sciences Index and Abstracts, Australian Education Index, British Education Index, CareData, Dissertation Abstracts, EconLIT, Educational Resources Information Centre, International Bibliography of the Sociological Sciences, ISI Proceedings: Social Sciences and Humanities, PAIS International, PsycINFO, SIGLE, Social Science Citation Index, Sociological Abstracts) in June and July 2004. These were supplemented by citation searching for key authors, contacting review authors and searching key internet sites.

REVIEW METHODS

Analyses of previous resampling studies, replication studies, comparable field studies and meta-epidemiology investigated the relationship between randomisation and effect size of policy interventions. New resampling studies and new analyses of comparable field studies and meta-epidemiology were strengthened by testing pre-specified associations supported by carefully argued hypotheses.

RESULTS

Resampling studies offer no evidence that the absence of randomisation directly influences the effect size of policy interventions in a systematic way. Prior methodological reviews and meta-analyses of existing reviews comparing effects from RCTs and non-randomised controlled trials (nRCTs) suggested that effect sizes from RCTs and nRCTs may indeed differ in some circumstances and that these differences may well be associated with factors confounded with design. No consistent explanations were found for randomisation being associated with changes in effect sizes of policy interventions in field trials.

CONCLUSIONS

From the resampling studies we have no evidence that the absence of randomisation directly influences the effect size of policy interventions in a systematic way. At the level of individual studies, non-randomised trials may lead to different effect sizes, but this is unpredictable. Many of the examples reviewed and the new analyses in the current study reveal that randomisation is indeed associated with changes in effect sizes of policy interventions in field trials. Despite extensive analysis, we have identified no consistent explanations for these differences. Researchers mounting new evaluations need to avoid, wherever possible, allocation bias. New policy evaluations should adopt randomised designs wherever possible.

Abstract

OBJECTIVES

To determine whether randomised controlled trials (RCTs) lead to the same effect size and variance as non-randomised studies (NRSs) of similar policy interventions, and whether these findings can be explained by other factors associated with the interventions or their evaluation.

DATA SOURCES

Two RCTs were resampled to compare randomised and non-randomised arms. Comparable field trials were identified from a series of health promotion systematic reviews and a systematic review of transition for youths with disabilities. Previous methodological studies were sought from 14 electronic bibliographic databases (Applied Social Sciences Index and Abstracts, Australian Education Index, British Education Index, CareData, Dissertation Abstracts, EconLIT, Educational Resources Information Centre, International Bibliography of the Sociological Sciences, ISI Proceedings: Social Sciences and Humanities, PAIS International, PsycINFO, SIGLE, Social Science Citation Index, Sociological Abstracts) in June and July 2004. These were supplemented by citation searching for key authors, contacting review authors and searching key internet sites.

REVIEW METHODS

Analyses of previous resampling studies, replication studies, comparable field studies and meta-epidemiology investigated the relationship between randomisation and effect size of policy interventions. New resampling studies and new analyses of comparable field studies and meta-epidemiology were strengthened by testing pre-specified associations supported by carefully argued hypotheses.

RESULTS

Resampling studies offer no evidence that the absence of randomisation directly influences the effect size of policy interventions in a systematic way. Prior methodological reviews and meta-analyses of existing reviews comparing effects from RCTs and non-randomised controlled trials (nRCTs) suggested that effect sizes from RCTs and nRCTs may indeed differ in some circumstances and that these differences may well be associated with factors confounded with design. No consistent explanations were found for randomisation being associated with changes in effect sizes of policy interventions in field trials.

CONCLUSIONS

From the resampling studies we have no evidence that the absence of randomisation directly influences the effect size of policy interventions in a systematic way. At the level of individual studies, non-randomised trials may lead to different effect sizes, but this is unpredictable. Many of the examples reviewed and the new analyses in the current study reveal that randomisation is indeed associated with changes in effect sizes of policy interventions in field trials. Despite extensive analysis, we have identified no consistent explanations for these differences. Researchers mounting new evaluations need to avoid, wherever possible, allocation bias. New policy evaluations should adopt randomised designs wherever possible.

If you would like to receive a notification when this project publishes in the NIHR Journals Library, please submit your email address below.

 

Responses to this report

 

No responses have been published.

If you would like to submit a response to this publication, please do so using the form below.

Comments submitted to the NIHR Journals Library are electronic letters to the editor. They enable our readers to debate issues raised in research reports published in the Journals Library. We aim to post within 2 working days all responses that contribute substantially to the topic investigated, as determined by the Editors.

Your name and affiliations will be published with your comment.

Once published, you will not have the right to remove or edit your response. The Editors may add, remove, or edit comments at their absolute discretion.

By submitting your response, you are stating that you agree to the terms & conditions