Long ago the social work profession embraced the notion that we can and should build an empirical evidence base to inform (but not dictate) practice, program development, and policy. Social work research output has grown rapidly over the last few decades, yet the fruits of this labor are still gathered up with largely unscientific (haphazard) review methods. Scientific methods of research synthesis are available, but are widely misunderstood and underutilized in social work and related disciplines. Empirical evidence shows that unscientific review methods increase bias and error and can lead to the wrong conclusions. Thus, we cannot have a scientific evidence base for social work without scientific syntheses of results of relevant research.
One of the Grand Challenges for Social Work involves building and maintaining comprehensive and reliable summaries of empirical evidence to inform practice, program development, and policy. This will require (a) building knowledge and skills to conduct and utilize scientific syntheses and (b) developing the standards, incentives, and infrastructure needed to reduce bias and error in the reporting, dissemination, and synthesis of research results. Further, all of the Grand Challenges for Social Work ought to be informed by scientific research syntheses, rather than haphazard research reviews.
Basic principles of science apply to research reviews as well as to primary studies: if we care about the reliability and validity of results, then our sampling procedures, data collection methods, and analytic methods matter. These principles hold regardless of whether we are studying individuals, families, or previous studies.
Most published research reviews fall far short of current international, interdisciplinary standards for research syntheses. Narrative reviews of convenience samples of published studies are affected by well-known biases, and do not provide the comprehensive, accurate summaries of empirical evidence needed to inform practice, program development, and policy. Most of the so-called “systematic reviews” and meta-analyses that appear in top social work and psychology journals do not meet evidence-based criteria for the conduct, reporting, and critical appraisal of research reviews.
This Grand Challenge for Social Work involves building an infrastructure that truly supports the scientific synthesis and dissemination of credible research results of all kinds. This should not be limited to reviews of RCTs or quantitative studies, nor should it be limited to questions about intervention effects (questions about the nature, prevalence, and correlates of various conditions; consumer preferences; and other topics are relevant for practice, program development, and policy as well).
The infrastructure needed for scientific synthesis involves: (a) education about synthesis methods (e.g., in doctoral programs), (b) incentives for scientific syntheses and disincentives for haphazard reviews (in universities and journals), (c) prospective registration of primary studies (required by IRBs or funders), (d) encouraging full reporting of research results (including null and negative results). Similar steps are well underway in medicine and related fields, where substantial improvements in the transparency and accuracy of research reviews have been achieved in the last 10-20 years.
Through greater collaboration, we can solve current knowledge and resource gaps that have impeded progress toward scientific research synthesis.