In many ways, I find Stephen King description, with regard to the fictional setting of "The Dark Tower" series, "the world has moved on" to apply to aspects of research. While I acknowledge the interest in Big Data, that is not what I have in mind, now. My current concern is the thing variously referred to as a "literature review" or "review or literature," and its associated variations. To put it bluntly, I believe reviews of literature are increasingly exercises in process rather than relevant, useful contributions to knowledge. I acknowledge approaches to reviewing (findings from) prior research have been undergoing refinement for years. Meta-analyses and other (quantitative) pooled results reports and meta-syntheses or other (qualitative) integrated results reports have been around for decades. There are systematic reviews, mostly identified by use of a systematic, thorough but not typically exhaustive search and selection process, scoping reviews, characterized by a narrower scope, and even narrative reviews, that tend to be a (preference base? convenience based?) sample of research summaries. There are increasing alternatives in software programs and applications to support these efforts, including general qualitative data analysis software (QDAS) programs, although Microsoft Excel can often be made to do what is done. Conventionally, reviews of prior literature are either used to build rationales for new studies or may comprise free standing reports. As anyone who has used library database programs or database aggregators can attest, the number of hits (AKA articles displayed following a key word or other type of query) has increased exponentially since the turn of the century. People use various criteria to limit the number of sources that have to be screened, including publication date, attributes of the research, such as publication language, study design, participant characteristics, and others. Sometimes these are arbitrary, sometimes they are logical, and sometimes they seem to mostly represent the easiest way to greatly reduce the sample. But I think the biggest limitation to the usefulness or potential interest in any review of literature is reliance on published research, usually peer-reviewed published research. This may make a strange kind of sense since the aim is often to use the results to justify another research study, that will be published in a peer reviewed journal format. There are ways to search "gray" (unpublished) reports although often any contacts are limited to people working in academic institutions and/or people working on government or foundation funded research, because funding searches are another way to identify relevant research. But I know there is good stuff out there that is not published, not done as part of employment by an academic institution, not funded through federal or large foundations, and otherwise just slips under everyone's radar. (How do I know this? Because I have happened upon things - a couple of examples I describe below.)
I have been dissatisfied with my own efforts to conduct any free-standing review of research for a few years now. I have also seen the review of literature considered the "go to" approach for myriad exploratory of investigative processes. It is heavily associated with reliance on the traditional steps of the "scientific method" that emphasize logic and not learning, planning versus iteration, and reductive rather than expansive thinking. Problems associated with literature reviews in particular include the quality of work available, the effort put toward identifying and screening, and the need for substantial human resources to do anything reasonably well. Comprehensive reviews of literature - including pooled and integrated methods - work best when done quickly, in a well organized way, and when addressing an emergent topic and/or one where there is a limited body of work - 10s to 100s rather than 1000s of publications. My solution in recent years has been to generally avoid these projects and only engage in mini-reviews - focused searches to identify the best quality examples of relevant work that can be summarized to introduce, contextualize, and rationalize my own novel research. This is a process best done at the same time when planning and/or proposing one's own work but should also be updated at the time of writing up a research report - because often someone else has seen the same need for research that inspired your work, and published their work while you were still doing yours. Some years ago, I was working with a student to investigate impact factors and I came upon a persuasive argument to abandon the publication/citation-based methods and instead to investigate literal impact, i.e., who has used the research results? If your study is an intervention, has there been additional uptake? Are there local, regional, or larger scale changes in outcomes or other trends that might be attributed to your study? Outside of funded research and evaluation efforts, I think no one knows what happens with a lot of research after it is published, other than the number of times it is cited. And I'm not sure most people even check to see if they are being cited in a complimentary or at least neutral way rather than being held up as an example of really bad research! But I have carried around this idea of practical "impact" for some time now. I think to some extent that doing good work - conducting research to improve or refine beneficial programs, for example - is more important than publishing papers. But for most academic faculty or institutional researchers- who, with, without, and as doctoral students contribute most published research reports - reports count for a lot and practice not so much. In a recent issue of Mother Earth News, there is a really nice citizen science report that describes the results of use of alternative (natural, organic) treatments for basil plants, emphasizing leaf litter (Bowman, 2022). The researchers conducted a systematic trial and described their results. This engaging and informative report is probably not publishable in a scholarly journal without a lot of revision and additional content, including a substantial review of prior research, and even then, odds are that editor and peer input may result in a declination or a substantially altered work, that may not be engaging. And certainly is not as likely to get to the target audience of people who grow, or aspire to be successful in growing organically. I acknowledge this report does have a sort of review of literature (i.e., the author describes use of a database search to find two articles, neither about basil and neither described). The study also received research funding - from the USDA via the University of Minnesota - but this is the type of small grant not likely to be identified by a researcher as a source or research; nor would Mother Earth News - a popular, not academic journal, be included in most people's searches. I had a couple of thoughts about this report - one is that I want to use it as an example of citizen science in research methods classes that I teach, to show students that it is possible to do good work outside of an academic or other institutional setting, and the other is that I wonder how much other useful research is out there - in popular media, on websites (for example, something like this: https://www.lgbtqohio.org), from internal agency presentations or meeting notes, and wherever else. It seems like things that reflect good practice, like the Mother Earth News article, ought to be citable and cited. And this made me start thinking about the entire approach to reviews of literature, that are biased toward published research, so by necessity are biased toward work done by certain types of people, working for certain types of employers, and often with certain types of priorities. What might it look like to conduct and write up a review of research (as opposed to literature) that represents a diligent search across multiple types of sources, to capture work published in media, work shared on websites, and work found elsewhere? The internet goes some way toward facilitating this although there are still low tech things that will be completely missed. But it is a lot more do-able, than, for instance, 20 years ago. I, for one, would welcome the chance to edit (in my current role as Editor-in-Chief of the Ohio Journal of Public Health) or review, a novel, forward thinking, diverse source reflecting, post modern approach to a review of literature! Reference Bowman, D. (2022, August/September). Leaf litter to the research: Free fertilizer from your backyard. Mother Earth News, Issue 313, 42-45. Photo by me, Franklin Street, Kent, Ohio, USA. Color, contrast and light aspects altered with Photos for Mac, version 7.
0 Comments
Leave a Reply. |
AuthorI am Sheryl L. Chatfield, Ph.D, C.T.R.S. I am a member of the faculty in the College of Public Health at Kent State University. I also Co-coordinate the Graduate Certificate in Qualitative Research and I am a member of the Design Innovation Team at Kent State. Archives
February 2024
Categories
|