Continuous and bimonthly publication
ISSN (on-line): 1806-3756

Licença Creative Commons
5106
Views
Back to summary
Open Access Peer-Reviewed
Editorial

The importance of strong fundamentals in scientific methodology

A importância de fundamentos robustos em metodologia científica

Rogério Souza1,2

DOI: http://dx.doi.org/10.1590/S1806-37562018000500005

In recent decades, there has been considerable growth of the scientific literature. Although that growth has not been uniform across the different fields of science, a persistent trend can be seen in the various databases available. In PubMed for example, there was an annual growth rate of more than 5% between 1997 and 2006.(1) More recently, increases in the number of online journals and the publication of collections of abstracts presented at conferences, as well as expanded access to databases, might have further influenced such growth.

However, such growth is not free from bias-quite the opposite. A few years ago, an article published in The Economist, entitled "Unreliable research - Trouble at the Lab,"(2) called attention to numerous problems associated with the state of scientific literature at that time, such as the low reproducibility of published studies and biases related to the exclusive publication of studies showing positive results, which were potentially influenced by funding sources. In addition, within the academic environment, there is increasing pressure to publish, which results in low-quality articles or the "salami slicing" phenomenon, which is characterized primarily by dividing individual research projects into multiple articles, not only reducing their relevance but also increasing redundancy.(3)

One of the mechanisms to minimize or at least to discourage many of the current biases in the scientific literature is to encourage a more solid training of researchers in the fundamentals of scientific research. Information regarding the precepts, not only good practices in clinical research but also the associated technical aspects, should first be offered in undergraduate courses and should be maintained throughout the academic life of the researcher, as continuing education.

In terms of the technical aspects, the entire rationale for the design of a study should be understood, from the development of the main research question(4) to the critical analysis of the methodology used and its limitations, as well as the appropriate use and interpretation of the various statistical tests. Graduate programs tend to focus on those aspects, because their main purpose is to prepare professors and researchers by constructing discipline-specific training centers. However, this initiative seems insufficient, given the extent of the scientific environment and the limited scope of those disciplines.

Scientific journals also play a relevant, albeit less explored, role in this process, not only by creating mechanisms to identify and prevent biases associated with the scientific publishing process but also by disseminating the best practices to be followed. In those two aspects, there is a pressing need to improve the performance of scientific journals. First, they should be able to identify biases. In general, it is well established that the peer review process, despite its various positive qualities, is unable to identify such biases. The lack of alternative models that do not significantly delay the publishing process has perpetuated this limitation of one of the most common editorial processes. One enormous opportunity that has yet to be taken advantage of by scientific journals is the dissemination of methodological concepts. There are few scientific journals in the field of internal medicine that have sections dedicated to the discussion of the fundamentals of scientific research. The potential gains from the dissemination of this type of knowledge are quite significant, not only in terms of improving the training of researchers but also in terms of increasing the overall critical thinking capacity of readers in general, which can, over time, function as a mechanism to improve the quality of the available scientific research.

What the JBP has specifically been doing over the past four years is publishing a series of articles about continuing education in scientific methodology,(5) addressing extremely diverse topics, from how to structure a research project(4,6-8) to the proper interpretation of different types of studies.(9-11) We are now investigating the impact that the publication of that series of articles has had on the JBP readership. However, in general terms, those articles have already been being used as a point of reference for researchers in the area.

The dissemination of methodological concepts addresses only one small aspect of the larger problem. Obviously, continuing education plays an important role, although other initiatives are needed in order to improve the scientific research scenario over the next few years. Funding agencies might have to take more direct action in that sense. The use of audits, making the reporting of formal aspects regarding methodology mandatory, and requiring analysis of the results in a more conclusive fashion are actions that can be implemented and added to the current project review format without significantly increasing the bureaucracy involved in the current submission and review processes.

Any interventions in the scientific publishing process should be agreed upon by consensus among the members of academia, funding agencies, scientific journals, and even readerships. Otherwise, the organic growth in the scientific literature will not be accompanied by a similar growth in quality.

REFERENCES

1. Larsen PO, von Ins M. The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics. 2010;84(3):575-603. https://doi.org/10.1007/s11192-010-0202-z
2. The Economist [homepage on the Internet]. London: The Economist; c2018; [updated 2013 Oct 18; cited 2018 Oct 10]. Unreliable research. Trouble at the lab; [about 27 screens]. Available from: https://www.economist.com/briefing/2013/10/18/trouble-at-the-lab
3. Sasaki K, Tan S. Publication ethic (1) "salami slicing". J Hepatobiliary Pancreat Sci. 2018;25(6):321. https://doi.org/10.1002/jhbp.561
4. Patino CM, Ferreira JC. Developing research questions that make a difference. J Bras Pneumol. 2016;42(6):403.
5. Souza R. 2016 - a second step. J Bras Pneumol. 2016;42(1):5-6. https://doi.org/10.1590/S1806-37562016000100001
6. Ferreira JC, Patino CM. Types of outcomes in clinical research. J Bras Pneumol. 2017;43(1):5. https://doi.org/10.1590/s1806-37562017000000021
7. Patino CM, Ferreira JC. What is the importance of calculating sample size? J Bras Pneumol. 2016;42(2):162. https://doi.org/10.1590/S1806-37562016000000114
8. Ferreira JC, Patino CM. Choosing wisely between randomized controlled trials and observational designs in studies about interventions. J Bras Pneumol. 2016;42(3):165. https://doi.org/10.1590/S1806-37562016000000152
9. Ferreira JC, Patino CM. What does the p value really mean? J Bras Pneumol. 2015;41(5):485. https://doi.org/10.1590/S1806-37132015000000215
10. Ferreira JC, Patino CM. Understanding diagnostic tests. Part 1. J Bras Pneumol. 2017;43(5):330. https://doi.org/10.1590/s1806-37562017000000330
11. Patino CM, Ferreira JC. Understanding diagnostic tests. Part 2. J Bras Pneumol. 2017;43(6):408. https://doi.org/10.1590/s1806-37562017000000424

Indexes

Development by:

© All rights reserved 2024 - Jornal Brasileiro de Pneumologia