Skip to main content
Guest Column

Towards Six Sigma science

To improve the quality of science, we need to bring rigour to the processes that enable science.

Rejection rates for research papers submitted to top science journals are as high as 80-95%, according to a 2018 analysis of 400 papers submitted to the Journal of Obstetrics and Gynecology of India. The top three reasons cited for rejection were "poor methodology", "no new information" and "poor scientific content". This may suggest that the effort to keep published content pristine is working well. But it could also point to poor quality of science research. This is a problem.

Global spending on R&D is at a record $1.7 trillion. In India, taxpayers fund R&D to the tune of $30 billion. If much of this spending results in output that journals find unacceptable, it is a matter of concern. That is compounded by the lack of reproducibility of published science: as a 2018 paper in the National Science Review notes, in cancer biology, almost 90% of results published were not reproducible. It is disturbing that even science that has been through the gauntlet of peer review can be so unreliable. Self-corrective mechanisms in science may fix this problem sooner or later. But there is a period during which the findings of a flawed study can cascade through the scientific community, spawning generations of research studies that are congenitally flawed.

The 'scientific method', which has defined the principles to be followed in scientific research, will be improved by the infusion of 'quality control' parameters.

One measure of evaluating science quality is the number of citations a published paper receives. A high citation score reflects the contemporary relevance of the work - and peers' opinion on the significance and reliability of the reported results. It is a subjective metric that is heavily influenced by the gloss of the author's seniority and popular standing. Furthermore, a citation score is generated post-hoc: it does nothing to ensure the quality of research.

THE 'SCIENTIFIC METHOD'

To improve science research output, we must intervene in the manner of its production: the scientific process itself. The 'scientific method' has defined the principles to be followed in scientific research, but this is mostly a folksy, cottage industry-style approach that has remained untouched by quality control developments in other sectors. Quality became an aspirational standard for the manufacturing sector in the latter half of the 20th century. The efforts of people like W. Edwards Deming elevated quality to totemic levels for the global automobile sector. Tools for process improvement, such as the Six Sigma standard, have been widely adopted. 

Some scientists may recoil in horror at the mention of process improvement and science, perhaps owing to a misperception that toning up the process of science will abridge freedom and creativity. At one level, science is not a commodity; scientific research is about charting a course with no prior knowledge of the destination. But the process of science is a well-understood commodity, with good experimental design, proper controls, robust statistics, and logical reasoning. The journey of science may be unpredictable, but that is all the more reason why the ship on which scientists embark on this journey, and the tools of navigation they use (the scientific method), should be certifiably sea-worthy.

SWAMI SUBRAMANIAM To improve the quality of science, we need to bring rigour to the processes that enable science. A clinical pharmacologist and neuroscientist, Dr Swami Subramaniam is CEO of Ignite Life Science Foundation.

To improve the quality of science, we need to bring rigour to the processes that enable science. As with the industrial process, the scientific process can be broken down into its components and each component studied for ways to ensure better quality. The backbone of this will continue to be the peer review, where senior scientists with proven track records sit in judgement over the quality of science. But the peer review process can be systematised. For example, checking and validating the experimental design and the use of controls must be made mandatory for all reviews. Likewise, the statistical analysis proposed must be appropriate. Just as companies run simulations on processes before deployment, outcomes of experiments can be anticipated so that the analysis and interpretation can be tested for robustness before the real experiment gets under way. The funding process can be improved by shifting the focus from the bureaucratic accounting and financial aspects of the project to an evaluation of the quality of ideation and experimental approach proposed.

One of the bugbears of the peer review process is the scarcity of scientists willing to commit time and effort to the review process, given that reviewers are not adequately compensated. But if we are to get leading scientists to give quality attention to peer reviews, some system of compensation - in cash or kind - must be worked out. Peer reviewers should also be trained and certified in the review process so that they can give balanced attention to all aspects of the proposal rather than being influenced by their own biases.

HARNESSING 'METASCIENCE'

It is time to put metascience - the use of scientific methodology to study science in order to improve its quality - to good use. Given the pivotal role that science has to play in the success of nations, the scientific enterprise must be improved. As the physician-scientist John Ioannidis put it, "Science is the best thing that has happened to human beings... but we can do it better." India does not have a choice. As an emerging nation aspiring to be a leader in science, we have to do science better.
 

LEAVE A COMMENT

Search by Keywords, Topic or Author

© 2024 IIT MADRAS - All rights reserved

Powered by RAGE