Almost to the day a year ago I wrote an article on this blog about the corruption of scientific inquiry. I used my own experiences with peer-review research in economics and political economy, and pointed to what it means when social scientists in general, and economists in particular, fail or even cease to apply basic, well-tested methods of peer review and the relentless pursuit of better knowledge. While bad research in physics, chemistry and biology can have long-term negative consequences for the work of engineers, medical professionals and the pharmaceutical industry, people often overlook the consequences of subpar research in political science, political economy and economics.
In fact, the consequences of bad social-science research can be immediate, severe and systemic. The entire European economic crisis was escalated from a normal, albeit relatively serious, downturn in the business cycle to a depression as a result of economic policies founded in bad, poorly designed and executed research.
However, the European crisis is only one example of many. The consequences of poorly executed research in economics are ultimately rooted in how graduate students are educated today. As I noted a year ago:
I have often wondered if the disappointing state of economics, both as an academic discipline and as a science, is due to the manic obsession of economists with multivariate regressions, second-degree differential equations and Cobb Douglas utility functions - or if it is part of a broader trend of corruption of modern science in general. One does not exclude the other of course, but with all the junk - pardon my Danish - that is being churned out by academic journals these days one cannot help wondering how we manage to keep the tabs of quality on it all.
One of the many problems with how economists are "brought up" these days is this emphasis on technical skills. The more time graduate students spend on mastering those, the less he or she will spend on theory, methodology and on developing the skills of systemic thinking that were so prevalent among economists in the first half of the 20th century.
As a result of the technicalization of economics, graduate students never develop the ability to actually pursue scientific inquiry. Their research, which builds their career from assistant professor to tenured faculty, takes the form of "the effects of X on Y". This type of question may interesting in itself, but it quickly becomes trivial from a truly scholarly viewpoint. After all, do you need years of graduate studies at a reputable university to learn how to answer such questions?
No, you don't. Yet our graduate schools have become assembly lines for highly advanced correlative statisticians. As an unintended, but certainly accepted, consequence, the prevailing wisdom of what it means to do research is one of fragmentation and a narrow scope of what questions to ask - and not to ask.
A friend, a retired professor of economics, mentioned that at the universities where he taught during his academic career, the policy these days is to have every graduate student with complete course work co-author three peer-review papers and then defend their dissertation. That is, he said, how all grad students at those universities earn their doctorate these days.
This might seem like a trivial issue for the outside viewer. It is, however, one of the keys to understanding how a scientific discipline breaks down and loses its scholarly content. My own dissertation was a monograph - unheard of today - which forced me to comprehensively understand a systemic issue (in my case macroeconomic stability) all the way from highly theoretical contributions on the foundations of economics, through complex methodological writings on probability, uncertainty and prices, to what all this meant at the ground level in political practice.
I was no genius back then (nor will I ever be...). I merely followed a tradition of scholarly training and education that goes back centuries. I wrote my dissertation in the same tradition that far brighter minds had done, from the founders of economics to those who modernized it and made it applicable to a monetary, industrial economy.
However, this traditional form of scholarly growth has distinct merits that these days are being pushed to the margin. It is not just in economics and political economy (though I will continue to brag by referring to my upcoming book on Routledge, The Rise of Big Government, as an example of how to put systemic thinking to work) but as revealed in an article in Nature from May last year, the fragmentization of scientific thinking is working its way into the natural sciences as well.
In natural sciences, reproducibility of experiments is a benchmark method for evaluating research. Nature reported that an alarming rate of published research is irreproducible:
More than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature's survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research.
The article opens a frightening window into the production process of modern natural-science research; fortunately, in a survey that the Nature article reports, 88 percent of scientists admitted that there was a significant or slight reproducibility crisis in their disciplines. That number is ground for some hope of a self-correcting process in the sciences, although one has to ponder the question asked by Michael Guillen, Ph.D. and former science editor at ABC News, in a recent article at The Global Warming Policy Forum:
Leaving unspoken this elephant-sized question: If we aren’t able to trust the published results of science, then what right does it have to demand more money and respect, before making noticeable strides toward better reproducibility?
Although reproducibility in natural sciences and systemic thought in social sciences are two different components of scholarly thought, they have one important thing in common. The subject of the natural sciences - nature in its broad form - lends itself to laboratory tests, hence the basic requirement of reproducibility. Social sciences, on the other hand, study the entirety of the social human existence; the sum total of the outcomes of human interaction - social, moral, cultural, economic - is in many ways larger than the addition of the actions of individuals. It is the amended entities that enlarge the sum total, that constitutes the system we refer to as "society".
In order to understand human society, one cannot begin with laboratory-level studies. The reproducibility method, which is essential to the natural sciences, is alien to the foundation of social sciences. That does not mean it is irrelevant or misplaced within the social sciences - there is good use for it in psychology and microeconomics - but it does not help the social scientist build an understanding of the systemic properties of human society.
The breakdown of the reproducibility method in natural sciences is a direct threat to the integrity of those sciences. By the same token, ignorance of systemic thought in social sciences has already compromised the integrity of economics, and is on its way to do the same with political science.
There are institutional reasons for why we as a society are weakening our scholarly intelligence. We have turned far too many of our academic institutions into skills factories. An academic program in economics today has more in common with an advanced vocational institution than a traditional place of scientific inquiry. Given the Nature article, and Dr. Guillen's comments in his Global Warming Policy Forum article, it is entirely possible that something similar is happening to the natural sciences.