In a follow-up to the last post on scientific misconduct, I think it is already becoming apparent that the incidence of "polishing" scientific data is on the rise. With "polishing" I refer to omission of experiments that don't seem to fit ("didn't work") or to normalize data so they "look nicer", even though the raw data already showed the same effect. Another example is a researcher, when asked after a presentation why the background noise in his data was so low, answered that they had increased the contrast in the images "for illustrational purposes only".
This might seem inocuous at first glance and surely doesn't involve any misconduct or ethically questionable behavior.
What it does give ample evidence of, however, is the current trend to eliminate the immensely important notion of caution and self-criticism from scientific practice. Mind you, this is not yet pervasive or dominant, but it is a trend!
If you have to hype and hyperbole your results and have to make the data "look nice" in order to sell them well, you are not only on a slippery slope, but you may also miss important alternative, less "flashy" explanations. In addition, editors and referees have to cut through the hyperbole to get to the real firm conclusions the paper actually allows.
The current "pressure-cooker" environment not only breeds cheats, but also leads to a dangerus development in which important control experiments are left out and to a research culture in which uncritical superlatives are the norm and not cautious optimism. The current scientific funding situation is rewarding hyperbole and excessive celebrization of "superstar" researchers and punishing cautious, tedious, meticulous, long-term research. Reward and punishment usually work very well in operant conditioning situations to shape behavior, so we shouldn't be surprised if future generations of "superstar" scientists don't quite live up to the hype in the long run.
This might seem inocuous at first glance and surely doesn't involve any misconduct or ethically questionable behavior.
What it does give ample evidence of, however, is the current trend to eliminate the immensely important notion of caution and self-criticism from scientific practice. Mind you, this is not yet pervasive or dominant, but it is a trend!
If you have to hype and hyperbole your results and have to make the data "look nice" in order to sell them well, you are not only on a slippery slope, but you may also miss important alternative, less "flashy" explanations. In addition, editors and referees have to cut through the hyperbole to get to the real firm conclusions the paper actually allows.
The current "pressure-cooker" environment not only breeds cheats, but also leads to a dangerus development in which important control experiments are left out and to a research culture in which uncritical superlatives are the norm and not cautious optimism. The current scientific funding situation is rewarding hyperbole and excessive celebrization of "superstar" researchers and punishing cautious, tedious, meticulous, long-term research. Reward and punishment usually work very well in operant conditioning situations to shape behavior, so we shouldn't be surprised if future generations of "superstar" scientists don't quite live up to the hype in the long run.
Posted on Monday 29 January 2007 - 14:55:11 comment: 0
{TAGS}
{TAGS}
You must be logged in to make comments on this site - please log in, or if you are not registered click here to signup
Render time: 0.0578 sec, 0.0047 of that for queries.