linking back to brembs.net






My lab:
lab.png
Welcome Guest
Username:

Password:


Remember me

[ ]
 Currently Online (51)
 Extra Information
MicroBlog
You must be logged in to post comments on this site - please either log in or if you are not registered click here to signup

[23 Dec 12: 13:20]
Inbox zero! I don't even remember the last time I could say that!

[06 Aug 12: 14:21]
Phew! Done with nine 20min oral exams, three more to go. To be continued tomorrow...

[14 Oct 11: 11:45]
Just received an email from a computer science student - with an AOL email address?

[03 Jul 11: 22:26]
Google citation alerts suck: I just found out by accident I rolled over h-index of 13 and 500 citations http://blogarchive.brembs.net/citations.php

[21 May 11: 18:14]
6.15pm: Does god have Alzheimer? No #rapture in Europe...

[01 May 11: 11:31]
w00t! Just been invited to present at OKCon 2011! #OKCon2011


Networking

Subscribe to me on FriendFeed
Follow brembs on Twitter

Research papers by Björn Brembs
View Bjoern Brembs

Science Blog Directory
Random Video
SciSites
In a follow-up to the last post on scientific misconduct, I think it is already becoming apparent that the incidence of "polishing" scientific data is on the rise. With "polishing" I refer to omission of experiments that don't seem to fit ("didn't work") or to normalize data so they "look nicer", even though the raw data already showed the same effect. Another example is a researcher, when asked after a presentation why the background noise in his data was so low, answered that they had increased the contrast in the images "for illustrational purposes only".
This might seem inocuous at first glance and surely doesn't involve any misconduct or ethically questionable behavior.
What it does give ample evidence of, however, is the current trend to eliminate the immensely important notion of caution and self-criticism from scientific practice. Mind you, this is not yet pervasive or dominant, but it is a trend!
If you have to hype and hyperbole your results and have to make the data "look nice" in order to sell them well, you are not only on a slippery slope, but you may also miss important alternative, less "flashy" explanations. In addition, editors and referees have to cut through the hyperbole to get to the real firm conclusions the paper actually allows.

The current "pressure-cooker" environment not only breeds cheats, but also leads to a dangerus development in which important control experiments are left out and to a research culture in which uncritical superlatives are the norm and not cautious optimism. The current scientific funding situation is rewarding hyperbole and excessive celebrization of "superstar" researchers and punishing cautious, tedious, meticulous, long-term research. Reward and punishment usually work very well in operant conditioning situations to shape behavior, so we shouldn't be surprised if future generations of "superstar" scientists don't quite live up to the hype in the long run.
Posted on Monday 29 January 2007 - 14:55:11 comment: 0
science   politics   science politics   misconduct   


You must be logged in to make comments on this site - please log in, or if you are not registered click here to signup
Render time: 0.8520 sec, 0.0282 of that for queries.