Post
Topic
Board Beginners & Help
Re: funding lab with bitcoin donations
by
niko
on 04/06/2013, 20:24:22 UTC
Hey all,

New here - howdy! I am a neuroscientist with a lab focused on finding treatments for brain disorders like Alzheimer's disease, stroke and traumatic brain injury. Given the current paucity of funding opportunities for science, I have decided to set up a lab bitcoin address to accept crowd-sourced donations for everyday lab expenses to keep the lab running between larger grant acquisitions... If anyone is interested, I'd be happy to set up a "donations" page thanking individuals for helping to support our science.

lab web address: www.behavioralneuroscience.org
lab BTC address: 1KxqK9w8uH8gvYNxDAg4FS8Soij2RwnhWE

Thanks!
Rich

--
Rich Hartman, PhD
Associate Professor
School of Behavioral Health
Loma Linda University

First of all, kudos for the brave, pioneering idea! Are there any examples of crowdfunded research at universities? I certainly have never heard of anything.

Next, in order to help people consider helping your research, as opposed to spending their coins elsewhere, would you please share your thoughts on the following (taken from "the truth wears off" by J. Lehrer):

Quote
In the late nineteen-nineties, John Crabbe, a neuroscientist at the Oregon Health and Science University, conducted an experiment that showed how unknowable chance events can skew tests of replicability. He performed a series of experiments on mouse behavior in three different science labs: in Albany, New York; Edmonton, Alberta; and Portland, Oregon. Before he conducted the experiments, he tried to standardize every variable he could think of. The same strains of mice were used in each lab, shipped on the same day from the same supplier. The animals were raised in the same kind of enclosure, with the same brand of sawdust bedding. They had been exposed to the same amount of incandescent light, were living with the same number of littermates, and were fed the exact same type of chow pellets. When the mice were handled, it was with the same kind of surgical glove, and when they were tested it was on the same equipment, at the same time in the morning. 

The premise of this test of replicability, of course, is that each of the labs should have generated the same pattern of results. “If any set of experiments should have passed the test, it should have been ours,” Crabbe says. “But that’s not the way it turned out.” In one experiment, Crabbe injected a particular strain of mouse with cocaine. In Portland the mice given the drug moved, on average, six hundred centimetres more than they normally did; in Albany they moved seven hundred and one additional centimetres. But in the Edmonton lab they moved more than five thousand additional centimetres. Similar deviations were observed in a test of anxiety. Furthermore, these inconsistencies didn’t follow any detectable pattern. In Portland one strain of mouse proved most anxious, while in Albany another strain won that distinction. 

The disturbing implication of the Crabbe study is that a lot of extraordinary scientific data are nothing but noise. The hyperactivity of those coked-up Edmonton mice wasn’t an interesting new fact—it was a meaningless outlier, a byproduct of invisible variables we don’t understand. The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically significant and entirely unexpected. Grants get written, follow-up studies are conducted. The end result is a scientific accident that can take years to unravel.