Ok. This is my last post before I go to bed. Look at the opportunity cost of this discussion alone - bright minds who should be solving structures or developing algorithms - anything! Debating this.
However - as someone else remarked will (a) anyone care about > 90% of the structures in 50 years?
And (b) even if they do, is this continual improvement even worthwhile? I am always depressed at how little a model changes from an initial build to the final one, even when the rfree drops from 35 to 23. All that work! - and my biological interpretation would have been almost the same at the beginning as at the end.
The structures aren't important in themselves. It's the story they tell. So to me this is an effort to fix what ain't broke.
Adrian
Sent from my iPhone
On 28 Oct 2011, at 01:15, Michel Fodje <[log in to unmask]> wrote:
> Every dataset costs money to produce. Is it more cost effective to expect that those wishing to use the data repeat the expenditures by repeating the experiments? To exaggerate the point, imagine a world without published research articles, would it be more expensive to do science or less? We should not simply dismiss an idea just because we think today that "640K is more memory than anyone will ever need"
>
>
>> On Oct 27, 2011, at 3:47 PM, Adrian Goldman wrote:
>> 2) I agree with Susan. In a time of limited funding, is this the most important use of money?
>
|