On Thu, 12 Jun 1997, Mr C A Rusbridge wrote:
> Of course, many advocate going even further, and making the material
> freely available via the Internet. There are obvious problems in this. If
> you retain current models of peer review and the journal status hierarchy,
> there has to be a way to pay the costs of organisation, editorial,
> publishing etc. Perhaps up-front fees for publication with no user charges
> might work in the coming world, but getting there from here is not an easy
> road to map out. If you like the idea of a virtually free pre-print
> system, how does the quality assurance of peer review etc fit into this?
I've reviewed the odd paper and so have a few others round here, and we've
never been paid for it (unless you consider that the pay we get as
researchers covers it, though personally reviewing and reading is
something I do in bed or in the bath so I'd rather not think that
Loughborough University are paying for my bath time :-)). So if the
quality assurance of peer review is free to the paper publishers, why
should it cost anything for "virtually free pre-print"?
It then comes down to paying for editorial staff and servers to stick
the journals on. As dissemination of research information is one of the
goals of all this academic stuff we do, and we're actually already being
paid to do the dissemination by Universities and research grants (eLib is
expecting us to spend some of our time disseminating, right? :-) ) and
lots of academics are already editors of paper publications, I don't see
the first of these as a real problem. Of course the editing would have
to be done using networked cooperative working tools which some people
find easier to adjust to than others.
As for the second, the hardware and software needed to run an academic
preprints service can be quite small. I've got a widely known and popular
multicast comms archive and quite a few mailing list archives running on
my workstation. I don't even notice the slight additional load most of
the time. With the increasing habit of departments to put P200s with >1GB
harddiscs on people's desks when they buy new machines, I can't see CPU
cycles or discs as a big problem. Hell, I've even got my _own_personal_
machines running here on the campus network and supporting production
services (its often easier to do this than get a grant or hassle people
for upgrades and you can get useful machines out of the trash these days).
I can't really believe that I and the other guys round here who do similar
things (for I am not alone in this) are oddballs and this doesn't happen
elsewhere in academia.
And take a look at the RFC process - as well as the experimental,
historical and informational categories of RFCs that anyone can write,
we've got the three classes of standard making RFC that are all subjected
to intensive peer review. These are free to the user. People, companies
and institutions are prepared to expend time and resources on these
documents because they believe them to be useful and that they will
further the cause that the participants are interested in (and maybe make
them some cash through new products or new research grants or whatever).
The don't do it to extract a few sheckles from each punter than wants to
read them. A "compare and contrast" with the ITU/ISO and the resulting
popularity of IETF protocols relative to the ISO ones is left as an
exercise for the reader. :-)
Tatty bye,
Jim'll
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Jon "Jim'll" Knight, Researcher, Sysop and General Dogsbody, Dept. Computer
Studies, Loughborough University of Technology, Leics., ENGLAND. LE11 3TU.
* I've found I now dream in Perl. More worryingly, I enjoy those dreams. *
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|