A few days ago I sent the following to lis-link.:
----------------------------------------------------------------------------------------
I would be very interested in lis-linkers experiences of 2 particular
monitoring methods:
1. Slips of paper wrapped around journals which are broken by the
user -
Were these an accurate indicator of use ?
Did they discourage use ?
Were they freuently defaced or damaged ?
2. Questionnaires inside journals -
What questions did you ask?
Did you request users names or initials?
Do you think it was an accurate indicator ?
Were questionnaires defaced or results rendered worthless ?
---------------------------------------------------------------------------------------
I received 10 replies. I am summarising in two parts.
1: Preferences for the different means of reviewing journal usage
2: Advice on best practice in reviewing journal usage. None of this
advice is my own – rather this is a summary of the good ideas received in
the replies.
1 - PREFERENCES
***Slips of paper which are broken by the user***
This has been tried by only 1 respondent. They favoured this as the
most effective way of checking if a journal had been used at all.
***Questionnaires inside or on the covers of journals***
This has been tried by 7 respondents.
4 thought the results unhelpful:
“[slips] fell off, were torn off and generally ignored”
“Librarianship lecturers subverted the system by ticking everything in
sight several times”
2 thought the results offered a little information
1 thought that the survey reasonably effective
“not a perfect method, but I accumulated results that were useful”
3 other methods were suggested:
***Re-shelving counts***
2 university libraries supported this method, of counting the frequency
with which a journal is reshelved, as a guide to usage.
***Copyright forms***
A hospital authority library gathered usage statistics based on the forms
completed by users for each photocopy.
***Cancellation list***
This was used by 3 respondents (2 in health authorities, 1 in a
university), and endorsed by all of them. A list of current journals is sent
to all members of staff, or a list of subject-relevant journals is sent to all
teams. Staff or teams then rank the journals in some way according to
their desirability.
Ranking methods included:
Straightforward ranking in order of usefulness
3 levels – essential, useful, not useful
4 levels – core title, major topic of interest, background interest, cancel
This returned data was treated in different ways to decide on a list of
journals to cancel: through meta-analysis, or chopping everything with
no votes, or taking into account ratings by library staff when making a
decision.
***Combination***
3 respondents suggested that some combination of two or more of these
above measures was most effective.
2: BEST PRACTICE
If you are following the ***Questionnaire*** approach, follow these
tips:
a. Use some popular journals as controls. This will allow you to test the
reliability of the process: if periodicals which you *know* to be popular
receive low scores, or lower scores than some more specialised journals,
you have an idea how much salt to take the results with.
b. Try to attach slips to journals securely!
c. Write the journal name on the back of the slip because it will still fall
off anyway.
d. Accept that this will be very time-consuming
e. Remember just how much of Cheshire you’re taking these results with
and try other methods simultaneously.
e If you are deciding whether to follow the Questionnaire study or a
Slips around the journals broken by the user approach, remember that
the latter is more effective at identifying zero usage journals, while
questionnaires are better at identifying very high usage journals, given
limited staff time. A combination of the two with offer more balanced
results.
If you are using reshelving counts
Discourage users from shelving periodicals through adequate signage
Make your job easier by encouraging users to place used periodicals on
a single trolley, clearly marked and strategically placed.
Try other methods simultaneously
When you are sending out cancellation lists, simultaneously -
a. Give staff information on subscription prices – it will encourage them
to be responsible.
b. Remind them of the institution’s mission and/or aims regarding the
journals collection
c. Remind them of the possibility of obtaining articles in cancelled
journals by inter-library loan. You can even remind them here of the
availability of abstracts in the library
d. Offer an invitation to suggest new journals
Then – harry them for their replies and/or warn them that failure to reply
will lead to cancellation with no comebacks
I hope this is informative. If anyone has queries, or would be interested
in our own experiences, come June, let me know.
Thanks again to everyone who replied – Stephen Howe, Nicky
Matthews, Julie Hitchen, Chris Smart, Linda Berube, Chris Martindale,
John Makin, Christine Bradshaw, Claire Ryan and Claira Bannon.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|