Thanks for your suggestions, Cathy - will try to add them to my own
practice - here's something we prepared earlier... (It's complexity
reflects the size of our institution.)
Best,
Sandra
Malone, Cathy wrote:
> I wanted to add my two pennyworth to the evaluation thread...
>
> Like many others in this group I have found this thread to be very relevant as we are increasingly asked to justify the services we provide.
> It is also very timely as I have been focusing for a few months now on our internal record keeping of study support sessions. I have been experimenting with getting the students more involved in completing these records and have been re-examining the purpose of these sessions. How do I measure success? & what is it exactly we want to track & record?
> I think student satisfaction is an important part of the picture (there's another thread here re significance of affective needs in student focused support) but it seems a leap to move from this to retention figures. I feel as if I am missing a stage somehow.
> With the record keeping I came to the conclusion that in these sessions I wanted to influence student ideas, their understanding and conceptions of learning and (usually) writing and influence or change their practices(academic, literacy, study). Giving students time to write in study support and requiring them to complete part of the sheet themselves proved very illuminating. There's also the advantage of getting the students writing in the session which I think can be very empowering, it alters the dynamic and also gives you an immediate insight into their perspective, you find out just where they're at. For multiple users it also provides a record and an informal evaluation of how they are proceeding (or not!) Is anything changing as a result of this input, these discussions? Or are their questions and practices unaltered over 5 weeks? There's a real sense with some students that you can track progress and map increasing articulacy and awareness of themselves in the pr
ocess whereas with others there's a sense of stagnation and a jotting down of buzzwords.
>
> I have whittled a number of questions down to asking (& requiring the students to write the answer)
>
> What's your question?
>
> Current practice
>
> What will you do next?
>
> They are semi guided through these questions. I hope to use this more consistently with a wider group of students next year and to use the results to inform the development of different forms of evaluation for different users; one off users compared to multiple or regular users, worshop attendees compared to users of 1:1 support.
> I doubt any of this is particularly innovative but it is providing me with very valuable qualitative data which unpacks student engagement in our services and allows us to examine what they take away from our sessions rather than focus on what is delivered to them.
>
> So an overview of what we're doing at the moment and as you can see a work in progress.
>
> Looking forward to reading the project.
> Regards
>
> Cathy Malone
> Sheffield Hallam
>
>
> ________________________________________
> From: learning development in higher education network [[log in to unmask]] On Behalf Of Julia Braham [[log in to unmask]]
> Sent: Thursday, September 03, 2009 6:06 PM
> To: [log in to unmask]
> Subject: Re: Evaluation Project
>
> Hello,
> This is an interesting question and very timely. There has been some discussion on the lists over summer about 'impact evaluation of study support' and evaluation of learning to learn modules, but the response to Nick Hooper's call in June suggests that we are all still skirting around this question.
> For some reason we rarely begin an academic year at Leeds thinking strategically about evaluation - it is usually towards the end of the year when we try to make sure we have captured plenty of 'how was it for you' evidence to populate annual reports. Our evaluation mechanisms seem relatively simplistic and still focus on questionnaires given to students attending workshops and drop ins. We have tried to find out from students the extent to which attending workshops may have helped them change their practice, and what they found useful (or not) about the workshop and use their responses to adapt our delivery - if they are consistent enough!
> A few years ago we distributed an end of year evaluation to students who had attended our drop in sessions. One question asked if students felt that attending drop in sessions had encouraged them to remain on their course. The responses to this were significant enough to convert into FTEs and inserted into a report that argued that our drop in facility had maintained x amount of income for the university. The extent to which this figure was quoted (and not interrogated) was surprising.
> When we launch a new learning resource or website, we usually run focus groups to capture impressions and again, take account of responses in resource development. We ask different questions and seek different answers depending on who needs to use the information our evaluation provides us with. We know that our students want to see the continuation of one to one drop in sessions and that they find them effective, but others ask questions about their cost effectiveness.
> I would suggest, by the responses to the email question, that may other institutions are the same as us. We know that learning development is an activity that is difficult to evaluate and we all seem to be unsure how to grasp the nettle to produce meaningful and reliable data (whatever that is), and then know how best to use the information that it provides.
> Does anyone feel that they are using more sophisticated evaluation mechanisms? Are there any more innovative example of current evaluation practice being used? It will be interesting to see the outcome of the Queens project, but I can see why the Project Leaders would benefit from a clearer understanding of what we are all currently doing to evaluate our delivery.
>
> Julia Braham
> Senior Academic Skills Adviser
> Skills@Library
> University of Leeds
>
> ________________________________________
> From: learning development in higher education network [[log in to unmask]] On Behalf Of Paula Moran [[log in to unmask]]
> Sent: 17 August 2009 14:25
> To: [log in to unmask]
> Subject: Evaluation Project - Small Grant Funding
>
> Hi
> Project DEVELOPMENT OF A GENERIC EVALUATION PLAN AND EVALUATION
> TOOLS FOR LEARNING DEVELOPMENT ACTIVITY
>
> We are currently working on our literature review for the above project and
> after initial discussions on our first draft with our mentor we think it would be
> very beneficial to include a section that scopes current practice within the
> LDHEN network.
> I would be very grateful if you would be willing to provide me with information
> on the rationale and/or practice you use for learning development activities.
> Also it would be useful if you could highlight what you have experienced as
> most effective and any issues you have experienced in the area of evaluation
> of learning activities which may include one-to-one provision, workshops and
> resources.
>
> Any feedback on the above or other information you think would be relevant
> will be much appreciated.
>
> Best wishes
>
> Paula
>
--
Sandra Sinfield
University Teaching Fellow
_______________________________________________________________________
Coordinator LDU & LearnHigher CETL www.learnhigher.ac.uk
LC-M10 London Metropolitan University, 236-250 Holloway Road, N7 6PP.
(020) 7 133 4045
www.londonmet.ac.uk/ldu
_______________________________________________________________________
Companies Act 2006 : http://www.londonmet.ac.uk/companyinfo
|