Hi folks,
It might be interesting, where colleagues who gather feedback from students about their learning experience on a course, etc, to hear how they USE the information so generated, e.g.
-- Do they share it with the teaching colleagues involved in the course?
-- Do they find themselves in a position to use it in improving the course (and maybe other courses also), either for the present students or future ones?
-- Do they share the feedback with the students who provided it (and with their colleagues who didn't) and let hem know what is happening as a result?
-- What problems (if any) do they encounter in trying to use the information gathered?
Hi Hannah,
Happy to talk about how we do evaluation at UoB:
· We have two main feedback streams, one for program-based initiatives, and the other for ad-hoc or one-off sessions:
o To measure initiatives we have a “start of-“ and “end of-“ version with questions around confidence in and perceptions of academic skills/motivation to develop them. We compare the overall trend between start and end in a rudimentary progressional analysis.
o One-off sessions and “Feedback Fortnight” (an annual two-week period where we solicit feedback from the students who come to our drop-in service, often done late Nov.) have a single questionnaire which focuses more on the purpose and motivation for their visit and how they heard about us (to check the efficiency of our marketing).
o Both questionnaires check whether the students feel they have learned anything new, how effective they felt the session was(/sessions were) and their opinion about the tutor’s approach. Whenever asking about sessions, we also ask about the effectiveness of any materials used
· We do indeed gather both qual. and quant. data
· We generally collect our data in hard copies, but we use a product called Evasys, which allows for semi-automated creation and scanning of 200 double-sided questionnaires in ~3 mins, and provides an electronic report with bar charts, averages and # of respondents per question (for quant. Qs) and a scan of all qual. Q responses in a single PDF. (As a side-note, I have recently looked into whether it would be difficult to do electronic distribution through Evasys as we have the possibility of that – so far it seems super-duper easy, but we haven’t rolled this out for any courses as yet. The primary concern is that sticking the link on a VLE or mailing list would garner fewer responses, and we’re considering the possibility of using electronic distribution purely for when the ‘end of initiative’ sessions are woefully under-attended, as a means of getting >20% response rate when compared to the ‘start of initiative’ one)
In terms of response rate for hard copies, we generally get 95% of the attendant students to respond when doing either initiative or single sessions, although a solid amount of those (no %s here, but I’ve QC’ed many stacks of these forms as I’m pretty much the questionnaire muggins of the dept J) will have no responses to any of the questions overleaf when we print 2-sided. Feedback fortnight is generally 100% of those who we ask (I’ve never heard of a student refusing a request to fill one in), but this very much relies on the number of student drop-ins and the memory of the team member seeing that student.
As can be expected when thinking about the bystander effect, the fewer the students in a group being asked to fill in a questionnaire, the higher the response rate is.
If you (or anyone else) have any questions I’d be happy to answer them (to the best of my ability at least)
Kind regards,
Barry Poulter
MA, TEFL-i, AFHEA
Professional and Academic Development Trainer
Professional and Academic Development Team
University of Bedfordshire
Luton
01582 489027 (ext. 9027)
Visit the
University of Bedfordshire Repository
for free, open access to our research.
From: learning development in higher education network [mailto:[log in to unmask]]
On Behalf Of Hannah Jones
Sent: 20 July 2018 14:13
To: [log in to unmask]
Subject: Re: Gathering Student Feedback on In-Sessional
Hi Jennie,
Thank you very much for your response - extremely useful and interesting.
Would you be able to tell me more about your electronic register - is this part of an automated booking system? I'd love to know which system you use for this.
Hannah
From: Jennie Blake <[log in to unmask]>
Sent: 20 July 2018 11:50:11
To: Hannah Jones;
[log in to unmask]
Subject: RE: Gathering Student Feedback on In-Sessional
Hi Hannah!
We gather a lot of data. J For face to face feedback, our ideal is:
1) Gather “quick impact” information via iPads. These are a few questions that take no more than two minutes to answer. They cover general evaluation (did you like it?) and touch on impact (will you change your behaviour/do you feel your understanding has improved?) We have quite high response rate to this as we stand there encouraging them (It’s certainly over 80% and I think it’s probably close to 95/100%)
2) We send out a longer feedback survey to all attendees. This goes much further into evaluation of the session and also asks for feedback on additional topics of interest and how the attendee heard about the service. The response rate for this is much lower( between 13%-18% although sometimes we get over 20-25%) as is typical for an online survey, but the data is useful. There is more qualitative data gathered through this one. It is anonymous. We also see enough students (5000+) per year to make the low response rate less of a problem.
3) We gather data on every attendee when they check in via our electronic register. This automatically pulls their school, faculty and student ID. We can then use the student ID to gather more information if necessary (EU/Home student/etc).
4) Finally, we sometimes do additional interviews and focus groups for evaluation and research. These all feed in to the work we do as well.
We have found all of this data very useful. We used it to run an impact study recently that showed correlation between attendance at multiple face to face sessions and getting a first at the end of the student’s degree programme! We also regularly use the feedback to indicate areas for new development and inform refresh and review of sessions.
Happy to answer any questions if people want further detail,
Jennie Blake
Jennie Blake l SFHEA | Learning Development Manager l Alan Gilbert Learning Commons l The University of Manchester Library l The University of Manchester l Oxford Road l Manchester l M13 9PP l Tel x7759811 (internal) 07876847305 (external)
From: learning development in higher education network [mailto:[log in to unmask]]
On Behalf Of Hannah Jones
Sent: 20 July 2018 11:00
To:
[log in to unmask]
Subject: Gathering Student Feedback on In-Sessional
Dear All,
Here at the Centre for English Language and Foundation Studies at Bristol, we are thinking about changing the way we gather student feedback on our in-sessional academic language & literacy provisions for PGT students – I was hoping to gather some insights and ideas from people working in other contexts as part of this review.
If any of you would be kind enough to provide responses to the following questions, I would be very grateful:
If anyone is interested, these are my answers:
Thank you very much in advance for your responses – I’ll be happy to collate results and share with anyone interested.
Best wishes,
Hannah Jones
To unsubscribe from the LDHEN list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LDHEN&A=1
To unsubscribe from the LDHEN list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LDHEN&A=1
To unsubscribe from the LDHEN list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LDHEN&A=1