Print

Print


Dear All,

 

First, many thanks to everyone who replied to my email about benchmarking RDM training a while ago (on or off the mailing list). I profusely apologise that it took me forever to summarise all the replies.

 

In short: several people expressed their interest in agreeing on some sort of feedback form / benchmarking mechanism. I therefore suggest that we start a mini-working group aiming to agree on minimal set of questions we would like to be asking in our RDM training feedback forms, which could help us benchmark our training. I have just made a request to set up a new Jisc mailing list to facilitate our conversations. If you would like to be added to that list, please do let me know.

 

I have also summarised all the responses received so far in here (feel free to edit): https://docs.google.com/document/d/1s57Eop4hViExTpqBxaAUsRI1W-IXekZPxIHwgLLRBLg/edit?usp=sharing and I’m also copying the summary below my email.

 

Thank you again for all your help – I find the List (all of you) wonderfully supportive and helpful.

 

All best,

 

Marta

 

Benchmarking Research Data training

 

Ideas and thoughts about assessment of RDM training collected through posts to the [log in to unmask]  mailing list

 

 

Before the workshop

Feedback after workshops

Resources

 

Before the workshop

Thoughts on branding, advertisement for RDM training

 

Titles of RDM / Scholarly Communication training

 

Use of clear titles, right in the workshop title, what specifically attendees will learn and why it matters. Following this practice at one of the institutions resulted in more attendees to workshops than before. Examples:

·        “Introduction to Authors’ Rights” and “You Know What You Write, But Do You Know Your Rights?” (yawn) to “What to Know Before You Submit to a Journal, or Sign Its Contract”

·        “Why and How to Deposit to CUNY Academic Works” (repository workshop) into “Your Google Scholar Profile: Why to Create It and How to Fine-Tune It.”

Advertisement of the content

Specifying exactly the topics which will be covered (and which will not be covered) during workshops in the workshop description helps to avoid disappointment. The purpose and aims of the workshop can be also re-iterated at the beginning of sessions. The question as to whether the workshop met attendees’ expectations can be asked in the feedback form.

 

Many respondents said that attendees indicated their preference for discipline-specific training. This suggests that it is desirable for both the content, as well as training description to be tailored to the audience type. Both in terms of language (Arts and Humanities researchers might be not comfortable with the term ‘data’ and the use of words like ‘sources’ or ‘digital information’ might be preferred), content itself (discipline-specific examples, tools and resources) and desired learning outcomes.

 

Feedback after workshops

General advice

 

Making feedback as generic as possible allows benchmarking against other training provided by the organisation. Generic feedback could also make it possible to compare feedback from independent organisations.

 

A question was raised as to whether feedback on RDM training from US and European organisations can be compared. My initial reaction is that yes, as long as the content is comparable and feedback is generic enough, but would welcome thoughts from other members of the List.

 

Several respondents discussed the advantages and disadvantages of on-paper or online feedback from workshops.

 

 

On-paper feedback

Online feedback

Advantages

Higher response rate

·        Responses available digitally straight away

·        Can be used to follow up with participants some time after the workshop (long-term learning outcomes)

Disadvantages

·        Hand-written responses need to be transcribed to a digital format

·        Participants don’t have the time to reflect on their learning and rush their feedback (less time spent on open questions)

Low response rate

 

One respondent said that they were now reverting back to on-paper feedback due to very low response rate from online feedback.

 

Questions asked in feedback forms

According to advice provided by DataONE’s Education EVAluation tool questions asked in feedback form should address all four Kirkpatrick’s Evaluation Areas:

·        Reaction (a measure of customer satisfaction)

·        Learning (extent to which participants change attitudes, increase knowledge, and/or increase skill as a result of attending a program)

·        Behavior (the extent to which a change in behavior has occurred because someone attended a training program)

·        Results (measuring the final results that occurred because a person attended a training session)

 

Behaviour change questions are difficult to address immediately after the workshop, and can be easily confounded with Results.

 

Questions most commonly asked in feedback forms by those who responded on the mailing lists are:

·        Questions about participants' background

·        To what extent did this workshop meet your expectations / needs? (multiple choice question)

·        What was good / effective about the session?

·        How could the session be improved?

·        What changes do you plan to make to your practice as a result of the session?

·        Would you recommend this workshop to your colleagues? (Yes / No / Not sure)

 

Long-term behavioural change

There was a lot of follow up questions about the assessment of long-term behavioural change as a result of training attendance. Several questions about long-term follow up surveys were asked to the list, but no one has replied saying that they were doing any longer-term follow up studies with their participants.

 

Some people noticed a correlation that people who attended workshops tend to use institutional repositories more often (number of data deposits).

 

Many respondents indicated that they were thinking about asking some follow up questions months’ later, but they fear that the response rate will be extremely low and the cost / benefit to conduct these surveys disproportionately high to the potential gain.

 

Resources

Bill Michener slides from IDCC17 about DataOne’s Education Evaluation Tools:

http://www.dcc.ac.uk/sites/default/files/documents/IDCC17~/Michener_IDCC_EVA_Presentation_compressed-2.pdf

DataOne’s Education Evaluation Tools:

http://www.dataone.org/education-evaluation

 

From Masud Khokhar:

·        When DMAOnline (http://dmao.info/) runs as a production SaaS (May/June 2017), we would be able to provide sector wide training benchmarks based on data that would be provided by institutions. Our thinking is to allow institutions to add their RDM training details, internally compare their similarity based on algorithms and user initiated feedback, and provide quantitative (and some qualitative) data back at sector level training outcomes in a more systematic fashion. We are still a few months away from that but working on it full speed.

 

 

 

 

 

Marta Teperek, PhD

Research Data Facility Manager

Cambridge University Library / Research Operations Office

e-mail: [log in to unmask]

tel. 01223 333138 (Friday - Wednesday)

tel. 01223 761652 (Thursday)

Twitter: @martateperek

 

You can arrange a meeting with me here

 

 

Sign up to our monthly Research Data Management newsletter: http://www.data.cam.ac.uk/newsletter/signup

 

From: Research Data Management discussion list [mailto:[log in to unmask]] On Behalf Of Marta Teperek
Sent: 31 March 2017 23:29
To: [log in to unmask]
Subject: Re: Benchmarking RDM training

 

Dear All,

 

Just wanted to thank everyone who responded to my email.

I’m going to be at RDA next week, but I will consolidate all the replies, post a summary here and get in touch with people who expressed their interest in collaboration on this.

 

Best wishes,

 

Marta

 

 

Marta Teperek, PhD

Research Data Facility Manager

Cambridge University Library / Research Operations Office

e-mail: [log in to unmask]

tel. 01223 333138 (Friday - Wednesday)

tel. 01223 761652 (Thursday)

Twitter: @martateperek

 

You can arrange a meeting with me here

 

 

Sign up to our monthly Research Data Management newsletter: http://www.data.cam.ac.uk/newsletter/signup

 

From: Research Data Management discussion list [mailto:[log in to unmask]] On Behalf Of WHYTE Angus
Sent: 28 March 2017 11:50
To: [log in to unmask]
Subject: Re: Benchmarking RDM training

 

 

I just wanted to flag up some work in DataONE that Bill Michener presented at IDCC17. His slides are available here:

http://www.dcc.ac.uk/sites/default/files/documents/IDCC17~/Michener_IDCC_EVA_Presentation_compressed-2.pdf

 

They have developed best practices and guidelines for RDM training evaluation, and a survey tool called EEVA.

http://www.dataone.org/education-evaluation   

 

best wishes,

 

Angus

 

Dr Angus Whyte

Snr Institutional Support Officer

Digital Curation Centre

University of Edinburgh

+44-131-650-9986

skype: angusawhyte

 

The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336.

 

 

From: Research Data Management discussion list <[log in to unmask]> on behalf of Marta Teperek <[log in to unmask]>
Reply-To: Research Data Management discussion list <[log in to unmask]>
Date: Monday, 27 March 2017 at 08:34
To: "[log in to unmask]" <[log in to unmask]>
Subject: Benchmarking RDM training

 

Good morning everyone,

 

Our Cambridge team has been providing lots of training to the research community. We systematically collect feedback from training to ensure that our training constantly improves.

However, I would really like to be able to compare the training we provide with sector-wide training. So I have two questions:

 

1.      Is anyone aware of any publicly available benchmarks for RDM training? I’d really like to know how good (or bad) our training is compared with sector-wide averages.

2.      Would anyone be interested in comparing our training results? What I have in mind is sharing feedback so far so that we can compare and improve our training provisions, but to perhaps also agree on/develop some standards for training assessment, so that such comparisons might be easier in the future. If you’re interested, please email me off the list ([log in to unmask]).

 

Best wishes and I look forward to hearing from you,

 

 

Marta

 

Marta Teperek, PhD

Research Data Facility Manager

Cambridge University Library / Research Operations Office

e-mail: [log in to unmask]

tel. 01223 333138 (Friday - Wednesday)

tel. 01223 761652 (Thursday)

Twitter: @martateperek

 

You can arrange a meeting with me here

 

 

Sign up to our monthly Research Data Management newsletter: http://www.data.cam.ac.uk/newsletter/signup