Thanks Bridget, you hit the nail on the head with your comment, this is
exactly the issue I'm interested in. If, as you say, Museum learning is
about open-ended discovery, do learning outcomes have to be loosely
expressed? And if they are loosely expressed then to what extent is it
possible, during development, to test the efficacy of a design? In
another reply Mark Elsom-Cook suggested that people generally simplify
the problem down to one of two alternatives: specifying objectives so
trivial they are measurable but largely meaningless or, asking people to
give their opinion about how effective the site might be for a specified
target audience. (Thanks Mark) but in an environment where as Joe
Cutting suggests (below - thanks Joe) funding is increasingly dependent
on evidence of effectiveness, how viable are either of these two
strategies? It seems to me that even if learning outcomes are expressed
as loosely as "develop skills in comparison of objects", does the
designer still need to think through what range of skills are likely to
be relevant and then to think about how to scaffold the development of
those skills? Maybe what then gets tested during development is this
scaffolding, not the learning outcome itself. So if the activity were
intended to develop comparative skills, then perhaps the activity would
include identifying features or attributes, classifying or clustering
features, using tables or interaction matrices to pair up features, or
rating or weighting scales. During formative testing one might then look
to see what techniques the visitor uses, what their rationale is for
selecting certain techniques, what conclusions they are able to draw
from their use and how well supported their conclusions are by evidence
derived from the activity. At the end the visitor may still not be able
to make very powerful comparisons but there would be some evidence to
assess how easily they found the built-in scaffolding to use and how
helpful they found it, that the designer could use to revise the
activity and/or reassure the budget holders that product is a good
investment.
Regards Stephen
-----Original Message-----
From: Museums Computer Group [mailto:[log in to unmask]] On Behalf Of
Bridget McKenzie
Sent: 01 December 2006 11:16
To: [log in to unmask]
Subject: Re: Learning design
Hello
This is an interjection, I hope a little bit relevant, just to raise a
thought about instructional learning design in a context that favours
constructivist learning.
Museum learning is about open-ended discovery, so learning outcomes will
often be loosely expressed e.g. 'develop skills in comparison of
objects',
'develop tolerance of other cultures'. Therefore it's harder to measure
the
success of your strategies. You may be able to discern changes in
learners
if you evaluate a sustained project, including museum visits,
experimental
activities and use of the linked web resources. Or formative evaluation
can
work if the resources are seen being used in a discovery learning
context.
If you haven't got time/money to do this you can only guess that
open-ended
tools would be successful in reaching the desired outcomes if people
seem to
like them and are using them.
It could be tempting to design more and more instructional or didactic
web
tools, simply because you know you're supposed to evaluate them and it's
easier to assess whether users 'got it' or not.
Much better to support an expansion of e-learning as dialogue, so that
discovery happens online in interactions between people with questions,
people with insights and cultural artefacts. The proof of that kind of
pudding is definitely in the eating, and not only divined in an
expensive
evaluation.
Bridget
----- Original Message -----
From: "Joe Cutting" <[log in to unmask]>
To: <[log in to unmask]>
Sent: Friday, December 01, 2006 10:34 AM
Subject: Learning design
> Stephen,
> >>
> Hi, does anyone have experience of using user-centred design to create
> either exhibitions/displays and/or web sites please? I would be really
> interested to find out how useful you found this approach.
> >>
> Dear colleagues, have you built learning activities into your Museum
> web site? Was it important for you to be able to demonstrate the
> effectiveness of your designs? Did you test the designs during
> development, and/or after they were completed?
> >>
>
> This is pretty much established best practice and becoming more
> widespread
> as
> funders are demanding summative evaluation on exhibitions and projects
> after completion.
> They're also demanding that institutions specify objectives and
audiences
> upfront with their
> funding applications - although I would agree that these can easily
get
> lost along the way.
> It tends to be much easier to do formative evaluation on exhibits and
> micro-sites than
> whole institution sites due to the difficulty of specifying audiences
and
> objectives for a whole institution site.
> The amount of evaluation done tends to depend on the attitudes and
> resources of the institution but if you're looking into this
> your main issues are going to be
> Institutions don't tend to release the results of evaluation unless
> they're completely positive. This is for a variety of political
> reasons connected with the press and funders.
> There's also very little movement by institutions to publish anything
> about their development methods and
> experiences - in general its just not seen as a priority. There are
> honourable exceptions like the Tate's Multimedia tour page
> http://www.tate.org.uk/modern/multimediatour/re_keyfindings.htm.
There's
> also bits and pieces by Ben Gammon scattered around the web
> like this one
http://ukupa.org.uk/events/presentations/science_museum.pdf
> (google "Ben Gammon" for more).
>
> Given that there's a lot of work going on, your problem is unlikely to
> be
> finding examples - its much more likely to be
> getting people to tell you about them in any detail - particularly if
user
> testing wasn't done or produced "bad" results.
> My recommendation would be either to get hold of a list of recent
projects
> from a major funder like HLF or Wellcome and then
> go and interview a selection of the project managers or pick one
project
> which did use a lot of user testing and study it in detail.
> To give you an idea, one project I worked on last year did around 20
> formative evaluation studies in 4 months of development so there's a
> lot to get your teeth into.
>
> All the best with your project
>
> Joe
>
>
>
>
> Joe Cutting
> Computer exhibits and installations
> www.joecutting.com
> The Fishergate Centre, 4 Fishergate, York, YO10 4FB
> 01904 624681
>
> As of 30th October 2006 I have a new office so
> please note my new address and phone number
> **************************************************
> For mcg information and to manage your subscription to the list, visit
> the
> website at http://www.museumscomputergroup.org.uk
> **************************************************
>
>
**************************************************
For mcg information and to manage your subscription to the list, visit
the website at http://www.museumscomputergroup.org.uk
**************************************************
**************************************************
For mcg information and to manage your subscription to the list, visit the website at http://www.museumscomputergroup.org.uk
**************************************************
|