Hi James,
Late to the party, but on dates, we've developed a fairly expressive syntax for data entry of historical dates which is a frequent issue for us. Here's a paste of the help text for our bog standard date field:
Date format: dd-mm-yyyy. e.g. '25-10-1970' (on that day), '10-1970' (sometime that month) or '1970' (sometime that year).
Date ranges: e.g. '1970 to 1975' (started in 1970 and finished in 1975).
Uncertainty: e.g. 'c. 1972' (circa 1972), '?1972' (probably in 1972 but might be another year), '?1970 to 1975' (sometime between 1970 and 1975).
As you can see where possible we try and allow for the use of relatively natural language phrases. We store this information in a way which allows us to perform searches which tolerate / are aware of uncertainty where it exists.
In some projects we've used more advanced versions of this, adding, for example, authority lists of keywords which correspond to named historical periods and so on.
Happy to make available the code for this (implemented in Django) if anyone's interested.
All best
Paul
-----Original Message-----
From: Museums Computer Group [mailto:[log in to unmask]] On Behalf Of James Morley
Sent: 04 November 2013 19:40
To: [log in to unmask]
Subject: Re: [MCG] Displaying imprecise date and location information - best practice and examples?
Thanks everyone for the answers and additional thoughts. What a can of worms!
Joe - looks interesting. Have you come across date_parse (see
http://stackoverflow.com/questions/15350309/heuristic-fuzzy-date-extraction-from-the-string<http://stackoverflow.com/questions/15350309/heuristic-fuzzy-date-extraction-from-the-string?answertab=votes#tab-top>)
which is supposedly very good at short pieces of text - for example if you have a date field, but the values are inconsistently formatted. I was trying it though on longer text input such as "The Prime Minister, Mr Winston Churchill, lights a cigar as he and General Montgomery set out in a jeep to go inland. Photograph taken during the Prime Minister's short visit to the Normandy coast, 12 June 1944." and was surprised how it frequently failed to spot anything even as obvious. And also it always tries to return a precise date, which kind of defeats the object of what we're talking about here.
Ken - didn't realise you were on here :) Decimal places is something I had thought of, and I wonder if there are any other examples in mainstream use?
I'm guessing you could do something roughly similar using trailing zeros on Unix timestamps, but I can certainly see a few problems, for example if something were located precisely at e.g. 51.000000000, 1.00000000 or happened at exactly 09 / 09 / 01 @ 1:46:40am UTC (I'll let you look that one up!). I personally prefer the approach of Flickr and Google which both use a precise measurement but store a measure of accuracy/granularity/zoom.
Sarah - that sounds like the most pragmatic solution I've heard, though I can think of scenarios where you'd probably have to be over cautious and thereby lose some of the relative precision other solutions might offer (e.g. 'post-War' - 1945-19..?)
Florian - really interesting examples, thanks. Lots of food for thought!
Richard - I've done a few experimental things in OpenLayers using aggregate pins, custom markers to depict accurate vs vague etc. The other possible approach is to show and hide pins with differing levels of accuracy at different zoom levels, but I've yet to find anything really elegant. And your last point I think is proven by this discussion!
Food for thought.
James
On Mon, Nov 4, 2013 at 5:29 PM, Richard Light <[log in to unmask]>wrote:
> It's an interesting reflection on the quality of machine-processible
> data that we are generating as a community, if it's necessary to use
> natural language processing or regular expressions to extract usable
> date ranges from CMS [1] data. The advice to record earliest and
> latest date in a consistent processible format has been around since
> the 1970s, and remains in SPECTRUM to this day.
>
> Of course, you still have to distinguish between duration of an event
> and uncertainty over when exactly it happened. The CIDOC CRM SIG have
> provided guidance on date recording [2] which aims to address this
> issue as "ongoing throughout" vs. "at some time within". When looking
> into a possible design pattern for dates in a Linked Data context [3],
> I found that the W3C date types allow you to express less precise
> dates, e.g. "year" [4]. This might be useful.
>
> For places, there is in principle the option of using a bounding box
> or boundary-line to indicate the size/scope of the place (which isn't
> quite the same thing as imprecision, of course). Geonames has the former (e.g.
> [5]); Ordnance Survey the latter [6], at least for administrative
> units in the area it covers. Of course, having this information, and
> getting to use it in an interactive context, are two different
> challenges. Has anyone implemented boundary box or boundary-line-based "pins", e.g. in OpenLayers?
>
> One problem I am finding with geolocation is that when you convert
> descriptive place name data to coordinates, every object from the "same"
> place ends up with exactly the same coordinates. So to add to the
> spurious precision of the place itself, you have a spurious collocation of objects.
> I am trying to address this issue by generating a single "pin" with a
> group description of the objects it contains, unless there is only one.
> [7] (Apart from other considerations, popular web mapping
> applications will often only display one pin for a single coordinate,
> so one is rather forced into this approach.)
>
> The requirement to display collections data on maps and in timelines
> is possibly a useful wake-up call as regards the usefulness of our data.
>
> Richard
>
> [1] i.e. Collections Management System, not the other one [2]
> http://www.cidoc-crm.org/docs/How_to%20implement%20CRM_Time_
> in%20RDF.pdf
> [3] http://light.demon.co.uk/wordpress/?p=600 - all comments welcome,
> BTW [4] http://www.w3.org/TR/xmlschema-2/#isoformats
> [5] http://ws.geonames.org/get?geonameId=2644667
> [6] http://data.ordnancesurvey.co.uk/datasets/os-linked-data/about
> [7] http://light.demon.co.uk/wordpress/?page_id=446&modes_
> query={Word%20search}=*{oakham}&page_id=537<http://light.demon.co.uk/w
> ordpress/?page_id=446&modes_query=%7BWord%20search%7D=*%7Boakham%7D&pa
> ge_id=537>
>
>
> On 04/11/2013 13:16, James Morley wrote:
>
>> Hi David
>>
>> I've been looking at it from that angle too, thought not quite in as
>> much detail as it sounds like you have! And I've also been thinking
>> about the 'middle' bit that you allude to - once you've deciphered
>> them, how you might store these things in a unified way.
>>
>> I'm working on a prototype crowdsourcing toolkit for cultural
>> collections and will be presenting the concept in a quick open-mic
>> session at next week's UKMW13. The aim is not just to give users
>> tools, but also tap into natural language processing (e.g.
>> OpenCalais, Alchemy, Zemanta) to suggest tags, locations, dates,
>> people, events etc that users can then verify (or otherwise!).
>>
>> Give me a shout if you'd like to have a look at this in a day or two,
>> when I should have something ready for testing.
>>
>> Cheers, James
>>
>>
>> ---
>> James Morley
>> www.jamesmorley.net / @jamesinealing
>> www.whatsthatpicture.com / @PhotosOfThePast www.apennypermile.com /
>> @APennyPerMile <http://www.apennypermile.com>
>>
>>
>>
>> On Mon, Nov 4, 2013 at 11:54 AM, David Croft
>> <[log in to unmask]>
>> wrote:
>>
>> I've been working on this problem on an off for a while now, but
>> from the
>>> other side as it were. Trying to extract the dates that the record
>>> author meant from what they actually wrote.
>>> There are a LOT of different date formats out there and I've yet to
>>> see a really good solution.
>>> I'm coming at this problem from a software angle, trying to decode
>>> dates automatically, so my desires for date formats may be different to yours.
>>> But I really, really, really wish that date information was stated
>>> explicitly and consistently.
>>>
>>> Plenty of collections use modifiers like 'circa', 'early' or 'first
>>> half', but then don't use these consistently.
>>> In one record 'late 20th century' means 1950 to 2000, in another
>>> place it will mean 1975 to 2000.
>>> These sort of date modifiers never seem to get explicitly defined
>>> for the collection which means that what one collection means by
>>> 'circa' is different to what another collection means.
>>> The modifiers also mean different things to different dates, 'circa 1950'
>>> may mean 1945 to 1955 but is `circa 1950s' 1950 to 1959 or 1945 to 1965?
>>> There are lots of records with dates like '80s' where you just have
>>> to assume the century information or '1940-50s' where you assume it
>>> means
>>> 1940
>>> to 1959.
>>>
>>> So for me, the best way is just to provide the upper and lower
>>> bounds for date period in full, i.e. not `circa 1955' but instead
>>> `1950/1/1 to 1959/12/31`.
>>> Or if that's not possible, define exactly what you mean by 'circa',
>>> 'late', 'early' etc and make that information available where anyone
>>> looking at your records can see it.
>>> For example, are you going to use the word 'circa'? or just put a
>>> 'c' on the front of the date i.e. 'c1950'?
>>> If there are two dates in a field does the circa apply to just the
>>> first one or both? i.e. is 'circa 1950 to 1960' the same as 'circa
>>> 1950 to circa 1960'?
>>> If you are saying 'circa 19th century' do you mean up to 25 years
>>> either side? 50 years? 75?
>>> Software can decode any format you use as long as we know what the
>>> rules are.
>>>
>>> P.S
>>> There are some truly interesting date fields out there and I've been
>>> keeping a list as part of my really tricky testing data.
>>> Some of my favourites are '25 feb ?', 'circa pre world war two',
>>> 'early or late 19th or 20 century' and 'c18-1 to c--01?'
>>>
>>> David
>>>
>>> ****************************************************************
>>> website: http://museumscomputergroup.org.uk/
>>> Twitter: http://www.twitter.com/ukmcg
>>> Facebook: http://www.facebook.com/museumscomputergroup
>>> [un]subscribe: http://museumscomputergroup.org.uk/email-list/
>>> ****************************************************************
>>>
>>> ****************************************************************
>> website: http://museumscomputergroup.org.uk/
>> Twitter: http://www.twitter.com/ukmcg
>> Facebook: http://www.facebook.com/museumscomputergroup
>> [un]subscribe: http://museumscomputergroup.org.uk/email-list/
>> ****************************************************************
>> .
>>
>>
> --
> *Richard Light*
>
>
> ****************************************************************
> website: http://museumscomputergroup.org.uk/
> Twitter: http://www.twitter.com/ukmcg
> Facebook: http://www.facebook.com/museumscomputergroup
> [un]subscribe: http://museumscomputergroup.org.uk/email-list/
> ****************************************************************
>
****************************************************************
website: http://museumscomputergroup.org.uk/
Twitter: http://www.twitter.com/ukmcg
Facebook: http://www.facebook.com/museumscomputergroup
[un]subscribe: http://museumscomputergroup.org.uk/email-list/
****************************************************************
****************************************************************
website: http://museumscomputergroup.org.uk/
Twitter: http://www.twitter.com/ukmcg
Facebook: http://www.facebook.com/museumscomputergroup
[un]subscribe: http://museumscomputergroup.org.uk/email-list/
****************************************************************
|