...continued from previous message
From: Peter McKeague, RCAHMS [mailto:[log in to unmask]]
*The OS Positional Accuracy Improvement Programme
The level of precision offered by modern technology, such as differential
GPS, is far higher than existing mapped detail. GPS is accurate to
millimetres whereas existing cartographic detail at basic scale (1:1250) is
mapped to tolerances of 0.4m in urban areas, 1.1m in the urban and rural
areas (1:2,500 scale) and 4.1m for mountain and moorland (1:10,000 scale).
Inevitably there are problems fitting higher accuracy work into old detail
that has been surveyed using various methods over the past 150 years. The
OS acknowledge the impact of new technology within their basic scale rural
mapping and are undertaking a positional accuracy improvement programme in
1:2,500 areas (www.ordnancesurvey.co.uk and quick link to positional
accuracy improvement programme). Their web pages outline the impact of this
programme on their Landline product. Where detail has changed the OS will
issue a link file with every improved 1:2500 map tile that is re-supplied to
customers holding a maintenance agreement. The file is in comma separated
value (.csv) format and contains the co-ordinates (in metres) of both the
original position and the new position of selected points together with the
feature code of the selected points for use in commercial transformation
packages. This is fine for a one-off transformation of digitally held data
which is subsequently maintained and updated exclusively through a GIS. So
which data should be transformed? GPS data, to the OS GB36 standard is
absolute and should not require transformation. EDM survey and aerial
photographic transcription detail, both reliant on establishing local mapped
control will need transformation if that data was gathered using chart copy
or unimproved Landline data. But should point data within the NMRS database
be updated? As described earlier, the accuracy of the information supplied
to the NMRS/SMRs is founded upon a 1:10,000 paper-based map and not the
1:2,500 maps but this is likely to change as digital mapping becomes more
readily available and affordable. Data transformation may be largely
inappropriate given the accuracy described, however, there is nothing to be
lost in testing the data if the user has access to the transformation
process.
*OS MasterMapTM
In November 2001, the OS will launch a new product, MasterMap, based on the
concept of the Digital National Framework. The map will radically change
the way we work with data, as polygons will represent real-world features
such as buildings, fields or plantations. Each object in the map will have
its own unique reference number, the TOpographic IDentifier or TOID. The
TOID has no intelligence other than to identify a specific object. The
representation of areas as polygon features enables quick and simple spatial
searches through a GIS (what sites lie within or are within x metres of the
object). However, MasterMap offers a much more sophisticated solution to
presenting individual datasets within a GIS. This is particularly true when
sharing information across an office network or over the internet. A joint
research project between the OS and the RCAHMS has explored the potential of
data association. Linking the TOID to unique identifiers within the
external dataset, creates an explicit, unambiguous link between the two
datasets. At its simplest level archaeological interest could be noted in
objects within the MasterMap so that other users became aware of their
presence. They need not see that the location of an archaeological record
is less accurate than the mapped detail or that the full extents of a site
have not been established. More sophisticated links can be made and
information be retrieved through network connections to the relevant
databases or over the internet to web-pages.
Just as each object in the MasterMap has its own unique identifier, the
RCAHMS is exploring applying its existing unique identifier from the NMRS
database to individual features within other graphic layers. This will
enable a user, sometime in the future, to retrieve the database record from
site area extent, surveyed information or aerial photograph transcription
detail. Further identifiers could link detail from different layers to
entries in events tables. Thus a user could select records for all Roman
Forts, retrieving the database entries and the site area extents, as well as
any additional information such as survey or transcription detail. Such
intricate cross-referencing of data may seem like overkill when working with
relatively small datasets, but handling digital data from archaeology is
still in its infancy. Cross layer retrieval may become important as
web-based applications take off enabling remote users to access all
information without first having to know which layers to search across.
*Metadata
Within the GIS community, discussions on metadata have focused on discovery
level metadata. However, data such as the Dublin Core or the National
Geospatial Data Framework (www.askGIraffe.org.uk) (and there are other
broadly similar standards in use at this level), represent only the very tip
of an information pyramid about what data is. Discovery level metadata
provide no more than pointers to what may be in a particular dataset.
However, they do not describe or document the structure of the data within a
particular layer. Schema should be created describing both the structure of
attributes attached to a layer and content of the individual fields. The
MIDAS manual, describing data standards for Monument Inventories for
England, performs such a function. Further documentation such as the
application of Thesauri to individual entries within a field further
documents the data. These standards are already applied to many SMR
databases in England and should migrate with negligible difficulty to the
GIS environment. Metadata should also address the character of data. It
should document how and why data was collected, whether the dataset is
homogeneous or heterogeneous in origin and describe factors affecting the
positional accuracy of the data. Similar documentation should be developed
for other related spatial datasets. Metadata describing the method of
survey, type of equipment used and even the local survey control (perhaps
incorporated within the spatial data itself) should accompany digital files
to enable users to evaluate the positional accuracy of the data.
* Summary
With the exception of positional accuracy, data standards applied to layers
held within a GIS should be no different from those governing databases or
survey methodology. Positional accuracy is governed by so many undocumented
variables, from working practices, equipment available, skills (such as the
selection of appropriate local control, map reading ability) and so on, that
it would be impossible to document each and every variable, especially with
legacy information. Metadata can help and should be in place for all newly
created positional datasets. The danger inherent in a GIS is that the
system confers absolute accuracy upon each and every dataset. As modern
surveying methodologies condition the mind to accepting absolute accuracy,
appreciation of factors governing legacy data may diminish. For now, most
users are aware that their datasets were created for different purposes to
different standards. Data association through the OS Digital National
Framework (MasterMap) may redress some concerns about the positional
accuracy of point-based information, in particular, by linking that
information to an object, or objects within the DNF. As real-world objects
start to represent the point-based datasets, flagging, but not displaying,
heritage information in this manner may redress concerns about sharing data
across networks.
|