>
> However, I think that (in the context of the encoding convention suggested),
> strictly speaking, this is what the introduction of encoding schemes in
> "Qualified Dublin Core"
> actually does: it "redefines" the content model for e.g. dc:subject to allow
> the use of an encoding scheme, which in this convention is denoted using an
> XML attribute.
I don't think of encoding schemes as redefining, but as specifying content.
One issue is, whether it should be possible to read a dataType like
doi, msc from a record or whether one should rely on global dataTyping.
Have you considered union types along with the use of xsi:type (or other
localizable typing trick?)
>
> >From another perspective, "Qualified Dublin Core" is an "application
> profile". Applications could also redefine the content models for elements
> in the dc: and dcterms: namespaces.
It might be necessary in some situations [for instance with technical content
coded in MathML] to use mark-up in the content (it would be unnatural
to expect any content could be or should be defined by strings, which one
wants to parse by non-xml rules) -
In such a case you may want the content not analyzed in the sense
of metaData semantics - As i mentioned previously the xlink:href trick
may not work in all instances -
There's some additional sublety with xlink's: They might create
RDF harvestable tripels as such - that could make you run into a problems
you may want to escape from.
>
> I was trying to make this explicit in the XML Schema by redefining/reusing
> the "unqualified" form, but (I think) it could also be done by just having a
> completely separate independent set of complexTypes in the schema for the
> qualified form (so dc11q.xsd never references dc.xsd etc)
>
> > There seems to be no sample schema matching the examples in the actual
> > text.
>
> There are some "application schemas" pointed to in the second column of the
> table near the end of
>
> http://www.ukoln.ac.uk/metadata/dcmi/dcxml/examples.html
>
> They may be "over-permissive" but I think they import schemas for the
> relevant namespaces in Andy's examples.
That's true, but they don't cover the examples.
>
> > 6) What is the model of interoperation behind a plethora of intertwined
> xml-schema
> > one redefining the other?
>
> I think the root of some of the messiness is explained above, and my
> fixation with application profiles.... Some of this would be removed if the
> content models for e.g. unqualified and qualified forms of elements were set
> up independently.
>
> So where there are xs:redefines in the present schemas, the content models
> would be specified in full. That removes a lot of the
> complexity/interdependency. I'll look at this approach.
>
> > 7) What a piece of software is supposed to do, when it receives a record
> coded
> > in a schema, which re-defines the content model for DC stuff
> differently than
> > it is used to?
>
> Applications _will_ choose different content models for DC elements, won't
> they? Those models would have to be supplied in complexTypes in
> application-specific XML schema, and if those instances are exchanged and
> require validation
An HTML application (a browser for instance) renders only such things it is
supposed to know from the dtd, but will not reject an instance, which has
additional mark-up.
One question is, what a DC -compliant application might do with a record, which
contains additional mark-up. Does it reject the record? Or does it apply some
XSLT to the record to "make it valid" - if so, what is the strategy of such an
XSLT ??
Turned around the other way: Suppose you're ready to receive qualified records -
maybe with other namespaces mixed in -
Now an unqualified record comes in: Do you reject it?
Best,
rs
>
> Pete
>
|