Richard Jones wrote:-
> If metadata is going to take off universally on the Web, then it is clear
> that the information has to be automatically generated. A key part of this
> is how to generate sensible keywords and summaries to act as a surrogate
> for the original document.
Sorry but I disagree. 'Automatic generation' of subject content is what the
robot search engines do and it is the dissatisfaction with the results they
produce that is the impetus for metadata. If the robot search engines were
infallible then there would be no need for metadata.
> If this is a problem that anyone is interested in following up, you might
> like to check out the two content analysis techonlogy demos on our Web
> site. In both cases they accept a URL and return either a set of keywords
> and phrases, or a hyperlinked summary. See
>
> http://www.intext.com.au
I just tried your Precison keyword generator on my home page at:-
http://www.poulter.demon.co.uk/
'Home page' was one of the keywords recommended along with some overly vague
ones ('page', 'web page', 'published', 'open letter', 'co.uk' etc) and a
worrying one ('child porn'). In fact if I was the litigious sort, I might be
more than a little upset with that last one :-)
--
Alan Poulter, Lecturer phone/fax:01509 223061/223053
Dept. of Information and Library Studies mailto:[log in to unmask]
Loughborough University mailto:[log in to unmask]
LE11 3TU mailto:[log in to unmask]
UK http://info.lboro.ac.uk/departments/ls/staff/apoulter/
|