Hi Sally,
This is an area that I'm interested in. We use a number of different tools
for analysing our sites - some of these are listed in programme for my
Benchmarking your Web Site workshop I gave last year:
http://www.ukoln.ac.uk/web-focus/events/workshops/webmaster-2001/materials/p
arallel-benchmarking/
It might be worth reminding people that it's not just a question of which
software you use to check the quality of aspects of
Web sites, such as broken links, HTML compliance, etc. there is also the
issue of ensuring that errors are fixed and that they do not return.
This is QA ... and part of what the QA Focus post is encouraging people to
do (this is a new post held by myself and Ed Bremner a colleague from ILRT).
I'm interested in hearing what others are currently doing because I'll be
running a session on this at the Strathclyde IWM conference
(http://www.ukoln.ac.uk/web-focus/events/workshops/webmaster-2002/).
Regards
Marieke
> -----Original Message-----
> From: List for the UK HE community to discuss all aspects of managing an
> institut [mailto:[log in to unmask]] On Behalf Of Sally
> Justice
> Sent: 17 April 2002 08:20
> To: [log in to unmask]
> Subject: Web analysis software
>
>
> What do people recommend for software to analyze the web
> for broken links, out of date pages, most searched words
> used in meta data, hits, etc. etc. across multi servers.
> We only have analog installed right now but it is not very useful and we
> need more powerful tools to a range of tasks right accross all the
> servers.
>
> Also has anyone taken out the paying version of Google and if so does
> this bring huge advantages?
>
> The main web site uses Digital UNIX ES40,Apache 1.3.12,
> mysql-3.22.30,php-3.0.5 and at the moment the free Google Search for
> Unis.
>
> If you prefer to email me then I can summarize if people are interested?
> [log in to unmask]
>
> Thanks
> Sally
>
>
|