> > Alan sent me some NDF URL stuff a day or two before he left. It's been
> > sitting in my inbox since then. I pushed it up my priority list today
> > after prompting from Malcolm, so I had a look at it and decided I didn't
> > really see why it was doing things the way it was. There was one change
> > which could easily have modified the "normal" behaviour of the NDF libary.
> >
> > One question which could do with answering is where, and for how long,
> > has this modified NDF library been in use? Anyone know?
>
> I'm not sure that anyone has tried them other than Alan. Al only found
> them two days ago after prompting from Malcolm about additional convert
> patches.
It's been used by Brian and Alan for web services, grid and other
associated demos. I don't believe it's ever been leaked to users, but
Brian may know more...?
> > I presume someone else (Malcolm?) has the related patches for convert?
> > It would probably help me to see why the changes have been done as they
> > have if I could see the related convert scripts.
>
> Al sent Alan's Convert patches to Malcolm which basically do the retrieval
> of the remote file and put it in a temp space so that they can be
> converted locally. I'm sure Al can send them to you.
Attached! I'm a bit unsure why Alan used some Java to retrieve the file,
looking at his URLFetcher class it doesn't seem to do any deep magic. I'm
presuming this call can be removed and replaced with wget (or something
more standard) in the URL2NDF CONVERT script.
> > How urgent is the inclusion of these NDF mods?
>
> Not urgent, it was just that having the convert patches sitting in a tar
> ball on Al's [dodgy] laptop and seemingly without any one knowing they
> exist did not seem like a long term solution.
Laptop spent most of last night flashing binary messages on the led's,
I've managed to revive it (again), but it's definately not happy.
Hopefully I managed to grab all the important files off it, but you never
know...
> not be forgotten about. As a bare minimum it would be nice to get them
> into CVS on a branch so that we know where they are [that can only happen
> once NDF is in cvs]
I think this is the least we need to do, it's interesting functionality
and would instantly give us something to talk about (advertise). Something
along the lines of "Hey look, you can just type a URL into any Starlink
Classic applicaiton and magically it knows what to do with it!" would be
a pretty good thing to be able to say...
So it basically comes down to how much breakage putting this into a
mainstream NDF distribution would do...!?
Al.
# The preceding line must be left blank
# Name:
# url2ndf
#
# Purpose:
# NDF on-the-fly conversion script for URLs.
# (This version handles ^type separately from ^name i.e. no dummy .url type)
#
# Invocation:
# url2ndf ^dir ^name ^type ^fxs ^ndf'
#
# Arguments:
# ^hostdir (Given)
# The protocol, hostname and directory portion of the URL
# (ftp://ftp.starlink.rl.ac.uk/pub/ajc/ for example)
# ^name (Given)
# The name of the file. For foreign files, this will include the file
# extension
# ^type (Given)
# The foreign file extension (".fits" for example)
# ^fxs (Given)
# The foreign extension specifier (e.g a FITS image extension
# specifier such as "[2]")
# ^ndf (Given)
# The name of the NDF to be created
#
# Method:
# The file is obtained from the remote site using the Java class
# java.net.URL, and placed in the current working directory. If the file
# extension is not ".sdf" but is one of the allowed types (currently FITS
# or GIF) the file is then converted to an NDF with the supplied ^ndf name
# using the standard CONVERT conversion utilities.
#
#
# Deficiencies:
# The process therefore only works for protocols handled by the Java URL
# class and certain file types handled by CONVERT (easily extended in the
# case statement below).
#
# The classpath to the Starlink util package URLFetcher class is hardwired
# as $WEB_INF/classes.
#-
#?echo "^hostdir " $1
#?echo "^name " $2
#?echo "^type " $3
#?echo "^fxs " $4
#?echo "^ndf " $5
# If there is no extension, assume .sdf
# Set name to the full filename
if [ -z $3 ]; then
type=".sdf"
else
type="$3"
fi
name=$2$type
#?echo 'name is ' $name
#?echo 'type is ' $type
# Construct the full URL
hostdir=$1
# Get the format of files with the given extension
# Also tells whether the file is to be got and/or converted
convert="yes"
get="yes"
case $type in
.fit | .fits | .FIT) fmt="FITS"
;;
.gif) fmt="GIF"
;;
.sdf) convert="no"
;;
*) echo \!\! Cannot convert $ext files
convert="no"
get="no"
;;
esac
# Get the file to the current working directory
if [ "$get" = "yes" ]; then
#?echo java URLFetcher $hostdir$name URL$name
java uk.ac.starlink.util.URLFetcher \
$hostdir$name URL$name &>Z_err_$name || \
( echo ""
echo "!! ** Error converting NDF from $2 format using $appl"
sed 's/^/! /' Z_err_$name
echo "! ** Returning to main application"
echo ""
rm -f Z_err_$name URL$name
exit 1
)
rm -f Z_err_$name
fi
# Now convert the file if necessary
# End up with required NDF and remove the intermediate file
if [ "$convert" = "yes" ]; then
#?echo $CONVERT_DIR/convertndf from "$fmt" "./" "URL${name%.*}" "$type" "$4" $5
$CONVERT_DIR/convertndf from "$fmt" "./" "URL${name%.*}" "$type" "$4" $5
rm -f URL$name
fi
# End of url2ndf
setenv NDF_FORMATS_IN "${NDF_FORMATS_IN},URL(.url),URL(.URL)"
setenv NDF_FROM_URL \
'scripts/url2ndf \
"^dir" "^name" "^type" "^fxs" "^ndf"'
|