Yippee!
Put the file in and got it back! Unfortunately, I couldn't get
my file back using srmcpy. Could get the file by globus-url-copy
though. The same happened to me when I was testing Graeme Stewart's
<[log in to unmask]> endpoint. He might know the answer.
Sorry for posting to <[log in to unmask]>, we've had
more than enough private conversation by now.
Words written by `Matt Doidge' on 31 May 2005 at 16:12:53 +0100 prompted:
> oh, my endpoint is;
> fal-pygrid-20.lancs.ac.uk
> please be gentle with it :-D
>
> cheers,
>
> matt
Regards.
--
Jiri
++ basename dteam/20050531-164253.txt
+ /opt/d-cache/srm/bin/srmcp -debug=true -webservice_protocol=https -x509_user_proxy=/tmp/x509up_u33032 -gsissl=true srm://fal-pygrid-20.lancs.ac.uk:8443/pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt file://./20050531-164253.txt
SRM Configuration:
debug=true
gsissl=true
help=false
pushmode=false
userproxy=true
buffer_size=2048
tcp_buffer_size=0
config_file=/home/mencak/.srmconfig/config.xml
glue_mapfile=/opt/d-cache/srm//conf/SRMServerV1.map
webservice_path=srm/managerv1.wsdl
webservice_protocol=https
gsiftpclinet=globus-url-copy
protocols_list=http,gsiftp
save_config_file=null
srmcphome=/opt/d-cache/srm/
urlcopy=/opt/d-cache/srm//bin/url-copy.sh
x509_user_cert=/root/jm/k5-ca-proxy.pem
x509_user_key=/root/jm/k5-ca-proxy.pem
x509_user_proxy=/tmp/x509up_u33032
x509_user_trusted_certificates=/root/jm/.globus/certificates
retry_num=20
retry_timeout=10000
wsdl_url=null
use_urlcopy_script=false
connect_to_wsdl=false
from[0]=srm://fal-pygrid-20.lancs.ac.uk:8443/pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt
to=file://./20050531-164253.txt
Tue May 31 16:51:49 BST 2005: starting SRMGetClient
Tue May 31 16:51:49 BST 2005: SRMClient(https,srm/managerv1.wsdl,true)
Tue May 31 16:51:49 BST 2005: connecting to server
Tue May 31 16:51:49 BST 2005: connected to server, obtaining proxy
SRMClientV1 : connecting to srm at httpg://fal-pygrid-20.lancs.ac.uk:8443/srm/managerv1
Tue May 31 16:51:51 BST 2005: got proxy of type class org.dcache.srm.client.SRMClientV1
SRMClientV1 : get: surls[0]="srm://fal-pygrid-20.lancs.ac.uk:8443/pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt"
SRMClientV1 : get: protocols[0]="http"
SRMClientV1 : get: protocols[1]="dcap"
SRMClientV1 : get: protocols[2]="gsiftp"
SRMClientV1 : get, contacting service httpg://fal-pygrid-20.lancs.ac.uk:8443/srm/managerv1
doneAddingJobs is false
copy_jobs is empty
Tue May 31 16:51:55 BST 2005: srm returned requestId = -2147483630
Tue May 31 16:51:55 BST 2005: sleeping 1 seconds ...
Tue May 31 16:51:56 BST 2005: FileRequestStatus with SURL=srm://fal-pygrid-20.lancs.ac.uk:8443/pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt is Ready
Tue May 31 16:51:56 BST 2005: received TURL=gsiftp://fal-pygrid-20.lancs.ac.uk:2811//pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt
doneAddingJobs is false
copy_jobs is not empty
copying CopyJob, source = gsiftp://fal-pygrid-20.lancs.ac.uk:2811//pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt destination = file://./20050531-164253.txt
GridftpClient: connecting to fal-pygrid-20.lancs.ac.uk on port 2811
Tue May 31 16:51:56 BST 2005: fileIDs is empty, breaking the loop
GridftpClient: gridFTPClient tcp buffer size is set to 1048576
GridftpClient: gridFTPRead started
GridftpClient: parallelism: 10
GridftpClient: waiting for completion of transfer
GridftpClient: gridFtpWrite: starting the transfer in emode from /pnfs/lancs.ac.uk/data/dteam/20050531-164253.txt
setting file request -2147483629 status to Done
|