The name attribute specifies the destination file name
(or a relative path to the file as in the example), however keep
in mind that the destination might be remote.
To give you an example:
xrdcp input.meta4 root://hostname//path/dir
and say the name attribute is subdir/file.txt
the effective destination will be:
however if you overwrite the destination file name in xrdcp command
xrdcp input.meta4 root://hostname//path/dir/file.txt
the name attribute will be ignored.
Hope this makes it more clear :-)
From: Adrian Sevcenco
Sent: 01 April 2019 13:22
To: Michal Kamil Simon; [log in to unmask]
Subject: Re: xrootd python :: CopyProcess - stop the queue if a job succeed
On 4/1/19 1:10 PM, Michal Kamil Simon wrote:
> Hi Adrian,
> Thanks for providing an example metalink, indeed
> it seems there is a problem with parsing a metalink
> that contains urls with alice tokens. I'll provide a fix for
Thanks a lot!!
> Regarding the destination, you are suppose to specify
> possible sources in the metalink, but it must not contain
> the destination, which has to be specified as for any other
> copy job.
metalink:file elements MUST have a "name" attribute, which contains
the local file name to which the downloaded file will be written.
This is why i said that destination should not be required...
but of course is not hard for me to take care to use the same destination
file as it is written in the meta4 file
Thanks a lot!
> From: Adrian Sevcenco
> Sent: 31 March 2019 00:09
> To: Michal Kamil Simon; [log in to unmask]
> Subject: Re: xrootd python :: CopyProcess - stop the queue if a job succeed
> On 3/28/19 5:31 PM, Michal Kamil Simon wrote:
>> Hi Adrian,
> Hi Michal!
>> If (for now) you simply want to download a file having different
>> replicas (with
>> different names, etc.) a metalink is what you need. Have a look at:
> cool! it is perfect for my need
>> you can create a metalink containing different URLs and assign priorities
>> to them. You could e.g. generate the metalink from python and then use a
>> local metalink as source in the CopyProcess.
>> Once we release 4.9.1 you will be also able to do a multi-source download
>> using a metalink.
>> However, this wont work for uploads.
> of course!
> I made one for testing,
> but it seems that there are problems :
> 1. the url part is not parsed properly
> 2. the usage require a destination (but that is required and included in
> the metalink) - so maybe if the source is meta4 the destination is taken
> from the meta file and no longer required in cli?
> I am sure that there is a problem with the url parsing because the same
> url works for a normal add_job copy request
> Additional question for when it will be working :
> for the python part - add_Job, will take as source this .meta4 file?
> Thank you!!
>> From: Adrian Sevcenco
>> Sent: 28 March 2019 16:03
>> To: Michal Kamil Simon; [log in to unmask]
>> Subject: Re: xrootd python :: CopyProcess - stop the queue if a job succeed
>> On 3/28/19 4:35 PM, Michal Kamil Simon wrote:
>> > Hi Adrian,
>> > My impression (correct me if I'm wrong) is that you are trying to do
>> a multi-source transfer,
>> > right?
>> well, that would have been the optimal result, but at this moment i was
>> just trying
>> to download a file while trying all replicas
>> > If so, you need to specify in the CopyProcess the sourceLimit, and as
>> a source you need
>> > to specify a metalink containing all the replicas, or a manager (in
>> this case the client will
>> > figure out what the replicas are by itself by using locate).
>> my case is a little special ... the same file have multiple replicas
>> that have _different_ physical names
>> and authz envelopes ...
>> as a client i have all required info.. i know for my logical file name
>> that the actual links are :
>> how can i create/what is the format of a metalink?
>> can i somehow put toghether the remote links from above?
>> would the same mechanism work for upload?
>> > I just pushed a patch for the client that is enabling multi-source
>> download in python bindings
>> > (63e9604) and will port it for 4.9.1.
>> great! thanks a lot!!!
>> > Cheers,
>> > Michal
>> > ________________________________________
>> > From: [log in to unmask]
[[log in to unmask]
>> behalf of Adrian Sevcenco [[log in to unmask]
>> > Sent: 28 March 2019 14:22
>> > To: [log in to unmask]
>> > Subject: xrootd python :: CopyProcess - stop the queue if a job succeed
>> > Hi! In my use case (ALICE), to download a file, i get all replicas
>> > and i add them as copy jobs to a CopyProcess ..
>> > I would like to stop the CopyProcess when a job is succesful ..
>> > Is there a way to do this?
>> > Maybe in MyCopyProgressHandler.end() to check if results['status'] have
>> > a "status: 0" and cancel all CopyProcess?
>> > Is there a way to stop a CopyProcess? or other way to take into
>> account the posibility
>> > of multiple replicas?
>> > Thank you!
>> > Adrian
>> > ########################################################################
>> > Use REPLY-ALL to reply to list
>> > To unsubscribe from the XROOTD-L list, click the following link:
>> > https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1