Print

Print


Hi Adrian,

Regarding 1) From what I see, unfortunately the API for specifying the number of parallel copy jobs is
not exposed in python bindings.

Regarding 2) Although there is the sourcelimit argument in the python bindings, from what I checked
in the source code it is ignored. So in practice this functionality is also not exposed in python bindings.

If you wish I can expose those features in python bindings and they potentially could be released in 4.10.0.
Could you please create respective feature request in our GitHub (https://github.com/xrootd/xrootd) so
I don't forget?

Cheers,
Michal
________________________________________
From: [log in to unmask] [[log in to unmask]] on behalf of Adrian Sevcenco [[log in to unmask]]
Sent: 26 March 2019 13:19
To: xrootd-l
Subject: python xrootd :: CopyProcess batch size + multiple sources problem

Hi! I want to use the CopyProcess to copy multiple files
and i seen that all(?) copies are done in parralell with
each copy with a default of parallelchunks=8

I would have some questions :
1. how can i set the size of parralel copies?
if i add 100 copy jobs will all run in parralel?

2. for each file i have multiple sources ..
is there a way to download the file using a list of
sources?

or shoudld just add to CopyProcess all sources with the same destination
name and with the default force=False all subseqvent copy jobs will just fail?
but in this case i do not want the parralel download feature of CopyProcess, do i?

Thank you for any ideas!
Adrian



########################################################################
Use REPLY-ALL to reply to list

To unsubscribe from the XROOTD-L list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1


Use REPLY-ALL to reply to list

To unsubscribe from the XROOTD-L list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1