Hi Michal, here is the output of the 4.8.2 Version with stalling. Test situ is one redirector, one client, one data server. I did restart redirector and data server right before the copy job. REDIRECTOR: root@glogin1 build # tail -f /var/log/xrootd/xrootd.log 180419 13:27:25 18556 XrdSched: scheduling transit protocol anchor in 3600 seconds Config warning: 'xrootd.prepare logdir' not specified; prepare tracking disabled. 180419 13:27:25 18570 XrdXeq: async callback thread started 180419 13:27:25 18571 XrdXeq: Admin traffic thread started ------ xrootd protocol initialization completed. ------ xrootd [log in to unmask]:1094 initialization completed. 180419 13:27:25 18561 XrdSched: running main accept inq=0 180419 13:27:25 18569 XrdInet: Connected to glogin1.iup.uni-bremen.de:3121 180419 13:27:25 18569 cms_ClientMan: Connected to suspended glogin1.iup.uni-bremen.de v 3 180419 13:27:25 18569 Hookup glogin1.iup.uni-bremen.de qt=178ms rw=2 180419 13:28:55 18569 Receive glogin1 0 bytes on 0 180419 13:28:55 18569 setStatus glogin1.iup.uni-bremen.de sent resume event 180419 13:28:55 18569 cms_setStatus: Manager glogin1.iup.uni-bremen.de resumed 180419 13:36:57 18562 XrdSched: Now have 3 workers 180419 13:36:57 18562 XrdSched: running main accept inq=0 180419 13:36:57 18732 XrdXeq: Worker thread started 180419 13:36:57 18561 XrdInet: Accepted connection from [log in to unmask] 180419 13:36:57 18561 XrdProtocol: matched protocol xrootd 180419 13:36:57 18561 ?:18@qc01 XrdPoll: FD 18 attached to poller 0; num=1 180419 13:36:57 18561 ?:18@qc01 XrootdProtocol: 0000 req=login dlen=97 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0000 sending 16 data bytes 180419 13:36:57 18561 XrootdXeq: schroete.119547:18@qc01 pvt IPv4 login 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 req=stat dlen=17 180419 13:36:57 18561 schroete.119547:18@qc01 ofs_stat: fn=/xrootd/schroete/ 180419 13:36:57 18569 Receive glogin1 19 bytes on 3071 180419 13:36:57 18569 Decode glogin1 redirects schroete.119547:18@qc01 to 192.168.16.122:1094 /xrootd/schroete/ 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 rc=-256 stat /xrootd/schroete/ 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 redirecting to 192.168.16.122:1094 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0100 sending 18 data bytes; status=4004 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 req=open dlen=51 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 open unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536 180419 13:36:57 18561 schroete.119547:18@qc01 ofs_open: 102-40644 fn=/xrootd/schroete/hdscia.txt.001 180419 13:36:57 18569 Receive glogin1 4 bytes on 4095 180419 13:36:57 18569 Decode glogin1 delays schroete.119547:18@qc01 5 /xrootd/schroete/hdscia.txt.001 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 stalling client for 5 sec 180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0100 sending 4 data bytes; status=4005 180419 13:36:57 18561 schroete.119547:18@qc01 ofs_close: use=0 fn=dummy 180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 req=open dlen=51 180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 open unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536 180419 13:37:02 18561 schroete.119547:18@qc01 ofs_open: 102-40644 fn=/xrootd/schroete/hdscia.txt.001 180419 13:37:02 18569 Receive glogin1 19 bytes on 5119 180419 13:37:02 18569 Decode glogin1 redirects schroete.119547:18@qc01 to 192.168.16.122:1094 /xrootd/schroete/hdscia.txt.001 180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 redirecting to 192.168.16.122:1094 180419 13:37:02 18561 schroete.119547:18@qc01 XrootdResponse: 0100 sending 18 data bytes; status=4004 180419 13:37:02 18561 schroete.119547:18@qc01 ofs_close: use=0 fn=dummy 180419 13:37:02 18561 XrootdXeq: schroete.119547:18@qc01 disc 0:00:05 180419 13:37:02 18561 schroete.119547:18@qc01 XrdPoll: FD 18 detached from poller 0; num=0 180419 13:40:25 18732 XrdSched: running underused thread monitor inq=0 180419 13:40:25 18732 XrdSched: 3 threads; 1 idle 180419 13:40:25 18732 XrdSched: scheduling underused thread monitor in 780 seconds DATA SERVER: root@dw ~ # tail -f /var/log/xrootd/xrootd.log 180419 13:27:28 15566 XrootdAioReq: Adding 18 aioreq objects. 180419 13:27:28 15566 XrootdAio: Adding 18 aio objects; 4096 pending. 180419 13:27:28 15566 XrdSched: scheduling xrootd protocol anchor in 3600 seconds 180419 13:27:28 15566 XrdSched: scheduling transit protocol anchor in 3600 seconds Config warning: 'xrootd.prepare logdir' not specified; prepare tracking disabled. 180419 13:27:28 15592 XrdXeq: Event notification sender thread started 180419 13:27:28 15595 XrdXeq: Admin traffic thread started ------ xrootd protocol initialization completed. ------ xrootd [log in to unmask]:1094 initialization completed. 180419 13:27:28 15573 XrdSched: running main accept inq=0 180419 13:36:57 15574 XrdSched: Now have 3 workers 180419 13:36:57 15574 XrdSched: running main accept inq=0 180419 13:36:57 15672 XrdXeq: Worker thread started 180419 13:36:57 15573 XrdInet: Accepted connection from [log in to unmask] 180419 13:36:57 15573 XrdProtocol: matched protocol xrootd 180419 13:36:57 15573 ?:7@qc01 XrdPoll: FD 7 attached to poller 0; num=1 180419 13:36:57 15573 ?:7@qc01 XrootdProtocol: 0000 req=login dlen=97 180419 13:36:57 15573 sec_getParms: qc01.iup.uni-bremen.de sectoken=&P=unix 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0000 sending 23 data bytes; status=0 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0000 req=auth dlen=21 sec_PM: Using unix protocol, args='' 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0000 sending OK 180419 13:36:57 15573 XrootdXeq: schroete.119547:7@qc01 pvt IPv4 login as schroete 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 req=stat dlen=17 180419 13:36:57 15573 schroete.119547:7@qc01 ofs_stat: fn=/xrootd/schroete/ 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 rc=0 stat /xrootd/schroete/ 180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0100 sending 38 data bytes 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 req=open dlen=51 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 open unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536 180419 13:37:02 15573 schroete.119547:7@qc01 ofs_open: 102-40644 fn=/xrootd/schroete/hdscia.txt.001 180419 13:37:02 15573 schroete.119547:7@qc01 oss_Open_ufs: fd=2048 flags=202 mode=40644 path=/xrootd/schroete/hdscia.txt.001 180419 13:37:02 15573 schroete.119547:7@qc01 ofs_fstat: fn=/xrootd/schroete/hdscia.txt.001 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdResponse: 0100 sending 47 data bytes; status=0 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 req=write dlen=12353536 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 fh=0 write 12353536@0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@0 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@262144 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@524288 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@786432 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@1048576 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@1310720 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@1572864 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@1835008 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@2097152 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@2359296 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@2621440 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@2883584 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@3145728 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@3407872 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@3670016 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@3932160 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@4194304 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@4456448 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@4718592 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@4980736 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@5242880 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@5505024 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@5767168 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@6029312 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@6291456 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@6553600 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@6815744 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@7077888 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@7340032 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@7602176 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@7864320 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@8126464 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@8388608 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@8650752 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@8912896 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@9175040 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@9437184 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@9699328 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@9961472 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@10223616 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@10485760 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@10747904 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@11010048 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@11272192 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@11534336 started; aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@11796480 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 262144@12058624 started; aiocb=0x1658ef8 180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write 32768@12320768 started; aiocb=0x1658fd0 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8 180419 13:37:02 15580 oss_AioWait: write completed for schroete.119547:7@qc01; rc=0 result=32768 aiocb=0x1658fd0 180419 13:37:02 15672 XrdSched: Now have 4 workers 180419 13:37:02 15672 XrdSched: running aio request inq=0 180419 13:37:02 15672 schroete.119547:7@qc01 XrootdResponse: 0100 sending OK 180419 13:37:02 15677 XrdXeq: Worker thread started 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 req=close dlen=0 180419 13:37:02 15573 schroete.119547:7@qc01 ofs_close: use=1 fn=/xrootd/schroete/hdscia.txt.001 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 close rc=0 fh=0 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdFile: closing w /xrootd/schroete/hdscia.txt.001 180419 13:37:02 15573 schroete.119547:7@qc01 ofs_close: use=0 fn=dummy 180419 13:37:02 15573 schroete.119547:7@qc01 XrootdResponse: 0100 sending OK 180419 13:37:02 15573 XrootdXeq: schroete.119547:7@qc01 disc 0:00:05 180419 13:37:02 15573 schroete.119547:7@qc01 XrdPoll: FD 7 detached from poller 0; num=0 CLIENT debug 2: schroete@qc01 ~ $ xrdcp --debug 2 hdscia.txt.001 xroot://glogin1//xrootd/schroete/ [2018-04-19 13:36:57.277728 +0200][Debug ][Poller ] Available pollers: built-in [2018-04-19 13:36:57.277832 +0200][Debug ][Poller ] Attempting to create a poller according to preference: built-in [2018-04-19 13:36:57.277861 +0200][Debug ][Poller ] Creating poller: built-in [2018-04-19 13:36:57.277888 +0200][Debug ][Poller ] Creating and starting the built-in poller... [2018-04-19 13:36:57.278089 +0200][Debug ][Poller ] Using 1 poller threads [2018-04-19 13:36:57.278108 +0200][Debug ][TaskMgr ] Starting the task manager... [2018-04-19 13:36:57.278180 +0200][Debug ][TaskMgr ] Task manager started [2018-04-19 13:36:57.278198 +0200][Debug ][JobMgr ] Starting the job manager... [2018-04-19 13:36:57.278335 +0200][Debug ][JobMgr ] Job manager started, 3 workers [2018-04-19 13:36:57.278356 +0200][Debug ][TaskMgr ] Registering task: "FileTimer task" to be run at: [2018-04-19 13:36:57 +0200] [2018-04-19 13:36:57.278482 +0200][Debug ][PostMaster ] Creating new channel to: glogin1:1094 1 stream(s) [2018-04-19 13:36:57.278517 +0200][Debug ][PostMaster ] [glogin1:1094 #0] Stream parameters: Network Stack: IPAuto, Connection Window: 120, ConnectionRetry: 5, Stream Error Widnow: 1800 [2018-04-19 13:36:57.279011 +0200][Debug ][TaskMgr ] Registering task: "TickGeneratorTask for: glogin1:1094" to be run at: [2018-04-19 13:37:12 +0200] [2018-04-19 13:36:57.279471 +0200][Debug ][PostMaster ] [glogin1:1094] Found 1 address(es): [::ffff:192.168.16.124]:1094 [2018-04-19 13:36:57.279511 +0200][Debug ][AsyncSock ] [glogin1:1094 #0.0] Attempting connection to [::ffff:192.168.16.124]:1094 [2018-04-19 13:36:57.279570 +0200][Debug ][Poller ] Adding socket 0x16cda00 to the poller [2018-04-19 13:36:57.279867 +0200][Debug ][AsyncSock ] [glogin1:1094 #0.0] Async connection call returned [2018-04-19 13:36:57.279977 +0200][Debug ][XRootDTransport ] [glogin1:1094 #0.0] Sending out the initial hand shake + kXR_protocol [2018-04-19 13:36:57.281634 +0200][Debug ][XRootDTransport ] [glogin1:1094 #0.0] Got the server hand shake response (type: manager [], protocol version 310) [2018-04-19 13:36:57.281665 +0200][Debug ][XRootDTransport ] [glogin1:1094 #0.0] kXR_protocol successful (type: manager [], protocol version 310) [2018-04-19 13:36:57.283464 +0200][Debug ][XRootDTransport ] [glogin1:1094 #0.0] Sending out kXR_login request, username: schroete, cgi: ?xrd.cc=de&xrd.tz=1&xrd.appname=xrdcp&xrd.info=&xrd.hostname=qc01.iup.uni-bremen.de&xrd.rn=v4.8.2, dual-stack: false, private IPv4: true, private IPv6: false [2018-04-19 13:36:57.283977 +0200][Debug ][XRootDTransport ] [glogin1:1094 #0.0] Logged in, session: 010000006c4800001200000002000000 [2018-04-19 13:36:57.284003 +0200][Debug ][PostMaster ] [glogin1:1094 #0] Stream 0 connected. [2018-04-19 13:36:57.284036 +0200][Debug ][Utility ] Monitor library name not set. No monitoring [2018-04-19 13:36:57.286195 +0200][Debug ][PostMaster ] Creating new channel to: 192.168.16.122:1094 1 stream(s) [2018-04-19 13:36:57.286253 +0200][Debug ][PostMaster ] [192.168.16.122:1094 #0] Stream parameters: Network Stack: IPAuto, Connection Window: 120, ConnectionRetry: 5, Stream Error Widnow: 1800 [2018-04-19 13:36:57.286803 +0200][Debug ][TaskMgr ] Registering task: "TickGeneratorTask for: 192.168.16.122:1094" to be run at: [2018-04-19 13:37:12 +0200] [2018-04-19 13:36:57.286895 +0200][Debug ][PostMaster ] [192.168.16.122:1094] Found 1 address(es): [::ffff:192.168.16.122]:1094 [2018-04-19 13:36:57.286924 +0200][Debug ][AsyncSock ] [192.168.16.122:1094 #0.0] Attempting connection to [::ffff:192.168.16.122]:1094 [2018-04-19 13:36:57.286968 +0200][Debug ][Poller ] Adding socket 0x80014c0 to the poller [2018-04-19 13:36:57.287166 +0200][Debug ][AsyncSock ] [192.168.16.122:1094 #0.0] Async connection call returned [2018-04-19 13:36:57.287205 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Sending out the initial hand shake + kXR_protocol [2018-04-19 13:36:57.288258 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Got the server hand shake response (type: server [], protocol version 310) [2018-04-19 13:36:57.288290 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] kXR_protocol successful (type: server [], protocol version 310) [2018-04-19 13:36:57.289853 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Sending out kXR_login request, username: schroete, cgi: ?xrd.cc=de&xrd.tz=1&xrd.appname=xrdcp&xrd.info=&xrd.hostname=qc01.iup.uni-bremen.de&xrd.rn=v4.8.2, dual-stack: false, private IPv4: true, private IPv6: false [2018-04-19 13:36:57.290306 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Logged in, session: 01000000c43c00000700000001000000 [2018-04-19 13:36:57.290322 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Authentication is required: &P=unix [2018-04-19 13:36:57.290347 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Sending authentication data [2018-04-19 13:36:57.306809 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Trying to authenticate using unix [2018-04-19 13:36:57.308135 +0200][Debug ][XRootDTransport ] [192.168.16.122:1094 #0.0] Authenticated with unix. [2018-04-19 13:36:57.308157 +0200][Debug ][PostMaster ] [192.168.16.122:1094 #0] Stream 0 connected. [2018-04-19 13:36:57.308995 +0200][Debug ][Utility ] CopyProcess: 2 jobs to prepare [2018-04-19 13:36:57.309066 +0200][Debug ][Utility ] Creating a classic copy job, from file://localhost/home/scia/schroete/hdscia.txt.001 to xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001 [2018-04-19 13:36:57.309178 +0200][Debug ][Utility ] Opening file://localhost/home/scia/schroete/hdscia.txt.001 for reading [2018-04-19 13:36:57.309233 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending an open command [2018-04-19 13:36:57.310681 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Open has returned with status [SUCCESS] [2018-04-19 13:36:57.310703 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] successfully opened at localhost, handle: 0x220d, session id: 1 [2018-04-19 13:36:57.310791 +0200][Debug ][Utility ] Opening xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536 for writing [2018-04-19 13:36:57.310863 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] Sending an open command [2018-04-19 13:36:57.312306 +0200][Debug ][TaskMgr ] Registering task: "WaitTask for: 0x0x16d3bc0" to be run at: [2018-04-19 13:37:02 +0200] [2018-04-19 13:37:02.278686 +0200][Debug ][TaskMgr ] Done with task: "WaitTask for: 0x0x16d3bc0" [2018-04-19 13:37:02.310076 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] Open has returned with status [SUCCESS] [2018-04-19 13:37:02.310129 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] successfully opened at 192.168.16.122:1094, handle: 0x0, session id: 1 [2018-04-19 13:37:02.310233 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending a read command for handle 0x220d to localhost [2018-04-19 13:37:02.327820 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] Sending a write command for handle 0x0 to 192.168.16.122:1094 [11.78MB/11.78MB][100%][==================================================][2.356MB/s] [2018-04-19 13:37:02.437643 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] Sending a close command for handle 0x0 to 192.168.16.122:1094 [2018-04-19 13:37:02.441380 +0200][Debug ][File ] [0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536] Close returned from 192.168.16.122:1094 with: [SUCCESS] [2018-04-19 13:37:02.441485 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending a close command for handle 0x220d to localhost [2018-04-19 13:37:02.441595 +0200][Debug ][File ] [0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Close returned from localhost with: [SUCCESS] [11.78MB/11.78MB][100%][==================================================][2.356MB/s] [2018-04-19 13:37:02.441830 +0200][Debug ][JobMgr ] Stopping the job manager... [2018-04-19 13:37:02.442219 +0200][Debug ][JobMgr ] Job manager stopped [2018-04-19 13:37:02.442265 +0200][Debug ][TaskMgr ] Stopping the task manager... [2018-04-19 13:37:02.442379 +0200][Debug ][TaskMgr ] Task manager stopped [2018-04-19 13:37:02.442410 +0200][Debug ][Poller ] Stopping the poller... [2018-04-19 13:37:02.442594 +0200][Debug ][TaskMgr ] Requesting unregistration of: "TickGeneratorTask for: 192.168.16.122:1094" [2018-04-19 13:37:02.442627 +0200][Debug ][AsyncSock ] [192.168.16.122:1094 #0.0] Closing the socket [2018-04-19 13:37:02.442658 +0200][Debug ][Poller ] <[::ffff:192.168.16.170]:33020><--><[::ffff:192.168.16.122]:1094> Removing socket from the poller [2018-04-19 13:37:02.442703 +0200][Debug ][PostMaster ] [192.168.16.122:1094 #0] Destroying stream [2018-04-19 13:37:02.442729 +0200][Debug ][AsyncSock ] [192.168.16.122:1094 #0.0] Closing the socket [2018-04-19 13:37:02.442760 +0200][Debug ][TaskMgr ] Requesting unregistration of: "TickGeneratorTask for: glogin1:1094" [2018-04-19 13:37:02.442783 +0200][Debug ][AsyncSock ] [glogin1:1094 #0.0] Closing the socket [2018-04-19 13:37:02.442807 +0200][Debug ][Poller ] <[::ffff:192.168.16.170]:45172><--><[::ffff:192.168.16.124]:1094> Removing socket from the poller [2018-04-19 13:37:02.442847 +0200][Debug ][PostMaster ] [glogin1:1094 #0] Destroying stream [2018-04-19 13:37:02.442871 +0200][Debug ][AsyncSock ] [glogin1:1094 #0.0] Closing the socket Am 19.04.2018 um 10:06 schrieb Michal Kamil Simon: > Hi Heiko, > > That's interesting, could you give me more details on your scenario, > are you using xrdcp from a script or XrdCl C++ API (or Python bindings)? > > Could you also provide client side logs from a run when you observed > stalling? > > Cheers, > Michal > ________________________________________ > From: [log in to unmask] [[log in to unmask]] on behalf of Heiko Schröter [[log in to unmask]] > Sent: 18 April 2018 20:26 > To: [log in to unmask] > Cc: [log in to unmask]; [log in to unmask] > Subject: Re: Stalling client when copying files (xrdcp 4.8.2) > > This stalling does not occur with the 4.8.3-rc1. > > > Am 17.04.2018 um 19:24 schrieb Heiko Schröter: >> Hello, >> >> we do observe that when copying a file the client is stalled for some time. >> >> 180417 19:17:28 17252 schroete.97360:7@qc08 XrootdProtocol: 0100 >> stalling client for 5 sec >> >> Sometimes it is for 10sec and this gets repeated without a recognizable >> pattern. >> >> The client is not stalled when the copied file is removed at once and >> recopied. >> >> It looks like a similar issue as this one: >> https://listserv.slac.stanford.edu/cgi-bin/wa?A2=ind1203&L=XROOTD-L&P=R598&1=XROOTD-L&9=A&I=-3&J=on&d=No+Match%3BMatch%3BMatches&z=4 >> >> >> Is this a settable parameter or something we did wron in our setup ? >> >> We have one redirector and 12 data server on a 10GBit network. Client >> access is very limited because this is a test setup. >> >> Regards >> Heiko >> > ######################################################################## > Use REPLY-ALL to reply to list > > To unsubscribe from the XROOTD-L list, click the following link: > https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1 > > > ######################################################################## > Use REPLY-ALL to reply to list > > To unsubscribe from the XROOTD-L list, click the following link: > https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1 > -- ----------------------------------------------------------------------- Dipl.-Ing. Heiko Schröter Institute of Environmental Physics (IUP) phone: ++49-(0)421-218-62092 Institute of Remote Sensing (IFE) fax: ++49-(0)421-218-62070 University of Bremen (FB1) P.O. Box 330440 email: [log in to unmask] Otto-Hahn-Allee 1 28359 Bremen Germany ----------------------------------------------------------------------- ######################################################################## Use REPLY-ALL to reply to list To unsubscribe from the XROOTD-L list, click the following link: https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1