LISTSERV mailing list manager LISTSERV 16.5

Help for XROOTD-L Archives


XROOTD-L Archives

XROOTD-L Archives


XROOTD-L@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Monospaced Font

LISTSERV Archives

LISTSERV Archives

XROOTD-L Home

XROOTD-L Home

XROOTD-L  April 2018

XROOTD-L April 2018

Subject:

Re: Stalling client when copying files (xrdcp 4.8.2)

From:

Heiko Schröter <[log in to unmask]>

Reply-To:

Support use of xrootd by HEP experiments <[log in to unmask]>

Date:

Thu, 19 Apr 2018 13:42:26 +0200

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (629 lines)

Hi Michal,

here is the output of the 4.8.2 Version with stalling. Test situ is one
redirector, one client, one data server.
I did restart redirector and data server right before the copy job.

REDIRECTOR:
root@glogin1 build # tail -f /var/log/xrootd/xrootd.log
180419 13:27:25 18556 XrdSched: scheduling transit protocol anchor in
3600 seconds
Config warning: 'xrootd.prepare logdir' not specified; prepare tracking
disabled.
180419 13:27:25 18570 XrdXeq: async callback thread started
180419 13:27:25 18571 XrdXeq: Admin traffic thread started
------ xrootd protocol initialization completed.
------ xrootd [log in to unmask]:1094 initialization completed.
180419 13:27:25 18561 XrdSched: running main accept inq=0
180419 13:27:25 18569 XrdInet: Connected to glogin1.iup.uni-bremen.de:3121
180419 13:27:25 18569 cms_ClientMan: Connected to suspended
glogin1.iup.uni-bremen.de v 3
180419 13:27:25 18569 Hookup glogin1.iup.uni-bremen.de qt=178ms rw=2
180419 13:28:55 18569 Receive glogin1 0 bytes on 0
180419 13:28:55 18569 setStatus glogin1.iup.uni-bremen.de sent resume event
180419 13:28:55 18569 cms_setStatus: Manager glogin1.iup.uni-bremen.de
resumed
180419 13:36:57 18562 XrdSched: Now have 3 workers
180419 13:36:57 18562 XrdSched: running main accept inq=0
180419 13:36:57 18732 XrdXeq: Worker thread started
180419 13:36:57 18561 XrdInet: Accepted connection from
[log in to unmask]
180419 13:36:57 18561 XrdProtocol: matched protocol xrootd
180419 13:36:57 18561 ?:18@qc01 XrdPoll: FD 18 attached to poller 0; num=1
180419 13:36:57 18561 ?:18@qc01 XrootdProtocol: 0000 req=login dlen=97
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0000
sending 16 data bytes
180419 13:36:57 18561 XrootdXeq: schroete.119547:18@qc01 pvt IPv4 login
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
req=stat dlen=17
180419 13:36:57 18561 schroete.119547:18@qc01 ofs_stat: fn=/xrootd/schroete/
180419 13:36:57 18569 Receive glogin1 19 bytes on 3071
180419 13:36:57 18569 Decode glogin1 redirects schroete.119547:18@qc01
to 192.168.16.122:1094 /xrootd/schroete/
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
rc=-256 stat /xrootd/schroete/
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
redirecting to 192.168.16.122:1094
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0100
sending 18 data bytes; status=4004
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
req=open dlen=51
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 open
unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536
180419 13:36:57 18561 schroete.119547:18@qc01 ofs_open: 102-40644
fn=/xrootd/schroete/hdscia.txt.001
180419 13:36:57 18569 Receive glogin1 4 bytes on 4095
180419 13:36:57 18569 Decode glogin1 delays schroete.119547:18@qc01 5
/xrootd/schroete/hdscia.txt.001
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
stalling client for 5 sec
180419 13:36:57 18561 schroete.119547:18@qc01 XrootdResponse: 0100
sending 4 data bytes; status=4005
180419 13:36:57 18561 schroete.119547:18@qc01 ofs_close: use=0 fn=dummy
180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
req=open dlen=51
180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100 open
unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536
180419 13:37:02 18561 schroete.119547:18@qc01 ofs_open: 102-40644
fn=/xrootd/schroete/hdscia.txt.001
180419 13:37:02 18569 Receive glogin1 19 bytes on 5119
180419 13:37:02 18569 Decode glogin1 redirects schroete.119547:18@qc01
to 192.168.16.122:1094 /xrootd/schroete/hdscia.txt.001
180419 13:37:02 18561 schroete.119547:18@qc01 XrootdProtocol: 0100
redirecting to 192.168.16.122:1094
180419 13:37:02 18561 schroete.119547:18@qc01 XrootdResponse: 0100
sending 18 data bytes; status=4004
180419 13:37:02 18561 schroete.119547:18@qc01 ofs_close: use=0 fn=dummy
180419 13:37:02 18561 XrootdXeq: schroete.119547:18@qc01 disc 0:00:05
180419 13:37:02 18561 schroete.119547:18@qc01 XrdPoll: FD 18 detached
from poller 0; num=0
180419 13:40:25 18732 XrdSched: running underused thread monitor inq=0
180419 13:40:25 18732 XrdSched: 3 threads; 1 idle
180419 13:40:25 18732 XrdSched: scheduling underused thread monitor in
780 seconds


DATA SERVER:
root@dw ~ # tail -f /var/log/xrootd/xrootd.log
180419 13:27:28 15566 XrootdAioReq: Adding 18 aioreq objects.
180419 13:27:28 15566 XrootdAio: Adding 18 aio objects; 4096 pending.
180419 13:27:28 15566 XrdSched: scheduling xrootd protocol anchor in
3600 seconds
180419 13:27:28 15566 XrdSched: scheduling transit protocol anchor in
3600 seconds
Config warning: 'xrootd.prepare logdir' not specified; prepare tracking
disabled.
180419 13:27:28 15592 XrdXeq: Event notification sender thread started
180419 13:27:28 15595 XrdXeq: Admin traffic thread started
------ xrootd protocol initialization completed.
------ xrootd [log in to unmask]:1094 initialization completed.
180419 13:27:28 15573 XrdSched: running main accept inq=0
180419 13:36:57 15574 XrdSched: Now have 3 workers
180419 13:36:57 15574 XrdSched: running main accept inq=0
180419 13:36:57 15672 XrdXeq: Worker thread started
180419 13:36:57 15573 XrdInet: Accepted connection from
[log in to unmask]
180419 13:36:57 15573 XrdProtocol: matched protocol xrootd
180419 13:36:57 15573 ?:7@qc01 XrdPoll: FD 7 attached to poller 0; num=1
180419 13:36:57 15573 ?:7@qc01 XrootdProtocol: 0000 req=login dlen=97
180419 13:36:57 15573 sec_getParms: qc01.iup.uni-bremen.de sectoken=&P=unix
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0000
sending 23 data bytes; status=0
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0000
req=auth dlen=21
sec_PM: Using unix protocol, args=''
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0000 sending OK
180419 13:36:57 15573 XrootdXeq: schroete.119547:7@qc01 pvt IPv4 login
as schroete
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0100
req=stat dlen=17
180419 13:36:57 15573 schroete.119547:7@qc01 ofs_stat: fn=/xrootd/schroete/
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 rc=0
stat /xrootd/schroete/
180419 13:36:57 15573 schroete.119547:7@qc01 XrootdResponse: 0100
sending 38 data bytes
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100
req=open dlen=51
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 open
unmat /xrootd/schroete//hdscia.txt.001?oss.asize=12353536
180419 13:37:02 15573 schroete.119547:7@qc01 ofs_open: 102-40644
fn=/xrootd/schroete/hdscia.txt.001
180419 13:37:02 15573 schroete.119547:7@qc01 oss_Open_ufs: fd=2048
flags=202 mode=40644 path=/xrootd/schroete/hdscia.txt.001
180419 13:37:02 15573 schroete.119547:7@qc01 ofs_fstat:
fn=/xrootd/schroete/hdscia.txt.001
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdResponse: 0100
sending 47 data bytes; status=0
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100
req=write dlen=12353536
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 fh=0
write 12353536@0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@0 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@262144 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@524288 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@786432 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@1048576 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@1310720 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@1572864 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@1835008 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@2097152 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@2359296 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@2621440 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@2883584 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@3145728 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@3407872 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@3670016 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@3932160 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@4194304 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@4456448 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@4718592 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@4980736 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@5242880 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@5505024 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@5767168 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@6029312 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@6291456 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@6553600 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@6815744 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@7077888 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@7340032 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@7602176 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@7864320 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@8126464 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@8388608 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@8650752 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@8912896 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@9175040 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@9437184 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@9699328 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@9961472 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@10223616 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@10485760 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@10747904 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@11010048 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@11272192 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@11534336 started; aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@11796480 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658fd0
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
262144@12058624 started; aiocb=0x1658ef8
180419 13:37:02 15573 schroete.119547:7@qc01 oss_AioWrite: Write
32768@12320768 started; aiocb=0x1658fd0
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=262144 aiocb=0x1658ef8
180419 13:37:02 15580 oss_AioWait: write completed for
schroete.119547:7@qc01; rc=0 result=32768 aiocb=0x1658fd0
180419 13:37:02 15672 XrdSched: Now have 4 workers
180419 13:37:02 15672 XrdSched: running aio request inq=0
180419 13:37:02 15672 schroete.119547:7@qc01 XrootdResponse: 0100 sending OK
180419 13:37:02 15677 XrdXeq: Worker thread started
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100
req=close dlen=0
180419 13:37:02 15573 schroete.119547:7@qc01 ofs_close: use=1
fn=/xrootd/schroete/hdscia.txt.001
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdProtocol: 0100 close
rc=0 fh=0
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdFile: closing w
/xrootd/schroete/hdscia.txt.001
180419 13:37:02 15573 schroete.119547:7@qc01 ofs_close: use=0 fn=dummy
180419 13:37:02 15573 schroete.119547:7@qc01 XrootdResponse: 0100 sending OK
180419 13:37:02 15573 XrootdXeq: schroete.119547:7@qc01 disc 0:00:05
180419 13:37:02 15573 schroete.119547:7@qc01 XrdPoll: FD 7 detached from
poller 0; num=0


CLIENT debug 2:
schroete@qc01 ~ $ xrdcp --debug 2 hdscia.txt.001
xroot://glogin1//xrootd/schroete/
[2018-04-19 13:36:57.277728 +0200][Debug  ][Poller            ]
Available pollers: built-in
[2018-04-19 13:36:57.277832 +0200][Debug  ][Poller            ]
Attempting to create a poller according to preference: built-in
[2018-04-19 13:36:57.277861 +0200][Debug  ][Poller            ] Creating
poller: built-in
[2018-04-19 13:36:57.277888 +0200][Debug  ][Poller            ] Creating
and starting the built-in poller...
[2018-04-19 13:36:57.278089 +0200][Debug  ][Poller            ] Using 1
poller threads
[2018-04-19 13:36:57.278108 +0200][Debug  ][TaskMgr           ] Starting
the task manager...
[2018-04-19 13:36:57.278180 +0200][Debug  ][TaskMgr           ] Task
manager started
[2018-04-19 13:36:57.278198 +0200][Debug  ][JobMgr            ] Starting
the job manager...
[2018-04-19 13:36:57.278335 +0200][Debug  ][JobMgr            ] Job
manager started, 3 workers
[2018-04-19 13:36:57.278356 +0200][Debug  ][TaskMgr           ]
Registering task: "FileTimer task" to be run at: [2018-04-19 13:36:57 +0200]
[2018-04-19 13:36:57.278482 +0200][Debug  ][PostMaster        ] Creating
new channel to: glogin1:1094 1 stream(s)
[2018-04-19 13:36:57.278517 +0200][Debug  ][PostMaster        ]
[glogin1:1094 #0] Stream parameters: Network Stack: IPAuto, Connection
Window: 120, ConnectionRetry: 5, Stream Error Widnow: 1800
[2018-04-19 13:36:57.279011 +0200][Debug  ][TaskMgr           ]
Registering task: "TickGeneratorTask for: glogin1:1094" to be run at:
[2018-04-19 13:37:12 +0200]
[2018-04-19 13:36:57.279471 +0200][Debug  ][PostMaster        ]
[glogin1:1094] Found 1 address(es): [::ffff:192.168.16.124]:1094
[2018-04-19 13:36:57.279511 +0200][Debug  ][AsyncSock         ]
[glogin1:1094 #0.0] Attempting connection to [::ffff:192.168.16.124]:1094
[2018-04-19 13:36:57.279570 +0200][Debug  ][Poller            ] Adding
socket 0x16cda00 to the poller
[2018-04-19 13:36:57.279867 +0200][Debug  ][AsyncSock         ]
[glogin1:1094 #0.0] Async connection call returned
[2018-04-19 13:36:57.279977 +0200][Debug  ][XRootDTransport   ]
[glogin1:1094 #0.0] Sending out the initial hand shake + kXR_protocol
[2018-04-19 13:36:57.281634 +0200][Debug  ][XRootDTransport   ]
[glogin1:1094 #0.0] Got the server hand shake response (type: manager
[], protocol version 310)
[2018-04-19 13:36:57.281665 +0200][Debug  ][XRootDTransport   ]
[glogin1:1094 #0.0] kXR_protocol successful (type: manager [], protocol
version 310)
[2018-04-19 13:36:57.283464 +0200][Debug  ][XRootDTransport   ]
[glogin1:1094 #0.0] Sending out kXR_login request, username: schroete,
cgi:
?xrd.cc=de&xrd.tz=1&xrd.appname=xrdcp&xrd.info=&xrd.hostname=qc01.iup.uni-bremen.de&xrd.rn=v4.8.2,
dual-stack: false, private IPv4: true, private IPv6: false
[2018-04-19 13:36:57.283977 +0200][Debug  ][XRootDTransport   ]
[glogin1:1094 #0.0] Logged in, session: 010000006c4800001200000002000000
[2018-04-19 13:36:57.284003 +0200][Debug  ][PostMaster        ]
[glogin1:1094 #0] Stream 0 connected.
[2018-04-19 13:36:57.284036 +0200][Debug  ][Utility           ] Monitor
library name not set. No monitoring
[2018-04-19 13:36:57.286195 +0200][Debug  ][PostMaster        ] Creating
new channel to: 192.168.16.122:1094 1 stream(s)
[2018-04-19 13:36:57.286253 +0200][Debug  ][PostMaster        ]
[192.168.16.122:1094 #0] Stream parameters: Network Stack: IPAuto,
Connection Window: 120, ConnectionRetry: 5, Stream Error Widnow: 1800
[2018-04-19 13:36:57.286803 +0200][Debug  ][TaskMgr           ]
Registering task: "TickGeneratorTask for: 192.168.16.122:1094" to be run
at: [2018-04-19 13:37:12 +0200]
[2018-04-19 13:36:57.286895 +0200][Debug  ][PostMaster        ]
[192.168.16.122:1094] Found 1 address(es): [::ffff:192.168.16.122]:1094
[2018-04-19 13:36:57.286924 +0200][Debug  ][AsyncSock         ]
[192.168.16.122:1094 #0.0] Attempting connection to
[::ffff:192.168.16.122]:1094
[2018-04-19 13:36:57.286968 +0200][Debug  ][Poller            ] Adding
socket 0x80014c0 to the poller
[2018-04-19 13:36:57.287166 +0200][Debug  ][AsyncSock         ]
[192.168.16.122:1094 #0.0] Async connection call returned
[2018-04-19 13:36:57.287205 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Sending out the initial hand shake + kXR_protocol
[2018-04-19 13:36:57.288258 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Got the server hand shake response (type:
server [], protocol version 310)
[2018-04-19 13:36:57.288290 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] kXR_protocol successful (type: server [],
protocol version 310)
[2018-04-19 13:36:57.289853 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Sending out kXR_login request, username:
schroete, cgi:
?xrd.cc=de&xrd.tz=1&xrd.appname=xrdcp&xrd.info=&xrd.hostname=qc01.iup.uni-bremen.de&xrd.rn=v4.8.2,
dual-stack: false, private IPv4: true, private IPv6: false
[2018-04-19 13:36:57.290306 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Logged in, session:
01000000c43c00000700000001000000
[2018-04-19 13:36:57.290322 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Authentication is required: &P=unix
[2018-04-19 13:36:57.290347 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Sending authentication data
[2018-04-19 13:36:57.306809 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Trying to authenticate using unix
[2018-04-19 13:36:57.308135 +0200][Debug  ][XRootDTransport   ]
[192.168.16.122:1094 #0.0] Authenticated with unix.
[2018-04-19 13:36:57.308157 +0200][Debug  ][PostMaster        ]
[192.168.16.122:1094 #0] Stream 0 connected.
[2018-04-19 13:36:57.308995 +0200][Debug  ][Utility           ]
CopyProcess: 2 jobs to prepare
[2018-04-19 13:36:57.309066 +0200][Debug  ][Utility           ] Creating
a classic copy job, from
file://localhost/home/scia/schroete/hdscia.txt.001 to
xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001
[2018-04-19 13:36:57.309178 +0200][Debug  ][Utility           ] Opening
file://localhost/home/scia/schroete/hdscia.txt.001 for reading
[2018-04-19 13:36:57.309233 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending
an open command
[2018-04-19 13:36:57.310681 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Open has
returned with status [SUCCESS]
[2018-04-19 13:36:57.310703 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001]
successfully opened at localhost, handle: 0x220d, session id: 1
[2018-04-19 13:36:57.310791 +0200][Debug  ][Utility           ] Opening
xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536
for writing
[2018-04-19 13:36:57.310863 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
Sending an open command
[2018-04-19 13:36:57.312306 +0200][Debug  ][TaskMgr           ]
Registering task: "WaitTask for: 0x0x16d3bc0" to be run at: [2018-04-19
13:37:02 +0200]
[2018-04-19 13:37:02.278686 +0200][Debug  ][TaskMgr           ] Done
with task: "WaitTask for: 0x0x16d3bc0"
[2018-04-19 13:37:02.310076 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
Open has returned with status [SUCCESS]
[2018-04-19 13:37:02.310129 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
successfully opened at 192.168.16.122:1094, handle: 0x0, session id: 1
[2018-04-19 13:37:02.310233 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending a
read command for handle 0x220d to localhost
[2018-04-19 13:37:02.327820 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
Sending a write command for handle 0x0 to 192.168.16.122:1094
[11.78MB/11.78MB][100%][==================================================][2.356MB/s]
[2018-04-19 13:37:02.437643 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
Sending a close command for handle 0x0 to 192.168.16.122:1094
[2018-04-19 13:37:02.441380 +0200][Debug  ][File              ]
[0x16d3500@xroot://glogin1:1094//xrootd/schroete//hdscia.txt.001?oss.asize=12353536]
Close returned from 192.168.16.122:1094 with: [SUCCESS]
[2018-04-19 13:37:02.441485 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Sending a
close command for handle 0x220d to localhost
[2018-04-19 13:37:02.441595 +0200][Debug  ][File              ]
[0x16ce690@file://localhost/home/scia/schroete/hdscia.txt.001] Close
returned from localhost with: [SUCCESS]
[11.78MB/11.78MB][100%][==================================================][2.356MB/s]
[2018-04-19 13:37:02.441830 +0200][Debug  ][JobMgr            ] Stopping
the job manager...
[2018-04-19 13:37:02.442219 +0200][Debug  ][JobMgr            ] Job
manager stopped
[2018-04-19 13:37:02.442265 +0200][Debug  ][TaskMgr           ] Stopping
the task manager...
[2018-04-19 13:37:02.442379 +0200][Debug  ][TaskMgr           ] Task
manager stopped
[2018-04-19 13:37:02.442410 +0200][Debug  ][Poller            ] Stopping
the poller...
[2018-04-19 13:37:02.442594 +0200][Debug  ][TaskMgr           ]
Requesting unregistration of: "TickGeneratorTask for: 192.168.16.122:1094"
[2018-04-19 13:37:02.442627 +0200][Debug  ][AsyncSock         ]
[192.168.16.122:1094 #0.0] Closing the socket
[2018-04-19 13:37:02.442658 +0200][Debug  ][Poller            ]
<[::ffff:192.168.16.170]:33020><--><[::ffff:192.168.16.122]:1094>
Removing socket from the poller
[2018-04-19 13:37:02.442703 +0200][Debug  ][PostMaster        ]
[192.168.16.122:1094 #0] Destroying stream
[2018-04-19 13:37:02.442729 +0200][Debug  ][AsyncSock         ]
[192.168.16.122:1094 #0.0] Closing the socket
[2018-04-19 13:37:02.442760 +0200][Debug  ][TaskMgr           ]
Requesting unregistration of: "TickGeneratorTask for: glogin1:1094"
[2018-04-19 13:37:02.442783 +0200][Debug  ][AsyncSock         ]
[glogin1:1094 #0.0] Closing the socket
[2018-04-19 13:37:02.442807 +0200][Debug  ][Poller            ]
<[::ffff:192.168.16.170]:45172><--><[::ffff:192.168.16.124]:1094>
Removing socket from the poller
[2018-04-19 13:37:02.442847 +0200][Debug  ][PostMaster        ]
[glogin1:1094 #0] Destroying stream
[2018-04-19 13:37:02.442871 +0200][Debug  ][AsyncSock         ]
[glogin1:1094 #0.0] Closing the socket












Am 19.04.2018 um 10:06 schrieb Michal Kamil Simon:
> Hi Heiko,
>
> That's interesting, could you give me more details on your scenario,
> are you using xrdcp from a script or XrdCl C++ API (or Python bindings)?
>
> Could you also provide client side logs from a run when you observed
> stalling?
>
> Cheers,
> Michal
> ________________________________________
> From: [log in to unmask] [[log in to unmask]] on behalf of Heiko Schröter [[log in to unmask]]
> Sent: 18 April 2018 20:26
> To: [log in to unmask]
> Cc: [log in to unmask]; [log in to unmask]
> Subject: Re: Stalling client when copying files (xrdcp 4.8.2)
>
> This stalling does not occur with the 4.8.3-rc1.
>
>
> Am 17.04.2018 um 19:24 schrieb Heiko Schröter:
>> Hello,
>>
>> we do observe that when copying a file the client is stalled for some time.
>>
>> 180417 19:17:28 17252 schroete.97360:7@qc08 XrootdProtocol: 0100
>> stalling client for 5 sec
>>
>> Sometimes it is for 10sec and this gets repeated without a recognizable
>> pattern.
>>
>> The client is not stalled when the copied file is removed at once and
>> recopied.
>>
>> It looks like a similar issue as this one:
>> https://listserv.slac.stanford.edu/cgi-bin/wa?A2=ind1203&L=XROOTD-L&P=R598&1=XROOTD-L&9=A&I=-3&J=on&d=No+Match%3BMatch%3BMatches&z=4
>>
>>
>> Is this a settable parameter or something we did wron in our setup ?
>>
>> We have one redirector and 12 data server on a 10GBit network. Client
>> access is very limited because this is a test setup.
>>
>> Regards
>> Heiko
>>
> ########################################################################
> Use REPLY-ALL to reply to list
>
> To unsubscribe from the XROOTD-L list, click the following link:
> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1
>
>
> ########################################################################
> Use REPLY-ALL to reply to list
>
> To unsubscribe from the XROOTD-L list, click the following link:
> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1
>

--
-----------------------------------------------------------------------
Dipl.-Ing. Heiko Schröter
Institute of Environmental Physics (IUP) phone: ++49-(0)421-218-62092
Institute of Remote Sensing (IFE) fax: ++49-(0)421-218-62070
University of Bremen (FB1)
P.O. Box 330440 email: [log in to unmask]
Otto-Hahn-Allee 1
28359 Bremen
Germany
-----------------------------------------------------------------------

########################################################################
Use REPLY-ALL to reply to list

To unsubscribe from the XROOTD-L list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=XROOTD-L&A=1

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
July 2009
June 2009
May 2009
April 2009
March 2009
January 2009
December 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
December 2005
November 2005
October 2005
September 2005
August 2005
July 2005
June 2005
May 2005
April 2005
March 2005
February 2005
January 2005
December 2004
November 2004
October 2004
September 2004
August 2004

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use