LISTSERV mailing list manager LISTSERV 16.5

Help for HPS-SOFTWARE Archives


HPS-SOFTWARE Archives

HPS-SOFTWARE Archives


HPS-SOFTWARE@LISTSERV.SLAC.STANFORD.EDU


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

HPS-SOFTWARE Home

HPS-SOFTWARE Home

HPS-SOFTWARE  June 2017

HPS-SOFTWARE June 2017

Subject:

Re: [Hps-analysis] Fwd: [Clas_offline] Fwd: ENP consumption of disk space under /work

From:

Nathan Baltzell <[log in to unmask]>

Reply-To:

Software for the Heavy Photon Search Experiment <[log in to unmask]>

Date:

Thu, 1 Jun 2017 13:23:19 -0400

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (170 lines)

Here’s the most relevant usage

649G	mrsolt/
570G	sebouh/
459G	mc_production/
228G	holly
159G	mccaky/
78G	rafopar/
45G	omoreno/
44G	spaul
39G	fxgirod
34G	jeremym

data/engrun2015:
3.2T	tweakpass6
50G	tweakpass6fail
64G	tpass7
2.4G	tpass7b
39G	tpass7c
6.5G	t_tweakpass_a
373G    pass6/skim
201G    pass6/dst

data/physrun2016:
3.5T	pass0
690G	feeiter4
94M	feeiter0
327M	feeiter1
339M	feeiter2
338M	feeiter3
15G	noPass
24G	pass0_allign
52G	pass0fail
4.5G	tmp_test
281G	tpass1
11G	upass0




On Jun 1, 2017, at 11:05, Stepan Stepanyan <[log in to unmask]> wrote:

> FYI, we need to move files.
> 
> Stepan
> 
>> Begin forwarded message:
>> 
>> From: Harut Avakian <[log in to unmask]>
>> Subject: [Clas_offline] Fwd: ENP consumption of disk space under /work
>> Date: June 1, 2017 at 5:01:24 PM GMT+2
>> To: "[log in to unmask]" <[log in to unmask]>
>> 
>> 
>> 
>> 
>> Dear All,
>> 
>> As you can see from the e-mail below,  keeping all our work disk space requires some additional funding.
>> Option 3 will inevitably impact on farm operations, removing of ~20% space from Lustre.
>> 
>> We can also choose something between options 1) and 3).
>> Please revise the content and move at least 75% of what is in /work/clas  to either /cache or /volatile?  
>> The current Hall-B usage includes:
>> 550G    hallb/bonus
>> 1.5T    hallb/clase1
>> 3.6T    hallb/clase1-6
>> 3.3T    hallb/clase1dvcs
>> 2.8T    hallb/clase1dvcs2
>> 987G    hallb/clase1f
>> 1.8T    hallb/clase2
>> 1.6G    hallb/clase5
>> 413G    hallb/clase6
>> 2.2T    hallb/claseg1
>> 3.9T    hallb/claseg1dvcs
>> 1.2T    hallb/claseg3
>> 4.1T    hallb/claseg4
>> 2.7T    hallb/claseg5
>> 1.7T    hallb/claseg6
>> 367G    hallb/clas-farm-output
>> 734G    hallb/clasg10
>> 601G    hallb/clasg11
>> 8.1T    hallb/clasg12
>> 2.4T    hallb/clasg13
>> 2.4T    hallb/clasg14
>> 28G    hallb/clasg3
>> 5.8G    hallb/clasg7
>> 269G    hallb/clasg8
>> 1.2T    hallb/clasg9
>> 1.3T    hallb/clashps
>> 1.8T    hallb/clas-production
>> 5.6T    hallb/clas-production2
>> 1.4T    hallb/clas-production3
>> 12T    hallb/hps
>> 13T    hallb/prad
>> 
>> Regards,
>> 
>> Harut
>> 
>> P.S. Few times we had crashes and they may also happen in future, so keeping important files in /work is not recommended.
>> You can see the list of lost files in /site/scicomp/lostfiles.txt  and  /site/scicomp/lostfiles-jan-2017.txt
>> 
>> 
>> 
>> -------- Forwarded Message --------
>> Subject:	ENP consumption of disk space under /work
>> Date:	Wed, 31 May 2017 10:35:51 -0400
>> From:	Chip Watson <[log in to unmask]>
>> To:	Sandy Philpott <[log in to unmask]>, Graham Heyes <[log in to unmask]>, Ole Hansen <[log in to unmask]>, Harut Avakian <[log in to unmask]>, Brad Sawatzky <[log in to unmask]>, Mark M. Ito <[log in to unmask]>
>> 
>> All,
>> 
>> As I have started on the procurement of the new /work file server, I 
>> have discovered that Physics' use of /work has grown unrestrained over 
>> the last year or two.
>> 
>> "Unrestrained" because there is no way under Lustre to restrain it 
>> except via a very unfriendly Lustre quota system.  As we leave some 
>> quota headroom to accommodate large swings in usage for each hall for 
>> cache and volatile, then /work continues to grow.
>> 
>> Total /work has now reached 260 TB, several times larger than I was 
>> anticipating.  This constitutes more than 25% of Physics' share of 
>> Lustre, compared to LQCD which uses less than 5% of its disk space on 
>> the un-managed /work.
>> 
>> It would cost Physics an extra $25K (total $35K - $40K) to treat the 260 
>> TB as a requirement.
>> 
>> There are 3 paths forward:
>> 
>> (1) Physics cuts its use of /work by a factor of 4-5.
>> (2) Physics increases funding to $40K
>> (3) We pull a server out of Lustre, decreasing Physics' share of the 
>> system, and use that as half of the new active-active pair, beefing it 
>> up with SSDs and perhaps additional memory; this would actually shrink 
>> Physics near term costs, but puts higher pressure on the file system for 
>> the farm
>> 
>> The decision is clearly Physics', but I do need a VERY FAST response to 
>> this question, as I need to move quickly now for LQCD's needs.
>> 
>> Hall D + GlueX,  96 TB
>> CLAS + CLAS12, 98 TB
>> Hall C,                35 TB
>> Hall A <unknown, still scanning>
>> 
>> Email, call (x7101), or drop by today 1:30-3:00 p.m. for discussion.
>> 
>> thanks,
>> Chip
>> 
>> 
>> _______________________________________________
>> Clas_offline mailing list
>> [log in to unmask]
>> https://mailman.jlab.org/mailman/listinfo/clas_offline
> 
> _______________________________________________
> Hps-analysis mailing list
> [log in to unmask]
> https://mailman.jlab.org/mailman/listinfo/hps-analysis

########################################################################
Use REPLY-ALL to reply to list

To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1

Top of Message | Previous Page | Permalink

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

May 2024
April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
June 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011

ATOM RSS1 RSS2



LISTSERV.SLAC.STANFORD.EDU

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager

Privacy Notice, Security Notice and Terms of Use