Print

Print


Looks like all the refit vertex files from tweak pass 6 are on tape already: /mss/hallb/hps/engrun2015/tweakpass6/recon/vtxRefit/

I’m deleting these files right now from /work.

-Holly

> On Dec 11, 2019, at 9:05 PM, maurik <[log in to unmask]> wrote:
> 
> Thank you Holly for cleaning that up. 
> 
> Do you know if the “tkp6_refit” files are still relevant for anyone?
> 
> - Maurik
> 
>> On Dec 11, 2019, at 8:20 PM, Holly Vance <[log in to unmask] <mailto:[log in to unmask]>> wrote:
>> 
>> Hi Maurik,
>> 
>> Thanks- I forgot about the lgmc_tuples (these were large MC files for vertex background studies for 2015).
>> 
>> They are on tape (/mss/hallb/hps/production/holly/) now if anyone needs the large background MC sample for 2015 vertex studies (and deleted from /work). 
>> 
>> -Holly
>> 
>>> On Dec 11, 2019, at 5:20 PM, maurik <[log in to unmask] <mailto:[log in to unmask]>> wrote:
>>> 
>>> Dear HPS,
>>> 
>>> As was brought up in the meeting today, we need to clean up our usage of the work disk to make space for new activities. 
>>> 
>>> I made an attempt to indicate what is what in the table below, which is an annotated output of “du -s -h -c * | sort -h” on the work disk. As you can see, there are a lot of directories where people are either doing 2019 calibration work or 2016 data analysis work. However, there seem to be also quite a bit of legacy files on the disk. 
>>> 
>>> Perhaps Bradley and Tongtong can discuss backing up and cleaning up the mc_production directory (purple). 
>>> Perhaps Holly and Sebouh can discuss backing up and cleanup the directories marked in red. 
>>> 
>>> /work/hallb/hps disk space use (>100MB): 
>>>  
>>> 270M	phansson
>>> 325M	mskolnik
>>> 414M	gkalicy
>>> 419M	baltzell
>>> 1.2G	tvm
>>> 1.3G	holly
>>> 3.4G	uemura
>>> 4.7G	mgraham
>>> 5.6G	lmarsicano
>>> 14G	jeremym
>>> 33G	fxgirod
>>> 79G	mrsolt
>>> 84G	omoreno
>>> 171G	sebouh                 - All files are older than 1 year.    
>>> 242G	byale                  - Almost all files are older than 1/2 year
>>> 310G	verylg_tritrig (msolt) - All files are older than 1 year.
>>> 329G	mccaky                 - Mostly recent Aprime MC files.
>>> 564G	lgmc_tuples (hszumila) - MC tuples, all older than 1 year.
>>> 647G	rafopar
>>>                                - Data 182 GB - MC root files for 2016 data analysis. Recent files.
>>>                                - PhysRun2019 431 GB - looks like mostly hodoscope MC. Recent file.
>>> 951G	celentan               - FEE2019 945 GB     - Current calibration output.
>>> 984G	mc_production  (hps)   - Some current, some old, some very old, MC output 
>>>                                - 23G	tweakPass6_ApReconAtneg5mm
>>>                                - 36G	BeamTilt
>>>                                - 56G	alphaFix
>>>                                - 56G	SLAC
>>>                                - 134G	zeroBeamWidth
>>>                                - 262G	PhysicsRun2016
>>>                                - 403G	MG_alphaFix
>>> 
>>> 1015G	data           (hps)
>>>                                - 851 GB - 2015 tweakpass6 DST root files.
>>>                                - 158 GB - 2016 old passes (Calibpass4b, pass4fail, pass1_allign) 
>>> 1.3T	tkp6_refit (hszumila)  - 1.3 T  - tuples from 2018.
>>> 2.2T	ngraf                  - Mostly FEE filtered files for 2019 calibration. Perhaps this can move to tape and /cache?
>>> 
>>> 8.7T	total
>>> 
>>> Note: How old files are was determined with “find . -ctime -365” to check if there are files that were created after 365 days ago. The “-atime” for access time of files does not work reliably on these disks, you can read a file but this will not update the access time. 
>>> 
>>> Thanks,
>>> 	Maurik
>>> 
>>> 
>>> Use REPLY-ALL to reply to list
>>> 
>>> To unsubscribe from the HPS-SOFTWARE list, click the following link:
>>> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1 <https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1>
> 


########################################################################
Use REPLY-ALL to reply to list

To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1