Hi Matt,
I went ahead and modified the page. It is at:
https://confluence.slac.stanford.edu/display/hpsg/JLAB+computing+resources+accessible+to+HPS
Cheers,
Homer
On Thu, 12 Apr 2012, Homer wrote:
> Hi Matt,
>
> Some of this is documented on the HPS wiki JLAB resources pages.
> Could you please fill in the holes and correct/improve where needed.
>
> Thanks,
> Homer
>
>
> On Thu, 12 Apr 2012, Graham, Mathew Thomas wrote:
>
>>
>> Hi All...
>>
>> I thought I'd sends a quick rundown of some useful stuff for traversing the
>> computer landscape (*nix) at JLAB.
>>
>> First off, if you want to log in to any of the Linux machines from a
>> non-JLAB machine, you need to go through the login portal:
>> login.jlab.org<http://login.jlab.org>
>>
>> ...you can't really do anything from this login (really, not even copy a
>> file). You need to log into one of the interactive machines like:
>> jlablX .jlab.org (X=1-5)
>>
>> or for prepping farm work
>> ifarml64.jlab.org<http://ifarml64.jlab.org>
>>
>> The firewall at JLAB is pretty strict. Most domains are blocked by default
>> for cvs or svn access (http seems to be ok mostly). You can get around
>> this by setting up an ssh tunnel through a more accommodating machine. For
>> example, to do this for the freehep cvs, first log into (e.g.) jlabl4 and
>> do:
>>
>> ssh -L localhost:2401:cvs.freehep.org:2401
>> [log in to unmask]<mailto:[log in to unmask]>
>>
>> ...this will give you an ssh tunnel through noric at slac to localhost
>> (jlabl4) for port 2401 (the default pserver ssh port). Leave this
>> connection open! Then, login to jlabl4 again and get your cvs package
>> through ocalhost, eg:
>>
>> cvs -d :pserver:mgraham@localhost:/cvs/lcd co lcsim
>>
>> For svn, the default port is 3690, so the tunnel command is:
>> ssh -L localhost:3690:svn.freehep.org:3690
>> [log in to unmask]<mailto:[log in to unmask]>
>>
>> HPS has a group account at JLAB called clashps. If you need access to it,
>> let FX know and he'll put your ssh public_id key into the list (there is no
>> password, at least as far as we are concerned). I think this is intended
>> to be the "official" user for hps stuff...we'll put code releases under
>> this user and run production from this account.
>>
>> * /u/home/clashps has 2GB, backed up...
>> * /work/clas/clashps: 1TB of permanent (but not backed up)
>> * /volatile/clas/clashps: 1TB for staging
>> * these two above are only visible from ifarml64 machines...not jlablX
>> * /u/group/hps has 20GB (backed up) and is visible from farm and jlabl
>> machines
>> * also mounted on the clonusr3 (our online machine) at /misc/hps
>> * need an generic hps online account on this machine...talk to Sergey
>>
>> I've installed the hps-java software in /u/group/hps/production (although
>> it's not up to date with all of the changes going on). This is probably
>> the place to keep the releases (backed up, visible all over). We should
>> discuss how this is going to work and where to put things. We can also
>> check-out and build everything; I've installed maven (and netbeans) and you
>> can find them in ~clashps/bin.
>>
>> Probably other stuff too...any questions/corrections or anything to add let
>> me know.
>>
>> Matt
>>
>> ps...I'd like to set this up so that everyone can at least read from the
>> clashps directories, but that's not the case now.
>>
>>
>>
>>
>>
>>
>>
>> ########################################################################
>> Use REPLY-ALL to reply to list
>>
>> To unsubscribe from the HPS-SOFTWARE list, click the following link:
>> https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1
>>
>
########################################################################
Use REPLY-ALL to reply to list
To unsubscribe from the HPS-SOFTWARE list, click the following link:
https://listserv.slac.stanford.edu/cgi-bin/wa?SUBED1=HPS-SOFTWARE&A=1
|