Cara you have re-ignited the perennial Mac v PC debate!!! You'll be asking about depositing raw diffraction data next ;)
Cheers
Ashley
Sent from my iPhone
On 23/01/2013, at 10:29 PM, Anastassis Perrakis <[log in to unmask]> wrote:
> I am of the opinion that the truth lies somewhere in between ...
>
> Here are my two cents based on personal experience ...
>
> For example, I am happy myself using a MacBook Pro, which is sufficient for all my activities, and has all software and data that I need.
> Thus, I am myself on the 'new' paradigm side, having a machine with "mainframe levels of storage and computing power" (I do not run git,
> but time machine in a mac has the bits I need from the git idea - as far as I know git that is).
>
> In the department, we have about 20-25 scientists. These people need to ''maintain" and be proficient in many software suites, many more than
> a traditional crystallographer (like me in my PhD time for example) would need:
> vector design software for cloning, databases for keeping track of clones, sequencing viewing software for their clones,
> interfaces for crystallisation and biophysical equipment, analysis suites like Graphpad/Prism, Origin, Kintek (etc etc) for biophysical experiments ...
> .... and lets not forget SAXS software ... Our experience, is that most of these people like to use a Windows workstation for these (the choice is free),
> others prefer a Mac, thats not my point here. Many of that software also needs to "maintained" by these people...
> Also, for a variety of reasons which have to do with IT "support" restrictions, the Windows machines
> we have for them are miss-configured with ancient versions of the windows "operating" system, but still Ok for many things, but not for really
> straightforward use of CCP4/Phenix ...
>
> My point here is, that these people are less likely to be keen of the idea to also install and run ccp4/coot/phenix/buster in their machines
> (they use pymol/yassara/chimera though locally since they can copy/paste to their presentations and papers then).
> So, we find it useful to keep an old fashioned setup running in parallel. Linux boxes, hooked to Zalmans or really
> big or double LCDs, in a specific room ... People like these for data processing, all data of many years back are online, incremental backup is running etc.
> For historical reasons we even run NFS/NIS there (I agree its not a great choice if one would start now).
>
> My conclusion and advice for labs or departments that have more than 5-6 people, and are doing crystallography but not as their
> "full-time" business is that besides personal PC/Mac, a common room with a few relatively powerful machines with nice, big, double,
> screens, likely also Stereo, is useful for a few reasons:
>
> 1. Easier to make sure everybody is using the same software more or less
> 2. Same machine to everybody - not the situation that a new student gets a new machine at year 0, which is redundant by graduation time at +3 years (...or +5,6,7...)
> 3. Mixing of people in the room and ability for people to "look over the shoulder" of others, the point that my colleague Titia Sixma always favours,
> which has indeed proved great for teaching others and learning from others.
> 4. Centralised "real" backup, availability of diffraction data on-line with less "mounts"...
>
> For these machines, centralised user account information and 'home' sharing is in my view essential, as it allows to "blindly" choose any of the
> machines that is available at a time ... and, being a Mac fun, I think Linmux is better suited for that purpose, financially and practically...
>
> These said, it reminds me that we need to update the OS, buy a few new machines, new LCDs ... argh.
>
> Sorry of this lecture was outside the scope of the original thread.
>
> Tassos
>
>
>
> On 23 Jan 2013, at 9:54, James Stroud wrote:
>
>> On Jan 22, 2013, at 11:20 PM, Nat Echols wrote:
>>> The real difficulty is integrating Macs into a
>>> Linux-centric environment, for example configuring NFS, NIS, etc.
>>
>> That's because NFS and NIS are antiquities left over from the days of mainframes. Distributed file systems and user information databases are designed for an environment of many workers and few machines, when the typical graphics workstation cost $50,000. These days, we argue whether to spend an extra $200 on a $500 computer. We have moved to a new paradigm: many workers with many more machines, with each machine having essentiallymainframe levels of storage and computing power. In other words, instead of NFS, you should run git.
>>
>> James
|