SVS will run well on just about any conventional laptop, desktop, or server. However, there are several considerations in choosing new hardware that will give you the optimal experience while mitigating budget impact.
As with any analytic application, the more hardware you throw at the problem, the faster the solution will be. However, diminishing returns soon outweigh the additional investment required.
Our desire to simplify the process has led us to suggesting two basic configurations: a Standard configuration, adequate for most types of analysis performed by a single user at a time; and an Advanced configuration, best suited for multi-user scenarios, those performing larger studies, and for anyone doing copy number segmentation on a regular basis.
Standard (Moderate power adequate for most Single Named User Licenses)
Single dual-core processor (Quad-core processors are getting cheaper, becoming an attractive option)
32-bit Windows 7, XP, Vista, Linux, or Mac OS X 10.6
2-4GB of RAM
Advanced (For Universal Server Licenses, systems used for CNV segmentation, or for large projects)
Two quad-core processors (eight cores)
64-bit Windows 7, XP, Vista, Linux, or Mac OS X 10.6
(If for a server license: Windows Server 2003 64-bit, Windows Server 2008 64-bit, or 64-bit Linux)
8-16GB+ of RAM
- SVS will run on just about anything; performance will simply get better as you add more memory and faster components.
- We do not recommend getting the fastest processors available as they tend to be extremely expensive for modest gain. When choosing a faster processor pushes the price of the system higher than it would be to buy a second lower speed computer, you know you have gone too far. Choose from the faster family of processors, just not the top processors in the family.
- Solid state hard drives make a dramatic performance difference for file input/output but are still rather expensive and small. They are very nice addition if you can afford it, particularly if you perform analysis on a laptop.
- We do not recommend analyzing data stored on a network drive; performance is usually poor as most network disks do not compete well with local disk performance. We recommend analyzing data on directly attached storage, then offloading the final results to network storage for backup.
- Raw data from some large studies can approach a terabyte of storage or more. Output files can be in the tens of gigabytes. A single 1TB disk for <$200 is enough for most single users to perform a single study with space to spare. Institutions doing many studies should consider fault tolerant disk arrays and tape backup systems.
- Due to the types of compilers we use, Windows implementations tend to have slightly better performance than Linux.
- Where possible, select systems that support faster memory (e.g. 1333MHz instead of 666MHz) and buy that faster memory.
- The 32-bit version will run on 64-bit machines but will not take advantage of the extra resources.