Jump to content

Xeon DP Computational Server Build


Onnes
 Share

Recommended Posts

Unfortunately I don't have any good photos for this, and there's really not much see at a distance since there's no flashy gaming parts and LEDs.

9kFl4IX.jpg?1

TL;DR
CPUs: 2 x Intel Xeon E5-2690 v3 (2 x 12 cores, 2.6 GHz base)
CPU Coolers: 2 x Noctua NH-U12DXi4
RAM: 8 x 16 GB Kingston DDR4 2133 Registered ECC (Cas 15)
GPU: MSI Radeon 7770 HD *(Temporary)
Motherboard: ASUS Z10PE-D16
Case: Phanteks Enthoo Luxe
OS Drive: Samsung 850 Pro 1TB
Alt Drive: WD Black 2TB
Power Supply: Corsair AX860i
UPS: TBD
OS: Xubuntu 15.10

Cost: $6700 USD

Purpose:
This system is going to be used as a computational server. This means it's largely going to be accessed remotely in order to run numeric code. Historically, the tasks we run tend to be heavy on double precision operations and relatively light on RAM and harddrive access.

Instead of talking about why certain decisions were made, I think it's easier to say why certain decision were not made.

Why not a GPU box?
Because we're physicists and programming for GPUs is a pain in the ass. For the most part we end up writing our own numerics from scratch because there aren't any libraries covering exactly what we need. This makes programming ease a major factor -- faster computation is meaningless if you end up spending far longer coding and debugging. And since we're physicists, most incoming students don't have significant programming experience in the first place.

Why not buy into a cluster?
Pretty much all the same reasons for not going with GPUs, plus the added issues of access controls and job scheduling.

What's with the storage?
This machine isn't meant to be used for anything but calculations, and unless we dump doubles into a file as plaintext strings, we really don't use space all that fast. We also don't usually need to access the harddrive except at the beginning and end up a calculation, so drive speed isn't very limiting. I figure the future upgrade path for this machine will be a large PCIe SSD, but since those are still expensive we went with a simple 850 Pro to start.

<Insert OS derangement syndrome here>
We already have a number of machines running Ubuntu in the office, and we went with Xubuntu for a lighter desktop. We also cannot assume much Linux knowledge for users. Plan is to upgrade to the next LTS when it hits later this year.

Build notes:
Despite listing SSI EEB compatiblity, the Phanteks case was missing one of the required standoff holes. The other 9 were present though, so we just left it open. The diagram in the case manual was also completely wrong. Case was also cosmetically damaged in shipping because fuck FedEx.

The ATX and CPU power sockets were on the top of the motherboard, which was far enough from the power supply that the supplied cables couldn't be routed behind the motherboard. I ended up just using a mess of cable ties to get the power cables as far away from the CPU fans as I could.

At idle, the system is dead quiet. Even under full CPU load, it remains fairly quiet compared to the building ventilation. CPUs reach max core temps of around 75C.

Link to comment
Share on other sites

Getting benchmarks to run on Linux turns out to be a pain in the ass. Out of the box, most of the Phoronix Test Suite tests either fail to install or autoconfigure improperly.

My computational server is hitting C-Ray v1.1 at 4.7 s , which puts it squarely in the 'holy shit' performance category.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...