WULFFNET Hardware
Hardware Introduction
The WULFFNET Hardware is composed of two server systems and fourteen
compute nodes. Since one of the main purposes of the cluster is
for x-ray tomography reconstructions, two servers were chosen so that
data could be transferred on to the hard disk of one server while the
second server orchestrated the computation nodes for the reconstruction.
There are several pieces of common hardware to each server.
Disclaimer
None of the hardware listed below should be construed as an endorsement
by DND-CAT or any of our member institutions. This is simply what we
used for our current cluster.
Pictures
Lets face it, thats what you have come to see. And after all, a picture
says a thousand words. So, here are a few of our cluster:
- Dr. Steve Weigand working on the Wulffnet Console.
- View of the servers and the front of the compute nodes that are located on top of the Sector 5 Bending Magnet Shielded Transport.
- View of the rear of the compute nodes showing wiring connections as well as the rack that holds the networking and communcations hardware.
Networking and Communications
Wulffnet currently uses an Allied Telesyn FS716 16 port 10/100 Switch
for internal communciations. The consoles of each of the nodes and
servers are brought to a single "workstation" point using Power Reach
KVM Controllers. The monitor for the system is a Princeton Ultra 72 Monitor.
Server Configuration
Each of the two servers is configured with the following hardware:
- Asus K7V Slot A Motherboard with 800 MhZ Athlon Processor
- 768 MByte Ram Non-ECC Memory
- 3.5" Floppy
- 1 Adaptec AHA-2940-AU SCSI Controller (for Plasmon JukeBox)
- 1 Adaptec 29160 Ultra-160-SCSI Controller
- 1 Hitachi GF-2000 Series DVD-RAM Drive
- 2 Sony SDT-10000 DDS-4 Tape Drives
- 2 IBM DDYS-T3695 36 GByte SCSI Disk Drives
- 1 Matrox MGA G400 AGP Video Card with 32 Bytes of Memory
- 2 D-Link DFE-530TX+ 10/100 Ethernet Cards
In addition, the two servers share a Plasmon D120 CD-R Jukebox which
contains four 8x CD-R burners and 120 CD-Rs. The jukebox has two
SCSI interfaces each containing two drives. One SCSI interface is
dedicated to each server. This jukebox
helped solve several data transfer issues for us. Some of our
experiments collect tens
of gigabytes of data per sample. While we encouraged our experimenters to ftp
the data back to their home computers, we quickly realized that data storage
at the "home lab" could be a serious bottleneck. We started burning CD's
since this was a format that could be read on Linux, Unix, Irix, Macs, PC's
etc...,
the media cost is relatively low,
software was available to control the robot and which could be easily folded into
excellent multivolume
CD-R backup software for Linux. The backup software uses
mkisofs and cdrecord to make ISO9660 file systems.
We mount the CD and use
star to verify the CD
(cd /mnt/cdrom; star -cPM . > /dev/null).
This combination of software and hardware solved the
transport, storage, and compatability problems.
We did consider initially using tapes for this purpose, but we found that
we could not rely on a particular tape drive being available at the
"home lab". CD's however, are everywhere.
In addition to the Plasmon Jukebox, we also have a
Primera Signature III CD Color
Printer with their Conductor Autoloader so we can print designs and
information directly on to the CD.
Currently, Redhat is our distribution of choice.
Node Configurations
The original Wulffnet consisted of 8 Dual Processor 450 MHz Pentium III
machines. Later we added 8 800 MhZ Athlon systems. Recently
we have replaced the Pentium III machines with 1.2 GHz Athlon
systems and are now
turning the P III systems into desktops systems. The current
node configuration for
Nodes 1-8 are:
- MSI K7T Pro2-A Socket A Motherboard with 1.2 GHz Athlon Processor
- 512 MBytes RAM
- 3.5" Floppy
- 1 ATI Rage Pro AGP Video Card
- 1 D-Link DFE-530TX+ 10/100 Ethernet Card
and
Nodes 9-14 contain:
- Asus K7V Slot A Motherboard with 800 MhZ Athlon Processor
- 512 MByte Ram
- 3.5" Floppy
- 1 ATI Rage Pro AGP Video Card
- 1 D-Link DFE-530TX+ 10/100 Ethernet Cards
John Quintana (jpq@northwestern.edu) / May 2001