Beowulf Project at CESDIS
For the Extreme Linux CD, look here.
Places we all check on a regular basis:
The Beowulf Project at GSFC:
The Beowulf Project was started at
which is operated for NASA by
in the summer of 1994
with the assembly of a 16 node cluster
developed for the Earth and space sciences project
at the Goddard Space Flight Center
The project quickly spread
to other NASA sites,
other R&D labs
and to universities
around the world.
The project's scope and the number of Beowulf
installations have grown over the years; in fact,
they appear to continue to grow at increasing rate.
This Web-site attempts to provide an introduction
to the Beowulf project and a distribution
mechanism for contributions CESDIS continues
to make to the project.
- Introduction and Overview
This contains an introduction and overview
of the Beowulf Project: history, goals,
philosophies and observations.
- Beowulf Networking Drivers
One of the keys to the development of Beowulf clusters is
the availablity of affordable, high performance networks
to serve as "back planes" for the cluster.
The development of network drivers for Linux continues
to be one of the primary foci for the CESDIS effort.
- Beowulf Software
The Beowulf software environment is implemented as an add-ons
to commercially available, royalty-free base Linux distributions.
This page is distribution point for this software.
- Mailing List Archives
This Hypermail archive of the Beowulf related mailing lists is
good place to catch up on previous discussions.
These lists are automatically managed by Majordomo and have been
archived at several sites including our
To subscribe to a
mailing list, send a message with the single word "subscribe" to
the appropriate address on the following "mailto" buttons:.
For general Beowulf discussions and questions.
For announcements about new Beowulf clusters or new software packages.
For discussions about porting software packages to Beowulf and general
For help with majordomo (i.e. how to unsubscribe) look here.
- Beowulf Hardware GSFC
Is this a brief description of Beowulf clusters at the
Goddard Space Flight Center.
- How to Build a Beowulf
- Linux at CESDIS
Linux software being developed at CESDIS.
- Beowulf Papers and Presentations
A short collection of journal and conference paper along with
overhead slides from various presentations.
Other Beowulf Sites:
Here is a partial list of other sites
that are working on Beowulf Related Projects:
- Grendel Clemson University PVFS and system development
- Drexel Drexel University
- PACET George Washington University
- Stone SouperComputer Oak Ridge National Lab (ORNL) a 126 node cluster at zero dollars per node
- Naegling CalTech's Beowulf Linux cluster
- Loki Los Alamos Beowulf cluster has an especially cool logo
- Megalon Sandia's large Beowulf
- theHive Goddard Space Flight Center one of the large Beowulf cluster at Goddard
- AENEAS University of California, Irvine
- MAGI CTU Prague's Beowulf cluster used for speech recognition
- Brahma Duke University's cluster
- Topcat University of Southern Queensland's Beowulf cluster
- USM University of Southern Mississippi online discussions about bringing a Beowulf to the campus
- LoBoS NIH
- SWARM Oregon State University
- smile Smile Cluster at Kasetsart University
- Wisconsin University of Wisconsin-Milwaukee A DEC Alpha 300XL based system used to support
the LIGO Experiment within the Physics Department at UWM
- HIDRA University of Barcelona's UB-UPC Dynamical Systems Group
- DeepFlow Viscoelastic Group at Universite' catholique de Louvain A 16 node Alpha-PC based cluster
- Avalon Los Alamos National Laboratory Center for Nonlinear Studies
and Theoretical Division
- Hyglac JPL's High Performance Computing Group
- Collage ANL distributed rendering and visualization
- U. Louisville U. Louisville Pentium-II system running CFD codes
- Weland Memorial University Seismic Imaging Consortium
- CPGE Intel Cluster Department of Petroleum and Geosystems Engineering, The University of Texas at Austin
- wonderland Texas Institute for Computational and Applied Mathematics, University of Texas at Austin
- ERATO-I Kitano Symbiotic Systems Project 32- Pentium-II, 100Base-TX, Catalyst 5505
- Enterprise Voyager and theBorg Deppartment Physical Chemistry, Cambridge University, UK
- Medusa Electromagnetics Laboratory, New Mexico State University
- CTF CASPUR, Italian Academic Consortium for Supercomputing Applications Rome
- Clarkson Department of Civil and Environmental Engineering used for geometric and material nonlinear finite element analysis of Geotechnical problems
- ICE Box Center for High Performance Computing, University of Utah one of many Beowulf clusters at U of Utah
- Wulfpack Johns Hopkins Medical School
- highpoint Computer Science Dept of High Point University primary use is education / research
- GALAXY Dept. of Nuclear Engineering, Korea Advanced Institute of Science and Technology Monte-Carlo particle transport simulation
- COCOA Aerospace Engineering, Penn State University 400MHz Pentium II, Fast Ethernet
- Spectra Cluster Dept Electrical and Computer Engineering, Univ of Alabama in Huntsville
- Storm rendering raytraced environments in interactive rates
- Texas Tech Tornado Dept. of Computer Science, TEXAS TECH UNIVERSITY cluster is used for both research and teaching
- NCHC PC Cluster National Center for High-performance Computing, Hsichu, Taiwan
- Marina Case Western Reserve University
- PPME Monash Parallel Parametric Modelling Engine
- Helge Arrhenius Laboratory, Stockholm University Physical Chemistry
- Yggdrasil Institute of Theoretical Dynamics
at the University of California, Davis primarily used for computational biology
- PPCC Indiana University Pentium II's with Gigabit Ethernet: half the system runs
Windows NT and half runs Linux
- CARRIER University of Florida This cluster supports simulative and experimental
architecture, interconnect, and algorithm research
- ISCTE/ADETTI Lisbon/Portugal University rendering/ray trace anfd other scientific applications
- LAMA LAMA's Materials Laboratory at UPV/EHU Monte Carlo simulations of phase transitions in condensed matter physics
- Calico Thomas Jefferson National Accelerator Facility studying the feasibility and efficiency of Lattice QCD calculations on a
- Störtebeker Institut für Technische Informatik,
Medizinische Universität zu Lübeck Research projects: fault tolerance, intelligent and
configurable networks, load management with mobile agents, mobile
robots. Application projects: medical and multimedia.
- earthdome Watson Technical Consulting runs TAOS/mpi and various meteorological software packages
such as MM5 (a mesoscale model) and CCM3.6 (a climate model)
for storm hazard risk assessments and research,
concentrating on Tropical Cyclone research
- SPARROW University of Arkansas network research and computational server
- BlackAdder Someplace in Canada A personal Beowulf Cluster built to study clusters and parallel rendering
- NA49 Hewlett-Packard/GSI/CERN Particle physics data processing
- YAC Bioinformatics, Samuel Lunenfeld Research Institute and
the Univeristy of Toronto All-atom ensemble-based protein folding simulations and
URL/HTTP-based distributed clustering
- GRAVITOR Geneva Observatory, University of Geneva used for astrophysics problems
- Tempest and Lear Department of Physics & Astronomy,University of Pittsburgh Lear is dedicated especially for large lattice QCD computations
- wiglaf Harvey Mudd College teaching and research including fluid dynamics and medical modeling applications
- Trojans Cluster Project Internet and Cluster Computing Lab at University of Southern California Cluster middleware research
- ClusterTux NCAR - National Center for Atmospheric Research MM5 (weather modelling)
- ParaStation University of Karlsruhe: Dept. of Informatics Enables high performance cluster computing by reducing latency
and increasing throughput
If you have a webpage about your cluster and
you would like to have it added to this list, send
mail to Phil Merkey.
Cluster Related Projects:
Commercial Beowulf sites:
Part of the Beowulf philosophy is that
all system software required
to construct and operate
a Beowulf be publicly available and all the
hardware should be COTS hardware.
However, this does not mean that we are
against the idea of buying software nor
against the idea of buying assembled hardware.
In particular, many of our ESS scientists look to
vendors for their support of Fortran 77 and Fortran 90.
There are number of commerical sites
that offer these and other products or services of interest
to the Beowulf community.
We offer these pointers at the request of these vendors for your
- Xtreme Machines from Paralogic Paralogic develops and sells software tools, commodity hardware, and
applications for parallel computers.
- PSSC Labs Custom configures Beowulf nodes for government, university, and
- NAG Fortran 90 Compiler for Linux
- Absoft F77 and F90 compilers for Linux
- PGI Portland Group offers F77, F90, HPF, C, and C++ compilers for Linux
- SALT Commercial Hardware Vendor in Germany
- Sybrandt Sybrandt Open Systems provides a commerical Beowulf solution
- Paralline distributes Linux clusters, high-speed networks and services
- ParTec supports the ParaStation project and sells clusters and services
- Giganet cLAN provides a native implementation of the Virtual Interface Architecture
- MPI Software Technology a Beowulf-commerical alternative
- Alinka Raisin provides a software tool for installation and administration
of Beowulf-type Linux clusters
If you would like to have pointer to your product or service added to this list,
send mail to Phil Merkey.
Contact: Phil Merkey firstname.lastname@example.org
Page last modified: 1999/10/27 13:31:38 GMT
CESDIS is operated for NASA by the USRA