Center for Integrated Research Computing
CIRC offers University researchers state of the art computing technology and software including support for the use and implementation of these resources. CIRC currently maintains about 420 TeraFLOPS of high performance computing systems, including the University of Rochester's flagship IBM Blue Gene/Q system.
1,024 nodes, 16,384 CPU cores, 16 TB RAM, 209 TeraFLOPS
The Blue Gene/Q system at the University of Rochester consists of one rack of the 209 TFLOPS IBM Blue Gene/Q massively parallel processing (MPP) supercomputer, one IBM System p (POWER 7) front-end node, one IBM System p (POWER 7) service node, and 4 IBM System x I/O nodes connected to a 400 TB IBM GPFS System Storage solution. The Blue Gene/Q system consists of 1,024 nodes, 16,384 CPU cores, 16 TB of RAM, and 400 TB of storage. Each node consists of a 16-core A2 processor with 32 MB of cache and access to 16 GB of RAM. The Blue Gene/Q rack is cooled by a highly-efficient water cooling.
284 nodes, 5,656 CPU cores, 21 TB RAM, 210 TeraFLOPS
The BlueHive cluster is CIRC's primary Linux cluster for demanding computations, which provides approximately 210 TFLOPS of computing capacity. This system consists of 284 nodes with a high-speed, low-latency, InfiniBand interconnect. The most recent addition to the BlueHive Cluster houses 2 x 12-core Intel Ivy Bridge processors (for a total of 24 cores per node), and ranges in memory from 64 GB to 512 GB. A number of the 64 GB nodes have two dedicated coprocessing cards, including NVIDIA's K20X (Kepler) GPUs and Intel Phi 5110P accelerators. In addition, some nodes of the cluster are dedicated to running “big data” analytics applications, such as Hadoop, with 112 TB of dedicated local storage and a total of 384 GB of RAM. The entire cluster is equipped with rear-door heat exchangers to leverage additional power and cooling savings, provided by the Blue Gene/Q.
The entire BlueHive cluster has an InfiniBand-attached storage system providing almost 2 PB of configurable raw disk within a GPFS file system. 84 nodes with varying capacity have been integrated into the BlueHive cluster for faculty investigators who have purchased additional priority-based compute capacity for the environment. CIRC runs the SLURM resource scheduler and queuing system to optimize usage and to support multiple users of the BlueHive Linux cluster environment. Additionally, users have access to an NX-based technology to connect to BlueHive using a resumable GUI interface.
- Access to high-performance computing and data analytic systems
- Research collaboration
- Assistance in grant writing
- Individual and group training
- Monthly symposia sessions
- Contributed hardware for priority access
In support of its mission to enable researchers to take advantage of computing resources at the University of Rochester, CIRC works together with other organizations. A selection of these affiliated organizations is below:
- Clinical and Translational Science Institute
- Data Center
- Health Sciences Center for Computational Innovation (HSCCI)
- Information Technology
- Institute for Data Science
- Miner Library
- River Campus Libraries
- School of Arts, Sciences, and Engineering
- School of Medicine and Dentistry
- Shared Resource Laboratories
- University of Rochester Research