Research Projects: HIPerWall

From Gravity
Jump to: navigation, search

Project Name HIPerWall
Team Members Falko Kuester, Stephen Jenks, Charlie Zender, Jean-Luc Gaudiot, Soroosh Soroshian, Kai-Uwe Doerr, Christopher Knox, Sung-Jin Kim, Harry Mangalam, Frank Wessel, Evan Klinger, Dirk Groeneveld and Greg Dawe
Project Sponsor National Science Foundation (NSF)
teaser2Description teaser1Description teaser3Description

Contents

Overview

The Highly Interactive Parallelized Display Wall project (HIPerWall) provides unprecedented high-capacity visualization capabilities to experimental and theoretical researchers. Our primary focus is on Earth science visualization, but collaborating researchers in fields including biomedical science and engineering also benefit from HIPerWall's capabilities. Earth science datasets often cover large areas of the planet at high resolution and depth with many values at each grid point that vary over time, resulting in many gigabytes or terabytes of data. Visualizing these multi-dimensional, time-varying dataset is both a challenge to computational/storage infrastructure as well as current display technologies. With HIPerWall, researchers can see both the broad view of the data and details concurrently, enabling collaboration and shared viewing of complex results. A visualization cluster of high-performance commodity computers transfers and manipulates data displayed on HIPerWall's 50 display tiles that operate at a combined resolution of over 200 mega pixels. The visualization cluster also receives real-time simulation data from UCI's Earth System Modeling Facility, an IBM supercomputer funded by NSF in 2003, as well as other sources. HIPerWall's ability to display extremely high-resolution datasets drives and provides focus for ongoing research into management, transfer, and visualization of terabyte-scale data. While the hardware infrastructure of HIPerWall is challenging and matches the state of the art, the data handling and distributed visualization capabilities needed to support HIPerWall's capacity are well beyond current practice. Please visit the HIPerSpace page to learn about the second generation of this technology.

System Specifications

By the Numbers

  • Number of tiles: 50 (fully supported in networked configuration)
  • Display resolution: 25,600 x 8,000 pixels, 204,800,000 pixels total
  • Number of display nodes: 25
  • Control and development nodes: 3

Hardware:

  • 11 Dual 2.5GHz Power Mac G5’s with nVIDIA GeForce 6800 Ultra DDL
  • 17 Dual 2.7GHz Power Mac G5’s with nVIDIA Quadro FX4500s
  • 56 Apple 30” Cinema Displays

Operating System

  • Mac OS X Server "Tiger"

Acknowledgments

This research is supported by the National Science Foundation (NSF), Major Research Instrumentation Program (MRI), under Award #0421554, the California Institute for Telecommunications and Information Technology (Calit2), and the Henry Samueli School of Engineering (HSSoE).

Publications

Citation

  • Nirnimesh, P. Harish, and P. Narayanan, “Garuda: A scalable tiled display wall using commodity pcs,” IEEE Transactions on Visualization and Computer Graphics, vol. 13, no. 5, pp. 864–877, 2007.

Media Coverage

Related Resarch Projects


Copyright

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by the author's copyright. This work may not be reposted without the explicit permission of the copyright holder.


back

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox