The availability of more and more satellites has created a large repository of very high-resolution image datasets. NASA has created a new version of their Blue Marble earth images at 86400x43200 pixels, with one of those images for each month of the year 2004. This dataset contains about 134 GB of image data. Similar amounts of data are created routinely by satellites such as the Hubble space telescope or the Mars Explorer system.
Even larger data sizes are created by seismic exploration in the form of 3D volumetric data sets, which are often viewed as a stack of images. Viewing and exploring these images is not a trivial problem, as they can't just be loaded into memory at once, they need to be split up into pieces that are loaded and displayed independently. As the resolution of the images is much higher than the available screen resolution, pre-filtered smaller versions need to be generated and stored to allow viewing appropriate information for a given resolution and zoom level, further increasing the amount of data that needs to be managed. To show as much information as possible, these days multiple independent displays, either projections or monitors, are used simultaneously, each one driven by a separate PC. The goal of this project is to develop the tools necessary to preprocess a resolution pyramid of arbitrary-sized images, and to display the most relevant portion of it for any given view and zoom level. The second part is to design and develop the networking system to efficiently feed data to the display cluster, either from a central server or via peer-to-peer mechanisms, to enable smooth and uninterrupted exploration. We got some basic results that were pretty promising, but in the end the performance difference to something like Google Maps or Google Earth was not big enough to make this a promising topic for further work.