Galveston bay 3-D model study channel deepening lessons learned in management of a large modeling study
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The numerical modeling of Galveston Bay in 3-D for long time durations with many geometry and hydrologic scenarios, coupled with the need to store all data for subsequent use in a model to predict oyster populations, produced a challenge in data and resource management. There were roughly 12 years of simulation generating nominally 50 gigabytes of data. The data consisted of thousands of input, output and hotstart files submitted to and generated by the Cray-YMP superconductor. The computer's six processors enable simultaneous execution of multiple jobs. Processes for managing such a large volume of input data and multiple output files and maintaining quality control are presented. Backup, archiving and transfer of results is also addressed. Visualization of 3-D results in a meaningful fashion is discussed