Schmidt JHMcAdory RTMartin WDBerger RC2010-02-152010-02-151993http://hdl.handle.net/1969.3/233512003-2007The numerical modeling of Galveston Bay in 3-D for long time durations with many geometry and hydrologic scenarios, coupled with the need to store all data for subsequent use in a model to predict oyster populations, produced a challenge in data and resource management. There were roughly 12 years of simulation generating nominally 50 gigabytes of data. The data consisted of thousands of input, output and hotstart files submitted to and generated by the Cray-YMP superconductor. The computer's six processors enable simultaneous execution of multiple jobs. Processes for managing such a large volume of input data and multiple output files and maintaining quality control are presented. Backup, archiving and transfer of results is also addressed. Visualization of 3-D results in a meaningful fashion is discussedComputer softwareHydraulicsMathematical techniquesSimulationGalveston bay 3-D model study channel deepening lessons learned in management of a large modeling studyCONF