6 results on '"Ballard, Sandy"'
Search Results
2. Revisiting Gurney Fest--What's Here and What's Not.
- Author
-
Ballard, Sandy
- Subjects
CREATIVE writing ,FILMMAKING ,POETS laureate ,CONFERENCE rooms ,CANOES & canoeing - Abstract
The article revisits the celebration of Gurney Norman's contributions at Gurney Fest held at the University of Kentucky. Topics discussed include Norman's impact on Appalachian literature and culture, the various events and speakers that highlighted his legacy, and efforts to document and archive the festival's experiences.
- Published
- 2024
3. Calculating Path-Dependent Travel Time Prediction Variance and Covariance for a Global Tomographic P-Velocity Model
- Author
-
LOS ALAMOS NATIONAL LAB NM, Hipp, Jim R, Encarnacao, Andre V, Young, Chris J, Ballard, Sandy, Chang, Marcus C, Phillips, W S, Begnaud, Mike L, LOS ALAMOS NATIONAL LAB NM, Hipp, Jim R, Encarnacao, Andre V, Young, Chris J, Ballard, Sandy, Chang, Marcus C, Phillips, W S, and Begnaud, Mike L
- Abstract
Several studies have shown that global 3D models of the compression wave speed in the Earth's mantle can provide superior first P travel time predictions at both regional and teleseismic distances. However, given the variable data quality and uneven data sampling associated with this type of model, it is essential that there be a means to calculate high-quality estimates of the path-dependent variance and covariance associated with the predicted travel times of ray paths through the model. In this paper, we show a methodology for accomplishing this by exploiting the full model covariance matrix. Typical global 3D models have on the order of 1/2 million nodes, so the challenge in calculating the covariance matrix is formidable: 0.9 TB storage for 1/2 of a symmetric matrix, necessitating an Out-Of-Core (OOC) blocked matrix solution technique. With our approach the tomography matrix (G which includes Tikhonov regularization terms) is multiplied by its transpose (GTG) and written in a blocked sub-matrix fashion. We employ a distributed parallel solution paradigm that solves for (GTG)-1 by assigning blocks to individual processing nodes for matrix decomposition update and scaling operations. We first find the Cholesky decomposition of GTG which is subsequently inverted. Next, we employ OOC matrix multiplication methods to calculate the model covariance matrix from (GTG)-1 and an assumed data covariance matrix. Given the model covariance matrix we solve for the travel-time covariance associated with arbitrary ray-paths by integrating the model covariance along both ray paths. Setting the paths equal yields the variance for that path., Published in the Proceedings of the 2011 Monitoring Research Review - Ground-Based Nuclear Explosion Monitoring Technologies, 13-15 September 2011, Tucson, AZ. Volume I. Sponsored by the Air Force Research Laboratory (AFRL) and the National Nuclear Security Administration (NNSA). U.S. Government or Federal Rights License
- Published
- 2011
4. Robust, Extensible Representation of Complex Earth Models for Use in Seismological Software Systems
- Author
-
SANDIA NATIONAL LABS LIVERMORE CA, Ballard, Sandy, Hipp, Jim, Young, Chris, SANDIA NATIONAL LABS LIVERMORE CA, Ballard, Sandy, Hipp, Jim, and Young, Chris
- Abstract
Models of geophysically important properties of the Earth, such as seismic velocity, Q and density, can become large and complex when those properties vary in three dimensions within the model. We have developed a system to represent the distribution of seismic properties in the Earth that can accommodate a wide range of local to global scale 3D Earth models with spatially variable resolution. A 2D grid of nodes is tessellated using either triangles or quadrilaterals and a profile is defined at each 2D grid node that extends from the center of the Earth to the surface. The surface of the model corresponds with the topographic/bathymetric surface of the Earth, which is referenced to the surface of the GRS80 ellipsoid. Each profile can be separated into a number of layers defined by interfaces across which geophysical properties may be discontinuous. Within the layers between interfaces, the vertical distribution of geophysical properties may be defined by a number of continuous sublayers, by arbitrary order polynomials or by various types of splines. Layer thicknesses can vary laterally and zero thickness layers and layer pinch-outs are accommodated. The distribution of nodes is very flexible, allowing model resolution to vary over a wide range in three dimensions. In this paper, we present detailed descriptions of the software algorithms used to construct, store and interpolate these models., Published in the Proceedings of the 30th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 23-25 Sep 2008, Portsmouth, VA sponsored by the National Nuclear Security Administration (NNSA) and the Air Force Research Laboratory (AFRL). The original document contains color images.
- Published
- 2008
5. Implementation of a Pseudo-Bending Seismic Travel-Time Calculator in a Distributed Parallel Computing Environment
- Author
-
SANDIA NATIONAL LABS ALBUQUERQUE NM, Ballard, Sandy, Young, Chris, Hipp, Jim, Barker, Glenn, Chang, Marcus, SANDIA NATIONAL LABS ALBUQUERQUE NM, Ballard, Sandy, Young, Chris, Hipp, Jim, Barker, Glenn, and Chang, Marcus
- Abstract
Pseudo-bending is an algorithm for calculating seismic travel time through complex 3D velocity models. The algorithm was originally proposed by Um and Thurber (1987) and later extended by Zhao et al. (1992) to account for first order velocity discontinuities. We have modified Zhao's method of handling discontinuities by implementing a two-dimensional (2D) minimization algorithm that searches for the point on the velocity discontinuity surface where Snell's Law is satisfied. Further, our implementation reduces the likelihood that the pseudo-bending algorithm will return a local minimum by starting the ray calculation from several different starting rays. Specifically, interfaces are defined that include first order discontinuities plus additional interfaces at levels of the model where local minima might be generated. Rays are computed that are constrained to bottom in each layer between these interfaces. The computed rays might be reflected off the top of the layer, turn within the layer, or diffract along the interface at the bottom of the layer. The computed ray that is seismologically valid and that has the shortest travel time is retained. The modifications we have made to the algorithm have made it more accurate and robust but have also made it more computationally expensive. To mitigate this impact, we have implemented our software in a distributed parallel computing environment, which makes possible the calculation of many rays simultaneously., Presented at the Conference on Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies (30th), held in Portsmouth, VA, on 23-25 Sep 2008. Published in the proceedings of the conference, p338-346, 2008. Sponsored in part by the National Nuclear Security Administration (NNSA). The original document contains color images.
- Published
- 2008
6. Advances in the Integration of Large Data Sets for Seismic Monitoring of Nuclear Explosions
- Author
-
SANDIA NATIONAL LABS ALBUQUERQUE NM, Carr, Dorthe B., Lewis, Jennifer E., Ballard, Sandy, Martinez, Elaine M., Hampton, Jeff W., Merchant, Bion J., Stead, Richard J., Crown, Michelle, Roman-Nieves, Jorge, Dwyer, John J., SANDIA NATIONAL LABS ALBUQUERQUE NM, Carr, Dorthe B., Lewis, Jennifer E., Ballard, Sandy, Martinez, Elaine M., Hampton, Jeff W., Merchant, Bion J., Stead, Richard J., Crown, Michelle, Roman-Nieves, Jorge, and Dwyer, John J.
- Abstract
The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program has been integrating large sets of seismic events and their associated measurements for almost a decade to support nuclear explosion monitoring. During that time the integration process has changed significantly, generally becoming more complex and more automated as the number of events and the range of associated measurements has steadily grown. In this paper, we explain the methodology for integrating database tables from different products that are part of a Knowledge Base (KB) release. The major effort of KB integration is merging events and their associated information in Oracle database tables. We have developed a substantial foundation of structure and software to assure data integrity in the integration of diverse data sets. The structural part of this foundation utilizes Oracle data dictionary tables along with complementary custom database tables. These custom tables contain information specifically related to how KB database objects are built. Information such as descriptions and database types are stored in these custom tables so that the KB structure is easily modifiable making it more flexible than it would be following a traditional database design. This "metadata" of the supporting structures is called the "schema schema," and all the integration tools are based on this structure. Los Alamos National Laboratory (LANL) has developed a tool to make sure the information in the product database tables is accurate and valid. Called the Quality Control Tool (QCTool), this tool checks that the database tables conform to the database structures found in the "schema schema" and that the values in the database tables are reasonable. Database tables are not merged together until the errors generated by QCTool are either corrected or explained., Presented at the Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies Conference (29th) held in Denver, CO on 25-27 September 2007. Published in the Proceedings of the Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies Conference (29th), p927-934, September 2007. The original document contains color images.
- Published
- 2007
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.