32 results on '"Specht, Sebastian"'
Search Results
2. The impact of the Covid-19 pandemic and government intervention on active mobility
- Author
-
Möllers, Alessa, Specht, Sebastian, and Wessel, Jan
- Published
- 2022
- Full Text
- View/download PDF
3. REHEATFUNQ (REgional HEAT-Flow Uncertainty and aNomaly Quantification) 2.0.1: a model for regional aggregate heat flow distributions and anomaly quantification.
- Author
-
Ziebarth, Malte Jörn and von Specht, Sebastian
- Subjects
- *
GEOTHERMAL ecology , *GAMMA distributions , *RANDOM variables , *BAYESIAN field theory , *GOODNESS-of-fit tests , *DATABASES - Abstract
Surface heat flow is a geophysical variable that is affected by a complex combination of various heat generation and transport processes. The processes act on different lengths scales, from tens of meters to hundreds of kilometers. In general, it is not possible to resolve all processes due to a lack of data or modeling resources, and hence the heat flow data within a region is subject to residual fluctuations. We introduce the REgional HEAT-Flow Uncertainty and aNomaly Quantification (REHEATFUNQ) model, version 2.0.1. At its core, REHEATFUNQ uses a stochastic model for heat flow within a region, considering the aggregate heat flow to be generated by a gamma-distributed random variable. Based on this assumption, REHEATFUNQ uses Bayesian inference to (i) quantify the regional aggregate heat flow distribution (RAHFD) and (ii) estimate the strength of a given heat flow anomaly, for instance as generated by a tectonically active fault. The inference uses a prior distribution conjugate to the gamma distribution for the RAHFDs, and we compute parameters for a uninformed prior distribution from the global heat flow database by. Through the Bayesian inference, our model is the first of its kind to consistently account for the variability in regional heat flow in the inference of spatial signals in heat flow data. Interpretation of these spatial signals and in particular their interpretation in terms of fault characteristics (particularly fault strength) form a long-standing debate within the geophysical community. We describe the components of REHEATFUNQ and perform a series of goodness-of-fit tests and synthetic resilience analyses of the model. While our analysis reveals to some degree a misfit of our idealized empirical model with real-world heat flow, it simultaneously confirms the robustness of REHEATFUNQ to these model simplifications. We conclude with an application of REHEATFUNQ to the San Andreas fault in California. Our analysis finds heat flow data in the Mojave section to be sufficient for an analysis and concludes that stochastic variability can allow for a surprisingly large fault-generated heat flow anomaly to be compatible with the data. This indicates that heat flow alone may not be a suitable quantity to address fault strength of the San Andreas fault. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Unchanged frequency of moraine-dammed glacial lake outburst floods in the Himalaya
- Author
-
Veh, Georg, Korup, Oliver, von Specht, Sebastian, Roessner, Sigrid, and Walz, Ariane
- Published
- 2019
- Full Text
- View/download PDF
5. Comprehensive data set of in situ hydraulic stimulation experiments for geothermal purposes at the Äspö Hard Rock Laboratory (Sweden).
- Author
-
Zang, Arno, Niemz, Peter, von Specht, Sebastian, Zimmermann, Günter, Milkereit, Claus, Plenkers, Katrin, and Klee, Gerd
- Subjects
ROCK music ,HORIZONTAL wells ,ACOUSTIC emission ,GRANITE ,FLUID injection ,SEISMIC arrays ,CRYSTALLINE rocks ,HYDRAULIC fracturing - Abstract
In this article, a high-resolution acoustic emission sensor, accelerometer, and broadband seismometer array data set is made available and described in detail from in situ experiments performed at Äspö Hard Rock Laboratory in May and June 2015. The main goal of the hydraulic stimulation tests in a horizontal borehole at 410 m depth in naturally fractured granitic rock mass is to demonstrate the technical feasibility of generating multi-stage heat exchangers in a controlled way superiorly to former massive stimulations applied in enhanced geothermal projects. A set of six, sub-parallel hydraulic fractures is propagated from an injection borehole drilled parallel to minimum horizontal in situ stress and is monitored by an extensive complementary sensor array implemented in three inclined monitoring boreholes and the nearby tunnel system. Three different fluid injection protocols are tested: constant water injection, progressive cyclic injection, and cyclic injection with a hydraulic hammer operating at 5 Hz frequency to stimulate a crystalline rock volume of size 30 m × 30 m × 30 m at depth. We collected geological data from core and borehole logs, fracture inspection data from an impression packer, and acoustic emission hypocenter tracking and tilt data, as well as quantified the permeability enhancement process. The data and interpretation provided through this publication are important steps in both upscaling laboratory tests and downscaling field tests in granitic rock in the framework of enhanced geothermal system research. Data described in this paper can be accessed at GFZ Data Services under 10.5880/GFZ.2.6.2023.004. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Revisiting the San Andreas Heat Flow Paradox From the Perspective of Seismic Efficiency and Elastic Power in Southern California.
- Author
-
Ziebarth, Malte J., Anderson, John G., von Specht, Sebastian, Heidbach, Oliver, and Cotton, Fabrice
- Subjects
SURFACE of the earth ,PARADOX ,CALORIMETRY ,FRICTION ,EARTHQUAKES - Abstract
We investigate the relation between frictional heating on a fault and the resulting conductive surface heat flow anomaly using the fault's long‐term energy budget. Analysis of the surface heat flow surrounding the fault trace leads to a constraint on the frictional power generated on the fault—the mechanism behind the San Andreas fault (SAF) heat flow paradox. We revisit this paradox from a new perspective using an estimate of the long‐term accumulating elastic power in the region surrounding the fault, and analyze the paradox using two parameters: the seismic efficiency and the elastic power. The results show that the constraint on frictional power from the classic interpretation is incompatible with the accumulating elastic power and the radiated power from earthquake catalogs. We then explore four mechanisms that can resolve this extended paradox. First, stochastic fluctuations of surface heat flow could mask the fault‐generated anomaly (we estimate 21% probability). Second, the elastic power accumulating in the region could be overestimated (≥550 MW required). Third, the seismic efficiency—ratio of radiated energy to elastic work—of the SAF could be higher than that of the remaining faults in the region (≥5.8% required). Fourth, the scaled energy—ratio of radiated energy to seismic moment—on the SAF could be lower than on the remaining faults in the region (a factor 5 difference required). In the last three hypotheses, we analyze the interplay of the energy budget on a single fault with the total energy budget of the region. Plain Language Summary: When earthquakes move rock against rock, friction heats the contact surface. If this frictional resistance were like laboratory measurements of typical crustal rock, the heat would cause a considerable heat flow signature ("anomaly") at Earth's surface. For the San Andreas fault (SAF) in Southern California, such a signature has not been observed. One solution to this paradox is that the fault is weak. We approach the paradox from a new angle by using additionally the rate at which elastic energy accumulates in California. This elastic power is incompatible with the radiated power from earthquake catalogs and the maximum rate of frictional heating from the paradox if only simple assumptions are made. We call this conflict the extended heat flow paradox. Four mechanisms could individually resolve the extended paradox: randomness in regional heat flow measurements could conceal the anomaly, the elastic power on the SAF could be overestimated, the seismic efficiency (ratio of radiated energy per input work) on the SAF could be comparatively high, or the scaled energy (ratio of radiated energy per seismic moment) on the SAF could be comparatively low. A combination of multiple effects is possible. Key Points: Heat flow around the San Andreas fault is incompatible with radiated power and elastic input power under simple assumptionsRegionally, a stochastic view on heat flow and/or an overestimated total elastic power can resolve the paradoxLocally, a high seismic efficiency or a low scaled energy (radiated energy/seismicmoment) on the San Andreas fault can resolve the paradox [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Comprehensive data set of in-situ hydraulic stimulation experiments for geothermal purposes at the Äspö Hard Rock Laboratory (Sweden).
- Author
-
Zang, Arno, Niemz, Peter, von Specht, Sebastian, Zimmermann, Günter, Milkereit, Claus, Plenkers, Katrin, and Klee, Gerd
- Subjects
ROCK music ,ACOUSTIC emission ,GRANITE ,SEISMIC arrays ,HYDRAULIC fracturing ,CRYSTALLINE rocks - Abstract
In this article, a high-resolution acoustic emission sensor, accelerometer and broadband seismometer array data set is made available and described in detail from in-situ experiments performed at Äspö Hard Rock Laboratory in May and June 2015. The main goal of the hydraulic stimulation tests in a horizontal borehole at 410 m depth in naturally fractured granitic rock mass is to demonstrate the technical feasibility to generate multi-stage heat exchangers in a controlled way superior to former massive stimulations applied in enhanced geothermal projects. A set of six, sub-parallel hydraulic fractures is propagated from an injection borehole drilled parallel to minimum horizontal in-situ stress, and monitored by an extensive complementary sensor array implemented in three inclined monitoring boreholes and the near-by tunnel system. Three different fluid-injection protocols are tested: constant water injection, progressive cyclic injection, and cyclic injection with a hydraulic hammer operating at 5 Hz frequency to stimulate a crystalline rock volume of size 30×30×30 m at depth. We collected geological data from core and borehole logs, fracture inspection data from impression packer, acoustic emission hypocenter tracking and tilt data, as well as quantified the permeability enhancement process. The data and interpretation provided through this publication is an important step both, in upscaling laboratory tests, and downscaling field tests in granitic rock in the framework of enhanced geothermal system research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. REHEATFUNQ 1.4.0: A model for regional aggregate heat flow distributions and anomaly quantification.
- Author
-
Ziebarth, Malte Jörn and Specht, Sebastian von
- Subjects
GAMMA distributions ,BAYESIAN field theory ,GOODNESS-of-fit tests ,RANDOM variables ,COMMUNITIES - Abstract
Surface heat flow is a geophysical variable that is affected by a complex combination of various heat generation and transport processes. The processes act on different lengths scales, from tens of meters to hundreds of kilometers. In general, it is not possible to resolve all processes for a lack of data or modeling resources, and hence the heat flow data within a region is subject to residual fluctuations. We introduce the REgional HEAT-Flow Uncertainty and aNomaly Quantification (REHEATFUNQ) model, version 1.4.0. At its core, REHEATFUNQ uses a stochastic model for heat flow within a region, considering the aggregate heat flow to be generated by a gamma distributed random variable. Based on this assumption, REHEATFUNQ uses Bayesian inference to (i) quantify the regional aggregate heat flow distribution (RAHFD), and (ii) estimate the strength of given heat flow anomaly, for instance as generated by a tectonically active fault. The inference uses a prior conjugate to the gamma distribution for the RAHFDs, and we compute parameters for a uninformed prior from the global heat flow data base by Lucazeau (2019). Through the Bayesian inference, our model is the first of its kind to consistently account for the variability of regional heat flow in the inference of spatial signals in heat flow data. Interpretation of these spatial signals and in particular their interpretation in terms of fault characteristics (particularly fault strength) is a longstanding debate within the geophysical community. We describe the components of REHEATFUNQ and perform a series of goodness-of-fit tests and synthetic resilience analyses of the model. While our analysis reveals to some degree a misfit of our idealized empirical model with real-world heat flow, it simultaneously confirms the robustness of REHEATFUNQ to these model simplifications. We conclude with an application of REHEATFUNQ to the San Andreas fault in California. Our analysis finds heat flow data in the Mojave section to be sufficient for an analysis, and concludes that stochastic variability can allow for a surprisingly large fault-generated heat flow anomaly to be compatible with the data. This indicates that heat flow alone may not be a suitable quantity to address fault strength of the San Andreas fault. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Mixtures of ground-motion prediction equations as backbone models for a logic tree: an application to the subduction zone in Northern Chile
- Author
-
Haendel, Annabel, Specht, Sebastian, Kuehn, Nicolas M., and Scherbaum, Frank
- Published
- 2015
- Full Text
- View/download PDF
10. ICH: Strengths, Weaknesses, and Future Tasks
- Author
-
Specht, Sebastian and Klingmann, Ingrid
- Published
- 2014
- Full Text
- View/download PDF
11. Chronic Pancreatitis Is Associated With Disease-Specific Regulatory T-Cell Responses
- Author
-
Schmitz–Winnenthal, Hubertus, Pietsch, Dong–Ho Kim, Schimmack, Simon, Bonertz, Andreas, Udonta, Florian, Ge, Yingzi, Galindo, Luis, Specht, Sebastian, Volk, Christine, Zgraggen, Kaspar, Koch, Moritz, Büchler, Markus W., Weitz, Jürgen, and Beckhove, Philipp
- Published
- 2010
- Full Text
- View/download PDF
12. Die Visualisierung von Pendlerverflechtungen — eine Herausforderung
- Author
-
Hanewinkel, Christian and Specht, Sebastian
- Published
- 2010
- Full Text
- View/download PDF
13. Ground-Motion Modeling as an Image Processing Task: Introducing a Neural Network Based, Fully Data-Driven, and Nonergodic Approach.
- Author
-
Lilienkamp, Henning, von Specht, Sebastian, Weatherill, Graeme, Caire, Giuseppe, and Cotton, Fabrice
- Abstract
We construct and examine the prototype of a deep learning-based ground-motion model (GMM) that is both fully data driven and nonergodic. We formulate ground-motion modeling as an image processing task, in which a specific type of neural network, the U-Net, relates continuous, horizontal maps of earthquake predictive parameters to sparse observations of a ground-motion intensity measure (IM). The processing of map-shaped data allows the natural incorporation of absolute earthquake source and observation site coordinates, and is, therefore, well suited to include site-, source-, and path-specific amplification effects in a nonergodic GMM. Data-driven interpolation of the IM between observation points is an inherent feature of the U-Net and requires no a priori assumptions. We evaluate our model using both a synthetic dataset and a subset of observations from the KiK-net strong motion network in the Kanto basin in Japan. We find that the U-Net model is capable of learning the magnitude-distance scaling, as well as site-, source-, and path-specific amplification effects from a strong motion dataset. The interpolation scheme is evaluated using a fivefold cross validation and is found to provide on average unbiased predictions. The magnitude-distance scaling as well as the site amplification of response spectral acceleration at a period of 1 s obtained for the Kanto basin are comparable to previous regional studies. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. Within- and Between-Event Variabilities of Strong-Velocity Pulses of Moderate Earthquakes within Dense Seismic Arrays.
- Author
-
Ming-Hsuan Yen, von Specht, Sebastian, Yen-Yu Lin, Cotton, Fabrice, and Kuo-Fong Ma
- Abstract
Ground motion with strong-velocity pulses can cause significant damage to buildings and structures at certain periods; hence, knowing the period and velocity amplitude of such pulses is critical for earthquake structural engineering. However, the physical factors relating the scaling of pulse periods with magnitude are poorly understood. In this study, we investigate moderate but damaging earthquakes (M
w 6-7) and characterize ground-motion pulses using the method of Shahi and Baker (2014) while considering the potential static-offset effects. We confirm that the within-event variability of the pulses is large. The identified pulses in this study are mostly from strike-slip-like earthquakes. We further perform simulations using the frequency-wavenumber algorithm to investigate the causes of the variability of the pulse periods within and between events for moderate strike-slip earthquakes. We test the effect of fault dips, and the impact of the asperity locations and sizes. The simulations reveal that the asperity properties have a high impact on the pulse periods and amplitudes at nearby stations. Our results emphasize the importance of asperity characteristics, in addition to earthquake magnitudes for the occurrence and properties of pulses produced by the forward directivity effect. We finally quantify and discuss within- and between-event variabilities of pulse properties at short distances. [ABSTRACT FROM AUTHOR]- Published
- 2022
- Full Text
- View/download PDF
15. A Link between Machine Learning and Optimization in Ground-Motion Model Development: Weighted Mixed-Effects Regression with Data-Driven Probabilistic Earthquake Classification.
- Author
-
von Specht, Sebastian and Cotton, Fabrice
- Abstract
The steady increase of ground-motion data not only allows new possibilities but also comes with new challenges in the development of ground-motion models (GMMs). Data classification techniques (e.g., cluster analysis) do not only produce deterministic classifications but also probabilistic classifications (e.g., probabilities for each datum to belong to a given class or cluster). One challenge is the integration of such continuous classification in regressions for GMM development such as the widely used mixed-effects model. We address this issue by introducing an extension of the mixed-effects model to incorporate data weighting. The parameter estimation of the mixed-effects model, that is, fixed-effects coefficients of the GMMs and the random-effects variances, are based on the weighted likelihood function, which also provides analytic uncertainty estimates. The data weighting permits for earthquake classification beyond the classical, expert-driven, binary classification based, for example, on event depth, distance to trench, style of faulting, and fault dip angle. We apply Angular Classification with Expectation-maximization, an algorithm to identify clusters of nodal planes from focal mechanisms to differentiate between, for example, interface- and intraslab-type events. Classification is continuous, that is, no event belongs completely to one class, which is taken into account in the ground-motion modeling. The theoretical framework described in this article allows for a fully automatic calibration of ground-motion models using large databases with automated classification and processing of earthquake and ground-motion data. As an example, we developed a GMM on the basis of the GMM by Montalva et al. (2017) with data from the strong-motion flat file of Bastías and Montalva (2016) with ~2400 records from 319 events in the Chilean subduction zone. Our GMM with the data-driven classification is comparable to the expert-classification-based model. Furthermore, the model shows temporal variations of the between-event residuals before and after large earthquakes in the region. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
16. Applying Conservation of Energy to Estimate Earthquake Frequencies from Strain Rates and Stresses.
- Author
-
Ziebarth, Malte J., Specht, Sebastian, Heidbach, Oliver, Cotton, Fabrice, and Anderson, John G.
- Subjects
- *
EARTHQUAKES , *EARTHQUAKE hazard analysis , *SEISMIC waves , *SEISMOLOGY , *GEOPHYSICS - Abstract
Estimating earthquake occurrence rates from the accumulation rate of seismic moment is an established tool of seismic hazard analysis. We propose an alternative, fault‐agnostic approach based on the conservation of energy: the Energy‐Conserving Seismicity Framework (ENCOS). Working in energy space has the advantage that the radiated energy is a better predictor of the damage potential of earthquake waves than the seismic moment release. In a region, ENCOS balances the stationary power available to cause earthquakes with the long‐term seismic energy release represented by the energy‐frequency distribution's first moment. Accumulation and release are connected through the average seismic efficiency, by which we mean the fraction of released energy that is converted into seismic waves. Besides measuring earthquakes in energy, ENCOS differs from moment balance essentially in that the energy accumulation rate depends on the total stress in addition to the strain rate tensor. To validate ENCOS, we exemplarily model the energy‐frequency distribution around Southern California. We estimate the energy accumulation rate due to tectonic loading assuming poroelasticity and hydrostasis. Using data from the World Stress Map and assuming the frictional limit to estimate the stress tensor, we obtain a power of 0.8 GW. The uncertainty range, 0.3–2.0 GW, originates mainly from the thickness of the seismogenic crust, the friction coefficient on preexisting faults, and models of Global Positioning System (GPS) derived strain rates. Based on a Gutenberg‐Richter magnitude‐frequency distribution, this power can be distributed over a range of energies consistent with historical earthquake rates and reasonable bounds on the seismic efficiency. Key Points: Conservation of energy is used to estimate long‐term earthquake occurrence rates from geomechanical modeling of relevant processesThe elastic power is estimated using global positioning system and stress data, assuming linear poroelasticity and the frictional limitIn Southern California, the earthquake occurrence rates modeled from the elastic power are compatible with observed seismicity [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
17. Full-waveform-based characterization of acoustic emission activity in a mine-scale experiment: a comparison of conventional and advanced hydraulic fracturing schemes.
- Author
-
Niemz, Peter, Cesca, Simone, Heimann, Sebastian, Grigoli, Francesco, von Specht, Sebastian, Hammer, Conny, Zang, Arno, and Dahm, Torsten
- Subjects
INDUCED seismicity ,HYDRAULIC fracturing ,PIEZOELECTRIC detectors ,CRYSTALLINE rocks ,ACOUSTIC emission ,SENSOR networks ,FLUID injection - Abstract
Understanding fracturing processes and the hydromechanical relation to induced seismicity is a key question for enhanced geothermal systems (EGS). Commonly massive fluid injection, predominately causing hydroshearing, are used in large-scale EGS but also hydraulic fracturing approaches were discussed. To evaluate the applicability of hydraulic fracturing techniques in EGS, six in situ , multistage hydraulic fracturing experiments with three different injection schemes were performed under controlled conditions in crystalline rock at the Äspö Hard Rock Laboratory (Sweden). During the experiments the near-field ground motion was continuously recorded by 11 piezoelectric borehole sensors with a sampling rate of 1 MHz. The sensor network covered a volume of 30×30×30 m around a horizontal, 28-m-long injection borehole at a depth of 410 m. To extract and characterize massive, induced, high-frequency acoustic emission (AE) activity from continuous recordings, a semi-automated workflow was developed relying on full waveform based detection, classification and location procedures. The approach extended the AE catalogue from 196 triggered events in previous studies to more than 19 600 located AEs. The enhanced catalogue, for the first time, allows a detailed analysis of induced seismicity during single hydraulic fracturing experiments, including the individual fracturing stages and the comparison between injection schemes. Beside the detailed study of the spatio-temporal patterns, event clusters and the growth of seismic clouds, we estimate relative magnitudes and b -values of AEs for conventional, cyclic progressive and dynamic pulse injection schemes, the latter two being fatigue hydraulic fracturing techniques. While the conventional fracturing leads to AE patterns clustered in planar regions, indicating the generation of a single main fracture plane, the cyclic progressive injection scheme results in a more diffuse, cloud-like AE distribution, indicating the activation of a more complex fracture network. For a given amount of hydraulic energy (pressure multiplied by injected volume) pumped into the system, the cyclic progressive scheme is characterized by a lower rate of seismicity, lower maximum magnitudes and significantly larger b -values, implying an increased number of small events relative to the large ones. To our knowledge, this is the first direct comparison of high resolution seismicity in a mine-scale experiment induced by different hydraulic fracturing schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
18. A Regionalized Seismicity Model for Subduction Zones Based on Geodetic Strain Rates, Geomechanical Parameters, and Earthquake-Catalog Data.
- Author
-
Viveros, José Antonio Bayona, von Specht, Sebastian, Strader, Anne, Hainzl, Sebastian, Cotton, Fabrice, and Schorlemmer, Danijel
- Abstract
The Seismic Hazard Inferred from Tectonics based on the Global Strain Rate Map (SHIFT_GSRM) earthquake forecast was designed to provide high-resolution estimates of global shallow seismicity to be used in seismic hazard assessment. This model combines geodetic strain rates with global earthquake parameters to characterize long-term rates of seismic moment and earthquake activity. Although SHIFT_GSRM properly computes seismicity rates in seismically active continental regions, it underestimates earthquake rates in subduction zones by an average factor of approximately 3. We present a complementary method to SHIFT_GSRM to more accurately forecast earthquake rates in 37 subduction segments, based on the conservation of moment principle and the use of regional interface seismicity parameters, such as subduction dip angles, corner magnitudes, and coupled seismogenic thicknesses. In seven progressive steps, we find that SHIFT_GSRM earthquake-rate underpredictions are mainly due to the utilization of a global probability function of seismic moment release that poorly captures the great variability among subduction megathrust interfaces. Retrospective test results show that the forecast is consistent with the observations during the 1 January 1977 to 31 December 2014 period. Moreover, successful pseudoprospective evaluations for the 1 January 2015 to 31 December 2018 period demonstrate the power of the regionalized earthquake model to properly estimate subduction-zone seismicity. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
19. Effects of finite source rupture on landslide triggering: the 2016 Mw 7.1 Kumamoto earthquake.
- Author
-
von Specht, Sebastian, Ozturk, Ugur, Veh, Georg, Cotton, Fabrice, and Korup, Oliver
- Subjects
- *
LANDSLIDES , *LANDSLIDE hazard analysis , *SEISMIC waves , *EARTHQUAKES , *LAND cover - Abstract
The propagation of a seismic rupture on a fault introduces spatial variations in the seismic wave field surrounding the fault. This directivity effect results in larger shaking amplitudes in the rupture propagation direction. Its seismic radiation pattern also causes amplitude variations between the strike-normal and strike-parallel components of horizontal ground motion. We investigated the landslide response to these effects during the 2016 Kumamoto earthquake (Mw 7.1) in central Kyushu (Japan). Although the distribution of some 1500 earthquake-triggered landslides as a function of rupture distance is consistent with the observed Arias intensity, the landslides were more concentrated to the northeast of the southwest–northeast striking rupture. We examined several landslide susceptibility factors: hillslope inclination, the median amplification factor (MAF) of ground shaking, lithology, land cover, and topographic wetness. None of these factors sufficiently explains the landslide distribution or orientation (aspect), although the landslide head scarps have an elevated hillslope inclination and MAF. We propose a new physics-based ground-motion model (GMM) that accounts for the seismic rupture effects, and we demonstrate that the low-frequency seismic radiation pattern is consistent with the overall landslide distribution. Its spatial pattern is influenced by the rupture directivity effect, whereas landslide aspect is influenced by amplitude variations between the fault-normal and fault-parallel motion at frequencies <2 Hz. This azimuth dependence implies that comparable landslide concentrations can occur at different distances from the rupture. This quantitative link between the prevalent landslide aspect and the low-frequency seismic radiation pattern can improve coseismic landslide hazard assessment. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
20. Effects of finite source rupture on landslide triggering: The 2016 MW 7.1 Kumamoto earthquake.
- Author
-
von Specht, Sebastian, Ozturk, Ugur, Veh, Georg, Cotton, Fabrice, and Korup, Oliver
- Subjects
- *
SPATIAL variation , *SURFACE fault ruptures , *LANDSLIDES - Abstract
The propagation of a seismic rupture on a fault introduces spatial variations in the seismic wavefield surrounding the fault during an earthquake. This directivity effect results in larger shaking amplitudes in the rupture propagation direction. Its seismic radiation pattern also causes amplitude variations between the strike-normal and strike-parallel components of horizontal ground motion. We investigated the landslide response to these effects during the 2016 Kumamoto earthquake (MW 7.1) in central Kyūshū (Japan). Although the distribution of some 1,500 earthquake-triggered landslides as function of rupture distance is consistent with the observed Arias intensity, the landslides are more concentrated to the northeast of the southwest-northeast striking rupture. We examined several landslide susceptibility factors: hillslope inclination, median amplification factor (MAF) of ground shaking, lithology, land cover, and topographic wetness. None of these factors can sufficiently explain the landslide distribution or orientation (aspect), although the landslide headscarps coincide with elevated hillslope inclination and MAF. We propose a new physics-based ground motion model that accounts for the seismic rupture effects, and demonstrate that the low-frequency seismic radiation pattern consistent with the overall landslide distribution. The spatial landslide distribution is primarily influenced by the rupture directivity effect, whereas landslide aspect is influenced by amplitude variations between the fault-normal and fault-parallel motion at frequencies < 2Hz. This azimuth-dependence implies that comparable landslide concentrations can occur at different distances from the rupture. This quantitative link between the prevalent landslide aspect and the low-frequency seismic radiation pattern can improve coseismic landslide hazard assessment. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
21. A Regionalized Seismicity Model for Subduction Zones Based on Geodetic Strain Rates, Geomechanical Parameters, and Earthquake-Catalog Data.
- Author
-
Viveros, José Antonio Bayona, von Specht, Sebastian, Strader, Anne, Hainzl, Sebastian, Cotton, Fabrice, and Schorlemmer, Danijel
- Abstract
The Seismic Hazard Inferred from Tectonics based on the Global Strain Rate Map (SHIFT_GSRM) earthquake forecast was designed to provide high-resolution estimates of global shallow seismicity to be used in seismic hazard assessment. This model combines geodetic strain rates with global earthquake parameters to characterize long-term rates of seismic moment and earthquake activity. Although SHIFT_GSRM properly computes seismicity rates in seismically active continental regions, it underestimates earthquake rates in subduction zones by an average factor of approximately 3. We present a complementary method to SHIFT_GSRM to more accurately forecast earthquake rates in 37 subduction segments, based on the conservation of moment principle and the use of regional interface seismicity parameters, such as subduction dip angles, corner magnitudes, and coupled seismogenic thicknesses. In seven progressive steps, we find that SHIFT_GSRM earthquake-rate underpredictions are mainly due to the utilization of a global probability function of seismic moment release that poorly captures the great variability among subduction megathrust interfaces. Retrospective test results show that the forecast is consistent with the observations during the 1 January 1977 to 31 December 2014 period. Moreover, successful pseudoprospective evaluations for the 1 January 2015 to 31 December 2018 period demonstrate the power of the regionalized earthquake model to properly estimate subduction-zone seismicity. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
22. Spatiotemporal Variations of Ground Motion in Northern Chile before and after the 2014 Mw 8.1 Iquique Megathrust Event.
- Author
-
Piña-Valdés, Jesús, Socquet, Anne, Cotton, Fabrice, and Specht, Sebastian
- Abstract
To evaluate the spatiotemporal variations of ground motions in northern Chile, we built a high-quality rock seismic acceleration database and an interface earthquakes catalog. Two ground-motion prediction equation (GMPE) models for subduction zones have been tested and validated for the area. They were then used as backbone models to describe the time-space variations of earthquake frequency content (Fourier and response spectra). Consistent with previous studies of large subduction earthquakes, moderate interface earthquakes in northern Chile show an increase of the high-frequency energy released with depth. A regional variability of earthquake frequency content is also observed, which may be related to a lateral segmentation of the mechanical properties of the subduction interface. Finally, interface earthquakes show a temporal evolution of their frequency content in the earthquake sequence associated with the 2014 Iquique M
w 8.1 megathrust earthquake. Surprisingly, the change does not occur with the mainshock but is associated with an 8 month slow slip preceding the megathrust. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
23. An 8 month slow slip event triggers progressive nucleation of the 2014 Chile megathrust.
- Author
-
Socquet, Anne, Valdes, Jesus Piña, Jara, Jorge, Cotton, Fabrice, Walpersdorf, Andrea, Cotte, Nathalie, Specht, Sebastian, Ortega-Culaciati, Francisco, Carrizo, Daniel, and Norabuena, Edmundo
- Published
- 2017
- Full Text
- View/download PDF
24. Hydraulic fracture monitoring in hard rock at 410 m depth with an advanced fluid-injection protocol and extensive sensor array.
- Author
-
Zang, Arno, Stephansson, Ove, Stenberg, Leif, Plenkers, Katrin, Specht, Sebastian, Milkereit, Claus, Schill, Eva, Kwiatek, Grzegorz, Dresen, Georg, Zimmermann, Günter, Dahm, Torsten, and Weber, Michael
- Subjects
HARD rock mining ,HYDRAULIC fracturing ,HEAT exchangers ,MICROSEISMS ,ACOUSTIC emission ,ELECTROMAGNETISM - Abstract
In this paper, an underground experiment at the Äspö Hard Rock Laboratory (HRL) is described. Main goal is optimizing geothermal heat exchange in crystalline rock mass at depth by multistage hydraulic fracturing with minimal impact on the environment, that is, seismic events. For this, three arrays with acoustic emission, microseismicity and electromagnetic sensors are installed mapping hydraulic fracture initiation and growth. Fractures are driven by three different water injection schemes (continuous, progressive and pulse pressurization). After a brief review of hydraulic fracture operations in crystalline rock mass at mine scale, the site geology and the stress conditions at Äspö HRL are described. Then, the continuous, single-flow rate and alternative, multiple-flow rate fracture breakdown tests in a horizontal borehole at depth level 410 m are described together with the monitoring networks and sensitivity. Monitoring results include the primary catalogue of acoustic emission hypocentres obtained from four hydraulic fractures with the in situ trigger and localizing network. The continuous versus alternative water injection schemes are discussed in terms of the fracture breakdown pressure, the fracture pattern from impression packer result and the monitoring at the arrays. An example of multistage hydraulic fracturing with several phases of opening and closing of fracture walls is evaluated using data from acoustic emissions, seismic broad-band recordings and electromagnetic signal response. Based on our limited amount of in situ tests (six) and evaluation of three tests in Ävrö granodiorite, in the multiple-flow rate test with progressively increasing target pressure, the acoustic emission activity starts at a later stage in the fracturing process compared to the conventional fracturing case with continuous water injection. In tendency, also the total number and magnitude of acoustic events are found to be smaller in the progressive treatment with frequent phases of depressurization. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
25. Designing a Cross-Border Health Atlas through Immersion in Health Services Research.
- Author
-
Specht, Sebastian
- Subjects
- *
MAP design , *MEDICAL geography , *MEDICAL care research - Abstract
This work describes an ongoing cartographic design process in the field between medical geography and health services research. The desired result is the prototypical implementation of cartographic artefacts united under the umbrella of a "Cross-border Health Data Compass" (CHDC). The resulting web-based visual analytics software aims to support multidisciplinary cross-border healthcare research in the northern Dutch-German border region. Visualizations of consolidated and harmonized data from publicly available information sources will interact with a model of the spatial accessibility of health care facilities of the region. Medical geography describes the spatial accessibility of health care facilities as one of several dimensions of access to health care services. Of all the dimensions of access, the aspects of "accessibility" and "availability" (i.e. the presence of facilities) are spatial and distance-based and therefore empirically ascertainable (Kisteman et al. 2019). In addition, health care itself is part of a "social space" occurring in space and time. This implies that the regional socio-economic situation influences health care. For health services research these spatial dimensions are of great importance as well and therefore relevant background information. For this reason, an accessibility model from OpenStreetMap network data will calculate potential catchment areas of hospitals and care facilities. The resulting accessibility model data will be an integral part of the cartographic visualisation of demographic and socio-economic data. Common requirements engineering techniques, qualitative study methods, and the "design by immersion" approach described by Hall (2020) provide the methods for the design process of this work (Figure 1). The latter understands the design effort of a visualisation researcher as an "immersion experience" in a specific (scientific) domain: The elicitation Abstracts of the International Cartographic Association, 5, 2022. European Cartographic Conference -- EuroCarto 2022, 19-21 September 2022, TU Wien, Vienna, Austria. https://doi.org/10.5194/ica-abs-5-14-2022 | © Author(s) 2022. CC BY 4.0 License. of cartographic requirements result from the visualisation problems of that specific domain and the reflection about the iterative and cooperative search for solutions. The design process started with a qualitative study among Dutch and German health researchers from the CBI initiative (n=9, semi structured interviews). In the interviews, more than half of the participants stated that spatial aspects play a role or at least a subordinate role for their research. However, even though many collect data on socio-economic status as part of their own studies, only one researcher ever recorded spatial information about the study subjects. Generally, the study detected an interest in spatial aspects of their research questions, but at the same time a reservation about the feasibility of spatial analyses, the possible significance of the findings and uncertainty about the availability of secondary data across the border. A number of challenges result from the findings of the first study: The transnational perspective of the CBI researchers call for homogenous data models from national sources and needs to create solutions to normalize data and classifications, which are hard to compare. The variety of disciplines in the consortium calls for a wide range of spatial scale down to spatial models using small-scale cartographic grids. The use of these grids in terms of statistical disclosure control, the soundness of disaggregation and small area estimations, and methodological transparency pose the research questions for the CHDC and the accompanying cartographic visualizations. The presentation will give insight into the intermediate results of this design process, will report on methodological and algorithmic considerations and present the state of the visual analytics artefacts. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
26. Hsp42 is required for sequestration of protein aggregates into deposition sites in Saccharomyces cerevisiae.
- Author
-
Specht, Sebastian, Miller, Stephanie B. M., Mogk, Axel, and Bukau, Bernd
- Subjects
- *
PROTEIN research , *PROTEIN folding , *MOLECULAR structure of amino compounds , *HEAT shock proteins , *SACCHAROMYCES cerevisiae - Abstract
The aggregation of proteins inside cells is an organized process with cytoprotective function. In Saccharomyces cerevisiae, aggregating proteins are spatially sequestered to either juxtanuclear or peripheral sites, which target distinct quality control pathways for refolding and degradation. The cellular machinery driving the sequestration of misfolded proteins to these sites is unknown. In this paper, we show that one of the two small heat shock proteins of yeast, Hsp42, is essential for the formation of peripheral aggregates during physiological heat stress. Hsp42 preferentially localizes to peripheral aggregates but is largely absent from juxtanuclear aggregates, which still form in hsp42Δ cells. Transferring the amino-terminal domain of Hsp42 to Hsp26, which does not participate in aggregate sorting, enables Hsp26 to replace Hsp42 function. Our data suggest that Hsp42 acts via its amino-terminal domain to coaggregate with misfolded proteins and perhaps link such complexes to further sorting factors. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
27. Detail or Disclosure - Towards a Visualisation of Confidentiality Related Spatial Damage to Demographic Grids.
- Author
-
Specht, Sebastian and Kramer, Bernd
- Subjects
- *
MUNICIPAL services , *DEMOGRAPHY , *MEDICAL care , *CARTOGRAPHY , *MAPS - Abstract
Statistical data on demography is the basis for many population-related scientific questions, economic questions of health care and questions of planning public services. Population data in equal-area cartographic grid cells appears to be a good basis, especially for use cases in inter-municipal contexts of administration and planning (Specht et al. 2019). Census results have been used, since the 2011 census made available small-scale population data for the entire Federal Republic of Germany on a 100m grid for the first time. Unfortunately, this data is not updated by the statistical offices. This presentation describes a use case of demographic grids implemented in a context of inter-municipal cooperation in the region of Bremen. As the calculation of population forecasts was an objective, small-scale data on migration was required. Similar to the approach in the census, demographic data and data on migration are recorded in the residents' registration offices (EMA) of the cooperating municipalities. However, since outside the census other legal frameworks apply, the process cannot be adopted as is. In the EMAs, individual-related micro-data are available, serving as a base file. Under the respective legal framework, the data is anonymised, geo coded and converted into an aggregated tabular form on site. Aggregated data may still contain individual cases worthy of protection. The higher the number of queried characteristics (region, gender, age, nationality etc.) and their differentiation (100m grid or 1km grid, age years or age groups, etc.), the higher the probability to encounter such cases. A number of procedures for statistical disclosure control are available, of which the SAFE procedure (Höhne 2015) (used in the 2011 census) is currently implemented in the project. As other methods or strategies are up for consideration, how can they be evaluated in a specific regional context? From the perspective of confidentiality, space is at first just one feature dimension among others, although there are approaches that explicitly take spatial interrelation into account (Young, Martin, and Skinner 2009). From a geographical point of view, however, high resolution data, especially in sparsely populated areas, can generally be expected to show high before-and-after deviations as a result of confidentiality procedures. Depending on the subject matter, these spatial errors can have different degrees of relevance and thus be ultimately relevant for the selection of the confidentiality strategy. To support a decidedly spatial comparison of the effects of different classification, aggregation and confidentiality strategies, a set of indicators together with an interactive visualization for the project area under consideration is presented for discussion. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
28. Development of ground-motion prediction equations from a weighted mixture model with data-driven weights.
- Author
-
Specht, Sebastian von and Cotton, Fabrice
- Subjects
- *
FOCAL planes , *EQUATIONS , *SUBDUCTION , *MIXTURES , *SURFACE fault ruptures - Abstract
The steady increase of ground-motion data allows new possibilities but also comes with new challenges in ground-motion modelling and the development of ground-motion prediction equations.One challenge is data selection - not all data can be processed equally - and we introduce an extension of the widely used mixed effect model, i.e. a model of fixed and random effects.The extension incorporates data selection by assigning weights per event and model parameters are inferred by weighting the record data accordingly.This advancement allows for the possibility of data classification beyond the more classical, expert-driven, binary classification based e.g. on the depth of the event, distance to the trench, style-of-faulting and dip angle of the fault plane.We apply ACE - (A)ngular (C)lassification with (E)xpectation-maximization - a method to efficiently identify clusters of nodal planes from focal mechanisms to differentiate between interface and intraslab type events.The classification is continuous, i.e. no event belongs completely to one class and classification uncertainty is then evaluated and taken into account in the ground-motion modelling.As an example, we developed a ground-motion prediction equation from a database of approximately 2400 records from 319 events in the Chilean subduction zone.Our ground-motion model with the data-driven and reproducible classification is comparable to the expert classification based model.Furthermore, the models show temporal variations of the between-event residuals before and after large earthquakes in the region. [ABSTRACT FROM AUTHOR]
- Published
- 2019
29. Assimilating Stress and Strain in an Energy-Based PSHA Workflow.
- Author
-
Ziebarth, Malte J., Heidbach, Oliver, Cotton, Fabrice, Anderson, John G., Weatherill, Graeme, and Specht, Sebastian von
- Published
- 2019
30. Improving Strain-Rate based Forecasts.
- Author
-
Bayona, José Antonio, Specht, Sebastian, Cotton, Fabrice, Hainzl, Sebastian, and Schorlemmer, Danijel
- Subjects
- *
FORECASTING - Published
- 2018
31. Best Practices for Microbial Challenge In-Use Studies to Evaluate the Microbial Growth Potential of Parenteral Biological Products; Industry and Regulatory Considerations.
- Author
-
Zamiri C, Leiske DL, Hughes P, Kirwan JP, Der E, Cox E, Warburton R, Goss M, Weiser S, Perez-Brown J, Gopalrathnam G, Liu J, Mehta SB, Shereefa S, Specht S, Aedo SJ, Goldbach P, Jia F, Kuehnle B, Page S, Voeten L, Yi L, and Zhu C
- Subjects
- Humans, United States, Drug Industry standards, Drug Industry methods, Drug Industry legislation & jurisprudence, Drug Storage standards, Patient Safety, Biological Products standards, Biological Products administration & dosage, Drug Contamination prevention & control, United States Food and Drug Administration standards
- Abstract
Microbial challenge in-use studies are performed to evaluate the potential for microbial proliferation in preservative-free single-dose biological products after first puncture and potential accidental contamination during dose preparation (e.g., reconstitution or dilution) and storage. These studies, in addition to physicochemical in-use stability assessments, are used as part of product registration to define in-use hold times in Prescribing Information and in the pharmacy manual in the case of clinical products. There are no formal guidance documents describing regulator expectations on how to conduct microbial challenge in-use studies and interpret microbial data to assign in-use storage hold times. In lieu of guidance, US Food and Drug Administration (FDA) regulators have authored publications and presentations describing regulator expectations. Insufficient or unavailable microbial challenge data can result in shortened in-use hold times, thus microbial challenge data enables flexibility for health care providers (HCPs) and patients while ensuring patient safety. A cross-industry/FDA in-use microbial working group was formed through the Innovation & Quality (IQ) Consortium to gain alignment among industry practice and regulator expectations. The working group assessed regulatory guidance, current industry practice via a blinded survey of IQ Consortium member companies, and scientific rationale to align on recommendations for experimental design, execution of microbial challenge in-use studies, and a decision tree for microbial data interpretation to assign in-use hold times. Besides the study execution and data interpretation, additional considerations are discussed including the use of platform data for clinical stage products, closed system transfer devices (CSTDs), transport of dose solutions, long infusion times, and the use of USP <797> by HCPs for preparing sterile drugs for administration. The recommendations provided in this article will help streamline biological product development, ensure consistency on assignment of in-use hold times in biological product labels across industry, and provide maximum allowable flexibility to HCPs and patients while ensuring patient safety., (© PDA, Inc. 2024.)
- Published
- 2024
- Full Text
- View/download PDF
32. Tumor infiltrating T lymphocytes in colorectal cancer: Tumor-selective activation and cytotoxic activity in situ.
- Author
-
Koch M, Beckhove P, Op den Winkel J, Autenrieth D, Wagner P, Nummer D, Specht S, Antolovic D, Galindo L, Schmitz-Winnenthal FH, Schirrmacher V, Büchler MW, and Weitz J
- Subjects
- Adenocarcinoma metabolism, Adenocarcinoma pathology, Aged, Antigens, CD metabolism, Case-Control Studies, Colorectal Neoplasms metabolism, Colorectal Neoplasms pathology, Female, Humans, Male, Neoplasm Staging, Adenocarcinoma immunology, Colorectal Neoplasms immunology, Lymphocyte Activation physiology, Lymphocytes, Tumor-Infiltrating physiology
- Abstract
Objective: To examine whether tumor-selective infiltration, activation, and cytotoxic activity of tumor infiltrating T lymphocytes (TIL) can be demonstrated in situ in colorectal cancer samples., Summary Background Data: Recent studies indicated a correlation between the presence of TIL and an improved prognosis in colorectal cancer. However, tumor-selective activation and cytotoxic activity of CD8 TIL in situ in colorectal cancer patients have not yet been examined., Methods: Tumor samples from 49 patients and corresponding normal mucosa samples from 23 patients with colorectal cancer (UICC stages II-IV) were examined for TIL. Two-color fluorescence immunohistochemistry and multicolor flowcytometric (FACS) analysis were used for quantification of CD8 T cells and measurement of their activation status (CD69-expression) and cytotoxic activity (CD107a-expression) in situ. Presence of tumor antigen-reactive T cells in tumor, blood, and bone marrow was evaluated by IFN-gamma Elispot analysis., Results: While absolute numbers of CD8 T cells were similar, CD4 T helper cells were significantly increased in tumor tissue compared with normal mucosa. There was a significantly higher proportion of activated and cytotoxically active CD8 TIL in colorectal cancer compared with normal mucosa. Increased activation, cytotoxic activity, and functional reactivity of TIL were correlated with the presence of functional tumor antigen-reactive T cells in the blood and bone marrow. The proportion of activated TIL decreased significantly with higher tumor stage., Conclusions: Tumor-selective activation and cytotoxic activity of CD8 TIL and tumor-selective migration of CD4 T helper cells were demonstrated in colorectal cancer for the first time. Our data support the immunogenicity of colorectal cancer and suggest clinical significance of tumor-specific immune responses.
- Published
- 2006
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.