870 results
Search Results
2. Nowcasting Earthquakes With Stochastic Simulations: Information Entropy of Earthquake Catalogs.
- Author
-
Rundle, John B., Baughman, Ian, and Zhang, Tianjian
- Subjects
EARTHQUAKES ,EARTHQUAKE aftershocks ,ENTROPY (Information theory) ,MACHINE learning ,EARTHQUAKE hazard analysis ,RECEIVER operating characteristic curves ,CATALOGS ,ENTROPY - Abstract
Earthquake nowcasting has been proposed as a means of tracking the change in large earthquake potential in a seismically active area. The method was developed using observable seismic data, in which probabilities of future large earthquakes can be computed using Receiver Operating Characteristic methods. Furthermore, analysis of the Shannon information content of the earthquake catalogs has been used to show that there is information contained in the catalogs, and that it can vary in time. So an important question remains, where does the information originate? In this paper, we examine this question using stochastic simulations of earthquake catalogs. Our catalog simulations are computed using an Earthquake Rescaled Aftershock Seismicity ("ERAS") stochastic model. This model is similar in many ways to other stochastic seismicity simulations, but has the advantage that the model has only 2 free parameters to be set, one for the aftershock (Omori‐Utsu) time decay, and one for the aftershock spatial migration away from the epicenter. Generating a simulation catalog and fitting the two parameters to the observed catalog such as California takes only a few minutes of wall clock time. While clustering can arise from random, Poisson statistics, we show that significant information in the simulation catalogs arises from the "non‐Poisson" power‐law aftershock clustering, implying that the practice of de‐clustering observed catalogs may remove information that would otherwise be useful in forecasting and nowcasting. We also show that the nowcasting method provides similar results with the ERAS model as it does with observed seismicity. Plain Language Summary: Earthquake nowcasting was proposed as a means of tracking the change in the potential for large earthquakes in a seismically active area, using the record of small earthquakes. The method was developed using observed seismic data, in which probabilities of future large earthquakes can be computed using machine learning methods that were originally developed with the advent of radar in the 1940s. These methods are now being used in the development of machine learning and artificial intelligence models in a variety of applications. In recent times, methods to simulate earthquakes using the observed statistical laws of earthquake seismicity have been developed. One of the advantages of these stochastic models is that it can be used to analyze the various assumptions that are inherent in the analysis of seismic catalogs of earthquakes. In this paper, we analyze the importance of the space‐time clustering that is often observed in earthquake seismicity. We find that the clustering is the origin of information that makes the earthquake nowcasting methods possible. We also find that a common practice of "aftershock de‐clustering", often used in the analysis of these catalogs, removes information about future large earthquakes. Key Points: Earthquake nowcasting tracks the change in the potential for large earthquakes, using information contained in seismic catalogsWe analyze the information contained in the space‐time clustering that is observed in earthquake seismicityWe find that "aftershock de‐clustering" of catalogs removes information about future large earthquakes that the nowcasting method uses [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Contributed Papers Factors Related to Fecal Corticosterone Levels in California Spotted Owls: Implications for Assessing Chronic Stress.
- Author
-
TEMPEL, DOUGLAS J. and GUTIÉRREZ, R. J.
- Subjects
- *
SPOTTED owl , *CORTICOSTERONE , *PHYSIOLOGICAL stress , *ANIMAL welfare - Abstract
The California Spotted Owl (Strix occidentalis occidentalis ) is under consideration for federal protection and has stimulated ecosystem-level management efforts in Sierra Nevada national forests. Because some populations are declining, we used a noninvasive fecal sampling method to estimate stress hormone (i.e., corticosterone) levels within a local population from April to August 2001. Fecal corticosterone levels were similar to those recorded in a previous study of Northern Spotted Owls (S.o. caurina ) ( &xmacr;= 80.1 ng/g dry feces, SE = 75.8). We then used an information-theoretic approach to identify factors that influence fecal corticosterone levels in Spotted Owls. Our best overall model indicated that nonbreeding owls had higher fecal corticosterone levels than breeding owls early in the breeding season and lower levels later in the breeding season. We collected few samples from breeding owls early in the breeding season, however, which may have influenced the results. Management-related factors reflecting habitat condition and proximity to roads were not correlated with fecal corticosterone. However, factors such as field storage method and sample mass were correlated with the amount of measured fecal corticosterone and should be considered in future studies. Sample vials initially stored on ice had higher levels than those stored immediately in liquid N2 (βstorage= 0.269 ln[ng/g], 95% CI = 0.026, 0.512). Hormone metabolites were extracted from extremely small samples (0.01 g) by slightly modifying the assay protocol, but the amount of corticosterone detected increased as the sample mass decreased (βmass=−6.248 ln[ng/g], 95% CI =−8.877, −3.620). Corticosterone levels were significantly higher in 10 cecal samples collected simultaneously with fecal samples (paired difference = 74.7 ng/g, SE = 45.0, p = 0.001 for a paired t test), so care must be taken to avoid contaminating fecal samples with cecal material. Most of the variation was unexplained by our best model ( R 2= 0.24), and additional factors influencing fecal corticosterone levels need to be identified. Therefore, we recommend that well-designed experiments be conducted under controlled conditions to better determine the effect of factors such as sample handling, partial sampling, and diet on fecal corticosterone levels in owls and other birds. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
4. The good, the bad, and the future: Systematic review identifies best use of biomass to meet air quality and climate policies in California.
- Author
-
Freer‐Smith, Peter, Bailey‐Bale, Jack H., Donnison, Caspar L., and Taylor, Gail
- Subjects
FOREST biomass ,GOVERNMENT policy on climate change ,BIOMASS ,BIOMASS production ,GREENHOUSE gases ,AIR quality - Abstract
California has large and diverse biomass resources and provides a pertinent example of how biomass use is changing and needs to change, in the face of climate mitigation policies. As in other areas of the world, California needs to optimize its use of biomass and waste to meet environmental and socioeconomic objectives. We used a systematic review to assess biomass use pathways in California and the associated impacts on climate and air quality. Biomass uses included the production of renewable fuels, electricity, biochar, compost, and other marketable products. For those biomass use pathways recently developed, information is available on the effects—usually beneficial—on greenhouse gas (GHG) emissions, and there is some, but less, published information on the effects on criteria pollutants. Our review identifies 34 biomass use pathways with beneficial impacts on either GHG or pollutant emissions, or both—the "good." These included combustion of forest biomass for power and conversion of livestock‐associated biomass to biogas by anaerobic digestion. The review identified 13 biomass use pathways with adverse impacts on GHG emissions, criteria pollutant emissions, or both—the "bad." Wildfires are an example of one out of eight pathways which were found to be bad for both climate and air quality, while only two biomass use pathways reduced GHG emissions relative to an identified counterfactual but had adverse air quality impacts. Issues of high interest for the "future" included land management to reduce fire risk, future policies for the dairy industries, and full life‐cycle analysis of biomass production and use. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. It's in the bag? The effect of plastic carryout bag bans on where and what people purchase to eat.
- Subjects
PLASTIC bag laws ,CONSUMER behavior ,GROCERY shopping ,BORDERLANDS ,CONSUMERS ,ENVIRONMENTAL health - Abstract
This paper examines how banning the use of plastic carryout bags at grocery stores affects where and what people purchase to eat. Using quasi‐random variation in local bag ban adoption across California and two data sources (retail scanner data and consumer survey data), I show that banning plastic carryout bags shifted some food sales away from regulated grocery stores toward unregulated grocery stores and restaurants. Specifically, I find that bag bans cause a 1.8% decline in food‐at‐home sales and a 1.9 percentage point increase in consumers' food‐away‐from‐home expenditure share. The decline in food‐at‐home sales is larger in jurisdictions more likely to experience cross‐border shopping, whereas the increase in food‐away‐from‐home expenditures is larger farther from jurisdiction borders. Together these results suggest that a small share of consumers find a way to bypass the bag bans—either by cross‐border shopping if near a border or by shifting to restaurants if not near a border. Heterogeneity analyses reveal the policy effects are strongest for those with higher incomes, those under 65 years, and those with young children, suggesting both income effects and time constraints as mechanisms behind the behavioral change. By quantifying consumer avoidance behaviors, these results enable policymakers to more accurately measure the impacts of their regulations and to understand the potential trade‐offs between their environmental and public health objectives. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Drought influences habitat associations and abundances of birds in California's Central Valley.
- Author
-
Goldstein, Benjamin R., Furnas, Brett J., Calhoun, Kendall L., Larsen, Ashley E., Karp, Daniel S., and de Valpine, Perry
- Subjects
DROUGHT management ,DROUGHTS ,HABITATS ,WATER supply ,AGRICULTURE ,FARMS ,ECOLOGICAL niche - Abstract
Aim: As climate change increases the frequency and severity of droughts in many regions, conservation during drought is becoming a major challenge for ecologists. Droughts are multidimensional climate events whose impacts may be moderated by changes in temperature, water availability or food availability, or some combination of these. Simultaneously, other stressors such as extensive anthropogenic landscape modification may synergize with drought. Useful observational models for guiding conservation decision‐making during drought require multidimensional, dynamic representations to disentangle possible drought impacts, and consequently, they will require large, highly resolved data sets. In this paper, we develop a two‐stage predictive framework for assessing how drought impacts vary with species, habitats and climate pathways. Location: Central Valley, California, USA. Methods: We used a two‐stage counterfactual analysis combining predictive linear mixed models and N‐mixture models to characterize the multidimensional impacts of drought on 66 bird species. We analysed counts from the eBird participatory science data set between 2010 and 2019 and produced species‐ and habitat‐specific estimates of the impact of drought on relative abundance. Results: We found that while fewer than a quarter (16/66) of species experienced abundance declines during drought, nearly half of all species (27/66) changed their habitat associations during drought. Among species that shifted their habitat associations, the use of natural habitats declined during drought while use of developed habitat and perennial agricultural habitat increased. Main Conclusions: Our findings suggest that birds take advantage of agricultural and developed land with artificial irrigation and heat‐buffering microhabitat structure, such as in orchards or parks, to buffer drought impacts. A working lands approach that promotes biodiversity and mitigates stressors across a human‐induced water gradient will be critical for conserving birds during drought. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Back Matter.
- Subjects
ANNOUNCEMENTS ,CONFERENCES & conventions ,ELECTIONS ,UNIVERSITIES & colleges - Abstract
This article announces the 2009 Annual Meeting for the "Journal of Finance," which will be held in San Francisco, California, from January 3, 2009 to January 5, 2009, the results of the 2008 election, which includes Jeremy Stein, Darrell Duffie, and John Cochrane, and that the AFA and the Department of Finance at Ohio State University have entered into a joint venture to maintain and enhance the finance faculty directory held on the OSU Web site.
- Published
- 2008
- Full Text
- View/download PDF
8. Optimizing Earthquake Nowcasting With Machine Learning: The Role of Strain Hardening in the Earthquake Cycle.
- Author
-
Rundle, John B., Yazbeck, Joe, Donnellan, Andrea, Fox, Geoffrey, Ludwig, Lisa Grant, Heflin, Michael, and Crutchfield, James
- Subjects
STRAIN hardening ,SEISMIC waves ,MACHINE learning ,SUPERVISED learning ,EARTHQUAKES ,RECEIVER operating characteristic curves ,TIME series analysis - Abstract
Nowcasting is a term originating from economics, finance, and meteorology. It refers to the process of determining the uncertain state of the economy, markets or the weather at the current time by indirect means. In this paper, we describe a simple two‐parameter data analysis that reveals hidden order in otherwise seemingly chaotic earthquake seismicity. One of these parameters relates to a mechanism of seismic quiescence arising from the physics of strain‐hardening of the crust prior to major events. We observe an earthquake cycle associated with major earthquakes in California, similar to what has long been postulated. An estimate of the earthquake hazard revealed by this state variable time series can be optimized by the use of machine learning in the form of the Receiver Operating Characteristic skill score. The ROC skill is used here as a loss function in a supervised learning mode. Our analysis is conducted in the region of 5° × 5° in latitude‐longitude centered on Los Angeles, a region which we used in previous papers to build similar time series using more involved methods (Rundle & Donnellan, 2020, https://doi.org/10.1029/2020EA001097; Rundle, Donnellan et al., 2021, https://doi.org/10.1029/2021EA001757; Rundle, Stein et al., 2021, https://doi.org/10.1088/1361-6633/abf893). Here we show that not only does the state variable time series have forecast skill, the associated spatial probability densities have skill as well. In addition, use of the standard ROC and Precision (PPV) metrics allow probabilities of current earthquake hazard to be defined in a simple, straightforward, and rigorous way. Plain Language Summary: Earthquake nowcasting refers to the determination of hazard for major earthquakes at the present time, the recent past, and the near future. Nowcasting is an idea borrowed from economics, markets, and meteorology, where it has been frequently used. In this paper, we show that there is order hidden within chaotic earthquake seismicity using a very simple transformation of the data. Small earthquakes appear to transition from unstable stick‐slip events that produce seismic waves, to stable sliding where no seismic waves are produced. Our hypothesis is that this transition is due to a material phenomenon called strain‐hardening, that is frequently observed in laboratory rock mechanics experiments. The result is a state variable time series, computed over the last 51 years in California, that strongly resembles the long‐anticipated cycle of stress accumulation and release. Using supervised machine learning techniques, we can optimize the two‐parameter model. From that optimized model, we can rigorously calculate the probability of current hazard from major earthquakes. Extending these methods, we can also compute spatial hazard as well. The result is a new method for assessing earthquake hazard that may be useful for a variety of applications. Key Points: "Chaotic" seismicity contains hidden structure in the form of state variable time seriesStandard data science methods can be used to convert the time series to probabilitiesBoth temporal and spatial probabilities can be computed [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. If you build it, they will come: Coastal amenities facilitate human engagement in marine protected areas.
- Author
-
Free, Christopher M., Smith, Joshua G., Lopazanski, Cori J., Brun, Julien, Francis, Tessa B., Eurich, Jacob G., Claudet, Joachim, Dugan, Jenifer E., Gill, David A., Hamilton, Scott L., Kaschner, Kristin, Mouillot, David, Ziegler, Shelby L., Caselle, Jennifer E., and Nickols, Kerry J.
- Subjects
MARINE parks & reserves ,CHARISMA ,FISH conservation ,OUTREACH programs ,TOURIST attractions - Abstract
Calls for using marine protected areas (MPAs) to achieve goals for nature and people are increasing globally. While the conservation and fisheries impacts of MPAs have been comparatively well‐studied, impacts on other dimensions of human use have received less attention. Understanding how humans engage with MPAs and identifying traits of MPAs that promote engagement is critical to designing MPA networks that achieve multiple goals effectively, equitably and with minimal environmental impact.In this paper, we characterize human engagement in California's MPA network, the world's largest MPA network scientifically designed to function as a coherent network (124 MPAs spanning 16% of state waters and 1300 km of coastline) and identify traits associated with higher human engagement. We assemble and compare diverse indicators of human engagement that capture recreational, educational and scientific activities across California's MPAs.We find that human engagement is correlated with nearby population density and that site "charisma" can expand human engagement beyond what would be predicted based on population density alone. Charismatic MPAs tend to be located near tourist destinations, have long sandy beaches and be adjacent to state parks and associated amenities. In contrast, underutilized MPAs were often more remote and lacked both sandy beaches and parking lot access.Synthesis and applications: These results suggest that achieving MPA goals associated with human engagement can be promoted by developing land‐based amenities that increase access to coastal MPAs or by locating new MPAs near existing amenities during the design phase. Alternatively, human engagement can be limited by locating MPAs in areas far from population centres, coastal amenities or sandy beaches. Furthermore, managers may want to prioritize monitoring, enforcement, education and outreach programmes in MPAs with traits that predict high human engagement. Understanding the extent to which human engagement impacts the conservation performance of MPAs is a critical next step to designing MPAs that minimize tradeoffs among potentially competing objectives. Read the free Plain Language Summary for this article on the Journal blog. Read the free Plain Language Summary for this article on the Journal blog. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Nowcasting Earthquakes: Imaging the Earthquake Cycle in California With Machine Learning.
- Author
-
Rundle, John B., Donnellan, Andrea, Fox, Geoffrey, Crutchfield, James P., and Granat, Robert
- Subjects
SUPERVISED learning ,MACHINE learning ,RECEIVER operating characteristic curves ,PRINCIPAL components analysis ,SIGNAL detection ,EARTHQUAKES - Abstract
We propose a new machine learning‐based method for nowcasting earthquakes to image the time‐dependent earthquake cycle. The result is a timeseries that may correspond to the process of stress accumulation and release. The timeseries are constructed by using principal component analysis of regional seismicity. The patterns are found as eigenvectors of the cross‐correlation matrix of a collection of seismicity timeseries in a coarse grained regional spatial grid (pattern recognition via unsupervised machine learning). The eigenvalues of this matrix represent the relative importance of the various eigenpatterns. Using the eigenvectors and eigenvalues, we compute the weighted correlation timeseries of the regional seismicity. This timeseries has the property that the weighted correlation generally decreases prior to major earthquakes in the region, and increases suddenly just after a major earthquake occurs. As in a previous paper (Rundle & Donnellan, 2020, https://doi.org/10.1029/2020ea001097), we find that this method produces a nowcasting timeseries that resembles the hypothesized regional stress accumulation and release process characterizing the earthquake cycle. We then address the problem of whether the timeseries contain information regarding future large earthquakes. For this, we compute a receiver operating characteristic and determine the decision thresholds for several future time periods of interest (optimization via supervised machine learning). We find that signals can be detected that can be used to characterize the information content of the timeseries. These signals may be useful in assessing present and near‐future seismic hazards. Plain Language Summary: Major earthquakes on fault systems in a tectonically active region are thought to occur in approximately repetitive cycles as a result of the buildup and release of tectonic forces (stress). Nowcasting is a technique adopted from weather, finance, and other fields that use readily observable proxy data to represent the unobservable stress accumulation process of interest. This paper presents a method that computes a timeseries representing the weighted correlation of small earthquake activity in the California region from 1950 to 2020. Prior to major magnitude M > 7 earthquakes, the timeseries trends toward lower values. Just after the earthquake occurs, the timeseries increases suddenly in association with the earthquake, before resuming its gradual trend toward lower values. Plotting the timeseries on an inverted scale, one sees a cyclic behavior that strongly resembles the hypothesized earthquake cycle. In principle, we can therefore use this timeseries for nowcasting, as a proxy for stress accumulation and release. Using methods of signal detection first developed for radar by the British in the 1940's, we find that the timeseries contain information about future large earthquakes that can be used for hazard assessment. Key Points: The current state of the earthquake cycle of tectonic stress accumulation and release is unobservable with existing methodsWe show that readily observable small earthquake correlations can be used to nowcast the current state of the earthquake cycleMachine learning techniques indicate that signals corresponding to future large earthquakes can be detected in a correlation time series [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
11. Residential Water Conservation and the Rebound Effect: A Temporal Decomposition and Investigation.
- Author
-
Nemati, Mehdi, Tran, Dat, and Schwabe, Kurt
- Subjects
WATER shortages ,WATER conservation ,DROUGHTS ,WATER use ,RESIDENTIAL mobility ,ENERGY demand management ,WATER utilities ,WATER management ,WATER consumption - Abstract
Water conservation in California has been a major subject of concern for agencies in their efforts to satisfy their residential demand while coping with frequent shortfalls, especially in periods of drought. During the 2012–2016 severe drought in California, the state enacted a conservation mandate that imposed specific conservation targets of 4% up to 36% for water utilities. While the utilities met those targets in 2015, water use, on average, has slowly crept up or rebounded subsequently, although not to pre‐drought levels. Understanding the manner and degree to which water use rebounds can be critically important for water utilities in their planning and investment decisions. Using a unique panel dataset on single‐family residential water use by nearly 20,000 customers of a Northern California water agency from 2013 to 2019, this paper explores the magnitude and character of the rebound effect that occurred after the cessation of a statewide conservation mandate that was imposed on water use in response a severe drought enveloping California from 2014 through 2016. Our results suggest the presence of a significant rebound in water use—of approximately 9% on average—after the conservation mandate ended. Yet, and novel to our research, we find significant heterogeneity in the rebound effects across seasons and water users, with a greater rebound in the warm season months relative to other months and among lower water use households relative to higher water use households. Our results also suggest a significant shift in water use to earlier periods of the day once the mandate was lifted. Understanding the magnitude and variation in rebound effects both temporally and across different types of water users can be useful to water agencies in their efforts to make informed decisions surrounding investments, management, and messaging in response to drought, conservation, and water scarcity. Plain Language Summary: Given the prominence of demand‐side management in the water utilities toolbox for addressing increasing water shortages driven by drought and climate change, understanding how water use changes during a drought along with how and to what extent it increases, or "rebounds," afterward can be useful if not necessary in making informed policy decisions and cost‐effective investments. This research develops a unique data set consisting of household‐level daily and hourly water use by nearly 19,500 residential accounts from a California water utility from 2013 through 2019. It explores how water use changed during and post‐drought relative to pre‐drought levels. During this period, California entered a significant drought. Its governor requested a voluntary cutback, imposed a conservation mandate, and implemented a self‐certification requirement lasting until the drought eased. We find that residential water use decreased by 26% during the conservation mandate relative to pre‐drought levels but rebounded by approximately 9% post‐mandate. The "rebound" effect varied across the season (greatest in summer months) and water user type (lower‐end water users showed a larger percentage rebound than higher‐end water users). Our results suggest that due to the mandate, the peak hour water consumption shifted to the earlier hours of the day. Key Points: Our results suggest that the 2015 California water mandate resulted in temporary and permanent reductions in water useRebound effects are more prevalent in the warm season than in other months and for lower‐end water users than high‐end water usersOur results also show that, due to the mandate, the peak hour water consumption shifted to the earlier hours of the day [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. The Role of Anthropogenic Forcing in Western United States Hydroclimate Extremes.
- Author
-
Zhang, Wei and Gillies, Robert
- Subjects
HYDROLOGIC cycle ,ATMOSPHERIC models ,CLIMATE change - Abstract
Despite lower‐than‐average total precipitation in the western states of the U.S., the 2021 "precipitation roller coaster" defined as large precipitation swings has pointed to a strong hydroclimatic intensity (HYINT). Here we examine the 2021 HYINT using an index—a product of the average precipitation intensity (INT) and dry spell length (DSL). HYINT exhibited an extremely high value in the western U.S. in 2021. INT and DSL contribute differently to the 2021 HYINT, with large spatial variability. Overall, the 2021 extreme HYINT in central California and Utah is tied more to large INT, than to DSL. Meanwhile, the historical trends in INT and DSL may have contributed to the extreme 2021 HYINT event. The fraction of attributable risk framework reveals that the 2021 extreme HYINT is more likely to occur with anthropogenic forcing (e.g., 7.3 times more likely for HYINT exceeding 1.3) than natural forcing alone. Plain Language Summary: The western U.S. is a hotspot for studying climate change impacts on the hydrological cycle. Despite lower‐than‐average total precipitation in 2021, the contrasting dryness and wetness in the western U.S. has been widely reported as a "precipitation roller coaster." In this paper we quantified the "precipitation roller coaster" using an index (hydroclimatic intensity [HYINT])—a product of average precipitation intensity during wet days and dry spell length (DSL). The study found that the 2021 extreme HYINT event was largely attributable to the combined impacts of precipitation intensity and DSL in California and Utah, with precipitation intensity playing a more important role. In contrast, the 2021 precipitation event in other western states exhibited divergent contributions from precipitation intensity and DSL. The southwestern U.S. has been identified as a hotspot for increasing HYINT, which is tied more to the increasing DSL than the precipitation intensity. The trends in DSL and precipitation intensity may have played a key role in driving the 2021 extreme HYINT event. Using climate model experiments with and without anthropogenic forcing, an extreme HYINT event in the western U.S. is more likely to occur with anthropogenic forcing. Key Points: Hydroclimatic intensity (HYINT) exhibited extremely high values in parts of the western U.S. in 2021, mainly caused by average precipitation intensityHYINT shows a significant rising trend in most of the southwestern U.S. mainly tied to a rising dry spell length trendThe extreme HYINT event is more likely to occur under anthropogenic forcing than natural forcing alone [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. Announcement.
- Author
-
Pesaran, M. Hashem
- Subjects
ECONOMETRICS ,AWARDS ,UNIVERSITY faculty - Abstract
Announces professors Daniel McFadden and Kenneth Train of the University of California as the winners of the sixth Richard Stone Prize in Applied Econometrics. Published paper 'Mixed MNL Models for Discrete Response;' Criteria; Previous winners.
- Published
- 2002
14. Making salient ethics arguments about vaccine mandates: A California case study.
- Author
-
Navin, Mark C. and Attwell, Katie
- Subjects
- *
VACCINATION policies , *HEALTH policy , *IMMUNIZATION , *HUMAN rights , *INFORMED consent (Medical law) , *HARM reduction - Abstract
Vaccine mandates can take many forms, and different kinds of mandates can implicate an array of values in diverse ways. It follows that good ethics arguments about particular vaccine mandates will attend to the details of individual policies. Furthermore, attention to particular mandate policies—and to attributes of the communities they aim to govern—can also illuminate which ethics arguments may be more salient in particular contexts. If ethicists want their arguments to make a difference in policy, they should attend to these kinds of empirical considerations. This paper focuses on the most common and contentious vaccine mandate reform in the contemporary United States: the elimination of nonmedical exemptions (NMEs) to school and daycare vaccine mandates. It highlights, in particular, debates about California's Senate Bill 277 (SB277), which was the first successful recent effort to eliminate NMEs in that country. We use media, secondary sources, and original interviews with policymakers and activists to identify and evaluate three ethics arguments offered by critics of SB277: parental freedom, informed consent, and children's rights to care and education. We then turn to one ethics argument often offered by advocates of SB277: harm prevention. We note, however, that three arguments for mandates that are common in the immunization ethics literature—fairness/free‐riding, children's rights to vaccination, and utilitarianism—did not play a role in debates about SB277. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. Conjunctive Water Management for Agriculture With Groundwater Salinity.
- Author
-
Yao, Yiqing, Lund, Jay R., and Harter, Thomas
- Subjects
ARTIFICIAL groundwater recharge ,GROUNDWATER management ,WATER in agriculture ,GROUNDWATER recharge ,WATER management ,AGRICULTURAL water supply ,SALINITY - Abstract
Salt accumulations in aquifers significantly affect and transform the conjunctive use of surface water and groundwater supporting irrigated agriculture. Salt accumulates in aquifers under many semi‐arid irrigated lands where pumping has lowered water levels enough to prevent drainage of saline groundwater from the basin. This paper provides new insights into optimal conjunctive management of groundwater pumping, recharge, surface water, and cropping patterns with groundwater salinity and hydrologic variability in an irrigated semi‐arid region, such as California's western San Joaquin Valley, reducing agricultural crop yields and revenues. A two‐stage stochastic quadratic model explores this problem to prescribe economically optimal crop mix and conjunctive water operation policies over a 10‐year period with probabilistic annual surface water availability, considering groundwater salinity's harm to crop yields. At low groundwater salinity, agricultural conjunctive use usually pumps most groundwater in drier years, supplied by additional recharge in wetter years. In contrast, at higher groundwater salinity, optimal conjunctive use pumps less in drier years while pumping more in wetter years, when more surface water allows more dilution of saltier groundwater. Reduced pumping in drier years substantially reduces a region's ability to support higher‐value perennial crops and reduces or eliminates lower‐value annual crops in dry years. Artificial recharge with fresh surface water in wetter years can have economic value from slowing groundwater salination which allows more groundwater use in drier years. Key Points: A two‐stage stochastic quadratic model estimates the economic impact of groundwater salinity on agriculture, especially with perennial cropsEconomically optimized groundwater management fundamentally changes with groundwater salinationArtificial recharge for aquifers with severe salt accumulation can reduce both loss of agricultural production and groundwater salinity [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. Predicting the distribution of non-vagile taxa: a response to.
- Author
-
Stockman, Amy K., Beamer, David A., and Bond, Jason E.
- Subjects
SPIDERS ,ARACHNIDA ,ECOLOGICAL niche ,BIOTIC communities ,ECOLOGY ,SPECIES ,INTEGRATED software ,NATURAL history - Abstract
This paper addresses the issues raised by McNyset and Blackburn (2006 ) in their response to Stockman et al. (2006 ). Re-evaluation of our published GARP analyses by McNyset and Blackburn showed that a much improved ecological niche model is obtained for predicting the distribution of the trapdoor spider genus Promyrmekiaphila in central/northern California. The improved niche model results in a substantially reduced omission error rate and a predictive model comparable to models obtained using other methods (GLM and BIOCLIM). However, the improved GARP models have a high commission error rate (> 0.75); consequently, the inferences regarding difficulties in modelling non-vagile taxa drawn by Stockman et al. remain valid. Finally, we discuss other relatively minor criticisms of our study raised by McNyset and Blackburn and issues related to the peer review of our original paper. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
17. Data‐driven spatio‐temporal analysis of wildfire risk to power systems operation.
- Author
-
Umunnakwe, Amarachi, Parvania, Masood, Nguyen, Hieu, Horel, John D., and Davis, Katherine R.
- Subjects
WILDFIRE prevention ,WILDFIRE risk ,RISK assessment ,ARTIFICIAL neural networks ,ELECTRIC power distribution grids ,NATURAL disasters - Abstract
Wildfires are natural or man‐made disasters that continuously threaten portions of the transmission and distribution grid, and thus the stability of the electric grid. This paper presents a two‐stage framework for assessing power system‐wildfire risk using a data‐driven wildfire prediction model. The first stage of the framework estimates the spatio‐temporal probability of potential wildfire ignition and propagation using a deep neural network in combination with the wildfire physical spread model. Analysis reveals similar spatial and temporal patterns between the model‐predicted wildfire ignition potential and actual wildfire ignition. Motivated by these observations, the second stage assesses the wildfire risk in the power grid operation in terms of potential loss of load by de‐energisation, through combining geospatial information system data of the power grid topology and the stochastic spatio‐temporal wildfire model developed in the first stage. The electric power utility applications introduced by the proposed framework are twofold: 1) a spatio‐temporal risk model for proactive de‐energisation against potential power system failure‐induced wildfire, and 2) a spatio‐temporal spreading model for optimal grid operations against exogenous wildfire. The proposed model, based on real‐world dataset, is demonstrated on the IEEE 24‐bus test system mapped to a study area in Northern California, while the results illustrate the proposed model can achieve the best performance in potential wildfire ignition detection (AUC of 0.995) compared to other baselines, as well as demonstrates the risk‐aware operation of the power system enabled by the proposed framework. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
18. DISCUSSION.
- Subjects
WATER pollution ,RUNOFF ,PAVEMENTS ,POLLUTANTS ,CHLORINE ,CHLORIDES - Abstract
The article focuses on the water environment research paper titled "Characteristics of Highway Stormwater Runoff," by Y. Han, S. Lau, M. Kayhanian, and M. K. Stenstrom. The paper examined a large amount of highway stormwater runoff from different traffic sites in Los Angeles, California. It mentioned that pavement degradation is one of the main cause of pollutants in highway stormwater runoff wherein pavement type affect the stormwater's pH, alkalinity, and organic and inorganic solids. It also showed the difference between chlorine and chloride through water analysis.
- Published
- 2008
- Full Text
- View/download PDF
19. News Editorial.
- Subjects
MEETINGS ,CYBERNETICS ,NONPROFIT organizations ,SOCIETIES ,CONFERENCES & conventions - Abstract
The article offers information on several conferences to be held in July, 2012 including the conference of American Society for Cybernetics and the Bateson Idea Group to be held in California, conference of System Dynamics Society to be held in Switzerland and conference on Management and Service Science to be held in Shanghai, China.
- Published
- 2012
- Full Text
- View/download PDF
20. Estimating the Economic Value of Interannual Reservoir Storage in Water Resource Systems.
- Author
-
Khadem, M., Rougé, C., Harou, J. J., Hansen, K. M., Medellin‐Azuara, J., and Lund, J. R.
- Subjects
WATER supply ,RESERVOIRS - Abstract
Reservoir operators face pressures on timing releases of water. Releasing too much water immediately can threaten future supplies and costs, but not releasing enough creates immediate economic hardship downstream. This paper examines how the economic valuation of end‐of‐year carryover storage can lead to optimal amounts of carryover storage in complex large water resource systems. Economic carryover storage value functions (COSVFs) are developed to represent the value of storage in the face of interannual inflow uncertainty and variability within water resource optimization models. The approach divides a perfect foresight optimization problem into year‐long (limited foresight) subproblems solved sequentially by a within‐year optimization engine to find optimal short‐term operations. The final storage state from the previous year provides the initial condition to each annual problem, and end‐of‐year COSVFs are the final condition. Here the COSVF parameters that maximize the interannual benefits from river basin operations are found by evolutionary search. This generalized approach can handle nonconvexity in large‐scale water resources systems. The approach is illustrated with a regional model of the California Central Valley water system including 30 reservoirs, 22 aquifers, and 51 urban and agricultural demand sites. Head‐dependent pumping costs make the optimization problem nonconvex. Optimized interannual reservoir operation improves over more cautious operation in the historical approximation, reducing the average annual scarcity volume and costs by 80% and 98%, respectively, with more realistic representation of hydrologic foresight for California's Mediterranean climate. The economic valuation of storage helps inform water storage decisions. Key Points: This paper introduces an approach to estimating the economic value of interannual reservoir storageThe approach uses an evolutionary search algorithm linked to a hydroeconomic optimization modelA regional model of the California Central Valley water resource system illustrates the approach [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
21. Piecemeal Farm Regulation and the U.S. Commerce Clause.
- Author
-
Carter, Colin A., Schaefer, K. Aleks, and Scheitrum, Daniel
- Subjects
ANIMAL housing ,AGRICULTURAL egg production ,DOMESTIC animals ,POULTRY farms ,PRICE increases ,FARMHOUSES ,PRICE regulation ,INVOICES - Abstract
Since January 2015, California has required that all shell eggs consumed in the state be produced cage free or by hens housed in enlarged cages defined under Assembly Bill 1437. This paper assesses the effects of California farm animal housing restrictions on egg prices and production practices inside and outside California, and on the volume of interstate trade. We find that the California regulation generated short‐ and long‐run egg price increases across the U.S. It has also bifurcated production methods outside California yielding more concentrated interstate trade. The largest share of the associated private costs was borne by out‐of‐state consumers. The balance between a state's power to regulate food production within its borders and the impacts on out‐of‐state producers and consumers has potential legal implications under the dormant Commerce Clause of the U.S. Constitution. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
22. Multi‐scale demographic analysis reveals range contraction via pseudo‐source and sink population structure.
- Author
-
Robinson, O. J., Ruiz‐Gutierrez, V., Meese, R. J., Graves, E. E., Holyoak, M., Wilson, C. R., Wyckoff, A. C., Merriell, B. D., Snyder, C., and Cooch, E. G.
- Subjects
CLIMATE change ,SURVIVAL rate ,BLACKBIRDS ,DEMOGRAPHIC change - Abstract
Naturally occurring populations of most species are distributed non‐uniformly across their ranges. Observed changes in range‐wide population size are attributed to local‐scale processes such as fecundity and survival and to regional scale demographic processes such as immigration. It is often infeasible to study a species across its entire range, and we frequently make inferences on populations as a whole based on the demographic rates of a more restricted area. Extrapolating inferences about demographic processes from study areas to the entire species' range can lead to erroneous estimations, particularly when permanent emigrants contribute significantly to individual population processes. In this paper, we evaluated demographic processes and population trends at multiple scales for tricolored blackbirds (Agelaiustricolor) in California using site‐specific banding data and range‐wide citizen‐science data. First, we found that a previous estimate of statewide decline of 34% is largely driven by an estimated decline of 51.7% for the southern population. Second, we found evidence of a pseudo‐source and sink system, with the northern region acting as a sink for individuals moving from the declining southern region. The southern region is a "pseudo‐source" since it has lower rates of adult survival and an annual growth rate of r = −0.099, while still acting as a source of immigrants for the northern region. In turn, the north fits the traditional definition of a sink by an annual growth rate near zero, in addition to declining at a rate of 2.5% even though it is estimated to receive immigrants from the south at rates ranging from 8.3% to 13.2% per year. Our results suggest that the loss of wetland habitats in Southern California, coupled with increasing severity of droughts driven by changing climatic conditions, has created this pseudo‐source and sink system. Long term, tricolored blackbirds are likely to experience range contraction in the south, and the northern region is likely to undergo declines due to a loss of immigrants from the south. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
23. Using Collabo RATE, a brief patient-reported measure of shared decision making: Results from three clinical settings in the United States.
- Author
-
Forcino, Rachel C., Barr, Paul J., O'Malley, A. James, Arend, Roger, Castaldo, Molly G., Ozanne, Elissa M., Percac‐Lima, Sanja, Stults, Cheryl D., Tai‐Seale, Ming, Thompson, Rachel, and Elwyn, Glyn
- Subjects
CONFIDENCE intervals ,MEDICAL cooperation ,PATIENT psychology ,SENSORY perception ,PRIMARY health care ,QUESTIONNAIRES ,RESEARCH ,RESEARCH funding ,SURVEYS ,LOGISTIC regression analysis ,RESEARCH methodology evaluation ,DESCRIPTIVE statistics ,ODDS ratio ,FIELD notes (Science) - Abstract
Introduction Collabo RATE is a brief patient survey focused on shared decision making. This paper aims to (i) provide insight on facilitators and challenges to implementing a real-time patient survey and (ii) evaluate Collabo RATE scores and response rates across multiple clinical settings with varied patient populations. Method All adult patients at three United States primary care practices were eligible to complete Collabo RATE post-visit. To inform key learnings, we aggregated all mentions of unanticipated decisions, problems and administration errors from field notes and email communications. Mixed-effects logistic regression evaluated the impact of site, clinician, patient age and patient gender on the Collabo RATE score. Results While Collabo RATE score increased only slightly with increasing patient age ( OR 1.018, 95% CI 1.014-1.021), female patient gender was associated with significantly higher Collabo RATE scores ( OR 1.224, 95% CI 1.073-1.397). Clinician also predicts Collabo RATE score (random effect variance 0.146). Site-specific factors such as clinical workflow and checkout procedures play a key role in successful in-clinic implementation and are significantly related to Collabo RATE scores, with Site 3 scoring significantly higher than Site 1 ( OR 1.759, 95% CI 1.216 to 2.545) or Site 2 (z=−2.71, 95% CI −1.114 to −0.178). Discussion This study demonstrates that Collabo RATE can be used in diverse primary care settings. A clinic's workflow plays a crucial role in implementation. Patient experience measurement risks becoming a burden to both patients and administrators. Episodic use of short measurement tools could reduce this burden. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
24. A Later Onset of the Rainy Season in California.
- Author
-
Luković, Jelena, Chiang, John C. H., Blagojević, Dragan, and Sekulić, Aleksandar
- Subjects
ATMOSPHERIC circulation ,WEATHER ,ATMOSPHERIC models ,CLIMATE change ,JET streams - Abstract
Californian hydroclimate is strongly seasonal and prone to severe water shortages. Recent changes in climate trends have induced shifts in seasonality, thus exacerbating droughts, wildfires, and adverse water shortage effects on the environment and economy. Previous studies have examined the timing of the seasonal cycle shifts mainly as temperature driven earlier onset of the spring season. In this paper, we address quantitative changes in the onset, amounts, and termination of the precipitation season over the past 6 decades, as well as the large‐scale atmospheric circulation underpinning the seasonal cycle changes. We discover that the onset of the rainy season has been progressively delayed since the 1960s, and as a result the precipitation season has become shorter and sharper in California. The progressively later onset of the rainy season is shown to be related to the summer circulation pattern extending into autumn across the North Pacific, in particular, a delay in the strengthening of the Aleutian Low and later southward displacement of the North Pacific westerlies. Plain Language Summary: The rainy season over California is projected to show a distinct sharpening of the mean seasonal cycle, with winter precipitation increasing, and both autumn and spring precipitation decreasing. Our analysis of the past 6 decades of data for California suggests autumn decrease is already underway. A delayed start of the rainy season of 27 days since 1960s can exacerbate seasonal droughts and prolong the wildfire season. This delay occurs due to a number of conditions that controls precipitation: the summer circulation pattern has been extending throughout November across the North Pacific, and the wintertime strengthening of the Aleutian Low is delayed. Accordingly, the southward migration of the North Pacific jet stream as well as extratropical storm tracks are delayed, which marks the start of the California rainy season, are delayed. More work, using climate models, will be needed to provide a better understanding of atmospheric conditions across the North America and the North Pacific. However, our findings provide observational evidence for the projected rainfall change over California and inform ongoing discussion about the drying/wetting tendencies of the rainy season in California. Key Points: Rainfall data over the last 6 decades suggests a progressively delayed onset to the rainy season over CaliforniaA corresponding delay occurs in the transition from summer to wintertime circulation over the North PacificObserved autumn trends appear consistent with projected future shortening of the California rainy season [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
25. Annual biomass spatial data for southern California (2001–2021): Above‐ and belowground, standing dead, and litter.
- Author
-
Schrader‐Patton, Charlie C., Underwood, Emma C., and Sorenson, Quinn M.
- Subjects
MONTE Carlo method ,BIOMASS ,GEOGRAPHIC information systems ,ECOSYSTEM management ,TUNDRAS ,DATA libraries ,FOREST fire management ,FOREST restoration - Abstract
Biomass estimates for shrub‐dominated ecosystems in southern California have been generated at national and statewide extents. However, existing data tend to underestimate biomass in shrub vegetation types are limited to one point in time, or estimate aboveground live biomass only. In this study, we extended our previously developed estimates of aboveground live biomass (AGLBM) based on the empirical relationship of plot‐based field biomass measurements to Landsat normalized difference vegetation index (NDVI) and multiple environmental factors to include other vegetative pools of biomass. AGLBM estimates were made by extracting plot values from elevation, solar radiation, aspect, slope, soil type, landform, climatic water deficit, evapotranspiration, and precipitation rasters and then using a random forest model to estimate per‐pixel AGLBM across our southern California study area. By substituting year‐specific Landsat NDVI and precipitation data, we created a stack of annual AGLBM raster layers for each year from 2001 to 2021. Using these AGLBM data as a foundation, we developed decision rules to estimate belowground, standing dead, and litter biomass pools. These rules were based on relationships between AGLBM and the biomass of the other vegetative pools derived primarily from peer‐reviewed literature and an existing spatial data set. For shrub vegetation types (our primary focus), rules were based on literature estimates by the postfire regeneration strategy of each species (obligate seeder, facultative seeder, obligate resprouter). Similarly, for nonshrub vegetation types (grasslands, woodlands) we used literature and existing spatial data sets specific to each vegetation type to define rules to estimate the other pools from AGLBM. Using a Python language script that accessed Environmental Systems Research Institute raster geographic information system utilities, we applied decision rules to create raster layers for each of the non‐AGLBM pools for the years 2001–2021. The resulting spatial data archive contains a zipped file for each year; each of these files contains four 32‐bit tiff files for each of the four biomass pools (AGLBM, standing dead, litter, and belowground). The biomass units are grams per square meter (g/m2). We estimated the uncertainty of our biomass data by conducting a Monte Carlo analysis of the inputs used to generate the data. Our Monte Carlo technique used randomly generated values for each of the literature‐based and spatial inputs based on their expected distribution. We conducted 200 Monte Carlo iterations, which produced percentage uncertainty values for each of the biomass pools. Results showed, using 2010 as an example, mean biomass for the study area and percentage uncertainty for each of the pools as follows: AGLBM (905.4 g/m2, 14.4%); standing dead (644.9 g/m2, 1.3%); litter (731.2 g/m2, 1.2%); and belowground (776.2 g/m2, 17.2%). Because our methods are consistently applied across each year, the data produced can be used to inform changes in biomass pools due to disturbance and subsequent recovery. As such, these data provide an important contribution to supporting the management of shrub‐dominated ecosystems for monitoring trends in carbon storage and assessing the impacts of wildfire and management activities, such as fuel management and restoration. There are no copyright restrictions on the data set; please cite this paper and the data package when using these data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
26. Probabilistic Categorical Groundwater Salinity Mapping From Airborne Electromagnetic Data Adjacent to California's Lost Hills and Belridge Oil Fields.
- Author
-
Ball, L. B., Davis, T. A., Minsley, B. J., Gillespie, J. M., and Landon, M. K.
- Subjects
OIL fields ,SALINITY ,FRESH water ,WATER diversion ,WATER quality ,GROUNDWATER ,SALTWATER encroachment - Abstract
Growing water stress has led to emerging interest in protecting fresh and brackish groundwater as a potential supplement to water supplies and raised questions about factors that could affect the future quality of fresh and brackish aquifers. Limited well infrastructure, particularly in regions where elevated salinity has led to limited historical groundwater development, hinders traditional mapping of salinity distributions through groundwater sampling. This paper presents a quantitative salinity mapping approach of the upper 300 m using high‐resolution, regionally comprehensive resistivity models derived from Bayesian inversion of an airborne electromagnetic survey adjacent to the Lost Hills and Belridge oil fields in the southwestern San Joaquin Valley of California. Using local water quality observations as an interpretational foundation, a probabilistic approach yields maps of fresh, saline, and brackish groundwater while quantifying joint uncertainty inherited from the geophysical data and interpretational relations. Saline and fresh regions are mapped with relatively high confidence in many locations, while areas of lower confidence, particularly at depth, can be mapped as their most probable salinity category while reflecting the relative uncertainty in the interpretation. These maps identify a stratified salinity structure, where saline water commonly occurs in the surficial aquifer overlying fresher groundwater in the Tulare aquifer, separated by regional confining clay layers. Downgradient of unlined surface water diversions, recharge of imported surface water results in relatively fresh groundwater throughout the depth of investigation. Plain Language Summary: Salinity, a measure of salt concentration, can impact the potential use of groundwater. Salinity can be difficult to map using wells alone, especially where groundwater development has been limited in the past. Dissolved salts are excellent conductors of electricity, and resistivity measurements are commonly used to estimate salinity where wells are not available. Uncertainty in estimating salinity from resistivity data can be introduced into the interpretation by two sources: (1) how well we can estimate salinity from resistivity and (2) how accurately we can measure resistivity. We present a method for mapping salinity that accounts for both sources of uncertainty using regional‐scale, high‐resolution airborne geophysical data, leading to spatially comprehensive 3‐D maps of likely fresh (low salt) and saline (high salt) groundwater. The method is applied in the southwestern San Joaquin Valley of California adjacent to the Lost Hills and Belridge oil fields. Shallow groundwater is shown to have relatively high salinity in much of the study area, and a clay layer protects lower salinity water in the underlying Tulare aquifer. Leaking surface water canals have imported relatively fresh water into the area, resulting in fresher groundwater downgradient of the canals. Key Points: Fresh, brackish, and saline groundwater are mapped using airborne electromagnetic‐derived regional resistivity dataCategorical probability developed from Bayesian inversion and local wells reflect geophysical and interpretational uncertaintySurface water diversions and clay units strongly influence groundwater quality in the southwestern San Joaquin Valley [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
27. Neurodevelopmental and health-related quality-of-life outcomes in adolescence after surgery for congenital heart disease in infancy.
- Author
-
Wotherspoon, Jane M, Eagleson, Karen J, Gilmore, Linda, Auld, Benjamin, Hirst, Anne, Johnson, Susan, Stocker, Christian, Heussler, Helen, and Justo, Robert N
- Subjects
CONGENITAL heart disease ,VERBAL behavior ,CARDIAC surgery ,WECHSLER Adult Intelligence Scale ,QUALITY of life ,BEHAVIORAL assessment ,CONTINUOUS performance test ,FAMILIES & psychology ,ADAPTABILITY (Personality) ,EXECUTIVE function ,COGNITION ,SURGICAL complications ,TREATMENT effectiveness ,SEVERITY of illness index ,NEUROPSYCHOLOGICAL tests ,QUESTIONNAIRES ,SOCIAL skills ,LONGITUDINAL method - Abstract
Aim: To assess outcomes in adolescence after surgery for congenital heart disease (CHD) in infancy. Domains analysed included cognition and executive function, social and emotional well-being, adaptive behaviour, academic achievement, and health-related quality of life (HRQoL).Method: Twenty-one participants (10 males, 11 females) ranged in age from 14 to 17 years (mean 15y 4.8mo, SD 8.4mo). Twenty had biventricular repairs. All were classified as New York Heart Association class I. Measures included: Wechsler Intelligence and Achievement scales; Wide Range Assessment of Memory and Learning, Second Edition; California Verbal Learning Test - Children's Version; Behaviour Rating Inventory of Executive Function; Conners, Third Edition; Adaptive Behavior Assessment System, Second Edition; Behavior Assessment System for Children, Second Edition; Rey-Osterrieth Complex Figure; and Pediatric Quality of Life Inventory.Results: Outcomes were significantly lower (p≤0.01) than population norms for processing speed, mathematical achievement, attention, and visual-spatial ability. Participants reported more frequent learning problems but more positive family relations. HRQoL was significantly lower across most domains by self- and parent-proxy report.Interpretation: Individuals with CHD may experience difficulties across a range of domains. These findings emphasize the importance of comprehensive screening, early intervention, and long-term follow-up, as deficits may extend into young adulthood.What This Paper Adds: Identified cognitive, learning, and attentional impairments in adolescents after congenital heart disease surgery in infancy. Combined self-report, caregiver report, and laboratory tasks in a comprehensive neurodevelopmental assessment protocol. Health-related quality of life was lower across most domains. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
28. THE EMERGENCE OF SUSTAINABLE INDUSTRIES: BUILDING ON NATURAL CAPITAL.
- Author
-
Russo, Michael V.
- Subjects
ENERGY industries ,SOCIOECONOMICS ,WIND power industry ,RENEWABLE energy sources ,CORPORATE growth ,BUSINESS forecasting ,STRATEGIC planning ,BUSINESS planning - Abstract
This paper focuses on the emergence and growth of sustainable industries, specifically analyzing the rise of the wind energy industry in California. Based on a favorable institutional environment and the presence of abundant natural capital, the wind energy industry took root and flourished in California during the last two decades. This paper analyzes this phenomenon by exploring the determinants of where and when wind energy projects would be established. Findings suggest that in locations where natural, social, and economic influences converged, greater wind energy activity followed. The paper advances a simple framework that uses natural capital, site specificity, and institutional environments to predict which sustainable industries will enjoy growth in coming decades. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
29. Genomics and 20 years of sampling reveal phenotypic differences between subpopulations of outmigrating Central Valley Chinook salmon.
- Author
-
Thompson, Tasha Q., O'Leary, Shannon, O'Rourke, Sean, Tarsa, Charlene, Baerwald, Melinda R., Goertler, Pascale, and Meek, Mariah H.
- Subjects
CHINOOK salmon ,NUCLEOTIDE sequencing ,GENOMICS ,BODY size ,PHENOTYPES ,DEER - Abstract
Intraspecific diversity plays a critical role in the resilience of Chinook salmon populations. California's Central Valley (CV) historically hosted one of the most diverse population complexes of Chinook salmon in the world. However, anthropogenic factors have dramatically decreased this diversity, with severe consequences for population resilience. Here we use next generation sequencing and an archive of thousands of tissue samples collected across two decades during the juvenile outmigration to evaluate phenotypic diversity between and within populations of CV Chinook salmon. To account for highly heterogeneous sample qualities in the archive dataset, we develop and test an approach for population and subpopulation assignments of CV Chinook salmon that allows inclusion of relatively low‐quality samples while controlling error rates. We find significantly distinct outmigration timing and body size distributions for each population and subpopulation. Within the archive dataset, spring run individuals that assigned to the Mill and Deer Creeks subpopulation exhibited an earlier and broader outmigration distribution as well as larger body sizes than individuals that assigned to the Butte Creek subpopulation. Within the fall run population, individuals that assigned to the late‐fall run subpopulation also exhibited an earlier and broader outmigration distribution and larger body sizes than other fall run fish in our dataset. These results highlight the importance of distinct subpopulations for maintaining remaining diversity in CV Chinook salmon, and demonstrates the power of genomics‐based population assignments to aid the study and management of intraspecific diversity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Climatically robust multiscale species distribution models to support pronghorn recovery in California.
- Author
-
Bean, William T., Butterfield, H. Scott, Howard, Jeanette K., and Batter, Thomas J.
- Subjects
SPECIES distribution ,RANDOM forest algorithms ,REGRESSION trees ,ATMOSPHERIC models ,HABITAT selection ,HOME range (Animal geography) ,HABITATS - Abstract
We combined two climate‐based distribution models with three finer‐scale suitability models to identify habitat for pronghorn recovery in California now and into the future. We used a consensus approach to identify areas of suitable climate now and future for pronghorn in California. We compared the results of climate models from two separate hypotheses about their historical ecology in the state. Under the migration hypothesis, pronghorn were expected to be limited climatically by extreme cold in winter and extreme heat in summer; under the niche reduction hypothesis, historical pronghorn of distribution would have better represented the climatic limitations of the species. We combined occurrences from GPS collars distributed across three populations of pronghorn in the state to create three distinct habitat suitability models: (1) an ensemble model using random forests, Maxent, classification and regression Trees, and a generalized linear model; (2) a step selection function; and (3) an expert‐driven model. We evaluated consensus among both the climate models and the suitability models to prioritize areas for, and evaluate the prospects of, pronghorn recovery. Climate suitability for pronghorn in the future depends heavily on model assumptions. Under the migration hypothesis, our model predicted that there will be no suitable climate in California in the future. Under the niche reduction hypothesis, by contrast, suitable climate will expand. Habitat suitability also depended on the methods used, but areas of consensus among all three models exist in large patches throughout the state. Identifying habitat for a species which has undergone extreme range collapse, and which has very fine scale habitat needs, presents novel challenges for spatial ecologists. Our multimethod, multihypothesis approach can allow habitat modelers to identify areas of consensus and, perhaps more importantly, fill critical knowledge gaps that could resolve disagreements among the models. For pronghorn, a better understanding of their upper thermal tolerances and whether historical populations migrated will be crucial to their potential recovery in California and throughout the arid Southwest. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Size at maturity, reproductive cycle, and fecundity of the southern California brown box crab Lopholithodes foraminatus and implications for developing a new targeted fishery.
- Author
-
Stroud, Ashley, Culver, Carolynn S., and Page, Henry M.
- Subjects
SEXUAL cycle ,FISHERS ,FERTILITY ,CRABS ,FISHERIES ,SPRING - Abstract
Objective: The brown box crab Lopholithodes foraminatus is a member of the king and stone crab family (Lithodidae) that occurs in deepwater along the eastern Pacific coast. Historically, landings in California have been low for this species, but an increase in fishing pressure prompted the state to designate it as an emerging fishery and implement an experimental fishery program. With no known biological studies of California brown box crab, essential fisheries information is needed to evaluate the feasibility of a new targeted fishery. Methods: Using field sampling and observations, along with laboratory studies, we investigated elements of reproductive capacity of the brown box crab in southern California. Result: We found that females reach physiological maturity at a carapace width (CW) between 50.8 and 71.7 mm, and males do so at a CW between 43.3 and 66.3 mm. Morphometric maturity analysis showed a clear inflection point of abdomen width between immature and mature females. Females were 50% functionally mature at 75 mm CW. Morphometric and functional maturity was not detected for males, albeit samples of small male crabs were extremely limited, thus warranting further study. Females followed a biennial reproduction pattern: mating occurred in the fall, followed by an approximately 18‐month brooding period, with hatching in the second spring after mating. Fecundity was positively related to size and ranged from 8352 eggs/brood for a 67.8‐mm‐CW female to 62,181 eggs/brood for a 130.5‐mm‐CW female. Conclusion: These findings can inform the evaluation of a fishery for the brown box crab, including potential management strategies and models for assessing stock condition. Impact statementNew information was generated about the reproduction of the deepwater brown box crab to help evaluate the potential for a new California commercial fishery. The results are informing discussions about ways to manage such a fishery, including limits on the size, number, and/or time of year crabs may be fished. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Equine neuroaxonal dystrophy/degenerative myeloencephalopathy in Gypsy Vanner horses.
- Author
-
Powers, Alexis, Peek, Simon F., Reed, Steve, Donnelly, Callum G., Tinkler, Stacey, Gasper, David, Woolard, Kevin D., and Finno, Carrie J.
- Subjects
VITAMIN E ,HORSES ,DYSTROPHY ,DIETARY supplements ,NEUROLOGIC examination ,POSTMORTEM changes ,OSTEOCHONDROSIS - Abstract
Background: Equine neuroaxonal dystrophy/degenerative myeloencephalopathy (eNAD/EDM) is a neurodegenerative disease that primarily affects young, genetically predisposed horses that are deficient in vitamin E. Equine NAD/EDM has not previously been documented in Gypsy Vanner horses (GVs). Objectives: To evaluate: (1) the clinical phenotype, blood vitamin E concentrations before and after supplementation and pedigree in a cohort of GV horses with a high prevalence of neurologic disease suspicious for eNAD/EDM and (2) to confirm eNAD/EDM in GVs through postmortem evaluation. Animals: Twenty‐six GVs from 1 farm in California and 2 cases from the Midwestern U.S. Methods: Prospective observational study on Californian horses; all 26 GVs underwent neurologic examination. Pre‐supplementation blood vitamin E concentration was assessed in 17‐ GVs. Twenty‐three were supplemented orally with 10 IU/kg of liquid RRR‐alpha‐tocopherol once daily for 28 days. Vitamin E concentration was measured in 23 GVs after supplementation, of which 15 (65%) had pre‐supplementation measurements. Two clinically affected GVs from California and the 2 Midwestern cases had necropsy confirmation of eNAD/EDM. Results: Pre‐supplementation blood vitamin E concentration was ≤2.0 μg/mL in 16/17 (94%) of GVs from California. Post‐supplementation concentration varied, with a median of 3.39 μg/mL (range, 1.23‐13.87 μg/mL), but only 12/23 (52%) were normal (≥3.0 μg/mL). Normalization of vitamin E was significantly associated with increasing age (P =.02). Euthanized horses (n = 4) had eNAD/EDM confirmed at necropsy. Conclusions and Clinical Importance: GVs could have a genetic predisposition to eNAD/EDM. Vitamin E supplementation should be considered and monitored in young GVs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Mechanism of smoke-induced seed germination in a post-fire chaparral annual.
- Author
-
Keeley, J.E. and Fotheringham, C.J.
- Subjects
GERMINATION ,CHAPARRAL - Abstract
1 Smoke-stimulated germination in the post-fire flora of California chaparral does not appear to be triggered by nitrate. Application of freshly prepared unbuffered KNO
3 solutions (pH c. 6.2) failed to enhance germination of five populations of Emmenanthe penduliflora or one Phacelia grandiflora population, regardless of light or stratification conditions. 2 KNO3 buffered at acidic pH (or unbuffered solutions equilibrated with atmospheric CO2 ) did induce germination, but KNO3 solutions at pH 7 failed to induce germination. Induction of germination is therefore not due to the nitrate ion per se, but rather to high [H+ ], although buffered controls gave weak germination at low pH, suggesting a role for H+ plus nitrate. However, other anions such as sulphate were equally as effective as nitrate at breaking dormancy. 3 The germination response to KNO3 was affected by the type of filter paper used and this may be linked to differences in pH. 4 NO2 , at concentrations present in biomass smoke, was highly effective at inducing germination, and other oxidizing agents also induced germination. 5 Several growth regulators, including nitrite and gibberellin, were stimulatory only at acidic pH, but KCN was stimulatory across a broad pH range. 6 Germination decreased at smoke exposures longer than a few minutes. Also, smoked water samples effective at breaking dormancy were acidic and were less effective when buffered to pH >7. 7 Physical scarification of the seed coat induced germination but the effect was not due to penetration of a water barrier, or to enhanced oxygen uptake or to wound responses such as CO2 or ethylene production. 8 Different effects of the gibberellin inhibitor CCC (chlorocholine chloride) suggested that the mechanisms of scarification-induced and smoke-induced germination may differ. 9 We conclude that either oxidizing gases in smoke and/or acids generated on burnt sites play a role in germination of post-fire annuals in chaparral. [ABSTRACT FROM AUTHOR]- Published
- 1998
- Full Text
- View/download PDF
34. The Spread of Violent Crime from City to Countryside, 1955 to 1975.
- Author
-
Fischer, Claude S.
- Subjects
CRIMINAL behavior ,CRIME ,URBANIZATION ,CITIES & towns ,REGIONAL disparities ,CULTURE - Abstract
This paper addresses the issue of whether cultural differences between communities of varying degrees of urbanism are declining in modern society, taking as a case in point acts of Violent crime. I will contend that, contrary to "massification" theories, between 1955 and 1975 differences in rates of criminal behaviour between large and small communities actually increased, and further more, the pattern of changes is consistent with a specific alternative theory about urban-rural differences. This theory holds that cultural change is continually generated in major urban centers, diffuses to smaller cities and thence to the rural hinterland. Part 1 of this paper presents the empirical material on criminal behavior, largely consisting of national crime data aggregated to the level of categories of communicating and of California crime data aggregated to the level of specific Counties. Part 2 of the paper turns to more speculative concerns, discussing the extent to which crime is a cultural phenomenon and presenting more fully a theory of urban-to-rural diffusion, a theory suggesting cyclical patterns that are hinted at--but by no means proven--in the crime data. [ABSTRACT FROM AUTHOR]
- Published
- 1980
35. Prediction model for short‐term traffic flow based on a K‐means‐gated recurrent unit combination.
- Author
-
Sun, Zhaoyun, Hu, Yuanjiao, Li, Wei, Feng, Shaowei, and Pei, Lili
- Subjects
TRAFFIC flow ,TRAFFIC patterns ,K-means clustering ,PREDICTION models ,TRAFFIC estimation ,CLASSIFICATION algorithms - Abstract
Short‐term forecasting of traffic flow is an indispensable part of easing traffic pressure. Considering that different traffic flow patterns will affect the short‐term traffic flow prediction results, a combined method based on the K‐means clustering algorithm and gated recurrent unit (GRU) is proposed to build a short‐term traffic flow prediction model to overcome the above problems. The K‐means algorithm is used to cluster historical traffic flow data to establish different traffic flow pattern libraries. The K‐nearest neighbour (KNN) classification algorithm is used to determine the historical traffic flow pattern most similar to the traffic flow change trend of the date to be predicted. All historical traffic flow data in this category is used training samples to make targeted predictions. The traffic flow data of performance measurement system (PeMS) in California, USA is used to verify the performance of the proposed model. Compared with the GRU network, stacked auto encoders (SAEs), random forest (RF), and support vector machine regression (SVR), the results show that the proposed combination model K‐means‐GRU considers the diversity of traffic flow patterns and improves the prediction accuracy, it can better solve the short‐term traffic flow prediction problem. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
36. ASSESSING ORGANIZATIONAL FITNESS ON A DYNAMIC LANDSCAPE: AN EMPIRICAL TEST OF THE RELATIVE INERTIA THESIS.
- Author
-
Ruef, Martin
- Subjects
ORGANIZATIONAL structure ,STRATEGIC planning ,MANAGEMENT science ,COMPETITION ,ORGANIZATIONAL sociology ,QUALITATIVE research ,MANAGEMENT ,ORGANIZATIONAL behavior ,HOSPITALS - Abstract
This paper proposes an empirical framework for evaluating the relative structural inertia hypothesis, a central assumption of organizational ecology theories. In stark contrast to the tenets of strategic management, the relative inertia thesis claims that organizations are typically unable to match structural changes to their competitive environments in a timely fashion. The hypothesis is tested for the hospital industry in California during the 1980-90 time frame. Strategic movements in a competition 'landscape' are tracked using a variant of the Jaccard similarity coefficient, which has been applied in numerous studies of biological competition. Findings indicate that few hospitals are able to overcome inertial forces in adapting their service portfolios; furthermore, the ability of hospitals to strategically reposition themselves decreases markedly with provider density. Analyses also investigate the relation between organizational attributes (e.g., age, size, mission, and portfolio scope) and adaptability, implications for both ecological and strategic theory are pursued. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
37. Plant community data collected by Robert H. Whittaker in the Siskiyou Mountains, Oregon and California, USA.
- Author
-
Whittaker, Robert H., Damschen, Ellen I., and Harrison, Susan
- Subjects
COMMUNITIES ,SPECIES diversity ,PLANT variation ,ACQUISITION of data ,CHEMICAL composition of plants ,SHRUBS ,PLANT communities ,PLANT diversity - Abstract
In 1949–1951, ecologist Robert H. Whittaker sampled plant community composition at 470 sites in the Siskiyou Mountains (Oregon and California; also known as Klamath or Klamath‐Siskiyou Mountains). His primary goal was to develop methods to quantify plant community variation across environmental gradients, following on his seminal work challenging communities as discrete entities. He selected the Siskiyous because of their diverse and endemic‐rich flora, which he attributed to geological complexity and an ancient stable climate. He chose sites to span gradients of topography, elevation, geologic substrate, and distance from the coast. He used the frequencies of indicator species in his data to assign sampling locations to positions on the topographic gradient, nested within the elevational and substrate gradients. He originated in this study the concept of diversity partitioning, in which gamma diversity (species richness of a community) equals alpha diversity (species richness in homogeneous sites) times beta diversity (species turnover among sites along gradients). Diversity partitioning subsequently became highly influential and new developments on it continue. Whittaker published his Siskiyou work covering paleohistory, biogeography, floristics, vegetation, gradient analysis, and diversity partitioning in Ecological Monographs in 1960. Discussed in 2 pages of his 60‐page monograph, diversity partitioning accounts for >95% of its current >4300 citations. In 2006, we retrieved Whittaker's Siskiyou data in hard copy from the Cornell University archives and entered them in a database. We used these data for multiple published analyses, including some based on (re)sampling the approximate locations of a subset of his sites. Because of the continued interest in diversity partitioning and in historic data sets, here we present his data, including 359 sampling locations and their descriptors and, for each sample, a list of species with their estimated percent cover (herbs and shrubs) and numbers by diameter at breast height (DBH) category (trees). Site descriptors include the approximate location (road, trail, or stream), elevation, topographic aspect, geologic substrate (serpentine, gabbro, or diorite), and dominant woody vegetation of each location. For 111 sites, including the small number chosen to represent the distance‐to‐coast gradient, we could not locate his data. There are no copyright restrictions and users of these data should cite this data paper in any publications that result from its use. The authors are available for consultations about and collaborations involving the data. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
38. The Kimberlina synthetic multiphysics dataset for CO2 monitoring investigations.
- Author
-
Alumbaugh, David, Gasperikova, Erika, Crandall, Dustin, Commer, Michael, Feng, Shihang, Harbert, William, Li, Yaoguo, Lin, Youzuo, and Samarasinghe, Savini
- Subjects
PETROPHYSICS ,GEOPHYSICAL well logging ,SPEED of sound ,SEISMIC wave velocity ,INJECTION wells ,ELECTRICAL resistivity - Abstract
We present a synthetic multi‐scale, multi‐physics dataset constructed from the Kimberlina 1.2 CO2 reservoir model based on a potential CO2 storage site in the Southern San Joaquin Basin of California. Among 300 models, one selected reservoir‐simulation scenario produces hydrologic‐state models at the onset and after 20 years of CO2 injection. Subsequently, these models were transformed into geophysical properties, including P‐ and S‐wave seismic velocities, saturated density where the saturating fluid can be a combination of brine and supercritical CO2, and electrical resistivity using established empirical petrophysical relationships. From these 3D distributions of geophysical properties, we have generated synthetic time‐lapse seismic, gravity and electromagnetic responses with acquisition geometries that mimic realistic monitoring surveys and are achievable in actual field situations. We have also created a series of synthetic well logs of CO2 saturation, acoustic velocity, density and induction resistivity in the injection well and three monitoring wells. These were constructed by combining the low‐frequency trend of the geophysical models with the high‐frequency variations of actual well logs collected at the potential storage site. In addition, to better calibrate our datasets, measurements of permeability and pore connectivity have been made on cores of Vedder Sandstone, which forms the primary reservoir unit. These measurements provide the range of scales in the otherwise synthetic dataset to be as close to a real‐world situation as possible. This dataset consisting of the reservoir models, geophysical models, simulated time‐lapse geophysical responses and well logs forms a multi‐scale, multi‐physics testbed for designing and testing geophysical CO2 monitoring systems as well as for imaging and characterization algorithms. The suite of numerical models and data have been made publicly available for downloading on the National Energy Technology Laboratory's (NETL) Energy Data Exchange (EDX) website. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Prescribed fire placement matters more than increasing frequency and extent in a simulated Pacific Northwest landscape.
- Author
-
Deak, Alison L., Lucash, Melissa S., Coughlan, Michael R., Weiss, Shelby, and Silva, Lucas C. R.
- Subjects
WILDFIRES ,WILDFIRE prevention ,PRESCRIBED burning ,FUEL reduction (Wildfire prevention) ,CARBON sequestration in forests ,FOREST succession ,CLIMATE change mitigation ,FOREST dynamics - Abstract
Prescribed fire has been increasingly promoted to reduce wildfire risk and restore fire‐adapted ecosystems. Yet, the complexities of forest ecosystem dynamics in response to disturbances, climate change, and drought stress, combined with myriad social and policy barriers, have inhibited widespread implementation. Using the forest succession model LANDIS‐II, we investigated the likely impacts of increasing prescribed fire frequency and extent on wildfire severity and forest carbon storage at local and landscape scales. Specifically, we ask how much prescribed fire is required to maintain carbon storage and reduce the severity and extent of wildfires under divergent climate change scenarios? We simulated four prescribed fire scenarios (no prescribed fire, business‐as‐usual, moderate increase, and large increase) in the Siskiyou Mountains of northwest California and southwest Oregon. At the local site scale, prescribed fires lowered the severity of projected wildfires and maintained approximately the same level of ecosystem carbon storage when reapplied at a ~15‐year return interval for 50‐year simulations. Increased frequency and extent of prescribed fire decreased the likelihood of aboveground carbon combustion during wildfire events. However, at the landscape scale, prescribed fire did not decrease the projected severity and extent of wildfire, even when large increases (up to 10× the current levels) of prescribed fire were simulated. Prescribed fire was most effective at reducing wildfire severity under a climate change scenario with increased temperature and precipitation and on sites with north‐facing aspects and slopes greater than 30°. Our findings suggest that placement matters more than frequency and extent to estimate the effects of prescribed fire, and that prescribed fire alone would not be sufficient to reduce the risk of wildfire and promote carbon sequestration at regional scales in the Siskiyou Mountains. To improve feasibility, we propose targeting areas of high concern or value to decrease the risk of high‐severity fire and contribute to meeting climate mitigation and adaptation goals. Our results support strategic and targeted landscape prioritization of fire treatments to reduce wildfire severity and increase the pace and scale of forest restoration in areas of social and ecological importance, highlighting the challenges of using prescribed fire to lower wildfire risk. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Evaluating the Role of Titanomagnetite in Bubble Nucleation: Rock Magnetic Detection and Characterization of Nanolites and Ultra‐Nanolites in Rhyolite Pumice and Obsidian From Glass Mountain, California.
- Author
-
Brachfeld, Stefanie, McCartney, Kelly N., Hammer, Julia E., Shea, Thomas, and Giachetti, Thomas
- Subjects
PUMICE ,RHYOLITE ,OBSIDIAN ,MAGNETIC anisotropy ,SURFACE of the earth ,COSMIC abundances ,SUPERPARAMAGNETIC materials ,SUPERRADIANCE - Abstract
We document the presence, composition, and number density (TND) of titanomagnetite nanolites and ultra‐nanolites in aphyric rhyolitic pumice, obsidian, and vesicular obsidian from the 1060 CE Glass Mountain volcanic eruption of Medicine Lake Volcano, California, using magnetic methods. Curie temperatures indicate compositions of Fe2.40Ti0.60O4 to Fe3O4. Rock‐magnetic parameters sensitive to domain state, which is dependent on grain volume, indicate a range of particle sizes spanning superparamagnetic (<50–80 nm) to multidomain (>10 μm) particles. Cylindrical cores drilled from the centers of individual pumice clasts display anisotropy of magnetic susceptibility with prolate fabrics, with the highest degree of anisotropy coinciding with the highest vesicularity. Fabrics within a pumice clast require particle alignment within a fluid, and are interpreted to result from the upward transport of magma driven by vesiculation, ensuing bubble growth, and shearing in the conduit. Titanomagnetite number density (TND) is calculated from titanomagnetite volume fraction, which is determined from ferromagnetic susceptibility. TND estimates for monospecific assemblages of 1,000 nm–10 nm cubes predict 1012 to 1020 m−3 of solid material, respectively. TND estimates derived using a power law distribution of grain sizes predict 1018 to 1019 m−3. These ranges agree well with TND determinations of 1018 to 1020 m−3 made by McCartney et al. (2024), and are several orders of magnitude larger than the number density of bubbles in these materials. These observations are consistent with the hypothesis that titanomagnetite crystals already existed in extremely high number‐abundance at the time of magma ascent and bubble nucleation. Plain Language Summary: We use magnetism experiments to prove that nanometer‐sized magnetic particles are present in volcanic rocks with low iron content and few visible crystals. Nanolites (particles between 30 and 1,000 nm) and ultra‐nanolites (particles smaller than 30 nm) are extremely difficult to detect in volcanic rocks composed mainly of glass using conventional methods such as optical and electron microscopy. Titanomagnetite nano‐particles may play a role in controlling the explosiveness of volcanic eruptions. The magnetic signatures of minerals can be used to determine their chemical composition, particle size range, and particle abundance. Pumice and obsidian contain the mineral titanomagnetite, with no evidence of prolonged crystallization at high oxygen levels at the Earth's surface. Observed magnetic behaviors are very similar to those of previously published studies of titanomagnetite in the 10–1,000 nm size range, and similar to mathematical models that simulate this size range. We find that pumice clasts have a magnetic fabric, suggesting that the nanolites and ultra‐nanolites were aligned in spatial patterns before the magma solidified, with stronger alignment coinciding with high degrees of vesicularity. Our results indicate that titanomagnetite crystals are highly abundant, and had crystallized in the magma chamber before the eruption. Key Points: Magnetic methods document titanomagnetite nanolites in rhyolitic materials from Glass Mountain, Medicine Lake Volcano, CaliforniaTitanomagnetite number densities for pumice, obsidian, and vesicular obsidian span 1012 to 1020 m−3 of solid materialTitanomagnetite crystals already existed in extremely high number‐abundance at the time of magma ascent and bubble nucleation [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Evaluating the Role of Titanomagnetite in Bubble Nucleation: Novel Applications of Low Temperature Magnetic Analysis and Textural Characterization of Rhyolite Pumice and Obsidian From Glass Mountain, California.
- Author
-
McCartney, Kelly N., Hammer, Julia E., Shea, Thomas, Brachfeld, Stefanie, and Giachetti, Thomas
- Subjects
PUMICE ,RHYOLITE ,OBSIDIAN ,LOW temperatures ,HOMOGENEOUS nucleation ,BUBBLES - Abstract
Nucleation of H2O vapor bubbles in magma requires surpassing a chemical supersaturation threshold via decompression. The threshold is minimized in the presence of a nucleation substrate (heterogeneous nucleation, <50 MPa), and maximized when no nucleation substrate is present (homogeneous nucleation, >100 MPa). The existence of explosively erupted aphyric rhyolite magma staged from shallow (<100 MPa) depths represents an apparent paradox that hints at the presence of a cryptic nucleation substrate. In a pair of studies focusing on Glass Mountain eruptive units from Medicine Lake, California, we characterize titanomagnetite nanolites and ultrananolites in pumice, obsidian, and vesicular obsidian (Brachfeld et al., 2024, https://doi.org/10.1029/2023GC011336), calculate titanomagnetite crystal number densities, and compare titanomagnetite abundance with the physical properties of pumice to evaluate hypotheses on the timing of titanomagnetite crystallization. Titanomagnetite crystals with grain sizes of approximately 3–33 nm are identified in pumice samples from the thermal unblocking of low‐temperature thermoremanent magnetization. The titanomagnetite number densities for pumice are 1018 to 1020 m−3, comparable to number densities in pumice and obsidian obtained from room temperature methods (Brachfeld et al., 2024, https://doi.org/10.1029/2023GC011336). This range exceeds reported bubble number densities (BND) within the pumice from the same eruptive units (average BND ∼4 × 1014 m−3). The similar abundances of nm‐scale titanomagnetite crystals in the effusive and explosive products of the same eruption, together with the lack of correlation between pumice permeability and titanomagnetite content, are consistent with titanomagnetite formation having preceded the bubble formation. Results suggest sub‐micron titanomagnetite crystals are responsible for heterogeneous bubble nucleation in this nominally aphyric rhyolite magma. Key Points: Aphyric rhyolite eruptions staged from shallow magma reservoirs lack the overpressure needed for homogeneous bubble nucleationHeterogeneous bubble nucleation may occur on sub‐µm titanomagnetite crystals, which are undetectable using standard analytical techniquesSub‐µm titanomagnetite crystals can be detected and quantified with low temperature magnetic analyses [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Exploring the impacts of traffic flow states on freeway normal crashes, primary crashes, and secondary crashes.
- Author
-
Yang, Bo, Guo, Yanyong, Zhang, Weihua, Yao, Ying, and Wu, Yiping
- Subjects
FLOW theory (Psychology) ,ASSOCIATION rule mining ,PROPERTY damage ,EXPRESS highways ,TRAFFIC flow ,LANE changing - Abstract
This study aims to explore the relationship between traffic flow states and crash type/severity in the scenarios of normal crashes, primary crashes, and secondary crashes using the association rules mining approach. The crash data and real‐time traffic data were collected from the I‐880 freeway for five years in California, USA. The secondary crashes were identified using a speed contour plot approach. Traffic flow states were identified by the three‐phase flow theory. The results showed that the free flow is associated with the proportion of the sideswipe normal crash, the hit object primary crash, and the injury primary crash. The synchronized flow, the wide moving jams, and the transitional state from synchronized flow to wide moving jams are associated with the proportion of the rear‐end secondary crash. The transitional state from synchronized flow to free flow is associated with the proportion of the rear‐end primary crash and the property damage only primary crash. In addition, the unsafe speed behaviour can increase the proportion of the rear‐end normal, primary, and secondary crashes. The unsafe lane change behaviour can increase the proportion of the sideswipe normal, primary, and secondary crashes. These results have the potential to reduce the secondary crash probability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. A Comprehensive Assessment of Submarine Landslides and Mass Wasting Processes Offshore Southern California.
- Author
-
Walton, Maureen A. L., Conrad, James E., Papesh, Antoinette G., Brothers, Daniel S., Kluesner, Jared W., McGann, Mary, and Dartnell, Peter
- Subjects
LANDSLIDES ,MARINE sediments ,GEOPHYSICAL surveys ,COASTAL sediments ,EARTHQUAKES ,SEDIMENTATION & deposition ,MARINE debris - Abstract
It is critical to characterize submarine landslide hazards near dense coastal populations, especially in areas with active faults, which can trigger slope failure, subsequent tsunamis, and damage seabed infrastructure during earthquake shaking. Offshore southern California, numerous marine geophysical surveys have been conducted over the past decade, and high‐resolution bathymetric and subsurface data now cover about 60 percent of the total region between Point Conception and the United States‐Mexico border from the California coast out to the base of Patton Escarpment ∼200 km offshore. In a comprehensive compilation and interpretive mapping effort, we find evidence of seafloor failure throughout offshore southern California with nearly 1,500 submarine landslide‐related features, including 63 discrete slide deposits with debris and >1,400 slide‐related scarps. In our analysis, we highlight new mapping of submarine landslides in Catalina Basin, the Del Mar slide, the San Gabriel slide complex, and the 232 km2 San Nicolas slide, the largest area of any known submarine landslide mass offshore southern California. Analysis of the spatial distribution of submarine landslide features suggests that most mapped slide features are located relatively near coastal sediment sources, particularly during sea‐level lowstand conditions, which underscores the importance of sediment supply and sediment accumulation on low‐gradient slopes as failure preconditioning processes. Tectonically driven uplift at shelf edges and along basin flanks is another key preconditioning factor, and our results also suggest that earthquakes along active faults trigger mass wasting, especially for repeated, small‐scale failures on tectonically steepened slopes. Plain Language Summary: Submarine landslides can damage seabed infrastructure such as cables and moorings, cause tsunamis, and be triggered by shaking from earthquakes. It is important to understand the risk of submarine landslides near dense coastal populations, particularly where earthquakes also pose hazards. Offshore southern California, we have new high‐resolution seafloor and subsurface imaging data that help us to identify submarine landslide deposits in the marine environment. In our study, we map and compile evidence for submarine landslides and find nearly 1,500 slide‐related features, 63 of which feature significant debris deposits. We describe some of the larger slides in this study for the first time, including submarine landslides in Catalina Basin, the Del Mar slide, the San Gabriel slide complex, and the 232 square kilometer San Nicolas slide, which is one of the largest known submarine landslide masses offshore southern California. Our work suggests that submarine landslide failure processes offshore southern California require a combination of (a) significant sediment supply, which is enhanced during low sea‐level conditions, (b) uplift and steepening along faults, and (c) earthquake shaking to trigger slide events. Key Points: Comprehensive analysis of submarine landslides in southern California provides new metrics on their size, distribution, timing, and geologySubmarine landslide failure processes are controlled by a combination of sediment deposition, tectonic uplift, and earthquake triggeringSmall‐scale failures dominate steep areas near Quaternary faults; large slides tend to occur on lower slopes farther from faults [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Satellite Remote Sensing: A Tool to Support Harmful Algal Bloom Monitoring and Recreational Health Advisories in a California Reservoir.
- Author
-
Lopez Barreto, Brittany N., Hestir, Erin L., Lee, Christine M., and Beutel, Marc W.
- Subjects
ALGAL blooms ,REMOTE sensing ,BODIES of water ,CYANOBACTERIAL toxins ,EVAPOTRANSPIRATION ,MICROCYSTIS ,TOXIC algae ,MICROCYSTINS - Abstract
Cyanobacterial harmful algal blooms (cyanoHABs) can harm people, animals, and affect consumptive and recreational use of inland waters. Monitoring cyanoHABs is often limited. However, chlorophyll‐a (chl‐a) is a common water quality metric and has been shown to have a relationship with cyanobacteria. The World Health Organization (WHO) recently updated their previous 1999 cyanoHAB guidance values (GVs) to be more practical by basing the GVs on chl‐a concentration rather than cyanobacterial counts. This creates an opportunity for widespread cyanoHAB monitoring based on chl‐a proxies, with satellite remote sensing (SRS) being a potentially powerful tool. We used Sentinel‐2 (S2) and Sentinel‐3 (S3) to map chl‐a and cyanobacteria, respectively, classified chl‐a values according to WHO GVs, and then compared them to cyanotoxin advisories issued by the California Department of Water Resources (DWR) at San Luis Reservoir, key infrastructure in California's water system. We found reasonably high rates of total agreement between advisories by DWR and SRS, however rates of agreement varied for S2 based on algorithm. Total agreement was 83% for S3, and 52%–79% for S2. False positive and false negative rates for S3 were 12% and 23%, respectively. S2 had 12%–80% false positive rate and 0%–38% false negative rate, depending on algorithm. Using SRS‐based chl‐a GVs as an early indicator for possible exposure advisories and as a trigger for in situ sampling may be effective to improve public health warnings. Implementing SRS for cyanoHAB monitoring could fill temporal data gaps and provide greater spatial information not available from in situ measurements alone. Plain Language Summary: Lakes often have algal blooms that create a water quality concern, especially when they contain cyanobacteria, which can be toxic to both humans and animals. These harmful algal blooms are of great concern in areas with limited water supply in states such as California. While it is often difficult and costly to collect and monitor toxin concentrations, monitoring concentrations of chlorophyll‐a (chl‐a) –a measure of how much algae are present—is relatively common and can even be accomplished using satellite remote sensing. There have been multiple studies that have found a relationship between toxins produced by cyanobacteria and chl‐a. The World Health Organization (WHO) has recently released (2021) an updated release of their previous 1999 guidance values for toxin monitoring based on chl‐a concentration. With satellite data, we were able to measure chl‐a concentration in a major reservoir in California, and then classify the chl‐a measurements into the WHO's guidance values for toxins. We compared the satellite‐based guidance values to the public advisory levels currently set by the California Department of Water Resources. Our results indicate that SRS of chl‐a is a reasonable substitute for cyanobacteria toxin advisories, and our framework can be applied to similar cyanobacteria dominated lakes. Key Points: The World Health Organization (WHO) updated cyanobacteria harmful algal blooms (cyanoHABs) guidelines for chlorophyll‐a (chl‐a) as a proxyWith satellite remote sensing (SRS), we estimated and classified chl‐a to compare cyanotoxins advisories used by CaliforniaThis study provides a framework for evaluating public health utility of SRS for enhancing cyanotoxin monitoring globally [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Health Impacts of Future Prescribed Fire Smoke: Considerations From an Exposure Scenario in California.
- Author
-
Rosenberg, Andrew, Hoshiko, Sumi, Buckman, Joseph R., Yeomans, Kirstin R., Hayashi, Thomas, Kramer, Samantha J., Huang, ShihMing, French, Nancy H. F., and Rappold, Ana G.
- Subjects
PRESCRIBED burning ,FIRE management ,WILDFIRE prevention ,EMERGENCY room visits ,FOREST fire management ,PARTICULATE matter ,FOREST management - Abstract
In response to increasing wildfire risks, California plans to expand the use of prescribed fire. We characterized the anticipated change in health impacts from exposure to smoke under a future fire‐management scenario relative to a historical period (2008–2016). Using dispersion models, we estimated daily fine particulate matter (PM2.5) emissions from hypothetical future prescribed fires on 500,000‐acres classified as high priority. To evaluate health impacts, we calculated excess daily cardiorespiratory emergency department visit rates attributed to all‐source PM2.5, distinguishing the portion of the burden attributed to prescribed fire. The total burden was differentiated by fire type and by smoke strata‐specific days to calculate strata‐specific burden rates, which were then applied to estimate the burden in the future scenario. This analysis suggests that the exposure to prescribed fire smoke, measured as the number of persons exposed per year, would be 15 times greater in the future. However, these exposures were associated with lower concentrations compared to the historical period. The increased number of exposure days led to an overall increase in the future health burden. Specifically, the northern, central, and southern regions experienced the largest burden increase. This study introduces an approach that integrates spatiotemporal exposure differences, baseline morbidity, and population size to assess the impacts of prescribed fire under a future scenario. The findings highlight the need to consider both the level and frequency of exposure to guide strategies to safeguard public health as well as aid forest management agencies in making informed decisions to protect communities while mitigating wildfire risks. Plain Language Summary: Prescribed fire is a forest management strategy for reducing the risks of wildfires. While some fires are ecologically beneficial, smoke from fires is a major source of airborne particle pollution, which is harmful to human health. This study examined the change in health impacts resulting from an expected increase in the use of prescribed fire within California's high‐priority wildfire risk areas. We used daily counts of cardiorespiratory emergency department visits attributed to air quality combined with model‐generated measures of smoke pollution to estimate health impacts. We compared exposures and the associated health burden on days impacted by wildfire or prescribed fire smoke in the past to the impacts in the hypothetical future scenario with increased prescribed fire. Projections of future prescribed burning in high priority areas suggest that more people would experience smoke more often, although exposures would occur at lower concentrations. With more frequent lower‐level exposure days near populated areas, the health burden would increase relative to past prescribed fire. Understanding the potential impact of prescribed fire may simultaneously help protect public health and increase safety from wildfires. Key Points: A California‐based model of future prescribed burning in high‐priority wildfire risk areas suggested more people will experience smokeAn increased number of exposure days in the future scenario led to an overall increase in the future health burdenThe excess future health burden was due to the cumulative impact of lower exposure days and high population density in high‐priority areas [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Evaluating immaturity risk in young stands of the serotinous knobcone pine (Pinus attenuata).
- Author
-
Marlin, Katherine F., Greene, David F., Kane, Jeffrey M., Reilly, Matthew, and Madurapperuma, Buddhika D.
- Subjects
PINUS radiata ,PINE ,SEED viability ,CONIFERS ,CALIFORNIA wildfires ,PLANT populations - Abstract
As wildfire becomes increasingly frequent, many serotinous plant populations risk local extirpation if fire recurs prior to sufficient seed accumulation in the canopy (i.e., "immaturity risk"). Following two 2018 wildfires in northwestern California, we studied seed viability, cone production, and postfire regeneration of a serotinous conifer, knobcone pine (Pinus attenuata), with stand ages (time since fire) ranging from 6 to 79 years. Cone density per tree was more strongly associated with tree diameter than age, and cone density was positively related to postfire seedling regeneration. Most of the postfire knobcone pine regeneration established during the first year with high survivorship in the following first postfire year. Adjusting for survivorship, the estimated minimum age for knobcone pine to promote self‐replacement (one recruit per tree) was 9.5 years (or 4.6‐cm dbh) and the probability of reburning at the modern fire rotation of 43 years was 19.8%. Based on our results, we found that immaturity risk was currently low for knobcone pine. Our approach provides a quantitative method to assess immaturity risk in knobcone pine and other serotinous conifer species that can be used to evaluate future risk under rapidly changing climate and fire conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Large‐scale, multidecade monitoring data from kelp forest ecosystems in California and Oregon (USA).
- Author
-
Malone, Daniel P., Davis, Kathryn, Lonhart, Steve I., Parsons‐Field, Avrey, Caselle, Jennifer E., and Carr, Mark H.
- Subjects
ECOSYSTEMS ,MACROCYSTIS ,ECOSYSTEM services ,MARINE parks & reserves ,KELPS ,OCEAN temperature ,BIOTIC communities ,FISHERIES - Abstract
Kelp forests are among the most productive ecosystems on Earth. In combination with their close proximity to the shore, the productivity and biodiversity of these ecosystems generate a wide range of ecosystem services including supporting (e.g., primary production, habitat), regulating (e.g., water flow, coastal erosion), provisioning (e.g., commercial and recreational fisheries), and cultural (e.g., recreational, artisanal) services. For these reasons, kelp forests have long been the target of ecological studies. However, with few exceptions, these studies have been localized and short term (<5 years). In 1999, recognizing the importance of large‐scale, long‐term studies for understanding the structure, functioning, and dynamics of coastal marine ecosystems, and for informing policy, the Partnership for Interdisciplinary Studies of Coastal Oceans (PISCO) designed and initiated a large‐scale, long‐term monitoring study of kelp forest ecosystems along 1400 km of coast stretching from southern California to southern Oregon, USA. The purpose of the study has been to characterize the spatial and temporal patterns of kelp forest ecosystem structure and evaluate the relative contributions of biological and environmental variables derived from external sources (e.g., sea otter density, Chl‐a concentration, sea surface temperature, wave energy) in explaining observed spatial and temporal patterns. For this purpose, the ecological community (i.e., density, percent cover, or biomass of conspicuous fishes, invertebrates, and macroalgae) and geomorphological attributes (bottom depth, substratum type, and vertical relief) of kelp forest ecosystems have been surveyed annually using SCUBA divers trained in both scientific diving and data collection techniques and the identification of kelp forest species. The study region spans distinct ecological and biogeographic provinces, which enables investigations of how variation in environmental drivers and distinctive species compositions influence community structure, and its response to climate‐related environmental change across a portion of the California Current Large Marine Ecosystem. These data have been used to inform fisheries management, design and evaluate California's state‐wide network of marine protected areas (MPAs), and assess the ecological consequences of climate change (e.g., marine heatwaves). Over time, the spatial and temporal design of the monitoring program was adapted to fill its role in evaluating the ecological responses to the establishment of MPAs. There are no copyright restrictions; please cite this paper when data are used. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
48. Advances in Sintering Science and Technology.
- Author
-
Bordia, Rajendra and Olevsky, Eugene
- Subjects
SINTERING ,METALLURGY ,CONFERENCES & conventions - Abstract
The article presents information on the advances in the field of sintering science and technology. Information on the International Conference Sintering 2008, held in San Diego, California, from November 16-20, is presented, which is reportedly the fifth in a series that started in 1995. It is papers discussed in that conference that the current issue of the journal is focused on. Similar conferences have reportedly come up in Eastern Europe, attended by professionals of the area.
- Published
- 2009
- Full Text
- View/download PDF
49. Ira Herskowitz, an Editor of Genes to Cells dies at 56.
- Author
-
Tomzawa, Jun-ichi
- Subjects
EDITORS ,LIFE sciences ,DEATH - Abstract
Ira Herscowitz died on 28 April 2003 of pancreatic cancer. He graduated from the California Institute of Technology and attended the Massachusetts Institute of Technology for his doctorate, studying the control of gene expression in phage lambda. As a young professor at the University of Oregon he started his seminal work on yeast molecular biology. Extending the pioneering work by Yasuji Oshima, he provided molecular interpretation of the cassette theory of yeast mating type interconversion. Later at the University of California, San Francisco, he continued to make key contributions on gene regulation and control of cell cycle with the yeast system. I think it a natural development that, in later years, he was concerned with the mammalian biology of pharmacogenetics of membrane transporters. Ira had a remarkable ability to untangle complex phenomena by clear reasoning and impressed us with his persuasive presentation. He was also an enthusiastic folk and blues singer. When I organized a biology meeting, I asked him to bring his guitar. He said that ‘I will bring my instrument made in Japan’, real or joke? I present below some of the witty lines he sang. I feel very sad that I cannot reproduce his attractively deep voice. (Jun-ichi Tomizawa, ‘Tomi’). [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
50. ‘Power in place’: viticultural spatialities of globalization and community empowerment in the Languedoc.
- Author
-
Jones, Alun
- Subjects
ECONOMIC globalization ,WINE industry - Abstract
This paper explores the ways in which economic globalization processes produce new spatio-temporalities. It emphasizes how the exercise of different modes of power, in particular instrumental and associational powers, is critical to understanding the distinct formations that are produced by globalization dynamics. Using the empirical context of globalization in the wine industry, and the efforts made by one of the industry's leading wine corporations, Robert Mondavi of Napa valley California, to extend its production base to one of Europe's foremost wine-producing regions, the paper provides a crucial interpretative angle on spatio-temporal disruptions caused by globalization processes. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.