26 results on '"Buhnerkempe MG"'
Search Results
2. Eight challenges in modelling disease ecology in multi-host, multi-agent systems
- Author
-
Buhnerkempe, MG, Roberts, MG, Dobson, AP, Heesterbeek, H, Hudson, PJ, and Lloyd-Smith, JO
- Subjects
Maintenance ,Multiple pathogens ,Food webs ,Community ecology ,Multiple hosts - Abstract
© 2014 The Authors. Many disease systems exhibit complexities not captured by current theoretical and empirical work. In particular, systems with multiple host species and multiple infectious agents (i.e., multi-host, multi-agent systems) require novel methods to extend the wealth of knowledge acquired studying primarily single-host, single-agent systems. We outline eight challenges in multi-host, multi-agent systems that could substantively increase our knowledge of the drivers and broader ecosystem effects of infectious disease dynamics.
- Published
- 2014
3. Sacubitril/Valsartan as an Effective Hypertension Treatment Option in Those With Chronic Type B Aortic Dissection.
- Author
-
Buhnerkempe MG, Bitner S, and Flack JM
- Subjects
- Humans, Male, Treatment Outcome, Middle Aged, Chronic Disease, Antihypertensive Agents therapeutic use, Female, Aged, Angiotensin Receptor Antagonists therapeutic use, Tetrazoles therapeutic use, Blood Pressure drug effects, Aortic Aneurysm drug therapy, Angiotensin II Type 1 Receptor Blockers therapeutic use, Valsartan therapeutic use, Aminobutyrates therapeutic use, Biphenyl Compounds therapeutic use, Aortic Dissection drug therapy, Aortic Dissection diagnostic imaging, Drug Combinations, Hypertension drug therapy, Hypertension physiopathology
- Published
- 2024
- Full Text
- View/download PDF
4. Resistant Hypertension: Disease Burden and Emerging Treatment Options.
- Author
-
Flack JM, Buhnerkempe MG, and Moore KT
- Subjects
- Humans, Drug Resistance, Blood Pressure drug effects, Cost of Illness, Antihypertensive Agents therapeutic use, Hypertension drug therapy, Hypertension physiopathology
- Abstract
Purpose of Review: To define resistant hypertension (RHT), review its pathophysiology and disease burden, identify barriers to effective hypertension management, and to highlight emerging treatment options., Recent Findings: RHT is defined as uncontrolled blood pressure (BP) ≥ 130/80 mm Hg despite concurrent prescription of ≥ 3 or ≥ 4 antihypertensive drugs in different classes or controlled BP despite prescription of ≥ to 4 drugs, at maximally tolerated doses, including a diuretic. BP is regulated by a complex interplay between the renin-angiotensin-aldosterone system, the sympathetic nervous system, the endothelin system, natriuretic peptides, the arterial vasculature, and the immune system; disruption of any of these can increase BP. RHT is disproportionately manifest in African Americans, older patients, and those with diabetes and/or chronic kidney disease (CKD). Amongst drug-treated hypertensives, only one-quarter have been treated intensively enough (prescribed > 2 drugs) to be considered for this diagnosis. New treatment strategies aimed at novel therapeutic targets include inhibition of sodium-glucose cotransporter 2, aminopeptidase A, aldosterone synthesis, phosphodiesterase 5, xanthine oxidase, and dopamine beta-hydroxylase, as well as soluble guanylate cyclase stimulation, nonsteroidal mineralocorticoid receptor antagonism, and dual endothelin receptor antagonism. The burden of RHT remains high. Better use of currently approved therapies and integrating emerging therapies are welcome additions to the therapeutic armamentarium for addressing needs in high-risk aTRH patients., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
5. Race and Antihypertensive Drug Therapy: Edging Closer to a New Paradigm.
- Author
-
Flack JM and Buhnerkempe MG
- Subjects
- Antihypertensive Agents therapeutic use
- Published
- 2022
- Full Text
- View/download PDF
6. Adverse Health Outcomes Associated With Refractory and Treatment-Resistant Hypertension in the Chronic Renal Insufficiency Cohort.
- Author
-
Buhnerkempe MG, Prakash V, Botchway A, Adekola B, Cohen JB, Rahman M, Weir MR, Ricardo AC, and Flack JM
- Subjects
- Adult, Aged, Cohort Studies, Female, Humans, Hypertension complications, Hypertension epidemiology, Male, Middle Aged, Patient Outcome Assessment, Proportional Hazards Models, Antihypertensive Agents therapeutic use, Hypertension drug therapy, Renal Insufficiency, Chronic complications
- Abstract
Refractory hypertension (RfH) is a severe phenotype of antihypertension treatment failure. Treatment-resistant hypertension (TRH), a less severe form of difficult-to-treat hypertension, has been associated with significantly worse health outcomes. However, no studies currently show how health outcomes may worsen upon progression to RfH. RfH and TRH were studied in 3147 hypertensive participants in the CRIC (Chronic Renal Insufficiency Cohort study). The hypertensive phenotype (ie, no TRH or RfH, TRH, or RfH) was identified at the baseline visit, and health outcomes were monitored at subsequent visits. Outcome risk was compared using Cox proportional hazards models with time-varying covariates. A total of 136 (4.3%) individuals were identified with RfH at baseline. After adjusting for participant characteristics, individuals with RfH had increased risk for the composite renal outcome across all study years (50% decline in estimated glomerular filtration rate or end-stage renal disease; hazard ratio for study years 0-10=1.73 [95% CI, 1.42-2.11]) and the composite cardiovascular disease outcome during later study years (stroke, myocardial infarction, or congestive heart failure; hazard ratio for study years 0-3=1.25 [0.91-1.73], for study years 3-6=1.50 [0.97-2.32]), and for study years 6-10=2.72 [1.47-5.01]) when compared with individuals with TRH. There was no significant difference in all-cause mortality between those with refractory versus TRH. We provide the first evidence that RfH is associated with worse long-term health outcomes compared with TRH.
- Published
- 2021
- Full Text
- View/download PDF
7. Linking longitudinal and cross-sectional biomarker data to understand host-pathogen dynamics: Leptospira in California sea lions (Zalophus californianus) as a case study.
- Author
-
Prager KC, Buhnerkempe MG, Greig DJ, Orr AJ, Jensen ED, Gomez F, Galloway RL, Wu Q, Gulland FMD, and Lloyd-Smith JO
- Subjects
- Animal Diseases diagnosis, Animal Diseases immunology, Animal Diseases microbiology, Animals, Antibodies, Bacterial blood, Bacterial Shedding, California, Cross-Sectional Studies, Host-Pathogen Interactions immunology, Immunity, Kinetics, Leptospira interrogans, Leptospirosis immunology, Survival Rate, Biomarkers blood, Host-Pathogen Interactions physiology, Leptospira pathogenicity, Leptospirosis diagnosis, Leptospirosis veterinary, Sea Lions microbiology
- Abstract
Confronted with the challenge of understanding population-level processes, disease ecologists and epidemiologists often simplify quantitative data into distinct physiological states (e.g. susceptible, exposed, infected, recovered). However, data defining these states often fall along a spectrum rather than into clear categories. Hence, the host-pathogen relationship is more accurately defined using quantitative data, often integrating multiple diagnostic measures, just as clinicians do to assess their patients. We use quantitative data on a major neglected tropical disease (Leptospira interrogans) in California sea lions (Zalophus californianus) to improve individual-level and population-level understanding of this Leptospira reservoir system. We create a "host-pathogen space" by mapping multiple biomarkers of infection (e.g. serum antibodies, pathogen DNA) and disease state (e.g. serum chemistry values) from 13 longitudinally sampled, severely ill individuals to characterize changes in these values through time. Data from these individuals describe a clear, unidirectional trajectory of disease and recovery within this host-pathogen space. Remarkably, this trajectory also captures the broad patterns in larger cross-sectional datasets of 1456 wild sea lions in all states of health but sampled only once. Our framework enables us to determine an individual's location in their time-course since initial infection, and to visualize the full range of clinical states and antibody responses induced by pathogen exposure. We identify predictive relationships between biomarkers and outcomes such as survival and pathogen shedding, and use these to impute values for missing data, thus increasing the size of the useable dataset. Mapping the host-pathogen space using quantitative biomarker data enables more nuanced understanding of an individual's time course of infection, duration of immunity, and probability of being infectious. Such maps also make efficient use of limited data for rare or poorly understood diseases, by providing a means to rapidly assess the range and extent of potential clinical and immunological profiles. These approaches yield benefits for clinicians needing to triage patients, prevent transmission, and assess immunity, and for disease ecologists or epidemiologists working to develop appropriate risk management strategies to reduce transmission risk on a population scale (e.g. model parameterization using more accurate estimates of duration of immunity and infectiousness) and to assess health impacts on a population scale., Competing Interests: The authors have declared that no competing interests exist.
- Published
- 2020
- Full Text
- View/download PDF
8. Estimating prevalence and test accuracy in disease ecology: How Bayesian latent class analysis can boost or bias imperfect test results.
- Author
-
Helman SK, Mummah RO, Gostic KM, Buhnerkempe MG, Prager KC, and Lloyd-Smith JO
- Abstract
Obtaining accurate estimates of disease prevalence is crucial for the monitoring and management of wildlife populations but can be difficult if different diagnostic tests yield conflicting results and if the accuracy of each diagnostic test is unknown. Bayesian latent class analysis (BLCA) modeling offers a potential solution, providing estimates of prevalence levels and diagnostic test accuracy under the realistic assumption that no diagnostic test is perfect.In typical applications of this approach, the specificity of one test is fixed at or close to 100%, allowing the model to simultaneously estimate the sensitivity and specificity of all other tests, in addition to infection prevalence. In wildlife systems, a test with near-perfect specificity is not always available, so we simulated data to investigate how decreasing this fixed specificity value affects the accuracy of model estimates.We used simulations to explore how the trade-off between diagnostic test specificity and sensitivity impacts prevalence estimates and found that directional biases depend on pathogen prevalence. Both the precision and accuracy of results depend on the sample size, the diagnostic tests used, and the true infection prevalence, so these factors should be considered when applying BLCA to estimate disease prevalence and diagnostic test accuracy in wildlife systems. A wildlife disease case study, focusing on leptospirosis in California sea lions, demonstrated the potential for Bayesian latent class methods to provide reliable estimates under real-world conditions.We delineate conditions under which BLCA improves upon the results from a single diagnostic across a range of prevalence levels and sample sizes, demonstrating when this method is preferable for disease ecologists working in a wide variety of pathogen systems., Competing Interests: The authors declare no competing interests., (© 2020 The Authors. Ecology and Evolution published by John Wiley & Sons Ltd.)
- Published
- 2020
- Full Text
- View/download PDF
9. Serious Adverse Events Cluster in Participants Experiencing the Primary Composite Cardiovascular Endpoint: A Post Hoc Analysis of the SPRINT Trial.
- Author
-
Botchway A, Buhnerkempe MG, Prakash V, Al-Akchar M, Adekola B, and Flack JM
- Subjects
- Aged, Antihypertensive Agents adverse effects, Cardiovascular Diseases mortality, Cardiovascular Diseases physiopathology, Cluster Analysis, Female, Heart Disease Risk Factors, Humans, Hypertension mortality, Hypertension physiopathology, Male, Middle Aged, Randomized Controlled Trials as Topic, Risk Assessment, Time Factors, Treatment Outcome, Antihypertensive Agents therapeutic use, Blood Pressure drug effects, Cardiovascular Diseases prevention & control, Hypertension drug therapy
- Abstract
Background: Intensively treated participants in the SPRINT study experienced fewer primary cardiovascular composite study endpoints (CVD events) and lower mortality, although 38% of participants experienced a serious adverse event (SAE). The relationship of SAEs with CVD events is unknown., Methods: CVD events were defined as either myocardial infarction, acute coronary syndrome, decompensated heart failure, stroke, or death from cardiovascular causes. Cox models were utilized to understand the occurrence of SAEs with CVD events according to baseline atherosclerotic cardiovascular disease (ASCVD) risk., Results: SAEs occurred in 96% of those experiencing a CVD event but only in 34% (P < 0.001) of those not experiencing a CVD event. Occurrence of SAEs monotonically increased across the range of baseline ASCVD risk being approximately twice as great in the highest compared with the lowest risk category. SAE occurrence was strongly associated with ASCVD risk but was similar within risk groups across treatment arms. In adjusted Cox models, experiencing a CVD event was the strongest predictor of SAEs in all risk groups. By the end of year 1, the hazard ratios for the low, middle, and high ASCVD risk tertiles, and baseline clinical CVD group were 2.56 (95% CI = 1.39-4.71); 2.52 (1.63-3.89); 3.61 (2.79-4.68); 1.86 (1.37-2.54), respectively-a trend observed in subsequent years until study end. Intensive treatment independently predicted SAEs only in the second ASVCD risk tertile., Conclusions: The occurrence of SAEs is multifactorial and mostly related to prerandomization patient characteristics, most prominently ASCVD risk, which, in turn, relates to in-study CVD events., (© American Journal of Hypertension, Ltd 2020. All rights reserved. For Permissions, please email: journals.permissions@oup.com.)
- Published
- 2020
- Full Text
- View/download PDF
10. Prevalence of refractory hypertension in the United States from 1999 to 2014.
- Author
-
Buhnerkempe MG, Botchway A, Prakash V, Al-Akchar M, Nolasco Morales CE, Calhoun DA, and Flack JM
- Subjects
- Aged, Aged, 80 and over, Albuminuria etiology, Antihypertensive Agents pharmacology, Blood Pressure drug effects, Blood Pressure Determination, Female, Humans, Hypertension complications, Hypertension drug therapy, Male, Middle Aged, Nutrition Surveys, Prevalence, Renal Insufficiency, Chronic etiology, Stroke etiology, United States epidemiology, Antihypertensive Agents therapeutic use, Diuretics therapeutic use, Hypertension epidemiology
- Abstract
Objectives: Refractory hypertension has been defined as uncontrolled blood pressure (at or above 140/90 mmHg) when on five or more classes of antihypertensive medication, inclusive of a diuretic. Because unbiased estimates of the prevalence of refractory hypertension in the United States are lacking, we aim to provide such estimates using data from the National Health and Nutrition Examination Surveys (NHANES)., Methods: Refractory hypertension was assessed across multiple NHANES cycles using the aforementioned definition. Eight cycles of NHANES surveys (1999-2014) representing 41 552 patients are the subject of this study. Prevalence of refractory hypertension across these surveys was estimated in the drug-treated hypertensive population after adjusting for the complex survey design and standardizing for age., Results: Across all surveys, refractory hypertension prevalence was 0.6% [95% confidence interval (CI) (0.5, 0.7)] amongst drug-treated hypertensive adults; 6.2% [95% CI (5.1, 7.6)] of individuals with treatment-resistant hypertension actually had refractory hypertension. Although the prevalence of refractory hypertension ranged from 0.3% [95% CI (0.1, 1.0)] to 0.9% [95% CI (0.6, 1.2)] over the eight cycles considered, there was no significant trend in prevalence over time. Refractory hypertension prevalence amongst those prescribed five or more drugs was 34.5% [95% CI (27.9, 41.9)]. Refractory hypertension was associated with advancing age, lower household income, black race, and also chronic kidney disease, albuminuria, diabetes, prior stroke, and coronary heart disease., Conclusions: We provided the first nationally representative estimate of refractory hypertension prevalence in US adults.
- Published
- 2019
- Full Text
- View/download PDF
11. Fixed-rate insulin for adult diabetic ketoacidosis is associated with more frequent hypoglycaemia than rate-reduction method: a retrospective cohort study.
- Author
-
Lorenson JL, Cusumano MC, Stewart AM, Buhnerkempe MG, and Sanghavi D
- Subjects
- Adult, Diabetic Ketoacidosis blood, Dose-Response Relationship, Drug, Female, Humans, Hypoglycemia blood, Hypoglycemia chemically induced, Hypoglycemic Agents administration & dosage, Incidence, Insulin administration & dosage, Length of Stay, Male, Middle Aged, Retrospective Studies, Young Adult, Blood Glucose analysis, Diabetic Ketoacidosis drug therapy, Hypoglycemia epidemiology, Hypoglycemic Agents adverse effects, Insulin adverse effects
- Abstract
Objective: To assess whether hypoglycaemia incidence during management of adult diabetic ketoacidosis (DKA) differed following transition from a fixed-rate insulin protocol to a protocol using an empiric insulin rate reduction after normoglycaemia., Methods: We retrospectively reviewed charts from adult patients managed with a DKA order set before and after order set revision. In cohort 1 (n = 77), insulin rate was 0.1 unit/kg/h with no adjustments and dextrose was infused at 12.5 g/h after glucose reached 250 mg/dl. In cohort 2 (n = 78), insulin was reduced to 0.05 unit/kg/h concurrent with dextrose initiation at 12.5 g/h after glucose reached 200 mg/dl. The primary outcome was hypoglycaemia (glucose < 70 mg/dl) within 24 h of the first order for insulin., Key Findings: The 24-h incidence of hypoglycaemia was 19.2% in cohort 2 versus 32.5% in cohort 1; the adjusted odds ratio was 0.46 (95% confidence interval (CI) [0.21, 0.98]; P = 0.047). The 24-h use of dextrose 50% in water (D50W) was also reduced in cohort 2. No differences were seen in anion gap or bicarbonate normalization, rebound hyperglycaemia or ICU length of stay. In most patients who became hypoglycaemic, the preceding glucose value was below 100 mg/dl., Conclusions: The insulin rate-reduction protocol was associated with less hypoglycaemia and no obvious disadvantage. Robust intervention for low-normal glucose values could plausibly achieve low hypoglycaemia rates with either approach., (© 2019 Royal Pharmaceutical Society.)
- Published
- 2019
- Full Text
- View/download PDF
12. Predicting the risk of apparent treatment-resistant hypertension: a longitudinal, cohort study in an urban hypertension referral clinic.
- Author
-
Buhnerkempe MG, Botchway A, Nolasco Morales CE, Prakash V, Hedquist L, and Flack JM
- Abstract
Apparent treatment-resistant hypertension (aTRH) is associated with higher prevalence of secondary hypertension, greater risk for adverse pressure-related clinical outcomes, and influences diagnostic and therapeutic decision-making. We previously showed that cross-sectional prevalence estimates of aTRH are lower than its true prevalence as patients with uncontrolled hypertension undergoing intensification/optimization of therapy will, over time, increasingly satisfy diagnostic criteria for aTRH. aTRH was assessed in an urban referral hypertension clinic using a 140/90 mm Hg goal blood pressure target in 745 patients with uncontrolled blood pressure, who were predominately African-American (86%) and female (65%). Analyses were stratified according to existing prescription of diuretic at initial visit. Risk for aTRH was estimated using logistic regression with patient characteristics at index visit as predictors. Among those prescribed diuretics, 84/363 developed aTRH; the risk score discriminated well (area under the receiver operating curve = 0.77, bootstrapped 95% CI [0.71, 0.81]). In patients not prescribed a diuretic, 44/382 developed aTRH, and the risk score showed a significantly better discriminative ability (area under the receiver operating curve = 0.82 [0.76, 0.87]; P < .001). In the diuretic and nondiuretic cohorts, 145/363 and 290/382 of patients had estimated risks for development of aTRH <15%. Of these low-risk patients, 139/145 and 278/290 did not develop aTRH (negative predictive value, diuretics - 0.94 [0.91, 0.98], no diuretics - 0.95 [0.93, 0.97]). We created a novel clinical score that discriminates well between those who will and will not develop aTRH, especially among those without existing diuretic prescriptions. Irrespective of baseline diuretic treatment status, a low-risk score had very high negative predictive value., (Copyright © 2018 American Heart Association. Published by Elsevier Inc. All rights reserved.)
- Published
- 2018
- Full Text
- View/download PDF
13. Clay content and pH: soil characteristic associations with the persistent presence of chronic wasting disease in northern Illinois.
- Author
-
Dorak SJ, Green ML, Wander MM, Ruiz MO, Buhnerkempe MG, Tian T, Novakofski JE, and Mateus-Pinilla NE
- Subjects
- Animals, Animals, Wild, Deer, Environment, Hydrogen-Ion Concentration, Illinois, Clay chemistry, Models, Theoretical, Prions metabolism, Soil chemistry, Wasting Disease, Chronic metabolism
- Abstract
Environmental reservoirs are important to infectious disease transmission and persistence, but empirical analyses are relatively few. The natural environment is a reservoir for prions that cause chronic wasting disease (CWD) and influences the risk of transmission to susceptible cervids. Soil is one environmental component demonstrated to affect prion infectivity and persistence. Here we provide the first landscape predictive model for CWD based solely on soil characteristics. We built a boosted regression tree model to predict the probability of the persistent presence of CWD in a region of northern Illinois using CWD surveillance in deer and soils data. We evaluated the outcome for possible pathways by which soil characteristics may increase the probability of CWD transmission via environmental contamination. Soil clay content and pH were the most important predictive soil characteristics of the persistent presence of CWD. The results suggest that exposure to prions in the environment is greater where percent clay is less than 18% and soil pH is greater than 6.6. These characteristics could alter availability of prions immobilized in soil and contribute to the environmental risk factors involved in the epidemiological complexity of CWD infection in natural populations of white-tailed deer.
- Published
- 2017
- Full Text
- View/download PDF
14. Detecting signals of chronic shedding to explain pathogen persistence: Leptospira interrogans in California sea lions.
- Author
-
Buhnerkempe MG, Prager KC, Strelioff CC, Greig DJ, Laake JL, Melin SR, DeLong RL, Gulland FM, and Lloyd-Smith JO
- Subjects
- Animals, California epidemiology, Female, Incidence, Leptospirosis epidemiology, Leptospirosis microbiology, Leptospirosis transmission, Male, Models, Theoretical, Prevalence, Seasons, Disease Outbreaks veterinary, Leptospira interrogans physiology, Leptospirosis veterinary, Sea Lions, Virus Shedding
- Abstract
Identifying mechanisms driving pathogen persistence is a vital component of wildlife disease ecology and control. Asymptomatic, chronically infected individuals are an oft-cited potential reservoir of infection, but demonstrations of the importance of chronic shedding to pathogen persistence at the population-level remain scarce. Studying chronic shedding using commonly collected disease data is hampered by numerous challenges, including short-term surveillance that focuses on single epidemics and acutely ill individuals, the subtle dynamical influence of chronic shedding relative to more obvious epidemic drivers, and poor ability to differentiate between the effects of population prevalence of chronic shedding vs. intensity and duration of chronic shedding in individuals. We use chronic shedding of Leptospira interrogans serovar Pomona in California sea lions (Zalophus californianus) as a case study to illustrate how these challenges can be addressed. Using leptospirosis-induced strands as a measure of disease incidence, we fit models with and without chronic shedding, and with different seasonal drivers, to determine the time-scale over which chronic shedding is detectable and the interactions between chronic shedding and seasonal drivers needed to explain persistence and outbreak patterns. Chronic shedding can enable persistence of L. interrogans within the sea lion population. However, the importance of chronic shedding was only apparent when surveillance data included at least two outbreaks and the intervening inter-epidemic trough during which fadeout of transmission was most likely. Seasonal transmission, as opposed to seasonal recruitment of susceptibles, was the dominant driver of seasonality in this system, and both seasonal factors had limited impact on long-term pathogen persistence. We show that the temporal extent of surveillance data can have a dramatic impact on inferences about population processes, where the failure to identify both short- and long-term ecological drivers can have cascading impacts on understanding higher order ecological phenomena, such as pathogen persistence., (© 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.)
- Published
- 2017
- Full Text
- View/download PDF
15. Inferring infection hazard in wildlife populations by linking data across individual and population scales.
- Author
-
Pepin KM, Kay SL, Golas BD, Shriner SS, Gilbert AT, Miller RS, Graham AL, Riley S, Cross PC, Samuel MD, Hooten MB, Hoeting JA, Lloyd-Smith JO, Webb CT, and Buhnerkempe MG
- Subjects
- Age Factors, Animals, Antibodies, Viral analysis, Computer Simulation, Cross-Sectional Studies, Influenza A virus physiology, Influenza in Birds virology, Longitudinal Studies, Northwest Territories epidemiology, Plague epidemiology, Plague microbiology, Poultry Diseases virology, Prevalence, Risk Assessment methods, Seroepidemiologic Studies, Yersinia pestis physiology, Coyotes, Ducks, Epidemiologic Methods veterinary, Geese, Influenza in Birds epidemiology, Plague veterinary, Poultry Diseases epidemiology
- Abstract
Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease., (© 2017 John Wiley & Sons Ltd/CNRS.)
- Published
- 2017
- Full Text
- View/download PDF
16. Epidemiological models to control the spread of information in marine mammals.
- Author
-
Schakner ZA, Buhnerkempe MG, Tennis MJ, Stansell RJ, van der Leeuw BK, Lloyd-Smith JO, and Blumstein DT
- Subjects
- Animal Communication, Animals, Animals, Wild, Conservation of Natural Resources, Feeding Behavior, Learning, Sea Lions physiology
- Abstract
Socially transmitted wildlife behaviours that create human-wildlife conflict are an emerging problem for conservation efforts, but also provide a unique opportunity to apply principles of infectious disease control to wildlife management. As an example, California sea lions (Zalophus californianus) have learned to exploit concentrations of migratory adult salmonids below the fish ladders at Bonneville Dam, impeding endangered salmonid recovery. Proliferation of this foraging behaviour in the sea lion population has resulted in a controversial culling programme of individual sea lions at the dam, but the impact of such culling remains unclear. To evaluate the effectiveness of current and alternative culling strategies, we used network-based diffusion analysis on a long-term dataset to demonstrate that social transmission is implicated in the increase in dam-foraging behaviour and then studied different culling strategies within an epidemiological model of the behavioural transmission data. We show that current levels of lethal control have substantially reduced the rate of social transmission, but failed to effectively reduce overall sea lion recruitment. Earlier implementation of culling could have substantially reduced the extent of behavioural transmission and, ultimately, resulted in fewer animals being culled. Epidemiological analyses offer a promising tool to understand and control socially transmissible behaviours., (© 2016 The Author(s).)
- Published
- 2016
- Full Text
- View/download PDF
17. Mapping U.S. cattle shipment networks: Spatial and temporal patterns of trade communities from 2009 to 2011.
- Author
-
Gorsich EE, Luis AD, Buhnerkempe MG, Grear DA, Portacci K, Miller RS, and Webb CT
- Subjects
- Animal Husbandry economics, Animals, Cattle, Female, Longitudinal Studies, Male, Models, Theoretical, Seasons, Spatial Analysis, United States, Animal Husbandry methods, Commerce, Transportation
- Abstract
The application of network analysis to cattle shipments broadens our understanding of shipment patterns beyond pairwise interactions to the network as a whole. Such a quantitative description of cattle shipments in the U.S. can identify trade communities, describe temporal shipment patterns, and inform the design of disease surveillance and control strategies. Here, we analyze a longitudinal dataset of beef and dairy cattle shipments from 2009 to 2011 in the United States to characterize communities within the broader cattle shipment network, which are groups of counties that ship mostly to each other. Because shipments occur over time, we aggregate the data at various temporal scales to examine the consistency of network and community structure over time. Our results identified nine large (>50 counties) communities based on shipments of beef cattle in 2009 aggregated into an annual network and nine large communities based on shipments of dairy cattle. The size and connectance of the shipment network was highly dynamic; monthly networks were smaller than yearly networks and revealed seasonal shipment patterns consistent across years. Comparison of the shipment network over time showed largely consistent shipping patterns, such that communities identified on annual networks of beef and diary shipments from 2009 still represented 41-95% of shipments in monthly networks from 2009 and 41-66% of shipments from networks in 2010 and 2011. The temporal aspects of cattle shipments suggest that future applications of the U.S. cattle shipment network should consider seasonal shipment patterns. However, the consistent within-community shipping patterns indicate that yearly communities could provide a reasonable way to group regions for management., (Copyright © 2016 Elsevier B.V. All rights reserved.)
- Published
- 2016
- Full Text
- View/download PDF
18. Identification of migratory bird flyways in North America using community detection on biological networks.
- Author
-
Buhnerkempe MG, Webb CT, Merton AA, Buhnerkempe JE, Givens GH, Miller RS, and Hoeting JA
- Subjects
- Animals, Ducks classification, Environmental Monitoring, North America, Species Specificity, Time Factors, Animal Migration, Ducks physiology, Models, Biological
- Abstract
Migratory behavior of waterfowl populations in North America has traditionally been broadly characterized by four north-south flyways, and these flyways have been central to the management of waterfowl populations for more than 80 yr. However, previous flyway characterizations are not easily updated with current bird movement data and fail to provide assessments of the importance of specific geographical regions to the identification of flyways. Here, we developed a network model of migratory movement for four waterfowl species, Mallard (Anas platyrhnchos), Northern Pintail (A. acuta), American Green-winged Teal (A. carolinensis), and Canada Goose (Branta canadensis), in North America, using bird band and recovery data. We then identified migratory flyways using a community detection algorithm and characterized the importance of smaller geographic regions in identifying flyways using a novel metric, the consolidation factor. We identified four main flyways for Mallards, Northern Pintails, and American Green-winged Teal, with the flyway identification in Canada Geese exhibiting higher complexity. For Mallards, flyways were relatively consistent through time. However, consolidation factors revealed that for Mallards and Green-winged Teal, the presumptive Mississippi flyway was potentially a zone of high mixing between other flyways. Our results demonstrate that the network approach provides a robust method for flyway identification that is widely applicable given the relatively minimal data requirements and is easily updated with future movement data to reflect changes in flyway definitions and management goals.
- Published
- 2016
- Full Text
- View/download PDF
19. Mapping influenza transmission in the ferret model to transmission in humans.
- Author
-
Buhnerkempe MG, Gostic K, Park M, Ahsan P, Belser JA, and Lloyd-Smith JO
- Subjects
- Animals, Disease Models, Animal, Ferrets, Humans, Models, Theoretical, Influenza, Human transmission, Orthomyxoviridae Infections transmission
- Abstract
The controversy surrounding 'gain-of-function' experiments on high-consequence avian influenza viruses has highlighted the role of ferret transmission experiments in studying the transmission potential of novel influenza strains. However, the mapping between influenza transmission in ferrets and in humans is unsubstantiated. We address this gap by compiling and analyzing 240 estimates of influenza transmission in ferrets and humans. We demonstrate that estimates of ferret secondary attack rate (SAR) explain 66% of the variation in human SAR estimates at the subtype level. Further analysis shows that ferret transmission experiments have potential to identify influenza viruses of concern for epidemic spread in humans, though small sample sizes and biological uncertainties prevent definitive classification of human transmissibility. Thus, ferret transmission experiments provide valid predictions of pandemic potential of novel influenza strains, though results should continue to be corroborated by targeted virological and epidemiological research.
- Published
- 2015
- Full Text
- View/download PDF
20. Eight challenges in modelling disease ecology in multi-host, multi-agent systems.
- Author
-
Buhnerkempe MG, Roberts MG, Dobson AP, Heesterbeek H, Hudson PJ, and Lloyd-Smith JO
- Subjects
- Communicable Diseases transmission, Ecology, Food Chain, Host-Pathogen Interactions, Humans, Life Cycle Stages, Models, Statistical, Population Dynamics, Communicable Diseases epidemiology
- Abstract
Many disease systems exhibit complexities not captured by current theoretical and empirical work. In particular, systems with multiple host species and multiple infectious agents (i.e., multi-host, multi-agent systems) require novel methods to extend the wealth of knowledge acquired studying primarily single-host, single-agent systems. We outline eight challenges in multi-host, multi-agent systems that could substantively increase our knowledge of the drivers and broader ecosystem effects of infectious disease dynamics., (Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.)
- Published
- 2015
- Full Text
- View/download PDF
21. Utility of mosquito surveillance data for spatial prioritization of vector control against dengue viruses in three Brazilian cities.
- Author
-
Pepin KM, Leach CB, Marques-Toledo C, Laass KH, Paixao KS, Luis AD, Hayman DT, Johnson NG, Buhnerkempe MG, Carver S, Grear DA, Tsao K, Eiras AE, and Webb CT
- Subjects
- Animals, Brazil epidemiology, Cities epidemiology, Dengue transmission, Humans, Models, Statistical, Spatio-Temporal Analysis, Dengue epidemiology, Dengue prevention & control, Disease Transmission, Infectious prevention & control, Epidemiological Monitoring, Health Care Rationing methods, Mosquito Control methods
- Abstract
Background: Vector control remains the primary defense against dengue fever. Its success relies on the assumption that vector density is related to disease transmission. Two operational issues include the amount by which mosquito density should be reduced to minimize transmission and the spatio-temporal allotment of resources needed to reduce mosquito density in a cost-effective manner. Recently, a novel technology, MI-Dengue, was implemented city-wide in several Brazilian cities to provide real-time mosquito surveillance data for spatial prioritization of vector control resources. We sought to understand the role of city-wide mosquito density data in predicting disease incidence in order to provide guidance for prioritization of vector control work., Methods: We used hierarchical Bayesian regression modeling to examine the role of city-wide vector surveillance data in predicting human cases of dengue fever in space and time. We used four years of weekly surveillance data from Vitoria city, Brazil, to identify the best model structure. We tested effects of vector density, lagged case data and spatial connectivity. We investigated the generality of the best model using an additional year of data from Vitoria and two years of data from other Brazilian cities: Governador Valadares and Sete Lagoas., Results: We found that city-wide, neighborhood-level averages of household vector density were a poor predictor of dengue-fever cases in the absence of accounting for interactions with human cases. Effects of city-wide spatial patterns were stronger than within-neighborhood or nearest-neighborhood effects. Readily available proxies of spatial relationships between human cases, such as economic status, population density or between-neighborhood roadway distance, did not explain spatial patterns in cases better than unweighted global effects., Conclusions: For spatial prioritization of vector controls, city-wide spatial effects should be given more weight than within-neighborhood or nearest-neighborhood connections, in order to minimize city-wide cases of dengue fever. More research is needed to determine which data could best inform city-wide connectivity. Once these data become available, MI-dengue may be even more effective if vector control is spatially prioritized by considering city-wide connectivity between cases together with information on the location of mosquito density and infected mosquitos.
- Published
- 2015
- Full Text
- View/download PDF
22. Antibiotic Efficacy in Eliminating Leptospiruria in California Sea Lions ( Zalophus californianus ) Stranding with Leptospirosis.
- Author
-
Prager KC, Alt DP, Buhnerkempe MG, Greig DJ, Galloway RL, Wu Q, Gulland FMD, and Lloyd-Smith JO
- Abstract
Stranded California sea lions ( Zalophus californianus ) along the California coast have been diagnosed with leptospirosis every year since at least the 1980s. Between September 2010 and November 2011, we followed 14 stranded California sea lions that survived to release and evaluated antibiotic efficacy in eliminating leptospiruria (urinary shedding of leptospires). Leptospiruria was assessed by real-time PCR of urine and urine culture, with persistence assessed using longitudinally collected samples. Serum chemistry was used to assess recovery of normal renal function. Microscopic agglutination testing (MAT) was performed to assess serum anti- Leptospira antibody titers, and the MAT reactivity patterns were consistent with L. interrogans serovar Pomona infection frequently observed in this population. Animals were initially treated for 6 to 16 d (median = 10.5; mean = 10.8) with antibiotics from the penicillin family, with some receiving additional antibiotics to treat other medical conditions. All urine cultures were negative; therefore, the presence of leptospiruria was assessed using PCR. Leptospiruria continued beyond the initial course of penicillin family antibiotics in 13 of the 14 sea lions, beyond the last antibiotic dose in 11 of the 14 sea lions, beyond recovery of renal function in 13 of the 14 sea lions, and persisted for at least 8 to 86 d (median = 45; mean = 46.8). Five animals were released with no negative urine PCR results detected; thus, their total shedding duration may have been longer. Cessation of leptospiruria was more likely in animals that received antibiotics for a greater duration, especially if coverage was uninterrupted. Real-time PCR results indicate that an antibiotic protocol commonly used to treat leptospirosis in rehabilitating California sea lions does not eliminate leptospiruria. It is possible that antibiotic protocols given for a longer duration and/or including other antibiotics may be effective in eliminating leptospiruria. These results may have important human and animal health implications, especially in rehabilitation facilities, as Leptospira transmission may occur through contact with animals with persistent leptospiruria.
- Published
- 2015
- Full Text
- View/download PDF
23. The impact of movements and animal density on continental scale cattle disease outbreaks in the United States.
- Author
-
Buhnerkempe MG, Tildesley MJ, Lindström T, Grear DA, Portacci K, Miller RS, Lombard JE, Werkman M, Keeling MJ, Wennergren U, and Webb CT
- Subjects
- Animals, Cattle, Cattle Diseases prevention & control, Cattle Diseases transmission, Disease Outbreaks prevention & control, Geography, Models, Biological, Population Density, Principal Component Analysis, Risk Factors, United States epidemiology, Cattle Diseases epidemiology, Disease Outbreaks veterinary, Movement
- Abstract
Globalization has increased the potential for the introduction and spread of novel pathogens over large spatial scales necessitating continental-scale disease models to guide emergency preparedness. Livestock disease spread models, such as those for the 2001 foot-and-mouth disease (FMD) epidemic in the United Kingdom, represent some of the best case studies of large-scale disease spread. However, generalization of these models to explore disease outcomes in other systems, such as the United States's cattle industry, has been hampered by differences in system size and complexity and the absence of suitable livestock movement data. Here, a unique database of US cattle shipments allows estimation of synthetic movement networks that inform a near-continental scale disease model of a potential FMD-like (i.e., rapidly spreading) epidemic in US cattle. The largest epidemics may affect over one-third of the US and 120,000 cattle premises, but cattle movement restrictions from infected counties, as opposed to national movement moratoriums, are found to effectively contain outbreaks. Slow detection or weak compliance may necessitate more severe state-level bans for similar control. Such results highlight the role of large-scale disease models in emergency preparedness, particularly for systems lacking comprehensive movement and outbreak data, and the need to rapidly implement multi-scale contingency plans during a potential US outbreak.
- Published
- 2014
- Full Text
- View/download PDF
24. A national-scale picture of U.S. cattle movements obtained from Interstate Certificate of Veterinary Inspection data.
- Author
-
Buhnerkempe MG, Grear DA, Portacci K, Miller RS, Lombard JE, and Webb CT
- Subjects
- Animals, Certification, Models, Theoretical, United States, Animal Husbandry, Cattle, Transportation
- Abstract
We present the first comprehensive description of how shipments of cattle connect the geographic extent and production diversity of the United States cattle industry. We built a network of cattle movement from a state-stratified 10% systematic sample of calendar year 2009 Interstate Certificates of Veterinary Inspection (ICVI) data. ICVIs are required to certify the apparent health of cattle moving across state borders and allow us to examine cattle movements at the county scale. The majority of the ICVI sample consisted of small shipments (<20 head) moved for feeding and beef production. Geographically, the central plains states had the most connections, correlated to feeding infrastructure. The entire nation was closely connected when interstate movements were summarized at the state level. At the county-level, the U.S. is still well connected geographically, but significant heterogeneities in the location and identity of counties central to the network emerge. Overall, the network of interstate movements is described by a hub structure, with a few counties sending or receiving extremely large numbers of shipments and many counties sending and receiving few shipments. The county-level network also has a very low proportion of reciprocal movements, indicating that high-order network properties may be better at describing a county's importance than simple summaries of the number of shipments or animals sent and received. We suggest that summarizing cattle movements at the state level homogenizes the network and a county level approach is most appropriate for examining processes influenced by cattle shipments, such as economic analyses and disease outbreaks., (Published by Elsevier B.V.)
- Published
- 2013
- Full Text
- View/download PDF
25. Assessment of paper interstate certificates of veterinary inspection used to support disease tracing in cattle.
- Author
-
Portacci K, Miller RS, Riggs PD, Buhnerkempe MG, and Abrahamsen LM
- Subjects
- Animals, Cattle, Certification methods, Risk Management methods, United States epidemiology, Veterinary Medicine standards, Cattle Diseases epidemiology, Certification standards, Communicable Disease Control methods, Veterinary Medicine methods
- Abstract
Objective: To evaluate the differences among each state's Interstate Certificate of Veterinary Inspection (ICVI) form and the legibility of data on paper ICVIs used to support disease tracing in cattle., Design: Descriptive retrospective cross-sectional study., Sample: Examples of ICVIs from 50 states and 7,630 randomly sampled completed paper ICVIs for cattle from 48 states., Procedures: Differences among paper ICVI forms from all 50 states were determined. Sixteen data elements were selected for further evaluation of their value in tracing cattle. Completed paper ICVIs for interstate cattle exports in 2009 were collected from 48 states. Each of the 16 data elements was recorded as legible, absent, or illegible on forms completed by accredited veterinarians, and results were summarized by state. Mean values for legibility at the state level were used to estimate legibility of data at the national level., Results: ICVIs were inconsistent among states in regard to data elements requested and availability of legible records. A mean ± SD of 70.0 ± 22.1% of ICVIs in each state had legible origin address information. Legible destination address information was less common, with 55.0 ± 21.4% of records complete. Incomplete address information was most often a result of the field having been left blank. Official animal identification was present on 33.1% of ICVIs., Conclusions and Clinical Relevance: The inconsistency among state ICVI forms and quality of information provided on paper ICVIs could lead to delays and the need for additional resources to trace cattle, which could result in continued spread of disease. Standardized ICVIs among states and more thorough recording of information by accredited veterinarians or expanded usage of electronic ICVIs could enhance traceability of cattle during an outbreak.
- Published
- 2013
- Full Text
- View/download PDF
26. Transmission shifts underlie variability in population responses to Yersinia pestis infection.
- Author
-
Buhnerkempe MG, Eisen RJ, Goodell B, Gage KL, Antolin MF, and Webb CT
- Subjects
- Animals, Endemic Diseases, Plague epidemiology, Siphonaptera microbiology, Models, Theoretical, Plague transmission, Yersinia pestis pathogenicity
- Abstract
Host populations for the plague bacterium, Yersinia pestis, are highly variable in their response to plague ranging from near deterministic extinction (i.e., epizootic dynamics) to a low probability of extinction despite persistent infection (i.e., enzootic dynamics). Much of the work to understand this variability has focused on specific host characteristics, such as population size and resistance, and their role in determining plague dynamics. Here, however, we advance the idea that the relative importance of alternative transmission routes may vary causing shifts from epizootic to enzootic dynamics. We present a model that incorporates host and flea ecology with multiple transmission hypotheses to study how transmission shifts determine population responses to plague. Our results suggest enzootic persistence relies on infection of an off-host flea reservoir and epizootics rely on transiently maintained flea infection loads through repeated infectious feeds by fleas. In either case, early-phase transmission by fleas (i.e., transmission immediately following an infected blood meal) has been observed in laboratory studies, and we show that it is capable of driving plague dynamics at the population level. Sensitivity analysis of model parameters revealed that host characteristics (e.g., population size and resistance) vary in importance depending on transmission dynamics, suggesting that host ecology may scale differently through different transmission routes enabling prediction of population responses in a more robust way than using either host characteristics or transmission shifts alone.
- Published
- 2011
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.