1,706 results on '"Kaplan, David"'
Search Results
2. Conditions for extrapolating differences in consumption to differences in welfare.
- Author
-
Zhao, Wei and Kaplan, David M.
- Subjects
- *
STOCHASTIC dominance , *EXPECTED utility , *SOCIAL dominance , *UTILITY functions , *PHYSICAL distribution of goods - Abstract
We characterize conditions under which a better consumption distribution implies higher welfare. Specifically, here "better consumption" means first‐order stochastic dominance, and "higher welfare" means higher expected utility for every subpopulation of individuals with the same utility function. Although this implication seems natural, we first provide a counterexample wherein better consumption risk allocation outweighs lower consumption. We then show that higher expected utility results from higher consumption in different settings, including fixed dependence (fixed copula) between consumption and individual risk preferences, or alternatively using the rank invariance assumption from the treatment effects literature. These are discussed in several real‐world examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. The Impact of Weight Loss Programs on Body Mass Index Trajectory in Patients With Metabolic Dysfunction-Associated Steatotic Liver Disease: A Veterans Health Administration Study.
- Author
-
Tang, Helen, Kaplan, David E., and Mahmud, Nadim
- Subjects
- *
WEIGHT loss , *VETERANS' health , *BODY mass index , *LIVER diseases - Abstract
INTRODUCTION: Weight loss is the mainstay of management for patients with metabolic dysfunction-associated steatotic liver disease (MASLD). We studied the impact of referral toMOVE!, a nationally implemented behavioral weight loss program, on weight in patients with MASLD. METHODS: This retrospective cohort study included 102,294 patients with MASLD from 125 Veterans Health Administration centers from 2008 to 2022. RESULTS: Most patients lost no significant weight or gained weight. Increased engagement with MOVE! was associated with a greater hazard of significant weight loss compared with no engagement. DISCUSSION: Aminority of patients experienced significant weight loss through 5 years using lifestyle interventions alone. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Linking temperature sensitivity of mangrove communities, populations and individuals across a tropical‐temperate transitional zone.
- Author
-
Kang, Yiyang, Kaplan, David A., and Osland, Michael J.
- Subjects
- *
MANGROVE plants , *COASTAL wetlands , *MANGROVE ecology , *ATMOSPHERIC temperature , *FIELD research , *SALT marshes , *RHIZOPHORA , *CLIMATE change - Abstract
Climate change is reshaping coastal wetlands worldwide, driving ecosystem shifts like mangrove poleward expansion into saltmarshes in tropical‐temperate transitional zones. Though warming is recognized as the primary driver, a lack of detailed field studies limits our ability to predict mangrove responses to rapid climate warming.Here, we characterized how mangroves vary across a temperature gradient at 18 sites along Florida's Gulf of Mexico coast (USA). We used minimum air temperature (Tmin) derived from daily data from 1989 to 2021 as the independent variable and applied plot‐based and synoptic approaches to quantify species‐specific mangrove variation at community, population, and individual levels. We then used these results to spatially project future mangrove ecosystem properties under multiple warming scenarios.Across the Tmin gradient from −10.8 to −1.4°C, mangrove canopy height and coverage ranged from 0.4 to 11.5 m and 15% to 98%, respectively, with both exhibiting sigmoidal increases with Tmin. Estimated mangrove aboveground biomass ranged from 0 to 496.7 Mg/ha and showed a positive linear relationship with Tmin due both to the tall tree stratum's increased biomass per tree and higher abundance.While the population abundance and coverage of Rhizophora mangle and Laguncularia racemosa had positive linear relationships with Tmin, Avicennia germinans exhibited a significant quadratic relationship, reflecting the higher freeze tolerance of this species. Such tolerance may stem from A. germinans' higher morphological plasticity observed at the individual level, adapting to cold stress by exhibiting a more shrub‐like architecture at colder sites.Based on these field‐derived quantitative relationships, we projected substantial increases in mangrove coverage and canopy height near current range limits, with tall A. germinans dominating in the north and R. mangle dominating the centre and south of the study region.Synthesis. To better predict the ecological consequences in coastal wetlands under future climate change, it is essential to understand how mangroves respond to winter temperature regimes across a temperature gradient. Collectively, these cross‐level and species‐specific results advance our understanding of mangrove temperature sensitivity and provide information about the future of coastal wetland structure and function in response to a changing climate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Phenotyping Hepatic Immune-Related Adverse Events in the Setting of Immune Checkpoint Inhibitor Therapy.
- Author
-
Feldman, Theodore C., Kaplan, David E., Lin, Albert, La, Jennifer, Lee, Jerry S.H., Aljehani, Mayada, Tuck, David P., Brophy, Mary T., Fillmore, Nathanael R., and Do, Nhan V.
- Subjects
- *
DRUG side effects , *IMMUNE checkpoint inhibitors , *BIG data , *HEPATOCELLULAR carcinoma - Abstract
PURPOSE: We present and validate a rule-based algorithm for the detection of moderate to severe liver-related immune-related adverse events (irAEs) in a real-world patient cohort. The algorithm can be applied to studies of irAEs in large data sets. METHODS: We developed a set of criteria to define hepatic irAEs. The criteria include: the temporality of elevated laboratory measurements in the first 2-14 weeks of immune checkpoint inhibitor (ICI) treatment, steroid intervention within 2 weeks of the onset of elevated laboratory measurements, and intervention with a duration of at least 2 weeks. These criteria are based on the kinetics of patients who experienced moderate to severe hepatotoxicity (Common Terminology Criteria for Adverse Events grades 2-4). We applied these criteria to a retrospective cohort of 682 patients diagnosed with hepatocellular carcinoma and treated with ICI. All patients were required to have baseline laboratory measurements before and after the initiation of ICI. RESULTS: A set of 63 equally sampled patients were reviewed by two blinded, clinical adjudicators. Disagreements were reviewed and consensus was taken to be the ground truth. Of these, 25 patients with irAEs were identified, 16 were determined to be hepatic irAEs, 36 patients were nonadverse events, and two patients were of indeterminant status. Reviewers agreed in 44 of 63 patients, including 19 patients with irAEs (0.70 concordance, Fleiss' kappa: 0.43). By comparison, the algorithm achieved a sensitivity and specificity of identifying hepatic irAEs of 0.63 and 0.81, respectively, with a test efficiency (percent correctly classified) of 0.78 and outcome-weighted F1 score of 0.74. CONCLUSION: The algorithm achieves greater concordance with the ground truth than either individual clinical adjudicator for the detection of irAEs. Three (sometimes 4) rules to identify hepatic immune-related adverse events in the setting of ICI. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Cell‐type specific molecular expression levels by restricted‐dimensional cytometry.
- Author
-
Kaplan, David, Lazarus, Hillard M., and Christian, Eric
- Abstract
Background Methods Results Conclusions Cytometric analysis has been commonly used to delineate distinct cell subpopulations among peripheral blood mononuclear cells by the differential expression of surface receptors. This capability has reached its apogee with high‐dimensional approaches such as mass cytometry and spectral cytometry that include simultaneous assessment of 20–50 analytes. Unfortunately, this approach also engenders significant complexity with analytical and interpretational pitfalls.Here, we demonstrate a complementary approach with restricted‐dimensionality to assess cell‐type specific intracellular molecular expression levels at exceptional levels of precision. The expression of five analytes was individually assessed in four mononuclear cell‐types from peripheral blood.Distinctions in expression levels were seen between cell‐types and between samples from different donor groups. Mononuclear cell‐type specific molecular expression levels distinguished pregnant from nonpregnant women and G‐CSF‐treated from untreated persons. Additionally, the precision of our analysis was sufficient to quantify a novel relationship between two molecules—Rel A and translocator protein—by correlational analysis.Restricted‐dimensional cytometry can provide a complementary approach to define characteristics of cell‐type specific intracellular protein and phosphoantigen expression in mononuclear cells. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Generative Modeling, Design, and Analysis of Spider Silk Protein Sequences for Enhanced Mechanical Properties.
- Author
-
Lu, Wei, Kaplan, David L., and Buehler, Markus J.
- Subjects
- *
SPIDER silk , *AMINO acid sequence , *MOLECULAR structure , *PROTEIN analysis , *ELASTIC modulus , *SEQUENCE analysis - Abstract
Spider silks are remarkable materials characterized by superb mechanical properties such as strength, extensibility, and lightweightedness. Yet, to date, limited models are available to fully explore sequence‐property relationships for analysis and design. Here a custom generative large‐language model is proposed to enable the design of novel spider silk protein sequences to meet complex combinations of target mechanical properties. The model, pretrained on a large set of protein sequences, is fine‐tuned on ≈1,000 major ampullate spidroin (MaSp) sequences for which associated fiber‐level mechanical properties exist, to yield an end‐to‐end forward and inverse generative approach that is aplied in a multi‐agent strategy. Performance is assessed through: 1) a novelty analysis and protein type classification for generated spidroin sequences through Basic Local Alignment Search Tool (BLAST) searches, 2) property evaluation and comparison with similar sequences, 3) comparison of resulting molecular structures, and 4) a detailed sequence motif analyses. This work generates silk sequences with property combinations that do not exist in nature and develops a deeper understanding of the mechanistic roles of sequence patterns in achieving overarching key mechanical properties (elastic modulus (E), strength, toughness, failure strain). The model provides an efficient approach to expand the silkome dataset, facilitating further sequence‐structure analyses of silks, and establishes a foundation for synthetic silk design and optimization. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Political advantage? Considering the Political Skill of raters.
- Author
-
Kaplan, David M. and Berkley, Robyn A.
- Subjects
- *
EMPLOYEE selection - Abstract
One of the ways that organizations can improve the selection process is by identifying good judges who can more accurately assess applicants. With this goal in mind, the present study proposed that politically skilled individuals would make good judges. The study focuses on two antecedents of Political Skill, perceptiveness, and control, which have corollaries in the Realistic Accuracy Model. In the present study, these are referenced as perceptiveness and self‐assuredness. The study found support for the proposition that a rater's level of Political Skill is related to selection decisions. Further, the study supports in some circumstances that Political Skill mediates the relationship between sources of knowledge and contextual factors with rater selection decisions. Suppression effects were also found indicating further research on the role of Political Skill in selection decisions is needed. Practitioner points: Political Skill (PS) is associated with someone being a good judgePS should be factored into the selection process for ratersPS should be a focus of both employee selection and development [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. ForceGen: End-to-end de novo protein generation based on nonlinear mechanical unfolding responses using a language diffusion model.
- Author
-
Bo Ni, Kaplan, David L., and Buehler, Markus J.
- Subjects
- *
LANGUAGE models , *BIOSYNTHESIS , *MECHANICAL behavior of materials , *PROTEIN engineering , *AMINO acid sequence , *SEQUENCE spaces , *KERATIN - Abstract
Through evolution, nature has presented a set of remarkable protein materials, including elastins, silks, keratins and collagens with superior mechanical performances that play crucial roles in mechanobiology. However, going beyond natural designs to discover proteins that meet specified mechanical properties remains challenging. Here, we report a generative model that predicts protein designs to meet complex nonlinear mechanical property-design objectives. Our model leverages deep knowledge on protein sequences from a pretrained protein language model and maps mechanical unfolding responses to create proteins. Via full-atom molecular simulations for direct validation, we demonstrate that the designed proteins are de novo, and fulfill the targeted mechanical properties, including unfolding energy and mechanical strength, as well as the detailed unfolding force-separation curves. Our model offers rapid pathways to explore the enormous mechanobiological protein sequence space unconstrained by biological synthesis, using mechanical features as the target to enable the discovery of protein materials with superior mechanical properties. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Confidence intervals for intentionally biased estimators.
- Author
-
Kaplan, David M. and Liu, Xin
- Subjects
- *
CONFIDENCE intervals , *PROBABILITY theory - Abstract
We propose and study three confidence intervals (CIs) centered at an estimator that is intentionally biased to reduce mean squared error. The first CI simply uses an unbiased estimator's standard error; compared to centering at the unbiased estimator, this CI has higher coverage probability for confidence levels above 91. 7%, even if the biased and unbiased estimators have equal mean squared error. The second CI trades some of this "excess" coverage for shorter length. The third CI is centered at a convex combination of the two estimators to further reduce length. Practically, these CIs apply broadly and are simple to compute. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Gauging Geography's Vitality through National and Regional Organizations.
- Author
-
Kaplan, David
- Subjects
- *
GEOGRAPHY , *REGIONALISM (International organization) , *GEOGRAPHERS , *GAGING , *GAGES - Abstract
The establishment of formal disciplinary-based associations is vital to the growth of the discipline and in facilitating the activities of people who work within the discipline. This geographical note examines the development and role of geography associations in the United States. It begins by looking at the four major geography societies which originated between 1851 and 1915 and how they specialized in their outreach and functions. This is followed by examining the role of the American Association of Geographers' regional divisions, which have created more localized geography associations for AAG members. The attributes of the AAG's nine regional divisions are considered, and then members' perspectives of these regions, based on a wide-ranging survey, are discussed. The special place of the Southeastern Division of the AAG (SEDAAG) emerges in comparing attributes and perceptions of each of the nine AAG regions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Benefits, concerns, and solutions of fishing for tunas with drifting fish aggregation devices.
- Author
-
Pons, Maite, Kaplan, David, Moreno, Gala, Escalle, Lauriane, Abascal, Francisco, Hall, Martin, Restrepo, Victor, and Hilborn, Ray
- Subjects
- *
TUNA , *TUNA fishing , *FISHERIES , *ENVIRONMENTAL organizations , *TUNA fisheries - Abstract
Drifting fish aggregating devices (dFADs) are human‐made floating objects widely used by tropical tuna purse seine (PS) fisheries to increase catch of target species. However, dFAD use has several negative impacts, including increased potential for overfishing, higher juvenile tuna catch, higher bycatch compared to other PS fishing modes, ghost‐fishing, and generation of marine litter. Based on these impacts, some stakeholders, especially environmental non‐governmental organizations and other competing fishing industries, suggest that dFADs should be completely banned. We list the pros and cons of dFAD fishing; address how to improve current management; and suggest solutions for the sustainability of dFAD fishing in the long term. A dFAD ban would lead to major changes in the availability and sourcing of tuna for human consumption and decrease the licensing revenue received by many developing states. Most importantly, we argue that tools exist today to manage for, reduce or eliminate most of the negative impacts of dFADs (e.g., bans on discards, limits on active dFADs, biodegradable non‐entangling constructions, time‐area deployment closures, recovery programs, and full data transparency, among others). Management decisions based on sound scientific reasoning are needed to address the legitimate concerns surrounding dFAD use and ensure the sustainability of both pelagic and coastal ecosystems and tropical tuna PS fisheries. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. A Bayesian statistics tutorial for clinical research: Prior distributions and meaningful results for small clinical samples.
- Author
-
Larson, Caroline, Kaplan, David, Girolamo, Teresa, Kover, Sara T., and Eigsti, Inge‐Marie
- Subjects
- *
MEDICAL research , *AUTISM spectrum disorders , *RESEARCH questions , *STATISTICS - Abstract
Objectives: Bayesian statistics provides an effective, reliable approach for research with small clinical samples and yields clinically meaningful results that can bridge research and practice. This tutorial demonstrates how Bayesian statistics can be effectively and reliably implemented with a small, heterogeneous participant sample to promote reproducible and clinically relevant research. Methods/Results: We tested example research questions pertaining to language and clinical features in autism spectrum disorder (ASD; n = 20), a condition characterized by significant heterogeneity. We provide step‐by‐step instructions and visualizations detailing how to (1) identify and develop prior distributions from the literature base, (2) evaluate model convergence and reliability, and (3) compare models with different prior distributions to select the best performing model. Moreover, in step three, we demonstrate how to determine whether a sample size is sufficient for reliably interpreting model results. We also provide instructions detailing how to examine results with varied bounds of clinical interest, such as the probability that an effect will reflect at least one standard deviation change in scores on a standardized assessment. This information facilitates generalization and application of Bayesian results to a variety of clinical research questions and settings. Conclusion: The tutorial concludes with suggestions for future clinical research, ensuring the utility of our step‐by‐step instructions for a broad clinical audience. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Error-independent effect of sensory uncertainty on motor learning when both feedforward and feedback control processes are engaged.
- Author
-
Hewitson, Christopher L., Kaplan, David M., and Crossley, Matthew J.
- Subjects
- *
MOTOR learning , *LITERATURE - Abstract
Integrating sensory information during movement and adapting motor plans over successive movements are both essential for accurate, flexible motor behaviour. When an ongoing movement is off target, feedback control mechanisms update the descending motor commands to counter the sensed error. Over longer timescales, errors induce adaptation in feedforward planning so that future movements become more accurate and require less online adjustment from feedback control processes. Both the degree to which sensory feedback is integrated into an ongoing movement and the degree to which movement errors drive adaptive changes in feedforward motor plans have been shown to scale inversely with sensory uncertainty. However, since these processes have only been studied in isolation from one another, little is known about how they are influenced by sensory uncertainty in real-world movement contexts where they co-occur. Here, we show that sensory uncertainty may impact feedforward adaptation of reaching movements differently when feedback integration is present versus when it is absent. In particular, participants gradually adjust their movements from trial-to-trial in a manner that is well characterised by a slow and consistent envelope of error reduction. Riding on top of this slow envelope, participants exhibit large and abrupt changes in their initial movement vectors that are strongly correlated with the degree of sensory uncertainty present on the previous trial. However, these abrupt changes are insensitive to the magnitude and direction of the sensed movement error. These results prompt important questions for current models of sensorimotor learning under uncertainty and open up new avenues for future exploration in the field. Author summary: A large body of literature shows that sensory uncertainty inversely scales the degree of error-driven corrections made to motor plans from one trial to the next. However, by limiting sensory feedback to the endpoint of movements, these studies prevent corrections from taking place during the movement. Here, we show that when such corrections are promoted, sensory uncertainty punctuates between-trial movement corrections with abrupt changes that closely track the degree of sensory uncertainty but are insensitive to the magnitude and direction of movement error. This result marks a significant departure from existing findings and opens up new paths for future exploration. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. Who Are We? Redefining the Academic Community.
- Author
-
Kaplan, David H.
- Subjects
- *
GEOGRAPHY , *COMMUNITIES , *STUDENTS , *HIGHER education , *DISCIPLINE - Abstract
This presidential address focuses on how community intertwines with geography. On one hand, geography is the discipline that studies the community as it exists in place. Geography itself is a community—an intellectual, social, and cultural community—that must be supported and expanded. Geography's strength has been in upholding this community. Today, the discipline of geography suffers from some real challenges. The number of majors has recently declined, and more programs have closed than have opened in recent years. Improving our geographic community can be accomplished through four efforts. First, we need to improve our institutional diversity. Second, we must increase our workforce diversity. Third, we need to attend to expanding our discipline to first-generation students. Finally, we must harness the growth of Advanced Placement Human Geography to improve the status of geography in higher education. These and other efforts will go a long way in expanding the community of geography and in reinvigorating geography as a discipline. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. Constraints on Undetected Long-period Binaries in the Known Pulsar Population.
- Author
-
Jones, Megan L., Kaplan, David L., McLaughlin, Maura A., and Lorimer, Duncan R.
- Subjects
- *
BINARY pulsars , *BINARY black holes , *GRAVITATIONAL waves , *PULSARS - Abstract
Although neutron star–black hole binaries have been identified through mergers detected in gravitational waves, a pulsar–black hole binary has yet to be detected. While short-period binaries are detectable due to a clear signal in the pulsar's timing residuals, effects from a long-period binary could be masked by other timing effects, allowing them to go undetected. In particular, a long-period binary measured over a small subset of its orbital period could manifest via time derivatives of the spin frequency incompatible with isolated pulsar properties. We assess the possibility of pulsars having unknown companions in long-period binaries and put constraints on the range of binary properties that may remain undetected in current data, but that may be detectable with further observations. We find that for 35% of canonical pulsars with published higher-order derivatives, the precision of measurements is not enough to confidently reject binarity (period ≳2 kyr), and that a black hole binary companion could not be ruled out for a sample of pulsars without published constraints if the period is >1 kyr. While we find no convincing cases in the literature, we put more stringent limits on orbital period and longitude of periastron for the few pulsars with published higher-order frequency derivatives (n ≥ 3). We discuss the detectability of candidates and find that a sample pulsar in a 100 yr orbit could be detectable within 5–10 yr. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. The past, present, and potential future of phosphorus management in the Florida Everglades.
- Author
-
Zacharias, Quinn and Kaplan, David
- Subjects
- *
WATER quality , *CITY dwellers , *PHOSPHORUS , *RESTORATION ecology , *AGRICULTURE , *STREAM restoration , *WETLAND restoration , *URBAN growth - Abstract
The Florida Everglades, the largest subtropical wetland in North America, is in the midst of one of the most comprehensive and expensive environmental restoration efforts in history. Over the past 150 years, the Everglades has suffered substantial degradation due to massive drainage projects, polluting agricultural practices, and urban population growth. Decades of scientific investigation have shown that phosphorus (P) pollution is a primary driver of this environmental decline. This paper reviews how and why specific P‐management goals and strategies have been adopted in support of Everglades restoration, focusing on the often‐contentious process for converting science into restoration policies and standards. We synthesize current P‐management successes, failures, and tradeoffs, including the challenge of balancing multiple hydrologic and water quality restoration goals with the priorities and values of a diverse group of stakeholders. We then highlight promising future directions for Everglades P policy and propose questions to help guide the discussion of future restoration priorities and research needs in this and other complex social–ecological systems. The overall goals of this review are thus twofold: (1) to support an in‐depth understanding of the past, present, and potential future of P management approaches in this globally unique social–ecological system; and (2) to provide a broader framework for understanding how the coevolution of science and policy can support or undermine large‐scale ecosystem restoration. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
18. The impact of conventional versus robust norming on cognitive characterization and clinical classification of MCI and dementia.
- Author
-
Kaser, Alyssa N., Kaplan, David M., Goette, William, and Kiselica, Andrew M.
- Subjects
- *
ALZHEIMER'S disease , *DEMENTIA , *MILD cognitive impairment , *RECEIVER operating characteristic curves , *COGNITIVE testing , *NEUROPSYCHOLOGICAL tests - Abstract
We examined the impact of conventional versus robust normative approaches on cognitive characterization and clinical classification of MCI versus dementia. The sample included participants from the National Alzheimer's Coordinating Center Uniform Data Set. Separate demographically adjusted z‐scores for cognitive tests were derived from conventional (n = 4273) and robust (n = 602) normative groups. To assess the impact of deriving scores from a conventional versus robust normative group on cognitive characterization, we examined likelihood of having a low score on each neuropsychological test. Next, we created receiver operating characteristic (ROC) curves for the ability of normed scores derived from each normative group to differentiate between MCI (n = 3570) and dementia (n = 1564). We examined the impact of choice of normative group on classification accuracy by comparing sensitivity and specificity values and areas under the curves (AUC). Compared with using a conventional normative group, using a robust normative group resulted in a higher likelihood of low cognitive scores for individuals classified with MCI and dementia. Comparison of the classification accuracy for distinguishing MCI from dementia did not suggest a statistically significant advantage for either normative approach (Z = −0.29, p =.77; AUC = 0.86 for conventional and AUC = 0.86 for robust). In summary, these results indicate that using a robust normative group increases the likelihood of characterizing cognitive performance as low. However, there is not a clear advantage of using a robust over a conventional normative group when differentiating between MCI and dementia. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
19. Binary collisions of dark matter blobs.
- Author
-
Diamond, Melissa D., Kaplan, David E., and Rajendran, Surjeet
- Abstract
We describe the model-independent mechanism by which dark matter and dark matter structures heavier than ~ 8 × 1011 GeV form binary pairs in the early Universe that spin down and merge both in the present and throughout the Universe’s history, producing potentially observable signals. Sufficiently dense dark objects will dominantly collide through binary mergers instead of random collisions. We detail how one would estimate the merger rate accounting for finite size effects, multibody interactions, and friction with the thermal bath. We predict how mergers of dark dense objects could be detected through gravitational and electromagnetic signals, noting that such mergers could be a unique source of high frequency gravitational waves. We rule out objects whose presence would contradict observations of the CMB and diffuse gamma-rays. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. Cultured meat: creative solutions for a cell biological problem.
- Author
-
Stout, Andrew J., Kaplan, David L., and Flack, Joshua E.
- Subjects
- *
TECHNOLOGICAL innovations , *INFORMATION sharing , *IN vitro meat , *BIOLOGISTS - Abstract
Cultured meat is an emerging technology that could address environmental, health, and animal welfare concerns associated with meat production. Development of cultured meat represents an exciting challenge for cell biologists and engineers, but it requires effective, open approaches for knowledge sharing to establish a fertile scientific field alongside a competitive industry. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
21. knitrdata: A Tool for Creating Standalone Rmarkdown Source Documents.
- Author
-
Kaplan, David M.
- Subjects
- *
TEXT files , *DATA analysis - Abstract
Though Rmarkdown is a powerful tool for integrating text with code for analyses in a single source document exportable to a variety of output formats, until now there has been no simple way to integrate the data behind analyses into Rmarkdown source documents. The knitrdata package makes it possible for arbitrary text and binary data to be integrated directly into Rmarkdown source documents via implementation of a new data chunk type. The package includes command-line and graphical tools that facilitate creating and inserting data chunks into Rmarkdown documents, and the treatment of data chunks is highly configurable via chunk options. These tools allow one to easily create fully standalone Rmarkdown source documents integrating data, ancillary formatting files, analysis code and text in a single file. Used properly, the package can facilitate open distribution of source documents that demonstrate computational reproducibility of scientific results. [ABSTRACT FROM AUTHOR]
- Published
- 2022
22. Capturing Multiple Sources of Change on Triannual Math Screeners in Elementary School.
- Author
-
Hall, Garret J., Kaplan, David, and Albers, Craig A.
- Subjects
- *
ELEMENTARY schools , *SPRING , *PACIFIC Islanders , *SPECIAL education , *MATHEMATICS , *HISPANIC Americans - Abstract
Bayesian latent change score modeling (LCSM) was used to compare models of triannual (fall, winter, spring) change on elementary math computation and concepts/applications curriculum‐based measures. Data were collected from elementary students in Grades 2–5, approximately 700 to 850 students in each grade (47%–54% female; 78%–79% White, 10%–11% Black, 2%–4% Hispanic/Latino, 2%–4% Asian, 2–4% Native American or Pacific Islander; 13%–14% English learner; 10%–14% had special education individualized education plans). Results converged with common nonlinear growth patterns from the assessment norms and prior independent findings. However, Bayesian LCSMs captured practically relevant sources of change not observed in prior studies. Practical and methodological implications for screening and data‐based decision‐making in multitiered systems of support, limitations, and future directions are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Algorithms to Identify Alcoholic Hepatitis Hospitalizations in Patients with Cirrhosis.
- Author
-
Panchal, Sarjukumar A., Kaplan, David E., Goldberg, David S., and Mahmud, Nadim
- Abstract
Background: Alcoholic hepatitis (AH) is a clinically diagnosed syndrome with high short-term mortality for which liver transplantation may be curative. A lack of validated algorithms to identify AH hospitalizations has hindered clinical epidemiology research. Methods: This was a retrospective cohort study of patients with cirrhosis using Veterans Health Administration (VHA) data from 2008 to 2015. We randomly sampled hospitalizations based upon abnormal liver tests and administrative codes for acute hepatitis or alcohol-associated liver disease (ALD). Hospitalizations were manually adjudicated for AH per society guidelines. A priori algorithms were evaluated to compute positive predicted value (PPV) and positive likelihood ratio (LR+), and were tested in an external University of Pennsylvania Health System (UPHS) cohort. Results: Of 368 hospitalizations, 142 (38.6%) were adjudicated as AH. AH patients were younger (55 vs. 58 years, p < 0.001), less likely to have prior cirrhosis decompensation (57% vs. 73.9%, p < 0.001), and had higher AST-to-ALT ratios (median 2.9 vs. 1.9 mg/dL, p < 0.001) and higher bilirubin levels (median 2.9 vs. 1.9 mg/dL, p < 0.001). Algorithms combining clinical laboratory criteria (AST > 85 U/L but < 450 U/L, AST-to-ALT ratio > 2, total bilirubin > 5 mg/dL) and administrative coding criteria yielded the highest PPV (96.4%, 95% CI 87.7–99.6) and the highest LR+ (43.0, 95% CI 10.6–173.5). Several algorithms demonstrated 100% PPV for definite AH in the UPHS external cohort. Conclusion: We have identified algorithms for AH hospitalizations with excellent PPV and LR+. These high-specificity algorithms may be used in VHA datasets to identify patients with high likelihood of AH, but should not be used to study AH incidence. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
24. Real-world health outcomes in US adult patients with mild to moderate plaque psoriasis taking topical therapy.
- Author
-
Kaplan, David, Hetherington, James, Lucas, James, Khilfeh, Ibrahim, and Nazareth, Tara
- Subjects
- *
PSORIASIS , *BODY surface area , *ADULTS , *TREATMENT effectiveness , *ITCHING - Abstract
Limited health outcomes information exists for patients with mild to moderate plaque psoriasis (hereafter, referred to as psoriasis) prescribed topical treatment(s). We evaluated clinical characteristics of patients with systemic-naïve mild to moderate psoriasis after topical use in the United States. Data were drawn from 2017 to 2018 Adelphi Psoriasis Disease Specific Programme™, a point-in-time survey of physicians and adult psoriasis patients, capturing data on topical treatment at time of consultation prescribed to systemic-naïve patients with mild to moderate psoriasis (i.e. body surface area [BSA] ≤ 10%) at current treatment initiation. Patient clinical characteristics before/after topical use were evaluated descriptively. Among 304 patients (median age 43.0 years; 53.6% female), mean time since diagnosis was 60.9 months. After a mean 6.9 months on their current topical, 14.5% of patients achieved ≥75% BSA reduction, 38.9% ≥50% BSA reduction, and 50.2% no BSA reduction. Residual psoriasis symptoms included scaling (76.5%), inflamed skin (65.9%), and itching (60.4%). Most patients (71.2%) had residual psoriasis in special body areas: nails (92.3%), palmoplantar (78.9%), scalp (75.9%), and face (65.8%). We found unmet need in topical treatment effectiveness in mild to moderate psoriasis patients, in terms of BSA reduction, symptoms, and special body areas affected. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
25. SAT-116 Association between pre-operative TIPS and post-operative mortality in a veterans affairs cohort.
- Author
-
Mahmud, Nadim and Kaplan, David
- Published
- 2024
- Full Text
- View/download PDF
26. Smoothed instrumental variables quantile regression.
- Author
-
Kaplan, David M.
- Subjects
- *
QUANTILE regression , *STATISTICAL accuracy , *REGRESSION analysis , *TREATMENT effectiveness - Abstract
In this article, I introduce the sivqr command, which estimates the coefficients of the instrumental variables quantile regression model introduced by Chernozhukov and Hansen (2005, Econometrica 73: 245–261). The sivqr command offers several advantages over the existing ivqreg and ivqreg2 commands for estimating this instrumental variables quantile regression model, which complements the alternative "triangular model" behind cqiv and the "local quantile treatment effect" model of ivqte. Computationally, sivqr implements the smoothed estimator of Kaplan and Sun (2017, Econometric Theory 33: 105–157), who show that smoothing improves both computation time and statistical accuracy. Standard errors are computed analytically or by Bayesian bootstrap; for nonindependent and identically distributed sampling, sivqr is compatible with bootstrap. I discuss syntax and the underlying methodology, and I compare sivqr with other commands in an example. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
27. Inpatient Gastroenterology Consultation and Outcomes of Cirrhosis-Related Hospitalizations in Two Large National Cohorts.
- Author
-
Serper, Marina, Kaplan, David E., Lin, Menghan, Taddei, Tamar H., Parikh, Neehar D., Werner, Rachel M., and Tapper, Elliot B.
- Abstract
Background: Little is known about use of specialty care among patients admitted with cirrhosis complications. Aims: We sought to characterize the use and impact of gastroenterology/hepatology (GI/HEP) consultations in hospitalized patients with cirrhosis. We studied two national cohorts—the Veterans Affairs Costs and Outcomes in Liver Disease (VOCAL) and a nationally representative database of commercially insured patients (Optum Clinformatics™ DataMart). Methods: Cirrhosis-related admissions were classified by ICD9/10 codes for ascites, hepatic encephalopathy, alcohol-associated hepatitis, spontaneous bacterial peritonitis, or infection related. We included 20,287/222,166 index admissions from VOCAL/Optum from 2010 to 2016. Propensity-matched analyses were conducted to balance clinical characteristics. Mortality and readmission were evaluated using competing risk regression (subhazard ratios, sHR), and length of stay (LOS) was assessed using negative binomial regression. Results: GI/HEP consultations were completed among 37% and 42% patients in VOCAL and Optum, respectively. In propensity-matched analyses for VOCAL, GI/HEP consultation was associated with adjusted estimates of increased LOS (1.55 + 1.03 additional days), 90-day mortality (sHR 1.23, 95% CI 1.14–1.36), and lower 30-day readmissions (sHR 0.82, 95% CI 0.75–0.89). In Optum, inpatient consultation was associated with higher LOS (1.13 + 1.01 additional days), higher 90-day mortality (sHR 1.57, 95% CI 1.43–1.72), and higher 30-day readmission risk (sHR 1.04, 95% CI 1.02–1.05). Post-discharge primary and specialty care was higher among admissions receiving GI/HEP consultation in both cohorts. Conclusions: Use of GI/HEP consultation for cirrhosis-related admissions was low. Patients who received consultation had higher disease severity, and consultation was not associated with lower mortality but was associated with lower 30-day readmissions in the VA cohort only. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
28. Constraints on relic magnetic black holes.
- Author
-
Diamond, Melissa D. and Kaplan, David E.
- Abstract
We present current direct and astrophysical limits on the cosmological abundance of black holes with extremal magnetic charge. Such black holes do not Hawking radiate, allowing those normally too light to survive to the present to do so. The dominant constraints come from white dwarf destruction for low and intermediate masses (2 × 10−5 g – 4 × 1012 g) and Galactic gas cloud heating for heavier masses (> 4 × 1012 g). Extremal magnetic black holes may catalyze proton decay. We derive robust limits — independent of the catalysis cross section — from the effect this has on white dwarfs. We discuss other bounds from neutron star heating, solar neutrino production, binary formation and annihilation into gamma-rays, and magnetic field destruction. Stable magnetically charged black holes can assist in the formation of neutron star mass black holes. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
29. Construct validity evidence reporting practices for the Reading the Mind in the Eyes Test: A systematic scoping review.
- Author
-
Higgins, Wendy C., Kaplan, David M., Deschrijver, Eliane, and Ross, Robert M.
- Subjects
- *
TEST validity , *PSYCHOLOGICAL literature , *SOCIAL skills , *STATISTICAL reliability , *READING - Abstract
The Reading the Mind in the Eyes Test (RMET) is one of the most influential measures of social cognitive ability, and it has been used extensively in clinical populations. However, questions have been raised about the validity of RMET scores. We conducted a systematic scoping review of the validity evidence reported in studies that administered the RMET (n = 1461; of which 804 included at least one clinical sample) with a focus on six key dimensions: internal consistency, test-retest reliability, factor structure, convergent validity, discriminant validity, and known group validity. Strikingly, 63% of these studies failed to provide validity evidence from any of these six categories. Moreover, when evidence was reported, it frequently failed to meet widely accepted validity standards. Overall, our results suggest a troubling conclusion: the validity of RMET scores (and the research findings based on them) are largely unsubstantiated and uninterpretable. More broadly, this project demonstrates how unaddressed measurement issues can undermine a voluminous psychological literature. • The Reading the Mind in the Eyes Test (RMET) is widely used as a measure of social cognitive ability. • The RMET has been used in more than 75 different clinical populations across more than 800 studies. • Most studies reported little to no validity evidence for RMET test scores. • When reported, validity evidence often indicates inadequate validity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Neutrino interactions in the late universe.
- Author
-
Green, Daniel, Kaplan, David E., and Rajendran, Surjeet
- Abstract
The cosmic neutrino background is both a dramatic prediction of the hot Big Bang and a compelling target for current and future observations. The impact of relativistic neutrinos in the early universe has been observed at high significance in a number of cosmological probes. In addition, the non-zero mass of neutrinos alters the growth of structure at late times, and this signature is a target for a number of upcoming surveys. These measurements are sensitive to the physics of the neutrino and could be used to probe physics beyond the standard model in the neutrino sector. We explore an intriguing possibility where light right-handed neutrinos are coupled to all, or a fraction of, the dark matter through a mediator. In a wide range of parameter space, this interaction only becomes important at late times and is uniquely probed by late-time cosmological observables. Due to this coupling, the dark matter and neutrinos behave as a single fluid with a non-trivial sound speed, leading to a suppression of power on small scales. In current and near-term cosmological surveys, this signature is equivalent to an increase in the sum of the neutrino masses. Given current limits, we show that at most 0.5% of the dark matter could be coupled to neutrinos in this way. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. Geography's Position in Education Today.
- Author
-
Kaplan, David H.
- Subjects
- *
GEOGRAPHY , *HUMAN geography , *UNIVERSITIES & colleges , *EDUCATION associations , *GEOGRAPHERS - Abstract
The objective of this article is to assess the position of geography in the United States by examining the educational organizations that serve geography. Specifically, I detail four pieces of evidence: (1) the number and distribution of higher education institutions that offer geography; (2) trends in the number and diversity of geography majors; (3) state geography requirements for middle and high schools; and (4) the growth in Advanced Placement Human Geography and its distribution by states. Retaining a strong institutional presence in geography is vital given the precarious status of the discipline. I call for a more systematic approach to collecting and disseminating data on the health of our discipline to ensure its long-term survival. These steps are currently being taken by the American Association of Geographers and will result in a more robust data infrastructure. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
32. Impact of SGLT2 inhibitors in comparison with DPP4 inhibitors on ascites and death in veterans with cirrhosis on metformin.
- Author
-
Saffo, Saad, Kaplan, David E., Mahmud, Nadim, Serper, Marina, John, Binu V., Ross, Joseph S., and Taddei, Tamar
- Subjects
- *
NON-alcoholic fatty liver disease , *SODIUM-glucose cotransporter 2 inhibitors , *METFORMIN , *CD26 antigen , *ASCITES , *OVERALL survival - Abstract
Sodium‐glucose cotransporter 2 inhibitors (SGLT2i) may have favourable neurohumoral and metabolic effects in patients with chronic liver disease. However, studies examining SGLT2i in this population have been limited to patients with non‐alcoholic fatty liver disease and have focused on surrogate biomarkers. Our aim was to evaluate whether SGLT2i can reduce the incidence of ascites and death over a period of 36 months in patients with cirrhosis and diabetes mellitus. Using electronic health data from Veterans Affairs hospitals in the United States, we conducted a propensity score matched intention‐to‐treat analysis among veterans on metformin who subsequently received either SGLT2i or dipeptidyl peptidase‐4 inhibitors. Among 423 matched pairs (in total, 846 patients), we found no significant difference in the risk for ascites (hazard ratio 0.68 for SGLT2i, 95% confidence interval 0.37‐1.25; p =.22) but did find that SGLT2i users had a reduced risk for death (adjusted hazard ratio 0.33, 95% confidence interval 0.11‐0.99; p <.05). In comparison with dipeptidyl peptidase‐4 inhibitors, SGLT2i may improve survival for patients with cirrhosis who require additional pharmacotherapy for diabetes mellitus beyond metformin, but confirmatory studies are necessary. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
33. On the prediction of neuronal microscale topology descriptors based on mesoscale recordings.
- Author
-
Bonzanni, Mattia and Kaplan, David L.
- Subjects
- *
NEURAL circuitry , *TOPOLOGY , *BIOLOGICAL networks , *FRAMES (Social sciences) , *POWER (Social sciences) - Abstract
The brain possesses structural and functional hierarchical architectures organized over multiple scales. Considering that functional recordings commonly focused on a single spatial level, and because multiple scales interact with one another, we explored the behaviour of in silico neuronal networks across different scales. We established ad hoc relations of several topological descriptors (average clustering coefficient, average path length, small‐world propensity, modularity, network degree, synchronizability and fraction of long‐term connections) between different scales upon application and empirical validation of a Euclidian renormalization approach. We tested a simple network (distance‐dependent model) as well as an artificial cortical network (Vertex; undirected and directed networks) finding the same qualitative power law relations of the parameters across levels: their quantitative nature is model dependent. Those findings were then organized in a workflow that can be used to predict, with approximation, microscale topologies from mesoscale recordings. The present manuscript not only presents a theoretical framework for the renormalization of biological neuronal network and their study across scales in light of the spatial features of the recording method but also proposes an applicable workflow to compare real functional networks across scales. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
34. Erratum: Constraining properties of neutron star merger outflows with radio observations.
- Author
-
Dobie, Dougal, Kaplan, David L, Hotokezaka, Kenta, Murphy, Tara, Deller, Adam, Hallinan, Gregg, and Nissanke, Samaya
- Subjects
- *
STELLAR mergers , *GRAVITATIONAL wave detectors - Published
- 2021
- Full Text
- View/download PDF
35. Statins in Cirrhosis: Trial Data Are in but the Jury Is Still Out.
- Author
-
Kaplan, David E.
- Subjects
- *
CIRRHOSIS of the liver , *STATINS (Cardiovascular agents) , *JURY , *PORTAL hypertension - Abstract
In future trial design and in clinical practice, patients with CTP C cirrhosis or total bilirubin >= 3 mg/dl should not be initiated on statins and if already taking should discontinue therapy. Two CTP C patients developed myonecrosis in this study; it was only patients with CTP C cirrhosis in the Bleeding Prevention With Simvastatin (BLEPS) trial that developed rhabdomyolysis [[6]]. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
36. Watch and Learn: The Cognitive Neuroscience of Learning from Others' Actions.
- Author
-
Ramsey, Richard, Kaplan, David M., and Cross, Emily S.
- Subjects
- *
COGNITIVE neuroscience , *COGNITIVE learning , *OBSERVATIONAL learning , *MOTOR learning , *MIRROR neurons , *NEUROLINGUISTICS - Abstract
The mirror neuron system has dominated understanding of observational learning from a cognitive neuroscience perspective. Our review highlights the value of observational learning frameworks that integrate a more diverse and distributed set of cognitive and brain systems, including those implicated in sensorimotor transformations, as well as in more general processes such as executive control, reward, and social cognition. We argue that understanding how observational learning occurs in the real world will require neuroscientific frameworks that consider how visuomotor processes interface with more general aspects of cognition, as well as how learning context and action complexity shape mechanisms supporting learning from watching others. Understanding how the human brain translates visual information into skilled motor performance has been assisted and constrained by the discovery of mirror neurons. Emerging evidence highlights how observational motor learning involves a far more diffuse network of brain regions and cognitive processes, which are shaped by the context and complexity of the motor task to be learned. A greater emphasis on combining functional decomposition and functional integration approaches should facilitate paradigms and discoveries that move us closer toward understanding how we learn from watching others in complex, real-world scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
37. Expressions of American White Ethnonationalism in Support for "Blue Lives Matter".
- Author
-
Solomon, Johanna, Kaplan, David, and Hancock, Landon E.
- Subjects
- *
WHITE nationalism , *BLUE Lives Matter movement , *BLACK Lives Matter movement , *ETHNONATIONALISM - Abstract
This paper poses three related questions. What is white ethno-nationalism as it exists today within the United States, how is this sentiment expressed by particular organizations, and how is it expressed by ordinary people who belong to these organizations? We begin what ethno-nationalism signifies and its relation to other forms of nationalism. We then check how certain indicators are present among supporters of an organization, Blue Lives Matter, which emerged as a reaction to Black Lives Matter. While this movement has framed itself as supportive of police rights, its negative reaction to Black Lives Matter has become a vehicle to express white ethno-nationalism. These views are gauged as a means to understanding the contours of banal white ethnonationalism as opposed to those more strident forms registered by neo-Nazis and KKK members. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
38. Care quality and outcomes among US veterans with chronic hepatitis B in the hepatitis C direct‐acting antiviral era.
- Author
-
Kaplan, David E., Medvedeva, Elina, and Serper, Marina
- Subjects
- *
CHRONIC hepatitis B , *HEPATITIS B , *HEPATITIS C , *VIRAL hepatitis , *HEPATITIS B virus , *VETERANS , *LIVER diseases , *DATA warehousing - Abstract
Adherence to guideline‐recommended hepatitis B virus (HBV) care is suboptimal. We hypothesized that national hepatitis C eradication efforts during the era from 2015 to 2017 would improve the quality of care for cHBV given increased recognition and specialty referrals for liver disease. The study described herein is a retrospective cohort study of veterans with at least one positive HBsAg (HBsAg+) result from 1 January 2003 to 31 December 2017 using the VA Corporate Data Warehouse (CDW) analysed by era (2003‐2004, 2005‐2009, 2010‐2014, 2015‐2017). Relevant covariates such as HCV co‐infection, demographics, cirrhosis and baseline laboratory testing were obtained through previously validated approaches. We evaluated completion of process measures within 2 years of the index HBsAg + result: specialty care referral; testing of ALT, HBV‐DNA, HBeAg and anti‐HBe; testing for co‐infection and/or vaccination for HAV, HCV, HDV and HIV; and hepatocellular carcinoma (HCC) surveillance among those meeting criteria. We also measured use of antiviral therapy in appropriate candidates (ALT ≥ 2 × ULN, HBV‐DNA ≥ 2000 IU/mL). Of the 16 673 individuals with HBsAg + test results, 9,521 were confirmed as chronic HBV. Era‐related (Era 3:2010‐2014 vs Era 4:2015‐2017) increases in guideline‐recommended process measures included the following: outpatient visits with GI/ID specialists (78%‐89%), HBV‐DNA testing (73%‐79%), HDV testing (27%‐35%), appropriate HBV antiviral utilization (55%‐70%) and HCC surveillance (40%‐43%); all P <.0001. In the subset of HBV/HCV‐co‐infected patients, HCV DAA therapy was associated with a trend towards improved overall survival. In conclusion, the overall quality of care for HBV has significantly improved in the era of widespread HCV DAA therapy in an integrated health system possibly due to increased recognition and referral for liver disease. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
39. Hepatitis C Virus.
- Author
-
Kaplan, David E.
- Subjects
- *
HEPATITIS C virus , *PRIMARY care , *HEPATITIS C diagnosis , *HEPATITIS C prevention , *HEPATITIS C transmission , *ANTIVIRAL agents , *COMPARATIVE studies , *HEPATITIS C , *HEPATITIS viruses , *RESEARCH methodology , *MEDICAL cooperation , *MEDICAL screening , *RESEARCH , *EVALUATION research - Published
- 2020
- Full Text
- View/download PDF
40. INTRODUCTION TO THE SPECIAL ISSUE ON COVID-19.
- Author
-
Keough, Sara Beth and Kaplan, David H.
- Subjects
- *
COVID-19 , *COVID-19 pandemic , *SARS-CoV-2 , *STAY-at-home orders , *ENVIRONMENTAL racism - Abstract
For most in the United States, the realities of the covid-19 pandemic hit home in March 2020. This decentralized method of handling the pandemic resulted in reinterpretations of space, place, practices, policies, and spatial patterns. Qingfang Wang and Wei Kang analyze the impact of the pandemic on small business practices in 50 MSAs and the degree of vulnerability created by policy differences. [Extracted from the article]
- Published
- 2021
- Full Text
- View/download PDF
41. An Approach to Addressing Multiple Imputation Model Uncertainty Using Bayesian Model Averaging.
- Author
-
Kaplan, David and Yavuz, Sinan
- Subjects
- *
MULTIPLE imputation (Statistics) , *MISSING data (Statistics) , *UNCERTAINTY , *FORECASTING - Abstract
This paper considers the problem of imputation model uncertainty in the context of missing data problems. We argue that so-called "Bayesianly proper" approaches to multiple imputation, although correctly accounting for uncertainty in imputation model parameters, ignore the uncertainty in the imputation model itself. We address imputation model uncertainty by implementing Bayesian model averaging as part of the imputation process. Bayesian model averaging accounts for both model and parameter uncertainty, and thus we argue is fully Bayesianly proper. We apply Bayesian model averaging to multiple imputation under the fully conditional specification approach. An extensive simulation study is conducted comparing our Bayesian model averaging approach against normal theory-based Bayesian imputation not accounting for model uncertainty. Across almost all conditions of the simulation study, the results reveal the extent of model uncertainty in multiple imputation and a consistent advantage to our Bayesian model averaging approach over normal-theory multiple imputation under missing-at-random and missing-completely-at random in terms of Kullback-Liebler divergence and mean squared prediction error. A small case study is also presented. Directions for future research are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
42. (H)Elping nerve growth factor: Elp1 inhibits TrkA's phosphatase to maintain retrograde signaling.
- Author
-
Kaplan, David R. and Mobley, William C.
- Abstract
Nerve growth factor (NGF) regulates many aspects of neuronal biology by retrogradely propagating signals along axons to the targets of those axons. How this occurs when axons contain a plethora of proteins that can silence those signals has long perplexed the neurotrophin field. In this issue of the JCI, Li et al. suggest an answer to this vexing problem, while exploring why the Elp1 gene that is mutated in familial dysautonomia (FD) causes peripheral neuropathy. They describe a distinctive function of Elp1 as a protein that is required to sustain NGF signaling by blocking the activity of its phosphatase that shuts off those signals. This finding helps explain the innervation deficits prominent in FD and reveals a unique role for Elp1 in the regulation of NGF-dependent TrkA activity. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
43. Language and Inhibition: Predictive Relationships in Children With Language Impairment Relative to Typically Developing Peers.
- Author
-
Larson, Caroline, Kaplan, David, Kaushanskay, Margarita, and Weismer, Susan Ellis
- Subjects
- *
INTERPERSONAL relations in children , *SPECIFIC language impairment in children , *CHILD development , *LANGUAGE disorders , *VOCABULARY , *AFFINITY groups , *PHONOLOGICAL awareness - Abstract
Background: This study examined predictive relationships between two indices of language—receptive vocabulary and morphological comprehension—and inhibition in children with specific language impairment (SLI) and typically developing (TD) children. Methods: Participants included 30 children with SLI and 41 TD age-matched peers (8–12 years). At two time points separated by 1 year, we assessed receptive vocabulary and morphological comprehension via standardized language measures and inhibition via a Flanker task. We used Bayesian model averaging and Bayesian regression analytical techniques. Results: Findings indicated predictive relationships between language indices and inhibition reaction time (RT), but not between language indices and inhibition accuracy. For the SLI group, Year 1 inhibition RT predicted Year 2 morphological comprehension. For the TD group, Year 1 morphological comprehension predicted Year 2 inhibition RT. Conclusions: This study provides preliminary evidence of a predictive relationship between language and inhibition, but this relationship differed between children with SLI and those with typical development. Findings suggest that inhibition RT played a larger predictive role in later morphological comprehension in children with SLI relative to the other relationships examined. Targeting inhibition skills as a part of language intervention may improve subsequent morphological comprehension. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
44. Are More Details Better? On the Norms of Completeness for Mechanistic Explanations.
- Author
-
Craver, Carl F and Kaplan, David M
- Subjects
- *
EXPLANATION , *SCIENTIFIC models - Abstract
Completeness is an important but misunderstood norm of explanation. It has recently been argued that mechanistic accounts of scientific explanation are committed to the thesis that models are complete only if they describe everything about a mechanism and, as a corollary, that incomplete models are always improved by adding more details. If so, mechanistic accounts are at odds with the obvious and important role of abstraction in scientific modelling. We respond to this characterization of the mechanist's views about abstraction and articulate norms of completeness for mechanistic explanations that have no such unwanted implications. 1 Introduction 2 A Balancing Act: When Do Details Matter? 3 The Norms of Causal Explanation 4 The Norms of Constitutive Explanation 5 Salmon-Completeness 6 From More Details to More Relevant Details 7 Non-explanatory Virtues of Abstraction 8 From Explanatory Models to Explanatory Knowledge 9 Mechanistic Completeness Reconsidered 10 Conclusion [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
45. distcomp: Comparing distributions.
- Author
-
Kaplan, David M.
- Subjects
- *
REGRESSION discontinuity design , *TREATMENT effectiveness , *ECONOMETRICS , *ERROR rates - Abstract
In this article, I introduce the distcomp command, which assesses whether two distributions differ at each possible value while controlling the probability of any false positive, even in finite samples. I discuss syntax and the underlying methodology (from Goldman and Kaplan [2018, Journal of Econometrics 206: 143–166]). Multiple examples illustrate the distcomp command, including revisiting the experimental data of Gneezy and List (2006, Econometrica 74: 1365–1384) and the regression discontinuity design of Cattaneo, Frandsen, and Titiunik (2015, Journal of Causal Inference 3: 1–24). [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
46. Detecting outliers in species distribution data: Some caveats and clarifications on a virtual species study.
- Author
-
Meynard, Christine N., Kaplan, David M., Leroy, Boris, and Pearman, Peter B.
- Subjects
- *
SPECIES distribution , *DATA distribution , *SPECIES - Abstract
Liu et al. (2018) used a virtual species approach to test the effects of outliers on species distribution models. In their simulations, they applied a threshold value over the simulated suitabilities to generate the species distributions, suggesting that using a probabilistic simulation approach would have been more complex and yield the same results. Here, we argue that using a probabilistic approach is not necessarily more complex and may significantly change results. Although the threshold approach may be justified under limited circumstances, the probabilistic approach has multiple advantages. First, it is in line with ecological theory, which largely assumes non‐threshold responses. Second, it is more general, as it includes the threshold as a limiting case. Third, it allows a better separation of the relevant intervening factors that influence model performance. Therefore, we argue that the probabilistic simulation approach should be used as a general standard in virtual species studies. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
47. Quantifying the impacts of dams on riverine hydrology under non-stationary conditions using incomplete data and Gaussian copula models.
- Author
-
Valle, Denis and Kaplan, David
- Abstract
Across the world, the assessment of environmental impacts attributable to infrastructure and development projects often require a comparison between observed post-impact outcomes with what "would have happened" in the absence of the impact (i.e., the counterfactual). Environmental impact assessment (EIA) methods traditionally determine the counterfactual based on strong assumptions of stationarity (e.g., using before and after comparisons) and can be particularly challenging to use in the context of substantial data gaps, a vexing problem when combining several time-series data from different sources. Here we propose and test a widely applicable statistical approach for quantifying environmental impacts that avoids the stationarity assumption and circumvents issues associated with data gaps. Specifically, we used a Gaussian Copula (GC) model to assess the hydrological impacts of the Tucuruí dam on the Tocantins River in the Brazilian Amazon. Using multi-source water level and climate data, GC predictions of pre-dam hydrology for the validation period were excellent (Nash-Sutcliffe coefficients of 0.83 to 0.98 and 93–96% of observations within the 95% predictive intervals). In the post-dam period, the river had higher dry-season water levels both upstream and downstream relative to the predicted counterfactual, and the timing and duration of wet-season drawdown was delayed and extended, substantially altering the flood pulse. These impacts were evident as far as 176 km away from the dam, highlighting widespread hydrological impacts. The GC model outperformed standard multiple regression models in representing predictive uncertainty while also avoiding the stationarity assumption and circumventing the issue of sparse and incomplete data. We thus believe the GC approach has wide utility for integrating disparate time-series data to quantify the impacts of dams and other anthropogenic phenomena on riverine hydrology globally. Unlabelled Image • We propose and test Gaussian Copula (GC) models for environmental impact assessments. • The GC model avoids the stationarity assumption and circumvents data gaps. • This model yielded better predictions than more standard methods. • We find substantial riverine hydrology alterations associated with the Tucuruí dam. • Our method is useful to integrate multiple time-series data for impact assessments. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
48. Simulations of drifting fish aggregating device (dFAD) trajectories in the Atlantic and Indian Oceans.
- Author
-
Imzilen, Taha, Kaplan, David M., Barrier, Nicolas, and Lett, Christophe
- Subjects
- *
OCEAN , *TUNA fisheries , *OCEAN currents , *MARINE pollution , *PREDICTION models - Abstract
Tropical tuna purse-seine fisheries deploy thousands of human-made drifting fish aggregating devices (dFADs) annually, raising a number of concerns regarding ecosystem impacts. In this study, we explored the use of a Lagrangian particle-tracking model to simulate the drift of dFADs in the Atlantic and Indian Oceans. We simulated more than 100,000 dFADs trajectories using the Lagrangian tool Ichthyop forced with velocity fields from an ocean model output (GLORYS12V1) and two satellite-derived ocean currents products (OSCAR and GEKCO). Importantly, through a collaborative agreement with the French frozen tuna producers' organization we had access to the true locations of all dFADs along their drift and could therefore evaluate the accuracy of our simulations. The accuracy was assessed by comparing the observed and simulated trajectories in terms of spatial distribution, separation distance, and a non-dimensional skill score (an index based on separation distances normalized by net displacements of dFADs). In the two oceans, simulations forced with GLORYS12V1 were more accurate than with OSCAR and GEKCO, probably due to the differences in the spatio-temporal resolution of the forcing products. When we compared multiple depths for GLORYS12V1, the model performed better at 0 m in the Indian Ocean and at 5 m in the Atlantic Ocean, which could be related to the longer vertical structure of dFADs in the Atlantic Ocean. We showed that including a windage factor did not improve the accuracy of modeled dFADs trajectories. We found that mean model-data separation distances were similar in both oceans, exceeding 100 km after 6–8 days of drift. While separation distances between simulated and observed trajectories show that model errors were similar in the two oceans, the generally longer distances traveled by dFADs in the Indian Ocean than in the Atlantic Ocean lead to considerably higher skill scores in the former than in the latter. This explains the relatively good predictive ability of the model to represent mean dFAD densities at the basin scale in both oceans, while at the same time indicates higher prediction skills for the movements of individual dFADs in the Indian Ocean than in the Atlantic Ocean. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Does John Roberts Still Matter?
- Author
-
KAPLAN, DAVID A.
- Subjects
- *
LAWYERS , *JUDGES , *DISSENTERS - Abstract
The article focuses on Chief Justice John Glover Roberts. Topics include John Glover Roberts Jr. is an American lawyer and jurist who serves as Chief Justice of the United States; Roberts was appointed as a judge of the U.S. Court of Appeals for the D.C. Circuit by George W. Bush; and during his two-year tenure on the D.C. Circuit, Roberts authored 49 opinions, eliciting two dissents from other judges, and authoring three dissents of his own.
- Published
- 2020
50. IT'S TIME TO PACK THE COURT—AGAIN.
- Author
-
Kaplan, David A.
- Subjects
- *
NOMINATIONS for public office - Abstract
In article the author discusses the dominance of the Republican Party in the U.S. Supreme Court. Topics include that Axelrod organized a computer tournament the game-theory experiment; the death of Ruth Bader Ginsburg and her replacement by Judge Amy Coney Barrett through Election Day; and that Republican Senate Majority Leader Mitch McConnell has pledged for replacement.
- Published
- 2020
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.