37 results on '"Coifman RR"'
Search Results
2. Rapid fluctuations in functional connectivity of cortical networks encode spontaneous behavior.
- Author
-
Benisty H, Barson D, Moberly AH, Lohani S, Tang L, Coifman RR, Crair MC, Mishne G, Cardin JA, and Higley MJ
- Subjects
- Mice, Animals, Magnetic Resonance Imaging, Wakefulness, Brain Mapping methods, Neural Pathways physiology, Neurons physiology, Neocortex diagnostic imaging
- Abstract
Experimental work across species has demonstrated that spontaneously generated behaviors are robustly coupled to variations in neural activity within the cerebral cortex. Functional magnetic resonance imaging data suggest that temporal correlations in cortical networks vary across distinct behavioral states, providing for the dynamic reorganization of patterned activity. However, these data generally lack the temporal resolution to establish links between cortical signals and the continuously varying fluctuations in spontaneous behavior observed in awake animals. Here, we used wide-field mesoscopic calcium imaging to monitor cortical dynamics in awake mice and developed an approach to quantify rapidly time-varying functional connectivity. We show that spontaneous behaviors are represented by fast changes in both the magnitude and correlational structure of cortical network activity. Combining mesoscopic imaging with simultaneous cellular-resolution two-photon microscopy demonstrated that correlations among neighboring neurons and between local and large-scale networks also encode behavior. Finally, the dynamic functional connectivity of mesoscale signals revealed subnetworks not predicted by traditional anatomical atlas-based parcellation of the cortex. These results provide new insights into how behavioral information is represented across the neocortex and demonstrate an analytical framework for investigating time-varying functional connectivity in neural networks., (© 2023. The Author(s), under exclusive licence to Springer Nature America, Inc.)
- Published
- 2024
- Full Text
- View/download PDF
3. Robust Estimation of Position-Dependent Anisotropic Diffusivity Tensors from Molecular Dynamics Trajectories.
- Author
-
Domingues TS, Coifman RR, and Haji-Akbari A
- Abstract
Confinement breaks translational and rotational symmetry in materials and makes all physical properties functions of position. Such spatial variations are key to modulating material properties at the nanoscale, and characterizing them accurately is therefore an intense area of research in the molecular simulations community. This is relatively easy to accomplish for basic mechanical observables. Determining spatial profiles of transport properties, such as diffusivity, is, however, much more challenging, as it requires calculating position-dependent autocorrelations of mechanical observables. In our previous paper (Domingues, T.S.; Coifman, R.; Haji-Akbari, A. J. Phys. Chem. B 2023 , 127 , 5273 10.1021/acs.jpcb.3c00670), we analytically derive and numerically validate a set of filtered covariance estimators (FCEs) for quantifying spatial variations of the diffusivity tensor from stochastic trajectories. In this work, we adapt these estimators to extract diffusivity profiles from MD trajectories and validate them by applying them to a Lennard-Jones fluid within a slit pore. We find our MD-adapted estimator to exhibit the same qualitative features as its stochastic counterpart, as it accurately estimates the lateral diffusivity across the pore while systematically underestimating the normal diffusivity close to hard boundaries. We introduce a conceptually simple and numerically efficient correction scheme based on simulated annealing and diffusion maps to resolve the latter artifact and obtain normal diffusivity profiles that are consistent with the self-part of the van Hove correlation functions. Our findings demonstrate the potential of this MD-adapted estimator in accurately characterizing spatial variations of diffusivity in confined materials.
- Published
- 2023
- Full Text
- View/download PDF
4. Robust Estimation of Position-Dependent Anisotropic Diffusivity Tensors from Stochastic Trajectories.
- Author
-
Domingues TS, Coifman RR, and Haji-Akbari A
- Abstract
Materials under confinement can possess properties that deviate considerably from their bulk counterparts. Indeed, confinement makes all physical properties position-dependent and possibly anisotropic, and characterizing such spatial variations and directionality has been an intense area of focus in experimental and computational studies of confined matter. While this task is fairly straightforward for simple mechanical observables, it is far more daunting for transport properties such as diffusivity that can only be estimated from autocorrelations of mechanical observables. For instance, there are well established methods for estimating diffusivity from experimentally observed or computationally generated trajectories in bulk systems. No rigorous generalizations of such methods, however, exist for confined systems. In this work, we present two filtered covariance estimators for computing anisotropic and position-dependent diffusivity tensors and validate them by applying them to stochastic trajectories generated according to known diffusivity profiles. These estimators can accurately capture spatial variations that span over several orders of magnitude and that assume different functional forms. Our kernel-based approach is also very robust to implementation details such as the localization function and time discretization and performs significantly better than estimators that are solely based on local covariance. Moreover, the kernel function does not have to be localized and can instead belong to a dictionary of orthogonal functions. Therefore, the proposed estimator can be readily used to obtain functional estimates of diffusivity rather than a tabulated collection of pointwise estimates. Nonetheless, the susceptibility of the proposed estimators to time discretization is higher at the immediate vicinity of hard boundaries. We demonstrate this heightened susceptibility to be common among all covariance-based estimators.
- Published
- 2023
- Full Text
- View/download PDF
5. Doubly Stochastic Normalization of the Gaussian Kernel Is Robust to Heteroskedastic Noise.
- Author
-
Landa B, Coifman RR, and Kluger Y
- Abstract
A fundamental step in many data-analysis techniques is the construction of an affinity matrix describing similarities between data points. When the data points reside in Euclidean space, a widespread approach is to from an affinity matrix by the Gaussian kernel with pairwise distances, and to follow with a certain normalization (e.g. the row-stochastic normalization or its symmetric variant). We demonstrate that the doubly-stochastic normalization of the Gaussian kernel with zero main diagonal (i.e., no self loops) is robust to heteroskedastic noise. That is, the doubly-stochastic normalization is advantageous in that it automatically accounts for observations with different noise variances. Specifically, we prove that in a suitable high-dimensional setting where heteroskedastic noise does not concentrate too much in any particular direction in space, the resulting (doubly-stochastic) noisy affinity matrix converges to its clean counterpart with rate m
-1/2 , where m is the ambient dimension. We demonstrate this result numerically, and show that in contrast, the popular row-stochastic and symmetric normalizations behave unfavorably under heteroskedastic noise. Furthermore, we provide examples of simulated and experimental single-cell RNA sequence data with intrinsic heteroskedasticity, where the advantage of the doubly-stochastic normalization for exploratory analysis is evident.- Published
- 2021
- Full Text
- View/download PDF
6. Local conformal autoencoder for standardized data coordinates.
- Author
-
Peterfreund E, Lindenbaum O, Dietrich F, Bertalan T, Gavish M, Kevrekidis IG, and Coifman RR
- Subjects
- Reference Standards, Algorithms, Data Analysis
- Abstract
We propose a local conformal autoencoder (LOCA) for standardized data coordinates. LOCA is a deep learning-based method for obtaining standardized data coordinates from scientific measurements. Data observations are modeled as samples from an unknown, nonlinear deformation of an underlying Riemannian manifold, which is parametrized by a few normalized, latent variables. We assume a repeated measurement sampling strategy, common in scientific measurements, and present a method for learning an embedding in [Formula: see text] that is isometric to the latent variables of the manifold. The coordinates recovered by our method are invariant to diffeomorphisms of the manifold, making it possible to match between different instrumental observations of the same phenomenon. Our embedding is obtained using LOCA, which is an algorithm that learns to rectify deformations by using a local z-scoring procedure, while preserving relevant geometric information. We demonstrate the isometric embedding properties of LOCA in various model settings and observe that it exhibits promising interpolation and extrapolation capabilities, superior to the current state of the art. Finally, we demonstrate LOCA's efficacy in single-site Wi-Fi localization data and for the reconstruction of three-dimensional curved surfaces from two-dimensional projections., Competing Interests: The authors declare no competing interest., (Copyright © 2020 the Author(s). Published by PNAS.)
- Published
- 2020
- Full Text
- View/download PDF
7. Two-sample statistics based on anisotropic kernels.
- Author
-
Cheng X, Cloninger A, and Coifman RR
- Abstract
The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions given finitely many multivariate samples. When the distributions are locally low-dimensional, the proposed test can be made more powerful to distinguish certain alternatives by incorporating local covariance matrices and constructing an anisotropic kernel. The kernel matrix is asymmetric; it computes the affinity between [Formula: see text] data points and a set of [Formula: see text] reference points, where [Formula: see text] can be drastically smaller than [Formula: see text]. While the proposed statistic can be viewed as a special class of Reproducing Kernel Hilbert Space MMD, the consistency of the test is proved, under mild assumptions of the kernel, as long as [Formula: see text], and a finite-sample lower bound of the testing power is obtained. Applications to flow cytometry and diffusion MRI datasets are demonstrated, which motivate the proposed approach to compare distributions., (© The Author(s) 2019. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.)
- Published
- 2020
- Full Text
- View/download PDF
8. Author Correction: Visualizing structure and transitions in high-dimensional biological data.
- Author
-
Moon KR, van Dijk D, Wang Z, Gigante S, Burkhardt DB, Chen WS, Yim K, van den Elzen A, Hirn MJ, Coifman RR, Ivanova NB, Wolf G, and Krishnaswamy S
- Abstract
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
- Published
- 2020
- Full Text
- View/download PDF
9. Visualizing structure and transitions in high-dimensional biological data.
- Author
-
Moon KR, van Dijk D, Wang Z, Gigante S, Burkhardt DB, Chen WS, Yim K, Elzen AVD, Hirn MJ, Coifman RR, Ivanova NB, Wolf G, and Krishnaswamy S
- Subjects
- Algorithms, Animals, Big Data, Cell Differentiation, Cells, Cultured, Computer Simulation, Databases, Genetic, Gastrointestinal Microbiome, Humans, Mice, Sequence Analysis, RNA, Single-Cell Analysis, Genomics methods, High-Throughput Screening Assays methods, Image Processing, Computer-Assisted methods
- Abstract
The high-dimensional data created by high-throughput technologies require visualization tools that reveal data structure and patterns in an intuitive form. We present PHATE, a visualization method that captures both local and global nonlinear structure using an information-geometric distance between data points. We compare PHATE to other tools on a variety of artificial and biological datasets, and find that it consistently preserves a range of patterns in data, including continual progressions, branches and clusters, better than other tools. We define a manifold preservation metric, which we call denoised embedding manifold preservation (DEMaP), and show that PHATE produces lower-dimensional embeddings that are quantitatively better denoised as compared to existing visualization methods. An analysis of a newly generated single-cell RNA sequencing dataset on human germ-layer differentiation demonstrates how PHATE reveals unique biological insight into the main developmental branches, including identification of three previously undescribed subpopulations. We also show that PHATE is applicable to a wide variety of data types, including mass cytometry, single-cell RNA sequencing, Hi-C and gut microbiome data.
- Published
- 2019
- Full Text
- View/download PDF
10. A REMARK ON THE ARCSINE DISTRIBUTION AND THE HILBERT TRANSFORM.
- Author
-
Coifman RR and Steinerberger S
- Abstract
It is known that if ( p
n )n ∈ℕ is a sequence of orthogonal polynomials in L2 ([-1,1], w ( x ) dx ), then the roots are distributed according to an arcsine distribution π-1 (1 - x2 )-1 dx for a wide variety of weights w ( x ). We connect this to a result of the Hilbert transform due to Tricomi: if f ( x )(1 - x2 )1 / 4 ∈ L2 (-1,1) and its Hilbert transform Hf vanishes on (-1,1), then the function f is a multiple of the arcsine distribution f ( x ) = c 1 - x 2 χ ( - 1 , 1 ) where c ∈ ℝ . We also prove a localized Parseval-type identity that seems to be new: if f ( x )(1- x2 )1 / 4 ∈ L2 (-1, 1) and f ( x ) 1 - x 2 has mean value 0 on (-1, 1), then ∫ - 1 1 ( H f ) ( x ) 2 1 - x 2 d x = ∫ - 1 1 f ( x ) 2 1 - x 2 d x . .- Published
- 2019
- Full Text
- View/download PDF
11. Data-Driven Tree Transforms and Metrics.
- Author
-
Mishne G, Talmon R, Cohen I, Coifman RR, and Kluger Y
- Abstract
We consider the analysis of high dimensional data given in the form of a matrix with columns consisting of observations and rows consisting of features. Often the data is such that the observations do not reside on a regular grid, and the given order of the features is arbitrary and does not convey a notion of locality. Therefore, traditional transforms and metrics cannot be used for data organization and analysis. In this paper, our goal is to organize the data by defining an appropriate representation and metric such that they respect the smoothness and structure underlying the data. We also aim to generalize the joint clustering of observations and features in the case the data does not fall into clear disjoint groups. For this purpose, we propose multiscale data-driven transforms and metrics based on trees. Their construction is implemented in an iterative refinement procedure that exploits the co-dependencies between features and observations. Beyond the organization of a single dataset, our approach enables us to transfer the organization learned from one dataset to another and to integrate several datasets together. We present an application to breast cancer gene expression analysis: learning metrics on the genes to cluster the tumor samples into cancer sub-types and validating the joint organization of both the genes and the samples. We demonstrate that using our approach to combine information from multiple gene expression cohorts, acquired by different profiling technologies, improves the clustering of tumor samples.
- Published
- 2018
- Full Text
- View/download PDF
12. Reconstruction of normal forms by learning informed observation geometries from data.
- Author
-
Yair O, Talmon R, Coifman RR, and Kevrekidis IG
- Abstract
The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities., Competing Interests: The authors declare no conflict of interest.
- Published
- 2017
- Full Text
- View/download PDF
13. Intrinsic map dynamics exploration for uncharted effective free-energy landscapes.
- Author
-
Chiavazzo E, Covino R, Coifman RR, Gear CW, Georgiou AS, Hummer G, and Kevrekidis IG
- Abstract
We describe and implement a computer-assisted approach for accelerating the exploration of uncharted effective free-energy surfaces (FESs). More generally, the aim is the extraction of coarse-grained, macroscopic information from stochastic or atomistic simulations, such as molecular dynamics (MD). The approach functionally links the MD simulator with nonlinear manifold learning techniques. The added value comes from biasing the simulator toward unexplored phase-space regions by exploiting the smoothness of the gradually revealed intrinsic low-dimensional geometry of the FES., Competing Interests: The authors declare no conflict of interest.
- Published
- 2017
- Full Text
- View/download PDF
14. Heterogeneity in Early Responses in ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial).
- Author
-
Dhruva SS, Huang C, Spatz ES, Coppi AC, Warner F, Li SX, Lin H, Xu X, Furberg CD, Davis BR, Pressel SL, Coifman RR, and Krumholz HM
- Subjects
- Aged, Analysis of Variance, Antihypertensive Agents administration & dosage, Antihypertensive Agents adverse effects, Blood Pressure drug effects, Cardiovascular Diseases etiology, Drug Monitoring methods, Female, Humans, Hypolipidemic Agents therapeutic use, Male, Middle Aged, Treatment Outcome, Amlodipine administration & dosage, Amlodipine adverse effects, Cardiovascular Diseases prevention & control, Chlorthalidone administration & dosage, Chlorthalidone adverse effects, Doxazosin administration & dosage, Doxazosin adverse effects, Hyperlipidemias complications, Hyperlipidemias diagnosis, Hyperlipidemias drug therapy, Hypertension complications, Hypertension diagnosis, Hypertension drug therapy, Lisinopril administration & dosage, Lisinopril adverse effects
- Abstract
Randomized trials of hypertension have seldom examined heterogeneity in response to treatments over time and the implications for cardiovascular outcomes. Understanding this heterogeneity, however, is a necessary step toward personalizing antihypertensive therapy. We applied trajectory-based modeling to data on 39 763 study participants of the ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) to identify distinct patterns of systolic blood pressure (SBP) response to randomized medications during the first 6 months of the trial. Two trajectory patterns were identified: immediate responders (85.5%), on average, had a decreasing SBP, whereas nonimmediate responders (14.5%), on average, had an initially increasing SBP followed by a decrease. Compared with those randomized to chlorthalidone, participants randomized to amlodipine (odds ratio, 1.20; 95% confidence interval [CI], 1.10-1.31), lisinopril (odds ratio, 1.88; 95% CI, 1.73-2.03), and doxazosin (odds ratio, 1.65; 95% CI, 1.52-1.78) had higher adjusted odds ratios associated with being a nonimmediate responder (versus immediate responder). After multivariable adjustment, nonimmediate responders had a higher hazard ratio of stroke (hazard ratio, 1.49; 95% CI, 1.21-1.84), combined cardiovascular disease (hazard ratio, 1.21; 95% CI, 1.11-1.31), and heart failure (hazard ratio, 1.48; 95% CI, 1.24-1.78) during follow-up between 6 months and 2 years. The SBP response trajectories provided superior discrimination for predicting downstream adverse cardiovascular events than classification based on difference in SBP between the first 2 measurements, SBP at 6 months, and average SBP during the first 6 months. Our findings demonstrate heterogeneity in response to antihypertensive therapies and show that chlorthalidone is associated with more favorable initial response than the other medications., (© 2017 American Heart Association, Inc.)
- Published
- 2017
- Full Text
- View/download PDF
15. Describing the performance of U.S. hospitals by applying big data analytics.
- Author
-
Downing NS, Cloninger A, Venkatesh AK, Hsieh A, Drye EE, Coifman RR, and Krumholz HM
- Subjects
- Centers for Medicare and Medicaid Services, U.S., United States, Hospital Administration
- Abstract
Public reporting of measures of hospital performance is an important component of quality improvement efforts in many countries. However, it can be challenging to provide an overall characterization of hospital performance because there are many measures of quality. In the United States, the Centers for Medicare and Medicaid Services reports over 100 measures that describe various domains of hospital quality, such as outcomes, the patient experience and whether established processes of care are followed. Although individual quality measures provide important insight, it is challenging to understand hospital performance as characterized by multiple quality measures. Accordingly, we developed a novel approach for characterizing hospital performance that highlights the similarities and differences between hospitals and identifies common patterns of hospital performance. Specifically, we built a semi-supervised machine learning algorithm and applied it to the publicly-available quality measures for 1,614 U.S. hospitals to graphically and quantitatively characterize hospital performance. In the resulting visualization, the varying density of hospitals demonstrates that there are key clusters of hospitals that share specific performance profiles, while there are other performance profiles that are rare. Several popular hospital rating systems aggregate some of the quality measures included in our study to produce a composite score; however, hospitals that were top-ranked by such systems were scattered across our visualization, indicating that these top-ranked hospitals actually excel in many different ways. Our application of a novel graph analytics method to data describing U.S. hospitals revealed nuanced differences in performance that are obscured in existing hospital rating systems.
- Published
- 2017
- Full Text
- View/download PDF
16. Reply to about the electrophysiological basis of resting state networks.
- Author
-
Duncan D, Duckrow RB, Pincus SM, Goncharova I, Hirsch LJ, Spencer DD, Coifman RR, and Zaveri HP
- Subjects
- Female, Humans, Male, Electroencephalography methods, Epilepsy, Frontal Lobe diagnosis, Epilepsy, Frontal Lobe physiopathology, Gyrus Cinguli physiopathology, Nerve Net physiopathology, Parietal Lobe physiopathology
- Published
- 2014
- Full Text
- View/download PDF
17. Diffusion methods for aligning medical datasets: location prediction in CT scan images.
- Author
-
Fernández Á, Rabin N, Coifman RR, and Eckstein J
- Subjects
- Algorithms, Anisotropy, Diffusion, Humans, Patient Positioning, Principal Component Analysis, Radiographic Image Interpretation, Computer-Assisted, Reproducibility of Results, Tomography, X-Ray Computed methods
- Abstract
The purpose of this study is to introduce diffusion methods as a tool to label CT scan images according to their position in the human body. A comparative study of different methods based on a k-NN search is carried out and we propose a new, simple and efficient way of applying diffusion techniques that is able to give better location forecasts than methods that can be considered the current state-of-the-art., (Copyright © 2014 Elsevier B.V. All rights reserved.)
- Published
- 2014
- Full Text
- View/download PDF
18. Low dimensional manifold embedding for scattering coefficients of intrapartum fetale heart rate variability.
- Author
-
Chudacek V, Talmon R, Anden J, Mallat S, Coifman RR, Abry P, and Doret M
- Subjects
- Female, Humans, Pregnancy, Time Factors, Algorithms, Heart Rate, Fetal physiology
- Abstract
Intrapartum fetal surveillance for early detection of fetal acidosis in clinical practice focuses on reducing neonatal morbidity via early detection. It is the subject of on going research studies attempting notably to improve detection performance by reducing false positive rate. In that context, the present contribution tailors to fetal heart rate variability analysis a graph-based dimensionality reduction procedure performed on scattering coefficients. Applied to a high quality and well-documented database constituted by obstetricians from a French academic hospital, the low dimensional embedding enables to distinguish between the temporal dynamics of healthy and acidotic fetuses, as well as to achieve satisfactory detection performance detection compared to those obtained by the clinical-benchmark FIGO criteria.
- Published
- 2014
- Full Text
- View/download PDF
19. Nonlinear intrinsic variables and state reconstruction in multiscale simulations.
- Author
-
Dsilva CJ, Talmon R, Rabin N, Coifman RR, and Kevrekidis IG
- Abstract
Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certain simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.
- Published
- 2013
- Full Text
- View/download PDF
20. Intracranial EEG evaluation of relationship within a resting state network.
- Author
-
Duncan D, Duckrow RB, Pincus SM, Goncharova I, Hirsch LJ, Spencer DD, Coifman RR, and Zaveri HP
- Subjects
- Adolescent, Adult, Brain Mapping methods, Child, Female, Humans, Magnetic Resonance Imaging, Male, Young Adult, Electroencephalography methods, Epilepsy, Frontal Lobe diagnosis, Epilepsy, Frontal Lobe physiopathology, Gyrus Cinguli physiopathology, Nerve Net physiopathology, Parietal Lobe physiopathology
- Abstract
Objective: We tested if a relationship between distant parts of the default mode network (DMN), a resting state network defined by fMRI studies, can be observed with intracranial EEG recorded from patients with localization-related epilepsy., Methods: Magnitude squared coherence, mutual information, cross-approximate entropy, and the coherence of the gamma power time-series were estimated, for one hour intracranial EEG recordings of background activity from 9 patients, to evaluate the relationship between two test areas which were within the DMN (anterior cingulate and orbital frontal, denoted as T1 and posterior cingulate and mesial parietal, denoted as T2), and one control area (denoted as C), which was outside the DMN. We tested if the relationship between T1 and T2 was stronger than the relationship between each of these areas and C., Results: A low level of relationship was observed among the 3 areas tested. The relationships among T1, T2 and C did not demonstrate support for the DMN., Conclusions: This study suggests a lack of intracranial EEG support for the fMRI defined default mode network., Significance: The results obtained underscore the considerable difference between electrophysiological and hemodynamic measurements of brain activity and possibly suggest a lack of neuronal involvement in the DMN., (Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.)
- Published
- 2013
- Full Text
- View/download PDF
21. Empirical intrinsic geometry for nonlinear modeling and time series filtering.
- Author
-
Talmon R and Coifman RR
- Abstract
In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.
- Published
- 2013
- Full Text
- View/download PDF
22. Identifying preseizure state in intracranial EEG data using diffusion kernels.
- Author
-
Duncan D, Talmon R, Zaveri HP, and Coifman RR
- Subjects
- Algorithms, Brain pathology, Brain physiopathology, Brain surgery, Brain Mapping methods, Brain Mapping statistics & numerical data, Electroencephalography methods, Epilepsy diagnosis, Epilepsy physiopathology, Epilepsy surgery, Humans, Mathematical Concepts, Models, Neurological, Population Dynamics, Seizures physiopathology, Systems Biology, Electroencephalography statistics & numerical data, Seizures diagnosis
- Abstract
The goal of this study is to identify preseizure changes in intracranial EEG (icEEG). A novel approach based on the recently developed diffusion map framework, which is considered to be one of the leading manifold learning methods, is proposed. Diffusion mapping provides dimensionality reduction of the data as well as pattern recognition that can be used to distinguish different states of the patient, for example, interictal and preseizure. A new algorithm, which is an extension of diffusion maps, is developed to construct coordinates that generate efficient geometric representations of the complex structures in the icEEG data. In addition, this method is adapted to the icEEG data and enables the extraction of the underlying brain activity. The algorithm is tested on icEEG data recorded from several electrode contacts from a patient being evaluated for possible epilepsy surgery at the Yale-New Haven Hospital. Numerical results show that the proposed approach provides a distinction between interictal and preseizure states.
- Published
- 2013
- Full Text
- View/download PDF
23. The automated malnutrition assessment.
- Author
-
David G, Bernstein LH, and Coifman RR
- Subjects
- Adult, Databases, Factual, Diagnosis, Differential, Humans, Prospective Studies, Risk Assessment, Algorithms, Malnutrition diagnosis, Nutrition Assessment
- Abstract
Objective: We propose an automated nutritional assessment algorithm that provides a method for malnutrition risk prediction with high accuracy and reliability., Methods: The database used for this study was a file of 432 patients, where each patient was described by 4 laboratory parameters and 11 clinical parameters. A malnutrition risk assessment of low (1), moderate (2), or high (3) was assigned by a dietitian for each patient. An algorithm for data organization and classification using characteristic metrics for each patient was developed. For each patient, the algorithm characterized the patients' unique profile and built a characteristic metric to identify similar patients who were mapped into a classification. For each patient, the algorithm characterized the patients' classification., Results: The algorithm assigned a malnutrition risk level for different training sizes that were taken from the data. Our method resulted in average errors (distance between the automated score and the real score) of 0.386, 0.3507, 0.3454, 0.34, and 0.2907 for the 10%, 30%, 50%, 70%, and 90% training sizes, respectively. Our method outperformed the compared method even when our method used a smaller training set than the compared method. In addition, we showed that the laboratory parameters themselves were sufficient for the automated risk prediction and organized the patients into clusters that corresponded to low-, low-moderate-, moderate-, moderate-high-, and high-risk areas. The organization and visualization methods provided a tool for the exploration and navigation of the data points., Conclusion: The problem of rapidly identifying risk and severity of malnutrition is crucial for minimizing medical and surgical complications. These are not easily performed or adequately expedited. We characterized for each patient a unique profile and mapped similar patients into a classification. We also found that the laboratory parameters were sufficient for the automated risk prediction., (Copyright © 2013 Elsevier Inc. All rights reserved.)
- Published
- 2013
- Full Text
- View/download PDF
24. Reference Free Structure Determination through Eigenvectors of Center of Mass Operators.
- Author
-
Coifman RR, Shkolnisky Y, Sigworth FJ, and Singer A
- Abstract
Recovering the three-dimensional structure of molecules is important for understanding their functionality. We describe a spectral graph algorithm for reconstructing the three-dimensional structure of molecules from their cryo-electron microscopy images taken at random unknown orientations.We first identify a one-to-one correspondence between radial lines in three-dimensional Fourier space of the molecule and points on the unit sphere. The problem is then reduced to determining the coordinates of points on the sphere given a subset of their pairwise geodesic distances. To recover those coordinates, we exploit the special geometry of the problem, as rendered by the Fourier projection-slice theorem, to construct a weighted graph whose vertices are the radial Fourier lines and whose edges are linked using the common line property. The graph organizes the radial lines on the sphere in a global manner that reveals the acquisition direction of each image. This organization is derived from a global computation of a few eigenvectors of the graph's sparse adjacency matrix. Once the directions are obtained, the molecule can be reconstructed using classical tomography methods.The presented algorithm is direct (as opposed to iterative refinement schemes), does not require any prior model for the reconstructed object, and is shown to have favorable computational and numerical properties. Moreover, the algorithm does not impose any assumption on the distribution of the projection orientations. Physically, this means that the algorithm is applicable to molecules that have unknown spatial preference.
- Published
- 2010
- Full Text
- View/download PDF
25. Detecting consistent common lines in cryo-EM by voting.
- Author
-
Singer A, Coifman RR, Sigworth FJ, Chester DW, and Shkolnisky Y
- Subjects
- Algorithms, Bayes Theorem, Fourier Analysis, Cryoelectron Microscopy methods, Image Processing, Computer-Assisted methods
- Abstract
The single-particle reconstruction problem of electron cryo-microscopy (cryo-EM) is to find the three-dimensional structure of a macromolecule given its two-dimensional noisy projection images at unknown random directions. Ab initio estimates of the 3D structure are often obtained by the "Angular Reconstitution" method, in which a coordinate system is established from three projections, and the orientation of the particle giving rise to each image is deduced from common lines among the images. However, a reliable detection of common lines is difficult due to the low signal-to-noise ratio of the images. In this paper we describe a global self-correcting voting procedure in which all projection images participate to decide the identity of the consistent common lines. The algorithm determines which common line pairs were detected correctly and which are spurious. We show that the voting procedure succeeds at relatively low detection rates and that its performance improves as the number of projection images increases. We demonstrate the algorithm for both simulative and experimental images of the 50S ribosomal subunit., ((c) 2009 Elsevier Inc. All rights reserved.)
- Published
- 2010
- Full Text
- View/download PDF
26. Detecting intrinsic slow variables in stochastic dynamical systems by anisotropic diffusion maps.
- Author
-
Singer A, Erban R, Kevrekidis IG, and Coifman RR
- Subjects
- Anisotropy, Computer Simulation, Markov Chains, Models, Chemical, Nonlinear Dynamics, Time Factors, Algorithms, Principal Component Analysis, Stochastic Processes
- Abstract
Nonlinear independent component analysis is combined with diffusion-map data analysis techniques to detect good observables in high-dimensional dynamic data. These detections are achieved by integrating local principal component analysis of simulation bursts by using eigenvectors of a Markov matrix describing anisotropic diffusion. The widely applicable procedure, a crucial step in model reduction approaches, is illustrated on stochastic chemical reaction network simulations.
- Published
- 2009
- Full Text
- View/download PDF
27. Graph Laplacian tomography from unknown random projections.
- Author
-
Coifman RR, Shkolnisky Y, Sigworth FJ, and Singer A
- Subjects
- Computer Simulation, Models, Statistical, Reproducibility of Results, Sensitivity and Specificity, Algorithms, Image Enhancement methods, Image Interpretation, Computer-Assisted methods, Tomography, Optical methods
- Abstract
We introduce a graph Laplacian-based algorithm for the tomographic reconstruction of a planar object from its projections taken at random unknown directions. A Laplace-type operator is constructed on the data set of projections, and the eigenvectors of this operator reveal the projection orientations. The algorithm is shown to successfully reconstruct the Shepp-Logan phantom from its noisy projections. Such a reconstruction algorithm is desirable for the structuring of certain biological proteins using cryo-electron microscopy.
- Published
- 2008
- Full Text
- View/download PDF
28. Data fusion and multicue data matching by diffusion maps.
- Author
-
Lafon S, Keller Y, and Coifman RR
- Subjects
- Cluster Analysis, Image Enhancement methods, Algorithms, Artificial Intelligence, Databases, Factual, Image Interpretation, Computer-Assisted methods, Information Storage and Retrieval methods, Pattern Recognition, Automated methods, Subtraction Technique
- Abstract
Data fusion and multicue data matching are fundamental tasks of high-dimensional data analysis. In this paper, we apply the recently introduced diffusion framework to address these tasks. Our contribution is three-fold: First, we present the Laplace-Beltrami approach for computing density invariant embeddings which are essential for integrating different sources of data. Second, we describe a refinement of the Nyström extension algorithm called "geometric harmonics." We also explain how to use this tool for data assimilation. Finally, we introduce a multicue data matching scheme based on nonlinear spectral graphs alignment. The effectiveness of the presented schemes is validated by applying it to the problems of lipreading and image sequence alignment.
- Published
- 2006
- Full Text
- View/download PDF
29. Geometric diffusions for the analysis of data from sensor networks.
- Author
-
Coifman RR, Maggioni M, Zucker SW, and Kevrekidis IG
- Subjects
- Algorithms, Models, Neurological, Models, Theoretical, Neural Networks, Computer
- Abstract
Harmonic analysis on manifolds and graphs has recently led to mathematical developments in the field of data analysis. The resulting new tools can be used to compress and analyze large and complex data sets, such as those derived from sensor networks or neuronal activity datasets, obtained in the laboratory or through computer modeling. The nature of the algorithms (based on diffusion maps and connectivity strengths on graphs) possesses a certain analogy with neural information processing, and has the potential to provide inspiration for modeling and understanding biological organization in perception and memory formation.
- Published
- 2005
- Full Text
- View/download PDF
30. Geometric diffusions as a tool for harmonic analysis and structure definition of data: multiscale methods.
- Author
-
Coifman RR, Lafon S, Lee AB, Maggioni M, Nadler B, Warner F, and Zucker SW
- Abstract
In the companion article, a framework for structural multiscale geometric organization of subsets of R(n) and of graphs was introduced. Here, diffusion semigroups are used to generate multiscale analyses in order to organize and represent complex structures. We emphasize the multiscale nature of these problems and build scaling functions of Markov matrices (describing local transitions) that lead to macroscopic descriptions at different scales. The process of iterating or diffusing the Markov matrix is seen as a generalization of some aspects of the Newtonian paradigm, in which local infinitesimal transitions of a system lead to global macroscopic descriptions by integration. This article deals with the construction of fast-order N algorithms for data representation and for homogenization of heterogeneous structures.
- Published
- 2005
- Full Text
- View/download PDF
31. Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps.
- Author
-
Coifman RR, Lafon S, Lee AB, Maggioni M, Nadler B, Warner F, and Zucker SW
- Abstract
We provide a framework for structural multiscale geometric organization of graphs and subsets of R(n). We use diffusion semigroups to generate multiscale geometries in order to organize and represent complex structures. We show that appropriately selected eigenfunctions or scaling functions of Markov matrices, which describe local transitions, lead to macroscopic descriptions at different scales. The process of iterating or diffusing the Markov matrix is seen as a generalization of some aspects of the Newtonian paradigm, in which local infinitesimal transitions of a system lead to global macroscopic descriptions by integration. We provide a unified view of ideas from data analysis, machine learning, and numerical analysis.
- Published
- 2005
- Full Text
- View/download PDF
32. Adaptive wavelet packet basis selection for zerotree image coding.
- Author
-
Rajpoot NM, Wilson RG, Meyer FG, and Coifman RR
- Abstract
Image coding methods based on adaptive wavelet transforms and those employing zerotree quantization have been shown to be successful. We present a general zerotree structure for an arbitrary wavelet packet geometry in an image coding framework. A fast basis selection algorithm is developed; it uses a Markov chain based cost estimate of encoding the image using this structure. As a result, our adaptive wavelet zerotree image coder has a relatively low computational complexity, performs comparably to state-of-the-art image coders, and is capable of progressively encoding images.
- Published
- 2003
- Full Text
- View/download PDF
33. Multilayered image representation: application to image compression.
- Author
-
Meyer FG, Averbuch AZ, and Coifman RR
- Abstract
The main contribution of this work is a new paradigm for image representation and image compression. We describe a new multilayered representation technique for images. An image is parsed into a superposition of coherent layers: piecewise smooth regions layer, textures layer, etc. The multilayered decomposition algorithm consists in a cascade of compressions applied successively to the image itself and to the residuals that resulted from the previous compressions. During each iteration of the algorithm, we code the residual part in a lossy way: we only retain the most significant structures of the residual part, which results in a sparse representation. Each layer is encoded independently with a different transform, or basis, at a different bitrate, and the combination of the compressed layers can always be reconstructed in a meaningful way. The strength of the multilayer approach comes from the fact that different sets of basis functions complement each others: some of the basis functions will give reasonable account of the large trend of the data, while others will catch the local transients, or the oscillatory patterns. This multilayered representation has a lot of beautiful applications in image understanding, and image and video coding. We have implemented the algorithm and we have studied its capabilities.
- Published
- 2002
- Full Text
- View/download PDF
34. Wavelets, adapted waveforms and de-noising.
- Author
-
Coifman RR and Wickerhauser MV
- Subjects
- Electricity, Humans, Electroencephalography, Signal Processing, Computer-Assisted
- Abstract
This is a short summary of a talk given at the Frontier Science in EEG Symposium, Continuous Waveform Analysis, held on 9 October 1993 in New Orleans. We describe some new libraries of waveforms well-adapted to various numerical analysis and signal processing tasks. The main point is that by expanding a signal in a library of waveforms which are well-localized in both time and frequency, one can achieve both understanding of structure and efficiency in computation. We briefly cover the properties of the new "wavelet packet" and "localized trigonometric" libraries. The main focus will be applications of such libraries to the analysis of complicated transient signals: a feature extraction and data compression algorithm for speech signals which uses best-adapted time and frequency decompositions, and an adapted waveform analysis algorithm for removing fish noises from hydrophone recordings. These signals share many of the same properties as EEG traces, but with distinct features that are easier to characterize and detect.
- Published
- 1996
35. Characterization of fourier transforms of hardy spaces.
- Author
-
Coifman RR
- Abstract
Characterizations of Fourier transforms of boundary distributions of functions in H(p)(R) or H(p)(T), 0 < p = 1, are given. These results are applied to obtain Fourier multiplier theorems on H(p).
- Published
- 1974
- Full Text
- View/download PDF
36. Maximal functions and h spaces defined by ergodic transformations.
- Author
-
Coifman RR and Weiss G
- Abstract
Suppose an ergodic flow acts on a probability space enabling us to introduce the Ergodic Hilbert transform f of f in L(p)(), 1 <== p <== infinity. H(1) is the class of all functions of the form f + if in L(1)(). We show that H(1) can be characterized in terms of a class of maximal functions; moreover, the dual space of H(1) is identified with a space of functions of bounded mean oscillation defined in terms of the flow.
- Published
- 1973
- Full Text
- View/download PDF
37. Distribution function inequalities for singular integrals.
- Author
-
Coifman RR
- Abstract
This paper describes some distribution function inequalities between maximal functions and singular integral operators.
- Published
- 1972
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.