92 results on '"McEwen, Jason D."'
Search Results
2. Proximal nested sampling for high-dimensional Bayesian model selection
- Author
-
Cai, Xiaohao, McEwen, Jason D., and Pereyra, Marcelo
- Published
- 2022
- Full Text
- View/download PDF
3. Wavelet-based segmentation on the sphere
- Author
-
Cai, Xiaohao, Wallis, Christopher G.R., Chan, Jennifer Y.H., and McEwen, Jason D.
- Published
- 2020
- Full Text
- View/download PDF
4. A covariant formulation for cosmological radiative transfer of the 21-cm line.
- Author
-
Chan, Jennifer Y H, Han, Qin, Wu, Kinwah, and McEwen, Jason D
- Subjects
RADIATIVE transfer ,RADIATION trapping ,EXPANDING universe ,RADIATIVE transfer equation ,MIDDLE Ages ,RADIO lines - Abstract
The 21-cm hyperfine line of neutral hydrogen is a useful tool to probe the conditions of the Universe during the Dark Ages, Cosmic Dawn, and the Epoch of Reionization. In most of the current calculations, the 21-cm line signals at given frequencies are computed, using an integrated line-of-sight line opacity, with the correction for cosmological expansion. These calculations have not fully captured the line and continuum interactions in the radiative transfer, in response to evolution of the radiation field and the variations of thermal and dynamic properties of the line-of-sight medium. We construct a covariant formulation for the radiative transfer of the 21-cm line and derive the cosmological 21-cm line radiative transfer (C21LRT) equation. The formulation properly accounts for local emission and absorption processes and the interaction between the line and continuum when the radiation propagates across the expanding Universe to the present observer. Our C21LRT calculations show that methods simply summing the line optical depth could lead to error of 5 per cent in the 21-cm signals for redshift z ∼ 12–35 and of |$\gt 10~{{\ \rm per\ cent}}$| for redshift z ≲ 8. Proper covariant radiative transfer is therefore necessary for producing correct theoretical templates for extracting information of the structural evolution of the Universe through the Epoch of Reionization from the 21-cm tomographic data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Localisation of directional scale-discretised wavelets on the sphere
- Author
-
McEwen, Jason D., Durastanti, Claudio, and Wiaux, Yves
- Published
- 2018
- Full Text
- View/download PDF
6. Slepian spatial-spectral concentration on the ball
- Author
-
Khalid, Zubair, Kennedy, Rodney A., and McEwen, Jason D.
- Published
- 2016
- Full Text
- View/download PDF
7. Proximal nested sampling with data-driven priors for physical scientists
- Author
-
McEwen, Jason D., Liaudat, Tobías I., Price, Matthew A., Cai, Xiaohao, and Pereyra, Marcelo
- Subjects
Methodology (stat.ME) ,FOS: Computer and information sciences ,Statistics - Machine Learning ,FOS: Physical sciences ,Machine Learning (stat.ML) ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Statistics - Methodology - Abstract
Proximal nested sampling was introduced recently to open up Bayesian model selection for high-dimensional problems such as computational imaging. The framework is suitable for models with a log-convex likelihood, which are ubiquitous in the imaging sciences. The purpose of this article is two-fold. First, we review proximal nested sampling in a pedagogical manner in an attempt to elucidate the framework for physical scientists. Second, we show how proximal nested sampling can be extended in an empirical Bayes setting to support data-driven priors, such as deep neural networks learned from training data., 9 pages, 4 figures
- Published
- 2023
8. Learned harmonic mean estimation of the marginal likelihood with normalizing flows
- Author
-
Polanska, Alicja, Price, Matthew A., Mancini, Alessio Spurio, and McEwen, Jason D.
- Subjects
Methodology (stat.ME) ,FOS: Computer and information sciences ,Statistics - Machine Learning ,FOS: Physical sciences ,Machine Learning (stat.ML) ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Statistics - Methodology - Abstract
Computing the marginal likelihood (also called the Bayesian model evidence) is an important task in Bayesian model selection, providing a principled quantitative way to compare models. The learned harmonic mean estimator solves the exploding variance problem of the original harmonic mean estimation of the marginal likelihood. The learned harmonic mean estimator learns an importance sampling target distribution that approximates the optimal distribution. While the approximation need not be highly accurate, it is critical that the probability mass of the learned distribution is contained within the posterior in order to avoid the exploding variance problem. In previous work a bespoke optimization problem is introduced when training models in order to ensure this property is satisfied. In the current article we introduce the use of normalizing flows to represent the importance sampling target distribution. A flow-based model is trained on samples from the posterior by maximum likelihood estimation. Then, the probability density of the flow is concentrated by lowering the variance of the base distribution, i.e. by lowering its "temperature", ensuring its probability mass is contained within the posterior. This approach avoids the need for a bespoke optimisation problem and careful fine tuning of parameters, resulting in a more robust method. Moreover, the use of normalizing flows has the potential to scale to high dimensional settings. We present preliminary experiments demonstrating the effectiveness of the use of flows for the learned harmonic mean estimator. The harmonic code implementing the learned harmonic mean, which is publicly available, has been updated to now support normalizing flows., 9 pages, 6 figures. arXiv admin note: text overlap with arXiv:2111.12720
- Published
- 2023
9. Learned Interferometric Imaging for the SPIDER Instrument
- Author
-
Mars, Matthijs, Betcke, Marta M., and McEwen, Jason D.
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Image and Video Processing (eess.IV) ,FOS: Electrical engineering, electronic engineering, information engineering ,FOS: Physical sciences ,Electrical Engineering and Systems Science - Image and Video Processing ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Machine Learning (cs.LG) - Abstract
The Segmented Planar Imaging Detector for Electro-Optical Reconnaissance (SPIDER) is an optical interferometric imaging device that aims to offer an alternative to the large space telescope designs of today with reduced size, weight and power consumption. This is achieved through interferometric imaging. State-of-the-art methods for reconstructing images from interferometric measurements adopt proximal optimization techniques, which are computationally expensive and require handcrafted priors. In this work we present two data-driven approaches for reconstructing images from measurements made by the SPIDER instrument. These approaches use deep learning to learn prior information from training data, increasing the reconstruction quality, and significantly reducing the computation time required to recover images by orders of magnitude. Reconstruction time is reduced to ${\sim} 10$ milliseconds, opening up the possibility of real-time imaging with SPIDER for the first time. Furthermore, we show that these methods can also be applied in domains where training data is scarce, such as astronomical imaging, by leveraging transfer learning from domains where plenty of training data are available., 21 pages, 14 figures
- Published
- 2023
10. Sparse Bayesian mass-mapping using trans-dimensional MCMC
- Author
-
Marignier, Augustin, Kitching, Thomas, McEwen, Jason D., and Ferreira, Ana M. G.
- Subjects
Cosmology and Nongalactic Astrophysics (astro-ph.CO) ,FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Astrophysics - Cosmology and Nongalactic Astrophysics - Abstract
Uncertainty quantification is a crucial step of cosmological mass-mapping that is often ignored. Suggested methods are typically only approximate or make strong assumptions of Gaussianity of the shear field. Probabilistic sampling methods, such as Markov chain Monte Carlo (MCMC), draw samples form a probability distribution, allowing for full and flexible uncertainty quantification, however these methods are notoriously slow and struggle in the high-dimensional parameter spaces of imaging problems. In this work we use, for the first time, a trans-dimensional MCMC sampler for mass-mapping, promoting sparsity in a wavelet basis. This sampler gradually grows the parameter space as required by the data, exploiting the extremely sparse nature of mass maps in wavelet space. The wavelet coefficients are arranged in a tree-like structure, which adds finer scale detail as the parameter space grows. We demonstrate the trans-dimensional sampler on galaxy cluster-scale images where the planar modelling approximation is valid. In high-resolution experiments, this method produces naturally parsimonious solutions, requiring less than 1% of the potential maximum number of wavelet coefficients and still producing a good fit to the observed data. In the presence of noisy data, trans-dimensional MCMC produces a better reconstruction of mass-maps than the standard smoothed Kaiser-Squires method, with the addition that uncertainties are fully quantified. This opens up the possibility for new mass maps and inferences about the nature of dark matter using the new high-resolution data from upcoming weak lensing surveys such as Euclid.
- Published
- 2022
11. Sparse Bayesian mass mapping with uncertainties: hypothesis testing of structure: hypothesis testing of structure
- Author
-
Price, Matthew A, McEwen, Jason D, Cai, Xiaohao, Kitching, Thomas D, and Wallis, Christopher GR
- Subjects
Astrophysics::Cosmology and Extragalactic Astrophysics - Abstract
A crucial aspect of mass mapping, via weak lensing, is quantification of the uncertainty introduced during the reconstruction process. Properly accounting for these errors has been largely ignored to date. We present a new method to reconstruct maximum a posteriori (MAP) convergence maps by formulating an unconstrained Bayesian inference problem with Laplace-type l1-norm sparsity-promoting priors, which we solve via convex optimization. Approaching mass mapping in this manner allows us to exploit recent developments in probability concentration theory to infer theoretically conservative uncertainties for our MAP reconstructions, without relying on assumptions of Gaussianity. For the first time, these methods allow us to perform hypothesis testing of structure, from which it is possible to distinguish between physical objects and artefacts of the reconstruction. Here, we present this new formalism, and demonstrate the method on simulations, before applying the developed formalism to two observational data sets of the Abell 520 cluster. Initial reconstructions of the Abell 520 catalogues reported the detection of an anomalous 'dark core' - an overdense region with no optical counterpart - which was taken to be evidence for self-interacting dark matter. In our Bayesian framework, it is found that neither Abell 520 data set can conclusively determine the physicality of such dark cores at 99 per cent confidence. However, in both cases the recovered MAP estimators are consistent with both sets of data.
- Published
- 2021
12. Bayesian variational regularization on the ball
- Author
-
Price, Matthew A. and McEwen, Jason D.
- Subjects
FOS: Computer and information sciences ,Information Theory (cs.IT) ,Computer Science - Information Theory ,FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) - Abstract
We develop variational regularization methods which leverage sparsity-promoting priors to solve severely ill posed inverse problems defined on the 3D ball (i.e. the solid sphere). Our method solves the problem natively on the ball and thus does not suffer from discontinuities that plague alternate approaches where each spherical shell is considered independently. Additionally, we leverage advances in probability density theory to produce Bayesian variational methods which benefit from the computational efficiency of advanced convex optimization algorithms, whilst supporting principled uncertainty quantification. We showcase these variational regularization and uncertainty quantification techniques on an illustrative example. The C++ code discussed throughout is provided under a GNU general public license.
- Published
- 2021
13. Scattering Networks on the Sphere for Scalable and Rotationally Equivariant Spherical CNNs
- Author
-
McEwen, Jason D., Wallis, Christopher G. R., and Mavor-Parker, Augustine N.
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Computer Vision and Pattern Recognition (cs.CV) ,Image and Video Processing (eess.IV) ,Computer Science::Neural and Evolutionary Computation ,FOS: Electrical engineering, electronic engineering, information engineering ,Computer Science - Computer Vision and Pattern Recognition ,FOS: Physical sciences ,Electrical Engineering and Systems Science - Image and Video Processing ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Machine Learning (cs.LG) - Abstract
Convolutional neural networks (CNNs) constructed natively on the sphere have been developed recently and shown to be highly effective for the analysis of spherical data. While an efficient framework has been formulated, spherical CNNs are nevertheless highly computationally demanding; typically they cannot scale beyond spherical signals of thousands of pixels. We develop scattering networks constructed natively on the sphere that provide a powerful representational space for spherical data. Spherical scattering networks are computationally scalable and exhibit rotational equivariance, while their representational space is invariant to isometries and provides efficient and stable signal representations. By integrating scattering networks as an additional type of layer in the generalized spherical CNN framework, we show how they can be leveraged to scale spherical CNNs to the high-resolution data typical of many practical applications, with spherical signals of many tens of megapixels and beyond., 18 pages, 6 figures, accepted by ICLR, code at https://www.kagenova.com/products/fourpiAI/
- Published
- 2021
14. Optimal filters on the sphere
- Author
-
McEwen, Jason D., Hobson, Michael P., and Ladenby, Anthony N.
- Subjects
Signal processing -- Research ,Acoustic filters -- Design and construction ,Stochastic processes -- Analysis ,Digital signal processor ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Optimal filters are derived on the sphere in order to detect compact objects embedded in a stochastic background process. A naive detection strategy is adopted to show the application of the new filter theory and it is shown to perform well, even at low signal-to-noise ratio.
- Published
- 2008
15. Sparse Bayesian mass-mapping with uncertainties: full sky observations on the celestial sphere
- Author
-
Price, Matthew A., McEwen, Jason D., Pratley, Luke, and Kitching, Thomas D.
- Subjects
Weak Lensing ,Spherical Analysis ,Image Processing ,Bayesian inference ,Astrophysics::Cosmology and Extragalactic Astrophysics - Abstract
To the best of our (the authors) knowledge this is the first set of joint spherical reconstructions of all (excluding the now released HSC observation patch) public weak lensing observational data. Maps included are: - Mask - Number density map - e1/e2 shear maps - Spherical Kaiser-Squires convergence estimate (both raw and with 25 arcmin smoothing) - DarkMapper (Hierarchical Bayesian Sparse optimisation) convergence estimate, {"references":["Price et al 2020 (arXiv:2004.07855)"]}
- Published
- 2020
- Full Text
- View/download PDF
16. Fast directional continuous spherical wavelet transform algorithms
- Author
-
McEwen, Jason D., Hobson, Michael P., Mortlock, Daniel J., and Lasenby, Anthony N.
- Subjects
Wavelet transforms -- Analysis ,Signal processing -- Research ,Euclidean geometry -- Usage ,Geometry, Plane -- Usage ,Geometry, Solid -- Usage ,Digital signal processor ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
The construction of a spherical wavelet analysis through the inverse stereographic projection of the Euclidean planar wavelet framework is described and fast algorithms are presented for performing the directional continuous wavelet analysis on the unit sphere. The extension of wavelet analysis to the sphere has enabled the detection of new physics in many areas, and is facilitated on large practical data sets by the fast directional CSWT (continuous spherical wavelet transform) algorithm.
- Published
- 2007
17. Reducing Cybersickness in 360-Degree Virtual Reality.
- Author
-
Arshad, Iqra, De Mello, Paulo, Ender, Martin, McEwen, Jason D., and Ferré, Elisa R.
- Subjects
SIMULATOR sickness ,VIRTUAL reality ,HEAD-mounted displays ,ARTIFICIAL intelligence ,MOTION sickness ,HEART beat ,VERTIGO - Abstract
Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
18. Considerations for Optimizing the Photometric Classification of Supernovae from the Rubin Observatory.
- Author
-
Alves, Catarina S., Peiris, Hiranya V., Lochner, Michelle, McEwen, Jason D., Allam Jr., Tarek, and Biswas, Rahul
- Published
- 2022
- Full Text
- View/download PDF
19. Distributed and parallel sparse convex optimization for radio interferometry with PURIFY
- Author
-
Pratley, Luke, McEwen, Jason D., d'Avezac, Mayeul, Cai, Xiaohao, Perez-Suarez, David, Christidi, Ilektra, and Guichard, Roland
- Subjects
FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) - Abstract
Next generation radio interferometric telescopes are entering an era of big data with extremely large data sets. While these telescopes can observe the sky in higher sensitivity and resolution than before, computational challenges in image reconstruction need to be overcome to realize the potential of forthcoming telescopes. New methods in sparse image reconstruction and convex optimization techniques (cf. compressive sensing) have shown to produce higher fidelity reconstructions of simulations and real observations than traditional methods. This article presents distributed and parallel algorithms and implementations to perform sparse image reconstruction, with significant practical considerations that are important for implementing these algorithms for Big Data. We benchmark the algorithms presented, showing that they are considerably faster than their serial equivalents. We then pre-sample gridding kernels to scale the distributed algorithms to larger data sizes, showing application times for 1 Gb to 2.4 Tb data sets over 25 to 100 nodes for up to 50 billion visibilities, and find that the run-times for the distributed algorithms range from 100 milliseconds to 3 minutes per iteration. This work presents an important step in working towards computationally scalable and efficient algorithms and implementations that are needed to image observations of both extended and compact sources from next generation radio interferometers such as the SKA. The algorithms are implemented in the latest versions of the SOPT (https://github.com/astro-informatics/sopt) and PURIFY (https://github.com/astro-informatics/purify) software packages {(Versions 3.1.0)}, which have been released alongside of this article., 25 pages, 5 figures
- Published
- 2019
20. Optimizing the LSST Observing Strategy for Dark Energy Science: DESC Recommendations for the Wide-Fast-Deep Survey
- Author
-
Lochner, Michelle, Scolnic, Daniel M., Awan, Humna, Regnault, Nicolas, Gris, Philippe, Mandelbaum, Rachel, Gawiser, Eric, Almoubayyed, Husni, Setzer, Christian N., Huber, Simon, Graham, Melissa L., Hložek, Renée, Biswas, Rahul, Eifler, Tim, Rothchild, Daniel, Allam, Tarek, Blazek, Jonathan, Chang, Chihway, Collett, Thomas, Goobar, Ariel, Hook, Isobel M., Jarvis, Mike, Jha, Saurabh W., Kim, Alex G., Marshall, Phil, Mcewen, Jason D., Moniez, Marc, Newman, Jeffrey A., Hiranya Peiris, Petrushevska, Tanja, Rhodes, Jason, Sevilla-Noarbe, Ignacio, Slosar, Anže, Suyu, Sherry H., Tyson, J. Anthony, Yoachim, Peter, Laboratoire de Physique Nucléaire et de Hautes Énergies (LPNHE (UMR_7585)), Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Paris Diderot - Paris 7 (UPD7)-Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS), Laboratoire de Physique de Clermont (LPC), Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Clermont Auvergne (UCA)-Centre National de la Recherche Scientifique (CNRS), Laboratoire de l'Accélérateur Linéaire (LAL), Université Paris-Sud - Paris 11 (UP11)-Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Centre National de la Recherche Scientifique (CNRS), LSST Dark Energy Science, Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Clermont Auvergne [2017-2020] (UCA [2017-2020])-Centre National de la Recherche Scientifique (CNRS), and Centre National de la Recherche Scientifique (CNRS)-Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Paris-Sud - Paris 11 (UP11)
- Subjects
FOS: Physical sciences ,Astrophysics::Cosmology and Extragalactic Astrophysics ,[PHYS.PHYS.PHYS-INS-DET]Physics [physics]/Physics [physics]/Instrumentation and Detectors [physics.ins-det] ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,astro-ph.IM - Abstract
Cosmology is one of the four science pillars of LSST, which promises to be transformative for our understanding of dark energy and dark matter. The LSST Dark Energy Science Collaboration (DESC) has been tasked with deriving constraints on cosmological parameters from LSST data. Each of the cosmological probes for LSST is heavily impacted by the choice of observing strategy. This white paper is written by the LSST DESC Observing Strategy Task Force (OSTF), which represents the entire collaboration, and aims to make recommendations on observing strategy that will benefit all cosmological analyses with LSST. It is accompanied by the DESC DDF (Deep Drilling Fields) white paper (Scolnic et al.). We use a variety of metrics to understand the effects of the observing strategy on measurements of weak lensing, large-scale structure, clusters, photometric redshifts, supernovae, strong lensing and kilonovae. In order to reduce systematic uncertainties, we conclude that the current baseline observing strategy needs to be significantly modified to result in the best possible cosmological constraints. We provide some key recommendations: moving the WFD (Wide-Fast-Deep) footprint to avoid regions of high extinction, taking visit pairs in different filters, changing the 2x15s snaps to a single exposure to improve efficiency, focusing on strategies that reduce long gaps (>15 days) between observations, and prioritizing spatial uniformity at several intervals during the 10-year survey., The LSST DESC response (WFD) to the Call for White Papers on LSST Cadence Optimization. Comments welcome
- Published
- 2018
21. Optimizing the LSST Observing Strategy for Dark Energy Science: DESC Recommendations for the Deep Drilling Fields and other Special Programs
- Author
-
Scolnic, Daniel M., Lochner, Michelle, Gris, Phillipe, Regnault, Nicolas, Hlo��ek, Ren��e, Aldering, Greg, Allam, Tarek, Awan, Humna, Biswas, Rahul, Blazek, Jonathan, Chang, Chihway, Gawiser, Eric, Goobar, Ariel, Hook, Isobel M., Jha, Saurabh W., McEwen, Jason D., Mandelbaum, Rachel, Marshall, Phil, Neilsen, Eric, Rhodes, Jason, Rothchild, Daniel, Noarbe, Ignacio Sevilla, Slosar, An��e, Yoachim, Peter, Laboratoire de Physique de Clermont (LPC), Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Clermont Auvergne (UCA)-Centre National de la Recherche Scientifique (CNRS), Laboratoire de Physique Nucléaire et de Hautes Énergies (LPNHE (UMR_7585)), Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Paris Diderot - Paris 7 (UPD7)-Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS), LSST Dark Energy Science, Centre National de la Recherche Scientifique (CNRS)-Université Clermont Auvergne [2017-2020] (UCA [2017-2020])-Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3), and Institut National de Physique Nucléaire et de Physique des Particules du CNRS (IN2P3)-Université Clermont Auvergne [2017-2020] (UCA [2017-2020])-Centre National de la Recherche Scientifique (CNRS)
- Subjects
FOS: Physical sciences ,[PHYS.PHYS.PHYS-INS-DET]Physics [physics]/Physics [physics]/Instrumentation and Detectors [physics.ins-det] ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) - Abstract
We review the measurements of dark energy enabled by observations of the Deep Drilling Fields and the optimization of survey design for cosmological measurements. This white paper is the result of efforts by the LSST DESC Observing Strategy Task Force (OSTF), which represents the entire collaboration, and aims to make recommendations on observing strategy for the DDFs that will benefit all cosmological analyses with LSST. It is accompanied by the DESC-WFD white paper (Lochner et al.). We argue for altering the nominal deep drilling plan to have $>6$ month seasons, interweaving $gri$ and $zy$ observations every 3 days with 2, 4, 8, 25, 4 visits in $grizy$, respectively. These recommendations are guided by metrics optimizing constraints on dark energy and mitigation of systematic uncertainties, including specific requirements on total number of visits after Y1 and Y10 for photometric redshifts (photo-$z$) and weak lensing systematics. We specify the precise locations for the previously-chosen LSST deep fields (ELAIS-S1, XMM-LSS, CDF-S, and COSMOS) and recommend Akari Deep Field South as the planned fifth deep field in order to synergize with Euclid and WFIRST. Our recommended DDF strategy uses $6.2\%$ of the LSST survey time. We briefly discuss synergy with white papers from other collaborations, as well as additional mini-surveys and Target-of-Opportunity programs that lead to better measurements of dark energy., The LSST DESC response (DDF) to the Call for White Papers on LSST Cadence Optimization. Comments welcome
- Published
- 2018
22. The Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC): Data set
- Author
-
The PLAsTiCC Team, Allam, Tarek, Bahmanyar, Anita, Biswas, Rahul, Dai, Mi, Galbany, Llu��s, Hlo��ek, Ren��e, Ishida, Emille E. O., Jha, Saurabh W., Jones, David O., Kessler, Richard, Lochner, Michelle, Mahabal, Ashish A., Malz, Alex I., Mandel, Kaisey S., Mart��nez-Galarza, Juan Rafael, McEwen, Jason D., Muthukrishna, Daniel, Narayan, Gautham, Peiris, Hiranya, Peters, Christina M., Ponder, Kara, Setzer, Christian N., Collaboration, The LSST Dark Energy Science, Transients, The LSST, and Collaboration, Variable Stars Science
- Subjects
Astrophysics - Solar and Stellar Astrophysics ,FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) ,Solar and Stellar Astrophysics (astro-ph.SR) - Abstract
The Photometric LSST Astronomical Time Series Classification Challenge (PLAsTiCC) is an open data challenge to classify simulated astronomical time-series data in preparation for observations from the Large Synoptic Survey Telescope (LSST), which will achieve first light in 2019 and commence its 10-year main survey in 2022. LSST will revolutionize our understanding of the changing sky, discovering and measuring millions of time-varying objects. In this challenge, we pose the question: how well can we classify objects in the sky that vary in brightness from simulated LSST time-series data, with all its challenges of non-representativity? In this note we explain the need for a data challenge to help classify such astronomical sources and describe the PLAsTiCC data set and Kaggle data challenge, noting that while the references are provided for context, they are not needed to participate in the challenge., Research note to accompany the https://www.kaggle.com/c/PLAsTiCC-2018 challenge
- Published
- 2018
23. Sifting Convolution on the Sphere.
- Author
-
Roddy, Patrick J. and McEwen, Jason D.
- Subjects
SPHERES ,TOPOGRAPHIC maps ,SPHERICAL harmonics ,HILBERT space ,HARMONIC analysis (Mathematics) - Abstract
A novel spherical convolution is defined through the sifting property of the Dirac delta on the sphere. The so-called sifting convolution is defined by the inner product of one function with a translated version of another, but with the adoption of an alternative translation operator on the sphere. This translation operator follows by analogy with the Euclidean translation when viewed in harmonic space. The sifting convolution satisfies a variety of desirable properties that are lacking in alternate definitions, namely: it supports directional kernels; it has an output which remains on the sphere; and is efficient to compute. An illustration of the sifting convolution on a topographic map of the Earth demonstrates that it supports directional kernels to perform anisotropic filtering, while its output remains on the sphere. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
24. PURIFYing real radio interferometric observations
- Author
-
Pratley, Luke, McEwen, Jason D., d'Avezac, Mayeul, Carrillo, Rafael E., Onose, Alexandru, and Wiaux, Yves
- Subjects
Astrophysics::Instrumentation and Methods for Astrophysics ,FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) - Abstract
Next-generation radio interferometers, such as the Square Kilometre Array (SKA), will revolutionise our understanding of the universe through their unprecedented sensitivity and resolution. However, standard methods in radio interferometry produce reconstructed interferometric images that are limited in quality and they are not scalable for big data. In this work we apply and evaluate alternative interferometric reconstruction methods that make use of state-of-the-art sparse image reconstruction algorithms motivated by compressive sensing, which have been implemented in the PURIFY software package. In particular, we implement and apply the proximal alternating direction method of multipliers (P-ADMM) algorithm presented in a recent article. We apply PURIFY to real interferometric observations. For all observations PURIFY outperforms the standard CLEAN, where in some cases PURIFY provides an improvement in dynamic range by over an order of magnitude. The latest version of PURIFY, which includes the developments presented in this work, is made publicly available., 1 page, Proceedings of International BASP Frontiers Workshop 2017
- Published
- 2017
25. Directional spin wavelets on the sphere
- Author
-
Mcewen, Jason D., Leistedt, Boris, Büttner, Martin, Hiranya Peiris, and Wiaux, Yves
- Subjects
FOS: Computer and information sciences ,Information Theory (cs.IT) ,Computer Science - Information Theory ,FOS: Physical sciences ,Astrophysics - Instrumentation and Methods for Astrophysics ,Instrumentation and Methods for Astrophysics (astro-ph.IM) - Abstract
We construct a directional spin wavelet framework on the sphere by generalising the scalar scale-discretised wavelet transform to signals of arbitrary spin. The resulting framework is the only wavelet framework defined natively on the sphere that is able to probe the directional intensity of spin signals. Furthermore, directional spin scale-discretised wavelets support the exact synthesis of a signal on the sphere from its wavelet coefficients and satisfy excellent localisation and uncorrelation properties. Consequently, directional spin scale-discretised wavelets are likely to be of use in a wide range of applications and in particular for the analysis of the polarisation of the cosmic microwave background (CMB). We develop new algorithms to compute (scalar and spin) forward and inverse wavelet transforms exactly and efficiently for very large data-sets containing tens of millions of samples on the sphere. By leveraging a novel sampling theorem on the rotation group developed in a companion article, only half as many wavelet coefficients as alternative approaches need be computed, while still capturing the full information content of the signal under analysis. Our implementation of these algorithms is made publicly available., 20 pages, 7 figures. Code available at http://www.s2let.org
- Published
- 2015
26. Online radio interferometric imaging: assimilating and discarding visibilities on arrival.
- Author
-
Cai, Xiaohao, Pratley, Luke, and McEwen, Jason D
- Subjects
RADIO astronomy ,OPTICAL tomography ,IMAGE reconstruction ,DATA warehousing ,VISIBILITY ,INVERSE problems - Abstract
The emerging generation of radio interferometric (RI) telescopes, such as the Square Kilometre Array (SKA), will acquire massive volumes of data and transition radio astronomy to a big-data era. The ill-posed inverse problem of imaging the raw visibilities acquired by RI telescopes will become significantly more computationally challenging, particularly in terms of data storage and computational cost. Current RI imaging methods, such as CLEAN, its variants, and compressive sensing approaches (i.e. sparse regularization), have yielded excellent reconstruction fidelity. However, scaling these methods to big-data remains difficult if not impossible in some cases. All state-of-the-art methods in RI imaging lack the ability to process data streams as they are acquired during the data observation stage. Such approaches are referred to as online processing methods. We present an online sparse regularization methodology for RI imaging. Image reconstruction is performed simultaneously with data acquisition, where observed visibilities are assimilated into the reconstructed image as they arrive and then discarded. Since visibilities are processed online, good reconstructions are recovered much faster than standard (offline) methods that cannot start until the data acquisition stage completes. Moreover, the online method provides additional computational savings and, most importantly, dramatically reduces data storage requirements. Theoretically, the reconstructed images are of the same fidelity as those recovered by the equivalent offline approach and, in practice, very similar reconstruction fidelity is achieved. We anticipate that online imaging techniques, as proposed here, will be critical in scaling RI imaging to the emerging big-data era of radio astronomy. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
27. Covariant polarized radiative transfer on cosmological scales for investigating large-scale magnetic field structures.
- Author
-
Chan, Jennifer Y H, Wu, Kinwah, On, Alvina Y L, Barnes, David J, McEwen, Jason D, and Kitching, Thomas D
- Subjects
RADIATIVE transfer equation ,FARADAY effect ,MAGNETIC fields ,MAGNETIC structure ,RADIATIVE transfer - Abstract
Polarization of radiation is a powerful tool to study cosmic magnetism and analysis of polarization can be used as a diagnostic tool for large-scale structures. In this work, we present a solid theoretical foundation for using polarized light to investigate large-scale magnetic field structures: the cosmological polarized radiative transfer (CPRT) formulation. The CPRT formulation is fully covariant. It accounts for cosmological and relativistic effects in a self-consistent manner and explicitly treats Faraday rotation, as well as Faraday conversion, emission, and absorption processes. The formulation is derived from the first principles of conservation of phase–space volume and photon number. Without loss of generality, we consider a flat Friedmann–Robertson–Walker (FRW) space–time metric and construct the corresponding polarized radiative transfer equations. We propose an all-sky CPRT calculation algorithm, based on a ray-tracing method, which allows cosmological simulation results to be incorporated and, thereby, model templates of polarization maps to be constructed. Such maps will be crucial in our interpretation of polarized data, such as those to be collected by the Square Kilometer Array (SKA). We describe several tests which are used for verifying the code and demonstrate applications in the study of the polarization signatures in different distributions of electron number density and magnetic fields. We present a pencil-beam CPRT calculation and an all-sky calculation, using a simulated galaxy cluster or a model magnetized universe obtained from GCMHD+ simulations as the respective input structures. The implications on large-scale magnetic field studies are discussed; remarks on the standard methods using rotation measure are highlighted. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
28. Uncertainty quantification for radio interferometric imaging: II. MAP estimation.
- Author
-
Cai, Xiaohao, Pereyra, Marcelo, and McEwen, Jason D
- Subjects
UNCERTAINTY ,REASONING ,RADIO interferometers ,ASTRONOMICAL instruments ,RADIO telescopes - Abstract
Uncertainty quantification is a critical missing component in radio interferometric imaging that will only become increasingly important as the big-data era of radio interferometry emerges. Statistical sampling approaches to perform Bayesian inference, like Markov Chain Monte Carlo (MCMC) sampling, can in principle recover the full posterior distribution of the image, from which uncertainties can then be quantified. However, for massive data sizes, like those anticipated from the Square Kilometre Array, it will be difficult if not impossible to apply any MCMC technique due to its inherent computational cost. We formulate Bayesian inference problems with sparsity-promoting priors (motivated by compressive sensing), for which we recover maximum a posteriori (MAP) point estimators of radio interferometric images by convex optimization. Exploiting recent developments in the theory of probability concentration, we quantify uncertainties by post-processing the recovered MAP estimate. Three strategies to quantify uncertainties are developed: (i) highest posterior density credible regions, (ii) local credible intervals (cf. error bars) for individual pixels and superpixels, and (iii) hypothesis testing of image structure. These forms of uncertainty quantification provide rich information for analysing radio interferometric observations in a statistically robust manner. Our MAP-based methods are approximately 10
5 times faster computationally than state-of-the-art MCMC methods and, in addition, support highly distributed and parallelized algorithmic structures. For the first time, our MAP-based techniques provide a means of quantifying uncertainties for radio interferometric imaging for realistic data volumes and practical use, and scale to the emerging big data era of radio astronomy. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
29. Uncertainty quantification for radio interferometric imaging – I. Proximal MCMC methods.
- Author
-
Cai, Xiaohao, Pereyra, Marcelo, and McEwen, Jason D
- Subjects
RADIO interferometers ,BAYESIAN analysis ,BAYES' estimation ,PROBABILITY theory ,MARKOV chain Monte Carlo - Abstract
Uncertainty quantification is a critical missing component in radio interferometric imaging that will only become increasingly important as the big-data era of radio interferometry emerges. Since radio interferometric imaging requires solving a high-dimensional, ill-posed inverse problem, uncertainty quantification is difficult but also critical to the accurate scientific interpretation of radio observations. Statistical sampling approaches to perform Bayesian inference, like Markov chain Monte Carlo (MCMC) sampling, can in principle recover the full posterior distribution of the image, from which uncertainties can then be quantified. However, traditional high-dimensional sampling methods are generally limited to smooth (e.g. Gaussian) priors and cannot be used with sparsity-promoting priors. Sparse priors, motivated by the theory of compressive sensing, have been shown to be highly effective for radio interferometric imaging. In this article proximal MCMC methods are developed for radio interferometric imaging, leveraging proximal calculus to support non-differential priors, such as sparse priors, in a Bayesian framework. Furthermore, three strategies to quantify uncertainties using the recovered posterior distribution are developed: (i) local (pixel-wise) credible intervals to provide error bars for each individual pixel; (ii) highest posterior density credible regions; and (iii) hypothesis testing of image structure. These forms of uncertainty quantification provide rich information for analysing radio interferometric observations in a statistically robust manner. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
30. An Optimal-Dimensionality Sampling for Spin-$s$ Functions on the Sphere.
- Author
-
Elahi, Usama, Khalid, Zubair, Kennedy, Rodney A., and Mcewen, Jason D.
- Subjects
HARMONIC analysis (Mathematics) ,SAMPLING methods ,SPHERICAL harmonics ,SPIN-spin interactions ,SPHERE eversion - Abstract
For the representation of spin- $s$ band-limited functions on the sphere, we propose a sampling scheme with optimal number of samples equal to the number of degrees of freedom of the function in harmonic space. In comparison to the existing sampling designs, which require ${\sim }2L^2$ samples for the representation of spin- $s$ functions band-limited at $L$ , the proposed scheme requires $N_o=L^2-s^2$ samples for the accurate computation of the spin- $s$ spherical harmonic transform ($s$ -SHT). For the proposed sampling scheme, we also develop a method to compute the $s$ -SHT. We place the samples in our design scheme such that the matrices involved in the computation of $s$ -SHT are well-conditioned. We also present a multipass $s$ -SHT to improve the accuracy of the transform. We also show the proposed sampling design exhibits superior geometrical properties compared to existing equiangular and Gauss–Legendre sampling schemes, and enables accurate computation of the $s$ -SHT corroborated through numerical experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
31. Robust sparse image reconstruction of radio interferometric observations with PURIFY.
- Author
-
Pratley, Luke, McEwen, Jason D., d'Avezac, Mayeul, Carrillo, Rafael E., Onose, Alexandru, and Wiaux, Yves
- Subjects
- *
RADIO interferometers , *INTERFEROMETRY , *IMAGE processing , *IMAGE reconstruction algorithms , *IMAGE reconstruction , *COMPRESSED sensing - Abstract
Next-generation radio interferometers, such as the Square Kilometre Array, will revolutionize our understanding of the Universe through their unprecedented sensitivity and resolution. However, to realize these goals significant challenges in image and data processing need to be overcome. The standard methods in radio interferometry for reconstructing images, such as CLEAN, have served the community well over the last few decades and have survived largely because they are pragmatic. However, they produce reconstructed interferometric images that are limited in quality and scalability for big data. In this work, we apply and evaluate alternative interferometric reconstruction methods that make use of state-of-the-art sparse image reconstruction algorithms motivated by compressive sensing, which have been implemented in the PURIFY software package. In particular, we implement and apply the proximal alternating direction method of multipliers algorithm presented in a recent article. First, we assess the impact of the interpolation kernel used to perform gridding and degridding on sparse image reconstruction. We find that the Kaiser-Bessel interpolation kernel performs as well as prolate spheroidal wave functions while providing a computational saving and an analytic form. Secondly, we apply PURIFY to real interferometric observations from the Very Large Array and the Australia Telescope Compact Array and find that images recovered by PURIFY are of higher quality than those recovered by CLEAN. Thirdly, we discuss how PURIFY reconstructions exhibit additional advantages over those recovered by CLEAN. The latest version of PURIFY, with developments presented in this work, is made publicly available. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
32. A randomised primal-dual algorithm for distributed radio-interferometric imaging.
- Author
-
Onose, Alexandru, Carrillo, Rafael E., McEwen, Jason D., and Wiaux, Yves
- Published
- 2016
- Full Text
- View/download PDF
33. Sparse Image Reconstruction on the Sphere: Analysis and Synthesis.
- Author
-
Wallis, Christopher G. R., Wiaux, Yves, and McEwen, Jason D.
- Subjects
IMAGE reconstruction ,IMAGE processing ,NP-complete problems ,INVERSE problems ,WAVELETS (Mathematics) ,STATISTICAL sampling - Abstract
We develop techniques to solve ill-posed inverse problems on the sphere by sparse regularization, exploiting sparsity in both axisymmetric and directional scale-discretized wavelet space. Denoising, inpainting, and deconvolution problems and combinations thereof, are considered as examples. Inverse problems are solved in both the analysis and synthesis settings, with a number of different sampling schemes. The most effective approach is that with the most restricted solution-space, which depends on the interplay between the adopted sampling scheme, the selection of the analysis/synthesis problem, and any weighting of the \ell 1 norm appearing in the regularization problem. More efficient sampling schemes on the sphere improve reconstruction fidelity by restricting the solution-space and also by improving sparsity in wavelet space. We apply the technique to denoise Planck 353-GHz observations, improving the ability to extract the structure of Galactic dust emission, which is important for studying Galactic magnetism. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
34. The limits of cosmic shear.
- Author
-
Kitching, Thomas D., Alsing, Justin, Heavens, Alan F., Jimenez, Raul, McEwen, Jason D., and Verde, Licia
- Subjects
HANKEL functions ,APPROXIMATION theory ,POWER spectra ,METAPHYSICAL cosmology ,UNIVERSE - Abstract
In this paper, we discuss the commonly used limiting cases, or approximations, for two-point cosmic-shear statistics. We discuss the most prominent assumptions in this statistic: the flat-sky (small angle limit), the Limber (Bessel-to-delta function limit) and the Hankel transform (large ℓ-mode limit) approximations; that the vast majority of cosmic-shear results to date have used simultaneously. We find that the combined effect of these approximations can suppress power by ≳ 1 per cent on scales of ℓ ≲ 40. A fully non-approximated cosmic-shear study should use a spherical-sky, non-Limber-approximated power spectrum analysis and a transform involving Wigner small-d matrices in place of the Hankel transform. These effects, unaccounted for, would constitute at least 11 per cent of the total budget for systematic effects for a power spectrum analysis of a Euclid-like experiment; but they are unnecessary. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
35. Wavelet reconstruction of E and B modes for CMB polarization and cosmic shear analyses.
- Author
-
Leistedt, Boris, McEwen, Jason D., Büttner, Martin, and Peiris, Hiranya V.
- Subjects
- *
SPECTRUM analysis , *TELESCOPES , *POLARIZATION (Nuclear physics) , *GALAXY spectra , *SPACE - Abstract
We present new methods for mapping the curl-free (E-mode) and divergence-free (B-mode) components of spin 2 signals using spin directional wavelets. Our methods are equally applicable to measurements of the polarization of the cosmic microwave background (CMB) and the shear of galaxy shapes due to weak gravitational lensing. We derive pseudo- and pure wavelet estimators, where E-B mixing arising due to incomplete sky coverage is suppressed in wavelet space using scale- and orientation-dependent masking and weighting schemes. In the case of the pure estimator, ambiguous modes (which have vanishing curl and divergence simultaneously on the incomplete sky) are also cancelled. On simulations, we demonstrate the improvement (i.e. reduction in leakage) provided by our wavelet space estimators over standard harmonic space approaches. Our new methods can be directly interfaced in a coherent and computationally efficient manner with component separation or feature extraction techniques that also exploit wavelets. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
36. Second-Generation Curvelets on the Sphere.
- Author
-
Chan, Jennifer Y. H., Leistedt, Boris, Kitching, Thomas D., and McEwen, Jason D.
- Subjects
CURVELET transforms ,WAVELET transforms ,CURVILINEAR motion ,DISCRETIZATION methods ,SIGNAL processing - Abstract
Curvelets are efficient to represent highly anisotropic signal content, such as a local linear and curvilinear structure. First-generation curvelets on the sphere, however, suffered from blocking artefacts. We present a new second-generation curvelet transform, where scale-discretized curvelets are constructed directly on the sphere. Scale-discretized curvelets exhibit a parabolic scaling relation, are well localized in both spatial and harmonic domains, support the exact analysis and synthesis of both scalar and spin signals, and are free of blocking artefacts. We present fast algorithms to compute the exact curvelet transform, reducing computational complexity from \mathcal O(L^5) to \mathcal O(L^3\log 2{L}) for signals band limited at $L$. The implementation of these algorithms is made publicly available. Finally, we present an illustrative application demonstrating the effectiveness of curvelets for representing directional curve-like features in natural spherical images. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. Spin-SILC: CMB polarization component separation with spin wavelets.
- Author
-
Rogers, Keir K., Peiris, Hiranya V., Leistedt, Boris, McEwen, Jason D., and Pontzen, Andrew
- Subjects
COSMIC background radiation ,POLARIZATION (Nuclear physics) ,MICROWAVES ,WAVELETS (Mathematics) ,ALGORITHMS - Abstract
We present Spin-SILC, a new foreground component separation method that accurately extracts the cosmic microwave background (CMB) polarization E and B modes from raw multifrequency Stokes Q and U measurements of the microwave sky. Spin-SILC is an internal linear combination method that uses spin wavelets to analyse the spin-2 polarization signal P = Q + iU. The wavelets are additionally directional (non-axisymmetric). This allows different morphologies of signals to be separated and therefore the cleaning algorithm is localized using an additional domain of information. The advantage of spin wavelets over standard scalar wavelets is to simultaneously and self-consistently probe scales and directions in the polarization signal P = Q + iU and in the underlying E and B modes, therefore providing the ability to perform component separation and E-B decomposition concurrently for the first time. We test Spin-SILC on full-mission Planck simulations and data and show the capacity to correctly recover the underlying cosmological E and B modes. We also demonstrate a strong consistency of our CMB maps with those derived from existing component separation methods. Spin-SILC can be combined with the pseudo- and pure E-B spin wavelet estimators presented in a companion paper to reliably extract the cosmological signal in the presence of complicated sky cuts and noise. Therefore, it will provide a computationally efficient method to accurately extract the CMB E and B modes for future polarization experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
38. Scalable splitting algorithms for big-data interferometric imaging in the SKA era.
- Author
-
Onose, Alexandru, Carrillo, Rafael E., Repetti, Audrey, McEwen, Jason D., Thiran, Jean-Philippe, Pesquet, Jean-Christophe, and Wiaux, Yves
- Subjects
RADIO telescopes ,INTERFEROMETRY ,IMAGE reconstruction ,MATHEMATICAL regularization ,MATHEMATICAL optimization ,BIG data - Abstract
In the context of next-generation radio telescopes, like the Square Kilometre Array (SKA), the efficient processing of large-scale data sets is extremely important. Convex optimization tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimization algorithmic structures able to solve the convex optimization tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy, with the clean major-minor cycle, as running sophisticated clean-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularization function, in particular, the well-studied ℓ
1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomization, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our matlab code is available online on GitHub. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
39. A framework for testing isotropy with the cosmic microwave background.
- Author
-
Saadeh, Daniela, Feeney, Stephen M., Pontzen, Andrew, Peiris, Hiranya V., and McEwen, Jason D.
- Subjects
COSMIC background radiation ,ISOTROPY subgroups ,DEGREES of freedom ,BIANCHI groups - Abstract
We present a new framework for testing the isotropy of the Universe using cosmic microwave background data, building on the nested-sampling ANICOSMO code. Uniquely, we are able to constrain the scalar, vector and tensor degrees of freedom alike; previous studies only considered the vector mode (linked to vorticity). We employ Bianchi type VII
h cosmologies to model the anisotropic Universe, from which other types may be obtained by taking suitable limits. In a separate development, we improve the statistical analysis by including the effect of Bianchi power in the high-ℓ, as well as the low-ℓ, likelihood. To understand the effect of all these changes, we apply our new techniques to Wilkinson Microwave Anisotropy Probe data. We find no evidence for anisotropy, constraining shear in the vector mode to (σV /H)0 < 1.7 × 10-10 (95 per cent confidence level). For the first time, we place limits on the tensor mode; unlike other modes, the tensor shear can grow from a near-isotropic early Universe. The limit on this type of shear is (σT,reg /H)0 < 2.4 × 10-7 (95 per cent confidence level). [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
40. SILC: a new Planck internal linear combination CMB temperature map using directional wavelets.
- Author
-
Rogers, Keir K., Peiris, Hiranya V., Leistedt, Boris, McEwen, Jason D., and Pontzen, Andrew
- Subjects
WAVELETS (Mathematics) ,SIGNAL processing ,ANISOTROPY ,METAPHYSICAL cosmology ,ALGORITHMS - Abstract
We present new clean maps of the cosmic microwave background (CMB) temperature anisotropies (as measured by Planck) constructed with a novel internal linear combination (ILC) algorithm using directional, scale-discretized wavelets - scale-discretized, directional wavelet ILC or Scale-discretised, directional wavelet Internal Linear Combination (SILC). Directional wavelets, when convolved with signals on the sphere, can separate the anisotropic filamentary structures which are characteristic of both the CMB and foregrounds. Extending previous component separation methods, which use the frequency, spatial and harmonic signatures of foregrounds to separate them from the cosmological background signal, SILC can additionally use morphological information in the foregrounds and CMB to better localize the cleaning algorithm. We test the method on Planck data and simulations, demonstrating consistency with existing component separation algorithms, and discuss how to optimize the use of morphological information by varying the number of directional wavelets as a function of spatial scale. We find that combining the use of directional and axisymmetric wavelets depending on scale could yield higher quality CMB temperature maps. Our results set the stage for the application of SILC to polarization anisotropies through an extension to spin wavelets. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Gauss-Legendre Sampling on the Rotation Group.
- Author
-
Khalid, Zubair, Durrani, Salman, Kennedy, Rodney A., Wiaux, Yves, and McEwen, Jason D.
- Subjects
FOURIER transforms ,DEGREES of freedom ,FOURIER analysis ,MATHEMATICAL transformations ,LEGENDRE'S functions - Abstract
We propose a Gauss-Legendre quadrature based sampling on the rotation group for the representation of a band-limited signal such that the Fourier transform (FT) of a signal can be exactly computed from its samples. Our figure of merit is the sampling efficiency, which is defined as a ratio of the degrees of freedom required to represent a band-limited signal in harmonic domain to the number of samples required to accurately compute the FT. The proposed sampling scheme is asymptotically as efficient as the most efficient scheme developed very recently. For the computation of FT and inverse FT, we also develop fast algorithms of complexity similar to the complexity attained by the fast algorithms for the existing sampling schemes. The developed algorithms are stable, accurate and do not have any pre-computation requirements. We also analyse the computation time and numerical accuracy of the proposed algorithms and show, through numerical experiments, that the proposed Fourier transforms are accurate with errors on the order of numerical precision. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
42. Purify: A new algorithmic framework for next-generation radio-interferometric imaging.
- Author
-
Carrillo, Rafael E., McEwen, Jason D., and Wiaux, Yves
- Published
- 2014
- Full Text
- View/download PDF
43. A Novel Sampling Theorem on the Rotation Group.
- Author
-
McEwen, Jason D., Buttner, Martin, Leistedt, Boris, Peiris, Hiranya V., and Wiaux, Yves
- Subjects
HARMONIC analysis (Mathematics) ,ROTATION groups ,STATISTICAL sampling ,SPHERES ,WIGNER distribution - Abstract
We develop a novel sampling theorem for functions defined on the three-dimensional rotation group SO(3) by connecting the rotation group to the three-torus through a periodic extension. Our sampling theorem requires 4L^3 samples to capture all of the information content of a signal band-limited at L, reducing the number of required samples by a factor of two compared to other equiangular sampling theorems. We present fast algorithms to compute the associated Fourier transform on the rotation group, the so-called Wigner transform, which scale as \cal O(L^4), compared to the naive scaling of \cal O(L^6). For the common case of a low directional band-limit N, complexity is reduced to \cal O(NL^3). Our fast algorithms will be of direct use in speeding up the computation of directional wavelet transforms on the sphere. We make our \tt SO3 code implementing these algorithms publicly available. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
44. Compressed sensing for radio interferometric imaging: Review and future direction.
- Author
-
McEwen, Jason D. and Wiaux, Yves
- Published
- 2011
- Full Text
- View/download PDF
45. Detecting dark energy with wavelets on the sphere.
- Author
-
McEwen, Jason D.
- Published
- 2007
- Full Text
- View/download PDF
46. An Optimal-Dimensionality Sampling Scheme on the Sphere With Fast Spherical Harmonic Transforms.
- Author
-
Khalid, Zubair, Kennedy, Rodney A., and McEwen, Jason D.
- Subjects
HARMONIC analysis (Mathematics) ,SAMPLING (Process) ,SPHERICAL harmonics ,MATHEMATICAL analysis ,MATHEMATICS research - Abstract
We develop a sampling scheme on the sphere that permits accurate computation of the spherical harmonic transform and its inverse for signals band-limited at L using only L^2 samples. We obtain the optimal number of samples given by the degrees of freedom of the signal in harmonic space. The number of samples required in our scheme is a factor of two or four fewer than existing techniques, which require either 2L^2 or 4L^2 samples. We note, however, that we do not recover a sampling theorem on the sphere, where spherical harmonic transforms are theoretically exact. Nevertheless, we achieve high accuracy even for very large band-limits. For our optimal-dimensionality sampling scheme, we develop a fast and accurate algorithm to compute the spherical harmonic transform (and inverse), with computational complexity comparable with existing schemes in practice. We conduct numerical experiments to study in detail the stability, accuracy and computational complexity of the proposed transforms. We also highlight the advantages of the proposed sampling scheme and associated transforms in the context of potential applications. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
47. On spin scale-discretised wavelets on the sphere for the analysis of CMB polarisation.
- Author
-
McEwen, Jason D., Büttner, Martin, Leistedt, Boris, Peiris, Hiranya V., Vandergheynst, Pierre, Wiaux, Yves, Heavens, A. F., Starck, J.-L., and Krone-Martins, A.
- Abstract
A new spin wavelet transform on the sphere is proposed to analyse the polarisation of the cosmic microwave background (CMB), a spin ± 2 signal observed on the celestial sphere. The scalar directional scale-discretised wavelet transform on the sphere is extended to analyse signals of arbitrary spin. The resulting spin scale-discretised wavelet transform probes the directional intensity of spin signals. A procedure is presented using this new spin wavelet transform to recover E- and B-mode signals from partial-sky observations of CMB polarisation. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
48. 3D weak lensing with spin wavelets on the ball.
- Author
-
Leistedt, Boris, McEwen, Jason D., Hitching, Thomas D., and Perns, Hiranya V.
- Subjects
- *
WAVELET transforms , *GRAVITATIONAL lenses - Abstract
We construct the spin flaglet transform, a wavelet transform to analyze spin signals in three dimensions. Spin flaglets can probe signal content localized simultaneously in space and frequency and, moreover, are separable so that their angular and radial properties can be controlled independently. They are particularly suited to analyzing cosmological observations such as the weak gravitational lensing of galaxies. Such observations have a unique 3D geometrical setting since they are natively made on the sky, have spin angular symmetries, and are extended in the radial direction by additional distance or redshift iniormation. Flaglets are constructed in the harmonic space defined by the Fourier-Laguerre transform, previously defined for scalar functions and extended here to signals with spin symmetries. Thanks to various sampling theorems, both the Fourier-Laguerre and flaglet transforms are theoretically exact when applied to bandlimited signals. In other words, in numerical computations the only loss of information is due to the finite representation of floating point numbers. We develop a 3D framework relating the weak lensing power spectrum to covariances of flaglet coefficients. We suggest that the resulting novel flaglet weak lensing estimator offers a powerful alternative to common 2D and 3D approaches to accurately capture cosmological information. While standard weak lensing analyses focus on either real- or harmonic-space representations (i.e., correlation functions or Fourier-Bessel power spectra, respectively), a wavelet approach inherits the advantages of both techniques, where both complicated sky coverage and uncertainties associated with the physical modeling of small scales can be handled effectively. Our codes to compute the Fourier-Laguerre and flaglet transforms are made publicly available. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
49. Fast Directional Spatially Localized Spherical Harmonic Transform.
- Author
-
Khalid, Zubair, Kennedy, Rodney A., Durrani, Salman, Sadeghi, Parastoo, Wiaux, Yves, and McEwen, Jason D.
- Subjects
SPHERICAL harmonics ,SPECTRUM analysis ,ARBITRARY constants ,BIG data ,NUMERICAL analysis - Abstract
We propose a transform for signals defined on the sphere that reveals their localized directional content in the spatio-spectral domain when used in conjunction with an asymmetric window function. We call this transform the directional spatially localized spherical harmonic transform (directional SLSHT) which extends the SLSHT from the literature whose usefulness is limited to symmetric windows. We present an inversion relation to synthesize the original signal from its directional-SLSHT distribution for an arbitrary window function. As an example of an asymmetric window, the most concentrated band-limited eigenfunction in an elliptical region on the sphere is proposed for directional spatio-spectral analysis and its effectiveness is illustrated on the synthetic and Mars topographic data-sets. Finally, since such typical data-sets on the sphere are of considerable size and the directional SLSHT is intrinsically computationally demanding depending on the band-limits of the signal and window, a fast algorithm for the efficient computation of the transform is developed. The floating point precision numerical accuracy of the fast algorithm is demonstrated and a full numerical complexity analysis is presented. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
50. Exact Wavelets on the Ball.
- Author
-
Leistedt, Boris and McEwen, Jason D.
- Subjects
- *
WAVELETS (Mathematics) , *LAGUERRE polynomials , *HARMONIC analysis (Mathematics) , *QUADRATURE domains , *ALGORITHMS - Abstract
We develop an exact wavelet transform on the three-dimensional ball (i.e. on the solid sphere), which we name the flaglet transform. For this purpose we first construct an exact transform on the radial half-line using damped Laguerre polynomials and develop a corresponding quadrature rule. Combined with the spherical harmonic transform, this approach leads to a sampling theorem on the ball and a novel three-dimensional decomposition which we call the Fourier-Laguerre transform. We relate this new transform to the well-known Fourier-Bessel decomposition and show that band-limitedness in the Fourier-Laguerre basis is a sufficient condition to compute the Fourier-Bessel decomposition exactly. We then construct the flaglet transform on the ball through a harmonic tiling, which is exact thanks to the exactness of the Fourier-Laguerre transform (from which the name flaglets is coined). The corresponding wavelet kernels are well localised in real and Fourier-Laguerre spaces and their angular aperture is invariant under radial translation. We introduce a multiresolution algorithm to perform the flaglet transform rapidly, while capturing all information at each wavelet scale in the minimal number of samples on the ball. Our implementation of these new tools achieves floating-point precision and is made publicly available. We perform numerical experiments demonstrating the speed and accuracy of these libraries and illustrate their capabilities on a simple denoising example. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.