Back to Search
Start Over
DeepGraviLens: a Multi-Modal Architecture for Classifying Gravitational Lensing Data
- Source :
- Neural Comput & Applic (2023)
- Publication Year :
- 2022
-
Abstract
- Gravitational lensing is the relativistic effect generated by massive bodies, which bend the space-time surrounding them. It is a deeply investigated topic in astrophysics and allows validating theoretical relativistic results and studying faint astrophysical objects that would not be visible otherwise. In recent years Machine Learning methods have been applied to support the analysis of the gravitational lensing phenomena by detecting lensing effects in data sets consisting of images associated with brightness variation time series. However, the state-of-art approaches either consider only images and neglect time-series data or achieve relatively low accuracy on the most difficult data sets. This paper introduces DeepGraviLens, a novel multi-modal network that classifies spatio-temporal data belonging to one non-lensed system type and three lensed system types. It surpasses the current state of the art accuracy results by $\approx 3\%$ to $\approx 11\%$, depending on the considered data set. Such an improvement will enable the acceleration of the analysis of lensed objects in upcoming astrophysical surveys, which will exploit the petabytes of data collected, e.g., from the Vera C. Rubin Observatory.<br />Comment: This preprint has not undergone peer review or any post-submission improvements or corrections. The Version of Record of this article is published in Neural Computing and Applications, and is available online at https://doi.org/10.1007/s00521-023-08766-9
- Subjects :
- Astrophysics - Instrumentation and Methods for Astrophysics
Astrophysics - Cosmology and Nongalactic Astrophysics
Computer Science - Artificial Intelligence
Computer Science - Computer Vision and Pattern Recognition
Computer Science - Machine Learning
General Relativity and Quantum Cosmology
Subjects
Details
- Database :
- arXiv
- Journal :
- Neural Comput & Applic (2023)
- Publication Type :
- Report
- Accession number :
- edsarx.2205.00701
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1007/s00521-023-08766-9