13,153 results on '"Malik, A"'
Search Results
2. Pakistan's Coming Crisis
- Author
-
Malik, Adeel and Tudor, Maya
- Published
- 2024
- Full Text
- View/download PDF
3. 5 Institutionalising epistocracy
- Author
-
Malik, Ali
- Published
- 2024
4. References
- Author
-
Malik, Ali
- Published
- 2024
5. Appendix A: List of official sources analysed for the doctoral study (2013-17)
- Author
-
Malik, Ali
- Published
- 2024
6. Index
- Author
-
Malik, Ali
- Published
- 2024
7. Appendix C: SIPR-SPA-Police Scotland Think Tanks Terms of Reference
- Author
-
Malik, Ali
- Published
- 2024
8. 3 Scottish police reform and localism
- Author
-
Malik, Ali
- Published
- 2024
9. 1 Introduction
- Author
-
Malik, Ali
- Published
- 2024
10. 4 Paradoxes and dilemmas: operational independence and internal governance
- Author
-
Malik, Ali
- Published
- 2024
11. Appendix B: Breakdown of interviewees
- Author
-
Malik, Ali
- Published
- 2024
12. 6 Conclusion
- Author
-
Malik, Ali
- Published
- 2024
13. 2 Developments in police governance: from democracy to epistocracy
- Author
-
Malik, Ali
- Published
- 2024
14. Acknowledgements
- Author
-
Malik, Ali
- Published
- 2024
15. Cover
- Author
-
Malik, Ali
- Published
- 2024
16. Table of Contents
- Author
-
Malik, Ali
- Published
- 2024
17. Title Page, Copyright
- Author
-
Malik, Ali
- Published
- 2024
18. List of abbreviations and definitions
- Author
-
Malik, Ali
- Published
- 2024
19. List of figures and table
- Author
-
Malik, Ali
- Published
- 2024
20. Comparative performance of Chilli (Capsicum annum L.) genotypes for quality traits under temperate condition of Kashmir valley
- Author
-
Rashid, Majid, Hussain, Khursheed, Malik, Ajaz Ahmad, Sofi, Najeeb-Ur-Rehman, Magray, M. Mudasir, Khan, Imran, Ara, Shoukat, Hussain, Syed Mazahir, Farwah, Syeda, Lone, Sameena, and Totawar, H.M.
- Published
- 2024
- Full Text
- View/download PDF
21. Allele mining for Granule bound starch synthase1 (GBSS1) gene governing amylose content in aromatic rice (Oryza sativa L.) germplasm
- Author
-
Dixit, Deepshikha, Siddiqui, N., Bollinedi, Haritha, Krishnan, Gopala S., Malik, Ankit, Bhowmick, P. K., Ellur, R. K., Nagarajan, M., Vinod, K. K., and Singh, A. K.
- Published
- 2024
- Full Text
- View/download PDF
22. Soft Gripping System for Space Exploration Legged Robots
- Author
-
Candalot, Arthur, Hashim, Malik-Manel, Hickey, Brigid, Laine, Mickael, Hunter-Scullion, Mitch, and Yoshida, Kazuya
- Subjects
Computer Science - Robotics - Abstract
Although wheeled robots have been predominant for planetary exploration, their geometry limits their capabilities when traveling over steep slopes, through rocky terrains, and in microgravity. Legged robots equipped with grippers are a viable alternative to overcome these obstacles. This paper proposes a gripping system that can provide legged space-explorer robots a reliable anchor on uneven rocky terrain. This gripper provides the benefits of soft gripping technology by using segmented tendon-driven fingers to adapt to the target shape, and creates a strong adhesion to rocky surfaces with the help of microspines. The gripping performances are showcased, and multiple experiments demonstrate the impact of the pulling angle, target shape, spine configuration, and actuation power on the performances. The results show that the proposed gripper can be a suitable solution for advanced space exploration, including climbing, lunar caves, or exploration of the surface of asteroids., Comment: The 27th issue of the International Conference Series on Climbing and Walking Robots and the Support Technologies for Mobile Machines (CLAWAR)
- Published
- 2024
23. Designing for Cooperative Grain Boundary Segregation in Multicomponent Alloys
- Author
-
Wagih, Malik, Naunheim, Yannick, Lei, Tianjiao, and Schuh, Christopher A.
- Subjects
Condensed Matter - Materials Science - Abstract
Tailoring the nanoscale distribution of chemical species at grain boundaries is a powerful method to dramatically influence the properties of polycrystalline materials. However, classical approaches to the problem have tacitly assumed that only competition is possible between solute species. In this paper, we show that solute elements can cooperate in the way they segregate to grain boundaries: in properly targeted alloys, the different chemical species cooperate to each fill complementary grain boundary sites disfavored by the other. By developing a theoretical "spectral" approach to this problem based on quantum-accurate grain boundary site distributions, we show how grain boundaries can be cooperatively alloyed, whether by depletion or enrichment. We provide machine-learned co-segregation information for over 700 ternary aluminum-based alloys, and experimentally validate the concept in one ternary alloy where co-segregation is not expected by prior models, but is expected based on the cooperative model.
- Published
- 2024
24. Digitizing Touch with an Artificial Multimodal Fingertip
- Author
-
Lambeta, Mike, Wu, Tingfan, Sengul, Ali, Most, Victoria Rose, Black, Nolan, Sawyer, Kevin, Mercado, Romeo, Qi, Haozhi, Sohn, Alexander, Taylor, Byron, Tydingco, Norb, Kammerer, Gregg, Stroud, Dave, Khatha, Jake, Jenkins, Kurt, Most, Kyle, Stein, Neal, Chavira, Ricardo, Craven-Bartle, Thomas, Sanchez, Eric, Ding, Yitian, Malik, Jitendra, and Calandra, Roberto
- Subjects
Computer Science - Robotics ,Computer Science - Artificial Intelligence ,Computer Science - Machine Learning ,I.2.0 ,I.2.9 - Abstract
Touch is a crucial sensing modality that provides rich information about object properties and interactions with the physical environment. Humans and robots both benefit from using touch to perceive and interact with the surrounding environment (Johansson and Flanagan, 2009; Li et al., 2020; Calandra et al., 2017). However, no existing systems provide rich, multi-modal digital touch-sensing capabilities through a hemispherical compliant embodiment. Here, we describe several conceptual and technological innovations to improve the digitization of touch. These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities. Significantly, this fingertip contains high-resolution sensors (~8.3 million taxels) that respond to omnidirectional touch, capture multi-modal signals, and use on-device artificial intelligence to process the data in real time. Evaluations show that the artificial fingertip can resolve spatial features as small as 7 um, sense normal and shear forces with a resolution of 1.01 mN and 1.27 mN, respectively, perceive vibrations up to 10 kHz, sense heat, and even sense odor. Furthermore, it embeds an on-device AI neural network accelerator that acts as a peripheral nervous system on a robot and mimics the reflex arc found in humans. These results demonstrate the possibility of digitizing touch with superhuman performance. The implications are profound, and we anticipate potential applications in robotics (industrial, medical, agricultural, and consumer-level), virtual reality and telepresence, prosthetics, and e-commerce. Toward digitizing touch at scale, we open-source a modular platform to facilitate future research on the nature of touch., Comment: 28 pages
- Published
- 2024
25. SFA-UNet: More Attention to Multi-Scale Contrast and Contextual Information in Infrared Small Object Segmentation
- Author
-
Shah, Imad Ali, Malik, Fahad Mumtaz, and Ashraf, Muhammad Waqas
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
Computer vision researchers have extensively worked on fundamental infrared visual recognition for the past few decades. Among various approaches, deep learning has emerged as the most promising candidate. However, Infrared Small Object Segmentation (ISOS) remains a major focus due to several challenges including: 1) the lack of effective utilization of local contrast and global contextual information; 2) the potential loss of small objects in deep models; and 3) the struggling to capture fine-grained details and ignore noise. To address these challenges, we propose a modified U-Net architecture, named SFA-UNet, by combining Scharr Convolution (SC) and Fast Fourier Convolution (FFC) in addition to vertical and horizontal Attention gates (AG) into UNet. SFA-UNet utilizes double convolution layers with the addition of SC and FFC in its encoder and decoder layers. SC helps to learn the foreground-to-background contrast information whereas FFC provide multi-scale contextual information while mitigating the small objects vanishing problem. Additionally, the introduction of vertical AGs in encoder layers enhances the model's focus on the targeted object by ignoring irrelevant regions. We evaluated the proposed approach on publicly available, SIRST and IRSTD datasets, and achieved superior performance by an average 0.75% with variance of 0.025 of all combined metrics in multiple runs as compared to the existing state-of-the-art methods, Comment: Accepted and Presented at PRIP 2023
- Published
- 2024
26. Natural Language Processing for Analyzing Electronic Health Records and Clinical Notes in Cancer Research: A Review
- Author
-
Bilal, Muhammad, Hamza, Ameer, and Malik, Nadia
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
Objective: This review aims to analyze the application of natural language processing (NLP) techniques in cancer research using electronic health records (EHRs) and clinical notes. This review addresses gaps in the existing literature by providing a broader perspective than previous studies focused on specific cancer types or applications. Methods: A comprehensive literature search was conducted using the Scopus database, identifying 94 relevant studies published between 2019 and 2024. Data extraction included study characteristics, cancer types, NLP methodologies, dataset information, performance metrics, challenges, and future directions. Studies were categorized based on cancer types and NLP applications. Results: The results showed a growing trend in NLP applications for cancer research, with breast, lung, and colorectal cancers being the most studied. Information extraction and text classification emerged as predominant NLP tasks. A shift from rule-based to advanced machine learning techniques, particularly transformer-based models, was observed. The Dataset sizes used in existing studies varied widely. Key challenges included the limited generalizability of proposed solutions and the need for improved integration into clinical workflows. Conclusion: NLP techniques show significant potential in analyzing EHRs and clinical notes for cancer research. However, future work should focus on improving model generalizability, enhancing robustness in handling complex clinical language, and expanding applications to understudied cancer types. Integration of NLP tools into clinical practice and addressing ethical considerations remain crucial for utilizing the full potential of NLP in enhancing cancer diagnosis, treatment, and patient outcomes.
- Published
- 2024
27. $\textit{Who Speaks Matters}$: Analysing the Influence of the Speaker's Ethnicity on Hate Classification
- Author
-
Malik, Ananya, Sharma, Kartik, Ng, Lynnette Hui Xian, and Bhatt, Shaily
- Subjects
Computer Science - Computation and Language ,Computer Science - Artificial Intelligence - Abstract
Large Language Models (LLMs) offer a lucrative promise for scalable content moderation, including hate speech detection. However, they are also known to be brittle and biased against marginalised communities and dialects. This requires their applications to high-stakes tasks like hate speech detection to be critically scrutinized. In this work, we investigate the robustness of hate speech classification using LLMs, particularly when explicit and implicit markers of the speaker's ethnicity are injected into the input. For the explicit markers, we inject a phrase that mentions the speaker's identity. For the implicit markers, we inject dialectal features. By analysing how frequently model outputs flip in the presence of these markers, we reveal varying degrees of brittleness across 4 popular LLMs and 5 ethnicities. We find that the presence of implicit dialect markers in inputs causes model outputs to flip more than the presence of explicit markers. Further, the percentage of flips varies across ethnicities. Finally, we find that larger models are more robust. Our findings indicate the need for exercising caution in deploying LLMs for high-stakes tasks like hate speech detection., Comment: 9 pages, 3 figures, 3 tables. To appear in NeurIPS SafeGenAI 2024 Workshop
- Published
- 2024
28. Search for gravitational waves emitted from SN 2023ixf
- Author
-
The LIGO Scientific Collaboration, the Virgo Collaboration, the KAGRA Collaboration, Abac, A. G., Abbott, R., Abouelfettouh, I., Acernese, F., Ackley, K., Adhicary, S., Adhikari, N., Adhikari, R. X., Adkins, V. K., Agarwal, D., Agathos, M., Abchouyeh, M. Aghaei, Aguiar, O. D., Aguilar, I., Aiello, L., Ain, A., Akutsu, T., Albanesi, S., Alfaidi, R. A., Al-Jodah, A., Alléné, C., Allocca, A., Al-Shammari, S., Altin, P. A., Alvarez-Lopez, S., Amato, A., Amez-Droz, L., Amorosi, A., Amra, C., Ananyeva, A., Anderson, S. B., Anderson, W. G., Andia, M., Ando, M., Andrade, T., Andres, N., Andrés-Carcasona, M., Andrić, T., Anglin, J., Ansoldi, S., Antelis, J. M., Antier, S., Aoumi, M., Appavuravther, E. Z., Appert, S., Apple, S. K., Arai, K., Araya, A., Araya, M. C., Areeda, J. S., Argianas, L., Aritomi, N., Armato, F., Arnaud, N., Arogeti, M., Aronson, S. M., Ashton, G., Aso, Y., Assiduo, M., Melo, S. Assis de Souza, Aston, S. M., Astone, P., Attadio, F., Aubin, F., AultONeal, K., Avallone, G., Babak, S., Badaracco, F., Badger, C., Bae, S., Bagnasco, S., Bagui, E., Baier, J. G., Baiotti, L., Bajpai, R., Baka, T., Ball, M., Ballardin, G., Ballmer, S. W., Banagiri, S., Banerjee, B., Bankar, D., Baral, P., Barayoga, J. C., Barish, B. C., Barker, D., Barneo, P., Barone, F., Barr, B., Barsotti, L., Barsuglia, M., Barta, D., Bartoletti, A. M., Barton, M. A., Bartos, I., Basak, S., Basalaev, A., Bassiri, R., Basti, A., Bates, D. E., Bawaj, M., Baxi, P., Bayley, J. C., Baylor, A. C., Baynard II, P. A., Bazzan, M., Bedakihale, V. M., Beirnaert, F., Bejger, M., Belardinelli, D., Bell, A. S., Benedetto, V., Benoit, W., Bentley, J. D., Yaala, M. Ben, Bera, S., Berbel, M., Bergamin, F., Berger, B. K., Bernuzzi, S., Beroiz, M., Bersanetti, D., Bertolini, A., Betzwieser, J., Beveridge, D., Bevins, N., Bhandare, R., Bhardwaj, U., Bhatt, R., Bhattacharjee, D., Bhaumik, S., Bhowmick, S., Bianchi, A., Bilenko, I. A., Billingsley, G., Binetti, A., Bini, S., Birnholtz, O., Biscoveanu, S., Bisht, A., Bitossi, M., Bizouard, M. -A., Blackburn, J. K., Blagg, L. A., Blair, C. D., Blair, D. G., Bobba, F., Bode, N., Boileau, G., Boldrini, M., Bolingbroke, G. N., Bolliand, A., Bonavena, L. D., Bondarescu, R., Bondu, F., Bonilla, E., Bonilla, M. S., Bonino, A., Bonnand, R., Booker, P., Borchers, A., Boschi, V., Bose, S., Bossilkov, V., Boudart, V., Boudon, A., Bozzi, A., Bradaschia, C., Brady, P. R., Braglia, M., Branch, A., Branchesi, M., Brandt, J., Braun, I., Breschi, M., Briant, T., Brillet, A., Brinkmann, M., Brockill, P., Brockmueller, E., Brooks, A. F., Brown, B. C., Brown, D. D., Brozzetti, M. L., Brunett, S., Bruno, G., Bruntz, R., Bryant, J., Bucci, F., Buchanan, J., Bulashenko, O., Bulik, T., Bulten, H. J., Buonanno, A., Burtnyk, K., Buscicchio, R., Buskulic, D., Buy, C., Byer, R. L., Davies, G. S. Cabourn, Cabras, G., Cabrita, R., Cáceres-Barbosa, V., Cadonati, L., Cagnoli, G., Cahillane, C., Bustillo, J. Calderón, Callister, T. A., Calloni, E., Camp, J. B., Canepa, M., Santoro, G. Caneva, Cannon, K. C., Cao, H., Capistran, L. A., Capocasa, E., Capote, E., Carapella, G., Carbognani, F., Carlassara, M., Carlin, J. B., Carpinelli, M., Carrillo, G., Carter, J. J., Carullo, G., Diaz, J. Casanueva, Casentini, C., Castro-Lucas, S. Y., Caudill, S., Cavaglià, M., Cavalieri, R., Cella, G., Cerdá-Durán, P., Cesarini, E., Chaibi, W., Chakraborty, P., Subrahmanya, S. Chalathadka, Chan, J. C. L., Chan, M., Chandra, K., Chang, R. -J., Chao, S., Charlton, E. L., Charlton, P., Chassande-Mottin, E., Chatterjee, C., Chatterjee, Debarati, Chatterjee, Deep, Chaturvedi, M., Chaty, S., Chen, A., Chen, A. H. -Y., Chen, D., Chen, H., Chen, H. Y., Chen, J., Chen, K. H., Chen, Y., Chen, Yanbei, Chen, Yitian, Cheng, H. P., Chessa, P., Cheung, H. T., Cheung, S. Y., Chiadini, F., Chiarini, G., Chierici, R., Chincarini, A., Chiofalo, M. L., Chiummo, A., Chou, C., Choudhary, S., Christensen, N., Chua, S. S. Y., Chugh, P., Ciani, G., Ciecielag, P., Cieślar, M., Cifaldi, M., Ciolfi, R., Clara, F., Clark, J. A., Clarke, J., Clarke, T. A., Clearwater, P., Clesse, S., Coccia, E., Codazzo, E., Cohadon, P. -F., Colace, S., Colleoni, M., Collette, C. G., Collins, J., Colloms, S., Colombo, A., Colpi, M., Compton, C. M., Connolly, G., Conti, L., Corbitt, T. R., Cordero-Carrión, I., Corezzi, S., Cornish, N. J., Corsi, A., Cortese, S., Costa, C. A., Cottingham, R., Coughlin, M. W., Couineaux, A., Coulon, J. -P., Countryman, S. T., Coupechoux, J. -F., Couvares, P., Coward, D. M., Cowart, M. J., Coyne, R., Craig, K., Creed, R., Creighton, J. D. E., Creighton, T. D., Cremonese, P., Criswell, A. W., Crockett-Gray, J. C. G., Crook, S., Crouch, R., Csizmazia, J., Cudell, J. R., Cullen, T. J., Cumming, A., Cuoco, E., Cusinato, M., Dabadie, P., Canton, T. Dal, Dall'Osso, S., Pra, S. Dal, Dálya, G., D'Angelo, B., Danilishin, S., D'Antonio, S., Danzmann, K., Darroch, K. E., Dartez, L. P., Dasgupta, A., Datta, S., Dattilo, V., Daumas, A., Davari, N., Dave, I., Davenport, A., Davier, M., Davies, T. F., Davis, D., Davis, L., Davis, M. C., Davis, P. J., Dax, M., De Bolle, J., Deenadayalan, M., Degallaix, J., De Laurentis, M., Deléglise, S., De Lillo, F., Dell'Aquila, D., Del Pozzo, W., De Marco, F., De Matteis, F., D'Emilio, V., Demos, N., Dent, T., Depasse, A., DePergola, N., De Pietri, R., De Rosa, R., De Rossi, C., DeSalvo, R., De Simone, R., Dhani, A., Diab, R., Díaz, M. C., Di Cesare, M., Dideron, G., Didio, N. A., Dietrich, T., Di Fiore, L., Di Fronzo, C., Di Giovanni, M., Di Girolamo, T., Diksha, D., Di Michele, A., Ding, J., Di Pace, S., Di Palma, I., Di Renzo, F., Divyajyoti, Dmitriev, A., Doctor, Z., Dohmen, E., Doleva, P. P., Dominguez, D., D'Onofrio, L., Donovan, F., Dooley, K. L., Dooney, T., Doravari, S., Dorosh, O., Drago, M., Driggers, J. C., Ducoin, J. -G., Dunn, L., Dupletsa, U., D'Urso, D., Duval, H., Duverne, P. -A., Dwyer, S. E., Eassa, C., Ebersold, M., Eckhardt, T., Eddolls, G., Edelman, B., Edo, T. B., Edy, O., Effler, A., Eichholz, J., Einsle, H., Eisenmann, M., Eisenstein, R. A., Ejlli, A., Eleveld, R. M., Emma, M., Endo, K., Engl, A. J., Enloe, E., Errico, L., Essick, R. C., Estellés, H., Estevez, D., Etzel, T., Evans, M., Evstafyeva, T., Ewing, B. E., Ezquiaga, J. M., Fabrizi, F., Faedi, F., Fafone, V., Fairhurst, S., Farah, A. M., Farr, B., Farr, W. M., Favaro, G., Favata, M., Fays, M., Fazio, M., Feicht, J., Fejer, M. M., Felicetti, R., Fenyvesi, E., Ferguson, D. L., Ferraiuolo, S., Ferrante, I., Ferreira, T. A., Fidecaro, F., Figura, P., Fiori, A., Fiori, I., Fishbach, M., Fisher, R. P., Fittipaldi, R., Fiumara, V., Flaminio, R., Fleischer, S. M., Fleming, L. S., Floden, E., Foley, E. M., Fong, H., Font, J. A., Fornal, B., Forsyth, P. W. F., Franceschetti, K., Franchini, N., Frasca, S., Frasconi, F., Mascioli, A. Frattale, Frei, Z., Freise, A., Freitas, O., Frey, R., Frischhertz, W., Fritschel, P., Frolov, V. V., Fronzé, G. G., Fuentes-Garcia, M., Fujii, S., Fujimori, T., Fulda, P., Fyffe, M., Gadre, B., Gair, J. R., Galaudage, S., Galdi, V., Gallagher, H., Gallardo, S., Gallego, B., Gamba, R., Gamboa, A., Ganapathy, D., Ganguly, A., Garaventa, B., García-Bellido, J., Núñez, C. García, García-Quirós, C., Gardner, J. W., Gardner, K. A., Gargiulo, J., Garron, A., Garufi, F., Gasbarra, C., Gateley, B., Gayathri, V., Gemme, G., Gennai, A., Gennari, V., George, J., George, R., Gerberding, O., Gergely, L., Ghosh, Archisman, Ghosh, Sayantan, Ghosh, Shaon, Ghosh, Shrobana, Ghosh, Suprovo, Ghosh, Tathagata, Giacoppo, L., Giaime, J. A., Giardina, K. D., Gibson, D. R., Gibson, D. T., Gier, C., Giri, P., Gissi, F., Gkaitatzis, S., Glanzer, J., Glotin, F., Godfrey, J., Godwin, P., Goebbels, N. L., Goetz, E., Golomb, J., Lopez, S. Gomez, Goncharov, B., Gong, Y., González, G., Goodarzi, P., Goode, S., Goodwin-Jones, A. W., Gosselin, M., Göttel, A. S., Gouaty, R., Gould, D. W., Govorkova, K., Goyal, S., Grace, B., Grado, A., Graham, V., Granados, A. E., Granata, M., Granata, V., Gras, S., Grassia, P., Gray, A., Gray, C., Gray, R., Greco, G., Green, A. C., Green, S. M., Green, S. R., Gretarsson, A. M., Gretarsson, E. M., Griffith, D., Griffiths, W. L., Griggs, H. L., Grignani, G., Grimaldi, A., Grimaud, C., Grote, H., Guerra, D., Guetta, D., Guidi, G. M., Guimaraes, A. R., Gulati, H. K., Gulminelli, F., Gunny, A. M., Guo, H., Guo, W., Guo, Y., Gupta, Anchal, Gupta, Anuradha, Gupta, Ish, Gupta, N. C., Gupta, P., Gupta, S. K., Gupta, T., Gupte, N., Gurs, J., Gutierrez, N., Guzman, F., H, H. -Y., Haba, D., Haberland, M., Haino, S., Hall, E. D., Hamilton, E. Z., Hammond, G., Han, W. -B., Haney, M., Hanks, J., Hanna, C., Hannam, M. D., Hannuksela, O. A., Hanselman, A. G., Hansen, H., Hanson, J., Harada, R., Hardison, A. R., Haris, K., Harmark, T., Harms, J., Harry, G. M., Harry, I. W., Hart, J., Haskell, B., Haster, C. -J., Hathaway, J. S., Haughian, K., Hayakawa, H., Hayama, K., Hayes, R., Heffernan, A., Heidmann, A., Heintze, M. C., Heinze, J., Heinzel, J., Heitmann, H., Hellman, F., Hello, P., Helmling-Cornell, A. F., Hemming, G., Henderson-Sapir, O., Hendry, M., Heng, I. S., Hennes, E., Henshaw, C., Hertog, T., Heurs, M., Hewitt, A. L., Heyns, J., Higginbotham, S., Hild, S., Hill, S., Himemoto, Y., Hirata, N., Hirose, C., Hoang, S., Hochheim, S., Hofman, D., Holland, N. A., Holley-Bockelmann, K., Holmes, Z. J., Holz, D. E., Honet, L., Hong, C., Hornung, J., Hoshino, S., Hough, J., Hourihane, S., Howell, E. J., Hoy, C. G., Hrishikesh, C. A., Hsieh, H. -F., Hsiung, C., Hsu, H. C., Hsu, W. -F., Hu, P., Hu, Q., Huang, H. Y., Huang, Y. -J., Huddart, A. D., Hughey, B., Hui, D. C. Y., Hui, V., Husa, S., Huxford, R., Huynh-Dinh, T., Iampieri, L., Iandolo, G. A., Ianni, M., Iess, A., Imafuku, H., Inayoshi, K., Inoue, Y., Iorio, G., Iqbal, M. H., Irwin, J., Ishikawa, R., Isi, M., Ismail, M. A., Itoh, Y., Iwanaga, H., Iwaya, M., Iyer, B. R., JaberianHamedan, V., Jacquet, C., Jacquet, P. -E., Jadhav, S. J., Jadhav, S. P., Jain, T., James, A. L., James, P. A., Jamshidi, R., Janquart, J., Janssens, K., Janthalur, N. N., Jaraba, S., Jaranowski, P., Jaume, R., Javed, W., Jennings, A., Jia, W., Jiang, J., Kubisz, J., Johanson, C., Johns, G. R., Johnson, N. A., Johnston, M. C., Johnston, R., Johny, N., Jones, D. H., Jones, D. I., Jones, R., Jose, S., Joshi, P., Ju, L., Jung, K., Junker, J., Juste, V., Kajita, T., Kaku, I., Kalaghatgi, C., Kalogera, V., Kamiizumi, M., Kanda, N., Kandhasamy, S., Kang, G., Kanner, J. B., Kapadia, S. J., Kapasi, D. P., Karat, S., Karathanasis, C., Kashyap, R., Kasprzack, M., Kastaun, W., Kato, T., Katsavounidis, E., Katzman, W., Kaushik, R., Kawabe, K., Kawamoto, R., Kazemi, A., Keitel, D., Kelley-Derzon, J., Kennington, J., Kesharwani, R., Key, J. S., Khadela, R., Khadka, S., Khalili, F. Y., Khan, F., Khan, I., Khanam, T., Khursheed, M., Khusid, N. M., Kiendrebeogo, W., Kijbunchoo, N., Kim, C., Kim, J. C., Kim, K., Kim, M. H., Kim, S., Kim, Y. -M., Kimball, C., Kinley-Hanlon, M., Kinnear, M., Kissel, J. S., Klimenko, S., Knee, A. M., Knust, N., Kobayashi, K., Obergaulinger, M., Koch, P., Koehlenbeck, S. M., Koekoek, G., Kohri, K., Kokeyama, K., Koley, S., Kolitsidou, P., Kolstein, M., Komori, K., Kong, A. K. H., Kontos, A., Korobko, M., Kossak, R. V., Kou, X., Koushik, A., Kouvatsos, N., Kovalam, M., Kozak, D. B., Kranzhoff, S. L., Kringel, V., Krishnendu, N. V., Królak, A., Kruska, K., Kuehn, G., Kuijer, P., Kulkarni, S., Ramamohan, A. Kulur, Kumar, A., Kumar, Praveen, Kumar, Prayush, Kumar, Rahul, Kumar, Rakesh, Kume, J., Kuns, K., Kuntimaddi, N., Kuroyanagi, S., Kurth, N. J., Kuwahara, S., Kwak, K., Kwan, K., Kwok, J., Lacaille, G., Lagabbe, P., Laghi, D., Lai, S., Laity, A. H., Lakkis, M. H., Lalande, E., Lalleman, M., Lalremruati, P. C., Landry, M., Lane, B. B., Lang, R. N., Lange, J., Lantz, B., La Rana, A., La Rosa, I., Lartaux-Vollard, A., Lasky, P. D., Lawrence, J., Lawrence, M. N., Laxen, M., Lazzarini, A., Lazzaro, C., Leaci, P., Lecoeuche, Y. K., Lee, H. M., Lee, H. W., Lee, K., Lee, R. -K., Lee, R., Lee, S., Lee, Y., Legred, I. N., Lehmann, J., Lehner, L., Jean, M. Le, Lemaître, A., Lenti, M., Leonardi, M., Lequime, M., Leroy, N., Lesovsky, M., Letendre, N., Lethuillier, M., Levin, S. E., Levin, Y., Leyde, K., Li, A. K. Y., Li, K. L., Li, T. G. F., Li, X., Li, Z., Lihos, A., Lin, C-Y., Lin, C. -Y., Lin, E. T., Lin, F., Lin, H., Lin, L. C. -C., Lin, Y. -C., Linde, F., Linker, S. D., Littenberg, T. B., Liu, A., Liu, G. C., Liu, Jian, Villarreal, F. Llamas, Llobera-Querol, J., Lo, R. K. L., Locquet, J. -P., London, L. T., Longo, A., Lopez, D., Portilla, M. Lopez, Lorenzini, M., Lorenzo-Medina, A., Loriette, V., Lormand, M., Losurdo, G., Lott IV, T. P., Lough, J. D., Loughlin, H. A., Lousto, C. O., Lowry, M. J., Lu, N., Lück, H., Lumaca, D., Lundgren, A. P., Lussier, A. W., Ma, L. -T., Ma, S., Ma'arif, M., Macas, R., Macedo, A., MacInnis, M., Maciy, R. R., Macleod, D. M., MacMillan, I. A. O., Macquet, A., Macri, D., Maeda, K., Maenaut, S., Hernandez, I. Magaña, Magare, S. S., Magazzù, C., Magee, R. M., Maggio, E., Maggiore, R., Magnozzi, M., Mahesh, M., Mahesh, S., Maini, M., Majhi, S., Majorana, E., Makarem, C. N., Makelele, E., Malaquias-Reis, J. A., Mali, U., Maliakal, S., Malik, A., Man, N., Mandic, V., Mangano, V., Mannix, B., Mansell, G. L., Mansingh, G., Manske, M., Mantovani, M., Mapelli, M., Marchesoni, F., Pina, D. Marín, Marion, F., Márka, S., Márka, Z., Markosyan, A. S., Markowitz, A., Maros, E., Marsat, S., Martelli, F., Martin, I. W., Martin, R. M., Martinez, B. B., Martinez, M., Martinez, V., Martini, A., Martinovic, K., Martins, J. C., Martynov, D. V., Marx, E. J., Massaro, L., Masserot, A., Masso-Reid, M., Mastrodicasa, M., Mastrogiovanni, S., Matcovich, T., Matiushechkina, M., Matsuyama, M., Mavalvala, N., Maxwell, N., McCarrol, G., McCarthy, R., McClelland, D. E., McCormick, S., McCuller, L., McEachin, S., McElhenny, C., McGhee, G. I., McGinn, J., McGowan, K. B. M., McIver, J., McLeod, A., McRae, T., Meacher, D., Meijer, Q., Melatos, A., Mellaerts, S., Menendez-Vazquez, A., Menoni, C. S., Mera, F., Mercer, R. A., Mereni, L., Merfeld, K., Merilh, E. L., Mérou, J. R., Merritt, J. D., Merzougui, M., Messenger, C., Messick, C., Meyer-Conde, M., Meylahn, F., Mhaske, A., Miani, A., Miao, H., Michaloliakos, I., Michel, C., Michimura, Y., Middleton, H., Miller, A. L., Miller, S., Millhouse, M., Milotti, E., Milotti, V., Minenkov, Y., Mio, N., Mir, Ll. M., Mirasola, L., Miravet-Tenés, M., Miritescu, C. -A., Mishra, A. K., Mishra, A., Mishra, C., Mishra, T., Mitchell, A. L., Mitchell, J. G., Mitra, S., Mitrofanov, V. P., Mittleman, R., Miyakawa, O., Miyamoto, S., Miyoki, S., Mo, G., Mobilia, L., Mohapatra, S. R. P., Mohite, S. R., Molina-Ruiz, M., Mondal, C., Mondin, M., Montani, M., Moore, C. J., Moraru, D., More, A., More, S., Moreno, G., Morgan, C., Morisaki, S., Moriwaki, Y., Morras, G., Moscatello, A., Mourier, P., Mours, B., Mow-Lowry, C. M., Muciaccia, F., Mukherjee, Arunava, Mukherjee, D., Mukherjee, Samanwaya, Mukherjee, Soma, Mukherjee, Subroto, Mukherjee, Suvodip, Mukund, N., Mullavey, A., Munch, J., Mundi, J., Mungioli, C. L., Oberg, W. R. Munn, Murakami, Y., Murakoshi, M., Murray, P. G., Muusse, S., Nabari, D., Nadji, S. L., Nagar, A., Nagarajan, N., Nagler, K. N., Nakagaki, K., Nakamura, K., Nakano, H., Nakano, M., Nandi, D., Napolano, V., Narayan, P., Nardecchia, I., Narikawa, T., Narola, H., Naticchioni, L., Nayak, R. K., Neilson, J., Nelson, A., Nelson, T. J. N., Nery, M., Neunzert, A., Ng, S., Quynh, L. Nguyen, Nichols, S. A., Nielsen, A. B., Nieradka, G., Niko, A., Nishino, Y., Nishizawa, A., Nissanke, S., Nitoglia, E., Niu, W., Nocera, F., Norman, M., North, C., Novak, J., Siles, J. F. Nuño, Nuttall, L. K., Obayashi, K., Oberling, J., O'Dell, J., Oertel, M., Offermans, A., Oganesyan, G., Oh, J. J., Oh, K., O'Hanlon, T., Ohashi, M., Ohkawa, M., Ohme, F., Oliveira, A. S., Oliveri, R., O'Neal, B., Oohara, K., O'Reilly, B., Ormsby, N. D., Orselli, M., O'Shaughnessy, R., O'Shea, S., Oshima, Y., Oshino, S., Ossokine, S., Osthelder, C., Ota, I., Ottaway, D. J., Ouzriat, A., Overmier, H., Owen, B. J., Pace, A. E., Pagano, R., Page, M. A., Pai, A., Pal, A., Pal, S., Palaia, M. A., Pálfi, M., Palma, P. P., Palomba, C., Palud, P., Pan, H., Pan, J., Pan, K. C., Panai, R., Panda, P. K., Pandey, S., Panebianco, L., Pang, P. T. H., Pannarale, F., Pannone, K. A., Pant, B. C., Panther, F. H., Paoletti, F., Paolone, A., Papalexakis, E. E., Papalini, L., Papigkiotis, G., Paquis, A., Parisi, A., Park, B. -J., Park, J., Parker, W., Pascale, G., Pascucci, D., Pasqualetti, A., Passaquieti, R., Passenger, L., Passuello, D., Patane, O., Pathak, D., Pathak, M., Patra, A., Patricelli, B., Patron, A. S., Paul, K., Paul, S., Payne, E., Pearce, T., Pedraza, M., Pegna, R., Pele, A., Arellano, F. E. Peña, Penn, S., Penuliar, M. D., Perego, A., Pereira, Z., Perez, J. J., Périgois, C., Perna, G., Perreca, A., Perret, J., Perriès, S., Perry, J. W., Pesios, D., Petracca, S., Petrillo, C., Pfeiffer, H. P., Pham, H., Pham, K. A., Phukon, K. S., Phurailatpam, H., Piarulli, M., Piccari, L., Piccinni, O. J., Pichot, M., Piendibene, M., Piergiovanni, F., Pierini, L., Pierra, G., Pierro, V., Pietrzak, M., Pillas, M., Pilo, F., Pinard, L., Pinto, I. M., Pinto, M., Piotrzkowski, B. J., Pirello, M., Pitkin, M. D., Placidi, A., Placidi, E., Planas, M. L., Plastino, W., Poggiani, R., Polini, E., Pompili, L., Poon, J., Porcelli, E., Porter, E. K., Posnansky, C., Poulton, R., Powell, J., Pracchia, M., Pradhan, B. K., Pradier, T., Prajapati, A. K., Prasai, K., Prasanna, R., Prasia, P., Pratten, G., Principe, G., Principe, M., Prodi, G. A., Prokhorov, L., Prosposito, P., Puecher, A., Pullin, J., Punturo, M., Puppo, P., Pürrer, M., Qi, H., Qin, J., Quéméner, G., Quetschke, V., Quigley, C., Quinonez, P. J., Raab, F. J., Raabith, S. S., Raaijmakers, G., Raja, S., Rajan, C., Rajbhandari, B., Ramirez, K. E., Vidal, F. A. Ramis, Ramos-Buades, A., Rana, D., Ranjan, S., Ransom, K., Rapagnani, P., Ratto, B., Rawat, S., Ray, A., Raymond, V., Razzano, M., Read, J., Payo, M. Recaman, Regimbau, T., Rei, L., Reid, S., Reitze, D. H., Relton, P., Renzini, A. I., Rettegno, P., Revenu, B., Reyes, R., Rezaei, A. S., Ricci, F., Ricci, M., Ricciardone, A., Richardson, J. W., Richardson, M., Rijal, A., Riles, K., Riley, H. K., Rinaldi, S., Rittmeyer, J., Robertson, C., Robinet, F., Robinson, M., Rocchi, A., Rolland, L., Rollins, J. G., Romano, A. E., Romano, R., Romero, A., Romero-Shaw, I. M., Romie, J. H., Ronchini, S., Roocke, T. J., Rosa, L., Rosauer, T. J., Rose, C. A., Rosińska, D., Ross, M. P., Rossello, M., Rowan, S., Roy, S. K., Roy, S., Rozza, D., Ruggi, P., Ruhama, N., Morales, E. Ruiz, Ruiz-Rocha, K., Sachdev, S., Sadecki, T., Sadiq, J., Saffarieh, P., Sah, M. R., Saha, S. S., Saha, S., Sainrat, T., Menon, S. Sajith, Sakai, K., Sakellariadou, M., Sakon, S., Salafia, O. S., Salces-Carcoba, F., Salconi, L., Saleem, M., Salemi, F., Sallé, M., Salvador, S., Sanchez, A., Sanchez, E. J., Sanchez, J. H., Sanchez, L. E., Sanchis-Gual, N., Sanders, J. R., Sänger, E. M., Santoliquido, F., Saravanan, T. R., Sarin, N., Sasaoka, S., Sasli, A., Sassi, P., Sassolas, B., Satari, H., Sato, R., Sato, Y., Sauter, O., Savage, R. L., Sawada, T., Sawant, H. L., Sayah, S., Scacco, V., Schaetzl, D., Scheel, M., Schiebelbein, A., Schiworski, M. G., Schmidt, P., Schmidt, S., Schnabel, R., Schneewind, M., Schofield, R. M. S., Schouteden, K., Schulte, B. W., Schutz, B. F., Schwartz, E., Scialpi, M., Scott, J., Scott, S. M., Seetharamu, T. C., Seglar-Arroyo, M., Sekiguchi, Y., Sellers, D., Sengupta, A. S., Sentenac, D., Seo, E. G., Seo, J. W., Sequino, V., Serra, M., Servignat, G., Sevrin, A., Shaffer, T., Shah, U. S., Shaikh, M. A., Shao, L., Sharma, A. K., Sharma, P., Sharma-Chaudhary, S., Shaw, M. R., Shawhan, P., Shcheblanov, N. S., Sheridan, E., Shikano, Y., Shikauchi, M., Shimode, K., Shinkai, H., Shiota, J., Shoemaker, D. H., Shoemaker, D. M., Short, R. W., ShyamSundar, S., Sider, A., Siegel, H., Sieniawska, M., Sigg, D., Silenzi, L., Simmonds, M., Singer, L. P., Singh, A., Singh, D., Singh, M. K., Singh, S., Singha, A., Sintes, A. M., Sipala, V., Skliris, V., Slagmolen, B. J. J., Slaven-Blair, T. J., Smetana, J., Smith, J. R., Smith, L., Smith, R. J. E., Smith, W. J., Soldateschi, J., Somiya, K., Song, I., Soni, K., Soni, S., Sordini, V., Sorrentino, F., Sorrentino, N., Sotani, H., Soulard, R., Southgate, A., Spagnuolo, V., Spencer, A. P., Spera, M., Spinicelli, P., Spoon, J. B., Sprague, C. A., Srivastava, A. K., Stachurski, F., Steer, D. A., Steinlechner, J., Steinlechner, S., Stergioulas, N., Stevens, P., StPierre, M., Stratta, G., Strong, M. D., Strunk, A., Sturani, R., Stuver, A. L., Suchenek, M., Sudhagar, S., Sueltmann, N., Suleiman, L., Sullivan, K. D., Sun, L., Sunil, S., Suresh, J., Sutton, P. J., Suzuki, T., Suzuki, Y., Swinkels, B. L., Syx, A., Szczepańczyk, M. J., Szewczyk, P., Tacca, M., Tagoshi, H., Tait, S. C., Takahashi, H., Takahashi, R., Takamori, A., Takase, T., Takatani, K., Takeda, H., Takeshita, K., Talbot, C., Tamaki, M., Tamanini, N., Tanabe, D., Tanaka, K., Tanaka, S. J., Tanaka, T., Tang, D., Tanioka, S., Tanner, D. B., Tao, L., Tapia, R. D., Martín, E. N. Tapia San, Tarafder, R., Taranto, C., Taruya, A., Tasson, J. D., Teloi, M., Tenorio, R., Themann, H., Theodoropoulos, A., Thirugnanasambandam, M. P., Thomas, L. M., Thomas, M., Thomas, P., Thompson, J. E., Thondapu, S. R., Thorne, K. A., Thrane, E., Tissino, J., Tiwari, A., Tiwari, P., Tiwari, S., Tiwari, V., Todd, M. R., Toivonen, A. M., Toland, K., Tolley, A. E., Tomaru, T., Tomita, K., Tomura, T., Tong-Yu, C., Toriyama, A., Toropov, N., Torres-Forné, A., Torrie, C. I., Toscani, M., Melo, I. Tosta e, Tournefier, E., Trapananti, A., Travasso, F., Traylor, G., Trevor, M., Tringali, M. C., Tripathee, A., Troian, G., Troiano, L., Trovato, A., Trozzo, L., Trudeau, R. J., Tsang, T. T. L., Tso, R., Tsuchida, S., Tsukada, L., Tsutsui, T., Turbang, K., Turconi, M., Turski, C., Ubach, H., Uchikata, N., Uchiyama, T., Udall, R. P., Uehara, T., Uematsu, M., Ueno, K., Ueno, S., Undheim, V., Ushiba, T., Vacatello, M., Vahlbruch, H., Vaidya, N., Vajente, G., Vajpeyi, A., Valdes, G., Valencia, J., Valentini, M., Vallejo-Peña, S. A., Vallero, S., Valsan, V., van Bakel, N., van Beuzekom, M., van Dael, M., Brand, J. F. J. van den, Broeck, C. Van Den, Vander-Hyde, D. C., van der Sluys, M., Van de Walle, A., van Dongen, J., Vandra, K., van Haevermaet, H., van Heijningen, J. V., Van Hove, P., VanKeuren, M., Vanosky, J., van Putten, M. H. P. M., van Ranst, Z., van Remortel, N., Vardaro, M., Vargas, A. F., Varghese, J. J., Varma, V., Vasúth, M., Vecchio, A., Vedovato, G., Veitch, J., Veitch, P. J., Venikoudis, S., Venneberg, J., Verdier, P., Verkindt, D., Verma, B., Verma, P., Verma, Y., Vermeulen, S. M., Vetrano, F., Veutro, A., Vibhute, A. M., Viceré, A., Vidyant, S., Viets, A. D., Vijaykumar, A., Vilkha, A., Villa-Ortega, V., Vincent, E. T., Vinet, J. -Y., Viret, S., Virtuoso, A., Vitale, S., Vives, A., Vocca, H., Voigt, D., von Reis, E. R. G., von Wrangel, J. S. A., Vyatchanin, S. P., Wade, L. E., Wade, M., Wagner, K. J., Wajid, A., Walker, M., Wallace, G. S., Wallace, L., Wang, H., Wang, J. Z., Wang, W. H., Wang, Z., Waratkar, G., Warner, J., Was, M., Washimi, T., Washington, N. Y., Watarai, D., Wayt, K. E., Weaver, B. R., Weaver, B., Weaving, C. R., Webster, S. A., Weinert, M., Weinstein, A. J., Weiss, R., Wellmann, F., Wen, L., Weßels, P., Wette, K., Whelan, J. T., Whiting, B. F., Whittle, C., Wildberger, J. B., Wilk, O. S., Wilken, D., Wilkin, A. T., Willadsen, D. J., Willetts, K., Williams, D., Williams, M. J., Williams, N. S., Willis, J. L., Willke, B., Wils, M., Winterflood, J., Wipf, C. C., Woan, G., Woehler, J., Wofford, J. K., Wolfe, N. E., Wong, H. T., Wong, H. W. Y., Wong, I. C. F., Wright, J. L., Wright, M., Wu, C., Wu, D. S., Wu, H., Wuchner, E., Wysocki, D. M., Xu, V. A., Xu, Y., Yadav, N., Yamamoto, H., Yamamoto, K., Yamamoto, T. S., Yamamoto, T., Yamamura, S., Yamazaki, R., Yan, S., Yan, T., Yang, F. W., Yang, F., Yang, K. Z., Yang, Y., Yarbrough, Z., Yasui, H., Yeh, S. -W., Yelikar, A. B., Yin, X., Yokoyama, J., Yokozawa, T., Yoo, J., Yu, H., Yuan, S., Yuzurihara, H., Zadrożny, A., Zanolin, M., Zeeshan, M., Zelenova, T., Zendri, J. -P., Zeoli, M., Zerrad, M., Zevin, M., Zhang, A. C., Zhang, L., Zhang, R., Zhang, T., Zhang, Y., Zhao, C., Zhao, Yue, Zhao, Yuhang, Zheng, Y., Zhong, H., Zhou, R., Zhu, X. -J., Zhu, Z. -H., Zimmerman, A. B., Zucker, M. E., and Zweizig, J.
- Subjects
Astrophysics - High Energy Astrophysical Phenomena - Abstract
We present the results of a search for gravitational-wave transients associated with core-collapse supernova SN 2023ixf, which was observed in the galaxy Messier 101 via optical emission on 2023 May 19th, during the LIGO-Virgo-KAGRA 15th Engineering Run. We define a five-day on-source window during which an accompanying gravitational-wave signal may have occurred. No gravitational waves have been identified in data when at least two gravitational-wave observatories were operating, which covered $\sim 14\%$ of this five-day window. We report the search detection efficiency for various possible gravitational-wave emission models. Considering the distance to M101 (6.7 Mpc), we derive constraints on the gravitational-wave emission mechanism of core-collapse supernovae across a broad frequency spectrum, ranging from 50 Hz to 2 kHz where we assume the GW emission occurred when coincident data are available in the on-source window. Considering an ellipsoid model for a rotating proto-neutron star, our search is sensitive to gravitational-wave energy $1 \times 10^{-5} M_{\odot} c^2$ and luminosity $4 \times 10^{-5} M_{\odot} c^2/\text{s}$ for a source emitting at 50 Hz. These constraints are around an order of magnitude more stringent than those obtained so far with gravitational-wave data. The constraint on the ellipticity of the proto-neutron star that is formed is as low as $1.04$, at frequencies above $1200$ Hz, surpassing results from SN 2019ejj., Comment: Main paper: 6 pages, 4 figures and 1 table. Total with appendices: 20 pages, 4 figures, and 1 table
- Published
- 2024
29. A Fusion-Driven Approach of Attention-Based CNN-BiLSTM for Protein Family Classification -- ProFamNet
- Author
-
Ali, Bahar, Shah, Anwar, Niaz, Malik, Mansoord, Musadaq, Ullah, Sami, and Adnan, Muhammad
- Subjects
Quantitative Biology - Quantitative Methods ,Computer Science - Machine Learning - Abstract
Advanced automated AI techniques allow us to classify protein sequences and discern their biological families and functions. Conventional approaches for classifying these protein families often focus on extracting N-Gram features from the sequences while overlooking crucial motif information and the interplay between motifs and neighboring amino acids. Recently, convolutional neural networks have been applied to amino acid and motif data, even with a limited dataset of well-characterized proteins, resulting in improved performance. This study presents a model for classifying protein families using the fusion of 1D-CNN, BiLSTM, and an attention mechanism, which combines spatial feature extraction, long-term dependencies, and context-aware representations. The proposed model (ProFamNet) achieved superior model efficiency with 450,953 parameters and a compact size of 1.72 MB, outperforming the state-of-the-art model with 4,578,911 parameters and a size of 17.47 MB. Further, we achieved a higher F1 score (98.30% vs. 97.67%) with more instances (271,160 vs. 55,077) in fewer training epochs (25 vs. 30)., Comment: It is the authors original Work
- Published
- 2024
30. ZK-DPPS: A Zero-Knowledge Decentralised Data Sharing and Processing Middleware
- Author
-
Jabbari, Amir, Ramachandran, Gowri, Malik, Sidra, and Jurdak, Raja
- Subjects
Computer Science - Cryptography and Security - Abstract
In the current digital landscape, supply chains have transformed into complex networks driven by the Internet of Things (IoT), necessitating enhanced data sharing and processing capabilities to ensure traceability and transparency. Leveraging Blockchain technology in IoT applications advances reliability and transparency in near-real-time insight extraction processes. However, it raises significant concerns regarding data privacy. Existing privacy-preserving approaches often rely on Smart Contracts for automation and Zero Knowledge Proofs (ZKP) for privacy. However, apart from being inflexible in adopting system changes while effectively protecting data confidentiality, these approaches introduce significant computational expenses and overheads that make them impractical for dynamic supply chain environments. To address these challenges, we propose ZK-DPPS, a framework that ensures zero-knowledge communications without the need for traditional ZKPs. In ZK-DPPS, privacy is preserved through a combination of Fully Homomorphic Encryption (FHE) for computations and Secure Multi-Party Computations (SMPC) for key reconstruction. To ensure that the raw data remains private throughout the entire process, we use FHE to execute computations directly on encrypted data. The "zero-knowledge" aspect of ZK-DPPS refers to the system's ability to process and share data insights without exposing sensitive information, thus offering a practical and efficient alternative to ZKP-based methods. We demonstrate the efficacy of ZK-DPPS through a simulated supply chain scenario, showcasing its ability to tackle the dual challenges of privacy preservation and computational trust in decentralised environments.
- Published
- 2024
31. Workflows Community Summit 2024: Future Trends and Challenges in Scientific Workflows
- Author
-
da Silva, Rafael Ferreira, Bard, Deborah, Chard, Kyle, de Witt, Shaun, Foster, Ian T., Gibbs, Tom, Goble, Carole, Godoy, William, Gustafsson, Johan, Haus, Utz-Uwe, Hudson, Stephen, Jha, Shantenu, Los, Laila, Paine, Drew, Suter, Frédéric, Ward, Logan, Wilkinson, Sean, Amaris, Marcos, Babuji, Yadu, Bader, Jonathan, Balin, Riccardo, Balouek, Daniel, Beecroft, Sarah, Belhajjame, Khalid, Bhattarai, Rajat, Brewer, Wes, Brunk, Paul, Caino-Lores, Silvina, Casanova, Henri, Cassol, Daniela, Coleman, Jared, Coleman, Taina, Colonnelli, Iacopo, Da Silva, Anderson Andrei, de Oliveira, Daniel, Elahi, Pascal, Elfaramawy, Nour, Elwasif, Wael, Etz, Brian, Fahringer, Thomas, Ferreira, Wesley, Filgueira, Rosa, Tande, Jacob Fosso, Gadelha, Luiz, Gallo, Andy, Garijo, Daniel, Georgiou, Yiannis, Gritsch, Philipp, Grubel, Patricia, Gueroudji, Amal, Guilloteau, Quentin, Hamalainen, Carlo, Enriquez, Rolando Hong, Huet, Lauren, Kesling, Kevin Hunter, Iborra, Paula, Jahangiri, Shiva, Janssen, Jan, Jordan, Joe, Kanwal, Sehrish, Kunstmann, Liliane, Lehmann, Fabian, Leser, Ulf, Li, Chen, Liu, Peini, Luettgau, Jakob, Lupat, Richard, Fernandez, Jose M., Maheshwari, Ketan, Malik, Tanu, Marquez, Jack, Matsuda, Motohiko, Medic, Doriana, Mohammadi, Somayeh, Mulone, Alberto, Navarro, John-Luke, Ng, Kin Wai, Noelp, Klaus, Kinoshita, Bruno P., Prout, Ryan, Crusoe, Michael R., Ristov, Sashko, Robila, Stefan, Rosendo, Daniel, Rowell, Billy, Rybicki, Jedrzej, Sanchez, Hector, Saurabh, Nishant, Saurav, Sumit Kumar, Scogland, Tom, Senanayake, Dinindu, Shin, Woong, Sirvent, Raul, Skluzacek, Tyler, Sly-Delgado, Barry, Soiland-Reyes, Stian, Souza, Abel, Souza, Renan, Talia, Domenico, Tallent, Nathan, Thamsen, Lauritz, Titov, Mikhail, Tovar, Benjamin, Vahi, Karan, Vardar-Irrgang, Eric, Vartina, Edite, Wang, Yuandou, Wouters, Merridee, Yu, Qi, Bkhetan, Ziad Al, and Zulfiqar, Mahnoor
- Subjects
Computer Science - Distributed, Parallel, and Cluster Computing - Abstract
The Workflows Community Summit gathered 111 participants from 18 countries to discuss emerging trends and challenges in scientific workflows, focusing on six key areas: time-sensitive workflows, AI-HPC convergence, multi-facility workflows, heterogeneous HPC environments, user experience, and FAIR computational workflows. The integration of AI and exascale computing has revolutionized scientific workflows, enabling higher-fidelity models and complex, time-sensitive processes, while introducing challenges in managing heterogeneous environments and multi-facility data dependencies. The rise of large language models is driving computational demands to zettaflop scales, necessitating modular, adaptable systems and cloud-service models to optimize resource utilization and ensure reproducibility. Multi-facility workflows present challenges in data movement, curation, and overcoming institutional silos, while diverse hardware architectures require integrating workflow considerations into early system design and developing standardized resource management tools. The summit emphasized improving user experience in workflow systems and ensuring FAIR workflows to enhance collaboration and accelerate scientific discovery. Key recommendations include developing standardized metrics for time-sensitive workflows, creating frameworks for cloud-HPC integration, implementing distributed-by-design workflow modeling, establishing multi-facility authentication protocols, and accelerating AI integration in HPC workflow management. The summit also called for comprehensive workflow benchmarks, workflow-specific UX principles, and a FAIR workflow maturity model, highlighting the need for continued collaboration in addressing the complex challenges posed by the convergence of AI, HPC, and multi-facility research environments.
- Published
- 2024
- Full Text
- View/download PDF
32. A search using GEO600 for gravitational waves coincident with fast radio bursts from SGR 1935+2154
- Author
-
The LIGO Scientific Collaboration, the Virgo Collaboration, the KAGRA Collaboration, Abac, A. G., Abbott, R., Abouelfettouh, I., Acernese, F., Ackley, K., Adhicary, S., Adhikari, N., Adhikari, R. X., Adkins, V. K., Agarwal, D., Agathos, M., Abchouyeh, M. Aghaei, Aguiar, O. D., Aguilar, I., Aiello, L., Ain, A., Ajith, P., Akutsu, T., Albanesi, S., Alfaidi, R. A., Al-Jodah, A., Alléné, C., Allocca, A., Al-Shammari, S., Altin, P. A., Alvarez-Lopez, S., Amato, A., Amez-Droz, L., Amorosi, A., Amra, C., Ananyeva, A., Anderson, S. B., Anderson, W. G., Andia, M., Ando, M., Andrade, T., Andres, N., Andrés-Carcasona, M., Andrić, T., Anglin, J., Ansoldi, S., Antelis, J. M., Antier, S., Aoumi, M., Appavuravther, E. Z., Appert, S., Apple, S. K., Arai, K., Araya, A., Araya, M. C., Areeda, J. S., Argianas, L., Aritomi, N., Armato, F., Arnaud, N., Arogeti, M., Aronson, S. M., Ashton, G., Aso, Y., Assiduo, M., Melo, S. Assis de Souza, Aston, S. M., Astone, P., Attadio, F., Aubin, F., AultONeal, K., Avallone, G., Azrad, D., Babak, S., Badaracco, F., Badger, C., Bae, S., Bagnasco, S., Bagui, E., Baier, J. G., Baiotti, L., Bajpai, R., Baka, T., Ball, M., Ballardin, G., Ballmer, S. W., Banagiri, S., Banerjee, B., Bankar, D., Baral, P., Barayoga, J. C., Barish, B. C., Barker, D., Barneo, P., Barone, F., Barr, B., Barsotti, L., Barsuglia, M., Barta, D., Bartoletti, A. M., Barton, M. A., Bartos, I., Basak, S., Basalaev, A., Bassiri, R., Basti, A., Bates, D. E., Bawaj, M., Baxi, P., Bayley, J. C., Baylor, A. C., Baynard II, P. A., Bazzan, M., Bedakihale, V. M., Beirnaert, F., Bejger, M., Belardinelli, D., Bell, A. S., Benedetto, V., Benoit, W., Bentley, J. D., Yaala, M. Ben, Bera, S., Berbel, M., Bergamin, F., Berger, B. K., Bernuzzi, S., Beroiz, M., Bersanetti, D., Bertolini, A., Betzwieser, J., Beveridge, D., Bevins, N., Bhandare, R., Bhardwaj, U., Bhatt, R., Bhattacharjee, D., Bhaumik, S., Bhowmick, S., Bianchi, A., Bilenko, I. A., Billingsley, G., Binetti, A., Bini, S., Birnholtz, O., Biscoveanu, S., Bisht, A., Bitossi, M., Bizouard, M. -A., Blackburn, J. K., Blagg, L. A., Blair, C. D., Blair, D. G., Bobba, F., Bode, N., Boileau, G., Boldrini, M., Bolingbroke, G. N., Bolliand, A., Bonavena, L. D., Bondarescu, R., Bondu, F., Bonilla, E., Bonilla, M. S., Bonino, A., Bonnand, R., Booker, P., Borchers, A., Boschi, V., Bose, S., Bossilkov, V., Boudart, V., Boudon, A., Bozzi, A., Bradaschia, C., Brady, P. R., Braglia, M., Branch, A., Branchesi, M., Brandt, J., Braun, I., Breschi, M., Briant, T., Brillet, A., Brinkmann, M., Brockill, P., Brockmueller, E., Brooks, A. F., Brown, B. C., Brown, D. D., Brozzetti, M. L., Brunett, S., Bruno, G., Bruntz, R., Bryant, J., Bucci, F., Buchanan, J., Bulashenko, O., Bulik, T., Bulten, H. J., Buonanno, A., Burtnyk, K., Buscicchio, R., Buskulic, D., Buy, C., Byer, R. L., Davies, G. S. Cabourn, Cabras, G., Cabrita, R., Cáceres-Barbosa, V., Cadonati, L., Cagnoli, G., Cahillane, C., Bustillo, J. Calderón, Callister, T. A., Calloni, E., Camp, J. B., Canepa, M., Santoro, G. Caneva, Cannon, K. C., Cao, H., Capistran, L. A., Capocasa, E., Capote, E., Carapella, G., Carbognani, F., Carlassara, M., Carlin, J. B., Carpinelli, M., Carrillo, G., Carter, J. J., Carullo, G., Diaz, J. Casanueva, Casentini, C., Castro-Lucas, S. Y., Caudill, S., Cavaglià, M., Cavalieri, R., Cella, G., Cerdá-Durán, P., Cesarini, E., Chaibi, W., Chakraborty, P., Subrahmanya, S. Chalathadka, Chan, J. C. L., Chan, M., Chandra, K., Chang, R. -J., Chao, S., Charlton, E. L., Charlton, P., Chassande-Mottin, E., Chatterjee, C., Chatterjee, Debarati, Chatterjee, Deep, Chaturvedi, M., Chaty, S., Chen, A., Chen, A. H. -Y., Chen, D., Chen, H., Chen, H. Y., Chen, J., Chen, K. H., Chen, Y., Chen, Yanbei, Chen, Yitian, Cheng, H. P., Chessa, P., Cheung, H. T., Cheung, S. Y., Chiadini, F., Chiarini, G., Chierici, R., Chincarini, A., Chiofalo, M. L., Chiummo, A., Chou, C., Choudhary, S., Christensen, N., Chua, S. S. Y., Chugh, P., Ciani, G., Ciecielag, P., Cieślar, M., Cifaldi, M., Ciolfi, R., Clara, F., Clark, J. A., Clarke, J., Clarke, T. A., Clearwater, P., Clesse, S., Coccia, E., Codazzo, E., Cohadon, P. -F., Colace, S., Colleoni, M., Collette, C. G., Collins, J., Colloms, S., Colombo, A., Colpi, M., Compton, C. M., Connolly, G., Conti, L., Corbitt, T. R., Cordero-Carrión, I., Corezzi, S., Cornish, N. J., Corsi, A., Cortese, S., Costa, C. A., Cottingham, R., Coughlin, M. W., Couineaux, A., Coulon, J. -P., Countryman, S. T., Coupechoux, J. -F., Couvares, P., Coward, D. M., Cowart, M. J., Coyne, R., Craig, K., Creed, R., Creighton, J. D. E., Creighton, T. D., Cremonese, P., Criswell, A. W., Crockett-Gray, J. C. G., Crook, S., Crouch, R., Csizmazia, J., Cudell, J. R., Cullen, T. J., Cumming, A., Cuoco, E., Cusinato, M., Dabadie, P., Canton, T. Dal, Dall'Osso, S., Pra, S. Dal, Dálya, G., D'Angelo, B., Danilishin, S., D'Antonio, S., Danzmann, K., Darroch, K. E., Dartez, L. P., Dasgupta, A., Datta, S., Dattilo, V., Daumas, A., Davari, N., Dave, I., Davenport, A., Davier, M., Davies, T. F., Davis, D., Davis, L., Davis, M. C., Davis, P. J., Dax, M., De Bolle, J., Deenadayalan, M., Degallaix, J., De Laurentis, M., Deléglise, S., De Lillo, F., Dell'Aquila, D., Del Pozzo, W., De Marco, F., De Matteis, F., D'Emilio, V., Demos, N., Dent, T., Depasse, A., DePergola, N., De Pietri, R., De Rosa, R., De Rossi, C., DeSalvo, R., De Simone, R., Dhani, A., Diab, R., Díaz, M. C., Di Cesare, M., Dideron, G., Didio, N. A., Dietrich, T., Di Fiore, L., Di Fronzo, C., Di Giovanni, M., Di Girolamo, T., Diksha, D., Di Michele, A., Ding, J., Di Pace, S., Di Palma, I., Di Renzo, F., Divyajyoti, Dmitriev, A., Doctor, Z., Dohmen, E., Doleva, P. P., Dominguez, D., D'Onofrio, L., Donovan, F., Dooley, K. L., Dooney, T., Doravari, S., Dorosh, O., Drago, M., Driggers, J. C., Ducoin, J. -G., Dunn, L., Dupletsa, U., D'Urso, D., Duval, H., Duverne, P. -A., Dwyer, S. E., Eassa, C., Ebersold, M., Eckhardt, T., Eddolls, G., Edelman, B., Edo, T. B., Edy, O., Effler, A., Eichholz, J., Einsle, H., Eisenmann, M., Eisenstein, R. A., Ejlli, A., Eleveld, R. M., Emma, M., Endo, K., Engl, A. J., Enloe, E., Errico, L., Essick, R. C., Estellés, H., Estevez, D., Etzel, T., Evans, M., Evstafyeva, T., Ewing, B. E., Ezquiaga, J. M., Fabrizi, F., Faedi, F., Fafone, V., Fairhurst, S., Farah, A. M., Farr, B., Farr, W. M., Favaro, G., Favata, M., Fays, M., Fazio, M., Feicht, J., Fejer, M. M., Felicetti, R. ., Fenyvesi, E., Ferguson, D. L., Ferraiuolo, S., Ferrante, I., Ferreira, T. A., Fidecaro, F., Figura, P., Fiori, A., Fiori, I., Fishbach, M., Fisher, R. P., Fittipaldi, R., Fiumara, V., Flaminio, R., Fleischer, S. M., Fleming, L. S., Floden, E., Foley, E. M., Fong, H., Font, J. A., Fornal, B., Forsyth, P. W. F., Franceschetti, K., Franchini, N., Frasca, S., Frasconi, F., Mascioli, A. Frattale, Frei, Z., Freise, A., Freitas, O., Frey, R., Frischhertz, W., Fritschel, P., Frolov, V. V., Fronzé, G. G., Fuentes-Garcia, M., Fujii, S., Fujimori, T., Fulda, P., Fyffe, M., Gadre, B., Gair, J. R., Galaudage, S., Galdi, V., Gallagher, H., Gallardo, S., Gallego, B., Gamba, R., Gamboa, A., Ganapathy, D., Ganguly, A., Garaventa, B., García-Bellido, J., Núñez, C. García, García-Quirós, C., Gardner, J. W., Gardner, K. A., Gargiulo, J., Garron, A., Garufi, F., Gasbarra, C., Gateley, B., Gayathri, V., Gemme, G., Gennai, A., Gennari, V., George, J., George, R., Gerberding, O., Gergely, L., Ghonge, S., Ghosh, Archisman, Ghosh, Sayantan, Ghosh, Shaon, Ghosh, Shrobana, Ghosh, Suprovo, Ghosh, Tathagata, Giacoppo, L., Giaime, J. A., Giardina, K. D., Gibson, D. R., Gibson, D. T., Gier, C., Giri, P., Gissi, F., Gkaitatzis, S., Glanzer, J., Glotin, F., Godfrey, J., Godwin, P., Goebbels, N. L., Goetz, E., Golomb, J., Lopez, S. Gomez, Goncharov, B., Gong, Y., González, G., Goodarzi, P., Goode, S., Goodwin-Jones, A. W., Gosselin, M., Göttel, A. S., Gouaty, R., Gould, D. W., Govorkova, K., Goyal, S., Grace, B., Grado, A., Graham, V., Granados, A. E., Granata, M., Granata, V., Gras, S., Grassia, P., Gray, A., Gray, C., Gray, R., Greco, G., Green, A. C., Green, S. M., Green, S. R., Gretarsson, A. M., Gretarsson, E. M., Griffith, D., Griffiths, W. L., Griggs, H. L., Grignani, G., Grimaldi, A., Grimaud, C., Grote, H., Guerra, D., Guetta, D., Guidi, G. M., Guimaraes, A. R., Gulati, H. K., Gulminelli, F., Gunny, A. M., Guo, H., Guo, W., Guo, Y., Gupta, Anchal, Gupta, Anuradha, Gupta, Ish, Gupta, N. C., Gupta, P., Gupta, S. K., Gupta, T., Gupte, N., Gurs, J., Gutierrez, N., Guzman, F., H, H. -Y., Haba, D., Haberland, M., Haino, S., Hall, E. D., Hamilton, E. Z., Hammond, G., Han, W. -B., Haney, M., Hanks, J., Hanna, C., Hannam, M. D., Hannuksela, O. A., Hanselman, A. G., Hansen, H., Hanson, J., Harada, R., Hardison, A. R., Haris, K., Harmark, T., Harms, J., Harry, G. M., Harry, I. W., Hart, J., Haskell, B., Haster, C. -J., Hathaway, J. S., Haughian, K., Hayakawa, H., Hayama, K., Hayes, R., Heffernan, A., Heidmann, A., Heintze, M. C., Heinze, J., Heinzel, J., Heitmann, H., Hellman, F., Hello, P., Helmling-Cornell, A. F., Hemming, G., Henderson-Sapir, O., Hendry, M., Heng, I. S., Hennes, E., Henshaw, C., Hertog, T., Heurs, M., Hewitt, A. L., Heyns, J., Higginbotham, S., Hild, S., Hill, S., Himemoto, Y., Hirata, N., Hirose, C., Ho, W. C. G., Hoang, S., Hochheim, S., Hofman, D., Holland, N. A., Holley-Bockelmann, K., Holmes, Z. J., Holz, D. E., Honet, L., Hong, C., Hornung, J., Hoshino, S., Hough, J., Hourihane, S., Howell, E. J., Hoy, C. G., Hrishikesh, C. A., Hsieh, H. -F., Hsiung, C., Hsu, H. C., Hsu, W. -F., Hu, P., Hu, Q., Huang, H. Y., Huang, Y. -J., Huddart, A. D., Hughey, B., Hui, D. C. Y., Hui, V., Husa, S., Huxford, R., Huynh-Dinh, T., Iampieri, L., Iandolo, G. A., Ianni, M., Iess, A., Imafuku, H., Inayoshi, K., Inoue, Y., Iorio, G., Iqbal, M. H., Irwin, J., Ishikawa, R., Isi, M., Ismail, M. A., Itoh, Y., Iwanaga, H., Iwaya, M., Iyer, B. R., JaberianHamedan, V., Jacquet, C., Jacquet, P. -E., Jadhav, S. J., Jadhav, S. P., Jain, T., James, A. L., James, P. A., Jamshidi, R., Janquart, J., Janssens, K., Janthalur, N. N., Jaraba, S., Jaranowski, P., Jaume, R., Javed, W., Jennings, A., Jia, W., Jiang, J., Kubisz, J., Johanson, C., Johns, G. R., Johnson, N. A., Johnston, M. C., Johnston, R., Johny, N., Jones, D. H., Jones, D. I., Jones, R., Jose, S., Joshi, P., Ju, L., Jung, K., Junker, J., Juste, V., Kajita, T., Kaku, I., Kalaghatgi, C., Kalogera, V., Kamiizumi, M., Kanda, N., Kandhasamy, S., Kang, G., Kanner, J. B., Kapadia, S. J., Kapasi, D. P., Karat, S., Karathanasis, C., Kashyap, R., Kasprzack, M., Kastaun, W., Kato, T., Katsavounidis, E., Katzman, W., Kaushik, R., Kawabe, K., Kawamoto, R., Kazemi, A., Keitel, D., Kelley-Derzon, J., Kennington, J., Kesharwani, R., Key, J. S., Khadela, R., Khadka, S., Khalili, F. Y., Khan, F., Khan, I., Khanam, T., Khursheed, M., Khusid, N. M., Kiendrebeogo, W., Kijbunchoo, N., Kim, C., Kim, J. C., Kim, K., Kim, M. H., Kim, S., Kim, Y. -M., Kimball, C., Kinley-Hanlon, M., Kinnear, M., Kissel, J. S., Klimenko, S., Knee, A. M., Knust, N., Kobayashi, K., Koch, P., Koehlenbeck, S. M., Koekoek, G., Kohri, K., Kokeyama, K., Koley, S., Kolitsidou, P., Kolstein, M., Komori, K., Kong, A. K. H., Kontos, A., Korobko, M., Kossak, R. V., Kou, X., Koushik, A., Kouvatsos, N., Kovalam, M., Kozak, D. B., Kranzhoff, S. L., Kringel, V., Krishnendu, N. V., Królak, A., Kruska, K., Kuehn, G., Kuijer, P., Kulkarni, S., Ramamohan, A. Kulur, Kumar, A., Kumar, Praveen, Kumar, Prayush, Kumar, Rahul, Kumar, Rakesh, Kume, J., Kuns, K., Kuntimaddi, N., Kuroyanagi, S., Kurth, N. J., Kuwahara, S., Kwak, K., Kwan, K., Kwok, J., Lacaille, G., Lagabbe, P., Laghi, D., Lai, S., Laity, A. H., Lakkis, M. H., Lalande, E., Lalleman, M., Lalremruati, P. C., Landry, M., Lane, B. B., Lang, R. N., Lange, J., Lantz, B., La Rana, A., La Rosa, I., Lartaux-Vollard, A., Lasky, P. D., Lawrence, J., Lawrence, M. N., Laxen, M., Lazzarini, A., Lazzaro, C., Leaci, P., Lecoeuche, Y. K., Lee, H. M., Lee, H. W., Lee, K., Lee, R. -K., Lee, R., Lee, S., Lee, Y., Legred, I. N., Lehmann, J., Lehner, L., Jean, M. Le, Lemaître, A., Lenti, M., Leonardi, M., Lequime, M., Leroy, N., Lesovsky, M., Letendre, N., Lethuillier, M., Levin, S. E., Levin, Y., Leyde, K., Li, A. K. Y., Li, K. L., Li, T. G. F., Li, X., Li, Z., Lihos, A., Lin, C-Y., Lin, C. -Y., Lin, E. T., Lin, F., Lin, H., Lin, L. C. -C., Lin, Y. -C., Linde, F., Linker, S. D., Littenberg, T. B., Liu, A., Liu, G. C., Liu, Jian, Villarreal, F. Llamas, Llobera-Querol, J., Lo, R. K. L., Locquet, J. -P., London, L. T., Longo, A., Lopez, D., Portilla, M. Lopez, Lorenzini, M., Lorenzo-Medina, A., Loriette, V., Lormand, M., Losurdo, G., Lott IV, T. P., Lough, J. D., Loughlin, H. A., Lousto, C. O., Lowry, M. J., Lu, N., Lück, H., Lumaca, D., Lundgren, A. P., Lussier, A. W., Ma, L. -T., Ma, S., Ma'arif, M., Macas, R., Macedo, A., MacInnis, M., Maciy, R. R., Macleod, D. M., MacMillan, I. A. O., Macquet, A., Macri, D., Maeda, K., Maenaut, S., Hernandez, I. Magaña, Magare, S. S., Magazzù, C., Magee, R. M., Maggio, E., Maggiore, R., Magnozzi, M., Mahesh, M., Mahesh, S., Maini, M., Majhi, S., Majorana, E., Makarem, C. N., Makelele, E., Malaquias-Reis, J. A., Mali, U., Maliakal, S., Malik, A., Man, N., Mandic, V., Mangano, V., Mannix, B., Mansell, G. L., Mansingh, G., Manske, M., Mantovani, M., Mapelli, M., Marchesoni, F., Pina, D. Marín, Marion, F., Márka, S., Márka, Z., Markosyan, A. S., Markowitz, A., Maros, E., Marsat, S., Martelli, F., Martin, I. W., Martin, R. M., Martinez, B. B., Martinez, M., Martinez, V., Martini, A., Martinovic, K., Martins, J. C., Martynov, D. V., Marx, E. J., Massaro, L., Masserot, A., Masso-Reid, M., Mastrodicasa, M., Mastrogiovanni, S., Matcovich, T., Matiushechkina, M., Matsuyama, M., Mavalvala, N., Maxwell, N., McCarrol, G., McCarthy, R., McCormick, S., McCuller, L., McEachin, S., McElhenny, C., McGhee, G. I., McGinn, J., McGowan, K. B. M., McIver, J., McLeod, A., McRae, T., Meacher, D., Meijer, Q., Melatos, A., Mellaerts, S., Menendez-Vazquez, A., Menoni, C. S., Mera, F., Mercer, R. A., Mereni, L., Merfeld, K., Merilh, E. L., Mérou, J. R., Merritt, J. D., Merzougui, M., Messenger, C., Messick, C., Meyer-Conde, M., Meylahn, F., Mhaske, A., Miani, A., Miao, H., Michaloliakos, I., Michel, C., Michimura, Y., Middleton, H., Miller, A. L., Miller, S., Millhouse, M., Milotti, E., Milotti, V., Minenkov, Y., Mio, N., Mir, Ll. M., Mirasola, L., Miravet-Tenés, M., Miritescu, C. -A., Mishra, A. K., Mishra, A., Mishra, C., Mishra, T., Mitchell, A. L., Mitchell, J. G., Mitra, S., Mitrofanov, V. P., Mittleman, R., Miyakawa, O., Miyamoto, S., Miyoki, S., Mo, G., Mobilia, L., Mohapatra, S. R. P., Mohite, S. R., Molina-Ruiz, M., Mondal, C., Mondin, M., Montani, M., Moore, C. J., Moraru, D., More, A., More, S., Moreno, G., Morgan, C., Morisaki, S., Moriwaki, Y., Morras, G., Moscatello, A., Mourier, P., Mours, B., Mow-Lowry, C. M., Muciaccia, F., Mukherjee, Arunava, Mukherjee, D., Mukherjee, Samanwaya, Mukherjee, Soma, Mukherjee, Subroto, Mukherjee, Suvodip, Mukund, N., Mullavey, A., Munch, J., Mundi, J., Mungioli, C. L., Oberg, W. R. Munn, Murakami, Y., Murakoshi, M., Murray, P. G., Muusse, S., Nabari, D., Nadji, S. L., Nagar, A., Nagarajan, N., Nagler, K. N., Nakagaki, K., Nakamura, K., Nakano, H., Nakano, M., Nandi, D., Napolano, V., Narayan, P., Nardecchia, I., Narola, H., Naticchioni, L., Nayak, R. K., Neilson, J., Nelson, A., Nelson, T. J. N., Nery, M., Neunzert, A., Ng, S., Quynh, L. Nguyen, Nichols, S. A., Nielsen, A. B., Nieradka, G., Niko, A., Nishino, Y., Nishizawa, A., Nissanke, S., Nitoglia, E., Niu, W., Nocera, F., Norman, M., North, C., Novak, J., Siles, J. F. Nuño, Nuttall, L. K., Obayashi, K., Oberling, J., O'Dell, J., Oertel, M., Offermans, A., Oganesyan, G., Oh, J. J., Oh, K., O'Hanlon, T., Ohashi, M., Ohkawa, M., Ohme, F., Oliveira, A. S., Oliveri, R., O'Neal, B., Oohara, K., O'Reilly, B., Ormsby, N. D., Orselli, M., O'Shaughnessy, R., O'Shea, S., Oshima, Y., Oshino, S., Ossokine, S., Osthelder, C., Ota, I., Ottaway, D. J., Ouzriat, A., Overmier, H., Owen, B. J., Pace, A. E., Pagano, R., Page, M. A., Pai, A., Pal, A., Pal, S., Palaia, M. A., Pálfi, M., Palma, P. P., Palomba, C., Palud, P., Pan, H., Pan, J., Pan, K. C., Panai, R., Panda, P. K., Pandey, S., Panebianco, L., Pang, P. T. H., Pannarale, F., Pannone, K. A., Pant, B. C., Panther, F. H., Paoletti, F., Paolone, A., Papalexakis, E. E., Papalini, L., Papigkiotis, G., Paquis, A., Parisi, A., Park, B. -J., Park, J., Parker, W., Pascale, G., Pascucci, D., Pasqualetti, A., Passaquieti, R., Passenger, L., Passuello, D., Patane, O., Pathak, D., Pathak, M., Patra, A., Patricelli, B., Patron, A. S., Paul, K., Paul, S., Payne, E., Pearce, T., Pedraza, M., Pegna, R., Pele, A., Arellano, F. E. Peña, Penn, S., Penuliar, M. D., Perego, A., Pereira, Z., Perez, J. J., Périgois, C., Perna, G., Perreca, A., Perret, J., Perriès, S., Perry, J. W., Pesios, D., Petracca, S., Petrillo, C., Pfeiffer, H. P., Pham, H., Pham, K. A., Phukon, K. S., Phurailatpam, H., Piarulli, M., Piccari, L., Piccinni, O. J., Pichot, M., Piendibene, M., Piergiovanni, F., Pierini, L., Pierra, G., Pierro, V., Pietrzak, M., Pillas, M., Pilo, F., Pinard, L., Pinto, I. M., Pinto, M., Piotrzkowski, B. J., Pirello, M., Pitkin, M. D., Placidi, A., Placidi, E., Planas, M. L., Plastino, W., Poggiani, R., Polini, E., Pompili, L., Poon, J., Porcelli, E., Porter, E. K., Posnansky, C., Poulton, R., Powell, J., Pracchia, M., Pradhan, B. K., Pradier, T., Prajapati, A. K., Prasai, K., Prasanna, R., Prasia, P., Pratten, G., Principe, G., Principe, M., Prodi, G. A., Prokhorov, L., Prosposito, P., Puecher, A., Pullin, J., Punturo, M., Puppo, P., Pürrer, M., Qi, H., Qin, J., Quéméner, G., Quetschke, V., Quigley, C., Quinonez, P. J., Quitzow-James, R., Raab, F. J., Raabith, S. S., Raaijmakers, G., Raja, S., Rajan, C., Rajbhandari, B., Ramirez, K. E., Vidal, F. A. Ramis, Ramos-Buades, A., Rana, D., Ranjan, S., Ransom, K., Rapagnani, P., Ratto, B., Rawat, S., Ray, A., Raymond, V., Razzano, M., Read, J., Payo, M. Recaman, Regimbau, T., Rei, L., Reid, S., Reitze, D. H., Relton, P., Renzini, A. I., Rettegno, P., Revenu, B., Reyes, R., Rezaei, A. S., Ricci, F., Ricci, M., Ricciardone, A., Richardson, J. W., Richardson, M., Rijal, A., Riles, K., Riley, H. K., Rinaldi, S., Rittmeyer, J., Robertson, C., Robinet, F., Robinson, M., Rocchi, A., Rolland, L., Rollins, J. G., Romano, A. E., Romano, R., Romero, A., Romero-Shaw, I. M., Romie, J. H., Ronchini, S., Roocke, T. J., Rosa, L., Rosauer, T. J., Rose, C. A., Rosińska, D., Ross, M. P., Rossello, M., Rowan, S., Roy, S. K., Roy, S., Rozza, D., Ruggi, P., Ruhama, N., Morales, E. Ruiz, Ruiz-Rocha, K., Sachdev, S., Sadecki, T., Sadiq, J., Saffarieh, P., Sah, M. R., Saha, S. S., Saha, S., Sainrat, T., Menon, S. Sajith, Sakai, K., Sakellariadou, M., Sakon, S., Salafia, O. S., Salces-Carcoba, F., Salconi, L., Saleem, M., Salemi, F., Sallé, M., Salvador, S., Sanchez, A., Sanchez, E. J., Sanchez, J. H., Sanchez, L. E., Sanchis-Gual, N., Sanders, J. R., Sänger, E. M., Santoliquido, F., Saravanan, T. R., Sarin, N., Sasaoka, S., Sasli, A., Sassi, P., Sassolas, B., Satari, H., Sato, R., Sato, Y., Sauter, O., Savage, R. L., Sawada, T., Sawant, H. L., Sayah, S., Scacco, V., Schaetzl, D., Scheel, M., Schiebelbein, A., Schiworski, M. G., Schmidt, P., Schmidt, S., Schnabel, R., Schneewind, M., Schofield, R. M. S., Schouteden, K., Schulte, B. W., Schutz, B. F., Schwartz, E., Scialpi, M., Scott, J., Scott, S. M., Seetharamu, T. C., Seglar-Arroyo, M., Sekiguchi, Y., Sellers, D., Sengupta, A. S., Sentenac, D., Seo, E. G., Seo, J. W., Sequino, V., Serra, M., Servignat, G., Sevrin, A., Shaffer, T., Shah, U. S., Shaikh, M. A., Shao, L., Sharma, A. K., Sharma, P., Sharma-Chaudhary, S., Shaw, M. R., Shawhan, P., Shcheblanov, N. S., Sheridan, E., Shikano, Y., Shikauchi, M., Shimode, K., Shinkai, H., Shiota, J., Shoemaker, D. H., Shoemaker, D. M., Short, R. W., ShyamSundar, S., Sider, A., Siegel, H., Sieniawska, M., Sigg, D., Silenzi, L., Simmonds, M., Singer, L. P., Singh, A., Singh, D., Singh, M. K., Singh, S., Singha, A., Sintes, A. M., Sipala, V., Skliris, V., Slagmolen, B. J. J., Slaven-Blair, T. J., Smetana, J., Smith, J. R., Smith, L., Smith, R. J. E., Smith, W. J., Soldateschi, J., Somiya, K., Song, I., Soni, K., Soni, S., Sordini, V., Sorrentino, F., Sorrentino, N., Sotani, H., Soulard, R., Southgate, A., Spagnuolo, V., Spencer, A. P., Spera, M., Spinicelli, P., Spoon, J. B., Sprague, C. A., Srivastava, A. K., Stachurski, F., Steer, D. A., Steinlechner, J., Steinlechner, S., Stergioulas, N., Stevens, P., StPierre, M., Stratta, G., Strong, M. D., Strunk, A., Sturani, R., Stuver, A. L., Suchenek, M., Sudhagar, S., Sueltmann, N., Suleiman, L., Sullivan, K. D., Sun, L., Sunil, S., Suresh, J., Sutton, P. J., Suzuki, T., Suzuki, Y., Swinkels, B. L., Syx, A., Szczepańczyk, M. J., Szewczyk, P., Tacca, M., Tagoshi, H., Tait, S. C., Takahashi, H., Takahashi, R., Takamori, A., Takase, T., Takatani, K., Takeda, H., Takeshita, K., Talbot, C., Tamaki, M., Tamanini, N., Tanabe, D., Tanaka, K., Tanaka, S. J., Tanaka, T., Tang, D., Tanioka, S., Tanner, D. B., Tao, L., Tapia, R. D., Martín, E. N. Tapia San, Tarafder, R., Taranto, C., Taruya, A., Tasson, J. D., Teloi, M., Tenorio, R., Themann, H., Theodoropoulos, A., Thirugnanasambandam, M. P., Thomas, L. M., Thomas, M., Thomas, P., Thompson, J. E., Thondapu, S. R., Thorne, K. A., Thrane, E., Tissino, J., Tiwari, A., Tiwari, P., Tiwari, S., Tiwari, V., Todd, M. R., Toivonen, A. M., Toland, K., Tolley, A. E., Tomaru, T., Tomita, K., Tomura, T., Tong-Yu, C., Toriyama, A., Toropov, N., Torres-Forné, A., Torrie, C. I., Toscani, M., Melo, I. Tosta e, Tournefier, E., Trapananti, A., Travasso, F., Traylor, G., Trevor, M., Tringali, M. C., Tripathee, A., Troian, G., Troiano, L., Trovato, A., Trozzo, L., Trudeau, R. J., Tsang, T. T. L., Tso, R., Tsuchida, S., Tsukada, L., Tsutsui, T., Turbang, K., Turconi, M., Turski, C., Ubach, H., Uchiyama, T., Udall, R. P., Uehara, T., Uematsu, M., Ueno, K., Ueno, S., Undheim, V., Ushiba, T., Vacatello, M., Vahlbruch, H., Vaidya, N., Vajente, G., Vajpeyi, A., Valdes, G., Valencia, J., Valentini, M., Vallejo-Peña, S. A., Vallero, S., Valsan, V., van Bakel, N., van Beuzekom, M., van Dael, M., Brand, J. F. J. van den, Broeck, C. Van Den, Vander-Hyde, D. C., van der Sluys, M., Van de Walle, A., van Dongen, J., Vandra, K., van Haevermaet, H., van Heijningen, J. V., Van Hove, P., VanKeuren, M., Vanosky, J., van Putten, M. H. P. M., van Ranst, Z., van Remortel, N., Vardaro, M., Vargas, A. F., Varghese, J. J., Varma, V., Vasúth, M., Vecchio, A., Vedovato, G., Veitch, J., Veitch, P. J., Venikoudis, S., Venneberg, J., Verdier, P., Verkindt, D., Verma, B., Verma, P., Verma, Y., Vermeulen, S. M., Vetrano, F., Veutro, A., Vibhute, A. M., Viceré, A., Vidyant, S., Viets, A. D., Vijaykumar, A., Vilkha, A., Villa-Ortega, V., Vincent, E. T., Vinet, J. -Y., Viret, S., Virtuoso, A., Vitale, S., Vives, A., Vocca, H., Voigt, D., von Reis, E. R. G., von Wrangel, J. S. A., Vyatchanin, S. P., Wade, L. E., Wade, M., Wagner, K. J., Wajid, A., Walker, M., Wallace, G. S., Wallace, L., Wang, H., Wang, J. Z., Wang, W. H., Wang, Z., Waratkar, G., Warner, J., Was, M., Washimi, T., Washington, N. Y., Watarai, D., Wayt, K. E., Weaver, B. R., Weaver, B., Weaving, C. R., Webster, S. A., Weinert, M., Weinstein, A. J., Weiss, R., Wellmann, F., Wen, L., Weßels, P., Wette, K., Whelan, J. T., Whiting, B. F., Whittle, C., Wildberger, J. B., Wilk, O. S., Wilken, D., Wilkin, A. T., Willadsen, D. J., Willetts, K., Williams, D., Williams, M. J., Williams, N. S., Willis, J. L., Willke, B., Wils, M., Winterflood, J., Wipf, C. C., Woan, G., Woehler, J., Wofford, J. K., Wolfe, N. E., Wong, H. T., Wong, H. W. Y., Wong, I. C. F., Wright, J. L., Wright, M., Wu, C., Wu, D. S., Wu, H., Wuchner, E., Wysocki, D. M., Xu, V. A., Xu, Y., Yadav, N., Yamamoto, H., Yamamoto, K., Yamamoto, T. S., Yamamoto, T., Yamamura, S., Yamazaki, R., Yan, S., Yan, T., Yang, F. W., Yang, F., Yang, K. Z., Yang, Y., Yarbrough, Z., Yasui, H., Yeh, S. -W., Yelikar, A. B., Yin, X., Yokoyama, J., Yokozawa, T., Yoo, J., Yu, H., Yuan, S., Yuzurihara, H., Zadrożny, A., Zanolin, M., Zeeshan, M., Zelenova, T., Zendri, J. -P., Zeoli, M., Zerrad, M., Zevin, M., Zhang, A. C., Zhang, L., Zhang, R., Zhang, T., Zhang, Y., Zhao, C., Zhao, Yue, Zhao, Yuhang, Zheng, Y., Zhong, H., Zhou, R., Zhu, X. -J., Zhu, Z. -H., Zucker, M. E., and Zweizig, J.
- Subjects
Astrophysics - High Energy Astrophysical Phenomena - Abstract
The magnetar SGR 1935+2154 is the only known Galactic source of fast radio bursts (FRBs). FRBs from SGR 1935+2154 were first detected by CHIME/FRB and STARE2 in 2020 April, after the conclusion of the LIGO, Virgo, and KAGRA Collaborations' O3 observing run. Here we analyze four periods of gravitational wave (GW) data from the GEO600 detector coincident with four periods of FRB activity detected by CHIME/FRB, as well as X-ray glitches and X-ray bursts detected by NICER and NuSTAR close to the time of one of the FRBs. We do not detect any significant GW emission from any of the events. Instead, using a short-duration GW search (for bursts $\leq$ 1 s) we derive 50\% (90\%) upper limits of $10^{48}$ ($10^{49}$) erg for GWs at 300 Hz and $10^{49}$ ($10^{50}$) erg at 2 kHz, and constrain the GW-to-radio energy ratio to $\leq 10^{14} - 10^{16}$. We also derive upper limits from a long-duration search for bursts with durations between 1 and 10 s. These represent the strictest upper limits on concurrent GW emission from FRBs., Comment: 15 pages of text including references, 4 figures, 5 tables
- Published
- 2024
33. New developments on the Ingot WFS laboratory testing
- Author
-
Machado, Tânia Gomes, Di Filippo, Simone, Santhakumari, Kalyan K. R., Bergomi, Maria, Greggio, Davide, Portaluri, Elisa, Malik, Dheeraj, Nesme, César, Arcidiacono, Carmelo, Ballone, Alessandro, Battaini, Federico, Viotto, Valentina, Ragazzoni, Roberto, Dima, Marco, Marafatto, Luca, Farinato, Jacopo, Magrin, Demetrio, Lessio, Luigi, and Umbriaco, Gabriele
- Subjects
Astrophysics - Instrumentation and Methods for Astrophysics - Abstract
The Ingot WFS was designed to overcome some of the challenges present in classical wavefront sensors when they deal with sodium LGSs. This innovative sensor works by sensing the full 3D volume of the elongated LGS and is suitable for use in very large telescopes. A test bench has been assembled at the INAF - Osservatorio Astronomico di Padova laboratories to test and characterize the functioning of the Ingot WFS. In this work, we summarize the main results of the tests performed on a new search algorithm. Then, we move towards a more accurate simulation of the sodium LGS by replicating real time-varying sodium layer profiles. The study of their impact on the ingot pupil signals is described in this work., Comment: 17 pages, 17 figures, Proceedings of SPIE 2024
- Published
- 2024
- Full Text
- View/download PDF
34. Block Induced Signature Generative Adversarial Network (BISGAN): Signature Spoofing Using GANs and Their Evaluation
- Author
-
Amjad, Haadia, Goeller, Kilian, Seitz, Steffen, Knoll, Carsten, Bajwa, Naseer, Tetzlaff, Ronald, and Malik, Muhammad Imran
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
Deep learning is actively being used in biometrics to develop efficient identification and verification systems. Handwritten signatures are a common subset of biometric data for authentication purposes. Generative adversarial networks (GANs) learn from original and forged signatures to generate forged signatures. While most GAN techniques create a strong signature verifier, which is the discriminator, there is a need to focus more on the quality of forgeries generated by the generator model. This work focuses on creating a generator that produces forged samples that achieve a benchmark in spoofing signature verification systems. We use CycleGANs infused with Inception model-like blocks with attention heads as the generator and a variation of the SigCNN model as the base Discriminator. We train our model with a new technique that results in 80% to 100% success in signature spoofing. Additionally, we create a custom evaluation technique to act as a goodness measure of the generated forgeries. Our work advocates generator-focused GAN architectures for spoofing data quality that aid in a better understanding of biometric data generation and evaluation.
- Published
- 2024
35. Analytical QNMs of fields of various spin in the Hayward spacetime
- Author
-
Malik, Zainab
- Subjects
General Relativity and Quantum Cosmology - Abstract
By employing an expansion in terms of the inverse multipole number, we derive analytic expressions for the quasinormal modes (QNMs) of scalar, Dirac, and Maxwell perturbations in the Hayward black hole (BH) background. The metric has three interpretations: as a model for a radiating BH, as a quantum-corrected BH owing to the running gravitational coupling in the Asymptotically Safe Gravity, and as a BH solution in the Effective Field Theory. We show that the obtained compact analytical formulas approximate QNMs with remarkable accuracy for $\ell > 0$.
- Published
- 2024
- Full Text
- View/download PDF
36. Estimating Body and Hand Motion in an Ego-sensed World
- Author
-
Yi, Brent, Ye, Vickie, Zheng, Maya, Müller, Lea, Pavlakos, Georgios, Ma, Yi, Malik, Jitendra, and Kanazawa, Angjoo
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
We present EgoAllo, a system for human motion estimation from a head-mounted device. Using only egocentric SLAM poses and images, EgoAllo guides sampling from a conditional diffusion model to estimate 3D body pose, height, and hand parameters that capture the wearer's actions in the allocentric coordinate frame of the scene. To achieve this, our key insight is in representation: we propose spatial and temporal invariance criteria for improving model performance, from which we derive a head motion conditioning parameterization that improves estimation by up to 18%. We also show how the bodies estimated by our system can improve the hands: the resulting kinematic and temporal constraints result in over 40% lower hand estimation errors compared to noisy monocular estimates. Project page: https://egoallo.github.io/, Comment: v2: fixed figures for Safari, typos
- Published
- 2024
37. Learning Humanoid Locomotion over Challenging Terrain
- Author
-
Radosavovic, Ilija, Kamat, Sarthak, Darrell, Trevor, and Malik, Jitendra
- Subjects
Computer Science - Robotics ,Computer Science - Machine Learning - Abstract
Humanoid robots can, in principle, use their legs to go almost anywhere. Developing controllers capable of traversing diverse terrains, however, remains a considerable challenge. Classical controllers are hard to generalize broadly while the learning-based methods have primarily focused on gentle terrains. Here, we present a learning-based approach for blind humanoid locomotion capable of traversing challenging natural and man-made terrain. Our method uses a transformer model to predict the next action based on the history of proprioceptive observations and actions. The model is first pre-trained on a dataset of flat-ground trajectories with sequence modeling, and then fine-tuned on uneven terrain using reinforcement learning. We evaluate our model on a real humanoid robot across a variety of terrains, including rough, deformable, and sloped surfaces. The model demonstrates robust performance, in-context adaptation, and emergent terrain representations. In real-world case studies, our humanoid robot successfully traversed over 4 miles of hiking trails in Berkeley and climbed some of the steepest streets in San Francisco., Comment: Project page: https://humanoid-challenging-terrain.github.io
- Published
- 2024
38. Automated Music Therapy for Anxiety and Depression Management in Older People (AMITY)
- Author
-
Faizan, Malik, White, P. J., and Dey, Indrakshi
- Subjects
Electrical Engineering and Systems Science - Systems and Control - Abstract
The onset of old age brings physiological and mental changes, with anxiety and depression being common mental disorders that can trigger other health issues and reduce lifespan. However, due to a global shortage of mental health professionals, combined with a growing population and limited awareness, these disorders often go undiagnosed. Music therapy offers a reliable method to address psychological, emotional, and cognitive needs. This paper presents an approach that monitors anxiety and depression symptoms in real time using low-complexity body sensors, followed by automated personalised music therapy, reducing the dependence on therapists and improving mental health care accessibility., Comment: 10 pages, 5 figures
- Published
- 2024
39. SurgeoNet: Realtime 3D Pose Estimation of Articulated Surgical Instruments from Stereo Images using a Synthetically-trained Network
- Author
-
Aboukhadra, Ahmed Tawfik, Robertini, Nadia, Malik, Jameel, Elhayek, Ahmed, Reis, Gerd, and Stricker, Didier
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Surgery monitoring in Mixed Reality (MR) environments has recently received substantial focus due to its importance in image-based decisions, skill assessment, and robot-assisted surgery. Tracking hands and articulated surgical instruments is crucial for the success of these applications. Due to the lack of annotated datasets and the complexity of the task, only a few works have addressed this problem. In this work, we present SurgeoNet, a real-time neural network pipeline to accurately detect and track surgical instruments from a stereo VR view. Our multi-stage approach is inspired by state-of-the-art neural-network architectural design, like YOLO and Transformers. We demonstrate the generalization capabilities of SurgeoNet in challenging real-world scenarios, achieved solely through training on synthetic data. The approach can be easily extended to any new set of articulated surgical instruments. SurgeoNet's code and data are publicly available.
- Published
- 2024
40. Tracking objects that change in appearance with phase synchrony
- Author
-
Muzellec, Sabine, Linsley, Drew, Ashok, Alekh K., Mingolla, Ennio, Malik, Girik, VanRullen, Rufin, and Serre, Thomas
- Subjects
Computer Science - Artificial Intelligence ,Computer Science - Computer Vision and Pattern Recognition ,Quantitative Biology - Neurons and Cognition - Abstract
Objects we encounter often change appearance as we interact with them. Changes in illumination (shadows), object pose, or movement of nonrigid objects can drastically alter available image features. How do biological visual systems track objects as they change? It may involve specific attentional mechanisms for reasoning about the locations of objects independently of their appearances -- a capability that prominent neuroscientific theories have associated with computing through neural synchrony. We computationally test the hypothesis that the implementation of visual attention through neural synchrony underlies the ability of biological visual systems to track objects that change in appearance over time. We first introduce a novel deep learning circuit that can learn to precisely control attention to features separately from their location in the world through neural synchrony: the complex-valued recurrent neural network (CV-RNN). Next, we compare object tracking in humans, the CV-RNN, and other deep neural networks (DNNs), using FeatureTracker: a large-scale challenge that asks observers to track objects as their locations and appearances change in precisely controlled ways. While humans effortlessly solved FeatureTracker, state-of-the-art DNNs did not. In contrast, our CV-RNN behaved similarly to humans on the challenge, providing a computational proof-of-concept for the role of phase synchronization as a neural substrate for tracking appearance-morphing objects as they move about.
- Published
- 2024
41. Radial Evolution of ICME-Associated Particle Acceleration Observed by Solar Orbiter and ACE
- Author
-
Walker, Malik H., Allen, Robert C., Li, Gang, Ho, George C., Mason, Glenn M., Rodriguez-Pacheco, Javier, Wimmer-Schweingruber, Robert F., and Kouloumvakos, Athanasios
- Subjects
Astrophysics - Solar and Stellar Astrophysics ,Physics - Space Physics - Abstract
On 2022 March 10, a coronal mass ejection (CME) erupted from the Sun, resulting in Solar Orbiter observations at 0.45 au of both dispersive solar energetic particles arriving prior to the interplanetary CME (ICME) and locally accelerated particles near the ICME-associated shock structure as it passed the spacecraft on 2022 March 11. This shock was later detected on 2022 March 14 by the Advanced Composition Explorer (ACE), which was radially aligned with Solar Orbiter, at 1 au. Ion composition data from both spacecraft -- via the Solar Orbiter Energetic Particle Detector/ Suprathermal Ion Spectrograph (EPD/SIS) and the Ultra Low Energy Isotope Spectrometer (ULEIS) on ACE -- allows for in-depth analysis of the radial evolution of species-dependent ICME shock-associated acceleration processes for this event. We present a study of the ion spectra observed at 0.45 and 1 au during both the gradual solar energetic particle (SEP) and energetic storm particle (ESP) phases of the event. We find that the shapes of the spectra seen at each spacecraft have significant differences that were likely caused by varying shock geometry: Solar Orbiter spectra tend to lack spectral breaks, and the higher energy portions of the ACE spectra have comparable average flux to the Solar Orbiter spectra. Through an analysis of rigidity effects on the spectral breaks observed by ACE, we conclude that the 1 au observations were largely influenced by a suprathermal pool of $\mathrm{He}^{+}$ ions that were enhanced due to propagation along a stream interaction region (SIR) that was interacting with the ICME at times of observation., Comment: 15 pages, 7 figures, submitted to A&A
- Published
- 2024
42. Augmentation through Laundering Attacks for Audio Spoof Detection
- Author
-
Ali, Hashim, Subramani, Surya, and Malik, Hafiz
- Subjects
Electrical Engineering and Systems Science - Audio and Speech Processing ,Computer Science - Artificial Intelligence ,Computer Science - Sound - Abstract
Recent text-to-speech (TTS) developments have made voice cloning (VC) more realistic, affordable, and easily accessible. This has given rise to many potential abuses of this technology, including Joe Biden's New Hampshire deepfake robocall. Several methodologies have been proposed to detect such clones. However, these methodologies have been trained and evaluated on relatively clean databases. Recently, ASVspoof 5 Challenge introduced a new crowd-sourced database of diverse acoustic conditions including various spoofing attacks and codec conditions. This paper is our submission to the ASVspoof 5 Challenge and aims to investigate the performance of Audio Spoof Detection, trained using data augmentation through laundering attacks, on the ASVSpoof 5 database. The results demonstrate that our system performs worst on A18, A19, A20, A26, and A30 spoofing attacks and in the codec and compression conditions of C08, C09, and C10.
- Published
- 2024
43. Controlling sharpness, SNR and SAR for 3D FSE at 7T by end-to-end learning
- Author
-
Dawood, Peter, Blaimer, Martin, Herrler, Jürgen, Liebig, Patrick, Weinmüller, Simon, Malik, Shaihan, Jakob, Peter M., and Zaiss, Moritz
- Subjects
Physics - Medical Physics ,Computer Science - Machine Learning ,Electrical Engineering and Systems Science - Image and Video Processing ,Electrical Engineering and Systems Science - Systems and Control - Abstract
Purpose: To non-heuristically identify dedicated variable flip angle (VFA) schemes optimized for the point-spread function (PSF) and signal-to-noise ratio (SNR) of multiple tissues in 3D FSE sequences with very long echo trains at 7T. Methods: The proposed optimization considers predefined SAR constraints and target contrast using an end-to-end learning framework. The cost function integrates components for contrast fidelity (SNR) and a penalty term to minimize image blurring (PSF) for multiple tissues. By adjusting the weights of PSF/SNR cost-function components, PSF- and SNR-optimized VFAs were derived and tested in vivo using both the open-source Pulseq standard on two volunteers as well as vendor protocols on a 7T MRI system with parallel transmit extension on three volunteers. Results: PSF-optimized VFAs resulted in significantly reduced image blurring compared to standard VFAs for T2w while maintaining contrast fidelity. Small white and gray matter structures, as well as blood vessels, are more visible with PSF-optimized VFAs. Quantitative analysis shows that the optimized VFA yields 50% less deviation from a sinc-like reference PSF than the standard VFA. The SNR-optimized VFAs yielded images with significantly improved SNR in a white and gray matter region relative to standard (81.2\pm18.4 vs. 41.2\pm11.5, respectively) as trade-off for elevated image blurring. Conclusion: This study demonstrates the potential of end-to-end learning frameworks to optimize VFA schemes in very long echo trains for 3D FSE acquisition at 7T in terms of PSF and SNR. It paves the way for fast and flexible adjustment of the trade-off between PSF and SNR for 3D FSE., Comment: Submitted to Magnetic Resonance in Medicine for peer-review
- Published
- 2024
44. Optimal Infinite-Horizon Mixed $\mathit{H}_2/\mathit{H}_\infty$ Control
- Author
-
Malik, Vikrant, Kargin, Taylan, Hajar, Joudi, and Hassibi, Babak
- Subjects
Mathematics - Optimization and Control ,Electrical Engineering and Systems Science - Systems and Control - Abstract
We study the problem of mixed $\mathit{H}_2/\mathit{H}_\infty$ control in the infinite-horizon setting. We identify the optimal causal controller that minimizes the $\mathit{H}_2$ cost of the closed-loop system subject to an $\mathit{H}_\infty$ constraint. Megretski proved that the optimal mixed $\mathit{H}_2/\mathit{H}_\infty$ controller is non-rational whenever the constraint is active without giving an explicit construction of the controller. In this work, we provide the first exact closed-form solution to the infinite-horizon mixed $\mathit{H}_2/\mathit{H}_\infty$ control in the frequency domain. While the optimal controller is non-rational, our formulation provides a finite-dimensional parameterization of the optimal controller. Leveraging this fact, we introduce an efficient iterative algorithm that finds the optimal causal controller in the frequency domain. We show that this algorithm is convergent when the system is scalar and present numerical evidence for exponential convergence of the proposed algorithm. Finally, we show how to find the best (in $\mathit{H}_\infty$ norm) fixed-order rational approximations of the optimal mixed $\mathit{H}_2/\mathit{H}_\infty$ controller and study its performance., Comment: Accepted for presentation at the 60th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2024
- Published
- 2024
45. Randomness from Radiation: Evaluation and Analysis of Radiation-Based Random Number Generators
- Author
-
Zafar, Roohi, Kamran, Muhammad, Malik, Tahir, Karera, Kashish, Tariq, Humayon, Mustafa, Ghulam, and Khan, Muhammad Mubashir
- Subjects
Quantum Physics - Abstract
Random numbers are central to various applications such as secure communications, quantum key distribution theory (QKD), statistics, and other tasks. One of today's most popular generators is quantum random numbers (QRNGs). The inherent randomness and true unpredictability in quantum mechanics allowed us to construct QRNGs that are more accurate and useful than traditional random number generators. Based on different quantum mechanical principles, several QRNGs have already been designed. The primary focus of this paper is the generation and analysis of quantum random numbers based on radioactive decay. In the experimental set, two beta-active radioactive sources, cobalt-60 (Co60) and Strontium-90 (Sr 90), and an ST-360 counter with a Geiger-Muller (GM) tube are used to record the counts. The recorded data was then self-tested by entropy and frequency measurement. Moreover, popular testing technique, the National Institute of Science and Technology (NIST) randomness testing is used, to ensure that the guaranteed randomness meets security standards. The research provides the impact of the nature of the radioactive source, the distance between the counter and sources, and the recording time of the counts on generating quantum random numbers of radioactive QRNGs.
- Published
- 2024
46. Beyond Fundamental Building Blocks: Plasticity in Structurally Complex Crystals
- Author
-
Stollenwerk, Tobias, Huckfeldt, Pia Carlotta, Ulumuddin, Nisa Zakia Zahra, Schneider, Malik, Xie, Zhuocheng, and Korte-Kerzel, Sandra
- Subjects
Condensed Matter - Materials Science - Abstract
Intermetallics, which encompass a wide range of compounds, often exhibit similar or closely related crystal structures, resulting in various intermetallic systems with structurally derivative phases. This study examines the hypothesis that deformation behavior can be transferred from fundamental building blocks to structurally related phases using the binary samarium-cobalt system. We investigate SmCo$_2$ and SmCo$_5$ as fundamental building blocks and compare them to the structurally related SmCo$_3$ and Sm$_2$Co$_{17}$ phases. Nanoindentation and micropillar compression tests were performed to characterize the primary slip systems, complemented by generalized stacking fault energy calculations via atomic-scale modeling. Our results show that while elastic properties of the structurally complex phases follow a rule of mixtures, their plastic deformation mechanisms are more intricate, influenced by the stacking and bonding nature within the crystal's building blocks. These findings underscore the importance of local bonding environments in predicting the mechanical behavior of structurally related intermetallics, providing crucial insights for the development of high-performance intermetallic materials.
- Published
- 2024
47. EfficientCrackNet: A Lightweight Model for Crack Segmentation
- Author
-
Zim, Abid Hasan, Iqbal, Aquib, Al-Huda, Zaid, Malik, Asad, and Kuribayash, Minoru
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
Crack detection, particularly from pavement images, presents a formidable challenge in the domain of computer vision due to several inherent complexities such as intensity inhomogeneity, intricate topologies, low contrast, and noisy backgrounds. Automated crack detection is crucial for maintaining the structural integrity of essential infrastructures, including buildings, pavements, and bridges. Existing lightweight methods often face challenges including computational inefficiency, complex crack patterns, and difficult backgrounds, leading to inaccurate detection and impracticality for real-world applications. To address these limitations, we propose EfficientCrackNet, a lightweight hybrid model combining Convolutional Neural Networks (CNNs) and transformers for precise crack segmentation. EfficientCrackNet integrates depthwise separable convolutions (DSC) layers and MobileViT block to capture both global and local features. The model employs an Edge Extraction Method (EEM) and for efficient crack edge detection without pretraining, and Ultra-Lightweight Subspace Attention Module (ULSAM) to enhance feature extraction. Extensive experiments on three benchmark datasets Crack500, DeepCrack, and GAPs384 demonstrate that EfficientCrackNet achieves superior performance compared to existing lightweight models, while requiring only 0.26M parameters, and 0.483 FLOPs (G). The proposed model offers an optimal balance between accuracy and computational efficiency, outperforming state-of-the-art lightweight models, and providing a robust and adaptable solution for real-world crack segmentation.
- Published
- 2024
48. Real-time fetAl brain and placental T2* mapping at 0.55T low-field MRI (RAT)
- Author
-
Verdera, Jordina Aviles, Silva, Sara Neves, Tomi-Tricot, Raphael, Hall, Megan, Story, Lisa, Malik, Shaihan J, Hajnal, Joseph V, Rutherford, Mary A, and Hutter, Jana
- Subjects
Physics - Medical Physics - Abstract
Purpose: To provide real-time quantitative organ-specific information - specifically placental and brain T2* - to allow optimization of the MR examination to the individual patient. Methods: A FIRE-based real-time setup segmenting placenta and fetal brain in real-time, performing T2* fitting and analysis and calculation of the centile was implemented. A nn-UNet were trained and tested on 2989 datasets for the fetal brain and a second one trained on 210 datasets for the placenta for automatic segmentation. T2* normal curves were obtained from 106 cases and prospective evaluation was performed on 10 cases between 35 and 39 weeks GA. Results: Quantitative brain and placental T2* maps and centiles were available in all prospective cases within 30 seconds. The robustness of the method was shown with intra-scan repeats (mean difference 1.04+/-12.39 ms for fetal brain and -3.15+/-8.88 ms for placenta) and direct validation with vendor-processed offline results (mean difference 1.62+/-4.33 ms for fetal brain and 0.16+/-6.19 ms for placenta). Discussion and Conclusion: Real-time available organ-specific quantitative information enables more personalized MR examinations, selection of the most pertinent sequences and thus the promise of reduced recalls and specific insights into tissue properties. The here enabled placental T2*, demonstrated in multiple recent studies to be a biomarker sensitive to a range of pregnancy complications, and fetal brain T2* will be explored in further studies in pregnancies with pre-eclampsia, growth restriction as a way of enabling future MR-guided fetal interventions.
- Published
- 2024
49. A Multi-Dataset Classification-Based Deep Learning Framework for Electronic Health Records and Predictive Analysis in Healthcare
- Author
-
Malik, Syed Mohd Faisal, Nafis, Md Tabrez, Ahad, Mohd Abdul, and Tanweer, Safdar
- Subjects
Computer Science - Artificial Intelligence - Abstract
In contemporary healthcare, to protect patient data, electronic health records have become invaluable repositories, creating vast opportunities to leverage deep learning techniques for predictive analysis. Retinal fundus images, cirrhosis stages, and heart disease diagnostic predictions have shown promising results through the integration of deep learning techniques for classifying diverse datasets. This study proposes a novel deep learning predictive analysis framework for classifying multiple datasets by pre-processing data from three distinct sources. A hybrid deep learning model combining Residual Networks and Artificial Neural Networks is proposed to detect acute and chronic diseases such as heart diseases, cirrhosis, and retinal conditions, outperforming existing models. Dataset preparation involves aspects such as categorical data transformation, dimensionality reduction, and missing data synthesis. Feature extraction is effectively performed using scaler transformation for categorical datasets and ResNet architecture for image datasets. The resulting features are integrated into a unified classification model. Rigorous experimentation and evaluation resulted in high accuracies of 93%, 99%, and 95% for retinal fundus images, cirrhosis stages, and heart disease diagnostic predictions, respectively. The efficacy of the proposed method is demonstrated through a detailed analysis of F1-score, precision, and recall metrics. This study offers a comprehensive exploration of methodologies and experiments, providing in-depth knowledge of deep learning predictive analysis in electronic health records.
- Published
- 2024
50. GSplatLoc: Grounding Keypoint Descriptors into 3D Gaussian Splatting for Improved Visual Localization
- Author
-
Sidorov, Gennady, Mohrat, Malik, Lebedeva, Ksenia, Rakhimov, Ruslan, and Kolyubin, Sergey
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence ,Computer Science - Machine Learning ,Computer Science - Robotics - Abstract
Although various visual localization approaches exist, such as scene coordinate and pose regression, these methods often struggle with high memory consumption or extensive optimization requirements. To address these challenges, we utilize recent advancements in novel view synthesis, particularly 3D Gaussian Splatting (3DGS), to enhance localization. 3DGS allows for the compact encoding of both 3D geometry and scene appearance with its spatial features. Our method leverages the dense description maps produced by XFeat's lightweight keypoint detection and description model. We propose distilling these dense keypoint descriptors into 3DGS to improve the model's spatial understanding, leading to more accurate camera pose predictions through 2D-3D correspondences. After estimating an initial pose, we refine it using a photometric warping loss. Benchmarking on popular indoor and outdoor datasets shows that our approach surpasses state-of-the-art Neural Render Pose (NRP) methods, including NeRFMatch and PNeRFLoc., Comment: Project website at https://gsplatloc.github.io/
- Published
- 2024
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.