Cite
Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks
MLA
Linse, Christoph, et al. Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks. 2024. EBSCOhost, https://doi.org/10.1109/IJCNN60899.2024.10650881.
APA
Linse, C., Barth, E., & Martinetz, T. (2024). Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks. https://doi.org/10.1109/IJCNN60899.2024.10650881
Chicago
Linse, Christoph, Erhardt Barth, and Thomas Martinetz. 2024. “Leaky ReLUs That Differ in Forward and Backward Pass Facilitate Activation Maximization in Deep Neural Networks.” doi:10.1109/IJCNN60899.2024.10650881.