1. Optimising network interactions through device agnostic models
- Author
-
Manneschi, Luca, Vidamour, Ian T., Stenning, Kilian D., Gartside, Jack C., Swindells, Charles, Venkat, Guru, Griffin, David, Stepney, Susan, Branford, Will R., Hayward, Thomas, Ellis, Matt O, Vasilaki, Eleni, Manneschi, Luca, Vidamour, Ian T., Stenning, Kilian D., Gartside, Jack C., Swindells, Charles, Venkat, Guru, Griffin, David, Stepney, Susan, Branford, Will R., Hayward, Thomas, Ellis, Matt O, and Vasilaki, Eleni
- Abstract
Physically implemented neural networks hold the potential to achieve the performance of deep learning models by exploiting the innate physical properties of devices as computational tools. This exploration of physical processes for computation requires to also consider their intrinsic dynamics, which can serve as valuable resources to process information. However, existing computational methods are unable to extend the success of deep learning techniques to parameters influencing device dynamics, which often lack a precise mathematical description. In this work, we formulate a universal framework to optimise interactions with dynamic physical systems in a fully data-driven fashion. The framework adopts neural stochastic differential equations as differentiable digital twins, effectively capturing both deterministic and stochastic behaviours of devices. Employing differentiation through the trained models provides the essential mathematical estimates for optimizing a physical neural network, harnessing the intrinsic temporal computation abilities of its physical nodes. To accurately model real devices' behaviours, we formulated neural-SDE variants that can operate under a variety of experimental settings. Our work demonstrates the framework's applicability through simulations and physical implementations of interacting dynamic devices, while highlighting the importance of accurately capturing system stochasticity for the successful deployment of a physically defined neural network.
- Published
- 2024