Article

Peering inside the black box by learning the relevance of many-body functions in neural network potentials
This paper extend tools recently proposed in the nascent field of explainable artificial intelligence, such as Layerwise Relevance Propagation, to coarse-grained potentials based on graph neural networks.
Peering inside the black box by learning the relevance of many-body functions in neural network potentials
Deterministic versus stochastic dynamical classifiers: opposing random adversarial attacks with noise
This article explores the comparison between deterministic and stochastic dynamical classifiers in the context of opposing random adversarial attacks using noise. The study provides insights into how these different types of classifiers can be used to mitigate adversarial threats.
Kernel shape renormalization explains output-output correlations in finite Bayesian one-hidden-layer networks
Finite-width one hidden layer networks display nontrivial output-output correlations that vanish in the lazy-training infinite-width limit. This manuscript rationalizes this evidence using kernel shape renormalization in the proportional limit of Bayesian deep learning.
Complex Recurrent Spectral Network
The Complex Recurrent Spectral Network (C-RSN) is a novel AI model that more accurately mimics biological neural processes using localized non-linearity, complex eigenvalues, and separated memory/input functionalities. It demonstrates dynamic, oscillatory behavior akin to biological cognition and effectively classifies data, as shown in tests with the MNIST dataset.
Complex Recurrent Spectral Network