Home
My Research
Talks
Publications
Teaching
Outreach
Contact
Light
Dark
Automatic
Theory
Machine learning in spectral domain
We introduce a new method for training deep neural networks by focusing on the spectral space, rather than the traditional node space. It involves adjusting the eigenvalues and eigenvectors of transfer operators, offering improved performance over standard methods with an equivalent number of parameters.
Lorenzo Giambagli
,
Lorenzo Buffoni
,
Timoteo Carletti
,
Walter Nocentini
,
Duccio Fanelli
PDF
Cite
Source Document
Training of sparse and dense deep neural networks: Fewer parameters, same performance
This study presents a variant of spectral learning for deep neural networks, where adjusting two sets of eigenvalues for each layer mapping significantly enhances network performance with fewer trainable parameters. This method, inspired by homeostatic plasticity, offers a computationally efficient alternative to conventional training, achieving comparable results with a simpler parameter setup. It also enables the creation of sparser networks with impressive classification abilities.
Lorenzo Chicchi
,
Lorenzo Giambagli
,
Lorenzo Buffoni
,
Timoteo Carletti
,
Marco Ciavarella
,
Duccio Fanelli
PDF
Cite
Source Document
Cite
×