Home
My Research
Talks
Publications
Teaching
Outreach
Contact
Light
Dark
Automatic
article-journal
Learning in Wilson-Cowan model for metapopulation
This research introduces a learning algorithm based on the Wilson-Cowan model for metapopulation, a neural mass network model that treats different subcortical regions of the brain as connected nodes. The model incorporates stable attractors into its dynamics, enabling it to solve various classification tasks. The algorithm is tested on datasets such as MNIST, Fashion MNIST, CIFAR-10, and TF-FLOWERS, as well as in combination with a transformer architecture (BERT) on IMDB, achieving high classification accuracy.
Raffaele Marino
,
Lorenzo Buffoni
,
Lorenzo Chicchi
,
Francesca Di Patti
,
Diego Febbe
,
Lorenzo Giambagli
,
Duccio Fanelli
Cite
Source Document
Topology shapes dynamics of higher-order networks
This research explores how higher-order interactions in complex systems influence the dynamics of topological signals, revealing new insights into the interplay between topology and dynamics.
Ana P. Millán
,
Hanlin Sun
,
Lorenzo Giambagli
,
Riccardo Muolo
,
Timoteo Carletti
,
Joaquín J. Torres
,
Filippo Radicchi
,
Jürgen Kurths
,
Ginestra Bianconi
PDF
Cite
Source Document
Global topological Dirac synchronization
This research introduces Global Topological Dirac Synchronization, a state where oscillators associated with simplices and cells of arbitrary dimension, coupled by the Topological Dirac operator, operate in unison. The study combines algebraic topology, non-linear dynamics, and machine learning to derive the conditions for the existence and stability of this synchronization state.
Timoteo Carletti
,
Lorenzo Giambagli
,
Riccardo Muolo
,
Ginestra Bianconi
Cite
Source Document
Turing patterns on discrete topologies
This research explores Turing patterns on discrete topologies, extending the classical theory of pattern formation to networks and higher-order structures. The study highlights the potential of this approach to transcend the conventional boundaries of PDE-based methods, offering insights into self-organization phenomena across various disciplines.
Riccardo Muolo
,
Lorenzo Giambagli
,
Hiroya Nakao
,
Duccio Fanelli
,
Timoteo Carletti
Cite
Source Document
Global topological synchronization on simplicial and cell complexes
This research explores the global synchronization of topological signals on higher-order networks, revealing that topological constraints impact synchronization differently across various network structures.
Timoteo Carletti
,
Lorenzo Giambagli
,
Ginestra Bianconi
PDF
Cite
Source Document
Non-parametric analysis of the Hubble Diagram with Neural Networks
This study introduces a neural network-based method for nonparametric analysis of the Hubble diagram, extended to high redshifts. Validated using simulated data, the method aligns with a flat Λ (Lambda) cold dark matter model (ΩM ≈ 0.3) up to z ≈ 1-1.5, but deviates at higher redshifts. It also suggests increasing ΩM values with redshift, indicating potential dark energy evolution.
Lorenzo Giambagli
,
Duccio Fanelli
,
Guido Risaliti
,
Matilde Signorini
PDF
Cite
Project
Source Document
Recurrent Spectral Network (RSN): Shaping a discrete map to reach automated classification
The Recurrent Spectral Network (RSN) is a new automated classification method that uses dynamical systems to direct data to specific targets, demonstrating effectiveness with both a simple model and a standard image processing dataset.
Lorenzo Chicchi
,
Duccio Fanelli
,
Lorenzo Giambagli
,
Lorenzo Buffoni
,
Timoteo Carletti
PDF
Cite
Source Document
Diffusion-driven instability of topological signals coupled by the Dirac operator
This research examines reaction-diffusion processes on networks, particularly focusing on topological signals across nodes, links, and cells. It uses the Dirac operator to study interactions and reveals conditions for Turing pattern emergence, validating the findings on network models and square lattices.
Lorenzo Giambagli
,
Lucille Calmon
,
Riccardo Muolo
,
Timoteo Carletti
,
Ginestra Bianconi
PDF
Cite
Project
Source Document
Spectral pruning of fully connected layers
Training neural networks in spectral space focuses on optimizing eigenvalues and eigenvectors instead of individual weights, allowing effective implicit bias that node enables pruning without sacrificing performance.
Lorenzo Buffoni
,
Enrico Civitelli
,
Lorenzo Giambagli
,
Lorenzo Chicchi
,
Duccio Fanelli
PDF
Cite
Source Document
Machine learning in spectral domain
We introduce a new method for training deep neural networks by focusing on the spectral space, rather than the traditional node space. It involves adjusting the eigenvalues and eigenvectors of transfer operators, offering improved performance over standard methods with an equivalent number of parameters.
Lorenzo Giambagli
,
Lorenzo Buffoni
,
Timoteo Carletti
,
Walter Nocentini
,
Duccio Fanelli
PDF
Cite
Source Document
»
Cite
×