If you like ResNet and enjoy Optimal Transport, you might enjoy this paper with Raphaël Barboni and F-X Vialard. We show that infinite width/depth ResNet are ("conditional") Wasserstein flows. arxiv.org/abs/2403.12887
10
49
325
31K
163
If there are "enough" neurons at initialization, there is a P-L inequality for the training loss (so no local minimum and convergence if the loss is small enough). This is an infinite-dimensional extension of the results of @PierreMari0n @m_e_sander arxiv.org/abs/2309.01213