Communication Dans Un Congrès Année : 2025

Growth strategies for arbitrary DAG neural architectures

Résumé

Deep learning has shown impressive results obtained at the cost of training huge neural networks. However, the larger the architecture, the higher the computational, financial, and environmental costs during training and inference. We aim at reducing both training and inference durations. We focus on Neural Architecture Growth, which can increase the size of a small model when needed, directly during training using information from the backpropagation. We expand existing work and freely grow neural networks in the form of any Directed Acyclic Graph by reducing expressivity bottlenecks in the architecture. We explore strategies to reduce excessive computations and steer network growth toward more parameter-efficient architectures.
Fichier principal
Vignette du fichier
main.pdf (472.05 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04902059 , version 1 (21-01-2025)

Licence

Identifiants

  • HAL Id : hal-04902059 , version 1

Citer

Stella Douka, Manon Verbockhaven, Théo Rudkiewicz, Stéphane Rivaud, François P. Landes, et al.. Growth strategies for arbitrary DAG neural architectures. ESANN 2025 - 33th European Symposium on Artificial Neural Networks, Apr 2025, Bruges, Belgium. ⟨hal-04902059⟩
0 Consultations
0 Téléchargements

Partager

More