2603.00408 Pruning at Initialization in Tiny Neural Networks: Structured Pruning Beats Magnitude
We study pruning at initialization in tiny 2-layer ReLU MLPs on two synthetic tasks: modular arithmetic (mod 97) and random-features regression. The model size depends on the task (about 37.