A

Antonio Andrea Gargiulo

Total Citations
94
h-index
2
Papers
1

Publications

#1 2604.03420v1 Apr 03, 2026

Zero-Shot Quantization via Weight-Space Arithmetic

We show that robustness to post-training quantization (PTQ) is a transferable direction in weight space. We call this direction the quantization vector: extracted from a donor task by simple weight-space arithmetic, it can be used to patch a receiver model and improve robustness to PTQ-induced noise by as much as 60%, without receiver-side quantization-aware training (QAT). Because the method requires no receiver training data, it provides a zero-shot, low-cost alternative to QAT for extremely low-bit deployment. We demonstrate this on Vision Transformer (ViT) models. More broadly, our results suggest that quantization robustness is not merely a byproduct of task-specific training, but a reusable feature of weight-space geometry that can be transferred rather than retrained.

Adrian Robert Minut Emanuele Rodola Daniele Solombrino Antonio Andrea Gargiulo Luca Zhou +1
0 Citations