ShiftAddLLM: Accelerating Pretrained LLMs via Post-Training Multiplication-Less Reparameterization Paper • 2406.05981 • Published Jun 10, 2024 • 16
ShiftAddViT: Mixture of Multiplication Primitives Towards Efficient Vision Transformer Paper • 2306.06446 • Published Jun 10, 2023 • 1
NetDistiller: Empowering Tiny Deep Learning via In-Situ Distillation Paper • 2310.19820 • Published Oct 24, 2023 • 1