MATS Fellow:
Roy Rinberg
Authors:
Roy Rinberg, Annabelle Michael Carrell, Simon Henniger, Nicholas Carlini, Keri Warr
Citations
Abstract:
We study the compression of LLM-generated text across lossless and lossy regimes, characterizing a compression-compute frontier where more compression is possible at the cost of more compute. For lossless compression, domain-adapted LoRA adapters can improve LLM-based arithmetic coding by 2× over compression with the base LLM alone. For lossy compression, prompting a model for a succinct rewrite then applying arithmetic coding can achieve compression ratios of approximately 0.03, a 2× improvement over compressing the original response. We further introduce Question-Asking compression (QA), an interactive lossy protocol inspired by the game “Twenty Questions”. A small model iteratively refines its response by asking yes/no questions to a stronger model, transferring exactly one bit per answer. On 8 benchmarks spanning math, science, and code, 10 binary questions recover 23% to 72% of the capability gap between a small and large model on standard benchmarks and 7% to 38% on harder benchmarks, achieving compression ratios of 0.0006 to 0.004. This is over 100× smaller than prior LLM-based compression (Del´etang et al., 2024), suggesting that interactive protocols can transfer knowledge far more efficiently than transmitting full responses.
Haiku to Opus in Just 10 bits: LLMs Unlock Massive Compression Gains
Authors:
Roy Rinberg
Date:
March 5, 2026
Citations:
The MATS Program is an independent research and educational initiative connecting emerging researchers with mentors in AI alignment, governance, and security.
Each MATS cohort runs for 12 weeks in Berkeley, California, followed by an optional 6–12 month extension in London for selected scholars.