AI could help turbocharge your SSD’s effective storage by compressing data even further — but just don’t delete 7zip just yet
DeepMind scientists have given a major upgrade to compression technology thanks to a large language model (LLM) that has achieved astonishing lossless compression speeds with image and audio data.
Thanks to the company’s Chinchilla 70B LLM, the researchers used a special compression algorithm to reduce images to 43.4% and audio files to 16.4% of their original size, as described in their paper – making it better than some of the best compression software out there.
In contrast, the standard image compression algorithm reduces PNG images by up to 58.5% and FLAC compressors reduce audio by up to 30.3% of their original file size. It means you can store so much more on one of the best SSDs.
Although Chinchilla 70B was trained primarily on text, they achieved these results by relying on the model’s predictive capabilities and framing the “prediction problem” through the lens of file compression. In other words, they redesigned the best features of an LLM and discovered that these features also serve to compress large files.
AI is great at compression – to a point
The DeepMind researchers showed that because of this equivalence between prediction and compression, any compressor can be used as a conditional generative model – and even vice versa.
But, they added, they can only achieve such compression results up to a certain file size, meaning using generative AI as a compression solution may not be practical for everyone.
“We compared large, pre-trained models used as compressors against several standard compressors and showed that they are competitive not only on text, but also on modalities for which they were never trained,” the researchers said.
“We showed that the compression perspective provides new insights into scaling laws because it takes model size into account, unlike the log-loss objective, which is standard in current language modeling research.”
Because of this scaling limitation, the models used in this study are no better than, say, 7zip when looking at files above a certain threshold. They may not compress as impressively as the results show, nor may they be as fast as conventional compression algorithms.