Compressing Images with Neural Networks

by skandiumon 3/17/2024, 6:28 PMwith 69 comments

by StiffFreeze9on 3/18/2024, 1:28 AM

How badly will its lossy-ness change critical things? In 2013, there were Xerox copiers with aggressive compression that changed numbers, https://www.theregister.com/2013/08/06/xerox_copier_flaw_mea...

by Dwediton 3/18/2024, 12:12 AM

There was an earlier article (Sep 20, 2022) about using the Stable Diffusion VAE to perform image compression. Uses the VAE to change from pixel space to latent space, dithers the latent space down to 256 colors, then when it's time to decompress it, it de-noises that.

https://pub.towardsai.net/stable-diffusion-based-image-compr...

HN discussion: https://news.ycombinator.com/item?id=32907494

by rottc0ddon 3/18/2024, 3:20 AM

Something similar by Fabrice Bellard:

https://bellard.org/nncp/

by mbtwlon 3/18/2024, 10:38 AM

A first NN based image compression standard is currently being developed by JPEG. More information can be found here: https://jpeg.org/jpegai/documentation.html

Best overview you can probably get from “JPEG AI Overview Slides”

by jfdion 3/17/2024, 10:26 PM

Anyone know of open models useful (and good quality) for going the other way? I.e., Input is a 800x600 jpg and output is 4k version.

by calebmon 3/18/2024, 2:49 PM

All learning is compression

by esafakon 3/17/2024, 7:13 PM

It is not going to take off if it is not significantly better, and has browser support. WebP took off thanks to Chrome, while JPEG2000 floundered. If not native browser support, maybe the codec could be shipped by WASM or something?

The interesting diagram to me is the last one, for computational cost, which shows the 10x penalty of the ML-based codecs.

by holodukeon 3/17/2024, 10:36 PM

How much vram is needed? And computing power? To open a webpage you soon need 24gb and 2 seconds of 1000 watts energy to uncompress images. Bandwidth is reduced from 2mb to only 20kb.

by ameliuson 3/17/2024, 10:17 PM

How do we know we don't get hands with 16 fingers?