Image Compression for Sup Forums

There was suggestion in the captcha threads that if Sup Forums could cut it's server costs, it could afford to implement a better captcha system. I imagine Sup Forums's biggest cost is the bandwidth for image hosting.
Phoneposters are uploading bad quality photos that take up 4MB. They could be compressed to a tenth of that size with no noticeable loss in quality.
I've seen people upload screenshots of text that are over 3MB. These recompress LOSSLESSLY over 70x, not even exaggerating.

Moot said in the past he didn't do recompression to minimize server load. But what about client-side compression? Just this year webassembly has become widely available for executing fast low level code client side. Existing compression tools can be compiled into webassembly.

Lossless techniques for recompression:
Deflate streams in JPEGs and PNGs can usually be compressed further using modern deflate encoders. This is completely lossless.
PNG has many options for compression such as different filters and color modes. Tools like optipng exist to search these possibilities and reduce filesize quite a bit.
JPEGs can be converted to progressive JPEGs which compress much better (and appear to load quicker as a bonus.)
Gifs can be converted to PNG with a 30% reduction in size. Animated Gifs can similarly be converted to apngs or lossless webms.
Huffman coded JPEGs can be converted to Arithmetically coded JPEGs for something around 15% reduction in size. Unfortunately only some browsers support this.

Lossy techniques for recompression:
Most JPEGs can be compressed to a fraction of their size with no noticeable loss in quality. Modern JPEG encoders are also much better than old ones.
JPEG is an old format and much better compression can be achieved with modern formats like WebP.
VP8 webms can be converted to VP9.

"But muh generation loss". These conversions only need to happen once. You can look at the quantization tables of a JPEG and see if it's already been compressed enough.

Bump because it's worth exploring

Someone said something about redirecting Sup Forums's data through TOR. How would that work?

I think you mean IPFS or some similar distributed solution. That could work to reduce bandwidth by having users host content instead of their servers. I think modern browsers can support some p2p like that now. I don't know much about it though.

This summer I was looking for a good image compression for a project on my job. I tried jpeg, png and webp. Webp had the best compression ratio, however it required tremendous processing power to compress it and decompress it. Google's implementation required several ms per decompression call on an i5 4570. Png was lossless but had no good ratio. In the end we used jpeg since it had a good balance of ratio and resources need to process. I also tried gzip compression over jpeg, but bandwidth was reduced about 5% at best.

what's a moot?

What's going to stop some asshole from disabling their client side compressor?

You mean like flif and bpg? Might be worth it but the js to gloss over absent browser support will also cost something.

What about buying a Sup Forums pass to get more money into Sup Forums?
I mean we spend a lot of time here and probably some of you paid for FOSS as well.
Why would Sup Forums be different?

>turning Sup Forums into a literal botnet
lol no thanks

A few ms extra to upload an image seems pretty acceptable to me.

Nothing, but most people won't.

rly?

BPG is patent and flif is only a good replacement for png. Also nothing supports them, though client side decompression is also possible.

Reducing costs is orthogonal to increasing revenue.

But it would be /ourbotnet/

>A few ms extra to upload an image seems pretty acceptable to me.

The process time is not the problem. It is the fact that decompression algorithm uses 100% of the cpu for those few ms. Imagine now a mobile phone using 100% of its cpu for ~40ms for each image it wants to show. That would be disastrous for the battery.

Interesting. I believe most phones have hardware support for VP8 (webm) which is what WebP basically is.

seems interesting
bump

...

Without lowering the maximum upload size, better compression won't help to effectively lower the storage costs. This needs to happen as a tradeof.

I don't agree with client side encoding, it would make more sense for people to start using dedicated tools, so not just the image format spreads but the surrounding software ecosystem too gets a boost. Also less botnet.

post this on

The maximum limit is just that, a limit. If you reduce the size of the average upload, then on average bandwidth will decrease.

Shove it up >>>/yourAss/

I don't understand the purpose of that board.

I'd be okay with lossless recompression. Lossy recompression can go die in a fire.