Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wow, I didn't know that. A top google result says:

> It is possible to losslessly transcode JPEG images into JPEG XL. Transcoding preserves the already-lossy compression data from the original JPEG image without any quality loss caused by re-encoding, while making the file size smaller than the original.

I wonder how it does that and why JPEG didn't notice it could. I would re-encode to JPEG-XL, when supported. So then the situation isn't that WebP is so great but rather Chrome's not so great.



> I wonder how it does that

It's trivial to do: JPEG's last stage is a compression via Huffmann code - which is a really ancient, not particularly effective compressor. You simply decompress that stage, and compress with something more modern, yielding better savings. Stuffit did it in 2005. PackJPG in 2006. Brunsli (a Google project!) in 2019 - and it was one of the inputs to the JXL draft. Lepton did it in 2016.

> and why JPEG didn't notice it could.

Oh that's the best part - they did, all the way back in 1991. The JPEG standard allows you to choose for the last stage between Huffmann and Arithmetic Coding - which is way more effective. Unfortunately it was patent-encumbered and its support is low. It yielded 10%ish space saving which wasn't worth the compatibility headache (it has the same extension and mime-type of a Huffmann-encoded JPEG, so a webserver won't know if your browser supports it). If it only had used a different file extension it would probably be the dominant format today.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: