13 comments

  • markdog121 day ago
    &quot;Yes, re-opening.&quot;.<p>&gt; Given these positive signals, we would welcome contributions to integrate a performant and memory-safe JPEG XL decoder in Chromium. In order to enable it by default in Chromium we would need a commitment to long-term maintenance. With those and our usual launch criteria met, we would ship it in Chrome.<p><a href="https:&#x2F;&#x2F;groups.google.com&#x2F;a&#x2F;chromium.org&#x2F;g&#x2F;blink-dev&#x2F;c&#x2F;WjCKcBw219k&#x2F;m&#x2F;NmOyvMCCBAAJ" rel="nofollow">https:&#x2F;&#x2F;groups.google.com&#x2F;a&#x2F;chromium.org&#x2F;g&#x2F;blink-dev&#x2F;c&#x2F;WjCKc...</a>
    • concinds17 hours ago
      Context: Mozilla has had the same stance and many devs (including Googlers) are already working on a Rust decoder which has made good progress.
    • bigbuppo19 hours ago
      LOL. Google, the &quot;yeah that thing we bought six months ago, we&#x27;re killing it off 30 days for 4 weeks ago&quot; company demanding &quot;long-term&quot; anything.
      • rjh296 hours ago
        That conversation doesn&#x27;t apply to their core products: Search, Mail, Maps, Chrome, Android. Their commitment to maintaining these services over decades has been amazing. It&#x27;s everything else that sucks.
        • professor_v2 hours ago
          Mail is dropping features left and right, like gmailify. I&#x27;m pretty sure they&#x27;re trying to limit the maintenance costs as much as possible.
      • lonjil18 hours ago
        long term support is actually being provided by google...<p>just a different team in a different country :D<p>most jxl devs are at google research in zurich, and already pledged to handle long tetm support
        • malfist16 hours ago
          Just like google pledges long term support for everything until the next new and shiny comes along.
          • tyre12 hours ago
            I think Chrome can safely be said to have a track record of long term investment.
            • bigiain8 hours ago
              It is, after all, their primary ad delivery vector.
            • josefx5 hours ago
              Very good track record there, native clients, floc, manifest v2, ...
  • crazygringo22 hours ago
    Dupe. From yesterday (183 points, 82 comments):<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46021179">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46021179</a>
    • markdog1222 hours ago
      Ah, I think I searched for &quot;jpegxl&quot;, that&#x27;s why there was no match.
  • wizee23 hours ago
    JPEG-XL provides the best migration path for image conversion from JPEG, with lossless recompression. It also supports arbitrary HDR bit depths (up to 32 bits per channel) unlike AVIF, and generally its HDR support is much better than AVIF. Other operating systems and applications were making strides towards adopting this format, but Google was up till now stubbornly holding the web back in their refusal to support JPEG-XL in favour of AVIF which they were pushing. I’m glad to hear they’re finally reconsidering. Let’s hope this leads to resources being dedicated to help build and maintain a performant and memory safe decoder (in Rust?).
    • homebrewer21 hours ago
      It&#x27;s not just Google, Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains either:<p><a href="https:&#x2F;&#x2F;github.com&#x2F;mozilla&#x2F;standards-positions&#x2F;pull&#x2F;1064" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;mozilla&#x2F;standards-positions&#x2F;pull&#x2F;1064</a><p>avif is just better for typical web image quality, it produces better looking images and its artifacts aren&#x27;t as annoying (smoothing instead of blocking and ringing around sharp edges).<p>You also get it for basically free because it&#x27;s just an av1 key frame. Every browser needs an av1 decoder already unless it&#x27;s willing to forego users who would like to be able to watch Netflix and YouTube.
      • lonjil20 hours ago
        I don&#x27;t understand what you&#x27;re trying to say. Mozilla said over a year ago that they would support JXL as soon as there&#x27;s a fast memory safe decoder that will be supported.<p>Google on the other hand never expressed any desire to support JXL at all, regardless of the implementation. Only just now after the PDF Association announced that PDF would be using JXL, did they decide to support JXL on the web.<p>&gt; avif is just better for typical web image quality, it produces better looking images and its artifacts aren&#x27;t as annoying (smoothing instead of blocking and ringing around sharp edges).<p>AVIF is certainly better for the level of quality that Google wants you to use, but in reality, images on the web are much higher quality than that.<p>And JXL is pretty good if you want smoothing, in fact libjxl&#x27;s defaults have gotten so overly smooth recently that it&#x27;s considered a problem which they&#x27;re in the process of fixing.
        • bawolff18 hours ago
          &gt; I don&#x27;t understand what you&#x27;re trying to say. Mozilla said over a year ago that they would support JXL as soon as there&#x27;s a fast memory safe decoder that will be supported.<p>Did they actually say that? All the statements i&#x27;ve seen them have been much more guarded and vauge. More of a, maybe we will think about it if that happens.
          • lonjil18 hours ago
            &gt; If they successfully contribute an implementation that satisfies these properties and meets our normal production requirements, we would ship it.<p>That&#x27;s what they said a year ago. And a couple of Mozilla devs have been in regular contact with the JXL devs ever since then, helping with the integration. The patches to use jxl-rs with Firefox already exist, and will be merged as soon as a couple of prerequisite issues in Gecko are fixed.
            • magicalist17 hours ago
              Their standards position is still neutral; what switched a year ago was that they said they would be <i>open</i> to shipping an implementation that met their requirements. The tracking bug hasn&#x27;t been updated[2] The patches you mention are still part of the intent to prototype (behind a flag), similar to the earlier implementation that was removed in Chrome.<p>They&#x27;re looking at the same signals as Chrome of a format that&#x27;s actually getting use, has a memory safe implementation, and that will stick around for decades to justify adding it to the web platform, all of which seem more and more positive since 2022.<p>[1] <a href="https:&#x2F;&#x2F;mozilla.github.io&#x2F;standards-positions&#x2F;#jpegxl" rel="nofollow">https:&#x2F;&#x2F;mozilla.github.io&#x2F;standards-positions&#x2F;#jpegxl</a><p>[2] <a href="https:&#x2F;&#x2F;bugzilla.mozilla.org&#x2F;show_bug.cgi?id=1539075" rel="nofollow">https:&#x2F;&#x2F;bugzilla.mozilla.org&#x2F;show_bug.cgi?id=1539075</a>
      • wizee20 hours ago
        I disagree about the image quality at typical sizes - I find JPEG-XL is generally similar or better than AVIF at any reasonable compression ratios for web images. See this for example: <a href="https:&#x2F;&#x2F;tonisagrista.com&#x2F;blog&#x2F;2023&#x2F;jpegxl-vs-avif&#x2F;" rel="nofollow">https:&#x2F;&#x2F;tonisagrista.com&#x2F;blog&#x2F;2023&#x2F;jpegxl-vs-avif&#x2F;</a><p>AVIF only comes out as superior at extreme compression ratios at much lower bit rates than are typically used for web images, and the images generally look like smothered messes at those extreme ratios.
      • bananalychee20 hours ago
        Even though AVIF decoding support is fairly widespread by now, it is still not ubiquitous like JPEG&#x2F;PNG&#x2F;GIF. So typically services will store or generate the same image in multiple formats including AVIF for bandwidth optimization and JPEG for universal client support. Browser headers help to determine compatibility, but it&#x27;s still fairly complicated to implement, and users also end up having to deal with different platforms supporting different formats when they are served WebP or AVIF and want to reupload an image somewhere else that does not like those formats. As far as I can tell, JXL solves that issue for most websites since it is backwards-compatible and can be decoded into JPEG when a client does not support JXL. I would happily give up a few percent in compression efficiency to get back to a single all-purpose lossy image format.
        • hirako200019 hours ago
          Even Google photo does not support avif.<p>It&#x27;s almost as if Google had an interest in increased storage and bandwidth. Of course they don&#x27;t but as paying Driver used I&#x27;m overcharged for the same thing.
          • magicalist17 hours ago
            &gt; <i>Even Google photo does not support avif</i><p>I have no previous first-hand knowledge of this, but I vaguely remember discussions of avif in google photos from reddit a while back so FWIW I just tried uploading some avif photos and it handled them just fine.<p>Listed as avif in file info, downloads as the original file, though inspecting the network in the web frontend, it serves versions of it as jpg and webp, so there&#x27;s obviously still transcoding going on.<p>I&#x27;m not sure when they added support, the consumer documentation seem to be more landing site than docs unless I&#x27;m completely missing the right page, but the API docs list avif support[1], and according to the way back machine, &quot;AVIF&quot; was added to that page some time between August and November 2023.<p>[1] <a href="https:&#x2F;&#x2F;developers.google.com&#x2F;photos&#x2F;library&#x2F;guides&#x2F;upload-media#file-types-sizes" rel="nofollow">https:&#x2F;&#x2F;developers.google.com&#x2F;photos&#x2F;library&#x2F;guides&#x2F;upload-m...</a>
            • hirako200016 hours ago
              You are correct it is possible to upload avif files into Google Photo. But you lose the view and of course the thumbnail. Defeating the whole purpose of putting them into Photo.<p>Given it&#x27;s an app, they didn&#x27;t even need Google chrome to add support. Avif is supported on Android natively.
              • magicalist34 minutes ago
                &gt; <i>You are correct it is possible to upload avif files into Google Photo. But you lose the view and of course the thumbnail.</i><p>I&#x27;m not sure what you mean. They appear to act like any other photo in the interface. You can view them and they&#x27;re visible in the thumbnail view, but maybe I&#x27;m misinterpreting what you mean?
          • lonjil18 hours ago
            Some years ago, the Google Photos team asked the Chrome team to support JXL, so that they could use it for Photos. The request was ignored, of course.
            • hirako200016 hours ago
              They could have added support themselves to the app as it doesn&#x27;t use the WebView
              • ascorbic6 hours ago
                Google Photos isn&#x27;t just the app
                • hirako200039 minutes ago
                  See cousin comment, it accepts AVIF files. At least they would render on the app. Which would be enough for many. As it stands it accepts this format and renders nothing at all.
      • kps21 hours ago
        Not everything in the world is passive end-of-the-line presentation. JPEG-XL is the only one that tries to be a general-purpose image format.
        • asadotzler21 hours ago
          If that&#x27;s the case, let it be a feature of image editing packages that can output formats that are for the web. It&#x27;s a web standard we&#x27;re talking about here, not a general-purpose image format, so asking browsers to carry that big code load seems unreasonable when existing formats do most of what we need and want <i>for the web.</i>
          • crote20 hours ago
            People generally expect browsers to display general-purpose image formats. It&#x27;s why they support formats like classical JPEG, instead of just GIF and PNG.<p>Turns out people <i>really</i> like being able to just drag-and-drop an image from their camera into a website - being forced to re-encode first it isn&#x27;t exactly popular.
            • robertoandred20 hours ago
              &gt; Turns out people really like being able to just drag-and-drop an image from their camera into a website - being forced to re-encode first it isn&#x27;t exactly popular.<p>That’s a function of the website, not the browser.
              • jyoung860720 hours ago
                &gt; That’s a function of the website, not the browser.<p>That&#x27;s hand-waving away quite a lot. The task changes from serving a copy of a file on disk, as every other image format in common use, to needing a transcoding pipeline more akin to sites like YouTube. Technically possible, but lots of extra complexity in return for what gain?
      • danielheath6 hours ago
        The killer feature of JXL is that most websites already have a whole bunch of images in JPEG format, and converting those to JXL shrinks them by about 30% without introducing any new artifacts.
      • greenavocado18 hours ago
        &quot;Marginal Gains&quot;<p>Generation Loss – JPEG, WebP, JPEG XL, AVIF : <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=w7UDJUCMTng" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=w7UDJUCMTng</a>
        • edflsafoiewq14 hours ago
          Marginal gains over AVIF.<p>(Also I am highly skeptical of the importance of these generation loss tests.)
          • jlouis13 hours ago
            Very nice in video workflows, where it&#x27;s common to write out image sequences to disk.
          • greenavocado13 hours ago
            Social media <i>exists</i>
      • magicalhippo19 hours ago
        &gt; Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains<p>On a slightly related note, I wanted to have a HDR background image in Windows 11. Should be a breeze in 2025 right?<p>Well, Windows 11 only supports JPEG XR[1] for HDR background images. And my commonly used tools did either not support JPEG XR (Gimp fex) or they did not work correctly (ImageMagick).<p>So I had a look at the JPEG XR reference implementation, which was hosted on Codeplex but has been mirrored on GitHub[2]. And boy, I sure hope that isn&#x27;t the code that lives in Windows 11...<p>Ok most of the gunk is in the encoder&#x2F;decoder wrapper code, but still, for something that&#x27;s supposedly still in active use by Microsoft... Though not even hosting their own copy of the reference implementation is telling enough I suppose.<p>[1]: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;JPEG_XR" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;JPEG_XR</a><p>[2]: <a href="https:&#x2F;&#x2F;github.com&#x2F;4creators&#x2F;jxrlib" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;4creators&#x2F;jxrlib</a>
        • infinet18 hours ago
          Another JPEG XR user is Zeiss. It saves both grayscale and color microscope images with JPEG XR compression in a container format. Zeiss also released a C++ library (libczi) using the reference JPEG XR implementation to read&#x2F;write these images. Somehow Zeiss is moving away from JPEG XR - its newer version of microscope control software saves with zstd compression by default.
      • xeeeeeeeeeeenu21 hours ago
        &gt;avif is just better for typical web image quality,<p>What does &quot;typical web image quality&quot; even mean? I see lots of benchmarks with very low BPPs, like 0.5 or even lower, and that&#x27;s where video-based image codecs shine.<p>However, I just visited CNN.com and these are the BPPs of the first 10 images my browser loaded: 1.40, 2.29, 1.88, 18.03 (PNG &quot;CNN headlines&quot; logo), 1.19, 2.01, 2.21, 2.32, 1.14, 2.45.<p>I believe people are underestimating the BPP values that are actually used on the web. I&#x27;m not saying that low-BPP images don&#x27;t exist, but clearly it isn&#x27;t hard to find examples of higher-quality images in the wild.
      • jnd-cz21 hours ago
        Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?
        • CharlesW9 hours ago
          &gt; <i>Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?</i><p>Sure, 12-bit too, with HDR transfer functions (PQ and HLG), wide-gamut primaries (BT.2020, P3, etc.), and high-dynamic-range metadata (ITU&#x2F;CTA mastering metadata, content light level metadata).<p>JPEG XL matches or exceeds these capabilities on paper, but not in practice. The reality is that the world is going to support the JPEG XL capabilities that Apple supports, and probably not much more.
        • arccy21 hours ago
          if you actually read your parent comment: &quot;typical web image quality&quot;
          • ansgri21 hours ago
            Typical web image quality is like it is partly because of lack of support. It’s literally more difficult to show a static HDR photo than a whole video!
            • zozbot23421 hours ago
              PNG supports HDR with up to 16 bits per channel, see <a href="https:&#x2F;&#x2F;www.w3.org&#x2F;TR&#x2F;png-3&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.w3.org&#x2F;TR&#x2F;png-3&#x2F;</a> and the cICP, mDCV and cLLI chunks.
              • lonjil20 hours ago
                With incredibly bad compression ratios.
            • mort9620 hours ago
              HDR should not be &quot;typical web&quot; anything. It&#x27;s insane that websites are allowed to override my system brightness setting through HDR media. There&#x27;s so much stuff out there that literally hurts my eyes if I&#x27;ve set my brightness such that pure white (SDR FFFFFF) is a comfortable light level.<p>I want JXL in web browsers, but without HDR support.
              • magicalhippo19 hours ago
                There&#x27;s nothing stopping browsers from tone mapping[1] those HDR images using your tone mapping preference.<p>[1]: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Tone_mapping" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Tone_mapping</a>
                • mort9619 hours ago
                  What does that achieve? Isn&#x27;t it simpler to just not support HDR than to support HDR but tone map away the HDR effect?<p>Anyway, which web browsers have a setting to tone map HDR images such that they look like SDR images? (And why should &quot;don&#x27;t physically hurt my eyes&quot; be an opt-in setting anyway instead of just the default?)
                  • jlouis56 minutes ago
                    In a modern image chain, capture is more often than not HDR.<p>These images are then graded for HDR or SDR. I.e., sacrifices are made on the image data such that it is suitable for a display standard.<p>If you have an HDR image, it&#x27;s relatively easy to tone-map that into SDR space, see e.g. BT.2408 for an approach in Video.<p>The underlying problem here is that the Web isn&#x27;t ready for HDR at all, and I&#x27;m almost 100% confident browsers don&#x27;t do the right things yet. HDR displays have enormous variance. From &quot;Slightly above SDR&quot; to experimental displays at Dolby Labs. So to display an image correctly, you need to render it properly to the displays capabilities. Likewise if you want to display a HDR image on an SDR monitor. I.e., tone mapping is a required part of the solution.<p>A correctly graded HDR image taken of the real world will have like 95% of the pixel values falling within your typical SDR (Rec.709&#x2F;sRGB) range. You only use the &quot;physically hurt my eyes&quot; values sparingly, and you will take the room conditions into consideration when designing the peak value. As an example: cinemas using DCI-P3 peaks at 48 nits because the cinema is completely dark. 48 nits is more than enough for a pure white in that environment. But take that image and put it on a display sitting inside during the day, and it&#x27;s not nearly enough for a white. Add HDR peaks into this, and it&#x27;s easy to see that in a cinema, you probably shouldn&#x27;t peak at 1000 nits (which is about 4.x stops of light above the DCI-P3 peak). In short: your rendering to the displays capabilities require that you probe the light conditions in the room.<p>It&#x27;s also why you shouldn&#x27;t be able to manipulate brightness on an HDR display. We need that to be part of the image rendering chain such that the right decisions can be made.
                  • magicalhippo16 hours ago
                    &gt; What does that achieve?<p>Because then a user who <i>wants</i> to see the HDR image in all its full glory can do so. If the base image is not HDR, then there is nothing they can do about it.<p>&gt; And why should &quot;don&#x27;t physically hurt my eyes&quot; be an opt-in setting anyway instead of just the default?<p>While I very much support more HDR in the online world, I fully agree with you here.<p>However, I suspect the reason will boil down to what it usually does: almost no users change the default settings ever. And so, any default which goes the other way will invariably lead to a ton of support cases of &quot;why doesn&#x27;t this work&quot;.<p>However, web browsers are dark-mode aware, they could be HDR aware and do what you prefer based on that.
                    • mort964 hours ago
                      What user <i>wants</i> the web to look like this? <a href="https:&#x2F;&#x2F;floss.social&#x2F;@mort&#x2F;115147174361502259" rel="nofollow">https:&#x2F;&#x2F;floss.social&#x2F;@mort&#x2F;115147174361502259</a>
                      • magicalhippo2 hours ago
                        That video is clearly not encoded correctly. If it were the levels would match the background, given there is no actual HDR content visible in that video frame.<p>Anyway, even if the video was of a lovely nature scene in proper HDR, you might still find it jarring compared to the surrounding non-HDR desktop elements. I might too, depending on the specifics.<p>However, like I said, it&#x27;s up to the browser to handle this.<p>One suggestion I saw mentioned by some browser devs was to make the default to tone map HDR if the page is not viewed in fullscreen mode, and switch to full HDR range if it is fullscreen.<p>Even if that doesn&#x27;t become the default, it could be a behavior the browser could let the user select.
                  • Dylan168079 hours ago
                    If you want to avoid eye pain then you want caps on how much brightness can be in what percent of the image, not to throw the baby out with the bathwater and disable it entirely.<p>And if you&#x27;re speaking from iphone experience, my understanding is the main problem there isn&#x27;t extra bright things in the image, it&#x27;s the renderer <i>ignoring your brightness settings</i> when HDR shows up, which is obviously stupid and not a problem with HDR in general.
                    • mort965 hours ago
                      If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR? As far as I can see, it&#x27;s all bath water, no baby
                      • Dylan168074 hours ago
                        &gt; If the brightness cap of the HDR image is full SDR brightness, what value remains in HDR?<p>If you set #ffffff to be a comfortable max, then that would be the brightness cap for HDR flares that <i>fill the entire screen</i>.<p>But filling the entire screen like that rarely happens. Smaller flares would have a higher cap.<p>For example, let&#x27;s say an HDR scene has an average brightness that&#x27;s 55% of #ffffff, but a tenth of the screen is up at 200% of #ffffff. That should give you a visually impressive boosted range without blinding you.
                        • mort964 hours ago
                          Oh.<p>I don&#x27;t want the ability for 10% of the screen to be so bright it hurts my eyes. That&#x27;s the exact thing I want to avoid. I don&#x27;t understand why you think your suggestion would help. I want SDR FFFFFF to be the brightest <i>any</i> part of my screen goes to, because that&#x27;s what I&#x27;ve configured to be at a comfortable value using my OS brightness controls.
                          • Dylan168074 hours ago
                            I <i>strongly</i> doubt that the brightness to hurt your eyes is the same for 10% of the screen and 100% of the screen.<p>I am not suggesting eye hurting. The opposite really, I&#x27;m suggesting a curve that stays similarly comfortable at all sizes.
                            • mort964 hours ago
                              I don&#x27;t want any one part of my screen to be a stupidly bright point light. It&#x27;s not just the total amount of photons that matters.
                              • Dylan168074 hours ago
                                It is not just the total amount.<p>But it&#x27;s not the brightest spot either.<p>It&#x27;s in between.
                                • mort964 hours ago
                                  I just don&#x27;t want your &quot;in between&quot; &quot;only hurt my eyes a little&quot; solution. I don&#x27;t see how that&#x27;s so hard to understand. I set my brightness so that SDR FFFFFF is a comfortable max brightness. I don&#x27;t understand why web content should be allowed to go brighter than that.
                                  • Dylan168074 hours ago
                                    I&#x27;m suggesting something that WON&#x27;T hurt your eyes. I don&#x27;t see how that&#x27;s so hard to understand.<p>You set a comfortable max brightness for the entire screen.<p>Comfortable max brightness for small parts of the screen is <i>a different brightness</i>. Comfortable. NO eye hurting.
                                    • mort964 hours ago
                                      It&#x27;s still uncomfortable to have 10% of the screen get ridiculously bright.
                                      • Dylan168074 hours ago
                                        Yes, it&#x27;s uncomfortable to have it get &quot;ridiculously&quot; bright.<p>But there&#x27;s a level that is comfortable that is <i>higher</i> than what you set for FFFFFF.<p>And the comfortable level for 1% of the screen is even higher.<p>HDR could take advantage of that to make more realistic scenes without making you uncomfortable. If it was coded right to respect your limits. Which is probably isn&#x27;t right now. But it <i>could be</i>.
                                        • mort964 hours ago
                                          I severely doubt that I could ever be comfortable with 10% of my screen getting much brighter than the value I set as max brightness.<p>But say you&#x27;re right. Now you&#x27;ve achieved images looking completely out of place. You&#x27;ve achieved making the surrounding GUI look grey instead of white. And the screen looks broken when it suddenly dims after switching tabs away from one with an HDR video. <i>What&#x27;s the point</i>? Even ignoring the painful aspects (which is a big thing to ignore, since my laptop currently <i>physically hurts me at night with no setting to make it not hurt me</i>, which I don&#x27;t appreciate), you&#x27;re just making the experience of browsing the web <i>worse</i>. Why?
                                          • Dylan168074 hours ago
                                            In general, people report that HDR content looks more realistic and pretty. That&#x27;s the point, if it can be done without hurting you.
                                            • mort964 hours ago
                                              Do they? Do people report that an HDR image on a web page that takes up roughly 10% of the screen looks <i>more realistic</i>? Do they report that an HDR YouTube video, which mostly consists of a screen recording with the recorded SDR FFF being mapped to the brightness of the sun, looks <i>pretty</i>? Do people <i>like</i> when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white? (see e.g <a href="https:&#x2F;&#x2F;floss.social&#x2F;@mort&#x2F;115147174361502259" rel="nofollow">https:&#x2F;&#x2F;floss.social&#x2F;@mort&#x2F;115147174361502259</a>)<p>Because that&#x27;s what HDR web content is.<p>HDR movies playing on a livingroom TV? Sure, nothing against that. I mean it&#x27;s stupid that it tries to achieve some kind of absolute brightness, but in principle, <i>some</i> form of &quot;brighter than SDR FFF&quot; could make sense there. But for web content, surrounded by an SDR GUI?
                                              • Dylan168073 hours ago
                                                &gt; when their light-mode GUI suddenly turns grey as a part of it becomes 10x the brightness of what used to be white<p>I don&#x27;t know why you&#x27;re asking me about examples that violate the rules I proposed. No I don&#x27;t want that.<p>And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don&#x27;t know why you&#x27;re even bringing it up. I am aware that HDR can be done wrong...<p>But for HDR videos where the HDR actually makes sense, yeah it&#x27;s fine for highlights in the video to be a little brighter than the GUI around them, or for tiny little blips to be significantly brighter. Not enough to make it look gray like the misbehavior you linked.
                                                • mort963 hours ago
                                                  &gt; I don&#x27;t know why you&#x27;re asking me about examples that violate the rules I proposed. No I don&#x27;t want that.<p>Other than the exaggerated 10x, I don&#x27;t understand how it violates the rules you proposed. You proposed a scheme where part of the screen should be allowed to be significantly brighter than the surrounding SDR GUI&#x27;s FFF. That makes the surrounding GUI look grey.<p>&gt; And obviously boosting the brightness of a screen capture is bad. It would look bad in SDR too. I don&#x27;t know why you&#x27;re even bringing it up.<p>I&#x27;m bringing it up because <i>that&#x27;s how HDR looks on the web</i>. Most web content isn&#x27;t made by professional movie studios.<p>The example video I linked conforms with your suggested rules, FWIW: most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn&#x27;t over SDR FFF. <i>Yet it still hurts.</i>
                                                  • Dylan168073 hours ago
                                                    The whole chip in the middle is brighter than white. Half that video is super bright, making this example way more than I was suggesting in both area and average brightness.<p>&gt; most of the image is near black, only a relatively smart part of it is white. The average brightness probably isn&#x27;t over SDR FFF.<p>It&#x27;s a lot more than I suggested, and I said average brightness <i>half</i> of FFF for my example.<p>Also if I knew you were going to hammer on the loose example numbers I would have said 2% or 1%.<p>&gt; I&#x27;m bringing it up because that&#x27;s how HDR looks on the web.<p>But I&#x27;m not defending how it looks. I&#x27;m defending how it <i>could</i> look, since you don&#x27;t see why anyone would even <i>want</i> HDR on the web.
                                                    • mort963 hours ago
                                                      This is going on for too long. Maybe you can somehow find a way to process all HDR content so that it&#x27;s reasonable (i.e never makes the surrounding SDR GUI look grey, never makes bright pots which are bright enough to hurt) across all screens imaginable and all contexts. Maybe. I have my doubts, but go ahead.<p>Convince the web standards bodies and browser implementers and transform the world into one where HDR on the web is perfect and never causes issues.<p>But until that&#x27;s done, there&#x27;s a simple solution: Just don&#x27;t support HDR. Until your hypothetical perfect solution is universally implemented, it does more harm than good on the web and should not be supported.<p>I don&#x27;t see why anyone would want HDR on the web <i>in its current form</i>.
                                                      • Dylan168073 hours ago
                                                        Well the reason I was talking about limits that way is because it&#x27;s something screens already do when displaying HDR content. They can&#x27;t go full power over much area, and the bigger you go the dimmer your limit gets. So repurposing those existing algorithms with some tweaking.<p>It&#x27;s not very hard on a technical level.<p>And no it doesn&#x27;t have to be <i>universal</i> and <i>perfect</i> to reach the point that HDR is a benefit. There are some blatant flaws that need fixing, and just a few fixes would get us a lot closer.
                                                        • mort963 hours ago
                                                          It has to be universally not harmful.<p>Again, go convince standards bodies and browser implementers to implement those algorithms after doing studies to demonstrate that it fixes the issue. Until then, I just don&#x27;t want it in my browser.
                    • adgjlsfhk17 hours ago
                      it actually is somewhat an HDR problem because the HDR standards made some dumb choices. SDR standardizes relative brightness, but HDR uses absolute brightness even though that&#x27;s an obviously dumb idea and in practice no one with a brain actually implements it.
                  • spider-mario16 hours ago
                    How about a user stylesheet that uses <a href="https:&#x2F;&#x2F;www.w3.org&#x2F;TR&#x2F;css-color-hdr-1&#x2F;#the-dynamic-range-limit-property" rel="nofollow">https:&#x2F;&#x2F;www.w3.org&#x2F;TR&#x2F;css-color-hdr-1&#x2F;#the-dynamic-range-lim...</a> ?
                    • mort9616 hours ago
                      How about websites just straight up aren&#x27;t allowed to physically hurt me, <i>by default</i>?
                      • NetMageSCW12 hours ago
                        Web sites aren’t made for just you. If images from your screen are causing you issues, that is a you &#x2F; your device problem, not a web site problem.
                        • mort965 hours ago
                          I agree, it&#x27;s not a web site problem. It&#x27;s a web standards problem that it&#x27;s possible for web sites to do that.
                      • spider-mario15 hours ago
                        You asked “which web browsers have a setting to tone map HDR images such that they look like SDR images?”; I answered. Were you not actually looking for a solution?
                        • mort9614 hours ago
                          I was looking for a setting, not a hack.
    • twotwotwo21 hours ago
      Wanted to note <a href="https:&#x2F;&#x2F;issues.chromium.org&#x2F;issues&#x2F;40141863" rel="nofollow">https:&#x2F;&#x2F;issues.chromium.org&#x2F;issues&#x2F;40141863</a> on making the lossless JPEG recompression a Content-Encoding, which provides a way that, say, a CDN could deploy it in a way that&#x27;s fully transparent to end users (if the user clicks Save it would save a .jpg).<p>(And: this is great! I think JPEG XL has chance of being adopted with the recompression &quot;bridge&quot; and fast decoding options, and things like progressive decoding for its VarDCT mode are practical advantages too.)
    • kps22 hours ago
      &gt; (in Rust?)<p>Looks like that&#x27;s the idea: <a href="https:&#x2F;&#x2F;issues.chromium.org&#x2F;issues&#x2F;462919304" rel="nofollow">https:&#x2F;&#x2F;issues.chromium.org&#x2F;issues&#x2F;462919304</a>
    • kllrnohj21 hours ago
      &gt; and generally its HDR support is much better than AVIF<p>Not anymore. JPEG had the best HDR support with ISO 21496-1 weirdly enough, but AVIF also just recently got that capability with 1.2 ( <a href="https:&#x2F;&#x2F;aomedia.org&#x2F;blog%20posts&#x2F;Libavif-Improves-Support-for-HDR-Imagery&#x2F;" rel="nofollow">https:&#x2F;&#x2F;aomedia.org&#x2F;blog%20posts&#x2F;Libavif-Improves-Support-fo...</a> ).<p>The last discussion in libjxl about this was seemingly taking the stance it wasn&#x27;t necessary since JXL has &quot;native HDR&quot; which completely fails to understand the problem space entirely.
      • lonjil20 hours ago
        The JXL spec already has gainmaps...<p>Also, just because there&#x27;s a spec for using gainmaps with JPEG doesn&#x27;t mean that it works well. With only 8 bits of precision, it really sucks for HDR, gainmap or no gainmap. You just get too much banding. JXL otoh is completely immune to banding, with or without gainmaps.
        • kllrnohj20 hours ago
          &gt; With only 8 bits of precision, it really sucks for HDR, gainmap or no gainmap. You just get too much banding.<p>This is simply not true. In fact, you get <i>less</i> banding than you do with 10-bit bt2020 PQ.<p>&gt; JXL otoh is completely immune to banding<p>Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be &quot;immune to banding&quot;.<p>&gt; The JXL spec already has gainmaps...<p>Ah looks like they added that sometime last year but decided to call it &quot;JHGM&quot; and also made almost no mention of this in the issue tracker, and didn&#x27;t bother updating the previous feature requests asking for this that are still open.
          • spaceducks17 hours ago
            &gt; Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be &quot;immune to banding&quot;.<p>color banding is not a result of lossy compression*, it results from not having enough precision in the color channels to represent slow gradients. VarDCT, JPEG XL&#x27;s lossy mode, encodes values as 32-bit floats. in fact, image bit depth in VarDCT is just a single value that tells the decoder what bit depth it should output to, <i>not</i> what bit depth the image is encoded as internally. optionally, the decoder can even blue-noise dither it for you if your image wants to be displayed in a higher bit depth than your display or software supports<p>this is more than enough precision to prevent <i>any</i> color banding (assuming of course the source data that was encoded into a JXL didn&#x27;t have any banding either). if you still want more precision for whatever reason, the spec just defines that the values in XYB color channels are a real number between 0 and 1, and the header supports signaling an internal depth up to 64 bit per channel<p>* technically color banding <i>could</i> result from &quot;lossy compression&quot; if high bit depth values are quantized to lower bit depth values, however with sophisticated compression, higher bit depths often compress better because transitions are less harsh and as such need fewer high-frequency coefficients to be represented. even in lossless images, slow gradients can be compressed better if they&#x27;re high bit depth, because frequent consistent changes in pixel values can be predicted better than sudden occasional changes (like suddenly transitioning from one color band to another)
    • 12_throw_away18 hours ago
      &gt; performant and memory safe decoder (in Rust?).<p>Isn&#x27;t this exactly the case that wuffs [1] is built for? I had the vague (and, looking into it now, probably incorrect) impression that Google was going to start building all their decoders with that.<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;google&#x2F;wuffs" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;google&#x2F;wuffs</a>
      • lonjil18 hours ago
        WUFFS only works for very simple codecs. Basically useless for anything complex enough that memory bugs would be common.
  • jlouis1 hour ago
    This is welcome.<p>AVIF is trying to be a distribution format for the Web. JPEG XL is trying to be a complete package for working with image data. JPEG XL can replace OpenEXR in many workflows. AVIF simply cannot.<p>There&#x27;s a lot of power in not having to convert for distribution.
  • FerritMans22 hours ago
    Love this, been waiting for Google to integrate this, from my experience with AVIF and JPEGXL, JPEGXL is much more promising for the next 20years.
  • h1fra1 hour ago
    Webp was a nice new format now widely adopted in browsers, yet it&#x27;s barely supported in websites (upload) and softwares. It&#x27;s hard to see this being different.
    • spaceducks1 hour ago
      WebP is much more limiting than JPEG XL. in lossy mode WebP has forced 4:2:0 chroma subsampling, supports only 8 bit per channel colors (really only about 7.8 bits, because thanks to WebP being tv-range the values aren&#x27;t in a 0-255 range but in a 16-235 range, causing even more color banding than 8 bit per channel already has), no HDR, a maximum resolution of 16385 x 16385 making it unsuitable for larger images...<p>JPEG XL on the other hand supports up to 4099 color channels, a bit depth up to 32 bit per channel (technically up to 64 bit, but this currently isn&#x27;t used), supports HDR natively, can use splines to compress elements like strands of hair, thin tree branches or line art that are typically hard to compress with DCT, supports patches for compressing repeating image elements, supports thermal, depth and alpha channels, supports losslessly recompressing existing JPEGs saving about 20%, supports CMYK and spot colors for printing, supports layers and selection masks, supports storing raw camera sensor data in bayer patterns, etc.<p>WebP is just a web delivery format, JPEG XL was designed to support many uses cases like web delivery, medical imaging, raw camera sensor data, authoring, multi-spectral imaging... the list goes on. if we support JPEG XL now, chances are it&#x27;ll be quite a while before we need a new general purpose image format because JPEG XL covers so many current use cases and was designed to accommodate potential future use cases as well.
  • masswerk22 hours ago
    Nice example for how a standard, like PDF, can even persuade&#x2F;force one of the mighty to adopt a crucial bit of technology, so that this may become a common standard in its own right (i.e. &quot;cascading standards&quot;).
  • gen2brain19 hours ago
    I like how even the nus product (jpegli) is a significant improvement. I am in the process of converting my comic book collection. I save a lot of space and still use JPEG, which is universally supported.
  • gethly6 hours ago
    jpg -&gt; png -&gt; webp -&gt; avif<p>Why are we going backward?
  • Pxtl22 hours ago
    &gt; Lossless JPEG recompression (byte-exact JPEG recompression, saving about 20%) for legacy images<p>Lossless recompression is the main interesting thing on offer here compared to other new formats... and honestly with only 20% improvement I can&#x27;t say I&#x27;m super excited by this, compared to the pain of dealing with yet another new image format.<p>For example, ask a normal social media user how they feel about .webp and expect to get an earful. The problem is that even if your browser supports the new format, there&#x27;s no guarantee that every other tool you use supports it, from the OS to every site you want to re-upload to, etc.
    • BeFlatXIII9 minutes ago
      &gt; For example, ask a normal social media user how they feel about .webp and expect to get an earful.<p>I&#x27;ve seen enough software that gets petulant about not supporting webp to fight the Google monopoly or whatever to understand their frustration.
    • F3nd021 hours ago
      If I remember correctly, WebP was single-handedly forced into adoption by Chrome, while offering only marginal improvements over existing formats. Mozilla even worked on an improved JPEG encoder, MozJPEG, to show it could compete with WebP very well. Then came HEIF and AVIF, which, like WebP, were just repurposed video codecs.<p>JPEG XL is the first image format in a long while that&#x27;s been actually designed for images and brings a substantial improvement to quality while <i>also</i> covering a wide range of uses and preserving features that video codecs don&#x27;t have. It supports progressive decoding, seamless very large image sizes, potentially large amount of channels, is reasonably resilient against generation loss, and more. The fact that it has no major drawbacks alone gives it much more merit than WebP has ever had. Lossless recompression is in addition to all of that.<p>The difference is that this time around, Google has single-handedly held back the adoption of JPEG XL, while a number of other parties have expressed interest.
      • Dwedit21 hours ago
        Having a PNG go from 164.5K to 127.1K as lossless WEBP is not what I&#x27;d call &quot;marginal&quot;. An improvement of over 20% is huge for lossless compression.<p>Going from lossless WEBP to lossless JXL is marginal though, and is not worth the big decode performance loss.
        • F3nd018 hours ago
          In context of the parent comment, &#x27;only 20% improvement&#x27; is not super exciting, &#x27;compared to the pain of dealing with yet another new image format&#x27;.<p>You raise a good point, though; WebP certainly did (and continues to do) well in some areas, but at the cost of lacking in others. Moreover, when considering a format for adoption, one should compare it with other candidates for adoption, too. And years before WebP gained widespread support in browsers, it had competition from other interesting formats like FLIF, which addressed some of its flaws, and I have to wonder how it compares to the even older JPEG 2000.
        • lonjil20 hours ago
          Since the person you replied to mentioned MozJPEG, I have to assume they meant that WebP&#x27;s lossy capabilities were a marginal improvement.
      • halapro19 hours ago
        You&#x27;re not being fair. Webp has been the only choice for lossy image compression with alpha layer. Give it some credit.
        • F3nd018 hours ago
          Fair point, though not entirely true: you can run an image through lossy compression and store the result in a PNG, using tools like pngquant [1]. Likely not as efficient for many kinds of images, but totally doable.<p>[1] <a href="https:&#x2F;&#x2F;pngquant.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;pngquant.org&#x2F;</a>
    • tempest_20 hours ago
      20% is massive for those storing those social media images though.
      • Pxtl19 hours ago
        I get that there are people who <i>are</i> super excited by this for very good reasons, but for those of us downstream this is just going to be a hassle.
    • spider-mario18 hours ago
      Since the recompression is lossless, you don’t need every tool you use to support it, as long as one of them is one that can do the decompression back to JPEG. This sounds a bit like complaining that you can’t upload .7z everywhere.
      • Pxtl17 hours ago
        AFAIK downconverting to jpeg is only an option for legacy jpegs that have been upconverted to jpegxl though. Many jpegxl images likely <i>won&#x27;t</i> support downconverting if they were created as jxl from the get-go.<p>Basically, jpeg-&gt;jxl-&gt;jpeg is perfectly lossless conversion, but a newly-made jxl-&gt;jpeg is not, even if it doesn&#x27;t use modern jxl-only features like alpha channels.<p>With that in mind I&#x27;d actually prefer if those were treated as separate file-formats with distinct file-extensions (backwards-compatible jpeg-&gt;jxls vs pure-jxl). The former could be trivially handled with automated tools, but the latter can&#x27;t.
        • spaceducks17 hours ago
          I&#x27;m not sure if that will be an issue in practice. in any case, you need a JPEG XL decoder to perform the transition from a recompressed-JPEG-JXL to the original JPEG, so whatever tool is doing this, it can already handle native-JXL too. it could be the conversion happens on the server side and the client always sees JPEG, in which case a native JXL can <i>also</i> be decoded to a JPEG (or if lossless a PNG), though obviously with information loss since JPEG is a subset of JXL (to put it lightly)
        • spider-mario16 hours ago
          Well, sure, but wasn’t that the use case we were discussing?
          • Pxtl15 hours ago
            Right. And that particular use-case sounds nice, but realistically this new format will not be <i>exclusively</i> used in that particular case.<p>Dealing with basically another .webp-like format in those cases (one that <i>might</i> be a backwards-compatible jpeg or might not and determining that can only be done by inspecting the file contents) doesn&#x27;t sound super fun.<p>So ideally, to make up names, I wish they&#x27;d used separate extensions and so a &quot;.jp3&quot; is a file that can be downconverted to a jpg and you could get a browser extension to automate that for you if you wanted, and a &quot;.jxl&quot; is the new file format that&#x27;s functionally another &quot;.webp&quot;-like thing to deal with and all the pain-points that implies.
            • spaceducks4 hours ago
              The names and extensions of JPEG XL files aren&#x27;t specified, except that the IANA media type is `image&#x2F;jxl`. I think an argument could be made to use the double extension convention when the encoder performs lossless JPEG recompression, so image.jpg becomes image.jpg.jxl (while not entirely semantically correct, since it&#x27;s not an additional layer of compression around the JPEG, it&#x27;s a reimplementation of the image using identical coding features as JPEG, in JXL).<p>But like I said in my other comment (which got hidden for some reason), it should be noted that a recompressed JPEG <i>is also</i> a valid JXL on its own. If you have the means to turn a recompressed JPEG into the original, you also have the means to view native JXLs.<p>Hopefully adoption is widespread and we won&#x27;t really have to worry about it. JPEG XL is a much more appealing format than WebP, and unlike WebP there are great arguments for software to support it <i>other than</i> &quot;Google started using them, so they&#x27;re everywhere now.&quot;
    • 7jjjjjjj19 hours ago
      I think there&#x27;s a difference here.<p>If I right click save and get a webp, it was probably converted from JPG. Very very few images are uploaded in webp. So getting a webp image means you&#x27;ve downloaded an inferior version.<p>JXL doesn&#x27;t have this issue because conversion from jpeg is lossless. So you&#x27;ve still gotten the real, fully-quality image.
      • Pxtl17 hours ago
        Let&#x27;s be realistic - when most users are upset they got a .webp, they&#x27;re not annoyed because of quality-loss, they&#x27;re annoyed because they can&#x27;t immediately use it in many other services &amp; software.
        • mubou28 hours ago
          This is still a problem with AVIF, too. Image viewers that support the format often don&#x27;t support animated AVIFs, and even GitHub still for some godforsaken reason treats .avif files in a repo&#x2F;PR as binary files instead of images. I think Discord just recently started supporting AVIFs, so that&#x27;s progress.
  • ChrisArchitect22 hours ago
    [dupe] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46021179">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46021179</a>
  • albert_e21 hours ago
    &gt; Chrome Jpegxl Issue Reopened<p>&gt; (this is the tracking bug for this feature)<p>Is it just me -- or it&#x27;s confusing to use the terms issue &#x2F; bug &#x2F; feature interchangeably?
    • mort9620 hours ago
      It&#x27;s not really used interchangeably: &quot;bug&quot; is used to mean &quot;entry in the bug tracker database&quot;, while &quot;feature&quot; is used to mean what we colloquially think of as a feature of a computer program.<p>It&#x27;s arguably a slight abuse of a bug tracking system to also track progress and discussion on features, but it&#x27;s not exactly uncommon; it&#x27;s just that many systems would call it an &quot;issue&quot; rather than a &quot;bug&quot;.
    • mubou28 hours ago
      Google&#x27;s internal issue tracker, Buganizer (which the Chromium Issue Tracker is based on), refers to everything as a &quot;bug&quot;. It&#x27;s confusing, yeah. You get used to it.
    • nandomrumber20 hours ago
      Maybe more like a heading bug:<p><a href="https:&#x2F;&#x2F;aviation.stackexchange.com&#x2F;questions&#x2F;23166&#x2F;what-is-the-use-of-heading-bug" rel="nofollow">https:&#x2F;&#x2F;aviation.stackexchange.com&#x2F;questions&#x2F;23166&#x2F;what-is-t...</a>
    • crazygringo19 hours ago
      Not really -- they&#x27;re all &quot;potential todos&quot; that need to be tracked and prioritized in the same place.<p>And the difference between a bug and a feature is often in the eye of the beholder. I&#x27;ll very often title a GitHub issue with &quot;Bug&#x2F;Feature Request:&quot; since it&#x27;s often debatable whether the existing behavior was by design or not, and I don&#x27;t want to presume one way or the other.<p>So I do consider them all pretty interchangeable at the end of the day, and therefore not really confusing.
  • claudiojulio23 hours ago
    [flagged]
    • fsflover18 hours ago
      This comment is of course breaking the HN Guidelines as a shallow dismissal, but the parent is right: After Google killed Ublock Origin and turned Android into a nanny OS, I have no idea why anyone would stick to anything from them. Also Firefox is better in almost every way.