HNNewShowAskJobs
Built with Tanstack Start
GCC 16 considering changing default to C++20(inbox.sourceware.org)
109 points by pjmlp 2 days ago | 115 comments
  • withzombies2 days ago

    Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

    It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.

    • cogman102 days ago |parent

      There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.

      So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.

      Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.

      • stabbles2 days ago |parent

        This is not the case. They are discussing the default value of `g++ -std=...`. That does not complicate bootstrapping as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.

        • cogman102 days ago |parent

          > as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.

          I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.

          It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.

          • bluGill2 days ago |parent

            Compilers often allow things in 11 that technically are not there until some later standard. Or sometimes things they have always allowed finally got standardized in a later version. Setting your standard to 11 if that is what you want to target it a good first step but don't depend on it - the real tests is all the compilers you care to support compile your code.

            Even if you only target 11, there may be advantages to setting a newer version anyway. Sometimes the standard finally allows some optimization that would work, or disallows something that was always error prone anyway. I would recommend you set your standard to the latest the compiler supports and fix any bugs. Solve your we have to support older standards problem by having your CI system build with an older compiler (and also the newest one). C++ is very good at compatibility so this will rarely be a problem.

          • quietbritishjim2 days ago |parent

            > ... why would you do it when you can simply keep the compiler targeting 11 ...

            It doesn't appear to me that the parent comment was implying otherwise.

            The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.

            • cogman102 days ago |parent

              > It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.

              I could be misreading this, but unless they have a different understanding of what it means to dog fooding than I do then it seems like the proposal is to use C++20 features in the compiler bootstraping.

              • ziotom782 days ago |parent

                I believe they are really referring to the default mode used by GCC when no standard is explicitly stated.

                The email mentions that the last time they changed it was 5 years ago in GCC 11, and the link <https://gcc.gnu.org/projects/cxx-status.html#cxx17> indeed says

                > C++17 mode is the default since GCC 11; it can be explicitly selected with the -std=c++17 command-line flag, or -std=gnu++17 to enable GNU extensions as well.

                which does not imply a change in an obscure feature (bootstrapping) that would only affect a few users.

          • eddd-ddde2 days ago |parent

            The answer is obvious, YES, specify your language version. Every single compiler invocation for production (for example ci builds) should explicitly select a version. Otherwise you are asking for trouble.

            • stabbles2 days ago |parent

              Yeah, developers should specify what language and dialect a project is written in. In practice though, support for that in build systems is cumbersome.

              For example in CMake the natural variable is CMAKE_CXX_STANDARD, but it's implemented backwards: if you set it to 14 but your compiler supports only C++11, they'll add -std=gnu++11. You have to also set CMAKE_CXX_STANDARD_REQUIRED to ON, which not man projects do. I don't think there's an easy way to say "this project requires C++14 or higher".

              • menaerus2 days ago |parent

                There is - you simply build your code with -std=c++XY, and if your toolchain doesn't support it (which one btw doesn't support at least c++17?), it will simply error out, no? That should be a pretty strong signal that your code requires XY standard without having to go into the CMake territory. Even if you want to, I see nothing wrong with the way they are implemented. Two simple variables doing two simple things.

      • rmu092 days ago |parent

        Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.

        • cogman102 days ago |parent

          The way I read withzombies's comment (and it could be wrong) was they were talking about the language version of the compilers source. I assumed that from the "dogfooding" portion of the comment.

        • maxlybbert2 days ago |parent

          Correct, this is a discussion of which language version the compiler should follow if the programmer doesn’t specify one. It’s not about which features are acceptable when implementing the compiler.

      • kstrauser2 days ago |parent

        Counterpoint: you could write a C++ compiler in a non-C/C++ language such that the compiler’s implementation language doesn’t even have the notion of C++20.

        A compiler is perfectly capable of compiling programs which use features that its own source does not.

        • cxr2 days ago |parent

          That's not a counterpoint—at least not to anything in the comment that you're (nominally) "responding" to.

          So why has it been posted it as a reply, and why label it a counterpoint?

          • kstrauser2 days ago |parent

            Read them again a couple more times and it may become clear.

            The prior post seemed to be claiming that this required any form of a bootstrapping process, when it does not.

            • wat100002 days ago |parent

              This particular compiler does require bootstrapping, and that's obviously what "the compiler" is referring to in that comment.

              Building your compiler in another language doesn't help at all. In fact, it just makes it worse. Dogfooding C++20 in your compiler that isn't even built in C++ is obviously impossible.

              • kstrauser2 days ago |parent

                It absolutely does not. There is no part of C++20 that requires the implementing compiler to be written in C++20.

                My original point is that you can write a compiler for any language in any language.

                • wat100002 days ago |parent

                  What is "It absolutely does not" responding to? I didn't say anything about a C++20 compiler needing to be written in C++20.

                  • kstrauser2 days ago |parent

                    You said:

                    > This particular compiler does require bootstrapping, and that's obviously what "the compiler" is referring to in that comment.

                    You have to pick an option: either it requires bootstrapping, or it doesn’t.

                    As it’s possible to write the C++20 compiler features in C++11 (or whatever GCC or Clang are written in these days), it factually does not require bootstrapping.

                    • wat100002 days ago |parent

                      Here, "requires bootstrapping" means "gcc needs to be able to build with gcc, including older versions of gcc."

                      • kstrauser2 days ago |parent

                        This is going in circles and this is my last comment on it, but here is what I originally replied to:

                        > So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.

                        …as though building the new version of the compiler depended on the features it’s implementing already existing. This is clearly not the case.

                        • wat100002 days ago |parent

                          The sentence you've quoted is explaining why a new version of the compiler cannot depend on the new features it's implementing. I.e. the first gcc version that supports C++20 cannot be written in C++20.

                          Which, as you say, is clearly not the case.

                          I have no idea how you managed to misread the comment so badly, but there we are.

                        • cxr2 days ago |parent

                          You're hallucinating a non-existent premise to the actual conversation that occurred.

                          The person you responded to answered the question posed by the person that they responded to. And they answered it correctly. Your "counterpoints" are counterpoints to an imaginary argument/claim that no one has actually made. The reason why it's not part of the quote that you pulled out of the other comment is that there's no way to quote the other person saying what you're trying to frame them as having said, because it's not what they were saying. This entire subthread is the result of an unnecessary attempt at a correction that doesn't manage to correct anyone about anything.

                • cxr2 days ago |parent

                  > My original point is that you can write a compiler for any language in any language.

                  A perfectly fine observation on its own—but it's not on its own. It's situated in a conversational context. And the observation is in no way a counterpoint to the person you posted your ostensible reply to.

                  Aside from that, you keep saying "bootstrapping" as in whether or not this or that compiler implementation strategy "requires bootstrapping". But writing a compiler in different source language than the target language it's intended to compile and using that to build the final compiler doesn't eliminate bootstrapping. The compiler in that other language is just part of the bootstrapping process.

            • cxr2 days ago |parent

              You have lost the plot, and you are wrong.

    • unclad59682 days ago |parent

      Well there are still some c++20 items that aren't fully supported, at least according to cppref.

      https://en.cppreference.com/w/cpp/compiler_support/20.html

      • withzombies2 days ago |parent

        Yeah, I think it's because none of the compilers are obligated to support the standard and things get added that never get implemented.

        A good example is the C++11 standard garbage collection! It was explicitly optional but afiak no one implemented it.

        https://isocpp.org/wiki/faq/cpp11-library#gc-abi

    • andsoitis2 days ago |parent

      > Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

      C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html

    • ajross2 days ago |parent

      > What is the downside of switching to the newest standard when it's properly supported?

      Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.

      [1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.

      [2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.

      [3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.

      • menaerus2 days ago |parent

        > C++ makes breaking changes all the time,

        Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.

        So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.

        Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.

        • ajrossa day ago |parent

          > Introducing [...] new keywords [...] is not a breaking change.

          This is some kind of semantic prestidigitation around a definition for "breaking" that I'm not following. Yes, obviously it is. New keywords were valid symbol names before they were keywords.

          Makes me wonder if the "don't spread misinformation" quip was made in good faith.

          • menaerusa day ago |parent

            It was. And no, breaking change is not what you seem to imply it is. When talking about breaking changes, introducing new keywords is not what people usually think of. It's irrelevant.

            • ajrossa day ago |parent

              Sigh. "Irrelevant" except in the sense that the code used to build and now it doesn't after a gcc upgrade, you mean. This is an attitude shared (c.f. "greenfield" comments elsewhere in this tree) among a bunch of people who do intensive development and maintenance on active products all the time.

              That use case is probably less than 20% of all the C++ development cycles out there. This is a 40 year old language that the bulk of the industry has decided to abandon for new work. The large majority of people doing work on this code are doing minimal-change updates, and nonsense like this is how you end up with rules like "We have to deploy on Ubuntu 20.04 still because the AbandonWare 4.7 library doesn't work on later version".

              And it's avoidable, but not if you run around lying to people and yourself about what a breaking change is. Again, look at how C does this. The C standard writers actually know that they're updating a legacy environment and care deeply about full backwards compatibility.

      • tcfhgj2 days ago |parent

        well, shouldn't not-up-to-date code use the corresponding compiler flag instead of someone starting a greenfield project, who might then write outdated code?

        • ajross2 days ago |parent

          No? The "corresponding compiler flag" is a new feature. I mean, who told folks at Bell Labs in 1978 how the GCC --std= arguments would work in the coming decades? Legacy code is legacy, it doesn't know it needs to use the correct flags. When it was a greenfield project, it was the default!

          Like, think about it: if you think the defaults should be good for greenfield projects, then greenfield projects won't be using the correct flags (because if they are, then the whole argument is specious anyway). And when C++34 shows up, they're going to be broken and we'll have this argument again.

          Compatibility is hard. But IMHO C++ and gcc are doing this wrong and C is doing it much better.

          • plorkyeran2 days ago |parent

            GCC's default has already changed once (to C++11). It did not cause any significant problems, and any software which is relying on the current value was created long after the flags to pick a standard version were added.

    • dagmx2 days ago |parent

      This is about changing the default.

      The issue with defaults is that people have projects that implicitly expect the default to be static.

      So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.

      • reactordev2 days ago |parent

        If you’re relying on defaults, and upgrade, that is entirely your fault. Don’t hold everyone in the world back because you didn’t want to codify your expectations.

      • dietr1ch2 days ago |parent

        So it has the added benefit of having people learn how to set up their projects properly? Great.

      • bluGill2 days ago |parent

        C++ is very good at compatibility. If your code breaks when the standard changes, odds are it was always broke and you just didn't know. C++ isn't perfect, but it is very good.

        • wat100002 days ago |parent

          On the other hand, if you didn't know your code was broken then it probably wasn't broken in a way that's catastrophic to whatever you use it for.

      • nomel2 days ago |parent

        Do you have an example? Adding the `--std=<whatever you're using now here>` flag should work, which you should already be using anyways. Is the issue that you don't want to use that argument?

      • MichaelZuo2 days ago |parent

        That sounds more like a problem of nonsensical assumptions… what possible expectation could there have been that GCC would never change this in the future?

        • jayd162 days ago |parent

          The assumption is along the lines of "this works so why should I ever think about it again if I don't have to?"

          It's not an end user problem, anyway. The issue is the language didn't change in a backwards compatible way and also didn't require setting a language version.

    • 17186274402 days ago |parent

      > What is the downside of switching to the newest standard when it's properly supported?

      They are discussing in this email thread whether it is already properly supported.

      > It's one reason why people care so much about self-hosted compilers

      For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.

    • BeetleB2 days ago |parent

      > What is the downside of switching to the newest standard when it's properly supported?

      "Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.

      • jcelerier2 days ago |parent

        Where do you draw the line for properly supported? I've been using g++ in c++23 mode for quite some time now - even if every feature is not entirely implemented, the ones that work, work well and are a huge improvement

        • tlb2 days ago |parent

          I draw the line where I can't expect the default gcc on most Linux and Mac systems to compile my code. And I don't want to force them to install a particular compiler. -std=c++20 seems to work pretty reliably these days.

          We're starting to need caniuse.com for C++.

          • cemdervis2 days ago |parent

            https://cppstat.dev

            • tlb2 days ago |parent

              Aha, that's just what I wanted!

              • cemdervisa day ago |parent

                Glad I could help!

          • pjmlp2 days ago |parent

            It already exists, https://en.cppreference.com/w/cpp.html

    • binary1322 days ago |parent

      A lot of software, and thus build automation, will break due to certain features that become warnings or outright errors in new versions of C++. It may or may not be a lot of work to change that, and it may or may not even be possible in some cases. We would all like there to be unlimited developer time, but in real life software needs a maintainer.

      • withzombies2 days ago |parent

        I'm not talking about software compiled by the compiler having a higher default.

        Warnings becoming errors would be scoped to gcc itself only, and they can fix them as part of the upgrade.

    • hulitu2 days ago |parent

      > Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

      cursing because the old program does not compile anymore No.

      • withzombies2 days ago |parent

        No old programs wouldn't be able to compile anymore with the proposed change

    • burnt-resistor2 days ago |parent

      Compatibility. This value has been lost, apparently, and so nothing in the future will be able to run anything else except modern things.

    • superkuh2 days ago |parent

      When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.

      I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.

      • surajrmal2 days ago |parent

        The problem you mention is perhaps a sign that the model Debian uses is ill suited for development. Stable software is great but it need not impede progress and evolution. It's also possible to support older rust compiler versions if it's important - apt developers can do the work necessary to support 4yo lts compilers.

      • mustache_kimono2 days ago |parent

        > Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters.

        This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!

        > The entire language culture is built around this rapid improvement.

        ... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.

        And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html

  • cardiffspaceman2 days ago

    In the last “big” shop I worked in, we were cross-compiling all production code. Each target device had an SDK that came with a GCC and a kernel tarball, inter alia. We had a standard way to set these up. We used C++03 for years. We decided to try C++11 for userland. All the compilers supported that and after some validation, we changed permanently. Neither before the change nor after, did we rely on the absence of a “—-std=“ command line option as the means of choosing the standard for C++ or even C.

    Of course we were all ADHD pedantic nerds so take this with a grain of salt.

  • secondcoming2 days ago

    The coroutine convo is interesting. Does it mean that for example, a GCC program may not run correctly when linked to a clang binary and both use coroutines?

  • jjmarr2 days ago

    Good. Let me use modules!

    • 17186274402 days ago |parent

      You can always specify the language version in your compiler invocation.

    • albertzeyer2 days ago |parent

      > Presumably we still wouldn't enable Modules by default.

    • forrestthewoods2 days ago |parent

      Modules will never been commonly used in C++. It’s a failed feature.

    • whobre2 days ago |parent

      Seriously, why? They are broken. https://vector-of-bool.github.io/2019/01/27/modules-doa.html

      • suby2 days ago |parent

        This is from 2019, prior to the finalization of modules in the standard. I'd be interested in how many of these issues were unaddressed in the final version shipped.

        • Maxatar2 days ago |parent

          There isn't much of a final version shipped. It's pretty well understood that modules are underspecified and their implementation across MSVC, clang, and GCC is mostly just ad-hoc based on an informal understanding among the people involved in their implementation. Even ignoring the usual complexity and ambiguity of the C++ standard, modules are on a whole different level in terms of lacking a suitable formal specification that could be used to come close to independently implementing the feature.

          And this is ignoring the fact that none of GCC, clang, or MSVC have a remotely good implementation of modules that would be worth using for anything outside of a hobby project.

          I agree with the other commenter who said modules are a failure of a feature, the only question left is whether the standards committee will learn from this mistake and refrain from ever standardizing a feature without a solid proof of concept and tangible use cases.

          • CyberDildonicsa day ago |parent

            You should get in there and put all your expertise to work.

            • Maxatara day ago |parent

              I did prior to 2017. I realized the committee was 75% politics and people with a lot of time and devotion pushing their pet projects, and about 25% about addressing actual issues faced by professional engineers and decided it was no longer worth the time and effort and money.

              The committee is full of very smart and talented people, no dispute about that, but it's also very silo'd where people just work on one particular niche or another based on their personal interests, and then they trade support with each other. In discussions it's almost never the case that features are added with any consideration towards the broader C++ audience.

              • CyberDildonicsa day ago |parent

                You implemented modules in 2017 and they didn't use it?

          • pjmlp2 days ago |parent

            Today I learnt that Office is an hobby project.

            • Maxatar2 days ago |parent

              You learned nothing because the extent of your knowledge tends to be rather superficial when it comes to C++.

              Office does not use C++ modules, what Office did was make use of a non-standard MSVC feature [1] which reinterprets #include preprocessor directives as header units. Absolutely no changes to source code is needed to make use of this compiler feature.

              This is not the same as using C++20 modules which would require an absolutely astronomical amount of effort to do.

              In the future, read more than just the headline of a blog post if you wish to actually understand a topic well enough to converse in it.

              [1] https://learn.microsoft.com/en-us/cpp/build/reference/transl...

              • pjmlp2 days ago |parent

                My dear, I have written more C++20 modules code than you ever will.

                Feel free to roam around on my Github account.

                Also go read the C++ mailings regarding what is standard or not in modules.

  • dmix2 days ago

    That anime gating is very jarring, thought I clicked on the wrong link and clicked back.

    • f1refly2 days ago |parent

      Right? I hope it never goes away, we should make the web more fun instead of sad and clean!

      • suby2 days ago |parent

        I think if you were to poll people, a significant portion would be repulsed by this catgirl aesthetic, or (though this isn't the case for Anubis) the cliche inappropriately dressed inappropriately young anime characters dawned as mascots in an ever increasing number of projects. People can do whatever they want with their projects, but I feel like the people who like this crap perhaps don't understand how repulsive it is to a large number of people. Personally it creeps me out.

        • ikamm2 days ago |parent

          I'm not repulsed by it but I do wish the people that forced this stuff into their software/hardware realized how juvenile it makes their product look. There's a decent cheap Chinese pair of Bluetooth earbuds on Amazon that's been very popular among audiophiles but the feedback sounds are an anime girl making noises and there's no way to turn it off so I lost interest in purchasing them.

          • makemake_kbo2 days ago |parent

            well for the bluetooth headphones i dont think you were the target demographic.

            but open source generally isnt treated as a product. its just a bunch of volunteers having fun writing code. its natural that they will include their other interests in it in some way because it makes working on a project more fun. first impressions matter a lot, but i dont think foss projects should optimize for that instead of having fun.

        • windward2 days ago |parent

          The internet was better when it repulsed a significant portion of people.

          • naIak2 days ago |parent

            What would happen if it changed in a way that repulsed you?

            • windward2 days ago |parent

              I'm still here

        • rossy2 days ago |parent

          > (though this isn't the case for Anubis) the cliche inappropriately dressed inappropriately young anime characters dawned as mascots in an ever increasing number of projects

          I think the fact that people bring up things that the Anubis mascot isn't when talking about Anubis is more telling of their own harmful (and potentially racist) biases against Japanese-styled media than it is about the idea of having anime-styled mascots for free software projects.

        • 2 days ago |parent
          [deleted]
        • ndiddy2 days ago |parent

          This is intentional. The version with the fun art that expresses the creator's individuality is free and open source, but they sell a paid version with bland, corporate-friendly art that also supports custom art and CSS. This makes the project sustainable to work on without having to worry about corporations that care about professionalism/how people like you think/etc not supporting the project financially.

        • exe342 days ago |parent

          It sounds like something you might benefit from talking to a therapist. It's not normal to have such a strong reaction. I hope you can get the help you need!

        • sacado22 days ago |parent

          What? She's wearing a hoodie and a tee-shirt, how is that inappropriate? And how being young is inappropriate?

        • secondcoming2 days ago |parent

          The whole Japanese cartoon schoolgirl thing is 100% creepy.

        • thoroughburro2 days ago |parent

          > inappropriately dressed

          How do you think Anubis should dress?

          • andsoitis2 days ago |parent

            Perhaps like he is depicted in temples, like this one from the tomb of Horemheb; 1323-1295 BC: https://commons.wikimedia.org/wiki/File:The_King_with_Anubis...

            • breppp2 days ago |parent

              a dog man wearing short skirts is also inappropriate in my opinion

              • andsoitis2 days ago |parent

                Other options would be: just the head, a black dog (common depiction), perhaps most fittingly to what Anybis does: the scales

              • 2 days ago |parent
                [deleted]
              • 2 days ago |parent
                [deleted]
    • NegativeK2 days ago |parent

      Anubis has been around for almost a year now, but it's also not particularly relevant to the content of the email thread.

      • veltas2 days ago |parent

        It's particularly jarring to basically every site I've seen it on which is usually some serious and professional looking open source site.

        I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?

        • 38362936482 days ago |parent

          Because that's the difference between the paid and free versions

        • andsoitis2 days ago |parent

          Anubis asks that you don’t change the logo and if you want to, pay them: https://anubis.techaro.lol/docs/funding/

        • 17186274402 days ago |parent

          I think they might also want to bring attention to the problem and advertise for an open-source solution.

          • maleldil2 days ago |parent

            Anubis is open-source (MIT).

        • renewiltord2 days ago |parent

          I’m sure if you want you can offer to pay like $500/mo on their behalf and they’ll change it for everyone.

        • gessha2 days ago |parent

          Fire up your LLM of choice and make a web extension to make it more presentable. Remove the logo, generate one, do whatever you want. The world is your playground, don’t let it “jarr” you with stuff.

          • 2 days ago |parent
            [deleted]
        • Tyr422 days ago |parent

          That's the paid upgrade for "enterprise" level quality.

      • 17186274402 days ago |parent

        Anubis is a bit annoying over crappy internet connections, especially in front of a webpage that would work quite well in this case otherwise, but it still performs way better than Cloudflare in this regard.

    • superkuh2 days ago |parent

      Anubis is significantly less jarring than cloudflare blocks preventing any access at all. At least Anubis lets me read the content of pages. Cloudflare is so bleeding edge and commercial they do not care about broad brower support (because it doesn't matter for commercial/sales). But for websites you actually want everyone to be able to load anubis is by far the best.

      That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.

      When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.

    • tr458722672 days ago |parent

      Many people have said they don't like it, and all that did is make its supporters even happier that it's there, because it makes them feel special is some strange way.

    • MangoToupe2 days ago |parent

      Who cares tbh

    • wyldfire2 days ago |parent

      Recently, on HN: https://news.ycombinator.com/item?id=44962529

    • 17186274402 days ago |parent

      I wouldn't have known that this is anime, if not for all the HN comments pointing that out.

    • falcor842 days ago |parent

      See also discussion on https://news.ycombinator.com/item?id=44962529

      • dmix2 days ago |parent

        So some sort of viral marketing by using weird images