HNNewShowAskJobs
Built with Tanstack Start
Does showing seconds in the system tray actually use more power?(lttlabs.com)
215 points by LorenDB 4 days ago | 196 comments
  • DecentShoes4 days ago

    I would have been happy with no seconds in the tray, but showing the seconds if you click on the clock - technology that existed a decade ago in Windows 10, but is obviously technologically impossible for hundreds of PhD holding software engineers at the richest company in the world to figure out in 2025.

    • rayiner4 days ago |parent

      All the smart engineers must have left to do AI stuff. The new windows 11 "power savings" settings menu has this gamification angle. It tells you that you've enabled "5 of 7 power saving settings" or whatever, in a way that implies the goal is to get all 7. It triggers my OCD every time I see the screen and it implies that quest is unfinished.

      • esperent4 days ago |parent

        Performative environmentalism, standard operating procedure for every big company. Shift the blame onto the consumer, make them feel guilty because they set their screen timeout to to more than 5 minutes. Gamification is a great tool for making people feel pressured and guilty but with plausible deniability for the company.

        Meanwhile, build database centers at incredible scale to run AI and force it into those same consumers in every way possible, but never tell them how much power that wastes.

        • aspenmayer2 days ago |parent

          > Meanwhile, build database centers at incredible scale to run AI and force it into those same consumers in every way possible, but never tell them how much power that wastes.

          Why would they? Those expenditures and investments are already priced in, and that blood is coming from that stone one way or the other or it won't, damn the expense.

          Shame, on the other hand, is a renewable resource!

        • signalToNose3 days ago |parent

          Coin mining and AI database centers consumer huge amounts of power. I’m not really exited about either. Both enterprises seems like scams to centralize wealth and power to a few actors.

        • 3 days ago |parent
          [deleted]
      • shaky-carrousel3 days ago |parent

        Then switch all the settings so it says "0 of 7 power saving settings". That way the quest isn't even started. I'd do that.

    • mrandish3 days ago |parent

      I think the removal was originally about memory usage in Win 95 due to fitting the new OS into lower RAM systems. Then it was about battery usage. More recently the epidemic of feature and information removal from interfaces is primarily driven by the obsession of UX designers to dumb down everything to the lowest common denominator.

      By controlling how usage analytics are instrumented UX designers can weaponize the data to support removing almost any feature or information they don't personally find essential. Of course, this entirely misses the fact that power users drive word-of-mouth and adoption >10x more than lowest common denominator users and also have significantly higher lifetime value because they are engaged and loyal (until you finally remove too much advanced utility). I'm all for simplicity - what I don't understand is the insistence on removing features or capabilities entirely instead of just putting them as an option in advanced settings. Different users have different preferences and good UX design can maintain surface simplicity without trashing depth, flexibility and personalization.

    • BuyMyBitcoins4 days ago |parent

      >”PhD holding software engineers at the richest company in the world to figure out in 2025.”

      Let’s be honest, implementing this would be up to a bunch of offshore contractors because corporate can’t bring itself to pay software engineers to implement this feature thoughtfully and comprehensively.

    • raxxorraxor3 days ago |parent

      The richest engineers at Microsoft are probably busy with trying to push Copilot into the a...ppendix of their users.

      Perhaps you would like to ask Copilot for the time?

    • carra3 days ago |parent

      Then wait until you learn of useful features that existed in Windows 7 and were removed... I'm especially baffled at the worsening of the file copy dialog.

    • HumblyTossed3 days ago |parent

      This is what we get when AI is writing the code. There are no examples on Github for it to copy.

    • 3 days ago |parent
      [deleted]
  • layer84 days ago

    Raymond Chen recently wrote about the history of seconds on the taskbar: https://devblogs.microsoft.com/oldnewthing/20250421-00/?p=11...

    • troupo3 days ago |parent

      I call bull on every part of this story when it talks about "power team looking at windows performance as a whole, environment etc."

      Because it's enough to look at modern Windows and things like "CPU spikes 100s of percent when opening start menu because it's now written in React"

      • panta3 days ago |parent

        At some point we'll have to admit that worse is actually worse.

      • naikrovek3 days ago |parent

        > I call bull on every part of this story when it talks about "power team looking at windows performance as a whole, environment etc."

        I am sure that the Power team has the understanding that user-initiated activity isn’t really anything they can control.

        Once the user takes over, all tricks for efficiency go out the window. The user could run anything.

        I don’t work for MS but I’m sure the Power team cares about the automatic built-in stuff that runs on schedules and timers: the things they can control. The start button doesn’t click itself.

        I agree that using web tech to render things in the OS is silly, but Microsoft has been doing it in every Explorer window since 2001 with Windows XP. I don’t remember anyone complaining about that at the time except me.

        • troupo3 days ago |parent

          > user-initiated activity isn’t really anything they can control.

          > Once the user takes over, all tricks for efficiency go out the window. The user could run anything.

          Erm. In case of the start menu and task bar, where they are pretending that showing servings is an issue, they do have control of a lot of things.

          Somehow they split hairs over showing seconds, but are perfectly okay with web views in the task bar and in the start menu, and are perfectly okay with components in React Native.

          > I agree that using web tech to render things in the OS is silly, but Microsoft has been doing it

          Yes, they have been doing it. Somehow the "power team" that supposedly had ultimate control over what gets displayed had no issues with that

      • zamadatix3 days ago |parent

        It's a bit hard to call bullshit when even the summary of the discussion is still measured to the microwatt.

        I wouldn't be surprised if there were some misalignments in goals though. E.g. if their team's goals are measured in results of typical battery life tests such as "how many hours the computer can sit idle playing a video" then they would be heavily weighted towards caring about these kinds of constantly recurring background draws instead of active usage draws.

        • troupo3 days ago |parent

          > they would be heavily weighted towards caring about these kinds of constantly recurring background draws instead of active usage draws.

          Yeah, it's very likely "metrics- driven development" when optimizing certain metrics becomes its own isolated goal

        • mrheosuper3 days ago |parent

          it's milliwatt

          • zamadatixa day ago |parent

            The unit is mW but the measurement precision is to the microwatt: 0.417 mW = 417 uW. Same as how 0.001 km would be a measurement to meters of precision, not kilometers of precision.

      • HumblyTossed3 days ago |parent

        I thought it was because the new code for win11 called the old code for win10 which called the old code for winX which called the old code for WinY, which called an old DOS routine.

        • troupo3 days ago |parent

          Yup. It's the org chart shipped through time: https://youtu.be/5IUj1EZwpJY?si=mIkDCkK7lG55ncUf

      • Valodim3 days ago |parent

        Wait. Is it really written in react?

        • treesknees3 days ago |parent

          No, there is a single component of the start menu (personal recommendations) that uses react native. It’s not accurate to say that the start menu is written in react.

    • ranger_danger4 days ago |parent

      After reading only the title of the HN story I automatically assumed it was probably Raymond. Always a pleasure reading his posts.

  • Jaxan4 days ago

    I hate these kind of “saves power” things in windows settings. The OS itself pings home so often, sends network request for everything you do, shows ads on the login screen, makes screenshots (for Recall), Edge sends contents from web forms for “AI”. And now it is my responsibility to disable showing seconds in the taskbar??? If microsoft really wants to be green, windows shouldn’t do all these wasteful things!

    • ctoth4 days ago |parent

      I had some very technical friends be incredibly surprised by the Edge form thing, I think that is not sufficiently called out!

      They send any text you type in a form to their AI cloud and hold on to it for 30 days.

      Any form.

      On any website.

      What the actual fuck?

      • smokel4 days ago |parent

        This is only true if you enable extended spell checks, which makes some sense. By default, no form data is sent to Microsoft AFAIK. Note that the same holds for Google Chrome.

        • perching_aix4 days ago |parent

          Reminds me to a video I saw on YouTube from the "PC Security Channel", who was utterly flabbergasted that the Start Menu would send all keypresses inputted into its search bar to MS.

          They had searching on the web enabled... Pretty hard to search the web using Bing without sending along a search term.

          • lucumo4 days ago |parent

            Stuff like that and the one you replied to are why I stopped caring. The outrage is so often complete and utter nonsense that my default response is disbelief.

            • freeone30004 days ago |parent

              It came enabled by default. It is not as if this setting was searched for, then enabled, then had some unintended consequence - taskbar searches used to not search the internet, then they did.

              • perching_aix4 days ago |parent

                Which would be a perfectly fine thing to take issue with. It just also wouldn't be quite as eye-catching as misleadingly portraying the thing as now being a keylogger.

                • troupo3 days ago |parent

                  It is essentially a keylogger. Enabled silently when it wasn't enabled (or didn't exist previously)

                  The purpose of a system is what it does, after all

                  • perching_aix3 days ago |parent

                    I disagree. Being covert and having access to user input are necessary criteria for a keylogger, but not sufficient. They also have to, well, log. And since keyloggers are a kind of malware, using these logs for malicious purposes is also implied, and so is that the data would be tied to your identity. They also tend to operate all the time, rather than just in specific contexts.

                    But the criterion of "having access to user input" is also necessary for goofy unneeded features like showing web search results in the Start Menu though, which they shove down people's throat like they do with every other feature their product team thinks is a great idea (explaining the "being covert" bit), at which point you have a complete, non-malicious explanation for the entire thing.

                    The reasonable thing to do then is to apply Hanlon's razor, at which point no, it's no longer reasonable to believe or portray it to be a keylogger anymore. Not essentially, not otherwise. Not only that, but the YouTuber in question made this portrayal knowing full well that it's impossible for them to actually properly demonstrate this feature doubling as a keylogger, as they have no access to the server side. They relied on people being gullible enough to simply not grasp this, and leveraged people's preexisting privacy concerns to farm views.

                    Having the capability to engage in crime doesn't make a criminal. Imagine if I portrayed 107M (!) of the 340M residents of the U.S. as a criminal because they own a gun, despite knowing full well that gun ownership sensibilities are just fundamentally different over there.

                    • freeone30003 days ago |parent

                      “if you use the windows taskbar, by default Microsoft sees your keystrokes now. Here’s how to disable it” is a completely reasonable take. Every week there’s a new announcement of a some-million-count leak of personal information. People’s privacy fears are well-founded.

                      • perching_aix3 days ago |parent

                        Is appealing to those fears to deliver misinformation ethical? Does it help this issue or worsen it? Cause I'd say poisioning the well is not a good thing. The road to hell being paved with good intentions and all. See the effect lies like this had on the person in this very thread above us: https://news.ycombinator.com/item?id=44552625 I share in their disbelief at decent amounts by this point, too.

                        It's like making up a bunch of rubbish when there's a hate train going on against something or somebody just to participate. Then having all of that backfire disproportionately when the tides turn. Why make things up when reality has plenty bad enough stuff going on already that one can report on? Rhetorical question of course.

                        • troupo3 days ago |parent

                          > The road to hell being paved with good intentions and all.

                          Why are we assuming good intentions from a company who for years has increased places and amounts of data it collects and tracks, and removed more and more ways to opt-out of this?

                          The intention of "search web first before searching local computer even if the user never asked for it" didn't appear from the intent of "let's create a keylogger", but it never came from a good innocent intention either.

                          • perching_aix3 days ago |parent

                            > Why are we assuming good intentions

                            I'm talking about the FOSS community.

                • 4 days ago |parent
                  [deleted]
            • Espressosaurus4 days ago |parent

              They make it hard as hell to turn off searching the web.

              Users of especially the home version of the OS are kind of fucked here.

        • atq21194 days ago |parent

          In what world does holding the user's private data for 30 days make sense for a spell checker? Even sending the data at all is sad. We've had offline spell checking for decades.

          • Xorlev4 days ago |parent

            This is often (though not always) blanket statement.

            Logs are always generated, and logs include some amount of data about the user, if only environmental.

            It's quite plausible that the spellchecker does not store your actual user data, but information about the request, or error logging includes more UGC than intended.

            Note: I don't have any insider knowledge about their spellcheck API, but I've worked on similar systems which have similar language for little more than basic request logging.

            • HeavyStorm4 days ago |parent

              Pii is stored _at most_ for 30 days.

          • perching_aix4 days ago |parent

            For the same reason Grammarly does it too, I'd assume.

          • justsomehnguy4 days ago |parent

            To track when the user corrects it. Otherwise you can't adapt if somehow the correction is not what the user wanted.

            If there are a bunch of these corrections you know something is wrong there. IMO 30 days is quite modest and if this is properly anonymized..

            Edit: dear HN user who decided to silently downvote - you could do better by actually voicing your opinion

            • perching_aix4 days ago |parent

              > dear HN user who decided to silently downvote - you could do better by actually voicing your opinion

              Sure, I'll bite. Let's address the obvious issue first: what you're saying is speculation. I can only provide my own speculation in return, and then you might or might not find it agreeable, or at least claim either way. And there will be nothing I can do about it. I generally don't find this valuable or productive, and I did disagree with yours, hence my silent downvote.

              But since you're explicitly asking for other people's speculation, here I go. Advanced "spellchecking" necessitates the usage of AI, as natural languages cannot ever be fully processed using just hard coded logic. This is not an opinion, you learn this when taking formal languages class at university. It arises from formal logic only being able to wrangle formal logic abiding things, which natural languages aren't (else they'd be called formal languages).

              What the opinion is, and the speculation is, is that this is what the feature kicks off when it sends over input data to MS's servers for advanced "spellchecking", much like what I speculate Grammarly does too. Either that, or these services have some proprietary language engine that they'd rather keep on their own premises, because why put your moat out there if you don't strictly have to.

              Technologically speaking, at this point it might be possible to do this locally, on-device now. This further didn't use to be the case I believe (although I do not have sources on this), and so this would be another reason why you'd send people's inputs to the shadow realm.

              • demarq4 days ago |parent

                It’s hard to read writing packed with defensive clauses.

                Better to say what you need to say. Leave the defense for the occasion someone misunderstood what you meant to say.

                • perching_aix4 days ago |parent

                  It's further pretty hard to write like this, but I still prefer it over getting trivially checkmated by ill meaning people, and over being misinterpreted silently and that causing issues downstream. It's at this point an instinctual defense mechanism, that I've grown to organically develop in the low-trust environments that are forums like this.

                • throw109204 days ago |parent

                  I 100% agree with the principle, but (regrettably) in practice you can't do this in a lot of places where the community is critical (which isn't a bad thing by itself) but doesn't call out/downvote/moderate bad criticism (which is bad).

                  I can't count the number of times on HN that I've seen responses to posts that took advantage of the poster not writing defensively to emotionally attack them in ways that absolutely break the HN guidelines, and weren't flagged or downvoted. And on other sites, like Reddit, it's just the norm.

                  The defensive writing will continue until morals improve.

            • 4 days ago |parent
              [deleted]
        • foolswisdom4 days ago |parent

          What setting is this? I can only find "Enable machine learning powered autofill suggestions" which seems to have defaulted to on.

          • perching_aix4 days ago |parent

            Here you go, from the horse's mouth: https://www.microsoft.com/en-us/edge/learning-center/improve...

            Note that this is from 2023. Their legal docs, last updated in 2024, claim a bit different: https://learn.microsoft.com/en-us/legal/microsoft-edge/priva...

            > By default, Microsoft Edge provides spelling and grammar checking using Microsoft Editor. When using Microsoft Editor, Microsoft Edge sends your typed text and a service token to a Microsoft cloud service over a secure HTTPS connection. The service token doesn't contain any user-identifiable information. A Microsoft cloud service then processes the text to detect spelling and grammar errors in your text. All your typed text that's sent to Microsoft is deleted immediately after processing occurs. No data is stored for any period of time.

        • 4 days ago |parent
          [deleted]
        • 4 days ago |parent
          [deleted]
      • IgorPartola4 days ago |parent

        Whoa how is this not all over the news at all times?

        • NewJazz4 days ago |parent

          People are tired of hearing about it. They don't feel like they can do anything about it.

        • vachina4 days ago |parent

          We moved on and used alternatives. And stayed on alternatives.

        • saparaloot4 days ago |parent

          The caring cohort has mortages and kids

          • jbaber4 days ago |parent

            And Linux for desktop is finally easy enough for those of us with both.

            Microsoft ordered me to buy a new computer for Win 11, so I took said kids to Microcenter, asked for a machine whose specs could play a particular steam game on Linux, returned to my mortgage, installed Ubuntu and haven't given Windows a second thought in months.

        • fkrkrkgkgk4 days ago |parent

          Anyone remember when Ubuntu sent every keystroke to amazon?

          • aspenmayer2 days ago |parent

            Richard Stallman probably. Hopefully EFF.

            https://www.omgubuntu.co.uk/2016/01/ubuntu-online-search-fea...

          • pasa day ago |parent

            yes, and it was a correct outrage, and the solution was also nice and easy (remove this modular thing instead of switch to FreeBSD)

            and importantly it seems that Canonical/Ubuntu is not doing something like that right now, whereas MSFT is all in on online only mode.

      • Teever4 days ago |parent

        Something I heard a while back but have never had confirmed is that the Nvidia driver sends the content of every window title to Nvidia.

        Does anyone know if that is true?

        • svnt4 days ago |parent

          This was when you had to create an account for GeForce Experience and they started sending crash stats.

          Some people checked it with wireshark at the time and didn’t find anything other than what was stated. [0]

          0: https://gamersnexus.net/industry/2672-geforce-experience-dat...

        • morkalork4 days ago |parent

          There was a smart tv that did that with the titles of any media played too wasn't there?

          • heavyset_go4 days ago |parent

            You can safely assume the same is happening with streaming sticks/boxes

      • tspivey4 days ago |parent

        I was surprised by this. I don't use Edge much, and I don't remember being asked about it.

      • Lu20254 days ago |parent

        Any form meaning passwords too?

        • perching_aix4 days ago |parent

          Looked into it, the answer seems like it can be both a yes or a no, depending on the website and user actions.

          By default, when you implement a form that takes a password, you (the developer) are going to be using the "input" HTML element with the type "password". This element is exempt from spellchecking, so no issues there.

          However, many websites also implement a temporary password reveal feature. To achieve this, one would typically change the type of the "input" element to "text" when clicking the reveal button, thereby unintentionally allowing spellchecking.

          You (the developer) can explicitly mark an element to be ineligible for spellchecking by setting the "spellchecking" attribute to "false", remediating this quirk: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

          You (the developer) can of course also just use a different approach for implementing a password reveal feature.

          As the MDN docs remark, this infoleak vector is known as "spelljacking".

      • IlikeKitties4 days ago |parent

        [flagged]

    • ape44 days ago |parent

      I would check:

          - Don't show ads (saves power)
          - Don't call home (saves power)
      • bee_rider4 days ago |parent

        This might be considered if they ever find out how shitty Windows can get before people actually stop buying computers with it.

        • mouse_4 days ago |parent

          As long as Red Hat keeps embracing and extending free desktop, and Apple keeps disallowing standard features like native Vulkan (Mac is not for games I get it but come on, please?), people will either keep using Windows or, more likely, switch to Android devices for their home and business needs.

    • lxgr4 days ago |parent

      Both things can be true/desirable at the same time.

      If, as tested, this setting makes a double-digit percentage difference, I'm glad Microsoft exposes it in the UI. I'd also be glad if they didn't do as much weird stuff on their user's devices as they do.

      • pavel_lishin4 days ago |parent

        > If, as tested, this setting makes a double-digit percentage difference, I'm glad Microsoft exposes it in the UI.

        I'd rather them write more performant code. This feels like your car having the option to burn motor oil to show a more precise clock on the dash; you don't get kudos for adding an off-switch for that.

        • minitech4 days ago |parent

          > I'd rather them write more performant code.

          In keeping with the theme of the comment you're replying to, writing better-performing code and providing performance options are not mutually exclusive. Both are good ideas.

          > This feels like your car having the option to burn motor oil to show a more precise clock on the dash; you don't get kudos for adding an off-switch for that.

          (Sounds more like you're arguing that it should be forced off instead of being an option? Reasonable take in this case, but not the same argument.)

          • jjj1234 days ago |parent

            No, I think they’re arguing that showing seconds in the system tray shouldn’t be so inefficient that turning it off gives back double-digit percentage energy savings.

            I think we all agree there needs to be some additional power draw for the seconds feature, but it’s unclear how much power is truly necessary vs this just being a poor implementation.

            • ants_everywhere4 days ago |parent

              there's a dramatic increase in how frequently you interrupt the CPU to update the display. That is true at the OS level no matter how efficient you make the second display code.

              • troupo3 days ago |parent

                How about web views throughout the OS? The new start menu written in React?

                There's an ungodly amount of CPU and GPU spikes throughout the OS which make the "omg seconds" invisible in comparison

          • morganherlocker4 days ago |parent

            It shouldn't take any noticable power/cycles to accomplish this task. Having flags for "performance" littered through the codebase and UI is a classic failure mode that leads to a janky slow base performance. "Do always and inhibit when not needed".

        • aksss4 days ago |parent

          Better analogy would be reducing your MPGs (fuel efficiency) to show a more precise clock, and arguably we all make that sacrifice to get CarPlay.

          Energy isn’t free.

          Even if they wrote more performant code, it would just mean less relative loss of energy to show seconds but still loss compared to not showing seconds.

          • pavel_lishin4 days ago |parent

            Of course it's not free - TANSTAAFL - but it should certainly not increase energy consumption by 13%!

        • orangecat4 days ago |parent

          This feels like your car having the option to burn motor oil to show a more precise clock on the dash

          I actively don't want to see seconds; the constant updating is distracting. It should be an option even if there were no energy impact. (Ditto for terminal cursor blinking).

          • GLdRH4 days ago |parent

            Doesn't the blinking cursor tell you it's ready for input and not still running the previous command? Seems useful.

            • lucb1e4 days ago |parent

              I have cursor blinking off anywhere I can. The prompt is what tells me I can type something, or in a GUI program, you can always type if there is a cursor no matter if it's solid or blinking. At least, that's my experience, perhaps you're familiar with another system or piece of software where the blinking is what tells you that you can enter something?

            • eviks4 days ago |parent

              There are better shape/color alternatives for that

          • p_ing4 days ago |parent

            ...Did you not see that it is an option, off by default?

        • criddell4 days ago |parent

          > I'd rather them write more performant code.

          My expectations of Microsoft software aren't terribly high. I'd say Windows is performant (ie it works about as well as I expect).

        • daveoc644 days ago |parent

          It's really an on switch.

          The feature is off by default in Windows 11 and was not offered in any previous non-beta Windows version.

          • anonymars4 days ago |parent

            But you could open the clock flyout and see it on demand. Now it's all-or-nothing (unless they changed it, again)

            (Have I mentioned how much I loathe Windows 11?)

      • Delk4 days ago |parent

        Mentioning that some setting uses more power can be useful and desirable. I think Jaxan might be irked by "energy recommendations" Windows gives you in power & battery settings, though. It suggests applying "energy saving recommendations" to lower your carbon footprint, and while I absolutely support energy saving, I also find those "recommendations" obnoxious.

        The recommendations suggest, among other things, switching to power-saving mode, turning on dark mode, setting screen brightness for energy efficiency, and auto-suspending and turning the screen off after 3 minutes.

        Power-saving mode saves little at least on most laptops but has a significant performance impact, dark mode only saves power on LED displays (LCDs have a slight inverse effect), and both dark/light mode and screen brightness should be set based on ergonomics, not based on saving three watts.

        When these kinds of recommendations are given to the consumer for "lowering your carbon footprint", with a green leaf symbol for impact, while Microsoft's data centres keep spending enormous amounts of power on data analysis, I find it hard to see that as anything more than greenwashing.

      • Xylakant4 days ago |parent

        The test setting is important here - the test is on an otherwise idle machine. This means that the update ensures that some thread wakes on a timer every second which may explain the large drop. This test is interesting, but not very representative of a real world usage scenario. It’ll be interesting to compare it to the results of the other test they running, where they keep a video running in the background.

        • Delk4 days ago |parent

          I'm still a little curious of what's causing the increase in power use. A single additional wakeup per second should not have a two-digit percentage impact on power use when even an idle machine is probably going to have dozens of wakeups per second anyway. I wonder if updating the seconds display somehow causes lots of extra wakeups instead.

    • HPsquared4 days ago |parent

      And Windows Update burns through an ungodly amount of CPU.

      • fkrkrkgkgk4 days ago |parent

        And it is only getting worse. I would consider windows update on its own enough reason for not using this shit os at all. Be aware! https://youtu.be/4RQ6pek3JoM

        • imtringued4 days ago |parent

          I've stopped using Windows for the same reason. Windows Update is a steaming pile of garbage. It has bricked my Windows installation so many times.

      • TiredOfLife2 days ago |parent

        Tell me you last used Windows 7/8 without telling me.

    • IgorPartola4 days ago |parent

      Wonder how much an OS that focuses on battery life can extend working time on a laptop. Would be a killer marketing point I think.

    • ozgrakkurt4 days ago |parent

      Similar vibe with telling people to not flush two times in toilet while companies are pouring literal poison into oceans/seas.

      Also airlines asking for extra money to offset emissions, just absolute insanity

      • dist-epoch4 days ago |parent

        While those same airlines fly empty planes just to avoid losing airport slots.

    • zozbot2344 days ago |parent

      This is not Windows-specific, it has been shown wrt. Linux systems also. It's why recent Linux desktop environments have gotten rid of the blinking cursor in command prompt windows (that also causes frequent wakeups and screen updates) and why it probably makes sense to disable most animations too.

      • userbinator4 days ago |parent

        It's why recent Linux desktop environments have gotten rid of the blinking cursor in command prompt windows

        This used to be done entirely in hardware (VGA text modes), and I believe some early GPUs had a feature to do that in graphics modes too.

      • JdeBP4 days ago |parent

        I was surprised that no-one mentioned this at https://news.ycombinator.com/item?id=44473048 , where someone is trying to put software-implemented text blinking into OpenBSD.

    • jasonthorsness4 days ago |parent

      There was a fight in Vista time frame about whether or not animated/video desktop backgrounds were a good idea. They were definitely cool, but AT WHAT COST. Ended up shipping as an "extra".

      • trinix9124 days ago |parent

        And nowadays we got people running Wallpaper Engine on their idling laptops in college classes ;)

    • cheema334 days ago |parent

      > And now it is my responsibility to disable showing seconds in the taskbar???

      It is not. This "feature" is disabled by default.

      Google "manufactured outrage".

    • TiredOfLife2 days ago |parent

      > makes screenshots (for Recall)

      It doesn't because that feature only just release, only works on specific new laptops and most ipmortant: YOU HAVE TO MANUALLY ENABLE IT

    • anonymars4 days ago |parent

      To be fair, does it do all those things every second? https://learn.microsoft.com/en-us/windows-hardware/drivers/k...

      (For the record, I abhor Windows 11)

    • Kwpolska4 days ago |parent

      This setting is disabled by default.

    • albert_e4 days ago |parent

      And forcefully overrides personal preferences to NOT ahouls any Windows Spotlight images and trivia on lock screen, and news and recommended content on Edge homescreen

    • Terr_4 days ago |parent

      > Edge sends contents from web forms for “AI”

      That reminds me of Chrom[e|ium]'s insanely bad form suggest/autofill logic: The browser creates some sort of fuzzy hash/fingerprint of the forms you visit, and uses that with some Google black box to "crowdsource" what kinds of field-data to suggest... even when both the user and the web-designer try to stop it.

      For example, imagine you're editing a list of Customers, and Chrome keeps trying to trick you into entering your own "first name" and "last name" whenever you add or edit an entry. For a while developers could stop that with autocomplete="off" and then Chromium deliberately put in code to ignore it.

      I'm not sure how much of a privacy leak those form-fingerprints are, but they are presumptively shady when the developers ignore countless detailed complaints over many years in order to keep the behavior.

      https://issues.chromium.org/issues/40093420

      • o11c4 days ago |parent

        > For a while developers could stop that with autocomplete="off" and then Chromium deliberately put in code to ignore it.

        To be fair, websites with a horrible misunderstanding of security kept on using that for "this password is important, better make sure the user is forced to enter it by hand!"

    • blibble4 days ago |parent

      > And now it is my responsibility to disable showing seconds in the taskbar??? If microsoft really wants to be green, windows shouldn’t do all these wasteful things!

      and building multiple gigawatt consuming data centres to produce AI slop no-one asked for and no-one wants

      powered by fossil fuels

      • netsharc4 days ago |parent

        "This is Windows 11, you'll need a new PC for it, throw away your old PC and wreck the planet some more, and by the way we'll stop supporting Windows 10 in October 2025, if your PC gets a malware and your bank account gets hacked and drained it's not our fault".

  • alok-g4 days ago

    Out of curiosity, what would it take to not have the system lose so much energy with this setting? My watch too updates seconds every second. Someone in the comments on this thread has mentioned hardware-based blinking cursors. A computer that could do over billion calculations per second should not need much to render a 30x30 (say) pixel area.

    • rajnathani3 days ago |parent

      3 reasons that I can think of:

      (1) For mostly static screens that the GPU's frame buffer (and pipeline) could be empty and there would be nothing to process.

      (2) The font rendering algorithm will have to run every second. These are not simple bitmap-type monospace fonts, and thus the calculation of the output will have to be done every second and that these rendered digits-combinations likely aren't cached.

      (3) The thread for system clock displaying system can set a sleep for 1 minute (after it has aligned to the start of the minute (or in actuality the cron-type service underlying operating systems is likely used here for which the alignment to the minute start is already a given)), instead of 1 second.

      • alok-g3 days ago |parent

        Thanks.

        For #2, it needs to render around 66 characters per minute. The power loss mentioned collectively is 15% in battery life. When I open a Word doc, that would show say around 1500 characters on the screen within two-three sevonds without any noteworthy drop in battery level. Hence, I do not think it is font rendering.

  • gleenn4 days ago

    13% less battery time is pretty wild just from updating the screen once per second but interesting to understand why.

    • Calwestjobs4 days ago |parent

      (linux + KDE) i save 24% by using black (#000000) everything, backgrounds, full theme, contrast control setting in firefox etc. on notebook with OLED screen. also if possible not using Chromes / safaris youtube video player but downloading of a video makes huge energy savings. (and using MPV in linux or properly configured PotPlayer in windows. VLC or default MS video apps are bad at energy saving.)

      And we are talking about 15+ hours of actual office work in webbrowser + little bit of python math. so add 24% on top of that... that is literally weekend worth of work on one charge. current generation of laptop CPUS is insane.

    • starburst4 days ago |parent

      Incompetence

      • userbinator4 days ago |parent

        You mean "Microsoft"... but that's really the same thing these days.

  • lxgr4 days ago

    That's quite surprising. I wouldn't have imagined Windows (or any other "desktop OS") to go to great lengths to optimize for static screen content in the way that e.g. smartphones or wearables do, which as I understand have dedicated hardware optimized for displaying a fully static screen while powering down large parts of the display pipeline.

    • pdw4 days ago |parent

      The decision to now show seconds dates back to Windows 95. Back then the motivation was not power saving, but rather to allow the code related to the clock and text rendering to be swapped out to disk on a 386 with 4MB RAM... Raymond Chen: https://devblogs.microsoft.com/oldnewthing/20031010-00/?p=42...

    • layer84 days ago |parent

      Desktop OSs idle most of the time, and the comparison is with respect to an idle desktop. Forcing context switches and propagating updates through the GUI stack every second isn’t free in that situation, it means that at least one CPU core can’t stay in a lower-power state. In contrast, you probably won’t see much of difference in battery life for the seconds display when simultaneously watching video or running computational tasks.

    • craftkiller3 days ago |parent

      Laptops also optimize for displaying a fully static screen. It's called "Panel Self Refresh" or "PSR". https://hardwaresecrets.com/introducing-the-panel-self-refre...

      On my laptop, I can check if it is enabled with:

        cat /sys/kernel/debug/dri/0000:c1:00.0/eDP-1/psr_state
        cat /sys/kernel/debug/dri/0000:c1:00.0/eDP-1/psr_capability
    • jayd164 days ago |parent

      Windows runs on laptops and tablets and such. At this point they probably do a fair bit of that sort of thing.

  • rwallace4 days ago

    I hope so, because I actively want seconds absent from the system tray. Attention is a scarce resource; the fewer things on the screen constantly changing and thereby consuming my attention, the better. If saving power means we remain free from that anti-feature, great.

    • aksss4 days ago |parent

      I, for one, love it for casual and incidental benchmarking. Of everything - not just a process I run, but also how long between bird chirps outside my window. But I also find it very easy to ignore, too. Glad it’s optional.

      • Asraelite4 days ago |parent

        Does nobody care about just being able to tell the time accurately? 59 seconds makes a big difference for joining online meetings and things.

        • rwallace4 days ago |parent

          Beware of concentrated benefit and diffuse cost. Sure, let a seconds clock be available to call up the 0.1% of the time when you want it. But it shouldn't be in the system tray presenting a small but ongoing attention drain the other 99.9% of the time.

          • technothrasher4 days ago |parent

            As a horologist, I want seconds. It annoys me not to have it. I wouldn't care if it isn't the default, as long as I can set it, similarly to how I currently have to set 24-hour time separately on all my machines because the US locale defaults to 12-hour time. That's fine, and understandable. But I'm constantly annoyed, for instance, by Apple's long running absolute refusal to allow the iOS clock to display seconds.

          • perching_aix4 days ago |parent

            The attention drain is sadly pretty much unmeasurable properly, as it's a subjective thing.

            I'm one of those freaks who have this on and I honestly like it a lot. It gives me a feeling of certainty, grounding, and precision.

            Primary driver for turning it on was their redesign of the clock flyout to be, uhh, nonexistent with Windows 11, which I'd previously use on demand for seconds information. I was also worried about this being a nonsolution and a distraction initially, but it ended up being fine.

          • nottorp4 days ago |parent

            Interesting, do you also turn off notification popups for all applications, or leave those on?

            • rwallace3 days ago |parent

              I leave a handful of actually important notifications on, like the one that says 'someone just made a purchase using your bank account, making sure it was you', but most of them, I do indeed turn off.

        • bigstrat20034 days ago |parent

          Approximately zero people in the world care if you join a meeting at 1:00, or 1:01. It's good to aim to be punctual, but if you're off by a minute there is no consequence.

          • crazygringo4 days ago |parent

            That is definitely not true. It's very dependent on the culture, the company, the specific group.

            I've met managers who literally lock the conference room door when it hits :00.

            That's a little crazy in my view, but there are definitely places where it's the norm.

            There are basically two ways of managing expectations around meeting times. The first is that it's acceptable for meetings to run late, so it's normal and tolerated for people to be late to their next meeting, and meetings often start something like 5 minutes late, and you try to make sure nothing really important gets discussed until 10 minutes in. The other is that it's unacceptable for meetings to start late, so people always leave the previous meeting early to make sure they have time for bathroom, emergency emails, etc. In which case important participants wind up leaving before a decision gets made, which is a whole problem of its own.

          • argomo4 days ago |parent

            I'm curious how you came to such a universally sweeping conclusion. At any rate, it's incorrect as I have personally observed counterexamples in my professional career.

            • yawaramin4 days ago |parent

              Companies where people care that you join a meeting just one minute late? Sounds kind of unbelievable tbh. Humans are humans.

        • GLdRH4 days ago |parent

          No and No, it doesn't.

      • erikpukinskis4 days ago |parent

        I just say “one one thousand two one thousand…” under my breath.

    • userbinator4 days ago |parent

      Ideally the clock display should be customisable to display whatever level of precision you want; I believe at least one Linux application lets you specify it via a strftime() format string.

      • pramodbiligiri4 days ago |parent

        KDE Plasma provides custom time formats: https://postimg.cc/sGXD8wqq. The time format documentation from that screenshot links to QT's formatDateTime function: https://doc.qt.io/archives/qt-5.15/qml-qtqml-qt.html#formatD...

        • userbinator3 days ago |parent

          That's already more customisation than most software will allow, but to paraphrase an old saying, "those who don't understand strftime() are doomed to reinvent it poorly":

          https://pubs.opengroup.org/onlinepubs/009695399/functions/st...

  • russellbeattie4 days ago

    > Some Reddit users on the same thread also pointed out that while the system is already doing plenty in the background, even small updates like this might prevent deeper power-saving states.

    This is undoubtedly the answer, and I suspect that if any actual effort were made by Microsoft, the problem might be eliminated entirely. Maybe.

    Most likely, the update is implemented calling a standard stack of system calls that are completely benign in a normal application, which is already limiting power savings in various ways. But when run by itself, the call stack is triggering a bunch of stuff that ends up using a bit more power.

    The big question is: Can this actually be optimized with some dedicated programming time? Or is the display/task bar/scheduling such a convoluted mess in Windows that updating the time every second without causing a bunch of other stuff to wake up is impossible without a complete rewrite.

  • seanalltogether4 days ago

    > Test Type: Idle desktop only (no applications or media playback, unless otherwise stated)

    It's weird they didn't also include a simple web browser test that navigates a set of web links and scrolls the window occasionally. Just something very light at least, doesn't even have to be heavy like video playback.

    • rustyminnow4 days ago |parent

      Not that weird. Idle desktop isolates the effects of the change to get a worst case scenario. Would be interesting to see a light activity test too though - see if you still get a noticeable difference.

    • jasonthorsness4 days ago |parent

      Yeah this is not meaningful due to the unrealistic workload. Sad thing is, I bet a web browser test would still show the difference, as long as a page is kept static on the screen for more than a few seconds before moving on.

      Power consumption is incredibly difficult to benchmark in a meaningful way because it is extremely dependent on all the devices in the system, all the software running, and most power optimizations are workload dependent. Tons of time went into this in the windows fundamentals team at Microsoft.

    • viraptor4 days ago |parent

      > We’re currently running the same test again on all three laptops to account for variance, but this time with a video playing to simulate a more active usage scenario. Once those results are in, we’ll update the relevant section with the new data.

      • delusional4 days ago |parent

        What? "We are doing a second test to account for variance, but also changing the test setup" that doesn't make any sense.

        • viraptor4 days ago |parent

          They account for variance between different laptops. The test is changing the same way for all laptops. It makes sense.

          • delusional3 days ago |parent

            Does that imply they didn't run the same test on the three different laptops before? Why are they presented on the same graph then?

            • shmeeed3 days ago |parent

              No. Try parsing it like this:

              >We’re currently running the same test again (on all three laptops to account for variance), _but this time with a video playing to simulate a more active usage scenario_

              • delusional3 days ago |parent

                I think you'd normally include a comma before "on all" then. I'm not convinced that's what the author meant, but at least it makes more sense than what I read it as. Thanks for guiding me through that.

                • viraptor3 days ago |parent

                  There's more discussion about it on the WAN show https://www.youtube.com/live/tkYiqvA7pmU around 33:42. They're competent people, so if some interpretation doesn't make sense, maybe it's not a good interpretation.

    • shellfishgene4 days ago |parent

      Would it not be a much better test to measure power draw for say 10 min with the seconds on and then off, for 10 cycles or so? Rather than waiting for the battery to run down. I guess it's more difficult for a laptop as your have to measure in between battery and laptop, but it should be easy for a desktop (maybe one with laptop CPU).

    • 0x_rs4 days ago |parent

      I agree. My guess is the way this may be implemented could keep the system from entering a lower energy state in some way or another, something which would be far less noticeable during normal usage.

  • renewiltord4 days ago

    42 minutes of battery life lost on 321 minutes of battery life is insane.

    • jeroenhd3 days ago |parent

      On an idle desktop that does nothing it makes sense. There's nothing else going on on the screen, so any difference becomes easier to notice.

      I doubt these results are as extreme when something else is happening on screen. Those results should be added to this article later:

      > We’re currently running the same test again on all three laptops to account for variance, but this time with a video playing to simulate a more active usage scenario. Once those results are in, we’ll update the relevant section with the new data.

  • __MatrixMan__4 days ago

    Laudauer's principle (https://en.wikipedia.org/wiki/Landauer%27s_principle) tells us that you can't delete a bit without releasing some heat. As the new time digits come in and overwrite the old ones (in the framebuffer, in the LCD, likely other places too) this would occur as the previous digits were deleted. So the only case where showing the time would not take more power is one where other things are not held equal, e.g. some quirk of the software ends up doing more work to ignore the time than to show it (I'd call such a thing a bug).

    This effect is likely vanishlingly small, definitely overshadowed by engineering considerations like the voltage used when walking pixels through changes and such. But still, it's a physics nudge towards "yes".

    • bee_rider4 days ago |parent

      Landauer’s principle is an information-theoretic result about the fundamental cost of computations. With CMOS, every logic gate has multiple transistors, some of which just get charged and dumped to ground with every state transition anyway.

      It is like worry about Carnot’s limit… for a motor boat.

      • __MatrixMan__4 days ago |parent

        Well if you have two nearly identical motor boats and you're looking for differences, they might be quite small.

    • ramraj074 days ago |parent

      When the start menu is a react native app that spikes up the cpu needing billions of flops just to do that, I doubt this number will make a difference.

      • __MatrixMan__4 days ago |parent

        Agreed, the dominant effect would likely be which ads are being served to the start menu, or which user data is being exfiltrated to Microsoft at the time.

      • dlcarrier4 days ago |parent

        Is that true? Was Active Desktop just a preview of what's to come?

      • internet20004 days ago |parent

        Wasn't that debunked already?

    • HPsquared4 days ago |parent

      What if it's an OLED screen and the clock is in a dark font on a light background, so adding seconds means less light is emitted? (Light mode only)

      • __MatrixMan__4 days ago |parent

        Yeah good point. With large enough pixels and pathological color choices you could almost certainly derive the opposite result.

        It would be interesting to test it over a remote desktop session where the screen on the device under test is off. That would eliminate a lot of factors related to the display. Presumably you'd see that the network traffic is either larger to begin with, or doesn't compress quite as well, giving you another reason to say "yes, but what if..."

    • burnt-resistor3 days ago |parent

      I'm kind of sad that reversible computing turned out to be mostly hype.

  • albert_e4 days ago

    Wonder if it make sense to architect computers with a small sidecar CPU that is not as powerful but it also runs at ultra low power ... so tasks like these can be delegated to it while allowing the main CPU to enter low power state when nothing else is placing demand on it.

    • phire4 days ago |parent

      The extra CPU usage isn't that bad. Updating a single number isn't that hard.

      The bigger problem is waking up the GPU and all the communication between components, which is why the computer with integrated graphics takes a smaller hit than the one with dedicated graphics. And why the ARM laptop did even better, because they were optimised for this usecase.

      • userbinator4 days ago |parent

        I'm not sure if the GPU is even needed for an operation like this. Basically modifying a few dozen bytes in the framebuffer every second. It would be interesting to disable all graphical acceleration, causing the CPU to do all the work, and see what difference that makes to the test.

        • jpc03 days ago |parent

          I would be much more interested in having the ability to inform both the gpu and cpu to stay in low power mode while performing this operation.

          I does not need to be scaling up to high performance to:

          - Read a piece of memory

          - Increment with one and modulo

          - Display a section of a texture on a section of the screen.

          You will very likely spend more time passing memory around than otherwise, and to be honest if it happens every second I would hope it stays in cache so you wouldn’t ever even bother the memory.

          • phire3 days ago |parent

            Yeah, some mobile GPUs actually have hardware compositors that can do. They can even support moving windows around with pretty low overhead.

            But the software support to take advantage of it isn't really there. There isn't a standard API to access such functionality, and so the hardware compositors end up unused, so the don't really put much effort into improving them.

            But with proper software support and a hardware compositor with enough flexibility, you could easily put the clock in it's own texture and update it with very low power consumption.

            Actually, Desktop GPUs already have a single hardware sprite that gets used for moving the mouse cursor around with very little overhead (and lower latency).

        • o11c4 days ago |parent

          Windowing systems these days tend to spam the GPU in the name of performance, unfortunately.

    • jeroenhd3 days ago |parent

      Intel already has that in the form of E-cores, and ARM has had big.LITTLE for quite a while now.

      I don't think any Microsoft employee is going to spend a day writing CPU scheduler code to make sure the seconds display in the task bar is being sent to the right low-power cores. I'm surprised they even bothered to port that from Windows 10 to Windows 11 to be honest.

    • cwillu4 days ago |parent

      https://en.m.wikipedia.org/wiki/ARM_big.LITTLE

      • pbhjpbhj3 days ago |parent

        This link is to Arm's big.LITTLE, but other companies have efficiency cores (E-cores) too and may run fewer cores at lower clock to save power.

  • draebek4 days ago

    Dave Plummer, a former Microsoft Windows engineer, also did a video on this: https://www.youtube.com/watch?v=qe1ltXdKMow

  • endorphine4 days ago

    I was wondering the same when configuring Polybar w/ i3 to show seconds on my Linux system. Even if it's marginal, I think I'll disable it.

  • fkrkrkgkgk4 days ago

    Who cares about this, when a geforce 5090 is using so much power that it most often is turning power cables into a furnace.

    • opan3 days ago |parent

      People without a 5090, I guess. People using devices that run off battery rather than desktop computers.

    • mrheosuper3 days ago |parent

      people without 5090

  • DoctorOetker4 days ago

    Oh come on, the seconds are entirely predictable, run from 00 to 59, so basically storing 10 frames ("0" to "9") in the font and foreground/background colors, the location on the screen into a small reserved buffer in whatever piece of code is responsible for the final screen frame handover, and this could be updated with minimal overhead and without a context switch,...

    Yes, it would require a small API addition to the desktop server (wayland, X11, ...) to "register"/transfer/update those 10 frames, their locations ... whenever the user initializes or changes the fonts, font size, ... the context switch can be totally eliminated.

    • fkrkrkgkgk4 days ago |parent

      It’s a feature to exercise the whole display stack at every update, to make sure it works, for example when running remote desktop (or any other similar protocol/stack like citrix), where you can observe if the network still can send packets every second. Otherwise you need to wait a minute before noticing connectivity issues. Both rdp and citrix are extremely fragile and a clock that updates every second is a very good visual indicator for monitoring the connection. https://xkcd.com/1172

  • jambutters4 days ago

    Does this happen on linux? Polybar with i3 has an option to show seconds by clicking the date and time

    • pdw4 days ago |parent

      It certainly does. There is for example a measurable energy cost for having a blinking cursor in a terminal, and there have been huge flame wars about efforts to move to non-blinking cursors.

      The compromise for GNOME Terminal is that the cursor will stop blinking after a terminal has been idle for ten seconds.

      • everybodyknows4 days ago |parent

        Why not similarly disable seconds display after some specifiable time has elapsed since the last key or mouse event? The decimal digits could be replaced with a greyed-out ":--".

    • dlcarrier4 days ago |parent

      It's going to take power, no matter the operating system. What matter is how much power it takes. On most desktop environments and widgets, it's probably negligible.

    • wirybeige4 days ago |parent

      It happens on GNOME at the very least, and I would expect every modern platform is the same way.

  • maxglute3 days ago

    What if taskbar autohides?

  • userbinator4 days ago

    There are two conclusions one can draw from this: either the idle power consumption of laptops is so low that something as trivial as updating the clock display on an otherwise idle system[1] is a significant amount, or their code is so shitty that it's taking an order of magnitude or more power than it should. Given this is Microsoft, I'm inclined to believe the latter, or that it was "deliberately" implemented in an inefficient way to "prove" their argument. It'd be trivial to write a tiny Win32 app that just has an incrementing seconds counter and use that to distinguish the latter two cases.

    [1] The caveat is that the majority of the time the system will not be idle but doing something else possibly even more energy-intensive.

  • nothrowaways4 days ago

    Tldr, yes.

  • 17186274404 days ago

    I don't get it. If the system is busy, it will update the screen less then a second anyway, if it is not it will go to sleep after less then a minute. Does Windows not turn off the display when unused, and then goes to sleep after a while?

    • viraptor4 days ago |parent

      This is not a test of normal usage. This is a scenario of sleep disabled and an idle system.