HNNewShowAskJobs
Built with Tanstack Start
Ecosia: The greenest AI is here(blog.ecosia.org)
50 points by doener 3 hours ago | 30 comments
  • Insanityan hour ago

    People in the comments seem confused about this with statements like “greenest AI is no AI” style comments. And well, obviously that’s true but it’s an apples to pears comparison.

    Clearly Ecosia is pushing for “people want AI” _and_ we want to make it more ecofriendly. Taking away features from users altogether is not the right answer.

    It’s like saying “cheapest car is no car”. It doesn’t solve the fundamental problem of “wanting a car”.

    • nemomarxan hour ago |parent

      Isn't this why transit advocates try to reduce the need for owning a car though?

      I'm thinking a really good search engine would not make you reach for ai as often and so could be eco friendly that way

      • tybit32 minutes ago |parent

        Yes, the greenest browser is one that doesn’t use AI. They aren’t claiming they’ve built that though, just the greenest AI.

    • oytisan hour ago |parent

      We don't have a problem of wanting AI though

  • arnaudsm2 hours ago

    Running an LLM by default when I open your site is the most energy-consuming thing a computer can do, and the thing consumers hate the most in 2025.

    • xzjis2 hours ago |parent

      I love Kagi's implementation: by default it's disabled, you either have to add a question mark to the search, or click in the interface after searching to generate the summary.

      • alex1138an hour ago |parent

        Yeah and you can set an option to disable the ? initiator too

    • observationist10 minutes ago |parent

      This is absurd. Training an AI is energy intensive but highly efficient. Running inference for a few hundred tokens, doing a search, stuff like that is a triviality.

      Each generated token takes the equivalent energy of the heat from burning ~.06 µL of gasoline per token. ~2 joules per token, including datacenter and hosting overhead. If you get up to massive million token prompts, it can get up to the 8-10 joules per token of output. Training runs around 17-20J per token.

      A liter of gasoline gets you 16,800,000 tokens for normal use cases. Caching and the various scaled up efficiency hacks and improvements get you into the thousands of tokens per joule for some use cases.

      For contrast, your desktop PC running idle uses around 350k joules per day. Your fridge uses 3 million joules per day.

      AI is such a relatively trivial use of resources that you caring about nearly any other problem, in the entire expanse of all available problems to care about, would be a better use of your time.

      AI is making resources allocated to computation and data processing much more efficient, and year over year, the relative intelligence per token generated, and the absolute energy cost per token generated, is getting far more efficient and relatively valuable.

      Find something meaningful to be upset at. AI is a dumb thing to be angry at.

    • conradev39 minutes ago |parent

      Your computer might use more energy displaying the results to you than the server does generating them. Especially in Chrome :)

      The server shares resources!

  • Vinnl2 hours ago

    Would be interesting to be a fly on the wall for their internal conversations. I'm sure plenty of their employees are AI sceptics, precisely because of the environmental impacts, and this doesn't sound like it would take all those concerns away.

  • dom96an hour ago

    I haven't kept up to date with this, but last I heard we still aren't certain how much energy AI training/inference actually takes at the big AI companies (OpenAI/Anthropic). Have any of these companies shared this since? What's our closest estimates if not?

  • Barathkanna3 hours ago

    Looks interesting. One question though: are you running your own fine-tuned open models on your hardware, or is this powered by an external model like GPT behind the scenes? Curious how independent the stack really is.

    • catlikesshrimp2 hours ago |parent

      TFA "... which already powers AI Overviews and some of our search results. Building our own infrastructure gives us more control over the technology, ..."

      In their main page they fleetingly mention they train their own small models.

      I agree it's little info

    • HelloUsername2 hours ago |parent

      "Great question! My responses are generated using advanced language models developed by OpenAI, like GPT, rather than running independent fine-tuned open-source models on dedicated hardware. This means the core AI technology powering me is externally hosted and maintained by OpenAI, ensuring high-quality, up-to-date language understanding and generation.

      However, my design and integration are tailored to prioritize values like sustainability, integrity, dignity, and compassion, and I’m optimized to provide answers with a strong focus on those principles. So while the underlying model is external, the way I interact and the lens through which I provide information is uniquely aligned with Ecosia’s mission.

      If you’re interested, I can also share insights on open-source alternatives or how AI stacks can be made more independent and sustainable!"

  • LogicFailsMe23 minutes ago

    I think it is more the greenwashiest AI

  • toonewbiean hour ago

    > As a not-for-profit company, we can afford to do things differently. AI Search uses smaller, more efficient models, and we avoid energy-heavy features like video generation altogether.

    I'm a bit confused -- do other search engines provide video generation? Mentioning that sounds too out of place to me. Am I missing something?

  • bcyean hour ago

    Reminder that LLMs only(?) consume energy on the order of a few seconds of Netflix[1].

    [1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...

    • belvalan hour ago |parent

      Netflix spending 240Wh for 1h of content just does not pass the smell test for me.

      Today I can have ~8 people streaming from my Jellyfin instance which is a server that consumes about 35W, measured at the wall. That's ~5Wh per hour of content from me not even trying.

    • add-sub-mul-divan hour ago |parent

      It's quickly pointed out that he's not counting the training of models, producing all the GPUs, energy spent on scraping, the increased storage needs from scraping the whole internet, etc.

      • DarmokJalad1701an hour ago |parent

        The Netflix number is probably not counting all the energy spent producing the shows/movies, building all the cameras/specialized equipment, building their data centers etc. either.

        It is fair to compare inference to streaming. Both are done by the end user.

    • oytisan hour ago |parent

      That's inference only, otherwise people would not be buulding nuclear reactors to power AI.

      • DarmokJalad1701an hour ago |parent

        Does the Netflix number include the energy cost of manufacturing all the cameras/equipment used for production? Energy for travel for all the crew involved to the location? Energy for building out the sets?

  • hktan hour ago

    The greenest AI will be connected to district heat networks instead of being cooled with air or water. It isn't even faintly green when heat is treated as a byproduct instead of a co-benefit.

    • weikjuan hour ago |parent

      Use AI as power source to power the industry!!! Let them live in a virtual world

  • eulgro2 hours ago

    The greenest AI is no AI though.

  • w4yai2 hours ago

    Yeah no. Fuck this greenwashing. AI is not ecological, let's not pretend otherwise.

  • josefritzishere2 hours ago

    Is there no browser I can use without this AI trash jammed into it?

    • sph2 hours ago |parent

      Lynx

    • josefritzishere20 minutes ago |parent

      The AI bros keep minusing. I want you to remember this when the AI bubble pops next year.

  • monegator2 hours ago

    greenest "AI" = no AI.

    NEEEEEXT