People in the comments seem confused about this with statements like “greenest AI is no AI” style comments. And well, obviously that’s true but it’s an apples to pears comparison.
Clearly Ecosia is pushing for “people want AI” _and_ we want to make it more ecofriendly. Taking away features from users altogether is not the right answer.
It’s like saying “cheapest car is no car”. It doesn’t solve the fundamental problem of “wanting a car”.
Isn't this why transit advocates try to reduce the need for owning a car though?
I'm thinking a really good search engine would not make you reach for ai as often and so could be eco friendly that way
Yes, the greenest browser is one that doesn’t use AI. They aren’t claiming they’ve built that though, just the greenest AI.
We don't have a problem of wanting AI though
Running an LLM by default when I open your site is the most energy-consuming thing a computer can do, and the thing consumers hate the most in 2025.
I love Kagi's implementation: by default it's disabled, you either have to add a question mark to the search, or click in the interface after searching to generate the summary.
Yeah and you can set an option to disable the ? initiator too
This is absurd. Training an AI is energy intensive but highly efficient. Running inference for a few hundred tokens, doing a search, stuff like that is a triviality.
Each generated token takes the equivalent energy of the heat from burning ~.06 µL of gasoline per token. ~2 joules per token, including datacenter and hosting overhead. If you get up to massive million token prompts, it can get up to the 8-10 joules per token of output. Training runs around 17-20J per token.
A liter of gasoline gets you 16,800,000 tokens for normal use cases. Caching and the various scaled up efficiency hacks and improvements get you into the thousands of tokens per joule for some use cases.
For contrast, your desktop PC running idle uses around 350k joules per day. Your fridge uses 3 million joules per day.
AI is such a relatively trivial use of resources that you caring about nearly any other problem, in the entire expanse of all available problems to care about, would be a better use of your time.
AI is making resources allocated to computation and data processing much more efficient, and year over year, the relative intelligence per token generated, and the absolute energy cost per token generated, is getting far more efficient and relatively valuable.
Find something meaningful to be upset at. AI is a dumb thing to be angry at.
Your computer might use more energy displaying the results to you than the server does generating them. Especially in Chrome :)
The server shares resources!
Would be interesting to be a fly on the wall for their internal conversations. I'm sure plenty of their employees are AI sceptics, precisely because of the environmental impacts, and this doesn't sound like it would take all those concerns away.
I haven't kept up to date with this, but last I heard we still aren't certain how much energy AI training/inference actually takes at the big AI companies (OpenAI/Anthropic). Have any of these companies shared this since? What's our closest estimates if not?
Looks interesting. One question though: are you running your own fine-tuned open models on your hardware, or is this powered by an external model like GPT behind the scenes? Curious how independent the stack really is.
TFA "... which already powers AI Overviews and some of our search results. Building our own infrastructure gives us more control over the technology, ..."
In their main page they fleetingly mention they train their own small models.
I agree it's little info
"Great question! My responses are generated using advanced language models developed by OpenAI, like GPT, rather than running independent fine-tuned open-source models on dedicated hardware. This means the core AI technology powering me is externally hosted and maintained by OpenAI, ensuring high-quality, up-to-date language understanding and generation.
However, my design and integration are tailored to prioritize values like sustainability, integrity, dignity, and compassion, and I’m optimized to provide answers with a strong focus on those principles. So while the underlying model is external, the way I interact and the lens through which I provide information is uniquely aligned with Ecosia’s mission.
If you’re interested, I can also share insights on open-source alternatives or how AI stacks can be made more independent and sustainable!"
I think it is more the greenwashiest AI
> As a not-for-profit company, we can afford to do things differently. AI Search uses smaller, more efficient models, and we avoid energy-heavy features like video generation altogether.
I'm a bit confused -- do other search engines provide video generation? Mentioning that sounds too out of place to me. Am I missing something?
Reminder that LLMs only(?) consume energy on the order of a few seconds of Netflix[1].
[1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...
Netflix spending 240Wh for 1h of content just does not pass the smell test for me.
Today I can have ~8 people streaming from my Jellyfin instance which is a server that consumes about 35W, measured at the wall. That's ~5Wh per hour of content from me not even trying.
It's quickly pointed out that he's not counting the training of models, producing all the GPUs, energy spent on scraping, the increased storage needs from scraping the whole internet, etc.
The Netflix number is probably not counting all the energy spent producing the shows/movies, building all the cameras/specialized equipment, building their data centers etc. either.
It is fair to compare inference to streaming. Both are done by the end user.
That's inference only, otherwise people would not be buulding nuclear reactors to power AI.
Does the Netflix number include the energy cost of manufacturing all the cameras/equipment used for production? Energy for travel for all the crew involved to the location? Energy for building out the sets?
The greenest AI will be connected to district heat networks instead of being cooled with air or water. It isn't even faintly green when heat is treated as a byproduct instead of a co-benefit.
Use AI as power source to power the industry!!! Let them live in a virtual world
The greenest AI is no AI though.
Yeah no. Fuck this greenwashing. AI is not ecological, let's not pretend otherwise.
Is there no browser I can use without this AI trash jammed into it?
Lynx
The AI bros keep minusing. I want you to remember this when the AI bubble pops next year.
greenest "AI" = no AI.
NEEEEEXT