Demand is internal and speculative. It’s not market driven and in some cases (AI features in existing products) is counter to clear market signals.
How is it internal or speculative? Chatgpt is the 5th most poplar website. Gemini is 30th but they have increasing demand and a ton of it isn't on the gemini main site. And that isn't their only external demand of coruse.
I think they are referring to the fact that Google has shimmied AI into every one of their products, thus demand surge is the byproduct of decisions made internally. They are themselves electing to send billions of calls daily to their models.
As opposed to external demand, where vastly more compute is needed just to keep up with users torching through Gemini tokens.
Here is the relevant part of the article:
"It’s unclear how much of this “demand” Google mentioned represents organic user interest in AI capabilities versus the company integrating AI features into existing services like Search, Gmail, and Workspace."
ChatGPT being the #5 website in the world is still indicative of consumer demand, as their only product is AI. Without commenting on the Google shims specifically, AI infrastructure buildouts are not speculative.
It's indicative of the demand when it's free yeah, try charging every user just to operate at cost and we'll see what the real demand is.
Google isn't chatgpt. Normal people have no idea what Gemini is and are annoyed by the crappy AI summaries in their Google searches.
That's not true at all. People love things like nano banana and notebooklm.
What you're quoting is Ars' pandering and need to placate its peanut gallery. The clique of tech bros hating ai is the exception not the norm.
You don't think it's plausible that Google's need to 1000x infrastructure has a lot to do with their very liberal incorporation of AI across the entire product suite?
I don't really care either way what the source of the demand is -- but it seems like an uncontroversial take.
These growth requirements are simply infeasible.
Semiconductor density, speed, and power efficiency grow much slower than doubling every six months. Creating custom silicon for this won't help - plenty of companies, including Nvidia, are already optimizing their hardware for these tasks and they are very good at it. Production capacity can't scale nearly that fast for myriad reasons, including access to materials, production capacity of inputs, the time and complexity of building new production facilities, the lack of experts available for all of this, and the long time frames for training new people.
This to me is the biggest sign that this is a bubble. Even if demand shrinks, it could still be huge. Even if many use cases are impractical, we will still find some where it's valuable. But the market is basing its valuations on forecasts of tremendous growth that simply can't be supported physically.
I agree it's infeasible.
Even if you can make the chips, the power supply to run them is very unlikely to grow at that kind of pace.
Most of the demand seems to be driven to by a bubble strategy to sell dollar bills for $0 rather than $1.
isn't the whole big thing with chinese AI, bieng much more energy/processor efficient, the unspoken push here?
What happens when this bubble pops? I am trying to avoid any time or financial investment in AI as I look around the corner. I am also looking back at the last big bubble. I got my first house for $129k at 2800sqft when the housing bubble popped. There were just so many empty houses available.
So now I am wondering what will be available once the AI investment implodes. I am thinking about computer hardware, available cloud infrastructure employees for hire on the cheap, and more.
I am also looking at the consequences for the incumbents reliant on both AI and cloud that either cannot pay their bills or who fail because their service providers no longer exist. I can’t help but see lots of opportunity on the horizon.
I'm also interested in this question, am not interested in profiting just curious about the future. Other HN users gave these answers:
"If the price of inference drops through the floor all the AI wrapper companies become instantly more valuable. Cursor's agents suck ...but their position would get much better with cheap inference." (edited)
"Buy the application layer near winners. When computing costs shrink, usage expands."
"massive Oracle Crypto Farm"
So the answer seems to be: 1. look at what is being priced out now, as this will be cheaper later. 2. Assume the big players will shrink and the smaller players will grow.
If crypto fills the gap I think we could also see a devaluation in coins?
If the data centres stop using so much electricity, will supply of electricity increase and so the costs will decrease?
Personally I'm looking forward to cheaper consumer GPU cards.
They're trying to increase their "compute capacity" over 5 years but aiming for the same cost and energy consumption. It's a drive for some buildout combined with hardware/algorithm efficiency improvements.
They're not trying to build double the data centers every 6 months. But Ars likes the "economic collapse" narrative because I guess their journalist spends too much time on social media.
If a recession comes it likely won't come from AI anyway, and anyhow nothing will happen to these huge tech companies.
The quoted goal is a 1000x increse over 5 years. That works out to an average of 1.9953x increase every 6 months.
> So now I am wondering what will be available once the AI investment implodes.
Memory/RAM. See also:
Probably some usable GPUs too.
> Probably some usable GPUs too.
They will find another bullshit to use them. Just like crypto to AI transition.
What happens? Said executive leaves for "a more exciting project" at another company after prudently cashing in on his stocks before the pop.
If it collapses, lenders will own (in general) giant tip-up buildings with two separate substations tied to two different utility circuits, a shitload of generators and UPSes, a shitload of chillers and cooling towers, and miles of cable tray. Plus whatever computing hardware is inside.
I assume the excess electrical and HVAC gear would be stripped out and sold before converting the tip-up building into another yet another Amazon distribution center.
It’s not particularly useful infrastructure, unlike all of the dark fiber laid during the dotcom boom.
With a quick search, it looks like some ai datacenters are not designed/built with hyperscaler grade reliability in mind so the would have to be upgraded for that purpose.
https://blogs.microsoft.com/blog/2025/11/12/infinite-scale-t...
We are pushing the envelope in serving this compute with cost-efficient, reliable power. The Atlanta site was selected with resilient utility power in mind and is capable of achieving 4×9 availability at 3×9 cost. By securing highly available grid power, we can also forgo traditional resiliency approaches for the GPU fleet (such as on-site generation, UPS systems and dual-corded distribution), driving cost savings for customers and faster time-to-market for Microsoft.
“It won’t be easy but through collaboration and co-design, we’re going to get there.”
Notably, not through AI ;)
Is it just me or anybody else is facing the issue that when you search documentation in google it is becoming hard to find them mainly the documentation of new versions of old libraries
It's amazing how up in arms people were with the blockchain's power requirements and how laissez-faire everyone is with AI's.
- [deleted]
What "AI"? The current search situation is:
- The forced "AI" summary is wrong.
- If you click on "Google AI", it gives a new summary that contradicts the initial one.
- If you check Wikipedia or the top real search result, they contradict both the above.
Should the board intervene and fire Pichai? Does the board know something we don't, e.g., are there massive surveillance contracts with the NSA and the "AI" demand is internal?
I'm wondering whether they're seeing "AI" as the next big thing for mass consumer markets.
IMHO, there are few "big thing" for mass consumer markets and we haven't had one recently.
In chronological order, I'd put PC (90's), internet and online shopping (2000's), smart phones and 'social' media (2010's) and that's it.
I consider something reached "mass consumer market" when a 75 years old grandma considers it normal.
Yes there are many new stuff (online services around phones, smart watches, assistant like Alexa) but nothing that's used by almost everyone.
I strongly believe it is _the_ next big thing.
The easiest way to tell why is that every high schooler or university student is using LLMs (whether it is a good thing or not is irrelevant). These people will go to the job market in a few years and carry the habit.
I know 75 year old grandmas who consider ChatGPT normal and use it all the time. I bet that number gets even higher if you count the people trusting Google AI summaries, although that’s more dubious since they didn’t opt in.
“Demand” nobody asked for or wants 90% of googles ai features. That stupid clippy crap in Google Docs? Yeah I fucking hate it and I’m not allowed to turn it off.
CIA wants your questions. Remember: "Who controls the past, controls the future" George Orwell
[dead]
Because that's gonna scale. /S
The bubble thing. Here's more concerning evidence we're heavily focused on a wild and unknown. High risk is high.
We are building a new species that will likely lead to the end of the human species.
This is not a bubble, ChatGPT is the number one app on Android. Many businesses are replacing most of their employees.
> We are building a new species that will likely lead to the end of the human species.
No, just to improved surveillance. This new species cannot exist and reproduce on its own.
- [deleted]
- [deleted]