If true this is the deal of the century. Apple pay 1/14th of a Wang per year for a top tier model whereas Meta burn multiple Wangs a year in salary alone and get garbage.
For those equally confused: Meta bought 49% of Scale AI for 14.3 billion, purportedly to largely bring Alexandr Wang on board.
What is a "Wang" ?
When is this not true?
It is cheaper to buy GPUs than to develop the capabilities to develop GPUs.
This is remarkable if you consider how much it must wound Apple's pride to make this deal with their main rival in the smartphone software space, especially after all the fuss they made about "Apple Intelligence". It's a tacit admission that Google is just better at this kind of thing.
I don't think it hurts their pride at all when they are taking tens of billions from Google so it can be the default search engine on iOS. So they give a little of that back to Google, it's still clear who is doing well in this arrangement between the two companies.
- [deleted]
> wound Apple's pride
do businesses really "think" in a personified manner as this? isnt it just what the accounting resolves to as the optimal path?
Despite decades of efforts to reduce individual accountability in corporations to zero, companies (as social groupings) definitely still have some sense of identity that shines through in decisions.
The C-levels leading the companies might, and the tech CEOs in question have been at the helm for a long enough while to build up some emotional feelings.
> tacit admission that Google is just better at this kind of thing
Yet at the same time google have the worst offering of all the major players (all starting up out of thin air) in this space.
It doesnt really matter anyway, the LLM is a commodity piece of tech, the interface is what matters and apple should focus on making that rather than worry about scraping the entire internet for training data and spending a trillion on GPUs
> Yet at the same time google have the worst offering of all the major players (all starting up out of thin air) in this space.
Is that so? Gemini Models (including Nano Banana), in my experience, are very good, and are kneecapped only by Google’s patronizing guardrails. (They will regularly refuse all kinds of things that GPT and Claude don’t bat a weight at, and I can often talk them out of the refusal eventually, which makes no sense at all.)
That’s not something Apple necessarily has to replicate in their implementation (although if there’s one company I’d trust to go above and beyond on that, it’s Apple).
I’m not sure. It could be a way to save a ton of money. Look at the investments non-Apple tech companies are making on data centers & compute.
Maybe paying Google a billion a year is still a lot cheaper?
Apple famously tries to focus on only a few things.
Still, they will continue working on their own LLM and plug it in when ready.
Edit: compare to another comment about Wang-units of currency
Well they would still be running the google models in Apple DCs. I doubt this is a very cost efficient deal for them.
> that Google is just better at this kind of thing
That might be true but Siri sucks so bad it doesn't matter. It uses GPT but the quality is OSS models' level.
As of its fiscal quarter ending September 2025 Apple had $35.93 billion in cash and cash equivalents.
I think the answer here involves licensing and Apple control of the infrastructure, but my first thought was, "I historically trust Apple with my data a bit more than I trust Google, how is this not just trusting Google with my data?"
Apple previously pitched a vision of local-first AI for privacy, but seems to have badly miscalculated the kind of customer experience they could provide. My personal experience is that Siri has suffered greatly.
Case in point, I like to listen to music in the car, and Siri now confidently starts playing artists whose names sound nothing like what I requested. Also maddening "Play [x] on Apple Music" "You'll need to authorize me to use Youtube Music"
Still I live with / pay for so much that is broken based on a kind of Apple privacy vibes inertia. Siri being wired up to more of my personal information plus Apple maybe shipping that data to Google is going to make me reevaluate that.
Mind blowing they couldn't get this to work. It's struck me lately that the models don't seem to matter anymore, they're all equally good.
The UX and integration with regular phone features is what makes the tool shine and by now there should be plenty of open source models and know how to create their own.
What is Google offering that Apple can't figure out on their own?
Maybe people don't personal assitant AI enough to justify the investment? My phone has probably 6 or 7 AI tools that have talking features that I don't ever explore.
LLM business is not a one-shot figure it out and then collect some easy money, it a constant work and expenses just for LLM functionality. So if Apple analyzed this and decided that they would rather rent such capability, it seems quite logical. Also Google already has ties to Apple, they may even strike a deal where search on iOS is bartered (maybe partially) for Gemini service. Win-win. And Google is not going out of business any time soon, so more reliable than any pure-LLM corporation.
Another, less likely possibility is that Apple may be reluctant to steal enough data to train own LLM to a competitive level and then continue this in perpetuity. They have this notion that they are privacy oriented FAANG company, and may want to keep up this idea.
Maybe it is a sum total of a lot of factors, which in the end tilted the decision to a rental model.
I don't know, Gemini 2.5 has been the only model that's been able to not consistently make fundamental mistakes with my project as I've been working with it over the last year. Claud 3.7, 4.0, and 4.5 are not nearly as good. I gave up on chatgpt a couple years ago so I have no idea how they perform. They were bad when I quit using it.
I use all of them about equally, and I don't really want to argue the point, as I've had this conversation with friends, and it really feels like it is becoming more about brand affiliation and preference. At the end of the day, they're random text generators and asking the same question with different seeds gives different results, and they're all mostly good.
Do you find that Gemini results are slightly different when you ask the same question multiple times? I found it to have the least consistently reproducible results compared to others I was trying to use.
Sometimes it will alternate between different design patterns for implementing the same feature on different generations.
If it gets the answer wrong and I notice it, often just regenerating will get past it rather than having to reformulate my prompt.
So, I'd say yeah...it is consistent in the general direction or understanding, but not so much in the details. Adjusting temp does help with that, but I often just leave it default regardless.
Coincidentally, Google pays Apple over a billion dollars a year (est. at 1.5B) to be the default search in iOS. Could be re-titled.
Google closes their trade deficit to half a billion dollars per year.
It’s actually much more than that, $20 billion per year
WHAT -- just had to Google and you're absolutely right! Makes sense that time has moved forward.
https://www.theverge.com/2023/10/26/23933206/google-apple-se...
I'm still using numbers from a post I wrote in 2016
https://techcrunch.com/2016/07/24/apple-lays-the-groundwork-...
Today I asked to my homepod:
> "Hey Siri, whens the next Formula 1 race in Montreal"
and she responds with the same infuriating answer I typically get
> "Hmm, I found some interesting results on the web, I can show them to you if you ask again from your iPhone"
I don't care what pride Apple has to swallow, or if they have to layoff 10,000 people.
I just want my device ecosystem to be able to do what its competitors have been able to do for a decade, or what Ive been able to build myself for the last 3 years. A working and useful voice assistant.
At this point Im convinced Tim Cook could sit at a terminal himself and ship a better version of what Apple has in an afternoon.
This isn't good news.
It means that Apple's huge, expensive AI team has basically failed.
And it presumably means that Apple is willing to accept Google's practices for ML model training and use.
I really hope Apple is working hard on improving on-device models for their use case so they can get out of this.
Companies working hard to bring us closer to on-device models are Kingston, Hynix, Micron and the like, not Apple. If they succeed, we will get on-device LLMs sooner. If not, well, it may take a while.
On device models are already so good. It's so insane Siri doesn't just use them.
Or why HomePods don't get answers via iPhone.
When I first read the headline, I thought they’d licensed a customized Gemma 3n for an on-device model.
The question is if Apple will buy TPUs to run it too.
$1B for the software and $1B for the hardware, every few years.
Why Gemini? Just because of the closeness between the two companies already or is there a technical reason? I like Gemini the least because each search results in slightly different hard to reproduce identically results... I find I like LibreChat the best and then just connect it to all the other LLMs like ChatGTP, Claude, Anthropic, etc. from there.
Much like they are paying the leaders in other specialties instead of becoming eg. a assembly company (Foxconn) or a search company (Google Search) they are not going to try and be a leader in at least large language models.
Am I interpreting that correctly?
I can understand that to a degree but that means the future for Apple is as a technology integrator, not a fundamental technology company.
As I type that out I guess I’m realizing that has always been true.
Or cpu
... and has widely been regarded as a bad move.
On device models please. My computer should work for me.
So now it does not matter what platform you choose for your smartphone, you cannot escape Google AI surveillance. Well you can shut it off on the iPhone I guess, but that means no more privacy focused Apple Intelligence.
Next to all the money they poured into Liquid glAss, this will be the worst investment Apple has ever made.
OP's 9to5mac article states:
Bloomberg states:Also under the agreement, Google’s model will reportedly run on Apple’s own servers, which in practice means that no user data will be shared with Google. Instead, they won’t leave Apple’s Private Cloud Compute structure.The model will run on Apple’s own Private Cloud Compute servers, ensuring that user data remains walled off from Google’s infrastructure.This assumes they'll make the data available to Google. With all their secure "Private Cloud Compute" stuffs they advertised, there's a good chance it will not be shared.
Giving credit where its due, I think the private cloud compute stuff of Apple is really interesting architecure wise. I think it included using ARM Cpu's with a special realm ability to prevent certain types of attacks to minimize the amount of trust if I remember correctly.
If iOS opened up the ability to implement your own assistant like VoiceInteractionService on Android, you wouldn't have to worry about it. On Android, if you don't like Google providing the service, you can switch to OpenAI, Alexa, or even your own service.
GrapheneOS.
patiently waiting to see which snap dragon will supported. Hopefully something smallish.