Apple Hires Google AI Executive and Gets Full Access to Gemini for Model Training
By Kelly Wong
# Apple Hires Google AI Executive and Gets Full Access to Gemini for Model Training
*By Kelly Wong • March 29, 2026*
Apple just pulled off a double play that should worry every AI competitor in Silicon Valley. The company hired Lilian Rincon, a Google veteran who spent nearly a decade running shopping and assistant products, as its new VP of product marketing for AI. And separately, Apple now has "complete access" to Google's Gemini model inside its own data centers, where it can distill smaller, device-optimized AI models for iPhones and Macs.
These two moves tell the same story. Apple isn't trying to build the biggest AI model. It's trying to build the most useful one, and it's willing to poach talent and cut deals to get there.
## Why Apple Hired Lilian Rincon From Google
Rincon reports directly to Greg "Joz" Joswiak, Apple's head of marketing. That reporting line matters. Apple didn't hire an AI researcher or an engineering lead. It hired someone who knows how to sell AI products to regular people.
At Google, Rincon oversaw products that millions of people used daily. Google Shopping and Google Assistant both required translating complex technology into experiences that felt simple and obvious. That's exactly what Apple needs right now.
Siri has been Apple's biggest embarrassment for years. While ChatGPT and Google's Gemini assistant captured public imagination, Siri stayed stuck in 2015. Apple's been working on a major Siri overhaul powered by large language models, but the marketing challenge is just as big as the technical one. How do you convince 1.5 billion iPhone users that Siri is worth trying again?
Rincon's hire suggests Apple is getting serious about that challenge. She knows the competitive landscape from the inside. She understands what Google's AI products do well and where they fall short. That intelligence is invaluable when you're positioning a rival product.
## The Gemini Distillation Deal Explained
The more consequential news is the Gemini deal. According to The Information, Apple has "complete access" to Google's Gemini model as part of an agreement announced in January. This isn't just API access. Apple can run Gemini inside its own data centers and use a technique called knowledge distillation to create smaller models.
Knowledge distillation works like this: you take a large, powerful model (the "teacher") and use it to train a smaller model (the "student"). The student learns to mimic the teacher's outputs without needing the teacher's massive size. The result is a compact model that runs efficiently on devices with limited processing power, like an iPhone.
This is a massive shortcut for Apple. Building a frontier-scale AI model from scratch requires billions of dollars in compute, years of research, and thousands of specialized engineers. Apple has the money but not the head start. Google's been training large language models since the Transformer paper in 2017. Apple's been catching up since roughly 2023.
By distilling Gemini, Apple skips the most expensive and time-consuming part of the process. It gets a high-quality starting point and can focus its engineering efforts on optimization, privacy features, and device-specific tuning. That's a much better use of Apple's strengths.
## What Apple's AI Strategy Actually Looks Like
Apple's AI approach differs fundamentally from every other major tech company. While Google, OpenAI, Anthropic, and Meta compete to build the largest and most capable cloud-based AI systems, Apple wants AI that runs directly on your device.
There are good reasons for this. On-device AI is faster because it doesn't need to send data to a remote server. It's more private because your information never leaves your phone. And it works offline, which matters when you're on a plane, in a subway, or somewhere with spotty cell service.
The tradeoff is capability. An iPhone can't run a model with 400 billion parameters. But a distilled model with 3 to 7 billion parameters? That's achievable on Apple's latest A-series and M-series chips. And if the distillation process works well, that smaller model can perform impressively close to its much larger teacher.
Apple Intelligence, the company's AI platform introduced in 2024, already runs some models on-device. The Gemini distillation deal should produce significantly better models for the next generation of Apple products. Expect improvements to Siri, on-device text generation, image understanding, and app automation.
## Google's Motivation for Sharing Gemini
Why would Google give a competitor access to its crown jewels? Money and market position.
Google earns substantial revenue from its default search deal with Apple. That deal is worth an estimated $20 billion or more annually. Extending the partnership to include AI ensures Google stays embedded in Apple's ecosystem even as the industry shifts from traditional search to AI-powered interfaces.
There's also a defensive angle. If Apple built its own competitive AI model, it might eventually replace Google Search with an AI-powered alternative on iPhones and Macs. By providing Gemini as a foundation, Google ensures its technology stays at the core of Apple's AI features.
Microsoft tried a similar strategy with its OpenAI partnership, embedding AI throughout Windows, Office, and Bing. Google's Gemini deal with Apple serves the same purpose: make your AI indispensable to the platform that reaches the most users.
## How This Affects the Broader AI Market
The Apple-Google AI deal has implications that ripple across the industry. For [AI startups](/companies) trying to compete with tech giants, it demonstrates how quickly incumbents can form alliances that lock out smaller players. Building a frontier model is already expensive. Competing against a Gemini-trained Apple is even harder.
For consumers, it should mean better AI features on Apple products sooner than expected. Instead of waiting for Apple to independently develop world-class models, iPhone users will benefit from Google's years of AI research filtered through Apple's design sensibility.
For OpenAI and Microsoft, this is a competitive threat. The Windows-OpenAI partnership has been the most prominent AI alliance in tech. Apple-Google now forms a counterweight. If Apple's AI features outperform Microsoft Copilot on consumer devices, it could shift the balance of the AI race.
For Anthropic, the situation is more complex. The company has been building relationships with cloud providers and enterprise customers. Apple's choice to partner with Google rather than Anthropic for its foundational AI suggests that model scale and existing business relationships still matter more than safety-focused branding.
## The Talent War Intensifies
Rincon's departure from Google is part of a broader pattern. AI talent is the scarcest resource in technology, and companies are poaching aggressively. Google has lost senior AI researchers and executives to every major competitor. Apple, OpenAI, Anthropic, and Meta have all raided Google's ranks.
Apple has been particularly active in AI recruiting over the past year. The company hired several senior machine learning researchers from Google DeepMind, recruited engineers from Meta's AI research lab, and brought on specialists from smaller AI startups. Each hire fills a gap in Apple's AI capabilities.
The challenge for Apple isn't just hiring individuals. It's building a culture that attracts and retains top AI talent. Google and Meta offer AI researchers freedom to publish papers and pursue fundamental research. Apple's secretive culture clashes with that openness. Rincon's hire at the VP level signals Apple is trying to bridge that gap.
## Privacy as a Competitive Weapon
Apple has always positioned privacy as a core product feature. In the AI era, that positioning becomes even more important. Users are increasingly aware that AI assistants process their personal data, conversations, and habits.
Apple's on-device AI approach means your data stays on your iPhone. When cloud processing is necessary, Apple uses a system it calls Private Cloud Compute, which processes requests without retaining user data. This is a genuine technical achievement that Google and Microsoft haven't matched.
The Gemini distillation deal doesn't compromise this privacy advantage. Apple is using Gemini to train its own models, not routing user queries through Google's servers. The resulting distilled models run locally on Apple hardware with no data leaving the device.
For users who care about privacy, this combination is compelling: AI capabilities powered by one of the world's best models, delivered through the industry's most privacy-conscious platform. That's a hard package for competitors to match.
## What to Watch Next
Apple's WWDC developer conference in June will likely showcase the fruits of this Gemini partnership. Expect a dramatically improved Siri, better on-device AI features, and possibly new capabilities that weren't possible with Apple's previous models.
The real test comes with the iPhone 18 launch in the fall. If Apple's distilled Gemini models deliver noticeably better performance than the current Apple Intelligence features, it validates the entire strategy. If the improvements are incremental, questions about Apple's AI competitiveness will persist.
Either way, the Apple-Google AI alliance reshapes the competitive landscape. The AI race isn't just about who builds the biggest model anymore. It's about who delivers the best experience to the most users. And nobody reaches more users than Apple.
## Frequently Asked Questions
### What is knowledge distillation in AI?
Knowledge distillation is a training technique where a large, powerful AI model (the "teacher") trains a smaller model (the "student") to replicate its outputs. The student model can't match the teacher's full capabilities, but it gets surprisingly close while using a fraction of the computing resources. This makes it possible to run capable AI on devices like phones and laptops. Learn more in our [AI glossary](/glossary).
### Does Apple's Gemini deal mean Google controls Apple's AI?
No. Apple is using Gemini as a starting point to train its own smaller [models](/models). Once distilled, these models belong to Apple and run independently on Apple hardware. Google doesn't have access to how Apple uses or modifies the distilled models, and user data doesn't flow back to Google.
### How will this affect Siri?
Apple is expected to use Gemini-distilled models to significantly upgrade Siri's capabilities. The improvements should include better natural language understanding, more accurate responses, and the ability to handle multi-step tasks. The updated Siri will likely debut at WWDC in June 2026. Compare current AI assistants on our [comparison page](/compare).
### Is Apple behind in the AI race?
Apple took a different approach than competitors by focusing on on-device AI rather than cloud-based services. While this meant slower initial progress on chatbot-style features, it positions Apple well for privacy-conscious AI that works without an internet connection. The Gemini partnership accelerates Apple's capabilities significantly. Check our [learn page](/learn) for more on how different companies approach AI development.
Get AI news in your inbox
Daily digest of what matters in AI.
Key Terms Explained
Anthropic
An AI safety company founded in 2021 by former OpenAI researchers, including Dario and Daniela Amodei.
Chatbot
An AI system designed to have conversations with humans through text or voice.
Compute
The processing power needed to train and run AI models.
DeepMind
A leading AI research lab, now part of Google.