- The Deep View
- Posts
- How Apple's AI plan could succeed on June 8
How Apple's AI plan could succeed on June 8

Welcome back. AI’s layoff narrative is getting louder than the evidence, giving companies a convenient excuse for job cuts that may still have more to do with pandemic overhiring than automation. Our latest podcast looks at why cloud-only AI may not scale, and why the future could run across phones, laptops, and other devices. And Apple gets its next big test on June 8, when its AI reboot could show whether Siri becomes a gateway to simpler, more trusted AI for billions of users. —Jason Hiner
1. How Apple's AI plan could succeed on June 8
2. The AI layoff panic is outrunning the data
3. Why AI is about to move from cloud to edge
BIG TECH
Apple's AI reboot at WWDC is bigger than Siri
Apple Intelligence is preparing for its big reboot, and there's a lot at stake for the two billion Apple users worldwide.
While Apple has taken it on the chin for not delivering a revamped Siri or the full suite of Apple Intelligence features it promised two summers ago at WWDC 2024, there's still a clear opportunity for Apple to play a key part in the global spread of AI in the years ahead. And come June 8 at the company's WWDC keynote, we'll learn how big of an AI leap the company intends to take.
Only about 16% of the global population uses AI today, which means there's still plenty of room for Apple to play a part, especially since it makes the devices that many people will use to access AI tools. But to get the next wave of people involved in AI, devices and services need to be easier to use and less expensive. Apple isn't likely to help with the affordability side of the equation, but it can certainly play a key role in making the technology easier to use. And once Apple figures out an easier path, the cheaper alternatives tend to copy it.
We'll learn the details of Apple's new vision for AI on the iPhone and other devices at the June 8 keynote of its annual WWDC event. The features will roll out in beta this summer and will officially land on the iPhone and other Apple gear in software updates this fall.
The two big moves to watch are:
Gemini-powered Siri: While most Apple users have written off using Siri for AI-related tasks, it's about to get a widely discussed brain transplant in iOS 27, powered by Google Gemini. The tersely worded statements from Apple and Google make it sound as though Siri will now use Gemini models under the hood, but the entire user experience will be crafted by Apple. That could include a new standalone Siri app and chatbot functionality to compete with ChatGPT, Claude, Perplexity, and others. All of these apps continue to pile in more and more features. Let's see what Apple offers to simplify the AI experience.
A new AI section in the App Store: Apple is also reportedly planning to launch a dedicated AI "Extensions" feature inside the App Store. According to Bloomberg's Mark Gurman, this will allow Apple users to download features such as agents and other AI tools from frontier labs and software developers and integrate them into Siri. The motivation here could be to make Siri an AI superapp that can pull in your favorite AI features from any AI developer and integrate them more smoothly into your phone or laptop.
The features will roll out in beta this summer and will officially land on the iPhone and other Apple gear in software updates this fall.

Even without a revamped Siri or the more advanced Apple Intelligence features, such as Personal Intelligence, Apple has not been completely left out of the AI race. Its greatest asset right now is its Apple Silicon chips, with neural processors purpose-built to handle AI workloads. That's why so many software developers at the frontier labs are creating their latest models on Macs. It's why they almost always release the latest features of their desktop apps on Mac first. And it's why Mac minis have been flying off the shelves to run personal AI agents such as OpenClaw. Apple is already winning in AI hardware. If it can pair that with an easier user experience to make AI accessible to more people with the next version of Siri, then it will play a key role in the next phase of AI. That's easier said than done. But let's also remember that Apple has user trust, privacy, and security on its side, if it can get the software right.
TOGETHER WITH ORACLE NETSUITE
Get your 2026 AI-Financial Handbook
AI in finance is moving from hype to hands-on.
Whether you’re a CFO modernizing finance operations or an analyst looking to work faster and more accurately, this handbook delivers actionable strategies you can use right away—no technical background required.
Download Nicolas Boucher’s handbook for ready-to-use prompts, NetSuite AI features, MCP guidance, security best practices, and a 30-day plan to drive measurable impact starting with one use case this week!
WORKFORCE
Why AI makes a convenient layoff scapegoat
It’s still unclear whether AI can do the work of white-collar employees. People are losing their jobs anyway.
A recent report from Challenger, Gray and Christmas found that more than a quarter of layoffs in April were attributable to AI, with more than 21,000 cuts announced. AI was used as the leading rationale for job cuts for the second month in a row, according to the report.
“Regardless of whether individual jobs are being replaced by AI, the money for those roles is,” Andy Challenger, workplace expert and chief revenue officer for the company, said in the report.
However, one White House official is challenging that narrative: On Monday, National Economic Council Director Kevin Hassett told CNBC that there is “no sign in the data” that AI has cost anyone their job just yet. Hassett said that companies that adopt AI tend to see “rapid revenue growth” and a bump in employment.
“We are studying the future of AI and what it means for the workforce, so we've got a big task force on that," Hassett told CNBC.
These contradictory reports add to a mountain of warring data on the effects of AI on how people work. An MIT study suggests that more than 11% of work hours in the US can already be automated. A Gartner forecast finds that 50% of those laid off due to AI will be rehired. Meanwhile, a Harvard study found that AI actually increases the hours and scope of work, rather than reducing them.
In the meantime, the layoff toll continues to grow higher. Tech firms like Block, Atlassian, Meta, Oracle, Amazon and more have slashed thousands of employees in recent months as they ramp up spending and reorganize their workforces.
These cuts are likely to continue. A survey of thousands of C-suite executives from AI agent platform Writer in April found that 60% of enterprises intend to lay off employees who can’t or won’t use AI.

Despite how enticing the promise of AI may seem, the tech remains incredibly nascent. With issues such as accuracy, hallucination and data security, it’s unclear whether this tech is actually capable of taking over jobs entirely — or if it just looks like it can. Either way, AI is likely not the sole reason for these cuts. Rather, it’s the excuse these companies can use. By claiming to reorganize around AI, these companies stand to make themselves appear to be riding the innovation curve. Plenty of tech firms overhired during the pandemic. The reality is that these cuts are more likely AI washing, using the tech as a scapegoat for the reductions rather than admitting they miscalculated.
TOGETHER WITH CRUSOE
Stop wrangling GPU clusters. Fine-tune open-source models in an afternoon with Crusoe Cloud
Fine-tuning shouldn't require a platform build. Crusoe Serverless Fine-Tuning is now in private preview — submit a job, get your weights back, ship your model. No cluster provisioning. No surprise bills. No infrastructure tax.
👉 From job submission to production-ready artifact — same day.
✅ Fine-tune top open-source models on your proprietary data.
✅ Your weights, your IP — portable .safetensors, zero lock-in.
✅ Built-in cost guardrails, automated recovery, and early stopping.
CONSUMER
Why the future of AI is hybrid and not cloud
What happens when AI moves from cloud-only to running everywhere, including on your laptop, your phone, and other devices around you?
In this episode of Deep View Conversations, senior reporter Sabrina Ortiz sits down with Olena Zhu, who leads AI for the client computing group at Intel, to explore one of the biggest shifts underway in AI: the move toward accessible, affordable, and privacy-first AI systems.
Zhu explains why the economics and infrastructure demands of cloud-only AI may not scale indefinitely, and why on-device AI could become a critical part of the industry's future. She also reflects on the evolution from traditional AI systems to LLMs and now to agentic AI, and why this wave feels fundamentally different from the hype cycles that came before it.
The conversation also dives into how AI is changing the way people work, learn, and experiment, including the surprising mindset Zhu believes helps people get the most value from AI tools today.
Topics covered include:
+ Why cloud-only AI has limits
+ The future of on-device and edge AI
+ AI affordability, energy use, and data sovereignty
+ How agentic AI changed Zhu’s workflow
+ Why experimentation matters more than expertise
+ Intel’s vision for privacy-first AI systems
+ The hidden infrastructure challenge behind AI growth
+ Why AI adoption may depend on trust and accessibility
If you're concerned about the affordability, accessibility, and privacy of AI, you don't want to miss this episode.
Subscribe to Deep View Conversations for interviews with the leaders shaping the future of AI, business, and technology.
LINKS

Google report finds it likely thwarted AI-powered “mass exploitation event”
OpenAI launches Deployment Company with $4 billion of initial investment
Thinking Machines unveils its interaction models in research preview
Softbank founder Masayoshi Son in talks for data center project in France
European Commission in talks with OpenAI, Anthropic on model use
OpenAI debuts Daybreak, its latest AI security initiative

Claude Platform on AWS: Customers of AWS now have access to all of Claude’s API features, with AWS authentication, billing, and commitment retirement
ChatGPT add-ons: OpenAI’s model is now available as an add-on for Excel and Google Sheets
Claude Code: Anthropic’s flagship coding platform now has “agent view,” allowing users to dispatch Claude agents into multiple sessions at once
Snapseed 4.0: Google has rolled out a major update to its photo editing tool for Android

Anthropic: Research Engineer, Universes
Berkeley Lab: AI-Readiness & Data Automation Postdoctoral Scholar
Microsoft: Research Intern - Applied Speech Research
Scale AI: Research Scientist, AI Controls and Monitoring
A QUICK POLL BEFORE YOU GO
Have you tried Claude Code or Claude Cowork Yet? |
The Deep View is written by Nat Rubio-Licht, Sabrina Ortiz, Jason Hiner, Faris Kojok and The Deep View crew. Please reply with any feedback.

Thanks for reading today’s edition of The Deep View! We’ll see you in the next one.

“The sea reflects what is in the perspective. The sun is still not quite risen but the burst is realistic whereas in [the other image] the sun is too round and not intensive; obviously put there for perfection. Nature is not perfect.”
|
“[This image] seems to add a more appealing contrast and highlights (i.e. such as the outline of the sun on the subjects). [The other image] seems to have dark spaces with poorly contrasted aspects to it. The trees also seem blotchy and not as appealing as the perfect trees in [This image].” |


If you want to get in front of an audience of 750,000+ developers, business leaders and tech enthusiasts, get in touch with us here.












