AI’s next big chip bet may be biological

Hello, friends. The race to control AI’s future is shifting deeper into the stack. Anthropic’s chip ambitions signal a move toward extreme co-design, where controlling compute becomes as critical as model breakthroughs. Meanwhile, Snap is betting that true AR, not just overlays and notifications, will define the smartglasses market. And in a San Francisco lab, a startup is pushing a radical idea: living neurons as compute, promising faster learning with far less energy. Jason Hiner

IN TODAY’S NEWSLETTER

1. AI’s next big chip bet may be biological

2. Why Anthropic may bring AI chips in-house

3. Snap is building what Meta and Google left out

STARTUPS

Biological chips could solve AI's bottleneck

Tucked into an unassuming office building in San Francisco, one startup is betting on an unconventional way of alleviating the AI energy crisis: living human cells. 

The Biological Computing Company, or TBC, a startup that emerged from stealth in February with $25 million in seed funding, is pioneering an alternative to traditional silicon, using living neurons as the foundation to improve generative AI algorithms and infrastructure. Sitting at the helm of TBC are former neurosurgeons, Dr. Alex Ksendsovsky and Dr. Jon Pomeraniec.

Last week, The Deep View was invited to tour TBC’s lab in Mission Bay, meeting the startup's team of 23, ranging from computer vision experts and AI developers to computational physicists and hands-on biologists, many of whom came from large tech firms like Meta, Apple and Amazon. Bundled in a lab coat, mask and latex gloves, I had the opportunity to get an up-close look at the chips themselves, each containing anywhere from 100,000 to 500,000 neurons. 

Here’s how it works: 

  • TBC takes real-world data, such as images and videos, and encodes them into the neurons of these chips. This information is then decoded into richer representations, which are then used to bolster AI algorithms. As Ksendsovsky put it, “all of the data flows through the biology.”

  • Each of the company’s chips contain a “multi-electron array” in which neurons, grown from stem cells that are “reprogrammed” into frontal cortex cells, are maintained. The company originally used neurons extracted from rats, but Ksendsovsky told me that TBC is “moving away from that.” The chips have a lifespan of a year, and create waste that needs to be cleaned out every few days. 

Though this might sound like science fiction, Ksendsovsky said that this technology harkens back to AI’s roots. Neural networks were originally named as such because they modeled the neurons in a human brain. But as AI developed, the technology “became very non-biological,” leading silicon to become “rigid” and forcing AI scaling to resort to brute-force, power-hungry methods, he said. 

This is where TBC’s chips may have the greatest advantage, he said: Brain cells, generally, require less energy than silicon chips to power intelligence. In its research, TBC discovered that models trained on “biological neural responses” reached peak performance three times faster, requiring fewer training iterations and implying a “threefold reduction in the computational and energy demands.” 

For now, the company isn’t selling the chips themselves, but rather is using the technology to build and strengthen algorithms that use neurological signals, primarily for visual AI, including generative video, game rendering and computer vision. Though he wouldn’t name names, Ksendsovsky said the company is in talks with foundational model labs, cybersecurity firms and data curation companies. 

“I think the most common misconception is that this is just a research project,” Ksendsovsky told me. “We are building products now. We want to show people that there are tangible results that can be monetized now. What we're building is so unique and so far out, that I think it's just going to take the world a little bit to see it and understand it.”

Eventually, TBC hopes to move to “real-time compute,” in which these biological chips themselves are a part of the “inference loop,” Ksendsovsky said, eyeing that milestone for five to ten years from now. However, a lot needs to be done to make that happen, including automating the collection of waste that these cells generate and figuring out how to “immortalize” their lifespans. 

“The reality is, there's a lot of foundational things that need to be understood and built to get to that point,” he said.

TBC’s thesis is undoubtedly outlandish. But AI, once deemed science fiction itself, has become so commonplace that its rapid scale has created a litany of issues in its wake, the greatest of which may be energy. And because of this, the leading voices in AI are grasping for technologies that also once seemed out of reach, such as fusion energy and data centers in space that harness the sun's power. Crisis is the mother of invention, and AI is unquestionably creating crises. The solutions may live within us, figuratively and literally, if TBC has anything to say about it. 

Nat Rubio-Licht

TOGETHER WITH ATTIO

Attio: The AI CRM Built for Modern Businesses

The landscape is changing fast, and every signal across your business needs to count.

Attio is the AI CRM that makes sure it does.

Connect your email, calls, calendar, and product data, and Attio instantly builds a complete picture of every deal and customer. No manual logging; zero missing context.

Then Ask Attio to plan your next move. Surface pipeline risks, prep for meetings, and run deep research on prospects without lifting a finger. Powered by Universal Context, Attio’s intelligence layer, every signal across your business becomes instantly actionable.

Ask more from CRM. Ask Attio.

HARDWARE

Why Anthropic may bring AI chips in-house

Anthropic may start making its own chips for the same reason Nvidia started making its own models. 

If you're not familiar with the term "extreme co-design" yet, then it's time to start learning. Nvidia executives used the phrase repeatedly at the company’s GTC event in March, and I expect it's a big part of the strategy behind the new Reuters report that Anthropic is exploring the opportunity to design its own AI chips. 

Extreme co-design allows you to architect the data center, chips, software stack, and LLMs in tandem, resulting in far better optimization and integration. The result can reduce the cost per token, lower latency, and minimize power consumption. 

That's why Nvidia is now making its own open-source frontier models: By building its own models, Nvidia learns how to better optimize ALL models for its AI chips.

Kari Briski, VP of generative AI software at Nvidia told The Deep View that the process allows the company "to break some rules, to see around the corner… When we say extreme co-design, it's literally about sitting down side by side, day by day, showing each other reports… because we're all racing at the speed of light to the next thing... [And] you have to be building it right now. It can't be a year from now. So we talk about extreme co-design literally in lockstep every single day, understanding each other's pains."

But beyond co-design, there are also a couple of other reasons why Anthropic is likely considering the move:

  • Remove the constraint: Now that Anthropic has proven it can consistently build some of the world's best and most powerful AI models, one of the company's biggest constraints is simply getting enough compute to run its models, serve its chatbot and coding tools to new and existing customers, and train future models. That's why it's been signing new deals with companies like Google/Broadcom and CoreWeave. But chips themselves are one of the biggest constraints for hyperscalers and neoclouds (and leads me to ponder if Anthropic might also consider acquiring a neocloud at some point).

  • Save money in the long run: Spinning up a chips program would be incredibly expensive and it likely wouldn't cover the company’s entire demand for chips. Anthropic would still end up buying chips from other vendors for the foreseeable future. However, in the perspective of decades, it could eventually mean Anthropic would control one of the most important drivers of its technology, and could save money by avoiding padding the profit margins of other companies (like Nvidia).

Let's keep in mind that this report indicates Anthropic is looking to design its own chips, not manufacture them. It would likely still rely on TSMC or Samsung to actually make the chips. However, even that could drastically reduce the time it takes for Anthropic to get the chips it needs, because it wouldn't have to stand in line with  OpenAI, Microsoft, Meta, and basically everyone else building AI to get the brains it needs to run the operation. If you consider all the pieces of the extreme co-design process, you can quickly make a list of long-term acquisition targets for Anthropic, OpenAI, and other AI giants. 

Jason Hiner, Editor-in-Chief

TOGETHER WITH TIGER DATA

ghost: free postgres for your agents

your agent needs a database. not tomorrow, not after a config file, not after a 12-step setup wizard. now.

ghost gives your agent unlimited, free postgres databases in seconds. one CLI command. no credit card. spin up, build, fork, tear down, repeat. databases at the rate of ideas, ephemeral and scalable.

stop wiring infrastructure. start shipping the thing you actually want to build.

CONSUMER

Snap is building what Meta and Google left out

The AI smart glasses market is booming, and Snap wants in.

Specs, Snap's eyewear subsidiary, announced Friday a partnership with Qualcomm to power all future Specs generations with Snapdragon system-on-a-chip technology, ahead of its first flagship consumer release later this year.

While the company has yet to release many details about Specs, it did disclose that the smart glasses will allow users to “see, hear, and interact with digital content just like it’s in your physical space.” This description suggests that the glasses will have AR capabilities, a feature that could help them stand out from competitors. 

Despite the Meta Ray-Ban Displays and the upcoming Google glasses having in-lens displays that can show users applications and notifications in color, they lack AR intelligence to place them in your environment and accurately map spatial elements. 

Before the consumer version launched, Snap Spectacles were available only to developers in the Spectacles Developer Program. I had the opportunity to test these out and was incredibly impressed by the AR experiences, object recognition, environmental understanding, and accurate visual placement. My biggest qualm was their bulk, but the forthcoming consumer pair will likely have a much lighter form factor. 

Qualcomm’s Snapdragon platforms have powered multiple previous generations of the Snap Spectacles and will enable “intelligent, context‑aware experiences to run directly on-device,” according to the post. Qualcomm, of course, has abundant experience powering smart glasses, including the most popular on the market, the Meta Ray-Bans.

With an AR component, Snap's Specs could unlock a whole new market for smart glasses, pushing experiences beyond being merely helpful, like notification overlays and turn signals, into something far more immersive. Imagine turn-by-turn directions perfectly overlaid on the road ahead, or watching a movie projected onto your wall, all from a pair of lightweight glasses rather than a bulky VR/AR headset. The key to making the smartglasses worthwhile, though, will remain the same: ensure the everyday AI features are useful and that the glasses are comfortable to wear.

LINKS

  • Claude for Word: A feature that allows Claude to assist from sidebar is now in beta

  • Replit: In beta, Replit can now deploy directly to Databricks

  • Claude Code: New Ultraplan feature builds an implementation plan for you on the web

  • Gemini: Notebooks in Gemini infuse NotebookLM capabilities into the chatbot

  • Anthropic: Research Engineer/Research Scientist, Audio

  • Scale AI: Research Scientist, Agent Robustness

  • Nvidia: Senior Deep Learning Communication Architect 

  • Roblox: Senior Machine Learning Engineer, Safety Core Data

GAMES

Which image is real?

Login or Subscribe to participate in polls.

POLL RESULTS

Would you use ChatGPT or other AI tools to help you find and hire talent?

Yes (42%)
No (31%)
Maybe (24%)
Other (3%)

The Deep View is written by Nat Rubio-Licht, Sabrina Ortiz, Jason Hiner, Faris Kojok and The Deep View crew. Please reply with any feedback.

Thanks for reading today’s edition of The Deep View! We’ll see you in the next one.

“Reflection at bottom of the other image's window was strange and in this image, the extension sticks to open/close inner curtain maybe something AI would miss.”

“The bumps in the bedding made me believe [this image] was real, AI generally would have the bedding more perfect.”

“I thought the good exposure in the [other image] was too good to be true.”

“The view is too "Days of Our Lives" perfect in the other [image].”

“[The other image] has buildings stacked in ways that look unreal, especially the brownstone one.”

If you want to get in front of an audience of 750,000+ developers, business leaders and tech enthusiasts, get in touch with us here.