Thursday, December 18, 2025

Top 5 This Week

Related News

Google pushes TPU adoption with PyTorch support to challenge Nvidia’s AI stronghold

A new software focused push is shaping Google’s strategy to expand its presence in the fast growing AI computing market.

Alphabet owned Google is working on an internal initiative to make its artificial intelligence chips work better with PyTorch, the most widely used AI software framework. The move is aimed at reducing Nvidia’s long standing advantage in AI computing, according to people familiar with the matter.

The effort is part of Google’s plan to position its Tensor Processing Units as a practical alternative to Nvidia’s GPUs. TPU sales have become an important driver of Google Cloud revenue as the company looks to show returns on its AI investments.

The initiative, known internally as “TorchTPU”, is designed to remove a major barrier to TPU adoption. Many companies already rely on PyTorch for AI development, and limited compatibility has slowed interest in Google’s chips. The project aims to make TPUs fully compatible and easier for developers who already use PyTorch. Google is also considering open sourcing parts of the software to encourage wider use.

Compared to earlier attempts, Google has assigned greater focus, resources and strategic importance to TorchTPU. Demand is growing from companies that want to adopt TPUs but see the software stack as a bottleneck.

PyTorch, first released in 2016 and strongly supported by Meta Platforms, is deeply embedded in the AI developer ecosystem. Nvidia has spent years optimizing PyTorch performance through its CUDA software, which many analysts see as its strongest competitive shield.

Google’s own AI systems have largely been built on Jax and optimized using XLA, creating a gap between how Google uses TPUs internally and how customers prefer to work. This mismatch has made it costly and time consuming for developers to switch from Nvidia chips.

A Google Cloud spokesperson did not comment on TorchTPU specifics but said the move would give customers more choice. “We are seeing massive, accelerating demand for both our TPU and GPU infrastructure,” the spokesperson said. “Our focus is providing the flexibility and scale developers need, regardless of the hardware they choose to build on.”

To speed development, Google is working closely with Meta, the creator of PyTorch. The companies have discussed expanded TPU access for Meta, according to people familiar with the talks. Meta has an interest in lowering inference costs and reducing dependence on Nvidia GPUs. Meta declined to comment.

Google began selling TPUs directly into customer data centres this year. Amin Vahdat was appointed head of AI infrastructure this month and reports to CEO Sundar Pichai. Google uses this infrastructure for its own products, including Gemini and AI powered search, and for cloud customers such as Anthropic.

Also read: Viksit Workforce for a Viksit Bharat

Do Follow: The Mainstream formerly known as CIO News LinkedIn Account | The Mainstream formerly known as CIO News Facebook | The Mainstream formerly known as CIO News Youtube | The Mainstream formerly known as CIO News Twitter

About us:

The Mainstream is a premier platform delivering the latest updates and informed perspectives across the technology business and cyber landscape. Built on research-driven, thought leadership and original intellectual property, The Mainstream also curates summits & conferences that convene decision makers to explore how technology reshapes industries and leadership. With a growing presence in India and globally across the Middle East, Africa, ASEAN, the USA, the UK and Australia, The Mainstream carries a vision to bring the latest happenings and insights to 8.2 billion people and to place technology at the centre of conversation for leaders navigating the future.

Popular Articles