Close Menu
AI News TodayAI News Today

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Google’s Gemini for Home Is Bringing Back Continued Conversations

    5 AI Models Tried to Scam Me. Some of Them Were Scary Good

    Google turns Chrome into an AI coworker for the workplace

    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    Facebook X (Twitter) Instagram Pinterest Vimeo
    AI News TodayAI News Today
    • Home
    • Shop
    • AI News
    • AI Reviews
    • AI Tools
    • AI Tutorials
    • Chatbots
    • Free AI Tools
    AI News TodayAI News Today
    Home»AI News»Google unveils two new TPUs designed for the “agentic era”
    AI News

    Google unveils two new TPUs designed for the “agentic era”

    By No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    TPU 8t chips on a board
    Share
    Facebook Twitter LinkedIn Pinterest Email

    So the new chips allow for faster training, but Google also says you get more useful computation for every volt you pump into a TPU 8t. The company claims a “goodpute” rate of 97 percent, which means less waiting and wasted effort. With better handling of irregular memory access, automatic handling of hardware faults, and real-time telemetry across all connected chips, TPU 8t spends more time actively advancing model training.

    When training is done, AI models run in inference mode to generate tokens—that’s the process happening behind the scenes when you tell a model to do something. This doesn’t require as much horsepower, so using the same hardware for both parts of the AI lifecycle is inefficient. That’s why inference is the purview of TPU 8i, which is designed to be more efficient when running multiple specialized agents, with less waiting time. TPU 8i chips also run in larger pods of 1,152 chips versus just 256 for the last-gen Ironwood inference clusters. That works out to 11.6 EFlops per pod, much lower than TPU 8t pods.

    The TPU 8i has less raw power than TPU 8t.

    Credit:
    Google

    The TPU 8i has less raw power than TPU 8t.


    Credit:

    Google

    Google has tripled the amount of on-chip SRAM for each TPU 8i to 384 MB. This allows the company’s new chips to keep a larger key value cache on the chip, speeding up models with longer context windows. The eighth-gen AI accelerators are also the first from Google to rely solely on Google’s custom Axion ARM CPU host, featuring one CPU for every two TPUs. In Ironwood, each x86 CPU serviced four TPU chips. Google says this “full-stack” ARM-based approach allows for much greater efficiency.

    An efficiency play

    It makes sense that efficiency is a core part of Google’s new TPU setup. Training and running frontier AI models is expensive, and the return on investment is unclear. Companies are still burning money on generative AI in the hopes that efficiency will turn the corner at some point. Maybe Google’s new TPUs will help get there and maybe not, but the company has made notable improvements.

    Agentic designed Era Google TPUs unveils
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleCorrelation vs. Causation: Measuring True Impact with Propensity Score Matching
    Next Article Introducing workspace agents in ChatGPT
    • Website

    Related Posts

    AI News

    Google turns Chrome into an AI coworker for the workplace

    AI News

    AI and the Future of Cybersecurity: Why Openness Matters

    AI News

    As EV batteries improve, ChargePoint debuts 600 kW fast charger

    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Google’s Gemini for Home Is Bringing Back Continued Conversations

    0 Views

    5 AI Models Tried to Scam Me. Some of Them Were Scary Good

    0 Views

    Google turns Chrome into an AI coworker for the workplace

    0 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews
    AI Tutorials

    Quantization from the ground up

    AI Tools

    David Sacks is done as AI czar — here’s what he’s doing instead

    AI Reviews

    Judge sides with Anthropic to temporarily block the Pentagon’s ban

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Google’s Gemini for Home Is Bringing Back Continued Conversations

    0 Views

    5 AI Models Tried to Scam Me. Some of Them Were Scary Good

    0 Views

    Google turns Chrome into an AI coworker for the workplace

    0 Views
    Our Picks

    Quantization from the ground up

    David Sacks is done as AI czar — here’s what he’s doing instead

    Judge sides with Anthropic to temporarily block the Pentagon’s ban

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Terms & Conditions
    • Privacy Policy
    • Disclaimer

    © 2026 ainewstoday.co. All rights reserved. Designed by DD.

    Type above and press Enter to search. Press Esc to cancel.