Close Menu

    Subscribe to Updates

    Get the latest Tech news from SynapseFlow

    What's Hot

    Laptop performance and FPS drop after BIOS update

    March 14, 2026

    How to upgrade your car’s old audio system to work with Android Auto and Apple CarPlay

    March 14, 2026

    US Destroys All Military Targets on Kharg Island Which Is Iran’s Oil Export Hub

    March 14, 2026
    Facebook X (Twitter) Instagram
    • Homepage
    • About Us
    • Contact Us
    • Privacy Policy
    Facebook X (Twitter) Instagram YouTube
    synapseflow.co.uksynapseflow.co.uk
    • AI News & Updates
    • Cybersecurity
    • Future Tech
    • Reviews
    • Software & Apps
    • Tech Gadgets
    synapseflow.co.uksynapseflow.co.uk
    Home»Future Tech»AI Pretraining Scaling Laws With Compute Are Still Working – XAI and Google Will Pull Away
    AI Pretraining Scaling Laws With Compute Are Still Working – XAI and Google Will Pull Away
    Future Tech

    AI Pretraining Scaling Laws With Compute Are Still Working – XAI and Google Will Pull Away

    The Tech GuyBy The Tech GuyNovember 20, 2025No Comments4 Mins Read0 Views
    Share
    Facebook Twitter LinkedIn Pinterest Email
    Advertisement


    Multi-billionaire fund manager Gavin Baker gives the overview of the state of AI.

    Advertisement

    Nextbigfuture identified the AI data center scaling of google and XAI over a year ago.

    AI Scaling Laws are Intact.
    Four leaders in AI – Google, xAI, OpenAI and Anthropic.
    BUT Google and XAI will pull away with faster and more compute scaling.
    Nvidia is still dominant.
    Best models are still coming mid-2026 using B300 chips.

    AI Scaling Laws are Intact

    Gemini 3 is the strongest evidence since o1 that pre-training scaling laws still hold.

    Expect large jumps from Blackwell-trained models in Q2 2026.

    GPT-5 was intentionally a smaller, cheaper-to-infer model (router-based), not a max-performance push → not evidence of scaling slowdown.

    Frontier Model Landscape

    (Emerging Oligopoly)Google – Currently pulling ahead. Lowest-cost high-quality tokens, massive coherent TPU fabric advantage.

    xAI – Close second on cost/quality. Grok 4.1 leverages huge coherent GPU clusters. Google + xAI clearly best models right now.

    Anthropic & OpenAI – Both have strong unreleased checkpoints, but currently behind Google/xAI in public model quality (OpenAI in 3rd place for the first time).

    Meta – Small chance to stay competitive because Chinese open-source is only ~9 months behind Meta’s Llama series.

    China open-source – Falling further behind. DeepSeek hasn’t released a new frontier model in a year, Huawei H20 issues persist. Blackwell gap + US rare-earth ramp will widen this dramatically.

    Nvidia Dominance

    Nvidia remains overwhelmingly dominant due to Blackwell (B300 variant now ramping smoothly after a rocky start with B200 delays, mask changes, canceled variants, and extreme datacenter complexity).

    Blackwell delivers superior coherent FLOPs and tokens-per-watt, the two metrics that matter most going forward.

    Power shortages make tokens/watt the key decision driver → favors general-purpose GPUs over custom ASICs; most non-Google ASIC programs likely dead.

    Hopper rental prices are still rising and even A100s remain highly profitable → GPU useful life likely over 6 years, financing costs dropping further.

    Blackwell will dramatically widen the gap vs Chinese domestic silicon (much bigger lag vs Blackwell than vs Hopper), further entrenching Nvidia’s moat.

    Reasoning Flywheel & Rising Barriers

    Reasoning models (o1-style, Gemini 3, etc.) create the classic internet flywheel: users → high-quality interaction data → better models → more users.

    Pre-reasoning era had only pre-training data scaling. Now post-training/reasoning data is a new, proprietary, compounding moat.

    All four U.S. leaders (Google, xAI, Anthropic, OpenAI) have much better internal checkpoints than public releases → very hard for anyone to catch up.

    Other Key Themes

    Power shortages are bullish: act as a natural governor against overbuild, extend cycle length and smoothness; push decisions toward tokens/watt (Blackwell wins).

    Optics (optical interconnect) becoming critical for multi-campus training and moving workloads to cheap/available power. Also ironically helps China offset some compute deficit (at cost of much higher power draw).

    Hyperscaler ROIC still higher than pre-AI capex ramp. Early signs of real enterprise productivity gains appearing in S&P 500 earnings.

    Overall token demand (driven by customer ROI) is what matters for the sector. Individual leader share battles (OpenAI losing ground) won’t kill the secular growth story.

    Bottom line from Gavin Baker

    We are still very early. Next decade will be steady progress driven by compute scaling, reasoning flywheels, and power/tokens-per-watt optimization, with Nvidia, Google, and xAI as the biggest structural winners.

    Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.

    Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.

    A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts.  He is open to public speaking and advising engagements.

    Advertisement
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    The Tech Guy
    • Website

    Related Posts

    US Destroys All Military Targets on Kharg Island Which Is Iran’s Oil Export Hub

    March 14, 2026

    NASA Selects Finalists in Student Aircraft Maintenance Competition – NASA

    March 13, 2026

    The US Plans to Break Ground on a Permanent Moon Base by 2030. Here’s What It Will Take.

    March 13, 2026

    Robot Escorted Away By Cops After Terrorizing Old Woman

    March 13, 2026

    SpaceX Space AI Ramp | NextBigFuture.com

    March 13, 2026

    Tiny NASA Spacecraft Delivers Exoplanet Mission’s First Images

    March 12, 2026
    Leave A Reply Cancel Reply

    Advertisement
    Top Posts

    The iPad Air brand makes no sense – it needs a rethink

    October 12, 202516 Views

    ChatGPT Group Chats are here … but not for everyone (yet)

    November 14, 20258 Views

    Facebook updates its algorithm to give users more control over which videos they see

    October 8, 20258 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Advertisement
    About Us
    About Us

    SynapseFlow brings you the latest updates in Technology, AI, and Gadgets from innovations and reviews to future trends. Stay smart, stay updated with the tech world every day!

    Our Picks

    Laptop performance and FPS drop after BIOS update

    March 14, 2026

    How to upgrade your car’s old audio system to work with Android Auto and Apple CarPlay

    March 14, 2026

    US Destroys All Military Targets on Kharg Island Which Is Iran’s Oil Export Hub

    March 14, 2026
    categories
    • AI News & Updates
    • Cybersecurity
    • Future Tech
    • Reviews
    • Software & Apps
    • Tech Gadgets
    Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
    • Homepage
    • About Us
    • Contact Us
    • Privacy Policy
    © 2026 SynapseFlow All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.

    Ad Blocker Enabled!
    Ad Blocker Enabled!
    Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.