• TurboQuant isn't the RAM crisis savior you're hoping for, analyst

    From TechnologyDaily@1337:1/100 to All on Sat Apr 11 11:15:27 2026
    TurboQuant isn't the RAM crisis savior you're hoping for, analysts say as memory prices continue to look bleak

    Date:
    Sat, 11 Apr 2026 10:00:00 +0000

    Description:
    Where are we now with the RAM crisis? It's still bleak, despite some positive glimmers of late and I wouldn't rely on TurboQuant to save the day.

    FULL STORY ======================================================================Copy link Facebook X Whatsapp Reddit Pinterest Flipboard Threads Email Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter Tech Radar Get daily insight, inspiration and deals in your inbox Sign up for breaking news, reviews, opinion, top tech deals, and more. Become a Member in Seconds Unlock instant access to exclusive member
    features. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting
    your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over. You are now subscribed Your newsletter sign-up was successful Join the club Get full access to premium articles, exclusive features and a growing list of member rewards. Explore An account already exists for this email address, please log in. Subscribe to our newsletter Is TurboQuant a silicon bullet to solve the RAM crisis ? No, it isn't, and if
    you were hoping that the compression algorithm that Google recently announced would be a major turning point for AI memory-related woes, I'm afraid you might have to think again.

    Sadly, for me, the RAM crisis remains a towering specter that'll likely continue to be a considerable blight on the PC landscape for a long time to come yet. Certainly, a good deal of online opinion has crystallized around
    the notion that Google hasn't got an ace up its sleeve with TurboQuant. Need
    a quick TurboQuant refresher? AI in the form of LLMs ( Large Language Models
    ) represents a huge RAM hoover, as we've seen, and that demand for memory has been a major driver in the current crisis. What Google's TurboQuant does is
    to reduce the memory use of AI , and not just by a little, but by a huge amount in a specific area: key-value cache memory usage. Article continues below You may like 'It really is the craziest time ever': RAM crisis will hit consumers hard DDR4 RAM price falls but don't get carried away with any optimism yet Hold your horses it's still not time to buy RAM despite price drops That's reduced by a factor of six, in fact, and this cache is the LLM's short-term memory (to store the ongoing conversation, and give context in future replies), so it's an important breakthrough on the face of it. The compression used in the tech shouldn't degrade the quality of the output (answers to queries) noticeably, either. In theory, then, with TurboQuant, an AI could keep performing at the same level while using just a sixth of the memory resources it did previously without Google's tech. That's why when TurboQuant was unveiled on March 24, the stock price of memory makers really tanked for a while, as it was seen by investors as a potential big hit to the future profits of those companies. Turbo mode: efficiency or performance? (Image credit: Shutterstock) Okay, that's all well and good, but as this report in the Korea Times makes clear, analysts feel differently to Google (and indeed those investors) about the impact that TurboQuant might have, and what it'll mean for AI and RAM supply more broadly.

    An analyst for Samsung Securities, Lee Jong-wook, observed that: "There have been efforts to improve AI models to optimize chip usage, but more efficient models tend to lower overall costs and, in turn, drive greater demand for AI computing. Rather than reducing semiconductor demand, such optimized models are being used to deliver higher-performance AI services with the same chip resources." Get daily insight, inspiration and deals in your inbox Sign up
    for breaking news, reviews, opinion, top tech deals, and more. Contact me
    with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.

    In other words, we won't see less RAM being used in AI data centers thanks to TurboQuant, but memory will continue to be gobbled up at the same rate, with the LLMs getting better performance instead. That improved performance will
    be preferred to achieving better efficiency for models (and any potential
    cost savings therein).

    After that, better AI will drive more people to use that AI, and make these LLMs more accessible, creating more demand to satiate, requiring yet more
    data center resources (including memory).

    Lee observes: "As long as AI companies compete on performance rather than cost, optimization will not weigh on semiconductor demand." And in the
    current climate, where the AI bubble is still expanding, and competition between the giant LLMs out there is fierce, the prevailing opinion is that tech like TurboQuant is not going to ease the pace of RAM consumption with
    the big AI players. What to read next SK Group chairman's bleak warning on
    RAM crisis: it could last until 2030 Micron is 'trying to help consumers' in RAM crisis despite Crucial closure Googles new compression drastically
    shrinks AI memory use while quietly speeding up performance

    Another analyst, Kim Rok-ho of Hana Securities, adds some further thoughts: "Compression technologies are not new, and it remains uncertain whether they will be widely adopted across the [AI] industry. Even if such technologies become more widely used over the mid to long term, it will lower memory cost barriers, expanding overall AI use. There are limited chances of decline in demand for DRAM and storage."

    The fact that TurboQuant isn't alone, and similar tricks have been tried with AI over the past couple of years, is a good point. Yes, Google is claiming it has something very different here in terms of the tech not lessening the quality of the AI's output but that remains to be seen in action.

    There's plenty of skepticism on Reddit about how Google has spun or hyped TurboQuant , and the reality of the claims made about the tech. And of
    course, it's still just research at this point, and whether it'll be realized for deployment on a large-scale basis, well, only time will tell.

    And again, Kim comes back to the theory that even if TurboQuant does become widely adopted, it's going to drive more AI usage, as opposed to driving down the amount of RAM used by LLMs. Memory misery loves company (Image credit: Shutterstock / Dean Drobot) Even if you accept the arguments that TurboQuant is not the panacea for AI-driven RAM shortages and I feel they're compelling and persuasive views myself you might still point hopefully to other recent developments that suggest the worst of the memory crisis might just be over. Unfortunately, I feel that these hints are red herrings, too.

    I'm mainly thinking of another analyst firm, this time TrendForce, which recently published a report talking about DDR5 RAM price drops in the US, Europe and Asia. While prices do appear to be easing currently, this is more about price tags reaching a ridiculous level where consumers just fold their arms and flat-out refuse to buy, than it is anything to do with supply improvements or bolstered stock.

    Granted, it's good to see some downward movement with retail prices, I'm not against that obviously but nothing's changing regarding the contract prices of RAM for the big memory manufacturers, which suggests the overall picture remains much the same. As TrendForce acknowledges, in the lens of that
    broader view, this positivity is a "consumer-driven, short-term adjustment" rather than the start of a full-on turnaround.

    Another optimistic nugget was laptop maker Framework being able to keep cost increases to a minimum with memory-related hikes this month. However, the company observed that "all indications are that this is a temporary reprieve and that we'll continue to see volatility and cost increases through the rest of 2026".

    I don't doubt that, frankly, when we also see the prices of GPUs which were already expensive climbing again due to the rising cost of video RAM. Or nasty hikes with gaming laptops due to memory and storage price increases, or Mac computers with painfully long lead times for delivery reportedly thanks
    to the memory crisis, or you get the idea.

    This just isn't going away, and the predictions that we won't see any meaningful improvement in the well-off-kilter balance of RAM supply and
    demand until 2028 feel just as entrenched as they were a month or two ago with Google's TurboQuant seemingly unlikely to ride in and save the day. The best laptops for all budgets Our top picks, based on real-world testing and comparisons

    Read our full guide to the best laptops 1. Best overall: Apple MacBook Air 13-inch M4 2. Best budget: Asus Chromebook CM14 3. Best Windows 11 laptop Microsoft Surface Laptop 13-inch 4. Best gaming: Razer Blade 16 5. Best for pros MacBook Pro 16-inch (M4 Pro) Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

    And of course, you can also follow TechRadar on YouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.



    ======================================================================
    Link to news story: https://www.techradar.com/computing/memory/turboquant-isnt-the-ram-crisis-savi or-youre-hoping-for-analysts-say-as-memory-prices-continue-to-look-bleak


    --- Mystic BBS v1.12 A49 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)