Sometimes someone comes around the corner and throws the proverbial anvil into a house of cards. Pat Gelsinger, ex-CEO of Intel, has done just that with the statement that quantum computing will end the era of GPUs in a few years. While Nvidia is mining the silicon gold with its H100 and B200 cards, Gelsinger can already see the wrecking ball rolling in from the quantum world. For him, one thing is clear: the current AI era is a short-lived hype and the big bang could come sooner than many think.

The prophet from the sidelines
Gelsinger is now active at the venture firm Playground Global and is therefore looking directly into the heart of quantum research. And apparently he sees more there than just cryo chambers and pretty PowerPoint slides. In an interview with the Financial Times, he speaks openly of a fundamental “holy trinity” in IT: classical, AI and quantum. In his opinion, the latter is on the verge of replacing the rest. In plain language: GPUs, such as those that Nvidia is currently pushing out at five-digit prices in AI fever, could become obsolete within less than ten years. A declaration of war that should not only make Jensen Huang frown.
- Exclusive access: Gelsinger’s involvement with Playground Global brings him into direct contact with start-ups and laboratories that are working on real Qubit implementations, not just pretty simulation projects.
- Hype radar on full blast: The former Intel boss recognizes the same symptoms in the current AI euphoria that were also observed in previous tech bubbles: astronomical valuations, exploding investments, but no clear path to monetization.
- Reference to history: he draws an analogy to the IBM-Gates story of the 1990s, when Microsoft provided the software and IBM the platform. Today, he sees OpenAI in the same role, dependent on Microsoft’s infrastructure. According to him, there is a threat of a shift in the balance of power here too.
Nvidia is still laughing
Nvidia boss Jensen Huang recently took a more relaxed view of the matter. He assumes that quantum computing will not become really relevant for at least another 20 years. Of course, he can afford to say that. The market is currently eating out of his hand anyway, and Nvidia is not just a chip manufacturer, but the monopolist of an entire AI infrastructure. But Gelsinger counters this: Two years, then a breakthrough is possible. Two. Not twenty.
Of course, it all sounds fancy, but anyone who has spent more time with quantum computing than a LinkedIn guru knows that we are miles away from using quantum hardware in production environments in a meaningful way.
- Susceptibility to errors: qubits are extremely unstable. Nothing works without a massive error correction framework.
- Application bottlenecks: Not every AI algorithm can simply be “quantized”. For many tasks, the classic computing architecture remains the more efficient choice.
- Ecosystem missing: compilers, frameworks, developers, everything would have to be rethought and rebuilt.
In short: Gelsinger’s vision sounds sexy, but it assumes that the quantum ecosystem will mature at a pace that would be historically unprecedented. And this for a technology that has barely progressed beyond feasibility studies so far.
Conclusion: visionary or storyteller?
It is remarkable that Gelsinger also admits a few of his own failings in the same breath. During his time at Intel, the company was “technically rotten”, in his words. There was a lack of discipline and no one delivered products on time. He himself had tried to save the 18A project in time, an attempt that ultimately failed. The new CEO Lip-Bu Tan pulled the plug before Gelsinger could deliver. A bitter anecdote from the heart of semiconductor hell.
Gelsinger wants to provoke, no question. He is casting doubt on a hype that is currently running too hot to be healthy. His theory of a quantum revolution is not far-fetched, but at best it is ambitious. At worst, it is a smoke screen from the second row. However, he also shows that technological progress does not adhere to PR roadmaps. Those who blindly rely on GPUs today could have a billion-dollar problem tomorrow if, and when, the quantum bang comes.
Until then, if you’re going to run your mouth so full, you’d better have a few qubits up your sleeve.
Source: The Financial Times

































9 Antworten
Kommentar
Lade neue Kommentare
Urgestein
Mitglied
Urgestein
Veteran
Urgestein
Urgestein
Veteran
Urgestein
Veteran
Alle Kommentare lesen unter igor´sLAB Community →