Tag: Quantum computing

  • Creotech Quantum will make its debut on the WSE. The first such company in Europe

    Creotech Quantum will make its debut on the WSE. The first such company in Europe

    Creotech Quantum ‘s debut on the main floor of the Warsaw Stock Exchange is a moment of significance beyond the local capital market. It signals that the European deep tech sector is no longer the domain of laboratories and venture capital funds alone, and is beginning to seek validation on the public market. As the first public company in Europe to focus on quantum technologies, Creotech is throwing down the gauntlet to US players who have been trading at valuations running into billions of dollars on the New York stock exchanges for years.

    CEO Anna Kaminska’s strategy is based on a pragmatic transition from the research phase to hard commercialisation. Central to this plan is the quantum key distribution (QKD) system, whose market debut the company announces later this year. In an era of growing concerns about cyber security and the potential ability of quantum computers to break classical ciphers, QKD offers a solution based on the laws of physics, not just the complexity of algorithms. For sectors such as finance, logistics and defence, it is no longer just a futuristic vision, but a viable tool for data protection.

    The company’s strength lies in its skilful diversification and synergies with its parent Creotech Instruments. Creotech Quantum is not limited to theory; it provides the infrastructure necessary to build a quantum ecosystem. A portfolio including precision White Rabbit timing systems and high-speed CMOS cameras for monitoring quantum processors positions the company as a key component supplier. What’s more, ambitions extend to orbit – collaboration on space-based QKD systems could give the company a unique competitive advantage on a global scale.

    Investors must remember, however, that the quantum market is a long-distance and extremely capital-intensive game. While the debut is an image success, the real test will be to prove the announced commercial deployments. If Creotech Quantum successfully commercialises its systems in the coming months, it could become a role model for other European technology players who have so far been cautious about the stock market. The stakes are high: what is at stake is not just returns, but whether Europe manages to build its own pillars in the most crucial technological area of the 21st century.

  • NVIDIA introduces Ising – AI as an operating system for quantum processors

    NVIDIA introduces Ising – AI as an operating system for quantum processors

    In the race for quantum supremacy, NVIDIA is making a move that could change the balance of power not only in the labs, but also in the data centres. The NVIDIA Ising family of models just unveiled is the world’s first open attempt to harness artificial intelligence to solve the ‘Achilles’ heel’ of quantum computers: their extreme instability.

    Today’s quantum processors (QPUs) are technologically impressive but business-wise unusable. They generate an error on average once per thousand operations. For the technology to realistically compete with traditional silicon in pharma or logistics, this rate needs to drop to one error per billion. Jensen Huang, Nvidia’s chief executive, makes it clear: AI is not just an add-on here, but an essential ‘operating system’ to manage this fragile architecture.

    Architecture instead of promises

    Instead of building its own quantum computer, NVIDIA is positioning itself as a critical layer provider. The Ising family consists of two specialised tools that hit the industry’s narrowest bottlenecks. The Ising Calibration model uses computer vision technology to automate processor settings. What previously took physicists days of painstaking work, AI can cut down to a few hours.

    Ising Decoding, on the other hand, is a 3D neural network designed for real-time error correction. The results are promising. Compared to the current market standard, pyMatching, Nvidia’s solution shows three times the accuracy and 2.5 times the speed. In a world where milliseconds of delay determine the decay of a quantum state, such an advantage is fundamental.

    Open door strategy

    The decision to make models available in an open source format is a smart business move. By integrating Ising with the existing CUDA-Q platform and NVQLink hardware link, the green giant is creating an ecosystem that will be difficult to disconnect from. Companies and universities can train these models on their own data while retaining full control of the infrastructure, which is crucial for sectors such as cyber security or finance.

  • SAP’s quantum revolution. How is the ERP giant gearing up for Q-Day?

    SAP’s quantum revolution. How is the ERP giant gearing up for Q-Day?

    German giant SAP is making a move rarely seen in players of this scale: it is beginning to professionalise the structures under a technology that is still largely trapped in laboratories. The appointment of Carsten Polenz as Chief Quantum Officer and the formation of a double-digit team of experts is a signal that Walldorf does not intend to just watch the quantum revolution from the sidelines.

    SAP’s strategy is strikingly pragmatic. Instead of building its own hardware, the company is focusing on the application layer and optimisation problems that have been a bottleneck in supply chains for decades. Philipp Herzig, SAP’s CTO, openly admits that quantum programming requires ‘unlearning’ classical computer science. This confession from a technology leader highlights the depth of the barrier to entry – in the world of quantum, algorithms do not operate on zero-one logic, but on probabilities, allowing millions of scenarios for loading trucks or scheduling production to be analysed simultaneously.

    For the average CFO or logistician, SAP’s ‘quantum revolution’ is supposed to be paradoxically dull and invisible. ERP systems are supposed to use quantum computers like specialised accelerators in the cloud – sending a complex mathematical task there and receiving the finished, optimised result in a fraction of a second. This ‘Quantum-as-a-Service’ approach is intended to protect customers from the complexity of the infrastructure itself, while giving them the tools to drastically reduce costs and CO2 emissions by eliminating empty runs in logistics.

    But behind the promise of performance lies an existential threat, referred to in the industry as ‘Q-Day’. The moment when quantum computers become powerful enough to break current encryption standards spends security architects’ sleepless nights. SAP, which manages the data of thousands of global corporations, must play on two fields simultaneously: building the algorithms of the future and implementing post-quantum cryptography to secure the foundations of today’s business.

  • Quantum AI – the synergy of new technologies will transform the cyber security market

    Quantum AI – the synergy of new technologies will transform the cyber security market

    Previous technological paradigms are no longer sufficient to describe reality. The dynamics of change, driven by the symbiosis of artificial intelligence and quantum computing, is becoming the foundation of a new business strategy, and the line between what is technically unfeasible and what is becoming standard is becoming blurred. This means abandoning the reactive model of infrastructure management in favour of a deep, visionary reconstruction of security systems.

    The foundation of today’s digital trust is asymmetric cryptography, based on standards such as RSA and ECC. For decades, these have provided an inviolable wall to protect corporate secrets, patient data and the critical infrastructure of states. However, this era is inevitably coming to an end. Analysts point to 2032 as the moment when legacy encryption methods will no longer be considered secure. This is because the computational potential of quantum computers will allow current security to be broken in times measured in seconds rather than centuries.

    The biggest, and most insidious, threat is the phenomenon known as ‘Harvest Now, Decrypt Later’. This strategy, which involves the mass accumulation of encrypted data streams with the intention of decrypting them when quantum processors are accessed, casts a shadow over today’s sense of security. Information that today seems like an unreadable string of characters could become an open book in a few years’ time. For business, this means that trade secrets, construction plans for strategic facilities or sensitive data stolen today will become public on what experts refer to as ‘Q-Day’. So preparing for the post-quantum era is already a top-level risk management priority.

    A common misconception in the public debate is that post-quantum cryptography is just another software update, similar to patching holes in an operating system. The reality, however, requires much deeper reflection. The key to survival is becoming what is known as cryptographic agility. This is the ability of an IT architecture to instantly replace encryption algorithms without the need for costly and time-consuming halting of operational processes. Companies that build their systems in a rigid manner risk being paralysed in the face of new, as yet unknown security vulnerabilities that will surely be discovered as quantum physics develops.

    Algorithmic evolution is taking place in parallel to the hardware revolution. It is often mistakenly assumed that the worlds of artificial intelligence and quantum computers are separate ecosystems. In fact, quantum computers offer an exponential acceleration of training AI models, creating a new quality of threat. Intelligent systems, backed by quantum computing power, will be able to identify infrastructure vulnerabilities with a precision and speed that goes beyond today’s detection capabilities. This will make cyber threats radically transformed, forcing system architects to design security barriers that can evolve on their own.

    In this arms race, however, there is a classic conflict between security and user comfort. It is well known that in the business world, every second of delay and every extra click generated by security mechanisms is seen as friction that reduces efficiency, in which the role of the ‘Security by Design’ concept is revealed. The real innovation lies in integrating highly sophisticated, automated protection mechanisms in such a way that they operate in the background, remaining invisible to the end user. Security cannot be a brake on progress; it must become an integral, unmanned part of it. Only such a symbiosis allows maximum protection to be maintained while keeping business processes running smoothly.

    Moving to hybrid solutions, combining proven classical methods with new post-quantum algorithms, seems to be the most sensible path at present for organisations concerned about their strategic resilience. It allows them to comply with current regulations while building resilience to future quantum attacks. However, this challenge requires overcoming a kind of cognitive paralysis. It is often easier to ignore threats far in the future than to make the difficult decision to rebuild the foundations of IT. The history of technology teaches, however, that those who are able to spot impending changes before they become a pressing crisis win.

  • Data gives you an edge, but requires control. 8 predictions for the enterprise market

    Data gives you an edge, but requires control. 8 predictions for the enterprise market

    Just a decade ago, the definition of a ‘secure business’ was simple: a robust firewall, up-to-date anti-virus and regular backup. Today, in the age of hybrid environments and ubiquitous artificial intelligence, this approach sounds like an archaism. Data has given businesses superpowers in the form of a competitive advantage, but it has also brought unprecedented operational complexity to IT departments. Looking at technology predictions for 2026, it is clear that we are entering an era where ‘digital sovereignty’ is becoming the new currency and speed is the only acceptable security parameter.

    Technology has ceased to be magic and has become critical logistics. If we look at what lies ahead over the next two years, the conclusions are clear: traditional cyber security is not enough. The arms race has moved to the infrastructure level, and it will be won by those who understand that the geographical boundaries of data matter, and that response times count more than the height of defence walls.

    Speed is the new benchmark

    For years, we have lived in a paradigm of perimeter protection – building a fortress where no unauthorised person has access. Predictions for 2026 brutally verify this approach. Cyber threats have evolved. These are no longer isolated incidents of ransomware, involving ‘just’ disk encryption. We are dealing with complex operations in which data is not only locked, but above all quietly exfiltrated and then sold on the black market or used for blackmail.

    In such a reality, a company’s resilience (resilience) is not measured by whether an attack can be avoided, but how quickly an organisation is able to recover from an incident. Traditional data recovery from tapes or free archive repositories becomes an unacceptable bottleneck.

    Speed is becoming the new standard. Anomaly detection must happen in real time and isolation of infected resources must happen automatically. Furthermore, the concept of ‘clean data recovery’ is becoming crucial. In the future, intelligent infrastructures will have to guarantee that the target state to which we return after a disaster is absolutely free of malicious code. This requires integrating security systems directly into the storage layer, rather than treating them as an external overlay.

    Geopolitics enters the server room

    Not so long ago, the cloud strategy of many companies was based on simple economic calculus and flexibility, often ignoring the physical location of bits and bytes. Those days are irrevocably passing. Governments around the world, concerned for national security and the privacy of citizens, are tightening regulations on where data can be stored and processed.

    Therefore, one of the key trends by 2026 will be data sovereignty. Companies and technology partners must respond by building environments that provide privacy without inhibiting innovation. Sovereign clouds and local hybrid environments are the market response. This is not about a complete retreat from global hyperscalers, but about managing risk wisely.

    Herein lies a huge opportunity for modern data platforms. They are designed to take the burden of bureaucracy off the shoulders of IT departments. Sustainable platforms are supposed to automate encryption, access policy management and regulatory compliance. This allows engineers to focus on creating business value, rather than wasting time manually aligning systems with regulatory requirements. Sovereignty ceases to be an obstacle and becomes part of the architecture.

    The race against time and quantum

    Looking to the future, it is impossible to ignore threats that seem distant today but could become standard in 2026. We are talking about post-quantum cryptography (PQC). Although quantum computers capable of breaking current security measures are still a song of the future, data that is stolen today could be decrypted in a few years (the so-called ‘harvest now, decrypt later’ attack).

    Therefore, the smart infrastructure of the future must integrate PQC standards now. Security cannot be a service tacked on at the end of the implementation process. It must be built into the DNA of data storage systems – from behavioural anomaly detection at the record level to advanced encryption. Only this approach will give companies peace of mind in the face of evolving threat models.

    Trust as a currency

    All of the above – speed, sovereignty, security – converge on one point: artificial intelligence. The year 2026 is when AI will cease to be just a content generator and will start to operate in the model of Agentic AI – autonomous systems that make decisions.

    However, for AI to be effective and secure, it must be trustworthy. Most AI initiatives fail not because of poor language models, but because of poor quality databases and lack of control over them. If a company is unsure who has accessed the training data, whether it has been manipulated and whether it complies with regulations, implementing AI becomes Russian roulette.

    Therefore, comprehensive data management (Data Governance) comes to the fore. Access control, data lifecycle tracking (data lineage) and integrity are foundations without which even the most advanced algorithm will be useless.

    The end of silos

    The path to 2026 is through understanding that artificial intelligence, cloud, cyber resilience and modern infrastructure are no longer separate areas. They are interconnected vessels.

    Cloud strategies are shifting towards workload-optimised (workload) platforms. Instead of managing separate consoles, companies will rely on unified platforms to decide where a given task will perform best – whether in the public cloud, sovereign cloud or local data centre.

    In the coming years, those who bet on an intelligent data infrastructure will win. One that ensures speed of recovery from attack, guarantees sovereignty in the face of regulation and provides the fuel for trustworthy artificial intelligence. It is time to stop treating infrastructure as a cost and start seeing it as the foundation of modern business.

  • Pat Gelsinger announces the twilight of the GPU era in favour of a quantum revolution

    Pat Gelsinger announces the twilight of the GPU era in favour of a quantum revolution

    Although no longer in the chair of Intel, Pat Gelsinger has no intention of remaining silent on the future of the world’s technological architecture. In a recent interview with the Financial Times, the former head of the Santa Clara giant presented a thesis that stands in stark opposition to the current market consensus. According to Gelsinger, the current ‘hype’ around artificial intelligence, although it will last a few more years, is merely a prelude to a much more profound change: the arrival of the era of quantum computing.

    While Nvidia CEO Jensen Huang is building an empire based on GPU dominance and predicts that quantum technology will not impact the market for the next two decades, Gelsinger drastically shortens this horizon. In his view, quantum computers will enter the mainstream in just two years. This is a bold prediction, suggesting that the current hegemony of GPU accelerators could end before the end of this decade. Gelsinger visualises the digital future as a ‘holy trinity’ of coexisting technologies: classical processors, AI accelerators and quantum units. In this scenario, the role of GPUs, currently crucial for training large language models, would be significantly marginalised in favour of cubit-based chips.

    Gelsinger’s perspective seems to be evolving as he moves away from Intel’s corporate structures towards the startup ecosystem. Working more closely with smaller, more agile quantum players, he sees a pace of innovation that may be invisible from the perspective of the incumbent silicon giants. He is also critical of the past in his assessment. He acknowledges that Intel under his helm struggled with a “loss of fundamental discipline”, resulting in delays in key processes such as 18A lithography. This experience of struggling with the limitations of classic silicon may have reinforced his belief that Moore’s Law in the traditional sense is depleting faster than the industry is willing to admit.

  • Palo Alto Networks warns: IT infrastructure replacement by 2029 is inevitable

    Palo Alto Networks warns: IT infrastructure replacement by 2029 is inevitable

    While the market is still debating the regulation of artificial intelligence, Palo Alto Networks CEO Nikesh Arora is setting a much tougher cut-off date for the cyber security industry. According to him, by 2029, hostile state actors will have operational quantum computers, forcing corporations to replace their infrastructure in an unprecedented way.

    Arora’s speech at the recent quarterly results discussion was more than a standard forecast – it was a clear signal to shareholders and IT directors that ‘quantum security’ is becoming a critical new commercial pillar. Lee Klarich, the company’s CTO, is already seeing increasing pressure from customers who are beginning to treat the quantum threat as a ‘here and now’ problem rather than a distant abstraction. The replacement of cryptographic systems needed to defend against the computational potential of quantum could become a catalyst for growth for Palo Alto Networks comparable to the wave of cloud transformation.

    In parallel to the futuristic visions, the company is aggressively managing the present, with the web browser becoming the dominant attack vector. Arora estimates that as much as 90 per cent of office work is now done in a browser window, and the growing popularity of autonomous AI agents only compounds the risk. The company’s internal testing, which showed 167 infected instances on 5,000 devices at one of its customers, served as a proof of concept for the new strategy. Palo Alto Networks is openly targeting a base of 100 million installations of its own business browser, which would allow the company to take control of the ‘last mile’ of data security traditionally overlooked by classic firewalls.

    Completing this offensive is the strategic acquisition of the Chronosphere platform. The nearly $3.5 billion deal is a clear bet on the observability (observability) market in the age of AI. Arora argues that classic observability tools are too expensive and slow for artificial intelligence systems operating on petabytes of data. Chronosphere is expected to reduce operating costs by up to two-thirds, while offering Palo Alto engineers the technology needed to handle next-generation workloads. It is a move that positions the company not only as a ‘shield’ provider, but also as a fundamental partner in building a high-performance infrastructure for the AI era.

  • The end of ‘garage’ deployments. OCP standardises infrastructure for quantum computers

    The end of ‘garage’ deployments. OCP standardises infrastructure for quantum computers

    The Open Compute Project (OCP) is opening a new chapter in data centre design, attempting to reconcile two technological elements: classical large-scale computing (HPC) and highly sensitive quantum mechanics. The organisation has begun work on formulating precise guidelines to enable these systems to coexist within a single server room. Although the vision of hybrid computing promises a leap in performance, the engineering reality presents facility operators with challenges that standard procedures do not anticipate.

    The integration of quantum systems is primarily a struggle with mass and thermodynamics. Although quantum processors themselves may impress with their energy efficiency, their associated infrastructure is demanding. A key element here is the cryostat – a device weighing up to 750 kilograms – which forces designers to ensure that the floor load capacity is at least 1,000 kg/m².

    Managing the temperature of the cooling fluid is proving to be even more challenging. While modern HPC cabinets can run on water temperatures as high as 45°C, quantum systems require a fluid supply in the 15-25°C range. This necessitates maintaining two separate cooling loops or using advanced heat exchangers. Added to this is the rigorous control of humidity, which must oscillate between 25 and 60 per cent to avoid condensation on refrigeration components, which would be disastrous in a precision electronics environment.

    However, it is environmental factors, often ignored in classical IT, that can determine the success of a deployment. Quantum hardware exhibits extreme sensitivity to electromagnetic interference. Even such mundane items as fluorescent lighting must be at least two metres away from the computing unit. Magnetic fields must be strictly limited, and the location of the data centre itself requires a new urban planning analysis. The presence of a tramline, railway traction or mobile phone masts within 100 metres can generate noise that prevents stable operation of the cubits.

    OCP rightly points out that installing a quantum computer is no longer a standard ‘plug-and-play’ operation. It is an engineering process that takes a minimum of four weeks and requires the involvement of specialist electricians and refrigeration technicians, not just IT staff. The OCP initiative to create checklists and best practices is therefore not so much a facilitator as a necessity for hybrid HPC environments to move out of the experimental phase and become a market standard.

  • Post-quantum cryptography in banking and medicine. Challenges, regulation and implementation

    Post-quantum cryptography in banking and medicine. Challenges, regulation and implementation

    Most discussions about quantum computers still oscillate around futurology. We talk about machines that will one day, in the near future, change the face of science and medicine. Meanwhile, for security directors in banks, hospitals or government institutions, the quantum age is not a distant vision, but a pressing problem that started yesterday. In the shadow of media reports about the next quantum processors, a quiet data security drama is playing out, known in the industry as the ‘Harvest Now, Decrypt Later’ strategy. It is a simple but brutal premise whereby cyber criminals and hostile state actors are already stealing and archiving encrypted data en masse. For them, it is currently a useless string of characters, but their long-term goal is to store it until quantum computers achieve enough computing power to crack today’s algorithms in seconds.

    For industries handling sensitive data with a long lifecycle, this is a nightmare scenario. The financial sector, which relies on trust and bank secrecy, and the healthcare system, which protects patient data often for decades, are on the front line. If we assume that a stable quantum computer will be developed in ten years’ time and that medical or financial data must remain confidential for fifteen or twenty years, the maths is inexorable. The safeguards in place today are already insufficient, as the period of necessary data protection is beyond the time horizon of safe use of current cryptography. Research published by the French agency ANSSI shows that half of the organisations surveyed are already at risk of future quantum attacks, especially in the context of such common tools as VPNs or long-term certificates.

    The answer to this invisible threat, however, is not to build our own quantum computers to defend ourselves, but mathematics, specifically post-quantum cryptography (PQC). Under this term is a new set of encryption algorithms specifically designed to resist the powerful computing power of future machines. Crucially, these technologies are compatible with our current hardware. You can, and should, deploy them on today’s servers, cloud and network infrastructure without waiting for a hardware revolution. This deployment is based on two foundations that are increasingly emerging in security strategies: hybridisation and crypto agility.

    The hybrid approach is a kind of security bridge. It involves the simultaneous use of conventional, time-tested algorithms and new post-quantum solutions. It works like a double lock on a door – even if one is forced, the other still protects the assets. This strategy allows companies to test new technology and build resilience without risking abandoning current standards overnight. Crypto-agility, or Crypto-Agility, on the other hand, is the system’s ability to quickly replace the encryption algorithm when a vulnerability is discovered in it. In a dynamically changing world of threats, IT systems cannot be monolithic; they must allow their cryptographic foundations to be seamlessly updated without rebuilding the entire architecture or paralysing the operational performance of the enterprise.

    While technological solutions are already on the table, the impetus for change is increasingly coming not from IT departments, but from the offices of regulators. Europe is clearly accelerating in the race for digital sovereignty, and bodies such as the aforementioned ANSSI and EU institutions are no longer treating quantum resilience as an option, but as a necessity. Standardisation work, led globally by the US NIST, is being closely followed and adapted to European requirements. On the Old Continent, legislation, including the Cyber Resilience Act, is beginning to play a key role. The new legislation will gradually force software and hardware providers to comply with state-of-the-art cryptographic criteria. This means that soon even non-critical infrastructure companies will have to review their supply chains, making sure that their technology partners offer solutions that are ready for the post-quantum era.

    There is currently an intriguing divergence in the market. On the one hand, technology providers are showing great mobilisation, actively following recommendations and integrating new standards into their products to stay ahead of regulation. On the other hand, many end users, including large enterprises, are adopting a wait-and-see attitude. Some industries are holding off on decisions until rigid legal guidelines emerge. However, experts warn that this is a risky strategy. Crypto migration is an extremely complex, costly and time-consuming process. Organisations that only start it when hard regulations come into force may find themselves in a no-win situation, forced to make chaotic and costly upgrades under time pressure.

    For sectors such as banking or healthcare, anticipating the quantum threat has therefore become a strategic necessity, going far beyond the technical aspects of IT operations. It requires coordination at management level, inventorying resources and planning multi-year budgets. The first step for any conscious organisation should be to map out exactly where cryptography is used and assess how long the protected data must remain confidential. Time to prepare is running out, and in the world of cyber security, where customer trust and the stability of the financial system are at stake, the principle of ‘prevention is better than cure’ has never been more relevant. The move to post-quantum cryptography is not just a software update – it is a fundamental shift in thinking about information persistence and security in the 21st century.

  • OVHcloud makes Pasqal quantum processors available. New service launches

    OVHcloud makes Pasqal quantum processors available. New service launches

    French cloud provider OVHcloud has just made the most decisive move to make the Old Continent independent of US computing technology. The launch of its new Quantum-as-a-Service platform is not only a technology launch, but above all a signal that Europe intends to build its own sovereign ecosystem in the nascent quantum computing sector.

    The Roubaix-based company has granted organisations access to the Orion Beta QPU processor, supplied by French startup Pasqal. This is the first step in an aggressive strategy to integrate at least eight quantum systems by the end of 2027. Significantly from a geopolitical perspective, up to seven of these are to come from European suppliers. Pasqal, as lead partner, sees this collaboration as the foundation for building a ‘digital autonomy’ in which both hardware and cloud infrastructure remain within EU jurisdiction.

    From a business perspective, OVHcloud’s offering stands out for its pragmatic approach to a still experimental technology. The provider combines access to a physical QPU with a set of nine quantum emulators, which are already used by nearly a thousand developers. This hybrid architecture allows companies to safely test algorithms and validate use cases in a cloud environment, without having to invest in expensive in-house lab infrastructure. Although about “quantum supremacy” – the moment when quantum computers will permanently surpass classical machines – may not yet be in sight, OVHcloud wants to be ready for that moment by offering an environment for iteration and learning now.

    The French move is a direct response to the dominance of US hyperscalers. IBM with its Quantum Cloud, Microsoft Azure with its Majorana 1 processor, AWS Braket or Google, have been building a competitive advantage overseas for years. OVHcloud is entering this market late, but with a clear value proposition: it offers the first viable alternative to ensure that sensitive research and data does not leave the European economic area. In an era of increasing regulatory tensions and pressure on data sovereignty, this could be a key asset in the battle for public, financial and research customers.

    The choice between Pasqal’s technology and IBM’s solutions is not just a question of supplier, but a decision about the fundamental physical architecture. Although both companies are pursuing the same goal – stable quantum computing – they approach the problem from completely different sides of the physics.

    IBM and the Superconducting Qubits approach

    IBM, like Google, is betting on superconducting qubits. These are essentially macroscopic electronic circuits that, when cooled to near absolute zero temperatures (in large dilution chillers), exhibit quantum properties. This approach is currently the most mature in engineering. Its greatest advantage is the speed of operations – quantum gates operate extremely fast here.

    However, it has the disadvantages of a short coherence time (the time it takes for a qubit to ‘remember’ its state) and difficulties with scaling. Each qubit must be physically connected to the control electronics, which, with thousands of qubits, creates a ‘wiring nightmare’ and generates heat, which is the enemy of the quantum state.

    Pasqal and the Neutral Atoms approach

    France’s Pasqal (and indirectly OVHcloud) uses rubidium atoms suspended in a vacuum and held by high-precision lasers called optical tweezers. In this system, the atoms themselves are the qubits. Since the atoms are identical by nature, this eliminates errors due to imperfections in chip manufacturing, something IBM has been facing.

    A key advantage of Pasqal technology is scalability and connectivity. The lasers can arrange atoms into any three-dimensional shape, allowing complex chemical molecules or optimisation problems to be simulated in a way that is inaccessible to IBM’s rigid chip architecture. These systems can operate at room temperature (for the apparatus itself, although the atoms are laser-cooled), which drastically reduces energy costs. The disadvantage is slower execution times compared to superconductors.

    FeaturePasqal (OVHcloud)IBM (IBM Cloud)
    ArchitectureNeutral atoms (light/laser controlled)Superconductors (electronic circuits on a chip)
    Stability (Coherence)High. Atoms maintain their quantum state for longer (seconds).Low. Very short state life (microseconds).
    Speed of operationsSlower. Operations on atoms take longer.Very fast. Instantaneous logic gates.
    ScalabilityHigh. Easier to add more atoms and lasers than cables.Moderate. Requires sophisticated cryogenic engineering.
    Main applicationsMaterials simulations, logistics optimisation, chemistry.Cryptography, factorisation, universal algorithms.

  • The race for quantum advantage: IBM bets on Nighthawk and accelerated manufacturing

    The race for quantum advantage: IBM bets on Nighthawk and accelerated manufacturing

    IBM has stepped up its efforts in the race to build a usable quantum computer with the unveiling of its new Quantum Nighthawk processor. The goal is clearly defined and strategically differentiated from the competition: the company wants to achieve a ‘measurable quantum advantage’ (Quantum Advantage) by the end of 2026. As opposed to purely theoretical ‘supremacy’, quantum advantage refers to the point at which quantum systems solve real scientific or business problems faster and more efficiently than the most powerful classical supercomputers.

    Featuring 120 qubits and 218 connectors, Nighthawk is an evolution of the previous generation Heron, which prioritised quality over quantity. IBM stresses that the improved architecture allows it to run circuits 30% more complex while maintaining a consistently low error rate. It is this rate, rather than the sheer number of qubits, that remains the biggest engineering challenge. Cubits are extremely sensitive to noise (decoherence) and errors that add up during computation render the results useless. Nighthawk is expected to be available to users by the end of 2025.

    The key to achieving this ambitious roadmap – which calls for, among other things, 15,000 two-bit gates by 2028 – is scaling production. IBM has moved quantum processor manufacturing to a 300mm wafer fabrication facility at the Albany NanoTech Complex. This move, taken directly from the mature semiconductor industry, has already doubled the speed of development and increased the physical complexity of the chips tenfold, according to the company.

    In parallel, the company is working on the foundations of the future. The experimental Quantum Loon processor demonstrates the components necessary for fault-tolerant quantum computing, a goal for 2029. A breakthrough in error correction has also been reported, with a new decoding method operating ten times faster than existing methods, a year ahead of the original plan.

    To give credibility to its progress and set a market standard, IBM is launching an open, community-based Quantum Advantage Tracker with partners such as Algorithmiq. The initiative aims to transparently monitor and verify new demonstrations of the real-world benefits of quantum technology.

  • Quantum computers on a massive scale. Nobel laureate and HPE announce groundbreaking plan

    Quantum computers on a massive scale. Nobel laureate and HPE announce groundbreaking plan

    John M. Martinis, a recent winner of the Nobel Prize in Physics (2025) and one of the architects of Google’s breakthrough in ‘quantum supremacy’, is starting a new chapter. This time his goal is not a laboratory record, but the creation of a practical, mass-produced quantum supercomputer. On Monday, he announced the formation of the Quantum Scaling Alliance, bringing in the heavy artillery: supercomputing giant HPE and key players in the semiconductor supply chain.

    The initiative is a direct response to the industry’s biggest pain point. Quantum computers, promising a revolution in chemistry or medicine, remain largely unitary works. As Martinis put it, since the 1980s quantum chips have been produced “in an artisanal way”. The Quantum Scaling Alliance aims to change this by moving the production of qubits from laboratories to factories.

    That’s why the presence in the alliance of Applied Materials, a supplier of chip-making machines, and Synopsys, a leader in chip-design software (EDA), is crucial. The idea is to use the same sophisticated tools that today produce millions of processors for smartphones and AI servers to build quantum systems. This signals the industry’s desire to move “to a more standardised, professional model”.

    However, building stable cubits at scale is only half the battle. The real challenge, the partners emphasise, lies in integration and scaling. Masoud Mohseni, head of the quantum team at HPE, tones down the enthusiasm, noting that moving from hundreds to thousands of qubits raises entirely new issues. “People naively think [scaling] is linear. This is simply not true,” Mohseni stated.

    HPE’s task will primarily be to integrate delicate quantum circuits into classical supercomputers. It is they who are to manage the system in real time and handle the crucial error correction process, without which qubits are useless. The consortium also included specialised companies such as Riverlane and 1QBit (responsible for error correction) or Quantum Machines (control systems), which shows that the aim is to build a complete, commercial technology stack.

  • Investments in quantum will pay off faster. IBM proves the technology is ready to scale

    Investments in quantum will pay off faster. IBM proves the technology is ready to scale

    The race to build a functional quantum computer has entered a new stage. IBM has announced a major breakthrough in error correction that solves one of the fundamental problems of the technology. Crucially, the solution is based on widely available components, which could dramatically accelerate commercialisation.

    The problem with quantum computing is well known: qubits, the basis of its computing power, are extremely prone to errors that quickly build up and prevent useful results. IBM signalled back in June that it had developed an algorithm capable of solving this problem on the fly.

    The latest reports, to be published on Monday, confirm that this is no longer just a theory. IBM has demonstrated that its error correction algorithm works in real time. The real breakthrough, however, is the hardware platform. Instead of relying on exotic, custom-designed processors, IBM has run its algorithm on standard field-programmable gate array (FPGA) chips manufactured by AMD.

    Jay Gambetta, director of research at IBM, highlighted that the implementation not only works, but is also 10 times faster than current requirements. Using off-the-shelf AMD chips means the solution is not ‘ridiculously expensive’, removing a significant financial barrier to scaling the technology.

    This move strengthens IBM’s position in competition with technology giants such as Google (Alphabet) and Microsoft, who are also investing heavily in quantum research. For IBM, it also means a significant acceleration of its own schedule. Algorithm work, crucial to the ‘Starling’ quantum computer planned for 2029, was completed a year ahead of schedule.

    The financial market reacted with immediate optimism. Friday’s trading of IBM shares closed up 7.88%, while AMD shares gained 7.63%. This is a clear signal that investors see this alliance of quantum and classical technology as a viable step towards practical applications of next-generation supercomputing.

  • AI accelerator market: NVIDIA, AMD, Intel – the battle for supremacy

    AI accelerator market: NVIDIA, AMD, Intel – the battle for supremacy

    At North Carolina State University, robotic arms precisely mix chemicals while streams of data flow through systems in real time. This ‘self-powered laboratory’, an AI-powered platform, discovers new materials for clean energy and electronics not in years, but days.

    Collecting data 10 times faster than traditional methods, it observes chemical reactions like a full-length film rather than a single snapshot. This is not science fiction; it is the new reality of scientific discovery.

    This incredible leap is being driven by a new kind of computing engine: specialised AI accelerator chips. These are the ‘silicon brains’ of the revolution. Moore’s law, the old paradigm of doubling computing power in general-purpose systems, has given way to a new law of exponential progress, driven by massive parallel processing.

    The crux of the story, however, is more complex. While AI algorithms are the software of a new scientific era, the physical hardware – the AI chips – has become the fundamental enabler of progress and, paradoxically, also its biggest bottleneck.

    The ability to discover a new life-saving drug or design a more efficient solar cell is today inextricably linked to a hyper-competitive, multi-billion dollar corporate arms race and a fragile geopolitical landscape in which access to these chips is a tool of global power.

    Anatomy of a boom: who is building silicon brains?

    The boom in generative artificial intelligence has created an insatiable demand for computing power. It’s not just chatbots, but foundational models that underpin a new wave of scientific research. This demand has transformed a niche market into a global battlefield for dominance.

    Reigning champion: NVIDIA

    NVIDIA has established itself as a key architect of the AI revolution, as evidenced by its stunning financial results. The data centre division, the heart of the company’s AI business, reported revenues of $41.1bn in a single quarter, up 56% year-on-year.

    This dominance is based on successive generations of powerful architectures such as Hopper and now Blackwell, which are core hardware for technology giants such as Microsoft, Meta and OpenAI .

    An energetic contender: AMD

    AMD is positioning itself not as a distant number two, but as a serious and fast-growing competitor. The company reported record data centre revenue of US$3.5bn in Q3 2024, a massive 122% year-on-year increase, driven by strong adoption of its Instinct series GPU accelerators.

    Significantly, major cloud service providers and companies such as Microsoft and Meta are actively deploying MI300X accelerators from AMD, signalling a desire to have a viable alternative to NVIDIA. The company forecasts that its data centre GPU revenue will exceed US$5bn in 2024.

    The gambit of the historical giant: Intel

    Intel’s situation presents a strategic challenge. Although the company claims that its Gaudi 3 accelerators offer a better price/performance ratio compared to NVIDIA’s H100 , it is struggling to gain market share.

    Intel missed its $500m revenue target for Gaudi in 2024, citing slower-than-expected adoption due to issues with transitioning between product generations and, crucially, challenges with ‘ease of use of the software’.

    Analysis of this data reveals deeper trends. Firstly, the AI hardware market is not just a race for components, but a war of platforms. Intel’s difficulties with software point to the real battlefield: the ecosystem. NVIDIA’s CUDA platform has more than a decade’s head start, creating a deep ‘moat’ of developer tools, libraries and expertise.

    Competitors are not just selling silicon; they need to convince the whole world of science and development to learn a new programming language. Secondly, the AI boom is leading to vertical integration of the data centre.

    Not only does NVIDIA dominate the GPU market, but following its acquisition of networking company Mellanox in 2020, it has also become the leader in Ethernet switches, recording sales growth of 7.5x year-on-year.

    NVIDIA is no longer just selling chips; it is selling a complete, optimised ‘AI factory’ design, creating an even stronger lock-in effect.

    From lab to reality: scientific breakthroughs powered by silicon

    This unprecedented computing power is fueling a revolution in the way we do research, leading to breakthroughs that seemed impossible just a few years ago.

    The medicine of tomorrow

    The traditional drug discovery process, which takes 10 to 15 years, is being dramatically shortened. DeepMind CEO Demis Hassabis predicts that AI will reduce this time to “a matter of months”.

    Isomorphic Labs, a subsidiary of DeepMind, is using AI to model complex biological systems and predict drug-protein interactions. Researchers at Virginia Tech have developed an AI tool called ProRNA3D-single that creates 3D models of protein-RNA interactions – key to understanding viruses and neurological diseases such as Alzheimer’s.

    Moreover, a new tool from Harvard, PDGrapher, goes beyond the ‘one target, one drug’ model. It uses a graph neural network to map the entire complex system of a diseased cell and predicts combinations of therapies that can restore it to health.

    High-resolution climate

    In the past, accurate climate modelling required a supercomputer. Today, AI models such as NeuralGCM from Google can run on a single laptop . This model, trained on decades of weather data, helped predict the arrival of the monsoon in India months in advance, providing key forecasts to 38 million farmers.

    A new AI model from the University of Washington is able to simulate 1,000 years of Earth’s climate in just one day on a single processor – a task that would take a supercomputer 90 days.

    Companies like Google DeepMind (WeatherNext), NVIDIA (Earth-2) and universities like Cambridge (Aardvark Weather) are building fully AI-driven systems that are faster, more efficient and often more accurate than traditional models.

    Alchemy of the 21st century

    As mentioned at the outset, AI is creating autonomous labs that accelerate materials discovery by a factor of ten or more. The paradigm shifts from searching existing materials to generating entirely new ones.

    AI models, such as MatterGen from Microsoft, can design new inorganic materials with desired properties from scratch. This ability to ‘reverse engineer’, where scientists identify a need and AI proposes a solution, has been the holy grail of materials science.

    These examples illustrate a fundamental change in the scientific method itself. The computer has ceased to be merely a tool for analysis; it has become an active participant in the generation of hypotheses. The role of the scientist is evolving into a curator of powerful generative systems.

    This accelerates the discovery cycle exponentially and allows scientists to explore a much larger ‘problem space’ than was ever possible for humans.

    Geopolitical storm and a new division of the world

    As the importance of these silicon brains grows, they are becoming the most valuable strategic resource of the 21st century – the new oil, crucial for economic competitiveness and scientific leadership.

    US strategy: “small garden, high fence”

    The US has implemented a ‘small garden, high fence’ strategy, introducing export controls aimed at slowing China’s ability to develop advanced AI. These restrictions apply not only to the chips themselves (such as NVIDIA’s H100), but also to the hardware required to manufacture them (from companies such as the Dutch ASML).

    This hit the Chinese semiconductor industry in the short term, causing equipment shortages and ‘crippling’ its production capacity.

    China’s determined response

    China’s response has been multi-pronged: massive investment in its domestic semiconductor industry and the use of its own economic leverage by restricting exports of key rare earth elements. The case study is Huawei.

    Despite being crippled by sanctions, the company has developed its own line of AI Ascend chips (910B/C/D), which are now seen as a viable alternative to NVIDA products in China.

    In response, the US government has toughened its stance, declaring that the use of these chips anywhere in the world violates US export controls, escalating the technological divide.

    A study by Oxford University reveals a harsh reality: advanced GPUs are heavily concentrated in just a few countries, mainly in the US and China. The US leads the way in access to state-of-the-art chips, while much of the world is in ‘computing deserts’.

    This situation leads to unintended consequences. US export controls, designed to slow China down, have become an ‘inadvertent accelerator of innovation’ for China, forcing Beijing to build a completely independent technology stack.

    A decade from now, the world may have two completely separate, incompatible AI stacks, fundamentally dividing global research.

    The cloud as the great equivalent?

    There is a powerful counter-argument: cloud computing democratises access to elite AI. Platforms such as Amazon Web Services (AWS), Microsoft Azure and Google Cloud offer AI-as-a-Service (AIaaS), allowing a university or startup to rent the same powerful GPUs that OpenAI uses.

    The cloud giants offer rich ecosystems. AWS provides services such as SageMaker for building models and Bedrock for access to leading foundation models. Google Cloud promotes democratisation with tools such as Vertex AI, designed for minimal complexity.

    Microsoft Azure is tightly integrating AI into its ecosystem through Azure AI Foundry, offering access to more than 1,700 models and running dedicated ‘AI for Science’ research labs.

    However, the promise of access must be set against the harsh reality of cost. Training a state-of-the-art model is prohibitively expensive, with estimates as high as USD 78 million for GPT-4 and USD 191 million for Gemini Ultra. This leads to a ‘two-tier democracy’ in AI research.

    On the one hand, any researcher with a grant can access world-class AI tools. This is a democratisation of application . On the other hand, the ability to train a new large-scale foundational model from scratch remains the exclusive domain of a handful of actors: the cloud providers themselves and their key partners.

    This is the centralisation of creation. The cloud ‘democratises’ AI in the same way that a public library democratises access to books. Anyone can read them, but only a few have the resources to write and publish them.

    A future written in silicon

    The breathtaking pace of scientific discovery in medicine, climatology and materials science is a direct consequence of the massive industrial and geopolitical mobilisation around a single technology: the AI accelerator.

    Progress has become fragile and deeply interdependent. Scientific breakthrough is no longer just a function of a brilliant mind. It now also depends on the quarterly financial reports of NVIDIA and AMD, the trade policies enacted in Washington and Beijing, the stability of the supply chain passing through Taiwan and the pricing models of AWS, Google and Microsoft.

    We have entered an era where the future is literally written in silicon. The great challenges of our time – curing disease, fighting climate change, creating a sustainable future – will be solved with these new tools.

    But who will be able to wield them, and for what purpose, remains the most important and unresolved question of the 21st century. The next great scientific revolution will be televised live, but the rights to broadcast it are currently being negotiated in the boardrooms of corporations and the corridors of global power.

  • Quantum Game of Thrones: who will build the machine that will change the world?

    Quantum Game of Thrones: who will build the machine that will change the world?

    There are moments in the history of technology that redefine the limits of possibility. The mastery of fire, the invention of printing, the digital age – each of these eras was sparked by a fundamental discovery.

    Today we stand on the threshold of another such transformation, which is not simply the evolution of computing power, but the birth of an entirely new paradigm. We are talking about quantum computing.

    The race to build a functional quantum computer is the most important technological and geopolitical duel of the 21st century.

    At stake is the ability to solve problems that are today beyond the reach of the most powerful supercomputers – from designing drugs at the molecular level, to creating revolutionary materials, to breaking almost all modern encryption systems.

    At the heart of this revolution is quantum mechanics, with its principles of superposition and entanglement, which allows a qubit – the quantum equivalent of a bit – to exist in multiple states simultaneously. It is this fundamental difference that gives quantum computers their unimaginable potential.

    The year 2025, declared by the UN as the International Year of Quantum Science and Technology, is a symbolic turning point . We are no longer in the realm of purely theoretical considerations. We have entered the NISQ (Noisy Intermediate-Scale Quantum) era – a time when we have imperfect, ‘noisy’ quantum computers, but which are becoming more powerful every year.

    It is a nascent industry, estimated to be worth US$866 million in 2023 and projected to reach US$4.4 billion by 2028.

    Great quantum families: contenders for the crown

    Three powerful players have emerged on the battlefield for the quantum future: Google, IBM and Microsoft. Each has a different strategy to sit on the technological throne.

    Google: the alchemists of Mountain View

    Google’s strategy focuses on spectacular, breakthrough demonstrations of power. Their latest weapon is the ‘Willow’ processor, but the real breakthrough lies not in the number of qubits, but in the mastery of error correction.

    Google engineers have announced that they are able to maintain the stability of a logical cubit – that is, a set of physical cubits working together to correct errors – for up to an hour.

    This is a monumental leap compared to the microseconds that were the standard not so long ago. Their claim to the throne is based on being the first to push the boundaries of science, as in 2019 when they were the first to announce the achievement of ‘quantum supremacy’.

    IBM: kingdom builders for all

    IBM is playing a very different game. Instead of isolated breakthroughs, they are betting on consistent progress and democratising access to technology. Their roadmap is public and precise, and they plan to make the ‘Nighthawk’ processor available in 2025.

    A key element of their strategy is to integrate quantum computers with classical supercomputers (HPC), creating a hybrid future. The opening of Europe’s first quantum data centre in Germany is a strategic move, bringing quantum resources directly into European industry and academia.

    IBM is not just building a lab experiment; it is creating a business-ready platform accessible through the cloud.

    Microsoft: patient architects in Redmond

    Microsoft took the path of highest risk, but also potentially highest reward. For decades they had invested in research into the mythical ‘topological cubit’, which would be inherently fault-tolerant.

    While waiting for this technology to mature, they built the powerful Azure Quantum ecosystem, designed to be independent of any particular hardware architecture . Their latest breakthrough is a demonstration of 12 entangled logical qubits with an error rate 800 times lower than single physical qubits, achieved in collaboration with Quantinuum.

    The partnership with Atom Computing aims to build “the world’s most powerful quantum machine”, combining their error correction software with promising technology based on neutral atoms .

    The geopolitical great game: the dragon versus the eagle

    The rivalry is moving into the global arena, where it is becoming central to the confrontation between the United States and China. It is a battle for technological hegemony involving billions of dollars of public and private funds.

    The US leads the way when it comes to the dynamism of the startup ecosystem, with 77 quantum technology companies operating there . This innovation is driven by gigantic private investment and the research power of big tech families.

    The federal government also plays a key role by providing significant funding for basic research.

    However, China is playing a long-term, fully state-controlled game. They are catching up with shocking speed. According to a report by the Australian Strategic Policy Institute (ASPI), China is already leading in 57 out of 64 key technologies, including such vital areas as quantum sensors.

    The Middle Kingdom is pursuing a strategy based on gigantic investments in research infrastructure and aims to achieve dominance in the production of mature chips.

    Europe, although a significant player, lags behind the two superpowers in terms of the scale of investment.

    Nevertheless, initiatives such as EuroHPC and the strategic positioning of quantum computers in Poland and Germany are evidence of a coordinated effort to maintain competitiveness.

    Winners’ trophies: industries on the threshold of tomorrow

    Why are governments and corporations investing billions in this technology? The answer lies in the revolutionary applications that await the winners.

    Health service and pharmacy: drug design

    One of the most difficult problems for classical computers is the precise simulation of complex molecules. Quantum computers are naturally predisposed to simulate such systems. Their use can reduce the time needed to discover and develop a new drug by up to 50-70%.

    Pharmaceutical giants such as Roche and Pfizer are actively working with technology companies to prepare for the coming of the quantum era.

    Pfizer’s collaboration with technology company XtalPi, using artificial intelligence as a bridge to full quantum computing, has already reduced the time it takes to calculate the crystal structure of molecules from months to just days.

    Finance: quantum hedge fund

    Financial markets are a world of complex optimisation and risk modelling problems. Quantum algorithms are able to analyse a much larger number of variables and scenarios simultaneously, leading to optimised portfolios and more accurate risk assessment.

    Financial institutions such as JPMorgan and BBVA are already running pilot projects in collaboration with IBM and D-Wave . However, this same power also poses an existential threat. A quantum computer of sufficient scale will be able to crack the encryption algorithms that underpin the security of the entire digital economy.

    This creates an urgent need to implement so-called post-quantum cryptography.

    Materials science and chemistry: engineering the impossible

    The creation of new materials today relies heavily on trial and error. Quantum computers are opening the way to ‘custom material design’, enabling the precise simulation of the quantum properties of substances before they are even produced in the laboratory . This could lead to breakthroughs such as superconductors that work at room temperature or catalysts that make industrial processes radically more energy efficient. Companies such as BASF are deeply involved in research, forming partnerships with startups and academic institutions.

    From supremacy to advantage: the true measure of victory

    Google’s announcement of ‘quantum supremacy’ in 2019 was a milestone, but not a commercial turning point. The problem their computer solved had no practical application . It is therefore crucial to distinguish the terms:

    • Quantum Supremacy (Quantum Supremacy): Proof that a quantum computer can beat a classical one at any task, even a useless one. It is a scientific benchmark, but without direct commercial relevance .
    • Quantum Advantage (Quantum Advantage): The real goal. Means the ability to solve a useful, real-world business problem faster, cheaper or more accurately than any classical computer .
    • Quantum Utility: The pragmatic state we are currently in. It means using today’s imperfect NISQ computers to achieve tangible, though not yet revolutionary, results .

    The change in language itself, moving away from the confrontational term ‘supremacy’ to the more practical terms ‘advantage’ and ‘usability’, is symptomatic of the maturity of the industry as a whole. It marks a shift from pure science to commercial applications.

    The quantum revolution will not come with a bang. It will be a quiet, creeping transformation. The true measure of victory in this game of thrones will not be supremacy, but utility – the number of problems solved and the value created. The time to prepare is not when the throne is won, but now, when the great families are making their first strategic moves. The game has begun.

  • The quantum time bomb is ticking. The problem is the present, not the future

    The quantum time bomb is ticking. The problem is the present, not the future

    In the shadow of discussions about artificial intelligence and cloud computing, a quiet arms race is underway. It is not about present dominance, but about access to the greatest treasure of the future: today’s encrypted data.

    Attackers collect and store packets of information on a massive scale, from trade secrets to government data, in the full knowledge that they are useless today. However, they assume that in a few years they will have the key that will open all these locks – a quantum computer.

    This strategy, known as ‘Harvest Now, Decrypt Later’, is fundamentally changing the perception of cyber security. The problem is no longer just the day a quantum computer breaks the first security, but the fact that data with a long shelf life is being stolen now.

    The paradox of uncertainty

    The debate about the maturity of quantum technologies is fraught with contradictions. Optimists point to the first practical applications in the next few years. Sceptics speak of a time horizon of a decade or more.

    This divergence of forecasts creates a dangerous sense of distant threat that leads organisations to postpone action.

    However, from a risk management perspective, it does not matter whether that moment arrives in five years or fifteen. Sensitive data – strategic company plans, patient medical data, intelligence information or intellectual property – must remain confidential for decades.

    Meanwhile, these, secured today with standard asymmetric algorithms such as RSA or ECC, are the main target of ongoing theft. For cybercriminals and hostile states, this is a low-cost investment with a potentially gigantic return.

    Global mobilisation and a concrete schedule

    Awareness of this threat is growing in standardisation and government institutions around the world. The US National Institute of Standards and Technology (NIST) has already completed the crucial step of selecting and standardising the first post-quantum cryptography (PQC) algorithms to be resistant to attacks using quantum computers. This has given the market a clear signal to start preparing for the migration.

    In Europe, momentum is being generated by the NIS Cooperation Group, which published a concrete action plan in June 2025. It leaves no illusions about the urgency of the task. Gartner analysts predict that the first serious threat to commonly used asymmetric processes could emerge as early as 2029. Time is therefore short.

    A challenge greater than algorithm replacement

    The transition to post-quantum cryptography is much more than a simple update to cryptographic libraries. New PQC algorithms often feature larger key and signature sizes, which can impact the performance and architecture of existing systems, especially in resource-constrained environments like IoT.

    Organisations face the need to conduct a detailed audit of their cryptographic assets – to understand where and what encryption is being used. This task in itself is complicated in distributed cloud and hybrid environments.

    Moreover, the migration process will have to take place in stages. Hybrid solutions, allowing the parallel use of classical algorithms and their post-quantum counterparts, will become necessary to ensure business continuity and backward compatibility.

    The quantum threat is no longer a theoretical scenario. It is a real and active data collection campaign that is happening before our eyes. Organisations that ignore the need to prepare a PQC migration strategy today may discover in a few years’ time that their most valuable secrets, secured to yesterday’s standards, have become publicly accessible.

    The time for action is now.

  • IBM and AMD join forces. Goal: Quantum supercomputers

    IBM and AMD join forces. Goal: Quantum supercomputers

    Technology giants IBM and AMD have announced a strategic partnership to integrate the power of quantum computing into classic supercomputers.

    The collaboration will focus on creating hybrid architectures that combine IBM’s leadership in quantum technology with AMD’s expertise in high performance computing (HPC) and AI accelerators.

    The partnership is expected to lead to open, scalable platforms that could redefine the future of advanced computing. The idea behind this project is to create so-called quantum-centric supercomputers.

    In such a model, quantum processors will operate in tandem with a classic HPC infrastructure, driven by CPUs, GPUs and FPGAs from AMD.

    The hybrid concept assumes that complex computational problems will be broken down into parts and solved by the technology that is best suited for this.

    For example, a quantum computer could tackle the simulation of the behaviour of atoms and molecules at the quantum level – a task not feasible for classical machines – while supercomputers based on the AMD architecture would analyse huge result data sets and support processes using artificial intelligence.

    Such synergies are expected to allow real-world problems to be solved in areas such as drug and material discovery, logistics or the optimisation of complex systems at a scale and speed not previously possible.

    The companies are exploring how to integrate AMD’s technologies with IBM’s quantum systems to accelerate a new class of algorithms.

    One of the key aspects of the collaboration is expected to be the use of AMD’s real-time error correction solutions, a fundamental challenge on the road to building stable and fault-tolerant quantum computers.

    The first demonstration showing how IBM’s quantum systems work with AMD technology is planned for later this year. The partners also intend to develop open-source ecosystems, such as Qiskit, to facilitate the creation of algorithms for new hybrid supercomputers.

    These activities are part of IBM’s broader strategy, which already includes similar integrations with the Fugaku supercomputer in Japan and collaborations with the likes of Cleveland Clinic and Lockheed Martin.

  • New Fujitsu technology: Quantum computing improves robot precision by 43%

    New Fujitsu technology: Quantum computing improves robot precision by 43%

    The hybrid approach from Fujitsu and leading universities reduces posture calculation errors by 43%, paving the way for more complex humanoid machines.

    A consortium involving Fujitsu, Shibaura Institute of Technology and Waseda University has developed a novel hybrid method to control the posture of multi-jointed robots. By harnessing the power of quantum computing, the researchers have solved one of the classic problems of robotics – the high computational complexity of inverse kinematics.

    A key challenge in advanced robotics is inverse kinematics, i.e. the process of calculating the angles in the individual joints of a robot to bring its tip (e.g. the gripper) to a precisely defined point. In the case of machines with a large number of degrees of freedom that mimic the human body (e.g. 17 joints), the number of possible combinations becomes so enormous that classical computers cannot cope with real-time calculations. This leads to simplifications, limiting the fluidity and range of movement of the robot.

    The new approach involves representing the orientation and position of each part of the robot using cubits. Crucially, the technique uses quantum entanglement to recreate the physical relationships between joints – the movement of one segment immediately affects the segments connected to it. The calculation of the simple kinematics (the position of the tip based on the angles) takes place in the quantum circuit, while the task of inverse kinematics remains with the classical computer.

    Verification on a Fujitsu quantum simulator showed a reduction in positioning error of up to 43% with fewer calculations compared to conventional methods. The effectiveness of entanglement was also confirmed in an experiment on a 64-kubit quantum computer built by Fujitsu and the RIKEN institute. Trial calculations for a complex 17-set model were achieved in about 30 minutes.

    The method is so efficient that it can be implemented on existing, still noisy Intermediate-Scale Quantum (NISQ) era computers. In the future, the technology could find applications in real-time control of humanoid robots and manipulators, optimisation of their energy consumption or advanced obstacle avoidance.

  • Post Quantum Cryptography: Business implications of algorithm choice

    Post Quantum Cryptography: Business implications of algorithm choice

    Today’s digital economy rests on an invisible but fundamental pillar: public key cryptography. Algorithms such as RSA and ECC have become synonymous with digital trust, but this foundation faces an existential threat. The advent of quantum computers capable of breaking current encryption standards is not the next evolutionary step in cyber security; it is a revolution that is forcing technology leaders to fundamentally change their thinking about data protection.

    The quantum threat is not a distant, theoretical possibility. It materialises today through a strategy known as ‘Harvest Now, Decrypt Later’ (HNDL), or ‘Harvest Now, Decrypt Later’. Adversaries, including state actors, are actively capturing and storing vast amounts of encrypted data, patiently waiting for the moment when a cryptographically-relevant quantum computer (CRQC) becomes operational. At that point, the data we consider secure today will be retrospectively breached. This fundamentally changes the risk model, shifting the responsibility from purely operational to strategic, related to protecting the long-term value of the company. Any information whose confidentiality lifecycle extends beyond the anticipated moment of CRQC is already at risk.

    The arguments for postponing migration to post-quantum cryptography (PQC) have run out of steam. In August 2024, the US National Institute of Standards and Technology (NIST) published the first finalised PQC standards: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA) and FIPS 205 (SLH-DSA). This event sends a key signal to the global market: the research phase is over and the standards are awaiting implementation.

    End of the era of “one right encryption”

    For the past decades, public key cryptography decisions have been relatively straightforward, mostly boiling down to a choice between RSA and ECC. The post-quantum era puts a definitive end to this paradigm. We are entering a world where there is not and will not be a one-size-fits-all PQC algorithm. Choosing the right solution becomes a conscious strategic decision that must be precisely tailored to the specific use case.

    The world of PQC is inherently heterogeneous. NIST deliberately standardises algorithms from different mathematical families because each offers a unique set of trade-offs between computational efficiency, data size, resource usage and security level. Algorithm selection ceases to be solely the domain of cryptographers and becomes an architectural and product decision. Differences in the characteristics of individual algorithms lead to drastically different consequences depending on the deployment scenario.

    For high-performance TLS servers supporting, for example, an e-commerce front-end, minimal latency is a priority. Benchmark data clearly shows that CRYSTALS-Kyber (ML-KEM) leads the way, offering negligible performance overhead. In contrast, in VPNs where maximum throughput is crucial, algorithms with very large keys, such as BIKE, can become an issue, causing massive IP packet fragmentation and degrading network performance. In the world of IoT devices, where resources are extremely limited, algorithms with high memory requirements can be completely impractical, directly impacting component cost.

    The data speaks for itself

    Abstract discussions become real when hard data is analysed. Translating milliseconds of latency and kilobytes of data into concrete business implications is key to making informed decisions. Analysis based on comprehensive benchmarks shows that the choice of PQC algorithm is not only a question of security, but also of real operational costs.

    The performance leader is undisputedly CRYSTALS-Kyber, which shows negligible computational overhead in server tests. Its public keys and ciphertexts are compact, occupying just over 2 KB in total, and peak RAM consumption is minimal, making it an ideal candidate for a wide range of applications. At the other extreme is BIKE, a code-based algorithm. Despite its advantages, its implementation comes at a tangible cost: the sum of data exchanged during connection reconciliation exceeds 10 KB, which is more than four times that of Kyber. This size not only increases the cost of data transfer, but above all risks fragmenting network packets.

    Even more striking is the example of the Falcon signature algorithm. It offers extremely small signatures, which is a huge advantage in IoT applications where bandwidth is at a premium. However, its peak memory consumption is more than 26 times that of the competing Dilithium. For an engineer designing a medical device, this means that he or she may have to choose a more expensive microcontroller, increasing the unit cost of the overall product.

    Hidden risks and recommendations for CISOs

    Achieving compliance with PQC standards is just the first step. The biggest risks lie not in the theory of the algorithms, but in their practical implementation. Mathematical robustness is worthless if the physical implementation on the processor ‘leaks’ information about the secret key through side-channel attacks. Research has already shown successful attacks on Kyber and Dilithium implementations using analysis of power consumption or electromagnetic emissions.

    A key implication for CISOs is the hidden cost of security. Securing an implementation requires specialised countermeasures, such as masking, which introduce a significant performance overhead. NIST representatives have admitted that a well-secured implementation of Kyber can be up to twice as slow as its baseline version. This means that infrastructure budgets based on benchmarks of unsecured implementations are fundamentally flawed.

    Another area of risk is the supply chain. The security of a PQC system is only as strong as its weakest link, and these are often outside the company’s control – in open-source libraries or cloud providers. The CISO needs to start asking vendors tough questions about their roadmap for implementing NIST standards, support for hybrid modes and documented resilience to physical attacks.

    The solution to these challenges is crypto-agility – the ability of an architecture to easily and quickly replace cryptographic algorithms without fundamental changes to the infrastructure. In the dynamic world of PQC, where new standards are already emerging, this approach is a recipe for avoiding costly and risky migrations in the future. Investing in a crypto-switching architecture now, as part of the first PQC project, is a strategic decision that will drastically reduce the total cost of security ownership over the coming decade.

    Migration to PQC is not a one-off project, but a strategic transformation programme. It should proceed in a methodical and phased manner. The first, fundamental phase is in-depth preparation and inventory. Crucial here is the creation of a detailed cryptographic inventory that maps all systems using vulnerable cryptography. Then, based on a risk analysis, the migration should be prioritised, focusing on assets with the longest required confidentiality lifecycle. The second phase is to build technical capacity, including designing systems for the aforementioned crypto agility and launching controlled pilot projects in hybrid modes. This approach allows the collection of real data on the impact of PQC on the company’s specific environment. The final phase is the actual phased migration of production systems and the establishment of continuous monitoring processes.

    The move to post-quantum cryptography is an evolution in technology risk management. The key long-term investment is not in implementing a specific algorithm, but in building a crypto agile architecture. It is this agility that will allow the company to adapt quickly and efficiently to the inevitable changes in the cyber security landscape, ensuring resilience and protecting its future.

  • Quantum computers create new jobs. Here is the most important role at the interface between IT and science

    Quantum computers create new jobs. Here is the most important role at the interface between IT and science

    Quantum computers, long seen as the domain of physics laboratories, are beginning to find practical application in business. With the increasing availability of hardware via the cloud, there is a demand for a new type of specialist – a hybrid of data scientist and physicist who can translate business problems into the language of qubits.

    For decades, quantum computers were a technological promise, a distant vision with computing power capable of cracking modern cryptography and simulating molecules with unimaginable precision. This vision is slowly becoming a reality, albeit in a more subdued and pragmatic form. The discussion in the IT industry is quietly shifting from ‘if’ to ‘how and when’ we can use these machines to solve real-world problems.

    The fundamental change that is driving this transformation is accessibility. Technology giants are making their, for now imperfect and ‘noisy’ (NISQ – Noisy Intermediate-Scale Quantum), quantum processors available via cloud platforms. In parallel, software libraries such as Qiskit or Cirq are being developed that abstract away much of the complexity of quantum physics. They allow programmers and analysts to focus on the logic of the algorithm rather than directly manipulating the states of individual particles.

    This opens the door for an evolution in the world of data analytics and artificial intelligence. And it creates a gap that needs to be filled by a new professional profile: the quantum data scientist (Quantum Data Scientist).

    Who is a quantum data scientist?

    This is not a theoretical physicist locked in an academic ivory tower. Nor is it a classic data scientist who merely swaps the `scikit-learn’ library for `qiskit-machine-learning’. The quantum data scientist is a bridge specialist who stands at the interface of three worlds:

    1. a deep understanding of business problems in sectors such as finance, pharmaceuticals, logistics or energy.

    2. proficiency in data modelling and classical artificial intelligence techniques.

    3. a working knowledge of quantum architectures and the algorithms that can operate on them.

    His key task is to identify problems that have the potential for ‘quantum supremacy’ – that is, where even early quantum computers can offer better, faster or more accurate results than the most powerful classical supercomputers. He or she must then be able to translate this problem into the language of quantum algorithms, integrate them with classical data flows and interpret the probabilistic results that cubits generate.

    Where does the potential lie?

    While a universal, fault-tolerant quantum computer is still a distant future, applications are already being experimented with in several key areas:

    • Optimisation: Logistics problems (e.g. optimising routes for a fleet of vehicles), financial problems (e.g. optimising an investment portfolio) or manufacturing problems are challenges in which the number of possible combinations grows exponentially. Quantum algorithms, such as QAOA (Quantum Approximate Optimisation Algorithm), are designed to search this huge solution space more efficiently.
    • Simulations: The chemical and pharmaceutical industries stand to gain the most in the near term. Simulating the behaviour of molecules to design new drugs or materials (e.g. more efficient batteries) is extremely difficult for classical computers. Since nature at its core is quantum, simulating it on a quantum computer is more natural and potentially much more efficient.
    • Artificial intelligence: The area of quantum machine learning (Quantum Machine Learning) is also being explored. The idea here is to use quantum phenomena to improve predictive models, recommendation systems or inference engines, especially when working on complex, multidimensional data sets.

    A collaborative ecosystem

    Creating valuable quantum applications is not a one-man job. It is a team sport, requiring interdisciplinary collaboration. A quantum data scientist will work hand in hand with:

    • Domain experts (chemists, engineers, financial analysts) who understand the physical or business ground on which they operate.
    • Computer scientists and software engineers who can build robust, scalable data pipelines that integrate classical and quantum systems.
    • Physicists and mathematicians to help develop new algorithms and understand the limitations of current hardware.

    The dynamics are also fuelled by a global open source community that collectively develops algorithms, frameworks and platforms, creating an unprecedented pace of innovation.

    We stand at the threshold of a new era in technology. Like the early days of the internet or the Big Data revolution, the quantum phase will create new roles and require new skills. Companies that start exploring this area today and invest in developing talent capable of quantum thinking will gain a strategic advantage. The race for the hardware is on, but the real battleground in the coming years may be the battle for the people who can use it. The quantum data scientist will be one of the key protagonists in this change.

  • It operates at -273°C and can destroy global finance. This is what quantum AI can do

    It operates at -273°C and can destroy global finance. This is what quantum AI can do

    At the intersection of two of the most disruptive technologies of our time – artificial intelligence and quantum computing – a new field is emerging: quantum AI (Quantum AI). It is a concept that promises to solve problems of a scale of complexity beyond the capabilities of even the most powerful supercomputers. While its large-scale practical applications are still a future prospect, it is already forcing a review of business and technology strategies in key economic sectors.

    Quantum AI is not simply a faster version of the machine learning models we are familiar with. It is a fundamental paradigm shift that relies on qubits instead of classical bits, operating in the binary system (0 or 1). It is the qubit, the basic unit of quantum information, that accounts for its revolutionary potential.

    Foundations of a new era of computing

    At its core are two fundamental principles of quantum mechanics: superposition and entanglement. Superposition allows a qubit to exist in multiple states simultaneously – to be a combination of 0 and 1, rather than just one of these values. Entanglement, on the other hand, causes two or more qubits to become inextricably linked. A change in the state of one of them immediately affects the state of the other, regardless of the distance separating them.

    These two properties give quantum computers the ability to process an unimaginable number of combinations in parallel. Where a classical computer must sequentially analyse each possibility, a quantum machine can explore the entire space of potential solutions simultaneously. Combined with AI algorithms, this paves the way for solving optimisation, simulation and cryptographic problems that have hitherto remained beyond our reach.

    Algorithms that create and break security features

    While talk of quantum computing has been around since the 1980s, when David Deutsch presented a theoretical model of a ‘quantum Turing machine’, the 1990s brought two algorithms that defined the potential and dangers of the technology.

    The first is the Shor algorithm, developed by Peter Shor in 1994. Its ability to efficiently decompose large numbers into prime factors is an existential threat to most modern cryptographic systems, such as RSA, which protect global communications and finance. An effective implementation of this algorithm on a suitably powerful quantum computer could break current security measures in a time that is incomparably shorter than any classical machine.

    The second is Grover’s algorithm, which quadratically speeds up searches of unstructured databases. Although its implications are less dramatic than Shor’s algorithm, it is of great importance for optimisation, data analysis and cryptanalysis.

    These two examples demonstrate the dual nature of quantum AI: on the one hand, it can create new, virtually unbreakable security systems (post-quantum cryptography); on the other, it can destroy those on which today’s digital world is based.

    From lab to business

    The potential of quantum AI goes far beyond cryptography. Already, companies and research institutions are exploring its applications in many fields:

    • Medicine and pharmacy: simulating molecular interactions to design new drugs and personalised therapies. The complexity of these systems is so high that modelling them accurately on classical computers is impossible. Quantum AI could drastically reduce the time and cost of researching new substances.
    • Finance: Create much more sophisticated models for risk assessment, investment portfolio optimisation and market forecasting. The ability to analyse huge, multi-dimensional data sets in real time could revolutionise trading and asset management.
    • Logistics and optimisation: Solving extremely complex logistical problems, such as optimising global supply chains (the so-called ‘commuter problem’ for thousands of cities), which would translate into gigantic savings in time and resources.
    • Materials science: Designing new materials with unique properties (e.g. superconductors operating at room temperature) by simulating their behaviour at the quantum level.

    Cold shower

    Despite the promising prospects, the road to practical and widespread quantum AI is fraught with challenges. The biggest of these is the very physical nature of quantum computers. Cubits are extremely sensitive to any disturbance from the environment, such as fluctuations in temperature, pressure or magnetic fields. This phenomenon, known as decoherence, causes a loss of the delicate quantum state and leads to errors in calculations.

    To prevent this, quantum processors must operate under extreme conditions: at temperatures close to absolute zero (-273°C) and in a near-perfect vacuum, isolated from external electromagnetic fields. Building and maintaining such machines is extremely expensive and complicated.

    Other challenges include the need to develop new fault-tolerant algorithms, as well as a shortage of skilled professionals capable of working at the interface between computer science, quantum physics and artificial intelligence.

    State of play and outlook

    The technology race is already on. Giants such as Google (TensorFlow Quantum), IBM (Qiskit) and Amazon (Braket) are providing cloud platforms that allow developers and researchers to experiment with quantum algorithms on real hardware or in advanced simulators. Initiatives such as the European Quantum Spain project show that governments also recognise the strategic importance of this technology.

    For the IT industry, this means that it needs to prepare for a hybrid future in which classical systems will collaborate with quantum systems, delegating tasks to them that they cannot handle on their own. Already, a key trend is becoming the development of post-quantum cryptography (PQC), i.e. algorithms that are resistant to attacks from both classical and quantum computers.

    Quantum AI will not replace traditional artificial intelligence in tasks such as image recognition or natural language processing. Rather, it is a specialised tool that will make leaps and bounds in solving a specific but highly relevant class of problems.