Category: Startups and innovations

  • Managing innovation in the shadow of doubt. The art of turning cool criticism into market success

    Managing innovation in the shadow of doubt. The art of turning cool criticism into market success

    Just recall the image of a typical board meeting or pitching session in front of an investment fund. A revolutionary idea for an innovative system architecture or a cutting-edge digital product lands on the table. After a moment of tense silence, the iconic cold statement is made, suggesting that the project is too risky and that the business model will most simply not scale. In a technology industry that has lived for years in a paradigm of continuous validation, agile methodologies and seeking immediate acceptance for every iteration of a product, a lack of enthusiasm from the environment is sometimes interpreted as the ultimate failure. Meanwhile, a clash with the wall of scepticism can prove to be the most life-giving moment in the innovation lifecycle. This phenomenon, while intuitively familiar to many business pioneers, has just found indisputable confirmation in rigorous scientific research.

    It is not uncommon for today’s technology ecosystems and startup environments to fall into the trap of so-called echo chambers. Being in an environment that uncritically applauds new initiatives builds a false sense of market security. Such dynamics can lull vigilance and make entire R&D departments lazy. Business psychology, however, points to a completely different, much more powerful driving mechanism. The rejection of an innovative concept, in which the creators have invested time and intellect, is sometimes perceived at a deep level almost as a personal attack. Contrary to appearances, a confrontation of this kind rarely leads to a final capitulation. Much more often it becomes an inflammatory spark, awakening a perverse need to prove the world wrong.

    “Being in an environment that uncritically applauds new initiatives builds a false sense of market security.”

    The intuitive belief in the power of being underestimated has been put on solid ground by the work of a team from North Carolina State University. Researcher Tim Michaelis and his colleagues set out to investigate a mechanism aptly named the ‘underdog effect’, or underdog syndrome. The analysis of this phenomenon was based on three independent, complementary research phases. In the first phase, the researchers conducted in-depth interviews with a group of more than four hundred and twenty entrepreneurs. Nearly three hundred and twenty of them heard an explicit prediction of failure early in their business. The conclusions from these interviews proved exceptionally clear, demonstrating that those who were predicted business failure showed noticeably higher levels of commitment to their visions. The chilling scepticism acted like a defibrillator on their determination.

    The second stage of the research, involving a group of almost five hundred and eighty participants, provided a glimpse into the mechanics of motivation itself. The experiment proved that the mere recall of a moment when someone expressed doubt about the success of a project led to an immediate and measurable increase in willingness to work. This clearly shows that the memory of the criticism experienced is not just a temporary impulse, but is a long-term source of market determination. The final proof was provided by a longitudinal study in which the actions of more than four hundred company creators were monitored for three months at regular intervals. This approach made it possible to observe how the underdog effect evolves over time. The researchers’ conclusion leaves no illusions – this psychological mechanism directly translates into work intensity, exceptional focus on operational goals and a real, tangible grounding of the company in the market.

    The above findings resonate perfectly with the day-to-day reality of the IT industry, where great ambition and merciless technological verification constantly clash. The mechanism of wanting to be proven right can effectively eliminate the phenomenon of procrastination among software architects, engineers or product managers. The frustration caused by the lack of faith of the environment forces one to enter a state of deep concentration on the so-called delivery of results. Project teams, who have had to fight for survival and justify the raison d’être of their vision from the very start, naturally build resilience. This early fortitude becomes an invaluable asset in later phases of development, for example when dealing with critical incidents in production environments or in the face of unexpected industry turbulence.

    However, it should be made clear that the glorification of the underdog syndrome carries certain analytical risks. For there is an extremely fine line between bold visionaryism and harmful blindness to macroeconomic realities. Every organisation faces the challenge of avoiding the trap in which absolutely every dissenting voice begins to be treated solely as an unfounded attack, and ignoring comments becomes an end in itself. The ability to calibrate the business compass proves to be an absolutely strategic competence in this context. On the one hand, it is worth drawing strength from general doubt, which is an unparalleled stimulant for hard work. On the other hand, however, under no circumstances should one close one’s ears to constructive criticism regarding errors in the business logic architecture, shortcomings in the user experience or gaps in the financial forecasts. The conscious implementation of substantive comments, manifested, for example, in the form of an agile pivot, ultimately separates the successful strategists from the incorrigible fantasists. Professor Michaelis’s team, moreover, is itself pointing in this direction as a target for future research, planning the search for the ideal balance point between market wind in the sails and the invigorating resistance of matter.

    “Scepticism, properly balanced and devoid of personal envy, becomes high-octane fuel.”

    The conclusions of the analyses cited above shed a refreshing new light on innovation management and building mature technology teams. From the perspective of modern business entities, it is worth realising that the complete absence of opposition when designing debuting solutions is rarely a reason for optimism. In healthy organisational structures, the presence of individuals acting as devil’s advocates – questioning the status quo and testing the logic of new concepts – is a downright desirable phenomenon. Scepticism, properly balanced and devoid of personal envy, becomes high-octane fuel. Indeed, the lack of popular acclaim does not signal a retreat, but constitutes a free dose of the most condensed business energy, capable of producing new-generation market leaders.

  • Sisyphean work in Silicon Valley. Physics teaches humility about AI

    Sisyphean work in Silicon Valley. Physics teaches humility about AI

    Cloud computing has for years effectively hidden the physical dimension of the technology, creating the illusion of infinite and seamlessly scalable resources. Generative AI is brutally tearing down this curtain. With the increasing complexity of models and the popularity of artificial intelligence, software development inevitably collides with the hard laws of physics and thermodynamics. Why do hardware engineers today resemble the mythical Sisyphus, and what does the looming technological token explosion mean for the operational strategies and cloud budgets of today’s enterprises?

    An end to the illusion of limitless computing space

    The early popularisation phase of generative artificial intelligence shaped an image in the market mindset of a technology that was lightweight, ubiquitous and almost free. However, consumer chatbots, efficiently generating lines or editing email correspondence, were merely an impressive display window. As analysis shows, the real business revolution, and the only way to generate a return on trillions of dollars of investment, lies in an entirely different area. The world of technology is moving inexorably towards a reality in which agent-based artificial intelligence becomes the operational foundation of businesses.

    The shift from simple text assistants to autonomous agents is a fundamental paradigm shift. It marks an evolution from single user queries to continuous multi-step inference and the execution of complex workflows in the background. Enterprises will soon be making tens of thousands of system calls to large language models every day. This phenomenon is no longer just a fascinating scientific experiment, but is becoming a process of scale and gravity typical of heavy industry, where process optimisation plays a central role.

    The brutal mathematics of floating point operations

    Understanding the challenges ahead requires looking under the hood of powerful language models. Each word generated, or more precisely each token, carries a measurable physical computational cost. The architecture of today’s systems typically requires two floating-point operations per second for each model parameter in the response generation process. The scale of this is striking when you consider that the most advanced market models operate on one to two trillion parameters. This means that even with highly sophisticated optimisation techniques, generating a single token forces the real-time conversion of between one hundred and two hundred billion variables.

    What’s more, the industry is dynamically shifting towards models based on deep reasoning, in which the contextual window is dramatically expanded. Agent-based artificial intelligence analyses problems multithreadedly, searching for optimal solution paths before formulating a final answer and executing an action. As a result, the number of tokens per query increases exponentially, often by a factor of ten or more. Referring to this phenomenon as a token explosion is not a literary exaggeration, but a chilling description of the digital reality to come.

    Energy consumption as a new unit of account in business

    The consequence of the aforementioned data growth is a return to the fundamentals of economics, where energy intensity becomes the main barrier. According to market analysts, energy consumption, measured in watts per single query, directly determines the profitability of the entire technological sector. The generative business model of artificial intelligence is unique in this respect; the target net margin here depends as much on ingenious code as it does on the cooling costs of the server room and a stable power supply.

    Currently, these costs are largely absorbed by model developers, leading to a situation where it is not uncommon for technology giants to subsidise query processing, relying on capital from investors. This model is not likely to stand the test of time in a mature market. The real beneficiaries of the ongoing investment boom today are not the developers of intelligent algorithms, but infrastructure providers, advanced chip manufacturers and data centre builders. The owners of language models do not have a profit machine, but a powerful mechanism in which capital burns in anticipation of the moment when massive use at the corporate level will offset the astronomical cost of maintaining servers.

    The myth of Sisyphus in the modern server room

    The market situation is forcing an unprecedented effort on the part of hardware manufacturers. The semiconductor industry is operating in a state of constant mobilisation, striving to increase the cost efficiency of graphics processing units, developing ever higher bandwidth memories and optimising the network architecture of cluster systems. Despite these colossal efforts, engineers working on hardware development today resemble the mythical Sisyphus.

    This phenomenon can be likened to a kind of Jevons paradox transposed to the digital world. Whenever a technological boulder is successfully rolled to the top of a mountain by creating a new, faster and more energy-efficient generation of processors, software developers immediately increase the complexity of their models. The boulder falls with a bang at the foot and the work begins again. As artificial intelligence continues to expand its analytical and operational capabilities, the quest for full cost optimisation seems a horizon that is constantly receding. Computational requirements are growing faster than the ability to handle them cheaply, representing an uncompromising clash between unlimited ambition and the limits imposed by semiconductor physics.

    Survival architecture, or cost engineering as an operational priority

    Awareness of the technological and physical considerations described is crucial for planning long-term business strategy. The end of the era of free experimentation means that target implementations of artificial intelligence systems in the corporate environment will have to be subject to rigorous financial and architectural evaluation. The implementation of agent-based systems will bring organisations leaps in productivity by automating complex workflows, but these benefits will be wiped out in a fraction of a second if the toll on computing resources gets out of hand.

    Modern IT infrastructure management will be inextricably linked to the implementation of advanced cloud cost engineering. Instead of routing every trivial task to the most resource-intensive models with trillions of parameters, organisations will be forced to design agile hybrid architectures. Intelligent process routing will involve delegating simple operations to much smaller, highly specialised and energy-efficient models. The costly computing power of the largest market systems will in turn be precisely reserved exclusively for tasks requiring the highest level of abstract inference.

    Understanding the physical, energy and economic limits of technology is becoming the new foundation for market advantage. Only those organisations that can harmoniously combine a bold vision of advanced automation with a cool, rigorous calculation of every watt consumed and token generated in the background will succeed in the target phase of artificial intelligence development.

  • Massmedica completes robot prototype. Clinical trials as early as 2026

    Massmedica completes robot prototype. Clinical trials as early as 2026

    In the saturated but still rapidly growing medical robotics market, Polish company Massmedica has just taken a significant step towards commercialising its own solutions. The NewConnect-listed entity announced the completion of work on the URSA Minor prototype. This is a key component of the proprietary Universal Robotic Surgical Assistant project being carried out by subsidiary Massmedica Technologie. The news signals that Polish technical thought aspires to play a greater role in a sector that has so far been dominated by global giants.

    The current phase of the project focuses on the software layer, which in today’s precision surgery is the frontline of the battle for competitive advantage. Massmedica’s engineers are focusing on optimising treatment planning algorithms and advanced navigation integrated with image data. From a business perspective, it is the quality of the user interface and the precision of real-time data processing that determines the adoption of the technology by medical facilities. The company is currently awaiting delivery of the target robotic arm, which is expected to provide higher precision movement than the current test module.

    The deployment schedule envisages moving to near-clinical testing as early as April 2026. These tests, conducted in the anatomy department on unfixed human specimens, will be crucial to verify the ergonomics of the system and the accuracy of performance in surgical situations. The results of these tests will become the foundation of the certification process, which in the MedTech industry remains the most capital-intensive and rigorous stage of product launch.

    The completion of the URSA Minor prototype is evidence of the smooth implementation of the diversification strategy. The company, hitherto known mainly for its activities in the areas of reconstructive, regenerative and anti-aging medicine, faces the opportunity to enter the high-margin medical device segment. The success of the URSA project now depends on the company’s ability to move smoothly through the testing phase and ensure the scalability of production of a system that has the potential to become an attractive alternative to currently available robotic solutions on the market.

  • Genomtec tests in Asia: Wroclaw-based deep-tech on the final straight to merger

    Genomtec tests in Asia: Wroclaw-based deep-tech on the final straight to merger

    Wrocław-based deep-tech, Genomtec, is about to make one of the most challenging moves in its history to date. The company, which specialises in molecular diagnostics, has announced that it has begun preparing for the operational testing of the Genomtec ID system in Asian markets. While the announcement sounds like a standard expansion step, it is in fact a critical part of the ongoing M&A process that could define the future valuation and ownership structure of the Polish startup.

    Asia, currently one of the most receptive markets for Point-of-Care (POCT) technology, is becoming a testing ground for Genomtec. Potential strategic partners with whom the company is in advanced talks have set a clear condition: the SNAAT® technology must prove its efficacy under local conditions. For investors, this signals that the process of selling the company or its key assets has entered the product due diligence phase, where promises of the analyser’s speed and mobility will be confronted with the operational reality of Far Eastern healthcare systems.

    The logistics of this project are as complex as the technology itself. Miron Tokarski’s team must now manage not only the production of reaction cards for validation, but more importantly the thicket of biosafety regulations. Each jurisdiction in the Asia-Pacific region has specific requirements, which, for a technology based on genetic diagnostics, can sometimes be an impenetrable barrier to entry for less prepared players.

    The success of these assays may position Genomtec as a sweetheart for global medical corporations looking for an alternative to the dominant but often less mobile PCR systems. The flagship product from Wroclaw offers a unique combination: the precision of a laboratory test encapsulated in a device the size of a desktop printer, which in the densely populated metropolises of Asia is a value in itself.

    For technology market observers, the move is a lesson in pre-transaction value building. Genomtec is not waiting for a buyer with a ready-made product on the shelf, but is actively adapting its validation process to the expectations of specific partners. If the Breslau devices pass the Asian trial by fire, the finalisation of the M&A process may come sooner than the market expects.

  • Creotech and ESA strike back at GNSS signal jamming with new SAFIR-PNT mission

    Creotech and ESA strike back at GNSS signal jamming with new SAFIR-PNT mission

    In a world dominated by the digital economy, the invisible infrastructure of satellite signals has become the backbone of everything from logistics to financial transactions. But this foundational technology, known as GNSS, is increasingly under attack. The answer to the growing chaos in the ether is to be found in a new mission from Polish company Creotech Instruments, which is launching the HyperSat SAFIR-PNT project in collaboration with the European Space Agency (ESA) and German giant OHB System AG.

    The phenomena of jamming and spoofing have ceased to be the domain of theoretical academic considerations and have become real operational risks for the transport and defence sectors. The SAFIR-PNT mission aims not only to passively watch these incidents, but above all to precisely geolocalise their sources from orbit. This SIGINT (electromagnetic reconnaissance) approach, which until recently was reserved for the major powers, is now moving down to the level of commercial microsatellites.

    For investors and market observers, the ‘dual-use’ model is key. Creotech is not building a tool exclusively for the military; it is building a protection system for critical infrastructure, the paralysis of which could cost modern economies billions of euros a day. The use of the company’s proprietary HyperSat platform as a base for three demonstration microsatellites suggests that the company is targeting a scalable product that could become a standard in Europe’s technological security arsenal.

    The strategic importance of the project goes beyond the technology itself. At a time of increasing geopolitical tensions, Europe is desperate for autonomy in the PNT (Positioning, Navigation and Timing) area. Creotech’s collaboration with OHB System on the Phase 0/A feasibility study is a signal that Polish technological thought is being integrated into the narrowest circle of the European “Space” sector supply chain.

    If the SAFIR-PNT mission proves successful, Creotech could move from being a component supplier to being the architect of orbital safety systems. In an industry where trust is the hardest currency, the interference detection contract with ESA is not just an engineering contract, but more importantly a mandate to build one of the most crucial early warning systems for the modern economy.

  • PLN 700 million for technological SMEs. PARP launches a call under the new FENG formula

    PLN 700 million for technological SMEs. PARP launches a call under the new FENG formula

    The Polish Agency for Enterprise Development(PARP) is opening another chapter in the history of the SMART Path programme, allocating PLN 700 million to support innovation in the SME sector. Although the amount is impressive, it is not the scale of the funding, but the fundamental change in the structure of the programme that attracts the attention of market analysts. In response to voices from the business community, the agency has decided to move away from complex modularity towards a transparent division of investment processes.

    Strategic evolution instead of bureaucracy

    So far, the SMART formula has sometimes been logistically challenging for entrepreneurs. The new edition, under the FENG 2021-2027 programme, relies on linearity: the current call focuses exclusively on the research and development (R&D) phase, while the commercialisation of results will be covered by separate competitions in the future. This strategic move allows companies to focus on the most difficult phase – the creation of new technological value – without having to immediately plan a full implementation line.

    The key criterion remains to fit into the National Intelligent Specialisations. PARP is looking for projects that not only digitise processes or automate production, but bring a new quality on a national scale. Funding covers a wide range of costs: from the salaries of research staff, through equipment depreciation, to intellectual property protection and team competence development.

    Entry barrier and risk premium

    The programme is aimed at entities with an established technological vision. The minimum threshold for eligible expenditure is set at PLN 3 million, which, with an upper limit of PLN 50 million, clearly defines the target group as ambitious technology and manufacturing companies. The level of co-financing is flexible – the base 50% for industrial research can rise to as much as 80% once certain conditions are met, providing an important capital shield for risky experimental projects.

    However, it is worth noting the strict definitions of research work. PARP clearly cuts off funding for routine IT work, such as developing simple applications or debugging. The new edition of the SMART path is a tool for those who push technological boundaries rather than replicate proven schemes.

    The time to prepare documentation is short. The transfer window for applications opens on 26 February and closes as early as 31 March 2026.

  • The Council of the Future is formed – with leaders from ElevenLabs, Klarna and ICEYE

    The Council of the Future is formed – with leaders from ElevenLabs, Klarna and ICEYE

    In a high-tech world, where the pace of innovation often outpaces legislative processes, Poland is making a move to strengthen cooperation between the government and the technology elite. The establishment of the Council of the Future, announced on Tuesday in Warsaw’s Skyliner office building by Prime Minister Donald Tusk and Finance Minister Andrzej Domański, signals Warsaw’s desire to stop being merely a recipient of technology and become its active architect.

    The initiative is not just a prestige advisory body. The composition of the 18-member panel indicates a precise selection of experts from sectors that will define the global economy of the next decade: from artificial intelligence and biotechnology, to space technologies and the defence sector. The presence of figures such as Mati Staniszewski of ElevenLabs, Rafal Modrzewski of ICEYE and Sebastian Siemiątkowski of Klarna suggests that the government is looking for direct insight into the mechanisms of scaling businesses with global reach.

    Minister Andrzej Domański, who will head the Council, made it clear: Poland is in a global race in which human capital is the main asset. From the point of view of business, the key challenge for this body will be to translate academic knowledge into real market value. The participation of Piotr Sankowski from IDEAS NCBR or Prof Krzysztof Pyrić is expected to ensure that Polish innovations do not remain confined to laboratories, but find an outlet in commercial implementations.

    For investors and entrepreneurs, the creation of the Future Council is a signal of stabilisation of the state’s priorities. The inclusion in the talks of leaders such as Piotr Wojciechowski of WB Group or Jarosław Królewski of Synerise suggests that economic policy will be more strongly oriented towards technological sovereignty and support for indigenous unicorns.

    Although the detailed competences of the Council are still in the process of being clarified, the very fact of creating such a high-level ‘think-tank’ at the Prime Minister’s Office changes the narrative about the Polish innovation ecosystem. Instead of scattered grants, what can be seen is an attempt to create a coherent strategy in which the state acts as a partner to the fastest growing industries. The success of this initiative will be measured not by the number of meetings, but by the government’s ability to quickly remove the regulatory barriers that today hamper the Polish DeepTech sector.

    “Poland is today the 20th economy in the world – this is a great success, but it is not given once and for all. Therefore, we need to build new advantages and win in the global race,” said Minister Andrzej Domański.

    Council for the Future
    Source: Ministry of Finance

    The council consists of:

    • Dominik Batorski, sociologist and data science expert, associated with the interdisciplinary Centre for Mathematical and Computer Modelling at the University of Warsaw;
    • Grzegorz Bron, CEO of Creotech Instruments;
    • Sebastian Kondracki, Chief Innovation Officer at Devinity, one of the creators of the Polish AI model Bielik;
    • Tomasz Konik, CEO of Deloitte Central Europe;
    • Jarosław Królewski, CEO and co-founder of Synerise;
    • Rafal Modrzewski, President of ICEYE;
    • Aleksandra Pędraszewska, technology entrepreneur;
    • Pawel Przewieźlikowski, CEO of Ryvu Therapeutics;
    • Krzysztof Pyrć, Chairman of the Board of the Foundation for Polish Science, Professor of Biological Sciences and Virologist;
    • Mikołaj Raczyński, Vice President of the Polish Development Fund for Investment;
    • Piotr Sankowski, President of the IDEAS Research Institute and Professor at the Institute of Computer Science, University of Warsaw;
    • Mati Staniszewski, CEO of ElevenLabs;
    • Sławosz Uznański-Wiśniewski, European Space Agency astronaut;
    • Marta Winiarska, President of the Board of the Polish Association of Innovative Medical Biotechnology Companies BioInmed;
    • Piotr Wojciechowski, Chairman of the Board of WB Group the largest Polish private technology and defence group;
    • Stefan Batory, co-founder of Booksy;
    • Aleksandra Przegalińska, Rector for Innovation at Kozminski University in Warsaw;
    • Sebastian Siemiątkowski, co-founder and CEO of Klarna Bank.
  • Not Starlink, but TeraWave. Blue Origin targets governments, not consumers

    Not Starlink, but TeraWave. Blue Origin targets governments, not consumers

    Jeff Bezos no longer just wants to race with Elon Musk for the individual customer; his ambitions extend to the critical infrastructure of the digital economy. Blue Origin ‘s announced TeraWave project is an attempt to capture the most lucrative and technically challenging segment of the space market: servicing data centres, governments and large-scale artificial intelligence systems. While Starlink has dominated the consumer market, Bezos is targeting ‘fibre in orbit’.

    The new constellation is to consist of 5408 satellites deployed in low and medium Earth orbit. The key differentiator here, however, is not scale, but performance. Using optical (laser) communication between satellites, the network is expected to offer transfer speeds of up to 6 Tbps anywhere on Earth. These are parameters unattainable for standard satellite services, dedicated to a specific audience. Dave Limp, CEO of Blue Origin, emphasises that TeraWave has been designed from the ground up for corporate customers, not households. The company estimates that the network will serve a maximum of around 100,000 entities, a radical departure from Starlink’s model of competing for millions of subscribers.

    The decision to build TeraWave fits perfectly with the ‘gold rush’ of the AI era. Training and operating large language models requires massive computing power and lightning-fast data transfer, which on Earth involves gigantic energy consumption. Moving some of this infrastructure into space or providing it with ultra-fast orbital connectivity is the game Bezos wants to deal the cards. It is also a strategic addition to another constellation linked to the billionaire, the Amazon Leo network (formerly Project Kuiper), which is still vying for the mass market.

    However, the implementation of these ambitious plans is fraught with significant execution risk. Satellite deployment is scheduled to begin in the last quarter of 2027, which requires flawless operation of the New Glenn rocket. Blue Origin’s reusable launch system must achieve a launch rate similar to SpaceX’s Falcon 9 to make the launch of thousands of craft economically viable. Geopolitical competition also remains in the background, with China dynamically developing its own rockets and constellations, making the battle for on-orbit dominance a race not only technologically but also politically.

  • Polish deep tech in Japanese supply chain. QNA Technology with sixth order from Japan

    Polish deep tech in Japanese supply chain. QNA Technology with sixth order from Japan

    In the deep tech industry, and particularly in the advanced semiconductor materials sector, repeat orders are a currency that is often more valuable than the one-off value of a contract. QNA Technology, a Wroclaw-based company that debuted on the WSE’s main floor last year, has just provided the market with proof that its blue quantum dot technology is defending itself in the laboratories of one of the world’s most demanding markets.

    The company has announced the completion of its sixth order for a Japanese chemical company supplying components to global display manufacturers. Although the contract amount of USD 10,000 net may seem modest from a stock market valuation perspective, in the context of the R&D process it is of key importance. It signals that Poland’s PureBlue.dots – quantum dots that emit blue light and are free of heavy metals – have successfully passed the initial testing phases and are being deployed to the increasingly advanced stages of customer validation.

    The display market has been struggling with the ‘blue pixel’ problem for years. While red and green quantum dots are already standard, efficient and stable blue emission without the use of toxic cadmium has remained the technological holy grail. QNA Technology’s focus precisely on this slice of the spectrum means that the company is targeting the bottleneck of the entire industry in an attempt to solve a problem faced by the major players.

    The regular returns from the Japanese partner suggest that QNA Technology is moving from the experimental phase towards a viable alternative to current solutions. Artur Podhorodecki, the company’s CEO, makes it clear that each subsequent order from the same entity increases the likelihood of moving from pilot sales to full-scale commercialisation. For investors who have been watching the company’s transition from NewConnect to the main market in 2025, this signals the technology’s applicability in a supply chain that historically tends to be airtight to non-Asian entities.

    The supply includes material in the form of a colloidal solution, which is standard in the printing processes of modern dies. If the Japanese partner decides to cooperate more widely, the Polish company could become an important link in the production of next-generation displays, fighting for market share in a market so far dominated by giants from Korea and China.

  • Insurtech investments 2025: Poland ‘above its weight’. Trasti with a massive injection of capital

    Insurtech investments 2025: Poland ‘above its weight’. Trasti with a massive injection of capital

    The July investment round of Polish insurtech Trasti, worth €21 million, has ranked the company among the five largest deals of its kind in Europe in 2025. These are the conclusions of the Astorya.vc fund report published in mid-January. The presence of the Polish entity in this list, in addition to support from strategic investors such as the EBRD and Triglav Group, signals a fundamental change in the strategy of venture capital funds.

    The European insurtech market has undergone a painful but necessary correction. After a record year in 2021, when investment reached €2.5 billion, capital volumes stabilised between 2023 and 2025 at €600-820 million per year. The number of transactions, however, has not collapsed, remaining at 60-70 per year. This means that capital is still available, but going to a narrower range of players. Investors have moved away from funding the vision of rapid growth alone to models characterised by operational resilience (resilience) and technological resilience.

    It is into this new paradigm that Trasti, which is the only Polish representative in the report, fits. The company, whose leading business line is motor insurance, was recognised for combining technology with effective underwriting and risk control. Artur Olech, CEO of Trasti, emphasises that the current market selection rewards companies that build scale without losing profitability. This is a shift towards ‘hard’ business metrics, where innovation must translate into an insurer’s technical result.

    The Astorya.vc report also highlights the growing role of artificial intelligence, which has driven as much as 33 per cent of transactions in the sector in the past year. Significantly, AI-first solutions are now mainly focused on loss adjustment and distribution support, which directly translates into cost efficiency. Poland, alongside Spain and the Netherlands, was identified in the report as a market “playing above its weight category”. Despite a smaller share of European GDP, our insurtech ecosystem is showing maturity, becoming one of the key innovation hubs in the CEE region and ahead of markets such as Romania and Hungary.

  • The great Starlink migration. SpaceX lowers the orbit of 4400 satellites

    The great Starlink migration. SpaceX lowers the orbit of 4400 satellites

    In response to rapidly increasing operational risks in low Earth orbit (LEO), SpaceX has decided to significantly reconfigure its infrastructure. Elon Musk’s company has announced a massive redeployment of part of its constellation to a lower altitude as a response to the problem of space debris and orbital congestion. The operation, planned for 2026, will involve around 4400 Starlink satellites.

    According to Michael Nicholls, director of engineering at SpaceX, craft currently orbiting at an altitude of 550 kilometres will be brought down to an orbit of 480 kilometres. While a difference of 70 kilometres may seem small, on the scale of orbital mechanics it is safety critical. Nicholls argues that a lower orbit drastically reduces the natural deorbit time of inactive satellites by more than 80 per cent. This means that in the event of a device failure, they will burn up much more quickly in the atmosphere, freeing up valuable space.

    The decision is a direct result of the deteriorating situation in space around the Earth. The European Space Agency (ESA) estimates that there are currently around 40,000 tracked objects orbiting below the 2,000 kilometre ceiling. Only 11,000 of these are functioning satellites, with SpaceX’s dominance in this sector unquestionable. According to astronomer Jonathan McDowell, the company owns more than 9300 active devices, making it the largest on-orbit operator.

    Moving satellites below the 500-kilometre limit is also a tactical escape from collisions. The area is now less crowded, minimising the risk of collisions with uncoordinated manoeuvres by other operators and small debris. ESA warns of the existence of more than 1.2 million objects larger than a centimetre that could cause catastrophic damage to infrastructure. SpaceX’s move may be perceived by the market not only as a concern for its own resources, but also as an attempt to impose new standards of risk management in the NewSpace sector, where ‘sustainability’ is beginning to apply to extraterrestrial space as well.

  • Billions of euros are slipping through their fingers. European SMEs are oversleeping the energy revolution

    Billions of euros are slipping through their fingers. European SMEs are oversleeping the energy revolution

    Although the small and medium-sized enterprise (SME) sector is the backbone of the European economy, generating more than half of the EU’s GDP, its role in the green revolution remains surprisingly marginal. The latest report published by the Solar Impulse Foundation and Schneider Electric, entitled Unlocking SME Competitiveness in Europe, sheds light on an important paradox. Although these companies account for 99 per cent of all players in the market, only 11 per cent of them are making significant investments in sustainability. This sluggishness could cost the European economy billions of euros in potential value lost by 2030.

    The authors of the study point out that the key to unlocking this potential is the synergy of electrification and digitalisation. Estimates are promising: the implementation of appropriate technologies would reduce the sector’s energy consumption by 20 to 30 per cent and, in some industries, reduce CO2 emissions by up to 40 per cent. However, the problem lies in the approach to strategy. The report makes a clear distinction between ad hoc measures and long-term plans. While as many as 93 per cent of companies are taking single efficiency measures such as replacing equipment, only a quarter have developed a comprehensive decarbonisation strategy.

    The market needs systemic solutions, not just spot-on implementations. Digital integration is now a matter of tough competitiveness and cost control, not just image. Indeed, smaller players, due to their limited bargaining power, are extremely vulnerable to energy price shocks, making them ideal candidates for the adoption of models such as Energy-as-a-Service.

    However, this transformation faces infrastructural barriers. Europe, wanting to increase the electrification rate to 32 per cent by the end of the decade, has to face the fact that 40 per cent of electricity grids are over 40 years old. This modernisation requires an investment of €584 billion. Against this backdrop, the report calls on policymakers to take urgent action, including simplifying permitting procedures and revising tax directives, which could lower investment risks for the SME sector and accelerate the adoption of modern energy technologies.

  • Seagate rolls out 69TB drives. Key storage density record

    Seagate rolls out 69TB drives. Key storage density record

    Seagate’ s latest lab achievement could change the trajectory of the storage market faster than expected. Achieving storage densities capable of fitting 6.9TB of data on a single platter is technical proof that it is feasible to build drives of nearly 70TB in the standard 3.5-inch format. Given that today’s enterprise-class drives can accommodate up to ten such platters, the vision of a capacity leap becomes tangible.

    For data centre operators and cloud service providers, where every inch of rack space has a measurable financial value, this is a key signal. In this segment, the game is about maximising data density, which directly translates into lower total cost of ownership (TCO). The foundation for this success is the Mozaic 3+ platform, a market implementation of HAMR technology. Although the brand’s current flagship, the Exos 32TB model, already benefits from advanced laser-assisted magnetic recording, laboratory results confirm that the technology still has a huge amount of scalability.

    Previous announcements of 50TB or 100TB drives have been treated with some reserve by the industry. However, a result close to 7TB per platter shows that Seagate ‘s ambitious schedules are based on hard physics and not just marketing. While we will still have to wait for the market release of finished 69TB units, the successful scaling of HAMR suggests that entering the era of 100-terabyte drives is already just a matter of optimising manufacturing processes rather than a distant theory.

  • The Trump administration is betting on XLight. US$150m for former Intel CEO’s startup

    The Trump administration is betting on XLight. US$150m for former Intel CEO’s startup

    Donald Trump’s administration has made the unprecedented decision to make a direct capital entry into XLight, a deep-tech startup led by former Intel CEO Pat Gelsinger. As reported by the Wall Street Journal, citing the Department of Commerce, the US government will invest up to $150 million in the company. The move represents a major adjustment in US semiconductor strategy. Washington is no longer limiting itself to subsidising factories and is beginning to build an active portfolio of holdings in technologies that could revolutionise the supply chain.

    XLight is targeting the ‘holy grail’ of chip manufacturing – next-generation EUV lithography based on particle accelerators, which has the potential to break the technology monopoly and cost barriers of current solutions. For Pat Gelsinger, this is a ‘new card’ and a return to the highest stakes game just moments after leaving Intel. The White House’s decision sends a clear signal to the IT channel: the stream of federal funding is shifting from bailing out ‘legacy’ giants to supporting risky but critical hardware innovation. This investment could, in the long term, change the balance of power in global silicon production, making the US independent of external suppliers of lithography machines.

  • TFI PZU enters the game: PLN 15m for Scanway at valuation confirming space-tech aspirations

    TFI PZU enters the game: PLN 15m for Scanway at valuation confirming space-tech aspirations

    The Polish space technology sector has received a strong signal of confidence from institutional capital. TFI PZU, one of the largest players in the market, has acquired a stake in Wrocław-based Scanway, investing PLN 15.3 million in the company. The transaction not only secures funding for the new development strategy for 2026-2028, but also sets a new valuation base for the company as it prepares to move to the main floor of the WSE.

    The key aspect of the transaction is the purchase price of the shares. TFI PZU acquired 100,000 shares, representing 6.45% of the share capital, paying PLN 153.00 apiece. Although this amount represents a 9 per cent discount to the market average over the last month, it is more than double the valuation of the financing round carried out only in June this year. Such an increase in a short period of time reflects the dynamic changes in the company’s fundamentals, which has increased the scale of its operations by leaps and bounds in recent quarters.

    The operation was carried out with the participation of the main shareholder, the Jedrzej Kowalewski Family Foundation. The mechanism was to sell the existing shares to the fund and then for the Foundation to subscribe to a new series H issue at the same price. In this way, the capital would go directly to the company and the institutional investor could immediately take possession of the liquid shares. Both CEO Kowalewski and TFI PZU have agreed to a lock-up until the end of 2025, which stabilises the shareholding during a key growth period.

    The cash injection has a specific strategic objective: to transform Scanway into one of Europe’s leading optical cargo integrators. The funds will allow the company to implement its plan to transfer its listing from NewConnect to the WSE Main Market without the need for another share issue, which is good news for existing shareholders worried about capital dilution.

    TFI PZU’s decision to exceed the 5% threshold in the shareholding of a technology company is part of a broader trend of seeking value in innovative sectors, especially in the context of growing investment in European security and Earth observation systems. Grounds for optimism are provided by the order book (backlog), which stood at PLN 58.3 million as at mid-November, growing by 254% year-on-year. It consists, among others, of a record contract worth EUR 9 million for a client from Asia and participation in the Polish CAMILA satellite constellation project. With revenues of PLN 16 million after three quarters of 2025 and positive EBITDA, Scanway is proving that deep tech can still be profitable while still scaling dynamically.

  • Vault open: Washington feeds AI with government data as part of Genesis Mission

    Vault open: Washington feeds AI with government data as part of Genesis Mission

    The White House is making a decisive move to integrate the nation’s digital assets with the computing capabilities of the National Laboratories. The executive order signed by President Donald Trump on Monday, initiating the ‘Genesis Mission’, signals a clear shift in strategy: The US intends to use its federal scientific archives as fuel for a new generation of artificial intelligence.

    At the centre of this initiative is the Department of Energy (DOE). Under the new guidelines, the department is tasked with creating an integrated collaborative system that connects the most powerful US supercomputers to the vast data sets held by the government. The plan is to build a closed-loop experimental platform. It is intended to be used not only to train fundamental scientific models, but also to directly control robotic laboratories, allowing research processes to be automated on an unprecedented scale.

    Energy Secretary Chris Wright points to an important shift in emphasis. While the private sector has invested billions of dollars in the commercial development of AI, the administration wants to redirect this technological machine towards hard science and engineering. The key to success is expected to be the unique datasets, available only in national laboratories, which have so far remained an untapped asset in training predictive models.

    The White House’s ambition is to radically accelerate the pace of innovation. Michael Kratsios, who heads the Office of Science and Technology Policy, points out that the aim is to reduce the discovery cycle from years to days or even hours. New AI agents are expected to automate the design of experiments and simulations in fields as complex as fusion plasma dynamics or protein folding.

    This initiative is inextricably linked to the broader geopolitical context and technological rivalry with China. Since taking office, the Trump administration has consistently sought to deregulate the sector, repealing previous AI security restrictions put in place by predecessors to remove barriers to rapid expansion. “Mission Genesis” precisely targets these efforts at sectors critical to national and economic security, including biotechnology, quantum computing and nuclear power, treating data science sovereignty as a key element of US competitive advantage.

  • The end of ‘garage’ deployments. OCP standardises infrastructure for quantum computers

    The end of ‘garage’ deployments. OCP standardises infrastructure for quantum computers

    The Open Compute Project (OCP) is opening a new chapter in data centre design, attempting to reconcile two technological elements: classical large-scale computing (HPC) and highly sensitive quantum mechanics. The organisation has begun work on formulating precise guidelines to enable these systems to coexist within a single server room. Although the vision of hybrid computing promises a leap in performance, the engineering reality presents facility operators with challenges that standard procedures do not anticipate.

    The integration of quantum systems is primarily a struggle with mass and thermodynamics. Although quantum processors themselves may impress with their energy efficiency, their associated infrastructure is demanding. A key element here is the cryostat – a device weighing up to 750 kilograms – which forces designers to ensure that the floor load capacity is at least 1,000 kg/m².

    Managing the temperature of the cooling fluid is proving to be even more challenging. While modern HPC cabinets can run on water temperatures as high as 45°C, quantum systems require a fluid supply in the 15-25°C range. This necessitates maintaining two separate cooling loops or using advanced heat exchangers. Added to this is the rigorous control of humidity, which must oscillate between 25 and 60 per cent to avoid condensation on refrigeration components, which would be disastrous in a precision electronics environment.

    However, it is environmental factors, often ignored in classical IT, that can determine the success of a deployment. Quantum hardware exhibits extreme sensitivity to electromagnetic interference. Even such mundane items as fluorescent lighting must be at least two metres away from the computing unit. Magnetic fields must be strictly limited, and the location of the data centre itself requires a new urban planning analysis. The presence of a tramline, railway traction or mobile phone masts within 100 metres can generate noise that prevents stable operation of the cubits.

    OCP rightly points out that installing a quantum computer is no longer a standard ‘plug-and-play’ operation. It is an engineering process that takes a minimum of four weeks and requires the involvement of specialist electricians and refrigeration technicians, not just IT staff. The OCP initiative to create checklists and best practices is therefore not so much a facilitator as a necessity for hybrid HPC environments to move out of the experimental phase and become a market standard.

  • Post-quantum cryptography in banking and medicine. Challenges, regulation and implementation

    Post-quantum cryptography in banking and medicine. Challenges, regulation and implementation

    Most discussions about quantum computers still oscillate around futurology. We talk about machines that will one day, in the near future, change the face of science and medicine. Meanwhile, for security directors in banks, hospitals or government institutions, the quantum age is not a distant vision, but a pressing problem that started yesterday. In the shadow of media reports about the next quantum processors, a quiet data security drama is playing out, known in the industry as the ‘Harvest Now, Decrypt Later’ strategy. It is a simple but brutal premise whereby cyber criminals and hostile state actors are already stealing and archiving encrypted data en masse. For them, it is currently a useless string of characters, but their long-term goal is to store it until quantum computers achieve enough computing power to crack today’s algorithms in seconds.

    For industries handling sensitive data with a long lifecycle, this is a nightmare scenario. The financial sector, which relies on trust and bank secrecy, and the healthcare system, which protects patient data often for decades, are on the front line. If we assume that a stable quantum computer will be developed in ten years’ time and that medical or financial data must remain confidential for fifteen or twenty years, the maths is inexorable. The safeguards in place today are already insufficient, as the period of necessary data protection is beyond the time horizon of safe use of current cryptography. Research published by the French agency ANSSI shows that half of the organisations surveyed are already at risk of future quantum attacks, especially in the context of such common tools as VPNs or long-term certificates.

    The answer to this invisible threat, however, is not to build our own quantum computers to defend ourselves, but mathematics, specifically post-quantum cryptography (PQC). Under this term is a new set of encryption algorithms specifically designed to resist the powerful computing power of future machines. Crucially, these technologies are compatible with our current hardware. You can, and should, deploy them on today’s servers, cloud and network infrastructure without waiting for a hardware revolution. This deployment is based on two foundations that are increasingly emerging in security strategies: hybridisation and crypto agility.

    The hybrid approach is a kind of security bridge. It involves the simultaneous use of conventional, time-tested algorithms and new post-quantum solutions. It works like a double lock on a door – even if one is forced, the other still protects the assets. This strategy allows companies to test new technology and build resilience without risking abandoning current standards overnight. Crypto-agility, or Crypto-Agility, on the other hand, is the system’s ability to quickly replace the encryption algorithm when a vulnerability is discovered in it. In a dynamically changing world of threats, IT systems cannot be monolithic; they must allow their cryptographic foundations to be seamlessly updated without rebuilding the entire architecture or paralysing the operational performance of the enterprise.

    While technological solutions are already on the table, the impetus for change is increasingly coming not from IT departments, but from the offices of regulators. Europe is clearly accelerating in the race for digital sovereignty, and bodies such as the aforementioned ANSSI and EU institutions are no longer treating quantum resilience as an option, but as a necessity. Standardisation work, led globally by the US NIST, is being closely followed and adapted to European requirements. On the Old Continent, legislation, including the Cyber Resilience Act, is beginning to play a key role. The new legislation will gradually force software and hardware providers to comply with state-of-the-art cryptographic criteria. This means that soon even non-critical infrastructure companies will have to review their supply chains, making sure that their technology partners offer solutions that are ready for the post-quantum era.

    There is currently an intriguing divergence in the market. On the one hand, technology providers are showing great mobilisation, actively following recommendations and integrating new standards into their products to stay ahead of regulation. On the other hand, many end users, including large enterprises, are adopting a wait-and-see attitude. Some industries are holding off on decisions until rigid legal guidelines emerge. However, experts warn that this is a risky strategy. Crypto migration is an extremely complex, costly and time-consuming process. Organisations that only start it when hard regulations come into force may find themselves in a no-win situation, forced to make chaotic and costly upgrades under time pressure.

    For sectors such as banking or healthcare, anticipating the quantum threat has therefore become a strategic necessity, going far beyond the technical aspects of IT operations. It requires coordination at management level, inventorying resources and planning multi-year budgets. The first step for any conscious organisation should be to map out exactly where cryptography is used and assess how long the protected data must remain confidential. Time to prepare is running out, and in the world of cyber security, where customer trust and the stability of the financial system are at stake, the principle of ‘prevention is better than cure’ has never been more relevant. The move to post-quantum cryptography is not just a software update – it is a fundamental shift in thinking about information persistence and security in the 21st century.

  • OVHcloud makes Pasqal quantum processors available. New service launches

    OVHcloud makes Pasqal quantum processors available. New service launches

    French cloud provider OVHcloud has just made the most decisive move to make the Old Continent independent of US computing technology. The launch of its new Quantum-as-a-Service platform is not only a technology launch, but above all a signal that Europe intends to build its own sovereign ecosystem in the nascent quantum computing sector.

    The Roubaix-based company has granted organisations access to the Orion Beta QPU processor, supplied by French startup Pasqal. This is the first step in an aggressive strategy to integrate at least eight quantum systems by the end of 2027. Significantly from a geopolitical perspective, up to seven of these are to come from European suppliers. Pasqal, as lead partner, sees this collaboration as the foundation for building a ‘digital autonomy’ in which both hardware and cloud infrastructure remain within EU jurisdiction.

    From a business perspective, OVHcloud’s offering stands out for its pragmatic approach to a still experimental technology. The provider combines access to a physical QPU with a set of nine quantum emulators, which are already used by nearly a thousand developers. This hybrid architecture allows companies to safely test algorithms and validate use cases in a cloud environment, without having to invest in expensive in-house lab infrastructure. Although about “quantum supremacy” – the moment when quantum computers will permanently surpass classical machines – may not yet be in sight, OVHcloud wants to be ready for that moment by offering an environment for iteration and learning now.

    The French move is a direct response to the dominance of US hyperscalers. IBM with its Quantum Cloud, Microsoft Azure with its Majorana 1 processor, AWS Braket or Google, have been building a competitive advantage overseas for years. OVHcloud is entering this market late, but with a clear value proposition: it offers the first viable alternative to ensure that sensitive research and data does not leave the European economic area. In an era of increasing regulatory tensions and pressure on data sovereignty, this could be a key asset in the battle for public, financial and research customers.

    The choice between Pasqal’s technology and IBM’s solutions is not just a question of supplier, but a decision about the fundamental physical architecture. Although both companies are pursuing the same goal – stable quantum computing – they approach the problem from completely different sides of the physics.

    IBM and the Superconducting Qubits approach

    IBM, like Google, is betting on superconducting qubits. These are essentially macroscopic electronic circuits that, when cooled to near absolute zero temperatures (in large dilution chillers), exhibit quantum properties. This approach is currently the most mature in engineering. Its greatest advantage is the speed of operations – quantum gates operate extremely fast here.

    However, it has the disadvantages of a short coherence time (the time it takes for a qubit to ‘remember’ its state) and difficulties with scaling. Each qubit must be physically connected to the control electronics, which, with thousands of qubits, creates a ‘wiring nightmare’ and generates heat, which is the enemy of the quantum state.

    Pasqal and the Neutral Atoms approach

    France’s Pasqal (and indirectly OVHcloud) uses rubidium atoms suspended in a vacuum and held by high-precision lasers called optical tweezers. In this system, the atoms themselves are the qubits. Since the atoms are identical by nature, this eliminates errors due to imperfections in chip manufacturing, something IBM has been facing.

    A key advantage of Pasqal technology is scalability and connectivity. The lasers can arrange atoms into any three-dimensional shape, allowing complex chemical molecules or optimisation problems to be simulated in a way that is inaccessible to IBM’s rigid chip architecture. These systems can operate at room temperature (for the apparatus itself, although the atoms are laser-cooled), which drastically reduces energy costs. The disadvantage is slower execution times compared to superconductors.

    FeaturePasqal (OVHcloud)IBM (IBM Cloud)
    ArchitectureNeutral atoms (light/laser controlled)Superconductors (electronic circuits on a chip)
    Stability (Coherence)High. Atoms maintain their quantum state for longer (seconds).Low. Very short state life (microseconds).
    Speed of operationsSlower. Operations on atoms take longer.Very fast. Instantaneous logic gates.
    ScalabilityHigh. Easier to add more atoms and lasers than cables.Moderate. Requires sophisticated cryogenic engineering.
    Main applicationsMaterials simulations, logistics optimisation, chemistry.Cryptography, factorisation, universal algorithms.

  • HPE Cray brings AI and HPC together. New generation of supercomputers relies on liquid cooling and choice architecture

    HPE Cray brings AI and HPC together. New generation of supercomputers relies on liquid cooling and choice architecture

    Hewlett Packard Enterprise unveiled the next generation of its supercomputing solutions yesterday (13 November), making a clear strategic bet. In an era of resource-intensive AI models that

    redefine data centres, HPE is unifying its HPE Cray architecture to meet both new workloads and traditional scientific simulation (HPC). This is a direct response to growing demand from research labs, government agencies and enterprises that no longer want to maintain separate, costly silos for both worlds.

    The company has announced that the HPE Cray Supercomputing GX5000 platform, introduced in October, has already won key customers. The German supercomputing centres, HLRS in Stuttgart (the ‘Herder’ system) and LRZ in Bavaria (the ‘Blue Lion’ system), have chosen it for their next-generation machines. Their motivations are clear: they need a platform that seamlessly combines simulation with AI and is extremely energy-efficient at the same time. Prof. Dieter Kranzlmüller from the LRZ emphasises that direct liquid cooling (up to 40°C) will allow the campus to reuse waste heat.

    At the core of Thursday’s announcement are three new liquid-cooled compute modules. HPE is betting on flexibility and partnership here, offering configurations based on both the next generation of NVIDIA Ruby GPUs and Vera CPUs (in the GX440n module), as well as competing AMD Instinct MI430X accelerators and ‘Venice’ EPYC processors (in the GX350a and GX250 modules). Compute density and full liquid cooling are key, to address the growing energy challenges.

    The supercomputer is not just about computing power, however. HPE is upgrading the entire platform. The HPE Slingshot 400 network is expected to provide the 400 Gbps throughput needed to scale AI jobs across thousands of GPUs. New HPE Cray K3000 storage systems, based on ProLiant servers and open-source DAOS (Distributed Asynchronous Object Storage) software, are in turn expected to address data access bottlenecks, which is critical for AI models.

    The whole is tied together by updated HPE Supercomputing Management Software, emphasising management of multi-tenant environments, virtualisation and detailed control of energy consumption across the system.

    While the announcement is strategically significant and secures HPE’s position in future multi-year contracts, IT market analysts must be patient. Most of the unveiled compute modules (with Ruby and MI430X chips) and software updates will not be available until “early 2027”. Slightly earlier, “early 2026”, the K3000 storage is expected to arrive. This is a clear indication that yesterday’s announcement is primarily a roadmap presentation and a response to competitors’ plans, rather than a launch of products that companies can order in the coming quarters.

  • The race for quantum advantage: IBM bets on Nighthawk and accelerated manufacturing

    The race for quantum advantage: IBM bets on Nighthawk and accelerated manufacturing

    IBM has stepped up its efforts in the race to build a usable quantum computer with the unveiling of its new Quantum Nighthawk processor. The goal is clearly defined and strategically differentiated from the competition: the company wants to achieve a ‘measurable quantum advantage’ (Quantum Advantage) by the end of 2026. As opposed to purely theoretical ‘supremacy’, quantum advantage refers to the point at which quantum systems solve real scientific or business problems faster and more efficiently than the most powerful classical supercomputers.

    Featuring 120 qubits and 218 connectors, Nighthawk is an evolution of the previous generation Heron, which prioritised quality over quantity. IBM stresses that the improved architecture allows it to run circuits 30% more complex while maintaining a consistently low error rate. It is this rate, rather than the sheer number of qubits, that remains the biggest engineering challenge. Cubits are extremely sensitive to noise (decoherence) and errors that add up during computation render the results useless. Nighthawk is expected to be available to users by the end of 2025.

    The key to achieving this ambitious roadmap – which calls for, among other things, 15,000 two-bit gates by 2028 – is scaling production. IBM has moved quantum processor manufacturing to a 300mm wafer fabrication facility at the Albany NanoTech Complex. This move, taken directly from the mature semiconductor industry, has already doubled the speed of development and increased the physical complexity of the chips tenfold, according to the company.

    In parallel, the company is working on the foundations of the future. The experimental Quantum Loon processor demonstrates the components necessary for fault-tolerant quantum computing, a goal for 2029. A breakthrough in error correction has also been reported, with a new decoding method operating ten times faster than existing methods, a year ahead of the original plan.

    To give credibility to its progress and set a market standard, IBM is launching an open, community-based Quantum Advantage Tracker with partners such as Algorithmiq. The initiative aims to transparently monitor and verify new demonstrations of the real-world benefits of quantum technology.