Category: Data

Data is a source of in-depth reports, visualisations and market analyses that show the real picture of the IT industry and the IT sales channel. These are unique breakdowns on vendors, distributors, integrators and trends.

  • The economics of open source: who pays for the code the world runs on?

    The economics of open source: who pays for the code the world runs on?

    Every day, as we reach for our smartphone, launch our favourite TV series or send a business email, we participate in the quiet miracle of modern technology. Beneath the shiny surface of apps and services lies an invisible foundation – open source software.

    It is millions of lines of code, written, refined and shared with the world for free by a global community. This code is the bloodstream of the internet and the backbone of the AI revolution.

    But this digital world, raised on the idea of freedom and collaboration, conceals a profound paradox. The global economy relies on an infrastructure created largely by volunteers, often balancing on the brink of professional burnout.

    It is as if global trade routes were based on bridges built as a hobby after hours. How long can such a structure last? Who actually pays for the code we all rely on?

    The invisible foundation: our global dependence

    Open source software is no longer an alternative. It has become the default building block of the digital world. Hard data paints a picture of almost total dependence. An analysis by Synopsys in 2024 showed that as much as 96% of the commercial code bases examined contained open source components.

    What’s more, on average, 77% of all code in these applications came from open source. It’s no longer a question of using individual libraries – it’s about building entire systems on a foundation created by the community.

    The scale of this dependency becomes even more striking when looking at the dynamics of consumption. In 2024, it was forecast that the total number of downloads of open source packages would reach the unimaginable figure of 6.6 trillion.

    The npm (JavaScript) ecosystem alone was responsible for 4.5 trillion requests, recording 70% year-on-year growth, while the AI-powered Python ecosystem (PyPI) grew by 87% to reach 530 billion downloads.

    The average commercial application today is a complex mosaic of an average of 526 different open source components. Each has its own life cycle, its own maintainers and its own potential problems.

    Cracks in the foundation: zombie code and a wake-up call called Log4j

    The ubiquity of open source is a double-edged sword. The same ease with which developers can incorporate off-the-shelf components into their projects leads to systemic neglect. The data is alarming: as many as 91% of the commercial code bases surveyed contain components that are ten or more versions out of date.

    This problem leads to so-called ‘zombie code’ – components that have had no development activity for more than two years. This phenomenon affects almost half (49%) of the applications on the market.

    This means that companies are building their critical systems on abandoned projects, without active support and, most importantly, without security patches. The consequence is a ticking time bomb: in just one year, the percentage of code bases containing high-risk security vulnerabilities has increased from 48% to 74%.

    Nothing illustrates this risk better than the December 2021 incident, when the world learned of the Log4j vulnerability. This small, free Java library for logging turned out to be embedded in millions of applications around the world.

    The vulnerability, named Log4Shell, received a maximum criticality rating of 10/10. An attacker could take full control of a server by sending a simple string of characters. US CISA director Jen Easterly called it “one of the most serious vulnerabilities she has seen in her entire career”.

    The Log4j incident became a global wake-up call, making companies brutally aware of how much their security depends on the work of anonymous volunteers.

    Worse still, even three years after the discovery of Log4Shell, up to 13% of all Log4j library downloads are still vulnerable versions. This demonstrates the profound inertia of organisations that fail to update their dependencies even in the face of a well-known, critical threat.

    The human cost of ‘free’ software: the burden of the custodian

    There are people behind every line of code. A model that treats their work as a free resource generates a huge human cost. Salvatore Sanfilippo, the creator of the Redis database, described this phenomenon as the ‘flooding effect’.

    Over time, the stream of emails, GitHub submissions and questions turns into a never-ending flood that leads to guilt over not being able to help everyone.

    The scale of this pressure is illustrated by the example of Jeff Geerling, who looks after more than 200 projects. Each day he receives between 50 and 100 notifications, of which he is only able to deal with a fraction.

    Nolan Lawson, another well-known maintainer, aptly put the emotional weight of this work. Notifications on GitHub are “a constant stream of negativity”. No one opens a notification to praise working code. People only post when something is wrong.

    This chronic pressure leads to burnout, which, in the context of open source, has clearly defined causes: demanding users, low quality contributions, lack of time and, most acutely, lack of remuneration.

    Knowing that work that consumes huge amounts of energy is the foundation for commercial products that make real profits for others is extremely demotivating. As one maintainer put it:

    “My software is free, but my time and attention is not”. Caregiver burnout is not just a personal tragedy. It is a critical risk to the global infrastructure.

    ‘Zombie code’ is a direct, measurable symptom of this crisis at the human level.

    The New Economy of Code: Towards a Sustainable Future

    In the face of these risks, the open source ecosystem is slowly maturing, moving from a volunteer-based model to more sustainable forms of funding.

    1. corporate patrons: strategy, not altruism

    At the forefront of this transformation are the technology giants. Companies such as Google, Microsoft and Red Hat have been the biggest contributors to the open source world for years. Their motivations, however, are not altruistic – they are cold, strategic calculations.

    Joint development of fundamental components (such as operating systems or containerisation) is simply more efficient. This allows them to compete at a higher level, in areas that directly differentiate their products.

    By becoming involved in key projects, corporations can also influence their direction, ensuring alignment with their own strategy.

    2 The power of institutions: the role of foundations

    The second pillar is non-profit foundations such as the Linux Foundation and the Apache Software Foundation. They act as neutral trustees for the most important projects, ensuring their stability and independence from a single corporation.

    They collect contributions from sponsors, creating a budget that allows them to fund key developers and safety audits.

    3 The maker revolution: the GitHub Sponsors model

    Alongside the big players, a new grassroots funding wave has been born. Platforms such as GitHub Sponsors allow direct, recurring contributions from users and companies, creating a revenue stream for maintainers.

    The story of Caleb Porzio, creator of Livewire and AlpineJS tools, is a prime example of the potential of this model. Standing on the brink of burnout, he decided to try his hand at the GitHub Sponsors programme.

    The real breakthrough came when he changed the paradigm: instead of asking for support, he decided to offer his sponsors additional, exclusive value. His secret turned out to be paid screencasts – a series of video tutorials.

    He reserved access to the full library exclusively for backers on GitHub. The effect was spectacular. His annual revenue grew by $80,000 in 90 days and crossed the $1 million threshold in the following years.

    This is a key lesson: a sustainable model does not have to be based on charity, but on building a viable business model around a free, open core.

    From stowaway to stakeholder

    ‘Free’ software has never been free. Its price, hitherto hidden, has been paid with the time, energy and mental health of a global army of volunteers. The model in which we treated their work as an inexhaustible resource is coming to an end.

    It is time for every participant in this ecosystem to undergo a transformation – from a passive ‘stowaway’ to an active stakeholder.

    This requires specific actions. Developers need to practice ‘software hygiene’ – regularly updating dependencies and consciously managing technical debt.

    Companies need to treat open source as a critical part of the supply chain, creating ‘software component inventories’ (SBOMs) and investing in business-critical projects. Investing in open source is not a cost, it is business continuity insurance.

    We stand at the threshold of a new era for open source – an era of professionalisation and sustainability. A future where creators are fairly remunerated and the global digital infrastructure is secure is within our reach. Building it, however, requires a conscious effort from each of us.

  • Challenges and priorities in the managed services market: Evolving from ‘handyman’ to business partner

    Challenges and priorities in the managed services market: Evolving from ‘handyman’ to business partner

    Imagine two scenarios. In the first, it’s 2003, and the owner of a small manufacturing company looks anxiously at a silent server that has paralysed the ordering system.

    In a panic, he calls his ‘IT man’, hoping that he will find time to come and diagnose the problem. Every minute of downtime is a measurable loss.

    In the second scenario, it is today. The CEO of a technology company receives a notification on his smartphone. It’s an automated report from his Managed Service Provider (MSP), informing him that a potential vulnerability in the company’s cloud security was discovered and patched overnight, before cybercriminals had time to exploit it.

    The company’s operations were not disrupted even for a second.

    This contrast perfectly illustrates the fundamental transformation that has taken place in the world of IT services. The evolution of managed service providers is not just a story of adaptation to new technologies.

    It is a story of a complete redefinition of the business model, driven by escalating cyber threats, the increasing complexity of cloud environments and the need for automation.

    The modern MSP has ceased to be just an external IT department called in to put out fires. It has become a key partner in risk management, an engine of digital transformation and a guardian of business continuity.

    Foundations of the past: the era of the “Break-Fix” model

    Before IT service providers became proactive partners, the dominant operating model was the so-called ‘break-fix’. Its logic was simple: when something breaks, a specialist is called in to fix it.

    The process was purely transactional: the customer experienced a breakdown, the technician arrived, repaired it and invoiced for his time and parts.

    The biggest drawback of this model was its fundamental economic structure, which created an inevitable conflict of interest. The IT service provider only made money when there were problems at the client.

    The more failures, the higher the provider’s profits. The customer sought maximum stability, while the provider’s business model depended on instability.

    This structural flaw prevented the building of relationships based on trust and had to give way as soon as companies understood that their survival depended on reliable technology.

    Proactive breakthrough: the birth of the modern SME

    The twilight of the ‘break-fix’ era has been accelerated by technologies that have enabled fundamental change. Remote monitoring and management (RMM) and professional services automation (PSA) platforms have catalysed the revolution.

    RMM tools allowed suppliers to continuously monitor the health of customer systems in an automated manner, enabling issues to be identified and resolved before they led to downtime.

    The most important innovation, however, was a change in the business model. MSPs moved away from hourly rates to a fixed monthly subscription fee (Monthly Recurring Revenue, MRR).

    For the customer, this meant cost predictability and for the SME, a stable revenue stream. The introduction of service level agreements (SLAs) gave customers contractual guarantees on response times or system availability.

    Most importantly, this model has united the interests of both parties. The MSP’s profitability became directly proportional to the stability of the client’s IT environment. Each failure was now a cost to the provider, rather than an opportunity to make money, motivating the provider to ensure maximum efficiency.

    The cyber security imperative: from administrator to defender

    If proactivity was the spark that started the revolution, the explosion of cyber threats has become the fuel that drives further evolution. Small and medium-sized enterprises (SMEs) have become a prime target for cybercriminals, and the fear of attack has become one of the top business priorities.

    Research from 2024 revealed that as many as 78% of SME companies fear that a major cyber-attack could bankrupt them.

    In response, cyber security has ceased to be an add-on and has become central to the MSP’s offering and a key driver of revenue growth.

    Market analysis shows that 97% of the highest revenue MSPs offer a wide range of managed security services. Clients are no longer just looking for tools; 64% expect strategic guidance from their MSP.

    This has forced providers to evolve towards a managed security service provider (MSSP) model, offering advanced solutions such as managed detection and response (MDR), security information and event management (SIEM) and security awareness training.

    By taking responsibility for cyber security, the MSP has fundamentally changed its role – it no longer just manages the technology, but the customer’s business risk.

    The cloud revolution: managing hybrid complexity

    Contrary to early predictions, the growth of public clouds has not made MSPs redundant. On the contrary, the mass adoption of hybrid and multi-cloud (multi-cloud) strategies has created an intense new level of complexity that companies have been unable to cope with on their own.

    This has opened up a huge opportunity for mature MSPs. They have transformed themselves into cloud strategists and integrators, helping clients develop strategies, implement complex migrations and, crucially, optimise cloud costs (FinOps).

    In an era of increasing data privacy regulation, MSPs have also started to act as a ‘data sovereignty broker’, advising on where data can and should be stored to comply with regulations.

    The ability to design and manage a fully customised hybrid environment, combining on-premises resources with private and public cloud, has strengthened the MSP’s position as a central coordinator of the client’s entire IT ecosystem.

    Innovation horizon: AIOps and Hhperautomation

    The most mature MSPs today stand on the threshold of the next evolutionary leap, whose horizon is marked by AIOps (AI for IT Operations) and hyper-automation. AIOps uses big data and machine learning to automate and streamline IT operations, moving management from proactive to predictive.

    Instead of reacting to known potential problems, AIOps predicts and prevents them before any symptoms become apparent.

    Practical applications include intelligent correlation of thousands of alerts into a single usable incident, predictive analytics that forecast future resource requirements and automated remediation that resolves repetitive problems without human intervention.

    Combined with hyper-automation, which streamlines entire business processes (e.g. implementation of new customers), these technologies become a key competitive advantage.

    AIOps is becoming a prerequisite for managing modern, complex IT environments, and vendors who successfully implement these technologies will be able to serve more demanding customers with greater efficiency.

    An essential engine for digital transformation

    The evolution of managed service providers is a story of remarkable adaptation and continuous climb up the value chain. From a reactive technician whose success was measured by the speed of repair, to a predictive, strategic partner whose value is defined by its contribution to the innovation, resilience and profitability of the client’s business.

    The MSP of the future is not a technology vendor, but a consultancy with deep technical expertise. It thrives in an environment of complexity, actively manages risk and uses intelligent automation to deliver measurable results.

  • IT market in CEE: Poland vs. Czech Republic, Hungary, Romania. Analysis

    IT market in CEE: Poland vs. Czech Republic, Hungary, Romania. Analysis

    Central and Eastern Europe (CEE) has long ceased to be seen as an ’emerging’ technology market. Today, it is a globally established, dynamic and competitive centre of innovation, whose IT services and R&D market is growing four to five times faster than the global average.

    At the heart of this technological renaissance are four key players, a kind of ‘Visegrad+ Technology’: Poland, the Czech Republic, Hungary and Romania. Each of these countries brings a unique profile to the regional jigsaw: Poland appears as a regional hegemon in terms of scale, the Czech Republic as a stable industrial and technological centre, Hungary as a magnet for foreign direct investment and specialised expertise, and Romania as a ‘digital contender’ with the highest growth rate.

    CEE technology arena

    To understand the dynamics of competition in the region, it is first necessary to assess the fundamental economic context, comparing the scale, structure and importance of IT markets in each of the four countries. It is these indicators that determine who are the biggest players and where the epicentre of growth lies.

    Scale and dynamics of the market: measuring the forces

    Market size is a fundamental indicator of strength. In this respect, Poland is the undisputed leader in the region, although different sources give slightly different estimates, reflecting the complexity and dynamics of the sector. According to PMR data, the value of the Polish IT market in 2023 was PLN 66.3 billion (approx. EUR 15.4 billion), with a forecast of growth to PLN 74 billion (approx. EUR 17.2 billion) by 2025. IDC Poland analysts, on the other hand, estimate this value even higher – at PLN 80.3 billion (approx. EUR 18.6 billion) in 2023. Regardless of the methodology adopted, the scale of the Polish market significantly exceeds its neighbours.

    The Czech ICT (information and communication technology) market presents the picture of a mature and stable powerhouse. Its revenues are forecast to reach EUR 24.3 billion by 2026, with a steady annual growth rate of 2.1%. This indicates a less volatile, well-established market. The Hungarian ICT market is more difficult to assess conclusively due to disparate data. Mordor Intelligence estimates its value at an impressive USD 35.17 billion in 2025, with a projected annual growth rate (CAGR) of 11.41% until 2030. Other sources quote a more conservative figure of €5bn for 2024. This discrepancy suggests that the higher estimate covers a wide range of telecoms services and hardware sales, driven by large corporations. The Hungarian e-commerce segment alone reached HUF 1,920 billion (approximately EUR 4.9 billion) in 2024.

    Romania presents the most dynamic picture. Its digital economy is expected to reach a value of EUR 52 billion by 2030. The IT services export market, valued at EUR 24.9 billion in 2023, is expected to grow to EUR 44.8 billion by 2028, representing an impressive CAGR of 9.1%. This is the fastest growth trajectory in the group analysed, positioning Romania as a top contender for regional momentum.

    This dichotomy between scale and speed of growth creates a strategic tension. Poland, as the largest market, offers stability, a mature and diverse ecosystem, which is attractive to large corporations looking for space for R&D centres. On the other hand, Romania, with its near double-digit growth, is a magnet for venture capital funds and companies looking for rapid expansion, willing to accept the risks associated with a less mature market. The choice between these countries is therefore not a simple decision, but depends on the investor’s appetite for risk and its growth strategy.

    The powerhouse that drives GDP: More than the service sector

    The importance of the IT sector for national economies is best reflected in its share of Gross Domestic Product. In Poland, it is an impressive 8%, reflecting the deep integration of technology into the overall economy and its key role as a driving force. Romania also boasts a high figure at 6.6%. Surprisingly, Hungary has the lowest share at 4.3%. Although precise data for the Czech Republic is lacking for IT alone, the context is the powerful automotive industry, generating 10% of GDP, indicating strong links between the technology sector and industry.

    These figures, juxtaposed with overall wealth levels, show that technology is a key tool for convergence. Poland and Romania, with GDP per capita (in purchasing power parity) at 79% of the EU average, are chasing the Czech Republic (92%). The IT sector is undoubtedly one of the main accelerators of this process.

    Market Architecture: What’s hiding under the hood?

    The internal structure of the IT markets in each country reveals their unique specialisations and strategic directions.

    Poland: We are seeing a clear bifurcation of the market. The hardware segment is stabilising after a pandemic boom, while software and services are going from strength to strength, reaching a value of PLN 30.5bn (€7.1bn) in 2023. Cloud services are a key driver, with the market growing by 25% year-on-year to reach US$2bn.

    Czech Republic: The market is strongly determined by a powerful industrial base, especially the automotive and electrical engineering sectors. This generates a huge demand for embedded systems, industrial automation and advanced enterprise IT solutions. The country is also a hub for international R&D centres such as Microsoft, IBM and Oracle.

    Hungary: the market is characterised by an exceptionally high level of high-tech adoption by businesses. The cloud adoption rate is 37.1% (slightly below the EU average) and data analytics as high as 53.2%, which is well above the EU average (33.2%). This indicates a mature and demanding corporate customer base. The largest segment of the ICT market is telecommunications services, accounting for more than 41% of the total.

    Romania: the market is largely export-oriented, especially in the area of software development services. Despite the government’s strong emphasis on the digitalisation of small and medium-sized enterprises, its level (27%) still lags far behind the EU average (57.7%), which paradoxically creates a huge potential for growth in the internal market.

    An analysis of the structure of markets reveals an interesting phenomenon in Hungary. On the one hand, companies there show above-average maturity in the adoption of advanced technologies such as data analytics.

    On the other hand, the contribution of the overall IT sector to GDP is the lowest in the group. This apparent contradiction suggests that technological advancement is concentrated in a narrow group of large, often foreign corporations (e.g. from the automotive sector), rather than being a widespread phenomenon driven by a broad domestic IT industry.

    This indicates a ‘top-heavy’ market with potentially fewer opportunities for local SMEs compared to Poland, where the domestic IT sector is a much larger economic force.

    The human capital equation: talent, skills and remuneration

    In an industry dominated by a ‘war for talent’, it is human capital that is the most valuable asset and the ultimate determinant of competitiveness. The analysis moves from macroeconomic numbers to the practical realities of building and maintaining technology teams.

    Talent resource: A deep but challenging resource

    Poland: a giant with a skills gap: Poland has by far the largest talent pool, estimated at between 493,000 and over 586,000 professionals. This is a powerful asset, but the country is struggling with a significant skills gap. IT professionals account for 3.5% of the total workforce, which is lower than the EU average (4.5%). It is estimated that Poland lacks as many as 147,000 experts to reach the EU average.

    Czech Republic: Hub of specialists: the Czech Republic has a solid base of nearly 230,000 ICT experts, representing 4.3% of the workforce – a figure close to the EU average. Renowned technical universities provide a steady flow of graduates, although they have to compete for talent with the powerful industrial sector.

    Hungary: Stability and qualifications: In Hungary, the share of ICT professionals in employment is 4.2%, also close to the EU average. However, the annual growth rate of these professionals (2.4%) is slower than in the EU (4.3%) , suggesting a stable but less rapidly growing talent pool.

    Romania: The density paradox: Romania has a large and highly valued talent pool of between 202,000 and 226,000 professionals. The country boasts the highest number of certified IT professionals per capita in Europe. Paradoxically, their share of the total workforce is the lowest in the group at just 2.8%. In addition, Romania faces a ‘brain drain’ problem, which poses a serious challenge to keeping top talent in the country.

    This talent flow dynamic is fundamental to long-term development. The phenomenon of ‘brain drain’ in Romania stands in contrast to the ‘brain inflow’ in Poland, which is becoming an attractive place to work for professionals from other countries, including Ukraine.

    An economy that loses talent often exports junior and mid-level professionals, which undermines its ability to create complex, high-margin products locally. In contrast, a country attracting talent can accelerate its march up the value chain by importing experienced experts.

    This indicates that the Polish ecosystem may mature faster, while the Romanian ecosystem, if not reversed, may remain more focused on the provision of outsourcing services.

    Map of Wages: Clash of the four capitals

    Salaries are a key competitive factor in the talent market. A comparison of rates in the region’s main technology hubs reveals significant differences.

    Warsaw bonus: Polish salaries are among the highest in the region. A senior programmer on a B2B contract in Kraków or Warsaw can expect a salary in excess of PLN 26,000 net per month (around EUR 6,000). Even on an employment contract, senior salaries exceed PLN 12,000 net (around EUR 2,800).

    Prague competitiveness: Czech salaries are also very high. The typical range for IT professionals is between CZK 43,130 (approximately EUR 1,730) and CZK 122,874 (approximately EUR 4,930) per month. The best-paid roles, such as Data Scientist, can bring in an annual income of CZK 1.2 million (approximately EUR 48,150). The average annual salary for a software engineer is around EUR 55,600.

    Budapest’s value proposition: Hungarian salaries offer a better cost/quality ratio. The average salary for an IT specialist is around EUR 1,800 per month , while a software engineer in Budapest earns an average of EUR 40,400 per year. This makes Hungary much more affordable to build a team than Poland or the Czech Republic.

    Rising costs in Bucharest: Romanian wages are rising fast, but still offer a cost advantage. The average salary in the technology industry is EUR 3,402 net per month. The general range for IT is between RON 4,647 (approximately EUR 930) and RON 16,879 (approximately EUR 3,390) per month. However, these rates are further bumped up by the total exemption from income tax up to a certain threshold, which significantly increases the net salary.

    The prevalence and high rates of B2B contracts in Poland are not just a billing method, but symptomatic of a mature, highly competitive senior talent market. This model gives maximum flexibility and earning potential to the best professionals, but at the same time creates instability for employers and leads to a more transactional relationship with employees.

    In contrast, the dominance of traditional employment contracts in Hungary and the Czech Republic (83.5% and 67% in IT respectively) suggests a more stable, corporate labour market. This means that companies in Poland need to adopt a different HR strategy, focusing on offering attractive projects and top salaries, while in the Czech Republic and Hungary more emphasis can be placed on long-term career paths and company culture.

    A list of coveted expertise: Who’s on top?

    Across the region, there is a huge demand for specialists in areas such as artificial intelligence and machine learning (AI/ML), data analytics (Data & BI), cyber security and DevOps. It is these roles that are the highest paid.

    However, each country also has its niches in which it has achieved a leadership position. Poland is a global powerhouse in the production of computer games (gamedev), with giants such as CD Projekt RED at the forefront. The industry generates more than EUR 500 million in revenue, creating an ecosystem of talent in game design, programming and graphics that is unique in the region.

    Romania is rapidly developing its own gamedev scene, attracting global players such as Amazon Games, which has opened a new studio in Bucharest. The country is also strong in the Fintech sector, with the capital generating 77% of the industry’s turnover in the country.

    The Czech technology scene fits perfectly with the needs of its industry base, targeting areas such as cyber security (Avast originated from here) and enterprise software. Hungary, on the other hand, with its high adoption rate of cloud and data analytics by corporations, generates a strong demand for data architects, cloud engineers and enterprise systems specialists such as SAP.

    The innovation frontier: start-ups, outsourcing and investment

    The future of any technology market depends on its ability to innovate, attract capital and integrate into the global ecosystem. This section examines the dynamics that are shaping tomorrow’s technology scene in Central and Eastern Europe.

    The vibrant Venturelands: The startup race

    Poland: Leader in terms of volume: Poland boasts the largest startup ecosystem in the group, with more than 1,251 companies. Warsaw is the dominant hub. The ecosystem is mature enough to have released nearly a third of all unicorns (companies with a valuation of more than USD 1 billion) in the CEE region. Funding, however, remains a challenge, with as many as 56% of startups reporting difficulties in obtaining it.

    Czech Republic: An effective rival: Despite its smaller scale, the Czech ecosystem is highly rated, ranking 3rd in Eastern Europe, ahead of Poland. It is famous for startups in the areas of SaaS, Fintech and Healthtech and is the cradle of global success stories such as Avast and unicorns such as Rohlik and Productboard. A key challenge is the perceived lack of a sufficient number of high-quality projects by investors.

    Hungary: a stagnant giant? Hungary has established companies such as Prezi and LogMeIn, but has struggled to maintain momentum in recent years. Total investment has stagnated at around EUR 54 million. Recently, however, there has been an upturn in the segment of early-stage AI-based startups, which could herald a rebound.

    Romania: The unicorn factory: The Romanian ecosystem has been defined by the spectacular success of UiPath, the global leader in process automation (South Africa). This event has put the country on the map for international investors. The AI scene is particularly active, with large funding rounds for companies such as FintechOS. The ecosystem is heavily concentrated in Bucharest.

    The success of UiPath has had a profound secondary impact on the entire Romanian ecosystem. Not only has it created a generation of experienced, wealthy angel investors and serial entrepreneurs (the so-called ‘UiPath mafia’), but it has also acted as a global proof of principle, reducing the perceived risk of investing in Romania in the eyes of international VC funds. This explains the impressive funding rounds for companies such as FintechOS and the general revival around the Romanian scene, which might otherwise seem disproportionate to the size of the market. This ‘unicorn effect’ is a powerful accelerator that allows the ecosystem to perform well above its nominal weight.

    Global background: Strategic partner, not cheap labour

    The entire CEE region is a leading global destination for IT outsourcing. Clients are increasingly shifting their focus from cost optimisation to access to high-end skills, innovation and cultural proximity. The regional talent pool exceeds 1.75 million engineers.

    A stable business environment is a key asset. In the Doing Business 2020 ranking, Poland (40th), the Czech Republic (41st), Hungary (52nd) and Romania (55th) offer predictable conditions, an advantage over other global outsourcing hubs.

    Poland is often recognised as a leader in IT competitiveness in the region thanks to its huge talent pool, business climate and strong exports. It is a major hub for the R&D centres of global giants such as Google and Microsoft.

    The Czech Republic ranks among the top five countries in terms of the attractiveness of outsourcing, renowned for its high quality services and data security.

    Hungary and Romania are attracting investors with their correspondingly low 9 per cent corporate income tax and tax exemptions for programmers, which, combined with a large talent pool, creates a powerful value proposition.

    The strong presence of international R&D centres and outsourcing companies in Poland and the Czech Republic is not just a service industry; it is a key incubator for the country’s startup ecosystem. These centres train local talent to global standards, introduce them to global business practices and create a network of professionals who eventually leave to start their own product companies. A programmer working for five years in Google ‘s Warsaw office learns product management, scaling and international sales at a level not available in most local companies. Such a specialist, armed with unique skills, contacts and an understanding of the needs of the global market, is much more likely to succeed. In this way, the outsourcing sector is not a separate entity, but a fundamental pillar that feeds and accelerates the development of the domestic product and startup economy.

    The role of the state: Catalysts for growth

    The governments of all four countries actively support the technology sector through a variety of initiatives, including tax incentives, funding programmes and startup visas. Key policies, such as Romania’s income tax exemption for software developers or Hungary’s low CIT rate , are important competitive advantages. Poland and the Czech Republic are effectively using EU funds and national development agencies (such as PFR Ventures and CzechInvest) to fuel their innovation ecosystems.

    Verdict: Poland’s position and the way forward

    A synthesis of the data presented makes it possible to formulate a clear verdict on Poland’s position compared to regional rivals and to outline strategic perspectives for the entire region.

    Regional SWOT analysis: Comparative scorecard

    Poland:

    • Strengths: Largest market and talent pool, diverse ecosystem (gamedev, enterprise), strong startup scene.
    • Weaknesses: Significant talent gap, increasing wage pressure, high competition.
    • Opportunities: Inflow of talent from abroad, opportunity to move up the value chain to more complex products.
    • Threats: Loss of cost competitiveness to Romania/Hungary, market saturation in some areas.

    Czech Republic:

    • Strengths: Stable, mature market, highly qualified professionals, strong integration with industry, excellent business environment.
    • Weaknesses: Smaller talent pool, slower growth, higher costs than some neighbours.
    • Opportunities: Leverage the industrial base for innovation within Industry 4.0, become a hub for high-margin R&D centres.
    • Threats: Competition for talent with powerful manufacturing sector, risk of stagnation.

    Hungary:

    • Strengths: Favourable tax environment, high adoption of advanced technologies in companies, strong value proposition.
    • Weaknesses: Stagnation in startup funding, slower growth of talent pool, less dynamic ecosystem.
    • Opportunities: Potential to become a specialised hub for AI and data science solutions for corporations, attracting cost-oriented FDI.
    • Threats: Lagging behind regional leaders in startup innovation, political uncertainty affecting investor confidence.

    Romania:

    • Strengths: Top growth rate, high talent density, significant cost advantages, ‘unicorn effect’ after UiPath success.
    • Weaknesses: Brain drain, less developed domestic market, infrastructure gaps outside major hubs.
    • Opportunities: huge potential in the digitalisation of the country’s SMEs, becoming the gamedev hub of South East Europe.
    • Threats: Talent retention, risk of overheating the economy, dependence on export markets.

    Poland’s position in the CEE arena: Heavyweight champion under pressure

    Poland remains the undisputed leader of the CEE technology scene in terms of scale, talent numbers and diversity of the ecosystem. The size of its market and depth of specialisation, especially in gamedev and enterprise software, are unrivalled.

    However, leadership comes at a price. Poland faces challenges typical of a mature market: intense wage competition that undermines its cost advantage, and a critical skills gap that could stifle future growth. Poland is no longer the ‘low-cost’ option; it is the ‘scale’ option.

    While Poland is leading the way, the competition is not sleeping. Romania challenges it in terms of growth and dynamism, the Czech Republic in terms of stability and specialised quality, and Hungary in terms of cost efficiency for corporate investments.

    Collective strength: The future is regional

    The future of the CEE technology scene will not depend on which country ‘wins’, but on how the region as a whole handles the transformation from a cost-driven outsourcing destination to a value-driven innovation partner. Poland, as the largest player, has a key role to play in leading this change, but its success is inextricably linked to the health and dynamism of its neighbours.

  • The great reallocation in IT: analysis of a $5.7 trillion market

    The great reallocation in IT: analysis of a $5.7 trillion market

    The global IT market is on the verge of an unprecedented boom. Leading analyst firms such as Gartner forecast that global IT spending will reach an astronomical $5.7 trillion in 2025, an impressive increase of more than 9% from 2024.

    Other forecasts, although differing in detail, agree on one thing: we are witnessing a historic influx of capital into the technology sector. However, to stop at this headline figure would be a mistake. The amount itself, while impressive, is merely a facade for a much deeper and more fundamental transformation.

    The story that this money tells is not about simple growth, but about a strategic and rapid reorientation of global business.

    The real story lies in the asymmetry of this growth. While the overall market is growing by around 9%, some segments are exploding. Spending on data centre systems is set to grow by a staggering 23.2% and on software by 14.2%.

    Communication services, on the other hand, will see a much more modest increase of just 3.8% . This disproportion is no accident. It is evidence of a conscious, strategic business decision that can be called the ‘Great Reallocation’ of capital.

    Companies are not just spending more; they are actively shifting resources from one area to another, de-prioritising maintenance of the status quo in favour of aggressive investment in intelligence and services.

    IT budgets in 2025 are not just bigger – they are smarter, more focused and ruthlessly geared towards a future where software and artificial intelligence are no longer support tools, but the very heart of value creation.

    The AI gold rush: from grand experimentation to pragmatic integration

    The undisputed driver of spending in 2025 is generative artificial intelligence (GenAI). It is the epicentre of the ‘Great Reallocation’, attracting capital at a scale that is redefining investment priorities around the world.

    The physical manifestation of this gold rush is a monumental expansion of infrastructure. Spending on AI-optimised servers is forecast to reach $202 billion by 2025, doubling spending on traditional servers.

    The entire data centre systems segment is expected to grow by the aforementioned 23.2 per cent as a direct result of the demand for computing power required to train and deploy advanced AI models .

    At the forefront of this boom are the hyperscalers – cloud giants such as Amazon Web Services, Microsoft Azure and Google Cloud. These companies, along with IT service providers, will account for more than 70% of all IT spending in 2025. Their role is evolving.

    They are no longer just infrastructure-as-a-service (IaaS) providers; they are becoming the foundation of a new, oligopolistic market for AI models.

    At the same time, the market is maturing at an extremely fast pace. The phase of unrestricted, often chaotic experiments with AI inside companies is coming to an end. Many companies have bumped into a wall: the capital and operational costs of creating their own models have turned out to be much higher than expected, the skills gaps in the teams have been too large, and the return on investment (ROI) from pilot programmes has been disappointing.

    As a result, a key change in strategy is taking place: a shift from an expensive ‘build’ model to a pragmatic ‘buy’ model. IT directors are no longer creating GenAI tools from scratch; instead, they are buying off-the-shelf functionality that software providers build into existing platforms.

    The market is entering a phase that Gartner refers to as the ‘bottom of disillusionment’ (trough of disillusionment) . Paradoxically, this does not mean a decline in spending, only a decline in unrealistic expectations.

    Companies are moving away from chasing revolutionary breakthroughs to practical applications of AI that increase employee productivity, automate processes and give real competitive advantage.

    Software-defined economics: how your car explains the future of business

    The spectacular growth in spending on software (+14.2%) and IT services (+9%) is the strongest signal yet that we are witnessing the birth of a new economic paradigm . You don’t have to look far to understand its essence – just look at the transformation taking place in the automotive industry.

    The Software-Defined Vehicle (SDV) model is an excellent, tangible case study that illustrates how physical products are transformed into platforms for delivering high-margin, cyclical digital services.

    The SDV revolution is the fundamental separation of the hardware layer from the software layer in the vehicle. This allows carmakers to deploy new features and enhancements continuously, via Over-The-Air (OTA) wireless updates, without having to physically interfere with the car.

    This completely changes the nature of the product. The car ceases to be an asset whose value diminishes over time and becomes a dynamic platform capable of generating revenue throughout its life cycle.

    Manufacturers are already experimenting with new business models: BMW is testing subscriptions for heated seats and Volkswagen plans to offer autonomous driving features in a pay-as-you-go model.

    However, this trend is not limited to automotive. It is a leading indicator of the universal transformation of business models. The entire software market is moving towards subscription and Software-as-a-Service (SaaS) models.

    Software is the fastest growing technology sector and is predicted to account for 60% of global technology spending growth by 2029 . This confirms that the SDV model heralds a broader shift in which the boundaries between product and service are blurring.

    In this new economy, the IT department, traditionally seen as a cost centre, is being promoted to the role of central value creator.

    The chief information officer (CIO) and chief technology officer (CTO) become key figures in the product strategy, and their expertise is essential to the creation of the company’s core product.

    Professional 2025: shaping a modern IT skill set

    Technological and business transformation is having a profound impact on the labour market, reshaping the demand for skills. To succeed in this dynamic environment, IT professionals need to develop a hybrid skill set, combining deep technical knowledge with sustainable ‘soft’ skills.

    The analysis of the labour market for 2025 leaves no doubt: the most sought-after professions are almost entirely technology-related. At the top of the lists are AI and machine learning specialists, data analysts and cyber security analysts.

    Demand for cyber security professionals alone is forecast to increase by 33% between 2023 and 2033, with artificial intelligence, data analytics, cloud computing and programming, with a particular focus on the Python language, dominating among the key technical skills employers are looking for.

    However, technical proficiency alone is no longer sufficient. As AI takes on more and more analytical tasks, the value of skills that machines cannot easily replicate increases.

    Employers are increasingly prioritising abilities such as analytical and creative thinking, complex problem solving, emotional intelligence and adaptability.

    Artificial intelligence will certainly lead to a displacement of the labour market. It is estimated that AI could automate up to a quarter of job tasks in the US and Europe, especially routine tasks such as basic programming or customer service.

    However, the dominant expert narrative does not focus on mass unemployment, but on the transformation of work. AI is not so much eliminating occupations as redefining them, creating new, often more strategic roles. In this new occupational landscape, the ‘half-life of technological skills’ is now less than five years .

    This means that continuous learning agility is becoming the most important meta-skill. The future of work is not about competition between humans and AI, but about their symbiosis.

    The most effective professionals are those who master the art of using AI as a collaborative partner to enhance their own creativity and productivity.

    Navigating the next wave of IT transformation

    Analysis of global IT spending trends for 2025 clearly shows that we are witnessing profound, structural changes. We are seeing a shift from spending more to spending smarter, and the AI market is maturing, moving from building to integrating off-the-shelf solutions.

    At the same time, business models are evolving from selling products to selling services, forcing a transformation in the labour market – from static roles to dynamic skills.

  • The technology gap is widening: SMEs vs corporates in the race for AI

    The technology gap is widening: SMEs vs corporates in the race for AI

    Small and medium-sized enterprises (SMEs) are the backbone of the European economy. They account for 99.8 per cent of all companies, generate more than half of the added value and employ nearly two-thirds of the private sector workforce. In an era of global competition and rising customer expectations, digitalisation is no longer an option for them – it has become a condition for survival. However, the latest data from across the European Union paints a worrying picture: while large corporations are departing on the digital express, the SME sector is largely still waiting on the platform.

    An analysis of the adoption of the three pillars of modern business – cloud computing, artificial intelligence (AI) and cyber security – reveals a deep and widening gap. This ‘digital maturity gap’ threatens not only the competitiveness of individual companies, but also the achievement of the EU’s ambitious strategic goals, known as the ‘Road to the Digital Decade’.

    Two-speed Europe: who is the digital leader and who is being left behind?

    To understand the real level of digitalisation, it is not enough to see if a company has access to the internet. The key is how deeply technology is integrated into its business processes. This is measured by the EU’s Digital Intensity Index (DII), which assesses the use of 12 key technologies.

    Only 58% of SMEs in the EU have reached a ‘basic level’ of digitalisation, which means using at least four of these technologies. This is a far cry from the EU’s target that more than 90% of companies in the sector should reach this threshold by 2030.

    The map of Europe shows a clear division. The Nordic countries are at the head of the peloton, with as many as 86% of SMEs meeting the criteria for basic digitisation in Finland and 80% in Sweden. At the other extreme are Romania (27%) and Bulgaria (28%). Poland, with a score of 43% (data for 2022), is well below the EU average, which signals systemic barriers inhibiting the potential of our companies.

    The problem is the difference between ‘being online’ and ‘being digital’. Almost all companies in the EU have broadband, but they often use it passively – for email or social media profiles. The real transformation begins when technology becomes an integral part of the operating model, not just a facade.

    Cloud computing: a foundation that shows cracks

    Cloud computing is today the cornerstone of flexibility and scalability. In 2023, 45.2% of businesses in the EU will be using it, a steady but slow growth. However, the devil is in the detail.

    The biggest challenge is the ‘cloud gap’ between companies of different sizes. While 77.6% of large corporations are actively using the cloud, the figure for small businesses drops to just 41.7%. This is a gap of more than 35 points, showing that SMEs still face barriers to accessing this fundamental technology.

    Moreover, companies that are already in the cloud mainly use it for basic tasks: email handling (82.7%), file storage (68%) or office software (66.3%). They are much less likely to use advanced services such as developer platforms (PaaS) or computing power (IaaS), which are essential for building innovation.

    The conclusion for managers is simple: the cloud is not just a storage facility for data, but first and foremost a launch platform for AI. Companies that do not invest in a mature cloud infrastructure today will have a double barrier to overcome tomorrow to enter the world of artificial intelligence.

    Artificial intelligence: the technology that divides most

    If the cloud shows the cracks, artificial intelligence reveals the real divide. Despite the huge interest, AI adoption in European companies remains alarmingly low, at just 13.48% in 2024. This is a result that is dramatically far from the EU target of 75% for 2030.

    “The AI implementation gap is gigantic. Artificial intelligence is used by as many as 41.17% of large corporations, but only 11.21% of small companies. This means that large companies implement AI almost four times as often. Poland, with a score of 5.9%, is at the grey end of Europe, ahead of only Romania (3.07%).

    Why is the gap so deep? Cloud deployment is often a decision to optimise costs. AI implementation is a strategic investment with an uncertain return, requiring not only capital, but above all competence and a mature data management strategy – resources that SMEs often lack.

    If this trend continues, AI, rather than levelling the playing field, will become the ‘great divider’. This could lead to a ‘winner-take-all’ scenario, in which large, data-rich corporations, thanks to AI, will become even more powerful, marginalising smaller players.

    Cyber security: the paradox of risk in the SME sector

    On paper, the situation looks good: 92.76% of companies in the EU use at least one ICT security measure. However, these are mainly basics, such as strong passwords or data backup. The real picture of digital resilience emerges when we look at proactive measures.

    A regular ICT risk assessment – the cornerstone of any mature security strategy – is carried out by only 34.1% of companies in the EU. The difference between large (75.62%) and small (29.35%) companies here is colossal. This means that most SMEs are operating ‘blindly’ without fully understanding their attack surface.

    This leads to the ‘SME digital risk paradox’. On the one hand, small businesses are increasingly being targeted, seen as ‘easier prey’ and a gateway to the supply chains of larger partners. On the other hand, they invest the least in strategic defence, mistakenly believing that they are too small to attract the attention of cybercriminals. In the connected economy, SME security becomes a security issue for the entire ecosystem.

    How to bridge the digital divide?

    Passivity is no longer an option. To survive and compete in the digital decade, SME leaders must take decisive action.

    It makes sense to start with strategy, not technology. Before you invest in any tool, define the key business problem you want to solve. Is it increasing sales, reducing costs or perhaps improving customer service? Only then select the right solution.

    Use the cloud as a foundation. Migrate core systems (email, files, accounting) to the cloud. This will not only free up resources and increase security, but most importantly create a centralised database – a prerequisite for future AI implementations.

    Invest in people, not just platforms. The best technology is useless without a competent team. Take advantage of available EU and national programmes (e.g. Digital Skills and Jobs Coalition, SME4DD) to upskill staff in data analytics, digital marketing and cyber security.

    Think security from the outset. Treat cyber security as an integral part of any digital project, not an expensive add-on. A proactive approach is always cheaper and more effective than reacting to a crisis.

  • Central and Eastern Europe (CEE) vs. Western Europe: where does the heart of innovation really beat?

    Central and Eastern Europe (CEE) vs. Western Europe: where does the heart of innovation really beat?

    For decades, Europe’s technological landscape was based on a simple divide: innovative, capital-rich centres in the West and talented but mostly cheaper hinterlands in the East. Central and Eastern Europe (CEE) was mainly seen through the prism of cost arbitrage, ideal for nearshoring.

    Today, this stereotype is not only outdated, but actually inhibits an understanding of the real dynamics of the continent. We are witnessing a fundamental change – the CEE region, led by Poland, the Czech Republic and Romania, is undergoing a transformation from a peripheral service provider to a self-sufficient ‘technology tiger’.

    Its new competitive advantage is no longer based solely on lower costs, but on a unique combination of value, deep specialisation and unparalleled growth momentum.

    To verify this thesis, let us look at the hard data, comparing the key pillars of the innovation ecosystems in CEE and Western Europe.

    The talent equation: more than cost, unparalleled value

    Traditional analysis of IT markets is often reduced to a comparison of nominal salaries. However, a full picture of the value of the CEE region only emerges when three dimensions are examined: the total cost of employment, the purchasing power of the employee and the objective quality of their skills.

    The total cost of employing an experienced software engineer in Warsaw is still significantly lower than in western hubs. Taking into account the gross salary and contributions on the employer’s side, the annual cost of employing a Senior Developer in Warsaw is approximately EUR 88,568.

    This compares to €101,035 in Berlin and €106,704 in Dublin. This means that it is 12-17% cheaper to acquire a world-class specialist in Poland.

    However, the real advantage of CEE lies in purchasing power. The lower cost of living means that a salary here has a much higher real value. A key factor is the cost of renting a flat: a one-room flat in the centre of Warsaw costs between €740-990 per month, while in Berlin it is already €1,100-1250 and in Dublin an astronomical €1,950.

    Similar disparities can be seen in the prices of public transport, catering or entertainment. As a result, the developer in Warsaw, while earning nominally less, enjoys a higher standard of living and greater financial freedom.

    The most important argument overturning the old paradigm, however, is quality. Data from global programming rankings proves that the CEE region is a breeding ground for talent of the absolute highest order. The HackerRank platform ranks Poland 3rd in the world in terms of programmer skills, ahead of countries such as Switzerland, Germany and France.

    Polish programmers are recognised as the best in the world in Java, and Czechs dominate in shell programming. Companies investing in CEE are therefore not making a trade-off between cost and quality – they are gaining access to world-class talent at a more sustainable price.

    Pulse of capital: ecosystem dynamics and resilience

    Venture capital (VC) flows are a seismograph for the innovation ecosystem. Analysis of the data shows that while Western Europe still dominates in terms of volume, it is CEE that shows greater dynamism and remarkable resilience to global slowdowns.

    The total value of businesses in the CEE startup ecosystem has increased 2.4 times since 2019, reaching €243 billion in the first quarter of 2025 – a growth rate almost double the average for Europe as a whole.

    What’s more, during the global slowdown in the VC market in 2023, when investments in Western Europe fell by 35%, the CEE region saw a decline of only 15%. Already in 2024, the market has rebounded, recording growth of 56%. This ability to recover quickly suggests that the foundations of the CEE ecosystem are healthier and better adapted to changing conditions.

    The growing interest from global investors is due to a unique investment thesis for the region. CEE founders, in contrast to the ‘growth at all costs’ culture, are taking a more pragmatic approach, focusing on early revenue generation and capital efficiency.

    This, combined with a strong engineering background, fostering the emergence of deep tech companies, and a ‘global from day one’ mentality, makes for an extremely attractive model for investors looking for not only high returns, but also lower risk.

    Map of the giants: where global companies are locating their future

    Investment decisions by global technology giants are the strongest signal of the region’s strategic importance. Over the past decade, CEE has become an arena for spectacular investments.

    Google invested $2 billion to launch the Google Cloud region in Warsaw, followed by nearly $700 million in The Warsaw HUB office complex, which has become its largest cloud technology development centre in Europe.

    Microsoft has announced a $1 billion plan to create a ‘Polish Digital Valley’, with a cloud computing centre near Warsaw. Intel, in turn, has been developing its largest R&D centre in the EU in Gdansk, employing more than 3,000 engineers working on future technologies such as AI and machine learning.

    A key driver of this is access to world-class talent. Technology leaders know that in order to maintain an edge, they need to be present where engineers capable of delivering the most complex projects can be recruited.

    The presence of these giants creates a powerful flywheel effect: it raises standards in the labour market, creates ‘start-up mafias’ (experienced workers setting up their own companies) and acts as a global quality certificate for the entire region.

    Technological DNA: from monolith to specialisation

    As the CEE ecosystem matures, we are seeing the emergence of deep specialisations. Poland has made a name for itself as a global leader in video game production (Gamedev) and financial technologies (FinTech). With revenues in excess of €1.28 billion and an almost total export orientation (96-97%), Polish gamedev is a powerhouse driven by the success of companies such as CD Projekt. In parallel, with more than 300 startups, Poland has become one of the liveliest FinTech hubs in Europe.

    Romania, with a strong tradition in mathematics, has grown into a European cyber security powerhouse. It is where the globally recognised Bitdefender comes from, and the overall market is expected to grow at a rate of nearly 11% per year.

    The Czech Republic, on the other hand, with its rich history in engineering, has naturally become a leader in artificial intelligence (AI) and its applications in Industry 4.0. The country has world-class research institutions and already more than 11% of Czech companies are using AI technology. This diversification is a source of strength for the entire region and evidence of its growing maturity.

    Hunting unicorns: the ultimate measure of success

    The ability of the ecosystem to regularly generate ‘unicorns’ – companies with a valuation of more than $1 billion – is the ultimate proof of its maturity. Although Western Europe still leads in terms of absolute number (UK – 104, France – 34, Germany – 30), the CEE region has already generated a total of 52-57 unicorns, with Poland as the leader (18).

    However, the dynamics are key: more than half of all CEE unicorns were created in just the last two years (2022-2024), indicating a rapid acceleration. What’s more, the region’s unicorns often have their roots in deep technology (deep tech), such as Lithuania’s Nord Security or Poland’s ICEYE.

    They are also developing a ‘global hybrid’ model, as exemplified by ElevenLabs – a company founded by Poles, with a key R&D centre in Poland, but with offices in London and New York, allowing them to draw on the best talent in the country while having access to the largest capital markets.

    The verdict on the “technological tiger”

    The data clearly shows that the narrative of Central and Eastern Europe as merely a ‘cheaper hinterland’ is outdated. The region offers unparalleled talent value, its VC ecosystem exhibits anti-fragility characteristics, it has become a strategic R&D centre for global giants, it is developing deep specialisations and is an increasingly efficient unicorn factory.

    Although Western Europe still dominates in terms of scale and maturity, the heart of innovation – defined as the epicentre of dynamism, growth and resilience to crises – beats loudest and fastest today precisely in Central and Eastern Europe. “Europe’s technological tiger” is no longer just a promise – it is a reality that can no longer be ignored.

  • Analysis of AI adoption in the EU: Why does the Scandinavian model win out over the Polish deployment rate?

    Analysis of AI adoption in the EU: Why does the Scandinavian model win out over the Polish deployment rate?

    The global economy in 2026 is in a phase of verification of technological enthusiasm. After years of surging interest in generative language models, artificial intelligence has collided with an ‘implementation wall’. The use of AI in companies continues to grow, but the pace of real, deep implementations is much slower than markets and investors expected. The year 2026 will see disappointments due to the lack of immediate return on investment and rising costs and regulatory pressures.

    According to Eurostat data for 2025, the percentage of businesses in the European Union (with more than 10 employees) using AI has reached a ceiling of 19.95%. Although this represents an increase from 13.5% the year before, this dynamic masks a massive stratification and the so-called ‘valley of death’ of AI projects. As the latest market analysis shows, only 33% of cognitive projects in large companies successfully move from pilot to full-scale production. What’s more, as many as 80% of companies that have deployed the new technology still have no measurable increase in productivity or impact on employment levels. Only 5% of pilot deployments are currently generating multi-million pound business value.

    In this complex context, the position of the Polish market appears highly worrying. Official structural adoption rates for Poland for 2025 stopped at 8.36% (Eurostat data) to 8.7% (CSO data), leaving the country at the tail end of Europe. The 2026 reports of the Polish Economic Institute (PIE) indicate unequivocally: as many as 77% of Polish entities not using AI declare that they do not intend to implement these technologies until they are absolutely forced to do so by the market or by law.

    AI adoption in Europe

    To fully illustrate the market phenomena described, it is necessary to analyse the market structure emerging from the hard Eurostat data. The difference between the top of the table and the countries closing the gap is an eightfold technological gap. The Scandinavian countries have achieved success not by building their own capital-intensive foundation models, but through the agile integration of off-the-shelf external services. They create a highly receptive environment in which application innovation is a priority.

    With a rate of less than 9%, Poland is moving at a pace that perpetuates its position on the fringes of Europe’s digital middle ground. This calls for reflection, as the consequences of this lag, combined with an ageing population, will directly affect the competitiveness of Polish exports.

    The Polish paradox

    A superficial comparison of nationwide indicators with data for the largest corporations may lead to misleading conclusions. While Poland performs extremely poorly in macroeconomic terms, there is a clear upturn in the segment of the largest companies. The research shows that 34% of medium and large companies on the Vistula River have implemented their first AI-based solutions. As many as 75% of large entities declare the implementation of AI or advanced analytics projects, and 55% of them have or are currently building a formal strategy in this regard.

    So why are productivity rates stagnating? European business has fallen into the trap of the so-called ‘valley of death’. Initiatives most often end up in closed, secure Proof of Concept experiments. Companies face gigantic problems in scaling these solutions across the entire organisational architecture. The main reasons for this are: outdated internal data infrastructure, huge API costs when used en masse, and the difficulty of integrating algorithms into existing, archaic ERP or CRM systems. As a result, AI is often seen by boards as a curiosity driven by stock market investors’ expectations, rather than a tool to realistically reduce operational costs.

    BYOAI and Shadow AI

    The most fascinating and yet most dangerous trend of 2026 is the massive explosion of the BYOAI (Bring Your Own AI) phenomenon, closely related to the concept of ‘Shadow AI’. As boards debate strategies, compliance procedures and budgets, employees have taken matters into their own hands.

    Officially, many organisations prohibit or strongly restrict the use of public generative models for fear of data leakage. Unofficially – they are used by the majority of corporate employees. Analyses from 2026 show that nearly 78% of office workers who rely on artificial intelligence on a daily basis use their own private accounts and applications for this purpose, often without the knowledge of IT departments. Moreover, in Poland, as many as 80% of employees still do not have formal permission from their employer to use GenAI as part of their job duties.

    This trend has a twofold consequence. On the one hand, it demonstrates the gigantic need to optimise one’s own work. On the other, it generates a powerful risk of injecting sensitive financial data, source code or trade secrets into open neural networks. Boards of directors who, in 2026, continue to pretend that the Shadow AI phenomenon in their organisations does not exist, risk not only losing their digital sovereignty, but also severe penalties for breaches of confidentiality. The paradigm shift urgently requires a shift from categorical bans to building secure, corporate equivalents of popular tools (e.g. private instances of language models in the cloud).

    Where does AI actually work?

    An analysis of the types of artificial intelligence being deployed sheds light on a clear evolution of the market – from complex engineering algorithms to democratised, natural language-based consumer interfaces. Among European companies, it is dominant:

    • Written Language Analysis (Text Mining): The foundation of digitisation, used in the European Union for document categorisation, contract risk assessment and compliance automation.
    • Generative AI for multimedia and language: Text, image and code generation are the areas recording the highest growth rates. They mainly serve marketing, customer service and developers.
    • Back-office processes: This is where the real value is hidden. Companies see the greatest potential in process automation (AI-supported RPA) and improved predictive quality, which 48% of high-tech organisations are already using. Areas such as HR, logistics and finance are slowly catching up with front-office departments.
    Subgroups

    Despite these applications, a staggering 56% of the companies surveyed said they had achieved only partial or no benefits from their implementations. This shows that simply subscribing to an AI assistant does not restructure work deeply enough to noticeably increase EBITDA ratios.

    Main barriers to development in 2026

    2026 precisely defines four key barriers holding back the European and Polish economies from entering the fully cognitive era.

    1 Legal Rubicon: Entry into force of the EU AI Act

    August 2026 is a time caesura in the European technology market. This is when transparency rules and regulations targeting high-risk artificial intelligence systems from the EU’s high-profile AI Act regulation come into full force. Entrepreneurs collide en masse with the need for complex technology audits and risk classification of deployed systems. If company algorithms (e.g. in credit scoring systems or automated CV selection) are classified as high-risk solutions, the act forces continuous human oversight, strict risk management procedures and rigorous record-keeping. The fear of gigantic financial penalties and new management liability paralyses investment decisions in many boards, which prefer to put innovation on hold in favour of so-called ‘legal compliance’.

    2. competence gap

    The most serious operational bottleneck remains the talent deficit. As many as 69% of Polish organisations report serious difficulties in recruiting and retaining AI experts. The situation is dramatic even in the wealthy financial sector, where only 10% of entities report adequate staffing in this area. This is leading to a devastating war for talent. Salaries for experienced machine learning engineers in Poland in 2026 reach ceilings in the order of PLN 23,000 – 30,000 per month, recording annual increases of often 20% in the face of pressure from global corporations and remote working. For the traditional SME sector, these rates are completely prohibitive.

    3. alarming demographic indicators and generation gap

    Analysis of the structure of the use of cognitive tools exposes the painful truth about the Polish education system and the entry of young people into the labour market. European data from 2026 shows that only 49.3% of young Poles (aged 16-24) actively use generative artificial intelligence tools. When juxtaposed with the EU average in this age group of 63.8% (and results of the order of 78% – 83% recorded in the Czech Republic, Estonia or Greece), Polish youth ranks among the last in Europe. This means that the Polish labour market will not be fed by a mass wave of “digital natives” proficient in automation, which in the perspective of the ageing society heralds a deepening crisis of hands for work.

    4. the illusion of immediate profit

    The final barrier is the aforementioned disappointment with financial performance. Businesses are discovering that the technology is not a magical remedy. The operational costs of the cloud, the processing of API queries and the need to restructure dirty historical data often outweigh the savings generated from reducing headcount or speeding up service.

    Given the market reality that the transformation has gone from a phase of joyful testing to a phase of hard cost-counting and regulatory adjustments, company boards should review their strategies. What is worth focusing on?

    Legalise and structure ‘Shadow AI’: Categorical bans on the use of external AI tools are a dead law, commonly broken by 80% of the workforce. It is worth considering buying access to secure, isolated corporate environments, making them available to staff with appropriate permissions and formally integrating them into office process flows.

    AI Act audit and compliance as a priority: With the enforcement requirements of the AI Act regulation (August 2026) looming large, every organisation needs to immediately catalogue the algorithms in its inventory. It is necessary to categorise them into appropriate risk profiles. Negligence in this area can result not only in sanctions, but also in the forced shutdown of key elements of e-commerce or HR infrastructure.

    Moving from point solutions to a data ecosystem: The success of Scandinavian companies proves that implementing AI ‘ad hoc’, without structuring the data architecture of the entire company, leads to burning through budgets in the ‘valley of death’. Data Lifecycle Management should be a priority for 2026-2027. An algorithm is only as smart as the clean, structured and integrated data on which it is based.

    Focusing on hard ROI rather than image innovation: The time of companies bragging about the mere implementation of any chatbot is irrevocably passing. New technology projects must have clearly defined financial KPIs even before the Proof of Concept phase begins. Investments should be channelled into back-office solutions (supply chain forecasting, audit automation and financial controlling), not just facade marketing.

    European business has reached a tipping point. The division into technological empires and digital provinces is becoming a reality. With demographics working against the employee market, cognitive automation, bypassed by a wide margin, is becoming the only survival policy for thousands of companies in lagging markets.

  • Cyber-Paradox: Why does the exponential increase in cyber attacks not translate 1:1 into security industry revenues?

    Cyber-Paradox: Why does the exponential increase in cyber attacks not translate 1:1 into security industry revenues?

    Does fear really sell? Analysis of data from 2019-2024 debunks a popular myth in the industry. Although the number of cyber security incidents in Poland increased by more than 1,500% during this period, the market for security services and solutions grew by “only” around 120%. Where is the reason for this gap and why does the SME sector still remain a ‘no-man’s land’ for integrators?

    For years, the commercial narrative of the IT industry has been dominated by a simple logic: the more threats, the more customers spend on protection. The market reality of the last five years, however, shows that this correlation is much weaker than it might seem. There is an unprecedented asymmetry – while the threat curve is climbing exponentially, the revenue curve of technology providers is growing at a linear rate, stable but far from explosive.

    Battle landscape: Escalation of 1,500 per cent

    To understand the scale of the disparity, we need to look at hard data on the ‘supply’ of threats. Statistics from CERT Polska (NASK) over the last five years paint a picture of a digital battlefield that has been completely transformed.

    Back in 2019, considered the last year of the ‘old era’, CERT Polska recorded 6,484 security incidents. Even then, there was talk of records. However, the real shock came in 2020 and the pandemic, when the number topped 10,000. This was just the beginning.

    The following years have had a snowball effect. In 2021, nearly 30,000 incidents were recorded, and 2023 closed with more than 80,000 recorded incidents. Preliminary estimates and communications for 2024 show a further dramatic increase, with the number of incidents exceeding 100,000 (an average of 300 per day).

    The mathematics are inexorable: in five years, the volume of successful attacks and incidents has increased by nearly 1500%. If the market had reacted directly in proportion, the cybersecurity industry should be the largest sector of the digital economy today. However, this is not the case.

    Market: Solid growth, but no euphoria

    Juxtaposing these figures with the financial performance of the cybersecurity sector reveals a fundamental ‘divergence’ (decoupling). According to analyses by the research company PMR, the value of the cyber security market in Poland in 2018 was PLN 1.14 billion. Forecasts for 2024 oscillated around PLN 2.5-3 billion.

    This means that at the same time as the number of attacks has increased fifteenfold, the market has grown by around 120-140%. This is a very good result compared to other IT branches, but it clearly shows that the elasticity of demand for security is low. Each additional thousand attacks generates a relatively small increase in new budgets.

    Eurostat data (NACE classification 62.09) and the services price indices (PPI) confirm this trend – there is a steady increase in turnover, but there is no question of a jump to match the scale of the risks.

    Diagnosis: Why are SMEs not buying security?

    The key to solving this conundrum is the structure of the Polish economy. While the banking sector and large corporations (Enterprise) invest adequately for the risk, the SME sector – which is the backbone of the economy – lags behind. This phenomenon can be called the ‘investment gap’.

    1. financial barrier and microbudgets

    The average annual expenditure on cyber security in SME companies is only PLN 24,000. When juxtaposed with the cost of modern SIEM, EDR or specialist salaries, this amount is a drop in the ocean of needs. It allows the purchase of basic licences, but not to build real resilience.

    2. awareness vs. practice

    ESET and Dagma’s 2024 research is alarming: as many as 41% of Polish companies do not even use anti-virus software. Despite the fact that 87% of companies consider digitalisation to be crucial and 88% have experienced an incident in the last 5 years, a ‘somehow it will be done’ attitude still persists.

    3 Technology debt and Shadow IT

    Many companies migrate to the cloud (SaaS), mistakenly assuming that security is included in the price of an office suite subscription. These expenses are often not classified as ‘cybersecurity’, which understates the market statistics, but also puts to sleep the vigilance of businesses that do not invest in additional layers of protection (backup, training).

    What really drives the market?

    Analysis of the data leads to the conclusion that the number of attacks is not the main driver of sales. Polish companies are reactive rather than preventive. The real ‘engine’ of spending growth is two other factors:

    • Paralysing Incident(Ransomware): Only an attack that encrypts data and stops production opens the board’s portfolio. Minor incidents (spam, phishing) are ignored.
    • Regulation (Compliance): The spike in reported incidents in 2020 coincided with the implementation of the KSC Act. The market is now waiting for the effect of the NIS2 directive. It is the threat of administrative penalties (up to 2% of turnover), not hackers, that will force thousands of entities to make real investments between 2025 and 2026.

    The future belongs to companies that will offer security as a scalable service (Managed Security Services), taking the burden of hiring expensive experts off the customer’s shoulders, and to those that combine technology with legal support for NIS2 requirements in their offerings. This is the only way to bridge the gap between the growing graph of attacks and the flat graph of spending.

  • West looks East: Is CEE becoming Europe’s new IT logistics hub?

    West looks East: Is CEE becoming Europe’s new IT logistics hub?

    The upheaval in global supply chains from 2020-2024 has forced the tech giants to redefine their strategies. In this new dispensation, Central and Eastern Europe (CEE) is no longer merely a ‘cheaper alternative’ to Germany. Thanks to a combination of advanced warehousing infrastructure, growing manufacturing competence and strategic location, the region is emerging as a key distribution hub for the IT and electronics sector. What does this mean for the Polish IT industry?

    End of the “Just-in-Time” era, beginning of the “Just-in-Case” era

    Until a decade ago, the model was simple: electronics produced in Asia sailed by container ship to the ports of Rotterdam or Hamburg, and from there they were sent to distribution centres in Western Europe. The CEE region played a peripheral role. Today, this model is becoming a thing of the past. The pandemic, the blockades of the Suez Canal and the war in Ukraine have exposed the fragility of long supply chains. The corporate response is nearshoring – moving production and logistics closer to markets.

    The data is clear: around half of the companies operating in Europe have started to decentralise their supply chain, and CEE is the main beneficiary of this trend. It is no longer just about cheap labour. It is about resilience. For an IT distributor, not having goods on the shelf during the ‘peak season’ (Q4) means measurable losses. Warehousing goods in Poland or the Czech Republic, from where it takes a few hours to get to Berlin, has become a safety net for Western European business.

    Hard data: Poland the warehouse of Europe

    Poland has grown into the undisputed logistics leader in the region, with 2024 figures confirming its dominance over many Western markets. In the first half of 2024, Poland recorded the highest warehouse rental volume in Europe, even overtaking the German market.

    Cost arbitrage remains a key factor for the IT industry, operating on low distribution margins. An analysis of prime rents in Q3 2024 shows a gap between CEE and the West:

    • Munich: approx. EUR 10.80/m²
    • Warsaw (surroundings): approx. EUR 5.25/m².
    • Bucharest: approx. EUR 4.70/m².
    czynsz eng

    For a company from the SME sector or a large electronics distributor (such as TD Synnex or Ingram Micro), maintaining a distribution centre near Wrocław or Poznań is nearly half as expensive as in Germany, while maintaining the A-class standard. Moreover, modern warehouses in CEE are often younger and better adapted to automation and ESG requirements (BREEAM/LEED certificates) than older facilities in the West.

    Not just a warehouse, but a factory

    The myth of CEE as exclusively an ‘assembler’ collides with the data on high-tech exports. Hungary, the Czech Republic and Poland have transformed themselves from technology importers to net exporters.

    • Hungary has become a regional powerhouse in electronics manufacturing, with exports of electrical and electronic equipment accounting for more than 23% of total exports there.
    • Lenovo in Hungary: The Üllő plant is an example of what nearshoring looks like in practice. This plant has already produced more than 1.5 million servers and workstations, serving the EMEA markets (Europe, Middle East, Africa). Reducing delivery times to customers in Europe from several weeks (shipping from China) to a few days is a competitive advantage that cannot be ignored.

    In the case of Poland, exports of high-tech products have approached USD 30 billion. Poland plays a key role in so-called fulfilment for e-commerce. Giants such as Amazon and Zalando have located their centres in Poland not only because of costs, but because of the possibility of servicing the entire German and Scandinavian market from here. More than 34% of the demand for warehouses in Poland in 2024 will be generated by the e-commerce and retail sector, which is directly correlated to the sale of electronics.

    Risks: Lessons from Intel and labour market challenges

    However, the image of the region as the ‘Promised Land’ has its cracks. The loudest echo in the industry was the suspension (and de facto cancellation in its current form) of Intel’s investment near Wrocław in the Semiconductor Integration and Testing Facility. This decision, resulting from the corporation’s global financial problems, shows that basing a development strategy on a single giant investor can sometimes be risky. Nonetheless, the logistics sector has ‘absorbed’ the news without breaking down. The land prepared for Intel’s investment remains an attractive asset and demand from other players is not waning.

    For Polish IT SMEs, the labour market is becoming a challenge. Although labour costs in Poland are still 2-3 times lower than in Germany , the availability of skilled workers is decreasing. The answer is automation. Implementing systems such as AutoStore or AMR robots in warehouses (as Zalando or Ingram Micro are doing, for example) is becoming a necessity rather than a gadget. This is an opportunity for Polish IT system integrators who can offer these solutions to logistics.

    Polish IT SMEs do not need to build their own warehouses. The CEE region offers the most modern fulfilment infrastructure in Europe. The use of logistics operators (3PLs) in Poland makes it possible to compete with Western companies on delivery times, with a lower cost base.

    • Supplier diversification: Proximity to factories in Hungary (Lenovo, Samsung) or the Czech Republic (Foxconn) means that Polish distributors and resellers can rely on shorter supply chains. It is worth reviewing contracts and looking for manufacturing partners within the CEE region, rather than relying solely on imports from Asia.
    • Investment in technology: As logistics in CEE is moving towards automation, IT companies should focus on providing solutions to support this trend: from WMS (Warehouse Management Systems), to IoT for shipment monitoring, to cyber security for industrial infrastructure.

    Is CEE the new IT logistics hub for Europe? The answer is yes, but in a new formula. We will not replace Rotterdam as a port of entry, but we have become a ‘buffer warehouse and last mile factory’ for Europe. Poland (distribution), Hungary (manufacturing) and the Czech Republic (high-tech) form a complementary ecosystem. For the IT industry, this means more stable supply and lower operating costs. The winners will be those who understand that logistics in 2025 is not just about pallet transport, but about data and time management – and in this, the CEE region is starting to win over Western competitors.

  • Two-speed digitalisation. Where does Polish business place billions of zlotys on IT?

    Two-speed digitalisation. Where does Polish business place billions of zlotys on IT?

    In 2023, investment expenditures in the Polish economy exceeded PLN 461 billion. Although nominally a record, a deeper analysis of the CSO data reveals a worrying trend of polarisation. We are facing a two-speed economy: while the financial sector is aggressively deploying AI and the cloud, industry is approaching digitalisation with great reserve and SMEs are still struggling with the basics. Where does the money for the IT industry lie in this landscape?

    Digital transformation in Poland is no longer a homogeneous process. Just three years ago, at the height of the pandemic, everyone and everything was digitalising. The figures for 2023 and the first half of 2024 show a paradigm shift: the time for simple hardware purchases is over and the stage of efficiency verification has begun. For technology providers and decision-makers in SME companies, the lessons from the hard data are clear – the market is maturing, but unevenly.

    Spending map: Finances are fleeing, industry is stabilising

    When analysing the flow of money into the market, it is important to distinguish between two key categories: growth rate (who is accelerating?) and investment volume (who is spending the most?).

    The financial and insurance sector is the clear leader of the transformation. According to the Central Statistical Office (CSO), the dynamics of investment expenditures in this section amounted to 137.1 (real growth of more than 37% year-on-year). Banks and insurers, with the highest digital maturity index (6.2 points on a 10-point scale according to KPMG ), are running ahead. Their investments are no longer just in infrastructure, but increasingly in ‘soft’ technologies: cyber security, hyper-personalisation of the offering and advanced data analytics.

    At the other extreme in terms of dynamics, but at the first in terms of volume, is Industrial Processing. It is here that more than PLN 98.5 billion in capital expenditure was located. However, dynamics at 103.9% means in practice stagnant development, taking into account producer inflation. The industry is investing heavily, but still mainly in ‘hard’ machinery and production lines. Digitalisation in this sector is insular – it is implemented where it directly reduces the cost of manufacturing a product (robotisation) and less frequently where it builds new value (AI, Data Driven Manufacturing).

    ICT sector paradox: Why is investment falling?

    The most surprising data in the CSO reports is the decline in investment dynamics in the ‘Information and Communication’ section alone (a fall of around 15% year-on-year in real terms). Does this indicate a crisis in the technology industry?

    On the contrary. This signals a fundamental change in the business model that the IT industry needs to understand. Technology companies (software houses, service providers) are migrating fastest from the CapEx model (investment in fixed assets, e.g. own server rooms) to the OpEx model (operating costs, e.g. cloud). Instead of buying servers (which the CSO sees as an investment), companies are renting computing power from hyperscalers (which is not shown in this column). This is confirmed by the dynamic growth of the cloud market in Poland – by 34% year-on-year, to nearly PLN 4 billion. The decline in IT ‘investment’ is therefore in fact evidence of… growing technological maturity of this sector.

    Verifying the myths: AI and the cloud in practice

    If we look beyond finances, to the real use of technology, the picture of Polish digitalisation becomes more stark.

    1. cloud computing: 45.2% of companies in Poland were using it in 2023. The result seems decent until we look deeper into the statistics. The dominant use is email (82.7% of cloud users) and file storage. Advanced uses, such as cloud databases (43%) or computing power for applications (25%), are still the domain of the big players.

    2 AI: Here the gap between declarations and reality is the largest. Although market reports suggest that as many as 83% of industrial companies are planning to invest in AI , hard data from Eurostat and the Central Statistical Office bring them down to earth. In 2023, only 5.9% of Polish companies will have made real use of artificial intelligence technologies. This gives us one of the last places in Europe (with an EU average of around 8% and leaders such as Denmark with 27.6%).

    SMEs in a technological debt trap

    A key problem in the Polish IT market is the barrier to entry for the SME sector. While large corporations (250+ employees) have cloud adoption rates of 78%, the SME sector is only 44%.

    The main barrier is no longer just a lack of capital, but a lack of strategy. As many as 51% of small businesses operate without a formalised digital transformation plan. As a result, IT investment in SMEs is often haphazard, forced by failures or regulations (e.g. KSeF), rather than a desire to build competitive advantage. An additional inhibitor is data waste – in industry, up to 56% of operational data is not used analytically in any way.

  • The end of the ‘sell and forget’ era. How the subscription economy is quietly redefining the IT channel

    The end of the ‘sell and forget’ era. How the subscription economy is quietly redefining the IT channel

    Nominal revenue growth in the IT industry is no longer the only determinant of success. Between 2023 and 2025, we are seeing a fundamental change in Poland and Europe: money in the distribution and partner channel is changing its nature. Instead of one-off cash shots (transactions), the market is shifting towards streams (subscriptions). For distributors this is an accounting challenge, for resellers a painful foray through the ‘valley of death’ in cash flow, but for those who survive, the reward is company valuations that are up to three times higher.

    The transformation from CapEx (capital expenditure) to OpEx (operating costs) is no longer just the domain of SaaS start-ups, but a hard reality for the traditional sales channel – from distribution giants to local MSP integrators (Managed Service Providers).

    Invisible turnover: lessons from the global giants

    To understand what is happening in Poland, you need to look at the performance of global leaders who are setting trends. The traditional ‘Revenue’ line in financial statements has become confusing in the cloud era. Why? Because subscription sales (e.g. Microsoft 365, AWS, Cyber-as-a-Service) are often accounted for on a ‘net’ agency model – the distributor only shows its margin, not the full invoice value.

    This is perfectly illustrated by TD SYNNEX’s Q3 results for fiscal 2025. Reported revenues amounted to USD 15.7 billion (+6.6% year-on-year), but so-called Gross Billings (the real value of technology sold) reached a staggering USD 22.7 billion, increasing by 12.1%.

    This difference – amounting to $7 billion in a single quarter – is the ‘invisible’ mass of turnover coming from cloud services and subscriptions in a standard P&L account. For the Polish market, the lesson here is clear: if your revenues are stagnant, but the number of supported licences and service contracts is growing, then your business is actually growing faster than the simple P&L calculation shows.

    Poland specific: Infrastructure ready, people not

    The growth dynamics of the as-a-Service model in Poland has a unique, local drive. It is the gap between infrastructure and competence. On the one hand, Poland has a fibre optic infrastructure above the EU average, which is a technological highway for cloud services. On the other hand, Poland’s basic digital skills rate is only 44.3% (compared to the EU average of 55.6%).

    For the partner channel (Resellers/MSPs), this is an ideal situation. Polish SMEs have the connectivity to use the cloud, but do not have the people to manage it. This forces outsourcing. According to PMR forecasts, the cloud market in Poland was expected to grow by 24% in 2024, reaching a value of PLN 4.8 billion. Importantly, Polish companies are still at the ‘Cloud 1.0’ stage (mail, storage), while the adoption of advanced services (AI, analytics) is only 3.7% compared to 8% in the EU. This shows a gigantic potential for upselling in the coming years.

    AB S.A. and the run to the front

    In Poland’s backyard, this transformation is perfectly illustrated by the AB S.A. Group. In the 2023/2024 financial year, the group’s revenues amounted to PLN 14.7 billion (a slight decrease reported by exchange rate differences), but profitability is key. The profit margin on sales reached a record level of 4.1%.

    This is evidence of a shift in product mix. AB S.A. is shifting the weight of the business from simple ‘box shifting’ to VAD (Value Added Distribution) – advanced server solutions, cyber security and cloud. The company has reduced its net financial debt by three quarters, building a powerful financial cushion. In a subscription model, where payments are staggered, it is the strong balance sheet and low-cost financing that become the distributor’s main competitive advantage over smaller players.

    YearNon-recurring revenues (Hardware/perpetual licences)Recurring revenues (Subscriptions/MSP/Cloud)Trend commentary
    202180%20%Dominance of the transactional model (traditional reseller).
    202365%35%Post-pandemic acceleration, cloud growth (PMR data).
    202455%45%‘Gross Billings’ effect – growth in managed services and security.
    2025 (Forecast)45%55%Point of intersection (AI, DaaS, NIS2).

    The holy grail: own IP and repeatability

    At the other end of the market are companies that have already completed the transformation. Asseco Poland reported in 2024 that as much as 79% of revenue (approximately PLN 13.5 billion) comes from its own software and services.1 Such a revenue structure, based on licences, maintenance and subscriptions, is the goal pursued by the modern IT channel. It ensures predictability and resilience to economic fluctuations, as can be seen from the steady growth of Asseco’s net profit (+8% year-on-year).

    MSP and the ‘Valley of Death’: why suffer?

    For smaller IT companies (integrators, resellers), moving to a subscription model involves the risk of the so-called ‘Valley of Death’ in cash flow (Cash Flow J-Curve). Instead of a one-off invoice for PLN 50,000 for the implementation, the company receives, for example, PLN 2,000 per month. This means a drastic drop in cash inflow in the first year, even though the customer value (LTV) is higher in a three-year perspective.

    So why do companies decide to take this step? The answer lies in the valuation of the business.

    • A traditional IT company (break-fix/non-recurring sales model) is valued in the market at 2.6x – 4.8x EBITDA.
    • A mature MSP with a high proportion of recurring revenue (Recurring Revenue) achieves multipliers of 8x – 10x EBITDA, and even more for large-scale platforms.

    Investors pay a ‘peace of mind premium’ – a predictable revenue stream is worth far more than a one-off sales shot, even a high one.

    Forecast: What will drive the market in 2025?

    The data points to three catalysts for service share growth in 2025:

    1. AI PCs and the end of Windows 10: PC fleet replacement (predicted by AB S.A. and IDC) is an opportunity to sell hardware in a DaaS (Device-as-a-Service) model with AI management services.
    2. Cyber Security (NIS2): EU directive will force thousands of companies in Poland to professionalise security. SMEs cannot afford their own SOC (Security Operations Centre), so they will buy security as a subscription service from MSPs.
    3. Platformisation of distribution: Tools such as Ingram Micro Xvantage or AB S.A.’s cloud platforms automate the subscription renewal process, making revenue more ‘sticky’ and cheaper to handle.

    The Polish IT industry is at a turning point. Although, by volume, hardware sales still account for a large proportion of turnover (as can be seen from distributors’ results), margins and enterprise value are migrating towards recurring services. Analysis of the data shows unequivocally: whoever does not have a significant share of recurring revenues on their balance sheet in 2025 will lose relevance and market value, no matter how much nominal turnover they generate.

  • Professional burnout in IT. How to combat it? – Risk analysis for key specialisations

    Professional burnout in IT. How to combat it? – Risk analysis for key specialisations

    Professional burnout in the tech industry has ceased to be a taboo subject and has become a systemic crisis that threatens the foundations of innovation and the stability of the entire sector. It is no longer a matter of individual fatigue, but a silent epidemic of alarming proportions.

    The research is clear: almost three-quarters (73%) of developers have experienced burnout at some point in their career. Earlier studies, such as the 2021 Haystack Analytics report, indicated an even higher percentage, reaching 83%.

    The problem is just as acute in the Polish market, where symptoms of burnout affect as many as 70% of IT workers to varying degrees, with up to 42.1% of respondents at high risk.

    When such a vast majority of the population experiences a negative phenomenon, it ceases to be an individual problem and becomes a systemic challenge. This normalisation leads to a vicious circle: new employees perceive burnout as ‘buying in’ to the industry, and companies may under-invest in prevention.

    To add weight to this discussion, it should be noted that occupational burnout has been officially recognised by the World Health Organisation (WHO) and included in the ICD-11 International Classification of Diseases as an occupational syndrome.

    Anatomy of burnout: The 3 dimensions of crisis

    To effectively address burnout, it is essential to understand its nature. The WHO defines it as ‘a syndrome resulting from chronic workplace stress that has not been effectively managed’. It is a phenomenon closely related to the work context.

    The most influential model describing this syndrome, the Maslach Burnout Inventory (MBI), distinguishes three fundamental dimensions.

    1. Emotional exhaustion: This is the central element of the syndrome, characterised by a feeling of complete depletion of energy resources. In the IT context, it manifests itself with thoughts such as: “I’m exhausted at the mere thought of starting another sprint” or “The morning status meeting is sucking all the energy out of me for the rest of the day”. It’s a sense of emptiness that doesn’t go away after the weekend.
    2. Cynicism and depersonalisation: This dimension describes a growing mental distance from work, accompanied by a negative or cynical attitude. In the IT world, cynicism can sound like: “Why make the effort and write clean code when this feature will be removed in six months anyway?”. Depersonalisation is treating colleagues and customers as anonymous objects, leading to a loss of empathy.
    3. Reduced sense of professional efficacy: The third pillar is a sense of lack of competence and personal achievement. The employee begins to evaluate his or her effectiveness negatively and feels that his or her contribution does not matter. In the IT industry, where progress is crucial, this dimension is particularly painful and can take the form of imposter syndrome, where even experienced technical leaders feel they are underperforming.

    These three dimensions form a vicious circle. Usually it all starts with emotional exhaustion. To cope with it, the individual develops cynicism as a defence mechanism. However, this detachment leads to a decline in the quality of work, which becomes fuel for the third dimension: a reduced sense of efficacy.

    The employee now has objective evidence that he is ‘failing’, which in turn exacerbates his exhaustion, closing a destructive cycle.

    Battlefield analysis: risk factors in key IT roles

    Each speciality has a unique ‘risk profile’ that shapes the day-to-day experience of professionals.

    DevOps and SRE: engineers on constant call

    DevOps and SRE specialists are the backbone of modern systems, but the role comes with a huge workload. The main stressor is the 24/7 work culture and ubiquitous on-call duties that blur the boundaries between work and personal life.

    Another factor is the enormous complexity and fragmentation of tools – engineers manage an ‘endless jigsaw’ of technologies such as Terraform, Kubernetes and Jenkins. Added to this is the constant context switching, which can reduce productivity by up to 40%.

    The burnout mechanism here is driven by chronic stress resulting from hyper-sensitivity and cognitive overload.

    Cyber security: watchdogs on constant alert

    Cyber security professionals operate in an environment with zero tolerance for error. A unique stressor is so-called ‘alert fatigue’ – fatigue caused by a constant barrage of alerts, most of which are false alarms.

    The pressure is immense, as one mistake can lead to catastrophic losses. The situation is exacerbated by a global shortage of staff, meaning that existing teams are permanently overstretched.

    This creates a sense of asymmetrical struggle: the defenders need to secure everything and the attackers need only one success. Burnout here is the result of operating in a long-term ‘fight or flight’ mode.

    Project manager: conductor of an orchestra of chaos

    PMs function at the intersection of different interests: the team, the client and management. The main source of stress is managing multiple projects at the same time, with more than half (52%) of PMs managing between two and five projects at once.

    Equally taxing is emotional labour, i.e. absorbing frustrations and managing expectations. The constant need to make decisions leads to decision fatigue (decision fatigue).

    The key element here is the disconnect between the enormous responsibility and the limited control over budgets, resources or changing requirements.

    Developer (Frontend/Backend): between creativity and technological debt

    The role of a software developer is saturated with unique stressors. One is the constant learning curve in the face of rapidly evolving technologies. But the strongest demotivator is the sense of wasted effort when weeks of work on functionality are rejected by the product department.

    Added to this is the high cognitive load associated with working on a complex legacy code. The burnout mechanism is fuelled by the frustration of feeling that the work is more about fighting the system than creating real value.

    QA / Tester: the quality guardian at the end of the chain

    The role of the quality specialist is often underestimated, yet under enormous pressure. The biggest stressor is the position at the very end of the software development cycle. Any delays accumulate, drastically reducing the time for testing.

    The role also suffers from a sense of undervaluation – the tester’s work remains ‘invisible’ as long as everything works, but when a bug makes its way into production, the QA department is most often blamed. Burnout in this role is the result of a toxic mix of high pressure, huge responsibility and low recognition.

    The common denominator for the roles most at risk of burnout is a fundamental imbalance between responsibility and control. The DevOps engineer is responsible for the stability of the system, but does not always have control over the quality of the code being deployed.

    The Project Manager is responsible for the delivery of the project, but does not fully control the resources. The tester is responsible for quality but has no control over the schedule. This discord is one of the most powerful chronic stressors in the workplace.

    How do you build resilience against burnout?

    Combating burnout requires a two-pronged approach: strengthening individual resilience and, more importantly, fundamental organisational change.

    • Conscious boundary-setting: It’s more than ‘work-life balance’. It’s about actively managing your own energy, scheduling blocks for deep work and regular digital detoxes. In doing so, it is important to remember that although 70% of developers code at weekends for pleasure, this can lead to blurred boundaries and prevent full recovery.
    • Managing cognitive load: Methods such as the Pomodoro Technique or the conscious reduction of context switching can reduce daily overload.
    • Normalising seeking support: Open communication with a supervisor and the use of psychological support are signs of maturity, not weakness.
    • Build a culture of psychological safety: An environment must be created where employees can talk openly about problems without fear of punishment. Leaders need to model healthy behaviour – taking leave and respecting working hours.
    • Process and tool optimisation: Investments in simplifying the tool chain (DevOps), automating testing (QA) and systematically reducing technical debt (Developers) are direct attacks on sources of stress.
    • Ensuring a balance between responsibility and control: The level of responsibility must go hand in hand with an appropriate level of autonomy and control over one’s own work. Regular feedback helps to correct any imbalances.
    • Actively promote regeneration: Companies should realistically encourage the use of holidays (and not contact employees during them) and offer flexible working arrangements.

    Professional burnout is a signal that legacy working models in IT have reached their limits. It is time to stop treating employees’ mental health as a cost and start seeing it as a key investment in the most important resource the technology industry has: human talent, creativity and a passion to create.

  • Is the IT industry getting older? Age structure of ICT professionals in Poland and the EU

    Is the IT industry getting older? Age structure of ICT professionals in Poland and the EU

    In the business narrative, the technology industry is synonymous with youth, dynamism and a constant influx of new talent. The common perception of the IT sector as the domain of people under 30 is so strong that the question of its ageing sounds almost like heresy. However, the latest Eurostat data sheds a whole new light on this picture, revealing a fundamental paradox that has direct implications for Polish companies, especially in the SME sector.

    Analysis of data from the last decade (2014-2024) brings a surprising conclusion: the European IT sector as a whole is not ageing at all. On the contrary, it shows remarkable demographic stability. In 2024, the share of ICT (information and communication technology) professionals aged 35 and over across the European Union was 62.8%. A decade earlier, the figure was marginally higher at 62.9%. However, this almost zero change (-0.1 percentage points) is not a sign of stagnation, but of a dynamic equilibrium.

    This stability is the result of two powerful forces clashing. On the one hand, the ICT industry is growing at a rate unmatched by the rest of the economy. Between 2014 and 2024, the number of ICT professionals employed in the EU increased by as much as 62.2%, while total employment in the EU grew by only 10.6%. Such massive growth generates a constant demand for new staff, met mainly by graduates, which naturally rejuvenates the sector. On the other hand, the powerful base of professionals hired a decade or two ago is naturally maturing, moving into older age groups. At an EU-wide level, these two trends are almost perfectly balanced. The IT sector in Europe is therefore a mature market – where almost two-thirds of the workforce is over 35 – but thanks to expansion, as a whole, it is not ageing.

    However, this pan-European balance masks ‘contrasting national trends’, and Poland is one of the clearest examples of this divergence. While the share of ICT professionals aged 35+ declined in the EU, in Poland it increased by as much as 10.4 percentage points over the same period. This places Poland among the top countries with the fastest maturing technology workforce, alongside Slovakia (+18.6 p.p.) or Romania (+13.4 p.p.).

    This does not mean that the Polish sector is ‘old’, but that it is maturing at a rapid pace. There are several factors behind this phenomenon. Firstly, it is the effect of the maturing of the market. The generations that fed into the industry en masse during the recruitment boom of the 2000s and 2010s are now collectively crossing the threshold of 35 and 45 years of age. Secondly, this trend is reinforced by the overall unfavourable demographic situation in the country. Poland is an ageing society at a rapid pace , which translates into a shrinking pool of young talent across the labour market. Finally, the market itself has started to favour experience – the observed slowdown in the recruitment of juniors goes hand in hand with a growing demand for experts, including in the context of the development of AI.

    As a result, the Polish IT sector is losing its former ‘youth’ advantage. The age structure of the industry, visualised in the form of a pyramid, would show a clear ‘convexity’ in the 25-34 and 35-44 age cohorts, which contrasts with the much older and flatter structure of the total workforce in the Polish economy.

    For Polish SME companies, this rapid demographic shift creates a fundamental conflict: a clash between cultural stereotypes and market reality. The organisational culture of many technology companies, both globally and in Poland, is steeped in ageism, or age discrimination. Stereotypes perceiving older workers as “less adaptable” or “unwilling to learn” are still common. Polish studies show glaring disparities: candidates aged 28 receive twice as many invitations to interviews as people over 50, with identical qualifications.

    With demographics forcing a reliance on older staff, this cultural ageism is becoming a strategic mistake. Companies that continue to use platitudes about a ‘young, dynamic team’ in their recruitment communications , are actively scaring away the largest and most experienced talent pool in the market. In an era when more than half of EU businesses report difficulty filling ICT vacancies , this is a business barrier that raises recruitment costs and increases turnover.

    Adapting to this new reality requires SMEs to immediately revise their strategy. Competing for a growing pool of experienced experts cannot be based on salary alone. It becomes necessary to implement conscious Age Management strategies. This means focusing on critical areas for a mature workforce. Firstly, on the strategic transfer of knowledge. Experienced specialists (50+) are often the “first generation” of Polish IT, possessing invaluable knowledge of key systems. Their departure without a succession plan, e.g. through mentoring or workshops, means an irrevocable loss of critical ‘know-how’. Secondly, SMEs can gain an advantage by offering values valued by seniors: employment stability , real flexibility and attention to work-life balance.

    Redefining development and upskilling is becoming equally important. Investing in the development of a senior looks different to that of a junior. It is no longer just about learning a new framework, but about developing competences that multiply the value of their experience. Companies need to create formal technical development paths (e.g. Staff/Principal Engineer) that allow for promotion and salary increases without moving to a managerial path. This is a development towards technical leadership , systems architecture and taking full responsibility for a product area.

    Finally, this internal strategy must be reflected in the company’s external communication, i.e. in its Employer Branding. It is time to audit recruitment language and eliminate ageist language from it. Instead, communications should consciously showcase and celebrate experienced employees as mentors and innovation leaders.

    The question “is the IT industry ageing?” has a complex answer: in Europe – no, in Poland – yes, and rapidly. The Polish IT sector has ceased to be a “young industry” and has become a mature market. The competitiveness of SMEs in the coming decade will not depend on the ability to attract the youngest, but on the ability to strategically retain, effectively develop and fully utilise the potential of the growing number of professionals over 35. The greatest competitive advantage will be gained by those companies that are the first to abandon stereotypes and adapt their culture to the demographic reality.

  • How the dollar and euro exchange rates are affecting the prices of servers, laptops and components

    How the dollar and euro exchange rates are affecting the prices of servers, laptops and components

    For every IT director and owner of a small or medium-sized business in Poland, planning a budget for technology equipment is like playing on two fronts. With one eye, they monitor technological advances and the needs of the company, and with the other – with growing anxiety – they follow the exchange rate charts. This is no coincidence. Fluctuations in the forex markets, especially the US dollar (USD/PLN) exchange rate, have a direct and often brutal impact on the final prices of servers, laptops and components.

    When the zloty was at a record low in autumn 2022 and the dollar exchange rate reached 5 zlotys, Polish consumers and companies were in for a shock. Apple’s introduction of new products was associated with price increases of up to 30%. This extreme example exposed a fundamental truth about the Polish IT market: we are an importer of technology and the global supply chain is priced in hard currency.

    However, reducing this relationship solely to a simple USD/PLN conversion rate is a mistake that can cost companies tens of thousands of zlotys. Analysis of the market in recent years shows that the invoice price is the product of at least four forces: the dollar exchange rate, the stabilising role of the euro, the global supply of semiconductors and price wars between technology giants.

    For Polish SMEs, understanding this complex mechanics and proactively managing risk is no longer an option but is becoming a strategic necessity.

    Anatomy of a price: why the server speaks Dollar and the laptop speaks European

    To manage costs effectively, it is important to understand why different categories of equipment react differently to exchange rate fluctuations.

    Most of the global technology trade, from silicon wafers in Taiwan to finished microprocessors from Intel or AMD, is settled in US dollars (USD). A Polish distributor or integrator, when buying components or servers, pays for them in USD. This means that any increase in the USD/PLN exchange rate almost immediately raises the cost of the purchase. Distributors, wishing to protect their margins, must pass this cost on to the end customer.

    The server market is the most sensitive here. Custom-tailored configurations (CTOs), ordered from manufacturers such as Dell or HPE, are often priced directly in USD, leaving the Polish company with an almost 100 per cent exchange rate risk.

    The situation is different in the laptop segment. A significant proportion of them come to Poland via European distribution centres located in the euro zone (e.g. in Germany or the Netherlands). The Polish distributor settles accounts with its European supplier in euro (EUR). The EUR/PLN exchange rate becomes a “filter” or “shock absorber” for sudden jumps in the dollar in this model. Laptop prices are thus more stable, but it should be remembered that the price of the euro already includes the USD/EUR exchange rate set by the European headquarters.

    There is also the phenomenon of ‘price lag’ (price lag). Distributors hold on to stock they bought at the old, lower exchange rate. Therefore, changes do not always transfer to 1:1 prices. This was perfectly demonstrated at the beginning of 2021: between December 2020 and March 2021, the USD/PLN exchange rate rose by more than 9%, but the average prices of smartphones and tablets rose by “only” 4% during this period. The market temporarily absorbed some of the hit, giving companies a brief ‘window’ to buy before the new, more expensive supply arrived.

    Server market trap 2024/2025: a missed SME opportunity

    Analysis of the server market reveals a key and risky paradox into which many Polish companies have fallen. The year 2024, paradoxically, was theoretically the best time in years to upgrade infrastructure. Two key factors contributed to this:

    • Strong zloty: In 2024, a ‘weaker dollar’ was recorded, significantly reducing the cost of importing equipment priced in USD.
    • Global price war: At the same time, there was a brutal battle for market share between Intel and AMD. This led to gigantic price cuts on key server processors (Xeon and EPYC), reaching up to 35-50% below list prices in the US market.

    A strong currency and cheap underlying components – a textbook ‘buying window’. Despite this, market data shows that the Polish IT equipment market has declined in 2024 (value in USD fell from 10.03 billion to 9.39 billion). Companies, probably due to the general macroeconomic situation and high interest rates, have halted investments.

    Now these companies could fall into a trap. Companies that have waited out 2024 in the hope of further declines will face a much worse situation in 2025. Forecasts for the beginning of 2025 show an 18 per cent increase in average chip prices and a renewed extension of lead times to more than four months. Trying to ‘wait it out’ has proved to be a strategic mistake – these companies will be forced to buy equipment more expensively and with longer lead times.

    Noise in the data: when the exchange rate goes down

    Analysis of IT prices solely through the prism of currencies is incomplete. There are factors that periodically become more important.

    The first is the availability of semiconductors. The 2021-2022 crisis has shown that price is becoming secondary to the ability to buy. What’s more, this crisis has generated a massive implicit currency risk. If the average waiting time for a server is more than four months, a Polish company placing an order in January (at an exchange rate of PLN 4.00) with a payment deadline in May, may have to pay 10% more if the exchange rate rises to PLN 4.40 in the meantime.

    The second factor is geopolitics. Customs decisions, such as those imposed by the US on Chinese imports, force manufacturers (Dell, HP, Lenovo) to costly relocate factories, for example to Vietnam. The costs of this global reorganisation of the supply chain are included in the base price of the product, raising it for everyone, regardless of local exchange rates.

    How can SMEs protect themselves?

    For Polish companies, passivity towards currency risk is a gamble. Instead of trying to predict the perfect ‘hole’ (which, as 2024 has shown, is almost impossible), companies need to implement conscious risk management strategies.

    1. purchase planning based on cycles, not ‘timing’: Instead of guessing, IT and finance departments should monitor both key indicators: the local USD/PLN exchange rate and global component price trends (e.g. CPU price wars). The budget should be flexible enough to accelerate key purchases when both indicators are favourable.

    2 Active management of currency risk (Hedging): Hedging instruments, hitherto seen as the domain of large corporations, are now also available to SMEs.

    • Forward contracts: This is the simplest tool. If a company knows that it needs to buy $50,000 worth of equipment in 3 months’ time, it can ‘freeze’ today’s rate in a contract with the bank. This eliminates the risk, although it also removes the benefit if the rate falls.
    • Currency options: They act as an ‘insurance policy’. The company pays a small premium for the right (but not the obligation) to buy the currency at a fixed rate. If the market rate is better – it benefits from the market. If worse – it exercises the option, protecting itself against loss.
    • Natural hedging: the simplest method for companies that have revenues in USD or EUR (e.g. from exporting IT services). It involves paying for imported equipment in the currency you have earned, thus bypassing currency conversion costs altogether.

    3 Building supply chain resilience: the risks for 2025 (more expensive chips, longer deliveries ) show that SMEs need to think not only about their risks, but also those of their suppliers. It is worth actively talking to local IT integrators. The key question is: does the supplier have diversified sources?

    The best strategy for SMEs may be to sign a framework agreement with a supplier for the cyclical delivery of equipment (e.g. 50 laptops per quarter) at a fixed price of PLN for 12 months. In this way, it is the supplier, who is much better equipped for professional hedging, who assumes the currency risk (USD/PLN) and the price risk of the components(projected increase of 18% ). Such an agreement provides invaluable predictability of operating costs.

    In a volatile economic environment, IT currency risk management is no longer the responsibility of the finance department. It is becoming a key element of a company’s technology strategy.

  • IT costs are freezing digitalisation? How high IT salaries are blocking innovation

    IT costs are freezing digitalisation? How high IT salaries are blocking innovation

    A paradox has become entrenched in the Polish business landscape. On the one hand, the media report a normalisation and cooling in the IT labour market, which could suggest an end to salary pressures. On the other hand, finance and HR leaders budgeting for digital transformation see a different reality: the cost of acquiring and retaining technology talent has stabilised at a level that is becoming an insurmountable barrier for many companies.

    The thesis is unequivocal – high IT personnel costs have become a hidden tax on innovation, which is measurably inhibiting key digitisation projects in Polish companies.

    A new, expensive balance

    Analysis of hard market data debunks the myth that the era of expensive professionals is over. Although wage growth has clearly slowed down, this does not mean a return to pre-boom levels. Instead, a new high equilibrium has taken shape.

    According to the Hays Poland salary report for 2025, only 16% of technology companies are planning raises of more than 10%, a drastic change compared to 2021-2022. However, the slowdown does not mean a reduction in costs, only their stabilisation at a very high level.

    The scale of the challenge is best illustrated by comparing salaries in IT with the rest of the economy. Data from the Central Statistical Office shows that in July 2025, the average salary in the ‘Information and Communication’ sector reached PLN 14,307 gross.

    At the same time, the average salary in the business sector was PLN 8,266. This means that IT specialists earn on average more than 70% more than the average employee. This gigantic salary premium, perpetuated by years of boom, is not diminishing, making digitalisation projects disproportionately expensive compared to other business initiatives.

    It is also crucial for budgeting to understand that the highest costs are generated by the most desirable, experienced professionals, often working on B2B contracts. In the first half of 2024, as many as 74% of senior offers included a proposal to work in this model.

    When the cost of talent becomes a barrier

    High staff costs have ceased to be a mere statistic and have become a major brake on transformation. Market research leaves no illusions. A report by Polcom indicates that for Polish companies in the SME sector, the two biggest barriers to digitalisation are the shortage of IT specialists (indicated by 66% of companies) and the high cost of implementations (46%). These two factors create a vicious circle: the limited supply of talent drives up their price, while high costs limit companies’ ability to compete for them.

    This means that every IT project carries a huge amount of risk. In a typical technology implementation, 70% to as much as 90% of the budget is human costs – the time of developers, analysts and managers. This structure means that even a small overrun or delay in a project can negate the projected return on investment.

    The most serious consequence, however, is strategic paralysis. The decision to postpone a project because personnel costs are too high is not neutral. In fact, it leads to an accumulation of technological debt.

    Outdated systems, on which half of Polish companies still work, are becoming more and more expensive to maintain and are blocking innovation. Apparent savings today therefore become a high-interest loan taken out for the future.

    Wykres 3

    Strategic responses to bottlenecks

    Instead of engaging in a debilitating war for talent, business leaders can take a more sophisticated approach, based on three complementary pillars.

    The first is smart sourcing, i.e. diversifying sources of competence. Instead of relying solely on recruiting full-time employees, companies can build a flexible talent ecosystem. This includes strategic outsourcing and nearshoring, but also effective relationship management with B2B contractors.

    Data from the Central Statistical Office (CSO) and CEIDG show that as many as 70.3% of entities in the ‘Information and Communication’ sector are sole traders. This is a huge market of flexible experts that allows project teams to be dynamically scaled according to needs.

    The second pillar is the democratisation of technology through Low-Code/No-Code (LCNC) platforms. These tools enable business applications to be built using visual interfaces, allowing solutions to be created by business analysts or managers, referred to as ‘citizen developers’.

    This drastically reduces implementation time from months to weeks, reduces costs and relieves permanently overburdened IT departments. The growing popularity of this approach is a fact – according to the NoCode Poland report, as many as 77% of companies plan to implement LCNC technology in the next 12 months.

    The third key element is the use of automation and artificial intelligence (AI) as an efficiency lever. The aim here is to free up financial and human resources, which can then be reinvested in strategic projects. AI implementations can deliver operational cost reductions of 25-40%, and technologies such as Robotic Process Automation (RPA) can increase the productivity of teams by up to 40%.

    New game plan

    The high cost of IT competence is not a passing trend, but a new, structural feature of the market. Attempts to ignore it lead directly to a loss of competitiveness. Instead of seeing IT as a cost centre, CFOs should treat it as an investment portfolio in digital capabilities, diversifying it between strategic staffing, a flexible network of external partners and technologies that increase productivity across the organisation.

    The role of HR, in turn, is evolving from that of a recruiter to an architect of the talent ecosystem that builds and nurtures relationships with various competence providers. In the new market reality, the advantage will be built not by those companies that hire the most expensive developers, but by those that most cleverly reduce the need to hire them, achieving the same business goals faster, cheaper and with less risk. This is the essence of strategic leadership in the digital age.

  • AI patents: Europe is playing a different game

    AI patents: Europe is playing a different game

    Today’s technological revolution has an epicentre, and that is undoubtedly artificial intelligence. It is not just another innovation; it is a fundamental force transforming the global economy, military strategy and social fabric.

    In this new era, marked by an unprecedented pace of change, patents have become the equivalent of territorial claims during the gold rush era. They are a hard, measurable indicator of national strategy, innovative capacity and, most importantly, future economic power. Leadership in AI is seen as a prerequisite for competitiveness, security and prosperity in the 21st century, with the technology revolutionising every sector from healthcare to defence.

    The global scene is dominated by two hegemons: The United States, with its capital power and dominance in the creation of fundamental models, and China, which is pursuing a monumental state-led strategy to achieve quantitative superiority. In this bipolar balance of power, a key question arises about Europe’s position. Is the Old Continent merely a distant third player, condemned to watch the giants compete from the sidelines? Or is it, contrary to common narratives, building its own unique path to technological sovereignty and competitiveness? Is Europe realistically catching up?

    Drawing on hard data from the world’s leading intellectual property organisations – the European Patent Office (EPO) and the World Intellectual Property Organisation (WIPO) – and in-depth reports from leading research institutions such as the Stanford Institute for Human-Centered AI (HAI) and the OECD, this analysis aims to separate the facts from the media hype. We will trace the dynamics of patent applications, examine the quality and strategic focus of innovation, and place this data in a broader geopolitical context to provide a nuanced answer to the question of Europe’s future in the global AI race.

    The global AI patent arena: a numbers game and exponential growth

    An analysis of global patent trends in the field of artificial intelligence reveals a picture of unprecedented dynamism. The scale and pace of growth in this field eclipses previous innovation cycles, signalling a fundamental technological shift. In just over a decade, the world has witnessed an explosion of patent activity that has redefined the map of global innovation.

    An unprecedented wave of innovation

    The data is clear: the number of AI-related patents granted globally has increased more than 31-fold since 2010. In 2010, only 3,833 patents were granted worldwide in this field. By 2023, this number has risen to a staggering 122,511, an increase of 29.6% on the previous year alone. This exponential growth is testament to the intense and ever-accelerating investment in AI research and development worldwide.

    The tri-polar balance of power

    When we look at the geographical distribution of these patents, a clear three-layered landscape emerges, with different players operating at very different scales.

    • China’s quantitative supremacy: China is the undisputed leader in terms of volume, accounting for an overwhelming 69.7% of all AI patents granted worldwide in 2023. This share has increased dramatically over the past decade, cementing China’s position as the most prolific innovator in terms of number of applications. Already in 2022, China has been granted more AI patents (around 40,000) than the rest of the world combined, while the US, in second place, has been granted around 9,000….
    • Position of the US and Europe: the US ranks a distant second with a share of 14.16% in 2023, and Europe is third with a share of 13.00%. These figures clearly indicate that in terms of pure patent numbers, Europe is not only failing to catch up with the US, but both regions are dominated by the scale of Chinese activity.

    The scale of China’s quantitative dominance is so massive that it suggests more than just organic, market-driven innovation growth. Historically, in globally competitive technology fields, such a rapid and massive accumulation of intellectual property in one country is unusual. The data confirms that this is the result of a deliberate national strategy, supported by massive government funding and top-down directives. This means that China’s patent statistics cannot be interpreted solely as a measure of commercial innovation in the Western sense. They are also an indicator of the implementation of a state industrial policy aimed at declaring a technological presence and building a powerful national IP portfolio. This fundamentally changes the context of the comparison – it is no longer just a race for the best ideas, but for China, also a race for sheer volume as a strategic goal.

    The European innovation frontline: analysis of data from the EPO

    Global statistics, while impressive, only tell part of the story. To reliably assess Europe’s competitiveness, it is necessary to transfer the analysis to its ‘own backyard’ – the European Patent Office (EPO). This is because the EPO data reflects the real battle for the lucrative and technologically advanced European market, and these shed a whole new light on the continent’s position.

    AI as the new engine of European innovation

    The EPO Patent Index 2024 report brings breaking news: for the first time ever, Computer Technology, a category that includes Artificial Intelligence, has become the leading technology field in terms of the number of patent applications at the EPO, reaching 16,815 applications. This proves that AI is at the heart of the most advanced R&D aimed at the European market.

    The main drivers of this growth over the past five years have been inventions directly related to AI – such as machine learning, pattern recognition and neural networks. Applications in these subcategories have grown at an average annual rate of 28% since 2019. In 2024 alone, the number of AI-related applications at the EPO increased by 10.6% compared to the previous year, confirming the unrelenting momentum in this area.

    European innovators leading their own market

    Most importantly, the EPO data shows that in the key area of AI-related inventions, it is “applicants from EPO member states that have maintained a leading position throughout this period [of the last five years]”. Although in the broader category of ‘computer technology’, US entities lead the way (with a 34.4% share compared to 29.5% for EPO countries), in the narrower but strategically key area of AI, Europe is in the lead.

    The growth in filings from EPO countries in computer technology was very solid at +5.9% in 2024, driven by significant innovation jumps from Germany (+12.7%), Switzerland (+37.4%) and the UK (+12.4%). This indicates the existence of a thriving and dynamic innovation ecosystem in Europe.

    Analysis of the EPO data leads to an important conclusion. Filing a patent application, especially at an office as rigorous and expensive as the EPO, is a significant strategic investment. Companies do not take such steps without a firm belief in the commercial potential of their inventions. The high level of EPO filings from all regions – the US, Asia and Europe – is indicative of the widespread perception that the European market is a key and highly profitable one. The fact that it is European entities that lead the way in AI-specific filings at their own patent office suggests that they are not only conducting advanced research, but are strategically focused on protecting and commercialising these innovations in their home economic zone. Global statistics may reflect broad research activity and national strategies, but EPO data is a much better indicator of the real-world, commercial race for Europe. And in this race, Europe is not just a participant – it is a leader. This completely changes the ‘catching up’ narrative.

    Quality vs. quantity: a deeper look at patent strategies

    The number of patents alone is an imperfect measure of innovative power. To fully understand the global dynamics, it is necessary to introduce the dimensions of quality, impact and strategic purpose behind patent portfolios. An analysis of these factors reveals that the US, China and Europe are not in one, but in three separate races, each with different rules and objectives.

    US strategy: fundamental models and high-impact intellectual property

    The US has a clear focus on the quality and fundamental importance of its innovations. The best evidence of this is the citation rate – US AI patents are cited almost seven times more often than Chinese patents, indicating that they form the basis for later inventions around the world.

    What’s more, the US absolutely dominates the creation of ‘notable AI models’ – systems that represent key technological breakthroughs. In 2024, US institutions have created 40 such models, compared to 15 by China and just three by Europe.

    This leadership is driven by corporate giants such as Google, Microsoft and IBM, who are investing billions in basic research.

    China strategy: Volume, applications and academic innovation

    The Chinese strategy is based on massive scale. However, the quality of this portfolio is subject to debate; the grant ratio for AI patents in China is estimated at only 32%. Furthermore, the vast majority of these patents are filed only in the domestic market, suggesting a focus on the domestic market and the building of a defensive IP wall.

    A key differentiator is the source of innovation. In China, as many as 65 of the top 100 patenting organisations in computer vision are universities. In the US, the number is only three. This shows a model driven by the state and the academic sector, in contrast to the US model, which is commercially oriented and dominated by the private sector.

    The European model: Industrial integration and applied AI

    Europe’s leading patent applicants reflect the continent’s industrial strength. Companies such as Siemens and Bosch from Germany are key players, focusing on the application of AI in industrial automation, manufacturing and transportation.

    This suggests a strategy of integrating cutting-edge artificial intelligence into already established, highly developed sectors of the economy, rather than competing directly in the creation of foundational models.

    The patent data shows not one race, but three parallel, strategically distinct endeavours. The US ecosystem, driven by massive private capital, seeks to create and own the foundational platforms – the ‘shovels and picks’ of the AI era – on which others will build their solutions. Its patent strategy is selective and geared towards maximum impact. China’s state-controlled ecosystem is focused on rapid and widespread adoption, technological self-sufficiency and achieving dominance in specific application areas such as computer vision. Its patent strategy is a game of volume and control of the domestic market. Finally, the European ecosystem is a mature industrial economy that is adapting AI to strengthen its existing competitive advantages. Its patent strategy focuses on protecting highly specialised, high-value applications in key sectors. The question of whether Europe is ‘catching up’ with the US is therefore like asking whether a Formula One team is ‘catching up’ with an aerospace company. Both are competing in engineering, but in completely different disciplines and with different objectives.

    Behind the numbers: strategic and geopolitical context

    Patent trends do not exist in a vacuum. They are a direct reflection of deeper national strategies, investment realities and structural conditions. Analysis of this context is crucial to understanding Europe’s true position in the global AI race.

    Europe’s dilemma: A pioneer of regulation, a marauder of investment

    The European Union has adopted a proactive, regulation-oriented approach, culminating in the AI Act. The aim is to establish a global standard for ‘trustworthy’ artificial intelligence as a showcase for the European model. However, this ambitious regulatory vision is accompanied by a massive investment gap. In 2024, private investment in AI in the US reached USD 109.1 billion. This is almost 12 times more than in China (USD 9.3 billion) and significantly more than in the European Union, where estimates suggest around USD 8 billion. European strategic initiatives such as the ‘AI Factories’ programme (with a budget of EUR 20 billion) or the ‘Apply AI Strategy’ (EUR 1 billion) attempt to respond to this challenge. However, their scale pales in comparison not only to total US private investment, but even to individual corporate projects such as OpenAI’s Stargate.

    Infrastructure gap – Europe’s Achilles heel

    Europe’s greatest structural weakness is its critical dependence on foreign technology infrastructure. It is estimated that around 70% of Europe’s digital services run on the clouds of three US giants (so-called hyperscalers). At the same time, the European semiconductor manufacturing sector accounts for less than 10% of global production, making the continent dependent on chips designed in the US and manufactured in Asia. This fundamental weakness poses a direct threat to the EU’s strategic goal of achieving ‘digital sovereignty’.p

    The table below synthesises the key differences between the three ecosystems, providing a summary of the AI strategic landscape.

    IndicatorEuropeUSAChina
    Private Investment in AI (2024)approx. USD 8 billionUSD 109.1 billionUSD 9.3 billion
    Regulatory PhilosophyProactive, risk-based, horizontal (AI Act)Reactive, free market, voluntary frameworkState-controlled, information control, sectoral regulation
    Key PlayersIndustrial companies (Siemens, Bosch), start-ups (Mistral)Technology giants (Google, Microsoft, OpenAI), VC fundsTechnology giants (Baidu, Alibaba, Tencent), universities, the state
    Patent StrategyQualitative, focused on industrial applicationsQualitative, focused on fundamental models and high impactQuantitative, focused on the domestic market and a broad spectrum of applications
    Main assetsStrong industrial base, leadership in applied AI, high regulatory standardsDominance in capital, leadership in basic research, global platformsHuge scale, rapid adoption, state support, dominance in data
    Main WeaknessesInvestment gap, infrastructure dependency (cloud, chips), market fragmentationPotential risk of lack of regulation, concentration of power in the hands of a few companiesQuality issues and international recognition of patents, political barriers

    What does the future hold for Europe in the AI race?

    After an in-depth analysis of the patent data and the strategic context, the answer to the question of Europe’s position in the global AI race becomes clear, albeit multidimensional.

    Firstly, in terms of pure patent numbers, Europe is not catching up and will probably never catch up with China. The scale of Beijing’s state strategy makes volume an inadequate and misleading measure of success for Europe.

    Secondly, Europe is significantly behind the US in developing fundamental models and raising venture capital. The financial strength and risk appetite of the US ecosystem are currently out of reach.

    However, the key conclusion of this analysis is that Europe’s success should not be measured by its ability to copy American or Chinese models. On the contrary, its future lies in the skilful exploitation of its unique strengths. Europe is not a marauder, but a leader in the race for high-value, industrial and applied artificial intelligence. Its strength lies in the deep integration of AI into the world-class manufacturing, automotive, medical and green technology sectors. Data from the European Patent Office clearly confirms that in its own core market, Europe is winning the innovation race.

    The future competitiveness of the continent will depend on two critical factors:

    • Bridging the infrastructure and investment gap: The success of initiatives such as AI Factories is absolutely crucial. Without sovereign computing power and a stronger venture capital ecosystem, Europe will always build its innovative applications on foundations owned by foreign powers.
    • Strategic use of regulation: Europe must effectively present its AI Act not as a barrier to innovation, but as a global competitive advantage. The aim is to create a premium market for ‘trustworthy’, secure and human-centred AI systems, which will increasingly be demanded by corporate customers and citizens around the world.

    Europe may not win the sprint for the number of patents, but it is perfectly placed to compete in the marathon for sustainable, valuable and trustworthy integration of AI into the real economy. The race is far from over, and Europe is running its own distinct and well-considered path in it.

  • Phishing 2.0: How artificial intelligence is changing the cyber threat landscape

    Phishing 2.0: How artificial intelligence is changing the cyber threat landscape

    In January 2024, an employee in the finance department at multinational engineering firm Arup received an email that appeared to be from the chief financial officer (CFO) at its UK headquarters. The email informed of a secret transaction and included an invitation to a video conference. The employee, although initially suspicious, joined the call. On the other side of the screen, he saw not only the CFO, but also several other board members he knew. Their appearance and voices were perfectly reproduced. Convinced of the authenticity of the meeting, over the next few days he authorised 15 transfers totalling $25.6 million. Only after the fact did he discover that he had been the victim of one of the most audacious frauds in history. All the participants in the video conference were digital clones, generated by artificial intelligence.

    This incident is not a sci-fi movie scenario, but the brutal reality of a new era of cyber threats. Welcome to the world of Phishing 2.0 – an evolution of phishing that, thanks to artificial intelligence, machine learning and deepfake technology, has become more sophisticated, personalised and dangerous than ever before. Traditional attacks, which we have learned over the years to recognise by grammatical errors and generic phrases, are becoming a thing of the past. In their place are campaigns that are almost indistinguishable from authentic communication, precisely targeting specific individuals and capable of bypassing traditional defences.

    Artificial intelligence is no longer just a tool that improves phishing; it is fundamentally redefining it. It democratises access to advanced attack techniques that were once the domain of only specialised hacking groups, and fuels an arms race in cyberspace. In this new reality, both attackers and defenders are engaged in a battle of algorithms, with data, finance and trust at stake as the foundation of the digital economy.

    FeaturePhishing 1.0 (Before the era of AI)Phishing 2.0 (AI-supported)
    Language and grammarFrequent errors, unnatural wording.Perfect grammar, imitating the writing style of specific individuals.
    PersonalisationGeneral phrases such as “Dear Customer”.Hyper-personalisation using social media data and public records.
    Scale and speedManual, resource-limited campaigns.Automated generation of thousands of unique messages in minutes.
    Attack vectorsMainly email.Multichannel: email, SMS (smishing), voice calls (vishing), social media.
    Avoidance tacticsSimple domain impersonation.Dynamic page cloning, code obfuscation by AI, deepfake audio and video.
    Required skillsBasic technical knowledge.Low entry threshold with AI tools and Phishing-as-a-Service (PhaaS) platforms.

    Anatomy of a Phishing 2.0 attack: An AI-driven arsenal

    The modern phishing attack is a complex, multi-step process in which artificial intelligence plays a key role at every step. At the core of Phishing 2.0 are large language models (LLMs) such as GPT-4, as well as their uncensored, darknet-accessible counterparts such as WormGPT or FraudGPT. These tools have become an inexhaustible source of perfectly written, psychologically compelling content for cybercriminals. They eliminate grammatical errors, mimic the communication style of specific individuals and can create persuasive narratives from just a few simple commands.

    The effectiveness of Phishing 2.0 is based on hyper-personalisation, and this depends on the quality of the data collected. Artificial intelligence has automated the reconnaissance (OSINT) process, systematically searching the digital footprint of potential victims. AI algorithms aggregate information from social media, corporate websites and public records to learn about the victim’s interests, professional relationships and recent activities. The collected data – the name of a project, a supervisor’s name or a recent holiday – is woven into the content of the message, making the scam appear extremely authentic.

    Artificial intelligence has also enabled the mass production and distribution of attacks. ‘Phishing-as-a-Service’ (PhaaS) platforms have emerged, such as ‘SpamGPT’, which mimic the interface of legitimate marketing services but serve a criminal purpose. They offer an integrated AI assistant for generating templates, automating mailings and tracking analytics, allowing even those with few technical skills to conduct sophisticated large-scale operations.

    One of the biggest challenges is Phishing 2.0’s ability to bypass traditional security filters. AI is used here to create dynamic threats. AI tools can create perfect, real-time updated replicas of legitimate login pages. Analysts at Microsoft Threat Intelligence identified a campaign where AI was used to hide malicious code inside an image file, masking it using business terminology to confuse scanners. Criminals are also abusing trusted developer platforms to host fake sites with CAPTCHA verification, which blocks automated scanners but lets the victim through to the phishing site.

    The integration of AI with phishing is the industrialisation of cybercrime. We are seeing a shift from an ‘artisanal’ to an ‘industrial’ model. AI has become a production line that automates the entire attack process on a scale previously unattainable.

    The human element under siege: Deepfake and psychological manipulation

    The most worrying front in the evolution of phishing is the use of AI to create hyper-realistic voice and image imitations. Deepfake technology is striking a blow to trust in one’s senses. It only takes a few seconds of audio material to create a convincing voice imitation. Attackers use this technology in voice messages or in real-time phone calls (vishing).

    Analysis of actual incidents shows the devastating potential of this technology. In the case of Arup, an employee who initially suspected phishing was completely convinced after a video conference with digital clones of the board of directors. In another attack, the CEO of a UK energy company authorised a transfer for $243,000 after a phone call with a cloned voice of his superior.

    However, there are also examples of foiled attempts that provide valuable lessons. An attack on Ferrari was stopped when a manager asked the supposed CEO a follow-up question about a recent private conversation, which the AI was unable to answer. At Wiz, the deception attempt failed because employees noticed a subtle difference between the CEO’s voice from public appearances (on which AI was trained) and his tone in everyday conversations. In contrast, a LastPass employee ignored an attempted contact from the supposed CEO because it was through an unusual channel (WhatsApp) and outside standard working hours.

    These cases reveal a fundamental weakness of deepfake technology: the ‘contextual gap’. AI can replicate patterns, but it cannot replicate authentic, shared human experience. It does not know the content of private conversations or the subtle nuances of interactions. This gap is a new battleground on which the ‘human firewall’ can claim victory.

    The data behind the threat: Quantification of impact

    The scale of the transformation is reflected in hard data. Reports indicate a 1,265% increase in phishing emails, directly linking it to the uptake of GenAI technology. Total phishing volume has increased by 4,151% since ChatGPT’s debut.

    The increase in the number of attacks translates into increasing financial losses. The average cost of a data breach whose vector was phishing reached $4.8 million in 2024. Losses from Business Email Compromise (BEC) attacks reached a record $2.9 billion.

    What’s more, an experiment conducted by Hoxhunt found that in March 2025, an AI agent became 24% more effective at creating phishing campaigns than an elite human team of experts. This proves that artificial intelligence is becoming objectively better at manipulating humans.

    Although the overall volume of attacks is increasing, a strategic shift is also being observed. Attackers are increasingly moving away from mass campaigns to precisely targeted operations on high-profile departments such as finance or HR. Invariably, Microsoft remains the most commonly impersonated brand, used in more than 51.7% of scams.

    Fighting fire with fire: AI-driven defence

    In response to threats, the cyber security industry has also reached out to AI, creating a new generation of intelligent defences. Unlike traditional filters, defensive AI is adaptive and contextual. Its operation is based on behavioural analysis, creating a dynamic profile of normal communication patterns and detecting anomalies such as a sudden change of tone in an email from a known sender. Natural language processing (NLP) tools analyse the content of messages for subtle signals of manipulation.

    Artificial intelligence is also revolutionising the work of security operations teams (SOCs) by automating log analysis and alert classification, allowing human analysts to focus on the most complex incidents. Interestingly, the same large language models used for phishing are also proving effective in detecting it.

    This evolution is forcing a fundamental change of philosophy in cyber security. We are seeing a shift from a ‘state’ based model (is this element known to be bad?) to a ‘behaviour’ based model (is this element behaving strangely?). The new model, driven by AI, is not so much interested in ‘what it is’ as in ‘how it works’.

    Building a resilient organisation: A multi-layered strategy

    Effective defence requires an integrated approach that combines technology, processes and informed people. Traditional training is no longer sufficient. The new programme must prepare employees to confront deepfakes. Implementing out-of-band verification protocols for every sensitive request – confirming an email with a phone call to a known number – becomes crucial. The Ferrari example also demonstrates the power of simple security questions based on a shared, private context.

    Technology must provide a solid foundation. The Zero Trust philosophy (‘never trust, always verify’) is becoming a fundamental defence strategy. Phishing-resistant multi-factor authentication (MFA), based on FIDO2 standards (e.g. dongles) that bind the authentication process to a physical token, rendering a stolen password useless, is also essential.

    Forecasts from analysts such as Gartner indicate a shift in the allocation of budgets. By 2030, more than half of spending will be on preventative measures rather than post-incident response. This is an acknowledgement that traditional models are too slow to combat attacks at the speed of AI.

    The most effective defence mechanisms are no longer purely technical; they are integrated into business processes. The failure at Arup was a process failure – the financial procedure lacked a mandatory, non-digital verification step. The Ferrari success, on the other hand, was a process success. The solution requires a change in the way work is done. IT leaders must become business process engineers, building verification steps directly into high-risk workflows.

    Navigating the future of digital deception

    Phishing 2.0, driven by AI, is not a hypothetical threat but a current reality. It is more personalised, persuasive and operates on an industrial scale. Deepfake technology has undermined our fundamental trust in sensory evidence.

    We are facing a new era where AI has democratised advanced attacks and defences must be equally intelligent. Looking to the future, experts predict a further escalation of this arms race. There is talk of the emergence of autonomous, multi-agent AI systems (‘swarms of agents’) that will conduct complex operations on both the attack and defence side. The UK’s National Cyber Security Centre (NCSC) predicts that AI will continue to reduce the time between vulnerability disclosure and exploitation.

    Resilience is a hybrid of smart technology and an equally smart, sceptical workforce. The ultimate defence is a holistic strategy that combines the predictive power of AI with the contextual wisdom of a well-trained, procedurally-acting human team. The fight against digital deception has reached a new level, and our ability to adapt will determine the outcome.

  • Anatomy of a zero-day attack: How hackers exploit unknown vulnerabilities and how to defend against it?

    Anatomy of a zero-day attack: How hackers exploit unknown vulnerabilities and how to defend against it?

    In the digital arms race, there is a moment of absolute advantage for the attacker – the moment when a previously unknown vulnerability in software is used for the first time to launch an attack. This is ‘zero-day’.

    For security teams, this is the worst possible scenario: they are faced with a threat they did not know existed, against which they have no ready defence, and the software vendor has not yet had time to prepare a ‘vaccine’ in the form of a security patch. During this critical window of time, which can last for days, weeks or even months, attackers operate with impunity, with an open path to the most valuable company assets.

    Zero-day attacks are not theoretical musings, but a brutal reality. Incidents such as the crippling attack on MOVEit Transfer software have shown that a single, unknown vulnerability can have a knock-on effect, leading to the theft of tens of millions of people’s data and exposing thousands of companies to financial and reputational damage . This proves that the stakes in this race against time are extremely high, and understanding the anatomy of this threat is crucial for every IT department today.

    Vulnerability lifecycle: from a bug in the code to a global incident

    To effectively defend against zero-day attacks, it is essential to understand their lifecycle. Although these terms are often used interchangeably, each describes a different stage on the path from a bug in the code to a viable incident.

    • Zero-Day Vulnerability (Zero-Day Vulnerability): This is a flaw in software code, operating system or device that is unknown to the manufacturer or, if known, has not yet been patched. The name zero-day refers to the developer’s perspective – it is the day they find out about a problem without having a solution ready.
    • Zero-Day Exploit (Zero-Day Exploit): This is a specific tool – a piece of code or technique – created to actively exploit a vulnerability. An exploit is a ‘key’ that allows a ‘lock’ to be opened in the form of a vulnerability.
    • Zero-Day Attack (Zero-Day Attack): This is the actual use of an exploit against a target. The name emphasises the perspective of the victim, who has exactly ‘zero days’ to prepare for defence.

    The process from gap creation to gap patching can be divided into several key phases:

    1. Emergence and release: Software containing a hidden flaw is made available to users. The vulnerability exists but remains undiscovered.
    2. Discovery: The existence of a vulnerability is identified. The discoverer may be an ethical researcher, the manufacturer itself or – the worst-case scenario – a cybercriminal.
    3. Creating an Exploit: A theoretical vulnerability is transformed into a ready-to-use attack tool.
    4. Disclosure: Information about the vulnerability becomes known. In a responsible model, it goes to the manufacturer; in a malicious scenario, it goes to the black market or is exploited in secret.
    5. Issue of a fix: the manufacturer publishes an update that eliminates the flaw.
    6. Patch installation: The vulnerability lifecycle ends when users install the patch, closing the exploitation window.

    A critical factor is the time gap between the discovery of a vulnerability by a malicious actor and the widespread installation of a patch. The entire strategy of zero-day attacks focuses on maximising this window.

    Threat landscape 2023-2024: changing objectives and tactics

    Analysis of recent years’ data, in particular from Google Threat Analysis Group (TAG) and Mandiant reports, reveals a fundamental transformation in attackers’ strategy. After a record-breaking year in 2021 (106 exploits), there were 97 exploits in 2023 and 75 exploits in 2024. However, these figures hide a more important trend: a strategic shift in objectives.

    We have seen a dramatic decrease in the number of exploits targeting traditional targets such as web browsers and mobile operating systems . This is the result of years of investment in security by the technology giants, which have significantly increased the cost of creating effective exploits.

    This shift has forced attackers to shift their focus to the heart of the corporate infrastructure. The percentage of zero-day attacks targeting enterprise technology has increased from 37% in 2023 to as high as 44% in 2024.

    The targets were primarily edge devices and security software: firewalls, VPN gateways and load balancing systems . The compromise of one such device gives attackers a strategic entry point into the entire corporate network. This evolution is driven by simple economic logic – the ‘return on investment of an exploit’ is incomparably higher for an attack on a central piece of infrastructure than on a single user.

    Case study: global crisis MOVEit transfer (CVE-2023-34362)

    Nothing illustrates the new era of threats better than the global MOVEit Transfer software incident. It was a model example of a strategic hit to a key part of the digital supply chain.

    MOVEit Transfer is a popular solution for the secure transfer of sensitive files, used by thousands of companies, government agencies and hospitals . Its central role has made it an extremely valuable target. The attackers, identified as the ransomware group Clop (FIN11), exploited a critical SQL Injection vulnerability that allowed remote command execution.

    The operation was fast and automated. At the end of May 2023, the Clop group started a massive scan of the internet for MOVEit servers, automatically exploiting a vulnerability to gain access . They then installed a custom web shell (LEMURLOOT) as a backdoor and conducted automated data theft for several days . By the time the manufacturer released a patch on 31 May, it was too late for thousands of companies.

    The impact was devastating. More than 2,700 organisations were affected by the attack, and personal data belonging to some 93 million people was stolen . The incident exposed a fundamental truth: the security of an organisation is inextricably linked to the security of its key software providers.

    Zero-day economics: black market versus bug bounty

    Behind every attack is a complex economic ecosystem. On the black market, knowledge of vulnerabilities is a valuable commodity. Prices for high-quality exploits are astronomical – a full chain of zero-click exploits for the iPhone can cost between $5 million and $7 million. Buyers are mainly government agencies, commercial spyware vendors (CSVs) and elite cybercrime groups.

    An ethical alternative is bug bounty programmes, where organisations offer financial rewards to ethical hackers for responsibly reporting vulnerabilities. Platforms such as HackerOne and Bugcrowd coordinate this process, creating a legitimate market for the skills of security researchers . While rewards are important, many hackers are motivated by a desire to learn and build a reputation.

    These programmes effectively limit the supply of less critical vulnerabilities on the black market. However, for the most powerful exploits, which are worth millions of dollars, bug bounties are economically uncompetitive. This reinforces the need to build defence strategies based on the ‘assume breach’ model (the assumption that an intrusion will occur).

    Defence strategies for IT departments

    In the face of zero-day threats that evade traditional defences, IT departments must adopt a multi-layered strategy based on prevention and effective detection and response.

    Pillar 1: Prevention and strengthening of immunity

    The aim is to make the IT environment as difficult to penetrate as possible.

    • Update Management: Traditional monthly update cycles are outdated. The average time from vulnerability disclosure to exploitation has shrunk to just five days in 2024. It is essential to implement automated patch management systems and prioritise vulnerabilities from the CISA Known Exploited Vulnerabilities (KEV) catalogue.
    • Zero Trust Architecture: The ‘never trust, always verify’ philosophy rejects the outdated ‘castle and moat’ model. Key elements are network microsegmentation, which limits attacker lateral movement, and the principle of least privilege, whereby each user and system has only the necessary privileges .
    • Virtual Patching: This is a key tactic during the period when there is not yet an official patch. It involves implementing rules on Web Application Firewall (WAF) or Intrusion Prevention System (IPS) devices that block network-level attempts to exploit a known attack technique, allowing valuable time for the patch to be deployed.

    Pillar 2: Detection and response to incidents

    As 100 per cent prevention is not possible, having the ability to detect an attack quickly is crucial.

    • Incident Response Plan (IRP): Having a formalised and rehearsed plan, based on a framework such as those developed by NIST, is the difference between a controlled response and chaos. The plan should include preparation, detection and analysis, containment and recovery, and post-incident phases.
    • State-of-the-art tools (EDR/XDR): Endpoint Detection and Response (EDR) and Extended Detection and Response (XDR) technologies are key. Instead of relying on signatures, they monitor and analyse the behaviour of processes across the infrastructure. Unusual activity, such as unauthorised privilege escalation, may indicate the use of an unknown exploit.
    • Human factor: The most common vector of exploit delivery is spear-phishing – personalised emails designed to persuade the victim to click on a malicious link . Employees also need to be aware of watering hole attacks, where attackers compromise a legitimate website frequently visited by company employees to infect their devices. Regular training is an essential part of defence.

    The future of fighting an invisible enemy

    The anatomy of the zero-day attack has undergone a profound transformation. Threats have become more strategic, precisely targeting the heart of corporate infrastructure and driven by a profiled global ecosystem. A reactive approach based solely on patching is no longer sufficient.

    Effective defence must evolve at the same pace. It is necessary to move to a model oriented towards architectural resilience. The Zero Trust philosophy is no longer an option, but a necessity. Ultimately, there is no single recipe for success in the fight against zero-day threats. It is an ongoing process, requiring a synergy of advanced technology, robust procedures and, most importantly, constant vigilance and readiness to adapt in the face of a constantly evolving enemy.

  • Megawatts to teraflops – how energy shapes AI hardware replacement cycles in the data centre

    Megawatts to teraflops – how energy shapes AI hardware replacement cycles in the data centre

    The development of artificial intelligence is not just about computational advances. Training linguistic and generative models requires thousands of GPU/TPU accelerators that devour tens of megawatts of power. As a result, electricity consumption in data centres is increasing – in Ireland, data centres consumed as much as 22% of the country’s total energy in 2024. Such a share is a challenge for energy suppliers and DC operators, who need to fit growing demand into increasingly expensive price tags while reducing CO₂ emissions.

    This article compares energy prices in three key European data hubs – Frankfurt, Dublin and Warsaw – with the energy efficiency of successive generations of AI accelerators. On this basis, we analyse how operational costs and technological advances shorten or lengthen the lifecycle of AI hardware .

    Energy prices in different hubs

    Frankfurt: high prices and environmental requirements

    Frankfurt is the second largest data centre market in Europe. Germany has some of the highest industrial energy prices; in 2024, companies paid an average of 16.77 cents/kWh, with the rate rising to 17.99 ct/kWh in January 2025. For companies with concessions (fixed consumption), the cost was 10.47 ct/kWh. These charges are made up of 29% taxes and charges and 27% network charges.

    A strong focus on RES energy and heat recovery is obliging data centre operators to invest in sustainable solutions. High energy costs motivate rapid deployment of more efficient systems to reduce consumption per teraflops.

    Dublin: the most expensive electricity in the EU and supply constraints

    In Ireland, energy prices for industrial consumers are among the highest in Europe – around €26 per 100 kWh in the first half of 2024. The SEAI report shows that in 2024 the weighted average price for business was 22.8 cents per kWh, with large consumers paying 16.3 c/kWh. The high rates are compounded by a shortage of power – Dublin’s data centres consume 22% of the country’s energy and EirGrid predicts this will rise to 30% by 2030. For this reason, new connections are only approved in exceptional cases, so operators must maximise the efficiency of existing infrastructure.

    Warsaw: lower prices but a growing market

    Poland stands out with lower prices – around €0.13 per kWh in 2024. According to GlobalPetrolPrices, in March 2025, businesses will pay an average of PLN 1.023/kWh (US$0.28), which is still lower than in Germany or Ireland. While lower energy costs allow for a longer amortisation cycle, increasing competition and demand for cloud services are encouraging investment in new hardware to increase computing density.

    Generations of accelerators: performance per watt

    GPU – from Volta to Blackwell

    Nvidia’s V100 (Volta) introduced tensor cores technology in 2017, but its TDP of 300 W and lower TFLOPS/W ratio are no longer viable. In 2020, the A100 (Ampere) came to market with a TDP of 400 W and doubled the performance per watt, reaching up to 10 TFLOPS/W. Another breakthrough was the 2022 H100, using the Hopper architecture: a 700 W chip delivers 20 TFLOPS/W and about three times the workload of the A100 per watt.

    In 2024, Nvidia announced the H200, a chip with a TDP of 700 W and featuring HBM3e memory with a bandwidth of 4.8 TB/s. This increased inference performance by 30-45% for the same power consumption. The DGX H200 system with eight such GPUs consumes 5.6 kW, but can do twice as much work per watt compared to its predecessor.

    The B200 (Blackwell), with a TDP of 1000 W and three times the computing power of the H100, is expected to debut in 2025. Although power consumption is increasing, the TFLOPS/W ratio continues to improve, pushing the frontier of computing density.

    TPU – an alternative with improved energy efficiency

    Google is developing Tensor Processing Units, dedicated AI accelerators. TPU v4 offers 1.2-1.7 times better performance per watt than the A100, and in general TPUs are 2-3 times more power efficient than GPUs. Upcoming generations, such as v6 ‘Trillium’ and v7 ‘Ironwood’, focus on maximising compute density while reducing power consumption.

    Equipment life cycle – flexibility instead of rigid cushioning

    In traditional data centres, hardware was replaced every five to seven years. However, decarbonisation research indicates that in AI environments, cycles of four years or longer are economically viable, although shortening the cycle can reduce emissions. When a new generation of GPUs provides several times the energy efficiency, early retirement of ageing chips is justified – the energy savings and emissions cost reductions outweigh the investment. Replacement every 4-5 years may become the norm in regions with high energy prices.

    How does the price of electricity affect decisions to upgrade?

    Dublin – need for computing density

    With prices of 22-26 cents per kWh and limited network capacity, Irish data centres are being forced to maximise efficiency. An investment in an H100 or H200 pays for itself faster with twice the performance per watt. Replacing old A100s with H100/H200s reduces the amortisation cycle to three to four years, as the energy savings and lower emissions costs outweigh the capital expenditure. The introduction of even more energy-efficient chips (B200, TPU v6) can further accelerate the upgrade.

    Frankfurt – a trade-off between cost and investment

    German energy prices (17-20 ct/kWh) are lower than in Ireland, but still motivate optimisation. Companies are keen to replace equipment every 4-5 years, especially when the gap between generations is large. At the same time, larger systems can benefit from discounts and long-term contracts, which reduces the pressure for immediate replacement. Regulations requiring the use of RES and heat recovery encourage the choice of energy-efficient platforms.

    Warsaw – a longer breath, but growing ambitions

    The lower cost of energy (around 13 ct/kWh) allows Polish operators to extend the life cycle of their equipment. Replacing the V100 with the A100 or H100 still brings savings, but they are not as spectacular as in Ireland. However, the growing demand for AI services, the development of R&D offices in Poland and competition from international players may shorten replacement cycles to 4-5 years, especially when B200s and energy-efficient TPUs appear on the market.

    Trends of the future: HBM3e memory, Blackwell architecture and TPU Trillium

    Accelerator performance is not only increasing with more cores. New chips, such as the H200, increase memory bandwidth to 4.8 TB/s via HBM3e. Another leap is the Blackwell B200 with a TDP of 1000W, which uses wider buses and improved Transformer Engine cores. Google, in turn, is developing the v6 ‘Trillium’ and v7 ‘Ironwood’ TPUs to improve power efficiency and compute density.

    Efficiency per watt is becoming the most important parameter as economic and regulatory pressures force operators to reduce emissions. High energy prices in Europe further exacerbate this trend.

    Differences in energy prices across Europe determine AI infrastructure modernisation strategies. Ireland and Germany, with the highest rates, are shortening equipment lifecycles to reduce operating costs. Poland, benefiting from lower prices, can afford to use existing systems for longer, although growing demand and competition will also accelerate change there.

    Technological advances – from the V100 GPU, to the A100 and H100, to the H200 and the upcoming B200 – mean that the TFLOPS/W ratio is growing exponentially. Alternative TPU accelerators are showing even greater energy efficiency, which could change GPU dominance in the future. Therefore, hardware replacement decisions cannot be rigid; they must take into account not only the cost of new hardware, but also energy prices, CO₂ emissions and customer requirements. Megawatts and teraflops will become increasingly intertwined in the strategies of data centre operators in the coming decade.

  • AI accelerator market in Europe: digital sovereignty vs. Nvidia’s dominance

    AI accelerator market in Europe: digital sovereignty vs. Nvidia’s dominance

    The generative artificial intelligence (GenAI) revolution has created an insatiable demand for computing power, fundamentally changing data centre architectures. Traditional processors (CPUs), for decades the heart of computing, have become the bottleneck for language-based models (LLMs) and other GenAI systems. In response to this challenge, a new class of specialised hardware was born: AI accelerators.

    The end of the CPU era and the birth of a new paradigm

    The problem with CPUs in the context of AI lies not in their speed, but in a fundamental architectural mismatch. Optimised for sequential execution of complex tasks, they have only a few powerful cores. Meanwhile, deep learning algorithms require massive parallel processing – performing trillions of simple operations simultaneously. This is a task for which graphics processing units (GPUs), equipped with thousands of smaller cores, are ideally suited .

    Alongside GPUs, which have become the standard for model training, even more specialised units have emerged. Neural processing units (NPUs) are a broad category of chips designed from the ground up with AI in mind, prioritising energy efficiency, making them crucial for edge AI applications. Tensor processing units (TPUs), on the other hand, are Google‘s proprietary ASICs, optimised for its software ecosystem and massive cloud computing .

    This paradigm shift is driving a market in Europe with huge potential. Valued at around €4.88 billion in 2024, the European AI accelerator market is expected to grow to nearly €43 billion by 2033, with an impressive compound annual growth rate (CAGR) of 27.4% .

    Unique European Drivers: Politics meets market demand

    The European accelerator market is shaped by a unique combination of bottom-up commercial demand and top-down strategic initiatives, which sets it apart from markets in the US or Asia.

    On the one hand, AI adoption is growing in key sectors such as healthcare, automotive and finance. Already 13.5% of businesses in the EU are using AI technologies, and the entire European AI market (software, hardware and services) is growing at a rate of more than 33% per year.

    On the other hand, the European Union is pursuing an ambitious programme to strengthen its digital and technological sovereignty. Geopolitical concerns and the desire for independence from non-EU suppliers have led to powerful investment mechanisms:

    • EU Chips Act: This initiative aims to mobilise more than €43 billion in public and private investment to double Europe’s share of global semiconductor production from 10% to 20% by 2030 . Attracting investment to build advanced factories, such as Intel’s and TSMC’s plants in Germany, is crucial for future accelerator production in Europe.
    • AI Continent Action Plan: this €200 billion plan aims to create a sovereign, pan-European AI ecosystem. Its key element is the InvestAI initiative, which is expected to mobilise €20 billion to build 4-5 ‘AI Gigafactories’ – each equipped with more than 100,000 advanced AI chips.
    • EuroHPC and ‘AI Factories’: The Joint Undertaking for European Large Scale Computing (EuroHPC JU) is investing billions of euros to build a fleet of supercomputers. Around these, 13 ‘AI Factories’ are being built to democratise access to computing power for startups and SMEs, stimulating innovation and creating guaranteed demand for infrastructure .

    The competitive landscape: Nvidia’s dominance and the strategies of the contenders

    The data centre accelerator market is close to a monopoly. Nvidia controls around 98% of the global market in terms of units shipped, and its real advantage is its mature CUDA software ecosystem, used by 5 million developers . This creates a powerful lock-in effect, making it difficult for competitors to gain share.

    Nevertheless, the contenders are pursuing well thought-out strategies:

    • AMD: Positions itself as a major high-performance alternative. The Instinct MI300 series of accelerators is intended to compete with Nvidia’s offerings, with a key selling point being the open ROCm software platform, aimed at breaking the CUDA monopoly.
    • Intel: It is betting on price competition with Gaudi accelerators (to be 50% cheaper than Nvidia’s H100) and an open oneAPI ecosystem.
    • Google (TPU): It does not sell the chips directly, but uses them as a key differentiator for its cloud platform, offering an excellent performance-to-cost ratio for specific AI workloads.

    Against this backdrop, European players such as the UK’s Graphcore and France’s Blaize are also emerging, focusing on niches such as novel architectures (IPUs) or energy-efficient chips for Edge AI

    The growth trilemma: Cost, energy and talent

    Despite the optimistic outlook, the European market faces three fundamental barriers that create a strategic trilemma for decision-makers.

    Cost and availability: The price of a single high-end accelerator, such as the Nvidia H100, is up to US$40,000, making building your own AI infrastructure prohibitive for most companies . Additionally, global supply chains are vulnerable to disruption and export controls, which threatens project continuity .

    Energy and ESG: Data centres dedicated to AI consume four to five times more energy than traditional ones. Data centre energy consumption in Europe is forecast to almost triple by 2030. This is in contrast to the EU’s ambitious sustainability goals, such as

    Energy Efficiency Directive, which imposes an obligation to reduce energy consumption.

    Talent: Europe is facing a critical shortage of AI and HPC professionals. The skills gap is slowing down innovation and preventing companies from effectively using even the infrastructure they already have, empowering global cloud providers.

    Future trends: From possession to access, from monolith to module

    Looking ahead to 2030, the market will be shaped by three key trends:

    • The dominance of the ‘Compute-as-a-Service’ model: Due to the aforementioned trilemma, most companies will not buy accelerators, but rent access to them. This model, pursued by both public ‘AI Factories’ and commercial cloud providers, transforms huge capital expenditure (CAPEX) into predictable operating costs (OPEX).
    • Software battle: The long-term structure of the market will depend on the success of open standards, such as ROCm and oneAPI, in breaking the dominance of CUDA. Avoiding dependence on a single vendor is a powerful motivator for the industry as a whole

    New hardware architectures: To overcome physical limitations, the industry is moving towards chiplets – smaller, specialised silicon cubes combined into a single system. This allows for greater modularity and lower costs. In the long term, the revolution could be

    photonic computing, using light instead of electrons, which promises orders of magnitude higher throughput and energy efficiency .

    Strategic lessons for technology leaders

    The European AI accelerator market is an arena where global technology competition meets unique political and regulatory ambitions. For technology and innovation directors, this means navigating a complex ecosystem.

    The key strategic question is shifting from “which accelerator to buy?” to “how to strategically access computing power?”. The answer requires balancing performance, cost, sovereignty and sustainability. Success in the GenAI era will not depend on simply having the latest hardware, but on the ability to intelligently use both public initiatives and private innovation to build a sustainable competitive advantage in a unique European market.

  • AI PC: real revolution or the biggest marketing bubble of the decade?

    AI PC: real revolution or the biggest marketing bubble of the decade?

    The PC market, after a period of pandemic revival, stagnated. Innovation seemed to be only cosmetic and hardware replacement cycles were lengthening. In this landscape, however, a powerful new catalyst for change has emerged: AI PC. This is not another fashionable buzzword, but the announcement of a fundamental transformation in PC architecture that is set to redefine the role of the PC in our lives and initiate a massive hardware replacement cycle.

    But what exactly is an AI PC? It is not simply a computer with access to cloud-based AI services. The definition goes back to the silicon itself. A true AI PC is a device equipped with a specialised, three-element computing architecture: a traditional CPU for general tasks, a powerful GPU for parallel processing and, crucially, an NPU (Neural Processing Unit). It is the NPU, a dedicated and energy-efficient accelerator, that is at the heart of the revolution, enabling AI tasks to be efficiently processed directly on the device, without burdening other components .

    The key parameter here became performance measured in TOPS (trillions of operations per second) . The turning point turned out to be Microsoft’s establishment of a threshold of at least 40 TOPS for the NPU itself as a condition for ‘Copilot+ PC’ certification . This strategic move redefined the market, forcing the entire industry into a race to exceed the imposed threshold.

    This brings us to the main thesis: AI PC is not just a hardware evolution, but a fundamental paradigm shift. We are witnessing a shift from a fully cloud-dependent architecture to a hybrid model in which AI computing power is strategically dispersed between data centres and the end device. This shift carries profound implications for cost, privacy and the entire IT ecosystem.

    Market drivers: why now?

    The sudden emergence of the AI PC category is the result of a confluence of three powerful forces that made moving AI to the device not only possible, but necessary.

    A technological necessity: privacy, security and latency

    In an era of increasing awareness of data protection, cloud computing raises concerns. AI PC addresses these challenges by offering analysis of sensitive data directly on the device, enhancing privacy and security.

    What’s more, for real-time applications like live translation, the elimination of delays (latencies) associated with communication with the cloud is crucial to the quality of the user experience.

    Economic impetus: the hidden cost of cloud AI

    The boom in generative AI has revealed a brutal economic truth: while training models is a huge but one-off expense, the real budget ‘eater’ is the cost of inference, i.e. actually using the models.

    Every query to cloud-based AI generates a cost that, at scale, becomes difficult to predict and is a barrier to enterprise adoption of the technology. By moving some of the computing to the end-device, technology giants such as Microsoft are strategically passing on some of the rising operational costs to customers who are investing in new, more expensive hardware.

    Market maturity: the boom effect of generative AI

    The explosion in popularity of tools such as ChatGPT has fundamentally changed user expectations. Consumers and business employees alike now expect AI to be an integral part of their everyday tools. The timing coincides perfectly with the natural cycle of post-pandemic hardware replacement and the impending end of Windows 10 support in October 2025, creating the perfect ‘window’ for the introduction of a new product category.

    Battlefield: architects of the new PC era

    The entry of AI PC into the market has sparked the most intense rivalry in the industry for years, with traditional and new players facing off against each other.

    Chipmakers: the war of architectures

    The competition is no longer just between Intel (Core Ultra) and AMD (Ryzen AI) within the same x86 architecture. The real breakthrough is the entry of Qualcom (Snapdragon X Elite), which brings ARM architecture to mainstream Windows PCs, promising unprecedented energy efficiency . This is the biggest challenge to the ‘Wintel’ duopoly (Windows + Intel) in decades, initiating a fundamental war of architectures – x86 versus ARM – on the same system platform.

    Although Microsoft has created an advanced emulation layer, history teaches that this always involves compromises in performance, especially in games and specialised software . It is also worth remembering that the pioneer in this field is Apple, which has been integrating dedicated neural engines into its processors since 2017, exploiting the advantage of full control over hardware and software.

    Software giants: Microsoft as market conductor

    In this revolution, it is not hardware manufacturers but the software giant that is dealing the cards. Microsoft, through Windows and the new Copilot+ feature category, has become the main conductor of the market . By introducing exclusive tools such as Recall (photographic computer memory) or Cocreator (real-time image generation), the company has created a real demand for hardware capable of running them locally . Microsoft’s strategy is clear: transform the operating system into a proactive, intelligent assistant.

    The market in figures: growth forecasts and potential

    Market analysts agree: we are standing at the threshold of an exponential increase in AI PC adoption. Although short-term forecasts are being adjusted due to macroeconomic uncertainty, the long-term trend is clear.

    • Canalys predicts that AI PC shipments will reach 48 million units in 2024 (18% of the market) and will reach 205 million by 2028, representing a compound annual growth rate (CAGR) of 44% .
    • Gartner forecasts that AI PC market share will reach 31% in 2025 and exceed 54% in 2026 .
    • IDC estimates that AI PCs will account for nearly 60% of the total market by 2027, with a CAGR of 42.1% between 2023 and 2028.

    Projected Share of AI PCs in Total PC Sales (2024-2028)

    • 2024: 18%
    • 2025: 35%
    • 2026: 55%
    • 2027: 60%
    • 2028: 75%

    This dynamic adoption curve shows that AI PC is not a fad, but a technological standard that will dominate the market before the end of this decade.

    Strategic implications: opportunities and threats

    The move to an AI PC architecture has fundamental implications for the entire IT ecosystem.

    For business: productivity versus security

    The promise of AI PC for business is to leapfrog productivity by automating routine tasks. However, the revolution comes at a price. AI PCs will be 10-15% more expensive, requiring IT departments to analyse their total cost of ownership (TCO). The biggest challenge, however, is security.

    Case study: Microsoft Recall

    Nothing illustrates this better than the controversy surrounding the Microsoft Recall feature. Designed as a computer’s ‘photographic memory’, the original version stored the user’s entire activity history in an unencrypted database. This meant that any malware could steal a victim’s entire digital life in seconds . Public criticism forced Microsoft to redesign the feature, making it disabled by default and adding advanced encryption . The Recall saga is a fundamental lesson: local processing creates powerful new attack vectors, and the promise of privacy is empty without a robust security architecture.

    For the software market and the risk of a “marketing bubble”

    For developers, the emergence of the NPU is an opportunity to create a new generation of ‘AI-native’ applications . On the other hand, the fragmentation of platforms (x86 vs. ARM) creates a risk of chaos and increased developer costs.

    At the same time, a question hovers over the market: are current applications revolutionary enough to justify a mass replacement of hardware? The industry has been searching for decades for a “killer app” – an application so groundbreaking that people buy new hardware for it . For now, the AI PC market does not have a single, obvious ‘killer app’, which fuels fears of a marketing bubble in which promises overtake actual value. However, it is possible that the strength of AI PC will be the sum of hundreds of small enhancements running in the background that will gradually improve the PC experience .

    Analysis of the AI PC market leads to a clear conclusion: we are witnessing more than just another hardware refresh cycle. Driven by the need for privacy, economic pressures and expectations shaped by generative AI, NPU integration is initiating a fundamental paradigm shift in PC architecture.

    This confirms our central thesis: AI PC is a revolution, not an evolution. It is a strategic shift to a hybrid AI architecture that will change not only how computers process information, but also how they interact with us. Predictions clearly point to exponential growth and the inevitable domination of this category in the market.

    The personal computer, for years seen as a mature tool, is on the threshold of reincarnation. It is being transformed from a passive window on the digital world into an intelligent, proactive partner. The biggest challenge for the industry as a whole now is not whether this transformation will happen, but how to manage it in a way that is safe, productive and of value to the user. Avoiding the trap where marketing promises trump real-world usability will determine whether AI PC becomes a true revolution or just an expensive bubble. This is not the end of the history of the personal computer – it is the beginning of a whole new chapter of it.