Category: Data

Data is a source of in-depth reports, visualisations and market analyses that show the real picture of the IT industry and the IT sales channel. These are unique breakdowns on vendors, distributors, integrators and trends.

  • The holy grail for IT partners: Which IT services offer the highest margins with the highest demand in 2025?

    The holy grail for IT partners: Which IT services offer the highest margins with the highest demand in 2025?

    The year 2024 has gone down in technology history as an era of bold experimentation, driven by a wave of generative artificial intelligence. Companies around the world immersed themselves in the possibilities of AI, launching countless pilot projects. However, 2025 brings a fundamental shift.

    Enthusiasm, which analysts at Channelnomics described as “unprecedented” , is giving way to the hard reality of business. The time for monetisation and return on investment has arrived. Customers are no longer asking “will AI?”, but “how will AI translate into our profit?”.

    For IT partners, this transformation opens up a new era of opportunity. They are becoming the key conduits for companies that must navigate an increasingly complex technology landscape. Success in 2025 will depend on the ability to respond to three powerful forces shaping the market: the shift towards profitable artificial intelligence, the imperative of cyber security as the foundation of every operation, and the maturity of the cloud, which is generating demand for new, high-margin optimisation services.

    Leading analyst firms agree on the scale of the coming growth. Forecasts of global IT spending in 2025 point to a market worth more than $5 trillion.

    This solid growth of a few per cent is almost entirely driven by two phenomena: the mass adoption of artificial intelligence and the need to upgrade existing systems.

    The IT services segment will remain the largest part of the market, with a projected value of $1.69 trillion, but data centre systems and software will see the greatest growth as a direct result of investment in AI.

    Against this background, Poland is emerging as a regional leader. The value of the Polish IT services market is forecast to reach USD 10.44 billion in 2025, with GDP growth expected to reach an impressive 4.1%. The strength of the Polish market is driven by a globally recognised pool of technology talent, making our country a leading location for nearshoring and outsourcing.

    However, this position is a double-edged sword for local IT partners. Huge demand creates a gigantic market, but it also means competition with global players.

    In this reality, competing on price alone is becoming a strategy without a future. Advantage must be based on talent, value and specialisation.

    The market analysis clearly identifies three areas where demand, the need for specialisation and margin potential converge, creating ideal conditions for building competitive advantage.

    The AI revolution – from experiment to profit

    The year 2025 marks the end of the era of free lunches in the AI world. Customers who have invested in pilot projects now expect tangible business results. This creates a huge opportunity for partners who can speak the language of ROI.

    The biggest barriers to AI adoption lie not in the technology, but in lack of competence (42% of companies), insufficient data quality (42%) and lack of a solid business case (42%). It is in these areas that an AI partner can offer the most value.

    Instead of generic implementations, the most profitable services focus on solving complex problems. AI governance (AI Governance) and ensuring models are explainable are becoming key for firms in regulated sectors, with 44% of organisations planning to invest in them. Instead of offering one-size-fits-all chatbots, partners should focus on creating industry-specific solutions that drive AI adoption in finance, retail or media.

    The ability to navigate costs also becomes crucial. The cost of an averagely complex AI project can range from $60,000 to more than $250,000, and a partner who can guide the client through the entire process, from concept to implementation, gains strategic advisor status.

    2. Cyber security: from reactor to proactive guardian

    The cyber security market is booming, with a projected annual growth rate of 14.6%. This growth is driven by the increasing complexity of threats, including AI-assisted attacks, and the increasingly severe financial consequences of successful intrusions, with the average cost already reaching $4.88 million.

    These services are high-margin, and consulting in this area can yield margins of 20-40% . This is because their value is directly linked to protection against catastrophic risk.

    The real growth and margin lies in the move away from standard, saturated services to advanced managed solutions. We are talking about the evolution from Endpoint Detection and Response (EDR) systems to Managed Detection and Response (MDR) and Extended Detection and Response (XDR).

    These solutions integrate signals from across the customer’s IT environment and provide 24/7 monitoring and expert response. It is no coincidence that 97% of the world’s top-earning managed service providers (MSPs) offer managed security services.

    The democratisation of advanced security operations (SOC) through AI is also a powerful new trend. AI assistants for analysts allow even smaller teams to operate with the efficiency of experienced experts, paving the way for offering a highly effective and profitable SOC-as-a-Service.

    3 The economics of the cloud, or the imperative of optimisation

    As companies move more and more operations to the cloud, a new fundamental challenge is emerging. For 82% of organisations, managing cloud spend is now the number one concern. It is estimated that up to a third of this spend is wasted due to ineffective configuration or lack of oversight.

    This is a huge, easily quantified business pain for which IT partners can offer an effective remedy.

    The immediate answer is FinOps, a new operational discipline that brings together finance, engineering and business to bring financial accountability to cloud consumption. The market for FinOps tools and services is growing at more than 11% per year.

    This is a high-value consulting practice that can help clients reduce cloud costs by 20-30%. Instead of one-off audits, partners can offer a continuous FinOps-as-a-Service, creating a recurring, high-margin revenue stream.

    An additional opportunity is the complexity of multi-cloud environments, which up to 92% of companies are implementing . Partners that can offer a unified management plane for AWS, Azure and GCP environments become invaluable to customers.

    Global trends provide the map, but success depends on skilfully navigating the local context. For Polish IT partners, 2025 is a time of strategic decisions. It is necessary to move away from a model of competing mainly on costs to building deep specialisation.

    Rather than being an ‘everything provider’, aim to be the best in a selected, high-margin niche, such as implementing regulatory compliance (e.g. DORA), implementing GenAI in a specific industry or consulting FinOps.

    The biggest barrier for clients is the lack of talent, so a partner’s most valuable asset is its team of experts . The investment in competence, certification and retraining of talented developers towards AI/ML engineering is the most important investment in future profitability.

    Equally important is a change in the way business discussions are conducted. The discussion must shift from the level of technological features to that of business outcomes. It is necessary to learn to quantify the value of the services provided, speaking the language of risk reduction, operational savings and return on investment .

    The market analysis for 2025 leads to one conclusion: “The Golden Grail is not a single service, but a strategically balanced portfolio. The partner of the future is one that is specialised, agile and obsessed with delivering tangible value to customers.

    A historic opportunity is opening up for Polish companies in the IT sector. Its geographical location, access to outstanding talent and dynamic economy create ideal conditions for a leap forward in global competitiveness.

    The key will be to have the courage to invest in the most difficult but also the most promising areas of the market and become an indispensable guide for their customers in the smart technology era.

  • What are companies spending money on in IT? IT spending in 2025

    What are companies spending money on in IT? IT spending in 2025

    The technology landscape in 2025 presents a fascinating paradox. On the one hand, global IT spending is set to soar. Analysts at Gartner forecast an increase of 7.9 per cent to $5.43 trillion , while other forecasts predict an increase of up to 9.3 per cent to $5.74 trillion.

    On the other hand, this impressive growth is taking place against a backdrop of global economic uncertainty, geopolitical risk and unprecedented pressure on CIOs to justify every penny spent.

    While artificial intelligence, cyber security and cloud modernisation are the undisputed drivers of growth, the real story of 2025 is the battle for efficiency. CIOs are being forced to reallocate resources from maintaining outdated systems to strategic innovation, making budget optimisation not just a financial exercise but a key competitive strategy.

    This is echoed by Gartner’s John-David Lovelock, who notes that although budgets are increasing, much of it will only be used to cover price increases, which will distort the perception of nominal spending versus real IT investment.

    Anatomy of an IT budget for 2025: where does the money flow?

    In order to understand how companies allocate their resources, we need to break down the typical IT budget. Although every organisation is different, analysing market trends allows us to create a representative model that sheds light on key priorities.

    The lion’s share of any IT budget, at around 35%, is taken up by salaries and team-related costs. The human factor remains the largest and most important single investment, including salaries, benefits, recruitment and training.

    ISG’s research indicates that almost half of all IT spend is on staff. Forrester confirms this trend, reporting that 71% of leaders expect to increase investment in staff, and Metrics’ research points to salaries as one of the main areas of cost growth. The high costs reflect the fierce competition for talent, especially in high-demand areas such as cyber security, artificial intelligence and cloud.

    Another significant segment, occupying around 20% of the budget, is software, dominated by subscription models (SaaS) and licences. This includes key enterprise platforms (ERP, CRM), collaboration tools and specialised applications.

    Gartner predicts that software spending will increase by a solid 10-14%, driven by investments in AI-based features and the need to upgrade legacy applications. At the same time, CIOs are focusing on platform consolidation to combat the sprawl of SaaS applications and reduce unnecessary spending.

    Cloud and infrastructure, accounting for around 15 per cent of expenditure, form the foundational layer of modern IT. This category includes the costs of public cloud providers (IaaS, PaaS), private cloud environments and other hardware in local data centres.

    Spending in this area is growing rapidly, driven by digital transformation initiatives, support for hybrid working and, most importantly, the need for scalable computing power for AI workloads. IDC forecasts that spending on cloud services alone will reach $1.3 trillion by 2025.

    Cyber security, once a smaller budget line item, is now a priority at board level and consumes around 12% of resources. It is a non-negotiable component of the budget, driven by the explosion of AI-driven threats and a wave of stringent new regulations such as DORA and the Cyber Resilience Act.

    Industry benchmarks indicate that security spending accounts for between 10% and 15% of the total IT budget in large enterprises.

    Approximately 10% of the budget is allocated to new projects and innovation, i.e. activities aimed at business development.

    These are dedicated funds for strategic initiatives that generate new revenues or transform business processes, including digital product development and experimentation with new technologies such as GenAI.

    The last category, with a share of around 8%, is maintenance and external operations.

    This includes non-wage costs associated with day-to-day operations, such as managed service provider (MSP) fees and outsourcing contracts for routine tasks

    Key drivers of growth: what drives investment?

    Three powerful forces are shaping priorities in IT spending in 2025. The first is artificial intelligence, which has become a real catalyst for change. Paradoxically, although generative AI is entering what Gartner calls the ‘pit of disillusionment’, meaning more moderate expectations, spending on it is accelerating.

    This signifies a maturing of the market, where investments are shifting from experimentation to pragmatic deployments. CIOs are investing in foundational infrastructure, such as AI-optimised servers, and in software with AI features built in. According to the research, 63% of CIOs plan to spend on AI/ML in 2025, making this area the third most important priority.

    The second driver is cyber security, which has ceased to be seen as a cost centre and has become a prerequisite for business. This is a top priority for CIOs for the fourth year in a row.

    Spending is being driven by an increase in cyber attacks, now amplified by AI, and by stringent new regulations such as the EU DORA regulation and the Cyber Resilience Act, which are forcing a proactive security posture.

    Budgets are mainly flowing into security services, cloud native application protection platforms (CNAPP) and identity and access management (IAM).

    The third driving force is the cloud and modernisation, seen as the only way to escape technology debt. Older systems are a major burden, with Deloitte research suggesting that as much as 56% of IT spend goes on maintenance.

    The cloud offers the agility, scalability and cost efficiency needed to compete. The PwC report identifies the modernisation of cloud platforms as a key objective for CIOs in 2025. The strategic goal is to move from simply moving workloads to building native applications in the cloud, which is reflected in the huge growth of the IaaS market.

    Strategic dilemma: the balance between retention and innovation

    The biggest challenge for IT leaders remains the age-old dilemma: how to balance spending on maintaining existing systems (‘Run’) with investing in innovation to drive growth (‘Grow’).

    Many organisations are stuck in a cycle where 70 per cent or more of their budget is absorbed by simply ‘keeping the lights on’, leaving few resources for digital transformation. This operational burden is the biggest single barrier to growth.

    This problem is compounded by inflationary pressures. As Gartner’s John-David Lovelock points out, increases in budgets are often swallowed up by rising supplier prices and wages, meaning that even with larger budgets, the funds available for new initiatives can shrink.

    This creates the ‘illusion of increased spending’, while the real value derived from investments remains stagnant.

    The combination of operational burden and inflationary pressures is forcing a radical change in IT financial management. CIOs can no longer win by simply asking for bigger budgets. They must win by reallocating existing resources more efficiently. The only way to fund necessary innovation is to aggressively cut maintenance costs.

    This elevates cost optimisation from a tactical exercise to a core strategic function. Strategies such as consolidating SaaS platforms, automating routine tasks, implementing rigorous cloud-based financial management (FinOps) and upgrading legacy systems are becoming key to unlocking capital for innovation .

    Budget as a compass for innovation

    The budget landscape for 2025 is characterised by growing but constrained budgets, dominated by talent, software and cloud spending. These investments are driven by the imperatives of artificial intelligence, cyber security and modernisation. The main challenge remains the same: striking a balance between the cost of living and funding innovation.

    To meet these challenges, IT leaders need to adopt a new approach. Value-based budgeting needs to be introduced, where every investment is linked to a measurable business outcome.

    Artificial intelligence should be treated not as a technology project but as an organisational change, requiring a solid foundation of data management and competence. Cost optimisation must become the engine of innovation, creating a self-financing growth mechanism through the reallocation of savings.

    Finally, digital resilience, built by strategic investment in cyber security, should be seen not as a cost, but as a competitive advantage that builds trust and enables faster innovation. In 2025, the IT budget ceases to be just a financial statement and becomes a strategic compass that sets the company’s course in the present day.

  • The $18bn video conferencing market is just beginning. Who will win the war for the meeting rooms of the future?

    The $18bn video conferencing market is just beginning. Who will win the war for the meeting rooms of the future?

    The global video conferencing market, growing by 5% a year despite economic uncertainty, is proving its strength. However, the real revolution is happening behind the scenes. While the market is becoming increasingly ‘commoditised’, the key to victory is no longer just image quality, but strategic alliances, artificial intelligence and the conquest of the gigantic, still undeveloped potential of meeting rooms.

    The paradox of growth – labour transformation instead of product revolution

    At first glance, the video conferencing market seems to defy logic. Despite a difficult geopolitical and economic environment, it reached $18 billion in 2024, recording 5% year-on-year growth. Analysts forecast that this pace will be maintained, with the prospect of reaching $21 billion by 2029. Other analyses are even more optimistic, pointing to the value of the market reaching between $24 billion and even $60 billion by the beginning of the next decade.

    At the same time, the same market is becoming a victim of its own success. The market is becoming increasingly commoditised, with very little product differentiation between vendor offerings. Basic features such as HD video, screen sharing and chat have become standard. So if the driver of growth is not a product revolution, what is? The answer is clear: a fundamental and permanent transformation of the global working model. Companies are no longer buying a virtual meeting tool; they are investing in a strategic infrastructure to connect dispersed teams, increase productivity and ensure business continuity. Video conferencing has become the communications backbone of the modern organisation, and the value has shifted from product features to its role in the wider business ecosystem.

    Growth is also being driven by expansion into new markets. While North America (42% subscription market share) appears saturated, Asia and Oceania (25% share) are showing ‘promising growth’. At the same time, the market is deepening its penetration in sectors such as healthcare, education and finance, demonstrating that video conferencing has moved far beyond traditional corporate applications.

    Microsoft: architect of the new ecosystem

    In a market where victory depends on control of the entire ecosystem, no one plays the game more effectively than Microsoft. Its strategy is a textbook example of ‘platformisation’, where the goal is not to sell Teams applications, but to dominate the entire value chain.

    The starting point is the dominant 49% market share of paid virtual meeting services. The source of this power is the deep integration of Teams into the Microsoft 365 ecosystem, making it an integrated functionality rather than an additional expense for millions of companies. The second pillar of the strategy is the Device Certification Programme for Microsoft Teams. On the surface it is a quality control mechanism, but in reality it is a powerful strategic control tool. By setting rigorous standards, Microsoft de facto dictates the development roadmap for equipment manufacturers (OEMs) such as Logitech, Poly and Yealink, who must align their products with the giant’s vision in order to gain access to its huge customer base.

    The strategy is complemented by a global network of implementation partners, motivated by the specialisation “Meetings and Meeting Rooms for Microsoft Teams” , and a smart licensing model. By offering a free Teams Rooms Basic licence and a paid Teams Rooms Pro licence, Microsoft is encouraging deeper integration. It is the Pro version that unlocks the full capabilities of the ecosystem, including advanced AI features, dual-screen support and, from an IT perspective, key remote management and analytics, closing the monetisation cycle.

    Hidden treasure: the battle for the conference room

    The real battle for the future of the video conferencing market is not about personal licences, but about physical meeting rooms. This is where the gigantic untapped potential lies. Omdia’s data is clear: only 28% of all meeting rooms worldwide have any form of video conferencing equipment, and only 6.25% of these are fully standardised, native rooms such as Microsoft Teams Rooms or Zoom Rooms. This means that around 72% of the rooms are a technology desert.

    In response to this market opportunity, two philosophies of room fit-out have crystallised:

    • Native systems (e.g. Microsoft Teams Rooms, Zoom Rooms): They rely on standardisation, simplicity and central management by IT. They offer a consistent user experience (‘one-touch join’) and high security, but at the expense of flexibility and the risk of dependence on a single provider.
    • BYOD/BYOM (Bring Your Own Device/Meeting) systems: Prioritise flexibility, allowing users to use their own laptops and any software. They are cheaper to implement, but generate potential security risks and compatibility issues, putting a strain on support departments.

    The market quickly realised that the ‘either-or’ choice was suboptimal. In response, a third way is emerging: all-in-one video bars, often based on Android. These devices, offered by companies such as AVer, operate in dual mode. They can function as a native, centrally managed Teams or Zoom room and, at the same time, when connected to a laptop via USB, switch to BYOD mode, becoming a high-quality peripheral for any application. This solution reconciles the conflicting interests of IT and users, making it the likely dominant architecture for meeting rooms in the future.

    Alliances in the shadow of giants

    In a market reality dominated by platform giants, a ‘lone wolf’ strategy is a recipe for failure. The value for the customer is shifting to the holistic, integrated experience that results from the synergy between software and hardware. Strategic alliances are becoming an absolute necessity.

    Integrated ecosystem model: Logitech and Microsoft

    The partnership between Logitech and Microsoft is an exemplary example of deep vertical integration. Logitech offers a wide range of ‘Certified for Teams’ devices, covering every conceivable need. Key here is the joint innovation in the area of AI. Advanced software functions, such as IntelliFrame and Copilot in Teams, require high-quality data from the hardware. Therefore, Logitech’s hardware innovations, such as smart cameras or RightSound 2 audio technology, are designed to enhance Microsoft’s AI services, creating a true symbiosis.

    The open ecosystem model: Poly (HP) and Zoom

    The strategy of Zoom and its key hardware partner, Poly, is an alternative to Microsoft’s closed garden. It focuses on openness and hardware innovation. Poly’s DirectorAI smart camera technology becomes a powerful hardware argument for choosing the Zoom Rooms ecosystem. Interoperability is also a key element of the strategy.

    A future defined by AI and ecosystems

    Analysis of the video conferencing market leads to a clear conclusion: the era of video conferencing as a standalone application is over. The future belongs to intelligent, integrated collaboration platforms, and success will be determined by the strength and cohesion of an ecosystem in which the battle for the meeting room becomes a strategic objective.

    The new frontier of differentiation and a key driver of value is becoming artificial intelligence. As core functions become commoditised, it is AI-driven capabilities – such as automated meeting summaries , intelligent speaker tracking and predictive analytics – that will drive competitive advantage. The future of meetings is not just about being able to see and hear each other, but having an intelligent assistant that understands the context of the conversation and actively supports team productivity. The $18 billion market is just the tip of the iceberg. The real prize is the creation of a technology infrastructure that will support trillions of hours of virtual and hybrid collaboration in the coming decade. That prize will be won – or lost – right in the boardrooms of the future.

    Strategic featureMicrosoft Teams ecosystemEcosystem Zoom
    The main challengeProviding flexibility in a multi-platform world.Competing with the distribution strength and bundled offerings of Microsoft.
    Dominant Business ModelSelling an integrated platform (Microsoft 365) where Teams is a key element.Selling communication services as ‘best-of-breed’, with an increasing emphasis on the platform (Zoom Workplace).
    Hardware strategyEcosystem control through a rigorous ‘Certified for Teams’ certification programme.Building alliances with leading equipment manufacturers (e.g. Poly, Neat, DTEN) and promoting equipment innovation.
    Approach to the Conference RoomPriority for native, standardised ‘Microsoft Teams Rooms’ for consistency and management by IT.Strong support for ‘Zoom Rooms’, but also a strong emphasis on flexibility and interoperability with other platforms.
    Key PartnersHardware: Logitech, Poly, Crestron, Yealink. Channel: Global network of partners with ‘Meetings and Meeting Rooms’ certification.Hardware: Poly (HP), Logitech, Neat, DTEN. Interoperability: Pexip.
    AI strategyDeep integration with Copilot, leveraging data from across the M365 ecosystem. Hardware-based AI (IntelliFrame) enhances AI in the cloud.Development of own AI Companion and intelligent functions (e.g. Zoom Intelligent Director) in close cooperation with hardware partners.
    Core StrengthNetwork effect and customer attachment to an integrated Microsoft 365 package.Perceived ease of use, flexibility and strong branding in video communication.
  • AI in cyber security, or how to save $1.76m and reduce attack response by 108 days

    AI in cyber security, or how to save $1.76m and reduce attack response by 108 days

    Every IT leader and security director knows this scenario all too well: the endless stream of security alerts, the growing fatigue of the analytics team (so-called alert fatigue), the acute shortage of specialists in the market and the relentless pressure to optimise budgets. In such a demanding environment, relying solely on manual threat analysis is becoming an inefficient and dangerous model. Cybercriminals are already using artificial intelligence and automation on a massive scale to type targets, create highly personalised phishing attacks and rapidly bypass traditional defences . Attempting a defence based solely on human response is doomed to failure in advance – it is too slow, too costly and too error-prone.

    However, recent data shows that there is a measurable and highly effective answer to this challenge. A key finding from IBM’s‘Cost of a Data Breach 20235‘ report is clear: the use of AI and automation is the #1 factor that most strongly reduces the financial impact of a security breach. In this article, we’ll explore where the $1.76 million saving comes from and how reducing the attack lifecycle by 108 days translates into real benefits for your organisation.

    Investment with the highest return on investment (ROI)

    When analysing the factors that have the greatest impact on minimising financial losses after an incident, the data leaves no illusions. Investing in intelligent automation is simply the most cost-effective strategy for building cyber resilience. IBM’s 2023 report, compiling the factors that mitigate the costs of a breach, makes the point clear. Implementing AI and automation is at the top of the list, offering the greatest reduction in potential losses. Significantly, this factor even surpasses such fundamental elements as having a dedicated Incident Response Team or using well-established DevSecOps practices.

    The figures speak for themselves. An average saving of USD 1.76 million is the difference in the cost of breach between organisations with fully implemented AI systems and those that do not use them at all. In the former group, the average cost was US$3.93m, while in the latter it was as high as US$5.69m. This is the largest single saving identified in the entire study, making AI and automation the investment with the highest documented return.

    Although these amounts seem huge, the principle of proportion is universal. For a Polish e-commerce company, software house or manufacturing plant, a reduction in potential losses of more than 30% can mean the difference between surviving the crisis and bankruptcy. This is not an abstract statistic for global corporations, but a hard indicator of strategic maturity. This is particularly relevant in the context of the Polish market, which is facing an alarming shortage of cyber-security specialists, estimated at between 10,000 and 17,500 people, while demand for these competences is growing at the highest rate in Europe (36% year-on-year). Moreover, as many as 39% of Polish companies do not employ a single employee responsible for cyber security, creating ideal conditions for attackers.

    Time is money: how AI shortens the attack lifecycle by more than 3 months

    A second, equally key dimension of benefit is the drastic reduction in exposure time to an active threat. Organisations that make full use of AI and automation identify and stop attacks on average 108 days faster than companies that do not (214 days vs 322 days). That’s a difference of more than three months that has a direct cost impact.

    Why such a powerful difference? The mechanism is simple and is based on the synergy of two key technologies:

    • Detection at the speed of the machine (UEBA): AI does what humans will never be able to do – analyse billions of events from system logs, network traffic and endpoints in real time. Advanced User and Entity Behaviour Analytics (UEBA) modules build patterns of normal behaviour for every user and device on the network. This allows them to detect subtle anomalies – such as logging in from an unusual location at an unusual time, accessing rarely used files or attempting to escalate permissions – that would easily escape an overloaded analyst . It is these deviations from the norm that are often the first sign of an advanced attack.
    • Real-time response (SOAR): Once an anomaly is detected, automation comes into play, usually in the form of SOAR (Security Orchestration, Automation, and Response) platforms . It allows an immediate, predefined response – e.g. automatically isolating an infected workstation from the network, blocking a suspicious user account or running a detailed scan – without waiting for human intervention . This eliminates the most costly element in the response chain: delay.

    It is important to remember that each additional day of active breach represents an increasing cost: more stolen or encrypted data, longer and more severe operational downtime, greater reputational damage and rising legal and crisis communications costs. The IBM report confirms this correlation, showing that breaches lasting more than 200 days cost, on average, more than US$1 million more than those contained below this threshold.

    Which AI technologies to implement in the company?

    The question that naturally arises is “OK, what specifically should I invest in?”. Implementing AI and automation is not a monolithic project, but a process that can be pursued by integrating specific classes of solutions.

    Here are the key technologies driving this revolution:

    • SIEM / XDR with behavioural analytics (UEBA): The brains of operations. Modern SIEM (Security Information and Event Management) and XDR (Extended Detection and Response) systems enhanced with UEBA modules become the focal point for analysing, correlating and detecting advanced, hidden threats.
    • SOAR (Security Orchestration, Automation, and Response): This is the nervous system that turns detection into action. SOAR platforms automate repetitive tasks and entire response procedures (playbooks) by integrating different security tools and orchestrating their joint response.
    • Modern EDRs (Endpoint Detection and Response): These are automatic gatekeepers on workstations and servers. EDR systems use AI to block malware, analyse attack techniques and automatically roll back malicious changes.
    • AI in data protection (DLP): Traditional Data Loss Prevention (DLP) systems based on simple rules often fail. Modern solutions use AI to understand the context and content of data, allowing for much more precise identification and protection of sensitive information, drastically reducing false positives.
    • Identity Management (IAM): Intelligent IAM systems implement so-called adaptive multi-factor authentication (MFA). AI continuously assesses the risk of a given session and dynamically adjusts the authentication requirements, requesting additional verification only when necessary.

    It is worth noting that many of these advanced solutions are today available in a flexible cloud model (SaaS) . This significantly lowers the entry threshold, eliminating the need for large, one-off capital investments (CAPEX) in favour of predictable, monthly operating costs (OPEX). For Polish companies, especially those in the SME sector, this is a historic opportunity to access technologies that only a few years ago were reserved for the largest global players.

    Strategic context: Evolution of threats and defences

    Analysis of the 2023 data provides a powerful argument, but the picture becomes even clearer when looking at the latest trends. Successive editions of the IBM report show that the role of AI is becoming even more critical.Evolving from Response to Prevention: 2024 report (analysing data from 2023-2024) reveals that the value of AI is shifting towards prevention. Organisations that used AI extensively in areas such as Attack Surface Management or proactive security testing reduced the average cost of a breach by US$2.2 million compared to companies not using these technologies . This is proof that the technology is maturing, allowing not only to put out fires faster, but to prevent them from happening in the first place.

    New Risk – ‘Shadow AI’: The latest research introduces a worrying new term into the lexicon: “Shadow AI”. It refers to the uncontrolled use of AI tools by employees, without the knowledge or oversight of IT departments. Incidents in which ‘Shadow AI’ played a role cost companies an average of USD 670,000 more . As many as 97% of companies affected by an AI incident did not have appropriate access control mechanisms in place for these technologies .

    This data leads to a fundamental conclusion. The risk is not the AI technology itself, but its uncontrolled, ‘wild’ application. This is a powerful argument for implementing a centralised AI governance model (AI Governance). The security team needs to evolve from a ‘braking’ role to a partner that enables AI-based innovations to be implemented safely.

    The era when cyber security was based on manual analysis and reactive firefighting is irrevocably passing. The data clearly shows that the future – indeed the present – of a secure and resilient organisation lies in intelligent automation.

    Conclusion 1 (Financial): Investment in AI and automation in cyber security delivers the greatest measurable return, reducing the average cost of a breach by US$1.76m.

    Conclusion 2 (Operational): These technologies reduce risk exposure time by as much as 108 days, which directly translates into lower operational, financial and image losses.

    Conclusion 3 (Strategic): AI is no longer a futuristic add-on, but a strategic necessity and the most effective way to build real digital resilience in the face of automated attacks and a critical shortage of specialists in the market .

    The call to action for any IT leader is therefore simple: conduct an audit of your current technology stack. Evaluate in which areas – from detection to response to identity protection – the implementation of intelligent automation will yield the quickest and greatest benefits. It is no longer a question of ‘if’, but ‘when and how’ to use these technologies to defend your organisation. The time to act is now.

  • Post Quantum Cryptography: Business implications of algorithm choice

    Post Quantum Cryptography: Business implications of algorithm choice

    Today’s digital economy rests on an invisible but fundamental pillar: public key cryptography. Algorithms such as RSA and ECC have become synonymous with digital trust, but this foundation faces an existential threat. The advent of quantum computers capable of breaking current encryption standards is not the next evolutionary step in cyber security; it is a revolution that is forcing technology leaders to fundamentally change their thinking about data protection.

    The quantum threat is not a distant, theoretical possibility. It materialises today through a strategy known as ‘Harvest Now, Decrypt Later’ (HNDL), or ‘Harvest Now, Decrypt Later’. Adversaries, including state actors, are actively capturing and storing vast amounts of encrypted data, patiently waiting for the moment when a cryptographically-relevant quantum computer (CRQC) becomes operational. At that point, the data we consider secure today will be retrospectively breached. This fundamentally changes the risk model, shifting the responsibility from purely operational to strategic, related to protecting the long-term value of the company. Any information whose confidentiality lifecycle extends beyond the anticipated moment of CRQC is already at risk.

    The arguments for postponing migration to post-quantum cryptography (PQC) have run out of steam. In August 2024, the US National Institute of Standards and Technology (NIST) published the first finalised PQC standards: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA) and FIPS 205 (SLH-DSA). This event sends a key signal to the global market: the research phase is over and the standards are awaiting implementation.

    End of the era of “one right encryption”

    For the past decades, public key cryptography decisions have been relatively straightforward, mostly boiling down to a choice between RSA and ECC. The post-quantum era puts a definitive end to this paradigm. We are entering a world where there is not and will not be a one-size-fits-all PQC algorithm. Choosing the right solution becomes a conscious strategic decision that must be precisely tailored to the specific use case.

    The world of PQC is inherently heterogeneous. NIST deliberately standardises algorithms from different mathematical families because each offers a unique set of trade-offs between computational efficiency, data size, resource usage and security level. Algorithm selection ceases to be solely the domain of cryptographers and becomes an architectural and product decision. Differences in the characteristics of individual algorithms lead to drastically different consequences depending on the deployment scenario.

    For high-performance TLS servers supporting, for example, an e-commerce front-end, minimal latency is a priority. Benchmark data clearly shows that CRYSTALS-Kyber (ML-KEM) leads the way, offering negligible performance overhead. In contrast, in VPNs where maximum throughput is crucial, algorithms with very large keys, such as BIKE, can become an issue, causing massive IP packet fragmentation and degrading network performance. In the world of IoT devices, where resources are extremely limited, algorithms with high memory requirements can be completely impractical, directly impacting component cost.

    The data speaks for itself

    Abstract discussions become real when hard data is analysed. Translating milliseconds of latency and kilobytes of data into concrete business implications is key to making informed decisions. Analysis based on comprehensive benchmarks shows that the choice of PQC algorithm is not only a question of security, but also of real operational costs.

    The performance leader is undisputedly CRYSTALS-Kyber, which shows negligible computational overhead in server tests. Its public keys and ciphertexts are compact, occupying just over 2 KB in total, and peak RAM consumption is minimal, making it an ideal candidate for a wide range of applications. At the other extreme is BIKE, a code-based algorithm. Despite its advantages, its implementation comes at a tangible cost: the sum of data exchanged during connection reconciliation exceeds 10 KB, which is more than four times that of Kyber. This size not only increases the cost of data transfer, but above all risks fragmenting network packets.

    Even more striking is the example of the Falcon signature algorithm. It offers extremely small signatures, which is a huge advantage in IoT applications where bandwidth is at a premium. However, its peak memory consumption is more than 26 times that of the competing Dilithium. For an engineer designing a medical device, this means that he or she may have to choose a more expensive microcontroller, increasing the unit cost of the overall product.

    Hidden risks and recommendations for CISOs

    Achieving compliance with PQC standards is just the first step. The biggest risks lie not in the theory of the algorithms, but in their practical implementation. Mathematical robustness is worthless if the physical implementation on the processor ‘leaks’ information about the secret key through side-channel attacks. Research has already shown successful attacks on Kyber and Dilithium implementations using analysis of power consumption or electromagnetic emissions.

    A key implication for CISOs is the hidden cost of security. Securing an implementation requires specialised countermeasures, such as masking, which introduce a significant performance overhead. NIST representatives have admitted that a well-secured implementation of Kyber can be up to twice as slow as its baseline version. This means that infrastructure budgets based on benchmarks of unsecured implementations are fundamentally flawed.

    Another area of risk is the supply chain. The security of a PQC system is only as strong as its weakest link, and these are often outside the company’s control – in open-source libraries or cloud providers. The CISO needs to start asking vendors tough questions about their roadmap for implementing NIST standards, support for hybrid modes and documented resilience to physical attacks.

    The solution to these challenges is crypto-agility – the ability of an architecture to easily and quickly replace cryptographic algorithms without fundamental changes to the infrastructure. In the dynamic world of PQC, where new standards are already emerging, this approach is a recipe for avoiding costly and risky migrations in the future. Investing in a crypto-switching architecture now, as part of the first PQC project, is a strategic decision that will drastically reduce the total cost of security ownership over the coming decade.

    Migration to PQC is not a one-off project, but a strategic transformation programme. It should proceed in a methodical and phased manner. The first, fundamental phase is in-depth preparation and inventory. Crucial here is the creation of a detailed cryptographic inventory that maps all systems using vulnerable cryptography. Then, based on a risk analysis, the migration should be prioritised, focusing on assets with the longest required confidentiality lifecycle. The second phase is to build technical capacity, including designing systems for the aforementioned crypto agility and launching controlled pilot projects in hybrid modes. This approach allows the collection of real data on the impact of PQC on the company’s specific environment. The final phase is the actual phased migration of production systems and the establishment of continuous monitoring processes.

    The move to post-quantum cryptography is an evolution in technology risk management. The key long-term investment is not in implementing a specific algorithm, but in building a crypto agile architecture. It is this agility that will allow the company to adapt quickly and efficiently to the inevitable changes in the cyber security landscape, ensuring resilience and protecting its future.

  • Business laptops in H1 2025: Leaders, strategies and forecasts in the age of AI

    Business laptops in H1 2025: Leaders, strategies and forecasts in the age of AI

    2025 is not another year of evolution in the business laptop market; it is a year of revolutionary upheaval. The confluence of three powerful forces – the massive deployment of on-device AI (artificial intelligence), the imminent end of Windows 10 support and the permanent entrenchment of hybrid working models – is creating a perfect storm that redefines what a PC is and can do for the professional. It is a fundamental paradigm shift, shifting the centre of gravity from raw computing power to intelligent, integrated and secure ecosystems.

    This transformation is based on three pillars. The first is the AI revolution, which shifts the burden of computing from the cloud directly to the device. With dedicated neural processing units (NPUs), laptops gain an unprecedented ability to personalise, automate tasks and deliver real-time productivity, all while maintaining the highest standards of data privacy. The second pillar is a powerful, unavoidable market catalyst: the 14 October 2025 deadline marking the end of support for Windows 10. It forces businesses around the world to upgrade their PC fleets en masse, perfectly coinciding with the release of the next generation of hardware. The third pillar is the new working paradigm. The hybrid model has become the norm, forcing manufacturers to create devices that are not only powerful, but also ultra-secure, highly mobile and optimised for continuous video collaboration.

    In this new landscape, competitive advantage is no longer defined solely by processor power, measured in gigahertz, or the number of cores. The winners in 2025 will be those vendors who offer the most cohesive, intelligent and secure ecosystem, combining hardware, software and services in a way that realistically addresses the challenges of a decentralised, AI-driven working environment. This analysis delves into the market data, technology strategies and positioning of key players to identify the trends that will define the future of business laptops.

    The market landscape in 2025 – a rebound in the shadow of uncertainty

    The first half of 2025 has brought a long-awaited recovery to the PC market. After a period of stagnation, global PC shipments recorded solid growth, increasing by 9% year-on-year in the first quarter and by 7% in the second quarter. At first glance, these figures signal the return of healthy demand and the start of a new buying cycle.

    An analysis of the leading manufacturers shows varying dynamics. Lenovo maintained its position as the undisputed leader, recording impressive shipment growth of 11% in Q1 and 15.2% in Q2 2025, demonstrating the effectiveness of its product and operational strategy. HP Inc. steadily took second place, maintaining a solid market share. The real star of the first half of the year, however, turned out to be Apple, which demonstrated a spectacular growth rate, with shipments increasing by 22% in Q1 and 21.3% in Q2. Such results point to the growing acceptance of the Mac platform in commercial and professional environments, which have traditionally been a bastion of Windows systems. In contrast, Dell, the third largest player, reported a 3.0% decline in shipments in Q2, which may signal some challenges in aligning its portfolio to the current product cycle or stronger competition from rivals.

    Although the figures point to recovery, a deeper analysis reveals a more complex picture. The seemingly strong growth figures for the first half of 2025 do not fully reflect organic demand from end users. They are largely the result of strategic stockpiling by manufacturers and distribution channels in response to growing geopolitical uncertainty and the threat of new trade tariffs, particularly in the US.[4, 5] Manufacturers have been deliberately accelerating orders to get ahead of potential tariff increases on electronics, meaning that high delivery numbers may be significantly ahead of actual end demand. This creates the risk of excess inventory in the distribution channel, which could lead to price pressure and a potential market correction in the second half of the year.

    The Polish IT market reflects global recovery trends, but with its own unique characteristics. In the first quarter of 2025, PC sales in Polish distribution grew by an impressive 18.1% year-on-year, but this growth was driven primarily by the consumer segment. The business sector showed more restraint, revealing the existence of a ‘value proof gap’. In the Polish market, there is a clear discrepancy between the global marketing hype around AI PCs and the actual purchasing decisions of companies. While global forecasts predict that AI laptops will account for nearly 60% of the market as early as 2025, first-quarter data from the Polish market shows that sales of the latest AI-optimised chips were “lower than expected”. Business customers were more likely to opt for tried-and-tested and often cheaper older-generation processors.

    The AI PC revolution – the birth of a new category of device

    The year 2025 marks the point at which the term ‘AI PC’ moves from the marketing buzzword phase to a real, defined product category. The de facto market benchmark that has defined what a modern AI PC is has become the Microsoft Copilot+ PC standard. This is a new class of Windows-based device designed from the ground up for local processing of AI tasks. In order for a laptop to carry this designation, it must meet stringent technical requirements: a neural processing unit (NPU) performance of at least 40 trillion operations per second (TOPS), a minimum of 16 GB of RAM and a fast SSD of at least 256 GB.

    The role of the NPU is fundamental. Unlike traditional architectures, where AI tasks overloaded the main processor (CPU) or graphics card (GPU), the NPU is specifically designed to efficiently perform these calculations with significantly lower power consumption. This translates into longer battery life and quieter operation – key attributes in a mobile business environment.

    Practical business applications are becoming more and more tangible. Video conferencing is becoming smarter with features such as real-time noise cancellation and automatic speaker framing. Office applications such as Microsoft Teams and Outlook offer automatic transcriptions and summaries of meetings and email threads, freeing up staff time. In Excel, it becomes possible to generate complex reports from simple natural language commands.

    The sudden rise of AI PCs is being driven by two powerful, coinciding developments. First, the end of Windows 10 support on 14 October 2025 acts as a powerful commercial catalyst, forcing companies to upgrade their hardware fleets. At this point, Copilot+ PCs are positioned as the logical, standard choice for any hardware replacement. Secondly, we are seeing a fundamental architectural shift: the re-decentralisation of computing. Moving AI computing from the cloud to the end-device addresses growing business concerns about privacy, speed, reliability and cost control associated with the cloud model.

    The war for silicon – who is driving the intelligence on board?

    At the heart of the AI PC revolution is a new generation of processors, and competition between processor makers has become a key battleground. In the AI PC era, the key performance indicator has become NPU power, expressed in TOPS (Trillions of Operations Per Second).

    Each of the four major players in the silicon market has adopted a different strategy. Intel, with its new generation of Core Ultra (‘Lunar Lake’) processors delivering 47 TOPS, is focused on defending its dominant position in the commercial segment by leveraging its mature x86 ecosystem. AMD, with its Ryzen AI 300 series (‘Strix Point’) offering 50 TOPS, is positioning itself as an aggressive contender, targeting the crown of performance leader, including in integrated graphics. Qualcomm, with its Snapdragon X Elite (45 TOPS), is making a breakthrough, bringing the highly energy-efficient ARM architecture to the Windows world and promising revolutionary battery life. Finally, Apple, with its M4 chip (38 TOPS), continues its strategy of full vertical integration, focusing on optimising the synergy between hardware and software rather than the race for the highest nominal TOPS value alone.

    Behind the façade of numbers lie deeper dimensions of competition. The real battle is to create the most efficient development platform, as it is the software, not the hardware, that will ultimately deliver value to the user. Qualcom’s entry into the Windows market with ARM architecture also creates a fundamental dilemma for IT decision-makers: whether to bet on the guaranteed compatibility of the x86 platform, or risk potential problems with emulating legacy applications in favour of the ARM platform’s significantly longer battery life.

    The battle for the ecosystem, not just the hardware

    Faced with such fundamental changes, major laptop manufacturers are struggling to create a coherent ecosystem.

    Lenovo is pursuing a two-pronged strategy. “AI Engine+” in the Legion series is targeted at gamers and optimises performance in real time. “AI Now” in the ThinkPad series, on the other hand, is a personal assistant for professionals, running locally on the device and emphasising privacy. This strategy is supported by the solid financial performance of the Intelligent Devices Group (IDG) division, which reported strong double-digit revenue growth.

    HP is focusing on an integrated ‘HP AI Companion’ platform to create ‘intelligent workflows’ and personalise the user experience, with a strong focus on security and on-device computing. The company has taken a flexible approach, offering processors from Intel, AMD and Qualcomm in its AI PC laptops, with the HP Wolf Security suite as a key component of the offering. The success of this strategy is evidenced by 9% revenue growth in the commercial segment in the second quarter of fiscal 2025.

    Dell is taking a more evolutionary approach to the AI revolution, gradually integrating intelligent features into its proven business lines such as Latitude and Precision. The company is focusing on practical benefits: optimising performance, enhancing security and improving user interaction. This conservative strategy, reflected in flat revenue in the commercial segment in Q2 2025, seems primarily aimed at maintaining the trust of existing large enterprise customers.

    Apple bases its strategy on deep ecosystem integration and uncompromised privacy. The ‘Apple Intelligence’ suite is woven into the core of the operating systems, with most operations taking place locally on the device. The company’s focus is on the fluidity and real-world usability of AI features that draw strength from awareness of the user’s personal context. Impressive double-digit growth in Mac shipments in the first half of 2025 is evidence that this strategy is resonating with the market.

    The year 2025 marks a turning point that requires IT decision-makers to rethink their purchasing strategies. AI is becoming standard, the platform market is fragmenting (x86 vs. ARM) and security is becoming an integral part of hardware.

    Choosing the right business laptop requires a new, holistic approach. The traditional purchasing model, based mainly on technical specifications, is no longer sufficient. IT decision-makers need to think like strategists, basing the decision-making process on a three-dimensional matrix that takes into account the specific needs of the organisation: the performance axis (including NPU and energy efficiency), the ecosystem axis (the trade-off between x86 compatibility and ARM battery performance) and the security axis (the depth of integration of hardware solutions).

    The traditional total cost of ownership (TCO) metric also needs to be redefined. The new TCO needs to take into account the return on investment (ROI) resulting from increased employee productivity through AI features and the risk-avoidance value of on-device computing.

    Looking to the future, current trends can be expected to deepen further. Artificial intelligence will become even more personalised and sustainability will play an increasingly important role in decision-making.

  • More than Zoom. The video conferencing market is growing by 10% a year. Who is making money from it?

    More than Zoom. The video conferencing market is growing by 10% a year. Who is making money from it?

    Video conferencing, once a niche tool for global corporations, has become a cornerstone of business communication over the past few years. The rapid transformation towards remote and hybrid working, accelerated by global events, has permanently changed the working landscape, making video platforms the lifeblood of modern organisations. Today, the video conferencing market, valued at tens of billions of dollars and forecast to grow steadily by double digits in the coming years, is entering a new phase of maturity. It is ceasing to be a battle arena for basic functions and is becoming a strategic battleground for the future of work, where integrated ecosystems, artificial intelligence, security and the psychology of human interaction are key.

    The war on ecosystems: software as a command centre

    At the heart of the videoconferencing revolution is software. This is where the fiercest battle is being fought and where innovation is redefining the rules of the game the fastest. The market is dominated by two titans: Zoom, which controls more than half the market, and Microsoft Teams, treading on its heels with a share of more than 30%. But their rivalry has long since ceased to be about single features such as screen sharing or recording, which have become standard . The real battle is over which platform will become the integrated ‘operating system’ for collaborative working.

    Microsoft Teams draws its strength from its deep, native integration with the ubiquitous Microsoft 365 suite. For companies already in the Microsoft ecosystem, Teams is a natural, often no-cost extension, combining communication, document collaboration and task management in a single, consistent environment . Zoom, on the other hand, has built its position on legendary ease of use, reliability and, crucially, platform agnosticity. It positions itself as a neutral communication hub that can be integrated into almost any tool, as evidenced by the huge market of nearly 2,800 apps in the Zoom App Marketplace . This flexibility makes it the preferred choice for communicating with external partners who do not necessarily work in a Microsoft environment. Google Meet, on the other hand, competes with the simplicity and seamless integration with the Google Workspace suite, running directly in the browser, which is a huge advantage in environments with limited IT privileges .

    The decision to choose a platform thus becomes a strategic choice for the company’s entire technology stack. Vendors aim to ‘lock’ the customer into their ecosystem, because the more processes that are integrated into one platform, the more difficult and costly the eventual migration becomes.

    AI: escaping commoditisation

    With the standardisation of core functions, artificial intelligence has become a major battleground and a key differentiator of the offering. AI has ceased to be a marketing buzzword and has become the core of the value proposition, offering real productivity improvements. All market leaders are investing heavily in the development of their ‘intelligent assistants’: AI Companion in Zoom, Copilot in Microsoft Teams and Gemini in Google Meet . Their capabilities are revolutionising meetings by offering automatic summaries with to-do lists, transcription and real-time translation that break down language barriers in international teams. Tools such as Read.ai go a step further, analysing not only the content but also the context of the meeting, identifying moments of highest engagement or the sentiment of the speech . For suppliers, AI is a way to create unique added value and justify higher prices in premium plans.

    Equipment and meaning of certification

    Software is the brains of the operation, but it is the hardware that is the physical foundation of the professional videoconferencing experience. This market, although revenue-dominated by expensive systems, is inextricably linked to the software world. Manufacturers such as Logitech, Poly (HP) and Cisco are no longer competing in a vacuum – they are designing their products and strategies around the Microsoft and Zoom ecosystems, offering dedicated ‘kits for Teams’ or ‘solutions for Zoom Rooms’.

    In this context, certification programmes such as ‘Certified for Microsoft Teams’ and ‘Zoom Certified’ have become crucial . Certification is a guarantee to the customer that the device has passed rigorous quality tests, ensuring seamless integration, the highest audio and video quality and full support for the platform’s advanced features . It simplifies the selection and implementation process, giving IT departments confidence that the equipment they purchase will work reliably. It is certification that has become the de facto industry standard and a key selection criterion for businesses.

    Main challenges: security, interoperability and the human factor

    Despite rapid growth, the market faces serious challenges. Cyber security has come to the fore. Incidents such as ‘Zoombing’ (unauthorised access to meetings) or data leaks, such as that of the Royal Mail Group, have shown how attractive a target video platforms have become . In response, end-to-end encryption (E2EE), strong authentication mechanisms and regular software updates to protect against phishing and malware have become standard .

    Another fundamental problem is interoperability, or rather the lack of it. Leading vendors deliberately create ‘walled gardens’ in which their software and certified hardware work perfectly, but communication with other ecosystems is hampered. An organisation with meeting rooms based on Cisco hardware may encounter problems when trying to join a meeting in Microsoft Teams. The solution is intermediary services, known as Cloud Video Interop (CVI), offered by specialised companies like Pexip that ‘translate’ protocols between platforms . However, this is an additional cost that companies have to bear in the name of flexibility.

    The human factor cannot be ignored either. The intensive use of video conferencing leads to a phenomenon known as ‘Zoom Fatigue’ – mental and physical exhaustion. Stanford University research has identified its causes: excessive and unnatural eye contact, increased cognitive load from interpreting limited non-verbal cues, stress from constantly seeing one’s own reflection, and reduced physical mobility. The problem has real business implications, leading to lost productivity and burnout. Organisations and platform providers alike are beginning to respond by making changes to work culture (e.g. days without meetings) and software interfaces (e.g. the option to hide the view of one’s own camera).

    The future is immersive

    The evolution of the video conferencing market is accelerating and the technologies that will define the next generation of communication are already on the horizon. Artificial intelligence will evolve from an assistant to a proactive partner that will not only summarise the meeting, but also analyse its dynamics and the sentiment of the participants. The ultimate goal, however, is to break down the screen barrier. Virtual (VR) and augmented (AR) reality technologies will enable the creation of fully immersive, 3D collaborative spaces. A breakthrough could come from Project Starline, being developed by Google and HP, which uses advanced technologies to create realistic, 3D holograms of the speaker in real time. Early tests have shown that the technology dramatically increases non-verbal communication and improves recall of conversational content.

    The video conferencing market has matured. The winners in this new era will not be those who offer the most features, but those who best understand that technology is just a tool. The ultimate goal is to enable more productive, secure and satisfying human interaction, regardless of the physical distance that separates callers.

  • An investment that pays off – implementing Zero Trust architecture

    An investment that pays off – implementing Zero Trust architecture

    Traditional security models are no longer effective. The ‘castle and moat’ concept – assuming that everything inside the corporate network is secure and that threats lurk only outside – has become outdated and inadequate to modern realities. In a world where data and applications leave the physical boundaries of an organisation and users connect to resources from anywhere in the world, the role of the firewall as the main line of defence is becoming irrelevant.

    This is why the Zero Trust concept was born – not so much a new technology, but a complete change in the way we think about security. Its foundation is the principle of ‘never trust, always verify’. In practice, this means that any attempt to access resources, whether from an internal or external network, is treated as a potential threat and requires thorough verification.

    From concept to standardised architecture

    The term ‘Zero Trust’ was first used by John Kindervag of Forrester Research in 2010. In his view, building trust solely on the basis of network location is flawed by design. However, it was not until the publication of NIST document SP 800-207 that the concept was formalised and turned into a precisely defined security architecture.

    Zero Trust according to NIST is based on several key assumptions. All data and services are treated as resources requiring protection, regardless of their location. All communications must be properly secured and access is granted on a session-by-session basis. Decisions to grant access are dynamic and dependent on a number of factors – including the identity of the user, the state of the device, and the context of the operation. Furthermore, organisations must monitor the integrity of resources, apply continuous identity verification and seek to automate and improve security mechanisms.

    At the heart of this concept is the assumption that security breaches are inevitable – they can come from either outside or inside the organisation. This model completely rejects the idea of a trusted internal network.

    Business value: safety, efficiency and savings

    Zero Trust is not just a response to changing threats – it is also a sound business decision. The value of the Zero Trust solutions market is growing rapidly and is forecast to reach more than $85 billion by 2030, almost triple that in 2023.

    The real benefits of implementing this strategy are confirmed by studies such as the Total Economic Impact™ report prepared by Forrester for Microsoft. It shows that implementing Zero Trust can reduce the risk of a data breach by up to half, deliver a 92% return on investment over three years and significantly increase the productivity of security teams. Additionally, organisations save millions of dollars by consolidating tools and simplifying their IT infrastructure. At the same time, companies implementing Zero Trust are less likely to fall victim to successful cyber attacks, and can more easily comply with regulatory requirements under regulations such as RODO, NIS2 and DORA.

    Zero Trust’s technological foundations

    For a Zero Trust strategy to work effectively, it needs to be based on several closely interlinked technological pillars. The first is **identity** – in the modern IT environment, it is the user and their identity that is becoming the new security perimeter. Solutions such as multi-factor authentication, single sign-on and context-sensitive access mechanisms to dynamically assess the level of risk are key here.

    Another element is the end devices. Any laptop, smartphone or server can become an entry point for threats, so their status must be constantly monitored. Technologies such as endpoint management (UEM) or incident detection and response (EDR) systems make it possible to keep devices in line with security policies.

    The network, in turn, should be segmented in a way that prevents unauthorised users from moving freely between resources. The modern approach to connectivity is based on ZTNA-type solutions that allow secure, individual user connections to a specific application, hiding the rest of the infrastructure.

    Applications and data deserve just as much attention – they are the ultimate target of cyber attacks. Protecting them effectively requires data classification, the use of leakage prevention mechanisms (DLP) and encryption of information both at rest and during transmission.

    Last but not least, visibility and automation is a key pillar. Integrated analytics systems, such as SIEM and SOAR, collect data from across the IT environment and enable the rapid detection of threats. The use of machine learning in user behaviour analytics (UEBA), on the other hand, identifies unusual activities that may be indicative of a security breach.

    Implementation as an evolution, not a revolution

    Zero Trust is not a project that can be completed by a specific date. It is an ongoing, strategic journey that requires consistency, commitment and an adaptive approach. The key to success is to start with areas that can deliver quick and measurable results, such as implementing SSO and MFA or replacing outdated VPN solutions with ZTNA. It is also necessary to take an inventory of assets and identify key data and systems – the so-called ‘crown jewels’.

    Artificial intelligence and convergence with SASE

    The Zero Trust architecture is constantly evolving. Artificial intelligence and machine learning are playing an increasingly important role, supporting anomaly detection, risk analysis and incident response automation. The integrated approach to security is also reflected in the concept of SASE – an architecture that combines network functions with security services and delivers them as a single, cloud-based service.

    Zero Trust defines ‘what’ and ‘why’ we secure. SASE answers the question of ‘how’ we implement it. Observing the market, there is a clear trend towards consolidation – by 2029, most companies will be using SASE solutions from a single vendor, significantly simplifying security management.

  • The crisis no one is talking about. 82% increase in AI gap is just the beginning

    The crisis no one is talking about. 82% increase in AI gap is just the beginning

    The Polish artificial intelligence scene in 2025 is a tale of two extremes. On the one hand, we are witnessing unprecedented successes. Polish AI startups such as ElevenLabs are raising hundreds of millions of zlotys from the world’s leading investors, and the country is actively participating in the global technology race by developing advanced concepts such as agent-based AI. On the other hand, hard data paints a picture of a deep crisis in the domestic market: the largest skills gap in 15 years, alarmingly low maturity of implementations in companies and a lack of a strategic approach to talent development.

    This discrepancy – let’s call it the Polish AI paradox – is not just a statistical curiosity. It is a fundamental strategic challenge. If it is not resolved, Poland risks consolidating its position as a nation of skilled users of off-the-shelf AI tools instead of becoming a nation of foundational technology developers. In the long term, this undermines our competitiveness and risks squandering the historic opportunity presented by the AI revolution.

    Poland as an exporter of innovation

    There is no doubt that the Polish technology sector is fully integrated into the global circulation of ideas. Key trends for 2025, identified by leading analyst firms such as Gartner and Forrester – including agentic AI (Agentic AI), governance AI (AI Governance) or reasoning AI (Reasoning AI) – are not only being discussed, but actively developed in Polish companies.

    More importantly, these ambitions are translating into tangible, financial successes. In 2024, Polish startups related to artificial intelligence have attracted more than PLN 1 billion in venture capital. Spectacular funding rounds, such as the raising of over PLN 700 million by ElevenLabs or PLN 121 million by Wordware, prove that Polish companies are able to compete at the highest global level and attract capital from the most prestigious funds, such as Andreessen Horowitz. These successes reinforce investors’ belief that Poland has real potential to become a major AI hub in the CEE region.

    Crisis of competence and adoption

    However, when we look at the internal market, optimism gives way to concern. The demand for AI talent has created the largest skills gap in Polish IT in over 15 years. The shortage of specialists has increased by an alarming 82% in one year. This crisis is being felt at board level. According to the Microsoft Work Trend Index report, as many as 84% of Polish business leaders plan to deploy AI agents, but at the same time admit that there is a deep ‘competency gap’ in their organisations.

    This gap between ambition and action is striking. Despite declarations to the contrary, as many as 52% of organisations do not provide any generative AI training to their teams. The implementations that do take place are often superficial. According to McKinsey data, only 1% of managers in Poland describe the implementation of AI in their organisation as “mature”. Although 78% of respondents declared the use of this technology in 2024, it seems to be mainly experimental and not deeply integrated into the company’s strategy.

    Systemic response and strategic trap

    The Polish educational system and market have responded to these challenges. Leading universities such as Jagiellonian University, Warsaw University of Technology and Poznan University of Technology already offer specialised degrees and postgraduate programmes in artificial intelligence. These activities are complemented by private sector initiatives, such as Microsoft’s programme to train one million Poles, and government strategies to develop digital competences.

    Nonetheless, we are facing a fundamental challenge that concerns the quality, not just the quantity, of these competences. We are seeing an explosive growth in a phenomenon that can be called ‘AI Literacy’ (AI) – that is, the ability to use off-the-shelf tools. Seven out of ten Poles use AI regularly, often without formal training or employer knowledge. Furthermore, 44% of employees prefer to delegate tasks to AI rather than a human colleague, valuing its accessibility and speed.

    However, this is evidence of a growing ability to consume technology rather than create it. The real long-term economic value and competitive advantage lies in something else – in ‘AI fluency’ (AI Fluency), i.e. the ability to build, integrate and modify complex AI-based systems.

    ElevenLabs’ success came not from the skilful use of other people’s models, but from the creation of its own ground-breaking technology solution. It was this ability to create fundamental intellectual property that attracted global capital. If the Polish economy is to fully capitalise on its ambitions, we need to make a strategic turnaround.

    From user to creator

    The Polish AI paradox puts us at a crossroads. We can follow the path of mass adoption of off-the-shelf tools, becoming an efficient but still dependent market on external suppliers. This is a safe path, but one that limits our potential to a subcontractor role in the global AI value chain.

    The alternative is to consciously and strategically build competence in technology creation. This requires a fundamental shift in thinking – both at the level of state strategy, corporate programmes and curricula. We need to aggressively shift the focus from teaching how to use AI to training engineering and scientific talent capable of building with AI.

    It is necessary to support not only startups, but the entire ecosystem that creates them: from basic education, to specialised studies, to incentives for R&D in Poland. Companies need to stop seeing AI training as a cost and start treating it as a key investment in building competitive advantage.

    Solving the Polish AI paradox will determine our economic position for decades. We have global talent and ambition. Now we need to build a domestic foundation that allows them to fully flourish. It is time to move from fascination with the possibilities of AI to strategically building our own capabilities in this field.

  • Poland’s alarming skills gap. Why are we such an easy target for hackers?

    Poland’s alarming skills gap. Why are we such an easy target for hackers?

    Poland’s economy is at the heart of a digital paradox. On the one hand, the country is dynamically consolidating its position as a key technology hub in Europe, attracting global investment and aspiring to become a regional leader. On the other hand, hard data paints a picture of a country on the digital front line. A Microsoft report ranks Poland third in Europe in terms of exposure to foreign-sponsored attacks, mainly from Russia. This contrast reveals a deep, structural weakness that could undermine the foundations of the country’s digital development: a widening competence gap in cyber security. This is not just a recruitment problem, but a strategic threat to the entire economy.

    Poland on target

    The scale of the threat facing Poland is unprecedented. In 2024, national incident response teams (CSIRTs) recorded a record 60 per cent increase in the number of breach reports, which translated into a 23 per cent increase in the number of realistically identified incidents. Each day, the CERT Polska team handled an average of nearly 300 attacks. Eurostat data is even more alarming: as many as 32% of Polish companies experienced cyber security incidents last year, which places Poland in the second disgraceful position in the entire European Union.

    Polska jest na 3. miejscu w Europie pod względem narażenia na cyberataki sponsorowane przez państwa (głównie Rosję)

    These external threats are hitting particularly fertile ground. Grant Thornton’s 2024 study, entitled ‘Castle of Paper’, reveals an alarmingly low level of cyber maturity among Polish companies. Although 40 per cent of companies have experienced an attack, almost half (48 per cent) do not regularly scan their systems for security vulnerabilities or log changes to IT systems. Most base their defence on basic tools such as antivirus and firewalls, while only 10-15% of companies have advanced monitoring systems (SIEM, XDR). There is thus a dangerous illusion of security: boards declare that cyber security is a priority , but a lack of competent staff prevents this awareness from being translated into real action.

    Deficit of defenders

    The problem of the shortage of specialists is a global phenomenon, with a worldwide shortage of more than 4 million skilled workers in the field. However, in Poland, the pressure is being felt particularly acutely. Demand for cyber security skills here has increased by 36% over the past year – the highest rate in Europe.

    W Polsce popyt na kompetencje cyberbezpieczeństwa wzrósł o 36% – najwięcej w Europie

    Estimates of the scale of the deficit in Poland range from 10,000 (according to the Polish Chamber of Information Technology and Telecommunications) to 17,500 specialists. However, these figures, based mainly on open vacancies, do not give the full picture. Data from the EU Cyber Security Agency (ENISA) shows that as many as 39% of Polish companies do not employ a single employee responsible for cyber security, and another 45% have only one such person. This means there is a huge latent demand. Many companies, especially in the SME sector, do not create vacancies because they are not yet fully aware of the need for such competence. As new regulations, such as the NIS2 directive, take effect, this latent demand will rapidly reveal itself, causing a shock to the labour market.

    High cost of vulnerability

    The skills gap translates into very tangible losses. There is a direct link between the skills shortage and the increasing number of successful hacks. As many as 87% of managers admitted that their company had experienced at least one hack, which can be partly attributed precisely to a lack of appropriate skills in the team.

    The consequences are severe. More than half of companies that have been victims of an attack have suffered financial losses in excess of US$1 million. What’s more, there is a growing trend of executives being held personally accountable – in 51% of cases after an attack, executives were fined, lost their position or even imprisoned.

    In addition to the direct costs, the skills gap acts as a silent saboteur, inhibiting innovation. Companies are afraid to implement new technologies, such as cloud computing or AI, because they do not have the human resources to secure them effectively. This innovation paralysis has real consequences: 20% of IT companies have had to refuse new projects due to a lack of specialists. In the long term, this weakens the competitive position of the entire Polish economy.

    Labour market under siege

    The imbalance between demand and supply raises salaries to levels not seen in other IT segments. Experienced professionals can count on salaries ranging from PLN 20,000 to even PLN 42,000 per month. Such high rates create a bipolar market. Large, international corporations can afford to compete for the best talent, effectively ‘draining’ them from the market. As a result, the backbone of the Polish economy – the SME sector – remains virtually defenceless, creating a systemic risk for the entire country.

    The greatest demand is for analysts, engineers and security architects, with a particular focus on cloud security and incident response specialists.

    Duże korporacje “drenują” rynek, zostawiając MŚP bez obrony – rośnie ryzyko systemowe

    In search of a digital army

    So how do we bridge the gap? Action is being taken on many fronts, but its effectiveness is limited. The formal education system is not keeping up with market needs. Employers point out that university graduates, while possessing sound theoretical knowledge, often lack key practical skills. As a result, the market ‘hacks’ the education system – companies and candidates create their own parallel education systems based on industry certifications, bootcamps and reskilling and upskilling programmes.

    More and more companies are investing in the development of internal talent, which is often cheaper and faster than external recruitment. At the same time, the state is stepping up its efforts by creating specialised units such as Cyber Defence Forces and sectoral response teams, such as the CSIRT of the FSA for the financial industry. This is a step in the right direction, but carries the risk of further draining talent from the general market.

  • The new elite of the job market. Here’s why Big Data and AI specialists are at a premium

    The new elite of the job market. Here’s why Big Data and AI specialists are at a premium

    Polish industry finds itself at the epicentre of a paradox. On the one hand, data shows an alarmingly low level of adoption of advanced technologies such as artificial intelligence (AI) and Big Data. On the other, it is these niche implementations that are triggering the most profound changes in the labour market, creating demand for new, elite competences. In the face of a demographic crisis and increasing competitive pressure from Europe, this divergence ceases to be a statistical curiosity and becomes a strategic challenge for the entire economy.

    Two-speed transformation

    Data for 2023 paints a picture of a two-speed digital Poland. Technologies perceived as mature and supporting day-to-day operations, such as cloud computing (used by 38% of industrial companies) and the Internet of Things (25%), have gained moderate popularity. However, they are often mainly used to optimise existing infrastructure, such as mail hosting or file storage.

    The real divide is revealed in the area of technologies with strategic, transformative potential. Only 3% of industrial companies have implemented artificial intelligence and only 2% have used Big Data analytics. These figures become even more alarming in the European context. With an overall AI adoption rate of 5.9% in 2024, Polish companies rank penultimate in the European Union, ahead of only Romania. The EU average is 13.48%, with leaders such as Denmark at 27.58%. This is no longer just a lag – it is a fundamental problem of competitiveness.

    Labour market: transformation instead of reduction

    Contrary to widespread fears of mass redundancies, the data shows that the impact of technology on the labour market is much more complex. It is the most avant-garde technologies, AI and Big Data, that are acting as a double-edged sword, making deep restructuring rather than simple downsizing.

    Companies that dared to implement Big Data reported the highest percentage of new job creation (14%), but at the same time a high percentage of reductions (6%). Likewise for AI – 6% job cuts and 4% job growth. This means that these technologies are not so much eliminating work as fundamentally recomposing it. They are automating routine and analytical tasks while creating demand for roles that require data interpretation, strategic thinking and management of new systems.

    Crucially, the biggest beneficiaries of this change are highly skilled professionals. Big Data and AI implementations were the most likely to lead to the hiring of new experts (8% and 6% of companies respectively). A new professional elite is emerging in the market: data analysts (data scientists), software developers and AI and machine learning engineers.

    This picture is further complicated by demographics. Projections by the Polish Economic Institute are clear: by 2035, 2.1 million people will have disappeared from the Polish labour market, including 400,000 from the industrial sector alone. In this reality, automation ceases to be a threat and becomes a strategic necessity for maintaining productivity and economic growth in a shrinking workforce. The real risk for the worker is not the replacement by a robot, but the collapse of his or her company due to a lack of competitiveness.

    The glass ceiling of innovation: why are we standing still?

    If the need for digitalisation is so pressing, why is its rate so low? The answer lies in a vicious circle formed by three mutually reinforcing barriers.

    • Finance: for 67% of industrial companies, the main obstacle is the high cost of implementation. The problem is compounded by the difficulty of estimating return on investment (ROI), cited by 44% of companies.
    • Competence: The second most frequently cited barrier is the lack of qualified staff, signalled by 52% of companies. The gap concerns both technical specialists and managers who understand the potential of technology.
    • Strategy: as many as 44% of companies admit that digital transformation is simply not a priority for them. Nearly half of companies do not even know in which areas they could apply AI.

    These barriers create an impasse: companies do not invest because they lack the staff to estimate profitability and implement solutions. The education market is not training specialists because there is a lack of mass demand. Breaking this cycle requires bold decisions at board level.

    Strategies for the digital decade

    Polish industry is facing the confluence of two powerful trends: global technological acceleration and a national demographic slowdown. The coming years will be decisive. Continued inaction risks a permanent loss of competitiveness, while bold and strategic action may open the way to a leap in development.

    A synthesis of the analysis leads to one conclusion: Polish industry is at a critical point. It is caught between increasing pressure for innovation, dictated by global leaders , and decreasing availability of human capital. The persistent skills gap and strategic inertia at board level are the axis around which the future of the Polish economy will revolve. The window of opportunity to bridge this gap is closing – the next 5-7 years will determine whether Poland will join the ranks of digital innovators or fall behind.

    By 2030, the labour market will be further polarised. Market forecasts clearly indicate a growing demand for specialists in fields that are crawling in Poland today. Roles related to AI and machine learning engineering, data analytics (Data Science), cyber security and cloud architecture will be key. At the same time, soft competencies that are complementary to technology will gain in importance: analytical and creative thinking, flexibility, mental toughness and readiness for continuous learning (lifelong learning). These are the ones that will enable effective management of technology-enabled teams and processes.

    Polish industry is at a crossroads. Continued inactivity risks a permanent loss of competitiveness. To meet the challenges, business leaders must take decisive action:

    • Invest in People, Not Just Technology: move from a mentality of ‘buying’ talent to ‘building’ it internally. Reskilling and upskilling programmes need to become a key part of business strategy, not just a task for the HR department.
    • Strategy First: Digital transformation must be a boardroom priority, not an IT department experiment. Leaders must clearly communicate the business objectives behind investments and promote a data-driven culture.
    • Build Foundations: Before companies throw themselves into the pursuit of AI, they need to ensure a solid foundation in the form of information governance (data governance) and a modern infrastructure. Many failed AI projects are due not to flaws in the technology, but to poor data quality.

    The coming years will decide whether Poland will join the ranks of digital innovators or remain in the technological tail of Europe. The time to act is now.

  • The Open Source Paradox – How innovation became the biggest threat to business

    The Open Source Paradox – How innovation became the biggest threat to business

    Open source software (OSS) is the silent, invisible engine driving today’s digital economy. From banking systems to medical applications to the dashboard in your car, its code is everywhere. The scale of this dependency is difficult to overstate. Synopsys’ Open Source Security and Risk Analysis (OSSRA) report for 2024 showed that 96% of commercial applications contain open source components, and that on average 77% of all code in these applications comes from open source. In Poland, according to data from the end of 2023, open source software was identified on more than 670,000 websites, ranking our country 13th in the world in terms of its use.

    This ubiquity has given rise to a fundamental paradox. On the one hand, OSS is an unparalleled accelerator of innovation, allowing companies to build and deploy advanced technologies faster than ever. On the other hand, it has become the largest, often ignored, attack surface for modern businesses. This dichotomy is brutally exposed by the data: in 2023, as many as 74% of the code bases examined contained high-risk security vulnerabilities. This is a dramatic increase of 54% from the previous year, when the percentage was 48%.

    The problem not only exists, but is dynamically worsening, spiralling out of control. Organisations are consuming open source software at a rate that far exceeds their ability to manage the associated risks, leading to a tacit acceptance of huge and growing levels of risk.

    Hidden dangers in your code

    Understanding today’s OSS threats requires deconstructing the problem into several key, often interrelated categories.

    1. ‘Zombie code’ and security debt

    One of the most fundamental risks is the reliance on old, out-of-date and often abandoned components by authors. The 2024 Synopsys report bluntly describes this condition as a ‘zombie code apocalypse’. The scale of the problem is alarming:

    • 91% of the code bases analysed contained components that were outdated by 10 or more versions.
    • 49% of the code base contained components where no development activity had been recorded for more than two years.

    These omissions create a so-called ‘security debt’ – a concept popularised in Veracode’s ‘State of Software Security 2024’ report. Each unpatched vulnerability is like a loan taken out, which builds up over time, generating ‘interest’ in the form of increasing risk. To make matters worse, it takes 50% more time to fix vulnerabilities in third-party code (mainly OSS) than in proprietary code, showing how difficult it is to manage this type of debt.

    2 The trap of transitive dependencies

    The problem of code obsolescence is compounded by the existence of transitive (indirect) dependencies. These are libraries that the project does not use directly, but are required by direct dependencies. They represent ‘hidden’ code that finds its way into the application, often without the knowledge and verification of the development teams.

    It is in this hidden layer that the greatest danger lurks. Transitive dependencies can make up as much as 80-90% of all application code , and it is estimated that as much as 95% of all vulnerabilities in open source software come from these intermediate dependencies. A textbook example is the Log4Shell vulnerability in the Log4j library. Many organisations had no idea they were using it at all until the global security crisis hit. Even three years after it was discovered, 13% of all Log4j library downloads are still vulnerable versions.

    3. active sabotage: attacks on the supply chain

    In recent years, the threat landscape has undergone a fundamental transformation – from passive vulnerabilities to actively hostile actions. Attackers have realised that compromising one popular component allows thousands of organisations to be hit simultaneously. The figures from the Sonatype report for 2024 are alarming: there has been a 156% year-on-year increase in the number of malicious packages. These attacks take a variety of forms:

    • Typosquatting: Publishing a malicious package with a name confusingly similar to a popular one (e.g. crossenv instead of cross-env), hoping that a developer will install it by mistake.
    • Dependency Confusion: publication in a public repository of a malicious package with a name identical to the company’s internal private package, but with a higher version number. The build system can ‘confuse’ the sources and download the public version, as happened in the attack on the PyTorch framework, among others.
    • Account takeover and code injection: The most sophisticated method, of which the incident with the event-stream package was a frightening example. The attacker first gained the trust of the author and, after taking over privileges, added a new malicious transitive (flatmap-stream) dependency to the project with the aim of stealing private keys from bitcoin wallets.

    Why do we lose? Systemic failures in organisations

    The scale and persistence of these risks are largely the result of deep-seated cultural and process issues within the organisation.

    Firstly, there is a culture of complacency and lack of strategy. The Sonatype report speaks of ‘developer overconfidence’. As many as 80% of dependencies in applications remain un-updated for more than a year, even though there is already an available, secure version for 95% of them. This means that almost all risk is avoidable. This state of affairs is due to the lack of a formal framework – only 47% of IT companies have a defined strategy for using open solutions.

    Secondly, we are seeing an immaturity of security processes and tools. The 2024 Snyk report reveals a worrying trend: despite growing threats, investment in defence mechanisms is declining. There has been an 11.3 per cent decline in the adoption of security tools. Key technologies such as Software Composition Analysis (SCA) and Static Application Security Testing (SAST) are used by just over 60% of organisations, and container scanning by just 35%.

    Finally, the idea of ‘shift left’, i.e. moving security to the early stages of the development cycle, remains largely an illusion. Security tools are most often integrated into build systems (around 65%), but only 40% of organisations have implemented them where they are most effective – directly in the developer’s integrated development environment (IDE). This means that control gates are being moved, rather than the actual responsibility for security. The result? Overstretched security teams, half of whom admit that they are unable to meet their goals, and 52% of companies regularly fail to meet their own SLAs for fixing critical vulnerabilities.

    The road to cyber resilience: A 3-step defence model

    Mitigating such complex risks requires moving away from reactive measures to implementing a comprehensive, strategic management model.

    Step 1: Achieve full visibility with SBOM

    A fundamental principle of security is that ‘you cannot protect something you do not know exists’. In the context of software, the tool that provides this visibility is the Software Bill of Materials (SBOM). This is a formalised, machine-readable inventory of all the components – including transitive dependencies – that make up an application. Having an accurate SBOM is becoming a global standard, driven by regulations such as the

    US Executive Order 14028 and the EU Cyber Resilience Act (CRA), which make supply chain transparency a condition for market access.

    Step 2: Automate protection with SCA

    Visibility alone is not enough. A list of thousands of components is just raw data. To turn it into useful knowledge, Software Composition Analysis (SCA) is needed. SCA is an automated process that analyses SBOM for risk: it scans components for known vulnerabilities (CVEs), verifies licence compliance and assesses the overall quality of dependencies. Modern SCA tools not only find problems, but also help prioritise them and often offer automated remediation suggestions.

    Step 3: Build a DevSecOps culture

    Technology is essential, but it is not enough without a cultural change. DevSecOps is an evolution of the DevOps philosophy that integrates security as a shared responsibility at every stage of the software development lifecycle. Rather than seeing security as an isolated assembly at the end of the process, it is ’embedded’ from the very beginning. In practice, this means integrating SCA tools directly into the developer’s IDE, automatic scanning in the CI/CD pipeline and continuous monitoring of the application in the production environment. This is a real ‘shift of responsibility to the left’ that gives developers the tools and knowledge to make safe decisions.

    From risk to advantage

    The open source landscape is full of contradictions. We are witnessing an explosion of risk, driven by the ubiquity of OSS and systemic mismanagement. However, this alarming picture need not lead to paralysis. The path to cyber resilience is well defined and leads through the implementation of an integrated defence model based on visibility (SBOM), automated intelligence (SCA) and a culture of shared responsibility (DevSecOps). In the new reality defined by advanced threats and increasing regulatory requirements, proactive risk management in the software supply chain is no longer just a technical responsibility. It is becoming a key element of business strategy. Organisations that master this area will not only avoid disaster, but gain the ability to innovate faster and more securely, transforming the biggest source of risk into a sustainable competitive advantage.

  • Global network infrastructure under acceleration: Market grows despite challenges

    Global network infrastructure under acceleration: Market grows despite challenges

    The network infrastructure market is the foundation of the digital economy – encompassing hardware, software and services that provide connectivity and data exchange on a global scale. The dynamic development of mobile technology, cloud computing and the Internet of Things (IoT) is driving investment in modern networks. As a result, network infrastructure is undergoing intense modernisation: telecom operators are deploying 5G, companies are migrating to the cloud and the hybrid working model, and organisations are betting on network automation. This article analyses the current value of the global network infrastructure market, its growth forecasts, regional differentiation, key technology trends (5G, edge computing, SD-WAN, AI) and key market players. It also provides an expert assessment of the outlook for the next 5-10 years and the key challenges – from costs to security to problems with outdated systems.

    Market value and growth forecasts

    The global network infrastructure market is worth hundreds of billions of dollars and is showing steady growth. Its value reached around USD 248.8 billion in 2024 and is forecast to grow to USD 463.9 billion in 2033. This means that the market will grow at a compound annual growth rate (CAGR) of around 7.2 per cent on average between 2025 and 2033. Such growth reflects the increasing demand for advanced networking technologies around the world – from data centre modernisation, cloud and 5G integration to the development of smart cities. Demand is driven by both the private sector (digital transformation of businesses) and public investment in broadband and mobile infrastructure.

    In 2024, the market was valued at just under USD 250 billion, to almost double its value to around USD 464 billion by 2033. The growth trend is relatively uniform and stable – reflecting the maturity of the market and the continued organic growth in demand for network capacity, security and new functionality. Importantly, the structure of the market includes network hardware (around 48% market share) and network software and services (together the remaining 52%), which means that, in addition to investment in physical equipment, the role of software solutions that define network operations is growing.

    Regional market structure

    Network infrastructure is growing in all regions of the world, but the dynamics and scale of investment vary by area. Asia and the Asia-Pacific region currently represent the largest segment, accounting for around 34% of the global market and showing the fastest growth rate. The main driver here is the expansion of next-generation mobile networks and urbanisation: it is estimated that more than half (51%) of all 5G base stations worldwide are located in Asia-Pacific. Countries such as China, Japan, South Korea and India are leading the way with investments in 5G and smart city projects. For example, some 68% of enterprises in the APAC region are betting on migrating to the cloud, and 59% are deploying advanced industrial networks for smart manufacturing and urban infrastructure.

    North America accounts for approximately 31% of the value of the global market and remains at the forefront of deploying the latest network solutions. The US accounts for the lion’s share of this market, with approximately 84% of North American infrastructure investment occurring in the US. The region has the highest density of data centres, widespread fibre availability and a rapid pace of 5G deployment. Already 48% of organisations in North America are using 5G connectivity in their operations. In addition, companies are placing a strong emphasis on security, with nearly 69% of US companies prioritising the integration of cyber security into their network infrastructure. Government programmes supporting network expansion and the widespread digitisation of business are also contributing to the growth.

    Europe accounts for approximately 27% of the global network infrastructure market. With key markets in Germany, the UK and France, the region is focusing on modernising corporate and telecom networks based on software-defined networking architectures. Already, some 61% of European enterprises are deploying SDN (Software-Defined Networking) solutions and 58% are investing in multi-cloud strategies – integrating multiple clouds for greater flexibility. European operators and companies are also intensively developing data centre infrastructure and fibre networks, preparing the ground for 5G rollout and future 6G deployments around the end of the decade. Despite a slightly smaller share of the global market, Europe maintains high standards of security and interoperability and is paying increasing attention to the energy efficiency of infrastructure – some 31% of new network investments in Europe and Asia are already directed towards green, energy-efficient technologies.

    Other regions are also seeing growth: The Middle East and Africa together account for around 8% of the market, catching up through digital infrastructure projects often funded by government funding and public-private partnerships. Latin America, meanwhile, is investing in the expansion of 4G/5G and fibre networks, although the scale of spending there is smaller compared to the three main regions.

    Leading technological trends

    Several key technology trends are clearly emerging in the network infrastructure that are shaping the development of the market:

    • 5G networks: 5G technology is being deployed globally in mobile operators’ networks, offering many times higher speeds and minimal latency. Investments related to 5G already account for approx. 21% of the total capital expenditure of telcos, and spending on 5G equipment accounts for 24% of telecoms network infrastructure budgets. Fifth-generation networks not only serve the growing mobile traffic of consumers, but also enable the development of new applications – from industrial IoT to autonomous vehicles. Of increasing importance are private 5G networks deployed by industrial and logistics companies that need reliable, unbundled connectivity with ultra-low latency for their own production needs. 5G will continue to be a catalyst for investment in the coming years, with more than 50% of global mobile connections expected to be 5G-enabled by 2029.
    • Edge computing: edge computing architectures, or computing at the edge of the network (closer to the data source and the user), are gaining popularity in response to the demands of real-time applications. Some 47% of organisations plan to deploy edge solutions to support critical systems that require minimal latency. Moving computing power closer to users improves the responsiveness of services such as video streaming, online gaming, telemedicine or autonomous vehicle systems. The proliferation of IoT also forces the local processing of huge streams of sensor data. The edge trend goes hand in hand with 5G – an estimated 39% of new network infrastructure projects combine 5G deployments with edge components to provide ultra-low latency and local data analytics for industry, smart cities or energy grids.
    • SD-WAN and Software-Defined Wide Area Networks: SD-WAN (Software-Defined Wide Area Network) are solutions that manage wide-area enterprise networks through the software layer, ensuring traffic is optimised between branch offices and the cloud. Demand for SD-WAN is growing exponentially in the age of remote and hybrid working – businesses need flexible and secure access to corporate applications in the cloud regardless of location. More than 68% of companies are shifting to a more flexible working model, stimulating SD-WAN deployments to ensure consistent connectivity and security policies across all departments. SD-WAN solutions, often offered by vendors as a managed service, also reduce data costs through the intelligent use of Internet connections and MPLS networks. A broader context is network virtualisation – SDN and NFV (Network Function Virtualisation) – where network functions, such as routing or firewall, are implemented programmatically. In Europe, the aforementioned ƒ~61% of companies are already using SDN architecture. Global vendors are intensively developing SD-WAN/SDN offerings – an example is Cisco, which in 2024 introduced a new generation of SD-WAN solutions adapted to the hybrid operating model, providing, among other things, 34% faster access to cloud applications while maintaining centralised security control.
    • Artificial intelligence (AI) in networks: AI is playing an increasingly important role in both the management of network infrastructure and in new security functions. Network operators and administrators are deploying machine learning algorithms to automate network configuration, monitoring and optimisation. More than 39% of enterprises are using AI-based analytics tools to optimise network performance and detect problems faster. At the same time, equipment manufacturers are integrating AI elements into their products – around 37% of new network devices have AI-based features for threat detection and automated response to security incidents. An example is the next-generation data centre switches from Huawei, equipped with AI mechanisms to increase performance by ~40% and improve traffic management. In the coming years, AI is expected to enable the implementation of so-called self-optimising networks (self-driving networks), which automatically adjust parameters to changing conditions and can predict failures through data analysis (predictive maintenance).

    Major market players

    The network infrastructure market is dominated by a handful of large global vendors who compete in both the telecoms equipment and corporate network solutions segments. These include Cisco, Huawei, Nokia and Ericsson, among others:

    • Cisco Systems (USA): The world’s largest provider of enterprise networking solutions. Cisco leads in the area of network equipment (switches, routers, Wi-Fi access points) and develops advanced software for network management and security. The company is adapting its offering to new trends – investing in SDN/SD-WAN solutions (the aforementioned latest products for software-defined WAN are an example) as well as cloud and data centre solutions. Many corporations base their infrastructure on Cisco hardware as standard, which gives the company a strong market position.
    • Huawei (China): A global telecommunications and networking giant that is one of the leaders in the deployment of 5G technology. Huawei offers a full infrastructure portfolio – from access equipment (5G base stations, fibre equipment) to backbone routers and cloud solutions. Together with Cisco, it is among the largest players – together the two companies control nearly 29% of the network infrastructure market share. Huawei has enjoyed tremendous success in the Asian, African and Latin American markets, although in recent years it has struggled with restrictions in some Western markets for geopolitical reasons. Despite this, the company continues to invest in research (including the development of AI-enabled switches for data centres) and maintains a strong position in global vendor rankings.
    • Nokia (Finland): One of the two European leaders in telecommunications infrastructure. Nokia (alongside Ericsson) is a leading supplier of equipment for mobile networks – especially for 4G/5G infrastructure (RAN, core network) – as well as optical and IP transport solutions. The company is using its telecoms know-how to enter new areas, such as private 5G networks for industry. In 2023. Nokia announced a number of deployments of private 5G wireless networks for industrial sectors and smart city projects, responding to business customers’ demand for dedicated, high-performance communication networks. Globally, Nokia is competing for 5G contracts with Huawei and Ericsson, with a strong presence in markets where alternatives to Chinese vendors are required.
    • Ericsson (Sweden): The second European network infrastructure giant next to Nokia, with more than a century of history in telecommunications. Ericsson specialises in equipment for mobile and radio networks – it is one of the main suppliers of 5G base stations to operators worldwide. The company is also investing in the development of core network solutions, managed services and IoT. With a strong position in North America and Europe, Ericsson is benefiting from operator demand for 5G equipment amid restrictions imposed on Huawei in these regions. In addition, Ericsson is engaged in standardisation work on future technologies (6G) and is working with partners (e.g. cloud providers) to virtualise network functions. As a company focused on the operator segment, Ericsson – like Nokia – complements the offerings of Cisco and Huawei, particularly dominating global mobile access network deployments.

    In addition to those mentioned, there are other major players in the network infrastructure market specialising in selected areas – including ZTE, Juniper Networks, Arista Networks, Dell EMC, HPE (Aruba), Extreme Networks or CommScope. The aforementioned companies compete in segments such as data centre switches, campus LAN/WLAN equipment, cabling or cloud solutions, completing the global network infrastructure ecosystem.

    Development prospects for the coming years

    According to experts, the growth prospects for the network infrastructure market over the next 5-10 years remain very promising. A projected average annual growth rate of 7% means that the sector will grow faster than many traditional industries, although slightly slower than the most dynamic segments of the IT market. The key technology trends described above will continue to drive investment: the global roll-out of 5G (and, looking ahead to the end of the decade, the first 6G deployments) will ensure continued demand for equipment and operator network upgrades. Edge computing will become an integral part of the network architecture – more and more data will be processed locally, creating a demand for distributed network nodes close to the user. Cloud and multicloud solutions will force the construction of networks capable of handling dynamic, distributed workloads, fostering the development of intelligent, software-defined networks. Automation using AI is likely to transform the way networks are managed – we are already seeing a trend towards autonomous networks, able to optimise traffic and respond to incidents autonomously. In the long term, this could result in significant operational savings and improved security.

    Regionally, the current balance of power is expected to continue, with Asia-Pacific remaining the largest and fastest-growing market due to investment in China and developing countries, North America maintaining high levels of innovation and corporate spending (especially in the US), and Europe consistently upgrading infrastructure with a focus on security and efficiency. However, the disparity between regions may narrow as network technologies become ubiquitous and deployment costs fall.

    Experts also highlight new areas of growth that may become increasingly important: private 5G networks for enterprises (e.g. in factories, ports or university campuses), networks for IoT supporting billions of devices (including narrowband LPWAN networks for sensors) or the development of the satellite internet (e.g. constellations in low orbit providing global connectivity). The digital transformation of sectors such as energy (smart grid), automotive (connected vehicles) or medicine (telemedicine, wearable devices) will generate demand for a reliable communication infrastructure. A further increase in research and development (R&D) spending in the network area can be expected – both by market giants and new players (startups), which will result in further innovations and even more efficient network technologies in the future.

    Market challenges

    Despite the positive outlook, the global network infrastructure market faces several significant challenges. The biggest barrier is high cost – network upgrades require huge capital expenditure. More than 44% of enterprises cite budget constraints as a factor inhibiting infrastructure upgrades. Next-generation hardware (e.g. 5G devices, backbone routers, edge nodes) and associated software and integration are costly investments that not all organisations can afford immediately. At the same time, obsolete (legacy) systems are still common – it is estimated that around 27% of the network infrastructure in use globally is made up of older, previous-generation equipment. Migrating from these legacy systems is difficult: nearly 38% of companies struggle to replace old hardware with newer hardware. Maintaining such solutions raises not only opportunity costs (lower performance, lack of new features), but also security risks – almost 33% of security breaches are related to vulnerabilities in outdated infrastructure.

    Cyber security is itself another challenge. The increasing complexity of networks (especially those distributed across multiple clouds and locations) means that 59% of organisations find it difficult to manage security in multicloud and hybrid environments. Attacks on network infrastructure are increasing in sophistication and the attack surface is widening with the connection of more IoT devices and the proliferation of 5G networks. Ensuring consistent security policies, network segmentation and data protection in such a heterogeneous environment is a heavy burden for IT departments. Many companies also face staff shortages, with some 29% of enterprises citing a shortage of qualified advanced network professionals as a limiting factor to progress.

    Another challenge is interoperability and integration of new technologies with existing infrastructure. Companies often use multi-vendor solutions, which raises compatibility issues. More than 34% of organisations experience integration issues when deploying disparate platforms and services. Standardisation of protocols and openness of ecosystems are therefore becoming crucial to avoid technology silos. Additionally, regulators are imposing requirements on the industry (e.g. on cyber security, data privacy or spectrum allocation), which can slow down deployments, especially in the telecoms sector.

  • Operational transformation: How the OT market is growing in Polish industry

    Operational transformation: How the OT market is growing in Polish industry

    Operational technology (OT) – i.e. industrial control, automation and critical infrastructure systems – has become a pillar of digital transformation in industry. The convergence of OT with classic IT systems is gaining momentum with the development of Industry 4.0. As a result, Polish companies are increasingly investing in automation, Industrial IoT (IIoT) and OT cyber security to increase the efficiency, competitiveness and operational resilience of their plants.

    Value of the OT market in Poland (2019-2024)

    The Polish OT market has been growing rapidly in recent years, almost doubling in value from 2019 to 2024. It is estimated that in 2019, the value of the domestic OT market was around PLN 6-7 billion, while in 2024 it is already reaching over PLN 13 billion, despite a periodic slowdown during the pandemic. Such significant growth – averaging around 15% per year – reflects the increasing demand for industrial automation solutions, SCADA/PLC systems, IT/OT integration and security services in industrial environments. For example, the Industrial Internet of Things (IIoT) segment in Poland reached a value of around PLN 10 billion in 2023, growing by 20% year-on-year. The domestic industrial automation and robotics industry has also grown strongly, with more than 210 companies already operating in it, generating a total of more than PLN 9 billion in revenue annually. The growth of the market is driven by the need to increase productivity and reduce costs: companies are investing in OT to increase productivity, precision and process safety, reduce human error and cope with skills shortages.

    Market value forecasts for 2025 and 2030

    The outlook for the OT market in Poland remains very optimistic. Maintaining a double-digit growth rate suggests that this market will exceed PLN 15-16 billion in 2025, and could reach PLN 30 billion by 2030 (i.e. another doubling of scale). These forecasts correspond with global trends – the global IT/OT convergence market (comprising OT and IT software and OT hardware) is valued at US$720bn in 2023, and IoT Analytics predicts it will grow by 8.5% per year to over US$1 trillion in 2027. The OT cyber security segment, on the other hand, is expected to grow globally at up to 15-25% per year, reaching USD 80-120 billion in 2030. Poland – as an economy catching up in terms of automation – may maintain a double-digit OT growth rate in the current decade as well. It is worth noting that these forecasts depend on a number of factors, including the industrial conjuncture, the availability of engineering staff and the pace of regulatory implementation (e.g. the NIS2 directive on critical infrastructure). Nevertheless, the trend is clear: operational technologies will become an increasingly important part of the ICT market in Poland, and investments in this area will accelerate as the concept of Industry 4.0 matures.

    Key technological trends in OT

    The development of the OT market is being driven by technological changes that are transforming traditional factories into modern, digital factories. Key trends include:

    • Automation and robotisation of production: Polish manufacturing companies are intensively implementing automation, industrial robots and control systems to compete. Automation improves efficiency and quality – “companies looking for ways to minimise human error and increase productivity are implementing solutions such as robotics, AI or IoT-based systems”. Despite the progress, the level of robotisation in Poland is still low compared to the world (only 52 robots per 10,000 industrial workers vs. 397 in Germany), which means there is huge potential for further growth. The automation trend will therefore maintain a high momentum for a long time, especially as rising labour costs and staff shortages force companies to invest in robots and automated production lines.
    • Industrial Internet of Things (IIoT): More and more equipment and machines in factories are being equipped with sensors and communication modules, creating an IIoT network. Real-time collection and analysis of production data enables process optimisation, predictive maintenance and improved asset management. IIoT is the fastest-growing segment of Industry 4.0 – it already accounts for around 20% of the global Industry 4.0 market. In Poland, spending on IoT is growing by double digits (despite a slowdown to 14% growth in 2024) and increasing IIoT applications can be seen in the energy sector (smart grid, meters), food industry or logistics, among others. According to Allied Market Research, the global IIoT market could reach an astronomical US$1.5 trillion by 2030, showing the scale of the trend.
    • Artificial intelligence and data analytics: AI and machine learning are increasingly boldly entering OT environments. AI algorithms analyse sensor data streams so that factories can realise advanced analytics, automatic parameter adjustments and predictive maintenance. Integrating AI/ML with automation systems provides new opportunities for diagnostics, predictive maintenance and decision support – for example, learning models predict machine failures, optimise energy consumption or even control robots adaptively. As a result, factories are becoming smarter and more autonomous, fitting in with the Smart Factory concept. This trend is set to grow – suffice it to mention that the global AI market in the area of security (including OT) is expected to grow from USD 15 billion in 2021 to USD 135 billion in 2030. Poland is not lagging behind – indigenous companies are experimenting with the vision of ‘digital twins’ of production lines and using AI to optimise processes.
    • Integration of IT/OT environments: The disappearance of the separation between IT and OT systems is another megatrend. Traditionally, operational systems (e.g. production control) were separated from IT, but now – in the age of fast information needs – they are increasingly being connected. “Connecting OT systems to IT networks increases their efficiency on the one hand, but on the other increases the risk of cyber attacks. IT/OT convergence enables the integration of data from the shop floor with business systems (ERP, cloud), providing holistic insights into company operations and new opportunities for optimisation. According to the IoT Analytics report, the integration of these worlds is already a strategic necessity for companies pursuing digital transformation. Globally, the cumulative market for IT/OT solutions is huge, as highlighted by the calculations – more than USD 720 billion in 2023. In Poland, integration is progressing for the time being in selected sectors (e.g. smart grid in energy, where SCADA systems are connected to data centres – energy is the main driver of SCADA demand). However, it remains a challenge to combine the different working cultures of IT teams vs. OT engineers and to upgrade older industrial installations, which often operate in isolation from the grid. Integration therefore requires specialised expertise and a well thought-out strategy, but offers tangible benefits – consistent infrastructure management and real-time access to operational data. For integrators and technology providers, it is also an opportunity to expand their offerings – IT solution manufacturers are intensively adapting their systems to the specifics of OT.
    • OT cyber security: The increasing digitalisation of industry makes protecting OT systems from cyber attacks a priority. Production environments have different characteristics than classic IT – here, a failure due to an attack can bring a production line to a halt or put people’s safety at risk. Unfortunately, attacks are increasing exponentially: in 2022, the number of cyber incidents against OT devices increased from 20 million to 120 million, and in Poland, ransomware incidents targeting industry have increased sixfold recently. Key trends in OT security are the Zero Trust model (already 58% of companies are implementing Zero Trust in OT in 2024, up from 21% in 2022) and network segmentation (locking critical devices into micro-zones to make it harder for attacks to spread). Dedicated ICS/OT monitoring solutions are emerging, detecting anomalies in industrial networks without disrupting the process. Unfortunately, many plants still rely on outdated control systems with known vulnerabilities and lack of updates. Lack of full visibility of OT assets and the use of proprietary, unencrypted communication protocols are further problems. In response, companies are increasing investment in security: according to estimates, the Polish market for cyber security solutions (including OT) is already approx. PLN 2.5-3 billion in 2024. Regulations are also forcing progress – the NIS2 directive in the EU, ISA/IEC 62443 standards or national laws oblige infrastructure operators to implement advanced security measures. All this means that the OT security segment will be one of the fastest-growing areas of the market (by double digits per year), although there is still a shortage of specialists – only 9% of IT/automation professionals focus exclusively on ICS/OT security, which means intensive staff training is needed.

    Growth dynamics and main sectors of demand

    The average annual growth rate of the Polish OT market between 2019 and 2024 was estimated to be around 15 per cent (CAGR) – a rate higher than that of the overall ICT market. Growth was primarily driven by demand in three vertical sectors: manufacturing, energy and transport and logistics.

    • Industry (manufacturing): Has accounted for the largest share of OT spending for years – it is estimated that the manufacturing sector generates around 18-20% of demand for IIoT solutions. It is in factories that most robots and control systems are installed. Poland has a strong automotive, food and chemical industry – all these industries are investing in line automation, IoT sensors and MES/SCADA systems for production supervision. 2021 brought an investment boom (more than 3,300 robots installed, +56% y-o-y), but the following years (2022-2023) brought a temporary slowdown in robotisation (declines in robot deliveries of several percent) due to supply chain disruptions and economic uncertainty. Despite this, the industry will continue to invest in the long term – Polish factories need to automate faster to catch up with the competition, especially as staying in the global market requires automation and robotisation of production. It can be expected that with an improvement in the economy and an influx of funds (e.g. KPO, EU funds for industrial transformation), industry spending on OT will get back on an upward trajectory.
    • Energy and utilities: the energy sector is a key driver of demand for OT solutions, especially SCADA systems, industrial networks and critical infrastructure protection. The automation of electricity grids (smart grid) and the modernisation of gas or water infrastructure require advanced OT systems. According to analysts, it is the energy industry that will be the main catalyst for demand for SCADA systems by 2024. – Increasing investments in smart grids and substations translate into the need to monitor thousands of devices in the field. In Poland, the challenges of the energy transition (RES, distributed energy) will further increase the demand for OT – automation systems for managing the grid, energy storage or PV/windmill farms. In addition, the power industry, as critical infrastructure, is investing heavily in OT cyber security – required by NIS regulations and crucial in view of increased cyber attacks on the energy sector in the region.
    • Transport and logistics: automation and OT are playing a growing role in transport – from rail and urban traffic control to airport systems to smart warehouses and logistics centres. Transport and logistics are among the fastest growing consumers of IIoT, with a projected increase in spending of up to +26% y/y globally. In Poland, we see this in the modernisation of railways (new train control systems, sensors on tracks), the development of road infrastructure (traffic management systems, ITS) or the automation of warehouses in e-commerce. Logistics companies are deploying IoT sensors for supply chain and vehicle tracking, while ports are investing in OT systems to increase throughput. The digitalisation of supply chains (autonomous warehouses, AGVs, sorting robots) is further driving the demand for OT in logistics. This sector is growing rapidly with the boom in online shopping and the need for efficient distribution of goods.

    It is worth noting that, in addition to those mentioned, significant adopters of OT technologies include the food sector (many food production lines are already highly automated), the mining sector (mines are implementing OT systems for safety and control) and the healthcare sector (HVAC and power automation in hospitals, medical IoT devices). Even local governments are investing in urban IoT systems (smart lighting, environmental monitoring), which also falls into the broad area of OT.

    Overall, demand for OT in Poland is multi-sectoral, but industry, energy and transport are the cornerstones and will account for the largest share of market growth in the coming years. Each of these sectors is driven by slightly different motivations – industry by productivity and lack of manpower, energy by grid reliability and ecological transformation, transport by efficiency and safety – but the common denominator is a shift towards OT as a solution to strategic challenges.

    The future of the OT market – conclusions and recommendations for IT staff

    The Polish operational technology market is entering a mature growth phase. As shown, we can expect further dynamic growth until 2030, with OT becoming a permanent item on the investment agenda of many companies. For IT and business executives, this means a number of strategic actions need to be taken:

    1. accelerating IT/OT convergence: today’s companies can no longer afford silos – IT systems (e.g. ERP, analytics) must work seamlessly with OT systems in production. Integration of these environments should become part of a company’s digital strategy. A plan should be in place to connect operational data with office systems, enabling new business benefits from analysing production data in a broader context. However, IT managers must remember that IT/OT integration requires trade-offs and specialised expertise – it is worth investing in team training and working with integrators with experience in both areas. Companies that are first to successfully integrate OT with IT will gain the advantage of more efficient operations and faster access to information from the ‘factory floor’.

    2 Investment in OT security and resilience: any industrial transformation initiative must go hand in hand with strengthening cyber security. The risk of cyber attacks on operational infrastructure is real and growing – high-profile incidents (such as the ransomware attack on Colonial Pipeline) have demonstrated the consequences for critical infrastructure. Managers should assume that their OT environments will sooner or later become the target of an attack, especially if they are connected to the IT network. It is recommended to implement a Zero Trust model in the OT area – treating every access and device as potentially hostile, network segmentation and strong access control. Visibility of OT assets is also necessary – investment in industrial network monitoring systems that can detect abnormal device behaviour before failure or sabotage occurs. Management should also take care to develop business continuity plans (BCP/DR) that take into account OT technology downtime, as well as meeting regulatory requirements (NIS2 compliance audit, implementation of security standards). OT cyber-security budgets will have to grow – but it is a necessary investment to protect production continuity and the company’s reputation.

    3. development of team competencies and a culture of collaboration: The shortage of OT security experts and automation engineers familiar with IT is a barrier that companies already face today. Managers should therefore invest in upgrading the skills of existing staff – e.g. by training automation engineers in cyber security and networking, and IT specialists in the basics of industrial systems. It is worth considering the creation of joint IT/OT teams or at least mechanisms for ongoing collaboration between these departments to break down ‘traditional organisational silos’. Building a culture where IT specialists understand production priorities (continuity, physical security) and OT engineers understand cyber hygiene principles and IT procedures is key. The staffing gap can be partially bridged by working with external managed service providers (MSSPs) that offer, for example, SOC monitoring for OT or industrial network management – but this is no substitute for internal awareness and competence. IT leaders should therefore include dedicated OT-related roles (e.g. OT Security Officer or OT Systems Engineer) in their staffing plans.

    4 Long-term planning and innovation: When thinking about the future, executives should already be considering new technology waves on the horizon, such as Industry 5.0, augmented reality (AR) in factories, or the use of digital twin technology for process simulation. Although these concepts are only in their infancy, investments in OT are inherently long-term – the life cycle of industrial systems is often 10-15 years. Therefore, decisions made today (e.g. the choice of automation platform or communication standard) will affect a company’s ability to implement further innovations a decade from now. Managers should design an OT architecture that is open to future expansion, compliant with standards (this will facilitate the integration of new AI/IIoT modules in the future). It is equally important to pilot innovations – such as launching proof-of-concept programmes for AI in maintenance or 5G in the factory – so that the organisation learns new technologies before they become mainstream.

    The development of the OT market in Poland is accelerating, reflecting the industry’s transformation towards modern, automated and intelligent operations. Historical data shows strong growth in the value of the market, and forecasts suggest a continuation of this trend in the coming years. Key trends – from automation, to IIoT and AI, to IT/OT integration and cyber-security – are setting the stage for businesses seeking greater efficiency and resilience. For IT managers and technology decision-makers, this means they need to actively engage in OT projects: combining IT and OT competencies, investing in security, training and innovation to take full advantage of the opportunities of Industrial Revolution 4.0 (and in the future 5.0). Polish industry faces a huge opportunity – the right decisions taken today will determine whether indigenous companies will be at the vanguard of the new era of operational technology or remain merely passive recipients of it.

  • Export of IT services from Poland are breaking records. Will the boom last?

    Export of IT services from Poland are breaking records. Will the boom last?

    The export of IT services has become one of the key elements of Poland’s foreign trade, significantly supporting the country’s current account surplus. Poland has maintained a positive balance in services trade since the mid-1990s, and in recent years services have become the main driver of the foreign trade surplus. Although the largest shares of services exports have traditionally been transport and tourism, IT (information technology) services are dynamically increasing in importance. Since 2010, exports of IT services have grown at an average annual rate of over 20% – more than twice as fast as exports of services in general. As a result, the share of IT services in Polish services exports has increased from around 5% a decade ago to nearly 13% in 2022. The Polish IT sector has become an important engine of the economy – in the last decade its revenues have almost quadrupled, the industry’s share of GDP has doubled and the value of IT services exports has increased 7.5 times. The importance of the industry can be seen even more clearly on the international stage: in 2023. Poland sold more IT services abroad than such technological powers as Japan or South Korea.

    Both National Bank of Poland and CSO data show a clear and uninterrupted increase in the value of IT services exports in recent years. As the chart above illustrates, in 2019, exports of telecommunications, IT and information services (a combined category in the balance of payments statistics) amounted to PLN 33.1 billion, rising to PLN 36.7 billion in 2020. Despite the global shock of the COVID-19 pandemic, there was thus an increase of around 10.9% y/y, although this was a slightly lower rate than in earlier years. Already in 2021, however, there was an acceleration, with the value of IT services exports reaching PLN 44.8 billion, or +22.1% y/y. The year 2022 brought a real leap: exports of this category increased to PLN 60.2 billion, as much as 34.4% more than the year before. This was the highest level ever, confirming that the Polish ICT sector has made excellent use of the global boom in digital services during the pandemic. In 2023, the dynamics slowed down slightly, but were still impressive, with exports of telecommunications, IT and information services reaching PLN 70.7 billion, growing by 17.5% compared to 2022. In the latter year, IT services alone (excluding telecommunications) accounted for the bulk of this amount – PLN 64.7 billion – underlining that it is strictly IT services that are driving this category. Overall, in just four years, Poland’s IT services exports have more than doubled (from ~£33bn in 2019 to ~£71bn in 2023). It is worth noting that these increases occurred in parallel to the overall growth of services exports, so that the share of the ICT segment in total services exports was steadily increasing. According to PFR data based on Eurostat, at the end of 2022, Poland’s exports of IT services amounted to EUR 11.66 billion (about PLN 54 billion) and imports to EUR 7.04 billion, a significant surplus. Importantly, there has not been a single year of decline since 2010 – Polish IT services are enjoying growing demand abroad. Although transport and tourism generate higher volumes, IT has become the fastest-growing segment – the average growth rate of IT services exports between 2011 and 2021 (22.4% per annum) more than doubled the growth rate of all services exports (11.2%). This makes IT services play an increasingly important role in the balance of payments. For context, in 2022, the services trade surplus reached a record 171.1 billion PLN (5.6% of GDP) and services alone were the largest positive component of the current account balance – dynamic IT exports contributed significantly to this.

    Export geography: key markets

    The geographic structure of IT services exports from Poland is concentrated around highly developed economies, especially in Western Europe and North America. Germany has remained the most important single customer for years – in total, Polish residents sold services (of all types) worth PLN 100.9 billion to Germany in 2023, of which a significant proportion were business and IT services. In the IT services segment, Germany also ranks first. The United States and the United Kingdom are other key markets, while Switzerland and the Netherlands also stand out in continental Europe as significant buyers of Polish digital services. According to PFR and ABSL data, Polish IT companies sell their services mainly to Germany, the US, Austria and the UK. Germany’s high position is due to both its geographical proximity and links within the EU, as well as the presence of many IT centres serving the German market. In turn, the significant share of the USA testifies to the global competitiveness of the Polish IT sector – American corporations are willing to outsource services to Polish entities or locate their technology centres in Poland. On a regional basis, EU countries remain the main recipients of Polish ICT services. In 2022, approximately 55% of the value of exports of services from the ‘other services’ category (including IT and business services) went to EU countries. In addition to Germany, Switzerland (8.7% of this category) and the UK were important partners in this area. The US accounted for about 15-16% of sales in the other services group in 2022, indicating its significant share also in the demand for IT services. It is worth noting that Polish technology companies have gained a good reputation worldwide – Polish programmers have been at the top of global quality and innovation rankings for years, which facilitates expansion into foreign markets. This helps win customers in Austria, the Nordic countries or Canada, among others, although the share of these markets is smaller. Nevertheless, geographical diversification is noticeable – Polish IT services reach customers in dozens of countries on several continents, which reduces dependence on a single market.

    Macroeconomic conditions: pandemic, exchange rates, economic situation

    The upward trend in IT services exports during the period under review was compounded by important macroeconomic factors. The COVID-19 pandemic initially caused global economic uncertainty, but the IT sector proved relatively resilient and even benefited in the medium term from the acceleration of digitalisation. In 2020, many companies worldwide moved their operations to remote mode and invested heavily in IT solutions, which maintained demand for IT services from Poland. Indeed, while, for example, exports of travel services collapsed in 2020, exports of IT services still grew by around 11%. This was slower than before the pandemic, but testament to the resilience of the industry. Already a year later, there was a rebound – global IT prosperity was very favourable in 2021-2022 due to massive deployments of cloud solutions, e-commerce, remote working or automation. Polish IT companies, often specialising in these areas, were able to increase sales abroad by the aforementioned 22% in 2021 and as much as 34% in 2022. Currency exchange rates also played an important role. The weakening of the zloty against the dollar and the euro in 2022 made services provided in Poland more competitively priced and, at the same time, the value of revenues calculated in zloty increased significantly. For example, according to the National Bank of Poland (NBP), exports of IT and telecommunications services in 2022 amounted to USD 13.4 billion (an increase of more than 15% y/y in USD terms), but when converted into PLN the dynamics exceeded 30% y/y. The weaker złoty meant that for every dollar earned, companies received more PLN, which partly explains the record jump in the value of exports this year. In turn, in 2023, the Polish currency strengthened slightly and global demand for IT services no longer grew so rapidly, hence the rate of growth of exports in PLN stabilised at a lower level (~17.5% y/y instead of 34% the year before).

    It is worth mentioning that, globally, 2023 was a tougher year for the technology industry – rising interest rates and fears of a recession led many companies to cut IT spending, while tech giants recorded declines in valuations and job cuts. This downturn also affected Polish services companies. As the PKO BP report notes, 2023 (and for many companies also 2024) saw a slowdown in the growth of sales, exports and employment in the IT sector in Poland. Rising costs, especially wages, have further reduced the profitability of companies. Despite these challenges, Polish IT exports still recorded solid growth in 2023, outpacing the dynamics of most other industries – evidence of a sustained, structural demand for digital services. In comparison, our neighbour Ukraine’s IT services exports fell by 8.4 per cent in 2023 due to war and economic turmoil. Poland avoided the decline, which is partly due to a safer business environment, but also due to customer diversification and specialisation in high-quality services.

    The role of outsourcing and foreign capital

    The export success of the Polish IT industry is closely linked to the role of outsourcing centres and foreign investment. Poland has for years been perceived as an attractive location for IT and business services outsourcing – it attracts a skilled engineering workforce, relatively lower labour costs compared to Western Europe and its presence in the EU (which facilitates legal and logistical issues). In practice, this means that a significant proportion of IT services exports are generated by companies linked to foreign capital, including shared service centres and branches of global technology corporations located in Poland. NBP analyses confirm that companies with foreign capital participation dominate the exports of most service categories – in 2022 it was entities with foreign investors that generated the largest surplus in services trade. Few industries are exceptions (e.g. construction or cultural services) – in the ICT sector, the predominance of foreign-owned companies is clear. According to ABSL data, almost all business service centres operating in Poland serve foreign clients. For example, Capgemini Poland – one of the largest IT employers – derives 94% of its revenue from serving foreign clients (providing services to customers in 31 countries), and Sii Poland derives around 30% of its revenue from exporting services. Such examples show that foreign investors recognise Poland’s potential and locate advanced processes here. Importantly, the nature of these investments is evolving.

    An increasing proportion of export growth is being driven by the phenomenon of reshoring, i.e. the transfer of operations back to Europe from more distant locations. In recent years, we have observed that new centres opening in Poland are carrying out advanced, highly specialised tasks from the outset, while simpler processes are being automated or redirected to cheaper countries (e.g. India). This demonstrates the growing added value of Polish services – these are no longer just simple software services, but often complex R&D projects, IT consulting, cloud services, cyber security, etc. The Polish IT sector is therefore becoming an integral part of global value chains. On the one hand, the presence of foreign capital guarantees an inflow of projects, know-how and stable financing; on the other hand, it can be a challenge to keep as much of the profits and intellectual property in the country as possible. Nevertheless, the current model seems to be beneficial for the Polish economy: it provides jobs (IT centres employ tens of thousands of specialists in Poland) and stimulates the transfer of knowledge and best practices. Thanks to this, Poland is strengthening its position on the global IT services market, as evidenced by high positions in rankings of attractiveness of outsourcing locations and a growing number of international contracts executed by the Vistula

    Polish IT services exports in 2019-2023 were characterised by impressive growth and resilience to crises, confirming the maturity and competitiveness of the industry. The trend is unequivocally upward – the sector is benefiting from the global digital transformation, while at the same time driving the domestic economy itself, providing growing foreign exchange inflows and a positive services trade balance. The outlook for the coming years is cautiously optimistic. After a temporary slowdown in 2023-2024, many analysts expect a gradual recovery. Macroeconomic forecasts indicate that in 2025, GDP growth rates in our key economies (Germany, USA, UK) may be higher than in 2023, which will translate into higher demand for IT services. At the same time, wage pressure in Poland is expected to ease somewhat, and the exchange rate of the zloty should remain at a level favourable to exporters. All this creates the conditions for maintaining a solid growth rate, although probably closer to a dozen or so than a few dozen per cent a year. The potential of the industry remains high – the growing role of artificial intelligence, cloud computing, automation or cyber security generates new niches in which Polish companies are already gaining competence.

    According to the estimates of the Polish Development Fund, the value of the Polish ICT sector (including IT, telecommunications and IT security services) may increase to EUR 25.2 billion by 2026, indicating a continuation of the positive trend. However, risk factors need to be borne in mind. A possible worsening of the downturn in Europe or the US, increased competition from other outsourcing centres (e.g. in Asia or Latin America), or difficulties in accessing IT talent at home – all of these could limit the rate of growth. It will therefore be crucial to invest in human capital (training and attracting specialists) and to support innovation and in-house IT products in order to gradually move from a purely service-based model to one based also on in-house technological thinking. For the time being, moderate forecasts assume that growth trends will continue, although no longer as spectacular as at the peak of the post-pandemic boom. Nevertheless, given the achievements to date, the Polish IT services sector remains one of the pillars of exports of strategic importance to the economy. If it manages to remain competitive and continue to gain the trust of foreign principals, IT services exports should continue to grow, strengthening Poland’s position as a European leader in the technology industry. This will certainly be an area worth watching closely in the years to come.

  • SOC report: Why modern security centres are becoming a pillar of business continuity

    SOC report: Why modern security centres are becoming a pillar of business continuity

    In a digital ecosystem where the lines between innovation and threat are blurring by the second, the Security Operations Centre (SOC) is emerging as the strategic nerve centre of every modern organisation. It is no longer just a technical bastion monitoring alerts, but a dynamically evolving organism that must stay ahead of the movements of increasingly sophisticated adversaries. In an age of AI-driven attacks and ubiquitous cloud, the effectiveness of the SOC determines the survival, reputation and continuity of the business.

    Pillars of cyber defence: the synergy of people, processes and technology

    The effectiveness of any SOC rests on the harmonious cooperation of three fundamental pillars: skilled people, standardised processes and advanced technology. Neglecting any of these creates gaps that become open gates for attackers.

    People – the irreplaceable heart of the operation

    It is the people – their knowledge, intuition and ability to think analytically – that are the most important element of cyber defence. The SOC team is usually a multi-level structure, where each line has clearly defined tasks. L1 analysts are the front line, monitoring the constant flow of alerts 24/7, filtering out information noise and escalating potential threats. L2 analysts, incident specialists, perform in-depth analysis, determine the scale of the attack and implement countermeasures. At the top are L3 analysts – elite ‘threat hunters’ who proactively look for signs of advanced, hidden adversaries. The whole is supported by security engineers, responsible for the architecture and maintenance of the tools, and a SOC manager, who manages the strategy and the team.

    Processes – the backbone of effectiveness

    Standardised processes ensure consistency and repeatability of actions, regardless of time pressures or shift composition. This is based on the NIST incident lifecycle, which defines four key phases: preparation, detection and analysis, containment and eradication, and post-incident action, or lessons learned. Mature SOCs complement this reactive model with proactive threat hunting – the iterative process of searching the network for signs of adversaries that have evaded automated defences.

    Technology – the arsenal of the modern defender

    A modern SOC is based on an integrated ecosystem of tools. At its heart is the SIEM (Security Information and Event Management) system, which aggregates and correlates logs from across the infrastructure, acting as the analytical brain of operations. SOAR (Security Orchestration, Automation, and Response) platforms act as a ‘force multiplier’, automating repetitive tasks and orchestrating incident response, allowing analysts to focus on more complex problems. EDR (Endpoint Detection and Response) technologies, providing deep visibility into workstations and servers, and its evolution, XDR (Extended Detection and Response), which integrates data from multiple layers (network, cloud, email) to offer a holistic view of the attack, also play a key role.

    A vicious circle of challenges: fatigue, skills gap and complexity of risks

    Despite advanced tools, SOC teams around the world are facing an operational crisis driven by three interrelated challenges.

    The first is alert fatigue (alert fatigue). Large organisations can receive up to tens of thousands of alerts a day, more than half of which are false alarms. This constant ‘noise’ leads to desensitisation and job burnout – more than 70% of analysts report symptoms of it. Paradoxically, the more alerts, the less security, as the risk of overlooking that one critical signal of a real attack increases.

    The second challenge is the global skills gap. It is estimated that there is a global shortage of around 4 million cyber security professionals. In Europe, 61% of teams are understaffed and nearly half of companies have serious recruitment difficulties. In Poland, 27% of companies report problems finding IT experts, including cyber security experts. Talent shortages lead to overloading existing teams, which in turn fuels burnout and turnover, creating a vicious circle.

    The third element is the increasing complexity of the threat landscape. The migration to the cloud has opened up new attack vectors, where as many as 99% of incidents are due to client-side configuration errors. At the same time, cybercriminals are increasingly daring to use artificial intelligence to automate and personalise attacks – from generating linguistically flawless phishing emails, to creating polymorphic malware, to deepfakes used in financial fraud.

    Next-generation SOC: intelligent automation and proactive defence

    In response to these challenges, the SOC is undergoing a fundamental transformation in which artificial intelligence is playing a key role – not as a replacement for humans, but as their most powerful ally.

    AI and machine learning (ML) algorithms are becoming the first line of defence against alert fatigue. They can analyse huge datasets in real time, automatically reject false alerts, and enrich the relevant ones with context and prioritisation. Technologies such as User and Entity Behaviour Analysis (UEBA) learn the ‘normal’ functioning of the environment and flag any anomalies that might escape human attention.

    The latest wave, generative AI, is revolutionising the work of analysts. Tools such as Microsoft Security Copilot and Splunk AI Assist can translate natural language queries into complex code, automatically generate concise summaries of complex incidents and recommend next investigative steps. This frees analysts from routine tasks, allowing them to focus on strategic thinking and verification of AI results. The role of the analyst is evolving from ‘tool operator’ to ‘AI supervisor and partner’.

    This technological evolution is also driving a strategic paradigm shift – from reactive to proactive defence. Instead of waiting for an alert, the next-generation SOC proactively manages risk. It implements strategies such as Continuous Threat Exposure Management (CTEM), systematically identifying and prioritising vulnerabilities from a business perspective. AI here supports advanced threat hunting, automating hypothesis generation and scouring data for subtle signs of compromise.

    Catalysts for expansion: Key factors shaping the SOC market

    Behind the impressive growth figures for the Security Operations Centre market is a set of powerful and interrelated driving forces. Understanding these drivers is key to assessing the sustainability of current trends and predicting the future evolution of the industry. This section explores the fundamental reasons why organisations around the world are investing with increasing urgency and scale in advanced defence capabilities.

    The evolving threat landscape: A new era of adversaries

    The primary driver of the SOC market is the constant and accelerating evolution of the threat itself. Modern cyber attacks have long ceased to be the domain of individual hackers; they have become a highly organised, automated and profitable branch of crime. We are seeing a fundamental shift in the characteristics of threats, which are becoming increasingly complex, persistent and destructive. Organisations no longer face just simple malware, but advanced, multi-vector campaigns such as Advanced Persistent Threats (APTs) and ransomware attacks that can cripple entire enterprises.

    This technological escalation on the attackers’ side forces a corresponding response on the defenders’ side. Traditional, reactive and signature-based security mechanisms are becoming insufficient. The need to detect subtle anomalies, analyse complex behaviour patterns and respond in machine time is directly driving the demand for modern SOCs equipped with AI and machine learning-based analytical tools.

    Digital transformation and the expanding attack surface

    In parallel to the evolution of threats, the very structure of corporate IT environments is undergoing a fundamental transformation. Digital transformation initiatives, while crucial to competitiveness, inevitably lead to a broadening and dispersal of the attack surface, generating new and complex challenges for security teams.

    Migration to the cloud: The mass movement of resources and applications to public clouds (AWS, Azure, GCP) and hybrid environments is one of the most important technology trends of the decade. However, this process is leading to the blurring of the traditional, well-defined network ‘perimeter’. Cloud security is based on a shared responsibility model, in which the cloud provider is responsible for the security of the platform itself, but the customer is fully responsible for the secure configuration of services, identity and access management and data protection. Unfortunately, misconfigurations of cloud services have become one of the main causes of data leaks. As Gartner analysis shows, as many as 99% of cloud security incidents are the result of customer-side error. This complexity and risk requires specialised tools and competencies in cloud configuration monitoring (CSPM) and cloud workload protection (CWPP), which are an integral part of a modern SOC. Case studies, such as the Capital One data leak caused by a misconfigured web application firewall in AWS, vividly illustrate the scale of this risk.

    The proliferation of IoT and OT: Another factor is the convergence of information technology (IT) with operational technology (OT) and the explosive growth of Internet of Things (IoT) devices. OT systems, such as industrial control systems (ICS) and SCADA, which manage processes in factories, power plants or critical infrastructure, have historically been isolated from corporate networks. Their connection to the internet to optimise operations creates new critical attack vectors. Similarly, millions of IoT devices – from smart sensors to cameras – are introducing new endpoints into the corporate network that are often poorly secured, lack the ability to install security agents and are difficult to manage. Monitoring such a diverse and vast ecosystem of devices is impossible without the centralised analytics platform that SOC offers.

    Increasing burden of compliance and risk management

    The third pillar driving the growth of the SOC market is the increasingly complex and demanding regulatory environment and the increasing demands from business partners, including insurers. Having a documented capability to monitor and respond to incidents is no longer good practice and is becoming a hard business requirement.

    It can be seen that these three key drivers – threat evolution, digital transformation and regulatory pressure – do not operate in isolation. They form a mutually reinforcing loop. Business initiatives such as migration to the cloud are creating new, complex attack surfaces. These new vectors are immediately exploited by adversaries armed with increasingly sophisticated tools such as AI. Increasing risks and spectacular incidents are attracting the attention of regulators and insurers, who are responding with stringent new requirements. For organisations, the only rational way to manage these complex risks, meet regulatory requirements and satisfy the expectations of business partners becomes to implement or hire a professional Security Operations Centre. In this way, business and technology decisions directly and indirectly drive demand in the SOC market.

     

     

  • US could absorb almost all global AI chip supply by 2030

    US could absorb almost all global AI chip supply by 2030

    The artificial intelligence(AI) revolution is driving unprecedented demand for two key resources: electricity and specialised semiconductors. Technology giants and financial markets paint a picture of almost unlimited growth, but beneath the surface of this narrative lies a fundamental conflict. On the one hand, we are faced with exponentially increasing projections of energy demand from data centres. On the other, we face hard physical and geopolitical constraints on the global capacity to produce the advanced chips needed to power this revolution. These two trends, inextricably linked, seem to be hurtling along a collision course.

    A landmark report published by London Economics International (LEI) was the first to quantify this contradiction. LEI’s analysis exposes that projections of energy demand in the US alone, when translated into chip demand, become impossible in the context of realistic global supply scenarios.

    AI gold rush: a tsunami of energy demand

    The current technological era is dominated by a phenomenon that can aptly be described as the ‘AI gold rush’. It is driving investment on an unprecedented scale, which can be seen most clearly in the rapid growth of the data centre market – the physical infrastructure that is the backbone of artificial intelligence. The global data centre market is expected to grow from around USD 347.6 billion in 2024 to more than USD 652 billion by 2030, driven by growing demand for cloud services, Internet of Things (IoT) technologies and, above all, AI applications.

    Artificial intelligence, particularly its generative variety, is a major catalyst for the exponential increase in energy demand. Training large AI models is an extremely energy-intensive process, and their subsequent use (inference) also consumes significantly more energy than traditional computational tasks. It is estimated that a single query to a model such as ChatGPT uses around 10 times more energy than a traditional Google search. Furthermore, dedicated AI accelerators such as graphics processing units (GPUs) consume significantly more energy than traditional processors (CPUs).

    The scale of this energy tsunami is reflected in forecasts from leading institutions. The International Energy Agency (IEA) warns that global data centre power consumption could double by 2026. Goldman Sachs estimates that by 2030, global power demand from data centres will increase by 165% compared to 2023, and that AI alone will increase from 14% to 27% of this demand as early as 2027. In the US, the projections are even more dramatic – by 2030, data centres could account for 9% to 12% of total national energy consumption, compared to around 4.4% in 2023. Behind this boom are mainly the so-called hyperscalers – Amazon, Google, Microsoft and Meta – who plan to spend USD 217 billion on AI infrastructure in 2024 alone.

    The great contradiction: the mathematics of the impossible

    In the face of such astronomical forecasts, the LEI report is a sobering voice of reason. Its authors cast doubt on the reliability of forecasts of energy needs, exposing the fundamental contradiction between declared demand and the physical capacity to meet it.

    The report’s key argument is simple but striking: energy demand forecasts for data centres in the US are systematically overestimated and consequently unreliable. To prove this, LEI used an innovative methodology, comparing aggregate energy demand forecasts with the physical global capacity to produce a key component – advanced AI chips.

    The results of this analysis are clear. LEI aggregated power demand forecasts from US grid operators, covering 77% of the US power market. These showed that an additional 57 GW of power demand would be created between 2025 and 2030. Subsequently, the analysts estimated that, even with very optimistic assumptions about the growth of global AI chip production, the total new supply over this period would be able to meet a demand equivalent to 63 GW of new data centre capacity worldwide.

    The juxtaposition of these two figures leads to a shocking conclusion: in order to meet the US operators’ projections, the US would have to absorb more than 90% of the entire new global supply of AI chips produced worldwide between 2025 and 2030. Such a scenario is clearly unrealistic. The US currently accounts for less than 50% of global chip demand, and other regions such as Europe and Asia are also rapidly developing their AI capabilities. The conclusion is inescapable: the forecasts on which US energy expansion plans are based are fundamentally unreliable and disconnected from the physical realities of global production.

    Fragile foundations: geopolitics, technology and talent

    The contradiction exposed by the LEI report is deeply rooted in the extremely complex and disruption-prone structure of the global semiconductor supply chain.

    Firstly, geopolitics. Global manufacturing is dominated by just a few countries and companies. At the centre of this ecosystem is Taiwan, specifically TSMC, which controls more than 90% of the market for the most advanced chips. The entire digital economy is de facto hostage to political stability across the Taiwan Strait. The second pillar is South Korea, with Samsung and SK Hynix dominating the memory market, including HBM memory, which is crucial for AI. This concentrated landscape has become an arena for a technology war between the US and China, leading to market fragmentation and supply chain disruption.

    Secondly, technology. A key bottleneck is advanced packaging. Chip-on-Wafer-on-Substrate (CoWoS) technology, dominated by TSMC, is essential for integrating multiple chiplets into a single AI super processor. CoWoS production capacity, despite dynamic expansion, has not kept pace with exploding demand, which is a major factor limiting the supply of the most powerful accelerators.

    Thirdly, people. Even if geopolitical and technological barriers could be overcome, a lack of skilled human resources stands in the way. It is estimated that by 2030, the industry will be short of around 67,000 professionals in the US alone. Globally, this gap could reach hundreds of thousands. New factories being built under initiatives such as the CHIPS Act could be left empty due to a lack of engineers and technicians to operate them.

    Avoiding collision: efficiency and innovation

    Faced with an inevitable collision, the technology industry must make a strategic turnaround. The way out of the impasse is through two paths: radical efficiency improvements and the search for new computing paradigms.

    In the short term, efficiency is key. The concept of ‘Green AI’ involves designing models with a view to minimising their energy footprint. Techniques such as model optimisation (e.g. pruning, quantisation) can significantly reduce computing power requirements. Infrastructure efficiency is equally important. Switching from air cooling to much more efficient liquid cooling can reduce energy consumption in a data centre by up to 40%. However, it is important to bear in mind the so-called rebound effect (Jevons paradox), where improvements in efficiency can lead to even higher resource consumption.

    In the long term, a technological breakthrough is needed. The two most promising technologies are photonics and neuromorphic computing. Photonic computing, using light instead of electrons, offers theoretically orders of magnitude higher throughput and lower energy consumption. Neuromorphic computing, on the other hand, inspired by the architecture of the human brain, is inherently more energy efficient for AI tasks. However, be realistic – these technologies will not be ready for mass deployment before the end of the decade.

    Moving beyond the impossible race

    The analysis leads to a clear conclusion: “the ‘impossible race to the chip’ is not a distant threat, but a present reality. The current growth trajectory is unsustainable. Constraints are systemic – geopolitical tensions, technological bottlenecks and a global talent gap create a web of interconnected barriers.

    A change in thinking is required in this situation. The real victory in this race will not be to build the biggest and most energy-intensive supercomputer. The victory will be to develop a fundamentally new, sustainable development path. The future belongs not to those who will be able to power the most monstrous AI models, but to those who will figure out how to achieve the same or better results with radically lower energy and resource consumption. This is no longer a race for raw computing power. It’s a race for intelligence – both the artificial intelligence we create and the human intelligence we must use to manage this transformation responsibly. Moving beyond the ‘impossible race’ requires moving from a paradigm of ‘more’ to a paradigm of ‘smarter’. This is the greatest challenge and also the greatest opportunity facing the technology industry in the coming decade.

  • AI infrastructure in 2025: Ready for a revolution or doomed to fail? Findings from the Flexential report

    AI infrastructure in 2025: Ready for a revolution or doomed to fail? Findings from the Flexential report

    The year 2025 marks the point at which artificial intelligence ceases to be a technological curiosity and becomes a business imperative. Already 83% of companies say AI is a top priority in their plans , and 88% of leaders see accelerating its adoption as a key objective. Investment is growing exponentially, driven by the promise of unprecedented returns.

    However, behind the facade of this enthusiasm lies a disturbing paradox. Organisations, rushing towards revolution, are colliding with the wall of their own technological foundations. Flexential’s‘2025 State of AI Infrastructure‘ report mercilessly exposes this truth: as many as 44% of companies cite IT infrastructure limitations as a major barrier to the expansion of AI initiatives. This is a central strategic challenge that can derail the most ambitious plans. Infrastructure, once seen as a back office, has become the bottleneck that determines the pace of innovation.

    ROI-driven arms race

    The global AI boom in 2025 is no longer driven by curiosity, but by hard, tangible benefits. We have moved from a phase of experimentation to a phase of strategic, company-wide scaling. The driving force is return on investment (ROI). Companies that have implemented generative AI report an average return of 3.7 times their investment. What’s more, 74% of organisations report that their most advanced GenAI initiatives meet or exceed ROI expectations.

    These results mean that business leaders are no longer asking “does AI work?”, but “how do we implement it in key processes to gain a sustainable advantage?”. We are seeing a clear shift in focus from distributed testing to deep deployments in business model-critical areas such as IT, operations, marketing and customer service.

    The scale of financial commitment is impressive. As many as 70% of organisations are allocating at least 10% of their IT budgets to AI initiatives. Enterprise use of generative AI has jumped from 55% to 75% in the past year alone. However, this rush towards innovation creates a dangerous feedback loop. Success in ROI motivates boards to allocate ever-increasing budgets to further projects, each of which generates huge, often underestimated demands for computing power, network bandwidth and data centre space. As a result, initial success becomes a driving force that collides with the physical limitations of the infrastructure, creating a strategic dilemma: how do you keep up the pace when the technological foundations start to crack under the pressure of your own success?

    Critical point: AI infrastructure crisis

    Enthusiasm and investment collide with hard, physical reality. IT infrastructure, for years treated as a support for the business, today becomes its main constraint. It is no longer just an IT issue, but a fundamental business risk.

    Energy and cooling – physical limits to growth

    AI workloads generate an insatiable appetite for energy. The data from the Uptime Institute study is alarming: 27% of server racks dedicated to AI training exceed a power consumption of 50 kW. By comparison, a simple answer to a query on ChatGPT can require ten times more energy than a traditional Google search. This ‘power fever’ is forcing data centre operators to take immediate and costly action. As many as 52% are urgently upgrading their power infrastructure and 51% are investing in new cooling systems. Underestimating these needs leads to project delays, increased operational costs and the imposition of a physical ‘ceiling’ on a company’s ability to scale AI.

    Data centres and the network – new bottlenecks

    The problem is not just one of capacity, but also of space availability. Demand for specialist data centres is dramatically outstripping supply. Vacancy rates in key markets have fallen to a record low of 1.9 per cent, and more than 70 per cent of new facilities are leased even before completion. For companies seeking significant capacity, waiting times for new infrastructure now exceed 24 months. This forces organisations to anticipate their needs 1-3 years ahead.

    Even if a company manages to secure power and space, its efforts can be thwarted by another bottleneck: the network. Flexential’s report shows a dramatic increase in problems in this area. The percentage of organisations reporting network capacity issues rose from 43% to 59% in a year, and latency challenges rose from 32% to 53%. Investing millions in the most expensive AI accelerators without at the same time upgrading the network to standards such as RoCE or InfiniBand is like putting a Formula 1 race car engine into a city car – the powerful computing power will be continually throttled.

    The infrastructure crisis is creating a new division between market winners and losers. The dividing line no longer runs along the wealth of the portfolio, but along the capacity for strategic planning. Companies that anticipated this crisis and planned their needs years ahead now have a fundamental advantage that cannot be quickly made up. The new competitive divide is therefore not between the rich and the poor, but between the foresighted and the reactive.

    Competence gap

    Even organisations that manage to overcome infrastructure barriers face another challenge: a shortage of skilled people. The AI skills gap is reaching a critical level, becoming as serious a brake as restrictions on access to hardware.

    Scale of the problem in figures

    Statistics show that talent has become the most scarce resource. As many as 86% of leaders are concerned about their ability to attract or develop specialist talent. For 38% of C-suite executives, the skills gap is the main factor inhibiting their organisation’s performance. The problem is growing: in one year, the percentage of companies reporting a shortfall in AI management rose from 53% to 61%. This translates into dramatically low confidence among leaders, with only 14% believing they have the right talent within their ranks to execute an AI strategy.

    The new technological elite and strategic responses

    The talent shortage is not uniform. Demand for some roles has exploded, creating a new, highly paid elite. The best example is the MLOps Engineer, a role that combines machine learning, software engineering and operations (DevOps). LinkedIn has seen this role grow in popularity by 9.8x in five years. Demand for AI infrastructure specialists, low-latency networking experts and GPU cluster management is also growing rapidly.

    Faced with such a competitive market, companies need to act strategically. The first option is to ‘build’ talent from within through upskilling and reskilling programmes. As many as 69% of global CEOs predict that implementing AI will require most of their employees to learn new skills. The second option is to ‘buy’ or ‘borrow’ talent through recruitment or outsourcing. The labour market for AI professionals is extremely heated , making outsourcing a powerful tool, providing immediate access to the global talent pool and eliminating lengthy recruitment processes.

    The AI skills gap is not a problem separate from the infrastructure crisis – it is a direct result of it. The increasing complexity of technologies such as InfiniBand and advanced cooling systems is creating a need for a whole new set of skills that the market has not been able to develop in such a short time. You can’t solve the infrastructure problem by just buying hardware. At the same time, you need to invest in people who can operate it effectively.

    Trust deficit: Security and governance in the age of AI

    Even organisations with state-of-the-art infrastructure and the best talent can fail if they ignore the third pillar: trust. The rapid and often uncontrolled adoption of AI is creating new and unprecedented risks.

    New risk frontier

    The deployment of AI on a massive scale is fundamentally changing the cyber threat landscape. As many as 55% of companies admit that AI adoption has increased their vulnerability to attacks, a dramatic increase from 39% the year before. New attack vectors are emerging, such as data poisoning and adversarial attacks to deceive the model.

    One of the most insidious threats is ‘Shadow AI’ – the unauthorised use of external AI tools by employees. In a well-intentioned attempt to increase productivity, employees enter sensitive company data into publicly available models such as ChatGPT, leading to millions of information leakage incidents.

    Governance imperative

    The speed of AI deployments is outpacing the development of internal regulations. A third (33%) of companies admit that their corporate governance systems for AI lack defined security protocols, and nearly half (48%) report gaps in bias detection (bias) policies. In response to this chaos, the US National Institute of Standards and Technology (NIST) has published the AI Risk Management Framework (AI RMF), a comprehensive guide to help manage risk in a structured way.

    The ‘Shadow AI’ phenomenon is a paradoxical side-effect of upskilling programmes. Companies encourage employees to learn, but by failing to provide them with safe, in-house alternatives, they push them into the arms of public tools. Upskilling initiatives, if not coupled with clear policies and the provision of secure tools, become a major source of risk for the entire organisation.

    Verdict for 2025

    The AI revolution is inevitable, but failure is an equally real scenario for those who ignore the cracks appearing in their technological and human foundations. Success will not depend on the scale of ambition, but on the ability to synchronise vision with operational readiness.

    The question posed in the title – ready for revolution or doomed to failure? – there is no single answer. It will be given individually, within each organisation, through the decisions taken in the coming months. The companies that approach these challenges with due seriousness have the opportunity to become the true leaders of this transformation. The others risk their AI investments becoming costly monuments to a missed opportunity.

  • The ‘cloud at any price’ paradigm is becoming history. The time of cloud at ‘good price’ is coming

    The ‘cloud at any price’ paradigm is becoming history. The time of cloud at ‘good price’ is coming

    After a decade of almost unconditional dominance of the ‘cloud first’ paradigm, the cloud market is entering a new, much more mature phase. Initial enthusiasm, driven by the promise of scalability and flexibility, is giving way to hard cost analysis and real business value. The industry is increasingly talking about ‘cloud fatigue’ – a phenomenon that signals not the end of the cloud era, but the end of its uncritical adoption.

    The paradox of growth and discontent

    Despite growing scepticism, the cloud market is experiencing an unprecedented boom. Global spending on public cloud services is expected to reach US$723.4 billion in 2025, an increase of 21.5% on the previous year. However, this impressive growth stands in stark contrast to widespread problems. According to Flexera ‘s ‘State of the Cloud 2025’ report , as many as 84% of organisations are struggling to manage their cloud spend, which is exceeded by an average of 17%, and estimated wastage remains at an alarming 27%.

    The key to understanding this paradox is artificial intelligence. Investment in AI and machine learning (ML) in the cloud is literally exploding. Gartner predicts that AI will consume 50% of all cloud computing resources by 2029, compared to less than 10% today. Companies are spending more because they need to invest in innovation to stay competitive. At the same time, they are struggling with the increasing cost and complexity of the very platforms on which they are building their future.

    The true price of the cloud and the birth of FinOps

    “Cloud fatigue” comes from the realisation that its true cost goes well beyond the price of a virtual machine. Hidden charges, such as those for outbound data transfer from the cloud (egress), become a painful trap. Hyperscalers charge around $0.09 to $0.12 for every gigabyte of data sent to the internet, effectively discouraging switching providers or building hybrid architectures.

    In response to this complexity, a new management discipline – FinOps – was born. It is an organisational culture and set of practices that brings together finance, technology and business to make informed decisions about cloud spending. The scale of adoption of this methodology is impressive: already 59% of companies have dedicated FinOps teams. FinOps is evolving from a simple cost control tool to a strategic analytics engine that helps decide which workload should run in which environment – public cloud, private cloud or on-premise.

    output 3

    Repatriation as strategic recalibration

    As companies gain a better understanding of costs, there is growing interest in repatriation – moving applications and data from the public cloud back to their own data centres. While still a niche phenomenon (21% of workloads have been repatriated to date ), up to 86% of IT directors are planning to move at least some of their resources back to private environments.

    A flagship example is 37signals (creators of Basecamp), which, after spending US$3.2m on public cloud services in 2022, has invested in hardware and estimates annual savings of US$1.3-2m. Similarly, Dropbox has saved US$75m over two years by building its own infrastructure. Repatriation is not evidence of the failure of the cloud, but of the maturity of companies, which are choosing more cost-effective models for stable and predictable workloads. The ‘cloud-first’ dogma is being replaced by a pragmatic ‘workload-first’ strategy.

    New players challenge the giants

    In response to the weaknesses of the hyperscaler model, a new wave of specialised providers is emerging in the market. They are not competing with the giants on their terms, but solving specific problems, e.g. Wasabi offers a simple and predictable pricing model for data storage, completely eliminating egress fees and API queries, making it ideal for backup and archiving. Akamai Connected Cloud, on the other hand, combines traditional data centres with a massively distributed edge network, placing computing power closer to users, which is crucial for latency-sensitive applications such as gaming and streaming.

    These new players are ‘disaggregating’ the cloud, optimising one dimension of its value proposition – cost, security or performance – which has been compromised by the ‘all-for-all’ model used by hyper-scalers.

    Entering the era of cloud maturity

    “Cloud fatigue” is not a sign of crisis, but a healthy sign of market maturity. Companies are not turning away from the cloud, but are learning to use it in a more informed and strategic way. The future is not a binary choice between cloud and on-premise, but an intelligent hybrid that combines hyper-scale services for innovative AI projects, in-house infrastructure for stable workloads and specialised services from niche providers for specific tasks.

    For IT leaders in Poland, this means building FinOps competencies, auditing the application portfolio for the optimal environment and strategically diversifying to avoid dependence on a single provider. In the new cloud phase, the goal is no longer to be in the cloud at all costs, but to optimise the return on investment for the entire organisation by consciously managing a diversified portfolio of infrastructure options.

  • Digital signage – transient trend or permanent fixture?

    Digital signage – transient trend or permanent fixture?

    Every day, millions of people around the world interact with digital screens, which have become an almost invisible yet ubiquitous part of the urban and commercial fabric. We stare at them at bus stops, checking the actual time of a vehicle’s arrival. They guide us through the mazes of shopping centres, pointing the way to the shop of our choice and tempting us with dynamic promotions. They welcome us at the reception desks of modern office buildings, streamline hotel check-in and allow us to order a meal in a fast-food restaurant without a single word spoken to the staff. This digital revolution, known collectively as digital signage, has evolved from a quiet background into an active participant in our everyday lives.

    This ubiquity, however, raises a fundamental analytical question that goes beyond the simple observation of the phenomenon. Are we witnessing a permanent, profound transformation in the ways in which we communicate, deliver services and organise space, which would make these technologies a fundamental need of the modern world? Or is it merely a passing fad, driven by marketing hype, a fascination with novelty and a desire to demonstrate innovation, which in time may prove to be an unsustainable, costly experiment? The answer requires a multidimensional analysis, from hard market data, to measurable benefits, to technological innovation and real-world challenges.

    Market fundamentals: evidence of technology maturity

    To reliably assess whether we are dealing with a fad or a need, it is first necessary to ground the discussion in hard economic data. By analysing global and local markets, their structure and growth dynamics, it is possible to understand that we are not talking about a niche phenomenon, but a powerful global economic sector.

    According to a report by Grand View Research, the global digital signage market was valued at approximately US$28.83 billion in 2024, with a forecast to grow to US$45.94 billion by 2030, representing a stable compound annual growth rate (CAGR) of 8.1%. Other sources, such as MarketsandMarkets, report similar single-digit growth forecasts. In technology lifecycle models such as Gartner’s ‘Hype Cycle’, phenomena referred to as ‘fads’ are characterised by rapid, speculative growth followed by a painful correction. Stable forecasts for digital signage indicate that the technology has long since passed this stage and is on a ‘productivity plateau’, where it is deployed not for novelty’s sake, but for delivering real value.

    In this global landscape, Poland represents an interesting and rapidly growing segment. The Polish market is estimated to be worth USD 494.1 million in 2024, with a forecast to grow to USD 753.1 million by 2030 (CAGR of 7.3%). Significantly, kiosks were the largest revenue segment in the Polish market in 2024, with a share of 27.22%. This structure suggests that the Polish market is largely driven by the need to automate processes and implement self-service solutions – from ticketing machines to queuing systems in offices to checkouts in retail. This is a strong argument in favour of the ‘need’ thesis, driven by specific socio-economic conditions such as rising labour costs and changing consumer expectations.

    The argument for ‘need’: measurable benefits and evidence from implementations

    The ultimate test for any technology is its ability to solve real-world problems. Analysis of digital signage deployments in key sectors provides strong evidence of a fundamental need.

    Smart City and Public Sector: In offices, post offices or courts, information screens and kiosks are becoming the digital front-end of administration, improving the service to petitioners and reducing chaos. In public transport, dynamic boards with actual departure times (Passenger Information Systems) are no longer a luxury but a standard. An excellent example is the system implemented in Poznań, where screens at bus stops, integrated with open ZTM data, allow the location of vehicles to be tracked in real time. As the developers emphasise, the key benefit is to “reduce the barrier of accessibility of digital services to people who do not use the app on a daily basis”, making the technology a tool for digital inclusion.

    Retail and Catering: In the commercial sector, the case for ‘need’ is even more pronounced. Data shows that dynamic messages capture the attention of 63% of people, and almost half of customers (47-48%) make a purchase decision under their influence. The implementation of digital signage can lead to an average increase in sales of 31.8%. The most spectacular example is self-service kiosks in quick service restaurants (QSR). After their installation in the McDonald’s chain, the average order value increased by up to 30%. Customers, feeling less pressure, explore the menu more freely and the kiosk interface is an excellent upselling tool – 20% of customers who would not normally order a drink will do so if it is suggested to them by the kiosk.

    Healthcare: In hospitals and clinics, where stress and misinformation can have serious consequences, digital signage is becoming a tool to improve the quality of care. Interactive wayfinding kiosks and screens in waiting rooms, displaying queue status, reduce perceived waiting times by up to 35%. A study conducted in the emergency department of Brigham and Women’s Hospital (affiliated with Harvard Medical School), published in 2023 in JMIR Formative Research, proved groundbreaking. It showed that patients in rooms equipped with digital dashboards (displaying real-time information about the treatment team and care plan) reported a statistically significant increase in satisfaction with communication and were more likely to recommend the facility. This is scientific confirmation that intelligent digital signage directly translates into key quality indicators in healthcare.

    Corporate Communications: In a world of hybrid working and information overload, screens in shared office spaces are an effective alternative to overflowing email inboxes. Integrated with office tools such as calendars or analytical systems (e.g. Power BI), they become dynamic dashboards to support daily work. Given that effective internal communication can increase productivity by up to 25%, investment in this channel is becoming a strategic necessity.

    The argument for ‘fashion’: innovations that shape the future

    While the evidence for the usefulness of digital signage is strong, one cannot ignore the fact that much of the market dynamics are driven by a constant technology race. It is these disruptive technologies that create the impression of ‘fashion’.

    Artificial Intelligence and Personalisation: AI is revolutionising digital signage, transforming static displays into intelligent platforms. The biggest change is the use of Computer Vision to analyse audiences anonymously in real time. Embedded cameras can estimate demographic characteristics (age, gender) and automatically adapt content based on this – for example, showing toy advertising when they detect a family with children.

    The New Era of Outdoor Advertising (DOOH): Digital Out-of-Home (DOOH) advertising is experiencing a renaissance. Programmatic DOOH brings the logic known from online advertising into the physical world, allowing automated, auction-based purchase of advertising space and precise targeting. In Poland, this market is becoming dynamically professionalised, with players such as Screen Network integrating their resources with global platforms, including Google Display & Video 360. The most spectacular manifestation of the ‘fashion’ is anamorphic 3D advertising, which creates the illusion of objects extending beyond the frame of the screen. Installations such as this one on the Promenada shopping centre in Warsaw are becoming viral sensations, generating huge reach on social media.

    What at first glance appears to be a technological ‘fad’ is in fact often a response to fundamental challenges. One of the main reasons for the failure of digital signage projects is unengaging content. Innovations such as AI or Programmatic DOOH are tools to address this very issue, maximising the return on investment and effectiveness of communications. In this view, ‘fashion’ and ‘need’ become two sides of the same coin.

    The dark side of the screen: challenges, risks and controversies

    Despite impressive growth, the road to successful digital signage implementation is fraught with pitfalls. A critical analysis of the challenges is essential to understand why many projects fail.

    Project Economics and Failures: The true cost of implementation (TCO) goes far beyond the screen price, covering software (often in a subscription model), installation, ongoing content creation costs and maintenance. Studies show that up to 80% of implementations fail to achieve their business objectives. The main reasons for failure rarely lie in the technology, but more often in a lack of strategy, non-relevant content and a lack of performance measurement.

    Ethical and Legal Dilemmas: The use of facial recognition technology is the most controversial. Under the RODO, biometric data processed to uniquely identify a person is a special category of data and its processing requires explicit, voluntary consent, which is impossible to obtain in the public space. In response, the market is developing Anonymous Video Analytics (AVA) technologies that estimate general demographic characteristics without identifying and recording data to link the analysis to a specific person, operating under a ‘privacy by design’ model. Another risk is cyber-security – every screen connected to the network is a potential target for a hacking attack, which in Poland, with the increasing number of incidents, is a real threat.

    Spatial Order and Pollution: Many Polish cities, out of concern for aesthetics, introduce so-called landscape resolutions that strictly regulate the placement of advertising media, often drastically restricting or even banning digital advertising, as in Krakow or Gdansk. This is a reaction to the visual chaos and light pollution that degrades urban space, negatively affects the well-being of residents and even property values.

    From fashion to digital infrastructure

    After analysing all the arguments, the verdict is clear: information screens and digital kiosks have moved beyond the phase of being a mere ‘fad’ and have become a response to the real ‘needs’ of modern society and business. They are part of a wider digital transformation that is blurring the boundaries between the online and offline worlds.

    However, whether a particular implementation turns out to be a useful ‘need’ or merely an expensive ‘fad’ depends critically on how it is implemented. Technology in itself is neutral; it is only the strategy, content and integration that give it real value.

    The future of digital signage will not be about simply increasing the number of screens, but about deeper and smarter integration. Key trends include hyper-personalisation through AI, full integration with Smart City platforms and business systems, sustainability enforced by regulation and the democratisation of technology, which will also become available to smaller players.

    Digital signage has ceased to be a technology for displaying images only. It is becoming a digital interface, a bridge between the vast data resources of the virtual world and our physical experience. As such, it is becoming a part of the digital infrastructure, almost as important as internet access. The question we should be asking ourselves today is no longer ‘whether’ to implement digital signage, but ‘how’ to do it wisely, ethically and in a way that brings real value to all.

  • Public cloud: safe or colander? 97% of confidential data within reach of hackers

    Public cloud: safe or colander? 97% of confidential data within reach of hackers

    Modern business is migrating en masse to the public cloud, often citing enhanced security as one of the key arguments for this transformation. However, behind the facade of this digital revolution, a quiet and disastrous crisis is unfolding. It stems not from the inherent flaws of cloud platforms, but from the way they are being used. This brings us to a fundamental question: for the average company today, is the cloud a digital safe or rather a leaky colander?

    The response brought by the latest Tenable 2025 Cloud Security Risk Report is alarming. The analysis found that while 9% of publicly accessible cloud resources contain sensitive data, up to 97% of this exposed information is classified as proprietary or confidential. This is not random, irrelevant data. These are strategic assets, intellectual property and customer data that become easy targets for cybercriminals.

    Quantifying the “colander effect” in financial and operational terms

    Understanding the scale of the problem is crucial. Data from a range of independent, reputable sources paints a picture of risk that no business decision-maker can ignore. This is not a theoretical threat; it is a measurable, growing and extremely costly reality. The starting point of our analysis is the shocking data from the Tenable report. The key finding that 9% of publicly available cloud storage contains sensitive data is in itself alarming. But the real alarm rings out when we realise that 97% of this data is sensitive information. This means that configuration errors do not lead to the leakage of insignificant files, but to the exposure of a company’s most guarded secrets. The reliability of this data is unquestionable. The report is based on an analysis of actual telemetry from the Tenable Cloud Security platform, collected from a variety of public and enterprise cloud environments between October 2024 and March 2025. This is not survey data, but hard evidence from production systems in operation.

    Data exposure is not just a technical problem – it is, above all, a huge financial risk. To translate these risks into concrete figures, let’s look at the IBM Cost of a Data Breach Report 2024. According to this study, the average global cost of a data breach reached a record $4.88 million, a 10% increase on the previous year. IBM’s analysis goes a step further, however, providing data that is key from the perspective of this article. Breaches where data was stored in the public cloud proved to be the most costly type of incident, generating an average cost of US$5.17 million. This directly links the location of the problem (the cloud) to its financial penalty. Furthermore, the report introduces the concept of ‘shadow data’ – information stored in unmanaged and often unknown to IT departments. As many as 35% of breaches involved such data, and the cost was 16% higher, reaching US$5.27 million, with almost 25% more time required to detect and contain it.

    The problem is not static; it is escalating at an alarming rate. Data from other leading reports confirms this dangerous trend. The CrowdStrike 2024 Global Threat Report indicates a 75% year-on-year increase in cloud intrusions.

    Palo Alto Networks 2024 State of Cloud-Native Security Report reports that 64% of organisations have seen an increase in data breaches in the last 12 months. The Verizon 2024 Data Breach Investigations Report (DBIR), on the other hand, reveals that the number of breaches resulting from the exploitation of vulnerabilities has almost tripled (by 180%) in one year. This surge has been driven primarily by zero-day attacks on internet-accessible systems, demonstrating that attackers are actively and effectively exploiting vulnerabilities.

    The combination of this data reveals a phenomenon that can be described as a ‘digital perfect storm’. On the one hand, there is the exponential increase in the cost of a single incident, especially in the cloud, as the IBM report confirms. This is the ‘impact’ variable in the risk equation. On the other hand, reports from CrowdStrike, Verizon and Palo Alto Networks clearly show that the frequency and speed of attacks on cloud environments is increasing dramatically. This, in turn, is a ‘probability’ variable. In basic risk calculus, Risk = Probability x Impact. When both of these variables increase simultaneously and in such an exponential manner, the overall business risk does not increase linearly, but exponentially. For the IT sales channel, the lesson is clear: cloud security is not just another problem to add to the list. It’s probably the fastest growing area of critical risk for their customers, which justifies positioning services in this area as an absolute priority, not just an add-on. This is a conversation about business continuity, not just IT hygiene.

    Anatomy of a spill: Deconstructing the four horsemen of the cloud apocalypse

    Having understood the scale of the problem (‘what’), it is now time to diagnose its causes (‘why’). This section looks at the fundamental errors that lead to massive data exposure. As the reports show, we are not dealing with sophisticated, unavoidable attacks, but with failures in the area of basic security hygiene.

    The first and main culprit is misconfigurations. They can be likened to leaving a bank vault door open. The problem lies not in the quality of the lock, i.e. the cloud provider’s infrastructure, but in the failure to use it. The Palo Alto Networks report provides a key reason for this: 71% of organisations have vulnerabilities resulting from rushed deployments. The ‘just get it up and running faster’ mentality , prevalent in many development teams, leads directly to leaving unsafe defaults and configurations open to the world. This phenomenon compounds complexity. Gartner analysts point out that the “relentless growth of cloud adoption” is leading to sprawling, difficult-to-manage digital ecosystems. It is in such a complex environment that configuration errors develop most easily. The IBM report confirms that cloud misconfiguration is a common attack vector, accounting for 15% of breaches.

    The second horseman is the secrets exposed. In security jargon, ‘secrets’ are passwords, API keys, certificates and authentication tokens. These are digital keys that give direct access to systems and data. The Tenable report provides devastating statistics showing how widespread the problem of their improper storage is. For example, 54% of organisations using job definitions in AWS ECS store at least one secret in them, as do 52% of companies using GCP Cloud Run and 31% using workflows in Microsoft Azure Logic Apps. The epidemic of exposed secrets is the main fuel for the rising tide of ‘malware-less’ and identity-based attacks. Attackers no longer need to bother creating malware when they can simply find the keys left in publicly available code. The CrowdStrike report indicates that 79% of intrusions are now malware-free, and Verizon DBIR confirms that the use of stolen credentials is the most common initial action in breaches.

    The third element is a phenomenon that Tenable illustratively calls the ‘toxic cloud trilogy’. It describes a single cloud resource that is simultaneously publicly accessible from the internet, critically vulnerable to attack due to unpatched vulnerabilities and highly privileged due to excessive IAM privileges. Although the percentage of companies with at least one such trilogy has fallen, it still affects an alarmingly high 29% of organisations. This configuration is a ready recipe for disaster, giving an attacker a direct attack vector on a vulnerable system that, once compromised, provides powerful privileges for further action.

    The fourth and final horseman is identity and access management (IAM) failures. Many organisations live under the misconception that implementing a modern identity provider (IdP) solves the problem. The Tenable report debunks this myth, showing that 83% of organisations on AWS are using IdP services, but are still at risk from overly permissive defaults and excessive permissions. This fits perfectly with the trends identified by Gartner, which identifies ‘managing machine identities’ as one of the key challenges. Palo Alto Networks’ data of a 116% increase in ‘impossible journey’ alerts shows that identity abuse is the order of the day. Excessive privileges are a force multiplier for any successful intrusion.

    How to turn customer risk into a service revenue strategy

    This section is the strategic heart of the article. It translates the diagnosed problems into concrete, practical service portfolios for the IT sales channel. A perfect starting point is a comment from Ari Eitan, director of cloud security research at Tenable: “Despite the security incidents we have witnessed … organisations continue to leave critical cloud resources…. vulnerable to attacks through avoidable configuration errors”. This quote perfectly frames the market opportunity: customers cannot manage these risks on their own and urgently need expert help. This need is compounded by the skills gap. An IBM report indicates that 53% of organisations report staff shortages in security teams, which directly translates into higher breach costs. This is the ultimate argument for clients to outsource these tasks to a competent managed service provider (MSP).

    Sales channel partners can build a modern security portfolio that directly addresses the problems of the ‘four horsemen’. In response to misconfigurations, a natural solution is to offer managed cloud security level management (Managed CSPM). This service consists of continuous, automated monitoring of cloud environments against security benchmarks to detect and fix errors, such as publicly available S3 buckets. It is a direct answer to a major problem, and its market rationale is found in a Palo Alto Networks report, which indicates that 92% of security professionals want better, ready-to-use visibility and risk prioritisation.

    To address the problem of exposed secrets and the ‘toxic trilogy’, the sales channel should offer a managed cloud-native application protection platform (Managed CNAPP). This is a unified platform that combines CSPM, cloud resource protection and other functions to identify the ‘toxic trilogy’ by correlating network exposure data, vulnerabilities and IAM permissions – something point tools cannot do. The investment in CNAPP’s expertise is in line with the direction of the market, as confirmed by Forrester, identifying a trend of security platform convergence.

    In the face of IAM governance failures, partners can provide IAM health auditing and Zero Trust consulting services. This includes auditing IAM roles for excessive privileges, implementing the principle of least privilege and implementing modern access control mechanisms such as just-in-time (JIT) on-demand access, recommended by Tenable. This service directly addresses the ‘excessive privileges’ problem and the ‘malware-free’ attack vector identified by CrowdStrike.

    Finally, to address the lack of data visibility, the channel can offer data security level management (DSPM as a Service). This newer discipline focuses on discovering, classifying and tracking sensitive data across the cloud ecosystem, including ‘shadow data’. It is a direct response to a key finding of the Tenable report (97% of exposed data is sensitive) and the high cost of breaches involving ‘shadow data’ identified by IBM.

    The most successful sales channel partners will not offer these services as separate products. Instead, they will combine them into a holistic ‘Cloud Resilience’ offering. Such a move allows them to move up the value chain – from tactical tool reseller to strategic security partner. The problems identified are inextricably linked. Selling separate point solutions reproduces the problem the customer is trying to solve – as the Palo Alto Networks report points out, 91% of organisations say point tools create visibility blind spots. Gartner also highlights the paradigm shift from prevention to cyber resilience , which requires an integrated rather than fragmented approach.

    From reactive patching to proactive partnerships

    In summary, the picture emerging from the analysis of leading industry reports is clear: data exposure in the cloud is widespread, costly and rapidly growing. Most importantly, it is not a technology problem, but a problem of processes, competence and operational hygiene. Its origins lie in avoidable configuration errors, careless management of secrets and negligence in the area of identity and access.

    The current state of affairs is a direct result of a skills gap that the IT sales channel is uniquely predisposed to fill. This is a historic opportunity to evolve from the role of technology resellers to that of indispensable security partners who deliver proactive, continuous risk management rather than, as the expert Tenable put it, ‘reactive patching’.

    The choice facing sales channel partners is clear. They can continue to sell point solutions to their customers’ increasingly fragmented and inefficient technology stacks. Alternatively, they can use this moment to build a strategic, high-margin managed services practice around cloud security. The ultimate message, then, is a call to action: the data and tools are available. The opportunity is here and now. Seizing it will not only generate significant revenue, but also fundamentally strengthen customer resilience in an era of digital transformation.