Tag: Open Source

  • The economics of open source: who pays for the code the world runs on?

    The economics of open source: who pays for the code the world runs on?

    Every day, as we reach for our smartphone, launch our favourite TV series or send a business email, we participate in the quiet miracle of modern technology. Beneath the shiny surface of apps and services lies an invisible foundation – open source software.

    It is millions of lines of code, written, refined and shared with the world for free by a global community. This code is the bloodstream of the internet and the backbone of the AI revolution.

    But this digital world, raised on the idea of freedom and collaboration, conceals a profound paradox. The global economy relies on an infrastructure created largely by volunteers, often balancing on the brink of professional burnout.

    It is as if global trade routes were based on bridges built as a hobby after hours. How long can such a structure last? Who actually pays for the code we all rely on?

    The invisible foundation: our global dependence

    Open source software is no longer an alternative. It has become the default building block of the digital world. Hard data paints a picture of almost total dependence. An analysis by Synopsys in 2024 showed that as much as 96% of the commercial code bases examined contained open source components.

    What’s more, on average, 77% of all code in these applications came from open source. It’s no longer a question of using individual libraries – it’s about building entire systems on a foundation created by the community.

    The scale of this dependency becomes even more striking when looking at the dynamics of consumption. In 2024, it was forecast that the total number of downloads of open source packages would reach the unimaginable figure of 6.6 trillion.

    The npm (JavaScript) ecosystem alone was responsible for 4.5 trillion requests, recording 70% year-on-year growth, while the AI-powered Python ecosystem (PyPI) grew by 87% to reach 530 billion downloads.

    The average commercial application today is a complex mosaic of an average of 526 different open source components. Each has its own life cycle, its own maintainers and its own potential problems.

    Cracks in the foundation: zombie code and a wake-up call called Log4j

    The ubiquity of open source is a double-edged sword. The same ease with which developers can incorporate off-the-shelf components into their projects leads to systemic neglect. The data is alarming: as many as 91% of the commercial code bases surveyed contain components that are ten or more versions out of date.

    This problem leads to so-called ‘zombie code’ – components that have had no development activity for more than two years. This phenomenon affects almost half (49%) of the applications on the market.

    This means that companies are building their critical systems on abandoned projects, without active support and, most importantly, without security patches. The consequence is a ticking time bomb: in just one year, the percentage of code bases containing high-risk security vulnerabilities has increased from 48% to 74%.

    Nothing illustrates this risk better than the December 2021 incident, when the world learned of the Log4j vulnerability. This small, free Java library for logging turned out to be embedded in millions of applications around the world.

    The vulnerability, named Log4Shell, received a maximum criticality rating of 10/10. An attacker could take full control of a server by sending a simple string of characters. US CISA director Jen Easterly called it “one of the most serious vulnerabilities she has seen in her entire career”.

    The Log4j incident became a global wake-up call, making companies brutally aware of how much their security depends on the work of anonymous volunteers.

    Worse still, even three years after the discovery of Log4Shell, up to 13% of all Log4j library downloads are still vulnerable versions. This demonstrates the profound inertia of organisations that fail to update their dependencies even in the face of a well-known, critical threat.

    The human cost of ‘free’ software: the burden of the custodian

    There are people behind every line of code. A model that treats their work as a free resource generates a huge human cost. Salvatore Sanfilippo, the creator of the Redis database, described this phenomenon as the ‘flooding effect’.

    Over time, the stream of emails, GitHub submissions and questions turns into a never-ending flood that leads to guilt over not being able to help everyone.

    The scale of this pressure is illustrated by the example of Jeff Geerling, who looks after more than 200 projects. Each day he receives between 50 and 100 notifications, of which he is only able to deal with a fraction.

    Nolan Lawson, another well-known maintainer, aptly put the emotional weight of this work. Notifications on GitHub are “a constant stream of negativity”. No one opens a notification to praise working code. People only post when something is wrong.

    This chronic pressure leads to burnout, which, in the context of open source, has clearly defined causes: demanding users, low quality contributions, lack of time and, most acutely, lack of remuneration.

    Knowing that work that consumes huge amounts of energy is the foundation for commercial products that make real profits for others is extremely demotivating. As one maintainer put it:

    “My software is free, but my time and attention is not”. Caregiver burnout is not just a personal tragedy. It is a critical risk to the global infrastructure.

    ‘Zombie code’ is a direct, measurable symptom of this crisis at the human level.

    The New Economy of Code: Towards a Sustainable Future

    In the face of these risks, the open source ecosystem is slowly maturing, moving from a volunteer-based model to more sustainable forms of funding.

    1. corporate patrons: strategy, not altruism

    At the forefront of this transformation are the technology giants. Companies such as Google, Microsoft and Red Hat have been the biggest contributors to the open source world for years. Their motivations, however, are not altruistic – they are cold, strategic calculations.

    Joint development of fundamental components (such as operating systems or containerisation) is simply more efficient. This allows them to compete at a higher level, in areas that directly differentiate their products.

    By becoming involved in key projects, corporations can also influence their direction, ensuring alignment with their own strategy.

    2 The power of institutions: the role of foundations

    The second pillar is non-profit foundations such as the Linux Foundation and the Apache Software Foundation. They act as neutral trustees for the most important projects, ensuring their stability and independence from a single corporation.

    They collect contributions from sponsors, creating a budget that allows them to fund key developers and safety audits.

    3 The maker revolution: the GitHub Sponsors model

    Alongside the big players, a new grassroots funding wave has been born. Platforms such as GitHub Sponsors allow direct, recurring contributions from users and companies, creating a revenue stream for maintainers.

    The story of Caleb Porzio, creator of Livewire and AlpineJS tools, is a prime example of the potential of this model. Standing on the brink of burnout, he decided to try his hand at the GitHub Sponsors programme.

    The real breakthrough came when he changed the paradigm: instead of asking for support, he decided to offer his sponsors additional, exclusive value. His secret turned out to be paid screencasts – a series of video tutorials.

    He reserved access to the full library exclusively for backers on GitHub. The effect was spectacular. His annual revenue grew by $80,000 in 90 days and crossed the $1 million threshold in the following years.

    This is a key lesson: a sustainable model does not have to be based on charity, but on building a viable business model around a free, open core.

    From stowaway to stakeholder

    ‘Free’ software has never been free. Its price, hitherto hidden, has been paid with the time, energy and mental health of a global army of volunteers. The model in which we treated their work as an inexhaustible resource is coming to an end.

    It is time for every participant in this ecosystem to undergo a transformation – from a passive ‘stowaway’ to an active stakeholder.

    This requires specific actions. Developers need to practice ‘software hygiene’ – regularly updating dependencies and consciously managing technical debt.

    Companies need to treat open source as a critical part of the supply chain, creating ‘software component inventories’ (SBOMs) and investing in business-critical projects. Investing in open source is not a cost, it is business continuity insurance.

    We stand at the threshold of a new era for open source – an era of professionalisation and sustainability. A future where creators are fairly remunerated and the global digital infrastructure is secure is within our reach. Building it, however, requires a conscious effort from each of us.

  • Technology debt is on the rise. Why is 278 days of delay a risk to business?

    Technology debt is on the rise. Why is 278 days of delay a risk to business?

    Today’s software development dynamics resemble a race in which the event horizon moves faster than the navigation systems can process it. In a culture focused on instant market gratification, the term Time-to-Market has become one of the main markers of success. However, beneath the shiny façade of innovation, in the foundations of digital ecosystems, there is a growing phenomenon that, in financial terms, could be described as toxic variable-rate credit. The latest data from Datadog’s ‘State of DevSecOps’ report casts a harsh light on this reality: not only is the tech industry failing to close the security gap, it is actually allowing it to expand freely.

    The illusion of speed in the digital arms race

    A common cognitive error in strategic management is to equate the speed of implementation of new functionality with the overall agility of the organisation. Meanwhile, modern software is rarely a work of authorship in the full sense of the word. Rather, it is an intricate construction erected from prefabricated components – libraries, modules and external services. This modularity, while providing unprecedented speed of work, introduces elements into the company’s bloodstream over which control is often illusory.

    Today, almost nine out of ten companies operate in a production environment that has at least one known and actively exploited security vulnerability. This is a statistic that should be a cause for concern not only in technical departments, but especially in boardrooms. For it means that the majority of the digital assets of a modern business are operating in a state of permanent exposure to risk, which is not a fault of the system, but a structural feature of it.

    A new unit of risk measurement: Anatomy 278 days

    A key indicator of the health of digital infrastructure has become the ‘backlog’ of dependencies, which has extended to an alarming 278 days in the last year. That’s almost ten months during which an organisation is using solutions with known flaws, while their safer alternatives are already available on the market. The increase in this delay by more than two months in just one year is indicative of the progressive inefficiency of upgrade processes.

    From a business perspective, these 278 days are when technology debt becomes a real burden on the balance sheet. Every out-of-date library is an ‘open door’ through which an uninvited visitor can pass at any time. Such a long delay in systems maintenance is a form of gambling in which the operational continuity of the company is at stake.

    The trap of ‘free’ components and trust architecture

    The open source model and off-the-shelf workflows such as GitHub shares have revolutionised programming efficiency. They allow small teams to build systems at a scale that a decade ago required armies of engineers. However, what is free in the licensing sense is rarely free in the accountability sense. Half of today’s enterprises deploy new versions of external libraries almost as soon as they are published, often without in-depth analysis of the code changes.

    This approach sets a dangerous precedent. CI/CD pipelines, the digital arteries through which code flows from the developer to the customer, are becoming a critical hotspot. The lack of rigorous control over the versioning of external components means that changes made by third parties, not necessarily with pure intentions, can seep into the organisation. In this way, the software supply chain ceases to be a secure tunnel and becomes an exposed commercial tract.

    The transparency paradox and the role of artificial intelligence

    Contrary to popular belief, the main obstacle to building secure systems is not the speed of development per se, but the lack of clarity in the maze of technological interconnections. Cloud environments have reached a level of complexity that is beyond the perceptual capabilities of a single individual or even entire expert teams. Herein lies the tension field between the need for automation and the need to maintain critical judgement.

    The phenomenon of over-warning, where safety systems generate thousands of ‘critical’ alerts, has led to a kind of decision-making desensitisation. When everything is on fire, the focus is on extinguishing the nearest flames, not necessarily the most dangerous ones. The data shows that only a small fraction of theoretical vulnerabilities have a real bearing on the ability to take control of a production service. The key, therefore, becomes analytics backed by artificial intelligence that can sift the noise from the signal, pinpointing those few truly significant risks. This shift from quantitative to qualitative security management is currently the biggest challenge for technology leaders.

    Exit strategy

    A modern security strategy must evolve towards processes that are an immanent part of value creation and not just a cumbersome add-on at the end of the production cycle. This requires a redefinition of the concept of software quality. A product that is functional but based on outdated foundations should be considered defective in today’s market reality.

    A key element of this transformation is the implementation of a strict component inventory, known as the Software Bill of Materials (SBOM). Knowing exactly what the company’s technology stack consists of allows for a rapid response in moments of crisis. Furthermore, it becomes essential to prioritise so-called contextual security. Instead of blindly following the recommendations of tool vendors, organisations must learn to assess risks through the prism of their own architecture and business specifics.

  • Sovereignty or a bottomless pit? Government, Microsoft and billions for licences

    Sovereignty or a bottomless pit? Government, Microsoft and billions for licences

    In February 2026, the Polish debate on digitalisation entered a new, hot phase. The Central Informatics Centre (COI) announced an ambitious plan: to build a national office suite for public administration to replace the Redmond giant’s solutions. This move is part of the broader European trend of ‘digital sovereignty’, but the Polish version raises as much hope as justified scepticism. For business and IT leaders, it is a signal that the hitherto “licence first” model is no longer the only path to state development. But is Poland ready for its own ‘Office’, or are we in for another billion-dollar project of dubious quality?

    The dictate of one supplier: Diagnosing the monopoly

    The starting point for the COI initiative was the Instrat Foundation report ‘Seemingly open procurement’, published at the end of 2025. The data are merciless: as much as 99% of the analysed public procurement of office software in Poland directly or indirectly favours Microsoft products. In one in five tenders, competitors are excluded outright, and in the rest, through specific technical requirements that only one ecosystem meets.

    The vendor lock-in phenomenon has ceased to be a theoretical academic problem and has become a real threat to government budgets. As licence costs rise, the administration has nowhere to run, because the entire infrastructure, from mail to advanced spreadsheets, is based on closed standards. The head of the COI, Radosław Maćkiewicz, makes it clear: Poland spends too much money on Microsoft software, and these funds could support an indigenous IT ecosystem.

    The other side of the coin: The spectre of “billionaire systems”

    When the administration talks about ‘building its own solutions’, a red light goes on in the private sector. The history of Polish public IT is full of projects whose costs ran into billions and whose quality left much to be desired. A symbol of these concerns is the ZUS Comprehensive Information System (KSI ZUS). According to figures from recent years, the maintenance and development of this system over a six-year cycle (2015-2020) cost the state nearly PLN 2.8 billion. What is more, current contracts for the maintenance of KSI ZUS alone amount to hundreds of millions of zlotys (e.g. Asseco’s offer for nearly 350 million zlotys).

    Critics rightly ask: should a state that struggles to manage such molochs efficiently embark on building an ecosystem from scratch to compete with Microsoft 365, which has been fine-tuned over decades? Building a modern office suite is not just a word processor, it involves hundreds of thousands of hours of developer work, security testing and cloud integration. There is a real risk that the ‘national alternative’ will become another bottomless pit, with billions of public money disappearing into it and the end product lagging behind market standards.

    At the same time, it is worth bearing in mind the expert opinion on the quality of the software built by the COI, e.g. on the occasion of mCitizen. Although the application is popular, reports by the Defence CSIRT have pointed to security gaps, such as ‘dead code’ or vulnerabilities in the library supply chain. The scaling of these problems to a system on which every official in the country will depend for their work raises understandable fears.

    Lesson from The Hague: Why sovereignty is not just about austerity

    Arguments about costs are only part of the equation, however. The geopolitical turning point came in May 2025, when the Prosecutor General of the International Criminal Court (ICC) in The Hague was to lose access to his Microsoft-provided email account. The official reason was said to be the sanctions imposed by the Donald Trump administration.

    Regardless of the corporation’s subsequent denials, the incident horrified European decision-makers. It became clear that relying on the US SaaS model was not only a matter of convenience, but also an exposure to US extraterritorial law (e.g. CLOUD Act) and the political decisions of a foreign power. It is this fear that is driving migrations today in the German state of Schleswig-Holstein (60,000 posts switching to Linux and LibreOffice) or the Danish Ministry of Digitalisation. In these cases, sovereignty over data is valued more highly than the polished interface of a US-based giant.

    Transparency of savings: The myth of ‘free’ Open Source

    The COI initiative is to be based on open (open source) solutions. This is key, because it lowers the barrier to entry – we don’t have to write everything from scratch, we can draw on projects such as LibreOffice, Collabora Online or Nextcloud. However, in the IT business, ‘open’ rarely means ‘cheap in the short term’.

    Transparency of savings requires an honest look at TCO (Total Cost of Ownership). While the cost of Microsoft’s licences (around £38-49m per year for ZUS alone in 2024/25) is easily measurable, the hidden costs of migration are huge. Germany’s Schleswig-Holstein estimates that it will save €15m per year on licences, but at the same time invest €9m in 2026 alone in the transformation and training process.

    The real cost of the national office suite will be:

    1. training and adaptation: officials accustomed to Outlook can lose productivity drastically in their first year with the new tool.

    2. maintenance and SLA: Open Source requires strong local support teams. Instead of paying Microsoft, we will pay Polish companies (e.g. in a PPP model), which supports the economy but does not necessarily mean a drastic decrease in budget expenditure.

    3 Compatibility: Millions of historical.docx documents and advanced.xlsx worksheets must work flawlessly. The cost of fixing formatting errors can run into the millions.

    European trend: Lyon and Schleswig-Holstein lead the way

    Poland is not alone. Lyon, France’s third-largest city, is already deploying OnlyOffice and Linux on 10,000 workstations, arguing not only for sovereignty but also for ecology – open software allows older hardware to be used for longer. In contrast, the German openDesk project, developed by ZenDiS under the aegis of the German Interior Ministry, is becoming a ready-made standard for the whole of Europe.

    This is where the opportunity for COI lies: not to build the wheel from scratch, but to become a Polish integrator of European sovereign solutions. Using Collaborator Online in conjunction with the Polish government cloud would avoid the fate of ZUS’s KSI, while at the same time providing a guarantee that citizens’ data will never leave the country.

    Value for business

    The COI initiative should be read as a call for diversification. A complete abandonment of Microsoft in commercial enterprises is unlikely today, but building ‘hybrid resilience’ is slowly becoming a necessity.

    This is a huge opportunity for the Polish IT sector. The shift from a model of selling licences (where most of the margin goes to the US) to a model of high-margin implementation and maintenance services around Open Source can be a ‘flywheel’ for domestic integrators. However, business needs to watch the state’s back. If the ‘national package’ is locked within the walls of a single institution, it will become an expensive monument. If, on the other hand, it is built on transparent public-private partnerships and open standards, it can become the foundation of a modern state that invests in its own intellect instead of paying a ‘digital tax’ to giants.

    The key to success will not be the technology – because LibreOffice or OnlyOffice are already ready – but the transparency of how these ‘saved’ millions are spent. True sovereignty is the ability to freely choose technology, not being forced to use software just because it is ‘national’. Poland must prove that it can build systems that are not only expensive, but above all effective.

  • The Open Source Paradox – How innovation became the biggest threat to business

    The Open Source Paradox – How innovation became the biggest threat to business

    Open source software (OSS) is the silent, invisible engine driving today’s digital economy. From banking systems to medical applications to the dashboard in your car, its code is everywhere. The scale of this dependency is difficult to overstate. Synopsys’ Open Source Security and Risk Analysis (OSSRA) report for 2024 showed that 96% of commercial applications contain open source components, and that on average 77% of all code in these applications comes from open source. In Poland, according to data from the end of 2023, open source software was identified on more than 670,000 websites, ranking our country 13th in the world in terms of its use.

    This ubiquity has given rise to a fundamental paradox. On the one hand, OSS is an unparalleled accelerator of innovation, allowing companies to build and deploy advanced technologies faster than ever. On the other hand, it has become the largest, often ignored, attack surface for modern businesses. This dichotomy is brutally exposed by the data: in 2023, as many as 74% of the code bases examined contained high-risk security vulnerabilities. This is a dramatic increase of 54% from the previous year, when the percentage was 48%.

    The problem not only exists, but is dynamically worsening, spiralling out of control. Organisations are consuming open source software at a rate that far exceeds their ability to manage the associated risks, leading to a tacit acceptance of huge and growing levels of risk.

    Hidden dangers in your code

    Understanding today’s OSS threats requires deconstructing the problem into several key, often interrelated categories.

    1. ‘Zombie code’ and security debt

    One of the most fundamental risks is the reliance on old, out-of-date and often abandoned components by authors. The 2024 Synopsys report bluntly describes this condition as a ‘zombie code apocalypse’. The scale of the problem is alarming:

    • 91% of the code bases analysed contained components that were outdated by 10 or more versions.
    • 49% of the code base contained components where no development activity had been recorded for more than two years.

    These omissions create a so-called ‘security debt’ – a concept popularised in Veracode’s ‘State of Software Security 2024’ report. Each unpatched vulnerability is like a loan taken out, which builds up over time, generating ‘interest’ in the form of increasing risk. To make matters worse, it takes 50% more time to fix vulnerabilities in third-party code (mainly OSS) than in proprietary code, showing how difficult it is to manage this type of debt.

    2 The trap of transitive dependencies

    The problem of code obsolescence is compounded by the existence of transitive (indirect) dependencies. These are libraries that the project does not use directly, but are required by direct dependencies. They represent ‘hidden’ code that finds its way into the application, often without the knowledge and verification of the development teams.

    It is in this hidden layer that the greatest danger lurks. Transitive dependencies can make up as much as 80-90% of all application code , and it is estimated that as much as 95% of all vulnerabilities in open source software come from these intermediate dependencies. A textbook example is the Log4Shell vulnerability in the Log4j library. Many organisations had no idea they were using it at all until the global security crisis hit. Even three years after it was discovered, 13% of all Log4j library downloads are still vulnerable versions.

    3. active sabotage: attacks on the supply chain

    In recent years, the threat landscape has undergone a fundamental transformation – from passive vulnerabilities to actively hostile actions. Attackers have realised that compromising one popular component allows thousands of organisations to be hit simultaneously. The figures from the Sonatype report for 2024 are alarming: there has been a 156% year-on-year increase in the number of malicious packages. These attacks take a variety of forms:

    • Typosquatting: Publishing a malicious package with a name confusingly similar to a popular one (e.g. crossenv instead of cross-env), hoping that a developer will install it by mistake.
    • Dependency Confusion: publication in a public repository of a malicious package with a name identical to the company’s internal private package, but with a higher version number. The build system can ‘confuse’ the sources and download the public version, as happened in the attack on the PyTorch framework, among others.
    • Account takeover and code injection: The most sophisticated method, of which the incident with the event-stream package was a frightening example. The attacker first gained the trust of the author and, after taking over privileges, added a new malicious transitive (flatmap-stream) dependency to the project with the aim of stealing private keys from bitcoin wallets.

    Why do we lose? Systemic failures in organisations

    The scale and persistence of these risks are largely the result of deep-seated cultural and process issues within the organisation.

    Firstly, there is a culture of complacency and lack of strategy. The Sonatype report speaks of ‘developer overconfidence’. As many as 80% of dependencies in applications remain un-updated for more than a year, even though there is already an available, secure version for 95% of them. This means that almost all risk is avoidable. This state of affairs is due to the lack of a formal framework – only 47% of IT companies have a defined strategy for using open solutions.

    Secondly, we are seeing an immaturity of security processes and tools. The 2024 Snyk report reveals a worrying trend: despite growing threats, investment in defence mechanisms is declining. There has been an 11.3 per cent decline in the adoption of security tools. Key technologies such as Software Composition Analysis (SCA) and Static Application Security Testing (SAST) are used by just over 60% of organisations, and container scanning by just 35%.

    Finally, the idea of ‘shift left’, i.e. moving security to the early stages of the development cycle, remains largely an illusion. Security tools are most often integrated into build systems (around 65%), but only 40% of organisations have implemented them where they are most effective – directly in the developer’s integrated development environment (IDE). This means that control gates are being moved, rather than the actual responsibility for security. The result? Overstretched security teams, half of whom admit that they are unable to meet their goals, and 52% of companies regularly fail to meet their own SLAs for fixing critical vulnerabilities.

    The road to cyber resilience: A 3-step defence model

    Mitigating such complex risks requires moving away from reactive measures to implementing a comprehensive, strategic management model.

    Step 1: Achieve full visibility with SBOM

    A fundamental principle of security is that ‘you cannot protect something you do not know exists’. In the context of software, the tool that provides this visibility is the Software Bill of Materials (SBOM). This is a formalised, machine-readable inventory of all the components – including transitive dependencies – that make up an application. Having an accurate SBOM is becoming a global standard, driven by regulations such as the

    US Executive Order 14028 and the EU Cyber Resilience Act (CRA), which make supply chain transparency a condition for market access.

    Step 2: Automate protection with SCA

    Visibility alone is not enough. A list of thousands of components is just raw data. To turn it into useful knowledge, Software Composition Analysis (SCA) is needed. SCA is an automated process that analyses SBOM for risk: it scans components for known vulnerabilities (CVEs), verifies licence compliance and assesses the overall quality of dependencies. Modern SCA tools not only find problems, but also help prioritise them and often offer automated remediation suggestions.

    Step 3: Build a DevSecOps culture

    Technology is essential, but it is not enough without a cultural change. DevSecOps is an evolution of the DevOps philosophy that integrates security as a shared responsibility at every stage of the software development lifecycle. Rather than seeing security as an isolated assembly at the end of the process, it is ’embedded’ from the very beginning. In practice, this means integrating SCA tools directly into the developer’s IDE, automatic scanning in the CI/CD pipeline and continuous monitoring of the application in the production environment. This is a real ‘shift of responsibility to the left’ that gives developers the tools and knowledge to make safe decisions.

    From risk to advantage

    The open source landscape is full of contradictions. We are witnessing an explosion of risk, driven by the ubiquity of OSS and systemic mismanagement. However, this alarming picture need not lead to paralysis. The path to cyber resilience is well defined and leads through the implementation of an integrated defence model based on visibility (SBOM), automated intelligence (SCA) and a culture of shared responsibility (DevSecOps). In the new reality defined by advanced threats and increasing regulatory requirements, proactive risk management in the software supply chain is no longer just a technical responsibility. It is becoming a key element of business strategy. Organisations that master this area will not only avoid disaster, but gain the ability to innovate faster and more securely, transforming the biggest source of risk into a sustainable competitive advantage.