Tag: Dell

  • Wojciech Janusz, Dell Technologies: 2026 is the time to settle the effects of AI, not to buy promises

    Wojciech Janusz, Dell Technologies: 2026 is the time to settle the effects of AI, not to buy promises

    Artificial intelligence is ceasing to be just a tool for conversation and is becoming a technology to realistically do our jobs and close business processes. We discuss how to invest wisely in AI infrastructure, cut costs and count the real return on deployments in an interview with Wojciech Janusz, EMEA Data Science & AI Horizontal Lead at Dell Technologies.

    Klaudia Ciesielska, Brandsit: Over the past year, the market has been wowed by Generative AI, and now Dell is starting to talk about Agentic AI – autonomous agents performing tasks. However, many Polish companies are still at the stage of testing simple chatbots. Are you running away from technology too fast to move forward? Why do you think this is the moment to invest in infrastructure for autonomous agents, when companies often have yet to see a return on investment in simpler GenAI models?

    Wojciech Janusz, Dell Technologies: Agentic AI is not a new technology. Rather, it is a natural transition from chatbots to agents that can perform specific actions for us. While large language models allow us to unleash the knowledge we have in the company, the real revolution starts when we turn knowledge into skills and actions.

    I get the feeling that we’re all a little over-saturated with chatbots. At the end of the day, we don’t want to read the advice of a wise assistant who will tell us what to do, but someone who will do that thing for us – or at least relieve us of most of the task.

    “Agentic AI is not a new technology. Rather, it’s a natural transition from chatbots to agents that can perform specific actions for us.”

    We always consider the implementation of new technology in three aspects: people, processes and technology. Unfortunately, in the last two years we have too often focused on technology instead of the first two categories. AI agents are a way to bring it all together. It’s about integration with processes, it’s about human-machine collaboration and leveraging existing technology.

    To answer the question: this is a very good moment, because only when AI starts to perform specific tasks for us will we be able to determine the actual yield, count the efficiency and better plan the next steps and implementations.

    K.C.: There is a lot of talk about Sovereign AI and local models, but where does the point of profitability lie? At what scale of operations does it realistically pay for a Polish company to withdraw data from a hyperscaler and invest in its own AI Factory? Is this a solution only for corporates or a viable financial alternative for the SME sector?

    W.J.: The break-even point lies much lower than we think. Few people realise how big a technological leap we have made in the last two years. That goes for both the hardware and the AI models themselves.

    Firstly, the AI market has split. ‘Open’ models have emerged, giving us the opportunity to download and run them on our hardware in a secure controlled environment, but also to further customise them to fit our use case even better.

    Simply downloading a model and running it won’t do much if it doesn’t meet expectations – and here too we have a big leap. Open models are catching up with the best closed cloud models in terms of capability and correctness. Of course, a model 1000 times smaller will not have the same capabilities as one in the cloud. But that is not the purpose and applications – instead of universal models that can speak all the languages of the world and solve every problem regardless of the domain, we can use specialised but smaller models and focus on specific activities. This gives us more flexibility and control over what is happening.

    Instead of a single ‘universal genius’, we choose a team of expert AIs working together in a controlled and efficient manner. Appointing the necessary resources when required to solve a specific problem.

    Such models with a developed ability to reason and break down problems into smaller tasks form the basis of AI agents.

    Through high computing power and energy costs, models optimised to run on simpler hardware have also emerged. Here, the biggest contributions come from new architectures – such as Mixture of Experts (MoE), new training methods – including the use of Reinforcement Learning, and advanced ways of optimising the model itself.

    The final element is the development of hardware platforms. Here too, new developments are emerging. We have a whole new category of hardware designed to use AI rather than train it.

    It is estimated that the cost of running the model per token cost is decreasing by a factor of 10 each year, and so far, since the advent of GPT 3.5, this trend is managing to continue.

    Tasks that only 2-3 years ago required powerful servers are today easily performed on an AI PC, for example, the Dell Pro Max with GB10 allows you to successfully work with models up to 200 billion parameters.

    Of course, the appetite is growing and the list of tasks we want to do with AI is growing too, but it is becoming increasingly clear that the technology is no longer blocking us. The main question now is what we actually want to do with AI, not how to run it on our infrastructure.

    K.C.: Poland has some of the highest energy prices in Europe and AI servers are extremely power hungry. Does the implementation of efficient AI solutions in Polish conditions force companies to overhaul their server rooms and switch to liquid cooling? Is it not the case that the main barrier to AI adoption in our region will not be the price of the server itself, but precisely the cost of electricity and the need to upgrade the cooling infrastructure?

    W.J.: It will depend on the scale. Earlier we talked about changes in AI technology itself. We have new models, new uses, but also new architectures to enable AI to run on even modest resources.

    “On a large scale, there will be no escaping energy costs and changes to the Data Centre infrastructure, but I am optimistic.”

    My impression is that quite a few companies assume a massive cost of entry. Meanwhile, we can start AI projects with single applications. In any case, this is a very sensible and recommended approach: to limit ourselves to a few use-cases, well grounded in the realities of the company, with a clear budget and projected profit, and, most importantly, lying close together in terms of the technology and integration needed. This approach means that we can start modestly, with single devices, such as the Dell Pro Max with GB10 just now, without a huge revolution in our DC. Of course, when we are successful, these examples will be the basis for further steps while providing a solid foundation.

    Start Small, Think Big, scale fast. This is the basis of our AI strategy.

    Of course, on a large scale there will be no escaping energy costs and changes to the Data Centre infrastructure, but I am optimistic. I think for most companies it will be a gradual evolution rather than a revolution requiring drastic changes.

    Investments can also yield very rewarding results. One new Dell PowerEdge server can replace up to seven older servers, and this translates into a reduction in energy costs of up to 65-80 per cent. Dell customer Wirth Research has reduced energy consumption in HPC environments by up to 70 per cent at its Verne Global data centres thanks to liquid-cooled PowerEdge servers.

    K.C.: The great hardware replacement is underway, but does it make economic sense to buy PCs with NPUs (AI PCs) today when there are still few business applications that make real use of this chip? Aren’t companies today paying a ‘novelty tax’ for hardware whose potential will only be realised in 2-3 years, i.e. at the end of its life cycle?

    W.J.: We are seeing a lot of interest in AI PC among business customers, with organisations looking to enhance their AI capabilities used locally.

    Every computer we presented at CES 2026 is a computer with an AI processor and an NPU. This chip is not just for new applications yet to be developed – it is actively used during the user’s day-to-day work, providing benefits such as extended battery life – up to 27 hours of video streaming in the case of the XPS 14.

    K.C.: Finally, a request for an honest forecast. Looking to 2026 and your experience of working with companies: in which area will Polish business (regardless of the industry) “overspend” with investments – spend too much money without a quick return, and which area will they drastically underestimate, which may negatively affect the performance of companies?

    W.J.: I think in 2026 companies will be preparing more thoroughly for AI projects. We no longer want to have AI for the sake of having an AI project. There will be more cost-effectiveness analysis and looking for those applications that actually bring real benefits. We will also focus on the efficiency of using AI, not just the cost of purchase.

    We have new metrics and tools to better choose the right approach to AI.

    “A model that achieves 80% in a test having 8 billion parameters is considered much more impressive (and effective) than one that achieves 82% but requires 70 billion parameters.”

    Until recently, we have only focused on maximum quality and speed.

    Currently, we are increasingly looking for a reasonable compromise between quality and efficiency. An example might be the Frontier Pareto methodology: instead of looking only at the top of the scoreboard, we look for models on the ‘Pareto front’, i.e. those that offer the best ratio of quality (e.g. MMLU score) to model size (number of parameters) or inference cost. A model that achieves 80% in a test with 8 billion parameters is considered much more impressive (and efficient) than one that achieves 82% but requires 70 billion parameters.

    Another example is a metric showing the real cost of an AI decision or action – Tokens per Decision/ Tokens per Action – A more efficient model will make an accurate decision using a few hundred reasoning tokens, while a weaker one may need several times as many.Choosing the former significantly reduces TCO and allows for a faster return on investment.

    A final but very effective way of showing which way we are heading is the Cost per Resolved Task (or Cost per Resolution) metric: how much it realistically costs us to perform a specific activity using AI or, more commonly, an AI Agent.

    In my opinion, 2026 will be the year of prudent AI project building, well-founded, grounded in reality and backed up by numbers.

  • Pragmatism over politics: Why are PC giants choosing Chinese DRAM?

    Pragmatism over politics: Why are PC giants choosing Chinese DRAM?

    For years, US and Taiwanese PC manufacturers have kept their distance from Chinese critical components, driven by both policy and supply chain stability. However, a drastic shortage of DRAM chips and projected price increases are forcing a change in strategy. According to recent reports, market leaders HP, Dell, Acer and Asus have begun product qualification from China’s ChangXin Memory Technologies (CXMT). This is a pragmatic, albeit risky, step that could redefine the balance of power in the semiconductor sector.

    For HP and Dell, the decision to test CXMT components is first and foremost an insurance policy. The global supply crisis, which is affecting almost every segment of electronics, from smartphones to data centre infrastructure, has left traditional suppliers unable to guarantee stable prices or delivery times. HP plans to monitor the situation until mid-2026. If tensions in the market do not subside, Chinese memory dice will find their way into devices destined for markets outside the US, avoiding direct regulatory friction in Washington.

    The move is significant in that CXMT is emerging as a viable alternative to the hegemony of Samsung, SK Hynix and Micron. Dell, concerned about the continuing upward trend in memory prices, is also looking to diversify its sources to protect its margins in the enterprise segment. Acer and Asus, on the other hand, are taking a more reactive stance, allowing the use of Chinese chips if their manufacturing partners deem it necessary to maintain assembly continuity.

    For the technology industry, this signals that operational pragmatism is beginning to prevail over geopolitical caution in the face of the economic crisis. The integration of Chinese DRAM chips into the products of global brands can not only alleviate shortages, but also permanently lower the barriers to entry for new players from the Middle Kingdom in Western markets. In a world where hardware availability determines quarterly results, stability of supply becomes more important than the origin of silicon.

  • Red Tuesday for Dell and HPE. Analysts prophesy the end of the IT buying eldorado

    Red Tuesday for Dell and HPE. Analysts prophesy the end of the IT buying eldorado

    The IT hardware sector, which had been riding the wave of artificial intelligence promises for the past few quarters, collided with hard market reality on Tuesday. Morgan Stanley analysts, in a note that immediately cooled sentiment on Wall Street, downgraded the entire industry to ‘cautious’. This is a clear signal to CFOs and investors that the period of easy gains is over, and 2026 could bring a painful review of sales plans.

    Investment bank experts warn of the formation of what they have termed a ‘perfect storm’. It is made up of three critical factors: a marked slowdown in demand, recurring component cost inflation and over-inflated valuations of technology companies. The floor reaction was instantaneous. The shares of infrastructure giants such as Hewlett Packard Enterprise, Dell Technologies and NetApp dived by around 5 per cent, dragging the entire industry index down with them. Logitech also came under pressure, with its recommendation downgraded to ‘underweight’.

    For business decision-makers, however, the most relevant information comes not from the stock prices themselves, but from the hard data underlying this discount. A recent Morgan Stanley survey indicates that corporate IT leaders plan to increase hardware budgets in 2026 by just a token 1 per cent, the weakest reading in 15 years, excluding the anomaly of the pandemic period. What’s more, surveys of sales intermediaries (VARs) suggest that between 30 and 60 per cent of business customers are prepared to drastically reduce planned purchases of servers, PCs and storage if manufacturers continue to pass on rising component costs.

    While investment in AI-dedicated infrastructure remains a bright spot on the spending map, it cannot fully offset broader macroeconomic concerns. Uncertainty is compounded by the Donald Trump administration’s tariff announcements and rising memory prices, as highlighted by Citigroup in its separate analysis. Analysts conclude that with such elastic demand and rigid production costs, the risk of a downward earnings revision for 2026 is now higher than ever. For companies, this means they need to revise their purchasing strategies and prepare for tougher negotiations with technology suppliers, who will struggle to maintain margins.

  • Dell updates PowerStore and reduces power consumption by 23 per cent.

    Dell updates PowerStore and reduces power consumption by 23 per cent.

    Faced with rising energy costs and a chronic shortage of space in server rooms, Dell Technologies is updating its key storage platform. The release of PowerStore 4.3 is not an architectural revolution, but a precision hit on enterprise operating costs (OpEx), offering a dramatic increase in storage density while reducing power appetite.

    The most significant change from an infrastructure perspective is support for 30TB QLC drives. This move allows Dell engineers to pack up to 2 petabytes of data into an enclosure just 2U high. For IT managers, this effectively means doubling the capacity per rack unit, which directly translates into saving valuable data centre space. Dell estimates that the new configuration reduces power consumption by up to 23 per cent and the total cost of ownership (TCO) can drop by 15 per cent. In the current macroeconomic climate, where energy efficiency is becoming a key KPI, these are arguments that can tip the scales in B2B tenders.

    The software layer has also seen significant improvements, with a focus on business continuity and security. The introduction of synchronous replication over Fibre Channel and the Metro Sync function with automatic failover aims to eliminate the risk of data loss at critical times. The system now also offers asynchronous replication over longer distances with a recovery point objective (RPO) of five minutes. Significantly, Dell is implementing artificial intelligence mechanisms to detect anomalies in the data environment and multi-component administrative authentication, in response to growing ransomware and insider threats.

    For system administrators, new analytical tools will be key. The ‘Top Talkers’ function provides insight into the processes that are putting the most strain on the network, enabling more precise Quality of Service (QoS) policy management. In addition, full support for the NFSv4.2 standard improves management of large file volumes. The upgrade is available immediately and free of charge to existing customers, a clear signal that Dell intends to aggressively defend its user base against the competition.

  • The big comeback of the Dell XPS. The company is phasing out the Dell Premium brand at CES 2026

    The big comeback of the Dell XPS. The company is phasing out the Dell Premium brand at CES 2026

    Just one year after the controversial decision to extinguish the iconic XPS line in favour of the generic ‘Dell Premium’ naming, the US tech giant is making a strategic U-turn. At CES 2026, Jeff Clarke, the company’s COO, took a rare bit of candour in the corporate world, admitting that last year’s branding restructuring was a mistake.

    The attempt to simplify the portfolio by introducing the Dell, Dell Pro and Dell Pro Max naming scheme – clearly inspired by the strategy of competitors from the smartphone world – crashed into market reality. Business customers and enthusiasts proved too attached to the XPS brand, which over the years has become synonymous with performance in the Windows segment.

    The response to this lesson in humility is the launch of two new models, the XPS 14 and XPS 16. The hardware represents not only a return to the old nomenclature, but also a correction in design philosophy. Dell is phasing out the user-criticised, touch-sensitive LED bar, bringing back a physical row of function keys. This is a clear signal that ergonomics at work is returning to the forefront before futuristic aesthetics.

    Despite the robust construction, the engineers managed to keep the weight of the devices low, oscillating around 1.36 kg for the 14-inch model and 1.6 kg for the 16-inch model, respectively.

    The interior of the new laptops reflects the growing importance of local artificial intelligence processing. Based on Intel Core Ultra Series 3 chips and integrated Intel Arc graphics, the units are said to offer as much as a 78 per cent increase in performance on AI tasks compared to previous generations.

    This is a key selling point at a time when the market is expecting Copilot-ready devices and advanced language models directly on the device. This is rounded off by a 70 Wh battery and a generous set of ports, including Thunderbolt 4 and DisplayPort 2.1, which is not standard in this chassis thickness class.

    Dell is also emphasising the service aspect, in line with the ‘Right to Repair’ trend. The modular design of USB-C ports, interchangeable keyboards and the use of recycled materials are expected to extend the product lifecycle in corporate fleets.

    Prices in the US market start at $1,650 for the smaller model, suggesting that the legend’s return will involve increases in Europe too. Cheaper configurations will hit distribution in February, with a refreshed XPS 13 model expected later in the year.

  • From the Big Bang to the speed of light: the AI revolution is underway

    From the Big Bang to the speed of light: the AI revolution is underway

    In 2023, we witnessed the Big Bang of technology – a year in which artificial intelligence ushered in a new era of innovation and transformation. In 2025, generative AI went mainstream, and agent-based AI took the stage. Most importantly, real returns on investment began to emerge for large companies such as Dell Technologies.

    In 2026, the story of artificial intelligence is accelerating. AI will redesign the entire structure of businesses and industries. It will drive new ways of doing things, building and innovating at a scale and pace previously unimaginable.

    Understanding these changes is essential, as those who invest today in a robust, flexible technology base and benefit from a network of partner ecosystems will be ready to manage the rapid changes to come.

    Time to act: principles governing a dynamic ecosystem

    With the acceleration of artificial intelligence comes a degree of volatility. While we anticipate that the governance framework will eventually stabilise the ecosystem, today’s reality is a call to action.

    Governance is currently causing the most delays and it’s even a critical problem that is not making progress. The industry has rushed to bring valuable artificial intelligence tools such as chatbots and agents into production, but we have done so without sufficient governance.

    This is not only risky, but unsustainable. By next year, robust frameworks and private environments are needed to ensure stability and control. Running models locally, on their own servers or in controlled AI factories, will become the norm to provide a stable foundation and insulate organisations from external disruption.

    But this is more than a forecast. It is an urgent appeal. We need to focus more on governance. Without this, we will end up with uncertainty that will slow down the implementation of practical and valuable artificial intelligence for businesses.

    Our concrete demand to the public and private sector is to create rules for the governance of the enterprise market in collaboration with the real players in this market – enterprises and business technology providers.

    We cannot assume that managing public AI or AGI chatbots is the same as helping businesses shape the actual application of artificial intelligence in their companies and processes.

    Governance is not about slowing down innovation. It is about building a protective framework that allows us all to accelerate in a safe and sustainable way.

    2. Data management: the real foundation of innovation in artificial intelligence

    The next big leap in artificial intelligence will not just come from more powerful algorithms. It will come from the way we manage, enrich and use our data. As artificial intelligence systems become more complex, the quality and availability of the data they use is paramount.

    In 2026, AI-based data management and storage will become the undisputed foundation of all AI innovations.

    AI infrastructure is different from classic IT systems. It focuses on accelerated computing, advanced networking adapted to AI, new user interfaces and, most importantly, a new layer of knowledge from data that drives its results.

    Purpose-built AI data platforms, designed to integrate disparate data sources, protect new artefacts and provide the efficient storage needed to support them, will become essential. Partner ecosystems can help unlock the potential of these purpose-built platforms, with partners using their expertise to integrate and optimise data management solutions for enterprise AI.

    The ability to effectively feed clean, structured and relevant data into artificial intelligence models is crucial. However, as we enter the era of agent-based AI, this data will no longer be used solely to train large models. Instead, they will be a dynamic resource during inference, enabling the generation of evolving knowledge and intelligence in real-time. This foundational layer of data is the starting point for everything that comes next.

    3 Agent AI: the new business continuity manager

    What is coming is agent-based artificial intelligence. An evolution that transforms artificial intelligence from a helpful assistant to an integral manager of long-term, complex processes.

    In areas such as manufacturing and logistics, artificial intelligence agents will not just assist workers, they will assist in coordinating their activities. Using rich, dynamic data streams, they will ensure continuity between shifts, optimise real-time workflows and create new levels of operational efficiency.

    Imagine an artificial intelligence agent scaling the capabilities of process managers on the shop floor, adjusting production schedules based on supply chain disruptions or guiding a new employee through a complex task. By positioning AI agents as intermediaries between a team’s goals and its employees, we are elevating team coordination across all sectors to unprecedented levels.

    These intelligent agents will become the nervous system of modern operations, ensuring resilience and progress. Like any other AI capability, they rely on enterprise data to create a unique store of knowledge and intelligence that must be properly stored and protected.

    4. Artificial intelligence factories redefine resilience and disaster recovery

    The more AI integrates with a company’s core functions, the more business continuity becomes unquestionable.

    Artificial intelligence infrastructure will evolve to prioritise operational resilience, redefining the meaning of disaster recovery in an AI-driven world. The focus is not just on backing up systems, but on ensuring AI functionality, even if the underlying systems go offline. This includes protecting vectorised data and other unique artefacts, so that system intelligence can survive any disruption.

    Achieving this requires innovation across the AI value chain, from data protection and cyber security companies to key AI technology providers. Collaborative ecosystems include governments, partners and large-scale AI innovators. They must work together to build resilient factories that bring together the tools and expertise needed to ensure continuity and secure critical functions in hybrid cloud environments.

    5. Sovereign artificial intelligence accelerates development of national enterprise infrastructure

    Artificial intelligence is central to national interests, which is why we are seeing the rapid development of sovereign artificial intelligence ecosystems. Countries are no longer just consumers of AI technology, they are actively building their own frameworks to drive local innovation and maintain digital autonomy.

    This is changing the way artificial intelligence infrastructure is planned, with computing, data storage and management playing a key role in protecting and locating sensitive information.

    Businesses will increasingly adapt to this framework, scaling their operations within regional boundaries. By storing data locally, governments can shape public services such as healthcare, and businesses can use national infrastructure while aligning business objectives with national industrial policy.

    This creates innovations with a direct impact on citizens and economies, and represents a fundamental shift that moves artificial intelligence from a global concept to a concrete, local reality.

    Setting the course for 2026

    In 2026, the artificial intelligence revolution is not slowing down, but accelerating. What started with the Big Bang has reached the speed of light, and leading organisations are evolving and adapting to change just as fast.

    To succeed, you don’t need to chase every breakthrough. It’s better to build an infrastructure that can keep up with these changes: resilient AI factories, sovereign frameworks, agent systems that manage complex operations and collaborative ecosystems that turn innovation into real business impact. The tools and information are available. It is the readiness to act that already sets leaders apart from the rest.

    Leadership and concrete action will determine who reaps the real rewards. The future is rushing by at the speed of light. The question is: are we ready?

    By John Roese, global director of technology and artificial intelligence at Dell Technologies

  • Pragmatism versus hype: How ‘agent washing’ and hallucinations brought AI down to earth

    Pragmatism versus hype: How ‘agent washing’ and hallucinations brought AI down to earth

    The technology industry, after two years of fascination with Generative AI, is entering the ‘check out’ stage. Enthusiasm is colliding with hard reality. Statistics indicating low levels of AI adoption in many economies are bringing us down to earth.

    This year’s Dell Technologies Forum in Warsaw was a good example of this. As Dariusz Piotrowski aptly summarised it, the key dogma nowadays is: ‘AI follows the data, not the other way around’. It is no longer the algorithms that are the bottleneck. The real challenge for business is access to clean, secure and well-structured data. The discussion has definitely moved from the lab to the operational back office.

    AI follows the data

    We have been living under the belief that the key to revolution is a more perfect algorithm. This myth is just now collapsing. However, internal case studies of major technology companies show: implementing an internal AI tool is often not a problem of the model itself, but months of painstaking work to… organising and providing access to distributed data.

    This raises an immediate consequence: computing power must move to where the data originates. Instead of sending terabytes of information to a central cloud, AI must start operating ‘at the edge’ (Edge AI).

    The most visible manifestation of this trend is the birth of the AI PC era. With dedicated processors (NPUs), PCs are expected to locally handle AI tasks. This is not a marketing gimmick, but a fundamental change in architecture. It’s all about security and privacy – key data no longer needs to leave the desk to be processed. Of course, this puzzle won’t work without hard foundations. Since data is so critical, the cyber security landscape is changing. The number one target of attack is no longer production systems, but backup. This is why the concepts of ‘digital bunkers’ (restore vaults) – guaranteeing access to ‘uncontaminated’ data – are becoming the absolute foundation of any serious AI strategy.

    Pragmatism versus “Agent Washing”

    In this red-hot market, how do you distinguish real value from marketing illusion? After the wave of fascination with ‘GenAI’, the new ‘holy grail’ of the industry is becoming ‘AI Agents’. However, we must beware of the phenomenon of “Agent Washing” – the packaging of old algorithms into a shiny new box with a trendy label.

    Business is beginning to understand that the chaotic ‘bottom-up’ approach leads nowhere. As Said Akar of Dell Technologies frankly admitted, the company initially put together ‘1,800 use cases’ of AI, which could have become a simple path to paralysis. Therefore, the strategy was changed to a hard ‘top-down’ approach: finding a real business problem, defining a measurable return on investment (ROI) and only then selecting tools.

    This leads directly to a global trend: a shift away from the pursuit of a single, giant overall model (AGI) to ‘Narrow AI’. This trend combines with the growing need for digital sovereignty. States and key sectors (such as finance or administration) cannot afford to be dependent on a few global providers. Hence the growing trend of investing in local models that allow for greater control.

    Hype versus hallucination

    When the dust settles, it turns out that the great technological race is no longer just about making models know more. It’s about making them… make up less often. The biggest technical and business problem remains hallucinations.

    The dominant and only viable business model is becoming ‘human-in-the-loop’, i.e. the human at the centre of the process. In regulated industries, no one in their right mind will allow a machine to ‘pull the lever’ on its own. As mBank’s Agnieszka Słomka-Gołębiowska aptly pointed out, financial institutions are in the ‘business of trust’ and the biggest risk of AI is ‘bias’, which cannot be fully controlled in the model itself.

    Artificial intelligence is set to become a powerful collaborator to take over the ‘thankless tasks’. But the final, strategic decision is up to humans. The real revolution is pragmatic, happens ‘on the edge’ and is meant to help, not take away, from work.

  • The end of Shadow AI, the beginning of measurable ROI. Key findings from the Dell Technologies Forum 2025

    The end of Shadow AI, the beginning of measurable ROI. Key findings from the Dell Technologies Forum 2025

    This year’s Dell Technologies Forum once again attracted crowds to Warsaw – over 2,700 guests is a result that shows the scale of this cyclical event. The keynote “AI. Simply.” sounded like a provocation, but it soon became clear that the organisers did not mean to trivialise AI, but to signal a change. The forum loudly proclaimed that the time for experimentation and Shadow AI should be over. Now is the time for implementations that can be used effectively in business and measure real return on investment.

    Dell’s prescription for this new phase is to be the AI Factory (AI Factory), a concept of an implementation plan for companies. Crucially, this plan is based on the thesis that the success of AI today is determined not only by the algorithm itself, but equally by having the right infrastructure capable of processing data close to where it is created (Edge Computing).

    I decided to step away from the official agenda to see what was real behind the slogan “AI. Simply.”. The key question became why the transition from uncontrollable Shadow AI to quantifiable, secure projects is so difficult. The issue was explored at the Forum from a variety of perspectives: from strategic dilemmas in boardrooms, to the realities of cyber security on the frontline, to frank conversations about the technical pain points of AI.

    Global vision – “AI follows the data”

    The energy on the main stage could be shared by several smaller conferences. Dariusz Piotrowski, VP and Managing Director of Dell Technologies Poland, opening this year’s forum, did not hide his pride. “We have been preparing for this day all year. This is our holiday,” he said, welcoming the record-breaking audience of almost 3,000.

    Dell Technologies Forum 2025
    Dariusz Piotrowski, source: Dell Technologies

    After this energetic introduction, the lights dimmed and Michael Dell appeared on the screen. His virtual speech was like a bird’s-eye view: global, calm and drawn in a thick line. The message was clear: the revolution has already happened. “Soon, more than 75 per cent of corporate data will be processed at the network (edge) edge”. – he pointed out, adding that this is no longer a vision, but a fact: “Today, more than 3,000 customers are using Dell AI factories.”

    The discussion took a practical turn when Dariusz Piotrowski returned to the stage to relate the global vision to the local context. He decided to take a quick poll of the audience: “How many people in the room use AI solutions on a daily basis (…) and make decisions based on information from them?”. The significant number of hands raised served as a starting point for a key diagnosis. Although Piotrowski commented on the turnout in jest, he immediately moved on to the substance. He pointed out that in most cases this widespread activity is not official company projects, but a phenomenon that can be called Shadow AI.

    And this is where we got to the point. “We used to say that technology changes the world. Today we say AI is changing the world,” he began. “Whereas the truth is that AI only changes what it has access to. And it has access to data.”

    Moments later, the key phrase that has become the mantra of the entire forum was uttered: “AI follows the data, not the other way around”.

    Piotrowski did not sweep the problems under the carpet. He bluntly cited data (McKinsey, IDC) showing that only 10 per cent of AI projects are realistically generating profits, and that many will fail. The diagnosis was brutal but honest: the problem is not a lack of talent or algorithms. The problem is chaos. “Success has come to those companies that have been able to organise their data and their processes,” he said.

    “How?” – a pragmatic action plan

    While the vision sets the strategic horizon, real success is determined by pragmatism and effective implementation. This key executive perspective was presented by Said Akar, Senior Vice President at Dell Technologies. His presentation answered the fundamental question of “how?” – how to turn ambitious ideas into a working, profitable process?

    However, before presenting his concept of the ‘AI Factory’, Said Akar began with the most convincing proof of its effectiveness. He used as a key case study of a successful AI transformation … Dell itself.

    The highlight of this analysis was the hard financial data. “Last year we added $10 billion in revenue. And we did that while reducing costs by 4%,” – he announced. To make the full point of these figures ring out, Akar added a key clarification. He stressed that this was a historic moment for the company – it was the first time it had been able to “decouple revenue growth from cost growth”. For decades, revenue growth inevitably meant operating cost growth too. Implementing AI, he argued, allowed the company to handle this additional volume of business while reducing costs. For a room full of executives, this was the strongest evidence yet that AI transformation is not a cost, but a quantifiable investment.

    What was to make this success possible is the Dell AI Factory – the AI Factory. Akar described it as a ‘blueprint’ for organisations that don’t know where to start.

    Interestingly, this plan does not start at all with the question “which server to buy?”. It starts with a use case. Akar revealed that when Dell started their journey, they had 1,800 grassroots ideas for AI, which could potentially lead to ‘total chaos’. So they changed their approach to a top-down one, setting an ironclad rule: “We will not invest our time, effort if we cannot measure the return on investment (ROI).”

    He cited the internal ‘Dell Sales Chat’ tool as an example. The team implemented it in just four months. Sounds great? This is where the catch lies, as Piotrowski has already mentioned. The biggest challenge was not the algorithm. “It took us, the data company, four months to put it all together,” Akar admitted, stressing that sorting out the data was the biggest challenge of the project.

    ‘AI factory’ is also about dispelling a few myths. Firstly, it’s a team game – Dell heavily emphasises its ecosystem of partners (Nvidia, Intel, Microsoft, Meta). Secondly, the public cloud is not the only answer. Akar cited data that 62% of respondents find it more cost-effective to run AI locally (on-prem). AI is ‘performance hungry’ and needs to be close to the data, which dovetails perfectly with the vision of Edge Computing from the first presentation.

    Finally, an important point was made for those who are afraid of cost. AI infrastructure does not have to mean building a supercomputer from scratch. Akar stressed the importance of ‘right-sizing’ (right-sizing): “You can start small and begin to grow gradually. Thus, a revolution can be started with the method of small steps.

    AI in Polish – debate on trust and competition

    The session that most firmly grounded the discussion in Polish realities was the panel ‘AI.Simply. Practice and real-world transformation in Poland’. The discussion room was filled with the audience, which was not surprising given the composition of the guests. In addition to moderator Robert Domagała (Dell) and Dariusz Piotrowski (Dell), Radosław Maćkiewicz (CEO of the Central Informatics Centre), Zbigniew Jagiełło (innovator and creator of Blik) and Sebastian Kondracki (Chief Innovation Officer of Deviniti and godfather of the SpeakLeash project) sat on stage.

    Dell Technologies Forum 2025
    source: Dell Technologies

    Radosław Maćkiewicz talked about the gigantic capital at the disposal of the state – the trust of 11 million users of the mCitizen application. On this foundation, the COI is now building a virtual assistant to help citizens. And here a key declaration was made: the heart of the system will be PLLuM, a Polish language model trained specifically for the administration. As Maćkiewicz vividly put it, it will not be a model that ‘will give you the recipe for apple pie’, but one that will precisely guide you through the meanders of official matters.

    The initially quiet discussion was enlivened by Zbigniew Jagiełło, who asked the panellists a direct question about the point of having two separate, large Polish language models: “Why don’t you join forces? Is Poland so strong (…) that both PLLuM and Bielik can develop separately?”.

    The question sparked a discussion that clearly outlined the differences in approach to AI development in Poland. Sebastian Kondracki, representing the open-source Bielik, defended the value of competition, citing a concept created by Marek Magrysio of Cyfronet, an institution that supports both models – ‘copetition’, a combination of cooperation and competition. “We cooperate, we exchange information, but a small element of rivalry won’t hurt” He stressed at the same time that Bielik’s goal is not to race against OpenAI, but “for Poland to be a specialist in specialised models or niche models”.

    In a similar vein was Radosław Maćkiewicz, who pointed out that the state-owned PLLuM is trained for precisely such specialised tasks – it is not supposed to “give the recipe for apple pie”, but to precisely support the citizen in official matters.

    Dariusz Piotrowski also spoke in the discussion, suggesting that perhaps Poland does not need a single ‘ChatGPT’, but precisely many specialised models, dedicated to specific industries, such as energy or medicine. The debate thus clearly showed that different visions are clashing on the Polish AI scene – from strategic state projects, to competitive open-source models, to the idea of distributed, industry-specific specialisation.

    AI on the front line and in the boardroom

    Security was another key topic raised at the forum, following an intense discussion of Polish AI models. This fundamental issue was discussed in two contexts: cyber security in the military dimension and risk management at the strategic level in companies.

    Of particular interest was the fireside chat featuring Major General Karol Molenda, Commander of the Cyber Defence Forces Component. Dariusz Piotrowski, who moderated the chat, noted that the general combines the military and business perspectives with great ease, which was confirmed in the presentation of the army’s innovative approach to cyber security. General Molenda detailed the ‘Cyber Legion’ project, an initiative to integrate civilian specialists with military experts. The success of this project is evidenced by the fact that it has so far attracted more than 2,500 applicants.

    Dell Technologies Forum 2025
    source: Dell Technologies

    General Molenda pointed to a fundamental paradigm shift in army operations as a key element of modern cyber defence. He explained that the unit has moved away from the restrictive ‘need to know’ principle to an open ‘need to share’ strategy. He stressed that in the cyber security domain, the ability to share information quickly is critical to the effectiveness of operations.

    In this context, he discussed the role of artificial intelligence, which has evolved from a passive monitoring to an active support tool. The general described AI as a disruptive technology (‘game changer’) that enables the analysis of reconnaissance data on an unprecedented scale and supports decision-making processes. As an example, he cited the systems’ ability to calculate the probability of success of an operation in real time, based on specific orders and changing conditions.

    Equally strategic was the discussion during the panel “AI in management: revolution or controlled experiment?”, held as part of the Executive Business Lounge. Leaders from business (Orange, mBank) and science (Cambridge) grappled with key challenges: How to manage innovation and risk when employees are already using unregulated tools en masse (so-called ‘Shadow AI’)?

    Dr Paul Calleja, director of the Cambridge Open Zettascale Lab, gave the perspective of the science and R&D world. He diagnosed that corporations are ‘playing catch-up’ in trying to regulate technology that is evolving too fast. He pointed to the phenomenon of “two IT lives” – the corporate one, blocked, and the private one, where employees freely use ChatGPT. In his view, blocking tools is ineffective. Education becomes the key solution: ‘We need to teach people to think critically and think analytically’ because, as he aptly pointed out, AI models are often trained to ‘sound convincing’ and not necessarily tell the truth.

    A practical response to this demand was presented by Bożena Leśniewska (Orange Polska). She described how her company, instead of locking down tools, opted for ‘controlled democratisation’ – a structured process was created for bottom-up initiatives. The result was “250 different use cases that were translated into real business cases”. She emphasised that the foundation of this success was precisely the massive education, with more than 5,000 employees trained in Responsible AI (among other things).

    The perspective of the banking sector, focused on risk management, was brought by Agnieszka Słomka-Gołębiowska (mBank). She acknowledged that banks are moving from analytics (e.g. complaints handling) to predictive (proactive fraud detection). However, she pointed out that algorithm bias (bias) remains a key challenge. Here a fundamental warning was given for the entire industry: ‘The biggest risk would be bias. (…) Banks are in the business of trust. This is what we provide to our customers”. As she stressed, trust is more valuable than any optimisation, and AI must not become a tool for manipulation.

    The whole strategic and highly valuable discussion was accurately summarised by Dell’s Mohammed Amin, giving managers a simple piece of advice: “Think big, start small” (Think global, start local).

    From ‘Edge’ to the fight against hallucinations

    A technical complement to the strategic discussions was a presentation by Wojtek Janusz, Data Science and AI Lead at Dell Technologies. Acting as the company’s ‘translator’ between business and engineering, Janusz outlined the key technology challenges currently facing the industry.

    He confirmed the growing importance of the ‘AI on the Edge’ trend that Piotrowski and Akar spoke about earlier. This refers to the ability to run powerful AI models locally, on laptops, which Janusz admitted “was completely unworkable just two years ago” but is now becoming an everyday occurrence.

    Dell Technologies Forum 2025
    source: Dell Technologies

    Janusz then directly addressed the biggest pain point of current models, calling hallucinations ‘the current No.1 problem of the entire industry’. He explained that all the giants of the large language model market today are focusing their efforts precisely on combating algorithm confabulation. The key is supposed to be a fundamental change in training philosophy: instead of rewarding a model for an answer that is supposed to ‘satisfy a human’ (even if it is made up), the new paradigm involves punishing improvisation. “We are now changing that paradigm and saying: if you don’t know, say you don’t know,” – explained Janusz.

    This frank take on the matter summed up the entire Dell Technologies Forum perfectly. The days of ‘magical AI’ replacing humans seem to be a thing of the past (‘It hasn’t happened and won’t happen,’ Janusz quipped). It is being replaced by the concept of an ‘AI Factory’, where technology is supposed to take over the ‘thankless tasks’ of things we don’t like to do, but it is the human, within the ‘human in the loop’, who still makes the final decision. The forum showed how to build such a factory. But it is up to us to decide who pulls the levers in it.

    I left this year’s Dell Technologies Forum with one dominant feeling: this is the end of ‘magic’ and the beginning of an era of mature engineering. After years dominated by hype and fears of ‘Shadow AI’, the discussion had finally shifted to the right track. The real protagonists of the conference were not algorithms, but hard data (like Dell’s 10 billion in revenue), pragmatism (fighting hallucinations) and accountability (from trust in mBank to General Molenda’s cyberfront).

    As editors of Brandsit, we were media patrons of the event.

  • AI servers drive Dell. Company revises financial targets upwards

    AI servers drive Dell. Company revises financial targets upwards

    Dell Technologies is sharply raising its long-term financial outlook, signalling that the boom in artificial intelligence servers is not a temporary trend but the foundation for future growth. The company, which supplies infrastructure to Elon Musk’s xAI, among others, has almost doubled its earnings per share growth target for the next four years.

    The conglomerate now expects annual adjusted earnings per share (EPS) growth of at least 15 per cent, up from around 8 per cent previously. The cumulative annual revenue growth (CAGR) forecast has also more than doubled from 3-4% to 7-9% by fiscal 2029.

    The main driver is demand for AI-optimised servers. The infrastructure solutions division, which includes servers and storage, is now expected to grow at a CAGR of 11-14% per year, rather than 6-8% as previously assumed. This is in response to the market’s insatiable appetite for the computing power needed to train and deploy language models.

    The optimistic outlook is expected to allay investor concerns about margin pressure in the highly competitive AI hardware segment. Analysts point out that Dell’s advantage over rivals such as Super Micro is its scale of operations, mature supply chain and established relationships with key corporate customers.

    Despite the euphoria around AI, the company is maintaining a conservative approach to its other pillar, the PC business. The Client Solutions Group division is expected to record stable but low growth of 2-3%, reflecting saturation and strong competition in the PC market. The new forecasts show where Dell sees its future – in the server room, not on the desktop.

  • Storage for data centres – 5 leaders in 2025

    Storage for data centres – 5 leaders in 2025

    Today’s data centre has ceased to be just a back-office IT facility; it has become an engine that drives innovation. The explosion of data, driven largely by generative artificial intelligence, is forcing a fundamental re-evaluation of the storage infrastructure.

    In 2025, choosing a vendor is a strategic decision beyond hardware purchase. Market leaders are no longer competing on petabytes alone; they are delivering platforms that offer a cloud operating model, guaranteed cyber resilience and architecture built with AI requirements in mind .

    New evaluation criteria: megatrends shaping the market

    To understand which vendors will dominate in 2025, it is necessary to identify the key market forces. Four megatrends define new criteria for evaluating storage platforms:

    • Ubiquitous artificial intelligence: AI workloads, especially training large language models (LLMs), require powerful GPU-based infrastructure and high-performance, low-latency storage. Demand is focused on all-flash NVMe architectures and scale-out designs that can power GPUs without creating bottlenecks. At the same time, vendors are integrating AIOps mechanisms directly into their platforms, automating management and enabling natural language administration.
    • Sustainability mandate: Data centres are forecast to consume more than 1,000 TWh of energy by 2026, leading to energy supply constraints and increased operating costs. Energy efficiency is becoming a key factor in total cost of ownership (TCO). These pressures are driving innovations such as the adoption of QLC flash memory, business models that reduce e-waste and the standard use of liquid cooling.
    • The cyber resilience imperative: The rise of ransomware attacks is making the storage layer the last line of defence. The focus is shifting from simple backup to end-to-end resilience, including real-time threat detection, immutable snapshots and guaranteed, fast recovery.
    • Cloud operating model: the market is increasingly clamouring for a cloud-like experience for on-premises infrastructure, as seen in the rise in popularity of Storage-as-a-Service (STaaS) offerings. IT teams want to manage infrastructure from a single, centralised console in the cloud, automate resource allocation and consume resources on demand, regardless of physical location.

    Analysis of the top 5 suppliers

    Based on the new criteria, five suppliers stand out as leaders ready to meet the challenges of 2025.

    Dell Technologies: market leader with a comprehensive portfolio

    Dell maintains its leadership position in terms of external storage market share. Its strategy is to leverage scale and a comprehensive portfolio to support a wide range of traditional and modern workloads.

    Key products:

    • PowerStore: The flagship mid-range all-flash platform, which recently gained support for the Nutanix Cloud Platform, offering customers an alternative to VMware.
    • PowerFlex: a software-defined infrastructure (SDS) platform designed for performance and scalability, ideal for consolidating diverse corporate workloads.
    • PowerScale: a scale-out NAS solution built to handle unstructured data, making it a key component in AI/ML workflows.

    A differentiator for 2025: Width of offering and established market position. Dell is the choice for large enterprises that need a single supplier to support diverse workloads, from the network edge to the cloud.

    Pure Storage: an innovator in simplicity and sustainability

    Pure Storage’s strategy is based on operational simplicity, customer experience and a subscription model that eliminates the traditional storage lifecycle.

    Key elements:

    • Evergreen® model: A subscription that provides uninterrupted software and hardware updates, directly impacting TCO and reducing e-waste.
    • Energy efficiency: Pure architecture delivers up to 85% lower power consumption compared to competitive all-flash arrays.
    • Pure1 AI Copilot: AI assistant for storage management, allowing administrators to use natural language for troubleshooting and planning.

    Distinction for 2025: A focus on customer experience, delivered through the Evergreen model and simple management. Pure is the choice for organisations that prioritise TCO, operational simplicity and sustainability.

    NetApp: champion of multi-vendor hybrid cloud

    NetApp’s strategy is to dominate the hybrid, multi-cloud data fabric through a software-first approach.

    Key technologies:

    • ONTAP software: the heart of the NetApp ecosystem, delivering unified data services that run consistently locally and natively across the three largest public clouds (AWS, Azure, Google Cloud).
    • Cyber resilience: NetApp offers autonomous ransomware protection with guaranteed recovery from snapshots, using AI/ML to detect anomalies in real time.
    • BlueXP: A unified management console based on AIOps that allows the entire data infrastructure to be managed from a single interface.

    Distinction for 2025: Software. No other vendor provides a more seamless and consistent data management experience in a multi-tenant hybrid cloud landscape. NetApp is the choice for organisations strategically committed to a hybrid architecture.

    Hewlett Packard Enterprise (HPE): On-premises cloud architect

    HPE’s strategy is to deliver its entire IT infrastructure as a service through the HPE GreenLake platform, providing a cloud operational experience in a local environment.

    Key technologies:

    • HPE GreenLake: A platform that underpins HPE’s strategy, providing a unified, cloud-based console for managing and consuming IT resources in a pay-per-use model.
    • HPE Alletra Storage MP: A hardware platform with a disaggregated scale-out architecture, meaning that compute resources and capacity can be scaled independently, providing flexibility and cost efficiency.
    • Availability guarantee: HPE guarantees 100% data availability for its critical Alletra systems.

    A differentiator for 2025: HPE’s vision is a radical departure from traditional infrastructure sales. It is the choice for enterprises that are fully committed to an on-premise cloud strategy and want to manage their entire infrastructure through a single as-a-service platform.

    IBM: a bastion of corporate resilience and security

    IBM’s storage strategy focuses on unparalleled cyber resilience, performance and integration in complex, often highly regulated IT environments.

    Key technologies:

    • FlashCore modules (FCMs): Unlike competitors using standard SSDs, IBM designs its own modules that handle tasks such as compression, encryption and real-time threat detection at the drive level, without affecting performance.
    • Ransomware detection: FlashSystem uses machine learning models running on FCMs to detect anomalies indicative of ransomware in less than a minute.
    • Safeguarded Copy: A function that creates immutable, isolated snapshots that cannot be modified or deleted during an attack.

    Distinction for 2025: In an era of escalating cyber threats, IBM’s deep focus on security engineering is key. It is the choice for large enterprises in regulated industries where data integrity and recoverability are unquestionable.

    Choosing a partner, not just a product

    The storage decision in 2025 is not about which device has the best specifications, but which platform best fits an organisation’s strategic goals for AI, cloud and resilience.

    The market has evolved from selling hardware to providing end-to-end intelligent platforms. The right choice is a supplier that acts as a partner, offering a platform that reduces complexity, mitigates risk and provides a basis for future innovation.

  • Changing of the guard in Dell’s finances. CFO Yvonne McGill is leaving after 30 years

    Changing of the guard in Dell’s finances. CFO Yvonne McGill is leaving after 30 years

    After nearly three decades at Dell Technologies, chief financial officer Yvonne McGill will step down on 9 September. Her duties will be taken over on an interim basis by David Kennedy, a veteran of 27 years with the company, currently serving as senior vice president of global business operations and finance.

    McGill will remain in an advisory role until the end of October to ensure a smooth handover.

    The sudden departure of such a key figure would have worried investors, but Dell acted quickly to calm the market.

    In an official release, the company stressed that McGill’s resignation is not the result of any disagreements over financial reporting, internal controls or operational policies.

    What’s more, Dell confirmed its financial forecasts for the third quarter and full year, signalling stability and no disruption to ongoing operations.

    Despite these assurances, the company’s shares saw a slight decline of 1.8% shortly after the change was announced. The market will now keep a close eye on the process of finding a permanent successor to McGill.

    Kennedy’s candidacy as interim CFO, given his in-depth knowledge of Dell’s financial operations, seems a natural and safe choice for the interim period.

    This change, although significant, for the time being does not herald a strategic shake-up in the tech giant’s finances.

  • AI is not just about the cloud. The success of Dell and HPE is evidence of the renaissance of private server rooms

    AI is not just about the cloud. The success of Dell and HPE is evidence of the renaissance of private server rooms

    The initial phase of the AI revolution solidified a simple and clear picture of the market. At the top was Nvidia, providing the technological ‘shovel’ in the form of GPUs, with cloud giants Amazon, Microsoft and Google just below offering access to ‘goldfields’ of computing power.

    However, the latest financial results from traditional hardware vendors such as Dell and HPE show that this picture was incomplete. The centre of gravity in the key enterprise segment is beginning to shift towards private infrastructure, signalling that the market is entering a new, more mature phase where control, security and cost are becoming the highest denomination currency.

    The hard financial figures leave no illusions. Dell’s server and network segment grew by an impressive 69% in the last quarter, an absolutely exceptional result in such a mature sector.

    This jump translated into record revenues across the company of $29.8bn. At the same time, Hewlett Packard Enterprise reports that AI-dedicated systems generated $1.6bn in revenue, and its entire server segment grew solidly by 16%.

    We’re not talking about selling standard machines. We’re talking about advanced, high-margin systems, saturated with the latest GPUs, ultra-fast interconnects and huge amounts of memory.

    They are the driving force behind these increases and are a clear indication of where companies are now placing their largest technology budgets.

    Behind this fundamental market shift is primarily a pragmatic calculation and strategic course correction. The ‘cloud-first’ model that has dominated IT thinking for the past decade is evolving towards a more sustainable ‘cloud-smart’ approach or, to put it simply, towards a hybrid architecture.

    While the public cloud remains an indispensable environment for rapid prototyping, experimentation and scaling of variable workloads, large-scale production AI deployments have highlighted its structural limitations.

    The motivation to invest in one’s own equipment is based on three pillars that have become critical.

    Firstly, the issue of data security and sovereignty has come to the fore. In an era of regulations such as RODO in Europe, the processing of sensitive corporate data – be it intellectual property, financial data or customer information – on external, shared infrastructure raises legitimate and often unacceptable risks.

    For many industries, from finance to healthcare, the ability to physically control data is not an option, but a legal requirement. The concept of ‘data gravity’ is becoming a reality: it is easier to attract computing power to massive corporate datasets than to transfer petabytes of information to the cloud.

    Secondly, businesses have begun to look closely at total cost of ownership (TCO). While the initial capital outlay to purchase their own servers is high, the operational cost of renting cloud resources to support sustained, intensive AI workloads can be astronomical and unpredictable in the long term.

    For companies that train and operate models continuously, having their own infrastructure offers much better financial predictability and a lower cost over a 3-5 year cycle.

    Thirdly, performance and personalisation requirements cannot be ignored. Latency-sensitive AI applications, crucial in industrial automation, autonomous vehicle systems or banking, require millisecond processing.

    Even the minimal latency associated with transferring data to and from the cloud can be unacceptable in such scenarios. Proprietary hardware also allows for deep optimisation and customisation of the entire architecture – from hardware to software – to the specific needs of the model, which is often impossible in standardised cloud environments.

    In this new landscape, Dell and HPE find themselves perfectly placed. Their advantage is not just in the technology, but in the deep understanding of the enterprise market that they have built up over decades.

    It is not just commercial relationships, but knowledge of procurement cycles, the ability to provide global technical support (SLA) and experience in integrating new solutions with existing complex IT systems. What’s more, they hit exceptionally fertile ground.

    It is estimated that up to 70% of servers in companies are older-generation hardware, which is not only insufficient for AI tasks, but also extremely energy inefficient. The pressure to upgrade is therefore twofold: on the one hand, the need for power; on the other, rising energy costs and sustainability goals (ESG).

    The artificial intelligence market is entering a new, more sustainable phase. This does not mean the end of the cloud, but a redefinition of its role as one of the key elements in a broader, hybrid strategy. The experimentation phase is coming to an end and the time for strategic, long-term deployments is beginning.

    In this game, it is the providers that can offer security, performance and cost predictability in their own data centre that are taking the lead.

  • Is the hype for AI PC dimming? The real revolution is just beginning

    Is the hype for AI PC dimming? The real revolution is just beginning

    Could it be that the great revolution in the world of personal computing, driven by artificial intelligence, has lost momentum before it has begun in earnest? Recent data from analysts at Gartner, suggesting a revision of growth forecasts, may dampen enthusiasm and provoke questions about the reality of the announced breakthrough. However, nothing could be further from the truth. What we are seeing is not an emergency brake, but a natural course correction in the face of global economic challenges. The reality is that the technological wave is gathering strength just over the horizon, and the current calm is just the calm before the inevitable storm.

    Understanding breathlessness – or why the numbers slowed down for a while

    There is no denying that the original forecasts were extremely optimistic. Gartner has revised its prediction for the share of AI PCs for this year from 43% to less than a third of the market. Similarly, instead of 114 million units, closer to 78 million will hit the market. Why the change? The reasons lie not in the technology itself, but in the macroeconomic environment. Global uncertainty, fluctuating markets and a complex trade policy situation are making companies around the world more cautious about large investments and equipment replacement cycles.

    This is a natural phenomenon – at a time when it is harder to predict the future, budgets become less flexible. However, it is important to understand that this is a delay in purchasing decisions, not a rejection of the AI PC idea itself. The problem lies not in the potential of the new machines, but in external factors that have pressed pause for a while.

    The train has already left – Giants put it all on the line

    While analysts are correcting short-term forecasts, in the labs and on the production lines, the revolution continues at its best. The truth is that the entire technological ecosystem is already fully engaged in the transformation. This is no longer an experiment – it is a strategic market redesign from which there is no turning back.

    The hardware foundations have long been in place. Intel with its Core Ultra processors has integrated NPUs (Neural Processing Units) directly into the silicon, making on-device AI processing a standard. On the other hand, Nvidia, the giant in graphics and AI computing, has formed strategic alliances with absolutely all the key players in the market – Acer, ASUS, Dell, HP and Lenovo – to deliver powerful machines to the market ready for the AI era.

    What’s more, let’s look at the scale of the increase. Although the figure of 78 million units shipped this year is lower than the original target, it still represents an almost fourfold increase on the just over 20 million units that hit the market in 2023. This is a growth rate that most industries can only dream of. This is not deceleration, this is the unleashing of a powerful machine.

    A tsunami of software is coming – the real reason AI PCs will win

    Ultimately, the success of any hardware platform is determined by the software that can harness its power. And herein lies the strongest argument for the inevitability of the AI PC revolution. The hardware itself is only a promise – the real value will come from the applications.

    According to Gartner, by the end of next year as many as 40% of software vendors will be investing in AI features running locally on the PC. This is a seismic shift, given that just last year the percentage was only 2%.

    What does this mean for us users? It marks the end of an era where advanced AI features were only available through the cloud. This means:

    • Greater privacy and security: Data will be processed locally on our device, without being sent to external servers.
    • Instant responsiveness: No more internet connection delays. Editing video, generating graphics or working with the intelligent assistant will be instantaneous.
    • Powerful new possibilities: Imagine a real-time video call interpreter working offline. Or an office suite that intelligently summarises documents and prepares replies to emails without network access. It is these ‘killer-apps’ that will ultimately convince the market.

    Time for preparation, not hesitation

    Momentary turbulence on the sales charts cannot stop a revolution whose foundations are already built of silicon, steel and code. The engine of change is in full swing and long-term forecasts leave no illusions: AI PCs will capture more than half of the market as early as next year, and by 2029 they will become the absolute norm.

    That is why it is worth treating the current moment of market calm not as a cause for concern, but as a strategic opportunity. It is the perfect time for companies and technology enthusiasts to plan their strategy calmly, update their knowledge and prepare for the moment when the storm of innovation will hit full force. Because the fact that it will hit is already certain.

  • Dell’s paradox: How is growing demand for AI eroding profitability?

    Dell’s paradox: How is growing demand for AI eroding profitability?

    Dell Technologies, one of the main beneficiaries of the AI revolution, is experiencing a growth paradox. Despite raising its annual revenue forecasts, the company’s shares fell as investors took note of increasing pressure on profitability.

    This signals that in the gold rush that is the AI server market today, simply winning contracts is not enough – the ability to monetise them is becoming crucial.

    At the heart of the problem is the strategic choice facing Dell. The company is deliberately prioritising fulfilling its huge order book for AI-optimised servers over maintaining high margins.

    In practice, this means an aggressive battle for market share with competitors such as Hewlett Packard Enterprise and Super Micro Computer. At stake is dominance in a segment that drives the entire technology industry today.

    However, the price of this strategy is evident in the results. The adjusted gross margin in the second quarter came in below analysts’ expectations, and the profit forecast for the third quarter also fell short of consensus.

    High component costs, supply chain disruptions and pricing pressures mean that each AI server delivered brings the company less profit than the market would like.

    Nevertheless, the long-term picture remains optimistic. Dell has significantly raised its full-year revenue forecast – from an expected $101-105 billion to $105-109 billion. It also raised its forecast for adjusted earnings per share.

    This shows that demand for AI infrastructure is powerful and provides a solid foundation for future growth, offsetting weaker dynamics in the traditional PC segment.

    Investors and the market are now watching closely to see if Dell manages to strike a balance between expansion and profitability.

    Gaining a leadership position in the AI era is crucial, but the ultimate test for the company will be to turn this dominance into a sustainable and profitable business model.

  • ReVault: Five serious security vulnerabilities in over 100 Dell laptop models

    ReVault: Five serious security vulnerabilities in over 100 Dell laptop models

    Cisco Talos has disclosed five serious vulnerabilities in the Dell ControlVault3 firmware and associated Windows APIs. Collectively named ReVault, the vulnerabilities pose a real threat to more than 100 models of Dell laptops – often used in environments with heightened security requirements.

    What is Dell ControlVault?

    Dell ControlVault3 (and the newer ControlVault3+ version) is the hardware solution responsible for storing sensitive login data: passwords, fingerprints, biometric templates or security codes. Instead of being stored in the operating system, this data goes into a dedicated chip on a board called the Unified Security Hub (USH), connected to a fingerprint reader, smart card or NFC module.

    The solution is used in more than 100 models of Dell laptops – mainly in the Latitude, Precision and Rugged series – often used in sectors such as public administration, the financial sector or critical infrastructure.

    How does the ReVault attack work?

    ReVault is not just a collection of software vulnerabilities. It’s also a potential way to take full control of a computer – whether the attacker acts remotely or has physical access to the device. Here are two possible scenarios that show the scale of the risk:

    1. Actions after taking control of the hardware
      On Windows, even a user without administrator rights can – using the available APIs – establish communication with the ControlVault firmware and run arbitrary code in it. This makes it possible, among other things, to steal cryptographic keys and permanently modify the firmware. As a result, it is possible to install a so-called implant – malicious code hidden in the firmware, which can remain invisible and can be used later to attack again, even after reinstalling the system.
    2. Physical attack on hardware
      Physical access to a laptop is all that is needed to connect to a Unified Security Hub (USH) board via USB and launch an attack – without the need to know the password, PIN or disk encryption key. Additionally, if the system is configured for fingerprint login, the modified firmware can accept any fingerprint, allowing unauthorised access to the system.

    How to mitigate risk?

    • Software update:
      The most effective method of protection is to install the latest firmware versions. Dell makes these available first on its website and, over time, also via Windows Update. Dell has already released updates, the installation of which solves the problem.
    • Disabling unused components:
      If biometric devices (fingerprint reader, NFC card, smart card) are not used, it is a good idea to disable the relevant services in the Windows Services and Devices Manager.
    • Changing login settings:
      Inhigh-risk environments such as hotels, shared spaces or business travel, it is recommended to temporarily disable biometric logins. You may also consider activating the Enhanced Sign-in Security (ESS) feature available in Windows.

    How to detect a potential security breach?

    • Physical tampering detection:
      On many Dell laptop models, case opening detection can be enabled in the BIOS. The system will inform the user of a possible physical intrusion.
    • System log monitoring:
      Unexpected failures of services, such as Windows Biometric Service or Credential Vault, may suggest an attack attempt or firmware malfunction.
    • Signatures in security software:
      Cisco Secure Endpoint users should note the alert: ‘bcmbipdll.dll Loaded by Abnormal Process’, which may indicate unauthorised activity.

  • Business laptops in H1 2025: Leaders, strategies and forecasts in the age of AI

    Business laptops in H1 2025: Leaders, strategies and forecasts in the age of AI

    2025 is not another year of evolution in the business laptop market; it is a year of revolutionary upheaval. The confluence of three powerful forces – the massive deployment of on-device AI (artificial intelligence), the imminent end of Windows 10 support and the permanent entrenchment of hybrid working models – is creating a perfect storm that redefines what a PC is and can do for the professional. It is a fundamental paradigm shift, shifting the centre of gravity from raw computing power to intelligent, integrated and secure ecosystems.

    This transformation is based on three pillars. The first is the AI revolution, which shifts the burden of computing from the cloud directly to the device. With dedicated neural processing units (NPUs), laptops gain an unprecedented ability to personalise, automate tasks and deliver real-time productivity, all while maintaining the highest standards of data privacy. The second pillar is a powerful, unavoidable market catalyst: the 14 October 2025 deadline marking the end of support for Windows 10. It forces businesses around the world to upgrade their PC fleets en masse, perfectly coinciding with the release of the next generation of hardware. The third pillar is the new working paradigm. The hybrid model has become the norm, forcing manufacturers to create devices that are not only powerful, but also ultra-secure, highly mobile and optimised for continuous video collaboration.

    In this new landscape, competitive advantage is no longer defined solely by processor power, measured in gigahertz, or the number of cores. The winners in 2025 will be those vendors who offer the most cohesive, intelligent and secure ecosystem, combining hardware, software and services in a way that realistically addresses the challenges of a decentralised, AI-driven working environment. This analysis delves into the market data, technology strategies and positioning of key players to identify the trends that will define the future of business laptops.

    The market landscape in 2025 – a rebound in the shadow of uncertainty

    The first half of 2025 has brought a long-awaited recovery to the PC market. After a period of stagnation, global PC shipments recorded solid growth, increasing by 9% year-on-year in the first quarter and by 7% in the second quarter. At first glance, these figures signal the return of healthy demand and the start of a new buying cycle.

    An analysis of the leading manufacturers shows varying dynamics. Lenovo maintained its position as the undisputed leader, recording impressive shipment growth of 11% in Q1 and 15.2% in Q2 2025, demonstrating the effectiveness of its product and operational strategy. HP Inc. steadily took second place, maintaining a solid market share. The real star of the first half of the year, however, turned out to be Apple, which demonstrated a spectacular growth rate, with shipments increasing by 22% in Q1 and 21.3% in Q2. Such results point to the growing acceptance of the Mac platform in commercial and professional environments, which have traditionally been a bastion of Windows systems. In contrast, Dell, the third largest player, reported a 3.0% decline in shipments in Q2, which may signal some challenges in aligning its portfolio to the current product cycle or stronger competition from rivals.

    Although the figures point to recovery, a deeper analysis reveals a more complex picture. The seemingly strong growth figures for the first half of 2025 do not fully reflect organic demand from end users. They are largely the result of strategic stockpiling by manufacturers and distribution channels in response to growing geopolitical uncertainty and the threat of new trade tariffs, particularly in the US.[4, 5] Manufacturers have been deliberately accelerating orders to get ahead of potential tariff increases on electronics, meaning that high delivery numbers may be significantly ahead of actual end demand. This creates the risk of excess inventory in the distribution channel, which could lead to price pressure and a potential market correction in the second half of the year.

    The Polish IT market reflects global recovery trends, but with its own unique characteristics. In the first quarter of 2025, PC sales in Polish distribution grew by an impressive 18.1% year-on-year, but this growth was driven primarily by the consumer segment. The business sector showed more restraint, revealing the existence of a ‘value proof gap’. In the Polish market, there is a clear discrepancy between the global marketing hype around AI PCs and the actual purchasing decisions of companies. While global forecasts predict that AI laptops will account for nearly 60% of the market as early as 2025, first-quarter data from the Polish market shows that sales of the latest AI-optimised chips were “lower than expected”. Business customers were more likely to opt for tried-and-tested and often cheaper older-generation processors.

    The AI PC revolution – the birth of a new category of device

    The year 2025 marks the point at which the term ‘AI PC’ moves from the marketing buzzword phase to a real, defined product category. The de facto market benchmark that has defined what a modern AI PC is has become the Microsoft Copilot+ PC standard. This is a new class of Windows-based device designed from the ground up for local processing of AI tasks. In order for a laptop to carry this designation, it must meet stringent technical requirements: a neural processing unit (NPU) performance of at least 40 trillion operations per second (TOPS), a minimum of 16 GB of RAM and a fast SSD of at least 256 GB.

    The role of the NPU is fundamental. Unlike traditional architectures, where AI tasks overloaded the main processor (CPU) or graphics card (GPU), the NPU is specifically designed to efficiently perform these calculations with significantly lower power consumption. This translates into longer battery life and quieter operation – key attributes in a mobile business environment.

    Practical business applications are becoming more and more tangible. Video conferencing is becoming smarter with features such as real-time noise cancellation and automatic speaker framing. Office applications such as Microsoft Teams and Outlook offer automatic transcriptions and summaries of meetings and email threads, freeing up staff time. In Excel, it becomes possible to generate complex reports from simple natural language commands.

    The sudden rise of AI PCs is being driven by two powerful, coinciding developments. First, the end of Windows 10 support on 14 October 2025 acts as a powerful commercial catalyst, forcing companies to upgrade their hardware fleets. At this point, Copilot+ PCs are positioned as the logical, standard choice for any hardware replacement. Secondly, we are seeing a fundamental architectural shift: the re-decentralisation of computing. Moving AI computing from the cloud to the end-device addresses growing business concerns about privacy, speed, reliability and cost control associated with the cloud model.

    The war for silicon – who is driving the intelligence on board?

    At the heart of the AI PC revolution is a new generation of processors, and competition between processor makers has become a key battleground. In the AI PC era, the key performance indicator has become NPU power, expressed in TOPS (Trillions of Operations Per Second).

    Each of the four major players in the silicon market has adopted a different strategy. Intel, with its new generation of Core Ultra (‘Lunar Lake’) processors delivering 47 TOPS, is focused on defending its dominant position in the commercial segment by leveraging its mature x86 ecosystem. AMD, with its Ryzen AI 300 series (‘Strix Point’) offering 50 TOPS, is positioning itself as an aggressive contender, targeting the crown of performance leader, including in integrated graphics. Qualcomm, with its Snapdragon X Elite (45 TOPS), is making a breakthrough, bringing the highly energy-efficient ARM architecture to the Windows world and promising revolutionary battery life. Finally, Apple, with its M4 chip (38 TOPS), continues its strategy of full vertical integration, focusing on optimising the synergy between hardware and software rather than the race for the highest nominal TOPS value alone.

    Behind the façade of numbers lie deeper dimensions of competition. The real battle is to create the most efficient development platform, as it is the software, not the hardware, that will ultimately deliver value to the user. Qualcom’s entry into the Windows market with ARM architecture also creates a fundamental dilemma for IT decision-makers: whether to bet on the guaranteed compatibility of the x86 platform, or risk potential problems with emulating legacy applications in favour of the ARM platform’s significantly longer battery life.

    The battle for the ecosystem, not just the hardware

    Faced with such fundamental changes, major laptop manufacturers are struggling to create a coherent ecosystem.

    Lenovo is pursuing a two-pronged strategy. “AI Engine+” in the Legion series is targeted at gamers and optimises performance in real time. “AI Now” in the ThinkPad series, on the other hand, is a personal assistant for professionals, running locally on the device and emphasising privacy. This strategy is supported by the solid financial performance of the Intelligent Devices Group (IDG) division, which reported strong double-digit revenue growth.

    HP is focusing on an integrated ‘HP AI Companion’ platform to create ‘intelligent workflows’ and personalise the user experience, with a strong focus on security and on-device computing. The company has taken a flexible approach, offering processors from Intel, AMD and Qualcomm in its AI PC laptops, with the HP Wolf Security suite as a key component of the offering. The success of this strategy is evidenced by 9% revenue growth in the commercial segment in the second quarter of fiscal 2025.

    Dell is taking a more evolutionary approach to the AI revolution, gradually integrating intelligent features into its proven business lines such as Latitude and Precision. The company is focusing on practical benefits: optimising performance, enhancing security and improving user interaction. This conservative strategy, reflected in flat revenue in the commercial segment in Q2 2025, seems primarily aimed at maintaining the trust of existing large enterprise customers.

    Apple bases its strategy on deep ecosystem integration and uncompromised privacy. The ‘Apple Intelligence’ suite is woven into the core of the operating systems, with most operations taking place locally on the device. The company’s focus is on the fluidity and real-world usability of AI features that draw strength from awareness of the user’s personal context. Impressive double-digit growth in Mac shipments in the first half of 2025 is evidence that this strategy is resonating with the market.

    The year 2025 marks a turning point that requires IT decision-makers to rethink their purchasing strategies. AI is becoming standard, the platform market is fragmenting (x86 vs. ARM) and security is becoming an integral part of hardware.

    Choosing the right business laptop requires a new, holistic approach. The traditional purchasing model, based mainly on technical specifications, is no longer sufficient. IT decision-makers need to think like strategists, basing the decision-making process on a three-dimensional matrix that takes into account the specific needs of the organisation: the performance axis (including NPU and energy efficiency), the ecosystem axis (the trade-off between x86 compatibility and ARM battery performance) and the security axis (the depth of integration of hardware solutions).

    The traditional total cost of ownership (TCO) metric also needs to be redefined. The new TCO needs to take into account the return on investment (ROI) resulting from increased employee productivity through AI features and the risk-avoidance value of on-device computing.

    Looking to the future, current trends can be expected to deepen further. Artificial intelligence will become even more personalised and sustainability will play an increasingly important role in decision-making.

  • Singapore is investigating the illegal export of Nvidia’s AI chips. In the background, Dell and Super Micro

    Singapore is investigating the illegal export of Nvidia’s AI chips. In the background, Dell and Super Micro

    At the centre of a high-profile investigation in Singapore were advanced servers supplied by US companies – Dell Technologies and Super Micro Computer – that may have contained Nvidia chips banned for export. The case sheds new light on the growing tensions around AI technology export controls and international trade loopholes.

    Singaporean authorities have arrested three men, including a Chinese national, in connection with fraud and suspected illegal export of servers containing restricted components. The servers were allegedly given to companies in Singapore and then shipped to Malaysia. Investigators are looking into whether China may have been the final destination.

    According to authorities, the equipment may have contained Nvidia chips, which may be in violation of US export regulations. Singapore has begun working with the US government, which has been sealing controls for months, to restrict China’s access to advanced semiconductors used to train artificial intelligence systems.

    The trail leads to DeepSeek, a Chinese AI startup that gained notoriety in January with a model to match its Western competitors. There have been unofficial reports suggesting that the company may have obtained up to 50,000 Nvidia’s A100 and H100 series chips – which are banned from export to China – although the company maintains that it only used legally sourced H800 chips.

    The unclear role of intermediaries and logistics hubs such as Singapore in the global supply chain for advanced hardware raises questions about the effectiveness of current regulations. Singapore alone accounted for 18% of Nvidia’s revenue in the past fiscal year, but physical shipments to the country accounted for less than 2% of revenue – indicating that the city-state acts as a hub for invoicing and redistribution.

    Both Dell and Super Micro declare that they comply with US export laws and are prepared to respond to customer violations. In both cases, investigations are ongoing.

    The case is part of a broader trend of increasing surveillance of high-tech flows, particularly in the context of US-China tensions. On the one hand, we have export restrictions, on the other, sophisticated attempts to circumvent them. As a result, centres such as Singapore are becoming the intersection of eastern and western technological interests.

    Whether the investigation leads to concrete consequences for technology companies will depend on the evidence – and the willingness of authorities on both sides of the Pacific to enforce their laws in the era of the chip war.