Tag: TSMC

  • Apple is looking for an alternative to TSMC. Talks with Intel and Samsung

    Apple is looking for an alternative to TSMC. Talks with Intel and Samsung

    Apple has entered into preliminary talks with Intel and Samsung Electronics over the potential production of its core processors. According to reports from Bloomberg, executives from the Cupertino giant have already visited Samsung’s Texas factory and held independent consultations with Intel. Although negotiations are at an early stage and have not translated into concrete orders, the move is aimed at creating an alternative to Taiwan’s TSMC. The decision comes in the shadow of Tim Cook’s warnings about supply constraints on advanced chips, which have negatively impacted iPhone sales. The situation is compounded by the fact that Apple’s upcoming smartphone processors use technology shared with its most coveted AI chips.

    Apple’s actions lead to a clear conclusion. The market’s deep dependence on a single supplier, as TSMC has become, raises powerful operational risks, especially in an era of massive demand for artificial intelligence architectures that are drastically shrinking available capacity. At the same time, Apple’s scepticism about the reliability standards and scale of alternative suppliers exposes a brutal truth: TSMC’s technological and logistical advantage creates a barrier that competitors cannot quickly overcome.

    The strategic need to review purchasing processes in the high-tech sector is worth noting. Business leaders should calculate long-term deficits in state-of-the-art lithography nodes and treat diversification not as a fallback option but as a permanent part of the strategy. It is advisable to develop closer collaboration with alternative manufacturing partners early on in the design and R&D phase. Such an approach will minimise technological risks and make the hardware architecture more flexible, effectively securing the company’s business continuity in the face of further supply crises.

  • AI investments under question: Why is the stock market losing billions?

    AI investments under question: Why is the stock market losing billions?

    In just a few weeks, an astronomical sum of $1.3 trillion evaporated from the valuations of major tech giants. This phenomenon, although rapid, was not the work of an unfortunate coincidence or a temporary panic of trading algorithms. Rather, it represented a harsh market verdict delivered over a business model based largely on narrative rather than on the foundation of cash flow generation. Unconditional faith in the promises of artificial intelligence has come to an end, giving way to an era of rigorous viability verification.

    For the past few years, the technology sector has been fed visions of an almost eschatological nature, in which artificial intelligence was to become the panacea for all efficiency ills. However, January 2026 brought a radical change in optics. Investors, hitherto inclined to give preference to far-reaching goals, turned their attention to current financial transparency. The gap between the enthusiastic declarations made in international economic forums and the hard operational reality became too clear to be ignored any longer. The symbol of this tumble became Microsoft, whose capitalisation shrank by $613 billion, which, when juxtaposed with historical daily declines of 12%, showed the scale of the fragility of modern valuations.

    The reason for this lies in the deep disconnect between the perceived usefulness of the technology by its creators and its real-world adoption at the consumer and corporate level. While leaders in Redmond or Seattle talk about changing the world, the average enterprise is still searching for answers on how generative models are going to realistically translate into operating margins. This gap in understanding led to a speculative bubble that burst when the market began to demand evidence of returns on gigantic capital expenditures.

    In doing so, it is worth noting an interesting paradox that sheds new light on the structure of the current crisis. While developers of software and language models are losing value, the beneficiaries of the situation remain those operating in the realm of physical infrastructure. Companies such as Taiwan Semiconductor Manufacturing Co or Samsung Electronics are experiencing growth, suggesting that capital is not fleeing the technology sector altogether, but is making a strategic rotation. Investors have begun to favour component suppliers whose returns are tangible and immediate, at the expense of visionaries whose success depends on the future, still uncertain monetisation of services. In this context, the success of a traditional giant like Walmart, which has significantly increased its market value through the point-to-point implementation of solutions in logistics, becomes a signpost for modern business strategy. Success is not achieved by whoever has the most powerful technology, but by whoever can harness it most effectively to generate savings in the real value chain.

    One of the most worrying aspects of the current situation is the phenomenon of the so-called capital expenditure trap. Projects of almost cyclopic scale, such as the OpenAI Stargate initiative valued at $500 billion, have turned into mechanisms for consuming capital on an industrial scale. A fundamental strategic dilemma is emerging here for technology and finance executives. Expenditure on AI infrastructure is growing exponentially, while the lifecycle of purchased hardware is shortening dramatically. There is a real risk that state-of-the-art accelerators and memory systems will lose their moral and technical value faster than they manage to make a profit covering the cost of their purchase. The situation is exacerbated by a component availability crisis, with delayed deliveries arriving in data centres at a time when the next, more efficient generation of silicon is already on the horizon.

    The current market correction is a lesson in humility towards the laws of economics. It demonstrates that, in the long term, technology cannot escape the need to demonstrate its cost-effectiveness. The strategic tension IT decision-makers now find themselves in requires a move away from aggressive, often unreflective pursuit of novelty towards sustainability. Rather than building monumental, untested systems, it makes sense to focus on financial transparency and precise definition of operational goals. The market no longer rewards mere presence in the AI arms race; it now rewards the ability to win individual performance battles.

    A key lesson from recent developments is the understanding that artificial intelligence is undergoing a process of normalisation. It is ceasing to be treated as a magical tool with infinite potential, and is beginning to be seen as a costly asset that must be managed with the same discipline as a fleet of machines or a transport fleet. This is not the end of the revolution, but the moment of its transition into a mature phase, where the advantage is determined not by the amount of capital invested, but by the precision of its allocation.

    Big Tech’s billion-dollar loss in 2026 does not necessarily signify the twilight of innovation, but is rather a necessary course correction. For the business world, it signals that the time of speculative euphoria is over and that the future belongs to those entities that can combine technological savvy with iron business logic. The biggest challenge of the coming months will therefore not only be the fight for access to the fastest processors, but above all the fight to regain the confidence of investors by showing real, tangible results from the transformations underway. In a world where trillions of dollars can disappear in a few trading sessions, the most valuable currency becomes credibility and the ability to generate profit here and now.

  • Physics versus marketing. What do you really gain by investing in 1.8nm and 3nm processors?

    Physics versus marketing. What do you really gain by investing in 1.8nm and 3nm processors?

    Intel is bringing out the heavy guns in the form of third-generation Core Ultra processors, known as Panther Lake, which are based on 18A, or 1.8 nanometre, technology. On the other side of the market barricade is AMD with its Ryzen chips, baked in TSMC ‘s Taiwanese factories using a 3nm process. On paper, Intel’s advantage seems crushing, suggesting a technology almost half the size and more modern. However, in the CFO’s portfolio, this difference may prove to be a statistical error. In a world where ‘nanometre’ has become a brand rather than a measurement, business must learn to look at what really drives performance, ignoring the labels on the boxes.

    When IT managers look at the specifications of new laptops or servers, their gaze naturally goes to the numbers, because in the technology industry, smaller usually means better, faster and more economical. Manufacturers are well aware of this, which is why the arms race in the semiconductor sector has moved from the physics labs to the marketing departments. To make an informed purchasing decision for 2025-2026, you need to understand where the engineering ends and the wordplay begins.

    The grand illusion of the nanometre

    For decades, the IT industry has operated with a simple and understandable currency. Back in 1995, when we talked about the 350 nm technology process, it meant that the gate of a transistor on a silicon wafer was actually 350 nanometres long. The engineer and the salesman spoke the same language, and the node name was a direct reflection of physical reality. However, this order broke down in the late 1990s with the introduction of new technologies for building microtransistors, which broke the direct link between the node name and the physical dimension of the components.

    Today, names such as ‘Intel 4′, ’18A’ meaning 18 Angstroms, or ‘TSMC N3’ are predominantly trade names. Treating them as a technical measure of length is a mistake that can lead to misleading business conclusions. It is a situation analogous to the automotive market, where the model designation of a car, for example the BMW 330, no longer necessarily denotes a three-litre engine. The number now serves to position the product in the range, rather than to describe its technical parameters precisely.

    For business, this means that the approach to analysing offerings needs to change. The fact that one processor is labelled ‘1.8 nm’ and another ‘3 nm’ does not automatically mean that the former is physically much smaller. In fact, the differences may be minimal and, in extreme cases, the packing density relationship may even be the opposite of what the numbers suggest.

    The hard currency of silicon

    Since nanometres are conventional, an informed investor or IT manager should look at other metrics. If we look under the hood of Panther Lake processors or the latest Ryzen processors, we find objective parameters that PR departments are reluctant to talk about, but which are crucial for engineers. These are, first and foremost, Gate Pitch, which is the minimum distance between individual transistors, and Metal Pitch, denoting the minimum distance between the copper paths connecting these components.

    Analysis of this hard data leads to surprising conclusions. Comparing the current generation of processes, it appears that the Intel 4 technology and the competing TSMC N4 have almost identical physical characteristics, with a gate pitch oscillating between 50 and 51 nanometres. Despite the different trade names, the packing density of the technologies is very similar. The future looks even more interesting, with Intel promoting an 18A process suggesting 1.8 nm, while TSMC is preparing to implement a 2 nm process. Paradoxically, according to many technical analyses, it is the Taiwanese ‘2 nm’ that may offer higher transistor density than the US solution. Intel is making up for it with marketing, suggesting leadership, but in practice the two giants are going head to head and their nodes will meet each other halfway in terms of real-world performance.

    Physics translates into costs

    Although the labels are confusing, the technological advances are real and central to the cost of doing business, or TCO. Regardless of the nomenclature, the drive towards denser transistor packing is driven by the inexorable laws of physics, as a smaller transistor with a shorter path between source and drain requires a lower voltage to switch its logic state. For the company, this translates directly into energy efficiency and thermal performance.

    The chip, made using a newer, denser process, uses less power for the same load. On the scale of a single laptop, this means an extra hour of battery life during a business trip, while on the scale of a data centre, it translates into thousands of zlotys of savings on electricity bills. The thermal aspect is equally important, as less power consumption means less heat generated. This allows the processors to run at higher frequencies without the risk of thermal throttling, ensuring more stable operation of demanding applications. Therefore, Intel Panther Lake will be inherently better than its predecessor not because of the name ’18A’, but because the engineers have actually improved the physical structure of the chip, which is also true for AMD using TSMC improvements.

    The strategic trap of the single supplier

    There is another element of business risk in this technological jigsaw puzzle, related to incompatibility. Intel’s, TSMC’s and Samsung’s manufacturing processes have diverged dramatically, with each giant using different chip production methods, deploying technologies such as FinFET or RibbonFET at different times. This means that chip designers such as AMD and NVIDIA are firmly tied to their chosen factory and cannot move production to a competitor overnight. Adapting a design to another factory is a process that takes up to a year and incurs huge costs. When choosing a hardware platform for a company, decision makers are therefore choosing not just a processor, but the entire supply chain, where the stability of the manufacturing partner becomes a strategic factor, more important than the marketing name of a nanometre.

    We are approaching the point where comparing processors solely on the basis of lithography becomes pointless. Intel Panther Lake and the upcoming Ryzen generations will be powerful chips, but their value to business is not based on the labels on the box. When planning infrastructure purchases, the key indicator should be the performance-per-watt ratio. It is this parameter that determines whether an investment in new hardware will translate into real productivity gains and reduced operating costs for the business.

  • TSMC raises prices. Tech giants and consumers will pay for geopolitics

    TSMC raises prices. Tech giants and consumers will pay for geopolitics

    Taiwan Semiconductor Manufacturing Co.(TSMC), a key chipmaker for companies such as Apple, Nvidia and AMD, plans to significantly increase the price of its most advanced semiconductors.

    According to industry reports, the increase could be as high as 10%, a signal of rising costs at the heart of the global technology supply chain.

    The main reason for this decision is the financial pressure from US trade policy and the increasing costs of global expansion. Although TSMC does most of its manufacturing in Taiwan, the company is facing costs related to import duties imposed by the US.

    Until now, the Taiwanese giant has largely absorbed these burdens to remain competitive. However, ongoing geopolitical tensions and pressure on margins have prompted management to change its strategy.

    The decision to pass on costs to customers is also linked to TSMC’s massive investments in the US. The construction of state-of-the-art factories in Arizona, partly financed by CHIPS Act subsidies, is aimed at geographically diversifying production and mitigating future customs risks.

    These strategic but capital-intensive projects generate additional costs, which the company intends to compensate for by adjusting price lists.

    As the undisputed market leader in the production of the most advanced chips (below 5 nm), TSMC is in a position to dictate terms. Price increases will almost certainly be passed down the value chain.

    Manufacturers such as Apple and Nvidia, facing higher component costs, are likely to revise the prices of their flagship products – from iPhones to GeForce graphics cards.

    As a result, the rising costs and geopolitical reshuffling of the semiconductor industry will ultimately be paid for by the consumer.

    TSMC’s decision clearly shows that the era of low-cost, globally produced chips may be coming to an end, with higher prices driven by national security strategies and regionalisation of production becoming the new norm.

  • TSMC beats predictions – artificial intelligence drives chip sales

    TSMC beats predictions – artificial intelligence drives chip sales

    Taiwanese semiconductor giant TSMC has again surprised the market by releasing its preliminary financial results for the second quarter. The company achieved revenues of 933.8 billion Taiwan dollars (US$31.9 billion), an increase of almost 39% year-on-year. This result exceeded both analysts’ forecasts and the company’s own earlier estimates.

    Behind the impressive result are the growing demands of the artificial intelligence market. TSMC, as a key manufacturer of advanced chips for players such as Nvidia, is directly benefiting from the global race for computing power. The latest generations of chips, produced in the most advanced lithographies (N3, N4), are now making their way into AI-oriented data centres.

    TSMC remains the undisputed leader in contract semiconductor manufacturing, with more than 50% market share and a technological lead over competitors such as Samsung Foundry and Intel Foundry Services. The record-breaking quarter indicates that the company is not only maintaining this lead, but consolidating it in the era of generative artificial intelligence.

    Full financial results and forecasts for the rest of the year will be unveiled on 17 July. Given the continued demand for AI chips and increasing pressure on infrastructure manufacturers, the market is expecting another strong quarter. TSMC is already becoming an informal barometer of the health of the entire technology sector in the era of the computing turnaround.