Today, let’s talk about Intel. On October 16, Bloomberg reported that Qualcomm plans to decide in early November, after the U.S. election, whether to acquire Intel. Based on previous rumors, Qualcomm might only want to buy Intel’s CPU design division. Either way, Qualcomm’s potential acquisition of Intel is one of the most closely watched events in Silicon Valley right now. If the acquisition succeeds, Qualcomm will become a mega-giant in the industry. However, they don’t yet know who the next U.S. president will be, nor the new administration’s stance on monopolies. This uncertainty has made Qualcomm hesitant to take action.
As for who might be feeling the most pressure right now, it’s probably Intel. Intel has faced several crises throughout its history, but data shows that it’s never been this severe. In September, the Dow Jones Industrial Average considered removing Intel. To be clear, the Dow Jones index consists of the 30 top-performing companies in the U.S. stock market. Generally, a company is only removed from the index if it experiences a significant and lasting decline. Intel is currently going through its largest layoffs since its founding, planning to let go of 15,000 employees, with all cuts completed by November 15.
Intel is like a dying elephant, with Qualcomm circling like a vulture, already calculating where to take the first bite.
A company in crisis isn’t exactly news. However, Intel’s case is worth discussing separately because of its unique position in the industry.
Everyone knows Intel makes CPUs, but the significance of Intel extends far beyond CPUs. To understand how important Intel is in tech, consider its founders.
First, there’s Gordon Moore, the person who proposed Moore’s Law. He passed away last year. Moore’s Law, as we know, states that the number of components on an integrated circuit doubles approximately every 18 months, while the cost remains the same, effectively doubling performance as well.
However, Moore’s Law is not an actual physical or mathematical law; it was merely Moore’s prediction for the semiconductor industry’s growth. This prediction, however, became the goal for the entire industry. All semiconductor companies started striving to meet Moore’s Law, directly impacting the pace of global technological advancement.
In other words, Gordon Moore singlehandedly set a KPI for the entire semiconductor industry. Today’s smartphones, tablets, and fast, affordable computers all owe something to this KPI.
Then there’s Robert Noyce, often described as a gentler version of Steve Jobs. Jobs indeed saw Noyce as his mentor. Noyce had talents comparable to Jobs, but without Jobs’ aggressive temperament. His peers admired Jobs, but not everyone liked him as a person, while Noyce was well-liked across Silicon Valley. When Japanese semiconductors entered the U.S. market, Silicon Valley needed a leader to negotiate with the U.S. government, urging action against Japanese semiconductors. They chose Noyce as their “chief.” He even earned the nickname “Mayor of Silicon Valley.”
The third founder, Andy Grove, was considered the greatest CEO in Silicon Valley history. Grove almost singlehandedly set the template for running a tech company and provided a cultural model for the entire tech industry. He famously said, “Only the paranoid survive,” meaning that tech companies must embrace a spirit of continuous innovation. During a crisis when Intel’s main business was not CPUs but semiconductor memory, this business started failing. Many were reluctant to abandon it since it was Intel’s founding specialty. Grove famously asked, “If a new CEO were brought in now, what would they do?” The answer: abandon memory production. So why wait? Grove was already paving the way for modern tech management, encouraging companies to move past attachments to past achievements.
Using today’s language, Grove’s approach was not to cling to the existing market. Similar to Zeng Guofan’s philosophy of “not clinging to the past, not complicating the present, and not fearing the future,” this became the mantra of many tech companies later on.
For example, Intel also provided a template for founding teams: an ideal team should have three members—a doer, a thinker, and an external communicator. For Intel, Noyce was the communicator, Moore was the thinker, and Grove was the doer. Originally proposed by Peter Drucker, this model was perfected at Intel.
Intel also pioneered Silicon Valley’s emphasis on talent and technology.
Furthermore, Intel helped mature the venture capital industry. When Intel was founded in 1968, U.S. investment institutions were still dispersed, and securing funding meant approaching multiple places. But Intel made history in funding, getting bombarded with calls within 48 hours of opening its fundraising. This success inspired many engineers who had been hesitant to start their own businesses, triggering a wave of startups and a flood of venture capital into the market, marking the beginning of the modern venture capital industry.
In short, Intel has left many significant legacies.
At this point, you may wonder: if Intel was so powerful, why is it facing difficulties today? Many people refer to the term “disruptive innovation” when explaining the decline of tech companies, where a more advanced technology overtakes an older one. For example, smartphones and AI have posed significant challenges to Intel, pushing it into a downturn.
However, on closer examination, this theory seems shaky. A tech giant like Intel doesn’t operate solo; it has an extensive technological layout. If a superior technology appeared, Intel would either acquire or develop it. It wouldn’t just sit back and wait to be disrupted.
Here lies a common misconception: many people believe disruptive innovation means advanced technology replacing outdated technology. In fact, Christensen, who coined the term, had a more complex view.
Christensen argued that technological transitions are often driven not by high-end technology replacing low-end technology, but rather the opposite: low-end technology overtakes high-end technology.
Take hard drives, for example. Before personal computers, the consensus was that bigger was better for hard drives—large hard drives had superior performance, especially as early computers were primarily used by companies without size constraints. This led top companies to focus on large drives, with smaller drives left to second-tier players. However, when personal computers gained popularity, demand for small drives skyrocketed, allowing companies that previously focused on small drives to leapfrog the big-drive manufacturers.
Intel faced a similar situation. For example, Intel’s CPUs have been the central component for computation and control, the core of any computer. NVIDIA’s GPUs, initially designed for graphics rendering, had a less critical role. But with the rise of Bitcoin, 3D gaming, and visual effects, the demand for GPUs surged, bolstering NVIDIA. Intel dismissed these needs initially, but combined, they paved the way for NVIDIA’s growth until the AI boom finally hit. Jensen Huang, NVIDIA’s CEO, even mentioned that they targeted a “zero-billion-dollar market,” referring to a market that didn’t exist yet but, when it did, would be worth billions.
In Christensen’s view, tech giants are often overtaken not because they lack technological prowess but because they overlook certain opportunities. For example, when Jobs approached Intel for chips, Intel declined due to the high cost of its chips, imperfections in phone integration, and skepticism about the smartphone market’s potential. What Intel didn’t anticipate was that this once “niche” market would become mainstream, resulting in their gradual replacement. And this wasn’t due to any sudden technological breakthrough but rather a prolonged shift in market demand.