Competition between tech companies is usually most noticeable when it spills over to the consumer end. Samsung, Google and Apple all battle to be your smartphone provider. Sony, Microsoft and Nintendo want to be your exclusive gaming device. Car manufacturers are ramping up electric vehicle production to compete with Tesla’s offerings.

But one of the biggest races in tech right now is happening on a small scale — and by small, I mean nanometers. Tech companies like AMD, Intel, Nvidia, and even those previously focused on consumer tech like Apple, are all competing to create the leading computer chip.

A computer chip, or chip, or CPU, is essentially the brains of a computer. But unless you're building a computer yourself, it's something the average consumer will probably never see.

But chips aren’t only used to power computers: they can more accurately be thought of as a technology that powers nearly every aspect of modern life. All the examples listed above — smartphones, gaming consoles and electric vehicles — make use of computer chips. There are non-consumer applications, too. The factories that make those electric vehicles, for example, also use them to automate mass production.

The wide range of applications and potential customers for chips has companies racing to develop the end-all-be-all processor. Nowhere is that push more clear than in job listing data for three of the leading chip manufacturers: AMD, Intel, and Nvidia.

During a year where all of FAANG and other major tech companies saw massive layoffs, AMD, Intel and Nvidia have all drastically increased their hiring. Intel, AMD, and Nvidia’s job listings were all up 98%, 52% and 27% respectively year-over-year as of last week, totaling thousands of jobs across the three companies.

Despite its prevalence now, Nvidia wasn’t always part of the trio of chip manufacturers. Though Intel was an early leader in the space, AMD’s Ryzen processors have been competing with Intel’s i-series since 2018, and the companies’ feud stretches further back. Nvidia was previously primarily known for creating GPUs, or graphics processing units, which are often used in conjunction with Intel or AMD chips. If a CPU is a computer’s brain, the GPU is like it’s muscle, used to display images on a screen as well as to process large amounts of data for something like an AI or cryptocurrency mining. 

Intel’s long-time dominance is still felt today — odds are, your laptop probably uses one. I am writing this article from my work-provided Macbook Air, which contains an Intel processor. Just to my right on top of my desk is a desktop I built, which uses an AMD Ryzen processor as well as an Nvidia GPU.

However, Nvidia stepped beyond GPUs last year when it acquired computer chip developer Arm Holdings from Softbank for $40 billion, further cementing its place as one of the leading computer part manufacturers. With Arm under its umbrella, Nvidia’s reach is vast: the company has since secured deals in auto manufacturing, self-driving vehicles, robotics and more.

Nvidia, Intel, and AMD don’t often release products intended for general consumers, but they’re starting to face competition from those that do. In November, Apple announced its brand new M1 chip, a proprietary processor that brings many of the functions previously performed by different computer parts, like the GPU, under the purview of one chip.

“Until now, a Mac needed multiple chips to deliver all of its features — including the processor, I/O, security, and memory,” the M1 announcement page reads. “With M1, these technologies are combined into a single system on a chip (SoC), delivering a new level of integration for more simplicity, more efficiency, and amazing performance.”

Since July 2019, Apple’s job listings for engineers or positions that specifically mention “chips” have increased 33%, totalling 251 jobs. The development of the M1 means that Apple can cut ties with Intel, whose processors previously powered its computers. In fact, the new Macbook Air, Pro and Mini are all powered by the M1, and Apple officially cut its 15-year partnership with Intel last summer.

The race is on hold thanks to a global semiconductor shortage which is making it difficult to produce new chips, which has caused supply shortages across multiple industries. The shortage was initially caused by the shutdown of factories at the onset of the pandemic, but has worsened due to increased consumer demand for devices like gaming consoles and computers as well as vehicle manufacturers like Ford and GM’s recent efforts to ramp up electric vehicle production. Ford was recently forced to cut down on factory working hours and vehicle production due to the shortage, and Sony and Microsoft have been unable to produce enough of their new Xbox Series X and Playstation 5 consoles to keep up with demand.

Three weeks ago, President Biden signed an executive order launching a 100-day initiative to prevent future shortages of semiconductors from occurring, describing the shortage as both a labor issue and one that prevents American companies from competing on the global stage. 

“Recently we’ve seen how a shortage of computer chips… has caused delays in the production of automobiles and has resulted in reduced hours for American workers,” President Biden said “I’m directing senior officials in my administration to work with industrial leaders to find resolutions to this semiconductor shortfall.”

Until the shortage ends, chip manufacturers are crouching at the starting line, ready to break into a sprint once the shortage ends and demand for computer chips reaches a fever pitch the likes of which has never been seen before.


About the Data:

Thinknum tracks companies using the information they post online, jobs, social and web traffic, product sales, and app ratings, and creates data sets that measure factors like hiring, revenue, and foot traffic. Data sets may not be fully comprehensive (they only account for what is available on the web), but they can be used to gauge performance factors like staffing and sales.