© 2025 All Rights reserved WUSF
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Our daily newsletter, delivered first thing weekdays, keeps you connected to your community with news, culture, national NPR headlines, and more.

In an era of rising prices, computers have gotten cheaper. (And why that may end)

Computing has been one of the few areas where prices have decreased over time while many other things have seen large increases. Technological advances have underpinned a consistent drop in the cost of computing, but experts say that this may be reaching the end of the road.
Getty Images
/
Emily Bogle/NPR
Computing has been one of the few areas where prices have decreased over time while many other things have seen large increases. Technological advances have underpinned a consistent drop in the cost of computing, but experts say that this may be reaching the end of the road.

NPR's series Cost of Living: The Price We Pay is examining what's driving price increases and how people are coping after years of stubborn inflation. How are higher prices changing the way you live? Fill out this form to share your story with NPR.

What's the item?

MacBook Pro laptop

How has the price changed since before the pandemic?

It has dropped $200. Today's entry-level MacBook Pro starts at $1,599. It has a 14-inch screen, 16 gigabytes of memory and a 512-gigabyte internal solid-state hard drive. The comparable MacBook Pro from five years ago, with the same memory and storage (but only a 13-inch screen), cost $1,799.

Why has the price fallen?

Pricing is an art form, and price tags can depend on a wide range of factors beyond the cost of labor and materials — market positioning, competition, company culture, consumer psychology and so forth. Apple and others often maintain steady price points for key products as a strategic choice. (Fun fact: Apple also tends to set prices that end with the number 9 — $999 for a MacBook Air, $6,999 for a Mac Pro, $549 for AirPods Max, etc.)

But there's a technical reason for why over time computers as a whole have become cheaper: It's called Moore's law.

Gordon Moore, a chip expert and co-founder of Intel, postulated that the number of transistors on microchips would double every 24 months or so thanks to advances in miniaturization technology. Transistors are the little switches that make digital processing happen. They control the flow of electricity — the ones and zeros of computing.

As transistors have shrunk, the price per transistor — and thus the price of computing — has plummeted. Being able to reliably double how many of them could fit onto a chip allowed computers to become smaller and more powerful without driving up their cost. It has given us computing power that would have been inaccessible or even unimaginable in the past.

It's the main reason that there are mass-market smartwatches today that have more power than the computers on the Apollo 11 lunar mission. And it's why computers, which were once behemoths so expensive that only businesses and universities could afford them, are now small enough to fit onto a desktop or into your pocket.

At the Computer History Museum in Mountain View, Calif., docent Scott Stauter demonstrates an IBM 1401, a mainframe computer from the early 1960s. It fills a room the size of a classroom, and it runs on punch cards and reel-to-reel tapes. It once cost hundreds of thousands of dollars. And it had only the equivalent of 16 kilobytes of memory.

"At home, my laptop has 16 gigabytes of memory. That's 16 billion bytes," says Stauter. "That's a million times more than the maximum that a 1401 could have."

Computer buyers can now get more bang for fewer bucks even over the span of a few years, as the MacBook Pro shows. And because chips are now in everything, that means other kinds of electronics have also become cheaper over time.

Take 55-inch OLED flat-screen TVs, for example. The first one hit the market in 2013 for over $10,000. Today, you can pick one up for under $1,000. Smartphones are another example. Samsung's newest model in 2020 started at $999.99. This year, the newest version was $799.

Moore, who died in 2023, knew his law had as much to do with economics as it did with physics. "I was just trying to get across the idea that integrated circuits were going to be the route to cheap electronics, something that was not clear at the time," Moore said in a 2008 oral history interview in the Computer History Museum's archives.

What are people doing about it?

They got used to it.

"Miniaturization was something that happened very regularly, and people could kind of count on it," says Neil Thompson, an innovation scholar at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Lab and the university's Initiative on the Digital Economy.

Moore's law enabled generations to believe that computers would always become better — and to buy more of them. People may now own several computers — in the form of laptops, tablets or smartwatches — as well as other devices with computers embedded within them, everything from cars to refrigerators.

But Moore's law may be hitting its limit. Transistors are getting so small — tens of billions can fit on a chip now — that experts say the laws of physics are slowing the reliable pace of progress.

"During the heyday of Moore's law, miniaturization gave us chips with more transistors, and it also meant that each transistor used less power," Thompson says. "Today, miniaturization is giving us much smaller reductions in power, and so trying to cram in too many transistors produces a lot of heat and can melt a chip."

He says that the predictability that Moore's law provided will wane in the coming decade and that it will take other technological breakthroughs to create new gains in efficiency and drops in price.

One example is software. Thompson says the steady march of progress underpinned by Moore's law meant that computer system designers could get away with code that was sometimes inefficient. He says there are significant computing gains to be mined by improving software.

Chip designers and manufacturers say chip packaging is another way to squeeze more out of the technology. Packaging refers to the ways in which individual chips are hooked up to others to form powerful sets.

Apple is a financial supporter of NPR.

Copyright 2025 NPR

John Ruwitch
John Ruwitch is a correspondent with NPR's international desk. He covers Chinese affairs.
Thanks to you, WUSF is here — delivering fact-based news and stories that reflect our community.⁠ Your support powers everything we do.