Skip to main content

Was Intel’s decision to (almost) kill Core M deceptive, or was it inevitable?

Intel Core M
Greg Mombert/Digital Trends
Two years ago, Intel introduced the Core M brand. The company hoped the new name would provide a rallying banner for its less powerful Core hardware, yet also mark a difference between Core chips built for the budget market and those built for extremely slim systems. The crossroads between low-voltage and low-cost has always caused confusion. All of Intel’s budget processors draw little power, but not all miserly chips are built for budget systems.

I don’t see Intel making its lineup more confusing. I see Intel correcting a screw-up.

Core M had problems from the start. Despite Intel’s best efforts, people caught on to the performance gap between Core M and Core. Core M became synonymous with slow, even if that characterization was a bit extreme. That would’ve been okay if it inspired the dramatic 2-in-1 designs Intel intended  — but few came, and when they did, they were often underwhelming.

The result? An unloved line of processors.

Now, Intel has decided to all but kill Core M. The brand will continue only with certain “m3” grade components. Otherwise, they’ll be sold as Core i5 and i7, albeit with a slightly different name than their full-fat brethren.

Is Core M’s death about deception?

This move has caught some flak. But I don’t think that’s warranted.

Why my lack of outrage? Mostly, it’s because this is unlikely to leave anyone with a bum PC. Core M isn’t the fastest, but it’s still plenty fast for anything short of gaming or workstation-grade productivity. The problem isn’t with the hardware, but instead the fact it’ll probably never be popular enough to justify its existence as a separate brand.

And if you want to do either of those, well – whether you buy Core M, or not, is irrelevant.

Here’s an example. According to our benchmarks, the Core m3 powered Asus Zenbook UX305CA needs about an hour to encode a 4K trailer that’s four minutes, 20 seconds long into h.265, if you use handbrake. The Zenbook UX305UA, with Core i5 processor, needs about 27 minutes.

Bill Roberson/Digital Trends
Bill Roberson/Digital Trends

Yes, that’s a big difference – but both of these devices are just flat-out bad at this task. You shouldn’t buy either for video encoding or any seriously demanding software. It’s a similar story with games. Asking yourself “Should I buy Core M, or Core i5?” often meant you were asking the wrong question.

To put it differently, I’m not convinced that knowledge of a specific processor is what folks need to worry about. I think they do a lot better by thinking broadly about what they want. Want incredible CPU performance? Well, you’re buying a quad-core. Want to play games? Well, you’re buying discrete graphics. Want battery life? Well, you’re buying a dual-core, and you can chuck graphics out the window.

The “average” person does a lot better when focusing on these broad strokes than when trying to dive into the minutia of model numbers.

Reversal of fortune

Besides, Intel isn’t treading new ground here. Core M was not an entirely new concept when it came into existence, but instead a fork from Intel’s earlier “Y”-series components, which existed for years. By getting rid of (most) Core M chips, Intel is just reverting back to its earlier strategy.

I don’t see Intel making its lineup more confusing. I see Intel correcting a screw-up. The company thought Core M would attract attention to the thinnest, lightest PCs. Instead, it caused more confusion than it was worth. It made manufacturers design the wrong products, and it made consumers ask the wrong questions.

Imagine that Intel didn’t start to kill Core M. Would buying a laptop be simpler? Not at all. Intel would merely have another brand to choose from, making the process more complex. If you want proof, just ask yourself which is better – Intel’s Core i5-6200U, or Core m7-6Y75? I’ll see you in a few hours.

So, Core M? I say good riddance. No one wanted it, and it never offered clarity.

Editors' Recommendations

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
Intel just accidentally leaked a mysterious 34-core CPU
Intel's CEO holding a Raptor Lake processor.

A mysterious wafer full of Intel Raptor Lake-S processors was spotted during Intel Innovation 2022. While that in itself may not sound odd, the weird part is that the wafer contains 34-core chips that are labeled as Raptor Lake, whereas Intel's new flagship Core i9-13900K maxes out at 24 cores.

Is Intel hiding an even more powerful processor? That'd be nice, but all signs point to it being something else entirely -- a Sapphire Rapids chip.

Read more
AMD Ryzen 9 7950X vs. Intel Core i9-12900K: Two flagships face off
A hand holding the Ryzen 9 7950X in front of a green light.

When the Intel Core i9-12900K came out in late 2021, it was Intel's first true flagship CPU since its 2018 Core i9-9900K. It actually beat AMD's flagship Ryzen 9 5950X in both single- and multi-threaded performance, and the 12900K remains the fastest mainstream desktop CPU to this day and one of the best CPUs in general.

But AMD now has its Ryzen 9 7950X. It blows past AMD's previous-generation offerings, there's no doubt about that. Even against Intel's most powerful CPU to date, however, AMD's latest processor shows a big jump in performance.
Pricing and availability

Read more
Ryzen 7 7700X vs. Intel Core i7-12700K
The Ryzen 7 7700X CPU.

Looking for a CPU upgrade for your gaming machine? We’ve got some good news: AMD's Ryzen 7000 Zen 4 CPUs are set to deliver significant boosts in performance compared to the previous generation and long-awaited support upgrades for the latest connections.

The Ryzen 7 7700X chip, in particular, looks like an excellent combination between a performance upgrade and affordability. But that also sets it against the similarly priced Intel Core i7 12700K -- a chip from the 12th-generation Alder Lake series Intel released in late 2021. Just how do these processors stack up, especially for gamers? Let’s take a look at what we know.

Read more