Intel had to learn the lesson the hard way, that they couldn't keep up the status quo, in production of their CPU's. The way that Intel competed against AMD in the past, was by producing chips with higher frequencies that would be AMD. They continued this process well into the P4 32-bit days, and single core 64-bit days of the 2000's.
Up until the Intel Extreme chips were made, they were the ones that got the hottest of them all. And these were also the chips that spawned early days DIY janky water cooling systems that leaked, just trying to cool those CPU's! People were becoming avid overclockers by then too, pushing the limits, till they fried their chips.
So what did Intel learn to do things better? Pushing frequency just makes them hotter. So the better way, was to add cores. First it was dual core, then quad core, then hexa core, then octa core. And now there are chips that have 12-core, 16-core 24-core, 32-core, its crazy!
When you add more cores, it allows the chip to handle parallel multi-tasking operations. This speeds up response time, allowing information to be handled more efficiently. Which in turn makes CPU's more efficient. Which allows you to drop temps, which equals extended chip life.
Gaming is generally what really pushes the frequencies. But for the longest time, games wouldn't utilize more then 1-core. It literally took 6-years before games started using like dual core, and another couple years before they started using 4-cores - 6-cores.
This is why in modern day, they say the sweet spot for gaming notebooks is 6-core CPU's. However, for the hardcore gamer, gotta go for the 8-core.