Never-ish. We've been in the ~4GHz holding pattern for more than a decade. Maybe a different architecture and/or changes in chipmaking materials can more fully address that, but the CPUs that can actually operate in a stable fashion at close to 5GHz across all 4 or 8 cores are apparently extreme outliers or Intel would be putting them in boxes with red flames and skulls on the side and getting every enthusiast site on the internet how great it would be to skip a mortgage payment to get one.
There's a whole bunch of reasons for that. We're approaching the limits of lithography which means we can't easily make smaller, faster transistors. On top of that signals propagating through silicon face issues of their own as speeds increase. If you think about it, in one clock tick with a 4GHz CPU light travels only 7.5 cm. Try keeping clock edges in sync at multiple places given that fact. Forget about running PCB traces with signals that fast, at least traces more than a few millimeters. That's why CPUs generate their clocks internally from a much slower external clock. Multicores are facing scaling issues of their own, notable the fact that for most problems throwing more than 8 cores at it doesn't buy you more speed.
In my opinion most of the work needs to happen on the software front. Break down problems into multiple tasks which parallel CPU cores can chew on at the same time. More importantly, end software bloat. Much of the hardware advances in the last decade have gone to waste simply because we're not efficiently optimizing compiled code. Sometimes it's so bad a multi-GHz CPU can't keep up with a person typing. In other words, it's back to basics. Coming from the microcontroller world, I know well the advantages of optimizing code. If we bothered to do the same on our computers most tasks would execute nearly instantaneously.
In all honestly, I'm actually not totally bummed things aren't advancing all that fast. This should mean any new systems people build may not show their age for a decade. Remember I was using an Athlon XP3200 until 2012. Outside of a few tasks, this decade old CPU was still viable for most of my work. My A10-5800K should probably be good enough to take me close to my senior years. Certainly the urge to upgrade is no longer there like it was for many people at least until 2010 or so.
Of course, there may be some breakthrough making it out of the labs which could change all this.