Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When did CPUs break 5-6GHz? It feels surprising to see that high GHz number after being conditioned that "clock speed can't go up anymore, so number of cores go up" and not seeing anything much higher than 4GHz


Intel got 5GHz back in 2019 with the i9-9900XE. The X900-suffixsuffix lines have been Intel's "screw it, we need the highest single core performance" product spot for a long time, and really what that SKU can do is really not a real reflection on any other part of the product line or what is even "reasonable", just what is "possible" (barely).

On the other hand, Intel did get the 12700k to 5GHz in 2021. That's an actual flagship part that actual people might actually buy for an actual purpose.

When Intel ran into their 14nm woes and started actually getting pushed by AMD, Intel responded by upping their core counts (part way) to erode the AMD core count advantage, while really leaning into their single core performance advantage (which was truly a real thing in Zen1/2) by really starting to up their clock speeds.


> not a real reflection on any other part of the product line or what is even "reasonable", just what is "possible" (barely)

As an example: to hit these speeds, Intel and AMD are pushing 1.4V into chips built on processes that work best at and below 1V. And the result isn't a stable, long-lasting chip: https://news.ycombinator.com/item?id=39478551


this is also why servers and chips that run for a long long time (think servers that aren’t decommissioned for years) don’t run such high voltages and clock speeds.

What I would want to know is what happens to these good binned chips when you lower their voltages and frequencies and run them at a more reasonable 3.5GHz or something for a long time. Is the price / power / performance ratio better than a server chip or is it worse ? Would love some data on that. But no one is buying these for it and server chips have different things going on for them like being able to replace the chip for years etc. still would be interesting to find out if these the best ‘quality’ chips out there


> What I would want to know is what happens to these good binned chips when you lower their voltages and frequencies and run them at a more reasonable 3.5GHz or something for a long time

I'm not sure about intel chips, but amd's last couple generations of chips have an "eco mode" which underclocks the chip for you. You get about 85-100% of the performance while consuming 60% as much power. The chips probably last way longer like that too. NVidia's GPUs have similar options. 3090/4090 cards can have a totally reasonable power budget if you're happy to lose ~10% of your framerate.

Arstechnica added eco mode to their benchmarks:

https://arstechnica.com/gadgets/2023/03/ryzen-7950x3d-review...


Where are you getting that ECO mode cause a 15% performance drop?

In my pretty unscientific testing, I saw ECO mode have zero impact on single core workloads and ~4% on multicore with a big drop in cpu temperature.

If I am reading the “Gaming CPU” chart correctly from your link, the AMD 170w performance was 114.8 FPS, but the 105w ECO performance was 111.4 FPS. Effectively nothing for a huge drop in power.


> Where are you getting that ECO mode cause a 15% performance drop?

It depends on the test. In that link I posted, 3dmark’s cpu test shows eco mode drop the score from 19000 to 14000. Which I guess is more like a 25% perf drop. In other tests, performance increased in eco mode.

It seems like it depends a lot on the workload.


Was this on a laptop or desktop? If a huge drop in power results in no meaningful drop in performance I would say that is a sign that previously you were hitting a point where heat was more of a problem than power.


My testing was on a desktop with a way over specced cooler. Regardless, the above link shows that they were getting huge power drops for basically no loss in performance.

With makes sense to me. We have seemingly run out of cheap architectural wins. CPUs and GPUs keep cheating by amping up the power draw to improve the performance numbers for vanishingly small returns. To get an extra 5% performance on these chips can require 10s of watts.


Thanks. The 7950X doesnt have a non X variant. So it seems the non X variant chips are basically doing the same - giving you lower clocks for lower wattage with an option to tinker at your own risk for pushing the chip as those are not as good as the X variants.


I run 50mv undervolted 12500 which gives me about 20% lower consumption.


I design chips in similar technologies. What these binned parts are fast-fast parts, meaning both PMOS and NMOS transistors have lower threshold voltage than typical due to stochastic process od production or intentional skewing of the process during fabrication. You can run them at faster clock speeds. If you stack higher voltage on top you can run quite a bit faster. For a while.. The transistors age. Aging is an exponential function of voltage level and temperature, and lunear function of clock frequency. With aging the device threshold voltages increase rapidly, so the devices get slower. On top of this there is electromigration (EM). With EM the interconnect resistance increases. These effects combined, you have a horrible product lifetime at those cobditions. After failung to work at 5-6 GHz at 1.4V, it would most likely still work at a lower clock frequecy, because in essence it becomes a slower chip.

Now the answer your question: A fast binned part operating at nominal conditions will perform exactly the same as the same sdries typical chip. The power consumption would be like 10% or so higher due to higher dynamic power and leakage. The lifetime would be muuuch more longer than a typical device though. So, in short, wouldn't be outperforming anything.


I would suggest not spreading FUD unless you can cite hard evidence.

Intel warrants their processors for 3 years if they are operated according to spec, which includes supplying 1.3~1.4V as far as I'm aware.

From what I can tell, the linked thread and article concern CPUs that were either defective out of the box (and thus under warranty, regardless usage) or were driven out of spec by mobo overclocking by default (which would void warranty).


Wasn't there an article here on HN more recently? Anyway, check out https://semiengineering.com/the-rising-price-of-power-in-chi... which also describes a few issues with actual chip aging. That may explain why Intel for example gives only 3 years of warranty, despite the fact that people still believe in "that is solid state electronics, it will last a lifetime!"

Also, the overall cost of design and production is going up for years (>10 years already or so), which is why only these few fabs are left that run these advanced processes and they need massive scale to stay profitable. This "going up" is btw. non-linear, more like exponential and that also applies to power consumption of the resulting chips if you want to eek out "a bit more".


I thought the 3 year thing was standard for many years, pretty much as long as there have been retail-boxed CPUs.

I think the logistics of a long term CPU warranty might be difficult too; aside from a few perennial, typically embedded market, products, manufacturers want to close out product lines and keeping warranty spares for decades becomes a burden.

Imagine the fracas at the RMA office: "He says he doesn't WANT a new 14900KS, he wants a 50MHz 486SX2 to replace the one that blew out, and he has the state attorney general on line 2."


Most PCs are operated for way longer than 3 years, but still are there any reports of dead CPUs that were operated in spec? With millions of devices out there, you'd expect some to pop up, but I only ever hear of dead HDDs or defective RAM.


I recently had mother board die on me, I think at age of 4 years and bit... Everything else is still working fine. So that might be in the pile of things that could act up in some cases.


Had a 6900k bail out on my after 4-ish years. Got a new one from extended warranty.


It's not FUD to point to real problems that are occurring in the field. A processor that's unstable is still unstable even if it's covered by warranty. And to the extent that these real issues are being caused by aggressive motherboard defaults, Intel is not absolved: they have huge leverage over their motherboard partners, and are seemingly failing to use it to ensure users get a good, stable experience out of the box (likely because Intel cares more about high scores on popular benchmarks than about subtle stability issues in this market segment).


Yes, real problems stemming from factory-defective products or products that were driven out of spec. The article linked even admits they (author or publisher) don't have enough evidence to pinpoint what the problem is; the best they could do was the aforementioned. Nowhere do they say "Driving 1.4V is killing CPUs." or something similar; just potential workarounds like reducing clock multipliers below spec and configuring mobos to enforce Intel's power limits.

Drive known-good products according to published specifications at load for statistically significant durations. If the results are that the majority of products fail to perform as warranted, then we can talk about how Intel (and I guess AMD) are driving their products to the point of failure.

Otherwise in the absence of such data, I'm going to look at the silent majority satisfied with their purchases and infer that the products concerned are working fine.


> If the results are that the majority of products fail to perform as warranted, then we can talk about how Intel (and I guess AMD) are driving their products to the point of failure.

That's a stupidly high bar. Recalls and class-action lawsuits don't need to be justified by failure rates as high as 50%, and I'm merely discussing that there are signs of trouble, not demanding a recall or other serious action from Intel. Intel's recent top of the line desktop chips are misbehaving in a way that is genuinely noteworthy, even if we don't have the impact solidly quantified and don't have a smoking gun. It's worth discussing, and worth keeping an eye out for similar issues from other chips that are being pushed to similar extremes.


>Intel's recent top of the line desktop chips are misbehaving in a way that is genuinely noteworthy, even if we don't have the impact solidly quantified and don't have a smoking gun.

And all I am asking is for you to cite proper evidence for your claim. The article you linked does not say driving 1.4V is damaging the CPUs, it's actually explicit that the cause is unknown. Speaking more broadly, most people who have bought the CPUs concerned have had no problems (or at least do not voice such concerns).

To reiterate, I am asking you to cite evidence for your claim that "Intel and AMD are pushing 1.4V into chips built on processes that work best at and below 1V. And the result isn't a stable, long-lasting chip." If you can't or won't, this is just FUD.


If the instability is something that develops over time as a genuine change in behavior of the chip, and not merely an artifact due to the evidence of instability taking time to pile up, then the extreme voltages are by far the most plausible culprit. And if on the other hand these chips are slightly unstable out of the box, despite the high voltages required to hit these peak frequencies and record-setting benchmark scores, it suggests that the clock speeds are being pushed too far.

Either way, the high operating voltages compared to what we see in laptop and server CPUs (and GPUs for any market segment) is worth raising an eyebrow. At a minimum, it's a symptom of the desperation Intel and AMD have for perennially leapfrogging each other in ways that are increasingly irrelevant to the average customer and the rest of their product stack.


Some of this is modeled over expected life-time / usage. In general, things such as electro migration, self-heating, bit-cell degradation, etc. are modeled either for 3y, 5y or 10y, depending on CPU, skew, and target market. Now, whether the process corner(s), voltage, frequency that was picked to perform this analysis is a good reflection of how the CPU is being actually used/pushed, is a different matter.


This explains a lot about my computer experiences over the last few years. I've been buying absolute top end hardware, and been consistently running into a plethora of absolute weird technical issues unlike any of my previous experiences. I even had to rma a 12900ks directly with Intel after it couldn't run at stock settings bug free.


>If the instability is something that develops over time as a genuine change in behavior of the chip, and not merely an artifact due to the evidence of instability taking time to pile up, then the extreme voltages are by far the most plausible culprit.

That's all fine and dandy, but can you please cite some evidence to support those claims?

This should not be such a farfetched request.


Would be nice to refer to something that does not say the reason is unknown anyway.


Exactly, a warranty is fine in theory, but I care about machine uptime and where my own time is going.

Often the time to deal with warranty exceeds the cost of a new component, in which case I don’t bother and just try a different brand.

Case in point, had some trouble with WD NVME drives - just tossed them, bought Samsung, and moved on.


There was no FUD there at all. Grow up.


It is FUD to state that 1.4V damages CPUs without citing evidence to support that statement.

Personally, I have a 14700K in my desktop and my laptop has a 12700H. Both routinely push 1.3~1.4V under load operating according to spec. If that is causing damage, I certainly would like to know and I asked for citation. I see none after prodding, so as far as I'm concerned it's FUD.


People say a lot of things that are not technically true. CPU speeds will probably keep going up for years or decades. But when CPU Hz were rising, the routine was things like 4MHz -> 8MHz every other year or so. Going from 4 to 6GHz over around 10-20 years is, effectively, flat compared to the rate of change people were dealing with before the year 2000.


https://www.extremetech.com/computing/158178-amd-unveils-wor...

Officially, 10-11 years ago. (This first 5Ghz CPU wasn't that impressive, and part of the reason "chasing Mhz" started to fade into irrelevance.)

For modern AMD, it wasn't until 2022 that they ventured back across 5Ghz, but this time it's much more impressive.

https://www.theverge.com/2022/5/23/23137217/amd-ryzen-7000-c...


Before that, IBM POWER6 reached 5 GHz in 2008: https://en.wikipedia.org/wiki/POWER6


Just today there was news of a 9GHz overclocking record. It was cooled with liquid helium though.



The 6.2 GHz is just the boost frequency, the base frequency is still well below 4 GHz, at 3.2 GHz. So, your CPU can essentially go into berserk mode for a few seconds and double its frequency, but cannot maintain this frequency. This is similar to a marathon runner who can sprint intermittently.


The 14900K(S) has an unlimited boost duration. It'll run those clocks until your cooler can't handle it anymore, they never give up on their own. Intel added this not that long ago to differentiate the xx900k from the xx700k.


"The 14900K(S) has an unlimited boost duration."

Okay, but unless you have cooling with liquid nitrogen you cannot maintain 6.2 GHz for long due to thermals in practice, hence after a short sprint it will fall back to a much lower frequency.


Very recently:

12900K: 5.2 GHz peak

13900K: 5.8 GHz peak

14900K: 6.0 GHz peak

14900KS: 6.2 GHz peak


For how many milliseconds ?


Single core only, for a good few seconds at full whack - if you have the enthusiast-grade cooling


Which sort of defeats whole purpose for >95% of the target market - super loud machines are extremely annoying to everybody, and all non-it (and most it) folks look at them as failed engineering/pc building effort.


Custom loop watercooling (not the all in one kind) and especially, direct die (delidded) cooling, can just about tame the beast. Someone has linked the DerBauer vid where he does it.

Of course, the market for enthusiast stuff like direct die watercooling and KS-series purchasers have a significant overlap <:o)


I have yet to read a review for how-to-build-your-own-pc that recommends water coolers (or more exotic stuff) as risk-free and maintenance-free. I don't mean a year or two.

I run my desktop since... 2018? and didn't have to think about noise, power consumption nor worries about short-circuiting everything, ever. There is no computer in this world power worth losing this (for me, I know I am pretty far from enthusiasts in this).


I don't think aircooling is going to do it for these – though even watercooling can become loud if stressed. I guess phase-change heat pumps are the logical next step. Peltier cooling is of course totally silent, and enthusiasts dabbled on it already back in the 90s – not sure how common it's these days outside astrophoto imaging rigs. But you can fairly easily get temperature deltas of around 100 kelvin with a Peltier heat pump. You'd still have to dissipate the heat somehow, of course.


I'd be really worried about condensation with peltier.


Phase change enthusiasts already have to deal with condensation.


If you tweak bios settings you can easily do one core at those speeds, or two cores 100Mhz lower, indefinitely without thermal worries.


I have a 12900k and was transcoding video on it yesterday. It stays stable at ~4.5gHz indefinitely when working on all cores.


2010, IBM z196 was 5.2GHz, then in 2012 the zEC12 was 5.5GHz. As always, the consumer world follows with a bit of lag.


I was surprised too. Then I checked my ryzen laptop and .... yep, max speed 6GHz.

I haven't noticed before either.

But given that my laptop pulls this off at a way lower TDP makes me wonder how competitive that Intel chip is, especially as the articles title already says it has a huge power consumption.....


If you have the ability to cool the cpu then it's not a hard limit; just one that's not practical for mainstream consumers.


A long time ago. AMD was famously first with 5ghz 10 years ago and then Intel not too long later.


AMD got 5GHz with Raphael in 2022, Intel hit it few years earlier with Coffee Lake in 2018.

(no, fx-9590 does not count)


Maybe never? Bram Nauta's ISSCC 2024 talk covered this and worth watching.



Sustained or multicore clock speeds cannot really go up. But it's easier to force a single core to go up to 6Ghz for 30s or so, which is really what this is talking about.


Technically AMD had CPUs boosting to 5GHz in 2013, but it was the horrible buldozer FX series with abysmal IPC.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: