Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Intel is still the king in single threaded performance. However, single threaded performance will end up being the bottle neck for fewer and fewer workloads. I have to wonder, with a $690 price tag and a 320 watt power budget who is this chip for? Team red wins most of the benchmarks with cheaper silicon.


>I have to wonder, with a $690 price tag and a 320 watt power budget who is this chip for?

There used to be a business in selling these ultra performant ST chips to high frequency traders, who'd overclock the absolute crap out of them. Luckily, this is no longer a business. But these kind of products used to be the "waste stream" from them, left overs.

But now, the KS-series intel chips are for slightly insane gamers and overclocking enthusiasts with more money than sense. That's okay, there's a real market segment there. At our work, we buy the 14700K for dev machines like sane people.


Case in point, the Xeon X5698, which was a 2-core 4.4ghz base freq Westmere made just for HFT. The regular ones were 3.6@4C and 3.467@6C, so it was quite a boost.


>There used to be a business in selling these ultra performant ST chips to high frequency traders, who'd overclock the absolute crap out of them. Luckily, this is no longer a business.

HF trading is no longer a business?


FPGAs have even lower latency than general purpose processors.


>single threaded performance will end up being the bottle neck for fewer and fewer workloads

the bottleneck will always be the fact that software is either unable or unwilling to be paralellised


I think they are in similar segment as sport or super cars just slightly lower bracket. Those with disposable income that just want to get the top tier thing.

You can get performant enough parts for half and reasonable enough parts for even less than that. But some people have hobby where they just want the fastest thing.


Not quite: even single threaded, the workload matters.


„Single threaded“ IS the work load, right? I‘m confused.


Different types of single threaded work stress different parts of the Core. Some might be throughput heavy, some might require a lot of cache, ...


Isn’t most games still dominated by single thread performance?


Largely yes, but the 7800X3D ($370->$300 on sale, 120W TDP) still beats the 14900KS since the extra L3 cache is more important than clock for most games: https://www.techpowerup.com/review/intel-core-i9-14900ks/18....

(As mentioned another post here, for even more CPU-intensive games like Factorio, the difference is even starker.)

For general mixed workloads (assuming you need more threads), I think the 7950X3D is the way to go - 8 3D VCache cores for similar gaming, 20% cheaper than the 14900KS, generally neck and neck on development workloads but also has AVX512 support, and of course all while using significantly less power (also 120W TDP, 160W PPT - you can do even better w/ PBO+UV or Eco mode). Here's the OOTB power consumption comparison, it's a bit bonkers: https://www.techpowerup.com/review/intel-core-i9-14900ks/23....


All of these benchmarks are not useful unless they’re maxing out memory speeds. If you’re buying the top of the line cpu, you’re also buying the fastest ram.

7950x3d is limited to ddr5 6000 while 14900 can do ddr5 8000.

The benchmarks you shared are using ddr5 6000. So by upgrading the ram, the 14900 should come out on top. The memory controller is a key part of the cpu. It makes sense to test them with the best components they are able to use. I know it probably doesn’t matter, but if you’re chasing those last 5% of performance fast ram is a better value than custom water cooling.


I would love to see some data to back that up. I expect L3 cache to have a much more drastic difference than memory bandwidth.


AMD CPU idle at high power, which kinda negates efficiency when loaded.Now their G series desktop CPU the opposite, idle very very efficiently.


While the G series (basically mobile chips) do idle a lot lower, here's a chart that shows that Ryzen chips idle at about the same (a few watts less actually, but negligible) than their Intel high-end desktop counterparts: https://www.guru3d.com/review/amd-ryzen-7-8700g-processor-re...

One thing to keep in mind is that while slightly higher idle power might cost you a bit more, high power when processing means that you will also need a much beefier cooling solution and have higher noise levels and exhaust more heat to boot. Again, from the TechPowerup review https://www.techpowerup.com/review/intel-core-i9-14900ks/22.... they show that for a multi-threaded blender workload, the 7950X3D sits at 140W, while the 14900KS hits 374W. I don't believe there's a single consumer air cooler that can handle that kind of load, so you'll be forced to liquid cool (or go even more exotic).


I find a lot contradicting info about what the idle consumption of AMDs is, different graphs show different numbers (so I do not trust the source you've linked, as my own 12500 machine idles at about 25w) , but overall consensus on Reddit and other sources is that a typical AMD system will idle at +10 watts. compared to Intel. Some posts claim that idle dissipation for AMD (cpu only ) reaches 55w. My own exdperience with Ryzens (it was 3600 last time) is they indeed idle (and while lightly loaded too) at higher power. For my typical usage scenario, where 80-85% of times CPU does nothing it matters.


For anyone that doesn't need the highest performance and where efficiency is super important, I can recommend the current generation of mini PCs/NUCs that run mobile chips. Last summer I picked up Ryzen 7 7940HS-based mini PC that idles at about 10W (from the wall), has a decent iGPU (the Radeon 780M is about on par with a mobile GTX 1650), and its 65W-max Zen 4 cores (8x) actually manages to be very competitive with my old custom 5950X workstation: https://github.com/lhl/linuxlaptops/wiki/Minisforum-UM790-Pr...

Intel Meteor Lake NUCs should perform similarly, but they tend to be a few hundred dollars more for basically the same performance (the Minisforum EliteMini UM780 XTX barebones is currently $440, the cheapest Core Ultra 7 155H minipc I could find was the ASRock Industrial NUC BOX-155H at $700). At this point though, personally, I'd wait for the upcoming Zen5/RDNA3.5 Strix Point APUs, which should be the next big jump up in terms of performance.


Definitely agree on the all over the place metrics for AMD. This is somewhat complicated by the chipset. The x570 chipset actually used like 7-8 extra watts over the x470 by itself because the repurposed i/o die turned into a chipset was the idle power using part of the CPU.

Different motherboards and settings are sort of a hidden factor in this in general it seems.


Don't ignore that "idle" isn't a real thing. Most Reddit users complaining about high idle consumption have a program causing the problem. For me, shutting down the Steam program took my Ryzen 3600 from 22 watts "idle" to 2 watts idle.

There is no such thing as idle in a modern desktop.


This problem also shows up in Intel's latest Meteor Lake laptop processors, which are supposed to be able to power off the CPU chiplet and idle with the two low-power cores on the SoC chiplet. In practice, most OEMs ship too much crapware on their laptops for that capability to kick in, and it's really hard to get Windows cleaned up enough to get the battery life the chip ought to provide.


LOL


Do the X3D chips with mixed 3D and normal cache still have scheduling issues where games can use the slow cache? Back when they were new I heard you want the 8 core version so you have no chance of that happening.


There have been updates on core parking ("Game mode") in Windows so it's probably fine, but I think for max perf, people are still using Process Lasso (I don't use a mixed X3D chip myself, so I haven't paid super close attention to it).


I do have a 7950X3D.

It’s improved a lot, I still use Lasso but strictly speaking I don’t really need to.


That is still a problem, that is the reason why the 7800x3d is as good as the 7950x3d, but if you do other things, you can go with the 7950x3d it’s more expensive tough


.. because the cache gives you better single threaded performance?


In many cases yes. Some single-threaded workloads are very sensitive to e.g. memory latency. They end up spending most of their time with the CPU waiting on a cache-missed memory load to arrive.

Typically, those would be sequential algorithms with large memory needs and very random (think: hash table) memory accesses.

Examples: SAT solvers, anything relying on sparse linear algebra


Obscure personal conspiracy theory: The CPU vendors, notably Intel, deliberately avoid adding the telemetry that would make it trivial for the OS to report a % spent in memory wait.

Users might realize how many of their cores and cycles are being effectively wasted by limits of the memory / cache hierarchy, and stop thinking of their workloads as “CPU bound”.


Arm v8.4 onwards has exactly this (https://docs.kernel.org/arch/arm64/amu.html). It counts the number of (active) cycles where instructions can't be dispatched while waiting for data. There can be a very high percentage of idle cycles. Lots of improvements to be found with faster memory (latency and throughput).


The performance counters for that have been in the chips for a long time. You can argue that perf(1) has unfriendly UX of course.


I think AMD has a tool to check something somewhat related (Cache misses) in AMD uProf


Right, so does Intel in at least their high end chips. But a count of last-level misses is just one factor in the cost formula for memory access.

I appreciate it’s a complicated and subjective measurement: Hyperthreading, superscalar, out-of-order all mean that a core can be operating at some fraction of its peak (and what does that mean, exactly?) due to memory stalls, vs. being completely idle. And reads meet the instruction pipeline in a totally different way than writes do.

But a synthesized approximation that could serve as the memory stall equivalent of -e cycles for perf would be a huge boon to performance analysis & optimization.


At the end, in many real time situations, yes. Just look at what is recommended month after month for top gaming PC builds. Intel stuff is there 'just if you really have to go Intel way, here is worse, louder and more power hungry alternative'.


Most games love cache. AMD's X3D CPUs tend to be either better, or at worst extremely competitive with Intel's top chips for games, at a far lower power budget.


To a degree, but it's typically more like 'this workload uses 2-4 cores and wants peak performance on all of them', not really 'this workload uses a single core' on modern games. And then some games will happily use 8+ cores now

The user-mode video driver and kernel mode driver both use other cores as well


But with all the background OS activity, will this chip ever sustain turbo to 6.2 for noticeable periods? Pure benchmarking win imo.


You think 16 efficient cores will not make it?


They're clocked lower.


Not anymore. Modern games do multi-threading quite good.


> However, single threaded performance will end up being the bottle neck for fewer and fewer workloads.

True, the bottleneck will shrink - but according to Amdahl's law [0], it will never really go away.

Also, the more cores you have, the more the single-threaded performance increase multiplies. Imagine a million-core CPUs in the future - even a tiny increase in single-threaded performance will yield millionfold.

[0] https://en.wikipedia.org/wiki/Amdahl's_law


I use 13900k (5.8 GHz) and will say this: Chrome/Brave with 50+ tabs opens instantly : - ) And I mean it. You click the icon and it is opened.

(Work paid for it for ML prototyping.)


Isn’t that mostly IO limited anyways?


If you can afford a 13900k you most likely have a ludicrous amount of ram and a decent % of your HD is cached in RAM anyway.


It depends on the algorithm that I am designing or working with. Having said that, I do have 128 GB of DDR5, which helps with IO.


This has been exactly my experience, and why I love single thread perf.

I've had 12900k, 12900ks,13900k processors, I'm going to build a new one with either a 14900k or ks. I own a P5800x optane ssd to match.


How many windows are the tabs in? That's a much bigger factor since they added lazy loading years ago.


Intel beats AMD on idle consumption though


As a home consumer though why would I care about power consumption. What is that like a few extra dollars in power per month?


Power costs vary wildly from country to country, let alone state to state in the US. [1]

Also, many more people are installing solar power residential batteries, so there’s that.

https://www.statista.com/statistics/263492/electricity-price...


PC power consumption is an important metric for those with backup power systems, especially in countries with infrequent electricity delivery (e.g. South Africa).

If you can afford a 320W CPU in the first place, you can probably afford the batteries to power the thing for a few hours, but it does still add a considerable amount to your backup costs.


fwiw a 320w cpu isn't running at 320w all of the time. if you're powering something off of batteries in a consumer situation, a 240w vs 320w cpu isn't going to move the needle unless you're really running it hard (like a game)


The baseline draw is still much higher than lower spec alternatives, and it means you would need to cater for the high end scenario in your battery estimations if you intend to actually use it during power outages.

I switched from a 5950X + 3080, to a 5700G APU, to finally an M1 MBP + Steam Deck last year for this exact reason. Far cheaper to have a 250wh battery that can handle those two for the ±2h outages every day.


For an ordinary consumer UPS that's not trying to keep the system up for hours but just a few minutes, the peak power consumption probably matters more than the average: a 750VA UPS might simply trip its overcurrent protection if you're unlucky enough to have a power outage coincide with a spike in system load. With a big enough GPU, even a 1000VA unit might not be safe.

And it might be hard to get your system configured to throttle itself quickly enough when the power goes out: having the UPS connected over USB to a userspace application that monitors it and changes Windows power plans in response to going on battery could be too slow to react. It's a similar problem to what gaming laptops face, where they commonly lose more than half of their GPU performance and sometimes a sizeable chunk of CPU performance when not plugged in, because their batteries cannot deliver enough current to handle peak load.


I am in a western country with good power. Again why would I care?


It's not always about you.


[flagged]


Power delivery is fine where I live, mostly decarbonated, too. So, power draw in and of itself is not an issue for me personally.

The reason I care is that a CPU (or any component) drawing this much power will turn it to a lot of heat. These components don't like actually being hot, so require some kind of contraption to move that heat elsewhere. Which usually means noise. I hate noise.

Also, air conditioning isn't widespread where I live, and since it's an apartment, I need permission from the HOA to set up an external unit (internal units are noisy, so I don't care for them, because I hate noise). So having a space heater grilling my legs in the summer is a pain. I also hate heat.

So, I don't see this as buying a "subpar product". I see it as trying to figure the correct compromise between competing characteristics for a product given my constraints.


This is a forum, fyi


Yes and I am allowed to post, thank you


Then get a low power processor. There’s no reason power consumption should really come up as a prime selling point in any discussion regarding home usage. Obviously when you do things at scale in a data center it makes sense to talk about.


Your original post was to ask why should anyone care about power draw, to which I provided an answer.

Now you shift goalposts to saying you should only care in data centre contexts, in a thread discussing a desktop processor.

It's okay if power consumption doesn't matter to you, but that doesn't mean it doesn't matter to everyone. That's why it's important to have these metrics in the first place and ideally to try and optimise them.


Dont forget cooling. The 14900ks almost requires good watercooling to get its full potential.


I care because I don't like noisy computers and I don't like to run a space heater during summer.

Lower power consumption means less heat, which means less noise from cooling fans and lower temperature increase in my room.

YMMV


Maybe you don’t. But most other people do and so does Intel. It’s not good business to have a poor perf per watt chip in 2024. Everything from phones, laptops, to servers care very much about perf per watt except the very hardcore DIY niche that you might belong to.


Why does it matter? It’s always plugged in and the cost difference is negligible.


Normal people dont know what perf per watt means dont be like that


They might not know the term "perf per watt" but they feel the heat, fan, noise, and speed on devices they use.


None of those are a problem on a desktop PC only on shitty laptops.


Do you have experience with modern high watagae CPU during summer? Yes, good cooler can make it work. But where does that heat end up? It gets blown out from the case and first heats your legs (depending on desk, position to wall, etc), and then the entire room. It can be very noticable and not in a good way.


I have 2 saved profiles in my bios, one where the cpu is allow to consume has much current as it want that I use from mid October to mid May and one where the CPU is capped at 65W for the rest of the year.

I do something similar with my GPU, 75% cap in the summer, 105% cap in the winter.


>Do you have experience with modern high watagae CPU during summer?

i7 4790k w noctua cooler is fine :)


Do you have A/Cin that room?

I actually have a similar desktop next to my legs, only it has the Xeon version with twice the cores. It's an absolute PITA in the summer with no A/C. It's also quite noisy when the temperature in the room reaches 27 ºC.


Same cpu, not bothered to replace until maybe these days


  None of those are a problem on a desktop PC only on shitty laptops.
My post specifically states the perf per watt advantage on laptops, phones, any small device, servers. I also mentioned this advantage being less on hardcore DIY computers.


Do you realise how big the PC gaming sector is these days? High performing desktop chips are not for the hardcore DIY enthusiast market anymore. There are now millions of gamers buying of the shelf PCs with the highest spec components as standard.


The reverse of what you said is true.

DIY desktop market is smaller than ever. You can see this in the number of discrete GPU sales which has drastically declined over the last 20 years[0] save for a few crypto booms.

Gaming laptops are now more popular than gaming desktops.[1]

If you disagree, I'd like to see your sources.

[0]https://cdn.mos.cms.futurecdn.net/9hGBfdHQBWtrbYQKAfFZWD-120...

[1]https://web.archive.org/web/20220628032503/https://www.idc.c...


Lokos like gaming laptop vs gaming desktop sales are roughly the same:

"In terms of market share, according to a 2020 report by Statista, Notebooks / laptops accounted for 46.8% of the global personal computer market, while desktop PC made up 40.6% of the market. The remaining market share was made up of other devices such as tablets and workstations."

https://www.statista.com/statistics/1119850/gaming-pc-market...

https://www.tech-bazaar.com/laptop-market-as-compared-to-des...


I'm upvoting this to counter the downvotes because, unfortunately, normal people don't know.

Specifically, normal people don't know what "watt" is. Seriously. There is a reason electrician is a skilled profession. Most of us here do know watts and the like, so it's easy to forget that normal people aren't like us.


Normal people understand watts. Because they know what an electric heater is.

And using 2x to 3x more electricity means more heat in their room.

Also many countries have smart electricity meters with in home units which tell them exactly how many watts are currently being consumed and how much that costs them.


I’m going to push back on this with a simple example. Go to your local hardware store and check out the electric space heater section. There will be a wide variety of units rated for small rooms, medium rooms, and large rooms, based on square footage. The heaters will have a variety of form factors and physical dimensions. Many of them will have mentions of “eco” and “efficiency”. Every single one of them, and I mean literally, will be a 1500W heater (or whatever your region’s equivalent maximum load per plug is which may vary in 240v countries). Exact same wattage, all 100% efficient because their only job is to produce heat, with wildly different dimensions and text on the box. Customers will swear up and down about the difference between these units.


I had to do this after my gas costs went well above my electric costs. Maybe you are in a country/area where your hardware store doesn’t supply a variety of heaters, but at my local store, no two models were the same wattage.


It grinds my gears that electric lawn mowers are marketed by voltage even though that has no relation to grass cut per time.


Normal people know that 60w bulb can burn you.


Do they? Even I have problem nowadays with this, because they write 60W but it's a LED and it acts like a 60W bulb but it's not 60W. 60W is more like branding.


60w bulb can literally blind you (temporarily)these days.


So you're just going to waste for no reason? Do you also leave the faucet running when you brush your teeth? It's a few cents per month at most, after all.


It’s not for no reason, you’re getting more performance. It’s like saying everyone should buy a 4 cylinder Corolla for every situation, including tractors used in farms.


The context is precisely that there are other, power-efficient options that are at least as performant.


For me it's about having a silent PC and not making the room feel like a sauna.

I have a Ryzen 7950X in ECO mode 105W that's very fast in every workload I can throw at it, trivial to run whisper quiet with a basic Noctua cooler, and barely warms the room.


Maybe your morals guide you to act to mitigate the climate crisis, or maybe you don't have air conditioning and get hot summers.


Get off your high horse. This is negligible power usage on any scale of things.


You should of course prioritize low hanging fruit and eliminate flying, car use, meat, low efficiency housing, etc. But thinking that it's only worth doing these one at a time serially is a fallacy, as is getting stuck on the low impact of individual actions. Dividing the emissions pie to insignificant slices and then arguing that nothing significant can be done is just fooling yourself into inaction.

Regarding horses, pointing out the emissions angle in response to "why would I care about power consumption" is basic table stakes in the world's current situation, no need to get offended.


In the US, not known for its frugality, "Residential daily consumption of electricity is 12 kilowatt-hours (kWh) per person."


Freedom is consuming as much electricity as we desire. So long as we're happy to pay for it, of course. (I am.)


Freedom is also choosing to respect our environment and making a conscious effort not to waste our resources just for the hell of it.


Keyword there being choose. Freedom is the right to choose whether I buy more power or not, which I'm happy to pay.

I have no interest in living in your dictatorial world where everyone must hug trees.


Emitting CO2 does harm to others, not just yourself, so there's good grounds for regulating it.


power equals more heat. Water cooling and large heatsinks and loud fans are not desirable.


Noctua coolers and fans m8


Where do you think those fans blow the heat?


I am in Sweden summer is max 30C here... so not really an issue... for now...


I mentioned power just to round out three major complaints about the chip. Intel chips are more expensive, consume more electricity, and benchmark lower.

In general, as a home user, you should care about power consumption in desktop computers for the following reasons.

* Higher power requirements means more expense in other parts of the build. It means higher PSU requirements, and stronger cooling requirements.

* If you ever plan on running the PC on battery backup (UPS or Tesla Powerwall) or if you plan on installing solar energy then power consumption becomes a bigger expense.


> why would I care about power consumption

Environmental impact maybe? Higher power usage = more heat = more noise?

There’s a bunch of other cases too.


You are absolutely right, most home users don't care about power consumption when it comes to desktop computers.

All this negative shilling about power consumption of Intel and AMD desktop CPUs started after Apple ARM processors appeared on the desktop. Apple sells soldered SSD and CPUs with un-upgradeable RAM, and the proprietary SoC is no longer a true general purpose CPU as only macOS can run properly on it. Performance wise also they don't truly beat AMD (and in some cases Intel) processors. This is a huge negative factor for Apple Silicon based hardwares. Thus, the only negative marketing they can do about Intel and AMD processors is based on its higher power consumption.

That said, Intel and AMD will (and do seem) to care about power consumption for their laptop and server segments.


Apple’s chips are well regarded simply because they’re laptop chips - which they excel at because that’s where power consumption matters.


You can run Linux and openbsd on Apple M series chips, what do you mean only macOS?


Linux and *bsd run crippled on M series chips as Apple doesn't provide the hardware specifications or device APIs for system programmers for utilising the GPU etc on its SoC. Linux developers are forced to reverse engineer everything for the Apple Silicon because Apple is hostile to system developers. This in in sharp contrast to Intel or AMD processors where these OSes can fully utilise the chip hardware to deliver maximal performance, because Intel and AMD are more open with their hardware literature.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: