Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

M2 Ultra is ~250W (averaging various reports since Apple don’t publish) for the entire SoC.

5090 is 575W without the CPU.

You’d have to cut the Nvidia to a quarter and then find a comparable CPU to normalize the wattage for an actual comparison.

I agree that Apple GPUs aren’t putting the dedicated GPU companies in danger on the benchmarks, but they’re also not really targeting it? They’re in completely different zones on too many fronts to really compare.



Well, select your hardware of choice and see for yourself then: https://browser.geekbench.com/opencl-benchmarks

> but they’re also not really targeting it?

That's fine, but it's not an excuse to ignore the power/performance ratio.


But I’m not ignoring the power/performance ratio? If anything, you are doing that by handwaving away the difference.

Give me a comparable system build where the NVIDIA GPU + any CPU of your choice is running at the same wattage as an M2 Ultra, and outperforms it on average. You’d get 150W for the GPU and 150W for the CPU.

Again, you can’t really compare the two. They’re inherently different systems unless you only care about singular metrics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: