Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That was my first thought as well. I wonder if Apple has limited the refresh rate to something like 30hz or if they are banking on most apps simply using the downsized resolution instead of the full 5k. I can't imagine that card trying to play any modern game at anything close to that resolution for example.


Of course not. You don't even run recent graphics-intensive games at full resolution on the highest-end MBP with 1/3rd the pixels. We're still some ways from any consumer system taking full advantage of retina displays for high-end gaming.


Are you sure about that? I have a 4K monitor and initially tested it with 3D games at full resolution and they played fine at 30Hz. I switched to an nVidia card and was able to play at 60Hz just fine.

Going to 5K only about doubles the number of pixels, so it sounds like a solvable engineering problem.

Panels are ready and GPUs are ready; the link layer is why you can't go to Best Buy and buy a 5k monitor. DisplayPort and HDMI both suck.


4K gaming is currently reserved to the most high end desktop GPUs if you want to play any recent/modern games. To get >60fps you'd need two in sli/crossfire. A M290X is not even close to that. But even on a 4K display you can easily play at 1080p because it scales down nicely, for 5K youd probably have to play at 1440p which is a bit much for the M290X


Good luck getting >60fps worth of 4k pixels over DisplayPort or HDMI.

I will give my firstborn for an 8k* monitor with a 120Hz refresh rate. He or she will probably have kids when I get that, though.

(4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)


> (4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)

...one would hope that you're not viewing the 32" 4K monitor at the same distance you would view your phone from. Comparing PPI of a 32" screen to a 6" one is rather meaningless as the viewing distance is going to be vastly different.


But your phone, at retina distances, does not fill your field of view, and putting your face close to a large screen does.

The ultimate watermark will be retina-level resolution for devices like the Oculus Rift, allowing you to move your eyeballs all around and see real-life-quality graphics.


Yes, 1080p in the DK2 is worse that i had imagined. But Oculus apparently demoed a device with a 1440p screen which is already a lot better from what i heard. Still 4K and above have a totally valid usecase for VR.


I sit a little bit more than an arm's length away. It's not enough resolution to turn off anti-aliasing on fonts and still have them look nice. (And I use big fonts.) Circles still look blocky.

The Chromebook Pixel has a nice resolution. That's 240ppi.


I agree, while 32" makes native 4K useable in terms of desktop real estate, 8K in retina mode (basically viewable 4K) would be the endgame for usable monitors i guess. However since we are already starting with 5K, i don't think we have to wait as long as you imagine.


> I switched to an nVidia card

Which is where you demonstrated that you are a power user. Most consumer systems don't even come with discrete graphics cards anymore.

Consumer systems took a big backwards leap when integrated graphics became the norm again. They're running several years behind the discrete cards someone like you or I might buy.


All I'm saying is that I think Apple might be able to make it work.


I have a hard time believing this, what games were you playing and what graphics cards are you using? And by 30hz/60hz do you mean your actual frame rate or the monitors refresh rate?


I play minecraft, wow, tf2, lfd2 at 4k@60hz on the D700 mac pro. I set everything to ultra and limit framerate to 60fps--it never drops below 50fps


Alright I can believe that, the mac pro is not a machine that an average user owns let alone can afford. You have two graphics cards that are most likely more powerful than a single card that an average pc gamer has.

Also those games aren't too graphically demanding tbh, l4d2 is the newest one and it's over 5 years old.


I don't think we're that far away from consumer systems being able to take advantage of high res displays. This build here (http://pcpartpicker.com/p/HZQm23) can already drive 4k on almost all games that are currently out, and it's only $1600. Large manufacturers like Dell wont be far behind with desktops sporting similar hardware. I'd expect consumer (as opposed to enthusiast) 4k gaming on the desktop to start happening mid next year.


that would require a new gpu generation while the current one that is 4K capable (albeit only really smooth in SLI) has just been refreshed.


If you mean extremely demanding games sure, the rMBP is not a gaming machine. But it plays many games quite well. Examples: Borderlands 2 @ 1920x1200 with high settings (Windows), Fallout 3 @ full res and high settings (Windows), Diablo III, Minecraft (more CPU demanding than GPU), Left 4 Dead 2 @ full res with high settings.


OSX as an OS is weaker than Windows at games performance. For example, Valve officially supports Mac & Windows in-house, and running the same game (e.g. TF2) on the same hardware in windows will see framerate increases of 20%-100%.


Why is this? Any technical reasons?


trying to play any game on the MBPr is a nightmare.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: