They put the "Retina" display in the iMac. This means people will buy it. Higher volume means whoever (LG, I think?) is manufacturing the screens will have to produce more, driving the cost down. That means they will sell variants. Then their competition will also sell competitive options because nobody will want 1080p on a computer screen anymore.
Monitor technology has been stalled for years. This is going to be a gigantic kick in the pants to the industry!
The main concern with consumer adoption of separate 4k/5k+ screens will be dual-link dvi and hdmi 2... since standard hdmi can only drive 4k at 30hz refresh.
Wait for DisplayPort 1.3 for this. It currently uses multi-stream transport to feed the display as if it's two half screens daisy chained together. Sometimes it works, other times it's a trainwreck. At least know what you're getting in to.
Actually, it's not intrinsic to DisplayPort 1.2 that 4k displays must be driven with MST, it's just that for a long time the electronics to decode the full signal were not available, hence the hack.
Starting with the Samsung U28D590D[1], many 4k displays use single stream transport on DP 1.2.
Also, for 5k, you actually have to wait for DisplayPort 1.3 (if you want 60hz), because there's not enough bandwidth, multi stream transport or not. There's a neat bandwidth calculator at [2].
4k is 11.94 GBit/s without overhead, 5k is 21.23 GBit/s. DisplayPort 1.2 and 1.3 are 17.28 Gbit/s and 25.92 Gbit/s respectively.
This unfortunately means no 120hz 5k displays on even DP 1.3 without compression (which is supported).
Oh, good to know. I'd assumed the MST screens were because they had to use it, but I guess I just jumped on the 4K train a few months before I should have...
Correct. Thunderbolt 2 includes support for DP 1.2[1]. And in any case, only has 20 Gbit/s of bandwidth in total, which isn't enough for 5k even without taking overhead into account.
Which makes this announcement quite strange. If this iMac doesn't support Thunderbolt 3 (or whatever standard Apple is planning), it'll never be usable as an external monitor.
IMO, it would have been more Apple-y to launch this and refreshed Mac Pros simultaneously with TB 3, allowing Mac Pros to drive two (or more) 5k displays while the rest of the PC world is still tripping over 4k. But I guess this segment isn't a priority for them any more.
The bandwidth would certainly be there, and there's precedent with monitors requiring dual HDMI to get to 60hz, but I'm not aware of the same thing being done with DisplayPort.
With DisplayPort 1.3 now released, I'd be pretty surprised if manufacturers messed around with dual DP 1.2 inputs for displays like Dell's upcoming 5k. Such things are tolerated when there is no alternative, but when it's just a matter of requiring a new video card for a tenth the price of the high end display, that's the only sensible option. This is extreme early adopter tech after all. 5k is yesterday's 4k.
And needless to say, if I'm speculating it's too messy for Dell, Apple won't touch it with a ten foot pole :)
Still, stranger things have happened. If the demand turns out to be there, the products will follow - it's certainly technically possible.
The 30" Cinema Display had 2 DVI connectors, which was nasty but necessary. So maybe Apple wouldn't be too disgusted by the thought of dual TB.
But if DP 1.3 is around the corner, and dual TB would be an ugly hack that is mutually exclusive with the new standard, I could see them steering clear from that.
In principle, Apple could do something similar again, with an adaptor for two Thunderbolt 2 connectors (from separate buses!) In practice, they'll probably just wait for Displayport 1.3, tho.
Patently false claim. The DVI standard always included dual-link, even before Apple added USB and power connections, changed the connector, and passed it off as a unique invention of theirs.
You don't have to use MST, just that most 4k panels manufactured intentionally mimic two panels so they can output two hdmi 1.4 displays to composite into the whole screen.
I imagine any HDMI 2.0 4k screens to not use it, and even if they are still on DP 1.2, they would start showing a single display.
Sadly, in practice there appear to be many things wrong with DisplayPort. I’m writing this on a machine using a workstation-class graphics card to drive two high-end Dell monitors over DisplayPort. In some respects — starting with basics like what happens when you turn things on and off — it is the least satisfactory video set-up I’ve used in many years, despite being more expensive than the last several put together.
In my experience, for a 2560x1600 samsung monitor, DisplayPort works well. Admittedly, the cable seems to stop working after about 6 months, since I plug and unplug from my laptop twice every day, but a cable replacement works.
In my experience DP doesn't work reliably @ 4k, at least on AMD & Intel GPUs. If you do a bit of Googleing you'll see that there are many problems with DP at super high resolutions. I personally went through 5 different brands of DP cables, a DP MST HUB and all sorts of other remedies and fixes and nothing works reliably. I have 3 ASUS PB278Q and the only input that works reliably is DVI.
I just bought a GTX 980 because it supports HDMI 2.0, I am now running my HTPC at 4096x2160 @ 60hz over HDMI. As of right now the only graphics cards you can buy that support HDMI 2.0 are the GTX 970 & 980.
Just build myself a rig for the first time in a decade, with a pair of 980s SLI'd... it kicks like a mule... although unfortunately my AV receiver seems to support "HDMI 2.0 standard", rather than HDMI 2.0, which is a kick in the nuts.
Now I just need to wait for the consumer version of the rift.
Yeah, that's what I plan on doing this weekend. Wasn't planning to upgrade my GTX680 for a while as it was plenty for GPU acceleration in Adobe apps and for gaming at 2560x1440 but the Rift pushes it to its limit in several applications. Thought about just getting another 680 to put in SLI but my stupid past self didn't buy an SLI motherboard and it's not worth the hassle to buy another mobo, another 680, and rebuild the thing now that the 980's are out.
this was my first thought too... considering i had to hack my $6000 D700 mac pro into displaying 4k at 56hz instead of 60hz to avoid ridiculous noise/tearing on half the screen... I personally wouldn't touch 5k for another year at least. No way of knowing because it's not mentioned in the specs, but i'm guessing this new mac is limited to 30hz.
I'm still waiting for a good non-retina ~30" 3k monitor. I'm currently using a 2560x1440 (HP zr2740w), and horizontal space is a little tight for dual windows. Retina doesn't really matter for me (I got the retina MBP, but using it for 1680).
The only options seems to be the ones linked below, but there aren't any reviews on the 2880 ones, and the 3440 is too wide and only 1440 high, not ideal for dual windows.
Bear in mind you may have set the display virtual resolution to 1680, which affects apparent font and icon sizes etc, but the phydical resolution it's being rendered at is still the full retina fidelity. You shouldn't be seeing jaggies on curves the way you would on a real 1680 display.
I'm in the market for a retina iMac but expect to turn down the virtual resolution a few notches to make the UI a bit more readable. I find the system text on the non retina 27" machines a bit small, but scaling it up on those panels makes it look crap due to scaling artefacts that just shouldn't happen on a retina panel.
Like you, I'd prefer to turn down the virtual resolution a few notches to make system text and UI elements bigger. I cannot help but notice that 1706 by 960 would be enough virtual resolution for me[1], which on the new iMac would put 9 actual pixels into every virtual pixel. It would be nice if Apple included system fonts and other UI elements adapted specifically for such a 1:3 ratio of virtual size to actual size, which we might call "pixel tripling" in contrast to the "pixel doubling" that Apple has already implemented (a.k.a., HiDPI mode) in which each virtual pixel is made up of a square of 4 actual pixels.
Apple is very unlikely to go through the trouble of implementing "pixel tripling" just to accommodate people like you and me, but wouldn't it be sweet? :)
[1]: A rectangle 1706 pixels by 960 pixels contains .93 as many pixels as one 1680 by 1050 contains.
You mean Windows. KDE is iffy (but I use it just fine and was able to scale my fonts / frames up enough on my 150 DPI notebook), but Gnome in its latest release has fantastic high dpi support.
It is an unmentioned consequence of Windows being a piece of shit in regards to DPI that holds back the desktop display industry, whereas Androids inherent scalability let smartphones race for big pixels.
Ah, Windows* is catching up. But the ecosystem (third and some first-party applications) has been admittedly quite slow in making their applications HiDPI friendly. Metro/Modern apps for use on smartphones and tablets is already quite good, desktop apps are catching up.
*disclaimer: MS employee who is just enthusiastic about "viva la resolution."
I've been using the LG 34UM95 for a month or so attached to a rMBP, and find the ultra-wide screen (3440) delightful. The 1440 height is the same as the current Apple T'bolt Cinema Display.
I really don't need Retina-level resolution, though it'd be pleasant. Maybe in a year or two.
But having a single monitor this wide is really a treat, as I never could stand a dual-monitor setup. One of them is going to be your main monitor, facing you, and one of them will be on one side or the other, somewhat ruining the effect of the combined width.
Actually, having a second monitor to the side at an angle does wonders for neck strain. A single flat monitor, in contrast, means that the edges are going to be pretty far away from your head. Of course, that could be fixed with a curved monitor, which are coming, though the curves aren't very dramatic yet.
Well, having a single wide monitor means I move my neck very little from side to side (maybe 15 degrees each side?). Even my old (60) eyes have no trouble viewing the edges.
We can visualize some geometry here using some highly appropriate ASCII art (-, /, and \ are screens, * is the user):
-------
*
vs.
---
/ \
*
The first one requires you to move your neck much more than the last option; it also provides a better peripheral view. I have a corner desk, so my setup looks like:
---
/
*
More or less; I also have a chair that pivots, so I can change my focus quickly without moving laterally. What is really painful is when, to reduce horizontal span, the monitors are thrown into portrait, then you have up-down neck strain...which is very painful! I actually decided against another 24" as my second monitor for this reason (I have a secondary 21" wide screen in portrait as my secondary, nice for reading PDFs and email).
I would prefer to have one wide monitor, but with a curve.
I'm not sure. I have a 30" widescreen at work I primarily use in isolation (the laptop next to it is pretty puny by comparison), and dualies in a laptop-monitor-monitor arrangement as in Example 2 at home.
The single monitor is wide and requires much scanning, but I think I prefer the experience to multi-monitor window management. I just end up not using the far right screen except in the most necessary situations (design + code + browser)
I really don't see how the second option change anything for your neck, on the opposite it is pretty clear on the picture that it won't change.
However the second setup will allow you to be perpendicular to the screen on a wider area, reducing the average angle between your eye and the screen surface. It might be more comfortable for some people from a perspective point of view, but not for the neck.
I'd like to second that - I've been using the curved version (LG 34UC97-S) for 1.099 € with a mid-2014 rMBP and it is awesome. I'm not sure if the curving is necessary, but it certainly is nice.
Not so sure about this. We still mostly have laptops with shit 1366x768 monitors despite "retina" or at least "full hd" screens being around for quite a while...
I bet it's a bigger deal for external monitors. A ton of people buy laptops just saying "Oh, 15 inch screen for cheap, that sounds perfect!" But the sort of consumer who buys an external monitor tends to know what "resolution" means. There's demand for better ones because nobody buys a 27" 1366x768 display.
Given that Dell's 5k 27" screen is priced similarly to these iMacs, I think the economies of scale could help push some prices down.
I feel like laptop resolution progress stalled in the early 2000s. Maybe mass knowledge of 1080p HD set us back. I guess consumers started thinking "1920 x 1080 is really good resolution" so they were willing to accept lower resolutions as "good enough" for a long time. I had a 1400 x 1050 14" Dell laptop in 2005 but it wasn't until the Retina Macbook Pro in 2012 that laptop resolutions got significantly better. 768 is an absurdly bad vertical resolution. Many serious content creation apps do not work well at all at such a low resolution.
Blame Intel for that. When Intel's chips represent a whopping 40 percent of the machine's BOM, there really isn't that much room left for a high-resolution IPS display.
In the cheapest LCD monitors, the brightness control just changes pixel values and doesn't affect the backlight. Hope this one's better, since the claimed brightness of 350 cd/m^2 is far too bright (you should have less than 120 cd/m^2 in a dark room) and is probably bad for your health.
I'd guess they've fixed it, but there's no way you could tell from their "tech specs", which just list the same huge numbers (1.07 billion colors!) as every other manufacturer's LCD.
It's a little worrying to see a Contrast control, too, since that makes no physical sense.
Brightness is not even close to being enough on modern displays. Looking at a 700nit display outside is still at most only half as bright as looking at a sheet of paper in the same light conditions. 350nit is nowhere near being bad for your health. I can only recommend to try out a high luminance display.
I've been using them for years for inexpensive yet high quality cables, I thought their reputation was pretty well established. Maybe I missed a scandal out there somewhere.
The problem is, most GPUs can't drive that many pixels, which wasn't the case with the 1440p ones. So there is now a new set of issues with displays like these being used standalone.
I hope some of those displays will be as matte as the old TFTs. I won't buy even an 8K display if it is glossy. For the same reason I won't buy the 5k imac.
My dual 2412Ms (noticeably less resolution than yours, but still) sit beside a 15" Retina MBP. It is painful to switch between the screens. First world problems, but the good kind.
And I assume you also mean content providers will also be forced to shoot their videos in 4k/5k resolution? (I don't think so)
I understand this is a big + for monitors. I have used Dell ultra-sharp 32" @ 3840x2160 with display port and it's fantastic. Larger workspace, better gaming, better multitasking!
I don't care that much about video. Full HD is plenty fine with me.
I care a lot about text, because that's what I consume almost exclusively day in day out at work and in my spare time when I sit in front of a screen. And nothing profits more from high contrast and high pixel density than text rendering.
I just inherited a Dell XPS 13 with a really nice Full HD screen from a coworker who quit. I've got running Ubuntu on it with a scaling factor of 1.5. Not quite Retina, but pretty sweet nonetheless. I was going to use it with an external screen, but after getting used to this laptop I couldn't stand looking at the external screen anymore and got rid of it. Work (programming and writing) is a lot more fun with this setup.
My spouse just got a 13 inch Retina MBP. She's pretty stoked as well. Even her half blind mom can see the difference.
High resolution screens are great and as sooner they become standard the better.
I am really curious about the technology behind the 5k iMac. I am not sure if there is any off the shelf GPU out there that can drive that display using retina type rendering.
It's interesting they did a custom controller for the display timing (Timing Controller (TCON)) . They must have had to do deep customizations to use the AMD R9 M290X (comparable to the Radeon HD 7870) to drive it.
If this is not innovative I am not sure what is, in terms of an engineering standpoint.
It's very impressive Apple made the effort to do this now instead of waiting for 'off the shelf' parts but the technology behind this is probably nothing to write home about. The biggest limitation of display bandwidth is currently external HDMI/DP standards which are slowly ratified and adopted. For all-in-ones and laptops additional channels can be added and/or clocked higher for additional bandwidth. Apple's TCON is probably a fairly standard part souped up a bit for higher performance to handle the additional channels / clock speeds. I'd be surprised if they are doing anything fancier than this because the jump from 4K to 5K isn't big enough to require it. Close enough that the 'old/current technology' can be stretched. On the video card side as long as the 7870 has enough VRAM to store frames desktop 2D / light 3D compositing isn't that demanding. The trade off here is heavy 3D (modern gaming) at native 5K or even down to 4K is mostly not happening on this machine. Probably a good choice since 2560x1440 at 120Hz+ is a better choice for gaming than higher resolutions at 60Hz.
The jump from 3840x2160 to 5120x2880 is nearly 2x, in pixel count. That is not just a 'tweak existing technology' kind of jump.
TCON's that can handle a single logical 4K display are just hitting the market, and Apple has one shipping today to handle 5K. A timing controller may be simple tech, but it's cool that they're so far ahead of the curve.
It's a big jump in pixel count but a 2X increase in bandwidth can be brute forced / tweaked out of existing technology similar to DL DVI or DP MST. IMO that doesn't diminish it at all because lots of technical problems have somewhat obvious solutions but actually making it happen is still difficult and expensive. Even more so if you want to avoid making any ugly trade offs in the process.
Not sure, though I really don't mind OSX, and with homebrew, it's pretty close to a 1:1 for most of my use on OSX...
The only thing that irks me a little is the muscle memory for using the option key for cut-copy-paste in the UI vs the actual ctrl key (ctrl-c etc) in a terminal...
That's easy to fix. Use iTerm2, and tell it to remap modifier keys (to unflip them). If you use mission-control shortcuts (like move workspace), configure those keys in iTerm2 to "don't remap modifiers"
@leephillips - good news is that linux installs fairly painlessly on the iMac 5K. I'm running Fedora 20 and it's going really well. Some slight funkiness with sound but I'm sorting that out now.
Need hardware accel for gnome shell so required this workaround for the ATI drivers: https://bugzilla.redhat.com/show_bug.cgi?id=1054435
It may not be innovative, but it's something good that makes Apple consistently leapfrog everyone else on everything else than raw processing power and value. I mean, who else would have done this? Dell? Nope.
Exactly. Apple tries to do a lot of firsts when they release a new product category, but most of their updates are about putting the effort into the smallest details that matter to users, even if the users don't know about it.
They are better than anyone else at that simply because nobody else really sees the whole product as their own problem. I.e. each hardware/software/service maker is just optimizing their part.
My favorite example of this: a MacBook's headphone socket has full support for iPhone earbuds. The little headset microphone is used and the volume/pause buttons operate exactly as on an iPhone.
Obvious in retrospect, but just another example of how Apple treats their product range as a complete ecosystem.
I think the issue is less that only Apple is capable of making these technological leaps, and more that only Apple is capable of pushing them to a wider audience.
Few companies can get away with such a relatively narrow selection of hardware products. Dell for example has dozens of laptop models. By only having a few models, Apple making a big change to one results in a big chunk of the overall market including that feature.
I'm not sure. Why doesn't everyone produce them? The PC industry could drive a lot more innovation because there's so many more Windows PCs. Unfortunately, the race to the bottom means margins are two thin to add an extra $10 port, for example.
I'm most impressed that they can manufacture 15 million pixels on a single panel without a single dead pixel. They must be wasting/repurposing a ton of panel square footage when cutting.
So you mean people buy a Mac that has dead pixels and Apple tells them to go to hell? They have a no-questions asked 16 days to return it or a bit longer than that, but I don't think they'd question a return on a already broken one. You're thinking of people trying to get a warranty repair after a long time of usage. The OP was referring to the fact that when you produce LCD displays you have limitations on panel size because you get defects ever so often so you have to throw away some produced panels because of them.
Right - but I'm questioning whether Apple (or any monitor vendor) would throw away a panel that was ISO 13406-2 Class I levels. I.E. a few dead pixels, depending on their nature, and where they are, don't mean you throw away a panel.
Perhaps what happens is they just get resold as Non-Apple Displays. So, the no dead pixel displays go into the premium Apple monitors, but the Class I devices get resold in the white-label market. (This is where you see these great deals on eBay).
I hope you have seen the below from jcheng: I literally just came from my local Apple Store where I got my Retina MBP serviced for two clusters of hot pixels. The Genius had to look up the policy to see if it was covered, and came back saying that Retina MBP screens will be replaced for even a single defective pixel. We were both surprised.
It's not dead pixels that you notice; I have a hot pixel on my Retina MBP, and it's very noticeable. But Apple doesn't warranty a single hot/dead pixel.
I literally just came from my local Apple Store where I got my Retina MBP serviced for two clusters of hot pixels. The Genius had to look up the policy to see if it was covered, and came back saying that Retina MBP screens will be replaced for even a single defective pixel. We were both surprised.
I had an issue with a single dead pixel on a 3 month of rMBP earlier this year and Apple replaced it (the whole machine!) without any fuss. The Apple tech guy said (and this is pretty much the exact quote) "the rMBP is all about the screen, a dead pixel is not acceptable". Say what you will about Apple but their support is the best I have ever experienced. If I were you I would book an appointment at an Apple Store and get it fixed. I doubt you will have any problems. Backup everything first and be prepared to do a wipe and get a new machine if that isn't too inconvenient.
As others have said Apple will absolutely replace the screen for any dead pixels. I had mine first crappy LG screen replaced because of two dead pixels and they told me they'd do it if it was only one.
There are 14.7 million pixels on the iMac's screen. The definition of a retina display is you can't discern individual pixels at normal viewing distances, so unless you're sitting with your nose against the screen or are using a loupe, you won't be able to see a single dead pixel.
> The definition of a retina display is you can't discern individual pixels at normal viewing distances, so unless you're sitting with your nose against the screen or are using a loupe, you won't be able to see a single dead pixel.
Your conclusion does not follow from your premise.
Just because we can't identify individual pixels does not mean a single pixel has no influence. It shapes the overall picture.
If a single dead pixel is not visible at all, why have that pixel there in the first place -- working or not?
I just tested with my iPhone 6 (a 1334x750 px white picture with a single black pixel in the middle of the picture), and the single black pixel is definitely visible. A screenshot of the iPhone showing the picture in question confirms it's a single pixel on the screen.
For one it was never about "more pixels than photosensitive cells".
Second, the number of rods and cones has nothing to do with it.
The "retina" argument was about having smaller than discernible angular pixel sizes, which is true for the majority of people and normal viewing distances.
That's true on conventional panels, but the whole point of retina is that the pixels individually are smaller than the human eye can resolve. I'd be interested to see in practice. I suppose it depends how far your eye needs to be from the screen for it you count as retina in that way.
The inability to discern an individual pixel with the same color as its neighbors doesn't mean that the same pixel with a vastly different collor won't be seen. A single bright green pixel in a white window will still be visible, even if the edges of the pixel itself can't be seen otherwise.
That was my first thought as well. I wonder if Apple has limited the refresh rate to something like 30hz or if they are banking on most apps simply using the downsized resolution instead of the full 5k. I can't imagine that card trying to play any modern game at anything close to that resolution for example.
Of course not. You don't even run recent graphics-intensive games at full resolution on the highest-end MBP with 1/3rd the pixels. We're still some ways from any consumer system taking full advantage of retina displays for high-end gaming.
Are you sure about that? I have a 4K monitor and initially tested it with 3D games at full resolution and they played fine at 30Hz. I switched to an nVidia card and was able to play at 60Hz just fine.
Going to 5K only about doubles the number of pixels, so it sounds like a solvable engineering problem.
Panels are ready and GPUs are ready; the link layer is why you can't go to Best Buy and buy a 5k monitor. DisplayPort and HDMI both suck.
4K gaming is currently reserved to the most high end desktop GPUs if you want to play any recent/modern games. To get >60fps you'd need two in sli/crossfire. A M290X is not even close to that.
But even on a 4K display you can easily play at 1080p because it scales down nicely, for 5K youd probably have to play at 1440p which is a bit much for the M290X
> (4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)
...one would hope that you're not viewing the 32" 4K monitor at the same distance you would view your phone from. Comparing PPI of a 32" screen to a 6" one is rather meaningless as the viewing distance is going to be vastly different.
But your phone, at retina distances, does not fill your field of view, and putting your face close to a large screen does.
The ultimate watermark will be retina-level resolution for devices like the Oculus Rift, allowing you to move your eyeballs all around and see real-life-quality graphics.
Yes, 1080p in the DK2 is worse that i had imagined. But Oculus apparently demoed a device with a 1440p screen which is already a lot better from what i heard. Still 4K and above have a totally valid usecase for VR.
I sit a little bit more than an arm's length away. It's not enough resolution to turn off anti-aliasing on fonts and still have them look nice. (And I use big fonts.) Circles still look blocky.
The Chromebook Pixel has a nice resolution. That's 240ppi.
I agree, while 32" makes native 4K useable in terms of desktop real estate, 8K in retina mode (basically viewable 4K) would be the endgame for usable monitors i guess. However since we are already starting with 5K, i don't think we have to wait as long as you imagine.
Which is where you demonstrated that you are a power user. Most consumer systems don't even come with discrete graphics cards anymore.
Consumer systems took a big backwards leap when integrated graphics became the norm again. They're running several years behind the discrete cards someone like you or I might buy.
I have a hard time believing this, what games were you playing and what graphics cards are you using? And by 30hz/60hz do you mean your actual frame rate or the monitors refresh rate?
Alright I can believe that, the mac pro is not a machine that an average user owns let alone can afford. You have two graphics cards that are most likely more powerful than a single card that an average pc gamer has.
Also those games aren't too graphically demanding tbh, l4d2 is the newest one and it's over 5 years old.
I don't think we're that far away from consumer systems being able to take advantage of high res displays. This build here (http://pcpartpicker.com/p/HZQm23) can already drive 4k on almost all games that are currently out, and it's only $1600. Large manufacturers like Dell wont be far behind with desktops sporting similar hardware. I'd expect consumer (as opposed to enthusiast) 4k gaming on the desktop to start happening mid next year.
If you mean extremely demanding games sure, the rMBP is not a gaming machine. But it plays many games quite well. Examples: Borderlands 2 @ 1920x1200 with high settings (Windows), Fallout 3 @ full res and high settings (Windows), Diablo III, Minecraft (more CPU demanding than GPU), Left 4 Dead 2 @ full res with high settings.
OSX as an OS is weaker than Windows at games performance. For example, Valve officially supports Mac & Windows in-house, and running the same game (e.g. TF2) on the same hardware in windows will see framerate increases of 20%-100%.
The problem is that Apple eco-system tends to be quite limiting in what we can do for gaming in other systems.
So I rather have the choice among powerful GPUs, with drivers kept up to date to the latest OpenGL versions (separately from OS versions), then having retina support.
Most 4k displays do that due to bandwidth restrictions of current display cables. There is unlikely to be any GPU benefit to treating it as two separate displays since you still have to push the same number of pixels.
Serious question here. At what point is the human eye unable to tell the difference between such high res displays?
I'm thinking in the same terms I used to think when I was doing home audio. My boss had a set of $10K speakers. Sure, they sounded great, but then I listened to a clients $5k speakers and couldn't tell the difference between his and my bosses speakers. It was as if my ears weren't finely tuned enough to tell.
Of course if you ask my Boss about the difference, he'd take an hour to tell all the differences. To me, the lay person at the time, I couldn't tell.
That chart should really be zoomed to about 1/5 its size... it goes up to 150 inch? Who has displays that big?
The X axis should go from 10 to 60, the Y axis should go from 1ft to 30ft, max.
As it is, it's not all that useful for figuring out what the density limits of a desktop monitor are. (Average desktop monitor sizes occupy maybe 1/12 of the X axis!)
Well with a 4K display your able to sit closer to larger ones because you will not detect the pixels as easier. I know many people think you want more distance with a larger set, but with 4Ks they are just fine, as long as don't move your eyes to much to see the entire screen.
I'm not entirely sure I follow why you feel this is significant. At least in the US, we tend to measure viewing distance in feet and diagonal screen size in inches, and those are both measures of length. It's not like the two axes are "feet" and "fluid ounces."
Even if it was fluid ounces, that's still not a problem - A line chart of my account balance would have Euros on the y-axis and months on the x-axis. It's still a meaningful chart though...
For a normal viewing distance when using a desktop PC 4k on 24" or 5K on 27" is pretty much the sweet spot and i am pretty sure Apple won't go above that anytime soon because the benefits are diminishing.
A typical generous estimate of human visual acuity is 0.3 arcminutes. As in, any pixel density beyond that is probably not visible to us. It's not an exact measurement since there's disagreement over how to properly measure it. But regardless, most estimates are in the 0.3-0.4 range.
Then simple trig tells us that we are maxed out when:
(distance to screen) * sin(0.3 arcminutes) = pixel size
The new iMac has a 27 inch diagonal, and 5874 pixel diagonal, so its pixel size is 0.004596 inches. Plug that in above, and you can see that you would have to be 53 inches from the screen to be maxing out your visual abilities. Anything closer (like normal) and you actually could still benefit from higher resolution.
>did a custom controller for the display timing (Timing Controller (TCON))
Apple doesnt produce displays, they dont design TCONs - Tcon is a pcb sitting directly behind glass driving individual crystals, and is designed and manufactured by the same company making the display. Also every Tcon is custom, and build to drive particular type of screen.
Yes, when Dan Rissio says "we manufactured this screen' he is LYING TO YOU. They bought whole thing from LG or Sharp.
AMD doesnt support DP 1.3 yet, so only way for 5K resolution is bonding two 3K screens together, this is so innovative IBM did it 15 years ago in T220.
> retina type rendering
you mean scaling down? this is what GPUs do best, out of the box.
Yes, this screen is amazing. But dont act like its something revolutionary touched by Noodly Appendage.
Apple has had its own display engineers since the '80s. Yes, they use several off the shelf components, but they're also known to create their own chips and oversee specialized manufacturing for particular models/challenges. Case in point was the 21" Apple CRT from the late '90s that featured a Sony Trinitron but also had custom Apple chips that auto calibrated as the display aged based on Apple's ColorSync technology. This looks like a similar collaboration.
You really think the largest company in the world (by market cap) doesn't actively collaborate with its source companies and do custom designs created by its internal engineers together with the supplier's engineers?
> Yes, when Dan Rissio says "we manufactured this screen' he is LYING TO YOU. They bought whole thing from LG or Sharp.
It's not a lie if "we" refers to Apple together with its partners or "manufacture this screen" refers to the combination of LG/Sharp's screen with Apple technology.
Is it a lie when Google says it created Android without mentioning the creators of Linux or Java? Is it a lie if Intel says it manufactures a CPU without mentioning the manufacturers of the silicon wafers?
Dude, it's marketing, which I'm not a fan of. But be fair.
Okay nobody seriously looks at Apple as a panel manufacturer, unless you (and Apple) are referring to the fact that the company has a stake in Sharp. The plant and expertise required to build panels is clean room stuff that AAPL just does not do. It does industrial design and software. Not "manufacturing of screens" so the point is clear. Android's skin is distant enough from Java and Linux that it can credibly be called a creation of Google. Just as Audi can say it built the car without having to credit the inventor of the internal combustion engine, and instructing a supplier to build a part to its specifications is hardly "manufacturing" it. It is "specifying" it. As every Apple product owner knows: "Designed in California, MADE in China". In this case, made in Japan/korea. As such, "we manufacture" is skirting close to the edge of a lie if "we" is spoken at an Apple branded press conference with no mention whatsoever of partners.
They didn't manufacture it, they designed it. How many tech companies design and manufacture all of their own stuff? Any? They didn't "buy the whole thing from LG or Sharp" they designed and developed it and then used them as manufacturing partner to produce it.
You have really swallowed the Apple hateraide it seems.
auto-correlated downvote trend on a perfectly reasonable point: AAPL is not very innovative as Dell is launching the same monitor. Sure the driver logic might have been the subject of some consulting but if anybody thinks it will not appear within weeks on competitor brands they're kidding themselves. Everything about this display is not proprietary AAPL tech.
The Dell monitor is apparently four 2560x1440 panels (a fairly common approach to early-stage 4k monitors/tvs, as well. Anandtech were told that the iMac is controlled as a single unit, so it's probably rather different to the Dell, and yes, they probably _do_ have a proprietary controller.
As is discussed above, though, even if such displays appear, who will have anything to plug them into? Yes, some people will be able to go find a specific kind of video card to plug into their homebuilt rig to drive it, but who will be offering something like this that normies (95%+ of the market) will comfortably and confidently be able to just go and buy? That counts for something.
Perfectly reasonable speculation/opinions but there's no technical details available of the internals of Dell's upcoming display that uses the same|similar panel or the iMac Retina at this point. You might be getting down voted for your reasonable speculation/opinions because you presented it as fact. Online discussions can be of higher quality when everyone agrees on the difference between fact and opinion.
The 5k display would be more versatile, however. Well, I guess that’s relative to …
I can understand why Apple is not in a hurry to sell this just as a display. This display could likely not be driven at all by all of their most popular Macs: all the MacBooks and the Mac mini. Maybe it would work with the Mac Pro, but that’s it. And that Mac Pro is probably not the best selling Mac, by a large margin. So they would basically sell this display that only works with a Mac Pro.
I can understand how that might seem like a wasted effort until at least a couple more widely selling Macs support the display. Maybe when the MacBook Pros can do it they will start selling just the display.
With the iMac the advantage is that Apple is delivering it with a computer that definitely can push those pixels, so you dodge all compatibility woes that will plague these high-res displays for a couple more years. It seems like the still had to get in there and do some weird custom stuff and maybe that’s just hard or impossible to do if you have to do it in a display that is hooked up to some random GPU from Apple’s past Macs.
Prior 27" iMacs have had the ability to run as external displays in Target Display Mode.
This was very nice when the view connection was DVI, but the current ThunderBolt connections limit it to other systems with ThunderBolt and the appropriate drivers. This limits it to pretty much using the display with other Macs.
My quick review of the Apple's tech spec for the new iMac doesn't say if it has this feature.
I had the worst time getting Target Display Mode to function sanely. I had thought that it was some kind of hidden systems feature or something, but at least at the time it required an Apple keyboard be attached and a specific key sequence be pressed during startup to make it happen.
On my mid 2011 iMac ⌘ + F2 enables target display mode once something is plugged in to the mini display port. Nothing needed at startup.
Not sure about having to be an Apple keyboard- probably works with any keyboard as long as you know which key is mapped to command. (Usually the "OS" key).
While you can use a DisplayPort display with a Thunderbolt iMacs, you cannot use target display mode on a Thunderbolt iMac with a DisplayPort 1.2 PC (or pre-thunderbolt mac).
Can you buy the monitor separately? It looks like a dream display to me, assuming that it's glossy which is usual for Apple. It's very difficult to find glossy monitors these days. I just can't go back to matter after getting spoiled with glossy.
Honest question: Why do you prefer a reflective surface over a matte on a screen? In my experience it only reflects the room and makes the content harder to see.
You get more saturation on glossy displays. Matte surfaces work by diffusing the light that is incident on the screen but that diffuser also attenuates the light coming out of the display. Less light, less contrast, and less saturation.
An artist friend of mine has what is essentially a white cotton muslin curtain that goes around their work area which keeps reflections down to near zero (without being in the dark) and only uses glossy displays. It would drive me nuts but I see the attraction.
Glossy requires you to exercise better control of room lighting, but the matte coating causes white backgrounds to sparkle and text to be hazy. Especially on a retina-level display where the text is otherwise so sharp. I used a 2414Q in a Microsoft store, and text looks nowhere near as sharp as it does on my Macbook Pro Retina, despite similar pixel density.
Microsoft uses different text rendering algorithms than apple. You should compare a picture with text generated on OS X, not text as seen in applications.
In my experience, colors just look better on glossy displays. Since glossy screens are ubiquitous on phones and tablets, I would figure most people are used to the reflection issue by now. Personally, I don't notice reflections unless the screen is off, but it is obviously going to depend on your room.
I have a different question: why hasn't Apple been using optical anti-reflective coatings on all their non-touch glossy displays -- especially the Thunderbolt display? And: is this 5K iMac AR coated?
AR coatings on touch screens are problematic because skin oils make visible marks, but on non-touch glossy screens they should be standard. Indeed, back in the CRT era, it was routine for high-end CRTs to be coated.
I have noticed that Apple is using a coated screen for at least one model of MacBook Air, but for some reason they don't seem to have fully embraced coated screens.
I've used a Mac with a glass screen since 2011 and it really hasnt given me any issues. Sometimes at the office I may have a issues with glare or if I sit with a window behind me. If I adjust the angle of my monitor the issue is immediately resolved.
I was really against buying the glossy screen but the matte looks washed out compared to the glossy. If your display is around 60% or above brightness then you really dont have a glare issue.
Of course, if you work in an open-plan office -- and let's face it, despite the evidence that it's an unforgivably stupid way to set things up, most of us do -- you can't control the light.
And if you're using a laptop outside of a home/personal office environment (which laptops are presumably built for), you don't have control over lighting either.
For tablets and mobile devices which generally aren't used for working at long stretches, I guess I can see the appeal. But it's really strange to me that glossy is popular on machines people do use for long-stretch work -- and often in high-glare overhead lighting open-plan offices.
(And not only popular, but increasingly the only available option.)
The color and contrast is way better than any matte screen I've seen. Reflections are not a big problem for me, but of course this depends on your computer setup and where your windows are located etc.
The annoying thing is that it's nearly impossible to find a glossy screen on the market.
Different strokes for different folks, but I find it funny that you think your preference is the one the market is ignoring lately. Every display Apple makes is glossy only, as is nearly every TV these days. I really wish I could get a matte rMBP and a matte 60" HDTV, but they just don't make them anymore...
I don't see how you can reach that conclusion. I want to buy a glossy monitor, not a glossy television screen. The only well known option is a monitor manufactured by Apple, but they're really expensive compared to typical quality matte screens. It's really hard to find even a single alternative. I found a single HP monitor which maybe has a glossy screen, but they don't state that clearly.
I agree that it is harder to see, especially if you have a backlight. If you have a well-lit room, with light coming from the top in a non-reflective angle, the display is vastly better than the matte one.
(Source: I have both Apple Thunderbolt Display and old-school Apple Cinema Display with matte finish)
Not yet it seems. They haven't released an update to their Thunderbolt Display in years (since 2011), but it should be right around the corner with the release of this 5K iMac.
They would have announced it today if that was the case. Currently there isn't a display transport capable of pushing that many pixels. DispayPort 1.2 can't do it on a single port and that's what Apple has MacBooks and Mac Pro.
Yes, and that's already being used by the first batch of 4K monitors and is called MST. This models the monitor as 2 lower-resolution displays and drives them with two separate signals.
It has a lot of compatibility problems, and is hacky as hell. I doubt Apple would adopt this as an official/first-class solution to anything.
Solving "compatibility problems" is what Apple do best by requiring you have an Apple Mac Pro and an Apple Cinema Display and not supporting any other configuration
It depends on the application. For games, fotos, and videos glossy may be fine. But if you work a lot with text (as developer etc.) then matte displays are far better for the eyes.
When I made an exception and bought a notebook with glossy display I regretted that a short time later. Glossy display? Never again!
I think it's a supply issue. Once that's worked out, I'm sure they will release just the 5k monitor. I don't think they like telling customers who ask for a 5k monitor to go buy dell for too long.
Shitty 4k displays have been cheap for a while. It's not a reasonable comparison. (My no-name brand Korean 27" IPS display was great until it frizzed out after 9 months.)
Dell's 24" UP2414Q is an IPS 4K monitor with 99% AdobeRGB and 100% sRGB coverage, and it has a 185 PPI panel that I'd argue is already in the "retina" range. Although with its ~€555 price it's not exactly in the $300-$500 range, but it's a close call, so I wouldn't call all 4K monitors around this price range "shitty" at all.
Interestingly done - the huge image is broken up into 15 separate canvas elements and then the containing DIV is scaled with CSS transforms as you scroll. Not immensely clever, but a solution which performs pretty well in Chrome on my Mac, at least.
This is because your eyes never get wider. So it makes sense that the content would never get wider either. I feel like web apps are the only real use case (aside from experimentations) where using a fluid-width (the content grows with the screen) is appropriate.
It actually has meaning, unlike most scroll captures.
I also like how (about 3/4 down the page) they capture the background iMac image on the left and keep it "sticky" during scroll, but not completely stationary -- it still responds and gives slight feedback, which makes it 99% less annoying. Nice.
I was curious what the exact resolutions were. A quick google search claims:
"4k monitor": 3824 x 2160
"5k monitor": 5120 x 2880
That's a lot of pixels.
It might be a good idea to be skeptical about spending >$1,500 on a 27-inch monitor in Q4 2014. It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440, so clearly the reason to upgrade to 5120x2880 is for the extra screen workspace. But unless you have very good vision, you're probably not going to be able to read text at 5120x2880 without zooming. What's the advantage?
For $1,000 you can buy two 27" 2560x1440 monitors, which is a huge amount of workspace. Also, a single $300 midrange GPU can drive both monitors at full resolution. A couple years ago, that was cutting-edge tech, but it cost ~$2600. Also, two monitors offer a better user experience than one monitor, since window management is a bit easier.
Would anyone mind explaining whether the pros of a 5k monitor outweigh the hefty pricetag?
> It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440
Not once you are used to a retina display. I have both a retina Macbook Pro, and a 27" iMac. It is actually really obvious, especially when looking at text.
> you're probably not going to be able to read text at 5120x2880 without zooming
This is correct, unless you use the standard retina resolution which is like 2560x1440 except doubled, so everything is the same size just much nicer looking.
I would say the pros are not there just yet; it would certainly look nicer, but not a whole lot nicer. Moving from 1366x768 on laptops to 2560x1560 (or whatever it is, can't remember off the top of my head) was a great shift. This isn't quite as big a deal.
It would be very nice to be able to display full 4K detail or side-by-side 1080p videos for editing purposes. Photo editing will be great too, though you will have to resort to 'physical zoom' (i.e. moving your head closer) to do serious pixel peeping, rather than the old and probably superior technique of blowing up the pixels to larger than you'd see them normally.
For gaming and such.... I'm not really sure. Downsampling is now the name of the game, i.e. rendering above the display rez and dropping it down, which has nice effects on IQ. Rendering 14.7 million pixels is a hell of a task to start with, and then you start getting all manner of masks, per-pixel effects, etc... the demands really multiply. Plus there's also a focus on high framerates, like 120, which we're not going to see at 4K or 5K for a gooood long while. It'll be years before things catch up at the mid-tier level of processors and GPUs. And 8 GB of RAM, I hardly need add, is entirely insufficient.
tl;dr: for today's flat desktop purposes, probably not necessary but could be nice. for gaming and other purposes, probably reaching too far. but it's still a nice trend.
Although I'm not sure how the display quality compares, that'll be for e.g. Anandtech to test. And other 5k monitors are in the same price range as the imac (except without the computer part)
>It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440, so clearly the reason to upgrade to 5120x2880 is for the extra screen workspace.
On the new 5K iMac you won't perceive a bigger workspace^ than you had on the previous (2560x1440) model, because all UI elements will be scaled up at the factor of two, practically making everything look twice as sharp.
^: Actually, just like with the retina MacBooks, you'll have the option to use alternative resolutions up to 5K with and without scaling, effectively creating a larger virtual workspace.
"It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440" - wouldn't say that's true - they screens are good, but when you switch between any phone, any tablet, a rMBP and the current iMacs you really do notice the low resolution.
I never had this problem before getting the rMBP however, so maybe I'm just getting picky!
As I mentioned in the other thread, there's just not enough bandwidth to drive at 60fps over a single bus / single cable with DisplayPort 1.2 without compression (DP v1.2 gives about 17.3Gbps usable bandwidth which is just not enough). The iMac allows them to bypass that bottleneck by interfacing directly with the graphics card.
My predication is that once we see Apple deploy Thunderbolt 3 / DisplayPort 1.3 (capable of 25.9Gbps usable bandwidth [1]), we'll see the Apple Cinema Display Retina.
That's why many people were expecting the iMac to come first. Since it's an integrated machine the panel can be connected directly to the graphics card via LVDS (or whatever the current version is) instead of the limitations of the external cabling the Mac Pro or (for an external monitor) MacBook Pro have.
Not 60fps over a single cable / single bus without compression.
DisplayPort v1.2 can carry around 17.3Gbps (this is minus overhead), so you would need DP v1.3 with an effective rate of 26Gbps to support native at 5k without compression [1].
They have to update their notebooks with Thunderbolt 3 before they can release 5K Thunderbolt Display. Thunderbolt 2 doesn't have enough bandwidth to push the pixels.
With the amount of custom chips they needed for the display, it wouldn't surprise me if there are issues getting it to work as an external display that they haven't solved yet.
I can't believe you can't believe it. Have you heard of Apple before? If anything I'm a bit more perturbed that they max out at 32GB of RAM. My aging Mac Pro has more than that in it.
I imagine that they preferred shipping with a default 8GB and selling a $200/600 RAM upgrade than having a higher headline price.
(Incidentally, their 1GB phones still outperform phones that are one year newer with 2GB of RAM, double the cores and nearly double the GHz, so maybe they're onto something.)
32GB is a limitation of iX Haswell, Ivy Bridge could use 64GB - Apple can't do anything about it. Let's hope Intel allows it for Broadwell-K (-Y,-U,-H variants will be restricted to at most 16GB)...
It's not a Haswell vs. Ivy Bridge distinction, it's a consumer vs. server distinction. Consumer CPUs only have dual-channel memory controllers and Intel's DDR3 controllers can't use 16GB UDIMMs or have more than 2 UDIMMs per channel. The server chips (a handful of which get some features cut off and sold under the i7 brand) have 4 memory controllers and also support RDIMMs and LRDIMMs.
That is a software not a hardware limitation (in particular, I think it is in Intel's MRC). ASUS has firmware with modified MRC code that can support it.
That is in fact incorrect. Consumer desktop Sandy Bridge, Ivy Bridge and Haswell all support 32 GB of RAM via 2 DDR3 channels. Enthusiast desktop processors (Sandy-E, Ivy-E, Haswell-E) support 64 GB, because they feature 4 channel memory controllers.
It seems to me that a top-of-the-line consumer machine should be able to compete with a 3 year old workstation (that hadn't received a significant update in two years -- effectively 5yo hardware).
For what do you need more then 32G RAM ever on a desktop (excluding time travelling forwards). If you are asking yourself that question you need to get a proper server or rendering farm.
You'd be surprised how fast you can run out of memory if you are mastering your own, non-professional 5k/4k video and applying some advanced non-linear operations such as video stabilization. It's not GPU accelerated, parallelism is limited by the nature of the algorithm which makes rendering farms useless. Do you need a power hungry and noisy server with slower single thread performance than a top end i7 just for additional RAM?
On the other hand, beyond 32GB, ECC seems to be very important so you probably won't have any choice but to buy a proper workstation unless AMD makes a miraculous processor on par with i7 (as they do support ECC in all models).
> non-linear operations such as video stabilization
What do you mean by non-linear? Are you referring to an equation containing a linear combination of the dependent variable and its derivatives, or something else?
Virtual machines behaves best with as much ram as a normal machine. Assuming you have 3 machines running, osx host and windows+linux vms and you want to give them 12gb each you are already at 30+. Might sound like a crazy setup but some times a compiler is only supported on one platform and then you want a replica of your clients server in one VM etc etc.
No, not really. I work in a games studio and our usage on individual workstations goes above 32GB when compiling our project. Due to technical and licencing issues it's also not something that can be relegated to a remote server farm for compilation.
If it's that gigantic, then a distributed build on local servers makes a lot of sense. Much faster than each developer doing it on their own single workstation.
We do distributed builds on local workstations across the studio with Incredibuild. That's the only way we could get away with licencing for consoles. That reduces the build times(from over 40 minutes to less than 5) but the RAM usage is still very very high.
Im guessing this is C++ with a lot of template abuse and header interdependencies? C++ has a powerful compiler, buts its also easy to shoot yourself in the foot with it.
If you didn't need the 5k display but you did need more memory and rendering capacity, the Mac Pro has you covered maxing out at 64 GB of RAM or 128 GB is you use non-Apple RAM upgrades. So there's that.
TDM is likely to break with this new machine. Thunderbolt 2 only has a theoretical max throughput of 20Gbps... Driving a 5K display @ 60Hz would require 98.88% of that throughput, which is unlikely to be sustained in IRL usage.
I'd be all over this if Apple can confirm that the monitor is usable by external machines.
I don't think that will work here - TB2 is derived from DP1.2, which only goes up to 4K@60hz. You'll have to wait for TB3 to drive an external 5K monitor from a Mac.
You should divide 20Gbps by 9 bits (2275 MB/s) or even by 10 (2048 MB/s), not by 8, because in addition to useful payload, there are also packet headers and control packets being transferred. So it actually doesn't.
"Packet headers and control packets" are not the reason you should divide by 10. The reason is that Thunderbolt uses 8b-10b encoding so 1 byte of data is transferred as 10 physical bits. Therefore the maximum theoretical usable bandwidth is 20e9/10/1024/1024 = 1907 MiB/sec. Then on top of that you have to deduct the overhead from packet headers and control packets so the real-world usable bandwidth is even less than 1970 MiB/sec...
Doesn't TB2 have two interleaved streams, one carrying DP1.2 and the other carrying 20Gbps PCI-E? Even if the PCI-E stream has enough bandwidth for 5K in theory, the ecosystem isn't in place to use it as a display output. That's what the DP part is for.
For other people to play with this, just google "(5120 * 2880 * 3 * 60) bytes/s in Gib/s" to get the answer (just short of TB2s 20GiB/s assuming no overheads.
I would imagine as long as the video card has similar specs as the new iMac, and the computer had Thunderbolt 2, then yes (at 5k). Otherwise, it would be scaled somehow or just not work at all.
Computer for free? With Mac everything costs money, there is hardly no way to enjoy this piece of hardware without spending money to the Mac way of money sucking.. I enjoy my arch linux system every single day, and it never cost me a cent, there is no backdoors, money sucking thing going on at all. But, for those who believe the marketing slogan that everything with Mac is better, well, grab your wallet and join the idiots.
I have 8 GB of memory on my 2013 13" MBook Air - I run 23 separate applications simultaneously, including the full office Suite (Outlook/PowerPoint/Office/Excel), VMware Fusion w/Windows XP + OpenBSD running, Dynamips Routing Simulator w/10 Cisco 7200 routers, Google Earth, Aperture, Pixelmator, etc.... No swapping. Everything runs fine.
OS X (in my case, Mountain Lion, so an old version) - does really, really well with shared libraries. 95%+ of consumer users (not pros, whose needs are obviously more rarified, and for whom the sky is the limit) are probably fine with 8GB of memory, and it's not something I would recommend them increasing unless they plan to hold onto the computer for more than 5 years.
Yosemite beta 6, just booted with only Firefox and this HN page open takes already 2,8GB. Yosemite is really memory hungry compared to Mavericks in my experience. And future version are going to get hungrier. Those 8GB are going to fall short within the next 2 iterations of OSX, an after paying $2500 that feels ridiculous to me.
> Yosemite is really memory hungry compared to Mavericks in my experience.
Which numbers are you reading for this? The "memory use" total includes lots of irrelevant things like files read once during boot and optimistically cached for later, so you can expect it to be nearly 100% of installed memory at all times.
You're not in a memory pressure situation until you have an actual performance problem or the swap space starts growing.
But if you are having performance problems, I'm pretty sure someone cares deeply about them and would like to know the per-process memory use.
Mountain Lion 10.8.5 - fresh boot with only Chrome and this HN page open - Used: 2.83 GB.
Once the shared libraries are in memory, memory usage grows really, really slowly for my application set (listed above) on OS X.
I'm not saying that 8 GB will last forever - I'm just saying, as a power user who has zero need for more than 8GB right now, that 8 GB on OS X will be just fine for about 95% of average consumer use. Absolutely would not recommend upgrading an iMac's memory unless you are really certain you are going to hang onto it for more than 5 years - not the best use of $200.
I'm just talking about your average family use; for vertical niches (Video Work, Graphics Design, Heavy Industrial Programs, Databases) - obviously those people will assess their situation and make a decision that is optimized for their circumstances.
That resolution will take a lot of vram to push. Start gaming or editing on it and it will get crushed. Most 290x's start at 4GB, why Apple chose 2GB I don't know.
Only 3090 on steam. Which is ~1/3 of the while windows back log. And a large chunk of the imported games are older games, so Linux and Mac both are making steady progress.
This looks like a perfect 4K video editing rig: the 5K display is big enough to play 4K at 100% with lots of extra space for editor controls. Pair it with the Panasonic GH4 and you can shoot and edit cinema quality 4K footage for under $5k, which is amazing.
That price is cheaper than I expected, since Dell announced that they would charge $2500 for their 5K display. It is about time that 'retina' comes to the desktop, hopefully this will put more pressure on Microsoft and third party devs to actually make high DPI displays work with Windows. As it stands now, Apple has a monopoly on a usable high DPI environment (Assuming the usual quick adaption to this display of the OSX dev community). That may be enough for me to change over to OSX as daily driver, just think about how nice coding on this thing could be.
I am far from an Apple fan boy, but I am so glad they are pushing the move for resolutions higher than 1920x1080, where monitors were stuck for far too long.
When Dell announced their 5K monitor (which does almost certainly have the same panel, albeit probably not the same circuitry) about a month ago, barely anyone registered. This show the brand power and marketing might of Apple I guess.
ATP talked about the Dell 5K for several episodes - but they couldn't figure out how they could drive that many pixels from any of their systems (including Marco's Mac Pro) - Apple just solved that problem today, and is the first company to do so - which is what's getting so much attention. It's a real technological/integration leap forward. Credit where it's due.
Except that this is an all in one computer, not just a monitor. There's a miniscule amount of people that have the hardware to power that dell monitor.
Plus the iMac costs the same and is an actual computer.
Typing this from my Lenovo W540. High DPI is good and all, but holy shit do programs vary between regular, tiny font, and large font. VMWare's remote desktop client (vmrc.exe) triggered via VCloud in browser renders at native, which requires me to squint and put my face 2 inches away from the screen.
I love the turn of phrase, but given this new display manages to improve on energy efficiency, I'm suspecting another, more mundane consideration: the Dread Marketeer. As is, the Pro line has the retina displays, whilst the Air doesn't, thereby setting out a clear product line delineation. (Not the sole one, to be sure. But, if Apple wishes to continue the Pro/Air distinction, this would seem one quite noticeable parameter)
For Air-like sizes, the GPU matters very much - even if the screen takes less wats, it takes much more GPU power to feed it, and that would hurt battery life.
Two years. MacBook Airs are mostly lambda consumer devices, where more pixels would just mean elements on the screen look nicer. MacBook Pro are used by a crowd that effectively need more pixels.
I've owned and used both a 4k monitor and a 144 Hz refresh rate monitor. I personally value the higher refresh rate over the higher resolution. My 4k monitor is actually difficult for me to use, because it feels like the software has not caught up to the resolution yet. Very little is optimized for 4k desktops, yet.
5K is great an all, if your GPU and connection interface can handle it.
It takes roughly 17.2Gbps of bandwidth to drive a 4K @ 60 fps signal in a single stream (Single Stream Transport); DisplayPort 1.2 has just enough bandwidth to support a single 4K @ 60 fps SST stream, but 5K is far too large for the standard. This iMac comes stock with an R9-M290X(2012 GPU) which supports up to DisplayPort 1.2. To get the bandwidth needed for 5k@60hz on DP1.2, Apple would have to overclock the DisplayPort signal by 50-100% on single stream transport.
It seems like the M295X upgrade is a necessity for this thing to render well.
I wonder if apple will prevent rendering at full resolution like the retina macbook pros because the max font/icon size for many apps is too small.
This would mean you are paying for a screen that can't render at the new fancy resolution because the resulting info would be too small to resolve by most people.
Many computer users suffer from eye strain because they have to stare to resolve the information; their eyes dry and then each blink causes tiny scratches which over time causes serious damage.
Ctrl+F'd in this thread for "eye strain" to see if anyone else reports oddly having more eye strain from using a retina screen.
I used a 15" macbook pro retina for 2 years pretty much 8 hours a day and noticed my eye sight deteriorating rapidly. I thought it was just the hours I was logging in front of a computer screen and thought 'well atleast I'm using a retina screen'. But now that I've stopped using the macbook and have moved to an old 20" lcd screen and working on Windows 7, I find that my eyesight has recovered significantly. I can read road signs and my ability to change focus has improved.
I'm not sure what the explanation could be but I wouldn't be surprised to hear that your eyes are actually more stressed when they try to resolve the denser retina pixels.
The point of a retina display is that the eye perceives it as continuous shapes and gradations of colour and tone.
It's more likely to be down to the size of fonts and visual elements. Would you say the font sizes you viewed on the MBP were smaller or larger than those on the 20" display? (As perceived by you, not necessarily as displayed on the screen as viewing distance is also a factor). It may be the smaller screen size caused you to zoom out of content to fit more on the screen. That would have a far grater impact on eye strain.
The fact that smaller text would be crisp and readable on a retina display may have encouraged you to zoom out more than you might do on a lower resolution display though, so the retina display isn't totally off the hook as a contributing factor.
> The fact that smaller text would be crisp and readable on a retina display may have encouraged you to zoom out more than you might do on a lower resolution display though, so the retina display isn't totally off the hook as a contributing factor.
Thanks for the reply, yea I think this is the most likely culprit. Though, looking at the font size of Xcode right now it doesn't seem to be any smaller than this text I'm typing on my 20" monitor. At one point a year into working on the macbook I thought it could have been the brightness so I did change my Xcode background to black and used a program called 'flux' to reduce my screens brightness but it really didn't offer much relief.
Does your old 20" lcd screen have a CCFL backlight?
For some reason unknown to me, the light emitted by the LED backlight in my life (in an iPad 2) tend to cause a little unpleasant tension in me, but the CCFL backlights in my life (in an HP w2007 monitor and in a Samsung LN32A450 television set) do not.
One size that Apple does not offer is the actual native resolution of each display, no doubt because icons, text and other interface elements are too small for just about anyone to read.
In any event, the issue is customers should check if this product will actually render the full 5k resolution without hacking and if it is usable at the resolution. It is abusive to sell a product with a spec that you can't actually use. But Apple is all about form over function.
Thank golly space and pixels are not the same thing; you can always render more pixels in the same amount of space, keeping your UI quite usable and even improved, since rendering the same vector in 400 pixels will look much better compared to rendering it with 100 pixels.
> It is abusive to sell a product with a spec that you can't actually use.
Ya, if pixels and space were the same thing, Apple should be sued! Thankfully they aren't.
> But Apple is all about form over function.
Because good looking rendering of vectors and fonts is only about form and doesn't improve function. For sure. </sarcasm>
It is great for media professionals to have such a great display but consumer level isn't this bad? Isn't anyone concerned how horrible all our exiting media will look at this resolution? By horrible I mean, windowed of course would look excellent but beyond a certain size not so much maybe?
Most media consumption is done on 1080p or below, not everyone is fortunate to stream 2K or 4K content yet and we are pushing to 5K.
Wow. I've been waiting for a screen like that for a decade now. My first Mac purchase (retina display Macbook) was made only on the basis of the screen, and I don't regret it at all. I hope they come out with a standalone display soon, but I'll have to upgrade my laptop I guess to power it. I don't think the current MBPR's can even drive a 5K screen?
Very impressive. I notice that the new Mac Mini has dropped in price to £300 here in the UK, although of a significantly lower (rubbish) spec than the lowest model we've seen before. If this runs fine, it looks like an attempt to make inroads into the "PC" market, as £300 + a screen is really in the economy PC market area, no?
I have Lenovo Y50 4K and use Seiki 50-inch TV 4k TV which I sometimes use it as a monitor for my machine. 4K resolution is amazing but there is definitely diminishing returns. I am sure 5K would be definitely amazing but do not expect a lot if you are power user or a programmer. Designer and other artistic folks may have more to mine here.
The 5K is a result of the Mac having an aspect ratio other than 4K & HD's 16:9, and resolving that difference with using a simple "retina" multiplier. Kicking 4K's butt when it's barely out the gate probably helped.
Hmmm, impressive. But I (honest question) wonder how this will handle blu rays or other full HD content at full screen? All those pixels need to be interpolated an 60 (50) Hz ...
Highly peronal: I hate all the marketing retina HD bla bla shit from Apple, but I love these beautiful iMacs. I want one :-).
I like the specs, but am wondering whether it is as quiet as a 2008 iMac 24" (I have one now). The parts are probably rated a bit higher TDP-wise. Any experiences yet, is there a noisy fan?
It's interesting that they have dropped NVidia graphics, hopefully they won't for the rmbp line in spring seeing as I'm hoping to buy one then for CUDA development.
It's been a while since I played with GPU stuff but it seems like OpenCL just hasn't gotten as much love/development as CUDA.
Lest people think OpenCL is somehow at odds with NVIDIA:
"The OpenCL standard was developed on NVIDIA GPUs and NVIDIA was the first company to demonstrate OpenCL code running on a GPU," [1]
The Vice President of NVIDIA is (was?) the chair of the OpenCL working group (and president of the Kronos group).
NVIDIA are easily the ones holding it back at this point. Both AMD and Intel have OpenCL 2.0 support on windows, and Nvidia hasn't even shipped support for OpenCL 1.2. Nvidia obviously want OpenCL to die off and CUDA to do well.
Well, there's also Intel who decided to ship their accelerator product (MIC / Xeon Phi) without OpenCL support initially. Their big marketing angle was that HPC programmers could just use their existing OpenMP x86 code. This shows how Intel had no idea in what they're getting into, because accelerator programming needs to deal with different problems than on the CPU, regardless of whether you have a simple CUDA core or a MIC x86 [1]. Results have so far been rather poor because of that. Kepler GPUs with CUDA support are for that reason still the best tooled accelerators for HPC programming.
[1] First of all, when you need on the order of thousands to tens of thousands of threads to saturate your cores, memory becomes a big issue again. Your threads have maybe 400k usable memory. Memory bandwidth is usually even a more important limitation - these systems have 2-3 times less bandwidth per physical thread than CPUs. This can be mitigated using a programming model that uses bandwidth very efficiently (e.g. stream programming like in CUDA, OpenCL) - an area where GPUs are traditionally more advanced, since they deal with this limitation since a long time. Don't get me wrong though, Intel will probably catch up at some point.
Master Xtrem top most over ultimate power.
Penta HD 3D dolby monster hugely powerfull master.
Chef sergeant ultra hydra peta wonderfull ultimately bestest.
...
To the max !
...
When were you looking at this? Chrome used to behave really badly on retina macs; it still behaves somewhat badly, but it's a lot better. Safari and Firefox don't have this problem, however.
Most likely, your 2013 Mac Pro won't be able to do that because a single cable / single bus DisplayPort v1.2 cannot provide the bandwidth for 5k (DP v1.2 provides about 17.3Gbps which is insufficient).
Unless Apple provide a multi-cable solution for the 2013 Mac Pro when the 5k Thunderbolt display comes out, you won't be able to take advantage of 5k displays on it.
My prediction is that we'll see an updated Mac Pro that also uses the Haswell Xeons (2013 Mac Pro uses last gen Ivy Bridge) and also DP v1.3 (which can support 5k displays).
Well, because I need high-ppi displays:) That's the main reason I changed my 2011 iMac to the 2013 Mac Pro. Now it's weird: either a single 5k-screen with the new iMac, or two worse 4k-screens on Mac Pro.
5K? Seriously? This shouldn't make me anywhere near as furious as it has done. Fucking Apple, yet again, does whatever they can to make sure their users aren't compatible with the rest of the world.
What the hell are you on about? 5k is a very real resolution, it is double 1440p. It means you can have a 1440p looking display, but with retina scaling.
I've got a 4k display, with scaling I have a 1080p looking display and as a result wouldn't want to go any bigger than 24".
This is exactly the display I have been waiting for (only it has a computer inside it).
Of course it's a "very real resolution" are you an idiot? Any display made using current technologies uses a "very real resolution" because that's how displays work.
What I was talking about, however, is that every other company under the Sun that has a relation to displays/media/TVs/whatever is pushing 4K and 8K as the next standards after Full HD. Yet, Apple in their infinite wisdom, go and pull a number out of their asses that isn't evenly divisible into those emerging standards - even if it is easily scalable into them.
Why? What could they possibly gain besides consumer dependence on yet-another piece of proprietary Apple nonsense?
Sure, the points being made about it pushing other companies to develop comparable tech for cheaper stands as Dell selling a display for the same price as an entire Mac computer is ridiculous. But so what? Apple diehards will get a 5K while the rest of the world goes with 4K and 8K, creating unnecessary disparity.
In what way are 5K displays incompatible with anything else? What actual problems does this cause anyone?
As to your question why, one advantage is that these monitors can display true pixel-for-pixel 4k video in an editing suite, while still leaving screen real-estate free to display the application user interface, film libraries, etc. Another is that even higher fidelity means less pixelation and smoother color gradations that on a similarly sized 4k display. There are plenty of advantages.
Finally, I don't remember anybody complaining when Dell released their 5k display a few months ago.
No chance. The I/O is too slow - you'd need a mess of cables. As others pointed out, putting the computer inside the monitor bypasses this and lets the display driver interact directly with the display rather than sending the signal through a port and cable.
Funny how in their keynote they went on and on about how having a high resolution display is great... a month after they released the sub 1080p iPhone 6.
They put the "Retina" display in the iMac. This means people will buy it. Higher volume means whoever (LG, I think?) is manufacturing the screens will have to produce more, driving the cost down. That means they will sell variants. Then their competition will also sell competitive options because nobody will want 1080p on a computer screen anymore.
Monitor technology has been stalled for years. This is going to be a gigantic kick in the pants to the industry!