I apologize if this is an incredibly stupid question, but why do these monitors cost so much? As I understand it, you can buy a good 4k TV for considerably less, so what features of this monitor make it a better deal?
I've been using a Samsung 40" 4k TV as my main monitor for development work for about 8 months and it's been great for me.
A couple caveats -- Because I don't game or do graphics work, I don't know or care about color reproduction. I made sure to get a TV, graphics card, and cable that supported 4k@60hz with 4:4:4 chroma so the text is perfectly sharp and the refresh is not obtrusive. It's not hard to find a Samsung TV around $350 that supports that, but as far as I could tell, I had to go up to a GTX1050 graphics card to get that. I sit between 24 and 28" away from the monitor so the text is readable to me at full resolution. At that distance, I do have to turn my head to comfortably read text in the corners of the TV, especially the upper corners. In practice, that means I keep monitoring-type applications such as CPU charts, Datadog graphs, etc., in one of those corners for quick reference. While I still have two 1920x1080 monitors on either side of the TV, it's quite nice to be able to open up on my main monitor a huge IDE window and, when necessary, put 3-4 normal-sized windows next to each other for comparison work.
Same here. I bought a Philips BDM4065UC monitor in 2015 (40 inch, 3840x2160 resolution) for 740 € (in Germany) and I'm loving it. The only cave-at with that model is that reaction time is a bit slow (I would guess 5 ms between full black and full white).
When I replace it, the replacement will be the same size and resolution, but I'll probably go for OLED and HDR-10 or better.
I happened to buy the same, and import it from Germany to the UK.
Was very excited for it as it featured a very similar DPI to a 27" 1440p Korean import, coming from 3+ 1080p monitors originally the lack of bezels was incredible.
It's not built for gaming at all though, extreme screen tearing and poor response times has me now looking to replace the 27" 1440p Korean import with something >144Hz and G-Sync, before replacing the Philips BDM40.
Yeah, I got that Philips too. For games 40 inch at 4K resolution is really a lot more practical then 38 inch with an odd resolution, and it's cheaper too.
I think the fact that the 27" monitors next to it are closer to the camera in that photo makes the TV look artificially smaller. It absolutely dwarfs the 27" monitors. If you're close to any store that carries 40" 4k TVs, though, just go there and stand 28" away from one. I think you'll see what I'm saying about having to turn your head to comfortably read text in the corners, something that has never been the case for me with smaller monitors.
I just bought (today) a 32" 1080p monitor and I'm lovin' it.
I think it's more productive to use a single big screen then set up a two monitor setup.
My old setup consisted of two monitors, one 23" and one 21". One day I realized that my 23" monitor could not reproduce colors. I needed to photoshop a picture. I sent the retouched version to my phone, but as I couldn't see the color depth on the 23" one, the image looked ugly on the better screen. At that time I learnt not to buy the cheapest monitor, just by looking at its size and resolution. So, I sold that monitor, and bought a 32" 1080p monitor at a bargain price.
Of course it would be better if I could buy a 1440p monitor at this size, (or even 4K OMG :)). But considering my budget this was all I could purchase.
I had serious doubts about the pixel density, however there was nothing to be afraid of. In usual tasks (e.g. browsing, writing code in emacs, terminal) nothing disturbs my eye, it's beautiful. However, if I open a 1080p youtube video fullscreen, it looks like as if it were a high quality SD video, because of the pixel density. I am standing close to the monitor, and although I cannot see any pixel points, I can notice the difference.
I bought my monitor for 900 Turkish Lira, equivalant to 254 dollars and I think it is a great investment.
I detect no lag when moving windows around. It might not (or might -- I simply don't know) be good enough for gamers, but it's definitely smooth enough for development, document, or web browsing (aka productivity) tasks.
144Hz monitors do produce a very noticeable difference for gaming if you can rival the refresh rate with in-game FPS, outside of gaming I've only found it noticeable when moving windows about.
For productivity I'd go for the 40" 2160p over a lower resolution higher refresh panel.
Mine is a 40" Samsung UN40KU6300 though Samsung appears to rev its model lineup frequently so they may have already released a successor to this model. It was a little difficult to confirm that it supported 4k@60hz with 4:4:4 chroma, but I relied on the experiences of some folks on an audio/video forum who confirmed that they had been able to run with those settings. Unfortunately, I don't remember the name of the forum or have it in my browsing history. I do wish manufacturers would make that information easier to come by though it's certainly a tiny minority of consumers that know or care about those kinds of specs.
rtings.com is a TV review site that is superb for finding such details, they test for 4:4:4 chroma specifically along with input lag in each mode and one of their sub-cores is "use as a PC monitor". A valuable resource.
It's designed for graphic design and other display-critical tasks, so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display. Plus, it's the UltraSharp top-end model, so all the mechanical construction will be top notch.
TVs, on the other hand, are designed to show the most oversaturated, "vibrant" colors on the demo loop on the show-room wall. And mechanically, they're designed to hang on the wall and never be touched.
> so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display.
That’s completely useless for graphic design. sRGB is defined as the lowest common denominator of CRTs on the market in the early 90s.
Actual monitors for graphic design purposes, every TV released in 2017, and most new gaming monitors use instead AdobeRGB or DCI-P3 colorspace or even Rec.2020, which is what is also used in cinemas.
I have an UltraSharp and it's fucking rubbish. Half of the screen has a yellow tint, and it has dark corners. Obviously support says it's "within spec".
It's not odd. The size is what makes this expensive. It's 10 inches more screen, and curved to boot (which you may not care about, but it does affect cost).
This screen is much bigger (37.5" vs 27"), has a better refresh rate (75Hz vs 60Hz), and is curved. All of those things correspond with higher price for both the panel itself and the resulting product that contains it.
* removed mention of FreeSync as this display lacks it
I have a 5k iMac myself, but for my work, which is Linux based, I would rather have that wide-screen. The iMac for sure has the way better font rendering, but the real estate is that of a 27" screen. With Linux, HiDpi support isn't quite there yet - so for now the 38" screen gives you the larger desktop.
This is a good display, and price has gone down.
But anyway to connect it to a Windows 10 / nVidia PC and get the full 5K resolution? No GPU that I’m familiar with has support for USB-C.
I cannot speak for 4K TVs, but as I was making a research for my own purchase I found the following differences between monitors and TVs.
TVs are said to be configured to display the best-looking colors, monitors try to stay true to the real color.
TVs have an immensely higher refresh rate, 50hz/60hz usually suffice for a TV. But on monitors we speak of milliseconds. The refresh rate makes a lot of difference, if you intend to game on your computer/monitor. An e-sport professional Redittor claims he had improved his playing performance multiple-folds after switching to a monitor instead of TV. (https://www.reddit.com/r/FIFA/comments/5whb9v/did_the_change...)
Other than these differences and the occasional overscan/underscan problems on older televisions, I personally see no reason to prefer a Monitor over a TV, if there is huge price difference. If the price difference is minor, I'd opt for a monitor.
Refresh rate and response time are different. Refresh rate is how often a new frame is displayed, i.e. 50/60/144Hz. Response time is the amount of time it takes for the display to process a frame and display it, i.e. 1/2/4/16ms.
>>Over a decade has passed since the LCD monitor unceremoniously ousted the boxy CRT monitor into obsolescence, but with that ousting came a small problem: CRT monitors redrew every frame from scratch, and this was baked into the fundamentals of how PCs sent information to the screen. Monitors redrew the screen with a refresh rate of 100 Hz, (100 times a second), and they were silky smooth.
>>LCD monitors don’t have this problem, because the pixels don’t need to be refreshed.
Instead, they update individual pixels with new colors, and each of those updates takes a certain amount of time depending on what the change is.
The response time refers to how long this change takes. LCDs started with a 60-Hz refresh rate but a response time of about 15 milliseconds (ms).
There's also factors like input lag that are separate from display refresh that can be a problem for people playing action games. Even casual players can notice some high input delay from time to time (we're into the 100ms+ range) so it's a good idea to check the display's input lag if games are anywhere near a possibility. While I like my 34UM95P for professional purposes it's really not that great for games that are much more than interactive movies. In contrast, nothing stops me from using an Acer x34 monitor for coding 95% of the time.
And it double-sucks, because here in Italy we have to pay "TV tax", even if you don't have the TV service hooked up. Modern LCD/OLED TVs are perfectly fine for most tasks, except maybe high-end FPS gaming. The colors are good, the refresh rates are decent.
That's an ownership tax on tv's as there's an ownership tax on cars. The fact that the proceedings of the Tv ownership tax pay for the public (national) tv service is just a side factor. As very few people were actually paying it, the Tv ownership tax has been embedded on the electricity bill. _Italians, good people_
I was just saying that above, it's an ownership tax. A lot more people are paying it as it's now embedded on the electricity bill, so it's useless to recommend to pay it as it's now been basically enforced.
I believe you should comment in english if you wish to be understood.
It's like how megapixels are a bad measure of camera quality. This is a pro monitor with accurate color reproduction, low fading on the edges, minimal dead pixels, etc.
I know someone who uses a 1080p TV as a monitor for his gaming PC. The picture quality is poor; it's too blurry to be tolerable for work. That's okay in this particular case, since he uses a different PC for work, but that one needs a proper monitor.
Granted that's 1080p, but it wouldn't surprise me if the same thing is true of 4k.
Related musing: why is it seemingly easier to make very high resolution small screen (phones) than large screens of the same resolution ? Instinctevly I would think smaller leds are harder to make, but it doesn't seem it's the case ?
There is a greater failure rate the larger you make apanel. a great many are discarded from production lines. If the screen is smaller then it is less likely to incur error.
Also that cost is hidden in a phone, i'm not sure what the value of screen components would be.
A large defective panel can be cut down into smaller panels most of which will have no defects. You can sell most of your faulty large panels as small panels, but there is no extra source of large panels.
Right, and a Phone could spend $50 to build that screen on a $700 device (not that they do). Just a raw 75 * $50 is near $4000, which would explain a high price.
That is if it is the pixels per unit that lead to cost, vs the size of the display itself.
Obviously it is harder to make a 1 inch x 1 inch display with 1 million pixels vs 1000. But maybe the difference from 600ppi to 200ppi is not enough to matter...
Also, incidentally, I'd happily pay a bit of premium for a TV / Monitor that has good image quality but no other features. Perhaps not 2x as much, but 20% might be doable.
High-end monitors are niche products. 40" TVs aren't.
HDMI 2.0 is a pain compared to DP, other than that, there's almost no reason not to buy a TV if you just care about office use. I've been using a Samsung UHD TV for almost two years.
Monitors are designed to be viewed at close range, from multiple near angles. TVs are designed to be viewed several meters away, at roughly the same angle.
TVs play content of the same frame rate, so there's no reason for them to be any more precise. Monitors can go at very high frame rate as the content can be very fast paced, especially gaming monitors.
Freesync/gsync also integrates with the graphics card to reduce lag. This is the most expensive part of these monitors.
- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames.
- Bulk. TV's are solid in higher numbers, justifying the lower price
- Distance from face. Your 60" TV can be two smaller panels "glued" together. Not noticeable from watching distance, but having a monitor so close to your face, you're more likely to notice the millisecond tearing.
"- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames."
That makes absolutely no sense. If the panel is incapable of 240 refreshes per second, how does software interpolation "increase the frames"? You are confusing content and panel.
Mainly it is that for more than 60fps you need dual-link DVI, display port or HDMI 2.0 and you won't find many TVs with either of those. So even though the screen can do more than 60fps the input and processor can't take it.
That’s not what was said in the comment. Obviously, the point of a interpolation is to take a signal with little samples and increase those. What the comment said was that the hardware (panel) was not capable, so the software somehow was able to do it. It makes no sense.
More like: the panel supports 240hz, but you can't get 240hz content into the TV (they don't exist and require high end interfaces), so you interpolate 30hz (which you do have) to 240 hz. Why go through all this trouble for fake 240hz? Probably because it sounds good as marketing.