I get what you are saying here but most likely most people won't care about things like that all they see is 86" and go OMFG I want I want I want...it. Personally I would be ok with it mainly because I currently have a 60" Samsung 4K TV and also game on it & guess what it is 60Hz and works just fine. It has a PC mode that pretty much gets rid of any sort of lag. Yes it is only 60Hz but even with this I still kick butt in most online shooters when playing against probably people with 144Hz monitors. All I need is skill and a good gaming mouse set to 1000Hz and DPI at 1600-2400 and my reflexes take care of the rest.
No HDR either. 1200:1 advertised contrast is well down in the SDR range, and if they actually had FALD backlights that would be plastered in the marketing copy.
Monitors don't default to resizing the input by 5-10% because Joe Sixpack thinks making the parts he sees bigger makes it look better than the equivalent panel showing a 100% scaled image?
Include the signal processing. Monitors are supposed to display 1:1 the data as faithful as possible. High refresh rates need to be attained without sacrificing display quality and faithfulness. TVs are meant for entertainment use so sources are displayed with enhancements (added saturations, sureal sharpening, Smoothmotion or whatever the brand adds on) or may have reductions in data (e.g. 4:2:2) as compensation for performance.
This is like saying Audio Technica and Sennheiser monitor and reference headphones are for studio and production use (having the sounds as faithful to source) while Beats headphones skews the audio to what their target audience likes.
On the subject of overscan, its typically not done anymore, but sometimes is still kinda there as they do some post-processing to make the videos 'smoother' aside from deinterlacing and what not. Think Premiere's Warp Stabilizer. But plain overscan now? lol
Unfortunately you would be surprised how many recent TVs still default to Overscan in certain preset settings, and sometimes its not even obvious how to defeat it.
I haven't run into this in the last 10 years, setting up my own TVs or those of my family (5 in total). Not saying it doesn't exist, I just haven't seen it. As an option, yes. As a hidden feature enabled by default and devious to disable, no. :D
I wasn't earning any money when plasma TVs were a thing, so I don't know much about them but the generalities. What would be the size and brightness of a 360W consuming plasma TV?
And just for reference, considering my 42" CCFL LCD TV (2009 era) warmed up my living room considerably, I thinkg it generated a few hundred watt as well.
Well, I would generalize that a bit more. Tuner is nice and all, but I know so many people that don't use those parts at all anymore.
So what I would say: The ability to display its own content, no matter if its from a tuner, or some apps that get it from the Internet, be it simple IPTV or Netflix etc.
Exactly. Almist all new Samsung models have freesynch and some older ones after firmware update too. See the best site reviewing TVs as game and work monitors Rtings dot com.
The 4 ms is pretty impressive but many 4K TVs also got close to 10 ms which fits 99.9% people.
Excluding those 0.1% pros who need 120hz at 4K or special gamut reqs, if you people don't use 4K TV as your PC and a game monitors consider yourself a dumbo and retarded technophobes, period
I have no interest in a 50" or even a 30"+ TV/Monitor for working or gaming. If you like that sort of format, all the power to you, but calling people retarded for not wanting to stare on a way too huge screen on their desk is a bit over the top, don't you think?
At the 27" sweet spot for me, they just don't make TVs. Nevermind wanting a high refresh rate screen, which is not a feature i'll ever want to give up again for gaming.
Well I think the difference use to be important but as long as it has some kind of sound device on it, I think it can be used as a TV - one could think it has a TV tuner - but not a days with device like a Amazon Fire TV combine with and Fire TV recast - who cares - of course as long as you have audio - I think Satellite of Cable will work anyway.
Two things have traditionally separated TV's from monitors: built-in speakers and built-in over-the-air tuners. Built-in speakers in monitors are certainly not standard today but they are common enough with there inclusion being a side effect of display interfaces carrying audio.
Built-in tuners still define a line between monitors and TV though the importance of that line is fading as streaming is taking over the same role.
There does appear to a new line that is emerging in the TV's rally around the HDMI connector while monitors follow VESA standards. There was a brief moment where DP1.2 was found on the first generation of 4K displays as HDMI 1.3 didn't officially support 4K (it has the bandwidth for 4k30 as that figure didn't change with HDMI 1.4) but for the most part, DP has a been a monitor connector. I suspect that TV's will continue to use HDMI as the primary interface while monitors will transition from DP to USB-C, especially as USB 4 becomes common place.
No HDR, no FreeSync, no quantum dots or similar color range for that money? No. One note of caution about OLED TVs as alternative: Beware of potential for burn-in if using your nice OLED TV as a monitor, especially at high brightness during the day. That's the main reason why I have forgone that route for now. Other points: 1. Yes, there are fully articulated VESA mounts even for 60kg plus TVs or monitors. Installing them is a different story. You will need at least 2 strong or 3 average people to make sure it doesn't end up on the floor, and really good fasteners/wall anchors. 2. What these JN displays show is mainly that mainstream TV manufacturers are not good at covering the full range of their market. I for one would love to have at least 1, better 2 display port connectors on my TV. Comes in handy for some situations. 3. Ditto that for Freesync. It's open and free now, so why not? Would help differentiate TVs for not too much effort.
I guess two main differences... 1. Monitors are meant to be seen closer than TV's, So Monitors must have stricter requirements to help with eye strains. 2. TV's have a lot of post-processing to do which adds a lot of latency. While monitors have this also they tend to do it either faster or less of it. TV's now have a gaming mode (even some monitors do) which help lessens the post-processing to have much less latency.
But there's also other stuff, TV's usually are tailored for low refresh rates after all boradcasts and movies have used 24 & 30fps for a long time, when faster refresh rates came, it was just a post processing feature, the input was still limited to 24 & 30fps.. later newer TV's supported faster like 60fps and 120fps input. My TV can do 120hz internally, while it has HDMI 1.4, it's limited to 1080p 60Hz input only.
Another swing and a miss. Wish the TV input board/scalar designers would crawl out from under the rock they have been living under producing out of touch designs. Hey at least they got the memo about displayport from 5 years ago.
The major think separating a "monitor" from TV is ability working 24x7 over significant amount of time into an industrial configurations (included multi-monitor setup). TV is not designed for that, so there is no warranty.
If you want high quality overdrive and low response time/input lag you go for GSync anyway. Might as well just take a normal one if you look for a Freesync one.
I also like how they dont mention if its edge-lit or direct-lit. Just like with monitors (which turn out to be mostly edgelit, which is far inferior and causes other problems like BLB)...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
n0x1ous - Friday, March 15, 2019 - link
Move over Nvidia? Really? these are 60hz and no adaptive refresh. Someone looking to buy a BFGD wouldn't even consider this.rocky12345 - Friday, March 15, 2019 - link
I get what you are saying here but most likely most people won't care about things like that all they see is 86" and go OMFG I want I want I want...it. Personally I would be ok with it mainly because I currently have a 60" Samsung 4K TV and also game on it & guess what it is 60Hz and works just fine. It has a PC mode that pretty much gets rid of any sort of lag. Yes it is only 60Hz but even with this I still kick butt in most online shooters when playing against probably people with 144Hz monitors. All I need is skill and a good gaming mouse set to 1000Hz and DPI at 1600-2400 and my reflexes take care of the rest.edzieba - Friday, March 15, 2019 - link
No HDR either. 1200:1 advertised contrast is well down in the SDR range, and if they actually had FALD backlights that would be plastered in the marketing copy.jajig - Saturday, March 16, 2019 - link
I agree the no adaptive refresh is an instant ignore.hMunster - Saturday, March 16, 2019 - link
Wouldn't that depend on the game? Not everything needs high refresh rates. These would be great for a flight simulator.stanleyipkiss - Friday, March 15, 2019 - link
Damn. I just bought a $3000 Samsung 75 inch-er. Would've gone for 86 inch for an extra $447...ToTTenTranz - Friday, March 15, 2019 - link
At 400nits in that 86", you're probably better off with your Samsung. At least when watching proper HDR content.DanNeely - Friday, March 15, 2019 - link
"What Separates TVs from Monitors, Anyhow?"Monitors don't default to resizing the input by 5-10% because Joe Sixpack thinks making the parts he sees bigger makes it look better than the equivalent panel showing a 100% scaled image?
lmcd - Friday, March 15, 2019 - link
Monitors don't include spyware integrated :)jabber - Sunday, March 17, 2019 - link
Also remote controls, optical outs, Dolby sound etc. etc. etc.vanilla_gorilla - Friday, March 15, 2019 - link
Last couple of TV I've setup have done 1:1 pixel mapping. I think default overscan on LCD displays is going the way of the dodo.ayunatsume - Saturday, March 16, 2019 - link
Include the signal processing. Monitors are supposed to display 1:1 the data as faithful as possible. High refresh rates need to be attained without sacrificing display quality and faithfulness. TVs are meant for entertainment use so sources are displayed with enhancements (added saturations, sureal sharpening, Smoothmotion or whatever the brand adds on) or may have reductions in data (e.g. 4:2:2) as compensation for performance.This is like saying Audio Technica and Sennheiser monitor and reference headphones are for studio and production use (having the sounds as faithful to source) while Beats headphones skews the audio to what their target audience likes.
On the subject of overscan, its typically not done anymore, but sometimes is still kinda there as they do some post-processing to make the videos 'smoother' aside from deinterlacing and what not. Think Premiere's Warp Stabilizer. But plain overscan now? lol
nevcairiel - Saturday, March 16, 2019 - link
Unfortunately you would be surprised how many recent TVs still default to Overscan in certain preset settings, and sometimes its not even obvious how to defeat it.Death666Angel - Saturday, March 16, 2019 - link
I haven't run into this in the last 10 years, setting up my own TVs or those of my family (5 in total). Not saying it doesn't exist, I just haven't seen it. As an option, yes. As a hidden feature enabled by default and devious to disable, no. :DCalabros - Friday, March 15, 2019 - link
360 W? I thought we won't see these numbers after good old days of Plasma.Death666Angel - Saturday, March 16, 2019 - link
I wasn't earning any money when plasma TVs were a thing, so I don't know much about them but the generalities. What would be the size and brightness of a 360W consuming plasma TV?Death666Angel - Saturday, March 16, 2019 - link
And just for reference, considering my 42" CCFL LCD TV (2009 era) warmed up my living room considerably, I thinkg it generated a few hundred watt as well.Azethoth - Monday, March 18, 2019 - link
All you need to know about plasma is that it looked good but not for long. They broke down, frequently.Makaveli - Sunday, March 17, 2019 - link
For first gen plasma's sure.My 2012 plasma doesn't use 300+ watts.
Standard power 133.58 Watts
Calibrated power 264.46 Watts
ElFenix - Friday, March 15, 2019 - link
TVs have tuners. If it doesn't have a tuner, it's a monitor.wrkingclass_hero - Saturday, March 16, 2019 - link
Correct!nevcairiel - Saturday, March 16, 2019 - link
Well, I would generalize that a bit more. Tuner is nice and all, but I know so many people that don't use those parts at all anymore.So what I would say: The ability to display its own content, no matter if its from a tuner, or some apps that get it from the Internet, be it simple IPTV or Netflix etc.
p1esk - Friday, March 15, 2019 - link
$3,447 for 86" TV? Wow, that's pretty good!zodiacfml - Friday, March 15, 2019 - link
It has been like that since 1080p. I bought a 4K TV as a monitor for cheap and couldn't be happierPinn - Saturday, March 16, 2019 - link
DP. duh.imaheadcase - Saturday, March 16, 2019 - link
Pixel Density 58.7 ppi 51.2 ppiThat answers the question you posed. Case closed.
eva02langley - Saturday, March 16, 2019 - link
At this point, Samsung is starting rolling freesync on their TV. Monitor will become more and more a niche.HStewart - Saturday, March 16, 2019 - link
I don't believe freesync is that big of factor - both TV and monitors were both out before freesync.HStewart - Saturday, March 16, 2019 - link
Maybe for niche market of gamers - freesync is a requirement.arashi - Saturday, March 16, 2019 - link
Again, IntelStewart misses the point.808Hilo - Saturday, March 16, 2019 - link
Not enough quality, just sized up yesteryears techSanX - Saturday, March 16, 2019 - link
Exactly. Almist all new Samsung models have freesynch and some older ones after firmware update too. See the best site reviewing TVs as game and work monitors Rtings dot com.The 4 ms is pretty impressive but many 4K TVs also got close to 10 ms which fits 99.9% people.
Excluding those 0.1% pros who need 120hz at 4K or special gamut reqs, if you people don't use 4K TV as your PC and a game monitors consider yourself a dumbo and retarded technophobes, period
nevcairiel - Sunday, March 17, 2019 - link
I have no interest in a 50" or even a 30"+ TV/Monitor for working or gaming. If you like that sort of format, all the power to you, but calling people retarded for not wanting to stare on a way too huge screen on their desk is a bit over the top, don't you think?At the 27" sweet spot for me, they just don't make TVs. Nevermind wanting a high refresh rate screen, which is not a feature i'll ever want to give up again for gaming.
HStewart - Saturday, March 16, 2019 - link
"What separates TV from Monitors, Anyhow?"Well I think the difference use to be important but as long as it has some kind of sound device on it, I think it can be used as a TV - one could think it has a TV tuner - but not a days with device like a Amazon Fire TV combine with and Fire TV recast - who cares - of course as long as you have audio - I think Satellite of Cable will work anyway.
Hul8 - Monday, March 18, 2019 - link
A display without a tuner can be used as a TV (with a set-top box etc.), but the crucial difference is that it can't be sold as one.ajp_anton - Saturday, March 16, 2019 - link
"they feature an extremely robust set of connectors to attach multiple devices"I wouldn't call it exremely robust, with only two connectors that support the monitor's native resolution at 60Hz.
twtech - Saturday, March 16, 2019 - link
That's a very competitive price for an 86" screen - if I didn't already have an 82", it would be tempting.It doesn't really matter whether they call it a TV or a monitor unless you aren't planning to connect a receiver to it.
Kevin G - Sunday, March 17, 2019 - link
Two things have traditionally separated TV's from monitors: built-in speakers and built-in over-the-air tuners. Built-in speakers in monitors are certainly not standard today but they are common enough with there inclusion being a side effect of display interfaces carrying audio.Built-in tuners still define a line between monitors and TV though the importance of that line is fading as streaming is taking over the same role.
There does appear to a new line that is emerging in the TV's rally around the HDMI connector while monitors follow VESA standards. There was a brief moment where DP1.2 was found on the first generation of 4K displays as HDMI 1.3 didn't officially support 4K (it has the bandwidth for 4k30 as that figure didn't change with HDMI 1.4) but for the most part, DP has a been a monitor connector. I suspect that TV's will continue to use HDMI as the primary interface while monitors will transition from DP to USB-C, especially as USB 4 becomes common place.
eastcoast_pete - Sunday, March 17, 2019 - link
No HDR, no FreeSync, no quantum dots or similar color range for that money? No. One note of caution about OLED TVs as alternative: Beware of potential for burn-in if using your nice OLED TV as a monitor, especially at high brightness during the day. That's the main reason why I have forgone that route for now.Other points:
1. Yes, there are fully articulated VESA mounts even for 60kg plus TVs or monitors. Installing them is a different story. You will need at least 2 strong or 3 average people to make sure it doesn't end up on the floor, and really good fasteners/wall anchors.
2. What these JN displays show is mainly that mainstream TV manufacturers are not good at covering the full range of their market. I for one would love to have at least 1, better 2 display port connectors on my TV. Comes in handy for some situations.
3. Ditto that for Freesync. It's open and free now, so why not? Would help differentiate TVs for not too much effort.
Lolimaster - Sunday, March 17, 2019 - link
IPS and piss poor contrast, once you touched OLED this feels like a pirate cannon on IBM war.Xajel - Monday, March 18, 2019 - link
I guess two main differences...1. Monitors are meant to be seen closer than TV's, So Monitors must have stricter requirements to help with eye strains.
2. TV's have a lot of post-processing to do which adds a lot of latency. While monitors have this also they tend to do it either faster or less of it. TV's now have a gaming mode (even some monitors do) which help lessens the post-processing to have much less latency.
But there's also other stuff, TV's usually are tailored for low refresh rates after all boradcasts and movies have used 24 & 30fps for a long time, when faster refresh rates came, it was just a post processing feature, the input was still limited to 24 & 30fps.. later newer TV's supported faster like 60fps and 120fps input. My TV can do 120hz internally, while it has HDMI 1.4, it's limited to 1080p 60Hz input only.
Gunbuster - Monday, March 18, 2019 - link
Another swing and a miss. Wish the TV input board/scalar designers would crawl out from under the rock they have been living under producing out of touch designs. Hey at least they got the memo about displayport from 5 years ago.mr_tawan - Monday, March 18, 2019 - link
Here in Thailand, every monitor that has HDMI input is TV (by the custom's standard).Altagon - Monday, March 18, 2019 - link
The major think separating a "monitor" from TV is ability working 24x7 over significant amount of time into an industrial configurations (included multi-monitor setup). TV is not designed for that, so there is no warranty.Beaver M. - Tuesday, March 19, 2019 - link
If you want high quality overdrive and low response time/input lag you go for GSync anyway.Might as well just take a normal one if you look for a Freesync one.
I also like how they dont mention if its edge-lit or direct-lit. Just like with monitors (which turn out to be mostly edgelit, which is far inferior and causes other problems like BLB)...
Beaver M. - Tuesday, March 19, 2019 - link
Nevermind, they actually say its direct lit. But no dimming whatsoever?