What's and EDTV panel? And HDR is what Vesa agreed upon. Time to move on. It's not like they are tricking anyone. Consumers are still obligated to do their homework.
According to Wikipedia, EDTV is "Enhanced-Definition TV". Above SDTV, below HDTV (as in 720p). EDTV typically refers to 480p or 576p - progressive scan versions of SDTV resolutions.
Panel technology has not appreciably advanced for years in terms of contrast ratio. Without the use of a FALD backlight, the contrast ratio available will be identical to an SDR monitor. That means the dynamic range will also be identical for the same peak backlight level (400 cd/m^2 is hardly uncommon). You're getting an SDR monitor that happens to accept HDR inputs, but you're not getting a HDR monitor.
For this monitor, you do have a point. It is too expensive for its specs, just a 4K monitor with a higher refresh rate. Basic FALD can improve its value.
However, edge lit HDR is still usable. It improves contrast ratio a bit and saves some power with average picture level is not high.
I'd probably settle with 100Hz myself instead of having to deal with chroma subsampling in some type of videogames. But for any twitch shooter, i'd sacrifice the color in order to get 144 (and honestly, I wish I could have more)
100 vs 144 is noticable, I can even tell the difference from 120 to 144 on any FPS game.
But for desktop use, I'd probably just do 100hz. since the chroma subsmpling is actually annoying there.
Is it really or are you just talking out of your ass? 10 bits per channel is 2^10 (1024) intensity levels per channel. That is running right at the line of a nominal 1000:1 static ratio (you'll find most panels are 10% below this when calibrated and 30% below it when running ULMB). You want contrast ratio overhead to help mitigate issues like "not being in a pitch black room". The benefit over 8 bits per channel here are not significant. More contrast ratio would be much better.
To understand why 1000 nits was pushed so hard as the "bare minimum" acceptable brightness. You have to look at Samsung vs LG.
Samsung knew that they needed an edge over LG, so they tried to make 1000 nits the bare minimum acceptable brightness for a set to be called HDR capable.
They did this, because they knew OLED (back then) could, at best do about 500-600 nits of brightness. (which is still 6x the brightness over 100 nits, which all SDR content is mastered at) Their plan kind of backfired, as the UHD alliance declared that OLED could also reach the certification, even if it peaked at lower brightness.
You see, the problem with SDR, is that it's still being mastered to compensate for the shortcomings of CRT displays, which couldn't get much brighter than 100 nits. However, LCD's and OLEDs far exceed the peak brightness of CRTs for years.
The fact is, just about any LCD out there is capable of doing HDR if the content is tone mapped correctly. If you have a PC, you can actually try it yourself, using MadVR and just entering the max brightness of your monitor (and cracking it up to max brightness) It will also convert the EOTF to a standard gamma curve so the image is displayed correctly.
So even if you're missing the wide color gamut, you'll still get very vivid colors because colors actually look more saturated the brighter they are, just using Rec.709. You'll still get very different colors from an SDR grade (but you still won't get colors that are outside the gamut, like say a Ferrari red or most neon tones) and it will still look a lot better than the SDR grade.
For video games it's even better. If the HDR is done right in the video game, they will include a peak brightness slider. and they'll let you choose anywhere from 300 to 10,000 nits of peak brightness and have the engine take care of the tone mapping. (battlefield 1 does this, and it looks amazing even on my 430 nit 4k TV)
Another thing to consider, is that the first couple doubling of brightness are much more noticeable than the one that follows. (going from 500 to 1000 nits doesn't result in 2x perceived brightness, It's a lot less than that.
As far as your FALD comment goes, It will help low APL scenes a lot, but with the usual trade-offs that just make me wish I could just buy an OLED monitor and forget about LCD forever.
TL;DR- HDR at <1000 nits works just fine, even with Rec.709. And i would take a 600 nit OLED for HDR, over a 4,000 nit LCD any day of the week for HDR content. (it would blow it out of the water)
Your last comment is exactly how I decided on an OLED TV over a Samsung QLED. The brightness just didn't matter as much as the massive difference in contrast.
I have been waiting for a cheaper versions of the Acer Predator X27 or ASUS PG27UQ (based on M270QAN02.3 ??).
I was thinking this monitor you speak of here, the Acer Predator XB273K, or the AOC AGON AG273UG would be very decent options. Or even the ASUS equivalency which I do not know the model of yet, unless anyone else does?
If you do not like the XB273K, would you be able to recommended another computer monitor with similar spec's to the ASUS Model PG27UQ ?
FYI - my usage is gaming a fair bit, but I dabble in photography and really appreciate the contrast, colour range, and colour accuracy features.
Looking at the ASUS Model PG27UQ specs, I have broken down its' spec's into groups that I think are important to me.
1. Ideally - has these spec's Size: 27" Brightness: 600 cd/m² Contrast Ratio (Max.): 1000:1 Display Millions: 1.07Billion
2. Nice to have - these spec's Display BITS: 10 bit Resolution: 3840 x 2160 Hz: 120 Hz Sync: G-SYNC Panel Type: IPS Features: EyeCare, HDR, Quantum-dot Response ms: 4ms
According to TFT Central, early next year for panel production from AUO to start, mid 2019 before the first shipping products are available.
Unfortunately neither LG or Samsung are currently planning on making similar panels. (Both companies are known for significantly better quality control than AUO.)
Acer and Asus pretty much abandoned it. The reason is pretty simple, its already hard to keep up with low size versions making the panel proved to expensive. You think the $2000 one was expensive, that would easily put it close to $3k mark. Of course they still might make it, but from all the people i've talked to cause i was very interested in it as well, they are not making it at this time.
They do have some out in the wild, but i think those are sample models.
Going through newegg review, this monitor has fan. Did not expect that:
Pros: 4k HDR panel, high refresh rate, 1000 NITS
Cons: Poor quality control. 8+ dead pixels on my panel. Speakers distort at moderate volume. Fan never turns off even when the monitor is power off. Sent it for repairs 3 weeks ago, still waiting on a fix.
Other Thoughts: I would avoid the panel given the quality issues and if you need a repair prepare to be without your monitor for 21+ days.
I'm guessing it's using the same huge FPGA to do Gsync as its FALD sibling, and that chip sucks too much power to passively cool in a reasonable sized package.
set to 8 bit color, use 4:2:2, or 98hz, then it works as advertised... it's a combination of HDR and high refresh rate that gets a bit much for displayport, so you have 3 different options on how to use it
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
33 Comments
Back to Article
edzieba - Wednesday, August 29, 2018 - link
Display'HDR' 400 and no FALD = not actually HDR in any useful sense. Like sending 1080p to an EDTV panel is not HD in any useful sense.Death666Angel - Wednesday, August 29, 2018 - link
What's and EDTV panel?And HDR is what Vesa agreed upon. Time to move on. It's not like they are tricking anyone. Consumers are still obligated to do their homework.
Death666Angel - Wednesday, August 29, 2018 - link
*an EDTVInteli - Wednesday, August 29, 2018 - link
According to Wikipedia, EDTV is "Enhanced-Definition TV". Above SDTV, below HDTV (as in 720p). EDTV typically refers to 480p or 576p - progressive scan versions of SDTV resolutions.edzieba - Wednesday, August 29, 2018 - link
"It's not like they are tricking anyone."Panel technology has not appreciably advanced for years in terms of contrast ratio. Without the use of a FALD backlight, the contrast ratio available will be identical to an SDR monitor. That means the dynamic range will also be identical for the same peak backlight level (400 cd/m^2 is hardly uncommon). You're getting an SDR monitor that happens to accept HDR inputs, but you're not getting a HDR monitor.
zodiacfml - Thursday, August 30, 2018 - link
For this monitor, you do have a point. It is too expensive for its specs, just a 4K monitor with a higher refresh rate. Basic FALD can improve its value.However, edge lit HDR is still usable. It improves contrast ratio a bit and saves some power with average picture level is not high.
Alistair - Wednesday, August 29, 2018 - link
It's all about that high refresh rate and 4k. I was hoping for $1000 without the FALD, so I'm still on the fence with this one.Lolimaster - Wednesday, August 29, 2018 - link
What high refresh rate at 4k?It's 4-2-2, CRAPPY image quality. Better playing streaming the games over internet.
Cooe - Wednesday, August 29, 2018 - link
4K 100Hz @ 4:4:4 isn't fast enough for you??? Spoiled rotted then.inighthawki - Wednesday, August 29, 2018 - link
There's a fairly noticeable difference between 100hz and 144hz.Kamus - Thursday, August 30, 2018 - link
I'd probably settle with 100Hz myself instead of having to deal with chroma subsampling in some type of videogames. But for any twitch shooter, i'd sacrifice the color in order to get 144 (and honestly, I wish I could have more)100 vs 144 is noticable, I can even tell the difference from 120 to 144 on any FPS game.
But for desktop use, I'd probably just do 100hz. since the chroma subsmpling is actually annoying there.
SirPerro - Wednesday, August 29, 2018 - link
That's not entirely true.The panel in dimly lit rooms will be perfectly fine with HDR.
Not the most accurate nor impressive HDR, but definitely noticeable.
imaheadcase - Wednesday, August 29, 2018 - link
HDR has never been noticeable on any monitor. I mean come on its just higher brightness. Its just marketing buzz.FreckledTrout - Wednesday, August 29, 2018 - link
Have you seen it on an OLED monitor?willis936 - Wednesday, August 29, 2018 - link
Is it really or are you just talking out of your ass? 10 bits per channel is 2^10 (1024) intensity levels per channel. That is running right at the line of a nominal 1000:1 static ratio (you'll find most panels are 10% below this when calibrated and 30% below it when running ULMB). You want contrast ratio overhead to help mitigate issues like "not being in a pitch black room". The benefit over 8 bits per channel here are not significant. More contrast ratio would be much better.Lolimaster - Wednesday, August 29, 2018 - link
Specially on a crappy low contrast IPS display.Kamus - Thursday, August 30, 2018 - link
Actually, that's a very common misconception.To understand why 1000 nits was pushed so hard as the "bare minimum" acceptable brightness. You have to look at Samsung vs LG.
Samsung knew that they needed an edge over LG, so they tried to make 1000 nits the bare minimum acceptable brightness for a set to be called HDR capable.
They did this, because they knew OLED (back then) could, at best do about 500-600 nits of brightness. (which is still 6x the brightness over 100 nits, which all SDR content is mastered at) Their plan kind of backfired, as the UHD alliance declared that OLED could also reach the certification, even if it peaked at lower brightness.
You see, the problem with SDR, is that it's still being mastered to compensate for the shortcomings of CRT displays, which couldn't get much brighter than 100 nits. However, LCD's and OLEDs far exceed the peak brightness of CRTs for years.
The fact is, just about any LCD out there is capable of doing HDR if the content is tone mapped correctly. If you have a PC, you can actually try it yourself, using MadVR and just entering the max brightness of your monitor (and cracking it up to max brightness) It will also convert the EOTF to a standard gamma curve so the image is displayed correctly.
So even if you're missing the wide color gamut, you'll still get very vivid colors because colors actually look more saturated the brighter they are, just using Rec.709. You'll still get very different colors from an SDR grade (but you still won't get colors that are outside the gamut, like say a Ferrari red or most neon tones) and it will still look a lot better than the SDR grade.
For video games it's even better. If the HDR is done right in the video game, they will include a peak brightness slider. and they'll let you choose anywhere from 300 to 10,000 nits of peak brightness and have the engine take care of the tone mapping. (battlefield 1 does this, and it looks amazing even on my 430 nit 4k TV)
Another thing to consider, is that the first couple doubling of brightness are much more noticeable than the one that follows. (going from 500 to 1000 nits doesn't result in 2x perceived brightness, It's a lot less than that.
As far as your FALD comment goes, It will help low APL scenes a lot, but with the usual trade-offs that just make me wish I could just buy an OLED monitor and forget about LCD forever.
TL;DR- HDR at <1000 nits works just fine, even with Rec.709. And i would take a 600 nit OLED for HDR, over a 4,000 nit LCD any day of the week for HDR content. (it would blow it out of the water)
FullmetalTitan - Friday, August 31, 2018 - link
Your last comment is exactly how I decided on an OLED TV over a Samsung QLED. The brightness just didn't matter as much as the massive difference in contrast.JessW - Wednesday, October 3, 2018 - link
Hi Kamus,You seem to be very knowledgeable on this topic.
I have been waiting for a cheaper versions of the Acer Predator X27 or ASUS PG27UQ (based on M270QAN02.3 ??).
I was thinking this monitor you speak of here, the Acer Predator XB273K, or the AOC AGON AG273UG would be very decent options. Or even the ASUS equivalency which I do not know the model of yet, unless anyone else does?
If you do not like the XB273K, would you be able to recommended another computer monitor with similar spec's to the ASUS Model PG27UQ ?
FYI - my usage is gaming a fair bit, but I dabble in photography and really appreciate the contrast, colour range, and colour accuracy features.
Looking at the ASUS Model PG27UQ specs, I have broken down its' spec's into groups that I think are important to me.
1. Ideally - has these spec's
Size: 27"
Brightness: 600 cd/m²
Contrast Ratio (Max.): 1000:1
Display Millions: 1.07Billion
2. Nice to have - these spec's
Display BITS: 10 bit
Resolution: 3840 x 2160
Hz: 120 Hz
Sync: G-SYNC
Panel Type: IPS
Features: EyeCare, HDR, Quantum-dot
Response ms: 4ms
3. Not required
UHD / HDR: 4K UHD
Price USD$: $2340
Thank you in advance,
zodiacfml - Thursday, August 30, 2018 - link
I got a TV with basic HDR connected to a PC. The HDR is noticeable. It is not blindingly great but usable, I left it on all the time.Kamus - Thursday, August 30, 2018 - link
ummm. I thought i replied to this comment, anyway, see my post below to understand why that's not true.milkod2001 - Wednesday, August 29, 2018 - link
Anybody knows if 32" version is also coming?Alistair - Wednesday, August 29, 2018 - link
That's what I'm waiting for too! Sometime next year.DanNeely - Wednesday, August 29, 2018 - link
According to TFT Central, early next year for panel production from AUO to start, mid 2019 before the first shipping products are available.Unfortunately neither LG or Samsung are currently planning on making similar panels. (Both companies are known for significantly better quality control than AUO.)
http://www.tftcentral.co.uk/articles/high_refresh_...
imaheadcase - Wednesday, August 29, 2018 - link
Acer and Asus pretty much abandoned it. The reason is pretty simple, its already hard to keep up with low size versions making the panel proved to expensive. You think the $2000 one was expensive, that would easily put it close to $3k mark. Of course they still might make it, but from all the people i've talked to cause i was very interested in it as well, they are not making it at this time.They do have some out in the wild, but i think those are sample models.
Alistair - Wednesday, August 29, 2018 - link
Maybe the FALD version, but I didn't hear they abandoned the normal 32 inch 4k 120hz ones.milkod2001 - Wednesday, August 29, 2018 - link
Going through newegg review, this monitor has fan. Did not expect that:Pros: 4k HDR panel, high refresh rate, 1000 NITS
Cons: Poor quality control. 8+ dead pixels on my panel. Speakers distort at moderate volume. Fan never turns off even when the monitor is power off. Sent it for repairs 3 weeks ago, still waiting on a fix.
Other Thoughts: I would avoid the panel given the quality issues and if you need a repair prepare to be without your monitor for 21+ days.
StevenD - Wednesday, August 29, 2018 - link
1 NITS = 1 cd/m2So it's not 1000 NITS, it's 400
DanNeely - Wednesday, August 29, 2018 - link
I'm guessing it's using the same huge FPGA to do Gsync as its FALD sibling, and that chip sucks too much power to passively cool in a reasonable sized package.Lolimaster - Wednesday, August 29, 2018 - link
Those fake 144Hz 4k monitors, why not simply advertise them as 4k 98Hz?Moronic gamers seems to like playing youtube quality real time games xD.
Alistair - Wednesday, August 29, 2018 - link
set to 8 bit color, use 4:2:2, or 98hz, then it works as advertised... it's a combination of HDR and high refresh rate that gets a bit much for displayport, so you have 3 different options on how to use itFreckledTrout - Wednesday, August 29, 2018 - link
At these costs i'll be waiting for a Mini LED backlit monitor. That is probably good enough FALD to get close to OLED.zodiacfml - Thursday, August 30, 2018 - link
I got a TV with basic HDR connected to a PC. The HDR is noticeable. It is not blindingly great but usable, I left it on all the time.