Do we know who's making the panel for this monitor? I know AUO had one that would fit in the works; but historically AUO panels have had too much variance (among other QA issues) to be suitable as a pro-grade displays. If it's theirs I'd suggest any pros hold off until 3rd parties can start testing the displays to see how consistent they are.
AUO also kept slipping their 32" 4k model at the same rate as the much delayed 27" 144hz k4 panel. Does this one being available indicate that the 27" gaming one has finally escaped production hell as well, or have they swapped around the release order. Since they were slipping both panels I'd always assumed it was the 384 zone backlight that was the blocking problem, but if this one is out while the gaming display is still stuck it suggests that the latter might have a different limiting problem.
I think it is a little bit a tragedy honestly. TVs are cheaper than monitors even they are 4 times the size.
Where is the OLED monitor with all the HDR goodies? Make it happen LG. It would cost a tiny fraction of this monitor. Split that 4k oled tv in 4 pieces!
OLED still has a ways to go for desktop use due to burn-in. OLED TVs and smartphones use various tricks to prevent static images like shifting pixels and sleep dimming. Some OLED phones still exhibit permanent burn-in on the status and navigation bars (ie read the Pixel 2 XL and LG V30 reviews). A desktop computer will have a static image MUCH more than a TV or smartphone... things like a photoshop UI or taskbar will be up for hours at a time.
Burn-in is already a minor problem. It happens if you use it all day long. But not with entertainment use, mainly all day work or signage that type of thing.
If it was $1000, I'd buy a 120hz OLED regardless of burn-in.
That's rather naive. "Not with entertainment use." Even PCs used "mainly" for gaming often spend as much time (or more!) in desktop applications as they do in-game. And nobody uses a monitor for long-term content consumption, seeing how TVs are cheaper and bigger. As such, you'd have the choice of whether you'd want to burn in your taskbar icons, favourite game UI, or both. It might take a while, but with 4-5 hours of use a day, I'd be surprised if burn-in wasn't apparent within a year.
Mitigation tech such as pixel shifting an the like is also wholly ineffective for PC UIs. Take taskbar icons - all you'd achieve is the burned-in Chrome icon being slightly more blurry around the edges, as it'd need to shift by a very significant amount for this to actually avoid burn-in.
"Oh, but there's pixel shifting" is of almost no use. My old Nokia with an OLED display had pixel shifting, but for the UI, the UI was frequently changing, and for the sleep screen the whole display frequently changed.
Is that surprising? Bigger = lower pixel density = easier to produce = cheaper. 4k at 32" is 2x the pixel density of 4k at 64", meaning you'd need to shrink both pixels and pixel pitch (or at least the sum of the two) by half on both axes. That's no small feat. Material costs are the only drawback for TVs in terms of cost, and as with all modern manufacturing, that's not really an issue.
LG's smallest OLED TV is 55". Quartering that gives you a 27.5" 1080p panel. Would you really want that? While 4k is largely useless at >10' viewing distances (whether the difference between 4k and 1080p can be seen is highly arguable), the same pixel density at 3' would not look good at all. Not to mention that TVs tend to have significantly longer pixel pitch than monitors (the black areas between pixels), which could give a 1st gen VR-like grid effect when looking at it. Not good.
This would be an excellent monitor for 3D Graphics Artist especially for 3D Movie production - My combination was the same that was use for a lot of sci-fi movies especially Babylon 5 and also Star Trek - but Combination Lightwave 3D and Photoshop. I also see why Asus would be interested in making this monitor - because I sure many game developers and artist would love to use this monitor.
One curiously - I would think that with 4K content both in movies and games, that they monitors should be 5k - so full 4k content be on screen and have enough resolution for controls. Of course one could always have dual screen.
No personal experience, but when the issue was raised elsewhere, the response I've seen from people claiming to be pros was that it's always one monitor for full screen playback with all controls/etc on the second.
Yes that is typical on the internet - I am no professional - but I have used the professional packages before - but it way too expensive for some of equipment. Most people probably are just gamers and look at the price and just wonder why it is so expensive. Only thing I notice that DCI-P3 is used for theaters
My version of Lightwave 3D is really old - 9.6 version now it is up to 2018 version and 9.6 was release in 2009. It looks like it did some of Star Wars movies from the following PDF.
For amature work like mine, dual full hd monitors is suffice for full hd video cut. One monitor stays at preview at all time, while the other screen are where I made the edit.
I have zero interest in HDR at the moment, but I must say this is infinitely better product than that fake "HDR" crap BenQ tried to pull a while ago with the SW271.
I've read a review comment on b&h and it's not promising. At that price in europe the nec PA is a better choice. Contrast is the same and for sure it will suffer less of color and luminosity bleeding. The good thing about the asus is the backlighting calibration for different areas of the screen, so even if the panel is not perfect we can adjust luminosity later. But it seems asus software doesn't work. So it is just an adobe rgb panel with hdr content support, but not editing ofc. Eizo just released its own 32 inches for hdr editing, 1000 nits, 1000.000 contrast ratio and 28k + taxes price tag. So until they fix their software it is way overpriced and the guy from the b&h wrote he had to recalibrate the day after due to serious white balance drifting. I wanted to buy this monitor but damn! :)
I am most unhappy to find that this monitor is considerably dearer in every other country than the US ! This means that we are basically subsidising the US who of all countries on this planet are the least deserving of a cheaper price. They have money coming out their ears and until somehow there is a reshuffle of the true cost of money and they have to pay back some of this excess we end up helping them get further in debt. Maybe with Trump there this might happen sooner rather than later.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
18 Comments
Back to Article
DanNeely - Wednesday, March 14, 2018 - link
Do we know who's making the panel for this monitor? I know AUO had one that would fit in the works; but historically AUO panels have had too much variance (among other QA issues) to be suitable as a pro-grade displays. If it's theirs I'd suggest any pros hold off until 3rd parties can start testing the displays to see how consistent they are.AUO also kept slipping their 32" 4k model at the same rate as the much delayed 27" 144hz k4 panel. Does this one being available indicate that the 27" gaming one has finally escaped production hell as well, or have they swapped around the release order. Since they were slipping both panels I'd always assumed it was the 384 zone backlight that was the blocking problem, but if this one is out while the gaming display is still stuck it suggests that the latter might have a different limiting problem.
Alistair - Wednesday, March 14, 2018 - link
I think it is a little bit a tragedy honestly. TVs are cheaper than monitors even they are 4 times the size.Where is the OLED monitor with all the HDR goodies? Make it happen LG. It would cost a tiny fraction of this monitor. Split that 4k oled tv in 4 pieces!
mukiex - Wednesday, March 14, 2018 - link
384 local dimming zones vs 8 million pixels really does sound outright tragic. I would love a 120hz 4K OLED monitor, price be damned.norazi - Wednesday, March 14, 2018 - link
OLED still has a ways to go for desktop use due to burn-in. OLED TVs and smartphones use various tricks to prevent static images like shifting pixels and sleep dimming. Some OLED phones still exhibit permanent burn-in on the status and navigation bars (ie read the Pixel 2 XL and LG V30 reviews). A desktop computer will have a static image MUCH more than a TV or smartphone... things like a photoshop UI or taskbar will be up for hours at a time.Alistair - Wednesday, March 14, 2018 - link
Burn-in is already a minor problem. It happens if you use it all day long. But not with entertainment use, mainly all day work or signage that type of thing.If it was $1000, I'd buy a 120hz OLED regardless of burn-in.
bug77 - Thursday, March 15, 2018 - link
Oh good. When this monitor costs $2k, you'd buy a much better one for $1k.I'm sure manufacturers will oblige, now that they've been made aware.
Valantar - Thursday, March 15, 2018 - link
That's rather naive. "Not with entertainment use." Even PCs used "mainly" for gaming often spend as much time (or more!) in desktop applications as they do in-game. And nobody uses a monitor for long-term content consumption, seeing how TVs are cheaper and bigger. As such, you'd have the choice of whether you'd want to burn in your taskbar icons, favourite game UI, or both. It might take a while, but with 4-5 hours of use a day, I'd be surprised if burn-in wasn't apparent within a year.Mitigation tech such as pixel shifting an the like is also wholly ineffective for PC UIs. Take taskbar icons - all you'd achieve is the burned-in Chrome icon being slightly more blurry around the edges, as it'd need to shift by a very significant amount for this to actually avoid burn-in.
Tams80 - Friday, March 16, 2018 - link
^ this."Oh, but there's pixel shifting" is of almost no use. My old Nokia with an OLED display had pixel shifting, but for the UI, the UI was frequently changing, and for the sleep screen the whole display frequently changed.
Valantar - Thursday, March 15, 2018 - link
Is that surprising? Bigger = lower pixel density = easier to produce = cheaper. 4k at 32" is 2x the pixel density of 4k at 64", meaning you'd need to shrink both pixels and pixel pitch (or at least the sum of the two) by half on both axes. That's no small feat. Material costs are the only drawback for TVs in terms of cost, and as with all modern manufacturing, that's not really an issue.LG's smallest OLED TV is 55". Quartering that gives you a 27.5" 1080p panel. Would you really want that? While 4k is largely useless at >10' viewing distances (whether the difference between 4k and 1080p can be seen is highly arguable), the same pixel density at 3' would not look good at all. Not to mention that TVs tend to have significantly longer pixel pitch than monitors (the black areas between pixels), which could give a 1st gen VR-like grid effect when looking at it. Not good.
CheapSushi - Thursday, March 15, 2018 - link
What about MicroLED? Think it might be better overall to OLED.HStewart - Wednesday, March 14, 2018 - link
This would be an excellent monitor for 3D Graphics Artist especially for 3D Movie production - My combination was the same that was use for a lot of sci-fi movies especially Babylon 5 and also Star Trek - but Combination Lightwave 3D and Photoshop. I also see why Asus would be interested in making this monitor - because I sure many game developers and artist would love to use this monitor.One curiously - I would think that with 4K content both in movies and games, that they monitors should be 5k - so full 4k content be on screen and have enough resolution for controls. Of course one could always have dual screen.
DanNeely - Wednesday, March 14, 2018 - link
No personal experience, but when the issue was raised elsewhere, the response I've seen from people claiming to be pros was that it's always one monitor for full screen playback with all controls/etc on the second.HStewart - Wednesday, March 14, 2018 - link
Yes that is typical on the internet - I am no professional - but I have used the professional packages before - but it way too expensive for some of equipment. Most people probably are just gamers and look at the price and just wonder why it is so expensive. Only thing I notice that DCI-P3 is used for theatersMy version of Lightwave 3D is really old - 9.6 version now it is up to 2018 version and 9.6 was release in 2009. It looks like it did some of Star Wars movies from the following PDF.
http://static.lightwave3d.com/marketing/lightwave_...
Gothmoth - Wednesday, March 14, 2018 - link
lightwave is maybe used by 100 people in the biz.. it´s practically dead. try to find a job offer that ask for lightwave experience... good luck.mr_tawan - Wednesday, March 14, 2018 - link
For amature work like mine, dual full hd monitors is suffice for full hd video cut. One monitor stays at preview at all time, while the other screen are where I made the edit.StrangerGuy - Thursday, March 15, 2018 - link
I have zero interest in HDR at the moment, but I must say this is infinitely better product than that fake "HDR" crap BenQ tried to pull a while ago with the SW271.umano - Friday, March 16, 2018 - link
I've read a review comment on b&h and it's not promising. At that price in europe the nec PA is a better choice. Contrast is the same and for sure it will suffer less of color and luminosity bleeding.The good thing about the asus is the backlighting calibration for different areas of the screen, so even if the panel is not perfect we can adjust luminosity later. But it seems asus software doesn't work. So it is just an adobe rgb panel with hdr content support, but not editing ofc. Eizo just released its own 32 inches for hdr editing, 1000 nits, 1000.000 contrast ratio and 28k + taxes price tag. So until they fix their software it is way overpriced and the guy from the b&h wrote he had to recalibrate the day after due to serious white balance drifting. I wanted to buy this monitor but damn! :)
chriscalvert - Monday, May 28, 2018 - link
I am most unhappy to find that this monitor is considerably dearer in every other country than the US ! This means that we are basically subsidising the US who of all countries on this planet are the least deserving of a cheaper price. They have money coming out their ears and until somehow there is a reshuffle of the true cost of money and they have to pay back some of this excess we end up helping them get further in debt. Maybe with Trump there this might happen sooner rather than later.