Edit / Grading Monitor

GeeBax wrote on 2/19/2014, 12:43 AM
Having recently had to sell body organs to finance a new computer for editing, I wanted to get a moderately good monitor to replace the one I am using. A good quality professional monitor is out of my price range, so I was looking for something a little more affordable, and I came up with this one from the Asus range: https://www.asus.com/au/Monitors_Projectors/PA279Q/

I would be grateful for the opinion of members here, and their thoughts on this monitor.

Geoff

PS, if you are shopping for body parts on eBay, take care, you might get mine, and I don't recommend them. Too much mileage, and the kidneys have seen far too much red wine.

Comments

VidMus wrote on 2/19/2014, 1:12 AM
I read the reviews on the Egg and Amazon and most are quite favorable. I noticed that the full res is only available using the display port according to some reviews.

Best to shop around and read the reviews both pro and con to get the best idea if this monitor will meet your needs or not.
GeeBax wrote on 2/19/2014, 2:13 AM
Yes, I did read a number of reviews, and the ones from the gaming community are all favourable, but I suspect their requirements are different to ours.

Fortunately, my graphics card has a DisplayPort output, so I should be able to access the full resolution.
Grazie wrote on 2/19/2014, 2:55 AM
Stoopid Alert here: Am I right to think that my HD 1920x1080 will appear in the middle of this ASUS, with a substantial border "black" around it? And that if I was to select 1920x1080 (if it can do that through drivers) reso I'd loose out on definition? In which should I even bother? I've been looking around too......

Grazie



farss wrote on 2/19/2014, 3:37 AM
[I]"Am I right to think that my HD 1920x1080 will appear in the middle of this ASUS, with a substantial border "black" around it?"[/I]

Yes, Vegas does this for you, the same as if you display SD on a 1920x1080 monitor.
From memory you can tell Vegas to scale the image to fit however as you've suggested this could cause loss of resolution.

[I]" In which should I even bother?"[/I]

I cannot think of a reason unless you'll also use the monitor for other duties where it would help. My Asus is 1920x1200 which means smallish black bars at the top and bottom which I find convenient as a source of reference black but I don't think I'd like black the whole way around the image.

Bob.
GeeBax wrote on 2/19/2014, 4:06 AM
I currently have an Asus VG236H, an older 23" monitor with CCFL illumination, and my main reason for updating is to obtain a more colour correct monitor. I have looked around and all the 'proper' ones are out of my reach price wise, hence my looking at this one.

Is your Asus one of the professional series Bob?
farss wrote on 2/19/2014, 5:28 AM
[I]"
Is your Asus one of the professional series Bob? "[/I]

One of the ProArt series and I do like it and many others have said the same here in previous threads. There are better such as the HP Dreamcolor but the price goes up and up very steeply as you get into the more "pro" monitors and you also start to pay for features that incur additional expense to make use of such as a HD-SDI input.

Bob.
OldSmoke wrote on 2/19/2014, 6:46 AM
GeeBax
This would be a 10bit monitor and if remember it, there was a thread here about the 10bit capability of Vegas. Was the consent that Vegas actually doesn't preview 10bit via the Windows Display output? Did I get that wrong ?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

videoITguy wrote on 2/19/2014, 8:08 AM
Oldsmoke, yes, it appears that Vegas thru the preview output does not allow more than 8bits. It has also been true that in the past the hook up of BlackMagic when supported (upto Version 9 VPro officially) that BlackMagic offered pass thru of 10bit signal from within its own parameters. That appears not to help the choke that preview is placing.
OldSmoke wrote on 2/19/2014, 9:44 AM
So is there any benefit in having a 10bit monitor that is "only" connected to a graphic card using the standard windows interface?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

rmack350 wrote on 2/19/2014, 11:14 AM



Several Adobe apps appear to use the 10-bit output so if you're using one of those then perhaps there's a benefit. It's certainly conceivable that Vegas might someday support 10-bit preview through a graphics card, but it doesn't up through VP12. I wouldn't buy such a monitor if VP12 was my only consideration.

Rob
JohnnyRoy wrote on 2/19/2014, 12:22 PM
I have two ASUS ProArt PA246Q (1920×1200) 24" monitors which are also 10-bit and I love them. They are pre-calibrated a come with the calibration report in the box. You'll really like this monitor.

~jr
farss wrote on 2/19/2014, 1:48 PM
[I]" So is there any benefit in having a 10bit monitor that is "only" connected to a graphic card using the standard windows interface?"[/I]

Yes, consider this.

Most video cameras use 14bits for their image processing. It stands to reason that a display device which is doing the reverse of what a video camera is doing should be just as capable. It too has to do image processing as it maps one set of 8 bit values to what the actual pixels of the LCD display present to the viewer.

Bob.

GeeBax wrote on 2/19/2014, 2:50 PM
I will also be using the monitor in conjunction with Resolve, and it is being driven by a GTX470 card, which appears to be capable of higher image bit depth.
videoITguy wrote on 2/19/2014, 3:06 PM
What has Bob been drinking? The mapping and processing of a camera sensor is going on internally - the external signal would be most often 8bits to the transmission side - be it recording of a memory card, or even direct analog connectors on the side of the camera.
Mapping and processing are internal - just like 8bit sources mapped to 32bit video processing in VegasPro on a 32bit project timeline....it still gets to an 8 bit codec for transmission.
Now if you could hook up processing so that video pipeline to the monitor is a processing conduit that would be the form you are looking for.
8bit preview output of VegasPro seems to be clamped before it gets on the wire.
OldSmoke wrote on 2/19/2014, 3:10 PM
Bob

I doubt a computer monitor works this way but I can see how a TV would or a DVD player that does do upscaling. I also believe the image processing is done in the graphics card rather then the PC monitor; the monitor displays what the card presents... pixel per pixel.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

farss wrote on 2/19/2014, 3:39 PM
[I]" I also believe the image processing is done in the graphics card rather then the PC monitor; the monitor displays what the card presents... pixel per pixel. "[/I]

No, high end monitors use look up tables for calibration.

The ProArt series lets you change colour space, white point etc in the monitor.

[edit]
From the PA246Q specifications:
SPLENDID Video Preset Modes : 6 Modes (Adobe RGB Mode/sRGB Mode/Scenery Mode/Theater Mode/Standard Mode/User Mode)
Color Temperature Selection : 3 Modes
Color Accuracy : ?¢E?… 5
Gamma adjustment : Yes (Support Gamma 2.2/1.8 )
Color Adjustment : 6-axis adjustment(R,G,B,C,M,Y)

[edit2]

From discussions elsewhere in "User Mode" you can recalibrate the monitor using a Spyder etc,

As I said previously there is a LOT of image processing going on inside these monitors and doing it at more than 8 bits makes a lot of sense. Traditionally monitor, printer and scanner calibration has been done in the computer using ICC profiles. This has changed.



Bob.
R0cky wrote on 2/19/2014, 3:42 PM
I have this one and love it:

http://www.bhphotovideo.com/c/search?Ntt=LMD2110W&N=0&InitialSearch=yes&sts=ma&Top+Nav-Search=

It's close to price compared what you were looking it but worth it for an entry level pro monitor.


rocky

OldSmoke wrote on 2/19/2014, 3:59 PM
A look up table doesn't define how a 8-bit output from a graphic card is converted to 10bit display... or does it?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

farss wrote on 2/19/2014, 4:01 PM
[I]"A look up table doesn't define how a 8-bit output from a graphic card is converted to 10bit display... or does it?"[/I]

A LUT is simply a way of implementing a transfer function. It can be at any bit depth however to keep the size of the LUT reasonable interpolation is often used.

Bob.
videoITguy wrote on 2/19/2014, 4:13 PM
oh, yeah, and just as we said if the transmission chain is 8bit - the LUT is an interpretative 8bit - good grief.

I would bet that calibrating with Spyder which I do all the time on a good 8bit monitor of less price is not going to calibrate improvements of a 10bit monitor fed an 8 bit transmission. Lets be real world and deal with what most people can deal with.
John_Cline wrote on 2/19/2014, 4:16 PM
Getting any new monitor isn't going to guarantee any more accurate grading unless you have a way to calibrate it. I use the Spyder4 from Datacolor.

http://spyder.datacolor.com/display-calibration/
OldSmoke wrote on 2/19/2014, 4:29 PM
That is exactly what I thought. If I feed the monitor 8bit I cant expect to see 10bit.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

farss wrote on 2/19/2014, 5:05 PM
[I]"That is exactly what I thought. If I feed the monitor 8bit I cant expect to see 10bit."[/I]

Obviously not.
What's missing though is that the phosphors used in LCD monitors are not a good match to the dyes used to provide the RGB filtering in the camera i.e. the native colour spaces don't match unlike what they did with CRTs. More precision at the mapping certainly doesn't hurt to reduce the errors.

In any case we're hardly talking about spending significantly more money and the ProArt series has other features that I've used many times. For me the on screen alignment grid that the monitor provides and Vegas doesn't to the external preview monitor was worth the couple of extra hundred dollars alone.


Bob.
GeeBax wrote on 2/19/2014, 9:58 PM
Getting any new monitor isn't going to guarantee any more accurate grading unless you have a way to calibrate it. I use the Spyder4 from Datacolor

Actually, Asus give away a Spyder with one of their monitors as a package deal, and I would buy one anyway.

In any event, the colour balance of one of these monitors is going to be closer out of the box than a non-professional monitor. Also, not all monitors are able to adjust the colour rendition anyway, many simply do not have the control.