I'm interested in purchasing this monitor for use in posting. However, I don't immediately see how (looking at the connections) this monitor would be connected to a computer using Vegas.
Any ideas or suggestions would be greatly appreciated.
It's hard to see the inputs on the photos, but it says it has HDMI. That should work depending on the video card in the PC.
I took a look at Sony's website, I didn't see where to download manuals, hmmm.
1- You have to figure out how you want your monitor hooked up to Vegas.
If you need to deal with VTRs, then you'll have a SDI card anyways and you'd get a monitor with SDI input. (I don't think you fall in this category.)
Otherwise, you generally want to view over a similar setup compared to how the final product will be displayed. So if it's a Bluray, then ultimately you'll want to monitor off your Bluray player to check that everything looks good... so that'll probably be HDMI. I'm not up on how you might setup Vegas to output over HDMI (and whether the material would get flagged as interlaced if applicable, and therefore the monitor will try to deinterlace the signal).
For now, I'd look at getting a 1920x1200 computer monitor, hooking it up via DVI, and using it for monitoring in Vegas under Windows Secondary Display. A lot of monitors also have HDMI so you can hook them up to a HD source that way. I'm not sure if any of these monitors are good as both computer monitors and as TV sets (requiring de-interlacing, etc.) as I haven't played around with them.
If your budget is bigger, JVC has a 24" broadcast monitor with DVI and HDMI input. It's under $2500 (the price keeps dropping).
I'd respectfully disagree. Just because it is marketed as a professional monitor does not mean that it is one (e.g. no offence to Boland Comm, but their products fall in that category IMO). LCD monitors have gotten a lot better and affordable. By today's standards, 1920x1080 monitors are much more affordable... even a Decklink + Apple Cinema Display will do it (though it doesn't deinterlace).
2- Jay, that should be the one. I saw the version of that monitor with SDI input at NAB at it looked good (but NAB is not a great place to look at monitors).
we just bought a couple of Panasonic monitors with HD SDI inputs. Work a treat for the EX1/3 in the field. Much goodness. Run off batteries, pixel to pixel viewing, built in vaguely useful scopes.
HD SDI is the best way to feed monitors. You're using an industry standard connection, the same as what goes to VCRs. You can monitor and meter it with calibrated hardware tools. Then there's no confusion over levels etc. This avoids some of the most common screwups. Yes it's expensive but so are screwups.
I think you need to monitor your final deliverable with as few distortions as possible.
If you need to deal with VTRs, then absolutely go with HD-SDI (and monitor the HD-SDI output off the deck to see what the confidence heads see).
If you are outputting for computer formats, then I would monitor via DVI... there's the whole silliness in Vegas in having to manually wrangle levels. Monitoring via HD-SDI won't catch that.
Of course, you would need to watch out for other things in that scenario like video overlays (get rid of them!).
2- Anybody see the HP Dreamcolor displays?
I've heard some controversy about the highest-end model being true 10-bit... I don't know the answer there.
The cheaper Dreamcolor model (the $700 or less one) is an 8-bit panel and it might offer accurate-ish color for a LCD. Though like nearly all LCDs it can't do black, and black is a color where we will notice color inaccuracy very easily.
According to remarks claimed by a posting at the following to be from an HP engineer working on this monitor model, it doesn't do proper color management for YUV or interlaced, it just applies its full gamut.
[http://www.justechn.com/2008/06/10/hp-dreamcolor-lp2480zx-lcd-display]
as of 24 Nov 2008:
<<<
The DreamColor Engine is only enabled if the input is:
Progressive (not interlaced) AND
RGB (not YUV)
If the input signal is Interlaced OR YUV, then the DreamColor Engine is disabled and there is no color management (the colors are unmanaged, reverts to Full (Native) gamut and the color presets menu is grayed out).
With an SDI input, you must use an adapter that converts to progressive RGB at the output.
The only adapter we have found that does this is the Gefen EXTHDSDI2DVISP HDSDI to HDMI adapter.
>>>
The engineer provides what looks (to naive me) like detailed instructions on how to set up the Gefen adaptor's menu settings etc.
Also at [http://www.hardforum.com/showthread.php?t=1366545], as of 23 Nov 08:
<<<
The DreamColor Engine is disabled for the analog inputs and interlacted modes over HDMI, which means you're stuck with the excessively wide gamut in those cases. There is a saturation control, but there's only so much you can do with that.
DVI and HDMI are always full range, regardless of the color space.
This monitor is a bust for anything other than 1080p and 720p full range RGB over DVI and HDMI. This monitor was obviously not designed for video.
>>>
The review at [http://www.hardforum.com/showthread.php?t=1366545] (24 Nov 2008) criticises the existence of dithering in the DreamColor engine and the quality of that dithering:
<<<
Not only is there dithering, the dithering is not that good. I can clearly see two different types of dithering being used: spatial dithering and temporal dithering using frame rate control. The dithering is most noticeable on dark colors, like on this page: http://www.lagom.nl/lcd-test/black.php
The firmware can be updated by the user, but only in Windows. Out of the box with the original firmware, the spatial dithering looked like a fixed pattern of colored noise. After updating to the latest firmware, the spatial dithering looked like simple 2x2 ordered dithering. With both firmware revisions, the temporal dithering looked like faint noise scrolling up or down, and some shades had rolling diagonal lines.
>>>
...the article continues...
<<<
I really hate to trash this monitor based on the dithering because other than that, it is a nice monitor, but a $3500 color-critical monitor should not have visible dithering, especially when the main selling point is 30-bit color with no dithering or frame rate control: "A full 30-bit pixel is sent from the DreamColor Engine to be displayed on the HP 30-bit LCD panel with no dithering or frame rate control." Source: http://h20202.www2.hp.com/Hpsub/down...AQ_June08a.pdf
The dithering is definitely coming from the DreamColor Engine. 1080i over HDMI disables the DreamColor Engine, and there's no dithering there. The dithering is present in all other modes on both DVI and HDMI. I did not have a way to test DisplayPort, but that shouldn't matter. A 24-bit source on a 30-bit monitor should not have dithering either.
>>>
The following gives the impression that the dithering is in fact essential to the perception of greater than 8-bit colour:
[http://archives.bengrosser.com/avid/2008-09/msg00539.htm] as of 24 Nov 2008:
<<<
The 24 inch Dreamcolor monitor is NOT a 10 bit display. The panel in question
is manufactured by LG. It is an 8 bit panel with FRC.
FRC stands for "Frame Rate Control".. It is an industry term that describes
the use of a dithering algorithm that allows the simulation of greater bit
depth. There are many implementations of such algorithms. A Google search for
"FRC" or "LCD FRC" should yield plenty of information. The basic idea is that
you take a cluster of pixels and quickly "flicker" them between two or more
levels in order to fake the eye into seeing something in between.
FRC might be OK for consumer-land, but has not place (at least in my
not-so-humble opinion) in professional imaging. The implications go from the
most obvious: The colors you are seeing are the result of your imagination,
they are not really on the screen. To some not so obvious: the FRC algorithm
can "beat" or interact with motion elements, de-interlacing algorithms, etc.
and produce artifacts that are not in the signal coming into the monitor. I
would avoid anything with FRC or dithering as a reference/evaluation monitor.
OK for preview, but not much else.
BTW, there are 8 bit panels out there (low cost) that are actually 6 bit panels
with FRC. Same issues.
The information about the Dreamcolor panel was provided to us by the LG
distributor when we inquired about being able to use the panel. It won't be
available to OEMs for a few months, as HP has a contractual lock on it.
However, the fact that it is an FRC panel takes it out of the running as far as
I am concerned. I prefer real color.
>>>
Better reviews
I think you should take them with a grain of salt, especially if the reviewer doesn't have both monitors side by side. I don't think I've run across any review where the reviewer had both the JVC and eCinema FX24 side by side. (There is a review in dv.com where the JVC is compared to the DCM23, which is an old generation LCD. LCDs have improved very rapidly.) The FX24 is probably the better monitor (e.g. 10-bit panel instead of 8-bit, and I believe slightly better viewing angle; I've not seen the two side by side).
And really, if the purchase is important to you, why not demo both units if you can?
2- I have a feeling that you would receive good support from eCinema. Martin Euredjian (the founder), while he is very opinionated, is an expert in LCD technology and an honest person.
<<<
...the LCD module in the LP2480zx monitor provides a 10 bits/color (30 bits/pixel) input, with true 10-bit drivers within the LCD itself.
...
Increasing the bit depth of the display drivers achieves this without the possibility of undesirable image artifacts which may result from temporal or spatial dithering as may sometimes be used with an LCD of lower inherent accuracy. (The LP2480zx’s “front-end” electronics are also, however, capable of providing temporal dithering, if needed, to increase the delivered accuracy beyond the 10 bits/color level.
By default, this is used only between the pre-LUT and the 3x3 matrix multiplier stage; temporal dithering is possible but normally disabled at the 30-bit connection between the post-LUT and the LCD module itself.)
>>>
So the exact nature of it gets more interesting perhaps.