Filmscanners mailing list archive (email@example.com)
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[filmscanners] RE: Digital Darkroom Computer Builders?
Sorry it has taken so long to respond; but I was attempting to do some
investigation of the product before replying.
>> does it mean that it can do 3 displays using independent color management
>Yes - and yes.
My research says that the first yes is correct; but I could not find any
verification for the second yes. Nowhere that I could see on the Matrox web
site does it say that each of the three monitors are independently color
managable from the card. It says that you can set the color depth and
grayscale gamma for the three heads and produce the same accurrate color
across the three monitors (implied but not stated is that one is using the
same size, make, model, and age monitors for all three displays); however,
it does not say that they are independently changable or profileable so that
one can color manage different size, makes, models, and age monitors on the
different heads in a whay so as to match them to each other as well as to
other components in the system. The implication is that you can set the
settings so as to establish a characterization that will control all the
heads and the attached displays the same even if the monitors are different.
I did see on their web page something about a software bundle that includes
software that will enable color management of each monitor separately ( or
at least that is how I read it) but it is unclear if (a) this bundle is
included with the card or an optional purchase, (b) if the software in the
bundle is dedicated to their three head card or will also work on their dual
head cards, and (c) how it differs from other monitor calibration packages
like the Spyder and its software.
>Funny, but I remember that same point being put about dualhead.
>I have a desktop with 3 display, 2 on a G400, and one on another
First, I do not think you are the norm - especially for those working with
still photographs and graphics. Secondly, the reasons why I see it as
remaining rare in contrast to what was thought about dual monitors are the
1. The practical reason that three sizable displays (even flat screen LCDS)
take up more desk real estate than most people can afford - dual large size
displays typically press the limits of the desktop space available to most
photoimaging and graphic arts workers. To have a triple display with one
21" and two 17" or smaller monitors does not make a lot of sense to most
people from a cost effective and a practical use point of view as comnpared
to 2 monitors of similar size (one 21" and one 19" or two 21" displays).
2. With the advent of the wide screen monitors like an Apple Cinema or
Formac Gallery LCD display, the same amount of screen space may be acquired
for about the same price or even slightly less than three high quality large
screen CRTs with the Matrox card you mention. In fact, if one has the floor
or deskspace, one might even be able to get two of these wide screens for
the price of a high quality large screen triple monitor setup - especially
as time goes on and the prices of these wide screens come down. (I will
admit that the same might be said about the costs coming down and quality
going up of 21" LCD flat screen displays in the future which might make the
three monitor setup more attractive at that time but not now).
>More importantly, Matrox went to great lengths to make sure the
>DACs were well filtered for colour work.
Funny, they said the same thing about their dual head cards when they were
introduced. I would say that any inprovements that they may have made in
that area for the new card, they probably also did with respect to their
older dual head cards in production and on the market.
This feature DACs with filtering did not seem to me make any real difference
as far as I can see on the dual head cards with respect to color management
within the monitor display or across the monitors. Even if it did make a
difference, color management of the display from the card would be limited
in its manifestation by other elements in the system through which the data
has to travel and be transmitted - i.e., the monitors' hardware limitations,
the OS's limitations, the application's limitations, and the hardware and
other limitations of input devices like scanners and digital cameras and
output devices like inkjet printers, laser printers, dye sub printers,
offset presses, film recorders. The system can only be optimized to its
weakest link not it strongest link.
My knowledge of DACs and their workings and uses is limited I admit. So if
you know of something about them and how they work with respect to still
imaging and graphics that either I do not know or have gotten wrong, please
by all means enlighten me. I would think that DACs would be most
appropriate and useful in systems that do not support or utilize color
management or CM engines either at the system level or the application
>Uhu. But as pointed out, its a better card with features dedicated
>to image work! 2 crappy cards aint better than one good one if its
>quality thats required.
Obviously, we have some different evaluations of the quality of cards. I am
not talking about cheap quality value video cards but about cards that are
considered top quality for their class and cutting edge when they were
introduced like Matrox's own Millenium G450 and G550. I hardly would call
them crappy cards. That it is in fact and use a better card is still open
to question and will have to wait until a bunch have been out in the field a
while and tested under real working conditions by graphics workers and that
they have features dedicated to image work leaves open the question of what
type of image work are we talking about that they are dedicated to.
Annimation and video imaging is different from still imaging and may require
different features; the same can be said of fine arts photography or
commercial photography versus medical or scientific imaging. Even graphic
arts illustration may require different dedicated features than those that
might be the case for any of the other types of imaging.
>Well, dont doubt it. Adobe have provided a plugin, and
>will be pushing this more because chipsets from ATI and Nvidia
>are going have 10+ bit depth per channel also. Its just the next
>step! Like when we all mucked about with 256 colour, then 15 bit,
>then 24bit ...
All I can say is that only time will tell. I am tired of being on the
bleeding edge caught up in getting what is touted as the next step only to
find it turn into vaporware and a science fiction dead end
in which the buyers were utilized free of cost by developers as test
subjects but at the expense of the buyer of the "next step"
>Yes, currently, and for functions where its very useful in correcting
>an image before close editing. Being able to actually see these colours
>on the monitor, rather than an 8 bit representation makes the process
>even better. Less guess. And PShop 8 may introduce more features for
>higher depths later. Again, the card will support this better than
But it really does not make a whole hell of a lot of difference if the file
has to be converted back down to 8 bits per channel because that is what the
printer will support. In such a case is WSIWYG really WSIWYG in anything
but name only. The only time what you say has import may be with imaging
work dedicated to monitor displays like video, animations, and games or if
inkjet printers and other output devices start supporting more than 8 bits
per channel - but that is in the future and another "next step" dream for
>We can say the same of our 8 bit (24 bit total) cards at the moment.
>When scanning the image, we want ALL the detail, to facilitate editing
>and correcting. Once weve done this we can factor it down far better
>to the target colour space
Yes that is the current hope and orthodoxy; but we really do not know if it
is true and works; or if it is true and does work, we do not know if the
differences are really significant or not and under what conditions they
are. Sometines I think we go overboard and get caught up in trivial details
that only the gods know and see while we take them as articles of faith. It
is sort of like the high fidelity systems of old whose specs got so good
they could capture sounds that only a few dogs could hear and most people
could not. But the people bought them just for the prestege of having the
latest and greatest equipment and specs. I do have to admit that I also
have fallen into that practice in the past and am not exempt from the
>As an aside, I wonder how many of us here have chosen our graphics
>card for its colour rendition, and where do we find info on this.
Most of us (professional and advanced amateur image workers)probably do the
best research we can on all the components that go into our system when we
purchase the first copy and then for each upgrade - unfortunately, we are
all too often dependent on relying on manufacture's hype and adverstising,
trade magazine reviews by reviewers of dubious credentials, or gossip on
mailing lists and forums. However, we are all pragmatic and use the older
versions as backups or as system extensions. Old monitors become the second
or third monitor in a multi-monitor system or the prinmary monitor in a
second or third computer system; the same may very well be true of graphics
cards and is true of such components as RAM, hard drives, CD drives, and
even sound cards or modems.
[mailto:firstname.lastname@example.org]On Behalf Of Robert Logan
Sent: Monday, October 21, 2002 12:14 AM
Subject: [filmscanners] Re: Digital Darkroom Computer Builders?
Laurie Solomon wrote:
>>RE DAC quality and can do 3 displays (excellent of Photoshop work).
> I do not have the foggiest idea what this means. Does can do 3 displays
> mean that it is a three head card versus a two head card or does it mean
> that it can do 3 displays using independent color management for each?
Yes - and yes.
> I think the use of three monitors is rare for most;
Funny, but I remember that same point being put about dualhead.
I have a desktop with 3 display, 2 on a G400, and one on another
PC. When I can drum up the money, I will go 3 head precisely
because of the benfits more real estate provides. Coming from
a Unix background, this is quite normal. And like I
said, the heads are independently colour manageable. More
importantly, Matrox went to great lengths to make sure the
DACs were well filtered for colour work.
> but more importantly if it
> [chop real estate/colour points] A three disply head on a single card
> independent color management for the displays may be nice and even
> efficient; but it can be accomplish by adding a second card which would
> allow for 4 displays and the possibility of maybe even with two if them
> having color management.
Uhu. But as pointed out, its a better card with features dedicated
to image work! 2 crappy cards aint better than one good one if its
quality thats required.
>>it can do 'gigacolor' - 10 bits per RGB channel
>>at the DAC. Again, phgotoshop supports this. A big improvement
>>for working on digital images.
> I doubt if Photoshop can really support completely 10 bits per RGB channel
> with respect to all its functions. Photoshop only supports high bit
> with regard to certain functions.
Well, dont doubt it. Adobe have provided a plugin, and
will be pushing this more because chipsets from ATI and Nvidia
are going have 10+ bit depth per channel also. Its just the next
step! Like when we all mucked about with 256 colour, then 15 bit,
then 24bit ...
Yes, currently, and for functions where its very useful in correcting
an image before close editing. Being able to actually see these colours
on the monitor, rather than an 8 bit representation makes the process
even better. Less guess. And PShop 8 may introduce more features for
higher depths later. Again, the card will support this better than
> Moreover, any high bit file will need to
> be or will be converted to 8 bit per RGB channel if it is to be printed on
> most laser or inkjet printers so with respect to WSIG a 10 bit per channel
> video card is much benefit when it comes to seeing what is to be printed.
> However, I could be missing the point that is being made,
We can say the same of our 8 bit (24 bit total) cards at the moment.
When scanning the image, we want ALL the detail, to facilitate editing
and correcting. Once weve done this we can factor it down far better
to the target colour space. Of course, some people here might be
content in scanning straight into their printers colour space ...
As an aside, I wonder how many of us here have chosen our graphics
card for its colour rendition, and where do we find info on this.
I sit with 3 identical monitors in front of me, and 2 are driven
by my G400, the other by an old ATI card. Even at 24 bit, and well
calibrated - the difference is stark.
Unsubscribe by mail to email@example.com, with 'unsubscribe
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title
Unsubscribe by mail to firstname.lastname@example.org, with 'unsubscribe
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or