ðòïåëôù 


  áòèé÷ 


Apache-Talk @lexa.ru 

Inet-Admins @info.east.ru 

Filmscanners @halftone.co.uk 

Security-alerts @yandex-team.ru 

nginx-ru @sysoev.ru 

  óôáôøé 


  ðåòóïîáìøîïå 


  ðòïçòáííù 



ðéûéôå
ðéóøíá












     áòèé÷ :: Filmscanners
Filmscanners mailing list archive (filmscanners@halftone.co.uk)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: filmscanners: Sharpening scanned images for printing



Harvey,

> What I was trying to say was that a scan of a negative (let's say
> B&W) *is* a scan of its grain.  If the
> scanner can't get the grain sharply rendered then it can't make a
> sharp scan.

You can get sharp scans and NOT scan "down to" the film grain.  In fact,
most scanners do not resolve to near the grain (or dye clouds).  Sharpness
is a matter of enlargement also.  What may appear not sharp on the monitor
at a billion times magnification, may print, at a particular magnification,
as sharp as it possibly can get, and even if the original scan were sharper,
it would not make for a sharper print.

> I don't car if you have the world's on the world's best tripod on
> the world's sharpest film.  If the scanner
> canter render the *film* sharply, it can't make a sharp scan.

Of course if the scanner can't FOCUS ON the film as well as it can resolve,
it won't make as sharp a scan as it could, but that's entirely different
than being able to resolve to the film grain.  And, as I said above,
sharpness is a factor of enlargement too.

> The holga was used as an an example of where
> one wants to look at the film itself and not necessarily of the
> image on hat film.   Obviously, it went over
> Austin's head, or he ignored the concept.

No, Harvey, it wasn't over my head, and I did not ignore "the concept".  I
said a number of times that I thought it wasn't relevant (to the general
case and to the issue being discussed).

> Actually I was referringto drum scans (which tend to be
> inherrently sharper then ccd scans), so the ccd red
> herring is of no concern to me.

For color that is true, since there is no smear with a PMT scanner (it only
scans one pixel at a time, not a line at a time like a CCD does), but that
is not necessarily true for B&W, if the CCD scanner scans only using a
single ND filter.

> > To Harvey, who wrote:
> >
> > >> Then why do (real) hi bit scans require less sharpening than low
> > >> bit scans?
> >
> > Harvey is it possible that by and large (certainly more so in
> the past than
> > today) the higher bit scanners have been the higher quality
> scanners? I mean
> > highbit used to come at a steep price, and from quality
> components. Still
> > does for "real" bit depth as you put it, by which I think you
> mean extended
> > dynamic range.
>
> The scanners I was referring to are the very top end drum
> scanners (in the $100,000 range). The true 48 bit
> scanners vs. the true 36 bit are supposed to need less
> sharpening.  Pure and simple.  Go to NancyScans and
> talk with them.  Yes the 48 bit scanners are newer, but I think
> it's more of a software change than the
> hardware breakthrough.

"Supposed to" can be for many different reasons.  I have designed quite a
few digital imaging systems.  It makes no sense that higher bit depth would
require less sharpening for only the reason of higher bit depth.  It sounds
to me that NancyScans is probably right, THEIR particular 48 bit scanner
requires less sharpening as THEIR 36 bit scanner, but that does NOT mean it
is because of the bit depth.  A lot of PMT scanners have sharpening inherent
in the hardware.

> my statement (on the conceptual level)
> still stands:
> Higher bit depth scans need less inherent sharpening than lower
> bit depth scans

It's technically and conceptually wrong, and I've given very simple examples
that show it is wrong.  A larger tonal range does not give you a sharper
image.  Make a print of a solid black box occupying %50 or so of the center
of a white piece of paper.  Then make another print with a gradient from the
center of the paper out the same distance as the box, going from
0-255...tape them both to a wall next to each other, and stand back some
40'...tell me which one looks sharper.

> and the sharpness of a scan
> has *nothing* to do with the sharpness of the original image.

No, but as I said, that's not the issue.  The issue was what one sees on the
final output, and is it sharp or not.  You agreed that if your original
image was fuzzy, then your scan would *appear* fuzzy (which was my original
point that you now agree with), yes, you may be in exceptional focus on the
film, but how on earth do you tell the difference (if you are not resolving
to grain, which most people don't) on the final print, and what difference
does it make (again, unless you are resolving to film grain) your resultant
print WILL be fuzzy either way?

> > > Most people don't sharpen grain, they sharpen the image.
>
> And that image was originally made up of the aforementioned grain.

Of course film is made up of grain, at least B&W films.  I don't understand
the relevance...unless you are resolving to the film grain, which most
people don't, except on grainy film.

I understand exactly what you are saying, but I disagree with the relevance
of it, except in the circumstance (the Holga/artsy fartsy example) you
mention...and that circumstance is not what most people who scan, and get
fuzzy scans, do, nor, as I said, do I believe is the reason they do get
fuzzy scans and believe they have to sharpen.

I have pointed out to a number of people who thought they were having
scanner problems, that it was not their scanner, but their
lense/development/technique etc, and in every case, when they fixed the
problem BEFORE the scan (new lense, more care when developing etc.), their
scans improved drastically.

Austin




 




Copyright © Lexa Software, 1996-2009.