Filmscanners mailing list archive (email@example.com)
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: filmscanners: Pixels per inch vs DPI
Whatever works for each of us I guess. I was trying to point out that
printer dots are not relevant to anything that I actually deal with (as in,
I don't have to decide on what dpi to set, or allow for it, or even know
what it is, to get 'proper' results - apart from as a specification on the
day I make my purchase decision (and if you assume that integer
relationships are not important with recent printers). I understand that a
group of dots make a pixel via dither etc, but my point is that it is not
something that you need to or should wrestle with when scanning and printing.
Samples per inch at scan time IMHO only confuses the issue - even if I do
oversample the result is still a pixel so ppi is still the correct
description. From that point on - image processing and printing, it is
still a pixel - so for me, call it a pixel at scan time, call it a pixel
all the time.
The other point I was making has been made by many others, and that is that
the only important thing to *track* is the pixel dimensions of the image -
trying to track ppi as you work from scanning (2700 ppi) to screen
(96/72/100ppi) to printing (300ppi) only makes things complex unnecessarily.
So I scan at 2700ppi bec that is my scanner's native resolution, without
worrying about any output parameters or sizes. I process in PS without
thinking about ppi. WHen I come to print, I resample in PS using the image
size box and set an image dimension to suit. Of course I have to check
that the resulting ppi is a sensible one, but apart from that don't think
it serves any purpose to even think about ppi at other times. I believe
most people actually do more or less the same, but lots of complex
suggestions pop out when people try to help others on the dreaded dpi/ppi
subject which I don't find useful myself.
At 11:05 26/10/01, you wrote:
>I like Maris' terms.
>Differentiation is important at least because a 1440 dpi printer doesn't
>print 1440 pixels per inch. It prints dots per inch and a mosaic of dots is
>required to render an image pixel.
>With scanners, saying samples per inch tends to suggest samples within the
>optical resolution of the scanner, although 'over sampling' is a term known
>in the science of digital signal processing that relates to creating
>artificial samples using interpolation of actual samples.
>Raster displays have always been described in terms of pixels, as have
>raster imaging applications, such as Photoshop.