ðòïåëôù 


  áòèé÷ 


Apache-Talk @lexa.ru 

Inet-Admins @info.east.ru 

Filmscanners @halftone.co.uk 

Security-alerts @yandex-team.ru 

nginx-ru @sysoev.ru 

  óôáôøé 


  ðåòóïîáìøîïå 


  ðòïçòáííù 



ðéûéôå
ðéóøíá












     áòèé÷ :: Filmscanners
Filmscanners mailing list archive (filmscanners@halftone.co.uk)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: filmscanners: Pixels per inch vs DPI



Everyone has their own points of confusion and moments of comparative 
clarity, but this is one discussion about which I have never understood the 
confusion.

I use pixels for everything. Everything that is relevant to me, I 
mean.  The pixels I get out of the scanner becomes the same number of 
pixels when I work in PS, and is the same number of pixels on screen, and 
(unless I resample) will be the same number of pixels when I print it.  The 
pixels per inch  is only of interest at those moments when I want to 
transfer from my digital image to a physical sized image or vice versa, and 
its calculation is straightforward.

It seems that thinking of the pixels more than the ppi is much more 
efficient. I have seen people totally tied in knots trying to fathom how to 
print their  36x24mm 2700ppi image onto 7"x5" paper at 300ppi, but thinking 
of it as 3800x2500 pixels means the whole thing is straightforward.  The 
tagging of images with ppi figures in PS and other software is an 
unnecessary confusion - I think it should never be mentioned unless the 
context at that time is one of transfer to a specific physical sized 
medium.  Even then the ppi should only be mentioned with a kind of flashing 
red-arrow link to the image size that is implied by that ppi.

The fact that the printer happens to separate colors and dither and 
re-present the image as a greater number of 4 or 6-colour dots is of no 
significance to me so I ignore it.  I suppose it would be different if I 
needed to understand the printing process, but even then the concept of 
printer dots does not seem confusing because it is such a different thing 
from the pixels that the image is stored as.  1440 dpi is an internal 
printer spec that has no relevance to me other than to define - once-  the 
likely resolution performance of the printer.  It is not something I have 
to work with or calculate with, so I ignore it.

And I don't understand the advantage in differentiating between scanner 
pixels and screen pixels or any other pixel - just makes things more complex?

Julian

At 15:37 23/10/01, you wrote:
>I use these terms:
>
>Scanner - spi - (scan) samples per inch
>
>Monitor - ppi - pixels per inck
>
>Printer - dpi - dots (of ink) per inch
>
>I think this came from Dan Margulis's "Professional Photoshop"
>
>Maris
>
>----- Original Message -----
>From: "Rob Geraghty" <harper@wordweb.com>
>To: <filmscanners@halftone.co.uk>
>Sent: Monday, October 22, 2001 8:45 PM
>Subject: filmscanners: Pixels per inch vs DPI




 




Copyright © Lexa Software, 1996-2009.