ðòïåëôù 


  áòèé÷ 


Apache-Talk @lexa.ru 

Inet-Admins @info.east.ru 

Filmscanners @halftone.co.uk 

Security-alerts @yandex-team.ru 

nginx-ru @sysoev.ru 

  óôáôøé 


  ðåòóïîáìøîïå 


  ðòïçòáííù 



ðéûéôå
ðéóøíá












     áòèé÷ :: Filmscanners
Filmscanners mailing list archive (filmscanners@halftone.co.uk)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[filmscanners] Re: Digital Darkroom Computer Builders?



Andras writes:

> As I said: 2^64 = 18446744073709551616. That's
> how much RAM can be directly addressed using
> 64-bit address registers.

You incorrectly assume that every byte allocated will be used--another error
that is all too common among engineers.

It is not unusual for software to allocate memory far in excess of what will
actually be used.  For example, a program might allocate 16-megabyte chunks
for tables that rarely ever have more than a thousand entries, just to be on
the safe side.  With virtual memory systems, this is fine and dandy ...
until you run out of address space, and that's when the problems begin.

Just look at IPv4:  If you truly assign IP addresses sequentially to each
computer on the Internet, there are plenty to go around with 32-bit
addressing.  But addresses are always assigned in blocks, and blocks are
almost never completely used, so huge expanses of the address space are
never used, and the address space is exhausted long before 4 billion
computers actually are connected to the Net.  The original designers of IP
were egregiously wasteful in this regard, assigning blocks of 16 million
addresses to companies that probably never used more than a few thousand
(the old Class A addresses).  Rethinking the IPv4 allocation has ameliorated
the situation a bit, but it still suffers from inefficient allocation.

And, unfortunately, IPv6 suffers from the same defect, only worse.
Engineers just never quite understand that "forever" is a really long
time--long enough to exhaust any amount of capacity, especially when
capacity is wasted.

The year-2000 problem was another example of this.  People designed systems
that were never intended to work for more than a few years, since they
figured that "nobody will ever need this for more than ten years."  Dating
routines failed after the turn of the century.  People had to scramble to
avoid disasters as the year 2000 actually approached and COBOL applications
written in 1963 turned out to still be in daily use.

I make a point of watching for this error in everything I design and code.
The last dating routine I wrote was designed to work without error for the
next 10,000,000 years or so (the most I could fit into 32 bits).  I worried
about the year 2000 thirty years ago, and I've worried about exhausting even
64-bit addresses spaces while everyone else was still wasting space in 16
bits.  But my stuff doesn't break after ten years of use.

----------------------------------------------------------------------------------------
Unsubscribe by mail to listserver@halftone.co.uk, with 'unsubscribe 
filmscanners'
or 'unsubscribe filmscanners_digest' (as appropriate) in the message title or 
body



 




Copyright © Lexa Software, 1996-2009.