[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: resolutions, etc..



Hi Jude,

First off, I'd like to point out that resolution in film, analog video, and
digital imaging are not exactly the same thing.

"Lines of resolution" means that if I were to draw a number of equally spaced
parallel lines on a piece of paper, how many would be able to fit (per inch
or per millimeter) before they started blurring together?  This is the basic
explanation; an actual resolution test would be complicated by such factors
as proper exposure and lenses, which must be taken into account before
meaningful measurements on a piece of film or a video imager could be
obtained.

In film, resolution is limited largely by the size of the grain.  Coarse
grained film stocks will have lower resolution than fine grained ones.  Since
the grain pattern is random in most types of film, the resolution is usually
the same whether you measure it horizontally or vertically. 

In analog video, two resolutions are typically mentioned.  The horizontal
resolution is usually the bigger number of the two, since it is limited
primarily by the television standard in use and by the bandwidth of the video
electronics driving the tube.  Vertical resolution can not be greater than
the number of active scan lines in one field (in interlaced systems), and may
be lower in cheap equipment.

In the world of digital images, pictures are made of discrete pixels (picture
elements), and every one of these has a unique address.  The resolution is
fixed by the speed and amount of memory "behind the screen," and the speed of
the processing that writes and reads the memory.  To increase resolution in a
digital imaging system, the number of pixels have to increase because the
maximum number of lines of resolution that can be displayed--in theory--is
half the number of pixels available horizontally and vertically (every other
pixel).  The horizontal and vertical numbers are not equal, however, because
most displays have some aspect ratio and it is not worth wasting memory and
processing time for pixels that aren't even on the screen.

<< 1. what is the minimum required resolution when scanning 35mm film 
for digital effects?(or is there an agreed resolution anyway?) >>

This depends on the effects you are doing and how much money you want to
spend.  2K by 2K scans appear to be fine for most purposes where the images
will be ultimately used for video, print, or computer graphics; 4K by 4K is
often used if the material will be shot back out on film.  There are defacto
standards based on the fact that most current hardware and software works
with particular data file formats, but it is hardly likely that these will
remain static as new systems emerge.

<< 2. are you referring to horizontal or vertical resolution? >>

Film scanners usually don't allow you to mix and match resolutions; either
you get the full resolution the system was designed for, or some sub multiple
(like half or quarter resolution).  The actual numbers depend on whose
machine we're talking about.

<< 3. how does color bit depth affect the scan? >>

Color bit depth doesn't affect the scan itself appreciably, but it does
affect the amount of data that has to be stored and manipulated later on.

<< 4. a bonus:  since the scanned 35mm goes into the digital domain (i.e. no
loss in picture quality), is there still a need to shoot in a higher film
format (such as Vistavision)? >>

The short answer is yes, if the quality (and budget) of the production
warrant it.  Nobody would bother with IMAX or 70mm if 35mm was perfectly
adequate for all purposes.  The bigger the negative, the more picture
information it can capture, with better results later on.   A frame of big
film yeilds a big data file when scanned, but we can always throw some away
if there's more than needed.  On the other hand, a small film frame only
contains so much data to be scanned, and if it's not enough, you've got
problems!

Christopher Bacon