When buying a digital camera or camera phone, most people consider the price per pixel, i.e., the price and the number of pixels, to be the main quality factor. If you are in the market for a new camera or phone, this book can help you find the best camera for your application and your budget.
Wait a minute, why am I telling you to read a lens book when what you want to buy is a camera? Well, the lens is the device that creates the image. The sensor just scans it and the image processing chip corrects the lens and sensor's shortcomings. There are three elements that determine the image quality, and in order of impact they are the lens, the image processing software, and the sensor.
Gregory Hallock Smith is an optical engineer and lens designer. He designed all of the camera lenses for JPL's Mars Exploration Rovers that have been taking pictures on Mars since January 2004. He knows best how to design an imaging system given a set of constraints, and this is why I recommend this book for the informed buyer of a digital camera.
When I was a young lad, I used to shoot black and white and make my own enlargements. In my first phase I was shooting mostly with a 24 mm lens on HP- 4 film I used to push it to the extreme by heavily underexposing it and then putting it through a soft physical development in Emofin. This was giving me a nice big grain while I still had a full tonal range. Then a friend showed me what he can get out of my enlarger from his Pan-F negatives. I was amazed and switched to Technical Pan, which has 320 line-pairs per mm, and was stupefied about the quality of my enlarger and my lab skills. However, in this second phase I had to completely change the way I see, because my 24 mm lens had become a clunker. I was now limited to photographing with a 55 mm micro lens and a 500 mm catadioptric lens, which are diffraction-limited.
Because of the blur disk from the point spread function (PSF) has a fixed size for a given lens and aperture, when the sensor resolution is increased, the pixels become smaller, but you just get more pixels inside the same disk, so they are not capturing more image information.
In section 16.4 Smith gives you a numerical example. His 4 megapixel pointand- shoot camera has a 1/1.8" sensor, hence a pixel pitch of 3.16 μm. The lens' PSF and its ƒ/8 Airy disk (the diffraction limit) are 10.7 μm in size, i.e., the blur disk is already 3.4 pixels across, i.e., he has already more pixels than he can use.
You really want less pixels, because then for the fixed sensor size you get larger pixels, which are more sensitive and give you more bits per pixel. If your sensor can for example give you 12 bits per pixel, then if the camera has good imaging software it will do some retinex magic to map those 12 bits into the 8 bits of a Jpeg file and give you an image with much better tone reproduction.
This book will help you understand the optical limits of a lens and let you find out the values typical of your lenses from a compilation of the most important lens designs. With this data you can then find the pixel size required to match your lenses and for a given sensor size, the number of pixels required.
Gregory Hallock Smith, Camera Lenses: From Box Camera to Digital, SPIE Press, 2006 http://bookstore.spie.org/index.cfm?fuseaction=DetailVolume&productid=660181