Zoom-wise: I believe an eye is equivalent to somewhere between a 35mm and 50mm lens
Light-wise: I would think the eye is MUCH better at dealing with different light levels than a camera. We can see in really dark rooms, which just appear as black in standard photos without a flash (this is a combination of the amount of light let into the camera, the time light is allowed onto the sensor for and the sensitivity of the sensor).
Resolution: Not sure how many receptor cells there are on the retina (anyone else?) but I’d guess 10’s of millions. This is probably broadly equivalent to a cutting edge, professional digital camera.
Of course the brain outperforms computers most of the time by a long shot when it comes to interpreting the images 🙂
Hiya, I agree with Andrew’s response. One thing that is different about our eyes and a camera is that a camera processes everything it ‘sees’ equally. It doesn’t matter if it is at the center or edge of what the camera can pick up- so all parts of the image can be in focus at once. Our eyes have many more cone receptors near the center- that means the we see best when the information falls right at the center of our eyes (the ‘fovea’). That’s why you move your eyes to things on the side if you want to see them!
There are 120 million rod cells and about 6 million cones in the retina, most of the cones are in the macula, the region concerned with detail vision. The central 125 microns of the retina is rod free. The receptors are arranged in a mosaic. The eye has a dynamic range of 10,000,000 times, in other words we can see in moonlight (1 lux) and in bright sunlight (10^8 lux). Our eyes can ignore the effect of ambient lights being coloured, this is till a big area of research in colour vision.
To paraphrase Andrew, computers are rubbish at seeing. Computers can do parts of seeing really well but overall they are mostly great at adding up.
🙂
Comments