So, when an image is out of focus the light of any given point of the object does not refocus exactly at one point on the screen, but is spread out around a region, such that the light of the image at any given point is a superposition of lights of surrounding points.
If this is the case, then it seems like no information about what light corresponds to what point hasn't been lost, only dispersed spatially.
Could a program retrieve that information without any loss?
Answer
No, because you have no information on how far objects are, and that affects the spread of the light. If you were taking the photograph of a 2D plane (a painting for example), you probably could retrieve more information, but it would still have a considerable quality loss, since in the borders you don't have the entire spread, some would be out of the photograph and that would have an impact on the entire image after the processing, since all points depend on one another.
No comments:
Post a Comment