When buying a new digital camera, vendors are trying to convince you to buy a camera with the most Megapixels(the highest resolution). But is this always necessary?
Do people even notice the difference between images with different resolutions?
David Pogues, a writer for the New York Times, conducted multiple tests on this subject.
These tests showed that only 6% of the people could distinguish different resolutions of the same image printed on a 16 x 24″ ( 40x 60 cm)paper.
He chose this size because according to him most amateur photographers wouldn’t often print images larger than that size.
The resolutions shown were 5, 8 and 13 Megapixels.
So do you think that, for amateur use, digital cameras with high resolution are useful?
Nowadays, when people buy tv’s, they always want to have the biggest resolution possible for their viewing pleasure.
But is a higher resolution always better? Is the difference between 1080p and 720p always noticeable?
A higher resolutions takes care of the ability to deliver more detail to the picture and make it sharper.
There are a few factors that take part in making the viewer able to distinguish between image resolutions:
the resolution of the screen, the size of the screen, and the viewing distance. To be able to detect differences between resolutions, the screen must be large enough and the viewer must sit close enough.
The human eye (for people with perfect vision) can resolve 1/60 of a degree of an arc. With this knowledge, it is possible to estimate when the differences between resolutions will become apparent. On the following picture you can see when a certain resolution becomes noticeably better than a lower one.
For example, we take a 50 inch screen (127 cm).
The benefits of 720p vs 480p only start to become noticeable at viewing distances closer than 14,6 feet (4,45m) and become fully apparent at 9.8 feet (3m). For the same screen the benefits of 1080p vs 720p start to become noticeable at distances closer than 9,8 feet (3m) and are fully apparent at 6,5 feet (2m).
So we can conclude that for smaller screens, there isn’t much difference between 1080p and 720p, but for bigger screens like in the cinema it’s a massive improvement.
Alexander & Tom
The amount of information in the world is rising at an enormous speed.
Especially the amount of image and video data is increasing as the resolution keeps increasing.
Nowadays, the standard-definition for television, 768 by 576 pixels,
is being replaced by the HDTV which is 1920 by 1080 pixels. But there is already a new standard in the making, the UHDTV or ultra-high definition television. This standard has a size of 7680 by 4320 pixels.
Following example illustrates the increase in data size between multiple standards:
this is for a frame rate of 25 frames per second and 24 bits per pixel.
figure1 : amount of raw data per second for multiple video standards
If you compare UHDTV to standard PAL, the size increases by 75 times. Both frame rate and bits per pixel will still increase in the UHDTV standard.
Where will this increase in size end?
Will we still be able to process all this data?
Alexander & Tom
We are going to develop parallel implementations of image processing algorithms for one of the researchers at Group T, Kim Kiekens. In her research she has to process a lot of high resolution CT-scan images which takes a lot of time. The processing is now done in Matlab, in which she has to enter commands to execute the algorithms. To make this easier, we are going to develop a graphical user interface using the Qt libraries.
Next, we are going to do research about the processing time and efficiency of different implementations of the data processing algorithms on a GPU. Testing multiple kinds of memories on the GPU is also one of our goals.