What's the Display Resolution?
Some time ago I made a search for explanations on the concept of resolution
I read several articles I have seen that there is a little confusion
about various concepts such as:
Size Pixel, Native resolution of the LCD monitor, resolution of the
CRT display (CRT :cathode-ray tube) and Dot pitch. I decided to extract
the contents of these articles in order to shed light on the subject.
The PC screen consists of a rectangular grid of points such Pixel (Picture
Element). Each pixel can be individually set to a specific color chosen
from among those available. In a display, all of these points made up
an image in the same way as the pieces of a mosaic make up a design.
Commons dimensions for this rectangular grid of pixels are follows:
640 x 480 (standard VGA);
800 x 600
1024 x 768 (standard XGA);
1280 x1024 (standard Super VGA);
The minimum size of a pixel that the Video device can support is the
dot pitch. Most video devices measure the dot pitch in millimeters.
Generally, the devices videos from PC support dot pitch between 0.39
and 0.22 mm.
To reduce the size of pixels mean increases the definition of the image,
as in a mosaic to decrease the size of cards corresponding to increase
the quality of design detail.
In a CRT video device (cathode ray ie), the pixels as small as possible
is actually composed of three points of light (one red, one green and
one blue). Depending on the intensity with which the electron beam lights
up the phosphors, the three points generate a range of colors created
by mixing the three main colors mentioned above.
For an LCD (liquid crystal namely) the speech will be similar. The
three colors in this case are not generated by an electron beam but
electrically polarization and in an appropriate way special material
called liquid crystal precisely.
The number of basic pixels and the number of pixels in height of rectangular
grid above is the size in pixels of monitor (for example 1024 x 768
Older monitors have fixed sizes and non-modifiable, for example, 640
x 480 pixels. A Multiscan monitor can be adjusted indifferently on any
size in pixels that supports (and which are supported also by the graphics
Since the physical dimensions of the monitor of course not vary, change
the size in pixels monitor means in reality the size of pixels, for
example from 640 x 480 pixels to 800 x 600 pixels, the side of pixels
decreases (because in the same space where before there were 640 pixels,
now there are 800).
The monitor resolution is the number of pixels per unit of measure
(inch or centimeter). You can have:
Pixels per inch (PPI, pixels per inch);
Pixels per centimeter (PPC).
Clearly in the same monitor to change the size in pixels means changing
the density of points and then the resolution. When in a monitor the
size in pixel change, says that was changed graphics mode.
We can clear everything with a practical example. Take a monitor Multiscan
adjusted to 1280 pixels horizontally and 1024 pixels vertically (for
a total of 1,310,720 pixels).
We measure the width in inches (1pollice = 2.54 cm) of viewing (ie
of the area in which there are pixels). Suppose they are 13.3 inches.
The current monitor resolution is 1280 pixels / 13.3 inches = 96.2 pixels
By dividing the length of an inch in millimetres by the number of pixels
contained in it is possible to find the size of pixels
27.4 mm / 92.6 = 0.28 mm
Attention: This is not the dot pitch. As mentioned above the Dot pitch
is the minimum size of pixels that the video device supports!
If we change graphics mode switching to 1024 x 768 , Resolution becomes,
1024 pixel / 13.3 inches = 77 pixels per inch. You do not need to do
the same calculation vertically because the pixels are square. In this
case the pixel dimention worth is 27.4 mm / 77 = 0.35 mm.
The resolution of a monitor, therefore, depends on the size in pixels
(which in multiscanning monitor is adjustable) and by the physical dimensions
of the display.
Often in web improperly using the term resolution. You ofen can hear
"I have a monitor with 800 x 600 resolution."
In fact, as explained above , the one mentioned is not really the resolution
but rather the number of pixels that your monitor displays.
If you know the dot pitch and the physical dimensions (width and height)
of the video, you can calculate the maximum number of horizontal and
vertical pixels that the monitor can display. In principle the operation
simple. It is came by division 1 inch (27.4 mmm) with the dot pitch
so calculating the maximum resolution. Multiplies this resolution by
the number of vertical and horizontal monitor inch you can calculating
the relative size in pixels.
Typically in CRT because of the geometry of the screen, of pixels and
the nature of analog device, the size in pixels so calculated not match
with the actual measurements. The procedure is to be considered valid
with reasonable accuracy in the LCD monitor.
The physical dimensions (width and height) of the screen is provided
by two parameters. The diagonal screen and the relationship of form
(equal to the ratio between the two sides). Typical form rate are 0.75
for displays of computers and 0.8 for television screens.
To view the columns with about 80 characters wide 8 pixels, requires
a horizontal resolution of 640 pixels. The minimum resolution supported
by Windows is 640x480 pixels. Thus, the ability to use a specific resolution
on a particular monitor depends primarily by the dot pitch and, secondly,
the size of the text desired by the user. For example,a video with a
small dot pitch and a smaller size could show the image properly, but
you may have difficulty in reading the text on the screen.
Why has circulated the idea that the Monitor
Resolution is 72 pixels per inch?
Because in a monitor with 72 pixels per inch, the pixels has the same
size of point typeface (both 1/72 of an inch), and therefore the size
of display are exactly the print size (for example,a piece of A4 paper
that is 21 cm wide , Also on display is wide 21 cm).
The first color monitors of Apple the resolution were fixed on 72 pixels
per inch, therefore they shows the images exactly to 100% of the original,
and this what had been widely publicized. Later Apple no longer care
of this fact, because the monitors were now become multiscanning and
could be set to various resolutions
Today one A4 displayed on the monitor has hardly ever the same size
of an A4 original, which is another way of saying that the resolution
of your monitor is almost never 72 pixels per inch.
The popular imagination, however, continues to believe that the resolution
has remained at 72 pixels per inch.
The LCD resolution
Unlike cathode ray tube monitors, who maintain the same quality of
vision at different resolutions, LCDs give only the maximum only the
native resolution. This resolution is what you get when the size of
pixels is the Dot Pitch monitor. These monitors can also be used with
different resolutions, but the loss of quality is dramatic. If you isn't
a big resolution expert person is well evaluate the behaviour of the
display to that native, and see if it can fit for the use of every day.
You can then modify to verify that this result is not entirely bad.
Let's say screens with 1,024 x 768 pixels fit for office, while to navigate
well you can use monitor with 1,280 x 1,024 pixel.
Dimensions in pixel of an image or a picture
When the people talk about a scanned image, in my opinion, they make
a improper use of the resolution terms.
Consider a digital camera. The image that should be acquired is focused
(through lenses) on a matrix elements. These elements play a complementary
function to pixels of a monitor. Each of these element produces electrical
signals dependent on 3 main chromatic components (RGB) of light striking
the item itself. Without go into technical details say that, in fact,
the picture quality depends on the number of pixels of this matrix.
For a digital image, however, has little sense to talk about resolution
since it does not have physical dimensions. The only dimensions are
those pixels that compose it. Clearly more great the number of pixels
matrix acquisition, and more details will be captured and then the quality
of the image increased.
Take a scanned image to 1024X768 pixels. If the display on a monitor
have the same grid of the image, it will be show complitely and it will
not dipend by the inches number of your monitor. What changes, to vary
the number of inches of the screen, is precisely the resolution and
that is the ratio between the number of inches and the number of pixels.
In practice the resolution depends by the display device or the printer
or the digitization device (scanner or camera) but not from the digital
Let's have look at what happens when we want to see on the screen an
image size in a pixels number different by those of monitors.
Suppose that we have an image of 2048X768 pixels and have a monitor
set to 1024X768. The monitor in this case is not sufficient to see all
the pixels. Indeed, the number of horizontal pixels available is half
of those needed.
In this case, the operating system sample the image in 1:2 ratio (There
will be 1 pixel each 2 ones losing the details. In this way the image
is displayed on the screen in full by reducing half the size in pixels
In order to maintain the ratio between length and hight size(thus avoiding
the distortion graphics) operation system sample the vertival size with
the same 1:2 ratio. Say that was made a "Zoom" at 50% because
the image size has been reduced by half. The resulting image will have
a dimension of 1024X384 pixels.
With appropriate graphics programs can be reduced in scale image at
a desired value 1:n . Can also increase this ratio by value n: 1.
In this last case for each pixel in width and height will appear n
pixels with the same color. The visual effect that gets clearly is a
magnification of the image that makes it more obvious details. In this
case it may happen that the image is not entirely contained on the screen.
Then the graphics tool will display a part (or detail) on the screen
(or in a Subpart).
The user can move the necessary rest of the display through the scroll
Let's have a look now, with an example, under what assumptions you
can talk about resolution of an image and what happens change it.
Take a picture printed on a sheet and we take the sizes. Suppose the
image is that of Fig.1- The sizes : 2 x 1.8 inches. Consider the number
of pixels that compose it. Suppose 200x180 pixels.
Under these conditions we can talk about resolution. The image in question
has 200 pixels / 2inches = 100 dpi (Doth for Inch). The image of Fig.2
apparently is the same of Fig 1. In reality has a resolution equal to
the half of the first; indeed try to enlarge the details in boxes of
two red figures up all'800%. We get the FIG. 3 and 4 below. Observing
them, it's easy to understand how at 100 dpi, the number of pixels is
definitely higher than those present in the picture at 50 dpi . After
all the pixels of the first figure are much smaller than the second
one and this fact , like explained , give a higher quality.
Moving from a higher resolution to a lesser resolution are lost definitively
the detail present at higher resolution image , and it is virtually
impossible to recover it.
Some programs use photo math functions such as bicubic interpolations
to 'increase the resolution'. Actually what they do is' recreate 'the
missing pixels by interpolating of colors of the existing pixels (see
With the interpolation between 2 pixels are added a number of intermediate
pixels with different shades intermediate between 2 colors
The result is often a larger picture and apparently more detailed,
but in reality blurred.
By Fabio Pacioni