Jump to content

Dots per inch

From Wikipedia, the free encyclopedia
(Redirected from DPI scaling in Windows)

A close-up of the dots produced by an inkjet printer at draft quality. Actual size is approximately 14 by 14 inch (6 by 6 mm). Individual coloured droplets of ink are visible; this sample is about 150 DPI.

Dots per inch (DPI, or dpi[1]) is a measure of spatial printing, video or image scanner dot density, in particular the number of individual dots that can be placed in a line within the span of 1 inch (2.54 cm). Similarly, dots per centimetre (d/cm or dpcm) refers to the number of individual dots that can be placed within a line of 1 centimetre (0.394 in).[2]

DPI measurement in printing

[edit]
Dots on printed paper

DPI is used to describe the resolution number of dots per inch in a digital print and the printing resolution of a hard copy print dot gain, which is the increase in the size of the halftone dots during printing. This is caused by the spreading of ink on the surface of the media.

Up to a point, printers with higher DPI produce clearer and more detailed output. A printer does not necessarily have a single DPI measurement; it is dependent on print mode, which is usually influenced by driver settings. The range of DPI supported by a printer is most dependent on the print head technology it uses. A dot matrix printer, for example, applies ink via tiny rods striking an ink ribbon, and has a relatively low resolution, typically in the range of 60 to 90 DPI (420 to 280 μm). An inkjet printer sprays ink through tiny nozzles, and is typically capable of 300–720 DPI.[3] A laser printer applies toner through a controlled electrostatic charge, and may be in the range of 600 to 2,400 DPI.

The DPI measurement of a printer often needs to be considerably higher than the pixels per inch (PPI) measurement of a video display in order to produce similar-quality output. This is due to the limited range of colours for each dot typically available on a printer. At each dot position, the simplest type of color printer can either print no dot, or print a dot consisting of a fixed volume of ink in each of four color channels (typically CMYK with cyan, magenta, yellow and black ink) or 24 = 16 colours on laser, wax and most inkjet printers, of which only 14 or 15 (or as few as 8 or 9) may be actually discernible depending on the strength of the black component, the strategy used for overlaying and combining it with the other colours, and whether it is in "color" mode.

Higher-end inkjet printers can offer 5, 6 or 7 ink colours giving 32, 64 or 128 possible tones per dot location (and again, it can be that not all combinations will produce a unique result). Contrast this to a standard sRGB monitor where each pixel produces 256 intensities of light in each of three channels (RGB).

While some color printers can produce variable drop volumes at each dot position, and may use additional ink-color channels, the number of colours is still typically less than on a monitor. Most printers must therefore produce additional colours through a halftone or dithering process, and rely on their base resolution being high enough to "fool" the human observer's eye into perceiving a patch of a single smooth colour.

The exception to this rule is dye-sublimation printers, which can apply a much more variable amount of dye—close to or exceeding the number of the 256 levels per channel available on a typical monitor—to each "pixel" on the page without dithering, but with other limitations:

  • lower spatial resolution (typically 200 to 300 dpi), which can make text and lines look somewhat rough
  • lower output speed (a single page requiring three or four complete passes, one for each dye colour, each of which may take more than fifteen seconds—generally quicker, however, than most inkjet printers' "photo" modes)
  • a wasteful (and, for confidential documents, insecure) dye-film roll cartridge system
  • occasional color registration errors (mainly along the long axis of the page), which necessitate recalibrating the printer to account for slippage and drift in the paper feed system.

These disadvantages mean that, despite their marked superiority in producing good photographic and non-linear diagrammatic output, dye-sublimation printers remain niche products, and thus other devices using higher resolution, lower color depth, and dither patterns remain the norm.

This dithered printing process could require a region of four to six dots (measured across each side) to accurately reproduce the color in a single pixel. An image that is 100 pixels wide may need to be 400 to 600 dots in width in the printed output; if a 100 × 100-pixel image is to be printed in a one-inch square, the printer must be capable of 400 to 600 dots per inch to reproduce the image. As such, 600 dpi (sometimes 720) is now the typical output resolution of entry-level laser printers and some utility inkjet printers, with 1,200–1,440 and 2,400–2,880 being common "high" resolutions. This contrasts with the 300–360 (or 240) dpi of early models, and the approximate 200 dpi of dot-matrix printers and fax machines, which gave faxed and computer-printed documents—especially those that made heavy use of graphics or coloured block text—a characteristic "digitized" appearance, because of their coarse, obvious dither patterns, inaccurate colours, loss of clarity in photographs, and jagged ("aliased") edges on some text and line art.

A 10 × 10-pixel computer display image usually requires many more than 10 × 10 printer dots to reproduce it accurately, due to the limited colours of ink available from the printer; here, a 60 × 60 grid is used, providing 36 times the original density, compensating for the printer's fewer colours. The whole blue pixels making up the sphere are reproduced by the printer using different overlaid combinations of cyan, magenta, and black ink, and the light aqua by cyan and yellow with some "white" (ink-free) print pixels within the actual image pixel. When viewed at a more normal distance, the primary coloured stippled dots appear to merge into a smoother, more richly coloured image.

DPI or PPI in digital image files

[edit]

In printing, DPI (dots per inch) refers to the output resolution of a printer or imagesetter, and PPI (pixels per inch) refers to the input resolution of a photograph or image. DPI refers to the physical dot density of an image when it is reproduced as a real physical entity, for example printed onto paper. A digitally stored image has no inherent physical dimensions, measured in inches or centimetres. Some digital file formats record a DPI value, or more commonly a PPI (pixels per inch) value, which is to be used when printing the image. This number lets the printer or software know the intended size of the image, or in the case of scanned images, the size of the original scanned object. For example, a bitmap image may measure 1,000 × 1,000 pixels, a resolution of 1 megapixel. If it is labelled as 250 PPI, that is an instruction to the printer to print it at a size of 4 × 4 inches. Changing the PPI to 100 in an image editing program would tell the printer to print it at a size of 10 × 10 inches. However, changing the PPI value would not change the size of the image in pixels which would still be 1,000 × 1,000. An image may also be resampled to change the number of pixels and therefore the size or resolution of the image, but this is quite different from simply setting a new PPI for the file.

For vector images, since the file is resolution independent, there is no need to resample the image before resizing it as it prints equally well at all sizes. However, there is still a target printing size. Some image formats, such as Photoshop format, can contain both bitmap and vector data in the same file. Adjusting the PPI in a Photoshop file will change the intended printing size of the bitmap portion of the data and also change the intended printing size of the vector data to match. This way the vector and bitmap data maintain a consistent size relationship when the target printing size is changed. Text stored as outline fonts in bitmap image formats is handled in the same way. Other formats, such as PDF, are primarily vector formats that can contain images, potentially at a mixture of resolutions. In these formats the target PPI of the bitmaps is adjusted to match when the target print size of the file is changed. This is the converse of how it works in a primarily bitmap format like Photoshop, but has exactly the same result of maintaining the relationship between the vector and bitmap portions of the data.[citation needed]

Computer monitor DPI standards

[edit]

Since the 1980s, Macs have set the default display "DPI" to 72 PPI, while the Microsoft Windows operating system has used a default of 96 PPI.[4] These default specifications arose out of the problems rendering standard fonts in the early display systems of the 1980s, including the IBM-based CGA, EGA, VGA and 8514 displays as well as the Macintosh displays featured in the 128K computer and its successors. The choice of 72 PPI by Macintosh for their displays arose from existing convention: the official 72 points per inch mirrored the 72 pixels per inch that appeared on their display screens. (Points are a physical unit of measure in typography, dating from the days of printing presses, where 1 point by the modern definition is 172 of the international inch (25.4 mm), which therefore makes 1 point approximately 0.0139 in or 352.8 μm). Thus, the 72 pixels per inch seen on the display had exactly the same physical dimensions as the 72 points per inch later seen on a printout, with 1 pt in printed text equal to 1 px on the display screen. As it is, the Macintosh 128K featured a screen measuring 512 pixels in width by 342 pixels in height, and this corresponded to the width of standard office paper (512 px ÷ 72 px/in ≈ 7.1 in, with a 0.7 in margin down each side when assuming 8+12 in × 11 in North American paper size; in the rest of the world, it is 210 mm × 297 mm – called A4. B5 is 176 mm × 250 mm).[citation needed]

A consequence of Apple's decision was that the widely used 10-point fonts from the typewriter era had to be allotted 10 display pixels in em height, and 5 display pixels in x-height. This is technically described as 10 pixels per em (PPEm). This made 10-point fonts be rendered crudely and made them difficult to read on the display screen, particularly the lowercase characters. Furthermore, there was the consideration that computer screens are typically viewed (at a desk) at a distance 30% greater than printed materials, causing a mismatch between the perceived sizes seen on the computer screen and those on the printouts.[citation needed]

Microsoft tried to solve both problems with a hack that has had long-term consequences for the understanding of what DPI and PPI mean.[5] Microsoft began writing its software to treat the screen as though it provided a PPI characteristic that is 43 of what the screen actually displayed. Because most screens at the time provided around 72 PPI, Microsoft essentially wrote its software to assume that every screen provides 96 PPI (because 72 × 43 = 96). The short-term gain of this trickery was twofold:

  • It would seem to the software that one-third more pixels were available for rendering an image, thereby allowing for bitmap fonts to be created with greater detail.
  • On every screen that actually provided 72 PPI, each graphical element (such as a character of text) would be rendered at a size one third larger than it "should" be, thereby allowing a person to sit a comfortable distance from the screen. However, larger graphical elements meant less screen space was available for programs to draw. Indeed, the default 720-pixel wide mode of a Hercules mono graphics adaptor (the one-time gold standard for high resolution PC graphics) – or a "tweaked" VGA adaptor – provided an apparent 7+12-inch page width at this resolution. However, the more common and colour-capable display adaptors of the time all provided a 640-pixel wide image in their high resolution modes, enough for a bare 6+23 inches at 100% zoom, with barely any greater visible page height – a maximum of 5 inches, versus 4+34. Consequently, the default margins in Microsoft Word were set, and still remain at 1 full inch on all sides of the page, keeping the "text width" for standard size printer paper within visible limits; despite most computer monitors now being both larger and finer-pitched, and printer paper transports having become more sophisticated, the Mac-standard half-inch borders remain listed in Word 2010's page layout presets as the "narrow" option (versus the 1-inch default).[citation needed]
  • Without using supplemental, software-provided zoom levels, the 1:1 relationship between display and print size was (deliberately) lost; the availability of different-sized, user-adjustable monitors and display adaptors with varying output resolutions exacerbated this, as it was not possible to rely on a properly-adjusted "standard" monitor and adaptor having a known PPI. For example, a 12-inch Hercules monitor and adaptor with a thick bezel and a little underscan may offer 90 "physical" PPI, with the displayed image appearing nearly identical to hardcopy (assuming the H-scan density was properly adjusted to give square pixels) but a thin-bezel 14-inch VGA monitor adjusted to give a borderless display may be closer to 60, with the same bitmap image thus appearing 50% larger; yet, someone with an 8514 ("XGA") adaptor and the same monitor could achieve 100 DPI using its 1024-pixel wide mode and adjusting the image to be underscanned. A user who wanted to directly compare on-screen elements against those on an existing printed page by holding it up against the monitor would therefore first need to determine the correct zoom level to use, largely by trial and error, and often not be able to obtain an exact match in programs that only allowed integer per cent settings, or even fixed pre-programmed zoom levels. For the examples above, they may need to use respectively 94% (precisely, 93.75) – or 9096, 63% (62.5) – or 6096; and 104% (104.167) – or 10096, with the more commonly accessible 110% actually being a less precise match.[citation needed]

Thus, for example, a 10-point font on a Macintosh (at 72 PPI) was represented with 10 pixels (i.e., 10 PPEm), whereas a 10-point font on a Windows platform (at 96 PPI) at the same zoom level is represented with 13 pixels (i.e., Microsoft rounded 13+13 to 13 pixels, or 13 PPEm) – and, on a typical consumer grade monitor, would have physically appeared around 1572 to 1672 inch high instead of 1072. Likewise, a 12-point font was represented with 12 pixels on a Macintosh, and 16 pixels (or a physical display height of maybe 1972 inch) on a Windows platform at the same zoom, and so on.[6] The negative consequence of this standard is that with 96 PPI displays, there is no longer a one-to-one relationship between the font size in pixels and the printout size in points. This difference is accentuated on more recent displays that feature higher pixel densities. This has been less of a problem with the advent of vector graphics and fonts being used in place of bitmap graphics and fonts. Moreover, many Windows software programs have been written since the 1980s which assume that the screen provides 96 PPI. Accordingly, these programs do not display properly at common alternative resolutions such as 72 PPI or 120 PPI. The solution has been to introduce two concepts:[5]

  • logical PPI: The PPI that software claims a screen provides. This can be thought of as the PPI provided by a virtual screen created by the operating system.
  • physical PPI: The PPI that a physical screen actually provides.

Software programs render images to the virtual screen and then the operating system renders the virtual screen onto the physical screen. With a logical PPI of 96 PPI, older programs can still run properly regardless of the actual physical PPI of the display screen, although they may exhibit some visual distortion thanks to the effective 133.3% pixel zoom level (requiring either that every third pixel be doubled in width/height, or heavy-handed smoothing be employed).[citation needed]

How Microsoft Windows handles DPI scaling

[edit]
Windows XP DPI scaling at 200%
Windows 2000 DPI scaling at 200%

Displays with high pixel densities were not common up to the Windows XP era. High DPI displays became mainstream around the time Windows 8 was released. Display scaling by entering a custom DPI irrespective of the display resolution has been a feature of Microsoft Windows since Windows 95.[7] Windows XP introduced the GDI+ library which allows resolution-independent text scaling.[8]

Windows Vista introduced support for programs to declare themselves to the OS that they are high-DPI aware via a manifest file or using an API.[9][10] For programs that do not declare themselves as DPI-aware, Windows Vista supports a compatibility feature called DPI virtualization so system metrics and UI elements are presented to applications as if they are running at 96 DPI and the Desktop Window Manager then scales the resulting application window to match the DPI setting. Windows Vista retains the Windows XP style scaling option which when enabled turns off DPI virtualization for all applications globally. DPI virtualization is a compatibility option as application developers are all expected to update their apps to support high DPI without relying on DPI virtualization.

Windows Vista also introduces Windows Presentation Foundation. WPF .NET applications are vector-based, not pixel-based and are designed to be resolution-independent. Developers using the old GDI API and Windows Forms on .NET Framework runtime need to update their apps to be DPI aware and flag their applications as DPI-aware.

Windows 7 adds the ability to change the DPI by doing only a log off, not a full reboot and makes it a per-user setting. Additionally, Windows 7 reads the monitor DPI from the EDID and automatically sets the system DPI value to match the monitor's physical pixel density, unless the effective resolution is less than 1024 × 768.

In Windows 8, only the DPI scaling percentage is shown in the DPI changing dialog and the display of the raw DPI value has been removed.[11] In Windows 8.1, the global setting to disable DPI virtualization (only use XP-style scaling) is removed and a per-app setting added for the user to disable DPI virtualization from the Compatibility tab.[11] When the DPI scaling setting is set to be higher than 120 PPI (125%), DPI virtualization is enabled for all applications unless the application opts out of it by specifying a DPI aware flag (manifest) as "true" inside the EXE. Windows 8.1 retains a per-application option to disable DPI virtualization of an app.[11] Windows 8.1 also adds the ability for different displays to use independent DPI scaling factors, although it calculates this automatically for each display and turns on DPI virtualization for all monitors at any scaling level.

Windows 10 adds manual control over DPI scaling for individual monitors.

Proposed metrication

[edit]

There are some ongoing efforts to abandon the DPI Image resolution unit in favour of a metric unit, giving the inter-dot spacing in dots per centimetre (px/cm or dpcm), as used in CSS3 media queries[12] or micrometres (μm) between dots.[13] A resolution of 72 DPI, for example, equals a resolution of about 28 dpcm or an inter-dot spacing of about 353 μm.

Conversion table (approximate)
DPI
(dot/in)
dpcm
  (dot/cm)
Pitch
  (μm)
72 28 353
96 38 265
150 59 169
203 80 125
300 118 85
2,540 1,000 10
4,000 1,575 6

See also

[edit]

References

[edit]
  1. ^ The acronym appears in sources as either "DPI" or lowercase "dpi". See: "Print Resolution Understanding 4-bit depth – Xerox" Archived 2017-11-12 at the Wayback Machine (PDF). Xerox.com. September 2012.
  2. ^ CSS3 Media Queries Recommendation
  3. ^ "OKI's Technology Guide to Inkjet Printing". www.askoki.co.uk. Archived from the original on 2009-08-15.
  4. ^ Hitchcock, Greg (2005-10-08). "Where does 96 DPI come from in Windows?". Microsoft Developer Network Blog. Microsoft. Retrieved 2009-11-07.
  5. ^ a b Hitchcock, Greg (2005-09-08). "Where does 96 DPI come from in Windows?". blogs.msdn.com. Retrieved 2010-05-09.
  6. ^ Connare, Vincent (1998-04-06). "Microsoft Typography – Making TrueType bitmap fonts". Microsoft. Retrieved 2009-11-07.
  7. ^ fbcontrb (2005-11-08). "Where does 96 DPI come from in Windows?". Blogs.msdn.com. Retrieved 2018-04-03.
  8. ^ "Why text appears different when drawn with GDIPlus versus GDI". Support.microsoft.com. 2018-02-04. Retrieved 2018-04-03.
  9. ^ "Win32 SetProcessDPIAware Function". 22 February 2024.
  10. ^ "Windows Vista DPI scaling: my Vista is bigger than your Vista". December 11, 2006.
  11. ^ a b c Christoph Nahr / (19 May 2011). "High DPI Settings in Windows". Kynosarges.org. Retrieved 2018-04-03.
  12. ^ "Media Queries".
  13. ^ "Class ResolutionSyntax". Sun Microsystems. Retrieved 2007-10-12.
[edit]