Graphics card outputs

by HEXUS Staff on 8 October 2010, 00:00

Tags: AMD (NYSE:AMD), NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qa5gn

Add to My Vault: x

Overview

Modern desktop computers and notebooks comprise of a CPU, motherboard, graphics, storage, and, usually an optical drive. Computers have a number of ports and sockets that enable the user to plug-in various peripherals such as a printer, USB mouse, or, perhaps most importantly of all, an Internet connection.

Looking at it from the other side, desktop computers need to be attached to a monitor or television so that one can view the output of the graphics - the text on the display, for example - and to this end have numerous graphics-related outputs. Notebooks have built-in screens so do not need these additional outputs unless transmitting to a larger screen.

Desktop computers' graphics stem from two sources. Firstly, the machine may have basic graphics capabilities that are integrated into the motherboard. These are called integrated graphics and are found on low-cost computers. More-expensive PCs have what are known discrete graphics, whereby a dedicated graphics card is placed into the system and used to run games at high-quality settings.

Both sets of graphics can be outputted to a display in various ways. These include connections known as VGA, DVI, HDMI and DisplayPort. This TekSpek looks at the common interfaces and explains how they work. A GeForce GTS 450 graphics card, released in September 2010, is shown with various outputs in the picture, below. This TekSpek will reference it during the discourse.

VGA

The oldest type of video connector still used on modern PCs is Video Graphics Array (VGA). It is the blue-coloured port in the middle of the graphics card. First introduced in 1987 by IBM and also referred to as HD15 - representing the number of pins in the connector - the VGA video connection flourished for a good 15 years after its inception. Indeed, the very fact that it's still specified on new graphics cards is a testament to its longevity.

VGA is an analogue connection that is primarily used to connect the graphics card to older CRT monitors that process data in analogue form. This means the CRT receives electrical signals in a continuous form, rather than the pulsing found on newer digital displays. Because CRTs have been around for more than 30 years and still represent a reasonable proportion of displays used in businesses, graphics-card manufacturers have kept this legacy connector alive.

The VGA connector requires the graphics card to convert its native digital output into analogue form before sending it out over the VGA cable and to the monitor. The quality of the digital-to-analogue convertor (DAC) largely dictates the quality of the picture on the CRT. VGA has a maximum supported resolution of 2,048x1,536 pixels, though it is usually limited to 1,600x1,200 on most monitors.

DVI

The emergence of flat-panel displays in the last 10 years has lead to the decline of VGA and rise of Digital Video Interface (DVI) as the de facto display standard. Flat-panel screens - think of LCD monitors and TVs here - use a digital format for the display. This means that the graphics card can send its digital output to the display without the need to convert it into an analogue signal first.

The white-coloured DVI connector, shown on the right of the picture, is connected to what is known as a Transition Minimised Differential Signalling (TMDS) chip on the graphics card. The TMDS grabs the signal from the video card and then passes it over to the display, optimising the resolution and refresh rate in the process. You will find a DVI connection on practically all graphics cards and flat-panel monitors nowadays.

There are two forms of DVI connections for the consumer market. Most monitors have what is known as a single-link connection that uses a single TMDS to provide the output. The resolution of this link, for the most part, is limited to 1,920x1,200 pixels and is therefore fine for the majority of displays. High-resolution monitors - 30in models, in the main - have a 2,560x1,600px resolution and need two TMDS transmitters and a special cable for what is known as a dual-link connection.

DVI is further split into DVI-I and DVI-D, where the former also supports an analogue connection by having four additional pins on the right-hand side of the port - as you can see on the pictured graphics card. DVI can also carry the optional High-Bandwidth Digital Content Protection (HDCP) feature that inhibits the copying of digital data by encrypting it before it's sent over the cable to the monitor.

HDMI

Whereas DVI is aimed squarely at the PC market, High-Definition Multimedia Interface (HDMI) provides the same digital data transfer for consumer-electronic devices such as flat-panel TVs, modern consoles, DVD- and Blu-ray players, home-cinema equipment, and, increasingly, on graphics cards (the left-hand port in the picture). Both DVI and HDMI transmit uncompressed digital data from source to display, but HDMI's one big advantage is that it can also send up to eight channels of compressed or uncompressed sound via the single cable.

HDMI is electrically very similar to DVI, insofar as a basic DVI-to-HDMI adapter can be used without a loss of quality. HDMI, however, requires that HDCP encryption is mandatory. The digital connection has been updated numerous times since its launch in 2002. The latest version is HDMI 1.4, which supports advanced features such as an ultra-high-resolution of 4,096x2,160px, an Ethernet channel and 3D compatibility.

The wide proliferation of HDMI in consumer-electronic devices means that it has now overtaken DVI as the digital connection of choice. From a computing perspective, AMD and NVIDIA's newest graphics cards are able to take the audio from the computer and ferry it through the HDMI port to an HDMI-equipped monitor or TV with both the video and audio streams present. HDMI implementation currently costs four cents (US) per device and a $10,000 annual fee for manufacturers with a high volume of HDMI-equipped parts.

DisplayPort

The newest mass-market digital interface is known as DisplayPort. Released in 2007, it is designed to replace both DVI and VGA as a single-cable connection to computer monitors. The 20-pin connection isn't directly compatible with DVI unless used in a dual-mode DisplayPort configuration, which itself is limited to single-link DVI transmissions. DisplayPort uses a system of what are termed differential data pairs within a main link to transmit data.

Up to four differential pairs are teamed up to provide the bandwidth, which can be used for both video and audio signalling. At its peak, a four-pair DisplayPort connection can transmit 17.28Gbit/s of data. It offers optional HDCP and advanced DisplayPort Content Protection, as well.

Chip-maker AMD has already taken DisplayPort onboard with its Eyefinity multi-monitor feature, where up to six displays can be connected to a single graphics card for increased productivity and large-area gaming. Laptop-makers are also looking at DisplayPort to replace the many-wire LVDS system that currently connects the motherboard's display electronics to the screen itself. Now gaining in momentum, Intel is providing DisplayPort as standard on its next-generation Sandy Bridge architecture.

Summary

There are four principal methods of connecting devices to displays. Analogue VGA is now outdated and merely added to graphics cards for legacy reasons. DVI remains the digital output of choice for graphics cards. Widespread adoption and a large flat-panel install base means that it will feature on graphics cards for many years to come.

HDMI is similar to DVI but also carries the audio component over a single wire, making it the de facto standard for consumer-electronic devices. The latest entrant is DisplayPort, mooted as the successor to DVI and VGA, and is now featuring on a select number of AMD graphics cards, laptops, and due to be included in Intel's next-generation CPU-and-GPU microarchitecture.

Sponsored by SCAN