DVI & VGA – What is the difference?
If you have ever looked at all the cables that plug into your computer you probably would have noticed the one that connects the monitor to the computer it’s self. Chances are that this cable, on older machines is a VGA cable or more commonly these days, a DVI cable. (Pictured Left)
The question is what is the difference between them?
VGA stands for Video Graphics Array and has been the standard cable for connecting monitors to PC’s since the early 1990′s. The standard was first introduced by IBM on it’s PS/2 line of computers in 1987. VGA cables carry RGBHV (Red, Green, Blue, Horizontal and Verticle Sync) analog video signals. Analog signals being sent to CRT monitors was fine as they were able to draw pictures on the screen as the signals arrived by passing an electron beam across the back of the phosphor coated glass screen.
These days most monitors are LCD flat screen monitors. The problem is that with a liquid crystal display each pixel on the screen is an individual entity and can be addressed individually. Each pixel must be told if it should on or off and whether or not it should be red, green, blue or white. Analog signals do not carry the correct information to be able to address individual pixels on a digital monitor.
Before the video card in the computer sends a signal across the VGA cable it is converted into an analog signal and sent. Once it reachs the digital monitor it is converted back into a digital signal. This works but is not ideal as converting an analog signal to a digital signal will never be 100% accurate and will result in reduced picture quality.
Meet DVI (Digital Visual Interface), the standard designed by the Digital Display Working Group to supersede the aging VGA standard. As the name the suggests this type of cable carries digital signals which coupled with a digital monitors is a match made in heaven. The DVI standard began production in 1999 and is still the standard to this day. DVI is also mostly compatible with HDMI except for the fact that is does not carry audio signals. The ability for the cable to carry pure digital signals eliminates the need for any kind of conversion at either the computer or monitor end and enables the computer’s video card to send signals to individual pixels on the the LCD monitor. As a result you get a much sharper picture on your screen. If required the DVI cable is also capable of transmitting analog signals as well.
There is much more to VGA and DVI than covered in this article but I covered the primary differences and the reasons why DVI is the preferred method of displaying images on digital monitors. If you wish to learn more about either standard take a look on Wikipedia:
So if you have ever wondered which cable you should be using on your computer then hopefully this answers your question.
Thanks for reading.Tags: analog, cable, cathode ray tube, connector, conversion, convert, crt, difference, digital, dvi, gpu, graphics, how to, lcd, monitors, picture, signals, vga, video, video card, which cable to use