Contact Us
TEL: 86+136 6240 7908
Address: floor 1710 tiansheng building north tower no 111, shipai west road Tianhe District, Guangzhou city.
Home > Exhibition > Content
Video Cables Explained: Difference Between VGA, DVI, and HDMI Ports
Aug 07, 2017


The Video Graphics Array (VGA) is one of the oldest connection standards which can still be found in large swaths of computing equipment. It was first developed by IBM and introduced in 1987. It was widely used for video cards, TV sets, computer monitors, and laptops.


VGA can support resolutions up to 640 x 480 in 16 colors, although you can increase the colors to 256 by lowering the resolution to 320 x 200. This is known as Mode 13h and is commonly used when booting your computer into Safe Mode. It’s also the mode that was used in computer gaming back in the late 1980s and early 1990s


The VGA cable can carry RGBHV video signals: Red, Green, Blue, Horizontal Sync, and Vertical Sync. The VGA socket is made up of 15 pins in three rows of five pins, and typically colored blue. The cable socket is securely attached to the device using two screws, one on each side of the socket.

It is rarely used today and is mostly found on older hardware, having been largely replaced by the digital DVI and HDMI connections, which we explore later in this article.


The famous red, white, and yellow setup of the RCA connector was once the most popular and widespread connection type for audio/visual devices.


While they are often referred to as RCA cables, RCA actually refers to the metal connectors at the end of the cables, named after the Radio Corporation of America which popularized the connection type.


The cables themselves are red and white audio cables and yellow for single channel composite video. The three cables together are able to transmit stereo audio along with video up to 480i or 576i resolution — where the i refers to interlaced video.

Just as with VGA, the once-popular RCA connector has been superseded by the digital DVI and HDMI connections.


More than a decade after IBM introduced VGA, the Digital Display Working Group launched the digital successor, DVI, in 1999. DVI, which stands for Digital Visual Interface, can transmit uncompressed digital video in one of three different modes:

  • DVI-I (Integrated) combines digital and analog in the same connector.

  • DVI-D (Digital) supports digital signals only.

  • DVI-A (Analog) supports analog only.


DVI-I and DVI-D can come in single- or dual-link varieties. Single-link can support 1920 x 1200 at 60 Hz, while adding a second digital transmitter for dual-link means the resolution can be increased to 2560 x 1600 at 60 Hz.

In order to prevent forced obsolescence of VGA devices, DVI was developed to support analog connections using the DVI-A mode. This meant that DVI connections and devices could be backwards-compatible with VGA connections.


Apple is infamous for always trying to make their devices thinner and lighter. Most people think of this as a relatively recent development, but they’ve been at it since at least 2006 when they decided to move on from the mini-VGA port. In its place, they developed the mini-DVI connector.

Mini-DVI-Shutterstock 6.jpg

It was included in the PowerBook G4, Intel-based iMacs and Macbooks, and Xserve. The mini-DVI doesn’t support dual-link connections and so was always limited to resolutions of 1920 x 1200 or less. This, in conjunction with advancing technology, led Apple to discontinue the use of mini-DVI in 2008 and replace it with Mini DisplayPort connections instead.


HDMI, or High Definition Media Input, is a proprietary but wildly successful digital audio and video transfer interface. HDMI was created by a group of electronics manufacturers — including Sony, Sanyo, and Toshiba — to transfer video (uncompressed) and audio (either uncompressed or eight-channel compressed) to computer monitors, digital TVs, and DVD or Blu-ray players.

HDMI-Shutterstock 7.jpg

As of HDMI 1.4, it can support 24-bit uncompressed audio at 192 kHz and video resolutions up to 4096 x 2160, which is also known as 4K or Ultra HD. As HDMI uses the same video format standards as DVI, the two are compatible through an adapter. As no signal conversion is necessary, there is no loss of quality either (although unlike HDMI, DVI does not support audio).

HDMI-Types-Amazon 9.jpg

There are three commonly used HDMI connectors. Type A is the full-sized HDMI connection for use on TVs and home theater equipment. Mini-HDMI (Type C) is commonly used on laptops and tablets, while Micro-HDMI (Type D) is mostly used on mobile devices.


DisplayPort is a digital display interface developed by the Video Electronics Standards Association (VESA). DisplayPort can carry digital video and audio, making it functionally similar to HDMI. As of DisplayPort 1.4, there is support for resolutions up to 8K (7680 x 4320) at 60 Hz, which surpasses even HDMI 1.4.

Displayport-Shutterstock 10.jpg

However, HDMI and DisplayPort were designed for different markets. While HDMI was made primarily for home entertainment, DisplayPort was designed for connecting computing devices to monitors. Due to their similar functionality, it is possible to connect DisplayPort and HDMI devices together using a Dual-Mode DisplayPort adapter.

DisplayPort operates using packet data transmission which is commonly used in Ethernet and USB connections, and that’s what makes it ideal for use in computing as opposed to home entertainment.

Mini DisplayPort

Mini-DisplayPort-Shutterstock 12.jpg

Apple replaced the mini-DVI connection on their Macbooks and iMacs with their own take on DisplayPort, known as Mini DisplayPort (MDP). Although Apple created MDP, it was incorporated into VESA’s DisplayPort 1.2 specification in 2009.


Thunderbolt is an interface used to connect peripheral devices to a computer. It was originally developed by Intel under the name of Light Peak and used optical cables for data transfer (hence the name Light Peak). After extensive testing, Intel found they could make it cheaper with copper cables without impacting performance.

Apple was the first manufacturer to bring Thunderbolt to the market in their MacBook Pro models in 2011. When Thunderbolt debuted, it was able to use the MDP connector that Apple had created, meaning the new connection would be backwards-compatible.

      Thunderbolt-Intel 13.jpg

One of the biggest advantages of Thunderbolt over DisplayPort is the high speed of data transfer. Versions 1 and 2 of Thunderbolt could transfer at a rate of 20 GB/s using the MDP connector. Thunderbolt 3 will increase this to 40 GB/s by using the USB Type-C connector in place of the MDP.

Using one Thunderbolt port, it is possible to connect up to six peripherals through a technique called daisy-chaining, which reduces the amount of ports needed on a device.

Thunderbolt is able to carry audio either using the DisplayPort protocol or by using USB audio cards. As of Thunderbolt 2, there was support for DisplayPort 1.2 specifications, which allowed for 4K streaming to a 4K monitor.

The Simplification Factor

If you’re still confused, the good news is that the world is moving towards a unified standard. HDMI Licensing recently announced that in order to connect to an HDMI device, all you will need is a compatible USB Type-C to HDMI cable which can run in “Alt Mode”.

This is one part of the industry’s larger push to simplify cabling. Standardization is possible — remember all of the proprietary mobile device chargers that existed before the micro-USB standard? It’s like that, except this time the industry is pushing USB Type-C as the next all-in-one standard for audio, video, power, and all other connections.

Apple is so certain of this future that at their 2016 Fall event they launched a Macbook Pro which exclusively uses USB Type-C ports for all connections and power. While this might take a while to become the standard connection, the future is starting to take shape where you may not need a hundred different cables just to watch video content.

Previous: HDMI vertion history

Next: HDMI 2.0 vs 1.4: What’s the difference?