If you have an old (as in pre-digital) television, the answer is that you probably can’t, and if you can, you probably shouldn’t. The results will be … well, I’ll just say less than ideal. Older TVs just weren’t made for the kind of display that our computers expect.
However, if your TV is relatively new — almost any “flat” TV will do — and your computer is also relatively current, you’ll probably be able to do exactly what you have in mind, just like the shows you’re watching on TV.
Become a Patron of Ask Leo! and go ad-free!
If you have both an HDMI output on your computer — as many do these days, especially laptops — and your TV has an HDMI input — once again, as many do — then you’re good to go. Get an HDMI cable to connect them, and make sure the correct input is selected on the TV and the correct display output is selected on the computer. It should just work.
HDMI is preferable for a variety of reasons.
- The display device (the TV, in our example) can inform the computer of the maximal, or optimal, resolution to use.
- HDMI includes both video and audio, so your TV’s speakers can be used if you like.
- It’s a single cable.
If either your computer or your television doesn’t support HDMI, you’ll need to look at alternatives. These include, in decreasing order of popularity and/or video quality:
- DisplayPort – a digital video and data connection that perhaps exceeds HDMI in overall capability (it can be used for things other than audio and video), but isn’t nearly as ubiquitous. HDMI/Displayport converters do exist if one side of your intended connection supports HDMI.
- DVI – (Digital Visual Interface) is a video-only interface.
- VGA – (Video Graphics Array) is an old analog interface that you might recall from older computers and computer monitors.
- S-Video – (Separate Video) is a higher-resolution analog interface that was common for some time.
- Component video – an analog interface that uses separate connections for each of the primary colors of red, green, and blue.
There are often conversions between the various alternative connections as well. While DisplayPort to HDMI might be a simple cable, other conversions may require an actual device of some sort to perform the signal conversion. You’ll generally get better results if you can avoid this type of conversion.
Resolution is the number of pixels or dots displayed on a computer screen. It’s measured as a count of the number of pixels across (horizontal) by the number down (vertical).
Computers and computer screens can be often be set to a wide variety of different resolutions, depending on the graphics hardware used. TelevisionS, on the other hand, have a fairly fixed set of resolutions1:
- SD: Standard Definition, 640 × 480.
- 720HD: High Definition, 1280 × 720.
- 1080HD: High Definition, 1920 × 1080.
- 4K: Ultra High Definition, 3840 × 2160
Older, analog television was roughly equivalent to a digital resolution of 486 × 440.
When outputting to a digital television, you’ll want to set your computer’s output resolution to one of those — preferably the highest-quality resolution supported.
A note about “overscan”
One of the artifacts of analog broadcast television is the concept of “overscan”. Analog signals actually included information that was technically off the edges of the television screen. Most TV video was encoded with this knowledge, so as not to try to display something off the edge of the screen. Most analog TVs had both vertical and horizontal adjustments to control how much of the image was actually included in the visible area.
Then came digital.
On computers, it’s very simple: an image, be it video or still, is characterized by its resolution, as discussed above. A 1080p high-definition video, for example, is exactly 1920 pixels wide and 1080 pixels high. Computer monitors display a fixed number of pixels as well; the display I’m using right now has exactly 3840 by 1600 pixels.
Digital televisions, however, occasionally carry the concept of overscan forward. The result is that, depending on many factors, you might only be seeing 1900 × 1060 pixels of your 1920 × 1080 high-definition video: 10 pixels might be “lost” off of each edge. When watching a television show this might not matter, but when using that TV as a computer display, it could mean missing a portion of items, such as your taskbar, off the edge of the screen.
If this happens, there are really only two alternatives:
- If your TV has a width and height adjustment, see if you can get it to display the full image of whatever it is you’re looking at.
- Adjust Windows to a display at a smaller resolution. This doesn’t always work, and is highly dependent on the video card used in your computer.
The good news is that most current televisions assume everything is digital and display every pixel without reverting to overscan.
Selecting the display
Since Windows 7, you can type Windows Key + P for “Presentation Mode”, which will bring up a menu of options for what gets displayed where. Options include:
- PC screen only: Your second, “external” display is not used.
- Duplicate: Your primary and secondary displays show the same thing. This is referred to as mirroring. When this is selected, the resolution of one or the other will be adjusted so that both display the same number of pixels.
- Extend: Your primary and secondary displays are independent, and together form a single contiguous virtual desktop. This way, you can display different items on each screen, such as your desktop on one and a video on the other.
- Second screen only: Your primary display is turned off and only the secondary one is used.
Once you have your TV successfully connected as a second screen on your computer, this setting gives you all the flexibility you need to control what you see where.
If you found this article helpful, I'm sure you'll also love Confident Computing! My weekly email newsletter is full of articles that help you solve problems, stay safe, and give you more confidence with technology. Subscribe now and I'll see you there soon,