In 1998, High Definition resolution arrived, along with sleek, streamlined monitors that made the boulder-like CRT models of yore resemble relics from the Stone Age. One such newcomer was the plasma screen TV, which remains popular for its quality relative to its price. But the question remains: more than a decade later, can plasma TVs keep up with modern resolution demands, like play 4K?

Consumer plasma TVs cannot play 4K videos. Plasma TVs are generally considered obsolete, as the market’s interest turned to LCD and OLED largely before the demand for Ultra High Definition TVs ever surfaced. The highest resolution format available on plasma TVs is Full HD format or 1080p.

In this article, we will define 4K and go into detail about plasma TVs, review the benefits of plasma monitors, and discuss whether or not 4K-compatibility is a current necessity.

What Is 4K?

4K is the highest resolution video format currently available, though 8K has been shown off at consumer showcases, and 12K is rumored to be in the works. Previous to 4K, the highest screen resolution available was Full HD or 1080p. Resolution formats have historically been referred to by their vertical resolution, in regards to a widescreen aspect ratio of 16:9. (Full HD or 1080p, for example, is equal to 1920 x 1080 pixels.)

This standard is reversed in 4K, which is referred to instead by its horizontal resolution: 3840 x 2160 pixels (the width is rounded up to 4,000, and thus known as 4K). 4K is referred to as Ultra HD and contains 8 million pixels or four times the Full HD displays.

The result is a clearer picture with a better color resolution, richer contrasts, and more realistic-looking image quality. With 4K, larger TVs or monitors provide a fuller viewing experience without sacrificing resolution or detail.

The Age of Ultra HD

Though 4K’s history dates as far back as the 1980s, this resolution format became popular in 2010 when many manufacturers began developing video cameras and smartphones capable of recording 4K video. Although technological advances have made the Ultra HD format more accessible to the public in terms of production on a personal level, most of the content available out there—whether movies, TV shows, or early 2000s internet clips—was not shot in 4K.

4K is still a relatively new technology, and 1080p remains the industry standard—for now, anyway. Although 4K video is still something of a luxury, the amount of content engineered in ultra HD resolution is growing by the day thanks to streaming platforms like Netflix and YouTube.

Optimal Viewing Experience for 4K Content

For 4K TV sets to function at their peak efficacy, chosen content ideally should be native 4K and streamed with high-speed internet. 4K will still look exceptionally sharp on screens that are not necessarily made for Ultra HD, however. 4K TVs released before 2015 will not be compatible with 4K streaming, as they cannot decode high-efficiency video coding (HEVC).

What Are Plasma Screens?

Plasmas TVs were among the first flat-panel TVs released to the public in the early 2000s, a more ergonomic and functional design than their bulky CRT predecessors.

Layered between two glass sheets, the construction takes its name from the plasma layer composed of over a million cells of neon and xenon gases. When this gaseous layer is stimulated by electricity, atoms dispense energy that forms red, blue, and green colored light, translating to images on the TV screen.

The first rudimentary plasma screen was invented at the University of Illinois in 1964, though models did not enter the commercial market until almost 40 years later. Plasma TVs enjoyed a heyday from the late 90s into the mid-2000s, but as LCD models became more slender and power-efficient, plasma screens’ popularity declined. Higher quality OLEDs entered the market, and although not as budget-friendly, helped to accelerate their demise further.

Though later plasma models continued improving, the damage was done, with the production of plasma screen models ceasing in 2014.

How Do Plasma TVs Stack Up vs. Newer Models?

Fans of plasma TVs and monitors remain loyal even six years after their commercial extinction. Plasma screens are touted to have better quality than LEDs and are available at a similarly appealing price point; fans praise the rich contrast, color, lack of motion blur, and wide viewing spectrum, especially in contrast with most LCDs where picture quality is known to suffer when viewed from any angle other than head-on.

Though modern TV and monitor options offer higher resolution, some assert that even the most up-to-date models do not exceed plasmas’ value as much as one would expect.

Many assert that plasmas are a more worthwhile investment than current LEDs, though OLEDs still reign supreme in terms of what is available on the market. While OLED screen quality surpasses that of retired plasmas, the competition remains close, especially considering the hefty price tag that comes attached to OLEDs.

One Reddit user vented his dismay at the current predicament as far as quality screen options that are actively available to consumers: “What we’re left with is mediocre offerings at the low to mid-range (LCDs), and great offerings at only the very high end.”

How Much Does Resolution Matter?

Is the difference between 1080p and 4K video extremely noticeable? Resolution notwithstanding, the appearance of the screen’s image depends on three different factors: the size of the monitor/screen, the viewer’s distance in relation to it, and the quality of the viewer’s vision.

TV Size

With a larger screen, you can position yourself further away from the TV and still expect impressive image quality. If the screen is big enough, the difference in quality will be more readily apparent.

While positioned closer to the screen, differences in quality will be more noticeable as well. Positioning and relative size will naturally depend on the screen’s intended use; people tend to sit farther away from television sets while getting much more ‘up close and personal’ with their computers.

Distance From the TV

“From a distance, it is virtually impossible for someone to tell the difference in quality between a 1080p and 4K screen,” ExperienceUHD reports. An article from Forbes arrived at the same conclusion, stating that the overall effect of 4K is lost if the viewer is sitting far enough away from the screen. It seems that the optimal distance for appreciating the finer details of Ultra HD screens requires sitting closer to the screen than the average viewer prefers.

Is a 4K-Enabled TV Really a Necessity?

Ultra HD streaming is the way of the future but is still a work in progress. The human eye can only process so many pixels at a time, and the difference between 4K and 1080p can be negligible in the context of a living room or basement TV set, where viewing is generally more spread out. 

Taking all of this into account with the comparative paucity of 4K content—and the increase in internet speed that Ultra-HD streaming requires–and one may conclude that it’s not entirely necessary to upgrade to 4K-enabled products depending on your setup and connectivity.


Unfortunately, plasma TVs will never be able to stream 4K content in all its glory—but depending on what the use and placement of the monitor will be, it may not matter all that much. Considering the lack of 4K content currently available and the agreeable pricing and quality of plasmas, it is not a terrible idea to delay the Ultra HD upgrade until a later date.