Acquired Electronics360

Internet Enabled Consumer Devices

8K TVs: An Evolution of Resolution

30 January 2018

Sony and Panasonic have partnered with broadcaster NHK to develop technology capable of handling ultra-high definition (Ultra HD) 8K video. Full-scale 8K broadcasting in Japan is expected by the end of this year and Sony is aiming to roll out 8K compatible TVs by 2020. Panasonic is also expected to have an 8K compatible TV in time for the 2020 Tokyo Olympics, which would offer the perfect showcase for the new technology. This aggressive all-Japanese alliance is meant to help the country reclaim market share lost over the past decade to China and South Korea.

Samsung curved TVs. Source: Karlis Dambrans / CC BY 2.0Samsung curved TVs. Source: Karlis Dambrans / CC BY 2.0Ultra HD 8K TVs are called 8K because they have almost 8,000 horizontal pixels; plus, the name is catchy. The 7,680 horizontal pixels and 4,320 vertical pixels (7,680 x 4,320) result in a TV with roughly 33 million pixels. That’s about 16 times the pixels found in a 1,920 x 1,080 high definition TV (1080p HDTV). Current Ultra HD TVs have 3,840 x 2,160 pixels and are marketed as Ultra HD 4K TVs. These have roughly 8 million pixels, still one-fourth less than an 8K TV. Around 20 years ago most TVs were 720 x 480, which is roughly 350,000 pixels. That is 100 times fewer pixels than an 8K TV.

The speed at which TV resolution has evolved the past 20 years is really a story of technology and marketing. The changeover from analog to digital television technology was a slow process, and it wasn’t until the mid-1990s that HDTV broadcasting began in the United States. At that time, most HDTVs were plasma TVs, which were expensive, so HDTV adoption was slow. Liquid crystal display (LCD) TVs existed, but because of material costs, were hard to scale up to traditional TV sizes of 40-plus inches diagonal. By the mid-2000s however, affordable 40-inch LCDs started to be sold. By the end of the decade LCD HDTVs were outselling traditional cathode ray tube (CRT) TVs as well as plasma TVs.

The conversion from standard CRTs to HDTV LCDs was a huge economic boon for everyone involved. TV manufacturers had products that were basically selling themselves. CRTs came to be viewed as antiquated technology and households rushed to replace perfectly good CRTs with newer LCD HDTVs. DVD players, which were designed for standard video, had to be replaced by Blu-ray players which produced HDTV quality video. Cable television companies used the conversion to HDTV signals as an opportunity to increase equipment rental costs and subscription prices.

The rapid increase in resolution over the past decade. The rapid increase in resolution over the past decade. Once the HDTV conversion was complete earlier this decade, TV companies started looking for the next technology shift that would make current TVs obsolete and generate the sales frenzy of the late 2000s. The success of the movie "Avatar" in 2009 prompted a shift towards 3D technology. 3D TVs were produced and heavily marketed, but adoption was slow. After a few years 3D technology was mostly abandoned and LED backlighting was promoted. Although it mustered some new sales, LED backlighting was not the paradigm shift needed to induce wholesale household conversion.

The last few years have seen further attempts, including curved screens, high dynamic range (HDR), higher refresh rates and smart TVs. Although curved screens didn’t catch on, the rest have been relatively successful incremental improvements. At the same time, there has been a steady shift towards 4K Ultra HDTVs, but it’s been hampered in adoption because it offers little benefit. The increased resolution is not readily detectable to the eye unless it is a 55-inch or larger screen viewed from within five feet. It would therefore take an 80-inch TV for viewers further than five feet away to notice the difference between HDTV and 4K.

Such a small difference in perceived quality except under extreme conditions makes upgrading a perfectly good HDTV to 4K hard to justify. TVs have grown in size and this is starting to drive demand, but the transition has been gradual. LCD TVs are difficult to build beyond 75 inches because they get very heavy. Organic light-emitting diode (OLED) TVs offer a lighter weight alternative that may eventually lead to 100-plus-inch TVs. At that point, the quality of 4K would be easier to perceive and a large upgrade cycle would likely follow.

Will there be a similar technology that drives 8K TV adoption in the next decade? Without some sort of technology feature it will be hard to justify upgrading from 4K to 8K, even for wall-sized TVs. Interactive TVs that can replay and zoom sections of the screen might justify 8K, but there has not been any indication that technology is on its way yet. Still, Japanese manufacturers are pushing forward to get an edge on the competition. By 2020 the first of the 8K TVs will arrive, whether needed or not, in search of that next big wave of adoption.

To contact the author of this article, email

Powered by CR4, the Engineering Community

Discussion – 2 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Re: 8K TVs: An Evolution of Resolution
2018-Jan-31 9:38 AM

I wanted to chime in on the visual perception difference between 1080p and 4k. Though the math that says at distance X the human eye cannot perceive the difference is totally correct. It only refers to pixelation. No the eye cannot pick out one single pixel next to the other. But I have yet to see any mathematician or biologist for that matter chime in on if the human eye can pick up on the difference in sharpness or more importantly the difference in color detail. Let me explain. First the Sharpness. Lets make it simple. Let say we want to make a 1 pixel wide circle. If we have 1000 pixels to do it will probably look pretty good depending on the size. But if we have twice as many pixels and make a circle of the same size then the curve of that circle will be that much closer to an analog circle. The more pixels the more you can approximate a true analog circle. The closer you can come to an analog circle the more natural it will look. Can the eye see that. I am not sure as I believe what I called color detail factors in as well. Now lets take those same 1000 pixels and make a straight line. This straight line will be a color gradient. From white to some other color The gradient is to take even steps in color tone from one end of the line to the other. with a 1000 pixel line you have 1000 even steps in color. Now lets make a line of 2000 pixels of the same length as the 1000 pixel line and apply the same concept. Now we have 2000 even steps in color in the same space. Again, the more pixels you can cram in there the more you can approximate a true analog. Again, the closer to analog you get the more natural the color gradient will look. I have a 1080p TV and a 4K TV of the same size. When I put a 4k image on them side by side (Adjusting the tvs to as close as equal as I can get them. Color tone, Brightness, etc) the 4K image definitely looks more natural, closer to real if you will. The difference is totally noticeable. Now sure, is it noticeable enough to run out and replace your tv. Probably not for most people. Most people I know cant even tell when a 4x3 image has been side stretched to fill a 16x9 screen. But the difference is there and I argue that my eyes can see it. I look forward to someone who can possibly prove me right or wrong with math and biology. But until then my position stands.

Re: 8K TVs: An Evolution of Resolution
In reply to #1
2018-Feb-16 9:52 AM

You are right, but that does not mean that it is important enough to make a difference to the average TV consumer. If millennials are willing to watch a movie on a 5.5" screen and listen with earbuds, the nuance you are speaking about is missed by most. Resolution changes, along with the inevitable bells and whistles that are added to TVs each year are there to attract attention and separate one brand or model from another. This is not a science business, its a marketing one, perhaps sadly, and has been since TV's early days, and while the higher pixel count or sub-pixel arrangement means a great deal to those who develop or measure them, if the marketing guys can't come up with a way to use it to sell more TVs, it goes back to the lab. If its something that they can give a catchy name to, regardless of the incremental improvement in quality, its good for a year, or until the next trend comes along, regardless of whether it is a legitimate change that will proliferate across the industry, or one that will piss consumers off when they realize that the set they just bought with the latest and greatest technology will not be compatible with next year's iteration.

I can see the difference, but I still won't pay more for it because I know that if it is a legitimate technology change, competition in the TV space is so great that the premium on the change will drop quickly. I can wait, and most others wouldn't even notice the difference.

Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter


Date Event Location
12-16 Aug 2018 Vancouver, Canada
11-13 Sep 2018 Novi, Michigan
27 Sep 2018 The Reef, Los Angeles
03-05 Oct 2018 Boston, Massachusetts
26 Oct 2018 Old Billingsgate
Find Free Electronics Datasheets