Industry sponsors:
Home | Notebooks | Tablets | Handhelds | Embedded | Panels | Definitions | Leaders | About us
RuggedPCReview Industry Sponsors:
Cincoze | Durabook Americas | DT Research | Getac Technology | Handheld Group | Janam Technologies
Juniper Systems | MobileDemand | RuggON | Trimble | Teguar Computers | Winmate | Zebra

« Does your Pentium have an Atom engine? | Main | Android parlor trick »

January 12, 2014

More 4k video contemplations

All of a sudden, everyone is talking about 4k video. Also known as Ultra-HD video, four times the resolution of the 1920 x 1080 pixel 1080p standard, 4k was everywhere at the Consumer Electronics Show in Las Vegas. Now, obviously, 4k video isn't the most important thing on rugged mobile computer manufacturers' minds, but 4k video is nonetheless a sign of changing times. And with some consumer smartphones already offering full 1080p resolution on their small screens, and consumer tablets going well beyond that, it's only a matter of time until rugged and industrial market customers, too, will demand much higher resolution in their professional computing gear. So keeping track of what's happening out there with ultra-high resolution makes sense.

As I stated in an earlier essay on high resolution (Thoughts about display resolutions, December 2013), I recently purchased a 39-inch Seiki 4k flatscreen display that can be used as a monitor or as a TV. It was an impulse buy, and I justified the (remarkably reasonable) price by deciding the Seiki was a research investment that would help us here at RuggedPCReview.com learn more about how 4k video worked, what was possible, and what wasn't.

On the surface, 4k video makes an awful lot of sense. 1080p HD video was just perfect six or seven years ago for the emerging flood of 40-50 inch flatscreens. But flatscreens have grown since then, and 65, 70 and even 80 inches are now the norm. As you can imagine, the same old 1080p video doesn't look nearly as sharp on screens with two, three, or four times the real estate, or more. So doubling the resolution in both directions makes perfect sense.

And it's a great opportunity to infuse new life and excitement into the flatscreen TV market. Three years ago, everyone offered 3D TVs. An amazing number were sold, amazing given that there was virtually no 3D content. And amazing considering one had to wear those annoying 3D glasses. So 3D quietly became just a built-in feature in most new TVs, but it's no longer a selling point. 4k video, on the other hand, IS now a selling point. And it'll become even bigger as time moves on.

The problem, though, is the same as it was with HD, initially, and then with 3D: no content. There is no native 4k video standard for storage or players. There are no 4k players and only a very few recorders. none of which is mature at this point.

So we did some testing to see what's possible, and what's not possible. The goal was to see whether it's actually possible to get 4k video without breaking the bank. So here's what we did, and how it worked out so far.

What can you do today with a 4k display such as our 39-inch Seiki? Well, you can watch regular HD TV on it via your satellite or cable setup. You can hook it up to video game consoles. You can connect it to streaming video gizmos like Apple TV, Google Chromecast, or the various Roku type of devices. Or you can connect a tablet, notebook or desktop to it. Sadly, almost none of these support resolutions higher than 1080p. Which means that on a 4k display you get video that may or may not look better than on a regular 1080p display. May or may not, because some devices do a decent job at "upscaling" the lower res. For the most part, though, displaying 1080p content on a 4k screen is a bit like running early iOS apps in "2X" mode, where each pixel was doubled in each direction. That's not impressive.

What about hooking up a notebook or desktop to the 4k screen? Well, none of the various computers around our offices supported more than 1080p. And the one Windows desktop I use most often for testing is actually a rather old HP with just a 2.2GHz Core 2 Duo processor. That's still good enough to run Windows 8.1 at a decent clip, but the video from the HP's Asus motherboard maxed out at 1680 x 1050 pixel. So it was time to look around for a video card that could actually drive 4k video.

That, my friends, was a sobering experience for me as I realized how little I knew of current video card technology. Sure, we cover whatever Intel bakes into its latest generation of Core processors, and have a degree of familiarity with some of the discrete graphics subsystems available for various rugged notebooks. But beyond that there's an incredibly complex world of dedicated graphics chips, interface standards, different connectors, as well as an endless array of very specialized graphics features and standards they may or may not support.

I am, I must admit, a bit of a gamer and love spending some relaxing hours playing video games on the Sony and Microsoft consoles. A particular favorite of mine is Skyrim, and so I bought a copy for the PC, to see what it would look like on the Seiki 4k screen. Well, initially I couldn't get the game to work at all on the old HP desktop as its motherboard video didn't support one feature or another. Now it was definitely time to invest in a graphics card.

Several hours of Googling and reading up on things yielded only a vague idea on what might be a feasible solution to our video issue. You can, you see, easily pay more for a sophisticated video card than you pay for an entire computer. And some of those cards need two expansion slots, have large fans, and require massive power supplies to even run in a system. That was out. Was it even possible to get a decent video card that would actually drive 4k video AND work in an old PC like our HP?

The next pitfall was that on Amazon and eBay you really never know if something is the latest technology, or old stuff from a few years ago. Vendors happily peddle old stuff at the full old list price and it's all too easy to get sucked in if you are not totally up to speed. So always check the date of the oldest review.

What eventually worked best was checking some of the tech sites for recent new video chip intros. nVidia and AMD have practically the entire market, and are locked in fierce competition. The actual cards may come from other sources, but they will use nVidia or AMD chips. a bit more research showed that AMD had recently introduced Radeon R7 chips for reasonably priced graphics cards, and those cards actually appeared to support 4k video. And use the PCI Express x16 slot that my old desktop had. I did truly not know if that was the same connector and standard (almost every new Intel chip uses a different socket), but it looked the same, and so I ordered a Gigabyte Radeon R7 250 card with a gigabyte of GDDR5 memory on Amazon for amazingly affordable price of US$89, and no-cost 2-day shipping with Amazon Prime.

The card promptly arrived. And it fit into the x16 slot in the old HP. And the HP booted right up, recognized the card, installed the AMD drivers from the supplied DVD, and did nice 1680 x 1050 video via the card's DVI port on the vintage 22-inch HP flatscreen that had come with the PC. Skyrim now ran just fine.

So it was time to see if the Radeon card would play ball with the Seiki 4k screen. Using a standard HDMI cable, I connected the old HP to the Seiki and, bingo, 4k video came right up. 3840 x 2160 pixel. Wow. It worked.

Windows, of course, even Windows 8.1, isn't exactly a champion in adapting to different screen resolutions, and so it took some messing around with control panels and settings to get the fonts and icons looking reasonably good. And while I have been using a 27-inch iMac for years as my main workstation, 39-inch seems weirdly large. You'd watch a TV this size from a good distance, but for PC work, you're real close and it doesn't feel quite right.

So now that we had an actual 4k video connection to a 4k display, it was time to look for some 4k content. YouTube has some (search "4k video demos"), and so we tried that. The problem there was that running it requires substantial bandwidth, and our solution—a LAN cable connected to a power line connector to run the signals through the building wiring, and then to our AT&T broadband—apparently wasn't up to snuff. So we saw some impressive demo video, very sharp, but more often than not it stopped or bogged down to very low frame rates. So bandwidth will be an issue.

We also perused pictures in full resolution. 4k is over 8 megapixel in camera speak, and so you can view 8mp pictures in full resolution. The result, while quite stunning from a distance, is actually a little disappointing from up close. The JPEG compression that is usually hardly noticeable on smaller screens is obvious, and then there's the fact that even 4k resolution on a 39-inch screen isn't all that much. It's really just in the pixel density range of an older XGA (1024 x 768) 10-inch tablet, and those have long been left behind by "retina" and other much higher resolution screens.

Then we cranked up the Skyrim game and ran that full-screen. It looked rather spectacular, though probably ran in 1080p mode because full 4k would require a good deal more video RAM and modified textures to really be in full 4k resolution. It did look good, but it also bogged down.

After an hour of playing with the 4k video setup I began feeling a bit nauseous and had to stop. The reason, apart from not being used to the large screen from so close up, was almost certainly a serious limitation of the 39-inch Seiki—it runs 4k video at a refresh rate of only 30 Hertz. That is very low. Most modern computer displays run at 65-85 Hertz. You don't actually see flickering, but I am convinced the brain has to do some extra work to make sense of such a low frame rate. And that can make you nauseous.

One of the original triggers of my impulse decision to get the 4k Seiki screen was to watch 4k video from our GoPro 3 Black cameras. The GoPro's 4k, unfortunately, is really also just a technology demonstration for now, as it runs at just 15 frames per second. Normal video viewing is at 30 fps, and games and other video may run at much higher frame rates yet. So until we can view 4k video at 30 fps and more, it's just an experiment.

So that's where we stand with 4k video. There's a vast discrepancy between the marking rhetoric that's pushing 4k now, and the fact that there's almost no content. And significant technical barriers in terms of frame rates, bandwidth, standards, and existing hardware and software that just can't handle it. It's a bit like Nokia trying two tell people you need a 41mp camera in a phone when the phone itself can display less than a single megapixel, and it would take more than four 4k screens in a matrix to display a single picture in full resolution.

In summary, I did not go into great detail in my investigations, and there will be many out there who have much more detailed knowledge of all those standards and display technologies. But we did take a common sense look at what 4k can and cannot offer today. The following about describes the situation:

- 4k displays will soon be common. Dell already offers a range of affordable models (like this one).
- 4k video support is rapidly becoming available as well, and 4k video cards starts at well under $100.
- Some Intel Haswell chips offer integrated 4k video support.
- There's virtually no 4k content available.
- 4k uses more bandwidth than many current linkups can supply.
- The 4k experience is in its infancy, with insufficient refresh rates and frame rates.
- BUT it's clearly the future.
- 4k rugged displays and signage systems are rapidly becoming available.
- Do spend the time learning what it all means, and how it fits together.

Posted by conradb212 at January 12, 2014 1:02 AM