Industry sponsors:
Home | Notebooks | Tablets | Handhelds | Embedded | Panels | Definitions | Leaders | About us
RuggedPCReview Industry Sponsors:
Cincoze | Durabook Americas | DT Research | Emdoor | Getac Technology | Handheld Group | Janam Technologies
Juniper Systems | MobileDemand | RuggON | Trimble | Teguar Computers | Winmate | Zebra

« November 2013 | Main | January 2014 »

December 26, 2013

Does your Pentium have an Atom engine?

There was a time in the very distant computing past where, when buying a computer, the sole decision you needed to make was whether to use the Intel 386/33 or save a few bucks and get the slightly slower 386/25. Today, if you use Intel's handy ARK app that lists every product available from Intel, there's a staggering 1,874 different processors listed. That includes processors targeted at desktop, server, mobile and embedded computers, but even if you leave out servers and desktops, there's still the choice of 949 processors for mobile and embedded applications. Not all of them are state-of-the-art, but even chips designated as "legacy" or "previous generation" are still in widespread use, and available in products still being sold.

The mind-blowing number of Intel processors available brings up the question of how many different Intel chips the world really needs. As of the end of 2013, Apple has sold about 700 million iPhones, iPads and iPhone Touch devices that made do with a mere dozen "A-Series" chips. Not too long ago, tens of millions of netbooks all used the same Intel Atom chip (the N270). So why does Intel make so many different chips? Even though many are based on the same microarchitectures, it can't be simple or cost-efficient to offer THAT wide a product lineup.

On the customer side, this proliferation serves no real purpose. End users make almost all purchasing decisions based on price. Figure in desired screen and disk sizes, and whether there's an Atom, Celeron, Pentium, Core i3, Core i5, or Core i7 chip is inside is confusing at best. For hardware manufacturers it's worse, as they must deal with very rapid product cycles, with customers both demanding legacy support AND availability of the latest Intel products. THEY must explain why this year's Intel crop is so much better than last year's which was so much better than what Intel offered the year before. Or which of a dozen of roughly identical Intel chips makes the most sense.

As is, Intel has managed to bewilder just about anyone with their baffling proliferation of processors, and without the benefit of having established true brand identities. What Intel might have had in mind was kind of a "good, better, best" thing with their Core i3, i5 and i7 processors, where i3 was bare-bones, i5 added Turbo mode and some goodies, and i7 was top-of-the-line. But that never really worked, and the disastrous idea to then come up with a generation-based system that automatically made last year's "generation" obsolete only adds to the confusion. And let's not even get into Intel "code names."

Atom processors were supposed to provide a less expensive alternative to the increasingly pricey Core chips—increasingly pricey at a time where the overall cost of computers became ever lower. Unfortunately, Intel took the same approach with Atom as Microsoft had taken with Windows CE—keep the line so wingclipped and unattractive that it would not threaten sales of the far more profitable Windows proper and mainline Intel chips. At RuggedPCReview we deeply feel for the many vertical and industrial market hardware and systems manufacturers who drank Intel's Atom CoolAid just to see those Atom processors underperform and in quick need of replacement with whatever Intel cooked up next.

But that unfortunate duality between attractively priced but mostly inadequate entry level Atom chips and the much more lucrative mainline Core chips wasn't, and isn't, all. Much to almost everyone's surprise, the Celeron and Pentium brands also continued to be used. Pentium goes back to circa 1990 when Intel needed a trademarkable term for its new processors, having gotten tired of everyone else also making "386" and "486" processors. "Celeron" came about a few years later, around 1998, when Intel realized it was losing the lower end market to generic x86 chipmakers. So along came Celeron, mostly wingclipped versions of Pentiums.

Confusingly, the Celerons and Pentiums continued to hang around when Intel introduced its "Core" processors. The message then became that Pentiums and Celerons were for those who wouldn't spring for a real Core Duo or Core 2 Duo processor. Even more curiously, Pentiums and Celerons still continued when the first generation of the "modern era" Core processors arrived, and then the second, third and forth generation. Study of spec sheets suggested that some of those Pentiums and Celerons were what one might have called Core i1 and i2 chips, solutions for when costs really needed to be contained to the max. In some cases it seemed that Intel secretly continued its long-standing tradition of simply turning off features that were really already part of those dies and chips. A weird outgrowth of that strategy were the last-ditch life support efforts of the dying netbook market to answer the calls for better performance by replacing Atom processors in favor of Celerons that were really slightly throttled Core i3 processors. That actually worked (we have one of those final netbooks in the RuggedPCReview office, an Acer Aspire One 756, and it's a very good performer), but it was too little, too late for netbooks, especially against the incoming tide of tablets.

Given that the choice of mass storage, the quality of drivers, keeping one's computer clean of performance-zapping gunk and, most of all, the speed of one's internet connection (what with all the back and forth with The Cloud) seems to have a far greater impact on perceived system performance than whatever Intel chip sits in the machine, it's more than curious that Celeron and Pentium are not only hanging around, but have even been given yet another lease on life, and this one more confusing than ever.

That's because Intel's latest Atom chips, depending on what they are targeted at, may now also be called Celerons and Pentiums. It's true. "Bay Trail" chips with the new Atom Silvermont micro architecture will be sold under the Atom, Celeron and Pentium brand names, depending on markets and chip configurations. Pontiac once took a heavy hit when the public discovered that some of their cars had Chevy engines in them. Pontiac is long gone now, and General Motors has ditched other brands as well, realizing that confusing consumers with too many choices made little sense. Even GM, however, didn't have anywhere near the dominance of their market as Intel has of its market.

Where will it all lead? No one knows. Intel still enjoys record profits, and other than the growing competition from ARM there seems little reason to change. On the other hand, if the current product strategy continues, four years from now we may have 8th generation Core processors and 4,000 different Intel chips, which cannot possible be feasible. And we really feel for the rugged hardware companies we cover. They are practically forced to use all those chips, even though everyone knows that some are inadequate and most will quickly be replaced.

PS: Interestingly, I do all of my production work at RuggedPCReview.com on an Apple iMac27 powered by a lowly Core 2 Duo processor. It's plenty fast enough to handle extreme workloads and extensive multitasking.

Posted by conradb212 at 6:35 PM

December 13, 2013

Michael Dell's keynote at Dell World 2013: reaching for the cloud

One big problem with being a public company is that every three months it's imperative not to disappoint analysts and investors. Dell won't have to worry about that anymore because it returned to being a private company. That means Dell can now take the longer look, pursue the bigger picture, and no longer suffer from the infliction of short term thinking, as Michael Dell so eloquently put it in his keynote address at the 2013 Dell World conference in Austin, Texas.

And that was the core of Michael Dell's message, that as a private company Dell now has the freedom to make the bold moves that are necessary, investing in emerging markets, and take a long-term view of their investments. Dell said he senses a new vibe of excitement at the company that bears his name, and that he feels like he is part of the world's largest startup.

He did, of course, also mention that sales were up in the double digits, that 98% of the Fortune 500 are Dell customers, that Dell has established a new Research Division and also the Dell Venture Fund. Dell reminisced how he had been taking an IBM PC apart in his dorm room, found that many of the components weren't actually by IBM but steeply marked-up 3rd party components, and how he felt he could make that technology available cheaper and better. He expounded on a recurring theme in his keynote, that the proliferation of computer technology between roughly 1985 and 2005 also saw poverty in the world cut by half.

Dell showed an Apple-esque commercial that will run in 2014 and plays homage to all the little companies that started with Dell, including the one in dorm room #2714 (Dell itself, founded on a thousand bucks). Nicely done and charming. He spoke of how the future is the move beyond products and to end-to-end scalable solutions. He spoke of $13 billion invested in research, highlighted the Dell PowerEdge servers that are in the top two in the world, demonstrated, right on stage, how integrated server, storage and network functionality in fluid cache technology clocked in at over 5 million iops (input output operations per second).

Dell spoke of their new mantra, to transform, connect, inform, and protect. Transform as in modernize, migrate and transition to the future. Connect as in connecting all the various computing devices out there, including the Dell Wyse virtual clients, because, still and for a good time to come, "the PC, for a lot of organizations, is how business is done." Inform, as in turning data into useful, productivity enhancing results, making companies run better with data analytics. And protect as in offering next gen firewalls and connected security to keep data safe and ward off attacks before they even happen.

Dell also reminded that the company has been the leader in displays for over a decade, and touched on 4k2k video resolution that is available from Dell now, another example of Dell making disruptive technology available at accessible prices.

Dell then introduced Elon Musk, who had driven in in a red Tesla Model S, his company's groundbreaking electric car. Along came David Kirkpatrick who took on the job of engaging Musk interview style. Musk, of course, also pioneered PayPal, and, in addition to Tesla, runs SpaceX, sending rockets into space. Musk was there, however, not so much to discuss technology, but to illustrate the impact of pursuing the big picture, big ideas, things that simply need to get done. As if on cue, Dell rejoined the conversation, congratulating Musk and bemoaning the infliction of short term thinking that hamstrings progress and big ideas and bold bets.

Musk said this must be the best time in human history, where "information equality" breaks down barriers, making the world a safer, better place, meshing with Dell's clear belief that technology is a boon for mankind. Today, Musk said, anyone with internet access has more information available than the very President of the United States had just 30 short years ago. Musk also expressed regret over a pessimistic, negatively biased media that always seems to seek the negative.

The address concluded with an observation by Kirkpatrick about Tesla cars' constant connection with the Tesla's computers, and how that information feedback and back and forth is used to make the cars better. Just as, Dell said, his company's approach with customers, a constant back and forth, and constant feedback in their quest to improve and innovate.

This 75 minute keynote was a bold move with a big picture and a big message, with Dell, Musk and Kirkpatrick almost like actors in a play. Quite a task to pull this off, but the message was loud and clear and finely tuned: Dell is now free to pursue big ideas. Like Elon Musk with his electric car and rockets shooting into space, Dell can now reach for the cloud(s), and beyond.

Posted by conradb212 at 2:57 PM

December 5, 2013

Thoughts about display resolutions

The resolution of computer displays is an interesting thing. There are now handhelds with the same number of pixels as large flatscreen TVs, Apple claims its "retina" displays are so sharp that the human eye can no longer see individual pixels, and the very term "high definition" is in the process of being redefined. Let's see what happened with display resolution and where things are headed, both in handhelds and in larger systems, and what 4k2k is all about.

Color monitors more or less started with the original IBM PC's 320 x 240 pixel resolution. In 1984, the monochrome Hercules video card provided 720 x 350 pixel for IBM compatibles. For a long time IBM's 640 x 480 VGA, introduced 1987 with their PS/2 computers, was considered a high resolution standard for the desktop and for notebooks. Then came 800 x 600 pixel SVGA and 1024 x 768 pixel XGA, and those two hung around for a good decade and a half in everything from desktops to notebooks to Tablet PCs. Occasionally there were higher resolutions or displays with aspect ratios different from the 4:3 format that'd been used since the first IBM PC, but those often suffered from lack of driver and software support, and so pretty much everyone stayed with the mainstream formats.

It was really HDTV that drove the next advance in computer displays. During a 1998 factory tour at Sharp in Japan I had my first experience with a wide-format TV. It looked rather odd to me in my hotel room, sort of too wide and not tall enough and not really a TV, but, of course, that turned out where things were going. In the US, it would take a few more years until the advent of HDTV brought on wide-format, and terms such as 720p and 1080p entered our tech vocabulary. For a good while, smaller and less expensive flatscreen TVs used the 1280 x 720, or "720p," format, while the larger and higher end models used full 1920 x 1080 pixel "1080p" resolution. That meant a wide 16:9 aspect ratio.

Desktop and notebook displays quickly followed suit. The venerable 1024 x 768 XGA became 1366 x 768 WXGA and full 1920 x 1080 pixel displays became fairly common as well, albeit more on the desktop than in mobile systems. Professional desktops such as the 27-inch Apple iMac adopted 2560 x 1440 resolution. On the PC side of things standards proliferated in various aspect ratios, resulting in unwieldy standards terminology such as WSXGA+ (1680 x 1050) or WQXGA (2560 x 1600).

An interesting thing happened. Whereas in the past, TVs and computer displays had very different ways to measure resolution, they're now more and more the same, what with flatscreen TVs really being nothing more than very large monitors. The 1920 x 1080 pixel 1080p format, in particular, is everywhere. Amazingly, that's becoming a bit of a problem.

Why? Because as TVs become ever larger, the same old 1080p resolution no longer looks quite as high definition as it did on smaller screens. Put a 42-inch and a 70-inch TV next to each other and you can plainly see the degradation in sharpness. The situation isn't as drastic on notebooks because, after growing ever larger for many years, notebook displays have leveled off in size, and have even shrunk again (Apple no longer makes the 17-inch MacBook Pro, for example). Desktop monitors, however, keep getting larger (I use two 27-inch monitors side-by-side), and that means even "high definition" 1920 x 1080 doesn't look so good anymore, at least not for office-type work with lots of small text. While I was excited getting a reasonably priced HP Pavilion 27-inch 1080p IPS monitor for the PC sitting next to my Mac, I find it almost unusable for detail work because the resolution just isn't high enough to cleanly display small text.

While current resolution standards are running out of steam on larger displays, the situation is quite different in the small screens used on handhelds and smartphones. There we have a somewhat baffling dichotomy where many industrial handhelds still use the same low-res 320 x 240 QVGA format that's been used since the dawn of (computer) time, whereas the latest smartphones have long since moved to 1280 x 800 and even full 1920 x 1080 resolution. Tablets, likewise, pack a lot of pixels onto the rather small 7-inch and 10-inch formats that make up the great majority of the tablet market. Apple led the way with the "retina" 2048 x 1536 pixel resolution on the 3rd generation iPad. That's like a 2x2 matrix of 1024 x 768 pixel XGA displays all in one small 9.7-inch screen. Trumping even that, the latest Kindle Fire HDX tablet packs an astounding 2560 x 1600 pixel onto its 8.9-inch screen. So for now, smartphones and tablets are at the front of the high-resolution revolution.

Somehow, we quickly get used to higher resolution. Older displays that we remember looking great now look coarse and pixellated. With technology you can never go back. The state-of-the-art almost instantly becomes the acceptable minimum. Whereas our eyes used to expect a degree of blurriness and the ability to see individual pixels on a screen, that's less and less acceptable as time goes on. And it really does not make much sense to declare 1080 as "high definition" when by now that resolution is used on anything between a smartphone and an 80-inch TV.

Fortunately, the next thing is on the horizon for TVs and monitors, and it's called 2k4k, which stands for 2,000 x 4,000 pixel. 2160p would be a better name for this likely standard as it is simply a 2x2 matrix of current four current 1080p resolution displays, or 3,840 x 2,160 pixel. That still only means that a giant new 80-inch screen will have no more than the pixel density of a 1080p 40-inch display, but it's certainly a logical next step.

I had all of this on my mind, when I received an email offer from one of my favorite electronics places. It was for a 39-inch Seiki TV/monitor with 4k resolution for a very attractive price and free shipping. I impulsed-ordered it on the spot, telling myself that I need to know where the 4k technology stands and what, at this point, it can and cannot do. And this would finally be a monitor where I could watch the 4k video my GoPro 3 Black Edition can produce.

So I got the Seiki and it's a great deal and bargain. Or it would be if I actually had anything that could drive a 2k4k display in its native mode, which I don't. In fact, at this point there is virtually nothing that can drive a 4k display in full 4k 3840 x 2160 pixel resolution. Yes, the 4k videos from my GoPro 3 Black Edition would probably look great on it, but that would require me to copy the video footage to a PC that can drive an external 4k monitor, which virtually no stock PCs can do today. DVD or Blu-Ray players certainly can't display in 2k4k, and even brand-new gear like the Sony PS4 game console can't. I COULD, of course, get a low-end 4k-capable video card from AMD, but I am not sure any of the PCs in the RuggedPCReview office could actually even accommodate such a card.

The unfortunate truth is that as of late 2013, there's very little gear that can send a true 4K video signal to a 4K TV or monitor. Which means that most content will be viewed in up-sampled mode, which may or may not look great. This will undoubtedly become a marketing issue in the consumer space—there will be great interest and great expectations in 4K TVs, but just as was the case with 3D TVs a couple of years ago, there will be virtually no 4K sources and content. And that can make for a customer backlash. There are some very detailed news on Amazon (see here) that provide an idea of where things stand.

What does all that mean for rugged mobile technology? Not all that much for now, but I am certain that the ready availability of super-high resolution on smartphones and consumer tablets will change customer expectations for rugged device displays just as capacitive touch changed touch screen expectations. Once the (technology) cat's out of the bag, that's it. It won't go back in.

And just as I finished this entry, I see that Dell announced 24-inch and 32-inch UltraSharp monitors with 4k 3840 x 2160 resolution, and a 28-inch version will soon follow (see Dell news). Given that Dell is the leading flat-panel vendor in the US and #2 in the world, that likely means that we'll soon see a lot more systems capable of supporting 4k resolution.

Posted by conradb212 at 4:53 PM